Notebook Metadata¶
- Notebook Name: 02_model.ipynb
- Title: MoneyLion DS Assessment
- Author: Khoon Ching Wong
- Created: 2024-09-24
- Last Modified: 2025-09-25
- Description:
Continuation of01_eda.ipynb.
This notebook focuses on model training, where Optuna is applied for hyperparameter optimization, model selection and performance reporting, with the goal of minimizing institutional financial losses. - Inputs:
- temp/Loan-level/clean_df.parquet
- Outputs:
- temp/Loan-level/best_trials_ClassifierX.json
- temp/Loan-level/all_metrics_ClassifierX.csv
- temp/Loan-level/cm_final_ClassifierX.csv
- temp/Loan-level/df_metrics_ClassifierX.csv
- temp/Loan-level/exec_time_ClassifierX.csv
- temp/Loan-level/lloss_auc_test_ClassifierX.csv
- temp/Loan-level/lloss_auc_train_ClassifierX.csv
- temp/Loan-level/cm_ClassifierX_all.pkl
- temp/Loan-level/cm_labels_ClassifierX_all.pkl
- Repository/Project Link: https://github.com/wongkhoon/DS-Assessment/tree/main/MoneyLion/notebooks
Import libraries¶
import gc
import os
import platform
from IPython.display import display, Markdown, HTML
import json
import pickle
import pandas as pd
import numpy as np
from collections import Counter
import matplotlib.pyplot as plt
import plotly.express as px
import plotly.graph_objects as go
import seaborn as sns
from tqdm import tqdm
import time
import logging
import optuna
import optunahub
from optuna.visualization import plot_pareto_front
import lightgbm as lgb
import sklearn.datasets
from sklearn.model_selection import GroupShuffleSplit, StratifiedGroupKFold
from sklearn.utils.class_weight import compute_sample_weight
from catboost import Pool
from catboost import CatBoostClassifier
import xgboost as xgb
from xgboost import XGBClassifier
from sklearn.ensemble import HistGradientBoostingClassifier
from sklearn.metrics import log_loss, average_precision_score, roc_curve, roc_auc_score, precision_recall_curve
from sklearn.metrics import accuracy_score, confusion_matrix, classification_report
import session_info
Display settings configuration¶
- Configure display settings for enhanced output in Jupyter notebook
# Display full output in output cell, not only last result
from IPython.core.interactiveshell import InteractiveShell
InteractiveShell.ast_node_interactivity = "all"
# Maximum rows and columns of Pandas DataFrame
# pd.options.display.max_rows,
pd.options.display.max_columns
# Print all the contents of a Pandas DataFrame
#pd.set_option('display.max_rows', None) # Print unlimited number of rows by setting to None, default is 10
pd.set_option("display.max_columns", None) # Do not truncate cols to display all of them by setting to None
pd.set_option("display.width", None) # Auto-detect the width of df to display all columns in single line by setting to None
pd.set_option("display.max_colwidth", None) # Auto detect the max size of col and print contents of that col without truncation
Create Temporary Directory for Intermediate Files¶
- Create a
tempdirectory to store intermediate files (e.g. model metrics, Optuna trials). - These files will be reused for generating plots and tables, as well as for reloading intermediate data i.e.
clean_df.parquet.
# Create the directory path if it doesn't exist and raise no errors if already exist
os.makedirs("temp/Loan-level", exist_ok = True)
temp_dir = "temp/Loan-level"
Function¶
def save_results(clf_name, best_trials,
exec_time, lloss_auc_train, lloss_auc_test,
df_metrics, cm_final, cm_all, cm_labels):
"""
Save results for a given classifier.
Parameters:
clf_name (str): Name identifier for the algorithm.
exec_time (list of dict): Execution time data.
lloss_auc_train (list of dict): Training log loss/AUC metrics.
lloss_auc_test (list of dict): Test log loss/AUC metrics.
df_metrics (pd.DataFrame): Additional metrics in Pandas DataFrame form.
cm_final (array-like): The final confusion matrix.
cm_all (list): List of confusion matrices for trials.
cm_labels (list): Corresponding labels for the confusion matrices.
"""
# Save tabular data as CSV
pd.DataFrame(exec_time).to_csv(f'{temp_dir}/exec_time_{clf_name}.csv', index = False)
pd.DataFrame(lloss_auc_train).to_csv(f'{temp_dir}/lloss_auc_train_{clf_name}.csv', index = False)
pd.DataFrame(lloss_auc_test).to_csv(f'{temp_dir}/lloss_auc_test_{clf_name}.csv', index = False)
pd.DataFrame(all_metrics).to_csv(f'{temp_dir}/all_metrics_{clf_name}.csv', index = False)
df_metrics.to_csv(f'{temp_dir}/df_metrics_{clf_name}.csv', index = False)
pd.DataFrame(cm_final).to_csv(f'{temp_dir}/cm_final_{clf_name}.csv', index = False)
# Saves all the parameters of the best trial(s) into a JSON file
with open(f'{temp_dir}/best_trials_{clf_name}.json', "w") as f:
json.dump(best_trials, f, indent = 4)
# Pickle complex objects (like lists, matrices, etc)
with open(f'{temp_dir}/cm_{clf_name}_all.pkl', "wb") as f:
pickle.dump(cm_all, f)
with open(f'{temp_dir}/cm_labels_{clf_name}_all.pkl', "wb") as f:
pickle.dump(cm_labels, f)
Load Parquet Data File¶
- Load the processed Parquet data file generated in
01_eda.ipynb - Drop columns not of interest
# Load aggregated loan level parquet data file
clean_df = pd.read_parquet(f'{temp_dir}/clean_df.parquet', engine = "pyarrow")
#clean_df.columns.tolist()
# Drop columns not of interest
cols_to_drop = ["underwritingid", "loanId", "clarityFraudId",
"applicationDate", "originatedDate",
"loanStatus", "fpymtDate", "fpymtAmt", "fpymtStatus", "yr_mth", "mth",
"principal_tot", "fees_tot", "paymentAmount_tot"]
clean_df.drop(columns = cols_to_drop, inplace = True)
clean_df.info(verbose = "all")
<class 'pandas.core.frame.DataFrame'> RangeIndex: 32312 entries, 0 to 32311 Data columns (total 270 columns): # Column Dtype --- ------ ----- 0 cfinq.thirtydaysago Int32 1 cfinq.twentyfourhoursago Int32 2 cfinq.oneminuteago Int32 3 cfinq.onehourago Int32 4 cfinq.ninetydaysago Int32 5 cfinq.sevendaysago Int32 6 cfinq.tenminutesago Int32 7 cfinq.fifteendaysago Int32 8 cfinq.threesixtyfivedaysago Int32 9 cfind.inquiryonfilecurrentaddressconflict boolean 10 cfind.totalnumberoffraudindicators Int32 11 cfind.telephonenumberinconsistentwithaddress boolean 12 cfind.inquiryageyoungerthanssnissuedate boolean 13 cfind.onfileaddresscautious boolean 14 cfind.inquiryaddressnonresidential boolean 15 cfind.onfileaddresshighrisk boolean 16 cfind.ssnreportedmorefrequentlyforanother boolean 17 cfind.currentaddressreportedbytradeopenlt90days boolean 18 cfind.inputssninvalid boolean 19 cfind.inputssnissuedatecannotbeverified boolean 20 cfind.inquiryaddresscautious boolean 21 cfind.morethan3inquiriesinthelast30days boolean 22 cfind.onfileaddressnonresidential boolean 23 cfind.creditestablishedpriortossnissuedate boolean 24 cfind.driverlicenseformatinvalid boolean 25 cfind.inputssnrecordedasdeceased boolean 26 cfind.inquiryaddresshighrisk boolean 27 cfind.inquirycurrentaddressnotonfile boolean 28 cfind.bestonfilessnissuedatecannotbeverified boolean 29 cfind.highprobabilityssnbelongstoanother boolean 30 cfind.maxnumberofssnswithanybankaccount Int32 31 cfind.bestonfilessnrecordedasdeceased boolean 32 cfind.currentaddressreportedbynewtradeonly boolean 33 cfind.creditestablishedbeforeage18 boolean 34 cfind.telephonenumberinconsistentwithstate boolean 35 cfind.driverlicenseinconsistentwithonfile boolean 36 cfind.workphonepreviouslylistedascellphone boolean 37 cfind.workphonepreviouslylistedashomephone boolean 38 cfindvrfy.ssnnamematch category 39 cfindvrfy.nameaddressmatch category 40 cfindvrfy.phonematchtype category 41 cfindvrfy.phonematchresult category 42 cfindvrfy.overallmatchresult category 43 cfindvrfy.phonetype category 44 cfindvrfy.ssndobreasoncode category 45 cfindvrfy.ssnnamereasoncode category 46 cfindvrfy.nameaddressreasoncode category 47 cfindvrfy.ssndobmatch category 48 cfindvrfy.overallmatchreasoncode float64 49 clearfraudscore float64 50 anon_ssn object 51 payFrequency category 52 apr float64 53 originated boolean 54 nPaidOff Int32 55 approved boolean 56 isFunded boolean 57 loanAmount float64 58 originallyScheduledPaymentAmount float64 59 state category 60 leadType category 61 leadCost float64 62 fpStatus category 63 hasCF boolean 64 sum_days_btw_pymts float64 65 mean_days_btw_pymts float64 66 med_days_btw_pymts float64 67 std_days_btw_pymts float64 68 cnt_days_btw_pymts Int32 69 min_days_btw_pymts float64 70 max_days_btw_pymts float64 71 sum_fees_Cancelled float64 72 sum_fees_Checked float64 73 sum_fees_Complete float64 74 sum_fees_None float64 75 sum_fees_Pending float64 76 sum_fees_Rejected float64 77 sum_fees_Rejected Awaiting Retry float64 78 sum_fees_Returned float64 79 sum_fees_Skipped float64 80 sum_principal_Cancelled float64 81 sum_principal_Checked float64 82 sum_principal_Complete float64 83 sum_principal_None float64 84 sum_principal_Pending float64 85 sum_principal_Rejected float64 86 sum_principal_Rejected Awaiting Retry float64 87 sum_principal_Returned float64 88 sum_principal_Skipped float64 89 sum_pymtAmt_Cancelled float64 90 sum_pymtAmt_Checked float64 91 sum_pymtAmt_Complete float64 92 sum_pymtAmt_None float64 93 sum_pymtAmt_Pending float64 94 sum_pymtAmt_Rejected float64 95 sum_pymtAmt_Rejected Awaiting Retry float64 96 sum_pymtAmt_Returned float64 97 sum_pymtAmt_Skipped float64 98 mean_fees_Cancelled float64 99 mean_fees_Checked float64 100 mean_fees_Complete float64 101 mean_fees_None float64 102 mean_fees_Pending float64 103 mean_fees_Rejected float64 104 mean_fees_Rejected Awaiting Retry float64 105 mean_fees_Returned float64 106 mean_fees_Skipped float64 107 mean_principal_Cancelled float64 108 mean_principal_Checked float64 109 mean_principal_Complete float64 110 mean_principal_None float64 111 mean_principal_Pending float64 112 mean_principal_Rejected float64 113 mean_principal_Rejected Awaiting Retry float64 114 mean_principal_Returned float64 115 mean_principal_Skipped float64 116 mean_pymtAmt_Cancelled float64 117 mean_pymtAmt_Checked float64 118 mean_pymtAmt_Complete float64 119 mean_pymtAmt_None float64 120 mean_pymtAmt_Pending float64 121 mean_pymtAmt_Rejected float64 122 mean_pymtAmt_Rejected Awaiting Retry float64 123 mean_pymtAmt_Returned float64 124 mean_pymtAmt_Skipped float64 125 med_fees_Cancelled float64 126 med_fees_Checked float64 127 med_fees_Complete float64 128 med_fees_None float64 129 med_fees_Pending float64 130 med_fees_Rejected float64 131 med_fees_Rejected Awaiting Retry float64 132 med_fees_Returned float64 133 med_fees_Skipped float64 134 med_principal_Cancelled float64 135 med_principal_Checked float64 136 med_principal_Complete float64 137 med_principal_None float64 138 med_principal_Pending float64 139 med_principal_Rejected float64 140 med_principal_Rejected Awaiting Retry float64 141 med_principal_Returned float64 142 med_principal_Skipped float64 143 med_pymtAmt_Cancelled float64 144 med_pymtAmt_Checked float64 145 med_pymtAmt_Complete float64 146 med_pymtAmt_None float64 147 med_pymtAmt_Pending float64 148 med_pymtAmt_Rejected float64 149 med_pymtAmt_Rejected Awaiting Retry float64 150 med_pymtAmt_Returned float64 151 med_pymtAmt_Skipped float64 152 std_fees_Cancelled float64 153 std_fees_Checked float64 154 std_fees_None float64 155 std_fees_Pending float64 156 std_fees_Rejected float64 157 std_fees_Rejected Awaiting Retry float64 158 std_fees_Skipped float64 159 std_principal_Cancelled float64 160 std_principal_Checked float64 161 std_principal_None float64 162 std_principal_Pending float64 163 std_principal_Rejected float64 164 std_principal_Rejected Awaiting Retry float64 165 std_principal_Skipped float64 166 std_pymtAmt_Cancelled float64 167 std_pymtAmt_Checked float64 168 std_pymtAmt_None float64 169 std_pymtAmt_Pending float64 170 std_pymtAmt_Rejected float64 171 std_pymtAmt_Rejected Awaiting Retry float64 172 std_pymtAmt_Skipped float64 173 min_fees_Cancelled float64 174 min_fees_Checked float64 175 min_fees_Complete float64 176 min_fees_None float64 177 min_fees_Pending float64 178 min_fees_Rejected float64 179 min_fees_Rejected Awaiting Retry float64 180 min_fees_Returned float64 181 min_fees_Skipped float64 182 min_principal_Cancelled float64 183 min_principal_Checked float64 184 min_principal_Complete float64 185 min_principal_None float64 186 min_principal_Pending float64 187 min_principal_Rejected float64 188 min_principal_Rejected Awaiting Retry float64 189 min_principal_Returned float64 190 min_principal_Skipped float64 191 min_pymtAmt_Cancelled float64 192 min_pymtAmt_Checked float64 193 min_pymtAmt_Complete float64 194 min_pymtAmt_None float64 195 min_pymtAmt_Pending float64 196 min_pymtAmt_Rejected float64 197 min_pymtAmt_Rejected Awaiting Retry float64 198 min_pymtAmt_Returned float64 199 min_pymtAmt_Skipped float64 200 max_fees_Cancelled float64 201 max_fees_Checked float64 202 max_fees_Complete float64 203 max_fees_None float64 204 max_fees_Pending float64 205 max_fees_Rejected float64 206 max_fees_Rejected Awaiting Retry float64 207 max_fees_Returned float64 208 max_fees_Skipped float64 209 max_principal_Cancelled float64 210 max_principal_Checked float64 211 max_principal_Complete float64 212 max_principal_None float64 213 max_principal_Pending float64 214 max_principal_Rejected float64 215 max_principal_Rejected Awaiting Retry float64 216 max_principal_Returned float64 217 max_principal_Skipped float64 218 max_pymtAmt_Cancelled float64 219 max_pymtAmt_Checked float64 220 max_pymtAmt_Complete float64 221 max_pymtAmt_None float64 222 max_pymtAmt_Pending float64 223 max_pymtAmt_Rejected float64 224 max_pymtAmt_Rejected Awaiting Retry float64 225 max_pymtAmt_Returned float64 226 max_pymtAmt_Skipped float64 227 cnt_custom Int32 228 cnt_non custom Int32 229 cnt_pymtStatus_Cancelled Int32 230 cnt_pymtStatus_Checked Int32 231 cnt_pymtStatus_Complete Int32 232 cnt_pymtStatus_None Int32 233 cnt_pymtStatus_Pending Int32 234 cnt_pymtStatus_Rejected Int32 235 cnt_pymtStatus_Rejected Awaiting Retry Int32 236 cnt_pymtStatus_Returned Int32 237 cnt_pymtStatus_Skipped Int32 238 cnt_pymtRCode_C01 Int32 239 cnt_pymtRCode_C02 Int32 240 cnt_pymtRCode_C03 Int32 241 cnt_pymtRCode_C05 Int32 242 cnt_pymtRCode_C07 Int32 243 cnt_pymtRCode_LPP01 Int32 244 cnt_pymtRCode_MISSED Int32 245 cnt_pymtRCode_R01 Int32 246 cnt_pymtRCode_R02 Int32 247 cnt_pymtRCode_R03 Int32 248 cnt_pymtRCode_R04 Int32 249 cnt_pymtRCode_R06 Int32 250 cnt_pymtRCode_R07 Int32 251 cnt_pymtRCode_R08 Int32 252 cnt_pymtRCode_R09 Int32 253 cnt_pymtRCode_R10 Int32 254 cnt_pymtRCode_R13 Int32 255 cnt_pymtRCode_R15 Int32 256 cnt_pymtRCode_R16 Int32 257 cnt_pymtRCode_R19 Int32 258 cnt_pymtRCode_R20 Int32 259 cnt_pymtRCode_R29 Int32 260 cnt_pymtRCode_R99 Int32 261 cnt_pymtRCode_RAF Int32 262 cnt_pymtRCode_RBW Int32 263 cnt_pymtRCode_RFG Int32 264 cnt_pymtRCode_RIR Int32 265 cnt_pymtRCode_RUP Int32 266 cnt_pymtRCode_RWC Int32 267 cnt_pymtRCode_RXL Int32 268 cnt_pymtRCode_RXS Int32 269 target Int8 dtypes: Int32(55), Int8(1), boolean(31), category(14), float64(168), object(1) memory usage: 52.5+ MB
"""
# Print counts of unique values for all columns
for col in clean_df.columns:
print(f'Value counts for column: {col}')
print(clean_df[col].value_counts(dropna = False)) # Count occurrences of each unique value
print("-" * 40)
""";
Models¶
Identity and hashing assumption
- Assume a consistent, deterministic hashing algorithm that yields a unique hash for every loan applicant (anon_ssn)Data splitting
- Train–test (80/20) withGroupShuffleSplitso that noanon_ssnappears in both train and testCross-validation on the training split
-StratifiedGroupKFold(5 folds) to:- Keep all samples from the same
anon_ssnin the same fold (prevent leakage) - Preserve class balance
- Use shuffling and a fixed
random_statefor robustness and reproducibility
- Keep all samples from the same
Models (tree-based gradient boosting)
- LightGBM: Grows trees by expanding the single branch that improves accuracy the most
- HistGradientBoostingClassifier: Grows trees step by step, testing options in groups (buckets) for speed
- XGBoost: Grows trees evenly level by level, keeping the structure balanced
- CatBoostClassifier: Grows trees in a perfectly symmetrical way, splitting all branches the same way at each step
Advantages of these classifiers:
- Require minimal preprocessing (e.g. no need for extensive encoding or scaling)
- Less sensitive to outliers in the data
- Less sensitive to multicollinearity
- Capable of capturing non-linear relationships effectively
- Support categorical features natively
Hyperparameter optimization
- Optuna multi-objective hyperparameter optimization with stratified 5-fold CV over 100 sequential trials:- Calibration: Log-loss (minimize) — evaluates how closely the predicted probabilities match the true outcomes. Lower values indicate better probability calibration.
- Discrimination: PR-AUC and ROC-AUC (maximize)
- ROC-AUC: Measures how well the model separates positives from negatives across all decision thresholds.
- PR-AUC: Focuses on precision–recall trade-offs, particularly valuable for imbalanced datasets where one class is much rarer.
Evaluation and reporting
- Report objective metrics above (on CV and hold-out test)
- Confusion matrix and classification report
- Elapsed time
# Runs the objective() function 100 times, each with a different set of hyperparameters suggested by Optuna
n_trials = 100
# Unsure which sampler to use so load the AutoSampler plugin from OptunaHub to automatically select the most suitable built-in sampler for the search space during Optuna optimization
module = optunahub.load_module(package = "samplers/auto_sampler")
seed = 42
Optuna with LightGBM¶
# Aggregated loan level data
X_df = clean_df.drop(columns = ["target", "anon_ssn"])
cat_cols = X_df.select_dtypes(include = "category").columns
cat_indices = [X_df.columns.get_loc(col) for col in cat_cols]
X_df = X_df.apply(lambda x: x.map(lambda z: np.nan if pd.isna(z) else z))
feature_name = list(X_df.columns)
y_df = clean_df["target"]
# Extract the grouping variable, anon_ssn
anon_ssn = clean_df["anon_ssn"];
# Convert categorical columns to numerical codes for Optuna with LightGBM, with NaN as -1
for col in X_df.columns:
if isinstance(X_df[col].dtype, pd.CategoricalDtype):
#print(col+": ordered?", {X_df[col].cat.ordered})
X_df[col] = X_df[col].cat.codes
X = X_df.to_numpy()
y = y_df.to_numpy()
# A single train-test split (80%-20%) using GroupShuffleSplit, ensuring no data leakage due anon_ssn (grouped by anon_ssn) appear in both sets
# and use a fixed random seed for reproducibility
gss = GroupShuffleSplit(n_splits = 1, test_size = 0.2, random_state = seed)
# Generate the train and test indices from X, y, while ensuring all samples from the same anon_ssn stay together in train or test sets
train_idx, test_idx = next(gss.split(X, y, groups = anon_ssn))
# Use the indices to slice out the training and testing subsets of our feature matrix
X_train, X_test = X[train_idx], X[test_idx]
# Likewise, slice out the corresponding labels for training and testing
y_train, y_test = y[train_idx], y[test_idx]
# Keep track of anon_ssn for cross-validation
anon_ssn_train = anon_ssn[train_idx]
display(Markdown(f"<span style = 'font-size: 18px; font-weight: bold;'> Overview of training and test sets:</span>"))
print(f'Training set:\n- {X_train.shape[0]} rows and {X_train.shape[1]} features')
print(f'- Target proportion:\n{pd.Series(y_train).value_counts(normalize = True)}\n')
print(f'Test set:\n- {X_test.shape[0]} rows and {X_test.shape[1]} features')
print(f'- Target proportion:\n{pd.Series(y_test).value_counts(normalize = True)}\n')
print(f'Training set for anon_ssn: {anon_ssn_train.shape[0]} row entries with {anon_ssn_train.nunique()} unique anon_ssn values')
Overview of training and test sets:
Training set: - 25838 rows and 268 features - Target proportion: 0 0.507973 1 0.492027 Name: proportion, dtype: float64 Test set: - 6474 rows and 268 features - Target proportion: 0 0.505715 1 0.494285 Name: proportion, dtype: float64 Training set for anon_ssn: 25838 row entries with 24008 unique anon_ssn values
The target distribution between safe and risky loans does not suffer from severe class imbalance and is well balanced across both the training and test sets.
del X_df, y_df, X, y, gss, train_idx, test_idx;
# Define the Optuna objective function
def objective(trial):
# https://lightgbm.readthedocs.io/en/latest/Parameters.html
# https://lightgbm.readthedocs.io/en/stable/Parameters.html
# https://lightgbm.readthedocs.io/en/latest/Parameters-Tuning.html
# Define parameter search space
param = {"objective": "binary",
"metric": ["binary_logloss", "auc", "average_precision"],
"device_type": "cpu",
#"device_type": "gpu", # Uses OpenCL backend on Windows
#"gpu_platform_id": 0,
#"gpu_device_id": 0,
"verbosity": 2,
"boosting_type": trial.suggest_categorical("boosting_type", ["gbdt", "rf", "dart"]),
"lambda_l1": trial.suggest_float("lambda_l1", 1e-8, 1e1, log = True),
"lambda_l2": trial.suggest_float("lambda_l2", 1e-8, 1e1, log = True),
"num_leaves": trial.suggest_int("num_leaves", 2, 256),
"feature_fraction": trial.suggest_float("feature_fraction", 1e-1, 1e0),
"bagging_fraction": trial.suggest_float("bagging_fraction", 1e-1, 1e0),
"bagging_freq": trial.suggest_int("bagging_freq", 1, 10),
"min_child_samples": trial.suggest_int("min_child_samples", 5, 200),
"learning_rate": trial.suggest_float("learning_rate", 1e-2, 1e-1, log = True),
"random_state": seed,
"deterministic": True,
"bagging_seed": seed,
"feature_fraction_seed": seed,
"drop_seed": seed,
"force_col_wise": True,
"num_threads": -1,
"is_unbalance": trial.suggest_categorical("is_unbalance", [True, False]),
"max_depth": -1, # <= 0, No tree depth limit
"max_bin": trial.suggest_int("max_bin", 40, 255),
"min_sum_hessian_in_leaf": trial.suggest_float("min_sum_hessian_in_leaf", 1e-10, 1e1, log = True),
"max_delta_step": trial.suggest_float("max_delta_step", 1, 100),
"feature_fraction_bynode": trial.suggest_float("feature_fraction_bynode", 1e-1, 1e0),
}
# Define how to split using StratifiedGroupKFold (5 folds, stratified, anon_ssn-safe)
sgkf = StratifiedGroupKFold(n_splits = 5, shuffle = True, random_state = seed)
# Lists to store the performance metrics from each fold
lloss_scores, pr_auc_scores, roc_auc_scores = [], [], []
# Iterate over each fold of the stratified, anon_ssn-aware split, numbering folds starting from 1
for fold_idx, (train_index, valid_index) in enumerate(sgkf.split(X_train, y_train, groups = anon_ssn_train), start = 1):
# Split into training and validation sets for this fold
X_train_fold, X_valid_fold = X_train[train_index], X_train[valid_index]
y_train_fold, y_valid_fold = y_train[train_index], y_train[valid_index]
# Summarize the composition of classes in the train and validation sets
train_0, train_1 = len(y_train_fold[y_train_fold == 0]), len(y_train_fold[y_train_fold == 1])
valid_0, valid_1 = len(y_valid_fold[y_valid_fold == 0]), len(y_valid_fold[y_valid_fold == 1])
print(f'Trial {trial.number}, Fold {fold_idx}: Train size = {len(train_index)} where 0 = {train_0}, 1 = {train_1}, 0/1 = {train_0/train_1}')
print(f'Trial {trial.number}, Fold {fold_idx}: Validation size = {len(valid_index)} where 0 = {valid_0}, 1 = {valid_1}, 0/1 = {valid_0/valid_1}')
# Create LightGBM datasets for efficient training
dtrain_fold = lgb.Dataset(X_train_fold, label = y_train_fold, categorical_feature = cat_indices)
dvalid_fold = lgb.Dataset(X_valid_fold, label = y_valid_fold)
# Suggest the number of boosting rounds (trees) for the LightGBM model
# This value is being tuned using Optuna's suggest_int method, which picks an integer between 50 and 500
# The number of boosting rounds defines how many trees are built during training
num_round = trial.suggest_int("num_boost_round", 5, 100)
# https://lightgbm.readthedocs.io/en/stable/pythonapi/lightgbm.train.html
# Train the LightGBM model using the provided parameters and training data
# The num_boost_round parameter is specified separately rather than included in the param dictionary
# This is because num_boost_round is not a model hyperparameter that affects the tree-building process,
# but rather controls the number of boosting iterations during training.
# Including it as a separate argument in lgb.train allows for more flexible and clear tuning using Optuna
start_fold = time.perf_counter()
clf = lgb.train(params = param, train_set = dtrain_fold,
num_boost_round = num_round, # Get num_rounds from Optuna
valid_sets = [dvalid_fold]
)
end_fold = time.perf_counter()
# Predict probabilities and convert to binary labels at 0.5 threshold
y_prob_fold = clf.predict(X_valid_fold)
y_pred_fold = np.rint(y_prob_fold) # Set y_pred_fold = 1 if y_prob_fold => 0.5 and 0 if y_prob_fold < 0.5
print(f'Trial {trial.number}, Fold {fold_idx}: '
f'Log loss = {log_loss(y_valid_fold, y_prob_fold)}, '
f'Average precision = {average_precision_score(y_valid_fold, y_prob_fold)}, '
f'ROC-AUC = {roc_auc_score(y_valid_fold, y_prob_fold)}, '
f'Elapsed Time = {end_fold - start_fold} seconds')
# Calculate and store the evaluation metrics for this fold
lloss_scores.append(log_loss(y_valid_fold, y_prob_fold))
pr_auc_scores.append(average_precision_score(y_valid_fold, y_prob_fold))
roc_auc_scores.append(roc_auc_score(y_valid_fold, y_prob_fold))
del X_train_fold, X_valid_fold, y_train_fold, y_valid_fold, dtrain_fold, dvalid_fold, clf, start_fold, end_fold
gc.collect()
# Calculate average metrics across all folds for Optuna to optimize
mean_lloss = np.mean(lloss_scores)
mean_pr_auc = np.mean(pr_auc_scores)
mean_roc_auc = np.mean(roc_auc_scores)
del lloss_scores, pr_auc_scores, roc_auc_scores
gc.collect()
# Return the metrics to Optuna for optimization
return mean_lloss, mean_pr_auc, mean_roc_auc
# Initialize a progress bar for visual feedback during optimization
# https://tqdm.github.io/docs/tqdm/
trial_progress = tqdm(total = n_trials, desc = "Optimization Progress", leave = True,
ascii = True, # Plain text mode
dynamic_ncols = True # Auto-fit width
)
# Callback for Optuna to update the progress bar after each trial
def update_progress(study_lgbm, trial):
trial_progress.update(1)
# Disable Optuna's stdout handler so notebook isn’t spammed
optuna.logging.disable_default_handler()
# Enable propagation to Python’s logging
optuna.logging.enable_propagation()
optuna.logging.set_verbosity(optuna.logging.DEBUG)
# Configure Python logging
logging.basicConfig(filename = "optuna_debug_LGBM.log", filemode = "w", level = logging.DEBUG, format = "%(asctime)s %(levelname)s %(message)s")
study_lgbm = optuna.create_study(study_name = "Optuna for LGBM",
directions = ["minimize", "maximize", "maximize"],
sampler = module.AutoSampler(seed = seed)
)
start_optuna = time.perf_counter()
study_lgbm.optimize(objective, n_trials = n_trials, n_jobs = 1, callbacks = [update_progress])
end_optuna = time.perf_counter()
print(f'Optuna Optimization Elapsed Time: {end_optuna - start_optuna} seconds')
# Create a Pareto-front plot (3D scatter plot) to show trade-off between log-loss, PR_AUC and ROC-AUC for each trial
fig = plot_pareto_front(study_lgbm, target_names = ["Log loss", "PR-AUC", "ROC-AUC"])
fig.update_layout(width = 900, height = 400)
fig.show()
trial_progress.close()
# Plot optimization history plot for each objective
metrics = ["Log loss", "PR-AUC", "ROC-AUC"]
for i, obj in enumerate(metrics):
optuna.visualization.plot_optimization_history(study_lgbm,
target = lambda t: t.values[i], # Correctly target each objective
target_name = obj).show()
best_trials = study_lgbm.best_trials
best_trials_lgbm = {}
# Lists to store the performance metrics from best trial(s)
exec_time_lgbm, lloss_auc_train_lgbm, lloss_auc_test_lgbm, all_metrics = [], [], [], []
# List to store confusion matrices and their labels
cm_lgbm_all, cm_labels_lgbm_all = [], []
for i, trial in enumerate(best_trials):
display(Markdown(f"<span style = 'font-size: 18px; font-weight: bold;'> Training with Best Trial {trial.number} </span>"))
best_params = trial.params
# Non-optimized and best Optuna optimized parameters
full_params = {"objective": "binary",
"metric": ["binary_logloss", "auc", "average_precision"],
"device_type": "cpu",
"verbosity": 2,
"random_state": seed,
"deterministic": True,
"bagging_seed": seed,
"feature_fraction_seed": seed,
"drop_seed": seed,
"force_col_wise": True,
"num_threads": -1,
"max_depth": -1, # <= 0, No tree depth limit
**best_params
}
# Prepare the data - 80% training set and 20% test set
dtrain_all = lgb.Dataset(X_train, label = y_train, categorical_feature = cat_indices, feature_name = feature_name)
dtest_all = lgb.Dataset(X_test, label = y_test, categorical_feature = cat_indices,
feature_name = feature_name)
#display(HTML(str(trial.params)))
display(HTML(full_params.__repr__()))
# To be able to store num_boost_round as well
best_trials_lgbm[trial.number] = full_params.copy()
# Extract `num_boost_round` separately (default to 100 if not found)
num_boost_round = full_params.pop("num_boost_round", 100) # Remove from dictionary and set default
start_train = time.perf_counter()
# https://lightgbm.readthedocs.io/en/stable/pythonapi/lightgbm.train.html
final_lgbm = lgb.train(params = full_params,
train_set = dtrain_all,
num_boost_round = num_boost_round # Pass explicitly
)
end_train = time.perf_counter()
print(f'Training Elapsed Time: {end_train - start_train} seconds')
# Save trained final_lgbm from the best trial
final_lgbm.save_model(f'{temp_dir}/booster_LGBM_best_trial_{trial.number}.txt')
y_prob_all = final_lgbm.predict(X_test)
y_pred_all = np.rint(y_prob_all)
print(f'Log loss: (Train) {trial.values[0]} vs (Test) {log_loss(y_test, y_prob_all)}')
print(f'PR-AUC: (Train) {trial.values[1]} vs (Test) {average_precision_score(y_test, y_prob_all)}')
print(f'ROC-AUC: (Train) {trial.values[2]} vs (Test) {roc_auc_score(y_test, y_prob_all)}')
exec_time_lgbm.append({"Classifier": "LGBM",
"Best Trial": trial.number,
"Optimization Elapsed Time (s)": end_optuna - start_optuna,
"Training Elapsed Time (s)": end_train - start_train})
lloss_auc_train_lgbm.append({"Classifier": "LGBM",
"Best Trial": trial.number,
"Set": "Training",
"Log loss": trial.values[0],
"PR-AUC": trial.values[1],
"ROC-AUC": trial.values[2]})
lloss_auc_test_lgbm.append({"Classifier": "LGBM",
"Best Trial": trial.number,
"Set": "Test",
"Log loss": log_loss(y_test, y_prob_all),
"PR-AUC": average_precision_score(y_test, y_prob_all),
"ROC-AUC": roc_auc_score(y_test, y_prob_all)})
report = classification_report(y_test, y_pred_all, target_names = ["Safe", "Risky"], output_dict = True)
all_metrics.append({"Classifier": "LGBM",
"Trial": trial.number,
"Accuracy": accuracy_score(y_test, y_pred_all),
"Precision (Safe)": report["Safe"]["precision"],
"Recall (Safe)": report["Safe"]["recall"],
"F1-score (Safe)": report["Safe"]["f1-score"],
"Precision (Risky)": report["Risky"]["precision"],
"Recall (Risky)": report["Risky"]["recall"],
"F1-score (Risky)": report["Risky"]["f1-score"],
"Precision (Macro avg)": report["macro avg"]["precision"],
"Recall (Macro avg)": report["macro avg"]["recall"],
"F1-score (Macro avg)": report["macro avg"]["f1-score"],
"Precision (Weighted avg)": report["weighted avg"]["precision"],
"Recall (Weighted avg)": report["weighted avg"]["recall"],
"F1-score (Weighted avg)": report["weighted avg"]["f1-score"]})
# Store confusion matrix
cm_final_lgbm = confusion_matrix(y_test, y_pred_all)
cm_lgbm_all.append(cm_final_lgbm)
cm_labels_lgbm_all.append(f'LGBM Confusion Matrix for Best Trial {trial.number}') # Store label for subplots
df_metrics_lgbm = pd.DataFrame(all_metrics)
gc.collect();
Optimization Progress: 0%| | 0/100 [00:00<?, ?it/s]
Trial 0, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 0, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 11600 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5235 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5180 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5031 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 15 [LightGBM] [Debug] Re-bagging, using 5244 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5130 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5247 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5155 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5020 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5047 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5095 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5097 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5188 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5245 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5188 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5172 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5087 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5239 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5076 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5165 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5106 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5240 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5208 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5188 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5208 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5129 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5176 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5092 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5208 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5088 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5221 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5138 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5141 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5239 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5075 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5151 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5164 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5038 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5189 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5091 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5172 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5010 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5203 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 Trial 0, Fold 1: Log loss = 0.32964674084575407, Average precision = 0.9635670908320297, ROC-AUC = 0.9569285821826472, Elapsed Time = 1.5621259999998074 seconds Trial 0, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 0, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795629 [LightGBM] [Info] Total Bins 11614 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5245 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5044 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5262 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5190 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5139 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5257 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5043 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5046 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5091 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5108 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5249 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5183 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5106 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5224 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5086 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5179 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5120 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5248 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5212 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5190 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5136 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5181 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5105 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5221 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5102 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5237 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5155 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5151 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5182 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5246 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5071 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5237 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5167 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5054 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5098 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5188 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5023 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 Trial 0, Fold 2: Log loss = 0.32852376972252656, Average precision = 0.9609631282739565, ROC-AUC = 0.9569558301650926, Elapsed Time = 1.8901175999999396 seconds Trial 0, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 0, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 11619 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5240 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5183 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5041 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5138 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5246 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5161 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5036 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5055 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5082 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5104 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5197 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5238 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5176 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5096 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5212 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5084 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5159 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5127 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5255 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5211 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5183 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5194 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5143 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5101 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5092 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5224 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5156 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5154 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5179 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5247 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5068 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5248 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5155 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5040 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5190 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5214 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5091 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5023 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5210 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 Trial 0, Fold 3: Log loss = 0.32251907251604217, Average precision = 0.9651567134011153, ROC-AUC = 0.9593725320472691, Elapsed Time = 1.500842399999783 seconds Trial 0, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 0, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 11600 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5234 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5176 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5198 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5033 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5240 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5195 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5132 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5241 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5149 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5024 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5040 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5094 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5099 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5185 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5239 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5192 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5166 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5096 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5241 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5075 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5156 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5161 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5116 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5230 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5140 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5097 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5197 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5090 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5219 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5140 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5142 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5239 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5060 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5246 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5150 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5048 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5195 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5189 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5090 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5015 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 Trial 0, Fold 4: Log loss = 0.3300627228082963, Average precision = 0.9630914115390379, ROC-AUC = 0.9564496496109836, Elapsed Time = 1.4581862999993973 seconds Trial 0, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 0, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 11602 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5232 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5176 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5198 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5028 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5239 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5136 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5236 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5150 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5023 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5035 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5083 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5107 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5233 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5197 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5166 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5082 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5232 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5077 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5152 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5120 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5232 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5175 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5194 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5142 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5181 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5093 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5085 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5212 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5143 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5138 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5161 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5234 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5061 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5241 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5162 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5038 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5193 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5082 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5175 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5020 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 Trial 0, Fold 5: Log loss = 0.3261443116609128, Average precision = 0.9628277217413703, ROC-AUC = 0.956578860338517, Elapsed Time = 1.429886599999918 seconds
Optimization Progress: 1%|1 | 1/100 [00:15<25:26, 15.42s/it]
Trial 1, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 1, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 21036 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 15798 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15944 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15852 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15754 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15785 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15868 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15771 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15788 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15706 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15775 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15707 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15895 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15813 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15845 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15755 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15851 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15954 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 Trial 1, Fold 1: Log loss = 0.3089323124480566, Average precision = 0.9724404048219716, ROC-AUC = 0.9668692316332334, Elapsed Time = 1.6683110999993005 seconds Trial 1, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 1, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 21050 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 15831 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15971 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15881 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15787 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15827 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15827 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15846 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15895 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15811 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15724 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15924 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15860 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15886 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Debug] Re-bagging, using 15792 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15886 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15962 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 Trial 1, Fold 2: Log loss = 0.3117475916409317, Average precision = 0.970029552862639, ROC-AUC = 0.9663060734623168, Elapsed Time = 2.4145504999996774 seconds Trial 1, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 1, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 21049 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 15814 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15958 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15865 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15773 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15811 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15795 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15810 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15828 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15875 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15797 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15715 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15788 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15713 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15828 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15873 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15779 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15865 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15972 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 Trial 1, Fold 3: Log loss = 0.31072578688090474, Average precision = 0.972312781336129, ROC-AUC = 0.9677071394862806, Elapsed Time = 2.2389018999992913 seconds Trial 1, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 1, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 21033 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 15793 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15936 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15848 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15752 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15794 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15777 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15798 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15799 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15857 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15772 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15777 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15698 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15779 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15696 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Debug] Re-bagging, using 15892 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15811 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15839 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15747 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15839 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15952 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 Trial 1, Fold 4: Log loss = 0.31275412815430426, Average precision = 0.9708694402491778, ROC-AUC = 0.9654651029707362, Elapsed Time = 2.2350790000000416 seconds Trial 1, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 1, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 21038 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 15787 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15935 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15845 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15744 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15788 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15772 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15790 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15852 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15769 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15773 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15689 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15772 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15699 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Debug] Re-bagging, using 15889 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15809 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15834 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15737 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15831 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15945 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 Trial 1, Fold 5: Log loss = 0.3118220763524803, Average precision = 0.9697970675961709, ROC-AUC = 0.9645898593538078, Elapsed Time = 2.185679899999741 seconds
Optimization Progress: 2%|2 | 2/100 [00:33<27:54, 17.08s/it]
Trial 2, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 2, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 10006 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 15148 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15286 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15252 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15126 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15175 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15259 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15110 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15164 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15065 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15136 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15053 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15282 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15208 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15240 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15330 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15061 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15175 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15148 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15192 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15329 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15118 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15213 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15256 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15262 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15094 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15149 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 Trial 2, Fold 1: Log loss = 0.31630301819399675, Average precision = 0.9665739919306886, ROC-AUC = 0.9600088780870172, Elapsed Time = 1.318768000000091 seconds Trial 2, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 2, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 10022 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 15180 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15312 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15279 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15235 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15192 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15194 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15225 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15288 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15142 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15187 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15082 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15159 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15068 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15315 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15279 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15228 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15336 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 16 [LightGBM] [Debug] Re-bagging, using 15089 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15235 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15358 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15154 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15231 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15290 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15293 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15133 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15172 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15218 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 Trial 2, Fold 2: Log loss = 0.31566260066916224, Average precision = 0.9642587196322059, ROC-AUC = 0.9608000568143709, Elapsed Time = 1.4685500999994474 seconds Trial 2, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 2, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 10024 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 15163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15300 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15264 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15219 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15181 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15183 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15267 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15133 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15074 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15151 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15053 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15298 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15222 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15265 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15195 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15348 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15080 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15156 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15353 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15145 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15276 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15277 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15115 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15164 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 Trial 2, Fold 3: Log loss = 0.31411749801159977, Average precision = 0.9676660962346362, ROC-AUC = 0.9623256481686586, Elapsed Time = 1.5098760999999286 seconds Trial 2, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 2, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 10003 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 15143 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15279 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15247 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15124 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15168 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15168 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15185 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15248 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15109 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15055 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15141 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15040 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15284 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15236 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15178 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15329 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15058 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15161 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15147 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15321 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15113 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15193 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15257 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15256 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15103 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15128 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 Trial 2, Fold 4: Log loss = 0.315470578568722, Average precision = 0.9666878301659132, ROC-AUC = 0.9606021737471715, Elapsed Time = 1.6039383999996062 seconds Trial 2, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 2, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 10007 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 15138 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15277 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15245 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15118 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15195 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15176 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15244 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15107 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15152 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15046 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15135 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15041 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15282 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15231 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15154 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15318 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15058 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15135 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15330 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15101 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15237 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15259 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Debug] Re-bagging, using 15090 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15145 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15178 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 Trial 2, Fold 5: Log loss = 0.31903414895241994, Average precision = 0.964618303671335, ROC-AUC = 0.9587533210709176, Elapsed Time = 1.4919348000003083 seconds
Optimization Progress: 3%|3 | 3/100 [00:49<26:41, 16.51s/it]
Trial 3, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 3, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.805817 [LightGBM] [Info] Total Bins 24071 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5514 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5491 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5495 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 Trial 3, Fold 1: Log loss = 0.5597832453424669, Average precision = 0.9667125671758896, ROC-AUC = 0.9600614535686803, Elapsed Time = 0.5946309999999357 seconds Trial 3, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 3, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.806987 [LightGBM] [Info] Total Bins 24079 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5524 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5503 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5501 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 Trial 3, Fold 2: Log loss = 0.5610194828577617, Average precision = 0.9630685411421096, ROC-AUC = 0.9596014769308462, Elapsed Time = 0.6716820000001462 seconds Trial 3, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 3, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.806893 [LightGBM] [Info] Total Bins 24073 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5519 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5494 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5499 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 Trial 3, Fold 3: Log loss = 0.5581836889306623, Average precision = 0.9675225486549472, ROC-AUC = 0.9629256315551232, Elapsed Time = 0.6689807000002475 seconds Trial 3, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 3, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804155 [LightGBM] [Info] Total Bins 24060 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5511 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5489 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5492 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 Trial 3, Fold 4: Log loss = 0.5640538981012556, Average precision = 0.9650158233172024, ROC-AUC = 0.9589714609994016, Elapsed Time = 0.713144000000284 seconds Trial 3, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 3, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.807356 [LightGBM] [Info] Total Bins 24068 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5509 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5489 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5492 data to train [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 Trial 3, Fold 5: Log loss = 0.5609692072882091, Average precision = 0.962104120896841, ROC-AUC = 0.956462776136596, Elapsed Time = 0.6953654999997525 seconds
Optimization Progress: 4%|4 | 4/100 [01:00<22:48, 14.26s/it]
Trial 4, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 4, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 8523 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 4037 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 Trial 4, Fold 1: Log loss = 0.3602231587822408, Average precision = 0.9516449899508185, ROC-AUC = 0.9442643136748826, Elapsed Time = 0.46940489999997226 seconds Trial 4, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 4, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 8527 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 4045 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 Trial 4, Fold 2: Log loss = 0.3391353717235677, Average precision = 0.9522235072890253, ROC-AUC = 0.9490507569027792, Elapsed Time = 0.4991823000000295 seconds Trial 4, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 4, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 8536 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 4040 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 Trial 4, Fold 3: Log loss = 0.3550606159355602, Average precision = 0.956825668397955, ROC-AUC = 0.9500223018655315, Elapsed Time = 0.5020086000004085 seconds Trial 4, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 4, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 8509 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 4036 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 Trial 4, Fold 4: Log loss = 0.34068448193557793, Average precision = 0.9555064606052168, ROC-AUC = 0.9494628073802251, Elapsed Time = 0.5262147999992521 seconds Trial 4, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 4, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 8515 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 4034 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 Trial 4, Fold 5: Log loss = 0.3415364396590086, Average precision = 0.9558100300661114, ROC-AUC = 0.947333147539156, Elapsed Time = 0.507986699999492 seconds
Optimization Progress: 5%|5 | 5/100 [01:10<20:19, 12.84s/it]
Trial 5, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 5, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 27687 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 8104 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 8171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 Trial 5, Fold 1: Log loss = 0.5627463031660861, Average precision = 0.961618792767833, ROC-AUC = 0.9552116641095865, Elapsed Time = 0.49536629999965953 seconds Trial 5, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 5, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 27695 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 8121 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 8186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 Trial 5, Fold 2: Log loss = 0.5602560012330817, Average precision = 0.9623815050366534, ROC-AUC = 0.958124667482665, Elapsed Time = 0.5083058999998684 seconds Trial 5, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 5, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 27691 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 8112 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 8176 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 Trial 5, Fold 3: Log loss = 0.5581777532188745, Average precision = 0.9660936011271237, ROC-AUC = 0.9598167635388275, Elapsed Time = 0.525013399999807 seconds Trial 5, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 5, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 27682 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 8100 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 8169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 Trial 5, Fold 4: Log loss = 0.5550038444062219, Average precision = 0.9652116478276259, ROC-AUC = 0.9590724258508171, Elapsed Time = 0.5312815999996019 seconds Trial 5, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 5, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 27687 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 8098 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 8168 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 Trial 5, Fold 5: Log loss = 0.5596013303068881, Average precision = 0.9631739580524441, ROC-AUC = 0.9570591011277707, Elapsed Time = 0.5385897999994995 seconds
Optimization Progress: 6%|6 | 6/100 [01:21<19:00, 12.13s/it]
Trial 6, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 6, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792997 [LightGBM] [Info] Total Bins 20218 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 12618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 12 [LightGBM] [Debug] Re-bagging, using 12703 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12488 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12648 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Debug] Re-bagging, using 12502 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 17 [LightGBM] [Debug] Re-bagging, using 12601 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12648 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12650 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 17 Trial 6, Fold 1: Log loss = 0.5724751092579775, Average precision = 0.9547962934851068, ROC-AUC = 0.9497921326766179, Elapsed Time = 0.5775649000006524 seconds Trial 6, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 6, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792956 [LightGBM] [Info] Total Bins 20183 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 12643 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12674 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12729 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 18 [LightGBM] [Debug] Re-bagging, using 12521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12666 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12511 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 17 [LightGBM] [Debug] Re-bagging, using 12681 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 12672 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 19 Trial 6, Fold 2: Log loss = 0.5658112154684879, Average precision = 0.960626113189713, ROC-AUC = 0.9565368545294133, Elapsed Time = 0.6053950999994413 seconds Trial 6, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 6, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792768 [LightGBM] [Info] Total Bins 20183 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 12629 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12664 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 16 [LightGBM] [Debug] Re-bagging, using 12715 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12511 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12656 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12502 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12611 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12667 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 11 [LightGBM] [Debug] Re-bagging, using 12653 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 17 Trial 6, Fold 3: Log loss = 0.5633294475258322, Average precision = 0.9644919097673798, ROC-AUC = 0.9585736075931082, Elapsed Time = 0.6343689000004815 seconds Trial 6, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 6, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 20218 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 12614 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12651 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 17 [LightGBM] [Debug] Re-bagging, using 12698 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 18 [LightGBM] [Debug] Re-bagging, using 12485 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12640 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 16 [LightGBM] [Debug] Re-bagging, using 12500 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 17 [LightGBM] [Debug] Re-bagging, using 12598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 12 [LightGBM] [Debug] Re-bagging, using 12649 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12637 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 17 Trial 6, Fold 4: Log loss = 0.5648543846673204, Average precision = 0.9643431122624986, ROC-AUC = 0.9583624678402732, Elapsed Time = 0.7327912000000651 seconds Trial 6, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 6, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.793364 [LightGBM] [Info] Total Bins 20213 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 12611 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12647 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 20 [LightGBM] [Debug] Re-bagging, using 12698 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12635 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12496 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 19 [LightGBM] [Debug] Re-bagging, using 12599 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 18 [LightGBM] [Debug] Re-bagging, using 12640 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 16 [LightGBM] [Debug] Re-bagging, using 12636 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 Trial 6, Fold 5: Log loss = 0.5652957851302245, Average precision = 0.9619926890222837, ROC-AUC = 0.956429333184698, Elapsed Time = 0.6392338000005111 seconds
Optimization Progress: 7%|7 | 7/100 [01:31<17:55, 11.57s/it]
Trial 7, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 7, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 11147 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 9953 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9892 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9970 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9834 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9910 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9921 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9935 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9660 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 Trial 7, Fold 1: Log loss = 0.34242023144480144, Average precision = 0.9722076973965884, ROC-AUC = 0.966624128322412, Elapsed Time = 1.238731899999948 seconds Trial 7, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 7, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 11158 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 9972 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9932 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Debug] Re-bagging, using 9828 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9997 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9832 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Debug] Re-bagging, using 9918 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9936 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9683 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 Trial 7, Fold 2: Log loss = 0.3405479248799782, Average precision = 0.9708784536841033, ROC-AUC = 0.9671138269243669, Elapsed Time = 1.569465699999455 seconds Trial 7, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 7, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 11164 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 9963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9919 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9897 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9823 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9982 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9917 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9927 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9941 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9682 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 Trial 7, Fold 3: Log loss = 0.3415467230855122, Average precision = 0.9724346573249535, ROC-AUC = 0.9676446882433956, Elapsed Time = 1.8036743000002389 seconds Trial 7, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 7, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 11145 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 9949 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9908 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9889 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9832 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9908 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9920 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9932 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9658 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 Trial 7, Fold 4: Log loss = 0.3424819866368217, Average precision = 0.9721901674147935, ROC-AUC = 0.9669495470813634, Elapsed Time = 1.786351100000502 seconds Trial 7, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 7, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 11147 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 9947 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9905 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9889 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9793 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9961 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9826 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9914 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9930 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9660 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 Trial 7, Fold 5: Log loss = 0.3435956901704336, Average precision = 0.9697871855943496, ROC-AUC = 0.9651655240324767, Elapsed Time = 1.7220666000002893 seconds
Optimization Progress: 8%|8 | 8/100 [01:47<19:41, 12.85s/it]
Trial 8, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 8, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 7358 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 9522 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 9517 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9466 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9408 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9414 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9518 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9510 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9245 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9337 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9365 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9368 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Debug] Re-bagging, using 9405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 Trial 8, Fold 1: Log loss = 0.31846836571832476, Average precision = 0.9630925930307719, ROC-AUC = 0.9555496706863013, Elapsed Time = 1.1809659999998985 seconds Trial 8, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 8, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 7374 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 9540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9536 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9481 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9433 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9573 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9412 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9568 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9264 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 9373 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9373 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9431 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 Trial 8, Fold 2: Log loss = 0.3197585532648229, Average precision = 0.960242214402566, ROC-AUC = 0.9550711226695712, Elapsed Time = 1.3920828999998776 seconds Trial 8, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 8, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.790045 [LightGBM] [Info] Total Bins 7369 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 254 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 9531 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9525 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 9428 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9559 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9408 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9545 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Debug] Re-bagging, using 9267 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9371 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9365 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Debug] Re-bagging, using 9410 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 Trial 8, Fold 3: Log loss = 0.31408900076580654, Average precision = 0.9649005536183861, ROC-AUC = 0.9582289971203205, Elapsed Time = 1.5456070000000182 seconds Trial 8, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 8, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 7355 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 9518 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9513 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9464 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9539 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9411 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Debug] Re-bagging, using 9516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9509 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9333 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9364 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 Trial 8, Fold 4: Log loss = 0.3132359088595296, Average precision = 0.9633989236643054, ROC-AUC = 0.9558619671142203, Elapsed Time = 1.369531599999391 seconds Trial 8, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 8, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 7363 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 9516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9510 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9464 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9399 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 9520 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9503 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 9536 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9245 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9324 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9357 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9358 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 Trial 8, Fold 5: Log loss = 0.3175104373823987, Average precision = 0.9616862841710087, ROC-AUC = 0.954297456477714, Elapsed Time = 1.3472390000006271 seconds
Optimization Progress: 9%|9 | 9/100 [02:02<20:23, 13.45s/it]
Trial 9, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 9, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 17568 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 3223 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3121 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Debug] Re-bagging, using 3107 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3121 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 7 Trial 9, Fold 1: Log loss = 0.5713470071668255, Average precision = 0.9584267499871053, ROC-AUC = 0.9509777695329864, Elapsed Time = 0.5892558999994435 seconds Trial 9, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 9, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 17582 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 3229 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3226 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3119 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3119 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3226 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3127 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 Trial 9, Fold 2: Log loss = 0.5698247483669396, Average precision = 0.9531280055612432, ROC-AUC = 0.9486416448731555, Elapsed Time = 0.7269937999999456 seconds Trial 9, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 9, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 17585 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 3225 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3220 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3122 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3113 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3221 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3129 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 Trial 9, Fold 3: Log loss = 0.5710888801937607, Average precision = 0.9603293675144042, ROC-AUC = 0.9536839010025906, Elapsed Time = 0.6494276999992508 seconds Trial 9, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 9, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 17572 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 3223 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3214 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3117 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3109 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3214 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3122 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 Trial 9, Fold 4: Log loss = 0.5738447768195227, Average precision = 0.958603345988887, ROC-AUC = 0.9520692653156602, Elapsed Time = 0.6557638000003863 seconds Trial 9, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 9, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 17567 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 3222 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3214 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3117 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3108 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3210 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3126 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 Trial 9, Fold 5: Log loss = 0.5738391657501708, Average precision = 0.9533676394410786, ROC-AUC = 0.9467562937777529, Elapsed Time = 0.6461589999998978 seconds
Optimization Progress: 10%|# | 10/100 [02:13<19:19, 12.88s/it]
Trial 10, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 10, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796981 [LightGBM] [Info] Total Bins 9564 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 6184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6149 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6141 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6075 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6239 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6258 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6113 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5987 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 Trial 10, Fold 1: Log loss = 0.30683541454586777, Average precision = 0.9681071884451107, ROC-AUC = 0.962354461507578, Elapsed Time = 1.1848104000000603 seconds Trial 10, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 10, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796949 [LightGBM] [Info] Total Bins 9556 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 6196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6165 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6143 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6093 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6263 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6203 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6117 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6268 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6136 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6011 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 Trial 10, Fold 2: Log loss = 0.3178055408882424, Average precision = 0.9640378864927284, ROC-AUC = 0.9600655975327024, Elapsed Time = 1.279888199999732 seconds Trial 10, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 10, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796784 [LightGBM] [Info] Total Bins 9563 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 6189 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6141 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6086 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6119 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6257 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6119 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6009 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 Trial 10, Fold 3: Log loss = 0.31589710878430993, Average precision = 0.9666835495238092, ROC-AUC = 0.9617884169949245, Elapsed Time = 1.5711670000000595 seconds Trial 10, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 10, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796474 [LightGBM] [Info] Total Bins 9562 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 6181 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6138 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6077 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6235 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6109 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5990 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 Trial 10, Fold 4: Log loss = 0.3124373915207655, Average precision = 0.967054943875468, ROC-AUC = 0.9605874947244933, Elapsed Time = 1.385622499999954 seconds Trial 10, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 10, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.797340 [LightGBM] [Info] Total Bins 9544 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 6179 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6137 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6071 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6234 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6116 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6248 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6110 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5989 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 Trial 10, Fold 5: Log loss = 0.31701231614868497, Average precision = 0.9634314803611199, ROC-AUC = 0.9594070936216859, Elapsed Time = 1.3575648000005458 seconds
Optimization Progress: 11%|#1 | 11/100 [02:27<19:42, 13.29s/it]
Trial 11, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 11, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 19014 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 10167 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10127 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10114 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10000 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10178 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10041 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 Trial 11, Fold 1: Log loss = 0.27839808071739647, Average precision = 0.9699116494901384, ROC-AUC = 0.9638715330054535, Elapsed Time = 0.9067294999995283 seconds Trial 11, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 11, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 19026 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 10188 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10128 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10026 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10039 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 Trial 11, Fold 2: Log loss = 0.2773252453732645, Average precision = 0.9679454703507686, ROC-AUC = 0.9644505774154984, Elapsed Time = 0.9466196999992462 seconds Trial 11, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 11, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 19029 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 10177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10134 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10120 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10021 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10189 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10037 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 Trial 11, Fold 3: Log loss = 0.2747655322336973, Average precision = 0.9708889291563291, ROC-AUC = 0.9660925867515482, Elapsed Time = 0.9953179000003729 seconds Trial 11, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 11, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 19010 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 10163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10123 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10110 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9999 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10039 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 Trial 11, Fold 4: Log loss = 0.27565143365103895, Average precision = 0.9698695678849122, ROC-AUC = 0.9640579286393842, Elapsed Time = 1.3377421000004688 seconds Trial 11, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 11, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 19008 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 10160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10120 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9992 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10032 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 Trial 11, Fold 5: Log loss = 0.28328383290878856, Average precision = 0.9680872451642151, ROC-AUC = 0.9623242851569032, Elapsed Time = 1.3138933999998699 seconds
Optimization Progress: 12%|#2 | 12/100 [02:40<19:21, 13.20s/it]
Trial 12, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 12, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 8297 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 6273 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6246 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6230 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6329 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6300 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6080 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6116 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6159 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6371 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6253 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6193 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6305 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6296 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6141 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6221 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6269 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6198 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6366 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6274 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6255 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6214 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6198 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 Trial 12, Fold 1: Log loss = 0.2795864886375474, Average precision = 0.9725727838068232, ROC-AUC = 0.967581689154634, Elapsed Time = 0.7844225999997434 seconds Trial 12, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 12, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795629 [LightGBM] [Info] Total Bins 8313 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 6285 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6262 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6232 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6352 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6357 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6231 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6102 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6113 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6159 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6263 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6381 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6273 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6221 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6212 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6322 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6302 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6228 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6288 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6214 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6370 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6282 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6264 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6211 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 Trial 12, Fold 2: Log loss = 0.2794407935119923, Average precision = 0.9694707487668018, ROC-AUC = 0.9668603322487413, Elapsed Time = 0.7739711000003808 seconds Trial 12, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 12, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 8313 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 6278 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6230 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6170 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6343 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6294 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6208 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6345 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6100 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6118 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6150 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6264 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6368 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6263 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6220 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6304 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6315 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6210 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6266 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6217 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6383 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6279 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6248 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6215 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 17 Trial 12, Fold 3: Log loss = 0.2764566261179014, Average precision = 0.9718120509134004, ROC-AUC = 0.9675455939218537, Elapsed Time = 0.7659664000002522 seconds Trial 12, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 12, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 8295 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 6270 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6227 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6325 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6297 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6343 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6083 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6105 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6250 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6366 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6256 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6198 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6298 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6142 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6208 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6270 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6353 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6274 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6241 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6213 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 Trial 12, Fold 4: Log loss = 0.2785140783863654, Average precision = 0.9718978332825114, ROC-AUC = 0.9670789013116639, Elapsed Time = 0.7898058999999193 seconds Trial 12, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 12, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 8302 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 6268 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6225 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6155 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6324 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6293 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6336 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6083 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6099 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6151 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6260 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6257 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6293 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6147 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6261 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6212 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6359 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6265 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6240 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6212 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 Trial 12, Fold 5: Log loss = 0.281669785331137, Average precision = 0.9701642707511918, ROC-AUC = 0.9660935287888076, Elapsed Time = 0.8207968000006076 seconds
Optimization Progress: 13%|#3 | 13/100 [02:52<18:29, 12.76s/it]
Trial 13, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 13, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804600 [LightGBM] [Info] Total Bins 29002 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 19497 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19452 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19514 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 14 [LightGBM] [Debug] Re-bagging, using 19469 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19477 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19483 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 33 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19435 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19467 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19496 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19436 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19473 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19432 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19449 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Re-bagging, using 19491 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19458 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19471 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19429 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19529 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19504 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19529 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19402 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19436 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19464 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19493 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19521 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 Trial 13, Fold 1: Log loss = 0.3439567228683248, Average precision = 0.9642014557062354, ROC-AUC = 0.9580364013516679, Elapsed Time = 1.7321501999995235 seconds Trial 13, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 13, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804566 [LightGBM] [Info] Total Bins 29006 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 19534 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19485 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Re-bagging, using 19553 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19503 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19517 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19506 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Re-bagging, using 19475 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Re-bagging, using 19514 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19532 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19470 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Re-bagging, using 19505 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19468 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19487 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Re-bagging, using 19526 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19493 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19506 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19465 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19569 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19540 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19560 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19439 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19469 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19504 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19535 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19554 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 Trial 13, Fold 2: Log loss = 0.3371119253550399, Average precision = 0.9651042928311725, ROC-AUC = 0.9614018885315396, Elapsed Time = 2.2189558999998553 seconds Trial 13, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 13, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804456 [LightGBM] [Info] Total Bins 29001 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 19516 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19468 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19533 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19490 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19491 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19497 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Re-bagging, using 19453 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 26 [LightGBM] [Debug] Re-bagging, using 19492 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 26 [LightGBM] [Debug] Re-bagging, using 19513 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19457 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19487 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19451 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19466 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19507 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19480 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Re-bagging, using 19483 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19449 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19550 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19521 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19549 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19419 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 30 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19447 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19485 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19513 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19543 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 Trial 13, Fold 3: Log loss = 0.3337444841372523, Average precision = 0.9682855889749199, ROC-AUC = 0.963323212720671, Elapsed Time = 2.7470613999994384 seconds Trial 13, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 13, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804155 [LightGBM] [Info] Total Bins 28992 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 19490 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19445 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19509 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Re-bagging, using 19465 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19468 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19474 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19433 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 13 [LightGBM] [Debug] Re-bagging, using 19462 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19487 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19428 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19465 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19426 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19444 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19483 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Re-bagging, using 19451 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19463 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19422 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19523 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19498 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19521 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19399 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19422 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19464 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19488 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19512 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 Trial 13, Fold 4: Log loss = 0.3345920525712018, Average precision = 0.9672589026802049, ROC-AUC = 0.961008492075414, Elapsed Time = 2.6501938000001246 seconds Trial 13, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 13, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804939 [LightGBM] [Info] Total Bins 28998 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 19484 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19440 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19503 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19460 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19463 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19465 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19433 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19454 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19478 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19423 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19459 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19422 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19437 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19477 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19449 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19453 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19419 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19518 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19489 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19518 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19393 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19419 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 36 [LightGBM] [Debug] Re-bagging, using 19452 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Re-bagging, using 19487 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Re-bagging, using 19507 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 Trial 13, Fold 5: Log loss = 0.34868921272817555, Average precision = 0.9628221406956591, ROC-AUC = 0.9572423685041711, Elapsed Time = 2.0537867000002734 seconds
Optimization Progress: 14%|#4 | 14/100 [03:11<21:02, 14.68s/it]
Trial 14, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 14, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 12063 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 18369 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18393 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18353 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 Trial 14, Fold 1: Log loss = 0.4712521550359508, Average precision = 0.9693039444042607, ROC-AUC = 0.9631592995272987, Elapsed Time = 0.7223698000007062 seconds Trial 14, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 14, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 12084 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 18405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18423 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18454 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18390 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 Trial 14, Fold 2: Log loss = 0.46625701603010106, Average precision = 0.9680161782461034, ROC-AUC = 0.9641126350972598, Elapsed Time = 0.6809782000000268 seconds Trial 14, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 14, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 12088 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 18388 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18438 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18374 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 Trial 14, Fold 3: Log loss = 0.4677295498601943, Average precision = 0.970775523157016, ROC-AUC = 0.9657632504984061, Elapsed Time = 0.7300248999999894 seconds Trial 14, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 14, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 12066 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 18362 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18386 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18350 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 Trial 14, Fold 4: Log loss = 0.4673188628179886, Average precision = 0.969554405860509, ROC-AUC = 0.9634035571817232, Elapsed Time = 0.6793779999998151 seconds Trial 14, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 14, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 12068 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 18356 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18382 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18411 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18345 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 Trial 14, Fold 5: Log loss = 0.46827598185049213, Average precision = 0.9680601591436406, ROC-AUC = 0.963749029225424, Elapsed Time = 0.6934823000001415 seconds
Optimization Progress: 15%|#5 | 15/100 [03:23<19:19, 13.65s/it]
Trial 15, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 15, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 19890 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 10808 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10809 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10793 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10614 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Debug] Re-bagging, using 10811 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10650 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10781 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10721 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 Trial 15, Fold 1: Log loss = 0.3307535494949883, Average precision = 0.9624177756608816, ROC-AUC = 0.9556137470545782, Elapsed Time = 0.7917609000005541 seconds Trial 15, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 15, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 19902 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 10830 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10828 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10810 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10638 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10840 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10648 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 10790 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 10742 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 Trial 15, Fold 2: Log loss = 0.32437769588339677, Average precision = 0.9611297020298436, ROC-AUC = 0.9558363108777972, Elapsed Time = 0.8626070999998774 seconds Trial 15, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 15, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 19855 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 10818 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10816 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10799 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 10635 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10822 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 10649 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 10786 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Debug] Re-bagging, using 10730 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 Trial 15, Fold 3: Log loss = 0.32260896629795294, Average precision = 0.9643834699614529, ROC-AUC = 0.958475641908486, Elapsed Time = 0.8613505000002988 seconds Trial 15, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 15, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 19885 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 10804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 10805 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10788 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10612 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 10806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 10646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10781 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 10719 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 Trial 15, Fold 4: Log loss = 0.3249451621825596, Average precision = 0.963468797794335, ROC-AUC = 0.9561431039292688, Elapsed Time = 0.9371393000001262 seconds Trial 15, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 15, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 19883 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 10801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10790 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10605 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10639 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10787 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10713 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 Trial 15, Fold 5: Log loss = 0.3293120261072558, Average precision = 0.96140222102461, ROC-AUC = 0.9544851085966967, Elapsed Time = 0.9248790000001463 seconds
Optimization Progress: 16%|#6 | 16/100 [03:34<18:13, 13.02s/it]
Trial 16, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 16, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 24804 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 2947 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2947 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2840 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2873 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2977 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2925 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2842 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2990 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2890 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 14 [LightGBM] [Debug] Re-bagging, using 2874 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 14 [LightGBM] [Debug] Re-bagging, using 2773 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2860 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2898 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2860 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2913 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2894 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 15 [LightGBM] [Debug] Re-bagging, using 2917 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2794 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2964 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 14 Trial 16, Fold 1: Log loss = 0.20747082145114132, Average precision = 0.972974517916921, ROC-AUC = 0.9679727192995033, Elapsed Time = 1.2898568999999043 seconds Trial 16, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 16, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795629 [LightGBM] [Info] Total Bins 24811 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 2952 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2957 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2838 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2883 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2986 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2921 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2847 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2996 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2907 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2887 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2771 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2857 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 16 [LightGBM] [Debug] Re-bagging, using 2857 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2916 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2904 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2925 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2807 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2972 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 15 Trial 16, Fold 2: Log loss = 0.2027997548495244, Average precision = 0.9719972349009797, ROC-AUC = 0.9685757650895721, Elapsed Time = 1.3284340999998676 seconds Trial 16, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 16, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 24804 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 2949 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2951 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2841 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2876 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2982 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2925 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2847 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2986 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2901 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2879 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2782 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2848 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2904 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2861 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 15 [LightGBM] [Debug] Re-bagging, using 2907 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2920 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2795 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2971 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 14 Trial 16, Fold 3: Log loss = 0.20250456993786586, Average precision = 0.9731164233792264, ROC-AUC = 0.9689353974246613, Elapsed Time = 1.448365900000681 seconds Trial 16, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 16, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 24794 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 2947 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2945 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2836 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2875 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2976 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2924 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2844 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2986 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2888 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2873 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2773 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2860 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2894 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2858 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 19 [LightGBM] [Debug] Re-bagging, using 2911 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 18 [LightGBM] [Debug] Re-bagging, using 2899 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 17 [LightGBM] [Debug] Re-bagging, using 2913 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2798 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 15 [LightGBM] [Debug] Re-bagging, using 2960 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 Trial 16, Fold 4: Log loss = 0.2061300286833578, Average precision = 0.9725030883078034, ROC-AUC = 0.9671212245851722, Elapsed Time = 1.462485399999423 seconds Trial 16, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 16, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 24798 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 2946 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2945 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2836 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2874 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2972 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2922 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2848 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2981 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2890 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2872 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2768 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2849 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2909 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2857 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2909 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2902 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2916 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2786 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 Trial 16, Fold 5: Log loss = 0.20904784892066716, Average precision = 0.9722563489089914, ROC-AUC = 0.9676975270795013, Elapsed Time = 1.4544285000001764 seconds
Optimization Progress: 17%|#7 | 17/100 [03:49<18:40, 13.51s/it]
Trial 17, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 17, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794337 [LightGBM] [Info] Total Bins 10352 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 3248 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3240 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3135 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3119 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3232 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3221 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3145 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3263 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3164 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3161 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3027 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3119 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3176 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3172 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3193 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3164 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3247 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3090 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 Trial 17, Fold 1: Log loss = 0.30963063875852925, Average precision = 0.9650738928081896, ROC-AUC = 0.958796878808735, Elapsed Time = 1.334805999999844 seconds Trial 17, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 17, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792956 [LightGBM] [Info] Total Bins 10364 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 3254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3250 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3133 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3132 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3241 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3151 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3269 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3180 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3176 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3028 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Debug] Re-bagging, using 3169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3197 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3256 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3104 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 Trial 17, Fold 2: Log loss = 0.31070298288297354, Average precision = 0.9617001013894317, ROC-AUC = 0.9575550821699083, Elapsed Time = 1.725110099999256 seconds Trial 17, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 17, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792768 [LightGBM] [Info] Total Bins 10368 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 3250 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3244 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3136 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3126 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3236 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3218 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3153 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3258 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3176 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3041 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3105 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3180 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3175 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3187 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3175 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3252 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3090 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 Trial 17, Fold 3: Log loss = 0.3052696325563928, Average precision = 0.9660969774617566, ROC-AUC = 0.9609569877974786, Elapsed Time = 1.5482718000002933 seconds Trial 17, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 17, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 10347 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 3248 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3238 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3131 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3121 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3230 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3221 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3259 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3159 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3028 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3116 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3174 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3189 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3245 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3092 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 Trial 17, Fold 4: Log loss = 0.30706864130283307, Average precision = 0.9650277959863939, ROC-AUC = 0.9583006222117307, Elapsed Time = 1.5326796999997896 seconds Trial 17, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 17, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.793364 [LightGBM] [Info] Total Bins 10351 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 3247 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3238 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3131 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3120 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3226 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3218 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3150 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3164 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3025 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3103 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3190 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3170 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3188 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3170 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3249 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3081 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 Trial 17, Fold 5: Log loss = 0.31262231004306085, Average precision = 0.9626557154725396, ROC-AUC = 0.9569277074856475, Elapsed Time = 1.2901283999999578 seconds
Optimization Progress: 18%|#8 | 18/100 [04:05<19:29, 14.26s/it]
Trial 18, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 18, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 7940 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 6760 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6778 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6679 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6628 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6807 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6755 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6700 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6818 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6697 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6579 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6631 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6639 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6717 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6851 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6724 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6703 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6680 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6784 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6788 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6612 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6680 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6769 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6719 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6867 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6756 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 Trial 18, Fold 1: Log loss = 0.20574575048184507, Average precision = 0.9739334509548979, ROC-AUC = 0.9692472266433423, Elapsed Time = 0.9175384999998641 seconds Trial 18, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 18, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 7956 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 6773 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6795 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6687 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6643 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6835 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6744 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6713 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6826 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6724 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6566 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6573 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6648 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6729 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6866 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6745 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6716 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6697 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6799 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6626 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6693 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6790 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6734 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6876 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6764 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 14 Trial 18, Fold 2: Log loss = 0.2013883232521865, Average precision = 0.9735480108160133, ROC-AUC = 0.9702207110840527, Elapsed Time = 1.02073879999989 seconds Trial 18, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 18, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 7963 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 6766 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6785 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6681 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6641 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6823 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6745 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6710 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6820 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6699 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6574 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6572 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6629 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6640 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6851 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6737 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6714 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6687 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6785 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6627 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6673 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6767 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6737 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6888 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6758 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 Trial 18, Fold 3: Log loss = 0.20044029078217634, Average precision = 0.9740716095881156, ROC-AUC = 0.9701093303059779, Elapsed Time = 1.2052254000000175 seconds Trial 18, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 18, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 7949 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 6757 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6775 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6676 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6630 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6752 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6701 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6812 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6694 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6548 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6569 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6638 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6714 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6845 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6701 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6683 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6775 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6791 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6611 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6670 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6770 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6721 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6862 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6752 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 Trial 18, Fold 4: Log loss = 0.20460782943851485, Average precision = 0.9734286703235141, ROC-AUC = 0.9687859896850433, Elapsed Time = 1.236685199999556 seconds Trial 18, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 18, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 7944 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 6755 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6775 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6674 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6745 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6709 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6805 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6692 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6551 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6559 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6629 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6639 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6724 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6842 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6726 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6697 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6669 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6776 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6784 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6619 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6662 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6764 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6726 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6863 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6749 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 Trial 18, Fold 5: Log loss = 0.209003953144028, Average precision = 0.971670356080389, ROC-AUC = 0.9682428701484496, Elapsed Time = 1.1931323999997403 seconds
Optimization Progress: 19%|#9 | 19/100 [04:18<18:56, 14.03s/it]
Trial 19, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 19, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796981 [LightGBM] [Info] Total Bins 21915 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 8731 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8796 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8715 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8664 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8776 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8704 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8720 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8788 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8763 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 30 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Re-bagging, using 8545 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 24 [LightGBM] [Debug] Re-bagging, using 8540 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8563 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 30 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8668 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 34 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Re-bagging, using 8668 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8874 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 Trial 19, Fold 1: Log loss = 0.19924243631778799, Average precision = 0.9757228974576991, ROC-AUC = 0.9711200788154266, Elapsed Time = 2.895565999999235 seconds Trial 19, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 19, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796949 [LightGBM] [Info] Total Bins 21927 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 8749 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Debug] Re-bagging, using 8813 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8729 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8685 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8699 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 30 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8730 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8804 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8789 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Re-bagging, using 8564 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8547 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 26 [LightGBM] [Debug] Re-bagging, using 8565 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Re-bagging, using 8676 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8687 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8906 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 Trial 19, Fold 2: Log loss = 0.18703719630522814, Average precision = 0.9756550682922964, ROC-AUC = 0.9730873784993419, Elapsed Time = 3.661527400000523 seconds Trial 19, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 19, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796784 [LightGBM] [Info] Total Bins 21870 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 8740 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8803 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8720 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8682 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8789 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8696 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8727 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 30 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8796 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 28 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 24 [LightGBM] [Debug] Re-bagging, using 8769 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 31 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 25 [LightGBM] [Debug] Re-bagging, using 8565 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8542 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8565 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 31 [LightGBM] [Debug] Re-bagging, using 8665 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8676 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 28 [LightGBM] [Debug] Re-bagging, using 8886 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 30 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 Trial 19, Fold 3: Log loss = 0.1889680747041436, Average precision = 0.9750371189788504, ROC-AUC = 0.9728410831062014, Elapsed Time = 3.0296307999997225 seconds Trial 19, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 19, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796474 [LightGBM] [Info] Total Bins 21910 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 8727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8794 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8711 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8663 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8769 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8702 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Re-bagging, using 8718 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 28 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8787 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8761 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8542 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 30 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8536 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8561 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 29 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 26 [LightGBM] [Debug] Re-bagging, using 8662 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 34 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8669 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8872 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 27 Trial 19, Fold 4: Log loss = 0.19223260795860275, Average precision = 0.9760122026562066, ROC-AUC = 0.9717124056071781, Elapsed Time = 3.7891062999997303 seconds Trial 19, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 19, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.797340 [LightGBM] [Info] Total Bins 21919 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 8725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8792 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Debug] Re-bagging, using 8710 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8656 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8769 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8694 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8725 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8781 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8758 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8544 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8526 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8554 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8664 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 31 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 29 [LightGBM] [Debug] Re-bagging, using 8678 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 24 [LightGBM] [Debug] Re-bagging, using 8870 data to train [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 36 Trial 19, Fold 5: Log loss = 0.2012878534321624, Average precision = 0.9731398997533733, ROC-AUC = 0.9698096352860301, Elapsed Time = 3.564209299999675 seconds
Optimization Progress: 20%|## | 20/100 [04:44<23:14, 17.43s/it]
Trial 20, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 20, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 14105 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 9540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9482 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9558 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9423 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9525 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 Trial 20, Fold 1: Log loss = 0.3012211491529579, Average precision = 0.9681789405533022, ROC-AUC = 0.9619023422377081, Elapsed Time = 1.2858525999999983 seconds Trial 20, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 20, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 14121 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 9558 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9497 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9445 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9535 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 Trial 20, Fold 2: Log loss = 0.3005685009027957, Average precision = 0.9655611009212376, ROC-AUC = 0.9629260432887872, Elapsed Time = 1.4934327999999368 seconds Trial 20, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 20, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 14124 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 9549 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9535 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9486 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9440 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9571 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9417 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 Trial 20, Fold 3: Log loss = 0.2985064549274648, Average precision = 0.96914730997676, ROC-AUC = 0.9638280143935819, Elapsed Time = 1.4035541000002922 seconds Trial 20, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 20, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 14106 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 9536 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9480 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9418 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9551 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9520 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 Trial 20, Fold 4: Log loss = 0.29806484641027425, Average precision = 0.9684156480487836, ROC-AUC = 0.962071715296704, Elapsed Time = 1.396743599999354 seconds Trial 20, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 20, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 14104 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 9534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9520 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9480 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9411 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9549 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9414 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9514 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 Trial 20, Fold 5: Log loss = 0.3018067238368054, Average precision = 0.9668616390071544, ROC-AUC = 0.9616186388718577, Elapsed Time = 1.3381429000000935 seconds
Optimization Progress: 21%|##1 | 21/100 [04:59<22:04, 16.77s/it]
Trial 21, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 21, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 15336 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 10556 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10535 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 17 Trial 21, Fold 1: Log loss = 0.4294722857380578, Average precision = 0.9663014875585593, ROC-AUC = 0.9613171412798784, Elapsed Time = 0.7071726000003764 seconds Trial 21, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 21, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 15352 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 10578 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10550 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10551 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 17 Trial 21, Fold 2: Log loss = 0.4269684819092223, Average precision = 0.9661205485665564, ROC-AUC = 0.9627950090760351, Elapsed Time = 0.7049581999999646 seconds Trial 21, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 21, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 15353 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 10566 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10539 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 17 Trial 21, Fold 3: Log loss = 0.4242086322595482, Average precision = 0.9697600374481333, ROC-AUC = 0.9651152624216275, Elapsed Time = 0.7312525000006644 seconds Trial 21, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 21, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 15336 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 10552 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 Trial 21, Fold 4: Log loss = 0.42668125159574394, Average precision = 0.9685943640991859, ROC-AUC = 0.9628020898159473, Elapsed Time = 0.7318037000004551 seconds Trial 21, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 21, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 15335 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 10549 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10524 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 Trial 21, Fold 5: Log loss = 0.4274642890160897, Average precision = 0.9688491111364995, ROC-AUC = 0.9643741151552311, Elapsed Time = 0.7462249999998676 seconds
Optimization Progress: 22%|##2 | 22/100 [05:10<19:37, 15.10s/it]
Trial 22, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 22, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.790295 [LightGBM] [Info] Total Bins 22226 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 254 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 14649 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Debug] Re-bagging, using 14767 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Debug] Re-bagging, using 14796 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Debug] Re-bagging, using 14659 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 14721 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 14652 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 14702 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 Trial 22, Fold 1: Log loss = 0.5228777373379064, Average precision = 0.9647546789177879, ROC-AUC = 0.9574565027506537, Elapsed Time = 0.7402208000003156 seconds Trial 22, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 22, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 22241 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 14676 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Debug] Re-bagging, using 14793 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 14827 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 14688 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 14750 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 14664 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 14723 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 Trial 22, Fold 2: Log loss = 0.5212665422613845, Average precision = 0.9636074131965384, ROC-AUC = 0.9593215022666567, Elapsed Time = 0.7811039999996865 seconds Trial 22, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 22, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.790045 [LightGBM] [Info] Total Bins 22123 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 254 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 14660 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Debug] Re-bagging, using 14780 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 14813 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Debug] Re-bagging, using 14679 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Debug] Re-bagging, using 14731 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 14655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Debug] Re-bagging, using 14714 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 Trial 22, Fold 3: Log loss = 0.5236276273082059, Average precision = 0.9654209196951052, ROC-AUC = 0.9595892303358342, Elapsed Time = 0.7161302000004071 seconds Trial 22, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 22, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 22222 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 14644 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Debug] Re-bagging, using 14760 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Debug] Re-bagging, using 14791 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Debug] Re-bagging, using 14658 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Debug] Re-bagging, using 14714 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Debug] Re-bagging, using 14648 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 14696 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 9 Trial 22, Fold 4: Log loss = 0.5223469997818746, Average precision = 0.9648356841262216, ROC-AUC = 0.9584982301718444, Elapsed Time = 0.688500400000521 seconds Trial 22, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 22, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 22235 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 14640 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Debug] Re-bagging, using 14757 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Debug] Re-bagging, using 14789 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 14652 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 14709 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Debug] Re-bagging, using 14639 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Debug] Re-bagging, using 14700 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 Trial 22, Fold 5: Log loss = 0.5231191120911644, Average precision = 0.963700468976446, ROC-AUC = 0.9570614792932389, Elapsed Time = 0.7112767999997232 seconds
Optimization Progress: 23%|##3 | 23/100 [05:22<18:10, 14.16s/it]
Trial 23, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 23, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 16789 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 14527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14650 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14677 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 Trial 23, Fold 1: Log loss = 0.5225291289612243, Average precision = 0.9675563224495939, ROC-AUC = 0.9608456049287123, Elapsed Time = 0.8706756999999925 seconds Trial 23, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 23, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 16800 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 14554 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14675 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14708 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14568 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 Trial 23, Fold 2: Log loss = 0.5232042123614699, Average precision = 0.9658566296699482, ROC-AUC = 0.9612999140197047, Elapsed Time = 0.9562309000002642 seconds Trial 23, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 23, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 16807 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 14538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14662 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14695 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14558 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14607 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 Trial 23, Fold 3: Log loss = 0.5238859532622018, Average precision = 0.9673452313722839, ROC-AUC = 0.9617849558417042, Elapsed Time = 0.9833689000006416 seconds Trial 23, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 23, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 16788 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 14522 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14644 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14671 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14591 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 Trial 23, Fold 4: Log loss = 0.5247934093888289, Average precision = 0.966912358126536, ROC-AUC = 0.9606433048157923, Elapsed Time = 1.0001123000001826 seconds Trial 23, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 23, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 16787 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 14518 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14641 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14669 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14531 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 17 Trial 23, Fold 5: Log loss = 0.5265657072592534, Average precision = 0.9651006232741035, ROC-AUC = 0.9590224253571893, Elapsed Time = 0.924916399999347 seconds
Optimization Progress: 24%|##4 | 24/100 [05:34<17:10, 13.55s/it]
Trial 24, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 24, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 27471 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 2806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2796 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2677 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2739 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2824 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2769 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2714 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 Trial 24, Fold 1: Log loss = 0.3751901903993324, Average precision = 0.9602343850255711, ROC-AUC = 0.9531562913733195, Elapsed Time = 0.8463048000003255 seconds Trial 24, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 24, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 27480 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 2811 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2676 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2750 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2833 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2763 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 Trial 24, Fold 2: Log loss = 0.3735114996032227, Average precision = 0.9565893234197076, ROC-AUC = 0.9526023379477813, Elapsed Time = 0.9060810000000856 seconds Trial 24, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 24, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 27477 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 2808 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2798 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2680 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2742 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2768 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 Trial 24, Fold 3: Log loss = 0.36761244261956777, Average precision = 0.9631109753164551, ROC-AUC = 0.9575659605512803, Elapsed Time = 0.9513429000007818 seconds Trial 24, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 24, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 27467 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 2806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2794 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2674 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2740 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2824 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2767 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2716 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 Trial 24, Fold 4: Log loss = 0.37014187852829117, Average precision = 0.961520701652486, ROC-AUC = 0.9546189444933249, Elapsed Time = 0.9763474999999744 seconds Trial 24, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 24, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 27472 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 2805 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2794 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2674 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2739 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2820 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2765 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2720 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 Trial 24, Fold 5: Log loss = 0.37331798813142764, Average precision = 0.9577758761061415, ROC-AUC = 0.9510315664307081, Elapsed Time = 0.9897025999998732 seconds
Optimization Progress: 25%|##5 | 25/100 [05:46<16:26, 13.15s/it]
Trial 25, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 25, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 24366 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 10496 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10463 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10459 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10304 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10526 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10397 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10215 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10299 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10288 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10280 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10332 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10471 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10536 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10252 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10333 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10417 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10489 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10393 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10390 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10517 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10351 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10370 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10378 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10368 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10411 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10342 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10371 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10431 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10482 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10335 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10582 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10333 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10386 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10290 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10357 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10473 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10338 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10276 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10190 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 Trial 25, Fold 1: Log loss = 0.3522762790360541, Average precision = 0.967843534468308, ROC-AUC = 0.9614237858648427, Elapsed Time = 1.9450667999999496 seconds Trial 25, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 25, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 24373 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 10518 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10481 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10475 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10330 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10555 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10429 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10483 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10237 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10313 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10299 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10290 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10507 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10381 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10426 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10282 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10439 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10436 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10496 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10409 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10368 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10395 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10404 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10382 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10498 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10362 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10387 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10443 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10509 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10338 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10409 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10317 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10486 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10357 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10306 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 Trial 25, Fold 2: Log loss = 0.34525464550036206, Average precision = 0.9670754836092235, ROC-AUC = 0.962763066167194, Elapsed Time = 2.3343526999997266 seconds Trial 25, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 25, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 24368 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 10506 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10465 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10325 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10429 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10403 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10463 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10236 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10305 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10280 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10340 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10491 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10356 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10422 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10400 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10548 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10271 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10329 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10425 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10434 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10505 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10397 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10380 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10515 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10373 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10395 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10395 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10372 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10412 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10481 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10355 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10381 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10437 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10496 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10294 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10501 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10344 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10304 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10211 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 Trial 25, Fold 3: Log loss = 0.3457557444663702, Average precision = 0.9693219298702005, ROC-AUC = 0.9640517102310485, Elapsed Time = 1.9869880999995075 seconds Trial 25, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 25, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 24356 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 10492 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10459 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10454 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10303 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10520 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10345 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10396 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10451 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10214 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10283 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10271 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10332 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10472 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10328 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10402 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10386 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10544 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10259 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10315 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10417 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10377 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10358 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10373 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10369 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10355 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10409 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10473 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10372 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10413 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10489 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10327 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10578 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10329 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10382 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10296 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10353 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10472 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10326 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10289 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10181 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 Trial 25, Fold 4: Log loss = 0.3457546230696314, Average precision = 0.9686800320577474, ROC-AUC = 0.962634808567661, Elapsed Time = 1.9513052000002062 seconds Trial 25, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 25, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 24360 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 10489 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10456 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10296 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10338 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10430 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10448 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10285 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10277 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10271 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10466 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10325 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10393 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10269 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10315 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10407 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10423 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10479 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10381 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10377 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10506 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10363 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10374 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10376 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10393 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10467 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10348 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10365 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10427 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10477 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10323 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10580 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10325 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10380 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10282 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10357 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10474 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10328 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10268 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10193 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 Trial 25, Fold 5: Log loss = 0.3514596909372222, Average precision = 0.9655090119026054, ROC-AUC = 0.9597230923582856, Elapsed Time = 1.796817899999951 seconds
Optimization Progress: 26%|##6 | 26/100 [06:06<18:29, 14.99s/it]
Trial 26, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 26, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 29191 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 16006 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 16142 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16052 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15949 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15995 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15998 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15998 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16006 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16060 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 Trial 26, Fold 1: Log loss = 0.39860973853534337, Average precision = 0.9658180373732671, ROC-AUC = 0.9595059756814499, Elapsed Time = 1.4416620000001785 seconds Trial 26, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 26, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 29195 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 16039 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Debug] Re-bagging, using 16169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16081 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15983 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 7 [LightGBM] [Debug] Re-bagging, using 16021 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 16021 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 16020 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 16052 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16087 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 Trial 26, Fold 2: Log loss = 0.3992144562722906, Average precision = 0.9625837281038332, ROC-AUC = 0.9588862137440397, Elapsed Time = 1.9067242000000988 seconds Trial 26, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 26, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 29190 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 16022 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16156 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 16065 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15969 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Debug] Re-bagging, using 16005 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16010 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 16007 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16032 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 16068 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 Trial 26, Fold 3: Log loss = 0.3947824437055664, Average precision = 0.9669910235944826, ROC-AUC = 0.9615008403078079, Elapsed Time = 1.8139153000001897 seconds Trial 26, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 26, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 29181 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 16001 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Debug] Re-bagging, using 16134 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 16048 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15947 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15988 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15991 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15993 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16005 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16048 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 Trial 26, Fold 4: Log loss = 0.3947772666877869, Average precision = 0.9661043514929211, ROC-AUC = 0.9597404331518798, Elapsed Time = 1.62656380000044 seconds Trial 26, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 26, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 29189 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 15995 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16133 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16045 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15939 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15982 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15986 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15995 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15996 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 16041 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 Trial 26, Fold 5: Log loss = 0.4027633484140435, Average precision = 0.9629441663634337, ROC-AUC = 0.9568721178678261, Elapsed Time = 1.5187667000000147 seconds
Optimization Progress: 27%|##7 | 27/100 [06:21<18:31, 15.22s/it]
Trial 27, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 27, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 11253 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 7912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7981 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7859 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7810 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 Trial 27, Fold 1: Log loss = 0.3770457259073332, Average precision = 0.9679646461912859, ROC-AUC = 0.9618513350587649, Elapsed Time = 0.6671121999997922 seconds Trial 27, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 27, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 11266 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 7928 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7996 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7875 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7825 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 Trial 27, Fold 2: Log loss = 0.3768335963671184, Average precision = 0.9652852768593705, ROC-AUC = 0.9619789853596772, Elapsed Time = 0.766985599999316 seconds Trial 27, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 27, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 11269 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 7919 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7987 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7866 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7825 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 Trial 27, Fold 3: Log loss = 0.3753375017459059, Average precision = 0.9697573184424777, ROC-AUC = 0.9644684029817684, Elapsed Time = 0.8073904000002585 seconds Trial 27, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 27, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 11250 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 7908 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7979 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7855 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7810 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 Trial 27, Fold 4: Log loss = 0.37479868622985846, Average precision = 0.9683889358539486, ROC-AUC = 0.9620702995534002, Elapsed Time = 0.7829996000000392 seconds Trial 27, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 27, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 11253 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 7906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7978 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7853 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7805 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 Trial 27, Fold 5: Log loss = 0.3770404917952498, Average precision = 0.9671860255854553, ROC-AUC = 0.9620701930401501, Elapsed Time = 0.7725092999999106 seconds
Optimization Progress: 28%|##8 | 28/100 [06:33<17:04, 14.23s/it]
Trial 28, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 28, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 28226 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 3312 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3310 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3203 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 Trial 28, Fold 1: Log loss = 0.36599374427258163, Average precision = 0.9603317174096473, ROC-AUC = 0.9534624091280596, Elapsed Time = 0.48313599999983126 seconds Trial 28, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 28, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 28232 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 3319 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3319 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3187 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 Trial 28, Fold 2: Log loss = 0.3654086901441953, Average precision = 0.9566215972223977, ROC-AUC = 0.9521490673125862, Elapsed Time = 0.4869757999995272 seconds Trial 28, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 28, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 28228 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 3314 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3314 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3181 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 Trial 28, Fold 3: Log loss = 0.36509507073811576, Average precision = 0.9613520695863139, ROC-AUC = 0.9549487267771667, Elapsed Time = 0.5294050000002244 seconds Trial 28, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 28, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 28219 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 3312 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3308 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3175 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 Trial 28, Fold 4: Log loss = 0.3631950589967323, Average precision = 0.9604381240848454, ROC-AUC = 0.9526484533525994, Elapsed Time = 0.5496585000000778 seconds Trial 28, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 28, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 28223 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 3311 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3307 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3174 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 Trial 28, Fold 5: Log loss = 0.3715326470936202, Average precision = 0.9579557801661939, ROC-AUC = 0.9517082288241085, Elapsed Time = 0.5568551000005755 seconds
Optimization Progress: 29%|##9 | 29/100 [06:43<15:19, 12.95s/it]
Trial 29, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 29, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 23584 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 13595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13670 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13744 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13669 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13639 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13649 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 Trial 29, Fold 1: Log loss = 0.29683447983407174, Average precision = 0.9730766746375938, ROC-AUC = 0.9680085662188191, Elapsed Time = 1.040899100000388 seconds Trial 29, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 29, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792956 [LightGBM] [Info] Total Bins 23515 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 13622 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13691 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13772 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13560 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13694 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13572 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Debug] Re-bagging, using 13655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13686 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 Trial 29, Fold 2: Log loss = 0.2948225769523191, Average precision = 0.9701316593407077, ROC-AUC = 0.9676732451354972, Elapsed Time = 1.1777079000003141 seconds Trial 29, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 29, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 23508 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 13606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13680 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13760 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13550 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13676 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13566 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13672 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 Trial 29, Fold 3: Log loss = 0.29240503741367013, Average precision = 0.9738813449367276, ROC-AUC = 0.9697626130442739, Elapsed Time = 1.215357199999744 seconds Trial 29, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 29, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 23522 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 13590 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13665 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13741 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 17 [LightGBM] [Debug] Re-bagging, using 13524 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13661 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 17 [LightGBM] [Debug] Re-bagging, using 13559 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13651 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 Trial 29, Fold 4: Log loss = 0.2952701846010938, Average precision = 0.9724463513089439, ROC-AUC = 0.9672528141996373, Elapsed Time = 1.2341573000003336 seconds Trial 29, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 29, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.793364 [LightGBM] [Info] Total Bins 23533 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 13587 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13661 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13740 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13518 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13656 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13553 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13635 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13643 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 Trial 29, Fold 5: Log loss = 0.2996227692538168, Average precision = 0.9708768054126033, ROC-AUC = 0.9662428329896142, Elapsed Time = 1.2330916000000798 seconds
Optimization Progress: 30%|### | 30/100 [06:57<15:15, 13.08s/it]
Trial 30, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 30, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 8631 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 7776 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 7845 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7722 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Debug] Re-bagging, using 7689 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7843 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7783 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7774 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Debug] Re-bagging, using 7855 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 7584 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7645 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 Trial 30, Fold 1: Log loss = 0.3174825538284151, Average precision = 0.9605865459743074, ROC-AUC = 0.9548121203213796, Elapsed Time = 0.9692813999999998 seconds Trial 30, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 30, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 8643 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 7792 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7860 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7736 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7705 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 6 [LightGBM] [Debug] Re-bagging, using 7875 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Debug] Re-bagging, using 7775 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7785 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7866 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7828 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Debug] Re-bagging, using 7605 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7644 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 Trial 30, Fold 2: Log loss = 0.31623480166398493, Average precision = 0.9587342409317563, ROC-AUC = 0.9555490523199048, Elapsed Time = 1.113342200000261 seconds Trial 30, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 30, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 8652 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 7783 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Debug] Re-bagging, using 7851 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7728 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7704 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7859 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Debug] Re-bagging, using 7774 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 7781 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7862 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7808 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7645 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 Trial 30, Fold 3: Log loss = 0.3125646760074285, Average precision = 0.9619599029006753, ROC-AUC = 0.9580770073484797, Elapsed Time = 1.2622389000007388 seconds Trial 30, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 30, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 8625 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 7772 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 7843 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7690 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7836 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7781 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7775 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7850 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7584 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7645 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 Trial 30, Fold 4: Log loss = 0.3106067431192909, Average precision = 0.9648280060412078, ROC-AUC = 0.9583181327210168, Elapsed Time = 1.1553637999995772 seconds Trial 30, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 30, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 8630 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 7770 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7842 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7717 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7684 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7835 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7772 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7782 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7846 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 7585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 7580 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 7638 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 Trial 30, Fold 5: Log loss = 0.3168421825858889, Average precision = 0.9579488880533888, ROC-AUC = 0.9524936922876838, Elapsed Time = 1.0095891000000847 seconds
Optimization Progress: 31%|###1 | 31/100 [07:10<14:57, 13.01s/it]
Trial 31, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 31, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 9440 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 16585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16662 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16603 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16552 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16610 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16601 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16562 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16508 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16556 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16564 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16517 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16631 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16709 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16487 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16525 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16573 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 17 Trial 31, Fold 1: Log loss = 0.25510860932062523, Average precision = 0.9695418810095376, ROC-AUC = 0.9637145533712832, Elapsed Time = 1.2519100999998045 seconds Trial 31, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 31, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 9431 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 16618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16693 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16630 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16637 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16621 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16553 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16610 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16663 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16557 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16577 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16657 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16654 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16548 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16665 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16524 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16549 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16674 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 Trial 31, Fold 2: Log loss = 0.2563946312758171, Average precision = 0.9671244359838015, ROC-AUC = 0.9639688540705282, Elapsed Time = 1.475114300000314 seconds Trial 31, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 31, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 9438 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 16601 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16677 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16617 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16574 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16529 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16544 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16570 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16570 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16531 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16638 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16579 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16638 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16536 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16503 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16520 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16592 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Re-bagging, using 16648 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 13 Trial 31, Fold 3: Log loss = 0.24453248742690947, Average precision = 0.9713019505932408, ROC-AUC = 0.9668311667517409, Elapsed Time = 1.4896720999995523 seconds Trial 31, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 31, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 9441 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 16579 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16599 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16549 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16604 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16593 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16623 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16505 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16522 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16547 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16511 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16619 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16608 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16503 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16624 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16707 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 113 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 111 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16483 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16568 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16633 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 112 and depth = 12 Trial 31, Fold 4: Log loss = 0.25129239627683203, Average precision = 0.9696580915958111, ROC-AUC = 0.9637685954157338, Elapsed Time = 1.4521058000000266 seconds Trial 31, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 31, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 9420 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 16573 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16654 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16542 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 18 [LightGBM] [Debug] Re-bagging, using 16598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16553 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16616 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16500 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16555 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16510 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16555 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16604 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16499 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16616 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16699 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16480 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16513 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 114 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16560 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16629 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 Trial 31, Fold 5: Log loss = 0.25288434539497634, Average precision = 0.9680446458583952, ROC-AUC = 0.96308552105977, Elapsed Time = 1.429005899999538 seconds
Optimization Progress: 32%|###2 | 32/100 [07:24<15:15, 13.47s/it]
Trial 32, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 32, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 24150 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 7527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7461 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7603 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7337 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7348 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 Trial 32, Fold 1: Log loss = 0.21780981303912247, Average precision = 0.9729802326851044, ROC-AUC = 0.9680215607270712, Elapsed Time = 1.631260299999667 seconds Trial 32, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 32, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 24158 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 7542 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7605 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7474 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7560 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7418 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 Trial 32, Fold 2: Log loss = 0.2169047217816407, Average precision = 0.9718437056956949, ROC-AUC = 0.9688737977402326, Elapsed Time = 1.995632499999374 seconds Trial 32, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 32, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 24152 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7467 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7468 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7609 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7525 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7543 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 Trial 32, Fold 3: Log loss = 0.21615489727185902, Average precision = 0.9730476703738112, ROC-AUC = 0.9688875432192698, Elapsed Time = 1.7993200999999317 seconds Trial 32, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 32, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 24142 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 7523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7587 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7457 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7457 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7587 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7597 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 Trial 32, Fold 4: Log loss = 0.21562484342864127, Average precision = 0.9726778588701402, ROC-AUC = 0.9675838000815469, Elapsed Time = 1.8426258999998026 seconds Trial 32, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 32, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 24146 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 7521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7456 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7451 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7591 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7332 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7413 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 Trial 32, Fold 5: Log loss = 0.22144917844599368, Average precision = 0.9705417710797805, ROC-AUC = 0.9670558683090873, Elapsed Time = 1.8122117000002618 seconds
Optimization Progress: 33%|###3 | 33/100 [07:41<16:15, 14.56s/it]
Trial 33, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 33, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 12184 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5834 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5799 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5805 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5686 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5875 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5845 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 Trial 33, Fold 1: Log loss = 0.314628043679088, Average precision = 0.9658566787223346, ROC-AUC = 0.9591776776453832, Elapsed Time = 0.796326699999554 seconds Trial 33, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 33, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795629 [LightGBM] [Info] Total Bins 12205 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5846 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5812 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5809 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5704 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5896 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5836 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5735 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 Trial 33, Fold 2: Log loss = 0.3131834756667966, Average precision = 0.9634034482236395, ROC-AUC = 0.959467635384063, Elapsed Time = 0.8349615999995876 seconds Trial 33, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 33, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 12209 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5839 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5805 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5807 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5696 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5888 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5837 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5737 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 Trial 33, Fold 3: Log loss = 0.31472151021701855, Average precision = 0.9662963901788351, ROC-AUC = 0.960705377006867, Elapsed Time = 0.8537070000002132 seconds Trial 33, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 33, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 12187 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5831 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5797 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5688 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5872 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5842 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 Trial 33, Fold 4: Log loss = 0.31352811928269, Average precision = 0.9659387169422231, ROC-AUC = 0.9593025213941166, Elapsed Time = 1.0476693999999043 seconds Trial 33, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 33, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 12189 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5797 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5681 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5871 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5838 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5733 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 Trial 33, Fold 5: Log loss = 0.3174828222684896, Average precision = 0.9638476806395812, ROC-AUC = 0.957613362317225, Elapsed Time = 1.1022882999996 seconds
Optimization Progress: 34%|###4 | 34/100 [07:54<15:23, 13.99s/it]
Trial 34, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 34, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 15448 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 7530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7591 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7465 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7460 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7535 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7535 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7350 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7407 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7461 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7637 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7517 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7498 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7590 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7643 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7390 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7479 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7559 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 11 Trial 34, Fold 1: Log loss = 0.2139692345325275, Average precision = 0.9721244320632372, ROC-AUC = 0.9677689893080589, Elapsed Time = 0.9208619000000908 seconds Trial 34, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 34, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 15463 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 7545 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Debug] Re-bagging, using 7607 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7475 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7526 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7547 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7616 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7363 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7419 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7477 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7659 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7571 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7529 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7517 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7602 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7409 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7490 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7584 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 Trial 34, Fold 2: Log loss = 0.2093861632111387, Average precision = 0.971479734768855, ROC-AUC = 0.9680098975975593, Elapsed Time = 0.9724216999993587 seconds Trial 34, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 34, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 15465 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 7537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7597 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7471 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7473 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7609 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7544 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7609 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7363 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7351 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7419 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7472 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7644 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7562 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7525 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7509 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7659 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7404 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7477 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 Trial 34, Fold 3: Log loss = 0.20776298384101446, Average precision = 0.9728767115684074, ROC-AUC = 0.9690121447351948, Elapsed Time = 0.9004064999999173 seconds Trial 34, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 34, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 15448 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 7526 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7461 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7462 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 6 [LightGBM] [Debug] Re-bagging, using 7587 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7533 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7600 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7535 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7460 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7631 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7543 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7501 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7577 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7392 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 Trial 34, Fold 4: Log loss = 0.21491292913763768, Average precision = 0.9721148252396932, ROC-AUC = 0.9669598298485185, Elapsed Time = 0.8922624000006181 seconds Trial 34, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 34, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 15447 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 7524 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7460 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7456 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7524 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7547 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7594 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7533 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7334 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7410 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7468 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7629 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7543 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7509 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7485 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7583 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7402 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7461 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7556 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 13 Trial 34, Fold 5: Log loss = 0.213919976562517, Average precision = 0.9716793669447494, ROC-AUC = 0.9675710384036564, Elapsed Time = 0.8812234000006356 seconds
Optimization Progress: 35%|###5 | 35/100 [08:07<14:52, 13.72s/it]
Trial 35, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 35, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 7358 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 2241 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2114 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2130 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2168 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2161 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2066 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2161 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2165 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2183 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2189 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2139 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 Trial 35, Fold 1: Log loss = 0.4108622812425437, Average precision = 0.9573498878587887, ROC-AUC = 0.9509579790462808, Elapsed Time = 0.8165712999998505 seconds Trial 35, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 35, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 7374 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 2245 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2110 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2142 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2251 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2192 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2168 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2210 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2066 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2172 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2187 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2192 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2147 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 Trial 35, Fold 2: Log loss = 0.41126862578961465, Average precision = 0.9547552990346806, ROC-AUC = 0.950380902116305, Elapsed Time = 0.8838040999999066 seconds Trial 35, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 35, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 7372 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 2243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2115 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2135 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2246 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2195 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2197 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2162 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2074 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2153 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2175 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2165 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2190 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2137 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 Trial 35, Fold 3: Log loss = 0.40891116404980826, Average precision = 0.9584163722175101, ROC-AUC = 0.9530043863345244, Elapsed Time = 0.9321852000002764 seconds Trial 35, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 35, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 7355 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 2241 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2132 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2193 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2170 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2185 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2156 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2069 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2192 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2178 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2165 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2187 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2144 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 Trial 35, Fold 4: Log loss = 0.41120225290133705, Average precision = 0.955421450053424, ROC-AUC = 0.9491549949808173, Elapsed Time = 0.9162712999996074 seconds Trial 35, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 35, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 7363 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 2241 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2131 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2240 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2193 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2064 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2150 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2167 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2183 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2133 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 Trial 35, Fold 5: Log loss = 0.413217515676604, Average precision = 0.954621797764589, ROC-AUC = 0.9490674247069096, Elapsed Time = 0.9273424000002706 seconds
Optimization Progress: 36%|###6 | 36/100 [08:19<14:00, 13.13s/it]
Trial 36, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 36, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 11033 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 11682 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Debug] Re-bagging, using 11729 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Debug] Re-bagging, using 11741 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Debug] Re-bagging, using 11502 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 11687 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Debug] Re-bagging, using 11592 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Debug] Re-bagging, using 11705 data to train [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 Trial 36, Fold 1: Log loss = 0.5434716286160565, Average precision = 0.9693808772361909, ROC-AUC = 0.9630761794831353, Elapsed Time = 1.2474445999996533 seconds Trial 36, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 36, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 11045 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 11707 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Debug] Re-bagging, using 11746 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Debug] Re-bagging, using 11764 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 12 [LightGBM] [Debug] Re-bagging, using 11530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Re-bagging, using 11709 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Debug] Re-bagging, using 11597 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Debug] Re-bagging, using 11720 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 11 Trial 36, Fold 2: Log loss = 0.5449643510258457, Average precision = 0.9666909822952459, ROC-AUC = 0.9626037309924519, Elapsed Time = 1.2437085000001389 seconds Trial 36, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 36, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 11051 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 11693 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Debug] Re-bagging, using 11736 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Debug] Re-bagging, using 11752 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 11522 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Debug] Re-bagging, using 11696 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 12 [LightGBM] [Debug] Re-bagging, using 11593 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Debug] Re-bagging, using 11715 data to train [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 15 Trial 36, Fold 3: Log loss = 0.5417627858781487, Average precision = 0.9701888548272047, ROC-AUC = 0.964835811920332, Elapsed Time = 1.0679775999997219 seconds Trial 36, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 36, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 11032 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 11678 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Debug] Re-bagging, using 11725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Debug] Re-bagging, using 11736 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 11 [LightGBM] [Debug] Re-bagging, using 11499 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Debug] Re-bagging, using 11681 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Debug] Re-bagging, using 11589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Debug] Re-bagging, using 11702 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 Trial 36, Fold 4: Log loss = 0.5420935465836577, Average precision = 0.9693905511387977, ROC-AUC = 0.963465030246238, Elapsed Time = 1.0401756999999634 seconds Trial 36, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 36, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 11035 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 11675 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Debug] Re-bagging, using 11721 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Debug] Re-bagging, using 11736 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Debug] Re-bagging, using 11493 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Debug] Re-bagging, using 11675 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Debug] Re-bagging, using 11585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Debug] Re-bagging, using 11707 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 Trial 36, Fold 5: Log loss = 0.5418271649599954, Average precision = 0.9683225385236497, ROC-AUC = 0.9626778143172994, Elapsed Time = 1.0533833999998024 seconds
Optimization Progress: 37%|###7 | 37/100 [08:32<13:49, 13.16s/it]
Trial 37, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 37, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 23907 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 17461 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Debug] Re-bagging, using 17476 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17479 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Debug] Re-bagging, using 17416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17411 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Debug] Re-bagging, using 17408 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Debug] Re-bagging, using 17358 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 Trial 37, Fold 1: Log loss = 0.45498393800634146, Average precision = 0.9655085446117008, ROC-AUC = 0.9593522820148838, Elapsed Time = 0.8728379999993194 seconds Trial 37, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 37, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 23835 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 17495 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 10 [LightGBM] [Debug] Re-bagging, using 17506 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17508 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 17453 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17444 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17425 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Debug] Re-bagging, using 17392 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 Trial 37, Fold 2: Log loss = 0.4584689056532689, Average precision = 0.9614054215271424, ROC-AUC = 0.9579318719212271, Elapsed Time = 0.9410419999994701 seconds Trial 37, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 37, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 23830 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 17478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17490 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17494 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Debug] Re-bagging, using 17437 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17426 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17414 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17376 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 Trial 37, Fold 3: Log loss = 0.45767112706573265, Average precision = 0.964278662255139, ROC-AUC = 0.9595124077828394, Elapsed Time = 0.9496364000005997 seconds Trial 37, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 37, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 23819 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 17455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Debug] Re-bagging, using 17469 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 10 [LightGBM] [Debug] Re-bagging, using 17475 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 17411 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17401 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Debug] Re-bagging, using 17354 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 Trial 37, Fold 4: Log loss = 0.45301620955825905, Average precision = 0.9650677575600484, ROC-AUC = 0.9586500127565922, Elapsed Time = 1.0068817000001218 seconds Trial 37, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 37, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 23826 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 17449 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Debug] Re-bagging, using 17467 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Debug] Re-bagging, using 17405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Debug] Re-bagging, using 17398 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Debug] Re-bagging, using 17395 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Re-bagging, using 17356 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 Trial 37, Fold 5: Log loss = 0.4544535693194063, Average precision = 0.9638041944638919, ROC-AUC = 0.9578615833379782, Elapsed Time = 0.9900883999998769 seconds
Optimization Progress: 38%|###8 | 38/100 [08:44<13:16, 12.84s/it]
Trial 38, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 38, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798290 [LightGBM] [Info] Total Bins 17915 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 14894 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15018 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15035 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 Trial 38, Fold 1: Log loss = 0.28412838776738836, Average precision = 0.9727748937905127, ROC-AUC = 0.9688282657738395, Elapsed Time = 0.6541177000008247 seconds Trial 38, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 38, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798259 [LightGBM] [Info] Total Bins 17930 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 14922 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15044 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15065 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 10 Trial 38, Fold 2: Log loss = 0.2830484800282979, Average precision = 0.9732954845031008, ROC-AUC = 0.9701862643367514, Elapsed Time = 0.719259399999828 seconds Trial 38, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 38, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798104 [LightGBM] [Info] Total Bins 17934 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 14905 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15032 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15051 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 Trial 38, Fold 3: Log loss = 0.2825482226615492, Average precision = 0.9733412577409921, ROC-AUC = 0.9699044450838382, Elapsed Time = 0.7281477999995332 seconds Trial 38, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 38, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.797795 [LightGBM] [Info] Total Bins 17917 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 14889 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15011 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15030 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 Trial 38, Fold 4: Log loss = 0.28471134543479953, Average precision = 0.9735303764738737, ROC-AUC = 0.9686139396172069, Elapsed Time = 0.7371172000002844 seconds Trial 38, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 38, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798646 [LightGBM] [Info] Total Bins 17916 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 14884 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15009 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15028 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 Trial 38, Fold 5: Log loss = 0.2870246817623127, Average precision = 0.9718168184861367, ROC-AUC = 0.9678533712353454, Elapsed Time = 0.7391860000007 seconds
Optimization Progress: 39%|###9 | 39/100 [08:55<12:27, 12.25s/it]
Trial 39, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 39, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 20107 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 6705 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6698 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6569 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6766 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6709 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6651 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6767 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6639 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6495 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6526 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6574 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6597 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6667 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6807 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6676 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6647 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6735 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6733 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6559 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6620 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6721 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6658 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6812 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6707 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6689 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6642 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6614 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6691 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6584 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6663 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6714 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6635 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6680 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Debug] Re-bagging, using 6674 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6723 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6645 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6616 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6637 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6520 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 Trial 39, Fold 1: Log loss = 0.4118523317409845, Average precision = 0.9640110933053028, ROC-AUC = 0.9570288041945675, Elapsed Time = 1.676568099999713 seconds Trial 39, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 39, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 20073 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 6718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6715 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6642 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6584 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6794 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6698 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6662 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6776 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6666 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6515 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6577 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6678 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6821 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6696 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6660 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6643 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6751 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6745 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6573 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6742 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6671 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6820 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6713 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6696 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6639 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6622 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6701 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6601 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6669 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6610 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6734 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6649 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6687 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6690 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6736 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6645 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6808 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6636 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6612 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 Trial 39, Fold 2: Log loss = 0.4136268071513568, Average precision = 0.9608336604594377, ROC-AUC = 0.9558583901805829, Elapsed Time = 1.9231171000001268 seconds Trial 39, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 39, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 20072 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 6711 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6705 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6636 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6582 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6782 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6699 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6661 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6768 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6642 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6522 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6520 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6572 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6677 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6689 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6658 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6632 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6736 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6751 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6574 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6614 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6719 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6676 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6833 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6709 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6680 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6631 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6629 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6701 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6659 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6604 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6719 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6647 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6690 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6688 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6642 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6810 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6620 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6641 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6592 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 Trial 39, Fold 3: Log loss = 0.4062084001497198, Average precision = 0.9655958077729394, ROC-AUC = 0.9596640213423737, Elapsed Time = 1.8035933000001023 seconds Trial 39, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 39, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 20105 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 6702 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6695 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6631 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6571 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6761 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6706 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6652 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6762 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6635 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6497 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6577 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6596 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6664 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6677 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6628 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6726 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6736 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6557 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6611 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6721 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6660 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6807 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6703 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6672 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6623 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6691 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6650 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6599 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6633 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6677 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6666 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6633 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6614 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6640 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 Trial 39, Fold 4: Log loss = 0.4129772817691646, Average precision = 0.9637340592536143, ROC-AUC = 0.9571041700942555, Elapsed Time = 1.6821828000001915 seconds Trial 39, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 39, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 20102 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 6700 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6695 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6629 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6566 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6760 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6699 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6660 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6755 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6499 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6507 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6571 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6597 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6674 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6798 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6678 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6642 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6614 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6728 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6729 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6565 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6602 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6716 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6664 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6808 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6700 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6667 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6630 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6631 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6696 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6584 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6654 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6710 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6645 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6671 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6665 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6720 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6609 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6638 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6518 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6596 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 Trial 39, Fold 5: Log loss = 0.4146718211969893, Average precision = 0.9620377715252689, ROC-AUC = 0.9551693513925275, Elapsed Time = 1.58937909999986 seconds
Optimization Progress: 40%|#### | 40/100 [09:11<13:26, 13.44s/it]
Trial 40, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 40, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.790295 [LightGBM] [Info] Total Bins 27148 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 254 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 18199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18214 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18156 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18148 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Debug] Re-bagging, using 18129 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18129 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 Trial 40, Fold 1: Log loss = 0.31388538222959717, Average precision = 0.9656748092626154, ROC-AUC = 0.9587817185491079, Elapsed Time = 1.5820659000000887 seconds Trial 40, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 40, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 27157 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 18235 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18276 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18210 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 18191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18168 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18166 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18172 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 Trial 40, Fold 2: Log loss = 0.31499227675900127, Average precision = 0.9634456488372517, ROC-AUC = 0.9585552518239325, Elapsed Time = 1.7733633999996528 seconds Trial 40, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 40, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.790045 [LightGBM] [Info] Total Bins 27151 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 254 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 18218 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18226 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18259 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18194 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 18158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18148 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18150 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 Trial 40, Fold 3: Log loss = 0.311584660495534, Average precision = 0.9669552582564993, ROC-AUC = 0.9613503553851933, Elapsed Time = 1.7890182999999524 seconds Trial 40, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 40, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 27145 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 18192 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Debug] Re-bagging, using 18238 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18147 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18142 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18126 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18125 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 Trial 40, Fold 4: Log loss = 0.3075476142307136, Average precision = 0.9664963265197322, ROC-AUC = 0.9599498141352582, Elapsed Time = 1.762109899999814 seconds Trial 40, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 40, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 27151 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 18186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 18203 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18233 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18164 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18141 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18135 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18126 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18117 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 Trial 40, Fold 5: Log loss = 0.3169454622000901, Average precision = 0.9636389789724216, ROC-AUC = 0.9571915352172864, Elapsed Time = 1.74589079999987 seconds
Optimization Progress: 41%|####1 | 41/100 [09:28<14:13, 14.47s/it]
Trial 41, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 41, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 25876 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5231 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5263 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5098 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5302 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5265 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5301 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5062 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5099 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5147 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5302 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5234 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5222 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5266 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5297 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5121 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5228 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5294 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5258 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5230 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5256 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5197 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5269 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5145 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5268 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5185 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5203 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5223 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5303 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5137 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Debug] Re-bagging, using 5300 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5217 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5213 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5097 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 Trial 41, Fold 1: Log loss = 0.27293726745619906, Average precision = 0.9709746855113583, ROC-AUC = 0.9652606011289389, Elapsed Time = 1.5060511000001497 seconds Trial 41, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 41, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 25878 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5302 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5269 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5112 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5322 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5259 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5194 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5311 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5221 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5085 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5099 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5143 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5156 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5251 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5307 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5251 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5234 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5165 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5286 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5300 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5131 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5217 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5301 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5262 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5233 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5255 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5249 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5159 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5282 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5285 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5203 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5213 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5237 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5310 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5133 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5234 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5114 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 Trial 41, Fold 2: Log loss = 0.2702378761409877, Average precision = 0.9704559815636802, ROC-AUC = 0.96690608420606, Elapsed Time = 1.9325926999999865 seconds Trial 41, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 41, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 25877 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5297 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5234 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5266 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5108 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5313 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5260 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5194 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5301 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5078 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5107 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5135 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5152 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5252 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5247 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5226 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5155 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5273 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5312 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5129 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5223 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5308 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5261 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5225 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5211 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5246 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5155 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5265 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5148 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5271 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5203 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5215 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5233 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5312 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5131 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Debug] Re-bagging, using 5308 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5220 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Debug] Re-bagging, using 5207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5098 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 Trial 41, Fold 3: Log loss = 0.2688744270662882, Average precision = 0.9722850054993755, ROC-AUC = 0.967731668528667, Elapsed Time = 1.771677599999748 seconds Trial 41, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 41, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 25866 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5291 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5227 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5260 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5100 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5298 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5264 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5297 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5066 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5092 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5147 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5241 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5296 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5238 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5155 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5256 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5298 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5120 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5226 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5180 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5284 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5257 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5219 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5208 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5240 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5151 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5258 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5145 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5267 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5188 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5203 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5211 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5304 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5122 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5304 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5215 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5212 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5107 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 Trial 41, Fold 4: Log loss = 0.2683222397204066, Average precision = 0.9710829752471616, ROC-AUC = 0.9657162111251791, Elapsed Time = 1.8824927999994543 seconds Trial 41, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 41, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 25873 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5289 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5227 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5260 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5095 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5296 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5260 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5066 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5087 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5135 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5155 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5247 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5290 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5140 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5261 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5289 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5122 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5222 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5285 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 14 [LightGBM] [Debug] Re-bagging, using 5252 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5218 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5210 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5248 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5147 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5263 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5140 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5259 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5215 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5298 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5124 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5299 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5211 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5211 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5096 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 Trial 41, Fold 5: Log loss = 0.276624281389057, Average precision = 0.9678688271291056, ROC-AUC = 0.9640171673819742, Elapsed Time = 1.8289052000000083 seconds
Optimization Progress: 42%|####2 | 42/100 [09:45<14:43, 15.23s/it]
Trial 42, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 42, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.805817 [LightGBM] [Info] Total Bins 16928 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5608 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Re-bagging, using 5567 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Re-bagging, using 5575 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Re-bagging, using 5431 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 [LightGBM] [Debug] Re-bagging, using 5645 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Re-bagging, using 5586 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5492 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Re-bagging, using 5644 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 Trial 42, Fold 1: Log loss = 0.22486720866549162, Average precision = 0.9744573034457623, ROC-AUC = 0.9691979371292831, Elapsed Time = 1.8659290999994482 seconds Trial 42, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 42, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.806987 [LightGBM] [Info] Total Bins 16941 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5618 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Re-bagging, using 5580 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5580 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Re-bagging, using 5448 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5664 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Re-bagging, using 5580 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Re-bagging, using 5499 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5655 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 Trial 42, Fold 2: Log loss = 0.22524210099664396, Average precision = 0.9739922954990727, ROC-AUC = 0.9709663238279531, Elapsed Time = 2.416602799999964 seconds Trial 42, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 42, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.806893 [LightGBM] [Info] Total Bins 16947 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5613 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5571 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5578 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Re-bagging, using 5442 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Re-bagging, using 5655 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Re-bagging, using 5580 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Re-bagging, using 5501 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Re-bagging, using 5646 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 Trial 42, Fold 3: Log loss = 0.22862562686622087, Average precision = 0.9740826150240826, ROC-AUC = 0.9698103920072041, Elapsed Time = 2.305730500000209 seconds Trial 42, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 42, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.805384 [LightGBM] [Info] Total Bins 16929 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5605 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Re-bagging, using 5565 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5572 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 21 [LightGBM] [Debug] Re-bagging, using 5432 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Re-bagging, using 5642 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 28 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Re-bagging, using 5583 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Re-bagging, using 5492 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 29 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Re-bagging, using 5642 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 Trial 42, Fold 4: Log loss = 0.2264667894282577, Average precision = 0.9738043014455474, ROC-AUC = 0.9688653458228716, Elapsed Time = 2.3237183999999615 seconds Trial 42, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 42, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.807356 [LightGBM] [Info] Total Bins 16928 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5603 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Re-bagging, using 5565 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Re-bagging, using 5572 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5427 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Re-bagging, using 5640 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 30 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 22 [LightGBM] [Debug] Re-bagging, using 5579 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Re-bagging, using 5497 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5637 data to train [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 Trial 42, Fold 5: Log loss = 0.22583290083240043, Average precision = 0.9723513066476789, ROC-AUC = 0.9693319956152575, Elapsed Time = 1.9339842999997927 seconds
Optimization Progress: 43%|####3 | 43/100 [10:03<15:17, 16.10s/it]
Trial 43, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 43, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.790295 [LightGBM] [Info] Total Bins 20104 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 254 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 15800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15945 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15858 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15755 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15790 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15872 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Debug] Re-bagging, using 15771 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15792 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15707 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15781 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15709 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15898 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 15815 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 Trial 43, Fold 1: Log loss = 0.5212313297273229, Average precision = 0.9627836292091663, ROC-AUC = 0.9549571509824445, Elapsed Time = 0.6328744000002189 seconds Trial 43, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 43, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 20073 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 15833 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15972 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15887 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15788 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15807 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15831 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15847 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15899 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15815 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15927 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15862 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 Trial 43, Fold 2: Log loss = 0.5209099033618703, Average precision = 0.9621351418743129, ROC-AUC = 0.9574910446044405, Elapsed Time = 0.6737821000006079 seconds Trial 43, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 43, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.790045 [LightGBM] [Info] Total Bins 20069 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 254 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 15816 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15959 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15871 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15774 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15813 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15814 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15879 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15797 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15716 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15794 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15715 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 5 [LightGBM] [Debug] Re-bagging, using 15909 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15830 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 Trial 43, Fold 3: Log loss = 0.5124468728151358, Average precision = 0.9650361000889902, ROC-AUC = 0.9592594426279242, Elapsed Time = 0.6866634000007252 seconds Trial 43, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 43, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 20105 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 15795 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15937 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15854 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15753 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15796 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15782 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15861 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15772 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15781 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15699 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15785 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15698 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15895 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15813 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 Trial 43, Fold 4: Log loss = 0.521958522934716, Average precision = 0.9634605752827095, ROC-AUC = 0.9565819843535032, Elapsed Time = 0.690412699999797 seconds Trial 43, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 43, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 20102 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 15789 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15936 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15851 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15745 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15790 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15777 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15791 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15856 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15769 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15777 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15690 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15778 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15701 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15892 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15811 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 Trial 43, Fold 5: Log loss = 0.5191817461829943, Average precision = 0.9611701754237488, ROC-AUC = 0.9542330230570574, Elapsed Time = 0.7347669999999198 seconds
Optimization Progress: 44%|####4 | 44/100 [10:14<13:28, 14.45s/it]
Trial 44, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 44, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 12968 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 3425 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3435 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3305 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3412 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3398 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3350 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3469 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3367 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3221 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3304 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3388 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3380 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 Trial 44, Fold 1: Log loss = 0.48721062203998294, Average precision = 0.9535683851934116, ROC-AUC = 0.9451276269817372, Elapsed Time = 0.7687642999999298 seconds Trial 44, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 44, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 12982 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 3432 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3444 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3318 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3394 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3356 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3377 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3365 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3224 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3297 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3370 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3383 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3385 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3357 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 Trial 44, Fold 2: Log loss = 0.4897891343912608, Average precision = 0.9531068724675854, ROC-AUC = 0.9482266905537914, Elapsed Time = 0.8361282999994728 seconds Trial 44, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 44, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 12988 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 3427 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3439 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3312 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3417 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3396 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3356 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3467 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3375 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3348 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3239 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3288 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3367 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3389 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3373 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 Trial 44, Fold 3: Log loss = 0.48084034590108415, Average precision = 0.961348036882991, ROC-AUC = 0.9536307798249077, Elapsed Time = 0.8631216000003405 seconds Trial 44, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 44, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 12968 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 3425 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3432 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3335 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3308 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3409 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3399 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3351 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3463 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3363 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3222 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3298 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3385 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3378 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3354 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 Trial 44, Fold 4: Log loss = 0.4804119373286586, Average precision = 0.9578869844868307, ROC-AUC = 0.9496273316547088, Elapsed Time = 0.8647602999999435 seconds Trial 44, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 44, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 12970 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 3423 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3432 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3336 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3307 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3404 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3397 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3354 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3459 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3363 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Debug] Re-bagging, using 3219 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3284 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3376 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3384 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3376 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3356 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 Trial 44, Fold 5: Log loss = 0.48515789446581037, Average precision = 0.9568248681429414, ROC-AUC = 0.9481767274213626, Elapsed Time = 0.8975406999998086 seconds
Optimization Progress: 45%|####5 | 45/100 [10:26<12:38, 13.78s/it]
Trial 45, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 45, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796981 [LightGBM] [Info] Total Bins 14564 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 9603 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9576 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9485 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9620 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9498 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 Trial 45, Fold 1: Log loss = 0.3353843979777529, Average precision = 0.9716630024206406, ROC-AUC = 0.9657538696749401, Elapsed Time = 0.9208911999994598 seconds Trial 45, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 45, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795629 [LightGBM] [Info] Total Bins 14576 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 9621 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9596 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9555 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9510 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9647 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 6 [LightGBM] [Debug] Re-bagging, using 9496 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 Trial 45, Fold 2: Log loss = 0.3353204673392179, Average precision = 0.9697656759149734, ROC-AUC = 0.9668807423258869, Elapsed Time = 0.9826597999999649 seconds Trial 45, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 45, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 14580 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 9612 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9584 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9545 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9505 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9633 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9492 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 Trial 45, Fold 3: Log loss = 0.3302802993933815, Average precision = 0.9730979632333543, ROC-AUC = 0.9685356342277355, Elapsed Time = 1.0227054999995744 seconds Trial 45, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 45, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796474 [LightGBM] [Info] Total Bins 14561 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 9599 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9572 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9484 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9495 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 Trial 45, Fold 4: Log loss = 0.33213599249562337, Average precision = 0.9722569957339893, ROC-AUC = 0.9670891840788189, Elapsed Time = 1.1327862000007372 seconds Trial 45, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 45, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.797340 [LightGBM] [Info] Total Bins 14563 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 9597 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9569 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9477 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9611 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9489 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 Trial 45, Fold 5: Log loss = 0.3349875746116728, Average precision = 0.9701929619761038, ROC-AUC = 0.9653170577634097, Elapsed Time = 1.2616589999997814 seconds
Optimization Progress: 46%|####6 | 46/100 [10:39<12:05, 13.44s/it]
Trial 46, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 46, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796981 [LightGBM] [Info] Total Bins 15571 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 17830 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 178 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 201 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 206 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 186 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 181 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 185 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 160 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 158 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 20 [LightGBM] [Debug] Re-bagging, using 17848 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 186 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 174 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 164 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 159 and depth = 18 Trial 46, Fold 1: Log loss = 0.5045576873189126, Average precision = 0.9703136294974727, ROC-AUC = 0.9661611056145836, Elapsed Time = 0.6460338999995656 seconds Trial 46, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 46, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796949 [LightGBM] [Info] Total Bins 15586 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 17866 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 177 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 204 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 157 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 175 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 206 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 206 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 155 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 177 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 159 and depth = 20 [LightGBM] [Debug] Re-bagging, using 17876 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 158 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 183 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 19 Trial 46, Fold 2: Log loss = 0.501208742468372, Average precision = 0.9713255478845191, ROC-AUC = 0.9676740797483173, Elapsed Time = 0.6990193999999974 seconds Trial 46, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 46, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796784 [LightGBM] [Info] Total Bins 15588 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 17849 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 146 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 201 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 172 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 150 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 191 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 170 and depth = 20 [LightGBM] [Debug] Re-bagging, using 17860 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 169 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 172 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 20 Trial 46, Fold 3: Log loss = 0.4985493422929144, Average precision = 0.9706569359296852, ROC-AUC = 0.9679967477198524, Elapsed Time = 0.7081786000007924 seconds Trial 46, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 46, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796474 [LightGBM] [Info] Total Bins 15571 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 17823 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 187 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 199 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 160 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 201 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 160 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 187 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 160 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 203 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 184 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 206 and depth = 18 [LightGBM] [Debug] Re-bagging, using 17842 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 162 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 158 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 159 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 19 Trial 46, Fold 4: Log loss = 0.5015279726399929, Average precision = 0.9721892005210573, ROC-AUC = 0.9666833128274094, Elapsed Time = 0.725632300000143 seconds Trial 46, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 46, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798646 [LightGBM] [Info] Total Bins 15573 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 17817 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 187 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 152 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 158 and depth = 25 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 156 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 187 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 188 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 18 [LightGBM] [Debug] Re-bagging, using 17839 data to train [LightGBM] [Debug] Trained a tree with leaves = 206 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 181 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 190 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 202 and depth = 18 Trial 46, Fold 5: Log loss = 0.5064286734003713, Average precision = 0.9674327831857938, ROC-AUC = 0.9628119577132452, Elapsed Time = 0.7206710000000385 seconds
Optimization Progress: 47%|####6 | 47/100 [10:50<11:18, 12.80s/it]
Trial 47, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 47, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794337 [LightGBM] [Info] Total Bins 22776 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 10427 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10372 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10366 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10223 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10432 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10261 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10353 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10317 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 Trial 47, Fold 1: Log loss = 0.3166572862889005, Average precision = 0.9661894137719051, ROC-AUC = 0.959624942644929, Elapsed Time = 1.3378720000000612 seconds Trial 47, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 47, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 22787 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 10449 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10390 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10381 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10250 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10461 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10257 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10359 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10336 data to train [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 Trial 47, Fold 2: Log loss = 0.3114679041777061, Average precision = 0.9638589230571969, ROC-AUC = 0.9598386587195339, Elapsed Time = 1.3030173999995895 seconds Trial 47, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 47, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 22678 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 10437 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10379 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10372 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10244 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10443 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10258 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10359 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10323 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 Trial 47, Fold 3: Log loss = 0.307874197146473, Average precision = 0.9678081239560853, ROC-AUC = 0.9624470895012085, Elapsed Time = 1.2034134999994421 seconds Trial 47, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 47, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 22770 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 10423 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10368 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10361 data to train [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10223 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10425 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10257 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10353 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10316 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 9 Trial 47, Fold 4: Log loss = 0.31159670408281126, Average precision = 0.9665855502745488, ROC-AUC = 0.9600392295018252, Elapsed Time = 1.1818234000002121 seconds Trial 47, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 47, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794703 [LightGBM] [Info] Total Bins 22726 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 10420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10365 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10362 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10250 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10359 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10311 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 Trial 47, Fold 5: Log loss = 0.3116467143025362, Average precision = 0.9643136124715688, ROC-AUC = 0.9583216097207514, Elapsed Time = 1.2777191000004677 seconds
Optimization Progress: 48%|####8 | 48/100 [11:04<11:26, 13.20s/it]
Trial 48, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 48, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 25983 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5347 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5294 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5327 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5160 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5355 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5330 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 Trial 48, Fold 1: Log loss = 0.26771342615351423, Average precision = 0.9693245107568664, ROC-AUC = 0.9633420441347271, Elapsed Time = 0.9700815000005605 seconds Trial 48, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 48, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 25985 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5357 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5305 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5333 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5174 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5376 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 Trial 48, Fold 2: Log loss = 0.26531075075348565, Average precision = 0.966748909372579, ROC-AUC = 0.9628439477368487, Elapsed Time = 1.041997000000265 seconds Trial 48, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 48, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 25984 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5352 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5297 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5330 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5170 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5366 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5325 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 Trial 48, Fold 3: Log loss = 0.2653291047390906, Average precision = 0.9690770258886212, ROC-AUC = 0.9640110793019425, Elapsed Time = 0.9497467000001052 seconds Trial 48, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 48, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 25973 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5344 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5162 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5351 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5328 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 Trial 48, Fold 4: Log loss = 0.2724567192845021, Average precision = 0.9676774602201403, ROC-AUC = 0.9611384424081586, Elapsed Time = 0.9449987999996665 seconds Trial 48, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 48, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 25979 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5342 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5349 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 Trial 48, Fold 5: Log loss = 0.28253181094529456, Average precision = 0.9636006730319187, ROC-AUC = 0.9580317708043029, Elapsed Time = 0.9368782000001374 seconds
Optimization Progress: 49%|####9 | 49/100 [11:17<11:16, 13.26s/it]
Trial 49, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 49, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 24796 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 12453 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 12477 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 12537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 12317 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 Trial 49, Fold 1: Log loss = 0.3063735045663137, Average precision = 0.9679374832296199, ROC-AUC = 0.961229092284309, Elapsed Time = 0.7717340999997759 seconds Trial 49, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 49, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 24803 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 12478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 12495 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 12564 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Debug] Re-bagging, using 12346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 Trial 49, Fold 2: Log loss = 0.307872180921878, Average precision = 0.9646038770080863, ROC-AUC = 0.9606868529664113, Elapsed Time = 0.8407884999996895 seconds Trial 49, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 49, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 24796 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 12464 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 12485 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Debug] Re-bagging, using 12550 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Debug] Re-bagging, using 12338 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 Trial 49, Fold 3: Log loss = 0.29787310753822754, Average precision = 0.9689564130818498, ROC-AUC = 0.9635157581791565, Elapsed Time = 0.9720113000003039 seconds Trial 49, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 49, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 24786 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 12449 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 12473 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 12532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Debug] Re-bagging, using 12314 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 Trial 49, Fold 4: Log loss = 0.31610381071939087, Average precision = 0.9654936821814473, ROC-AUC = 0.9590535741110323, Elapsed Time = 0.9608648000003086 seconds Trial 49, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 49, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 24790 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 12446 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 12469 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 12532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 12308 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 Trial 49, Fold 5: Log loss = 0.31440501143902616, Average precision = 0.9644052616094297, ROC-AUC = 0.958472028686621, Elapsed Time = 0.9044985000000452 seconds
Optimization Progress: 50%|##### | 50/100 [11:32<11:18, 13.58s/it]
Trial 50, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 50, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 15120 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 14380 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14499 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 Trial 50, Fold 1: Log loss = 0.2982694639889566, Average precision = 0.9688332355090242, ROC-AUC = 0.964010439817802, Elapsed Time = 0.5601753000000826 seconds Trial 50, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 50, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 15133 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 14407 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14550 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 Trial 50, Fold 2: Log loss = 0.3062531891195165, Average precision = 0.9664051336574846, ROC-AUC = 0.9624566873883326, Elapsed Time = 0.5621974999994563 seconds Trial 50, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 50, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 15138 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 14391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14510 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 10 [LightGBM] [Debug] Re-bagging, using 14537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 Trial 50, Fold 3: Log loss = 0.3072467004888748, Average precision = 0.9687241074849424, ROC-AUC = 0.9633100452899424, Elapsed Time = 0.5728681999999026 seconds Trial 50, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 50, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 15120 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 14375 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14493 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14514 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 Trial 50, Fold 4: Log loss = 0.3113197561966934, Average precision = 0.9667452770254485, ROC-AUC = 0.9607278023371986, Elapsed Time = 0.5734934000001886 seconds Trial 50, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 50, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 15119 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 14371 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14490 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14512 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 Trial 50, Fold 5: Log loss = 0.32099173466049086, Average precision = 0.9641892476702298, ROC-AUC = 0.9583259201456626, Elapsed Time = 0.5571890000001076 seconds
Optimization Progress: 51%|#####1 | 51/100 [11:43<10:28, 12.83s/it]
Trial 51, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 51, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 24796 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 12453 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12477 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Debug] Re-bagging, using 12317 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12486 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12352 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 12428 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12471 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Debug] Re-bagging, using 12410 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 18 Trial 51, Fold 1: Log loss = 0.40943127585042277, Average precision = 0.9668415201170047, ROC-AUC = 0.96053000267657, Elapsed Time = 1.668200600000091 seconds Trial 51, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 51, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 24803 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 12478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12495 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12564 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 12346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12508 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 18 [LightGBM] [Debug] Re-bagging, using 12361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 18 [LightGBM] [Debug] Re-bagging, using 12443 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12507 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12497 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12372 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Debug] Re-bagging, using 12421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 Trial 51, Fold 2: Log loss = 0.40561261665555776, Average precision = 0.9647727106963009, ROC-AUC = 0.9609372368124345, Elapsed Time = 1.935546699999577 seconds Trial 51, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 51, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 24796 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 12464 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12485 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 17 [LightGBM] [Debug] Re-bagging, using 12550 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12338 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 16 [LightGBM] [Debug] Re-bagging, using 12496 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12352 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12438 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12493 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 [LightGBM] [Debug] Re-bagging, using 12476 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 18 [LightGBM] [Debug] Re-bagging, using 12367 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 Trial 51, Fold 3: Log loss = 0.40089889042129667, Average precision = 0.9691804405867582, ROC-AUC = 0.9640356083443289, Elapsed Time = 1.9567003000001932 seconds Trial 51, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 51, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 24786 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 12449 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12473 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Debug] Re-bagging, using 12314 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12350 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12425 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12479 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Debug] Re-bagging, using 12459 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Debug] Re-bagging, using 12346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 12410 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 16 Trial 51, Fold 4: Log loss = 0.4101429593770919, Average precision = 0.9659798685954358, ROC-AUC = 0.9598668068699614, Elapsed Time = 1.9267745999995896 seconds Trial 51, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 51, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 24790 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 12446 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Debug] Re-bagging, using 12469 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Debug] Re-bagging, using 12532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12308 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 17 [LightGBM] [Debug] Re-bagging, using 12472 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Debug] Re-bagging, using 12426 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12459 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 15 [LightGBM] [Debug] Re-bagging, using 12346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Debug] Re-bagging, using 12402 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 Trial 51, Fold 5: Log loss = 0.4074947757547993, Average precision = 0.9650135359660748, ROC-AUC = 0.958990617394051, Elapsed Time = 1.938223399999515 seconds
Optimization Progress: 52%|#####2 | 52/100 [12:00<11:15, 14.07s/it]
Trial 52, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 52, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 25876 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5347 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5294 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5327 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5160 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5355 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5330 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5234 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5370 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5261 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5126 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5160 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5207 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5205 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5307 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5363 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5283 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5288 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5206 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5330 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5364 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Re-bagging, using 5186 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Re-bagging, using 5265 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5286 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5234 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5350 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 Trial 52, Fold 1: Log loss = 0.21649941651454954, Average precision = 0.972431254470847, ROC-AUC = 0.966919716044603, Elapsed Time = 0.8327139999992141 seconds Trial 52, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 52, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 25878 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5357 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5305 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5333 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5174 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5376 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5242 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5379 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5277 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5149 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5160 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5203 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5215 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5315 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5368 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5300 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Re-bagging, using 5300 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5226 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5353 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5367 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 17 [LightGBM] [Debug] Re-bagging, using 5197 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5271 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5298 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5247 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 15 [LightGBM] [Debug] Re-bagging, using 5356 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 Trial 52, Fold 2: Log loss = 0.21555834957559505, Average precision = 0.9703827467001086, ROC-AUC = 0.9668916681664406, Elapsed Time = 0.844282199999725 seconds Trial 52, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 52, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 25877 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5352 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5297 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5330 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5170 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5366 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5325 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5243 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5369 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5265 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5142 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5168 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5195 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5211 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5316 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5356 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5296 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5215 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5337 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5381 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5196 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 16 [LightGBM] [Debug] Re-bagging, using 5256 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 15 [LightGBM] [Debug] Re-bagging, using 5280 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5254 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5363 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 15 Trial 52, Fold 3: Log loss = 0.21758157949179519, Average precision = 0.9725401558795804, ROC-AUC = 0.9687837086226657, Elapsed Time = 0.8842640000002575 seconds Trial 52, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 52, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 25866 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5344 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5162 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5351 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5328 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5235 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5366 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5255 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5130 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5153 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5206 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5206 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5305 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5357 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5287 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5282 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5215 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5320 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5365 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5185 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5255 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5284 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Re-bagging, using 5245 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5339 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 15 Trial 52, Fold 4: Log loss = 0.2224244821480468, Average precision = 0.9709196358797433, ROC-AUC = 0.9655868568948784, Elapsed Time = 1.0510160999992877 seconds Trial 52, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 52, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 25873 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5342 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5157 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5349 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5240 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5361 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5255 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5130 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5148 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5195 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5214 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5311 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5351 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5282 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Re-bagging, using 5200 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5325 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5356 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5187 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5250 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5278 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 14 [LightGBM] [Debug] Re-bagging, using 5251 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5342 data to train [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 Trial 52, Fold 5: Log loss = 0.22496569853134174, Average precision = 0.9695400144057988, ROC-AUC = 0.9648006242684355, Elapsed Time = 1.025408299999981 seconds
Optimization Progress: 53%|#####3 | 53/100 [12:12<10:34, 13.51s/it]
Trial 53, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 53, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.805817 [LightGBM] [Info] Total Bins 16928 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5608 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 145 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 153 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 159 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 168 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 168 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 174 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 174 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 173 and depth = 22 [LightGBM] [Debug] Re-bagging, using 5567 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 190 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 190 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 185 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 203 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 188 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 17 [LightGBM] [Debug] Re-bagging, using 5575 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 200 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 201 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 20 [LightGBM] [Debug] Re-bagging, using 5431 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 28 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 190 and depth = 29 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 25 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 27 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 23 [LightGBM] [Debug] Re-bagging, using 5645 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 208 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 200 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 210 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 202 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 203 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 26 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 189 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 183 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 191 and depth = 23 [LightGBM] [Debug] Re-bagging, using 5492 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 187 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 184 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 179 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 183 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 178 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 179 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 175 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 181 and depth = 16 [LightGBM] [Debug] Re-bagging, using 5644 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 188 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 188 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 187 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 185 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 187 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 179 and depth = 17 Trial 53, Fold 1: Log loss = 0.3327160762671582, Average precision = 0.9716258069958176, ROC-AUC = 0.9672362891508104, Elapsed Time = 2.8530601000002207 seconds Trial 53, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 53, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.806987 [LightGBM] [Info] Total Bins 16941 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 140 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 144 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 164 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 157 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 170 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 167 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 182 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 186 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5580 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 191 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 186 and depth = 21 [LightGBM] [Debug] Re-bagging, using 5580 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 201 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 206 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 200 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 200 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 203 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 203 and depth = 25 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 204 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 21 [LightGBM] [Debug] Re-bagging, using 5448 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 187 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 188 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 184 and depth = 24 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 180 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 21 [LightGBM] [Debug] Re-bagging, using 5664 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 189 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 191 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 189 and depth = 18 [LightGBM] [Debug] Re-bagging, using 5580 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 199 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 189 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 23 [LightGBM] [Debug] Re-bagging, using 5499 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 189 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 190 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 188 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 185 and depth = 24 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 181 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 186 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 183 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 185 and depth = 21 [LightGBM] [Debug] Re-bagging, using 5655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 191 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 189 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 183 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 18 Trial 53, Fold 2: Log loss = 0.32865200527720656, Average precision = 0.9737266722440454, ROC-AUC = 0.9695671333720391, Elapsed Time = 3.666922400000658 seconds Trial 53, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 53, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.806893 [LightGBM] [Info] Total Bins 16947 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 147 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 146 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 166 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 169 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 178 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 181 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 178 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 188 and depth = 23 [LightGBM] [Debug] Re-bagging, using 5571 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 200 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 189 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 190 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 20 [LightGBM] [Debug] Re-bagging, using 5578 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 191 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 188 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 190 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 189 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 199 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5442 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 188 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 25 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 188 and depth = 17 [LightGBM] [Debug] Re-bagging, using 5655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 191 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 191 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 191 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 189 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 188 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5580 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 185 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 176 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 181 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 180 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 182 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 178 and depth = 20 [LightGBM] [Debug] Re-bagging, using 5501 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 202 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 186 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 184 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 187 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 179 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 174 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 182 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 184 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 187 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 186 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 182 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 176 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 177 and depth = 19 Trial 53, Fold 3: Log loss = 0.32894213973091035, Average precision = 0.9721211712269816, ROC-AUC = 0.9693382455625006, Elapsed Time = 3.4834272999996756 seconds Trial 53, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 53, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.805384 [LightGBM] [Info] Total Bins 16929 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5605 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 144 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 144 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 152 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 163 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 171 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 166 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 172 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 174 and depth = 20 [LightGBM] [Debug] Re-bagging, using 5565 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 200 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 25 [LightGBM] [Debug] Re-bagging, using 5572 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 204 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 202 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 199 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 199 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 200 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 208 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 202 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 201 and depth = 17 [LightGBM] [Debug] Re-bagging, using 5432 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 202 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 204 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 205 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 191 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 190 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 21 [LightGBM] [Debug] Re-bagging, using 5642 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 205 and depth = 24 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 208 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 207 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 207 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 201 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 205 and depth = 20 [LightGBM] [Debug] Re-bagging, using 5583 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 208 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 203 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 204 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 206 and depth = 25 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 202 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 199 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5492 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 24 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 24 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 190 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 190 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 190 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 191 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 187 and depth = 18 [LightGBM] [Debug] Re-bagging, using 5642 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 176 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 181 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 178 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 175 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 176 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 171 and depth = 16 Trial 53, Fold 4: Log loss = 0.32957811689794386, Average precision = 0.9740590844027639, ROC-AUC = 0.9695006420023321, Elapsed Time = 4.050504700000602 seconds Trial 53, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 53, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.807356 [LightGBM] [Info] Total Bins 16928 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5603 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 141 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 150 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 156 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 156 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 162 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 184 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 186 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 183 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5565 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 200 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 197 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5572 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 201 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 199 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 194 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 198 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 199 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 18 [LightGBM] [Debug] Re-bagging, using 5427 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 193 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 188 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 192 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 185 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 187 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5640 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 204 and depth = 26 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 200 and depth = 25 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 24 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 196 and depth = 24 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 199 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 189 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 191 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 195 and depth = 20 [LightGBM] [Debug] Re-bagging, using 5579 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 206 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 204 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 208 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 207 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 206 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 200 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 203 and depth = 19 [LightGBM] [Debug] Re-bagging, using 5497 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 184 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 177 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 174 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 181 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 182 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 178 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 169 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 176 and depth = 18 [LightGBM] [Debug] Re-bagging, using 5637 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 180 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 172 and depth = 24 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 176 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 170 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 171 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 174 and depth = 23 Trial 53, Fold 5: Log loss = 0.3317355849042862, Average precision = 0.9704764770963913, ROC-AUC = 0.9674112554112553, Elapsed Time = 3.419964799999434 seconds
Optimization Progress: 54%|#####4 | 54/100 [12:38<13:16, 17.32s/it]
Trial 54, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 54, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 7940 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5294 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5327 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5355 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5330 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 Trial 54, Fold 1: Log loss = 0.2752874065988546, Average precision = 0.9677937124264382, ROC-AUC = 0.9611638956998035, Elapsed Time = 0.7653597999997146 seconds Trial 54, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 54, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 7956 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5357 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5305 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5333 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5174 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5376 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 Trial 54, Fold 2: Log loss = 0.27866620105733475, Average precision = 0.9643420683093281, ROC-AUC = 0.9599818327551237, Elapsed Time = 0.8094517999998061 seconds Trial 54, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 54, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 7963 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5352 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5297 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5330 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5170 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5366 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5325 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 11 Trial 54, Fold 3: Log loss = 0.2700365962158037, Average precision = 0.9682625609104769, ROC-AUC = 0.9626263170440428, Elapsed Time = 0.8366236999991088 seconds Trial 54, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 54, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 7949 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5344 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5162 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5351 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5328 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 Trial 54, Fold 4: Log loss = 0.27234685971359907, Average precision = 0.9677491230315155, ROC-AUC = 0.961371816514899, Elapsed Time = 0.8441063999998732 seconds Trial 54, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 54, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 7944 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5342 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 Trial 54, Fold 5: Log loss = 0.27751199575788493, Average precision = 0.9634017326562361, ROC-AUC = 0.9582249967486018, Elapsed Time = 0.8501927999996042 seconds
Optimization Progress: 55%|#####5 | 55/100 [12:50<11:43, 15.64s/it]
Trial 55, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 55, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 15120 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5834 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5799 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5805 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 Trial 55, Fold 1: Log loss = 0.319710227830236, Average precision = 0.9637692004130884, ROC-AUC = 0.9577537334566466, Elapsed Time = 0.46974260000024515 seconds Trial 55, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 55, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795629 [LightGBM] [Info] Total Bins 15136 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5846 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 5812 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5809 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 Trial 55, Fold 2: Log loss = 0.31212313747011217, Average precision = 0.9633393334570692, ROC-AUC = 0.958773920382793, Elapsed Time = 0.5101971000003687 seconds Trial 55, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 55, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 15138 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5839 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5805 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 5807 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 Trial 55, Fold 3: Log loss = 0.31374128541596613, Average precision = 0.9651192620136533, ROC-AUC = 0.9601149494129884, Elapsed Time = 0.5250800000003437 seconds Trial 55, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 55, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 15120 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5831 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5797 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 Trial 55, Fold 4: Log loss = 0.3170786943123466, Average precision = 0.9644478627146373, ROC-AUC = 0.9579533180254225, Elapsed Time = 0.5307753999995839 seconds Trial 55, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 55, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 15119 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5797 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 Trial 55, Fold 5: Log loss = 0.31820035782452255, Average precision = 0.9625577655712161, ROC-AUC = 0.9560946807127064, Elapsed Time = 0.5251232000000527 seconds
Optimization Progress: 56%|#####6 | 56/100 [13:00<10:12, 13.92s/it]
Trial 56, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 56, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 7940 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 7530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 7591 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7465 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7460 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7535 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7535 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 6 [LightGBM] [Debug] Re-bagging, using 7606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7350 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7407 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7461 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7637 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7517 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7498 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7590 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7643 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7390 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7479 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7559 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 Trial 56, Fold 1: Log loss = 0.20832250070227745, Average precision = 0.9728527881784961, ROC-AUC = 0.9678535282927785, Elapsed Time = 0.7831826000001456 seconds Trial 56, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 56, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 7956 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 7545 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7607 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7475 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7526 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7547 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7616 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7363 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7419 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7477 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7659 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7571 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7529 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7517 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7602 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7409 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7490 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7584 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 Trial 56, Fold 2: Log loss = 0.20397070967147965, Average precision = 0.9727620209508783, ROC-AUC = 0.9690584747951481, Elapsed Time = 0.8534793999997419 seconds Trial 56, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 56, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 7963 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 7537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Debug] Re-bagging, using 7597 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7471 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7473 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7609 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7544 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7609 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7363 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7351 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7419 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7472 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7644 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7562 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7525 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7509 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7659 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7404 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7477 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 Trial 56, Fold 3: Log loss = 0.2031745484476355, Average precision = 0.9729414594836683, ROC-AUC = 0.9691502898941549, Elapsed Time = 0.8719132999995054 seconds Trial 56, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 56, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 7946 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 7526 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7461 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7462 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7587 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7533 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7600 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7535 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7460 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7631 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7543 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7501 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7577 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7392 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 Trial 56, Fold 4: Log loss = 0.2059442149093668, Average precision = 0.9734037759849372, ROC-AUC = 0.9688341249573786, Elapsed Time = 0.9462315000000672 seconds Trial 56, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 56, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 7944 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 7524 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7460 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7456 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7524 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7547 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7594 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7533 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7334 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7410 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7468 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7629 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7543 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7509 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7485 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7583 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7402 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7461 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7556 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 Trial 56, Fold 5: Log loss = 0.21099260244594162, Average precision = 0.9715680379175089, ROC-AUC = 0.9680349293053157, Elapsed Time = 0.9077146999998149 seconds
Optimization Progress: 57%|#####6 | 57/100 [13:12<09:31, 13.30s/it]
Trial 57, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 57, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 7358 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 2241 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2114 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2130 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2168 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2161 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2066 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2161 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2165 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2183 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2189 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2139 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2249 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2113 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2218 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2215 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 Trial 57, Fold 1: Log loss = 0.37892899961134713, Average precision = 0.9575913146504536, ROC-AUC = 0.9511021882154448, Elapsed Time = 0.9399035000005824 seconds Trial 57, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 57, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 7374 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 2245 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2110 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2142 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2251 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2192 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2168 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2210 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2066 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2172 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2187 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2192 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2147 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2253 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2190 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2115 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2226 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2229 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 Trial 57, Fold 2: Log loss = 0.3760319081535994, Average precision = 0.9562582062183459, ROC-AUC = 0.9522469446342134, Elapsed Time = 1.015178399999968 seconds Trial 57, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 57, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 7372 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 2243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2115 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2135 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2246 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2195 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2197 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2162 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2074 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2153 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2175 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2165 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2190 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2137 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2253 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2109 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 Trial 57, Fold 3: Log loss = 0.3728635305151177, Average precision = 0.9598185775067866, ROC-AUC = 0.9544234591547803, Elapsed Time = 1.0134969999999157 seconds Trial 57, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 57, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 7355 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 2241 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2132 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2193 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2170 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2185 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2156 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2069 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2192 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2178 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2165 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2187 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2144 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2247 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2108 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2218 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 Trial 57, Fold 4: Log loss = 0.3768339494526506, Average precision = 0.9570589877268126, ROC-AUC = 0.9501171788378864, Elapsed Time = 1.086921899999652 seconds Trial 57, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 57, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 7363 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 2241 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2131 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2240 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2193 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2064 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2150 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2167 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2183 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2133 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2250 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2185 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2110 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2211 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2218 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 Trial 57, Fold 5: Log loss = 0.37952485796871727, Average precision = 0.9561166398751471, ROC-AUC = 0.9498930940304331, Elapsed Time = 1.0671511999998984 seconds
Optimization Progress: 58%|#####8 | 58/100 [13:24<09:08, 13.05s/it]
Trial 58, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 58, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 11147 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5231 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5263 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5098 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5302 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5265 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5301 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5062 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 Trial 58, Fold 1: Log loss = 0.4291124910075804, Average precision = 0.9672170501464812, ROC-AUC = 0.9607728655549341, Elapsed Time = 1.0771875000000364 seconds Trial 58, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 58, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 11158 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5302 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5269 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5112 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5322 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5259 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5194 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5311 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5221 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5085 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 Trial 58, Fold 2: Log loss = 0.43217372256804676, Average precision = 0.9640188943857604, ROC-AUC = 0.9597874438419383, Elapsed Time = 1.1487709999992148 seconds Trial 58, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 58, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 11164 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5297 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5234 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5266 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5108 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5313 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5260 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5194 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5301 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5078 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 Trial 58, Fold 3: Log loss = 0.4258938337035939, Average precision = 0.9683770395765154, ROC-AUC = 0.9628013310090435, Elapsed Time = 1.3509405000004335 seconds Trial 58, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 58, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 11145 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5291 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5227 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5260 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5100 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5298 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5264 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5297 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5066 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 Trial 58, Fold 4: Log loss = 0.4284234880137364, Average precision = 0.96641092870218, ROC-AUC = 0.9596838034197204, Elapsed Time = 1.3038344999995388 seconds Trial 58, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 58, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 11147 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5289 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5227 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5260 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5095 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5296 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5260 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 14 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5066 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 Trial 58, Fold 5: Log loss = 0.4294260241321941, Average precision = 0.9645658906559297, ROC-AUC = 0.9585099306987719, Elapsed Time = 1.2730691999995543 seconds
Optimization Progress: 59%|#####8 | 59/100 [13:38<09:03, 13.26s/it]
Trial 59, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 59, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 7940 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 6760 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6778 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6679 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6628 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6807 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6755 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6700 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6818 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6697 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6579 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6631 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6639 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6717 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6851 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6724 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6703 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6680 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6784 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6788 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6612 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6680 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6769 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6719 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6867 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6756 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 Trial 59, Fold 1: Log loss = 0.22675598811748662, Average precision = 0.9723914444623574, ROC-AUC = 0.9673419628817099, Elapsed Time = 0.8552437999996982 seconds Trial 59, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 59, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 7956 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 6773 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6795 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6687 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6643 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6835 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6744 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 7 [LightGBM] [Debug] Re-bagging, using 6713 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6826 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6724 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6566 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6573 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6648 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6729 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6866 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6745 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6716 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6697 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6799 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6626 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6693 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6790 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6734 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6876 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6764 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 Trial 59, Fold 2: Log loss = 0.2252146869930471, Average precision = 0.9707824030321003, ROC-AUC = 0.9679105027980773, Elapsed Time = 0.9205776999997397 seconds Trial 59, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 59, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 7963 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 6766 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6785 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6681 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6641 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6823 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6745 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 7 [LightGBM] [Debug] Re-bagging, using 6710 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6820 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6699 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6574 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6572 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6629 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6640 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6851 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6737 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6714 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6687 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6785 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6627 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6673 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6767 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6737 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6888 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6758 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 Trial 59, Fold 3: Log loss = 0.2235571018816946, Average precision = 0.972709413979505, ROC-AUC = 0.9680875653706502, Elapsed Time = 0.9554589000008491 seconds Trial 59, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 59, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 7949 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 6757 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6775 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6676 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6630 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6752 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6701 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6812 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6694 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6548 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Debug] Re-bagging, using 6569 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6638 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6714 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6845 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6701 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6683 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6775 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6791 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6611 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6670 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6770 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6721 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6862 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6752 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 Trial 59, Fold 4: Log loss = 0.22644173638755147, Average precision = 0.9724764906199391, ROC-AUC = 0.9675457985507556, Elapsed Time = 0.9750905000000785 seconds Trial 59, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 59, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 7944 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 6755 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6775 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6674 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Debug] Re-bagging, using 6625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6745 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6709 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6805 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6692 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6551 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6559 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6629 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6639 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6724 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6842 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6726 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6697 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6669 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6776 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6784 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6619 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6662 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6764 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6726 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6863 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6749 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 Trial 59, Fold 5: Log loss = 0.2278463569606474, Average precision = 0.9716918799256338, ROC-AUC = 0.9674079854337365, Elapsed Time = 1.0714367000000493 seconds
Optimization Progress: 60%|###### | 60/100 [13:50<08:36, 12.91s/it]
Trial 60, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 60, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 24804 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 10496 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10463 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10459 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10304 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10526 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10349 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10424 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10397 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10455 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10215 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10299 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 Trial 60, Fold 1: Log loss = 0.27400577030241996, Average precision = 0.9722423793246776, ROC-AUC = 0.966849291784364, Elapsed Time = 0.6557243000006565 seconds Trial 60, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 60, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795629 [LightGBM] [Info] Total Bins 24811 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 10518 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10481 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10475 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10330 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10555 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10346 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10429 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10416 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10483 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10237 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10313 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 Trial 60, Fold 2: Log loss = 0.2655808748637191, Average precision = 0.9727684389468915, ROC-AUC = 0.9691335140750623, Elapsed Time = 0.7152415000000474 seconds Trial 60, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 60, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 24804 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 10506 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10470 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10465 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10325 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10537 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10347 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10429 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10403 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10463 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10236 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10305 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 Trial 60, Fold 3: Log loss = 0.270243769829823, Average precision = 0.9713635394785448, ROC-AUC = 0.9683922220964838, Elapsed Time = 0.7405485999997836 seconds Trial 60, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 60, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 24794 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 10492 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10459 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10454 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10303 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10520 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10345 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10424 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10396 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10451 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10214 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10295 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 Trial 60, Fold 4: Log loss = 0.27232331528172177, Average precision = 0.9720421759528667, ROC-AUC = 0.9666652062156794, Elapsed Time = 0.7559318000003259 seconds Trial 60, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 60, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 24798 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 10489 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10456 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10455 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10296 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10516 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10338 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10430 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10448 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10216 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10285 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 Trial 60, Fold 5: Log loss = 0.2777135945673179, Average precision = 0.969383736130569, ROC-AUC = 0.9651062928487821, Elapsed Time = 0.7577618000004804 seconds
Optimization Progress: 61%|######1 | 61/100 [14:01<08:01, 12.34s/it]
Trial 61, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 61, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 12968 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 3425 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3435 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 6 Trial 61, Fold 1: Log loss = 0.48690144509883243, Average precision = 0.9569593258807947, ROC-AUC = 0.9484052303045075, Elapsed Time = 0.49105599999984406 seconds Trial 61, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 61, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 12982 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 3432 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3444 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 5 Trial 61, Fold 2: Log loss = 0.4883409069805142, Average precision = 0.9495550603192829, ROC-AUC = 0.943807567419265, Elapsed Time = 0.4926711999996769 seconds Trial 61, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 61, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 12988 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 3427 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3439 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 5 [LightGBM] [Debug] Re-bagging, using 3341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 6 Trial 61, Fold 3: Log loss = 0.48699146362382356, Average precision = 0.9595972136306565, ROC-AUC = 0.9528731634820043, Elapsed Time = 0.5241814000000886 seconds Trial 61, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 61, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 12968 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 3425 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3432 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 5 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3335 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 Trial 61, Fold 4: Log loss = 0.48865133948204637, Average precision = 0.9537333610631538, ROC-AUC = 0.94517988583446, Elapsed Time = 0.545862599999964 seconds Trial 61, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 61, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 12970 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 3423 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3432 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 9 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 3 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 7 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 8 and depth = 4 [LightGBM] [Debug] Re-bagging, using 3336 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 6 and depth = 4 Trial 61, Fold 5: Log loss = 0.480761489092147, Average precision = 0.9529118812446569, ROC-AUC = 0.9442202032588298, Elapsed Time = 0.5731816999996227 seconds
Optimization Progress: 62%|######2 | 62/100 [14:11<07:22, 11.66s/it]
Trial 62, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 62, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 23584 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 13595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13670 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13744 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13669 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 16 [LightGBM] [Debug] Re-bagging, using 13639 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13649 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 Trial 62, Fold 1: Log loss = 0.2946442850084093, Average precision = 0.9735029491011513, ROC-AUC = 0.9683388059630155, Elapsed Time = 1.019398600000386 seconds Trial 62, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 62, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792956 [LightGBM] [Info] Total Bins 23515 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 13622 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13691 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13772 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13560 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13694 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13572 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Re-bagging, using 13686 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 Trial 62, Fold 2: Log loss = 0.2933031130971272, Average precision = 0.972143220909115, ROC-AUC = 0.9683382798053137, Elapsed Time = 1.161121699999967 seconds Trial 62, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 62, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 23508 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 13606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13680 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 [LightGBM] [Debug] Re-bagging, using 13760 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13550 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13676 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13566 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13672 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 Trial 62, Fold 3: Log loss = 0.29255958868463744, Average precision = 0.9741501584594792, ROC-AUC = 0.9695752593156188, Elapsed Time = 1.2140692000002673 seconds Trial 62, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 62, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 23522 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 13590 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Debug] Re-bagging, using 13665 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13741 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 16 [LightGBM] [Debug] Re-bagging, using 13524 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13661 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13559 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13651 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 Trial 62, Fold 4: Log loss = 0.2928080487627608, Average precision = 0.9730172557150623, ROC-AUC = 0.968042724450453, Elapsed Time = 1.2142837999999756 seconds Trial 62, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 62, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.793364 [LightGBM] [Info] Total Bins 23533 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 13587 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13661 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13740 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13518 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13656 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 16 [LightGBM] [Debug] Re-bagging, using 13553 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13635 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13643 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 Trial 62, Fold 5: Log loss = 0.29784747770938497, Average precision = 0.9711117089106877, ROC-AUC = 0.9667426936439811, Elapsed Time = 1.2246425000003 seconds
Optimization Progress: 63%|######3 | 63/100 [14:24<07:29, 12.14s/it]
Trial 63, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 63, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 12063 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 18369 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18393 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18353 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 Trial 63, Fold 1: Log loss = 0.3871478465053016, Average precision = 0.9701212026243069, ROC-AUC = 0.964097219234979, Elapsed Time = 0.6317716000003202 seconds Trial 63, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 63, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 12084 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 18405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18423 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18454 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Debug] Re-bagging, using 18390 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 Trial 63, Fold 2: Log loss = 0.3901727806816126, Average precision = 0.9682952301589332, ROC-AUC = 0.9648423901975787, Elapsed Time = 0.8101253000004363 seconds Trial 63, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 63, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 12088 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 18388 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18438 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 18374 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 Trial 63, Fold 3: Log loss = 0.38395415345069356, Average precision = 0.9714498437181125, ROC-AUC = 0.967068707202088, Elapsed Time = 0.7840771999999561 seconds Trial 63, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 63, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 12066 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 18362 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Debug] Re-bagging, using 18386 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18350 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 Trial 63, Fold 4: Log loss = 0.3813460075386185, Average precision = 0.9717720706462006, ROC-AUC = 0.9665523193153882, Elapsed Time = 0.7806607000002259 seconds Trial 63, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 63, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 12068 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 18356 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 18382 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 18411 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 18345 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 Trial 63, Fold 5: Log loss = 0.39025751567850453, Average precision = 0.9689818062249139, ROC-AUC = 0.9639260539174703, Elapsed Time = 0.7991799999999785 seconds
Optimization Progress: 64%|######4 | 64/100 [14:36<07:06, 11.85s/it]
Trial 64, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 64, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 10006 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 15148 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15286 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15252 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15126 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15175 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15259 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15110 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15164 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15065 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15136 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15053 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15282 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15208 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15240 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15191 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15330 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15061 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15175 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15148 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15192 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15329 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15118 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15213 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15256 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15262 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15094 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15149 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 Trial 64, Fold 1: Log loss = 0.31630301819399675, Average precision = 0.9665739919306886, ROC-AUC = 0.9600088780870172, Elapsed Time = 1.321477500000583 seconds Trial 64, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 64, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 10022 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 15180 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15312 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15279 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15235 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15192 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15194 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15225 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15288 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15142 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15187 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15082 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15159 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15068 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15315 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15279 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15228 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15336 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 16 [LightGBM] [Debug] Re-bagging, using 15089 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15186 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15235 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15358 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15154 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15231 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15290 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15293 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15133 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15172 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15218 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 Trial 64, Fold 2: Log loss = 0.31566260066916224, Average precision = 0.9642587196322059, ROC-AUC = 0.9608000568143709, Elapsed Time = 1.3881407999997464 seconds Trial 64, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 64, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 10024 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 15163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15300 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15264 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15219 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15181 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15183 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15267 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15133 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15074 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15151 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15053 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15298 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15222 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15265 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15195 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15348 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15080 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15156 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15353 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15145 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15276 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15277 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15115 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15164 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 Trial 64, Fold 3: Log loss = 0.31411749801159977, Average precision = 0.9676660962346362, ROC-AUC = 0.9623256481686586, Elapsed Time = 1.5844845999999961 seconds Trial 64, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 64, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 10003 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 15143 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15279 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15247 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15124 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15168 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15168 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15185 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15248 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15109 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15055 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15141 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15040 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15284 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15201 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15236 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15178 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15329 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15058 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15161 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15147 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15321 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15113 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15193 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15257 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15256 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15103 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15128 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 Trial 64, Fold 4: Log loss = 0.315470578568722, Average precision = 0.9666878301659132, ROC-AUC = 0.9606021737471715, Elapsed Time = 1.5104277000000366 seconds Trial 64, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 64, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 10007 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 15138 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15277 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15245 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15118 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15195 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15176 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15244 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15107 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15152 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15046 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15135 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15041 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15282 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15231 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15154 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15173 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15318 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15058 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15135 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15200 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15330 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15101 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15237 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15259 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Debug] Re-bagging, using 15090 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15145 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15178 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 Trial 64, Fold 5: Log loss = 0.31903414895241994, Average precision = 0.964618303671335, ROC-AUC = 0.9587533210709176, Elapsed Time = 1.6158471999997346 seconds
Optimization Progress: 65%|######5 | 65/100 [14:51<07:30, 12.88s/it]
Trial 65, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 65, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 24150 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 7527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7461 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7603 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7337 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7348 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 Trial 65, Fold 1: Log loss = 0.387947778516032, Average precision = 0.9633361701294318, ROC-AUC = 0.9562298659803177, Elapsed Time = 1.4673411999992823 seconds Trial 65, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 65, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 24158 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 7542 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7605 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7474 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7560 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7418 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 Trial 65, Fold 2: Log loss = 0.38556459991614683, Average precision = 0.9615695436743681, ROC-AUC = 0.9566611359657123, Elapsed Time = 1.68528249999963 seconds Trial 65, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 65, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 24152 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7467 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7468 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7609 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7525 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7543 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 Trial 65, Fold 3: Log loss = 0.3833619629188997, Average precision = 0.9651936736622466, ROC-AUC = 0.9589117472142232, Elapsed Time = 1.7250509000004968 seconds Trial 65, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 65, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 24142 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 7523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7587 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7457 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7457 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7587 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7597 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 Trial 65, Fold 4: Log loss = 0.38938473344963875, Average precision = 0.9621895753988847, ROC-AUC = 0.9551340515175576, Elapsed Time = 1.6923073999996632 seconds Trial 65, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 65, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 24146 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 7521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7456 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7451 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7591 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7332 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7413 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 Trial 65, Fold 5: Log loss = 0.39631969132039463, Average precision = 0.9600423777986307, ROC-AUC = 0.9534142652769263, Elapsed Time = 1.6810988000006546 seconds
Optimization Progress: 66%|######6 | 66/100 [15:07<07:48, 13.77s/it]
Trial 66, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 66, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 11147 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 14380 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14499 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14345 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14432 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14436 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14426 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14510 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14362 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 Trial 66, Fold 1: Log loss = 0.3087517333061513, Average precision = 0.9687092433911032, ROC-AUC = 0.9625075577254891, Elapsed Time = 1.0490066000002116 seconds Trial 66, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 66, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 11158 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 14407 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14550 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14375 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14458 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Debug] Re-bagging, using 14455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14463 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14542 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14392 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 Trial 66, Fold 2: Log loss = 0.30928391734393224, Average precision = 0.966970769585805, ROC-AUC = 0.9628082870072635, Elapsed Time = 1.1502757999996902 seconds Trial 66, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 66, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 11164 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 14391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14510 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14366 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Debug] Re-bagging, using 14440 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 9 [LightGBM] [Debug] Re-bagging, using 14395 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Debug] Re-bagging, using 14448 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14447 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 16 [LightGBM] [Debug] Re-bagging, using 14515 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 Trial 66, Fold 3: Log loss = 0.3057466631568812, Average precision = 0.9694848143497176, ROC-AUC = 0.9645859317063306, Elapsed Time = 1.2300295000004553 seconds Trial 66, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 66, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 11145 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 14375 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14493 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14514 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14343 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14425 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14387 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14430 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14429 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14498 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 Trial 66, Fold 4: Log loss = 0.3149989869422477, Average precision = 0.9668836789137704, ROC-AUC = 0.9608261592404225, Elapsed Time = 1.2065956000005826 seconds Trial 66, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 66, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 11147 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 14371 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14490 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14512 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14337 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14380 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14433 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 19 [LightGBM] [Debug] Re-bagging, using 14493 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14359 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 Trial 66, Fold 5: Log loss = 0.3105819669547541, Average precision = 0.9669574604331346, ROC-AUC = 0.9615033721643164, Elapsed Time = 1.2555413000000044 seconds
Optimization Progress: 67%|######7 | 67/100 [15:20<07:29, 13.63s/it]
Trial 67, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 67, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 9448 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 16585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16662 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16603 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16552 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16610 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16601 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16562 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16508 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 117 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16556 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 13 Trial 67, Fold 1: Log loss = 0.28741825917483693, Average precision = 0.9739333585759847, ROC-AUC = 0.9687488051026895, Elapsed Time = 1.8393102 seconds Trial 67, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 67, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 9436 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 16618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16693 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16630 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16637 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16621 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 111 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16553 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 111 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16610 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 111 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16663 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16557 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16577 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 16 Trial 67, Fold 2: Log loss = 0.28359683000303204, Average precision = 0.9717995056127583, ROC-AUC = 0.9694156890821415, Elapsed Time = 2.248905299999933 seconds Trial 67, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 67, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 9446 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 16601 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16677 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16617 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16574 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16529 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16544 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16570 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16570 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16531 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 17 Trial 67, Fold 3: Log loss = 0.2814484218320297, Average precision = 0.9743597628338133, ROC-AUC = 0.9699769035740771, Elapsed Time = 2.2512329000001046 seconds Trial 67, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 67, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 9446 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 16579 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16599 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16549 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16604 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16593 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16623 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16505 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16522 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16547 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 116 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16511 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 14 Trial 67, Fold 4: Log loss = 0.28250526105517854, Average precision = 0.973973155793459, ROC-AUC = 0.9691755426320545, Elapsed Time = 2.24522160000015 seconds Trial 67, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 67, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 9428 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 16573 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16654 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16542 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16553 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16616 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16500 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16555 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 18 [LightGBM] [Debug] Re-bagging, using 16510 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 16 Trial 67, Fold 5: Log loss = 0.2847991314597355, Average precision = 0.9727540451579517, ROC-AUC = 0.9683353213310295, Elapsed Time = 2.201694000000316 seconds
Optimization Progress: 68%|######8 | 68/100 [15:38<08:00, 15.03s/it]
Trial 68, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 68, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798290 [LightGBM] [Info] Total Bins 21918 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 14894 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Re-bagging, using 15018 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15035 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14902 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14960 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Re-bagging, using 14907 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14960 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Re-bagging, using 14950 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Re-bagging, using 15011 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Re-bagging, using 14870 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 28 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14919 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 30 [LightGBM] [Debug] Re-bagging, using 14818 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Re-bagging, using 14902 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Re-bagging, using 14795 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Re-bagging, using 15006 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 28 Trial 68, Fold 1: Log loss = 0.19255620850863048, Average precision = 0.9771368172255197, ROC-AUC = 0.9729349784679504, Elapsed Time = 2.999589400000332 seconds Trial 68, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 68, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798259 [LightGBM] [Info] Total Bins 21930 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 14922 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15044 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 15065 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14932 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Re-bagging, using 14990 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14924 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Re-bagging, using 14980 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Re-bagging, using 14987 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Re-bagging, using 15039 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14902 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 14939 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 14837 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Re-bagging, using 14927 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Re-bagging, using 14812 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15038 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 37 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 39 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 Trial 68, Fold 2: Log loss = 0.1810799720171419, Average precision = 0.9768380951427735, ROC-AUC = 0.9741089445911172, Elapsed Time = 3.4721553999997923 seconds Trial 68, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 68, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798104 [LightGBM] [Info] Total Bins 21873 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 14905 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Re-bagging, using 15032 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Re-bagging, using 15051 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14922 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 14971 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Re-bagging, using 14913 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Re-bagging, using 14971 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Re-bagging, using 14972 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 15016 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 29 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Re-bagging, using 14894 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Re-bagging, using 14931 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Re-bagging, using 14826 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 26 [LightGBM] [Debug] Re-bagging, using 14917 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Re-bagging, using 14798 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 30 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 31 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Re-bagging, using 15021 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 Trial 68, Fold 3: Log loss = 0.18652658405551348, Average precision = 0.9771003102923845, ROC-AUC = 0.9738919193208194, Elapsed Time = 3.2772579000002224 seconds Trial 68, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 68, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.797795 [LightGBM] [Info] Total Bins 21913 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 14889 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 15011 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Re-bagging, using 15030 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 14900 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Re-bagging, using 14954 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14902 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 14953 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Re-bagging, using 14953 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Re-bagging, using 15001 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Re-bagging, using 14867 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Re-bagging, using 14913 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Re-bagging, using 14810 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Re-bagging, using 14904 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 36 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Re-bagging, using 14781 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Re-bagging, using 15008 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 Trial 68, Fold 4: Log loss = 0.18663521781285533, Average precision = 0.9769212559836613, ROC-AUC = 0.9733175604626709, Elapsed Time = 3.38944790000005 seconds Trial 68, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 68, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798646 [LightGBM] [Info] Total Bins 21922 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 14884 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Re-bagging, using 15009 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 15028 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Re-bagging, using 14894 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Re-bagging, using 14949 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14893 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 14956 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Re-bagging, using 14944 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 14997 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Re-bagging, using 14866 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 14907 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Re-bagging, using 14801 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Re-bagging, using 14900 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Re-bagging, using 14784 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 28 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 31 [LightGBM] [Debug] Re-bagging, using 15003 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 Trial 68, Fold 5: Log loss = 0.1877665116315995, Average precision = 0.9769218532521622, ROC-AUC = 0.9735586645114542, Elapsed Time = 3.328153100000236 seconds
Optimization Progress: 69%|######9 | 69/100 [16:02<09:10, 17.77s/it]
Trial 69, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 69, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 7940 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 7776 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7845 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7722 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7689 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7843 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7783 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7774 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7855 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7584 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7645 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7683 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7935 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7762 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7741 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7821 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7913 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7649 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 18 [LightGBM] [Debug] Re-bagging, using 7722 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7841 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7765 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 Trial 69, Fold 1: Log loss = 0.2022036418702006, Average precision = 0.9745308643163686, ROC-AUC = 0.970485438981374, Elapsed Time = 0.9094229000002088 seconds Trial 69, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 69, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 7956 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 7792 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7860 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7736 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7705 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7875 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7775 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7785 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7866 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7828 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7605 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7644 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7693 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7740 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7959 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7830 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7776 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7761 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7834 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7924 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7667 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7733 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7866 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7842 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7914 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7783 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 16 Trial 69, Fold 2: Log loss = 0.1989488805167814, Average precision = 0.9738621553061394, ROC-AUC = 0.9705636610792121, Elapsed Time = 0.955718700000034 seconds Trial 69, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 69, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 7963 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 7783 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7851 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7728 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7704 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7859 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7774 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7781 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7862 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7808 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7645 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7679 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7736 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7942 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7820 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7769 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7756 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7822 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7924 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7663 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7724 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7844 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7835 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 15 [LightGBM] [Debug] Re-bagging, using 7926 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7774 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 Trial 69, Fold 3: Log loss = 0.1985734875654708, Average precision = 0.9746141114328595, ROC-AUC = 0.9702187328447189, Elapsed Time = 1.015855300000112 seconds Trial 69, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 69, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 7949 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 7772 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7843 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7690 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7836 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7781 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7775 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7850 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7584 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7645 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7678 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7929 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7759 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7745 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7808 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7654 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 15 [LightGBM] [Debug] Re-bagging, using 7714 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7842 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 15 [LightGBM] [Debug] Re-bagging, using 7822 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7765 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 Trial 69, Fold 4: Log loss = 0.20015244545998195, Average precision = 0.9746741557231141, ROC-AUC = 0.9703810107989919, Elapsed Time = 1.0393052999997963 seconds Trial 69, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 69, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 7944 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 7770 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7842 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7717 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7684 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7835 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7772 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7782 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7846 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7580 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7638 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7682 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7734 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7928 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7753 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7728 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7815 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7900 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 16 [LightGBM] [Debug] Re-bagging, using 7661 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7707 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7838 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7827 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7900 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7765 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 Trial 69, Fold 5: Log loss = 0.20355979520113743, Average precision = 0.9736788859168499, ROC-AUC = 0.9702772420712333, Elapsed Time = 1.0688019000008353 seconds
Optimization Progress: 70%|####### | 70/100 [16:15<08:03, 16.13s/it]
Trial 70, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 70, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.790295 [LightGBM] [Info] Total Bins 20104 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 254 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 15800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15945 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15858 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15755 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15790 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15872 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Debug] Re-bagging, using 15771 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15792 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15707 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15781 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15709 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15898 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 15815 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 Trial 70, Fold 1: Log loss = 0.5212313297273229, Average precision = 0.9627836292091663, ROC-AUC = 0.9549571509824445, Elapsed Time = 0.6239304999999149 seconds Trial 70, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 70, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 20073 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 15833 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15972 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15887 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15788 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15807 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15831 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15847 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15899 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15815 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15927 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15862 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 Trial 70, Fold 2: Log loss = 0.5209099033618703, Average precision = 0.9621351418743129, ROC-AUC = 0.9574910446044405, Elapsed Time = 0.6758497000000716 seconds Trial 70, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 70, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.790045 [LightGBM] [Info] Total Bins 20069 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 254 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 15816 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15959 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15871 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15774 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15813 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15814 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15879 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15797 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15716 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15794 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15715 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 5 [LightGBM] [Debug] Re-bagging, using 15909 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15830 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 Trial 70, Fold 3: Log loss = 0.5124468728151358, Average precision = 0.9650361000889902, ROC-AUC = 0.9592594426279242, Elapsed Time = 0.7345519999998942 seconds Trial 70, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 70, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 20105 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 15795 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15937 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15854 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15753 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15796 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15782 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15861 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15772 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15781 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15699 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15785 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15698 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15895 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15813 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 Trial 70, Fold 4: Log loss = 0.521958522934716, Average precision = 0.9634605752827095, ROC-AUC = 0.9565819843535032, Elapsed Time = 0.705152999999882 seconds Trial 70, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 70, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 20102 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 15789 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15936 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15851 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15745 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15790 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15777 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15791 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15856 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15769 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15777 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15690 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15778 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Debug] Re-bagging, using 15701 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Debug] Re-bagging, using 15892 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 15811 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 Trial 70, Fold 5: Log loss = 0.5191817461829943, Average precision = 0.9611701754237488, ROC-AUC = 0.9542330230570574, Elapsed Time = 0.7133186000000933 seconds
Optimization Progress: 71%|#######1 | 71/100 [16:26<07:01, 14.54s/it]
Trial 71, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 71, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 11147 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 9953 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9892 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Debug] Re-bagging, using 9802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Debug] Re-bagging, using 9970 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 9834 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Debug] Re-bagging, using 9910 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9921 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9935 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 Trial 71, Fold 1: Log loss = 0.33959284495604364, Average precision = 0.9724411600723212, ROC-AUC = 0.9668345796112281, Elapsed Time = 1.8166848999999274 seconds Trial 71, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 71, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 11158 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 9972 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9932 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9828 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9997 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9832 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Debug] Re-bagging, using 9918 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9936 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Debug] Re-bagging, using 9963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 Trial 71, Fold 2: Log loss = 0.34029115995704734, Average precision = 0.9706272485357673, ROC-AUC = 0.9671575302865818, Elapsed Time = 2.0769625000002634 seconds Trial 71, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 71, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 11164 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 9963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9919 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9897 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9823 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9982 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9917 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9927 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Debug] Re-bagging, using 9941 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 Trial 71, Fold 3: Log loss = 0.34119889364049394, Average precision = 0.9716527135534025, ROC-AUC = 0.9667977590989204, Elapsed Time = 2.086481299999832 seconds Trial 71, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 71, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 11145 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 9949 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9908 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9889 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Debug] Re-bagging, using 9963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9832 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9908 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Debug] Re-bagging, using 9920 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Debug] Re-bagging, using 9932 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 Trial 71, Fold 4: Log loss = 0.33957611513155844, Average precision = 0.9720091270541313, ROC-AUC = 0.9665419620354276, Elapsed Time = 2.2675779999999577 seconds Trial 71, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 71, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 11147 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 9947 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9905 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9889 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Debug] Re-bagging, using 9793 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9961 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9826 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Debug] Re-bagging, using 9914 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Debug] Re-bagging, using 9930 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 Trial 71, Fold 5: Log loss = 0.34555205055908644, Average precision = 0.969191136142906, ROC-AUC = 0.964253497575386, Elapsed Time = 1.7241317999996681 seconds
Optimization Progress: 72%|#######2 | 72/100 [16:43<07:12, 15.43s/it]
Trial 72, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 72, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 19014 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 10167 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10127 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10114 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10000 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10178 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10041 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 Trial 72, Fold 1: Log loss = 0.27839808071739647, Average precision = 0.9699116494901384, ROC-AUC = 0.9638715330054535, Elapsed Time = 0.9093064999997296 seconds Trial 72, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 72, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 19026 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 10188 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10128 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10026 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10039 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 Trial 72, Fold 2: Log loss = 0.2773252453732645, Average precision = 0.9679454703507686, ROC-AUC = 0.9644505774154984, Elapsed Time = 0.9629867000003287 seconds Trial 72, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 72, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 19029 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 10177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10134 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10120 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10021 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10189 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10037 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 Trial 72, Fold 3: Log loss = 0.2747655322336973, Average precision = 0.9708889291563291, ROC-AUC = 0.9660925867515482, Elapsed Time = 1.03429369999958 seconds Trial 72, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 72, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 19010 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 10163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10123 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10110 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9999 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10039 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 Trial 72, Fold 4: Log loss = 0.27565143365103895, Average precision = 0.9698695678849122, ROC-AUC = 0.9640579286393842, Elapsed Time = 1.0720799000000625 seconds Trial 72, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 72, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 19008 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 10160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10120 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9992 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10032 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 Trial 72, Fold 5: Log loss = 0.28328383290878856, Average precision = 0.9680872451642151, ROC-AUC = 0.9623242851569032, Elapsed Time = 1.0523961999997482 seconds
Optimization Progress: 73%|#######3 | 73/100 [16:55<06:32, 14.54s/it]
Trial 73, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 73, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796981 [LightGBM] [Info] Total Bins 21915 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 8731 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8796 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8715 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8664 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8776 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8704 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Debug] Re-bagging, using 8720 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8788 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8763 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8545 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8668 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8668 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8874 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8724 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8682 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8682 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8762 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8877 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8552 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8668 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8737 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8789 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8827 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8742 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8695 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 23 [LightGBM] [Debug] Re-bagging, using 8804 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8649 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 27 [LightGBM] [Debug] Re-bagging, using 8741 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8673 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Re-bagging, using 8750 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Re-bagging, using 8691 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 27 [LightGBM] [Debug] Re-bagging, using 8792 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8668 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8708 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Re-bagging, using 8678 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8785 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Re-bagging, using 8675 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8925 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8668 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8689 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8586 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 23 [LightGBM] [Debug] Re-bagging, using 8618 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 23 Trial 73, Fold 1: Log loss = 0.19207670774348418, Average precision = 0.9764999102293743, ROC-AUC = 0.972403846613422, Elapsed Time = 2.5400171000001137 seconds Trial 73, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 73, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796949 [LightGBM] [Info] Total Bins 21927 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 8749 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Debug] Re-bagging, using 8813 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 8729 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8685 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Re-bagging, using 8699 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8730 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8789 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8564 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8547 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8565 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Re-bagging, using 8676 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8687 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8906 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8756 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8699 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8701 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8769 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8883 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8574 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8682 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8763 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8804 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8835 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8757 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8715 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8813 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8660 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 26 [LightGBM] [Debug] Re-bagging, using 8759 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8696 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8763 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8689 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8818 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Re-bagging, using 8681 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Re-bagging, using 8715 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 28 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 31 [LightGBM] [Debug] Re-bagging, using 8686 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8804 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8671 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8944 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8684 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 24 [LightGBM] [Debug] Re-bagging, using 8720 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8615 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8653 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 Trial 73, Fold 2: Log loss = 0.18436194570607786, Average precision = 0.9759969547269485, ROC-AUC = 0.9731626454009343, Elapsed Time = 3.1395682000002125 seconds Trial 73, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 73, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796784 [LightGBM] [Info] Total Bins 21870 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 8740 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8803 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8720 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8682 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8789 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8696 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8796 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8769 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8565 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8542 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8565 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8665 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8676 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8886 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8740 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8697 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8699 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8759 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8886 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8565 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8667 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8744 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8800 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8845 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Re-bagging, using 8751 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8688 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Re-bagging, using 8797 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8667 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8766 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8682 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8751 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 26 [LightGBM] [Debug] Re-bagging, using 8686 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8800 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8683 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8717 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8690 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8790 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8667 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8930 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8670 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8697 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8597 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8623 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 Trial 73, Fold 3: Log loss = 0.19152865274959918, Average precision = 0.97518386567535, ROC-AUC = 0.9725112201558301, Elapsed Time = 3.040055299999949 seconds Trial 73, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 73, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796474 [LightGBM] [Info] Total Bins 21910 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 8727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8794 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8711 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8663 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8769 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8702 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8787 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8761 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8542 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8536 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8662 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8669 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8872 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8714 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8681 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Re-bagging, using 8684 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8747 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8879 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8559 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Re-bagging, using 8656 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8742 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8782 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8823 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8743 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8679 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8801 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8659 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 35 [LightGBM] [Debug] Re-bagging, using 8744 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8667 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 23 [LightGBM] [Debug] Re-bagging, using 8740 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8680 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 28 [LightGBM] [Debug] Re-bagging, using 8801 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8671 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8706 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 23 [LightGBM] [Debug] Re-bagging, using 8665 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Re-bagging, using 8789 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8663 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 28 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8924 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8662 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8695 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8600 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8609 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 Trial 73, Fold 4: Log loss = 0.19165113364949743, Average precision = 0.9757806427128366, ROC-AUC = 0.9716902007911471, Elapsed Time = 3.2279449000006935 seconds Trial 73, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 73, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.797340 [LightGBM] [Info] Total Bins 21919 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 8725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 8792 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8710 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Debug] Re-bagging, using 8656 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8769 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8694 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8781 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8758 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 12 [LightGBM] [Debug] Re-bagging, using 8544 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8526 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8554 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8664 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8678 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8870 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8715 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8673 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8665 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8751 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8867 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8567 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8655 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8731 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8790 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8818 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 8739 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8681 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8787 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8659 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8749 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 23 [LightGBM] [Debug] Re-bagging, using 8671 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8735 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8672 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 28 [LightGBM] [Debug] Re-bagging, using 8785 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 8680 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 8706 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8667 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8784 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 8657 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 8924 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 8659 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Re-bagging, using 8695 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 27 [LightGBM] [Debug] Re-bagging, using 8584 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Re-bagging, using 8625 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 Trial 73, Fold 5: Log loss = 0.19210909317188127, Average precision = 0.9761174581789205, ROC-AUC = 0.9729034799249392, Elapsed Time = 3.01584129999992 seconds
Optimization Progress: 74%|#######4 | 74/100 [17:18<07:21, 16.97s/it]
Trial 74, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 74, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 19898 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 2947 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2947 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2840 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2873 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2977 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2925 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2842 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2990 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2890 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2874 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2773 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2860 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2898 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2860 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 14 [LightGBM] [Debug] Re-bagging, using 2913 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 14 [LightGBM] [Debug] Re-bagging, using 2894 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2917 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2794 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 18 [LightGBM] [Debug] Re-bagging, using 2964 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 Trial 74, Fold 1: Log loss = 0.20789237807057712, Average precision = 0.9728302046346347, ROC-AUC = 0.9679757065427798, Elapsed Time = 1.160290700000587 seconds Trial 74, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 74, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795629 [LightGBM] [Info] Total Bins 19910 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 2952 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2957 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2838 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2883 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2986 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2921 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2847 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2996 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2907 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2887 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2771 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2857 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2857 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 15 [LightGBM] [Debug] Re-bagging, using 2916 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2904 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 17 [LightGBM] [Debug] Re-bagging, using 2925 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 16 [LightGBM] [Debug] Re-bagging, using 2807 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2972 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 Trial 74, Fold 2: Log loss = 0.20502511626084324, Average precision = 0.9716575349201336, ROC-AUC = 0.9684218928351069, Elapsed Time = 1.2803405000004204 seconds Trial 74, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 74, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 19863 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 2949 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2951 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2841 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2876 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2982 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2925 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2847 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2986 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2901 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 17 [LightGBM] [Debug] Re-bagging, using 2879 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2782 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2848 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2904 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2861 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2907 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2920 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2795 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Debug] Re-bagging, using 2971 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 13 Trial 74, Fold 3: Log loss = 0.20485285144190643, Average precision = 0.9725765501366499, ROC-AUC = 0.9683117879053462, Elapsed Time = 1.4039001000001008 seconds Trial 74, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 74, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 19893 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 2947 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2945 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2836 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2875 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2976 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2924 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2844 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2986 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2888 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2873 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2773 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2860 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2894 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 14 [LightGBM] [Debug] Re-bagging, using 2858 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 14 [LightGBM] [Debug] Re-bagging, using 2911 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 14 [LightGBM] [Debug] Re-bagging, using 2899 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2913 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2798 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2960 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 Trial 74, Fold 4: Log loss = 0.20620693439015805, Average precision = 0.9719124400290444, ROC-AUC = 0.9663950227830355, Elapsed Time = 1.486533999999665 seconds Trial 74, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 74, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 19891 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 2946 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2945 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2836 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2874 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2972 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2922 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2848 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Debug] Re-bagging, using 2981 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2890 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2872 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2768 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2849 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 2909 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2857 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 12 [LightGBM] [Debug] Re-bagging, using 2909 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2902 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 14 [LightGBM] [Debug] Re-bagging, using 2916 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Debug] Re-bagging, using 2786 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 16 [LightGBM] [Debug] Re-bagging, using 2963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 14 Trial 74, Fold 5: Log loss = 0.21038025074049305, Average precision = 0.9709620645493768, ROC-AUC = 0.9671163628931869, Elapsed Time = 1.355631500000527 seconds
Optimization Progress: 75%|#######5 | 75/100 [17:32<06:43, 16.15s/it]
Trial 75, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 75, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 24366 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 10496 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10463 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10459 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10304 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10526 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10397 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10215 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10299 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10288 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10280 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10332 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10471 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10536 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10252 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10333 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10417 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10489 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10393 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10390 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10517 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10351 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10370 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10378 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10368 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10411 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10342 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10371 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10431 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10482 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10335 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10582 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10333 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10386 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10290 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10357 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10473 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10338 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10276 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10190 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 Trial 75, Fold 1: Log loss = 0.3386479276944911, Average precision = 0.9700048427156842, ROC-AUC = 0.9639866912337552, Elapsed Time = 1.6730855999994674 seconds Trial 75, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 75, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 24373 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 10518 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10481 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10475 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10330 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10555 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10429 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10483 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10237 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10313 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10299 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10290 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10507 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10381 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10426 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10282 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10439 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10436 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10496 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10409 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10368 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10395 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10404 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10382 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10498 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10362 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10387 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10443 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10509 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10338 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10409 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10317 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10486 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10357 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10306 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 Trial 75, Fold 2: Log loss = 0.33831803482587663, Average precision = 0.968553079645746, ROC-AUC = 0.9648312367353467, Elapsed Time = 2.0940998000005493 seconds Trial 75, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 75, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 24368 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 10506 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10465 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10325 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10429 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10403 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10463 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10236 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10305 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10280 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10340 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10491 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10356 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10422 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10400 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10548 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10271 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10329 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10425 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10434 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10505 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10397 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10380 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10515 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10373 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10395 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10395 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10372 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10412 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10481 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10355 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10381 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10437 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10496 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10294 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10501 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10344 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10304 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10211 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 Trial 75, Fold 3: Log loss = 0.3374498500721996, Average precision = 0.9709400920837394, ROC-AUC = 0.9658439856593888, Elapsed Time = 2.0018864000003305 seconds Trial 75, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 75, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 24356 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 10492 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10459 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10454 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10303 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10520 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10345 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10396 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10451 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10214 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10283 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10271 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10332 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10472 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10328 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10402 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10386 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10544 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10259 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10315 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10417 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10377 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10358 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10373 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10369 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10355 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10409 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10473 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10372 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10413 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10489 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10327 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10578 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10329 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10382 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10296 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10353 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10472 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10326 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10289 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10181 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 Trial 75, Fold 4: Log loss = 0.33745229817959493, Average precision = 0.9701460653297643, ROC-AUC = 0.9644183470794558, Elapsed Time = 2.0010793999999805 seconds Trial 75, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 75, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 24360 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 10489 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10456 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10296 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10338 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10430 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10448 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10285 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10277 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10271 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10466 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 8 [LightGBM] [Debug] Re-bagging, using 10325 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10393 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10269 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10315 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10407 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10423 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10479 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10381 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10377 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10506 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10363 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10374 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10376 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10393 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10467 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10348 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10365 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10427 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10477 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10323 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10580 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10325 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 10380 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10282 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10357 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10474 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10328 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10268 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10193 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 Trial 75, Fold 5: Log loss = 0.3406738747682316, Average precision = 0.9684612209351231, ROC-AUC = 0.96299306987719, Elapsed Time = 1.9582474000008006 seconds
Optimization Progress: 76%|#######6 | 76/100 [17:50<06:34, 16.45s/it]
Trial 76, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 76, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 15448 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 12453 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 12477 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Debug] Re-bagging, using 12537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 12317 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 12486 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 12352 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 Trial 76, Fold 1: Log loss = 0.3155715255143001, Average precision = 0.9662255319221411, ROC-AUC = 0.9594275605454466, Elapsed Time = 0.8203442000003633 seconds Trial 76, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 76, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 15463 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 12478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 12495 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 12564 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 12346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 12508 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Debug] Re-bagging, using 12361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 Trial 76, Fold 2: Log loss = 0.3124389825131124, Average precision = 0.9638516831597628, ROC-AUC = 0.960129938593741, Elapsed Time = 0.881771999999728 seconds Trial 76, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 76, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 15465 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 12464 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Debug] Re-bagging, using 12485 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Debug] Re-bagging, using 12550 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 9 [LightGBM] [Debug] Re-bagging, using 12338 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 12496 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Debug] Re-bagging, using 12352 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 Trial 76, Fold 3: Log loss = 0.3054877716963616, Average precision = 0.9673556159887645, ROC-AUC = 0.9617291259354142, Elapsed Time = 0.9247225999997681 seconds Trial 76, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 76, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 15448 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 12449 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Debug] Re-bagging, using 12473 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 12532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 12314 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Debug] Re-bagging, using 12478 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 12350 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 Trial 76, Fold 4: Log loss = 0.3103363416716246, Average precision = 0.9664082453702845, ROC-AUC = 0.9597652459161022, Elapsed Time = 0.9628636999996161 seconds Trial 76, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 76, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 15447 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 12446 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Debug] Re-bagging, using 12469 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 12532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Debug] Re-bagging, using 12308 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 12472 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 12346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 Trial 76, Fold 5: Log loss = 0.3148505789874095, Average precision = 0.9643632385544947, ROC-AUC = 0.9577471341248165, Elapsed Time = 0.9563079999998081 seconds
Optimization Progress: 77%|#######7 | 77/100 [18:01<05:47, 15.10s/it]
Trial 77, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 77, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 8297 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 6273 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6246 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6230 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6329 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6300 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 19 [LightGBM] [Debug] Re-bagging, using 6199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6080 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6116 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6159 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 20 [LightGBM] [Debug] Re-bagging, using 6371 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6253 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 19 [LightGBM] [Debug] Re-bagging, using 6193 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6305 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6296 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 19 [LightGBM] [Debug] Re-bagging, using 6141 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6221 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6269 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 19 [LightGBM] [Debug] Re-bagging, using 6198 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6366 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6274 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 18 Trial 77, Fold 1: Log loss = 0.22585042950322304, Average precision = 0.9734482911791483, ROC-AUC = 0.9681996004263393, Elapsed Time = 1.0046127000005072 seconds Trial 77, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 77, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795629 [LightGBM] [Info] Total Bins 8313 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 6285 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6262 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6232 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 19 [LightGBM] [Debug] Re-bagging, using 6177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6352 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6357 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6231 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6102 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6113 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6159 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6263 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6381 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 20 [LightGBM] [Debug] Re-bagging, using 6273 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6221 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6212 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6322 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6302 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6228 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6288 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 20 [LightGBM] [Debug] Re-bagging, using 6214 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6370 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6282 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 Trial 77, Fold 2: Log loss = 0.22227296826877135, Average precision = 0.9713868294735187, ROC-AUC = 0.968683506017255, Elapsed Time = 1.0871165999997174 seconds Trial 77, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 77, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 8313 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 6278 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6230 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 19 [LightGBM] [Debug] Re-bagging, using 6170 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6343 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6294 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6208 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6345 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6100 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6118 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6150 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6264 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6368 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6263 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6220 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6304 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6315 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6210 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6266 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6217 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6383 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6279 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 Trial 77, Fold 3: Log loss = 0.22081356715814243, Average precision = 0.9731649895583823, ROC-AUC = 0.9689281741483757, Elapsed Time = 1.1825113999993846 seconds Trial 77, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 77, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 8295 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 6270 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6227 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6325 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6297 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6199 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 21 [LightGBM] [Debug] Re-bagging, using 6343 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6083 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6105 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 20 [LightGBM] [Debug] Re-bagging, using 6160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6250 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6366 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 20 [LightGBM] [Debug] Re-bagging, using 6256 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6198 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6298 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6142 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6208 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6270 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6353 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6274 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 20 Trial 77, Fold 4: Log loss = 0.22251997590152925, Average precision = 0.9732956379599811, ROC-AUC = 0.9683390618778182, Elapsed Time = 1.1666599000000133 seconds Trial 77, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 77, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 8302 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 6268 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6225 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6155 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6324 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6293 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6336 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 21 [LightGBM] [Debug] Re-bagging, using 6206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6083 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 19 [LightGBM] [Debug] Re-bagging, using 6099 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6151 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 21 [LightGBM] [Debug] Re-bagging, using 6163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 21 [LightGBM] [Debug] Re-bagging, using 6260 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 20 [LightGBM] [Debug] Re-bagging, using 6361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6257 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 21 [LightGBM] [Debug] Re-bagging, using 6205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 20 [LightGBM] [Debug] Re-bagging, using 6295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6293 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6147 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6204 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6261 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6212 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 20 [LightGBM] [Debug] Re-bagging, using 6359 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6265 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 20 Trial 77, Fold 5: Log loss = 0.2264348846158163, Average precision = 0.9721371360826228, ROC-AUC = 0.9677990450179292, Elapsed Time = 1.1781983000000764 seconds
Optimization Progress: 78%|#######8 | 78/100 [18:15<05:19, 14.51s/it]
Trial 78, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 78, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 24158 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 7527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7461 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7603 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7337 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7348 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7403 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7458 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7636 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7543 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7515 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7498 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7642 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7387 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7477 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7556 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 15 [LightGBM] [Debug] Re-bagging, using 7655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7536 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7485 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7553 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7454 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7428 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7560 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7522 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7476 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7460 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 17 [LightGBM] [Debug] Re-bagging, using 7705 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7475 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7488 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7376 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7411 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 19 [LightGBM] [Debug] Re-bagging, using 7577 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 18 [LightGBM] [Debug] Re-bagging, using 7486 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 Trial 78, Fold 1: Log loss = 0.20672061417487297, Average precision = 0.9748272179255015, ROC-AUC = 0.9703260695525826, Elapsed Time = 1.8581021999998484 seconds Trial 78, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 78, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 24163 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 7542 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7605 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7474 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7560 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7418 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7415 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7474 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7658 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7568 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7517 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7601 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7654 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7488 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7581 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7555 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7661 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7550 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7529 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7490 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7567 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7468 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7539 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7434 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7580 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7487 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 18 [LightGBM] [Debug] Re-bagging, using 7570 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7460 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7495 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7512 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7390 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7436 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Debug] Re-bagging, using 7604 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7499 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 Trial 78, Fold 2: Log loss = 0.20328855586184658, Average precision = 0.9741812854286478, ROC-AUC = 0.9719177065689493, Elapsed Time = 2.229027600000336 seconds Trial 78, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 78, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 24160 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7467 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7468 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7609 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7525 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7543 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7401 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7469 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7643 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7559 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7509 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7658 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7401 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7475 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7560 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7550 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7675 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7500 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7568 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7457 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7531 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7431 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 15 [LightGBM] [Debug] Re-bagging, using 7484 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7531 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7564 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7462 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7713 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7477 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Debug] Re-bagging, using 7490 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7376 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7413 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7608 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Debug] Re-bagging, using 7497 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 Trial 78, Fold 3: Log loss = 0.20468469213154777, Average precision = 0.9756805906632632, ROC-AUC = 0.971607257225684, Elapsed Time = 2.315972100000181 seconds Trial 78, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 78, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 24150 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 7523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7587 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7457 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7457 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7587 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7597 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7401 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7457 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7630 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7514 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7501 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7576 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7645 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7389 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7468 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7558 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7536 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7651 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7522 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7524 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7494 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7558 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7447 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7524 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7430 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7562 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7525 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7471 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 17 [LightGBM] [Debug] Re-bagging, using 7509 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7562 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7450 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7705 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7492 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Debug] Re-bagging, using 7380 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Debug] Re-bagging, using 7404 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 16 [LightGBM] [Debug] Re-bagging, using 7578 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Debug] Re-bagging, using 7487 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 Trial 78, Fold 4: Log loss = 0.20804533960468277, Average precision = 0.9746462747165743, ROC-AUC = 0.9700041250289111, Elapsed Time = 2.1942557000002125 seconds Trial 78, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 78, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 24154 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 7521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7456 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7451 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7591 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7332 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7413 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7465 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7628 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7507 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7485 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7582 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7633 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7399 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7459 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7553 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7542 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7648 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7511 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7495 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7451 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7419 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7556 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Debug] Re-bagging, using 7533 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7471 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7507 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7558 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7449 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7703 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7468 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7489 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7369 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Debug] Re-bagging, using 7413 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7579 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Debug] Re-bagging, using 7481 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 Trial 78, Fold 5: Log loss = 0.20993519748703102, Average precision = 0.9730666504093994, ROC-AUC = 0.9692958029095369, Elapsed Time = 2.297114899999542 seconds
Optimization Progress: 79%|#######9 | 79/100 [18:33<05:30, 15.71s/it]
Trial 79, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 79, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 9448 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 16585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16662 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16603 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 Trial 79, Fold 1: Log loss = 0.35574942751144145, Average precision = 0.953768924563304, ROC-AUC = 0.9488460727310095, Elapsed Time = 0.5450129999999263 seconds Trial 79, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 79, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 9436 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 16618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16693 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16630 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 Trial 79, Fold 2: Log loss = 0.36948599381691194, Average precision = 0.947260346271916, ROC-AUC = 0.9453780052510802, Elapsed Time = 0.5532783000007839 seconds Trial 79, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 79, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 9446 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 16601 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 9 [LightGBM] [Debug] Re-bagging, using 16677 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16617 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 Trial 79, Fold 3: Log loss = 0.36693852889771394, Average precision = 0.9495115214204645, ROC-AUC = 0.9450564499041711, Elapsed Time = 0.5432987000003777 seconds Trial 79, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 79, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 9446 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 16579 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 16599 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 Trial 79, Fold 4: Log loss = 0.35530964163414613, Average precision = 0.9599567042135402, ROC-AUC = 0.9534670510335225, Elapsed Time = 0.6347985000002154 seconds Trial 79, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 79, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 9428 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 16573 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Debug] Re-bagging, using 16654 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 Trial 79, Fold 5: Log loss = 0.35161469182991933, Average precision = 0.9587201253380273, ROC-AUC = 0.9516898723594003, Elapsed Time = 0.5850366999993639 seconds
Optimization Progress: 80%|######## | 80/100 [18:45<04:48, 14.44s/it]
Trial 80, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 80, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 23592 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 9953 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9892 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9970 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9834 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9910 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9921 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 Trial 80, Fold 1: Log loss = 0.29717970856942566, Average precision = 0.9729448820054003, ROC-AUC = 0.9678465082710793, Elapsed Time = 0.96355179999955 seconds Trial 80, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 80, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 23517 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 9972 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9932 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9828 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9997 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9832 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9918 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9936 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 Trial 80, Fold 2: Log loss = 0.2975817993313782, Average precision = 0.971209615433894, ROC-AUC = 0.9677456288291657, Elapsed Time = 1.021224500000244 seconds Trial 80, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 80, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 23516 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 9963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9919 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9897 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Debug] Re-bagging, using 9823 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9982 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9917 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9927 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 Trial 80, Fold 3: Log loss = 0.29619044182555126, Average precision = 0.9728419392402384, ROC-AUC = 0.9684816853829782, Elapsed Time = 1.0942439000000377 seconds Trial 80, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 80, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 23527 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 9949 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9908 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9889 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9832 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9908 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9920 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 Trial 80, Fold 4: Log loss = 0.29494973302308586, Average precision = 0.9727109825081937, ROC-AUC = 0.9675677053155648, Elapsed Time = 1.1352965000005497 seconds Trial 80, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 80, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 23538 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 9947 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9905 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9889 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9793 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9961 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9826 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 15 [LightGBM] [Debug] Re-bagging, using 9912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Re-bagging, using 9914 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 Trial 80, Fold 5: Log loss = 0.29787818514134823, Average precision = 0.9707340038556295, ROC-AUC = 0.9663314196533079, Elapsed Time = 1.196688500000164 seconds
Optimization Progress: 81%|########1 | 81/100 [18:58<04:27, 14.09s/it]
Trial 81, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 81, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796981 [LightGBM] [Info] Total Bins 20226 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 13083 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 18 [LightGBM] [Debug] Re-bagging, using 13151 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Re-bagging, using 13216 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Re-bagging, using 12985 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13153 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Re-bagging, using 13052 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 Trial 81, Fold 1: Log loss = 0.32300591972987625, Average precision = 0.9734513638765099, ROC-AUC = 0.9695915064309373, Elapsed Time = 1.536184399999911 seconds Trial 81, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 81, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795629 [LightGBM] [Info] Total Bins 20188 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 13110 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13171 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13241 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Re-bagging, using 13018 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 18 [LightGBM] [Debug] Re-bagging, using 13176 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13058 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 Trial 81, Fold 2: Log loss = 0.32332806823816257, Average precision = 0.9720980957530481, ROC-AUC = 0.9681450290005192, Elapsed Time = 1.8439387999997052 seconds Trial 81, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 81, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 20188 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 13094 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13161 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13228 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Re-bagging, using 13008 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13161 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13053 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 Trial 81, Fold 3: Log loss = 0.3216995749037648, Average precision = 0.9744215142566041, ROC-AUC = 0.9700077529832132, Elapsed Time = 1.765900899999906 seconds Trial 81, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 81, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796474 [LightGBM] [Info] Total Bins 20226 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 13078 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Re-bagging, using 13147 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13212 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 18 [LightGBM] [Debug] Re-bagging, using 12981 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13145 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Re-bagging, using 13050 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 Trial 81, Fold 4: Log loss = 0.32243371529582, Average precision = 0.9739904678441192, ROC-AUC = 0.9691439492025341, Elapsed Time = 1.831553900000472 seconds Trial 81, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 81, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.797340 [LightGBM] [Info] Total Bins 20221 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 13075 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13143 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Re-bagging, using 13212 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Re-bagging, using 12974 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Re-bagging, using 13140 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Debug] Re-bagging, using 13044 data to train [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 Trial 81, Fold 5: Log loss = 0.32431580591947057, Average precision = 0.9719594880006669, ROC-AUC = 0.9685382828902143, Elapsed Time = 1.7745714000002408 seconds
Optimization Progress: 82%|########2 | 82/100 [19:14<04:24, 14.70s/it]
Trial 82, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 82, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 24158 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5834 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5799 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5805 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5686 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5875 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5845 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5869 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5745 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5623 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5645 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5738 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 Trial 82, Fold 1: Log loss = 0.30882726610776134, Average precision = 0.9673871031756777, ROC-AUC = 0.961117444066857, Elapsed Time = 1.5058353999993415 seconds Trial 82, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 82, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795629 [LightGBM] [Info] Total Bins 24166 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5846 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5812 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5809 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5704 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5896 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5836 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5735 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5880 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5764 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5649 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5641 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5736 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 Trial 82, Fold 2: Log loss = 0.3052057853384527, Average precision = 0.9662910981216197, ROC-AUC = 0.9624116941699411, Elapsed Time = 1.8040215000000899 seconds Trial 82, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 82, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 24160 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5839 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5805 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5807 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5696 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5888 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5837 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5737 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5869 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5749 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5643 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5651 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 Trial 82, Fold 3: Log loss = 0.3009154014630842, Average precision = 0.9692448578588674, ROC-AUC = 0.9639620212171702, Elapsed Time = 1.7570170999997572 seconds Trial 82, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 82, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 24150 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5831 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5797 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5688 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5872 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5842 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5727 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5867 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5738 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5626 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5637 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5737 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 Trial 82, Fold 4: Log loss = 0.30238588560447804, Average precision = 0.9679750797460415, ROC-AUC = 0.9617191952140127, Elapsed Time = 1.7509574999994584 seconds Trial 82, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 82, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 24154 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5797 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5801 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 13 [LightGBM] [Debug] Re-bagging, using 5681 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5871 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5838 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5733 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5860 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5739 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5626 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5631 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5728 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 Trial 82, Fold 5: Log loss = 0.30639183738309717, Average precision = 0.9655618446251829, ROC-AUC = 0.9598895639410661, Elapsed Time = 1.7526851000002353 seconds
Optimization Progress: 83%|########2 | 83/100 [19:30<04:16, 15.08s/it]
Trial 83, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 83, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 28977 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 2928 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2933 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2823 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2862 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2958 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2907 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2827 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2965 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2868 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2854 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2747 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2836 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2876 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2848 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2889 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2868 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2896 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2779 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2943 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2803 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2921 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2919 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2841 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2995 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 Trial 83, Fold 1: Log loss = 0.3518844215145265, Average precision = 0.9577284032215908, ROC-AUC = 0.9496131221232846, Elapsed Time = 1.1002770000004602 seconds Trial 83, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 83, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 28981 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 2933 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2943 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2821 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2872 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2967 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2903 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2832 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2971 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2885 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2867 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2745 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2833 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2884 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2844 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2892 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2877 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2905 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2792 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2951 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2911 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2928 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2931 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2855 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2998 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 Trial 83, Fold 2: Log loss = 0.3468327660580311, Average precision = 0.9540629330455027, ROC-AUC = 0.9482948253094668, Elapsed Time = 1.2421316999998453 seconds Trial 83, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 83, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 28976 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 2930 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2937 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2824 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2865 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2907 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2832 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2961 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2879 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2859 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2756 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2824 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Debug] Re-bagging, using 2882 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2849 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2883 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2879 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2900 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2779 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2951 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2924 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2797 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2923 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2862 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2998 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 Trial 83, Fold 3: Log loss = 0.33874433004893345, Average precision = 0.9606818061407815, ROC-AUC = 0.9534419212470264, Elapsed Time = 1.305384199999935 seconds Trial 83, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 83, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 28967 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 2928 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2931 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2819 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2864 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2957 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2961 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2866 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2853 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2747 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2836 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2872 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2846 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2887 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2873 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2893 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2782 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2940 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2911 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2798 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2918 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2924 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2846 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2983 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 Trial 83, Fold 4: Log loss = 0.34823218671688666, Average precision = 0.9580260734276069, ROC-AUC = 0.9501910210281099, Elapsed Time = 1.2381753999998182 seconds Trial 83, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 83, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 28973 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 2927 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2931 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2819 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2863 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2953 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2904 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2833 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2956 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2868 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2852 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2742 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2825 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2887 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2845 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2885 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2876 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2896 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2770 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2943 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 10 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2908 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2799 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 11 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2910 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2925 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 4 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 5 [LightGBM] [Debug] Re-bagging, using 2855 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 14 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2981 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 13 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 12 and depth = 4 Trial 83, Fold 5: Log loss = 0.35669165442933026, Average precision = 0.9562607558042574, ROC-AUC = 0.9485020901844936, Elapsed Time = 1.1870617000004131 seconds
Optimization Progress: 84%|########4 | 84/100 [19:44<03:55, 14.74s/it]
Trial 84, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 84, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.790295 [LightGBM] [Info] Total Bins 11136 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 254 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 9953 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9892 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9802 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9970 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9834 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 9910 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9921 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 Trial 84, Fold 1: Log loss = 0.33400758482481047, Average precision = 0.9622280943411928, ROC-AUC = 0.9545879277134923, Elapsed Time = 1.0007489999998143 seconds Trial 84, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 84, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 11153 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 9972 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9932 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9828 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9997 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9832 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9918 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9936 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 Trial 84, Fold 2: Log loss = 0.332232484757396, Average precision = 0.9601227420006524, ROC-AUC = 0.9546932706837543, Elapsed Time = 1.0486750999998549 seconds Trial 84, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 84, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.790045 [LightGBM] [Info] Total Bins 11153 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 254 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 9963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9919 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9897 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 7 [LightGBM] [Debug] Re-bagging, using 9823 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9982 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9917 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9927 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 Trial 84, Fold 3: Log loss = 0.32143217731769164, Average precision = 0.9645005379665792, ROC-AUC = 0.957877012886325, Elapsed Time = 1.1556067000001349 seconds Trial 84, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 84, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 11137 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 9949 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9908 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9889 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9800 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9963 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9832 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9908 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9920 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 Trial 84, Fold 4: Log loss = 0.33049159274169043, Average precision = 0.9629257029960119, ROC-AUC = 0.9557678574409083, Elapsed Time = 1.2255662999996275 seconds Trial 84, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 84, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 11139 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 9947 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9905 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9889 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9793 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9961 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 9826 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 9914 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 Trial 84, Fold 5: Log loss = 0.33872359839674515, Average precision = 0.9608101868097333, ROC-AUC = 0.953562603348011, Elapsed Time = 1.1282185999998546 seconds
Optimization Progress: 85%|########5 | 85/100 [19:57<03:32, 14.19s/it]
Trial 85, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 85, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 25876 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 2806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2796 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2677 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2739 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2824 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2769 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2714 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 Trial 85, Fold 1: Log loss = 0.4490798698237985, Average precision = 0.9661487854658025, ROC-AUC = 0.9595509336927583, Elapsed Time = 0.9809285999999702 seconds Trial 85, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 85, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 25878 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 2811 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2804 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Debug] Re-bagging, using 2676 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2750 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2833 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2763 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 Trial 85, Fold 2: Log loss = 0.4472839815272487, Average precision = 0.9633424876446511, ROC-AUC = 0.9583642013620273, Elapsed Time = 1.0804873999995834 seconds Trial 85, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 85, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 25877 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 2808 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2798 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2680 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2742 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2829 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2768 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 Trial 85, Fold 3: Log loss = 0.4468608871653139, Average precision = 0.9668104377238605, ROC-AUC = 0.961073387885121, Elapsed Time = 1.1630433999998786 seconds Trial 85, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 85, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 25866 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 2806 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2794 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2674 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2740 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2824 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2767 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 25 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2716 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 Trial 85, Fold 4: Log loss = 0.44792467876332814, Average precision = 0.9659312944127066, ROC-AUC = 0.959395066298514, Elapsed Time = 1.1747064999999566 seconds Trial 85, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 85, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 25873 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 2805 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2794 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2674 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2739 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2820 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 2765 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 2720 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 Trial 85, Fold 5: Log loss = 0.44603286692157473, Average precision = 0.9641034079380562, ROC-AUC = 0.957529309031455, Elapsed Time = 1.116951500000141 seconds
Optimization Progress: 86%|########6 | 86/100 [20:10<03:13, 13.83s/it]
Trial 86, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 86, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 22771 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 16585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16662 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16603 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16552 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16610 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16601 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16562 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16508 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16556 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16564 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16517 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16631 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16709 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16487 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16525 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16573 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 Trial 86, Fold 1: Log loss = 0.305143960093892, Average precision = 0.9682342423734314, ROC-AUC = 0.961671428332449, Elapsed Time = 2.3427920000003724 seconds Trial 86, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 86, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 22782 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 16618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16693 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16630 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16637 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16621 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16553 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16610 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16663 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16557 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16577 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16657 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16654 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16548 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16665 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16524 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16549 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16674 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 Trial 86, Fold 2: Log loss = 0.3031811940840515, Average precision = 0.9666190300590372, ROC-AUC = 0.9625416661481951, Elapsed Time = 2.989899200000764 seconds Trial 86, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 86, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 22670 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 16601 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16677 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16617 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16574 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16529 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16544 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16570 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16570 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16531 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16638 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16579 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16638 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16536 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16503 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16520 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16592 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16648 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 Trial 86, Fold 3: Log loss = 0.30205762492099164, Average precision = 0.9691763190075562, ROC-AUC = 0.9639460698153729, Elapsed Time = 2.8637853000000177 seconds Trial 86, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 86, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 22765 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 16579 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16599 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16549 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16604 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16593 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16623 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16505 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16522 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16547 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16511 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16619 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 10 [LightGBM] [Debug] Re-bagging, using 16608 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16503 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16624 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16707 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16483 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16568 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16633 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 11 Trial 86, Fold 4: Log loss = 0.30270349940476843, Average precision = 0.9686390023398933, ROC-AUC = 0.9624188704574013, Elapsed Time = 2.87171219999982 seconds Trial 86, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 86, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 22721 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 16573 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16654 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16542 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16553 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16616 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16500 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16555 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16510 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16555 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16604 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16499 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16616 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16699 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16480 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16513 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16560 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16629 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 Trial 86, Fold 5: Log loss = 0.3074469106939738, Average precision = 0.9656815610375317, ROC-AUC = 0.9594476710699886, Elapsed Time = 2.8436714999998003 seconds
Optimization Progress: 87%|########7 | 87/100 [20:31<03:30, 16.17s/it]
Trial 87, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 87, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798290 [LightGBM] [Info] Total Bins 17915 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 14894 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15018 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Debug] Re-bagging, using 15035 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 Trial 87, Fold 1: Log loss = 0.3128623381671113, Average precision = 0.9699769960360858, ROC-AUC = 0.9640660025427417, Elapsed Time = 0.6332750999999917 seconds Trial 87, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 87, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798259 [LightGBM] [Info] Total Bins 17930 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 14922 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Debug] Re-bagging, using 15044 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 15065 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 Trial 87, Fold 2: Log loss = 0.31324368228512456, Average precision = 0.9690407017541076, ROC-AUC = 0.9654967266485197, Elapsed Time = 0.7145328000005975 seconds Trial 87, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 87, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798104 [LightGBM] [Info] Total Bins 17934 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 14905 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Debug] Re-bagging, using 15032 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15051 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 Trial 87, Fold 3: Log loss = 0.301574581235144, Average precision = 0.9727812892251233, ROC-AUC = 0.9680483640483093, Elapsed Time = 0.7058385999998791 seconds Trial 87, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 87, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.797795 [LightGBM] [Info] Total Bins 17917 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 14889 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 10 [LightGBM] [Debug] Re-bagging, using 15011 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Re-bagging, using 15030 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 Trial 87, Fold 4: Log loss = 0.31242890490674685, Average precision = 0.9695993487992027, ROC-AUC = 0.9642784120307781, Elapsed Time = 0.6826785000002928 seconds Trial 87, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 87, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.798646 [LightGBM] [Info] Total Bins 17916 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 260 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 14884 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15009 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Re-bagging, using 15028 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 Trial 87, Fold 5: Log loss = 0.3150488481613542, Average precision = 0.9685165788771627, ROC-AUC = 0.9634251528157108, Elapsed Time = 0.7011179999999513 seconds
Optimization Progress: 88%|########8 | 88/100 [20:43<02:57, 14.83s/it]
Trial 88, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 88, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 22663 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 6273 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6246 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6230 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6158 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 Trial 88, Fold 1: Log loss = 0.3061306220134525, Average precision = 0.9662695164729078, ROC-AUC = 0.9598435341716732, Elapsed Time = 0.7078392000003078 seconds Trial 88, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 88, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 22674 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 6285 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6262 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Debug] Re-bagging, using 6232 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 Trial 88, Fold 2: Log loss = 0.3087993840397771, Average precision = 0.9633442503351907, ROC-AUC = 0.9590662624914983, Elapsed Time = 0.7561420999991242 seconds Trial 88, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 88, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 22561 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 6278 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6230 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6170 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 Trial 88, Fold 3: Log loss = 0.30574315365828897, Average precision = 0.9666086984224752, ROC-AUC = 0.9610126672188462, Elapsed Time = 0.7820138999995834 seconds Trial 88, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 88, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 22657 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 6270 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6227 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 Trial 88, Fold 4: Log loss = 0.30470793975541044, Average precision = 0.9663651109636795, ROC-AUC = 0.9594859719211906, Elapsed Time = 0.8285013000004255 seconds Trial 88, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 88, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 22613 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 6268 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 6243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 6225 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 7 [LightGBM] [Debug] Re-bagging, using 6155 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 Trial 88, Fold 5: Log loss = 0.3070984501419724, Average precision = 0.9657053793675368, ROC-AUC = 0.9597007970570203, Elapsed Time = 0.8024282999995194 seconds
Optimization Progress: 89%|########9 | 89/100 [20:55<02:32, 13.84s/it]
Trial 89, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 89, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796981 [LightGBM] [Info] Total Bins 9564 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 6184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6149 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6141 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6075 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6239 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6209 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 Trial 89, Fold 1: Log loss = 0.33416366488173527, Average precision = 0.9717220616218375, ROC-AUC = 0.9662786536375065, Elapsed Time = 1.0397594999994908 seconds Trial 89, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 89, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796949 [LightGBM] [Info] Total Bins 9556 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 6196 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6165 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6143 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6093 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6263 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6203 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 Trial 89, Fold 2: Log loss = 0.33738411543798374, Average precision = 0.9680395626741372, ROC-AUC = 0.9650552164666985, Elapsed Time = 1.1089407999988907 seconds Trial 89, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 89, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796784 [LightGBM] [Info] Total Bins 9563 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 6189 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6141 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6086 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6254 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 Trial 89, Fold 3: Log loss = 0.332700635870908, Average precision = 0.9727254923682255, ROC-AUC = 0.9683099820862748, Elapsed Time = 1.2187215000012657 seconds Trial 89, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 89, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796474 [LightGBM] [Info] Total Bins 9562 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 6181 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6138 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Debug] Re-bagging, using 6077 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6235 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 Trial 89, Fold 4: Log loss = 0.3365947795505942, Average precision = 0.9718392168322076, ROC-AUC = 0.9664804144581072, Elapsed Time = 1.2304527000014787 seconds Trial 89, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 89, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.797340 [LightGBM] [Info] Total Bins 9544 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 259 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 6179 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6137 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 11 [LightGBM] [Debug] Re-bagging, using 6071 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6234 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Debug] Re-bagging, using 6202 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 11 Trial 89, Fold 5: Log loss = 0.338789382130623, Average precision = 0.9691199604516071, ROC-AUC = 0.9640932686769597, Elapsed Time = 1.192119699999239 seconds
Optimization Progress: 90%|######### | 90/100 [21:08<02:17, 13.73s/it]
Trial 90, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 90, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 12968 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 4096 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4091 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4061 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3955 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4115 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 5 [LightGBM] [Debug] Re-bagging, using 4078 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4013 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4058 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3982 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3916 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3992 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4001 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4059 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4043 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4028 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4100 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3968 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4090 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 Trial 90, Fold 1: Log loss = 0.331844758191932, Average precision = 0.9606197678651246, ROC-AUC = 0.9531092422917175, Elapsed Time = 1.0667802999996638 seconds Trial 90, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 90, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 12982 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 4104 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4100 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4065 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3970 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4128 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4067 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 5 [LightGBM] [Debug] Re-bagging, using 4020 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4168 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4078 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4005 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3914 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3983 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 5 [LightGBM] [Debug] Re-bagging, using 4007 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4059 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4051 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4042 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3982 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4126 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4093 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 Trial 90, Fold 2: Log loss = 0.33514325857746285, Average precision = 0.9569969912288875, ROC-AUC = 0.9518625674936214, Elapsed Time = 1.074133999998594 seconds Trial 90, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 90, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 12988 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 4099 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4095 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4063 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3964 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Debug] Re-bagging, using 4124 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4070 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4021 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4156 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4068 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3994 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3927 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3978 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4003 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4063 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4035 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 5 [LightGBM] [Debug] Re-bagging, using 4043 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4108 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3968 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4117 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4104 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 Trial 90, Fold 3: Log loss = 0.32265590326638777, Average precision = 0.9635499999456428, ROC-AUC = 0.9568613901435987, Elapsed Time = 1.1633805000001303 seconds Trial 90, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 90, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 12968 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 4095 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4089 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4057 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3958 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4078 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4014 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4154 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4055 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3985 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3910 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3991 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4000 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4059 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4040 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4033 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4095 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3970 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4107 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4091 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 15 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 Trial 90, Fold 4: Log loss = 0.3344532381528901, Average precision = 0.9603714714322781, ROC-AUC = 0.9529044793522036, Elapsed Time = 1.1753873999987263 seconds Trial 90, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 90, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 12970 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 4093 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 22 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4089 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4057 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3956 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4107 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4075 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4019 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4148 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4057 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3982 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3908 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Debug] Re-bagging, using 3977 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4014 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 8 [LightGBM] [Debug] Re-bagging, using 4062 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4036 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 5 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 16 and depth = 5 [LightGBM] [Debug] Re-bagging, using 4036 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4099 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Debug] Re-bagging, using 3955 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 23 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 6 [LightGBM] [Debug] Re-bagging, using 4109 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 17 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 19 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 18 and depth = 7 [LightGBM] [Debug] Re-bagging, using 4087 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 20 and depth = 6 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 21 and depth = 7 Trial 90, Fold 5: Log loss = 0.3372951464810977, Average precision = 0.9593237982203765, ROC-AUC = 0.9517935455102837, Elapsed Time = 1.1603880000002391 seconds
Optimization Progress: 91%|#########1| 91/100 [21:21<02:02, 13.58s/it]
Trial 91, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 91, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 23584 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 10496 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10463 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10459 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10304 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10526 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10397 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 20 [LightGBM] [Debug] Re-bagging, using 10215 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10299 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10288 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10280 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10332 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10471 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 18 Trial 91, Fold 1: Log loss = 0.3008881871994671, Average precision = 0.9718838331284857, ROC-AUC = 0.9666126274357981, Elapsed Time = 0.9970556000007491 seconds Trial 91, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 91, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792956 [LightGBM] [Info] Total Bins 23515 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 10518 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10481 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10475 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10330 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10555 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10346 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10429 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10483 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10237 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10313 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10299 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10290 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10507 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10381 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 19 [LightGBM] [Debug] Re-bagging, using 10420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10426 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 22 [LightGBM] [Debug] Re-bagging, using 10420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 16 Trial 91, Fold 2: Log loss = 0.2991729032591439, Average precision = 0.9707289483964904, ROC-AUC = 0.9667573713763008, Elapsed Time = 1.1395652999999584 seconds Trial 91, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 91, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Warning] Met negative value in categorical features, will convert it to NaN [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 23508 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 10506 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10465 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10325 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10429 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10403 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10463 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10236 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10305 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 21 [LightGBM] [Debug] Re-bagging, using 10280 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10340 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10491 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10356 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10422 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10400 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 14 Trial 91, Fold 3: Log loss = 0.2955545562487179, Average precision = 0.9719937847754485, ROC-AUC = 0.9677634960897998, Elapsed Time = 1.16592270000001 seconds Trial 91, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 91, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 23522 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 10492 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Debug] Re-bagging, using 10459 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10454 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10303 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10520 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10345 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10424 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 19 [LightGBM] [Debug] Re-bagging, using 10396 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10451 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10214 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10295 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10283 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10271 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10332 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10472 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10328 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10402 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10386 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 20 Trial 91, Fold 4: Log loss = 0.3017551683302678, Average precision = 0.9715725587706239, ROC-AUC = 0.9661564327799117, Elapsed Time = 1.1777863999996043 seconds Trial 91, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 91, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.793364 [LightGBM] [Info] Total Bins 23533 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 10489 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10456 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10296 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10338 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10430 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10448 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10216 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10285 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10277 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10271 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10341 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 77 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 20 [LightGBM] [Debug] Re-bagging, using 10466 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10325 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10393 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 75 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 19 [LightGBM] [Debug] Re-bagging, using 10391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 Trial 91, Fold 5: Log loss = 0.3005683826155532, Average precision = 0.9699361551117898, ROC-AUC = 0.9658108986864351, Elapsed Time = 1.171874800000296 seconds
Optimization Progress: 92%|#########2| 92/100 [21:34<01:47, 13.41s/it]
Trial 92, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 92, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 24804 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 5347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5294 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5327 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5355 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5330 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5234 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5370 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5261 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5126 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5207 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 Trial 92, Fold 1: Log loss = 0.24369748691301674, Average precision = 0.9719584055289463, ROC-AUC = 0.9668471260329887, Elapsed Time = 1.0470143999991706 seconds Trial 92, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 92, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 24808 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 5357 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5305 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5333 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5174 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5376 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5242 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5379 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5277 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5149 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5203 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5215 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 Trial 92, Fold 2: Log loss = 0.24594562479299417, Average precision = 0.9695862442338552, ROC-AUC = 0.9657001445549405, Elapsed Time = 1.1108922000003076 seconds Trial 92, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 92, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 24804 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 5352 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5297 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5330 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5170 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5366 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5325 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5243 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5369 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5265 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5142 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5168 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5195 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5211 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 Trial 92, Fold 3: Log loss = 0.24196498044713488, Average precision = 0.9718823432725099, ROC-AUC = 0.9677592072695054, Elapsed Time = 1.1935706000003847 seconds Trial 92, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 92, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 24794 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 5344 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5162 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5351 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5328 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5235 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5366 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5255 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5130 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5153 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Re-bagging, using 5206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5206 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 Trial 92, Fold 4: Log loss = 0.24702565761415668, Average precision = 0.9708847749496918, ROC-AUC = 0.9653432745337883, Elapsed Time = 1.2001282000001083 seconds Trial 92, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 92, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 24798 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 5342 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 8 [LightGBM] [Debug] Re-bagging, using 5292 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 39 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5324 data to train [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5240 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5255 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Debug] Re-bagging, using 5130 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 38 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5148 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 9 [LightGBM] [Debug] Re-bagging, using 5195 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Debug] Re-bagging, using 5214 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 Trial 92, Fold 5: Log loss = 0.24949129955282773, Average precision = 0.9692402742286004, ROC-AUC = 0.9643114653586757, Elapsed Time = 1.1896702000012738 seconds
Optimization Progress: 93%|#########3| 93/100 [21:48<01:33, 13.35s/it]
Trial 93, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 93, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795662 [LightGBM] [Info] Total Bins 15120 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 14380 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14499 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14345 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14432 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 41 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14436 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14426 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14510 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14362 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14427 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14283 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14392 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Debug] Re-bagging, using 14301 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14539 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14436 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14439 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14454 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 Trial 93, Fold 1: Log loss = 0.28890787041309307, Average precision = 0.9721419694650522, ROC-AUC = 0.9666721482580787, Elapsed Time = 1.2925502000016422 seconds Trial 93, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 93, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.794298 [LightGBM] [Info] Total Bins 15133 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 257 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 14407 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14550 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14375 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14458 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14405 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14463 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14542 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14392 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Debug] Re-bagging, using 14441 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14305 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14319 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14567 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14483 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Debug] Re-bagging, using 14475 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14496 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 12 Trial 93, Fold 2: Log loss = 0.2883054234552478, Average precision = 0.9700646790336668, ROC-AUC = 0.9665828614230118, Elapsed Time = 1.5125194000011106 seconds Trial 93, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 93, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795455 [LightGBM] [Info] Total Bins 15138 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 14391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14510 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14366 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14440 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14395 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14448 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14447 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14515 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14391 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14431 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14294 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14302 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14557 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14450 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14457 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14482 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 Trial 93, Fold 3: Log loss = 0.2847499990984428, Average precision = 0.9724436861937583, ROC-AUC = 0.9676233193843842, Elapsed Time = 1.528636799999731 seconds Trial 93, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 93, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.795144 [LightGBM] [Info] Total Bins 15120 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 14375 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14493 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14514 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14343 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14425 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14387 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14430 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 10 [LightGBM] [Debug] Re-bagging, using 14429 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14498 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14360 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14281 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14390 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14286 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14542 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14431 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14433 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14451 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 14 Trial 93, Fold 4: Log loss = 0.28714861168790945, Average precision = 0.9713751042348534, ROC-AUC = 0.965691323848151, Elapsed Time = 1.502879200001189 seconds Trial 93, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 93, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.796026 [LightGBM] [Info] Total Bins 15119 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 258 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 14371 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Debug] Re-bagging, using 14490 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14512 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14337 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14380 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 10 [LightGBM] [Debug] Re-bagging, using 14433 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14420 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14493 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14359 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14413 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Debug] Re-bagging, using 14274 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 45 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 10 [LightGBM] [Debug] Re-bagging, using 14386 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Debug] Re-bagging, using 14289 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 14 [LightGBM] [Debug] Re-bagging, using 14425 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Debug] Re-bagging, using 14429 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 10 [LightGBM] [Debug] Re-bagging, using 14443 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 14 Trial 93, Fold 5: Log loss = 0.2910606291415502, Average precision = 0.969880620876491, ROC-AUC = 0.9651118666740985, Elapsed Time = 1.4681914000011602 seconds
Optimization Progress: 94%|#########3| 94/100 [22:02<01:22, 13.75s/it]
Trial 94, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 94, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 19014 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 16585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16662 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16603 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 111 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16552 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 19 [LightGBM] [Debug] Re-bagging, using 16610 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 111 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 18 [LightGBM] [Debug] Re-bagging, using 16601 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 113 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16528 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16562 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16634 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 19 [LightGBM] [Debug] Re-bagging, using 16508 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 18 [LightGBM] [Debug] Re-bagging, using 16530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16556 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16564 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 113 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16517 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16631 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16709 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16487 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16525 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 114 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16573 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 Trial 94, Fold 1: Log loss = 0.24722053119581827, Average precision = 0.9700217835372049, ROC-AUC = 0.964115889505456, Elapsed Time = 1.3311601000004885 seconds Trial 94, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 94, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 19026 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 16618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 18 [LightGBM] [Debug] Re-bagging, using 16693 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16630 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 18 [LightGBM] [Debug] Re-bagging, using 16637 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16621 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16553 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16610 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 112 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16663 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16557 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 18 [LightGBM] [Debug] Re-bagging, using 16577 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 114 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 19 [LightGBM] [Debug] Re-bagging, using 16540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16657 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 115 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16654 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16548 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 20 [LightGBM] [Debug] Re-bagging, using 16665 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16718 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16524 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16549 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 111 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16674 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 20 Trial 94, Fold 2: Log loss = 0.2518173165478632, Average precision = 0.9688081030783827, ROC-AUC = 0.9652927017509569, Elapsed Time = 1.426636199999848 seconds Trial 94, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 94, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 19029 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 16601 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16677 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16617 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16574 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 112 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 111 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16538 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 118 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 21 [LightGBM] [Debug] Re-bagging, using 16588 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16529 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16544 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 112 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16570 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16570 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 18 [LightGBM] [Debug] Re-bagging, using 16531 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16638 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16579 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16638 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 112 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16536 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 114 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 113 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16646 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 116 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16725 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 111 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16503 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16520 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16592 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16648 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 13 Trial 94, Fold 3: Log loss = 0.2580977406334601, Average precision = 0.9691910105800448, ROC-AUC = 0.964746649603683, Elapsed Time = 1.6245612000002438 seconds Trial 94, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 94, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792454 [LightGBM] [Info] Total Bins 19013 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 256 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 16579 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16655 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 111 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 82 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16599 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16549 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16604 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16593 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 113 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16561 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 113 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 76 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16623 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16505 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 88 and depth = 18 [LightGBM] [Debug] Re-bagging, using 16522 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 80 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16547 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 79 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16511 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16619 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16563 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16608 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16503 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 112 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16624 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16707 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16483 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 121 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16516 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16568 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 109 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 113 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16633 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 17 Trial 94, Fold 4: Log loss = 0.24575069718207382, Average precision = 0.9701339402577004, ROC-AUC = 0.9641384769821002, Elapsed Time = 1.6441727000001265 seconds Trial 94, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 94, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 19008 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 16573 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16654 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16542 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 78 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 84 and depth = 12 [LightGBM] [Debug] Re-bagging, using 16598 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16585 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 86 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 87 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 11 [LightGBM] [Debug] Re-bagging, using 16553 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16616 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 104 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 117 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 20 [LightGBM] [Debug] Re-bagging, using 16500 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 21 [LightGBM] [Debug] Re-bagging, using 16519 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 106 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 17 [LightGBM] [Debug] Re-bagging, using 16541 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 99 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 23 [LightGBM] [Debug] Re-bagging, using 16555 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16510 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 83 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 102 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16618 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 94 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 91 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16555 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 96 and depth = 15 [LightGBM] [Debug] Re-bagging, using 16604 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 105 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 95 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 100 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16499 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 98 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 18 [LightGBM] [Debug] Re-bagging, using 16616 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 97 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 108 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 113 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16699 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 111 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 101 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 19 [LightGBM] [Debug] Re-bagging, using 16480 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 103 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 13 [LightGBM] [Debug] Re-bagging, using 16513 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 117 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 110 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 92 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 93 and depth = 14 [LightGBM] [Debug] Re-bagging, using 16560 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 85 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 90 and depth = 16 [LightGBM] [Debug] Re-bagging, using 16629 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 107 and depth = 18 Trial 94, Fold 5: Log loss = 0.24745709343345237, Average precision = 0.9671251946854702, ROC-AUC = 0.9629334671051409, Elapsed Time = 1.5948129999997036 seconds
Optimization Progress: 95%|#########5| 95/100 [22:18<01:11, 14.24s/it]
Trial 95, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 95, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 24150 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 7527 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 24 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7589 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7461 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7455 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7603 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7337 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7348 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7403 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7458 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7636 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7543 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 11 Trial 95, Fold 1: Log loss = 0.2819267212222898, Average precision = 0.9687208579186044, ROC-AUC = 0.962326082696453, Elapsed Time = 1.3650629999992816 seconds Trial 95, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 95, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 24158 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 7542 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7605 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7474 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7470 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7625 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7613 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7560 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7347 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7418 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7415 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7474 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7658 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7568 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 Trial 95, Fold 2: Log loss = 0.27767917005180226, Average precision = 0.9672489021741485, ROC-AUC = 0.9627532025611384, Elapsed Time = 1.6671499000003678 seconds Trial 95, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 95, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 24152 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 27 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 28 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7595 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7467 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7468 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7609 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7525 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7543 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7606 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7361 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7349 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7416 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7401 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7469 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7643 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 12 [LightGBM] [Debug] Re-bagging, using 7559 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 37 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 Trial 95, Fold 3: Log loss = 0.2742493392445868, Average precision = 0.9705207772076768, ROC-AUC = 0.9650506291473644, Elapsed Time = 1.686474399999497 seconds Trial 95, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 95, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 24142 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 7523 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7587 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7457 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7457 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7587 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7530 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7537 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7597 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7534 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7421 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7401 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7457 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 7 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7630 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 Trial 95, Fold 4: Log loss = 0.2786265672968445, Average precision = 0.9688874155910504, ROC-AUC = 0.9626656568691268, Elapsed Time = 1.6847445999992487 seconds Trial 95, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 95, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 24146 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 7521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 26 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7456 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7451 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7586 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7521 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7546 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7591 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 29 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7532 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7339 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 30 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7332 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7413 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 36 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7406 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 35 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7465 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7628 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7540 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 34 and depth = 9 Trial 95, Fold 5: Log loss = 0.2833765556609774, Average precision = 0.9667874212231111, ROC-AUC = 0.96182643107965, Elapsed Time = 1.6412778000012622 seconds
Optimization Progress: 96%|#########6| 96/100 [22:33<00:58, 14.65s/it]
Trial 96, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 96, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 25868 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 10167 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 47 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10127 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10114 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10000 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10178 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10041 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10119 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10107 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10157 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 18 [LightGBM] [Debug] Re-bagging, using 9877 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 24 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Debug] Re-bagging, using 9954 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 20 Trial 96, Fold 1: Log loss = 0.19265856405157333, Average precision = 0.9761579744323005, ROC-AUC = 0.9717249955788799, Elapsed Time = 2.0507928999995784 seconds Trial 96, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 96, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 25873 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 10188 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 43 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10146 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10128 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10026 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10205 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10039 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10126 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 72 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10125 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 18 [LightGBM] [Debug] Re-bagging, using 10184 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 19 [LightGBM] [Debug] Re-bagging, using 9899 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 25 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 19 [LightGBM] [Debug] Re-bagging, using 9967 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 21 Trial 96, Fold 2: Log loss = 0.18734632777382243, Average precision = 0.9734495986173171, ROC-AUC = 0.9719588302188112, Elapsed Time = 2.1794813000014983 seconds Trial 96, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 96, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 25869 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 10177 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10134 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10120 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10021 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10189 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10037 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10125 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10114 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 19 [LightGBM] [Debug] Re-bagging, using 10163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 15 [LightGBM] [Debug] Re-bagging, using 9899 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 73 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 18 [LightGBM] [Debug] Re-bagging, using 9959 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 22 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 74 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 19 Trial 96, Fold 3: Log loss = 0.1883741298901994, Average precision = 0.9752283440379164, ROC-AUC = 0.9723419246178886, Elapsed Time = 2.1102914000002784 seconds Trial 96, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 96, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 25858 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 10163 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 42 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10123 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10110 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 13 [LightGBM] [Debug] Re-bagging, using 9999 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10171 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10039 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10117 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 21 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 12 [LightGBM] [Debug] Re-bagging, using 10106 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 16 [LightGBM] [Debug] Re-bagging, using 10153 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 18 [LightGBM] [Debug] Re-bagging, using 9876 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 20 [LightGBM] [Debug] Re-bagging, using 9950 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 17 Trial 96, Fold 4: Log loss = 0.19104105517645986, Average precision = 0.9752330648341077, ROC-AUC = 0.9707036512466887, Elapsed Time = 2.073414300000877 seconds Trial 96, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 96, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 25865 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 10160 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 46 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10120 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 53 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10111 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 59 and depth = 12 [LightGBM] [Debug] Re-bagging, using 9992 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 49 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 48 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 56 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 50 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 11 [LightGBM] [Debug] Re-bagging, using 10169 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 51 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 55 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 52 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 54 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 13 [LightGBM] [Debug] Re-bagging, using 10032 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 15 [LightGBM] [Debug] Re-bagging, using 10121 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 44 and depth = 10 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 60 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 17 [LightGBM] [Debug] Re-bagging, using 10101 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 62 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 70 and depth = 20 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 14 [LightGBM] [Debug] Re-bagging, using 10150 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 11 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 17 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 65 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 71 and depth = 19 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 16 [LightGBM] [Debug] Re-bagging, using 9879 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 68 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 69 and depth = 15 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 64 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 23 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 58 and depth = 14 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 57 and depth = 19 [LightGBM] [Debug] Re-bagging, using 9940 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 63 and depth = 13 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 61 and depth = 12 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 16 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 66 and depth = 18 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 67 and depth = 15 Trial 96, Fold 5: Log loss = 0.19329716201171468, Average precision = 0.9739603585368279, ROC-AUC = 0.9715274139308474, Elapsed Time = 2.0674993000011455 seconds
Optimization Progress: 97%|#########7| 97/100 [22:52<00:47, 15.98s/it]
Trial 97, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 97, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804600 [LightGBM] [Info] Total Bins 29002 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 19497 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19452 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19514 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 14 [LightGBM] [Debug] Re-bagging, using 19469 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19477 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19483 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 33 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19435 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19467 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19496 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19436 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19473 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19432 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19449 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Re-bagging, using 19491 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19458 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19471 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19429 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19529 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19504 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19529 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19402 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19436 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19464 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19493 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19521 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 Trial 97, Fold 1: Log loss = 0.3439567228683248, Average precision = 0.9642014557062354, ROC-AUC = 0.9580364013516679, Elapsed Time = 1.6739467000006698 seconds Trial 97, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 97, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804566 [LightGBM] [Info] Total Bins 29006 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 19534 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19485 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Re-bagging, using 19553 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19503 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19517 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19506 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Re-bagging, using 19475 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Re-bagging, using 19514 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19532 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19470 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Re-bagging, using 19505 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19468 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19487 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Re-bagging, using 19526 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19493 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19506 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19465 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19569 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19540 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19560 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19439 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19469 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19504 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19535 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19554 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 Trial 97, Fold 2: Log loss = 0.3371119253550399, Average precision = 0.9651042928311725, ROC-AUC = 0.9614018885315396, Elapsed Time = 2.0464709000007133 seconds Trial 97, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 97, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804456 [LightGBM] [Info] Total Bins 29001 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 19516 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19468 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19533 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19490 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19491 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19497 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Re-bagging, using 19453 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 26 [LightGBM] [Debug] Re-bagging, using 19492 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 26 [LightGBM] [Debug] Re-bagging, using 19513 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19457 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19487 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19451 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19466 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19507 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19480 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Re-bagging, using 19483 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19449 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19550 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19521 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19549 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19419 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 30 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19447 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19485 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19513 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19543 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 Trial 97, Fold 3: Log loss = 0.3337444841372523, Average precision = 0.9682855889749199, ROC-AUC = 0.963323212720671, Elapsed Time = 2.080849899999521 seconds Trial 97, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 97, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804155 [LightGBM] [Info] Total Bins 28992 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 19490 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19445 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19509 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Re-bagging, using 19465 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19468 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19474 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19433 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 13 [LightGBM] [Debug] Re-bagging, using 19462 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19487 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19428 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19465 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19426 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19444 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19483 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Re-bagging, using 19451 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19463 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19422 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19523 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19498 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19521 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19399 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19422 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19464 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19488 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19512 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 Trial 97, Fold 4: Log loss = 0.3345920525712018, Average precision = 0.9672589026802049, ROC-AUC = 0.961008492075414, Elapsed Time = 2.0492997000001196 seconds Trial 97, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 97, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804939 [LightGBM] [Info] Total Bins 28998 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 19484 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19440 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19503 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Re-bagging, using 19460 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19463 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19465 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19433 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19454 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 23 [LightGBM] [Debug] Re-bagging, using 19478 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19423 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19459 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19422 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19437 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19477 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Re-bagging, using 19449 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Re-bagging, using 19453 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Re-bagging, using 19419 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19518 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 [LightGBM] [Debug] Re-bagging, using 19489 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19518 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19393 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Re-bagging, using 19419 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 36 [LightGBM] [Debug] Re-bagging, using 19452 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 15 [LightGBM] [Debug] Re-bagging, using 19487 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 22 [LightGBM] [Debug] Re-bagging, using 19507 data to train [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 209 and depth = 16 Trial 97, Fold 5: Log loss = 0.34868921272817555, Average precision = 0.9628221406956591, ROC-AUC = 0.9572423685041711, Elapsed Time = 2.0245357000003423 seconds
Optimization Progress: 98%|#########8| 98/100 [23:10<00:32, 16.41s/it]
Trial 98, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 98, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.805817 [LightGBM] [Info] Total Bins 7969 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 6760 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6778 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6679 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6628 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6807 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6755 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6700 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 Trial 98, Fold 1: Log loss = 0.2059870375568024, Average precision = 0.9736278666055719, ROC-AUC = 0.9687441748756112, Elapsed Time = 1.1208126999990782 seconds Trial 98, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 98, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.806987 [LightGBM] [Info] Total Bins 7985 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 6773 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6795 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6687 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6643 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 6835 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 6744 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 26 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 6713 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 Trial 98, Fold 2: Log loss = 0.1950855705719745, Average precision = 0.9740338406308985, ROC-AUC = 0.9714351486111586, Elapsed Time = 1.53894020000007 seconds Trial 98, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 98, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.806893 [LightGBM] [Info] Total Bins 7992 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 6766 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6785 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 6681 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6641 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 6823 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 6745 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 26 [LightGBM] [Debug] Re-bagging, using 6710 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 Trial 98, Fold 3: Log loss = 0.19211088925521713, Average precision = 0.974235235731083, ROC-AUC = 0.9724076865290712, Elapsed Time = 1.3739041999997426 seconds Trial 98, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 98, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804155 [LightGBM] [Info] Total Bins 7971 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 6757 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6775 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Re-bagging, using 6676 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Re-bagging, using 6630 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 30 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Re-bagging, using 6802 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6752 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Re-bagging, using 6701 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 21 Trial 98, Fold 4: Log loss = 0.20014693335544856, Average precision = 0.97502289346449, ROC-AUC = 0.970527950051384, Elapsed Time = 1.3837733000000298 seconds Trial 98, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 98, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.807356 [LightGBM] [Info] Total Bins 7973 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 267 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 6755 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6775 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Re-bagging, using 6674 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 12 [LightGBM] [Debug] Re-bagging, using 6625 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Re-bagging, using 6801 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 23 [LightGBM] [Debug] Re-bagging, using 6745 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 30 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 14 [LightGBM] [Debug] Re-bagging, using 6709 data to train [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 89 and depth = 18 Trial 98, Fold 5: Log loss = 0.20402423794622532, Average precision = 0.9736777729507122, ROC-AUC = 0.9695342139977332, Elapsed Time = 1.3356522999984008 seconds
Optimization Progress: 99%|#########9| 99/100 [23:24<00:15, 15.83s/it]
Trial 99, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 99, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 [LightGBM] [Info] Number of positive: 10130, number of negative: 10533 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791640 [LightGBM] [Info] Total Bins 11253 [LightGBM] [Info] Number of data points in the train set: 20663, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490248 -> initscore=-0.039012 [LightGBM] [Info] Start training from score -0.039012 [LightGBM] [Debug] Re-bagging, using 7912 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7981 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7859 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7810 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7975 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7910 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7908 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 Trial 99, Fold 1: Log loss = 0.4228630313718012, Average precision = 0.9653912293403434, ROC-AUC = 0.9590646104873748, Elapsed Time = 0.658761600001526 seconds Trial 99, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 99, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 [LightGBM] [Info] Number of positive: 10230, number of negative: 10471 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791600 [LightGBM] [Info] Total Bins 11266 [LightGBM] [Info] Number of data points in the train set: 20701, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.494179 -> initscore=-0.023285 [LightGBM] [Info] Start training from score -0.023285 [LightGBM] [Debug] Re-bagging, using 7928 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7996 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7875 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7825 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 8007 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7902 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7918 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 Trial 99, Fold 2: Log loss = 0.42258646154902085, Average precision = 0.9609599053042722, ROC-AUC = 0.9579277747310195, Elapsed Time = 0.7333526000002166 seconds Trial 99, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 99, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 [LightGBM] [Info] Number of positive: 10165, number of negative: 10517 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791401 [LightGBM] [Info] Total Bins 11269 [LightGBM] [Info] Number of data points in the train set: 20682, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491490 -> initscore=-0.034043 [LightGBM] [Info] Start training from score -0.034043 [LightGBM] [Debug] Re-bagging, using 7919 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 31 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7987 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7866 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7825 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7990 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Re-bagging, using 7901 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7915 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 Trial 99, Fold 3: Log loss = 0.42016989402690114, Average precision = 0.9661421572245269, ROC-AUC = 0.9603194584228216, Elapsed Time = 0.7783467999997811 seconds Trial 99, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 99, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 [LightGBM] [Info] Number of positive: 10177, number of negative: 10479 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.791090 [LightGBM] [Info] Total Bins 11250 [LightGBM] [Info] Number of data points in the train set: 20656, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492690 -> initscore=-0.029243 [LightGBM] [Info] Start training from score -0.029243 [LightGBM] [Debug] Re-bagging, using 7908 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 32 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7979 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7855 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7810 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7969 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7907 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7909 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 Trial 99, Fold 4: Log loss = 0.42064763482572876, Average precision = 0.9653052217148508, ROC-AUC = 0.9584859355589415, Elapsed Time = 0.8043866000007256 seconds Trial 99, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 99, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 [LightGBM] [Info] Number of positive: 10150, number of negative: 10500 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.792012 [LightGBM] [Info] Total Bins 11253 [LightGBM] [Info] Number of data points in the train set: 20650, number of used features: 255 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.491525 -> initscore=-0.033902 [LightGBM] [Info] Start training from score -0.033902 [LightGBM] [Debug] Re-bagging, using 7906 data to train [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Debug] Trained a tree with leaves = 33 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7978 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7853 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7805 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 8 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Re-bagging, using 7968 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 11 [LightGBM] [Debug] Re-bagging, using 7898 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 10 [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 [LightGBM] [Debug] Re-bagging, using 7916 data to train [LightGBM] [Debug] Trained a tree with leaves = 40 and depth = 9 Trial 99, Fold 5: Log loss = 0.42703576123191167, Average precision = 0.9626720951126174, ROC-AUC = 0.9566728721921856, Elapsed Time = 0.772207600000911 seconds
Optimization Progress: 100%|##########| 100/100 [23:36<00:00, 14.51s/it]
Optuna Optimization Elapsed Time: 1416.1865574000003 seconds
Optimization Progress: 100%|##########| 100/100 [23:36<00:00, 14.17s/it]
Training with Best Trial 68
[LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 12713, number of negative: 13125 [LightGBM] [Debug] Dataset::GetMultiBinFromSparseFeatures: sparse rate 0.804279 [LightGBM] [Info] Total Bins 22087 [LightGBM] [Info] Number of data points in the train set: 25838, number of used features: 265 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.492027 -> initscore=-0.031894 [LightGBM] [Info] Start training from score -0.031894 [LightGBM] [Debug] Re-bagging, using 18633 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 18737 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Re-bagging, using 18772 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 12 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Re-bagging, using 18702 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Re-bagging, using 18714 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Re-bagging, using 18601 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Re-bagging, using 18731 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Re-bagging, using 18744 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 13 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 15 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Re-bagging, using 18737 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 27 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Re-bagging, using 18635 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Re-bagging, using 18640 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 23 [LightGBM] [Debug] Re-bagging, using 18506 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 14 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 19 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 24 [LightGBM] [Debug] Re-bagging, using 18695 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 17 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 26 [LightGBM] [Debug] Re-bagging, using 18539 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 25 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 16 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 20 [LightGBM] [Debug] Re-bagging, using 18768 data to train [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 21 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 18 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 22 [LightGBM] [Debug] Trained a tree with leaves = 81 and depth = 28 Training Elapsed Time: 3.8860796999997547 seconds Log loss: (Train) 0.18691289880514814 vs (Test) 0.19337372842261577 PR-AUC: (Train) 0.9769836663793002 vs (Test) 0.9751402347947055 ROC-AUC: (Train) 0.9735624134708025 vs (Test) 0.9717662835980452
save_results(clf_name = "LGBM",
best_trials = best_trials_lgbm,
exec_time = exec_time_lgbm,
lloss_auc_train = lloss_auc_train_lgbm,
lloss_auc_test = lloss_auc_test_lgbm,
df_metrics = df_metrics_lgbm,
cm_final = cm_final_lgbm,
cm_all = cm_lgbm_all,
cm_labels = cm_labels_lgbm_all)
Optuna with HistGradientBoostingClassifier¶
gc.collect();
X_df = clean_df.drop(columns = ["target", "anon_ssn"])
y_df = clean_df.target
anon_ssn = clean_df.anon_ssn;
# A single train-test split (80%-20%) using GroupShuffleSplit, ensuring that no anon_ssn (grouped by anon_ssn) appear in both sets
gss = GroupShuffleSplit(n_splits = 1, test_size = 0.2, random_state = seed)
train_idx, test_idx = next(gss.split(X_df, y_df, groups = anon_ssn))
X_train, X_test = X_df.iloc[train_idx], X_df.iloc[test_idx]
y_train, y_test = y_df.iloc[train_idx], y_df.iloc[test_idx]
anon_ssn_train = anon_ssn[train_idx] # Keeping track of anon_ssn for cross-validation
del X_df, y_df, gss, train_idx, test_idx;
def objective(trial):
# https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.HistGradientBoostingClassifier.html
# https://scikit-learn.org/stable/modules/ensemble.html#categorical-support-gbdt
param = {"loss": "log_loss",
"verbose": 2,
"learning_rate": trial.suggest_float("learning_rate", 1e-2, 1e-1, log = True),
"max_iter": trial.suggest_int("max_iter", 5, 100),
"max_leaf_nodes": trial.suggest_int("max_leaf_nodes", 2, 256),
"min_samples_leaf": trial.suggest_int("min_samples_leaf", 5, 200),
"l2_regularization": trial.suggest_float("l2_regularization", 1e-8, 1e1, log = True),
"class_weight": trial.suggest_categorical("class_weight", [None, "balanced"]), # classes are balance
"random_state": seed,
"categorical_features": "from_dtype",
"max_depth": None,
"max_features": trial.suggest_float("max_features", 1e-1, 1e0),
"max_bins": trial.suggest_int("max_bins", 40, 255),
#"monotonic_cst": trial.suggest_categorical("monotonic_cst", [None, -1, 0, 1]),
"interaction_cst": trial.suggest_categorical("interaction_cst",
["pairwise"
,"no_interactions"]),
"warm_start": trial.suggest_categorical("warm_start",[False, True])
}
sgkf = StratifiedGroupKFold(n_splits = 5, shuffle = True, random_state = seed)
lloss_scores, pr_auc_scores, roc_auc_scores = [], [], []
for fold_idx, (train_index, valid_index) in enumerate(sgkf.split(X_train, y_train, groups = anon_ssn_train), start = 1):
# Extract train and validation sets
X_train_fold, X_valid_fold = X_train.iloc[train_index], X_train.iloc[valid_index]
y_train_fold, y_valid_fold = y_train.iloc[train_index], y_train.iloc[valid_index]
# Summarize the composition of classes in the train and validation sets
train_0, train_1 = len(y_train_fold[y_train_fold == 0]), len(y_train_fold[y_train_fold == 1])
valid_0, valid_1 = len(y_valid_fold[y_valid_fold == 0]), len(y_valid_fold[y_valid_fold == 1])
print(f'Trial {trial.number}, Fold {fold_idx}: Train size = {len(train_index)} where 0 = {train_0}, 1 = {train_1}, 0/1 = {train_0/train_1}')
print(f'Trial {trial.number}, Fold {fold_idx}: Validation size = {len(valid_index)} where 0 = {valid_0}, 1 = {valid_1}, 0/1 = {valid_0/valid_1}')
clf = HistGradientBoostingClassifier(**param)
start_fold = time.perf_counter()
clf.fit(X_train_fold, y_train_fold)
end_fold = time.perf_counter()
# Predict probabilities
y_prob_fold = clf.predict_proba(X_valid_fold)[:, 1]
print(f'Trial {trial.number}, Fold {fold_idx}: '
f'Log loss = {log_loss(y_valid_fold, y_prob_fold)}, '
f'Average precision = {average_precision_score(y_valid_fold, y_prob_fold)}, '
f'ROC-AUC = {roc_auc_score(y_valid_fold, y_prob_fold)}, '
f'Elapsed Time = {end_fold - start_fold} seconds')
# Calculate and store the evaluation metrics for this fold
lloss_scores.append(log_loss(y_valid_fold, y_prob_fold))
pr_auc_scores.append(average_precision_score(y_valid_fold, y_prob_fold))
roc_auc_scores.append(roc_auc_score(y_valid_fold, y_prob_fold))
del X_train_fold, X_valid_fold, y_train_fold, y_valid_fold, clf, start_fold, end_fold
gc.collect()
# Calculate average metrics across all folds for Optuna to optimize
mean_lloss = np.mean(lloss_scores)
mean_pr_auc = np.mean(pr_auc_scores)
mean_roc_auc = np.mean(roc_auc_scores)
del lloss_scores, pr_auc_scores, roc_auc_scores
gc.collect()
# Return the metrics to Optuna for optimization
return mean_lloss, mean_pr_auc, mean_roc_auc
trial_progress = tqdm(total = n_trials, desc = "Optimization Progress", leave = True,
ascii = True, # Plain text mode
dynamic_ncols = True # Auto-fit width
)
def update_progress(study_hgbc, trial):
trial_progress.update(1)
optuna.logging.disable_default_handler()
optuna.logging.enable_propagation()
optuna.logging.set_verbosity(optuna.logging.DEBUG)
logging.basicConfig(filename = "optuna_debug_HistGradientBoostingClassifier.log", filemode = "w", level = logging.DEBUG, format="%(asctime)s %(levelname)s %(message)s")
study_hgbc = optuna.create_study(study_name = "Optuna with HistGradientBoostingClassifier",
directions = ["minimize", "maximize", "maximize"],
sampler = module.AutoSampler(seed = seed)
)
start_optuna = time.perf_counter()
study_hgbc.optimize(objective, n_trials = n_trials, n_jobs = 1, callbacks = [update_progress])
end_optuna = time.perf_counter()
print(f'Optuna Optimization Elapsed Time: {end_optuna - start_optuna} seconds')
fig = plot_pareto_front(study_hgbc, target_names = ["Log loss", "PR-AUC", "ROC-AUC"])
fig.update_layout(width = 900, height = 400) # Set desired width and height in pixels
fig.show()
trial_progress.close()
metrics = ["Log loss", "PR-AUC", "ROC-AUC"]
for i, obj in enumerate(metrics):
optuna.visualization.plot_optimization_history(study_hgbc,
target = lambda t: t.values[i], # Correctly target each objective
target_name = obj).show()
best_trials = study_hgbc.best_trials
best_trials_hgbc = {}
exec_time_hgbc, lloss_auc_train_hgbc, lloss_auc_test_hgbc, all_metrics = [], [], [], []
cm_hgbc_all, cm_labels_hgbc_all = [], []
for i, trial in enumerate(best_trials):
display(Markdown(f'<span style = "font-size: 18px; font-weight: bold;"> Training with Best Trial {trial.number}: </span>'))
best_params = trial.params
# Non-optimized and best Optuna optimized parameters
full_params = {"loss": "log_loss",
"verbose": 2,
"random_state": seed,
"categorical_features": "from_dtype",
"max_depth": None,
**best_params
}
print("Full_params:", full_params)
best_trials_hgbc[trial.number] = full_params
final_hgbc = HistGradientBoostingClassifier(**full_params)
start_train = time.perf_counter()
final_hgbc.fit(X_train, y_train)
end_train = time.perf_counter()
print(f'Training Elapsed Time: {end_train - start_train} seconds')
exec_time_hgbc.append({"Classifier": "HistGradientBoostingClassifier",
"Best Trial": trial.number,
"Optimization Elapsed Time (s)": end_optuna - start_optuna,
"Training Elapsed Time (s)": end_train - start_train})
y_prob_all = final_hgbc.predict_proba(X_test)[:, 1]
y_pred_all = final_hgbc.predict(X_test)
print(f'Log loss: (Train) {trial.values[0]} vs (Test) {log_loss(y_test, y_prob_all)}')
print(f'PR-AUC: (Train) {trial.values[1]} vs (Test) {average_precision_score(y_test, y_prob_all)}')
print(f'ROC-AUC: (Train) {trial.values[2]} vs (Test) {roc_auc_score(y_test, y_prob_all)}')
lloss_auc_train_hgbc.append({"Classifier": "HistGradientBoostingClassifier",
"Best Trial": trial.number,
"Set": "Training",
"Log loss": trial.values[0],
"PR-AUC": trial.values[1],
"ROC-AUC": trial.values[2]})
lloss_auc_test_hgbc.append({"Classifier": "HistGradientBoostingClassifier",
"Best Trial": trial.number,
"Set": "Test",
"Log loss": log_loss(y_test, y_prob_all),
"PR-AUC": average_precision_score(y_test, y_prob_all),
"ROC-AUC": roc_auc_score(y_test, y_prob_all)})
report = classification_report(y_test, y_pred_all, target_names = ["Safe", "Risky"], output_dict = True)
all_metrics.append({"Classifier": "HistGradientBoostingClassifier",
"Trial": trial.number,
"Accuracy": accuracy_score(y_test, y_pred_all),
"Precision (Safe)": report["Safe"]["precision"],
"Recall (Safe)": report["Safe"]["recall"],
"F1-score (Safe)": report["Safe"]["f1-score"],
"Precision (Risky)": report["Risky"]["precision"],
"Recall (Risky)": report["Risky"]["recall"],
"F1-score (Risky)": report["Risky"]["f1-score"],
"Precision (Macro avg)": report["macro avg"]["precision"],
"Recall (Macro avg)": report["macro avg"]["recall"],
"F1-score (Macro avg)": report["macro avg"]["f1-score"],
"Precision (Weighted avg)": report["weighted avg"]["precision"],
"Recall (Weighted avg)": report["weighted avg"]["recall"],
"F1-score (Weighted avg)": report["weighted avg"]["f1-score"]})
# Store confusion matrix
cm_final_hgbc = confusion_matrix(y_test, y_pred_all)
cm_hgbc_all.append(cm_final_hgbc)
cm_labels_hgbc_all.append(f'HistGradientBoostingClassifier Confusion Matrix for Best Trial {trial.number}') # Store label for subplots
df_metrics_hgbc = pd.DataFrame(all_metrics)
gc.collect();
Optimization Progress: 0%| | 0/100 [00:00<?, ?it/s]
Trial 0, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 0, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 Binning 0.040 GB of training data: 0.141 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 19 leaves, max depth = 9, train loss: 0.68006, val loss: 0.67964, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.66763, val loss: 0.66685, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.65585, val loss: 0.65476, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.64478, val loss: 0.64325, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.63425, val loss: 0.63238, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.62427, val loss: 0.62208, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.61494, val loss: 0.61244, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.60610, val loss: 0.60325, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.59758, val loss: 0.59443, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.58933, val loss: 0.58588, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.58173, val loss: 0.57794, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.57455, val loss: 0.57044, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.56761, val loss: 0.56323, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.56109, val loss: 0.55642, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.55480, val loss: 0.54987, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.54838, val loss: 0.54389, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.54251, val loss: 0.53780, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.53693, val loss: 0.53200, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.53161, val loss: 0.52637, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.52641, val loss: 0.52093, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.52146, val loss: 0.51575, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.51675, val loss: 0.51081, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.51226, val loss: 0.50610, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.50798, val loss: 0.50161, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.50401, val loss: 0.49747, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.50023, val loss: 0.49352, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.49667, val loss: 0.48973, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.49329, val loss: 0.48618, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.49009, val loss: 0.48278, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.48493, val loss: 0.47801, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.48142, val loss: 0.47476, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.47846, val loss: 0.47159, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.47550, val loss: 0.46843, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.47081, val loss: 0.46413, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.46805, val loss: 0.46118, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.46553, val loss: 0.45850, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.46311, val loss: 0.45595, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.45874, val loss: 0.45193, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.45649, val loss: 0.44955, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.45361, val loss: 0.44691, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.45150, val loss: 0.44468, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.44940, val loss: 0.44241, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.44749, val loss: 0.44037, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.44353, val loss: 0.43677, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.44097, val loss: 0.43443, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.43853, val loss: 0.43222, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.43496, val loss: 0.42897, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.43316, val loss: 0.42704, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.43096, val loss: 0.42505, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.42767, val loss: 0.42206, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.42607, val loss: 0.42044, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.42298, val loss: 0.41764, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.42004, val loss: 0.41498, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.41825, val loss: 0.41338, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.41668, val loss: 0.41169, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.41396, val loss: 0.40923, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.41249, val loss: 0.40760, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.41109, val loss: 0.40610, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.40850, val loss: 0.40379, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40666, val loss: 0.40176, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.40505, val loss: 0.40035, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.40265, val loss: 0.39822, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.40143, val loss: 0.39688, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.40022, val loss: 0.39559, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.39912, val loss: 0.39437, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.39772, val loss: 0.39316, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39601, val loss: 0.39126, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.39480, val loss: 0.39007, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.39263, val loss: 0.38815, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.39055, val loss: 0.38632, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.38947, val loss: 0.38513, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.38818, val loss: 0.38403, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38656, val loss: 0.38223, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.38551, val loss: 0.38120, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38397, val loss: 0.37949, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.38299, val loss: 0.37846, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38152, val loss: 0.37683, in 0.000s 1 tree, 19 leaves, max depth = 12, train loss: 0.38065, val loss: 0.37592, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.37978, val loss: 0.37496, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37839, val loss: 0.37341, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.37752, val loss: 0.37247, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.37562, val loss: 0.37082, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.37463, val loss: 0.36988, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.37284, val loss: 0.36834, in 0.016s Fit 84 trees in 1.032 s, (1476 total leaves) Time spent computing histograms: 0.355s Time spent finding best splits: 0.037s Time spent applying splits: 0.033s Time spent predicting: 0.016s Trial 0, Fold 1: Log loss = 0.3744440833225494, Average precision = 0.9491437057327736, ROC-AUC = 0.9445759578296841, Elapsed Time = 1.0311564000003273 seconds Trial 0, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 0, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 19 leaves, max depth = 10, train loss: 0.68016, val loss: 0.67951, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.66784, val loss: 0.66660, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.65608, val loss: 0.65428, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.64497, val loss: 0.64262, in 0.000s 1 tree, 19 leaves, max depth = 11, train loss: 0.63454, val loss: 0.63165, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.62463, val loss: 0.62126, in 0.000s 1 tree, 19 leaves, max depth = 11, train loss: 0.61526, val loss: 0.61139, in 0.016s 1 tree, 19 leaves, max depth = 11, train loss: 0.60638, val loss: 0.60202, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.59786, val loss: 0.59302, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.58966, val loss: 0.58432, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.58205, val loss: 0.57630, in 0.000s 1 tree, 19 leaves, max depth = 11, train loss: 0.57483, val loss: 0.56866, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.56797, val loss: 0.56137, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.56146, val loss: 0.55444, in 0.016s 1 tree, 19 leaves, max depth = 11, train loss: 0.55526, val loss: 0.54786, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.54889, val loss: 0.54169, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.54303, val loss: 0.53546, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.53745, val loss: 0.52952, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.53212, val loss: 0.52385, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.52696, val loss: 0.51831, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.52205, val loss: 0.51304, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.51737, val loss: 0.50801, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.51291, val loss: 0.50321, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.50867, val loss: 0.49863, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.50470, val loss: 0.49438, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.50093, val loss: 0.49032, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.49737, val loss: 0.48652, in 0.016s 1 tree, 19 leaves, max depth = 12, train loss: 0.49398, val loss: 0.48286, in 0.000s 1 tree, 19 leaves, max depth = 11, train loss: 0.49076, val loss: 0.47936, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.48563, val loss: 0.47443, in 0.016s 1 tree, 19 leaves, max depth = 12, train loss: 0.48263, val loss: 0.47118, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.47976, val loss: 0.46810, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.47690, val loss: 0.46498, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.47213, val loss: 0.46041, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.46947, val loss: 0.45750, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.46702, val loss: 0.45481, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.46467, val loss: 0.45226, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.46022, val loss: 0.44801, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.45804, val loss: 0.44562, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.45509, val loss: 0.44287, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.45307, val loss: 0.44079, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.45098, val loss: 0.43848, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.44895, val loss: 0.43672, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.44497, val loss: 0.43294, in 0.016s 1 tree, 19 leaves, max depth = 5, train loss: 0.44236, val loss: 0.43053, in 0.000s 1 tree, 19 leaves, max depth = 5, train loss: 0.43989, val loss: 0.42824, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.43631, val loss: 0.42485, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.43444, val loss: 0.42283, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.43221, val loss: 0.42079, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.42890, val loss: 0.41765, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.42728, val loss: 0.41590, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.42413, val loss: 0.41292, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.42113, val loss: 0.41008, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.41929, val loss: 0.40842, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.41780, val loss: 0.40682, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.41503, val loss: 0.40419, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.41356, val loss: 0.40257, in 0.016s 1 tree, 19 leaves, max depth = 13, train loss: 0.41225, val loss: 0.40113, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.40961, val loss: 0.39862, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40789, val loss: 0.39685, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.40653, val loss: 0.39571, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.40406, val loss: 0.39338, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.40258, val loss: 0.39209, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.40131, val loss: 0.39070, in 0.016s 1 tree, 19 leaves, max depth = 11, train loss: 0.40018, val loss: 0.38943, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.39879, val loss: 0.38822, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39719, val loss: 0.38656, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.39597, val loss: 0.38556, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.39378, val loss: 0.38351, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.39170, val loss: 0.38155, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.39058, val loss: 0.38029, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.38924, val loss: 0.37911, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38773, val loss: 0.37754, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.38667, val loss: 0.37669, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38523, val loss: 0.37519, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.38419, val loss: 0.37406, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38282, val loss: 0.37263, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.38159, val loss: 0.37161, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.38065, val loss: 0.37060, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37934, val loss: 0.36924, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.37841, val loss: 0.36822, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.37655, val loss: 0.36649, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.37558, val loss: 0.36572, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.37384, val loss: 0.36411, in 0.016s Fit 84 trees in 0.940 s, (1462 total leaves) Time spent computing histograms: 0.374s Time spent finding best splits: 0.037s Time spent applying splits: 0.034s Time spent predicting: 0.000s Trial 0, Fold 2: Log loss = 0.375627800803536, Average precision = 0.9455558130432133, ROC-AUC = 0.9450537961074265, Elapsed Time = 0.9430072000013752 seconds Trial 0, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 0, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 19 leaves, max depth = 10, train loss: 0.68016, val loss: 0.67973, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.66781, val loss: 0.66692, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.65615, val loss: 0.65478, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.64511, val loss: 0.64333, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.63465, val loss: 0.63246, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.62475, val loss: 0.62216, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.61545, val loss: 0.61249, in 0.016s 1 tree, 19 leaves, max depth = 12, train loss: 0.60665, val loss: 0.60333, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.59820, val loss: 0.59451, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.59006, val loss: 0.58607, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.58248, val loss: 0.57825, in 0.016s 1 tree, 19 leaves, max depth = 12, train loss: 0.57533, val loss: 0.57078, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.56844, val loss: 0.56358, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.56198, val loss: 0.55683, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.55574, val loss: 0.55029, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.54935, val loss: 0.54430, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.54356, val loss: 0.53819, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.53806, val loss: 0.53237, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.53275, val loss: 0.52682, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.52764, val loss: 0.52150, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.52278, val loss: 0.51642, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.51815, val loss: 0.51158, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.51375, val loss: 0.50696, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.50955, val loss: 0.50256, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.50563, val loss: 0.49841, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.50191, val loss: 0.49443, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.49836, val loss: 0.49075, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.49500, val loss: 0.48718, in 0.016s 1 tree, 19 leaves, max depth = 12, train loss: 0.49181, val loss: 0.48378, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.48668, val loss: 0.47902, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.48316, val loss: 0.47580, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.48022, val loss: 0.47274, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.47732, val loss: 0.46969, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.47266, val loss: 0.46538, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.46996, val loss: 0.46253, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.46747, val loss: 0.45984, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.46498, val loss: 0.45715, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.46065, val loss: 0.45316, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.45843, val loss: 0.45076, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.45556, val loss: 0.44816, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.45346, val loss: 0.44624, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.45133, val loss: 0.44397, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.44936, val loss: 0.44184, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.44543, val loss: 0.43824, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.44288, val loss: 0.43595, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.44045, val loss: 0.43378, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.43691, val loss: 0.43055, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.43511, val loss: 0.42855, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.43292, val loss: 0.42662, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.42965, val loss: 0.42364, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.42798, val loss: 0.42217, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.42492, val loss: 0.41939, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.42199, val loss: 0.41674, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.42040, val loss: 0.41497, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.41888, val loss: 0.41327, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.41612, val loss: 0.41077, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.41465, val loss: 0.40918, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.41329, val loss: 0.40767, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.41066, val loss: 0.40531, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40887, val loss: 0.40365, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.40721, val loss: 0.40213, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.40478, val loss: 0.39997, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.40331, val loss: 0.39874, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.40210, val loss: 0.39738, in 0.000s 1 tree, 19 leaves, max depth = 11, train loss: 0.40098, val loss: 0.39616, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.39960, val loss: 0.39502, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39793, val loss: 0.39348, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.39666, val loss: 0.39242, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.39450, val loss: 0.39052, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.39244, val loss: 0.38871, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.39138, val loss: 0.38752, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.39009, val loss: 0.38635, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38852, val loss: 0.38489, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.38739, val loss: 0.38398, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38590, val loss: 0.38259, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.38491, val loss: 0.38148, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38349, val loss: 0.38016, in 0.000s 1 tree, 19 leaves, max depth = 14, train loss: 0.38257, val loss: 0.37915, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.38166, val loss: 0.37821, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38031, val loss: 0.37696, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.37943, val loss: 0.37600, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.37753, val loss: 0.37434, in 0.016s 1 tree, 19 leaves, max depth = 11, train loss: 0.37647, val loss: 0.37350, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.37470, val loss: 0.37196, in 0.016s Fit 84 trees in 0.923 s, (1462 total leaves) Time spent computing histograms: 0.352s Time spent finding best splits: 0.037s Time spent applying splits: 0.033s Time spent predicting: 0.000s Trial 0, Fold 3: Log loss = 0.36979576733385733, Average precision = 0.9509140324614288, ROC-AUC = 0.9483565090146489, Elapsed Time = 0.9244606000011117 seconds Trial 0, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 0, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 19 leaves, max depth = 9, train loss: 0.68017, val loss: 0.67941, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.66780, val loss: 0.66634, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.65611, val loss: 0.65397, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.64506, val loss: 0.64229, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.63458, val loss: 0.63116, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.62466, val loss: 0.62060, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.61534, val loss: 0.61068, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.60650, val loss: 0.60129, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.59803, val loss: 0.59221, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.58990, val loss: 0.58345, in 0.000s 1 tree, 19 leaves, max depth = 13, train loss: 0.58231, val loss: 0.57527, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.57514, val loss: 0.56758, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.56823, val loss: 0.56014, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.56175, val loss: 0.55310, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.55550, val loss: 0.54633, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.54915, val loss: 0.54007, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.54333, val loss: 0.53375, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.53780, val loss: 0.52773, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.53249, val loss: 0.52198, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.52739, val loss: 0.51637, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.52253, val loss: 0.51101, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.51790, val loss: 0.50590, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.51350, val loss: 0.50102, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.50930, val loss: 0.49636, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.50537, val loss: 0.49201, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.50163, val loss: 0.48785, in 0.000s 1 tree, 19 leaves, max depth = 13, train loss: 0.49808, val loss: 0.48388, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.49470, val loss: 0.48013, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.49148, val loss: 0.47655, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.48639, val loss: 0.47158, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.48323, val loss: 0.46807, in 0.016s 1 tree, 19 leaves, max depth = 12, train loss: 0.48037, val loss: 0.46482, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.47754, val loss: 0.46161, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.47282, val loss: 0.45703, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.47020, val loss: 0.45403, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.46776, val loss: 0.45126, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.46544, val loss: 0.44861, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.46101, val loss: 0.44433, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.45885, val loss: 0.44185, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.45593, val loss: 0.43903, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.45393, val loss: 0.43689, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.45187, val loss: 0.43450, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.44987, val loss: 0.43245, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.44592, val loss: 0.42865, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.44334, val loss: 0.42618, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.44090, val loss: 0.42382, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.43733, val loss: 0.42040, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.43548, val loss: 0.41828, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.43328, val loss: 0.41616, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.43000, val loss: 0.41302, in 0.016s 1 tree, 19 leaves, max depth = 13, train loss: 0.42836, val loss: 0.41112, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.42524, val loss: 0.40813, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.42226, val loss: 0.40528, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.42044, val loss: 0.40352, in 0.047s 1 tree, 19 leaves, max depth = 7, train loss: 0.41889, val loss: 0.40173, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.41613, val loss: 0.39910, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.41465, val loss: 0.39742, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.41326, val loss: 0.39579, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.41064, val loss: 0.39331, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40892, val loss: 0.39146, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.40727, val loss: 0.38990, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.40485, val loss: 0.38759, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.40342, val loss: 0.38621, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.40217, val loss: 0.38475, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.40103, val loss: 0.38338, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.39968, val loss: 0.38209, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39809, val loss: 0.38037, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.39686, val loss: 0.37911, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.39472, val loss: 0.37709, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.39267, val loss: 0.37517, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.39158, val loss: 0.37386, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.39029, val loss: 0.37265, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38878, val loss: 0.37102, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.38771, val loss: 0.36993, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38627, val loss: 0.36837, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.38525, val loss: 0.36718, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38388, val loss: 0.36570, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.38297, val loss: 0.36462, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.38207, val loss: 0.36354, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38077, val loss: 0.36214, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.37990, val loss: 0.36111, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.37802, val loss: 0.35936, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.37701, val loss: 0.35835, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.37526, val loss: 0.35672, in 0.000s Fit 84 trees in 1.126 s, (1461 total leaves) Time spent computing histograms: 0.462s Time spent finding best splits: 0.064s Time spent applying splits: 0.050s Time spent predicting: 0.000s Trial 0, Fold 4: Log loss = 0.37527621904651487, Average precision = 0.9487464960271949, ROC-AUC = 0.9447211104911646, Elapsed Time = 1.1381462000008469 seconds Trial 0, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 0, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.203 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 19 leaves, max depth = 10, train loss: 0.67998, val loss: 0.67905, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.66756, val loss: 0.66584, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.65575, val loss: 0.65320, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.64456, val loss: 0.64124, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.63405, val loss: 0.62998, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.62408, val loss: 0.61925, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.61465, val loss: 0.60911, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.60571, val loss: 0.59947, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.59715, val loss: 0.59025, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.58892, val loss: 0.58133, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.58125, val loss: 0.57300, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.57398, val loss: 0.56510, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.56707, val loss: 0.55758, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.56052, val loss: 0.55037, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.55426, val loss: 0.54354, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.54799, val loss: 0.53751, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.54209, val loss: 0.53104, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.53648, val loss: 0.52487, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.53111, val loss: 0.51897, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.52593, val loss: 0.51324, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.52100, val loss: 0.50778, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.51631, val loss: 0.50257, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.51183, val loss: 0.49759, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.50757, val loss: 0.49283, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.50358, val loss: 0.48837, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.49978, val loss: 0.48410, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.49620, val loss: 0.48005, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.49278, val loss: 0.47621, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.48953, val loss: 0.47254, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.48449, val loss: 0.46783, in 0.016s 1 tree, 19 leaves, max depth = 11, train loss: 0.48146, val loss: 0.46439, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.47856, val loss: 0.46108, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.47569, val loss: 0.45781, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.47101, val loss: 0.45347, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.46834, val loss: 0.45041, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.46589, val loss: 0.44760, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.46352, val loss: 0.44486, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.45915, val loss: 0.44084, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.45695, val loss: 0.43828, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.45404, val loss: 0.43575, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.45199, val loss: 0.43335, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.44995, val loss: 0.43097, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.44796, val loss: 0.42898, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.44404, val loss: 0.42539, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.44149, val loss: 0.42320, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.43905, val loss: 0.42112, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.43555, val loss: 0.41794, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.43375, val loss: 0.41583, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.43156, val loss: 0.41399, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.42831, val loss: 0.41103, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.42673, val loss: 0.40915, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.42363, val loss: 0.40634, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.42070, val loss: 0.40372, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.41889, val loss: 0.40224, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.41744, val loss: 0.40050, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.41470, val loss: 0.39805, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.41327, val loss: 0.39635, in 0.000s 1 tree, 19 leaves, max depth = 11, train loss: 0.41199, val loss: 0.39482, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.40938, val loss: 0.39249, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40759, val loss: 0.39077, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.40625, val loss: 0.38947, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.40382, val loss: 0.38730, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.40236, val loss: 0.38616, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.40115, val loss: 0.38468, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.40002, val loss: 0.38330, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.39865, val loss: 0.38225, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39698, val loss: 0.38065, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.39579, val loss: 0.37948, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.39363, val loss: 0.37758, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.39157, val loss: 0.37576, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.39049, val loss: 0.37445, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.38919, val loss: 0.37320, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38761, val loss: 0.37170, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.38656, val loss: 0.37069, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38506, val loss: 0.36925, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.38407, val loss: 0.36804, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38264, val loss: 0.36668, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.38140, val loss: 0.36558, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.38048, val loss: 0.36445, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37913, val loss: 0.36316, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.37824, val loss: 0.36206, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.37640, val loss: 0.36047, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.37544, val loss: 0.35956, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.37372, val loss: 0.35808, in 0.016s Fit 84 trees in 1.063 s, (1475 total leaves) Time spent computing histograms: 0.406s Time spent finding best splits: 0.041s Time spent applying splits: 0.037s Time spent predicting: 0.000s Trial 0, Fold 5: Log loss = 0.3795476722528022, Average precision = 0.9488541635377791, ROC-AUC = 0.9447077271798302, Elapsed Time = 1.0641242000001512 seconds
Optimization Progress: 1%|1 | 1/100 [00:12<21:16, 12.90s/it]
Trial 1, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 1, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.143 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 64 leaves, max depth = 14, train loss: 0.67836, val loss: 0.67826, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.66426, val loss: 0.66417, in 0.016s 1 tree, 97 leaves, max depth = 12, train loss: 0.65085, val loss: 0.65081, in 0.016s 1 tree, 91 leaves, max depth = 13, train loss: 0.63799, val loss: 0.63795, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.62573, val loss: 0.62569, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.61408, val loss: 0.61410, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.60293, val loss: 0.60296, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.59236, val loss: 0.59238, in 0.016s 1 tree, 118 leaves, max depth = 13, train loss: 0.58230, val loss: 0.58222, in 0.016s 1 tree, 136 leaves, max depth = 14, train loss: 0.57288, val loss: 0.57261, in 0.031s 1 tree, 122 leaves, max depth = 14, train loss: 0.56366, val loss: 0.56332, in 0.016s 1 tree, 67 leaves, max depth = 10, train loss: 0.55492, val loss: 0.55450, in 0.016s 1 tree, 67 leaves, max depth = 10, train loss: 0.54656, val loss: 0.54604, in 0.016s 1 tree, 96 leaves, max depth = 13, train loss: 0.53829, val loss: 0.53784, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.53042, val loss: 0.52995, in 0.031s 1 tree, 100 leaves, max depth = 14, train loss: 0.52282, val loss: 0.52240, in 0.016s 1 tree, 100 leaves, max depth = 14, train loss: 0.51552, val loss: 0.51515, in 0.016s 1 tree, 124 leaves, max depth = 13, train loss: 0.50863, val loss: 0.50824, in 0.016s 1 tree, 99 leaves, max depth = 14, train loss: 0.50187, val loss: 0.50153, in 0.016s 1 tree, 116 leaves, max depth = 16, train loss: 0.49555, val loss: 0.49519, in 0.031s 1 tree, 100 leaves, max depth = 14, train loss: 0.48929, val loss: 0.48900, in 0.016s 1 tree, 87 leaves, max depth = 14, train loss: 0.48314, val loss: 0.48283, in 0.016s 1 tree, 100 leaves, max depth = 14, train loss: 0.47735, val loss: 0.47712, in 0.016s 1 tree, 99 leaves, max depth = 14, train loss: 0.47177, val loss: 0.47161, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.46641, val loss: 0.46630, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.46124, val loss: 0.46104, in 0.016s 1 tree, 102 leaves, max depth = 14, train loss: 0.45627, val loss: 0.45613, in 0.016s 1 tree, 103 leaves, max depth = 14, train loss: 0.45133, val loss: 0.45119, in 0.016s 1 tree, 103 leaves, max depth = 14, train loss: 0.44670, val loss: 0.44663, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.44224, val loss: 0.44224, in 0.016s 1 tree, 95 leaves, max depth = 16, train loss: 0.43553, val loss: 0.43579, in 0.031s 1 tree, 122 leaves, max depth = 16, train loss: 0.42920, val loss: 0.42964, in 0.016s 1 tree, 128 leaves, max depth = 13, train loss: 0.42524, val loss: 0.42570, in 0.016s 1 tree, 109 leaves, max depth = 14, train loss: 0.42125, val loss: 0.42170, in 0.032s 1 tree, 95 leaves, max depth = 15, train loss: 0.41533, val loss: 0.41599, in 0.016s 1 tree, 128 leaves, max depth = 13, train loss: 0.41175, val loss: 0.41244, in 0.031s 1 tree, 95 leaves, max depth = 15, train loss: 0.40618, val loss: 0.40708, in 0.016s 1 tree, 96 leaves, max depth = 15, train loss: 0.40082, val loss: 0.40196, in 0.031s 1 tree, 105 leaves, max depth = 14, train loss: 0.39732, val loss: 0.39851, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.39398, val loss: 0.39516, in 0.016s 1 tree, 95 leaves, max depth = 15, train loss: 0.38903, val loss: 0.39042, in 0.031s 1 tree, 96 leaves, max depth = 15, train loss: 0.38431, val loss: 0.38591, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.38131, val loss: 0.38303, in 0.031s 1 tree, 99 leaves, max depth = 15, train loss: 0.37685, val loss: 0.37879, in 0.016s 1 tree, 96 leaves, max depth = 15, train loss: 0.37255, val loss: 0.37468, in 0.016s 1 tree, 108 leaves, max depth = 15, train loss: 0.36979, val loss: 0.37203, in 0.031s 1 tree, 101 leaves, max depth = 15, train loss: 0.36575, val loss: 0.36819, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.36315, val loss: 0.36572, in 0.016s 1 tree, 136 leaves, max depth = 13, train loss: 0.36072, val loss: 0.36338, in 0.031s 1 tree, 124 leaves, max depth = 14, train loss: 0.35702, val loss: 0.35983, in 0.016s 1 tree, 108 leaves, max depth = 13, train loss: 0.35464, val loss: 0.35756, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.35126, val loss: 0.35421, in 0.016s 1 tree, 50 leaves, max depth = 14, train loss: 0.34801, val loss: 0.35102, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.34491, val loss: 0.34793, in 0.016s 1 tree, 51 leaves, max depth = 14, train loss: 0.34192, val loss: 0.34499, in 0.016s 1 tree, 119 leaves, max depth = 15, train loss: 0.33932, val loss: 0.34245, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.33652, val loss: 0.33968, in 0.016s 1 tree, 120 leaves, max depth = 15, train loss: 0.33404, val loss: 0.33719, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.33131, val loss: 0.33475, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.32873, val loss: 0.33219, in 0.016s 1 tree, 121 leaves, max depth = 15, train loss: 0.32640, val loss: 0.32986, in 0.031s 1 tree, 109 leaves, max depth = 18, train loss: 0.32437, val loss: 0.32786, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.32189, val loss: 0.32565, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.31951, val loss: 0.32354, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.31724, val loss: 0.32133, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.31500, val loss: 0.31935, in 0.016s 1 tree, 119 leaves, max depth = 16, train loss: 0.31292, val loss: 0.31735, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.31092, val loss: 0.31534, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.30885, val loss: 0.31330, in 0.031s 1 tree, 82 leaves, max depth = 13, train loss: 0.30681, val loss: 0.31151, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.30487, val loss: 0.30960, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.30300, val loss: 0.30775, in 0.016s 1 tree, 86 leaves, max depth = 13, train loss: 0.30111, val loss: 0.30607, in 0.016s 1 tree, 109 leaves, max depth = 17, train loss: 0.29951, val loss: 0.30454, in 0.016s Fit 74 trees in 1.690 s, (7096 total leaves) Time spent computing histograms: 0.503s Time spent finding best splits: 0.131s Time spent applying splits: 0.114s Time spent predicting: 0.000s Trial 1, Fold 1: Log loss = 0.306563213010577, Average precision = 0.9577684804959341, ROC-AUC = 0.9521826741084871, Elapsed Time = 1.7062987999997858 seconds Trial 1, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 1, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 61 leaves, max depth = 11, train loss: 0.67837, val loss: 0.67810, in 0.016s 1 tree, 89 leaves, max depth = 11, train loss: 0.66391, val loss: 0.66349, in 0.016s 1 tree, 92 leaves, max depth = 13, train loss: 0.65038, val loss: 0.64974, in 0.016s 1 tree, 92 leaves, max depth = 13, train loss: 0.63750, val loss: 0.63666, in 0.031s 1 tree, 93 leaves, max depth = 11, train loss: 0.62502, val loss: 0.62406, in 0.016s 1 tree, 97 leaves, max depth = 11, train loss: 0.61313, val loss: 0.61206, in 0.016s 1 tree, 97 leaves, max depth = 11, train loss: 0.60177, val loss: 0.60059, in 0.031s 1 tree, 100 leaves, max depth = 12, train loss: 0.59093, val loss: 0.58965, in 0.016s 1 tree, 112 leaves, max depth = 11, train loss: 0.58067, val loss: 0.57935, in 0.031s 1 tree, 135 leaves, max depth = 17, train loss: 0.57143, val loss: 0.57008, in 0.016s 1 tree, 121 leaves, max depth = 11, train loss: 0.56202, val loss: 0.56064, in 0.016s 1 tree, 79 leaves, max depth = 11, train loss: 0.55309, val loss: 0.55160, in 0.016s 1 tree, 102 leaves, max depth = 12, train loss: 0.54433, val loss: 0.54274, in 0.016s 1 tree, 127 leaves, max depth = 12, train loss: 0.53606, val loss: 0.53444, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.52835, val loss: 0.52674, in 0.031s 1 tree, 142 leaves, max depth = 13, train loss: 0.52118, val loss: 0.51955, in 0.031s 1 tree, 103 leaves, max depth = 12, train loss: 0.51370, val loss: 0.51204, in 0.016s 1 tree, 103 leaves, max depth = 12, train loss: 0.50653, val loss: 0.50479, in 0.016s 1 tree, 108 leaves, max depth = 12, train loss: 0.49963, val loss: 0.49787, in 0.031s 1 tree, 106 leaves, max depth = 12, train loss: 0.49301, val loss: 0.49117, in 0.031s 1 tree, 107 leaves, max depth = 15, train loss: 0.48666, val loss: 0.48475, in 0.016s 1 tree, 124 leaves, max depth = 17, train loss: 0.48090, val loss: 0.47893, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.47323, val loss: 0.47138, in 0.031s 1 tree, 102 leaves, max depth = 16, train loss: 0.46751, val loss: 0.46558, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.46199, val loss: 0.45998, in 0.016s 1 tree, 103 leaves, max depth = 16, train loss: 0.45669, val loss: 0.45460, in 0.031s 1 tree, 99 leaves, max depth = 16, train loss: 0.44973, val loss: 0.44770, in 0.016s 1 tree, 100 leaves, max depth = 16, train loss: 0.44307, val loss: 0.44109, in 0.031s 1 tree, 99 leaves, max depth = 16, train loss: 0.43669, val loss: 0.43476, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.43200, val loss: 0.43003, in 0.016s 1 tree, 102 leaves, max depth = 16, train loss: 0.42749, val loss: 0.42546, in 0.016s 1 tree, 121 leaves, max depth = 16, train loss: 0.42341, val loss: 0.42136, in 0.016s 1 tree, 108 leaves, max depth = 16, train loss: 0.41919, val loss: 0.41710, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.41512, val loss: 0.41301, in 0.031s 1 tree, 81 leaves, max depth = 13, train loss: 0.41138, val loss: 0.40924, in 0.016s 1 tree, 100 leaves, max depth = 16, train loss: 0.40595, val loss: 0.40389, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.40226, val loss: 0.40018, in 0.031s 1 tree, 102 leaves, max depth = 16, train loss: 0.39873, val loss: 0.39665, in 0.016s 1 tree, 112 leaves, max depth = 16, train loss: 0.39533, val loss: 0.39331, in 0.016s 1 tree, 102 leaves, max depth = 16, train loss: 0.39040, val loss: 0.38848, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.38715, val loss: 0.38524, in 0.016s 1 tree, 108 leaves, max depth = 16, train loss: 0.38400, val loss: 0.38210, in 0.031s 1 tree, 102 leaves, max depth = 16, train loss: 0.37939, val loss: 0.37756, in 0.016s 1 tree, 130 leaves, max depth = 12, train loss: 0.37649, val loss: 0.37472, in 0.016s 1 tree, 103 leaves, max depth = 16, train loss: 0.37218, val loss: 0.37051, in 0.031s 1 tree, 128 leaves, max depth = 13, train loss: 0.36806, val loss: 0.36654, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.36532, val loss: 0.36378, in 0.016s 1 tree, 104 leaves, max depth = 17, train loss: 0.36140, val loss: 0.35996, in 0.031s 1 tree, 103 leaves, max depth = 15, train loss: 0.35760, val loss: 0.35622, in 0.016s 1 tree, 108 leaves, max depth = 14, train loss: 0.35508, val loss: 0.35374, in 0.016s 1 tree, 91 leaves, max depth = 12, train loss: 0.35274, val loss: 0.35142, in 0.016s 1 tree, 51 leaves, max depth = 14, train loss: 0.34944, val loss: 0.34818, in 0.016s 1 tree, 54 leaves, max depth = 14, train loss: 0.34626, val loss: 0.34508, in 0.016s 1 tree, 119 leaves, max depth = 18, train loss: 0.34360, val loss: 0.34250, in 0.016s 1 tree, 56 leaves, max depth = 14, train loss: 0.34061, val loss: 0.33958, in 0.016s 1 tree, 133 leaves, max depth = 13, train loss: 0.33851, val loss: 0.33762, in 0.016s 1 tree, 86 leaves, max depth = 13, train loss: 0.33558, val loss: 0.33494, in 0.031s 1 tree, 87 leaves, max depth = 13, train loss: 0.33276, val loss: 0.33236, in 0.016s 1 tree, 55 leaves, max depth = 15, train loss: 0.33011, val loss: 0.32978, in 0.016s 1 tree, 87 leaves, max depth = 15, train loss: 0.32748, val loss: 0.32735, in 0.016s 1 tree, 121 leaves, max depth = 20, train loss: 0.32513, val loss: 0.32513, in 0.016s 1 tree, 122 leaves, max depth = 20, train loss: 0.32286, val loss: 0.32298, in 0.031s 1 tree, 57 leaves, max depth = 15, train loss: 0.32043, val loss: 0.32064, in 0.016s 1 tree, 88 leaves, max depth = 13, train loss: 0.31803, val loss: 0.31845, in 0.016s 1 tree, 144 leaves, max depth = 17, train loss: 0.31582, val loss: 0.31635, in 0.016s 1 tree, 57 leaves, max depth = 15, train loss: 0.31359, val loss: 0.31419, in 0.016s 1 tree, 87 leaves, max depth = 13, train loss: 0.31137, val loss: 0.31217, in 0.016s 1 tree, 52 leaves, max depth = 16, train loss: 0.30929, val loss: 0.31012, in 0.016s 1 tree, 145 leaves, max depth = 18, train loss: 0.30721, val loss: 0.30817, in 0.031s 1 tree, 87 leaves, max depth = 13, train loss: 0.30517, val loss: 0.30632, in 0.016s 1 tree, 56 leaves, max depth = 15, train loss: 0.30324, val loss: 0.30445, in 0.016s 1 tree, 117 leaves, max depth = 18, train loss: 0.30137, val loss: 0.30268, in 0.031s 1 tree, 86 leaves, max depth = 15, train loss: 0.29946, val loss: 0.30096, in 0.016s 1 tree, 87 leaves, max depth = 15, train loss: 0.29763, val loss: 0.29931, in 0.016s Fit 74 trees in 1.830 s, (7413 total leaves) Time spent computing histograms: 0.529s Time spent finding best splits: 0.143s Time spent applying splits: 0.128s Time spent predicting: 0.000s Trial 1, Fold 2: Log loss = 0.30159064159157584, Average precision = 0.9600446233961784, ROC-AUC = 0.9570980178400766, Elapsed Time = 1.844349200000579 seconds Trial 1, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 1, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.159 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 66 leaves, max depth = 12, train loss: 0.67847, val loss: 0.67835, in 0.016s 1 tree, 99 leaves, max depth = 12, train loss: 0.66446, val loss: 0.66422, in 0.031s 1 tree, 95 leaves, max depth = 11, train loss: 0.65103, val loss: 0.65079, in 0.016s 1 tree, 97 leaves, max depth = 11, train loss: 0.63825, val loss: 0.63801, in 0.031s 1 tree, 96 leaves, max depth = 11, train loss: 0.62608, val loss: 0.62584, in 0.016s 1 tree, 85 leaves, max depth = 11, train loss: 0.61427, val loss: 0.61405, in 0.016s 1 tree, 98 leaves, max depth = 11, train loss: 0.60321, val loss: 0.60293, in 0.016s 1 tree, 98 leaves, max depth = 11, train loss: 0.59264, val loss: 0.59231, in 0.031s 1 tree, 106 leaves, max depth = 12, train loss: 0.58264, val loss: 0.58227, in 0.016s 1 tree, 101 leaves, max depth = 12, train loss: 0.57299, val loss: 0.57254, in 0.016s 1 tree, 114 leaves, max depth = 15, train loss: 0.56362, val loss: 0.56323, in 0.016s 1 tree, 117 leaves, max depth = 15, train loss: 0.55466, val loss: 0.55433, in 0.031s 1 tree, 119 leaves, max depth = 13, train loss: 0.54612, val loss: 0.54583, in 0.016s 1 tree, 122 leaves, max depth = 15, train loss: 0.53790, val loss: 0.53767, in 0.016s 1 tree, 120 leaves, max depth = 16, train loss: 0.53006, val loss: 0.53001, in 0.031s 1 tree, 140 leaves, max depth = 17, train loss: 0.52281, val loss: 0.52283, in 0.031s 1 tree, 104 leaves, max depth = 15, train loss: 0.51538, val loss: 0.51546, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.50825, val loss: 0.50839, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.50140, val loss: 0.50159, in 0.031s 1 tree, 131 leaves, max depth = 14, train loss: 0.49497, val loss: 0.49518, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.48863, val loss: 0.48890, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.48254, val loss: 0.48287, in 0.016s 1 tree, 100 leaves, max depth = 14, train loss: 0.47463, val loss: 0.47545, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.46896, val loss: 0.46984, in 0.031s 1 tree, 106 leaves, max depth = 15, train loss: 0.46349, val loss: 0.46445, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.45824, val loss: 0.45926, in 0.016s 1 tree, 99 leaves, max depth = 14, train loss: 0.45116, val loss: 0.45265, in 0.031s 1 tree, 99 leaves, max depth = 14, train loss: 0.44438, val loss: 0.44634, in 0.016s 1 tree, 102 leaves, max depth = 14, train loss: 0.43788, val loss: 0.44029, in 0.016s 1 tree, 108 leaves, max depth = 15, train loss: 0.43325, val loss: 0.43572, in 0.031s 1 tree, 111 leaves, max depth = 14, train loss: 0.42878, val loss: 0.43128, in 0.016s 1 tree, 108 leaves, max depth = 14, train loss: 0.42448, val loss: 0.42701, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.42032, val loss: 0.42292, in 0.031s 1 tree, 101 leaves, max depth = 14, train loss: 0.41458, val loss: 0.41761, in 0.031s 1 tree, 107 leaves, max depth = 15, train loss: 0.41068, val loss: 0.41377, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.40691, val loss: 0.41008, in 0.016s 1 tree, 99 leaves, max depth = 14, train loss: 0.40161, val loss: 0.40520, in 0.016s 1 tree, 109 leaves, max depth = 14, train loss: 0.39807, val loss: 0.40174, in 0.016s 1 tree, 111 leaves, max depth = 14, train loss: 0.39466, val loss: 0.39839, in 0.031s 1 tree, 107 leaves, max depth = 15, train loss: 0.39136, val loss: 0.39516, in 0.016s 1 tree, 100 leaves, max depth = 14, train loss: 0.38654, val loss: 0.39074, in 0.016s 1 tree, 111 leaves, max depth = 14, train loss: 0.38343, val loss: 0.38771, in 0.031s 1 tree, 100 leaves, max depth = 14, train loss: 0.37889, val loss: 0.38356, in 0.031s 1 tree, 110 leaves, max depth = 14, train loss: 0.37596, val loss: 0.38071, in 0.016s 1 tree, 99 leaves, max depth = 16, train loss: 0.37170, val loss: 0.37682, in 0.031s 1 tree, 111 leaves, max depth = 14, train loss: 0.36894, val loss: 0.37413, in 0.016s 1 tree, 99 leaves, max depth = 16, train loss: 0.36491, val loss: 0.37047, in 0.031s 1 tree, 110 leaves, max depth = 14, train loss: 0.36231, val loss: 0.36795, in 0.016s 1 tree, 99 leaves, max depth = 16, train loss: 0.35850, val loss: 0.36451, in 0.031s 1 tree, 132 leaves, max depth = 14, train loss: 0.35609, val loss: 0.36223, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.35264, val loss: 0.35910, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.34934, val loss: 0.35610, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.34618, val loss: 0.35323, in 0.016s 1 tree, 102 leaves, max depth = 14, train loss: 0.34294, val loss: 0.35032, in 0.016s 1 tree, 118 leaves, max depth = 18, train loss: 0.34035, val loss: 0.34762, in 0.016s 1 tree, 115 leaves, max depth = 19, train loss: 0.33783, val loss: 0.34496, in 0.031s 1 tree, 53 leaves, max depth = 16, train loss: 0.33502, val loss: 0.34242, in 0.016s 1 tree, 112 leaves, max depth = 15, train loss: 0.33302, val loss: 0.34051, in 0.031s 1 tree, 53 leaves, max depth = 13, train loss: 0.33035, val loss: 0.33813, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.32766, val loss: 0.33595, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.32508, val loss: 0.33386, in 0.016s 1 tree, 55 leaves, max depth = 16, train loss: 0.32267, val loss: 0.33169, in 0.032s 1 tree, 110 leaves, max depth = 15, train loss: 0.32087, val loss: 0.32999, in 0.016s 1 tree, 120 leaves, max depth = 18, train loss: 0.31871, val loss: 0.32771, in 0.031s 1 tree, 53 leaves, max depth = 13, train loss: 0.31645, val loss: 0.32567, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.31416, val loss: 0.32380, in 0.016s 1 tree, 121 leaves, max depth = 18, train loss: 0.31213, val loss: 0.32169, in 0.031s 1 tree, 53 leaves, max depth = 13, train loss: 0.31005, val loss: 0.31982, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.30805, val loss: 0.31802, in 0.016s 1 tree, 121 leaves, max depth = 18, train loss: 0.30614, val loss: 0.31602, in 0.016s 1 tree, 86 leaves, max depth = 14, train loss: 0.30409, val loss: 0.31437, in 0.016s 1 tree, 86 leaves, max depth = 14, train loss: 0.30213, val loss: 0.31278, in 0.016s 1 tree, 86 leaves, max depth = 14, train loss: 0.30023, val loss: 0.31130, in 0.031s 1 tree, 85 leaves, max depth = 14, train loss: 0.29841, val loss: 0.30983, in 0.016s Fit 74 trees in 1.956 s, (7280 total leaves) Time spent computing histograms: 0.587s Time spent finding best splits: 0.152s Time spent applying splits: 0.135s Time spent predicting: 0.000s Trial 1, Fold 3: Log loss = 0.29955521394130014, Average precision = 0.9582165465175596, ROC-AUC = 0.9550424788839557, Elapsed Time = 1.9552509000004648 seconds Trial 1, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 1, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.159 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 72 leaves, max depth = 14, train loss: 0.67840, val loss: 0.67787, in 0.016s 1 tree, 96 leaves, max depth = 12, train loss: 0.66423, val loss: 0.66319, in 0.016s 1 tree, 100 leaves, max depth = 14, train loss: 0.65076, val loss: 0.64929, in 0.016s 1 tree, 100 leaves, max depth = 13, train loss: 0.63793, val loss: 0.63605, in 0.016s 1 tree, 103 leaves, max depth = 12, train loss: 0.62571, val loss: 0.62338, in 0.031s 1 tree, 124 leaves, max depth = 12, train loss: 0.61418, val loss: 0.61130, in 0.016s 1 tree, 100 leaves, max depth = 12, train loss: 0.60304, val loss: 0.59977, in 0.047s 1 tree, 101 leaves, max depth = 12, train loss: 0.59241, val loss: 0.58877, in 0.031s 1 tree, 105 leaves, max depth = 12, train loss: 0.58225, val loss: 0.57822, in 0.031s 1 tree, 104 leaves, max depth = 12, train loss: 0.57253, val loss: 0.56811, in 0.031s 1 tree, 99 leaves, max depth = 12, train loss: 0.56322, val loss: 0.55845, in 0.016s 1 tree, 113 leaves, max depth = 17, train loss: 0.55466, val loss: 0.54952, in 0.031s 1 tree, 102 leaves, max depth = 11, train loss: 0.54659, val loss: 0.54129, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.53839, val loss: 0.53273, in 0.031s 1 tree, 99 leaves, max depth = 11, train loss: 0.53035, val loss: 0.52436, in 0.016s 1 tree, 131 leaves, max depth = 13, train loss: 0.52272, val loss: 0.51636, in 0.031s 1 tree, 100 leaves, max depth = 13, train loss: 0.51530, val loss: 0.50864, in 0.031s 1 tree, 99 leaves, max depth = 12, train loss: 0.50821, val loss: 0.50120, in 0.016s 1 tree, 102 leaves, max depth = 12, train loss: 0.50137, val loss: 0.49406, in 0.031s 1 tree, 133 leaves, max depth = 13, train loss: 0.49489, val loss: 0.48724, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.48856, val loss: 0.48063, in 0.031s 1 tree, 102 leaves, max depth = 13, train loss: 0.48249, val loss: 0.47428, in 0.016s 1 tree, 134 leaves, max depth = 13, train loss: 0.47673, val loss: 0.46820, in 0.031s 1 tree, 137 leaves, max depth = 13, train loss: 0.47119, val loss: 0.46235, in 0.016s 1 tree, 133 leaves, max depth = 13, train loss: 0.46586, val loss: 0.45674, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.45868, val loss: 0.44940, in 0.031s 1 tree, 97 leaves, max depth = 13, train loss: 0.45172, val loss: 0.44237, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.44679, val loss: 0.43720, in 0.031s 1 tree, 95 leaves, max depth = 15, train loss: 0.44024, val loss: 0.43056, in 0.016s 1 tree, 139 leaves, max depth = 13, train loss: 0.43572, val loss: 0.42577, in 0.031s 1 tree, 107 leaves, max depth = 13, train loss: 0.43126, val loss: 0.42112, in 0.031s 1 tree, 111 leaves, max depth = 13, train loss: 0.42696, val loss: 0.41662, in 0.016s 1 tree, 105 leaves, max depth = 12, train loss: 0.42281, val loss: 0.41228, in 0.016s 1 tree, 96 leaves, max depth = 15, train loss: 0.41701, val loss: 0.40642, in 0.031s 1 tree, 136 leaves, max depth = 13, train loss: 0.41322, val loss: 0.40242, in 0.016s 1 tree, 116 leaves, max depth = 17, train loss: 0.40963, val loss: 0.39868, in 0.031s 1 tree, 97 leaves, max depth = 15, train loss: 0.40426, val loss: 0.39328, in 0.016s 1 tree, 113 leaves, max depth = 13, train loss: 0.40071, val loss: 0.38955, in 0.016s 1 tree, 112 leaves, max depth = 14, train loss: 0.39727, val loss: 0.38594, in 0.031s 1 tree, 110 leaves, max depth = 14, train loss: 0.39396, val loss: 0.38245, in 0.016s 1 tree, 130 leaves, max depth = 14, train loss: 0.38920, val loss: 0.37763, in 0.031s 1 tree, 99 leaves, max depth = 16, train loss: 0.38451, val loss: 0.37292, in 0.016s 1 tree, 111 leaves, max depth = 14, train loss: 0.38146, val loss: 0.36973, in 0.031s 1 tree, 132 leaves, max depth = 14, train loss: 0.37716, val loss: 0.36538, in 0.016s 1 tree, 138 leaves, max depth = 14, train loss: 0.37438, val loss: 0.36240, in 0.031s 1 tree, 108 leaves, max depth = 12, train loss: 0.37160, val loss: 0.35948, in 0.016s 1 tree, 102 leaves, max depth = 14, train loss: 0.36750, val loss: 0.35538, in 0.032s 1 tree, 110 leaves, max depth = 14, train loss: 0.36487, val loss: 0.35263, in 0.016s 1 tree, 98 leaves, max depth = 15, train loss: 0.36097, val loss: 0.34874, in 0.031s 1 tree, 136 leaves, max depth = 12, train loss: 0.35857, val loss: 0.34619, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.35510, val loss: 0.34261, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.35178, val loss: 0.33917, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.34908, val loss: 0.33648, in 0.031s 1 tree, 56 leaves, max depth = 13, train loss: 0.34595, val loss: 0.33324, in 0.016s 1 tree, 150 leaves, max depth = 18, train loss: 0.34328, val loss: 0.33054, in 0.031s 1 tree, 86 leaves, max depth = 13, train loss: 0.34026, val loss: 0.32767, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.33738, val loss: 0.32468, in 0.016s 1 tree, 86 leaves, max depth = 14, train loss: 0.33455, val loss: 0.32203, in 0.016s 1 tree, 126 leaves, max depth = 18, train loss: 0.33218, val loss: 0.31967, in 0.031s 1 tree, 53 leaves, max depth = 14, train loss: 0.32953, val loss: 0.31689, in 0.016s 1 tree, 86 leaves, max depth = 14, train loss: 0.32693, val loss: 0.31445, in 0.031s 1 tree, 56 leaves, max depth = 14, train loss: 0.32446, val loss: 0.31189, in 0.016s 1 tree, 124 leaves, max depth = 17, train loss: 0.32227, val loss: 0.30969, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.31993, val loss: 0.30726, in 0.031s 1 tree, 86 leaves, max depth = 14, train loss: 0.31758, val loss: 0.30508, in 0.016s 1 tree, 126 leaves, max depth = 14, train loss: 0.31551, val loss: 0.30303, in 0.016s 1 tree, 86 leaves, max depth = 14, train loss: 0.31329, val loss: 0.30093, in 0.031s 1 tree, 126 leaves, max depth = 14, train loss: 0.31132, val loss: 0.29898, in 0.016s 1 tree, 86 leaves, max depth = 14, train loss: 0.30922, val loss: 0.29703, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.30719, val loss: 0.29493, in 0.031s 1 tree, 85 leaves, max depth = 14, train loss: 0.30522, val loss: 0.29307, in 0.016s 1 tree, 108 leaves, max depth = 14, train loss: 0.30369, val loss: 0.29145, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.30182, val loss: 0.28948, in 0.031s 1 tree, 149 leaves, max depth = 17, train loss: 0.29991, val loss: 0.28754, in 0.031s Fit 74 trees in 2.096 s, (7611 total leaves) Time spent computing histograms: 0.623s Time spent finding best splits: 0.176s Time spent applying splits: 0.163s Time spent predicting: 0.016s Trial 1, Fold 4: Log loss = 0.3012262393830042, Average precision = 0.9599601894426121, ROC-AUC = 0.956310683228777, Elapsed Time = 2.0986773000004177 seconds Trial 1, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 1, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 66 leaves, max depth = 11, train loss: 0.67817, val loss: 0.67751, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.66462, val loss: 0.66343, in 0.031s 1 tree, 94 leaves, max depth = 15, train loss: 0.65091, val loss: 0.64928, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.63782, val loss: 0.63568, in 0.016s 1 tree, 99 leaves, max depth = 15, train loss: 0.62536, val loss: 0.62275, in 0.031s 1 tree, 101 leaves, max depth = 15, train loss: 0.61347, val loss: 0.61038, in 0.016s 1 tree, 100 leaves, max depth = 15, train loss: 0.60215, val loss: 0.59870, in 0.016s 1 tree, 100 leaves, max depth = 15, train loss: 0.59131, val loss: 0.58742, in 0.016s 1 tree, 126 leaves, max depth = 13, train loss: 0.58106, val loss: 0.57674, in 0.016s 1 tree, 99 leaves, max depth = 15, train loss: 0.57115, val loss: 0.56643, in 0.031s 1 tree, 97 leaves, max depth = 15, train loss: 0.56168, val loss: 0.55664, in 0.016s 1 tree, 130 leaves, max depth = 12, train loss: 0.55271, val loss: 0.54727, in 0.016s 1 tree, 128 leaves, max depth = 14, train loss: 0.54411, val loss: 0.53830, in 0.031s 1 tree, 103 leaves, max depth = 15, train loss: 0.53575, val loss: 0.52959, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.52774, val loss: 0.52126, in 0.016s 1 tree, 90 leaves, max depth = 13, train loss: 0.52066, val loss: 0.51381, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.51326, val loss: 0.50609, in 0.016s 1 tree, 129 leaves, max depth = 15, train loss: 0.50625, val loss: 0.49876, in 0.016s 1 tree, 122 leaves, max depth = 15, train loss: 0.49976, val loss: 0.49204, in 0.031s 1 tree, 120 leaves, max depth = 15, train loss: 0.49354, val loss: 0.48558, in 0.016s 1 tree, 109 leaves, max depth = 15, train loss: 0.48717, val loss: 0.47899, in 0.016s 1 tree, 110 leaves, max depth = 15, train loss: 0.48104, val loss: 0.47262, in 0.016s 1 tree, 112 leaves, max depth = 15, train loss: 0.47514, val loss: 0.46647, in 0.031s 1 tree, 82 leaves, max depth = 13, train loss: 0.46768, val loss: 0.45885, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.46251, val loss: 0.45347, in 0.031s 1 tree, 109 leaves, max depth = 15, train loss: 0.45718, val loss: 0.44794, in 0.016s 1 tree, 110 leaves, max depth = 15, train loss: 0.45205, val loss: 0.44260, in 0.016s 1 tree, 109 leaves, max depth = 15, train loss: 0.44712, val loss: 0.43748, in 0.031s 1 tree, 76 leaves, max depth = 15, train loss: 0.44255, val loss: 0.43264, in 0.016s 1 tree, 110 leaves, max depth = 15, train loss: 0.43794, val loss: 0.42791, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.43148, val loss: 0.42148, in 0.031s 1 tree, 112 leaves, max depth = 15, train loss: 0.42717, val loss: 0.41702, in 0.016s 1 tree, 109 leaves, max depth = 15, train loss: 0.42303, val loss: 0.41272, in 0.031s 1 tree, 102 leaves, max depth = 15, train loss: 0.41708, val loss: 0.40681, in 0.016s 1 tree, 112 leaves, max depth = 15, train loss: 0.41318, val loss: 0.40279, in 0.031s 1 tree, 134 leaves, max depth = 12, train loss: 0.40948, val loss: 0.39895, in 0.016s 1 tree, 101 leaves, max depth = 15, train loss: 0.40398, val loss: 0.39350, in 0.031s 1 tree, 111 leaves, max depth = 15, train loss: 0.40044, val loss: 0.38985, in 0.016s 1 tree, 133 leaves, max depth = 11, train loss: 0.39709, val loss: 0.38640, in 0.031s 1 tree, 134 leaves, max depth = 11, train loss: 0.39385, val loss: 0.38307, in 0.016s 1 tree, 113 leaves, max depth = 15, train loss: 0.39066, val loss: 0.37977, in 0.031s 1 tree, 103 leaves, max depth = 15, train loss: 0.38572, val loss: 0.37491, in 0.031s 1 tree, 103 leaves, max depth = 15, train loss: 0.38099, val loss: 0.37025, in 0.016s 1 tree, 125 leaves, max depth = 14, train loss: 0.37653, val loss: 0.36585, in 0.031s 1 tree, 102 leaves, max depth = 14, train loss: 0.37217, val loss: 0.36155, in 0.031s 1 tree, 125 leaves, max depth = 14, train loss: 0.36806, val loss: 0.35752, in 0.016s 1 tree, 111 leaves, max depth = 14, train loss: 0.36531, val loss: 0.35470, in 0.016s 1 tree, 101 leaves, max depth = 14, train loss: 0.36135, val loss: 0.35081, in 0.031s 1 tree, 128 leaves, max depth = 13, train loss: 0.35764, val loss: 0.34715, in 0.016s 1 tree, 109 leaves, max depth = 14, train loss: 0.35507, val loss: 0.34461, in 0.031s 1 tree, 101 leaves, max depth = 15, train loss: 0.35147, val loss: 0.34109, in 0.016s 1 tree, 126 leaves, max depth = 17, train loss: 0.34864, val loss: 0.33830, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.34542, val loss: 0.33503, in 0.016s 1 tree, 125 leaves, max depth = 17, train loss: 0.34273, val loss: 0.33238, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.33969, val loss: 0.32929, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.33713, val loss: 0.32678, in 0.016s 1 tree, 124 leaves, max depth = 17, train loss: 0.33467, val loss: 0.32436, in 0.031s 1 tree, 83 leaves, max depth = 13, train loss: 0.33178, val loss: 0.32175, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.32903, val loss: 0.31896, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.32632, val loss: 0.31653, in 0.016s 1 tree, 152 leaves, max depth = 17, train loss: 0.32393, val loss: 0.31411, in 0.031s 1 tree, 54 leaves, max depth = 16, train loss: 0.32139, val loss: 0.31155, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.31890, val loss: 0.30931, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.31654, val loss: 0.30691, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.31427, val loss: 0.30459, in 0.016s 1 tree, 110 leaves, max depth = 14, train loss: 0.31253, val loss: 0.30285, in 0.031s 1 tree, 83 leaves, max depth = 12, train loss: 0.31027, val loss: 0.30080, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.30817, val loss: 0.29867, in 0.016s 1 tree, 150 leaves, max depth = 17, train loss: 0.30609, val loss: 0.29656, in 0.031s 1 tree, 83 leaves, max depth = 12, train loss: 0.30400, val loss: 0.29471, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.30199, val loss: 0.29294, in 0.016s 1 tree, 50 leaves, max depth = 14, train loss: 0.30011, val loss: 0.29106, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.29822, val loss: 0.28940, in 0.016s 1 tree, 156 leaves, max depth = 18, train loss: 0.29629, val loss: 0.28745, in 0.031s Fit 74 trees in 1.925 s, (7611 total leaves) Time spent computing histograms: 0.567s Time spent finding best splits: 0.152s Time spent applying splits: 0.134s Time spent predicting: 0.016s Trial 1, Fold 5: Log loss = 0.3059257783920364, Average precision = 0.9571162576778489, ROC-AUC = 0.952768147446259, Elapsed Time = 1.9239307999996527 seconds
Optimization Progress: 2%|2 | 2/100 [00:29<24:13, 14.83s/it]
Trial 2, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 2, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 33 leaves, max depth = 8, train loss: 0.66988, val loss: 0.66936, in 0.016s 1 tree, 32 leaves, max depth = 7, train loss: 0.64735, val loss: 0.64677, in 0.016s 1 tree, 27 leaves, max depth = 8, train loss: 0.62702, val loss: 0.62621, in 0.016s 1 tree, 33 leaves, max depth = 7, train loss: 0.60789, val loss: 0.60700, in 0.016s 1 tree, 37 leaves, max depth = 7, train loss: 0.59157, val loss: 0.59044, in 0.031s 1 tree, 27 leaves, max depth = 7, train loss: 0.57534, val loss: 0.57389, in 0.000s 1 tree, 40 leaves, max depth = 9, train loss: 0.56066, val loss: 0.55895, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.54615, val loss: 0.54432, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.53325, val loss: 0.53123, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.52143, val loss: 0.51907, in 0.016s 1 tree, 39 leaves, max depth = 8, train loss: 0.50972, val loss: 0.50712, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.49942, val loss: 0.49652, in 0.016s Fit 12 trees in 0.486 s, (430 total leaves) Time spent computing histograms: 0.071s Time spent finding best splits: 0.009s Time spent applying splits: 0.008s Time spent predicting: 0.016s Trial 2, Fold 1: Log loss = 0.5002498264839964, Average precision = 0.9215519475609961, ROC-AUC = 0.9313925811215783, Elapsed Time = 0.48928890000024694 seconds Trial 2, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 2, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 26 leaves, max depth = 7, train loss: 0.66975, val loss: 0.66923, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.64713, val loss: 0.64618, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.62625, val loss: 0.62494, in 0.016s 1 tree, 22 leaves, max depth = 7, train loss: 0.60738, val loss: 0.60567, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.59063, val loss: 0.58876, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.57389, val loss: 0.57174, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.56004, val loss: 0.55744, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.54544, val loss: 0.54262, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.53273, val loss: 0.52961, in 0.031s 1 tree, 32 leaves, max depth = 9, train loss: 0.52105, val loss: 0.51779, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.50992, val loss: 0.50640, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.49882, val loss: 0.49509, in 0.031s Fit 12 trees in 0.533 s, (422 total leaves) Time spent computing histograms: 0.087s Time spent finding best splits: 0.011s Time spent applying splits: 0.010s Time spent predicting: 0.000s Trial 2, Fold 2: Log loss = 0.4999592546829724, Average precision = 0.9167748509612672, ROC-AUC = 0.9306778179032644, Elapsed Time = 0.5291692999999213 seconds Trial 2, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 2, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.127 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 8, train loss: 0.66986, val loss: 0.66962, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.64715, val loss: 0.64693, in 0.016s 1 tree, 24 leaves, max depth = 6, train loss: 0.62681, val loss: 0.62637, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.60765, val loss: 0.60713, in 0.016s 1 tree, 41 leaves, max depth = 7, train loss: 0.59130, val loss: 0.59069, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.57563, val loss: 0.57501, in 0.000s 1 tree, 39 leaves, max depth = 7, train loss: 0.56090, val loss: 0.56041, in 0.031s 1 tree, 36 leaves, max depth = 10, train loss: 0.54646, val loss: 0.54594, in 0.000s 1 tree, 51 leaves, max depth = 11, train loss: 0.53362, val loss: 0.53316, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.52212, val loss: 0.52155, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.51011, val loss: 0.50963, in 0.016s 1 tree, 46 leaves, max depth = 8, train loss: 0.49965, val loss: 0.49929, in 0.016s Fit 12 trees in 0.502 s, (443 total leaves) Time spent computing histograms: 0.082s Time spent finding best splits: 0.008s Time spent applying splits: 0.009s Time spent predicting: 0.000s Trial 2, Fold 3: Log loss = 0.4973438762829475, Average precision = 0.924954635794069, ROC-AUC = 0.9353176676522426, Elapsed Time = 0.5145811000002141 seconds Trial 2, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 2, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 33 leaves, max depth = 8, train loss: 0.66985, val loss: 0.66880, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.64736, val loss: 0.64531, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.62709, val loss: 0.62403, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.60789, val loss: 0.60389, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.59126, val loss: 0.58663, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.57582, val loss: 0.57043, in 0.000s 1 tree, 38 leaves, max depth = 8, train loss: 0.56136, val loss: 0.55521, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.54675, val loss: 0.53984, in 0.031s 1 tree, 37 leaves, max depth = 10, train loss: 0.53403, val loss: 0.52653, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.52218, val loss: 0.51412, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.51036, val loss: 0.50163, in 0.016s 1 tree, 44 leaves, max depth = 9, train loss: 0.50014, val loss: 0.49096, in 0.016s Fit 12 trees in 0.501 s, (429 total leaves) Time spent computing histograms: 0.075s Time spent finding best splits: 0.009s Time spent applying splits: 0.009s Time spent predicting: 0.000s Trial 2, Fold 4: Log loss = 0.499519425604958, Average precision = 0.9219241863020324, ROC-AUC = 0.9324585380945227, Elapsed Time = 0.5062674999990122 seconds Trial 2, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 2, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 32 leaves, max depth = 8, train loss: 0.66967, val loss: 0.66820, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.64699, val loss: 0.64472, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.62652, val loss: 0.62337, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.60723, val loss: 0.60335, in 0.016s 1 tree, 26 leaves, max depth = 7, train loss: 0.58990, val loss: 0.58519, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.57447, val loss: 0.56880, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.55998, val loss: 0.55352, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.54538, val loss: 0.53844, in 0.016s 1 tree, 47 leaves, max depth = 10, train loss: 0.53263, val loss: 0.52515, in 0.016s 1 tree, 36 leaves, max depth = 8, train loss: 0.52105, val loss: 0.51313, in 0.000s 1 tree, 38 leaves, max depth = 8, train loss: 0.50947, val loss: 0.50111, in 0.016s 1 tree, 40 leaves, max depth = 7, train loss: 0.49927, val loss: 0.49066, in 0.016s Fit 12 trees in 0.485 s, (421 total leaves) Time spent computing histograms: 0.075s Time spent finding best splits: 0.008s Time spent applying splits: 0.008s Time spent predicting: 0.000s Trial 2, Fold 5: Log loss = 0.5021190230330362, Average precision = 0.9164841006897811, ROC-AUC = 0.9291524441224014, Elapsed Time = 0.5015739999998914 seconds
Optimization Progress: 3%|3 | 3/100 [00:38<19:37, 12.14s/it]
Trial 3, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 3, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.141 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 56 leaves, max depth = 11, train loss: 0.68265, val loss: 0.68255, in 0.016s 1 tree, 76 leaves, max depth = 11, train loss: 0.67219, val loss: 0.67220, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.66265, val loss: 0.66258, in 0.031s 1 tree, 93 leaves, max depth = 12, train loss: 0.65301, val loss: 0.65292, in 0.016s 1 tree, 73 leaves, max depth = 13, train loss: 0.64421, val loss: 0.64402, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.63505, val loss: 0.63491, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.62663, val loss: 0.62649, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.61810, val loss: 0.61803, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.60989, val loss: 0.60981, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.60188, val loss: 0.60184, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.59451, val loss: 0.59446, in 0.016s 1 tree, 102 leaves, max depth = 13, train loss: 0.58734, val loss: 0.58720, in 0.031s 1 tree, 85 leaves, max depth = 11, train loss: 0.58029, val loss: 0.58008, in 0.016s 1 tree, 74 leaves, max depth = 11, train loss: 0.57330, val loss: 0.57296, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.56644, val loss: 0.56610, in 0.016s 1 tree, 97 leaves, max depth = 12, train loss: 0.55995, val loss: 0.55954, in 0.016s 1 tree, 72 leaves, max depth = 11, train loss: 0.55399, val loss: 0.55352, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.54771, val loss: 0.54725, in 0.016s 1 tree, 102 leaves, max depth = 13, train loss: 0.54172, val loss: 0.54122, in 0.016s 1 tree, 81 leaves, max depth = 11, train loss: 0.53582, val loss: 0.53531, in 0.031s 1 tree, 101 leaves, max depth = 16, train loss: 0.53045, val loss: 0.52987, in 0.016s 1 tree, 84 leaves, max depth = 12, train loss: 0.52477, val loss: 0.52416, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.51987, val loss: 0.51922, in 0.016s 1 tree, 98 leaves, max depth = 12, train loss: 0.51475, val loss: 0.51405, in 0.016s 1 tree, 103 leaves, max depth = 12, train loss: 0.50961, val loss: 0.50881, in 0.016s 1 tree, 102 leaves, max depth = 12, train loss: 0.50462, val loss: 0.50372, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.49996, val loss: 0.49904, in 0.016s 1 tree, 101 leaves, max depth = 15, train loss: 0.49552, val loss: 0.49450, in 0.016s 1 tree, 74 leaves, max depth = 14, train loss: 0.48950, val loss: 0.48863, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.48521, val loss: 0.48432, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.48083, val loss: 0.47996, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.47658, val loss: 0.47578, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.47263, val loss: 0.47181, in 0.016s 1 tree, 87 leaves, max depth = 11, train loss: 0.46881, val loss: 0.46802, in 0.031s 1 tree, 62 leaves, max depth = 12, train loss: 0.46516, val loss: 0.46424, in 0.016s 1 tree, 85 leaves, max depth = 12, train loss: 0.46121, val loss: 0.46026, in 0.016s 1 tree, 86 leaves, max depth = 12, train loss: 0.45738, val loss: 0.45640, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.45382, val loss: 0.45275, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.45037, val loss: 0.44921, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.44689, val loss: 0.44565, in 0.016s 1 tree, 84 leaves, max depth = 12, train loss: 0.44363, val loss: 0.44236, in 0.016s 1 tree, 85 leaves, max depth = 11, train loss: 0.44028, val loss: 0.43901, in 0.031s 1 tree, 102 leaves, max depth = 12, train loss: 0.43709, val loss: 0.43577, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.43399, val loss: 0.43260, in 0.016s 1 tree, 74 leaves, max depth = 14, train loss: 0.42936, val loss: 0.42815, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.42491, val loss: 0.42388, in 0.031s 1 tree, 85 leaves, max depth = 11, train loss: 0.42216, val loss: 0.42118, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.41925, val loss: 0.41828, in 0.016s 1 tree, 96 leaves, max depth = 14, train loss: 0.41512, val loss: 0.41428, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.41263, val loss: 0.41170, in 0.016s 1 tree, 71 leaves, max depth = 12, train loss: 0.41006, val loss: 0.40906, in 0.016s 1 tree, 95 leaves, max depth = 14, train loss: 0.40613, val loss: 0.40525, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.40353, val loss: 0.40266, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.40097, val loss: 0.40011, in 0.031s 1 tree, 86 leaves, max depth = 13, train loss: 0.39845, val loss: 0.39760, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.39602, val loss: 0.39517, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.39382, val loss: 0.39300, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.39016, val loss: 0.38953, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.38807, val loss: 0.38746, in 0.016s Fit 59 trees in 1.344 s, (4932 total leaves) Time spent computing histograms: 0.405s Time spent finding best splits: 0.082s Time spent applying splits: 0.077s Time spent predicting: 0.000s Trial 3, Fold 1: Log loss = 0.3930796249502737, Average precision = 0.9518336436663236, ROC-AUC = 0.9472694057297716, Elapsed Time = 1.3563878999993904 seconds Trial 3, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 3, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.141 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 61 leaves, max depth = 9, train loss: 0.68264, val loss: 0.68243, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.67194, val loss: 0.67151, in 0.016s 1 tree, 100 leaves, max depth = 13, train loss: 0.66241, val loss: 0.66189, in 0.016s 1 tree, 80 leaves, max depth = 12, train loss: 0.65243, val loss: 0.65170, in 0.016s 1 tree, 70 leaves, max depth = 14, train loss: 0.64361, val loss: 0.64272, in 0.016s 1 tree, 80 leaves, max depth = 12, train loss: 0.63430, val loss: 0.63320, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.62583, val loss: 0.62465, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.61715, val loss: 0.61584, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.60929, val loss: 0.60787, in 0.016s 1 tree, 100 leaves, max depth = 16, train loss: 0.60179, val loss: 0.60030, in 0.016s 1 tree, 102 leaves, max depth = 13, train loss: 0.59396, val loss: 0.59233, in 0.031s 1 tree, 83 leaves, max depth = 12, train loss: 0.58629, val loss: 0.58449, in 0.016s 1 tree, 81 leaves, max depth = 16, train loss: 0.57940, val loss: 0.57749, in 0.016s 1 tree, 73 leaves, max depth = 14, train loss: 0.57243, val loss: 0.57036, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.56545, val loss: 0.56322, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.55918, val loss: 0.55686, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.55275, val loss: 0.55034, in 0.016s 1 tree, 100 leaves, max depth = 12, train loss: 0.54646, val loss: 0.54394, in 0.031s 1 tree, 84 leaves, max depth = 14, train loss: 0.54074, val loss: 0.53814, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.53520, val loss: 0.53252, in 0.031s 1 tree, 85 leaves, max depth = 13, train loss: 0.52974, val loss: 0.52703, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.52408, val loss: 0.52125, in 0.016s 1 tree, 101 leaves, max depth = 12, train loss: 0.51862, val loss: 0.51572, in 0.016s 1 tree, 84 leaves, max depth = 11, train loss: 0.51370, val loss: 0.51067, in 0.031s 1 tree, 83 leaves, max depth = 14, train loss: 0.50890, val loss: 0.50582, in 0.016s 1 tree, 82 leaves, max depth = 16, train loss: 0.50425, val loss: 0.50111, in 0.016s 1 tree, 99 leaves, max depth = 12, train loss: 0.49937, val loss: 0.49622, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.49498, val loss: 0.49179, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.49026, val loss: 0.48698, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.48569, val loss: 0.48231, in 0.016s 1 tree, 72 leaves, max depth = 16, train loss: 0.48178, val loss: 0.47836, in 0.016s 1 tree, 84 leaves, max depth = 12, train loss: 0.47745, val loss: 0.47397, in 0.016s 1 tree, 71 leaves, max depth = 14, train loss: 0.47336, val loss: 0.46981, in 0.016s 1 tree, 99 leaves, max depth = 13, train loss: 0.46963, val loss: 0.46606, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.46566, val loss: 0.46201, in 0.031s 1 tree, 76 leaves, max depth = 13, train loss: 0.46039, val loss: 0.45680, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.45528, val loss: 0.45176, in 0.016s 1 tree, 96 leaves, max depth = 17, train loss: 0.45042, val loss: 0.44700, in 0.016s 1 tree, 62 leaves, max depth = 14, train loss: 0.44571, val loss: 0.44232, in 0.016s 1 tree, 61 leaves, max depth = 18, train loss: 0.44263, val loss: 0.43922, in 0.016s 1 tree, 103 leaves, max depth = 12, train loss: 0.43918, val loss: 0.43576, in 0.031s 1 tree, 83 leaves, max depth = 12, train loss: 0.43576, val loss: 0.43227, in 0.016s 1 tree, 102 leaves, max depth = 12, train loss: 0.43254, val loss: 0.42904, in 0.016s 1 tree, 72 leaves, max depth = 14, train loss: 0.42942, val loss: 0.42589, in 0.016s 1 tree, 98 leaves, max depth = 17, train loss: 0.42519, val loss: 0.42176, in 0.031s 1 tree, 82 leaves, max depth = 13, train loss: 0.42234, val loss: 0.41889, in 0.016s 1 tree, 73 leaves, max depth = 14, train loss: 0.41816, val loss: 0.41477, in 0.016s 1 tree, 85 leaves, max depth = 12, train loss: 0.41520, val loss: 0.41177, in 0.016s 1 tree, 96 leaves, max depth = 13, train loss: 0.41131, val loss: 0.40800, in 0.031s 1 tree, 69 leaves, max depth = 13, train loss: 0.40885, val loss: 0.40551, in 0.016s 1 tree, 74 leaves, max depth = 13, train loss: 0.40623, val loss: 0.40288, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.40377, val loss: 0.40040, in 0.016s 1 tree, 73 leaves, max depth = 14, train loss: 0.40004, val loss: 0.39674, in 0.016s 1 tree, 102 leaves, max depth = 14, train loss: 0.39754, val loss: 0.39423, in 0.031s 1 tree, 82 leaves, max depth = 15, train loss: 0.39502, val loss: 0.39169, in 0.016s 1 tree, 82 leaves, max depth = 15, train loss: 0.39252, val loss: 0.38917, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.39009, val loss: 0.38672, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.38778, val loss: 0.38440, in 0.016s 1 tree, 62 leaves, max depth = 13, train loss: 0.38560, val loss: 0.38218, in 0.016s Fit 59 trees in 1.407 s, (4915 total leaves) Time spent computing histograms: 0.415s Time spent finding best splits: 0.089s Time spent applying splits: 0.083s Time spent predicting: 0.016s Trial 3, Fold 2: Log loss = 0.3883934943415345, Average precision = 0.9514920456568772, ROC-AUC = 0.949003108098142, Elapsed Time = 1.4139042999995581 seconds Trial 3, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 3, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 64 leaves, max depth = 11, train loss: 0.68269, val loss: 0.68256, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.67223, val loss: 0.67205, in 0.016s 1 tree, 98 leaves, max depth = 13, train loss: 0.66272, val loss: 0.66240, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.65294, val loss: 0.65264, in 0.016s 1 tree, 71 leaves, max depth = 14, train loss: 0.64410, val loss: 0.64378, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.63497, val loss: 0.63465, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.62656, val loss: 0.62628, in 0.031s 1 tree, 81 leaves, max depth = 12, train loss: 0.61799, val loss: 0.61769, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.61000, val loss: 0.60975, in 0.016s 1 tree, 99 leaves, max depth = 14, train loss: 0.60247, val loss: 0.60226, in 0.031s 1 tree, 99 leaves, max depth = 12, train loss: 0.59478, val loss: 0.59456, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.58723, val loss: 0.58703, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.58026, val loss: 0.58009, in 0.031s 1 tree, 71 leaves, max depth = 15, train loss: 0.57323, val loss: 0.57307, in 0.016s 1 tree, 82 leaves, max depth = 15, train loss: 0.56628, val loss: 0.56618, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.55994, val loss: 0.55989, in 0.016s 1 tree, 98 leaves, max depth = 14, train loss: 0.55347, val loss: 0.55346, in 0.016s 1 tree, 98 leaves, max depth = 14, train loss: 0.54720, val loss: 0.54724, in 0.031s 1 tree, 83 leaves, max depth = 14, train loss: 0.54142, val loss: 0.54150, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.53581, val loss: 0.53593, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.53026, val loss: 0.53046, in 0.031s 1 tree, 82 leaves, max depth = 14, train loss: 0.52460, val loss: 0.52478, in 0.016s 1 tree, 98 leaves, max depth = 13, train loss: 0.51918, val loss: 0.51935, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.51421, val loss: 0.51444, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.50936, val loss: 0.50963, in 0.031s 1 tree, 84 leaves, max depth = 14, train loss: 0.50466, val loss: 0.50497, in 0.016s 1 tree, 99 leaves, max depth = 15, train loss: 0.49979, val loss: 0.50017, in 0.062s 1 tree, 83 leaves, max depth = 14, train loss: 0.49535, val loss: 0.49576, in 0.047s 1 tree, 84 leaves, max depth = 14, train loss: 0.49066, val loss: 0.49113, in 0.031s 1 tree, 83 leaves, max depth = 14, train loss: 0.48611, val loss: 0.48662, in 0.016s 1 tree, 64 leaves, max depth = 15, train loss: 0.48043, val loss: 0.48129, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.47611, val loss: 0.47697, in 0.031s 1 tree, 82 leaves, max depth = 14, train loss: 0.47224, val loss: 0.47318, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.46846, val loss: 0.46943, in 0.016s 1 tree, 56 leaves, max depth = 15, train loss: 0.46324, val loss: 0.46457, in 0.031s 1 tree, 85 leaves, max depth = 14, train loss: 0.45930, val loss: 0.46070, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.45549, val loss: 0.45689, in 0.016s 1 tree, 70 leaves, max depth = 14, train loss: 0.45188, val loss: 0.45330, in 0.016s 1 tree, 71 leaves, max depth = 14, train loss: 0.44837, val loss: 0.44982, in 0.031s 1 tree, 71 leaves, max depth = 14, train loss: 0.44496, val loss: 0.44644, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.44181, val loss: 0.44333, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.43846, val loss: 0.43999, in 0.016s 1 tree, 100 leaves, max depth = 13, train loss: 0.43526, val loss: 0.43682, in 0.016s 1 tree, 101 leaves, max depth = 12, train loss: 0.43214, val loss: 0.43378, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.42758, val loss: 0.42960, in 0.016s 1 tree, 77 leaves, max depth = 15, train loss: 0.42320, val loss: 0.42559, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.42020, val loss: 0.42266, in 0.016s 1 tree, 99 leaves, max depth = 13, train loss: 0.41737, val loss: 0.41986, in 0.016s 1 tree, 95 leaves, max depth = 15, train loss: 0.41327, val loss: 0.41615, in 0.016s 1 tree, 69 leaves, max depth = 11, train loss: 0.41062, val loss: 0.41350, in 0.016s 1 tree, 77 leaves, max depth = 15, train loss: 0.40666, val loss: 0.40989, in 0.016s 1 tree, 62 leaves, max depth = 14, train loss: 0.40414, val loss: 0.40739, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.40175, val loss: 0.40505, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.39948, val loss: 0.40281, in 0.031s 1 tree, 101 leaves, max depth = 15, train loss: 0.39719, val loss: 0.40063, in 0.016s 1 tree, 99 leaves, max depth = 14, train loss: 0.39484, val loss: 0.39835, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.39269, val loss: 0.39628, in 0.016s 1 tree, 73 leaves, max depth = 16, train loss: 0.38911, val loss: 0.39305, in 0.031s 1 tree, 80 leaves, max depth = 15, train loss: 0.38706, val loss: 0.39108, in 0.016s Fit 59 trees in 1.596 s, (4886 total leaves) Time spent computing histograms: 0.492s Time spent finding best splits: 0.115s Time spent applying splits: 0.093s Time spent predicting: 0.000s Trial 3, Fold 3: Log loss = 0.38656042904283894, Average precision = 0.9553353227323066, ROC-AUC = 0.9516027697652917, Elapsed Time = 1.598971100000199 seconds Trial 3, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 3, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 58 leaves, max depth = 12, train loss: 0.68270, val loss: 0.68226, in 0.016s 1 tree, 80 leaves, max depth = 11, train loss: 0.67219, val loss: 0.67131, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.66270, val loss: 0.66134, in 0.031s 1 tree, 82 leaves, max depth = 11, train loss: 0.65294, val loss: 0.65123, in 0.016s 1 tree, 71 leaves, max depth = 13, train loss: 0.64426, val loss: 0.64216, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.63514, val loss: 0.63269, in 0.031s 1 tree, 82 leaves, max depth = 13, train loss: 0.62685, val loss: 0.62410, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.61828, val loss: 0.61516, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.61046, val loss: 0.60700, in 0.031s 1 tree, 103 leaves, max depth = 13, train loss: 0.60307, val loss: 0.59923, in 0.016s 1 tree, 103 leaves, max depth = 13, train loss: 0.59541, val loss: 0.59118, in 0.016s 1 tree, 79 leaves, max depth = 11, train loss: 0.58790, val loss: 0.58338, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.58095, val loss: 0.57613, in 0.031s 1 tree, 72 leaves, max depth = 12, train loss: 0.57391, val loss: 0.56877, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.56704, val loss: 0.56160, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.56071, val loss: 0.55499, in 0.031s 1 tree, 106 leaves, max depth = 12, train loss: 0.55434, val loss: 0.54828, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.54817, val loss: 0.54177, in 0.031s 1 tree, 73 leaves, max depth = 12, train loss: 0.54226, val loss: 0.53555, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.53639, val loss: 0.52941, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.53094, val loss: 0.52369, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.52540, val loss: 0.51788, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.52009, val loss: 0.51225, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.51519, val loss: 0.50723, in 0.031s 1 tree, 82 leaves, max depth = 14, train loss: 0.51033, val loss: 0.50212, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.50571, val loss: 0.49737, in 0.016s 1 tree, 107 leaves, max depth = 13, train loss: 0.50083, val loss: 0.49226, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.49641, val loss: 0.48762, in 0.016s 1 tree, 85 leaves, max depth = 11, train loss: 0.49214, val loss: 0.48322, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.48755, val loss: 0.47845, in 0.016s 1 tree, 72 leaves, max depth = 14, train loss: 0.48363, val loss: 0.47431, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.47932, val loss: 0.46976, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.47538, val loss: 0.46571, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.47159, val loss: 0.46181, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.46639, val loss: 0.45651, in 0.016s 1 tree, 86 leaves, max depth = 12, train loss: 0.46245, val loss: 0.45240, in 0.031s 1 tree, 84 leaves, max depth = 12, train loss: 0.45861, val loss: 0.44840, in 0.016s 1 tree, 73 leaves, max depth = 12, train loss: 0.45502, val loss: 0.44461, in 0.016s 1 tree, 105 leaves, max depth = 12, train loss: 0.45150, val loss: 0.44085, in 0.031s 1 tree, 63 leaves, max depth = 12, train loss: 0.44814, val loss: 0.43724, in 0.016s 1 tree, 86 leaves, max depth = 13, train loss: 0.44492, val loss: 0.43390, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.44160, val loss: 0.43038, in 0.016s 1 tree, 103 leaves, max depth = 12, train loss: 0.43844, val loss: 0.42699, in 0.031s 1 tree, 107 leaves, max depth = 12, train loss: 0.43531, val loss: 0.42371, in 0.016s 1 tree, 74 leaves, max depth = 12, train loss: 0.43074, val loss: 0.41907, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.42633, val loss: 0.41463, in 0.016s 1 tree, 86 leaves, max depth = 12, train loss: 0.42333, val loss: 0.41150, in 0.016s 1 tree, 103 leaves, max depth = 12, train loss: 0.42053, val loss: 0.40850, in 0.031s 1 tree, 95 leaves, max depth = 12, train loss: 0.41643, val loss: 0.40431, in 0.031s 1 tree, 69 leaves, max depth = 12, train loss: 0.41381, val loss: 0.40148, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.40982, val loss: 0.39747, in 0.016s 1 tree, 72 leaves, max depth = 12, train loss: 0.40732, val loss: 0.39477, in 0.016s 1 tree, 84 leaves, max depth = 12, train loss: 0.40494, val loss: 0.39235, in 0.031s 1 tree, 86 leaves, max depth = 11, train loss: 0.40267, val loss: 0.39003, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.40049, val loss: 0.38773, in 0.031s 1 tree, 72 leaves, max depth = 11, train loss: 0.39820, val loss: 0.38529, in 0.016s 1 tree, 74 leaves, max depth = 12, train loss: 0.39454, val loss: 0.38160, in 0.016s 1 tree, 74 leaves, max depth = 12, train loss: 0.39098, val loss: 0.37802, in 0.031s 1 tree, 84 leaves, max depth = 12, train loss: 0.38876, val loss: 0.37568, in 0.016s Fit 59 trees in 1.533 s, (4986 total leaves) Time spent computing histograms: 0.452s Time spent finding best splits: 0.096s Time spent applying splits: 0.090s Time spent predicting: 0.000s Trial 3, Fold 4: Log loss = 0.38953950043832297, Average precision = 0.9538868791853735, ROC-AUC = 0.9492232487106304, Elapsed Time = 1.5329940999999963 seconds Trial 3, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 3, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 61 leaves, max depth = 14, train loss: 0.68267, val loss: 0.68208, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.67200, val loss: 0.67100, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.66251, val loss: 0.66098, in 0.031s 1 tree, 77 leaves, max depth = 12, train loss: 0.65257, val loss: 0.65070, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.64343, val loss: 0.64118, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.63415, val loss: 0.63158, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.62570, val loss: 0.62276, in 0.031s 1 tree, 79 leaves, max depth = 11, train loss: 0.61705, val loss: 0.61379, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.60916, val loss: 0.60554, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.60165, val loss: 0.59769, in 0.016s 1 tree, 80 leaves, max depth = 12, train loss: 0.59376, val loss: 0.58952, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.58638, val loss: 0.58179, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.57945, val loss: 0.57456, in 0.031s 1 tree, 73 leaves, max depth = 11, train loss: 0.57243, val loss: 0.56719, in 0.016s 1 tree, 85 leaves, max depth = 12, train loss: 0.56547, val loss: 0.55997, in 0.016s 1 tree, 102 leaves, max depth = 13, train loss: 0.55927, val loss: 0.55349, in 0.016s 1 tree, 102 leaves, max depth = 12, train loss: 0.55278, val loss: 0.54674, in 0.031s 1 tree, 102 leaves, max depth = 12, train loss: 0.54652, val loss: 0.54024, in 0.031s 1 tree, 81 leaves, max depth = 13, train loss: 0.54076, val loss: 0.53426, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.53518, val loss: 0.52846, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.52969, val loss: 0.52273, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.52400, val loss: 0.51686, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.51858, val loss: 0.51124, in 0.031s 1 tree, 85 leaves, max depth = 13, train loss: 0.51357, val loss: 0.50614, in 0.016s 1 tree, 85 leaves, max depth = 13, train loss: 0.50875, val loss: 0.50113, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.50407, val loss: 0.49626, in 0.016s 1 tree, 100 leaves, max depth = 13, train loss: 0.49919, val loss: 0.49119, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.49477, val loss: 0.48660, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.49006, val loss: 0.48171, in 0.031s 1 tree, 83 leaves, max depth = 13, train loss: 0.48548, val loss: 0.47696, in 0.016s 1 tree, 72 leaves, max depth = 11, train loss: 0.48157, val loss: 0.47280, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.47725, val loss: 0.46832, in 0.016s 1 tree, 86 leaves, max depth = 12, train loss: 0.47334, val loss: 0.46434, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.46954, val loss: 0.46047, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.46432, val loss: 0.45519, in 0.031s 1 tree, 82 leaves, max depth = 13, train loss: 0.46038, val loss: 0.45112, in 0.016s 1 tree, 85 leaves, max depth = 13, train loss: 0.45654, val loss: 0.44716, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.45297, val loss: 0.44341, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.44949, val loss: 0.43978, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.44605, val loss: 0.43621, in 0.031s 1 tree, 84 leaves, max depth = 12, train loss: 0.44292, val loss: 0.43296, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.43958, val loss: 0.42950, in 0.016s 1 tree, 103 leaves, max depth = 13, train loss: 0.43640, val loss: 0.42620, in 0.031s 1 tree, 103 leaves, max depth = 13, train loss: 0.43329, val loss: 0.42301, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.42871, val loss: 0.41845, in 0.016s 1 tree, 73 leaves, max depth = 14, train loss: 0.42427, val loss: 0.41406, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.42128, val loss: 0.41097, in 0.031s 1 tree, 82 leaves, max depth = 13, train loss: 0.41838, val loss: 0.40798, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.41553, val loss: 0.40508, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.41295, val loss: 0.40244, in 0.031s 1 tree, 74 leaves, max depth = 14, train loss: 0.40888, val loss: 0.39840, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.40626, val loss: 0.39575, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.40369, val loss: 0.39310, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.40142, val loss: 0.39076, in 0.031s 1 tree, 102 leaves, max depth = 12, train loss: 0.39923, val loss: 0.38857, in 0.016s 1 tree, 91 leaves, max depth = 14, train loss: 0.39554, val loss: 0.38494, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.39337, val loss: 0.38278, in 0.031s 1 tree, 80 leaves, max depth = 14, train loss: 0.38972, val loss: 0.37921, in 0.016s 1 tree, 65 leaves, max depth = 12, train loss: 0.38757, val loss: 0.37696, in 0.016s Fit 59 trees in 1.502 s, (4930 total leaves) Time spent computing histograms: 0.441s Time spent finding best splits: 0.095s Time spent applying splits: 0.089s Time spent predicting: 0.000s Trial 3, Fold 5: Log loss = 0.39399982523344507, Average precision = 0.9492866062937685, ROC-AUC = 0.9478948404956988, Elapsed Time = 1.5117119999995339 seconds
Optimization Progress: 4%|4 | 4/100 [00:51<20:32, 12.84s/it]
Trial 4, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 4, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.174 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 7, train loss: 0.67412, val loss: 0.67382, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.65525, val loss: 0.65467, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.63726, val loss: 0.63628, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.62082, val loss: 0.61942, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.60647, val loss: 0.60492, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.59321, val loss: 0.59147, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.57988, val loss: 0.57792, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.56780, val loss: 0.56557, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.55604, val loss: 0.55344, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.54339, val loss: 0.54088, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.53386, val loss: 0.53113, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.52390, val loss: 0.52099, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.51381, val loss: 0.51081, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.50556, val loss: 0.50232, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.49718, val loss: 0.49367, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48948, val loss: 0.48578, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48109, val loss: 0.47722, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.47319, val loss: 0.46920, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.46678, val loss: 0.46247, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.46041, val loss: 0.45609, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.45454, val loss: 0.44995, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.44794, val loss: 0.44327, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.44206, val loss: 0.43727, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.43525, val loss: 0.43046, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.43039, val loss: 0.42546, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.42565, val loss: 0.42064, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.41986, val loss: 0.41492, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.41536, val loss: 0.41028, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.41087, val loss: 0.40555, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40634, val loss: 0.40100, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.40269, val loss: 0.39725, in 0.031s 1 tree, 17 leaves, max depth = 9, train loss: 0.39968, val loss: 0.39406, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39481, val loss: 0.38923, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.39169, val loss: 0.38601, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38506, val loss: 0.37956, in 0.016s Fit 35 trees in 0.987 s, (570 total leaves) Time spent computing histograms: 0.281s Time spent finding best splits: 0.023s Time spent applying splits: 0.018s Time spent predicting: 0.000s Trial 4, Fold 1: Log loss = 0.3857294285343602, Average precision = 0.9546373970653965, ROC-AUC = 0.9465977987601746, Elapsed Time = 1.0007836999993742 seconds Trial 4, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 4, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.205 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 7, train loss: 0.67433, val loss: 0.67347, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.65446, val loss: 0.65314, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.63657, val loss: 0.63500, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.62000, val loss: 0.61798, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.60582, val loss: 0.60311, in 0.031s 1 tree, 17 leaves, max depth = 8, train loss: 0.59264, val loss: 0.58920, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.57918, val loss: 0.57533, in 0.031s 1 tree, 12 leaves, max depth = 6, train loss: 0.56709, val loss: 0.56271, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.55458, val loss: 0.55005, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.54171, val loss: 0.53693, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.53217, val loss: 0.52690, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.52228, val loss: 0.51665, in 0.031s 1 tree, 17 leaves, max depth = 5, train loss: 0.51238, val loss: 0.50663, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.50429, val loss: 0.49824, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.49574, val loss: 0.48940, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.48818, val loss: 0.48147, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.47964, val loss: 0.47268, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.47165, val loss: 0.46456, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.46514, val loss: 0.45802, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.45833, val loss: 0.45112, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.45239, val loss: 0.44500, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.44376, val loss: 0.43639, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.43665, val loss: 0.42944, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.43129, val loss: 0.42406, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.42487, val loss: 0.41774, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.41908, val loss: 0.41184, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.41473, val loss: 0.40730, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.41004, val loss: 0.40265, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.40493, val loss: 0.39749, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.40137, val loss: 0.39379, in 0.031s 1 tree, 17 leaves, max depth = 8, train loss: 0.39769, val loss: 0.39010, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39364, val loss: 0.38605, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.39037, val loss: 0.38272, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.38742, val loss: 0.37971, in 0.016s 1 tree, 10 leaves, max depth = 6, train loss: 0.38473, val loss: 0.37692, in 0.031s Fit 35 trees in 1.283 s, (573 total leaves) Time spent computing histograms: 0.329s Time spent finding best splits: 0.031s Time spent applying splits: 0.023s Time spent predicting: 0.000s Trial 4, Fold 2: Log loss = 0.3869140975171336, Average precision = 0.950777794365889, ROC-AUC = 0.9465078433877876, Elapsed Time = 1.288409300001149 seconds Trial 4, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 4, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 6, train loss: 0.67429, val loss: 0.67398, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.65557, val loss: 0.65486, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.63772, val loss: 0.63694, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.62128, val loss: 0.62002, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.60715, val loss: 0.60547, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.59404, val loss: 0.59203, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.58058, val loss: 0.57846, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.56862, val loss: 0.56621, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.55634, val loss: 0.55353, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.54372, val loss: 0.54080, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.53417, val loss: 0.53103, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.52413, val loss: 0.52081, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.51363, val loss: 0.51032, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.50477, val loss: 0.50132, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.49335, val loss: 0.49047, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48428, val loss: 0.48145, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.47658, val loss: 0.47356, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.46858, val loss: 0.46559, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.46269, val loss: 0.45952, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.45526, val loss: 0.45214, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.44841, val loss: 0.44531, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.44198, val loss: 0.43894, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.43709, val loss: 0.43393, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43167, val loss: 0.42869, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.42544, val loss: 0.42215, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.41996, val loss: 0.41667, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.41561, val loss: 0.41227, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.41079, val loss: 0.40763, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.40642, val loss: 0.40342, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.40199, val loss: 0.39868, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.39832, val loss: 0.39492, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39498, val loss: 0.39150, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.39030, val loss: 0.38667, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.38721, val loss: 0.38377, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38087, val loss: 0.37797, in 0.016s Fit 35 trees in 1.002 s, (559 total leaves) Time spent computing histograms: 0.279s Time spent finding best splits: 0.022s Time spent applying splits: 0.018s Time spent predicting: 0.000s Trial 4, Fold 3: Log loss = 0.3760721421921893, Average precision = 0.956812154332084, ROC-AUC = 0.9499740714478335, Elapsed Time = 1.0188118000005488 seconds Trial 4, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 4, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.172 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 9, train loss: 0.67437, val loss: 0.67377, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.65373, val loss: 0.65215, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.63749, val loss: 0.63533, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.62145, val loss: 0.61868, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.60728, val loss: 0.60401, in 0.031s 1 tree, 17 leaves, max depth = 8, train loss: 0.59418, val loss: 0.59029, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.58072, val loss: 0.57638, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.56861, val loss: 0.56359, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.55655, val loss: 0.55088, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.54381, val loss: 0.53749, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.53429, val loss: 0.52760, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.52443, val loss: 0.51726, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.51608, val loss: 0.50850, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.50802, val loss: 0.50002, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.49940, val loss: 0.49062, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.49184, val loss: 0.48259, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.48338, val loss: 0.47340, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.47538, val loss: 0.46475, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.46876, val loss: 0.45746, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.46164, val loss: 0.44983, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.45553, val loss: 0.44322, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.44877, val loss: 0.43603, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.44179, val loss: 0.42898, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.43637, val loss: 0.42311, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.43005, val loss: 0.41659, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42451, val loss: 0.41068, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.41982, val loss: 0.40562, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.41511, val loss: 0.40067, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.41073, val loss: 0.39602, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40627, val loss: 0.39127, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39866, val loss: 0.38385, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39524, val loss: 0.38019, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39052, val loss: 0.37538, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.38737, val loss: 0.37202, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.38404, val loss: 0.36837, in 0.016s Fit 35 trees in 0.985 s, (582 total leaves) Time spent computing histograms: 0.270s Time spent finding best splits: 0.022s Time spent applying splits: 0.018s Time spent predicting: 0.016s Trial 4, Fold 4: Log loss = 0.3838827584958714, Average precision = 0.9537464555030243, ROC-AUC = 0.9472256349087129, Elapsed Time = 0.9993679999988672 seconds Trial 4, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 4, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.174 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 7, train loss: 0.67408, val loss: 0.67314, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.65417, val loss: 0.65205, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.63645, val loss: 0.63339, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.62000, val loss: 0.61584, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.60572, val loss: 0.60077, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.59242, val loss: 0.58659, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.57897, val loss: 0.57263, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.56682, val loss: 0.55988, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.55445, val loss: 0.54666, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.54181, val loss: 0.53346, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.53230, val loss: 0.52316, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.52232, val loss: 0.51280, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.51193, val loss: 0.50179, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.50264, val loss: 0.49164, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.49117, val loss: 0.47989, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.48224, val loss: 0.47039, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.47453, val loss: 0.46197, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.46784, val loss: 0.45473, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.46093, val loss: 0.44757, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.45322, val loss: 0.43958, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.44623, val loss: 0.43221, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43958, val loss: 0.42519, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.43457, val loss: 0.41991, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.42919, val loss: 0.41417, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.42288, val loss: 0.40770, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.41731, val loss: 0.40193, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.41294, val loss: 0.39738, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.40817, val loss: 0.39247, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40400, val loss: 0.38794, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.39875, val loss: 0.38266, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39373, val loss: 0.37764, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.38908, val loss: 0.37311, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.38617, val loss: 0.36985, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.38333, val loss: 0.36672, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.37712, val loss: 0.36046, in 0.000s Fit 35 trees in 0.987 s, (563 total leaves) Time spent computing histograms: 0.277s Time spent finding best splits: 0.021s Time spent applying splits: 0.017s Time spent predicting: 0.000s Trial 4, Fold 5: Log loss = 0.3835427485060661, Average precision = 0.953093001420166, ROC-AUC = 0.9466856176727422, Elapsed Time = 1.0019355999993422 seconds
Optimization Progress: 5%|5 | 5/100 [01:05<20:30, 12.96s/it]
Trial 5, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 5, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.158 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 12 leaves, max depth = 7, train loss: 0.68648, val loss: 0.68626, in 0.016s 1 tree, 15 leaves, max depth = 9, train loss: 0.67994, val loss: 0.67954, in 0.000s 1 tree, 12 leaves, max depth = 8, train loss: 0.67354, val loss: 0.67299, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.66732, val loss: 0.66654, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.66132, val loss: 0.66036, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.65548, val loss: 0.65430, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.64981, val loss: 0.64846, in 0.000s 1 tree, 15 leaves, max depth = 9, train loss: 0.64430, val loss: 0.64278, in 0.000s 1 tree, 15 leaves, max depth = 10, train loss: 0.63891, val loss: 0.63720, in 0.016s 1 tree, 10 leaves, max depth = 6, train loss: 0.63355, val loss: 0.63167, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.62846, val loss: 0.62638, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.62352, val loss: 0.62128, in 0.000s 1 tree, 18 leaves, max depth = 10, train loss: 0.61872, val loss: 0.61632, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.61406, val loss: 0.61148, in 0.000s 1 tree, 15 leaves, max depth = 8, train loss: 0.60951, val loss: 0.60677, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.60508, val loss: 0.60217, in 0.000s 1 tree, 12 leaves, max depth = 8, train loss: 0.60070, val loss: 0.59766, in 0.016s 1 tree, 12 leaves, max depth = 8, train loss: 0.59644, val loss: 0.59328, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.59229, val loss: 0.58894, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.58817, val loss: 0.58465, in 0.000s Fit 20 trees in 0.408 s, (282 total leaves) Time spent computing histograms: 0.082s Time spent finding best splits: 0.006s Time spent applying splits: 0.006s Time spent predicting: 0.000s Trial 5, Fold 1: Log loss = 0.5874254284962492, Average precision = 0.8367587948394197, ROC-AUC = 0.8674685622517603, Elapsed Time = 0.42420400000082736 seconds Trial 5, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 5, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 15 leaves, max depth = 8, train loss: 0.68647, val loss: 0.68613, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.67988, val loss: 0.67921, in 0.000s 1 tree, 13 leaves, max depth = 7, train loss: 0.67348, val loss: 0.67249, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.66725, val loss: 0.66596, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.66120, val loss: 0.65960, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.65533, val loss: 0.65341, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.64964, val loss: 0.64744, in 0.016s 1 tree, 12 leaves, max depth = 7, train loss: 0.64412, val loss: 0.64165, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.63871, val loss: 0.63595, in 0.000s 1 tree, 11 leaves, max depth = 7, train loss: 0.63340, val loss: 0.63035, in 0.016s 1 tree, 18 leaves, max depth = 9, train loss: 0.62831, val loss: 0.62498, in 0.000s 1 tree, 13 leaves, max depth = 7, train loss: 0.62336, val loss: 0.61978, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.61852, val loss: 0.61466, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.61386, val loss: 0.60974, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.60927, val loss: 0.60488, in 0.000s 1 tree, 19 leaves, max depth = 11, train loss: 0.60482, val loss: 0.60020, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.60047, val loss: 0.59560, in 0.000s 1 tree, 18 leaves, max depth = 9, train loss: 0.59625, val loss: 0.59113, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.59209, val loss: 0.58675, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.58804, val loss: 0.58246, in 0.000s Fit 20 trees in 0.440 s, (277 total leaves) Time spent computing histograms: 0.087s Time spent finding best splits: 0.006s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 5, Fold 2: Log loss = 0.5878867246029013, Average precision = 0.8337268872759109, ROC-AUC = 0.869113741338616, Elapsed Time = 0.45685410000078264 seconds Trial 5, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 5, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 14 leaves, max depth = 9, train loss: 0.68650, val loss: 0.68627, in 0.016s 1 tree, 15 leaves, max depth = 11, train loss: 0.68001, val loss: 0.67955, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.67368, val loss: 0.67297, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.66748, val loss: 0.66656, in 0.000s 1 tree, 13 leaves, max depth = 7, train loss: 0.66152, val loss: 0.66038, in 0.000s 1 tree, 14 leaves, max depth = 9, train loss: 0.65570, val loss: 0.65439, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.65007, val loss: 0.64853, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.64460, val loss: 0.64285, in 0.000s 1 tree, 12 leaves, max depth = 7, train loss: 0.63921, val loss: 0.63726, in 0.000s 1 tree, 12 leaves, max depth = 6, train loss: 0.63393, val loss: 0.63179, in 0.000s 1 tree, 22 leaves, max depth = 9, train loss: 0.62889, val loss: 0.62654, in 0.016s 1 tree, 20 leaves, max depth = 12, train loss: 0.62399, val loss: 0.62145, in 0.000s 1 tree, 22 leaves, max depth = 13, train loss: 0.61922, val loss: 0.61650, in 0.016s 1 tree, 18 leaves, max depth = 9, train loss: 0.61459, val loss: 0.61168, in 0.000s 1 tree, 20 leaves, max depth = 11, train loss: 0.61008, val loss: 0.60698, in 0.016s 1 tree, 22 leaves, max depth = 13, train loss: 0.60568, val loss: 0.60241, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.60136, val loss: 0.59789, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.59715, val loss: 0.59349, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.59301, val loss: 0.58920, in 0.000s 1 tree, 16 leaves, max depth = 6, train loss: 0.58895, val loss: 0.58499, in 0.016s Fit 20 trees in 0.439 s, (313 total leaves) Time spent computing histograms: 0.079s Time spent finding best splits: 0.007s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 5, Fold 3: Log loss = 0.5860022740246483, Average precision = 0.8476111167806911, ROC-AUC = 0.8774352975026725, Elapsed Time = 0.4495236999991903 seconds Trial 5, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 5, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0.157 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 7 leaves, max depth = 5, train loss: 0.68650, val loss: 0.68611, in 0.000s 1 tree, 15 leaves, max depth = 9, train loss: 0.67998, val loss: 0.67924, in 0.016s 1 tree, 12 leaves, max depth = 7, train loss: 0.67362, val loss: 0.67253, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.66742, val loss: 0.66599, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.66144, val loss: 0.65967, in 0.000s 1 tree, 19 leaves, max depth = 11, train loss: 0.65560, val loss: 0.65347, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.64995, val loss: 0.64749, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.64445, val loss: 0.64167, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.63909, val loss: 0.63596, in 0.000s 1 tree, 10 leaves, max depth = 6, train loss: 0.63379, val loss: 0.63032, in 0.000s 1 tree, 19 leaves, max depth = 11, train loss: 0.62872, val loss: 0.62491, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.62379, val loss: 0.61967, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.61900, val loss: 0.61458, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.61436, val loss: 0.60960, in 0.000s 1 tree, 15 leaves, max depth = 9, train loss: 0.60982, val loss: 0.60477, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.60540, val loss: 0.60004, in 0.000s 1 tree, 13 leaves, max depth = 7, train loss: 0.60106, val loss: 0.59540, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.59683, val loss: 0.59088, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.59269, val loss: 0.58647, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.58862, val loss: 0.58209, in 0.016s Fit 20 trees in 0.454 s, (294 total leaves) Time spent computing histograms: 0.086s Time spent finding best splits: 0.006s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 5, Fold 4: Log loss = 0.5879750852345283, Average precision = 0.8255683557170996, ROC-AUC = 0.8667012853160894, Elapsed Time = 0.45265290000133973 seconds Trial 5, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 5, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 9, train loss: 0.68641, val loss: 0.68594, in 0.016s 1 tree, 16 leaves, max depth = 10, train loss: 0.67980, val loss: 0.67887, in 0.000s 1 tree, 11 leaves, max depth = 6, train loss: 0.67335, val loss: 0.67201, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.66708, val loss: 0.66532, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.66102, val loss: 0.65881, in 0.000s 1 tree, 18 leaves, max depth = 8, train loss: 0.65511, val loss: 0.65247, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.64939, val loss: 0.64636, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.64383, val loss: 0.64037, in 0.016s 1 tree, 14 leaves, max depth = 8, train loss: 0.63838, val loss: 0.63455, in 0.000s 1 tree, 11 leaves, max depth = 7, train loss: 0.63305, val loss: 0.62880, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.62791, val loss: 0.62326, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.62292, val loss: 0.61787, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.61808, val loss: 0.61263, in 0.000s 1 tree, 18 leaves, max depth = 8, train loss: 0.61339, val loss: 0.60755, in 0.016s 1 tree, 16 leaves, max depth = 10, train loss: 0.60878, val loss: 0.60257, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.60431, val loss: 0.59775, in 0.000s 1 tree, 11 leaves, max depth = 7, train loss: 0.59994, val loss: 0.59301, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.59568, val loss: 0.58838, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.59150, val loss: 0.58386, in 0.016s 1 tree, 12 leaves, max depth = 8, train loss: 0.58740, val loss: 0.57940, in 0.000s Fit 20 trees in 0.439 s, (292 total leaves) Time spent computing histograms: 0.083s Time spent finding best splits: 0.006s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 5, Fold 5: Log loss = 0.5906025603832106, Average precision = 0.829664343872623, ROC-AUC = 0.8594015941140405, Elapsed Time = 0.4455651999996917 seconds
Optimization Progress: 6%|6 | 6/100 [01:14<18:16, 11.67s/it]
Trial 6, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 6, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 7, train loss: 0.64682, val loss: 0.64584, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.60923, val loss: 0.60826, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.57663, val loss: 0.57493, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.54702, val loss: 0.54501, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.52327, val loss: 0.52099, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.50163, val loss: 0.49901, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.48332, val loss: 0.48045, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.46715, val loss: 0.46376, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.45198, val loss: 0.44809, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.43849, val loss: 0.43484, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.42493, val loss: 0.42120, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.41304, val loss: 0.40976, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.40354, val loss: 0.40020, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.38656, val loss: 0.38455, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.37841, val loss: 0.37587, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.37063, val loss: 0.36800, in 0.000s 1 tree, 31 leaves, max depth = 13, train loss: 0.36308, val loss: 0.36019, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.35007, val loss: 0.34797, in 0.016s Fit 18 trees in 0.564 s, (514 total leaves) Time spent computing histograms: 0.110s Time spent finding best splits: 0.014s Time spent applying splits: 0.011s Time spent predicting: 0.000s Trial 6, Fold 1: Log loss = 0.351445931369262, Average precision = 0.9549264452505613, ROC-AUC = 0.9488231456388638, Elapsed Time = 0.5713984000012715 seconds Trial 6, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 6, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 8, train loss: 0.64672, val loss: 0.64567, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.60891, val loss: 0.60705, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.57563, val loss: 0.57322, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.54689, val loss: 0.54408, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.52281, val loss: 0.51955, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.50099, val loss: 0.49712, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.48171, val loss: 0.47747, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.46327, val loss: 0.45859, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.44737, val loss: 0.44247, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.43278, val loss: 0.42760, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.41305, val loss: 0.40819, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.39614, val loss: 0.39161, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.38659, val loss: 0.38186, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.37596, val loss: 0.37114, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36871, val loss: 0.36346, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.36135, val loss: 0.35629, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.35377, val loss: 0.34874, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.34666, val loss: 0.34195, in 0.016s Fit 18 trees in 0.626 s, (479 total leaves) Time spent computing histograms: 0.125s Time spent finding best splits: 0.015s Time spent applying splits: 0.011s Time spent predicting: 0.000s Trial 6, Fold 2: Log loss = 0.3485951215531161, Average precision = 0.9527528591451231, ROC-AUC = 0.9498124549119393, Elapsed Time = 0.6378046999998332 seconds Trial 6, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 6, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 9, train loss: 0.64689, val loss: 0.64606, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.60951, val loss: 0.60831, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.57613, val loss: 0.57510, in 0.031s 1 tree, 31 leaves, max depth = 12, train loss: 0.54743, val loss: 0.54646, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.52314, val loss: 0.52207, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.50169, val loss: 0.50025, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.48170, val loss: 0.47987, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.46353, val loss: 0.46166, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.44786, val loss: 0.44578, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.43362, val loss: 0.43174, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.41372, val loss: 0.41334, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.39648, val loss: 0.39754, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.38458, val loss: 0.38508, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.37387, val loss: 0.37399, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.36103, val loss: 0.36236, in 0.016s 1 tree, 9 leaves, max depth = 6, train loss: 0.34986, val loss: 0.35234, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.34225, val loss: 0.34499, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.33422, val loss: 0.33678, in 0.016s Fit 18 trees in 0.642 s, (481 total leaves) Time spent computing histograms: 0.125s Time spent finding best splits: 0.016s Time spent applying splits: 0.012s Time spent predicting: 0.000s Trial 6, Fold 3: Log loss = 0.33131520081959953, Average precision = 0.9602025195875663, ROC-AUC = 0.9562246884360162, Elapsed Time = 0.6537537000003795 seconds Trial 6, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 6, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 9, train loss: 0.64797, val loss: 0.64605, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.61040, val loss: 0.60692, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.57731, val loss: 0.57218, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.54897, val loss: 0.54247, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.52528, val loss: 0.51725, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.50382, val loss: 0.49495, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.48282, val loss: 0.47276, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.46470, val loss: 0.45353, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.44889, val loss: 0.43673, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.43514, val loss: 0.42176, in 0.031s 1 tree, 20 leaves, max depth = 9, train loss: 0.41537, val loss: 0.40175, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.40489, val loss: 0.39095, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.39317, val loss: 0.37912, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.38347, val loss: 0.36885, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37073, val loss: 0.35523, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.35966, val loss: 0.34315, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.35111, val loss: 0.33455, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.34035, val loss: 0.32368, in 0.016s Fit 18 trees in 0.657 s, (483 total leaves) Time spent computing histograms: 0.125s Time spent finding best splits: 0.016s Time spent applying splits: 0.012s Time spent predicting: 0.000s Trial 6, Fold 4: Log loss = 0.33944525800884284, Average precision = 0.9586036453769143, ROC-AUC = 0.9534304652460353, Elapsed Time = 0.6549568999998883 seconds Trial 6, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 6, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.172 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 9, train loss: 0.64656, val loss: 0.64368, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.60851, val loss: 0.60445, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.57553, val loss: 0.56993, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.54676, val loss: 0.53985, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.52258, val loss: 0.51387, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.50078, val loss: 0.49148, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.48064, val loss: 0.47014, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.46273, val loss: 0.45134, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.44693, val loss: 0.43461, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.43251, val loss: 0.41911, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.41342, val loss: 0.39988, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39633, val loss: 0.38292, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.38660, val loss: 0.37296, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.37632, val loss: 0.36253, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36899, val loss: 0.35500, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.36171, val loss: 0.34769, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.35397, val loss: 0.33972, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.34685, val loss: 0.33242, in 0.016s Fit 18 trees in 0.641 s, (472 total leaves) Time spent computing histograms: 0.119s Time spent finding best splits: 0.015s Time spent applying splits: 0.011s Time spent predicting: 0.000s Trial 6, Fold 5: Log loss = 0.3542236616621092, Average precision = 0.9494919234787114, ROC-AUC = 0.9467516860821583, Elapsed Time = 0.6471075999997993 seconds
Optimization Progress: 7%|7 | 7/100 [01:24<17:07, 11.05s/it]
Trial 7, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 7, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 Binning 0.040 GB of training data: 0.220 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 50 leaves, max depth = 11, train loss: 0.66112, val loss: 0.66111, in 0.031s 1 tree, 49 leaves, max depth = 11, train loss: 0.63276, val loss: 0.63277, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.60721, val loss: 0.60719, in 0.031s 1 tree, 52 leaves, max depth = 11, train loss: 0.58504, val loss: 0.58493, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.56405, val loss: 0.56390, in 0.031s 1 tree, 52 leaves, max depth = 11, train loss: 0.54583, val loss: 0.54557, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.52809, val loss: 0.52771, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.51192, val loss: 0.51143, in 0.031s [9/36] 1 tree, 47 leaves, max depth = 11, train loss: 0.49707, val loss: 0.49647, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.48348, val loss: 0.48268, in 0.031s 1 tree, 47 leaves, max depth = 10, train loss: 0.47105, val loss: 0.47016, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.45957, val loss: 0.45860, in 0.031s 1 tree, 56 leaves, max depth = 11, train loss: 0.44915, val loss: 0.44785, in 0.031s 1 tree, 45 leaves, max depth = 10, train loss: 0.43945, val loss: 0.43808, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.42548, val loss: 0.42457, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.41704, val loss: 0.41610, in 0.031s 1 tree, 51 leaves, max depth = 12, train loss: 0.40486, val loss: 0.40436, in 0.016s 1 tree, 47 leaves, max depth = 10, train loss: 0.39745, val loss: 0.39694, in 0.031s 1 tree, 52 leaves, max depth = 12, train loss: 0.38679, val loss: 0.38671, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.38032, val loss: 0.38026, in 0.031s 1 tree, 53 leaves, max depth = 12, train loss: 0.37095, val loss: 0.37131, in 0.031s 1 tree, 49 leaves, max depth = 12, train loss: 0.36527, val loss: 0.36567, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.35741, val loss: 0.35787, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.35027, val loss: 0.35078, in 0.031s 1 tree, 51 leaves, max depth = 12, train loss: 0.34466, val loss: 0.34528, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.33837, val loss: 0.33904, in 0.031s 1 tree, 50 leaves, max depth = 11, train loss: 0.33325, val loss: 0.33402, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.32750, val loss: 0.32895, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.32226, val loss: 0.32375, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.31730, val loss: 0.31944, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.31299, val loss: 0.31518, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.30861, val loss: 0.31083, in 0.016s 1 tree, 61 leaves, max depth = 11, train loss: 0.30462, val loss: 0.30681, in 0.031s 1 tree, 51 leaves, max depth = 13, train loss: 0.30056, val loss: 0.30334, in 0.031s 1 tree, 30 leaves, max depth = 9, train loss: 0.29687, val loss: 0.29966, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.29335, val loss: 0.29669, in 0.016s Fit 36 trees in 1.330 s, (1686 total leaves) Time spent computing histograms: 0.323s Time spent finding best splits: 0.057s Time spent applying splits: 0.052s Time spent predicting: 0.000s Trial 7, Fold 1: Log loss = 0.2971298608703609, Average precision = 0.9601508407125532, ROC-AUC = 0.954229981287908, Elapsed Time = 1.3452409000001353 seconds Trial 7, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 7, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.236 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 50 leaves, max depth = 11, train loss: 0.66131, val loss: 0.66052, in 0.016s 1 tree, 47 leaves, max depth = 9, train loss: 0.63253, val loss: 0.63114, in 0.031s 1 tree, 49 leaves, max depth = 10, train loss: 0.60688, val loss: 0.60477, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.58478, val loss: 0.58232, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.56350, val loss: 0.56065, in 0.031s 1 tree, 51 leaves, max depth = 13, train loss: 0.54533, val loss: 0.54218, in 0.031s 1 tree, 53 leaves, max depth = 11, train loss: 0.52756, val loss: 0.52399, in 0.016s 1 tree, 50 leaves, max depth = 9, train loss: 0.51136, val loss: 0.50739, in 0.031s 1 tree, 49 leaves, max depth = 11, train loss: 0.49658, val loss: 0.49232, in 0.031s 1 tree, 49 leaves, max depth = 9, train loss: 0.48304, val loss: 0.47846, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.47059, val loss: 0.46569, in 0.031s 1 tree, 46 leaves, max depth = 10, train loss: 0.45917, val loss: 0.45407, in 0.031s 1 tree, 49 leaves, max depth = 10, train loss: 0.44866, val loss: 0.44341, in 0.031s 1 tree, 46 leaves, max depth = 10, train loss: 0.43895, val loss: 0.43343, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.42508, val loss: 0.41974, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.41672, val loss: 0.41110, in 0.031s 1 tree, 53 leaves, max depth = 12, train loss: 0.40472, val loss: 0.39931, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.39751, val loss: 0.39189, in 0.031s 1 tree, 53 leaves, max depth = 12, train loss: 0.38699, val loss: 0.38159, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.37742, val loss: 0.37220, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.37146, val loss: 0.36610, in 0.031s 1 tree, 50 leaves, max depth = 10, train loss: 0.36584, val loss: 0.36035, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.35803, val loss: 0.35271, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.35094, val loss: 0.34576, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.34451, val loss: 0.33951, in 0.031s 1 tree, 50 leaves, max depth = 11, train loss: 0.33911, val loss: 0.33410, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.33332, val loss: 0.32871, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.32796, val loss: 0.32343, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.32326, val loss: 0.31872, in 0.016s 1 tree, 51 leaves, max depth = 14, train loss: 0.31891, val loss: 0.31445, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.31421, val loss: 0.31009, in 0.016s 1 tree, 62 leaves, max depth = 12, train loss: 0.31019, val loss: 0.30625, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.30598, val loss: 0.30218, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.30203, val loss: 0.29852, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.29839, val loss: 0.29499, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.29495, val loss: 0.29180, in 0.016s Fit 36 trees in 1.346 s, (1694 total leaves) Time spent computing histograms: 0.313s Time spent finding best splits: 0.055s Time spent applying splits: 0.050s Time spent predicting: 0.031s Trial 7, Fold 2: Log loss = 0.29597727707207266, Average precision = 0.9594641087138922, ROC-AUC = 0.9563577921425602, Elapsed Time = 1.3618537000002107 seconds Trial 7, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 7, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.237 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 51 leaves, max depth = 11, train loss: 0.66100, val loss: 0.66112, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.63240, val loss: 0.63261, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.60692, val loss: 0.60694, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.58480, val loss: 0.58485, in 0.031s 1 tree, 52 leaves, max depth = 11, train loss: 0.56367, val loss: 0.56370, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.54464, val loss: 0.54451, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.52803, val loss: 0.52794, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.51196, val loss: 0.51182, in 0.031s 1 tree, 51 leaves, max depth = 11, train loss: 0.49740, val loss: 0.49713, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.48393, val loss: 0.48361, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.47159, val loss: 0.47124, in 0.031s 1 tree, 53 leaves, max depth = 10, train loss: 0.46022, val loss: 0.45986, in 0.016s 1 tree, 64 leaves, max depth = 10, train loss: 0.44987, val loss: 0.44958, in 0.031s 1 tree, 52 leaves, max depth = 12, train loss: 0.43527, val loss: 0.43601, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.42617, val loss: 0.42690, in 0.031s 1 tree, 54 leaves, max depth = 11, train loss: 0.41773, val loss: 0.41836, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.40551, val loss: 0.40712, in 0.031s 1 tree, 53 leaves, max depth = 11, train loss: 0.39816, val loss: 0.39972, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.38746, val loss: 0.38993, in 0.031s 1 tree, 51 leaves, max depth = 11, train loss: 0.38115, val loss: 0.38360, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.37173, val loss: 0.37505, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.36608, val loss: 0.36932, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.35813, val loss: 0.36208, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.35091, val loss: 0.35552, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.34535, val loss: 0.34960, in 0.031s 1 tree, 49 leaves, max depth = 13, train loss: 0.33903, val loss: 0.34420, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.33303, val loss: 0.33877, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.32756, val loss: 0.33420, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.32242, val loss: 0.32949, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.31769, val loss: 0.32553, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.31329, val loss: 0.32079, in 0.031s 1 tree, 28 leaves, max depth = 10, train loss: 0.30897, val loss: 0.31685, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.30501, val loss: 0.31263, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.30119, val loss: 0.30945, in 0.016s [35/36] 1 tree, 31 leaves, max depth = 10, train loss: 0.29756, val loss: 0.30616, in 0.016s 1 tree, 46 leaves, max depth = 14, train loss: 0.29408, val loss: 0.30241, in 0.016s Fit 36 trees in 1.222 s, (1727 total leaves) Time spent computing histograms: 0.293s Time spent finding best splits: 0.047s Time spent applying splits: 0.044s Time spent predicting: 0.016s Trial 7, Fold 3: Log loss = 0.2932838393195154, Average precision = 0.9595868858698278, ROC-AUC = 0.9563772801475475, Elapsed Time = 1.2324594000001525 seconds Trial 7, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 7, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0.173 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 46 leaves, max depth = 9, train loss: 0.66148, val loss: 0.66016, in 0.016s 1 tree, 46 leaves, max depth = 9, train loss: 0.63340, val loss: 0.63081, in 0.016s 1 tree, 46 leaves, max depth = 9, train loss: 0.60809, val loss: 0.60438, in 0.031s 1 tree, 53 leaves, max depth = 11, train loss: 0.58620, val loss: 0.58132, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.56504, val loss: 0.55906, in 0.031s 1 tree, 54 leaves, max depth = 13, train loss: 0.54696, val loss: 0.54001, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.52929, val loss: 0.52137, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.51313, val loss: 0.50434, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.49843, val loss: 0.48873, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.48494, val loss: 0.47441, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.47255, val loss: 0.46141, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.46121, val loss: 0.44939, in 0.031s 1 tree, 49 leaves, max depth = 10, train loss: 0.45075, val loss: 0.43820, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.44110, val loss: 0.42796, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.42733, val loss: 0.41394, in 0.031s 1 tree, 51 leaves, max depth = 9, train loss: 0.41898, val loss: 0.40503, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.40697, val loss: 0.39281, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.39972, val loss: 0.38506, in 0.031s 1 tree, 54 leaves, max depth = 13, train loss: 0.38923, val loss: 0.37444, in 0.016s 1 tree, 49 leaves, max depth = 9, train loss: 0.38286, val loss: 0.36757, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.37360, val loss: 0.35817, in 0.031s 1 tree, 48 leaves, max depth = 10, train loss: 0.36802, val loss: 0.35209, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.36019, val loss: 0.34397, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.35269, val loss: 0.33636, in 0.031s 1 tree, 31 leaves, max depth = 12, train loss: 0.34619, val loss: 0.32964, in 0.000s 1 tree, 51 leaves, max depth = 11, train loss: 0.34083, val loss: 0.32413, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.33513, val loss: 0.31871, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.33031, val loss: 0.31376, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.32506, val loss: 0.30829, in 0.031s 1 tree, 49 leaves, max depth = 12, train loss: 0.32029, val loss: 0.30378, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.31577, val loss: 0.29909, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.31162, val loss: 0.29473, in 0.016s 1 tree, 49 leaves, max depth = 14, train loss: 0.30762, val loss: 0.29098, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.30380, val loss: 0.28696, in 0.016s 1 tree, 49 leaves, max depth = 14, train loss: 0.30019, val loss: 0.28327, in 0.016s 1 tree, 49 leaves, max depth = 14, train loss: 0.29681, val loss: 0.28013, in 0.031s Fit 36 trees in 1.111 s, (1731 total leaves) Time spent computing histograms: 0.286s Time spent finding best splits: 0.046s Time spent applying splits: 0.043s Time spent predicting: 0.016s Trial 7, Fold 4: Log loss = 0.2960871708318758, Average precision = 0.9603063654296586, ROC-AUC = 0.9556672651535202, Elapsed Time = 1.122271999998702 seconds Trial 7, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 7, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 51 leaves, max depth = 10, train loss: 0.66088, val loss: 0.65953, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.63219, val loss: 0.62956, in 0.016s 1 tree, 49 leaves, max depth = 9, train loss: 0.60645, val loss: 0.60265, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.58433, val loss: 0.57944, in 0.031s 1 tree, 52 leaves, max depth = 10, train loss: 0.56314, val loss: 0.55732, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.54495, val loss: 0.53821, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.52727, val loss: 0.51973, in 0.031s 1 tree, 53 leaves, max depth = 10, train loss: 0.51118, val loss: 0.50295, in 0.016s 1 tree, 50 leaves, max depth = 9, train loss: 0.49644, val loss: 0.48754, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.48294, val loss: 0.47345, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.47053, val loss: 0.46045, in 0.031s 1 tree, 49 leaves, max depth = 11, train loss: 0.45914, val loss: 0.44850, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.44865, val loss: 0.43752, in 0.031s 1 tree, 49 leaves, max depth = 9, train loss: 0.43897, val loss: 0.42744, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.42513, val loss: 0.41360, in 0.016s 1 tree, 50 leaves, max depth = 9, train loss: 0.41675, val loss: 0.40490, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.40467, val loss: 0.39284, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.39734, val loss: 0.38523, in 0.031s 1 tree, 54 leaves, max depth = 9, train loss: 0.39059, val loss: 0.37825, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.38430, val loss: 0.37173, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.37444, val loss: 0.36192, in 0.031s 1 tree, 53 leaves, max depth = 11, train loss: 0.36544, val loss: 0.35301, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.35723, val loss: 0.34485, in 0.031s 1 tree, 52 leaves, max depth = 11, train loss: 0.35224, val loss: 0.33979, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.34525, val loss: 0.33263, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.33889, val loss: 0.32610, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.33310, val loss: 0.32015, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.32816, val loss: 0.31523, in 0.031s 1 tree, 50 leaves, max depth = 12, train loss: 0.32291, val loss: 0.31059, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.31809, val loss: 0.30561, in 0.000s 1 tree, 52 leaves, max depth = 14, train loss: 0.31376, val loss: 0.30131, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.30951, val loss: 0.29696, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.30552, val loss: 0.29298, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.30149, val loss: 0.28942, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.29780, val loss: 0.28617, in 0.016s 1 tree, 32 leaves, max depth = 13, train loss: 0.29439, val loss: 0.28269, in 0.016s Fit 36 trees in 1.095 s, (1734 total leaves) Time spent computing histograms: 0.281s Time spent finding best splits: 0.044s Time spent applying splits: 0.042s Time spent predicting: 0.000s Trial 7, Fold 5: Log loss = 0.30208427335159127, Average precision = 0.9562072059339609, ROC-AUC = 0.9511322668747562, Elapsed Time = 1.1022063000000344 seconds
Optimization Progress: 8%|8 | 8/100 [01:38<18:50, 12.29s/it]
Trial 8, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 8, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.141 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 5 leaves, max depth = 3, train loss: 0.67012, val loss: 0.66942, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.64882, val loss: 0.64743, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.62956, val loss: 0.62755, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.61219, val loss: 0.60943, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.59636, val loss: 0.59301, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.58199, val loss: 0.57806, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.56910, val loss: 0.56462, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.55737, val loss: 0.55235, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.54646, val loss: 0.54095, in 0.000s 1 tree, 6 leaves, max depth = 3, train loss: 0.53625, val loss: 0.53025, in 0.000s 1 tree, 5 leaves, max depth = 4, train loss: 0.52727, val loss: 0.52078, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.51911, val loss: 0.51220, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.51148, val loss: 0.50416, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.50457, val loss: 0.49684, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.49815, val loss: 0.49003, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.48850, val loss: 0.48109, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.48286, val loss: 0.47511, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.47769, val loss: 0.46962, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.47284, val loss: 0.46432, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.46828, val loss: 0.45940, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.46285, val loss: 0.45438, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.45729, val loss: 0.44932, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.45363, val loss: 0.44533, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.45002, val loss: 0.44139, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.44541, val loss: 0.43721, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.44204, val loss: 0.43345, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.43875, val loss: 0.43013, in 0.000s 1 tree, 23 leaves, max depth = 11, train loss: 0.43533, val loss: 0.42704, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.43109, val loss: 0.42332, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.42498, val loss: 0.41784, in 0.000s 1 tree, 18 leaves, max depth = 8, train loss: 0.42142, val loss: 0.41476, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.41863, val loss: 0.41202, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.41591, val loss: 0.40902, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.41336, val loss: 0.40652, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.41089, val loss: 0.40379, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.40857, val loss: 0.40153, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.40629, val loss: 0.39913, in 0.016s 1 tree, 18 leaves, max depth = 9, train loss: 0.40155, val loss: 0.39492, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.39952, val loss: 0.39278, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.39685, val loss: 0.39055, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39408, val loss: 0.38745, in 0.000s 1 tree, 11 leaves, max depth = 5, train loss: 0.39227, val loss: 0.38544, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.39033, val loss: 0.38357, in 0.000s 1 tree, 11 leaves, max depth = 5, train loss: 0.38634, val loss: 0.38010, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.38400, val loss: 0.37818, in 0.000s 1 tree, 27 leaves, max depth = 9, train loss: 0.38193, val loss: 0.37652, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.37860, val loss: 0.37367, in 0.000s 1 tree, 13 leaves, max depth = 7, train loss: 0.37707, val loss: 0.37201, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.37533, val loss: 0.37067, in 0.000s 1 tree, 22 leaves, max depth = 9, train loss: 0.37236, val loss: 0.36811, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.37099, val loss: 0.36685, in 0.000s 1 tree, 26 leaves, max depth = 9, train loss: 0.36835, val loss: 0.36461, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.36698, val loss: 0.36318, in 0.000s 1 tree, 13 leaves, max depth = 7, train loss: 0.36572, val loss: 0.36183, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.36458, val loss: 0.36060, in 0.016s 1 tree, 24 leaves, max depth = 10, train loss: 0.36212, val loss: 0.35854, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.36107, val loss: 0.35719, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.36006, val loss: 0.35610, in 0.000s 1 tree, 26 leaves, max depth = 9, train loss: 0.35773, val loss: 0.35399, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.35523, val loss: 0.35119, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.35382, val loss: 0.35011, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.35174, val loss: 0.34827, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.35077, val loss: 0.34707, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.34990, val loss: 0.34613, in 0.000s 1 tree, 10 leaves, max depth = 4, train loss: 0.34906, val loss: 0.34510, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.34775, val loss: 0.34396, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.34544, val loss: 0.34136, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.34421, val loss: 0.34047, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.34241, val loss: 0.33890, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.34075, val loss: 0.33745, in 0.000s 1 tree, 11 leaves, max depth = 5, train loss: 0.33989, val loss: 0.33641, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.33912, val loss: 0.33549, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.33696, val loss: 0.33304, in 0.000s 1 tree, 46 leaves, max depth = 13, train loss: 0.33604, val loss: 0.33229, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.33403, val loss: 0.33002, in 0.000s 1 tree, 13 leaves, max depth = 5, train loss: 0.33329, val loss: 0.32912, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.33144, val loss: 0.32701, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.33079, val loss: 0.32635, in 0.016s 1 tree, 14 leaves, max depth = 12, train loss: 0.33011, val loss: 0.32554, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.32840, val loss: 0.32357, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.32779, val loss: 0.32288, in 0.000s 1 tree, 18 leaves, max depth = 7, train loss: 0.32613, val loss: 0.32148, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.32505, val loss: 0.32075, in 0.000s 1 tree, 26 leaves, max depth = 7, train loss: 0.32356, val loss: 0.31948, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.32218, val loss: 0.31832, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.32052, val loss: 0.31642, in 0.016s 1 tree, 16 leaves, max depth = 9, train loss: 0.31988, val loss: 0.31570, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.31932, val loss: 0.31516, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.31852, val loss: 0.31459, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.31697, val loss: 0.31280, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.31605, val loss: 0.31222, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.31482, val loss: 0.31123, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.31423, val loss: 0.31051, in 0.000s 1 tree, 54 leaves, max depth = 11, train loss: 0.31341, val loss: 0.30999, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.31191, val loss: 0.30827, in 0.000s Fit 95 trees in 0.923 s, (1545 total leaves) Time spent computing histograms: 0.382s Time spent finding best splits: 0.039s Time spent applying splits: 0.032s Time spent predicting: 0.000s Trial 8, Fold 1: Log loss = 0.3164149077430296, Average precision = 0.9534281493802323, ROC-AUC = 0.9489805733595255, Elapsed Time = 0.9312449999997625 seconds Trial 8, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 8, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 5 leaves, max depth = 4, train loss: 0.67009, val loss: 0.66893, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.64885, val loss: 0.64658, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.62968, val loss: 0.62634, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.61225, val loss: 0.60801, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.59649, val loss: 0.59130, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.58217, val loss: 0.57609, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.56921, val loss: 0.56239, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.55742, val loss: 0.54988, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.54658, val loss: 0.53831, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.53648, val loss: 0.52750, in 0.000s 1 tree, 5 leaves, max depth = 4, train loss: 0.52750, val loss: 0.51786, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.51929, val loss: 0.50907, in 0.000s 1 tree, 6 leaves, max depth = 3, train loss: 0.51172, val loss: 0.50090, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.50483, val loss: 0.49343, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.49847, val loss: 0.48653, in 0.000s 1 tree, 13 leaves, max depth = 6, train loss: 0.48885, val loss: 0.47729, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.48328, val loss: 0.47118, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.47818, val loss: 0.46558, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.47333, val loss: 0.46031, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.46885, val loss: 0.45540, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.46341, val loss: 0.45040, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.45792, val loss: 0.44531, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.45427, val loss: 0.44129, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.45073, val loss: 0.43737, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.44612, val loss: 0.43319, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.44280, val loss: 0.42977, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.43989, val loss: 0.42653, in 0.016s 1 tree, 24 leaves, max depth = 10, train loss: 0.43644, val loss: 0.42327, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.43209, val loss: 0.41927, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.42582, val loss: 0.41336, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.42217, val loss: 0.41006, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.41920, val loss: 0.40756, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.41646, val loss: 0.40524, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.41394, val loss: 0.40315, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.41150, val loss: 0.40041, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.40920, val loss: 0.39852, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.40711, val loss: 0.39617, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.40233, val loss: 0.39169, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.40047, val loss: 0.38959, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.39775, val loss: 0.38717, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39504, val loss: 0.38436, in 0.016s 1 tree, 11 leaves, max depth = 7, train loss: 0.39334, val loss: 0.38242, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.39141, val loss: 0.38097, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.38736, val loss: 0.37718, in 0.000s 1 tree, 33 leaves, max depth = 10, train loss: 0.38502, val loss: 0.37517, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.38289, val loss: 0.37339, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.37949, val loss: 0.37024, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.37795, val loss: 0.36856, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.37615, val loss: 0.36714, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.37316, val loss: 0.36438, in 0.000s 1 tree, 32 leaves, max depth = 8, train loss: 0.37179, val loss: 0.36342, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.36904, val loss: 0.36096, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.36776, val loss: 0.35947, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.36656, val loss: 0.35868, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.36532, val loss: 0.35727, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.36284, val loss: 0.35504, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.36159, val loss: 0.35361, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.36058, val loss: 0.35254, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.35840, val loss: 0.35058, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.35596, val loss: 0.34804, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.35462, val loss: 0.34697, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.35258, val loss: 0.34516, in 0.016s 1 tree, 10 leaves, max depth = 4, train loss: 0.35152, val loss: 0.34386, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.35056, val loss: 0.34267, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.34969, val loss: 0.34161, in 0.000s 1 tree, 35 leaves, max depth = 12, train loss: 0.34832, val loss: 0.34057, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.34606, val loss: 0.33821, in 0.000s 1 tree, 53 leaves, max depth = 10, train loss: 0.34487, val loss: 0.33730, in 0.013s 1 tree, 26 leaves, max depth = 8, train loss: 0.34305, val loss: 0.33571, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.34151, val loss: 0.33438, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.34063, val loss: 0.33330, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.33987, val loss: 0.33254, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.33776, val loss: 0.33033, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.33699, val loss: 0.32938, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.33504, val loss: 0.32734, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.33432, val loss: 0.32643, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.33252, val loss: 0.32453, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.33132, val loss: 0.32366, in 0.000s 1 tree, 13 leaves, max depth = 5, train loss: 0.33059, val loss: 0.32275, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.32891, val loss: 0.32098, in 0.000s 1 tree, 11 leaves, max depth = 5, train loss: 0.32829, val loss: 0.32019, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.32665, val loss: 0.31876, in 0.000s 1 tree, 48 leaves, max depth = 11, train loss: 0.32563, val loss: 0.31802, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.32415, val loss: 0.31672, in 0.000s 1 tree, 22 leaves, max depth = 8, train loss: 0.32278, val loss: 0.31555, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.32115, val loss: 0.31383, in 0.000s 1 tree, 10 leaves, max depth = 6, train loss: 0.32049, val loss: 0.31301, in 0.000s 1 tree, 11 leaves, max depth = 7, train loss: 0.31990, val loss: 0.31230, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.31931, val loss: 0.31156, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.31779, val loss: 0.30995, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.31687, val loss: 0.30931, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.31558, val loss: 0.30819, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.31501, val loss: 0.30755, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.31420, val loss: 0.30700, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.31274, val loss: 0.30545, in 0.016s Fit 95 trees in 1.049 s, (1419 total leaves) Time spent computing histograms: 0.435s Time spent finding best splits: 0.048s Time spent applying splits: 0.041s Time spent predicting: 0.000s Trial 8, Fold 2: Log loss = 0.31504257772127, Average precision = 0.9514261562429316, ROC-AUC = 0.9503154229468751, Elapsed Time = 1.0518565000002127 seconds Trial 8, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 8, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.205 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 4 leaves, max depth = 3, train loss: 0.67019, val loss: 0.66930, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.64909, val loss: 0.64730, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.63001, val loss: 0.62740, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.61268, val loss: 0.60942, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.59702, val loss: 0.59306, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.58285, val loss: 0.57835, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.57002, val loss: 0.56486, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.55835, val loss: 0.55256, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.54756, val loss: 0.54123, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.53749, val loss: 0.53072, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.52855, val loss: 0.52138, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.52044, val loss: 0.51277, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.51290, val loss: 0.50479, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.50606, val loss: 0.49749, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.49973, val loss: 0.49077, in 0.000s 1 tree, 18 leaves, max depth = 8, train loss: 0.49007, val loss: 0.48179, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.48451, val loss: 0.47582, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.47943, val loss: 0.47033, in 0.016s 1 tree, 5 leaves, max depth = 4, train loss: 0.47460, val loss: 0.46526, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.47014, val loss: 0.46055, in 0.000s 1 tree, 32 leaves, max depth = 8, train loss: 0.46466, val loss: 0.45538, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.45910, val loss: 0.45038, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.45547, val loss: 0.44655, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.45196, val loss: 0.44281, in 0.000s 1 tree, 34 leaves, max depth = 8, train loss: 0.44729, val loss: 0.43846, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.44388, val loss: 0.43533, in 0.000s 1 tree, 12 leaves, max depth = 5, train loss: 0.44099, val loss: 0.43228, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.43751, val loss: 0.42915, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.43311, val loss: 0.42523, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.42685, val loss: 0.41949, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.42315, val loss: 0.41627, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.42011, val loss: 0.41367, in 0.000s 1 tree, 27 leaves, max depth = 9, train loss: 0.41733, val loss: 0.41131, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.41479, val loss: 0.40917, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.41237, val loss: 0.40657, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.41003, val loss: 0.40465, in 0.000s 1 tree, 13 leaves, max depth = 7, train loss: 0.40782, val loss: 0.40215, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40308, val loss: 0.39793, in 0.000s 1 tree, 12 leaves, max depth = 7, train loss: 0.40112, val loss: 0.39570, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.39841, val loss: 0.39344, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39561, val loss: 0.39086, in 0.016s 1 tree, 18 leaves, max depth = 6, train loss: 0.39375, val loss: 0.38930, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.39187, val loss: 0.38784, in 0.016s 1 tree, 22 leaves, max depth = 10, train loss: 0.38794, val loss: 0.38442, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.38562, val loss: 0.38236, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.38361, val loss: 0.38077, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.38033, val loss: 0.37794, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.37872, val loss: 0.37613, in 0.000s 1 tree, 15 leaves, max depth = 9, train loss: 0.37725, val loss: 0.37456, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.37423, val loss: 0.37197, in 0.000s 1 tree, 40 leaves, max depth = 14, train loss: 0.37285, val loss: 0.37099, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.37009, val loss: 0.36867, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.36878, val loss: 0.36713, in 0.000s 1 tree, 45 leaves, max depth = 13, train loss: 0.36757, val loss: 0.36630, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.36634, val loss: 0.36490, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.36392, val loss: 0.36284, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.36268, val loss: 0.36147, in 0.000s 1 tree, 8 leaves, max depth = 6, train loss: 0.36169, val loss: 0.36043, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.35939, val loss: 0.35854, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35689, val loss: 0.35622, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.35547, val loss: 0.35506, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.35336, val loss: 0.35334, in 0.016s 1 tree, 11 leaves, max depth = 4, train loss: 0.35237, val loss: 0.35202, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.35152, val loss: 0.35114, in 0.000s 1 tree, 10 leaves, max depth = 4, train loss: 0.35066, val loss: 0.34992, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.34962, val loss: 0.34925, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.34730, val loss: 0.34711, in 0.000s 1 tree, 69 leaves, max depth = 16, train loss: 0.34606, val loss: 0.34613, in 0.016s 1 tree, 27 leaves, max depth = 14, train loss: 0.34436, val loss: 0.34475, in 0.016s 1 tree, 38 leaves, max depth = 15, train loss: 0.34264, val loss: 0.34340, in 0.000s 1 tree, 11 leaves, max depth = 4, train loss: 0.34180, val loss: 0.34220, in 0.016s 1 tree, 5 leaves, max depth = 4, train loss: 0.34105, val loss: 0.34108, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.33888, val loss: 0.33907, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.33815, val loss: 0.33820, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.33616, val loss: 0.33635, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.33547, val loss: 0.33542, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.33363, val loss: 0.33371, in 0.000s 1 tree, 34 leaves, max depth = 13, train loss: 0.33240, val loss: 0.33277, in 0.016s 1 tree, 11 leaves, max depth = 7, train loss: 0.33170, val loss: 0.33185, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.32998, val loss: 0.33025, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.32938, val loss: 0.32946, in 0.000s 1 tree, 26 leaves, max depth = 8, train loss: 0.32769, val loss: 0.32816, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.32662, val loss: 0.32737, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.32514, val loss: 0.32623, in 0.000s 1 tree, 37 leaves, max depth = 13, train loss: 0.32376, val loss: 0.32521, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.32210, val loss: 0.32366, in 0.000s 1 tree, 12 leaves, max depth = 5, train loss: 0.32147, val loss: 0.32280, in 0.016s 1 tree, 18 leaves, max depth = 11, train loss: 0.32084, val loss: 0.32218, in 0.000s 1 tree, 46 leaves, max depth = 14, train loss: 0.32004, val loss: 0.32181, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.31848, val loss: 0.32035, in 0.000s 1 tree, 73 leaves, max depth = 17, train loss: 0.31756, val loss: 0.31971, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.31630, val loss: 0.31883, in 0.000s 1 tree, 12 leaves, max depth = 5, train loss: 0.31571, val loss: 0.31800, in 0.016s 1 tree, 74 leaves, max depth = 17, train loss: 0.31489, val loss: 0.31744, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.31339, val loss: 0.31605, in 0.000s Fit 95 trees in 1.080 s, (1767 total leaves) Time spent computing histograms: 0.428s Time spent finding best splits: 0.049s Time spent applying splits: 0.040s Time spent predicting: 0.000s Trial 8, Fold 3: Log loss = 0.3097361123640661, Average precision = 0.956834791175193, ROC-AUC = 0.9536477846211633, Elapsed Time = 1.09967609999876 seconds Trial 8, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 8, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.189 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 5 leaves, max depth = 4, train loss: 0.67015, val loss: 0.66875, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.64905, val loss: 0.64636, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.62994, val loss: 0.62599, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.61261, val loss: 0.60759, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.59694, val loss: 0.59086, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.58276, val loss: 0.57562, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.56991, val loss: 0.56179, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.55817, val loss: 0.54917, in 0.000s 1 tree, 7 leaves, max depth = 3, train loss: 0.54736, val loss: 0.53745, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.53729, val loss: 0.52640, in 0.000s 1 tree, 6 leaves, max depth = 3, train loss: 0.52834, val loss: 0.51658, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.52016, val loss: 0.50764, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.51261, val loss: 0.49931, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.50573, val loss: 0.49160, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.49938, val loss: 0.48452, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.48977, val loss: 0.47517, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.48419, val loss: 0.46893, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.47907, val loss: 0.46318, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.47424, val loss: 0.45776, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.46977, val loss: 0.45262, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.46438, val loss: 0.44729, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.45894, val loss: 0.44208, in 0.000s 1 tree, 9 leaves, max depth = 4, train loss: 0.45530, val loss: 0.43787, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.45178, val loss: 0.43377, in 0.000s 1 tree, 26 leaves, max depth = 8, train loss: 0.44719, val loss: 0.42925, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.44391, val loss: 0.42577, in 0.000s 1 tree, 9 leaves, max depth = 4, train loss: 0.44101, val loss: 0.42235, in 0.016s 1 tree, 24 leaves, max depth = 11, train loss: 0.43760, val loss: 0.41871, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.43327, val loss: 0.41459, in 0.000s 1 tree, 13 leaves, max depth = 5, train loss: 0.42702, val loss: 0.40859, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.42338, val loss: 0.40515, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.42041, val loss: 0.40213, in 0.000s 1 tree, 22 leaves, max depth = 8, train loss: 0.41770, val loss: 0.39935, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.41521, val loss: 0.39680, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.41279, val loss: 0.39393, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.41052, val loss: 0.39159, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.40825, val loss: 0.38892, in 0.000s 1 tree, 18 leaves, max depth = 9, train loss: 0.40355, val loss: 0.38449, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.40154, val loss: 0.38210, in 0.000s 1 tree, 24 leaves, max depth = 8, train loss: 0.39887, val loss: 0.37972, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.39618, val loss: 0.37684, in 0.000s 1 tree, 11 leaves, max depth = 6, train loss: 0.39442, val loss: 0.37470, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.39252, val loss: 0.37275, in 0.000s 1 tree, 16 leaves, max depth = 9, train loss: 0.38855, val loss: 0.36905, in 0.000s 1 tree, 32 leaves, max depth = 8, train loss: 0.38625, val loss: 0.36691, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.38417, val loss: 0.36509, in 0.000s 1 tree, 12 leaves, max depth = 5, train loss: 0.38086, val loss: 0.36199, in 0.016s 1 tree, 15 leaves, max depth = 8, train loss: 0.37935, val loss: 0.36015, in 0.000s 1 tree, 24 leaves, max depth = 8, train loss: 0.37761, val loss: 0.35861, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.37468, val loss: 0.35593, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.37336, val loss: 0.35458, in 0.000s 1 tree, 22 leaves, max depth = 8, train loss: 0.37076, val loss: 0.35222, in 0.016s 1 tree, 15 leaves, max depth = 8, train loss: 0.36941, val loss: 0.35057, in 0.000s 1 tree, 11 leaves, max depth = 8, train loss: 0.36818, val loss: 0.34905, in 0.016s 1 tree, 16 leaves, max depth = 9, train loss: 0.36705, val loss: 0.34766, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.36464, val loss: 0.34547, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.36346, val loss: 0.34408, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.36247, val loss: 0.34302, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.36025, val loss: 0.34099, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.35783, val loss: 0.33837, in 0.016s 1 tree, 54 leaves, max depth = 16, train loss: 0.35643, val loss: 0.33721, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.35447, val loss: 0.33550, in 0.031s 1 tree, 8 leaves, max depth = 4, train loss: 0.35348, val loss: 0.33439, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.35264, val loss: 0.33352, in 0.000s 1 tree, 12 leaves, max depth = 6, train loss: 0.35178, val loss: 0.33252, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.35038, val loss: 0.33118, in 0.031s 1 tree, 3 leaves, max depth = 2, train loss: 0.34814, val loss: 0.32875, in 0.000s 1 tree, 57 leaves, max depth = 16, train loss: 0.34692, val loss: 0.32779, in 0.031s 1 tree, 21 leaves, max depth = 8, train loss: 0.34520, val loss: 0.32619, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.34362, val loss: 0.32479, in 0.016s 1 tree, 10 leaves, max depth = 4, train loss: 0.34274, val loss: 0.32380, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.34195, val loss: 0.32292, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.33986, val loss: 0.32064, in 0.016s 1 tree, 44 leaves, max depth = 16, train loss: 0.33895, val loss: 0.31979, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.33701, val loss: 0.31768, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.33626, val loss: 0.31685, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.33447, val loss: 0.31490, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.33382, val loss: 0.31422, in 0.000s 1 tree, 11 leaves, max depth = 6, train loss: 0.33315, val loss: 0.31351, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.33149, val loss: 0.31170, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.33091, val loss: 0.31108, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.32933, val loss: 0.30964, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.32823, val loss: 0.30883, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.32685, val loss: 0.30759, in 0.016s 1 tree, 16 leaves, max depth = 9, train loss: 0.32561, val loss: 0.30654, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.32401, val loss: 0.30478, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.32340, val loss: 0.30414, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.32282, val loss: 0.30359, in 0.000s 1 tree, 12 leaves, max depth = 5, train loss: 0.32226, val loss: 0.30301, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.32076, val loss: 0.30136, in 0.000s 1 tree, 57 leaves, max depth = 14, train loss: 0.31977, val loss: 0.30067, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.31859, val loss: 0.29964, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.31806, val loss: 0.29909, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.31719, val loss: 0.29848, in 0.000s 1 tree, 4 leaves, max depth = 2, train loss: 0.31575, val loss: 0.29690, in 0.016s Fit 95 trees in 1.283 s, (1467 total leaves) Time spent computing histograms: 0.535s Time spent finding best splits: 0.066s Time spent applying splits: 0.054s Time spent predicting: 0.000s Trial 8, Fold 4: Log loss = 0.31759633195988035, Average precision = 0.9537953575007956, ROC-AUC = 0.9493490263262684, Elapsed Time = 1.2855565000008937 seconds Trial 8, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 8, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 4 leaves, max depth = 3, train loss: 0.66986, val loss: 0.66819, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.64852, val loss: 0.64533, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.62922, val loss: 0.62458, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.61167, val loss: 0.60572, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.59591, val loss: 0.58863, in 0.016s 1 tree, 5 leaves, max depth = 4, train loss: 0.58158, val loss: 0.57303, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.56856, val loss: 0.55885, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.55669, val loss: 0.54582, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.54576, val loss: 0.53380, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.53560, val loss: 0.52259, in 0.016s 1 tree, 5 leaves, max depth = 4, train loss: 0.52656, val loss: 0.51251, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.51830, val loss: 0.50327, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.51066, val loss: 0.49473, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.50372, val loss: 0.48684, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.49730, val loss: 0.47961, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.48781, val loss: 0.47076, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.48217, val loss: 0.46431, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.47700, val loss: 0.45837, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.47212, val loss: 0.45278, in 0.000s 1 tree, 10 leaves, max depth = 6, train loss: 0.46762, val loss: 0.44761, in 0.000s 1 tree, 24 leaves, max depth = 9, train loss: 0.46233, val loss: 0.44243, in 0.016s 1 tree, 13 leaves, max depth = 8, train loss: 0.45696, val loss: 0.43780, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.45327, val loss: 0.43346, in 0.016s 1 tree, 9 leaves, max depth = 6, train loss: 0.44970, val loss: 0.42931, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.44521, val loss: 0.42495, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.44191, val loss: 0.42174, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.43898, val loss: 0.41824, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.43562, val loss: 0.41489, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.43134, val loss: 0.41136, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.42517, val loss: 0.40587, in 0.000s 1 tree, 22 leaves, max depth = 9, train loss: 0.42158, val loss: 0.40297, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.41864, val loss: 0.40007, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.41594, val loss: 0.39742, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.41347, val loss: 0.39499, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.41099, val loss: 0.39206, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.40871, val loss: 0.38981, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.40659, val loss: 0.38728, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.40195, val loss: 0.38325, in 0.016s 1 tree, 12 leaves, max depth = 7, train loss: 0.39993, val loss: 0.38079, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.39729, val loss: 0.37880, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.39456, val loss: 0.37617, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.39275, val loss: 0.37399, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.39085, val loss: 0.37215, in 0.000s 1 tree, 15 leaves, max depth = 6, train loss: 0.38694, val loss: 0.36881, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.38466, val loss: 0.36674, in 0.000s 1 tree, 33 leaves, max depth = 10, train loss: 0.38261, val loss: 0.36528, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.37934, val loss: 0.36254, in 0.016s 1 tree, 11 leaves, max depth = 7, train loss: 0.37780, val loss: 0.36066, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.37608, val loss: 0.35949, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.37320, val loss: 0.35705, in 0.000s 1 tree, 35 leaves, max depth = 15, train loss: 0.37184, val loss: 0.35574, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.36925, val loss: 0.35349, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.36788, val loss: 0.35177, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.36668, val loss: 0.35026, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.36547, val loss: 0.34889, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.36304, val loss: 0.34682, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.36183, val loss: 0.34533, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.36086, val loss: 0.34421, in 0.000s 1 tree, 22 leaves, max depth = 7, train loss: 0.35863, val loss: 0.34233, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.35619, val loss: 0.33997, in 0.000s 1 tree, 49 leaves, max depth = 13, train loss: 0.35481, val loss: 0.33880, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.35284, val loss: 0.33716, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.35182, val loss: 0.33599, in 0.000s 1 tree, 12 leaves, max depth = 6, train loss: 0.35090, val loss: 0.33493, in 0.000s 1 tree, 13 leaves, max depth = 6, train loss: 0.35005, val loss: 0.33381, in 0.000s 1 tree, 36 leaves, max depth = 13, train loss: 0.34866, val loss: 0.33268, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.34640, val loss: 0.33050, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.34519, val loss: 0.32951, in 0.016s 1 tree, 22 leaves, max depth = 7, train loss: 0.34347, val loss: 0.32813, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.34192, val loss: 0.32690, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.34104, val loss: 0.32586, in 0.000s 1 tree, 11 leaves, max depth = 5, train loss: 0.34025, val loss: 0.32492, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.33814, val loss: 0.32288, in 0.000s 1 tree, 31 leaves, max depth = 13, train loss: 0.33697, val loss: 0.32198, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.33502, val loss: 0.32009, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.33424, val loss: 0.31920, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.33244, val loss: 0.31746, in 0.000s 1 tree, 12 leaves, max depth = 6, train loss: 0.33178, val loss: 0.31671, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.33107, val loss: 0.31590, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.32939, val loss: 0.31429, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.32879, val loss: 0.31360, in 0.000s 1 tree, 19 leaves, max depth = 7, train loss: 0.32714, val loss: 0.31229, in 0.016s 1 tree, 49 leaves, max depth = 14, train loss: 0.32609, val loss: 0.31145, in 0.016s 1 tree, 23 leaves, max depth = 7, train loss: 0.32464, val loss: 0.31030, in 0.000s 1 tree, 26 leaves, max depth = 9, train loss: 0.32330, val loss: 0.30930, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.32168, val loss: 0.30773, in 0.000s 1 tree, 10 leaves, max depth = 6, train loss: 0.32106, val loss: 0.30700, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.32049, val loss: 0.30636, in 0.000s 1 tree, 43 leaves, max depth = 14, train loss: 0.31967, val loss: 0.30566, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.31816, val loss: 0.30418, in 0.000s 1 tree, 52 leaves, max depth = 14, train loss: 0.31725, val loss: 0.30347, in 0.016s 1 tree, 24 leaves, max depth = 10, train loss: 0.31607, val loss: 0.30257, in 0.000s 1 tree, 12 leaves, max depth = 5, train loss: 0.31547, val loss: 0.30191, in 0.000s 1 tree, 54 leaves, max depth = 14, train loss: 0.31467, val loss: 0.30128, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.31321, val loss: 0.29987, in 0.000s Fit 95 trees in 1.033 s, (1455 total leaves) Time spent computing histograms: 0.434s Time spent finding best splits: 0.044s Time spent applying splits: 0.036s Time spent predicting: 0.000s Trial 8, Fold 5: Log loss = 0.319615800834132, Average precision = 0.9529594062011224, ROC-AUC = 0.9482054883599947, Elapsed Time = 1.0457816000016464 seconds
Optimization Progress: 9%|9 | 9/100 [01:51<18:51, 12.44s/it]
Trial 9, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 9, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.159 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 77 leaves, max depth = 17, train loss: 0.64898, val loss: 0.64781, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.61250, val loss: 0.61028, in 0.016s 1 tree, 76 leaves, max depth = 18, train loss: 0.58242, val loss: 0.57932, in 0.000s 1 tree, 77 leaves, max depth = 18, train loss: 0.55778, val loss: 0.55395, in 0.016s 1 tree, 127 leaves, max depth = 13, train loss: 0.53453, val loss: 0.53256, in 0.016s 1 tree, 76 leaves, max depth = 17, train loss: 0.51534, val loss: 0.51258, in 0.016s 1 tree, 127 leaves, max depth = 14, train loss: 0.49704, val loss: 0.49590, in 0.016s 1 tree, 77 leaves, max depth = 17, train loss: 0.48227, val loss: 0.48041, in 0.000s 1 tree, 76 leaves, max depth = 16, train loss: 0.46965, val loss: 0.46711, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.45857, val loss: 0.45533, in 0.000s 1 tree, 127 leaves, max depth = 14, train loss: 0.44484, val loss: 0.44306, in 0.016s 1 tree, 77 leaves, max depth = 16, train loss: 0.43659, val loss: 0.43430, in 0.016s 1 tree, 76 leaves, max depth = 17, train loss: 0.42911, val loss: 0.42639, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42246, val loss: 0.41905, in 0.000s 1 tree, 168 leaves, max depth = 16, train loss: 0.41469, val loss: 0.41243, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.40459, val loss: 0.40373, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39914, val loss: 0.39767, in 0.000s 1 tree, 76 leaves, max depth = 15, train loss: 0.39378, val loss: 0.39209, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38923, val loss: 0.38700, in 0.000s 1 tree, 76 leaves, max depth = 13, train loss: 0.38458, val loss: 0.38209, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38077, val loss: 0.37779, in 0.000s 1 tree, 168 leaves, max depth = 15, train loss: 0.37539, val loss: 0.37369, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.36746, val loss: 0.36718, in 0.031s 1 tree, 76 leaves, max depth = 13, train loss: 0.36381, val loss: 0.36332, in 0.000s 1 tree, 153 leaves, max depth = 19, train loss: 0.35998, val loss: 0.35994, in 0.000s 1 tree, 127 leaves, max depth = 15, train loss: 0.35365, val loss: 0.35484, in 0.031s 1 tree, 127 leaves, max depth = 15, train loss: 0.34825, val loss: 0.35056, in 0.000s 1 tree, 46 leaves, max depth = 11, train loss: 0.34571, val loss: 0.34787, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.34113, val loss: 0.34431, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.33723, val loss: 0.34132, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.33447, val loss: 0.33823, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33099, val loss: 0.33425, in 0.000s 1 tree, 76 leaves, max depth = 13, train loss: 0.32862, val loss: 0.33170, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.32508, val loss: 0.32909, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.32291, val loss: 0.32668, in 0.016s 1 tree, 96 leaves, max depth = 14, train loss: 0.32095, val loss: 0.32436, in 0.000s 1 tree, 164 leaves, max depth = 17, train loss: 0.31834, val loss: 0.32270, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.31546, val loss: 0.32062, in 0.016s 1 tree, 97 leaves, max depth = 14, train loss: 0.31375, val loss: 0.31863, in 0.016s 1 tree, 164 leaves, max depth = 17, train loss: 0.31169, val loss: 0.31743, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30852, val loss: 0.31378, in 0.000s 1 tree, 96 leaves, max depth = 16, train loss: 0.30688, val loss: 0.31184, in 0.016s 1 tree, 153 leaves, max depth = 22, train loss: 0.30509, val loss: 0.31056, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.30282, val loss: 0.30899, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30000, val loss: 0.30573, in 0.000s 1 tree, 76 leaves, max depth = 15, train loss: 0.29847, val loss: 0.30409, in 0.016s Fit 46 trees in 0.799 s, (4201 total leaves) Time spent computing histograms: 0.215s Time spent finding best splits: 0.084s Time spent applying splits: 0.060s Time spent predicting: 0.016s Trial 9, Fold 1: Log loss = 0.30940654560773445, Average precision = 0.9546708274374668, ROC-AUC = 0.9510657438474738, Elapsed Time = 0.8121594000003824 seconds Trial 9, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 9, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.141 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 77 leaves, max depth = 20, train loss: 0.64881, val loss: 0.64702, in 0.015s 1 tree, 77 leaves, max depth = 15, train loss: 0.61244, val loss: 0.60872, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.58246, val loss: 0.57713, in 0.016s 1 tree, 77 leaves, max depth = 17, train loss: 0.55774, val loss: 0.55117, in 0.000s 1 tree, 126 leaves, max depth = 19, train loss: 0.53504, val loss: 0.52940, in 0.016s 1 tree, 77 leaves, max depth = 14, train loss: 0.51590, val loss: 0.50923, in 0.016s 1 tree, 126 leaves, max depth = 19, train loss: 0.49800, val loss: 0.49216, in 0.016s 1 tree, 77 leaves, max depth = 20, train loss: 0.48316, val loss: 0.47640, in 0.000s 1 tree, 77 leaves, max depth = 13, train loss: 0.47050, val loss: 0.46276, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.45947, val loss: 0.45086, in 0.016s 1 tree, 126 leaves, max depth = 20, train loss: 0.44605, val loss: 0.43823, in 0.016s 1 tree, 77 leaves, max depth = 22, train loss: 0.43775, val loss: 0.42930, in 0.000s 1 tree, 77 leaves, max depth = 19, train loss: 0.43073, val loss: 0.42172, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42425, val loss: 0.41501, in 0.000s 1 tree, 168 leaves, max depth = 13, train loss: 0.41656, val loss: 0.40836, in 0.016s 1 tree, 126 leaves, max depth = 19, train loss: 0.40664, val loss: 0.39923, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40134, val loss: 0.39370, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.39587, val loss: 0.38800, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39144, val loss: 0.38338, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.38684, val loss: 0.37835, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38314, val loss: 0.37445, in 0.000s 1 tree, 153 leaves, max depth = 23, train loss: 0.37833, val loss: 0.37084, in 0.016s 1 tree, 126 leaves, max depth = 19, train loss: 0.37046, val loss: 0.36376, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.36682, val loss: 0.35978, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.36212, val loss: 0.35617, in 0.016s 1 tree, 126 leaves, max depth = 20, train loss: 0.35600, val loss: 0.35075, in 0.016s 1 tree, 126 leaves, max depth = 20, train loss: 0.35080, val loss: 0.34618, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.34775, val loss: 0.34290, in 0.000s 1 tree, 126 leaves, max depth = 20, train loss: 0.34327, val loss: 0.33899, in 0.031s 1 tree, 126 leaves, max depth = 20, train loss: 0.33945, val loss: 0.33571, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.33677, val loss: 0.33296, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33336, val loss: 0.32938, in 0.016s 1 tree, 97 leaves, max depth = 18, train loss: 0.33103, val loss: 0.32673, in 0.000s 1 tree, 126 leaves, max depth = 20, train loss: 0.32760, val loss: 0.32381, in 0.016s 1 tree, 96 leaves, max depth = 16, train loss: 0.32559, val loss: 0.32149, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.32372, val loss: 0.31953, in 0.016s 1 tree, 163 leaves, max depth = 17, train loss: 0.32131, val loss: 0.31803, in 0.016s 1 tree, 126 leaves, max depth = 20, train loss: 0.31853, val loss: 0.31571, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.31682, val loss: 0.31383, in 0.000s 1 tree, 162 leaves, max depth = 16, train loss: 0.31493, val loss: 0.31275, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31181, val loss: 0.30948, in 0.016s 1 tree, 96 leaves, max depth = 15, train loss: 0.31011, val loss: 0.30765, in 0.000s 1 tree, 152 leaves, max depth = 24, train loss: 0.30812, val loss: 0.30663, in 0.016s 1 tree, 126 leaves, max depth = 21, train loss: 0.30594, val loss: 0.30486, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30317, val loss: 0.30194, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.30164, val loss: 0.30029, in 0.000s Fit 46 trees in 0.798 s, (4239 total leaves) Time spent computing histograms: 0.224s Time spent finding best splits: 0.082s Time spent applying splits: 0.059s Time spent predicting: 0.016s Trial 9, Fold 2: Log loss = 0.30944503444618643, Average precision = 0.952180089458451, ROC-AUC = 0.9511007936105684, Elapsed Time = 0.8179369999998016 seconds Trial 9, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 9, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 75 leaves, max depth = 15, train loss: 0.64923, val loss: 0.64769, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.61303, val loss: 0.61001, in 0.000s 1 tree, 76 leaves, max depth = 14, train loss: 0.58324, val loss: 0.57897, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.55878, val loss: 0.55332, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.53606, val loss: 0.53243, in 0.016s 1 tree, 76 leaves, max depth = 15, train loss: 0.51700, val loss: 0.51227, in 0.000s 1 tree, 121 leaves, max depth = 14, train loss: 0.49913, val loss: 0.49593, in 0.016s 1 tree, 76 leaves, max depth = 19, train loss: 0.48443, val loss: 0.48029, in 0.016s 1 tree, 76 leaves, max depth = 15, train loss: 0.47188, val loss: 0.46681, in 0.000s 1 tree, 76 leaves, max depth = 13, train loss: 0.46090, val loss: 0.45537, in 0.016s 1 tree, 121 leaves, max depth = 15, train loss: 0.44753, val loss: 0.44321, in 0.016s 1 tree, 76 leaves, max depth = 19, train loss: 0.43929, val loss: 0.43432, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.43195, val loss: 0.42628, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.42523, val loss: 0.42008, in 0.016s 1 tree, 168 leaves, max depth = 14, train loss: 0.41756, val loss: 0.41361, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.40779, val loss: 0.40487, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40230, val loss: 0.39978, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.39701, val loss: 0.39394, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39243, val loss: 0.38970, in 0.000s 1 tree, 76 leaves, max depth = 13, train loss: 0.38784, val loss: 0.38491, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38401, val loss: 0.38136, in 0.000s 1 tree, 153 leaves, max depth = 25, train loss: 0.37916, val loss: 0.37772, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.37143, val loss: 0.37093, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.36780, val loss: 0.36713, in 0.016s 1 tree, 153 leaves, max depth = 25, train loss: 0.36418, val loss: 0.36460, in 0.016s 1 tree, 121 leaves, max depth = 15, train loss: 0.35802, val loss: 0.35925, in 0.000s 1 tree, 121 leaves, max depth = 14, train loss: 0.35279, val loss: 0.35475, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.35018, val loss: 0.35204, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.34577, val loss: 0.34829, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.34203, val loss: 0.34514, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.33936, val loss: 0.34210, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33587, val loss: 0.33883, in 0.000s 1 tree, 96 leaves, max depth = 16, train loss: 0.33348, val loss: 0.33595, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.33012, val loss: 0.33316, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.32806, val loss: 0.33085, in 0.000s 1 tree, 96 leaves, max depth = 16, train loss: 0.32598, val loss: 0.32826, in 0.016s 1 tree, 163 leaves, max depth = 18, train loss: 0.32344, val loss: 0.32641, in 0.031s 1 tree, 121 leaves, max depth = 15, train loss: 0.32075, val loss: 0.32421, in 0.016s 1 tree, 92 leaves, max depth = 22, train loss: 0.31898, val loss: 0.32186, in 0.000s 1 tree, 162 leaves, max depth = 18, train loss: 0.31699, val loss: 0.32049, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31381, val loss: 0.31751, in 0.016s 1 tree, 96 leaves, max depth = 20, train loss: 0.31209, val loss: 0.31529, in 0.016s 1 tree, 153 leaves, max depth = 24, train loss: 0.31029, val loss: 0.31449, in 0.016s 1 tree, 121 leaves, max depth = 19, train loss: 0.30820, val loss: 0.31283, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30538, val loss: 0.31019, in 0.000s 1 tree, 96 leaves, max depth = 20, train loss: 0.30384, val loss: 0.30819, in 0.016s Fit 46 trees in 0.830 s, (4096 total leaves) Time spent computing histograms: 0.227s Time spent finding best splits: 0.087s Time spent applying splits: 0.062s Time spent predicting: 0.000s Trial 9, Fold 3: Log loss = 0.3068492381650382, Average precision = 0.9564745796419276, ROC-AUC = 0.9535465835107049, Elapsed Time = 0.8307597000002716 seconds Trial 9, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 9, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 75 leaves, max depth = 14, train loss: 0.64907, val loss: 0.64680, in 0.000s 1 tree, 76 leaves, max depth = 14, train loss: 0.61275, val loss: 0.60823, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.58283, val loss: 0.57639, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.55820, val loss: 0.55014, in 0.016s 1 tree, 124 leaves, max depth = 16, train loss: 0.53569, val loss: 0.52826, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.51655, val loss: 0.50746, in 0.000s 1 tree, 124 leaves, max depth = 16, train loss: 0.49887, val loss: 0.49034, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.48407, val loss: 0.47406, in 0.031s 1 tree, 76 leaves, max depth = 13, train loss: 0.47145, val loss: 0.46009, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.46054, val loss: 0.44777, in 0.000s 1 tree, 124 leaves, max depth = 16, train loss: 0.44735, val loss: 0.43517, in 0.031s 1 tree, 76 leaves, max depth = 13, train loss: 0.43906, val loss: 0.42575, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.43157, val loss: 0.41737, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42510, val loss: 0.41044, in 0.016s 1 tree, 168 leaves, max depth = 14, train loss: 0.41760, val loss: 0.40375, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.40800, val loss: 0.39474, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40270, val loss: 0.38905, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.39727, val loss: 0.38279, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39286, val loss: 0.37804, in 0.000s 1 tree, 76 leaves, max depth = 13, train loss: 0.38832, val loss: 0.37273, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38463, val loss: 0.36875, in 0.016s 1 tree, 152 leaves, max depth = 27, train loss: 0.37991, val loss: 0.36440, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.37232, val loss: 0.35750, in 0.016s 1 tree, 76 leaves, max depth = 11, train loss: 0.36873, val loss: 0.35327, in 0.016s 1 tree, 163 leaves, max depth = 18, train loss: 0.36407, val loss: 0.34928, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.35821, val loss: 0.34402, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.35324, val loss: 0.33957, in 0.016s 1 tree, 46 leaves, max depth = 13, train loss: 0.35063, val loss: 0.33688, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.34645, val loss: 0.33317, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.34291, val loss: 0.33005, in 0.016s 1 tree, 96 leaves, max depth = 14, train loss: 0.34010, val loss: 0.32708, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33672, val loss: 0.32338, in 0.000s 1 tree, 97 leaves, max depth = 14, train loss: 0.33440, val loss: 0.32095, in 0.016s 1 tree, 124 leaves, max depth = 13, train loss: 0.33119, val loss: 0.31816, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.32896, val loss: 0.31561, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.32694, val loss: 0.31360, in 0.016s 1 tree, 163 leaves, max depth = 22, train loss: 0.32451, val loss: 0.31184, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.32193, val loss: 0.30963, in 0.016s 1 tree, 97 leaves, max depth = 19, train loss: 0.32023, val loss: 0.30776, in 0.000s 1 tree, 163 leaves, max depth = 21, train loss: 0.31830, val loss: 0.30644, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.31521, val loss: 0.30306, in 0.000s 1 tree, 97 leaves, max depth = 16, train loss: 0.31352, val loss: 0.30142, in 0.016s 1 tree, 152 leaves, max depth = 30, train loss: 0.31152, val loss: 0.29988, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.30952, val loss: 0.29822, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30679, val loss: 0.29522, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.30525, val loss: 0.29346, in 0.016s Fit 46 trees in 0.923 s, (4149 total leaves) Time spent computing histograms: 0.242s Time spent finding best splits: 0.098s Time spent applying splits: 0.070s Time spent predicting: 0.000s Trial 9, Fold 4: Log loss = 0.3106716960208532, Average precision = 0.9542641250120021, ROC-AUC = 0.9510418082409973, Elapsed Time = 0.9211533999987296 seconds Trial 9, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 9, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 76 leaves, max depth = 12, train loss: 0.64857, val loss: 0.64569, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.61206, val loss: 0.60642, in 0.016s 1 tree, 76 leaves, max depth = 16, train loss: 0.58208, val loss: 0.57397, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.55722, val loss: 0.54677, in 0.016s 1 tree, 121 leaves, max depth = 17, train loss: 0.53504, val loss: 0.52573, in 0.000s 1 tree, 76 leaves, max depth = 15, train loss: 0.51560, val loss: 0.50445, in 0.016s 1 tree, 121 leaves, max depth = 18, train loss: 0.49819, val loss: 0.48810, in 0.031s 1 tree, 76 leaves, max depth = 14, train loss: 0.48315, val loss: 0.47132, in 0.000s 1 tree, 76 leaves, max depth = 15, train loss: 0.47030, val loss: 0.45701, in 0.016s 1 tree, 76 leaves, max depth = 15, train loss: 0.45915, val loss: 0.44442, in 0.016s 1 tree, 121 leaves, max depth = 19, train loss: 0.44613, val loss: 0.43252, in 0.016s 1 tree, 76 leaves, max depth = 16, train loss: 0.43768, val loss: 0.42285, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.43054, val loss: 0.41463, in 0.016s [14/46] 1 tree, 5 leaves, max depth = 3, train loss: 0.42404, val loss: 0.40839, in 0.000s 1 tree, 168 leaves, max depth = 13, train loss: 0.41646, val loss: 0.40243, in 0.016s 1 tree, 121 leaves, max depth = 20, train loss: 0.40693, val loss: 0.39395, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40160, val loss: 0.38887, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.39611, val loss: 0.38223, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39169, val loss: 0.37802, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.38699, val loss: 0.37250, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38329, val loss: 0.36900, in 0.016s 1 tree, 152 leaves, max depth = 30, train loss: 0.37850, val loss: 0.36462, in 0.016s 1 tree, 121 leaves, max depth = 20, train loss: 0.37098, val loss: 0.35818, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.36725, val loss: 0.35376, in 0.016s 1 tree, 164 leaves, max depth = 18, train loss: 0.36264, val loss: 0.34991, in 0.016s 1 tree, 121 leaves, max depth = 20, train loss: 0.35684, val loss: 0.34505, in 0.016s 1 tree, 121 leaves, max depth = 19, train loss: 0.35193, val loss: 0.34098, in 0.016s 1 tree, 97 leaves, max depth = 13, train loss: 0.34875, val loss: 0.33742, in 0.016s 1 tree, 121 leaves, max depth = 21, train loss: 0.34455, val loss: 0.33397, in 0.016s 1 tree, 121 leaves, max depth = 21, train loss: 0.34099, val loss: 0.33109, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.33828, val loss: 0.32763, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33488, val loss: 0.32438, in 0.016s 1 tree, 97 leaves, max depth = 13, train loss: 0.33245, val loss: 0.32181, in 0.016s 1 tree, 121 leaves, max depth = 21, train loss: 0.32925, val loss: 0.31928, in 0.016s 1 tree, 97 leaves, max depth = 14, train loss: 0.32715, val loss: 0.31705, in 0.000s 1 tree, 76 leaves, max depth = 17, train loss: 0.32524, val loss: 0.31468, in 0.016s 1 tree, 164 leaves, max depth = 17, train loss: 0.32281, val loss: 0.31290, in 0.063s 1 tree, 121 leaves, max depth = 20, train loss: 0.32024, val loss: 0.31091, in 0.031s 1 tree, 94 leaves, max depth = 18, train loss: 0.31844, val loss: 0.30896, in 0.031s 1 tree, 97 leaves, max depth = 15, train loss: 0.31676, val loss: 0.30707, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.31371, val loss: 0.30412, in 0.000s 1 tree, 46 leaves, max depth = 11, train loss: 0.31235, val loss: 0.30254, in 0.016s 1 tree, 152 leaves, max depth = 28, train loss: 0.31024, val loss: 0.30088, in 0.031s 1 tree, 121 leaves, max depth = 19, train loss: 0.30806, val loss: 0.29925, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30536, val loss: 0.29662, in 0.016s 1 tree, 121 leaves, max depth = 20, train loss: 0.30346, val loss: 0.29523, in 0.016s Fit 46 trees in 1.017 s, (4076 total leaves) Time spent computing histograms: 0.284s Time spent finding best splits: 0.123s Time spent applying splits: 0.090s Time spent predicting: 0.000s Trial 9, Fold 5: Log loss = 0.31477065490313444, Average precision = 0.9525868854943338, ROC-AUC = 0.9487047544729948, Elapsed Time = 1.028342600000542 seconds
Optimization Progress: 10%|# | 10/100 [02:02<18:03, 12.04s/it]
Trial 10, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 10, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.190 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 77 leaves, max depth = 19, train loss: 0.68188, val loss: 0.68157, in 0.000s 1 tree, 80 leaves, max depth = 17, train loss: 0.67108, val loss: 0.67043, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.66079, val loss: 0.65980, in 0.016s 1 tree, 76 leaves, max depth = 17, train loss: 0.65117, val loss: 0.64997, in 0.000s 1 tree, 80 leaves, max depth = 15, train loss: 0.64190, val loss: 0.64041, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.63304, val loss: 0.63127, in 0.016s 1 tree, 73 leaves, max depth = 19, train loss: 0.62468, val loss: 0.62268, in 0.000s 1 tree, 78 leaves, max depth = 18, train loss: 0.61668, val loss: 0.61449, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.60894, val loss: 0.60648, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.60142, val loss: 0.59868, in 0.000s 1 tree, 79 leaves, max depth = 16, train loss: 0.59439, val loss: 0.59138, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.58759, val loss: 0.58439, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.58109, val loss: 0.57766, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.57477, val loss: 0.57107, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.56881, val loss: 0.56489, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.56310, val loss: 0.55896, in 0.000s 1 tree, 80 leaves, max depth = 14, train loss: 0.55761, val loss: 0.55332, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.55235, val loss: 0.54791, in 0.016s 1 tree, 78 leaves, max depth = 16, train loss: 0.54737, val loss: 0.54280, in 0.000s 1 tree, 80 leaves, max depth = 16, train loss: 0.54244, val loss: 0.53763, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.53772, val loss: 0.53267, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.53319, val loss: 0.52790, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.52786, val loss: 0.52299, in 0.000s 1 tree, 80 leaves, max depth = 16, train loss: 0.52360, val loss: 0.51850, in 0.000s 1 tree, 80 leaves, max depth = 15, train loss: 0.51959, val loss: 0.51436, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.51461, val loss: 0.50978, in 0.016s 1 tree, 123 leaves, max depth = 16, train loss: 0.50983, val loss: 0.50539, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.50614, val loss: 0.50151, in 0.000s 1 tree, 122 leaves, max depth = 19, train loss: 0.50161, val loss: 0.49735, in 0.016s 1 tree, 123 leaves, max depth = 19, train loss: 0.49725, val loss: 0.49336, in 0.016s 1 tree, 78 leaves, max depth = 14, train loss: 0.49386, val loss: 0.48984, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.49058, val loss: 0.48637, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.48734, val loss: 0.48293, in 0.000s 1 tree, 123 leaves, max depth = 19, train loss: 0.48328, val loss: 0.47923, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.48023, val loss: 0.47597, in 0.000s 1 tree, 79 leaves, max depth = 14, train loss: 0.47736, val loss: 0.47301, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.47454, val loss: 0.46999, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.47074, val loss: 0.46654, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.46815, val loss: 0.46385, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.46565, val loss: 0.46120, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.46325, val loss: 0.45872, in 0.000s 1 tree, 123 leaves, max depth = 14, train loss: 0.45968, val loss: 0.45549, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.45738, val loss: 0.45306, in 0.016s 1 tree, 122 leaves, max depth = 13, train loss: 0.45397, val loss: 0.45000, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.45180, val loss: 0.44770, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.44855, val loss: 0.44478, in 0.000s 1 tree, 123 leaves, max depth = 14, train loss: 0.44542, val loss: 0.44198, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.44336, val loss: 0.43974, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.44037, val loss: 0.43707, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.43749, val loss: 0.43450, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.43554, val loss: 0.43245, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.43279, val loss: 0.43001, in 0.016s 1 tree, 122 leaves, max depth = 14, train loss: 0.43015, val loss: 0.42766, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.42834, val loss: 0.42569, in 0.000s 1 tree, 123 leaves, max depth = 14, train loss: 0.42581, val loss: 0.42345, in 0.016s Fit 55 trees in 0.987 s, (5055 total leaves) Time spent computing histograms: 0.277s Time spent finding best splits: 0.073s Time spent applying splits: 0.086s Time spent predicting: 0.016s Trial 10, Fold 1: Log loss = 0.42790914008392067, Average precision = 0.9175382538763569, ROC-AUC = 0.9268080588654212, Elapsed Time = 0.9964141999989806 seconds Trial 10, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 10, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.221 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 76 leaves, max depth = 19, train loss: 0.68194, val loss: 0.68148, in 0.000s 1 tree, 80 leaves, max depth = 17, train loss: 0.67119, val loss: 0.67021, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.66094, val loss: 0.65946, in 0.016s 1 tree, 78 leaves, max depth = 16, train loss: 0.65128, val loss: 0.64939, in 0.016s 1 tree, 79 leaves, max depth = 17, train loss: 0.64206, val loss: 0.63972, in 0.000s 1 tree, 79 leaves, max depth = 17, train loss: 0.63327, val loss: 0.63048, in 0.016s 1 tree, 76 leaves, max depth = 19, train loss: 0.62486, val loss: 0.62170, in 0.016s 1 tree, 77 leaves, max depth = 18, train loss: 0.61683, val loss: 0.61326, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.60909, val loss: 0.60513, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.60161, val loss: 0.59725, in 0.000s 1 tree, 79 leaves, max depth = 16, train loss: 0.59458, val loss: 0.58988, in 0.016s 1 tree, 77 leaves, max depth = 19, train loss: 0.58786, val loss: 0.58279, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.58136, val loss: 0.57594, in 0.000s 1 tree, 79 leaves, max depth = 14, train loss: 0.57507, val loss: 0.56929, in 0.016s 1 tree, 79 leaves, max depth = 17, train loss: 0.56915, val loss: 0.56308, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.56344, val loss: 0.55704, in 0.016s 1 tree, 79 leaves, max depth = 17, train loss: 0.55800, val loss: 0.55129, in 0.000s 1 tree, 79 leaves, max depth = 17, train loss: 0.55279, val loss: 0.54578, in 0.016s 1 tree, 77 leaves, max depth = 19, train loss: 0.54781, val loss: 0.54048, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.54291, val loss: 0.53528, in 0.000s 1 tree, 79 leaves, max depth = 14, train loss: 0.53820, val loss: 0.53028, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.53369, val loss: 0.52548, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.52845, val loss: 0.52047, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.52421, val loss: 0.51595, in 0.000s 1 tree, 79 leaves, max depth = 17, train loss: 0.52024, val loss: 0.51172, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.51534, val loss: 0.50705, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.51064, val loss: 0.50257, in 0.016s 1 tree, 79 leaves, max depth = 19, train loss: 0.50697, val loss: 0.49866, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.50251, val loss: 0.49441, in 0.000s 1 tree, 123 leaves, max depth = 17, train loss: 0.49824, val loss: 0.49035, in 0.016s 1 tree, 76 leaves, max depth = 18, train loss: 0.49482, val loss: 0.48674, in 0.016s 1 tree, 77 leaves, max depth = 19, train loss: 0.49157, val loss: 0.48325, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.48834, val loss: 0.47979, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.48436, val loss: 0.47601, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.48131, val loss: 0.47275, in 0.000s 1 tree, 80 leaves, max depth = 12, train loss: 0.47843, val loss: 0.46966, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.47562, val loss: 0.46664, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.47190, val loss: 0.46312, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.46932, val loss: 0.46036, in 0.000s 1 tree, 79 leaves, max depth = 16, train loss: 0.46684, val loss: 0.45771, in 0.016s 1 tree, 79 leaves, max depth = 25, train loss: 0.46445, val loss: 0.45518, in 0.016s 1 tree, 123 leaves, max depth = 18, train loss: 0.46095, val loss: 0.45188, in 0.016s 1 tree, 80 leaves, max depth = 12, train loss: 0.45867, val loss: 0.44942, in 0.000s 1 tree, 123 leaves, max depth = 17, train loss: 0.45532, val loss: 0.44628, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.45320, val loss: 0.44400, in 0.016s [46/55] 1 tree, 123 leaves, max depth = 17, train loss: 0.45000, val loss: 0.44100, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.44692, val loss: 0.43812, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.44487, val loss: 0.43590, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.44193, val loss: 0.43316, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.43911, val loss: 0.43052, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.43726, val loss: 0.42853, in 0.000s 1 tree, 123 leaves, max depth = 15, train loss: 0.43455, val loss: 0.42601, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.43195, val loss: 0.42360, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.43015, val loss: 0.42165, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.42766, val loss: 0.41934, in 0.016s Fit 55 trees in 1.049 s, (5042 total leaves) Time spent computing histograms: 0.283s Time spent finding best splits: 0.071s Time spent applying splits: 0.085s Time spent predicting: 0.000s Trial 10, Fold 2: Log loss = 0.4291473245433425, Average precision = 0.9123761684258548, ROC-AUC = 0.9273861049408776, Elapsed Time = 1.0500720000000001 seconds Trial 10, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 10, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 72 leaves, max depth = 13, train loss: 0.68197, val loss: 0.68161, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.67128, val loss: 0.67061, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.66109, val loss: 0.66011, in 0.016s 1 tree, 77 leaves, max depth = 16, train loss: 0.65154, val loss: 0.65022, in 0.000s 1 tree, 79 leaves, max depth = 14, train loss: 0.64238, val loss: 0.64081, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.63361, val loss: 0.63168, in 0.016s 1 tree, 77 leaves, max depth = 16, train loss: 0.62527, val loss: 0.62304, in 0.016s 1 tree, 77 leaves, max depth = 17, train loss: 0.61732, val loss: 0.61480, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.60965, val loss: 0.60680, in 0.000s 1 tree, 79 leaves, max depth = 15, train loss: 0.60221, val loss: 0.59913, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.59521, val loss: 0.59193, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.58847, val loss: 0.58492, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.58204, val loss: 0.57819, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.57578, val loss: 0.57173, in 0.000s 1 tree, 79 leaves, max depth = 12, train loss: 0.56988, val loss: 0.56554, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.56423, val loss: 0.55962, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.55880, val loss: 0.55394, in 0.016s 1 tree, 78 leaves, max depth = 14, train loss: 0.55360, val loss: 0.54860, in 0.016s 1 tree, 77 leaves, max depth = 18, train loss: 0.54865, val loss: 0.54344, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.54378, val loss: 0.53839, in 0.000s 1 tree, 79 leaves, max depth = 15, train loss: 0.53911, val loss: 0.53355, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.53463, val loss: 0.52890, in 0.016s 1 tree, 120 leaves, max depth = 15, train loss: 0.52929, val loss: 0.52398, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.52508, val loss: 0.51960, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.52111, val loss: 0.51542, in 0.016s 1 tree, 120 leaves, max depth = 15, train loss: 0.51612, val loss: 0.51082, in 0.016s 1 tree, 120 leaves, max depth = 15, train loss: 0.51133, val loss: 0.50642, in 0.016s 1 tree, 78 leaves, max depth = 17, train loss: 0.50767, val loss: 0.50263, in 0.016s 1 tree, 120 leaves, max depth = 15, train loss: 0.50312, val loss: 0.49845, in 0.016s 1 tree, 120 leaves, max depth = 15, train loss: 0.49876, val loss: 0.49445, in 0.016s 1 tree, 73 leaves, max depth = 18, train loss: 0.49539, val loss: 0.49090, in 0.000s 1 tree, 79 leaves, max depth = 17, train loss: 0.49213, val loss: 0.48752, in 0.000s 1 tree, 79 leaves, max depth = 15, train loss: 0.48894, val loss: 0.48417, in 0.016s 1 tree, 120 leaves, max depth = 14, train loss: 0.48487, val loss: 0.48045, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.48187, val loss: 0.47729, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.47902, val loss: 0.47427, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.47624, val loss: 0.47135, in 0.000s 1 tree, 120 leaves, max depth = 15, train loss: 0.47244, val loss: 0.46788, in 0.016s 1 tree, 79 leaves, max depth = 17, train loss: 0.46986, val loss: 0.46520, in 0.000s 1 tree, 79 leaves, max depth = 17, train loss: 0.46739, val loss: 0.46262, in 0.000s 1 tree, 79 leaves, max depth = 13, train loss: 0.46501, val loss: 0.46008, in 0.000s 1 tree, 120 leaves, max depth = 14, train loss: 0.46144, val loss: 0.45681, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.45918, val loss: 0.45440, in 0.016s 1 tree, 120 leaves, max depth = 14, train loss: 0.45577, val loss: 0.45129, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.45364, val loss: 0.44901, in 0.016s 1 tree, 120 leaves, max depth = 14, train loss: 0.45038, val loss: 0.44605, in 0.016s 1 tree, 120 leaves, max depth = 14, train loss: 0.44724, val loss: 0.44320, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.44522, val loss: 0.44105, in 0.016s 1 tree, 120 leaves, max depth = 14, train loss: 0.44222, val loss: 0.43833, in 0.016s 1 tree, 120 leaves, max depth = 14, train loss: 0.43934, val loss: 0.43573, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.43741, val loss: 0.43367, in 0.016s 1 tree, 120 leaves, max depth = 14, train loss: 0.43466, val loss: 0.43118, in 0.016s 1 tree, 120 leaves, max depth = 14, train loss: 0.43201, val loss: 0.42880, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.43024, val loss: 0.42690, in 0.016s 1 tree, 120 leaves, max depth = 14, train loss: 0.42770, val loss: 0.42462, in 0.000s Fit 55 trees in 1.065 s, (4972 total leaves) Time spent computing histograms: 0.309s Time spent finding best splits: 0.079s Time spent applying splits: 0.095s Time spent predicting: 0.000s Trial 10, Fold 3: Log loss = 0.4249057059562685, Average precision = 0.9207502708543311, ROC-AUC = 0.9313444142404484, Elapsed Time = 1.0818947999996453 seconds Trial 10, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 10, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 75 leaves, max depth = 15, train loss: 0.68197, val loss: 0.68140, in 0.016s 1 tree, 79 leaves, max depth = 17, train loss: 0.67128, val loss: 0.67007, in 0.000s 1 tree, 79 leaves, max depth = 17, train loss: 0.66109, val loss: 0.65926, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.65149, val loss: 0.64915, in 0.016s 1 tree, 79 leaves, max depth = 19, train loss: 0.64228, val loss: 0.63940, in 0.016s 1 tree, 79 leaves, max depth = 19, train loss: 0.63349, val loss: 0.63008, in 0.000s 1 tree, 78 leaves, max depth = 14, train loss: 0.62513, val loss: 0.62123, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.61714, val loss: 0.61276, in 0.016s 1 tree, 79 leaves, max depth = 19, train loss: 0.60945, val loss: 0.60458, in 0.000s 1 tree, 78 leaves, max depth = 16, train loss: 0.60202, val loss: 0.59662, in 0.016s 1 tree, 80 leaves, max depth = 18, train loss: 0.59500, val loss: 0.58908, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.58824, val loss: 0.58189, in 0.000s 1 tree, 79 leaves, max depth = 19, train loss: 0.58179, val loss: 0.57499, in 0.000s 1 tree, 78 leaves, max depth = 16, train loss: 0.57554, val loss: 0.56825, in 0.016s 1 tree, 78 leaves, max depth = 16, train loss: 0.56966, val loss: 0.56195, in 0.016s 1 tree, 78 leaves, max depth = 16, train loss: 0.56402, val loss: 0.55591, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.55856, val loss: 0.55007, in 0.000s 1 tree, 79 leaves, max depth = 13, train loss: 0.55333, val loss: 0.54446, in 0.016s 1 tree, 74 leaves, max depth = 15, train loss: 0.54835, val loss: 0.53911, in 0.016s 1 tree, 78 leaves, max depth = 16, train loss: 0.54349, val loss: 0.53381, in 0.000s 1 tree, 78 leaves, max depth = 16, train loss: 0.53882, val loss: 0.52872, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.53434, val loss: 0.52382, in 0.016s 1 tree, 124 leaves, max depth = 17, train loss: 0.52907, val loss: 0.51871, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.52486, val loss: 0.51410, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.52087, val loss: 0.50978, in 0.000s 1 tree, 124 leaves, max depth = 16, train loss: 0.51595, val loss: 0.50500, in 0.031s 1 tree, 124 leaves, max depth = 16, train loss: 0.51122, val loss: 0.50043, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.50754, val loss: 0.49637, in 0.000s 1 tree, 124 leaves, max depth = 13, train loss: 0.50306, val loss: 0.49203, in 0.031s 1 tree, 124 leaves, max depth = 13, train loss: 0.49877, val loss: 0.48787, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.49537, val loss: 0.48416, in 0.000s 1 tree, 78 leaves, max depth = 13, train loss: 0.49199, val loss: 0.48048, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.48879, val loss: 0.47694, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.48480, val loss: 0.47308, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.48179, val loss: 0.46973, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.47894, val loss: 0.46660, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.47616, val loss: 0.46349, in 0.000s 1 tree, 124 leaves, max depth = 14, train loss: 0.47242, val loss: 0.45989, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.46973, val loss: 0.45693, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.46723, val loss: 0.45413, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.46484, val loss: 0.45149, in 0.016s 1 tree, 124 leaves, max depth = 16, train loss: 0.46134, val loss: 0.44812, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.45907, val loss: 0.44556, in 0.016s 1 tree, 124 leaves, max depth = 13, train loss: 0.45573, val loss: 0.44236, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.45358, val loss: 0.43994, in 0.016s 1 tree, 124 leaves, max depth = 13, train loss: 0.45039, val loss: 0.43688, in 0.016s 1 tree, 124 leaves, max depth = 13, train loss: 0.44733, val loss: 0.43394, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.44530, val loss: 0.43163, in 0.000s 1 tree, 124 leaves, max depth = 13, train loss: 0.44237, val loss: 0.42882, in 0.016s 1 tree, 124 leaves, max depth = 13, train loss: 0.43955, val loss: 0.42612, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.43761, val loss: 0.42396, in 0.016s 1 tree, 124 leaves, max depth = 13, train loss: 0.43492, val loss: 0.42139, in 0.016s 1 tree, 124 leaves, max depth = 13, train loss: 0.43233, val loss: 0.41891, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.43056, val loss: 0.41688, in 0.016s 1 tree, 124 leaves, max depth = 13, train loss: 0.42808, val loss: 0.41451, in 0.016s Fit 55 trees in 1.017 s, (5039 total leaves) Time spent computing histograms: 0.294s Time spent finding best splits: 0.075s Time spent applying splits: 0.091s Time spent predicting: 0.000s Trial 10, Fold 4: Log loss = 0.42822515332766903, Average precision = 0.9172057006370695, ROC-AUC = 0.9287372940764108, Elapsed Time = 1.0293785000012576 seconds Trial 10, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 10, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 78 leaves, max depth = 13, train loss: 0.68180, val loss: 0.68110, in 0.016s 1 tree, 78 leaves, max depth = 16, train loss: 0.67100, val loss: 0.66961, in 0.000s 1 tree, 79 leaves, max depth = 14, train loss: 0.66078, val loss: 0.65869, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.65106, val loss: 0.64833, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.64177, val loss: 0.63840, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.63291, val loss: 0.62891, in 0.000s 1 tree, 77 leaves, max depth = 14, train loss: 0.62445, val loss: 0.61984, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.61637, val loss: 0.61118, in 0.000s 1 tree, 79 leaves, max depth = 13, train loss: 0.60861, val loss: 0.60283, in 0.016s 1 tree, 79 leaves, max depth = 17, train loss: 0.60110, val loss: 0.59474, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.59401, val loss: 0.58709, in 0.000s 1 tree, 79 leaves, max depth = 14, train loss: 0.58719, val loss: 0.57971, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.58068, val loss: 0.57266, in 0.016s 1 tree, 79 leaves, max depth = 17, train loss: 0.57436, val loss: 0.56582, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.56840, val loss: 0.55936, in 0.000s 1 tree, 78 leaves, max depth = 13, train loss: 0.56269, val loss: 0.55316, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.55719, val loss: 0.54715, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.55191, val loss: 0.54138, in 0.016s 1 tree, 77 leaves, max depth = 14, train loss: 0.54688, val loss: 0.53590, in 0.000s 1 tree, 79 leaves, max depth = 16, train loss: 0.54195, val loss: 0.53052, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.53723, val loss: 0.52535, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.53270, val loss: 0.52039, in 0.016s 1 tree, 122 leaves, max depth = 17, train loss: 0.52751, val loss: 0.51548, in 0.016s 1 tree, 78 leaves, max depth = 16, train loss: 0.52324, val loss: 0.51080, in 0.000s 1 tree, 79 leaves, max depth = 14, train loss: 0.51921, val loss: 0.50634, in 0.016s 1 tree, 122 leaves, max depth = 18, train loss: 0.51436, val loss: 0.50177, in 0.016s 1 tree, 122 leaves, max depth = 18, train loss: 0.50970, val loss: 0.49740, in 0.031s 1 tree, 78 leaves, max depth = 15, train loss: 0.50598, val loss: 0.49328, in 0.000s 1 tree, 122 leaves, max depth = 19, train loss: 0.50156, val loss: 0.48914, in 0.016s 1 tree, 122 leaves, max depth = 19, train loss: 0.49732, val loss: 0.48517, in 0.016s 1 tree, 78 leaves, max depth = 14, train loss: 0.49388, val loss: 0.48135, in 0.016s 1 tree, 77 leaves, max depth = 15, train loss: 0.49056, val loss: 0.47767, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.48732, val loss: 0.47408, in 0.000s 1 tree, 122 leaves, max depth = 17, train loss: 0.48337, val loss: 0.47040, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.48031, val loss: 0.46699, in 0.016s 1 tree, 77 leaves, max depth = 15, train loss: 0.47743, val loss: 0.46379, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.47461, val loss: 0.46063, in 0.000s 1 tree, 122 leaves, max depth = 17, train loss: 0.47091, val loss: 0.45721, in 0.031s 1 tree, 78 leaves, max depth = 15, train loss: 0.46828, val loss: 0.45427, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.46577, val loss: 0.45143, in 0.000s 1 tree, 78 leaves, max depth = 14, train loss: 0.46335, val loss: 0.44870, in 0.016s 1 tree, 122 leaves, max depth = 18, train loss: 0.45987, val loss: 0.44551, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.45758, val loss: 0.44292, in 0.016s 1 tree, 122 leaves, max depth = 17, train loss: 0.45426, val loss: 0.43988, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.45210, val loss: 0.43743, in 0.016s 1 tree, 122 leaves, max depth = 17, train loss: 0.44892, val loss: 0.43453, in 0.016s 1 tree, 122 leaves, max depth = 17, train loss: 0.44586, val loss: 0.43174, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.44380, val loss: 0.42940, in 0.016s 1 tree, 122 leaves, max depth = 17, train loss: 0.44088, val loss: 0.42675, in 0.016s 1 tree, 122 leaves, max depth = 17, train loss: 0.43807, val loss: 0.42421, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.43618, val loss: 0.42206, in 0.016s 1 tree, 122 leaves, max depth = 17, train loss: 0.43349, val loss: 0.41963, in 0.016s 1 tree, 122 leaves, max depth = 17, train loss: 0.43090, val loss: 0.41729, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.42909, val loss: 0.41523, in 0.016s 1 tree, 122 leaves, max depth = 17, train loss: 0.42661, val loss: 0.41300, in 0.016s Fit 55 trees in 1.018 s, (5007 total leaves) Time spent computing histograms: 0.299s Time spent finding best splits: 0.075s Time spent applying splits: 0.090s Time spent predicting: 0.016s Trial 10, Fold 5: Log loss = 0.4337304957267745, Average precision = 0.912977732762567, ROC-AUC = 0.9249993497203797, Elapsed Time = 1.0223001000013028 seconds
Optimization Progress: 11%|#1 | 11/100 [02:15<18:14, 12.30s/it]
Trial 11, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 11, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 43 leaves, max depth = 11, train loss: 0.68296, val loss: 0.68266, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.67751, val loss: 0.67678, in 0.000s 1 tree, 43 leaves, max depth = 12, train loss: 0.66797, val loss: 0.66700, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.65892, val loss: 0.65761, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.65025, val loss: 0.64861, in 0.000s 1 tree, 42 leaves, max depth = 13, train loss: 0.64185, val loss: 0.63997, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.63387, val loss: 0.63168, in 0.000s 1 tree, 42 leaves, max depth = 13, train loss: 0.62624, val loss: 0.62380, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.62171, val loss: 0.61895, in 0.000s 1 tree, 42 leaves, max depth = 14, train loss: 0.61435, val loss: 0.61134, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.60746, val loss: 0.60426, in 0.000s 1 tree, 24 leaves, max depth = 8, train loss: 0.60343, val loss: 0.59992, in 0.016s 1 tree, 42 leaves, max depth = 13, train loss: 0.59686, val loss: 0.59314, in 0.000s 1 tree, 42 leaves, max depth = 11, train loss: 0.59066, val loss: 0.58673, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.58468, val loss: 0.58055, in 0.000s 1 tree, 43 leaves, max depth = 11, train loss: 0.57893, val loss: 0.57461, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.57362, val loss: 0.56912, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.56826, val loss: 0.56351, in 0.000s 1 tree, 42 leaves, max depth = 11, train loss: 0.56314, val loss: 0.55820, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.55802, val loss: 0.55290, in 0.000s 1 tree, 42 leaves, max depth = 14, train loss: 0.55310, val loss: 0.54779, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.54836, val loss: 0.54288, in 0.000s 1 tree, 40 leaves, max depth = 10, train loss: 0.54417, val loss: 0.53856, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.53976, val loss: 0.53398, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.53564, val loss: 0.52965, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.53161, val loss: 0.52549, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.52779, val loss: 0.52146, in 0.000s 1 tree, 43 leaves, max depth = 14, train loss: 0.52413, val loss: 0.51767, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.52061, val loss: 0.51400, in 0.000s 1 tree, 72 leaves, max depth = 11, train loss: 0.51605, val loss: 0.50979, in 0.016s 1 tree, 43 leaves, max depth = 14, train loss: 0.51271, val loss: 0.50633, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.50949, val loss: 0.50290, in 0.000s 1 tree, 42 leaves, max depth = 13, train loss: 0.50631, val loss: 0.49959, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.50331, val loss: 0.49640, in 0.000s 1 tree, 42 leaves, max depth = 14, train loss: 0.50029, val loss: 0.49323, in 0.016s 1 tree, 44 leaves, max depth = 13, train loss: 0.49752, val loss: 0.49034, in 0.016s 1 tree, 43 leaves, max depth = 14, train loss: 0.49484, val loss: 0.48756, in 0.000s 1 tree, 72 leaves, max depth = 11, train loss: 0.49072, val loss: 0.48378, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.48811, val loss: 0.48107, in 0.000s 1 tree, 43 leaves, max depth = 14, train loss: 0.48568, val loss: 0.47853, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.48325, val loss: 0.47600, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.48095, val loss: 0.47358, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.47877, val loss: 0.47128, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.47664, val loss: 0.46898, in 0.000s 1 tree, 89 leaves, max depth = 14, train loss: 0.47393, val loss: 0.46651, in 0.016s 1 tree, 88 leaves, max depth = 10, train loss: 0.47132, val loss: 0.46414, in 0.016s 1 tree, 72 leaves, max depth = 11, train loss: 0.46768, val loss: 0.46083, in 0.000s 1 tree, 49 leaves, max depth = 13, train loss: 0.46576, val loss: 0.45898, in 0.016s 1 tree, 87 leaves, max depth = 10, train loss: 0.46338, val loss: 0.45684, in 0.016s 1 tree, 42 leaves, max depth = 15, train loss: 0.46146, val loss: 0.45483, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.45964, val loss: 0.45290, in 0.000s 1 tree, 71 leaves, max depth = 11, train loss: 0.45630, val loss: 0.44988, in 0.016s Fit 52 trees in 0.735 s, (2175 total leaves) Time spent computing histograms: 0.228s Time spent finding best splits: 0.036s Time spent applying splits: 0.039s Time spent predicting: 0.000s Trial 11, Fold 1: Log loss = 0.45635692276042383, Average precision = 0.9041750786722952, ROC-AUC = 0.9112115870387099, Elapsed Time = 0.7370410000003176 seconds Trial 11, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 11, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.141 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 40 leaves, max depth = 11, train loss: 0.68307, val loss: 0.68258, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.67755, val loss: 0.67652, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.66803, val loss: 0.66655, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.65895, val loss: 0.65703, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.65025, val loss: 0.64790, in 0.016s 1 tree, 40 leaves, max depth = 13, train loss: 0.64190, val loss: 0.63917, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.63391, val loss: 0.63077, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.62624, val loss: 0.62275, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.62168, val loss: 0.61800, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.61436, val loss: 0.61029, in 0.000s 1 tree, 45 leaves, max depth = 19, train loss: 0.60745, val loss: 0.60302, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.60339, val loss: 0.59878, in 0.000s 1 tree, 45 leaves, max depth = 14, train loss: 0.59688, val loss: 0.59195, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.59069, val loss: 0.58541, in 0.000s 1 tree, 42 leaves, max depth = 12, train loss: 0.58469, val loss: 0.57912, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.57892, val loss: 0.57303, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.57365, val loss: 0.56751, in 0.000s 1 tree, 39 leaves, max depth = 13, train loss: 0.56830, val loss: 0.56187, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.56318, val loss: 0.55645, in 0.000s 1 tree, 43 leaves, max depth = 10, train loss: 0.55811, val loss: 0.55106, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.55324, val loss: 0.54588, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.54855, val loss: 0.54089, in 0.000s 1 tree, 38 leaves, max depth = 16, train loss: 0.54433, val loss: 0.53648, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.53997, val loss: 0.53183, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.53585, val loss: 0.52748, in 0.016s [26/52] 1 tree, 41 leaves, max depth = 11, train loss: 0.53185, val loss: 0.52323, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.52804, val loss: 0.51920, in 0.016s 1 tree, 46 leaves, max depth = 19, train loss: 0.52436, val loss: 0.51528, in 0.000s 1 tree, 44 leaves, max depth = 14, train loss: 0.52083, val loss: 0.51151, in 0.016s 1 tree, 73 leaves, max depth = 15, train loss: 0.51626, val loss: 0.50714, in 0.016s 1 tree, 46 leaves, max depth = 19, train loss: 0.51291, val loss: 0.50357, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.50967, val loss: 0.50011, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.50652, val loss: 0.49674, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.50351, val loss: 0.49351, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.50054, val loss: 0.49031, in 0.000s 1 tree, 41 leaves, max depth = 16, train loss: 0.49775, val loss: 0.48735, in 0.016s 1 tree, 46 leaves, max depth = 19, train loss: 0.49507, val loss: 0.48448, in 0.016s 1 tree, 73 leaves, max depth = 14, train loss: 0.49093, val loss: 0.48054, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.48835, val loss: 0.47779, in 0.000s 1 tree, 46 leaves, max depth = 19, train loss: 0.48591, val loss: 0.47516, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.48351, val loss: 0.47260, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.48121, val loss: 0.47012, in 0.000s 1 tree, 45 leaves, max depth = 17, train loss: 0.47903, val loss: 0.46776, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.47690, val loss: 0.46545, in 0.016s 1 tree, 92 leaves, max depth = 11, train loss: 0.47423, val loss: 0.46300, in 0.016s 1 tree, 92 leaves, max depth = 11, train loss: 0.47166, val loss: 0.46066, in 0.016s 1 tree, 73 leaves, max depth = 13, train loss: 0.46801, val loss: 0.45720, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.46599, val loss: 0.45540, in 0.000s 1 tree, 92 leaves, max depth = 11, train loss: 0.46366, val loss: 0.45327, in 0.016s 1 tree, 47 leaves, max depth = 15, train loss: 0.46176, val loss: 0.45122, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.45994, val loss: 0.44925, in 0.016s 1 tree, 73 leaves, max depth = 13, train loss: 0.45659, val loss: 0.44610, in 0.000s Fit 52 trees in 0.766 s, (2202 total leaves) Time spent computing histograms: 0.251s Time spent finding best splits: 0.040s Time spent applying splits: 0.045s Time spent predicting: 0.000s Trial 11, Fold 2: Log loss = 0.45779518972191213, Average precision = 0.9013062927885805, ROC-AUC = 0.9160451583199821, Elapsed Time = 0.7835273000000598 seconds Trial 11, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 11, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 40 leaves, max depth = 11, train loss: 0.68304, val loss: 0.68274, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.67750, val loss: 0.67707, in 0.000s 1 tree, 42 leaves, max depth = 10, train loss: 0.66806, val loss: 0.66726, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.65903, val loss: 0.65791, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.65039, val loss: 0.64895, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.64211, val loss: 0.64041, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.63416, val loss: 0.63216, in 0.000s 1 tree, 44 leaves, max depth = 16, train loss: 0.62657, val loss: 0.62429, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.62201, val loss: 0.61967, in 0.000s 1 tree, 45 leaves, max depth = 11, train loss: 0.61473, val loss: 0.61214, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.60788, val loss: 0.60504, in 0.016s 1 tree, 23 leaves, max depth = 7, train loss: 0.60381, val loss: 0.60092, in 0.000s 1 tree, 40 leaves, max depth = 9, train loss: 0.59732, val loss: 0.59412, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.59117, val loss: 0.58775, in 0.000s 1 tree, 43 leaves, max depth = 10, train loss: 0.58523, val loss: 0.58155, in 0.016s 1 tree, 44 leaves, max depth = 14, train loss: 0.57952, val loss: 0.57561, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.57426, val loss: 0.57013, in 0.000s 1 tree, 42 leaves, max depth = 12, train loss: 0.56894, val loss: 0.56462, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.56386, val loss: 0.55934, in 0.000s 1 tree, 45 leaves, max depth = 11, train loss: 0.55882, val loss: 0.55411, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.55397, val loss: 0.54907, in 0.016s 1 tree, 45 leaves, max depth = 10, train loss: 0.54931, val loss: 0.54422, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.54515, val loss: 0.53987, in 0.016s 1 tree, 45 leaves, max depth = 10, train loss: 0.54081, val loss: 0.53535, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.53672, val loss: 0.53110, in 0.000s 1 tree, 42 leaves, max depth = 11, train loss: 0.53275, val loss: 0.52693, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.52896, val loss: 0.52298, in 0.016s 1 tree, 44 leaves, max depth = 15, train loss: 0.52533, val loss: 0.51920, in 0.000s 1 tree, 44 leaves, max depth = 15, train loss: 0.52183, val loss: 0.51553, in 0.016s 1 tree, 70 leaves, max depth = 13, train loss: 0.51720, val loss: 0.51122, in 0.016s 1 tree, 43 leaves, max depth = 15, train loss: 0.51390, val loss: 0.50776, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.51068, val loss: 0.50438, in 0.016s 1 tree, 42 leaves, max depth = 9, train loss: 0.50756, val loss: 0.50107, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.50457, val loss: 0.49792, in 0.000s 1 tree, 45 leaves, max depth = 10, train loss: 0.50161, val loss: 0.49482, in 0.016s 1 tree, 43 leaves, max depth = 14, train loss: 0.49886, val loss: 0.49192, in 0.000s 1 tree, 44 leaves, max depth = 15, train loss: 0.49621, val loss: 0.48915, in 0.016s 1 tree, 70 leaves, max depth = 12, train loss: 0.49202, val loss: 0.48526, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.48947, val loss: 0.48255, in 0.000s 1 tree, 44 leaves, max depth = 14, train loss: 0.48705, val loss: 0.48000, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.48467, val loss: 0.47747, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.48240, val loss: 0.47507, in 0.000s 1 tree, 42 leaves, max depth = 14, train loss: 0.48024, val loss: 0.47280, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.47812, val loss: 0.47055, in 0.000s 1 tree, 96 leaves, max depth = 10, train loss: 0.47541, val loss: 0.46812, in 0.016s 1 tree, 96 leaves, max depth = 10, train loss: 0.47280, val loss: 0.46579, in 0.016s 1 tree, 70 leaves, max depth = 11, train loss: 0.46911, val loss: 0.46238, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.46713, val loss: 0.46059, in 0.016s 1 tree, 96 leaves, max depth = 12, train loss: 0.46475, val loss: 0.45850, in 0.016s 1 tree, 43 leaves, max depth = 17, train loss: 0.46288, val loss: 0.45652, in 0.016s 1 tree, 43 leaves, max depth = 13, train loss: 0.46108, val loss: 0.45462, in 0.000s 1 tree, 71 leaves, max depth = 11, train loss: 0.45769, val loss: 0.45150, in 0.016s Fit 52 trees in 0.831 s, (2202 total leaves) Time spent computing histograms: 0.260s Time spent finding best splits: 0.041s Time spent applying splits: 0.046s Time spent predicting: 0.000s Trial 11, Fold 3: Log loss = 0.4524066912549197, Average precision = 0.9099322068591883, ROC-AUC = 0.9202060469657424, Elapsed Time = 0.8413478000002215 seconds Trial 11, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 11, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 40 leaves, max depth = 12, train loss: 0.68307, val loss: 0.68251, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.67749, val loss: 0.67657, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.66801, val loss: 0.66657, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.65898, val loss: 0.65704, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.65034, val loss: 0.64791, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.64199, val loss: 0.63907, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.63404, val loss: 0.63064, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.62641, val loss: 0.62254, in 0.000s 1 tree, 27 leaves, max depth = 9, train loss: 0.62181, val loss: 0.61779, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.61454, val loss: 0.61001, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.60766, val loss: 0.60268, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.60356, val loss: 0.59843, in 0.000s 1 tree, 40 leaves, max depth = 11, train loss: 0.59704, val loss: 0.59148, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.59088, val loss: 0.58488, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.58490, val loss: 0.57850, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.57917, val loss: 0.57234, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.57396, val loss: 0.56663, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.56865, val loss: 0.56091, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.56356, val loss: 0.55542, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.55852, val loss: 0.54996, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.55367, val loss: 0.54469, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.54900, val loss: 0.53962, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.54479, val loss: 0.53504, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.54045, val loss: 0.53031, in 0.000s 1 tree, 40 leaves, max depth = 15, train loss: 0.53636, val loss: 0.52582, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.53238, val loss: 0.52150, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.52859, val loss: 0.51735, in 0.000s 1 tree, 42 leaves, max depth = 11, train loss: 0.52493, val loss: 0.51336, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.52142, val loss: 0.50950, in 0.016s 1 tree, 72 leaves, max depth = 15, train loss: 0.51677, val loss: 0.50500, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.51344, val loss: 0.50134, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.51023, val loss: 0.49782, in 0.000s 1 tree, 42 leaves, max depth = 11, train loss: 0.50709, val loss: 0.49436, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.50411, val loss: 0.49108, in 0.000s 1 tree, 40 leaves, max depth = 12, train loss: 0.50114, val loss: 0.48779, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.49837, val loss: 0.48474, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.49570, val loss: 0.48176, in 0.000s 1 tree, 72 leaves, max depth = 15, train loss: 0.49149, val loss: 0.47770, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.48892, val loss: 0.47486, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.48649, val loss: 0.47214, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.48410, val loss: 0.46950, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.48183, val loss: 0.46695, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.47965, val loss: 0.46451, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.47754, val loss: 0.46214, in 0.000s 1 tree, 93 leaves, max depth = 11, train loss: 0.47488, val loss: 0.45964, in 0.016s 1 tree, 93 leaves, max depth = 11, train loss: 0.47234, val loss: 0.45723, in 0.016s 1 tree, 72 leaves, max depth = 15, train loss: 0.46862, val loss: 0.45367, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.46660, val loss: 0.45167, in 0.016s 1 tree, 94 leaves, max depth = 11, train loss: 0.46428, val loss: 0.44948, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.46240, val loss: 0.44734, in 0.000s 1 tree, 43 leaves, max depth = 12, train loss: 0.46058, val loss: 0.44528, in 0.016s 1 tree, 71 leaves, max depth = 15, train loss: 0.45717, val loss: 0.44201, in 0.031s Fit 52 trees in 0.846 s, (2130 total leaves) Time spent computing histograms: 0.263s Time spent finding best splits: 0.047s Time spent applying splits: 0.051s Time spent predicting: 0.000s Trial 11, Fold 4: Log loss = 0.45743982263479455, Average precision = 0.9038198909903736, ROC-AUC = 0.9152758106397132, Elapsed Time = 0.8614232999989326 seconds Trial 11, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 11, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 39 leaves, max depth = 13, train loss: 0.68291, val loss: 0.68219, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.67746, val loss: 0.67617, in 0.000s 1 tree, 42 leaves, max depth = 11, train loss: 0.66789, val loss: 0.66593, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.65874, val loss: 0.65617, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.64997, val loss: 0.64681, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.64158, val loss: 0.63786, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.63352, val loss: 0.62923, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.62581, val loss: 0.62098, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.62129, val loss: 0.61613, in 0.016s 1 tree, 43 leaves, max depth = 14, train loss: 0.61394, val loss: 0.60821, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.60697, val loss: 0.60075, in 0.000s 1 tree, 26 leaves, max depth = 9, train loss: 0.60294, val loss: 0.59641, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.59636, val loss: 0.58931, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.59012, val loss: 0.58251, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.58407, val loss: 0.57600, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.57825, val loss: 0.56971, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.57295, val loss: 0.56397, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.56756, val loss: 0.55816, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.56241, val loss: 0.55250, in 0.000s 1 tree, 43 leaves, max depth = 14, train loss: 0.55731, val loss: 0.54694, in 0.016s 1 tree, 43 leaves, max depth = 14, train loss: 0.55241, val loss: 0.54159, in 0.016s 1 tree, 43 leaves, max depth = 14, train loss: 0.54769, val loss: 0.53643, in 0.000s 1 tree, 38 leaves, max depth = 9, train loss: 0.54345, val loss: 0.53178, in 0.016s 1 tree, 43 leaves, max depth = 14, train loss: 0.53906, val loss: 0.52696, in 0.000s 1 tree, 40 leaves, max depth = 11, train loss: 0.53492, val loss: 0.52244, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.53089, val loss: 0.51800, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.52706, val loss: 0.51380, in 0.000s 1 tree, 44 leaves, max depth = 12, train loss: 0.52335, val loss: 0.50973, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.51978, val loss: 0.50580, in 0.016s 1 tree, 72 leaves, max depth = 14, train loss: 0.51520, val loss: 0.50151, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.51182, val loss: 0.49779, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.50856, val loss: 0.49416, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.50540, val loss: 0.49065, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.50237, val loss: 0.48726, in 0.016s 1 tree, 43 leaves, max depth = 14, train loss: 0.49937, val loss: 0.48391, in 0.000s 1 tree, 44 leaves, max depth = 12, train loss: 0.49657, val loss: 0.48079, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.49386, val loss: 0.47777, in 0.016s 1 tree, 73 leaves, max depth = 13, train loss: 0.48971, val loss: 0.47392, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.48712, val loss: 0.47101, in 0.000s 1 tree, 44 leaves, max depth = 12, train loss: 0.48465, val loss: 0.46824, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.48225, val loss: 0.46553, in 0.016s [42/52] 1 tree, 39 leaves, max depth = 11, train loss: 0.47995, val loss: 0.46291, in 0.000s 1 tree, 44 leaves, max depth = 13, train loss: 0.47774, val loss: 0.46042, in 0.016s [44/52] 1 tree, 7 leaves, max depth = 5, train loss: 0.47559, val loss: 0.45797, in 0.000s 1 tree, 93 leaves, max depth = 10, train loss: 0.47296, val loss: 0.45572, in 0.031s 1 tree, 93 leaves, max depth = 10, train loss: 0.47044, val loss: 0.45356, in 0.016s 1 tree, 73 leaves, max depth = 13, train loss: 0.46678, val loss: 0.45019, in 0.000s 1 tree, 53 leaves, max depth = 10, train loss: 0.46480, val loss: 0.44834, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.46286, val loss: 0.44614, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.46098, val loss: 0.44400, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.45919, val loss: 0.44194, in 0.000s 1 tree, 72 leaves, max depth = 13, train loss: 0.45577, val loss: 0.43881, in 0.016s Fit 52 trees in 0.845 s, (2144 total leaves) Time spent computing histograms: 0.254s Time spent finding best splits: 0.043s Time spent applying splits: 0.045s Time spent predicting: 0.000s Trial 11, Fold 5: Log loss = 0.46344583690742613, Average precision = 0.8988737071441618, ROC-AUC = 0.9086291734017056, Elapsed Time = 0.8489270999998553 seconds
Optimization Progress: 12%|#2 | 12/100 [02:26<17:21, 11.83s/it]
Trial 12, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 12, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 13 leaves, max depth = 7, train loss: 0.67249, val loss: 0.67184, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.66180, val loss: 0.66028, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.64349, val loss: 0.64145, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.62687, val loss: 0.62418, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.61165, val loss: 0.60833, in 0.000s 1 tree, 14 leaves, max depth = 9, train loss: 0.59754, val loss: 0.59373, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.58471, val loss: 0.58032, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.57300, val loss: 0.56817, in 0.016s 1 tree, 34 leaves, max depth = 7, train loss: 0.56509, val loss: 0.56073, in 0.000s 1 tree, 26 leaves, max depth = 10, train loss: 0.56039, val loss: 0.55528, in 0.016s 1 tree, 18 leaves, max depth = 10, train loss: 0.55022, val loss: 0.54472, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.54603, val loss: 0.53972, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.53694, val loss: 0.53028, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.52853, val loss: 0.52149, in 0.016s 1 tree, 16 leaves, max depth = 9, train loss: 0.52079, val loss: 0.51342, in 0.000s 1 tree, 18 leaves, max depth = 9, train loss: 0.51364, val loss: 0.50592, in 0.016s Fit 16 trees in 0.407 s, (256 total leaves) Time spent computing histograms: 0.073s Time spent finding best splits: 0.007s Time spent applying splits: 0.006s Time spent predicting: 0.000s Trial 12, Fold 1: Log loss = 0.5121890687203764, Average precision = 0.8792070370375461, ROC-AUC = 0.8829421836987328, Elapsed Time = 0.41822590000083437 seconds Trial 12, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 12, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0.158 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 20 leaves, max depth = 8, train loss: 0.67249, val loss: 0.67142, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.66164, val loss: 0.65946, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.64336, val loss: 0.64029, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.62670, val loss: 0.62279, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.61144, val loss: 0.60674, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.59742, val loss: 0.59188, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.58454, val loss: 0.57830, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.57280, val loss: 0.56587, in 0.016s 1 tree, 42 leaves, max depth = 8, train loss: 0.56496, val loss: 0.55839, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.56033, val loss: 0.55325, in 0.016s 1 tree, 21 leaves, max depth = 13, train loss: 0.55010, val loss: 0.54242, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.54594, val loss: 0.53788, in 0.016s 1 tree, 21 leaves, max depth = 13, train loss: 0.53681, val loss: 0.52818, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.52843, val loss: 0.51920, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.52066, val loss: 0.51091, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.51347, val loss: 0.50322, in 0.016s Fit 16 trees in 0.423 s, (287 total leaves) Time spent computing histograms: 0.068s Time spent finding best splits: 0.007s Time spent applying splits: 0.006s Time spent predicting: 0.000s Trial 12, Fold 2: Log loss = 0.5132259896944191, Average precision = 0.8776855526440213, ROC-AUC = 0.8887734256850123, Elapsed Time = 0.4338880999985122 seconds Trial 12, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 12, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.143 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 15 leaves, max depth = 8, train loss: 0.67258, val loss: 0.67186, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.66170, val loss: 0.66070, in 0.000s 1 tree, 15 leaves, max depth = 8, train loss: 0.64355, val loss: 0.64184, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.62700, val loss: 0.62466, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.61184, val loss: 0.60891, in 0.016s 1 tree, 18 leaves, max depth = 9, train loss: 0.59796, val loss: 0.59448, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.58518, val loss: 0.58117, in 0.000s 1 tree, 21 leaves, max depth = 10, train loss: 0.57352, val loss: 0.56894, in 0.016s 1 tree, 39 leaves, max depth = 7, train loss: 0.56559, val loss: 0.56164, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.56103, val loss: 0.55613, in 0.016s 1 tree, 22 leaves, max depth = 11, train loss: 0.55092, val loss: 0.54549, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.54682, val loss: 0.54073, in 0.016s 1 tree, 23 leaves, max depth = 12, train loss: 0.53780, val loss: 0.53121, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.52948, val loss: 0.52248, in 0.000s 1 tree, 20 leaves, max depth = 11, train loss: 0.52178, val loss: 0.51434, in 0.016s 1 tree, 24 leaves, max depth = 11, train loss: 0.51468, val loss: 0.50681, in 0.000s Fit 16 trees in 0.393 s, (297 total leaves) Time spent computing histograms: 0.060s Time spent finding best splits: 0.006s Time spent applying splits: 0.006s Time spent predicting: 0.000s Trial 12, Fold 3: Log loss = 0.509757626563093, Average precision = 0.8850018372669946, ROC-AUC = 0.8930221495747899, Elapsed Time = 0.39575520000107645 seconds Trial 12, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 12, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0.126 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 11 leaves, max depth = 6, train loss: 0.67255, val loss: 0.67132, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.66159, val loss: 0.65961, in 0.000s 1 tree, 15 leaves, max depth = 8, train loss: 0.64345, val loss: 0.64035, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.62691, val loss: 0.62281, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.61175, val loss: 0.60669, in 0.000s 1 tree, 19 leaves, max depth = 11, train loss: 0.59786, val loss: 0.59179, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.58509, val loss: 0.57813, in 0.000s 1 tree, 22 leaves, max depth = 9, train loss: 0.57336, val loss: 0.56553, in 0.016s 1 tree, 38 leaves, max depth = 8, train loss: 0.56555, val loss: 0.55778, in 0.000s 1 tree, 27 leaves, max depth = 11, train loss: 0.56087, val loss: 0.55270, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.55072, val loss: 0.54175, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.54660, val loss: 0.53713, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.53752, val loss: 0.52731, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.52917, val loss: 0.51816, in 0.000s 1 tree, 22 leaves, max depth = 9, train loss: 0.52143, val loss: 0.50968, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.51428, val loss: 0.50184, in 0.000s Fit 16 trees in 0.360 s, (283 total leaves) Time spent computing histograms: 0.062s Time spent finding best splits: 0.007s Time spent applying splits: 0.006s Time spent predicting: 0.000s Trial 12, Fold 4: Log loss = 0.5130938883208326, Average precision = 0.8756620847959189, ROC-AUC = 0.8863719953456319, Elapsed Time = 0.37896379999983765 seconds Trial 12, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 12, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 14 leaves, max depth = 7, train loss: 0.67229, val loss: 0.67077, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.66158, val loss: 0.65890, in 0.000s 1 tree, 11 leaves, max depth = 6, train loss: 0.64322, val loss: 0.63926, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.62642, val loss: 0.62127, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.61104, val loss: 0.60474, in 0.000s 1 tree, 15 leaves, max depth = 8, train loss: 0.59699, val loss: 0.58945, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.58402, val loss: 0.57542, in 0.000s 1 tree, 23 leaves, max depth = 9, train loss: 0.57218, val loss: 0.56244, in 0.000s 1 tree, 45 leaves, max depth = 7, train loss: 0.56445, val loss: 0.55537, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.55972, val loss: 0.55005, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.54943, val loss: 0.53871, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.54517, val loss: 0.53409, in 0.000s 1 tree, 18 leaves, max depth = 10, train loss: 0.53597, val loss: 0.52391, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.52752, val loss: 0.51448, in 0.000s 1 tree, 19 leaves, max depth = 11, train loss: 0.51969, val loss: 0.50570, in 0.000s 1 tree, 19 leaves, max depth = 12, train loss: 0.51245, val loss: 0.49755, in 0.016s Fit 16 trees in 0.376 s, (269 total leaves) Time spent computing histograms: 0.058s Time spent finding best splits: 0.006s Time spent applying splits: 0.006s Time spent predicting: 0.000s Trial 12, Fold 5: Log loss = 0.5175040686108638, Average precision = 0.8760296722679743, ROC-AUC = 0.8810527841257455, Elapsed Time = 0.38353559999995923 seconds
Optimization Progress: 13%|#3 | 13/100 [02:35<15:56, 11.00s/it]
Trial 13, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 13, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 6 leaves, max depth = 4, train loss: 0.67771, val loss: 0.67720, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.66296, val loss: 0.66204, in 0.000s 1 tree, 5 leaves, max depth = 4, train loss: 0.64917, val loss: 0.64781, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.63630, val loss: 0.63442, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.62422, val loss: 0.62195, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.61289, val loss: 0.61025, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.60240, val loss: 0.59937, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.59256, val loss: 0.58915, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.58317, val loss: 0.57942, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.57418, val loss: 0.57006, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.56596, val loss: 0.56146, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.55827, val loss: 0.55344, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.55091, val loss: 0.54577, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.54406, val loss: 0.53859, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.53752, val loss: 0.53176, in 0.000s 1 tree, 21 leaves, max depth = 10, train loss: 0.53026, val loss: 0.52504, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.52424, val loss: 0.51875, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.51857, val loss: 0.51282, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.51318, val loss: 0.50707, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.50799, val loss: 0.50157, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.50309, val loss: 0.49638, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.49848, val loss: 0.49146, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.49412, val loss: 0.48682, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.49001, val loss: 0.48242, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.48628, val loss: 0.47842, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.48272, val loss: 0.47465, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.47940, val loss: 0.47107, in 0.000s 1 tree, 9 leaves, max depth = 7, train loss: 0.47628, val loss: 0.46775, in 0.000s 1 tree, 22 leaves, max depth = 7, train loss: 0.47213, val loss: 0.46399, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.46651, val loss: 0.45890, in 0.000s 1 tree, 20 leaves, max depth = 6, train loss: 0.46282, val loss: 0.45556, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.46007, val loss: 0.45257, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.45732, val loss: 0.44960, in 0.016s 1 tree, 9 leaves, max depth = 7, train loss: 0.45489, val loss: 0.44698, in 0.000s 1 tree, 10 leaves, max depth = 6, train loss: 0.45244, val loss: 0.44430, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.44999, val loss: 0.44186, in 0.016s 1 tree, 22 leaves, max depth = 10, train loss: 0.44738, val loss: 0.43949, in 0.000s 1 tree, 22 leaves, max depth = 9, train loss: 0.44264, val loss: 0.43519, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.44039, val loss: 0.43280, in 0.000s 1 tree, 22 leaves, max depth = 7, train loss: 0.43741, val loss: 0.43016, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.43518, val loss: 0.42771, in 0.000s 1 tree, 10 leaves, max depth = 6, train loss: 0.43317, val loss: 0.42551, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.43111, val loss: 0.42347, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.42695, val loss: 0.41972, in 0.016s 1 tree, 24 leaves, max depth = 7, train loss: 0.42437, val loss: 0.41747, in 0.000s 1 tree, 26 leaves, max depth = 8, train loss: 0.42194, val loss: 0.41537, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.41828, val loss: 0.41216, in 0.000s 1 tree, 10 leaves, max depth = 6, train loss: 0.41651, val loss: 0.41028, in 0.016s 1 tree, 26 leaves, max depth = 7, train loss: 0.41437, val loss: 0.40844, in 0.000s 1 tree, 26 leaves, max depth = 11, train loss: 0.41100, val loss: 0.40548, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.40942, val loss: 0.40392, in 0.000s Fit 51 trees in 0.610 s, (590 total leaves) Time spent computing histograms: 0.197s Time spent finding best splits: 0.016s Time spent applying splits: 0.013s Time spent predicting: 0.000s Trial 13, Fold 1: Log loss = 0.4108587589253095, Average precision = 0.9460757513044896, ROC-AUC = 0.9415504778394344, Elapsed Time = 0.6094315999998798 seconds Trial 13, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 13, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.159 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 4 leaves, max depth = 3, train loss: 0.67769, val loss: 0.67689, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.66307, val loss: 0.66155, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.64932, val loss: 0.64706, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.63642, val loss: 0.63352, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.62445, val loss: 0.62088, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.61321, val loss: 0.60901, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.60268, val loss: 0.59791, in 0.000s 1 tree, 5 leaves, max depth = 4, train loss: 0.59281, val loss: 0.58744, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.58347, val loss: 0.57750, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.57454, val loss: 0.56801, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.56632, val loss: 0.55927, in 0.016s 1 tree, 5 leaves, max depth = 4, train loss: 0.55860, val loss: 0.55103, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.55133, val loss: 0.54329, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.54436, val loss: 0.53586, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.53785, val loss: 0.52886, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.53061, val loss: 0.52180, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.52464, val loss: 0.51534, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.51904, val loss: 0.50933, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.51363, val loss: 0.50355, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.50848, val loss: 0.49799, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.50362, val loss: 0.49274, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.49903, val loss: 0.48779, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.49470, val loss: 0.48310, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.49062, val loss: 0.47866, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.48690, val loss: 0.47461, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.48335, val loss: 0.47072, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.48003, val loss: 0.46710, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.47689, val loss: 0.46368, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.47283, val loss: 0.45983, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.46721, val loss: 0.45443, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.46360, val loss: 0.45104, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.46086, val loss: 0.44803, in 0.000s 1 tree, 12 leaves, max depth = 6, train loss: 0.45813, val loss: 0.44505, in 0.016s 1 tree, 9 leaves, max depth = 6, train loss: 0.45569, val loss: 0.44236, in 0.000s 1 tree, 12 leaves, max depth = 6, train loss: 0.45325, val loss: 0.43968, in 0.016s 1 tree, 27 leaves, max depth = 13, train loss: 0.45075, val loss: 0.43753, in 0.000s 1 tree, 8 leaves, max depth = 6, train loss: 0.44859, val loss: 0.43515, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.44376, val loss: 0.43052, in 0.000s 1 tree, 8 leaves, max depth = 6, train loss: 0.44179, val loss: 0.42833, in 0.000s 1 tree, 23 leaves, max depth = 8, train loss: 0.43879, val loss: 0.42556, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.43662, val loss: 0.42331, in 0.000s 1 tree, 12 leaves, max depth = 6, train loss: 0.43470, val loss: 0.42118, in 0.016s 1 tree, 27 leaves, max depth = 13, train loss: 0.43253, val loss: 0.41937, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.42826, val loss: 0.41531, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.42558, val loss: 0.41290, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.42307, val loss: 0.41062, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.41932, val loss: 0.40704, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.41765, val loss: 0.40519, in 0.000s 1 tree, 29 leaves, max depth = 7, train loss: 0.41543, val loss: 0.40321, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.41201, val loss: 0.39996, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.41034, val loss: 0.39862, in 0.016s Fit 51 trees in 0.643 s, (558 total leaves) Time spent computing histograms: 0.225s Time spent finding best splits: 0.017s Time spent applying splits: 0.015s Time spent predicting: 0.000s Trial 13, Fold 2: Log loss = 0.411491134302576, Average precision = 0.9436506570711057, ROC-AUC = 0.9425052679243726, Elapsed Time = 0.6547442999999475 seconds Trial 13, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 13, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 4 leaves, max depth = 3, train loss: 0.67776, val loss: 0.67719, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.66317, val loss: 0.66199, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.64951, val loss: 0.64775, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.63668, val loss: 0.63446, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.62472, val loss: 0.62196, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.61357, val loss: 0.61043, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.60315, val loss: 0.59955, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.59337, val loss: 0.58930, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.58408, val loss: 0.57956, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.57519, val loss: 0.57031, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.56703, val loss: 0.56183, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.55940, val loss: 0.55379, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.55212, val loss: 0.54612, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.54533, val loss: 0.53897, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.53886, val loss: 0.53216, in 0.016s 1 tree, 22 leaves, max depth = 10, train loss: 0.53147, val loss: 0.52526, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.52554, val loss: 0.51897, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.51995, val loss: 0.51305, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.51458, val loss: 0.50742, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.50948, val loss: 0.50206, in 0.000s 1 tree, 6 leaves, max depth = 3, train loss: 0.50466, val loss: 0.49700, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.50012, val loss: 0.49222, in 0.000s 1 tree, 6 leaves, max depth = 3, train loss: 0.49583, val loss: 0.48771, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.49178, val loss: 0.48344, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.48810, val loss: 0.47956, in 0.000s 1 tree, 6 leaves, max depth = 3, train loss: 0.48460, val loss: 0.47580, in 0.000s 1 tree, 8 leaves, max depth = 6, train loss: 0.48131, val loss: 0.47233, in 0.016s 1 tree, 6 leaves, max depth = 5, train loss: 0.47823, val loss: 0.46898, in 0.000s 1 tree, 22 leaves, max depth = 6, train loss: 0.47408, val loss: 0.46521, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.46837, val loss: 0.45993, in 0.000s 1 tree, 23 leaves, max depth = 6, train loss: 0.46467, val loss: 0.45659, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.46195, val loss: 0.45371, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.45927, val loss: 0.45086, in 0.016s 1 tree, 6 leaves, max depth = 5, train loss: 0.45688, val loss: 0.44822, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.45448, val loss: 0.44566, in 0.016s 1 tree, 30 leaves, max depth = 14, train loss: 0.45195, val loss: 0.44342, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.44935, val loss: 0.44102, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.44453, val loss: 0.43661, in 0.000s 1 tree, 13 leaves, max depth = 6, train loss: 0.44235, val loss: 0.43420, in 0.016s 1 tree, 23 leaves, max depth = 7, train loss: 0.43937, val loss: 0.43152, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.43711, val loss: 0.42946, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.43516, val loss: 0.42736, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.43301, val loss: 0.42549, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.42876, val loss: 0.42162, in 0.000s 1 tree, 36 leaves, max depth = 9, train loss: 0.42614, val loss: 0.41920, in 0.016s 1 tree, 21 leaves, max depth = 6, train loss: 0.42365, val loss: 0.41699, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.41985, val loss: 0.41356, in 0.016s 1 tree, 11 leaves, max depth = 8, train loss: 0.41815, val loss: 0.41167, in 0.000s 1 tree, 23 leaves, max depth = 6, train loss: 0.41595, val loss: 0.40975, in 0.000s 1 tree, 26 leaves, max depth = 10, train loss: 0.41248, val loss: 0.40663, in 0.000s 1 tree, 38 leaves, max depth = 16, train loss: 0.41082, val loss: 0.40525, in 0.016s Fit 51 trees in 0.658 s, (612 total leaves) Time spent computing histograms: 0.224s Time spent finding best splits: 0.017s Time spent applying splits: 0.016s Time spent predicting: 0.000s Trial 13, Fold 3: Log loss = 0.4052344243260207, Average precision = 0.951163209490254, ROC-AUC = 0.9481140778043167, Elapsed Time = 0.6698638999987452 seconds Trial 13, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 13, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 4 leaves, max depth = 3, train loss: 0.67774, val loss: 0.67683, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.66312, val loss: 0.66140, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.64942, val loss: 0.64689, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.63659, val loss: 0.63332, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.62462, val loss: 0.62060, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.61340, val loss: 0.60865, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.60294, val loss: 0.59749, in 0.000s 1 tree, 5 leaves, max depth = 4, train loss: 0.59312, val loss: 0.58701, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.58382, val loss: 0.57704, in 0.000s 1 tree, 6 leaves, max depth = 3, train loss: 0.57493, val loss: 0.56742, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.56676, val loss: 0.55856, in 0.000s 1 tree, 5 leaves, max depth = 4, train loss: 0.55908, val loss: 0.55030, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.55180, val loss: 0.54240, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.54499, val loss: 0.53494, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.53851, val loss: 0.52789, in 0.016s 1 tree, 22 leaves, max depth = 10, train loss: 0.53121, val loss: 0.52067, in 0.000s 1 tree, 6 leaves, max depth = 3, train loss: 0.52525, val loss: 0.51415, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.51965, val loss: 0.50798, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.51427, val loss: 0.50211, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.50916, val loss: 0.49645, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.50435, val loss: 0.49110, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.49980, val loss: 0.48603, in 0.000s 1 tree, 8 leaves, max depth = 6, train loss: 0.49551, val loss: 0.48124, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.49146, val loss: 0.47669, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.48775, val loss: 0.47252, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.48424, val loss: 0.46853, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.48093, val loss: 0.46479, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.47781, val loss: 0.46128, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.47377, val loss: 0.45737, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.46812, val loss: 0.45187, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.46453, val loss: 0.44840, in 0.000s 1 tree, 12 leaves, max depth = 6, train loss: 0.46162, val loss: 0.44512, in 0.016s 1 tree, 9 leaves, max depth = 6, train loss: 0.45893, val loss: 0.44204, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.45650, val loss: 0.43926, in 0.016s 1 tree, 10 leaves, max depth = 6, train loss: 0.45409, val loss: 0.43647, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.45162, val loss: 0.43391, in 0.016s [37/51] 1 tree, 8 leaves, max depth = 4, train loss: 0.44946, val loss: 0.43141, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.44463, val loss: 0.42671, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.44248, val loss: 0.42425, in 0.000s 1 tree, 23 leaves, max depth = 7, train loss: 0.43954, val loss: 0.42146, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.43737, val loss: 0.41917, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.43549, val loss: 0.41696, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.43336, val loss: 0.41476, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.42910, val loss: 0.41066, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.42647, val loss: 0.40812, in 0.000s 1 tree, 23 leaves, max depth = 7, train loss: 0.42401, val loss: 0.40581, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.42025, val loss: 0.40219, in 0.000s 1 tree, 13 leaves, max depth = 8, train loss: 0.41859, val loss: 0.40025, in 0.016s 1 tree, 22 leaves, max depth = 6, train loss: 0.41642, val loss: 0.39824, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.41299, val loss: 0.39493, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.41135, val loss: 0.39325, in 0.000s Fit 51 trees in 0.658 s, (556 total leaves) Time spent computing histograms: 0.222s Time spent finding best splits: 0.017s Time spent applying splits: 0.014s Time spent predicting: 0.016s Trial 13, Fold 4: Log loss = 0.4107153777528465, Average precision = 0.945980256226045, ROC-AUC = 0.9420119739097883, Elapsed Time = 0.6631233000007342 seconds Trial 13, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 13, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 4 leaves, max depth = 3, train loss: 0.67755, val loss: 0.67641, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.66285, val loss: 0.66070, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.64902, val loss: 0.64585, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.63603, val loss: 0.63194, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.62399, val loss: 0.61899, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.61269, val loss: 0.60682, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.60211, val loss: 0.59540, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.59218, val loss: 0.58463, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.58277, val loss: 0.57439, in 0.000s 1 tree, 6 leaves, max depth = 3, train loss: 0.57379, val loss: 0.56461, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.56552, val loss: 0.55559, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.55776, val loss: 0.54710, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.55039, val loss: 0.53899, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.54351, val loss: 0.53135, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.53696, val loss: 0.52411, in 0.016s 1 tree, 22 leaves, max depth = 11, train loss: 0.52974, val loss: 0.51731, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.52371, val loss: 0.51062, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.51803, val loss: 0.50429, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.51261, val loss: 0.49827, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.50743, val loss: 0.49250, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.50256, val loss: 0.48704, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.49795, val loss: 0.48188, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.49361, val loss: 0.47698, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.48951, val loss: 0.47235, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.48577, val loss: 0.46809, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.48220, val loss: 0.46400, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.47886, val loss: 0.46018, in 0.016s 1 tree, 6 leaves, max depth = 5, train loss: 0.47572, val loss: 0.45657, in 0.000s 1 tree, 22 leaves, max depth = 7, train loss: 0.47170, val loss: 0.45301, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.46612, val loss: 0.44789, in 0.000s 1 tree, 22 leaves, max depth = 7, train loss: 0.46254, val loss: 0.44475, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.45978, val loss: 0.44155, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.45705, val loss: 0.43838, in 0.016s 1 tree, 6 leaves, max depth = 5, train loss: 0.45460, val loss: 0.43551, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.45216, val loss: 0.43265, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.44969, val loss: 0.43019, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.44750, val loss: 0.42760, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.44270, val loss: 0.42327, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.44070, val loss: 0.42089, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.43773, val loss: 0.41836, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.43556, val loss: 0.41626, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.43365, val loss: 0.41398, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.43152, val loss: 0.41187, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.42728, val loss: 0.40810, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.42466, val loss: 0.40560, in 0.016s 1 tree, 24 leaves, max depth = 7, train loss: 0.42218, val loss: 0.40354, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.41843, val loss: 0.40023, in 0.016s 1 tree, 15 leaves, max depth = 9, train loss: 0.41676, val loss: 0.39827, in 0.000s 1 tree, 27 leaves, max depth = 7, train loss: 0.41457, val loss: 0.39648, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.41115, val loss: 0.39349, in 0.000s 1 tree, 36 leaves, max depth = 14, train loss: 0.40950, val loss: 0.39189, in 0.016s Fit 51 trees in 0.642 s, (569 total leaves) Time spent computing histograms: 0.214s Time spent finding best splits: 0.017s Time spent applying splits: 0.015s Time spent predicting: 0.000s Trial 13, Fold 5: Log loss = 0.4152681992329296, Average precision = 0.9466220671297173, ROC-AUC = 0.9429014361889898, Elapsed Time = 0.6470339000006788 seconds
Optimization Progress: 14%|#4 | 14/100 [02:45<15:11, 10.60s/it]
Trial 14, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 14, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 37 leaves, max depth = 11, train loss: 0.68557, val loss: 0.68534, in 0.000s 1 tree, 37 leaves, max depth = 14, train loss: 0.67812, val loss: 0.67769, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.67091, val loss: 0.67028, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.66395, val loss: 0.66305, in 0.016s 1 tree, 38 leaves, max depth = 14, train loss: 0.65719, val loss: 0.65609, in 0.000s 1 tree, 38 leaves, max depth = 14, train loss: 0.65064, val loss: 0.64936, in 0.016s 1 tree, 37 leaves, max depth = 13, train loss: 0.64434, val loss: 0.64289, in 0.000s 1 tree, 37 leaves, max depth = 12, train loss: 0.63825, val loss: 0.63663, in 0.016s 1 tree, 38 leaves, max depth = 14, train loss: 0.63229, val loss: 0.63049, in 0.000s 1 tree, 37 leaves, max depth = 13, train loss: 0.62642, val loss: 0.62444, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.62084, val loss: 0.61867, in 0.000s 1 tree, 36 leaves, max depth = 11, train loss: 0.61545, val loss: 0.61309, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.61017, val loss: 0.60764, in 0.000s 1 tree, 37 leaves, max depth = 11, train loss: 0.60512, val loss: 0.60240, in 0.016s 1 tree, 36 leaves, max depth = 14, train loss: 0.60020, val loss: 0.59733, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.59543, val loss: 0.59238, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.59074, val loss: 0.58754, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.58619, val loss: 0.58284, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.58177, val loss: 0.57820, in 0.000s 1 tree, 37 leaves, max depth = 13, train loss: 0.57739, val loss: 0.57367, in 0.016s 1 tree, 37 leaves, max depth = 13, train loss: 0.57314, val loss: 0.56927, in 0.000s 1 tree, 37 leaves, max depth = 13, train loss: 0.56902, val loss: 0.56499, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.56511, val loss: 0.56092, in 0.000s 1 tree, 37 leaves, max depth = 13, train loss: 0.56122, val loss: 0.55688, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.55754, val loss: 0.55303, in 0.000s 1 tree, 37 leaves, max depth = 12, train loss: 0.55393, val loss: 0.54930, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.55046, val loss: 0.54567, in 0.016s 1 tree, 37 leaves, max depth = 13, train loss: 0.54708, val loss: 0.54218, in 0.000s 1 tree, 34 leaves, max depth = 14, train loss: 0.54381, val loss: 0.53878, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.54006, val loss: 0.53530, in 0.000s 1 tree, 37 leaves, max depth = 13, train loss: 0.53692, val loss: 0.53205, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.53383, val loss: 0.52877, in 0.000s 1 tree, 38 leaves, max depth = 13, train loss: 0.53075, val loss: 0.52554, in 0.016s 1 tree, 36 leaves, max depth = 18, train loss: 0.52787, val loss: 0.52255, in 0.000s 1 tree, 38 leaves, max depth = 13, train loss: 0.52496, val loss: 0.51949, in 0.016s 1 tree, 36 leaves, max depth = 16, train loss: 0.52220, val loss: 0.51661, in 0.000s 1 tree, 35 leaves, max depth = 16, train loss: 0.51954, val loss: 0.51386, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.51612, val loss: 0.51070, in 0.000s 1 tree, 37 leaves, max depth = 11, train loss: 0.51354, val loss: 0.50801, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.51106, val loss: 0.50540, in 0.000s 1 tree, 37 leaves, max depth = 11, train loss: 0.50863, val loss: 0.50286, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.50619, val loss: 0.50028, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.50389, val loss: 0.49788, in 0.000s 1 tree, 57 leaves, max depth = 10, train loss: 0.50068, val loss: 0.49493, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.49845, val loss: 0.49254, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.49630, val loss: 0.49029, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.49321, val loss: 0.48746, in 0.000s 1 tree, 36 leaves, max depth = 9, train loss: 0.49123, val loss: 0.48538, in 0.016s 1 tree, 37 leaves, max depth = 14, train loss: 0.48922, val loss: 0.48326, in 0.000s 1 tree, 57 leaves, max depth = 10, train loss: 0.48627, val loss: 0.48056, in 0.016s 1 tree, 37 leaves, max depth = 15, train loss: 0.48436, val loss: 0.47856, in 0.000s 1 tree, 57 leaves, max depth = 10, train loss: 0.48150, val loss: 0.47594, in 0.016s 1 tree, 36 leaves, max depth = 17, train loss: 0.47967, val loss: 0.47403, in 0.016s 1 tree, 36 leaves, max depth = 16, train loss: 0.47789, val loss: 0.47216, in 0.000s 1 tree, 36 leaves, max depth = 11, train loss: 0.47613, val loss: 0.47031, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.47340, val loss: 0.46781, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.47169, val loss: 0.46597, in 0.000s 1 tree, 38 leaves, max depth = 16, train loss: 0.47005, val loss: 0.46423, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.46741, val loss: 0.46184, in 0.000s 1 tree, 36 leaves, max depth = 15, train loss: 0.46586, val loss: 0.46020, in 0.016s 1 tree, 38 leaves, max depth = 16, train loss: 0.46433, val loss: 0.45858, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.46179, val loss: 0.45628, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.46003, val loss: 0.45469, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.45857, val loss: 0.45314, in 0.016s 1 tree, 37 leaves, max depth = 15, train loss: 0.45718, val loss: 0.45167, in 0.000s 1 tree, 57 leaves, max depth = 10, train loss: 0.45549, val loss: 0.45015, in 0.016s 1 tree, 37 leaves, max depth = 13, train loss: 0.45412, val loss: 0.44870, in 0.000s 1 tree, 37 leaves, max depth = 11, train loss: 0.45279, val loss: 0.44729, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.45043, val loss: 0.44517, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.44910, val loss: 0.44372, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.44780, val loss: 0.44231, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.44659, val loss: 0.44103, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.44542, val loss: 0.43975, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.44424, val loss: 0.43848, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.44311, val loss: 0.43724, in 0.016s Fit 75 trees in 0.908 s, (2782 total leaves) Time spent computing histograms: 0.325s Time spent finding best splits: 0.042s Time spent applying splits: 0.050s Time spent predicting: 0.000s Trial 14, Fold 1: Log loss = 0.4434066575603427, Average precision = 0.9299184649791674, ROC-AUC = 0.9270560000573551, Elapsed Time = 0.9077753000001394 seconds Trial 14, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 14, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 36 leaves, max depth = 10, train loss: 0.68557, val loss: 0.68520, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.67813, val loss: 0.67738, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.67091, val loss: 0.66982, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.66392, val loss: 0.66250, in 0.000s 1 tree, 36 leaves, max depth = 14, train loss: 0.65720, val loss: 0.65546, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.65067, val loss: 0.64863, in 0.000s 1 tree, 39 leaves, max depth = 18, train loss: 0.64436, val loss: 0.64202, in 0.016s 1 tree, 36 leaves, max depth = 14, train loss: 0.63824, val loss: 0.63561, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.63229, val loss: 0.62933, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.62645, val loss: 0.62317, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.62086, val loss: 0.61732, in 0.000s 1 tree, 35 leaves, max depth = 13, train loss: 0.61548, val loss: 0.61165, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.61021, val loss: 0.60608, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.60517, val loss: 0.60075, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.60023, val loss: 0.59557, in 0.000s 1 tree, 37 leaves, max depth = 16, train loss: 0.59544, val loss: 0.59053, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.59075, val loss: 0.58558, in 0.000s 1 tree, 33 leaves, max depth = 10, train loss: 0.58620, val loss: 0.58078, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.58178, val loss: 0.57611, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.57746, val loss: 0.57150, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.57324, val loss: 0.56701, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.56914, val loss: 0.56265, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.56523, val loss: 0.55854, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.56138, val loss: 0.55443, in 0.000s 1 tree, 36 leaves, max depth = 13, train loss: 0.55770, val loss: 0.55054, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.55409, val loss: 0.54672, in 0.000s 1 tree, 36 leaves, max depth = 13, train loss: 0.55062, val loss: 0.54305, in 0.016s 1 tree, 38 leaves, max depth = 15, train loss: 0.54724, val loss: 0.53947, in 0.016s 1 tree, 37 leaves, max depth = 17, train loss: 0.54396, val loss: 0.53599, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.54027, val loss: 0.53241, in 0.016s 1 tree, 38 leaves, max depth = 14, train loss: 0.53712, val loss: 0.52908, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.53403, val loss: 0.52579, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.53100, val loss: 0.52252, in 0.000s 1 tree, 39 leaves, max depth = 14, train loss: 0.52811, val loss: 0.51945, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.52524, val loss: 0.51635, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.52249, val loss: 0.51340, in 0.000s 1 tree, 38 leaves, max depth = 15, train loss: 0.51984, val loss: 0.51058, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.51645, val loss: 0.50731, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.51388, val loss: 0.50457, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.51140, val loss: 0.50193, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.50897, val loss: 0.49932, in 0.000s 1 tree, 36 leaves, max depth = 10, train loss: 0.50657, val loss: 0.49672, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.50427, val loss: 0.49426, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.50110, val loss: 0.49121, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.49887, val loss: 0.48882, in 0.000s 1 tree, 36 leaves, max depth = 11, train loss: 0.49673, val loss: 0.48653, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.49369, val loss: 0.48360, in 0.016s 1 tree, 38 leaves, max depth = 14, train loss: 0.49170, val loss: 0.48150, in 0.000s 1 tree, 36 leaves, max depth = 13, train loss: 0.48971, val loss: 0.47937, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.48679, val loss: 0.47657, in 0.000s 1 tree, 36 leaves, max depth = 14, train loss: 0.48488, val loss: 0.47452, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.48206, val loss: 0.47182, in 0.016s 1 tree, 37 leaves, max depth = 17, train loss: 0.48023, val loss: 0.46985, in 0.000s 1 tree, 37 leaves, max depth = 17, train loss: 0.47844, val loss: 0.46794, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.47669, val loss: 0.46605, in 0.000s 1 tree, 57 leaves, max depth = 13, train loss: 0.47399, val loss: 0.46347, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.47229, val loss: 0.46162, in 0.000s 1 tree, 38 leaves, max depth = 17, train loss: 0.47067, val loss: 0.45988, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.46807, val loss: 0.45739, in 0.016s 1 tree, 36 leaves, max depth = 16, train loss: 0.46652, val loss: 0.45573, in 0.000s 1 tree, 36 leaves, max depth = 13, train loss: 0.46501, val loss: 0.45410, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.46250, val loss: 0.45171, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.46103, val loss: 0.45012, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.45960, val loss: 0.44856, in 0.016s 1 tree, 37 leaves, max depth = 14, train loss: 0.45823, val loss: 0.44707, in 0.000s 1 tree, 57 leaves, max depth = 9, train loss: 0.45652, val loss: 0.44551, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.45519, val loss: 0.44406, in 0.016s 1 tree, 37 leaves, max depth = 14, train loss: 0.45390, val loss: 0.44265, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.45155, val loss: 0.44042, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.45024, val loss: 0.43900, in 0.000s 1 tree, 37 leaves, max depth = 9, train loss: 0.44898, val loss: 0.43760, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.44779, val loss: 0.43630, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.44664, val loss: 0.43512, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.44550, val loss: 0.43388, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.44440, val loss: 0.43273, in 0.000s Fit 75 trees in 0.955 s, (2737 total leaves) Time spent computing histograms: 0.343s Time spent finding best splits: 0.045s Time spent applying splits: 0.054s Time spent predicting: 0.016s Trial 14, Fold 2: Log loss = 0.4453315201550713, Average precision = 0.9283547951673355, ROC-AUC = 0.9304742482490582, Elapsed Time = 0.9605992000015249 seconds Trial 14, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 14, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 34 leaves, max depth = 12, train loss: 0.68561, val loss: 0.68537, in 0.000s 1 tree, 34 leaves, max depth = 9, train loss: 0.67820, val loss: 0.67771, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.67104, val loss: 0.67028, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.66409, val loss: 0.66309, in 0.000s 1 tree, 36 leaves, max depth = 10, train loss: 0.65738, val loss: 0.65614, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.65087, val loss: 0.64940, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.64461, val loss: 0.64287, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.63854, val loss: 0.63656, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.63262, val loss: 0.63042, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.62680, val loss: 0.62441, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.62125, val loss: 0.61866, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.61589, val loss: 0.61307, in 0.000s 1 tree, 36 leaves, max depth = 11, train loss: 0.61065, val loss: 0.60763, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.60563, val loss: 0.60243, in 0.000s 1 tree, 38 leaves, max depth = 13, train loss: 0.60074, val loss: 0.59732, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.59599, val loss: 0.59238, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.59134, val loss: 0.58753, in 0.000s 1 tree, 35 leaves, max depth = 9, train loss: 0.58683, val loss: 0.58282, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.58243, val loss: 0.57824, in 0.000s 1 tree, 36 leaves, max depth = 9, train loss: 0.57810, val loss: 0.57376, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.57390, val loss: 0.56941, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.56982, val loss: 0.56518, in 0.000s 1 tree, 36 leaves, max depth = 12, train loss: 0.56593, val loss: 0.56115, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.56208, val loss: 0.55715, in 0.000s 1 tree, 37 leaves, max depth = 12, train loss: 0.55841, val loss: 0.55335, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.55484, val loss: 0.54961, in 0.000s 1 tree, 37 leaves, max depth = 12, train loss: 0.55139, val loss: 0.54601, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.54804, val loss: 0.54249, in 0.000s 1 tree, 39 leaves, max depth = 14, train loss: 0.54479, val loss: 0.53910, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.54107, val loss: 0.53566, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.53795, val loss: 0.53238, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.53487, val loss: 0.52916, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.53184, val loss: 0.52600, in 0.000s 1 tree, 36 leaves, max depth = 11, train loss: 0.52898, val loss: 0.52299, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.52611, val loss: 0.52000, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.52338, val loss: 0.51713, in 0.000s 1 tree, 36 leaves, max depth = 10, train loss: 0.52075, val loss: 0.51436, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.51735, val loss: 0.51122, in 0.000s 1 tree, 35 leaves, max depth = 9, train loss: 0.51480, val loss: 0.50853, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.51233, val loss: 0.50595, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.50993, val loss: 0.50341, in 0.000s 1 tree, 37 leaves, max depth = 9, train loss: 0.50753, val loss: 0.50091, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.50525, val loss: 0.49850, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.50206, val loss: 0.49557, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.49985, val loss: 0.49324, in 0.000s 1 tree, 35 leaves, max depth = 9, train loss: 0.49773, val loss: 0.49099, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.49466, val loss: 0.48818, in 0.000s 1 tree, 35 leaves, max depth = 9, train loss: 0.49271, val loss: 0.48611, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.49072, val loss: 0.48400, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.48778, val loss: 0.48131, in 0.000s 1 tree, 35 leaves, max depth = 14, train loss: 0.48589, val loss: 0.47932, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.48306, val loss: 0.47672, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.48124, val loss: 0.47480, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.47948, val loss: 0.47293, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.47775, val loss: 0.47108, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.47503, val loss: 0.46859, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.47333, val loss: 0.46679, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.47171, val loss: 0.46507, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.46909, val loss: 0.46267, in 0.000s 1 tree, 36 leaves, max depth = 14, train loss: 0.46755, val loss: 0.46104, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.46604, val loss: 0.45942, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.46352, val loss: 0.45712, in 0.016s 1 tree, 57 leaves, max depth = 8, train loss: 0.46176, val loss: 0.45556, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.46033, val loss: 0.45402, in 0.000s 1 tree, 36 leaves, max depth = 14, train loss: 0.45894, val loss: 0.45254, in 0.016s 1 tree, 57 leaves, max depth = 8, train loss: 0.45726, val loss: 0.45104, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.45592, val loss: 0.44959, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.45461, val loss: 0.44818, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.45227, val loss: 0.44606, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.45095, val loss: 0.44468, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.44968, val loss: 0.44333, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.44849, val loss: 0.44204, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.44731, val loss: 0.44096, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.44614, val loss: 0.43972, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.44500, val loss: 0.43867, in 0.000s Fit 75 trees in 1.017 s, (2741 total leaves) Time spent computing histograms: 0.352s Time spent finding best splits: 0.049s Time spent applying splits: 0.059s Time spent predicting: 0.000s Trial 14, Fold 3: Log loss = 0.4398167373771425, Average precision = 0.9355649438911422, ROC-AUC = 0.9350192560506977, Elapsed Time = 1.0301173999996536 seconds Trial 14, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 14, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 14, train loss: 0.68559, val loss: 0.68516, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.67818, val loss: 0.67735, in 0.000s 1 tree, 36 leaves, max depth = 11, train loss: 0.67100, val loss: 0.66979, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.66406, val loss: 0.66247, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.65733, val loss: 0.65535, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.65081, val loss: 0.64846, in 0.000s 1 tree, 36 leaves, max depth = 11, train loss: 0.64452, val loss: 0.64181, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.63843, val loss: 0.63536, in 0.000s 1 tree, 38 leaves, max depth = 12, train loss: 0.63250, val loss: 0.62906, in 0.016s 1 tree, 37 leaves, max depth = 13, train loss: 0.62670, val loss: 0.62287, in 0.000s 1 tree, 35 leaves, max depth = 14, train loss: 0.62114, val loss: 0.61693, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.61575, val loss: 0.61123, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.61050, val loss: 0.60563, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.60547, val loss: 0.60025, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.60054, val loss: 0.59500, in 0.000s 1 tree, 37 leaves, max depth = 11, train loss: 0.59577, val loss: 0.58990, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.59111, val loss: 0.58493, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.58659, val loss: 0.58011, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.58219, val loss: 0.57540, in 0.000s 1 tree, 38 leaves, max depth = 12, train loss: 0.57787, val loss: 0.57074, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.57368, val loss: 0.56621, in 0.000s 1 tree, 38 leaves, max depth = 12, train loss: 0.56961, val loss: 0.56181, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.56572, val loss: 0.55760, in 0.000s 1 tree, 38 leaves, max depth = 12, train loss: 0.56187, val loss: 0.55343, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.55820, val loss: 0.54945, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.55461, val loss: 0.54559, in 0.000s 1 tree, 35 leaves, max depth = 12, train loss: 0.55115, val loss: 0.54182, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.54779, val loss: 0.53819, in 0.000s 1 tree, 33 leaves, max depth = 11, train loss: 0.54452, val loss: 0.53465, in 0.016s [30/75] 1 tree, 57 leaves, max depth = 12, train loss: 0.54078, val loss: 0.53100, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.53765, val loss: 0.52761, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.53457, val loss: 0.52427, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.53154, val loss: 0.52096, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.52867, val loss: 0.51784, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.52581, val loss: 0.51470, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.52307, val loss: 0.51171, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.52043, val loss: 0.50882, in 0.000s 1 tree, 57 leaves, max depth = 13, train loss: 0.51701, val loss: 0.50550, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.51445, val loss: 0.50271, in 0.000s 1 tree, 35 leaves, max depth = 12, train loss: 0.51199, val loss: 0.49998, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.50956, val loss: 0.49734, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.50717, val loss: 0.49470, in 0.000s 1 tree, 37 leaves, max depth = 11, train loss: 0.50489, val loss: 0.49220, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.50169, val loss: 0.48907, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.49947, val loss: 0.48664, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.49734, val loss: 0.48429, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.49427, val loss: 0.48130, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.49227, val loss: 0.47908, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.49028, val loss: 0.47686, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.48734, val loss: 0.47399, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.48544, val loss: 0.47188, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.48260, val loss: 0.46911, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.48078, val loss: 0.46708, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.47901, val loss: 0.46512, in 0.000s 1 tree, 36 leaves, max depth = 11, train loss: 0.47727, val loss: 0.46318, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.47454, val loss: 0.46054, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.47284, val loss: 0.45865, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.47122, val loss: 0.45682, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.46860, val loss: 0.45428, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.46697, val loss: 0.45247, in 0.016s 1 tree, 39 leaves, max depth = 13, train loss: 0.46546, val loss: 0.45075, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.46293, val loss: 0.44830, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.46147, val loss: 0.44667, in 0.016s 1 tree, 39 leaves, max depth = 13, train loss: 0.46006, val loss: 0.44505, in 0.000s 1 tree, 36 leaves, max depth = 13, train loss: 0.45867, val loss: 0.44347, in 0.016s 1 tree, 57 leaves, max depth = 9, train loss: 0.45698, val loss: 0.44187, in 0.000s 1 tree, 36 leaves, max depth = 11, train loss: 0.45565, val loss: 0.44037, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.45436, val loss: 0.43890, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.45200, val loss: 0.43663, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.45072, val loss: 0.43515, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.44947, val loss: 0.43371, in 0.000s 1 tree, 39 leaves, max depth = 13, train loss: 0.44830, val loss: 0.43236, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.44716, val loss: 0.43116, in 0.000s 1 tree, 37 leaves, max depth = 11, train loss: 0.44603, val loss: 0.42987, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.44493, val loss: 0.42871, in 0.000s Fit 75 trees in 0.971 s, (2757 total leaves) Time spent computing histograms: 0.343s Time spent finding best splits: 0.044s Time spent applying splits: 0.053s Time spent predicting: 0.000s Trial 14, Fold 4: Log loss = 0.44463370939777047, Average precision = 0.9301654073756966, ROC-AUC = 0.9295039563319194, Elapsed Time = 0.9744702000007237 seconds Trial 14, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 14, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 36 leaves, max depth = 12, train loss: 0.68550, val loss: 0.68497, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.67801, val loss: 0.67699, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.67075, val loss: 0.66924, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.66371, val loss: 0.66174, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.65694, val loss: 0.65447, in 0.000s 1 tree, 36 leaves, max depth = 12, train loss: 0.65037, val loss: 0.64742, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.64401, val loss: 0.64062, in 0.000s 1 tree, 37 leaves, max depth = 12, train loss: 0.63785, val loss: 0.63399, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.63186, val loss: 0.62757, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.62600, val loss: 0.62126, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.62038, val loss: 0.61520, in 0.000s 1 tree, 37 leaves, max depth = 9, train loss: 0.61496, val loss: 0.60933, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.60967, val loss: 0.60362, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.60459, val loss: 0.59809, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.59962, val loss: 0.59269, in 0.000s 1 tree, 34 leaves, max depth = 10, train loss: 0.59480, val loss: 0.58745, in 0.000s 1 tree, 35 leaves, max depth = 12, train loss: 0.59008, val loss: 0.58235, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.58553, val loss: 0.57740, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.58108, val loss: 0.57258, in 0.000s 1 tree, 37 leaves, max depth = 13, train loss: 0.57673, val loss: 0.56783, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.57250, val loss: 0.56322, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.56839, val loss: 0.55873, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.56446, val loss: 0.55442, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.56059, val loss: 0.55018, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.55688, val loss: 0.54610, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.55326, val loss: 0.54215, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.54976, val loss: 0.53829, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.54636, val loss: 0.53457, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.54305, val loss: 0.53092, in 0.000s 1 tree, 57 leaves, max depth = 14, train loss: 0.53940, val loss: 0.52745, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.53624, val loss: 0.52397, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.53312, val loss: 0.52054, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.53007, val loss: 0.51716, in 0.000s 1 tree, 38 leaves, max depth = 12, train loss: 0.52716, val loss: 0.51396, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.52427, val loss: 0.51075, in 0.000s 1 tree, 37 leaves, max depth = 12, train loss: 0.52152, val loss: 0.50769, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.51885, val loss: 0.50473, in 0.016s 1 tree, 57 leaves, max depth = 14, train loss: 0.51551, val loss: 0.50157, in 0.000s 1 tree, 35 leaves, max depth = 10, train loss: 0.51293, val loss: 0.49871, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.51043, val loss: 0.49591, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.50801, val loss: 0.49321, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.50559, val loss: 0.49051, in 0.000s 1 tree, 36 leaves, max depth = 15, train loss: 0.50329, val loss: 0.48795, in 0.016s 1 tree, 57 leaves, max depth = 14, train loss: 0.50017, val loss: 0.48501, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.49792, val loss: 0.48251, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.49577, val loss: 0.48009, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.49277, val loss: 0.47728, in 0.000s 1 tree, 34 leaves, max depth = 10, train loss: 0.49076, val loss: 0.47505, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.48874, val loss: 0.47278, in 0.031s 1 tree, 57 leaves, max depth = 13, train loss: 0.48586, val loss: 0.47009, in 0.032s 1 tree, 36 leaves, max depth = 12, train loss: 0.48394, val loss: 0.46793, in 0.047s 1 tree, 57 leaves, max depth = 13, train loss: 0.48115, val loss: 0.46533, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.47930, val loss: 0.46325, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.47751, val loss: 0.46122, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.47574, val loss: 0.45923, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.47307, val loss: 0.45674, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.47135, val loss: 0.45479, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.46971, val loss: 0.45293, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.46714, val loss: 0.45055, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.46557, val loss: 0.44874, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.46403, val loss: 0.44699, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.46156, val loss: 0.44471, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.46009, val loss: 0.44304, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.45865, val loss: 0.44139, in 0.000s 1 tree, 36 leaves, max depth = 12, train loss: 0.45727, val loss: 0.43978, in 0.016s 1 tree, 57 leaves, max depth = 9, train loss: 0.45559, val loss: 0.43837, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.45423, val loss: 0.43682, in 0.000s 1 tree, 35 leaves, max depth = 12, train loss: 0.45293, val loss: 0.43530, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.45061, val loss: 0.43318, in 0.016s 1 tree, 38 leaves, max depth = 9, train loss: 0.44930, val loss: 0.43167, in 0.016s 1 tree, 38 leaves, max depth = 9, train loss: 0.44803, val loss: 0.43019, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.44683, val loss: 0.42881, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.44569, val loss: 0.42770, in 0.000s 1 tree, 35 leaves, max depth = 13, train loss: 0.44455, val loss: 0.42638, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.44344, val loss: 0.42531, in 0.000s Fit 75 trees in 1.096 s, (2748 total leaves) Time spent computing histograms: 0.393s Time spent finding best splits: 0.071s Time spent applying splits: 0.082s Time spent predicting: 0.000s Trial 14, Fold 5: Log loss = 0.4511288488395596, Average precision = 0.925886440789031, ROC-AUC = 0.924868922207978, Elapsed Time = 1.1130646000001434 seconds
Optimization Progress: 15%|#5 | 15/100 [02:56<15:20, 10.83s/it]
Trial 15, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 15, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.158 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 49 leaves, max depth = 10, train loss: 0.66220, val loss: 0.66243, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.63453, val loss: 0.63502, in 0.015s 1 tree, 48 leaves, max depth = 11, train loss: 0.60959, val loss: 0.61019, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.58790, val loss: 0.58839, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.56691, val loss: 0.56733, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.54896, val loss: 0.54928, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.53166, val loss: 0.53207, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.51556, val loss: 0.51590, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.50083, val loss: 0.50105, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.48729, val loss: 0.48745, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.47482, val loss: 0.47483, in 0.031s 1 tree, 51 leaves, max depth = 10, train loss: 0.46334, val loss: 0.46327, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.45272, val loss: 0.45257, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.44290, val loss: 0.44274, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.42909, val loss: 0.42941, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.42059, val loss: 0.42090, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.41303, val loss: 0.41326, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.40140, val loss: 0.40211, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.39446, val loss: 0.39517, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.38798, val loss: 0.38875, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.37808, val loss: 0.37933, in 0.031s 1 tree, 50 leaves, max depth = 11, train loss: 0.36904, val loss: 0.37072, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.36375, val loss: 0.36547, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.35613, val loss: 0.35795, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.34919, val loss: 0.35110, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.34368, val loss: 0.34559, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.33755, val loss: 0.33954, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.33259, val loss: 0.33460, in 0.016s 1 tree, 43 leaves, max depth = 13, train loss: 0.32679, val loss: 0.32945, in 0.016s 1 tree, 44 leaves, max depth = 13, train loss: 0.32149, val loss: 0.32476, in 0.031s 1 tree, 26 leaves, max depth = 15, train loss: 0.31666, val loss: 0.32003, in 0.016s 1 tree, 62 leaves, max depth = 12, train loss: 0.31236, val loss: 0.31581, in 0.016s 1 tree, 44 leaves, max depth = 10, train loss: 0.30788, val loss: 0.31189, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.30379, val loss: 0.30784, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.30006, val loss: 0.30426, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.29624, val loss: 0.30096, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.29278, val loss: 0.29754, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.28941, val loss: 0.29468, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.28638, val loss: 0.29168, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.28314, val loss: 0.28844, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.28028, val loss: 0.28603, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.27770, val loss: 0.28343, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.27489, val loss: 0.28071, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.27243, val loss: 0.27870, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.27023, val loss: 0.27649, in 0.016s Fit 45 trees in 1.095 s, (2018 total leaves) Time spent computing histograms: 0.297s Time spent finding best splits: 0.064s Time spent applying splits: 0.043s Time spent predicting: 0.000s Trial 15, Fold 1: Log loss = 0.2758642335299996, Average precision = 0.963677897916902, ROC-AUC = 0.9577225914454912, Elapsed Time = 1.1047766000010597 seconds Trial 15, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 15, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 49 leaves, max depth = 10, train loss: 0.66197, val loss: 0.66118, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.63397, val loss: 0.63262, in 0.016s 1 tree, 47 leaves, max depth = 10, train loss: 0.60884, val loss: 0.60702, in 0.016s 1 tree, 50 leaves, max depth = 15, train loss: 0.58717, val loss: 0.58512, in 0.031s 1 tree, 50 leaves, max depth = 10, train loss: 0.56631, val loss: 0.56377, in 0.016s 1 tree, 51 leaves, max depth = 15, train loss: 0.54836, val loss: 0.54555, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.53069, val loss: 0.52757, in 0.031s 1 tree, 50 leaves, max depth = 10, train loss: 0.51452, val loss: 0.51102, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.49971, val loss: 0.49597, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.48612, val loss: 0.48203, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.47362, val loss: 0.46922, in 0.026s 1 tree, 49 leaves, max depth = 10, train loss: 0.46217, val loss: 0.45764, in 0.022s 1 tree, 48 leaves, max depth = 11, train loss: 0.45155, val loss: 0.44675, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.44169, val loss: 0.43673, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.42806, val loss: 0.42331, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.41963, val loss: 0.41470, in 0.031s 1 tree, 54 leaves, max depth = 12, train loss: 0.41218, val loss: 0.40725, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.40070, val loss: 0.39601, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.39372, val loss: 0.38892, in 0.031s 1 tree, 52 leaves, max depth = 10, train loss: 0.38727, val loss: 0.38246, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.37751, val loss: 0.37294, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.36854, val loss: 0.36413, in 0.016s 1 tree, 59 leaves, max depth = 9, train loss: 0.36320, val loss: 0.35876, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.35564, val loss: 0.35134, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.34879, val loss: 0.34467, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.34330, val loss: 0.33931, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.33707, val loss: 0.33352, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.33134, val loss: 0.32790, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.32654, val loss: 0.32323, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.32131, val loss: 0.31844, in 0.016s 1 tree, 25 leaves, max depth = 14, train loss: 0.31654, val loss: 0.31376, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.31223, val loss: 0.30962, in 0.031s 1 tree, 43 leaves, max depth = 13, train loss: 0.30782, val loss: 0.30560, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.30377, val loss: 0.30165, in 0.016s 1 tree, 43 leaves, max depth = 13, train loss: 0.29991, val loss: 0.29813, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.29639, val loss: 0.29472, in 0.016s 1 tree, 44 leaves, max depth = 13, train loss: 0.29297, val loss: 0.29158, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.28989, val loss: 0.28860, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.28646, val loss: 0.28534, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.28332, val loss: 0.28228, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.28050, val loss: 0.27975, in 0.016s 1 tree, 26 leaves, max depth = 16, train loss: 0.27799, val loss: 0.27728, in 0.016s 1 tree, 61 leaves, max depth = 15, train loss: 0.27512, val loss: 0.27458, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.27271, val loss: 0.27241, in 0.031s 1 tree, 28 leaves, max depth = 10, train loss: 0.27055, val loss: 0.27030, in 0.016s Fit 45 trees in 1.189 s, (2034 total leaves) Time spent computing histograms: 0.317s Time spent finding best splits: 0.067s Time spent applying splits: 0.047s Time spent predicting: 0.000s Trial 15, Fold 2: Log loss = 0.2732321948794839, Average precision = 0.9636508261454557, ROC-AUC = 0.9604291852266854, Elapsed Time = 1.1898820999995223 seconds Trial 15, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 15, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 49 leaves, max depth = 11, train loss: 0.66244, val loss: 0.66232, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.63447, val loss: 0.63435, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.60927, val loss: 0.60911, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.58756, val loss: 0.58750, in 0.031s 1 tree, 50 leaves, max depth = 12, train loss: 0.56673, val loss: 0.56670, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.54872, val loss: 0.54872, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.53142, val loss: 0.53127, in 0.031s 1 tree, 51 leaves, max depth = 12, train loss: 0.51548, val loss: 0.51535, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.50085, val loss: 0.50067, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.48813, val loss: 0.48804, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.47569, val loss: 0.47557, in 0.031s 1 tree, 53 leaves, max depth = 12, train loss: 0.46421, val loss: 0.46414, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.45352, val loss: 0.45332, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.44369, val loss: 0.44344, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.42989, val loss: 0.43066, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.42134, val loss: 0.42206, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.41391, val loss: 0.41472, in 0.031s 1 tree, 50 leaves, max depth = 11, train loss: 0.40229, val loss: 0.40410, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.39528, val loss: 0.39704, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.38505, val loss: 0.38776, in 0.031s 1 tree, 52 leaves, max depth = 11, train loss: 0.37884, val loss: 0.38151, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.36979, val loss: 0.37332, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.36441, val loss: 0.36789, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.35673, val loss: 0.36089, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.34974, val loss: 0.35454, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.34325, val loss: 0.34905, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.33724, val loss: 0.34358, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.33215, val loss: 0.33813, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.32741, val loss: 0.33303, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.32212, val loss: 0.32858, in 0.000s 1 tree, 27 leaves, max depth = 10, train loss: 0.31726, val loss: 0.32414, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.31312, val loss: 0.31975, in 0.031s 1 tree, 42 leaves, max depth = 11, train loss: 0.30865, val loss: 0.31605, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.30456, val loss: 0.31233, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.30066, val loss: 0.30912, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.29713, val loss: 0.30587, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.29369, val loss: 0.30226, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.29050, val loss: 0.29944, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.28723, val loss: 0.29587, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.28417, val loss: 0.29338, in 0.031s 1 tree, 43 leaves, max depth = 10, train loss: 0.28134, val loss: 0.29112, in 0.016s 1 tree, 26 leaves, max depth = 12, train loss: 0.27878, val loss: 0.28881, in 0.000s 1 tree, 42 leaves, max depth = 10, train loss: 0.27624, val loss: 0.28681, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.27398, val loss: 0.28477, in 0.016s 1 tree, 51 leaves, max depth = 14, train loss: 0.27135, val loss: 0.28186, in 0.016s Fit 45 trees in 1.143 s, (2028 total leaves) Time spent computing histograms: 0.312s Time spent finding best splits: 0.064s Time spent applying splits: 0.045s Time spent predicting: 0.000s Trial 15, Fold 3: Log loss = 0.27160769915984334, Average precision = 0.9628420835833131, ROC-AUC = 0.9593151972917529, Elapsed Time = 1.1477941000011924 seconds Trial 15, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 15, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.166 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 51 leaves, max depth = 10, train loss: 0.66237, val loss: 0.66117, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.63481, val loss: 0.63244, in 0.016s 1 tree, 50 leaves, max depth = 9, train loss: 0.60993, val loss: 0.60653, in 0.031s 1 tree, 49 leaves, max depth = 13, train loss: 0.58852, val loss: 0.58411, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.56757, val loss: 0.56228, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.54987, val loss: 0.54374, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.53227, val loss: 0.52533, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.51624, val loss: 0.50850, in 0.031s 1 tree, 52 leaves, max depth = 11, train loss: 0.50151, val loss: 0.49305, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.48804, val loss: 0.47885, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.47561, val loss: 0.46574, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.46415, val loss: 0.45366, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.45363, val loss: 0.44252, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.44388, val loss: 0.43219, in 0.031s 1 tree, 47 leaves, max depth = 11, train loss: 0.43489, val loss: 0.42258, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.42176, val loss: 0.40926, in 0.016s 1 tree, 54 leaves, max depth = 14, train loss: 0.41434, val loss: 0.40140, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.40284, val loss: 0.38972, in 0.031s 1 tree, 51 leaves, max depth = 11, train loss: 0.39587, val loss: 0.38238, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.38939, val loss: 0.37551, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.37960, val loss: 0.36560, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.37065, val loss: 0.35655, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.36536, val loss: 0.35091, in 0.032s 1 tree, 26 leaves, max depth = 11, train loss: 0.35778, val loss: 0.34306, in 0.000s 1 tree, 26 leaves, max depth = 11, train loss: 0.35087, val loss: 0.33590, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.34544, val loss: 0.33042, in 0.031s 1 tree, 44 leaves, max depth = 9, train loss: 0.33917, val loss: 0.32456, in 0.031s 1 tree, 26 leaves, max depth = 10, train loss: 0.33342, val loss: 0.31856, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.32856, val loss: 0.31359, in 0.031s 1 tree, 44 leaves, max depth = 11, train loss: 0.32328, val loss: 0.30871, in 0.031s 1 tree, 26 leaves, max depth = 12, train loss: 0.31847, val loss: 0.30369, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.31431, val loss: 0.29953, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.31021, val loss: 0.29529, in 0.031s 1 tree, 46 leaves, max depth = 11, train loss: 0.30588, val loss: 0.29131, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.30193, val loss: 0.28718, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.29813, val loss: 0.28361, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.29468, val loss: 0.27991, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.29135, val loss: 0.27689, in 0.031s 1 tree, 25 leaves, max depth = 10, train loss: 0.28834, val loss: 0.27371, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.28506, val loss: 0.27032, in 0.016s 1 tree, 45 leaves, max depth = 10, train loss: 0.28221, val loss: 0.26775, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.27968, val loss: 0.26501, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.27689, val loss: 0.26223, in 0.031s 1 tree, 47 leaves, max depth = 11, train loss: 0.27444, val loss: 0.26005, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.27226, val loss: 0.25774, in 0.016s Fit 45 trees in 1.236 s, (1998 total leaves) Time spent computing histograms: 0.333s Time spent finding best splits: 0.074s Time spent applying splits: 0.050s Time spent predicting: 0.000s Trial 15, Fold 4: Log loss = 0.2721552688535955, Average precision = 0.9637009760932421, ROC-AUC = 0.9597320132048613, Elapsed Time = 1.2341109000008146 seconds Trial 15, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 15, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 50 leaves, max depth = 12, train loss: 0.66188, val loss: 0.66069, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.63385, val loss: 0.63157, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.60861, val loss: 0.60532, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.58686, val loss: 0.58256, in 0.031s 1 tree, 50 leaves, max depth = 12, train loss: 0.56588, val loss: 0.56065, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.54790, val loss: 0.54184, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.53037, val loss: 0.52360, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.51431, val loss: 0.50682, in 0.031s 1 tree, 50 leaves, max depth = 11, train loss: 0.49961, val loss: 0.49151, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.48604, val loss: 0.47746, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.47363, val loss: 0.46450, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.46218, val loss: 0.45253, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.45166, val loss: 0.44155, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.44191, val loss: 0.43140, in 0.016s 1 tree, 51 leaves, max depth = 15, train loss: 0.42819, val loss: 0.41770, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.41969, val loss: 0.40884, in 0.031s 1 tree, 51 leaves, max depth = 11, train loss: 0.41221, val loss: 0.40106, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.40065, val loss: 0.38961, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.39365, val loss: 0.38248, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.38718, val loss: 0.37591, in 0.031s 1 tree, 49 leaves, max depth = 15, train loss: 0.37734, val loss: 0.36618, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.36833, val loss: 0.35731, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.36316, val loss: 0.35204, in 0.031s 1 tree, 25 leaves, max depth = 10, train loss: 0.35550, val loss: 0.34428, in 0.000s 1 tree, 27 leaves, max depth = 11, train loss: 0.34857, val loss: 0.33709, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.34312, val loss: 0.33164, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.33694, val loss: 0.32538, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.33113, val loss: 0.32024, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.32637, val loss: 0.31550, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.32124, val loss: 0.31018, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.31684, val loss: 0.30572, in 0.031s 1 tree, 43 leaves, max depth = 12, train loss: 0.31210, val loss: 0.30158, in 0.016s 1 tree, 26 leaves, max depth = 14, train loss: 0.30776, val loss: 0.29715, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.30395, val loss: 0.29341, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.29995, val loss: 0.28997, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.29629, val loss: 0.28623, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.29279, val loss: 0.28312, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.28960, val loss: 0.27986, in 0.031s 1 tree, 58 leaves, max depth = 11, train loss: 0.28627, val loss: 0.27655, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.28329, val loss: 0.27394, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.28057, val loss: 0.27106, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.27773, val loss: 0.26832, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.27518, val loss: 0.26612, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.27287, val loss: 0.26377, in 0.016s 1 tree, 60 leaves, max depth = 11, train loss: 0.27023, val loss: 0.26117, in 0.016s Fit 45 trees in 1.173 s, (2029 total leaves) Time spent computing histograms: 0.307s Time spent finding best splits: 0.063s Time spent applying splits: 0.045s Time spent predicting: 0.000s Trial 15, Fold 5: Log loss = 0.27970886264491346, Average precision = 0.959729690231182, ROC-AUC = 0.9547875815171952, Elapsed Time = 1.1758101999985229 seconds
Optimization Progress: 16%|#6 | 16/100 [03:10<16:26, 11.75s/it]
Trial 16, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 16, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 37 leaves, max depth = 12, train loss: 0.68427, val loss: 0.68410, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.67611, val loss: 0.67582, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.66815, val loss: 0.66773, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.66014, val loss: 0.65970, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.65280, val loss: 0.65227, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.64479, val loss: 0.64426, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.63793, val loss: 0.63733, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.63064, val loss: 0.63002, in 0.016s 1 tree, 33 leaves, max depth = 9, train loss: 0.62411, val loss: 0.62344, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.61737, val loss: 0.61652, in 0.000s 1 tree, 34 leaves, max depth = 12, train loss: 0.61094, val loss: 0.61000, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.60445, val loss: 0.60347, in 0.016s 1 tree, 18 leaves, max depth = 9, train loss: 0.59835, val loss: 0.59723, in 0.000s 1 tree, 36 leaves, max depth = 12, train loss: 0.59276, val loss: 0.59162, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.58708, val loss: 0.58584, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.58167, val loss: 0.58031, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.57597, val loss: 0.57453, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.57082, val loss: 0.56935, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.56533, val loss: 0.56382, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.55986, val loss: 0.55835, in 0.016s 1 tree, 33 leaves, max depth = 9, train loss: 0.55472, val loss: 0.55317, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.54950, val loss: 0.54795, in 0.016s 1 tree, 32 leaves, max depth = 13, train loss: 0.54478, val loss: 0.54320, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.54038, val loss: 0.53870, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.53565, val loss: 0.53392, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.53171, val loss: 0.52989, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.52715, val loss: 0.52526, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.52292, val loss: 0.52092, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.51857, val loss: 0.51648, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.51442, val loss: 0.51231, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.51021, val loss: 0.50804, in 0.031s 1 tree, 33 leaves, max depth = 10, train loss: 0.50655, val loss: 0.50431, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.50243, val loss: 0.50015, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.49858, val loss: 0.49630, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.49472, val loss: 0.49246, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.49125, val loss: 0.48899, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.48788, val loss: 0.48549, in 0.016s 1 tree, 14 leaves, max depth = 9, train loss: 0.48334, val loss: 0.48101, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.48006, val loss: 0.47769, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.47557, val loss: 0.47331, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.47264, val loss: 0.47034, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.46937, val loss: 0.46706, in 0.016s 1 tree, 38 leaves, max depth = 15, train loss: 0.46641, val loss: 0.46403, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.46330, val loss: 0.46083, in 0.031s 1 tree, 8 leaves, max depth = 5, train loss: 0.46063, val loss: 0.45800, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.45789, val loss: 0.45523, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.45502, val loss: 0.45233, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.45250, val loss: 0.44973, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.45008, val loss: 0.44713, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.44732, val loss: 0.44428, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.44465, val loss: 0.44161, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.44226, val loss: 0.43910, in 0.016s 1 tree, 15 leaves, max depth = 9, train loss: 0.43848, val loss: 0.43544, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.43591, val loss: 0.43278, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.43237, val loss: 0.42937, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.42989, val loss: 0.42688, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.42780, val loss: 0.42479, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.42575, val loss: 0.42272, in 0.016s 1 tree, 34 leaves, max depth = 14, train loss: 0.42357, val loss: 0.42047, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.42173, val loss: 0.41859, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.41953, val loss: 0.41639, in 0.031s 1 tree, 38 leaves, max depth = 13, train loss: 0.41718, val loss: 0.41403, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.41505, val loss: 0.41194, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.41193, val loss: 0.40893, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.40956, val loss: 0.40656, in 0.016s 1 tree, 37 leaves, max depth = 14, train loss: 0.40755, val loss: 0.40454, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.40584, val loss: 0.40279, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.40391, val loss: 0.40084, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.40186, val loss: 0.39886, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.39963, val loss: 0.39667, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.39750, val loss: 0.39455, in 0.016s 1 tree, 11 leaves, max depth = 7, train loss: 0.39565, val loss: 0.39262, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.39398, val loss: 0.39086, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.39232, val loss: 0.38908, in 0.016s Fit 74 trees in 1.517 s, (2394 total leaves) Time spent computing histograms: 0.512s Time spent finding best splits: 0.062s Time spent applying splits: 0.054s Time spent predicting: 0.000s Trial 16, Fold 1: Log loss = 0.3938402636701185, Average precision = 0.9528827845173964, ROC-AUC = 0.9465832359492026, Elapsed Time = 1.5249160999992455 seconds Trial 16, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 16, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 34 leaves, max depth = 9, train loss: 0.68432, val loss: 0.68414, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.67584, val loss: 0.67557, in 0.016s 1 tree, 38 leaves, max depth = 16, train loss: 0.66782, val loss: 0.66742, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.66017, val loss: 0.65948, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.65277, val loss: 0.65180, in 0.031s 1 tree, 32 leaves, max depth = 11, train loss: 0.64499, val loss: 0.64385, in 0.000s 1 tree, 28 leaves, max depth = 8, train loss: 0.63818, val loss: 0.63678, in 0.031s 1 tree, 42 leaves, max depth = 11, train loss: 0.63110, val loss: 0.62954, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.62431, val loss: 0.62272, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.61733, val loss: 0.61556, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.61098, val loss: 0.60907, in 0.016s 1 tree, 34 leaves, max depth = 13, train loss: 0.60466, val loss: 0.60265, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.59875, val loss: 0.59650, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.59236, val loss: 0.58996, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.58671, val loss: 0.58416, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.58109, val loss: 0.57838, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.57564, val loss: 0.57282, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.56987, val loss: 0.56703, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.56460, val loss: 0.56162, in 0.031s 1 tree, 38 leaves, max depth = 11, train loss: 0.55918, val loss: 0.55608, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.55427, val loss: 0.55105, in 0.016s 1 tree, 35 leaves, max depth = 16, train loss: 0.54940, val loss: 0.54610, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.54474, val loss: 0.54129, in 0.016s 1 tree, 38 leaves, max depth = 9, train loss: 0.54002, val loss: 0.53651, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.53567, val loss: 0.53205, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.53104, val loss: 0.52730, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.52689, val loss: 0.52308, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.52270, val loss: 0.51882, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.51853, val loss: 0.51460, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.51482, val loss: 0.51076, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.51100, val loss: 0.50684, in 0.016s 1 tree, 22 leaves, max depth = 11, train loss: 0.50715, val loss: 0.50292, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.50312, val loss: 0.49884, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.49957, val loss: 0.49515, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.49614, val loss: 0.49160, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.49264, val loss: 0.48812, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.48930, val loss: 0.48475, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.48582, val loss: 0.48123, in 0.031s 1 tree, 36 leaves, max depth = 9, train loss: 0.48249, val loss: 0.47789, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.47936, val loss: 0.47471, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.47651, val loss: 0.47175, in 0.016s 1 tree, 40 leaves, max depth = 13, train loss: 0.47316, val loss: 0.46838, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.47023, val loss: 0.46542, in 0.031s 1 tree, 34 leaves, max depth = 11, train loss: 0.46703, val loss: 0.46216, in 0.016s 1 tree, 27 leaves, max depth = 12, train loss: 0.46429, val loss: 0.45934, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.46157, val loss: 0.45661, in 0.016s 1 tree, 11 leaves, max depth = 7, train loss: 0.45754, val loss: 0.45257, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.45489, val loss: 0.44988, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.45204, val loss: 0.44700, in 0.016s 1 tree, 36 leaves, max depth = 14, train loss: 0.44930, val loss: 0.44419, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.44660, val loss: 0.44143, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.44422, val loss: 0.43895, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.44045, val loss: 0.43522, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.43783, val loss: 0.43258, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.43430, val loss: 0.42904, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.43178, val loss: 0.42651, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.42955, val loss: 0.42422, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.42743, val loss: 0.42205, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.42530, val loss: 0.41996, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.42347, val loss: 0.41808, in 0.031s 1 tree, 36 leaves, max depth = 11, train loss: 0.42104, val loss: 0.41563, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.41847, val loss: 0.41307, in 0.031s 1 tree, 37 leaves, max depth = 12, train loss: 0.41598, val loss: 0.41061, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.41433, val loss: 0.40884, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.41189, val loss: 0.40643, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.40998, val loss: 0.40454, in 0.031s 1 tree, 33 leaves, max depth = 12, train loss: 0.40837, val loss: 0.40289, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.40608, val loss: 0.40062, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.40401, val loss: 0.39851, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.40178, val loss: 0.39631, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.39906, val loss: 0.39371, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.39722, val loss: 0.39184, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.39560, val loss: 0.39021, in 0.016s 1 tree, 11 leaves, max depth = 7, train loss: 0.39299, val loss: 0.38773, in 0.016s Fit 74 trees in 1.706 s, (2392 total leaves) Time spent computing histograms: 0.574s Time spent finding best splits: 0.070s Time spent applying splits: 0.062s Time spent predicting: 0.000s Trial 16, Fold 2: Log loss = 0.3954167607410999, Average precision = 0.9511083064377481, ROC-AUC = 0.9473437460640419, Elapsed Time = 1.711078099999213 seconds Trial 16, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 16, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 33 leaves, max depth = 13, train loss: 0.68432, val loss: 0.68415, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.67628, val loss: 0.67603, in 0.016s 1 tree, 33 leaves, max depth = 13, train loss: 0.66836, val loss: 0.66792, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.66030, val loss: 0.65979, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.65290, val loss: 0.65227, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.64492, val loss: 0.64428, in 0.016s 1 tree, 37 leaves, max depth = 14, train loss: 0.63789, val loss: 0.63716, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.63093, val loss: 0.63018, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.62433, val loss: 0.62348, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.61767, val loss: 0.61671, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.61120, val loss: 0.61010, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.60466, val loss: 0.60351, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.59862, val loss: 0.59745, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.59273, val loss: 0.59144, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.58705, val loss: 0.58572, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.58160, val loss: 0.58016, in 0.016s 1 tree, 33 leaves, max depth = 9, train loss: 0.57574, val loss: 0.57422, in 0.031s 1 tree, 30 leaves, max depth = 11, train loss: 0.57043, val loss: 0.56882, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.56522, val loss: 0.56357, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.56029, val loss: 0.55855, in 0.016s 1 tree, 34 leaves, max depth = 14, train loss: 0.55492, val loss: 0.55315, in 0.031s 1 tree, 43 leaves, max depth = 11, train loss: 0.55041, val loss: 0.54858, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.54545, val loss: 0.54357, in 0.031s 1 tree, 26 leaves, max depth = 8, train loss: 0.54085, val loss: 0.53897, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.53646, val loss: 0.53456, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.53253, val loss: 0.53054, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.52793, val loss: 0.52592, in 0.031s 1 tree, 34 leaves, max depth = 11, train loss: 0.52374, val loss: 0.52160, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.51941, val loss: 0.51727, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.51518, val loss: 0.51302, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.51136, val loss: 0.50924, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.50748, val loss: 0.50530, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.50364, val loss: 0.50148, in 0.031s 1 tree, 34 leaves, max depth = 11, train loss: 0.50002, val loss: 0.49790, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.49611, val loss: 0.49400, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.49242, val loss: 0.49027, in 0.016s 1 tree, 32 leaves, max depth = 13, train loss: 0.48908, val loss: 0.48692, in 0.031s 1 tree, 19 leaves, max depth = 9, train loss: 0.48455, val loss: 0.48265, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.48125, val loss: 0.47931, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.47685, val loss: 0.47515, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.47404, val loss: 0.47235, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.47072, val loss: 0.46902, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.46783, val loss: 0.46610, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.46473, val loss: 0.46299, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.46201, val loss: 0.46028, in 0.016s 1 tree, 39 leaves, max depth = 14, train loss: 0.45926, val loss: 0.45753, in 0.031s 1 tree, 10 leaves, max depth = 5, train loss: 0.45538, val loss: 0.45391, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.45288, val loss: 0.45130, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.45026, val loss: 0.44870, in 0.016s 1 tree, 34 leaves, max depth = 13, train loss: 0.44774, val loss: 0.44621, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.44533, val loss: 0.44383, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.44278, val loss: 0.44125, in 0.031s 1 tree, 4 leaves, max depth = 2, train loss: 0.44061, val loss: 0.43910, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.43827, val loss: 0.43681, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.43604, val loss: 0.43457, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.43353, val loss: 0.43207, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.43116, val loss: 0.42978, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.42879, val loss: 0.42740, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.42612, val loss: 0.42469, in 0.031s 1 tree, 33 leaves, max depth = 12, train loss: 0.42426, val loss: 0.42284, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.42167, val loss: 0.42020, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.41935, val loss: 0.41791, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.41717, val loss: 0.41570, in 0.031s 1 tree, 40 leaves, max depth = 12, train loss: 0.41470, val loss: 0.41315, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.41263, val loss: 0.41112, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.41081, val loss: 0.40927, in 0.031s 1 tree, 35 leaves, max depth = 14, train loss: 0.40876, val loss: 0.40725, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.40682, val loss: 0.40532, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.40474, val loss: 0.40324, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.40253, val loss: 0.40096, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.40068, val loss: 0.39916, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.39868, val loss: 0.39730, in 0.016s 1 tree, 11 leaves, max depth = 7, train loss: 0.39575, val loss: 0.39462, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39411, val loss: 0.39301, in 0.016s Fit 74 trees in 1.752 s, (2308 total leaves) Time spent computing histograms: 0.604s Time spent finding best splits: 0.070s Time spent applying splits: 0.061s Time spent predicting: 0.000s Trial 16, Fold 3: Log loss = 0.39023381825865555, Average precision = 0.9561549356669634, ROC-AUC = 0.9512085745105027, Elapsed Time = 1.7605428999995638 seconds Trial 16, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 16, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 32 leaves, max depth = 11, train loss: 0.68448, val loss: 0.68410, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.67644, val loss: 0.67568, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.66844, val loss: 0.66726, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.66060, val loss: 0.65906, in 0.016s 1 tree, 38 leaves, max depth = 17, train loss: 0.65309, val loss: 0.65120, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.64586, val loss: 0.64373, in 0.031s 1 tree, 34 leaves, max depth = 10, train loss: 0.63842, val loss: 0.63592, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.63161, val loss: 0.62882, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.62463, val loss: 0.62156, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.61781, val loss: 0.61437, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.61140, val loss: 0.60764, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.60514, val loss: 0.60107, in 0.031s 1 tree, 32 leaves, max depth = 15, train loss: 0.59923, val loss: 0.59500, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.59306, val loss: 0.58853, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.58740, val loss: 0.58263, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.58194, val loss: 0.57695, in 0.016s 1 tree, 35 leaves, max depth = 16, train loss: 0.57594, val loss: 0.57071, in 0.031s 1 tree, 34 leaves, max depth = 11, train loss: 0.57038, val loss: 0.56485, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.56533, val loss: 0.55956, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.55999, val loss: 0.55390, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.55519, val loss: 0.54875, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.55030, val loss: 0.54363, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.54530, val loss: 0.53835, in 0.031s 1 tree, 35 leaves, max depth = 13, train loss: 0.54065, val loss: 0.53345, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.53616, val loss: 0.52865, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.53199, val loss: 0.52431, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.52779, val loss: 0.51979, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.52372, val loss: 0.51546, in 0.016s 1 tree, 37 leaves, max depth = 13, train loss: 0.51958, val loss: 0.51108, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.51595, val loss: 0.50717, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.51220, val loss: 0.50325, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.50836, val loss: 0.49918, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.50437, val loss: 0.49499, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.50082, val loss: 0.49131, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.49743, val loss: 0.48775, in 0.016s 1 tree, 32 leaves, max depth = 13, train loss: 0.49433, val loss: 0.48450, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.49088, val loss: 0.48085, in 0.031s 1 tree, 35 leaves, max depth = 13, train loss: 0.48760, val loss: 0.47738, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.48309, val loss: 0.47272, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.48005, val loss: 0.46954, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.47590, val loss: 0.46526, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.47310, val loss: 0.46239, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.46987, val loss: 0.45896, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.46659, val loss: 0.45552, in 0.031s 1 tree, 34 leaves, max depth = 11, train loss: 0.46387, val loss: 0.45269, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.46120, val loss: 0.44990, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.45833, val loss: 0.44684, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.45548, val loss: 0.44376, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.45294, val loss: 0.44102, in 0.016s 1 tree, 37 leaves, max depth = 14, train loss: 0.45015, val loss: 0.43806, in 0.016s 1 tree, 37 leaves, max depth = 15, train loss: 0.44769, val loss: 0.43542, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.44502, val loss: 0.43257, in 0.031s 1 tree, 30 leaves, max depth = 14, train loss: 0.44293, val loss: 0.43040, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.44039, val loss: 0.42769, in 0.016s 1 tree, 11 leaves, max depth = 8, train loss: 0.43693, val loss: 0.42415, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.43439, val loss: 0.42143, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.43216, val loss: 0.41904, in 0.031s 1 tree, 33 leaves, max depth = 11, train loss: 0.42983, val loss: 0.41656, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.42790, val loss: 0.41456, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.42588, val loss: 0.41243, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.42370, val loss: 0.41011, in 0.016s 1 tree, 37 leaves, max depth = 8, train loss: 0.42174, val loss: 0.40800, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.41976, val loss: 0.40596, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.41752, val loss: 0.40364, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.41569, val loss: 0.40166, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.41395, val loss: 0.39977, in 0.031s 1 tree, 31 leaves, max depth = 13, train loss: 0.41219, val loss: 0.39786, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.41006, val loss: 0.39568, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.40784, val loss: 0.39347, in 0.031s 1 tree, 40 leaves, max depth = 13, train loss: 0.40561, val loss: 0.39121, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.40360, val loss: 0.38915, in 0.016s 1 tree, 38 leaves, max depth = 9, train loss: 0.40171, val loss: 0.38717, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.40018, val loss: 0.38555, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.39847, val loss: 0.38367, in 0.016s Fit 74 trees in 1.751 s, (2447 total leaves) Time spent computing histograms: 0.598s Time spent finding best splits: 0.072s Time spent applying splits: 0.064s Time spent predicting: 0.000s Trial 16, Fold 4: Log loss = 0.3980620539886425, Average precision = 0.951851052319919, ROC-AUC = 0.9459866359793129, Elapsed Time = 1.7685146000003442 seconds Trial 16, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 16, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 34 leaves, max depth = 11, train loss: 0.68426, val loss: 0.68370, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.67617, val loss: 0.67516, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.66823, val loss: 0.66676, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.66016, val loss: 0.65826, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.65272, val loss: 0.65057, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.64472, val loss: 0.64225, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.63796, val loss: 0.63515, in 0.031s 1 tree, 38 leaves, max depth = 12, train loss: 0.63079, val loss: 0.62764, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.62391, val loss: 0.62044, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.61718, val loss: 0.61326, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.61083, val loss: 0.60659, in 0.016s 1 tree, 34 leaves, max depth = 13, train loss: 0.60419, val loss: 0.59965, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.59815, val loss: 0.59330, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.59213, val loss: 0.58705, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.58642, val loss: 0.58118, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.58092, val loss: 0.57545, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.57514, val loss: 0.56935, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.56979, val loss: 0.56366, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.56468, val loss: 0.55828, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.55916, val loss: 0.55257, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.55392, val loss: 0.54711, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.54862, val loss: 0.54158, in 0.031s 1 tree, 28 leaves, max depth = 9, train loss: 0.54397, val loss: 0.53668, in 0.031s 1 tree, 27 leaves, max depth = 7, train loss: 0.53959, val loss: 0.53213, in 0.031s 1 tree, 34 leaves, max depth = 13, train loss: 0.53524, val loss: 0.52760, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.53128, val loss: 0.52340, in 0.032s 1 tree, 34 leaves, max depth = 12, train loss: 0.52661, val loss: 0.51851, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.52240, val loss: 0.51400, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.51810, val loss: 0.50953, in 0.031s 1 tree, 34 leaves, max depth = 11, train loss: 0.51382, val loss: 0.50500, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.50992, val loss: 0.50086, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.50612, val loss: 0.49682, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.50201, val loss: 0.49254, in 0.016s 1 tree, 22 leaves, max depth = 7, train loss: 0.49802, val loss: 0.48838, in 0.031s 1 tree, 41 leaves, max depth = 13, train loss: 0.49408, val loss: 0.48431, in 0.016s 1 tree, 39 leaves, max depth = 14, train loss: 0.49034, val loss: 0.48044, in 0.031s 1 tree, 38 leaves, max depth = 11, train loss: 0.48725, val loss: 0.47718, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.48394, val loss: 0.47367, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.48064, val loss: 0.47009, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.47614, val loss: 0.46555, in 0.031s 1 tree, 42 leaves, max depth = 11, train loss: 0.47315, val loss: 0.46246, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.46988, val loss: 0.45909, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.46698, val loss: 0.45599, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.46389, val loss: 0.45271, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.46123, val loss: 0.44994, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.45850, val loss: 0.44717, in 0.031s 1 tree, 9 leaves, max depth = 5, train loss: 0.45461, val loss: 0.44322, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.45204, val loss: 0.44062, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.44935, val loss: 0.43771, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.44656, val loss: 0.43475, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.44416, val loss: 0.43224, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.44160, val loss: 0.42949, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.43792, val loss: 0.42574, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.43554, val loss: 0.42323, in 0.031s 1 tree, 39 leaves, max depth = 11, train loss: 0.43333, val loss: 0.42091, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.43080, val loss: 0.41827, in 0.031s 1 tree, 32 leaves, max depth = 11, train loss: 0.42855, val loss: 0.41591, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.42641, val loss: 0.41369, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.42427, val loss: 0.41141, in 0.016s 1 tree, 27 leaves, max depth = 8, train loss: 0.42244, val loss: 0.40947, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.41985, val loss: 0.40685, in 0.031s 1 tree, 33 leaves, max depth = 11, train loss: 0.41770, val loss: 0.40461, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.41524, val loss: 0.40211, in 0.016s 1 tree, 33 leaves, max depth = 9, train loss: 0.41362, val loss: 0.40034, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.41120, val loss: 0.39789, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.40942, val loss: 0.39594, in 0.031s 1 tree, 37 leaves, max depth = 12, train loss: 0.40741, val loss: 0.39380, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.40550, val loss: 0.39180, in 0.016s 1 tree, 44 leaves, max depth = 14, train loss: 0.40345, val loss: 0.38970, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.40175, val loss: 0.38799, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.39960, val loss: 0.38582, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39673, val loss: 0.38290, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39411, val loss: 0.38014, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.39244, val loss: 0.37839, in 0.016s Fit 74 trees in 1.846 s, (2384 total leaves) Time spent computing histograms: 0.627s Time spent finding best splits: 0.081s Time spent applying splits: 0.078s Time spent predicting: 0.000s Trial 16, Fold 5: Log loss = 0.3984597419290703, Average precision = 0.9524122235739496, ROC-AUC = 0.9470420452223027, Elapsed Time = 1.8619557000001805 seconds
Optimization Progress: 17%|#7 | 17/100 [03:25<17:42, 12.80s/it]
Trial 17, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 17, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.143 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 32 leaves, max depth = 9, train loss: 0.66507, val loss: 0.66472, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.63841, val loss: 0.63797, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.61633, val loss: 0.61549, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.59559, val loss: 0.59439, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.57693, val loss: 0.57546, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.55961, val loss: 0.55818, in 0.016s 1 tree, 32 leaves, max depth = 8, train loss: 0.54219, val loss: 0.54094, in 0.016s 1 tree, 22 leaves, max depth = 10, train loss: 0.52658, val loss: 0.52496, in 0.000s 1 tree, 22 leaves, max depth = 8, train loss: 0.51319, val loss: 0.51126, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.50033, val loss: 0.49815, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.48853, val loss: 0.48650, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.47672, val loss: 0.47460, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.46670, val loss: 0.46432, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.45308, val loss: 0.45113, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.44400, val loss: 0.44195, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.43655, val loss: 0.43440, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.42880, val loss: 0.42653, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.42121, val loss: 0.41879, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.41505, val loss: 0.41242, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.40483, val loss: 0.40253, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.39548, val loss: 0.39352, in 0.016s Fit 21 trees in 0.596 s, (468 total leaves) Time spent computing histograms: 0.123s Time spent finding best splits: 0.013s Time spent applying splits: 0.009s Time spent predicting: 0.000s Trial 17, Fold 1: Log loss = 0.39756739316111345, Average precision = 0.9541704790921671, ROC-AUC = 0.9475914305549582, Elapsed Time = 0.5942180999991251 seconds Trial 17, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 17, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 9, train loss: 0.66539, val loss: 0.66473, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.64026, val loss: 0.63928, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.61738, val loss: 0.61586, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.59498, val loss: 0.59293, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.57621, val loss: 0.57385, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.55887, val loss: 0.55629, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.54307, val loss: 0.54021, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.52964, val loss: 0.52629, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.51611, val loss: 0.51239, in 0.016s 1 tree, 24 leaves, max depth = 10, train loss: 0.50330, val loss: 0.49944, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.49164, val loss: 0.48767, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.47942, val loss: 0.47515, in 0.016s 1 tree, 18 leaves, max depth = 9, train loss: 0.46952, val loss: 0.46529, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.46090, val loss: 0.45620, in 0.000s 1 tree, 26 leaves, max depth = 9, train loss: 0.45187, val loss: 0.44693, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.44443, val loss: 0.43914, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.43653, val loss: 0.43132, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.42810, val loss: 0.42291, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.42129, val loss: 0.41596, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.41343, val loss: 0.40812, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.40291, val loss: 0.39803, in 0.016s Fit 21 trees in 0.610 s, (526 total leaves) Time spent computing histograms: 0.122s Time spent finding best splits: 0.014s Time spent applying splits: 0.010s Time spent predicting: 0.000s Trial 17, Fold 2: Log loss = 0.40518344337243584, Average precision = 0.9527342274576989, ROC-AUC = 0.949551524594826, Elapsed Time = 0.6115257000001293 seconds Trial 17, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 17, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 27 leaves, max depth = 9, train loss: 0.66559, val loss: 0.66505, in 0.016s 1 tree, 43 leaves, max depth = 13, train loss: 0.64046, val loss: 0.63985, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.61773, val loss: 0.61666, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.59592, val loss: 0.59467, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.57611, val loss: 0.57468, in 0.031s 1 tree, 12 leaves, max depth = 6, train loss: 0.55841, val loss: 0.55707, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.54289, val loss: 0.54170, in 0.000s 1 tree, 15 leaves, max depth = 8, train loss: 0.52858, val loss: 0.52736, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.51444, val loss: 0.51341, in 0.016s 1 tree, 38 leaves, max depth = 9, train loss: 0.50159, val loss: 0.50054, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.49013, val loss: 0.48903, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.47809, val loss: 0.47697, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.46787, val loss: 0.46705, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.45428, val loss: 0.45429, in 0.000s 1 tree, 27 leaves, max depth = 9, train loss: 0.44552, val loss: 0.44544, in 0.016s 1 tree, 12 leaves, max depth = 7, train loss: 0.43799, val loss: 0.43780, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.43013, val loss: 0.42978, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.42302, val loss: 0.42272, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.41618, val loss: 0.41609, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.40647, val loss: 0.40718, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.39716, val loss: 0.39861, in 0.016s Fit 21 trees in 0.673 s, (481 total leaves) Time spent computing histograms: 0.130s Time spent finding best splits: 0.013s Time spent applying splits: 0.010s Time spent predicting: 0.000s Trial 17, Fold 3: Log loss = 0.39428931547288537, Average precision = 0.9567107082320997, ROC-AUC = 0.9522550466623647, Elapsed Time = 0.6815104000015708 seconds Trial 17, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 17, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 8, train loss: 0.66547, val loss: 0.66418, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.64126, val loss: 0.63883, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.61845, val loss: 0.61484, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.59738, val loss: 0.59317, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.58069, val loss: 0.57572, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.56350, val loss: 0.55780, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.54596, val loss: 0.53959, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.52977, val loss: 0.52266, in 0.016s 1 tree, 23 leaves, max depth = 7, train loss: 0.51579, val loss: 0.50794, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.50278, val loss: 0.49436, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.49103, val loss: 0.48235, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.48049, val loss: 0.47128, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.46961, val loss: 0.45968, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.45616, val loss: 0.44584, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.44701, val loss: 0.43613, in 0.016s 1 tree, 42 leaves, max depth = 9, train loss: 0.43925, val loss: 0.42805, in 0.016s 1 tree, 44 leaves, max depth = 9, train loss: 0.43107, val loss: 0.41942, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.42377, val loss: 0.41158, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.41728, val loss: 0.40452, in 0.016s 1 tree, 11 leaves, max depth = 7, train loss: 0.41171, val loss: 0.39847, in 0.016s 1 tree, 26 leaves, max depth = 8, train loss: 0.40425, val loss: 0.39089, in 0.000s Fit 21 trees in 0.674 s, (588 total leaves) Time spent computing histograms: 0.143s Time spent finding best splits: 0.016s Time spent applying splits: 0.012s Time spent predicting: 0.000s Trial 17, Fold 4: Log loss = 0.4040348495124786, Average precision = 0.954412483575146, ROC-AUC = 0.9494533442539301, Elapsed Time = 0.6907933000002231 seconds Trial 17, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 17, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 28 leaves, max depth = 11, train loss: 0.66527, val loss: 0.66365, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.63975, val loss: 0.63669, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.61722, val loss: 0.61283, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.59465, val loss: 0.58947, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.57630, val loss: 0.57058, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.55935, val loss: 0.55280, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.54307, val loss: 0.53549, in 0.016s 1 tree, 46 leaves, max depth = 9, train loss: 0.52860, val loss: 0.52063, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.51506, val loss: 0.50630, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.50106, val loss: 0.49176, in 0.016s 1 tree, 22 leaves, max depth = 7, train loss: 0.48909, val loss: 0.47910, in 0.016s 1 tree, 13 leaves, max depth = 8, train loss: 0.47812, val loss: 0.46755, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.46845, val loss: 0.45713, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.45849, val loss: 0.44673, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.44993, val loss: 0.43794, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.44236, val loss: 0.42992, in 0.000s 1 tree, 42 leaves, max depth = 11, train loss: 0.43443, val loss: 0.42171, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.42614, val loss: 0.41325, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.41917, val loss: 0.40599, in 0.031s 1 tree, 21 leaves, max depth = 8, train loss: 0.41301, val loss: 0.39956, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.40720, val loss: 0.39350, in 0.016s Fit 21 trees in 0.690 s, (556 total leaves) Time spent computing histograms: 0.138s Time spent finding best splits: 0.015s Time spent applying splits: 0.012s Time spent predicting: 0.000s Trial 17, Fold 5: Log loss = 0.41314811982724015, Average precision = 0.9197794119472843, ROC-AUC = 0.9302998346431822, Elapsed Time = 0.6928602000007231 seconds
Optimization Progress: 18%|#8 | 18/100 [03:35<16:17, 11.92s/it]
Trial 18, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 18, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.158 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 52 leaves, max depth = 16, train loss: 0.66318, val loss: 0.66227, in 0.015s 1 tree, 52 leaves, max depth = 16, train loss: 0.63692, val loss: 0.63517, in 0.000s 1 tree, 51 leaves, max depth = 16, train loss: 0.61382, val loss: 0.61127, in 0.016s 1 tree, 54 leaves, max depth = 17, train loss: 0.59360, val loss: 0.59055, in 0.016s 1 tree, 52 leaves, max depth = 17, train loss: 0.57550, val loss: 0.57175, in 0.000s Fit 5 trees in 0.314 s, (261 total leaves) Time spent computing histograms: 0.024s Time spent finding best splits: 0.005s Time spent applying splits: 0.004s Time spent predicting: 0.000s Trial 18, Fold 1: Log loss = 0.5758696069682718, Average precision = 0.8140119097748746, ROC-AUC = 0.8614314929524958, Elapsed Time = 0.3241126000011718 seconds Trial 18, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 18, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 52 leaves, max depth = 11, train loss: 0.66356, val loss: 0.66219, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.63764, val loss: 0.63497, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.61482, val loss: 0.61105, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.59466, val loss: 0.58981, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.57680, val loss: 0.57101, in 0.016s Fit 5 trees in 0.313 s, (262 total leaves) Time spent computing histograms: 0.023s Time spent finding best splits: 0.005s Time spent applying splits: 0.004s Time spent predicting: 0.000s Trial 18, Fold 2: Log loss = 0.5768084023705091, Average precision = 0.8172011369396084, ROC-AUC = 0.8660990439586019, Elapsed Time = 0.3173349999997299 seconds Trial 18, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 18, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.126 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 51 leaves, max depth = 10, train loss: 0.66371, val loss: 0.66260, in 0.000s 1 tree, 51 leaves, max depth = 10, train loss: 0.63793, val loss: 0.63579, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.61525, val loss: 0.61217, in 0.016s 1 tree, 52 leaves, max depth = 15, train loss: 0.59520, val loss: 0.59141, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.57718, val loss: 0.57284, in 0.016s Fit 5 trees in 0.314 s, (259 total leaves) Time spent computing histograms: 0.021s Time spent finding best splits: 0.005s Time spent applying splits: 0.004s Time spent predicting: 0.000s Trial 18, Fold 3: Log loss = 0.5743287091850737, Average precision = 0.8266571807986699, ROC-AUC = 0.8716865477314095, Elapsed Time = 0.31165280000095663 seconds Trial 18, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 18, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0.143 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 53 leaves, max depth = 12, train loss: 0.66348, val loss: 0.66164, in 0.000s 1 tree, 53 leaves, max depth = 12, train loss: 0.63749, val loss: 0.63393, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.61463, val loss: 0.60946, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.59452, val loss: 0.58806, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.57663, val loss: 0.56873, in 0.000s Fit 5 trees in 0.315 s, (264 total leaves) Time spent computing histograms: 0.023s Time spent finding best splits: 0.005s Time spent applying splits: 0.005s Time spent predicting: 0.000s Trial 18, Fold 4: Log loss = 0.5755665976093852, Average precision = 0.8188990668810218, ROC-AUC = 0.8676090003123578, Elapsed Time = 0.3268798999997671 seconds Trial 18, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 18, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 54 leaves, max depth = 15, train loss: 0.66343, val loss: 0.66149, in 0.000s 1 tree, 48 leaves, max depth = 14, train loss: 0.63709, val loss: 0.63332, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.61391, val loss: 0.60844, in 0.016s 1 tree, 54 leaves, max depth = 17, train loss: 0.59365, val loss: 0.58660, in 0.000s 1 tree, 48 leaves, max depth = 15, train loss: 0.57548, val loss: 0.56694, in 0.016s Fit 5 trees in 0.344 s, (252 total leaves) Time spent computing histograms: 0.024s Time spent finding best splits: 0.005s Time spent applying splits: 0.005s Time spent predicting: 0.000s Trial 18, Fold 5: Log loss = 0.5793771384317353, Average precision = 0.8077174872619299, ROC-AUC = 0.858010664585772, Elapsed Time = 0.34722619999956805 seconds
Optimization Progress: 19%|#9 | 19/100 [03:43<14:33, 10.79s/it]
Trial 19, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 19, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.126 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 42 leaves, max depth = 9, train loss: 0.68003, val loss: 0.68007, in 0.016s 1 tree, 45 leaves, max depth = 10, train loss: 0.66750, val loss: 0.66754, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.65551, val loss: 0.65557, in 0.016s 1 tree, 44 leaves, max depth = 9, train loss: 0.64402, val loss: 0.64410, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.63301, val loss: 0.63311, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.62247, val loss: 0.62257, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.61234, val loss: 0.61250, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.60261, val loss: 0.60279, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.59328, val loss: 0.59340, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.58429, val loss: 0.58446, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.57564, val loss: 0.57588, in 0.016s 1 tree, 47 leaves, max depth = 9, train loss: 0.56743, val loss: 0.56764, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.55974, val loss: 0.55988, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.55200, val loss: 0.55216, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.54454, val loss: 0.54470, in 0.031s 1 tree, 46 leaves, max depth = 11, train loss: 0.53745, val loss: 0.53753, in 0.000s 1 tree, 49 leaves, max depth = 13, train loss: 0.53080, val loss: 0.53082, in 0.031s 1 tree, 48 leaves, max depth = 13, train loss: 0.52438, val loss: 0.52435, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.51789, val loss: 0.51787, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.51162, val loss: 0.51162, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.50558, val loss: 0.50559, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.49989, val loss: 0.49982, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.49424, val loss: 0.49420, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.48901, val loss: 0.48894, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.48372, val loss: 0.48366, in 0.016s 1 tree, 56 leaves, max depth = 14, train loss: 0.47860, val loss: 0.47858, in 0.016s 1 tree, 58 leaves, max depth = 14, train loss: 0.47365, val loss: 0.47367, in 0.016s Fit 27 trees in 0.736 s, (1351 total leaves) Time spent computing histograms: 0.188s Time spent finding best splits: 0.033s Time spent applying splits: 0.023s Time spent predicting: 0.000s Trial 19, Fold 1: Log loss = 0.4759809969926131, Average precision = 0.9192054238246357, ROC-AUC = 0.9301531738862362, Elapsed Time = 0.743226999999024 seconds Trial 19, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 19, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.127 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 38 leaves, max depth = 9, train loss: 0.68007, val loss: 0.67983, in 0.016s 1 tree, 36 leaves, max depth = 8, train loss: 0.66738, val loss: 0.66688, in 0.016s 1 tree, 36 leaves, max depth = 8, train loss: 0.65524, val loss: 0.65450, in 0.016s 1 tree, 36 leaves, max depth = 8, train loss: 0.64361, val loss: 0.64263, in 0.016s 1 tree, 37 leaves, max depth = 8, train loss: 0.63247, val loss: 0.63127, in 0.016s 1 tree, 40 leaves, max depth = 9, train loss: 0.62183, val loss: 0.62049, in 0.016s 1 tree, 40 leaves, max depth = 8, train loss: 0.61157, val loss: 0.61003, in 0.016s 1 tree, 42 leaves, max depth = 8, train loss: 0.60173, val loss: 0.60000, in 0.016s 1 tree, 40 leaves, max depth = 8, train loss: 0.59227, val loss: 0.59034, in 0.016s 1 tree, 47 leaves, max depth = 9, train loss: 0.58318, val loss: 0.58107, in 0.016s 1 tree, 42 leaves, max depth = 8, train loss: 0.57445, val loss: 0.57215, in 0.016s 1 tree, 42 leaves, max depth = 8, train loss: 0.56605, val loss: 0.56358, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.55795, val loss: 0.55534, in 0.016s 1 tree, 47 leaves, max depth = 9, train loss: 0.55023, val loss: 0.54752, in 0.031s 1 tree, 43 leaves, max depth = 9, train loss: 0.54272, val loss: 0.53985, in 0.016s 1 tree, 43 leaves, max depth = 9, train loss: 0.53548, val loss: 0.53245, in 0.016s 1 tree, 44 leaves, max depth = 10, train loss: 0.52885, val loss: 0.52577, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.52210, val loss: 0.51887, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.51559, val loss: 0.51221, in 0.016s 1 tree, 49 leaves, max depth = 9, train loss: 0.50926, val loss: 0.50588, in 0.031s 1 tree, 43 leaves, max depth = 8, train loss: 0.50320, val loss: 0.49971, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.49734, val loss: 0.49376, in 0.016s 1 tree, 43 leaves, max depth = 9, train loss: 0.49168, val loss: 0.48796, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.48621, val loss: 0.48239, in 0.000s 1 tree, 55 leaves, max depth = 9, train loss: 0.48103, val loss: 0.47720, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.47591, val loss: 0.47200, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.47091, val loss: 0.46702, in 0.016s Fit 27 trees in 0.750 s, (1174 total leaves) Time spent computing histograms: 0.182s Time spent finding best splits: 0.032s Time spent applying splits: 0.022s Time spent predicting: 0.000s Trial 19, Fold 2: Log loss = 0.47331780547530794, Average precision = 0.915780116516474, ROC-AUC = 0.9320602402288841, Elapsed Time = 0.765905100000964 seconds Trial 19, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 19, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 34 leaves, max depth = 8, train loss: 0.67995, val loss: 0.67999, in 0.016s 1 tree, 36 leaves, max depth = 8, train loss: 0.66734, val loss: 0.66740, in 0.016s 1 tree, 37 leaves, max depth = 8, train loss: 0.65528, val loss: 0.65533, in 0.016s 1 tree, 33 leaves, max depth = 8, train loss: 0.64434, val loss: 0.64440, in 0.016s 1 tree, 36 leaves, max depth = 8, train loss: 0.63324, val loss: 0.63329, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.62318, val loss: 0.62325, in 0.000s 1 tree, 38 leaves, max depth = 8, train loss: 0.61294, val loss: 0.61299, in 0.016s 1 tree, 37 leaves, max depth = 8, train loss: 0.60311, val loss: 0.60314, in 0.031s 1 tree, 37 leaves, max depth = 9, train loss: 0.59366, val loss: 0.59371, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.58507, val loss: 0.58519, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.57630, val loss: 0.57648, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.56786, val loss: 0.56810, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.55974, val loss: 0.56005, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.55191, val loss: 0.55228, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.54436, val loss: 0.54481, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.53711, val loss: 0.53760, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.53009, val loss: 0.53064, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.52333, val loss: 0.52395, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.51682, val loss: 0.51748, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.51084, val loss: 0.51157, in 0.031s 1 tree, 56 leaves, max depth = 13, train loss: 0.50474, val loss: 0.50550, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.49891, val loss: 0.49972, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.49322, val loss: 0.49406, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.48772, val loss: 0.48861, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.48246, val loss: 0.48339, in 0.016s 1 tree, 62 leaves, max depth = 14, train loss: 0.47732, val loss: 0.47833, in 0.016s 1 tree, 64 leaves, max depth = 14, train loss: 0.47231, val loss: 0.47339, in 0.016s Fit 27 trees in 0.799 s, (1280 total leaves) Time spent computing histograms: 0.187s Time spent finding best splits: 0.033s Time spent applying splits: 0.023s Time spent predicting: 0.000s Trial 19, Fold 3: Log loss = 0.47063638550940856, Average precision = 0.9229946765596092, ROC-AUC = 0.9352773376929819, Elapsed Time = 0.803773600000568 seconds Trial 19, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 19, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 38 leaves, max depth = 8, train loss: 0.68012, val loss: 0.67960, in 0.016s 1 tree, 37 leaves, max depth = 8, train loss: 0.66767, val loss: 0.66664, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.65575, val loss: 0.65424, in 0.016s 1 tree, 38 leaves, max depth = 8, train loss: 0.64433, val loss: 0.64235, in 0.031s 1 tree, 46 leaves, max depth = 10, train loss: 0.63339, val loss: 0.63096, in 0.000s 1 tree, 44 leaves, max depth = 8, train loss: 0.62290, val loss: 0.62002, in 0.031s 1 tree, 43 leaves, max depth = 9, train loss: 0.61284, val loss: 0.60952, in 0.016s 1 tree, 45 leaves, max depth = 10, train loss: 0.60318, val loss: 0.59943, in 0.016s 1 tree, 44 leaves, max depth = 8, train loss: 0.59390, val loss: 0.58973, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.58498, val loss: 0.58043, in 0.016s 1 tree, 45 leaves, max depth = 8, train loss: 0.57640, val loss: 0.57147, in 0.016s 1 tree, 44 leaves, max depth = 8, train loss: 0.56816, val loss: 0.56285, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.56022, val loss: 0.55455, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.55268, val loss: 0.54658, in 0.016s 1 tree, 47 leaves, max depth = 10, train loss: 0.54531, val loss: 0.53886, in 0.016s 1 tree, 45 leaves, max depth = 9, train loss: 0.53821, val loss: 0.53141, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.53165, val loss: 0.52453, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.52503, val loss: 0.51760, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.51874, val loss: 0.51094, in 0.031s 1 tree, 53 leaves, max depth = 10, train loss: 0.51242, val loss: 0.50437, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.50647, val loss: 0.49810, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.50071, val loss: 0.49207, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.49515, val loss: 0.48623, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.48977, val loss: 0.48059, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.48465, val loss: 0.47513, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.47962, val loss: 0.46983, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.47474, val loss: 0.46470, in 0.016s Fit 27 trees in 0.814 s, (1296 total leaves) Time spent computing histograms: 0.194s Time spent finding best splits: 0.036s Time spent applying splits: 0.025s Time spent predicting: 0.000s Trial 19, Fold 4: Log loss = 0.47506838673715585, Average precision = 0.9201397556504622, ROC-AUC = 0.9309760313168379, Elapsed Time = 0.8261987999994744 seconds Trial 19, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 19, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 9, train loss: 0.67988, val loss: 0.67937, in 0.000s 1 tree, 41 leaves, max depth = 9, train loss: 0.66720, val loss: 0.66622, in 0.016s 1 tree, 43 leaves, max depth = 9, train loss: 0.65514, val loss: 0.65370, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.64419, val loss: 0.64229, in 0.031s 1 tree, 46 leaves, max depth = 9, train loss: 0.63301, val loss: 0.63067, in 0.016s 1 tree, 46 leaves, max depth = 9, train loss: 0.62230, val loss: 0.61952, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.61263, val loss: 0.60946, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.60273, val loss: 0.59917, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.59322, val loss: 0.58929, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.58408, val loss: 0.57978, in 0.016s 1 tree, 47 leaves, max depth = 10, train loss: 0.57529, val loss: 0.57064, in 0.016s 1 tree, 47 leaves, max depth = 10, train loss: 0.56684, val loss: 0.56184, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.55870, val loss: 0.55339, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.55086, val loss: 0.54521, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.54331, val loss: 0.53738, in 0.016s 1 tree, 45 leaves, max depth = 9, train loss: 0.53613, val loss: 0.52987, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.52911, val loss: 0.52258, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.52271, val loss: 0.51592, in 0.031s 1 tree, 53 leaves, max depth = 12, train loss: 0.51616, val loss: 0.50911, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.50984, val loss: 0.50253, in 0.016s 1 tree, 55 leaves, max depth = 9, train loss: 0.50383, val loss: 0.49623, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.49792, val loss: 0.49014, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.49223, val loss: 0.48420, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.48672, val loss: 0.47850, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.48140, val loss: 0.47298, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.47625, val loss: 0.46764, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.47153, val loss: 0.46276, in 0.016s Fit 27 trees in 0.799 s, (1325 total leaves) Time spent computing histograms: 0.182s Time spent finding best splits: 0.034s Time spent applying splits: 0.024s Time spent predicting: 0.000s Trial 19, Fold 5: Log loss = 0.47660104352388494, Average precision = 0.9162011773787254, ROC-AUC = 0.9303857458707244, Elapsed Time = 0.8032679999996617 seconds
Optimization Progress: 20%|## | 20/100 [03:54<14:10, 10.63s/it]
Trial 20, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 20, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 101 leaves, max depth = 15, train loss: 0.68483, val loss: 0.68463, in 0.000s 1 tree, 104 leaves, max depth = 17, train loss: 0.67663, val loss: 0.67620, in 0.016s 1 tree, 104 leaves, max depth = 17, train loss: 0.66871, val loss: 0.66805, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.66116, val loss: 0.66033, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.65386, val loss: 0.65283, in 0.000s 1 tree, 105 leaves, max depth = 19, train loss: 0.64681, val loss: 0.64558, in 0.000s 1 tree, 104 leaves, max depth = 21, train loss: 0.64007, val loss: 0.63868, in 0.016s 1 tree, 104 leaves, max depth = 19, train loss: 0.63356, val loss: 0.63201, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.62719, val loss: 0.62545, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.62095, val loss: 0.61901, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.61505, val loss: 0.61289, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.60928, val loss: 0.60695, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.60371, val loss: 0.60120, in 0.016s 1 tree, 104 leaves, max depth = 18, train loss: 0.59824, val loss: 0.59554, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.59302, val loss: 0.59015, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.58797, val loss: 0.58494, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.58300, val loss: 0.57979, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.57827, val loss: 0.57493, in 0.031s 1 tree, 103 leaves, max depth = 20, train loss: 0.57372, val loss: 0.57026, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.56920, val loss: 0.56557, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.56483, val loss: 0.56102, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.56059, val loss: 0.55661, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.55648, val loss: 0.55233, in 0.031s 1 tree, 105 leaves, max depth = 18, train loss: 0.55250, val loss: 0.54818, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.54870, val loss: 0.54425, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.54448, val loss: 0.54029, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.54039, val loss: 0.53647, in 0.031s 1 tree, 106 leaves, max depth = 17, train loss: 0.53684, val loss: 0.53276, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.53292, val loss: 0.52909, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.52913, val loss: 0.52555, in 0.031s 1 tree, 104 leaves, max depth = 16, train loss: 0.52578, val loss: 0.52210, in 0.000s 1 tree, 106 leaves, max depth = 16, train loss: 0.52253, val loss: 0.51869, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.51931, val loss: 0.51530, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.51573, val loss: 0.51197, in 0.031s 1 tree, 159 leaves, max depth = 15, train loss: 0.51227, val loss: 0.50875, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.50928, val loss: 0.50563, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.50632, val loss: 0.50251, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.50302, val loss: 0.49945, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.50025, val loss: 0.49654, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.49757, val loss: 0.49371, in 0.000s 1 tree, 106 leaves, max depth = 21, train loss: 0.49495, val loss: 0.49101, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.49183, val loss: 0.48812, in 0.031s 1 tree, 105 leaves, max depth = 19, train loss: 0.48933, val loss: 0.48550, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.48632, val loss: 0.48271, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.48340, val loss: 0.48001, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.48057, val loss: 0.47739, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.47782, val loss: 0.47485, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.47546, val loss: 0.47234, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.47281, val loss: 0.46990, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.47024, val loss: 0.46752, in 0.031s 1 tree, 105 leaves, max depth = 17, train loss: 0.46805, val loss: 0.46519, in 0.000s 1 tree, 159 leaves, max depth = 16, train loss: 0.46557, val loss: 0.46291, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.46316, val loss: 0.46069, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.46102, val loss: 0.45841, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.45870, val loss: 0.45628, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.45644, val loss: 0.45421, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.45424, val loss: 0.45220, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.45228, val loss: 0.45010, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.45016, val loss: 0.44816, in 0.016s 1 tree, 104 leaves, max depth = 18, train loss: 0.44821, val loss: 0.44612, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.44638, val loss: 0.44415, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.44434, val loss: 0.44229, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.44236, val loss: 0.44049, in 0.031s 1 tree, 104 leaves, max depth = 20, train loss: 0.44062, val loss: 0.43867, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.43871, val loss: 0.43693, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.43697, val loss: 0.43508, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.43533, val loss: 0.43331, in 0.016s 1 tree, 104 leaves, max depth = 17, train loss: 0.43369, val loss: 0.43158, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.43186, val loss: 0.42992, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.43008, val loss: 0.42832, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.42835, val loss: 0.42675, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.42667, val loss: 0.42524, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.42517, val loss: 0.42361, in 0.016s 1 tree, 104 leaves, max depth = 21, train loss: 0.42372, val loss: 0.42209, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.42230, val loss: 0.42057, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.42093, val loss: 0.41907, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.41955, val loss: 0.41757, in 0.016s 1 tree, 104 leaves, max depth = 17, train loss: 0.41821, val loss: 0.41615, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.41690, val loss: 0.41475, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41565, val loss: 0.41338, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.41439, val loss: 0.41200, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.41280, val loss: 0.41058, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.41159, val loss: 0.40925, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.41005, val loss: 0.40788, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.40856, val loss: 0.40655, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40737, val loss: 0.40524, in 0.000s 1 tree, 159 leaves, max depth = 16, train loss: 0.40592, val loss: 0.40395, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40476, val loss: 0.40268, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.40362, val loss: 0.40143, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.40222, val loss: 0.40019, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.40112, val loss: 0.39898, in 0.016s Fit 91 trees in 1.815 s, (10974 total leaves) Time spent computing histograms: 0.564s Time spent finding best splits: 0.175s Time spent applying splits: 0.207s Time spent predicting: 0.000s Trial 20, Fold 1: Log loss = 0.4053307122495676, Average precision = 0.9457528949247934, ROC-AUC = 0.9421825785167024, Elapsed Time = 1.8230208000004495 seconds Trial 20, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 20, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 105 leaves, max depth = 18, train loss: 0.68479, val loss: 0.68447, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.67669, val loss: 0.67601, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.66888, val loss: 0.66784, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.66134, val loss: 0.65995, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.65408, val loss: 0.65235, in 0.000s 1 tree, 105 leaves, max depth = 17, train loss: 0.64707, val loss: 0.64501, in 0.016s 1 tree, 106 leaves, max depth = 21, train loss: 0.64030, val loss: 0.63796, in 0.016s 1 tree, 104 leaves, max depth = 21, train loss: 0.63376, val loss: 0.63114, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.62743, val loss: 0.62449, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.62123, val loss: 0.61798, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.61531, val loss: 0.61177, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.60956, val loss: 0.60573, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.60402, val loss: 0.59991, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.59858, val loss: 0.59419, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.59340, val loss: 0.58877, in 0.000s 1 tree, 106 leaves, max depth = 17, train loss: 0.58837, val loss: 0.58353, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.58342, val loss: 0.57832, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.57869, val loss: 0.57334, in 0.016s 1 tree, 104 leaves, max depth = 23, train loss: 0.57413, val loss: 0.56857, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.56963, val loss: 0.56383, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.56528, val loss: 0.55923, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.56106, val loss: 0.55478, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.55697, val loss: 0.55046, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.55300, val loss: 0.54628, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.54921, val loss: 0.54227, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.54499, val loss: 0.53820, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.54089, val loss: 0.53427, in 0.031s 1 tree, 105 leaves, max depth = 17, train loss: 0.53734, val loss: 0.53051, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.53342, val loss: 0.52674, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.52961, val loss: 0.52309, in 0.016s 1 tree, 106 leaves, max depth = 18, train loss: 0.52626, val loss: 0.51957, in 0.016s 1 tree, 104 leaves, max depth = 17, train loss: 0.52301, val loss: 0.51613, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.51980, val loss: 0.51272, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.51622, val loss: 0.50930, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.51275, val loss: 0.50598, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.50977, val loss: 0.50281, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.50683, val loss: 0.49968, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.50352, val loss: 0.49652, in 0.031s 1 tree, 106 leaves, max depth = 15, train loss: 0.50075, val loss: 0.49358, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.49807, val loss: 0.49073, in 0.000s 1 tree, 106 leaves, max depth = 15, train loss: 0.49547, val loss: 0.48795, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.49234, val loss: 0.48496, in 0.032s 1 tree, 106 leaves, max depth = 15, train loss: 0.48984, val loss: 0.48230, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.48683, val loss: 0.47943, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.48391, val loss: 0.47665, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.48107, val loss: 0.47394, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.47832, val loss: 0.47132, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.47597, val loss: 0.46882, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.47331, val loss: 0.46629, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.47074, val loss: 0.46384, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.46855, val loss: 0.46151, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.46606, val loss: 0.45915, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.46365, val loss: 0.45686, in 0.031s 1 tree, 106 leaves, max depth = 14, train loss: 0.46153, val loss: 0.45459, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.45920, val loss: 0.45238, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.45694, val loss: 0.45024, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.45474, val loss: 0.44816, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.45278, val loss: 0.44607, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.45066, val loss: 0.44406, in 0.031s 1 tree, 106 leaves, max depth = 13, train loss: 0.44877, val loss: 0.44204, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.44694, val loss: 0.44008, in 0.000s 1 tree, 159 leaves, max depth = 16, train loss: 0.44490, val loss: 0.43816, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.44292, val loss: 0.43629, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.44117, val loss: 0.43444, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.43925, val loss: 0.43264, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.43757, val loss: 0.43084, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.43594, val loss: 0.42909, in 0.016s 1 tree, 104 leaves, max depth = 21, train loss: 0.43436, val loss: 0.42742, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.43252, val loss: 0.42569, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.43073, val loss: 0.42402, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.42899, val loss: 0.42239, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.42730, val loss: 0.42081, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.42581, val loss: 0.41921, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.42436, val loss: 0.41768, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.42295, val loss: 0.41617, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.42159, val loss: 0.41471, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.42023, val loss: 0.41324, in 0.016s 1 tree, 106 leaves, max depth = 13, train loss: 0.41894, val loss: 0.41185, in 0.016s 1 tree, 106 leaves, max depth = 13, train loss: 0.41770, val loss: 0.41051, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41648, val loss: 0.40924, in 0.000s 1 tree, 106 leaves, max depth = 18, train loss: 0.41524, val loss: 0.40790, in 0.000s 1 tree, 159 leaves, max depth = 15, train loss: 0.41364, val loss: 0.40642, in 0.031s 1 tree, 106 leaves, max depth = 18, train loss: 0.41245, val loss: 0.40512, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.41089, val loss: 0.40368, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.40939, val loss: 0.40229, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40822, val loss: 0.40108, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.40676, val loss: 0.39972, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40563, val loss: 0.39855, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.40452, val loss: 0.39736, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.40310, val loss: 0.39604, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.40198, val loss: 0.39488, in 0.016s Fit 91 trees in 1.830 s, (10912 total leaves) Time spent computing histograms: 0.555s Time spent finding best splits: 0.167s Time spent applying splits: 0.202s Time spent predicting: 0.000s Trial 20, Fold 2: Log loss = 0.4061719563790699, Average precision = 0.9421895019520601, ROC-AUC = 0.9427703713055864, Elapsed Time = 1.8393180000002758 seconds Trial 20, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 20, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 99 leaves, max depth = 15, train loss: 0.68486, val loss: 0.68458, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.67682, val loss: 0.67626, in 0.000s 1 tree, 105 leaves, max depth = 14, train loss: 0.66905, val loss: 0.66824, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.66155, val loss: 0.66049, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.65433, val loss: 0.65307, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.64736, val loss: 0.64591, in 0.016s 1 tree, 101 leaves, max depth = 15, train loss: 0.64065, val loss: 0.63895, in 0.016s 1 tree, 103 leaves, max depth = 19, train loss: 0.63417, val loss: 0.63224, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.62786, val loss: 0.62569, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.62168, val loss: 0.61932, in 0.016s 1 tree, 102 leaves, max depth = 20, train loss: 0.61582, val loss: 0.61326, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.61011, val loss: 0.60733, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.60460, val loss: 0.60167, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.59917, val loss: 0.59608, in 0.016s 1 tree, 102 leaves, max depth = 20, train loss: 0.59404, val loss: 0.59076, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.58903, val loss: 0.58553, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.58411, val loss: 0.58045, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.57941, val loss: 0.57557, in 0.016s 1 tree, 103 leaves, max depth = 20, train loss: 0.57489, val loss: 0.57087, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.57041, val loss: 0.56624, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.56607, val loss: 0.56176, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.56187, val loss: 0.55742, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.55739, val loss: 0.55329, in 0.031s 1 tree, 105 leaves, max depth = 14, train loss: 0.55338, val loss: 0.54914, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.54955, val loss: 0.54515, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.54531, val loss: 0.54125, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.54121, val loss: 0.53748, in 0.031s 1 tree, 105 leaves, max depth = 16, train loss: 0.53761, val loss: 0.53376, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.53368, val loss: 0.53015, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.52987, val loss: 0.52665, in 0.031s 1 tree, 101 leaves, max depth = 15, train loss: 0.52649, val loss: 0.52312, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.52321, val loss: 0.51972, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.51996, val loss: 0.51634, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.51637, val loss: 0.51305, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.51289, val loss: 0.50987, in 0.031s 1 tree, 104 leaves, max depth = 15, train loss: 0.50989, val loss: 0.50671, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.50692, val loss: 0.50361, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.50360, val loss: 0.50058, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.50081, val loss: 0.49764, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.49810, val loss: 0.49482, in 0.031s 1 tree, 104 leaves, max depth = 15, train loss: 0.49548, val loss: 0.49204, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.49234, val loss: 0.48918, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.48983, val loss: 0.48652, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.48681, val loss: 0.48378, in 0.016s [45/91] 1 tree, 159 leaves, max depth = 18, train loss: 0.48389, val loss: 0.48111, in 0.031s 1 tree, 159 leaves, max depth = 18, train loss: 0.48104, val loss: 0.47853, in 0.031s 1 tree, 159 leaves, max depth = 18, train loss: 0.47828, val loss: 0.47603, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.47591, val loss: 0.47353, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.47325, val loss: 0.47112, in 0.031s 1 tree, 159 leaves, max depth = 18, train loss: 0.47067, val loss: 0.46878, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.46846, val loss: 0.46647, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.46596, val loss: 0.46421, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.46354, val loss: 0.46203, in 0.031s 1 tree, 105 leaves, max depth = 14, train loss: 0.46141, val loss: 0.45977, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.45907, val loss: 0.45766, in 0.031s 1 tree, 159 leaves, max depth = 15, train loss: 0.45680, val loss: 0.45562, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.45459, val loss: 0.45364, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.45261, val loss: 0.45156, in 0.031s 1 tree, 159 leaves, max depth = 17, train loss: 0.45048, val loss: 0.44965, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.44859, val loss: 0.44761, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.44674, val loss: 0.44567, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.44469, val loss: 0.44383, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.44270, val loss: 0.44205, in 0.031s 1 tree, 105 leaves, max depth = 15, train loss: 0.44095, val loss: 0.44016, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.43903, val loss: 0.43845, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.43735, val loss: 0.43663, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.43571, val loss: 0.43486, in 0.031s 1 tree, 104 leaves, max depth = 14, train loss: 0.43407, val loss: 0.43310, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.43222, val loss: 0.43146, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.43043, val loss: 0.42986, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.42869, val loss: 0.42832, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.42700, val loss: 0.42682, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.42551, val loss: 0.42520, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.42406, val loss: 0.42364, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.42266, val loss: 0.42211, in 0.031s 1 tree, 104 leaves, max depth = 13, train loss: 0.42128, val loss: 0.42062, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.42000, val loss: 0.41943, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.41863, val loss: 0.41795, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.41735, val loss: 0.41656, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41612, val loss: 0.41541, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.41483, val loss: 0.41403, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.41323, val loss: 0.41262, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.41200, val loss: 0.41129, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.41045, val loss: 0.40993, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.40895, val loss: 0.40861, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40777, val loss: 0.40751, in 0.000s 1 tree, 159 leaves, max depth = 16, train loss: 0.40631, val loss: 0.40623, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40517, val loss: 0.40517, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.40404, val loss: 0.40394, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.40262, val loss: 0.40271, in 0.016s 1 tree, 103 leaves, max depth = 13, train loss: 0.40150, val loss: 0.40149, in 0.016s Fit 91 trees in 1.955 s, (10896 total leaves) Time spent computing histograms: 0.608s Time spent finding best splits: 0.184s Time spent applying splits: 0.224s Time spent predicting: 0.063s Trial 20, Fold 3: Log loss = 0.40054153446053187, Average precision = 0.948156783525392, ROC-AUC = 0.9469457128651367, Elapsed Time = 1.969958299998325 seconds Trial 20, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 20, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 96 leaves, max depth = 16, train loss: 0.68486, val loss: 0.68446, in 0.016s 1 tree, 102 leaves, max depth = 20, train loss: 0.67674, val loss: 0.67586, in 0.016s 1 tree, 102 leaves, max depth = 20, train loss: 0.66890, val loss: 0.66754, in 0.016s 1 tree, 103 leaves, max depth = 19, train loss: 0.66138, val loss: 0.65964, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.65413, val loss: 0.65198, in 0.016s 1 tree, 103 leaves, max depth = 19, train loss: 0.64716, val loss: 0.64458, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.64044, val loss: 0.63746, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.63394, val loss: 0.63057, in 0.000s 1 tree, 105 leaves, max depth = 13, train loss: 0.62762, val loss: 0.62386, in 0.031s 1 tree, 105 leaves, max depth = 20, train loss: 0.62144, val loss: 0.61727, in 0.000s 1 tree, 103 leaves, max depth = 19, train loss: 0.61556, val loss: 0.61098, in 0.016s 1 tree, 104 leaves, max depth = 19, train loss: 0.60981, val loss: 0.60492, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.60428, val loss: 0.59902, in 0.000s 1 tree, 105 leaves, max depth = 20, train loss: 0.59887, val loss: 0.59321, in 0.031s 1 tree, 104 leaves, max depth = 13, train loss: 0.59368, val loss: 0.58768, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.58867, val loss: 0.58231, in 0.000s 1 tree, 104 leaves, max depth = 19, train loss: 0.58376, val loss: 0.57702, in 0.000s 1 tree, 104 leaves, max depth = 18, train loss: 0.57903, val loss: 0.57201, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.57450, val loss: 0.56716, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.57003, val loss: 0.56233, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.56571, val loss: 0.55765, in 0.047s 1 tree, 105 leaves, max depth = 20, train loss: 0.56152, val loss: 0.55310, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.55746, val loss: 0.54870, in 0.063s 1 tree, 105 leaves, max depth = 20, train loss: 0.55352, val loss: 0.54443, in 0.016s 1 tree, 104 leaves, max depth = 21, train loss: 0.54974, val loss: 0.54038, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.54552, val loss: 0.53628, in 0.031s 1 tree, 159 leaves, max depth = 18, train loss: 0.54143, val loss: 0.53232, in 0.031s 1 tree, 103 leaves, max depth = 21, train loss: 0.53789, val loss: 0.52848, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.53397, val loss: 0.52468, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.53018, val loss: 0.52100, in 0.031s 1 tree, 103 leaves, max depth = 15, train loss: 0.52685, val loss: 0.51742, in 0.016s 1 tree, 103 leaves, max depth = 20, train loss: 0.52361, val loss: 0.51390, in 0.016s 1 tree, 104 leaves, max depth = 20, train loss: 0.52042, val loss: 0.51041, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.51685, val loss: 0.50696, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.51339, val loss: 0.50361, in 0.016s 1 tree, 104 leaves, max depth = 21, train loss: 0.51041, val loss: 0.50040, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.50750, val loss: 0.49720, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.50420, val loss: 0.49401, in 0.031s 1 tree, 104 leaves, max depth = 21, train loss: 0.50144, val loss: 0.49103, in 0.016s 1 tree, 103 leaves, max depth = 22, train loss: 0.49877, val loss: 0.48810, in 0.031s 1 tree, 104 leaves, max depth = 21, train loss: 0.49616, val loss: 0.48529, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.49304, val loss: 0.48228, in 0.016s 1 tree, 104 leaves, max depth = 21, train loss: 0.49055, val loss: 0.47959, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.48755, val loss: 0.47670, in 0.031s 1 tree, 159 leaves, max depth = 18, train loss: 0.48464, val loss: 0.47389, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.48181, val loss: 0.47117, in 0.031s 1 tree, 159 leaves, max depth = 18, train loss: 0.47907, val loss: 0.46853, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.47674, val loss: 0.46594, in 0.016s 1 tree, 159 leaves, max depth = 19, train loss: 0.47410, val loss: 0.46340, in 0.031s 1 tree, 159 leaves, max depth = 19, train loss: 0.47153, val loss: 0.46093, in 0.016s 1 tree, 103 leaves, max depth = 20, train loss: 0.46935, val loss: 0.45852, in 0.016s 1 tree, 159 leaves, max depth = 19, train loss: 0.46688, val loss: 0.45614, in 0.031s 1 tree, 159 leaves, max depth = 19, train loss: 0.46447, val loss: 0.45383, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.46237, val loss: 0.45149, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.46006, val loss: 0.44927, in 0.031s 1 tree, 159 leaves, max depth = 18, train loss: 0.45780, val loss: 0.44710, in 0.031s 1 tree, 159 leaves, max depth = 21, train loss: 0.45561, val loss: 0.44501, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.45360, val loss: 0.44281, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.45149, val loss: 0.44079, in 0.031s 1 tree, 104 leaves, max depth = 14, train loss: 0.44961, val loss: 0.43871, in 0.016s 1 tree, 103 leaves, max depth = 19, train loss: 0.44778, val loss: 0.43668, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.44576, val loss: 0.43474, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.44379, val loss: 0.43286, in 0.031s 1 tree, 104 leaves, max depth = 14, train loss: 0.44199, val loss: 0.43089, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.44009, val loss: 0.42908, in 0.031s 1 tree, 103 leaves, max depth = 17, train loss: 0.43841, val loss: 0.42720, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.43672, val loss: 0.42536, in 0.031s 1 tree, 104 leaves, max depth = 16, train loss: 0.43515, val loss: 0.42360, in 0.031s 1 tree, 159 leaves, max depth = 18, train loss: 0.43333, val loss: 0.42186, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.43156, val loss: 0.42018, in 0.031s 1 tree, 159 leaves, max depth = 18, train loss: 0.42985, val loss: 0.41855, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.42818, val loss: 0.41696, in 0.031s 1 tree, 103 leaves, max depth = 17, train loss: 0.42668, val loss: 0.41528, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.42523, val loss: 0.41367, in 0.016s 1 tree, 103 leaves, max depth = 16, train loss: 0.42382, val loss: 0.41209, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.42246, val loss: 0.41057, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.42111, val loss: 0.40902, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.41977, val loss: 0.40755, in 0.016s 1 tree, 103 leaves, max depth = 16, train loss: 0.41852, val loss: 0.40613, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41730, val loss: 0.40482, in 0.000s 1 tree, 105 leaves, max depth = 17, train loss: 0.41607, val loss: 0.40341, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.41449, val loss: 0.40192, in 0.031s 1 tree, 105 leaves, max depth = 17, train loss: 0.41331, val loss: 0.40056, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.41178, val loss: 0.39911, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.41029, val loss: 0.39771, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40913, val loss: 0.39646, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.40769, val loss: 0.39510, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40656, val loss: 0.39389, in 0.000s 1 tree, 105 leaves, max depth = 17, train loss: 0.40545, val loss: 0.39261, in 0.031s 1 tree, 159 leaves, max depth = 20, train loss: 0.40405, val loss: 0.39130, in 0.031s 1 tree, 105 leaves, max depth = 16, train loss: 0.40298, val loss: 0.39005, in 0.016s Fit 91 trees in 2.221 s, (10918 total leaves) Time spent computing histograms: 0.679s Time spent finding best splits: 0.226s Time spent applying splits: 0.279s Time spent predicting: 0.000s Trial 20, Fold 4: Log loss = 0.40511396654818094, Average precision = 0.9467197122369408, ROC-AUC = 0.9438916339406426, Elapsed Time = 2.220715299999938 seconds Trial 20, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 20, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 102 leaves, max depth = 13, train loss: 0.68473, val loss: 0.68423, in 0.016s 1 tree, 103 leaves, max depth = 16, train loss: 0.67653, val loss: 0.67551, in 0.016s 1 tree, 103 leaves, max depth = 16, train loss: 0.66861, val loss: 0.66709, in 0.016s 1 tree, 101 leaves, max depth = 14, train loss: 0.66105, val loss: 0.65905, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.65374, val loss: 0.65127, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.64667, val loss: 0.64375, in 0.016s 1 tree, 99 leaves, max depth = 15, train loss: 0.63986, val loss: 0.63650, in 0.016s 1 tree, 103 leaves, max depth = 18, train loss: 0.63328, val loss: 0.62949, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.62689, val loss: 0.62263, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.62065, val loss: 0.61595, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.61469, val loss: 0.60952, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.60891, val loss: 0.60330, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.60333, val loss: 0.59727, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.59785, val loss: 0.59138, in 0.016s 1 tree, 100 leaves, max depth = 20, train loss: 0.59263, val loss: 0.58579, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.58756, val loss: 0.58030, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.58258, val loss: 0.57494, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.57783, val loss: 0.56982, in 0.016s 1 tree, 103 leaves, max depth = 18, train loss: 0.57324, val loss: 0.56487, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.56871, val loss: 0.55997, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.56433, val loss: 0.55523, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.56008, val loss: 0.55062, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.55597, val loss: 0.54615, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.55198, val loss: 0.54182, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.54816, val loss: 0.53768, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.54406, val loss: 0.53379, in 0.016s 1 tree, 158 leaves, max depth = 17, train loss: 0.54008, val loss: 0.53004, in 0.031s 1 tree, 105 leaves, max depth = 17, train loss: 0.53649, val loss: 0.52613, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.53267, val loss: 0.52254, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.52898, val loss: 0.51906, in 0.031s 1 tree, 102 leaves, max depth = 15, train loss: 0.52559, val loss: 0.51536, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.52231, val loss: 0.51174, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.51907, val loss: 0.50820, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.51560, val loss: 0.50494, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.51223, val loss: 0.50179, in 0.031s 1 tree, 103 leaves, max depth = 18, train loss: 0.50922, val loss: 0.49850, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.50625, val loss: 0.49523, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.50304, val loss: 0.49224, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.50024, val loss: 0.48916, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.49752, val loss: 0.48617, in 0.016s 1 tree, 102 leaves, max depth = 15, train loss: 0.49489, val loss: 0.48326, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.49185, val loss: 0.48044, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.48933, val loss: 0.47765, in 0.016s 1 tree, 158 leaves, max depth = 17, train loss: 0.48640, val loss: 0.47495, in 0.031s 1 tree, 159 leaves, max depth = 17, train loss: 0.48357, val loss: 0.47232, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.48082, val loss: 0.46978, in 0.031s 1 tree, 159 leaves, max depth = 18, train loss: 0.47815, val loss: 0.46732, in 0.031s 1 tree, 105 leaves, max depth = 14, train loss: 0.47576, val loss: 0.46468, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.47319, val loss: 0.46231, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.47069, val loss: 0.46001, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.46847, val loss: 0.45754, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.46606, val loss: 0.45534, in 0.031s 1 tree, 159 leaves, max depth = 17, train loss: 0.46372, val loss: 0.45320, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.46157, val loss: 0.45080, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.45931, val loss: 0.44874, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.45712, val loss: 0.44674, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.45500, val loss: 0.44481, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.45300, val loss: 0.44258, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.45095, val loss: 0.44071, in 0.031s 1 tree, 103 leaves, max depth = 16, train loss: 0.44903, val loss: 0.43857, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.44716, val loss: 0.43648, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.44519, val loss: 0.43470, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.44327, val loss: 0.43297, in 0.016s 1 tree, 100 leaves, max depth = 15, train loss: 0.44149, val loss: 0.43099, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.43964, val loss: 0.42932, in 0.031s 1 tree, 105 leaves, max depth = 16, train loss: 0.43792, val loss: 0.42739, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.43625, val loss: 0.42551, in 0.016s 1 tree, 103 leaves, max depth = 17, train loss: 0.43464, val loss: 0.42369, in 0.000s 1 tree, 159 leaves, max depth = 16, train loss: 0.43287, val loss: 0.42211, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.43114, val loss: 0.42057, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.42947, val loss: 0.41907, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.42784, val loss: 0.41763, in 0.031s 1 tree, 105 leaves, max depth = 16, train loss: 0.42631, val loss: 0.41589, in 0.016s 1 tree, 100 leaves, max depth = 15, train loss: 0.42483, val loss: 0.41423, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.42339, val loss: 0.41259, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.42199, val loss: 0.41100, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.42059, val loss: 0.40942, in 0.016s 1 tree, 102 leaves, max depth = 15, train loss: 0.41928, val loss: 0.40792, in 0.031s 1 tree, 105 leaves, max depth = 16, train loss: 0.41800, val loss: 0.40645, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41678, val loss: 0.40527, in 0.000s 1 tree, 105 leaves, max depth = 19, train loss: 0.41550, val loss: 0.40382, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.41397, val loss: 0.40247, in 0.031s 1 tree, 105 leaves, max depth = 19, train loss: 0.41274, val loss: 0.40107, in 0.016s 1 tree, 159 leaves, max depth = 19, train loss: 0.41124, val loss: 0.39977, in 0.031s 1 tree, 159 leaves, max depth = 17, train loss: 0.40980, val loss: 0.39850, in 0.063s 1 tree, 5 leaves, max depth = 3, train loss: 0.40863, val loss: 0.39738, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.40723, val loss: 0.39616, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40610, val loss: 0.39508, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.40494, val loss: 0.39375, in 0.031s 1 tree, 159 leaves, max depth = 19, train loss: 0.40358, val loss: 0.39258, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.40246, val loss: 0.39130, in 0.031s Fit 91 trees in 2.017 s, (10928 total leaves) Time spent computing histograms: 0.633s Time spent finding best splits: 0.190s Time spent applying splits: 0.236s Time spent predicting: 0.031s Trial 20, Fold 5: Log loss = 0.41079164018892395, Average precision = 0.9446465371422839, ROC-AUC = 0.941807182802891, Elapsed Time = 2.021551100000579 seconds
Optimization Progress: 21%|##1 | 21/100 [04:10<16:19, 12.40s/it]
Trial 21, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 21, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 8, train loss: 0.67808, val loss: 0.67764, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.66413, val loss: 0.66327, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.65103, val loss: 0.64976, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.63883, val loss: 0.63723, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.62725, val loss: 0.62526, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.61634, val loss: 0.61397, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.60617, val loss: 0.60351, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.59648, val loss: 0.59347, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.58734, val loss: 0.58398, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.57871, val loss: 0.57502, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.57057, val loss: 0.56655, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.56287, val loss: 0.55853, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.55573, val loss: 0.55108, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.54884, val loss: 0.54389, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.54233, val loss: 0.53709, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.53543, val loss: 0.53057, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.52942, val loss: 0.52428, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.52301, val loss: 0.51826, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.51758, val loss: 0.51255, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.51162, val loss: 0.50695, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.50598, val loss: 0.50168, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.50064, val loss: 0.49668, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.49559, val loss: 0.49197, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.49077, val loss: 0.48687, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.48604, val loss: 0.48246, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.48164, val loss: 0.47778, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.47723, val loss: 0.47370, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.47313, val loss: 0.46935, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.46899, val loss: 0.46552, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.46509, val loss: 0.46137, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.46143, val loss: 0.45750, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.45760, val loss: 0.45397, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.45415, val loss: 0.45027, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.45054, val loss: 0.44694, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.44712, val loss: 0.44379, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.44388, val loss: 0.44082, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.44081, val loss: 0.43759, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.43776, val loss: 0.43480, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.43479, val loss: 0.43167, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43192, val loss: 0.42909, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.42922, val loss: 0.42617, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42651, val loss: 0.42372, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.42396, val loss: 0.42098, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42141, val loss: 0.41871, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.41899, val loss: 0.41654, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.41662, val loss: 0.41397, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.41433, val loss: 0.41191, in 0.000s 1 tree, 17 leaves, max depth = 11, train loss: 0.41213, val loss: 0.40956, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40994, val loss: 0.40714, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.40778, val loss: 0.40519, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.40565, val loss: 0.40294, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.40361, val loss: 0.40115, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.40165, val loss: 0.39903, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.39979, val loss: 0.39700, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.39785, val loss: 0.39529, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39584, val loss: 0.39306, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.39400, val loss: 0.39145, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39210, val loss: 0.38933, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39030, val loss: 0.38733, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38859, val loss: 0.38542, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38697, val loss: 0.38361, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.38522, val loss: 0.38211, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.38356, val loss: 0.38070, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38202, val loss: 0.37897, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38057, val loss: 0.37733, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.37887, val loss: 0.37556, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37749, val loss: 0.37400, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.37593, val loss: 0.37237, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.37437, val loss: 0.37115, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37306, val loss: 0.36967, in 0.000s Fit 70 trees in 0.736 s, (1070 total leaves) Time spent computing histograms: 0.281s Time spent finding best splits: 0.029s Time spent applying splits: 0.023s Time spent predicting: 0.000s Trial 21, Fold 1: Log loss = 0.374868091326567, Average precision = 0.9452746861639605, ROC-AUC = 0.9428602346539338, Elapsed Time = 0.7437253999996756 seconds Trial 21, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 21, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.143 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 8, train loss: 0.67826, val loss: 0.67752, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.66446, val loss: 0.66307, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.65141, val loss: 0.64934, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.63924, val loss: 0.63656, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.62769, val loss: 0.62439, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.61683, val loss: 0.61291, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.60668, val loss: 0.60222, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.59703, val loss: 0.59201, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.58793, val loss: 0.58236, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.57934, val loss: 0.57325, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.57123, val loss: 0.56465, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.56364, val loss: 0.55660, in 0.000s 1 tree, 17 leaves, max depth = 12, train loss: 0.55648, val loss: 0.54903, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.54969, val loss: 0.54182, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.54321, val loss: 0.53489, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.53621, val loss: 0.52810, in 0.016s 1 tree, 17 leaves, max depth = 12, train loss: 0.53031, val loss: 0.52184, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.52381, val loss: 0.51553, in 0.000s 1 tree, 17 leaves, max depth = 12, train loss: 0.51836, val loss: 0.50974, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.51231, val loss: 0.50388, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.50658, val loss: 0.49834, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.50117, val loss: 0.49311, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.49604, val loss: 0.48816, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.49132, val loss: 0.48310, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48656, val loss: 0.47856, in 0.000s 1 tree, 17 leaves, max depth = 11, train loss: 0.48216, val loss: 0.47386, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.47767, val loss: 0.46954, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.47358, val loss: 0.46518, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.46941, val loss: 0.46122, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.46543, val loss: 0.45739, in 0.000s 1 tree, 17 leaves, max depth = 11, train loss: 0.46168, val loss: 0.45335, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.45793, val loss: 0.44978, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.45443, val loss: 0.44604, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.45091, val loss: 0.44268, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.44764, val loss: 0.43917, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.44433, val loss: 0.43603, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.44121, val loss: 0.43310, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.43819, val loss: 0.42986, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.43533, val loss: 0.42679, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.43239, val loss: 0.42397, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.42971, val loss: 0.42109, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.42692, val loss: 0.41846, in 0.016s 1 tree, 17 leaves, max depth = 11, train loss: 0.42441, val loss: 0.41572, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.42178, val loss: 0.41321, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.41931, val loss: 0.41089, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.41693, val loss: 0.40865, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.41464, val loss: 0.40612, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.41239, val loss: 0.40400, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41026, val loss: 0.40179, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.40812, val loss: 0.39943, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.40599, val loss: 0.39745, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.40399, val loss: 0.39523, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.40197, val loss: 0.39336, in 0.016s 1 tree, 17 leaves, max depth = 11, train loss: 0.40005, val loss: 0.39126, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.39814, val loss: 0.38948, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39619, val loss: 0.38746, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.39437, val loss: 0.38578, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39252, val loss: 0.38386, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39077, val loss: 0.38204, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38911, val loss: 0.38030, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38754, val loss: 0.37866, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.38581, val loss: 0.37708, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.38406, val loss: 0.37518, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38257, val loss: 0.37362, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.38092, val loss: 0.37212, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37951, val loss: 0.37064, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.37783, val loss: 0.36885, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37649, val loss: 0.36744, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.37492, val loss: 0.36602, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.37337, val loss: 0.36434, in 0.000s Fit 70 trees in 0.768 s, (1082 total leaves) Time spent computing histograms: 0.291s Time spent finding best splits: 0.032s Time spent applying splits: 0.025s Time spent predicting: 0.000s Trial 21, Fold 2: Log loss = 0.3752575024880809, Average precision = 0.9417755109207956, ROC-AUC = 0.9443603087278346, Elapsed Time = 0.7794021999998222 seconds Trial 21, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 21, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.141 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 8, train loss: 0.67827, val loss: 0.67781, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.66471, val loss: 0.66373, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.65184, val loss: 0.65036, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.63975, val loss: 0.63779, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.62838, val loss: 0.62596, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.61755, val loss: 0.61477, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.60747, val loss: 0.60425, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.59800, val loss: 0.59445, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.58903, val loss: 0.58508, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.58045, val loss: 0.57619, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.57255, val loss: 0.56790, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.56489, val loss: 0.55994, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.55775, val loss: 0.55245, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.55090, val loss: 0.54533, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.54442, val loss: 0.53860, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.53733, val loss: 0.53192, in 0.000s 1 tree, 17 leaves, max depth = 12, train loss: 0.53149, val loss: 0.52576, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.52490, val loss: 0.51956, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.51948, val loss: 0.51389, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.51335, val loss: 0.50813, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.50755, val loss: 0.50269, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.50205, val loss: 0.49756, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.49685, val loss: 0.49272, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.49219, val loss: 0.48775, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48733, val loss: 0.48321, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.48272, val loss: 0.47894, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.47843, val loss: 0.47440, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.47441, val loss: 0.47010, in 0.016s [29/70] 1 tree, 17 leaves, max depth = 7, train loss: 0.47015, val loss: 0.46615, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.46610, val loss: 0.46241, in 0.016s [31/70] 1 tree, 17 leaves, max depth = 9, train loss: 0.46243, val loss: 0.45848, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.45863, val loss: 0.45498, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.45516, val loss: 0.45132, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.45158, val loss: 0.44804, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.44818, val loss: 0.44494, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.44499, val loss: 0.44156, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.44179, val loss: 0.43864, in 0.016s 1 tree, 17 leaves, max depth = 12, train loss: 0.43880, val loss: 0.43545, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.43596, val loss: 0.43242, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.43296, val loss: 0.42970, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.43026, val loss: 0.42675, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.42744, val loss: 0.42419, in 0.000s 1 tree, 17 leaves, max depth = 12, train loss: 0.42494, val loss: 0.42152, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.42227, val loss: 0.41911, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.41973, val loss: 0.41682, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.41732, val loss: 0.41465, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.41507, val loss: 0.41219, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.41279, val loss: 0.41015, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.41057, val loss: 0.40809, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.40841, val loss: 0.40617, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.40627, val loss: 0.40382, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.40422, val loss: 0.40201, in 0.000s 1 tree, 17 leaves, max depth = 12, train loss: 0.40227, val loss: 0.39987, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.40040, val loss: 0.39779, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.39846, val loss: 0.39609, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39643, val loss: 0.39420, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.39459, val loss: 0.39260, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39267, val loss: 0.39081, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39085, val loss: 0.38912, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38912, val loss: 0.38751, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38749, val loss: 0.38599, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.38573, val loss: 0.38448, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.38406, val loss: 0.38304, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38251, val loss: 0.38159, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38103, val loss: 0.38022, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37963, val loss: 0.37892, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.37792, val loss: 0.37706, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.37635, val loss: 0.37532, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.37475, val loss: 0.37396, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37342, val loss: 0.37272, in 0.016s Fit 70 trees in 0.829 s, (1070 total leaves) Time spent computing histograms: 0.313s Time spent finding best splits: 0.032s Time spent applying splits: 0.026s Time spent predicting: 0.000s Trial 21, Fold 3: Log loss = 0.3701317586291271, Average precision = 0.9496235379991602, ROC-AUC = 0.9494132893837101, Elapsed Time = 0.8211824000009074 seconds Trial 21, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 21, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 7, train loss: 0.67830, val loss: 0.67744, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.66447, val loss: 0.66278, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.65149, val loss: 0.64898, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.63937, val loss: 0.63615, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.62790, val loss: 0.62391, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.61709, val loss: 0.61236, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.60699, val loss: 0.60160, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.59739, val loss: 0.59130, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.58834, val loss: 0.58157, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.57980, val loss: 0.57236, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.57174, val loss: 0.56365, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.56412, val loss: 0.55541, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.55699, val loss: 0.54765, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.55018, val loss: 0.54024, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.54374, val loss: 0.53323, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.53663, val loss: 0.52625, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.53070, val loss: 0.51976, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.52409, val loss: 0.51329, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.51867, val loss: 0.50732, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.51253, val loss: 0.50132, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.50671, val loss: 0.49561, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.50120, val loss: 0.49021, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.49599, val loss: 0.48511, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.49128, val loss: 0.47999, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48641, val loss: 0.47523, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.48179, val loss: 0.47070, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.47751, val loss: 0.46601, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.47346, val loss: 0.46149, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.46919, val loss: 0.45732, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.46513, val loss: 0.45336, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.46140, val loss: 0.44921, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.45760, val loss: 0.44550, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.45414, val loss: 0.44164, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.45056, val loss: 0.43818, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.44716, val loss: 0.43487, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.44398, val loss: 0.43130, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.44078, val loss: 0.42820, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.43781, val loss: 0.42487, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.43489, val loss: 0.42161, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.43190, val loss: 0.41872, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.42926, val loss: 0.41574, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.42644, val loss: 0.41302, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.42396, val loss: 0.41018, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.42130, val loss: 0.40760, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.41877, val loss: 0.40517, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.41636, val loss: 0.40285, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.41410, val loss: 0.40029, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.41183, val loss: 0.39807, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40969, val loss: 0.39577, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.40753, val loss: 0.39368, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.40542, val loss: 0.39127, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.40337, val loss: 0.38930, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.40140, val loss: 0.38702, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.39952, val loss: 0.38484, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.39758, val loss: 0.38298, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39561, val loss: 0.38086, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.39377, val loss: 0.37909, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39191, val loss: 0.37708, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39015, val loss: 0.37518, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38848, val loss: 0.37337, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38690, val loss: 0.37166, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.38515, val loss: 0.36999, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.38347, val loss: 0.36839, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38197, val loss: 0.36676, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38054, val loss: 0.36522, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.37883, val loss: 0.36328, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37748, val loss: 0.36182, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.37591, val loss: 0.36000, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.37430, val loss: 0.35849, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37302, val loss: 0.35710, in 0.016s Fit 70 trees in 0.830 s, (1070 total leaves) Time spent computing histograms: 0.304s Time spent finding best splits: 0.032s Time spent applying splits: 0.026s Time spent predicting: 0.000s Trial 21, Fold 4: Log loss = 0.3720529426817586, Average precision = 0.948352786118597, ROC-AUC = 0.9466053903159581, Elapsed Time = 0.8287339999988035 seconds Trial 21, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 21, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 8, train loss: 0.67829, val loss: 0.67724, in 0.016s 1 tree, 17 leaves, max depth = 11, train loss: 0.66452, val loss: 0.66246, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.65138, val loss: 0.64845, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.63921, val loss: 0.63531, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.62759, val loss: 0.62286, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.61665, val loss: 0.61113, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.60649, val loss: 0.60012, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.59677, val loss: 0.58961, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.58760, val loss: 0.57970, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.57894, val loss: 0.57032, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.57077, val loss: 0.56146, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.56316, val loss: 0.55310, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.55597, val loss: 0.54518, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.54905, val loss: 0.53759, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.54251, val loss: 0.53042, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.53552, val loss: 0.52372, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.52949, val loss: 0.51706, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.52301, val loss: 0.51087, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.51753, val loss: 0.50481, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.51150, val loss: 0.49908, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.50578, val loss: 0.49365, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.50038, val loss: 0.48851, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.49526, val loss: 0.48366, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.49050, val loss: 0.47837, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.48573, val loss: 0.47386, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.48129, val loss: 0.46892, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.47682, val loss: 0.46472, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.47269, val loss: 0.46010, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.46850, val loss: 0.45616, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.46465, val loss: 0.45185, in 0.016s 1 tree, 17 leaves, max depth = 11, train loss: 0.46101, val loss: 0.44778, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.45712, val loss: 0.44415, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.45372, val loss: 0.44031, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.45006, val loss: 0.43690, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.44658, val loss: 0.43367, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.44328, val loss: 0.43061, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.44022, val loss: 0.42714, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43711, val loss: 0.42427, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.43423, val loss: 0.42098, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43130, val loss: 0.41828, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.42859, val loss: 0.41520, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42582, val loss: 0.41266, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.42329, val loss: 0.40976, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42067, val loss: 0.40738, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.41817, val loss: 0.40513, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.41582, val loss: 0.40242, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.41346, val loss: 0.40030, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.41121, val loss: 0.39828, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40906, val loss: 0.39620, in 0.016s 1 tree, 17 leaves, max depth = 11, train loss: 0.40690, val loss: 0.39376, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.40477, val loss: 0.39182, in 0.016s 1 tree, 17 leaves, max depth = 11, train loss: 0.40275, val loss: 0.38947, in 0.047s 1 tree, 17 leaves, max depth = 8, train loss: 0.40073, val loss: 0.38765, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.39880, val loss: 0.38540, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.39688, val loss: 0.38367, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39490, val loss: 0.38177, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.39307, val loss: 0.38014, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39120, val loss: 0.37834, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38942, val loss: 0.37664, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38774, val loss: 0.37503, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38615, val loss: 0.37350, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.38440, val loss: 0.37195, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.38266, val loss: 0.36990, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38116, val loss: 0.36846, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.37949, val loss: 0.36698, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37806, val loss: 0.36561, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.37639, val loss: 0.36368, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37504, val loss: 0.36238, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.37345, val loss: 0.36099, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.37188, val loss: 0.35917, in 0.000s Fit 70 trees in 0.892 s, (1082 total leaves) Time spent computing histograms: 0.331s Time spent finding best splits: 0.038s Time spent applying splits: 0.050s Time spent predicting: 0.000s Trial 21, Fold 5: Log loss = 0.37708270831005164, Average precision = 0.9454919453985955, ROC-AUC = 0.9430249521579993, Elapsed Time = 0.8994215000002441 seconds
Optimization Progress: 22%|##2 | 22/100 [04:21<15:25, 11.86s/it]
Trial 22, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 22, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 11, train loss: 0.66298, val loss: 0.66325, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.63598, val loss: 0.63648, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.61156, val loss: 0.61214, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.59031, val loss: 0.59090, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.57004, val loss: 0.57076, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.55147, val loss: 0.55217, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.53410, val loss: 0.53461, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.51818, val loss: 0.51851, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.50361, val loss: 0.50378, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.49016, val loss: 0.49020, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.47775, val loss: 0.47766, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.46631, val loss: 0.46614, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.45572, val loss: 0.45536, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.44593, val loss: 0.44558, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.43681, val loss: 0.43644, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.42369, val loss: 0.42385, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.41572, val loss: 0.41585, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.40408, val loss: 0.40468, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.39707, val loss: 0.39776, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.38678, val loss: 0.38794, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.38060, val loss: 0.38185, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.37148, val loss: 0.37323, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.36600, val loss: 0.36783, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.36089, val loss: 0.36290, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.35341, val loss: 0.35553, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.34659, val loss: 0.34879, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.34034, val loss: 0.34263, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.33428, val loss: 0.33721, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.32888, val loss: 0.33187, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.32410, val loss: 0.32731, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.31893, val loss: 0.32273, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.31438, val loss: 0.31823, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.31018, val loss: 0.31424, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.30577, val loss: 0.31037, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.30192, val loss: 0.30656, in 0.016s 1 tree, 48 leaves, max depth = 16, train loss: 0.29789, val loss: 0.30251, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.29413, val loss: 0.29926, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.29084, val loss: 0.29601, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.28726, val loss: 0.29245, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.28403, val loss: 0.28970, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.28123, val loss: 0.28694, in 0.016s 1 tree, 48 leaves, max depth = 16, train loss: 0.27825, val loss: 0.28410, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.27546, val loss: 0.28176, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.27252, val loss: 0.27874, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.27001, val loss: 0.27664, in 0.016s 1 tree, 47 leaves, max depth = 14, train loss: 0.26779, val loss: 0.27445, in 0.016s 1 tree, 47 leaves, max depth = 14, train loss: 0.26573, val loss: 0.27247, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.26311, val loss: 0.26984, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.26101, val loss: 0.26806, in 0.016s Fit 49 trees in 1.127 s, (2350 total leaves) Time spent computing histograms: 0.317s Time spent finding best splits: 0.069s Time spent applying splits: 0.046s Time spent predicting: 0.000s Trial 22, Fold 1: Log loss = 0.2685903231207794, Average precision = 0.9631391826305978, ROC-AUC = 0.9585063694001137, Elapsed Time = 1.1325500999992073 seconds Trial 22, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 22, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 10, train loss: 0.66256, val loss: 0.66211, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.63534, val loss: 0.63440, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.61048, val loss: 0.60922, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.58929, val loss: 0.58780, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.56878, val loss: 0.56697, in 0.031s 1 tree, 48 leaves, max depth = 10, train loss: 0.54985, val loss: 0.54777, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.53252, val loss: 0.53025, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.51664, val loss: 0.51419, in 0.031s 1 tree, 48 leaves, max depth = 13, train loss: 0.50203, val loss: 0.49938, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.48862, val loss: 0.48574, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.47624, val loss: 0.47313, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.46481, val loss: 0.46151, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.45427, val loss: 0.45082, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.44450, val loss: 0.44090, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.43091, val loss: 0.42745, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.42238, val loss: 0.41882, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.41448, val loss: 0.41078, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.40304, val loss: 0.39959, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.39610, val loss: 0.39252, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.38602, val loss: 0.38265, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.37669, val loss: 0.37346, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.37085, val loss: 0.36759, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.36546, val loss: 0.36231, in 0.016s 1 tree, 48 leaves, max depth = 16, train loss: 0.35748, val loss: 0.35458, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.35262, val loss: 0.34966, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.34589, val loss: 0.34315, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.33971, val loss: 0.33715, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.33374, val loss: 0.33164, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.32840, val loss: 0.32647, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.32355, val loss: 0.32177, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.31849, val loss: 0.31711, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.31398, val loss: 0.31275, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.30971, val loss: 0.30860, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.30537, val loss: 0.30460, in 0.016s 1 tree, 48 leaves, max depth = 18, train loss: 0.30156, val loss: 0.30092, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.29784, val loss: 0.29724, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.29413, val loss: 0.29384, in 0.016s 1 tree, 48 leaves, max depth = 19, train loss: 0.29090, val loss: 0.29073, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.28749, val loss: 0.28747, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.28431, val loss: 0.28459, in 0.016s 1 tree, 48 leaves, max depth = 22, train loss: 0.28154, val loss: 0.28190, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.27849, val loss: 0.27898, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.27574, val loss: 0.27649, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.27338, val loss: 0.27422, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.27072, val loss: 0.27165, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.26834, val loss: 0.26950, in 0.016s 1 tree, 48 leaves, max depth = 16, train loss: 0.26632, val loss: 0.26759, in 0.031s 1 tree, 48 leaves, max depth = 13, train loss: 0.26421, val loss: 0.26571, in 0.016s 1 tree, 48 leaves, max depth = 16, train loss: 0.26242, val loss: 0.26399, in 0.016s Fit 49 trees in 1.221 s, (2352 total leaves) Time spent computing histograms: 0.351s Time spent finding best splits: 0.074s Time spent applying splits: 0.050s Time spent predicting: 0.000s Trial 22, Fold 2: Log loss = 0.26610610846642424, Average precision = 0.9634199782399377, ROC-AUC = 0.960638900666203, Elapsed Time = 1.2277723000006517 seconds Trial 22, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 22, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 12, train loss: 0.66322, val loss: 0.66334, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.63585, val loss: 0.63615, in 0.016s 1 tree, 48 leaves, max depth = 8, train loss: 0.61153, val loss: 0.61171, in 0.031s 1 tree, 48 leaves, max depth = 8, train loss: 0.58904, val loss: 0.58938, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.56858, val loss: 0.56905, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.54991, val loss: 0.55048, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.53351, val loss: 0.53414, in 0.031s 1 tree, 48 leaves, max depth = 9, train loss: 0.51787, val loss: 0.51852, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.50330, val loss: 0.50410, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.48989, val loss: 0.49086, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.47757, val loss: 0.47859, in 0.031s 1 tree, 48 leaves, max depth = 13, train loss: 0.46616, val loss: 0.46729, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.45561, val loss: 0.45676, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.44584, val loss: 0.44707, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.43209, val loss: 0.43434, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.42358, val loss: 0.42593, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.41564, val loss: 0.41810, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.40399, val loss: 0.40739, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.39333, val loss: 0.39761, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.38673, val loss: 0.39117, in 0.031s 1 tree, 48 leaves, max depth = 13, train loss: 0.37729, val loss: 0.38253, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.37147, val loss: 0.37676, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.36309, val loss: 0.36917, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.35789, val loss: 0.36400, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.35048, val loss: 0.35747, in 0.016s 1 tree, 47 leaves, max depth = 14, train loss: 0.34394, val loss: 0.35150, in 0.028s 1 tree, 47 leaves, max depth = 14, train loss: 0.33795, val loss: 0.34605, in 0.004s 1 tree, 48 leaves, max depth = 15, train loss: 0.33289, val loss: 0.34080, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.32712, val loss: 0.33591, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.32209, val loss: 0.33135, in 0.016s 1 tree, 48 leaves, max depth = 16, train loss: 0.31766, val loss: 0.32673, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.31274, val loss: 0.32259, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.30848, val loss: 0.31878, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.30459, val loss: 0.31476, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.30038, val loss: 0.31122, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.29649, val loss: 0.30794, in 0.031s 1 tree, 47 leaves, max depth = 12, train loss: 0.29306, val loss: 0.30488, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.28972, val loss: 0.30139, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.28636, val loss: 0.29851, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.28346, val loss: 0.29593, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.28024, val loss: 0.29230, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.27735, val loss: 0.28991, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.27487, val loss: 0.28771, in 0.031s 1 tree, 48 leaves, max depth = 13, train loss: 0.27230, val loss: 0.28560, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.27009, val loss: 0.28369, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.26733, val loss: 0.28056, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.26537, val loss: 0.27883, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.26301, val loss: 0.27628, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.26091, val loss: 0.27460, in 0.016s Fit 49 trees in 1.268 s, (2348 total leaves) Time spent computing histograms: 0.353s Time spent finding best splits: 0.077s Time spent applying splits: 0.051s Time spent predicting: 0.016s Trial 22, Fold 3: Log loss = 0.2634322686868408, Average precision = 0.9640817460306497, ROC-AUC = 0.9609534514017972, Elapsed Time = 1.279824600000211 seconds Trial 22, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 22, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 11, train loss: 0.66321, val loss: 0.66211, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.63633, val loss: 0.63405, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.61205, val loss: 0.60880, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.59116, val loss: 0.58710, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.57097, val loss: 0.56607, in 0.031s 1 tree, 48 leaves, max depth = 10, train loss: 0.55206, val loss: 0.54636, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.53478, val loss: 0.52829, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.51894, val loss: 0.51173, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.50438, val loss: 0.49650, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.49180, val loss: 0.48327, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.47950, val loss: 0.47023, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.46798, val loss: 0.45819, in 0.031s 1 tree, 48 leaves, max depth = 10, train loss: 0.45734, val loss: 0.44703, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.44749, val loss: 0.43668, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.43837, val loss: 0.42708, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.42536, val loss: 0.41391, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.41739, val loss: 0.40548, in 0.031s 1 tree, 48 leaves, max depth = 14, train loss: 0.40595, val loss: 0.39390, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.39896, val loss: 0.38648, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.38885, val loss: 0.37627, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.38269, val loss: 0.36967, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.37373, val loss: 0.36062, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.36838, val loss: 0.35498, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.36326, val loss: 0.34953, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.35586, val loss: 0.34188, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.34910, val loss: 0.33492, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.34291, val loss: 0.32850, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.33782, val loss: 0.32340, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.33192, val loss: 0.31791, in 0.031s 1 tree, 48 leaves, max depth = 13, train loss: 0.32673, val loss: 0.31251, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.32227, val loss: 0.30803, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.31785, val loss: 0.30341, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.31297, val loss: 0.29893, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.30872, val loss: 0.29452, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.30444, val loss: 0.29062, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.30049, val loss: 0.28703, in 0.031s 1 tree, 48 leaves, max depth = 14, train loss: 0.29697, val loss: 0.28339, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.29348, val loss: 0.28023, in 0.031s 1 tree, 48 leaves, max depth = 13, train loss: 0.29038, val loss: 0.27699, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.28691, val loss: 0.27332, in 0.032s 1 tree, 48 leaves, max depth = 13, train loss: 0.28392, val loss: 0.27063, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.28127, val loss: 0.26784, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.27818, val loss: 0.26452, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.27560, val loss: 0.26221, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.27334, val loss: 0.25982, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.27073, val loss: 0.25724, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.26871, val loss: 0.25510, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.26613, val loss: 0.25235, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.26404, val loss: 0.25051, in 0.031s Fit 49 trees in 1.377 s, (2352 total leaves) Time spent computing histograms: 0.393s Time spent finding best splits: 0.091s Time spent applying splits: 0.061s Time spent predicting: 0.000s Trial 22, Fold 4: Log loss = 0.2667903100689803, Average precision = 0.9642265018935711, ROC-AUC = 0.9601925023426827, Elapsed Time = 1.3921172999998817 seconds Trial 22, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 22, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 14, train loss: 0.66293, val loss: 0.66200, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.63540, val loss: 0.63352, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.61051, val loss: 0.60776, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.58918, val loss: 0.58564, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.56846, val loss: 0.56419, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.54957, val loss: 0.54461, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.53228, val loss: 0.52670, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.51643, val loss: 0.51028, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.50187, val loss: 0.49512, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.48916, val loss: 0.48194, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.47670, val loss: 0.46903, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.46519, val loss: 0.45709, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.45456, val loss: 0.44609, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.44472, val loss: 0.43584, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.43559, val loss: 0.42640, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.42252, val loss: 0.41342, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.41452, val loss: 0.40521, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.40301, val loss: 0.39379, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.39598, val loss: 0.38658, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.38580, val loss: 0.37649, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.37959, val loss: 0.37013, in 0.031s 1 tree, 48 leaves, max depth = 15, train loss: 0.37057, val loss: 0.36122, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.36507, val loss: 0.35558, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.35994, val loss: 0.35036, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.35252, val loss: 0.34274, in 0.016s 1 tree, 48 leaves, max depth = 8, train loss: 0.34798, val loss: 0.33825, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.34136, val loss: 0.33158, in 0.016s 1 tree, 46 leaves, max depth = 15, train loss: 0.33531, val loss: 0.32546, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.32939, val loss: 0.32014, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.32416, val loss: 0.31481, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.31955, val loss: 0.31025, in 0.016s 1 tree, 48 leaves, max depth = 16, train loss: 0.31452, val loss: 0.30578, in 0.016s 1 tree, 47 leaves, max depth = 14, train loss: 0.31012, val loss: 0.30134, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.30590, val loss: 0.29708, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.30161, val loss: 0.29318, in 0.031s 1 tree, 46 leaves, max depth = 13, train loss: 0.29788, val loss: 0.28928, in 0.016s 1 tree, 46 leaves, max depth = 13, train loss: 0.29444, val loss: 0.28567, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.29077, val loss: 0.28200, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.28725, val loss: 0.27887, in 0.031s 1 tree, 48 leaves, max depth = 14, train loss: 0.28398, val loss: 0.27606, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.28120, val loss: 0.27314, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.27809, val loss: 0.27016, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.27528, val loss: 0.26767, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.27267, val loss: 0.26547, in 0.031s 1 tree, 47 leaves, max depth = 11, train loss: 0.27036, val loss: 0.26309, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.26754, val loss: 0.26026, in 0.031s 1 tree, 48 leaves, max depth = 15, train loss: 0.26548, val loss: 0.25812, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.26289, val loss: 0.25549, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.26078, val loss: 0.25368, in 0.031s Fit 49 trees in 1.298 s, (2340 total leaves) Time spent computing histograms: 0.360s Time spent finding best splits: 0.078s Time spent applying splits: 0.053s Time spent predicting: 0.000s Trial 22, Fold 5: Log loss = 0.27211823887263314, Average precision = 0.9607680900911164, ROC-AUC = 0.9565815357746688, Elapsed Time = 1.3137683000004472 seconds
Optimization Progress: 23%|##3 | 23/100 [04:34<15:40, 12.21s/it]
Trial 23, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 23, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 51 leaves, max depth = 15, train loss: 0.67715, val loss: 0.67667, in 0.016s 1 tree, 50 leaves, max depth = 15, train loss: 0.66222, val loss: 0.66127, in 0.000s 1 tree, 51 leaves, max depth = 15, train loss: 0.64827, val loss: 0.64687, in 0.016s 1 tree, 50 leaves, max depth = 15, train loss: 0.63522, val loss: 0.63339, in 0.016s 1 tree, 51 leaves, max depth = 15, train loss: 0.62300, val loss: 0.62075, in 0.000s 1 tree, 50 leaves, max depth = 15, train loss: 0.61154, val loss: 0.60888, in 0.016s 1 tree, 51 leaves, max depth = 15, train loss: 0.60079, val loss: 0.59774, in 0.016s 1 tree, 51 leaves, max depth = 15, train loss: 0.59070, val loss: 0.58727, in 0.000s 1 tree, 50 leaves, max depth = 15, train loss: 0.58122, val loss: 0.57742, in 0.016s 1 tree, 51 leaves, max depth = 15, train loss: 0.57231, val loss: 0.56815, in 0.016s 1 tree, 50 leaves, max depth = 15, train loss: 0.56392, val loss: 0.55942, in 0.000s 1 tree, 50 leaves, max depth = 14, train loss: 0.55603, val loss: 0.55121, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.54798, val loss: 0.54375, in 0.016s 1 tree, 50 leaves, max depth = 14, train loss: 0.54076, val loss: 0.53621, in 0.000s 1 tree, 51 leaves, max depth = 14, train loss: 0.53396, val loss: 0.52910, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.52670, val loss: 0.52240, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.51986, val loss: 0.51611, in 0.000s 1 tree, 83 leaves, max depth = 13, train loss: 0.51341, val loss: 0.51019, in 0.016s 1 tree, 50 leaves, max depth = 14, train loss: 0.50748, val loss: 0.50395, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.50151, val loss: 0.49849, in 0.016s 1 tree, 50 leaves, max depth = 14, train loss: 0.49605, val loss: 0.49273, in 0.000s 1 tree, 82 leaves, max depth = 13, train loss: 0.49051, val loss: 0.48767, in 0.016s 1 tree, 51 leaves, max depth = 15, train loss: 0.48552, val loss: 0.48250, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.48037, val loss: 0.47782, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.47576, val loss: 0.47295, in 0.000s 1 tree, 82 leaves, max depth = 11, train loss: 0.47096, val loss: 0.46861, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.46643, val loss: 0.46451, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.46223, val loss: 0.46001, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.45799, val loss: 0.45620, in 0.000s 1 tree, 53 leaves, max depth = 13, train loss: 0.45409, val loss: 0.45202, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.45013, val loss: 0.44846, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.44648, val loss: 0.44460, in 0.000s 1 tree, 82 leaves, max depth = 12, train loss: 0.44277, val loss: 0.44128, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.43940, val loss: 0.43764, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.43592, val loss: 0.43455, in 0.016s 1 tree, 85 leaves, max depth = 12, train loss: 0.43263, val loss: 0.43162, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.42953, val loss: 0.42827, in 0.016s 1 tree, 83 leaves, max depth = 11, train loss: 0.42644, val loss: 0.42553, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.42353, val loss: 0.42243, in 0.000s 1 tree, 83 leaves, max depth = 11, train loss: 0.42063, val loss: 0.41986, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.41793, val loss: 0.41692, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.41519, val loss: 0.41451, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.41267, val loss: 0.41175, in 0.000s 1 tree, 84 leaves, max depth = 12, train loss: 0.41009, val loss: 0.40949, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40779, val loss: 0.40697, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.40535, val loss: 0.40486, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.40319, val loss: 0.40249, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.40089, val loss: 0.40003, in 0.000s 1 tree, 84 leaves, max depth = 12, train loss: 0.39860, val loss: 0.39807, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39658, val loss: 0.39585, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39467, val loss: 0.39376, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.39250, val loss: 0.39193, in 0.000s 1 tree, 53 leaves, max depth = 13, train loss: 0.39040, val loss: 0.38969, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38861, val loss: 0.38772, in 0.016s 1 tree, 80 leaves, max depth = 11, train loss: 0.38656, val loss: 0.38600, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38486, val loss: 0.38413, in 0.016s 1 tree, 85 leaves, max depth = 12, train loss: 0.38292, val loss: 0.38252, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.38098, val loss: 0.38047, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.37938, val loss: 0.37871, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.37754, val loss: 0.37720, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.37609, val loss: 0.37551, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.37426, val loss: 0.37358, in 0.000s 1 tree, 83 leaves, max depth = 13, train loss: 0.37251, val loss: 0.37215, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37114, val loss: 0.37056, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.36944, val loss: 0.36878, in 0.000s 1 tree, 82 leaves, max depth = 12, train loss: 0.36777, val loss: 0.36745, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36648, val loss: 0.36594, in 0.000s 1 tree, 51 leaves, max depth = 12, train loss: 0.36488, val loss: 0.36427, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.36366, val loss: 0.36282, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.36216, val loss: 0.36126, in 0.000s 1 tree, 83 leaves, max depth = 12, train loss: 0.36055, val loss: 0.36000, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.35939, val loss: 0.35861, in 0.000s 1 tree, 48 leaves, max depth = 12, train loss: 0.35795, val loss: 0.35711, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.35642, val loss: 0.35591, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35531, val loss: 0.35460, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.35397, val loss: 0.35322, in 0.016s Fit 76 trees in 1.032 s, (4129 total leaves) Time spent computing histograms: 0.350s Time spent finding best splits: 0.070s Time spent applying splits: 0.070s Time spent predicting: 0.000s Trial 23, Fold 1: Log loss = 0.35838270196769895, Average precision = 0.946139576660052, ROC-AUC = 0.9448678861788617, Elapsed Time = 1.0405602000009821 seconds Trial 23, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 23, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 16, train loss: 0.67725, val loss: 0.67651, in 0.000s 1 tree, 47 leaves, max depth = 14, train loss: 0.66252, val loss: 0.66111, in 0.016s 1 tree, 49 leaves, max depth = 14, train loss: 0.64864, val loss: 0.64657, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.63577, val loss: 0.63309, in 0.000s 1 tree, 50 leaves, max depth = 13, train loss: 0.62361, val loss: 0.62031, in 0.016s 1 tree, 51 leaves, max depth = 15, train loss: 0.61229, val loss: 0.60843, in 0.016s 1 tree, 49 leaves, max depth = 14, train loss: 0.60169, val loss: 0.59729, in 0.000s 1 tree, 48 leaves, max depth = 16, train loss: 0.59175, val loss: 0.58682, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.58228, val loss: 0.57683, in 0.016s 1 tree, 48 leaves, max depth = 16, train loss: 0.57350, val loss: 0.56755, in 0.000s 1 tree, 50 leaves, max depth = 13, train loss: 0.56512, val loss: 0.55868, in 0.016s 1 tree, 51 leaves, max depth = 14, train loss: 0.55732, val loss: 0.55045, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.54932, val loss: 0.54275, in 0.016s 1 tree, 51 leaves, max depth = 18, train loss: 0.54218, val loss: 0.53519, in 0.000s 1 tree, 51 leaves, max depth = 13, train loss: 0.53534, val loss: 0.52793, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.52812, val loss: 0.52099, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.52133, val loss: 0.51446, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.51492, val loss: 0.50832, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.50904, val loss: 0.50209, in 0.000s 1 tree, 82 leaves, max depth = 14, train loss: 0.50311, val loss: 0.49641, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.49768, val loss: 0.49066, in 0.016s 1 tree, 82 leaves, max depth = 15, train loss: 0.49217, val loss: 0.48540, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.48716, val loss: 0.48009, in 0.000s 1 tree, 82 leaves, max depth = 15, train loss: 0.48204, val loss: 0.47520, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.47721, val loss: 0.47059, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.47266, val loss: 0.46577, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.46815, val loss: 0.46149, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.46394, val loss: 0.45701, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.45973, val loss: 0.45303, in 0.016s 1 tree, 53 leaves, max depth = 15, train loss: 0.45583, val loss: 0.44887, in 0.000s 1 tree, 79 leaves, max depth = 15, train loss: 0.45189, val loss: 0.44516, in 0.016s 1 tree, 53 leaves, max depth = 15, train loss: 0.44826, val loss: 0.44128, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.44457, val loss: 0.43780, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.44109, val loss: 0.43450, in 0.016s 1 tree, 53 leaves, max depth = 17, train loss: 0.43776, val loss: 0.43095, in 0.000s 1 tree, 81 leaves, max depth = 14, train loss: 0.43449, val loss: 0.42788, in 0.016s 1 tree, 53 leaves, max depth = 17, train loss: 0.43139, val loss: 0.42456, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.42832, val loss: 0.42169, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.42543, val loss: 0.41860, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.42256, val loss: 0.41594, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.41986, val loss: 0.41305, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.41716, val loss: 0.41055, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.41459, val loss: 0.40816, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.41209, val loss: 0.40548, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40985, val loss: 0.40316, in 0.000s 1 tree, 82 leaves, max depth = 12, train loss: 0.40744, val loss: 0.40093, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.40512, val loss: 0.39844, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40304, val loss: 0.39627, in 0.000s 1 tree, 80 leaves, max depth = 13, train loss: 0.40075, val loss: 0.39416, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39878, val loss: 0.39211, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.39662, val loss: 0.39012, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.39451, val loss: 0.38785, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39267, val loss: 0.38594, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.39064, val loss: 0.38408, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38890, val loss: 0.38227, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.38697, val loss: 0.38051, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.38503, val loss: 0.37843, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38339, val loss: 0.37673, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.38157, val loss: 0.37506, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.38009, val loss: 0.37351, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.37824, val loss: 0.37152, in 0.000s 1 tree, 82 leaves, max depth = 12, train loss: 0.37650, val loss: 0.36996, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37510, val loss: 0.36850, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.37337, val loss: 0.36664, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37205, val loss: 0.36526, in 0.000s 1 tree, 79 leaves, max depth = 13, train loss: 0.37040, val loss: 0.36379, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.36877, val loss: 0.36204, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36752, val loss: 0.36073, in 0.000s 1 tree, 80 leaves, max depth = 12, train loss: 0.36594, val loss: 0.35933, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.36442, val loss: 0.35770, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36322, val loss: 0.35645, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.36179, val loss: 0.35491, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.36028, val loss: 0.35358, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35914, val loss: 0.35239, in 0.000s 1 tree, 52 leaves, max depth = 15, train loss: 0.35775, val loss: 0.35090, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.35632, val loss: 0.34963, in 0.016s Fit 76 trees in 1.142 s, (4173 total leaves) Time spent computing histograms: 0.378s Time spent finding best splits: 0.081s Time spent applying splits: 0.085s Time spent predicting: 0.000s Trial 23, Fold 2: Log loss = 0.3599834861148231, Average precision = 0.9429529922221647, ROC-AUC = 0.9446858836015578, Elapsed Time = 1.1502019999988988 seconds Trial 23, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 23, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 51 leaves, max depth = 13, train loss: 0.67733, val loss: 0.67686, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.66257, val loss: 0.66165, in 0.000s 1 tree, 51 leaves, max depth = 13, train loss: 0.64877, val loss: 0.64742, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.63587, val loss: 0.63409, in 0.000s 1 tree, 51 leaves, max depth = 13, train loss: 0.62378, val loss: 0.62160, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.61246, val loss: 0.60989, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.60193, val loss: 0.59899, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.59194, val loss: 0.58865, in 0.000s 1 tree, 47 leaves, max depth = 14, train loss: 0.58266, val loss: 0.57897, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.57383, val loss: 0.56981, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.56561, val loss: 0.56130, in 0.000s 1 tree, 51 leaves, max depth = 13, train loss: 0.55779, val loss: 0.55317, in 0.016s 1 tree, 80 leaves, max depth = 12, train loss: 0.54966, val loss: 0.54560, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.54202, val loss: 0.53850, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.53512, val loss: 0.53131, in 0.000s 1 tree, 81 leaves, max depth = 12, train loss: 0.52807, val loss: 0.52478, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.52174, val loss: 0.51817, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.51522, val loss: 0.51214, in 0.000s 1 tree, 49 leaves, max depth = 13, train loss: 0.50941, val loss: 0.50605, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.50337, val loss: 0.50047, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.49801, val loss: 0.49483, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.49240, val loss: 0.48967, in 0.016s 1 tree, 80 leaves, max depth = 12, train loss: 0.48710, val loss: 0.48481, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.48226, val loss: 0.47971, in 0.000s 1 tree, 80 leaves, max depth = 14, train loss: 0.47733, val loss: 0.47520, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.47285, val loss: 0.47047, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.46825, val loss: 0.46627, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.46411, val loss: 0.46188, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.45981, val loss: 0.45797, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.45574, val loss: 0.45428, in 0.016s 1 tree, 48 leaves, max depth = 16, train loss: 0.45198, val loss: 0.45026, in 0.000s 1 tree, 80 leaves, max depth = 12, train loss: 0.44817, val loss: 0.44682, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.44466, val loss: 0.44308, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.44109, val loss: 0.43985, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.43783, val loss: 0.43637, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.43447, val loss: 0.43336, in 0.000s 1 tree, 50 leaves, max depth = 13, train loss: 0.43144, val loss: 0.43011, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.42828, val loss: 0.42728, in 0.016s 1 tree, 80 leaves, max depth = 12, train loss: 0.42530, val loss: 0.42462, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.42250, val loss: 0.42160, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.41969, val loss: 0.41911, in 0.000s 1 tree, 52 leaves, max depth = 13, train loss: 0.41707, val loss: 0.41628, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.41442, val loss: 0.41393, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.41208, val loss: 0.41178, in 0.000s 1 tree, 81 leaves, max depth = 13, train loss: 0.40958, val loss: 0.40959, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.40717, val loss: 0.40692, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40499, val loss: 0.40491, in 0.000s 1 tree, 81 leaves, max depth = 12, train loss: 0.40262, val loss: 0.40285, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40057, val loss: 0.40096, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.39836, val loss: 0.39851, in 0.000s 1 tree, 81 leaves, max depth = 12, train loss: 0.39614, val loss: 0.39657, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39422, val loss: 0.39480, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39240, val loss: 0.39313, in 0.000s 1 tree, 80 leaves, max depth = 12, train loss: 0.39029, val loss: 0.39131, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.38826, val loss: 0.38906, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38656, val loss: 0.38749, in 0.000s 1 tree, 82 leaves, max depth = 13, train loss: 0.38456, val loss: 0.38578, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38295, val loss: 0.38430, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.38104, val loss: 0.38267, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.37917, val loss: 0.38058, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.37765, val loss: 0.37918, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.37585, val loss: 0.37766, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.37446, val loss: 0.37638, in 0.000s 1 tree, 49 leaves, max depth = 15, train loss: 0.37268, val loss: 0.37442, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.37096, val loss: 0.37299, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.36965, val loss: 0.37177, in 0.016s 1 tree, 49 leaves, max depth = 15, train loss: 0.36797, val loss: 0.36993, in 0.016s 1 tree, 50 leaves, max depth = 15, train loss: 0.36639, val loss: 0.36819, in 0.000s 1 tree, 82 leaves, max depth = 13, train loss: 0.36474, val loss: 0.36681, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36350, val loss: 0.36567, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.36202, val loss: 0.36403, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.36084, val loss: 0.36294, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.35926, val loss: 0.36164, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.35788, val loss: 0.36009, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.35676, val loss: 0.35905, in 0.000s 1 tree, 82 leaves, max depth = 14, train loss: 0.35525, val loss: 0.35782, in 0.016s Fit 76 trees in 1.173 s, (4073 total leaves) Time spent computing histograms: 0.378s Time spent finding best splits: 0.081s Time spent applying splits: 0.084s Time spent predicting: 0.000s Trial 23, Fold 3: Log loss = 0.3552397610452097, Average precision = 0.9484253015879371, ROC-AUC = 0.9490830502210323, Elapsed Time = 1.1743360999989818 seconds Trial 23, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 23, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 15, train loss: 0.67731, val loss: 0.67637, in 0.000s 1 tree, 48 leaves, max depth = 15, train loss: 0.66252, val loss: 0.66068, in 0.016s 1 tree, 49 leaves, max depth = 15, train loss: 0.64870, val loss: 0.64600, in 0.016s 1 tree, 47 leaves, max depth = 15, train loss: 0.63578, val loss: 0.63222, in 0.000s 1 tree, 48 leaves, max depth = 13, train loss: 0.62368, val loss: 0.61930, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.61233, val loss: 0.60716, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.60169, val loss: 0.59574, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.59170, val loss: 0.58500, in 0.016s 1 tree, 50 leaves, max depth = 15, train loss: 0.58232, val loss: 0.57489, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.57350, val loss: 0.56536, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.56521, val loss: 0.55637, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.55740, val loss: 0.54790, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.54941, val loss: 0.54010, in 0.031s 1 tree, 49 leaves, max depth = 15, train loss: 0.54227, val loss: 0.53231, in 0.016s 1 tree, 49 leaves, max depth = 15, train loss: 0.53554, val loss: 0.52497, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.52833, val loss: 0.51795, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.52153, val loss: 0.51133, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.51514, val loss: 0.50510, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.50927, val loss: 0.49874, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.50334, val loss: 0.49296, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.49794, val loss: 0.48709, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.49244, val loss: 0.48174, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.48745, val loss: 0.47631, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.48234, val loss: 0.47134, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.47774, val loss: 0.46631, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.47298, val loss: 0.46170, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.46848, val loss: 0.45736, in 0.016s 1 tree, 50 leaves, max depth = 15, train loss: 0.46429, val loss: 0.45272, in 0.000s 1 tree, 79 leaves, max depth = 14, train loss: 0.46009, val loss: 0.44867, in 0.016s 1 tree, 50 leaves, max depth = 15, train loss: 0.45619, val loss: 0.44434, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.45228, val loss: 0.44056, in 0.016s 1 tree, 50 leaves, max depth = 15, train loss: 0.44866, val loss: 0.43655, in 0.016s 1 tree, 77 leaves, max depth = 14, train loss: 0.44499, val loss: 0.43299, in 0.016s 1 tree, 50 leaves, max depth = 14, train loss: 0.44162, val loss: 0.42924, in 0.000s 1 tree, 78 leaves, max depth = 13, train loss: 0.43818, val loss: 0.42593, in 0.031s 1 tree, 79 leaves, max depth = 13, train loss: 0.43491, val loss: 0.42278, in 0.016s 1 tree, 50 leaves, max depth = 14, train loss: 0.43183, val loss: 0.41932, in 0.000s 1 tree, 79 leaves, max depth = 13, train loss: 0.42876, val loss: 0.41638, in 0.031s 1 tree, 50 leaves, max depth = 12, train loss: 0.42588, val loss: 0.41314, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.42301, val loss: 0.41038, in 0.016s 1 tree, 51 leaves, max depth = 15, train loss: 0.42032, val loss: 0.40734, in 0.000s 1 tree, 80 leaves, max depth = 14, train loss: 0.41761, val loss: 0.40475, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.41510, val loss: 0.40190, in 0.016s 1 tree, 78 leaves, max depth = 14, train loss: 0.41255, val loss: 0.39948, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.41031, val loss: 0.39711, in 0.000s 1 tree, 78 leaves, max depth = 12, train loss: 0.40790, val loss: 0.39481, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.40580, val loss: 0.39257, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.40353, val loss: 0.38999, in 0.016s 1 tree, 80 leaves, max depth = 12, train loss: 0.40126, val loss: 0.38783, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39929, val loss: 0.38574, in 0.000s 1 tree, 80 leaves, max depth = 12, train loss: 0.39714, val loss: 0.38372, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39528, val loss: 0.38174, in 0.000s 1 tree, 49 leaves, max depth = 12, train loss: 0.39320, val loss: 0.37938, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39146, val loss: 0.37753, in 0.000s 1 tree, 79 leaves, max depth = 12, train loss: 0.38943, val loss: 0.37562, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.38785, val loss: 0.37393, in 0.000s 1 tree, 80 leaves, max depth = 12, train loss: 0.38592, val loss: 0.37213, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.38401, val loss: 0.36996, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.38251, val loss: 0.36837, in 0.000s 1 tree, 80 leaves, max depth = 12, train loss: 0.38068, val loss: 0.36666, in 0.031s 1 tree, 2 leaves, max depth = 1, train loss: 0.37926, val loss: 0.36515, in 0.000s 1 tree, 53 leaves, max depth = 13, train loss: 0.37745, val loss: 0.36308, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.37571, val loss: 0.36148, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37437, val loss: 0.36005, in 0.000s 1 tree, 51 leaves, max depth = 12, train loss: 0.37270, val loss: 0.35815, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.37104, val loss: 0.35662, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36977, val loss: 0.35526, in 0.016s 1 tree, 54 leaves, max depth = 14, train loss: 0.36817, val loss: 0.35344, in 0.016s 1 tree, 80 leaves, max depth = 12, train loss: 0.36659, val loss: 0.35200, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36538, val loss: 0.35071, in 0.000s 1 tree, 51 leaves, max depth = 13, train loss: 0.36390, val loss: 0.34902, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36276, val loss: 0.34780, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.36136, val loss: 0.34620, in 0.000s 1 tree, 80 leaves, max depth = 12, train loss: 0.35984, val loss: 0.34482, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35875, val loss: 0.34366, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.35744, val loss: 0.34216, in 0.000s Fit 76 trees in 1.329 s, (3975 total leaves) Time spent computing histograms: 0.444s Time spent finding best splits: 0.101s Time spent applying splits: 0.101s Time spent predicting: 0.000s Trial 23, Fold 4: Log loss = 0.35819156661274026, Average precision = 0.9483507958808596, ROC-AUC = 0.946384683386148, Elapsed Time = 1.3260851000013645 seconds Trial 23, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 23, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 47 leaves, max depth = 12, train loss: 0.67714, val loss: 0.67608, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.66220, val loss: 0.66013, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.64837, val loss: 0.64533, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.63529, val loss: 0.63130, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.62304, val loss: 0.61814, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.61156, val loss: 0.60577, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.60078, val loss: 0.59414, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.59075, val loss: 0.58330, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.58124, val loss: 0.57299, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.57230, val loss: 0.56327, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.56395, val loss: 0.55418, in 0.000s 1 tree, 48 leaves, max depth = 12, train loss: 0.55602, val loss: 0.54554, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.54801, val loss: 0.53787, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.54082, val loss: 0.53000, in 0.000s 1 tree, 48 leaves, max depth = 12, train loss: 0.53397, val loss: 0.52248, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.52676, val loss: 0.51564, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.51996, val loss: 0.50921, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.51390, val loss: 0.50254, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.50760, val loss: 0.49656, in 0.000s 1 tree, 84 leaves, max depth = 16, train loss: 0.50165, val loss: 0.49093, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.49621, val loss: 0.48492, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.49069, val loss: 0.47971, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.48559, val loss: 0.47405, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.48046, val loss: 0.46923, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.47579, val loss: 0.46402, in 0.000s 1 tree, 84 leaves, max depth = 17, train loss: 0.47100, val loss: 0.45954, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.46668, val loss: 0.45471, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.46221, val loss: 0.45054, in 0.016s 1 tree, 85 leaves, max depth = 16, train loss: 0.45799, val loss: 0.44660, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.45405, val loss: 0.44218, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.45010, val loss: 0.43852, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.44644, val loss: 0.43440, in 0.016s 1 tree, 85 leaves, max depth = 13, train loss: 0.44273, val loss: 0.43100, in 0.000s 1 tree, 48 leaves, max depth = 13, train loss: 0.43935, val loss: 0.42719, in 0.016s 1 tree, 84 leaves, max depth = 12, train loss: 0.43588, val loss: 0.42400, in 0.016s 1 tree, 85 leaves, max depth = 11, train loss: 0.43258, val loss: 0.42100, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.42946, val loss: 0.41744, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.42636, val loss: 0.41464, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.42345, val loss: 0.41131, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.42053, val loss: 0.40866, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.41781, val loss: 0.40554, in 0.000s 1 tree, 85 leaves, max depth = 13, train loss: 0.41507, val loss: 0.40309, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.41253, val loss: 0.40015, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.40994, val loss: 0.39785, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40770, val loss: 0.39567, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.40525, val loss: 0.39349, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.40292, val loss: 0.39080, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.40083, val loss: 0.38877, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.39852, val loss: 0.38674, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39655, val loss: 0.38483, in 0.000s 1 tree, 82 leaves, max depth = 13, train loss: 0.39436, val loss: 0.38290, in 0.031s 1 tree, 51 leaves, max depth = 12, train loss: 0.39224, val loss: 0.38044, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39039, val loss: 0.37866, in 0.000s 1 tree, 82 leaves, max depth = 13, train loss: 0.38832, val loss: 0.37684, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.38665, val loss: 0.37518, in 0.000s 1 tree, 80 leaves, max depth = 14, train loss: 0.38468, val loss: 0.37348, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.38272, val loss: 0.37119, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.38116, val loss: 0.36963, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.37929, val loss: 0.36804, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37780, val loss: 0.36656, in 0.000s 1 tree, 50 leaves, max depth = 11, train loss: 0.37598, val loss: 0.36443, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.37420, val loss: 0.36292, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37280, val loss: 0.36152, in 0.000s 1 tree, 50 leaves, max depth = 11, train loss: 0.37106, val loss: 0.35952, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36973, val loss: 0.35820, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.36803, val loss: 0.35676, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.36640, val loss: 0.35487, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.36514, val loss: 0.35362, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.36352, val loss: 0.35224, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.36199, val loss: 0.35047, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36080, val loss: 0.34929, in 0.000s 1 tree, 51 leaves, max depth = 10, train loss: 0.35937, val loss: 0.34760, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.35780, val loss: 0.34630, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35667, val loss: 0.34517, in 0.000s 1 tree, 50 leaves, max depth = 10, train loss: 0.35533, val loss: 0.34359, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.35383, val loss: 0.34235, in 0.016s Fit 76 trees in 1.236 s, (4137 total leaves) Time spent computing histograms: 0.417s Time spent finding best splits: 0.088s Time spent applying splits: 0.089s Time spent predicting: 0.016s Trial 23, Fold 5: Log loss = 0.3607328935159004, Average precision = 0.946009279757922, ROC-AUC = 0.9442421269717407, Elapsed Time = 1.2442246000009618 seconds
Optimization Progress: 24%|##4 | 24/100 [04:47<15:47, 12.46s/it]
Trial 24, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 24, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 20 leaves, max depth = 8, train loss: 0.68729, val loss: 0.68711, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.68158, val loss: 0.68122, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.67600, val loss: 0.67547, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.67057, val loss: 0.66986, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.66526, val loss: 0.66438, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.66009, val loss: 0.65904, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.65503, val loss: 0.65382, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.65010, val loss: 0.64872, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.64529, val loss: 0.64374, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.64059, val loss: 0.63888, in 0.000s Fit 10 trees in 0.314 s, (200 total leaves) Time spent computing histograms: 0.038s Time spent finding best splits: 0.004s Time spent applying splits: 0.004s Time spent predicting: 0.000s Trial 24, Fold 1: Log loss = 0.640623233035379, Average precision = 0.8109263008444648, ROC-AUC = 0.8600171079422435, Elapsed Time = 0.32782589999987977 seconds Trial 24, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 24, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 20 leaves, max depth = 8, train loss: 0.68733, val loss: 0.68703, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.68170, val loss: 0.68113, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.67616, val loss: 0.67530, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.67081, val loss: 0.66969, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.66553, val loss: 0.66413, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.66043, val loss: 0.65878, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.65541, val loss: 0.65348, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.65055, val loss: 0.64837, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.64575, val loss: 0.64331, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.64112, val loss: 0.63844, in 0.000s Fit 10 trees in 0.329 s, (199 total leaves) Time spent computing histograms: 0.042s Time spent finding best splits: 0.005s Time spent applying splits: 0.004s Time spent predicting: 0.000s Trial 24, Fold 2: Log loss = 0.6409561541008428, Average precision = 0.8198208757306849, ROC-AUC = 0.8671791846955682, Elapsed Time = 0.33721589999913704 seconds Trial 24, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 24, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 20 leaves, max depth = 7, train loss: 0.68734, val loss: 0.68716, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.68167, val loss: 0.68131, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.67614, val loss: 0.67561, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.67075, val loss: 0.67004, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.66549, val loss: 0.66461, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.66035, val loss: 0.65931, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.65534, val loss: 0.65413, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.65045, val loss: 0.64907, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.64567, val loss: 0.64413, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.64101, val loss: 0.63931, in 0.000s Fit 10 trees in 0.330 s, (200 total leaves) Time spent computing histograms: 0.038s Time spent finding best splits: 0.004s Time spent applying splits: 0.004s Time spent predicting: 0.000s Trial 24, Fold 3: Log loss = 0.6396197595777118, Average precision = 0.8252803499994151, ROC-AUC = 0.8716421546792383, Elapsed Time = 0.33429789999900095 seconds Trial 24, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 24, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 19 leaves, max depth = 9, train loss: 0.68737, val loss: 0.68704, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.68173, val loss: 0.68108, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.67624, val loss: 0.67526, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.67087, val loss: 0.66957, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.66564, val loss: 0.66402, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.66053, val loss: 0.65860, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.65555, val loss: 0.65331, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.65065, val loss: 0.64809, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.64590, val loss: 0.64303, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.64124, val loss: 0.63804, in 0.016s Fit 10 trees in 0.329 s, (192 total leaves) Time spent computing histograms: 0.040s Time spent finding best splits: 0.004s Time spent applying splits: 0.004s Time spent predicting: 0.000s Trial 24, Fold 4: Log loss = 0.640927237290986, Average precision = 0.8185837745206201, ROC-AUC = 0.8648848121442758, Elapsed Time = 0.3400923999997758 seconds Trial 24, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 24, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 20 leaves, max depth = 8, train loss: 0.68734, val loss: 0.68696, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.68167, val loss: 0.68091, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.67614, val loss: 0.67502, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.67075, val loss: 0.66926, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.66547, val loss: 0.66359, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.66033, val loss: 0.65810, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.65530, val loss: 0.65269, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.65041, val loss: 0.64745, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.64561, val loss: 0.64228, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.64095, val loss: 0.63728, in 0.016s Fit 10 trees in 0.345 s, (200 total leaves) Time spent computing histograms: 0.041s Time spent finding best splits: 0.005s Time spent applying splits: 0.004s Time spent predicting: 0.000s Trial 24, Fold 5: Log loss = 0.6424195026971156, Average precision = 0.8047005213484206, ROC-AUC = 0.8551868160451852, Elapsed Time = 0.35267949999979464 seconds
Optimization Progress: 25%|##5 | 25/100 [04:55<13:56, 11.15s/it]
Trial 25, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 25, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.205 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 6 leaves, max depth = 4, train loss: 0.66934, val loss: 0.66854, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.65718, val loss: 0.65539, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.63648, val loss: 0.63409, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.61800, val loss: 0.61486, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.60112, val loss: 0.59737, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.58600, val loss: 0.58163, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.57225, val loss: 0.56724, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.55985, val loss: 0.55430, in 0.000s 1 tree, 28 leaves, max depth = 7, train loss: 0.55106, val loss: 0.54609, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.53971, val loss: 0.53425, in 0.000s 1 tree, 10 leaves, max depth = 7, train loss: 0.52971, val loss: 0.52376, in 0.000s 1 tree, 12 leaves, max depth = 6, train loss: 0.52050, val loss: 0.51412, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.51201, val loss: 0.50520, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.50444, val loss: 0.49722, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.49754, val loss: 0.48992, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.48797, val loss: 0.48101, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.48173, val loss: 0.47441, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.47615, val loss: 0.46844, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.47098, val loss: 0.46285, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.46601, val loss: 0.45755, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.46147, val loss: 0.45268, in 0.000s 1 tree, 28 leaves, max depth = 8, train loss: 0.45596, val loss: 0.44773, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.45228, val loss: 0.44378, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.44856, val loss: 0.43975, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.44366, val loss: 0.43536, in 0.000s 1 tree, 16 leaves, max depth = 6, train loss: 0.44050, val loss: 0.43199, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.43722, val loss: 0.42867, in 0.000s 1 tree, 24 leaves, max depth = 9, train loss: 0.43451, val loss: 0.42578, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.43022, val loss: 0.42200, in 0.016s Fit 29 trees in 0.580 s, (364 total leaves) Time spent computing histograms: 0.133s Time spent finding best splits: 0.010s Time spent applying splits: 0.010s Time spent predicting: 0.000s Trial 25, Fold 1: Log loss = 0.43042724292040524, Average precision = 0.904210098361593, ROC-AUC = 0.9104416997653222, Elapsed Time = 0.5778432999995857 seconds Trial 25, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 25, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.173 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 6 leaves, max depth = 4, train loss: 0.66933, val loss: 0.66808, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.65699, val loss: 0.65449, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.63639, val loss: 0.63281, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.61785, val loss: 0.61333, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.60106, val loss: 0.59560, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.58594, val loss: 0.57960, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.57215, val loss: 0.56503, in 0.016s 1 tree, 5 leaves, max depth = 4, train loss: 0.55970, val loss: 0.55184, in 0.000s 1 tree, 28 leaves, max depth = 7, train loss: 0.55103, val loss: 0.54364, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.53980, val loss: 0.53169, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.52982, val loss: 0.52101, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.52064, val loss: 0.51121, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.51233, val loss: 0.50229, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.50478, val loss: 0.49415, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.49785, val loss: 0.48671, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.48808, val loss: 0.47735, in 0.000s 1 tree, 10 leaves, max depth = 6, train loss: 0.48191, val loss: 0.47069, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.47635, val loss: 0.46469, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.47117, val loss: 0.45907, in 0.000s 1 tree, 11 leaves, max depth = 5, train loss: 0.46630, val loss: 0.45375, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.46184, val loss: 0.44887, in 0.000s 1 tree, 28 leaves, max depth = 7, train loss: 0.45641, val loss: 0.44392, in 0.017s 1 tree, 18 leaves, max depth = 7, train loss: 0.45268, val loss: 0.43995, in 0.000s 1 tree, 13 leaves, max depth = 6, train loss: 0.44903, val loss: 0.43593, in 0.015s 1 tree, 28 leaves, max depth = 8, train loss: 0.44412, val loss: 0.43148, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.44100, val loss: 0.42807, in 0.000s 1 tree, 15 leaves, max depth = 8, train loss: 0.43820, val loss: 0.42501, in 0.016s 1 tree, 21 leaves, max depth = 13, train loss: 0.43563, val loss: 0.42215, in 0.000s 1 tree, 28 leaves, max depth = 7, train loss: 0.43121, val loss: 0.41822, in 0.016s Fit 29 trees in 0.549 s, (371 total leaves) Time spent computing histograms: 0.131s Time spent finding best splits: 0.010s Time spent applying splits: 0.010s Time spent predicting: 0.000s Trial 25, Fold 2: Log loss = 0.43183056693913974, Average precision = 0.8994185525865448, ROC-AUC = 0.9132483707599013, Elapsed Time = 0.5474708000001556 seconds Trial 25, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 25, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 7 leaves, max depth = 5, train loss: 0.66943, val loss: 0.66856, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.65706, val loss: 0.65587, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.63667, val loss: 0.63463, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.61824, val loss: 0.61550, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.60161, val loss: 0.59821, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.58658, val loss: 0.58263, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.57286, val loss: 0.56833, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.56057, val loss: 0.55544, in 0.000s 1 tree, 28 leaves, max depth = 8, train loss: 0.55180, val loss: 0.54733, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.54066, val loss: 0.53569, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.53074, val loss: 0.52535, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.52161, val loss: 0.51567, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.51327, val loss: 0.50689, in 0.000s 1 tree, 12 leaves, max depth = 6, train loss: 0.50575, val loss: 0.49892, in 0.016s 1 tree, 18 leaves, max depth = 12, train loss: 0.49890, val loss: 0.49167, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.48918, val loss: 0.48259, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.48306, val loss: 0.47604, in 0.000s 1 tree, 16 leaves, max depth = 10, train loss: 0.47751, val loss: 0.47022, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.47235, val loss: 0.46476, in 0.000s 1 tree, 11 leaves, max depth = 5, train loss: 0.46752, val loss: 0.45967, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.46311, val loss: 0.45504, in 0.016s 1 tree, 28 leaves, max depth = 7, train loss: 0.45761, val loss: 0.45007, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.45397, val loss: 0.44612, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.45035, val loss: 0.44230, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.44540, val loss: 0.43770, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.44231, val loss: 0.43433, in 0.000s 1 tree, 17 leaves, max depth = 5, train loss: 0.43907, val loss: 0.43145, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.43635, val loss: 0.42854, in 0.000s 1 tree, 28 leaves, max depth = 7, train loss: 0.43207, val loss: 0.42474, in 0.000s Fit 29 trees in 0.517 s, (390 total leaves) Time spent computing histograms: 0.123s Time spent finding best splits: 0.010s Time spent applying splits: 0.010s Time spent predicting: 0.000s Trial 25, Fold 3: Log loss = 0.4254316361822161, Average precision = 0.910916950452471, ROC-AUC = 0.9208799936916721, Elapsed Time = 0.5182334000000992 seconds Trial 25, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 25, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 6 leaves, max depth = 4, train loss: 0.66940, val loss: 0.66796, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.65694, val loss: 0.65464, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.63642, val loss: 0.63287, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.61802, val loss: 0.61333, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.60130, val loss: 0.59547, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.58617, val loss: 0.57925, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.57247, val loss: 0.56456, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.56011, val loss: 0.55124, in 0.000s 1 tree, 28 leaves, max depth = 7, train loss: 0.55149, val loss: 0.54270, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.54031, val loss: 0.53052, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.53042, val loss: 0.51977, in 0.016s 1 tree, 14 leaves, max depth = 9, train loss: 0.52123, val loss: 0.50979, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.51286, val loss: 0.50062, in 0.000s 1 tree, 12 leaves, max depth = 8, train loss: 0.50534, val loss: 0.49229, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.49842, val loss: 0.48466, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.48904, val loss: 0.47543, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.48289, val loss: 0.46855, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.47738, val loss: 0.46238, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.47221, val loss: 0.45659, in 0.000s 1 tree, 13 leaves, max depth = 6, train loss: 0.46733, val loss: 0.45104, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.46287, val loss: 0.44594, in 0.000s 1 tree, 28 leaves, max depth = 7, train loss: 0.45752, val loss: 0.44083, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.45382, val loss: 0.43656, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.45017, val loss: 0.43232, in 0.016s 1 tree, 28 leaves, max depth = 7, train loss: 0.44530, val loss: 0.42758, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.44219, val loss: 0.42396, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.43900, val loss: 0.42105, in 0.000s 1 tree, 22 leaves, max depth = 10, train loss: 0.43628, val loss: 0.41782, in 0.016s 1 tree, 28 leaves, max depth = 7, train loss: 0.43211, val loss: 0.41386, in 0.016s Fit 29 trees in 0.518 s, (422 total leaves) Time spent computing histograms: 0.117s Time spent finding best splits: 0.011s Time spent applying splits: 0.011s Time spent predicting: 0.000s Trial 25, Fold 4: Log loss = 0.4316979696405457, Average precision = 0.9031565302826969, ROC-AUC = 0.9134650749539213, Elapsed Time = 0.5205359000010503 seconds Trial 25, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 25, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 6 leaves, max depth = 5, train loss: 0.66911, val loss: 0.66737, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.65694, val loss: 0.65387, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.63619, val loss: 0.63160, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.61751, val loss: 0.61156, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.60063, val loss: 0.59337, in 0.016s 1 tree, 9 leaves, max depth = 6, train loss: 0.58542, val loss: 0.57687, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.57154, val loss: 0.56182, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.55903, val loss: 0.54818, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.55049, val loss: 0.54039, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.53917, val loss: 0.52793, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.52911, val loss: 0.51685, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.51986, val loss: 0.50663, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.51136, val loss: 0.49722, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.50376, val loss: 0.48862, in 0.016s 1 tree, 18 leaves, max depth = 11, train loss: 0.49675, val loss: 0.48080, in 0.000s 1 tree, 28 leaves, max depth = 12, train loss: 0.48716, val loss: 0.47176, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.48092, val loss: 0.46469, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.47531, val loss: 0.45830, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.47010, val loss: 0.45236, in 0.000s 1 tree, 11 leaves, max depth = 5, train loss: 0.46518, val loss: 0.44670, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.46069, val loss: 0.44149, in 0.000s 1 tree, 28 leaves, max depth = 8, train loss: 0.45537, val loss: 0.43694, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.45163, val loss: 0.43264, in 0.000s 1 tree, 11 leaves, max depth = 5, train loss: 0.44794, val loss: 0.42831, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.44315, val loss: 0.42368, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.43999, val loss: 0.41995, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.43715, val loss: 0.41660, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.43454, val loss: 0.41348, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.43022, val loss: 0.40992, in 0.016s Fit 29 trees in 0.549 s, (404 total leaves) Time spent computing histograms: 0.129s Time spent finding best splits: 0.010s Time spent applying splits: 0.010s Time spent predicting: 0.016s Trial 25, Fold 5: Log loss = 0.4376364533477417, Average precision = 0.9000715888354822, ROC-AUC = 0.9088322092785612, Elapsed Time = 0.5473820000006526 seconds
Optimization Progress: 26%|##6 | 26/100 [05:05<13:24, 10.88s/it]
Trial 26, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 26, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.143 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 190 leaves, max depth = 15, train loss: 0.68518, val loss: 0.68533, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.67727, val loss: 0.67748, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.66946, val loss: 0.66980, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.66197, val loss: 0.66236, in 0.016s 1 tree, 190 leaves, max depth = 14, train loss: 0.65457, val loss: 0.65492, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.64780, val loss: 0.64813, in 0.016s 1 tree, 190 leaves, max depth = 14, train loss: 0.64114, val loss: 0.64149, in 0.016s 1 tree, 190 leaves, max depth = 16, train loss: 0.63454, val loss: 0.63490, in 0.038s 1 tree, 190 leaves, max depth = 16, train loss: 0.62810, val loss: 0.62848, in 0.009s 1 tree, 190 leaves, max depth = 14, train loss: 0.62143, val loss: 0.62186, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.61512, val loss: 0.61559, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.60919, val loss: 0.60987, in 0.016s 1 tree, 190 leaves, max depth = 14, train loss: 0.60318, val loss: 0.60388, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.59720, val loss: 0.59803, in 0.016s 1 tree, 190 leaves, max depth = 14, train loss: 0.59128, val loss: 0.59214, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.58601, val loss: 0.58684, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.58044, val loss: 0.58138, in 0.016s 1 tree, 190 leaves, max depth = 14, train loss: 0.57510, val loss: 0.57608, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.57009, val loss: 0.57109, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.56530, val loss: 0.56626, in 0.031s Fit 20 trees in 0.783 s, (3800 total leaves) Time spent computing histograms: 0.153s Time spent finding best splits: 0.066s Time spent applying splits: 0.051s Time spent predicting: 0.016s Trial 26, Fold 1: Log loss = 0.569108795906292, Average precision = 0.9202807195831728, ROC-AUC = 0.9315492620314211, Elapsed Time = 0.7984632000006968 seconds Trial 26, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 26, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.127 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 190 leaves, max depth = 15, train loss: 0.68526, val loss: 0.68526, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.67718, val loss: 0.67722, in 0.031s 1 tree, 190 leaves, max depth = 18, train loss: 0.66990, val loss: 0.66999, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.66234, val loss: 0.66253, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.65538, val loss: 0.65564, in 0.031s 1 tree, 190 leaves, max depth = 17, train loss: 0.64807, val loss: 0.64842, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.64096, val loss: 0.64137, in 0.016s 1 tree, 190 leaves, max depth = 14, train loss: 0.63448, val loss: 0.63492, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.62772, val loss: 0.62822, in 0.016s 1 tree, 190 leaves, max depth = 16, train loss: 0.62110, val loss: 0.62174, in 0.031s 1 tree, 190 leaves, max depth = 19, train loss: 0.61505, val loss: 0.61569, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.60889, val loss: 0.60957, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.60289, val loss: 0.60361, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.59698, val loss: 0.59780, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.59112, val loss: 0.59202, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.58541, val loss: 0.58637, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.58010, val loss: 0.58107, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.57464, val loss: 0.57569, in 0.016s 1 tree, 190 leaves, max depth = 19, train loss: 0.56974, val loss: 0.57081, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.56483, val loss: 0.56595, in 0.031s Fit 20 trees in 0.846 s, (3800 total leaves) Time spent computing histograms: 0.177s Time spent finding best splits: 0.074s Time spent applying splits: 0.059s Time spent predicting: 0.000s Trial 26, Fold 2: Log loss = 0.5681891539256246, Average precision = 0.9177697630994953, ROC-AUC = 0.9322516700602529, Elapsed Time = 0.8451671000002534 seconds Trial 26, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 26, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 190 leaves, max depth = 13, train loss: 0.68511, val loss: 0.68519, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.67719, val loss: 0.67737, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.66938, val loss: 0.66965, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.66185, val loss: 0.66224, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.65452, val loss: 0.65501, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.64778, val loss: 0.64843, in 0.031s 1 tree, 190 leaves, max depth = 13, train loss: 0.64098, val loss: 0.64179, in 0.031s 1 tree, 190 leaves, max depth = 13, train loss: 0.63436, val loss: 0.63532, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.62801, val loss: 0.62913, in 0.016s 1 tree, 190 leaves, max depth = 14, train loss: 0.62142, val loss: 0.62264, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.61500, val loss: 0.61634, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.60882, val loss: 0.61026, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.60271, val loss: 0.60427, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.59673, val loss: 0.59839, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.59090, val loss: 0.59265, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.58554, val loss: 0.58744, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.57998, val loss: 0.58200, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.57462, val loss: 0.57673, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.56961, val loss: 0.57186, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.56472, val loss: 0.56711, in 0.016s Fit 20 trees in 0.860 s, (3800 total leaves) Time spent computing histograms: 0.176s Time spent finding best splits: 0.073s Time spent applying splits: 0.059s Time spent predicting: 0.000s Trial 26, Fold 3: Log loss = 0.5659776112241304, Average precision = 0.921698723740934, ROC-AUC = 0.9365434425894001, Elapsed Time = 0.8646678000004613 seconds Trial 26, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 26, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 190 leaves, max depth = 14, train loss: 0.68512, val loss: 0.68504, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.67723, val loss: 0.67691, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.66944, val loss: 0.66890, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.66192, val loss: 0.66114, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.65462, val loss: 0.65365, in 0.016s 1 tree, 190 leaves, max depth = 17, train loss: 0.64785, val loss: 0.64666, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.64082, val loss: 0.63943, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.63409, val loss: 0.63252, in 0.031s 1 tree, 190 leaves, max depth = 19, train loss: 0.62780, val loss: 0.62608, in 0.016s 1 tree, 190 leaves, max depth = 14, train loss: 0.62127, val loss: 0.61938, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.61495, val loss: 0.61284, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.60908, val loss: 0.60697, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.60305, val loss: 0.60073, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.59707, val loss: 0.59462, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.59164, val loss: 0.58903, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.58632, val loss: 0.58350, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.58073, val loss: 0.57779, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.57537, val loss: 0.57224, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.57003, val loss: 0.56679, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.56485, val loss: 0.56147, in 0.031s Fit 20 trees in 0.861 s, (3800 total leaves) Time spent computing histograms: 0.170s Time spent finding best splits: 0.073s Time spent applying splits: 0.059s Time spent predicting: 0.000s Trial 26, Fold 4: Log loss = 0.5667484174551578, Average precision = 0.9241353906034142, ROC-AUC = 0.9350950246905633, Elapsed Time = 0.8688241999989259 seconds Trial 26, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 26, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 190 leaves, max depth = 14, train loss: 0.68510, val loss: 0.68483, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.67701, val loss: 0.67665, in 0.031s 1 tree, 190 leaves, max depth = 17, train loss: 0.66914, val loss: 0.66863, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.66156, val loss: 0.66084, in 0.016s 1 tree, 190 leaves, max depth = 16, train loss: 0.65417, val loss: 0.65324, in 0.031s 1 tree, 190 leaves, max depth = 17, train loss: 0.64732, val loss: 0.64616, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.64065, val loss: 0.63930, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.63371, val loss: 0.63230, in 0.016s 1 tree, 190 leaves, max depth = 16, train loss: 0.62702, val loss: 0.62541, in 0.031s 1 tree, 190 leaves, max depth = 17, train loss: 0.62053, val loss: 0.61880, in 0.016s 1 tree, 190 leaves, max depth = 16, train loss: 0.61406, val loss: 0.61214, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.60809, val loss: 0.60622, in 0.031s 1 tree, 190 leaves, max depth = 15, train loss: 0.60205, val loss: 0.60012, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.59603, val loss: 0.59393, in 0.016s 1 tree, 190 leaves, max depth = 17, train loss: 0.59056, val loss: 0.58826, in 0.031s 1 tree, 190 leaves, max depth = 14, train loss: 0.58512, val loss: 0.58288, in 0.016s 1 tree, 190 leaves, max depth = 15, train loss: 0.57962, val loss: 0.57732, in 0.016s 1 tree, 190 leaves, max depth = 16, train loss: 0.57422, val loss: 0.57177, in 0.031s 1 tree, 190 leaves, max depth = 17, train loss: 0.56910, val loss: 0.56652, in 0.031s 1 tree, 190 leaves, max depth = 18, train loss: 0.56424, val loss: 0.56147, in 0.016s Fit 20 trees in 0.845 s, (3800 total leaves) Time spent computing histograms: 0.172s Time spent finding best splits: 0.074s Time spent applying splits: 0.059s Time spent predicting: 0.000s Trial 26, Fold 5: Log loss = 0.5685353232718314, Average precision = 0.9194077007754793, ROC-AUC = 0.9328047117403341, Elapsed Time = 0.863473299999896 seconds
Optimization Progress: 27%|##7 | 27/100 [05:16<13:20, 10.96s/it]
Trial 27, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 27, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 39 leaves, max depth = 9, train loss: 0.66511, val loss: 0.66498, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.63941, val loss: 0.63945, in 0.016s 1 tree, 55 leaves, max depth = 9, train loss: 0.61638, val loss: 0.61627, in 0.016s 1 tree, 59 leaves, max depth = 10, train loss: 0.59531, val loss: 0.59502, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.57582, val loss: 0.57567, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.55789, val loss: 0.55776, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.54232, val loss: 0.54228, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.52719, val loss: 0.52691, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.51281, val loss: 0.51246, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.49997, val loss: 0.49944, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.48764, val loss: 0.48698, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.47663, val loss: 0.47582, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.46598, val loss: 0.46512, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.45610, val loss: 0.45520, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.44700, val loss: 0.44608, in 0.032s 1 tree, 43 leaves, max depth = 9, train loss: 0.43879, val loss: 0.43762, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.43107, val loss: 0.42985, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.41904, val loss: 0.41825, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.41213, val loss: 0.41119, in 0.016s 1 tree, 60 leaves, max depth = 10, train loss: 0.40570, val loss: 0.40462, in 0.000s 1 tree, 48 leaves, max depth = 10, train loss: 0.39540, val loss: 0.39482, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.38581, val loss: 0.38569, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.37722, val loss: 0.37743, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.37196, val loss: 0.37233, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.36713, val loss: 0.36730, in 0.016s 1 tree, 23 leaves, max depth = 10, train loss: 0.35995, val loss: 0.36020, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.35337, val loss: 0.35368, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.34821, val loss: 0.34851, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.34237, val loss: 0.34335, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.33768, val loss: 0.33866, in 0.016s 1 tree, 24 leaves, max depth = 10, train loss: 0.33225, val loss: 0.33329, in 0.000s 1 tree, 55 leaves, max depth = 10, train loss: 0.32873, val loss: 0.33000, in 0.031s 1 tree, 38 leaves, max depth = 10, train loss: 0.32390, val loss: 0.32579, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.31994, val loss: 0.32182, in 0.000s 1 tree, 23 leaves, max depth = 9, train loss: 0.31547, val loss: 0.31741, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.31134, val loss: 0.31386, in 0.016s 1 tree, 24 leaves, max depth = 11, train loss: 0.30744, val loss: 0.31001, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.30380, val loss: 0.30691, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.30036, val loss: 0.30352, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.29713, val loss: 0.30080, in 0.016s 1 tree, 60 leaves, max depth = 11, train loss: 0.29484, val loss: 0.29845, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.29186, val loss: 0.29550, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.28905, val loss: 0.29316, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.28645, val loss: 0.29055, in 0.000s 1 tree, 63 leaves, max depth = 10, train loss: 0.28350, val loss: 0.28762, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.28107, val loss: 0.28564, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.27881, val loss: 0.28335, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.27621, val loss: 0.28080, in 0.016s Fit 48 trees in 1.018 s, (2200 total leaves) Time spent computing histograms: 0.277s Time spent finding best splits: 0.044s Time spent applying splits: 0.041s Time spent predicting: 0.016s Trial 27, Fold 1: Log loss = 0.2806973281608503, Average precision = 0.9612056199159554, ROC-AUC = 0.9552867932779856, Elapsed Time = 1.0278072000000975 seconds Trial 27, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 27, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.143 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 39 leaves, max depth = 10, train loss: 0.66495, val loss: 0.66419, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.63895, val loss: 0.63762, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.61502, val loss: 0.61328, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.59406, val loss: 0.59219, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.57545, val loss: 0.57337, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.55696, val loss: 0.55455, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.54118, val loss: 0.53836, in 0.016s 1 tree, 60 leaves, max depth = 10, train loss: 0.52571, val loss: 0.52277, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.51120, val loss: 0.50800, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.49886, val loss: 0.49557, in 0.031s 1 tree, 53 leaves, max depth = 11, train loss: 0.48638, val loss: 0.48286, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.47557, val loss: 0.47196, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.46478, val loss: 0.46098, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.45477, val loss: 0.45074, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.44568, val loss: 0.44156, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.43731, val loss: 0.43298, in 0.016s 1 tree, 55 leaves, max depth = 14, train loss: 0.42975, val loss: 0.42539, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.41788, val loss: 0.41377, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.41092, val loss: 0.40674, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.40447, val loss: 0.40022, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.39428, val loss: 0.39026, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.38482, val loss: 0.38102, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.37934, val loss: 0.37558, in 0.031s 1 tree, 47 leaves, max depth = 12, train loss: 0.37097, val loss: 0.36741, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.36624, val loss: 0.36260, in 0.016s 1 tree, 23 leaves, max depth = 11, train loss: 0.35916, val loss: 0.35565, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.35384, val loss: 0.35042, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.34750, val loss: 0.34458, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.34263, val loss: 0.33979, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.33685, val loss: 0.33417, in 0.000s 1 tree, 40 leaves, max depth = 12, train loss: 0.33148, val loss: 0.32923, in 0.016s 1 tree, 23 leaves, max depth = 11, train loss: 0.32643, val loss: 0.32427, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.32173, val loss: 0.31996, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.31753, val loss: 0.31593, in 0.016s 1 tree, 23 leaves, max depth = 10, train loss: 0.31325, val loss: 0.31169, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.30952, val loss: 0.30805, in 0.016s 1 tree, 23 leaves, max depth = 12, train loss: 0.30569, val loss: 0.30431, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.30215, val loss: 0.30095, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.29900, val loss: 0.29791, in 0.016s 1 tree, 60 leaves, max depth = 11, train loss: 0.29591, val loss: 0.29498, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.29248, val loss: 0.29189, in 0.016s 1 tree, 23 leaves, max depth = 12, train loss: 0.28934, val loss: 0.28885, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.28667, val loss: 0.28631, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.28403, val loss: 0.28385, in 0.016s 1 tree, 23 leaves, max depth = 12, train loss: 0.28125, val loss: 0.28118, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.27851, val loss: 0.27870, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.27597, val loss: 0.27644, in 0.016s 1 tree, 23 leaves, max depth = 12, train loss: 0.27364, val loss: 0.27416, in 0.016s Fit 48 trees in 1.096 s, (2224 total leaves) Time spent computing histograms: 0.310s Time spent finding best splits: 0.046s Time spent applying splits: 0.044s Time spent predicting: 0.000s Trial 27, Fold 2: Log loss = 0.2768179417669449, Average precision = 0.9623801790052878, ROC-AUC = 0.9598748505663682, Elapsed Time = 1.1042353000011644 seconds Trial 27, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 27, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 40 leaves, max depth = 10, train loss: 0.66514, val loss: 0.66509, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.63945, val loss: 0.63947, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.61602, val loss: 0.61600, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.59528, val loss: 0.59509, in 0.016s 1 tree, 60 leaves, max depth = 10, train loss: 0.57657, val loss: 0.57649, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.55824, val loss: 0.55830, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.54254, val loss: 0.54254, in 0.016s 1 tree, 60 leaves, max depth = 10, train loss: 0.52716, val loss: 0.52721, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.51374, val loss: 0.51390, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.50110, val loss: 0.50136, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.48866, val loss: 0.48897, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.47715, val loss: 0.47752, in 0.031s 1 tree, 57 leaves, max depth = 11, train loss: 0.46669, val loss: 0.46713, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.45699, val loss: 0.45748, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.44802, val loss: 0.44859, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.43970, val loss: 0.44031, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.43207, val loss: 0.43275, in 0.016s 1 tree, 44 leaves, max depth = 10, train loss: 0.42001, val loss: 0.42170, in 0.031s 1 tree, 62 leaves, max depth = 12, train loss: 0.41307, val loss: 0.41475, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.40660, val loss: 0.40830, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.39627, val loss: 0.39888, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.38672, val loss: 0.39025, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.37824, val loss: 0.38257, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.37214, val loss: 0.37624, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.36655, val loss: 0.37041, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.35934, val loss: 0.36380, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.35271, val loss: 0.35778, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.34769, val loss: 0.35258, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.34171, val loss: 0.34754, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.33727, val loss: 0.34298, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.33182, val loss: 0.33846, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.32768, val loss: 0.33416, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.32383, val loss: 0.33018, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.31898, val loss: 0.32580, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.31447, val loss: 0.32207, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.31016, val loss: 0.31819, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.30682, val loss: 0.31469, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.30298, val loss: 0.31151, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.30046, val loss: 0.30906, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.29644, val loss: 0.30545, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.29354, val loss: 0.30230, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.28993, val loss: 0.29909, in 0.016s 1 tree, 44 leaves, max depth = 10, train loss: 0.28801, val loss: 0.29723, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.28497, val loss: 0.29486, in 0.016s 1 tree, 24 leaves, max depth = 10, train loss: 0.28217, val loss: 0.29239, in 0.000s 1 tree, 62 leaves, max depth = 12, train loss: 0.27970, val loss: 0.28969, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.27701, val loss: 0.28729, in 0.031s 1 tree, 23 leaves, max depth = 9, train loss: 0.27468, val loss: 0.28520, in 0.016s Fit 48 trees in 1.142 s, (2236 total leaves) Time spent computing histograms: 0.324s Time spent finding best splits: 0.047s Time spent applying splits: 0.045s Time spent predicting: 0.000s Trial 27, Fold 3: Log loss = 0.27394828088236733, Average precision = 0.9627584780797169, ROC-AUC = 0.9596843368069268, Elapsed Time = 1.136641499999314 seconds Trial 27, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 27, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.174 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 9, train loss: 0.66513, val loss: 0.66404, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.63931, val loss: 0.63718, in 0.016s 1 tree, 57 leaves, max depth = 9, train loss: 0.61626, val loss: 0.61303, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.59551, val loss: 0.59139, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.57706, val loss: 0.57190, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.55892, val loss: 0.55295, in 0.031s 1 tree, 54 leaves, max depth = 11, train loss: 0.54313, val loss: 0.53670, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.52778, val loss: 0.52044, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.51330, val loss: 0.50525, in 0.016s 1 tree, 62 leaves, max depth = 12, train loss: 0.50104, val loss: 0.49215, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.48854, val loss: 0.47900, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.47777, val loss: 0.46763, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.46696, val loss: 0.45624, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.45693, val loss: 0.44565, in 0.031s 1 tree, 57 leaves, max depth = 10, train loss: 0.44765, val loss: 0.43582, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.43938, val loss: 0.42702, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.43185, val loss: 0.41905, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.42000, val loss: 0.40704, in 0.016s 1 tree, 60 leaves, max depth = 10, train loss: 0.41309, val loss: 0.39959, in 0.016s 1 tree, 62 leaves, max depth = 10, train loss: 0.40662, val loss: 0.39261, in 0.016s 1 tree, 45 leaves, max depth = 10, train loss: 0.39644, val loss: 0.38234, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.38709, val loss: 0.37294, in 0.016s 1 tree, 62 leaves, max depth = 10, train loss: 0.38164, val loss: 0.36704, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.37324, val loss: 0.35855, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.36749, val loss: 0.35263, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.36039, val loss: 0.34528, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.35376, val loss: 0.33901, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.34866, val loss: 0.33389, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.34269, val loss: 0.32769, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.33802, val loss: 0.32293, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.33254, val loss: 0.31768, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.32748, val loss: 0.31242, in 0.016s 1 tree, 61 leaves, max depth = 11, train loss: 0.32327, val loss: 0.30810, in 0.016s 1 tree, 22 leaves, max depth = 11, train loss: 0.31875, val loss: 0.30341, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.31457, val loss: 0.29905, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.31091, val loss: 0.29533, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.30683, val loss: 0.29150, in 0.031s 1 tree, 42 leaves, max depth = 11, train loss: 0.30299, val loss: 0.28789, in 0.000s 1 tree, 42 leaves, max depth = 11, train loss: 0.29944, val loss: 0.28461, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.29619, val loss: 0.28120, in 0.016s 1 tree, 60 leaves, max depth = 11, train loss: 0.29300, val loss: 0.27791, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.29010, val loss: 0.27507, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.28737, val loss: 0.27237, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.28451, val loss: 0.26970, in 0.016s 1 tree, 22 leaves, max depth = 10, train loss: 0.28185, val loss: 0.26686, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.27940, val loss: 0.26422, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.27746, val loss: 0.26223, in 0.016s 1 tree, 23 leaves, max depth = 10, train loss: 0.27527, val loss: 0.25988, in 0.000s Fit 48 trees in 1.143 s, (2224 total leaves) Time spent computing histograms: 0.302s Time spent finding best splits: 0.046s Time spent applying splits: 0.044s Time spent predicting: 0.000s Trial 27, Fold 4: Log loss = 0.274923146793898, Average precision = 0.9643716512099001, ROC-AUC = 0.9600594224721084, Elapsed Time = 1.1375859999989189 seconds Trial 27, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 27, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 38 leaves, max depth = 10, train loss: 0.66504, val loss: 0.66373, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.63879, val loss: 0.63634, in 0.031s 1 tree, 50 leaves, max depth = 11, train loss: 0.61501, val loss: 0.61152, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.59374, val loss: 0.58917, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.57486, val loss: 0.56936, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.55654, val loss: 0.55020, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.54081, val loss: 0.53366, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.52516, val loss: 0.51733, in 0.016s 1 tree, 61 leaves, max depth = 11, train loss: 0.51192, val loss: 0.50337, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.49945, val loss: 0.49027, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.48694, val loss: 0.47720, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.47628, val loss: 0.46601, in 0.016s 1 tree, 60 leaves, max depth = 10, train loss: 0.46569, val loss: 0.45490, in 0.016s 1 tree, 61 leaves, max depth = 11, train loss: 0.45599, val loss: 0.44476, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.44689, val loss: 0.43521, in 0.016s 1 tree, 44 leaves, max depth = 9, train loss: 0.43855, val loss: 0.42632, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.43105, val loss: 0.41849, in 0.016s 1 tree, 44 leaves, max depth = 10, train loss: 0.41910, val loss: 0.40661, in 0.016s 1 tree, 60 leaves, max depth = 10, train loss: 0.41211, val loss: 0.39926, in 0.016s 1 tree, 61 leaves, max depth = 11, train loss: 0.40570, val loss: 0.39254, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.39540, val loss: 0.38234, in 0.031s 1 tree, 45 leaves, max depth = 11, train loss: 0.38592, val loss: 0.37299, in 0.016s 1 tree, 59 leaves, max depth = 9, train loss: 0.38045, val loss: 0.36724, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.37506, val loss: 0.36171, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.36999, val loss: 0.35647, in 0.016s 1 tree, 24 leaves, max depth = 12, train loss: 0.36247, val loss: 0.34881, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.35543, val loss: 0.34243, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.34889, val loss: 0.33576, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.34388, val loss: 0.33075, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.33918, val loss: 0.32606, in 0.016s 1 tree, 44 leaves, max depth = 13, train loss: 0.33317, val loss: 0.32061, in 0.016s 1 tree, 53 leaves, max depth = 15, train loss: 0.32956, val loss: 0.31710, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.32551, val loss: 0.31304, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.32003, val loss: 0.30792, in 0.031s 1 tree, 21 leaves, max depth = 8, train loss: 0.31529, val loss: 0.30300, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.31051, val loss: 0.29842, in 0.016s 1 tree, 63 leaves, max depth = 12, train loss: 0.30690, val loss: 0.29472, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.30294, val loss: 0.29072, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.29901, val loss: 0.28733, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.29648, val loss: 0.28483, in 0.016s 1 tree, 63 leaves, max depth = 12, train loss: 0.29339, val loss: 0.28166, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.29009, val loss: 0.27823, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.28729, val loss: 0.27552, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.28406, val loss: 0.27276, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.28108, val loss: 0.27013, in 0.000s 1 tree, 22 leaves, max depth = 10, train loss: 0.27839, val loss: 0.26740, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.27588, val loss: 0.26477, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.27335, val loss: 0.26259, in 0.016s Fit 48 trees in 1.110 s, (2258 total leaves) Time spent computing histograms: 0.307s Time spent finding best splits: 0.046s Time spent applying splits: 0.043s Time spent predicting: 0.000s Trial 27, Fold 5: Log loss = 0.28149502590144165, Average precision = 0.9592432659407226, ROC-AUC = 0.9550485108596697, Elapsed Time = 1.121284700000615 seconds
Optimization Progress: 28%|##8 | 28/100 [05:28<13:31, 11.27s/it]
Trial 28, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 28, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 8, train loss: 0.67338, val loss: 0.67336, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.65449, val loss: 0.65448, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.63710, val loss: 0.63698, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.62067, val loss: 0.62051, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.60563, val loss: 0.60531, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.59171, val loss: 0.59136, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.57880, val loss: 0.57822, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.56571, val loss: 0.56515, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.55338, val loss: 0.55287, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.54237, val loss: 0.54183, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.53133, val loss: 0.53078, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.52150, val loss: 0.52093, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.51158, val loss: 0.51102, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.50201, val loss: 0.50144, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.49341, val loss: 0.49268, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.48497, val loss: 0.48428, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.47679, val loss: 0.47611, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.46903, val loss: 0.46837, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.46179, val loss: 0.46101, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.45478, val loss: 0.45405, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.44543, val loss: 0.44496, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.43901, val loss: 0.43859, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.43291, val loss: 0.43252, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.42443, val loss: 0.42442, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.41881, val loss: 0.41886, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.41360, val loss: 0.41356, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.40851, val loss: 0.40845, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.40112, val loss: 0.40146, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.39644, val loss: 0.39682, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.38976, val loss: 0.39040, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.38552, val loss: 0.38629, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.37917, val loss: 0.38033, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.37516, val loss: 0.37642, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.37159, val loss: 0.37289, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.36586, val loss: 0.36754, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.36239, val loss: 0.36422, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.35736, val loss: 0.35930, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.35262, val loss: 0.35466, in 0.000s 1 tree, 35 leaves, max depth = 10, train loss: 0.34816, val loss: 0.35029, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.34451, val loss: 0.34675, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.34039, val loss: 0.34272, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.33698, val loss: 0.33941, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.33276, val loss: 0.33552, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.32913, val loss: 0.33198, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.32600, val loss: 0.32898, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.32230, val loss: 0.32568, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.31880, val loss: 0.32257, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.31548, val loss: 0.31964, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.31234, val loss: 0.31688, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.30943, val loss: 0.31405, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.30667, val loss: 0.31138, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.30385, val loss: 0.30888, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.30118, val loss: 0.30652, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.29895, val loss: 0.30443, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.29691, val loss: 0.30242, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.29457, val loss: 0.30015, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.29205, val loss: 0.29757, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.28975, val loss: 0.29537, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.28747, val loss: 0.29340, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.28530, val loss: 0.29149, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.28306, val loss: 0.28918, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.28110, val loss: 0.28727, in 0.016s 1 tree, 35 leaves, max depth = 15, train loss: 0.27912, val loss: 0.28553, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.27732, val loss: 0.28381, in 0.000s Fit 64 trees in 1.283 s, (2240 total leaves) Time spent computing histograms: 0.388s Time spent finding best splits: 0.073s Time spent applying splits: 0.044s Time spent predicting: 0.000s Trial 28, Fold 1: Log loss = 0.28464131570100243, Average precision = 0.9620374566234541, ROC-AUC = 0.9565089491834071, Elapsed Time = 1.2902052999997977 seconds Trial 28, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 28, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.143 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 9, train loss: 0.67318, val loss: 0.67290, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.65421, val loss: 0.65369, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.63651, val loss: 0.63571, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.62079, val loss: 0.61980, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.60512, val loss: 0.60407, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.59126, val loss: 0.59005, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.57844, val loss: 0.57718, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.56506, val loss: 0.56373, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.55252, val loss: 0.55104, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.54155, val loss: 0.53993, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.53026, val loss: 0.52859, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.52048, val loss: 0.51868, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.51040, val loss: 0.50853, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.50079, val loss: 0.49885, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.49169, val loss: 0.48968, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.48383, val loss: 0.48172, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.47560, val loss: 0.47341, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.46781, val loss: 0.46563, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.46039, val loss: 0.45815, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.45335, val loss: 0.45106, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.44401, val loss: 0.44186, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.43757, val loss: 0.43541, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.43146, val loss: 0.42930, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.42575, val loss: 0.42364, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.42032, val loss: 0.41827, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.41237, val loss: 0.41050, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.40727, val loss: 0.40542, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.39996, val loss: 0.39828, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.39527, val loss: 0.39362, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.39080, val loss: 0.38920, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.38418, val loss: 0.38280, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.37805, val loss: 0.37692, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.37439, val loss: 0.37323, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.37067, val loss: 0.36954, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.36700, val loss: 0.36584, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.36146, val loss: 0.36046, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.35623, val loss: 0.35538, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.35154, val loss: 0.35082, in 0.031s 1 tree, 35 leaves, max depth = 14, train loss: 0.34711, val loss: 0.34654, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.34347, val loss: 0.34304, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.33939, val loss: 0.33910, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.33589, val loss: 0.33578, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.33188, val loss: 0.33200, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.32869, val loss: 0.32893, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.32482, val loss: 0.32525, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.32126, val loss: 0.32193, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.31788, val loss: 0.31879, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.31473, val loss: 0.31573, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.31163, val loss: 0.31286, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.30876, val loss: 0.31008, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.30591, val loss: 0.30742, in 0.000s 1 tree, 35 leaves, max depth = 10, train loss: 0.30328, val loss: 0.30489, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.30056, val loss: 0.30236, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.29848, val loss: 0.30039, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.29593, val loss: 0.29804, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.29344, val loss: 0.29574, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.29118, val loss: 0.29358, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.28888, val loss: 0.29147, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.28680, val loss: 0.28948, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.28452, val loss: 0.28740, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.28245, val loss: 0.28552, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.28048, val loss: 0.28369, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.27865, val loss: 0.28196, in 0.000s 1 tree, 35 leaves, max depth = 12, train loss: 0.27683, val loss: 0.28030, in 0.016s Fit 64 trees in 1.346 s, (2240 total leaves) Time spent computing histograms: 0.406s Time spent finding best splits: 0.076s Time spent applying splits: 0.047s Time spent predicting: 0.000s Trial 28, Fold 2: Log loss = 0.2808506028525122, Average precision = 0.9628806869606636, ROC-AUC = 0.959936611915054, Elapsed Time = 1.3568864999997459 seconds Trial 28, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 28, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 9, train loss: 0.67335, val loss: 0.67322, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.65453, val loss: 0.65434, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.63715, val loss: 0.63693, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.62067, val loss: 0.62039, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.60562, val loss: 0.60540, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.59174, val loss: 0.59162, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.57887, val loss: 0.57886, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.56579, val loss: 0.56578, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.55348, val loss: 0.55346, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.54228, val loss: 0.54237, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.53115, val loss: 0.53122, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.52114, val loss: 0.52132, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.51106, val loss: 0.51132, in 0.031s 1 tree, 35 leaves, max depth = 8, train loss: 0.50152, val loss: 0.50178, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.49273, val loss: 0.49296, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.48417, val loss: 0.48443, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.47606, val loss: 0.47637, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.46836, val loss: 0.46871, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.46114, val loss: 0.46155, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.45418, val loss: 0.45461, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.44479, val loss: 0.44593, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.43843, val loss: 0.43961, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.43237, val loss: 0.43361, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.42387, val loss: 0.42581, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.41833, val loss: 0.42038, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.41314, val loss: 0.41534, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.40556, val loss: 0.40838, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.40102, val loss: 0.40397, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.39398, val loss: 0.39757, in 0.031s 1 tree, 35 leaves, max depth = 8, train loss: 0.38959, val loss: 0.39324, in 0.000s 1 tree, 35 leaves, max depth = 12, train loss: 0.38312, val loss: 0.38740, in 0.031s 1 tree, 35 leaves, max depth = 8, train loss: 0.37916, val loss: 0.38347, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.37314, val loss: 0.37806, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.36928, val loss: 0.37430, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.36558, val loss: 0.37072, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.36040, val loss: 0.36602, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.35554, val loss: 0.36158, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.35094, val loss: 0.35742, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.34637, val loss: 0.35360, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.34272, val loss: 0.34984, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.33865, val loss: 0.34613, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.33525, val loss: 0.34265, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.33123, val loss: 0.33931, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.32805, val loss: 0.33601, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.32502, val loss: 0.33286, in 0.000s 1 tree, 35 leaves, max depth = 8, train loss: 0.32212, val loss: 0.32984, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.31851, val loss: 0.32691, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.31520, val loss: 0.32396, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.31189, val loss: 0.32129, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.30888, val loss: 0.31861, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.30586, val loss: 0.31613, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.30311, val loss: 0.31366, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.30030, val loss: 0.31132, in 0.016s 1 tree, 35 leaves, max depth = 18, train loss: 0.29782, val loss: 0.30907, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.29523, val loss: 0.30696, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.29294, val loss: 0.30489, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.29063, val loss: 0.30252, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.28842, val loss: 0.30019, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.28612, val loss: 0.29834, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.28394, val loss: 0.29656, in 0.016s 1 tree, 35 leaves, max depth = 15, train loss: 0.28198, val loss: 0.29480, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.27996, val loss: 0.29315, in 0.016s 1 tree, 35 leaves, max depth = 15, train loss: 0.27816, val loss: 0.29153, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.27609, val loss: 0.28921, in 0.016s Fit 64 trees in 1.408 s, (2240 total leaves) Time spent computing histograms: 0.420s Time spent finding best splits: 0.080s Time spent applying splits: 0.046s Time spent predicting: 0.000s Trial 28, Fold 3: Log loss = 0.2772330542065684, Average precision = 0.9641706060782602, ROC-AUC = 0.96079649562751, Elapsed Time = 1.4219327000009798 seconds Trial 28, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 28, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 9, train loss: 0.67344, val loss: 0.67260, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.65470, val loss: 0.65308, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.63721, val loss: 0.63482, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.62083, val loss: 0.61772, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.60547, val loss: 0.60166, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.59105, val loss: 0.58656, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.57827, val loss: 0.57307, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.56518, val loss: 0.55936, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.55289, val loss: 0.54650, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.54198, val loss: 0.53512, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.53099, val loss: 0.52358, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.52104, val loss: 0.51329, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.51112, val loss: 0.50285, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.50178, val loss: 0.49304, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.49299, val loss: 0.48372, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.48455, val loss: 0.47481, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.47658, val loss: 0.46648, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.46896, val loss: 0.45847, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.46185, val loss: 0.45095, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.45498, val loss: 0.44370, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.44575, val loss: 0.43430, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.43946, val loss: 0.42765, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.43348, val loss: 0.42132, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.42512, val loss: 0.41287, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.41965, val loss: 0.40707, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.41457, val loss: 0.40162, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.40962, val loss: 0.39639, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.40227, val loss: 0.38904, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.39770, val loss: 0.38417, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.39105, val loss: 0.37742, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.38459, val loss: 0.37089, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.38056, val loss: 0.36662, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.37459, val loss: 0.36057, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.37104, val loss: 0.35676, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.36743, val loss: 0.35294, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.36205, val loss: 0.34758, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.35725, val loss: 0.34257, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.35270, val loss: 0.33784, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.34900, val loss: 0.33410, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.34479, val loss: 0.32973, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.34133, val loss: 0.32624, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.33731, val loss: 0.32252, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.33357, val loss: 0.31862, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.33040, val loss: 0.31543, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.32737, val loss: 0.31237, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.32447, val loss: 0.30944, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.32101, val loss: 0.30625, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.31779, val loss: 0.30292, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.31463, val loss: 0.30003, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.31170, val loss: 0.29695, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.30878, val loss: 0.29425, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.30610, val loss: 0.29146, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.30342, val loss: 0.28900, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.30096, val loss: 0.28642, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.29844, val loss: 0.28373, in 0.016s 1 tree, 35 leaves, max depth = 16, train loss: 0.29603, val loss: 0.28147, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.29368, val loss: 0.27894, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.29143, val loss: 0.27652, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.28923, val loss: 0.27448, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.28718, val loss: 0.27241, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.28516, val loss: 0.27032, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.28317, val loss: 0.26852, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.28131, val loss: 0.26657, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.27948, val loss: 0.26491, in 0.016s Fit 64 trees in 1.361 s, (2240 total leaves) Time spent computing histograms: 0.400s Time spent finding best splits: 0.075s Time spent applying splits: 0.045s Time spent predicting: 0.000s Trial 28, Fold 4: Log loss = 0.28037042522834454, Average precision = 0.9633028006312216, ROC-AUC = 0.959027792680339, Elapsed Time = 1.3770490999995673 seconds Trial 28, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 28, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 9, train loss: 0.67321, val loss: 0.67246, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.65418, val loss: 0.65279, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.63649, val loss: 0.63452, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.61979, val loss: 0.61728, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.60436, val loss: 0.60127, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.59016, val loss: 0.58647, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.57704, val loss: 0.57269, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.56379, val loss: 0.55892, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.55130, val loss: 0.54598, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.54029, val loss: 0.53455, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.52912, val loss: 0.52297, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.51929, val loss: 0.51277, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.50935, val loss: 0.50248, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.49985, val loss: 0.49262, in 0.031s 1 tree, 35 leaves, max depth = 8, train loss: 0.49115, val loss: 0.48341, in 0.000s 1 tree, 35 leaves, max depth = 9, train loss: 0.48261, val loss: 0.47454, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.47451, val loss: 0.46617, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.46682, val loss: 0.45823, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.45966, val loss: 0.45079, in 0.031s 1 tree, 35 leaves, max depth = 8, train loss: 0.45272, val loss: 0.44362, in 0.000s 1 tree, 35 leaves, max depth = 8, train loss: 0.44343, val loss: 0.43433, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.43708, val loss: 0.42779, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.43104, val loss: 0.42157, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.42260, val loss: 0.41323, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.41706, val loss: 0.40753, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.41194, val loss: 0.40224, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.40694, val loss: 0.39717, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.39956, val loss: 0.38992, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.39492, val loss: 0.38517, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.38826, val loss: 0.37860, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.38404, val loss: 0.37429, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.37795, val loss: 0.36815, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.37403, val loss: 0.36422, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.36833, val loss: 0.35861, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.36466, val loss: 0.35495, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.35925, val loss: 0.34973, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.35406, val loss: 0.34471, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.34920, val loss: 0.33996, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.34601, val loss: 0.33673, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.34178, val loss: 0.33236, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.33825, val loss: 0.32891, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.33415, val loss: 0.32520, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.33034, val loss: 0.32137, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.32711, val loss: 0.31823, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.32398, val loss: 0.31516, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.32027, val loss: 0.31158, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.31679, val loss: 0.30846, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.31363, val loss: 0.30530, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.31045, val loss: 0.30245, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.30756, val loss: 0.29956, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.30477, val loss: 0.29668, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.30190, val loss: 0.29414, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.29920, val loss: 0.29170, in 0.016s 1 tree, 35 leaves, max depth = 16, train loss: 0.29671, val loss: 0.28919, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.29422, val loss: 0.28685, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.29176, val loss: 0.28466, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.28952, val loss: 0.28240, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.28716, val loss: 0.27998, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.28520, val loss: 0.27822, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.28301, val loss: 0.27628, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.28104, val loss: 0.27423, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.27902, val loss: 0.27249, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.27697, val loss: 0.27057, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.27509, val loss: 0.26888, in 0.016s Fit 64 trees in 1.424 s, (2240 total leaves) Time spent computing histograms: 0.441s Time spent finding best splits: 0.080s Time spent applying splits: 0.047s Time spent predicting: 0.000s Trial 28, Fold 5: Log loss = 0.2856709964198882, Average precision = 0.9600743059726313, ROC-AUC = 0.9556647529866413, Elapsed Time = 1.4378190000006725 seconds
Optimization Progress: 29%|##9 | 29/100 [05:42<14:05, 11.91s/it]
Trial 29, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 29, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.158 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 13 leaves, max depth = 5, train loss: 0.68812, val loss: 0.68797, in 0.015s 1 tree, 13 leaves, max depth = 5, train loss: 0.68339, val loss: 0.68309, in 0.000s 1 tree, 13 leaves, max depth = 5, train loss: 0.67875, val loss: 0.67831, in 0.000s 1 tree, 16 leaves, max depth = 9, train loss: 0.67425, val loss: 0.67371, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.66979, val loss: 0.66911, in 0.000s 1 tree, 13 leaves, max depth = 5, train loss: 0.66543, val loss: 0.66461, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.66114, val loss: 0.66019, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.65695, val loss: 0.65586, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.65283, val loss: 0.65162, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.64880, val loss: 0.64746, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.64484, val loss: 0.64337, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.64096, val loss: 0.63937, in 0.000s 1 tree, 15 leaves, max depth = 9, train loss: 0.63721, val loss: 0.63550, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.63348, val loss: 0.63164, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.62982, val loss: 0.62786, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.62627, val loss: 0.62421, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.62274, val loss: 0.62056, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.61929, val loss: 0.61699, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.61596, val loss: 0.61353, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.61263, val loss: 0.61008, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.60936, val loss: 0.60670, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.60616, val loss: 0.60338, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.60301, val loss: 0.60012, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.59992, val loss: 0.59691, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.59692, val loss: 0.59383, in 0.000s 1 tree, 16 leaves, max depth = 6, train loss: 0.59395, val loss: 0.59074, in 0.016s 1 tree, 16 leaves, max depth = 9, train loss: 0.59106, val loss: 0.58777, in 0.000s 1 tree, 23 leaves, max depth = 10, train loss: 0.58825, val loss: 0.58484, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.58543, val loss: 0.58192, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.58266, val loss: 0.57904, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.57994, val loss: 0.57622, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.57728, val loss: 0.57345, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.57465, val loss: 0.57073, in 0.016s 1 tree, 24 leaves, max depth = 10, train loss: 0.57214, val loss: 0.56809, in 0.000s 1 tree, 18 leaves, max depth = 7, train loss: 0.56961, val loss: 0.56546, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.56713, val loss: 0.56288, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.56469, val loss: 0.56034, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.56229, val loss: 0.55785, in 0.000s 1 tree, 22 leaves, max depth = 10, train loss: 0.55997, val loss: 0.55545, in 0.016s 1 tree, 22 leaves, max depth = 10, train loss: 0.55769, val loss: 0.55310, in 0.000s Fit 40 trees in 0.517 s, (642 total leaves) Time spent computing histograms: 0.152s Time spent finding best splits: 0.016s Time spent applying splits: 0.012s Time spent predicting: 0.000s Trial 29, Fold 1: Log loss = 0.5583189356982935, Average precision = 0.8176059446619469, ROC-AUC = 0.8618765922006664, Elapsed Time = 0.5281728000009025 seconds Trial 29, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 29, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 16 leaves, max depth = 8, train loss: 0.68832, val loss: 0.68809, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.68365, val loss: 0.68321, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.67907, val loss: 0.67842, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.67459, val loss: 0.67373, in 0.016s 1 tree, 21 leaves, max depth = 12, train loss: 0.67021, val loss: 0.66913, in 0.000s 1 tree, 21 leaves, max depth = 12, train loss: 0.66592, val loss: 0.66463, in 0.000s 1 tree, 18 leaves, max depth = 8, train loss: 0.66169, val loss: 0.66021, in 0.016s 1 tree, 21 leaves, max depth = 12, train loss: 0.65757, val loss: 0.65587, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.65351, val loss: 0.65162, in 0.016s 1 tree, 21 leaves, max depth = 12, train loss: 0.64954, val loss: 0.64745, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.64564, val loss: 0.64336, in 0.000s 1 tree, 21 leaves, max depth = 12, train loss: 0.64182, val loss: 0.63935, in 0.016s 1 tree, 21 leaves, max depth = 12, train loss: 0.63808, val loss: 0.63541, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.63439, val loss: 0.63155, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.63075, val loss: 0.62769, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.62721, val loss: 0.62397, in 0.016s 1 tree, 21 leaves, max depth = 12, train loss: 0.62374, val loss: 0.62032, in 0.000s 1 tree, 21 leaves, max depth = 12, train loss: 0.62034, val loss: 0.61674, in 0.016s 1 tree, 21 leaves, max depth = 12, train loss: 0.61700, val loss: 0.61322, in 0.000s 1 tree, 21 leaves, max depth = 12, train loss: 0.61372, val loss: 0.60977, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.61050, val loss: 0.60638, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.60733, val loss: 0.60305, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.60423, val loss: 0.59978, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.60115, val loss: 0.59651, in 0.000s 1 tree, 21 leaves, max depth = 12, train loss: 0.59816, val loss: 0.59336, in 0.016s 1 tree, 21 leaves, max depth = 12, train loss: 0.59524, val loss: 0.59027, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.59235, val loss: 0.58723, in 0.016s 1 tree, 21 leaves, max depth = 12, train loss: 0.58953, val loss: 0.58425, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.58671, val loss: 0.58126, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.58394, val loss: 0.57832, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.58122, val loss: 0.57544, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.57859, val loss: 0.57266, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.57597, val loss: 0.56988, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.57344, val loss: 0.56720, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.57091, val loss: 0.56452, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.56843, val loss: 0.56188, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.56599, val loss: 0.55928, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.56359, val loss: 0.55673, in 0.016s 1 tree, 27 leaves, max depth = 12, train loss: 0.56128, val loss: 0.55428, in 0.000s 1 tree, 49 leaves, max depth = 10, train loss: 0.55885, val loss: 0.55194, in 0.016s Fit 40 trees in 0.548 s, (802 total leaves) Time spent computing histograms: 0.157s Time spent finding best splits: 0.020s Time spent applying splits: 0.015s Time spent predicting: 0.000s Trial 29, Fold 2: Log loss = 0.558383755264408, Average precision = 0.8717239998038205, ROC-AUC = 0.8998013166244859, Elapsed Time = 0.5486139999993611 seconds Trial 29, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 29, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 18 leaves, max depth = 8, train loss: 0.68828, val loss: 0.68813, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.68366, val loss: 0.68335, in 0.000s 1 tree, 18 leaves, max depth = 8, train loss: 0.67913, val loss: 0.67867, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.67469, val loss: 0.67409, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.67027, val loss: 0.66953, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.66595, val loss: 0.66506, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.66170, val loss: 0.66068, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.65754, val loss: 0.65638, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.65347, val loss: 0.65217, in 0.000s 1 tree, 23 leaves, max depth = 11, train loss: 0.64953, val loss: 0.64809, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.64561, val loss: 0.64404, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.64176, val loss: 0.64006, in 0.016s 1 tree, 23 leaves, max depth = 11, train loss: 0.63804, val loss: 0.63621, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.63434, val loss: 0.63239, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.63071, val loss: 0.62863, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.62721, val loss: 0.62501, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.62372, val loss: 0.62139, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.62029, val loss: 0.61784, in 0.000s 1 tree, 28 leaves, max depth = 12, train loss: 0.61698, val loss: 0.61441, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.61368, val loss: 0.61099, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.61044, val loss: 0.60763, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.60726, val loss: 0.60434, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.60414, val loss: 0.60110, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.60107, val loss: 0.59793, in 0.000s 1 tree, 28 leaves, max depth = 12, train loss: 0.59811, val loss: 0.59485, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.59516, val loss: 0.59179, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.59231, val loss: 0.58883, in 0.000s 1 tree, 28 leaves, max depth = 12, train loss: 0.58951, val loss: 0.58592, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.58671, val loss: 0.58302, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.58397, val loss: 0.58017, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.58128, val loss: 0.57737, in 0.000s 1 tree, 18 leaves, max depth = 7, train loss: 0.57863, val loss: 0.57463, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.57603, val loss: 0.57193, in 0.000s 1 tree, 28 leaves, max depth = 13, train loss: 0.57352, val loss: 0.56931, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.57102, val loss: 0.56671, in 0.000s 1 tree, 18 leaves, max depth = 7, train loss: 0.56856, val loss: 0.56415, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.56614, val loss: 0.56164, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.56377, val loss: 0.55917, in 0.016s 1 tree, 28 leaves, max depth = 13, train loss: 0.56147, val loss: 0.55678, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.55905, val loss: 0.55452, in 0.016s Fit 40 trees in 0.612 s, (784 total leaves) Time spent computing histograms: 0.175s Time spent finding best splits: 0.022s Time spent applying splits: 0.017s Time spent predicting: 0.000s Trial 29, Fold 3: Log loss = 0.5556490839538677, Average precision = 0.8730575981840494, ROC-AUC = 0.9010974714921364, Elapsed Time = 0.614298399999825 seconds Trial 29, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 29, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 10 leaves, max depth = 5, train loss: 0.68826, val loss: 0.68798, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.68357, val loss: 0.68301, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.67898, val loss: 0.67814, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.67451, val loss: 0.67343, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.67010, val loss: 0.66875, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.66577, val loss: 0.66416, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.66154, val loss: 0.65965, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.65738, val loss: 0.65524, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.65331, val loss: 0.65091, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.64932, val loss: 0.64666, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.64540, val loss: 0.64249, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.64157, val loss: 0.63840, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.63785, val loss: 0.63444, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.63415, val loss: 0.63050, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.63053, val loss: 0.62663, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.62700, val loss: 0.62288, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.62351, val loss: 0.61915, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.62009, val loss: 0.61550, in 0.016s 1 tree, 25 leaves, max depth = 14, train loss: 0.61677, val loss: 0.61195, in 0.000s 1 tree, 13 leaves, max depth = 6, train loss: 0.61348, val loss: 0.60842, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.61025, val loss: 0.60496, in 0.000s 1 tree, 13 leaves, max depth = 6, train loss: 0.60707, val loss: 0.60155, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.60396, val loss: 0.59821, in 0.000s 1 tree, 13 leaves, max depth = 6, train loss: 0.60090, val loss: 0.59493, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.59792, val loss: 0.59175, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.59498, val loss: 0.58858, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.59211, val loss: 0.58551, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.58930, val loss: 0.58249, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.58651, val loss: 0.57949, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.58378, val loss: 0.57654, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.58109, val loss: 0.57364, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.57845, val loss: 0.57078, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.57586, val loss: 0.56798, in 0.016s 1 tree, 27 leaves, max depth = 15, train loss: 0.57335, val loss: 0.56527, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.57085, val loss: 0.56256, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.56840, val loss: 0.55990, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.56599, val loss: 0.55729, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.56362, val loss: 0.55472, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.56131, val loss: 0.55223, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.55904, val loss: 0.54978, in 0.000s Fit 40 trees in 0.579 s, (608 total leaves) Time spent computing histograms: 0.166s Time spent finding best splits: 0.019s Time spent applying splits: 0.014s Time spent predicting: 0.000s Trial 29, Fold 4: Log loss = 0.557560007433454, Average precision = 0.8233037899458493, ROC-AUC = 0.8688659568278767, Elapsed Time = 0.5799826999991637 seconds Trial 29, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 29, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.143 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 14 leaves, max depth = 6, train loss: 0.68817, val loss: 0.68786, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.68343, val loss: 0.68281, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.67879, val loss: 0.67787, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.67429, val loss: 0.67308, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.66983, val loss: 0.66832, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.66546, val loss: 0.66366, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.66118, val loss: 0.65909, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.65698, val loss: 0.65460, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.65287, val loss: 0.65020, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.64883, val loss: 0.64589, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.64487, val loss: 0.64165, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.64099, val loss: 0.63749, in 0.016s 1 tree, 18 leaves, max depth = 9, train loss: 0.63723, val loss: 0.63347, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.63349, val loss: 0.62947, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.62983, val loss: 0.62554, in 0.000s 1 tree, 23 leaves, max depth = 13, train loss: 0.62627, val loss: 0.62173, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.62275, val loss: 0.61794, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.61929, val loss: 0.61422, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.61593, val loss: 0.61062, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.61260, val loss: 0.60703, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.60933, val loss: 0.60351, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.60612, val loss: 0.60006, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.60297, val loss: 0.59666, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.59988, val loss: 0.59333, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.59688, val loss: 0.59010, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.59390, val loss: 0.58688, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.59101, val loss: 0.58376, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.58817, val loss: 0.58070, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.58535, val loss: 0.57764, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.58258, val loss: 0.57464, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.57986, val loss: 0.57170, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.57719, val loss: 0.56880, in 0.000s 1 tree, 15 leaves, max depth = 7, train loss: 0.57457, val loss: 0.56595, in 0.016s 1 tree, 27 leaves, max depth = 12, train loss: 0.57203, val loss: 0.56320, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.56950, val loss: 0.56045, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.56702, val loss: 0.55775, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.56458, val loss: 0.55509, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.56218, val loss: 0.55248, in 0.016s 1 tree, 26 leaves, max depth = 12, train loss: 0.55986, val loss: 0.54996, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.55757, val loss: 0.54748, in 0.016s Fit 40 trees in 0.565 s, (688 total leaves) Time spent computing histograms: 0.165s Time spent finding best splits: 0.021s Time spent applying splits: 0.015s Time spent predicting: 0.000s Trial 29, Fold 5: Log loss = 0.5614857329703491, Average precision = 0.816707956377658, ROC-AUC = 0.8589935157832154, Elapsed Time = 0.5744513999998162 seconds
Optimization Progress: 30%|### | 30/100 [05:51<12:56, 11.09s/it]
Trial 30, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 30, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.155 s 0.040 GB of training data: 0.003 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 79 leaves, max depth = 17, train loss: 0.68638, val loss: 0.68620, in 0.016s 1 tree, 84 leaves, max depth = 19, train loss: 0.67984, val loss: 0.67947, in 0.000s 1 tree, 84 leaves, max depth = 19, train loss: 0.67349, val loss: 0.67292, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.66737, val loss: 0.66667, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.66144, val loss: 0.66057, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.65567, val loss: 0.65464, in 0.000s 1 tree, 80 leaves, max depth = 18, train loss: 0.65010, val loss: 0.64890, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.64467, val loss: 0.64335, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.63935, val loss: 0.63787, in 0.000s 1 tree, 84 leaves, max depth = 18, train loss: 0.63410, val loss: 0.63244, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.62907, val loss: 0.62724, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.62415, val loss: 0.62220, in 0.000s 1 tree, 84 leaves, max depth = 17, train loss: 0.61937, val loss: 0.61727, in 0.016s 1 tree, 84 leaves, max depth = 18, train loss: 0.61464, val loss: 0.61238, in 0.016s 1 tree, 84 leaves, max depth = 20, train loss: 0.61010, val loss: 0.60770, in 0.016s 1 tree, 84 leaves, max depth = 20, train loss: 0.60569, val loss: 0.60314, in 0.000s 1 tree, 84 leaves, max depth = 16, train loss: 0.60137, val loss: 0.59871, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.59716, val loss: 0.59439, in 0.000s 1 tree, 83 leaves, max depth = 17, train loss: 0.59310, val loss: 0.59023, in 0.016s 1 tree, 84 leaves, max depth = 18, train loss: 0.58904, val loss: 0.58602, in 0.016s 1 tree, 84 leaves, max depth = 18, train loss: 0.58509, val loss: 0.58191, in 0.000s 1 tree, 84 leaves, max depth = 18, train loss: 0.58123, val loss: 0.57791, in 0.016s 1 tree, 84 leaves, max depth = 18, train loss: 0.57747, val loss: 0.57401, in 0.016s 1 tree, 84 leaves, max depth = 18, train loss: 0.57381, val loss: 0.57020, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.57029, val loss: 0.56659, in 0.000s 1 tree, 84 leaves, max depth = 15, train loss: 0.56686, val loss: 0.56306, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.56352, val loss: 0.55958, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.56027, val loss: 0.55619, in 0.000s 1 tree, 80 leaves, max depth = 16, train loss: 0.55711, val loss: 0.55292, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.55380, val loss: 0.54983, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.55075, val loss: 0.54668, in 0.000s 1 tree, 85 leaves, max depth = 14, train loss: 0.54777, val loss: 0.54356, in 0.016s 1 tree, 84 leaves, max depth = 18, train loss: 0.54479, val loss: 0.54045, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.54164, val loss: 0.53752, in 0.016s 1 tree, 84 leaves, max depth = 18, train loss: 0.53876, val loss: 0.53452, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.53601, val loss: 0.53167, in 0.000s 1 tree, 84 leaves, max depth = 18, train loss: 0.53327, val loss: 0.52881, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.53027, val loss: 0.52603, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.52768, val loss: 0.52335, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.52515, val loss: 0.52070, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.52268, val loss: 0.51815, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.51982, val loss: 0.51550, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.51743, val loss: 0.51304, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.51466, val loss: 0.51048, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.51236, val loss: 0.50806, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.50968, val loss: 0.50558, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.50706, val loss: 0.50316, in 0.016s 1 tree, 84 leaves, max depth = 18, train loss: 0.50481, val loss: 0.50079, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.50227, val loss: 0.49845, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.49979, val loss: 0.49617, in 0.016s 1 tree, 85 leaves, max depth = 13, train loss: 0.49769, val loss: 0.49395, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.49528, val loss: 0.49174, in 0.016s [53/100] 1 tree, 128 leaves, max depth = 15, train loss: 0.49294, val loss: 0.48958, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.49087, val loss: 0.48740, in 0.000s 1 tree, 128 leaves, max depth = 15, train loss: 0.48860, val loss: 0.48531, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.48637, val loss: 0.48328, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.48420, val loss: 0.48129, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.48229, val loss: 0.47926, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.48018, val loss: 0.47733, in 0.016s 1 tree, 83 leaves, max depth = 19, train loss: 0.47833, val loss: 0.47542, in 0.000s 1 tree, 85 leaves, max depth = 13, train loss: 0.47652, val loss: 0.47349, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.47448, val loss: 0.47164, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.47249, val loss: 0.46982, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.47070, val loss: 0.46796, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.46877, val loss: 0.46620, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.46709, val loss: 0.46445, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.46544, val loss: 0.46273, in 0.016s 1 tree, 83 leaves, max depth = 19, train loss: 0.46384, val loss: 0.46107, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.46198, val loss: 0.45938, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.46017, val loss: 0.45774, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.45839, val loss: 0.45613, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.45686, val loss: 0.45453, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.45532, val loss: 0.45292, in 0.000s 1 tree, 80 leaves, max depth = 18, train loss: 0.45386, val loss: 0.45138, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.45243, val loss: 0.44987, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.45102, val loss: 0.44836, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.44961, val loss: 0.44685, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.44822, val loss: 0.44541, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.44692, val loss: 0.44404, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.44561, val loss: 0.44263, in 0.000s 1 tree, 84 leaves, max depth = 13, train loss: 0.44432, val loss: 0.44125, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.44266, val loss: 0.43975, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.44141, val loss: 0.43841, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.43979, val loss: 0.43696, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.43821, val loss: 0.43554, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.43701, val loss: 0.43424, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.43547, val loss: 0.43286, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.43431, val loss: 0.43161, in 0.000s 1 tree, 84 leaves, max depth = 13, train loss: 0.43318, val loss: 0.43038, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.43168, val loss: 0.42904, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.43058, val loss: 0.42785, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.42912, val loss: 0.42655, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42810, val loss: 0.42543, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.42706, val loss: 0.42432, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42607, val loss: 0.42323, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42511, val loss: 0.42216, in 0.000s 1 tree, 84 leaves, max depth = 17, train loss: 0.42408, val loss: 0.42105, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42315, val loss: 0.42002, in 0.000s 1 tree, 128 leaves, max depth = 15, train loss: 0.42173, val loss: 0.41877, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.42074, val loss: 0.41770, in 0.016s Fit 100 trees in 1.485 s, (9287 total leaves) Time spent computing histograms: 0.510s Time spent finding best splits: 0.125s Time spent applying splits: 0.152s Time spent predicting: 0.000s Trial 30, Fold 1: Log loss = 0.4232796007217628, Average precision = 0.9459918060041705, ROC-AUC = 0.9419506190762966, Elapsed Time = 1.4879868999996688 seconds Trial 30, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 30, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 77 leaves, max depth = 21, train loss: 0.68647, val loss: 0.68619, in 0.000s 1 tree, 81 leaves, max depth = 18, train loss: 0.68003, val loss: 0.67948, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.67370, val loss: 0.67283, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.66763, val loss: 0.66652, in 0.016s 1 tree, 82 leaves, max depth = 18, train loss: 0.66171, val loss: 0.66034, in 0.000s 1 tree, 82 leaves, max depth = 18, train loss: 0.65596, val loss: 0.65432, in 0.016s 1 tree, 78 leaves, max depth = 21, train loss: 0.65037, val loss: 0.64849, in 0.016s 1 tree, 83 leaves, max depth = 20, train loss: 0.64493, val loss: 0.64280, in 0.000s 1 tree, 85 leaves, max depth = 11, train loss: 0.63960, val loss: 0.63719, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.63437, val loss: 0.63169, in 0.016s 1 tree, 81 leaves, max depth = 18, train loss: 0.62934, val loss: 0.62642, in 0.016s 1 tree, 84 leaves, max depth = 19, train loss: 0.62443, val loss: 0.62127, in 0.000s 1 tree, 85 leaves, max depth = 11, train loss: 0.61964, val loss: 0.61623, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.61493, val loss: 0.61126, in 0.016s 1 tree, 82 leaves, max depth = 16, train loss: 0.61042, val loss: 0.60655, in 0.000s 1 tree, 85 leaves, max depth = 11, train loss: 0.60600, val loss: 0.60188, in 0.016s 1 tree, 82 leaves, max depth = 17, train loss: 0.60170, val loss: 0.59737, in 0.016s 1 tree, 82 leaves, max depth = 17, train loss: 0.59752, val loss: 0.59298, in 0.000s 1 tree, 83 leaves, max depth = 23, train loss: 0.59345, val loss: 0.58871, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.58941, val loss: 0.58443, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.58548, val loss: 0.58027, in 0.000s 1 tree, 85 leaves, max depth = 15, train loss: 0.58164, val loss: 0.57621, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.57790, val loss: 0.57225, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.57425, val loss: 0.56839, in 0.016s 1 tree, 82 leaves, max depth = 16, train loss: 0.57075, val loss: 0.56470, in 0.000s 1 tree, 82 leaves, max depth = 16, train loss: 0.56734, val loss: 0.56111, in 0.016s 1 tree, 82 leaves, max depth = 18, train loss: 0.56401, val loss: 0.55761, in 0.016s 1 tree, 82 leaves, max depth = 18, train loss: 0.56077, val loss: 0.55419, in 0.000s 1 tree, 128 leaves, max depth = 16, train loss: 0.55742, val loss: 0.55097, in 0.016s 1 tree, 85 leaves, max depth = 19, train loss: 0.55428, val loss: 0.54764, in 0.016s 1 tree, 78 leaves, max depth = 18, train loss: 0.55122, val loss: 0.54443, in 0.016s 1 tree, 82 leaves, max depth = 17, train loss: 0.54824, val loss: 0.54129, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.54528, val loss: 0.53814, in 0.000s 1 tree, 128 leaves, max depth = 17, train loss: 0.54213, val loss: 0.53511, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.53927, val loss: 0.53206, in 0.016s 1 tree, 85 leaves, max depth = 19, train loss: 0.53652, val loss: 0.52915, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.53379, val loss: 0.52625, in 0.000s 1 tree, 127 leaves, max depth = 16, train loss: 0.53079, val loss: 0.52336, in 0.016s 1 tree, 84 leaves, max depth = 18, train loss: 0.52821, val loss: 0.52062, in 0.016s 1 tree, 82 leaves, max depth = 16, train loss: 0.52569, val loss: 0.51796, in 0.000s 1 tree, 85 leaves, max depth = 18, train loss: 0.52322, val loss: 0.51534, in 0.016s 1 tree, 128 leaves, max depth = 17, train loss: 0.52036, val loss: 0.51259, in 0.016s 1 tree, 82 leaves, max depth = 17, train loss: 0.51799, val loss: 0.51009, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.51522, val loss: 0.50743, in 0.016s 1 tree, 82 leaves, max depth = 17, train loss: 0.51293, val loss: 0.50500, in 0.000s 1 tree, 127 leaves, max depth = 17, train loss: 0.51025, val loss: 0.50243, in 0.031s 1 tree, 127 leaves, max depth = 17, train loss: 0.50762, val loss: 0.49992, in 0.000s 1 tree, 85 leaves, max depth = 16, train loss: 0.50539, val loss: 0.49752, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.50285, val loss: 0.49509, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.50037, val loss: 0.49272, in 0.016s 1 tree, 82 leaves, max depth = 17, train loss: 0.49828, val loss: 0.49050, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.49587, val loss: 0.48820, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.49352, val loss: 0.48596, in 0.016s 1 tree, 85 leaves, max depth = 16, train loss: 0.49147, val loss: 0.48376, in 0.000s 1 tree, 127 leaves, max depth = 17, train loss: 0.48920, val loss: 0.48159, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.48697, val loss: 0.47947, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.48480, val loss: 0.47740, in 0.016s 1 tree, 81 leaves, max depth = 17, train loss: 0.48290, val loss: 0.47538, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.48079, val loss: 0.47337, in 0.000s 1 tree, 83 leaves, max depth = 26, train loss: 0.47895, val loss: 0.47141, in 0.016s 1 tree, 82 leaves, max depth = 17, train loss: 0.47715, val loss: 0.46949, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.47511, val loss: 0.46756, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.47312, val loss: 0.46567, in 0.016s 1 tree, 85 leaves, max depth = 12, train loss: 0.47138, val loss: 0.46379, in 0.016s 1 tree, 127 leaves, max depth = 18, train loss: 0.46945, val loss: 0.46196, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.46778, val loss: 0.46017, in 0.000s 1 tree, 82 leaves, max depth = 14, train loss: 0.46614, val loss: 0.45843, in 0.016s 1 tree, 83 leaves, max depth = 26, train loss: 0.46455, val loss: 0.45673, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.46269, val loss: 0.45496, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.46087, val loss: 0.45324, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.45909, val loss: 0.45156, in 0.016s 1 tree, 85 leaves, max depth = 12, train loss: 0.45756, val loss: 0.44990, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.45607, val loss: 0.44831, in 0.000s 1 tree, 79 leaves, max depth = 19, train loss: 0.45462, val loss: 0.44677, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.45320, val loss: 0.44525, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.45181, val loss: 0.44377, in 0.016s 1 tree, 85 leaves, max depth = 16, train loss: 0.45042, val loss: 0.44226, in 0.000s 1 tree, 78 leaves, max depth = 19, train loss: 0.44909, val loss: 0.44086, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.44780, val loss: 0.43948, in 0.016s 1 tree, 85 leaves, max depth = 16, train loss: 0.44650, val loss: 0.43806, in 0.016s 1 tree, 85 leaves, max depth = 16, train loss: 0.44523, val loss: 0.43668, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.44356, val loss: 0.43511, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.44233, val loss: 0.43377, in 0.004s 1 tree, 128 leaves, max depth = 16, train loss: 0.44070, val loss: 0.43223, in 0.013s 1 tree, 128 leaves, max depth = 16, train loss: 0.43911, val loss: 0.43074, in 0.016s 1 tree, 85 leaves, max depth = 16, train loss: 0.43793, val loss: 0.42945, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.43638, val loss: 0.42799, in 0.016s 1 tree, 85 leaves, max depth = 16, train loss: 0.43523, val loss: 0.42674, in 0.016s 1 tree, 85 leaves, max depth = 16, train loss: 0.43411, val loss: 0.42552, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.43261, val loss: 0.42411, in 0.016s 1 tree, 85 leaves, max depth = 16, train loss: 0.43152, val loss: 0.42292, in 0.016s 1 tree, 128 leaves, max depth = 18, train loss: 0.43005, val loss: 0.42154, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42906, val loss: 0.42052, in 0.000s 1 tree, 83 leaves, max depth = 22, train loss: 0.42804, val loss: 0.41942, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42708, val loss: 0.41842, in 0.000s 1 tree, 83 leaves, max depth = 22, train loss: 0.42609, val loss: 0.41735, in 0.016s 1 tree, 85 leaves, max depth = 17, train loss: 0.42509, val loss: 0.41626, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42416, val loss: 0.41529, in 0.000s 1 tree, 127 leaves, max depth = 17, train loss: 0.42274, val loss: 0.41397, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.42178, val loss: 0.41294, in 0.016s Fit 100 trees in 1.534 s, (9315 total leaves) Time spent computing histograms: 0.519s Time spent finding best splits: 0.128s Time spent applying splits: 0.154s Time spent predicting: 0.016s Trial 30, Fold 2: Log loss = 0.42380552050116543, Average precision = 0.9431906991089813, ROC-AUC = 0.9425675603903075, Elapsed Time = 1.5441061000001355 seconds Trial 30, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 30, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.159 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 81 leaves, max depth = 14, train loss: 0.68646, val loss: 0.68625, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.68007, val loss: 0.67968, in 0.000s 1 tree, 81 leaves, max depth = 14, train loss: 0.67377, val loss: 0.67320, in 0.000s 1 tree, 83 leaves, max depth = 13, train loss: 0.66771, val loss: 0.66692, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.66183, val loss: 0.66087, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.65612, val loss: 0.65498, in 0.016s 1 tree, 81 leaves, max depth = 16, train loss: 0.65059, val loss: 0.64926, in 0.016s 1 tree, 80 leaves, max depth = 18, train loss: 0.64520, val loss: 0.64369, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.63991, val loss: 0.63818, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.63471, val loss: 0.63282, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.62971, val loss: 0.62767, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.62483, val loss: 0.62260, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.62008, val loss: 0.61765, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.61540, val loss: 0.61282, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.61094, val loss: 0.60819, in 0.016s 1 tree, 83 leaves, max depth = 18, train loss: 0.60657, val loss: 0.60367, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.60231, val loss: 0.59927, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.59815, val loss: 0.59497, in 0.000s 1 tree, 80 leaves, max depth = 19, train loss: 0.59411, val loss: 0.59077, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.59009, val loss: 0.58662, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.58617, val loss: 0.58258, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.58235, val loss: 0.57863, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.57863, val loss: 0.57479, in 0.000s 1 tree, 83 leaves, max depth = 14, train loss: 0.57500, val loss: 0.57103, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.57152, val loss: 0.56743, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.56813, val loss: 0.56392, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.56482, val loss: 0.56050, in 0.000s 1 tree, 84 leaves, max depth = 13, train loss: 0.56160, val loss: 0.55716, in 0.016s 1 tree, 125 leaves, max depth = 14, train loss: 0.55818, val loss: 0.55399, in 0.016s 1 tree, 125 leaves, max depth = 14, train loss: 0.55485, val loss: 0.55090, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.55179, val loss: 0.54770, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.54879, val loss: 0.54459, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.54580, val loss: 0.54149, in 0.000s 1 tree, 125 leaves, max depth = 15, train loss: 0.54264, val loss: 0.53856, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.53976, val loss: 0.53557, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.53699, val loss: 0.53265, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.53425, val loss: 0.52980, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.53124, val loss: 0.52702, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.52864, val loss: 0.52432, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.52611, val loss: 0.52168, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.52363, val loss: 0.51907, in 0.000s 1 tree, 124 leaves, max depth = 15, train loss: 0.52076, val loss: 0.51641, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.51837, val loss: 0.51393, in 0.000s 1 tree, 125 leaves, max depth = 15, train loss: 0.51559, val loss: 0.51136, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.51329, val loss: 0.50896, in 0.000s 1 tree, 125 leaves, max depth = 15, train loss: 0.51059, val loss: 0.50647, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.50796, val loss: 0.50404, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.50571, val loss: 0.50169, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.50316, val loss: 0.49934, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.50067, val loss: 0.49705, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.49857, val loss: 0.49485, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.49615, val loss: 0.49263, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.49380, val loss: 0.49046, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.49173, val loss: 0.48830, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.48945, val loss: 0.48620, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.48722, val loss: 0.48416, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.48504, val loss: 0.48216, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.48312, val loss: 0.48015, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.48100, val loss: 0.47821, in 0.016s 1 tree, 80 leaves, max depth = 23, train loss: 0.47915, val loss: 0.47624, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.47734, val loss: 0.47434, in 0.000s 1 tree, 124 leaves, max depth = 15, train loss: 0.47530, val loss: 0.47247, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.47330, val loss: 0.47065, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.47155, val loss: 0.46877, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.46961, val loss: 0.46700, in 0.031s 1 tree, 84 leaves, max depth = 14, train loss: 0.46793, val loss: 0.46522, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.46628, val loss: 0.46349, in 0.016s 1 tree, 80 leaves, max depth = 23, train loss: 0.46468, val loss: 0.46178, in 0.000s 1 tree, 125 leaves, max depth = 14, train loss: 0.46281, val loss: 0.46008, in 0.031s 1 tree, 125 leaves, max depth = 14, train loss: 0.46099, val loss: 0.45842, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.45920, val loss: 0.45680, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.45766, val loss: 0.45513, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.45617, val loss: 0.45355, in 0.016s 1 tree, 83 leaves, max depth = 18, train loss: 0.45471, val loss: 0.45200, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.45328, val loss: 0.45049, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.45189, val loss: 0.44901, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.45048, val loss: 0.44753, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.44910, val loss: 0.44604, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.44780, val loss: 0.44466, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.44649, val loss: 0.44327, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.44521, val loss: 0.44192, in 0.000s 1 tree, 125 leaves, max depth = 15, train loss: 0.44353, val loss: 0.44040, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.44229, val loss: 0.43908, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.44066, val loss: 0.43761, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.43907, val loss: 0.43617, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.43787, val loss: 0.43490, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.43632, val loss: 0.43350, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.43517, val loss: 0.43227, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.43404, val loss: 0.43107, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.43253, val loss: 0.42971, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.43143, val loss: 0.42854, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.42996, val loss: 0.42722, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42894, val loss: 0.42627, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.42789, val loss: 0.42515, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.42690, val loss: 0.42423, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42592, val loss: 0.42333, in 0.000s 1 tree, 83 leaves, max depth = 16, train loss: 0.42491, val loss: 0.42225, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42397, val loss: 0.42138, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.42254, val loss: 0.42010, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.42157, val loss: 0.41906, in 0.000s Fit 100 trees in 1.674 s, (9225 total leaves) Time spent computing histograms: 0.570s Time spent finding best splits: 0.144s Time spent applying splits: 0.172s Time spent predicting: 0.031s Trial 30, Fold 3: Log loss = 0.4193375609257467, Average precision = 0.9465947262409895, ROC-AUC = 0.945489996966224, Elapsed Time = 1.6884934999998222 seconds Trial 30, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 30, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 79 leaves, max depth = 13, train loss: 0.68647, val loss: 0.68614, in 0.016s 1 tree, 81 leaves, max depth = 19, train loss: 0.68000, val loss: 0.67930, in 0.016s 1 tree, 81 leaves, max depth = 19, train loss: 0.67370, val loss: 0.67264, in 0.000s 1 tree, 81 leaves, max depth = 16, train loss: 0.66762, val loss: 0.66623, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.66171, val loss: 0.66000, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.65597, val loss: 0.65394, in 0.000s 1 tree, 82 leaves, max depth = 13, train loss: 0.65042, val loss: 0.64808, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.64503, val loss: 0.64239, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.63973, val loss: 0.63678, in 0.000s 1 tree, 84 leaves, max depth = 20, train loss: 0.63453, val loss: 0.63125, in 0.016s 1 tree, 82 leaves, max depth = 19, train loss: 0.62953, val loss: 0.62593, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.62464, val loss: 0.62074, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.61988, val loss: 0.61569, in 0.016s 1 tree, 84 leaves, max depth = 20, train loss: 0.61519, val loss: 0.61069, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.61068, val loss: 0.60590, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.60632, val loss: 0.60125, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.60202, val loss: 0.59668, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.59784, val loss: 0.59223, in 0.000s 1 tree, 80 leaves, max depth = 13, train loss: 0.59380, val loss: 0.58793, in 0.016s 1 tree, 83 leaves, max depth = 20, train loss: 0.58979, val loss: 0.58363, in 0.016s 1 tree, 83 leaves, max depth = 20, train loss: 0.58587, val loss: 0.57943, in 0.000s 1 tree, 83 leaves, max depth = 20, train loss: 0.58206, val loss: 0.57533, in 0.016s 1 tree, 83 leaves, max depth = 20, train loss: 0.57834, val loss: 0.57133, in 0.016s 1 tree, 83 leaves, max depth = 20, train loss: 0.57472, val loss: 0.56742, in 0.000s 1 tree, 81 leaves, max depth = 14, train loss: 0.57122, val loss: 0.56368, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.56781, val loss: 0.56002, in 0.016s 1 tree, 83 leaves, max depth = 19, train loss: 0.56449, val loss: 0.55644, in 0.016s 1 tree, 83 leaves, max depth = 19, train loss: 0.56126, val loss: 0.55294, in 0.000s 1 tree, 128 leaves, max depth = 13, train loss: 0.55781, val loss: 0.54959, in 0.031s 1 tree, 81 leaves, max depth = 14, train loss: 0.55468, val loss: 0.54622, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.55164, val loss: 0.54296, in 0.000s 1 tree, 83 leaves, max depth = 17, train loss: 0.54868, val loss: 0.53974, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.54573, val loss: 0.53654, in 0.016s 1 tree, 128 leaves, max depth = 14, train loss: 0.54249, val loss: 0.53340, in 0.016s 1 tree, 84 leaves, max depth = 20, train loss: 0.53966, val loss: 0.53031, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.53692, val loss: 0.52736, in 0.016s 1 tree, 84 leaves, max depth = 20, train loss: 0.53422, val loss: 0.52441, in 0.000s 1 tree, 128 leaves, max depth = 14, train loss: 0.53114, val loss: 0.52142, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.52856, val loss: 0.51863, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.52605, val loss: 0.51589, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.52360, val loss: 0.51323, in 0.000s [42/100] 1 tree, 128 leaves, max depth = 16, train loss: 0.52066, val loss: 0.51038, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.51829, val loss: 0.50781, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.51544, val loss: 0.50505, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.51316, val loss: 0.50254, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.51040, val loss: 0.49987, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.50770, val loss: 0.49726, in 0.016s 1 tree, 84 leaves, max depth = 20, train loss: 0.50549, val loss: 0.49483, in 0.000s 1 tree, 128 leaves, max depth = 16, train loss: 0.50288, val loss: 0.49230, in 0.031s 1 tree, 128 leaves, max depth = 16, train loss: 0.50032, val loss: 0.48983, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.49824, val loss: 0.48754, in 0.000s 1 tree, 128 leaves, max depth = 16, train loss: 0.49576, val loss: 0.48514, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.49334, val loss: 0.48281, in 0.016s 1 tree, 84 leaves, max depth = 20, train loss: 0.49132, val loss: 0.48058, in 0.000s 1 tree, 128 leaves, max depth = 14, train loss: 0.48898, val loss: 0.47831, in 0.016s 1 tree, 128 leaves, max depth = 14, train loss: 0.48668, val loss: 0.47610, in 0.016s 1 tree, 128 leaves, max depth = 14, train loss: 0.48445, val loss: 0.47393, in 0.016s 1 tree, 82 leaves, max depth = 18, train loss: 0.48255, val loss: 0.47184, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.48038, val loss: 0.46974, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.47857, val loss: 0.46776, in 0.016s 1 tree, 83 leaves, max depth = 18, train loss: 0.47678, val loss: 0.46578, in 0.000s 1 tree, 128 leaves, max depth = 15, train loss: 0.47467, val loss: 0.46375, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.47262, val loss: 0.46177, in 0.031s 1 tree, 83 leaves, max depth = 12, train loss: 0.47085, val loss: 0.45984, in 0.000s 1 tree, 128 leaves, max depth = 17, train loss: 0.46886, val loss: 0.45791, in 0.016s 1 tree, 81 leaves, max depth = 17, train loss: 0.46720, val loss: 0.45609, in 0.016s 1 tree, 81 leaves, max depth = 17, train loss: 0.46558, val loss: 0.45430, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.46394, val loss: 0.45250, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.46202, val loss: 0.45066, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.46015, val loss: 0.44886, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.45832, val loss: 0.44709, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.45682, val loss: 0.44543, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.45529, val loss: 0.44374, in 0.016s 1 tree, 81 leaves, max depth = 16, train loss: 0.45386, val loss: 0.44214, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.45246, val loss: 0.44058, in 0.031s 1 tree, 83 leaves, max depth = 15, train loss: 0.45108, val loss: 0.43904, in 0.031s 1 tree, 84 leaves, max depth = 17, train loss: 0.44971, val loss: 0.43750, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.44834, val loss: 0.43599, in 0.031s 1 tree, 81 leaves, max depth = 16, train loss: 0.44707, val loss: 0.43457, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.44579, val loss: 0.43312, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.44454, val loss: 0.43171, in 0.016s 1 tree, 128 leaves, max depth = 17, train loss: 0.44283, val loss: 0.43007, in 0.031s 1 tree, 84 leaves, max depth = 17, train loss: 0.44162, val loss: 0.42870, in 0.016s 1 tree, 128 leaves, max depth = 17, train loss: 0.43995, val loss: 0.42710, in 0.016s 1 tree, 128 leaves, max depth = 17, train loss: 0.43832, val loss: 0.42554, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.43716, val loss: 0.42422, in 0.016s 1 tree, 128 leaves, max depth = 17, train loss: 0.43556, val loss: 0.42270, in 0.031s 1 tree, 84 leaves, max depth = 17, train loss: 0.43444, val loss: 0.42142, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.43334, val loss: 0.42017, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.43179, val loss: 0.41869, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.43073, val loss: 0.41748, in 0.016s 1 tree, 128 leaves, max depth = 17, train loss: 0.42922, val loss: 0.41604, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42823, val loss: 0.41498, in 0.000s 1 tree, 83 leaves, max depth = 15, train loss: 0.42721, val loss: 0.41382, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42625, val loss: 0.41279, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42531, val loss: 0.41178, in 0.000s 1 tree, 84 leaves, max depth = 17, train loss: 0.42432, val loss: 0.41065, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42340, val loss: 0.40967, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.42194, val loss: 0.40828, in 0.016s 1 tree, 84 leaves, max depth = 17, train loss: 0.42098, val loss: 0.40718, in 0.000s Fit 100 trees in 1.720 s, (9210 total leaves) Time spent computing histograms: 0.582s Time spent finding best splits: 0.158s Time spent applying splits: 0.190s Time spent predicting: 0.031s Trial 30, Fold 4: Log loss = 0.4216082511273579, Average precision = 0.9466523855487519, ROC-AUC = 0.9441516091189367, Elapsed Time = 1.738201800000752 seconds Trial 30, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 30, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 83 leaves, max depth = 14, train loss: 0.68635, val loss: 0.68594, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.67980, val loss: 0.67897, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.67344, val loss: 0.67218, in 0.016s 1 tree, 81 leaves, max depth = 18, train loss: 0.66733, val loss: 0.66569, in 0.000s 1 tree, 83 leaves, max depth = 15, train loss: 0.66137, val loss: 0.65935, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.65558, val loss: 0.65317, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.64995, val loss: 0.64719, in 0.000s 1 tree, 83 leaves, max depth = 13, train loss: 0.64448, val loss: 0.64137, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.63913, val loss: 0.63565, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.63387, val loss: 0.63001, in 0.000s 1 tree, 83 leaves, max depth = 15, train loss: 0.62881, val loss: 0.62459, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.62387, val loss: 0.61928, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.61908, val loss: 0.61417, in 0.000s 1 tree, 84 leaves, max depth = 16, train loss: 0.61434, val loss: 0.60908, in 0.000s 1 tree, 83 leaves, max depth = 16, train loss: 0.60979, val loss: 0.60422, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.60536, val loss: 0.59949, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.60103, val loss: 0.59480, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.59681, val loss: 0.59024, in 0.016s 1 tree, 82 leaves, max depth = 15, train loss: 0.59271, val loss: 0.58584, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.58865, val loss: 0.58145, in 0.000s 1 tree, 84 leaves, max depth = 15, train loss: 0.58468, val loss: 0.57716, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.58082, val loss: 0.57298, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.57706, val loss: 0.56890, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.57339, val loss: 0.56492, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.56986, val loss: 0.56110, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.56641, val loss: 0.55735, in 0.000s 1 tree, 83 leaves, max depth = 15, train loss: 0.56305, val loss: 0.55371, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.55977, val loss: 0.55015, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.55658, val loss: 0.54671, in 0.000s 1 tree, 128 leaves, max depth = 20, train loss: 0.55323, val loss: 0.54354, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.55016, val loss: 0.54019, in 0.000s 1 tree, 83 leaves, max depth = 17, train loss: 0.54715, val loss: 0.53692, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.54417, val loss: 0.53365, in 0.016s 1 tree, 128 leaves, max depth = 20, train loss: 0.54098, val loss: 0.53066, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.53810, val loss: 0.52750, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.53533, val loss: 0.52447, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.53259, val loss: 0.52146, in 0.000s 1 tree, 128 leaves, max depth = 18, train loss: 0.52956, val loss: 0.51862, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.52695, val loss: 0.51575, in 0.000s 1 tree, 82 leaves, max depth = 17, train loss: 0.52441, val loss: 0.51297, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.52193, val loss: 0.51023, in 0.016s 1 tree, 128 leaves, max depth = 18, train loss: 0.51904, val loss: 0.50753, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.51664, val loss: 0.50489, in 0.016s 1 tree, 128 leaves, max depth = 19, train loss: 0.51384, val loss: 0.50227, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.51153, val loss: 0.49973, in 0.016s 1 tree, 128 leaves, max depth = 19, train loss: 0.50881, val loss: 0.49719, in 0.000s 1 tree, 128 leaves, max depth = 19, train loss: 0.50616, val loss: 0.49473, in 0.031s 1 tree, 84 leaves, max depth = 15, train loss: 0.50391, val loss: 0.49224, in 0.000s 1 tree, 128 leaves, max depth = 20, train loss: 0.50134, val loss: 0.48985, in 0.016s 1 tree, 128 leaves, max depth = 20, train loss: 0.49883, val loss: 0.48752, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.49672, val loss: 0.48519, in 0.016s 1 tree, 128 leaves, max depth = 21, train loss: 0.49428, val loss: 0.48293, in 0.016s 1 tree, 128 leaves, max depth = 21, train loss: 0.49190, val loss: 0.48073, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.48984, val loss: 0.47844, in 0.016s 1 tree, 128 leaves, max depth = 19, train loss: 0.48753, val loss: 0.47631, in 0.016s 1 tree, 128 leaves, max depth = 19, train loss: 0.48528, val loss: 0.47422, in 0.016s 1 tree, 128 leaves, max depth = 19, train loss: 0.48308, val loss: 0.47219, in 0.016s 1 tree, 82 leaves, max depth = 17, train loss: 0.48116, val loss: 0.47006, in 0.016s 1 tree, 128 leaves, max depth = 18, train loss: 0.47902, val loss: 0.46809, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.47717, val loss: 0.46605, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.47535, val loss: 0.46403, in 0.016s 1 tree, 128 leaves, max depth = 19, train loss: 0.47328, val loss: 0.46213, in 0.000s 1 tree, 128 leaves, max depth = 19, train loss: 0.47127, val loss: 0.46028, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.46952, val loss: 0.45833, in 0.016s 1 tree, 128 leaves, max depth = 19, train loss: 0.46756, val loss: 0.45654, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.46587, val loss: 0.45465, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.46422, val loss: 0.45281, in 0.000s 1 tree, 81 leaves, max depth = 14, train loss: 0.46262, val loss: 0.45101, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.46073, val loss: 0.44929, in 0.031s 1 tree, 128 leaves, max depth = 16, train loss: 0.45889, val loss: 0.44761, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.45708, val loss: 0.44596, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.45555, val loss: 0.44424, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.45405, val loss: 0.44255, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.45259, val loss: 0.44093, in 0.000s 1 tree, 81 leaves, max depth = 14, train loss: 0.45117, val loss: 0.43933, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.44977, val loss: 0.43775, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.44837, val loss: 0.43617, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.44705, val loss: 0.43467, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.44575, val loss: 0.43320, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.44444, val loss: 0.43171, in 0.000s 1 tree, 84 leaves, max depth = 15, train loss: 0.44316, val loss: 0.43026, in 0.016s 1 tree, 128 leaves, max depth = 18, train loss: 0.44147, val loss: 0.42873, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.44023, val loss: 0.42732, in 0.016s 1 tree, 128 leaves, max depth = 18, train loss: 0.43858, val loss: 0.42583, in 0.016s 1 tree, 128 leaves, max depth = 18, train loss: 0.43696, val loss: 0.42438, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.43577, val loss: 0.42303, in 0.016s 1 tree, 128 leaves, max depth = 18, train loss: 0.43419, val loss: 0.42161, in 0.031s 1 tree, 84 leaves, max depth = 15, train loss: 0.43304, val loss: 0.42030, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.43192, val loss: 0.41901, in 0.000s 1 tree, 128 leaves, max depth = 18, train loss: 0.43038, val loss: 0.41764, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.42929, val loss: 0.41639, in 0.016s 1 tree, 128 leaves, max depth = 18, train loss: 0.42780, val loss: 0.41505, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42680, val loss: 0.41409, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.42578, val loss: 0.41293, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42481, val loss: 0.41200, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.42386, val loss: 0.41108, in 0.000s 1 tree, 84 leaves, max depth = 15, train loss: 0.42284, val loss: 0.40991, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.42185, val loss: 0.40877, in 0.016s 1 tree, 128 leaves, max depth = 17, train loss: 0.42040, val loss: 0.40749, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.41944, val loss: 0.40637, in 0.016s Fit 100 trees in 1.626 s, (9266 total leaves) Time spent computing histograms: 0.551s Time spent finding best splits: 0.141s Time spent applying splits: 0.168s Time spent predicting: 0.000s Trial 30, Fold 5: Log loss = 0.427324505663281, Average precision = 0.9425507706044726, ROC-AUC = 0.9406792635118816, Elapsed Time = 1.635113199999978 seconds
Optimization Progress: 31%|###1 | 31/100 [06:06<14:02, 12.21s/it]
Trial 31, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 31, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 194 leaves, max depth = 18, train loss: 0.68167, val loss: 0.68189, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.67091, val loss: 0.67120, in 0.031s 1 tree, 63 leaves, max depth = 10, train loss: 0.66041, val loss: 0.66056, in 0.016s 1 tree, 183 leaves, max depth = 15, train loss: 0.64992, val loss: 0.65031, in 0.031s 1 tree, 64 leaves, max depth = 13, train loss: 0.64028, val loss: 0.64060, in 0.016s 1 tree, 184 leaves, max depth = 16, train loss: 0.63057, val loss: 0.63109, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.62149, val loss: 0.62184, in 0.016s 1 tree, 113 leaves, max depth = 15, train loss: 0.61202, val loss: 0.61229, in 0.016s 1 tree, 116 leaves, max depth = 16, train loss: 0.60307, val loss: 0.60325, in 0.031s 1 tree, 207 leaves, max depth = 15, train loss: 0.59509, val loss: 0.59550, in 0.016s 1 tree, 213 leaves, max depth = 16, train loss: 0.58714, val loss: 0.58771, in 0.031s 1 tree, 116 leaves, max depth = 16, train loss: 0.57905, val loss: 0.57950, in 0.031s 1 tree, 119 leaves, max depth = 15, train loss: 0.57112, val loss: 0.57159, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.56406, val loss: 0.56447, in 0.031s 1 tree, 122 leaves, max depth = 20, train loss: 0.55662, val loss: 0.55708, in 0.016s 1 tree, 110 leaves, max depth = 16, train loss: 0.55005, val loss: 0.55044, in 0.016s 1 tree, 203 leaves, max depth = 17, train loss: 0.54366, val loss: 0.54406, in 0.031s 1 tree, 213 leaves, max depth = 13, train loss: 0.53740, val loss: 0.53798, in 0.016s 1 tree, 147 leaves, max depth = 17, train loss: 0.53085, val loss: 0.53154, in 0.031s 1 tree, 129 leaves, max depth = 15, train loss: 0.52507, val loss: 0.52568, in 0.016s 1 tree, 91 leaves, max depth = 14, train loss: 0.51936, val loss: 0.51989, in 0.016s 1 tree, 126 leaves, max depth = 17, train loss: 0.51393, val loss: 0.51441, in 0.031s 1 tree, 213 leaves, max depth = 16, train loss: 0.50856, val loss: 0.50915, in 0.016s 1 tree, 213 leaves, max depth = 14, train loss: 0.50345, val loss: 0.50418, in 0.031s 1 tree, 186 leaves, max depth = 17, train loss: 0.49822, val loss: 0.49891, in 0.031s 1 tree, 129 leaves, max depth = 14, train loss: 0.49343, val loss: 0.49405, in 0.016s 1 tree, 213 leaves, max depth = 18, train loss: 0.48866, val loss: 0.48948, in 0.031s 1 tree, 213 leaves, max depth = 16, train loss: 0.48393, val loss: 0.48494, in 0.031s 1 tree, 131 leaves, max depth = 15, train loss: 0.47906, val loss: 0.48000, in 0.031s 1 tree, 124 leaves, max depth = 17, train loss: 0.47431, val loss: 0.47516, in 0.016s 1 tree, 213 leaves, max depth = 16, train loss: 0.47004, val loss: 0.47111, in 0.031s 1 tree, 207 leaves, max depth = 17, train loss: 0.46604, val loss: 0.46727, in 0.016s 1 tree, 189 leaves, max depth = 18, train loss: 0.46204, val loss: 0.46321, in 0.031s 1 tree, 192 leaves, max depth = 15, train loss: 0.45788, val loss: 0.45918, in 0.031s 1 tree, 132 leaves, max depth = 17, train loss: 0.45365, val loss: 0.45499, in 0.016s 1 tree, 134 leaves, max depth = 15, train loss: 0.44950, val loss: 0.45086, in 0.031s 1 tree, 139 leaves, max depth = 14, train loss: 0.44561, val loss: 0.44686, in 0.016s 1 tree, 149 leaves, max depth = 17, train loss: 0.44168, val loss: 0.44284, in 0.031s 1 tree, 213 leaves, max depth = 17, train loss: 0.43817, val loss: 0.43953, in 0.031s 1 tree, 138 leaves, max depth = 15, train loss: 0.43452, val loss: 0.43588, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.42957, val loss: 0.43108, in 0.016s 1 tree, 144 leaves, max depth = 14, train loss: 0.42597, val loss: 0.42756, in 0.031s 1 tree, 139 leaves, max depth = 18, train loss: 0.42270, val loss: 0.42427, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.41806, val loss: 0.41979, in 0.016s 1 tree, 213 leaves, max depth = 16, train loss: 0.41507, val loss: 0.41698, in 0.016s 1 tree, 150 leaves, max depth = 14, train loss: 0.41205, val loss: 0.41388, in 0.031s 1 tree, 177 leaves, max depth = 19, train loss: 0.40905, val loss: 0.41080, in 0.016s 1 tree, 136 leaves, max depth = 15, train loss: 0.40599, val loss: 0.40782, in 0.031s 1 tree, 29 leaves, max depth = 10, train loss: 0.40165, val loss: 0.40362, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.39753, val loss: 0.39963, in 0.016s 1 tree, 26 leaves, max depth = 8, train loss: 0.39337, val loss: 0.39562, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.38953, val loss: 0.39189, in 0.016s 1 tree, 10 leaves, max depth = 4, train loss: 0.38583, val loss: 0.38828, in 0.016s 1 tree, 154 leaves, max depth = 13, train loss: 0.38284, val loss: 0.38542, in 0.016s 1 tree, 146 leaves, max depth = 15, train loss: 0.37996, val loss: 0.38256, in 0.031s 1 tree, 156 leaves, max depth = 14, train loss: 0.37713, val loss: 0.37985, in 0.031s 1 tree, 149 leaves, max depth = 24, train loss: 0.37506, val loss: 0.37779, in 0.016s 1 tree, 141 leaves, max depth = 14, train loss: 0.37237, val loss: 0.37519, in 0.031s 1 tree, 158 leaves, max depth = 15, train loss: 0.36973, val loss: 0.37252, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.36751, val loss: 0.37024, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.36438, val loss: 0.36720, in 0.016s 1 tree, 135 leaves, max depth = 17, train loss: 0.36192, val loss: 0.36478, in 0.016s 1 tree, 212 leaves, max depth = 26, train loss: 0.36005, val loss: 0.36300, in 0.031s 1 tree, 123 leaves, max depth = 15, train loss: 0.35669, val loss: 0.36001, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.35362, val loss: 0.35705, in 0.016s 1 tree, 24 leaves, max depth = 7, train loss: 0.35050, val loss: 0.35410, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.34780, val loss: 0.35166, in 0.016s 1 tree, 153 leaves, max depth = 17, train loss: 0.34584, val loss: 0.34981, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.34292, val loss: 0.34710, in 0.016s 1 tree, 165 leaves, max depth = 16, train loss: 0.34093, val loss: 0.34519, in 0.016s Fit 70 trees in 1.845 s, (9063 total leaves) Time spent computing histograms: 0.544s Time spent finding best splits: 0.175s Time spent applying splits: 0.138s Time spent predicting: 0.016s Trial 31, Fold 1: Log loss = 0.35161783356055243, Average precision = 0.9569503128917286, ROC-AUC = 0.9514637193329605, Elapsed Time = 1.8531702999989648 seconds Trial 31, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 31, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 156 leaves, max depth = 14, train loss: 0.68167, val loss: 0.68164, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.67017, val loss: 0.67009, in 0.031s 1 tree, 105 leaves, max depth = 17, train loss: 0.65906, val loss: 0.65895, in 0.031s 1 tree, 83 leaves, max depth = 16, train loss: 0.64946, val loss: 0.64902, in 0.016s 1 tree, 135 leaves, max depth = 16, train loss: 0.63915, val loss: 0.63863, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.62900, val loss: 0.62839, in 0.031s 1 tree, 109 leaves, max depth = 15, train loss: 0.61920, val loss: 0.61853, in 0.016s 1 tree, 110 leaves, max depth = 14, train loss: 0.61057, val loss: 0.60996, in 0.031s 1 tree, 113 leaves, max depth = 14, train loss: 0.60152, val loss: 0.60081, in 0.016s 1 tree, 111 leaves, max depth = 17, train loss: 0.59280, val loss: 0.59206, in 0.031s 1 tree, 213 leaves, max depth = 14, train loss: 0.58480, val loss: 0.58415, in 0.016s 1 tree, 119 leaves, max depth = 15, train loss: 0.57657, val loss: 0.57586, in 0.031s 1 tree, 116 leaves, max depth = 19, train loss: 0.56919, val loss: 0.56844, in 0.016s 1 tree, 103 leaves, max depth = 13, train loss: 0.56225, val loss: 0.56141, in 0.016s 1 tree, 136 leaves, max depth = 15, train loss: 0.55494, val loss: 0.55407, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.54787, val loss: 0.54709, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.54137, val loss: 0.54066, in 0.016s 1 tree, 85 leaves, max depth = 11, train loss: 0.53532, val loss: 0.53460, in 0.016s 1 tree, 147 leaves, max depth = 16, train loss: 0.52887, val loss: 0.52824, in 0.016s 1 tree, 213 leaves, max depth = 17, train loss: 0.52287, val loss: 0.52229, in 0.031s 1 tree, 131 leaves, max depth = 17, train loss: 0.51669, val loss: 0.51607, in 0.016s 1 tree, 179 leaves, max depth = 18, train loss: 0.51117, val loss: 0.51065, in 0.031s 1 tree, 165 leaves, max depth = 23, train loss: 0.50549, val loss: 0.50504, in 0.016s 1 tree, 164 leaves, max depth = 17, train loss: 0.49986, val loss: 0.49940, in 0.031s 1 tree, 205 leaves, max depth = 16, train loss: 0.49458, val loss: 0.49429, in 0.031s 1 tree, 144 leaves, max depth = 24, train loss: 0.48966, val loss: 0.48944, in 0.016s 1 tree, 120 leaves, max depth = 13, train loss: 0.48492, val loss: 0.48473, in 0.031s 1 tree, 126 leaves, max depth = 14, train loss: 0.48033, val loss: 0.48016, in 0.031s 1 tree, 140 leaves, max depth = 20, train loss: 0.47556, val loss: 0.47533, in 0.016s 1 tree, 139 leaves, max depth = 15, train loss: 0.47076, val loss: 0.47054, in 0.031s 1 tree, 170 leaves, max depth = 18, train loss: 0.46625, val loss: 0.46606, in 0.016s 1 tree, 152 leaves, max depth = 16, train loss: 0.46232, val loss: 0.46215, in 0.031s 1 tree, 140 leaves, max depth = 16, train loss: 0.45790, val loss: 0.45778, in 0.016s 1 tree, 172 leaves, max depth = 18, train loss: 0.45396, val loss: 0.45387, in 0.031s 1 tree, 170 leaves, max depth = 20, train loss: 0.44991, val loss: 0.44986, in 0.031s 1 tree, 174 leaves, max depth = 20, train loss: 0.44585, val loss: 0.44585, in 0.016s 1 tree, 32 leaves, max depth = 8, train loss: 0.44218, val loss: 0.44208, in 0.031s 1 tree, 157 leaves, max depth = 16, train loss: 0.43852, val loss: 0.43837, in 0.016s 1 tree, 181 leaves, max depth = 15, train loss: 0.43483, val loss: 0.43477, in 0.031s 1 tree, 133 leaves, max depth = 17, train loss: 0.43133, val loss: 0.43129, in 0.016s 1 tree, 138 leaves, max depth = 16, train loss: 0.42795, val loss: 0.42796, in 0.031s 1 tree, 178 leaves, max depth = 17, train loss: 0.42456, val loss: 0.42472, in 0.031s 1 tree, 10 leaves, max depth = 4, train loss: 0.41969, val loss: 0.41994, in 0.016s 1 tree, 173 leaves, max depth = 16, train loss: 0.41655, val loss: 0.41681, in 0.016s 1 tree, 168 leaves, max depth = 22, train loss: 0.41377, val loss: 0.41413, in 0.031s 1 tree, 213 leaves, max depth = 15, train loss: 0.41083, val loss: 0.41126, in 0.031s 1 tree, 24 leaves, max depth = 8, train loss: 0.40623, val loss: 0.40674, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.40344, val loss: 0.40404, in 0.031s 1 tree, 207 leaves, max depth = 21, train loss: 0.40118, val loss: 0.40183, in 0.031s 1 tree, 33 leaves, max depth = 10, train loss: 0.39714, val loss: 0.39786, in 0.016s 1 tree, 116 leaves, max depth = 16, train loss: 0.39449, val loss: 0.39529, in 0.016s 1 tree, 163 leaves, max depth = 17, train loss: 0.39174, val loss: 0.39261, in 0.031s 1 tree, 213 leaves, max depth = 17, train loss: 0.38935, val loss: 0.39037, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.38534, val loss: 0.38647, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.38164, val loss: 0.38279, in 0.031s 1 tree, 185 leaves, max depth = 22, train loss: 0.37942, val loss: 0.38069, in 0.016s 1 tree, 197 leaves, max depth = 20, train loss: 0.37727, val loss: 0.37867, in 0.031s 1 tree, 14 leaves, max depth = 5, train loss: 0.37383, val loss: 0.37531, in 0.016s 1 tree, 146 leaves, max depth = 18, train loss: 0.37145, val loss: 0.37300, in 0.031s 1 tree, 168 leaves, max depth = 18, train loss: 0.36886, val loss: 0.37053, in 0.031s 1 tree, 11 leaves, max depth = 5, train loss: 0.36558, val loss: 0.36731, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.36229, val loss: 0.36417, in 0.016s 1 tree, 171 leaves, max depth = 17, train loss: 0.35981, val loss: 0.36182, in 0.031s 1 tree, 22 leaves, max depth = 9, train loss: 0.35683, val loss: 0.35902, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.35391, val loss: 0.35617, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.35120, val loss: 0.35358, in 0.016s 1 tree, 166 leaves, max depth = 18, train loss: 0.34892, val loss: 0.35144, in 0.031s 1 tree, 14 leaves, max depth = 6, train loss: 0.34637, val loss: 0.34889, in 0.000s 1 tree, 171 leaves, max depth = 15, train loss: 0.34409, val loss: 0.34679, in 0.031s 1 tree, 176 leaves, max depth = 16, train loss: 0.34186, val loss: 0.34475, in 0.031s Fit 70 trees in 2.017 s, (8807 total leaves) Time spent computing histograms: 0.589s Time spent finding best splits: 0.188s Time spent applying splits: 0.147s Time spent predicting: 0.031s Trial 31, Fold 2: Log loss = 0.3499749363102381, Average precision = 0.9564690837470304, ROC-AUC = 0.9537399152215471, Elapsed Time = 2.0252207000012277 seconds Trial 31, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 31, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 161 leaves, max depth = 15, train loss: 0.68177, val loss: 0.68167, in 0.016s 1 tree, 97 leaves, max depth = 13, train loss: 0.67085, val loss: 0.67080, in 0.031s 1 tree, 82 leaves, max depth = 13, train loss: 0.66039, val loss: 0.66035, in 0.016s 1 tree, 94 leaves, max depth = 13, train loss: 0.65019, val loss: 0.65022, in 0.016s 1 tree, 200 leaves, max depth = 15, train loss: 0.64067, val loss: 0.64094, in 0.031s 1 tree, 136 leaves, max depth = 18, train loss: 0.63065, val loss: 0.63098, in 0.031s 1 tree, 213 leaves, max depth = 18, train loss: 0.62170, val loss: 0.62227, in 0.016s 1 tree, 160 leaves, max depth = 17, train loss: 0.61269, val loss: 0.61324, in 0.031s 1 tree, 89 leaves, max depth = 13, train loss: 0.60408, val loss: 0.60463, in 0.016s 1 tree, 185 leaves, max depth = 18, train loss: 0.59600, val loss: 0.59648, in 0.016s 1 tree, 213 leaves, max depth = 16, train loss: 0.58819, val loss: 0.58875, in 0.031s 1 tree, 120 leaves, max depth = 18, train loss: 0.58004, val loss: 0.58067, in 0.031s 1 tree, 116 leaves, max depth = 14, train loss: 0.57219, val loss: 0.57286, in 0.016s 1 tree, 171 leaves, max depth = 17, train loss: 0.56493, val loss: 0.56560, in 0.031s 1 tree, 213 leaves, max depth = 17, train loss: 0.55823, val loss: 0.55903, in 0.031s 1 tree, 131 leaves, max depth = 14, train loss: 0.55096, val loss: 0.55189, in 0.031s 1 tree, 134 leaves, max depth = 20, train loss: 0.54443, val loss: 0.54548, in 0.031s 1 tree, 88 leaves, max depth = 11, train loss: 0.53797, val loss: 0.53920, in 0.016s 1 tree, 89 leaves, max depth = 11, train loss: 0.53190, val loss: 0.53319, in 0.016s 1 tree, 116 leaves, max depth = 16, train loss: 0.52556, val loss: 0.52692, in 0.031s 1 tree, 116 leaves, max depth = 14, train loss: 0.51945, val loss: 0.52091, in 0.031s 1 tree, 183 leaves, max depth = 16, train loss: 0.51390, val loss: 0.51539, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.50851, val loss: 0.51013, in 0.031s 1 tree, 208 leaves, max depth = 17, train loss: 0.50320, val loss: 0.50482, in 0.031s 1 tree, 213 leaves, max depth = 17, train loss: 0.49807, val loss: 0.49980, in 0.016s 1 tree, 151 leaves, max depth = 15, train loss: 0.49266, val loss: 0.49453, in 0.031s 1 tree, 109 leaves, max depth = 15, train loss: 0.48755, val loss: 0.48950, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.48325, val loss: 0.48518, in 0.016s 1 tree, 129 leaves, max depth = 15, train loss: 0.47842, val loss: 0.48042, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.47366, val loss: 0.47572, in 0.016s 1 tree, 213 leaves, max depth = 17, train loss: 0.46953, val loss: 0.47170, in 0.031s 1 tree, 86 leaves, max depth = 13, train loss: 0.46542, val loss: 0.46762, in 0.031s 1 tree, 14 leaves, max depth = 7, train loss: 0.45968, val loss: 0.46218, in 0.016s 1 tree, 137 leaves, max depth = 17, train loss: 0.45558, val loss: 0.45813, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.45029, val loss: 0.45313, in 0.016s 1 tree, 163 leaves, max depth = 20, train loss: 0.44659, val loss: 0.44956, in 0.031s 1 tree, 42 leaves, max depth = 10, train loss: 0.44139, val loss: 0.44469, in 0.016s 1 tree, 158 leaves, max depth = 20, train loss: 0.43745, val loss: 0.44083, in 0.016s 1 tree, 139 leaves, max depth = 17, train loss: 0.43400, val loss: 0.43749, in 0.031s 1 tree, 141 leaves, max depth = 16, train loss: 0.43042, val loss: 0.43411, in 0.016s 1 tree, 134 leaves, max depth = 18, train loss: 0.42690, val loss: 0.43064, in 0.031s 1 tree, 18 leaves, max depth = 7, train loss: 0.42242, val loss: 0.42651, in 0.016s 1 tree, 146 leaves, max depth = 17, train loss: 0.41913, val loss: 0.42332, in 0.031s 1 tree, 136 leaves, max depth = 16, train loss: 0.41585, val loss: 0.42009, in 0.016s 1 tree, 155 leaves, max depth = 18, train loss: 0.41270, val loss: 0.41711, in 0.031s 1 tree, 169 leaves, max depth = 19, train loss: 0.40979, val loss: 0.41430, in 0.031s 1 tree, 26 leaves, max depth = 7, train loss: 0.40553, val loss: 0.41039, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.40282, val loss: 0.40772, in 0.016s 1 tree, 195 leaves, max depth = 24, train loss: 0.40022, val loss: 0.40525, in 0.031s 1 tree, 12 leaves, max depth = 6, train loss: 0.39627, val loss: 0.40160, in 0.016s 1 tree, 162 leaves, max depth = 16, train loss: 0.39358, val loss: 0.39904, in 0.016s 1 tree, 155 leaves, max depth = 18, train loss: 0.39078, val loss: 0.39639, in 0.031s 1 tree, 176 leaves, max depth = 18, train loss: 0.38812, val loss: 0.39390, in 0.031s 1 tree, 116 leaves, max depth = 12, train loss: 0.38523, val loss: 0.39097, in 0.016s 1 tree, 177 leaves, max depth = 18, train loss: 0.38296, val loss: 0.38891, in 0.031s 1 tree, 132 leaves, max depth = 12, train loss: 0.38017, val loss: 0.38610, in 0.031s 1 tree, 18 leaves, max depth = 8, train loss: 0.37670, val loss: 0.38291, in 0.016s 1 tree, 157 leaves, max depth = 20, train loss: 0.37439, val loss: 0.38079, in 0.031s 1 tree, 97 leaves, max depth = 14, train loss: 0.37088, val loss: 0.37779, in 0.016s 1 tree, 169 leaves, max depth = 14, train loss: 0.36827, val loss: 0.37520, in 0.031s 1 tree, 94 leaves, max depth = 15, train loss: 0.36492, val loss: 0.37235, in 0.031s 1 tree, 113 leaves, max depth = 13, train loss: 0.36270, val loss: 0.37017, in 0.016s 1 tree, 141 leaves, max depth = 18, train loss: 0.36055, val loss: 0.36812, in 0.031s 1 tree, 132 leaves, max depth = 16, train loss: 0.35822, val loss: 0.36574, in 0.031s 1 tree, 16 leaves, max depth = 5, train loss: 0.35520, val loss: 0.36296, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.35235, val loss: 0.36048, in 0.000s 1 tree, 134 leaves, max depth = 13, train loss: 0.35010, val loss: 0.35817, in 0.016s 1 tree, 7 leaves, max depth = 3, train loss: 0.34749, val loss: 0.35569, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.34454, val loss: 0.35304, in 0.016s 1 tree, 12 leaves, max depth = 7, train loss: 0.34191, val loss: 0.35063, in 0.016s Fit 70 trees in 2.002 s, (8348 total leaves) Time spent computing histograms: 0.591s Time spent finding best splits: 0.183s Time spent applying splits: 0.143s Time spent predicting: 0.000s Trial 31, Fold 3: Log loss = 0.346050974179654, Average precision = 0.9593904984587104, ROC-AUC = 0.9552786649699992, Elapsed Time = 2.0108632999999827 seconds Trial 31, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 31, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 169 leaves, max depth = 15, train loss: 0.68174, val loss: 0.68155, in 0.016s 1 tree, 137 leaves, max depth = 16, train loss: 0.67022, val loss: 0.66958, in 0.031s 1 tree, 207 leaves, max depth = 17, train loss: 0.65982, val loss: 0.65903, in 0.031s 1 tree, 113 leaves, max depth = 13, train loss: 0.64919, val loss: 0.64805, in 0.016s 1 tree, 180 leaves, max depth = 15, train loss: 0.63929, val loss: 0.63789, in 0.032s 1 tree, 93 leaves, max depth = 20, train loss: 0.63000, val loss: 0.62821, in 0.016s 1 tree, 127 leaves, max depth = 20, train loss: 0.62052, val loss: 0.61838, in 0.031s 1 tree, 107 leaves, max depth = 18, train loss: 0.61204, val loss: 0.60955, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.60386, val loss: 0.60093, in 0.016s 1 tree, 148 leaves, max depth = 17, train loss: 0.59577, val loss: 0.59254, in 0.031s 1 tree, 110 leaves, max depth = 16, train loss: 0.58718, val loss: 0.58365, in 0.016s 1 tree, 135 leaves, max depth = 20, train loss: 0.57895, val loss: 0.57517, in 0.031s 1 tree, 130 leaves, max depth = 14, train loss: 0.57093, val loss: 0.56691, in 0.016s 1 tree, 185 leaves, max depth = 17, train loss: 0.56379, val loss: 0.55948, in 0.031s 1 tree, 133 leaves, max depth = 19, train loss: 0.55711, val loss: 0.55255, in 0.016s 1 tree, 132 leaves, max depth = 15, train loss: 0.55014, val loss: 0.54533, in 0.031s 1 tree, 181 leaves, max depth = 22, train loss: 0.54370, val loss: 0.53868, in 0.016s 1 tree, 127 leaves, max depth = 14, train loss: 0.53749, val loss: 0.53225, in 0.031s 1 tree, 135 leaves, max depth = 19, train loss: 0.53143, val loss: 0.52598, in 0.031s 1 tree, 213 leaves, max depth = 18, train loss: 0.52564, val loss: 0.52013, in 0.031s 1 tree, 126 leaves, max depth = 17, train loss: 0.51993, val loss: 0.51418, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.51396, val loss: 0.50797, in 0.031s 1 tree, 183 leaves, max depth = 15, train loss: 0.50855, val loss: 0.50231, in 0.016s 1 tree, 213 leaves, max depth = 18, train loss: 0.50321, val loss: 0.49709, in 0.031s 1 tree, 123 leaves, max depth = 19, train loss: 0.49842, val loss: 0.49208, in 0.031s 1 tree, 124 leaves, max depth = 16, train loss: 0.49303, val loss: 0.48645, in 0.016s 1 tree, 130 leaves, max depth = 13, train loss: 0.48798, val loss: 0.48118, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.48171, val loss: 0.47475, in 0.016s 1 tree, 122 leaves, max depth = 19, train loss: 0.47736, val loss: 0.47018, in 0.031s 1 tree, 144 leaves, max depth = 18, train loss: 0.47252, val loss: 0.46525, in 0.016s 1 tree, 128 leaves, max depth = 22, train loss: 0.46833, val loss: 0.46088, in 0.031s 1 tree, 19 leaves, max depth = 10, train loss: 0.46291, val loss: 0.45528, in 0.016s 1 tree, 136 leaves, max depth = 13, train loss: 0.45842, val loss: 0.45070, in 0.031s 1 tree, 178 leaves, max depth = 17, train loss: 0.45435, val loss: 0.44652, in 0.031s 1 tree, 168 leaves, max depth = 15, train loss: 0.45007, val loss: 0.44218, in 0.031s 1 tree, 172 leaves, max depth = 15, train loss: 0.44592, val loss: 0.43795, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.44232, val loss: 0.43411, in 0.031s 1 tree, 139 leaves, max depth = 20, train loss: 0.43858, val loss: 0.43020, in 0.016s 1 tree, 140 leaves, max depth = 17, train loss: 0.43499, val loss: 0.42641, in 0.031s 1 tree, 138 leaves, max depth = 20, train loss: 0.43146, val loss: 0.42272, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.42669, val loss: 0.41779, in 0.016s 1 tree, 165 leaves, max depth = 15, train loss: 0.42320, val loss: 0.41421, in 0.031s 1 tree, 14 leaves, max depth = 6, train loss: 0.41871, val loss: 0.40958, in 0.016s 1 tree, 213 leaves, max depth = 18, train loss: 0.41556, val loss: 0.40652, in 0.031s 1 tree, 154 leaves, max depth = 15, train loss: 0.41238, val loss: 0.40323, in 0.031s 1 tree, 167 leaves, max depth = 17, train loss: 0.40931, val loss: 0.40011, in 0.016s 1 tree, 187 leaves, max depth = 18, train loss: 0.40632, val loss: 0.39700, in 0.031s 1 tree, 171 leaves, max depth = 17, train loss: 0.40345, val loss: 0.39399, in 0.031s 1 tree, 148 leaves, max depth = 15, train loss: 0.40062, val loss: 0.39103, in 0.031s 1 tree, 27 leaves, max depth = 8, train loss: 0.39660, val loss: 0.38688, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.39275, val loss: 0.38293, in 0.016s 1 tree, 167 leaves, max depth = 15, train loss: 0.39012, val loss: 0.38023, in 0.016s 1 tree, 119 leaves, max depth = 14, train loss: 0.38721, val loss: 0.37743, in 0.016s 1 tree, 155 leaves, max depth = 17, train loss: 0.38456, val loss: 0.37475, in 0.031s 1 tree, 207 leaves, max depth = 17, train loss: 0.38195, val loss: 0.37209, in 0.031s 1 tree, 200 leaves, max depth = 19, train loss: 0.37940, val loss: 0.36954, in 0.031s 1 tree, 150 leaves, max depth = 15, train loss: 0.37709, val loss: 0.36715, in 0.016s 1 tree, 156 leaves, max depth = 15, train loss: 0.37439, val loss: 0.36460, in 0.047s 1 tree, 20 leaves, max depth = 7, train loss: 0.37126, val loss: 0.36125, in 0.000s 1 tree, 142 leaves, max depth = 18, train loss: 0.36870, val loss: 0.35876, in 0.031s 1 tree, 21 leaves, max depth = 10, train loss: 0.36553, val loss: 0.35574, in 0.016s 1 tree, 142 leaves, max depth = 13, train loss: 0.36306, val loss: 0.35335, in 0.016s 1 tree, 129 leaves, max depth = 13, train loss: 0.36075, val loss: 0.35108, in 0.031s 1 tree, 21 leaves, max depth = 8, train loss: 0.35775, val loss: 0.34794, in 0.016s 1 tree, 211 leaves, max depth = 17, train loss: 0.35582, val loss: 0.34606, in 0.031s 1 tree, 171 leaves, max depth = 16, train loss: 0.35351, val loss: 0.34382, in 0.031s 1 tree, 19 leaves, max depth = 10, train loss: 0.35065, val loss: 0.34083, in 0.016s 1 tree, 153 leaves, max depth = 14, train loss: 0.34850, val loss: 0.33873, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.34575, val loss: 0.33587, in 0.016s 1 tree, 152 leaves, max depth = 15, train loss: 0.34365, val loss: 0.33382, in 0.031s Fit 70 trees in 2.033 s, (9027 total leaves) Time spent computing histograms: 0.594s Time spent finding best splits: 0.194s Time spent applying splits: 0.153s Time spent predicting: 0.016s Trial 31, Fold 4: Log loss = 0.3492675491284612, Average precision = 0.9588931444376244, ROC-AUC = 0.9539327560677268, Elapsed Time = 2.041462600000159 seconds Trial 31, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 31, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 176 leaves, max depth = 16, train loss: 0.68165, val loss: 0.68129, in 0.016s 1 tree, 190 leaves, max depth = 19, train loss: 0.67054, val loss: 0.66977, in 0.031s 1 tree, 213 leaves, max depth = 14, train loss: 0.65976, val loss: 0.65893, in 0.031s 1 tree, 119 leaves, max depth = 14, train loss: 0.64884, val loss: 0.64774, in 0.016s 1 tree, 122 leaves, max depth = 16, train loss: 0.63859, val loss: 0.63713, in 0.031s 1 tree, 57 leaves, max depth = 12, train loss: 0.62920, val loss: 0.62738, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.62012, val loss: 0.61796, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.61083, val loss: 0.60839, in 0.031s 1 tree, 106 leaves, max depth = 15, train loss: 0.60194, val loss: 0.59918, in 0.016s 1 tree, 213 leaves, max depth = 15, train loss: 0.59378, val loss: 0.59103, in 0.031s 1 tree, 62 leaves, max depth = 16, train loss: 0.58628, val loss: 0.58335, in 0.016s 1 tree, 184 leaves, max depth = 16, train loss: 0.57871, val loss: 0.57548, in 0.031s 1 tree, 119 leaves, max depth = 14, train loss: 0.57071, val loss: 0.56729, in 0.016s 1 tree, 182 leaves, max depth = 15, train loss: 0.56330, val loss: 0.55971, in 0.031s 1 tree, 126 leaves, max depth = 15, train loss: 0.55579, val loss: 0.55204, in 0.031s 1 tree, 115 leaves, max depth = 13, train loss: 0.54856, val loss: 0.54462, in 0.016s 1 tree, 135 leaves, max depth = 16, train loss: 0.54151, val loss: 0.53752, in 0.016s 1 tree, 138 leaves, max depth = 14, train loss: 0.53472, val loss: 0.53058, in 0.031s 1 tree, 104 leaves, max depth = 13, train loss: 0.52860, val loss: 0.52431, in 0.016s 1 tree, 136 leaves, max depth = 14, train loss: 0.52227, val loss: 0.51788, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.51659, val loss: 0.51205, in 0.031s 1 tree, 138 leaves, max depth = 18, train loss: 0.51082, val loss: 0.50619, in 0.031s 1 tree, 213 leaves, max depth = 18, train loss: 0.50537, val loss: 0.50078, in 0.016s 1 tree, 112 leaves, max depth = 16, train loss: 0.49974, val loss: 0.49502, in 0.031s 1 tree, 194 leaves, max depth = 17, train loss: 0.49442, val loss: 0.48963, in 0.016s 1 tree, 106 leaves, max depth = 21, train loss: 0.48950, val loss: 0.48461, in 0.016s 1 tree, 24 leaves, max depth = 12, train loss: 0.48323, val loss: 0.47817, in 0.016s 1 tree, 166 leaves, max depth = 19, train loss: 0.47858, val loss: 0.47341, in 0.016s 1 tree, 142 leaves, max depth = 17, train loss: 0.47362, val loss: 0.46843, in 0.031s 1 tree, 127 leaves, max depth = 16, train loss: 0.46888, val loss: 0.46360, in 0.031s 1 tree, 129 leaves, max depth = 16, train loss: 0.46465, val loss: 0.45927, in 0.016s 1 tree, 162 leaves, max depth = 14, train loss: 0.46013, val loss: 0.45471, in 0.031s 1 tree, 18 leaves, max depth = 8, train loss: 0.45465, val loss: 0.44918, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.44922, val loss: 0.44369, in 0.016s 1 tree, 146 leaves, max depth = 16, train loss: 0.44520, val loss: 0.43961, in 0.031s 1 tree, 136 leaves, max depth = 15, train loss: 0.44127, val loss: 0.43565, in 0.016s 1 tree, 213 leaves, max depth = 16, train loss: 0.43770, val loss: 0.43210, in 0.031s 1 tree, 118 leaves, max depth = 13, train loss: 0.43387, val loss: 0.42825, in 0.031s 1 tree, 139 leaves, max depth = 15, train loss: 0.43023, val loss: 0.42451, in 0.016s 1 tree, 212 leaves, max depth = 19, train loss: 0.42679, val loss: 0.42103, in 0.031s 1 tree, 154 leaves, max depth = 15, train loss: 0.42332, val loss: 0.41760, in 0.031s 1 tree, 213 leaves, max depth = 16, train loss: 0.42023, val loss: 0.41459, in 0.031s 1 tree, 139 leaves, max depth = 14, train loss: 0.41682, val loss: 0.41118, in 0.016s 1 tree, 171 leaves, max depth = 18, train loss: 0.41369, val loss: 0.40794, in 0.031s 1 tree, 134 leaves, max depth = 16, train loss: 0.41065, val loss: 0.40492, in 0.016s 1 tree, 118 leaves, max depth = 15, train loss: 0.40759, val loss: 0.40188, in 0.031s 1 tree, 165 leaves, max depth = 18, train loss: 0.40464, val loss: 0.39894, in 0.031s 1 tree, 163 leaves, max depth = 14, train loss: 0.40162, val loss: 0.39604, in 0.031s 1 tree, 204 leaves, max depth = 16, train loss: 0.39872, val loss: 0.39324, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.39462, val loss: 0.38914, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.39068, val loss: 0.38528, in 0.016s 1 tree, 213 leaves, max depth = 19, train loss: 0.38826, val loss: 0.38304, in 0.031s 1 tree, 196 leaves, max depth = 15, train loss: 0.38574, val loss: 0.38052, in 0.031s 1 tree, 11 leaves, max depth = 5, train loss: 0.38341, val loss: 0.37812, in 0.016s 1 tree, 189 leaves, max depth = 20, train loss: 0.38081, val loss: 0.37563, in 0.031s 1 tree, 213 leaves, max depth = 18, train loss: 0.37858, val loss: 0.37359, in 0.031s 1 tree, 213 leaves, max depth = 21, train loss: 0.37639, val loss: 0.37148, in 0.031s 1 tree, 173 leaves, max depth = 16, train loss: 0.37401, val loss: 0.36919, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.37053, val loss: 0.36567, in 0.016s 1 tree, 213 leaves, max depth = 20, train loss: 0.36839, val loss: 0.36367, in 0.031s 1 tree, 19 leaves, max depth = 7, train loss: 0.36511, val loss: 0.36030, in 0.016s 1 tree, 168 leaves, max depth = 19, train loss: 0.36262, val loss: 0.35791, in 0.031s 1 tree, 11 leaves, max depth = 5, train loss: 0.35948, val loss: 0.35474, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.35658, val loss: 0.35204, in 0.016s 1 tree, 137 leaves, max depth = 13, train loss: 0.35419, val loss: 0.34974, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.35128, val loss: 0.34681, in 0.016s 1 tree, 158 leaves, max depth = 15, train loss: 0.34899, val loss: 0.34466, in 0.031s 1 tree, 148 leaves, max depth = 16, train loss: 0.34719, val loss: 0.34290, in 0.016s 1 tree, 171 leaves, max depth = 17, train loss: 0.34529, val loss: 0.34113, in 0.031s 1 tree, 109 leaves, max depth = 12, train loss: 0.34322, val loss: 0.33928, in 0.016s Fit 70 trees in 2.033 s, (9185 total leaves) Time spent computing histograms: 0.593s Time spent finding best splits: 0.195s Time spent applying splits: 0.156s Time spent predicting: 0.000s Trial 31, Fold 5: Log loss = 0.3565390565096313, Average precision = 0.9549716549634287, ROC-AUC = 0.9506515801794772, Elapsed Time = 2.0520335999990493 seconds
Optimization Progress: 32%|###2 | 32/100 [06:23<15:39, 13.81s/it]
Trial 32, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 32, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 27 leaves, max depth = 9, train loss: 0.67187, val loss: 0.67116, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.65255, val loss: 0.65121, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.63460, val loss: 0.63274, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.61833, val loss: 0.61577, in 0.000s 1 tree, 26 leaves, max depth = 9, train loss: 0.60334, val loss: 0.60025, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.58961, val loss: 0.58601, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.57720, val loss: 0.57310, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.56586, val loss: 0.56125, in 0.000s 1 tree, 27 leaves, max depth = 10, train loss: 0.55519, val loss: 0.55013, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.54514, val loss: 0.53959, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.53623, val loss: 0.53019, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.52809, val loss: 0.52163, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.52038, val loss: 0.51355, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.51336, val loss: 0.50612, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.50678, val loss: 0.49920, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.49755, val loss: 0.49064, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.49169, val loss: 0.48450, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.48629, val loss: 0.47883, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.48124, val loss: 0.47337, in 0.016s 1 tree, 27 leaves, max depth = 8, train loss: 0.47644, val loss: 0.46821, in 0.000s 1 tree, 63 leaves, max depth = 13, train loss: 0.47118, val loss: 0.46335, in 0.016s 1 tree, 58 leaves, max depth = 10, train loss: 0.46580, val loss: 0.45842, in 0.000s 1 tree, 26 leaves, max depth = 8, train loss: 0.46162, val loss: 0.45390, in 0.016s 1 tree, 26 leaves, max depth = 8, train loss: 0.45776, val loss: 0.44972, in 0.000s 1 tree, 63 leaves, max depth = 14, train loss: 0.45326, val loss: 0.44560, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.44992, val loss: 0.44191, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.44672, val loss: 0.43842, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.44379, val loss: 0.43530, in 0.000s 1 tree, 61 leaves, max depth = 9, train loss: 0.43932, val loss: 0.43129, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.43299, val loss: 0.42560, in 0.016s 1 tree, 61 leaves, max depth = 9, train loss: 0.42917, val loss: 0.42221, in 0.000s 1 tree, 58 leaves, max depth = 17, train loss: 0.42631, val loss: 0.41937, in 0.016s 1 tree, 58 leaves, max depth = 17, train loss: 0.42367, val loss: 0.41675, in 0.000s 1 tree, 31 leaves, max depth = 16, train loss: 0.42125, val loss: 0.41417, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.41881, val loss: 0.41149, in 0.000s 1 tree, 60 leaves, max depth = 15, train loss: 0.41638, val loss: 0.40911, in 0.016s 1 tree, 54 leaves, max depth = 9, train loss: 0.41384, val loss: 0.40689, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.40898, val loss: 0.40257, in 0.000s 1 tree, 28 leaves, max depth = 8, train loss: 0.40685, val loss: 0.40031, in 0.016s 1 tree, 60 leaves, max depth = 10, train loss: 0.40402, val loss: 0.39787, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40128, val loss: 0.39485, in 0.016s 1 tree, 27 leaves, max depth = 8, train loss: 0.39936, val loss: 0.39273, in 0.000s 1 tree, 59 leaves, max depth = 13, train loss: 0.39737, val loss: 0.39082, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.39322, val loss: 0.38718, in 0.000s 1 tree, 60 leaves, max depth = 10, train loss: 0.39088, val loss: 0.38522, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.38874, val loss: 0.38344, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.38518, val loss: 0.38035, in 0.000s 1 tree, 29 leaves, max depth = 8, train loss: 0.38353, val loss: 0.37858, in 0.016s 1 tree, 62 leaves, max depth = 10, train loss: 0.38172, val loss: 0.37710, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.37855, val loss: 0.37436, in 0.016s 1 tree, 59 leaves, max depth = 15, train loss: 0.37715, val loss: 0.37302, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.37430, val loss: 0.37058, in 0.000s 1 tree, 55 leaves, max depth = 12, train loss: 0.37167, val loss: 0.36833, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.37022, val loss: 0.36676, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.36888, val loss: 0.36531, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.36644, val loss: 0.36324, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.36513, val loss: 0.36177, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.36400, val loss: 0.36057, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.36172, val loss: 0.35865, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35928, val loss: 0.35595, in 0.000s 1 tree, 64 leaves, max depth = 12, train loss: 0.35791, val loss: 0.35487, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.35588, val loss: 0.35319, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.35480, val loss: 0.35191, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.35386, val loss: 0.35090, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.35292, val loss: 0.34974, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.35165, val loss: 0.34863, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34939, val loss: 0.34612, in 0.016s 1 tree, 63 leaves, max depth = 13, train loss: 0.34820, val loss: 0.34523, in 0.000s 1 tree, 53 leaves, max depth = 12, train loss: 0.34643, val loss: 0.34380, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.34480, val loss: 0.34249, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.34385, val loss: 0.34134, in 0.000s 1 tree, 37 leaves, max depth = 15, train loss: 0.34299, val loss: 0.34031, in 0.016s Fit 72 trees in 0.845 s, (2787 total leaves) Time spent computing histograms: 0.299s Time spent finding best splits: 0.056s Time spent applying splits: 0.047s Time spent predicting: 0.000s Trial 32, Fold 1: Log loss = 0.3466410806136022, Average precision = 0.9522885844063388, ROC-AUC = 0.947052009100338, Elapsed Time = 0.8517611000006582 seconds Trial 32, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 32, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 28 leaves, max depth = 9, train loss: 0.67195, val loss: 0.67090, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.65251, val loss: 0.65046, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.63462, val loss: 0.63170, in 0.000s 1 tree, 9 leaves, max depth = 7, train loss: 0.61829, val loss: 0.61452, in 0.016s 1 tree, 29 leaves, max depth = 14, train loss: 0.60349, val loss: 0.59889, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.58987, val loss: 0.58451, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.57741, val loss: 0.57134, in 0.000s 1 tree, 29 leaves, max depth = 14, train loss: 0.56599, val loss: 0.55921, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.55534, val loss: 0.54785, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.54537, val loss: 0.53725, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.53644, val loss: 0.52773, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.52825, val loss: 0.51896, in 0.000s 1 tree, 28 leaves, max depth = 12, train loss: 0.52068, val loss: 0.51085, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.51367, val loss: 0.50333, in 0.000s 1 tree, 30 leaves, max depth = 13, train loss: 0.50725, val loss: 0.49636, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.49811, val loss: 0.48760, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.49227, val loss: 0.48133, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.48688, val loss: 0.47553, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.48184, val loss: 0.47009, in 0.000s 1 tree, 31 leaves, max depth = 8, train loss: 0.47708, val loss: 0.46493, in 0.016s 1 tree, 63 leaves, max depth = 12, train loss: 0.47181, val loss: 0.46008, in 0.016s 1 tree, 63 leaves, max depth = 9, train loss: 0.46650, val loss: 0.45517, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.46235, val loss: 0.45066, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.45851, val loss: 0.44649, in 0.016s 1 tree, 61 leaves, max depth = 11, train loss: 0.45401, val loss: 0.44238, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.45074, val loss: 0.43901, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.44754, val loss: 0.43551, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.44461, val loss: 0.43228, in 0.016s 1 tree, 63 leaves, max depth = 9, train loss: 0.44021, val loss: 0.42827, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.43396, val loss: 0.42236, in 0.000s 1 tree, 63 leaves, max depth = 9, train loss: 0.43021, val loss: 0.41898, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.42724, val loss: 0.41645, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.42451, val loss: 0.41414, in 0.000s 1 tree, 58 leaves, max depth = 12, train loss: 0.42199, val loss: 0.41203, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.41942, val loss: 0.40920, in 0.000s 1 tree, 59 leaves, max depth = 12, train loss: 0.41711, val loss: 0.40728, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.41491, val loss: 0.40482, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.41009, val loss: 0.40030, in 0.000s 1 tree, 30 leaves, max depth = 14, train loss: 0.40812, val loss: 0.39809, in 0.016s 1 tree, 64 leaves, max depth = 9, train loss: 0.40527, val loss: 0.39559, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40261, val loss: 0.39282, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.40070, val loss: 0.39071, in 0.000s 1 tree, 59 leaves, max depth = 10, train loss: 0.39874, val loss: 0.38914, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.39461, val loss: 0.38528, in 0.016s 1 tree, 62 leaves, max depth = 11, train loss: 0.39214, val loss: 0.38315, in 0.000s 1 tree, 62 leaves, max depth = 9, train loss: 0.38989, val loss: 0.38123, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.38640, val loss: 0.37797, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.38474, val loss: 0.37614, in 0.000s 1 tree, 62 leaves, max depth = 9, train loss: 0.38283, val loss: 0.37455, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.37973, val loss: 0.37167, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.37834, val loss: 0.37062, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.37555, val loss: 0.36804, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.37298, val loss: 0.36567, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.37162, val loss: 0.36413, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.37026, val loss: 0.36261, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.36787, val loss: 0.36041, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.36656, val loss: 0.35893, in 0.000s 1 tree, 31 leaves, max depth = 15, train loss: 0.36549, val loss: 0.35772, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.36325, val loss: 0.35566, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.36089, val loss: 0.35320, in 0.016s 1 tree, 63 leaves, max depth = 11, train loss: 0.35951, val loss: 0.35211, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.35752, val loss: 0.35030, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.35647, val loss: 0.34905, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.35554, val loss: 0.34810, in 0.016s 1 tree, 37 leaves, max depth = 13, train loss: 0.35462, val loss: 0.34703, in 0.000s 1 tree, 36 leaves, max depth = 10, train loss: 0.35324, val loss: 0.34594, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35105, val loss: 0.34365, in 0.000s 1 tree, 64 leaves, max depth = 11, train loss: 0.34986, val loss: 0.34274, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.34813, val loss: 0.34118, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.34654, val loss: 0.33974, in 0.000s 1 tree, 37 leaves, max depth = 13, train loss: 0.34561, val loss: 0.33865, in 0.016s 1 tree, 36 leaves, max depth = 15, train loss: 0.34475, val loss: 0.33761, in 0.000s Fit 72 trees in 0.892 s, (2879 total leaves) Time spent computing histograms: 0.315s Time spent finding best splits: 0.063s Time spent applying splits: 0.053s Time spent predicting: 0.000s Trial 32, Fold 2: Log loss = 0.34737620853550505, Average precision = 0.949150803587019, ROC-AUC = 0.94730262241418, Elapsed Time = 0.9006196000009368 seconds Trial 32, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 32, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 28 leaves, max depth = 10, train loss: 0.67201, val loss: 0.67129, in 0.000s 1 tree, 27 leaves, max depth = 9, train loss: 0.65277, val loss: 0.65134, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.63501, val loss: 0.63285, in 0.000s 1 tree, 8 leaves, max depth = 6, train loss: 0.61876, val loss: 0.61601, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.60411, val loss: 0.60073, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.59060, val loss: 0.58675, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.57827, val loss: 0.57386, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.56697, val loss: 0.56202, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.55639, val loss: 0.55091, in 0.000s 1 tree, 27 leaves, max depth = 9, train loss: 0.54644, val loss: 0.54056, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.53757, val loss: 0.53134, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.52948, val loss: 0.52279, in 0.000s 1 tree, 29 leaves, max depth = 8, train loss: 0.52185, val loss: 0.51472, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.51490, val loss: 0.50738, in 0.000s 1 tree, 29 leaves, max depth = 8, train loss: 0.50840, val loss: 0.50047, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.49925, val loss: 0.49204, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.49347, val loss: 0.48588, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.48814, val loss: 0.48018, in 0.000s 1 tree, 9 leaves, max depth = 7, train loss: 0.48310, val loss: 0.47489, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.47837, val loss: 0.46992, in 0.000s 1 tree, 61 leaves, max depth = 12, train loss: 0.47308, val loss: 0.46496, in 0.016s 1 tree, 61 leaves, max depth = 9, train loss: 0.46770, val loss: 0.46007, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.46358, val loss: 0.45572, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.45979, val loss: 0.45170, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.45526, val loss: 0.44748, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.45188, val loss: 0.44438, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.44870, val loss: 0.44103, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.44581, val loss: 0.43789, in 0.016s 1 tree, 64 leaves, max depth = 9, train loss: 0.44135, val loss: 0.43389, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.43511, val loss: 0.42824, in 0.016s 1 tree, 62 leaves, max depth = 9, train loss: 0.43131, val loss: 0.42487, in 0.016s 1 tree, 59 leaves, max depth = 16, train loss: 0.42832, val loss: 0.42231, in 0.000s 1 tree, 60 leaves, max depth = 16, train loss: 0.42558, val loss: 0.41996, in 0.016s 1 tree, 60 leaves, max depth = 16, train loss: 0.42305, val loss: 0.41781, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.42052, val loss: 0.41509, in 0.016s 1 tree, 60 leaves, max depth = 15, train loss: 0.41819, val loss: 0.41314, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.41587, val loss: 0.41054, in 0.000s 1 tree, 52 leaves, max depth = 13, train loss: 0.41107, val loss: 0.40626, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.40899, val loss: 0.40391, in 0.016s 1 tree, 62 leaves, max depth = 9, train loss: 0.40614, val loss: 0.40146, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40338, val loss: 0.39892, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.40150, val loss: 0.39688, in 0.000s 1 tree, 60 leaves, max depth = 14, train loss: 0.39954, val loss: 0.39530, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.39542, val loss: 0.39167, in 0.000s 1 tree, 60 leaves, max depth = 10, train loss: 0.39298, val loss: 0.38946, in 0.016s 1 tree, 64 leaves, max depth = 9, train loss: 0.39072, val loss: 0.38758, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.38723, val loss: 0.38455, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.38563, val loss: 0.38276, in 0.016s 1 tree, 64 leaves, max depth = 9, train loss: 0.38372, val loss: 0.38120, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.38061, val loss: 0.37852, in 0.000s 1 tree, 60 leaves, max depth = 14, train loss: 0.37920, val loss: 0.37746, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.37640, val loss: 0.37507, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.37381, val loss: 0.37287, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.37240, val loss: 0.37127, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.37110, val loss: 0.36980, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.36868, val loss: 0.36776, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.36739, val loss: 0.36633, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.36632, val loss: 0.36513, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.36405, val loss: 0.36324, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36161, val loss: 0.36096, in 0.000s 1 tree, 60 leaves, max depth = 11, train loss: 0.36025, val loss: 0.35980, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.35823, val loss: 0.35815, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.35714, val loss: 0.35693, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.35626, val loss: 0.35602, in 0.000s 1 tree, 36 leaves, max depth = 10, train loss: 0.35535, val loss: 0.35475, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.35401, val loss: 0.35372, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35175, val loss: 0.35162, in 0.016s 1 tree, 61 leaves, max depth = 11, train loss: 0.35060, val loss: 0.35065, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.34884, val loss: 0.34924, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.34722, val loss: 0.34795, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.34630, val loss: 0.34666, in 0.000s 1 tree, 37 leaves, max depth = 15, train loss: 0.34545, val loss: 0.34546, in 0.016s Fit 72 trees in 0.924 s, (2769 total leaves) Time spent computing histograms: 0.318s Time spent finding best splits: 0.063s Time spent applying splits: 0.052s Time spent predicting: 0.016s Trial 32, Fold 3: Log loss = 0.34113213388950736, Average precision = 0.9536000701204155, ROC-AUC = 0.9504115311178742, Elapsed Time = 0.9310433000009652 seconds Trial 32, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 32, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 27 leaves, max depth = 9, train loss: 0.67201, val loss: 0.67076, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.65271, val loss: 0.65035, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.63491, val loss: 0.63146, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.61867, val loss: 0.61423, in 0.000s 1 tree, 26 leaves, max depth = 11, train loss: 0.60396, val loss: 0.59852, in 0.016s 1 tree, 30 leaves, max depth = 14, train loss: 0.59042, val loss: 0.58401, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.57803, val loss: 0.57073, in 0.000s 1 tree, 26 leaves, max depth = 11, train loss: 0.56668, val loss: 0.55848, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.55607, val loss: 0.54701, in 0.000s 1 tree, 26 leaves, max depth = 12, train loss: 0.54615, val loss: 0.53620, in 0.016s 1 tree, 30 leaves, max depth = 14, train loss: 0.53726, val loss: 0.52650, in 0.000s 1 tree, 27 leaves, max depth = 11, train loss: 0.52912, val loss: 0.51758, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.52146, val loss: 0.50918, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.51449, val loss: 0.50143, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.50795, val loss: 0.49420, in 0.000s 1 tree, 51 leaves, max depth = 12, train loss: 0.49881, val loss: 0.48525, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.49300, val loss: 0.47880, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.48765, val loss: 0.47282, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.48262, val loss: 0.46722, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.47791, val loss: 0.46188, in 0.016s 1 tree, 64 leaves, max depth = 11, train loss: 0.47270, val loss: 0.45673, in 0.000s 1 tree, 60 leaves, max depth = 8, train loss: 0.46742, val loss: 0.45163, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.46333, val loss: 0.44695, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.45955, val loss: 0.44261, in 0.000s 1 tree, 64 leaves, max depth = 11, train loss: 0.45509, val loss: 0.43822, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.45184, val loss: 0.43476, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.44865, val loss: 0.43108, in 0.016s 1 tree, 30 leaves, max depth = 15, train loss: 0.44572, val loss: 0.42770, in 0.000s 1 tree, 63 leaves, max depth = 10, train loss: 0.44134, val loss: 0.42355, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.43509, val loss: 0.41754, in 0.016s 1 tree, 63 leaves, max depth = 10, train loss: 0.43136, val loss: 0.41401, in 0.000s 1 tree, 58 leaves, max depth = 17, train loss: 0.42841, val loss: 0.41103, in 0.016s 1 tree, 58 leaves, max depth = 17, train loss: 0.42569, val loss: 0.40828, in 0.016s 1 tree, 58 leaves, max depth = 17, train loss: 0.42319, val loss: 0.40574, in 0.000s 1 tree, 27 leaves, max depth = 8, train loss: 0.42068, val loss: 0.40279, in 0.016s 1 tree, 58 leaves, max depth = 16, train loss: 0.41838, val loss: 0.40045, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.41600, val loss: 0.39770, in 0.000s 1 tree, 50 leaves, max depth = 10, train loss: 0.41120, val loss: 0.39313, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.40907, val loss: 0.39064, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.40628, val loss: 0.38806, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40363, val loss: 0.38521, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.40177, val loss: 0.38298, in 0.000s 1 tree, 58 leaves, max depth = 15, train loss: 0.39983, val loss: 0.38104, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.39574, val loss: 0.37716, in 0.016s 1 tree, 64 leaves, max depth = 13, train loss: 0.39332, val loss: 0.37487, in 0.000s 1 tree, 61 leaves, max depth = 10, train loss: 0.39111, val loss: 0.37286, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.38763, val loss: 0.36958, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.38602, val loss: 0.36766, in 0.000s 1 tree, 61 leaves, max depth = 9, train loss: 0.38414, val loss: 0.36597, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.38105, val loss: 0.36307, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.37961, val loss: 0.36136, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.37675, val loss: 0.35866, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.37410, val loss: 0.35618, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.37278, val loss: 0.35459, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.37156, val loss: 0.35312, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.36910, val loss: 0.35082, in 0.000s 1 tree, 8 leaves, max depth = 6, train loss: 0.36789, val loss: 0.34940, in 0.000s 1 tree, 28 leaves, max depth = 8, train loss: 0.36673, val loss: 0.34846, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.36450, val loss: 0.34639, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.36215, val loss: 0.34383, in 0.016s 1 tree, 65 leaves, max depth = 15, train loss: 0.36077, val loss: 0.34259, in 0.000s 1 tree, 51 leaves, max depth = 11, train loss: 0.35880, val loss: 0.34075, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.35771, val loss: 0.33948, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.35681, val loss: 0.33855, in 0.000s 1 tree, 37 leaves, max depth = 14, train loss: 0.35589, val loss: 0.33752, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.35448, val loss: 0.33617, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35230, val loss: 0.33380, in 0.016s 1 tree, 65 leaves, max depth = 15, train loss: 0.35113, val loss: 0.33276, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.34942, val loss: 0.33119, in 0.000s 1 tree, 51 leaves, max depth = 11, train loss: 0.34784, val loss: 0.32974, in 0.016s 1 tree, 36 leaves, max depth = 14, train loss: 0.34690, val loss: 0.32871, in 0.016s 1 tree, 38 leaves, max depth = 17, train loss: 0.34603, val loss: 0.32777, in 0.000s Fit 72 trees in 0.955 s, (2725 total leaves) Time spent computing histograms: 0.337s Time spent finding best splits: 0.065s Time spent applying splits: 0.055s Time spent predicting: 0.000s Trial 32, Fold 4: Log loss = 0.3466575263630329, Average precision = 0.9518809856724947, ROC-AUC = 0.9474566246056781, Elapsed Time = 0.9676071000012598 seconds Trial 32, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 32, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 27 leaves, max depth = 10, train loss: 0.67170, val loss: 0.67020, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.65213, val loss: 0.64928, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.63415, val loss: 0.63004, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.61770, val loss: 0.61238, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.60282, val loss: 0.59636, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.58913, val loss: 0.58154, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.57659, val loss: 0.56794, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.56511, val loss: 0.55546, in 0.000s 1 tree, 26 leaves, max depth = 9, train loss: 0.55441, val loss: 0.54380, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.54442, val loss: 0.53282, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.53543, val loss: 0.52290, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.52719, val loss: 0.51381, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.51959, val loss: 0.50540, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.51254, val loss: 0.49748, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.50595, val loss: 0.49013, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.49695, val loss: 0.48163, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.49108, val loss: 0.47504, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.48566, val loss: 0.46892, in 0.016s 1 tree, 9 leaves, max depth = 6, train loss: 0.48057, val loss: 0.46315, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.47581, val loss: 0.45770, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.47142, val loss: 0.45264, in 0.016s 1 tree, 63 leaves, max depth = 8, train loss: 0.46604, val loss: 0.44799, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.46211, val loss: 0.44343, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.45847, val loss: 0.43919, in 0.016s 1 tree, 64 leaves, max depth = 15, train loss: 0.45379, val loss: 0.43466, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.45053, val loss: 0.43152, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.44747, val loss: 0.42791, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.44405, val loss: 0.42437, in 0.016s 1 tree, 63 leaves, max depth = 8, train loss: 0.43972, val loss: 0.42076, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.43350, val loss: 0.41510, in 0.016s 1 tree, 63 leaves, max depth = 8, train loss: 0.42981, val loss: 0.41209, in 0.000s 1 tree, 60 leaves, max depth = 21, train loss: 0.42690, val loss: 0.40919, in 0.016s 1 tree, 60 leaves, max depth = 21, train loss: 0.42422, val loss: 0.40651, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.42175, val loss: 0.40359, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.41935, val loss: 0.40072, in 0.000s 1 tree, 59 leaves, max depth = 19, train loss: 0.41688, val loss: 0.39827, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.41479, val loss: 0.39576, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.40986, val loss: 0.39136, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.40785, val loss: 0.38893, in 0.000s 1 tree, 64 leaves, max depth = 11, train loss: 0.40498, val loss: 0.38629, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40232, val loss: 0.38374, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.40036, val loss: 0.38221, in 0.000s 1 tree, 61 leaves, max depth = 18, train loss: 0.39835, val loss: 0.38024, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.39425, val loss: 0.37663, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.39186, val loss: 0.37446, in 0.016s 1 tree, 64 leaves, max depth = 9, train loss: 0.38963, val loss: 0.37280, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.38615, val loss: 0.36977, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.38452, val loss: 0.36781, in 0.000s 1 tree, 63 leaves, max depth = 8, train loss: 0.38264, val loss: 0.36645, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.37955, val loss: 0.36378, in 0.016s 1 tree, 61 leaves, max depth = 20, train loss: 0.37812, val loss: 0.36237, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.37535, val loss: 0.35999, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.37279, val loss: 0.35780, in 0.016s 1 tree, 28 leaves, max depth = 14, train loss: 0.37141, val loss: 0.35609, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.37008, val loss: 0.35444, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.36768, val loss: 0.35241, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.36638, val loss: 0.35081, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.36525, val loss: 0.34937, in 0.000s 1 tree, 54 leaves, max depth = 10, train loss: 0.36301, val loss: 0.34749, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36063, val loss: 0.34523, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.35929, val loss: 0.34409, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.35730, val loss: 0.34245, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.35617, val loss: 0.34121, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.35514, val loss: 0.34006, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.35422, val loss: 0.33889, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.35284, val loss: 0.33775, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35065, val loss: 0.33565, in 0.000s 1 tree, 65 leaves, max depth = 11, train loss: 0.34947, val loss: 0.33469, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.34772, val loss: 0.33327, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.34611, val loss: 0.33197, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.34516, val loss: 0.33092, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.34438, val loss: 0.33003, in 0.000s Fit 72 trees in 1.127 s, (2802 total leaves) Time spent computing histograms: 0.405s Time spent finding best splits: 0.086s Time spent applying splits: 0.070s Time spent predicting: 0.000s Trial 32, Fold 5: Log loss = 0.35100856288112636, Average precision = 0.950646473005743, ROC-AUC = 0.9457785704996006, Elapsed Time = 1.1419028999989678 seconds
Optimization Progress: 33%|###3 | 33/100 [06:35<14:38, 13.11s/it]
Trial 33, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 33, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.143 s 0.040 GB of training data: 0.015 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 27 leaves, max depth = 12, train loss: 0.67757, val loss: 0.67715, in 0.000s 1 tree, 28 leaves, max depth = 12, train loss: 0.66302, val loss: 0.66218, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.64939, val loss: 0.64816, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.63662, val loss: 0.63501, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.62465, val loss: 0.62267, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.61341, val loss: 0.61107, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.60286, val loss: 0.60017, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.59293, val loss: 0.58991, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.58360, val loss: 0.58024, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.57481, val loss: 0.57113, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.56654, val loss: 0.56255, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.55874, val loss: 0.55446, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.55145, val loss: 0.54680, in 0.000s Fit 13 trees in 0.361 s, (377 total leaves) Time spent computing histograms: 0.052s Time spent finding best splits: 0.007s Time spent applying splits: 0.006s Time spent predicting: 0.000s Trial 33, Fold 1: Log loss = 0.5523366704044703, Average precision = 0.8140478710455946, ROC-AUC = 0.8619023571739245, Elapsed Time = 0.36222819999966305 seconds Trial 33, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 33, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.143 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 9, train loss: 0.67775, val loss: 0.67702, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.66336, val loss: 0.66191, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.64990, val loss: 0.64776, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.63728, val loss: 0.63449, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.62545, val loss: 0.62202, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.61434, val loss: 0.61036, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.60390, val loss: 0.59934, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.59408, val loss: 0.58902, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.58485, val loss: 0.57923, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.57615, val loss: 0.57007, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.56796, val loss: 0.56144, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.56010, val loss: 0.55310, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.55226, val loss: 0.54561, in 0.000s Fit 13 trees in 0.377 s, (407 total leaves) Time spent computing histograms: 0.049s Time spent finding best splits: 0.007s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 33, Fold 2: Log loss = 0.5521814942210168, Average precision = 0.8878644760545134, ROC-AUC = 0.9063182770192243, Elapsed Time = 0.3772525999993377 seconds Trial 33, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 33, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0.158 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 28 leaves, max depth = 11, train loss: 0.67789, val loss: 0.67739, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.66362, val loss: 0.66265, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.65013, val loss: 0.64872, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.63762, val loss: 0.63576, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.62575, val loss: 0.62348, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.61473, val loss: 0.61205, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.60425, val loss: 0.60119, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.59452, val loss: 0.59107, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.58525, val loss: 0.58145, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.57663, val loss: 0.57247, in 0.000s 1 tree, 22 leaves, max depth = 7, train loss: 0.56840, val loss: 0.56391, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.56075, val loss: 0.55593, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.55293, val loss: 0.54866, in 0.016s Fit 13 trees in 0.377 s, (357 total leaves) Time spent computing histograms: 0.053s Time spent finding best splits: 0.007s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 33, Fold 3: Log loss = 0.5495966806223601, Average precision = 0.8902275823415812, ROC-AUC = 0.9079231666120907, Elapsed Time = 0.3846448000003875 seconds Trial 33, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 33, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 27 leaves, max depth = 12, train loss: 0.67773, val loss: 0.67684, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.66332, val loss: 0.66157, in 0.016s 1 tree, 27 leaves, max depth = 12, train loss: 0.64984, val loss: 0.64725, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.63721, val loss: 0.63381, in 0.016s 1 tree, 27 leaves, max depth = 12, train loss: 0.62536, val loss: 0.62117, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.61424, val loss: 0.60929, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.60379, val loss: 0.59810, in 0.016s 1 tree, 32 leaves, max depth = 13, train loss: 0.59403, val loss: 0.58762, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.58478, val loss: 0.57768, in 0.016s 1 tree, 32 leaves, max depth = 13, train loss: 0.57613, val loss: 0.56836, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.56792, val loss: 0.55950, in 0.016s 1 tree, 33 leaves, max depth = 13, train loss: 0.56024, val loss: 0.55118, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.55295, val loss: 0.54326, in 0.016s Fit 13 trees in 0.376 s, (378 total leaves) Time spent computing histograms: 0.051s Time spent finding best splits: 0.007s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 33, Fold 4: Log loss = 0.5517180277152086, Average precision = 0.8204335615774272, ROC-AUC = 0.8677278482370867, Elapsed Time = 0.38067810000029567 seconds Trial 33, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 33, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 27 leaves, max depth = 10, train loss: 0.67760, val loss: 0.67656, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.66306, val loss: 0.66102, in 0.000s 1 tree, 27 leaves, max depth = 11, train loss: 0.64952, val loss: 0.64655, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.63675, val loss: 0.63285, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.62485, val loss: 0.62008, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.61360, val loss: 0.60796, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.60309, val loss: 0.59664, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.59314, val loss: 0.58587, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.58385, val loss: 0.57582, in 0.000s 1 tree, 26 leaves, max depth = 12, train loss: 0.57510, val loss: 0.56633, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.56678, val loss: 0.55726, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.55901, val loss: 0.54880, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.55133, val loss: 0.54145, in 0.016s Fit 13 trees in 0.377 s, (378 total leaves) Time spent computing histograms: 0.050s Time spent finding best splits: 0.007s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 33, Fold 5: Log loss = 0.5551145966827552, Average precision = 0.8855353230294833, ROC-AUC = 0.9000196941827843, Elapsed Time = 0.3723337999999785 seconds
Optimization Progress: 34%|###4 | 34/100 [06:44<12:59, 11.81s/it]
Trial 34, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 34, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.141 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 64 leaves, max depth = 16, train loss: 0.65572, val loss: 0.65450, in 0.000s 1 tree, 64 leaves, max depth = 16, train loss: 0.62421, val loss: 0.62186, in 0.016s 1 tree, 68 leaves, max depth = 19, train loss: 0.59754, val loss: 0.59454, in 0.016s 1 tree, 68 leaves, max depth = 19, train loss: 0.57469, val loss: 0.57110, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.55484, val loss: 0.55033, in 0.016s 1 tree, 107 leaves, max depth = 18, train loss: 0.53642, val loss: 0.53322, in 0.000s 1 tree, 67 leaves, max depth = 18, train loss: 0.52055, val loss: 0.51683, in 0.016s 1 tree, 106 leaves, max depth = 18, train loss: 0.50540, val loss: 0.50285, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.49249, val loss: 0.48932, in 0.000s 1 tree, 107 leaves, max depth = 18, train loss: 0.47987, val loss: 0.47776, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.46924, val loss: 0.46663, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.45859, val loss: 0.45695, in 0.000s 1 tree, 107 leaves, max depth = 15, train loss: 0.44930, val loss: 0.44856, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.44082, val loss: 0.43960, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.43316, val loss: 0.43122, in 0.000s 1 tree, 107 leaves, max depth = 13, train loss: 0.42538, val loss: 0.42429, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.41903, val loss: 0.41746, in 0.016s 1 tree, 107 leaves, max depth = 13, train loss: 0.41233, val loss: 0.41156, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.40685, val loss: 0.40550, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.40102, val loss: 0.40048, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.39630, val loss: 0.39522, in 0.016s 1 tree, 107 leaves, max depth = 13, train loss: 0.39119, val loss: 0.39094, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.38623, val loss: 0.38562, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.38193, val loss: 0.38099, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.37791, val loss: 0.37649, in 0.000s 1 tree, 107 leaves, max depth = 14, train loss: 0.37343, val loss: 0.37285, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.36951, val loss: 0.36970, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.36598, val loss: 0.36571, in 0.016s 1 tree, 68 leaves, max depth = 12, train loss: 0.36221, val loss: 0.36159, in 0.000s 1 tree, 106 leaves, max depth = 14, train loss: 0.35873, val loss: 0.35888, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.35557, val loss: 0.35549, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.35248, val loss: 0.35198, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.34978, val loss: 0.34889, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.34705, val loss: 0.34604, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.34384, val loss: 0.34366, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.34142, val loss: 0.34082, in 0.016s 1 tree, 86 leaves, max depth = 17, train loss: 0.33886, val loss: 0.33801, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.33596, val loss: 0.33592, in 0.000s 1 tree, 86 leaves, max depth = 15, train loss: 0.33371, val loss: 0.33340, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33152, val loss: 0.33086, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.32959, val loss: 0.32860, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.32690, val loss: 0.32672, in 0.000s 1 tree, 85 leaves, max depth = 14, train loss: 0.32465, val loss: 0.32419, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.32288, val loss: 0.32207, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.32097, val loss: 0.32013, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.31851, val loss: 0.31845, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.31671, val loss: 0.31651, in 0.016s [48/85] 1 tree, 107 leaves, max depth = 14, train loss: 0.31454, val loss: 0.31507, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.31287, val loss: 0.31308, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.31129, val loss: 0.31145, in 0.000s 1 tree, 107 leaves, max depth = 15, train loss: 0.30933, val loss: 0.31021, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30780, val loss: 0.30834, in 0.000s 1 tree, 86 leaves, max depth = 15, train loss: 0.30623, val loss: 0.30665, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.30443, val loss: 0.30553, in 0.016s 1 tree, 85 leaves, max depth = 19, train loss: 0.30294, val loss: 0.30382, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.30152, val loss: 0.30207, in 0.000s 1 tree, 107 leaves, max depth = 14, train loss: 0.29989, val loss: 0.30112, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29861, val loss: 0.29952, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.29723, val loss: 0.29826, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.29609, val loss: 0.29684, in 0.016s 1 tree, 85 leaves, max depth = 19, train loss: 0.29481, val loss: 0.29542, in 0.000s 1 tree, 107 leaves, max depth = 15, train loss: 0.29323, val loss: 0.29450, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.29213, val loss: 0.29350, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29104, val loss: 0.29212, in 0.000s 1 tree, 107 leaves, max depth = 15, train loss: 0.28966, val loss: 0.29138, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28868, val loss: 0.29012, in 0.000s 1 tree, 85 leaves, max depth = 19, train loss: 0.28753, val loss: 0.28883, in 0.016s 1 tree, 132 leaves, max depth = 24, train loss: 0.28595, val loss: 0.28798, in 0.016s 1 tree, 64 leaves, max depth = 15, train loss: 0.28497, val loss: 0.28696, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28404, val loss: 0.28575, in 0.000s 1 tree, 140 leaves, max depth = 18, train loss: 0.28243, val loss: 0.28485, in 0.016s 1 tree, 140 leaves, max depth = 20, train loss: 0.28082, val loss: 0.28362, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27996, val loss: 0.28251, in 0.000s 1 tree, 83 leaves, max depth = 17, train loss: 0.27896, val loss: 0.28156, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.27753, val loss: 0.28080, in 0.016s 1 tree, 131 leaves, max depth = 24, train loss: 0.27623, val loss: 0.28017, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27541, val loss: 0.27910, in 0.000s 1 tree, 40 leaves, max depth = 10, train loss: 0.27456, val loss: 0.27836, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.27383, val loss: 0.27738, in 0.016s 1 tree, 85 leaves, max depth = 17, train loss: 0.27293, val loss: 0.27635, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.27138, val loss: 0.27547, in 0.000s 1 tree, 140 leaves, max depth = 17, train loss: 0.27018, val loss: 0.27490, in 0.016s 1 tree, 140 leaves, max depth = 17, train loss: 0.26887, val loss: 0.27393, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.26804, val loss: 0.27291, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.26732, val loss: 0.27195, in 0.000s Fit 85 trees in 1.064 s, (5840 total leaves) Time spent computing histograms: 0.358s Time spent finding best splits: 0.077s Time spent applying splits: 0.088s Time spent predicting: 0.000s Trial 34, Fold 1: Log loss = 0.2807773485256574, Average precision = 0.9522178452405305, ROC-AUC = 0.9524693001008493, Elapsed Time = 1.0698346000008314 seconds Trial 34, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 34, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 65 leaves, max depth = 15, train loss: 0.65638, val loss: 0.65475, in 0.016s 1 tree, 68 leaves, max depth = 11, train loss: 0.62534, val loss: 0.62225, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.59839, val loss: 0.59387, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.57553, val loss: 0.56976, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.55565, val loss: 0.54871, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.53740, val loss: 0.53119, in 0.016s 1 tree, 67 leaves, max depth = 20, train loss: 0.52158, val loss: 0.51443, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.50660, val loss: 0.50012, in 0.000s 1 tree, 68 leaves, max depth = 12, train loss: 0.49338, val loss: 0.48599, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.48096, val loss: 0.47419, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.47029, val loss: 0.46280, in 0.000s 1 tree, 107 leaves, max depth = 15, train loss: 0.45981, val loss: 0.45289, in 0.016s 1 tree, 68 leaves, max depth = 17, train loss: 0.45088, val loss: 0.44338, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.44292, val loss: 0.43475, in 0.016s 1 tree, 68 leaves, max depth = 18, train loss: 0.43616, val loss: 0.42749, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.42762, val loss: 0.41952, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.42217, val loss: 0.41385, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.41477, val loss: 0.40700, in 0.000s 1 tree, 68 leaves, max depth = 14, train loss: 0.40943, val loss: 0.40129, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.40301, val loss: 0.39541, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.39738, val loss: 0.39024, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.39249, val loss: 0.38583, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.38798, val loss: 0.38112, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.38358, val loss: 0.37632, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.37970, val loss: 0.37226, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.37539, val loss: 0.36839, in 0.000s 1 tree, 107 leaves, max depth = 15, train loss: 0.37166, val loss: 0.36510, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.36824, val loss: 0.36151, in 0.016s 1 tree, 67 leaves, max depth = 11, train loss: 0.36446, val loss: 0.35744, in 0.000s 1 tree, 107 leaves, max depth = 17, train loss: 0.36117, val loss: 0.35458, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.35795, val loss: 0.35107, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.35495, val loss: 0.34791, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.35233, val loss: 0.34513, in 0.000s 1 tree, 67 leaves, max depth = 15, train loss: 0.34955, val loss: 0.34213, in 0.000s 1 tree, 107 leaves, max depth = 18, train loss: 0.34648, val loss: 0.33949, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.34413, val loss: 0.33700, in 0.016s 1 tree, 86 leaves, max depth = 16, train loss: 0.34166, val loss: 0.33427, in 0.000s 1 tree, 107 leaves, max depth = 18, train loss: 0.33894, val loss: 0.33197, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.33661, val loss: 0.32957, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33448, val loss: 0.32730, in 0.000s 1 tree, 83 leaves, max depth = 15, train loss: 0.33225, val loss: 0.32487, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33036, val loss: 0.32284, in 0.000s 1 tree, 83 leaves, max depth = 15, train loss: 0.32843, val loss: 0.32074, in 0.016s 1 tree, 107 leaves, max depth = 17, train loss: 0.32585, val loss: 0.31858, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.32412, val loss: 0.31673, in 0.000s 1 tree, 107 leaves, max depth = 18, train loss: 0.32184, val loss: 0.31484, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.32009, val loss: 0.31302, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.31808, val loss: 0.31138, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.31646, val loss: 0.30964, in 0.000s 1 tree, 83 leaves, max depth = 16, train loss: 0.31474, val loss: 0.30775, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.31286, val loss: 0.30621, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.31137, val loss: 0.30461, in 0.016s 1 tree, 86 leaves, max depth = 16, train loss: 0.30985, val loss: 0.30298, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.30814, val loss: 0.30158, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.30675, val loss: 0.30017, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30537, val loss: 0.29869, in 0.000s 1 tree, 83 leaves, max depth = 16, train loss: 0.30406, val loss: 0.29725, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30283, val loss: 0.29591, in 0.000s 1 tree, 107 leaves, max depth = 17, train loss: 0.30122, val loss: 0.29464, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30010, val loss: 0.29341, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.29813, val loss: 0.29203, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.29687, val loss: 0.29064, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.29578, val loss: 0.28960, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.29473, val loss: 0.28845, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.29333, val loss: 0.28733, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29238, val loss: 0.28628, in 0.000s 1 tree, 83 leaves, max depth = 16, train loss: 0.29129, val loss: 0.28509, in 0.016s 1 tree, 131 leaves, max depth = 21, train loss: 0.28970, val loss: 0.28436, in 0.016s 1 tree, 65 leaves, max depth = 18, train loss: 0.28867, val loss: 0.28342, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.28777, val loss: 0.28242, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.28655, val loss: 0.28151, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.28494, val loss: 0.28081, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28409, val loss: 0.27987, in 0.000s 1 tree, 83 leaves, max depth = 17, train loss: 0.28304, val loss: 0.27882, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.28152, val loss: 0.27780, in 0.016s 1 tree, 86 leaves, max depth = 21, train loss: 0.28056, val loss: 0.27673, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27976, val loss: 0.27584, in 0.000s 1 tree, 140 leaves, max depth = 17, train loss: 0.27823, val loss: 0.27522, in 0.016s 1 tree, 131 leaves, max depth = 21, train loss: 0.27700, val loss: 0.27475, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27626, val loss: 0.27392, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.27544, val loss: 0.27315, in 0.000s 1 tree, 140 leaves, max depth = 18, train loss: 0.27406, val loss: 0.27262, in 0.016s 1 tree, 86 leaves, max depth = 21, train loss: 0.27322, val loss: 0.27167, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.27199, val loss: 0.27124, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27128, val loss: 0.27043, in 0.000s Fit 85 trees in 1.126 s, (5776 total leaves) Time spent computing histograms: 0.381s Time spent finding best splits: 0.081s Time spent applying splits: 0.093s Time spent predicting: 0.000s Trial 34, Fold 2: Log loss = 0.2800543717383729, Average precision = 0.9508008089697524, ROC-AUC = 0.9531672190791882, Elapsed Time = 1.1384725000007165 seconds Trial 34, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 34, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 65 leaves, max depth = 13, train loss: 0.65601, val loss: 0.65491, in 0.000s 1 tree, 65 leaves, max depth = 13, train loss: 0.62471, val loss: 0.62261, in 0.016s 1 tree, 65 leaves, max depth = 13, train loss: 0.59805, val loss: 0.59502, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.57547, val loss: 0.57137, in 0.000s 1 tree, 66 leaves, max depth = 13, train loss: 0.55579, val loss: 0.55095, in 0.016s 1 tree, 106 leaves, max depth = 13, train loss: 0.53741, val loss: 0.53400, in 0.016s 1 tree, 66 leaves, max depth = 17, train loss: 0.52169, val loss: 0.51748, in 0.000s 1 tree, 106 leaves, max depth = 13, train loss: 0.50658, val loss: 0.50363, in 0.016s 1 tree, 66 leaves, max depth = 17, train loss: 0.49376, val loss: 0.49006, in 0.016s 1 tree, 106 leaves, max depth = 13, train loss: 0.48118, val loss: 0.47860, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.47049, val loss: 0.46743, in 0.000s 1 tree, 106 leaves, max depth = 14, train loss: 0.45989, val loss: 0.45783, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.45098, val loss: 0.44844, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.44313, val loss: 0.44007, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.43639, val loss: 0.43281, in 0.016s 1 tree, 106 leaves, max depth = 12, train loss: 0.42773, val loss: 0.42502, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.42206, val loss: 0.41978, in 0.000s 1 tree, 106 leaves, max depth = 12, train loss: 0.41456, val loss: 0.41311, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.40969, val loss: 0.40860, in 0.016s 1 tree, 106 leaves, max depth = 12, train loss: 0.40315, val loss: 0.40285, in 0.000s 1 tree, 106 leaves, max depth = 12, train loss: 0.39742, val loss: 0.39788, in 0.016s 1 tree, 106 leaves, max depth = 12, train loss: 0.39241, val loss: 0.39358, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.38754, val loss: 0.38822, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.38330, val loss: 0.38357, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.37922, val loss: 0.37978, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.37483, val loss: 0.37605, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.37099, val loss: 0.37285, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.36740, val loss: 0.36951, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.36374, val loss: 0.36541, in 0.000s 1 tree, 106 leaves, max depth = 17, train loss: 0.36033, val loss: 0.36262, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.35724, val loss: 0.35919, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.35410, val loss: 0.35626, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.35135, val loss: 0.35370, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.34895, val loss: 0.35144, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.34628, val loss: 0.34854, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.34308, val loss: 0.34598, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.34066, val loss: 0.34293, in 0.000s 1 tree, 106 leaves, max depth = 17, train loss: 0.33783, val loss: 0.34070, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.33553, val loss: 0.33832, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33330, val loss: 0.33623, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.33133, val loss: 0.33438, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.32915, val loss: 0.33179, in 0.000s 1 tree, 83 leaves, max depth = 14, train loss: 0.32729, val loss: 0.32954, in 0.016s 1 tree, 103 leaves, max depth = 14, train loss: 0.32460, val loss: 0.32744, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.32278, val loss: 0.32572, in 0.000s 1 tree, 105 leaves, max depth = 14, train loss: 0.32039, val loss: 0.32389, in 0.016s [47/85] 1 tree, 67 leaves, max depth = 13, train loss: 0.31874, val loss: 0.32211, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.31662, val loss: 0.32052, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.31492, val loss: 0.31893, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.31326, val loss: 0.31689, in 0.000s 1 tree, 106 leaves, max depth = 15, train loss: 0.31133, val loss: 0.31547, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30977, val loss: 0.31399, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.30838, val loss: 0.31250, in 0.000s 1 tree, 106 leaves, max depth = 15, train loss: 0.30662, val loss: 0.31123, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.30519, val loss: 0.30943, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30374, val loss: 0.30805, in 0.000s 1 tree, 106 leaves, max depth = 13, train loss: 0.30213, val loss: 0.30692, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30083, val loss: 0.30569, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.29952, val loss: 0.30404, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.29835, val loss: 0.30292, in 0.016s 1 tree, 129 leaves, max depth = 19, train loss: 0.29654, val loss: 0.30210, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.29534, val loss: 0.30085, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.29392, val loss: 0.29988, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.29281, val loss: 0.29883, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.29169, val loss: 0.29740, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.29069, val loss: 0.29644, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.28934, val loss: 0.29554, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28843, val loss: 0.29467, in 0.000s 1 tree, 64 leaves, max depth = 16, train loss: 0.28739, val loss: 0.29361, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.28617, val loss: 0.29282, in 0.000s 1 tree, 83 leaves, max depth = 17, train loss: 0.28518, val loss: 0.29153, in 0.016s 1 tree, 130 leaves, max depth = 20, train loss: 0.28374, val loss: 0.29104, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28285, val loss: 0.29020, in 0.000s 1 tree, 40 leaves, max depth = 10, train loss: 0.28192, val loss: 0.28939, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28112, val loss: 0.28862, in 0.000s 1 tree, 83 leaves, max depth = 13, train loss: 0.27949, val loss: 0.28740, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.27847, val loss: 0.28630, in 0.000s 1 tree, 140 leaves, max depth = 15, train loss: 0.27708, val loss: 0.28525, in 0.031s 1 tree, 4 leaves, max depth = 3, train loss: 0.27632, val loss: 0.28451, in 0.000s 1 tree, 140 leaves, max depth = 17, train loss: 0.27491, val loss: 0.28359, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.27401, val loss: 0.28259, in 0.016s 1 tree, 140 leaves, max depth = 15, train loss: 0.27277, val loss: 0.28166, in 0.016s 1 tree, 131 leaves, max depth = 18, train loss: 0.27160, val loss: 0.28135, in 0.016s 1 tree, 83 leaves, max depth = 21, train loss: 0.27085, val loss: 0.28032, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.27010, val loss: 0.27961, in 0.016s Fit 85 trees in 1.189 s, (5765 total leaves) Time spent computing histograms: 0.402s Time spent finding best splits: 0.084s Time spent applying splits: 0.096s Time spent predicting: 0.000s Trial 34, Fold 3: Log loss = 0.2744742454659397, Average precision = 0.9557721692213328, ROC-AUC = 0.9553627860417409, Elapsed Time = 1.1863267999997333 seconds Trial 34, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 34, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 67 leaves, max depth = 13, train loss: 0.65633, val loss: 0.65431, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.62527, val loss: 0.62136, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.59856, val loss: 0.59275, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.57590, val loss: 0.56845, in 0.000s 1 tree, 67 leaves, max depth = 15, train loss: 0.55621, val loss: 0.54711, in 0.000s 1 tree, 101 leaves, max depth = 14, train loss: 0.53808, val loss: 0.52926, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.52227, val loss: 0.51210, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.50739, val loss: 0.49748, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.49449, val loss: 0.48334, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.48211, val loss: 0.47120, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.47129, val loss: 0.45916, in 0.000s 1 tree, 104 leaves, max depth = 15, train loss: 0.46088, val loss: 0.44898, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.45197, val loss: 0.43904, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.44410, val loss: 0.43012, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.43733, val loss: 0.42244, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.42888, val loss: 0.41423, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.42340, val loss: 0.40837, in 0.000s 1 tree, 104 leaves, max depth = 14, train loss: 0.41607, val loss: 0.40125, in 0.016s 1 tree, 68 leaves, max depth = 16, train loss: 0.41067, val loss: 0.39502, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.40432, val loss: 0.38885, in 0.000s 1 tree, 104 leaves, max depth = 13, train loss: 0.39876, val loss: 0.38346, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.39390, val loss: 0.37877, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.38935, val loss: 0.37387, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.38507, val loss: 0.36895, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.38117, val loss: 0.36474, in 0.000s 1 tree, 104 leaves, max depth = 14, train loss: 0.37690, val loss: 0.36070, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.37317, val loss: 0.35713, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.36973, val loss: 0.35340, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.36603, val loss: 0.34919, in 0.000s 1 tree, 104 leaves, max depth = 15, train loss: 0.36274, val loss: 0.34606, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.35961, val loss: 0.34245, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.35659, val loss: 0.33918, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.35395, val loss: 0.33632, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.35126, val loss: 0.33324, in 0.000s 1 tree, 104 leaves, max depth = 14, train loss: 0.34821, val loss: 0.33045, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.34585, val loss: 0.32788, in 0.000s 1 tree, 86 leaves, max depth = 17, train loss: 0.34338, val loss: 0.32537, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.34062, val loss: 0.32287, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.33836, val loss: 0.32036, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33622, val loss: 0.31803, in 0.000s 1 tree, 85 leaves, max depth = 15, train loss: 0.33404, val loss: 0.31583, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33214, val loss: 0.31374, in 0.000s 1 tree, 85 leaves, max depth = 15, train loss: 0.33025, val loss: 0.31185, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.32763, val loss: 0.30951, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.32590, val loss: 0.30762, in 0.000s 1 tree, 104 leaves, max depth = 14, train loss: 0.32355, val loss: 0.30553, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.32190, val loss: 0.30399, in 0.000s 1 tree, 104 leaves, max depth = 14, train loss: 0.31988, val loss: 0.30222, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.31825, val loss: 0.30042, in 0.016s 1 tree, 85 leaves, max depth = 18, train loss: 0.31654, val loss: 0.29874, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.31507, val loss: 0.29701, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.31360, val loss: 0.29541, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.31167, val loss: 0.29378, in 0.000s 1 tree, 86 leaves, max depth = 15, train loss: 0.31021, val loss: 0.29233, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.30851, val loss: 0.29084, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30712, val loss: 0.28930, in 0.000s 1 tree, 85 leaves, max depth = 18, train loss: 0.30580, val loss: 0.28806, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30456, val loss: 0.28669, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.30295, val loss: 0.28536, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.30182, val loss: 0.28411, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.30057, val loss: 0.28289, in 0.000s 1 tree, 83 leaves, max depth = 16, train loss: 0.29856, val loss: 0.28111, in 0.016s 1 tree, 85 leaves, max depth = 18, train loss: 0.29740, val loss: 0.28004, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29635, val loss: 0.27889, in 0.000s 1 tree, 103 leaves, max depth = 15, train loss: 0.29495, val loss: 0.27771, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29400, val loss: 0.27664, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.29294, val loss: 0.27581, in 0.000s 1 tree, 140 leaves, max depth = 19, train loss: 0.29124, val loss: 0.27474, in 0.016s 1 tree, 85 leaves, max depth = 18, train loss: 0.29019, val loss: 0.27380, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28928, val loss: 0.27279, in 0.000s 1 tree, 83 leaves, max depth = 13, train loss: 0.28759, val loss: 0.27131, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.28592, val loss: 0.27038, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28509, val loss: 0.26946, in 0.000s 1 tree, 86 leaves, max depth = 17, train loss: 0.28409, val loss: 0.26859, in 0.016s 1 tree, 129 leaves, max depth = 23, train loss: 0.28256, val loss: 0.26746, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.28172, val loss: 0.26646, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28093, val loss: 0.26559, in 0.000s 1 tree, 140 leaves, max depth = 20, train loss: 0.27946, val loss: 0.26469, in 0.016s 1 tree, 85 leaves, max depth = 17, train loss: 0.27862, val loss: 0.26397, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.27731, val loss: 0.26317, in 0.016s 1 tree, 85 leaves, max depth = 17, train loss: 0.27655, val loss: 0.26251, in 0.016s 1 tree, 129 leaves, max depth = 22, train loss: 0.27523, val loss: 0.26151, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.27410, val loss: 0.26084, in 0.016s 1 tree, 140 leaves, max depth = 20, train loss: 0.27280, val loss: 0.26014, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27198, val loss: 0.25924, in 0.000s Fit 85 trees in 1.190 s, (5883 total leaves) Time spent computing histograms: 0.410s Time spent finding best splits: 0.086s Time spent applying splits: 0.100s Time spent predicting: 0.016s Trial 34, Fold 4: Log loss = 0.2763670443208256, Average precision = 0.9559006771424874, ROC-AUC = 0.9546079165981148, Elapsed Time = 1.2050694999998086 seconds Trial 34, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 34, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 64 leaves, max depth = 14, train loss: 0.65604, val loss: 0.65344, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.62471, val loss: 0.61997, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.59802, val loss: 0.59130, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.57510, val loss: 0.56637, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.55541, val loss: 0.54499, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.53731, val loss: 0.52774, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.52132, val loss: 0.51023, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.50645, val loss: 0.49620, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.49308, val loss: 0.48144, in 0.000s 1 tree, 105 leaves, max depth = 22, train loss: 0.48073, val loss: 0.46989, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.46990, val loss: 0.45785, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.45949, val loss: 0.44824, in 0.000s 1 tree, 68 leaves, max depth = 13, train loss: 0.45047, val loss: 0.43812, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.44241, val loss: 0.42900, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.43540, val loss: 0.42099, in 0.000s 1 tree, 105 leaves, max depth = 18, train loss: 0.42689, val loss: 0.41333, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.42117, val loss: 0.40674, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.41382, val loss: 0.40021, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40852, val loss: 0.39513, in 0.000s 1 tree, 105 leaves, max depth = 18, train loss: 0.40214, val loss: 0.38953, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.39658, val loss: 0.38469, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.39173, val loss: 0.38051, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.38719, val loss: 0.37617, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.38276, val loss: 0.37099, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.37885, val loss: 0.36726, in 0.000s 1 tree, 105 leaves, max depth = 18, train loss: 0.37460, val loss: 0.36370, in 0.031s 1 tree, 105 leaves, max depth = 19, train loss: 0.37089, val loss: 0.36062, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36745, val loss: 0.35734, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.36360, val loss: 0.35295, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.36035, val loss: 0.34910, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.35744, val loss: 0.34567, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35447, val loss: 0.34284, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35188, val loss: 0.34037, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.34842, val loss: 0.33763, in 0.000s 1 tree, 105 leaves, max depth = 19, train loss: 0.34543, val loss: 0.33528, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34307, val loss: 0.33303, in 0.016s 1 tree, 86 leaves, max depth = 13, train loss: 0.34050, val loss: 0.33033, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.33823, val loss: 0.32759, in 0.000s 1 tree, 83 leaves, max depth = 15, train loss: 0.33619, val loss: 0.32532, in 0.016s 1 tree, 104 leaves, max depth = 18, train loss: 0.33341, val loss: 0.32319, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33126, val loss: 0.32113, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32937, val loss: 0.31933, in 0.000s 1 tree, 84 leaves, max depth = 14, train loss: 0.32726, val loss: 0.31710, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.32470, val loss: 0.31517, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32298, val loss: 0.31353, in 0.000s 1 tree, 104 leaves, max depth = 18, train loss: 0.32071, val loss: 0.31185, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.31894, val loss: 0.30969, in 0.016s 1 tree, 86 leaves, max depth = 18, train loss: 0.31725, val loss: 0.30792, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31566, val loss: 0.30640, in 0.000s 1 tree, 105 leaves, max depth = 17, train loss: 0.31357, val loss: 0.30488, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.31193, val loss: 0.30315, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31047, val loss: 0.30175, in 0.000s 1 tree, 105 leaves, max depth = 17, train loss: 0.30857, val loss: 0.30041, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.30720, val loss: 0.29869, in 0.000s 1 tree, 105 leaves, max depth = 17, train loss: 0.30553, val loss: 0.29754, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30415, val loss: 0.29622, in 0.000s 1 tree, 84 leaves, max depth = 15, train loss: 0.30272, val loss: 0.29471, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30150, val loss: 0.29353, in 0.000s 1 tree, 105 leaves, max depth = 17, train loss: 0.29992, val loss: 0.29247, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29881, val loss: 0.29140, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.29749, val loss: 0.29003, in 0.000s 1 tree, 105 leaves, max depth = 18, train loss: 0.29607, val loss: 0.28907, in 0.016s 1 tree, 63 leaves, max depth = 13, train loss: 0.29491, val loss: 0.28767, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29386, val loss: 0.28666, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.29257, val loss: 0.28581, in 0.000s 1 tree, 84 leaves, max depth = 16, train loss: 0.29143, val loss: 0.28463, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29046, val loss: 0.28369, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.28862, val loss: 0.28234, in 0.000s 1 tree, 84 leaves, max depth = 16, train loss: 0.28758, val loss: 0.28125, in 0.000s 1 tree, 140 leaves, max depth = 21, train loss: 0.28593, val loss: 0.28015, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.28501, val loss: 0.27926, in 0.016s 1 tree, 128 leaves, max depth = 24, train loss: 0.28349, val loss: 0.27819, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.28266, val loss: 0.27738, in 0.000s 1 tree, 84 leaves, max depth = 19, train loss: 0.28166, val loss: 0.27645, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.28017, val loss: 0.27547, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.27930, val loss: 0.27437, in 0.016s [77/85] 1 tree, 5 leaves, max depth = 3, train loss: 0.27852, val loss: 0.27361, in 0.000s 1 tree, 140 leaves, max depth = 18, train loss: 0.27692, val loss: 0.27319, in 0.016s 1 tree, 140 leaves, max depth = 21, train loss: 0.27561, val loss: 0.27234, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.27474, val loss: 0.27145, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.27336, val loss: 0.27049, in 0.000s 1 tree, 129 leaves, max depth = 25, train loss: 0.27214, val loss: 0.26968, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.27138, val loss: 0.26886, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.27032, val loss: 0.26821, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.26952, val loss: 0.26741, in 0.016s Fit 85 trees in 1.330 s, (5901 total leaves) Time spent computing histograms: 0.461s Time spent finding best splits: 0.103s Time spent applying splits: 0.131s Time spent predicting: 0.000s Trial 34, Fold 5: Log loss = 0.2834021484954921, Average precision = 0.9528156004014571, ROC-AUC = 0.9512490199357152, Elapsed Time = 1.3338655999996263 seconds
Optimization Progress: 35%|###5 | 35/100 [06:56<12:58, 11.97s/it]
Trial 35, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 35, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.158 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 15 leaves, max depth = 6, train loss: 0.68744, val loss: 0.68725, in 0.000s 1 tree, 15 leaves, max depth = 6, train loss: 0.68204, val loss: 0.68167, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.67677, val loss: 0.67623, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.67163, val loss: 0.67090, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.66660, val loss: 0.66570, in 0.000s 1 tree, 15 leaves, max depth = 6, train loss: 0.66168, val loss: 0.66061, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.65688, val loss: 0.65564, in 0.000s 1 tree, 15 leaves, max depth = 6, train loss: 0.65218, val loss: 0.65077, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.64759, val loss: 0.64601, in 0.000s 1 tree, 18 leaves, max depth = 8, train loss: 0.64310, val loss: 0.64136, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.63871, val loss: 0.63681, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.63442, val loss: 0.63235, in 0.000s 1 tree, 15 leaves, max depth = 6, train loss: 0.63022, val loss: 0.62799, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.62611, val loss: 0.62373, in 0.000s 1 tree, 15 leaves, max depth = 6, train loss: 0.62209, val loss: 0.61955, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.61816, val loss: 0.61547, in 0.000s 1 tree, 15 leaves, max depth = 6, train loss: 0.61431, val loss: 0.61147, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.61055, val loss: 0.60755, in 0.000s 1 tree, 15 leaves, max depth = 6, train loss: 0.60686, val loss: 0.60372, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.60326, val loss: 0.59996, in 0.000s 1 tree, 15 leaves, max depth = 6, train loss: 0.59973, val loss: 0.59629, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.59632, val loss: 0.59275, in 0.000s 1 tree, 16 leaves, max depth = 6, train loss: 0.59294, val loss: 0.58922, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.58963, val loss: 0.58577, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.58643, val loss: 0.58244, in 0.000s 1 tree, 16 leaves, max depth = 6, train loss: 0.58326, val loss: 0.57913, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.58015, val loss: 0.57588, in 0.000s 1 tree, 16 leaves, max depth = 6, train loss: 0.57710, val loss: 0.57270, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.57412, val loss: 0.56959, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.57124, val loss: 0.56659, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.56838, val loss: 0.56359, in 0.000s 1 tree, 16 leaves, max depth = 6, train loss: 0.56557, val loss: 0.56066, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.56287, val loss: 0.55784, in 0.000s 1 tree, 16 leaves, max depth = 6, train loss: 0.56017, val loss: 0.55502, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.55745, val loss: 0.55248, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.55483, val loss: 0.54974, in 0.000s 1 tree, 49 leaves, max depth = 11, train loss: 0.55218, val loss: 0.54728, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.54969, val loss: 0.54466, in 0.000s 1 tree, 50 leaves, max depth = 11, train loss: 0.54711, val loss: 0.54227, in 0.016s [40/65] 1 tree, 31 leaves, max depth = 11, train loss: 0.54468, val loss: 0.53973, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.54218, val loss: 0.53740, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.53982, val loss: 0.53493, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.53738, val loss: 0.53267, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.53506, val loss: 0.53022, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.53268, val loss: 0.52803, in 0.000s 1 tree, 50 leaves, max depth = 11, train loss: 0.53036, val loss: 0.52587, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.52815, val loss: 0.52356, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.52589, val loss: 0.52147, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.52375, val loss: 0.51921, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.52154, val loss: 0.51718, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.51946, val loss: 0.51498, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.51731, val loss: 0.51300, in 0.062s 1 tree, 22 leaves, max depth = 10, train loss: 0.51525, val loss: 0.51082, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.51316, val loss: 0.50890, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.51119, val loss: 0.50682, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.50915, val loss: 0.50494, in 0.000s 1 tree, 51 leaves, max depth = 11, train loss: 0.50716, val loss: 0.50311, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.50525, val loss: 0.50110, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.50331, val loss: 0.49931, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.50146, val loss: 0.49736, in 0.000s 1 tree, 49 leaves, max depth = 11, train loss: 0.49956, val loss: 0.49561, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.49776, val loss: 0.49371, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.49591, val loss: 0.49201, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.49415, val loss: 0.49016, in 0.000s 1 tree, 49 leaves, max depth = 10, train loss: 0.49235, val loss: 0.48850, in 0.016s Fit 65 trees in 1.002 s, (1831 total leaves) Time spent computing histograms: 0.335s Time spent finding best splits: 0.067s Time spent applying splits: 0.065s Time spent predicting: 0.000s Trial 35, Fold 1: Log loss = 0.4928565760604082, Average precision = 0.9120125022297894, ROC-AUC = 0.9211213185213862, Elapsed Time = 1.0007270999994944 seconds Trial 35, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 35, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 11, train loss: 0.68764, val loss: 0.68738, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.68233, val loss: 0.68181, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.67715, val loss: 0.67638, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.67208, val loss: 0.67106, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.66713, val loss: 0.66586, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.66229, val loss: 0.66078, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.65755, val loss: 0.65582, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.65293, val loss: 0.65096, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.64841, val loss: 0.64621, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.64399, val loss: 0.64156, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.63967, val loss: 0.63702, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.63544, val loss: 0.63257, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.63124, val loss: 0.62814, in 0.000s 1 tree, 18 leaves, max depth = 8, train loss: 0.62713, val loss: 0.62379, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.62317, val loss: 0.61963, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.61923, val loss: 0.61547, in 0.000s 1 tree, 18 leaves, max depth = 8, train loss: 0.61538, val loss: 0.61139, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.61167, val loss: 0.60749, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.60798, val loss: 0.60358, in 0.000s 1 tree, 22 leaves, max depth = 10, train loss: 0.60444, val loss: 0.59984, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.60090, val loss: 0.59610, in 0.016s 1 tree, 24 leaves, max depth = 11, train loss: 0.59749, val loss: 0.59251, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.59410, val loss: 0.58891, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.59078, val loss: 0.58539, in 0.016s 1 tree, 24 leaves, max depth = 11, train loss: 0.58759, val loss: 0.58201, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.58446, val loss: 0.57871, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.58134, val loss: 0.57539, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.57834, val loss: 0.57222, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.57535, val loss: 0.56904, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.57242, val loss: 0.56592, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.56960, val loss: 0.56293, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.56679, val loss: 0.55994, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.56408, val loss: 0.55708, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.56143, val loss: 0.55427, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.55873, val loss: 0.55166, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.55616, val loss: 0.54893, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.55353, val loss: 0.54639, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.55103, val loss: 0.54375, in 0.000s 1 tree, 46 leaves, max depth = 12, train loss: 0.54848, val loss: 0.54128, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.54605, val loss: 0.53871, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.54357, val loss: 0.53631, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.54116, val loss: 0.53373, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.53874, val loss: 0.53140, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.53645, val loss: 0.52897, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.53410, val loss: 0.52670, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.53180, val loss: 0.52448, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.52959, val loss: 0.52214, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.52735, val loss: 0.51997, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.52521, val loss: 0.51769, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.52302, val loss: 0.51559, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.52094, val loss: 0.51337, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.51881, val loss: 0.51132, in 0.016s 1 tree, 29 leaves, max depth = 17, train loss: 0.51678, val loss: 0.50915, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.51471, val loss: 0.50715, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.51274, val loss: 0.50505, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.51072, val loss: 0.50311, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.50874, val loss: 0.50121, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.50684, val loss: 0.49918, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.50491, val loss: 0.49733, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.50306, val loss: 0.49535, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.50118, val loss: 0.49355, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.49937, val loss: 0.49162, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.49754, val loss: 0.48986, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.49578, val loss: 0.48799, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.49400, val loss: 0.48627, in 0.016s Fit 65 trees in 1.017 s, (2086 total leaves) Time spent computing histograms: 0.354s Time spent finding best splits: 0.055s Time spent applying splits: 0.055s Time spent predicting: 0.000s Trial 35, Fold 2: Log loss = 0.49437818976459447, Average precision = 0.9123666380898829, ROC-AUC = 0.927085492577864, Elapsed Time = 1.021703800000978 seconds Trial 35, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 35, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 13 leaves, max depth = 8, train loss: 0.68755, val loss: 0.68738, in 0.000s 1 tree, 13 leaves, max depth = 8, train loss: 0.68223, val loss: 0.68189, in 0.016s 1 tree, 13 leaves, max depth = 8, train loss: 0.67703, val loss: 0.67652, in 0.016s 1 tree, 13 leaves, max depth = 8, train loss: 0.67195, val loss: 0.67127, in 0.000s 1 tree, 13 leaves, max depth = 8, train loss: 0.66699, val loss: 0.66615, in 0.016s 1 tree, 13 leaves, max depth = 8, train loss: 0.66214, val loss: 0.66113, in 0.000s 1 tree, 13 leaves, max depth = 7, train loss: 0.65740, val loss: 0.65623, in 0.016s 1 tree, 13 leaves, max depth = 8, train loss: 0.65276, val loss: 0.65143, in 0.000s 1 tree, 13 leaves, max depth = 7, train loss: 0.64823, val loss: 0.64674, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.64383, val loss: 0.64220, in 0.000s 1 tree, 13 leaves, max depth = 8, train loss: 0.63950, val loss: 0.63771, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.63529, val loss: 0.63336, in 0.016s 1 tree, 14 leaves, max depth = 9, train loss: 0.63114, val loss: 0.62906, in 0.000s 1 tree, 21 leaves, max depth = 11, train loss: 0.62711, val loss: 0.62490, in 0.016s 1 tree, 14 leaves, max depth = 9, train loss: 0.62314, val loss: 0.62078, in 0.000s 1 tree, 20 leaves, max depth = 11, train loss: 0.61928, val loss: 0.61679, in 0.016s 1 tree, 14 leaves, max depth = 9, train loss: 0.61548, val loss: 0.61285, in 0.000s 1 tree, 20 leaves, max depth = 11, train loss: 0.61179, val loss: 0.60903, in 0.016s 1 tree, 20 leaves, max depth = 11, train loss: 0.60817, val loss: 0.60529, in 0.000s 1 tree, 14 leaves, max depth = 9, train loss: 0.60460, val loss: 0.60159, in 0.016s 1 tree, 20 leaves, max depth = 11, train loss: 0.60114, val loss: 0.59800, in 0.000s 1 tree, 14 leaves, max depth = 9, train loss: 0.59772, val loss: 0.59445, in 0.016s 1 tree, 20 leaves, max depth = 11, train loss: 0.59440, val loss: 0.59101, in 0.000s 1 tree, 14 leaves, max depth = 9, train loss: 0.59112, val loss: 0.58760, in 0.016s 1 tree, 20 leaves, max depth = 11, train loss: 0.58794, val loss: 0.58430, in 0.000s [26/65] 1 tree, 15 leaves, max depth = 10, train loss: 0.58480, val loss: 0.58103, in 0.000s 1 tree, 20 leaves, max depth = 11, train loss: 0.58174, val loss: 0.57787, in 0.016s 1 tree, 20 leaves, max depth = 11, train loss: 0.57875, val loss: 0.57477, in 0.000s 1 tree, 15 leaves, max depth = 10, train loss: 0.57580, val loss: 0.57169, in 0.016s 1 tree, 20 leaves, max depth = 11, train loss: 0.57293, val loss: 0.56872, in 0.000s 1 tree, 15 leaves, max depth = 10, train loss: 0.57010, val loss: 0.56576, in 0.016s 1 tree, 20 leaves, max depth = 11, train loss: 0.56734, val loss: 0.56291, in 0.016s 1 tree, 15 leaves, max depth = 10, train loss: 0.56462, val loss: 0.56007, in 0.000s 1 tree, 20 leaves, max depth = 11, train loss: 0.56198, val loss: 0.55733, in 0.016s 1 tree, 15 leaves, max depth = 10, train loss: 0.55937, val loss: 0.55460, in 0.000s 1 tree, 48 leaves, max depth = 13, train loss: 0.55670, val loss: 0.55213, in 0.016s 1 tree, 20 leaves, max depth = 11, train loss: 0.55419, val loss: 0.54952, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.55159, val loss: 0.54712, in 0.016s 1 tree, 20 leaves, max depth = 11, train loss: 0.54916, val loss: 0.54458, in 0.000s 1 tree, 49 leaves, max depth = 13, train loss: 0.54663, val loss: 0.54225, in 0.016s 1 tree, 20 leaves, max depth = 11, train loss: 0.54426, val loss: 0.53978, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.54181, val loss: 0.53752, in 0.000s 1 tree, 22 leaves, max depth = 11, train loss: 0.53951, val loss: 0.53512, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.53712, val loss: 0.53292, in 0.016s 1 tree, 22 leaves, max depth = 11, train loss: 0.53489, val loss: 0.53058, in 0.000s 1 tree, 49 leaves, max depth = 13, train loss: 0.53256, val loss: 0.52845, in 0.016s 1 tree, 22 leaves, max depth = 11, train loss: 0.53039, val loss: 0.52617, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.52813, val loss: 0.52409, in 0.000s 1 tree, 49 leaves, max depth = 13, train loss: 0.52591, val loss: 0.52206, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.52382, val loss: 0.51985, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.52167, val loss: 0.51786, in 0.016s 1 tree, 22 leaves, max depth = 11, train loss: 0.51963, val loss: 0.51573, in 0.000s 1 tree, 49 leaves, max depth = 13, train loss: 0.51753, val loss: 0.51380, in 0.016s 1 tree, 22 leaves, max depth = 11, train loss: 0.51555, val loss: 0.51173, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.51350, val loss: 0.50985, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.51158, val loss: 0.50781, in 0.000s 1 tree, 49 leaves, max depth = 13, train loss: 0.50958, val loss: 0.50599, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.50771, val loss: 0.50402, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.50576, val loss: 0.50224, in 0.000s 1 tree, 49 leaves, max depth = 13, train loss: 0.50386, val loss: 0.50050, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.50205, val loss: 0.49860, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.50019, val loss: 0.49690, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.49843, val loss: 0.49505, in 0.000s 1 tree, 48 leaves, max depth = 13, train loss: 0.49662, val loss: 0.49340, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.49490, val loss: 0.49160, in 0.016s Fit 65 trees in 0.970 s, (1674 total leaves) Time spent computing histograms: 0.343s Time spent finding best splits: 0.047s Time spent applying splits: 0.045s Time spent predicting: 0.000s Trial 35, Fold 3: Log loss = 0.4913007819756683, Average precision = 0.916790820488213, ROC-AUC = 0.9291073053808592, Elapsed Time = 0.9722562999995716 seconds Trial 35, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 35, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.173 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 14 leaves, max depth = 6, train loss: 0.68757, val loss: 0.68725, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.68222, val loss: 0.68160, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.67700, val loss: 0.67607, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.67190, val loss: 0.67066, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.66691, val loss: 0.66537, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.66204, val loss: 0.66020, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.65728, val loss: 0.65515, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.65263, val loss: 0.65021, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.64807, val loss: 0.64537, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.64362, val loss: 0.64064, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.63927, val loss: 0.63600, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.63502, val loss: 0.63147, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.63086, val loss: 0.62704, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.62679, val loss: 0.62270, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.62280, val loss: 0.61845, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.61891, val loss: 0.61429, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.61510, val loss: 0.61021, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.61137, val loss: 0.60622, in 0.000s 1 tree, 22 leaves, max depth = 12, train loss: 0.60772, val loss: 0.60231, in 0.016s 1 tree, 23 leaves, max depth = 12, train loss: 0.60414, val loss: 0.59847, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.60068, val loss: 0.59478, in 0.016s 1 tree, 23 leaves, max depth = 12, train loss: 0.59725, val loss: 0.59110, in 0.000s 1 tree, 23 leaves, max depth = 12, train loss: 0.59390, val loss: 0.58749, in 0.016s 1 tree, 23 leaves, max depth = 12, train loss: 0.59062, val loss: 0.58396, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.58743, val loss: 0.58056, in 0.016s 1 tree, 23 leaves, max depth = 12, train loss: 0.58429, val loss: 0.57716, in 0.000s 1 tree, 23 leaves, max depth = 12, train loss: 0.58120, val loss: 0.57384, in 0.016s [28/65] 1 tree, 24 leaves, max depth = 12, train loss: 0.57818, val loss: 0.57058, in 0.000s 1 tree, 24 leaves, max depth = 12, train loss: 0.57523, val loss: 0.56739, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.57236, val loss: 0.56431, in 0.000s 1 tree, 24 leaves, max depth = 12, train loss: 0.56952, val loss: 0.56124, in 0.016s 1 tree, 24 leaves, max depth = 12, train loss: 0.56674, val loss: 0.55823, in 0.000s 1 tree, 24 leaves, max depth = 12, train loss: 0.56402, val loss: 0.55528, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.56138, val loss: 0.55244, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.55867, val loss: 0.54976, in 0.000s 1 tree, 27 leaves, max depth = 11, train loss: 0.55610, val loss: 0.54700, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.55347, val loss: 0.54440, in 0.016s 1 tree, 24 leaves, max depth = 12, train loss: 0.55095, val loss: 0.54167, in 0.000s 1 tree, 43 leaves, max depth = 12, train loss: 0.54839, val loss: 0.53914, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.54597, val loss: 0.53654, in 0.000s 1 tree, 42 leaves, max depth = 12, train loss: 0.54348, val loss: 0.53408, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.54104, val loss: 0.53168, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.53872, val loss: 0.52916, in 0.000s 1 tree, 43 leaves, max depth = 12, train loss: 0.53634, val loss: 0.52683, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.53408, val loss: 0.52438, in 0.000s 1 tree, 43 leaves, max depth = 12, train loss: 0.53177, val loss: 0.52211, in 0.016s 1 tree, 24 leaves, max depth = 12, train loss: 0.52958, val loss: 0.51973, in 0.000s 1 tree, 43 leaves, max depth = 12, train loss: 0.52733, val loss: 0.51751, in 0.016s 1 tree, 24 leaves, max depth = 12, train loss: 0.52520, val loss: 0.51520, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.52301, val loss: 0.51305, in 0.000s 1 tree, 24 leaves, max depth = 12, train loss: 0.52093, val loss: 0.51080, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.51880, val loss: 0.50870, in 0.000s 1 tree, 44 leaves, max depth = 12, train loss: 0.51672, val loss: 0.50665, in 0.016s 1 tree, 24 leaves, max depth = 12, train loss: 0.51472, val loss: 0.50447, in 0.000s 1 tree, 44 leaves, max depth = 12, train loss: 0.51268, val loss: 0.50248, in 0.016s [56/65] 1 tree, 24 leaves, max depth = 12, train loss: 0.51074, val loss: 0.50036, in 0.016s [57/65] 1 tree, 45 leaves, max depth = 12, train loss: 0.50876, val loss: 0.49841, in 0.000s 1 tree, 28 leaves, max depth = 12, train loss: 0.50687, val loss: 0.49635, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.50494, val loss: 0.49445, in 0.016s 1 tree, 22 leaves, max depth = 11, train loss: 0.50307, val loss: 0.49240, in 0.000s 1 tree, 44 leaves, max depth = 13, train loss: 0.50119, val loss: 0.49055, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.49940, val loss: 0.48859, in 0.000s 1 tree, 44 leaves, max depth = 13, train loss: 0.49756, val loss: 0.48679, in 0.000s 1 tree, 44 leaves, max depth = 13, train loss: 0.49577, val loss: 0.48503, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.49403, val loss: 0.48313, in 0.016s Fit 65 trees in 0.892 s, (1763 total leaves) Time spent computing histograms: 0.306s Time spent finding best splits: 0.040s Time spent applying splits: 0.040s Time spent predicting: 0.000s Trial 35, Fold 4: Log loss = 0.4927859316438387, Average precision = 0.916245593193003, ROC-AUC = 0.9274143192152431, Elapsed Time = 0.8943703999993886 seconds Trial 35, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 35, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 28 leaves, max depth = 10, train loss: 0.68753, val loss: 0.68718, in 0.016s 1 tree, 26 leaves, max depth = 11, train loss: 0.68220, val loss: 0.68150, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.67698, val loss: 0.67595, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.67188, val loss: 0.67051, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.66689, val loss: 0.66519, in 0.000s 1 tree, 27 leaves, max depth = 11, train loss: 0.66203, val loss: 0.66000, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.65726, val loss: 0.65491, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.65262, val loss: 0.64995, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.64806, val loss: 0.64508, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.64362, val loss: 0.64032, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.63926, val loss: 0.63566, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.63501, val loss: 0.63110, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.63085, val loss: 0.62663, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.62678, val loss: 0.62227, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.62274, val loss: 0.61791, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.61878, val loss: 0.61364, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.61496, val loss: 0.60954, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.61117, val loss: 0.60544, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.60746, val loss: 0.60144, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.60382, val loss: 0.59751, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.60027, val loss: 0.59366, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.59679, val loss: 0.58989, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.59344, val loss: 0.58628, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.59011, val loss: 0.58266, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.58689, val loss: 0.57918, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.58369, val loss: 0.57570, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.58056, val loss: 0.57229, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.57749, val loss: 0.56896, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.57448, val loss: 0.56568, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.57154, val loss: 0.56248, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.56866, val loss: 0.55933, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.56588, val loss: 0.55631, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.56311, val loss: 0.55328, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.56040, val loss: 0.55032, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.55775, val loss: 0.54741, in 0.000s 1 tree, 40 leaves, max depth = 13, train loss: 0.55511, val loss: 0.54489, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.55257, val loss: 0.54212, in 0.000s 1 tree, 40 leaves, max depth = 13, train loss: 0.55001, val loss: 0.53966, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.54755, val loss: 0.53697, in 0.000s 1 tree, 42 leaves, max depth = 13, train loss: 0.54506, val loss: 0.53459, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.54262, val loss: 0.53192, in 0.000s 1 tree, 42 leaves, max depth = 13, train loss: 0.54020, val loss: 0.52961, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.53787, val loss: 0.52706, in 0.000s 1 tree, 42 leaves, max depth = 13, train loss: 0.53551, val loss: 0.52481, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.53325, val loss: 0.52233, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.53096, val loss: 0.52015, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.52872, val loss: 0.51768, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.52649, val loss: 0.51556, in 0.000s 1 tree, 42 leaves, max depth = 13, train loss: 0.52430, val loss: 0.51348, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.52219, val loss: 0.51116, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.52006, val loss: 0.50914, in 0.000s 1 tree, 32 leaves, max depth = 10, train loss: 0.51800, val loss: 0.50687, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.51592, val loss: 0.50491, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.51391, val loss: 0.50270, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.51190, val loss: 0.50079, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.50995, val loss: 0.49864, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.50798, val loss: 0.49678, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.50608, val loss: 0.49469, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.50416, val loss: 0.49288, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.50228, val loss: 0.49079, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.50041, val loss: 0.48903, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.49858, val loss: 0.48731, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.49679, val loss: 0.48533, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.49501, val loss: 0.48365, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.49326, val loss: 0.48173, in 0.000s Fit 65 trees in 0.861 s, (1892 total leaves) Time spent computing histograms: 0.304s Time spent finding best splits: 0.041s Time spent applying splits: 0.041s Time spent predicting: 0.000s Trial 35, Fold 5: Log loss = 0.49773541235725716, Average precision = 0.9113811010579826, ROC-AUC = 0.9217625922003606, Elapsed Time = 0.8786968999993405 seconds
Optimization Progress: 36%|###6 | 36/100 [07:08<12:41, 11.90s/it]
Trial 36, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 36, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 15 leaves, max depth = 6, train loss: 0.67203, val loss: 0.67173, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.64883, val loss: 0.64854, in 0.000s 1 tree, 11 leaves, max depth = 6, train loss: 0.63083, val loss: 0.63017, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.61309, val loss: 0.61227, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.59605, val loss: 0.59496, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.58042, val loss: 0.57903, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.56613, val loss: 0.56452, in 0.016s 1 tree, 21 leaves, max depth = 6, train loss: 0.55287, val loss: 0.55115, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.54003, val loss: 0.53814, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.52689, val loss: 0.52513, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.51701, val loss: 0.51499, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.50699, val loss: 0.50494, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.49662, val loss: 0.49455, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.48687, val loss: 0.48464, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.47487, val loss: 0.47280, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.46607, val loss: 0.46385, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.45818, val loss: 0.45578, in 0.000s 1 tree, 13 leaves, max depth = 5, train loss: 0.45051, val loss: 0.44806, in 0.031s 1 tree, 6 leaves, max depth = 4, train loss: 0.44352, val loss: 0.44080, in 0.000s 1 tree, 15 leaves, max depth = 6, train loss: 0.43701, val loss: 0.43415, in 0.016s 1 tree, 22 leaves, max depth = 7, train loss: 0.43012, val loss: 0.42724, in 0.016s Fit 21 trees in 0.595 s, (317 total leaves) Time spent computing histograms: 0.128s Time spent finding best splits: 0.012s Time spent applying splits: 0.008s Time spent predicting: 0.000s Trial 36, Fold 1: Log loss = 0.4301857673216879, Average precision = 0.9524029040859325, ROC-AUC = 0.9449978312613816, Elapsed Time = 0.6013696999998501 seconds Trial 36, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 36, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 6, train loss: 0.67213, val loss: 0.67116, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.64904, val loss: 0.64762, in 0.025s 1 tree, 9 leaves, max depth = 5, train loss: 0.63110, val loss: 0.62876, in 0.007s 1 tree, 16 leaves, max depth = 6, train loss: 0.61269, val loss: 0.60995, in 0.031s 1 tree, 14 leaves, max depth = 5, train loss: 0.59576, val loss: 0.59279, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.58018, val loss: 0.57680, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.56589, val loss: 0.56193, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.55240, val loss: 0.54800, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.53940, val loss: 0.53478, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.52623, val loss: 0.52130, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.51637, val loss: 0.51100, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.50619, val loss: 0.50043, in 0.016s 1 tree, 10 leaves, max depth = 6, train loss: 0.49527, val loss: 0.48930, in 0.000s 1 tree, 13 leaves, max depth = 5, train loss: 0.48534, val loss: 0.47909, in 0.031s 1 tree, 6 leaves, max depth = 3, train loss: 0.47334, val loss: 0.46706, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.46457, val loss: 0.45801, in 0.031s 1 tree, 13 leaves, max depth = 7, train loss: 0.45382, val loss: 0.44729, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.44675, val loss: 0.44008, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.43996, val loss: 0.43310, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.43343, val loss: 0.42674, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.42688, val loss: 0.41994, in 0.000s Fit 21 trees in 0.643 s, (287 total leaves) Time spent computing histograms: 0.149s Time spent finding best splits: 0.012s Time spent applying splits: 0.008s Time spent predicting: 0.000s Trial 36, Fold 2: Log loss = 0.42915409712140934, Average precision = 0.9483329192762742, ROC-AUC = 0.9447445341206413, Elapsed Time = 0.6612169999989419 seconds Trial 36, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 36, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.143 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 11 leaves, max depth = 5, train loss: 0.67219, val loss: 0.67183, in 0.000s 1 tree, 13 leaves, max depth = 6, train loss: 0.64908, val loss: 0.64859, in 0.000s 1 tree, 15 leaves, max depth = 8, train loss: 0.63112, val loss: 0.63017, in 0.031s 1 tree, 16 leaves, max depth = 6, train loss: 0.61350, val loss: 0.61224, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.59642, val loss: 0.59520, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.58089, val loss: 0.57936, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.56666, val loss: 0.56496, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.55325, val loss: 0.55150, in 0.000s 1 tree, 13 leaves, max depth = 6, train loss: 0.54038, val loss: 0.53819, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.52731, val loss: 0.52502, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.51712, val loss: 0.51475, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.50690, val loss: 0.50426, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.49637, val loss: 0.49373, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.48654, val loss: 0.48385, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.47732, val loss: 0.47469, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.46857, val loss: 0.46593, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.45721, val loss: 0.45532, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.45023, val loss: 0.44818, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.44326, val loss: 0.44133, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.43679, val loss: 0.43498, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.43075, val loss: 0.42901, in 0.016s Fit 21 trees in 0.643 s, (297 total leaves) Time spent computing histograms: 0.140s Time spent finding best splits: 0.012s Time spent applying splits: 0.008s Time spent predicting: 0.000s Trial 36, Fold 3: Log loss = 0.4271048823779159, Average precision = 0.9534852152384319, ROC-AUC = 0.9482975189249838, Elapsed Time = 0.6425328000004811 seconds Trial 36, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 36, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.159 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 15 leaves, max depth = 6, train loss: 0.67212, val loss: 0.67149, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.64917, val loss: 0.64765, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.63125, val loss: 0.62907, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.61354, val loss: 0.61074, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.59675, val loss: 0.59298, in 0.019s 1 tree, 18 leaves, max depth = 6, train loss: 0.58124, val loss: 0.57680, in 0.013s 1 tree, 18 leaves, max depth = 7, train loss: 0.56701, val loss: 0.56201, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.55372, val loss: 0.54812, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.54078, val loss: 0.53446, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.52768, val loss: 0.52065, in 0.016s 1 tree, 27 leaves, max depth = 8, train loss: 0.51741, val loss: 0.50975, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.50742, val loss: 0.49935, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.49837, val loss: 0.48986, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.48822, val loss: 0.47908, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.48048, val loss: 0.47076, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.47240, val loss: 0.46203, in 0.016s 1 tree, 24 leaves, max depth = 7, train loss: 0.46540, val loss: 0.45467, in 0.016s 1 tree, 10 leaves, max depth = 6, train loss: 0.45757, val loss: 0.44616, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.45065, val loss: 0.43878, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.44425, val loss: 0.43186, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.43762, val loss: 0.42464, in 0.016s Fit 21 trees in 0.643 s, (316 total leaves) Time spent computing histograms: 0.143s Time spent finding best splits: 0.012s Time spent applying splits: 0.009s Time spent predicting: 0.000s Trial 36, Fold 4: Log loss = 0.4373566537181545, Average precision = 0.9497832966419613, ROC-AUC = 0.9433571535869868, Elapsed Time = 0.6570773000003101 seconds Trial 36, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 36, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 16 leaves, max depth = 6, train loss: 0.67196, val loss: 0.67085, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.64876, val loss: 0.64661, in 0.016s 1 tree, 11 leaves, max depth = 8, train loss: 0.63074, val loss: 0.62734, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.61288, val loss: 0.60873, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.59587, val loss: 0.59070, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.58037, val loss: 0.57407, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.56713, val loss: 0.55997, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.55273, val loss: 0.54472, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.53949, val loss: 0.53058, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.52626, val loss: 0.51686, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.51593, val loss: 0.50572, in 0.016s 1 tree, 23 leaves, max depth = 7, train loss: 0.50589, val loss: 0.49534, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.49665, val loss: 0.48572, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.48687, val loss: 0.47538, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.47846, val loss: 0.46647, in 0.016s 1 tree, 18 leaves, max depth = 6, train loss: 0.47090, val loss: 0.45854, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.46266, val loss: 0.44977, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.45560, val loss: 0.44230, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.44905, val loss: 0.43545, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.44193, val loss: 0.42788, in 0.000s 1 tree, 16 leaves, max depth = 9, train loss: 0.43561, val loss: 0.42133, in 0.031s Fit 21 trees in 0.642 s, (333 total leaves) Time spent computing histograms: 0.137s Time spent finding best splits: 0.013s Time spent applying splits: 0.009s Time spent predicting: 0.000s Trial 36, Fold 5: Log loss = 0.44078513005362435, Average precision = 0.9495169038635765, ROC-AUC = 0.9432683425301451, Elapsed Time = 0.6378953999992518 seconds
Optimization Progress: 37%|###7 | 37/100 [07:18<11:52, 11.31s/it]
Trial 37, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 37, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 66 leaves, max depth = 10, train loss: 0.67495, val loss: 0.67482, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.65724, val loss: 0.65695, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.64077, val loss: 0.64054, in 0.031s 1 tree, 66 leaves, max depth = 12, train loss: 0.62631, val loss: 0.62619, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.61130, val loss: 0.61109, in 0.016s 1 tree, 66 leaves, max depth = 10, train loss: 0.59835, val loss: 0.59787, in 0.031s 1 tree, 66 leaves, max depth = 16, train loss: 0.58679, val loss: 0.58628, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.57493, val loss: 0.57441, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.56390, val loss: 0.56324, in 0.031s 1 tree, 66 leaves, max depth = 13, train loss: 0.55273, val loss: 0.55193, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.54219, val loss: 0.54137, in 0.016s Fit 11 trees in 0.532 s, (726 total leaves) Time spent computing histograms: 0.092s Time spent finding best splits: 0.020s Time spent applying splits: 0.014s Time spent predicting: 0.000s Trial 37, Fold 1: Log loss = 0.5447501958689264, Average precision = 0.923759226037229, ROC-AUC = 0.932276730450285, Elapsed Time = 0.5382616000006237 seconds Trial 37, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 37, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 66 leaves, max depth = 12, train loss: 0.67491, val loss: 0.67488, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.65725, val loss: 0.65710, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.64050, val loss: 0.64020, in 0.031s 1 tree, 66 leaves, max depth = 17, train loss: 0.62601, val loss: 0.62566, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.61097, val loss: 0.61042, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.59763, val loss: 0.59692, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.58429, val loss: 0.58353, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.57186, val loss: 0.57115, in 0.031s 1 tree, 66 leaves, max depth = 14, train loss: 0.56021, val loss: 0.55946, in 0.031s 1 tree, 66 leaves, max depth = 11, train loss: 0.54943, val loss: 0.54862, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.53884, val loss: 0.53792, in 0.016s Fit 11 trees in 0.580 s, (726 total leaves) Time spent computing histograms: 0.093s Time spent finding best splits: 0.022s Time spent applying splits: 0.015s Time spent predicting: 0.000s Trial 37, Fold 2: Log loss = 0.5415810970872361, Average precision = 0.9166716562949201, ROC-AUC = 0.9333092003771843, Elapsed Time = 0.6004226000004564 seconds Trial 37, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 37, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 66 leaves, max depth = 11, train loss: 0.67505, val loss: 0.67510, in 0.031s 1 tree, 66 leaves, max depth = 13, train loss: 0.65762, val loss: 0.65776, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.64113, val loss: 0.64134, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.62692, val loss: 0.62707, in 0.031s 1 tree, 66 leaves, max depth = 13, train loss: 0.61189, val loss: 0.61223, in 0.016s 1 tree, 66 leaves, max depth = 11, train loss: 0.59878, val loss: 0.59919, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.58667, val loss: 0.58714, in 0.031s 1 tree, 66 leaves, max depth = 12, train loss: 0.57446, val loss: 0.57494, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.56293, val loss: 0.56332, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.55215, val loss: 0.55263, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.54137, val loss: 0.54185, in 0.031s Fit 11 trees in 0.596 s, (726 total leaves) Time spent computing histograms: 0.092s Time spent finding best splits: 0.020s Time spent applying splits: 0.013s Time spent predicting: 0.000s Trial 37, Fold 3: Log loss = 0.5414732280783043, Average precision = 0.9260159673414275, ROC-AUC = 0.9363839285714284, Elapsed Time = 0.597768299998279 seconds Trial 37, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 37, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.190 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 66 leaves, max depth = 12, train loss: 0.67506, val loss: 0.67440, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.65755, val loss: 0.65637, in 0.016s 1 tree, 66 leaves, max depth = 17, train loss: 0.64099, val loss: 0.63931, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.62663, val loss: 0.62472, in 0.031s 1 tree, 66 leaves, max depth = 18, train loss: 0.61191, val loss: 0.60943, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.59869, val loss: 0.59588, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.58542, val loss: 0.58223, in 0.031s 1 tree, 66 leaves, max depth = 14, train loss: 0.57315, val loss: 0.56937, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.56174, val loss: 0.55759, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.55118, val loss: 0.54669, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.54061, val loss: 0.53566, in 0.031s Fit 11 trees in 0.612 s, (726 total leaves) Time spent computing histograms: 0.087s Time spent finding best splits: 0.020s Time spent applying splits: 0.013s Time spent predicting: 0.000s Trial 37, Fold 4: Log loss = 0.5420898621380901, Average precision = 0.9209019511537924, ROC-AUC = 0.9337531980896108, Elapsed Time = 0.6224643000005017 seconds Trial 37, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 37, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.159 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 66 leaves, max depth = 14, train loss: 0.67501, val loss: 0.67422, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.65740, val loss: 0.65604, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.64071, val loss: 0.63888, in 0.031s 1 tree, 66 leaves, max depth = 16, train loss: 0.62738, val loss: 0.62485, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.61236, val loss: 0.60946, in 0.031s 1 tree, 66 leaves, max depth = 11, train loss: 0.59937, val loss: 0.59588, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.58713, val loss: 0.58337, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.57437, val loss: 0.57023, in 0.031s 1 tree, 66 leaves, max depth = 15, train loss: 0.56272, val loss: 0.55809, in 0.016s 1 tree, 66 leaves, max depth = 11, train loss: 0.55205, val loss: 0.54700, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.54132, val loss: 0.53598, in 0.031s Fit 11 trees in 0.596 s, (726 total leaves) Time spent computing histograms: 0.092s Time spent finding best splits: 0.020s Time spent applying splits: 0.014s Time spent predicting: 0.016s Trial 37, Fold 5: Log loss = 0.5459636127053025, Average precision = 0.9183070200349506, ROC-AUC = 0.9307324378053993, Elapsed Time = 0.6025019000007887 seconds
Optimization Progress: 38%|###8 | 38/100 [07:27<11:05, 10.74s/it]
Trial 38, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 38, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 10, train loss: 0.68529, val loss: 0.68505, in 0.016s 1 tree, 25 leaves, max depth = 14, train loss: 0.67774, val loss: 0.67729, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.67044, val loss: 0.66980, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.66340, val loss: 0.66250, in 0.000s 1 tree, 25 leaves, max depth = 15, train loss: 0.65656, val loss: 0.65546, in 0.016s 1 tree, 25 leaves, max depth = 13, train loss: 0.64994, val loss: 0.64864, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.64359, val loss: 0.64211, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.63744, val loss: 0.63578, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.63142, val loss: 0.62957, in 0.016s 1 tree, 25 leaves, max depth = 13, train loss: 0.62550, val loss: 0.62347, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.61988, val loss: 0.61765, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.61446, val loss: 0.61206, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.60914, val loss: 0.60657, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.60405, val loss: 0.60129, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.59905, val loss: 0.59611, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.59425, val loss: 0.59114, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.58953, val loss: 0.58628, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.58496, val loss: 0.58156, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.58052, val loss: 0.57691, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.57612, val loss: 0.57233, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.57186, val loss: 0.56788, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.56772, val loss: 0.56356, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.56370, val loss: 0.55937, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.55980, val loss: 0.55530, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.55607, val loss: 0.55145, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.55217, val loss: 0.54778, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.54864, val loss: 0.54409, in 0.016s 1 tree, 25 leaves, max depth = 14, train loss: 0.54521, val loss: 0.54055, in 0.000s 1 tree, 25 leaves, max depth = 13, train loss: 0.54189, val loss: 0.53710, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.53821, val loss: 0.53366, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.53502, val loss: 0.53037, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.53191, val loss: 0.52710, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.52880, val loss: 0.52383, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.52532, val loss: 0.52058, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.52233, val loss: 0.51743, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.51949, val loss: 0.51448, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.51672, val loss: 0.51160, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.51343, val loss: 0.50854, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.51077, val loss: 0.50578, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.50822, val loss: 0.50309, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.50571, val loss: 0.50048, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.50322, val loss: 0.49784, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.50085, val loss: 0.49536, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.49777, val loss: 0.49250, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.49553, val loss: 0.49015, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.49332, val loss: 0.48785, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.49036, val loss: 0.48511, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.48819, val loss: 0.48280, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.48608, val loss: 0.48055, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.48324, val loss: 0.47793, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.48128, val loss: 0.47585, in 0.016s Fit 51 trees in 0.689 s, (1237 total leaves) Time spent computing histograms: 0.222s Time spent finding best splits: 0.027s Time spent applying splits: 0.027s Time spent predicting: 0.000s Trial 38, Fold 1: Log loss = 0.4815927550429081, Average precision = 0.9019773409073615, ROC-AUC = 0.9093855599049819, Elapsed Time = 0.7017431000003853 seconds Trial 38, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 38, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.143 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 8, train loss: 0.68540, val loss: 0.68502, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.67789, val loss: 0.67716, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.67058, val loss: 0.66951, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.66351, val loss: 0.66210, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.65671, val loss: 0.65496, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.65011, val loss: 0.64806, in 0.000s 1 tree, 25 leaves, max depth = 13, train loss: 0.64373, val loss: 0.64136, in 0.016s 1 tree, 25 leaves, max depth = 14, train loss: 0.63755, val loss: 0.63486, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.63154, val loss: 0.62851, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.62565, val loss: 0.62229, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.62002, val loss: 0.61639, in 0.016s 1 tree, 25 leaves, max depth = 15, train loss: 0.61457, val loss: 0.61064, in 0.000s 1 tree, 25 leaves, max depth = 13, train loss: 0.60929, val loss: 0.60507, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.60421, val loss: 0.59971, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.59924, val loss: 0.59448, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.59442, val loss: 0.58938, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.58970, val loss: 0.58440, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.58512, val loss: 0.57956, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.58068, val loss: 0.57487, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.57633, val loss: 0.57023, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.57214, val loss: 0.56580, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.56804, val loss: 0.56143, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.56406, val loss: 0.55718, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.56020, val loss: 0.55305, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.55648, val loss: 0.54911, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.55257, val loss: 0.54540, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.54903, val loss: 0.54166, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.54559, val loss: 0.53801, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.54226, val loss: 0.53447, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.53858, val loss: 0.53098, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.53538, val loss: 0.52758, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.53226, val loss: 0.52429, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.52918, val loss: 0.52098, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.52571, val loss: 0.51769, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.52279, val loss: 0.51457, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.51995, val loss: 0.51154, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.51719, val loss: 0.50860, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.51391, val loss: 0.50549, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.51126, val loss: 0.50266, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.50870, val loss: 0.49996, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.50621, val loss: 0.49729, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.50374, val loss: 0.49462, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.50138, val loss: 0.49209, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.49831, val loss: 0.48918, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.49606, val loss: 0.48678, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.49386, val loss: 0.48442, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.49091, val loss: 0.48164, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.48876, val loss: 0.47931, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.48667, val loss: 0.47704, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.48384, val loss: 0.47438, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.48187, val loss: 0.47229, in 0.000s Fit 51 trees in 0.737 s, (1237 total leaves) Time spent computing histograms: 0.245s Time spent finding best splits: 0.028s Time spent applying splits: 0.029s Time spent predicting: 0.000s Trial 38, Fold 2: Log loss = 0.4818755510950665, Average precision = 0.899719366512129, ROC-AUC = 0.9126473736555525, Elapsed Time = 0.7471654999990278 seconds Trial 38, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 38, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 10, train loss: 0.68537, val loss: 0.68512, in 0.000s 1 tree, 25 leaves, max depth = 13, train loss: 0.67792, val loss: 0.67741, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.67066, val loss: 0.66988, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.66363, val loss: 0.66261, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.65689, val loss: 0.65563, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.65032, val loss: 0.64884, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.64399, val loss: 0.64228, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.63787, val loss: 0.63593, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.63189, val loss: 0.62970, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.62602, val loss: 0.62363, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.62041, val loss: 0.61783, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.61501, val loss: 0.61223, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.60974, val loss: 0.60672, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.60468, val loss: 0.60147, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.59972, val loss: 0.59628, in 0.016s 1 tree, 25 leaves, max depth = 14, train loss: 0.59495, val loss: 0.59130, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.59026, val loss: 0.58641, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.58572, val loss: 0.58167, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.58130, val loss: 0.57707, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.57695, val loss: 0.57256, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.57273, val loss: 0.56819, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.56864, val loss: 0.56394, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.56466, val loss: 0.55982, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.56081, val loss: 0.55581, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.55712, val loss: 0.55194, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.55318, val loss: 0.54824, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.54966, val loss: 0.54457, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.54625, val loss: 0.54101, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.54296, val loss: 0.53756, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.53925, val loss: 0.53408, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.53608, val loss: 0.53076, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.53299, val loss: 0.52753, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.52991, val loss: 0.52432, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.52642, val loss: 0.52104, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.52347, val loss: 0.51796, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.52066, val loss: 0.51499, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.51793, val loss: 0.51211, in 0.016s [38/51] 1 tree, 25 leaves, max depth = 7, train loss: 0.51462, val loss: 0.50901, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.51200, val loss: 0.50624, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.50946, val loss: 0.50358, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.50699, val loss: 0.50096, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.50453, val loss: 0.49839, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.50220, val loss: 0.49591, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.49909, val loss: 0.49300, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.49688, val loss: 0.49066, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.49470, val loss: 0.48834, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.49173, val loss: 0.48556, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.48959, val loss: 0.48331, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.48750, val loss: 0.48113, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.48465, val loss: 0.47846, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.48270, val loss: 0.47641, in 0.000s Fit 51 trees in 0.799 s, (1237 total leaves) Time spent computing histograms: 0.268s Time spent finding best splits: 0.030s Time spent applying splits: 0.032s Time spent predicting: 0.000s Trial 38, Fold 3: Log loss = 0.4776605345559353, Average precision = 0.908762631514317, ROC-AUC = 0.9183635095732487, Elapsed Time = 0.7995752000006178 seconds Trial 38, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 38, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.159 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 11, train loss: 0.68539, val loss: 0.68493, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.67788, val loss: 0.67701, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.67061, val loss: 0.66935, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.66358, val loss: 0.66194, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.65678, val loss: 0.65474, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.65019, val loss: 0.64777, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.64385, val loss: 0.64106, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.63770, val loss: 0.63455, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.63172, val loss: 0.62819, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.62585, val loss: 0.62191, in 0.016s 1 tree, 25 leaves, max depth = 13, train loss: 0.62024, val loss: 0.61592, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.61483, val loss: 0.61016, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.60954, val loss: 0.60453, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.60446, val loss: 0.59907, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.59949, val loss: 0.59376, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.59470, val loss: 0.58864, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.59000, val loss: 0.58363, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.58544, val loss: 0.57876, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.58101, val loss: 0.57403, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.57667, val loss: 0.56933, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.57245, val loss: 0.56477, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.56836, val loss: 0.56034, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.56440, val loss: 0.55603, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.56054, val loss: 0.55184, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.55684, val loss: 0.54787, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.55300, val loss: 0.54408, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.54947, val loss: 0.54024, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.54605, val loss: 0.53655, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.54274, val loss: 0.53296, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.53911, val loss: 0.52938, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.53592, val loss: 0.52594, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.53282, val loss: 0.52254, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.52975, val loss: 0.51917, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.52632, val loss: 0.51579, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.52338, val loss: 0.51256, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.52055, val loss: 0.50949, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.51781, val loss: 0.50650, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.51456, val loss: 0.50330, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.51193, val loss: 0.50043, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.50938, val loss: 0.49762, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.50690, val loss: 0.49490, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.50444, val loss: 0.49217, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.50210, val loss: 0.48960, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.49906, val loss: 0.48662, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.49682, val loss: 0.48415, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.49463, val loss: 0.48174, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.49172, val loss: 0.47890, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.48958, val loss: 0.47651, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.48750, val loss: 0.47418, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.48470, val loss: 0.47145, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.48275, val loss: 0.46925, in 0.016s Fit 51 trees in 0.799 s, (1237 total leaves) Time spent computing histograms: 0.257s Time spent finding best splits: 0.031s Time spent applying splits: 0.032s Time spent predicting: 0.047s Trial 38, Fold 4: Log loss = 0.4817802276329943, Average precision = 0.9027976986375478, ROC-AUC = 0.9135190222250835, Elapsed Time = 0.7933245000003808 seconds Trial 38, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 38, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.172 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 10, train loss: 0.68526, val loss: 0.68471, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.67770, val loss: 0.67668, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.67035, val loss: 0.66885, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.66324, val loss: 0.66126, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.65640, val loss: 0.65396, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.64976, val loss: 0.64687, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.64334, val loss: 0.64001, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.63714, val loss: 0.63336, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.63108, val loss: 0.62688, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.62516, val loss: 0.62052, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.61950, val loss: 0.61439, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.61403, val loss: 0.60851, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.60868, val loss: 0.60277, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.60356, val loss: 0.59720, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.59852, val loss: 0.59178, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.59367, val loss: 0.58654, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.58893, val loss: 0.58143, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.58436, val loss: 0.57643, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.57989, val loss: 0.57159, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.57551, val loss: 0.56683, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.57126, val loss: 0.56221, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.56713, val loss: 0.55772, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.56313, val loss: 0.55335, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.55924, val loss: 0.54911, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.55552, val loss: 0.54505, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.55167, val loss: 0.54135, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.54810, val loss: 0.53745, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.54465, val loss: 0.53367, in 0.000s 1 tree, 25 leaves, max depth = 13, train loss: 0.54129, val loss: 0.52999, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.53765, val loss: 0.52651, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.53444, val loss: 0.52298, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.53130, val loss: 0.51951, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.52820, val loss: 0.51610, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.52476, val loss: 0.51283, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.52178, val loss: 0.50956, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.51895, val loss: 0.50644, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.51620, val loss: 0.50340, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.51295, val loss: 0.50031, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.51029, val loss: 0.49738, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.50772, val loss: 0.49449, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.50520, val loss: 0.49171, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.50271, val loss: 0.48895, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.50035, val loss: 0.48632, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.49730, val loss: 0.48344, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.49504, val loss: 0.48093, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.49282, val loss: 0.47846, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.48990, val loss: 0.47571, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.48773, val loss: 0.47329, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.48561, val loss: 0.47094, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.48281, val loss: 0.46831, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.48083, val loss: 0.46608, in 0.000s Fit 51 trees in 0.813 s, (1237 total leaves) Time spent computing histograms: 0.269s Time spent finding best splits: 0.033s Time spent applying splits: 0.033s Time spent predicting: 0.000s Trial 38, Fold 5: Log loss = 0.4873302464752492, Average precision = 0.8985605879198201, ROC-AUC = 0.9079279861769133, Elapsed Time = 0.8260637000003044 seconds
Optimization Progress: 39%|###9 | 39/100 [07:38<11:00, 10.82s/it]
Trial 39, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 39, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 64 leaves, max depth = 16, train loss: 0.68187, val loss: 0.68155, in 0.016s 1 tree, 64 leaves, max depth = 16, train loss: 0.67129, val loss: 0.67067, in 0.016s 1 tree, 64 leaves, max depth = 16, train loss: 0.66120, val loss: 0.66028, in 0.000s 1 tree, 64 leaves, max depth = 16, train loss: 0.65157, val loss: 0.65035, in 0.016s 1 tree, 64 leaves, max depth = 16, train loss: 0.64237, val loss: 0.64087, in 0.016s 1 tree, 64 leaves, max depth = 16, train loss: 0.63357, val loss: 0.63179, in 0.000s 1 tree, 64 leaves, max depth = 16, train loss: 0.62517, val loss: 0.62311, in 0.016s 1 tree, 64 leaves, max depth = 16, train loss: 0.61712, val loss: 0.61480, in 0.000s 1 tree, 64 leaves, max depth = 16, train loss: 0.60943, val loss: 0.60685, in 0.016s 1 tree, 64 leaves, max depth = 16, train loss: 0.60206, val loss: 0.59922, in 0.016s 1 tree, 64 leaves, max depth = 16, train loss: 0.59501, val loss: 0.59192, in 0.000s 1 tree, 64 leaves, max depth = 16, train loss: 0.58825, val loss: 0.58492, in 0.016s 1 tree, 66 leaves, max depth = 17, train loss: 0.58188, val loss: 0.57829, in 0.016s 1 tree, 66 leaves, max depth = 16, train loss: 0.57567, val loss: 0.57184, in 0.000s 1 tree, 66 leaves, max depth = 16, train loss: 0.56971, val loss: 0.56566, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.56405, val loss: 0.55985, in 0.016s 1 tree, 66 leaves, max depth = 17, train loss: 0.55857, val loss: 0.55415, in 0.000s 1 tree, 105 leaves, max depth = 14, train loss: 0.55301, val loss: 0.54893, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.54793, val loss: 0.54362, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.54268, val loss: 0.53870, in 0.016s 1 tree, 66 leaves, max depth = 17, train loss: 0.53782, val loss: 0.53363, in 0.000s 1 tree, 105 leaves, max depth = 14, train loss: 0.53285, val loss: 0.52898, in 0.016s 1 tree, 66 leaves, max depth = 17, train loss: 0.52827, val loss: 0.52419, in 0.000s 1 tree, 66 leaves, max depth = 17, train loss: 0.52387, val loss: 0.51958, in 0.031s 1 tree, 106 leaves, max depth = 16, train loss: 0.51922, val loss: 0.51525, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.51477, val loss: 0.51110, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.51049, val loss: 0.50711, in 0.000s 1 tree, 67 leaves, max depth = 15, train loss: 0.50653, val loss: 0.50295, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.50247, val loss: 0.49918, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.49867, val loss: 0.49518, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.49502, val loss: 0.49133, in 0.000s 1 tree, 106 leaves, max depth = 17, train loss: 0.49120, val loss: 0.48780, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.48754, val loss: 0.48440, in 0.016s 1 tree, 66 leaves, max depth = 16, train loss: 0.48418, val loss: 0.48086, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.48069, val loss: 0.47763, in 0.000s 1 tree, 106 leaves, max depth = 17, train loss: 0.47733, val loss: 0.47453, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.47415, val loss: 0.47117, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.47095, val loss: 0.46822, in 0.016s 1 tree, 67 leaves, max depth = 17, train loss: 0.46798, val loss: 0.46506, in 0.000s 1 tree, 106 leaves, max depth = 16, train loss: 0.46493, val loss: 0.46226, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.46206, val loss: 0.45921, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.45915, val loss: 0.45655, in 0.016s 1 tree, 67 leaves, max depth = 17, train loss: 0.45646, val loss: 0.45368, in 0.000s 1 tree, 106 leaves, max depth = 17, train loss: 0.45369, val loss: 0.45114, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.45101, val loss: 0.44869, in 0.016s 1 tree, 67 leaves, max depth = 17, train loss: 0.44848, val loss: 0.44599, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.44593, val loss: 0.44366, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.44350, val loss: 0.44108, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.44106, val loss: 0.43886, in 0.016s 1 tree, 65 leaves, max depth = 14, train loss: 0.43879, val loss: 0.43647, in 0.000s 1 tree, 106 leaves, max depth = 17, train loss: 0.43645, val loss: 0.43435, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.43425, val loss: 0.43201, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.43202, val loss: 0.42999, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.42986, val loss: 0.42805, in 0.016s [55/82] 1 tree, 67 leaves, max depth = 13, train loss: 0.42780, val loss: 0.42582, in 0.000s 1 tree, 106 leaves, max depth = 16, train loss: 0.42574, val loss: 0.42397, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.42377, val loss: 0.42183, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.42179, val loss: 0.42006, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.41985, val loss: 0.41800, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.41796, val loss: 0.41631, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.41617, val loss: 0.41436, in 0.000s 1 tree, 106 leaves, max depth = 16, train loss: 0.41435, val loss: 0.41274, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.41259, val loss: 0.41118, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.41095, val loss: 0.40936, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.40937, val loss: 0.40761, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.40765, val loss: 0.40578, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.40597, val loss: 0.40431, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.40447, val loss: 0.40264, in 0.000s 1 tree, 106 leaves, max depth = 16, train loss: 0.40286, val loss: 0.40124, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.40142, val loss: 0.39963, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.39984, val loss: 0.39795, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.39829, val loss: 0.39661, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.39692, val loss: 0.39507, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.39541, val loss: 0.39348, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.39393, val loss: 0.39220, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.39249, val loss: 0.39064, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.39118, val loss: 0.38917, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.38992, val loss: 0.38776, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.38871, val loss: 0.38640, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.38734, val loss: 0.38496, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.38599, val loss: 0.38356, in 0.000s 1 tree, 105 leaves, max depth = 13, train loss: 0.38456, val loss: 0.38235, in 0.016s Fit 82 trees in 1.173 s, (6143 total leaves) Time spent computing histograms: 0.399s Time spent finding best splits: 0.091s Time spent applying splits: 0.104s Time spent predicting: 0.000s Trial 39, Fold 1: Log loss = 0.3873480272135086, Average precision = 0.9476433109414608, ROC-AUC = 0.9442173392743628, Elapsed Time = 1.1750902000003407 seconds Trial 39, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 39, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 66 leaves, max depth = 16, train loss: 0.68205, val loss: 0.68153, in 0.016s 1 tree, 66 leaves, max depth = 16, train loss: 0.67152, val loss: 0.67051, in 0.016s 1 tree, 66 leaves, max depth = 16, train loss: 0.66149, val loss: 0.65999, in 0.000s 1 tree, 66 leaves, max depth = 16, train loss: 0.65190, val loss: 0.64994, in 0.016s 1 tree, 66 leaves, max depth = 16, train loss: 0.64275, val loss: 0.64033, in 0.000s 1 tree, 66 leaves, max depth = 16, train loss: 0.63400, val loss: 0.63114, in 0.000s 1 tree, 66 leaves, max depth = 16, train loss: 0.62564, val loss: 0.62235, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.61771, val loss: 0.61406, in 0.000s 1 tree, 67 leaves, max depth = 16, train loss: 0.61005, val loss: 0.60599, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.60279, val loss: 0.59838, in 0.000s 1 tree, 67 leaves, max depth = 16, train loss: 0.59577, val loss: 0.59097, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.58911, val loss: 0.58398, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.58273, val loss: 0.57729, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.57654, val loss: 0.57074, in 0.000s 1 tree, 67 leaves, max depth = 16, train loss: 0.57060, val loss: 0.56446, in 0.016s 1 tree, 65 leaves, max depth = 16, train loss: 0.56497, val loss: 0.55852, in 0.000s 1 tree, 67 leaves, max depth = 16, train loss: 0.55951, val loss: 0.55273, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.55394, val loss: 0.54741, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.54887, val loss: 0.54206, in 0.000s 1 tree, 106 leaves, max depth = 16, train loss: 0.54361, val loss: 0.53704, in 0.016s 1 tree, 66 leaves, max depth = 16, train loss: 0.53882, val loss: 0.53196, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.53385, val loss: 0.52722, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.52934, val loss: 0.52246, in 0.016s 1 tree, 68 leaves, max depth = 11, train loss: 0.52494, val loss: 0.51778, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.52029, val loss: 0.51335, in 0.000s 1 tree, 106 leaves, max depth = 15, train loss: 0.51583, val loss: 0.50911, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.51155, val loss: 0.50504, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.50760, val loss: 0.50085, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.50353, val loss: 0.49699, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.49979, val loss: 0.49302, in 0.000s 1 tree, 64 leaves, max depth = 16, train loss: 0.49618, val loss: 0.48921, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.49236, val loss: 0.48559, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.48869, val loss: 0.48211, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.48534, val loss: 0.47856, in 0.000s 1 tree, 105 leaves, max depth = 18, train loss: 0.48184, val loss: 0.47525, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.47848, val loss: 0.47208, in 0.016s 1 tree, 66 leaves, max depth = 17, train loss: 0.47535, val loss: 0.46873, in 0.016s 1 tree, 106 leaves, max depth = 19, train loss: 0.47215, val loss: 0.46571, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.46917, val loss: 0.46255, in 0.000s 1 tree, 105 leaves, max depth = 18, train loss: 0.46612, val loss: 0.45968, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.46330, val loss: 0.45667, in 0.016s 1 tree, 106 leaves, max depth = 18, train loss: 0.46038, val loss: 0.45393, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.45770, val loss: 0.45108, in 0.000s 1 tree, 106 leaves, max depth = 17, train loss: 0.45492, val loss: 0.44846, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.45224, val loss: 0.44596, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.44972, val loss: 0.44327, in 0.016s 1 tree, 106 leaves, max depth = 18, train loss: 0.44716, val loss: 0.44087, in 0.016s 1 tree, 66 leaves, max depth = 17, train loss: 0.44477, val loss: 0.43831, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.44232, val loss: 0.43602, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.44005, val loss: 0.43357, in 0.016s 1 tree, 106 leaves, max depth = 20, train loss: 0.43770, val loss: 0.43139, in 0.016s 1 tree, 106 leaves, max depth = 20, train loss: 0.43544, val loss: 0.42929, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.43330, val loss: 0.42698, in 0.000s 1 tree, 106 leaves, max depth = 20, train loss: 0.43114, val loss: 0.42498, in 0.016s 1 tree, 68 leaves, max depth = 12, train loss: 0.42909, val loss: 0.42278, in 0.016s 1 tree, 106 leaves, max depth = 20, train loss: 0.42701, val loss: 0.42086, in 0.016s 1 tree, 68 leaves, max depth = 12, train loss: 0.42506, val loss: 0.41876, in 0.016s 1 tree, 106 leaves, max depth = 20, train loss: 0.42307, val loss: 0.41693, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.42121, val loss: 0.41492, in 0.000s 1 tree, 105 leaves, max depth = 20, train loss: 0.41931, val loss: 0.41317, in 0.031s 1 tree, 68 leaves, max depth = 12, train loss: 0.41747, val loss: 0.41117, in 0.000s 1 tree, 105 leaves, max depth = 20, train loss: 0.41564, val loss: 0.40949, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.41388, val loss: 0.40787, in 0.031s 1 tree, 3 leaves, max depth = 2, train loss: 0.41230, val loss: 0.40624, in 0.000s 1 tree, 68 leaves, max depth = 12, train loss: 0.41062, val loss: 0.40444, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40912, val loss: 0.40288, in 0.000s 1 tree, 106 leaves, max depth = 19, train loss: 0.40743, val loss: 0.40134, in 0.031s 1 tree, 105 leaves, max depth = 19, train loss: 0.40581, val loss: 0.39986, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40436, val loss: 0.39837, in 0.000s 1 tree, 66 leaves, max depth = 15, train loss: 0.40281, val loss: 0.39668, in 0.016s 1 tree, 106 leaves, max depth = 19, train loss: 0.40125, val loss: 0.39526, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39987, val loss: 0.39384, in 0.000s 1 tree, 68 leaves, max depth = 12, train loss: 0.39838, val loss: 0.39224, in 0.016s 1 tree, 106 leaves, max depth = 19, train loss: 0.39688, val loss: 0.39088, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39557, val loss: 0.38952, in 0.016s 1 tree, 68 leaves, max depth = 12, train loss: 0.39415, val loss: 0.38799, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39289, val loss: 0.38669, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39169, val loss: 0.38543, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39053, val loss: 0.38422, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.38915, val loss: 0.38272, in 0.000s 1 tree, 67 leaves, max depth = 11, train loss: 0.38780, val loss: 0.38129, in 0.016s 1 tree, 106 leaves, max depth = 19, train loss: 0.38636, val loss: 0.38000, in 0.016s Fit 82 trees in 1.284 s, (6181 total leaves) Time spent computing histograms: 0.431s Time spent finding best splits: 0.100s Time spent applying splits: 0.115s Time spent predicting: 0.000s Trial 39, Fold 2: Log loss = 0.38897657752043585, Average precision = 0.9443297070994459, ROC-AUC = 0.9444997649426803, Elapsed Time = 1.2912431999993714 seconds Trial 39, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 39, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 65 leaves, max depth = 12, train loss: 0.68203, val loss: 0.68171, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.67165, val loss: 0.67095, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.66167, val loss: 0.66065, in 0.000s 1 tree, 66 leaves, max depth = 12, train loss: 0.65213, val loss: 0.65082, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.64310, val loss: 0.64144, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.63449, val loss: 0.63256, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.62623, val loss: 0.62397, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.61826, val loss: 0.61573, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.61063, val loss: 0.60785, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.60342, val loss: 0.60038, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.59649, val loss: 0.59316, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.58978, val loss: 0.58622, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.58342, val loss: 0.57958, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.57734, val loss: 0.57328, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.57143, val loss: 0.56715, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.56583, val loss: 0.56128, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.56039, val loss: 0.55564, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.55483, val loss: 0.55049, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.54978, val loss: 0.54524, in 0.000s 1 tree, 104 leaves, max depth = 14, train loss: 0.54453, val loss: 0.54038, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.53971, val loss: 0.53536, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.53475, val loss: 0.53077, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.53026, val loss: 0.52609, in 0.000s 1 tree, 68 leaves, max depth = 13, train loss: 0.52595, val loss: 0.52160, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.52130, val loss: 0.51731, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.51685, val loss: 0.51321, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.51257, val loss: 0.50928, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.50864, val loss: 0.50516, in 0.000s 1 tree, 104 leaves, max depth = 15, train loss: 0.50458, val loss: 0.50143, in 0.000s 1 tree, 68 leaves, max depth = 13, train loss: 0.50086, val loss: 0.49752, in 0.016s 1 tree, 64 leaves, max depth = 16, train loss: 0.49730, val loss: 0.49375, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.49348, val loss: 0.49025, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.48981, val loss: 0.48690, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.48629, val loss: 0.48368, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.48300, val loss: 0.48021, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.47965, val loss: 0.47717, in 0.000s 1 tree, 67 leaves, max depth = 16, train loss: 0.47656, val loss: 0.47386, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.47336, val loss: 0.47096, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.47040, val loss: 0.46783, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.46736, val loss: 0.46507, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.46455, val loss: 0.46209, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.46164, val loss: 0.45946, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.45897, val loss: 0.45663, in 0.000s 1 tree, 104 leaves, max depth = 15, train loss: 0.45620, val loss: 0.45413, in 0.031s 1 tree, 104 leaves, max depth = 15, train loss: 0.45352, val loss: 0.45172, in 0.000s 1 tree, 68 leaves, max depth = 14, train loss: 0.45102, val loss: 0.44905, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.44846, val loss: 0.44676, in 0.016s 1 tree, 67 leaves, max depth = 17, train loss: 0.44611, val loss: 0.44421, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.44367, val loss: 0.44203, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.44132, val loss: 0.43993, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.43907, val loss: 0.43752, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.43682, val loss: 0.43552, in 0.000s 1 tree, 67 leaves, max depth = 18, train loss: 0.43470, val loss: 0.43322, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.43255, val loss: 0.43130, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.43050, val loss: 0.42910, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.42844, val loss: 0.42728, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.42649, val loss: 0.42517, in 0.000s 1 tree, 104 leaves, max depth = 14, train loss: 0.42452, val loss: 0.42343, in 0.031s 1 tree, 67 leaves, max depth = 16, train loss: 0.42269, val loss: 0.42142, in 0.000s 1 tree, 104 leaves, max depth = 14, train loss: 0.42080, val loss: 0.41975, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.41898, val loss: 0.41777, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.41716, val loss: 0.41619, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.41542, val loss: 0.41467, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.41377, val loss: 0.41313, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.41218, val loss: 0.41166, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.41048, val loss: 0.40979, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.40882, val loss: 0.40833, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.40731, val loss: 0.40693, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.40585, val loss: 0.40558, in 0.000s 1 tree, 104 leaves, max depth = 13, train loss: 0.40425, val loss: 0.40419, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.40270, val loss: 0.40249, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.40131, val loss: 0.40120, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.39978, val loss: 0.39988, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.39831, val loss: 0.39826, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.39698, val loss: 0.39703, in 0.000s 1 tree, 66 leaves, max depth = 12, train loss: 0.39557, val loss: 0.39549, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.39430, val loss: 0.39431, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.39308, val loss: 0.39318, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.39191, val loss: 0.39209, in 0.000s 1 tree, 66 leaves, max depth = 11, train loss: 0.39057, val loss: 0.39062, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.38924, val loss: 0.38916, in 0.016s 1 tree, 102 leaves, max depth = 15, train loss: 0.38777, val loss: 0.38791, in 0.016s Fit 82 trees in 1.298 s, (6025 total leaves) Time spent computing histograms: 0.445s Time spent finding best splits: 0.100s Time spent applying splits: 0.117s Time spent predicting: 0.000s Trial 39, Fold 3: Log loss = 0.38644798890290544, Average precision = 0.9486453359713937, ROC-AUC = 0.9473633085253923, Elapsed Time = 1.3085611999995308 seconds Trial 39, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 39, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 65 leaves, max depth = 16, train loss: 0.68206, val loss: 0.68143, in 0.000s 1 tree, 65 leaves, max depth = 16, train loss: 0.67159, val loss: 0.67034, in 0.016s 1 tree, 65 leaves, max depth = 16, train loss: 0.66160, val loss: 0.65975, in 0.016s 1 tree, 65 leaves, max depth = 16, train loss: 0.65206, val loss: 0.64962, in 0.000s 1 tree, 65 leaves, max depth = 16, train loss: 0.64295, val loss: 0.63994, in 0.016s 1 tree, 65 leaves, max depth = 16, train loss: 0.63425, val loss: 0.63067, in 0.016s 1 tree, 65 leaves, max depth = 16, train loss: 0.62593, val loss: 0.62179, in 0.000s 1 tree, 65 leaves, max depth = 16, train loss: 0.61797, val loss: 0.61329, in 0.016s 1 tree, 65 leaves, max depth = 16, train loss: 0.61035, val loss: 0.60514, in 0.016s 1 tree, 65 leaves, max depth = 16, train loss: 0.60306, val loss: 0.59733, in 0.016s 1 tree, 65 leaves, max depth = 16, train loss: 0.59608, val loss: 0.58984, in 0.000s 1 tree, 65 leaves, max depth = 16, train loss: 0.58940, val loss: 0.58266, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.58303, val loss: 0.57579, in 0.016s 1 tree, 65 leaves, max depth = 16, train loss: 0.57688, val loss: 0.56917, in 0.016s 1 tree, 65 leaves, max depth = 16, train loss: 0.57100, val loss: 0.56281, in 0.000s 1 tree, 68 leaves, max depth = 18, train loss: 0.56536, val loss: 0.55678, in 0.016s 1 tree, 66 leaves, max depth = 16, train loss: 0.55994, val loss: 0.55091, in 0.000s 1 tree, 107 leaves, max depth = 16, train loss: 0.55436, val loss: 0.54549, in 0.031s 1 tree, 68 leaves, max depth = 15, train loss: 0.54929, val loss: 0.53998, in 0.000s 1 tree, 107 leaves, max depth = 16, train loss: 0.54402, val loss: 0.53486, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.53922, val loss: 0.52964, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.53424, val loss: 0.52481, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.52973, val loss: 0.51989, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.52539, val loss: 0.51513, in 0.000s 1 tree, 107 leaves, max depth = 14, train loss: 0.52073, val loss: 0.51063, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.51626, val loss: 0.50631, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.51197, val loss: 0.50216, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.50803, val loss: 0.49782, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.50395, val loss: 0.49389, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.50022, val loss: 0.48978, in 0.016s 1 tree, 64 leaves, max depth = 13, train loss: 0.49662, val loss: 0.48588, in 0.000s 1 tree, 107 leaves, max depth = 16, train loss: 0.49279, val loss: 0.48219, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.48912, val loss: 0.47865, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.48558, val loss: 0.47525, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.48229, val loss: 0.47160, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.47893, val loss: 0.46836, in 0.016s 1 tree, 63 leaves, max depth = 13, train loss: 0.47579, val loss: 0.46494, in 0.016s 1 tree, 107 leaves, max depth = 17, train loss: 0.47259, val loss: 0.46186, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.46963, val loss: 0.45857, in 0.000s 1 tree, 106 leaves, max depth = 16, train loss: 0.46657, val loss: 0.45564, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.46377, val loss: 0.45250, in 0.016s 1 tree, 107 leaves, max depth = 17, train loss: 0.46085, val loss: 0.44971, in 0.016s 1 tree, 68 leaves, max depth = 19, train loss: 0.45817, val loss: 0.44675, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.45539, val loss: 0.44409, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.45271, val loss: 0.44153, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.45019, val loss: 0.43872, in 0.000s 1 tree, 107 leaves, max depth = 16, train loss: 0.44764, val loss: 0.43627, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.44525, val loss: 0.43360, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.44281, val loss: 0.43126, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.44054, val loss: 0.42874, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.43820, val loss: 0.42652, in 0.000s 1 tree, 107 leaves, max depth = 14, train loss: 0.43596, val loss: 0.42437, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.43382, val loss: 0.42199, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.43167, val loss: 0.41994, in 0.016s 1 tree, 68 leaves, max depth = 17, train loss: 0.42962, val loss: 0.41762, in 0.016s 1 tree, 107 leaves, max depth = 17, train loss: 0.42756, val loss: 0.41566, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.42561, val loss: 0.41345, in 0.000s 1 tree, 107 leaves, max depth = 16, train loss: 0.42363, val loss: 0.41157, in 0.016s 1 tree, 68 leaves, max depth = 17, train loss: 0.42176, val loss: 0.40949, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.41987, val loss: 0.40769, in 0.016s 1 tree, 67 leaves, max depth = 11, train loss: 0.41805, val loss: 0.40563, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.41624, val loss: 0.40391, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.41449, val loss: 0.40226, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.41292, val loss: 0.40059, in 0.000s 1 tree, 68 leaves, max depth = 18, train loss: 0.41124, val loss: 0.39867, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40973, val loss: 0.39707, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.40807, val loss: 0.39550, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.40663, val loss: 0.39396, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.40502, val loss: 0.39246, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.40346, val loss: 0.39067, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40208, val loss: 0.38921, in 0.000s 1 tree, 107 leaves, max depth = 15, train loss: 0.40055, val loss: 0.38777, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39923, val loss: 0.38636, in 0.000s 1 tree, 107 leaves, max depth = 15, train loss: 0.39774, val loss: 0.38498, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.39627, val loss: 0.38330, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.39486, val loss: 0.38169, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39360, val loss: 0.38035, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39240, val loss: 0.37906, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39128, val loss: 0.37787, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.38994, val loss: 0.37634, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.38860, val loss: 0.37484, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.38719, val loss: 0.37352, in 0.016s Fit 82 trees in 1.299 s, (6190 total leaves) Time spent computing histograms: 0.448s Time spent finding best splits: 0.103s Time spent applying splits: 0.117s Time spent predicting: 0.031s Trial 39, Fold 4: Log loss = 0.3883433454778819, Average precision = 0.9477614116286428, ROC-AUC = 0.9447609003292871, Elapsed Time = 1.3085276000001613 seconds Trial 39, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 39, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 64 leaves, max depth = 13, train loss: 0.68203, val loss: 0.68127, in 0.000s 1 tree, 64 leaves, max depth = 13, train loss: 0.67156, val loss: 0.67007, in 0.016s 1 tree, 64 leaves, max depth = 13, train loss: 0.66158, val loss: 0.65936, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.65192, val loss: 0.64903, in 0.000s 1 tree, 64 leaves, max depth = 13, train loss: 0.64281, val loss: 0.63924, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.63398, val loss: 0.62977, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.62554, val loss: 0.62071, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.61746, val loss: 0.61203, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.60974, val loss: 0.60371, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.60234, val loss: 0.59574, in 0.000s 1 tree, 65 leaves, max depth = 13, train loss: 0.59535, val loss: 0.58814, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.58856, val loss: 0.58080, in 0.016s 1 tree, 65 leaves, max depth = 13, train loss: 0.58214, val loss: 0.57381, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.57590, val loss: 0.56704, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.56991, val loss: 0.56054, in 0.016s 1 tree, 65 leaves, max depth = 19, train loss: 0.56425, val loss: 0.55443, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.55875, val loss: 0.54842, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.55346, val loss: 0.54265, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.54844, val loss: 0.53719, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.54317, val loss: 0.53217, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.53812, val loss: 0.52737, in 0.000s 1 tree, 65 leaves, max depth = 13, train loss: 0.53349, val loss: 0.52226, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.52871, val loss: 0.51773, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.52428, val loss: 0.51286, in 0.000s 1 tree, 105 leaves, max depth = 13, train loss: 0.51975, val loss: 0.50858, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.51541, val loss: 0.50448, in 0.016s 1 tree, 65 leaves, max depth = 12, train loss: 0.51136, val loss: 0.49998, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.50724, val loss: 0.49610, in 0.016s 1 tree, 65 leaves, max depth = 13, train loss: 0.50341, val loss: 0.49185, in 0.000s 1 tree, 105 leaves, max depth = 14, train loss: 0.49950, val loss: 0.48818, in 0.016s 1 tree, 65 leaves, max depth = 14, train loss: 0.49587, val loss: 0.48419, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.49215, val loss: 0.48071, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.48869, val loss: 0.47690, in 0.000s 1 tree, 105 leaves, max depth = 14, train loss: 0.48516, val loss: 0.47360, in 0.000s 1 tree, 105 leaves, max depth = 14, train loss: 0.48177, val loss: 0.47043, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.47853, val loss: 0.46686, in 0.016s [37/82] 1 tree, 105 leaves, max depth = 14, train loss: 0.47530, val loss: 0.46385, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.47223, val loss: 0.46045, in 0.000s 1 tree, 105 leaves, max depth = 14, train loss: 0.46915, val loss: 0.45760, in 0.032s 1 tree, 67 leaves, max depth = 13, train loss: 0.46623, val loss: 0.45436, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.46343, val loss: 0.45125, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.46052, val loss: 0.44856, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.45786, val loss: 0.44559, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.45508, val loss: 0.44303, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.45240, val loss: 0.44057, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.44982, val loss: 0.43821, in 0.000s 1 tree, 65 leaves, max depth = 17, train loss: 0.44737, val loss: 0.43548, in 0.000s 1 tree, 105 leaves, max depth = 14, train loss: 0.44491, val loss: 0.43323, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.44255, val loss: 0.43059, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.44020, val loss: 0.42845, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.43795, val loss: 0.42593, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.43570, val loss: 0.42388, in 0.016s 1 tree, 66 leaves, max depth = 19, train loss: 0.43358, val loss: 0.42151, in 0.000s 1 tree, 105 leaves, max depth = 16, train loss: 0.43142, val loss: 0.41956, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.42935, val loss: 0.41769, in 0.016s 1 tree, 64 leaves, max depth = 17, train loss: 0.42735, val loss: 0.41545, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.42536, val loss: 0.41366, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.42343, val loss: 0.41147, in 0.000s 1 tree, 104 leaves, max depth = 15, train loss: 0.42152, val loss: 0.40976, in 0.016s 1 tree, 64 leaves, max depth = 19, train loss: 0.41970, val loss: 0.40771, in 0.016s 1 tree, 67 leaves, max depth = 11, train loss: 0.41793, val loss: 0.40569, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.41611, val loss: 0.40407, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.41435, val loss: 0.40251, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.41275, val loss: 0.40096, in 0.000s 1 tree, 105 leaves, max depth = 15, train loss: 0.41106, val loss: 0.39947, in 0.016s 1 tree, 67 leaves, max depth = 11, train loss: 0.40940, val loss: 0.39758, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.40787, val loss: 0.39611, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.40626, val loss: 0.39469, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.40480, val loss: 0.39327, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.40325, val loss: 0.39191, in 0.016s 1 tree, 67 leaves, max depth = 11, train loss: 0.40169, val loss: 0.39013, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.40029, val loss: 0.38879, in 0.000s 1 tree, 105 leaves, max depth = 14, train loss: 0.39880, val loss: 0.38749, in 0.016s 1 tree, 67 leaves, max depth = 11, train loss: 0.39732, val loss: 0.38579, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.39599, val loss: 0.38451, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.39457, val loss: 0.38286, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.39330, val loss: 0.38163, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.39207, val loss: 0.38046, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.39089, val loss: 0.37932, in 0.000s 1 tree, 66 leaves, max depth = 12, train loss: 0.38955, val loss: 0.37776, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.38823, val loss: 0.37619, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.38680, val loss: 0.37496, in 0.016s Fit 82 trees in 1.283 s, (6101 total leaves) Time spent computing histograms: 0.427s Time spent finding best splits: 0.098s Time spent applying splits: 0.112s Time spent predicting: 0.016s Trial 39, Fold 5: Log loss = 0.3938249430487527, Average precision = 0.9446614020697768, ROC-AUC = 0.941758430410791, Elapsed Time = 1.2814244000001054 seconds
Optimization Progress: 40%|#### | 40/100 [07:52<11:48, 11.80s/it]
Trial 40, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 40, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.126 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 33 leaves, max depth = 11, train loss: 0.67951, val loss: 0.67910, in 0.000s 1 tree, 33 leaves, max depth = 11, train loss: 0.66665, val loss: 0.66584, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.65451, val loss: 0.65332, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.64315, val loss: 0.64166, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.63231, val loss: 0.63046, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.62205, val loss: 0.61985, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.61254, val loss: 0.61001, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.60333, val loss: 0.60047, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.59460, val loss: 0.59142, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.58632, val loss: 0.58282, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.57846, val loss: 0.57467, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.57110, val loss: 0.56707, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.56415, val loss: 0.55980, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.55742, val loss: 0.55279, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.55103, val loss: 0.54612, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.54434, val loss: 0.53986, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.53841, val loss: 0.53366, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.53217, val loss: 0.52784, in 0.000s 1 tree, 33 leaves, max depth = 12, train loss: 0.52676, val loss: 0.52213, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.52093, val loss: 0.51671, in 0.000s 1 tree, 54 leaves, max depth = 11, train loss: 0.51539, val loss: 0.51156, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.51012, val loss: 0.50668, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.50519, val loss: 0.50149, in 0.000s 1 tree, 32 leaves, max depth = 13, train loss: 0.50051, val loss: 0.49656, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.49564, val loss: 0.49206, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.49101, val loss: 0.48779, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.48679, val loss: 0.48330, in 0.000s 1 tree, 33 leaves, max depth = 12, train loss: 0.48277, val loss: 0.47903, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.47847, val loss: 0.47508, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.47437, val loss: 0.47133, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.47069, val loss: 0.46748, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.46682, val loss: 0.46396, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.46329, val loss: 0.46020, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.45965, val loss: 0.45689, in 0.000s 1 tree, 53 leaves, max depth = 10, train loss: 0.45617, val loss: 0.45374, in 0.000s Fit 35 trees in 0.533 s, (1386 total leaves) Time spent computing histograms: 0.144s Time spent finding best splits: 0.024s Time spent applying splits: 0.025s Time spent predicting: 0.000s Trial 40, Fold 1: Log loss = 0.4573582800681732, Average precision = 0.9182890017667679, ROC-AUC = 0.9282691942329476, Elapsed Time = 0.5440377000013541 seconds Trial 40, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 40, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 11, train loss: 0.67970, val loss: 0.67910, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.66701, val loss: 0.66584, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.65504, val loss: 0.65332, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.64374, val loss: 0.64147, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.63306, val loss: 0.63028, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.62297, val loss: 0.61968, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.61342, val loss: 0.60964, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.60425, val loss: 0.59993, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.59557, val loss: 0.59072, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.58744, val loss: 0.58215, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.57972, val loss: 0.57402, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.57237, val loss: 0.56624, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.56541, val loss: 0.55890, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.55878, val loss: 0.55187, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.55238, val loss: 0.54506, in 0.000s 1 tree, 52 leaves, max depth = 10, train loss: 0.54578, val loss: 0.53877, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.53993, val loss: 0.53256, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.53378, val loss: 0.52670, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.52836, val loss: 0.52095, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.52261, val loss: 0.51549, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.51714, val loss: 0.51031, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.51195, val loss: 0.50539, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.50710, val loss: 0.50023, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.50249, val loss: 0.49530, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.49769, val loss: 0.49076, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.49312, val loss: 0.48645, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.48889, val loss: 0.48193, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.48486, val loss: 0.47763, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.48062, val loss: 0.47365, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.47657, val loss: 0.46986, in 0.016s 1 tree, 32 leaves, max depth = 16, train loss: 0.47286, val loss: 0.46588, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.46906, val loss: 0.46232, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.46558, val loss: 0.45859, in 0.000s 1 tree, 54 leaves, max depth = 10, train loss: 0.46199, val loss: 0.45524, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.45856, val loss: 0.45205, in 0.016s Fit 35 trees in 0.610 s, (1341 total leaves) Time spent computing histograms: 0.147s Time spent finding best splits: 0.025s Time spent applying splits: 0.025s Time spent predicting: 0.000s Trial 40, Fold 2: Log loss = 0.4593806726958846, Average precision = 0.9127169139881373, ROC-AUC = 0.9285808911297653, Elapsed Time = 0.6046727999982977 seconds Trial 40, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 40, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 11, train loss: 0.67980, val loss: 0.67938, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.66721, val loss: 0.66640, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.65522, val loss: 0.65400, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.64399, val loss: 0.64240, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.63328, val loss: 0.63130, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.62323, val loss: 0.62091, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.61375, val loss: 0.61104, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.60464, val loss: 0.60159, in 0.000s 1 tree, 30 leaves, max depth = 8, train loss: 0.59600, val loss: 0.59262, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.58790, val loss: 0.58421, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.58022, val loss: 0.57624, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.57297, val loss: 0.56867, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.56605, val loss: 0.56147, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.55933, val loss: 0.55450, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.55296, val loss: 0.54787, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.54630, val loss: 0.54169, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.54042, val loss: 0.53555, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.53421, val loss: 0.52981, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.52883, val loss: 0.52417, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.52302, val loss: 0.51880, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.51751, val loss: 0.51372, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.51227, val loss: 0.50888, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.50745, val loss: 0.50382, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.50277, val loss: 0.49889, in 0.000s 1 tree, 55 leaves, max depth = 13, train loss: 0.49793, val loss: 0.49445, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.49332, val loss: 0.49022, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.48912, val loss: 0.48579, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.48512, val loss: 0.48156, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.48083, val loss: 0.47764, in 0.000s 1 tree, 53 leaves, max depth = 12, train loss: 0.47676, val loss: 0.47391, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.47309, val loss: 0.47000, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.46925, val loss: 0.46650, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.46580, val loss: 0.46284, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.46217, val loss: 0.45954, in 0.000s 1 tree, 51 leaves, max depth = 12, train loss: 0.45872, val loss: 0.45641, in 0.016s Fit 35 trees in 0.627 s, (1343 total leaves) Time spent computing histograms: 0.159s Time spent finding best splits: 0.026s Time spent applying splits: 0.026s Time spent predicting: 0.000s Trial 40, Fold 3: Log loss = 0.4559647888321185, Average precision = 0.9192760922522056, ROC-AUC = 0.9317236362454374, Elapsed Time = 0.6254176999991614 seconds Trial 40, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 40, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 14, train loss: 0.67964, val loss: 0.67882, in 0.016s 1 tree, 29 leaves, max depth = 14, train loss: 0.66689, val loss: 0.66529, in 0.000s 1 tree, 29 leaves, max depth = 14, train loss: 0.65487, val loss: 0.65250, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.64359, val loss: 0.64053, in 0.000s 1 tree, 29 leaves, max depth = 14, train loss: 0.63285, val loss: 0.62906, in 0.016s 1 tree, 30 leaves, max depth = 15, train loss: 0.62268, val loss: 0.61820, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.61319, val loss: 0.60811, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.60407, val loss: 0.59832, in 0.000s 1 tree, 30 leaves, max depth = 14, train loss: 0.59542, val loss: 0.58902, in 0.000s 1 tree, 30 leaves, max depth = 14, train loss: 0.58723, val loss: 0.58020, in 0.016s 1 tree, 30 leaves, max depth = 14, train loss: 0.57946, val loss: 0.57181, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.57214, val loss: 0.56393, in 0.016s 1 tree, 33 leaves, max depth = 13, train loss: 0.56523, val loss: 0.55645, in 0.000s 1 tree, 29 leaves, max depth = 13, train loss: 0.55858, val loss: 0.54922, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.55225, val loss: 0.54233, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.54555, val loss: 0.53577, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.53969, val loss: 0.52937, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.53343, val loss: 0.52325, in 0.016s 1 tree, 33 leaves, max depth = 13, train loss: 0.52806, val loss: 0.51738, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.52220, val loss: 0.51168, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.51664, val loss: 0.50626, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.51135, val loss: 0.50111, in 0.016s 1 tree, 32 leaves, max depth = 14, train loss: 0.50650, val loss: 0.49577, in 0.000s 1 tree, 32 leaves, max depth = 14, train loss: 0.50188, val loss: 0.49067, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.49699, val loss: 0.48592, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.49234, val loss: 0.48140, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.48791, val loss: 0.47710, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.48379, val loss: 0.47255, in 0.000s 1 tree, 58 leaves, max depth = 13, train loss: 0.47962, val loss: 0.46850, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.47573, val loss: 0.46418, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.47206, val loss: 0.46013, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.46818, val loss: 0.45637, in 0.016s 1 tree, 32 leaves, max depth = 13, train loss: 0.46471, val loss: 0.45249, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.46104, val loss: 0.44895, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.45755, val loss: 0.44557, in 0.000s Fit 35 trees in 0.596 s, (1392 total leaves) Time spent computing histograms: 0.151s Time spent finding best splits: 0.026s Time spent applying splits: 0.026s Time spent predicting: 0.000s Trial 40, Fold 4: Log loss = 0.45723244672666946, Average precision = 0.917938386877304, ROC-AUC = 0.9294905440269343, Elapsed Time = 0.6093828999983089 seconds Trial 40, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 40, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 32 leaves, max depth = 12, train loss: 0.67949, val loss: 0.67861, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.66662, val loss: 0.66489, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.65446, val loss: 0.65191, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.64311, val loss: 0.63976, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.63225, val loss: 0.62813, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.62197, val loss: 0.61710, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.61235, val loss: 0.60677, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.60313, val loss: 0.59684, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.59439, val loss: 0.58740, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.58610, val loss: 0.57845, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.57824, val loss: 0.56993, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.57086, val loss: 0.56192, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.56386, val loss: 0.55431, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.55713, val loss: 0.54696, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.55073, val loss: 0.53997, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.54421, val loss: 0.53375, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.53827, val loss: 0.52724, in 0.000s 1 tree, 53 leaves, max depth = 12, train loss: 0.53219, val loss: 0.52146, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.52674, val loss: 0.51547, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.52105, val loss: 0.51005, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.51566, val loss: 0.50492, in 0.000s 1 tree, 33 leaves, max depth = 12, train loss: 0.51061, val loss: 0.49936, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.50556, val loss: 0.49459, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.50090, val loss: 0.48945, in 0.000s 1 tree, 53 leaves, max depth = 12, train loss: 0.49617, val loss: 0.48497, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.49182, val loss: 0.48016, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.48737, val loss: 0.47596, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.48331, val loss: 0.47144, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.47912, val loss: 0.46751, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.47513, val loss: 0.46377, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.47139, val loss: 0.45960, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.46764, val loss: 0.45609, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.46408, val loss: 0.45211, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.46053, val loss: 0.44880, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.45725, val loss: 0.44509, in 0.016s Fit 35 trees in 0.626 s, (1355 total leaves) Time spent computing histograms: 0.164s Time spent finding best splits: 0.026s Time spent applying splits: 0.026s Time spent predicting: 0.000s Trial 40, Fold 5: Log loss = 0.4623714122151564, Average precision = 0.9148543808130059, ROC-AUC = 0.925672444865578, Elapsed Time = 0.6308587999992596 seconds
Optimization Progress: 41%|####1 | 41/100 [08:02<10:59, 11.18s/it]
Trial 41, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 41, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 26 leaves, max depth = 11, train loss: 0.65757, val loss: 0.65670, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.62692, val loss: 0.62497, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.60070, val loss: 0.59791, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.57855, val loss: 0.57500, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.55919, val loss: 0.55488, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.54133, val loss: 0.53809, in 0.000s 1 tree, 31 leaves, max depth = 14, train loss: 0.52568, val loss: 0.52173, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.51091, val loss: 0.50792, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.49790, val loss: 0.49422, in 0.000s 1 tree, 56 leaves, max depth = 10, train loss: 0.48555, val loss: 0.48276, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.47475, val loss: 0.47127, in 0.000s 1 tree, 53 leaves, max depth = 10, train loss: 0.46427, val loss: 0.46161, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.45521, val loss: 0.45183, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.44704, val loss: 0.44305, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.44003, val loss: 0.43551, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.43145, val loss: 0.42780, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.42540, val loss: 0.42119, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.42023, val loss: 0.41556, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.41506, val loss: 0.40991, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.41052, val loss: 0.40494, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.40610, val loss: 0.40008, in 0.016s 1 tree, 64 leaves, max depth = 9, train loss: 0.40086, val loss: 0.39564, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.39414, val loss: 0.38976, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.39059, val loss: 0.38594, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38680, val loss: 0.38176, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.38090, val loss: 0.37675, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.37572, val loss: 0.37238, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.37236, val loss: 0.36866, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.36778, val loss: 0.36480, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.36374, val loss: 0.36150, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.36089, val loss: 0.35842, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35802, val loss: 0.35506, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.35543, val loss: 0.35228, in 0.000s 1 tree, 51 leaves, max depth = 10, train loss: 0.35178, val loss: 0.34930, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.34858, val loss: 0.34669, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.34598, val loss: 0.34360, in 0.000s 1 tree, 37 leaves, max depth = 11, train loss: 0.34373, val loss: 0.34107, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.34083, val loss: 0.33887, in 0.016s Fit 38 trees in 0.642 s, (1301 total leaves) Time spent computing histograms: 0.187s Time spent finding best splits: 0.030s Time spent applying splits: 0.031s Time spent predicting: 0.000s Trial 41, Fold 1: Log loss = 0.34574342825086174, Average precision = 0.9481375195266801, ROC-AUC = 0.945188492063492, Elapsed Time = 0.6465646999986348 seconds Trial 41, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 41, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 10, train loss: 0.65762, val loss: 0.65594, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.62741, val loss: 0.62422, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.60154, val loss: 0.59697, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.57922, val loss: 0.57339, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.55994, val loss: 0.55297, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.54195, val loss: 0.53566, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.52626, val loss: 0.51900, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.51138, val loss: 0.50481, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.49827, val loss: 0.49077, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.48580, val loss: 0.47889, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.47504, val loss: 0.46737, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.46447, val loss: 0.45732, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.45547, val loss: 0.44765, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.44738, val loss: 0.43881, in 0.000s 1 tree, 28 leaves, max depth = 16, train loss: 0.44054, val loss: 0.43136, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.43185, val loss: 0.42328, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.42587, val loss: 0.41668, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.42080, val loss: 0.41118, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.41579, val loss: 0.40600, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.41134, val loss: 0.40103, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40705, val loss: 0.39658, in 0.000s 1 tree, 65 leaves, max depth = 10, train loss: 0.40185, val loss: 0.39205, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.39502, val loss: 0.38582, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.39142, val loss: 0.38187, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38775, val loss: 0.37804, in 0.000s 1 tree, 51 leaves, max depth = 12, train loss: 0.38173, val loss: 0.37262, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.37643, val loss: 0.36776, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37330, val loss: 0.36452, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.36860, val loss: 0.36029, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.36446, val loss: 0.35655, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.36158, val loss: 0.35345, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35878, val loss: 0.35055, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.35625, val loss: 0.34772, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.35250, val loss: 0.34441, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.34921, val loss: 0.34147, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.34668, val loss: 0.33883, in 0.000s 1 tree, 37 leaves, max depth = 11, train loss: 0.34449, val loss: 0.33647, in 0.000s 1 tree, 51 leaves, max depth = 13, train loss: 0.34151, val loss: 0.33388, in 0.016s Fit 38 trees in 0.751 s, (1278 total leaves) Time spent computing histograms: 0.214s Time spent finding best splits: 0.034s Time spent applying splits: 0.035s Time spent predicting: 0.000s Trial 41, Fold 2: Log loss = 0.34458307528002785, Average precision = 0.9450730202217352, ROC-AUC = 0.9462417536459682, Elapsed Time = 0.7594802000003256 seconds Trial 41, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 41, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.205 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 9, train loss: 0.65779, val loss: 0.65641, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.62758, val loss: 0.62519, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.60187, val loss: 0.59830, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.57979, val loss: 0.57524, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.56060, val loss: 0.55532, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.54267, val loss: 0.53863, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.52711, val loss: 0.52225, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.51228, val loss: 0.50855, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.49921, val loss: 0.49478, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.48680, val loss: 0.48334, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.47606, val loss: 0.47207, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.46554, val loss: 0.46249, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.45657, val loss: 0.45301, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.44853, val loss: 0.44443, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.44164, val loss: 0.43702, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.43303, val loss: 0.42922, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.42708, val loss: 0.42277, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.42181, val loss: 0.41699, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.41660, val loss: 0.41220, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.41215, val loss: 0.40735, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40768, val loss: 0.40326, in 0.000s 1 tree, 63 leaves, max depth = 10, train loss: 0.40253, val loss: 0.39897, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.39584, val loss: 0.39298, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.39221, val loss: 0.38896, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38839, val loss: 0.38545, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.38250, val loss: 0.38021, in 0.000s 1 tree, 51 leaves, max depth = 11, train loss: 0.37733, val loss: 0.37565, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.37394, val loss: 0.37253, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.36936, val loss: 0.36860, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.36535, val loss: 0.36521, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.36250, val loss: 0.36198, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35960, val loss: 0.35930, in 0.000s 1 tree, 29 leaves, max depth = 8, train loss: 0.35706, val loss: 0.35647, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.35343, val loss: 0.35348, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.35025, val loss: 0.35094, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.34763, val loss: 0.34851, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.34547, val loss: 0.34605, in 0.000s 1 tree, 50 leaves, max depth = 11, train loss: 0.34260, val loss: 0.34369, in 0.016s Fit 38 trees in 0.846 s, (1241 total leaves) Time spent computing histograms: 0.240s Time spent finding best splits: 0.039s Time spent applying splits: 0.039s Time spent predicting: 0.000s Trial 41, Fold 3: Log loss = 0.3399037601455915, Average precision = 0.9499262049311503, ROC-AUC = 0.9491582926823396, Elapsed Time = 0.8536236999989342 seconds Trial 41, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 41, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.204 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 10, train loss: 0.65777, val loss: 0.65584, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.62740, val loss: 0.62347, in 0.016s 1 tree, 32 leaves, max depth = 13, train loss: 0.60138, val loss: 0.59562, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.57921, val loss: 0.57190, in 0.016s 1 tree, 30 leaves, max depth = 14, train loss: 0.56002, val loss: 0.55118, in 0.000s 1 tree, 53 leaves, max depth = 10, train loss: 0.54240, val loss: 0.53375, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.52671, val loss: 0.51681, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.51216, val loss: 0.50245, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.49907, val loss: 0.48825, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.48690, val loss: 0.47624, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.47617, val loss: 0.46448, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.46589, val loss: 0.45437, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.45687, val loss: 0.44432, in 0.000s 1 tree, 31 leaves, max depth = 8, train loss: 0.44882, val loss: 0.43525, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.44186, val loss: 0.42734, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.43345, val loss: 0.41917, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.42749, val loss: 0.41234, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.42236, val loss: 0.40639, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.41737, val loss: 0.40110, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.41292, val loss: 0.39592, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40864, val loss: 0.39139, in 0.000s 1 tree, 65 leaves, max depth = 10, train loss: 0.40358, val loss: 0.38684, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.39700, val loss: 0.38054, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.39348, val loss: 0.37638, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38982, val loss: 0.37250, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.38403, val loss: 0.36701, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.37896, val loss: 0.36219, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37584, val loss: 0.35886, in 0.000s 1 tree, 51 leaves, max depth = 10, train loss: 0.37134, val loss: 0.35460, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.36739, val loss: 0.35086, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.36456, val loss: 0.34755, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36176, val loss: 0.34456, in 0.000s 1 tree, 27 leaves, max depth = 9, train loss: 0.35920, val loss: 0.34160, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.35564, val loss: 0.33831, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.35254, val loss: 0.33536, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35000, val loss: 0.33264, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.34785, val loss: 0.33013, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.34503, val loss: 0.32754, in 0.016s Fit 38 trees in 0.860 s, (1267 total leaves) Time spent computing histograms: 0.238s Time spent finding best splits: 0.039s Time spent applying splits: 0.039s Time spent predicting: 0.031s Trial 41, Fold 4: Log loss = 0.3451491793687913, Average precision = 0.948128226802262, ROC-AUC = 0.9461292534889876, Elapsed Time = 0.857237800000803 seconds Trial 41, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 41, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.189 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 10, train loss: 0.65751, val loss: 0.65527, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.62691, val loss: 0.62237, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.60094, val loss: 0.59422, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.57858, val loss: 0.57007, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.55918, val loss: 0.54879, in 0.000s 1 tree, 51 leaves, max depth = 11, train loss: 0.54135, val loss: 0.53196, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.52559, val loss: 0.51452, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.51084, val loss: 0.50075, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.49775, val loss: 0.48623, in 0.000s 1 tree, 52 leaves, max depth = 13, train loss: 0.48541, val loss: 0.47479, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.47451, val loss: 0.46267, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.46406, val loss: 0.45304, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.45494, val loss: 0.44281, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.44676, val loss: 0.43357, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.43984, val loss: 0.42565, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.43126, val loss: 0.41793, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.42521, val loss: 0.41096, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.42006, val loss: 0.40497, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.41504, val loss: 0.40012, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.41052, val loss: 0.39483, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40622, val loss: 0.39071, in 0.000s 1 tree, 63 leaves, max depth = 10, train loss: 0.40111, val loss: 0.38667, in 0.031s 1 tree, 52 leaves, max depth = 12, train loss: 0.39436, val loss: 0.38083, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.39077, val loss: 0.37653, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38709, val loss: 0.37302, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.38113, val loss: 0.36795, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.37590, val loss: 0.36358, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37279, val loss: 0.36050, in 0.000s 1 tree, 51 leaves, max depth = 12, train loss: 0.36814, val loss: 0.35663, in 0.016s 1 tree, 51 leaves, max depth = 14, train loss: 0.36406, val loss: 0.35330, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.36119, val loss: 0.34982, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35840, val loss: 0.34706, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.35583, val loss: 0.34398, in 0.000s 1 tree, 53 leaves, max depth = 13, train loss: 0.35213, val loss: 0.34100, in 0.016s 1 tree, 52 leaves, max depth = 15, train loss: 0.34889, val loss: 0.33843, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.34637, val loss: 0.33594, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.34408, val loss: 0.33343, in 0.016s 1 tree, 54 leaves, max depth = 15, train loss: 0.34114, val loss: 0.33111, in 0.016s Fit 38 trees in 0.862 s, (1298 total leaves) Time spent computing histograms: 0.233s Time spent finding best splits: 0.036s Time spent applying splits: 0.037s Time spent predicting: 0.000s Trial 41, Fold 5: Log loss = 0.34752194627777483, Average precision = 0.947238785766397, ROC-AUC = 0.9448383033275738, Elapsed Time = 0.8585003999996843 seconds
Optimization Progress: 42%|####2 | 42/100 [08:13<10:41, 11.06s/it]
Trial 42, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 42, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 13 leaves, max depth = 5, train loss: 0.68314, val loss: 0.68317, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.67366, val loss: 0.67373, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.66449, val loss: 0.66459, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.65563, val loss: 0.65577, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.64705, val loss: 0.64724, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.63875, val loss: 0.63897, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.63053, val loss: 0.63071, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.62275, val loss: 0.62293, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.61521, val loss: 0.61540, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.60771, val loss: 0.60786, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.60045, val loss: 0.60056, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.59340, val loss: 0.59347, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.58656, val loss: 0.58660, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.57992, val loss: 0.57992, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.57348, val loss: 0.57344, in 0.000s 1 tree, 17 leaves, max depth = 5, train loss: 0.56723, val loss: 0.56715, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.56115, val loss: 0.56104, in 0.016s Fit 17 trees in 0.532 s, (262 total leaves) Time spent computing histograms: 0.101s Time spent finding best splits: 0.008s Time spent applying splits: 0.006s Time spent predicting: 0.000s Trial 42, Fold 1: Log loss = 0.5622161983440589, Average precision = 0.9122596955287137, ROC-AUC = 0.9279834644135683, Elapsed Time = 0.5323338999987755 seconds Trial 42, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 42, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 14 leaves, max depth = 5, train loss: 0.68307, val loss: 0.68285, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.67339, val loss: 0.67297, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.66404, val loss: 0.66341, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.65500, val loss: 0.65418, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.64624, val loss: 0.64523, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.63778, val loss: 0.63658, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.62957, val loss: 0.62819, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.62163, val loss: 0.62007, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.61393, val loss: 0.61221, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.60647, val loss: 0.60458, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.59924, val loss: 0.59718, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.59222, val loss: 0.59000, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.58541, val loss: 0.58304, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.57880, val loss: 0.57626, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.57239, val loss: 0.56967, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.56615, val loss: 0.56330, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.56010, val loss: 0.55709, in 0.016s Fit 17 trees in 0.580 s, (244 total leaves) Time spent computing histograms: 0.120s Time spent finding best splits: 0.009s Time spent applying splits: 0.006s Time spent predicting: 0.016s Trial 42, Fold 2: Log loss = 0.5607065966168168, Average precision = 0.9051654203427068, ROC-AUC = 0.9282734501164058, Elapsed Time = 0.5888044000002992 seconds Trial 42, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 42, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 13 leaves, max depth = 6, train loss: 0.68317, val loss: 0.68310, in 0.000s 1 tree, 13 leaves, max depth = 6, train loss: 0.67368, val loss: 0.67354, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.66450, val loss: 0.66429, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.65562, val loss: 0.65534, in 0.000s 1 tree, 16 leaves, max depth = 5, train loss: 0.64689, val loss: 0.64656, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.63844, val loss: 0.63806, in 0.000s 1 tree, 16 leaves, max depth = 5, train loss: 0.63025, val loss: 0.62983, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.62232, val loss: 0.62186, in 0.000s 1 tree, 16 leaves, max depth = 5, train loss: 0.61464, val loss: 0.61413, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.60720, val loss: 0.60664, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.59998, val loss: 0.59937, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.59297, val loss: 0.59233, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.58618, val loss: 0.58549, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.57959, val loss: 0.57888, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.57319, val loss: 0.57246, in 0.016s 1 tree, 21 leaves, max depth = 6, train loss: 0.56698, val loss: 0.56623, in 0.000s 1 tree, 20 leaves, max depth = 6, train loss: 0.56094, val loss: 0.56016, in 0.031s Fit 17 trees in 0.563 s, (279 total leaves) Time spent computing histograms: 0.109s Time spent finding best splits: 0.009s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 42, Fold 3: Log loss = 0.5586542076405022, Average precision = 0.9144956673682674, ROC-AUC = 0.9326203758992979, Elapsed Time = 0.5713990999993257 seconds Trial 42, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 42, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.141 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 13 leaves, max depth = 5, train loss: 0.68327, val loss: 0.68287, in 0.000s 1 tree, 13 leaves, max depth = 5, train loss: 0.67384, val loss: 0.67303, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.66471, val loss: 0.66352, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.65589, val loss: 0.65432, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.64718, val loss: 0.64521, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.63873, val loss: 0.63639, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.63056, val loss: 0.62785, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.62265, val loss: 0.61957, in 0.000s 1 tree, 16 leaves, max depth = 5, train loss: 0.61497, val loss: 0.61153, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.60754, val loss: 0.60375, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.60034, val loss: 0.59623, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.59333, val loss: 0.58889, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.58655, val loss: 0.58181, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.57996, val loss: 0.57490, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.57357, val loss: 0.56822, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.56737, val loss: 0.56168, in 0.000s 1 tree, 16 leaves, max depth = 5, train loss: 0.56133, val loss: 0.55534, in 0.031s Fit 17 trees in 0.578 s, (259 total leaves) Time spent computing histograms: 0.110s Time spent finding best splits: 0.009s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 42, Fold 4: Log loss = 0.5609712921412567, Average precision = 0.9131328401100529, ROC-AUC = 0.9301575826615258, Elapsed Time = 0.5790532999999414 seconds Trial 42, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 42, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 15 leaves, max depth = 5, train loss: 0.68300, val loss: 0.68258, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.67333, val loss: 0.67250, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.66399, val loss: 0.66276, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.65495, val loss: 0.65333, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.64621, val loss: 0.64421, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.63775, val loss: 0.63539, in 0.000s 1 tree, 15 leaves, max depth = 5, train loss: 0.62956, val loss: 0.62684, in 0.000s 1 tree, 16 leaves, max depth = 5, train loss: 0.62162, val loss: 0.61855, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.61394, val loss: 0.61053, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.60648, val loss: 0.60275, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.59926, val loss: 0.59521, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.59225, val loss: 0.58789, in 0.016s 1 tree, 18 leaves, max depth = 5, train loss: 0.58545, val loss: 0.58078, in 0.016s 1 tree, 18 leaves, max depth = 5, train loss: 0.57885, val loss: 0.57389, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.57245, val loss: 0.56719, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.56622, val loss: 0.56070, in 0.016s 1 tree, 18 leaves, max depth = 5, train loss: 0.56018, val loss: 0.55437, in 0.016s Fit 17 trees in 0.595 s, (272 total leaves) Time spent computing histograms: 0.122s Time spent finding best splits: 0.010s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 42, Fold 5: Log loss = 0.5621848627885069, Average precision = 0.9096137397135172, ROC-AUC = 0.9274200248964197, Elapsed Time = 0.605163199999879 seconds
Optimization Progress: 43%|####3 | 43/100 [08:22<10:07, 10.65s/it]
Trial 43, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 43, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.127 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 10, train loss: 0.68834, val loss: 0.68819, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.68379, val loss: 0.68350, in 0.000s 1 tree, 27 leaves, max depth = 10, train loss: 0.67926, val loss: 0.67884, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.67485, val loss: 0.67427, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.67055, val loss: 0.66984, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.66632, val loss: 0.66546, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.66220, val loss: 0.66121, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.65815, val loss: 0.65703, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.65413, val loss: 0.65288, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.65013, val loss: 0.64874, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.64630, val loss: 0.64476, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.64255, val loss: 0.64089, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.63888, val loss: 0.63709, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.63529, val loss: 0.63337, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.63175, val loss: 0.62972, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.62828, val loss: 0.62613, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.62482, val loss: 0.62256, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.62146, val loss: 0.61905, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.61814, val loss: 0.61559, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.61482, val loss: 0.61215, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.61156, val loss: 0.60876, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.60837, val loss: 0.60543, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.60530, val loss: 0.60224, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.60222, val loss: 0.59903, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.59927, val loss: 0.59594, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.59633, val loss: 0.59291, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.59349, val loss: 0.58993, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.59071, val loss: 0.58705, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.58798, val loss: 0.58422, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.58530, val loss: 0.58144, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.58267, val loss: 0.57870, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.58005, val loss: 0.57596, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.57742, val loss: 0.57321, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.57492, val loss: 0.57062, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.57239, val loss: 0.56797, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.56993, val loss: 0.56541, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.56751, val loss: 0.56291, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.56519, val loss: 0.56049, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.56286, val loss: 0.55807, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.56060, val loss: 0.55570, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.55835, val loss: 0.55336, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.55610, val loss: 0.55100, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.55393, val loss: 0.54874, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.55170, val loss: 0.54668, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.54959, val loss: 0.54445, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.54751, val loss: 0.54229, in 0.000s 1 tree, 48 leaves, max depth = 11, train loss: 0.54534, val loss: 0.54029, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.54330, val loss: 0.53815, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.54130, val loss: 0.53607, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.53920, val loss: 0.53413, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.53728, val loss: 0.53210, in 0.000s 1 tree, 48 leaves, max depth = 11, train loss: 0.53523, val loss: 0.53021, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.53337, val loss: 0.52827, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.53154, val loss: 0.52636, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.52970, val loss: 0.52444, in 0.000s 1 tree, 48 leaves, max depth = 11, train loss: 0.52772, val loss: 0.52261, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.52593, val loss: 0.52072, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.52417, val loss: 0.51887, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.52224, val loss: 0.51710, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.52055, val loss: 0.51534, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.51886, val loss: 0.51357, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.51698, val loss: 0.51185, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.51533, val loss: 0.51012, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.51371, val loss: 0.50842, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.51215, val loss: 0.50678, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.51058, val loss: 0.50514, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.50904, val loss: 0.50352, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.50752, val loss: 0.50193, in 0.000s 1 tree, 50 leaves, max depth = 10, train loss: 0.50573, val loss: 0.50029, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.50396, val loss: 0.49868, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.50246, val loss: 0.49709, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.50101, val loss: 0.49557, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.49962, val loss: 0.49408, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.49822, val loss: 0.49261, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.49685, val loss: 0.49117, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.49549, val loss: 0.48972, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.49418, val loss: 0.48832, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.49291, val loss: 0.48697, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.49162, val loss: 0.48562, in 0.000s Fit 79 trees in 0.877 s, (2337 total leaves) Time spent computing histograms: 0.344s Time spent finding best splits: 0.053s Time spent applying splits: 0.045s Time spent predicting: 0.000s Trial 43, Fold 1: Log loss = 0.49087564799547156, Average precision = 0.9016494024047659, ROC-AUC = 0.9077731953465921, Elapsed Time = 0.8890620000001945 seconds Trial 43, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 43, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 27 leaves, max depth = 8, train loss: 0.68846, val loss: 0.68822, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.68389, val loss: 0.68343, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.67937, val loss: 0.67870, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.67494, val loss: 0.67406, in 0.000s 1 tree, 28 leaves, max depth = 13, train loss: 0.67064, val loss: 0.66954, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.66640, val loss: 0.66509, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.66225, val loss: 0.66074, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.65819, val loss: 0.65648, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.65418, val loss: 0.65226, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.65026, val loss: 0.64814, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.64642, val loss: 0.64410, in 0.000s 1 tree, 29 leaves, max depth = 13, train loss: 0.64266, val loss: 0.64015, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.63897, val loss: 0.63626, in 0.000s 1 tree, 26 leaves, max depth = 8, train loss: 0.63538, val loss: 0.63247, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.63183, val loss: 0.62873, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.62834, val loss: 0.62506, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.62489, val loss: 0.62142, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.62152, val loss: 0.61788, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.61819, val loss: 0.61438, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.61489, val loss: 0.61089, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.61166, val loss: 0.60746, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.60848, val loss: 0.60410, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.60541, val loss: 0.60086, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.60241, val loss: 0.59769, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.59945, val loss: 0.59457, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.59653, val loss: 0.59148, in 0.016s [27/79] 1 tree, 30 leaves, max depth = 10, train loss: 0.59368, val loss: 0.58847, in 0.000s 1 tree, 28 leaves, max depth = 12, train loss: 0.59088, val loss: 0.58551, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.58814, val loss: 0.58261, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.58545, val loss: 0.57977, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.58281, val loss: 0.57696, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.58017, val loss: 0.57418, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.57756, val loss: 0.57140, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.57506, val loss: 0.56874, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.57253, val loss: 0.56606, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.57012, val loss: 0.56350, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.56772, val loss: 0.56095, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.56539, val loss: 0.55847, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.56307, val loss: 0.55601, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.56081, val loss: 0.55362, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.55857, val loss: 0.55124, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.55634, val loss: 0.54886, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.55420, val loss: 0.54658, in 0.000s 1 tree, 50 leaves, max depth = 11, train loss: 0.55192, val loss: 0.54437, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.54980, val loss: 0.54212, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.54773, val loss: 0.53992, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.54552, val loss: 0.53778, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.54348, val loss: 0.53562, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.54151, val loss: 0.53352, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.53936, val loss: 0.53145, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.53744, val loss: 0.52940, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.53534, val loss: 0.52737, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.53347, val loss: 0.52538, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.53164, val loss: 0.52342, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.52983, val loss: 0.52149, in 0.000s 1 tree, 50 leaves, max depth = 11, train loss: 0.52780, val loss: 0.51953, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.52601, val loss: 0.51762, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.52428, val loss: 0.51577, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.52230, val loss: 0.51386, in 0.000s 1 tree, 30 leaves, max depth = 14, train loss: 0.52062, val loss: 0.51206, in 0.016s 1 tree, 30 leaves, max depth = 13, train loss: 0.51896, val loss: 0.51028, in 0.000s 1 tree, 51 leaves, max depth = 11, train loss: 0.51704, val loss: 0.50843, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.51540, val loss: 0.50667, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.51379, val loss: 0.50494, in 0.000s 1 tree, 30 leaves, max depth = 14, train loss: 0.51223, val loss: 0.50326, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.51068, val loss: 0.50159, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.50915, val loss: 0.49995, in 0.016s 1 tree, 30 leaves, max depth = 14, train loss: 0.50766, val loss: 0.49835, in 0.000s 1 tree, 51 leaves, max depth = 12, train loss: 0.50583, val loss: 0.49659, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.50402, val loss: 0.49485, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.50253, val loss: 0.49324, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.50110, val loss: 0.49170, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.49970, val loss: 0.49020, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.49834, val loss: 0.48872, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.49699, val loss: 0.48727, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.49564, val loss: 0.48582, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.49433, val loss: 0.48442, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.49306, val loss: 0.48304, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.49180, val loss: 0.48168, in 0.016s Fit 79 trees in 0.970 s, (2345 total leaves) Time spent computing histograms: 0.363s Time spent finding best splits: 0.054s Time spent applying splits: 0.046s Time spent predicting: 0.000s Trial 43, Fold 2: Log loss = 0.49183921891558047, Average precision = 0.8987425486197136, ROC-AUC = 0.9109339894098256, Elapsed Time = 0.9627939000001788 seconds Trial 43, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 43, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 9, train loss: 0.68841, val loss: 0.68825, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.68388, val loss: 0.68356, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.67939, val loss: 0.67891, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.67499, val loss: 0.67436, in 0.000s 1 tree, 26 leaves, max depth = 10, train loss: 0.67073, val loss: 0.66993, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.66652, val loss: 0.66560, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.66242, val loss: 0.66134, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.65840, val loss: 0.65716, in 0.000s 1 tree, 27 leaves, max depth = 9, train loss: 0.65441, val loss: 0.65302, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.65045, val loss: 0.64893, in 0.000s 1 tree, 27 leaves, max depth = 10, train loss: 0.64664, val loss: 0.64500, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.64292, val loss: 0.64114, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.63927, val loss: 0.63734, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.63570, val loss: 0.63364, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.63219, val loss: 0.62998, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.62874, val loss: 0.62640, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.62531, val loss: 0.62283, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.62197, val loss: 0.61938, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.61866, val loss: 0.61595, in 0.000s 1 tree, 26 leaves, max depth = 9, train loss: 0.61538, val loss: 0.61255, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.61215, val loss: 0.60921, in 0.000s 1 tree, 26 leaves, max depth = 9, train loss: 0.60898, val loss: 0.60594, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.60594, val loss: 0.60279, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.60289, val loss: 0.59963, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.59995, val loss: 0.59660, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.59705, val loss: 0.59357, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.59422, val loss: 0.59065, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.59146, val loss: 0.58775, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.58875, val loss: 0.58492, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.58609, val loss: 0.58214, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.58347, val loss: 0.57940, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.58086, val loss: 0.57668, in 0.000s 1 tree, 26 leaves, max depth = 9, train loss: 0.57826, val loss: 0.57398, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.57578, val loss: 0.57138, in 0.000s 1 tree, 26 leaves, max depth = 9, train loss: 0.57327, val loss: 0.56878, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.57088, val loss: 0.56628, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.56849, val loss: 0.56378, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.56619, val loss: 0.56136, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.56388, val loss: 0.55895, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.56164, val loss: 0.55662, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.55941, val loss: 0.55429, in 0.000s 1 tree, 27 leaves, max depth = 9, train loss: 0.55719, val loss: 0.55198, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.55504, val loss: 0.54973, in 0.000s 1 tree, 50 leaves, max depth = 12, train loss: 0.55278, val loss: 0.54764, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.55068, val loss: 0.54544, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.54862, val loss: 0.54328, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.54642, val loss: 0.54125, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.54441, val loss: 0.53912, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.54245, val loss: 0.53710, in 0.000s 1 tree, 50 leaves, max depth = 12, train loss: 0.54032, val loss: 0.53513, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.53842, val loss: 0.53315, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.53634, val loss: 0.53123, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.53449, val loss: 0.52928, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.53268, val loss: 0.52736, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.53087, val loss: 0.52545, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.52885, val loss: 0.52359, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.52707, val loss: 0.52172, in 0.000s 1 tree, 27 leaves, max depth = 9, train loss: 0.52534, val loss: 0.51989, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.52338, val loss: 0.51808, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.52172, val loss: 0.51633, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.52005, val loss: 0.51456, in 0.000s 1 tree, 49 leaves, max depth = 13, train loss: 0.51815, val loss: 0.51281, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.51652, val loss: 0.51109, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.51492, val loss: 0.50939, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.51338, val loss: 0.50776, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.51184, val loss: 0.50612, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.51032, val loss: 0.50451, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.50882, val loss: 0.50293, in 0.000s 1 tree, 48 leaves, max depth = 11, train loss: 0.50700, val loss: 0.50126, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.50522, val loss: 0.49961, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.50373, val loss: 0.49806, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.50231, val loss: 0.49655, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.50093, val loss: 0.49511, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.49955, val loss: 0.49364, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.49820, val loss: 0.49221, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.49685, val loss: 0.49079, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.49556, val loss: 0.48944, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.49430, val loss: 0.48809, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.49303, val loss: 0.48674, in 0.016s Fit 79 trees in 0.955 s, (2315 total leaves) Time spent computing histograms: 0.352s Time spent finding best splits: 0.055s Time spent applying splits: 0.046s Time spent predicting: 0.000s Trial 43, Fold 3: Log loss = 0.48784348922773413, Average precision = 0.9067597255312809, ROC-AUC = 0.9150396738449983, Elapsed Time = 0.9549069000004238 seconds Trial 43, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 43, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 11, train loss: 0.68844, val loss: 0.68818, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.68389, val loss: 0.68339, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.67939, val loss: 0.67864, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.67499, val loss: 0.67401, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.67070, val loss: 0.66948, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.66649, val loss: 0.66500, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.66237, val loss: 0.66064, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.65833, val loss: 0.65637, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.65434, val loss: 0.65213, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.65044, val loss: 0.64798, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.64663, val loss: 0.64394, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.64289, val loss: 0.63997, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.63922, val loss: 0.63607, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.63565, val loss: 0.63226, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.63212, val loss: 0.62852, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.62865, val loss: 0.62484, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.62521, val loss: 0.62118, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.62187, val loss: 0.61759, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.61856, val loss: 0.61408, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.61528, val loss: 0.61056, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.61206, val loss: 0.60710, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.60890, val loss: 0.60371, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.60585, val loss: 0.60043, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.60280, val loss: 0.59715, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.59987, val loss: 0.59401, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.59696, val loss: 0.59089, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.59413, val loss: 0.58787, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.59136, val loss: 0.58489, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.58863, val loss: 0.58198, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.58595, val loss: 0.57912, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.58333, val loss: 0.57629, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.58071, val loss: 0.57349, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.57812, val loss: 0.57068, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.57563, val loss: 0.56800, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.57312, val loss: 0.56528, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.57072, val loss: 0.56270, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.56832, val loss: 0.56011, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.56600, val loss: 0.55762, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.56369, val loss: 0.55512, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.56145, val loss: 0.55270, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.55921, val loss: 0.55028, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.55699, val loss: 0.54786, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.55484, val loss: 0.54553, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.55254, val loss: 0.54325, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.55044, val loss: 0.54098, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.54837, val loss: 0.53874, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.54614, val loss: 0.53652, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.54411, val loss: 0.53432, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.54213, val loss: 0.53217, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.53997, val loss: 0.53002, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.53807, val loss: 0.52795, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.53596, val loss: 0.52586, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.53410, val loss: 0.52384, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.53228, val loss: 0.52185, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.53046, val loss: 0.51986, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.52842, val loss: 0.51784, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.52664, val loss: 0.51591, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.52490, val loss: 0.51400, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.52291, val loss: 0.51203, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.52124, val loss: 0.51020, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.51957, val loss: 0.50837, in 0.000s 1 tree, 53 leaves, max depth = 12, train loss: 0.51763, val loss: 0.50646, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.51600, val loss: 0.50467, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.51440, val loss: 0.50291, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.51285, val loss: 0.50120, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.51130, val loss: 0.49950, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.50977, val loss: 0.49782, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.50828, val loss: 0.49617, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.50643, val loss: 0.49435, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.50462, val loss: 0.49256, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.50314, val loss: 0.49091, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.50172, val loss: 0.48934, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.50034, val loss: 0.48783, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.49896, val loss: 0.48630, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.49760, val loss: 0.48480, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.49626, val loss: 0.48332, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.49497, val loss: 0.48189, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.49370, val loss: 0.48048, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.49243, val loss: 0.47907, in 0.016s Fit 79 trees in 1.033 s, (2401 total leaves) Time spent computing histograms: 0.394s Time spent finding best splits: 0.063s Time spent applying splits: 0.054s Time spent predicting: 0.016s Trial 43, Fold 4: Log loss = 0.4912654405101837, Average precision = 0.8998129762974338, ROC-AUC = 0.9107013204861334, Elapsed Time = 1.0364403000003222 seconds Trial 43, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 43, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.159 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 12, train loss: 0.68834, val loss: 0.68802, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.68374, val loss: 0.68312, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.67920, val loss: 0.67829, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.67474, val loss: 0.67354, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.67041, val loss: 0.66892, in 0.000s 1 tree, 27 leaves, max depth = 10, train loss: 0.66615, val loss: 0.66438, in 0.016s 1 tree, 28 leaves, max depth = 13, train loss: 0.66198, val loss: 0.65993, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.65790, val loss: 0.65556, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.65386, val loss: 0.65126, in 0.000s 1 tree, 27 leaves, max depth = 10, train loss: 0.64992, val loss: 0.64705, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.64606, val loss: 0.64291, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.64228, val loss: 0.63886, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.63857, val loss: 0.63488, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.63495, val loss: 0.63097, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.63138, val loss: 0.62715, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.62787, val loss: 0.62338, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.62440, val loss: 0.61966, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.62101, val loss: 0.61602, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.61766, val loss: 0.61242, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.61441, val loss: 0.60892, in 0.016s 1 tree, 30 leaves, max depth = 13, train loss: 0.61116, val loss: 0.60542, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.60803, val loss: 0.60204, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.60494, val loss: 0.59872, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.60193, val loss: 0.59546, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.59896, val loss: 0.59225, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.59601, val loss: 0.58907, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.59315, val loss: 0.58597, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.59033, val loss: 0.58293, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.58758, val loss: 0.57995, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.58488, val loss: 0.57702, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.58221, val loss: 0.57414, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.57957, val loss: 0.57127, in 0.016s 1 tree, 30 leaves, max depth = 13, train loss: 0.57694, val loss: 0.56842, in 0.000s 1 tree, 29 leaves, max depth = 13, train loss: 0.57442, val loss: 0.56568, in 0.016s 1 tree, 30 leaves, max depth = 13, train loss: 0.57189, val loss: 0.56293, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.56946, val loss: 0.56029, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.56707, val loss: 0.55768, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.56473, val loss: 0.55513, in 0.000s 1 tree, 29 leaves, max depth = 13, train loss: 0.56242, val loss: 0.55262, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.56015, val loss: 0.55013, in 0.000s 1 tree, 29 leaves, max depth = 13, train loss: 0.55792, val loss: 0.54770, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.55568, val loss: 0.54525, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.55352, val loss: 0.54289, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.55125, val loss: 0.54069, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.54912, val loss: 0.53837, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.54703, val loss: 0.53608, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.54482, val loss: 0.53396, in 0.031s 1 tree, 29 leaves, max depth = 8, train loss: 0.54278, val loss: 0.53171, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.54080, val loss: 0.52954, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.53866, val loss: 0.52748, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.53673, val loss: 0.52535, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.53464, val loss: 0.52334, in 0.000s 1 tree, 29 leaves, max depth = 13, train loss: 0.53276, val loss: 0.52128, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.53092, val loss: 0.51925, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.52907, val loss: 0.51722, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.52705, val loss: 0.51528, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.52525, val loss: 0.51330, in 0.000s 1 tree, 29 leaves, max depth = 13, train loss: 0.52351, val loss: 0.51138, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.52154, val loss: 0.50950, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.51985, val loss: 0.50762, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.51819, val loss: 0.50578, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.51628, val loss: 0.50395, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.51462, val loss: 0.50213, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.51300, val loss: 0.50034, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.51144, val loss: 0.49860, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.50987, val loss: 0.49686, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.50832, val loss: 0.49515, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.50683, val loss: 0.49349, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.50500, val loss: 0.49175, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.50320, val loss: 0.49004, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.50171, val loss: 0.48837, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.50026, val loss: 0.48678, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.49886, val loss: 0.48521, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.49746, val loss: 0.48365, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.49608, val loss: 0.48212, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.49472, val loss: 0.48060, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.49341, val loss: 0.47913, in 0.000s 1 tree, 29 leaves, max depth = 13, train loss: 0.49213, val loss: 0.47769, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.49084, val loss: 0.47625, in 0.000s Fit 79 trees in 1.112 s, (2344 total leaves) Time spent computing histograms: 0.421s Time spent finding best splits: 0.074s Time spent applying splits: 0.072s Time spent predicting: 0.031s Trial 43, Fold 5: Log loss = 0.49696672240271467, Average precision = 0.8970724235639036, ROC-AUC = 0.9054165691247237, Elapsed Time = 1.1303874000004726 seconds
Optimization Progress: 44%|####4 | 44/100 [08:34<10:11, 10.92s/it]
Trial 44, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 44, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.158 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 26 leaves, max depth = 7, train loss: 0.67205, val loss: 0.67196, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.65221, val loss: 0.65221, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.63397, val loss: 0.63387, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.61672, val loss: 0.61668, in 0.016s 1 tree, 38 leaves, max depth = 7, train loss: 0.60176, val loss: 0.60165, in 0.016s 1 tree, 23 leaves, max depth = 7, train loss: 0.58762, val loss: 0.58727, in 0.016s 1 tree, 33 leaves, max depth = 7, train loss: 0.57428, val loss: 0.57371, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.56062, val loss: 0.56003, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.54851, val loss: 0.54784, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.53724, val loss: 0.53658, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.52592, val loss: 0.52539, in 0.016s 1 tree, 39 leaves, max depth = 8, train loss: 0.51609, val loss: 0.51539, in 0.031s 1 tree, 38 leaves, max depth = 12, train loss: 0.50597, val loss: 0.50541, in 0.016s 1 tree, 39 leaves, max depth = 8, train loss: 0.49656, val loss: 0.49587, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.48761, val loss: 0.48667, in 0.000s 1 tree, 43 leaves, max depth = 11, train loss: 0.47885, val loss: 0.47785, in 0.031s 1 tree, 40 leaves, max depth = 13, train loss: 0.47078, val loss: 0.46990, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.46288, val loss: 0.46196, in 0.016s 1 tree, 44 leaves, max depth = 8, train loss: 0.45558, val loss: 0.45451, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.44853, val loss: 0.44744, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.44180, val loss: 0.44070, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.43546, val loss: 0.43437, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.42619, val loss: 0.42549, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.41750, val loss: 0.41718, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.41195, val loss: 0.41167, in 0.016s 1 tree, 42 leaves, max depth = 7, train loss: 0.40683, val loss: 0.40642, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.39910, val loss: 0.39913, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.39426, val loss: 0.39432, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.38712, val loss: 0.38755, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.38039, val loss: 0.38116, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.37601, val loss: 0.37684, in 0.016s 1 tree, 44 leaves, max depth = 10, train loss: 0.37193, val loss: 0.37282, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.36588, val loss: 0.36714, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.36238, val loss: 0.36371, in 0.016s 1 tree, 46 leaves, max depth = 13, train loss: 0.35869, val loss: 0.36011, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.35355, val loss: 0.35501, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.34871, val loss: 0.35022, in 0.016s 1 tree, 62 leaves, max depth = 13, train loss: 0.34395, val loss: 0.34594, in 0.016s 1 tree, 46 leaves, max depth = 16, train loss: 0.34025, val loss: 0.34238, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.33565, val loss: 0.33817, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.33281, val loss: 0.33546, in 0.016s 1 tree, 63 leaves, max depth = 14, train loss: 0.32869, val loss: 0.33177, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.32490, val loss: 0.32804, in 0.016s 1 tree, 47 leaves, max depth = 16, train loss: 0.32175, val loss: 0.32498, in 0.016s 1 tree, 64 leaves, max depth = 14, train loss: 0.31806, val loss: 0.32171, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.31471, val loss: 0.31840, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.31230, val loss: 0.31602, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.30921, val loss: 0.31302, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.30629, val loss: 0.31013, in 0.016s 1 tree, 50 leaves, max depth = 16, train loss: 0.30335, val loss: 0.30718, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.30033, val loss: 0.30454, in 0.016s 1 tree, 65 leaves, max depth = 12, train loss: 0.29747, val loss: 0.30205, in 0.032s 1 tree, 65 leaves, max depth = 12, train loss: 0.29477, val loss: 0.29969, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.29237, val loss: 0.29735, in 0.016s 1 tree, 54 leaves, max depth = 16, train loss: 0.28976, val loss: 0.29474, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.28784, val loss: 0.29298, in 0.016s 1 tree, 68 leaves, max depth = 12, train loss: 0.28549, val loss: 0.29097, in 0.031s 1 tree, 51 leaves, max depth = 14, train loss: 0.28373, val loss: 0.28931, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.28152, val loss: 0.28741, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.27952, val loss: 0.28544, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.27749, val loss: 0.28371, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.27567, val loss: 0.28195, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.27395, val loss: 0.28025, in 0.016s 1 tree, 46 leaves, max depth = 18, train loss: 0.27197, val loss: 0.27835, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.27020, val loss: 0.27688, in 0.016s 1 tree, 55 leaves, max depth = 15, train loss: 0.26791, val loss: 0.27455, in 0.031s 1 tree, 66 leaves, max depth = 13, train loss: 0.26626, val loss: 0.27317, in 0.016s 1 tree, 42 leaves, max depth = 15, train loss: 0.26473, val loss: 0.27161, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.26320, val loss: 0.27035, in 0.016s 1 tree, 39 leaves, max depth = 14, train loss: 0.26183, val loss: 0.26896, in 0.016s 1 tree, 30 leaves, max depth = 7, train loss: 0.25995, val loss: 0.26706, in 0.016s 1 tree, 69 leaves, max depth = 13, train loss: 0.25856, val loss: 0.26586, in 0.016s 1 tree, 42 leaves, max depth = 16, train loss: 0.25728, val loss: 0.26455, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.25611, val loss: 0.26339, in 0.016s 1 tree, 47 leaves, max depth = 16, train loss: 0.25455, val loss: 0.26195, in 0.016s 1 tree, 32 leaves, max depth = 8, train loss: 0.25289, val loss: 0.26027, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.25170, val loss: 0.25930, in 0.016s 1 tree, 43 leaves, max depth = 16, train loss: 0.25062, val loss: 0.25819, in 0.016s 1 tree, 48 leaves, max depth = 18, train loss: 0.24923, val loss: 0.25693, in 0.016s 1 tree, 65 leaves, max depth = 16, train loss: 0.24815, val loss: 0.25607, in 0.016s 1 tree, 43 leaves, max depth = 15, train loss: 0.24716, val loss: 0.25503, in 0.016s 1 tree, 55 leaves, max depth = 18, train loss: 0.24568, val loss: 0.25350, in 0.016s 1 tree, 66 leaves, max depth = 17, train loss: 0.24470, val loss: 0.25273, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.24380, val loss: 0.25179, in 0.031s 1 tree, 56 leaves, max depth = 16, train loss: 0.24242, val loss: 0.25037, in 0.016s Fit 85 trees in 1.721 s, (3900 total leaves) Time spent computing histograms: 0.533s Time spent finding best splits: 0.108s Time spent applying splits: 0.079s Time spent predicting: 0.000s Trial 44, Fold 1: Log loss = 0.2527531781390743, Average precision = 0.9657645417180376, ROC-AUC = 0.9605238041467716, Elapsed Time = 1.7215439999999944 seconds Trial 44, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 44, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 24 leaves, max depth = 6, train loss: 0.67210, val loss: 0.67157, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.65188, val loss: 0.65105, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.63335, val loss: 0.63239, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.61578, val loss: 0.61457, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.59954, val loss: 0.59802, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.58422, val loss: 0.58243, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.56999, val loss: 0.56795, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.55749, val loss: 0.55515, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.54478, val loss: 0.54216, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.53282, val loss: 0.52998, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.52216, val loss: 0.51922, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.51146, val loss: 0.50840, in 0.031s 1 tree, 39 leaves, max depth = 10, train loss: 0.50147, val loss: 0.49823, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.49198, val loss: 0.48855, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.48301, val loss: 0.47943, in 0.016s 1 tree, 33 leaves, max depth = 8, train loss: 0.47469, val loss: 0.47101, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.46697, val loss: 0.46310, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.45933, val loss: 0.45534, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.45210, val loss: 0.44797, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.44526, val loss: 0.44099, in 0.031s 1 tree, 36 leaves, max depth = 7, train loss: 0.43889, val loss: 0.43456, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.42954, val loss: 0.42537, in 0.016s 1 tree, 37 leaves, max depth = 8, train loss: 0.42371, val loss: 0.41956, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.41516, val loss: 0.41116, in 0.016s 1 tree, 37 leaves, max depth = 8, train loss: 0.40996, val loss: 0.40596, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.40213, val loss: 0.39828, in 0.016s 1 tree, 37 leaves, max depth = 8, train loss: 0.39738, val loss: 0.39355, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.39021, val loss: 0.38655, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.38345, val loss: 0.37994, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.37932, val loss: 0.37587, in 0.016s 1 tree, 46 leaves, max depth = 8, train loss: 0.37524, val loss: 0.37184, in 0.031s 1 tree, 47 leaves, max depth = 9, train loss: 0.37138, val loss: 0.36805, in 0.016s 1 tree, 36 leaves, max depth = 8, train loss: 0.36559, val loss: 0.36245, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.36189, val loss: 0.35874, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.35670, val loss: 0.35366, in 0.016s 1 tree, 19 leaves, max depth = 11, train loss: 0.35184, val loss: 0.34888, in 0.016s 1 tree, 38 leaves, max depth = 14, train loss: 0.34797, val loss: 0.34518, in 0.016s 1 tree, 64 leaves, max depth = 15, train loss: 0.34332, val loss: 0.34093, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.33904, val loss: 0.33676, in 0.016s 1 tree, 37 leaves, max depth = 14, train loss: 0.33553, val loss: 0.33339, in 0.000s 1 tree, 21 leaves, max depth = 11, train loss: 0.33162, val loss: 0.32957, in 0.016s 1 tree, 18 leaves, max depth = 9, train loss: 0.32791, val loss: 0.32598, in 0.016s 1 tree, 39 leaves, max depth = 17, train loss: 0.32469, val loss: 0.32291, in 0.016s 1 tree, 25 leaves, max depth = 14, train loss: 0.32127, val loss: 0.31956, in 0.016s 1 tree, 40 leaves, max depth = 17, train loss: 0.31825, val loss: 0.31666, in 0.016s 1 tree, 47 leaves, max depth = 17, train loss: 0.31524, val loss: 0.31374, in 0.016s 1 tree, 48 leaves, max depth = 18, train loss: 0.31238, val loss: 0.31098, in 0.016s 1 tree, 26 leaves, max depth = 14, train loss: 0.30932, val loss: 0.30804, in 0.016s 1 tree, 26 leaves, max depth = 14, train loss: 0.30645, val loss: 0.30528, in 0.016s 1 tree, 53 leaves, max depth = 16, train loss: 0.30380, val loss: 0.30273, in 0.016s 1 tree, 61 leaves, max depth = 16, train loss: 0.30064, val loss: 0.29986, in 0.016s 1 tree, 62 leaves, max depth = 16, train loss: 0.29765, val loss: 0.29717, in 0.031s 1 tree, 51 leaves, max depth = 18, train loss: 0.29524, val loss: 0.29485, in 0.016s 1 tree, 65 leaves, max depth = 15, train loss: 0.29246, val loss: 0.29236, in 0.016s 1 tree, 65 leaves, max depth = 16, train loss: 0.28984, val loss: 0.29002, in 0.031s 1 tree, 63 leaves, max depth = 15, train loss: 0.28737, val loss: 0.28781, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.28521, val loss: 0.28573, in 0.016s 1 tree, 43 leaves, max depth = 16, train loss: 0.28313, val loss: 0.28374, in 0.016s 1 tree, 64 leaves, max depth = 16, train loss: 0.28090, val loss: 0.28173, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.27879, val loss: 0.27985, in 0.031s 1 tree, 65 leaves, max depth = 15, train loss: 0.27680, val loss: 0.27809, in 0.016s 1 tree, 41 leaves, max depth = 17, train loss: 0.27492, val loss: 0.27632, in 0.031s 1 tree, 26 leaves, max depth = 11, train loss: 0.27316, val loss: 0.27462, in 0.016s 1 tree, 66 leaves, max depth = 16, train loss: 0.27135, val loss: 0.27300, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.26974, val loss: 0.27144, in 0.016s 1 tree, 66 leaves, max depth = 16, train loss: 0.26808, val loss: 0.26998, in 0.031s 1 tree, 33 leaves, max depth = 16, train loss: 0.26660, val loss: 0.26856, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.26522, val loss: 0.26731, in 0.016s 1 tree, 63 leaves, max depth = 16, train loss: 0.26371, val loss: 0.26599, in 0.031s 1 tree, 63 leaves, max depth = 16, train loss: 0.26228, val loss: 0.26475, in 0.016s 1 tree, 31 leaves, max depth = 16, train loss: 0.26099, val loss: 0.26351, in 0.016s 1 tree, 53 leaves, max depth = 17, train loss: 0.25934, val loss: 0.26197, in 0.031s 1 tree, 65 leaves, max depth = 16, train loss: 0.25804, val loss: 0.26082, in 0.016s 1 tree, 59 leaves, max depth = 17, train loss: 0.25650, val loss: 0.25942, in 0.031s 1 tree, 34 leaves, max depth = 12, train loss: 0.25535, val loss: 0.25834, in 0.016s 1 tree, 44 leaves, max depth = 17, train loss: 0.25398, val loss: 0.25702, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.25288, val loss: 0.25598, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.25172, val loss: 0.25498, in 0.031s 1 tree, 32 leaves, max depth = 11, train loss: 0.25073, val loss: 0.25408, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.24972, val loss: 0.25337, in 0.016s 1 tree, 41 leaves, max depth = 15, train loss: 0.24873, val loss: 0.25240, in 0.016s 1 tree, 52 leaves, max depth = 17, train loss: 0.24740, val loss: 0.25118, in 0.031s 1 tree, 72 leaves, max depth = 17, train loss: 0.24637, val loss: 0.25044, in 0.016s 1 tree, 52 leaves, max depth = 17, train loss: 0.24512, val loss: 0.24931, in 0.031s 1 tree, 65 leaves, max depth = 17, train loss: 0.24415, val loss: 0.24851, in 0.016s Fit 85 trees in 1.908 s, (3644 total leaves) Time spent computing histograms: 0.606s Time spent finding best splits: 0.121s Time spent applying splits: 0.087s Time spent predicting: 0.000s Trial 44, Fold 2: Log loss = 0.24867228386667295, Average precision = 0.9662767909458208, ROC-AUC = 0.9632831058279951, Elapsed Time = 1.924355299999661 seconds Trial 44, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 44, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 7, train loss: 0.67210, val loss: 0.67202, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.65208, val loss: 0.65216, in 0.031s 1 tree, 27 leaves, max depth = 7, train loss: 0.63387, val loss: 0.63392, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.61667, val loss: 0.61675, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.60062, val loss: 0.60066, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.58539, val loss: 0.58551, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.57116, val loss: 0.57136, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.55877, val loss: 0.55893, in 0.031s 1 tree, 37 leaves, max depth = 9, train loss: 0.54613, val loss: 0.54641, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.53427, val loss: 0.53463, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.52346, val loss: 0.52395, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.51289, val loss: 0.51341, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.50292, val loss: 0.50347, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.49343, val loss: 0.49413, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.48474, val loss: 0.48547, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.47628, val loss: 0.47707, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.46856, val loss: 0.46940, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.46098, val loss: 0.46191, in 0.031s 1 tree, 37 leaves, max depth = 12, train loss: 0.45380, val loss: 0.45479, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.44700, val loss: 0.44803, in 0.016s 1 tree, 39 leaves, max depth = 8, train loss: 0.44060, val loss: 0.44180, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.43110, val loss: 0.43302, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.42523, val loss: 0.42720, in 0.031s 1 tree, 44 leaves, max depth = 15, train loss: 0.41654, val loss: 0.41921, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.40849, val loss: 0.41188, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.40330, val loss: 0.40682, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.39588, val loss: 0.40002, in 0.016s 1 tree, 34 leaves, max depth = 7, train loss: 0.39137, val loss: 0.39556, in 0.031s 1 tree, 45 leaves, max depth = 11, train loss: 0.38449, val loss: 0.38932, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.37800, val loss: 0.38344, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.37190, val loss: 0.37793, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.36788, val loss: 0.37396, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.36437, val loss: 0.37051, in 0.031s 1 tree, 40 leaves, max depth = 8, train loss: 0.36076, val loss: 0.36706, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.35566, val loss: 0.36246, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.35087, val loss: 0.35814, in 0.016s 1 tree, 40 leaves, max depth = 14, train loss: 0.34698, val loss: 0.35415, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.34262, val loss: 0.35022, in 0.031s 1 tree, 39 leaves, max depth = 14, train loss: 0.33900, val loss: 0.34650, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.33473, val loss: 0.34288, in 0.016s 1 tree, 63 leaves, max depth = 12, train loss: 0.33071, val loss: 0.33949, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.32695, val loss: 0.33613, in 0.016s 1 tree, 40 leaves, max depth = 16, train loss: 0.32372, val loss: 0.33276, in 0.031s 1 tree, 62 leaves, max depth = 12, train loss: 0.32012, val loss: 0.32978, in 0.016s 1 tree, 49 leaves, max depth = 14, train loss: 0.31707, val loss: 0.32647, in 0.031s 1 tree, 32 leaves, max depth = 11, train loss: 0.31381, val loss: 0.32353, in 0.016s 1 tree, 50 leaves, max depth = 9, train loss: 0.31138, val loss: 0.32139, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.30837, val loss: 0.31870, in 0.016s 1 tree, 42 leaves, max depth = 16, train loss: 0.30570, val loss: 0.31590, in 0.016s 1 tree, 60 leaves, max depth = 14, train loss: 0.30270, val loss: 0.31345, in 0.016s 1 tree, 33 leaves, max depth = 9, train loss: 0.30003, val loss: 0.31108, in 0.031s 1 tree, 63 leaves, max depth = 15, train loss: 0.29729, val loss: 0.30887, in 0.016s 1 tree, 47 leaves, max depth = 14, train loss: 0.29531, val loss: 0.30703, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.29277, val loss: 0.30496, in 0.016s 1 tree, 51 leaves, max depth = 8, train loss: 0.29086, val loss: 0.30331, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.28849, val loss: 0.30137, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.28633, val loss: 0.29947, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.28416, val loss: 0.29771, in 0.031s 1 tree, 33 leaves, max depth = 12, train loss: 0.28218, val loss: 0.29592, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.28019, val loss: 0.29427, in 0.016s 1 tree, 39 leaves, max depth = 13, train loss: 0.27838, val loss: 0.29265, in 0.016s 1 tree, 46 leaves, max depth = 14, train loss: 0.27640, val loss: 0.29055, in 0.031s 1 tree, 61 leaves, max depth = 16, train loss: 0.27460, val loss: 0.28918, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.27295, val loss: 0.28778, in 0.031s 1 tree, 62 leaves, max depth = 13, train loss: 0.27131, val loss: 0.28639, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.26983, val loss: 0.28505, in 0.016s 1 tree, 44 leaves, max depth = 16, train loss: 0.26803, val loss: 0.28310, in 0.031s 1 tree, 65 leaves, max depth = 15, train loss: 0.26653, val loss: 0.28193, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.26519, val loss: 0.28072, in 0.016s 1 tree, 42 leaves, max depth = 15, train loss: 0.26352, val loss: 0.27892, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.26227, val loss: 0.27785, in 0.016s 1 tree, 45 leaves, max depth = 14, train loss: 0.26071, val loss: 0.27619, in 0.031s 1 tree, 69 leaves, max depth = 11, train loss: 0.25918, val loss: 0.27474, in 0.016s 1 tree, 63 leaves, max depth = 16, train loss: 0.25790, val loss: 0.27370, in 0.031s 1 tree, 44 leaves, max depth = 13, train loss: 0.25673, val loss: 0.27256, in 0.016s 1 tree, 55 leaves, max depth = 14, train loss: 0.25519, val loss: 0.27082, in 0.016s 1 tree, 62 leaves, max depth = 19, train loss: 0.25403, val loss: 0.26996, in 0.031s 1 tree, 43 leaves, max depth = 13, train loss: 0.25297, val loss: 0.26893, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.25189, val loss: 0.26814, in 0.031s 1 tree, 64 leaves, max depth = 15, train loss: 0.25087, val loss: 0.26728, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.24996, val loss: 0.26652, in 0.016s 1 tree, 49 leaves, max depth = 16, train loss: 0.24869, val loss: 0.26515, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.24778, val loss: 0.26429, in 0.031s 1 tree, 46 leaves, max depth = 16, train loss: 0.24657, val loss: 0.26297, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.24572, val loss: 0.26217, in 0.016s Fit 85 trees in 2.029 s, (3817 total leaves) Time spent computing histograms: 0.649s Time spent finding best splits: 0.132s Time spent applying splits: 0.095s Time spent predicting: 0.016s Trial 44, Fold 3: Log loss = 0.24909530956709083, Average precision = 0.9658513285518209, ROC-AUC = 0.9626928313798384, Elapsed Time = 2.0412577999995847 seconds Trial 44, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 44, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 23 leaves, max depth = 7, train loss: 0.67219, val loss: 0.67130, in 0.016s 1 tree, 23 leaves, max depth = 7, train loss: 0.65285, val loss: 0.65112, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.63549, val loss: 0.63305, in 0.016s 1 tree, 26 leaves, max depth = 7, train loss: 0.61862, val loss: 0.61541, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.60253, val loss: 0.59870, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.58742, val loss: 0.58303, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.57448, val loss: 0.56937, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.56112, val loss: 0.55550, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.54938, val loss: 0.54323, in 0.016s 1 tree, 26 leaves, max depth = 8, train loss: 0.53785, val loss: 0.53110, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.52673, val loss: 0.51947, in 0.016s 1 tree, 37 leaves, max depth = 8, train loss: 0.51690, val loss: 0.50902, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.50670, val loss: 0.49838, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.49701, val loss: 0.48830, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.48860, val loss: 0.47946, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.47991, val loss: 0.47039, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.46908, val loss: 0.45941, in 0.031s 1 tree, 45 leaves, max depth = 10, train loss: 0.46192, val loss: 0.45187, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.45441, val loss: 0.44407, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.44731, val loss: 0.43668, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.44058, val loss: 0.42967, in 0.031s 1 tree, 38 leaves, max depth = 11, train loss: 0.43424, val loss: 0.42301, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.42820, val loss: 0.41657, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.42244, val loss: 0.41054, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.41437, val loss: 0.40236, in 0.031s 1 tree, 58 leaves, max depth = 13, train loss: 0.40657, val loss: 0.39448, in 0.016s 1 tree, 44 leaves, max depth = 8, train loss: 0.40164, val loss: 0.38922, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.39684, val loss: 0.38411, in 0.031s 1 tree, 51 leaves, max depth = 12, train loss: 0.38980, val loss: 0.37699, in 0.016s 1 tree, 56 leaves, max depth = 14, train loss: 0.38323, val loss: 0.37037, in 0.031s 1 tree, 47 leaves, max depth = 12, train loss: 0.37896, val loss: 0.36585, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.37494, val loss: 0.36160, in 0.031s 1 tree, 50 leaves, max depth = 11, train loss: 0.37138, val loss: 0.35795, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.36565, val loss: 0.35214, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.36044, val loss: 0.34672, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.35553, val loss: 0.34160, in 0.016s 1 tree, 40 leaves, max depth = 14, train loss: 0.35159, val loss: 0.33759, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.34693, val loss: 0.33320, in 0.031s 1 tree, 31 leaves, max depth = 14, train loss: 0.34261, val loss: 0.32870, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.33898, val loss: 0.32503, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.33505, val loss: 0.32086, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.33100, val loss: 0.31713, in 0.016s 1 tree, 51 leaves, max depth = 14, train loss: 0.32754, val loss: 0.31356, in 0.016s 1 tree, 40 leaves, max depth = 14, train loss: 0.32438, val loss: 0.31036, in 0.031s 1 tree, 49 leaves, max depth = 13, train loss: 0.32194, val loss: 0.30777, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.31835, val loss: 0.30447, in 0.016s 1 tree, 24 leaves, max depth = 10, train loss: 0.31509, val loss: 0.30107, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.31182, val loss: 0.29809, in 0.031s 1 tree, 20 leaves, max depth = 8, train loss: 0.30887, val loss: 0.29501, in 0.016s 1 tree, 42 leaves, max depth = 15, train loss: 0.30618, val loss: 0.29227, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.30325, val loss: 0.28962, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.30063, val loss: 0.28687, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.29815, val loss: 0.28428, in 0.016s 1 tree, 51 leaves, max depth = 15, train loss: 0.29555, val loss: 0.28158, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.29301, val loss: 0.27931, in 0.031s 1 tree, 59 leaves, max depth = 12, train loss: 0.29061, val loss: 0.27716, in 0.016s 1 tree, 69 leaves, max depth = 9, train loss: 0.28877, val loss: 0.27526, in 0.016s 1 tree, 61 leaves, max depth = 13, train loss: 0.28654, val loss: 0.27323, in 0.031s 1 tree, 33 leaves, max depth = 11, train loss: 0.28451, val loss: 0.27106, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.28245, val loss: 0.26922, in 0.031s 1 tree, 35 leaves, max depth = 15, train loss: 0.28060, val loss: 0.26726, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.27870, val loss: 0.26559, in 0.016s 1 tree, 39 leaves, max depth = 15, train loss: 0.27669, val loss: 0.26355, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.27503, val loss: 0.26175, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.27298, val loss: 0.25961, in 0.031s 1 tree, 41 leaves, max depth = 13, train loss: 0.27118, val loss: 0.25775, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.26931, val loss: 0.25578, in 0.016s 1 tree, 70 leaves, max depth = 15, train loss: 0.26769, val loss: 0.25434, in 0.031s 1 tree, 71 leaves, max depth = 15, train loss: 0.26617, val loss: 0.25301, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.26478, val loss: 0.25151, in 0.016s 1 tree, 69 leaves, max depth = 15, train loss: 0.26337, val loss: 0.25026, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.26209, val loss: 0.24891, in 0.016s 1 tree, 52 leaves, max depth = 15, train loss: 0.26042, val loss: 0.24717, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.25915, val loss: 0.24592, in 0.031s 1 tree, 64 leaves, max depth = 15, train loss: 0.25791, val loss: 0.24481, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.25673, val loss: 0.24366, in 0.031s 1 tree, 56 leaves, max depth = 14, train loss: 0.25523, val loss: 0.24212, in 0.016s 1 tree, 71 leaves, max depth = 15, train loss: 0.25408, val loss: 0.24112, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.25305, val loss: 0.24014, in 0.031s 1 tree, 38 leaves, max depth = 11, train loss: 0.25207, val loss: 0.23907, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.25105, val loss: 0.23819, in 0.016s 1 tree, 91 leaves, max depth = 13, train loss: 0.25006, val loss: 0.23747, in 0.031s 1 tree, 52 leaves, max depth = 13, train loss: 0.24869, val loss: 0.23606, in 0.016s 1 tree, 69 leaves, max depth = 13, train loss: 0.24773, val loss: 0.23520, in 0.016s 1 tree, 92 leaves, max depth = 13, train loss: 0.24680, val loss: 0.23454, in 0.031s Fit 85 trees in 2.049 s, (3880 total leaves) Time spent computing histograms: 0.643s Time spent finding best splits: 0.138s Time spent applying splits: 0.099s Time spent predicting: 0.000s Trial 44, Fold 4: Log loss = 0.24959408851622875, Average precision = 0.9669935233028454, ROC-AUC = 0.9628425502693191, Elapsed Time = 2.0559398000004876 seconds Trial 44, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 44, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 24 leaves, max depth = 7, train loss: 0.67203, val loss: 0.67121, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.65215, val loss: 0.65058, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.63366, val loss: 0.63152, in 0.000s 1 tree, 30 leaves, max depth = 7, train loss: 0.61643, val loss: 0.61361, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.59999, val loss: 0.59658, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.58547, val loss: 0.58147, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.57106, val loss: 0.56656, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.55842, val loss: 0.55332, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.54642, val loss: 0.54085, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.53459, val loss: 0.52842, in 0.031s 1 tree, 39 leaves, max depth = 12, train loss: 0.52314, val loss: 0.51659, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.51321, val loss: 0.50617, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.50364, val loss: 0.49622, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.49477, val loss: 0.48718, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.48636, val loss: 0.47846, in 0.031s 1 tree, 34 leaves, max depth = 7, train loss: 0.47785, val loss: 0.46962, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.46982, val loss: 0.46128, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.46216, val loss: 0.45333, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.45491, val loss: 0.44580, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.44781, val loss: 0.43852, in 0.031s 1 tree, 39 leaves, max depth = 12, train loss: 0.44113, val loss: 0.43165, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.43474, val loss: 0.42512, in 0.016s 1 tree, 36 leaves, max depth = 8, train loss: 0.42896, val loss: 0.41897, in 0.016s 1 tree, 40 leaves, max depth = 9, train loss: 0.42021, val loss: 0.41023, in 0.031s 1 tree, 38 leaves, max depth = 12, train loss: 0.41177, val loss: 0.40187, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.40709, val loss: 0.39711, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.40202, val loss: 0.39197, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.39447, val loss: 0.38453, in 0.031s 1 tree, 38 leaves, max depth = 12, train loss: 0.38977, val loss: 0.37975, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.38572, val loss: 0.37561, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.37917, val loss: 0.36918, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.37501, val loss: 0.36499, in 0.031s 1 tree, 26 leaves, max depth = 11, train loss: 0.36909, val loss: 0.35903, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.36353, val loss: 0.35338, in 0.016s 1 tree, 42 leaves, max depth = 15, train loss: 0.35927, val loss: 0.34917, in 0.016s 1 tree, 43 leaves, max depth = 15, train loss: 0.35521, val loss: 0.34516, in 0.016s 1 tree, 63 leaves, max depth = 15, train loss: 0.34999, val loss: 0.34050, in 0.031s 1 tree, 22 leaves, max depth = 7, train loss: 0.34520, val loss: 0.33565, in 0.016s 1 tree, 62 leaves, max depth = 15, train loss: 0.34052, val loss: 0.33149, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.33621, val loss: 0.32712, in 0.016s 1 tree, 43 leaves, max depth = 15, train loss: 0.33270, val loss: 0.32365, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.32873, val loss: 0.31963, in 0.016s 1 tree, 43 leaves, max depth = 15, train loss: 0.32540, val loss: 0.31639, in 0.016s 1 tree, 41 leaves, max depth = 15, train loss: 0.32226, val loss: 0.31335, in 0.016s 1 tree, 62 leaves, max depth = 15, train loss: 0.31836, val loss: 0.30997, in 0.016s 1 tree, 63 leaves, max depth = 15, train loss: 0.31469, val loss: 0.30672, in 0.031s 1 tree, 27 leaves, max depth = 12, train loss: 0.31137, val loss: 0.30337, in 0.016s 1 tree, 64 leaves, max depth = 15, train loss: 0.30801, val loss: 0.30049, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.30568, val loss: 0.29821, in 0.031s 1 tree, 63 leaves, max depth = 15, train loss: 0.30256, val loss: 0.29548, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.29974, val loss: 0.29256, in 0.016s 1 tree, 65 leaves, max depth = 14, train loss: 0.29689, val loss: 0.29009, in 0.016s 1 tree, 47 leaves, max depth = 14, train loss: 0.29420, val loss: 0.28738, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.29169, val loss: 0.28476, in 0.016s 1 tree, 34 leaves, max depth = 15, train loss: 0.28932, val loss: 0.28230, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.28749, val loss: 0.28057, in 0.031s 1 tree, 63 leaves, max depth = 12, train loss: 0.28505, val loss: 0.27849, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.28274, val loss: 0.27658, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.28068, val loss: 0.27443, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.27903, val loss: 0.27288, in 0.031s 1 tree, 52 leaves, max depth = 14, train loss: 0.27681, val loss: 0.27064, in 0.016s 1 tree, 65 leaves, max depth = 15, train loss: 0.27476, val loss: 0.26892, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.27295, val loss: 0.26704, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.27107, val loss: 0.26554, in 0.016s 1 tree, 53 leaves, max depth = 16, train loss: 0.26907, val loss: 0.26357, in 0.031s 1 tree, 37 leaves, max depth = 12, train loss: 0.26748, val loss: 0.26190, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.26578, val loss: 0.26050, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.26429, val loss: 0.25895, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.26272, val loss: 0.25767, in 0.031s 1 tree, 44 leaves, max depth = 13, train loss: 0.26130, val loss: 0.25624, in 0.016s 1 tree, 45 leaves, max depth = 14, train loss: 0.25961, val loss: 0.25467, in 0.016s 1 tree, 68 leaves, max depth = 16, train loss: 0.25816, val loss: 0.25351, in 0.031s 1 tree, 46 leaves, max depth = 13, train loss: 0.25684, val loss: 0.25220, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.25529, val loss: 0.25075, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.25344, val loss: 0.24885, in 0.016s 1 tree, 63 leaves, max depth = 18, train loss: 0.25214, val loss: 0.24778, in 0.016s 1 tree, 30 leaves, max depth = 15, train loss: 0.25100, val loss: 0.24657, in 0.031s 1 tree, 55 leaves, max depth = 14, train loss: 0.24945, val loss: 0.24504, in 0.016s 1 tree, 68 leaves, max depth = 16, train loss: 0.24828, val loss: 0.24413, in 0.031s 1 tree, 33 leaves, max depth = 17, train loss: 0.24725, val loss: 0.24305, in 0.016s 1 tree, 42 leaves, max depth = 13, train loss: 0.24624, val loss: 0.24204, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.24519, val loss: 0.24119, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.24418, val loss: 0.24057, in 0.031s 1 tree, 57 leaves, max depth = 14, train loss: 0.24280, val loss: 0.23921, in 0.016s 1 tree, 49 leaves, max depth = 14, train loss: 0.24161, val loss: 0.23814, in 0.031s Fit 85 trees in 2.018 s, (3727 total leaves) Time spent computing histograms: 0.640s Time spent finding best splits: 0.129s Time spent applying splits: 0.093s Time spent predicting: 0.031s Trial 44, Fold 5: Log loss = 0.25348226199119334, Average precision = 0.964379013424272, ROC-AUC = 0.9602610779778162, Elapsed Time = 2.019451999998637 seconds
Optimization Progress: 45%|####5 | 45/100 [08:50<11:33, 12.62s/it]
Trial 45, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 45, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.158 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 47 leaves, max depth = 11, train loss: 0.65211, val loss: 0.65237, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.61671, val loss: 0.61729, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.58596, val loss: 0.58668, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.55998, val loss: 0.56050, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.53547, val loss: 0.53585, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.51513, val loss: 0.51539, in 0.000s 1 tree, 51 leaves, max depth = 10, train loss: 0.49554, val loss: 0.49563, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.47806, val loss: 0.47795, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.46248, val loss: 0.46226, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.44852, val loss: 0.44814, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.43593, val loss: 0.43551, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.41817, val loss: 0.41837, in 0.016s 1 tree, 57 leaves, max depth = 9, train loss: 0.40785, val loss: 0.40788, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.39293, val loss: 0.39359, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.38415, val loss: 0.38484, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.37156, val loss: 0.37281, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.36457, val loss: 0.36579, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.35444, val loss: 0.35578, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.34694, val loss: 0.34834, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.33832, val loss: 0.33980, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.33177, val loss: 0.33345, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.32408, val loss: 0.32670, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.31820, val loss: 0.32078, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.31169, val loss: 0.31436, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.30558, val loss: 0.30909, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.30023, val loss: 0.30389, in 0.000s 1 tree, 43 leaves, max depth = 12, train loss: 0.29511, val loss: 0.29956, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.29067, val loss: 0.29520, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.28628, val loss: 0.29099, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.28233, val loss: 0.28732, in 0.016s Fit 30 trees in 0.721 s, (1383 total leaves) Time spent computing histograms: 0.161s Time spent finding best splits: 0.028s Time spent applying splits: 0.023s Time spent predicting: 0.000s Trial 45, Fold 1: Log loss = 0.287727311305024, Average precision = 0.9614434610463919, ROC-AUC = 0.9557613168724279, Elapsed Time = 0.7307570999983 seconds Trial 45, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 45, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.143 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 50 leaves, max depth = 10, train loss: 0.65176, val loss: 0.65098, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.61607, val loss: 0.61459, in 0.016s 1 tree, 51 leaves, max depth = 9, train loss: 0.58506, val loss: 0.58286, in 0.016s 1 tree, 55 leaves, max depth = 14, train loss: 0.55912, val loss: 0.55660, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.53470, val loss: 0.53154, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.51431, val loss: 0.51097, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.49489, val loss: 0.49101, in 0.016s 1 tree, 49 leaves, max depth = 9, train loss: 0.47748, val loss: 0.47333, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.46223, val loss: 0.45771, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.44269, val loss: 0.43837, in 0.031s 1 tree, 58 leaves, max depth = 9, train loss: 0.43023, val loss: 0.42573, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.41857, val loss: 0.41395, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.40804, val loss: 0.40331, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.39323, val loss: 0.38871, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.38023, val loss: 0.37595, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.37220, val loss: 0.36777, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.36535, val loss: 0.36099, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.35530, val loss: 0.35114, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.34647, val loss: 0.34254, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.33927, val loss: 0.33561, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.33150, val loss: 0.32845, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.32454, val loss: 0.32161, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.31854, val loss: 0.31586, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.31238, val loss: 0.31023, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.30687, val loss: 0.30480, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.30172, val loss: 0.29996, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.29679, val loss: 0.29547, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.29240, val loss: 0.29119, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.28808, val loss: 0.28710, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.28412, val loss: 0.28351, in 0.000s Fit 30 trees in 0.799 s, (1390 total leaves) Time spent computing histograms: 0.195s Time spent finding best splits: 0.032s Time spent applying splits: 0.027s Time spent predicting: 0.000s Trial 45, Fold 2: Log loss = 0.28666063128091607, Average precision = 0.9612289400368368, ROC-AUC = 0.9578255725975062, Elapsed Time = 0.8044456999996328 seconds Trial 45, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 45, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.159 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 50 leaves, max depth = 10, train loss: 0.65237, val loss: 0.65226, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.61727, val loss: 0.61706, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.58671, val loss: 0.58640, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.56054, val loss: 0.56038, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.53621, val loss: 0.53613, in 0.031s 1 tree, 50 leaves, max depth = 11, train loss: 0.51577, val loss: 0.51584, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.49645, val loss: 0.49658, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.47921, val loss: 0.47937, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.46380, val loss: 0.46392, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.44995, val loss: 0.45018, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.43753, val loss: 0.43779, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.41978, val loss: 0.42138, in 0.000s 1 tree, 60 leaves, max depth = 11, train loss: 0.40953, val loss: 0.41135, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.39462, val loss: 0.39768, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.38592, val loss: 0.38897, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.37333, val loss: 0.37751, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.36638, val loss: 0.37069, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.35618, val loss: 0.36137, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.34717, val loss: 0.35318, in 0.000s 1 tree, 50 leaves, max depth = 11, train loss: 0.34005, val loss: 0.34571, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.33202, val loss: 0.33894, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.32498, val loss: 0.33248, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.31903, val loss: 0.32624, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.31263, val loss: 0.32101, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.30694, val loss: 0.31633, in 0.016s 1 tree, 26 leaves, max depth = 11, train loss: 0.30180, val loss: 0.31157, in 0.000s 1 tree, 49 leaves, max depth = 12, train loss: 0.29699, val loss: 0.30647, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.29239, val loss: 0.30282, in 0.016s 1 tree, 26 leaves, max depth = 11, train loss: 0.28825, val loss: 0.29915, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.28417, val loss: 0.29480, in 0.016s Fit 30 trees in 0.815 s, (1345 total leaves) Time spent computing histograms: 0.189s Time spent finding best splits: 0.032s Time spent applying splits: 0.027s Time spent predicting: 0.000s Trial 45, Fold 3: Log loss = 0.2838568494431637, Average precision = 0.9616779725832294, ROC-AUC = 0.957759032706995, Elapsed Time = 0.8189179000000877 seconds Trial 45, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 45, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 10, train loss: 0.65248, val loss: 0.65076, in 0.016s 1 tree, 47 leaves, max depth = 9, train loss: 0.61743, val loss: 0.61438, in 0.031s 1 tree, 47 leaves, max depth = 10, train loss: 0.58690, val loss: 0.58236, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.56104, val loss: 0.55526, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.53656, val loss: 0.52966, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.51633, val loss: 0.50842, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.49689, val loss: 0.48814, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.47959, val loss: 0.46988, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.46411, val loss: 0.45344, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.45025, val loss: 0.43888, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.43772, val loss: 0.42558, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.42022, val loss: 0.40783, in 0.031s 1 tree, 60 leaves, max depth = 11, train loss: 0.40997, val loss: 0.39678, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.39528, val loss: 0.38190, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.38658, val loss: 0.37264, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.37417, val loss: 0.36009, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.36726, val loss: 0.35272, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.35719, val loss: 0.34231, in 0.000s 1 tree, 26 leaves, max depth = 12, train loss: 0.34831, val loss: 0.33310, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.34113, val loss: 0.32580, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.33463, val loss: 0.31921, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.32713, val loss: 0.31218, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.32046, val loss: 0.30596, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.31425, val loss: 0.29951, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.30873, val loss: 0.29376, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.30360, val loss: 0.28860, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.29877, val loss: 0.28358, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.29401, val loss: 0.27922, in 0.016s 1 tree, 44 leaves, max depth = 10, train loss: 0.28977, val loss: 0.27537, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.28579, val loss: 0.27122, in 0.016s Fit 30 trees in 0.830 s, (1371 total leaves) Time spent computing histograms: 0.194s Time spent finding best splits: 0.033s Time spent applying splits: 0.027s Time spent predicting: 0.000s Trial 45, Fold 4: Log loss = 0.2852568290057377, Average precision = 0.9622366021799464, ROC-AUC = 0.9579074926500568, Elapsed Time = 0.8325492999993003 seconds Trial 45, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 45, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 11, train loss: 0.65224, val loss: 0.65067, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.61645, val loss: 0.61336, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.58518, val loss: 0.58074, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.55912, val loss: 0.55340, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.53470, val loss: 0.52790, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.51432, val loss: 0.50646, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.49490, val loss: 0.48611, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.47758, val loss: 0.46801, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.46203, val loss: 0.45182, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.44234, val loss: 0.43204, in 0.016s 1 tree, 57 leaves, max depth = 9, train loss: 0.42975, val loss: 0.41885, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.41818, val loss: 0.40677, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.40776, val loss: 0.39574, in 0.031s 1 tree, 51 leaves, max depth = 11, train loss: 0.39296, val loss: 0.38104, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.37986, val loss: 0.36802, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.37172, val loss: 0.35953, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.36482, val loss: 0.35243, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.35463, val loss: 0.34207, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.34563, val loss: 0.33289, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.33841, val loss: 0.32569, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.33066, val loss: 0.31870, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.32361, val loss: 0.31147, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.31758, val loss: 0.30555, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.31142, val loss: 0.30024, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.30584, val loss: 0.29453, in 0.000s 1 tree, 50 leaves, max depth = 11, train loss: 0.30076, val loss: 0.28957, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.29582, val loss: 0.28537, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.29139, val loss: 0.28069, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.28699, val loss: 0.27636, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.28302, val loss: 0.27291, in 0.016s Fit 30 trees in 0.830 s, (1348 total leaves) Time spent computing histograms: 0.189s Time spent finding best splits: 0.032s Time spent applying splits: 0.027s Time spent predicting: 0.000s Trial 45, Fold 5: Log loss = 0.2918554722452596, Average precision = 0.9588748949249177, ROC-AUC = 0.9535981271946937, Elapsed Time = 0.8407050000005256 seconds
Optimization Progress: 46%|####6 | 46/100 [09:01<10:47, 11.99s/it]
Trial 46, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 46, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.190 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 11, train loss: 0.68146, val loss: 0.68153, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.67023, val loss: 0.67037, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.65945, val loss: 0.65964, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.64907, val loss: 0.64933, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.63909, val loss: 0.63941, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.62948, val loss: 0.62986, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.62022, val loss: 0.62065, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.61129, val loss: 0.61177, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.60268, val loss: 0.60319, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.59438, val loss: 0.59493, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.58636, val loss: 0.58695, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.57862, val loss: 0.57925, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.57114, val loss: 0.57180, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.56392, val loss: 0.56458, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.55694, val loss: 0.55763, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.55020, val loss: 0.55090, in 0.031s 1 tree, 41 leaves, max depth = 10, train loss: 0.54353, val loss: 0.54416, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.53704, val loss: 0.53761, in 0.031s 1 tree, 41 leaves, max depth = 10, train loss: 0.53075, val loss: 0.53127, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.52472, val loss: 0.52515, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.51883, val loss: 0.51921, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.51316, val loss: 0.51347, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.50764, val loss: 0.50789, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.50232, val loss: 0.50250, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.49713, val loss: 0.49726, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.49211, val loss: 0.49218, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.48724, val loss: 0.48726, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.48252, val loss: 0.48249, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.47794, val loss: 0.47787, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.47350, val loss: 0.47339, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.46919, val loss: 0.46902, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.46501, val loss: 0.46480, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.46095, val loss: 0.46069, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.45700, val loss: 0.45670, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.45317, val loss: 0.45282, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.44945, val loss: 0.44905, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.44584, val loss: 0.44541, in 0.016s Fit 37 trees in 1.035 s, (1517 total leaves) Time spent computing histograms: 0.268s Time spent finding best splits: 0.052s Time spent applying splits: 0.035s Time spent predicting: 0.000s Trial 46, Fold 1: Log loss = 0.44786320029330523, Average precision = 0.9199227180329279, ROC-AUC = 0.9306717593190043, Elapsed Time = 1.0404746000003797 seconds Trial 46, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 46, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 11, train loss: 0.68142, val loss: 0.68126, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.67015, val loss: 0.66985, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.65932, val loss: 0.65887, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.64890, val loss: 0.64833, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.63889, val loss: 0.63818, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.62924, val loss: 0.62841, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.61995, val loss: 0.61899, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.61099, val loss: 0.60991, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.60235, val loss: 0.60117, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.59390, val loss: 0.59255, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.58572, val loss: 0.58422, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.57781, val loss: 0.57618, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.57016, val loss: 0.56842, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.56278, val loss: 0.56092, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.55568, val loss: 0.55367, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.54878, val loss: 0.54665, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.54210, val loss: 0.53988, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.53568, val loss: 0.53332, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.52943, val loss: 0.52698, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.52338, val loss: 0.52083, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.51755, val loss: 0.51492, in 0.031s 1 tree, 41 leaves, max depth = 10, train loss: 0.51190, val loss: 0.50920, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.50640, val loss: 0.50361, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.50109, val loss: 0.49820, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.49592, val loss: 0.49296, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.49094, val loss: 0.48791, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.48608, val loss: 0.48297, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.48137, val loss: 0.47819, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.47683, val loss: 0.47358, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.47239, val loss: 0.46906, in 0.031s 1 tree, 41 leaves, max depth = 9, train loss: 0.46808, val loss: 0.46468, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.46393, val loss: 0.46047, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.45987, val loss: 0.45635, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.45595, val loss: 0.45238, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.45213, val loss: 0.44849, in 0.016s 1 tree, 41 leaves, max depth = 8, train loss: 0.44841, val loss: 0.44471, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.44483, val loss: 0.44107, in 0.016s Fit 37 trees in 1.002 s, (1517 total leaves) Time spent computing histograms: 0.273s Time spent finding best splits: 0.045s Time spent applying splits: 0.030s Time spent predicting: 0.000s Trial 46, Fold 2: Log loss = 0.4479834310936089, Average precision = 0.9135194459769991, ROC-AUC = 0.9308501275136641, Elapsed Time = 1.01119479999943 seconds Trial 46, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 46, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 9, train loss: 0.68137, val loss: 0.68143, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.67007, val loss: 0.67017, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.65919, val loss: 0.65935, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.64875, val loss: 0.64892, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.63868, val loss: 0.63890, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.62898, val loss: 0.62924, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.61966, val loss: 0.61993, in 0.016s 1 tree, 41 leaves, max depth = 8, train loss: 0.61065, val loss: 0.61097, in 0.016s 1 tree, 41 leaves, max depth = 8, train loss: 0.60196, val loss: 0.60227, in 0.016s 1 tree, 41 leaves, max depth = 8, train loss: 0.59357, val loss: 0.59393, in 0.016s 1 tree, 41 leaves, max depth = 8, train loss: 0.58546, val loss: 0.58589, in 0.031s 1 tree, 41 leaves, max depth = 8, train loss: 0.57765, val loss: 0.57806, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.57008, val loss: 0.57057, in 0.016s 1 tree, 41 leaves, max depth = 8, train loss: 0.56277, val loss: 0.56332, in 0.063s 1 tree, 41 leaves, max depth = 8, train loss: 0.55571, val loss: 0.55631, in 0.047s 1 tree, 41 leaves, max depth = 8, train loss: 0.54888, val loss: 0.54954, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.54228, val loss: 0.54292, in 0.031s 1 tree, 41 leaves, max depth = 9, train loss: 0.53588, val loss: 0.53658, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.52970, val loss: 0.53044, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.52376, val loss: 0.52449, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.51796, val loss: 0.51867, in 0.031s 1 tree, 41 leaves, max depth = 9, train loss: 0.51239, val loss: 0.51309, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.50700, val loss: 0.50770, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.50172, val loss: 0.50240, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.49666, val loss: 0.49736, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.49169, val loss: 0.49237, in 0.031s 1 tree, 41 leaves, max depth = 9, train loss: 0.48693, val loss: 0.48761, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.48226, val loss: 0.48293, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.47778, val loss: 0.47844, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.47343, val loss: 0.47412, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.46916, val loss: 0.46985, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.46506, val loss: 0.46574, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.46104, val loss: 0.46171, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.45717, val loss: 0.45788, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.45335, val loss: 0.45412, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.44785, val loss: 0.44901, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.44429, val loss: 0.44547, in 0.016s Fit 37 trees in 1.079 s, (1517 total leaves) Time spent computing histograms: 0.312s Time spent finding best splits: 0.063s Time spent applying splits: 0.050s Time spent predicting: 0.016s Trial 46, Fold 3: Log loss = 0.441979098882662, Average precision = 0.9501071518342736, ROC-AUC = 0.9485726806059848, Elapsed Time = 1.092893199998798 seconds Trial 46, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 46, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 11, train loss: 0.68149, val loss: 0.68109, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.67029, val loss: 0.66950, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.65952, val loss: 0.65836, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.64917, val loss: 0.64763, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.63921, val loss: 0.63731, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.62962, val loss: 0.62735, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.62039, val loss: 0.61776, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.61149, val loss: 0.60852, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.60291, val loss: 0.59959, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.59463, val loss: 0.59099, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.58648, val loss: 0.58249, in 0.031s 1 tree, 41 leaves, max depth = 10, train loss: 0.57861, val loss: 0.57428, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.57102, val loss: 0.56636, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.56368, val loss: 0.55869, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.55658, val loss: 0.55127, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.54973, val loss: 0.54410, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.54309, val loss: 0.53718, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.53668, val loss: 0.53046, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.53047, val loss: 0.52397, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.52446, val loss: 0.51768, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.51864, val loss: 0.51159, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.51300, val loss: 0.50569, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.50753, val loss: 0.49999, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.50223, val loss: 0.49446, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.49710, val loss: 0.48908, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.49212, val loss: 0.48386, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.48729, val loss: 0.47881, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.48261, val loss: 0.47390, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.47807, val loss: 0.46914, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.47367, val loss: 0.46451, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.46940, val loss: 0.46003, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.46525, val loss: 0.45566, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.46123, val loss: 0.45140, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.45732, val loss: 0.44727, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.45353, val loss: 0.44325, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.44984, val loss: 0.43933, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.44627, val loss: 0.43554, in 0.016s Fit 37 trees in 1.064 s, (1517 total leaves) Time spent computing histograms: 0.276s Time spent finding best splits: 0.044s Time spent applying splits: 0.032s Time spent predicting: 0.000s Trial 46, Fold 4: Log loss = 0.44626257917322193, Average precision = 0.9205991301635942, ROC-AUC = 0.9328650054483764, Elapsed Time = 1.0834144000000379 seconds Trial 46, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 46, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 12, train loss: 0.68147, val loss: 0.68108, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.67026, val loss: 0.66948, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.65948, val loss: 0.65835, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.64896, val loss: 0.64744, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.63883, val loss: 0.63695, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.62908, val loss: 0.62684, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.61969, val loss: 0.61710, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.61064, val loss: 0.60771, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.60191, val loss: 0.59865, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.59349, val loss: 0.58991, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.58536, val loss: 0.58147, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.57752, val loss: 0.57333, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.56994, val loss: 0.56546, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.56259, val loss: 0.55781, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.55552, val loss: 0.55045, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.54865, val loss: 0.54330, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.54201, val loss: 0.53638, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.53561, val loss: 0.52973, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.52939, val loss: 0.52325, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.52340, val loss: 0.51701, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.51757, val loss: 0.51092, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.51191, val loss: 0.50504, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.50645, val loss: 0.49938, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.50115, val loss: 0.49384, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.49602, val loss: 0.48851, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.49102, val loss: 0.48329, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.48621, val loss: 0.47827, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.48153, val loss: 0.47341, in 0.031s [29/37] 1 tree, 41 leaves, max depth = 11, train loss: 0.47697, val loss: 0.46865, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.47254, val loss: 0.46402, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.46828, val loss: 0.45957, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.46411, val loss: 0.45521, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.46008, val loss: 0.45101, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.45615, val loss: 0.44690, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.45235, val loss: 0.44297, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.44864, val loss: 0.43908, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.44506, val loss: 0.43536, in 0.031s Fit 37 trees in 0.985 s, (1517 total leaves) Time spent computing histograms: 0.252s Time spent finding best splits: 0.045s Time spent applying splits: 0.035s Time spent predicting: 0.000s Trial 46, Fold 5: Log loss = 0.45017790444983935, Average precision = 0.9146362790173244, ROC-AUC = 0.9284197462051541, Elapsed Time = 0.9993957999995473 seconds
Optimization Progress: 47%|####6 | 47/100 [09:14<10:56, 12.39s/it]
Trial 47, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 47, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.127 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 65 leaves, max depth = 17, train loss: 0.67073, val loss: 0.67007, in 0.000s 1 tree, 65 leaves, max depth = 17, train loss: 0.65039, val loss: 0.64911, in 0.016s 1 tree, 65 leaves, max depth = 17, train loss: 0.63191, val loss: 0.63002, in 0.016s 1 tree, 65 leaves, max depth = 17, train loss: 0.61507, val loss: 0.61262, in 0.000s 1 tree, 65 leaves, max depth = 17, train loss: 0.59970, val loss: 0.59670, in 0.016s 1 tree, 65 leaves, max depth = 17, train loss: 0.58565, val loss: 0.58213, in 0.000s 1 tree, 65 leaves, max depth = 17, train loss: 0.57279, val loss: 0.56878, in 0.016s 1 tree, 65 leaves, max depth = 17, train loss: 0.56100, val loss: 0.55651, in 0.000s 1 tree, 65 leaves, max depth = 17, train loss: 0.55018, val loss: 0.54524, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.53934, val loss: 0.53514, in 0.016s 1 tree, 67 leaves, max depth = 18, train loss: 0.52979, val loss: 0.52515, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.52013, val loss: 0.51619, in 0.016s 1 tree, 67 leaves, max depth = 17, train loss: 0.51166, val loss: 0.50729, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.50302, val loss: 0.49931, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.49507, val loss: 0.49199, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.48782, val loss: 0.48436, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.48066, val loss: 0.47780, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.47418, val loss: 0.47094, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.46771, val loss: 0.46504, in 0.000s 1 tree, 68 leaves, max depth = 14, train loss: 0.46189, val loss: 0.45887, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.45602, val loss: 0.45354, in 0.000s 1 tree, 67 leaves, max depth = 15, train loss: 0.45075, val loss: 0.44796, in 0.000s 1 tree, 75 leaves, max depth = 11, train loss: 0.44542, val loss: 0.44315, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.44050, val loss: 0.43872, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.43586, val loss: 0.43375, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.43137, val loss: 0.42973, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.42717, val loss: 0.42521, in 0.000s 1 tree, 75 leaves, max depth = 11, train loss: 0.42306, val loss: 0.42157, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.41924, val loss: 0.41747, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.41548, val loss: 0.41416, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.41218, val loss: 0.41052, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.40876, val loss: 0.40686, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.40533, val loss: 0.40388, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40234, val loss: 0.40057, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.39918, val loss: 0.39786, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39643, val loss: 0.39481, in 0.000s 1 tree, 67 leaves, max depth = 16, train loss: 0.39342, val loss: 0.39161, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.39052, val loss: 0.38916, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38801, val loss: 0.38635, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.38532, val loss: 0.38412, in 0.000s 1 tree, 68 leaves, max depth = 13, train loss: 0.38258, val loss: 0.38119, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38028, val loss: 0.37861, in 0.000s 1 tree, 75 leaves, max depth = 11, train loss: 0.37779, val loss: 0.37657, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37566, val loss: 0.37417, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.37320, val loss: 0.37158, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37124, val loss: 0.36936, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.36898, val loss: 0.36700, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.36665, val loss: 0.36514, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.36483, val loss: 0.36307, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.36276, val loss: 0.36092, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.36059, val loss: 0.35921, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35890, val loss: 0.35728, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.35700, val loss: 0.35531, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.35497, val loss: 0.35374, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35339, val loss: 0.35192, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.35165, val loss: 0.35012, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.34974, val loss: 0.34868, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34827, val loss: 0.34697, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.34666, val loss: 0.34531, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34528, val loss: 0.34371, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.34370, val loss: 0.34190, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.34189, val loss: 0.34056, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34059, val loss: 0.33905, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.33918, val loss: 0.33760, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.33748, val loss: 0.33635, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.33605, val loss: 0.33471, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33483, val loss: 0.33327, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.33351, val loss: 0.33176, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.33190, val loss: 0.33060, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33075, val loss: 0.32923, in 0.000s 1 tree, 67 leaves, max depth = 15, train loss: 0.32955, val loss: 0.32802, in 0.016s Fit 71 trees in 0.861 s, (4031 total leaves) Time spent computing histograms: 0.294s Time spent finding best splits: 0.060s Time spent applying splits: 0.059s Time spent predicting: 0.000s Trial 47, Fold 1: Log loss = 0.3353796802663027, Average precision = 0.9461633419759852, ROC-AUC = 0.9449084380063377, Elapsed Time = 0.8692676999999094 seconds Trial 47, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 47, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 68 leaves, max depth = 15, train loss: 0.67100, val loss: 0.66996, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.65092, val loss: 0.64891, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.63267, val loss: 0.62973, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.61604, val loss: 0.61223, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.60087, val loss: 0.59624, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.58701, val loss: 0.58159, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.57432, val loss: 0.56816, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.56270, val loss: 0.55583, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.55133, val loss: 0.54490, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.54103, val loss: 0.53402, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.53092, val loss: 0.52431, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.52179, val loss: 0.51466, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.51276, val loss: 0.50600, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.50464, val loss: 0.49741, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.49653, val loss: 0.48968, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.48929, val loss: 0.48201, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.48198, val loss: 0.47507, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.47524, val loss: 0.46869, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.46894, val loss: 0.46199, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.46284, val loss: 0.45624, in 0.016s 1 tree, 68 leaves, max depth = 16, train loss: 0.45718, val loss: 0.45020, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.45164, val loss: 0.44500, in 0.016s 1 tree, 68 leaves, max depth = 17, train loss: 0.44654, val loss: 0.43955, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.44149, val loss: 0.43483, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.43688, val loss: 0.42991, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.43226, val loss: 0.42561, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.42800, val loss: 0.42166, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.42390, val loss: 0.41727, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.41999, val loss: 0.41367, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.41627, val loss: 0.40967, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.41267, val loss: 0.40637, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40946, val loss: 0.40305, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.40614, val loss: 0.39948, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.40284, val loss: 0.39648, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39994, val loss: 0.39346, in 0.000s 1 tree, 75 leaves, max depth = 11, train loss: 0.39689, val loss: 0.39071, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39422, val loss: 0.38793, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.39129, val loss: 0.38479, in 0.000s 1 tree, 75 leaves, max depth = 11, train loss: 0.38848, val loss: 0.38226, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38604, val loss: 0.37971, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.38343, val loss: 0.37739, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.38073, val loss: 0.37448, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37849, val loss: 0.37214, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.37607, val loss: 0.37000, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37400, val loss: 0.36783, in 0.000s 1 tree, 67 leaves, max depth = 11, train loss: 0.37155, val loss: 0.36519, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36964, val loss: 0.36319, in 0.016s 1 tree, 67 leaves, max depth = 11, train loss: 0.36740, val loss: 0.36077, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.36513, val loss: 0.35879, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36336, val loss: 0.35692, in 0.016s 1 tree, 67 leaves, max depth = 11, train loss: 0.36130, val loss: 0.35471, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.35918, val loss: 0.35288, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35753, val loss: 0.35114, in 0.016s 1 tree, 68 leaves, max depth = 16, train loss: 0.35558, val loss: 0.34897, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.35359, val loss: 0.34726, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35205, val loss: 0.34564, in 0.000s 1 tree, 67 leaves, max depth = 11, train loss: 0.35031, val loss: 0.34376, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.34844, val loss: 0.34217, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34700, val loss: 0.34065, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.34528, val loss: 0.33878, in 0.016s 1 tree, 67 leaves, max depth = 11, train loss: 0.34375, val loss: 0.33714, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.34199, val loss: 0.33564, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34063, val loss: 0.33420, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.33911, val loss: 0.33254, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33784, val loss: 0.33120, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.33616, val loss: 0.32979, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.33474, val loss: 0.32824, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33354, val loss: 0.32697, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.33225, val loss: 0.32560, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.33066, val loss: 0.32427, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32953, val loss: 0.32307, in 0.000s Fit 71 trees in 0.955 s, (4072 total leaves) Time spent computing histograms: 0.323s Time spent finding best splits: 0.067s Time spent applying splits: 0.067s Time spent predicting: 0.000s Trial 47, Fold 2: Log loss = 0.33349052582425665, Average precision = 0.9443356104667668, ROC-AUC = 0.9468951037363036, Elapsed Time = 0.9595272000005934 seconds Trial 47, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 47, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 66 leaves, max depth = 14, train loss: 0.67121, val loss: 0.67047, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.65104, val loss: 0.64967, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.63295, val loss: 0.63092, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.61621, val loss: 0.61363, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.60094, val loss: 0.59784, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.58698, val loss: 0.58338, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.57421, val loss: 0.57013, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.56267, val loss: 0.55807, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.55122, val loss: 0.54750, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.54088, val loss: 0.53672, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.53071, val loss: 0.52735, in 0.000s 1 tree, 67 leaves, max depth = 15, train loss: 0.52170, val loss: 0.51786, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.51261, val loss: 0.50951, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.50461, val loss: 0.50105, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.49645, val loss: 0.49357, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.48932, val loss: 0.48600, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.48196, val loss: 0.47930, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.47518, val loss: 0.47313, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.46898, val loss: 0.46651, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.46284, val loss: 0.46095, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.45728, val loss: 0.45497, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.45169, val loss: 0.44994, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.44669, val loss: 0.44454, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.44159, val loss: 0.43997, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.43708, val loss: 0.43510, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.43242, val loss: 0.43094, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.42812, val loss: 0.42712, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.42411, val loss: 0.42276, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.42016, val loss: 0.41927, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.41652, val loss: 0.41530, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.41289, val loss: 0.41211, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40959, val loss: 0.40904, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.40625, val loss: 0.40612, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40323, val loss: 0.40330, in 0.000s 1 tree, 66 leaves, max depth = 12, train loss: 0.40002, val loss: 0.39980, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39726, val loss: 0.39723, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.39420, val loss: 0.39459, in 0.000s 1 tree, 66 leaves, max depth = 12, train loss: 0.39131, val loss: 0.39142, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38879, val loss: 0.38907, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.38597, val loss: 0.38665, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38364, val loss: 0.38448, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.38102, val loss: 0.38226, in 0.000s 1 tree, 66 leaves, max depth = 12, train loss: 0.37843, val loss: 0.37942, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37629, val loss: 0.37742, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.37386, val loss: 0.37538, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.37141, val loss: 0.37270, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36943, val loss: 0.37085, in 0.000s 1 tree, 66 leaves, max depth = 11, train loss: 0.36720, val loss: 0.36840, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.36493, val loss: 0.36650, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36309, val loss: 0.36478, in 0.000s 1 tree, 67 leaves, max depth = 11, train loss: 0.36105, val loss: 0.36253, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.35892, val loss: 0.36078, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35722, val loss: 0.35918, in 0.016s 1 tree, 66 leaves, max depth = 11, train loss: 0.35538, val loss: 0.35717, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35380, val loss: 0.35569, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.35180, val loss: 0.35405, in 0.016s 1 tree, 66 leaves, max depth = 11, train loss: 0.35011, val loss: 0.35219, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34863, val loss: 0.35081, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.34675, val loss: 0.34928, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.34514, val loss: 0.34751, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34375, val loss: 0.34621, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.34219, val loss: 0.34436, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.34041, val loss: 0.34293, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33911, val loss: 0.34170, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.33766, val loss: 0.33997, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.33598, val loss: 0.33863, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33475, val loss: 0.33747, in 0.016s 1 tree, 66 leaves, max depth = 11, train loss: 0.33343, val loss: 0.33603, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33228, val loss: 0.33495, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.33098, val loss: 0.33340, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.32937, val loss: 0.33213, in 0.016s Fit 71 trees in 1.002 s, (4043 total leaves) Time spent computing histograms: 0.324s Time spent finding best splits: 0.070s Time spent applying splits: 0.069s Time spent predicting: 0.016s Trial 47, Fold 3: Log loss = 0.3298393523478352, Average precision = 0.9500604398322146, ROC-AUC = 0.9502438156716203, Elapsed Time = 1.0118590999991284 seconds Trial 47, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 47, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 65 leaves, max depth = 13, train loss: 0.67094, val loss: 0.66962, in 0.016s 1 tree, 65 leaves, max depth = 13, train loss: 0.65081, val loss: 0.64823, in 0.000s 1 tree, 65 leaves, max depth = 13, train loss: 0.63251, val loss: 0.62873, in 0.016s 1 tree, 65 leaves, max depth = 13, train loss: 0.61584, val loss: 0.61092, in 0.000s 1 tree, 65 leaves, max depth = 13, train loss: 0.60063, val loss: 0.59461, in 0.016s 1 tree, 65 leaves, max depth = 13, train loss: 0.58672, val loss: 0.57966, in 0.016s 1 tree, 65 leaves, max depth = 13, train loss: 0.57400, val loss: 0.56592, in 0.016s 1 tree, 65 leaves, max depth = 13, train loss: 0.56234, val loss: 0.55330, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.55088, val loss: 0.54203, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.54060, val loss: 0.53085, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.53042, val loss: 0.52085, in 0.000s 1 tree, 67 leaves, max depth = 15, train loss: 0.52137, val loss: 0.51098, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.51226, val loss: 0.50206, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.50418, val loss: 0.49319, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.49601, val loss: 0.48520, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.48885, val loss: 0.47732, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.48148, val loss: 0.47012, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.47468, val loss: 0.46348, in 0.000s 1 tree, 67 leaves, max depth = 15, train loss: 0.46846, val loss: 0.45660, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.46230, val loss: 0.45059, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.45672, val loss: 0.44438, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.45111, val loss: 0.43893, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.44605, val loss: 0.43331, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.44095, val loss: 0.42835, in 0.000s 1 tree, 67 leaves, max depth = 15, train loss: 0.43642, val loss: 0.42325, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.43175, val loss: 0.41874, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.42743, val loss: 0.41456, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.42340, val loss: 0.41000, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.41943, val loss: 0.40617, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.41577, val loss: 0.40203, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.41213, val loss: 0.39852, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40893, val loss: 0.39506, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.40557, val loss: 0.39184, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.40234, val loss: 0.38814, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39945, val loss: 0.38500, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.39635, val loss: 0.38205, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39369, val loss: 0.37916, in 0.000s 1 tree, 66 leaves, max depth = 14, train loss: 0.39081, val loss: 0.37589, in 0.000s 1 tree, 75 leaves, max depth = 11, train loss: 0.38795, val loss: 0.37318, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38551, val loss: 0.37053, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.38285, val loss: 0.36803, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38060, val loss: 0.36558, in 0.000s 1 tree, 66 leaves, max depth = 14, train loss: 0.37803, val loss: 0.36267, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.37556, val loss: 0.36036, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37348, val loss: 0.35810, in 0.000s 1 tree, 66 leaves, max depth = 13, train loss: 0.37114, val loss: 0.35544, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36923, val loss: 0.35336, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.36691, val loss: 0.35122, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.36477, val loss: 0.34879, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36299, val loss: 0.34685, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.36082, val loss: 0.34486, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.35886, val loss: 0.34264, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35721, val loss: 0.34083, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.35537, val loss: 0.33870, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.35333, val loss: 0.33684, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35178, val loss: 0.33515, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.35012, val loss: 0.33326, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.34820, val loss: 0.33152, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34675, val loss: 0.32994, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.34507, val loss: 0.32821, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.34325, val loss: 0.32658, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34189, val loss: 0.32509, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.34032, val loss: 0.32349, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33906, val loss: 0.32210, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.33767, val loss: 0.32056, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.33593, val loss: 0.31900, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.33453, val loss: 0.31759, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33333, val loss: 0.31627, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.33168, val loss: 0.31480, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.33037, val loss: 0.31349, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32924, val loss: 0.31224, in 0.000s Fit 71 trees in 1.002 s, (4034 total leaves) Time spent computing histograms: 0.330s Time spent finding best splits: 0.071s Time spent applying splits: 0.070s Time spent predicting: 0.000s Trial 47, Fold 4: Log loss = 0.3291262557638443, Average precision = 0.9527007277982728, ROC-AUC = 0.9509306351352318, Elapsed Time = 1.0164748999995936 seconds Trial 47, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 47, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 67 leaves, max depth = 14, train loss: 0.67071, val loss: 0.66926, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.65037, val loss: 0.64754, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.63187, val loss: 0.62774, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.61518, val loss: 0.60975, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.59978, val loss: 0.59316, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.58570, val loss: 0.57796, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.57282, val loss: 0.56400, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.56110, val loss: 0.55116, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.55024, val loss: 0.53931, in 0.000s 1 tree, 75 leaves, max depth = 17, train loss: 0.53943, val loss: 0.52898, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.52993, val loss: 0.51860, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.52029, val loss: 0.50942, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.51144, val loss: 0.50102, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.50328, val loss: 0.49205, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.49533, val loss: 0.48455, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.48805, val loss: 0.47650, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.48089, val loss: 0.46978, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.47438, val loss: 0.46249, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.46791, val loss: 0.45645, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.46206, val loss: 0.44992, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.45619, val loss: 0.44446, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.45077, val loss: 0.43944, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.44562, val loss: 0.43367, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.44068, val loss: 0.42912, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.43603, val loss: 0.42387, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.43151, val loss: 0.41974, in 0.016s [27/71] 1 tree, 67 leaves, max depth = 13, train loss: 0.42731, val loss: 0.41496, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.42317, val loss: 0.41121, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.41936, val loss: 0.40685, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.41557, val loss: 0.40343, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.41210, val loss: 0.39944, in 0.016s [32/71] 1 tree, 75 leaves, max depth = 16, train loss: 0.40861, val loss: 0.39631, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40543, val loss: 0.39325, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.40221, val loss: 0.39039, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39930, val loss: 0.38759, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.39627, val loss: 0.38409, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.39330, val loss: 0.38149, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39065, val loss: 0.37894, in 0.000s 1 tree, 75 leaves, max depth = 17, train loss: 0.38791, val loss: 0.37655, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38546, val loss: 0.37421, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.38274, val loss: 0.37104, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.38020, val loss: 0.36885, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37796, val loss: 0.36670, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37588, val loss: 0.36471, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.37338, val loss: 0.36185, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.37101, val loss: 0.35983, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36909, val loss: 0.35800, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.36684, val loss: 0.35535, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.36462, val loss: 0.35349, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36284, val loss: 0.35179, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.36078, val loss: 0.34935, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.35871, val loss: 0.34764, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.35676, val loss: 0.34539, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35511, val loss: 0.34381, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.35317, val loss: 0.34222, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.35142, val loss: 0.34012, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34987, val loss: 0.33864, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.34815, val loss: 0.33677, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.34631, val loss: 0.33529, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34487, val loss: 0.33390, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.34333, val loss: 0.33206, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34199, val loss: 0.33077, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.34045, val loss: 0.32910, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.33869, val loss: 0.32770, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33742, val loss: 0.32649, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.33599, val loss: 0.32493, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.33433, val loss: 0.32361, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33313, val loss: 0.32247, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.33184, val loss: 0.32090, in 0.000s 1 tree, 75 leaves, max depth = 17, train loss: 0.33028, val loss: 0.31967, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.32897, val loss: 0.31826, in 0.016s Fit 71 trees in 1.001 s, (4121 total leaves) Time spent computing histograms: 0.319s Time spent finding best splits: 0.069s Time spent applying splits: 0.070s Time spent predicting: 0.000s Trial 47, Fold 5: Log loss = 0.33656045996488004, Average precision = 0.9484047206945048, ROC-AUC = 0.9457466882187912, Elapsed Time = 1.0071942000013223 seconds
Optimization Progress: 48%|####8 | 48/100 [09:26<10:32, 12.16s/it]
Trial 48, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 48, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.143 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 21 leaves, max depth = 8, train loss: 0.68062, val loss: 0.68020, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.66875, val loss: 0.66793, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.65749, val loss: 0.65628, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.64681, val loss: 0.64522, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.63667, val loss: 0.63471, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.62704, val loss: 0.62471, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.61789, val loss: 0.61520, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.60918, val loss: 0.60615, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.60090, val loss: 0.59753, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.59301, val loss: 0.58931, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.58550, val loss: 0.58148, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.57835, val loss: 0.57401, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.57153, val loss: 0.56689, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.56503, val loss: 0.56009, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.55888, val loss: 0.55368, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.55301, val loss: 0.54757, in 0.000s 1 tree, 38 leaves, max depth = 9, train loss: 0.54710, val loss: 0.54203, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.54157, val loss: 0.53622, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.53634, val loss: 0.53076, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.53088, val loss: 0.52566, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.52568, val loss: 0.52081, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.52072, val loss: 0.51620, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.51596, val loss: 0.51117, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.51130, val loss: 0.50685, in 0.000s 1 tree, 21 leaves, max depth = 11, train loss: 0.50691, val loss: 0.50221, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.50252, val loss: 0.49814, in 0.000s 1 tree, 21 leaves, max depth = 11, train loss: 0.49840, val loss: 0.49378, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.49426, val loss: 0.48996, in 0.000s 1 tree, 21 leaves, max depth = 10, train loss: 0.49038, val loss: 0.48584, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.48647, val loss: 0.48225, in 0.000s 1 tree, 21 leaves, max depth = 11, train loss: 0.48282, val loss: 0.47841, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.47913, val loss: 0.47503, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.47560, val loss: 0.47180, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.47221, val loss: 0.46819, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.46887, val loss: 0.46514, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.46567, val loss: 0.46173, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.46250, val loss: 0.45885, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.45946, val loss: 0.45562, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.45646, val loss: 0.45291, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.45361, val loss: 0.44985, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.45089, val loss: 0.44692, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.44806, val loss: 0.44438, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.44536, val loss: 0.44196, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.44279, val loss: 0.43922, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.44023, val loss: 0.43693, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.43778, val loss: 0.43476, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.43538, val loss: 0.43217, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.43305, val loss: 0.43010, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.43077, val loss: 0.42766, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.42856, val loss: 0.42571, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.42641, val loss: 0.42337, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.42430, val loss: 0.42152, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.42225, val loss: 0.41931, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.42024, val loss: 0.41757, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.41830, val loss: 0.41544, in 0.016s [56/80] 1 tree, 39 leaves, max depth = 9, train loss: 0.41638, val loss: 0.41379, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.41454, val loss: 0.41176, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41273, val loss: 0.40976, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.41091, val loss: 0.40820, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40918, val loss: 0.40629, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40753, val loss: 0.40445, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.40579, val loss: 0.40299, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40422, val loss: 0.40124, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.40256, val loss: 0.39985, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.40088, val loss: 0.39805, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39938, val loss: 0.39638, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.39780, val loss: 0.39507, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39637, val loss: 0.39348, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.39479, val loss: 0.39179, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.39328, val loss: 0.39055, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39192, val loss: 0.38903, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.39039, val loss: 0.38742, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.38895, val loss: 0.38626, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38765, val loss: 0.38480, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38640, val loss: 0.38341, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.38502, val loss: 0.38231, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.38358, val loss: 0.38078, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38239, val loss: 0.37944, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.38107, val loss: 0.37840, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.37972, val loss: 0.37697, in 0.016s Fit 80 trees in 0.846 s, (2050 total leaves) Time spent computing histograms: 0.312s Time spent finding best splits: 0.046s Time spent applying splits: 0.039s Time spent predicting: 0.016s Trial 48, Fold 1: Log loss = 0.380684813906018, Average precision = 0.9458659296414029, ROC-AUC = 0.9424333575897487, Elapsed Time = 0.8453929999996035 seconds Trial 48, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 48, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 21 leaves, max depth = 8, train loss: 0.68065, val loss: 0.68003, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.66881, val loss: 0.66760, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.65775, val loss: 0.65597, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.64709, val loss: 0.64476, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.63697, val loss: 0.63411, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.62736, val loss: 0.62397, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.61823, val loss: 0.61434, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.60967, val loss: 0.60530, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.60140, val loss: 0.59656, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.59353, val loss: 0.58822, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.58603, val loss: 0.58028, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.57900, val loss: 0.57283, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.57219, val loss: 0.56560, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.56570, val loss: 0.55870, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.55951, val loss: 0.55212, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.55371, val loss: 0.54594, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.54783, val loss: 0.54020, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.54241, val loss: 0.53442, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.53724, val loss: 0.52889, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.53179, val loss: 0.52360, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.52660, val loss: 0.51856, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.52166, val loss: 0.51376, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.51699, val loss: 0.50876, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.51234, val loss: 0.50425, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.50796, val loss: 0.49956, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.50358, val loss: 0.49531, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.49947, val loss: 0.49089, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.49533, val loss: 0.48689, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.49146, val loss: 0.48273, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.48756, val loss: 0.47896, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.48391, val loss: 0.47504, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.48022, val loss: 0.47148, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.47670, val loss: 0.46808, in 0.016s [34/80] 1 tree, 21 leaves, max depth = 8, train loss: 0.47332, val loss: 0.46443, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.46998, val loss: 0.46123, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.46680, val loss: 0.45778, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.46364, val loss: 0.45475, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.46063, val loss: 0.45149, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.45764, val loss: 0.44862, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.45480, val loss: 0.44554, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.45208, val loss: 0.44259, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.44926, val loss: 0.43989, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.44656, val loss: 0.43731, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.44398, val loss: 0.43484, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.44147, val loss: 0.43211, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.43902, val loss: 0.42977, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.43664, val loss: 0.42718, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.43431, val loss: 0.42496, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.43205, val loss: 0.42250, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.42983, val loss: 0.42039, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.42770, val loss: 0.41805, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.42558, val loss: 0.41605, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.42356, val loss: 0.41383, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.42154, val loss: 0.41192, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.41962, val loss: 0.40982, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.41770, val loss: 0.40800, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.41586, val loss: 0.40627, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41408, val loss: 0.40443, in 0.000s 1 tree, 21 leaves, max depth = 10, train loss: 0.41229, val loss: 0.40246, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41061, val loss: 0.40072, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.40887, val loss: 0.39908, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40726, val loss: 0.39741, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.40560, val loss: 0.39586, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.40392, val loss: 0.39402, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40241, val loss: 0.39243, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.40082, val loss: 0.39095, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39937, val loss: 0.38944, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.39776, val loss: 0.38767, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.39625, val loss: 0.38626, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39487, val loss: 0.38482, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.39342, val loss: 0.38348, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.39189, val loss: 0.38182, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39058, val loss: 0.38044, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38932, val loss: 0.37913, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.38793, val loss: 0.37785, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.38651, val loss: 0.37627, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38530, val loss: 0.37501, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.38398, val loss: 0.37379, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38283, val loss: 0.37259, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.38147, val loss: 0.37110, in 0.000s Fit 80 trees in 0.876 s, (2059 total leaves) Time spent computing histograms: 0.341s Time spent finding best splits: 0.050s Time spent applying splits: 0.043s Time spent predicting: 0.000s Trial 48, Fold 2: Log loss = 0.3832620626880714, Average precision = 0.9423748606646112, ROC-AUC = 0.9432571023274772, Elapsed Time = 0.8830567999993946 seconds Trial 48, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 48, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 21 leaves, max depth = 7, train loss: 0.68072, val loss: 0.68033, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.66895, val loss: 0.66819, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.65779, val loss: 0.65666, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.64720, val loss: 0.64572, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.63715, val loss: 0.63533, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.62760, val loss: 0.62544, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.61852, val loss: 0.61605, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.60989, val loss: 0.60710, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.60168, val loss: 0.59858, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.59386, val loss: 0.59047, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.58642, val loss: 0.58274, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.57932, val loss: 0.57537, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.57257, val loss: 0.56834, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.56613, val loss: 0.56163, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.55999, val loss: 0.55523, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.55385, val loss: 0.54946, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.54818, val loss: 0.54354, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.54243, val loss: 0.53814, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.53713, val loss: 0.53259, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.53173, val loss: 0.52753, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.52677, val loss: 0.52232, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.52170, val loss: 0.51757, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.51685, val loss: 0.51304, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.51229, val loss: 0.50824, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.50773, val loss: 0.50398, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.50345, val loss: 0.49947, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.49915, val loss: 0.49546, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.49514, val loss: 0.49121, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.49108, val loss: 0.48744, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.48730, val loss: 0.48343, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.48346, val loss: 0.47987, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.47991, val loss: 0.47610, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.47628, val loss: 0.47274, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.47281, val loss: 0.46953, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.46951, val loss: 0.46601, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.46623, val loss: 0.46298, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.46312, val loss: 0.45966, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.46000, val loss: 0.45679, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.45707, val loss: 0.45365, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.45411, val loss: 0.45093, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.45134, val loss: 0.44796, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.44854, val loss: 0.44539, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.44585, val loss: 0.44293, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.44326, val loss: 0.44015, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.44071, val loss: 0.43782, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.43826, val loss: 0.43517, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.43584, val loss: 0.43296, in 0.000s 1 tree, 19 leaves, max depth = 9, train loss: 0.43351, val loss: 0.43041, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.43120, val loss: 0.42831, in 0.000s 1 tree, 21 leaves, max depth = 10, train loss: 0.42903, val loss: 0.42591, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.42683, val loss: 0.42392, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.42474, val loss: 0.42166, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.42265, val loss: 0.41977, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.42065, val loss: 0.41797, in 0.000s 1 tree, 20 leaves, max depth = 10, train loss: 0.41868, val loss: 0.41579, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.41683, val loss: 0.41407, in 0.016s [57/80] 1 tree, 41 leaves, max depth = 10, train loss: 0.41493, val loss: 0.41236, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.41316, val loss: 0.41072, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.41133, val loss: 0.40868, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.40952, val loss: 0.40707, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40785, val loss: 0.40552, in 0.000s 1 tree, 40 leaves, max depth = 10, train loss: 0.40613, val loss: 0.40398, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40453, val loss: 0.40250, in 0.000s 1 tree, 20 leaves, max depth = 10, train loss: 0.40283, val loss: 0.40060, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.40118, val loss: 0.39915, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39967, val loss: 0.39774, in 0.000s 1 tree, 20 leaves, max depth = 10, train loss: 0.39806, val loss: 0.39594, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.39649, val loss: 0.39456, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39505, val loss: 0.39322, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39367, val loss: 0.39194, in 0.000s 1 tree, 40 leaves, max depth = 9, train loss: 0.39216, val loss: 0.39062, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.39063, val loss: 0.38890, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38932, val loss: 0.38768, in 0.016s 1 tree, 40 leaves, max depth = 9, train loss: 0.38788, val loss: 0.38643, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38662, val loss: 0.38526, in 0.016s 1 tree, 40 leaves, max depth = 9, train loss: 0.38525, val loss: 0.38408, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.38380, val loss: 0.38245, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38259, val loss: 0.38133, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.38128, val loss: 0.38020, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38012, val loss: 0.37913, in 0.000s Fit 80 trees in 0.938 s, (2110 total leaves) Time spent computing histograms: 0.345s Time spent finding best splits: 0.051s Time spent applying splits: 0.044s Time spent predicting: 0.000s Trial 48, Fold 3: Log loss = 0.3766158102262939, Average precision = 0.9477626147986153, ROC-AUC = 0.9475018298966589, Elapsed Time = 0.9445581999989372 seconds Trial 48, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 48, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.159 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 20 leaves, max depth = 9, train loss: 0.68079, val loss: 0.68008, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.66909, val loss: 0.66768, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.65800, val loss: 0.65590, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.64747, val loss: 0.64471, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.63748, val loss: 0.63407, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.62799, val loss: 0.62394, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.61897, val loss: 0.61429, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.61039, val loss: 0.60510, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.60223, val loss: 0.59634, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.59446, val loss: 0.58799, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.58707, val loss: 0.58002, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.58003, val loss: 0.57241, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.57332, val loss: 0.56515, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.56693, val loss: 0.55822, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.56083, val loss: 0.55159, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.55502, val loss: 0.54525, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.54914, val loss: 0.53933, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.54371, val loss: 0.53340, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.53854, val loss: 0.52773, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.53311, val loss: 0.52227, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.52794, val loss: 0.51706, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.52302, val loss: 0.51210, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.51835, val loss: 0.50697, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.51371, val loss: 0.50230, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.50933, val loss: 0.49747, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.50496, val loss: 0.49307, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.50084, val loss: 0.48852, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.49673, val loss: 0.48438, in 0.000s 1 tree, 21 leaves, max depth = 11, train loss: 0.49286, val loss: 0.48009, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.48897, val loss: 0.47618, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.48534, val loss: 0.47218, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.48167, val loss: 0.46849, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.47817, val loss: 0.46496, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.47478, val loss: 0.46119, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.47146, val loss: 0.45785, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.46827, val loss: 0.45429, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.46513, val loss: 0.45113, in 0.016s 1 tree, 20 leaves, max depth = 12, train loss: 0.46212, val loss: 0.44776, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.45915, val loss: 0.44477, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.45630, val loss: 0.44157, in 0.000s 1 tree, 20 leaves, max depth = 12, train loss: 0.45358, val loss: 0.43851, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.45078, val loss: 0.43570, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.44811, val loss: 0.43301, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.44555, val loss: 0.43044, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.44303, val loss: 0.42759, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.44060, val loss: 0.42514, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.43821, val loss: 0.42244, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.43590, val loss: 0.42011, in 0.016s 1 tree, 20 leaves, max depth = 11, train loss: 0.43364, val loss: 0.41756, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.43145, val loss: 0.41535, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.42930, val loss: 0.41290, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.42721, val loss: 0.41080, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.42519, val loss: 0.40851, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.42320, val loss: 0.40651, in 0.000s 1 tree, 20 leaves, max depth = 10, train loss: 0.42128, val loss: 0.40431, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.41939, val loss: 0.40240, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.41758, val loss: 0.40058, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41580, val loss: 0.39866, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.41411, val loss: 0.39683, in 0.000s 1 tree, 20 leaves, max depth = 10, train loss: 0.41232, val loss: 0.39479, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.41061, val loss: 0.39307, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40900, val loss: 0.39134, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.40732, val loss: 0.38941, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.40569, val loss: 0.38777, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40416, val loss: 0.38613, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40270, val loss: 0.38455, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.40115, val loss: 0.38300, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.39957, val loss: 0.38119, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39819, val loss: 0.37969, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.39670, val loss: 0.37821, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39537, val loss: 0.37678, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.39395, val loss: 0.37536, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.39247, val loss: 0.37367, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39121, val loss: 0.37230, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.38985, val loss: 0.37095, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.38845, val loss: 0.36934, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38724, val loss: 0.36803, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.38593, val loss: 0.36674, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38478, val loss: 0.36548, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.38339, val loss: 0.36388, in 0.016s Fit 80 trees in 0.925 s, (2004 total leaves) Time spent computing histograms: 0.343s Time spent finding best splits: 0.050s Time spent applying splits: 0.043s Time spent predicting: 0.000s Trial 48, Fold 4: Log loss = 0.38315428635904075, Average precision = 0.945962665443955, ROC-AUC = 0.9426270026061598, Elapsed Time = 0.938886599999023 seconds Trial 48, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 48, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.159 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 21 leaves, max depth = 8, train loss: 0.68073, val loss: 0.67990, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.66897, val loss: 0.66733, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.65776, val loss: 0.65533, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.64712, val loss: 0.64392, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.63707, val loss: 0.63313, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.62748, val loss: 0.62280, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.61836, val loss: 0.61297, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.60973, val loss: 0.60365, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.60147, val loss: 0.59471, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.59361, val loss: 0.58620, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.58617, val loss: 0.57811, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.57904, val loss: 0.57035, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.57225, val loss: 0.56294, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.56581, val loss: 0.55590, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.55964, val loss: 0.54914, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.55375, val loss: 0.54268, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.54790, val loss: 0.53706, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.54242, val loss: 0.53103, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.53720, val loss: 0.52527, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.53179, val loss: 0.52010, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.52664, val loss: 0.51518, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.52184, val loss: 0.50986, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.51699, val loss: 0.50525, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.51249, val loss: 0.50025, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.50792, val loss: 0.49592, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.50357, val loss: 0.49179, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.49941, val loss: 0.48716, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.49530, val loss: 0.48328, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.49139, val loss: 0.47892, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.48751, val loss: 0.47527, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.48383, val loss: 0.47115, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.48017, val loss: 0.46771, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.47670, val loss: 0.46382, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.47324, val loss: 0.46058, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.46996, val loss: 0.45690, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.46668, val loss: 0.45384, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.46355, val loss: 0.45093, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.46050, val loss: 0.44749, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.45752, val loss: 0.44473, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.45464, val loss: 0.44147, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.45189, val loss: 0.43835, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.44909, val loss: 0.43577, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.44641, val loss: 0.43330, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.44383, val loss: 0.43037, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.44129, val loss: 0.42804, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.43885, val loss: 0.42526, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.43643, val loss: 0.42305, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.43411, val loss: 0.42094, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.43182, val loss: 0.41832, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.42962, val loss: 0.41633, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.42745, val loss: 0.41384, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.42535, val loss: 0.41194, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.42330, val loss: 0.40958, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.42130, val loss: 0.40779, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.41935, val loss: 0.40553, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.41744, val loss: 0.40382, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.41558, val loss: 0.40167, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41381, val loss: 0.39997, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.41199, val loss: 0.39836, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41030, val loss: 0.39674, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.40857, val loss: 0.39521, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.40686, val loss: 0.39322, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40526, val loss: 0.39169, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.40361, val loss: 0.39024, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40208, val loss: 0.38878, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.40050, val loss: 0.38741, in 0.000s 1 tree, 20 leaves, max depth = 10, train loss: 0.39889, val loss: 0.38550, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39744, val loss: 0.38411, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.39593, val loss: 0.38281, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39455, val loss: 0.38148, in 0.000s 1 tree, 20 leaves, max depth = 10, train loss: 0.39303, val loss: 0.37968, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.39159, val loss: 0.37844, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39027, val loss: 0.37718, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38900, val loss: 0.37597, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.38763, val loss: 0.37480, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.38620, val loss: 0.37310, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38499, val loss: 0.37195, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.38367, val loss: 0.37083, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38251, val loss: 0.36973, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.38116, val loss: 0.36811, in 0.000s Fit 80 trees in 0.925 s, (2055 total leaves) Time spent computing histograms: 0.338s Time spent finding best splits: 0.051s Time spent applying splits: 0.044s Time spent predicting: 0.016s Trial 48, Fold 5: Log loss = 0.3854740829711175, Average precision = 0.9450464468313988, ROC-AUC = 0.9415032978466454, Elapsed Time = 0.9400613999987399 seconds
Optimization Progress: 49%|####9 | 49/100 [09:37<10:02, 11.81s/it]
Trial 49, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 49, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.173 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 6 leaves, max depth = 4, train loss: 0.64967, val loss: 0.64819, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.61370, val loss: 0.61102, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.58401, val loss: 0.58024, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.55969, val loss: 0.55494, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.53764, val loss: 0.53467, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.51850, val loss: 0.51461, in 0.000s 1 tree, 40 leaves, max depth = 13, train loss: 0.50111, val loss: 0.49877, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.48634, val loss: 0.48316, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.47365, val loss: 0.46970, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.46025, val loss: 0.45770, in 0.000s 1 tree, 40 leaves, max depth = 11, train loss: 0.44879, val loss: 0.44734, in 0.016s Fit 11 trees in 0.376 s, (194 total leaves) Time spent computing histograms: 0.050s Time spent finding best splits: 0.007s Time spent applying splits: 0.005s Time spent predicting: 0.000s Trial 49, Fold 1: Log loss = 0.44881063567524554, Average precision = 0.9169794849465421, ROC-AUC = 0.9279338014940995, Elapsed Time = 0.3875459000009869 seconds Trial 49, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 49, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0.189 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 6 leaves, max depth = 4, train loss: 0.64957, val loss: 0.64758, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.61375, val loss: 0.60978, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.58416, val loss: 0.57844, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.55976, val loss: 0.55260, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.53801, val loss: 0.53182, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.51902, val loss: 0.51149, in 0.000s 1 tree, 45 leaves, max depth = 14, train loss: 0.50178, val loss: 0.49514, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.48697, val loss: 0.47920, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.47434, val loss: 0.46545, in 0.000s 1 tree, 46 leaves, max depth = 14, train loss: 0.46095, val loss: 0.45293, in 0.016s 1 tree, 46 leaves, max depth = 14, train loss: 0.44960, val loss: 0.44238, in 0.016s Fit 11 trees in 0.423 s, (233 total leaves) Time spent computing histograms: 0.055s Time spent finding best splits: 0.008s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 49, Fold 2: Log loss = 0.4507257352537522, Average precision = 0.9085994456637089, ROC-AUC = 0.9257551956165527, Elapsed Time = 0.43212510000012117 seconds Trial 49, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 49, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0.173 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 6 leaves, max depth = 4, train loss: 0.64996, val loss: 0.64830, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.61434, val loss: 0.61120, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.58491, val loss: 0.58043, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.56077, val loss: 0.55505, in 0.000s 1 tree, 44 leaves, max depth = 14, train loss: 0.53865, val loss: 0.53467, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.51975, val loss: 0.51466, in 0.016s 1 tree, 46 leaves, max depth = 14, train loss: 0.50227, val loss: 0.49871, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.48767, val loss: 0.48304, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.47517, val loss: 0.46960, in 0.000s 1 tree, 48 leaves, max depth = 11, train loss: 0.46140, val loss: 0.45717, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.44957, val loss: 0.44655, in 0.016s Fit 11 trees in 0.423 s, (244 total leaves) Time spent computing histograms: 0.055s Time spent finding best splits: 0.007s Time spent applying splits: 0.006s Time spent predicting: 0.000s Trial 49, Fold 3: Log loss = 0.44611069590815944, Average precision = 0.9203295690550128, ROC-AUC = 0.9321823895320279, Elapsed Time = 0.42323920000126236 seconds Trial 49, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 49, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0.173 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 6 leaves, max depth = 5, train loss: 0.64985, val loss: 0.64751, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.61415, val loss: 0.60949, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.58463, val loss: 0.57790, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.56034, val loss: 0.55181, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.53858, val loss: 0.53068, in 0.000s 1 tree, 14 leaves, max depth = 6, train loss: 0.51959, val loss: 0.51003, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.50202, val loss: 0.49318, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.48731, val loss: 0.47699, in 0.000s 1 tree, 13 leaves, max depth = 5, train loss: 0.47475, val loss: 0.46303, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.46148, val loss: 0.45029, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.45011, val loss: 0.43939, in 0.000s Fit 11 trees in 0.407 s, (228 total leaves) Time spent computing histograms: 0.054s Time spent finding best splits: 0.007s Time spent applying splits: 0.006s Time spent predicting: 0.000s Trial 49, Fold 4: Log loss = 0.45057004868577993, Average precision = 0.9144949064218189, ROC-AUC = 0.9272061304367524, Elapsed Time = 0.42175010000028124 seconds Trial 49, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 49, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0.174 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 5 leaves, max depth = 4, train loss: 0.64929, val loss: 0.64638, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.61341, val loss: 0.60766, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.58357, val loss: 0.57537, in 0.016s 1 tree, 5 leaves, max depth = 4, train loss: 0.55901, val loss: 0.54834, in 0.000s 1 tree, 46 leaves, max depth = 16, train loss: 0.53697, val loss: 0.52722, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.51789, val loss: 0.50611, in 0.016s 1 tree, 47 leaves, max depth = 14, train loss: 0.50044, val loss: 0.48961, in 0.000s 1 tree, 5 leaves, max depth = 4, train loss: 0.48557, val loss: 0.47292, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.47283, val loss: 0.45868, in 0.000s 1 tree, 40 leaves, max depth = 10, train loss: 0.45932, val loss: 0.44610, in 0.016s 1 tree, 49 leaves, max depth = 14, train loss: 0.44784, val loss: 0.43552, in 0.016s Fit 11 trees in 0.424 s, (228 total leaves) Time spent computing histograms: 0.054s Time spent finding best splits: 0.007s Time spent applying splits: 0.006s Time spent predicting: 0.000s Trial 49, Fold 5: Log loss = 0.45311330864184113, Average precision = 0.9136391070882067, ROC-AUC = 0.9254636122103934, Elapsed Time = 0.42471040000054927 seconds
Optimization Progress: 50%|##### | 50/100 [09:46<09:11, 11.02s/it]
Trial 50, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 50, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.143 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 93 leaves, max depth = 19, train loss: 0.67691, val loss: 0.67669, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.66030, val loss: 0.66018, in 0.016s 1 tree, 51 leaves, max depth = 14, train loss: 0.64396, val loss: 0.64362, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.63039, val loss: 0.62996, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.61710, val loss: 0.61688, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.60445, val loss: 0.60429, in 0.016s 1 tree, 92 leaves, max depth = 14, train loss: 0.59204, val loss: 0.59153, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.57944, val loss: 0.57868, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.56946, val loss: 0.56846, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.55847, val loss: 0.55736, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.54744, val loss: 0.54619, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.53563, val loss: 0.53445, in 0.000s 1 tree, 54 leaves, max depth = 10, train loss: 0.52659, val loss: 0.52521, in 0.016s 1 tree, 93 leaves, max depth = 18, train loss: 0.51887, val loss: 0.51738, in 0.031s 1 tree, 93 leaves, max depth = 17, train loss: 0.51029, val loss: 0.50873, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.50283, val loss: 0.50141, in 0.016s 1 tree, 93 leaves, max depth = 20, train loss: 0.49582, val loss: 0.49441, in 0.031s 1 tree, 93 leaves, max depth = 16, train loss: 0.48930, val loss: 0.48782, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.48209, val loss: 0.48047, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.47611, val loss: 0.47438, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.47000, val loss: 0.46813, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.46400, val loss: 0.46182, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.45805, val loss: 0.45567, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.45242, val loss: 0.44982, in 0.016s 1 tree, 93 leaves, max depth = 11, train loss: 0.44640, val loss: 0.44398, in 0.031s 1 tree, 93 leaves, max depth = 14, train loss: 0.44121, val loss: 0.43887, in 0.016s 1 tree, 15 leaves, max depth = 10, train loss: 0.43418, val loss: 0.43195, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.42909, val loss: 0.42692, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.42403, val loss: 0.42180, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.41933, val loss: 0.41696, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.41568, val loss: 0.41328, in 0.031s 1 tree, 93 leaves, max depth = 18, train loss: 0.41229, val loss: 0.40987, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.40863, val loss: 0.40613, in 0.016s 1 tree, 93 leaves, max depth = 22, train loss: 0.40466, val loss: 0.40203, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.40183, val loss: 0.39911, in 0.031s 1 tree, 93 leaves, max depth = 13, train loss: 0.39699, val loss: 0.39437, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.39320, val loss: 0.39046, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.38875, val loss: 0.38606, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.38455, val loss: 0.38189, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.37911, val loss: 0.37662, in 0.000s 1 tree, 34 leaves, max depth = 13, train loss: 0.37666, val loss: 0.37410, in 0.016s 1 tree, 14 leaves, max depth = 8, train loss: 0.37159, val loss: 0.36924, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.36869, val loss: 0.36636, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.36650, val loss: 0.36412, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.36391, val loss: 0.36149, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.35946, val loss: 0.35716, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.35696, val loss: 0.35467, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.35512, val loss: 0.35292, in 0.016s 1 tree, 85 leaves, max depth = 16, train loss: 0.35318, val loss: 0.35100, in 0.016s 1 tree, 93 leaves, max depth = 11, train loss: 0.35115, val loss: 0.34906, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.34746, val loss: 0.34561, in 0.016s 1 tree, 27 leaves, max depth = 8, train loss: 0.34361, val loss: 0.34186, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.34156, val loss: 0.33997, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.33846, val loss: 0.33677, in 0.016s 1 tree, 22 leaves, max depth = 11, train loss: 0.33484, val loss: 0.33356, in 0.000s 1 tree, 23 leaves, max depth = 8, train loss: 0.33145, val loss: 0.33027, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.32824, val loss: 0.32715, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.32645, val loss: 0.32547, in 0.031s Fit 58 trees in 1.283 s, (4131 total leaves) Time spent computing histograms: 0.386s Time spent finding best splits: 0.075s Time spent applying splits: 0.066s Time spent predicting: 0.000s Trial 50, Fold 1: Log loss = 0.332543159481717, Average precision = 0.960439148332299, ROC-AUC = 0.9515212984471115, Elapsed Time = 1.2974546999994345 seconds Trial 50, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 50, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 93 leaves, max depth = 14, train loss: 0.67712, val loss: 0.67655, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.66063, val loss: 0.65961, in 0.031s 1 tree, 93 leaves, max depth = 17, train loss: 0.64447, val loss: 0.64340, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.62999, val loss: 0.62871, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.61492, val loss: 0.61359, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.60232, val loss: 0.60093, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.59010, val loss: 0.58830, in 0.031s 1 tree, 44 leaves, max depth = 13, train loss: 0.57760, val loss: 0.57544, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.56719, val loss: 0.56463, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.55622, val loss: 0.55374, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.54582, val loss: 0.54325, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.53720, val loss: 0.53424, in 0.031s 1 tree, 93 leaves, max depth = 16, train loss: 0.52793, val loss: 0.52493, in 0.016s 1 tree, 93 leaves, max depth = 18, train loss: 0.51891, val loss: 0.51599, in 0.016s 1 tree, 93 leaves, max depth = 18, train loss: 0.51000, val loss: 0.50705, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.50179, val loss: 0.49877, in 0.031s 1 tree, 93 leaves, max depth = 11, train loss: 0.49437, val loss: 0.49129, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.48643, val loss: 0.48321, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.47894, val loss: 0.47571, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.47201, val loss: 0.46873, in 0.031s 1 tree, 93 leaves, max depth = 17, train loss: 0.46525, val loss: 0.46193, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.45915, val loss: 0.45589, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.45419, val loss: 0.45077, in 0.031s 1 tree, 93 leaves, max depth = 15, train loss: 0.44872, val loss: 0.44524, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.44314, val loss: 0.43967, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.43878, val loss: 0.43512, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.43392, val loss: 0.43036, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.42930, val loss: 0.42583, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.42494, val loss: 0.42148, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.42125, val loss: 0.41768, in 0.031s 1 tree, 93 leaves, max depth = 16, train loss: 0.41641, val loss: 0.41302, in 0.016s 1 tree, 93 leaves, max depth = 18, train loss: 0.41129, val loss: 0.40802, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.40671, val loss: 0.40348, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.40006, val loss: 0.39694, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.39391, val loss: 0.39091, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.38824, val loss: 0.38532, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.38280, val loss: 0.38002, in 0.000s 1 tree, 9 leaves, max depth = 4, train loss: 0.37745, val loss: 0.37473, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.37253, val loss: 0.36987, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.36838, val loss: 0.36592, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.36414, val loss: 0.36161, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.36101, val loss: 0.35856, in 0.031s 1 tree, 15 leaves, max depth = 6, train loss: 0.35658, val loss: 0.35428, in 0.000s 1 tree, 32 leaves, max depth = 9, train loss: 0.35267, val loss: 0.35069, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.34950, val loss: 0.34769, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.34791, val loss: 0.34631, in 0.000s 1 tree, 11 leaves, max depth = 5, train loss: 0.34401, val loss: 0.34257, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.34155, val loss: 0.34014, in 0.031s 1 tree, 93 leaves, max depth = 15, train loss: 0.33926, val loss: 0.33803, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.33713, val loss: 0.33602, in 0.016s 1 tree, 93 leaves, max depth = 20, train loss: 0.33436, val loss: 0.33322, in 0.016s 1 tree, 93 leaves, max depth = 11, train loss: 0.33118, val loss: 0.33035, in 0.031s 1 tree, 93 leaves, max depth = 13, train loss: 0.32926, val loss: 0.32849, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.32614, val loss: 0.32572, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.32423, val loss: 0.32381, in 0.016s 1 tree, 93 leaves, max depth = 20, train loss: 0.32235, val loss: 0.32212, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.31964, val loss: 0.31937, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.31793, val loss: 0.31775, in 0.016s Fit 58 trees in 1.361 s, (4202 total leaves) Time spent computing histograms: 0.402s Time spent finding best splits: 0.078s Time spent applying splits: 0.070s Time spent predicting: 0.016s Trial 50, Fold 2: Log loss = 0.322366115525635, Average precision = 0.9586962201836953, ROC-AUC = 0.9532182063351058, Elapsed Time = 1.3756073000004108 seconds Trial 50, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 50, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 93 leaves, max depth = 16, train loss: 0.67699, val loss: 0.67687, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.66059, val loss: 0.66047, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.64392, val loss: 0.64371, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.63036, val loss: 0.63012, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.61582, val loss: 0.61557, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.60270, val loss: 0.60208, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.58979, val loss: 0.58909, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.57838, val loss: 0.57759, in 0.031s 1 tree, 93 leaves, max depth = 13, train loss: 0.56866, val loss: 0.56741, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.55778, val loss: 0.55665, in 0.031s 1 tree, 93 leaves, max depth = 18, train loss: 0.54750, val loss: 0.54645, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.53805, val loss: 0.53703, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.52802, val loss: 0.52715, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.51983, val loss: 0.51872, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.51229, val loss: 0.51097, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.50429, val loss: 0.50288, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.49686, val loss: 0.49552, in 0.031s 1 tree, 93 leaves, max depth = 18, train loss: 0.48960, val loss: 0.48841, in 0.016s 1 tree, 93 leaves, max depth = 11, train loss: 0.48313, val loss: 0.48199, in 0.031s 1 tree, 93 leaves, max depth = 15, train loss: 0.47729, val loss: 0.47620, in 0.016s 1 tree, 93 leaves, max depth = 18, train loss: 0.47177, val loss: 0.47068, in 0.016s 1 tree, 93 leaves, max depth = 18, train loss: 0.46518, val loss: 0.46392, in 0.031s 1 tree, 93 leaves, max depth = 14, train loss: 0.45945, val loss: 0.45830, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.45456, val loss: 0.45366, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.44924, val loss: 0.44841, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.44406, val loss: 0.44327, in 0.016s 1 tree, 14 leaves, max depth = 8, train loss: 0.43627, val loss: 0.43604, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.43106, val loss: 0.43091, in 0.031s 1 tree, 93 leaves, max depth = 16, train loss: 0.42579, val loss: 0.42578, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.42053, val loss: 0.42059, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.41531, val loss: 0.41527, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.41015, val loss: 0.41007, in 0.031s 1 tree, 93 leaves, max depth = 20, train loss: 0.40523, val loss: 0.40504, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.40136, val loss: 0.40132, in 0.016s 1 tree, 93 leaves, max depth = 18, train loss: 0.39694, val loss: 0.39681, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.39388, val loss: 0.39364, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.39107, val loss: 0.39071, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.38772, val loss: 0.38755, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.38493, val loss: 0.38493, in 0.016s 1 tree, 11 leaves, max depth = 4, train loss: 0.37915, val loss: 0.37970, in 0.016s 1 tree, 28 leaves, max depth = 13, train loss: 0.37680, val loss: 0.37723, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.37407, val loss: 0.37460, in 0.016s 1 tree, 11 leaves, max depth = 4, train loss: 0.36892, val loss: 0.36985, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.36651, val loss: 0.36755, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.36409, val loss: 0.36527, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.36140, val loss: 0.36232, in 0.031s 1 tree, 93 leaves, max depth = 19, train loss: 0.35951, val loss: 0.36054, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.35618, val loss: 0.35716, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.35396, val loss: 0.35499, in 0.031s 1 tree, 93 leaves, max depth = 14, train loss: 0.35196, val loss: 0.35315, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.34994, val loss: 0.35125, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.34801, val loss: 0.34936, in 0.031s 1 tree, 93 leaves, max depth = 17, train loss: 0.34655, val loss: 0.34796, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.34205, val loss: 0.34429, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.34021, val loss: 0.34257, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.33858, val loss: 0.34114, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.33663, val loss: 0.33936, in 0.031s 1 tree, 93 leaves, max depth = 19, train loss: 0.33458, val loss: 0.33749, in 0.016s Fit 58 trees in 1.456 s, (4741 total leaves) Time spent computing histograms: 0.423s Time spent finding best splits: 0.089s Time spent applying splits: 0.080s Time spent predicting: 0.000s Trial 50, Fold 3: Log loss = 0.33476056901817486, Average precision = 0.9590247945597814, ROC-AUC = 0.9532110773757355, Elapsed Time = 1.46114309999939 seconds Trial 50, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 50, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 93 leaves, max depth = 16, train loss: 0.67706, val loss: 0.67662, in 0.031s 1 tree, 93 leaves, max depth = 15, train loss: 0.66065, val loss: 0.65963, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.64404, val loss: 0.64234, in 0.031s 1 tree, 93 leaves, max depth = 23, train loss: 0.63095, val loss: 0.62891, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.61819, val loss: 0.61587, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.60411, val loss: 0.60139, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.59182, val loss: 0.58841, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.57956, val loss: 0.57559, in 0.031s 1 tree, 86 leaves, max depth = 14, train loss: 0.56887, val loss: 0.56458, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.55849, val loss: 0.55400, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.54768, val loss: 0.54265, in 0.031s 1 tree, 93 leaves, max depth = 18, train loss: 0.53934, val loss: 0.53406, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.52996, val loss: 0.52418, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.52248, val loss: 0.51605, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.51347, val loss: 0.50652, in 0.031s 1 tree, 93 leaves, max depth = 12, train loss: 0.50528, val loss: 0.49792, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.49812, val loss: 0.49043, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.49035, val loss: 0.48221, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.48311, val loss: 0.47458, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.47595, val loss: 0.46706, in 0.031s 1 tree, 93 leaves, max depth = 16, train loss: 0.46919, val loss: 0.45981, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.46239, val loss: 0.45270, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.45693, val loss: 0.44701, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.45220, val loss: 0.44209, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.44424, val loss: 0.43376, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.43880, val loss: 0.42793, in 0.016s 1 tree, 93 leaves, max depth = 21, train loss: 0.43397, val loss: 0.42281, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.42901, val loss: 0.41748, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.42493, val loss: 0.41326, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.42092, val loss: 0.40902, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.41696, val loss: 0.40488, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.41041, val loss: 0.39815, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.40405, val loss: 0.39168, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.40064, val loss: 0.38802, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.39686, val loss: 0.38430, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.39228, val loss: 0.37961, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.38937, val loss: 0.37661, in 0.031s 1 tree, 8 leaves, max depth = 4, train loss: 0.38389, val loss: 0.37111, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.37882, val loss: 0.36581, in 0.016s 1 tree, 10 leaves, max depth = 4, train loss: 0.37444, val loss: 0.36104, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.36958, val loss: 0.35611, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.36581, val loss: 0.35236, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.36170, val loss: 0.34789, in 0.000s 1 tree, 93 leaves, max depth = 15, train loss: 0.35880, val loss: 0.34478, in 0.031s 1 tree, 93 leaves, max depth = 17, train loss: 0.35558, val loss: 0.34145, in 0.016s 1 tree, 93 leaves, max depth = 18, train loss: 0.35367, val loss: 0.33964, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.35128, val loss: 0.33703, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.34947, val loss: 0.33508, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.34721, val loss: 0.33300, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.34421, val loss: 0.33030, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.34245, val loss: 0.32862, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.34063, val loss: 0.32694, in 0.031s 1 tree, 84 leaves, max depth = 13, train loss: 0.33614, val loss: 0.32258, in 0.016s 1 tree, 11 leaves, max depth = 4, train loss: 0.33288, val loss: 0.31933, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.32976, val loss: 0.31607, in 0.000s 1 tree, 16 leaves, max depth = 6, train loss: 0.32660, val loss: 0.31279, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.32374, val loss: 0.30998, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.32194, val loss: 0.30798, in 0.016s Fit 58 trees in 1.393 s, (4225 total leaves) Time spent computing histograms: 0.404s Time spent finding best splits: 0.080s Time spent applying splits: 0.071s Time spent predicting: 0.000s Trial 50, Fold 4: Log loss = 0.3254950150277615, Average precision = 0.9597598144707746, ROC-AUC = 0.9523395232611097, Elapsed Time = 1.4036365000010846 seconds Trial 50, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 50, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 93 leaves, max depth = 18, train loss: 0.67693, val loss: 0.67613, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.65889, val loss: 0.65740, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.64240, val loss: 0.64030, in 0.016s 1 tree, 93 leaves, max depth = 22, train loss: 0.62800, val loss: 0.62499, in 0.032s 1 tree, 93 leaves, max depth = 16, train loss: 0.61531, val loss: 0.61168, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.60245, val loss: 0.59855, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.59016, val loss: 0.58574, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.57771, val loss: 0.57269, in 0.016s 1 tree, 93 leaves, max depth = 11, train loss: 0.56684, val loss: 0.56124, in 0.031s 1 tree, 93 leaves, max depth = 17, train loss: 0.55727, val loss: 0.55110, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.54609, val loss: 0.53956, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.53762, val loss: 0.53042, in 0.019s 1 tree, 93 leaves, max depth = 13, train loss: 0.52816, val loss: 0.52042, in 0.029s 1 tree, 93 leaves, max depth = 13, train loss: 0.52061, val loss: 0.51222, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.51164, val loss: 0.50288, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.50482, val loss: 0.49561, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.49646, val loss: 0.48679, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.48951, val loss: 0.47963, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.48216, val loss: 0.47189, in 0.031s 1 tree, 93 leaves, max depth = 15, train loss: 0.47498, val loss: 0.46430, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.46818, val loss: 0.45719, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.46127, val loss: 0.45012, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.45575, val loss: 0.44449, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.45080, val loss: 0.43934, in 0.031s 1 tree, 25 leaves, max depth = 8, train loss: 0.44602, val loss: 0.43428, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.44189, val loss: 0.42987, in 0.000s 1 tree, 43 leaves, max depth = 11, train loss: 0.43739, val loss: 0.42532, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.43228, val loss: 0.42012, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.42779, val loss: 0.41536, in 0.031s 1 tree, 93 leaves, max depth = 16, train loss: 0.42416, val loss: 0.41150, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.41970, val loss: 0.40683, in 0.016s 1 tree, 93 leaves, max depth = 18, train loss: 0.41632, val loss: 0.40326, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.41188, val loss: 0.39883, in 0.031s 1 tree, 11 leaves, max depth = 4, train loss: 0.40569, val loss: 0.39258, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.40210, val loss: 0.38896, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.39591, val loss: 0.38266, in 0.016s 1 tree, 87 leaves, max depth = 12, train loss: 0.39288, val loss: 0.37956, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.39003, val loss: 0.37664, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.38470, val loss: 0.37127, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.37984, val loss: 0.36636, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.37727, val loss: 0.36380, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.37367, val loss: 0.36030, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.36912, val loss: 0.35571, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.36596, val loss: 0.35251, in 0.016s 1 tree, 93 leaves, max depth = 18, train loss: 0.36289, val loss: 0.34947, in 0.016s 1 tree, 93 leaves, max depth = 21, train loss: 0.36098, val loss: 0.34743, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.35840, val loss: 0.34480, in 0.031s 1 tree, 27 leaves, max depth = 10, train loss: 0.35658, val loss: 0.34275, in 0.016s 1 tree, 93 leaves, max depth = 11, train loss: 0.35402, val loss: 0.34021, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.35000, val loss: 0.33615, in 0.016s 1 tree, 93 leaves, max depth = 17, train loss: 0.34782, val loss: 0.33397, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.34502, val loss: 0.33118, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.34276, val loss: 0.32882, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.34009, val loss: 0.32605, in 0.031s 1 tree, 93 leaves, max depth = 20, train loss: 0.33788, val loss: 0.32392, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.33598, val loss: 0.32193, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.33236, val loss: 0.31829, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.32964, val loss: 0.31577, in 0.016s Fit 58 trees in 1.377 s, (4221 total leaves) Time spent computing histograms: 0.400s Time spent finding best splits: 0.082s Time spent applying splits: 0.072s Time spent predicting: 0.016s Trial 50, Fold 5: Log loss = 0.34125740585444037, Average precision = 0.9561758490398942, ROC-AUC = 0.9489523066347101, Elapsed Time = 1.3802941999983886 seconds
Optimization Progress: 51%|#####1 | 51/100 [10:01<09:53, 12.12s/it]
Trial 51, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 51, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.125 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 69 leaves, max depth = 14, train loss: 0.64940, val loss: 0.64810, in 0.016s 1 tree, 73 leaves, max depth = 19, train loss: 0.61326, val loss: 0.61086, in 0.000s 1 tree, 73 leaves, max depth = 17, train loss: 0.58337, val loss: 0.58023, in 0.016s 1 tree, 68 leaves, max depth = 16, train loss: 0.55886, val loss: 0.55494, in 0.000s 1 tree, 115 leaves, max depth = 16, train loss: 0.53645, val loss: 0.53415, in 0.016s 1 tree, 74 leaves, max depth = 19, train loss: 0.51724, val loss: 0.51411, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.49958, val loss: 0.49786, in 0.000s 1 tree, 69 leaves, max depth = 16, train loss: 0.48468, val loss: 0.48225, in 0.016s 1 tree, 74 leaves, max depth = 19, train loss: 0.47194, val loss: 0.46881, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.45826, val loss: 0.45640, in 0.000s 1 tree, 114 leaves, max depth = 17, train loss: 0.44667, val loss: 0.44597, in 0.016s Fit 11 trees in 0.360 s, (959 total leaves) Time spent computing histograms: 0.047s Time spent finding best splits: 0.011s Time spent applying splits: 0.013s Time spent predicting: 0.000s Trial 51, Fold 1: Log loss = 0.44821471458833306, Average precision = 0.9172839742555474, ROC-AUC = 0.927829023936183, Elapsed Time = 0.36683610000000044 seconds Trial 51, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 51, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 72 leaves, max depth = 18, train loss: 0.64926, val loss: 0.64729, in 0.000s 1 tree, 71 leaves, max depth = 15, train loss: 0.61334, val loss: 0.60958, in 0.016s 1 tree, 73 leaves, max depth = 13, train loss: 0.58351, val loss: 0.57807, in 0.016s 1 tree, 71 leaves, max depth = 20, train loss: 0.55890, val loss: 0.55214, in 0.000s 1 tree, 112 leaves, max depth = 15, train loss: 0.53652, val loss: 0.53076, in 0.016s 1 tree, 73 leaves, max depth = 15, train loss: 0.51744, val loss: 0.51051, in 0.016s 1 tree, 112 leaves, max depth = 15, train loss: 0.49976, val loss: 0.49373, in 0.000s 1 tree, 72 leaves, max depth = 18, train loss: 0.48486, val loss: 0.47787, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.47209, val loss: 0.46413, in 0.016s 1 tree, 111 leaves, max depth = 15, train loss: 0.45839, val loss: 0.45127, in 0.000s 1 tree, 112 leaves, max depth = 15, train loss: 0.44679, val loss: 0.44042, in 0.016s Fit 11 trees in 0.377 s, (954 total leaves) Time spent computing histograms: 0.051s Time spent finding best splits: 0.012s Time spent applying splits: 0.014s Time spent predicting: 0.000s Trial 51, Fold 2: Log loss = 0.4485400674319748, Average precision = 0.9115079649576727, ROC-AUC = 0.9271467228093008, Elapsed Time = 0.38675879999937024 seconds Trial 51, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 51, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.126 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 69 leaves, max depth = 16, train loss: 0.64962, val loss: 0.64818, in 0.000s 1 tree, 73 leaves, max depth = 11, train loss: 0.61380, val loss: 0.61092, in 0.016s 1 tree, 72 leaves, max depth = 12, train loss: 0.58423, val loss: 0.58013, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.55987, val loss: 0.55478, in 0.016s 1 tree, 110 leaves, max depth = 13, train loss: 0.53762, val loss: 0.53429, in 0.016s 1 tree, 72 leaves, max depth = 14, train loss: 0.51860, val loss: 0.51459, in 0.000s 1 tree, 110 leaves, max depth = 14, train loss: 0.50103, val loss: 0.49851, in 0.016s 1 tree, 70 leaves, max depth = 15, train loss: 0.48624, val loss: 0.48290, in 0.016s 1 tree, 74 leaves, max depth = 13, train loss: 0.47360, val loss: 0.46939, in 0.000s 1 tree, 110 leaves, max depth = 14, train loss: 0.46000, val loss: 0.45707, in 0.016s 1 tree, 110 leaves, max depth = 13, train loss: 0.44849, val loss: 0.44672, in 0.016s Fit 11 trees in 0.360 s, (937 total leaves) Time spent computing histograms: 0.045s Time spent finding best splits: 0.011s Time spent applying splits: 0.014s Time spent predicting: 0.000s Trial 51, Fold 3: Log loss = 0.4468223273437511, Average precision = 0.9191827663788117, ROC-AUC = 0.9306622660862364, Elapsed Time = 0.36531429999922693 seconds Trial 51, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 51, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.143 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 65 leaves, max depth = 12, train loss: 0.64952, val loss: 0.64734, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.61360, val loss: 0.60925, in 0.016s 1 tree, 72 leaves, max depth = 13, train loss: 0.58385, val loss: 0.57766, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.55935, val loss: 0.55142, in 0.000s 1 tree, 111 leaves, max depth = 14, train loss: 0.53742, val loss: 0.52987, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.51833, val loss: 0.50917, in 0.016s 1 tree, 111 leaves, max depth = 13, train loss: 0.50107, val loss: 0.49227, in 0.000s 1 tree, 69 leaves, max depth = 18, train loss: 0.48619, val loss: 0.47602, in 0.016s 1 tree, 74 leaves, max depth = 13, train loss: 0.47353, val loss: 0.46203, in 0.016s 1 tree, 111 leaves, max depth = 13, train loss: 0.46021, val loss: 0.44909, in 0.000s 1 tree, 110 leaves, max depth = 14, train loss: 0.44894, val loss: 0.43815, in 0.016s Fit 11 trees in 0.378 s, (939 total leaves) Time spent computing histograms: 0.049s Time spent finding best splits: 0.011s Time spent applying splits: 0.013s Time spent predicting: 0.000s Trial 51, Fold 4: Log loss = 0.4507458387276249, Average precision = 0.9130938405507721, ROC-AUC = 0.9259754024287599, Elapsed Time = 0.37635580000096525 seconds Trial 51, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 51, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.127 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 68 leaves, max depth = 13, train loss: 0.64895, val loss: 0.64615, in 0.000s 1 tree, 69 leaves, max depth = 12, train loss: 0.61287, val loss: 0.60750, in 0.000s 1 tree, 74 leaves, max depth = 15, train loss: 0.58282, val loss: 0.57515, in 0.016s 1 tree, 71 leaves, max depth = 14, train loss: 0.55807, val loss: 0.54831, in 0.000s 1 tree, 114 leaves, max depth = 15, train loss: 0.53602, val loss: 0.52714, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.51672, val loss: 0.50600, in 0.016s 1 tree, 114 leaves, max depth = 16, train loss: 0.49935, val loss: 0.48944, in 0.016s 1 tree, 69 leaves, max depth = 15, train loss: 0.48426, val loss: 0.47274, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.47146, val loss: 0.45848, in 0.016s 1 tree, 114 leaves, max depth = 15, train loss: 0.45802, val loss: 0.44588, in 0.016s 1 tree, 114 leaves, max depth = 15, train loss: 0.44668, val loss: 0.43528, in 0.000s Fit 11 trees in 0.361 s, (957 total leaves) Time spent computing histograms: 0.052s Time spent finding best splits: 0.011s Time spent applying splits: 0.013s Time spent predicting: 0.000s Trial 51, Fold 5: Log loss = 0.4527107971310902, Average precision = 0.9107154606342183, ROC-AUC = 0.924461958642216, Elapsed Time = 0.3719161000008171 seconds
Optimization Progress: 52%|#####2 | 52/100 [10:09<08:46, 10.97s/it]
Trial 52, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 52, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.143 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 66 leaves, max depth = 10, train loss: 0.65370, val loss: 0.65348, in 0.016s 1 tree, 103 leaves, max depth = 12, train loss: 0.61890, val loss: 0.61892, in 0.016s 1 tree, 100 leaves, max depth = 12, train loss: 0.58839, val loss: 0.58842, in 0.031s 1 tree, 106 leaves, max depth = 13, train loss: 0.56160, val loss: 0.56168, in 0.016s 1 tree, 105 leaves, max depth = 12, train loss: 0.53781, val loss: 0.53800, in 0.016s 1 tree, 103 leaves, max depth = 12, train loss: 0.51668, val loss: 0.51697, in 0.016s 1 tree, 107 leaves, max depth = 13, train loss: 0.49788, val loss: 0.49834, in 0.016s 1 tree, 108 leaves, max depth = 12, train loss: 0.48101, val loss: 0.48159, in 0.016s 1 tree, 115 leaves, max depth = 13, train loss: 0.46603, val loss: 0.46647, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.45190, val loss: 0.45229, in 0.031s 1 tree, 103 leaves, max depth = 12, train loss: 0.43940, val loss: 0.43978, in 0.016s 1 tree, 103 leaves, max depth = 14, train loss: 0.42183, val loss: 0.42258, in 0.031s 1 tree, 120 leaves, max depth = 15, train loss: 0.41137, val loss: 0.41191, in 0.016s 1 tree, 94 leaves, max depth = 14, train loss: 0.39639, val loss: 0.39753, in 0.016s 1 tree, 111 leaves, max depth = 15, train loss: 0.38739, val loss: 0.38863, in 0.031s 1 tree, 94 leaves, max depth = 12, train loss: 0.37471, val loss: 0.37665, in 0.016s 1 tree, 108 leaves, max depth = 13, train loss: 0.36718, val loss: 0.36926, in 0.016s 1 tree, 96 leaves, max depth = 13, train loss: 0.35633, val loss: 0.35898, in 0.031s 1 tree, 111 leaves, max depth = 12, train loss: 0.35006, val loss: 0.35301, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.34132, val loss: 0.34437, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.33353, val loss: 0.33673, in 0.016s 1 tree, 120 leaves, max depth = 16, train loss: 0.32708, val loss: 0.33031, in 0.031s 1 tree, 71 leaves, max depth = 13, train loss: 0.32026, val loss: 0.32425, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.31410, val loss: 0.31816, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.30864, val loss: 0.31282, in 0.016s 1 tree, 121 leaves, max depth = 16, train loss: 0.30337, val loss: 0.30774, in 0.016s 1 tree, 73 leaves, max depth = 14, train loss: 0.29820, val loss: 0.30317, in 0.016s 1 tree, 131 leaves, max depth = 18, train loss: 0.29462, val loss: 0.29965, in 0.031s 1 tree, 72 leaves, max depth = 14, train loss: 0.29010, val loss: 0.29567, in 0.016s 1 tree, 72 leaves, max depth = 14, train loss: 0.28605, val loss: 0.29213, in 0.016s 1 tree, 124 leaves, max depth = 16, train loss: 0.28272, val loss: 0.28908, in 0.031s 1 tree, 47 leaves, max depth = 13, train loss: 0.27917, val loss: 0.28566, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.27600, val loss: 0.28260, in 0.016s 1 tree, 120 leaves, max depth = 17, train loss: 0.27228, val loss: 0.27895, in 0.016s 1 tree, 73 leaves, max depth = 12, train loss: 0.26925, val loss: 0.27635, in 0.016s 1 tree, 53 leaves, max depth = 16, train loss: 0.26652, val loss: 0.27351, in 0.016s 1 tree, 131 leaves, max depth = 14, train loss: 0.26299, val loss: 0.27006, in 0.031s 1 tree, 73 leaves, max depth = 13, train loss: 0.26053, val loss: 0.26794, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.25836, val loss: 0.26573, in 0.016s 1 tree, 72 leaves, max depth = 14, train loss: 0.25625, val loss: 0.26394, in 0.016s 1 tree, 51 leaves, max depth = 17, train loss: 0.25427, val loss: 0.26181, in 0.016s 1 tree, 118 leaves, max depth = 15, train loss: 0.25150, val loss: 0.25915, in 0.016s 1 tree, 131 leaves, max depth = 17, train loss: 0.24858, val loss: 0.25623, in 0.031s 1 tree, 121 leaves, max depth = 17, train loss: 0.24623, val loss: 0.25401, in 0.031s 1 tree, 83 leaves, max depth = 19, train loss: 0.24446, val loss: 0.25248, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.24263, val loss: 0.25104, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.24099, val loss: 0.24920, in 0.031s 1 tree, 131 leaves, max depth = 18, train loss: 0.23840, val loss: 0.24660, in 0.016s Fit 48 trees in 1.237 s, (4395 total leaves) Time spent computing histograms: 0.326s Time spent finding best splits: 0.090s Time spent applying splits: 0.079s Time spent predicting: 0.000s Trial 52, Fold 1: Log loss = 0.24882506581240213, Average precision = 0.9662038143774188, ROC-AUC = 0.9609069180969588, Elapsed Time = 1.2518823999998858 seconds Trial 52, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 52, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.173 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 68 leaves, max depth = 12, train loss: 0.65352, val loss: 0.65276, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.61851, val loss: 0.61723, in 0.031s 1 tree, 96 leaves, max depth = 14, train loss: 0.58731, val loss: 0.58581, in 0.016s 1 tree, 96 leaves, max depth = 14, train loss: 0.55988, val loss: 0.55818, in 0.031s 1 tree, 98 leaves, max depth = 14, train loss: 0.53563, val loss: 0.53374, in 0.031s 1 tree, 103 leaves, max depth = 14, train loss: 0.51409, val loss: 0.51187, in 0.016s 1 tree, 102 leaves, max depth = 14, train loss: 0.49495, val loss: 0.49261, in 0.016s 1 tree, 102 leaves, max depth = 14, train loss: 0.47775, val loss: 0.47514, in 0.031s 1 tree, 110 leaves, max depth = 12, train loss: 0.46249, val loss: 0.45972, in 0.016s 1 tree, 112 leaves, max depth = 15, train loss: 0.44880, val loss: 0.44604, in 0.031s 1 tree, 103 leaves, max depth = 16, train loss: 0.43625, val loss: 0.43341, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.41929, val loss: 0.41669, in 0.016s 1 tree, 122 leaves, max depth = 17, train loss: 0.40911, val loss: 0.40661, in 0.031s 1 tree, 103 leaves, max depth = 14, train loss: 0.39460, val loss: 0.39253, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.38574, val loss: 0.38398, in 0.016s 1 tree, 125 leaves, max depth = 19, train loss: 0.37802, val loss: 0.37638, in 0.031s 1 tree, 92 leaves, max depth = 16, train loss: 0.36608, val loss: 0.36476, in 0.016s 1 tree, 126 leaves, max depth = 18, train loss: 0.35961, val loss: 0.35842, in 0.016s 1 tree, 46 leaves, max depth = 14, train loss: 0.34997, val loss: 0.34900, in 0.016s 1 tree, 131 leaves, max depth = 17, train loss: 0.34247, val loss: 0.34198, in 0.031s 1 tree, 121 leaves, max depth = 17, train loss: 0.33575, val loss: 0.33555, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.32783, val loss: 0.32788, in 0.031s 1 tree, 65 leaves, max depth = 12, train loss: 0.32042, val loss: 0.32104, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.31382, val loss: 0.31497, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.30781, val loss: 0.30909, in 0.031s 1 tree, 120 leaves, max depth = 17, train loss: 0.30257, val loss: 0.30414, in 0.016s 1 tree, 69 leaves, max depth = 12, train loss: 0.29727, val loss: 0.29926, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.29246, val loss: 0.29453, in 0.016s 1 tree, 68 leaves, max depth = 12, train loss: 0.28800, val loss: 0.29044, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.28403, val loss: 0.28651, in 0.031s 1 tree, 131 leaves, max depth = 18, train loss: 0.27969, val loss: 0.28259, in 0.032s 1 tree, 131 leaves, max depth = 17, train loss: 0.27567, val loss: 0.27903, in 0.016s 1 tree, 71 leaves, max depth = 14, train loss: 0.27216, val loss: 0.27581, in 0.031s 1 tree, 45 leaves, max depth = 12, train loss: 0.26899, val loss: 0.27274, in 0.016s 1 tree, 120 leaves, max depth = 19, train loss: 0.26563, val loss: 0.26966, in 0.016s 1 tree, 71 leaves, max depth = 12, train loss: 0.26277, val loss: 0.26706, in 0.031s 1 tree, 45 leaves, max depth = 13, train loss: 0.26019, val loss: 0.26457, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.25789, val loss: 0.26237, in 0.031s 1 tree, 131 leaves, max depth = 19, train loss: 0.25487, val loss: 0.25983, in 0.031s 1 tree, 45 leaves, max depth = 13, train loss: 0.25286, val loss: 0.25795, in 0.016s 1 tree, 119 leaves, max depth = 19, train loss: 0.25028, val loss: 0.25565, in 0.016s 1 tree, 71 leaves, max depth = 13, train loss: 0.24821, val loss: 0.25380, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.24604, val loss: 0.25215, in 0.031s 1 tree, 131 leaves, max depth = 17, train loss: 0.24365, val loss: 0.25017, in 0.031s 1 tree, 71 leaves, max depth = 14, train loss: 0.24202, val loss: 0.24873, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.24035, val loss: 0.24758, in 0.031s 1 tree, 68 leaves, max depth = 13, train loss: 0.23888, val loss: 0.24631, in 0.016s 1 tree, 52 leaves, max depth = 14, train loss: 0.23742, val loss: 0.24485, in 0.031s Fit 48 trees in 1.454 s, (4229 total leaves) Time spent computing histograms: 0.381s Time spent finding best splits: 0.102s Time spent applying splits: 0.088s Time spent predicting: 0.031s Trial 52, Fold 2: Log loss = 0.2419004480603451, Average precision = 0.9674886909543102, ROC-AUC = 0.9645675749580949, Elapsed Time = 1.456512400000065 seconds Trial 52, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 52, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 69 leaves, max depth = 12, train loss: 0.65379, val loss: 0.65364, in 0.016s 1 tree, 86 leaves, max depth = 12, train loss: 0.61857, val loss: 0.61869, in 0.031s 1 tree, 100 leaves, max depth = 14, train loss: 0.58817, val loss: 0.58829, in 0.031s 1 tree, 97 leaves, max depth = 12, train loss: 0.56106, val loss: 0.56143, in 0.016s 1 tree, 97 leaves, max depth = 14, train loss: 0.53743, val loss: 0.53784, in 0.031s 1 tree, 103 leaves, max depth = 14, train loss: 0.51604, val loss: 0.51667, in 0.016s 1 tree, 103 leaves, max depth = 14, train loss: 0.49697, val loss: 0.49789, in 0.031s 1 tree, 102 leaves, max depth = 12, train loss: 0.47989, val loss: 0.48103, in 0.016s 1 tree, 120 leaves, max depth = 13, train loss: 0.46470, val loss: 0.46613, in 0.031s 1 tree, 105 leaves, max depth = 14, train loss: 0.45084, val loss: 0.45248, in 0.031s 1 tree, 93 leaves, max depth = 14, train loss: 0.43230, val loss: 0.43529, in 0.016s 1 tree, 91 leaves, max depth = 14, train loss: 0.41575, val loss: 0.42007, in 0.031s 1 tree, 83 leaves, max depth = 14, train loss: 0.40546, val loss: 0.40995, in 0.031s 1 tree, 121 leaves, max depth = 14, train loss: 0.39589, val loss: 0.40075, in 0.016s 1 tree, 120 leaves, max depth = 16, train loss: 0.38687, val loss: 0.39214, in 0.031s 1 tree, 121 leaves, max depth = 15, train loss: 0.37904, val loss: 0.38461, in 0.031s 1 tree, 42 leaves, max depth = 12, train loss: 0.36747, val loss: 0.37405, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.35730, val loss: 0.36475, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.34794, val loss: 0.35642, in 0.031s 1 tree, 121 leaves, max depth = 15, train loss: 0.34064, val loss: 0.34895, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.33398, val loss: 0.34215, in 0.031s 1 tree, 70 leaves, max depth = 12, train loss: 0.32622, val loss: 0.33586, in 0.031s 1 tree, 45 leaves, max depth = 12, train loss: 0.31933, val loss: 0.32969, in 0.016s 1 tree, 117 leaves, max depth = 16, train loss: 0.31375, val loss: 0.32390, in 0.031s 1 tree, 70 leaves, max depth = 13, train loss: 0.30754, val loss: 0.31897, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.30205, val loss: 0.31401, in 0.031s 1 tree, 121 leaves, max depth = 13, train loss: 0.29798, val loss: 0.31049, in 0.016s 1 tree, 119 leaves, max depth = 18, train loss: 0.29352, val loss: 0.30590, in 0.031s 1 tree, 71 leaves, max depth = 14, train loss: 0.28871, val loss: 0.30207, in 0.016s 1 tree, 120 leaves, max depth = 18, train loss: 0.28478, val loss: 0.29809, in 0.031s 1 tree, 73 leaves, max depth = 13, train loss: 0.28057, val loss: 0.29494, in 0.031s 1 tree, 43 leaves, max depth = 12, train loss: 0.27678, val loss: 0.29154, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.27339, val loss: 0.28809, in 0.031s 1 tree, 131 leaves, max depth = 16, train loss: 0.27004, val loss: 0.28451, in 0.031s 1 tree, 71 leaves, max depth = 13, train loss: 0.26672, val loss: 0.28211, in 0.031s 1 tree, 94 leaves, max depth = 17, train loss: 0.26325, val loss: 0.27918, in 0.016s 1 tree, 71 leaves, max depth = 12, train loss: 0.26046, val loss: 0.27703, in 0.031s 1 tree, 44 leaves, max depth = 14, train loss: 0.25796, val loss: 0.27482, in 0.016s 1 tree, 72 leaves, max depth = 13, train loss: 0.25561, val loss: 0.27314, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.25346, val loss: 0.27114, in 0.016s 1 tree, 131 leaves, max depth = 17, train loss: 0.25081, val loss: 0.26820, in 0.016s 1 tree, 40 leaves, max depth = 13, train loss: 0.24891, val loss: 0.26662, in 0.031s 1 tree, 73 leaves, max depth = 13, train loss: 0.24705, val loss: 0.26531, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.24545, val loss: 0.26382, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.24373, val loss: 0.26283, in 0.031s 1 tree, 123 leaves, max depth = 15, train loss: 0.24210, val loss: 0.26127, in 0.031s 1 tree, 58 leaves, max depth = 14, train loss: 0.24061, val loss: 0.25983, in 0.016s 1 tree, 131 leaves, max depth = 18, train loss: 0.23803, val loss: 0.25698, in 0.031s Fit 48 trees in 1.549 s, (4221 total leaves) Time spent computing histograms: 0.394s Time spent finding best splits: 0.114s Time spent applying splits: 0.098s Time spent predicting: 0.000s Trial 52, Fold 3: Log loss = 0.24269607928834366, Average precision = 0.9655662806077301, ROC-AUC = 0.9624913320684575, Elapsed Time = 1.555675900001006 seconds Trial 52, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 52, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 71 leaves, max depth = 12, train loss: 0.65354, val loss: 0.65202, in 0.016s 1 tree, 103 leaves, max depth = 16, train loss: 0.61862, val loss: 0.61578, in 0.031s 1 tree, 101 leaves, max depth = 16, train loss: 0.58794, val loss: 0.58395, in 0.016s 1 tree, 99 leaves, max depth = 13, train loss: 0.56067, val loss: 0.55556, in 0.031s 1 tree, 103 leaves, max depth = 14, train loss: 0.53660, val loss: 0.53051, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.51512, val loss: 0.50806, in 0.031s 1 tree, 102 leaves, max depth = 14, train loss: 0.49603, val loss: 0.48810, in 0.016s 1 tree, 108 leaves, max depth = 14, train loss: 0.47889, val loss: 0.47012, in 0.031s 1 tree, 114 leaves, max depth = 13, train loss: 0.46372, val loss: 0.45401, in 0.031s 1 tree, 95 leaves, max depth = 14, train loss: 0.44419, val loss: 0.43419, in 0.016s 1 tree, 110 leaves, max depth = 14, train loss: 0.43143, val loss: 0.42076, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.41517, val loss: 0.40437, in 0.031s 1 tree, 87 leaves, max depth = 13, train loss: 0.40487, val loss: 0.39337, in 0.016s 1 tree, 122 leaves, max depth = 13, train loss: 0.39539, val loss: 0.38323, in 0.031s 1 tree, 123 leaves, max depth = 15, train loss: 0.38657, val loss: 0.37421, in 0.031s 1 tree, 131 leaves, max depth = 15, train loss: 0.37909, val loss: 0.36632, in 0.031s 1 tree, 91 leaves, max depth = 17, train loss: 0.36712, val loss: 0.35433, in 0.016s 1 tree, 126 leaves, max depth = 18, train loss: 0.36058, val loss: 0.34755, in 0.031s 1 tree, 96 leaves, max depth = 18, train loss: 0.35030, val loss: 0.33728, in 0.031s 1 tree, 46 leaves, max depth = 14, train loss: 0.34168, val loss: 0.32832, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.33481, val loss: 0.32139, in 0.016s 1 tree, 71 leaves, max depth = 12, train loss: 0.32704, val loss: 0.31409, in 0.031s 1 tree, 131 leaves, max depth = 15, train loss: 0.32071, val loss: 0.30766, in 0.016s 1 tree, 96 leaves, max depth = 18, train loss: 0.31363, val loss: 0.30059, in 0.031s 1 tree, 113 leaves, max depth = 18, train loss: 0.30927, val loss: 0.29607, in 0.016s 1 tree, 71 leaves, max depth = 13, train loss: 0.30334, val loss: 0.29055, in 0.016s 1 tree, 120 leaves, max depth = 15, train loss: 0.29863, val loss: 0.28578, in 0.016s 1 tree, 131 leaves, max depth = 15, train loss: 0.29402, val loss: 0.28107, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.28916, val loss: 0.27598, in 0.016s 1 tree, 71 leaves, max depth = 14, train loss: 0.28458, val loss: 0.27171, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.28050, val loss: 0.26750, in 0.016s 1 tree, 69 leaves, max depth = 14, train loss: 0.27661, val loss: 0.26397, in 0.016s 1 tree, 131 leaves, max depth = 14, train loss: 0.27286, val loss: 0.26014, in 0.031s 1 tree, 72 leaves, max depth = 14, train loss: 0.26949, val loss: 0.25709, in 0.016s 1 tree, 122 leaves, max depth = 16, train loss: 0.26685, val loss: 0.25468, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.26390, val loss: 0.25162, in 0.016s 1 tree, 131 leaves, max depth = 14, train loss: 0.26072, val loss: 0.24837, in 0.031s 1 tree, 72 leaves, max depth = 16, train loss: 0.25797, val loss: 0.24582, in 0.016s 1 tree, 131 leaves, max depth = 20, train loss: 0.25428, val loss: 0.24258, in 0.031s 1 tree, 55 leaves, max depth = 12, train loss: 0.25180, val loss: 0.24015, in 0.016s 1 tree, 74 leaves, max depth = 12, train loss: 0.24951, val loss: 0.23802, in 0.016s 1 tree, 131 leaves, max depth = 21, train loss: 0.24631, val loss: 0.23524, in 0.016s 1 tree, 56 leaves, max depth = 16, train loss: 0.24432, val loss: 0.23338, in 0.016s 1 tree, 74 leaves, max depth = 13, train loss: 0.24241, val loss: 0.23162, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.24071, val loss: 0.22979, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.23912, val loss: 0.22855, in 0.031s 1 tree, 75 leaves, max depth = 13, train loss: 0.23749, val loss: 0.22706, in 0.031s 1 tree, 51 leaves, max depth = 12, train loss: 0.23581, val loss: 0.22576, in 0.016s Fit 48 trees in 1.425 s, (4499 total leaves) Time spent computing histograms: 0.369s Time spent finding best splits: 0.101s Time spent applying splits: 0.088s Time spent predicting: 0.000s Trial 52, Fold 4: Log loss = 0.24011756494161798, Average precision = 0.9684737999022944, ROC-AUC = 0.9645331713126891, Elapsed Time = 1.4310083000000304 seconds Trial 52, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 52, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 64 leaves, max depth = 12, train loss: 0.65348, val loss: 0.65181, in 0.016s 1 tree, 90 leaves, max depth = 14, train loss: 0.61798, val loss: 0.61507, in 0.016s 1 tree, 92 leaves, max depth = 14, train loss: 0.58694, val loss: 0.58286, in 0.016s 1 tree, 110 leaves, max depth = 13, train loss: 0.56174, val loss: 0.55693, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.53773, val loss: 0.53183, in 0.031s 1 tree, 103 leaves, max depth = 14, train loss: 0.51594, val loss: 0.50921, in 0.016s 1 tree, 118 leaves, max depth = 15, train loss: 0.49745, val loss: 0.48993, in 0.016s 1 tree, 103 leaves, max depth = 14, train loss: 0.47993, val loss: 0.47181, in 0.016s 1 tree, 106 leaves, max depth = 12, train loss: 0.46458, val loss: 0.45581, in 0.031s 1 tree, 99 leaves, max depth = 12, train loss: 0.44531, val loss: 0.43637, in 0.016s 1 tree, 107 leaves, max depth = 13, train loss: 0.43220, val loss: 0.42288, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.41571, val loss: 0.40652, in 0.016s 1 tree, 114 leaves, max depth = 14, train loss: 0.40467, val loss: 0.39516, in 0.031s 1 tree, 111 leaves, max depth = 13, train loss: 0.39470, val loss: 0.38490, in 0.016s 1 tree, 116 leaves, max depth = 13, train loss: 0.38600, val loss: 0.37581, in 0.016s 1 tree, 98 leaves, max depth = 15, train loss: 0.37336, val loss: 0.36330, in 0.016s 1 tree, 108 leaves, max depth = 13, train loss: 0.36559, val loss: 0.35539, in 0.031s 1 tree, 98 leaves, max depth = 15, train loss: 0.35482, val loss: 0.34476, in 0.016s 1 tree, 110 leaves, max depth = 13, train loss: 0.34822, val loss: 0.33808, in 0.016s 1 tree, 117 leaves, max depth = 16, train loss: 0.34101, val loss: 0.33094, in 0.031s 1 tree, 98 leaves, max depth = 14, train loss: 0.33208, val loss: 0.32221, in 0.016s 1 tree, 119 leaves, max depth = 16, train loss: 0.32582, val loss: 0.31605, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.31869, val loss: 0.30875, in 0.016s 1 tree, 131 leaves, max depth = 17, train loss: 0.31291, val loss: 0.30299, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.30637, val loss: 0.29700, in 0.016s 1 tree, 46 leaves, max depth = 14, train loss: 0.30066, val loss: 0.29120, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.29554, val loss: 0.28598, in 0.016s 1 tree, 46 leaves, max depth = 14, train loss: 0.29099, val loss: 0.28134, in 0.016s 1 tree, 131 leaves, max depth = 16, train loss: 0.28627, val loss: 0.27669, in 0.031s 1 tree, 73 leaves, max depth = 12, train loss: 0.28163, val loss: 0.27267, in 0.016s 1 tree, 73 leaves, max depth = 12, train loss: 0.27746, val loss: 0.26897, in 0.016s 1 tree, 46 leaves, max depth = 13, train loss: 0.27405, val loss: 0.26547, in 0.016s 1 tree, 120 leaves, max depth = 17, train loss: 0.27040, val loss: 0.26210, in 0.031s 1 tree, 75 leaves, max depth = 14, train loss: 0.26697, val loss: 0.25905, in 0.016s 1 tree, 131 leaves, max depth = 16, train loss: 0.26348, val loss: 0.25561, in 0.016s 1 tree, 131 leaves, max depth = 23, train loss: 0.25932, val loss: 0.25186, in 0.031s 1 tree, 123 leaves, max depth = 18, train loss: 0.25652, val loss: 0.24925, in 0.016s 1 tree, 73 leaves, max depth = 12, train loss: 0.25376, val loss: 0.24683, in 0.031s 1 tree, 46 leaves, max depth = 12, train loss: 0.25129, val loss: 0.24437, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.24884, val loss: 0.24223, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.24663, val loss: 0.24028, in 0.016s 1 tree, 56 leaves, max depth = 15, train loss: 0.24458, val loss: 0.23826, in 0.016s 1 tree, 57 leaves, max depth = 15, train loss: 0.24274, val loss: 0.23640, in 0.016s 1 tree, 73 leaves, max depth = 12, train loss: 0.24097, val loss: 0.23499, in 0.016s 1 tree, 131 leaves, max depth = 18, train loss: 0.23903, val loss: 0.23380, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.23751, val loss: 0.23233, in 0.016s 1 tree, 121 leaves, max depth = 18, train loss: 0.23545, val loss: 0.23050, in 0.031s 1 tree, 127 leaves, max depth = 17, train loss: 0.23394, val loss: 0.22978, in 0.016s Fit 48 trees in 1.267 s, (4439 total leaves) Time spent computing histograms: 0.329s Time spent finding best splits: 0.089s Time spent applying splits: 0.079s Time spent predicting: 0.000s Trial 52, Fold 5: Log loss = 0.24758770280829395, Average precision = 0.9640340686720158, ROC-AUC = 0.9600119651450124, Elapsed Time = 1.2826208999995288 seconds
Optimization Progress: 53%|#####3 | 53/100 [10:22<09:09, 11.70s/it]
Trial 53, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 53, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 37 leaves, max depth = 8, train loss: 0.68173, val loss: 0.68168, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.67031, val loss: 0.67032, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.65933, val loss: 0.65942, in 0.016s 1 tree, 51 leaves, max depth = 8, train loss: 0.64935, val loss: 0.64935, in 0.016s 1 tree, 59 leaves, max depth = 10, train loss: 0.63979, val loss: 0.63967, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.62996, val loss: 0.62992, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.62072, val loss: 0.62061, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.61171, val loss: 0.61150, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.60292, val loss: 0.60277, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.59489, val loss: 0.59460, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.58658, val loss: 0.58624, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.57902, val loss: 0.57864, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.57123, val loss: 0.57083, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.56383, val loss: 0.56348, in 0.016s 1 tree, 50 leaves, max depth = 8, train loss: 0.55665, val loss: 0.55618, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.54991, val loss: 0.54937, in 0.016s 1 tree, 52 leaves, max depth = 8, train loss: 0.54320, val loss: 0.54254, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.53673, val loss: 0.53614, in 0.016s 1 tree, 34 leaves, max depth = 8, train loss: 0.53054, val loss: 0.52986, in 0.016s 1 tree, 65 leaves, max depth = 12, train loss: 0.52474, val loss: 0.52403, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.51874, val loss: 0.51801, in 0.000s 1 tree, 53 leaves, max depth = 13, train loss: 0.51346, val loss: 0.51262, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.50792, val loss: 0.50713, in 0.016s 1 tree, 65 leaves, max depth = 13, train loss: 0.50282, val loss: 0.50196, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.49771, val loss: 0.49684, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.49266, val loss: 0.49186, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.48790, val loss: 0.48711, in 0.016s 1 tree, 62 leaves, max depth = 12, train loss: 0.48333, val loss: 0.48252, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.47859, val loss: 0.47778, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.47445, val loss: 0.47355, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.47028, val loss: 0.46937, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.46434, val loss: 0.46362, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.46017, val loss: 0.45939, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.45454, val loss: 0.45394, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.45077, val loss: 0.45018, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.44691, val loss: 0.44620, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.44184, val loss: 0.44125, in 0.016s 1 tree, 47 leaves, max depth = 10, train loss: 0.43826, val loss: 0.43756, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.43488, val loss: 0.43420, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.43133, val loss: 0.43061, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.42788, val loss: 0.42718, in 0.016s 1 tree, 62 leaves, max depth = 11, train loss: 0.42462, val loss: 0.42386, in 0.016s 1 tree, 60 leaves, max depth = 11, train loss: 0.42146, val loss: 0.42064, in 0.016s 1 tree, 65 leaves, max depth = 10, train loss: 0.41858, val loss: 0.41773, in 0.031s 1 tree, 52 leaves, max depth = 10, train loss: 0.41548, val loss: 0.41465, in 0.016s 1 tree, 69 leaves, max depth = 12, train loss: 0.41282, val loss: 0.41198, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.41014, val loss: 0.40938, in 0.016s 1 tree, 70 leaves, max depth = 11, train loss: 0.40758, val loss: 0.40679, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.40477, val loss: 0.40401, in 0.016s 1 tree, 38 leaves, max depth = 9, train loss: 0.40222, val loss: 0.40140, in 0.016s 1 tree, 51 leaves, max depth = 10, train loss: 0.39956, val loss: 0.39877, in 0.016s 1 tree, 60 leaves, max depth = 13, train loss: 0.39539, val loss: 0.39485, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.39136, val loss: 0.39106, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.38890, val loss: 0.38863, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.38650, val loss: 0.38627, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.38271, val loss: 0.38271, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.37924, val loss: 0.37930, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.37587, val loss: 0.37599, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.37263, val loss: 0.37279, in 0.016s Fit 59 trees in 1.236 s, (3029 total leaves) Time spent computing histograms: 0.378s Time spent finding best splits: 0.060s Time spent applying splits: 0.048s Time spent predicting: 0.016s Trial 53, Fold 1: Log loss = 0.37719282831701495, Average precision = 0.9540985323839766, ROC-AUC = 0.9484556400347954, Elapsed Time = 1.235247900000104 seconds Trial 53, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 53, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.126 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 39 leaves, max depth = 7, train loss: 0.68182, val loss: 0.68164, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.67031, val loss: 0.66989, in 0.016s 1 tree, 49 leaves, max depth = 8, train loss: 0.65940, val loss: 0.65880, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.64920, val loss: 0.64844, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.63965, val loss: 0.63878, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.62965, val loss: 0.62862, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.62006, val loss: 0.61886, in 0.016s 1 tree, 38 leaves, max depth = 8, train loss: 0.61100, val loss: 0.60957, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.60209, val loss: 0.60051, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.59415, val loss: 0.59250, in 0.000s 1 tree, 47 leaves, max depth = 11, train loss: 0.58584, val loss: 0.58404, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.57828, val loss: 0.57635, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.57048, val loss: 0.56839, in 0.031s 1 tree, 47 leaves, max depth = 11, train loss: 0.56298, val loss: 0.56076, in 0.000s 1 tree, 46 leaves, max depth = 8, train loss: 0.55579, val loss: 0.55349, in 0.031s 1 tree, 38 leaves, max depth = 10, train loss: 0.54890, val loss: 0.54649, in 0.016s 1 tree, 63 leaves, max depth = 12, train loss: 0.54263, val loss: 0.54019, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.53670, val loss: 0.53406, in 0.016s 1 tree, 50 leaves, max depth = 9, train loss: 0.53038, val loss: 0.52768, in 0.016s 1 tree, 51 leaves, max depth = 9, train loss: 0.52426, val loss: 0.52151, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.51876, val loss: 0.51585, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.51290, val loss: 0.50991, in 0.016s 1 tree, 50 leaves, max depth = 9, train loss: 0.50734, val loss: 0.50430, in 0.016s 1 tree, 34 leaves, max depth = 8, train loss: 0.50202, val loss: 0.49886, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.49669, val loss: 0.49346, in 0.016s 1 tree, 53 leaves, max depth = 9, train loss: 0.49167, val loss: 0.48839, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.48685, val loss: 0.48345, in 0.016s 1 tree, 42 leaves, max depth = 8, train loss: 0.48216, val loss: 0.47860, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.47743, val loss: 0.47381, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.47285, val loss: 0.46918, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.46869, val loss: 0.46498, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.46441, val loss: 0.46058, in 0.016s 1 tree, 57 leaves, max depth = 9, train loss: 0.46030, val loss: 0.45649, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.45463, val loss: 0.45089, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.44925, val loss: 0.44563, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.44536, val loss: 0.44170, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.44015, val loss: 0.43656, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.43661, val loss: 0.43294, in 0.016s 1 tree, 65 leaves, max depth = 12, train loss: 0.43327, val loss: 0.42958, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.42977, val loss: 0.42604, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.42671, val loss: 0.42301, in 0.016s 1 tree, 62 leaves, max depth = 10, train loss: 0.42349, val loss: 0.41979, in 0.031s 1 tree, 61 leaves, max depth = 10, train loss: 0.42036, val loss: 0.41668, in 0.016s 1 tree, 74 leaves, max depth = 15, train loss: 0.41748, val loss: 0.41387, in 0.016s 1 tree, 63 leaves, max depth = 11, train loss: 0.41455, val loss: 0.41092, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.41008, val loss: 0.40654, in 0.016s 1 tree, 64 leaves, max depth = 9, train loss: 0.40757, val loss: 0.40402, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.40472, val loss: 0.40116, in 0.016s 1 tree, 53 leaves, max depth = 15, train loss: 0.40066, val loss: 0.39721, in 0.016s 1 tree, 63 leaves, max depth = 12, train loss: 0.39655, val loss: 0.39318, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.39403, val loss: 0.39065, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.39160, val loss: 0.38818, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.38928, val loss: 0.38586, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.38560, val loss: 0.38227, in 0.016s 1 tree, 54 leaves, max depth = 14, train loss: 0.38194, val loss: 0.37870, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.37974, val loss: 0.37645, in 0.016s 1 tree, 65 leaves, max depth = 11, train loss: 0.37756, val loss: 0.37425, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.37531, val loss: 0.37195, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.37209, val loss: 0.36879, in 0.016s Fit 59 trees in 1.251 s, (3004 total leaves) Time spent computing histograms: 0.376s Time spent finding best splits: 0.066s Time spent applying splits: 0.054s Time spent predicting: 0.000s Trial 53, Fold 2: Log loss = 0.37552071858201236, Average precision = 0.9503976093764905, ROC-AUC = 0.9489623638177436, Elapsed Time = 1.2549022000002878 seconds Trial 53, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 53, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 39 leaves, max depth = 8, train loss: 0.68181, val loss: 0.68159, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.67045, val loss: 0.67018, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.65954, val loss: 0.65922, in 0.016s 1 tree, 50 leaves, max depth = 8, train loss: 0.64958, val loss: 0.64910, in 0.016s 1 tree, 49 leaves, max depth = 8, train loss: 0.64008, val loss: 0.63964, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.63069, val loss: 0.63028, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.62150, val loss: 0.62117, in 0.016s 1 tree, 44 leaves, max depth = 13, train loss: 0.61235, val loss: 0.61200, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.60388, val loss: 0.60357, in 0.016s 1 tree, 50 leaves, max depth = 9, train loss: 0.59583, val loss: 0.59557, in 0.016s 1 tree, 42 leaves, max depth = 8, train loss: 0.58759, val loss: 0.58736, in 0.016s 1 tree, 51 leaves, max depth = 9, train loss: 0.58010, val loss: 0.57993, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.57251, val loss: 0.57232, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.56552, val loss: 0.56540, in 0.031s 1 tree, 41 leaves, max depth = 12, train loss: 0.55819, val loss: 0.55811, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.55146, val loss: 0.55147, in 0.016s 1 tree, 65 leaves, max depth = 11, train loss: 0.54520, val loss: 0.54526, in 0.016s 1 tree, 58 leaves, max depth = 10, train loss: 0.53903, val loss: 0.53917, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.53314, val loss: 0.53335, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.52685, val loss: 0.52711, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.52125, val loss: 0.52155, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.51531, val loss: 0.51568, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.51006, val loss: 0.51048, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.50499, val loss: 0.50538, in 0.016s 1 tree, 40 leaves, max depth = 8, train loss: 0.50023, val loss: 0.50058, in 0.016s 1 tree, 47 leaves, max depth = 8, train loss: 0.49552, val loss: 0.49593, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.48900, val loss: 0.48983, in 0.031s 1 tree, 35 leaves, max depth = 7, train loss: 0.48415, val loss: 0.48500, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.47947, val loss: 0.48029, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.47340, val loss: 0.47464, in 0.016s 1 tree, 43 leaves, max depth = 13, train loss: 0.46880, val loss: 0.47008, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.46463, val loss: 0.46601, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.46028, val loss: 0.46172, in 0.016s 1 tree, 68 leaves, max depth = 11, train loss: 0.45482, val loss: 0.45669, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.44950, val loss: 0.45171, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.44546, val loss: 0.44772, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.44154, val loss: 0.44385, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.43787, val loss: 0.44023, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.43450, val loss: 0.43693, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.42983, val loss: 0.43258, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.42512, val loss: 0.42825, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.42055, val loss: 0.42403, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.41716, val loss: 0.42069, in 0.016s 1 tree, 46 leaves, max depth = 13, train loss: 0.41388, val loss: 0.41744, in 0.016s 1 tree, 52 leaves, max depth = 15, train loss: 0.40976, val loss: 0.41366, in 0.031s 1 tree, 45 leaves, max depth = 13, train loss: 0.40662, val loss: 0.41057, in 0.016s 1 tree, 46 leaves, max depth = 13, train loss: 0.40358, val loss: 0.40756, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.40062, val loss: 0.40464, in 0.000s 1 tree, 48 leaves, max depth = 13, train loss: 0.39776, val loss: 0.40180, in 0.031s 1 tree, 47 leaves, max depth = 12, train loss: 0.39496, val loss: 0.39906, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.39236, val loss: 0.39652, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.38971, val loss: 0.39392, in 0.016s 1 tree, 55 leaves, max depth = 15, train loss: 0.38600, val loss: 0.39054, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.38347, val loss: 0.38806, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.38130, val loss: 0.38591, in 0.016s 1 tree, 63 leaves, max depth = 14, train loss: 0.37906, val loss: 0.38376, in 0.031s 1 tree, 49 leaves, max depth = 12, train loss: 0.37671, val loss: 0.38146, in 0.000s 1 tree, 55 leaves, max depth = 15, train loss: 0.37329, val loss: 0.37838, in 0.031s 1 tree, 61 leaves, max depth = 9, train loss: 0.37113, val loss: 0.37626, in 0.016s Fit 59 trees in 1.345 s, (2943 total leaves) Time spent computing histograms: 0.411s Time spent finding best splits: 0.067s Time spent applying splits: 0.055s Time spent predicting: 0.016s Trial 53, Fold 3: Log loss = 0.37057968492038945, Average precision = 0.9563776470250097, ROC-AUC = 0.9524284052932168, Elapsed Time = 1.3416933999997127 seconds Trial 53, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 53, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 8, train loss: 0.68182, val loss: 0.68135, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.67044, val loss: 0.66956, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.65950, val loss: 0.65821, in 0.031s 1 tree, 47 leaves, max depth = 12, train loss: 0.64982, val loss: 0.64805, in 0.016s 1 tree, 46 leaves, max depth = 9, train loss: 0.63982, val loss: 0.63759, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.63054, val loss: 0.62793, in 0.016s 1 tree, 46 leaves, max depth = 9, train loss: 0.62125, val loss: 0.61820, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.61205, val loss: 0.60863, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.60374, val loss: 0.59997, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.59579, val loss: 0.59166, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.58759, val loss: 0.58314, in 0.016s 1 tree, 57 leaves, max depth = 14, train loss: 0.58012, val loss: 0.57535, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.57241, val loss: 0.56725, in 0.016s 1 tree, 58 leaves, max depth = 9, train loss: 0.56549, val loss: 0.55999, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.55816, val loss: 0.55233, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.55159, val loss: 0.54548, in 0.016s 1 tree, 60 leaves, max depth = 16, train loss: 0.54547, val loss: 0.53905, in 0.031s 1 tree, 50 leaves, max depth = 10, train loss: 0.53956, val loss: 0.53297, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.53319, val loss: 0.52626, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.52704, val loss: 0.51977, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.52126, val loss: 0.51375, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.51537, val loss: 0.50758, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.50977, val loss: 0.50167, in 0.016s 1 tree, 43 leaves, max depth = 9, train loss: 0.50441, val loss: 0.49602, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.49906, val loss: 0.49041, in 0.016s 1 tree, 44 leaves, max depth = 9, train loss: 0.49403, val loss: 0.48511, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.48911, val loss: 0.47989, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.48281, val loss: 0.47346, in 0.031s 1 tree, 66 leaves, max depth = 15, train loss: 0.47846, val loss: 0.46889, in 0.016s 1 tree, 65 leaves, max depth = 10, train loss: 0.47426, val loss: 0.46455, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.47009, val loss: 0.46027, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.46614, val loss: 0.45611, in 0.016s 1 tree, 60 leaves, max depth = 13, train loss: 0.46049, val loss: 0.45035, in 0.031s 1 tree, 54 leaves, max depth = 13, train loss: 0.45503, val loss: 0.44483, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.45129, val loss: 0.44098, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.44731, val loss: 0.43681, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.44387, val loss: 0.43309, in 0.016s 1 tree, 65 leaves, max depth = 17, train loss: 0.44049, val loss: 0.42953, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.43682, val loss: 0.42568, in 0.031s 1 tree, 52 leaves, max depth = 11, train loss: 0.43326, val loss: 0.42193, in 0.000s 1 tree, 49 leaves, max depth = 11, train loss: 0.42981, val loss: 0.41829, in 0.031s 1 tree, 60 leaves, max depth = 12, train loss: 0.42510, val loss: 0.41352, in 0.016s 1 tree, 65 leaves, max depth = 10, train loss: 0.42190, val loss: 0.41012, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.41740, val loss: 0.40557, in 0.016s 1 tree, 66 leaves, max depth = 10, train loss: 0.41436, val loss: 0.40233, in 0.016s 1 tree, 74 leaves, max depth = 13, train loss: 0.41018, val loss: 0.39809, in 0.031s 1 tree, 72 leaves, max depth = 12, train loss: 0.40613, val loss: 0.39398, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.40320, val loss: 0.39089, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.40037, val loss: 0.38790, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.39760, val loss: 0.38498, in 0.016s 1 tree, 45 leaves, max depth = 9, train loss: 0.39504, val loss: 0.38224, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.39242, val loss: 0.37948, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.38988, val loss: 0.37677, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.38631, val loss: 0.37315, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.38409, val loss: 0.37091, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.38172, val loss: 0.36839, in 0.031s 1 tree, 69 leaves, max depth = 15, train loss: 0.37965, val loss: 0.36620, in 0.016s 1 tree, 60 leaves, max depth = 11, train loss: 0.37738, val loss: 0.36381, in 0.016s 1 tree, 65 leaves, max depth = 14, train loss: 0.37398, val loss: 0.36039, in 0.016s Fit 59 trees in 1.376 s, (3193 total leaves) Time spent computing histograms: 0.434s Time spent finding best splits: 0.074s Time spent applying splits: 0.061s Time spent predicting: 0.000s Trial 53, Fold 4: Log loss = 0.3745417032972952, Average precision = 0.9548845195369, ROC-AUC = 0.9501723183139361, Elapsed Time = 1.3895826000007219 seconds Trial 53, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 53, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 9, train loss: 0.68162, val loss: 0.68100, in 0.016s 1 tree, 44 leaves, max depth = 13, train loss: 0.67017, val loss: 0.66914, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.65905, val loss: 0.65762, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.64873, val loss: 0.64679, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.63915, val loss: 0.63675, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.62918, val loss: 0.62645, in 0.016s 1 tree, 38 leaves, max depth = 9, train loss: 0.61979, val loss: 0.61659, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.61058, val loss: 0.60702, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.60218, val loss: 0.59827, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.59403, val loss: 0.58976, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.58569, val loss: 0.58114, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.57815, val loss: 0.57328, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.57047, val loss: 0.56529, in 0.016s 1 tree, 58 leaves, max depth = 10, train loss: 0.56349, val loss: 0.55799, in 0.016s 1 tree, 42 leaves, max depth = 13, train loss: 0.55618, val loss: 0.55042, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.54953, val loss: 0.54350, in 0.016s 1 tree, 62 leaves, max depth = 10, train loss: 0.54329, val loss: 0.53697, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.53732, val loss: 0.53088, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.53136, val loss: 0.52467, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.52510, val loss: 0.51820, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.51951, val loss: 0.51238, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.51363, val loss: 0.50630, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.50839, val loss: 0.50085, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.50324, val loss: 0.49562, in 0.016s 1 tree, 41 leaves, max depth = 8, train loss: 0.49850, val loss: 0.49073, in 0.016s 1 tree, 44 leaves, max depth = 8, train loss: 0.49391, val loss: 0.48599, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.48748, val loss: 0.47952, in 0.016s 1 tree, 36 leaves, max depth = 7, train loss: 0.48268, val loss: 0.47448, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.47802, val loss: 0.46958, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.47371, val loss: 0.46519, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.46915, val loss: 0.46049, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.46505, val loss: 0.45626, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.46074, val loss: 0.45182, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.45517, val loss: 0.44622, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.45108, val loss: 0.44202, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.44579, val loss: 0.43674, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.44069, val loss: 0.43165, in 0.016s 1 tree, 47 leaves, max depth = 10, train loss: 0.43712, val loss: 0.42787, in 0.031s 1 tree, 47 leaves, max depth = 12, train loss: 0.43344, val loss: 0.42407, in 0.016s 1 tree, 56 leaves, max depth = 14, train loss: 0.42865, val loss: 0.41929, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.42401, val loss: 0.41465, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.41948, val loss: 0.41013, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.41611, val loss: 0.40668, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.41283, val loss: 0.40334, in 0.016s 1 tree, 53 leaves, max depth = 15, train loss: 0.40878, val loss: 0.39917, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.40566, val loss: 0.39599, in 0.031s 1 tree, 49 leaves, max depth = 13, train loss: 0.40263, val loss: 0.39289, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.39968, val loss: 0.38989, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.39684, val loss: 0.38696, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.39406, val loss: 0.38411, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.39149, val loss: 0.38139, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.38886, val loss: 0.37871, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.38630, val loss: 0.37609, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.38282, val loss: 0.37257, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.38060, val loss: 0.37036, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.37836, val loss: 0.36797, in 0.016s 1 tree, 77 leaves, max depth = 11, train loss: 0.37626, val loss: 0.36584, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.37400, val loss: 0.36352, in 0.016s 1 tree, 72 leaves, max depth = 13, train loss: 0.37200, val loss: 0.36151, in 0.016s Fit 59 trees in 1.330 s, (2959 total leaves) Time spent computing histograms: 0.412s Time spent finding best splits: 0.067s Time spent applying splits: 0.054s Time spent predicting: 0.000s Trial 53, Fold 5: Log loss = 0.3786991673644685, Average precision = 0.9511013074293633, ROC-AUC = 0.9479524366906341, Elapsed Time = 1.3397103999996034 seconds
Optimization Progress: 54%|#####4 | 54/100 [10:36<09:18, 12.14s/it]
Trial 54, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 54, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.126 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 74 leaves, max depth = 19, train loss: 0.64843, val loss: 0.64717, in 0.016s 1 tree, 76 leaves, max depth = 19, train loss: 0.61204, val loss: 0.60963, in 0.000s 1 tree, 79 leaves, max depth = 19, train loss: 0.58201, val loss: 0.57855, in 0.016s 1 tree, 76 leaves, max depth = 17, train loss: 0.55730, val loss: 0.55297, in 0.016s 1 tree, 132 leaves, max depth = 16, train loss: 0.53483, val loss: 0.53240, in 0.016s 1 tree, 79 leaves, max depth = 19, train loss: 0.51546, val loss: 0.51214, in 0.016s 1 tree, 131 leaves, max depth = 17, train loss: 0.49773, val loss: 0.49607, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.48253, val loss: 0.48004, in 0.016s 1 tree, 132 leaves, max depth = 15, train loss: 0.46827, val loss: 0.46726, in 0.016s 1 tree, 81 leaves, max depth = 19, train loss: 0.45626, val loss: 0.45455, in 0.000s 1 tree, 128 leaves, max depth = 15, train loss: 0.44458, val loss: 0.44423, in 0.031s 1 tree, 81 leaves, max depth = 14, train loss: 0.43486, val loss: 0.43399, in 0.000s 1 tree, 123 leaves, max depth = 14, train loss: 0.42523, val loss: 0.42559, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.41702, val loss: 0.41851, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.40946, val loss: 0.41047, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40288, val loss: 0.40318, in 0.016s 1 tree, 82 leaves, max depth = 16, train loss: 0.39661, val loss: 0.39655, in 0.000s 1 tree, 130 leaves, max depth = 16, train loss: 0.38983, val loss: 0.39093, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38442, val loss: 0.38487, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.37863, val loss: 0.38022, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37403, val loss: 0.37505, in 0.000s 1 tree, 85 leaves, max depth = 15, train loss: 0.36899, val loss: 0.36967, in 0.016s 1 tree, 129 leaves, max depth = 15, train loss: 0.36400, val loss: 0.36579, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36010, val loss: 0.36136, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.35581, val loss: 0.35697, in 0.000s 1 tree, 126 leaves, max depth = 14, train loss: 0.35149, val loss: 0.35374, in 0.031s 1 tree, 86 leaves, max depth = 16, train loss: 0.34799, val loss: 0.35007, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34463, val loss: 0.34620, in 0.000s 1 tree, 126 leaves, max depth = 15, train loss: 0.34084, val loss: 0.34347, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33791, val loss: 0.34007, in 0.000s 1 tree, 109 leaves, max depth = 13, train loss: 0.33453, val loss: 0.33617, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33201, val loss: 0.33320, in 0.016s 1 tree, 108 leaves, max depth = 13, train loss: 0.32916, val loss: 0.32993, in 0.016s 1 tree, 122 leaves, max depth = 15, train loss: 0.32567, val loss: 0.32751, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32341, val loss: 0.32483, in 0.000s 1 tree, 85 leaves, max depth = 15, train loss: 0.32095, val loss: 0.32236, in 0.016s 1 tree, 121 leaves, max depth = 15, train loss: 0.31786, val loss: 0.32030, in 0.031s 1 tree, 107 leaves, max depth = 13, train loss: 0.31554, val loss: 0.31757, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31349, val loss: 0.31510, in 0.000s 1 tree, 123 leaves, max depth = 16, train loss: 0.31075, val loss: 0.31334, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.30894, val loss: 0.31113, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.30686, val loss: 0.30872, in 0.000s 1 tree, 82 leaves, max depth = 14, train loss: 0.30509, val loss: 0.30695, in 0.016s 1 tree, 123 leaves, max depth = 16, train loss: 0.30260, val loss: 0.30540, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30094, val loss: 0.30334, in 0.016s 1 tree, 120 leaves, max depth = 16, train loss: 0.29877, val loss: 0.30208, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.29704, val loss: 0.30002, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.29561, val loss: 0.29856, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29408, val loss: 0.29665, in 0.000s 1 tree, 123 leaves, max depth = 16, train loss: 0.29209, val loss: 0.29551, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.29070, val loss: 0.29383, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.28932, val loss: 0.29207, in 0.000s 1 tree, 125 leaves, max depth = 16, train loss: 0.28755, val loss: 0.29109, in 0.016s 1 tree, 111 leaves, max depth = 15, train loss: 0.28620, val loss: 0.28980, in 0.016s 1 tree, 176 leaves, max depth = 20, train loss: 0.28417, val loss: 0.28865, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.28287, val loss: 0.28699, in 0.000s 1 tree, 106 leaves, max depth = 17, train loss: 0.28173, val loss: 0.28561, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.28061, val loss: 0.28414, in 0.000s 1 tree, 126 leaves, max depth = 17, train loss: 0.27908, val loss: 0.28334, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.27808, val loss: 0.28201, in 0.000s 1 tree, 110 leaves, max depth = 14, train loss: 0.27608, val loss: 0.28086, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.27498, val loss: 0.27952, in 0.016s 1 tree, 182 leaves, max depth = 20, train loss: 0.27304, val loss: 0.27829, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.27211, val loss: 0.27704, in 0.016s 1 tree, 84 leaves, max depth = 20, train loss: 0.27104, val loss: 0.27603, in 0.016s 1 tree, 163 leaves, max depth = 27, train loss: 0.26927, val loss: 0.27515, in 0.016s 1 tree, 175 leaves, max depth = 21, train loss: 0.26764, val loss: 0.27434, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.26676, val loss: 0.27315, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.26580, val loss: 0.27233, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.26503, val loss: 0.27127, in 0.016s 1 tree, 165 leaves, max depth = 27, train loss: 0.26351, val loss: 0.27061, in 0.016s 1 tree, 108 leaves, max depth = 15, train loss: 0.26252, val loss: 0.26971, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.26183, val loss: 0.26873, in 0.016s 1 tree, 177 leaves, max depth = 21, train loss: 0.26035, val loss: 0.26808, in 0.016s Fit 74 trees in 1.236 s, (6131 total leaves) Time spent computing histograms: 0.410s Time spent finding best splits: 0.103s Time spent applying splits: 0.112s Time spent predicting: 0.016s Trial 54, Fold 1: Log loss = 0.2760671375760067, Average precision = 0.9524626419368991, ROC-AUC = 0.9524715405333066, Elapsed Time = 1.2402317999985826 seconds Trial 54, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 54, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 78 leaves, max depth = 19, train loss: 0.64875, val loss: 0.64656, in 0.016s 1 tree, 77 leaves, max depth = 16, train loss: 0.61286, val loss: 0.60902, in 0.016s 1 tree, 80 leaves, max depth = 20, train loss: 0.58280, val loss: 0.57733, in 0.016s 1 tree, 78 leaves, max depth = 14, train loss: 0.55806, val loss: 0.55117, in 0.016s 1 tree, 132 leaves, max depth = 17, train loss: 0.53579, val loss: 0.52983, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.51670, val loss: 0.50966, in 0.016s 1 tree, 130 leaves, max depth = 15, train loss: 0.49910, val loss: 0.49290, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.48410, val loss: 0.47699, in 0.016s 1 tree, 129 leaves, max depth = 15, train loss: 0.46992, val loss: 0.46358, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.45802, val loss: 0.45083, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.44642, val loss: 0.43995, in 0.016s 1 tree, 77 leaves, max depth = 17, train loss: 0.43683, val loss: 0.42969, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.42720, val loss: 0.42075, in 0.016s 1 tree, 129 leaves, max depth = 15, train loss: 0.41900, val loss: 0.41320, in 0.016s 1 tree, 75 leaves, max depth = 19, train loss: 0.41140, val loss: 0.40505, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40501, val loss: 0.39841, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.39883, val loss: 0.39176, in 0.000s 1 tree, 124 leaves, max depth = 15, train loss: 0.39204, val loss: 0.38562, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38680, val loss: 0.38015, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.38099, val loss: 0.37499, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37655, val loss: 0.37031, in 0.000s 1 tree, 84 leaves, max depth = 12, train loss: 0.37131, val loss: 0.36470, in 0.016s 1 tree, 126 leaves, max depth = 17, train loss: 0.36634, val loss: 0.36036, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36255, val loss: 0.35636, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.35820, val loss: 0.35171, in 0.016s 1 tree, 127 leaves, max depth = 19, train loss: 0.35390, val loss: 0.34801, in 0.016s 1 tree, 85 leaves, max depth = 13, train loss: 0.35031, val loss: 0.34426, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34704, val loss: 0.34080, in 0.000s 1 tree, 127 leaves, max depth = 18, train loss: 0.34326, val loss: 0.33762, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.34041, val loss: 0.33459, in 0.000s 1 tree, 99 leaves, max depth = 15, train loss: 0.33699, val loss: 0.33095, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33453, val loss: 0.32833, in 0.016s 1 tree, 87 leaves, max depth = 16, train loss: 0.33170, val loss: 0.32533, in 0.016s 1 tree, 126 leaves, max depth = 20, train loss: 0.32823, val loss: 0.32244, in 0.016s 1 tree, 101 leaves, max depth = 16, train loss: 0.32554, val loss: 0.31960, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.32333, val loss: 0.31722, in 0.016s 1 tree, 126 leaves, max depth = 19, train loss: 0.32026, val loss: 0.31471, in 0.016s 1 tree, 102 leaves, max depth = 16, train loss: 0.31792, val loss: 0.31223, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31593, val loss: 0.31009, in 0.000s 1 tree, 130 leaves, max depth = 19, train loss: 0.31319, val loss: 0.30785, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.31113, val loss: 0.30587, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30934, val loss: 0.30394, in 0.016s 1 tree, 102 leaves, max depth = 16, train loss: 0.30743, val loss: 0.30195, in 0.016s 1 tree, 127 leaves, max depth = 20, train loss: 0.30495, val loss: 0.29995, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30334, val loss: 0.29820, in 0.000s 1 tree, 131 leaves, max depth = 20, train loss: 0.30117, val loss: 0.29647, in 0.031s 1 tree, 102 leaves, max depth = 17, train loss: 0.29942, val loss: 0.29461, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.29788, val loss: 0.29322, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.29639, val loss: 0.29160, in 0.016s 1 tree, 130 leaves, max depth = 17, train loss: 0.29440, val loss: 0.29003, in 0.016s 1 tree, 103 leaves, max depth = 16, train loss: 0.29299, val loss: 0.28856, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29164, val loss: 0.28708, in 0.000s 1 tree, 130 leaves, max depth = 19, train loss: 0.28985, val loss: 0.28569, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.28764, val loss: 0.28416, in 0.016s 1 tree, 101 leaves, max depth = 15, train loss: 0.28632, val loss: 0.28274, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.28506, val loss: 0.28135, in 0.000s 1 tree, 158 leaves, max depth = 21, train loss: 0.28320, val loss: 0.28065, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.28208, val loss: 0.27940, in 0.016s 1 tree, 85 leaves, max depth = 20, train loss: 0.28077, val loss: 0.27834, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.27979, val loss: 0.27724, in 0.016s 1 tree, 108 leaves, max depth = 15, train loss: 0.27786, val loss: 0.27594, in 0.016s 1 tree, 101 leaves, max depth = 15, train loss: 0.27672, val loss: 0.27473, in 0.016s 1 tree, 182 leaves, max depth = 19, train loss: 0.27482, val loss: 0.27404, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.27390, val loss: 0.27301, in 0.016s 1 tree, 130 leaves, max depth = 17, train loss: 0.27256, val loss: 0.27206, in 0.016s 1 tree, 47 leaves, max depth = 9, train loss: 0.27151, val loss: 0.27110, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.27066, val loss: 0.27015, in 0.000s 1 tree, 187 leaves, max depth = 18, train loss: 0.26879, val loss: 0.26947, in 0.016s [69/74] 1 tree, 113 leaves, max depth = 15, train loss: 0.26762, val loss: 0.26843, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.26685, val loss: 0.26755, in 0.016s 1 tree, 157 leaves, max depth = 21, train loss: 0.26538, val loss: 0.26712, in 0.016s 1 tree, 108 leaves, max depth = 23, train loss: 0.26427, val loss: 0.26617, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.26357, val loss: 0.26536, in 0.016s 1 tree, 186 leaves, max depth = 18, train loss: 0.26191, val loss: 0.26482, in 0.016s Fit 74 trees in 1.330 s, (6067 total leaves) Time spent computing histograms: 0.421s Time spent finding best splits: 0.109s Time spent applying splits: 0.120s Time spent predicting: 0.016s Trial 54, Fold 2: Log loss = 0.27454362553156353, Average precision = 0.9524217169185072, ROC-AUC = 0.9541073724840596, Elapsed Time = 1.341459000001123 seconds Trial 54, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 54, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.205 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 80 leaves, max depth = 15, train loss: 0.64887, val loss: 0.64767, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.61312, val loss: 0.61053, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.58330, val loss: 0.57976, in 0.016s 1 tree, 72 leaves, max depth = 14, train loss: 0.55880, val loss: 0.55430, in 0.016s 1 tree, 130 leaves, max depth = 16, train loss: 0.53612, val loss: 0.53336, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.51725, val loss: 0.51360, in 0.031s 1 tree, 132 leaves, max depth = 16, train loss: 0.49934, val loss: 0.49717, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.48417, val loss: 0.48127, in 0.016s 1 tree, 134 leaves, max depth = 16, train loss: 0.46978, val loss: 0.46814, in 0.016s 1 tree, 78 leaves, max depth = 17, train loss: 0.45806, val loss: 0.45565, in 0.016s 1 tree, 134 leaves, max depth = 17, train loss: 0.44629, val loss: 0.44500, in 0.031s 1 tree, 77 leaves, max depth = 18, train loss: 0.43691, val loss: 0.43492, in 0.000s 1 tree, 133 leaves, max depth = 16, train loss: 0.42716, val loss: 0.42616, in 0.031s 1 tree, 134 leaves, max depth = 16, train loss: 0.41885, val loss: 0.41876, in 0.016s 1 tree, 87 leaves, max depth = 12, train loss: 0.41138, val loss: 0.41063, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40475, val loss: 0.40446, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39917, val loss: 0.39927, in 0.016s 1 tree, 131 leaves, max depth = 15, train loss: 0.39226, val loss: 0.39323, in 0.016s 1 tree, 87 leaves, max depth = 13, train loss: 0.38643, val loss: 0.38685, in 0.016s 1 tree, 130 leaves, max depth = 15, train loss: 0.38056, val loss: 0.38177, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37590, val loss: 0.37742, in 0.016s 1 tree, 129 leaves, max depth = 16, train loss: 0.37085, val loss: 0.37313, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36686, val loss: 0.36940, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.36193, val loss: 0.36406, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.35777, val loss: 0.35965, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.35334, val loss: 0.35594, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.34957, val loss: 0.35284, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34612, val loss: 0.34959, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.34255, val loss: 0.34559, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33960, val loss: 0.34281, in 0.000s 1 tree, 103 leaves, max depth = 16, train loss: 0.33630, val loss: 0.33892, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33376, val loss: 0.33652, in 0.016s 1 tree, 122 leaves, max depth = 16, train loss: 0.33022, val loss: 0.33368, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.32737, val loss: 0.33029, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32510, val loss: 0.32814, in 0.016s 1 tree, 86 leaves, max depth = 16, train loss: 0.32258, val loss: 0.32558, in 0.000s 1 tree, 124 leaves, max depth = 15, train loss: 0.31939, val loss: 0.32309, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.31714, val loss: 0.32038, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.31507, val loss: 0.31842, in 0.000s 1 tree, 127 leaves, max depth = 15, train loss: 0.31223, val loss: 0.31622, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31040, val loss: 0.31448, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.30839, val loss: 0.31243, in 0.016s 1 tree, 104 leaves, max depth = 17, train loss: 0.30655, val loss: 0.31023, in 0.016s 1 tree, 126 leaves, max depth = 19, train loss: 0.30400, val loss: 0.30831, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30231, val loss: 0.30669, in 0.016s 1 tree, 129 leaves, max depth = 19, train loss: 0.30001, val loss: 0.30497, in 0.016s 1 tree, 102 leaves, max depth = 17, train loss: 0.29833, val loss: 0.30290, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.29635, val loss: 0.30143, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29476, val loss: 0.29991, in 0.016s 1 tree, 87 leaves, max depth = 13, train loss: 0.29324, val loss: 0.29836, in 0.000s 1 tree, 105 leaves, max depth = 17, train loss: 0.29188, val loss: 0.29667, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29047, val loss: 0.29532, in 0.016s 1 tree, 125 leaves, max depth = 16, train loss: 0.28858, val loss: 0.29402, in 0.016s 1 tree, 161 leaves, max depth = 24, train loss: 0.28644, val loss: 0.29312, in 0.031s 1 tree, 52 leaves, max depth = 11, train loss: 0.28526, val loss: 0.29197, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.28394, val loss: 0.29070, in 0.000s 1 tree, 105 leaves, max depth = 17, train loss: 0.28276, val loss: 0.28925, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.28162, val loss: 0.28814, in 0.016s [59/74] 1 tree, 127 leaves, max depth = 17, train loss: 0.28001, val loss: 0.28700, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.27898, val loss: 0.28601, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.27692, val loss: 0.28467, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.27579, val loss: 0.28326, in 0.016s 1 tree, 175 leaves, max depth = 19, train loss: 0.27393, val loss: 0.28213, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.27298, val loss: 0.28120, in 0.000s 1 tree, 81 leaves, max depth = 15, train loss: 0.27188, val loss: 0.28019, in 0.000s 1 tree, 180 leaves, max depth = 19, train loss: 0.27007, val loss: 0.27918, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.26920, val loss: 0.27833, in 0.016s 1 tree, 161 leaves, max depth = 24, train loss: 0.26747, val loss: 0.27780, in 0.016s 1 tree, 113 leaves, max depth = 17, train loss: 0.26639, val loss: 0.27660, in 0.031s 1 tree, 182 leaves, max depth = 19, train loss: 0.26486, val loss: 0.27576, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.26403, val loss: 0.27495, in 0.016s 1 tree, 104 leaves, max depth = 17, train loss: 0.26317, val loss: 0.27384, in 0.016s 1 tree, 162 leaves, max depth = 24, train loss: 0.26170, val loss: 0.27349, in 0.031s 1 tree, 183 leaves, max depth = 19, train loss: 0.26039, val loss: 0.27279, in 0.016s Fit 74 trees in 1.549 s, (6333 total leaves) Time spent computing histograms: 0.471s Time spent finding best splits: 0.125s Time spent applying splits: 0.139s Time spent predicting: 0.016s Trial 54, Fold 3: Log loss = 0.26791224175750544, Average precision = 0.9581761600803326, ROC-AUC = 0.9569774140189347, Elapsed Time = 1.5581058999996458 seconds Trial 54, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 54, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 76 leaves, max depth = 17, train loss: 0.64889, val loss: 0.64626, in 0.016s 1 tree, 79 leaves, max depth = 17, train loss: 0.61281, val loss: 0.60783, in 0.016s 1 tree, 79 leaves, max depth = 17, train loss: 0.58304, val loss: 0.57592, in 0.016s 1 tree, 77 leaves, max depth = 14, train loss: 0.55838, val loss: 0.54963, in 0.016s 1 tree, 125 leaves, max depth = 16, train loss: 0.53601, val loss: 0.52804, in 0.016s 1 tree, 81 leaves, max depth = 18, train loss: 0.51684, val loss: 0.50717, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.49919, val loss: 0.49024, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.48424, val loss: 0.47395, in 0.016s 1 tree, 129 leaves, max depth = 18, train loss: 0.47006, val loss: 0.46042, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.45807, val loss: 0.44708, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.44649, val loss: 0.43609, in 0.016s 1 tree, 84 leaves, max depth = 18, train loss: 0.43687, val loss: 0.42531, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.42731, val loss: 0.41629, in 0.016s 1 tree, 128 leaves, max depth = 17, train loss: 0.41917, val loss: 0.40864, in 0.031s 1 tree, 84 leaves, max depth = 15, train loss: 0.41166, val loss: 0.40008, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40529, val loss: 0.39321, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.39910, val loss: 0.38612, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.39237, val loss: 0.37990, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38713, val loss: 0.37423, in 0.000s 1 tree, 127 leaves, max depth = 16, train loss: 0.38138, val loss: 0.36899, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37693, val loss: 0.36417, in 0.016s 1 tree, 86 leaves, max depth = 15, train loss: 0.37201, val loss: 0.35853, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.36705, val loss: 0.35407, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36328, val loss: 0.34997, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.35903, val loss: 0.34528, in 0.000s 1 tree, 124 leaves, max depth = 15, train loss: 0.35471, val loss: 0.34149, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.35127, val loss: 0.33750, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34801, val loss: 0.33396, in 0.016s 1 tree, 126 leaves, max depth = 17, train loss: 0.34423, val loss: 0.33067, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34139, val loss: 0.32757, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.33808, val loss: 0.32419, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33564, val loss: 0.32152, in 0.000s 1 tree, 83 leaves, max depth = 16, train loss: 0.33300, val loss: 0.31855, in 0.016s 1 tree, 122 leaves, max depth = 16, train loss: 0.32952, val loss: 0.31558, in 0.031s 1 tree, 106 leaves, max depth = 17, train loss: 0.32690, val loss: 0.31293, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32469, val loss: 0.31051, in 0.000s 1 tree, 122 leaves, max depth = 18, train loss: 0.32158, val loss: 0.30790, in 0.031s 1 tree, 104 leaves, max depth = 16, train loss: 0.31927, val loss: 0.30560, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31729, val loss: 0.30343, in 0.000s 1 tree, 119 leaves, max depth = 18, train loss: 0.31451, val loss: 0.30115, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.31249, val loss: 0.29916, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31071, val loss: 0.29720, in 0.016s 1 tree, 87 leaves, max depth = 15, train loss: 0.30897, val loss: 0.29531, in 0.016s 1 tree, 126 leaves, max depth = 19, train loss: 0.30645, val loss: 0.29323, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30485, val loss: 0.29146, in 0.016s 1 tree, 123 leaves, max depth = 19, train loss: 0.30259, val loss: 0.28973, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.30086, val loss: 0.28808, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.29942, val loss: 0.28677, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29794, val loss: 0.28513, in 0.000s 1 tree, 126 leaves, max depth = 15, train loss: 0.29594, val loss: 0.28356, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.29452, val loss: 0.28223, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.29318, val loss: 0.28074, in 0.000s 1 tree, 108 leaves, max depth = 17, train loss: 0.29069, val loss: 0.27878, in 0.016s 1 tree, 176 leaves, max depth = 26, train loss: 0.28861, val loss: 0.27773, in 0.031s 1 tree, 106 leaves, max depth = 17, train loss: 0.28728, val loss: 0.27648, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.28603, val loss: 0.27508, in 0.000s 1 tree, 174 leaves, max depth = 22, train loss: 0.28391, val loss: 0.27410, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.28281, val loss: 0.27288, in 0.000s 1 tree, 128 leaves, max depth = 14, train loss: 0.28121, val loss: 0.27174, in 0.031s 1 tree, 111 leaves, max depth = 16, train loss: 0.27988, val loss: 0.27062, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.27888, val loss: 0.26950, in 0.000s 1 tree, 177 leaves, max depth = 26, train loss: 0.27705, val loss: 0.26864, in 0.031s 1 tree, 82 leaves, max depth = 15, train loss: 0.27593, val loss: 0.26727, in 0.016s 1 tree, 158 leaves, max depth = 28, train loss: 0.27411, val loss: 0.26609, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.27316, val loss: 0.26503, in 0.016s 1 tree, 113 leaves, max depth = 16, train loss: 0.27198, val loss: 0.26419, in 0.016s 1 tree, 175 leaves, max depth = 23, train loss: 0.27041, val loss: 0.26348, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.26954, val loss: 0.26252, in 0.016s 1 tree, 107 leaves, max depth = 19, train loss: 0.26861, val loss: 0.26171, in 0.016s 1 tree, 176 leaves, max depth = 23, train loss: 0.26720, val loss: 0.26108, in 0.016s 1 tree, 161 leaves, max depth = 26, train loss: 0.26567, val loss: 0.26010, in 0.031s 1 tree, 175 leaves, max depth = 21, train loss: 0.26405, val loss: 0.25945, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.26322, val loss: 0.25852, in 0.000s 1 tree, 52 leaves, max depth = 10, train loss: 0.26233, val loss: 0.25781, in 0.016s Fit 74 trees in 1.424 s, (6314 total leaves) Time spent computing histograms: 0.459s Time spent finding best splits: 0.118s Time spent applying splits: 0.130s Time spent predicting: 0.016s Trial 54, Fold 4: Log loss = 0.27029483261290455, Average precision = 0.9573415530068022, ROC-AUC = 0.9565255036469547, Elapsed Time = 1.4256296999992628 seconds Trial 54, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 54, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.172 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 79 leaves, max depth = 16, train loss: 0.64882, val loss: 0.64587, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.61270, val loss: 0.60711, in 0.016s 1 tree, 82 leaves, max depth = 21, train loss: 0.58241, val loss: 0.57442, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.55762, val loss: 0.54747, in 0.000s 1 tree, 133 leaves, max depth = 16, train loss: 0.53517, val loss: 0.52612, in 0.016s 1 tree, 81 leaves, max depth = 21, train loss: 0.51564, val loss: 0.50472, in 0.031s 1 tree, 132 leaves, max depth = 16, train loss: 0.49794, val loss: 0.48806, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.48261, val loss: 0.47111, in 0.016s 1 tree, 132 leaves, max depth = 20, train loss: 0.46838, val loss: 0.45786, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.45638, val loss: 0.44452, in 0.016s 1 tree, 134 leaves, max depth = 17, train loss: 0.44474, val loss: 0.43381, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.43510, val loss: 0.42297, in 0.016s 1 tree, 131 leaves, max depth = 19, train loss: 0.42544, val loss: 0.41420, in 0.016s 1 tree, 132 leaves, max depth = 19, train loss: 0.41722, val loss: 0.40677, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.40957, val loss: 0.39811, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40314, val loss: 0.39191, in 0.000s 1 tree, 82 leaves, max depth = 18, train loss: 0.39675, val loss: 0.38456, in 0.016s 1 tree, 131 leaves, max depth = 19, train loss: 0.38994, val loss: 0.37860, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.38468, val loss: 0.37355, in 0.000s 1 tree, 129 leaves, max depth = 20, train loss: 0.37886, val loss: 0.36851, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37438, val loss: 0.36423, in 0.016s 1 tree, 130 leaves, max depth = 20, train loss: 0.36937, val loss: 0.35994, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.36440, val loss: 0.35410, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36059, val loss: 0.35046, in 0.000s 1 tree, 83 leaves, max depth = 14, train loss: 0.35629, val loss: 0.34534, in 0.016s 1 tree, 131 leaves, max depth = 18, train loss: 0.35193, val loss: 0.34173, in 0.031s 1 tree, 81 leaves, max depth = 13, train loss: 0.34841, val loss: 0.33773, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34514, val loss: 0.33459, in 0.016s 1 tree, 130 leaves, max depth = 19, train loss: 0.34130, val loss: 0.33147, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33845, val loss: 0.32874, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.33493, val loss: 0.32510, in 0.016s 1 tree, 130 leaves, max depth = 22, train loss: 0.33154, val loss: 0.32237, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32901, val loss: 0.31996, in 0.016s 1 tree, 102 leaves, max depth = 15, train loss: 0.32597, val loss: 0.31680, in 0.016s 1 tree, 87 leaves, max depth = 15, train loss: 0.32351, val loss: 0.31381, in 0.016s 1 tree, 130 leaves, max depth = 23, train loss: 0.32044, val loss: 0.31141, in 0.016s 1 tree, 110 leaves, max depth = 15, train loss: 0.31806, val loss: 0.30900, in 0.016s 1 tree, 129 leaves, max depth = 24, train loss: 0.31541, val loss: 0.30694, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31306, val loss: 0.30467, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31105, val loss: 0.30274, in 0.000s 1 tree, 104 leaves, max depth = 16, train loss: 0.30887, val loss: 0.30053, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30713, val loss: 0.29887, in 0.016s 1 tree, 88 leaves, max depth = 13, train loss: 0.30530, val loss: 0.29665, in 0.016s 1 tree, 124 leaves, max depth = 22, train loss: 0.30271, val loss: 0.29467, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30113, val loss: 0.29316, in 0.000s 1 tree, 126 leaves, max depth = 22, train loss: 0.29887, val loss: 0.29145, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.29701, val loss: 0.28960, in 0.031s 1 tree, 104 leaves, max depth = 14, train loss: 0.29547, val loss: 0.28806, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29400, val loss: 0.28665, in 0.000s 1 tree, 125 leaves, max depth = 22, train loss: 0.29193, val loss: 0.28509, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.29044, val loss: 0.28340, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.28910, val loss: 0.28211, in 0.016s 1 tree, 126 leaves, max depth = 23, train loss: 0.28725, val loss: 0.28076, in 0.016s 1 tree, 110 leaves, max depth = 18, train loss: 0.28585, val loss: 0.27942, in 0.016s 1 tree, 179 leaves, max depth = 21, train loss: 0.28379, val loss: 0.27799, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.28253, val loss: 0.27676, in 0.016s 1 tree, 184 leaves, max depth = 21, train loss: 0.28038, val loss: 0.27615, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.27927, val loss: 0.27508, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.27803, val loss: 0.27389, in 0.016s 1 tree, 177 leaves, max depth = 22, train loss: 0.27617, val loss: 0.27259, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.27516, val loss: 0.27162, in 0.016s 1 tree, 129 leaves, max depth = 22, train loss: 0.27371, val loss: 0.27055, in 0.016s 1 tree, 102 leaves, max depth = 16, train loss: 0.27255, val loss: 0.26944, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.27163, val loss: 0.26856, in 0.016s 1 tree, 158 leaves, max depth = 21, train loss: 0.26980, val loss: 0.26755, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.26868, val loss: 0.26629, in 0.016s 1 tree, 178 leaves, max depth = 22, train loss: 0.26716, val loss: 0.26528, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.26628, val loss: 0.26443, in 0.000s 1 tree, 101 leaves, max depth = 17, train loss: 0.26529, val loss: 0.26352, in 0.016s 1 tree, 186 leaves, max depth = 21, train loss: 0.26360, val loss: 0.26318, in 0.031s 1 tree, 101 leaves, max depth = 16, train loss: 0.26181, val loss: 0.26204, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.26095, val loss: 0.26113, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.26012, val loss: 0.26033, in 0.000s 1 tree, 160 leaves, max depth = 26, train loss: 0.25868, val loss: 0.25963, in 0.016s Fit 74 trees in 1.391 s, (6375 total leaves) Time spent computing histograms: 0.444s Time spent finding best splits: 0.117s Time spent applying splits: 0.131s Time spent predicting: 0.016s Trial 54, Fold 5: Log loss = 0.27748913248953944, Average precision = 0.9528118582715788, ROC-AUC = 0.9521970904631849, Elapsed Time = 1.414216099999976 seconds
Optimization Progress: 55%|#####5 | 55/100 [10:49<09:26, 12.59s/it]
Trial 55, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 55, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 Binning 0.040 GB of training data: 0.205 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 11, train loss: 0.67308, val loss: 0.67327, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.65437, val loss: 0.65472, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.63687, val loss: 0.63735, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.62118, val loss: 0.62166, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.60572, val loss: 0.60620, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.59126, val loss: 0.59186, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.57761, val loss: 0.57821, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.56481, val loss: 0.56552, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.55271, val loss: 0.55342, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.54103, val loss: 0.54164, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.53002, val loss: 0.53053, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.51963, val loss: 0.52003, in 0.031s 1 tree, 48 leaves, max depth = 10, train loss: 0.50978, val loss: 0.51010, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.50045, val loss: 0.50067, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.49163, val loss: 0.49181, in 0.016s Fit 15 trees in 0.705 s, (720 total leaves) Time spent computing histograms: 0.120s Time spent finding best splits: 0.025s Time spent applying splits: 0.016s Time spent predicting: 0.000s Trial 55, Fold 1: Log loss = 0.4936647614237157, Average precision = 0.9200493818973322, ROC-AUC = 0.9309645838411647, Elapsed Time = 0.7165882999997848 seconds Trial 55, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 55, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.189 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 10, train loss: 0.67272, val loss: 0.67242, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.65384, val loss: 0.65320, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.63603, val loss: 0.63512, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.62028, val loss: 0.61923, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.60470, val loss: 0.60342, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.58994, val loss: 0.58846, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.57619, val loss: 0.57448, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.56313, val loss: 0.56116, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.55083, val loss: 0.54860, in 0.031s 1 tree, 48 leaves, max depth = 10, train loss: 0.53933, val loss: 0.53690, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.52833, val loss: 0.52578, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.51795, val loss: 0.51523, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.50814, val loss: 0.50526, in 0.031s 1 tree, 48 leaves, max depth = 10, train loss: 0.49888, val loss: 0.49584, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.49008, val loss: 0.48688, in 0.016s Fit 15 trees in 0.689 s, (720 total leaves) Time spent computing histograms: 0.119s Time spent finding best splits: 0.026s Time spent applying splits: 0.016s Time spent predicting: 0.016s Trial 55, Fold 2: Log loss = 0.4922320066760393, Average precision = 0.9170846482239512, ROC-AUC = 0.9323047059112742, Elapsed Time = 0.700506299999688 seconds Trial 55, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 55, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 12, train loss: 0.67321, val loss: 0.67331, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.65463, val loss: 0.65480, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.63688, val loss: 0.63717, in 0.031s 1 tree, 48 leaves, max depth = 9, train loss: 0.62113, val loss: 0.62153, in 0.016s 1 tree, 48 leaves, max depth = 8, train loss: 0.60574, val loss: 0.60607, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.59103, val loss: 0.59146, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.57792, val loss: 0.57843, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.56484, val loss: 0.56542, in 0.016s 1 tree, 48 leaves, max depth = 9, train loss: 0.55250, val loss: 0.55314, in 0.031s 1 tree, 48 leaves, max depth = 10, train loss: 0.54087, val loss: 0.54160, in 0.016s 1 tree, 48 leaves, max depth = 8, train loss: 0.52990, val loss: 0.53071, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.51951, val loss: 0.52041, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.50970, val loss: 0.51066, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.50040, val loss: 0.50143, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.49162, val loss: 0.49271, in 0.016s Fit 15 trees in 0.626 s, (720 total leaves) Time spent computing histograms: 0.114s Time spent finding best splits: 0.023s Time spent applying splits: 0.015s Time spent predicting: 0.016s Trial 55, Fold 3: Log loss = 0.49051110730915715, Average precision = 0.9220649795799083, ROC-AUC = 0.9346053472710462, Elapsed Time = 0.6382448000003933 seconds Trial 55, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 55, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 11, train loss: 0.67318, val loss: 0.67245, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.65456, val loss: 0.65314, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.63718, val loss: 0.63508, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.62175, val loss: 0.61907, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.60642, val loss: 0.60308, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.59198, val loss: 0.58805, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.57833, val loss: 0.57381, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.56549, val loss: 0.56039, in 0.032s 1 tree, 48 leaves, max depth = 10, train loss: 0.55316, val loss: 0.54755, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.54228, val loss: 0.53615, in 0.016s 1 tree, 48 leaves, max depth = 8, train loss: 0.53138, val loss: 0.52466, in 0.020s 1 tree, 48 leaves, max depth = 10, train loss: 0.52096, val loss: 0.51377, in 0.011s 1 tree, 48 leaves, max depth = 10, train loss: 0.51110, val loss: 0.50345, in 0.031s 1 tree, 48 leaves, max depth = 10, train loss: 0.50178, val loss: 0.49369, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.49293, val loss: 0.48448, in 0.016s Fit 15 trees in 0.643 s, (720 total leaves) Time spent computing histograms: 0.117s Time spent finding best splits: 0.023s Time spent applying splits: 0.015s Time spent predicting: 0.000s Trial 55, Fold 4: Log loss = 0.4929778956507077, Average precision = 0.9227369323898403, ROC-AUC = 0.9334807047599973, Elapsed Time = 0.6573267000003398 seconds Trial 55, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 55, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 13, train loss: 0.67295, val loss: 0.67229, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.65417, val loss: 0.65295, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.63632, val loss: 0.63449, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.62056, val loss: 0.61813, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.60481, val loss: 0.60184, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.59003, val loss: 0.58654, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.57614, val loss: 0.57215, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.56307, val loss: 0.55862, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.55075, val loss: 0.54583, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.53977, val loss: 0.53446, in 0.016s 1 tree, 48 leaves, max depth = 8, train loss: 0.52889, val loss: 0.52312, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.51845, val loss: 0.51231, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.50858, val loss: 0.50208, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.49924, val loss: 0.49236, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.49040, val loss: 0.48320, in 0.031s Fit 15 trees in 0.658 s, (720 total leaves) Time spent computing histograms: 0.122s Time spent finding best splits: 0.023s Time spent applying splits: 0.015s Time spent predicting: 0.000s Trial 55, Fold 5: Log loss = 0.49450128925862646, Average precision = 0.9170951877765856, ROC-AUC = 0.930760752838006, Elapsed Time = 0.6636056999996072 seconds
Optimization Progress: 56%|#####6 | 56/100 [11:00<08:50, 12.06s/it]
Trial 56, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 56, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.158 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 21 leaves, max depth = 8, train loss: 0.68062, val loss: 0.68020, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.66875, val loss: 0.66793, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.65749, val loss: 0.65628, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.64681, val loss: 0.64522, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.63667, val loss: 0.63471, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.62704, val loss: 0.62471, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.61789, val loss: 0.61520, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.60918, val loss: 0.60615, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.60090, val loss: 0.59753, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.59301, val loss: 0.58931, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.58550, val loss: 0.58148, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.57835, val loss: 0.57401, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.57153, val loss: 0.56689, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.56503, val loss: 0.56009, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.55888, val loss: 0.55368, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.55301, val loss: 0.54757, in 0.000s 1 tree, 38 leaves, max depth = 9, train loss: 0.54710, val loss: 0.54203, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.54157, val loss: 0.53622, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.53634, val loss: 0.53076, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.53088, val loss: 0.52566, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.52568, val loss: 0.52081, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.52072, val loss: 0.51620, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.51596, val loss: 0.51117, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.51130, val loss: 0.50685, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.50691, val loss: 0.50221, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.50252, val loss: 0.49814, in 0.000s 1 tree, 21 leaves, max depth = 11, train loss: 0.49840, val loss: 0.49378, in 0.019s 1 tree, 39 leaves, max depth = 9, train loss: 0.49426, val loss: 0.48996, in 0.000s 1 tree, 21 leaves, max depth = 10, train loss: 0.49038, val loss: 0.48584, in 0.013s 1 tree, 39 leaves, max depth = 9, train loss: 0.48647, val loss: 0.48225, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.48282, val loss: 0.47841, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.47913, val loss: 0.47503, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.47560, val loss: 0.47180, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.47221, val loss: 0.46819, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.46887, val loss: 0.46514, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.46567, val loss: 0.46173, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.46250, val loss: 0.45885, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.45946, val loss: 0.45562, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.45646, val loss: 0.45291, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.45361, val loss: 0.44985, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.45089, val loss: 0.44692, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.44806, val loss: 0.44438, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.44536, val loss: 0.44196, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.44279, val loss: 0.43922, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.44023, val loss: 0.43693, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.43778, val loss: 0.43476, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.43538, val loss: 0.43217, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.43305, val loss: 0.43010, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.43077, val loss: 0.42766, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.42856, val loss: 0.42571, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.42641, val loss: 0.42337, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.42430, val loss: 0.42152, in 0.000s 1 tree, 20 leaves, max depth = 10, train loss: 0.42225, val loss: 0.41931, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.42024, val loss: 0.41757, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.41830, val loss: 0.41544, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.41638, val loss: 0.41379, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.41454, val loss: 0.41176, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.41273, val loss: 0.40976, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.41091, val loss: 0.40820, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40918, val loss: 0.40629, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40753, val loss: 0.40445, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.40579, val loss: 0.40299, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40422, val loss: 0.40124, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.40256, val loss: 0.39985, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.40088, val loss: 0.39805, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39938, val loss: 0.39638, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.39780, val loss: 0.39507, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39637, val loss: 0.39348, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.39479, val loss: 0.39179, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.39328, val loss: 0.39055, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39192, val loss: 0.38903, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.39039, val loss: 0.38742, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.38895, val loss: 0.38626, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38765, val loss: 0.38480, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38640, val loss: 0.38341, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.38502, val loss: 0.38231, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.38358, val loss: 0.38078, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38239, val loss: 0.37944, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.38107, val loss: 0.37840, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.37972, val loss: 0.37697, in 0.016s Fit 80 trees in 0.994 s, (2050 total leaves) Time spent computing histograms: 0.374s Time spent finding best splits: 0.059s Time spent applying splits: 0.049s Time spent predicting: 0.000s Trial 56, Fold 1: Log loss = 0.380684813906018, Average precision = 0.9458659296414029, ROC-AUC = 0.9424333575897487, Elapsed Time = 1.0017719000006764 seconds Trial 56, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 56, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.173 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 21 leaves, max depth = 8, train loss: 0.68065, val loss: 0.68003, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.66881, val loss: 0.66760, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.65775, val loss: 0.65597, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.64709, val loss: 0.64476, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.63697, val loss: 0.63411, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.62736, val loss: 0.62397, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.61823, val loss: 0.61434, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.60967, val loss: 0.60530, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.60140, val loss: 0.59656, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.59353, val loss: 0.58822, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.58603, val loss: 0.58028, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.57900, val loss: 0.57283, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.57219, val loss: 0.56560, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.56570, val loss: 0.55870, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.55951, val loss: 0.55212, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.55371, val loss: 0.54594, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.54783, val loss: 0.54020, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.54241, val loss: 0.53442, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.53724, val loss: 0.52889, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.53179, val loss: 0.52360, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.52660, val loss: 0.51856, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.52166, val loss: 0.51376, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.51699, val loss: 0.50876, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.51234, val loss: 0.50425, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.50796, val loss: 0.49956, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.50358, val loss: 0.49531, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.49947, val loss: 0.49089, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.49533, val loss: 0.48689, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.49146, val loss: 0.48273, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.48756, val loss: 0.47896, in 0.000s 1 tree, 21 leaves, max depth = 10, train loss: 0.48391, val loss: 0.47504, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.48022, val loss: 0.47148, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.47670, val loss: 0.46808, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.47332, val loss: 0.46443, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.46998, val loss: 0.46123, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.46680, val loss: 0.45778, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.46364, val loss: 0.45475, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.46063, val loss: 0.45149, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.45764, val loss: 0.44862, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.45480, val loss: 0.44554, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.45208, val loss: 0.44259, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.44926, val loss: 0.43989, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.44656, val loss: 0.43731, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.44398, val loss: 0.43484, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.44147, val loss: 0.43211, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.43902, val loss: 0.42977, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.43664, val loss: 0.42718, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.43431, val loss: 0.42496, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.43205, val loss: 0.42250, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.42983, val loss: 0.42039, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.42770, val loss: 0.41805, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.42558, val loss: 0.41605, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.42356, val loss: 0.41383, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.42154, val loss: 0.41192, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.41962, val loss: 0.40982, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.41770, val loss: 0.40800, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.41586, val loss: 0.40627, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.41408, val loss: 0.40443, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.41229, val loss: 0.40246, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41061, val loss: 0.40072, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.40887, val loss: 0.39908, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40726, val loss: 0.39741, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.40560, val loss: 0.39586, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.40392, val loss: 0.39402, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40241, val loss: 0.39243, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.40082, val loss: 0.39095, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39937, val loss: 0.38944, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.39776, val loss: 0.38767, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.39625, val loss: 0.38626, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39487, val loss: 0.38482, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.39342, val loss: 0.38348, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.39189, val loss: 0.38182, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39058, val loss: 0.38044, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38932, val loss: 0.37913, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.38793, val loss: 0.37785, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.38651, val loss: 0.37627, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38530, val loss: 0.37501, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.38398, val loss: 0.37379, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38283, val loss: 0.37259, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.38147, val loss: 0.37110, in 0.016s Fit 80 trees in 1.095 s, (2059 total leaves) Time spent computing histograms: 0.404s Time spent finding best splits: 0.063s Time spent applying splits: 0.053s Time spent predicting: 0.000s Trial 56, Fold 2: Log loss = 0.3832620626880714, Average precision = 0.9423748606646112, ROC-AUC = 0.9432571023274772, Elapsed Time = 1.0971567000015057 seconds Trial 56, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 56, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 21 leaves, max depth = 7, train loss: 0.68072, val loss: 0.68033, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.66895, val loss: 0.66819, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.65779, val loss: 0.65666, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.64720, val loss: 0.64572, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.63715, val loss: 0.63533, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.62760, val loss: 0.62544, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.61852, val loss: 0.61605, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.60989, val loss: 0.60710, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.60168, val loss: 0.59858, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.59386, val loss: 0.59047, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.58642, val loss: 0.58274, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.57932, val loss: 0.57537, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.57257, val loss: 0.56834, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.56613, val loss: 0.56163, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.55999, val loss: 0.55523, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.55385, val loss: 0.54946, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.54818, val loss: 0.54354, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.54243, val loss: 0.53814, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.53713, val loss: 0.53259, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.53173, val loss: 0.52753, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.52677, val loss: 0.52232, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.52170, val loss: 0.51757, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.51685, val loss: 0.51304, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.51229, val loss: 0.50824, in 0.016s [25/80] 1 tree, 41 leaves, max depth = 10, train loss: 0.50773, val loss: 0.50398, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.50345, val loss: 0.49947, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.49915, val loss: 0.49546, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.49514, val loss: 0.49121, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.49108, val loss: 0.48744, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.48730, val loss: 0.48343, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.48346, val loss: 0.47987, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.47991, val loss: 0.47610, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.47628, val loss: 0.47274, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.47281, val loss: 0.46953, in 0.031s 1 tree, 21 leaves, max depth = 8, train loss: 0.46951, val loss: 0.46601, in 0.031s 1 tree, 41 leaves, max depth = 10, train loss: 0.46623, val loss: 0.46298, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.46312, val loss: 0.45966, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.46000, val loss: 0.45679, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.45707, val loss: 0.45365, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.45411, val loss: 0.45093, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.45134, val loss: 0.44796, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.44854, val loss: 0.44539, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.44585, val loss: 0.44293, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.44326, val loss: 0.44015, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.44071, val loss: 0.43782, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.43826, val loss: 0.43517, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.43584, val loss: 0.43296, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.43351, val loss: 0.43041, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.43120, val loss: 0.42831, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.42903, val loss: 0.42591, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.42683, val loss: 0.42392, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.42474, val loss: 0.42166, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.42265, val loss: 0.41977, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.42065, val loss: 0.41797, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.41868, val loss: 0.41579, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41683, val loss: 0.41407, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.41493, val loss: 0.41236, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41316, val loss: 0.41072, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.41133, val loss: 0.40868, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.40952, val loss: 0.40707, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40785, val loss: 0.40552, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.40613, val loss: 0.40398, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40453, val loss: 0.40250, in 0.000s 1 tree, 20 leaves, max depth = 10, train loss: 0.40283, val loss: 0.40060, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.40118, val loss: 0.39915, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39967, val loss: 0.39774, in 0.000s 1 tree, 20 leaves, max depth = 10, train loss: 0.39806, val loss: 0.39594, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.39649, val loss: 0.39456, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39505, val loss: 0.39322, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39367, val loss: 0.39194, in 0.016s 1 tree, 40 leaves, max depth = 9, train loss: 0.39216, val loss: 0.39062, in 0.000s 1 tree, 20 leaves, max depth = 8, train loss: 0.39063, val loss: 0.38890, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38932, val loss: 0.38768, in 0.000s 1 tree, 40 leaves, max depth = 9, train loss: 0.38788, val loss: 0.38643, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38662, val loss: 0.38526, in 0.016s 1 tree, 40 leaves, max depth = 9, train loss: 0.38525, val loss: 0.38408, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.38380, val loss: 0.38245, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38259, val loss: 0.38133, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.38128, val loss: 0.38020, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38012, val loss: 0.37913, in 0.016s Fit 80 trees in 1.220 s, (2110 total leaves) Time spent computing histograms: 0.450s Time spent finding best splits: 0.082s Time spent applying splits: 0.069s Time spent predicting: 0.016s Trial 56, Fold 3: Log loss = 0.3766158102262939, Average precision = 0.9477626147986153, ROC-AUC = 0.9475018298966589, Elapsed Time = 1.228315799999109 seconds Trial 56, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 56, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 20 leaves, max depth = 9, train loss: 0.68079, val loss: 0.68008, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.66909, val loss: 0.66768, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.65800, val loss: 0.65590, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.64747, val loss: 0.64471, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.63748, val loss: 0.63407, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.62799, val loss: 0.62394, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.61897, val loss: 0.61429, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.61039, val loss: 0.60510, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.60223, val loss: 0.59634, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.59446, val loss: 0.58799, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.58707, val loss: 0.58002, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.58003, val loss: 0.57241, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.57332, val loss: 0.56515, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.56693, val loss: 0.55822, in 0.000s 1 tree, 20 leaves, max depth = 9, train loss: 0.56083, val loss: 0.55159, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.55502, val loss: 0.54525, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.54914, val loss: 0.53933, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.54371, val loss: 0.53340, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.53854, val loss: 0.52773, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.53311, val loss: 0.52227, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.52794, val loss: 0.51706, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.52302, val loss: 0.51210, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.51835, val loss: 0.50697, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.51371, val loss: 0.50230, in 0.000s 1 tree, 21 leaves, max depth = 11, train loss: 0.50933, val loss: 0.49747, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.50496, val loss: 0.49307, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.50084, val loss: 0.48852, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.49673, val loss: 0.48438, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.49286, val loss: 0.48009, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.48897, val loss: 0.47618, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.48534, val loss: 0.47218, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.48167, val loss: 0.46849, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.47817, val loss: 0.46496, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.47478, val loss: 0.46119, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.47146, val loss: 0.45785, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.46827, val loss: 0.45429, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.46513, val loss: 0.45113, in 0.016s 1 tree, 20 leaves, max depth = 12, train loss: 0.46212, val loss: 0.44776, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.45915, val loss: 0.44477, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.45630, val loss: 0.44157, in 0.000s 1 tree, 20 leaves, max depth = 12, train loss: 0.45358, val loss: 0.43851, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.45078, val loss: 0.43570, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.44811, val loss: 0.43301, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.44555, val loss: 0.43044, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.44303, val loss: 0.42759, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.44060, val loss: 0.42514, in 0.000s 1 tree, 21 leaves, max depth = 11, train loss: 0.43821, val loss: 0.42244, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.43590, val loss: 0.42011, in 0.016s 1 tree, 20 leaves, max depth = 11, train loss: 0.43364, val loss: 0.41756, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.43145, val loss: 0.41535, in 0.016s 1 tree, 21 leaves, max depth = 11, train loss: 0.42930, val loss: 0.41290, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.42721, val loss: 0.41080, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.42519, val loss: 0.40851, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.42320, val loss: 0.40651, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.42128, val loss: 0.40431, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.41939, val loss: 0.40240, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.41758, val loss: 0.40058, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41580, val loss: 0.39866, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.41411, val loss: 0.39683, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.41232, val loss: 0.39479, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.41061, val loss: 0.39307, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40900, val loss: 0.39134, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.40732, val loss: 0.38941, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.40569, val loss: 0.38777, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40416, val loss: 0.38613, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40270, val loss: 0.38455, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.40115, val loss: 0.38300, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.39957, val loss: 0.38119, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39819, val loss: 0.37969, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.39670, val loss: 0.37821, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39537, val loss: 0.37678, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.39395, val loss: 0.37536, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.39247, val loss: 0.37367, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39121, val loss: 0.37230, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.38985, val loss: 0.37095, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.38845, val loss: 0.36934, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38724, val loss: 0.36803, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.38593, val loss: 0.36674, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38478, val loss: 0.36548, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.38339, val loss: 0.36388, in 0.000s Fit 80 trees in 1.158 s, (2004 total leaves) Time spent computing histograms: 0.430s Time spent finding best splits: 0.067s Time spent applying splits: 0.056s Time spent predicting: 0.000s Trial 56, Fold 4: Log loss = 0.38315428635904075, Average precision = 0.945962665443955, ROC-AUC = 0.9426270026061598, Elapsed Time = 1.1737682000002678 seconds Trial 56, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 56, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 21 leaves, max depth = 8, train loss: 0.68073, val loss: 0.67990, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.66897, val loss: 0.66733, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.65776, val loss: 0.65533, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.64712, val loss: 0.64392, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.63707, val loss: 0.63313, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.62748, val loss: 0.62280, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.61836, val loss: 0.61297, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.60973, val loss: 0.60365, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.60147, val loss: 0.59471, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.59361, val loss: 0.58620, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.58617, val loss: 0.57811, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.57904, val loss: 0.57035, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.57225, val loss: 0.56294, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.56581, val loss: 0.55590, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.55964, val loss: 0.54914, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.55375, val loss: 0.54268, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.54790, val loss: 0.53706, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.54242, val loss: 0.53103, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.53720, val loss: 0.52527, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.53179, val loss: 0.52010, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.52664, val loss: 0.51518, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.52184, val loss: 0.50986, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.51699, val loss: 0.50525, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.51249, val loss: 0.50025, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.50792, val loss: 0.49592, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.50357, val loss: 0.49179, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.49941, val loss: 0.48716, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.49530, val loss: 0.48328, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.49139, val loss: 0.47892, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.48751, val loss: 0.47527, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.48383, val loss: 0.47115, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.48017, val loss: 0.46771, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.47670, val loss: 0.46382, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.47324, val loss: 0.46058, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.46996, val loss: 0.45690, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.46668, val loss: 0.45384, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.46355, val loss: 0.45093, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.46050, val loss: 0.44749, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.45752, val loss: 0.44473, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.45464, val loss: 0.44147, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.45189, val loss: 0.43835, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.44909, val loss: 0.43577, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.44641, val loss: 0.43330, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.44383, val loss: 0.43037, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.44129, val loss: 0.42804, in 0.000s 1 tree, 21 leaves, max depth = 7, train loss: 0.43885, val loss: 0.42526, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.43643, val loss: 0.42305, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.43411, val loss: 0.42094, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.43182, val loss: 0.41832, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.42962, val loss: 0.41633, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.42745, val loss: 0.41384, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.42535, val loss: 0.41194, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.42330, val loss: 0.40958, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.42130, val loss: 0.40779, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.41935, val loss: 0.40553, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.41744, val loss: 0.40382, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.41558, val loss: 0.40167, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41381, val loss: 0.39997, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.41199, val loss: 0.39836, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41030, val loss: 0.39674, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.40857, val loss: 0.39521, in 0.000s 1 tree, 21 leaves, max depth = 9, train loss: 0.40686, val loss: 0.39322, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40526, val loss: 0.39169, in 0.000s 1 tree, 39 leaves, max depth = 9, train loss: 0.40361, val loss: 0.39024, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40208, val loss: 0.38878, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.40050, val loss: 0.38741, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.39889, val loss: 0.38550, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39744, val loss: 0.38411, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.39593, val loss: 0.38281, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39455, val loss: 0.38148, in 0.000s 1 tree, 20 leaves, max depth = 10, train loss: 0.39303, val loss: 0.37968, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.39159, val loss: 0.37844, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39027, val loss: 0.37718, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38900, val loss: 0.37597, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.38763, val loss: 0.37480, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.38620, val loss: 0.37310, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38499, val loss: 0.37195, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.38367, val loss: 0.37083, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38251, val loss: 0.36973, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.38116, val loss: 0.36811, in 0.000s Fit 80 trees in 1.110 s, (2055 total leaves) Time spent computing histograms: 0.422s Time spent finding best splits: 0.065s Time spent applying splits: 0.055s Time spent predicting: 0.000s Trial 56, Fold 5: Log loss = 0.3854740829711175, Average precision = 0.9450464468313988, ROC-AUC = 0.9415032978466454, Elapsed Time = 1.122230999999374 seconds
Optimization Progress: 57%|#####6 | 57/100 [11:13<08:49, 12.32s/it]
Trial 57, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 57, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 32 leaves, max depth = 12, train loss: 0.67075, val loss: 0.67008, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.65045, val loss: 0.64913, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.63199, val loss: 0.63006, in 0.000s 1 tree, 30 leaves, max depth = 13, train loss: 0.61534, val loss: 0.61293, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.59998, val loss: 0.59701, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.58594, val loss: 0.58243, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.57308, val loss: 0.56906, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.56130, val loss: 0.55678, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.55050, val loss: 0.54550, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.53979, val loss: 0.53552, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.53024, val loss: 0.52551, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.52071, val loss: 0.51669, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.51237, val loss: 0.50788, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.50383, val loss: 0.50000, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.49628, val loss: 0.49203, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.48860, val loss: 0.48495, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.48186, val loss: 0.47782, in 0.000s 1 tree, 55 leaves, max depth = 10, train loss: 0.47494, val loss: 0.47147, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.46899, val loss: 0.46512, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.46272, val loss: 0.45937, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.45694, val loss: 0.45410, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.45169, val loss: 0.44850, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.44645, val loss: 0.44376, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.44177, val loss: 0.43878, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.43700, val loss: 0.43450, in 0.000s 1 tree, 54 leaves, max depth = 10, train loss: 0.43259, val loss: 0.43056, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.42840, val loss: 0.42602, in 0.000s 1 tree, 54 leaves, max depth = 10, train loss: 0.42437, val loss: 0.42242, in 0.000s 1 tree, 31 leaves, max depth = 8, train loss: 0.42047, val loss: 0.41825, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.41679, val loss: 0.41498, in 0.016s 1 tree, 30 leaves, max depth = 13, train loss: 0.41336, val loss: 0.41133, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.41016, val loss: 0.40783, in 0.000s 1 tree, 54 leaves, max depth = 10, train loss: 0.40680, val loss: 0.40490, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40388, val loss: 0.40169, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.40078, val loss: 0.39903, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.39774, val loss: 0.39575, in 0.000s 1 tree, 54 leaves, max depth = 10, train loss: 0.39490, val loss: 0.39333, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.39211, val loss: 0.39033, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38947, val loss: 0.38743, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38703, val loss: 0.38474, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38479, val loss: 0.38226, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.38215, val loss: 0.38010, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.37967, val loss: 0.37745, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.37759, val loss: 0.37516, in 0.000s 1 tree, 55 leaves, max depth = 12, train loss: 0.37514, val loss: 0.37319, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37330, val loss: 0.37104, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.37105, val loss: 0.36866, in 0.006s 1 tree, 55 leaves, max depth = 12, train loss: 0.36877, val loss: 0.36686, in 0.010s 1 tree, 3 leaves, max depth = 2, train loss: 0.36705, val loss: 0.36482, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.36499, val loss: 0.36265, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.36286, val loss: 0.36099, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.36126, val loss: 0.35909, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.35937, val loss: 0.35710, in 0.000s 1 tree, 57 leaves, max depth = 10, train loss: 0.35738, val loss: 0.35557, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.35563, val loss: 0.35374, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35413, val loss: 0.35197, in 0.000s 1 tree, 30 leaves, max depth = 8, train loss: 0.35245, val loss: 0.35019, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.35106, val loss: 0.34851, in 0.000s 1 tree, 55 leaves, max depth = 12, train loss: 0.34916, val loss: 0.34710, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.34785, val loss: 0.34553, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.34631, val loss: 0.34375, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.34452, val loss: 0.34243, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.34310, val loss: 0.34096, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.34187, val loss: 0.33947, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.34049, val loss: 0.33789, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.33933, val loss: 0.33649, in 0.000s 1 tree, 55 leaves, max depth = 10, train loss: 0.33764, val loss: 0.33526, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.33638, val loss: 0.33397, in 0.000s 1 tree, 53 leaves, max depth = 13, train loss: 0.33479, val loss: 0.33282, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.33368, val loss: 0.33146, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.33243, val loss: 0.33002, in 0.000s Fit 71 trees in 0.877 s, (2340 total leaves) Time spent computing histograms: 0.312s Time spent finding best splits: 0.051s Time spent applying splits: 0.047s Time spent predicting: 0.016s Trial 57, Fold 1: Log loss = 0.3367816199749105, Average precision = 0.9455012029625787, ROC-AUC = 0.9441351154031821, Elapsed Time = 0.8882864000006521 seconds Trial 57, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 57, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 10, train loss: 0.67105, val loss: 0.67000, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.65101, val loss: 0.64897, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.63280, val loss: 0.62983, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.61622, val loss: 0.61235, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.60107, val loss: 0.59644, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.58723, val loss: 0.58189, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.57455, val loss: 0.56845, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.56293, val loss: 0.55620, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.55165, val loss: 0.54529, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.54140, val loss: 0.53445, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.53136, val loss: 0.52478, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.52227, val loss: 0.51507, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.51330, val loss: 0.50645, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.50521, val loss: 0.49786, in 0.016s 1 tree, 29 leaves, max depth = 15, train loss: 0.49779, val loss: 0.48992, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.48991, val loss: 0.48237, in 0.000s 1 tree, 29 leaves, max depth = 8, train loss: 0.48328, val loss: 0.47523, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.47617, val loss: 0.46846, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.47022, val loss: 0.46210, in 0.000s 1 tree, 54 leaves, max depth = 14, train loss: 0.46377, val loss: 0.45598, in 0.016s 1 tree, 55 leaves, max depth = 15, train loss: 0.45782, val loss: 0.45036, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.45232, val loss: 0.44516, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.44724, val loss: 0.43969, in 0.000s 1 tree, 31 leaves, max depth = 15, train loss: 0.44259, val loss: 0.43465, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.43764, val loss: 0.43001, in 0.000s 1 tree, 55 leaves, max depth = 10, train loss: 0.43307, val loss: 0.42574, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.42891, val loss: 0.42125, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.42471, val loss: 0.41733, in 0.000s 1 tree, 55 leaves, max depth = 13, train loss: 0.42084, val loss: 0.41372, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.41712, val loss: 0.40970, in 0.000s 1 tree, 32 leaves, max depth = 13, train loss: 0.41373, val loss: 0.40598, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.41020, val loss: 0.40271, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40709, val loss: 0.39949, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.40382, val loss: 0.39649, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40097, val loss: 0.39354, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.39796, val loss: 0.39078, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.39505, val loss: 0.38758, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.39227, val loss: 0.38506, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38967, val loss: 0.38237, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38728, val loss: 0.37988, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.38459, val loss: 0.37694, in 0.000s 1 tree, 55 leaves, max depth = 13, train loss: 0.38202, val loss: 0.37464, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.37956, val loss: 0.37194, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.37746, val loss: 0.36977, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.37507, val loss: 0.36763, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.37312, val loss: 0.36561, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.37089, val loss: 0.36364, in 0.000s 1 tree, 32 leaves, max depth = 10, train loss: 0.36859, val loss: 0.36106, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36678, val loss: 0.35919, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.36511, val loss: 0.35745, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.36306, val loss: 0.35522, in 0.000s 1 tree, 56 leaves, max depth = 13, train loss: 0.36095, val loss: 0.35340, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35939, val loss: 0.35177, in 0.000s 1 tree, 33 leaves, max depth = 10, train loss: 0.35745, val loss: 0.34961, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.35547, val loss: 0.34789, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.35401, val loss: 0.34637, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.35227, val loss: 0.34447, in 0.000s 1 tree, 55 leaves, max depth = 12, train loss: 0.35041, val loss: 0.34286, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.34903, val loss: 0.34142, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.34738, val loss: 0.33965, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.34580, val loss: 0.33788, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.34452, val loss: 0.33654, in 0.000s 1 tree, 55 leaves, max depth = 12, train loss: 0.34274, val loss: 0.33503, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.34127, val loss: 0.33338, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.34006, val loss: 0.33211, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.33866, val loss: 0.33056, in 0.000s 1 tree, 57 leaves, max depth = 13, train loss: 0.33698, val loss: 0.32912, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.33583, val loss: 0.32791, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.33451, val loss: 0.32643, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.33292, val loss: 0.32510, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.33183, val loss: 0.32395, in 0.000s Fit 71 trees in 0.907 s, (2402 total leaves) Time spent computing histograms: 0.308s Time spent finding best splits: 0.048s Time spent applying splits: 0.046s Time spent predicting: 0.016s Trial 57, Fold 2: Log loss = 0.3348763084371926, Average precision = 0.9444874552828766, ROC-AUC = 0.9469416144325498, Elapsed Time = 0.906026099999508 seconds Trial 57, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 57, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 10, train loss: 0.67125, val loss: 0.67053, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.65119, val loss: 0.64980, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.63312, val loss: 0.63109, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.61667, val loss: 0.61402, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.60145, val loss: 0.59824, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.58755, val loss: 0.58379, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.57497, val loss: 0.57068, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.56345, val loss: 0.55866, in 0.000s 1 tree, 55 leaves, max depth = 12, train loss: 0.55209, val loss: 0.54807, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.54177, val loss: 0.53728, in 0.000s 1 tree, 54 leaves, max depth = 11, train loss: 0.53168, val loss: 0.52789, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.52266, val loss: 0.51841, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.51365, val loss: 0.51005, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.50564, val loss: 0.50159, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.49810, val loss: 0.49364, in 0.000s 1 tree, 56 leaves, max depth = 13, train loss: 0.49018, val loss: 0.48632, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.48346, val loss: 0.47920, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.47632, val loss: 0.47261, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.47045, val loss: 0.46636, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.46399, val loss: 0.46041, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.45802, val loss: 0.45494, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.45251, val loss: 0.44990, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.44749, val loss: 0.44452, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.44290, val loss: 0.43955, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.43794, val loss: 0.43503, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.43337, val loss: 0.43088, in 0.000s 1 tree, 54 leaves, max depth = 11, train loss: 0.42913, val loss: 0.42705, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.42511, val loss: 0.42268, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.42122, val loss: 0.41918, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.41757, val loss: 0.41520, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.41417, val loss: 0.41149, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.41063, val loss: 0.40831, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40743, val loss: 0.40536, in 0.000s 1 tree, 52 leaves, max depth = 13, train loss: 0.40418, val loss: 0.40245, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40124, val loss: 0.39974, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39854, val loss: 0.39724, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.39555, val loss: 0.39460, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.39265, val loss: 0.39142, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39017, val loss: 0.38913, in 0.000s 1 tree, 53 leaves, max depth = 13, train loss: 0.38741, val loss: 0.38672, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38512, val loss: 0.38461, in 0.000s 1 tree, 53 leaves, max depth = 13, train loss: 0.38256, val loss: 0.38239, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.37997, val loss: 0.37954, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.37787, val loss: 0.37759, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.37542, val loss: 0.37492, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.37304, val loss: 0.37286, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.37084, val loss: 0.37099, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.36860, val loss: 0.36852, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.36673, val loss: 0.36678, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.36500, val loss: 0.36517, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.36301, val loss: 0.36298, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.36141, val loss: 0.36149, in 0.000s 1 tree, 53 leaves, max depth = 15, train loss: 0.35931, val loss: 0.35972, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.35748, val loss: 0.35771, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.35554, val loss: 0.35609, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35403, val loss: 0.35469, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.35228, val loss: 0.35278, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.35088, val loss: 0.35147, in 0.000s 1 tree, 29 leaves, max depth = 8, train loss: 0.34926, val loss: 0.34966, in 0.016s [60/71] 1 tree, 3 leaves, max depth = 2, train loss: 0.34795, val loss: 0.34844, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.34644, val loss: 0.34664, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.34455, val loss: 0.34509, in 0.000s 1 tree, 54 leaves, max depth = 15, train loss: 0.34281, val loss: 0.34366, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.34157, val loss: 0.34249, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.34014, val loss: 0.34078, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.33897, val loss: 0.33969, in 0.000s 1 tree, 30 leaves, max depth = 8, train loss: 0.33760, val loss: 0.33817, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.33594, val loss: 0.33682, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.33483, val loss: 0.33578, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.33355, val loss: 0.33425, in 0.000s 1 tree, 55 leaves, max depth = 13, train loss: 0.33198, val loss: 0.33299, in 0.016s Fit 71 trees in 0.908 s, (2362 total leaves) Time spent computing histograms: 0.316s Time spent finding best splits: 0.049s Time spent applying splits: 0.046s Time spent predicting: 0.000s Trial 57, Fold 3: Log loss = 0.3302480489271217, Average precision = 0.9506856079438344, ROC-AUC = 0.950528232175362, Elapsed Time = 0.9109050000006391 seconds Trial 57, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 57, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 11, train loss: 0.67097, val loss: 0.66966, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.65086, val loss: 0.64830, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.63258, val loss: 0.62882, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.61611, val loss: 0.61125, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.60089, val loss: 0.59495, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.58698, val loss: 0.57999, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.57425, val loss: 0.56625, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.56259, val loss: 0.55362, in 0.000s 1 tree, 55 leaves, max depth = 10, train loss: 0.55122, val loss: 0.54241, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.54094, val loss: 0.53123, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.53084, val loss: 0.52128, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.52175, val loss: 0.51134, in 0.000s 1 tree, 55 leaves, max depth = 11, train loss: 0.51272, val loss: 0.50248, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.50465, val loss: 0.49360, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.49723, val loss: 0.48540, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.48931, val loss: 0.47763, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.48274, val loss: 0.47035, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.47558, val loss: 0.46336, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.46971, val loss: 0.45683, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.46321, val loss: 0.45050, in 0.000s 1 tree, 55 leaves, max depth = 14, train loss: 0.45721, val loss: 0.44468, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.45166, val loss: 0.43927, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.44662, val loss: 0.43364, in 0.000s 1 tree, 30 leaves, max depth = 14, train loss: 0.44200, val loss: 0.42845, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.43701, val loss: 0.42362, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.43240, val loss: 0.41917, in 0.016s 1 tree, 56 leaves, max depth = 9, train loss: 0.42812, val loss: 0.41505, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.42410, val loss: 0.41050, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.42017, val loss: 0.40670, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.41652, val loss: 0.40254, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.41317, val loss: 0.39871, in 0.016s 1 tree, 55 leaves, max depth = 15, train loss: 0.40958, val loss: 0.39526, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40649, val loss: 0.39198, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.40318, val loss: 0.38882, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.40035, val loss: 0.38581, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.39729, val loss: 0.38290, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.39439, val loss: 0.37957, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.39156, val loss: 0.37686, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38898, val loss: 0.37412, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.38669, val loss: 0.37169, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.38458, val loss: 0.36944, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.38194, val loss: 0.36693, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.37938, val loss: 0.36400, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37742, val loss: 0.36193, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.37496, val loss: 0.35961, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.37262, val loss: 0.35694, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.37034, val loss: 0.35480, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.36820, val loss: 0.35234, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36638, val loss: 0.35040, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.36470, val loss: 0.34861, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.36274, val loss: 0.34636, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.36058, val loss: 0.34435, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35900, val loss: 0.34268, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.35721, val loss: 0.34062, in 0.016s 1 tree, 56 leaves, max depth = 14, train loss: 0.35517, val loss: 0.33874, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.35370, val loss: 0.33717, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.35205, val loss: 0.33527, in 0.000s 1 tree, 56 leaves, max depth = 14, train loss: 0.35013, val loss: 0.33352, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.34875, val loss: 0.33204, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.34717, val loss: 0.33037, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.34588, val loss: 0.32900, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.34436, val loss: 0.32731, in 0.016s 1 tree, 56 leaves, max depth = 14, train loss: 0.34252, val loss: 0.32564, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.34108, val loss: 0.32414, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.33986, val loss: 0.32284, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.33853, val loss: 0.32147, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.33677, val loss: 0.31988, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.33561, val loss: 0.31862, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.33397, val loss: 0.31716, in 0.000s 1 tree, 39 leaves, max depth = 14, train loss: 0.33270, val loss: 0.31588, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.33160, val loss: 0.31468, in 0.016s Fit 71 trees in 0.892 s, (2391 total leaves) Time spent computing histograms: 0.307s Time spent finding best splits: 0.048s Time spent applying splits: 0.045s Time spent predicting: 0.031s Trial 57, Fold 4: Log loss = 0.33093443406721534, Average precision = 0.9528355563995237, ROC-AUC = 0.9508373451027798, Elapsed Time = 0.8867230000014388 seconds Trial 57, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 57, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.189 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 8, train loss: 0.67074, val loss: 0.66927, in 0.000s 1 tree, 29 leaves, max depth = 8, train loss: 0.65042, val loss: 0.64756, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.63195, val loss: 0.62776, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.61530, val loss: 0.60976, in 0.000s 1 tree, 30 leaves, max depth = 8, train loss: 0.59992, val loss: 0.59317, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.58586, val loss: 0.57796, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.57299, val loss: 0.56398, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.56133, val loss: 0.55119, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.55050, val loss: 0.53934, in 0.016s 1 tree, 51 leaves, max depth = 14, train loss: 0.53975, val loss: 0.52903, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.53028, val loss: 0.51863, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.52069, val loss: 0.50948, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.51189, val loss: 0.50110, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.50376, val loss: 0.49211, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.49627, val loss: 0.48372, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.48853, val loss: 0.47639, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.48185, val loss: 0.46893, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.47487, val loss: 0.46235, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.46889, val loss: 0.45564, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.46256, val loss: 0.44971, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.45672, val loss: 0.44424, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.45132, val loss: 0.43922, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.44619, val loss: 0.43344, in 0.000s 1 tree, 30 leaves, max depth = 13, train loss: 0.44148, val loss: 0.42809, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.43662, val loss: 0.42362, in 0.000s 1 tree, 53 leaves, max depth = 10, train loss: 0.43214, val loss: 0.41948, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.42794, val loss: 0.41469, in 0.000s 1 tree, 53 leaves, max depth = 13, train loss: 0.42383, val loss: 0.41096, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.42002, val loss: 0.40659, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.41625, val loss: 0.40317, in 0.000s 1 tree, 31 leaves, max depth = 13, train loss: 0.41281, val loss: 0.39919, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.40934, val loss: 0.39608, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40623, val loss: 0.39308, in 0.000s 1 tree, 53 leaves, max depth = 12, train loss: 0.40304, val loss: 0.39024, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40019, val loss: 0.38748, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.39715, val loss: 0.38394, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.39421, val loss: 0.38134, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.39144, val loss: 0.37811, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38887, val loss: 0.37563, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.38615, val loss: 0.37327, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.38388, val loss: 0.37101, in 0.000s 1 tree, 53 leaves, max depth = 13, train loss: 0.38136, val loss: 0.36883, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.37889, val loss: 0.36590, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.37680, val loss: 0.36382, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37487, val loss: 0.36191, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.37251, val loss: 0.35990, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.37027, val loss: 0.35724, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.36808, val loss: 0.35539, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36628, val loss: 0.35360, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.36422, val loss: 0.35116, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.36218, val loss: 0.34945, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.36050, val loss: 0.34778, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.35855, val loss: 0.34547, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.35663, val loss: 0.34388, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.35489, val loss: 0.34179, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.35333, val loss: 0.34023, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.35164, val loss: 0.33839, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.35019, val loss: 0.33694, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.34834, val loss: 0.33545, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.34698, val loss: 0.33410, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.34541, val loss: 0.33222, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.34415, val loss: 0.33097, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.34264, val loss: 0.32934, in 0.000s 1 tree, 52 leaves, max depth = 13, train loss: 0.34087, val loss: 0.32789, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.33967, val loss: 0.32671, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.33826, val loss: 0.32518, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.33659, val loss: 0.32385, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.33530, val loss: 0.32228, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.33416, val loss: 0.32115, in 0.000s 1 tree, 38 leaves, max depth = 11, train loss: 0.33291, val loss: 0.31979, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.33131, val loss: 0.31853, in 0.000s Fit 71 trees in 0.939 s, (2355 total leaves) Time spent computing histograms: 0.314s Time spent finding best splits: 0.050s Time spent applying splits: 0.047s Time spent predicting: 0.000s Trial 57, Fold 5: Log loss = 0.3377299161193195, Average precision = 0.9483662589185281, ROC-AUC = 0.9455798450476561, Elapsed Time = 0.9485564999995404 seconds
Optimization Progress: 58%|#####8 | 58/100 [11:25<08:37, 12.31s/it]
Trial 58, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 58, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.127 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 23 leaves, max depth = 7, train loss: 0.67353, val loss: 0.67345, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.65484, val loss: 0.65486, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.63738, val loss: 0.63749, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.62104, val loss: 0.62122, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.60571, val loss: 0.60596, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.59129, val loss: 0.59158, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.57818, val loss: 0.57836, in 0.016s 1 tree, 30 leaves, max depth = 7, train loss: 0.56553, val loss: 0.56565, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.55345, val loss: 0.55357, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.54220, val loss: 0.54218, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.53143, val loss: 0.53152, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.52150, val loss: 0.52151, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.51159, val loss: 0.51154, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.50220, val loss: 0.50209, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.49375, val loss: 0.49360, in 0.016s 1 tree, 33 leaves, max depth = 8, train loss: 0.48540, val loss: 0.48507, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.47740, val loss: 0.47702, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.46976, val loss: 0.46936, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.46288, val loss: 0.46245, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.45611, val loss: 0.45554, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.44952, val loss: 0.44895, in 0.016s Fit 21 trees in 0.642 s, (690 total leaves) Time spent computing histograms: 0.136s Time spent finding best splits: 0.024s Time spent applying splits: 0.015s Time spent predicting: 0.000s Trial 58, Fold 1: Log loss = 0.4518877100166728, Average precision = 0.9206472773505876, ROC-AUC = 0.9320100443067922, Elapsed Time = 0.6472635999998602 seconds Trial 58, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 58, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 23 leaves, max depth = 6, train loss: 0.67342, val loss: 0.67295, in 0.016s 1 tree, 24 leaves, max depth = 7, train loss: 0.65472, val loss: 0.65388, in 0.016s 1 tree, 23 leaves, max depth = 6, train loss: 0.63803, val loss: 0.63696, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.62176, val loss: 0.62029, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.60623, val loss: 0.60447, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.59173, val loss: 0.58972, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.57894, val loss: 0.57683, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.56588, val loss: 0.56355, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.55357, val loss: 0.55098, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.54220, val loss: 0.53936, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.53111, val loss: 0.52815, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.52134, val loss: 0.51830, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.51139, val loss: 0.50822, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.50197, val loss: 0.49874, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.49368, val loss: 0.49039, in 0.016s 1 tree, 32 leaves, max depth = 7, train loss: 0.48530, val loss: 0.48200, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.47725, val loss: 0.47387, in 0.031s 1 tree, 30 leaves, max depth = 7, train loss: 0.46977, val loss: 0.46631, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.46252, val loss: 0.45897, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.45290, val loss: 0.44948, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.44631, val loss: 0.44279, in 0.016s Fit 21 trees in 0.704 s, (629 total leaves) Time spent computing histograms: 0.159s Time spent finding best splits: 0.027s Time spent applying splits: 0.015s Time spent predicting: 0.000s Trial 58, Fold 2: Log loss = 0.448487289569763, Average precision = 0.9484778573320208, ROC-AUC = 0.9477508094985616, Elapsed Time = 0.7175649000000703 seconds Trial 58, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 58, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 22 leaves, max depth = 6, train loss: 0.67356, val loss: 0.67346, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.65498, val loss: 0.65482, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.63752, val loss: 0.63738, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.62118, val loss: 0.62102, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.60585, val loss: 0.60572, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.59203, val loss: 0.59200, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.57827, val loss: 0.57828, in 0.016s 1 tree, 24 leaves, max depth = 7, train loss: 0.56611, val loss: 0.56620, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.55449, val loss: 0.55463, in 0.016s 1 tree, 29 leaves, max depth = 7, train loss: 0.54314, val loss: 0.54327, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.53216, val loss: 0.53237, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.52236, val loss: 0.52267, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.51309, val loss: 0.51346, in 0.031s 1 tree, 33 leaves, max depth = 11, train loss: 0.50368, val loss: 0.50407, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.49536, val loss: 0.49585, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.48686, val loss: 0.48746, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.47623, val loss: 0.47751, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.46851, val loss: 0.46987, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.46127, val loss: 0.46278, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.45437, val loss: 0.45602, in 0.031s 1 tree, 32 leaves, max depth = 12, train loss: 0.44770, val loss: 0.44941, in 0.016s Fit 21 trees in 0.721 s, (671 total leaves) Time spent computing histograms: 0.151s Time spent finding best splits: 0.026s Time spent applying splits: 0.016s Time spent predicting: 0.000s Trial 58, Fold 3: Log loss = 0.4462454177332421, Average precision = 0.9515587752719641, ROC-AUC = 0.949697781129913, Elapsed Time = 0.7262883999992482 seconds Trial 58, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 58, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 21 leaves, max depth = 6, train loss: 0.67371, val loss: 0.67290, in 0.016s 1 tree, 30 leaves, max depth = 6, train loss: 0.65546, val loss: 0.65372, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.63809, val loss: 0.63571, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.62178, val loss: 0.61879, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.60652, val loss: 0.60291, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.59219, val loss: 0.58797, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.57936, val loss: 0.57465, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.56655, val loss: 0.56107, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.55430, val loss: 0.54834, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.54301, val loss: 0.53645, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.53205, val loss: 0.52505, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.52230, val loss: 0.51488, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.51311, val loss: 0.50525, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.50377, val loss: 0.49551, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.49551, val loss: 0.48688, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.48707, val loss: 0.47803, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.47654, val loss: 0.46733, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.46883, val loss: 0.45935, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.46168, val loss: 0.45166, in 0.016s 1 tree, 33 leaves, max depth = 8, train loss: 0.45487, val loss: 0.44449, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.44828, val loss: 0.43758, in 0.016s Fit 21 trees in 0.705 s, (662 total leaves) Time spent computing histograms: 0.144s Time spent finding best splits: 0.024s Time spent applying splits: 0.015s Time spent predicting: 0.000s Trial 58, Fold 4: Log loss = 0.44771803535883925, Average precision = 0.9527280687118671, ROC-AUC = 0.9486615711829772, Elapsed Time = 0.7152146000007633 seconds Trial 58, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 58, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.190 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 22 leaves, max depth = 7, train loss: 0.67363, val loss: 0.67280, in 0.031s 1 tree, 24 leaves, max depth = 6, train loss: 0.65499, val loss: 0.65339, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.63814, val loss: 0.63576, in 0.031s 1 tree, 22 leaves, max depth = 6, train loss: 0.62188, val loss: 0.61872, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.60623, val loss: 0.60247, in 0.031s 1 tree, 32 leaves, max depth = 12, train loss: 0.59149, val loss: 0.58715, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.57873, val loss: 0.57364, in 0.031s 1 tree, 32 leaves, max depth = 12, train loss: 0.56566, val loss: 0.56005, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.55335, val loss: 0.54726, in 0.031s 1 tree, 26 leaves, max depth = 8, train loss: 0.54199, val loss: 0.53531, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.53095, val loss: 0.52384, in 0.031s 1 tree, 30 leaves, max depth = 11, train loss: 0.52113, val loss: 0.51360, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.51185, val loss: 0.50393, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.50247, val loss: 0.49420, in 0.031s 1 tree, 30 leaves, max depth = 11, train loss: 0.49415, val loss: 0.48554, in 0.021s 1 tree, 31 leaves, max depth = 11, train loss: 0.48563, val loss: 0.47669, in 0.027s 1 tree, 32 leaves, max depth = 11, train loss: 0.47755, val loss: 0.46825, in 0.031s 1 tree, 32 leaves, max depth = 11, train loss: 0.46982, val loss: 0.46021, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.45987, val loss: 0.45024, in 0.016s 1 tree, 33 leaves, max depth = 7, train loss: 0.45299, val loss: 0.44307, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.44388, val loss: 0.43396, in 0.016s Fit 21 trees in 0.878 s, (624 total leaves) Time spent computing histograms: 0.198s Time spent finding best splits: 0.035s Time spent applying splits: 0.021s Time spent predicting: 0.000s Trial 58, Fold 5: Log loss = 0.44844346568526616, Average precision = 0.9489391251785544, ROC-AUC = 0.9459536629322036, Elapsed Time = 0.8952907000002597 seconds
Optimization Progress: 59%|#####8 | 59/100 [11:35<07:59, 11.70s/it]
Trial 59, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 59, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 11, train loss: 0.67087, val loss: 0.67019, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.65066, val loss: 0.64935, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.63229, val loss: 0.63037, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.61556, val loss: 0.61305, in 0.015s 1 tree, 30 leaves, max depth = 11, train loss: 0.60022, val loss: 0.59717, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.58627, val loss: 0.58267, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.57353, val loss: 0.56955, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.56189, val loss: 0.55740, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.55112, val loss: 0.54615, in 0.000s 1 tree, 48 leaves, max depth = 10, train loss: 0.54034, val loss: 0.53591, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.53088, val loss: 0.52601, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.52127, val loss: 0.51690, in 0.000s 1 tree, 49 leaves, max depth = 10, train loss: 0.51244, val loss: 0.50855, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.50432, val loss: 0.50000, in 0.000s 1 tree, 49 leaves, max depth = 11, train loss: 0.49640, val loss: 0.49253, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.48916, val loss: 0.48487, in 0.000s 1 tree, 49 leaves, max depth = 10, train loss: 0.48201, val loss: 0.47815, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.47553, val loss: 0.47126, in 0.000s 1 tree, 49 leaves, max depth = 10, train loss: 0.46907, val loss: 0.46521, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.46326, val loss: 0.45901, in 0.000s Fit 20 trees in 0.392 s, (692 total leaves) Time spent computing histograms: 0.070s Time spent finding best splits: 0.010s Time spent applying splits: 0.011s Time spent predicting: 0.000s Trial 59, Fold 1: Log loss = 0.46407318827883015, Average precision = 0.9176346016864347, ROC-AUC = 0.9272196263078151, Elapsed Time = 0.39520360000096844 seconds Trial 59, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 59, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0.158 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 28 leaves, max depth = 11, train loss: 0.67095, val loss: 0.66977, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.65082, val loss: 0.64853, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.63265, val loss: 0.62950, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.61595, val loss: 0.61185, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.60083, val loss: 0.59597, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.58688, val loss: 0.58117, in 0.016s 1 tree, 30 leaves, max depth = 13, train loss: 0.57423, val loss: 0.56784, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.56250, val loss: 0.55536, in 0.016s 1 tree, 30 leaves, max depth = 13, train loss: 0.55187, val loss: 0.54413, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.54106, val loss: 0.53375, in 0.016s 1 tree, 30 leaves, max depth = 13, train loss: 0.53166, val loss: 0.52380, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.52202, val loss: 0.51455, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.51318, val loss: 0.50608, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.50511, val loss: 0.49750, in 0.000s 1 tree, 52 leaves, max depth = 13, train loss: 0.49718, val loss: 0.48990, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.48998, val loss: 0.48223, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.48283, val loss: 0.47541, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.47638, val loss: 0.46853, in 0.000s 1 tree, 51 leaves, max depth = 12, train loss: 0.46992, val loss: 0.46237, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.46414, val loss: 0.45619, in 0.016s Fit 20 trees in 0.471 s, (731 total leaves) Time spent computing histograms: 0.095s Time spent finding best splits: 0.013s Time spent applying splits: 0.014s Time spent predicting: 0.000s Trial 59, Fold 2: Log loss = 0.46482889485721857, Average precision = 0.9129098439490244, ROC-AUC = 0.9281474994544667, Elapsed Time = 0.46732349999911094 seconds Trial 59, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 59, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 8, train loss: 0.67108, val loss: 0.67035, in 0.000s 1 tree, 29 leaves, max depth = 8, train loss: 0.65108, val loss: 0.64965, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.63301, val loss: 0.63105, in 0.000s 1 tree, 30 leaves, max depth = 8, train loss: 0.61633, val loss: 0.61377, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.60119, val loss: 0.59803, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.58738, val loss: 0.58360, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.57480, val loss: 0.57056, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.56326, val loss: 0.55859, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.55256, val loss: 0.54743, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.54179, val loss: 0.53742, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.53193, val loss: 0.52828, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.52288, val loss: 0.51881, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.51407, val loss: 0.51067, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.50602, val loss: 0.50223, in 0.000s 1 tree, 48 leaves, max depth = 13, train loss: 0.49811, val loss: 0.49495, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.49092, val loss: 0.48739, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.48380, val loss: 0.48085, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.47737, val loss: 0.47407, in 0.000s 1 tree, 47 leaves, max depth = 11, train loss: 0.47093, val loss: 0.46817, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.46516, val loss: 0.46206, in 0.000s Fit 20 trees in 0.454 s, (703 total leaves) Time spent computing histograms: 0.090s Time spent finding best splits: 0.013s Time spent applying splits: 0.014s Time spent predicting: 0.000s Trial 59, Fold 3: Log loss = 0.4618078758230681, Average precision = 0.917900894446191, ROC-AUC = 0.9314134868199285, Elapsed Time = 0.46097200000076555 seconds Trial 59, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 59, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 10, train loss: 0.67106, val loss: 0.66979, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.65103, val loss: 0.64855, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.63283, val loss: 0.62918, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.61624, val loss: 0.61148, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.60105, val loss: 0.59516, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.58716, val loss: 0.58020, in 0.000s 1 tree, 29 leaves, max depth = 13, train loss: 0.57453, val loss: 0.56658, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.56291, val loss: 0.55401, in 0.000s 1 tree, 29 leaves, max depth = 13, train loss: 0.55229, val loss: 0.54247, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.54158, val loss: 0.53196, in 0.000s 1 tree, 29 leaves, max depth = 13, train loss: 0.53219, val loss: 0.52171, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.52264, val loss: 0.51235, in 0.000s 1 tree, 48 leaves, max depth = 11, train loss: 0.51388, val loss: 0.50377, in 0.016s 1 tree, 29 leaves, max depth = 14, train loss: 0.50581, val loss: 0.49494, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.49796, val loss: 0.48726, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.49076, val loss: 0.47932, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.48369, val loss: 0.47241, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.47725, val loss: 0.46528, in 0.000s 1 tree, 48 leaves, max depth = 12, train loss: 0.47086, val loss: 0.45904, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.46508, val loss: 0.45261, in 0.000s Fit 20 trees in 0.455 s, (701 total leaves) Time spent computing histograms: 0.083s Time spent finding best splits: 0.012s Time spent applying splits: 0.013s Time spent predicting: 0.000s Trial 59, Fold 4: Log loss = 0.464658487404741, Average precision = 0.91684926447469, ROC-AUC = 0.9289775978740603, Elapsed Time = 0.46681159999934607 seconds Trial 59, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 59, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 10, train loss: 0.67080, val loss: 0.66929, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.65069, val loss: 0.64783, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.63224, val loss: 0.62802, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.61557, val loss: 0.61014, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.60021, val loss: 0.59355, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.58630, val loss: 0.57853, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.57348, val loss: 0.56462, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.56180, val loss: 0.55193, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.55100, val loss: 0.54013, in 0.000s 1 tree, 48 leaves, max depth = 12, train loss: 0.54049, val loss: 0.53009, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.53099, val loss: 0.51968, in 0.000s 1 tree, 48 leaves, max depth = 12, train loss: 0.52161, val loss: 0.51079, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.51318, val loss: 0.50150, in 0.000s 1 tree, 48 leaves, max depth = 12, train loss: 0.50478, val loss: 0.49358, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.49726, val loss: 0.48527, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.48971, val loss: 0.47820, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.48277, val loss: 0.47171, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.47622, val loss: 0.46444, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.46996, val loss: 0.45858, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.46408, val loss: 0.45202, in 0.016s Fit 20 trees in 0.471 s, (693 total leaves) Time spent computing histograms: 0.082s Time spent finding best splits: 0.012s Time spent applying splits: 0.013s Time spent predicting: 0.000s Trial 59, Fold 5: Log loss = 0.4690791333260642, Average precision = 0.9139430069964957, ROC-AUC = 0.9255519016034037, Elapsed Time = 0.47083159999965574 seconds
Optimization Progress: 60%|###### | 60/100 [11:45<07:18, 10.97s/it]
Trial 60, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 60, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 20 leaves, max depth = 7, train loss: 0.66210, val loss: 0.66184, in 0.016s 1 tree, 19 leaves, max depth = 5, train loss: 0.63373, val loss: 0.63343, in 0.016s 1 tree, 18 leaves, max depth = 6, train loss: 0.60928, val loss: 0.60887, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.58612, val loss: 0.58565, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.56516, val loss: 0.56470, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.54680, val loss: 0.54608, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.52910, val loss: 0.52830, in 0.000s 1 tree, 22 leaves, max depth = 8, train loss: 0.51294, val loss: 0.51208, in 0.016s 1 tree, 20 leaves, max depth = 5, train loss: 0.49832, val loss: 0.49702, in 0.031s 1 tree, 23 leaves, max depth = 8, train loss: 0.48470, val loss: 0.48335, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.47226, val loss: 0.47086, in 0.000s 1 tree, 24 leaves, max depth = 8, train loss: 0.46079, val loss: 0.45927, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.45021, val loss: 0.44865, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.44046, val loss: 0.43887, in 0.016s 1 tree, 21 leaves, max depth = 10, train loss: 0.42656, val loss: 0.42545, in 0.016s 1 tree, 20 leaves, max depth = 5, train loss: 0.41832, val loss: 0.41699, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.40625, val loss: 0.40542, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.39883, val loss: 0.39800, in 0.016s 1 tree, 18 leaves, max depth = 9, train loss: 0.38826, val loss: 0.38790, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.38177, val loss: 0.38144, in 0.031s 1 tree, 24 leaves, max depth = 10, train loss: 0.37242, val loss: 0.37253, in 0.000s 1 tree, 23 leaves, max depth = 6, train loss: 0.36687, val loss: 0.36687, in 0.016s 1 tree, 26 leaves, max depth = 11, train loss: 0.36185, val loss: 0.36191, in 0.016s 1 tree, 12 leaves, max depth = 7, train loss: 0.35433, val loss: 0.35447, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.34746, val loss: 0.34768, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.34205, val loss: 0.34237, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.33599, val loss: 0.33638, in 0.000s Fit 27 trees in 0.673 s, (561 total leaves) Time spent computing histograms: 0.162s Time spent finding best splits: 0.016s Time spent applying splits: 0.012s Time spent predicting: 0.000s Trial 60, Fold 1: Log loss = 0.3384298929201707, Average precision = 0.9562246653547312, ROC-AUC = 0.9504353608350898, Elapsed Time = 0.6907088000007207 seconds Trial 60, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 60, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 6, train loss: 0.66211, val loss: 0.66122, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.63339, val loss: 0.63185, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.60759, val loss: 0.60548, in 0.016s 1 tree, 18 leaves, max depth = 5, train loss: 0.58433, val loss: 0.58174, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.56326, val loss: 0.56007, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.54504, val loss: 0.54158, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.52744, val loss: 0.52354, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.51140, val loss: 0.50709, in 0.016s 1 tree, 19 leaves, max depth = 5, train loss: 0.49697, val loss: 0.49241, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.48353, val loss: 0.47873, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.47154, val loss: 0.46637, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.46018, val loss: 0.45481, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.44971, val loss: 0.44408, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.43534, val loss: 0.42988, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.42640, val loss: 0.42067, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.41802, val loss: 0.41204, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.40605, val loss: 0.40026, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.39886, val loss: 0.39288, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.38839, val loss: 0.38259, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.37879, val loss: 0.37315, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.37284, val loss: 0.36702, in 0.031s 1 tree, 24 leaves, max depth = 6, train loss: 0.36737, val loss: 0.36149, in 0.000s 1 tree, 23 leaves, max depth = 9, train loss: 0.35909, val loss: 0.35337, in 0.031s 1 tree, 25 leaves, max depth = 9, train loss: 0.35418, val loss: 0.34828, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.34730, val loss: 0.34152, in 0.000s 1 tree, 11 leaves, max depth = 5, train loss: 0.34110, val loss: 0.33548, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.33537, val loss: 0.33015, in 0.016s Fit 27 trees in 0.736 s, (551 total leaves) Time spent computing histograms: 0.178s Time spent finding best splits: 0.017s Time spent applying splits: 0.013s Time spent predicting: 0.000s Trial 60, Fold 2: Log loss = 0.336854622544885, Average precision = 0.9510914200412619, ROC-AUC = 0.9497730004877174, Elapsed Time = 0.747653000000355 seconds Trial 60, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 60, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.159 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 6, train loss: 0.66187, val loss: 0.66194, in 0.000s 1 tree, 16 leaves, max depth = 6, train loss: 0.63335, val loss: 0.63344, in 0.016s 1 tree, 18 leaves, max depth = 6, train loss: 0.60791, val loss: 0.60781, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.58479, val loss: 0.58460, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.56387, val loss: 0.56359, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.54557, val loss: 0.54526, in 0.016s 1 tree, 24 leaves, max depth = 11, train loss: 0.52826, val loss: 0.52782, in 0.016s 1 tree, 18 leaves, max depth = 6, train loss: 0.51233, val loss: 0.51179, in 0.016s 1 tree, 24 leaves, max depth = 11, train loss: 0.49776, val loss: 0.49718, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.48442, val loss: 0.48381, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.47257, val loss: 0.47189, in 0.000s 1 tree, 26 leaves, max depth = 11, train loss: 0.46126, val loss: 0.46056, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.45082, val loss: 0.45009, in 0.016s 1 tree, 23 leaves, max depth = 11, train loss: 0.43627, val loss: 0.43653, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.42717, val loss: 0.42736, in 0.016s 1 tree, 21 leaves, max depth = 6, train loss: 0.41898, val loss: 0.41927, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.40678, val loss: 0.40804, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.39560, val loss: 0.39767, in 0.016s 1 tree, 26 leaves, max depth = 11, train loss: 0.38865, val loss: 0.39069, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.38260, val loss: 0.38469, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.37317, val loss: 0.37612, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.36754, val loss: 0.37042, in 0.016s 1 tree, 7 leaves, max depth = 3, train loss: 0.35967, val loss: 0.36317, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.35449, val loss: 0.35815, in 0.016s 1 tree, 12 leaves, max depth = 7, train loss: 0.34753, val loss: 0.35183, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.34121, val loss: 0.34643, in 0.000s 1 tree, 16 leaves, max depth = 7, train loss: 0.33526, val loss: 0.34101, in 0.016s Fit 27 trees in 0.737 s, (570 total leaves) Time spent computing histograms: 0.167s Time spent finding best splits: 0.016s Time spent applying splits: 0.013s Time spent predicting: 0.000s Trial 60, Fold 3: Log loss = 0.33306986354968326, Average precision = 0.9541527565952485, ROC-AUC = 0.9513584574934268, Elapsed Time = 0.7431777999991027 seconds Trial 60, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 60, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 14 leaves, max depth = 5, train loss: 0.66229, val loss: 0.66093, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.63426, val loss: 0.63166, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.60898, val loss: 0.60531, in 0.000s 1 tree, 17 leaves, max depth = 5, train loss: 0.58617, val loss: 0.58139, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.56517, val loss: 0.55927, in 0.016s 1 tree, 22 leaves, max depth = 7, train loss: 0.54704, val loss: 0.54015, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.52951, val loss: 0.52178, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.51356, val loss: 0.50489, in 0.031s 1 tree, 22 leaves, max depth = 7, train loss: 0.49915, val loss: 0.48959, in 0.016s 1 tree, 21 leaves, max depth = 6, train loss: 0.48578, val loss: 0.47544, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.47385, val loss: 0.46259, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.46248, val loss: 0.45058, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.45207, val loss: 0.43952, in 0.016s 1 tree, 22 leaves, max depth = 7, train loss: 0.44245, val loss: 0.42932, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.42867, val loss: 0.41532, in 0.016s 1 tree, 24 leaves, max depth = 7, train loss: 0.42054, val loss: 0.40653, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.40856, val loss: 0.39438, in 0.000s 1 tree, 23 leaves, max depth = 8, train loss: 0.40127, val loss: 0.38659, in 0.031s 1 tree, 21 leaves, max depth = 7, train loss: 0.39078, val loss: 0.37596, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.38466, val loss: 0.36932, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.37535, val loss: 0.35991, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.36688, val loss: 0.35132, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.35915, val loss: 0.34351, in 0.000s 1 tree, 27 leaves, max depth = 7, train loss: 0.35426, val loss: 0.33812, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.34870, val loss: 0.33239, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.34244, val loss: 0.32578, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.33671, val loss: 0.32033, in 0.016s Fit 27 trees in 0.767 s, (580 total leaves) Time spent computing histograms: 0.177s Time spent finding best splits: 0.017s Time spent applying splits: 0.014s Time spent predicting: 0.000s Trial 60, Fold 4: Log loss = 0.33538787395702674, Average precision = 0.9564025175831881, ROC-AUC = 0.9518338793631718, Elapsed Time = 0.7707864999993035 seconds Trial 60, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 60, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 16 leaves, max depth = 5, train loss: 0.66177, val loss: 0.66029, in 0.031s 1 tree, 17 leaves, max depth = 5, train loss: 0.63313, val loss: 0.63036, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.60741, val loss: 0.60347, in 0.000s 1 tree, 19 leaves, max depth = 6, train loss: 0.58422, val loss: 0.57917, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.56321, val loss: 0.55722, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.54497, val loss: 0.53804, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.52748, val loss: 0.51976, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.51150, val loss: 0.50304, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.49689, val loss: 0.48779, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.48349, val loss: 0.47377, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.47159, val loss: 0.46111, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.46024, val loss: 0.44924, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.44972, val loss: 0.43830, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.43530, val loss: 0.42382, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.42627, val loss: 0.41443, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.41792, val loss: 0.40570, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.40586, val loss: 0.39363, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.39860, val loss: 0.38611, in 0.000s 1 tree, 22 leaves, max depth = 9, train loss: 0.38803, val loss: 0.37560, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.37842, val loss: 0.36595, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.37239, val loss: 0.35972, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.36680, val loss: 0.35396, in 0.016s 1 tree, 19 leaves, max depth = 9, train loss: 0.35862, val loss: 0.34580, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.35371, val loss: 0.34078, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.34674, val loss: 0.33363, in 0.000s 1 tree, 11 leaves, max depth = 6, train loss: 0.34045, val loss: 0.32705, in 0.016s 1 tree, 10 leaves, max depth = 6, train loss: 0.33469, val loss: 0.32115, in 0.016s Fit 27 trees in 0.736 s, (521 total leaves) Time spent computing histograms: 0.170s Time spent finding best splits: 0.016s Time spent applying splits: 0.013s Time spent predicting: 0.000s Trial 60, Fold 5: Log loss = 0.3408628390929188, Average precision = 0.9507184833691187, ROC-AUC = 0.9466764022815525, Elapsed Time = 0.7471552000006341 seconds
Optimization Progress: 61%|######1 | 61/100 [11:55<06:58, 10.73s/it]
Trial 61, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 61, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.141 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 75 leaves, max depth = 16, train loss: 0.67118, val loss: 0.67058, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.65078, val loss: 0.64955, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.63224, val loss: 0.63040, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.61552, val loss: 0.61325, in 0.016s 1 tree, 75 leaves, max depth = 19, train loss: 0.60027, val loss: 0.59751, in 0.016s 1 tree, 75 leaves, max depth = 19, train loss: 0.58634, val loss: 0.58311, in 0.000s 1 tree, 75 leaves, max depth = 18, train loss: 0.57368, val loss: 0.57001, in 0.016s 1 tree, 75 leaves, max depth = 19, train loss: 0.56208, val loss: 0.55807, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.55132, val loss: 0.54691, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.54037, val loss: 0.53671, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.53034, val loss: 0.52738, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.52116, val loss: 0.51789, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.51282, val loss: 0.50919, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.50486, val loss: 0.50081, in 0.000s 1 tree, 75 leaves, max depth = 18, train loss: 0.49779, val loss: 0.49341, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.48944, val loss: 0.48573, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.48306, val loss: 0.47911, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.47719, val loss: 0.47300, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.47176, val loss: 0.46729, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.46658, val loss: 0.46176, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.46180, val loss: 0.45665, in 0.016s 1 tree, 75 leaves, max depth = 10, train loss: 0.45667, val loss: 0.45208, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.44994, val loss: 0.44598, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.44580, val loss: 0.44152, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.44213, val loss: 0.43768, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.43605, val loss: 0.43221, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.43043, val loss: 0.42717, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.42712, val loss: 0.42351, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.42195, val loss: 0.41892, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.41718, val loss: 0.41468, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.41414, val loss: 0.41147, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41116, val loss: 0.40817, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.40833, val loss: 0.40509, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.40397, val loss: 0.40129, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.40136, val loss: 0.39857, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39868, val loss: 0.39559, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.39640, val loss: 0.39320, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.39241, val loss: 0.38978, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.39023, val loss: 0.38755, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38779, val loss: 0.38483, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38554, val loss: 0.38232, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.38186, val loss: 0.37921, in 0.016s 1 tree, 75 leaves, max depth = 20, train loss: 0.37987, val loss: 0.37744, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.37659, val loss: 0.37469, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37450, val loss: 0.37235, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.37145, val loss: 0.36982, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.36863, val loss: 0.36750, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.36680, val loss: 0.36553, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.36419, val loss: 0.36340, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.36177, val loss: 0.36145, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.36007, val loss: 0.35975, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.35784, val loss: 0.35796, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.35578, val loss: 0.35631, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.35423, val loss: 0.35461, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.35231, val loss: 0.35311, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35034, val loss: 0.35088, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.34857, val loss: 0.34952, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.34712, val loss: 0.34809, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34529, val loss: 0.34601, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34360, val loss: 0.34408, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.34223, val loss: 0.34255, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.34053, val loss: 0.34127, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.33897, val loss: 0.34012, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.33757, val loss: 0.33848, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.33611, val loss: 0.33742, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33450, val loss: 0.33558, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33301, val loss: 0.33386, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33164, val loss: 0.33227, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.33041, val loss: 0.33091, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.32927, val loss: 0.32980, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.32784, val loss: 0.32880, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.32676, val loss: 0.32766, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32545, val loss: 0.32614, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.32444, val loss: 0.32516, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32323, val loss: 0.32374, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.32224, val loss: 0.32268, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32111, val loss: 0.32134, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.32018, val loss: 0.32053, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.31921, val loss: 0.31945, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.31815, val loss: 0.31818, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.31731, val loss: 0.31739, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.31585, val loss: 0.31637, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.31474, val loss: 0.31580, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.31346, val loss: 0.31494, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.31228, val loss: 0.31416, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31123, val loss: 0.31291, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.31013, val loss: 0.31219, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30914, val loss: 0.31101, in 0.000s 1 tree, 75 leaves, max depth = 18, train loss: 0.30821, val loss: 0.30995, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30728, val loss: 0.30884, in 0.000s 1 tree, 75 leaves, max depth = 18, train loss: 0.30642, val loss: 0.30786, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.30535, val loss: 0.30719, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30447, val loss: 0.30613, in 0.000s 1 tree, 75 leaves, max depth = 18, train loss: 0.30365, val loss: 0.30521, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30283, val loss: 0.30420, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.30207, val loss: 0.30327, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.30134, val loss: 0.30262, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.30062, val loss: 0.30174, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.29954, val loss: 0.30108, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.29878, val loss: 0.30025, in 0.000s Fit 100 trees in 1.251 s, (5865 total leaves) Time spent computing histograms: 0.465s Time spent finding best splits: 0.097s Time spent applying splits: 0.098s Time spent predicting: 0.000s Trial 61, Fold 1: Log loss = 0.3085742850973289, Average precision = 0.9500699490593552, ROC-AUC = 0.9481077755552687, Elapsed Time = 1.2683758000002854 seconds Trial 61, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 61, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 75 leaves, max depth = 20, train loss: 0.67109, val loss: 0.67018, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.65104, val loss: 0.64923, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.63257, val loss: 0.62980, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.61599, val loss: 0.61250, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.60083, val loss: 0.59659, in 0.000s 1 tree, 75 leaves, max depth = 11, train loss: 0.58688, val loss: 0.58184, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.57420, val loss: 0.56855, in 0.016s 1 tree, 75 leaves, max depth = 23, train loss: 0.56258, val loss: 0.55628, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.55182, val loss: 0.54484, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.54094, val loss: 0.53435, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.53097, val loss: 0.52476, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.52180, val loss: 0.51505, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.51338, val loss: 0.50603, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.50546, val loss: 0.49756, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.49839, val loss: 0.49008, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.49009, val loss: 0.48213, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.48375, val loss: 0.47540, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.47789, val loss: 0.46913, in 0.016s 1 tree, 75 leaves, max depth = 20, train loss: 0.47253, val loss: 0.46339, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.46739, val loss: 0.45783, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.46265, val loss: 0.45270, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.45829, val loss: 0.44795, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.45131, val loss: 0.44134, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.44743, val loss: 0.43711, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.44307, val loss: 0.43329, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.43702, val loss: 0.42759, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.43144, val loss: 0.42233, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42819, val loss: 0.41896, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.42307, val loss: 0.41414, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.41833, val loss: 0.40968, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.41527, val loss: 0.40642, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41236, val loss: 0.40339, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.40944, val loss: 0.40035, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.40515, val loss: 0.39634, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.40266, val loss: 0.39371, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40004, val loss: 0.39098, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.39773, val loss: 0.38854, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.39378, val loss: 0.38487, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.39159, val loss: 0.38258, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38921, val loss: 0.38010, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38701, val loss: 0.37780, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.38336, val loss: 0.37445, in 0.000s 1 tree, 75 leaves, max depth = 11, train loss: 0.38086, val loss: 0.37235, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.37764, val loss: 0.36942, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37561, val loss: 0.36729, in 0.000s 1 tree, 75 leaves, max depth = 17, train loss: 0.37263, val loss: 0.36457, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.36988, val loss: 0.36206, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.36806, val loss: 0.36007, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.36551, val loss: 0.35776, in 0.000s 1 tree, 75 leaves, max depth = 17, train loss: 0.36316, val loss: 0.35565, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.36147, val loss: 0.35389, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.35929, val loss: 0.35196, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.35729, val loss: 0.35017, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.35574, val loss: 0.34846, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.35388, val loss: 0.34681, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35196, val loss: 0.34480, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.35023, val loss: 0.34328, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.34880, val loss: 0.34177, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34702, val loss: 0.33989, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34538, val loss: 0.33815, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.34403, val loss: 0.33668, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.34237, val loss: 0.33524, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.34084, val loss: 0.33392, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.33945, val loss: 0.33248, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.33803, val loss: 0.33126, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33646, val loss: 0.32960, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33501, val loss: 0.32806, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33367, val loss: 0.32663, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.33239, val loss: 0.32535, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.33102, val loss: 0.32419, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.32975, val loss: 0.32288, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.32864, val loss: 0.32174, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.32737, val loss: 0.32038, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.32634, val loss: 0.31938, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32515, val loss: 0.31811, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.32413, val loss: 0.31703, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32302, val loss: 0.31584, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.32209, val loss: 0.31494, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.32112, val loss: 0.31397, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32007, val loss: 0.31286, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.31923, val loss: 0.31198, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.31780, val loss: 0.31076, in 0.016s 1 tree, 75 leaves, max depth = 19, train loss: 0.31655, val loss: 0.31002, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.31529, val loss: 0.30896, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.31414, val loss: 0.30800, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31311, val loss: 0.30689, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.31203, val loss: 0.30600, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31106, val loss: 0.30495, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.31014, val loss: 0.30403, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30923, val loss: 0.30305, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.30838, val loss: 0.30221, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.30733, val loss: 0.30133, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30647, val loss: 0.30040, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.30568, val loss: 0.29969, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30487, val loss: 0.29882, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.30412, val loss: 0.29800, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.30334, val loss: 0.29723, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30263, val loss: 0.29646, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.30157, val loss: 0.29558, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.30085, val loss: 0.29495, in 0.016s Fit 100 trees in 1.392 s, (5890 total leaves) Time spent computing histograms: 0.499s Time spent finding best splits: 0.104s Time spent applying splits: 0.104s Time spent predicting: 0.000s Trial 61, Fold 2: Log loss = 0.3076580276331425, Average precision = 0.9456636992142133, ROC-AUC = 0.9476590020883531, Elapsed Time = 1.3889223000005586 seconds Trial 61, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 61, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 75 leaves, max depth = 14, train loss: 0.67132, val loss: 0.67060, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.65113, val loss: 0.64982, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.63278, val loss: 0.63089, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.61623, val loss: 0.61368, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.60113, val loss: 0.59809, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.58734, val loss: 0.58383, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.57475, val loss: 0.57075, in 0.000s 1 tree, 75 leaves, max depth = 18, train loss: 0.56323, val loss: 0.55877, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.55255, val loss: 0.54756, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.54143, val loss: 0.53723, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.53124, val loss: 0.52780, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.52217, val loss: 0.51820, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.51389, val loss: 0.50953, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.50604, val loss: 0.50136, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.49897, val loss: 0.49384, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.49049, val loss: 0.48602, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.48419, val loss: 0.47944, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.47840, val loss: 0.47338, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.47303, val loss: 0.46773, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.46794, val loss: 0.46241, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.46325, val loss: 0.45749, in 0.000s 1 tree, 75 leaves, max depth = 10, train loss: 0.45810, val loss: 0.45291, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.45124, val loss: 0.44664, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.44719, val loss: 0.44236, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.44354, val loss: 0.43852, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.43734, val loss: 0.43289, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.43162, val loss: 0.42770, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42827, val loss: 0.42461, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.42300, val loss: 0.41985, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.41813, val loss: 0.41547, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.41516, val loss: 0.41228, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41216, val loss: 0.40950, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.40935, val loss: 0.40643, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.40492, val loss: 0.40249, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.40240, val loss: 0.39969, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39970, val loss: 0.39718, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.39740, val loss: 0.39473, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.39334, val loss: 0.39117, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.39124, val loss: 0.38885, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38879, val loss: 0.38658, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38653, val loss: 0.38448, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.38277, val loss: 0.38119, in 0.016s 1 tree, 75 leaves, max depth = 22, train loss: 0.38070, val loss: 0.37961, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.37734, val loss: 0.37670, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37524, val loss: 0.37475, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.37213, val loss: 0.37209, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.36925, val loss: 0.36965, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.36749, val loss: 0.36768, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.36482, val loss: 0.36544, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.36236, val loss: 0.36337, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.36075, val loss: 0.36157, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.35847, val loss: 0.35969, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.35637, val loss: 0.35796, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.35485, val loss: 0.35632, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.35290, val loss: 0.35473, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35092, val loss: 0.35288, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.34909, val loss: 0.35141, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.34773, val loss: 0.34988, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34589, val loss: 0.34816, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34419, val loss: 0.34657, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.34289, val loss: 0.34505, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.34115, val loss: 0.34367, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.33954, val loss: 0.34241, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.33822, val loss: 0.34077, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.33672, val loss: 0.33960, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33510, val loss: 0.33808, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33360, val loss: 0.33668, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33221, val loss: 0.33539, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.33077, val loss: 0.33429, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.32951, val loss: 0.33273, in 0.010s 1 tree, 75 leaves, max depth = 16, train loss: 0.32841, val loss: 0.33151, in 0.006s 1 tree, 75 leaves, max depth = 20, train loss: 0.32737, val loss: 0.33024, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32605, val loss: 0.32900, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.32508, val loss: 0.32798, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32385, val loss: 0.32683, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.32290, val loss: 0.32575, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.32176, val loss: 0.32468, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.32089, val loss: 0.32373, in 0.016s 1 tree, 75 leaves, max depth = 19, train loss: 0.31966, val loss: 0.32304, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.31857, val loss: 0.32203, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.31773, val loss: 0.32108, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.31629, val loss: 0.32000, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.31522, val loss: 0.31940, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.31395, val loss: 0.31845, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.31279, val loss: 0.31760, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31172, val loss: 0.31660, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.31062, val loss: 0.31582, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30962, val loss: 0.31487, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.30869, val loss: 0.31374, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30775, val loss: 0.31285, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.30690, val loss: 0.31181, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.30582, val loss: 0.31099, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30492, val loss: 0.31014, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.30414, val loss: 0.30935, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30331, val loss: 0.30857, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.30253, val loss: 0.30784, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.30174, val loss: 0.30687, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30101, val loss: 0.30619, in 0.000s 1 tree, 75 leaves, max depth = 17, train loss: 0.29992, val loss: 0.30539, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.29921, val loss: 0.30469, in 0.016s Fit 100 trees in 1.486 s, (5890 total leaves) Time spent computing histograms: 0.540s Time spent finding best splits: 0.113s Time spent applying splits: 0.115s Time spent predicting: 0.016s Trial 61, Fold 3: Log loss = 0.30098675456426244, Average precision = 0.9532472784465278, ROC-AUC = 0.9525602300854272, Elapsed Time = 1.483257500000036 seconds Trial 61, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 61, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 75 leaves, max depth = 13, train loss: 0.67122, val loss: 0.67007, in 0.016s 1 tree, 75 leaves, max depth = 19, train loss: 0.65103, val loss: 0.64867, in 0.000s 1 tree, 75 leaves, max depth = 19, train loss: 0.63267, val loss: 0.62915, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.61605, val loss: 0.61152, in 0.016s 1 tree, 75 leaves, max depth = 19, train loss: 0.60098, val loss: 0.59540, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.58710, val loss: 0.58060, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.57450, val loss: 0.56710, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.56297, val loss: 0.55477, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.55226, val loss: 0.54324, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.54103, val loss: 0.53232, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.53073, val loss: 0.52232, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.52163, val loss: 0.51247, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.51327, val loss: 0.50340, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.50546, val loss: 0.49482, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.49844, val loss: 0.48711, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.48987, val loss: 0.47884, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.48358, val loss: 0.47192, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.47778, val loss: 0.46552, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.47240, val loss: 0.45953, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.46734, val loss: 0.45385, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.46268, val loss: 0.44859, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.45839, val loss: 0.44372, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.45122, val loss: 0.43688, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.44741, val loss: 0.43253, in 0.016s 1 tree, 75 leaves, max depth = 19, train loss: 0.44301, val loss: 0.42834, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.43676, val loss: 0.42240, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.43099, val loss: 0.41692, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42775, val loss: 0.41344, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.42243, val loss: 0.40840, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.41749, val loss: 0.40373, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.41452, val loss: 0.40032, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.41162, val loss: 0.39719, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.40890, val loss: 0.39404, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.40440, val loss: 0.38980, in 0.016s 1 tree, 75 leaves, max depth = 10, train loss: 0.40141, val loss: 0.38711, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39879, val loss: 0.38428, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.39643, val loss: 0.38155, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.39244, val loss: 0.37783, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.39018, val loss: 0.37528, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38781, val loss: 0.37271, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38561, val loss: 0.37033, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.38192, val loss: 0.36692, in 0.016s 1 tree, 75 leaves, max depth = 10, train loss: 0.37958, val loss: 0.36487, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.37630, val loss: 0.36183, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37426, val loss: 0.35963, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.37120, val loss: 0.35680, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.36837, val loss: 0.35419, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.36647, val loss: 0.35204, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.36385, val loss: 0.34963, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.36141, val loss: 0.34741, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.35970, val loss: 0.34546, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.35744, val loss: 0.34340, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.35535, val loss: 0.34149, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.35383, val loss: 0.33972, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.35188, val loss: 0.33794, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34995, val loss: 0.33583, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.34813, val loss: 0.33418, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.34666, val loss: 0.33264, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34487, val loss: 0.33068, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34321, val loss: 0.32887, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.34180, val loss: 0.32746, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.34004, val loss: 0.32589, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.33842, val loss: 0.32442, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.33700, val loss: 0.32296, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.33548, val loss: 0.32161, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33390, val loss: 0.31987, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33243, val loss: 0.31826, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33107, val loss: 0.31678, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.32983, val loss: 0.31541, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.32855, val loss: 0.31412, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.32706, val loss: 0.31281, in 0.016s 1 tree, 75 leaves, max depth = 19, train loss: 0.32597, val loss: 0.31174, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32468, val loss: 0.31033, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.32365, val loss: 0.30918, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32245, val loss: 0.30786, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.32142, val loss: 0.30690, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32030, val loss: 0.30567, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.31936, val loss: 0.30467, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.31840, val loss: 0.30374, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31735, val loss: 0.30259, in 0.000s 1 tree, 51 leaves, max depth = 11, train loss: 0.31652, val loss: 0.30182, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.31500, val loss: 0.30049, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.31417, val loss: 0.29973, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.31275, val loss: 0.29850, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.31145, val loss: 0.29735, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.31041, val loss: 0.29621, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.30919, val loss: 0.29515, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30821, val loss: 0.29407, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.30733, val loss: 0.29325, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30642, val loss: 0.29224, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.30560, val loss: 0.29149, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.30441, val loss: 0.29047, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30353, val loss: 0.28950, in 0.000s 1 tree, 75 leaves, max depth = 12, train loss: 0.30281, val loss: 0.28875, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30200, val loss: 0.28785, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30125, val loss: 0.28701, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.30049, val loss: 0.28635, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29978, val loss: 0.28557, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.29857, val loss: 0.28455, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.29745, val loss: 0.28379, in 0.016s Fit 100 trees in 1.440 s, (5866 total leaves) Time spent computing histograms: 0.521s Time spent finding best splits: 0.112s Time spent applying splits: 0.113s Time spent predicting: 0.016s Trial 61, Fold 4: Log loss = 0.30024239566538646, Average precision = 0.9538280617924989, ROC-AUC = 0.9534089310452537, Elapsed Time = 1.4443295000000944 seconds Trial 61, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 61, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 75 leaves, max depth = 14, train loss: 0.67094, val loss: 0.66952, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.65052, val loss: 0.64772, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.63196, val loss: 0.62785, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.61527, val loss: 0.60999, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.59998, val loss: 0.59356, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.58600, val loss: 0.57849, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.57323, val loss: 0.56476, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.56153, val loss: 0.55212, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.55072, val loss: 0.54028, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.53968, val loss: 0.52988, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.53016, val loss: 0.51947, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.52135, val loss: 0.50979, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.51332, val loss: 0.50101, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.50571, val loss: 0.49258, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.49892, val loss: 0.48498, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.48996, val loss: 0.47668, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.48386, val loss: 0.46985, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.47825, val loss: 0.46354, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.47311, val loss: 0.45781, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.46818, val loss: 0.45222, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.46364, val loss: 0.44705, in 0.000s 1 tree, 75 leaves, max depth = 11, train loss: 0.45834, val loss: 0.44256, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.45106, val loss: 0.43596, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.44715, val loss: 0.43147, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.44277, val loss: 0.42734, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.43644, val loss: 0.42165, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.43058, val loss: 0.41641, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42733, val loss: 0.41329, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.42192, val loss: 0.40848, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.41692, val loss: 0.40404, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.41393, val loss: 0.40052, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41101, val loss: 0.39772, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.40821, val loss: 0.39446, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.40363, val loss: 0.39045, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.40111, val loss: 0.38749, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39849, val loss: 0.38499, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.39619, val loss: 0.38229, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.39198, val loss: 0.37864, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.38984, val loss: 0.37606, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38746, val loss: 0.37379, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38527, val loss: 0.37171, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.38136, val loss: 0.36836, in 0.016s 1 tree, 75 leaves, max depth = 22, train loss: 0.37927, val loss: 0.36654, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.37577, val loss: 0.36357, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37373, val loss: 0.36164, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.37047, val loss: 0.35889, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.36746, val loss: 0.35637, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.36566, val loss: 0.35424, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.36286, val loss: 0.35191, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.36028, val loss: 0.34979, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.35865, val loss: 0.34779, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.35625, val loss: 0.34583, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.35403, val loss: 0.34403, in 0.016s 1 tree, 75 leaves, max depth = 17, train loss: 0.35254, val loss: 0.34241, in 0.000s 1 tree, 75 leaves, max depth = 14, train loss: 0.35047, val loss: 0.34074, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34853, val loss: 0.33889, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.34660, val loss: 0.33734, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.34519, val loss: 0.33559, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34340, val loss: 0.33387, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34174, val loss: 0.33228, in 0.000s 1 tree, 75 leaves, max depth = 13, train loss: 0.34039, val loss: 0.33083, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.33854, val loss: 0.32937, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.33683, val loss: 0.32803, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.33546, val loss: 0.32654, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.33386, val loss: 0.32530, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33228, val loss: 0.32377, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33081, val loss: 0.32236, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.32946, val loss: 0.32106, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.32824, val loss: 0.31976, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.32669, val loss: 0.31856, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.32548, val loss: 0.31724, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.32440, val loss: 0.31593, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32311, val loss: 0.31469, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.32212, val loss: 0.31346, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.32092, val loss: 0.31231, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.31996, val loss: 0.31129, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.31884, val loss: 0.31022, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.31795, val loss: 0.30910, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.31703, val loss: 0.30811, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.31598, val loss: 0.30710, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.31519, val loss: 0.30627, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.31358, val loss: 0.30505, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.31235, val loss: 0.30412, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.31093, val loss: 0.30307, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.30962, val loss: 0.30210, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30857, val loss: 0.30109, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.30734, val loss: 0.30020, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.30636, val loss: 0.29924, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.30547, val loss: 0.29829, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30455, val loss: 0.29740, in 0.000s 1 tree, 75 leaves, max depth = 19, train loss: 0.30373, val loss: 0.29652, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.30253, val loss: 0.29566, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30165, val loss: 0.29481, in 0.000s 1 tree, 75 leaves, max depth = 18, train loss: 0.30087, val loss: 0.29398, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30006, val loss: 0.29319, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.29930, val loss: 0.29247, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.29859, val loss: 0.29158, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29788, val loss: 0.29090, in 0.000s 1 tree, 75 leaves, max depth = 15, train loss: 0.29666, val loss: 0.29004, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.29594, val loss: 0.28928, in 0.016s Fit 100 trees in 1.423 s, (5890 total leaves) Time spent computing histograms: 0.524s Time spent finding best splits: 0.107s Time spent applying splits: 0.109s Time spent predicting: 0.000s Trial 61, Fold 5: Log loss = 0.3061034647699614, Average precision = 0.9509200647523018, ROC-AUC = 0.949326496107612, Elapsed Time = 1.433862500000032 seconds
Optimization Progress: 62%|######2 | 62/100 [12:09<07:22, 11.65s/it]
Trial 62, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 62, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.143 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 49 leaves, max depth = 10, train loss: 0.65202, val loss: 0.65248, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.61666, val loss: 0.61748, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.58585, val loss: 0.58689, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.55982, val loss: 0.56076, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.53538, val loss: 0.53637, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.51502, val loss: 0.51592, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.49548, val loss: 0.49624, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.47812, val loss: 0.47875, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.46256, val loss: 0.46300, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.44864, val loss: 0.44889, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.43600, val loss: 0.43617, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.41825, val loss: 0.41907, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.40780, val loss: 0.40866, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.39291, val loss: 0.39438, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.38413, val loss: 0.38563, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.37157, val loss: 0.37363, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.36408, val loss: 0.36625, in 0.018s 1 tree, 29 leaves, max depth = 10, train loss: 0.35393, val loss: 0.35621, in 0.014s 1 tree, 30 leaves, max depth = 10, train loss: 0.34493, val loss: 0.34728, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.33780, val loss: 0.34033, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.32994, val loss: 0.33341, in 0.016s 1 tree, 29 leaves, max depth = 14, train loss: 0.32291, val loss: 0.32650, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.31696, val loss: 0.32067, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.31072, val loss: 0.31525, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.30512, val loss: 0.30970, in 0.000s 1 tree, 62 leaves, max depth = 12, train loss: 0.29993, val loss: 0.30456, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.29493, val loss: 0.30030, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.29046, val loss: 0.29584, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.28614, val loss: 0.29161, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.28222, val loss: 0.28786, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.27833, val loss: 0.28460, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.27474, val loss: 0.28095, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.27141, val loss: 0.27764, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.26826, val loss: 0.27508, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.26547, val loss: 0.27226, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.26277, val loss: 0.27013, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.26041, val loss: 0.26773, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.25752, val loss: 0.26483, in 0.031s 1 tree, 45 leaves, max depth = 12, train loss: 0.25533, val loss: 0.26313, in 0.016s 1 tree, 35 leaves, max depth = 16, train loss: 0.25334, val loss: 0.26095, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.25147, val loss: 0.25952, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.24968, val loss: 0.25771, in 0.016s 1 tree, 59 leaves, max depth = 15, train loss: 0.24732, val loss: 0.25534, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.24490, val loss: 0.25281, in 0.016s 1 tree, 28 leaves, max depth = 14, train loss: 0.24338, val loss: 0.25121, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.24121, val loss: 0.24895, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.23958, val loss: 0.24769, in 0.016s 1 tree, 36 leaves, max depth = 17, train loss: 0.23820, val loss: 0.24609, in 0.016s 1 tree, 65 leaves, max depth = 14, train loss: 0.23601, val loss: 0.24377, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.23454, val loss: 0.24241, in 0.016s 1 tree, 52 leaves, max depth = 18, train loss: 0.23321, val loss: 0.24139, in 0.016s 1 tree, 61 leaves, max depth = 14, train loss: 0.23200, val loss: 0.23999, in 0.016s 1 tree, 62 leaves, max depth = 12, train loss: 0.23010, val loss: 0.23790, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.22882, val loss: 0.23695, in 0.016s 1 tree, 60 leaves, max depth = 13, train loss: 0.22775, val loss: 0.23587, in 0.016s 1 tree, 52 leaves, max depth = 19, train loss: 0.22665, val loss: 0.23494, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.22496, val loss: 0.23312, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.22389, val loss: 0.23216, in 0.016s 1 tree, 37 leaves, max depth = 15, train loss: 0.22299, val loss: 0.23110, in 0.016s 1 tree, 60 leaves, max depth = 13, train loss: 0.22210, val loss: 0.23009, in 0.000s 1 tree, 61 leaves, max depth = 12, train loss: 0.22070, val loss: 0.22854, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.21942, val loss: 0.22725, in 0.031s 1 tree, 33 leaves, max depth = 11, train loss: 0.21844, val loss: 0.22649, in 0.000s 1 tree, 58 leaves, max depth = 15, train loss: 0.21752, val loss: 0.22595, in 0.016s Fit 64 trees in 1.315 s, (3062 total leaves) Time spent computing histograms: 0.388s Time spent finding best splits: 0.080s Time spent applying splits: 0.059s Time spent predicting: 0.000s Trial 62, Fold 1: Log loss = 0.22921671941905178, Average precision = 0.9687370333075268, ROC-AUC = 0.9640845234510547, Elapsed Time = 1.327083500000299 seconds Trial 62, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 62, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 49 leaves, max depth = 11, train loss: 0.65203, val loss: 0.65130, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.61660, val loss: 0.61514, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.58569, val loss: 0.58361, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.55961, val loss: 0.55727, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.53489, val loss: 0.53223, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.51451, val loss: 0.51166, in 0.031s 1 tree, 54 leaves, max depth = 10, train loss: 0.49491, val loss: 0.49176, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.47736, val loss: 0.47402, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.46184, val loss: 0.45820, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.44775, val loss: 0.44393, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.43516, val loss: 0.43111, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.41764, val loss: 0.41398, in 0.016s 1 tree, 63 leaves, max depth = 11, train loss: 0.40729, val loss: 0.40362, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.39258, val loss: 0.38927, in 0.031s 1 tree, 52 leaves, max depth = 11, train loss: 0.38378, val loss: 0.38042, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.37138, val loss: 0.36833, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.36453, val loss: 0.36152, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.35449, val loss: 0.35166, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.34695, val loss: 0.34416, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.33834, val loss: 0.33627, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.33052, val loss: 0.32866, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.32422, val loss: 0.32242, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.31737, val loss: 0.31613, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.31121, val loss: 0.31014, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.30570, val loss: 0.30483, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.30025, val loss: 0.29988, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.29531, val loss: 0.29504, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.29074, val loss: 0.29092, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.28663, val loss: 0.28685, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.28227, val loss: 0.28267, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.27875, val loss: 0.27926, in 0.000s 1 tree, 52 leaves, max depth = 14, train loss: 0.27500, val loss: 0.27565, in 0.031s 1 tree, 47 leaves, max depth = 13, train loss: 0.27165, val loss: 0.27267, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.26865, val loss: 0.27003, in 0.016s 1 tree, 30 leaves, max depth = 16, train loss: 0.26596, val loss: 0.26741, in 0.016s 1 tree, 62 leaves, max depth = 13, train loss: 0.26270, val loss: 0.26441, in 0.016s 1 tree, 47 leaves, max depth = 14, train loss: 0.26024, val loss: 0.26222, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.25798, val loss: 0.26003, in 0.016s 1 tree, 62 leaves, max depth = 13, train loss: 0.25516, val loss: 0.25743, in 0.031s 1 tree, 52 leaves, max depth = 14, train loss: 0.25268, val loss: 0.25504, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.25034, val loss: 0.25322, in 0.016s 1 tree, 28 leaves, max depth = 16, train loss: 0.24859, val loss: 0.25154, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.24680, val loss: 0.25025, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.24460, val loss: 0.24825, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.24280, val loss: 0.24682, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.24132, val loss: 0.24533, in 0.016s 1 tree, 62 leaves, max depth = 13, train loss: 0.23991, val loss: 0.24421, in 0.016s 1 tree, 63 leaves, max depth = 14, train loss: 0.23773, val loss: 0.24222, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.23619, val loss: 0.24103, in 0.016s 1 tree, 60 leaves, max depth = 14, train loss: 0.23492, val loss: 0.24007, in 0.000s 1 tree, 55 leaves, max depth = 14, train loss: 0.23365, val loss: 0.23927, in 0.016s 1 tree, 61 leaves, max depth = 11, train loss: 0.23174, val loss: 0.23751, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.23044, val loss: 0.23658, in 0.016s 1 tree, 60 leaves, max depth = 13, train loss: 0.22938, val loss: 0.23578, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.22750, val loss: 0.23379, in 0.031s 1 tree, 36 leaves, max depth = 9, train loss: 0.22645, val loss: 0.23293, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.22472, val loss: 0.23099, in 0.000s 1 tree, 63 leaves, max depth = 13, train loss: 0.22379, val loss: 0.23040, in 0.016s 1 tree, 60 leaves, max depth = 14, train loss: 0.22243, val loss: 0.22926, in 0.031s 1 tree, 48 leaves, max depth = 11, train loss: 0.22101, val loss: 0.22811, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.22009, val loss: 0.22747, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.21869, val loss: 0.22588, in 0.016s 1 tree, 32 leaves, max depth = 8, train loss: 0.21793, val loss: 0.22523, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.21723, val loss: 0.22479, in 0.016s Fit 64 trees in 1.392 s, (3022 total leaves) Time spent computing histograms: 0.393s Time spent finding best splits: 0.081s Time spent applying splits: 0.060s Time spent predicting: 0.000s Trial 62, Fold 2: Log loss = 0.22243597504614823, Average precision = 0.96841896352419, ROC-AUC = 0.9656550754626563, Elapsed Time = 1.4002233000010165 seconds Trial 62, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 62, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 52 leaves, max depth = 9, train loss: 0.65227, val loss: 0.65219, in 0.016s 1 tree, 52 leaves, max depth = 9, train loss: 0.61709, val loss: 0.61691, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.58642, val loss: 0.58609, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.56023, val loss: 0.56005, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.53580, val loss: 0.53570, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.51531, val loss: 0.51531, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.49588, val loss: 0.49595, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.47857, val loss: 0.47868, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.46308, val loss: 0.46324, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.44920, val loss: 0.44939, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.43044, val loss: 0.43204, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.41894, val loss: 0.42062, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.40844, val loss: 0.41007, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.39352, val loss: 0.39643, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.38473, val loss: 0.38766, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.37213, val loss: 0.37624, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.36467, val loss: 0.36878, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.35443, val loss: 0.35947, in 0.031s 1 tree, 29 leaves, max depth = 10, train loss: 0.34539, val loss: 0.35129, in 0.000s 1 tree, 51 leaves, max depth = 11, train loss: 0.33837, val loss: 0.34408, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.33025, val loss: 0.33744, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.32321, val loss: 0.33096, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.31734, val loss: 0.32479, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.31090, val loss: 0.31958, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.30529, val loss: 0.31457, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.30039, val loss: 0.30950, in 0.016s 1 tree, 46 leaves, max depth = 13, train loss: 0.29524, val loss: 0.30536, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.29077, val loss: 0.30134, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.28637, val loss: 0.29800, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.28266, val loss: 0.29464, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.27867, val loss: 0.29035, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.27510, val loss: 0.28765, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.27189, val loss: 0.28522, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.26904, val loss: 0.28270, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.26582, val loss: 0.27921, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.26319, val loss: 0.27732, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.26087, val loss: 0.27524, in 0.016s 1 tree, 63 leaves, max depth = 11, train loss: 0.25786, val loss: 0.27188, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.25567, val loss: 0.27045, in 0.016s 1 tree, 33 leaves, max depth = 14, train loss: 0.25371, val loss: 0.26855, in 0.016s 1 tree, 62 leaves, max depth = 15, train loss: 0.25114, val loss: 0.26571, in 0.031s 1 tree, 28 leaves, max depth = 13, train loss: 0.24943, val loss: 0.26411, in 0.000s 1 tree, 61 leaves, max depth = 14, train loss: 0.24705, val loss: 0.26145, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.24512, val loss: 0.26008, in 0.016s 1 tree, 32 leaves, max depth = 15, train loss: 0.24367, val loss: 0.25868, in 0.016s 1 tree, 60 leaves, max depth = 14, train loss: 0.24132, val loss: 0.25600, in 0.031s 1 tree, 59 leaves, max depth = 12, train loss: 0.23991, val loss: 0.25448, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.23801, val loss: 0.25288, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.23634, val loss: 0.25161, in 0.016s 1 tree, 54 leaves, max depth = 16, train loss: 0.23498, val loss: 0.25083, in 0.016s 1 tree, 62 leaves, max depth = 12, train loss: 0.23301, val loss: 0.24862, in 0.031s 1 tree, 60 leaves, max depth = 11, train loss: 0.23188, val loss: 0.24743, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.23014, val loss: 0.24548, in 0.016s 1 tree, 54 leaves, max depth = 16, train loss: 0.22887, val loss: 0.24477, in 0.031s 1 tree, 60 leaves, max depth = 12, train loss: 0.22729, val loss: 0.24301, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.22605, val loss: 0.24216, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.22511, val loss: 0.24123, in 0.016s 1 tree, 62 leaves, max depth = 14, train loss: 0.22424, val loss: 0.24047, in 0.031s 1 tree, 58 leaves, max depth = 13, train loss: 0.22307, val loss: 0.23983, in 0.016s 1 tree, 61 leaves, max depth = 13, train loss: 0.22172, val loss: 0.23830, in 0.016s 1 tree, 54 leaves, max depth = 15, train loss: 0.22074, val loss: 0.23782, in 0.031s 1 tree, 61 leaves, max depth = 13, train loss: 0.21950, val loss: 0.23635, in 0.016s 1 tree, 62 leaves, max depth = 13, train loss: 0.21873, val loss: 0.23583, in 0.031s 1 tree, 61 leaves, max depth = 12, train loss: 0.21763, val loss: 0.23451, in 0.016s Fit 64 trees in 1.470 s, (3165 total leaves) Time spent computing histograms: 0.402s Time spent finding best splits: 0.090s Time spent applying splits: 0.069s Time spent predicting: 0.000s Trial 62, Fold 3: Log loss = 0.22279210569114516, Average precision = 0.9677415351549978, ROC-AUC = 0.9644402623012396, Elapsed Time = 1.4759725999992952 seconds Trial 62, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 62, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 49 leaves, max depth = 10, train loss: 0.65255, val loss: 0.65082, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.61746, val loss: 0.61419, in 0.031s 1 tree, 52 leaves, max depth = 11, train loss: 0.58655, val loss: 0.58175, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.56076, val loss: 0.55467, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.53636, val loss: 0.52918, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.51632, val loss: 0.50806, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.49689, val loss: 0.48771, in 0.031s 1 tree, 53 leaves, max depth = 10, train loss: 0.47962, val loss: 0.46944, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.46417, val loss: 0.45303, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.45022, val loss: 0.43821, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.43776, val loss: 0.42501, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.42023, val loss: 0.40725, in 0.031s 1 tree, 58 leaves, max depth = 11, train loss: 0.41003, val loss: 0.39632, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.39531, val loss: 0.38140, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.38654, val loss: 0.37194, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.37410, val loss: 0.35932, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.36731, val loss: 0.35213, in 0.016s 1 tree, 30 leaves, max depth = 15, train loss: 0.35720, val loss: 0.34169, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.34971, val loss: 0.33410, in 0.016s 1 tree, 30 leaves, max depth = 15, train loss: 0.34115, val loss: 0.32526, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.33465, val loss: 0.31863, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.32705, val loss: 0.31153, in 0.016s 1 tree, 60 leaves, max depth = 11, train loss: 0.32125, val loss: 0.30556, in 0.016s 1 tree, 30 leaves, max depth = 14, train loss: 0.31476, val loss: 0.29885, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.30873, val loss: 0.29326, in 0.031s 1 tree, 29 leaves, max depth = 10, train loss: 0.30339, val loss: 0.28771, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.29837, val loss: 0.28309, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.29394, val loss: 0.27846, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.28961, val loss: 0.27408, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.28557, val loss: 0.27040, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.28202, val loss: 0.26665, in 0.016s 1 tree, 60 leaves, max depth = 13, train loss: 0.27811, val loss: 0.26245, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.27485, val loss: 0.25950, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.27193, val loss: 0.25642, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.26874, val loss: 0.25317, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.26610, val loss: 0.25077, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.26373, val loss: 0.24827, in 0.016s 1 tree, 60 leaves, max depth = 13, train loss: 0.26079, val loss: 0.24509, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.25854, val loss: 0.24291, in 0.016s 1 tree, 37 leaves, max depth = 14, train loss: 0.25651, val loss: 0.24086, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.25425, val loss: 0.23896, in 0.031s 1 tree, 63 leaves, max depth = 14, train loss: 0.25153, val loss: 0.23632, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.24948, val loss: 0.23449, in 0.016s 1 tree, 37 leaves, max depth = 14, train loss: 0.24787, val loss: 0.23286, in 0.016s 1 tree, 60 leaves, max depth = 13, train loss: 0.24536, val loss: 0.23048, in 0.016s 1 tree, 61 leaves, max depth = 13, train loss: 0.24313, val loss: 0.22835, in 0.016s 1 tree, 60 leaves, max depth = 13, train loss: 0.24163, val loss: 0.22728, in 0.016s 1 tree, 60 leaves, max depth = 11, train loss: 0.23989, val loss: 0.22563, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.23821, val loss: 0.22422, in 0.031s 1 tree, 34 leaves, max depth = 10, train loss: 0.23670, val loss: 0.22297, in 0.016s 1 tree, 60 leaves, max depth = 13, train loss: 0.23545, val loss: 0.22212, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.23362, val loss: 0.22039, in 0.016s 1 tree, 53 leaves, max depth = 16, train loss: 0.23235, val loss: 0.21937, in 0.016s 1 tree, 63 leaves, max depth = 14, train loss: 0.23068, val loss: 0.21778, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.22963, val loss: 0.21711, in 0.016s 1 tree, 57 leaves, max depth = 14, train loss: 0.22828, val loss: 0.21649, in 0.031s 1 tree, 57 leaves, max depth = 17, train loss: 0.22715, val loss: 0.21549, in 0.016s 1 tree, 36 leaves, max depth = 16, train loss: 0.22621, val loss: 0.21459, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.22481, val loss: 0.21328, in 0.016s 1 tree, 60 leaves, max depth = 14, train loss: 0.22393, val loss: 0.21275, in 0.016s 1 tree, 60 leaves, max depth = 14, train loss: 0.22267, val loss: 0.21159, in 0.016s 1 tree, 60 leaves, max depth = 14, train loss: 0.22174, val loss: 0.21092, in 0.031s 1 tree, 49 leaves, max depth = 10, train loss: 0.22033, val loss: 0.20976, in 0.016s 1 tree, 56 leaves, max depth = 15, train loss: 0.21934, val loss: 0.20943, in 0.016s Fit 64 trees in 1.486 s, (3143 total leaves) Time spent computing histograms: 0.422s Time spent finding best splits: 0.091s Time spent applying splits: 0.069s Time spent predicting: 0.000s Trial 62, Fold 4: Log loss = 0.2271394765584005, Average precision = 0.96831361157216, ROC-AUC = 0.963736480396575, Elapsed Time = 1.4979230000008101 seconds Trial 62, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 62, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 50 leaves, max depth = 12, train loss: 0.65172, val loss: 0.65012, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.61637, val loss: 0.61345, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.58523, val loss: 0.58104, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.55911, val loss: 0.55370, in 0.031s 1 tree, 53 leaves, max depth = 11, train loss: 0.53479, val loss: 0.52842, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.51435, val loss: 0.50701, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.49472, val loss: 0.48653, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.47730, val loss: 0.46839, in 0.031s 1 tree, 52 leaves, max depth = 11, train loss: 0.46163, val loss: 0.45199, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.44762, val loss: 0.43744, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.43498, val loss: 0.42418, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.41734, val loss: 0.40664, in 0.016s 1 tree, 62 leaves, max depth = 10, train loss: 0.40698, val loss: 0.39594, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.39215, val loss: 0.38130, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.38334, val loss: 0.37232, in 0.031s 1 tree, 52 leaves, max depth = 13, train loss: 0.37081, val loss: 0.35994, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.36389, val loss: 0.35285, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.35368, val loss: 0.34251, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.34611, val loss: 0.33496, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.33747, val loss: 0.32620, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.33090, val loss: 0.31964, in 0.016s 1 tree, 45 leaves, max depth = 15, train loss: 0.32330, val loss: 0.31291, in 0.016s 1 tree, 60 leaves, max depth = 13, train loss: 0.31745, val loss: 0.30698, in 0.031s 1 tree, 28 leaves, max depth = 10, train loss: 0.31090, val loss: 0.30036, in 0.000s 1 tree, 46 leaves, max depth = 14, train loss: 0.30486, val loss: 0.29509, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.29946, val loss: 0.28960, in 0.031s 1 tree, 46 leaves, max depth = 12, train loss: 0.29444, val loss: 0.28519, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.28997, val loss: 0.28046, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.28560, val loss: 0.27622, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.28177, val loss: 0.27233, in 0.016s 1 tree, 60 leaves, max depth = 11, train loss: 0.27774, val loss: 0.26834, in 0.016s 1 tree, 46 leaves, max depth = 14, train loss: 0.27407, val loss: 0.26521, in 0.016s 1 tree, 45 leaves, max depth = 14, train loss: 0.27078, val loss: 0.26241, in 0.031s 1 tree, 29 leaves, max depth = 10, train loss: 0.26786, val loss: 0.25935, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.26456, val loss: 0.25627, in 0.031s 1 tree, 45 leaves, max depth = 13, train loss: 0.26187, val loss: 0.25411, in 0.016s 1 tree, 29 leaves, max depth = 14, train loss: 0.25950, val loss: 0.25162, in 0.016s 1 tree, 43 leaves, max depth = 14, train loss: 0.25715, val loss: 0.24962, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.25496, val loss: 0.24732, in 0.016s 1 tree, 63 leaves, max depth = 11, train loss: 0.25218, val loss: 0.24460, in 0.016s 1 tree, 44 leaves, max depth = 14, train loss: 0.25026, val loss: 0.24304, in 0.016s 1 tree, 28 leaves, max depth = 14, train loss: 0.24852, val loss: 0.24124, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.24571, val loss: 0.23870, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.24351, val loss: 0.23664, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.24165, val loss: 0.23505, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.24013, val loss: 0.23399, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.23873, val loss: 0.23258, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.23653, val loss: 0.23026, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.23452, val loss: 0.22850, in 0.016s 1 tree, 44 leaves, max depth = 13, train loss: 0.23326, val loss: 0.22752, in 0.016s 1 tree, 60 leaves, max depth = 13, train loss: 0.23203, val loss: 0.22672, in 0.031s 1 tree, 34 leaves, max depth = 14, train loss: 0.23089, val loss: 0.22553, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.22909, val loss: 0.22362, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.22747, val loss: 0.22223, in 0.016s 1 tree, 61 leaves, max depth = 14, train loss: 0.22646, val loss: 0.22165, in 0.016s 1 tree, 56 leaves, max depth = 14, train loss: 0.22491, val loss: 0.22001, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.22368, val loss: 0.21900, in 0.032s 1 tree, 55 leaves, max depth = 24, train loss: 0.22255, val loss: 0.21805, in 0.016s 1 tree, 57 leaves, max depth = 15, train loss: 0.22118, val loss: 0.21663, in 0.016s 1 tree, 61 leaves, max depth = 14, train loss: 0.22033, val loss: 0.21618, in 0.031s 1 tree, 34 leaves, max depth = 12, train loss: 0.21934, val loss: 0.21546, in 0.016s 1 tree, 54 leaves, max depth = 22, train loss: 0.21841, val loss: 0.21470, in 0.016s 1 tree, 62 leaves, max depth = 13, train loss: 0.21758, val loss: 0.21410, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.21634, val loss: 0.21271, in 0.016s Fit 64 trees in 1.470 s, (3061 total leaves) Time spent computing histograms: 0.421s Time spent finding best splits: 0.090s Time spent applying splits: 0.067s Time spent predicting: 0.000s Trial 62, Fold 5: Log loss = 0.23154348837633418, Average precision = 0.9660934449123427, ROC-AUC = 0.962099102614124, Elapsed Time = 1.485750599998937 seconds
Optimization Progress: 63%|######3 | 63/100 [12:23<07:36, 12.33s/it]
Trial 63, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 63, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.127 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 6, train loss: 0.67126, val loss: 0.67128, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.65074, val loss: 0.65091, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.63284, val loss: 0.63309, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.61529, val loss: 0.61548, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.59926, val loss: 0.59925, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.58432, val loss: 0.58410, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.57048, val loss: 0.57017, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.55672, val loss: 0.55634, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.54387, val loss: 0.54358, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.53159, val loss: 0.53115, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.52039, val loss: 0.51964, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.50980, val loss: 0.50867, in 0.000s 1 tree, 17 leaves, max depth = 5, train loss: 0.49953, val loss: 0.49822, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.49006, val loss: 0.48852, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.48094, val loss: 0.47922, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.47240, val loss: 0.47062, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.46457, val loss: 0.46251, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.45380, val loss: 0.45208, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.44687, val loss: 0.44507, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.43993, val loss: 0.43803, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.43335, val loss: 0.43140, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42713, val loss: 0.42513, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.42124, val loss: 0.41921, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.41253, val loss: 0.41087, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40731, val loss: 0.40553, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.40223, val loss: 0.40044, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.39447, val loss: 0.39306, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.38716, val loss: 0.38607, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.38268, val loss: 0.38155, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.37606, val loss: 0.37514, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.36973, val loss: 0.36920, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.36374, val loss: 0.36350, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.35981, val loss: 0.35960, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.35632, val loss: 0.35602, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.35130, val loss: 0.35104, in 0.016s Fit 35 trees in 0.767 s, (595 total leaves) Time spent computing histograms: 0.219s Time spent finding best splits: 0.023s Time spent applying splits: 0.014s Time spent predicting: 0.000s Trial 63, Fold 1: Log loss = 0.3543605450878688, Average precision = 0.9539285503411343, ROC-AUC = 0.9486214320366307, Elapsed Time = 0.7798387000002549 seconds Trial 63, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 63, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 6, train loss: 0.67092, val loss: 0.67043, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.65043, val loss: 0.64946, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.63246, val loss: 0.63135, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.61431, val loss: 0.61282, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.59850, val loss: 0.59678, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.58377, val loss: 0.58184, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.57002, val loss: 0.56784, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.55598, val loss: 0.55350, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.54284, val loss: 0.54008, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.53052, val loss: 0.52749, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.51927, val loss: 0.51591, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.50869, val loss: 0.50506, in 0.000s 1 tree, 17 leaves, max depth = 5, train loss: 0.49846, val loss: 0.49459, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.48908, val loss: 0.48506, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.47997, val loss: 0.47575, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.47135, val loss: 0.46703, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.46361, val loss: 0.45907, in 0.000s 1 tree, 17 leaves, max depth = 5, train loss: 0.45295, val loss: 0.44850, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.44294, val loss: 0.43857, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43599, val loss: 0.43144, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42943, val loss: 0.42474, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42309, val loss: 0.41831, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.41709, val loss: 0.41223, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.40879, val loss: 0.40405, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40349, val loss: 0.39874, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39840, val loss: 0.39352, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39102, val loss: 0.38625, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.38406, val loss: 0.37939, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.37954, val loss: 0.37479, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.37325, val loss: 0.36862, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.36918, val loss: 0.36447, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.36524, val loss: 0.36048, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.36193, val loss: 0.35725, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.35665, val loss: 0.35211, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.35169, val loss: 0.34724, in 0.016s Fit 35 trees in 0.814 s, (595 total leaves) Time spent computing histograms: 0.216s Time spent finding best splits: 0.024s Time spent applying splits: 0.014s Time spent predicting: 0.016s Trial 63, Fold 2: Log loss = 0.35372807073401374, Average precision = 0.9503790214942958, ROC-AUC = 0.9492831586362245, Elapsed Time = 0.8227070000011736 seconds Trial 63, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 63, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 6, train loss: 0.67125, val loss: 0.67116, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.65113, val loss: 0.65095, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.63313, val loss: 0.63296, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.61532, val loss: 0.61508, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.59943, val loss: 0.59917, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.58467, val loss: 0.58442, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.57013, val loss: 0.56984, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.55622, val loss: 0.55590, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.54320, val loss: 0.54285, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.53100, val loss: 0.53063, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.51984, val loss: 0.51944, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.50985, val loss: 0.50939, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.49970, val loss: 0.49918, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.49023, val loss: 0.48976, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.48118, val loss: 0.48070, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.47267, val loss: 0.47215, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.46487, val loss: 0.46434, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.45726, val loss: 0.45673, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.44674, val loss: 0.44694, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.43983, val loss: 0.44008, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43332, val loss: 0.43358, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42715, val loss: 0.42743, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42132, val loss: 0.42165, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.41260, val loss: 0.41354, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.40448, val loss: 0.40615, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39935, val loss: 0.40106, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.39187, val loss: 0.39419, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.38479, val loss: 0.38774, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.38031, val loss: 0.38326, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.37394, val loss: 0.37755, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.36989, val loss: 0.37350, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.36598, val loss: 0.36962, in 0.031s 1 tree, 17 leaves, max depth = 9, train loss: 0.36051, val loss: 0.36454, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.35535, val loss: 0.35981, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.35049, val loss: 0.35540, in 0.016s Fit 35 trees in 0.860 s, (595 total leaves) Time spent computing histograms: 0.221s Time spent finding best splits: 0.025s Time spent applying splits: 0.015s Time spent predicting: 0.000s Trial 63, Fold 3: Log loss = 0.3476500742964973, Average precision = 0.9537392723874637, ROC-AUC = 0.9517430217131685, Elapsed Time = 0.8657454000003781 seconds Trial 63, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 63, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 6, train loss: 0.67139, val loss: 0.67049, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.65096, val loss: 0.64924, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.63320, val loss: 0.63096, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.61575, val loss: 0.61260, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.59954, val loss: 0.59559, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.58466, val loss: 0.57993, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.57079, val loss: 0.56533, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.55695, val loss: 0.55080, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.54418, val loss: 0.53732, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.53194, val loss: 0.52444, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.52075, val loss: 0.51259, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.50995, val loss: 0.50115, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.49997, val loss: 0.49052, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.49068, val loss: 0.48061, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.48165, val loss: 0.47103, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.47320, val loss: 0.46205, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.46541, val loss: 0.45378, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.45476, val loss: 0.44295, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.44472, val loss: 0.43274, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.43793, val loss: 0.42559, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43134, val loss: 0.41858, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42510, val loss: 0.41195, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.41919, val loss: 0.40568, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.41089, val loss: 0.39729, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40567, val loss: 0.39163, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40058, val loss: 0.38621, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.39319, val loss: 0.37873, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.38617, val loss: 0.37162, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.38185, val loss: 0.36695, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.37560, val loss: 0.36055, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.37148, val loss: 0.35616, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.36751, val loss: 0.35194, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.36419, val loss: 0.34834, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.35887, val loss: 0.34283, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.35389, val loss: 0.33767, in 0.016s Fit 35 trees in 0.829 s, (595 total leaves) Time spent computing histograms: 0.215s Time spent finding best splits: 0.023s Time spent applying splits: 0.014s Time spent predicting: 0.016s Trial 63, Fold 4: Log loss = 0.35249242395515334, Average precision = 0.955026329538689, ROC-AUC = 0.9499984799387684, Elapsed Time = 0.8395970000001398 seconds Trial 63, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 63, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 6, train loss: 0.67099, val loss: 0.67000, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.65027, val loss: 0.64844, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.63218, val loss: 0.62985, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.61443, val loss: 0.61133, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.59795, val loss: 0.59400, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.58305, val loss: 0.57837, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.56904, val loss: 0.56371, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.55512, val loss: 0.54918, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.54213, val loss: 0.53563, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.52990, val loss: 0.52288, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.51884, val loss: 0.51119, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.50835, val loss: 0.50010, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.49818, val loss: 0.48952, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.48876, val loss: 0.47961, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.47971, val loss: 0.47020, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.47116, val loss: 0.46129, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.46337, val loss: 0.45305, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.45575, val loss: 0.44509, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.44532, val loss: 0.43466, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43846, val loss: 0.42751, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43190, val loss: 0.42072, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42570, val loss: 0.41431, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.41987, val loss: 0.40827, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.41119, val loss: 0.39966, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.40319, val loss: 0.39165, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39805, val loss: 0.38630, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.39062, val loss: 0.37893, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.38361, val loss: 0.37199, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.37912, val loss: 0.36733, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.37285, val loss: 0.36107, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.36870, val loss: 0.35680, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.36282, val loss: 0.35099, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.35902, val loss: 0.34712, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.35566, val loss: 0.34367, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.35064, val loss: 0.33854, in 0.016s Fit 35 trees in 0.844 s, (595 total leaves) Time spent computing histograms: 0.221s Time spent finding best splits: 0.024s Time spent applying splits: 0.014s Time spent predicting: 0.000s Trial 63, Fold 5: Log loss = 0.3572451824505063, Average precision = 0.9509240981302716, ROC-AUC = 0.9475750515578842, Elapsed Time = 0.846569699999236 seconds
Optimization Progress: 64%|######4 | 64/100 [12:33<07:06, 11.84s/it]
Trial 64, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 64, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.015 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 8, train loss: 0.67357, val loss: 0.67362, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.65487, val loss: 0.65513, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.63740, val loss: 0.63785, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.62105, val loss: 0.62163, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.60634, val loss: 0.60689, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.59292, val loss: 0.59325, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.58012, val loss: 0.58027, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.56701, val loss: 0.56709, in 0.031s 1 tree, 35 leaves, max depth = 7, train loss: 0.55542, val loss: 0.55543, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.54369, val loss: 0.54376, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.53336, val loss: 0.53340, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.52284, val loss: 0.52287, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.51305, val loss: 0.51299, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.50361, val loss: 0.50362, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.49492, val loss: 0.49469, in 0.000s 1 tree, 35 leaves, max depth = 10, train loss: 0.48640, val loss: 0.48613, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.47877, val loss: 0.47848, in 0.000s 1 tree, 35 leaves, max depth = 10, train loss: 0.47111, val loss: 0.47078, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.46380, val loss: 0.46344, in 0.000s 1 tree, 35 leaves, max depth = 13, train loss: 0.45413, val loss: 0.45409, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.44762, val loss: 0.44747, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.44130, val loss: 0.44121, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.43529, val loss: 0.43521, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.42956, val loss: 0.42950, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.42146, val loss: 0.42161, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.41667, val loss: 0.41668, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.41189, val loss: 0.41179, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.40709, val loss: 0.40699, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.39985, val loss: 0.40010, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.39539, val loss: 0.39571, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.39139, val loss: 0.39174, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.38488, val loss: 0.38559, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.37874, val loss: 0.37981, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.37491, val loss: 0.37609, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.36920, val loss: 0.37072, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.36566, val loss: 0.36729, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.36038, val loss: 0.36233, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.35564, val loss: 0.35762, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.35193, val loss: 0.35394, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.34758, val loss: 0.34966, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.34412, val loss: 0.34626, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.33998, val loss: 0.34259, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.33611, val loss: 0.33878, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.33294, val loss: 0.33564, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.32924, val loss: 0.33238, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.32577, val loss: 0.32895, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.32277, val loss: 0.32590, in 0.000s 1 tree, 35 leaves, max depth = 10, train loss: 0.31951, val loss: 0.32303, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.31638, val loss: 0.32030, in 0.016s 1 tree, 28 leaves, max depth = 13, train loss: 0.31340, val loss: 0.31738, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.31053, val loss: 0.31490, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.30778, val loss: 0.31217, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.30523, val loss: 0.30965, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.30265, val loss: 0.30742, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.30020, val loss: 0.30499, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.29776, val loss: 0.30250, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.29543, val loss: 0.30052, in 0.000s 1 tree, 29 leaves, max depth = 15, train loss: 0.29323, val loss: 0.29836, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.29104, val loss: 0.29622, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.28893, val loss: 0.29445, in 0.031s 1 tree, 30 leaves, max depth = 10, train loss: 0.28694, val loss: 0.29248, in 0.000s 1 tree, 35 leaves, max depth = 8, train loss: 0.28484, val loss: 0.29035, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.28294, val loss: 0.28877, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.28097, val loss: 0.28680, in 0.031s Fit 64 trees in 1.283 s, (2189 total leaves) Time spent computing histograms: 0.399s Time spent finding best splits: 0.068s Time spent applying splits: 0.045s Time spent predicting: 0.016s Trial 64, Fold 1: Log loss = 0.28649361238491966, Average precision = 0.961671769787424, ROC-AUC = 0.9561228479899437, Elapsed Time = 1.288647299999866 seconds Trial 64, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 64, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 10, train loss: 0.67341, val loss: 0.67298, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.65468, val loss: 0.65387, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.63711, val loss: 0.63596, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.62073, val loss: 0.61924, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.60531, val loss: 0.60350, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.59066, val loss: 0.58871, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.57762, val loss: 0.57553, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.56480, val loss: 0.56253, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.55322, val loss: 0.55084, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.54179, val loss: 0.53918, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.53074, val loss: 0.52797, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.52097, val loss: 0.51811, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.51106, val loss: 0.50798, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.50169, val loss: 0.49854, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.49340, val loss: 0.49018, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.48489, val loss: 0.48157, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.47681, val loss: 0.47340, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.46919, val loss: 0.46567, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.46217, val loss: 0.45851, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.45575, val loss: 0.45204, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.44919, val loss: 0.44544, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.44298, val loss: 0.43912, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.43703, val loss: 0.43302, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.42842, val loss: 0.42456, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.42031, val loss: 0.41659, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.41556, val loss: 0.41188, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.41053, val loss: 0.40672, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.40603, val loss: 0.40223, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.39892, val loss: 0.39529, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.39448, val loss: 0.39084, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.39055, val loss: 0.38692, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.38415, val loss: 0.38066, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.37810, val loss: 0.37476, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.37430, val loss: 0.37097, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.36870, val loss: 0.36549, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.36518, val loss: 0.36189, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.35994, val loss: 0.35677, in 0.031s 1 tree, 29 leaves, max depth = 11, train loss: 0.35528, val loss: 0.35220, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.35211, val loss: 0.34907, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.34855, val loss: 0.34553, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.34429, val loss: 0.34163, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.34023, val loss: 0.33766, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.33641, val loss: 0.33394, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.33266, val loss: 0.33054, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.32911, val loss: 0.32727, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.32575, val loss: 0.32398, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.32272, val loss: 0.32099, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.31960, val loss: 0.31810, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.31660, val loss: 0.31517, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.31372, val loss: 0.31237, in 0.000s 1 tree, 35 leaves, max depth = 10, train loss: 0.31086, val loss: 0.30979, in 0.031s 1 tree, 29 leaves, max depth = 11, train loss: 0.30817, val loss: 0.30715, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.30559, val loss: 0.30461, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.30301, val loss: 0.30229, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.30060, val loss: 0.29995, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.29813, val loss: 0.29757, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.29581, val loss: 0.29549, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.29365, val loss: 0.29336, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.29144, val loss: 0.29123, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.28925, val loss: 0.28913, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.28719, val loss: 0.28729, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.28528, val loss: 0.28543, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.28338, val loss: 0.28375, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.28139, val loss: 0.28185, in 0.000s Fit 64 trees in 1.347 s, (2185 total leaves) Time spent computing histograms: 0.400s Time spent finding best splits: 0.067s Time spent applying splits: 0.046s Time spent predicting: 0.000s Trial 64, Fold 2: Log loss = 0.2846823637848209, Average precision = 0.9607167809910063, ROC-AUC = 0.9577754958283016, Elapsed Time = 1.3587115000009362 seconds Trial 64, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 64, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 8, train loss: 0.67361, val loss: 0.67350, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.65501, val loss: 0.65486, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.63763, val loss: 0.63745, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.62137, val loss: 0.62113, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.60612, val loss: 0.60583, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.59222, val loss: 0.59199, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.57843, val loss: 0.57823, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.56626, val loss: 0.56615, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.55459, val loss: 0.55459, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.54324, val loss: 0.54321, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.53223, val loss: 0.53227, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.52243, val loss: 0.52253, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.51255, val loss: 0.51271, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.50320, val loss: 0.50338, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.49488, val loss: 0.49513, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.48658, val loss: 0.48687, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.47859, val loss: 0.47889, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.47113, val loss: 0.47148, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.46394, val loss: 0.46430, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.45420, val loss: 0.45523, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.44758, val loss: 0.44872, in 0.031s 1 tree, 35 leaves, max depth = 7, train loss: 0.44141, val loss: 0.44267, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.43546, val loss: 0.43675, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.42976, val loss: 0.43112, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.42162, val loss: 0.42371, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.41653, val loss: 0.41874, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.40892, val loss: 0.41179, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.40425, val loss: 0.40724, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.39723, val loss: 0.40086, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.39282, val loss: 0.39652, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.38881, val loss: 0.39264, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.38253, val loss: 0.38692, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.37654, val loss: 0.38153, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.37305, val loss: 0.37814, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.36950, val loss: 0.37462, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.36435, val loss: 0.36994, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.35950, val loss: 0.36553, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.35573, val loss: 0.36167, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.35112, val loss: 0.35783, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.34684, val loss: 0.35392, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.34340, val loss: 0.35033, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.33928, val loss: 0.34693, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.33545, val loss: 0.34344, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.33230, val loss: 0.34019, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.32860, val loss: 0.33717, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.32518, val loss: 0.33409, in 0.016s 1 tree, 35 leaves, max depth = 15, train loss: 0.32229, val loss: 0.33111, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.31903, val loss: 0.32846, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.31597, val loss: 0.32568, in 0.000s 1 tree, 35 leaves, max depth = 15, train loss: 0.31331, val loss: 0.32288, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.31034, val loss: 0.32045, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.30762, val loss: 0.31803, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.30517, val loss: 0.31549, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.30249, val loss: 0.31333, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.30005, val loss: 0.31112, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.29758, val loss: 0.30914, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.29532, val loss: 0.30713, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.29314, val loss: 0.30490, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.29092, val loss: 0.30313, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.28879, val loss: 0.30147, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.28684, val loss: 0.29974, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.28480, val loss: 0.29760, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.28288, val loss: 0.29610, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.28091, val loss: 0.29396, in 0.016s Fit 64 trees in 1.455 s, (2178 total leaves) Time spent computing histograms: 0.449s Time spent finding best splits: 0.078s Time spent applying splits: 0.050s Time spent predicting: 0.000s Trial 64, Fold 3: Log loss = 0.28106817183611726, Average precision = 0.962650523022376, ROC-AUC = 0.9594392721104487, Elapsed Time = 1.4647454000005382 seconds Trial 64, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 64, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 8, train loss: 0.67371, val loss: 0.67286, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.65519, val loss: 0.65356, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.63791, val loss: 0.63547, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.62194, val loss: 0.61865, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.60657, val loss: 0.60256, in 0.024s 1 tree, 35 leaves, max depth = 9, train loss: 0.59283, val loss: 0.58818, in 0.008s 1 tree, 35 leaves, max depth = 8, train loss: 0.57919, val loss: 0.57384, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.56703, val loss: 0.56108, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.55477, val loss: 0.54821, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.54347, val loss: 0.53618, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.53250, val loss: 0.52470, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.52236, val loss: 0.51411, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.51274, val loss: 0.50389, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.50352, val loss: 0.49421, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.49501, val loss: 0.48519, in 0.000s 1 tree, 35 leaves, max depth = 9, train loss: 0.48663, val loss: 0.47632, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.47934, val loss: 0.46856, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.47182, val loss: 0.46059, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.46462, val loss: 0.45294, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.45500, val loss: 0.44317, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.44866, val loss: 0.43636, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.44246, val loss: 0.42976, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.43658, val loss: 0.42350, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.43099, val loss: 0.41756, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.42282, val loss: 0.40928, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.41514, val loss: 0.40148, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.41018, val loss: 0.39619, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.40546, val loss: 0.39115, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.40090, val loss: 0.38635, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.39415, val loss: 0.37953, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.39022, val loss: 0.37528, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.38642, val loss: 0.37128, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.38031, val loss: 0.36511, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.37457, val loss: 0.35932, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.37109, val loss: 0.35562, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.36576, val loss: 0.35022, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.36254, val loss: 0.34679, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.35785, val loss: 0.34195, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.35343, val loss: 0.33732, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.34988, val loss: 0.33374, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.34577, val loss: 0.32950, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.34181, val loss: 0.32578, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.33808, val loss: 0.32192, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.33489, val loss: 0.31869, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.33135, val loss: 0.31538, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.32839, val loss: 0.31242, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.32511, val loss: 0.30929, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.32201, val loss: 0.30603, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.31927, val loss: 0.30326, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.31632, val loss: 0.30052, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.31373, val loss: 0.29792, in 0.031s 1 tree, 30 leaves, max depth = 12, train loss: 0.31096, val loss: 0.29505, in 0.062s 1 tree, 35 leaves, max depth = 11, train loss: 0.30832, val loss: 0.29253, in 0.031s 1 tree, 30 leaves, max depth = 12, train loss: 0.30578, val loss: 0.28990, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.30334, val loss: 0.28764, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.30102, val loss: 0.28516, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.29868, val loss: 0.28268, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.29651, val loss: 0.28049, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.29436, val loss: 0.27846, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.29230, val loss: 0.27632, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.29032, val loss: 0.27449, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.28843, val loss: 0.27247, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.28643, val loss: 0.27037, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.28456, val loss: 0.26849, in 0.016s Fit 64 trees in 1.549 s, (2190 total leaves) Time spent computing histograms: 0.481s Time spent finding best splits: 0.091s Time spent applying splits: 0.075s Time spent predicting: 0.000s Trial 64, Fold 4: Log loss = 0.2846748268270564, Average precision = 0.961753461255784, ROC-AUC = 0.9574160806979644, Elapsed Time = 1.5648371000006591 seconds Trial 64, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 64, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.156 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 9, train loss: 0.67336, val loss: 0.67251, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.65449, val loss: 0.65294, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.63686, val loss: 0.63464, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.62050, val loss: 0.61768, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.60502, val loss: 0.60161, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.59048, val loss: 0.58651, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.57735, val loss: 0.57275, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.56455, val loss: 0.55937, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.55294, val loss: 0.54722, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.54156, val loss: 0.53519, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.53058, val loss: 0.52378, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.52078, val loss: 0.51351, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.51087, val loss: 0.50317, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.50159, val loss: 0.49354, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.49327, val loss: 0.48483, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.48484, val loss: 0.47605, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.47680, val loss: 0.46769, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.46921, val loss: 0.45980, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.46224, val loss: 0.45240, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.45581, val loss: 0.44566, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.44929, val loss: 0.43896, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.44306, val loss: 0.43249, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.43715, val loss: 0.42632, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.42850, val loss: 0.41774, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.42036, val loss: 0.40963, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.41563, val loss: 0.40467, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.41076, val loss: 0.39963, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.40592, val loss: 0.39467, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.39874, val loss: 0.38757, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.39432, val loss: 0.38297, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.39036, val loss: 0.37890, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.38391, val loss: 0.37252, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.37781, val loss: 0.36650, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.37432, val loss: 0.36300, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.37078, val loss: 0.35933, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.36546, val loss: 0.35393, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.36019, val loss: 0.34877, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.35545, val loss: 0.34395, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.35233, val loss: 0.34085, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.34795, val loss: 0.33641, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.34444, val loss: 0.33296, in 0.031s 1 tree, 35 leaves, max depth = 13, train loss: 0.34027, val loss: 0.32921, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.33641, val loss: 0.32521, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.33318, val loss: 0.32201, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.32945, val loss: 0.31868, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.32598, val loss: 0.31515, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.32300, val loss: 0.31225, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.31972, val loss: 0.30933, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.31655, val loss: 0.30652, in 0.016s 1 tree, 30 leaves, max depth = 13, train loss: 0.31356, val loss: 0.30349, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.31089, val loss: 0.30084, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.30804, val loss: 0.29834, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.30546, val loss: 0.29573, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.30283, val loss: 0.29306, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.30026, val loss: 0.29082, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.29787, val loss: 0.28839, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.29551, val loss: 0.28635, in 0.016s 1 tree, 30 leaves, max depth = 13, train loss: 0.29332, val loss: 0.28411, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.29109, val loss: 0.28195, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.28942, val loss: 0.28039, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.28734, val loss: 0.27855, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.28539, val loss: 0.27650, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.28347, val loss: 0.27481, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.28168, val loss: 0.27291, in 0.016s Fit 64 trees in 1.422 s, (2177 total leaves) Time spent computing histograms: 0.448s Time spent finding best splits: 0.075s Time spent applying splits: 0.049s Time spent predicting: 0.000s Trial 64, Fold 5: Log loss = 0.29058164076700016, Average precision = 0.9577833376287552, ROC-AUC = 0.9529413819370902, Elapsed Time = 1.4403724000003422 seconds
Optimization Progress: 65%|######5 | 65/100 [12:47<07:13, 12.38s/it]
Trial 65, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 65, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 115 leaves, max depth = 18, train loss: 0.67687, val loss: 0.67681, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.66007, val loss: 0.65980, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.64443, val loss: 0.64402, in 0.031s 1 tree, 115 leaves, max depth = 20, train loss: 0.63097, val loss: 0.63039, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.61752, val loss: 0.61694, in 0.016s 1 tree, 72 leaves, max depth = 13, train loss: 0.60427, val loss: 0.60342, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.59202, val loss: 0.59098, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.58010, val loss: 0.57915, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.56880, val loss: 0.56784, in 0.031s 1 tree, 115 leaves, max depth = 13, train loss: 0.55798, val loss: 0.55692, in 0.016s 1 tree, 115 leaves, max depth = 13, train loss: 0.54781, val loss: 0.54680, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.53822, val loss: 0.53699, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.52978, val loss: 0.52840, in 0.031s 1 tree, 115 leaves, max depth = 13, train loss: 0.52149, val loss: 0.52004, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.51184, val loss: 0.51034, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.50448, val loss: 0.50306, in 0.031s 1 tree, 115 leaves, max depth = 22, train loss: 0.49611, val loss: 0.49450, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.48963, val loss: 0.48785, in 0.031s 1 tree, 115 leaves, max depth = 17, train loss: 0.48263, val loss: 0.48089, in 0.016s 1 tree, 89 leaves, max depth = 12, train loss: 0.47609, val loss: 0.47437, in 0.016s 1 tree, 115 leaves, max depth = 19, train loss: 0.47035, val loss: 0.46859, in 0.031s 1 tree, 115 leaves, max depth = 25, train loss: 0.46423, val loss: 0.46236, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.45837, val loss: 0.45643, in 0.031s 1 tree, 115 leaves, max depth = 23, train loss: 0.45276, val loss: 0.45079, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.44807, val loss: 0.44620, in 0.016s 1 tree, 115 leaves, max depth = 22, train loss: 0.44371, val loss: 0.44173, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.43875, val loss: 0.43666, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.43283, val loss: 0.43075, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.42786, val loss: 0.42569, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.42320, val loss: 0.42105, in 0.016s 1 tree, 115 leaves, max depth = 20, train loss: 0.41891, val loss: 0.41674, in 0.031s 1 tree, 115 leaves, max depth = 12, train loss: 0.41526, val loss: 0.41306, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.41108, val loss: 0.40893, in 0.031s 1 tree, 100 leaves, max depth = 17, train loss: 0.40730, val loss: 0.40504, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.40350, val loss: 0.40138, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.40042, val loss: 0.39852, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.39707, val loss: 0.39527, in 0.031s 1 tree, 11 leaves, max depth = 6, train loss: 0.39106, val loss: 0.38942, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.38544, val loss: 0.38394, in 0.000s 1 tree, 115 leaves, max depth = 17, train loss: 0.38192, val loss: 0.38042, in 0.031s 1 tree, 55 leaves, max depth = 9, train loss: 0.37570, val loss: 0.37469, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.37017, val loss: 0.36960, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.36543, val loss: 0.36500, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.36107, val loss: 0.36080, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.35653, val loss: 0.35650, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.35239, val loss: 0.35246, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.34975, val loss: 0.34997, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.34743, val loss: 0.34773, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.34548, val loss: 0.34567, in 0.016s 1 tree, 115 leaves, max depth = 21, train loss: 0.34175, val loss: 0.34196, in 0.031s 1 tree, 115 leaves, max depth = 17, train loss: 0.34007, val loss: 0.34033, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.33643, val loss: 0.33681, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.33359, val loss: 0.33411, in 0.031s 1 tree, 115 leaves, max depth = 12, train loss: 0.33150, val loss: 0.33211, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.32837, val loss: 0.32916, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.32616, val loss: 0.32688, in 0.031s 1 tree, 115 leaves, max depth = 17, train loss: 0.32435, val loss: 0.32515, in 0.016s 1 tree, 115 leaves, max depth = 13, train loss: 0.32154, val loss: 0.32258, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.32024, val loss: 0.32131, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.31797, val loss: 0.31915, in 0.031s 1 tree, 11 leaves, max depth = 6, train loss: 0.31683, val loss: 0.31798, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.31440, val loss: 0.31568, in 0.016s 1 tree, 115 leaves, max depth = 13, train loss: 0.31289, val loss: 0.31414, in 0.031s 1 tree, 21 leaves, max depth = 6, train loss: 0.31018, val loss: 0.31167, in 0.016s 1 tree, 115 leaves, max depth = 12, train loss: 0.30856, val loss: 0.31020, in 0.016s 1 tree, 77 leaves, max depth = 14, train loss: 0.30728, val loss: 0.30893, in 0.031s 1 tree, 115 leaves, max depth = 23, train loss: 0.30541, val loss: 0.30701, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.30426, val loss: 0.30589, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.30154, val loss: 0.30336, in 0.016s 1 tree, 78 leaves, max depth = 16, train loss: 0.29875, val loss: 0.30082, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.29594, val loss: 0.29845, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.29468, val loss: 0.29721, in 0.031s 1 tree, 115 leaves, max depth = 12, train loss: 0.29335, val loss: 0.29599, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.29186, val loss: 0.29438, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.29075, val loss: 0.29335, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.28945, val loss: 0.29215, in 0.031s Fit 76 trees in 1.954 s, (6956 total leaves) Time spent computing histograms: 0.599s Time spent finding best splits: 0.140s Time spent applying splits: 0.128s Time spent predicting: 0.000s Trial 65, Fold 1: Log loss = 0.300041432985021, Average precision = 0.961748428061056, ROC-AUC = 0.9546362463734865, Elapsed Time = 1.9627108999993652 seconds Trial 65, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 65, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.188 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 115 leaves, max depth = 16, train loss: 0.67707, val loss: 0.67651, in 0.031s 1 tree, 88 leaves, max depth = 12, train loss: 0.66096, val loss: 0.65998, in 0.031s 1 tree, 115 leaves, max depth = 25, train loss: 0.64653, val loss: 0.64512, in 0.016s 1 tree, 115 leaves, max depth = 21, train loss: 0.63142, val loss: 0.63009, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.61789, val loss: 0.61627, in 0.031s 1 tree, 115 leaves, max depth = 22, train loss: 0.60439, val loss: 0.60271, in 0.031s 1 tree, 115 leaves, max depth = 17, train loss: 0.59069, val loss: 0.58892, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.57942, val loss: 0.57720, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.56878, val loss: 0.56631, in 0.031s 1 tree, 115 leaves, max depth = 24, train loss: 0.55812, val loss: 0.55562, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.54770, val loss: 0.54533, in 0.031s 1 tree, 115 leaves, max depth = 20, train loss: 0.53832, val loss: 0.53583, in 0.016s 1 tree, 115 leaves, max depth = 19, train loss: 0.53004, val loss: 0.52742, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.52158, val loss: 0.51886, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.51199, val loss: 0.50913, in 0.016s 1 tree, 115 leaves, max depth = 20, train loss: 0.50468, val loss: 0.50165, in 0.031s 1 tree, 115 leaves, max depth = 17, train loss: 0.49718, val loss: 0.49402, in 0.031s 1 tree, 110 leaves, max depth = 15, train loss: 0.49132, val loss: 0.48778, in 0.016s 1 tree, 73 leaves, max depth = 14, train loss: 0.48463, val loss: 0.48094, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.47554, val loss: 0.47188, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.46928, val loss: 0.46563, in 0.031s 1 tree, 49 leaves, max depth = 11, train loss: 0.46414, val loss: 0.46020, in 0.016s 1 tree, 18 leaves, max depth = 8, train loss: 0.45613, val loss: 0.45224, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.44977, val loss: 0.44581, in 0.031s 1 tree, 115 leaves, max depth = 13, train loss: 0.44458, val loss: 0.44071, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.43903, val loss: 0.43527, in 0.031s 1 tree, 11 leaves, max depth = 5, train loss: 0.43219, val loss: 0.42845, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.42594, val loss: 0.42229, in 0.016s 1 tree, 115 leaves, max depth = 19, train loss: 0.42097, val loss: 0.41745, in 0.031s 1 tree, 115 leaves, max depth = 13, train loss: 0.41692, val loss: 0.41331, in 0.016s 1 tree, 115 leaves, max depth = 11, train loss: 0.41201, val loss: 0.40849, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.40791, val loss: 0.40454, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.40409, val loss: 0.40080, in 0.031s 1 tree, 23 leaves, max depth = 9, train loss: 0.39816, val loss: 0.39500, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.39222, val loss: 0.38920, in 0.016s 1 tree, 115 leaves, max depth = 20, train loss: 0.38828, val loss: 0.38551, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.38479, val loss: 0.38204, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.38104, val loss: 0.37849, in 0.031s 1 tree, 8 leaves, max depth = 4, train loss: 0.37631, val loss: 0.37381, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.37181, val loss: 0.36936, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.36942, val loss: 0.36701, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.36535, val loss: 0.36303, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.36301, val loss: 0.36074, in 0.031s 1 tree, 39 leaves, max depth = 10, train loss: 0.35886, val loss: 0.35699, in 0.000s 1 tree, 115 leaves, max depth = 15, train loss: 0.35488, val loss: 0.35320, in 0.016s 1 tree, 115 leaves, max depth = 20, train loss: 0.35108, val loss: 0.34946, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.34741, val loss: 0.34594, in 0.031s 1 tree, 115 leaves, max depth = 13, train loss: 0.34497, val loss: 0.34350, in 0.031s 1 tree, 115 leaves, max depth = 28, train loss: 0.34319, val loss: 0.34181, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.34008, val loss: 0.33890, in 0.016s 1 tree, 115 leaves, max depth = 19, train loss: 0.33787, val loss: 0.33679, in 0.031s 1 tree, 13 leaves, max depth = 5, train loss: 0.33444, val loss: 0.33343, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.33182, val loss: 0.33094, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.32892, val loss: 0.32820, in 0.031s 1 tree, 38 leaves, max depth = 15, train loss: 0.32730, val loss: 0.32664, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.32476, val loss: 0.32422, in 0.031s 1 tree, 115 leaves, max depth = 20, train loss: 0.32288, val loss: 0.32241, in 0.031s 1 tree, 33 leaves, max depth = 8, train loss: 0.31900, val loss: 0.31882, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.31549, val loss: 0.31553, in 0.016s 1 tree, 115 leaves, max depth = 21, train loss: 0.31430, val loss: 0.31435, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.31255, val loss: 0.31266, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.31004, val loss: 0.31036, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.30876, val loss: 0.30911, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.30668, val loss: 0.30726, in 0.016s 1 tree, 115 leaves, max depth = 22, train loss: 0.30471, val loss: 0.30549, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.30320, val loss: 0.30422, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.30182, val loss: 0.30299, in 0.016s 1 tree, 115 leaves, max depth = 21, train loss: 0.30042, val loss: 0.30185, in 0.047s 1 tree, 115 leaves, max depth = 18, train loss: 0.29909, val loss: 0.30073, in 0.031s 1 tree, 43 leaves, max depth = 13, train loss: 0.29584, val loss: 0.29779, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.29373, val loss: 0.29584, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.29251, val loss: 0.29477, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.29055, val loss: 0.29304, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.28844, val loss: 0.29103, in 0.031s 1 tree, 72 leaves, max depth = 18, train loss: 0.28595, val loss: 0.28876, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.28407, val loss: 0.28700, in 0.031s Fit 76 trees in 2.157 s, (6852 total leaves) Time spent computing histograms: 0.655s Time spent finding best splits: 0.155s Time spent applying splits: 0.142s Time spent predicting: 0.000s Trial 65, Fold 2: Log loss = 0.2915758691707463, Average precision = 0.9600215453753159, ROC-AUC = 0.9564426191546374, Elapsed Time = 2.174914799999897 seconds Trial 65, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 65, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 115 leaves, max depth = 17, train loss: 0.67692, val loss: 0.67678, in 0.016s 1 tree, 94 leaves, max depth = 12, train loss: 0.66093, val loss: 0.66069, in 0.031s 1 tree, 115 leaves, max depth = 20, train loss: 0.64646, val loss: 0.64617, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.63116, val loss: 0.63095, in 0.016s 1 tree, 115 leaves, max depth = 13, train loss: 0.61823, val loss: 0.61797, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.60477, val loss: 0.60445, in 0.031s 1 tree, 24 leaves, max depth = 9, train loss: 0.59355, val loss: 0.59302, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.58153, val loss: 0.58117, in 0.031s 1 tree, 98 leaves, max depth = 13, train loss: 0.57012, val loss: 0.57005, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.55933, val loss: 0.55945, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.55026, val loss: 0.55033, in 0.016s 1 tree, 115 leaves, max depth = 22, train loss: 0.54040, val loss: 0.54063, in 0.031s 1 tree, 104 leaves, max depth = 14, train loss: 0.53122, val loss: 0.53153, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.52175, val loss: 0.52198, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.51487, val loss: 0.51475, in 0.016s 1 tree, 115 leaves, max depth = 20, train loss: 0.50686, val loss: 0.50690, in 0.031s 1 tree, 115 leaves, max depth = 13, train loss: 0.49910, val loss: 0.49911, in 0.031s 1 tree, 115 leaves, max depth = 22, train loss: 0.49177, val loss: 0.49192, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.48464, val loss: 0.48472, in 0.031s 1 tree, 20 leaves, max depth = 8, train loss: 0.47537, val loss: 0.47608, in 0.016s 1 tree, 115 leaves, max depth = 19, train loss: 0.46817, val loss: 0.46896, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.46215, val loss: 0.46311, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.45388, val loss: 0.45540, in 0.016s 1 tree, 115 leaves, max depth = 20, train loss: 0.44750, val loss: 0.44911, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.44111, val loss: 0.44285, in 0.016s 1 tree, 62 leaves, max depth = 17, train loss: 0.43617, val loss: 0.43780, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.43013, val loss: 0.43160, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.42574, val loss: 0.42711, in 0.016s 1 tree, 115 leaves, max depth = 19, train loss: 0.42021, val loss: 0.42167, in 0.031s 1 tree, 20 leaves, max depth = 9, train loss: 0.41348, val loss: 0.41550, in 0.016s 1 tree, 115 leaves, max depth = 23, train loss: 0.40917, val loss: 0.41138, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.40524, val loss: 0.40755, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.40107, val loss: 0.40341, in 0.016s 1 tree, 115 leaves, max depth = 19, train loss: 0.39649, val loss: 0.39889, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.39173, val loss: 0.39406, in 0.031s 1 tree, 9 leaves, max depth = 4, train loss: 0.38630, val loss: 0.38895, in 0.016s 1 tree, 12 leaves, max depth = 7, train loss: 0.38108, val loss: 0.38420, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.37639, val loss: 0.37970, in 0.000s 1 tree, 115 leaves, max depth = 16, train loss: 0.37357, val loss: 0.37692, in 0.031s 1 tree, 115 leaves, max depth = 17, train loss: 0.36944, val loss: 0.37270, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.36522, val loss: 0.36839, in 0.031s 1 tree, 15 leaves, max depth = 5, train loss: 0.36091, val loss: 0.36449, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.35772, val loss: 0.36139, in 0.031s 1 tree, 115 leaves, max depth = 20, train loss: 0.35563, val loss: 0.35948, in 0.031s 1 tree, 115 leaves, max depth = 13, train loss: 0.35305, val loss: 0.35703, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.35105, val loss: 0.35521, in 0.031s 1 tree, 51 leaves, max depth = 12, train loss: 0.34614, val loss: 0.35104, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.34375, val loss: 0.34896, in 0.031s 1 tree, 9 leaves, max depth = 4, train loss: 0.34035, val loss: 0.34576, in 0.016s 1 tree, 115 leaves, max depth = 12, train loss: 0.33811, val loss: 0.34385, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.33609, val loss: 0.34204, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.33317, val loss: 0.33938, in 0.031s 1 tree, 115 leaves, max depth = 24, train loss: 0.33123, val loss: 0.33764, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.32828, val loss: 0.33469, in 0.031s 1 tree, 115 leaves, max depth = 23, train loss: 0.32690, val loss: 0.33343, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.32501, val loss: 0.33152, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.32243, val loss: 0.32881, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.32085, val loss: 0.32720, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.31956, val loss: 0.32597, in 0.031s 1 tree, 9 leaves, max depth = 4, train loss: 0.31638, val loss: 0.32318, in 0.016s 1 tree, 115 leaves, max depth = 22, train loss: 0.31384, val loss: 0.32059, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.31154, val loss: 0.31833, in 0.016s 1 tree, 89 leaves, max depth = 15, train loss: 0.31009, val loss: 0.31705, in 0.031s 1 tree, 14 leaves, max depth = 6, train loss: 0.30725, val loss: 0.31474, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.30550, val loss: 0.31328, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.30286, val loss: 0.31112, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.30095, val loss: 0.30913, in 0.031s 1 tree, 17 leaves, max depth = 9, train loss: 0.29840, val loss: 0.30703, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.29698, val loss: 0.30573, in 0.031s 1 tree, 51 leaves, max depth = 11, train loss: 0.29378, val loss: 0.30311, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.29221, val loss: 0.30159, in 0.031s 1 tree, 115 leaves, max depth = 12, train loss: 0.29104, val loss: 0.30044, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.28962, val loss: 0.29914, in 0.016s 1 tree, 115 leaves, max depth = 20, train loss: 0.28821, val loss: 0.29791, in 0.047s 1 tree, 79 leaves, max depth = 16, train loss: 0.28519, val loss: 0.29536, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.28289, val loss: 0.29348, in 0.016s Fit 76 trees in 2.174 s, (6738 total leaves) Time spent computing histograms: 0.666s Time spent finding best splits: 0.156s Time spent applying splits: 0.144s Time spent predicting: 0.000s Trial 65, Fold 3: Log loss = 0.2863868354714503, Average precision = 0.9625752619952467, ROC-AUC = 0.957905905991467, Elapsed Time = 2.191066100000171 seconds Trial 65, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 65, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.174 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 115 leaves, max depth = 16, train loss: 0.67700, val loss: 0.67664, in 0.031s 1 tree, 103 leaves, max depth = 15, train loss: 0.66083, val loss: 0.66018, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.64673, val loss: 0.64585, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.63219, val loss: 0.63065, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.61877, val loss: 0.61671, in 0.031s 1 tree, 84 leaves, max depth = 18, train loss: 0.60697, val loss: 0.60409, in 0.016s 1 tree, 99 leaves, max depth = 15, train loss: 0.59452, val loss: 0.59094, in 0.031s 1 tree, 115 leaves, max depth = 13, train loss: 0.58223, val loss: 0.57821, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.57096, val loss: 0.56675, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.56004, val loss: 0.55525, in 0.031s 1 tree, 115 leaves, max depth = 20, train loss: 0.55078, val loss: 0.54576, in 0.016s 1 tree, 115 leaves, max depth = 13, train loss: 0.54119, val loss: 0.53597, in 0.031s 1 tree, 115 leaves, max depth = 17, train loss: 0.53304, val loss: 0.52757, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.52468, val loss: 0.51881, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.51619, val loss: 0.50992, in 0.031s 1 tree, 46 leaves, max depth = 9, train loss: 0.50827, val loss: 0.50141, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.50160, val loss: 0.49462, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.49462, val loss: 0.48719, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.48746, val loss: 0.47977, in 0.031s 1 tree, 70 leaves, max depth = 15, train loss: 0.48087, val loss: 0.47305, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.47341, val loss: 0.46540, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.46797, val loss: 0.45952, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.46283, val loss: 0.45429, in 0.031s 1 tree, 10 leaves, max depth = 6, train loss: 0.45438, val loss: 0.44607, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.44681, val loss: 0.43794, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.43938, val loss: 0.43028, in 0.016s 1 tree, 115 leaves, max depth = 26, train loss: 0.43502, val loss: 0.42585, in 0.031s 1 tree, 115 leaves, max depth = 17, train loss: 0.42983, val loss: 0.42075, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.42444, val loss: 0.41508, in 0.016s 1 tree, 115 leaves, max depth = 13, train loss: 0.42018, val loss: 0.41074, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.41557, val loss: 0.40585, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.41016, val loss: 0.40034, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.40644, val loss: 0.39643, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.40225, val loss: 0.39202, in 0.031s 1 tree, 67 leaves, max depth = 11, train loss: 0.39893, val loss: 0.38866, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.39485, val loss: 0.38439, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.39095, val loss: 0.38026, in 0.016s 1 tree, 115 leaves, max depth = 13, train loss: 0.38728, val loss: 0.37639, in 0.031s 1 tree, 15 leaves, max depth = 6, train loss: 0.38183, val loss: 0.37067, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.37765, val loss: 0.36655, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.37419, val loss: 0.36296, in 0.031s 1 tree, 11 leaves, max depth = 4, train loss: 0.36939, val loss: 0.35793, in 0.016s 1 tree, 115 leaves, max depth = 21, train loss: 0.36660, val loss: 0.35495, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.36349, val loss: 0.35164, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.36091, val loss: 0.34899, in 0.031s 1 tree, 115 leaves, max depth = 22, train loss: 0.35894, val loss: 0.34715, in 0.031s 1 tree, 115 leaves, max depth = 17, train loss: 0.35552, val loss: 0.34399, in 0.016s 1 tree, 97 leaves, max depth = 12, train loss: 0.35315, val loss: 0.34177, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.35045, val loss: 0.33920, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.34781, val loss: 0.33645, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.34567, val loss: 0.33415, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.34272, val loss: 0.33122, in 0.016s 1 tree, 97 leaves, max depth = 13, train loss: 0.34072, val loss: 0.32911, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.33863, val loss: 0.32712, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.33700, val loss: 0.32544, in 0.031s 1 tree, 115 leaves, max depth = 24, train loss: 0.33485, val loss: 0.32324, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.33272, val loss: 0.32107, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.33104, val loss: 0.31945, in 0.031s 1 tree, 115 leaves, max depth = 21, train loss: 0.32875, val loss: 0.31721, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.32665, val loss: 0.31512, in 0.031s 1 tree, 115 leaves, max depth = 22, train loss: 0.32518, val loss: 0.31360, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.32164, val loss: 0.31029, in 0.016s 1 tree, 111 leaves, max depth = 16, train loss: 0.32009, val loss: 0.30874, in 0.031s 1 tree, 38 leaves, max depth = 12, train loss: 0.31660, val loss: 0.30546, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.31458, val loss: 0.30344, in 0.031s 1 tree, 115 leaves, max depth = 20, train loss: 0.31270, val loss: 0.30158, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.31076, val loss: 0.29983, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.30958, val loss: 0.29861, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.30837, val loss: 0.29728, in 0.031s 1 tree, 13 leaves, max depth = 8, train loss: 0.30562, val loss: 0.29465, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.30433, val loss: 0.29347, in 0.031s 1 tree, 13 leaves, max depth = 7, train loss: 0.30135, val loss: 0.29047, in 0.016s 1 tree, 111 leaves, max depth = 16, train loss: 0.30018, val loss: 0.28929, in 0.031s 1 tree, 115 leaves, max depth = 17, train loss: 0.29827, val loss: 0.28750, in 0.016s 1 tree, 40 leaves, max depth = 15, train loss: 0.29544, val loss: 0.28480, in 0.031s 1 tree, 115 leaves, max depth = 17, train loss: 0.29345, val loss: 0.28287, in 0.016s Fit 76 trees in 2.206 s, (7119 total leaves) Time spent computing histograms: 0.695s Time spent finding best splits: 0.161s Time spent applying splits: 0.148s Time spent predicting: 0.031s Trial 65, Fold 4: Log loss = 0.30077750899317207, Average precision = 0.9625300286328604, ROC-AUC = 0.9559317111001429, Elapsed Time = 2.2293044999987615 seconds Trial 65, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 65, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0.174 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 115 leaves, max depth = 16, train loss: 0.67682, val loss: 0.67611, in 0.031s 1 tree, 108 leaves, max depth = 13, train loss: 0.66060, val loss: 0.65933, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.64577, val loss: 0.64381, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.63060, val loss: 0.62798, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.61703, val loss: 0.61361, in 0.016s 1 tree, 88 leaves, max depth = 14, train loss: 0.60372, val loss: 0.59954, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.59027, val loss: 0.58544, in 0.031s 1 tree, 57 leaves, max depth = 10, train loss: 0.57904, val loss: 0.57387, in 0.016s 1 tree, 115 leaves, max depth = 13, train loss: 0.56793, val loss: 0.56215, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.55718, val loss: 0.55096, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.54702, val loss: 0.54074, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.53633, val loss: 0.52978, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.52723, val loss: 0.52053, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.51887, val loss: 0.51178, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.51078, val loss: 0.50331, in 0.031s 1 tree, 115 leaves, max depth = 17, train loss: 0.50347, val loss: 0.49572, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.49610, val loss: 0.48797, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.48971, val loss: 0.48121, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.48336, val loss: 0.47455, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.47426, val loss: 0.46515, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.46799, val loss: 0.45860, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.46252, val loss: 0.45289, in 0.031s 1 tree, 49 leaves, max depth = 11, train loss: 0.45732, val loss: 0.44725, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.45184, val loss: 0.44139, in 0.031s 1 tree, 115 leaves, max depth = 26, train loss: 0.44538, val loss: 0.43482, in 0.016s 1 tree, 115 leaves, max depth = 13, train loss: 0.43897, val loss: 0.42840, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.43426, val loss: 0.42334, in 0.016s 1 tree, 115 leaves, max depth = 23, train loss: 0.43002, val loss: 0.41897, in 0.031s 1 tree, 22 leaves, max depth = 10, train loss: 0.42279, val loss: 0.41166, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.41607, val loss: 0.40482, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.41054, val loss: 0.39948, in 0.031s 1 tree, 63 leaves, max depth = 10, train loss: 0.40649, val loss: 0.39534, in 0.016s 1 tree, 115 leaves, max depth = 13, train loss: 0.40264, val loss: 0.39150, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.39843, val loss: 0.38709, in 0.031s 1 tree, 115 leaves, max depth = 23, train loss: 0.39550, val loss: 0.38397, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.39263, val loss: 0.38096, in 0.031s 1 tree, 115 leaves, max depth = 26, train loss: 0.38998, val loss: 0.37803, in 0.031s 1 tree, 115 leaves, max depth = 20, train loss: 0.38744, val loss: 0.37524, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.38431, val loss: 0.37205, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.37922, val loss: 0.36689, in 0.031s 1 tree, 12 leaves, max depth = 6, train loss: 0.37382, val loss: 0.36187, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.36901, val loss: 0.35703, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.36562, val loss: 0.35361, in 0.031s 1 tree, 115 leaves, max depth = 13, train loss: 0.36298, val loss: 0.35095, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.35916, val loss: 0.34717, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.35554, val loss: 0.34361, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.35208, val loss: 0.34023, in 0.031s 1 tree, 115 leaves, max depth = 16, train loss: 0.34926, val loss: 0.33739, in 0.016s 1 tree, 103 leaves, max depth = 14, train loss: 0.34697, val loss: 0.33514, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.34479, val loss: 0.33297, in 0.016s 1 tree, 115 leaves, max depth = 15, train loss: 0.34157, val loss: 0.32988, in 0.031s 1 tree, 29 leaves, max depth = 11, train loss: 0.33985, val loss: 0.32815, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.33773, val loss: 0.32598, in 0.031s 1 tree, 115 leaves, max depth = 13, train loss: 0.33565, val loss: 0.32420, in 0.031s 1 tree, 115 leaves, max depth = 18, train loss: 0.33276, val loss: 0.32150, in 0.016s 1 tree, 115 leaves, max depth = 12, train loss: 0.33095, val loss: 0.31982, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.32963, val loss: 0.31831, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.32812, val loss: 0.31691, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.32550, val loss: 0.31447, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.32386, val loss: 0.31280, in 0.031s 1 tree, 71 leaves, max depth = 16, train loss: 0.32004, val loss: 0.30925, in 0.016s 1 tree, 26 leaves, max depth = 12, train loss: 0.31646, val loss: 0.30579, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.31285, val loss: 0.30255, in 0.016s 1 tree, 115 leaves, max depth = 21, train loss: 0.31177, val loss: 0.30152, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.31032, val loss: 0.30007, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.30868, val loss: 0.29841, in 0.016s 1 tree, 115 leaves, max depth = 19, train loss: 0.30738, val loss: 0.29712, in 0.031s 1 tree, 13 leaves, max depth = 6, train loss: 0.30621, val loss: 0.29591, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.30287, val loss: 0.29292, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.30115, val loss: 0.29134, in 0.016s 1 tree, 115 leaves, max depth = 17, train loss: 0.29902, val loss: 0.28937, in 0.031s 1 tree, 95 leaves, max depth = 15, train loss: 0.29592, val loss: 0.28654, in 0.031s 1 tree, 115 leaves, max depth = 22, train loss: 0.29479, val loss: 0.28540, in 0.031s 1 tree, 93 leaves, max depth = 15, train loss: 0.29332, val loss: 0.28408, in 0.016s 1 tree, 115 leaves, max depth = 18, train loss: 0.29197, val loss: 0.28295, in 0.016s 1 tree, 16 leaves, max depth = 9, train loss: 0.29111, val loss: 0.28216, in 0.016s Fit 76 trees in 2.142 s, (7070 total leaves) Time spent computing histograms: 0.664s Time spent finding best splits: 0.156s Time spent applying splits: 0.144s Time spent predicting: 0.016s Trial 65, Fold 5: Log loss = 0.3062555495030684, Average precision = 0.9591346466491077, ROC-AUC = 0.9539096668710402, Elapsed Time = 2.1642406999999366 seconds
Optimization Progress: 66%|######6 | 66/100 [13:04<07:52, 13.90s/it]
Trial 66, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 66, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.206 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 64 leaves, max depth = 16, train loss: 0.65572, val loss: 0.65450, in 0.016s 1 tree, 64 leaves, max depth = 16, train loss: 0.62421, val loss: 0.62186, in 0.016s 1 tree, 68 leaves, max depth = 19, train loss: 0.59754, val loss: 0.59454, in 0.016s 1 tree, 68 leaves, max depth = 19, train loss: 0.57469, val loss: 0.57110, in 0.016s 1 tree, 68 leaves, max depth = 15, train loss: 0.55484, val loss: 0.55033, in 0.000s 1 tree, 107 leaves, max depth = 18, train loss: 0.53642, val loss: 0.53322, in 0.016s 1 tree, 67 leaves, max depth = 18, train loss: 0.52055, val loss: 0.51683, in 0.016s 1 tree, 106 leaves, max depth = 18, train loss: 0.50540, val loss: 0.50285, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.49249, val loss: 0.48932, in 0.016s 1 tree, 107 leaves, max depth = 18, train loss: 0.47987, val loss: 0.47776, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.46924, val loss: 0.46663, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.45859, val loss: 0.45695, in 0.000s 1 tree, 107 leaves, max depth = 15, train loss: 0.44930, val loss: 0.44856, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.44082, val loss: 0.43960, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.43316, val loss: 0.43122, in 0.016s 1 tree, 107 leaves, max depth = 13, train loss: 0.42538, val loss: 0.42429, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.41903, val loss: 0.41746, in 0.000s 1 tree, 107 leaves, max depth = 13, train loss: 0.41233, val loss: 0.41156, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.40685, val loss: 0.40550, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.40102, val loss: 0.40048, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.39630, val loss: 0.39522, in 0.000s 1 tree, 107 leaves, max depth = 13, train loss: 0.39119, val loss: 0.39094, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.38623, val loss: 0.38562, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.38193, val loss: 0.38099, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.37791, val loss: 0.37649, in 0.000s 1 tree, 107 leaves, max depth = 14, train loss: 0.37343, val loss: 0.37285, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.36951, val loss: 0.36970, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.36598, val loss: 0.36571, in 0.000s 1 tree, 68 leaves, max depth = 12, train loss: 0.36221, val loss: 0.36159, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.35873, val loss: 0.35888, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.35557, val loss: 0.35549, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.35248, val loss: 0.35198, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.34978, val loss: 0.34889, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.34705, val loss: 0.34604, in 0.000s 1 tree, 107 leaves, max depth = 14, train loss: 0.34384, val loss: 0.34366, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.34142, val loss: 0.34082, in 0.000s 1 tree, 86 leaves, max depth = 17, train loss: 0.33886, val loss: 0.33801, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.33596, val loss: 0.33592, in 0.016s 1 tree, 86 leaves, max depth = 15, train loss: 0.33371, val loss: 0.33340, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33152, val loss: 0.33086, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.32959, val loss: 0.32860, in 0.000s 1 tree, 107 leaves, max depth = 14, train loss: 0.32690, val loss: 0.32672, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.32465, val loss: 0.32419, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.32288, val loss: 0.32207, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.32097, val loss: 0.32013, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.31851, val loss: 0.31845, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.31671, val loss: 0.31651, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.31454, val loss: 0.31507, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.31287, val loss: 0.31308, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.31129, val loss: 0.31145, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.30933, val loss: 0.31021, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30780, val loss: 0.30834, in 0.016s 1 tree, 86 leaves, max depth = 15, train loss: 0.30623, val loss: 0.30665, in 0.000s 1 tree, 107 leaves, max depth = 14, train loss: 0.30443, val loss: 0.30553, in 0.016s 1 tree, 85 leaves, max depth = 19, train loss: 0.30294, val loss: 0.30382, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30152, val loss: 0.30207, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.29989, val loss: 0.30112, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29861, val loss: 0.29952, in 0.000s 1 tree, 67 leaves, max depth = 15, train loss: 0.29723, val loss: 0.29826, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29609, val loss: 0.29684, in 0.000s 1 tree, 85 leaves, max depth = 19, train loss: 0.29481, val loss: 0.29542, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.29323, val loss: 0.29450, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.29213, val loss: 0.29350, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29104, val loss: 0.29212, in 0.000s 1 tree, 107 leaves, max depth = 15, train loss: 0.28966, val loss: 0.29138, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28868, val loss: 0.29012, in 0.016s 1 tree, 85 leaves, max depth = 19, train loss: 0.28753, val loss: 0.28883, in 0.016s 1 tree, 132 leaves, max depth = 24, train loss: 0.28595, val loss: 0.28798, in 0.016s 1 tree, 64 leaves, max depth = 15, train loss: 0.28497, val loss: 0.28696, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28404, val loss: 0.28575, in 0.000s 1 tree, 140 leaves, max depth = 18, train loss: 0.28243, val loss: 0.28485, in 0.016s 1 tree, 140 leaves, max depth = 20, train loss: 0.28082, val loss: 0.28362, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27996, val loss: 0.28251, in 0.000s 1 tree, 83 leaves, max depth = 17, train loss: 0.27896, val loss: 0.28156, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.27753, val loss: 0.28080, in 0.016s 1 tree, 131 leaves, max depth = 24, train loss: 0.27623, val loss: 0.28017, in 0.031s 1 tree, 4 leaves, max depth = 3, train loss: 0.27541, val loss: 0.27910, in 0.000s 1 tree, 40 leaves, max depth = 10, train loss: 0.27456, val loss: 0.27836, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27383, val loss: 0.27738, in 0.016s 1 tree, 85 leaves, max depth = 17, train loss: 0.27293, val loss: 0.27635, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.27138, val loss: 0.27547, in 0.016s 1 tree, 140 leaves, max depth = 17, train loss: 0.27018, val loss: 0.27490, in 0.031s 1 tree, 140 leaves, max depth = 17, train loss: 0.26887, val loss: 0.27393, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.26804, val loss: 0.27291, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.26732, val loss: 0.27195, in 0.000s Fit 85 trees in 1.393 s, (5840 total leaves) Time spent computing histograms: 0.450s Time spent finding best splits: 0.109s Time spent applying splits: 0.125s Time spent predicting: 0.000s Trial 66, Fold 1: Log loss = 0.2807773485256574, Average precision = 0.9522178452405305, ROC-AUC = 0.9524693001008493, Elapsed Time = 1.4055117999996583 seconds Trial 66, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 66, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.173 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 65 leaves, max depth = 15, train loss: 0.65638, val loss: 0.65475, in 0.016s 1 tree, 68 leaves, max depth = 11, train loss: 0.62534, val loss: 0.62225, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.59839, val loss: 0.59387, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.57553, val loss: 0.56976, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.55565, val loss: 0.54871, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.53740, val loss: 0.53119, in 0.016s 1 tree, 67 leaves, max depth = 20, train loss: 0.52158, val loss: 0.51443, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.50660, val loss: 0.50012, in 0.016s 1 tree, 68 leaves, max depth = 12, train loss: 0.49338, val loss: 0.48599, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.48096, val loss: 0.47419, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.47029, val loss: 0.46280, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.45981, val loss: 0.45289, in 0.016s 1 tree, 68 leaves, max depth = 17, train loss: 0.45088, val loss: 0.44338, in 0.000s 1 tree, 68 leaves, max depth = 15, train loss: 0.44292, val loss: 0.43475, in 0.016s 1 tree, 68 leaves, max depth = 18, train loss: 0.43616, val loss: 0.42749, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.42762, val loss: 0.41952, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.42217, val loss: 0.41385, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.41477, val loss: 0.40700, in 0.016s 1 tree, 68 leaves, max depth = 14, train loss: 0.40943, val loss: 0.40129, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.40301, val loss: 0.39541, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.39738, val loss: 0.39024, in 0.000s 1 tree, 106 leaves, max depth = 14, train loss: 0.39249, val loss: 0.38583, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.38798, val loss: 0.38112, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.38358, val loss: 0.37632, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.37970, val loss: 0.37226, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.37539, val loss: 0.36839, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.37166, val loss: 0.36510, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.36824, val loss: 0.36151, in 0.016s 1 tree, 67 leaves, max depth = 11, train loss: 0.36446, val loss: 0.35744, in 0.016s 1 tree, 107 leaves, max depth = 17, train loss: 0.36117, val loss: 0.35458, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.35795, val loss: 0.35107, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.35495, val loss: 0.34791, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.35233, val loss: 0.34513, in 0.000s 1 tree, 67 leaves, max depth = 15, train loss: 0.34955, val loss: 0.34213, in 0.016s 1 tree, 107 leaves, max depth = 18, train loss: 0.34648, val loss: 0.33949, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.34413, val loss: 0.33700, in 0.000s 1 tree, 86 leaves, max depth = 16, train loss: 0.34166, val loss: 0.33427, in 0.016s 1 tree, 107 leaves, max depth = 18, train loss: 0.33894, val loss: 0.33197, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.33661, val loss: 0.32957, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33448, val loss: 0.32730, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.33225, val loss: 0.32487, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33036, val loss: 0.32284, in 0.000s 1 tree, 83 leaves, max depth = 15, train loss: 0.32843, val loss: 0.32074, in 0.016s 1 tree, 107 leaves, max depth = 17, train loss: 0.32585, val loss: 0.31858, in 0.029s 1 tree, 4 leaves, max depth = 3, train loss: 0.32412, val loss: 0.31673, in 0.007s 1 tree, 107 leaves, max depth = 18, train loss: 0.32184, val loss: 0.31484, in 0.019s 1 tree, 67 leaves, max depth = 13, train loss: 0.32009, val loss: 0.31302, in 0.014s 1 tree, 106 leaves, max depth = 17, train loss: 0.31808, val loss: 0.31138, in 0.014s 1 tree, 4 leaves, max depth = 3, train loss: 0.31646, val loss: 0.30964, in 0.000s 1 tree, 83 leaves, max depth = 16, train loss: 0.31474, val loss: 0.30775, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.31286, val loss: 0.30621, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.31137, val loss: 0.30461, in 0.016s 1 tree, 86 leaves, max depth = 16, train loss: 0.30985, val loss: 0.30298, in 0.000s 1 tree, 107 leaves, max depth = 16, train loss: 0.30814, val loss: 0.30158, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.30675, val loss: 0.30017, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30537, val loss: 0.29869, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.30406, val loss: 0.29725, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30283, val loss: 0.29591, in 0.000s 1 tree, 107 leaves, max depth = 17, train loss: 0.30122, val loss: 0.29464, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30010, val loss: 0.29341, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.29813, val loss: 0.29203, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.29687, val loss: 0.29064, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.29578, val loss: 0.28960, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29473, val loss: 0.28845, in 0.000s 1 tree, 106 leaves, max depth = 17, train loss: 0.29333, val loss: 0.28733, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29238, val loss: 0.28628, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.29129, val loss: 0.28509, in 0.016s 1 tree, 131 leaves, max depth = 21, train loss: 0.28970, val loss: 0.28436, in 0.016s 1 tree, 65 leaves, max depth = 18, train loss: 0.28867, val loss: 0.28342, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28777, val loss: 0.28242, in 0.000s 1 tree, 106 leaves, max depth = 17, train loss: 0.28655, val loss: 0.28151, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.28494, val loss: 0.28081, in 0.031s 1 tree, 4 leaves, max depth = 3, train loss: 0.28409, val loss: 0.27987, in 0.000s 1 tree, 83 leaves, max depth = 17, train loss: 0.28304, val loss: 0.27882, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.28152, val loss: 0.27780, in 0.016s 1 tree, 86 leaves, max depth = 21, train loss: 0.28056, val loss: 0.27673, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27976, val loss: 0.27584, in 0.000s 1 tree, 140 leaves, max depth = 17, train loss: 0.27823, val loss: 0.27522, in 0.031s 1 tree, 131 leaves, max depth = 21, train loss: 0.27700, val loss: 0.27475, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27626, val loss: 0.27392, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.27544, val loss: 0.27315, in 0.000s 1 tree, 140 leaves, max depth = 18, train loss: 0.27406, val loss: 0.27262, in 0.031s 1 tree, 86 leaves, max depth = 21, train loss: 0.27322, val loss: 0.27167, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.27199, val loss: 0.27124, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27128, val loss: 0.27043, in 0.016s Fit 85 trees in 1.445 s, (5776 total leaves) Time spent computing histograms: 0.489s Time spent finding best splits: 0.110s Time spent applying splits: 0.127s Time spent predicting: 0.000s Trial 66, Fold 2: Log loss = 0.2800543717383729, Average precision = 0.9508008089697524, ROC-AUC = 0.9531672190791882, Elapsed Time = 1.45176309999988 seconds Trial 66, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 66, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 65 leaves, max depth = 13, train loss: 0.65601, val loss: 0.65491, in 0.016s 1 tree, 65 leaves, max depth = 13, train loss: 0.62471, val loss: 0.62261, in 0.016s 1 tree, 65 leaves, max depth = 13, train loss: 0.59805, val loss: 0.59502, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.57547, val loss: 0.57137, in 0.000s 1 tree, 66 leaves, max depth = 13, train loss: 0.55579, val loss: 0.55095, in 0.016s 1 tree, 106 leaves, max depth = 13, train loss: 0.53741, val loss: 0.53400, in 0.016s 1 tree, 66 leaves, max depth = 17, train loss: 0.52169, val loss: 0.51748, in 0.016s 1 tree, 106 leaves, max depth = 13, train loss: 0.50658, val loss: 0.50363, in 0.016s 1 tree, 66 leaves, max depth = 17, train loss: 0.49376, val loss: 0.49006, in 0.016s 1 tree, 106 leaves, max depth = 13, train loss: 0.48118, val loss: 0.47860, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.47049, val loss: 0.46743, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.45989, val loss: 0.45783, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.45098, val loss: 0.44844, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.44313, val loss: 0.44007, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.43639, val loss: 0.43281, in 0.016s 1 tree, 106 leaves, max depth = 12, train loss: 0.42773, val loss: 0.42502, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.42206, val loss: 0.41978, in 0.000s 1 tree, 106 leaves, max depth = 12, train loss: 0.41456, val loss: 0.41311, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.40969, val loss: 0.40860, in 0.016s 1 tree, 106 leaves, max depth = 12, train loss: 0.40315, val loss: 0.40285, in 0.000s 1 tree, 106 leaves, max depth = 12, train loss: 0.39742, val loss: 0.39788, in 0.016s 1 tree, 106 leaves, max depth = 12, train loss: 0.39241, val loss: 0.39358, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.38754, val loss: 0.38822, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.38330, val loss: 0.38357, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.37922, val loss: 0.37978, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.37483, val loss: 0.37605, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.37099, val loss: 0.37285, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.36740, val loss: 0.36951, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.36374, val loss: 0.36541, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.36033, val loss: 0.36262, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.35724, val loss: 0.35919, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.35410, val loss: 0.35626, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.35135, val loss: 0.35370, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.34895, val loss: 0.35144, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.34628, val loss: 0.34854, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.34308, val loss: 0.34598, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.34066, val loss: 0.34293, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.33783, val loss: 0.34070, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.33553, val loss: 0.33832, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33330, val loss: 0.33623, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33133, val loss: 0.33438, in 0.000s 1 tree, 83 leaves, max depth = 14, train loss: 0.32915, val loss: 0.33179, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.32729, val loss: 0.32954, in 0.016s 1 tree, 103 leaves, max depth = 14, train loss: 0.32460, val loss: 0.32744, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.32278, val loss: 0.32572, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.32039, val loss: 0.32389, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.31874, val loss: 0.32211, in 0.000s 1 tree, 106 leaves, max depth = 15, train loss: 0.31662, val loss: 0.32052, in 0.031s 1 tree, 4 leaves, max depth = 3, train loss: 0.31492, val loss: 0.31893, in 0.000s 1 tree, 83 leaves, max depth = 15, train loss: 0.31326, val loss: 0.31689, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.31133, val loss: 0.31547, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30977, val loss: 0.31399, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.30838, val loss: 0.31250, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.30662, val loss: 0.31123, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.30519, val loss: 0.30943, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30374, val loss: 0.30805, in 0.000s 1 tree, 106 leaves, max depth = 13, train loss: 0.30213, val loss: 0.30692, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30083, val loss: 0.30569, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.29952, val loss: 0.30404, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29835, val loss: 0.30292, in 0.000s 1 tree, 129 leaves, max depth = 19, train loss: 0.29654, val loss: 0.30210, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.29534, val loss: 0.30085, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.29392, val loss: 0.29988, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29281, val loss: 0.29883, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.29169, val loss: 0.29740, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.29069, val loss: 0.29644, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.28934, val loss: 0.29554, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28843, val loss: 0.29467, in 0.016s 1 tree, 64 leaves, max depth = 16, train loss: 0.28739, val loss: 0.29361, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.28617, val loss: 0.29282, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.28518, val loss: 0.29153, in 0.000s 1 tree, 130 leaves, max depth = 20, train loss: 0.28374, val loss: 0.29104, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28285, val loss: 0.29020, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.28192, val loss: 0.28939, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28112, val loss: 0.28862, in 0.000s 1 tree, 83 leaves, max depth = 13, train loss: 0.27949, val loss: 0.28740, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.27847, val loss: 0.28630, in 0.016s 1 tree, 140 leaves, max depth = 15, train loss: 0.27708, val loss: 0.28525, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27632, val loss: 0.28451, in 0.016s 1 tree, 140 leaves, max depth = 17, train loss: 0.27491, val loss: 0.28359, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.27401, val loss: 0.28259, in 0.016s 1 tree, 140 leaves, max depth = 15, train loss: 0.27277, val loss: 0.28166, in 0.016s 1 tree, 131 leaves, max depth = 18, train loss: 0.27160, val loss: 0.28135, in 0.031s 1 tree, 83 leaves, max depth = 21, train loss: 0.27085, val loss: 0.28032, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27010, val loss: 0.27961, in 0.016s Fit 85 trees in 1.454 s, (5765 total leaves) Time spent computing histograms: 0.497s Time spent finding best splits: 0.108s Time spent applying splits: 0.125s Time spent predicting: 0.016s Trial 66, Fold 3: Log loss = 0.2744742454659397, Average precision = 0.9557721692213328, ROC-AUC = 0.9553627860417409, Elapsed Time = 1.4555007999988447 seconds Trial 66, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 66, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.173 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 67 leaves, max depth = 13, train loss: 0.65633, val loss: 0.65431, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.62527, val loss: 0.62136, in 0.000s 1 tree, 67 leaves, max depth = 15, train loss: 0.59856, val loss: 0.59275, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.57590, val loss: 0.56845, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.55621, val loss: 0.54711, in 0.000s 1 tree, 101 leaves, max depth = 14, train loss: 0.53808, val loss: 0.52926, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.52227, val loss: 0.51210, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.50739, val loss: 0.49748, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.49449, val loss: 0.48334, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.48211, val loss: 0.47120, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.47129, val loss: 0.45916, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.46088, val loss: 0.44898, in 0.031s 1 tree, 67 leaves, max depth = 13, train loss: 0.45197, val loss: 0.43904, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.44410, val loss: 0.43012, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.43733, val loss: 0.42244, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.42888, val loss: 0.41423, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.42340, val loss: 0.40837, in 0.000s 1 tree, 104 leaves, max depth = 14, train loss: 0.41607, val loss: 0.40125, in 0.016s 1 tree, 68 leaves, max depth = 16, train loss: 0.41067, val loss: 0.39502, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.40432, val loss: 0.38885, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.39876, val loss: 0.38346, in 0.000s 1 tree, 104 leaves, max depth = 13, train loss: 0.39390, val loss: 0.37877, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.38935, val loss: 0.37387, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.38507, val loss: 0.36895, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.38117, val loss: 0.36474, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.37690, val loss: 0.36070, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.37317, val loss: 0.35713, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.36973, val loss: 0.35340, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.36603, val loss: 0.34919, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.36274, val loss: 0.34606, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.35961, val loss: 0.34245, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.35659, val loss: 0.33918, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.35395, val loss: 0.33632, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.35126, val loss: 0.33324, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.34821, val loss: 0.33045, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.34585, val loss: 0.32788, in 0.000s 1 tree, 86 leaves, max depth = 17, train loss: 0.34338, val loss: 0.32537, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.34062, val loss: 0.32287, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.33836, val loss: 0.32036, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33622, val loss: 0.31803, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.33404, val loss: 0.31583, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.33214, val loss: 0.31374, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.33025, val loss: 0.31185, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.32763, val loss: 0.30951, in 0.000s 1 tree, 4 leaves, max depth = 3, train loss: 0.32590, val loss: 0.30762, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.32355, val loss: 0.30553, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.32190, val loss: 0.30399, in 0.000s 1 tree, 104 leaves, max depth = 14, train loss: 0.31988, val loss: 0.30222, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.31825, val loss: 0.30042, in 0.000s 1 tree, 85 leaves, max depth = 18, train loss: 0.31654, val loss: 0.29874, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.31507, val loss: 0.29701, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.31360, val loss: 0.29541, in 0.000s 1 tree, 104 leaves, max depth = 16, train loss: 0.31167, val loss: 0.29378, in 0.016s 1 tree, 86 leaves, max depth = 15, train loss: 0.31021, val loss: 0.29233, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.30851, val loss: 0.29084, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30712, val loss: 0.28930, in 0.000s 1 tree, 85 leaves, max depth = 18, train loss: 0.30580, val loss: 0.28806, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30456, val loss: 0.28669, in 0.000s 1 tree, 104 leaves, max depth = 15, train loss: 0.30295, val loss: 0.28536, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30182, val loss: 0.28411, in 0.016s 1 tree, 67 leaves, max depth = 13, train loss: 0.30057, val loss: 0.28289, in 0.000s 1 tree, 83 leaves, max depth = 16, train loss: 0.29856, val loss: 0.28111, in 0.016s [63/85] 1 tree, 85 leaves, max depth = 18, train loss: 0.29740, val loss: 0.28004, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29635, val loss: 0.27889, in 0.000s 1 tree, 103 leaves, max depth = 15, train loss: 0.29495, val loss: 0.27771, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.29400, val loss: 0.27664, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.29294, val loss: 0.27581, in 0.000s 1 tree, 140 leaves, max depth = 19, train loss: 0.29124, val loss: 0.27474, in 0.016s 1 tree, 85 leaves, max depth = 18, train loss: 0.29019, val loss: 0.27380, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28928, val loss: 0.27279, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.28759, val loss: 0.27131, in 0.000s 1 tree, 140 leaves, max depth = 18, train loss: 0.28592, val loss: 0.27038, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28509, val loss: 0.26946, in 0.016s 1 tree, 86 leaves, max depth = 17, train loss: 0.28409, val loss: 0.26859, in 0.000s 1 tree, 129 leaves, max depth = 23, train loss: 0.28256, val loss: 0.26746, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.28172, val loss: 0.26646, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.28093, val loss: 0.26559, in 0.000s 1 tree, 140 leaves, max depth = 20, train loss: 0.27946, val loss: 0.26469, in 0.016s 1 tree, 85 leaves, max depth = 17, train loss: 0.27862, val loss: 0.26397, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.27731, val loss: 0.26317, in 0.016s 1 tree, 85 leaves, max depth = 17, train loss: 0.27655, val loss: 0.26251, in 0.016s 1 tree, 129 leaves, max depth = 22, train loss: 0.27523, val loss: 0.26151, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.27410, val loss: 0.26084, in 0.016s 1 tree, 140 leaves, max depth = 20, train loss: 0.27280, val loss: 0.26014, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.27198, val loss: 0.25924, in 0.000s Fit 85 trees in 1.299 s, (5883 total leaves) Time spent computing histograms: 0.435s Time spent finding best splits: 0.093s Time spent applying splits: 0.107s Time spent predicting: 0.016s Trial 66, Fold 4: Log loss = 0.2763670443208256, Average precision = 0.9559006771424874, ROC-AUC = 0.9546079165981148, Elapsed Time = 1.2988741000008304 seconds Trial 66, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 66, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 64 leaves, max depth = 14, train loss: 0.65604, val loss: 0.65344, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.62471, val loss: 0.61997, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.59802, val loss: 0.59130, in 0.016s 1 tree, 66 leaves, max depth = 15, train loss: 0.57510, val loss: 0.56637, in 0.000s 1 tree, 67 leaves, max depth = 14, train loss: 0.55541, val loss: 0.54499, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.53731, val loss: 0.52774, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.52132, val loss: 0.51023, in 0.000s 1 tree, 105 leaves, max depth = 19, train loss: 0.50645, val loss: 0.49620, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.49308, val loss: 0.48144, in 0.016s 1 tree, 105 leaves, max depth = 22, train loss: 0.48073, val loss: 0.46989, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.46990, val loss: 0.45785, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.45949, val loss: 0.44824, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.45047, val loss: 0.43812, in 0.000s 1 tree, 67 leaves, max depth = 16, train loss: 0.44241, val loss: 0.42900, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.43540, val loss: 0.42099, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.42689, val loss: 0.41333, in 0.000s 1 tree, 67 leaves, max depth = 13, train loss: 0.42117, val loss: 0.40674, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.41382, val loss: 0.40021, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40852, val loss: 0.39513, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.40214, val loss: 0.38953, in 0.000s 1 tree, 105 leaves, max depth = 18, train loss: 0.39658, val loss: 0.38469, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.39173, val loss: 0.38051, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38719, val loss: 0.37617, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.38276, val loss: 0.37099, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37885, val loss: 0.36726, in 0.000s 1 tree, 105 leaves, max depth = 18, train loss: 0.37460, val loss: 0.36370, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.37089, val loss: 0.36062, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36745, val loss: 0.35734, in 0.000s 1 tree, 67 leaves, max depth = 12, train loss: 0.36360, val loss: 0.35295, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.36035, val loss: 0.34910, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.35744, val loss: 0.34567, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35447, val loss: 0.34284, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35188, val loss: 0.34037, in 0.000s 1 tree, 105 leaves, max depth = 19, train loss: 0.34842, val loss: 0.33763, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.34543, val loss: 0.33528, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34307, val loss: 0.33303, in 0.000s 1 tree, 86 leaves, max depth = 13, train loss: 0.34050, val loss: 0.33033, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.33823, val loss: 0.32759, in 0.000s 1 tree, 83 leaves, max depth = 15, train loss: 0.33619, val loss: 0.32532, in 0.016s 1 tree, 104 leaves, max depth = 18, train loss: 0.33341, val loss: 0.32319, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33126, val loss: 0.32113, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.32937, val loss: 0.31933, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.32726, val loss: 0.31710, in 0.000s 1 tree, 105 leaves, max depth = 18, train loss: 0.32470, val loss: 0.31517, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.32298, val loss: 0.31353, in 0.016s 1 tree, 104 leaves, max depth = 18, train loss: 0.32071, val loss: 0.31185, in 0.016s 1 tree, 67 leaves, max depth = 15, train loss: 0.31894, val loss: 0.30969, in 0.000s 1 tree, 86 leaves, max depth = 18, train loss: 0.31725, val loss: 0.30792, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31566, val loss: 0.30640, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.31357, val loss: 0.30488, in 0.000s 1 tree, 84 leaves, max depth = 15, train loss: 0.31193, val loss: 0.30315, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31047, val loss: 0.30175, in 0.000s 1 tree, 105 leaves, max depth = 17, train loss: 0.30857, val loss: 0.30041, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.30720, val loss: 0.29869, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.30553, val loss: 0.29754, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30415, val loss: 0.29622, in 0.000s 1 tree, 84 leaves, max depth = 15, train loss: 0.30272, val loss: 0.29471, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30150, val loss: 0.29353, in 0.000s 1 tree, 105 leaves, max depth = 17, train loss: 0.29992, val loss: 0.29247, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29881, val loss: 0.29140, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.29749, val loss: 0.29003, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.29607, val loss: 0.28907, in 0.000s 1 tree, 63 leaves, max depth = 13, train loss: 0.29491, val loss: 0.28767, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29386, val loss: 0.28666, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.29257, val loss: 0.28581, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.29143, val loss: 0.28463, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.29046, val loss: 0.28369, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.28862, val loss: 0.28234, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.28758, val loss: 0.28125, in 0.000s 1 tree, 140 leaves, max depth = 21, train loss: 0.28593, val loss: 0.28015, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.28501, val loss: 0.27926, in 0.016s 1 tree, 128 leaves, max depth = 24, train loss: 0.28349, val loss: 0.27819, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.28266, val loss: 0.27738, in 0.000s 1 tree, 84 leaves, max depth = 19, train loss: 0.28166, val loss: 0.27645, in 0.016s 1 tree, 140 leaves, max depth = 18, train loss: 0.28017, val loss: 0.27547, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.27930, val loss: 0.27437, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.27852, val loss: 0.27361, in 0.000s 1 tree, 140 leaves, max depth = 18, train loss: 0.27692, val loss: 0.27319, in 0.031s 1 tree, 140 leaves, max depth = 21, train loss: 0.27561, val loss: 0.27234, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.27474, val loss: 0.27145, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.27336, val loss: 0.27049, in 0.016s 1 tree, 129 leaves, max depth = 25, train loss: 0.27214, val loss: 0.26968, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.27138, val loss: 0.26886, in 0.000s 1 tree, 140 leaves, max depth = 18, train loss: 0.27032, val loss: 0.26821, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.26952, val loss: 0.26741, in 0.016s Fit 85 trees in 1.252 s, (5901 total leaves) Time spent computing histograms: 0.417s Time spent finding best splits: 0.093s Time spent applying splits: 0.108s Time spent predicting: 0.000s Trial 66, Fold 5: Log loss = 0.2834021484954921, Average precision = 0.9528156004014571, ROC-AUC = 0.9512490199357152, Elapsed Time = 1.2532468999997946 seconds
Optimization Progress: 67%|######7 | 67/100 [13:19<07:45, 14.10s/it]
Trial 67, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 67, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 123 leaves, max depth = 13, train loss: 0.66207, val loss: 0.66244, in 0.016s 1 tree, 123 leaves, max depth = 12, train loss: 0.63449, val loss: 0.63467, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.60773, val loss: 0.60835, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.58359, val loss: 0.58395, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.56303, val loss: 0.56339, in 0.016s 1 tree, 123 leaves, max depth = 12, train loss: 0.54270, val loss: 0.54312, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.52582, val loss: 0.52646, in 0.031s 1 tree, 123 leaves, max depth = 15, train loss: 0.50930, val loss: 0.51002, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.49468, val loss: 0.49593, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.48021, val loss: 0.48142, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.46803, val loss: 0.46919, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.45569, val loss: 0.45687, in 0.016s 1 tree, 123 leaves, max depth = 11, train loss: 0.44522, val loss: 0.44654, in 0.016s 1 tree, 123 leaves, max depth = 16, train loss: 0.43531, val loss: 0.43667, in 0.031s 1 tree, 123 leaves, max depth = 16, train loss: 0.42583, val loss: 0.42725, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.41772, val loss: 0.41933, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.40947, val loss: 0.41105, in 0.031s 1 tree, 104 leaves, max depth = 11, train loss: 0.39700, val loss: 0.39893, in 0.016s 1 tree, 105 leaves, max depth = 11, train loss: 0.38572, val loss: 0.38798, in 0.016s 1 tree, 107 leaves, max depth = 13, train loss: 0.37518, val loss: 0.37794, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.36601, val loss: 0.36909, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.35880, val loss: 0.36205, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.35203, val loss: 0.35549, in 0.016s 1 tree, 62 leaves, max depth = 12, train loss: 0.34422, val loss: 0.34772, in 0.016s 1 tree, 123 leaves, max depth = 16, train loss: 0.33818, val loss: 0.34185, in 0.031s 1 tree, 95 leaves, max depth = 13, train loss: 0.33114, val loss: 0.33547, in 0.016s 1 tree, 123 leaves, max depth = 12, train loss: 0.32573, val loss: 0.33026, in 0.031s 1 tree, 123 leaves, max depth = 12, train loss: 0.32070, val loss: 0.32544, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.31428, val loss: 0.31945, in 0.016s 1 tree, 123 leaves, max depth = 16, train loss: 0.31043, val loss: 0.31588, in 0.031s 1 tree, 95 leaves, max depth = 13, train loss: 0.30478, val loss: 0.31085, in 0.016s 1 tree, 95 leaves, max depth = 12, train loss: 0.29960, val loss: 0.30626, in 0.016s 1 tree, 119 leaves, max depth = 12, train loss: 0.29464, val loss: 0.30153, in 0.016s 1 tree, 110 leaves, max depth = 13, train loss: 0.28987, val loss: 0.29714, in 0.031s 1 tree, 99 leaves, max depth = 14, train loss: 0.28567, val loss: 0.29347, in 0.016s 1 tree, 68 leaves, max depth = 11, train loss: 0.28188, val loss: 0.28973, in 0.016s Fit 36 trees in 1.033 s, (4093 total leaves) Time spent computing histograms: 0.265s Time spent finding best splits: 0.084s Time spent applying splits: 0.062s Time spent predicting: 0.000s Trial 67, Fold 1: Log loss = 0.2964510256821385, Average precision = 0.9613911346311143, ROC-AUC = 0.9561273288548583, Elapsed Time = 1.038752000000386 seconds Trial 67, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 67, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.126 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 123 leaves, max depth = 12, train loss: 0.66203, val loss: 0.66200, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.63411, val loss: 0.63440, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.60761, val loss: 0.60824, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.58491, val loss: 0.58556, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.56388, val loss: 0.56467, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.54320, val loss: 0.54416, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.52518, val loss: 0.52621, in 0.031s 1 tree, 123 leaves, max depth = 13, train loss: 0.50826, val loss: 0.50943, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.49372, val loss: 0.49497, in 0.031s 1 tree, 123 leaves, max depth = 13, train loss: 0.47973, val loss: 0.48136, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.46730, val loss: 0.46911, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.45483, val loss: 0.45681, in 0.031s 1 tree, 123 leaves, max depth = 17, train loss: 0.44335, val loss: 0.44553, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.43279, val loss: 0.43515, in 0.031s 1 tree, 123 leaves, max depth = 12, train loss: 0.42320, val loss: 0.42612, in 0.016s 1 tree, 109 leaves, max depth = 15, train loss: 0.40966, val loss: 0.41290, in 0.031s 1 tree, 123 leaves, max depth = 12, train loss: 0.40109, val loss: 0.40465, in 0.031s 1 tree, 99 leaves, max depth = 15, train loss: 0.38931, val loss: 0.39323, in 0.016s 1 tree, 100 leaves, max depth = 15, train loss: 0.37861, val loss: 0.38285, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.37151, val loss: 0.37609, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.36222, val loss: 0.36724, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.35376, val loss: 0.35919, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.34692, val loss: 0.35271, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.34065, val loss: 0.34669, in 0.031s 1 tree, 123 leaves, max depth = 11, train loss: 0.33477, val loss: 0.34122, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.32789, val loss: 0.33499, in 0.016s 1 tree, 70 leaves, max depth = 15, train loss: 0.32153, val loss: 0.32879, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.31624, val loss: 0.32395, in 0.031s 1 tree, 123 leaves, max depth = 15, train loss: 0.31134, val loss: 0.31945, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.30695, val loss: 0.31533, in 0.031s 1 tree, 71 leaves, max depth = 15, train loss: 0.30162, val loss: 0.31016, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.29791, val loss: 0.30681, in 0.031s 1 tree, 104 leaves, max depth = 15, train loss: 0.29285, val loss: 0.30221, in 0.016s 1 tree, 123 leaves, max depth = 12, train loss: 0.28917, val loss: 0.29880, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.28464, val loss: 0.29472, in 0.016s 1 tree, 102 leaves, max depth = 15, train loss: 0.28049, val loss: 0.29097, in 0.031s Fit 36 trees in 1.158 s, (4182 total leaves) Time spent computing histograms: 0.288s Time spent finding best splits: 0.099s Time spent applying splits: 0.074s Time spent predicting: 0.000s Trial 67, Fold 2: Log loss = 0.29018273364600816, Average precision = 0.9635144684162199, ROC-AUC = 0.9609087841026591, Elapsed Time = 1.16494640000019 seconds Trial 67, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 67, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.141 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 123 leaves, max depth = 13, train loss: 0.66171, val loss: 0.66191, in 0.016s 1 tree, 123 leaves, max depth = 12, train loss: 0.63371, val loss: 0.63444, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.60740, val loss: 0.60833, in 0.031s 1 tree, 123 leaves, max depth = 15, train loss: 0.58465, val loss: 0.58606, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.56365, val loss: 0.56558, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.54340, val loss: 0.54561, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.52618, val loss: 0.52882, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.50951, val loss: 0.51254, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.49496, val loss: 0.49841, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.48059, val loss: 0.48437, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.46811, val loss: 0.47231, in 0.031s 1 tree, 123 leaves, max depth = 13, train loss: 0.45585, val loss: 0.46034, in 0.016s 1 tree, 123 leaves, max depth = 12, train loss: 0.44493, val loss: 0.44967, in 0.031s 1 tree, 123 leaves, max depth = 15, train loss: 0.43462, val loss: 0.43979, in 0.016s 1 tree, 100 leaves, max depth = 19, train loss: 0.42047, val loss: 0.42671, in 0.016s 1 tree, 94 leaves, max depth = 13, train loss: 0.40758, val loss: 0.41484, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.39909, val loss: 0.40653, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.39109, val loss: 0.39888, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.38368, val loss: 0.39186, in 0.031s 1 tree, 72 leaves, max depth = 12, train loss: 0.37343, val loss: 0.38254, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.36748, val loss: 0.37696, in 0.016s 1 tree, 113 leaves, max depth = 17, train loss: 0.35843, val loss: 0.36889, in 0.031s 1 tree, 58 leaves, max depth = 12, train loss: 0.35048, val loss: 0.36154, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.34415, val loss: 0.35510, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.33696, val loss: 0.34898, in 0.016s 1 tree, 52 leaves, max depth = 8, train loss: 0.33028, val loss: 0.34283, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.32409, val loss: 0.33759, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.31884, val loss: 0.33219, in 0.031s 1 tree, 57 leaves, max depth = 10, train loss: 0.31333, val loss: 0.32719, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.30845, val loss: 0.32230, in 0.031s 1 tree, 123 leaves, max depth = 13, train loss: 0.30394, val loss: 0.31774, in 0.016s 1 tree, 69 leaves, max depth = 11, train loss: 0.29919, val loss: 0.31350, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.29456, val loss: 0.30967, in 0.031s 1 tree, 123 leaves, max depth = 13, train loss: 0.29132, val loss: 0.30675, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.28832, val loss: 0.30402, in 0.016s 1 tree, 121 leaves, max depth = 16, train loss: 0.28397, val loss: 0.30020, in 0.031s Fit 36 trees in 1.094 s, (3926 total leaves) Time spent computing histograms: 0.266s Time spent finding best splits: 0.089s Time spent applying splits: 0.066s Time spent predicting: 0.000s Trial 67, Fold 3: Log loss = 0.2912220947073553, Average precision = 0.9607902861811848, ROC-AUC = 0.9576986882530265, Elapsed Time = 1.1027567000001 seconds Trial 67, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 67, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.174 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 123 leaves, max depth = 13, train loss: 0.66161, val loss: 0.66116, in 0.016s 1 tree, 123 leaves, max depth = 18, train loss: 0.63456, val loss: 0.63290, in 0.031s 1 tree, 123 leaves, max depth = 12, train loss: 0.60786, val loss: 0.60554, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.58402, val loss: 0.58078, in 0.016s 1 tree, 123 leaves, max depth = 12, train loss: 0.56375, val loss: 0.55993, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.54400, val loss: 0.53941, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.52610, val loss: 0.52098, in 0.016s 1 tree, 123 leaves, max depth = 12, train loss: 0.50937, val loss: 0.50376, in 0.016s 1 tree, 123 leaves, max depth = 16, train loss: 0.49516, val loss: 0.48910, in 0.031s 1 tree, 123 leaves, max depth = 16, train loss: 0.48216, val loss: 0.47566, in 0.016s 1 tree, 123 leaves, max depth = 12, train loss: 0.47015, val loss: 0.46367, in 0.031s 1 tree, 123 leaves, max depth = 12, train loss: 0.45797, val loss: 0.45109, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.44776, val loss: 0.44061, in 0.016s 1 tree, 98 leaves, max depth = 13, train loss: 0.43317, val loss: 0.42574, in 0.031s 1 tree, 113 leaves, max depth = 15, train loss: 0.41970, val loss: 0.41204, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.41075, val loss: 0.40270, in 0.031s 1 tree, 123 leaves, max depth = 16, train loss: 0.40205, val loss: 0.39378, in 0.016s 1 tree, 113 leaves, max depth = 14, train loss: 0.39073, val loss: 0.38230, in 0.016s 1 tree, 115 leaves, max depth = 14, train loss: 0.38041, val loss: 0.37183, in 0.031s 1 tree, 123 leaves, max depth = 12, train loss: 0.37319, val loss: 0.36440, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.36649, val loss: 0.35743, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.36030, val loss: 0.35119, in 0.031s 1 tree, 108 leaves, max depth = 13, train loss: 0.35195, val loss: 0.34273, in 0.016s 1 tree, 123 leaves, max depth = 11, train loss: 0.34559, val loss: 0.33650, in 0.016s 1 tree, 64 leaves, max depth = 12, train loss: 0.33860, val loss: 0.32911, in 0.016s 1 tree, 89 leaves, max depth = 12, train loss: 0.33201, val loss: 0.32276, in 0.031s 1 tree, 68 leaves, max depth = 14, train loss: 0.32607, val loss: 0.31646, in 0.000s 1 tree, 123 leaves, max depth = 13, train loss: 0.32076, val loss: 0.31136, in 0.031s 1 tree, 123 leaves, max depth = 13, train loss: 0.31576, val loss: 0.30658, in 0.016s 1 tree, 123 leaves, max depth = 12, train loss: 0.31028, val loss: 0.30104, in 0.016s 1 tree, 66 leaves, max depth = 13, train loss: 0.30556, val loss: 0.29607, in 0.031s 1 tree, 66 leaves, max depth = 12, train loss: 0.30132, val loss: 0.29154, in 0.016s 1 tree, 99 leaves, max depth = 15, train loss: 0.29680, val loss: 0.28723, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.29245, val loss: 0.28286, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.28842, val loss: 0.27881, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.28530, val loss: 0.27583, in 0.016s Fit 36 trees in 1.174 s, (4074 total leaves) Time spent computing histograms: 0.285s Time spent finding best splits: 0.092s Time spent applying splits: 0.068s Time spent predicting: 0.000s Trial 67, Fold 4: Log loss = 0.2932094147428679, Average precision = 0.9616622060698489, ROC-AUC = 0.9566776587957301, Elapsed Time = 1.1851515000016661 seconds Trial 67, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 67, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 123 leaves, max depth = 12, train loss: 0.66160, val loss: 0.66026, in 0.016s 1 tree, 123 leaves, max depth = 12, train loss: 0.63372, val loss: 0.63151, in 0.031s 1 tree, 123 leaves, max depth = 12, train loss: 0.60714, val loss: 0.60408, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.58399, val loss: 0.58006, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.56306, val loss: 0.55834, in 0.031s 1 tree, 123 leaves, max depth = 12, train loss: 0.54263, val loss: 0.53737, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.52525, val loss: 0.51995, in 0.016s 1 tree, 123 leaves, max depth = 16, train loss: 0.50810, val loss: 0.50242, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.49365, val loss: 0.48802, in 0.016s 1 tree, 123 leaves, max depth = 16, train loss: 0.47948, val loss: 0.47342, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.46704, val loss: 0.46063, in 0.031s 1 tree, 123 leaves, max depth = 16, train loss: 0.45467, val loss: 0.44833, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.44370, val loss: 0.43713, in 0.016s 1 tree, 123 leaves, max depth = 16, train loss: 0.43318, val loss: 0.42656, in 0.031s 1 tree, 123 leaves, max depth = 15, train loss: 0.42340, val loss: 0.41686, in 0.016s 1 tree, 84 leaves, max depth = 13, train loss: 0.40973, val loss: 0.40342, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.40123, val loss: 0.39513, in 0.016s 1 tree, 109 leaves, max depth = 18, train loss: 0.38941, val loss: 0.38343, in 0.031s 1 tree, 123 leaves, max depth = 16, train loss: 0.38191, val loss: 0.37592, in 0.016s 1 tree, 111 leaves, max depth = 17, train loss: 0.37151, val loss: 0.36565, in 0.031s 1 tree, 123 leaves, max depth = 15, train loss: 0.36552, val loss: 0.35979, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.35676, val loss: 0.35088, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.35002, val loss: 0.34431, in 0.031s 1 tree, 50 leaves, max depth = 11, train loss: 0.34232, val loss: 0.33649, in 0.016s 1 tree, 69 leaves, max depth = 13, train loss: 0.33537, val loss: 0.32936, in 0.016s 1 tree, 94 leaves, max depth = 14, train loss: 0.32863, val loss: 0.32315, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.32405, val loss: 0.31905, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.31874, val loss: 0.31396, in 0.031s 1 tree, 91 leaves, max depth = 14, train loss: 0.31288, val loss: 0.30869, in 0.016s 1 tree, 99 leaves, max depth = 14, train loss: 0.30754, val loss: 0.30385, in 0.031s 1 tree, 123 leaves, max depth = 13, train loss: 0.30285, val loss: 0.29930, in 0.016s 1 tree, 62 leaves, max depth = 13, train loss: 0.29802, val loss: 0.29431, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.29356, val loss: 0.28997, in 0.031s 1 tree, 86 leaves, max depth = 13, train loss: 0.28917, val loss: 0.28608, in 0.016s 1 tree, 69 leaves, max depth = 15, train loss: 0.28511, val loss: 0.28198, in 0.016s 1 tree, 63 leaves, max depth = 11, train loss: 0.28138, val loss: 0.27808, in 0.016s Fit 36 trees in 1.096 s, (3866 total leaves) Time spent computing histograms: 0.256s Time spent finding best splits: 0.086s Time spent applying splits: 0.065s Time spent predicting: 0.000s Trial 67, Fold 5: Log loss = 0.29708795787730236, Average precision = 0.9595966341554678, ROC-AUC = 0.9554665477583932, Elapsed Time = 1.1053009999995993 seconds
Optimization Progress: 68%|######8 | 68/100 [13:32<07:17, 13.68s/it]
Trial 68, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 68, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.205 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 11, train loss: 0.68570, val loss: 0.68549, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.67859, val loss: 0.67818, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.67171, val loss: 0.67115, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.66511, val loss: 0.66436, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.65863, val loss: 0.65769, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.65235, val loss: 0.65121, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.64632, val loss: 0.64499, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.64047, val loss: 0.63894, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.63476, val loss: 0.63302, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.62910, val loss: 0.62719, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.62373, val loss: 0.62162, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.61854, val loss: 0.61624, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.61345, val loss: 0.61097, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.60855, val loss: 0.60588, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.60375, val loss: 0.60090, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.59912, val loss: 0.59609, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.59456, val loss: 0.59141, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.59013, val loss: 0.58686, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.58582, val loss: 0.58234, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.58155, val loss: 0.57792, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.57740, val loss: 0.57362, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.57337, val loss: 0.56944, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.56945, val loss: 0.56537, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.56563, val loss: 0.56141, in 0.000s 1 tree, 28 leaves, max depth = 13, train loss: 0.56200, val loss: 0.55766, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.55846, val loss: 0.55402, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.55504, val loss: 0.55044, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.55171, val loss: 0.54699, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.54848, val loss: 0.54362, in 0.000s 1 tree, 51 leaves, max depth = 10, train loss: 0.54498, val loss: 0.54036, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.54187, val loss: 0.53713, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.53884, val loss: 0.53395, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.53580, val loss: 0.53077, in 0.000s 1 tree, 51 leaves, max depth = 10, train loss: 0.53248, val loss: 0.52769, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.52956, val loss: 0.52463, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.52678, val loss: 0.52176, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.52407, val loss: 0.51896, in 0.000s 1 tree, 51 leaves, max depth = 10, train loss: 0.52093, val loss: 0.51605, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.51833, val loss: 0.51336, in 0.018s 1 tree, 31 leaves, max depth = 10, train loss: 0.51581, val loss: 0.51071, in 0.007s 1 tree, 28 leaves, max depth = 11, train loss: 0.51334, val loss: 0.50816, in 0.008s 1 tree, 28 leaves, max depth = 10, train loss: 0.51087, val loss: 0.50557, in 0.008s 1 tree, 28 leaves, max depth = 11, train loss: 0.50854, val loss: 0.50315, in 0.008s 1 tree, 51 leaves, max depth = 10, train loss: 0.50559, val loss: 0.50043, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.50335, val loss: 0.49810, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.50117, val loss: 0.49583, in 0.000s 1 tree, 51 leaves, max depth = 11, train loss: 0.49833, val loss: 0.49322, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.49616, val loss: 0.49093, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.49405, val loss: 0.48869, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.49133, val loss: 0.48618, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.48935, val loss: 0.48407, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.48671, val loss: 0.48164, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.48415, val loss: 0.47928, in 0.031s 1 tree, 30 leaves, max depth = 12, train loss: 0.48228, val loss: 0.47731, in 0.000s 1 tree, 52 leaves, max depth = 10, train loss: 0.47979, val loss: 0.47504, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.47737, val loss: 0.47282, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.47554, val loss: 0.47085, in 0.000s 1 tree, 32 leaves, max depth = 10, train loss: 0.47372, val loss: 0.46893, in 0.016s 1 tree, 52 leaves, max depth = 10, train loss: 0.47139, val loss: 0.46680, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.46972, val loss: 0.46503, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.46810, val loss: 0.46330, in 0.000s 1 tree, 52 leaves, max depth = 10, train loss: 0.46585, val loss: 0.46125, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.46425, val loss: 0.45954, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.46264, val loss: 0.45784, in 0.000s 1 tree, 50 leaves, max depth = 10, train loss: 0.46048, val loss: 0.45587, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.45901, val loss: 0.45430, in 0.000s 1 tree, 32 leaves, max depth = 8, train loss: 0.45750, val loss: 0.45270, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.45610, val loss: 0.45120, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.45402, val loss: 0.44932, in 0.000s 1 tree, 51 leaves, max depth = 10, train loss: 0.45199, val loss: 0.44749, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.45059, val loss: 0.44597, in 0.000s Fit 71 trees in 0.957 s, (2372 total leaves) Time spent computing histograms: 0.332s Time spent finding best splits: 0.045s Time spent applying splits: 0.052s Time spent predicting: 0.000s Trial 68, Fold 1: Log loss = 0.4511051676766809, Average precision = 0.9082653287694907, ROC-AUC = 0.915245410996879, Elapsed Time = 0.974603800001205 seconds Trial 68, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 68, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0.163 s 0.040 GB of training data: 0.006 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 12, train loss: 0.68579, val loss: 0.68545, in 0.007s 1 tree, 32 leaves, max depth = 12, train loss: 0.67871, val loss: 0.67806, in 0.009s 1 tree, 30 leaves, max depth = 10, train loss: 0.67182, val loss: 0.67084, in 0.008s 1 tree, 30 leaves, max depth = 9, train loss: 0.66519, val loss: 0.66391, in 0.008s 1 tree, 31 leaves, max depth = 17, train loss: 0.65876, val loss: 0.65716, in 0.009s 1 tree, 32 leaves, max depth = 11, train loss: 0.65251, val loss: 0.65062, in 0.007s 1 tree, 30 leaves, max depth = 14, train loss: 0.64645, val loss: 0.64428, in 0.010s 1 tree, 31 leaves, max depth = 17, train loss: 0.64058, val loss: 0.63811, in 0.010s 1 tree, 30 leaves, max depth = 10, train loss: 0.63485, val loss: 0.63207, in 0.009s 1 tree, 31 leaves, max depth = 10, train loss: 0.62923, val loss: 0.62612, in 0.009s 1 tree, 32 leaves, max depth = 11, train loss: 0.62384, val loss: 0.62048, in 0.011s 1 tree, 30 leaves, max depth = 14, train loss: 0.61863, val loss: 0.61498, in 0.009s 1 tree, 30 leaves, max depth = 10, train loss: 0.61354, val loss: 0.60959, in 0.008s 1 tree, 29 leaves, max depth = 9, train loss: 0.60864, val loss: 0.60443, in 0.008s 1 tree, 30 leaves, max depth = 10, train loss: 0.60383, val loss: 0.59933, in 0.009s 1 tree, 30 leaves, max depth = 14, train loss: 0.59918, val loss: 0.59444, in 0.009s 1 tree, 29 leaves, max depth = 8, train loss: 0.59462, val loss: 0.58962, in 0.008s 1 tree, 29 leaves, max depth = 8, train loss: 0.59019, val loss: 0.58494, in 0.009s 1 tree, 6 leaves, max depth = 4, train loss: 0.58587, val loss: 0.58039, in 0.007s 1 tree, 31 leaves, max depth = 10, train loss: 0.58163, val loss: 0.57589, in 0.008s 1 tree, 29 leaves, max depth = 8, train loss: 0.57756, val loss: 0.57158, in 0.008s 1 tree, 31 leaves, max depth = 10, train loss: 0.57355, val loss: 0.56733, in 0.008s 1 tree, 31 leaves, max depth = 10, train loss: 0.56965, val loss: 0.56318, in 0.009s 1 tree, 31 leaves, max depth = 10, train loss: 0.56586, val loss: 0.55915, in 0.009s 1 tree, 29 leaves, max depth = 8, train loss: 0.56222, val loss: 0.55529, in 0.010s 1 tree, 51 leaves, max depth = 11, train loss: 0.55854, val loss: 0.55174, in 0.012s 1 tree, 32 leaves, max depth = 11, train loss: 0.55506, val loss: 0.54807, in 0.010s 1 tree, 31 leaves, max depth = 11, train loss: 0.55169, val loss: 0.54448, in 0.008s 1 tree, 29 leaves, max depth = 14, train loss: 0.54841, val loss: 0.54100, in 0.008s 1 tree, 50 leaves, max depth = 13, train loss: 0.54493, val loss: 0.53765, in 0.011s 1 tree, 31 leaves, max depth = 11, train loss: 0.54178, val loss: 0.53430, in 0.009s 1 tree, 32 leaves, max depth = 11, train loss: 0.53870, val loss: 0.53104, in 0.010s 1 tree, 31 leaves, max depth = 10, train loss: 0.53564, val loss: 0.52777, in 0.009s 1 tree, 50 leaves, max depth = 13, train loss: 0.53235, val loss: 0.52460, in 0.011s 1 tree, 29 leaves, max depth = 8, train loss: 0.52945, val loss: 0.52151, in 0.009s 1 tree, 29 leaves, max depth = 8, train loss: 0.52663, val loss: 0.51850, in 0.009s 1 tree, 29 leaves, max depth = 8, train loss: 0.52388, val loss: 0.51557, in 0.009s 1 tree, 50 leaves, max depth = 12, train loss: 0.52076, val loss: 0.51257, in 0.010s 1 tree, 29 leaves, max depth = 8, train loss: 0.51812, val loss: 0.50974, in 0.009s 1 tree, 32 leaves, max depth = 11, train loss: 0.51556, val loss: 0.50703, in 0.008s 1 tree, 29 leaves, max depth = 8, train loss: 0.51306, val loss: 0.50435, in 0.008s 1 tree, 31 leaves, max depth = 10, train loss: 0.51058, val loss: 0.50168, in 0.009s 1 tree, 29 leaves, max depth = 8, train loss: 0.50821, val loss: 0.49914, in 0.009s 1 tree, 49 leaves, max depth = 12, train loss: 0.50528, val loss: 0.49634, in 0.010s 1 tree, 31 leaves, max depth = 10, train loss: 0.50302, val loss: 0.49393, in 0.010s 1 tree, 28 leaves, max depth = 8, train loss: 0.50080, val loss: 0.49154, in 0.008s 1 tree, 49 leaves, max depth = 12, train loss: 0.49799, val loss: 0.48885, in 0.010s 1 tree, 31 leaves, max depth = 9, train loss: 0.49580, val loss: 0.48650, in 0.008s 1 tree, 31 leaves, max depth = 9, train loss: 0.49367, val loss: 0.48420, in 0.009s 1 tree, 49 leaves, max depth = 14, train loss: 0.49097, val loss: 0.48162, in 0.012s 1 tree, 32 leaves, max depth = 12, train loss: 0.48897, val loss: 0.47950, in 0.007s 1 tree, 50 leaves, max depth = 9, train loss: 0.48636, val loss: 0.47700, in 0.010s 1 tree, 50 leaves, max depth = 15, train loss: 0.48381, val loss: 0.47458, in 0.012s 1 tree, 31 leaves, max depth = 11, train loss: 0.48191, val loss: 0.47253, in 0.010s 1 tree, 50 leaves, max depth = 9, train loss: 0.47945, val loss: 0.47018, in 0.011s 1 tree, 50 leaves, max depth = 9, train loss: 0.47705, val loss: 0.46789, in 0.008s 1 tree, 6 leaves, max depth = 4, train loss: 0.47520, val loss: 0.46590, in 0.006s 1 tree, 31 leaves, max depth = 11, train loss: 0.47343, val loss: 0.46401, in 0.008s 1 tree, 49 leaves, max depth = 10, train loss: 0.47112, val loss: 0.46181, in 0.010s 1 tree, 31 leaves, max depth = 15, train loss: 0.46943, val loss: 0.45999, in 0.011s 1 tree, 31 leaves, max depth = 15, train loss: 0.46778, val loss: 0.45820, in 0.011s 1 tree, 49 leaves, max depth = 10, train loss: 0.46555, val loss: 0.45609, in 0.011s 1 tree, 29 leaves, max depth = 9, train loss: 0.46394, val loss: 0.45435, in 0.009s 1 tree, 29 leaves, max depth = 10, train loss: 0.46238, val loss: 0.45265, in 0.010s 1 tree, 50 leaves, max depth = 12, train loss: 0.46022, val loss: 0.45060, in 0.011s 1 tree, 29 leaves, max depth = 9, train loss: 0.45872, val loss: 0.44896, in 0.008s 1 tree, 29 leaves, max depth = 9, train loss: 0.45725, val loss: 0.44737, in 0.009s 1 tree, 31 leaves, max depth = 15, train loss: 0.45584, val loss: 0.44583, in 0.009s 1 tree, 50 leaves, max depth = 12, train loss: 0.45377, val loss: 0.44386, in 0.010s 1 tree, 50 leaves, max depth = 12, train loss: 0.45175, val loss: 0.44195, in 0.017s 1 tree, 31 leaves, max depth = 10, train loss: 0.45034, val loss: 0.44041, in 0.011s Fit 71 trees in 0.993 s, (2415 total leaves) Time spent computing histograms: 0.344s Time spent finding best splits: 0.048s Time spent applying splits: 0.057s Time spent predicting: 0.012s Trial 68, Fold 2: Log loss = 0.4508887269438503, Average precision = 0.9074085799523012, ROC-AUC = 0.9224698560611556, Elapsed Time = 1.001351099999738 seconds Trial 68, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 68, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.151 s 0.005 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 11, train loss: 0.68578, val loss: 0.68551, in 0.008s 1 tree, 30 leaves, max depth = 10, train loss: 0.67874, val loss: 0.67825, in 0.008s 1 tree, 31 leaves, max depth = 9, train loss: 0.67189, val loss: 0.67115, in 0.009s 1 tree, 28 leaves, max depth = 11, train loss: 0.66532, val loss: 0.66432, in 0.008s 1 tree, 32 leaves, max depth = 13, train loss: 0.65894, val loss: 0.65768, in 0.008s 1 tree, 30 leaves, max depth = 10, train loss: 0.65272, val loss: 0.65125, in 0.008s 1 tree, 31 leaves, max depth = 14, train loss: 0.64672, val loss: 0.64500, in 0.009s 1 tree, 31 leaves, max depth = 13, train loss: 0.64089, val loss: 0.63895, in 0.008s 1 tree, 30 leaves, max depth = 8, train loss: 0.63521, val loss: 0.63305, in 0.008s 1 tree, 30 leaves, max depth = 8, train loss: 0.62962, val loss: 0.62726, in 0.009s 1 tree, 30 leaves, max depth = 10, train loss: 0.62426, val loss: 0.62172, in 0.007s 1 tree, 31 leaves, max depth = 13, train loss: 0.61909, val loss: 0.61633, in 0.009s 1 tree, 31 leaves, max depth = 14, train loss: 0.61407, val loss: 0.61108, in 0.009s 1 tree, 30 leaves, max depth = 10, train loss: 0.60920, val loss: 0.60604, in 0.007s 1 tree, 28 leaves, max depth = 11, train loss: 0.60446, val loss: 0.60110, in 0.008s 1 tree, 31 leaves, max depth = 14, train loss: 0.59986, val loss: 0.59628, in 0.009s 1 tree, 30 leaves, max depth = 9, train loss: 0.59535, val loss: 0.59157, in 0.010s 1 tree, 29 leaves, max depth = 10, train loss: 0.59097, val loss: 0.58703, in 0.009s 1 tree, 6 leaves, max depth = 4, train loss: 0.58668, val loss: 0.58256, in 0.007s 1 tree, 30 leaves, max depth = 8, train loss: 0.58245, val loss: 0.57818, in 0.007s 1 tree, 30 leaves, max depth = 8, train loss: 0.57833, val loss: 0.57392, in 0.008s 1 tree, 30 leaves, max depth = 8, train loss: 0.57433, val loss: 0.56977, in 0.008s 1 tree, 30 leaves, max depth = 8, train loss: 0.57045, val loss: 0.56574, in 0.009s 1 tree, 30 leaves, max depth = 8, train loss: 0.56666, val loss: 0.56181, in 0.007s 1 tree, 29 leaves, max depth = 10, train loss: 0.56307, val loss: 0.55807, in 0.008s 1 tree, 52 leaves, max depth = 11, train loss: 0.55934, val loss: 0.55458, in 0.012s 1 tree, 29 leaves, max depth = 10, train loss: 0.55589, val loss: 0.55098, in 0.008s 1 tree, 29 leaves, max depth = 10, train loss: 0.55255, val loss: 0.54748, in 0.008s 1 tree, 28 leaves, max depth = 12, train loss: 0.54931, val loss: 0.54407, in 0.008s 1 tree, 51 leaves, max depth = 12, train loss: 0.54580, val loss: 0.54078, in 0.012s 1 tree, 29 leaves, max depth = 10, train loss: 0.54268, val loss: 0.53750, in 0.007s 1 tree, 29 leaves, max depth = 10, train loss: 0.53962, val loss: 0.53431, in 0.009s 1 tree, 30 leaves, max depth = 8, train loss: 0.53656, val loss: 0.53113, in 0.007s 1 tree, 51 leaves, max depth = 12, train loss: 0.53324, val loss: 0.52802, in 0.010s 1 tree, 30 leaves, max depth = 8, train loss: 0.53030, val loss: 0.52496, in 0.008s 1 tree, 30 leaves, max depth = 8, train loss: 0.52750, val loss: 0.52201, in 0.007s 1 tree, 31 leaves, max depth = 9, train loss: 0.52477, val loss: 0.51913, in 0.008s 1 tree, 51 leaves, max depth = 13, train loss: 0.52161, val loss: 0.51618, in 0.009s 1 tree, 30 leaves, max depth = 8, train loss: 0.51900, val loss: 0.51343, in 0.009s 1 tree, 30 leaves, max depth = 10, train loss: 0.51646, val loss: 0.51076, in 0.010s 1 tree, 31 leaves, max depth = 9, train loss: 0.51397, val loss: 0.50813, in 0.010s 1 tree, 30 leaves, max depth = 8, train loss: 0.51150, val loss: 0.50554, in 0.011s 1 tree, 31 leaves, max depth = 9, train loss: 0.50914, val loss: 0.50305, in 0.009s 1 tree, 53 leaves, max depth = 12, train loss: 0.50618, val loss: 0.50029, in 0.011s 1 tree, 28 leaves, max depth = 14, train loss: 0.50395, val loss: 0.49791, in 0.009s 1 tree, 31 leaves, max depth = 9, train loss: 0.50174, val loss: 0.49557, in 0.009s [47/71] 1 tree, 53 leaves, max depth = 12, train loss: 0.49890, val loss: 0.49292, in 0.012s 1 tree, 30 leaves, max depth = 8, train loss: 0.49672, val loss: 0.49064, in 0.009s 1 tree, 30 leaves, max depth = 8, train loss: 0.49460, val loss: 0.48842, in 0.009s 1 tree, 52 leaves, max depth = 11, train loss: 0.49187, val loss: 0.48588, in 0.010s 1 tree, 30 leaves, max depth = 10, train loss: 0.48989, val loss: 0.48379, in 0.010s 1 tree, 52 leaves, max depth = 11, train loss: 0.48724, val loss: 0.48133, in 0.012s 1 tree, 52 leaves, max depth = 11, train loss: 0.48467, val loss: 0.47893, in 0.010s 1 tree, 29 leaves, max depth = 13, train loss: 0.48280, val loss: 0.47694, in 0.009s 1 tree, 52 leaves, max depth = 11, train loss: 0.48030, val loss: 0.47463, in 0.012s 1 tree, 52 leaves, max depth = 11, train loss: 0.47787, val loss: 0.47238, in 0.010s 1 tree, 6 leaves, max depth = 4, train loss: 0.47604, val loss: 0.47044, in 0.007s 1 tree, 29 leaves, max depth = 10, train loss: 0.47429, val loss: 0.46858, in 0.009s 1 tree, 52 leaves, max depth = 11, train loss: 0.47195, val loss: 0.46642, in 0.010s 1 tree, 31 leaves, max depth = 13, train loss: 0.47028, val loss: 0.46464, in 0.009s 1 tree, 31 leaves, max depth = 9, train loss: 0.46864, val loss: 0.46287, in 0.009s 1 tree, 52 leaves, max depth = 10, train loss: 0.46638, val loss: 0.46079, in 0.013s 1 tree, 29 leaves, max depth = 10, train loss: 0.46479, val loss: 0.45910, in 0.009s 1 tree, 31 leaves, max depth = 9, train loss: 0.46325, val loss: 0.45744, in 0.009s 1 tree, 52 leaves, max depth = 13, train loss: 0.46107, val loss: 0.45543, in 0.012s 1 tree, 31 leaves, max depth = 9, train loss: 0.45958, val loss: 0.45382, in 0.009s 1 tree, 29 leaves, max depth = 9, train loss: 0.45807, val loss: 0.45219, in 0.009s 1 tree, 31 leaves, max depth = 9, train loss: 0.45666, val loss: 0.45067, in 0.010s 1 tree, 51 leaves, max depth = 11, train loss: 0.45456, val loss: 0.44873, in 0.011s 1 tree, 52 leaves, max depth = 13, train loss: 0.45252, val loss: 0.44685, in 0.010s 1 tree, 30 leaves, max depth = 9, train loss: 0.45112, val loss: 0.44537, in 0.009s Fit 71 trees in 0.947 s, (2430 total leaves) Time spent computing histograms: 0.345s Time spent finding best splits: 0.045s Time spent applying splits: 0.055s Time spent predicting: 0.006s Trial 68, Fold 3: Log loss = 0.4462839684151748, Average precision = 0.9177517585201269, ROC-AUC = 0.9285210913648141, Elapsed Time = 0.9534422999986418 seconds Trial 68, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 68, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.176 s 0.006 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 11, train loss: 0.68579, val loss: 0.68541, in 0.009s 1 tree, 31 leaves, max depth = 11, train loss: 0.67875, val loss: 0.67795, in 0.009s 1 tree, 29 leaves, max depth = 10, train loss: 0.67191, val loss: 0.67074, in 0.009s 1 tree, 30 leaves, max depth = 10, train loss: 0.66531, val loss: 0.66379, in 0.009s 1 tree, 29 leaves, max depth = 11, train loss: 0.65891, val loss: 0.65701, in 0.010s 1 tree, 31 leaves, max depth = 11, train loss: 0.65268, val loss: 0.65040, in 0.010s 1 tree, 30 leaves, max depth = 11, train loss: 0.64666, val loss: 0.64402, in 0.009s 1 tree, 30 leaves, max depth = 10, train loss: 0.64082, val loss: 0.63784, in 0.009s 1 tree, 29 leaves, max depth = 11, train loss: 0.63512, val loss: 0.63179, in 0.010s 1 tree, 30 leaves, max depth = 10, train loss: 0.62952, val loss: 0.62582, in 0.008s 1 tree, 31 leaves, max depth = 11, train loss: 0.62416, val loss: 0.62010, in 0.009s 1 tree, 29 leaves, max depth = 11, train loss: 0.61897, val loss: 0.61457, in 0.007s 1 tree, 29 leaves, max depth = 11, train loss: 0.61390, val loss: 0.60916, in 0.009s 1 tree, 29 leaves, max depth = 10, train loss: 0.60902, val loss: 0.60394, in 0.008s 1 tree, 30 leaves, max depth = 12, train loss: 0.60423, val loss: 0.59882, in 0.011s 1 tree, 30 leaves, max depth = 11, train loss: 0.59961, val loss: 0.59387, in 0.010s 1 tree, 29 leaves, max depth = 10, train loss: 0.59508, val loss: 0.58904, in 0.010s 1 tree, 29 leaves, max depth = 10, train loss: 0.59068, val loss: 0.58433, in 0.009s 1 tree, 6 leaves, max depth = 4, train loss: 0.58638, val loss: 0.57974, in 0.008s 1 tree, 29 leaves, max depth = 12, train loss: 0.58216, val loss: 0.57520, in 0.010s 1 tree, 29 leaves, max depth = 12, train loss: 0.57805, val loss: 0.57078, in 0.009s 1 tree, 29 leaves, max depth = 12, train loss: 0.57406, val loss: 0.56648, in 0.008s 1 tree, 29 leaves, max depth = 12, train loss: 0.57018, val loss: 0.56230, in 0.009s 1 tree, 29 leaves, max depth = 12, train loss: 0.56641, val loss: 0.55822, in 0.010s 1 tree, 29 leaves, max depth = 10, train loss: 0.56280, val loss: 0.55433, in 0.011s 1 tree, 50 leaves, max depth = 12, train loss: 0.55909, val loss: 0.55066, in 0.010s 1 tree, 30 leaves, max depth = 11, train loss: 0.55563, val loss: 0.54690, in 0.009s 1 tree, 30 leaves, max depth = 11, train loss: 0.55226, val loss: 0.54327, in 0.011s 1 tree, 30 leaves, max depth = 12, train loss: 0.54899, val loss: 0.53972, in 0.009s 1 tree, 50 leaves, max depth = 11, train loss: 0.54550, val loss: 0.53627, in 0.013s 1 tree, 31 leaves, max depth = 11, train loss: 0.54235, val loss: 0.53287, in 0.010s 1 tree, 31 leaves, max depth = 11, train loss: 0.53928, val loss: 0.52952, in 0.009s 1 tree, 29 leaves, max depth = 12, train loss: 0.53624, val loss: 0.52620, in 0.010s 1 tree, 51 leaves, max depth = 11, train loss: 0.53293, val loss: 0.52293, in 0.011s 1 tree, 29 leaves, max depth = 9, train loss: 0.53000, val loss: 0.51974, in 0.010s 1 tree, 29 leaves, max depth = 10, train loss: 0.52720, val loss: 0.51669, in 0.010s 1 tree, 29 leaves, max depth = 10, train loss: 0.52447, val loss: 0.51372, in 0.010s 1 tree, 51 leaves, max depth = 12, train loss: 0.52134, val loss: 0.51062, in 0.012s 1 tree, 29 leaves, max depth = 10, train loss: 0.51872, val loss: 0.50776, in 0.010s 1 tree, 30 leaves, max depth = 11, train loss: 0.51617, val loss: 0.50495, in 0.009s 1 tree, 29 leaves, max depth = 10, train loss: 0.51368, val loss: 0.50223, in 0.010s 1 tree, 29 leaves, max depth = 9, train loss: 0.51122, val loss: 0.49952, in 0.010s 1 tree, 29 leaves, max depth = 10, train loss: 0.50887, val loss: 0.49693, in 0.142s 1 tree, 52 leaves, max depth = 12, train loss: 0.50592, val loss: 0.49404, in 0.015s 1 tree, 31 leaves, max depth = 12, train loss: 0.50366, val loss: 0.49157, in 0.040s 1 tree, 29 leaves, max depth = 10, train loss: 0.50146, val loss: 0.48914, in 0.011s 1 tree, 52 leaves, max depth = 13, train loss: 0.49864, val loss: 0.48637, in 0.012s 1 tree, 30 leaves, max depth = 9, train loss: 0.49647, val loss: 0.48397, in 0.012s 1 tree, 30 leaves, max depth = 9, train loss: 0.49436, val loss: 0.48162, in 0.012s 1 tree, 52 leaves, max depth = 13, train loss: 0.49165, val loss: 0.47896, in 0.013s 1 tree, 30 leaves, max depth = 11, train loss: 0.48966, val loss: 0.47675, in 0.011s 1 tree, 52 leaves, max depth = 13, train loss: 0.48703, val loss: 0.47417, in 0.012s 1 tree, 52 leaves, max depth = 13, train loss: 0.48447, val loss: 0.47166, in 0.012s 1 tree, 29 leaves, max depth = 11, train loss: 0.48250, val loss: 0.46949, in 0.011s 1 tree, 52 leaves, max depth = 13, train loss: 0.48003, val loss: 0.46706, in 0.013s 1 tree, 52 leaves, max depth = 13, train loss: 0.47761, val loss: 0.46470, in 0.015s 1 tree, 6 leaves, max depth = 4, train loss: 0.47576, val loss: 0.46266, in 0.008s 1 tree, 31 leaves, max depth = 10, train loss: 0.47400, val loss: 0.46069, in 0.010s 1 tree, 53 leaves, max depth = 11, train loss: 0.47168, val loss: 0.45842, in 0.013s 1 tree, 30 leaves, max depth = 11, train loss: 0.46992, val loss: 0.45646, in 0.011s 1 tree, 30 leaves, max depth = 12, train loss: 0.46827, val loss: 0.45461, in 0.011s 1 tree, 52 leaves, max depth = 10, train loss: 0.46603, val loss: 0.45243, in 0.013s 1 tree, 31 leaves, max depth = 10, train loss: 0.46444, val loss: 0.45063, in 0.011s 1 tree, 30 leaves, max depth = 9, train loss: 0.46282, val loss: 0.44883, in 0.010s 1 tree, 52 leaves, max depth = 12, train loss: 0.46066, val loss: 0.44673, in 0.014s 1 tree, 30 leaves, max depth = 12, train loss: 0.45917, val loss: 0.44505, in 0.011s 1 tree, 30 leaves, max depth = 11, train loss: 0.45765, val loss: 0.44334, in 0.010s 1 tree, 30 leaves, max depth = 9, train loss: 0.45617, val loss: 0.44169, in 0.010s 1 tree, 51 leaves, max depth = 13, train loss: 0.45409, val loss: 0.43967, in 0.014s 1 tree, 52 leaves, max depth = 13, train loss: 0.45207, val loss: 0.43770, in 0.012s 1 tree, 30 leaves, max depth = 9, train loss: 0.45068, val loss: 0.43611, in 0.012s Fit 71 trees in 1.244 s, (2413 total leaves) Time spent computing histograms: 0.453s Time spent finding best splits: 0.147s Time spent applying splits: 0.067s Time spent predicting: 0.007s Trial 68, Fold 4: Log loss = 0.44999522667030717, Average precision = 0.9131729718295871, ROC-AUC = 0.9238388222446356, Elapsed Time = 1.253656900000351 seconds Trial 68, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 68, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.288 s 0.009 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 10, train loss: 0.68566, val loss: 0.68519, in 0.014s 1 tree, 30 leaves, max depth = 11, train loss: 0.67855, val loss: 0.67757, in 0.020s 1 tree, 28 leaves, max depth = 10, train loss: 0.67162, val loss: 0.67019, in 0.015s 1 tree, 29 leaves, max depth = 11, train loss: 0.66493, val loss: 0.66307, in 0.013s 1 tree, 29 leaves, max depth = 13, train loss: 0.65846, val loss: 0.65615, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.65216, val loss: 0.64938, in 0.014s 1 tree, 30 leaves, max depth = 10, train loss: 0.64607, val loss: 0.64282, in 0.011s 1 tree, 29 leaves, max depth = 13, train loss: 0.64016, val loss: 0.63648, in 0.010s 1 tree, 28 leaves, max depth = 10, train loss: 0.63439, val loss: 0.63030, in 0.013s 1 tree, 29 leaves, max depth = 8, train loss: 0.62874, val loss: 0.62423, in 0.012s 1 tree, 29 leaves, max depth = 11, train loss: 0.62332, val loss: 0.61836, in 0.013s 1 tree, 29 leaves, max depth = 13, train loss: 0.61807, val loss: 0.61271, in 0.010s 1 tree, 30 leaves, max depth = 10, train loss: 0.61297, val loss: 0.60718, in 0.013s 1 tree, 29 leaves, max depth = 11, train loss: 0.60804, val loss: 0.60183, in 0.015s 1 tree, 30 leaves, max depth = 11, train loss: 0.60322, val loss: 0.59665, in 0.013s 1 tree, 29 leaves, max depth = 11, train loss: 0.59854, val loss: 0.59160, in 0.044s 1 tree, 27 leaves, max depth = 10, train loss: 0.59397, val loss: 0.58665, in 0.033s 1 tree, 29 leaves, max depth = 11, train loss: 0.58954, val loss: 0.58181, in 0.021s 1 tree, 6 leaves, max depth = 4, train loss: 0.58520, val loss: 0.57711, in 0.011s 1 tree, 30 leaves, max depth = 8, train loss: 0.58094, val loss: 0.57249, in 0.012s 1 tree, 30 leaves, max depth = 8, train loss: 0.57680, val loss: 0.56798, in 0.013s 1 tree, 30 leaves, max depth = 8, train loss: 0.57278, val loss: 0.56360, in 0.013s 1 tree, 30 leaves, max depth = 8, train loss: 0.56887, val loss: 0.55934, in 0.010s 1 tree, 30 leaves, max depth = 8, train loss: 0.56507, val loss: 0.55518, in 0.017s 1 tree, 29 leaves, max depth = 11, train loss: 0.56142, val loss: 0.55118, in 0.015s 1 tree, 27 leaves, max depth = 9, train loss: 0.55787, val loss: 0.54728, in 0.014s 1 tree, 29 leaves, max depth = 11, train loss: 0.55443, val loss: 0.54348, in 0.012s 1 tree, 30 leaves, max depth = 11, train loss: 0.55107, val loss: 0.53981, in 0.011s 1 tree, 30 leaves, max depth = 12, train loss: 0.54781, val loss: 0.53623, in 0.013s 1 tree, 49 leaves, max depth = 14, train loss: 0.54429, val loss: 0.53288, in 0.013s 1 tree, 30 leaves, max depth = 11, train loss: 0.54116, val loss: 0.52941, in 0.013s 1 tree, 29 leaves, max depth = 11, train loss: 0.53811, val loss: 0.52602, in 0.012s 1 tree, 30 leaves, max depth = 8, train loss: 0.53508, val loss: 0.52268, in 0.012s 1 tree, 49 leaves, max depth = 13, train loss: 0.53175, val loss: 0.51952, in 0.015s 1 tree, 30 leaves, max depth = 8, train loss: 0.52884, val loss: 0.51630, in 0.013s 1 tree, 27 leaves, max depth = 9, train loss: 0.52605, val loss: 0.51321, in 0.013s 1 tree, 30 leaves, max depth = 11, train loss: 0.52334, val loss: 0.51021, in 0.012s 1 tree, 49 leaves, max depth = 14, train loss: 0.52019, val loss: 0.50722, in 0.018s 1 tree, 30 leaves, max depth = 11, train loss: 0.51758, val loss: 0.50434, in 0.015s 1 tree, 30 leaves, max depth = 11, train loss: 0.51505, val loss: 0.50152, in 0.014s 1 tree, 27 leaves, max depth = 9, train loss: 0.51257, val loss: 0.49877, in 0.011s 1 tree, 30 leaves, max depth = 8, train loss: 0.51011, val loss: 0.49603, in 0.014s 1 tree, 30 leaves, max depth = 11, train loss: 0.50777, val loss: 0.49342, in 0.011s 1 tree, 49 leaves, max depth = 13, train loss: 0.50480, val loss: 0.49063, in 0.017s 1 tree, 30 leaves, max depth = 11, train loss: 0.50255, val loss: 0.48812, in 0.014s 1 tree, 27 leaves, max depth = 9, train loss: 0.50035, val loss: 0.48566, in 0.012s 1 tree, 50 leaves, max depth = 13, train loss: 0.49750, val loss: 0.48298, in 0.018s 1 tree, 30 leaves, max depth = 8, train loss: 0.49534, val loss: 0.48055, in 0.013s 1 tree, 30 leaves, max depth = 8, train loss: 0.49324, val loss: 0.47819, in 0.010s 1 tree, 51 leaves, max depth = 13, train loss: 0.49050, val loss: 0.47562, in 0.015s 1 tree, 31 leaves, max depth = 10, train loss: 0.48852, val loss: 0.47339, in 0.011s 1 tree, 52 leaves, max depth = 12, train loss: 0.48587, val loss: 0.47090, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.48329, val loss: 0.46849, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.48140, val loss: 0.46637, in 0.015s 1 tree, 52 leaves, max depth = 12, train loss: 0.47890, val loss: 0.46403, in 0.018s 1 tree, 52 leaves, max depth = 12, train loss: 0.47647, val loss: 0.46176, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.47463, val loss: 0.45969, in 0.010s 1 tree, 30 leaves, max depth = 10, train loss: 0.47288, val loss: 0.45772, in 0.014s 1 tree, 51 leaves, max depth = 12, train loss: 0.47053, val loss: 0.45552, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.46885, val loss: 0.45362, in 0.012s 1 tree, 29 leaves, max depth = 10, train loss: 0.46721, val loss: 0.45176, in 0.012s 1 tree, 53 leaves, max depth = 12, train loss: 0.46494, val loss: 0.44966, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.46336, val loss: 0.44786, in 0.012s 1 tree, 29 leaves, max depth = 9, train loss: 0.46181, val loss: 0.44611, in 0.012s 1 tree, 53 leaves, max depth = 12, train loss: 0.45962, val loss: 0.44408, in 0.017s 1 tree, 29 leaves, max depth = 10, train loss: 0.45814, val loss: 0.44239, in 0.013s 1 tree, 30 leaves, max depth = 12, train loss: 0.45669, val loss: 0.44075, in 0.013s 1 tree, 29 leaves, max depth = 12, train loss: 0.45528, val loss: 0.43914, in 0.012s 1 tree, 52 leaves, max depth = 12, train loss: 0.45318, val loss: 0.43721, in 0.015s 1 tree, 51 leaves, max depth = 12, train loss: 0.45112, val loss: 0.43532, in 0.017s 1 tree, 30 leaves, max depth = 9, train loss: 0.44973, val loss: 0.43373, in 0.012s Fit 71 trees in 1.562 s, (2361 total leaves) Time spent computing histograms: 0.499s Time spent finding best splits: 0.097s Time spent applying splits: 0.103s Time spent predicting: 0.010s Trial 68, Fold 5: Log loss = 0.45655774610222777, Average precision = 0.9056527493890548, ROC-AUC = 0.9152489456180443, Elapsed Time = 1.5716627000001608 seconds
Optimization Progress: 69%|######9 | 69/100 [13:45<06:56, 13.42s/it]
Trial 69, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 69, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.210 s 0.040 GB of training data: 0.006 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 81 leaves, max depth = 11, train loss: 0.63986, val loss: 0.63959, in 0.022s 1 tree, 81 leaves, max depth = 14, train loss: 0.59513, val loss: 0.59500, in 0.027s 1 tree, 81 leaves, max depth = 13, train loss: 0.55742, val loss: 0.55748, in 0.027s 1 tree, 81 leaves, max depth = 10, train loss: 0.52598, val loss: 0.52567, in 0.025s 1 tree, 81 leaves, max depth = 14, train loss: 0.49873, val loss: 0.49862, in 0.027s 1 tree, 81 leaves, max depth = 11, train loss: 0.47577, val loss: 0.47540, in 0.024s 1 tree, 81 leaves, max depth = 12, train loss: 0.45490, val loss: 0.45454, in 0.022s 1 tree, 81 leaves, max depth = 13, train loss: 0.42895, val loss: 0.42968, in 0.030s 1 tree, 81 leaves, max depth = 14, train loss: 0.41344, val loss: 0.41434, in 0.029s 1 tree, 81 leaves, max depth = 15, train loss: 0.39295, val loss: 0.39481, in 0.040s 1 tree, 81 leaves, max depth = 12, train loss: 0.38000, val loss: 0.38182, in 0.025s 1 tree, 81 leaves, max depth = 14, train loss: 0.36856, val loss: 0.37059, in 0.027s 1 tree, 81 leaves, max depth = 16, train loss: 0.35323, val loss: 0.35628, in 0.021s 1 tree, 81 leaves, max depth = 12, train loss: 0.34423, val loss: 0.34735, in 0.021s 1 tree, 81 leaves, max depth = 23, train loss: 0.33261, val loss: 0.33615, in 0.020s 1 tree, 81 leaves, max depth = 18, train loss: 0.32270, val loss: 0.32651, in 0.019s 1 tree, 81 leaves, max depth = 14, train loss: 0.31290, val loss: 0.31784, in 0.019s 1 tree, 81 leaves, max depth = 15, train loss: 0.30479, val loss: 0.30985, in 0.021s 1 tree, 81 leaves, max depth = 18, train loss: 0.29754, val loss: 0.30281, in 0.020s 1 tree, 81 leaves, max depth = 17, train loss: 0.29065, val loss: 0.29640, in 0.020s 1 tree, 81 leaves, max depth = 15, train loss: 0.28404, val loss: 0.28983, in 0.025s 1 tree, 81 leaves, max depth = 16, train loss: 0.27746, val loss: 0.28413, in 0.020s 1 tree, 81 leaves, max depth = 16, train loss: 0.27221, val loss: 0.27917, in 0.019s 1 tree, 81 leaves, max depth = 15, train loss: 0.26767, val loss: 0.27498, in 0.019s 1 tree, 81 leaves, max depth = 16, train loss: 0.26246, val loss: 0.27045, in 0.021s 1 tree, 81 leaves, max depth = 16, train loss: 0.25901, val loss: 0.26722, in 0.022s 1 tree, 81 leaves, max depth = 16, train loss: 0.25492, val loss: 0.26337, in 0.027s 1 tree, 81 leaves, max depth = 14, train loss: 0.25056, val loss: 0.25908, in 0.024s 1 tree, 81 leaves, max depth = 15, train loss: 0.24661, val loss: 0.25568, in 0.025s 1 tree, 81 leaves, max depth = 20, train loss: 0.24311, val loss: 0.25264, in 0.022s 1 tree, 81 leaves, max depth = 16, train loss: 0.23997, val loss: 0.24940, in 0.028s 1 tree, 81 leaves, max depth = 13, train loss: 0.23714, val loss: 0.24698, in 0.022s 1 tree, 81 leaves, max depth = 16, train loss: 0.23463, val loss: 0.24430, in 0.027s 1 tree, 81 leaves, max depth = 15, train loss: 0.23237, val loss: 0.24190, in 0.024s 1 tree, 81 leaves, max depth = 20, train loss: 0.23021, val loss: 0.23974, in 0.023s 1 tree, 81 leaves, max depth = 17, train loss: 0.22807, val loss: 0.23791, in 0.023s 1 tree, 81 leaves, max depth = 16, train loss: 0.22619, val loss: 0.23637, in 0.024s 1 tree, 81 leaves, max depth = 16, train loss: 0.22420, val loss: 0.23447, in 0.025s 1 tree, 81 leaves, max depth = 19, train loss: 0.22252, val loss: 0.23257, in 0.023s 1 tree, 81 leaves, max depth = 12, train loss: 0.21952, val loss: 0.22950, in 0.024s 1 tree, 81 leaves, max depth = 17, train loss: 0.21691, val loss: 0.22687, in 0.027s 1 tree, 81 leaves, max depth = 19, train loss: 0.21515, val loss: 0.22530, in 0.018s 1 tree, 81 leaves, max depth = 15, train loss: 0.21288, val loss: 0.22376, in 0.037s 1 tree, 81 leaves, max depth = 14, train loss: 0.21158, val loss: 0.22261, in 0.033s 1 tree, 81 leaves, max depth = 14, train loss: 0.20955, val loss: 0.22117, in 0.027s 1 tree, 81 leaves, max depth = 14, train loss: 0.20827, val loss: 0.22013, in 0.039s 1 tree, 81 leaves, max depth = 17, train loss: 0.20628, val loss: 0.21827, in 0.030s 1 tree, 81 leaves, max depth = 14, train loss: 0.20490, val loss: 0.21743, in 0.031s 1 tree, 81 leaves, max depth = 18, train loss: 0.20398, val loss: 0.21654, in 0.023s 1 tree, 81 leaves, max depth = 12, train loss: 0.20227, val loss: 0.21475, in 0.023s 1 tree, 81 leaves, max depth = 17, train loss: 0.20127, val loss: 0.21394, in 0.027s 1 tree, 81 leaves, max depth = 16, train loss: 0.19967, val loss: 0.21238, in 0.027s 1 tree, 81 leaves, max depth = 21, train loss: 0.19852, val loss: 0.21186, in 0.027s 1 tree, 81 leaves, max depth = 18, train loss: 0.19679, val loss: 0.21099, in 0.030s 1 tree, 81 leaves, max depth = 16, train loss: 0.19580, val loss: 0.21047, in 0.024s 1 tree, 81 leaves, max depth = 16, train loss: 0.19387, val loss: 0.20913, in 0.025s 1 tree, 81 leaves, max depth = 14, train loss: 0.19295, val loss: 0.20858, in 0.028s 1 tree, 30 leaves, max depth = 9, train loss: 0.19227, val loss: 0.20795, in 0.018s 1 tree, 81 leaves, max depth = 18, train loss: 0.19154, val loss: 0.20746, in 0.030s 1 tree, 81 leaves, max depth = 10, train loss: 0.19028, val loss: 0.20628, in 0.031s 1 tree, 30 leaves, max depth = 8, train loss: 0.18957, val loss: 0.20565, in 0.014s 1 tree, 81 leaves, max depth = 15, train loss: 0.18883, val loss: 0.20507, in 0.039s 1 tree, 81 leaves, max depth = 12, train loss: 0.18778, val loss: 0.20461, in 0.026s 1 tree, 81 leaves, max depth = 14, train loss: 0.18668, val loss: 0.20367, in 0.031s Fit 64 trees in 2.035 s, (5082 total leaves) Time spent computing histograms: 0.551s Time spent finding best splits: 0.174s Time spent applying splits: 0.132s Time spent predicting: 0.006s Trial 69, Fold 1: Log loss = 0.21489149959789555, Average precision = 0.9714608550870261, ROC-AUC = 0.9659503556014396, Elapsed Time = 2.049052999998821 seconds Trial 69, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 69, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.222 s 0.003 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 81 leaves, max depth = 14, train loss: 0.63988, val loss: 0.63921, in 0.037s 1 tree, 81 leaves, max depth = 13, train loss: 0.59507, val loss: 0.59392, in 0.032s 1 tree, 81 leaves, max depth = 10, train loss: 0.55679, val loss: 0.55522, in 0.033s 1 tree, 81 leaves, max depth = 11, train loss: 0.52476, val loss: 0.52314, in 0.034s 1 tree, 81 leaves, max depth = 11, train loss: 0.49713, val loss: 0.49523, in 0.023s 1 tree, 81 leaves, max depth = 11, train loss: 0.47379, val loss: 0.47188, in 0.031s 1 tree, 81 leaves, max depth = 13, train loss: 0.45414, val loss: 0.45221, in 0.031s 1 tree, 81 leaves, max depth = 11, train loss: 0.43616, val loss: 0.43414, in 0.031s 1 tree, 81 leaves, max depth = 16, train loss: 0.41255, val loss: 0.41119, in 0.031s 1 tree, 81 leaves, max depth = 15, train loss: 0.39259, val loss: 0.39175, in 0.047s 1 tree, 81 leaves, max depth = 13, train loss: 0.37987, val loss: 0.37892, in 0.016s 1 tree, 81 leaves, max depth = 11, train loss: 0.36864, val loss: 0.36777, in 0.031s 1 tree, 81 leaves, max depth = 17, train loss: 0.35364, val loss: 0.35340, in 0.031s 1 tree, 81 leaves, max depth = 10, train loss: 0.34474, val loss: 0.34508, in 0.031s 1 tree, 81 leaves, max depth = 21, train loss: 0.33334, val loss: 0.33414, in 0.031s 1 tree, 81 leaves, max depth = 23, train loss: 0.32359, val loss: 0.32474, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.31404, val loss: 0.31600, in 0.031s 1 tree, 81 leaves, max depth = 16, train loss: 0.30572, val loss: 0.30832, in 0.031s 1 tree, 81 leaves, max depth = 16, train loss: 0.29856, val loss: 0.30171, in 0.031s 1 tree, 81 leaves, max depth = 14, train loss: 0.29125, val loss: 0.29508, in 0.031s 1 tree, 81 leaves, max depth = 20, train loss: 0.28513, val loss: 0.28928, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.27927, val loss: 0.28393, in 0.031s 1 tree, 81 leaves, max depth = 12, train loss: 0.27337, val loss: 0.27867, in 0.031s 1 tree, 81 leaves, max depth = 16, train loss: 0.26848, val loss: 0.27419, in 0.031s 1 tree, 81 leaves, max depth = 14, train loss: 0.26391, val loss: 0.27003, in 0.031s 1 tree, 81 leaves, max depth = 18, train loss: 0.26033, val loss: 0.26694, in 0.031s 1 tree, 81 leaves, max depth = 19, train loss: 0.25651, val loss: 0.26342, in 0.031s 1 tree, 81 leaves, max depth = 16, train loss: 0.25359, val loss: 0.26103, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.24960, val loss: 0.25753, in 0.031s 1 tree, 81 leaves, max depth = 15, train loss: 0.24629, val loss: 0.25437, in 0.031s 1 tree, 81 leaves, max depth = 14, train loss: 0.24297, val loss: 0.25137, in 0.031s 1 tree, 81 leaves, max depth = 18, train loss: 0.23972, val loss: 0.24892, in 0.016s 1 tree, 81 leaves, max depth = 19, train loss: 0.23699, val loss: 0.24639, in 0.031s 1 tree, 81 leaves, max depth = 12, train loss: 0.23396, val loss: 0.24376, in 0.031s 1 tree, 81 leaves, max depth = 16, train loss: 0.23131, val loss: 0.24219, in 0.031s 1 tree, 81 leaves, max depth = 18, train loss: 0.22825, val loss: 0.23950, in 0.016s 1 tree, 81 leaves, max depth = 17, train loss: 0.22577, val loss: 0.23760, in 0.031s 1 tree, 81 leaves, max depth = 18, train loss: 0.22390, val loss: 0.23597, in 0.031s 1 tree, 81 leaves, max depth = 17, train loss: 0.22187, val loss: 0.23433, in 0.031s 1 tree, 81 leaves, max depth = 15, train loss: 0.22006, val loss: 0.23325, in 0.016s 1 tree, 81 leaves, max depth = 19, train loss: 0.21850, val loss: 0.23180, in 0.034s 1 tree, 81 leaves, max depth = 17, train loss: 0.21685, val loss: 0.23056, in 0.033s 1 tree, 81 leaves, max depth = 14, train loss: 0.21446, val loss: 0.22850, in 0.026s 1 tree, 81 leaves, max depth = 15, train loss: 0.21299, val loss: 0.22776, in 0.029s 1 tree, 81 leaves, max depth = 16, train loss: 0.21151, val loss: 0.22703, in 0.028s 1 tree, 81 leaves, max depth = 18, train loss: 0.21039, val loss: 0.22598, in 0.033s 1 tree, 81 leaves, max depth = 13, train loss: 0.20842, val loss: 0.22441, in 0.030s 1 tree, 81 leaves, max depth = 13, train loss: 0.20726, val loss: 0.22383, in 0.028s 1 tree, 81 leaves, max depth = 15, train loss: 0.20551, val loss: 0.22234, in 0.023s 1 tree, 81 leaves, max depth = 17, train loss: 0.20365, val loss: 0.22172, in 0.023s 1 tree, 81 leaves, max depth = 17, train loss: 0.20250, val loss: 0.22130, in 0.027s 1 tree, 81 leaves, max depth = 15, train loss: 0.20148, val loss: 0.22055, in 0.024s 1 tree, 81 leaves, max depth = 17, train loss: 0.20064, val loss: 0.21992, in 0.024s 1 tree, 81 leaves, max depth = 16, train loss: 0.19963, val loss: 0.21960, in 0.025s 1 tree, 81 leaves, max depth = 13, train loss: 0.19879, val loss: 0.21923, in 0.024s 1 tree, 81 leaves, max depth = 15, train loss: 0.19732, val loss: 0.21807, in 0.029s 1 tree, 81 leaves, max depth = 14, train loss: 0.19537, val loss: 0.21622, in 0.028s 1 tree, 56 leaves, max depth = 9, train loss: 0.19453, val loss: 0.21554, in 0.025s 1 tree, 81 leaves, max depth = 15, train loss: 0.19318, val loss: 0.21534, in 0.028s 1 tree, 81 leaves, max depth = 18, train loss: 0.19241, val loss: 0.21489, in 0.032s 1 tree, 81 leaves, max depth = 16, train loss: 0.19155, val loss: 0.21452, in 0.030s 1 tree, 81 leaves, max depth = 17, train loss: 0.19096, val loss: 0.21420, in 0.021s 1 tree, 81 leaves, max depth = 17, train loss: 0.18936, val loss: 0.21371, in 0.030s 1 tree, 81 leaves, max depth = 17, train loss: 0.18869, val loss: 0.21347, in 0.033s Fit 64 trees in 2.289 s, (5159 total leaves) Time spent computing histograms: 0.620s Time spent finding best splits: 0.196s Time spent applying splits: 0.144s Time spent predicting: 0.004s Trial 69, Fold 2: Log loss = 0.2075985488626582, Average precision = 0.9717443660603746, ROC-AUC = 0.9680514764907778, Elapsed Time = 2.299709499999153 seconds Trial 69, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 69, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.194 s 0.007 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 81 leaves, max depth = 13, train loss: 0.63987, val loss: 0.63947, in 0.028s 1 tree, 81 leaves, max depth = 12, train loss: 0.59499, val loss: 0.59463, in 0.024s 1 tree, 81 leaves, max depth = 14, train loss: 0.55746, val loss: 0.55734, in 0.038s 1 tree, 81 leaves, max depth = 13, train loss: 0.52609, val loss: 0.52627, in 0.034s 1 tree, 81 leaves, max depth = 14, train loss: 0.49853, val loss: 0.49911, in 0.033s 1 tree, 81 leaves, max depth = 12, train loss: 0.47531, val loss: 0.47643, in 0.034s 1 tree, 81 leaves, max depth = 12, train loss: 0.45509, val loss: 0.45697, in 0.029s 1 tree, 81 leaves, max depth = 13, train loss: 0.42915, val loss: 0.43298, in 0.022s 1 tree, 81 leaves, max depth = 16, train loss: 0.40738, val loss: 0.41291, in 0.028s 1 tree, 81 leaves, max depth = 13, train loss: 0.39386, val loss: 0.39996, in 0.032s 1 tree, 81 leaves, max depth = 13, train loss: 0.38112, val loss: 0.38758, in 0.033s 1 tree, 81 leaves, max depth = 14, train loss: 0.36952, val loss: 0.37628, in 0.028s 1 tree, 81 leaves, max depth = 15, train loss: 0.35433, val loss: 0.36264, in 0.025s 1 tree, 81 leaves, max depth = 15, train loss: 0.34193, val loss: 0.35138, in 0.023s 1 tree, 81 leaves, max depth = 14, train loss: 0.33192, val loss: 0.34097, in 0.026s 1 tree, 81 leaves, max depth = 16, train loss: 0.32182, val loss: 0.33202, in 0.024s 1 tree, 81 leaves, max depth = 19, train loss: 0.31197, val loss: 0.32389, in 0.024s 1 tree, 81 leaves, max depth = 15, train loss: 0.30416, val loss: 0.31588, in 0.025s 1 tree, 81 leaves, max depth = 16, train loss: 0.29680, val loss: 0.30944, in 0.029s 1 tree, 81 leaves, max depth = 17, train loss: 0.29015, val loss: 0.30255, in 0.028s 1 tree, 81 leaves, max depth = 20, train loss: 0.28313, val loss: 0.29674, in 0.030s 1 tree, 81 leaves, max depth = 15, train loss: 0.27703, val loss: 0.29162, in 0.026s 1 tree, 81 leaves, max depth = 13, train loss: 0.27141, val loss: 0.28524, in 0.030s 1 tree, 81 leaves, max depth = 16, train loss: 0.26667, val loss: 0.28114, in 0.017s 1 tree, 81 leaves, max depth = 18, train loss: 0.26201, val loss: 0.27745, in 0.039s 1 tree, 81 leaves, max depth = 16, train loss: 0.25798, val loss: 0.27417, in 0.031s 1 tree, 81 leaves, max depth = 14, train loss: 0.25373, val loss: 0.26940, in 0.030s 1 tree, 81 leaves, max depth = 16, train loss: 0.25023, val loss: 0.26653, in 0.032s 1 tree, 81 leaves, max depth = 16, train loss: 0.24674, val loss: 0.26283, in 0.020s 1 tree, 81 leaves, max depth = 15, train loss: 0.24304, val loss: 0.25892, in 0.029s 1 tree, 81 leaves, max depth = 19, train loss: 0.23963, val loss: 0.25649, in 0.040s 1 tree, 81 leaves, max depth = 16, train loss: 0.23667, val loss: 0.25422, in 0.038s 1 tree, 81 leaves, max depth = 20, train loss: 0.23400, val loss: 0.25249, in 0.027s 1 tree, 81 leaves, max depth = 18, train loss: 0.23133, val loss: 0.24988, in 0.034s 1 tree, 81 leaves, max depth = 17, train loss: 0.22905, val loss: 0.24810, in 0.028s 1 tree, 81 leaves, max depth = 18, train loss: 0.22699, val loss: 0.24627, in 0.016s 1 tree, 71 leaves, max depth = 15, train loss: 0.22465, val loss: 0.24366, in 0.028s 1 tree, 81 leaves, max depth = 21, train loss: 0.22287, val loss: 0.24230, in 0.023s 1 tree, 81 leaves, max depth = 19, train loss: 0.22131, val loss: 0.24100, in 0.031s 1 tree, 81 leaves, max depth = 13, train loss: 0.21964, val loss: 0.23973, in 0.027s 1 tree, 81 leaves, max depth = 15, train loss: 0.21708, val loss: 0.23710, in 0.027s 1 tree, 81 leaves, max depth = 18, train loss: 0.21526, val loss: 0.23609, in 0.034s 1 tree, 81 leaves, max depth = 13, train loss: 0.21387, val loss: 0.23514, in 0.036s 1 tree, 81 leaves, max depth = 15, train loss: 0.21127, val loss: 0.23212, in 0.033s 1 tree, 81 leaves, max depth = 18, train loss: 0.20922, val loss: 0.23101, in 0.024s 1 tree, 81 leaves, max depth = 18, train loss: 0.20691, val loss: 0.22871, in 0.030s 1 tree, 81 leaves, max depth = 16, train loss: 0.20583, val loss: 0.22773, in 0.031s 1 tree, 81 leaves, max depth = 13, train loss: 0.20476, val loss: 0.22706, in 0.028s 1 tree, 81 leaves, max depth = 14, train loss: 0.20317, val loss: 0.22557, in 0.033s 1 tree, 81 leaves, max depth = 22, train loss: 0.20185, val loss: 0.22498, in 0.027s 1 tree, 81 leaves, max depth = 11, train loss: 0.20030, val loss: 0.22357, in 0.034s 1 tree, 81 leaves, max depth = 17, train loss: 0.19919, val loss: 0.22303, in 0.032s 1 tree, 81 leaves, max depth = 15, train loss: 0.19814, val loss: 0.22239, in 0.027s 1 tree, 35 leaves, max depth = 8, train loss: 0.19735, val loss: 0.22186, in 0.024s 1 tree, 81 leaves, max depth = 15, train loss: 0.19656, val loss: 0.22135, in 0.028s 1 tree, 81 leaves, max depth = 12, train loss: 0.19521, val loss: 0.22019, in 0.147s 1 tree, 81 leaves, max depth = 15, train loss: 0.19336, val loss: 0.21974, in 0.063s 1 tree, 81 leaves, max depth = 14, train loss: 0.19216, val loss: 0.21892, in 0.031s 1 tree, 81 leaves, max depth = 19, train loss: 0.19054, val loss: 0.21887, in 0.047s 1 tree, 81 leaves, max depth = 23, train loss: 0.18960, val loss: 0.21862, in 0.031s 1 tree, 81 leaves, max depth = 18, train loss: 0.18879, val loss: 0.21838, in 0.031s 1 tree, 81 leaves, max depth = 14, train loss: 0.18776, val loss: 0.21756, in 0.031s 1 tree, 53 leaves, max depth = 9, train loss: 0.18709, val loss: 0.21695, in 0.047s 1 tree, 81 leaves, max depth = 12, train loss: 0.18606, val loss: 0.21668, in 0.031s Fit 64 trees in 2.523 s, (5100 total leaves) Time spent computing histograms: 0.699s Time spent finding best splits: 0.223s Time spent applying splits: 0.220s Time spent predicting: 0.021s Trial 69, Fold 3: Log loss = 0.20688421374573782, Average precision = 0.9722179513099881, ROC-AUC = 0.9679817744700523, Elapsed Time = 2.536510100000669 seconds Trial 69, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 69, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.176 s 0.004 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 81 leaves, max depth = 12, train loss: 0.64013, val loss: 0.63816, in 0.021s 1 tree, 81 leaves, max depth = 13, train loss: 0.59548, val loss: 0.59200, in 0.024s 1 tree, 81 leaves, max depth = 15, train loss: 0.55797, val loss: 0.55333, in 0.023s 1 tree, 81 leaves, max depth = 11, train loss: 0.52682, val loss: 0.52062, in 0.020s 1 tree, 81 leaves, max depth = 11, train loss: 0.49932, val loss: 0.49204, in 0.019s 1 tree, 81 leaves, max depth = 13, train loss: 0.47554, val loss: 0.46745, in 0.024s 1 tree, 81 leaves, max depth = 11, train loss: 0.45514, val loss: 0.44620, in 0.023s 1 tree, 81 leaves, max depth = 15, train loss: 0.42986, val loss: 0.42069, in 0.022s 1 tree, 81 leaves, max depth = 14, train loss: 0.40805, val loss: 0.39863, in 0.023s 1 tree, 81 leaves, max depth = 17, train loss: 0.39410, val loss: 0.38447, in 0.022s 1 tree, 81 leaves, max depth = 11, train loss: 0.38141, val loss: 0.37133, in 0.024s 1 tree, 81 leaves, max depth = 12, train loss: 0.37016, val loss: 0.35979, in 0.020s 1 tree, 81 leaves, max depth = 13, train loss: 0.35523, val loss: 0.34493, in 0.021s 1 tree, 81 leaves, max depth = 13, train loss: 0.34655, val loss: 0.33580, in 0.025s 1 tree, 81 leaves, max depth = 16, train loss: 0.33509, val loss: 0.32404, in 0.022s 1 tree, 81 leaves, max depth = 16, train loss: 0.32458, val loss: 0.31422, in 0.024s 1 tree, 81 leaves, max depth = 14, train loss: 0.31574, val loss: 0.30489, in 0.024s 1 tree, 81 leaves, max depth = 18, train loss: 0.30767, val loss: 0.29685, in 0.021s 1 tree, 81 leaves, max depth = 12, train loss: 0.30169, val loss: 0.29087, in 0.028s 1 tree, 81 leaves, max depth = 14, train loss: 0.29413, val loss: 0.28403, in 0.020s 1 tree, 81 leaves, max depth = 14, train loss: 0.28754, val loss: 0.27801, in 0.020s 1 tree, 81 leaves, max depth = 15, train loss: 0.28177, val loss: 0.27206, in 0.025s 1 tree, 81 leaves, max depth = 18, train loss: 0.27651, val loss: 0.26732, in 0.026s 1 tree, 81 leaves, max depth = 15, train loss: 0.27194, val loss: 0.26260, in 0.020s 1 tree, 81 leaves, max depth = 12, train loss: 0.26621, val loss: 0.25690, in 0.025s 1 tree, 81 leaves, max depth = 16, train loss: 0.26250, val loss: 0.25309, in 0.027s 1 tree, 81 leaves, max depth = 12, train loss: 0.25767, val loss: 0.24828, in 0.020s 1 tree, 81 leaves, max depth = 15, train loss: 0.25414, val loss: 0.24517, in 0.020s 1 tree, 81 leaves, max depth = 20, train loss: 0.25106, val loss: 0.24234, in 0.024s 1 tree, 81 leaves, max depth = 14, train loss: 0.24816, val loss: 0.23982, in 0.024s 1 tree, 81 leaves, max depth = 14, train loss: 0.24521, val loss: 0.23772, in 0.020s 1 tree, 81 leaves, max depth = 14, train loss: 0.24270, val loss: 0.23559, in 0.021s 1 tree, 81 leaves, max depth = 15, train loss: 0.23787, val loss: 0.23138, in 0.026s 1 tree, 81 leaves, max depth = 16, train loss: 0.23379, val loss: 0.22801, in 0.022s 1 tree, 81 leaves, max depth = 18, train loss: 0.23169, val loss: 0.22607, in 0.022s 1 tree, 81 leaves, max depth = 12, train loss: 0.22965, val loss: 0.22473, in 0.023s 1 tree, 81 leaves, max depth = 21, train loss: 0.22772, val loss: 0.22313, in 0.020s 1 tree, 81 leaves, max depth = 15, train loss: 0.22468, val loss: 0.22049, in 0.020s 1 tree, 81 leaves, max depth = 14, train loss: 0.22302, val loss: 0.21942, in 0.020s 1 tree, 81 leaves, max depth = 22, train loss: 0.22104, val loss: 0.21805, in 0.021s 1 tree, 81 leaves, max depth = 15, train loss: 0.21846, val loss: 0.21560, in 0.021s 1 tree, 81 leaves, max depth = 14, train loss: 0.21666, val loss: 0.21421, in 0.019s 1 tree, 81 leaves, max depth = 13, train loss: 0.21530, val loss: 0.21346, in 0.020s 1 tree, 81 leaves, max depth = 14, train loss: 0.21313, val loss: 0.21137, in 0.021s 1 tree, 81 leaves, max depth = 24, train loss: 0.21162, val loss: 0.21031, in 0.026s 1 tree, 81 leaves, max depth = 20, train loss: 0.21024, val loss: 0.20949, in 0.025s 1 tree, 81 leaves, max depth = 13, train loss: 0.20834, val loss: 0.20770, in 0.022s 1 tree, 81 leaves, max depth = 13, train loss: 0.20694, val loss: 0.20664, in 0.022s 1 tree, 81 leaves, max depth = 13, train loss: 0.20569, val loss: 0.20631, in 0.019s 1 tree, 81 leaves, max depth = 21, train loss: 0.20445, val loss: 0.20555, in 0.023s 1 tree, 81 leaves, max depth = 15, train loss: 0.20286, val loss: 0.20421, in 0.020s 1 tree, 81 leaves, max depth = 16, train loss: 0.20194, val loss: 0.20386, in 0.022s 1 tree, 81 leaves, max depth = 19, train loss: 0.20099, val loss: 0.20321, in 0.025s 1 tree, 81 leaves, max depth = 15, train loss: 0.19999, val loss: 0.20291, in 0.024s 1 tree, 81 leaves, max depth = 13, train loss: 0.19860, val loss: 0.20164, in 0.020s 1 tree, 81 leaves, max depth = 14, train loss: 0.19761, val loss: 0.20094, in 0.028s 1 tree, 53 leaves, max depth = 8, train loss: 0.19671, val loss: 0.20033, in 0.019s 1 tree, 81 leaves, max depth = 13, train loss: 0.19599, val loss: 0.20006, in 0.019s 1 tree, 81 leaves, max depth = 13, train loss: 0.19485, val loss: 0.19904, in 0.020s 1 tree, 81 leaves, max depth = 19, train loss: 0.19356, val loss: 0.19818, in 0.026s 1 tree, 81 leaves, max depth = 14, train loss: 0.19205, val loss: 0.19753, in 0.024s 1 tree, 81 leaves, max depth = 14, train loss: 0.19033, val loss: 0.19751, in 0.026s 1 tree, 81 leaves, max depth = 20, train loss: 0.18934, val loss: 0.19672, in 0.023s 1 tree, 81 leaves, max depth = 24, train loss: 0.18852, val loss: 0.19627, in 0.022s Fit 64 trees in 1.816 s, (5156 total leaves) Time spent computing histograms: 0.491s Time spent finding best splits: 0.147s Time spent applying splits: 0.110s Time spent predicting: 0.009s Trial 69, Fold 4: Log loss = 0.21187220792729713, Average precision = 0.9714045250282974, ROC-AUC = 0.9661044973544972, Elapsed Time = 1.8256971999999223 seconds Trial 69, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 69, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.160 s 0.006 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 81 leaves, max depth = 12, train loss: 0.63928, val loss: 0.63710, in 0.022s 1 tree, 81 leaves, max depth = 15, train loss: 0.59393, val loss: 0.59046, in 0.023s 1 tree, 81 leaves, max depth = 16, train loss: 0.55585, val loss: 0.55107, in 0.023s 1 tree, 81 leaves, max depth = 11, train loss: 0.52421, val loss: 0.51861, in 0.023s 1 tree, 81 leaves, max depth = 16, train loss: 0.49657, val loss: 0.49014, in 0.025s 1 tree, 81 leaves, max depth = 11, train loss: 0.47331, val loss: 0.46640, in 0.024s 1 tree, 81 leaves, max depth = 15, train loss: 0.45361, val loss: 0.44604, in 0.024s 1 tree, 81 leaves, max depth = 15, train loss: 0.43550, val loss: 0.42753, in 0.012s 1 tree, 81 leaves, max depth = 19, train loss: 0.41168, val loss: 0.40405, in 0.034s 1 tree, 81 leaves, max depth = 13, train loss: 0.39753, val loss: 0.38956, in 0.027s 1 tree, 81 leaves, max depth = 16, train loss: 0.37903, val loss: 0.37142, in 0.023s 1 tree, 81 leaves, max depth = 14, train loss: 0.36771, val loss: 0.36019, in 0.027s 1 tree, 81 leaves, max depth = 17, train loss: 0.35250, val loss: 0.34534, in 0.026s 1 tree, 81 leaves, max depth = 12, train loss: 0.34387, val loss: 0.33703, in 0.027s 1 tree, 81 leaves, max depth = 16, train loss: 0.33226, val loss: 0.32538, in 0.020s 1 tree, 81 leaves, max depth = 17, train loss: 0.32308, val loss: 0.31651, in 0.020s 1 tree, 81 leaves, max depth = 17, train loss: 0.31296, val loss: 0.30761, in 0.023s 1 tree, 81 leaves, max depth = 18, train loss: 0.30456, val loss: 0.29907, in 0.021s 1 tree, 81 leaves, max depth = 15, train loss: 0.29730, val loss: 0.29187, in 0.024s 1 tree, 81 leaves, max depth = 13, train loss: 0.28990, val loss: 0.28469, in 0.026s 1 tree, 81 leaves, max depth = 13, train loss: 0.28360, val loss: 0.27901, in 0.021s 1 tree, 81 leaves, max depth = 16, train loss: 0.27702, val loss: 0.27318, in 0.012s 1 tree, 81 leaves, max depth = 15, train loss: 0.27144, val loss: 0.26784, in 0.035s 1 tree, 81 leaves, max depth = 18, train loss: 0.26602, val loss: 0.26314, in 0.023s 1 tree, 81 leaves, max depth = 13, train loss: 0.26151, val loss: 0.25900, in 0.022s 1 tree, 81 leaves, max depth = 15, train loss: 0.25723, val loss: 0.25492, in 0.022s 1 tree, 81 leaves, max depth = 13, train loss: 0.25302, val loss: 0.25158, in 0.011s 1 tree, 81 leaves, max depth = 21, train loss: 0.24793, val loss: 0.24690, in 0.040s 1 tree, 81 leaves, max depth = 17, train loss: 0.24435, val loss: 0.24390, in 0.025s 1 tree, 81 leaves, max depth = 14, train loss: 0.24120, val loss: 0.24083, in 0.025s 1 tree, 81 leaves, max depth = 17, train loss: 0.23831, val loss: 0.23855, in 0.022s 1 tree, 81 leaves, max depth = 18, train loss: 0.23579, val loss: 0.23619, in 0.023s 1 tree, 81 leaves, max depth = 13, train loss: 0.23344, val loss: 0.23381, in 0.020s 1 tree, 81 leaves, max depth = 11, train loss: 0.23032, val loss: 0.23083, in 0.026s 1 tree, 81 leaves, max depth = 15, train loss: 0.22802, val loss: 0.22963, in 0.026s 1 tree, 81 leaves, max depth = 18, train loss: 0.22587, val loss: 0.22783, in 0.028s 1 tree, 81 leaves, max depth = 15, train loss: 0.22397, val loss: 0.22687, in 0.023s 1 tree, 81 leaves, max depth = 15, train loss: 0.22150, val loss: 0.22485, in 0.024s 1 tree, 81 leaves, max depth = 13, train loss: 0.21941, val loss: 0.22347, in 0.024s 1 tree, 81 leaves, max depth = 19, train loss: 0.21783, val loss: 0.22193, in 0.025s 1 tree, 81 leaves, max depth = 22, train loss: 0.21644, val loss: 0.22093, in 0.022s 1 tree, 81 leaves, max depth = 15, train loss: 0.21492, val loss: 0.22032, in 0.022s 1 tree, 81 leaves, max depth = 13, train loss: 0.21273, val loss: 0.21836, in 0.031s 1 tree, 81 leaves, max depth = 13, train loss: 0.21047, val loss: 0.21671, in 0.021s 1 tree, 81 leaves, max depth = 26, train loss: 0.20886, val loss: 0.21569, in 0.027s 1 tree, 81 leaves, max depth = 15, train loss: 0.20679, val loss: 0.21467, in 0.023s 1 tree, 81 leaves, max depth = 19, train loss: 0.20570, val loss: 0.21368, in 0.027s 1 tree, 81 leaves, max depth = 17, train loss: 0.20363, val loss: 0.21201, in 0.026s 1 tree, 81 leaves, max depth = 13, train loss: 0.20257, val loss: 0.21179, in 0.027s 1 tree, 81 leaves, max depth = 13, train loss: 0.20149, val loss: 0.21103, in 0.024s 1 tree, 81 leaves, max depth = 15, train loss: 0.20053, val loss: 0.21078, in 0.022s 1 tree, 81 leaves, max depth = 18, train loss: 0.19876, val loss: 0.20915, in 0.027s 1 tree, 81 leaves, max depth = 12, train loss: 0.19783, val loss: 0.20852, in 0.023s 1 tree, 81 leaves, max depth = 18, train loss: 0.19679, val loss: 0.20808, in 0.022s 1 tree, 81 leaves, max depth = 16, train loss: 0.19527, val loss: 0.20672, in 0.023s 1 tree, 81 leaves, max depth = 20, train loss: 0.19429, val loss: 0.20622, in 0.026s 1 tree, 81 leaves, max depth = 16, train loss: 0.19327, val loss: 0.20577, in 0.024s 1 tree, 81 leaves, max depth = 20, train loss: 0.19191, val loss: 0.20489, in 0.028s 1 tree, 81 leaves, max depth = 16, train loss: 0.19065, val loss: 0.20367, in 0.025s 1 tree, 81 leaves, max depth = 13, train loss: 0.18912, val loss: 0.20332, in 0.024s 1 tree, 81 leaves, max depth = 20, train loss: 0.18829, val loss: 0.20310, in 0.023s 1 tree, 81 leaves, max depth = 15, train loss: 0.18746, val loss: 0.20285, in 0.023s 1 tree, 81 leaves, max depth = 17, train loss: 0.18682, val loss: 0.20251, in 0.027s 1 tree, 81 leaves, max depth = 14, train loss: 0.18526, val loss: 0.20241, in 0.029s Fit 64 trees in 1.926 s, (5184 total leaves) Time spent computing histograms: 0.525s Time spent finding best splits: 0.164s Time spent applying splits: 0.124s Time spent predicting: 0.023s Trial 69, Fold 5: Log loss = 0.21520133387638787, Average precision = 0.970336942697105, ROC-AUC = 0.9663292644408524, Elapsed Time = 1.9375400000008085 seconds
Optimization Progress: 70%|####### | 70/100 [14:04<07:36, 15.20s/it]
Trial 70, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 70, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.166 s 0.040 GB of training data: 0.004 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 36 leaves, max depth = 13, train loss: 0.67807, val loss: 0.67763, in 0.010s 1 tree, 36 leaves, max depth = 13, train loss: 0.66411, val loss: 0.66325, in 0.002s 1 tree, 36 leaves, max depth = 13, train loss: 0.65101, val loss: 0.64973, in 0.017s 1 tree, 36 leaves, max depth = 13, train loss: 0.63869, val loss: 0.63702, in 0.012s 1 tree, 36 leaves, max depth = 13, train loss: 0.62711, val loss: 0.62506, in 0.004s 1 tree, 36 leaves, max depth = 13, train loss: 0.61620, val loss: 0.61378, in 0.014s 1 tree, 36 leaves, max depth = 13, train loss: 0.60593, val loss: 0.60315, in 0.001s 1 tree, 36 leaves, max depth = 13, train loss: 0.59625, val loss: 0.59312, in 0.017s 1 tree, 36 leaves, max depth = 13, train loss: 0.58711, val loss: 0.58364, in 0.000s 1 tree, 36 leaves, max depth = 13, train loss: 0.57849, val loss: 0.57469, in 0.017s 1 tree, 36 leaves, max depth = 13, train loss: 0.57035, val loss: 0.56623, in 0.014s 1 tree, 36 leaves, max depth = 13, train loss: 0.56266, val loss: 0.55823, in 0.002s 1 tree, 35 leaves, max depth = 11, train loss: 0.55554, val loss: 0.55084, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.54832, val loss: 0.54407, in 0.000s 1 tree, 36 leaves, max depth = 13, train loss: 0.54161, val loss: 0.53707, in 0.018s 1 tree, 48 leaves, max depth = 11, train loss: 0.53493, val loss: 0.53081, in 0.015s 1 tree, 36 leaves, max depth = 12, train loss: 0.52874, val loss: 0.52433, in 0.002s 1 tree, 48 leaves, max depth = 11, train loss: 0.52254, val loss: 0.51855, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.51691, val loss: 0.51263, in 0.011s 1 tree, 48 leaves, max depth = 12, train loss: 0.51114, val loss: 0.50726, in 0.006s 1 tree, 36 leaves, max depth = 12, train loss: 0.50583, val loss: 0.50167, in 0.015s 1 tree, 48 leaves, max depth = 12, train loss: 0.50046, val loss: 0.49669, in 0.001s 1 tree, 37 leaves, max depth = 11, train loss: 0.49553, val loss: 0.49150, in 0.006s 1 tree, 37 leaves, max depth = 11, train loss: 0.49086, val loss: 0.48657, in 0.013s 1 tree, 48 leaves, max depth = 10, train loss: 0.48592, val loss: 0.48200, in 0.004s 1 tree, 48 leaves, max depth = 10, train loss: 0.48124, val loss: 0.47768, in 0.017s 1 tree, 48 leaves, max depth = 10, train loss: 0.47680, val loss: 0.47359, in 0.013s 1 tree, 33 leaves, max depth = 10, train loss: 0.47269, val loss: 0.46924, in 0.004s 1 tree, 48 leaves, max depth = 10, train loss: 0.46854, val loss: 0.46542, in 0.017s 1 tree, 36 leaves, max depth = 9, train loss: 0.46465, val loss: 0.46128, in 0.013s 1 tree, 30 leaves, max depth = 11, train loss: 0.46098, val loss: 0.45739, in 0.004s 1 tree, 48 leaves, max depth = 10, train loss: 0.45713, val loss: 0.45387, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.45348, val loss: 0.45053, in 0.000s 1 tree, 48 leaves, max depth = 11, train loss: 0.45002, val loss: 0.44738, in 0.007s 1 tree, 31 leaves, max depth = 10, train loss: 0.44673, val loss: 0.44386, in 0.014s 1 tree, 48 leaves, max depth = 10, train loss: 0.44346, val loss: 0.44090, in 0.000s 1 tree, 33 leaves, max depth = 11, train loss: 0.44039, val loss: 0.43764, in 0.018s 1 tree, 48 leaves, max depth = 9, train loss: 0.43733, val loss: 0.43489, in 0.015s 1 tree, 34 leaves, max depth = 10, train loss: 0.43435, val loss: 0.43175, in 0.000s 1 tree, 48 leaves, max depth = 10, train loss: 0.43148, val loss: 0.42917, in 0.018s 1 tree, 32 leaves, max depth = 10, train loss: 0.42877, val loss: 0.42624, in 0.012s 1 tree, 48 leaves, max depth = 10, train loss: 0.42604, val loss: 0.42378, in 0.004s 1 tree, 29 leaves, max depth = 10, train loss: 0.42348, val loss: 0.42104, in 0.014s 1 tree, 48 leaves, max depth = 10, train loss: 0.42092, val loss: 0.41876, in 0.001s 1 tree, 48 leaves, max depth = 9, train loss: 0.41848, val loss: 0.41658, in 0.018s 1 tree, 32 leaves, max depth = 10, train loss: 0.41611, val loss: 0.41401, in 0.014s 1 tree, 48 leaves, max depth = 11, train loss: 0.41381, val loss: 0.41199, in 0.003s 1 tree, 33 leaves, max depth = 10, train loss: 0.41160, val loss: 0.40962, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40945, val loss: 0.40727, in 0.011s Fit 49 trees in 0.799 s, (1922 total leaves) Time spent computing histograms: 0.239s Time spent finding best splits: 0.050s Time spent applying splits: 0.044s Time spent predicting: 0.000s Trial 70, Fold 1: Log loss = 0.4112736509614114, Average precision = 0.9467322043099218, ROC-AUC = 0.9422860864962266, Elapsed Time = 0.8045333999998547 seconds Trial 70, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 70, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.160 s 0.005 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 34 leaves, max depth = 11, train loss: 0.67829, val loss: 0.67754, in 0.001s 1 tree, 35 leaves, max depth = 13, train loss: 0.66451, val loss: 0.66308, in 0.018s 1 tree, 34 leaves, max depth = 12, train loss: 0.65148, val loss: 0.64937, in 0.012s 1 tree, 35 leaves, max depth = 13, train loss: 0.63932, val loss: 0.63659, in 0.005s 1 tree, 34 leaves, max depth = 12, train loss: 0.62780, val loss: 0.62444, in 0.013s 1 tree, 34 leaves, max depth = 12, train loss: 0.61696, val loss: 0.61298, in 0.003s 1 tree, 32 leaves, max depth = 15, train loss: 0.60683, val loss: 0.60238, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.59719, val loss: 0.59217, in 0.009s 1 tree, 33 leaves, max depth = 13, train loss: 0.58819, val loss: 0.58270, in 0.008s 1 tree, 33 leaves, max depth = 12, train loss: 0.57960, val loss: 0.57358, in 0.013s 1 tree, 32 leaves, max depth = 15, train loss: 0.57157, val loss: 0.56514, in 0.004s 1 tree, 35 leaves, max depth = 12, train loss: 0.56387, val loss: 0.55697, in 0.015s 1 tree, 32 leaves, max depth = 15, train loss: 0.55670, val loss: 0.54942, in 0.011s 1 tree, 48 leaves, max depth = 12, train loss: 0.54937, val loss: 0.54231, in 0.006s 1 tree, 35 leaves, max depth = 9, train loss: 0.54266, val loss: 0.53516, in 0.015s 1 tree, 48 leaves, max depth = 13, train loss: 0.53587, val loss: 0.52858, in 0.010s 1 tree, 32 leaves, max depth = 15, train loss: 0.52977, val loss: 0.52214, in 0.007s 1 tree, 48 leaves, max depth = 13, train loss: 0.52347, val loss: 0.51604, in 0.014s 1 tree, 32 leaves, max depth = 13, train loss: 0.51783, val loss: 0.51007, in 0.000s 1 tree, 48 leaves, max depth = 11, train loss: 0.51196, val loss: 0.50441, in 0.017s 1 tree, 48 leaves, max depth = 11, train loss: 0.50641, val loss: 0.49906, in 0.014s 1 tree, 35 leaves, max depth = 14, train loss: 0.50130, val loss: 0.49363, in 0.004s 1 tree, 48 leaves, max depth = 12, train loss: 0.49612, val loss: 0.48863, in 0.015s 1 tree, 35 leaves, max depth = 12, train loss: 0.49137, val loss: 0.48358, in 0.013s 1 tree, 48 leaves, max depth = 13, train loss: 0.48653, val loss: 0.47893, in 0.004s 1 tree, 48 leaves, max depth = 13, train loss: 0.48194, val loss: 0.47452, in 0.017s 1 tree, 34 leaves, max depth = 12, train loss: 0.47761, val loss: 0.46990, in 0.013s 1 tree, 48 leaves, max depth = 11, train loss: 0.47332, val loss: 0.46579, in 0.013s 1 tree, 34 leaves, max depth = 12, train loss: 0.46929, val loss: 0.46148, in 0.008s 1 tree, 48 leaves, max depth = 12, train loss: 0.46526, val loss: 0.45764, in 0.015s 1 tree, 35 leaves, max depth = 15, train loss: 0.46150, val loss: 0.45359, in 0.010s 1 tree, 48 leaves, max depth = 13, train loss: 0.45771, val loss: 0.44999, in 0.008s 1 tree, 34 leaves, max depth = 11, train loss: 0.45420, val loss: 0.44624, in 0.014s 1 tree, 48 leaves, max depth = 13, train loss: 0.45065, val loss: 0.44287, in 0.004s 1 tree, 48 leaves, max depth = 13, train loss: 0.44727, val loss: 0.43966, in 0.007s 1 tree, 33 leaves, max depth = 11, train loss: 0.44403, val loss: 0.43620, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.44085, val loss: 0.43320, in 0.011s 1 tree, 33 leaves, max depth = 11, train loss: 0.43782, val loss: 0.42994, in 0.008s 1 tree, 48 leaves, max depth = 10, train loss: 0.43482, val loss: 0.42711, in 0.013s 1 tree, 34 leaves, max depth = 11, train loss: 0.43198, val loss: 0.42407, in 0.003s 1 tree, 35 leaves, max depth = 11, train loss: 0.42923, val loss: 0.42104, in 0.015s 1 tree, 48 leaves, max depth = 12, train loss: 0.42641, val loss: 0.41838, in 0.012s 1 tree, 33 leaves, max depth = 11, train loss: 0.42389, val loss: 0.41566, in 0.008s 1 tree, 48 leaves, max depth = 11, train loss: 0.42123, val loss: 0.41314, in 0.012s 1 tree, 48 leaves, max depth = 12, train loss: 0.41870, val loss: 0.41077, in 0.002s 1 tree, 48 leaves, max depth = 11, train loss: 0.41630, val loss: 0.40851, in 0.010s 1 tree, 34 leaves, max depth = 13, train loss: 0.41399, val loss: 0.40599, in 0.010s 1 tree, 48 leaves, max depth = 11, train loss: 0.41172, val loss: 0.40387, in 0.002s 1 tree, 3 leaves, max depth = 2, train loss: 0.40962, val loss: 0.40170, in 0.015s Fit 49 trees in 0.820 s, (1908 total leaves) Time spent computing histograms: 0.247s Time spent finding best splits: 0.052s Time spent applying splits: 0.044s Time spent predicting: 0.012s Trial 70, Fold 2: Log loss = 0.4112498194579391, Average precision = 0.9438457631997765, ROC-AUC = 0.9436594615806474, Elapsed Time = 0.8273012000008748 seconds Trial 70, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 70, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.178 s 0.005 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 33 leaves, max depth = 9, train loss: 0.67832, val loss: 0.67784, in 0.011s 1 tree, 33 leaves, max depth = 13, train loss: 0.66472, val loss: 0.66374, in 0.007s 1 tree, 34 leaves, max depth = 9, train loss: 0.65189, val loss: 0.65043, in 0.011s 1 tree, 36 leaves, max depth = 12, train loss: 0.63984, val loss: 0.63795, in 0.008s 1 tree, 32 leaves, max depth = 9, train loss: 0.62838, val loss: 0.62609, in 0.010s 1 tree, 33 leaves, max depth = 13, train loss: 0.61774, val loss: 0.61502, in 0.007s 1 tree, 34 leaves, max depth = 9, train loss: 0.60766, val loss: 0.60453, in 0.012s 1 tree, 33 leaves, max depth = 13, train loss: 0.59822, val loss: 0.59468, in 0.011s 1 tree, 34 leaves, max depth = 9, train loss: 0.58910, val loss: 0.58524, in 0.010s 1 tree, 32 leaves, max depth = 9, train loss: 0.58050, val loss: 0.57632, in 0.009s 1 tree, 33 leaves, max depth = 9, train loss: 0.57252, val loss: 0.56797, in 0.008s 1 tree, 34 leaves, max depth = 11, train loss: 0.56498, val loss: 0.56012, in 0.010s 1 tree, 35 leaves, max depth = 9, train loss: 0.55785, val loss: 0.55266, in 0.008s 1 tree, 48 leaves, max depth = 12, train loss: 0.55044, val loss: 0.54574, in 0.013s 1 tree, 32 leaves, max depth = 12, train loss: 0.54391, val loss: 0.53887, in 0.010s 1 tree, 48 leaves, max depth = 12, train loss: 0.53704, val loss: 0.53248, in 0.007s 1 tree, 33 leaves, max depth = 9, train loss: 0.53085, val loss: 0.52602, in 0.013s 1 tree, 48 leaves, max depth = 11, train loss: 0.52448, val loss: 0.52010, in 0.012s 1 tree, 35 leaves, max depth = 12, train loss: 0.51887, val loss: 0.51423, in 0.007s 1 tree, 48 leaves, max depth = 11, train loss: 0.51294, val loss: 0.50873, in 0.013s 1 tree, 48 leaves, max depth = 11, train loss: 0.50733, val loss: 0.50354, in 0.012s 1 tree, 35 leaves, max depth = 11, train loss: 0.50224, val loss: 0.49820, in 0.007s 1 tree, 48 leaves, max depth = 11, train loss: 0.49701, val loss: 0.49337, in 0.014s 1 tree, 35 leaves, max depth = 11, train loss: 0.49229, val loss: 0.48840, in 0.003s 1 tree, 48 leaves, max depth = 12, train loss: 0.48739, val loss: 0.48389, in 0.017s 1 tree, 48 leaves, max depth = 10, train loss: 0.48275, val loss: 0.47963, in 0.011s 1 tree, 35 leaves, max depth = 11, train loss: 0.47846, val loss: 0.47508, in 0.005s 1 tree, 48 leaves, max depth = 10, train loss: 0.47411, val loss: 0.47110, in 0.015s 1 tree, 35 leaves, max depth = 11, train loss: 0.47011, val loss: 0.46688, in 0.010s 1 tree, 48 leaves, max depth = 12, train loss: 0.46603, val loss: 0.46314, in 0.010s 1 tree, 35 leaves, max depth = 13, train loss: 0.46232, val loss: 0.45918, in 0.009s 1 tree, 48 leaves, max depth = 12, train loss: 0.45849, val loss: 0.45568, in 0.005s 1 tree, 48 leaves, max depth = 12, train loss: 0.45486, val loss: 0.45237, in 0.019s 1 tree, 35 leaves, max depth = 11, train loss: 0.45143, val loss: 0.44873, in 0.012s 1 tree, 48 leaves, max depth = 12, train loss: 0.44801, val loss: 0.44561, in 0.012s 1 tree, 35 leaves, max depth = 11, train loss: 0.44481, val loss: 0.44220, in 0.008s 1 tree, 48 leaves, max depth = 11, train loss: 0.44158, val loss: 0.43927, in 0.017s [38/49] 1 tree, 35 leaves, max depth = 11, train loss: 0.43859, val loss: 0.43608, in 0.012s 1 tree, 48 leaves, max depth = 10, train loss: 0.43554, val loss: 0.43332, in 0.014s 1 tree, 35 leaves, max depth = 11, train loss: 0.43274, val loss: 0.43032, in 0.009s 1 tree, 35 leaves, max depth = 11, train loss: 0.43009, val loss: 0.42748, in 0.012s 1 tree, 48 leaves, max depth = 12, train loss: 0.42723, val loss: 0.42488, in 0.014s 1 tree, 35 leaves, max depth = 10, train loss: 0.42473, val loss: 0.42215, in 0.007s 1 tree, 48 leaves, max depth = 11, train loss: 0.42204, val loss: 0.41971, in 0.014s 1 tree, 48 leaves, max depth = 11, train loss: 0.41947, val loss: 0.41740, in 0.013s 1 tree, 48 leaves, max depth = 12, train loss: 0.41704, val loss: 0.41523, in 0.007s 1 tree, 34 leaves, max depth = 12, train loss: 0.41477, val loss: 0.41275, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.41247, val loss: 0.41068, in 0.014s 1 tree, 3 leaves, max depth = 2, train loss: 0.41030, val loss: 0.40867, in 0.002s Fit 49 trees in 0.842 s, (1918 total leaves) Time spent computing histograms: 0.252s Time spent finding best splits: 0.054s Time spent applying splits: 0.047s Time spent predicting: 0.005s Trial 70, Fold 3: Log loss = 0.4076260721292932, Average precision = 0.9485255964042099, ROC-AUC = 0.9468514340611185, Elapsed Time = 0.8560482000011689 seconds Trial 70, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 70, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.179 s 0.006 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 34 leaves, max depth = 11, train loss: 0.67831, val loss: 0.67743, in 0.011s 1 tree, 34 leaves, max depth = 11, train loss: 0.66449, val loss: 0.66276, in 0.012s 1 tree, 34 leaves, max depth = 11, train loss: 0.65152, val loss: 0.64897, in 0.006s 1 tree, 34 leaves, max depth = 11, train loss: 0.63933, val loss: 0.63598, in 0.013s 1 tree, 34 leaves, max depth = 11, train loss: 0.62787, val loss: 0.62375, in 0.010s 1 tree, 34 leaves, max depth = 11, train loss: 0.61708, val loss: 0.61222, in 0.010s 1 tree, 34 leaves, max depth = 11, train loss: 0.60692, val loss: 0.60133, in 0.009s 1 tree, 34 leaves, max depth = 11, train loss: 0.59734, val loss: 0.59104, in 0.006s 1 tree, 34 leaves, max depth = 11, train loss: 0.58831, val loss: 0.58132, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.57978, val loss: 0.57213, in 0.012s 1 tree, 34 leaves, max depth = 11, train loss: 0.57174, val loss: 0.56343, in 0.012s 1 tree, 33 leaves, max depth = 10, train loss: 0.56414, val loss: 0.55520, in 0.010s 1 tree, 35 leaves, max depth = 12, train loss: 0.55698, val loss: 0.54751, in 0.012s 1 tree, 48 leaves, max depth = 10, train loss: 0.54957, val loss: 0.54023, in 0.013s 1 tree, 33 leaves, max depth = 11, train loss: 0.54295, val loss: 0.53302, in 0.009s 1 tree, 48 leaves, max depth = 10, train loss: 0.53609, val loss: 0.52630, in 0.014s 1 tree, 34 leaves, max depth = 11, train loss: 0.53002, val loss: 0.51973, in 0.011s 1 tree, 48 leaves, max depth = 10, train loss: 0.52365, val loss: 0.51350, in 0.014s 1 tree, 34 leaves, max depth = 15, train loss: 0.51805, val loss: 0.50737, in 0.010s 1 tree, 48 leaves, max depth = 10, train loss: 0.51212, val loss: 0.50156, in 0.014s 1 tree, 48 leaves, max depth = 10, train loss: 0.50651, val loss: 0.49607, in 0.013s 1 tree, 34 leaves, max depth = 11, train loss: 0.50144, val loss: 0.49057, in 0.011s 1 tree, 48 leaves, max depth = 11, train loss: 0.49621, val loss: 0.48545, in 0.015s 1 tree, 34 leaves, max depth = 13, train loss: 0.49147, val loss: 0.48021, in 0.000s 1 tree, 48 leaves, max depth = 11, train loss: 0.48658, val loss: 0.47544, in 0.018s 1 tree, 48 leaves, max depth = 11, train loss: 0.48194, val loss: 0.47091, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.47766, val loss: 0.46621, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.47332, val loss: 0.46198, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.46934, val loss: 0.45760, in 0.020s 1 tree, 48 leaves, max depth = 11, train loss: 0.46528, val loss: 0.45364, in 0.017s 1 tree, 32 leaves, max depth = 14, train loss: 0.46154, val loss: 0.44948, in 0.015s 1 tree, 48 leaves, max depth = 11, train loss: 0.45773, val loss: 0.44577, in 0.013s 1 tree, 48 leaves, max depth = 11, train loss: 0.45411, val loss: 0.44225, in 0.006s 1 tree, 34 leaves, max depth = 14, train loss: 0.45068, val loss: 0.43842, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.44727, val loss: 0.43511, in 0.019s 1 tree, 34 leaves, max depth = 14, train loss: 0.44406, val loss: 0.43152, in 0.014s 1 tree, 48 leaves, max depth = 10, train loss: 0.44085, val loss: 0.42841, in 0.013s 1 tree, 32 leaves, max depth = 13, train loss: 0.43786, val loss: 0.42504, in 0.008s 1 tree, 48 leaves, max depth = 9, train loss: 0.43483, val loss: 0.42210, in 0.019s 1 tree, 35 leaves, max depth = 14, train loss: 0.43202, val loss: 0.41892, in 0.010s 1 tree, 33 leaves, max depth = 12, train loss: 0.42933, val loss: 0.41587, in 0.013s 1 tree, 48 leaves, max depth = 11, train loss: 0.42649, val loss: 0.41313, in 0.014s 1 tree, 32 leaves, max depth = 13, train loss: 0.42400, val loss: 0.41030, in 0.000s 1 tree, 48 leaves, max depth = 10, train loss: 0.42132, val loss: 0.40771, in 0.020s 1 tree, 48 leaves, max depth = 11, train loss: 0.41877, val loss: 0.40522, in 0.021s 1 tree, 48 leaves, max depth = 10, train loss: 0.41633, val loss: 0.40287, in 0.018s 1 tree, 31 leaves, max depth = 10, train loss: 0.41407, val loss: 0.40030, in 0.015s 1 tree, 48 leaves, max depth = 9, train loss: 0.41177, val loss: 0.39807, in 0.009s 1 tree, 3 leaves, max depth = 2, train loss: 0.40968, val loss: 0.39584, in 0.016s Fit 49 trees in 0.948 s, (1906 total leaves) Time spent computing histograms: 0.290s Time spent finding best splits: 0.062s Time spent applying splits: 0.055s Time spent predicting: 0.003s Trial 70, Fold 4: Log loss = 0.4089359936187368, Average precision = 0.9485610339411467, ROC-AUC = 0.9454596069062046, Elapsed Time = 0.9537225000003673 seconds Trial 70, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 70, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0.200 s 0.040 GB of training data: 0.019 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 34 leaves, max depth = 10, train loss: 0.67829, val loss: 0.67728, in 0.010s 1 tree, 35 leaves, max depth = 12, train loss: 0.66450, val loss: 0.66250, in 0.012s 1 tree, 33 leaves, max depth = 11, train loss: 0.65137, val loss: 0.64845, in 0.007s 1 tree, 33 leaves, max depth = 11, train loss: 0.63903, val loss: 0.63522, in 0.022s 1 tree, 33 leaves, max depth = 11, train loss: 0.62742, val loss: 0.62276, in 0.013s 1 tree, 36 leaves, max depth = 12, train loss: 0.61664, val loss: 0.61110, in 0.011s 1 tree, 33 leaves, max depth = 11, train loss: 0.60633, val loss: 0.59999, in 0.012s 1 tree, 33 leaves, max depth = 11, train loss: 0.59662, val loss: 0.58949, in 0.010s 1 tree, 35 leaves, max depth = 12, train loss: 0.58758, val loss: 0.57966, in 0.014s 1 tree, 33 leaves, max depth = 11, train loss: 0.57892, val loss: 0.57027, in 0.005s 1 tree, 33 leaves, max depth = 11, train loss: 0.57075, val loss: 0.56137, in 0.019s 1 tree, 33 leaves, max depth = 11, train loss: 0.56303, val loss: 0.55295, in 0.000s 1 tree, 35 leaves, max depth = 12, train loss: 0.55582, val loss: 0.54504, in 0.023s 1 tree, 48 leaves, max depth = 12, train loss: 0.54855, val loss: 0.53805, in 0.014s 1 tree, 33 leaves, max depth = 11, train loss: 0.54181, val loss: 0.53067, in 0.013s 1 tree, 48 leaves, max depth = 12, train loss: 0.53507, val loss: 0.52421, in 0.008s 1 tree, 34 leaves, max depth = 10, train loss: 0.52893, val loss: 0.51749, in 0.019s 1 tree, 48 leaves, max depth = 12, train loss: 0.52266, val loss: 0.51151, in 0.015s 1 tree, 34 leaves, max depth = 10, train loss: 0.51698, val loss: 0.50528, in 0.013s 1 tree, 48 leaves, max depth = 12, train loss: 0.51115, val loss: 0.49973, in 0.015s 1 tree, 34 leaves, max depth = 12, train loss: 0.50590, val loss: 0.49394, in 0.010s 1 tree, 48 leaves, max depth = 12, train loss: 0.50045, val loss: 0.48877, in 0.008s 1 tree, 48 leaves, max depth = 12, train loss: 0.49529, val loss: 0.48389, in 0.019s 1 tree, 35 leaves, max depth = 10, train loss: 0.49051, val loss: 0.47859, in 0.017s 1 tree, 48 leaves, max depth = 12, train loss: 0.48569, val loss: 0.47403, in 0.014s 1 tree, 35 leaves, max depth = 10, train loss: 0.48124, val loss: 0.46910, in 0.014s 1 tree, 48 leaves, max depth = 13, train loss: 0.47673, val loss: 0.46486, in 0.014s 1 tree, 36 leaves, max depth = 10, train loss: 0.47260, val loss: 0.46021, in 0.004s 1 tree, 48 leaves, max depth = 12, train loss: 0.46836, val loss: 0.45625, in 0.021s 1 tree, 36 leaves, max depth = 10, train loss: 0.46451, val loss: 0.45193, in 0.013s 1 tree, 33 leaves, max depth = 10, train loss: 0.46079, val loss: 0.44774, in 0.013s 1 tree, 48 leaves, max depth = 12, train loss: 0.45686, val loss: 0.44408, in 0.013s 1 tree, 48 leaves, max depth = 12, train loss: 0.45312, val loss: 0.44062, in 0.003s 1 tree, 48 leaves, max depth = 12, train loss: 0.44957, val loss: 0.43733, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.44627, val loss: 0.43360, in 0.011s 1 tree, 48 leaves, max depth = 12, train loss: 0.44292, val loss: 0.43050, in 0.015s 1 tree, 34 leaves, max depth = 12, train loss: 0.43985, val loss: 0.42702, in 0.012s 1 tree, 48 leaves, max depth = 13, train loss: 0.43670, val loss: 0.42411, in 0.012s 1 tree, 33 leaves, max depth = 9, train loss: 0.43381, val loss: 0.42083, in 0.013s 1 tree, 48 leaves, max depth = 12, train loss: 0.43083, val loss: 0.41810, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.42813, val loss: 0.41500, in 0.014s 1 tree, 48 leaves, max depth = 12, train loss: 0.42531, val loss: 0.41244, in 0.007s 1 tree, 33 leaves, max depth = 9, train loss: 0.42277, val loss: 0.40953, in 0.023s 1 tree, 48 leaves, max depth = 13, train loss: 0.42011, val loss: 0.40711, in 0.015s 1 tree, 48 leaves, max depth = 12, train loss: 0.41758, val loss: 0.40481, in 0.015s 1 tree, 33 leaves, max depth = 9, train loss: 0.41522, val loss: 0.40210, in 0.012s 1 tree, 48 leaves, max depth = 13, train loss: 0.41282, val loss: 0.39994, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.41054, val loss: 0.39788, in 0.001s 1 tree, 3 leaves, max depth = 2, train loss: 0.40843, val loss: 0.39584, in 0.016s Fit 49 trees in 1.039 s, (1913 total leaves) Time spent computing histograms: 0.307s Time spent finding best splits: 0.070s Time spent applying splits: 0.058s Time spent predicting: 0.004s Trial 70, Fold 5: Log loss = 0.41464192296993685, Average precision = 0.9462843445442741, ROC-AUC = 0.94220047191721, Elapsed Time = 1.0569494000010309 seconds
Optimization Progress: 71%|#######1 | 71/100 [14:16<06:52, 14.21s/it]
Trial 71, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 71, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.186 s 0.040 GB of training data: 0.006 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 14, train loss: 0.68366, val loss: 0.68339, in 0.011s 1 tree, 41 leaves, max depth = 14, train loss: 0.67456, val loss: 0.67401, in 0.012s 1 tree, 41 leaves, max depth = 14, train loss: 0.66581, val loss: 0.66499, in 0.012s 1 tree, 41 leaves, max depth = 14, train loss: 0.65740, val loss: 0.65632, in 0.011s 1 tree, 41 leaves, max depth = 14, train loss: 0.64931, val loss: 0.64798, in 0.011s 1 tree, 41 leaves, max depth = 14, train loss: 0.64153, val loss: 0.63996, in 0.011s 1 tree, 41 leaves, max depth = 17, train loss: 0.63405, val loss: 0.63223, in 0.005s 1 tree, 41 leaves, max depth = 14, train loss: 0.62685, val loss: 0.62479, in 0.015s 1 tree, 41 leaves, max depth = 13, train loss: 0.61991, val loss: 0.61762, in 0.011s 1 tree, 41 leaves, max depth = 13, train loss: 0.61324, val loss: 0.61071, in 0.011s 1 tree, 41 leaves, max depth = 13, train loss: 0.60680, val loss: 0.60405, in 0.010s 1 tree, 41 leaves, max depth = 13, train loss: 0.60060, val loss: 0.59763, in 0.010s 1 tree, 41 leaves, max depth = 17, train loss: 0.59463, val loss: 0.59144, in 0.010s 1 tree, 41 leaves, max depth = 17, train loss: 0.58887, val loss: 0.58546, in 0.012s 1 tree, 41 leaves, max depth = 17, train loss: 0.58332, val loss: 0.57970, in 0.002s 1 tree, 41 leaves, max depth = 17, train loss: 0.57796, val loss: 0.57414, in 0.020s 1 tree, 41 leaves, max depth = 16, train loss: 0.57279, val loss: 0.56876, in 0.012s 1 tree, 41 leaves, max depth = 13, train loss: 0.56781, val loss: 0.56358, in 0.011s 1 tree, 41 leaves, max depth = 17, train loss: 0.56300, val loss: 0.55858, in 0.015s 1 tree, 41 leaves, max depth = 17, train loss: 0.55836, val loss: 0.55374, in 0.010s 1 tree, 41 leaves, max depth = 8, train loss: 0.55361, val loss: 0.54929, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.54920, val loss: 0.54469, in 0.015s 1 tree, 41 leaves, max depth = 8, train loss: 0.54468, val loss: 0.54046, in 0.013s 1 tree, 41 leaves, max depth = 18, train loss: 0.54050, val loss: 0.53609, in 0.011s 1 tree, 41 leaves, max depth = 8, train loss: 0.53619, val loss: 0.53206, in 0.009s 1 tree, 41 leaves, max depth = 18, train loss: 0.53221, val loss: 0.52790, in 0.010s 1 tree, 41 leaves, max depth = 10, train loss: 0.52811, val loss: 0.52406, in 0.010s 1 tree, 41 leaves, max depth = 13, train loss: 0.52432, val loss: 0.52010, in 0.012s 1 tree, 41 leaves, max depth = 10, train loss: 0.52040, val loss: 0.51643, in 0.012s 1 tree, 41 leaves, max depth = 9, train loss: 0.51661, val loss: 0.51290, in 0.009s 1 tree, 41 leaves, max depth = 15, train loss: 0.51306, val loss: 0.50917, in 0.011s 1 tree, 41 leaves, max depth = 9, train loss: 0.50944, val loss: 0.50581, in 0.011s 1 tree, 41 leaves, max depth = 15, train loss: 0.50605, val loss: 0.50225, in 0.009s 1 tree, 41 leaves, max depth = 9, train loss: 0.50260, val loss: 0.49903, in 0.012s 1 tree, 41 leaves, max depth = 15, train loss: 0.49937, val loss: 0.49563, in 0.010s 1 tree, 41 leaves, max depth = 9, train loss: 0.49606, val loss: 0.49256, in 0.025s 1 tree, 41 leaves, max depth = 15, train loss: 0.49298, val loss: 0.48931, in 0.042s 1 tree, 41 leaves, max depth = 9, train loss: 0.48981, val loss: 0.48639, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.48675, val loss: 0.48354, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.48387, val loss: 0.48049, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.48093, val loss: 0.47778, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.47818, val loss: 0.47487, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.47536, val loss: 0.47226, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.47274, val loss: 0.46948, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.47003, val loss: 0.46699, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.46752, val loss: 0.46432, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.46492, val loss: 0.46194, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.46241, val loss: 0.45963, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.46003, val loss: 0.45710, in 0.031s 1 tree, 41 leaves, max depth = 10, train loss: 0.45761, val loss: 0.45490, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.45534, val loss: 0.45247, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.45302, val loss: 0.45035, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.45085, val loss: 0.44806, in 0.020s 1 tree, 41 leaves, max depth = 10, train loss: 0.44862, val loss: 0.44605, in 0.013s 1 tree, 41 leaves, max depth = 14, train loss: 0.44652, val loss: 0.44383, in 0.013s 1 tree, 41 leaves, max depth = 10, train loss: 0.44437, val loss: 0.44189, in 0.003s 1 tree, 41 leaves, max depth = 10, train loss: 0.44229, val loss: 0.44001, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.44031, val loss: 0.43788, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.43831, val loss: 0.43608, in 0.016s 1 tree, 41 leaves, max depth = 15, train loss: 0.43639, val loss: 0.43406, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.43446, val loss: 0.43233, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.43263, val loss: 0.43036, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.43078, val loss: 0.42869, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.42902, val loss: 0.42680, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.42724, val loss: 0.42518, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.42551, val loss: 0.42364, in 0.016s 1 tree, 41 leaves, max depth = 15, train loss: 0.42382, val loss: 0.42185, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.42216, val loss: 0.42035, in 0.043s 1 tree, 41 leaves, max depth = 11, train loss: 0.42054, val loss: 0.41861, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.41893, val loss: 0.41718, in 0.031s 1 tree, 41 leaves, max depth = 10, train loss: 0.41739, val loss: 0.41551, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.41584, val loss: 0.41411, in 0.016s 1 tree, 41 leaves, max depth = 15, train loss: 0.41434, val loss: 0.41253, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.41284, val loss: 0.41120, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41145, val loss: 0.40966, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.41001, val loss: 0.40838, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40867, val loss: 0.40689, in 0.016s 1 tree, 41 leaves, max depth = 16, train loss: 0.40726, val loss: 0.40540, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40597, val loss: 0.40396, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.40458, val loss: 0.40275, in 0.016s Fit 80 trees in 1.451 s, (3172 total leaves) Time spent computing histograms: 0.504s Time spent finding best splits: 0.151s Time spent applying splits: 0.113s Time spent predicting: 0.002s Trial 71, Fold 1: Log loss = 0.4071347963646409, Average precision = 0.9452412166113221, ROC-AUC = 0.9425571788235519, Elapsed Time = 1.457269799999267 seconds Trial 71, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 71, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.204 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 9, train loss: 0.68378, val loss: 0.68335, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.67479, val loss: 0.67395, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.66615, val loss: 0.66490, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.65784, val loss: 0.65620, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.64986, val loss: 0.64783, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.64218, val loss: 0.63978, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.63479, val loss: 0.63202, in 0.000s 1 tree, 41 leaves, max depth = 9, train loss: 0.62768, val loss: 0.62455, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.62083, val loss: 0.61736, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.61424, val loss: 0.61042, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.60789, val loss: 0.60373, in 0.000s 1 tree, 41 leaves, max depth = 9, train loss: 0.60178, val loss: 0.59729, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.59588, val loss: 0.59107, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.59020, val loss: 0.58507, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.58472, val loss: 0.57928, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.57944, val loss: 0.57369, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.57435, val loss: 0.56830, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.56943, val loss: 0.56309, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.56469, val loss: 0.55807, in 0.000s 1 tree, 41 leaves, max depth = 9, train loss: 0.56012, val loss: 0.55321, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.55543, val loss: 0.54868, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.55109, val loss: 0.54406, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.54663, val loss: 0.53973, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.54248, val loss: 0.53535, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.53822, val loss: 0.53124, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.53428, val loss: 0.52706, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.53022, val loss: 0.52315, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.52646, val loss: 0.51917, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.52259, val loss: 0.51544, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.51901, val loss: 0.51165, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.51531, val loss: 0.50808, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.51174, val loss: 0.50465, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.50838, val loss: 0.50108, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.50496, val loss: 0.49780, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.50175, val loss: 0.49439, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.49848, val loss: 0.49126, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.49542, val loss: 0.48800, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.49229, val loss: 0.48500, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.48937, val loss: 0.48189, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.48636, val loss: 0.47902, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.48346, val loss: 0.47624, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.48070, val loss: 0.47330, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.47792, val loss: 0.47064, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.47528, val loss: 0.46782, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.47261, val loss: 0.46527, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.47009, val loss: 0.46258, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.46752, val loss: 0.46014, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.46511, val loss: 0.45756, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.46265, val loss: 0.45521, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.46027, val loss: 0.45294, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.45798, val loss: 0.45050, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.45569, val loss: 0.44831, in 0.016s 1 tree, 41 leaves, max depth = 15, train loss: 0.45351, val loss: 0.44596, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.45130, val loss: 0.44387, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.44920, val loss: 0.44162, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.44708, val loss: 0.43961, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.44507, val loss: 0.43745, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.44303, val loss: 0.43551, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.44110, val loss: 0.43344, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.43914, val loss: 0.43158, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.43724, val loss: 0.42978, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.43540, val loss: 0.42781, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.43357, val loss: 0.42609, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.43180, val loss: 0.42419, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.43003, val loss: 0.42252, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.42834, val loss: 0.42070, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.42664, val loss: 0.41910, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.42501, val loss: 0.41735, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.42337, val loss: 0.41580, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.42177, val loss: 0.41431, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.42022, val loss: 0.41263, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.41868, val loss: 0.41120, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.41719, val loss: 0.40959, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.41571, val loss: 0.40820, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.41427, val loss: 0.40665, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41292, val loss: 0.40525, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.41149, val loss: 0.40393, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41019, val loss: 0.40258, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.40881, val loss: 0.40129, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.40745, val loss: 0.39983, in 0.016s Fit 80 trees in 1.360 s, (3208 total leaves) Time spent computing histograms: 0.462s Time spent finding best splits: 0.115s Time spent applying splits: 0.082s Time spent predicting: 0.016s Trial 71, Fold 2: Log loss = 0.4097584221749652, Average precision = 0.9431351239372479, ROC-AUC = 0.9423233982034884, Elapsed Time = 1.3551755999997113 seconds Trial 71, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 71, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 13, train loss: 0.68386, val loss: 0.68359, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.67498, val loss: 0.67438, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.66631, val loss: 0.66543, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.65797, val loss: 0.65681, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.64995, val loss: 0.64852, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.64236, val loss: 0.64064, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.63494, val loss: 0.63296, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.62779, val loss: 0.62556, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.62091, val loss: 0.61844, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.61428, val loss: 0.61157, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.60800, val loss: 0.60502, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.60184, val loss: 0.59864, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.59591, val loss: 0.59248, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.59019, val loss: 0.58655, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.58468, val loss: 0.58082, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.57945, val loss: 0.57535, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.57431, val loss: 0.57001, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.56936, val loss: 0.56486, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.56458, val loss: 0.55988, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.56004, val loss: 0.55512, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.55532, val loss: 0.55073, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.55094, val loss: 0.54616, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.54644, val loss: 0.54198, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.54234, val loss: 0.53767, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.53806, val loss: 0.53368, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.53417, val loss: 0.52958, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.53008, val loss: 0.52579, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.52614, val loss: 0.52212, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.52249, val loss: 0.51827, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.51873, val loss: 0.51478, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.51523, val loss: 0.51113, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.51163, val loss: 0.50780, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.50829, val loss: 0.50432, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.50485, val loss: 0.50114, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.50163, val loss: 0.49774, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.49833, val loss: 0.49470, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.49533, val loss: 0.49151, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.49217, val loss: 0.48860, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.48912, val loss: 0.48580, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.48629, val loss: 0.48278, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.48337, val loss: 0.48010, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.48063, val loss: 0.47723, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.47783, val loss: 0.47467, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.47522, val loss: 0.47193, in 0.031s 1 tree, 41 leaves, max depth = 13, train loss: 0.47253, val loss: 0.46948, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.47006, val loss: 0.46683, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.46747, val loss: 0.46449, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.46497, val loss: 0.46222, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.46261, val loss: 0.45972, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.46021, val loss: 0.45754, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.45797, val loss: 0.45514, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.45566, val loss: 0.45304, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.45352, val loss: 0.45073, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.45130, val loss: 0.44873, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.44921, val loss: 0.44653, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.44708, val loss: 0.44462, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.44500, val loss: 0.44275, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.44303, val loss: 0.44065, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.44103, val loss: 0.43886, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.43913, val loss: 0.43684, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.43722, val loss: 0.43513, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.43543, val loss: 0.43318, in 0.000s 1 tree, 41 leaves, max depth = 14, train loss: 0.43358, val loss: 0.43154, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.43183, val loss: 0.42968, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.43006, val loss: 0.42810, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.42833, val loss: 0.42658, in 0.000s 1 tree, 41 leaves, max depth = 14, train loss: 0.42670, val loss: 0.42479, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.42503, val loss: 0.42330, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.42343, val loss: 0.42156, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.42183, val loss: 0.42013, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.42028, val loss: 0.41848, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.41873, val loss: 0.41710, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41732, val loss: 0.41579, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.41582, val loss: 0.41446, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.41436, val loss: 0.41287, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.41301, val loss: 0.41161, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.41157, val loss: 0.41034, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41026, val loss: 0.40912, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.40888, val loss: 0.40761, in 0.016s [80/80] 1 tree, 41 leaves, max depth = 12, train loss: 0.40750, val loss: 0.40640, in 0.000s Fit 80 trees in 1.283 s, (3172 total leaves) Time spent computing histograms: 0.451s Time spent finding best splits: 0.112s Time spent applying splits: 0.083s Time spent predicting: 0.047s Trial 71, Fold 3: Log loss = 0.4058957659864714, Average precision = 0.9473637724315495, ROC-AUC = 0.9462533317361866, Elapsed Time = 1.2974756000003254 seconds Trial 71, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 71, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.174 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 13, train loss: 0.68375, val loss: 0.68320, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.67473, val loss: 0.67364, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.66606, val loss: 0.66444, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.65774, val loss: 0.65560, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.64973, val loss: 0.64708, in 0.000s 1 tree, 41 leaves, max depth = 13, train loss: 0.64202, val loss: 0.63888, in 0.062s 1 tree, 41 leaves, max depth = 13, train loss: 0.63461, val loss: 0.63097, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.62748, val loss: 0.62336, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.62062, val loss: 0.61602, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.61400, val loss: 0.60894, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.60763, val loss: 0.60212, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.60150, val loss: 0.59553, in 0.031s 1 tree, 41 leaves, max depth = 14, train loss: 0.59558, val loss: 0.58918, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.58988, val loss: 0.58304, in 0.000s 1 tree, 41 leaves, max depth = 14, train loss: 0.58439, val loss: 0.57712, in 0.031s 1 tree, 41 leaves, max depth = 14, train loss: 0.57909, val loss: 0.57140, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.57397, val loss: 0.56588, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.56904, val loss: 0.56054, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.56428, val loss: 0.55538, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.55969, val loss: 0.55040, in 0.000s 1 tree, 41 leaves, max depth = 9, train loss: 0.55499, val loss: 0.54579, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.55063, val loss: 0.54104, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.54615, val loss: 0.53666, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.54201, val loss: 0.53214, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.53775, val loss: 0.52796, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.53381, val loss: 0.52366, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.52974, val loss: 0.51968, in 0.016s 1 tree, 41 leaves, max depth = 16, train loss: 0.52601, val loss: 0.51560, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.52212, val loss: 0.51180, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.51837, val loss: 0.50814, in 0.016s 1 tree, 41 leaves, max depth = 17, train loss: 0.51487, val loss: 0.50429, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.51129, val loss: 0.50080, in 0.000s 1 tree, 41 leaves, max depth = 17, train loss: 0.50796, val loss: 0.49713, in 0.000s 1 tree, 41 leaves, max depth = 9, train loss: 0.50453, val loss: 0.49379, in 0.016s 1 tree, 41 leaves, max depth = 17, train loss: 0.50135, val loss: 0.49029, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.49807, val loss: 0.48709, in 0.016s 1 tree, 41 leaves, max depth = 17, train loss: 0.49504, val loss: 0.48374, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.49189, val loss: 0.48069, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.48886, val loss: 0.47773, in 0.016s 1 tree, 41 leaves, max depth = 17, train loss: 0.48600, val loss: 0.47456, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.48309, val loss: 0.47173, in 0.016s 1 tree, 41 leaves, max depth = 17, train loss: 0.48036, val loss: 0.46870, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.47757, val loss: 0.46599, in 0.016s 1 tree, 41 leaves, max depth = 17, train loss: 0.47496, val loss: 0.46308, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.47228, val loss: 0.46048, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.46977, val loss: 0.45768, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.46720, val loss: 0.45519, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.46471, val loss: 0.45277, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.46235, val loss: 0.45013, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.45996, val loss: 0.44781, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.45770, val loss: 0.44528, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.45540, val loss: 0.44305, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.45322, val loss: 0.44064, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.45101, val loss: 0.43850, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.44894, val loss: 0.43616, in 0.016s 1 tree, 41 leaves, max depth = 8, train loss: 0.44682, val loss: 0.43410, in 0.016s 1 tree, 41 leaves, max depth = 8, train loss: 0.44476, val loss: 0.43211, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.44278, val loss: 0.42991, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.44080, val loss: 0.42799, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.43891, val loss: 0.42585, in 0.000s 1 tree, 41 leaves, max depth = 9, train loss: 0.43700, val loss: 0.42401, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.43519, val loss: 0.42195, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.43335, val loss: 0.42018, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.43160, val loss: 0.41822, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.42983, val loss: 0.41652, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.42812, val loss: 0.41486, in 0.016s 1 tree, 41 leaves, max depth = 15, train loss: 0.42646, val loss: 0.41297, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.42481, val loss: 0.41137, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.42321, val loss: 0.40957, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.42162, val loss: 0.40804, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.42008, val loss: 0.40628, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.41855, val loss: 0.40481, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.41707, val loss: 0.40311, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.41559, val loss: 0.40169, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41424, val loss: 0.40024, in 0.000s 1 tree, 41 leaves, max depth = 9, train loss: 0.41280, val loss: 0.39887, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.41140, val loss: 0.39728, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41010, val loss: 0.39588, in 0.000s 1 tree, 41 leaves, max depth = 9, train loss: 0.40872, val loss: 0.39457, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.40747, val loss: 0.39322, in 0.016s Fit 80 trees in 1.456 s, (3172 total leaves) Time spent computing histograms: 0.502s Time spent finding best splits: 0.142s Time spent applying splits: 0.116s Time spent predicting: 0.016s Trial 71, Fold 4: Log loss = 0.40726675348936964, Average precision = 0.9476603377576511, ROC-AUC = 0.9445157531992818, Elapsed Time = 1.45156870000028 seconds Trial 71, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 71, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.173 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 41 leaves, max depth = 12, train loss: 0.68366, val loss: 0.68305, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.67454, val loss: 0.67334, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.66579, val loss: 0.66400, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.65737, val loss: 0.65502, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.64928, val loss: 0.64637, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.64150, val loss: 0.63804, in 0.000s 1 tree, 41 leaves, max depth = 14, train loss: 0.63401, val loss: 0.63002, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.62681, val loss: 0.62229, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.61987, val loss: 0.61484, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.61319, val loss: 0.60765, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.60675, val loss: 0.60072, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.60059, val loss: 0.59405, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.59461, val loss: 0.58760, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.58884, val loss: 0.58137, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.58328, val loss: 0.57535, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.57795, val loss: 0.56955, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.57278, val loss: 0.56394, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.56778, val loss: 0.55851, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.56297, val loss: 0.55327, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.55834, val loss: 0.54821, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.55385, val loss: 0.54331, in 0.000s 1 tree, 41 leaves, max depth = 9, train loss: 0.54930, val loss: 0.53893, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.54506, val loss: 0.53427, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.54072, val loss: 0.53011, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.53668, val loss: 0.52567, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.53254, val loss: 0.52171, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.52871, val loss: 0.51748, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.52476, val loss: 0.51370, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.52094, val loss: 0.51005, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.51734, val loss: 0.50608, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.51370, val loss: 0.50260, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.51026, val loss: 0.49881, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.50678, val loss: 0.49549, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.50351, val loss: 0.49187, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.50018, val loss: 0.48870, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.49705, val loss: 0.48524, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.49387, val loss: 0.48221, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.49089, val loss: 0.47890, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.48783, val loss: 0.47602, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.48488, val loss: 0.47323, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.48207, val loss: 0.47010, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.47924, val loss: 0.46743, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.47655, val loss: 0.46443, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.47383, val loss: 0.46188, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.47126, val loss: 0.45900, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.46865, val loss: 0.45656, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.46619, val loss: 0.45380, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.46369, val loss: 0.45146, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.46127, val loss: 0.44920, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.45894, val loss: 0.44661, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.45662, val loss: 0.44443, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.45438, val loss: 0.44192, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.45215, val loss: 0.43983, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.45001, val loss: 0.43741, in 0.016s 1 tree, 41 leaves, max depth = 8, train loss: 0.44786, val loss: 0.43541, in 0.006s 1 tree, 41 leaves, max depth = 12, train loss: 0.44581, val loss: 0.43312, in 0.011s 1 tree, 41 leaves, max depth = 10, train loss: 0.44375, val loss: 0.43120, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.44178, val loss: 0.42898, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.43979, val loss: 0.42713, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.43786, val loss: 0.42535, in 0.000s 1 tree, 41 leaves, max depth = 11, train loss: 0.43599, val loss: 0.42322, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.43413, val loss: 0.42152, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.43233, val loss: 0.41948, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.43055, val loss: 0.41784, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.42882, val loss: 0.41588, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.42710, val loss: 0.41430, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.42544, val loss: 0.41241, in 0.016s 1 tree, 41 leaves, max depth = 8, train loss: 0.42378, val loss: 0.41089, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.42219, val loss: 0.40907, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.42058, val loss: 0.40761, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.41903, val loss: 0.40621, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.41751, val loss: 0.40446, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.41601, val loss: 0.40311, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.41455, val loss: 0.40143, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.41310, val loss: 0.40012, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41175, val loss: 0.39882, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.41035, val loss: 0.39721, in 0.000s 1 tree, 41 leaves, max depth = 10, train loss: 0.40896, val loss: 0.39596, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40766, val loss: 0.39472, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.40631, val loss: 0.39350, in 0.000s Fit 80 trees in 1.252 s, (3208 total leaves) Time spent computing histograms: 0.449s Time spent finding best splits: 0.105s Time spent applying splits: 0.077s Time spent predicting: 0.016s Trial 71, Fold 5: Log loss = 0.41181279095167456, Average precision = 0.9466656914645775, ROC-AUC = 0.9421432473106294, Elapsed Time = 1.260043800000858 seconds
Optimization Progress: 72%|#######2 | 72/100 [14:31<06:44, 14.46s/it]
Trial 72, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 72, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.174 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 8, train loss: 0.67697, val loss: 0.67687, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.65917, val loss: 0.65903, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.64329, val loss: 0.64282, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.62763, val loss: 0.62691, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.61362, val loss: 0.61266, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.60063, val loss: 0.59951, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.58836, val loss: 0.58702, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.57685, val loss: 0.57542, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.56646, val loss: 0.56492, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.55606, val loss: 0.55439, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.54542, val loss: 0.54376, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.53592, val loss: 0.53421, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.52668, val loss: 0.52488, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.51739, val loss: 0.51547, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.50938, val loss: 0.50740, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.50126, val loss: 0.49913, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.49289, val loss: 0.49065, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.48516, val loss: 0.48289, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.47817, val loss: 0.47575, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.47160, val loss: 0.46908, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.46499, val loss: 0.46230, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.45913, val loss: 0.45651, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.45384, val loss: 0.45113, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.44845, val loss: 0.44584, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.44262, val loss: 0.44005, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.43719, val loss: 0.43446, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.43284, val loss: 0.43001, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.42792, val loss: 0.42508, in 0.031s 1 tree, 31 leaves, max depth = 12, train loss: 0.42329, val loss: 0.42046, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.41898, val loss: 0.41610, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.41219, val loss: 0.40948, in 0.016s 1 tree, 19 leaves, max depth = 8, train loss: 0.40554, val loss: 0.40305, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.40193, val loss: 0.39945, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.39788, val loss: 0.39530, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.39181, val loss: 0.38928, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.38608, val loss: 0.38364, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.38279, val loss: 0.38022, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.38020, val loss: 0.37756, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.37624, val loss: 0.37367, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.37110, val loss: 0.36872, in 0.000s 1 tree, 13 leaves, max depth = 5, train loss: 0.36653, val loss: 0.36431, in 0.000s 1 tree, 31 leaves, max depth = 8, train loss: 0.36350, val loss: 0.36140, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.35978, val loss: 0.35774, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.35539, val loss: 0.35340, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.35191, val loss: 0.35002, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.34756, val loss: 0.34595, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.34337, val loss: 0.34205, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.33970, val loss: 0.33845, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.33604, val loss: 0.33497, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.33410, val loss: 0.33311, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.33240, val loss: 0.33134, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.33050, val loss: 0.32953, in 0.016s 1 tree, 31 leaves, max depth = 6, train loss: 0.32764, val loss: 0.32686, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.32615, val loss: 0.32539, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.32309, val loss: 0.32252, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.32040, val loss: 0.31991, in 0.016s 1 tree, 24 leaves, max depth = 7, train loss: 0.31752, val loss: 0.31711, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.31494, val loss: 0.31461, in 0.016s 1 tree, 14 leaves, max depth = 4, train loss: 0.31222, val loss: 0.31195, in 0.000s 1 tree, 31 leaves, max depth = 7, train loss: 0.30977, val loss: 0.30961, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.30706, val loss: 0.30715, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.30468, val loss: 0.30474, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.30229, val loss: 0.30265, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.30077, val loss: 0.30114, in 0.031s 1 tree, 31 leaves, max depth = 8, train loss: 0.29940, val loss: 0.29982, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.29807, val loss: 0.29855, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.29615, val loss: 0.29673, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.29389, val loss: 0.29459, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.29276, val loss: 0.29356, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.29084, val loss: 0.29178, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.28918, val loss: 0.29010, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.28762, val loss: 0.28855, in 0.016s Fit 72 trees in 1.518 s, (2004 total leaves) Time spent computing histograms: 0.493s Time spent finding best splits: 0.063s Time spent applying splits: 0.048s Time spent predicting: 0.000s Trial 72, Fold 1: Log loss = 0.29415414771195725, Average precision = 0.961117044484153, ROC-AUC = 0.955106737189506, Elapsed Time = 1.520083399998839 seconds Trial 72, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 72, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 12, train loss: 0.67712, val loss: 0.67641, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.66005, val loss: 0.65885, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.64408, val loss: 0.64277, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.62828, val loss: 0.62687, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.61428, val loss: 0.61290, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.60131, val loss: 0.59976, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.58898, val loss: 0.58732, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.57744, val loss: 0.57545, in 0.016s 1 tree, 22 leaves, max depth = 7, train loss: 0.56386, val loss: 0.56174, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.55307, val loss: 0.55081, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.54280, val loss: 0.54049, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.53334, val loss: 0.53082, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.52391, val loss: 0.52131, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.51428, val loss: 0.51165, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.50609, val loss: 0.50330, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.49819, val loss: 0.49535, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.49135, val loss: 0.48834, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.48198, val loss: 0.47897, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.47499, val loss: 0.47195, in 0.000s 1 tree, 31 leaves, max depth = 8, train loss: 0.46864, val loss: 0.46550, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.46255, val loss: 0.45939, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.45669, val loss: 0.45352, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.45090, val loss: 0.44779, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.44490, val loss: 0.44177, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.43918, val loss: 0.43596, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.43335, val loss: 0.43019, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.42869, val loss: 0.42539, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.42351, val loss: 0.42020, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.41865, val loss: 0.41531, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.41473, val loss: 0.41117, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.40794, val loss: 0.40451, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.40284, val loss: 0.39943, in 0.031s 1 tree, 31 leaves, max depth = 8, train loss: 0.39809, val loss: 0.39475, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.39402, val loss: 0.39078, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.38959, val loss: 0.38634, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.38391, val loss: 0.38077, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.38071, val loss: 0.37760, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.37797, val loss: 0.37484, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.37242, val loss: 0.36973, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.36747, val loss: 0.36485, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.36423, val loss: 0.36155, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.36126, val loss: 0.35872, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.35811, val loss: 0.35565, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.35325, val loss: 0.35115, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.34984, val loss: 0.34780, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.34553, val loss: 0.34368, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.34243, val loss: 0.34065, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.34025, val loss: 0.33850, in 0.016s 1 tree, 16 leaves, max depth = 9, train loss: 0.33640, val loss: 0.33477, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.33429, val loss: 0.33270, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.33209, val loss: 0.33059, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.33039, val loss: 0.32883, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.32773, val loss: 0.32623, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.32618, val loss: 0.32468, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.32291, val loss: 0.32154, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.32128, val loss: 0.31998, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.31821, val loss: 0.31701, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.31689, val loss: 0.31573, in 0.016s 1 tree, 19 leaves, max depth = 10, train loss: 0.31383, val loss: 0.31284, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.31127, val loss: 0.31055, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.30995, val loss: 0.30924, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.30830, val loss: 0.30766, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.30628, val loss: 0.30561, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.30413, val loss: 0.30355, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.30095, val loss: 0.30056, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.29854, val loss: 0.29839, in 0.016s 1 tree, 22 leaves, max depth = 11, train loss: 0.29608, val loss: 0.29601, in 0.016s 1 tree, 27 leaves, max depth = 8, train loss: 0.29377, val loss: 0.29395, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.29159, val loss: 0.29202, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.29052, val loss: 0.29105, in 0.016s 1 tree, 14 leaves, max depth = 9, train loss: 0.28840, val loss: 0.28899, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.28733, val loss: 0.28799, in 0.016s Fit 72 trees in 1.565 s, (2024 total leaves) Time spent computing histograms: 0.514s Time spent finding best splits: 0.065s Time spent applying splits: 0.048s Time spent predicting: 0.000s Trial 72, Fold 2: Log loss = 0.2910875895115242, Average precision = 0.9600590605027294, ROC-AUC = 0.9561806266030257, Elapsed Time = 1.5716917000008834 seconds Trial 72, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 72, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.141 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 11, train loss: 0.67712, val loss: 0.67691, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.65932, val loss: 0.65910, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.64390, val loss: 0.64366, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.62823, val loss: 0.62796, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.61429, val loss: 0.61401, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.60112, val loss: 0.60090, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.59001, val loss: 0.58937, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.57862, val loss: 0.57777, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.56516, val loss: 0.56479, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.55414, val loss: 0.55369, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.54410, val loss: 0.54346, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.53448, val loss: 0.53385, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.52503, val loss: 0.52421, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.51582, val loss: 0.51505, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.50822, val loss: 0.50743, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.50017, val loss: 0.49955, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.49241, val loss: 0.49172, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.48508, val loss: 0.48451, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.47734, val loss: 0.47684, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.47086, val loss: 0.47041, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.46221, val loss: 0.46222, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.45627, val loss: 0.45636, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.44987, val loss: 0.45001, in 0.031s 1 tree, 9 leaves, max depth = 5, train loss: 0.44457, val loss: 0.44475, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.43940, val loss: 0.43959, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.43239, val loss: 0.43305, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.42637, val loss: 0.42688, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.42126, val loss: 0.42181, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.41720, val loss: 0.41780, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.41356, val loss: 0.41415, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.40887, val loss: 0.40949, in 0.000s 1 tree, 31 leaves, max depth = 7, train loss: 0.40381, val loss: 0.40428, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.39948, val loss: 0.40012, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.39361, val loss: 0.39479, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.38899, val loss: 0.38996, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.38554, val loss: 0.38652, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.38128, val loss: 0.38204, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.37775, val loss: 0.37861, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.37474, val loss: 0.37571, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.37147, val loss: 0.37236, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.36830, val loss: 0.36934, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.36503, val loss: 0.36616, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.36256, val loss: 0.36362, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.35975, val loss: 0.36095, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.35525, val loss: 0.35684, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.35084, val loss: 0.35292, in 0.016s 1 tree, 30 leaves, max depth = 7, train loss: 0.34676, val loss: 0.34927, in 0.016s 1 tree, 22 leaves, max depth = 11, train loss: 0.34244, val loss: 0.34561, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.33939, val loss: 0.34244, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.33726, val loss: 0.34041, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.33445, val loss: 0.33750, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.33093, val loss: 0.33432, in 0.000s 1 tree, 14 leaves, max depth = 5, train loss: 0.32757, val loss: 0.33127, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.32594, val loss: 0.32977, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.32387, val loss: 0.32789, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.32132, val loss: 0.32524, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.31816, val loss: 0.32245, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.31573, val loss: 0.31988, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.31407, val loss: 0.31835, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.31210, val loss: 0.31619, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.30956, val loss: 0.31378, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.30811, val loss: 0.31242, in 0.000s 1 tree, 31 leaves, max depth = 8, train loss: 0.30674, val loss: 0.31142, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.30570, val loss: 0.31046, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.30407, val loss: 0.30906, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.30206, val loss: 0.30700, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.30100, val loss: 0.30585, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.29924, val loss: 0.30419, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.29655, val loss: 0.30200, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.29526, val loss: 0.30082, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.29414, val loss: 0.29998, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.29122, val loss: 0.29764, in 0.016s Fit 72 trees in 1.438 s, (1999 total leaves) Time spent computing histograms: 0.464s Time spent finding best splits: 0.057s Time spent applying splits: 0.042s Time spent predicting: 0.016s Trial 72, Fold 3: Log loss = 0.29083142577126364, Average precision = 0.9623127350153663, ROC-AUC = 0.9575925963825833, Elapsed Time = 1.439354000000094 seconds Trial 72, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 72, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 12, train loss: 0.67709, val loss: 0.67659, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.65937, val loss: 0.65814, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.64409, val loss: 0.64209, in 0.028s 1 tree, 31 leaves, max depth = 11, train loss: 0.62851, val loss: 0.62581, in 0.004s 1 tree, 31 leaves, max depth = 7, train loss: 0.61465, val loss: 0.61120, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.60179, val loss: 0.59760, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.58943, val loss: 0.58453, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.57794, val loss: 0.57272, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.56433, val loss: 0.55877, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.55333, val loss: 0.54733, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.54345, val loss: 0.53691, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.53302, val loss: 0.52598, in 0.016s 1 tree, 7 leaves, max depth = 3, train loss: 0.52488, val loss: 0.51730, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.51605, val loss: 0.50820, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.50749, val loss: 0.49922, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.49906, val loss: 0.49025, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.49157, val loss: 0.48232, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.48484, val loss: 0.47526, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.47795, val loss: 0.46807, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.47195, val loss: 0.46160, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.46522, val loss: 0.45448, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.45704, val loss: 0.44617, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.45148, val loss: 0.44022, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.44646, val loss: 0.43496, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.44141, val loss: 0.42975, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.43599, val loss: 0.42389, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.43032, val loss: 0.41792, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.42332, val loss: 0.41068, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.41902, val loss: 0.40591, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.41363, val loss: 0.40045, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.40920, val loss: 0.39581, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.40423, val loss: 0.39076, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.40008, val loss: 0.38631, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.39611, val loss: 0.38200, in 0.016s 1 tree, 10 leaves, max depth = 4, train loss: 0.39013, val loss: 0.37592, in 0.016s 1 tree, 21 leaves, max depth = 12, train loss: 0.38560, val loss: 0.37141, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.38133, val loss: 0.36717, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.37775, val loss: 0.36328, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.37464, val loss: 0.36004, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.37136, val loss: 0.35652, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.36838, val loss: 0.35326, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.36351, val loss: 0.34834, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.36105, val loss: 0.34581, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.35835, val loss: 0.34290, in 0.031s 1 tree, 12 leaves, max depth = 5, train loss: 0.35445, val loss: 0.33864, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.35028, val loss: 0.33441, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.34754, val loss: 0.33147, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.34515, val loss: 0.32891, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.34211, val loss: 0.32588, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.33955, val loss: 0.32326, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.33768, val loss: 0.32141, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.33495, val loss: 0.31868, in 0.016s 1 tree, 14 leaves, max depth = 8, train loss: 0.33159, val loss: 0.31529, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.32815, val loss: 0.31178, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.32629, val loss: 0.30975, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.32417, val loss: 0.30756, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.32101, val loss: 0.30431, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.31858, val loss: 0.30186, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.31529, val loss: 0.29868, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.31359, val loss: 0.29691, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.31037, val loss: 0.29386, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.30728, val loss: 0.29096, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.30571, val loss: 0.28932, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.30427, val loss: 0.28771, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.30231, val loss: 0.28576, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.30096, val loss: 0.28442, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.29857, val loss: 0.28195, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.29675, val loss: 0.28018, in 0.016s 1 tree, 15 leaves, max depth = 6, train loss: 0.29412, val loss: 0.27754, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.29292, val loss: 0.27634, in 0.031s 1 tree, 31 leaves, max depth = 8, train loss: 0.29158, val loss: 0.27498, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.28908, val loss: 0.27262, in 0.016s Fit 72 trees in 1.471 s, (1976 total leaves) Time spent computing histograms: 0.477s Time spent finding best splits: 0.058s Time spent applying splits: 0.043s Time spent predicting: 0.000s Trial 72, Fold 4: Log loss = 0.29002303537082896, Average precision = 0.961783253739625, ROC-AUC = 0.9564205151040437, Elapsed Time = 1.4727536999998847 seconds Trial 72, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 72, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 9, train loss: 0.67688, val loss: 0.67615, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.65903, val loss: 0.65764, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.64330, val loss: 0.64111, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.62756, val loss: 0.62463, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.61339, val loss: 0.60985, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.60040, val loss: 0.59614, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.58793, val loss: 0.58305, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.57600, val loss: 0.57038, in 0.000s 1 tree, 12 leaves, max depth = 8, train loss: 0.56271, val loss: 0.55670, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.55226, val loss: 0.54563, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.54236, val loss: 0.53534, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.53167, val loss: 0.52422, in 0.016s 1 tree, 7 leaves, max depth = 3, train loss: 0.52345, val loss: 0.51559, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.51453, val loss: 0.50638, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.50615, val loss: 0.49729, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.49755, val loss: 0.48826, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.49016, val loss: 0.48036, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.48341, val loss: 0.47333, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.47628, val loss: 0.46573, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.46975, val loss: 0.45870, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.46301, val loss: 0.45159, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.45717, val loss: 0.44546, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.45215, val loss: 0.44003, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.44590, val loss: 0.43360, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.44010, val loss: 0.42758, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.43443, val loss: 0.42177, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.42927, val loss: 0.41637, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.42440, val loss: 0.41135, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.41972, val loss: 0.40648, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.41568, val loss: 0.40221, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.41051, val loss: 0.39697, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.40393, val loss: 0.39034, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39755, val loss: 0.38400, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.39356, val loss: 0.37991, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.38906, val loss: 0.37536, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.38553, val loss: 0.37194, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.38139, val loss: 0.36773, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.37885, val loss: 0.36498, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.37322, val loss: 0.35978, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.36834, val loss: 0.35483, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.36464, val loss: 0.35110, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.36168, val loss: 0.34809, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.35860, val loss: 0.34504, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.35401, val loss: 0.34044, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.35067, val loss: 0.33723, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.34625, val loss: 0.33289, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.34357, val loss: 0.33028, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.34144, val loss: 0.32820, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.33947, val loss: 0.32617, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.33723, val loss: 0.32400, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.33527, val loss: 0.32201, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.33313, val loss: 0.31990, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.33157, val loss: 0.31812, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.33008, val loss: 0.31666, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.32637, val loss: 0.31306, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.32485, val loss: 0.31142, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.32156, val loss: 0.30811, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.31908, val loss: 0.30572, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.31592, val loss: 0.30259, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.31304, val loss: 0.29981, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.31167, val loss: 0.29847, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.30940, val loss: 0.29626, in 0.031s 1 tree, 31 leaves, max depth = 8, train loss: 0.30764, val loss: 0.29474, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.30617, val loss: 0.29335, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.30450, val loss: 0.29182, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.30324, val loss: 0.29062, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.30046, val loss: 0.28809, in 0.016s 1 tree, 16 leaves, max depth = 6, train loss: 0.29801, val loss: 0.28582, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.29633, val loss: 0.28421, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.29520, val loss: 0.28311, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.29394, val loss: 0.28197, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.29297, val loss: 0.28096, in 0.016s Fit 72 trees in 1.455 s, (2002 total leaves) Time spent computing histograms: 0.480s Time spent finding best splits: 0.057s Time spent applying splits: 0.042s Time spent predicting: 0.000s Trial 72, Fold 5: Log loss = 0.3028119526249657, Average precision = 0.9591369312082461, ROC-AUC = 0.9534296490347993, Elapsed Time = 1.4715311999989353 seconds
Optimization Progress: 73%|#######3 | 73/100 [14:46<06:34, 14.60s/it]
Trial 73, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 73, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 50 leaves, max depth = 10, train loss: 0.68290, val loss: 0.68280, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.67228, val loss: 0.67213, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.66279, val loss: 0.66251, in 0.016s 1 tree, 76 leaves, max depth = 11, train loss: 0.65306, val loss: 0.65280, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.64407, val loss: 0.64377, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.63495, val loss: 0.63466, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.62648, val loss: 0.62613, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.61796, val loss: 0.61761, in 0.016s 1 tree, 77 leaves, max depth = 15, train loss: 0.61008, val loss: 0.60968, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.60261, val loss: 0.60209, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.59486, val loss: 0.59435, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.58763, val loss: 0.58711, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.58076, val loss: 0.58021, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.57382, val loss: 0.57311, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.56699, val loss: 0.56628, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.56066, val loss: 0.55988, in 0.031s 1 tree, 83 leaves, max depth = 13, train loss: 0.55430, val loss: 0.55345, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.54815, val loss: 0.54724, in 0.016s 1 tree, 80 leaves, max depth = 11, train loss: 0.54213, val loss: 0.54109, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.53620, val loss: 0.53513, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.53070, val loss: 0.52957, in 0.016s 1 tree, 76 leaves, max depth = 10, train loss: 0.52550, val loss: 0.52437, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.52000, val loss: 0.51885, in 0.016s 1 tree, 76 leaves, max depth = 10, train loss: 0.51510, val loss: 0.51395, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.51024, val loss: 0.50906, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.50552, val loss: 0.50431, in 0.016s 1 tree, 81 leaves, max depth = 11, train loss: 0.50074, val loss: 0.49943, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.49628, val loss: 0.49495, in 0.016s 1 tree, 76 leaves, max depth = 10, train loss: 0.49202, val loss: 0.49070, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.48747, val loss: 0.48613, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.48305, val loss: 0.48169, in 0.031s 1 tree, 80 leaves, max depth = 12, train loss: 0.47878, val loss: 0.47737, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.47490, val loss: 0.47347, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.47112, val loss: 0.46971, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.46727, val loss: 0.46576, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.46339, val loss: 0.46188, in 0.031s 1 tree, 79 leaves, max depth = 12, train loss: 0.45967, val loss: 0.45813, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.45614, val loss: 0.45453, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.45271, val loss: 0.45103, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.44931, val loss: 0.44757, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.44616, val loss: 0.44441, in 0.031s 1 tree, 78 leaves, max depth = 12, train loss: 0.44286, val loss: 0.44111, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.43971, val loss: 0.43788, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.43665, val loss: 0.43475, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.43199, val loss: 0.43026, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.42755, val loss: 0.42594, in 0.031s 1 tree, 77 leaves, max depth = 10, train loss: 0.42490, val loss: 0.42332, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.42215, val loss: 0.42050, in 0.016s 1 tree, 61 leaves, max depth = 10, train loss: 0.41797, val loss: 0.41643, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.41536, val loss: 0.41372, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.41130, val loss: 0.40985, in 0.016s 1 tree, 81 leaves, max depth = 11, train loss: 0.40879, val loss: 0.40729, in 0.016s 1 tree, 80 leaves, max depth = 12, train loss: 0.40622, val loss: 0.40472, in 0.016s 1 tree, 77 leaves, max depth = 10, train loss: 0.40397, val loss: 0.40252, in 0.016s 1 tree, 57 leaves, max depth = 14, train loss: 0.40018, val loss: 0.39892, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.39650, val loss: 0.39543, in 0.031s 1 tree, 58 leaves, max depth = 11, train loss: 0.39292, val loss: 0.39205, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.38946, val loss: 0.38877, in 0.016s 1 tree, 77 leaves, max depth = 11, train loss: 0.38718, val loss: 0.38650, in 0.016s Fit 59 trees in 1.314 s, (4358 total leaves) Time spent computing histograms: 0.388s Time spent finding best splits: 0.071s Time spent applying splits: 0.075s Time spent predicting: 0.000s Trial 73, Fold 1: Log loss = 0.3897518471591272, Average precision = 0.9534496063506044, ROC-AUC = 0.9484036620017876, Elapsed Time = 1.3255394999996497 seconds Trial 73, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 73, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 51 leaves, max depth = 10, train loss: 0.68289, val loss: 0.68265, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.67227, val loss: 0.67181, in 0.016s 1 tree, 80 leaves, max depth = 12, train loss: 0.66281, val loss: 0.66226, in 0.031s 1 tree, 75 leaves, max depth = 12, train loss: 0.65294, val loss: 0.65220, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.64388, val loss: 0.64302, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.63463, val loss: 0.63359, in 0.016s 1 tree, 81 leaves, max depth = 17, train loss: 0.62617, val loss: 0.62503, in 0.031s 1 tree, 80 leaves, max depth = 12, train loss: 0.61761, val loss: 0.61623, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.60979, val loss: 0.60827, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.60233, val loss: 0.60072, in 0.031s 1 tree, 77 leaves, max depth = 12, train loss: 0.59451, val loss: 0.59275, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.58718, val loss: 0.58529, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.58032, val loss: 0.57831, in 0.031s 1 tree, 58 leaves, max depth = 13, train loss: 0.57336, val loss: 0.57121, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.56641, val loss: 0.56412, in 0.031s 1 tree, 80 leaves, max depth = 14, train loss: 0.56016, val loss: 0.55776, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.55370, val loss: 0.55123, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.54745, val loss: 0.54490, in 0.031s 1 tree, 57 leaves, max depth = 12, train loss: 0.54144, val loss: 0.53877, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.53556, val loss: 0.53283, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.53019, val loss: 0.52735, in 0.031s 1 tree, 75 leaves, max depth = 13, train loss: 0.52459, val loss: 0.52167, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.51921, val loss: 0.51623, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.51434, val loss: 0.51120, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.50943, val loss: 0.50625, in 0.031s 1 tree, 77 leaves, max depth = 13, train loss: 0.50466, val loss: 0.50146, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.49988, val loss: 0.49662, in 0.031s 1 tree, 80 leaves, max depth = 14, train loss: 0.49549, val loss: 0.49215, in 0.016s 1 tree, 74 leaves, max depth = 14, train loss: 0.49126, val loss: 0.48778, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.48678, val loss: 0.48318, in 0.016s 1 tree, 78 leaves, max depth = 14, train loss: 0.48238, val loss: 0.47871, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.47815, val loss: 0.47437, in 0.016s 1 tree, 77 leaves, max depth = 14, train loss: 0.47432, val loss: 0.47046, in 0.031s 1 tree, 77 leaves, max depth = 14, train loss: 0.47057, val loss: 0.46663, in 0.016s 1 tree, 83 leaves, max depth = 11, train loss: 0.46678, val loss: 0.46278, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.46292, val loss: 0.45882, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.45924, val loss: 0.45504, in 0.031s 1 tree, 58 leaves, max depth = 12, train loss: 0.45572, val loss: 0.45143, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.45229, val loss: 0.44791, in 0.016s 1 tree, 84 leaves, max depth = 12, train loss: 0.44897, val loss: 0.44455, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.44586, val loss: 0.44147, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.44268, val loss: 0.43830, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.43964, val loss: 0.43516, in 0.031s 1 tree, 79 leaves, max depth = 12, train loss: 0.43651, val loss: 0.43195, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.43346, val loss: 0.42883, in 0.016s 1 tree, 62 leaves, max depth = 11, train loss: 0.42896, val loss: 0.42444, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.42618, val loss: 0.42161, in 0.016s 1 tree, 83 leaves, max depth = 11, train loss: 0.42343, val loss: 0.41884, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.41913, val loss: 0.41462, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.41654, val loss: 0.41195, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.41243, val loss: 0.40790, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.41000, val loss: 0.40541, in 0.031s 1 tree, 80 leaves, max depth = 13, train loss: 0.40758, val loss: 0.40302, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.40517, val loss: 0.40054, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.40133, val loss: 0.39677, in 0.031s 1 tree, 57 leaves, max depth = 12, train loss: 0.39756, val loss: 0.39308, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.39395, val loss: 0.38953, in 0.031s 1 tree, 58 leaves, max depth = 12, train loss: 0.39046, val loss: 0.38610, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.38825, val loss: 0.38384, in 0.016s Fit 59 trees in 1.548 s, (4366 total leaves) Time spent computing histograms: 0.469s Time spent finding best splits: 0.086s Time spent applying splits: 0.089s Time spent predicting: 0.000s Trial 73, Fold 2: Log loss = 0.39035312161219826, Average precision = 0.9510917369187883, ROC-AUC = 0.9489883885629515, Elapsed Time = 1.5692022999992332 seconds Trial 73, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 73, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 54 leaves, max depth = 11, train loss: 0.68288, val loss: 0.68273, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.67249, val loss: 0.67227, in 0.016s 1 tree, 80 leaves, max depth = 12, train loss: 0.66301, val loss: 0.66264, in 0.031s 1 tree, 77 leaves, max depth = 13, train loss: 0.65325, val loss: 0.65285, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.64416, val loss: 0.64374, in 0.031s 1 tree, 79 leaves, max depth = 13, train loss: 0.63505, val loss: 0.63459, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.62650, val loss: 0.62612, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.61798, val loss: 0.61757, in 0.031s 1 tree, 79 leaves, max depth = 12, train loss: 0.61010, val loss: 0.60971, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.60263, val loss: 0.60226, in 0.031s 1 tree, 78 leaves, max depth = 13, train loss: 0.59489, val loss: 0.59449, in 0.031s 1 tree, 82 leaves, max depth = 13, train loss: 0.58766, val loss: 0.58718, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.58074, val loss: 0.58028, in 0.031s 1 tree, 58 leaves, max depth = 12, train loss: 0.57378, val loss: 0.57327, in 0.031s 1 tree, 80 leaves, max depth = 12, train loss: 0.56684, val loss: 0.56634, in 0.031s 1 tree, 80 leaves, max depth = 14, train loss: 0.56054, val loss: 0.56007, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.55409, val loss: 0.55364, in 0.031s 1 tree, 83 leaves, max depth = 12, train loss: 0.54783, val loss: 0.54741, in 0.031s 1 tree, 82 leaves, max depth = 12, train loss: 0.54181, val loss: 0.54139, in 0.031s 1 tree, 78 leaves, max depth = 13, train loss: 0.53588, val loss: 0.53546, in 0.032s 1 tree, 77 leaves, max depth = 13, train loss: 0.53035, val loss: 0.53000, in 0.031s 1 tree, 77 leaves, max depth = 14, train loss: 0.52520, val loss: 0.52487, in 0.031s 1 tree, 76 leaves, max depth = 13, train loss: 0.51973, val loss: 0.51941, in 0.031s 1 tree, 77 leaves, max depth = 14, train loss: 0.51488, val loss: 0.51457, in 0.031s 1 tree, 79 leaves, max depth = 14, train loss: 0.51008, val loss: 0.50981, in 0.031s 1 tree, 76 leaves, max depth = 13, train loss: 0.50529, val loss: 0.50508, in 0.031s 1 tree, 82 leaves, max depth = 10, train loss: 0.50049, val loss: 0.50031, in 0.031s 1 tree, 80 leaves, max depth = 14, train loss: 0.49606, val loss: 0.49591, in 0.016s 1 tree, 77 leaves, max depth = 14, train loss: 0.49185, val loss: 0.49171, in 0.031s 1 tree, 76 leaves, max depth = 13, train loss: 0.48735, val loss: 0.48722, in 0.031s 1 tree, 76 leaves, max depth = 13, train loss: 0.48298, val loss: 0.48286, in 0.031s 1 tree, 75 leaves, max depth = 13, train loss: 0.47873, val loss: 0.47862, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.47490, val loss: 0.47484, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.47116, val loss: 0.47112, in 0.031s 1 tree, 81 leaves, max depth = 10, train loss: 0.46730, val loss: 0.46729, in 0.031s 1 tree, 77 leaves, max depth = 14, train loss: 0.46343, val loss: 0.46345, in 0.031s 1 tree, 77 leaves, max depth = 13, train loss: 0.45971, val loss: 0.45973, in 0.031s 1 tree, 55 leaves, max depth = 13, train loss: 0.45617, val loss: 0.45620, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.45275, val loss: 0.45278, in 0.031s 1 tree, 81 leaves, max depth = 13, train loss: 0.44934, val loss: 0.44942, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.44629, val loss: 0.44641, in 0.031s 1 tree, 83 leaves, max depth = 10, train loss: 0.44310, val loss: 0.44325, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.44001, val loss: 0.44016, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.43526, val loss: 0.43578, in 0.031s 1 tree, 79 leaves, max depth = 13, train loss: 0.43222, val loss: 0.43276, in 0.016s 1 tree, 84 leaves, max depth = 12, train loss: 0.42930, val loss: 0.42988, in 0.031s 1 tree, 57 leaves, max depth = 13, train loss: 0.42483, val loss: 0.42577, in 0.031s 1 tree, 58 leaves, max depth = 10, train loss: 0.42212, val loss: 0.42307, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.41783, val loss: 0.41914, in 0.031s 1 tree, 84 leaves, max depth = 10, train loss: 0.41521, val loss: 0.41655, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.41254, val loss: 0.41391, in 0.031s 1 tree, 79 leaves, max depth = 13, train loss: 0.40994, val loss: 0.41134, in 0.031s 1 tree, 83 leaves, max depth = 10, train loss: 0.40751, val loss: 0.40896, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.40530, val loss: 0.40678, in 0.031s 1 tree, 83 leaves, max depth = 13, train loss: 0.40307, val loss: 0.40463, in 0.031s 1 tree, 57 leaves, max depth = 12, train loss: 0.40082, val loss: 0.40241, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.39698, val loss: 0.39893, in 0.031s 1 tree, 58 leaves, max depth = 13, train loss: 0.39330, val loss: 0.39556, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.39107, val loss: 0.39337, in 0.031s Fit 59 trees in 1.924 s, (4403 total leaves) Time spent computing histograms: 0.573s Time spent finding best splits: 0.113s Time spent applying splits: 0.115s Time spent predicting: 0.000s Trial 73, Fold 3: Log loss = 0.3888056946075886, Average precision = 0.9533842900230023, ROC-AUC = 0.9508368767516446, Elapsed Time = 1.926594299999124 seconds Trial 73, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 73, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.187 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 57 leaves, max depth = 12, train loss: 0.68285, val loss: 0.68242, in 0.016s 1 tree, 73 leaves, max depth = 11, train loss: 0.67245, val loss: 0.67155, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.66300, val loss: 0.66154, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.65329, val loss: 0.65139, in 0.032s 1 tree, 56 leaves, max depth = 14, train loss: 0.64452, val loss: 0.64220, in 0.016s 1 tree, 74 leaves, max depth = 11, train loss: 0.63547, val loss: 0.63273, in 0.031s 1 tree, 75 leaves, max depth = 11, train loss: 0.62672, val loss: 0.62358, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.61845, val loss: 0.61489, in 0.031s 1 tree, 76 leaves, max depth = 16, train loss: 0.61065, val loss: 0.60675, in 0.016s 1 tree, 73 leaves, max depth = 11, train loss: 0.60273, val loss: 0.59845, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.59550, val loss: 0.59081, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.58842, val loss: 0.58342, in 0.031s 1 tree, 58 leaves, max depth = 12, train loss: 0.58140, val loss: 0.57597, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.57426, val loss: 0.56851, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.56756, val loss: 0.56143, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.56130, val loss: 0.55482, in 0.031s 1 tree, 76 leaves, max depth = 12, train loss: 0.55479, val loss: 0.54802, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.54887, val loss: 0.54181, in 0.031s 1 tree, 56 leaves, max depth = 11, train loss: 0.54308, val loss: 0.53560, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.53743, val loss: 0.52957, in 0.031s 1 tree, 76 leaves, max depth = 12, train loss: 0.53170, val loss: 0.52352, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.52625, val loss: 0.51771, in 0.031s 1 tree, 76 leaves, max depth = 12, train loss: 0.52096, val loss: 0.51208, in 0.016s 1 tree, 85 leaves, max depth = 11, train loss: 0.51576, val loss: 0.50658, in 0.031s 1 tree, 84 leaves, max depth = 11, train loss: 0.51079, val loss: 0.50132, in 0.016s 1 tree, 84 leaves, max depth = 11, train loss: 0.50589, val loss: 0.49613, in 0.031s 1 tree, 78 leaves, max depth = 11, train loss: 0.50104, val loss: 0.49104, in 0.016s 1 tree, 74 leaves, max depth = 11, train loss: 0.49639, val loss: 0.48612, in 0.031s 1 tree, 76 leaves, max depth = 13, train loss: 0.49210, val loss: 0.48169, in 0.031s 1 tree, 39 leaves, max depth = 10, train loss: 0.48646, val loss: 0.47593, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.48215, val loss: 0.47136, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.47805, val loss: 0.46696, in 0.031s 1 tree, 59 leaves, max depth = 11, train loss: 0.47406, val loss: 0.46274, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.47040, val loss: 0.45892, in 0.031s 1 tree, 78 leaves, max depth = 11, train loss: 0.46648, val loss: 0.45479, in 0.016s 1 tree, 74 leaves, max depth = 15, train loss: 0.46291, val loss: 0.45104, in 0.031s 1 tree, 57 leaves, max depth = 11, train loss: 0.45781, val loss: 0.44585, in 0.016s 1 tree, 61 leaves, max depth = 11, train loss: 0.45430, val loss: 0.44212, in 0.031s 1 tree, 61 leaves, max depth = 11, train loss: 0.45090, val loss: 0.43850, in 0.016s 1 tree, 85 leaves, max depth = 11, train loss: 0.44754, val loss: 0.43491, in 0.031s 1 tree, 81 leaves, max depth = 15, train loss: 0.44449, val loss: 0.43166, in 0.031s 1 tree, 85 leaves, max depth = 11, train loss: 0.44130, val loss: 0.42826, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.43824, val loss: 0.42501, in 0.031s 1 tree, 77 leaves, max depth = 12, train loss: 0.43513, val loss: 0.42172, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.43210, val loss: 0.41852, in 0.031s 1 tree, 82 leaves, max depth = 12, train loss: 0.42924, val loss: 0.41545, in 0.031s 1 tree, 84 leaves, max depth = 11, train loss: 0.42649, val loss: 0.41248, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.42368, val loss: 0.40952, in 0.031s 1 tree, 73 leaves, max depth = 11, train loss: 0.42103, val loss: 0.40675, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.41695, val loss: 0.40260, in 0.031s 1 tree, 57 leaves, max depth = 12, train loss: 0.41289, val loss: 0.39850, in 0.016s 1 tree, 60 leaves, max depth = 11, train loss: 0.41044, val loss: 0.39589, in 0.031s 1 tree, 82 leaves, max depth = 11, train loss: 0.40805, val loss: 0.39330, in 0.031s 1 tree, 76 leaves, max depth = 10, train loss: 0.40583, val loss: 0.39100, in 0.031s 1 tree, 57 leaves, max depth = 10, train loss: 0.40205, val loss: 0.38719, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.39840, val loss: 0.38350, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.39483, val loss: 0.37990, in 0.031s 1 tree, 56 leaves, max depth = 13, train loss: 0.39140, val loss: 0.37643, in 0.016s 1 tree, 79 leaves, max depth = 11, train loss: 0.38917, val loss: 0.37406, in 0.031s Fit 59 trees in 1.813 s, (4181 total leaves) Time spent computing histograms: 0.544s Time spent finding best splits: 0.102s Time spent applying splits: 0.103s Time spent predicting: 0.000s Trial 73, Fold 4: Log loss = 0.38931738228668655, Average precision = 0.9534384907358634, ROC-AUC = 0.9483126277149486, Elapsed Time = 1.8212827000006655 seconds Trial 73, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 73, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.159 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 54 leaves, max depth = 10, train loss: 0.68275, val loss: 0.68220, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.67212, val loss: 0.67113, in 0.031s 1 tree, 82 leaves, max depth = 11, train loss: 0.66268, val loss: 0.66117, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.65277, val loss: 0.65085, in 0.016s 1 tree, 77 leaves, max depth = 14, train loss: 0.64361, val loss: 0.64126, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.63436, val loss: 0.63162, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.62582, val loss: 0.62266, in 0.031s 1 tree, 79 leaves, max depth = 12, train loss: 0.61717, val loss: 0.61365, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.60930, val loss: 0.60541, in 0.031s 1 tree, 86 leaves, max depth = 12, train loss: 0.60181, val loss: 0.59755, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.59400, val loss: 0.58943, in 0.016s 1 tree, 75 leaves, max depth = 15, train loss: 0.58665, val loss: 0.58168, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.57974, val loss: 0.57444, in 0.016s 1 tree, 59 leaves, max depth = 10, train loss: 0.57282, val loss: 0.56718, in 0.031s 1 tree, 79 leaves, max depth = 13, train loss: 0.56587, val loss: 0.55994, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.55959, val loss: 0.55336, in 0.016s 1 tree, 86 leaves, max depth = 11, train loss: 0.55318, val loss: 0.54671, in 0.016s 1 tree, 85 leaves, max depth = 12, train loss: 0.54695, val loss: 0.54022, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.54096, val loss: 0.53393, in 0.016s 1 tree, 85 leaves, max depth = 12, train loss: 0.53510, val loss: 0.52782, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.52970, val loss: 0.52215, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.52411, val loss: 0.51634, in 0.031s 1 tree, 83 leaves, max depth = 10, train loss: 0.51877, val loss: 0.51080, in 0.016s 1 tree, 82 leaves, max depth = 10, train loss: 0.51391, val loss: 0.50582, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.50876, val loss: 0.50046, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.50392, val loss: 0.49539, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.49903, val loss: 0.49029, in 0.031s 1 tree, 78 leaves, max depth = 13, train loss: 0.49429, val loss: 0.48534, in 0.016s 1 tree, 87 leaves, max depth = 12, train loss: 0.49009, val loss: 0.48093, in 0.031s 1 tree, 70 leaves, max depth = 13, train loss: 0.48590, val loss: 0.47661, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.48174, val loss: 0.47224, in 0.031s 1 tree, 82 leaves, max depth = 10, train loss: 0.47761, val loss: 0.46793, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.47360, val loss: 0.46367, in 0.016s 1 tree, 84 leaves, max depth = 10, train loss: 0.46992, val loss: 0.45991, in 0.031s 1 tree, 77 leaves, max depth = 12, train loss: 0.46597, val loss: 0.45579, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.46213, val loss: 0.45179, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.45842, val loss: 0.44793, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.45494, val loss: 0.44423, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.45155, val loss: 0.44063, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.44821, val loss: 0.43713, in 0.031s 1 tree, 77 leaves, max depth = 12, train loss: 0.44500, val loss: 0.43377, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.44172, val loss: 0.43034, in 0.031s 1 tree, 83 leaves, max depth = 10, train loss: 0.43866, val loss: 0.42716, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.43555, val loss: 0.42392, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.43288, val loss: 0.42107, in 0.031s 1 tree, 75 leaves, max depth = 12, train loss: 0.43012, val loss: 0.41827, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.42561, val loss: 0.41380, in 0.031s 1 tree, 80 leaves, max depth = 12, train loss: 0.42277, val loss: 0.41084, in 0.016s 1 tree, 76 leaves, max depth = 13, train loss: 0.42006, val loss: 0.40799, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.41750, val loss: 0.40526, in 0.031s 1 tree, 57 leaves, max depth = 12, train loss: 0.41328, val loss: 0.40107, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.41083, val loss: 0.39845, in 0.031s 1 tree, 85 leaves, max depth = 10, train loss: 0.40844, val loss: 0.39597, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.40620, val loss: 0.39369, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.40400, val loss: 0.39138, in 0.031s 1 tree, 59 leaves, max depth = 12, train loss: 0.40180, val loss: 0.38902, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.39793, val loss: 0.38519, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.39420, val loss: 0.38151, in 0.031s 1 tree, 79 leaves, max depth = 12, train loss: 0.39198, val loss: 0.37922, in 0.016s Fit 59 trees in 1.612 s, (4358 total leaves) Time spent computing histograms: 0.474s Time spent finding best splits: 0.085s Time spent applying splits: 0.089s Time spent predicting: 0.000s Trial 73, Fold 5: Log loss = 0.39799173642544305, Average precision = 0.9482297461164175, ROC-AUC = 0.9457495122902849, Elapsed Time = 1.61448879999989 seconds
Optimization Progress: 74%|#######4 | 74/100 [15:01<06:24, 14.80s/it]
Trial 74, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 74, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 7, train loss: 0.67330, val loss: 0.67326, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.65458, val loss: 0.65471, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.63709, val loss: 0.63734, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.62070, val loss: 0.62092, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.60535, val loss: 0.60569, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.59092, val loss: 0.59126, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.57792, val loss: 0.57821, in 0.016s 1 tree, 31 leaves, max depth = 6, train loss: 0.56526, val loss: 0.56537, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.55357, val loss: 0.55364, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.54238, val loss: 0.54229, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.53134, val loss: 0.53123, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.52148, val loss: 0.52133, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.51214, val loss: 0.51196, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.50265, val loss: 0.50242, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.49430, val loss: 0.49403, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.48587, val loss: 0.48543, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.47779, val loss: 0.47734, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.47006, val loss: 0.46957, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.46002, val loss: 0.45979, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.45315, val loss: 0.45278, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.44395, val loss: 0.44386, in 0.000s 1 tree, 31 leaves, max depth = 8, train loss: 0.43768, val loss: 0.43745, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.43160, val loss: 0.43132, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.42587, val loss: 0.42548, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.42033, val loss: 0.41994, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.41508, val loss: 0.41469, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.41018, val loss: 0.40968, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.40275, val loss: 0.40253, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.39812, val loss: 0.39797, in 0.031s 1 tree, 31 leaves, max depth = 8, train loss: 0.39139, val loss: 0.39144, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.38713, val loss: 0.38719, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.38081, val loss: 0.38121, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.37694, val loss: 0.37747, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.37130, val loss: 0.37198, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.36599, val loss: 0.36683, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.36251, val loss: 0.36348, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.35765, val loss: 0.35867, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.35308, val loss: 0.35415, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.34934, val loss: 0.35048, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.34577, val loss: 0.34696, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.34144, val loss: 0.34299, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.33745, val loss: 0.33907, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.33418, val loss: 0.33590, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.33026, val loss: 0.33231, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.32671, val loss: 0.32882, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.32335, val loss: 0.32551, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.31987, val loss: 0.32237, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.31677, val loss: 0.31920, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.31400, val loss: 0.31652, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.31115, val loss: 0.31359, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.30826, val loss: 0.31076, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.30524, val loss: 0.30805, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.30281, val loss: 0.30571, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.30001, val loss: 0.30321, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.29734, val loss: 0.30084, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.29492, val loss: 0.29833, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.29276, val loss: 0.29626, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.29043, val loss: 0.29397, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.28807, val loss: 0.29188, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.28633, val loss: 0.29023, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.28424, val loss: 0.28818, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.28233, val loss: 0.28632, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.28033, val loss: 0.28424, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.27826, val loss: 0.28244, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.27641, val loss: 0.28062, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.27454, val loss: 0.27866, in 0.016s 1 tree, 31 leaves, max depth = 16, train loss: 0.27282, val loss: 0.27699, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.27105, val loss: 0.27515, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.26937, val loss: 0.27340, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.26762, val loss: 0.27187, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.26609, val loss: 0.27038, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.26447, val loss: 0.26896, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.26305, val loss: 0.26762, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.26155, val loss: 0.26632, in 0.031s 1 tree, 31 leaves, max depth = 8, train loss: 0.26021, val loss: 0.26505, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.25889, val loss: 0.26374, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.25751, val loss: 0.26253, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.25620, val loss: 0.26140, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.25484, val loss: 0.25996, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.25367, val loss: 0.25878, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.25238, val loss: 0.25742, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.25122, val loss: 0.25644, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.25036, val loss: 0.25543, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.24924, val loss: 0.25449, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.24821, val loss: 0.25341, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.24741, val loss: 0.25247, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.24636, val loss: 0.25162, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.24542, val loss: 0.25063, in 0.016s Fit 88 trees in 1.767 s, (2728 total leaves) Time spent computing histograms: 0.579s Time spent finding best splits: 0.109s Time spent applying splits: 0.065s Time spent predicting: 0.000s Trial 74, Fold 1: Log loss = 0.25379233630043185, Average precision = 0.9658598545976502, ROC-AUC = 0.9608860820751064, Elapsed Time = 1.7715054000000237 seconds Trial 74, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 74, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.189 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 8, train loss: 0.67327, val loss: 0.67289, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.65441, val loss: 0.65373, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.63766, val loss: 0.63673, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.62144, val loss: 0.62018, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.60567, val loss: 0.60419, in 0.000s 1 tree, 31 leaves, max depth = 7, train loss: 0.59106, val loss: 0.58943, in 0.031s 1 tree, 31 leaves, max depth = 12, train loss: 0.57804, val loss: 0.57624, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.56509, val loss: 0.56316, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.55273, val loss: 0.55058, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.54132, val loss: 0.53894, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.53024, val loss: 0.52770, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.52047, val loss: 0.51781, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.51056, val loss: 0.50776, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.50119, val loss: 0.49827, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.49290, val loss: 0.48989, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.48453, val loss: 0.48150, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.47650, val loss: 0.47337, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.46887, val loss: 0.46569, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.45897, val loss: 0.45590, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.45211, val loss: 0.44902, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.44547, val loss: 0.44235, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.43914, val loss: 0.43587, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.43315, val loss: 0.42976, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.42744, val loss: 0.42394, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.42201, val loss: 0.41840, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.41406, val loss: 0.41059, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.40908, val loss: 0.40553, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.40198, val loss: 0.39860, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.39736, val loss: 0.39391, in 0.031s 1 tree, 31 leaves, max depth = 12, train loss: 0.39061, val loss: 0.38733, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.38661, val loss: 0.38337, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.38038, val loss: 0.37725, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.37449, val loss: 0.37148, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.37068, val loss: 0.36760, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.36706, val loss: 0.36397, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.36200, val loss: 0.35901, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.35722, val loss: 0.35433, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.35341, val loss: 0.35056, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.34905, val loss: 0.34629, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.34530, val loss: 0.34266, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.34104, val loss: 0.33875, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.33712, val loss: 0.33494, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.33386, val loss: 0.33173, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.32999, val loss: 0.32819, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.32649, val loss: 0.32480, in 0.016s 1 tree, 31 leaves, max depth = 16, train loss: 0.32320, val loss: 0.32160, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.32007, val loss: 0.31860, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.31709, val loss: 0.31574, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.31440, val loss: 0.31310, in 0.031s 1 tree, 31 leaves, max depth = 12, train loss: 0.31117, val loss: 0.31019, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.30811, val loss: 0.30744, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.30520, val loss: 0.30480, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.30255, val loss: 0.30224, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.29988, val loss: 0.29985, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.29746, val loss: 0.29750, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.29500, val loss: 0.29531, in 0.016s 1 tree, 31 leaves, max depth = 16, train loss: 0.29279, val loss: 0.29315, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.29057, val loss: 0.29096, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.28833, val loss: 0.28894, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.28634, val loss: 0.28704, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.28418, val loss: 0.28497, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.28216, val loss: 0.28318, in 0.031s 1 tree, 31 leaves, max depth = 15, train loss: 0.28036, val loss: 0.28145, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.27837, val loss: 0.27953, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.27654, val loss: 0.27793, in 0.031s 1 tree, 31 leaves, max depth = 13, train loss: 0.27491, val loss: 0.27639, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.27306, val loss: 0.27463, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.27139, val loss: 0.27316, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.26992, val loss: 0.27173, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.26819, val loss: 0.27009, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.26665, val loss: 0.26873, in 0.016s 1 tree, 31 leaves, max depth = 17, train loss: 0.26533, val loss: 0.26743, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.26391, val loss: 0.26614, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.26239, val loss: 0.26467, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.26119, val loss: 0.26352, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.26004, val loss: 0.26242, in 0.016s 1 tree, 31 leaves, max depth = 16, train loss: 0.25877, val loss: 0.26129, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.25730, val loss: 0.25990, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.25590, val loss: 0.25857, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.25472, val loss: 0.25757, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.25373, val loss: 0.25663, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.25248, val loss: 0.25544, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.25139, val loss: 0.25452, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.25045, val loss: 0.25366, in 0.016s 1 tree, 31 leaves, max depth = 16, train loss: 0.24945, val loss: 0.25277, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.24855, val loss: 0.25187, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.24768, val loss: 0.25099, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.24656, val loss: 0.24991, in 0.016s Fit 88 trees in 1.924 s, (2728 total leaves) Time spent computing histograms: 0.620s Time spent finding best splits: 0.115s Time spent applying splits: 0.070s Time spent predicting: 0.016s Trial 74, Fold 2: Log loss = 0.25000810147832486, Average precision = 0.9661793912567854, ROC-AUC = 0.9633919848640689, Elapsed Time = 1.929179999999178 seconds Trial 74, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 74, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 7, train loss: 0.67339, val loss: 0.67326, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.65476, val loss: 0.65461, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.63732, val loss: 0.63712, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.62100, val loss: 0.62075, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.60552, val loss: 0.60522, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.59113, val loss: 0.59081, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.57803, val loss: 0.57775, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.56524, val loss: 0.56497, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.55353, val loss: 0.55332, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.54227, val loss: 0.54194, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.53130, val loss: 0.53103, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.52147, val loss: 0.52127, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.51219, val loss: 0.51203, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.50361, val loss: 0.50338, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.49525, val loss: 0.49508, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.48691, val loss: 0.48681, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.47885, val loss: 0.47884, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.47120, val loss: 0.47125, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.46117, val loss: 0.46190, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.45430, val loss: 0.45516, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.44767, val loss: 0.44861, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.44139, val loss: 0.44233, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.43536, val loss: 0.43637, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.42967, val loss: 0.43074, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.42140, val loss: 0.42312, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.41612, val loss: 0.41794, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.40851, val loss: 0.41096, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.40152, val loss: 0.40454, in 0.031s 1 tree, 31 leaves, max depth = 12, train loss: 0.39683, val loss: 0.39994, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.39236, val loss: 0.39554, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.38833, val loss: 0.39161, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.38200, val loss: 0.38588, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.37803, val loss: 0.38201, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.37217, val loss: 0.37672, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.36664, val loss: 0.37175, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.36309, val loss: 0.36824, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.35820, val loss: 0.36380, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.35358, val loss: 0.35962, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.34985, val loss: 0.35578, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.34561, val loss: 0.35193, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.34212, val loss: 0.34834, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.33802, val loss: 0.34484, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.33424, val loss: 0.34142, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.33107, val loss: 0.33812, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.32736, val loss: 0.33502, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.32399, val loss: 0.33196, in 0.031s 1 tree, 31 leaves, max depth = 8, train loss: 0.32059, val loss: 0.32914, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.31774, val loss: 0.32614, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.31475, val loss: 0.32346, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.31209, val loss: 0.32068, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.30928, val loss: 0.31821, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.30676, val loss: 0.31561, in 0.031s 1 tree, 31 leaves, max depth = 7, train loss: 0.30392, val loss: 0.31328, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.30122, val loss: 0.31108, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.29874, val loss: 0.30901, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.29631, val loss: 0.30705, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.29405, val loss: 0.30499, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.29192, val loss: 0.30306, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.28967, val loss: 0.30059, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.28754, val loss: 0.29888, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.28554, val loss: 0.29721, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.28365, val loss: 0.29553, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.28162, val loss: 0.29328, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.27980, val loss: 0.29183, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.27809, val loss: 0.29031, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.27625, val loss: 0.28836, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.27469, val loss: 0.28696, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.27288, val loss: 0.28496, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.27122, val loss: 0.28320, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.26967, val loss: 0.28196, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.26828, val loss: 0.28072, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.26685, val loss: 0.27961, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.26531, val loss: 0.27802, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.26378, val loss: 0.27627, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.26248, val loss: 0.27524, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.26125, val loss: 0.27418, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.26003, val loss: 0.27324, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.25890, val loss: 0.27223, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.25776, val loss: 0.27140, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.25671, val loss: 0.27043, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.25565, val loss: 0.26968, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.25389, val loss: 0.26769, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.25293, val loss: 0.26687, in 0.016s 1 tree, 31 leaves, max depth = 17, train loss: 0.25198, val loss: 0.26595, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.25071, val loss: 0.26451, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.24954, val loss: 0.26330, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.24865, val loss: 0.26257, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.24781, val loss: 0.26191, in 0.016s Fit 88 trees in 1.923 s, (2728 total leaves) Time spent computing histograms: 0.631s Time spent finding best splits: 0.118s Time spent applying splits: 0.071s Time spent predicting: 0.000s Trial 74, Fold 3: Log loss = 0.24984248592210537, Average precision = 0.9660236678991468, ROC-AUC = 0.9624181211536054, Elapsed Time = 1.928146400001424 seconds Trial 74, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 74, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 8, train loss: 0.67358, val loss: 0.67269, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.65502, val loss: 0.65336, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.63839, val loss: 0.63595, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.62249, val loss: 0.61927, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.60716, val loss: 0.60328, in 0.016s 1 tree, 31 leaves, max depth = 6, train loss: 0.59294, val loss: 0.58834, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.57999, val loss: 0.57478, in 0.016s 1 tree, 31 leaves, max depth = 6, train loss: 0.56735, val loss: 0.56147, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.55506, val loss: 0.54864, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.54374, val loss: 0.53667, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.53273, val loss: 0.52513, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.52299, val loss: 0.51489, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.51305, val loss: 0.50451, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.50375, val loss: 0.49478, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.49549, val loss: 0.48607, in 0.016s 1 tree, 31 leaves, max depth = 6, train loss: 0.48724, val loss: 0.47734, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.47923, val loss: 0.46890, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.47174, val loss: 0.46096, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.46455, val loss: 0.45340, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.45807, val loss: 0.44661, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.45199, val loss: 0.44016, in 0.031s 1 tree, 31 leaves, max depth = 8, train loss: 0.44568, val loss: 0.43350, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.43966, val loss: 0.42719, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.43100, val loss: 0.41841, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.42566, val loss: 0.41265, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.41772, val loss: 0.40460, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.41268, val loss: 0.39926, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.40552, val loss: 0.39195, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.40101, val loss: 0.38719, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.39424, val loss: 0.38033, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.39030, val loss: 0.37620, in 0.031s 1 tree, 31 leaves, max depth = 7, train loss: 0.38619, val loss: 0.37182, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.38008, val loss: 0.36564, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.37426, val loss: 0.35973, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.37058, val loss: 0.35589, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.36550, val loss: 0.35065, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.36070, val loss: 0.34558, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.35689, val loss: 0.34176, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.35252, val loss: 0.33715, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.34896, val loss: 0.33359, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.34471, val loss: 0.32952, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.34075, val loss: 0.32540, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.33746, val loss: 0.32208, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.33366, val loss: 0.31843, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.33005, val loss: 0.31504, in 0.031s 1 tree, 31 leaves, max depth = 7, train loss: 0.32768, val loss: 0.31246, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.32478, val loss: 0.30953, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.32151, val loss: 0.30611, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.31832, val loss: 0.30312, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.31534, val loss: 0.30001, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.31241, val loss: 0.29728, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.30971, val loss: 0.29439, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.30710, val loss: 0.29169, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.30447, val loss: 0.28925, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.30202, val loss: 0.28668, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.29961, val loss: 0.28445, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.29729, val loss: 0.28214, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.29514, val loss: 0.27985, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.29291, val loss: 0.27753, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.29086, val loss: 0.27538, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.28876, val loss: 0.27320, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.28672, val loss: 0.27130, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.28479, val loss: 0.26955, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.28300, val loss: 0.26762, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.28121, val loss: 0.26597, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.27958, val loss: 0.26426, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.27773, val loss: 0.26234, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.27597, val loss: 0.26059, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.27468, val loss: 0.25921, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.27312, val loss: 0.25780, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.27165, val loss: 0.25646, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.27001, val loss: 0.25472, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.26863, val loss: 0.25326, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.26728, val loss: 0.25205, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.26601, val loss: 0.25068, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.26477, val loss: 0.24955, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.26328, val loss: 0.24793, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.26213, val loss: 0.24669, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.26103, val loss: 0.24552, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.25962, val loss: 0.24398, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.25833, val loss: 0.24270, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.25724, val loss: 0.24170, in 0.031s 1 tree, 31 leaves, max depth = 15, train loss: 0.25622, val loss: 0.24067, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.25525, val loss: 0.23977, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.25430, val loss: 0.23881, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.25335, val loss: 0.23798, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.25249, val loss: 0.23704, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.25164, val loss: 0.23640, in 0.016s Fit 88 trees in 1.924 s, (2728 total leaves) Time spent computing histograms: 0.620s Time spent finding best splits: 0.116s Time spent applying splits: 0.072s Time spent predicting: 0.000s Trial 74, Fold 4: Log loss = 0.25293808784018346, Average precision = 0.9665306166848716, ROC-AUC = 0.9624423419911252, Elapsed Time = 1.941979100000026 seconds Trial 74, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 74, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 8, train loss: 0.67313, val loss: 0.67229, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.65420, val loss: 0.65272, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.63646, val loss: 0.63431, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.61988, val loss: 0.61715, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.60432, val loss: 0.60101, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.58974, val loss: 0.58590, in 0.031s 1 tree, 31 leaves, max depth = 12, train loss: 0.57671, val loss: 0.57228, in 0.031s 1 tree, 31 leaves, max depth = 6, train loss: 0.56408, val loss: 0.55913, in 0.047s 1 tree, 31 leaves, max depth = 12, train loss: 0.55185, val loss: 0.54647, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.54054, val loss: 0.53457, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.52958, val loss: 0.52320, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.51975, val loss: 0.51295, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.51048, val loss: 0.50328, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.50171, val loss: 0.49433, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.49336, val loss: 0.48563, in 0.016s 1 tree, 31 leaves, max depth = 6, train loss: 0.48508, val loss: 0.47700, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.47696, val loss: 0.46855, in 0.031s 1 tree, 31 leaves, max depth = 12, train loss: 0.46926, val loss: 0.46053, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.45928, val loss: 0.45052, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.45252, val loss: 0.44346, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.44338, val loss: 0.43430, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.43721, val loss: 0.42788, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.43116, val loss: 0.42166, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.42556, val loss: 0.41586, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.42007, val loss: 0.41022, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.41486, val loss: 0.40489, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.40997, val loss: 0.39979, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.40257, val loss: 0.39246, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.39798, val loss: 0.38776, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.39395, val loss: 0.38369, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.38728, val loss: 0.37711, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.38094, val loss: 0.37081, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.37703, val loss: 0.36680, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.37145, val loss: 0.36118, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.36590, val loss: 0.35569, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.36233, val loss: 0.35206, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.35746, val loss: 0.34716, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.35360, val loss: 0.34330, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.34917, val loss: 0.33878, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.34560, val loss: 0.33520, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.34141, val loss: 0.33095, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.33804, val loss: 0.32763, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.33484, val loss: 0.32447, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.33088, val loss: 0.32089, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.32722, val loss: 0.31717, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.32364, val loss: 0.31386, in 0.021s 1 tree, 31 leaves, max depth = 14, train loss: 0.32075, val loss: 0.31101, in 0.026s 1 tree, 31 leaves, max depth = 10, train loss: 0.31739, val loss: 0.30799, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.31423, val loss: 0.30478, in 0.016s 1 tree, 31 leaves, max depth = 7, train loss: 0.31202, val loss: 0.30265, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.30900, val loss: 0.29995, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.30619, val loss: 0.29709, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.30342, val loss: 0.29463, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.30084, val loss: 0.29200, in 0.031s 1 tree, 31 leaves, max depth = 8, train loss: 0.29829, val loss: 0.28975, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.29594, val loss: 0.28733, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.29361, val loss: 0.28508, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.29135, val loss: 0.28280, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.28911, val loss: 0.28075, in 0.031s 1 tree, 31 leaves, max depth = 8, train loss: 0.28697, val loss: 0.27863, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.28493, val loss: 0.27655, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.28300, val loss: 0.27468, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.28100, val loss: 0.27287, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.27917, val loss: 0.27099, in 0.031s 1 tree, 31 leaves, max depth = 11, train loss: 0.27772, val loss: 0.26963, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.27591, val loss: 0.26806, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.27426, val loss: 0.26630, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.27254, val loss: 0.26463, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.27081, val loss: 0.26294, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.26914, val loss: 0.26128, in 0.031s 1 tree, 31 leaves, max depth = 8, train loss: 0.26755, val loss: 0.25986, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.26607, val loss: 0.25843, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.26464, val loss: 0.25697, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.26319, val loss: 0.25570, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.26188, val loss: 0.25436, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.26040, val loss: 0.25290, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.25900, val loss: 0.25150, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.25771, val loss: 0.25036, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.25602, val loss: 0.24875, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.25484, val loss: 0.24756, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.25366, val loss: 0.24653, in 0.016s 1 tree, 31 leaves, max depth = 16, train loss: 0.25255, val loss: 0.24542, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.25142, val loss: 0.24442, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.25037, val loss: 0.24338, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.24941, val loss: 0.24240, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.24840, val loss: 0.24150, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.24748, val loss: 0.24063, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.24654, val loss: 0.23985, in 0.016s Fit 88 trees in 2.002 s, (2728 total leaves) Time spent computing histograms: 0.666s Time spent finding best splits: 0.127s Time spent applying splits: 0.079s Time spent predicting: 0.031s Trial 74, Fold 5: Log loss = 0.25782377744800283, Average precision = 0.963778447082425, ROC-AUC = 0.9593320327740928, Elapsed Time = 2.021543599999859 seconds
Optimization Progress: 75%|#######5 | 75/100 [15:18<06:27, 15.51s/it]
Trial 75, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 75, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.173 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 27 leaves, max depth = 8, train loss: 0.67221, val loss: 0.67213, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.65231, val loss: 0.65222, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.63380, val loss: 0.63384, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.61653, val loss: 0.61666, in 0.031s 1 tree, 33 leaves, max depth = 9, train loss: 0.60101, val loss: 0.60101, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.58590, val loss: 0.58598, in 0.016s 1 tree, 34 leaves, max depth = 7, train loss: 0.57252, val loss: 0.57234, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.55908, val loss: 0.55882, in 0.031s 1 tree, 38 leaves, max depth = 12, train loss: 0.54714, val loss: 0.54684, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.53593, val loss: 0.53559, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.52466, val loss: 0.52433, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.51472, val loss: 0.51434, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.50466, val loss: 0.50433, in 0.031s 1 tree, 38 leaves, max depth = 11, train loss: 0.49495, val loss: 0.49458, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.48608, val loss: 0.48554, in 0.031s 1 tree, 35 leaves, max depth = 7, train loss: 0.47756, val loss: 0.47681, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.46934, val loss: 0.46859, in 0.031s 1 tree, 39 leaves, max depth = 11, train loss: 0.46160, val loss: 0.46082, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.45427, val loss: 0.45348, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.44735, val loss: 0.44655, in 0.031s 1 tree, 46 leaves, max depth = 11, train loss: 0.43763, val loss: 0.43720, in 0.016s 1 tree, 38 leaves, max depth = 8, train loss: 0.43135, val loss: 0.43082, in 0.031s 1 tree, 45 leaves, max depth = 11, train loss: 0.42528, val loss: 0.42480, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.41953, val loss: 0.41910, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.41121, val loss: 0.41102, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.40601, val loss: 0.40585, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.39828, val loss: 0.39846, in 0.031s 1 tree, 44 leaves, max depth = 10, train loss: 0.39346, val loss: 0.39374, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.38637, val loss: 0.38701, in 0.031s 1 tree, 43 leaves, max depth = 10, train loss: 0.37974, val loss: 0.38077, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.37344, val loss: 0.37480, in 0.016s 1 tree, 46 leaves, max depth = 13, train loss: 0.36937, val loss: 0.37080, in 0.016s 1 tree, 50 leaves, max depth = 14, train loss: 0.36572, val loss: 0.36725, in 0.016s 1 tree, 43 leaves, max depth = 8, train loss: 0.36211, val loss: 0.36360, in 0.031s 1 tree, 24 leaves, max depth = 10, train loss: 0.35687, val loss: 0.35843, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.35196, val loss: 0.35359, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.34735, val loss: 0.34907, in 0.016s 1 tree, 43 leaves, max depth = 13, train loss: 0.34352, val loss: 0.34529, in 0.031s 1 tree, 41 leaves, max depth = 13, train loss: 0.33988, val loss: 0.34172, in 0.016s 1 tree, 71 leaves, max depth = 16, train loss: 0.33543, val loss: 0.33774, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.33109, val loss: 0.33377, in 0.031s 1 tree, 32 leaves, max depth = 15, train loss: 0.32734, val loss: 0.33011, in 0.016s 1 tree, 49 leaves, max depth = 14, train loss: 0.32389, val loss: 0.32662, in 0.016s 1 tree, 72 leaves, max depth = 15, train loss: 0.32009, val loss: 0.32325, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.31674, val loss: 0.31996, in 0.016s Fit 45 trees in 1.283 s, (1877 total leaves) Time spent computing histograms: 0.363s Time spent finding best splits: 0.072s Time spent applying splits: 0.047s Time spent predicting: 0.000s Trial 75, Fold 1: Log loss = 0.3221012483428208, Average precision = 0.958427671986638, ROC-AUC = 0.9527340445362127, Elapsed Time = 1.290885200000048 seconds Trial 75, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 75, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 7, train loss: 0.67199, val loss: 0.67151, in 0.016s 1 tree, 27 leaves, max depth = 7, train loss: 0.65196, val loss: 0.65107, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.63425, val loss: 0.63317, in 0.031s 1 tree, 26 leaves, max depth = 6, train loss: 0.61718, val loss: 0.61572, in 0.016s 1 tree, 29 leaves, max depth = 6, train loss: 0.60087, val loss: 0.59909, in 0.016s 1 tree, 28 leaves, max depth = 7, train loss: 0.58555, val loss: 0.58349, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.57237, val loss: 0.57017, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.55885, val loss: 0.55651, in 0.016s 1 tree, 29 leaves, max depth = 7, train loss: 0.54694, val loss: 0.54439, in 0.031s 1 tree, 34 leaves, max depth = 10, train loss: 0.53509, val loss: 0.53228, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.52378, val loss: 0.52076, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.51382, val loss: 0.51084, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.50354, val loss: 0.50043, in 0.031s 1 tree, 38 leaves, max depth = 10, train loss: 0.49384, val loss: 0.49062, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.48543, val loss: 0.48212, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.47674, val loss: 0.47332, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.46590, val loss: 0.46258, in 0.031s 1 tree, 39 leaves, max depth = 10, train loss: 0.45874, val loss: 0.45537, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.45137, val loss: 0.44806, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.44426, val loss: 0.44085, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.43753, val loss: 0.43403, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.43111, val loss: 0.42754, in 0.031s 1 tree, 39 leaves, max depth = 13, train loss: 0.42506, val loss: 0.42147, in 0.016s 1 tree, 39 leaves, max depth = 13, train loss: 0.41930, val loss: 0.41569, in 0.016s 1 tree, 37 leaves, max depth = 13, train loss: 0.41100, val loss: 0.40755, in 0.031s 1 tree, 40 leaves, max depth = 13, train loss: 0.40574, val loss: 0.40218, in 0.016s 1 tree, 44 leaves, max depth = 13, train loss: 0.39807, val loss: 0.39464, in 0.016s 1 tree, 40 leaves, max depth = 13, train loss: 0.39328, val loss: 0.38985, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.38641, val loss: 0.38324, in 0.031s 1 tree, 38 leaves, max depth = 12, train loss: 0.38197, val loss: 0.37877, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.37567, val loss: 0.37272, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.37166, val loss: 0.36871, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.36811, val loss: 0.36518, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.36244, val loss: 0.35977, in 0.016s 1 tree, 19 leaves, max depth = 11, train loss: 0.35727, val loss: 0.35469, in 0.016s 1 tree, 19 leaves, max depth = 11, train loss: 0.35242, val loss: 0.34991, in 0.016s 1 tree, 43 leaves, max depth = 15, train loss: 0.34846, val loss: 0.34609, in 0.016s 1 tree, 72 leaves, max depth = 17, train loss: 0.34377, val loss: 0.34176, in 0.016s 1 tree, 19 leaves, max depth = 11, train loss: 0.33950, val loss: 0.33755, in 0.016s 1 tree, 68 leaves, max depth = 16, train loss: 0.33520, val loss: 0.33361, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.33221, val loss: 0.33060, in 0.031s 1 tree, 50 leaves, max depth = 17, train loss: 0.32874, val loss: 0.32719, in 0.031s 1 tree, 73 leaves, max depth = 15, train loss: 0.32487, val loss: 0.32367, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.32132, val loss: 0.32022, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.31874, val loss: 0.31763, in 0.016s Fit 45 trees in 1.236 s, (1739 total leaves) Time spent computing histograms: 0.354s Time spent finding best splits: 0.065s Time spent applying splits: 0.042s Time spent predicting: 0.000s Trial 75, Fold 2: Log loss = 0.322476233779029, Average precision = 0.9562086952003208, ROC-AUC = 0.9535934786085698, Elapsed Time = 1.2553203999996185 seconds Trial 75, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 75, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 26 leaves, max depth = 6, train loss: 0.67231, val loss: 0.67222, in 0.016s 1 tree, 37 leaves, max depth = 12, train loss: 0.65264, val loss: 0.65255, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.63430, val loss: 0.63415, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.61687, val loss: 0.61675, in 0.016s 1 tree, 40 leaves, max depth = 9, train loss: 0.60140, val loss: 0.60140, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.58629, val loss: 0.58624, in 0.031s 1 tree, 31 leaves, max depth = 7, train loss: 0.57286, val loss: 0.57294, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.55950, val loss: 0.55951, in 0.016s 1 tree, 37 leaves, max depth = 8, train loss: 0.54730, val loss: 0.54720, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.53530, val loss: 0.53520, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.52462, val loss: 0.52459, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.51393, val loss: 0.51393, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.50383, val loss: 0.50381, in 0.031s 1 tree, 41 leaves, max depth = 11, train loss: 0.49431, val loss: 0.49436, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.48528, val loss: 0.48538, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.47689, val loss: 0.47704, in 0.016s 1 tree, 43 leaves, max depth = 13, train loss: 0.46917, val loss: 0.46947, in 0.031s 1 tree, 43 leaves, max depth = 12, train loss: 0.46147, val loss: 0.46178, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.45418, val loss: 0.45454, in 0.016s 1 tree, 43 leaves, max depth = 13, train loss: 0.44727, val loss: 0.44771, in 0.016s 1 tree, 38 leaves, max depth = 8, train loss: 0.44084, val loss: 0.44139, in 0.031s 1 tree, 47 leaves, max depth = 12, train loss: 0.43136, val loss: 0.43266, in 0.016s 1 tree, 43 leaves, max depth = 13, train loss: 0.42539, val loss: 0.42674, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.41674, val loss: 0.41879, in 0.031s 1 tree, 42 leaves, max depth = 8, train loss: 0.41142, val loss: 0.41361, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.40353, val loss: 0.40639, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.39854, val loss: 0.40152, in 0.031s 1 tree, 45 leaves, max depth = 11, train loss: 0.39147, val loss: 0.39508, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.38689, val loss: 0.39058, in 0.031s 1 tree, 45 leaves, max depth = 13, train loss: 0.38278, val loss: 0.38655, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.37624, val loss: 0.38063, in 0.031s 1 tree, 45 leaves, max depth = 13, train loss: 0.37221, val loss: 0.37668, in 0.016s 1 tree, 50 leaves, max depth = 14, train loss: 0.36870, val loss: 0.37325, in 0.016s 1 tree, 59 leaves, max depth = 16, train loss: 0.36293, val loss: 0.36818, in 0.031s 1 tree, 29 leaves, max depth = 10, train loss: 0.35765, val loss: 0.36339, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.35269, val loss: 0.35889, in 0.016s 1 tree, 71 leaves, max depth = 17, train loss: 0.34778, val loss: 0.35476, in 0.031s 1 tree, 29 leaves, max depth = 10, train loss: 0.34334, val loss: 0.35070, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.33964, val loss: 0.34681, in 0.016s 1 tree, 67 leaves, max depth = 17, train loss: 0.33527, val loss: 0.34316, in 0.031s 1 tree, 41 leaves, max depth = 14, train loss: 0.33183, val loss: 0.33954, in 0.016s 1 tree, 33 leaves, max depth = 15, train loss: 0.32798, val loss: 0.33607, in 0.016s 1 tree, 65 leaves, max depth = 17, train loss: 0.32410, val loss: 0.33283, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.32061, val loss: 0.32968, in 0.031s 1 tree, 40 leaves, max depth = 14, train loss: 0.31753, val loss: 0.32644, in 0.016s Fit 45 trees in 1.267 s, (1891 total leaves) Time spent computing histograms: 0.357s Time spent finding best splits: 0.070s Time spent applying splits: 0.047s Time spent predicting: 0.016s Trial 75, Fold 3: Log loss = 0.3175289258925027, Average precision = 0.9590985162914907, ROC-AUC = 0.9555419383421137, Elapsed Time = 1.26342300000033 seconds Trial 75, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 75, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 23 leaves, max depth = 8, train loss: 0.67229, val loss: 0.67146, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.65289, val loss: 0.65121, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.63548, val loss: 0.63305, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.61860, val loss: 0.61545, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.60247, val loss: 0.59873, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.58739, val loss: 0.58307, in 0.016s 1 tree, 22 leaves, max depth = 7, train loss: 0.57442, val loss: 0.56933, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.56103, val loss: 0.55538, in 0.031s 1 tree, 32 leaves, max depth = 11, train loss: 0.54925, val loss: 0.54302, in 0.016s 1 tree, 36 leaves, max depth = 8, train loss: 0.53743, val loss: 0.53066, in 0.031s 1 tree, 40 leaves, max depth = 10, train loss: 0.52599, val loss: 0.51872, in 0.016s 1 tree, 36 leaves, max depth = 8, train loss: 0.51608, val loss: 0.50829, in 0.031s 1 tree, 42 leaves, max depth = 10, train loss: 0.50584, val loss: 0.49763, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.49620, val loss: 0.48754, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.48781, val loss: 0.47875, in 0.016s 1 tree, 44 leaves, max depth = 10, train loss: 0.47914, val loss: 0.46970, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.46828, val loss: 0.45867, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.46102, val loss: 0.45115, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.45357, val loss: 0.44334, in 0.031s 1 tree, 47 leaves, max depth = 11, train loss: 0.44402, val loss: 0.43363, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.43727, val loss: 0.42651, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.43084, val loss: 0.41973, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.42472, val loss: 0.41334, in 0.031s 1 tree, 46 leaves, max depth = 11, train loss: 0.41892, val loss: 0.40726, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.41371, val loss: 0.40164, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.40849, val loss: 0.39614, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.40086, val loss: 0.38843, in 0.031s 1 tree, 44 leaves, max depth = 11, train loss: 0.39605, val loss: 0.38338, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.38905, val loss: 0.37632, in 0.031s 1 tree, 51 leaves, max depth = 11, train loss: 0.38247, val loss: 0.36968, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.37626, val loss: 0.36341, in 0.016s 1 tree, 29 leaves, max depth = 7, train loss: 0.37240, val loss: 0.35922, in 0.031s 1 tree, 50 leaves, max depth = 11, train loss: 0.36881, val loss: 0.35557, in 0.016s 1 tree, 53 leaves, max depth = 8, train loss: 0.36513, val loss: 0.35163, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.35993, val loss: 0.34623, in 0.000s 1 tree, 33 leaves, max depth = 11, train loss: 0.35505, val loss: 0.34119, in 0.016s 1 tree, 42 leaves, max depth = 13, train loss: 0.35112, val loss: 0.33725, in 0.031s 1 tree, 65 leaves, max depth = 14, train loss: 0.34635, val loss: 0.33285, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.34205, val loss: 0.32836, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.33843, val loss: 0.32471, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.33451, val loss: 0.32055, in 0.016s 1 tree, 63 leaves, max depth = 14, train loss: 0.33042, val loss: 0.31680, in 0.031s 1 tree, 48 leaves, max depth = 12, train loss: 0.32693, val loss: 0.31321, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.32379, val loss: 0.31002, in 0.031s 1 tree, 50 leaves, max depth = 12, train loss: 0.32061, val loss: 0.30675, in 0.016s Fit 45 trees in 1.235 s, (1829 total leaves) Time spent computing histograms: 0.357s Time spent finding best splits: 0.066s Time spent applying splits: 0.045s Time spent predicting: 0.000s Trial 75, Fold 4: Log loss = 0.32137275120798536, Average precision = 0.9595918254177039, ROC-AUC = 0.9550634133779696, Elapsed Time = 1.250694600001225 seconds Trial 75, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 75, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 24 leaves, max depth = 6, train loss: 0.67198, val loss: 0.67111, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.65189, val loss: 0.65033, in 0.016s 1 tree, 33 leaves, max depth = 9, train loss: 0.63318, val loss: 0.63101, in 0.031s 1 tree, 31 leaves, max depth = 7, train loss: 0.61603, val loss: 0.61320, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.59974, val loss: 0.59635, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.58526, val loss: 0.58123, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.57084, val loss: 0.56627, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.55811, val loss: 0.55296, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.54535, val loss: 0.53974, in 0.000s 1 tree, 28 leaves, max depth = 6, train loss: 0.53368, val loss: 0.52742, in 0.016s 1 tree, 39 leaves, max depth = 13, train loss: 0.52238, val loss: 0.51571, in 0.031s 1 tree, 39 leaves, max depth = 13, train loss: 0.51176, val loss: 0.50473, in 0.016s 1 tree, 34 leaves, max depth = 7, train loss: 0.50191, val loss: 0.49446, in 0.016s 1 tree, 39 leaves, max depth = 13, train loss: 0.49242, val loss: 0.48461, in 0.016s 1 tree, 39 leaves, max depth = 8, train loss: 0.48368, val loss: 0.47551, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.47519, val loss: 0.46673, in 0.016s 1 tree, 44 leaves, max depth = 9, train loss: 0.46769, val loss: 0.45902, in 0.016s 1 tree, 39 leaves, max depth = 13, train loss: 0.46004, val loss: 0.45116, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.45277, val loss: 0.44365, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.44591, val loss: 0.43656, in 0.016s 1 tree, 37 leaves, max depth = 7, train loss: 0.43958, val loss: 0.42997, in 0.031s 1 tree, 39 leaves, max depth = 13, train loss: 0.43341, val loss: 0.42367, in 0.016s 1 tree, 38 leaves, max depth = 13, train loss: 0.42749, val loss: 0.41766, in 0.016s 1 tree, 34 leaves, max depth = 13, train loss: 0.41856, val loss: 0.40882, in 0.016s 1 tree, 40 leaves, max depth = 8, train loss: 0.41037, val loss: 0.40068, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.40518, val loss: 0.39539, in 0.016s 1 tree, 37 leaves, max depth = 13, train loss: 0.39750, val loss: 0.38779, in 0.031s 1 tree, 47 leaves, max depth = 13, train loss: 0.39028, val loss: 0.38066, in 0.016s 1 tree, 44 leaves, max depth = 14, train loss: 0.38349, val loss: 0.37397, in 0.016s 1 tree, 39 leaves, max depth = 13, train loss: 0.37935, val loss: 0.36974, in 0.016s 1 tree, 43 leaves, max depth = 13, train loss: 0.37513, val loss: 0.36546, in 0.016s 1 tree, 47 leaves, max depth = 10, train loss: 0.37127, val loss: 0.36156, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.36545, val loss: 0.35582, in 0.031s 1 tree, 49 leaves, max depth = 8, train loss: 0.36189, val loss: 0.35224, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.35665, val loss: 0.34691, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.35172, val loss: 0.34191, in 0.016s 1 tree, 44 leaves, max depth = 14, train loss: 0.34782, val loss: 0.33804, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.34334, val loss: 0.33346, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.33966, val loss: 0.32987, in 0.016s 1 tree, 72 leaves, max depth = 15, train loss: 0.33530, val loss: 0.32588, in 0.016s 1 tree, 30 leaves, max depth = 14, train loss: 0.33133, val loss: 0.32180, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.32796, val loss: 0.31851, in 0.031s 1 tree, 48 leaves, max depth = 14, train loss: 0.32464, val loss: 0.31509, in 0.016s 1 tree, 72 leaves, max depth = 14, train loss: 0.32081, val loss: 0.31166, in 0.016s 1 tree, 52 leaves, max depth = 15, train loss: 0.31830, val loss: 0.30920, in 0.031s Fit 45 trees in 1.174 s, (1794 total leaves) Time spent computing histograms: 0.325s Time spent finding best splits: 0.061s Time spent applying splits: 0.041s Time spent predicting: 0.000s Trial 75, Fold 5: Log loss = 0.32681020784542797, Average precision = 0.956019732699627, ROC-AUC = 0.9517103840365644, Elapsed Time = 1.1787164000015764 seconds
Optimization Progress: 76%|#######6 | 76/100 [15:31<05:55, 14.81s/it]
Trial 76, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 76, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.141 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 9, train loss: 0.68558, val loss: 0.68534, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.68143, val loss: 0.68087, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.67412, val loss: 0.67336, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.66707, val loss: 0.66605, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.66021, val loss: 0.65900, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.65357, val loss: 0.65217, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.64716, val loss: 0.64551, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.64100, val loss: 0.63918, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.63732, val loss: 0.63523, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.63129, val loss: 0.62903, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.62557, val loss: 0.62311, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.62004, val loss: 0.61740, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.61461, val loss: 0.61180, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.60943, val loss: 0.60643, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.60439, val loss: 0.60124, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.59949, val loss: 0.59618, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.59496, val loss: 0.59147, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.59031, val loss: 0.58665, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.58579, val loss: 0.58192, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.58130, val loss: 0.57726, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.57693, val loss: 0.57275, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.57270, val loss: 0.56836, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.56892, val loss: 0.56444, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.56491, val loss: 0.56028, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.56112, val loss: 0.55633, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.55741, val loss: 0.55248, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.55383, val loss: 0.54875, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.55037, val loss: 0.54517, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.54701, val loss: 0.54167, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.54328, val loss: 0.53817, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.54005, val loss: 0.53482, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.53688, val loss: 0.53148, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.53378, val loss: 0.52825, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.53079, val loss: 0.52508, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.52780, val loss: 0.52194, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.52495, val loss: 0.51897, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.52222, val loss: 0.51614, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.51882, val loss: 0.51297, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.51616, val loss: 0.51020, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.51362, val loss: 0.50756, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.51110, val loss: 0.50493, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.50868, val loss: 0.50237, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.50635, val loss: 0.49993, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.50317, val loss: 0.49698, in 0.016s [45/70] 1 tree, 6 leaves, max depth = 4, train loss: 0.50089, val loss: 0.49455, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.49866, val loss: 0.49222, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.49561, val loss: 0.48939, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.49401, val loss: 0.48783, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.49192, val loss: 0.48562, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48901, val loss: 0.48293, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.48702, val loss: 0.48084, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.48420, val loss: 0.47824, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.48229, val loss: 0.47624, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.48043, val loss: 0.47430, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.47858, val loss: 0.47235, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.47589, val loss: 0.46987, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.47411, val loss: 0.46796, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.47238, val loss: 0.46613, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.46979, val loss: 0.46375, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.46817, val loss: 0.46204, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.46655, val loss: 0.46033, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.46497, val loss: 0.45866, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.46344, val loss: 0.45705, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.46190, val loss: 0.45543, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.46046, val loss: 0.45388, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.45873, val loss: 0.45231, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.45737, val loss: 0.45086, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.45601, val loss: 0.44941, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.45362, val loss: 0.44722, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.45225, val loss: 0.44575, in 0.016s Fit 70 trees in 0.704 s, (1102 total leaves) Time spent computing histograms: 0.265s Time spent finding best splits: 0.022s Time spent applying splits: 0.022s Time spent predicting: 0.000s Trial 76, Fold 1: Log loss = 0.45140575375698305, Average precision = 0.9059768387995848, ROC-AUC = 0.9121660859465738, Elapsed Time = 0.7139700999996421 seconds Trial 76, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 76, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 9, train loss: 0.68557, val loss: 0.68520, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.68137, val loss: 0.68058, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.67405, val loss: 0.67292, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.66698, val loss: 0.66551, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.66013, val loss: 0.65834, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.65353, val loss: 0.65141, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.64711, val loss: 0.64468, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.64092, val loss: 0.63820, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.63722, val loss: 0.63434, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.63122, val loss: 0.62801, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.62550, val loss: 0.62199, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.61998, val loss: 0.61618, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.61457, val loss: 0.61045, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.60940, val loss: 0.60499, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.60434, val loss: 0.59968, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.59942, val loss: 0.59451, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.59492, val loss: 0.58979, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.59029, val loss: 0.58490, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.58577, val loss: 0.58012, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.58130, val loss: 0.57538, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.57699, val loss: 0.57078, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.57278, val loss: 0.56630, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.56899, val loss: 0.56233, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.56501, val loss: 0.55809, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.56124, val loss: 0.55407, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.55752, val loss: 0.55013, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.55395, val loss: 0.54635, in 0.016s 1 tree, 17 leaves, max depth = 12, train loss: 0.55049, val loss: 0.54269, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.54712, val loss: 0.53911, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.54346, val loss: 0.53557, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.54023, val loss: 0.53214, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.53707, val loss: 0.52878, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.53399, val loss: 0.52547, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.53099, val loss: 0.52228, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.52804, val loss: 0.51910, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.52521, val loss: 0.51606, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.52249, val loss: 0.51317, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.51911, val loss: 0.50990, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.51646, val loss: 0.50706, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.51392, val loss: 0.50434, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.51142, val loss: 0.50166, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.50900, val loss: 0.49908, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.50667, val loss: 0.49658, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.50351, val loss: 0.49353, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.50124, val loss: 0.49109, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.49903, val loss: 0.48872, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.49601, val loss: 0.48580, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.49432, val loss: 0.48427, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.49224, val loss: 0.48203, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.48935, val loss: 0.47926, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.48736, val loss: 0.47712, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.48457, val loss: 0.47444, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.48265, val loss: 0.47238, in 0.016s 1 tree, 17 leaves, max depth = 11, train loss: 0.48079, val loss: 0.47037, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.47895, val loss: 0.46838, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.47629, val loss: 0.46582, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.47451, val loss: 0.46390, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.47279, val loss: 0.46203, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.47023, val loss: 0.45955, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.46860, val loss: 0.45781, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.46702, val loss: 0.45610, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.46545, val loss: 0.45440, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.46393, val loss: 0.45275, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.46241, val loss: 0.45112, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.46097, val loss: 0.44955, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.45927, val loss: 0.44798, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.45791, val loss: 0.44649, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.45658, val loss: 0.44506, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.45421, val loss: 0.44279, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.45286, val loss: 0.44131, in 0.016s Fit 70 trees in 0.783 s, (1091 total leaves) Time spent computing histograms: 0.287s Time spent finding best splits: 0.025s Time spent applying splits: 0.025s Time spent predicting: 0.000s Trial 76, Fold 2: Log loss = 0.4535027898029579, Average precision = 0.9029478982459576, ROC-AUC = 0.9163226291457116, Elapsed Time = 0.7876867999984825 seconds Trial 76, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 76, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 10, train loss: 0.68561, val loss: 0.68537, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.68139, val loss: 0.68104, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.67412, val loss: 0.67350, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.66709, val loss: 0.66623, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.66028, val loss: 0.65919, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.65369, val loss: 0.65237, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.64730, val loss: 0.64574, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.64116, val loss: 0.63936, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.63744, val loss: 0.63559, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.63147, val loss: 0.62942, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.62578, val loss: 0.62352, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.62028, val loss: 0.61780, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.61490, val loss: 0.61222, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.60976, val loss: 0.60689, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.60475, val loss: 0.60165, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.59988, val loss: 0.59658, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.59539, val loss: 0.59192, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.59077, val loss: 0.58712, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.58627, val loss: 0.58244, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.58183, val loss: 0.57783, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.57752, val loss: 0.57336, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.57333, val loss: 0.56901, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.56958, val loss: 0.56511, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.56562, val loss: 0.56099, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.56185, val loss: 0.55707, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.55817, val loss: 0.55322, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.55462, val loss: 0.54952, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.55118, val loss: 0.54590, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.54785, val loss: 0.54241, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.54415, val loss: 0.53893, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.54095, val loss: 0.53556, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.53780, val loss: 0.53227, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.53474, val loss: 0.52906, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.53176, val loss: 0.52594, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.52881, val loss: 0.52287, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.52600, val loss: 0.51992, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.52330, val loss: 0.51707, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.51992, val loss: 0.51390, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.51729, val loss: 0.51114, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.51478, val loss: 0.50848, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.51230, val loss: 0.50586, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.50991, val loss: 0.50334, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.50760, val loss: 0.50090, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.50443, val loss: 0.49794, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.50217, val loss: 0.49556, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.49997, val loss: 0.49323, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.49694, val loss: 0.49040, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.49529, val loss: 0.48891, in 0.016s [49/70] 1 tree, 17 leaves, max depth = 9, train loss: 0.49322, val loss: 0.48673, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.49033, val loss: 0.48404, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.48836, val loss: 0.48195, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.48556, val loss: 0.47935, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.48367, val loss: 0.47733, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.48183, val loss: 0.47537, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48000, val loss: 0.47343, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.47732, val loss: 0.47094, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.47556, val loss: 0.46907, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.47385, val loss: 0.46726, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.47128, val loss: 0.46488, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.46967, val loss: 0.46316, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.46808, val loss: 0.46146, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.46653, val loss: 0.45980, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.46502, val loss: 0.45818, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.46351, val loss: 0.45656, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.46209, val loss: 0.45504, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.46038, val loss: 0.45349, in 0.005s 1 tree, 17 leaves, max depth = 10, train loss: 0.45903, val loss: 0.45204, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.45769, val loss: 0.45061, in 0.012s 1 tree, 17 leaves, max depth = 5, train loss: 0.45531, val loss: 0.44842, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.45397, val loss: 0.44700, in 0.016s Fit 70 trees in 0.815 s, (1102 total leaves) Time spent computing histograms: 0.292s Time spent finding best splits: 0.025s Time spent applying splits: 0.025s Time spent predicting: 0.016s Trial 76, Fold 3: Log loss = 0.44810020855286986, Average precision = 0.912123486339555, ROC-AUC = 0.9214749358332289, Elapsed Time = 0.8128985000003013 seconds Trial 76, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 76, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 9, train loss: 0.68560, val loss: 0.68517, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.68135, val loss: 0.68064, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.67408, val loss: 0.67298, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.66705, val loss: 0.66557, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.66023, val loss: 0.65837, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.65363, val loss: 0.65139, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.64724, val loss: 0.64464, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.64107, val loss: 0.63812, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.63732, val loss: 0.63425, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.63137, val loss: 0.62789, in 0.000s 1 tree, 17 leaves, max depth = 11, train loss: 0.62567, val loss: 0.62181, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.62015, val loss: 0.61596, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.61476, val loss: 0.61022, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.60960, val loss: 0.60470, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.60455, val loss: 0.59931, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.59966, val loss: 0.59409, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.59520, val loss: 0.58922, in 0.000s 1 tree, 17 leaves, max depth = 11, train loss: 0.59057, val loss: 0.58424, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.58607, val loss: 0.57943, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.58164, val loss: 0.57466, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.57734, val loss: 0.57002, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.57316, val loss: 0.56551, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.56936, val loss: 0.56142, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.56541, val loss: 0.55714, in 0.000s 1 tree, 17 leaves, max depth = 11, train loss: 0.56164, val loss: 0.55305, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.55794, val loss: 0.54908, in 0.000s 1 tree, 17 leaves, max depth = 11, train loss: 0.55439, val loss: 0.54521, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.55093, val loss: 0.54148, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.54758, val loss: 0.53784, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.54386, val loss: 0.53419, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.54064, val loss: 0.53071, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.53749, val loss: 0.52730, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.53441, val loss: 0.52395, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.53144, val loss: 0.52072, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.52850, val loss: 0.51749, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.52567, val loss: 0.51440, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.52296, val loss: 0.51144, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.51956, val loss: 0.50811, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.51692, val loss: 0.50525, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.51439, val loss: 0.50248, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.51189, val loss: 0.49975, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.50949, val loss: 0.49709, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.50717, val loss: 0.49452, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.50399, val loss: 0.49140, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.50173, val loss: 0.48891, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.49952, val loss: 0.48649, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.49647, val loss: 0.48349, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.49482, val loss: 0.48186, in 0.016s 1 tree, 17 leaves, max depth = 11, train loss: 0.49275, val loss: 0.47954, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.48984, val loss: 0.47669, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.48786, val loss: 0.47449, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.48505, val loss: 0.47174, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.48315, val loss: 0.46962, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.48129, val loss: 0.46756, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.47946, val loss: 0.46553, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.47677, val loss: 0.46290, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.47501, val loss: 0.46094, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.47330, val loss: 0.45902, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.47071, val loss: 0.45650, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.46909, val loss: 0.45467, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.46750, val loss: 0.45286, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.46594, val loss: 0.45112, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.46442, val loss: 0.44942, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.46289, val loss: 0.44770, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.46147, val loss: 0.44607, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.45979, val loss: 0.44445, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.45843, val loss: 0.44290, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.45709, val loss: 0.44137, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.45471, val loss: 0.43906, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.45338, val loss: 0.43753, in 0.016s Fit 70 trees in 0.799 s, (1102 total leaves) Time spent computing histograms: 0.296s Time spent finding best splits: 0.025s Time spent applying splits: 0.025s Time spent predicting: 0.000s Trial 76, Fold 4: Log loss = 0.45288470105915773, Average precision = 0.9060052309612978, ROC-AUC = 0.9150917640101958, Elapsed Time = 0.8077116000004025 seconds Trial 76, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 76, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 8, train loss: 0.68551, val loss: 0.68497, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.68135, val loss: 0.68038, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.67399, val loss: 0.67253, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.66686, val loss: 0.66493, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.65997, val loss: 0.65758, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.65331, val loss: 0.65043, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.64684, val loss: 0.64352, in 0.016s 1 tree, 17 leaves, max depth = 11, train loss: 0.64060, val loss: 0.63680, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.63693, val loss: 0.63286, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.63092, val loss: 0.62638, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.62516, val loss: 0.62017, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.61959, val loss: 0.61417, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.61416, val loss: 0.60834, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.60895, val loss: 0.60267, in 0.000s 1 tree, 17 leaves, max depth = 11, train loss: 0.60384, val loss: 0.59713, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.59889, val loss: 0.59174, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.59436, val loss: 0.58683, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.58968, val loss: 0.58174, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.58512, val loss: 0.57681, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.58065, val loss: 0.57194, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.57631, val loss: 0.56720, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.57209, val loss: 0.56260, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.56826, val loss: 0.55842, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.56427, val loss: 0.55405, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.56046, val loss: 0.54987, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.55674, val loss: 0.54579, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.55314, val loss: 0.54183, in 0.000s 1 tree, 17 leaves, max depth = 11, train loss: 0.54964, val loss: 0.53800, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.54624, val loss: 0.53425, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.54261, val loss: 0.53077, in 0.016s 1 tree, 17 leaves, max depth = 11, train loss: 0.53935, val loss: 0.52719, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.53616, val loss: 0.52369, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.53308, val loss: 0.52027, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.53006, val loss: 0.51694, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.52709, val loss: 0.51365, in 0.016s 1 tree, 17 leaves, max depth = 11, train loss: 0.52426, val loss: 0.51050, in 0.000s 1 tree, 17 leaves, max depth = 11, train loss: 0.52151, val loss: 0.50745, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.51819, val loss: 0.50430, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.51552, val loss: 0.50136, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.51296, val loss: 0.49851, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.51044, val loss: 0.49572, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.50801, val loss: 0.49298, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.50565, val loss: 0.49033, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.50255, val loss: 0.48739, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.50025, val loss: 0.48483, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.49803, val loss: 0.48235, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.49505, val loss: 0.47954, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.49339, val loss: 0.47796, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.49129, val loss: 0.47558, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48845, val loss: 0.47291, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.48644, val loss: 0.47063, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48369, val loss: 0.46805, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.48176, val loss: 0.46587, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.47988, val loss: 0.46374, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.47802, val loss: 0.46166, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.47539, val loss: 0.45922, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.47360, val loss: 0.45719, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.47187, val loss: 0.45524, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.46934, val loss: 0.45289, in 0.000s 1 tree, 17 leaves, max depth = 11, train loss: 0.46770, val loss: 0.45100, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.46608, val loss: 0.44916, in 0.000s 1 tree, 17 leaves, max depth = 10, train loss: 0.46450, val loss: 0.44737, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.46296, val loss: 0.44563, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.46142, val loss: 0.44387, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.45998, val loss: 0.44220, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.45832, val loss: 0.44076, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.45694, val loss: 0.43914, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.45558, val loss: 0.43758, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.45325, val loss: 0.43543, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.45190, val loss: 0.43387, in 0.016s Fit 70 trees in 0.845 s, (1091 total leaves) Time spent computing histograms: 0.309s Time spent finding best splits: 0.026s Time spent applying splits: 0.026s Time spent predicting: 0.000s Trial 76, Fold 5: Log loss = 0.45875741501839024, Average precision = 0.9023385615786321, ROC-AUC = 0.910490608104342, Elapsed Time = 0.8457624999991822 seconds
Optimization Progress: 77%|#######7 | 77/100 [15:42<05:12, 13.58s/it]
Trial 77, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 77, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.143 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 215 leaves, max depth = 15, train loss: 0.67801, val loss: 0.67814, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.66343, val loss: 0.66381, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.65008, val loss: 0.65061, in 0.016s 1 tree, 239 leaves, max depth = 21, train loss: 0.63755, val loss: 0.63810, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.62487, val loss: 0.62569, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.61277, val loss: 0.61378, in 0.016s 1 tree, 239 leaves, max depth = 16, train loss: 0.60123, val loss: 0.60250, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.59020, val loss: 0.59172, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.57966, val loss: 0.58142, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.56959, val loss: 0.57159, in 0.016s 1 tree, 239 leaves, max depth = 16, train loss: 0.55995, val loss: 0.56216, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.55080, val loss: 0.55313, in 0.031s 1 tree, 239 leaves, max depth = 20, train loss: 0.54175, val loss: 0.54411, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.53367, val loss: 0.53603, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.52583, val loss: 0.52832, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.51793, val loss: 0.52064, in 0.016s 1 tree, 239 leaves, max depth = 21, train loss: 0.51083, val loss: 0.51353, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.50352, val loss: 0.50642, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.49649, val loss: 0.49962, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.48982, val loss: 0.49308, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.48341, val loss: 0.48677, in 0.031s 1 tree, 239 leaves, max depth = 23, train loss: 0.47743, val loss: 0.48093, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.47138, val loss: 0.47510, in 0.016s 1 tree, 239 leaves, max depth = 18, train loss: 0.46547, val loss: 0.46914, in 0.016s 1 tree, 197 leaves, max depth = 19, train loss: 0.45788, val loss: 0.46181, in 0.031s 1 tree, 239 leaves, max depth = 23, train loss: 0.45269, val loss: 0.45677, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.44748, val loss: 0.45154, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.44224, val loss: 0.44640, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.43731, val loss: 0.44168, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.43254, val loss: 0.43712, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.42795, val loss: 0.43272, in 0.031s 1 tree, 197 leaves, max depth = 15, train loss: 0.42147, val loss: 0.42648, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.41716, val loss: 0.42240, in 0.031s 1 tree, 239 leaves, max depth = 14, train loss: 0.41300, val loss: 0.41844, in 0.031s 1 tree, 239 leaves, max depth = 14, train loss: 0.40898, val loss: 0.41464, in 0.031s 1 tree, 197 leaves, max depth = 15, train loss: 0.40306, val loss: 0.40893, in 0.016s 1 tree, 239 leaves, max depth = 18, train loss: 0.39914, val loss: 0.40512, in 0.031s 1 tree, 201 leaves, max depth = 15, train loss: 0.39360, val loss: 0.39982, in 0.031s 1 tree, 196 leaves, max depth = 15, train loss: 0.38825, val loss: 0.39467, in 0.016s 1 tree, 239 leaves, max depth = 18, train loss: 0.38463, val loss: 0.39116, in 0.031s 1 tree, 195 leaves, max depth = 15, train loss: 0.37959, val loss: 0.38632, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.37622, val loss: 0.38299, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.37292, val loss: 0.37981, in 0.031s 1 tree, 231 leaves, max depth = 16, train loss: 0.36828, val loss: 0.37534, in 0.031s 1 tree, 197 leaves, max depth = 15, train loss: 0.36378, val loss: 0.37104, in 0.016s 1 tree, 239 leaves, max depth = 16, train loss: 0.36085, val loss: 0.36833, in 0.031s 1 tree, 234 leaves, max depth = 15, train loss: 0.35668, val loss: 0.36432, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.35377, val loss: 0.36160, in 0.031s 1 tree, 102 leaves, max depth = 14, train loss: 0.35003, val loss: 0.35787, in 0.016s 1 tree, 102 leaves, max depth = 14, train loss: 0.34645, val loss: 0.35430, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.34374, val loss: 0.35181, in 0.031s 1 tree, 98 leaves, max depth = 14, train loss: 0.34034, val loss: 0.34843, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.33713, val loss: 0.34524, in 0.031s 1 tree, 239 leaves, max depth = 20, train loss: 0.33425, val loss: 0.34246, in 0.094s 1 tree, 148 leaves, max depth = 13, train loss: 0.33110, val loss: 0.33959, in 0.031s 1 tree, 99 leaves, max depth = 14, train loss: 0.32811, val loss: 0.33661, in 0.016s 1 tree, 239 leaves, max depth = 22, train loss: 0.32540, val loss: 0.33403, in 0.047s 1 tree, 149 leaves, max depth = 13, train loss: 0.32249, val loss: 0.33139, in 0.031s 1 tree, 149 leaves, max depth = 13, train loss: 0.31970, val loss: 0.32887, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.31704, val loss: 0.32621, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.31454, val loss: 0.32388, in 0.031s 1 tree, 239 leaves, max depth = 20, train loss: 0.31209, val loss: 0.32166, in 0.031s 1 tree, 146 leaves, max depth = 13, train loss: 0.30955, val loss: 0.31937, in 0.016s 1 tree, 101 leaves, max depth = 15, train loss: 0.30713, val loss: 0.31697, in 0.031s 1 tree, 145 leaves, max depth = 13, train loss: 0.30475, val loss: 0.31484, in 0.016s 1 tree, 100 leaves, max depth = 15, train loss: 0.30249, val loss: 0.31260, in 0.016s 1 tree, 239 leaves, max depth = 20, train loss: 0.30025, val loss: 0.31055, in 0.031s 1 tree, 146 leaves, max depth = 13, train loss: 0.29804, val loss: 0.30859, in 0.031s 1 tree, 101 leaves, max depth = 15, train loss: 0.29596, val loss: 0.30651, in 0.016s 1 tree, 239 leaves, max depth = 19, train loss: 0.29382, val loss: 0.30451, in 0.031s 1 tree, 100 leaves, max depth = 15, train loss: 0.29185, val loss: 0.30254, in 0.016s 1 tree, 239 leaves, max depth = 19, train loss: 0.28975, val loss: 0.30054, in 0.031s 1 tree, 145 leaves, max depth = 13, train loss: 0.28778, val loss: 0.29880, in 0.016s 1 tree, 239 leaves, max depth = 20, train loss: 0.28584, val loss: 0.29698, in 0.047s 1 tree, 101 leaves, max depth = 16, train loss: 0.28404, val loss: 0.29522, in 0.016s 1 tree, 138 leaves, max depth = 14, train loss: 0.28220, val loss: 0.29360, in 0.016s 1 tree, 139 leaves, max depth = 14, train loss: 0.28043, val loss: 0.29204, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.27854, val loss: 0.29027, in 0.031s 1 tree, 142 leaves, max depth = 16, train loss: 0.27688, val loss: 0.28881, in 0.016s 1 tree, 96 leaves, max depth = 17, train loss: 0.27527, val loss: 0.28720, in 0.016s 1 tree, 103 leaves, max depth = 16, train loss: 0.27375, val loss: 0.28571, in 0.031s 1 tree, 137 leaves, max depth = 16, train loss: 0.27220, val loss: 0.28436, in 0.016s Fit 82 trees in 2.581 s, (16428 total leaves) Time spent computing histograms: 0.714s Time spent finding best splits: 0.357s Time spent applying splits: 0.265s Time spent predicting: 0.016s Trial 77, Fold 1: Log loss = 0.2906058467959607, Average precision = 0.9604337413844874, ROC-AUC = 0.9551242125626724, Elapsed Time = 2.6012752999995428 seconds Trial 77, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 77, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 227 leaves, max depth = 17, train loss: 0.67806, val loss: 0.67807, in 0.016s 1 tree, 239 leaves, max depth = 15, train loss: 0.66349, val loss: 0.66363, in 0.031s 1 tree, 239 leaves, max depth = 20, train loss: 0.65034, val loss: 0.65058, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.63797, val loss: 0.63838, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.62525, val loss: 0.62581, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.61313, val loss: 0.61383, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.60156, val loss: 0.60240, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.59052, val loss: 0.59148, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.57996, val loss: 0.58105, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.56987, val loss: 0.57111, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.56001, val loss: 0.56144, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.55076, val loss: 0.55232, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.54190, val loss: 0.54357, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.53340, val loss: 0.53516, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.52524, val loss: 0.52710, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.51758, val loss: 0.51954, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.51022, val loss: 0.51237, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.50278, val loss: 0.50511, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.49563, val loss: 0.49816, in 0.047s 1 tree, 239 leaves, max depth = 17, train loss: 0.48876, val loss: 0.49149, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.48215, val loss: 0.48509, in 0.016s 1 tree, 239 leaves, max depth = 20, train loss: 0.47622, val loss: 0.47933, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.47008, val loss: 0.47341, in 0.047s 1 tree, 239 leaves, max depth = 17, train loss: 0.46418, val loss: 0.46771, in 0.047s 1 tree, 239 leaves, max depth = 16, train loss: 0.45849, val loss: 0.46219, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.45301, val loss: 0.45688, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.44773, val loss: 0.45172, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.44265, val loss: 0.44677, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.43774, val loss: 0.44203, in 0.031s 1 tree, 193 leaves, max depth = 18, train loss: 0.43098, val loss: 0.43546, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.42637, val loss: 0.43097, in 0.031s 1 tree, 193 leaves, max depth = 18, train loss: 0.42002, val loss: 0.42481, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.41577, val loss: 0.42080, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.41170, val loss: 0.41690, in 0.047s 1 tree, 197 leaves, max depth = 17, train loss: 0.40582, val loss: 0.41122, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.40188, val loss: 0.40751, in 0.031s 1 tree, 194 leaves, max depth = 18, train loss: 0.39636, val loss: 0.40219, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.39264, val loss: 0.39859, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.38905, val loss: 0.39519, in 0.031s 1 tree, 197 leaves, max depth = 18, train loss: 0.38392, val loss: 0.39027, in 0.031s 1 tree, 196 leaves, max depth = 17, train loss: 0.37901, val loss: 0.38556, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.37567, val loss: 0.38251, in 0.047s 1 tree, 200 leaves, max depth = 17, train loss: 0.37103, val loss: 0.37806, in 0.031s 1 tree, 239 leaves, max depth = 21, train loss: 0.36785, val loss: 0.37512, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.36484, val loss: 0.37233, in 0.031s 1 tree, 202 leaves, max depth = 17, train loss: 0.36051, val loss: 0.36819, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.35764, val loss: 0.36547, in 0.031s 1 tree, 202 leaves, max depth = 17, train loss: 0.35356, val loss: 0.36158, in 0.031s 1 tree, 93 leaves, max depth = 15, train loss: 0.34987, val loss: 0.35799, in 0.016s 1 tree, 92 leaves, max depth = 15, train loss: 0.34634, val loss: 0.35454, in 0.016s 1 tree, 95 leaves, max depth = 15, train loss: 0.34296, val loss: 0.35127, in 0.016s 1 tree, 239 leaves, max depth = 23, train loss: 0.33998, val loss: 0.34853, in 0.031s 1 tree, 239 leaves, max depth = 23, train loss: 0.33710, val loss: 0.34591, in 0.047s 1 tree, 98 leaves, max depth = 20, train loss: 0.33396, val loss: 0.34286, in 0.016s 1 tree, 96 leaves, max depth = 15, train loss: 0.33094, val loss: 0.33996, in 0.031s 1 tree, 239 leaves, max depth = 24, train loss: 0.32823, val loss: 0.33750, in 0.031s 1 tree, 142 leaves, max depth = 16, train loss: 0.32527, val loss: 0.33478, in 0.031s 1 tree, 239 leaves, max depth = 22, train loss: 0.32272, val loss: 0.33241, in 0.032s 1 tree, 239 leaves, max depth = 22, train loss: 0.32026, val loss: 0.33015, in 0.047s 1 tree, 142 leaves, max depth = 16, train loss: 0.31749, val loss: 0.32760, in 0.016s 1 tree, 141 leaves, max depth = 16, train loss: 0.31483, val loss: 0.32516, in 0.016s 1 tree, 98 leaves, max depth = 15, train loss: 0.31228, val loss: 0.32271, in 0.016s 1 tree, 143 leaves, max depth = 16, train loss: 0.30978, val loss: 0.32044, in 0.031s 1 tree, 94 leaves, max depth = 17, train loss: 0.30740, val loss: 0.31815, in 0.031s 1 tree, 239 leaves, max depth = 22, train loss: 0.30513, val loss: 0.31610, in 0.031s 1 tree, 93 leaves, max depth = 17, train loss: 0.30287, val loss: 0.31392, in 0.016s 1 tree, 143 leaves, max depth = 15, train loss: 0.30062, val loss: 0.31188, in 0.031s 1 tree, 141 leaves, max depth = 15, train loss: 0.29845, val loss: 0.30990, in 0.016s 1 tree, 95 leaves, max depth = 16, train loss: 0.29639, val loss: 0.30793, in 0.031s 1 tree, 144 leaves, max depth = 15, train loss: 0.29436, val loss: 0.30610, in 0.016s 1 tree, 96 leaves, max depth = 16, train loss: 0.29242, val loss: 0.30427, in 0.031s 1 tree, 144 leaves, max depth = 15, train loss: 0.29051, val loss: 0.30255, in 0.016s 1 tree, 239 leaves, max depth = 22, train loss: 0.28849, val loss: 0.30080, in 0.047s 1 tree, 239 leaves, max depth = 22, train loss: 0.28647, val loss: 0.29903, in 0.031s 1 tree, 95 leaves, max depth = 15, train loss: 0.28468, val loss: 0.29730, in 0.016s 1 tree, 144 leaves, max depth = 15, train loss: 0.28293, val loss: 0.29573, in 0.031s 1 tree, 97 leaves, max depth = 18, train loss: 0.28128, val loss: 0.29417, in 0.031s 1 tree, 239 leaves, max depth = 21, train loss: 0.27939, val loss: 0.29253, in 0.031s 1 tree, 98 leaves, max depth = 19, train loss: 0.27782, val loss: 0.29104, in 0.016s 1 tree, 239 leaves, max depth = 21, train loss: 0.27600, val loss: 0.28946, in 0.031s 1 tree, 239 leaves, max depth = 22, train loss: 0.27429, val loss: 0.28797, in 0.031s 1 tree, 144 leaves, max depth = 15, train loss: 0.27275, val loss: 0.28661, in 0.016s Fit 82 trees in 2.893 s, (16380 total leaves) Time spent computing histograms: 0.770s Time spent finding best splits: 0.387s Time spent applying splits: 0.297s Time spent predicting: 0.016s Trial 77, Fold 2: Log loss = 0.2863839142277258, Average precision = 0.9617003319210659, ROC-AUC = 0.9584315015048827, Elapsed Time = 2.906086900000446 seconds Trial 77, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 77, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 205 leaves, max depth = 17, train loss: 0.67810, val loss: 0.67818, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.66363, val loss: 0.66398, in 0.031s 1 tree, 239 leaves, max depth = 20, train loss: 0.64983, val loss: 0.65037, in 0.047s 1 tree, 239 leaves, max depth = 14, train loss: 0.63677, val loss: 0.63750, in 0.016s 1 tree, 239 leaves, max depth = 16, train loss: 0.62483, val loss: 0.62579, in 0.031s 1 tree, 239 leaves, max depth = 14, train loss: 0.61269, val loss: 0.61381, in 0.031s 1 tree, 239 leaves, max depth = 14, train loss: 0.60112, val loss: 0.60237, in 0.031s 1 tree, 239 leaves, max depth = 14, train loss: 0.59015, val loss: 0.59158, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.57957, val loss: 0.58115, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.56945, val loss: 0.57120, in 0.016s 1 tree, 239 leaves, max depth = 17, train loss: 0.55977, val loss: 0.56168, in 0.031s 1 tree, 239 leaves, max depth = 21, train loss: 0.55094, val loss: 0.55307, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.54203, val loss: 0.54431, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.53345, val loss: 0.53595, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.52524, val loss: 0.52788, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.51736, val loss: 0.52031, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.51018, val loss: 0.51340, in 0.016s 1 tree, 239 leaves, max depth = 16, train loss: 0.50288, val loss: 0.50621, in 0.047s 1 tree, 239 leaves, max depth = 16, train loss: 0.49587, val loss: 0.49932, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.48947, val loss: 0.49322, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.48301, val loss: 0.48699, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.47675, val loss: 0.48088, in 0.031s 1 tree, 191 leaves, max depth = 17, train loss: 0.46881, val loss: 0.47342, in 0.016s 1 tree, 239 leaves, max depth = 16, train loss: 0.46295, val loss: 0.46772, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.45731, val loss: 0.46222, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.45188, val loss: 0.45693, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.44665, val loss: 0.45186, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.44161, val loss: 0.44695, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.43674, val loss: 0.44223, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.43206, val loss: 0.43770, in 0.047s 1 tree, 195 leaves, max depth = 17, train loss: 0.42551, val loss: 0.43161, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.42111, val loss: 0.42735, in 0.031s 1 tree, 195 leaves, max depth = 17, train loss: 0.41494, val loss: 0.42166, in 0.016s 1 tree, 239 leaves, max depth = 16, train loss: 0.41080, val loss: 0.41772, in 0.047s 1 tree, 239 leaves, max depth = 14, train loss: 0.40681, val loss: 0.41400, in 0.031s 1 tree, 228 leaves, max depth = 16, train loss: 0.40115, val loss: 0.40883, in 0.031s 1 tree, 199 leaves, max depth = 19, train loss: 0.39570, val loss: 0.40379, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.39201, val loss: 0.40036, in 0.031s 1 tree, 239 leaves, max depth = 21, train loss: 0.38835, val loss: 0.39693, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.38491, val loss: 0.39369, in 0.031s 1 tree, 203 leaves, max depth = 18, train loss: 0.37992, val loss: 0.38914, in 0.031s 1 tree, 202 leaves, max depth = 19, train loss: 0.37513, val loss: 0.38475, in 0.031s 1 tree, 200 leaves, max depth = 19, train loss: 0.37053, val loss: 0.38055, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.36736, val loss: 0.37762, in 0.031s 1 tree, 202 leaves, max depth = 21, train loss: 0.36305, val loss: 0.37369, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.36008, val loss: 0.37087, in 0.047s 1 tree, 210 leaves, max depth = 20, train loss: 0.35598, val loss: 0.36718, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.35319, val loss: 0.36477, in 0.031s 1 tree, 202 leaves, max depth = 19, train loss: 0.34930, val loss: 0.36125, in 0.031s 1 tree, 91 leaves, max depth = 20, train loss: 0.34581, val loss: 0.35802, in 0.016s 1 tree, 94 leaves, max depth = 13, train loss: 0.34244, val loss: 0.35493, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.33951, val loss: 0.35203, in 0.031s 1 tree, 94 leaves, max depth = 13, train loss: 0.33633, val loss: 0.34912, in 0.031s 1 tree, 137 leaves, max depth = 17, train loss: 0.33321, val loss: 0.34654, in 0.016s 1 tree, 100 leaves, max depth = 13, train loss: 0.33026, val loss: 0.34386, in 0.031s 1 tree, 239 leaves, max depth = 21, train loss: 0.32761, val loss: 0.34126, in 0.031s 1 tree, 139 leaves, max depth = 17, train loss: 0.32473, val loss: 0.33889, in 0.016s 1 tree, 239 leaves, max depth = 18, train loss: 0.32215, val loss: 0.33635, in 0.047s 1 tree, 93 leaves, max depth = 16, train loss: 0.31946, val loss: 0.33389, in 0.016s 1 tree, 239 leaves, max depth = 18, train loss: 0.31700, val loss: 0.33148, in 0.031s 1 tree, 95 leaves, max depth = 16, train loss: 0.31445, val loss: 0.32918, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.31211, val loss: 0.32687, in 0.016s 1 tree, 137 leaves, max depth = 17, train loss: 0.30959, val loss: 0.32483, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.30735, val loss: 0.32263, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.30519, val loss: 0.32056, in 0.031s 1 tree, 138 leaves, max depth = 17, train loss: 0.30284, val loss: 0.31866, in 0.031s 1 tree, 95 leaves, max depth = 22, train loss: 0.30062, val loss: 0.31665, in 0.016s 1 tree, 91 leaves, max depth = 13, train loss: 0.29847, val loss: 0.31469, in 0.031s 1 tree, 139 leaves, max depth = 16, train loss: 0.29632, val loss: 0.31296, in 0.031s 1 tree, 138 leaves, max depth = 16, train loss: 0.29425, val loss: 0.31131, in 0.016s 1 tree, 239 leaves, max depth = 18, train loss: 0.29228, val loss: 0.30941, in 0.031s 1 tree, 97 leaves, max depth = 15, train loss: 0.29035, val loss: 0.30767, in 0.031s 1 tree, 145 leaves, max depth = 17, train loss: 0.28843, val loss: 0.30616, in 0.031s 1 tree, 97 leaves, max depth = 15, train loss: 0.28662, val loss: 0.30454, in 0.016s 1 tree, 144 leaves, max depth = 16, train loss: 0.28481, val loss: 0.30312, in 0.031s 1 tree, 145 leaves, max depth = 16, train loss: 0.28306, val loss: 0.30176, in 0.031s 1 tree, 239 leaves, max depth = 22, train loss: 0.28142, val loss: 0.30027, in 0.031s 1 tree, 97 leaves, max depth = 15, train loss: 0.27979, val loss: 0.29881, in 0.016s 1 tree, 141 leaves, max depth = 17, train loss: 0.27816, val loss: 0.29755, in 0.031s 1 tree, 91 leaves, max depth = 13, train loss: 0.27661, val loss: 0.29614, in 0.016s 1 tree, 87 leaves, max depth = 13, train loss: 0.27513, val loss: 0.29478, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.27343, val loss: 0.29310, in 0.031s Fit 82 trees in 2.814 s, (16290 total leaves) Time spent computing histograms: 0.755s Time spent finding best splits: 0.373s Time spent applying splits: 0.283s Time spent predicting: 0.016s Trial 77, Fold 3: Log loss = 0.28523170367129097, Average precision = 0.9624306571419454, ROC-AUC = 0.9585553236750103, Elapsed Time = 2.813362500000949 seconds Trial 77, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 77, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 229 leaves, max depth = 15, train loss: 0.67817, val loss: 0.67782, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.66367, val loss: 0.66310, in 0.031s 1 tree, 239 leaves, max depth = 23, train loss: 0.65044, val loss: 0.64969, in 0.047s 1 tree, 239 leaves, max depth = 19, train loss: 0.63804, val loss: 0.63711, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.62544, val loss: 0.62441, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.61341, val loss: 0.61225, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.60195, val loss: 0.60068, in 0.016s 1 tree, 239 leaves, max depth = 17, train loss: 0.59099, val loss: 0.58955, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.58052, val loss: 0.57891, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.57052, val loss: 0.56877, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.56094, val loss: 0.55903, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.55188, val loss: 0.54974, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.54288, val loss: 0.54058, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.53464, val loss: 0.53214, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.52634, val loss: 0.52368, in 0.031s 1 tree, 222 leaves, max depth = 18, train loss: 0.51860, val loss: 0.51566, in 0.031s 1 tree, 239 leaves, max depth = 20, train loss: 0.51145, val loss: 0.50844, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.50408, val loss: 0.50092, in 0.047s 1 tree, 239 leaves, max depth = 19, train loss: 0.49745, val loss: 0.49424, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.49062, val loss: 0.48727, in 0.047s 1 tree, 239 leaves, max depth = 16, train loss: 0.48412, val loss: 0.48057, in 0.047s 1 tree, 239 leaves, max depth = 17, train loss: 0.47779, val loss: 0.47411, in 0.047s 1 tree, 195 leaves, max depth = 15, train loss: 0.46992, val loss: 0.46612, in 0.047s 1 tree, 239 leaves, max depth = 17, train loss: 0.46410, val loss: 0.46027, in 0.063s 1 tree, 239 leaves, max depth = 17, train loss: 0.45840, val loss: 0.45450, in 0.047s 1 tree, 239 leaves, max depth = 16, train loss: 0.45292, val loss: 0.44892, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.44764, val loss: 0.44354, in 0.031s 1 tree, 226 leaves, max depth = 17, train loss: 0.44079, val loss: 0.43656, in 0.047s 1 tree, 239 leaves, max depth = 17, train loss: 0.43590, val loss: 0.43157, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.43112, val loss: 0.42671, in 0.047s 1 tree, 239 leaves, max depth = 15, train loss: 0.42656, val loss: 0.42211, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.42226, val loss: 0.41775, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.41793, val loss: 0.41346, in 0.047s 1 tree, 201 leaves, max depth = 18, train loss: 0.41188, val loss: 0.40736, in 0.031s 1 tree, 198 leaves, max depth = 16, train loss: 0.40612, val loss: 0.40155, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.40230, val loss: 0.39773, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.39847, val loss: 0.39397, in 0.031s 1 tree, 198 leaves, max depth = 19, train loss: 0.39314, val loss: 0.38861, in 0.047s 1 tree, 239 leaves, max depth = 20, train loss: 0.38988, val loss: 0.38529, in 0.047s 1 tree, 239 leaves, max depth = 16, train loss: 0.38635, val loss: 0.38184, in 0.031s 1 tree, 198 leaves, max depth = 19, train loss: 0.38141, val loss: 0.37688, in 0.047s 1 tree, 239 leaves, max depth = 19, train loss: 0.37815, val loss: 0.37374, in 0.047s 1 tree, 202 leaves, max depth = 17, train loss: 0.37349, val loss: 0.36910, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.37041, val loss: 0.36614, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.36750, val loss: 0.36334, in 0.047s 1 tree, 199 leaves, max depth = 17, train loss: 0.36316, val loss: 0.35899, in 0.031s 1 tree, 236 leaves, max depth = 16, train loss: 0.35902, val loss: 0.35483, in 0.047s 1 tree, 237 leaves, max depth = 19, train loss: 0.35505, val loss: 0.35089, in 0.047s 1 tree, 239 leaves, max depth = 17, train loss: 0.35227, val loss: 0.34822, in 0.047s 1 tree, 96 leaves, max depth = 16, train loss: 0.34870, val loss: 0.34449, in 0.031s 1 tree, 95 leaves, max depth = 15, train loss: 0.34526, val loss: 0.34091, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.34230, val loss: 0.33817, in 0.031s 1 tree, 97 leaves, max depth = 17, train loss: 0.33906, val loss: 0.33483, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.33624, val loss: 0.33222, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.33352, val loss: 0.32970, in 0.047s 1 tree, 144 leaves, max depth = 15, train loss: 0.33043, val loss: 0.32680, in 0.031s 1 tree, 96 leaves, max depth = 16, train loss: 0.32751, val loss: 0.32374, in 0.016s 1 tree, 239 leaves, max depth = 21, train loss: 0.32526, val loss: 0.32166, in 0.047s 1 tree, 144 leaves, max depth = 15, train loss: 0.32241, val loss: 0.31900, in 0.016s 1 tree, 99 leaves, max depth = 18, train loss: 0.31971, val loss: 0.31622, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.31728, val loss: 0.31402, in 0.031s 1 tree, 142 leaves, max depth = 15, train loss: 0.31465, val loss: 0.31157, in 0.031s 1 tree, 140 leaves, max depth = 15, train loss: 0.31213, val loss: 0.30922, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.30984, val loss: 0.30714, in 0.047s 1 tree, 104 leaves, max depth = 16, train loss: 0.30749, val loss: 0.30467, in 0.016s 1 tree, 239 leaves, max depth = 21, train loss: 0.30560, val loss: 0.30295, in 0.047s 1 tree, 140 leaves, max depth = 15, train loss: 0.30329, val loss: 0.30081, in 0.031s 1 tree, 94 leaves, max depth = 15, train loss: 0.30109, val loss: 0.29852, in 0.016s 1 tree, 140 leaves, max depth = 15, train loss: 0.29893, val loss: 0.29652, in 0.031s 1 tree, 95 leaves, max depth = 16, train loss: 0.29688, val loss: 0.29437, in 0.031s 1 tree, 135 leaves, max depth = 15, train loss: 0.29485, val loss: 0.29248, in 0.031s 1 tree, 239 leaves, max depth = 20, train loss: 0.29277, val loss: 0.29050, in 0.031s 1 tree, 92 leaves, max depth = 16, train loss: 0.29087, val loss: 0.28848, in 0.031s 1 tree, 143 leaves, max depth = 15, train loss: 0.28899, val loss: 0.28673, in 0.031s 1 tree, 100 leaves, max depth = 17, train loss: 0.28720, val loss: 0.28487, in 0.031s 1 tree, 136 leaves, max depth = 15, train loss: 0.28544, val loss: 0.28323, in 0.016s 1 tree, 239 leaves, max depth = 20, train loss: 0.28351, val loss: 0.28141, in 0.031s 1 tree, 93 leaves, max depth = 16, train loss: 0.28186, val loss: 0.27965, in 0.016s 1 tree, 140 leaves, max depth = 15, train loss: 0.28022, val loss: 0.27816, in 0.031s 1 tree, 96 leaves, max depth = 16, train loss: 0.27866, val loss: 0.27656, in 0.016s 1 tree, 239 leaves, max depth = 18, train loss: 0.27685, val loss: 0.27485, in 0.062s 1 tree, 100 leaves, max depth = 16, train loss: 0.27538, val loss: 0.27331, in 0.016s Fit 82 trees in 3.237 s, (16435 total leaves) Time spent computing histograms: 0.853s Time spent finding best splits: 0.450s Time spent applying splits: 0.337s Time spent predicting: 0.000s Trial 77, Fold 4: Log loss = 0.2879911332859797, Average precision = 0.9611189898196466, ROC-AUC = 0.9559277619214528, Elapsed Time = 3.2417513000000326 seconds Trial 77, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 77, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0.190 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 208 leaves, max depth = 18, train loss: 0.67793, val loss: 0.67750, in 0.047s 1 tree, 239 leaves, max depth = 16, train loss: 0.66321, val loss: 0.66250, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.64977, val loss: 0.64872, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.63720, val loss: 0.63584, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.62440, val loss: 0.62281, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.61278, val loss: 0.61096, in 0.031s 1 tree, 239 leaves, max depth = 16, train loss: 0.60110, val loss: 0.59907, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.58997, val loss: 0.58771, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.57933, val loss: 0.57687, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.56931, val loss: 0.56661, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.55970, val loss: 0.55686, in 0.047s 1 tree, 239 leaves, max depth = 23, train loss: 0.55089, val loss: 0.54786, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.54188, val loss: 0.53865, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.53324, val loss: 0.52981, in 0.031s 1 tree, 239 leaves, max depth = 23, train loss: 0.52546, val loss: 0.52186, in 0.047s 1 tree, 239 leaves, max depth = 15, train loss: 0.51795, val loss: 0.51439, in 0.047s 1 tree, 239 leaves, max depth = 17, train loss: 0.51026, val loss: 0.50656, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.50288, val loss: 0.49903, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.49593, val loss: 0.49189, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.48910, val loss: 0.48493, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.48245, val loss: 0.47822, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.47626, val loss: 0.47188, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.47009, val loss: 0.46567, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.46419, val loss: 0.45982, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.45846, val loss: 0.45407, in 0.031s 1 tree, 239 leaves, max depth = 21, train loss: 0.45302, val loss: 0.44871, in 0.047s 1 tree, 239 leaves, max depth = 18, train loss: 0.44770, val loss: 0.44337, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.44257, val loss: 0.43823, in 0.047s 1 tree, 194 leaves, max depth = 18, train loss: 0.43561, val loss: 0.43133, in 0.031s 1 tree, 239 leaves, max depth = 15, train loss: 0.43094, val loss: 0.42677, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.42629, val loss: 0.42214, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.42181, val loss: 0.41768, in 0.031s 1 tree, 239 leaves, max depth = 21, train loss: 0.41753, val loss: 0.41353, in 0.047s 1 tree, 239 leaves, max depth = 15, train loss: 0.41345, val loss: 0.40950, in 0.047s 1 tree, 239 leaves, max depth = 20, train loss: 0.40941, val loss: 0.40552, in 0.047s 1 tree, 195 leaves, max depth = 20, train loss: 0.40343, val loss: 0.39967, in 0.031s 1 tree, 202 leaves, max depth = 20, train loss: 0.39769, val loss: 0.39404, in 0.047s 1 tree, 197 leaves, max depth = 20, train loss: 0.39220, val loss: 0.38862, in 0.031s 1 tree, 202 leaves, max depth = 20, train loss: 0.38693, val loss: 0.38346, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.38340, val loss: 0.38000, in 0.047s 1 tree, 235 leaves, max depth = 17, train loss: 0.37851, val loss: 0.37524, in 0.047s 1 tree, 198 leaves, max depth = 20, train loss: 0.37373, val loss: 0.37053, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.37048, val loss: 0.36736, in 0.047s 1 tree, 239 leaves, max depth = 19, train loss: 0.36733, val loss: 0.36431, in 0.031s 1 tree, 204 leaves, max depth = 20, train loss: 0.36290, val loss: 0.36000, in 0.047s 1 tree, 239 leaves, max depth = 19, train loss: 0.35994, val loss: 0.35713, in 0.047s 1 tree, 230 leaves, max depth = 17, train loss: 0.35581, val loss: 0.35313, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.35300, val loss: 0.35040, in 0.047s 1 tree, 235 leaves, max depth = 17, train loss: 0.34911, val loss: 0.34666, in 0.047s 1 tree, 206 leaves, max depth = 20, train loss: 0.34529, val loss: 0.34297, in 0.031s 1 tree, 239 leaves, max depth = 21, train loss: 0.34269, val loss: 0.34056, in 0.047s 1 tree, 235 leaves, max depth = 17, train loss: 0.33915, val loss: 0.33717, in 0.031s 1 tree, 97 leaves, max depth = 18, train loss: 0.33589, val loss: 0.33391, in 0.031s 1 tree, 103 leaves, max depth = 16, train loss: 0.33277, val loss: 0.33070, in 0.016s 1 tree, 239 leaves, max depth = 19, train loss: 0.32995, val loss: 0.32810, in 0.047s 1 tree, 102 leaves, max depth = 16, train loss: 0.32699, val loss: 0.32512, in 0.016s 1 tree, 239 leaves, max depth = 19, train loss: 0.32430, val loss: 0.32263, in 0.062s 1 tree, 147 leaves, max depth = 19, train loss: 0.32146, val loss: 0.32001, in 0.016s 1 tree, 239 leaves, max depth = 19, train loss: 0.31887, val loss: 0.31765, in 0.047s 1 tree, 239 leaves, max depth = 19, train loss: 0.31637, val loss: 0.31536, in 0.031s 1 tree, 146 leaves, max depth = 19, train loss: 0.31371, val loss: 0.31296, in 0.031s 1 tree, 100 leaves, max depth = 16, train loss: 0.31113, val loss: 0.31038, in 0.031s 1 tree, 146 leaves, max depth = 17, train loss: 0.30864, val loss: 0.30814, in 0.016s 1 tree, 101 leaves, max depth = 16, train loss: 0.30623, val loss: 0.30573, in 0.031s 1 tree, 143 leaves, max depth = 17, train loss: 0.30390, val loss: 0.30364, in 0.031s 1 tree, 103 leaves, max depth = 16, train loss: 0.30165, val loss: 0.30133, in 0.031s 1 tree, 239 leaves, max depth = 19, train loss: 0.29942, val loss: 0.29931, in 0.031s 1 tree, 93 leaves, max depth = 15, train loss: 0.29727, val loss: 0.29715, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.29522, val loss: 0.29505, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.29309, val loss: 0.29303, in 0.031s 1 tree, 239 leaves, max depth = 18, train loss: 0.29105, val loss: 0.29120, in 0.031s 1 tree, 144 leaves, max depth = 18, train loss: 0.28905, val loss: 0.28943, in 0.031s 1 tree, 144 leaves, max depth = 18, train loss: 0.28713, val loss: 0.28774, in 0.031s 1 tree, 239 leaves, max depth = 20, train loss: 0.28520, val loss: 0.28603, in 0.031s 1 tree, 145 leaves, max depth = 16, train loss: 0.28338, val loss: 0.28443, in 0.031s 1 tree, 101 leaves, max depth = 15, train loss: 0.28162, val loss: 0.28263, in 0.031s 1 tree, 145 leaves, max depth = 16, train loss: 0.27991, val loss: 0.28113, in 0.016s 1 tree, 239 leaves, max depth = 21, train loss: 0.27837, val loss: 0.27987, in 0.047s 1 tree, 145 leaves, max depth = 16, train loss: 0.27674, val loss: 0.27841, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.27495, val loss: 0.27671, in 0.031s 1 tree, 99 leaves, max depth = 14, train loss: 0.27338, val loss: 0.27511, in 0.031s 1 tree, 239 leaves, max depth = 17, train loss: 0.27195, val loss: 0.27404, in 0.031s Fit 82 trees in 3.362 s, (16999 total leaves) Time spent computing histograms: 0.884s Time spent finding best splits: 0.470s Time spent applying splits: 0.352s Time spent predicting: 0.000s Trial 77, Fold 5: Log loss = 0.29372377089913193, Average precision = 0.9590494070438371, ROC-AUC = 0.9544510711034315, Elapsed Time = 3.3814411999992444 seconds
Optimization Progress: 78%|#######8 | 78/100 [16:04<05:53, 16.07s/it]
Trial 78, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 78, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.158 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 10, train loss: 0.67124, val loss: 0.67051, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.65149, val loss: 0.65008, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.63349, val loss: 0.63142, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.61737, val loss: 0.61467, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.60230, val loss: 0.59900, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.58849, val loss: 0.58462, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.57593, val loss: 0.57164, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.56429, val loss: 0.55947, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.55358, val loss: 0.54825, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.54300, val loss: 0.53831, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.53351, val loss: 0.52834, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.52406, val loss: 0.51949, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.51578, val loss: 0.51077, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.50799, val loss: 0.50252, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.50079, val loss: 0.49490, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.49266, val loss: 0.48735, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.48639, val loss: 0.48068, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.47902, val loss: 0.47387, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.47331, val loss: 0.46780, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.46663, val loss: 0.46165, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.46046, val loss: 0.45599, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.45475, val loss: 0.45076, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.44947, val loss: 0.44594, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.44481, val loss: 0.44097, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.43999, val loss: 0.43658, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.43553, val loss: 0.43253, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.43134, val loss: 0.42799, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.42744, val loss: 0.42379, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.42339, val loss: 0.42015, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.41964, val loss: 0.41679, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.41619, val loss: 0.41308, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.41274, val loss: 0.41001, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40952, val loss: 0.40644, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.40634, val loss: 0.40364, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40339, val loss: 0.40036, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.40033, val loss: 0.39706, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.39740, val loss: 0.39453, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.39460, val loss: 0.39151, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.39201, val loss: 0.38865, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38936, val loss: 0.38569, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38692, val loss: 0.38295, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.38421, val loss: 0.38066, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.38188, val loss: 0.37816, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37963, val loss: 0.37563, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.37711, val loss: 0.37353, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37503, val loss: 0.37117, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.37269, val loss: 0.36925, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.37052, val loss: 0.36749, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36857, val loss: 0.36527, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.36648, val loss: 0.36306, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.36446, val loss: 0.36144, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.36265, val loss: 0.35937, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.36066, val loss: 0.35724, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.35882, val loss: 0.35527, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.35692, val loss: 0.35377, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35524, val loss: 0.35183, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.35359, val loss: 0.35010, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35203, val loss: 0.34830, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35058, val loss: 0.34662, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.34901, val loss: 0.34504, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34767, val loss: 0.34347, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.34582, val loss: 0.34206, in 0.000s 1 tree, 38 leaves, max depth = 12, train loss: 0.34434, val loss: 0.34034, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.34262, val loss: 0.33904, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34134, val loss: 0.33753, in 0.000s 1 tree, 39 leaves, max depth = 12, train loss: 0.33995, val loss: 0.33592, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.33834, val loss: 0.33471, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33712, val loss: 0.33328, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.33580, val loss: 0.33196, in 0.000s 1 tree, 38 leaves, max depth = 12, train loss: 0.33456, val loss: 0.33052, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.33303, val loss: 0.32939, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.33188, val loss: 0.32803, in 0.000s Fit 72 trees in 0.970 s, (2538 total leaves) Time spent computing histograms: 0.341s Time spent finding best splits: 0.067s Time spent applying splits: 0.053s Time spent predicting: 0.000s Trial 78, Fold 1: Log loss = 0.3356414306376848, Average precision = 0.9479986388867059, ROC-AUC = 0.9462670362484049, Elapsed Time = 0.9823266000003059 seconds Trial 78, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 78, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.174 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 32 leaves, max depth = 14, train loss: 0.67170, val loss: 0.67066, in 0.000s 1 tree, 32 leaves, max depth = 14, train loss: 0.65225, val loss: 0.65025, in 0.016s 1 tree, 32 leaves, max depth = 14, train loss: 0.63453, val loss: 0.63162, in 0.016s 1 tree, 32 leaves, max depth = 14, train loss: 0.61835, val loss: 0.61458, in 0.000s 1 tree, 32 leaves, max depth = 14, train loss: 0.60356, val loss: 0.59896, in 0.016s 1 tree, 32 leaves, max depth = 14, train loss: 0.59000, val loss: 0.58464, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.57755, val loss: 0.57148, in 0.016s 1 tree, 32 leaves, max depth = 13, train loss: 0.56614, val loss: 0.55937, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.55535, val loss: 0.54792, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.54481, val loss: 0.53772, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.53525, val loss: 0.52753, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.52584, val loss: 0.51844, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.51718, val loss: 0.51010, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.50916, val loss: 0.50152, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.50154, val loss: 0.49336, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.49392, val loss: 0.48605, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.48731, val loss: 0.47894, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.48041, val loss: 0.47233, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.47447, val loss: 0.46593, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.46821, val loss: 0.45995, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.46243, val loss: 0.45444, in 0.000s 1 tree, 32 leaves, max depth = 9, train loss: 0.45721, val loss: 0.44880, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.45194, val loss: 0.44378, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.44705, val loss: 0.43848, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.44225, val loss: 0.43392, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.43780, val loss: 0.42971, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.43360, val loss: 0.42515, in 0.000s 1 tree, 32 leaves, max depth = 9, train loss: 0.42973, val loss: 0.42093, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.42568, val loss: 0.41712, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.42194, val loss: 0.41360, in 0.000s 1 tree, 32 leaves, max depth = 14, train loss: 0.41847, val loss: 0.40981, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.41503, val loss: 0.40659, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41191, val loss: 0.40336, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.40874, val loss: 0.40040, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.40565, val loss: 0.39702, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40282, val loss: 0.39408, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.39991, val loss: 0.39137, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.39699, val loss: 0.38816, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.39430, val loss: 0.38526, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39174, val loss: 0.38260, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38938, val loss: 0.38013, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.38670, val loss: 0.37767, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.38433, val loss: 0.37508, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.38184, val loss: 0.37280, in 0.016s [45/72] 1 tree, 5 leaves, max depth = 3, train loss: 0.37967, val loss: 0.37053, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.37736, val loss: 0.36843, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.37524, val loss: 0.36650, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.37302, val loss: 0.36405, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37100, val loss: 0.36194, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.36902, val loss: 0.36014, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.36699, val loss: 0.35790, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36512, val loss: 0.35594, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.36327, val loss: 0.35427, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.36144, val loss: 0.35228, in 0.000s 1 tree, 57 leaves, max depth = 13, train loss: 0.35971, val loss: 0.35074, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35797, val loss: 0.34890, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.35628, val loss: 0.34706, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35467, val loss: 0.34536, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35317, val loss: 0.34378, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.35158, val loss: 0.34208, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35019, val loss: 0.34061, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.34872, val loss: 0.33897, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.34703, val loss: 0.33747, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34572, val loss: 0.33608, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.34435, val loss: 0.33459, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34313, val loss: 0.33330, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.34181, val loss: 0.33182, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34067, val loss: 0.33061, in 0.000s 1 tree, 57 leaves, max depth = 14, train loss: 0.33903, val loss: 0.32916, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.33779, val loss: 0.32778, in 0.000s 1 tree, 57 leaves, max depth = 14, train loss: 0.33627, val loss: 0.32644, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.33508, val loss: 0.32517, in 0.016s Fit 72 trees in 1.033 s, (2557 total leaves) Time spent computing histograms: 0.360s Time spent finding best splits: 0.071s Time spent applying splits: 0.056s Time spent predicting: 0.000s Trial 78, Fold 2: Log loss = 0.3370147572962747, Average precision = 0.9455089027903654, ROC-AUC = 0.9465475254336876, Elapsed Time = 1.0398611000000528 seconds Trial 78, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 78, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 10, train loss: 0.67146, val loss: 0.67075, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.65187, val loss: 0.65048, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.63402, val loss: 0.63199, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.61795, val loss: 0.61537, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.60301, val loss: 0.59986, in 0.000s 1 tree, 30 leaves, max depth = 10, train loss: 0.58933, val loss: 0.58563, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.57700, val loss: 0.57273, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.56545, val loss: 0.56069, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.55483, val loss: 0.54961, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.54438, val loss: 0.53992, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.53497, val loss: 0.53005, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.52563, val loss: 0.52142, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.51739, val loss: 0.51279, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.50980, val loss: 0.50484, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.50266, val loss: 0.49730, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.49462, val loss: 0.48990, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.48837, val loss: 0.48332, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.48109, val loss: 0.47663, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.47550, val loss: 0.47073, in 0.000s 1 tree, 56 leaves, max depth = 10, train loss: 0.46887, val loss: 0.46467, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.46275, val loss: 0.45910, in 0.000s 1 tree, 56 leaves, max depth = 10, train loss: 0.45708, val loss: 0.45396, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.45184, val loss: 0.44923, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.44721, val loss: 0.44420, in 0.000s 1 tree, 56 leaves, max depth = 10, train loss: 0.44242, val loss: 0.43990, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.43799, val loss: 0.43593, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.43382, val loss: 0.43141, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.42997, val loss: 0.42729, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.42594, val loss: 0.42371, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.42221, val loss: 0.42040, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.41878, val loss: 0.41666, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41554, val loss: 0.41364, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.41254, val loss: 0.41086, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.40914, val loss: 0.40788, in 0.000s 1 tree, 56 leaves, max depth = 10, train loss: 0.40598, val loss: 0.40514, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.40295, val loss: 0.40181, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.40004, val loss: 0.39930, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.39727, val loss: 0.39624, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.39470, val loss: 0.39346, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39204, val loss: 0.39098, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38958, val loss: 0.38870, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.38688, val loss: 0.38640, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.38459, val loss: 0.38385, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38233, val loss: 0.38175, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.37982, val loss: 0.37963, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37772, val loss: 0.37768, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.37538, val loss: 0.37573, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.37322, val loss: 0.37394, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37126, val loss: 0.37211, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.36921, val loss: 0.36984, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.36727, val loss: 0.36764, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36547, val loss: 0.36595, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.36343, val loss: 0.36429, in 0.016s [54/72] 1 tree, 31 leaves, max depth = 9, train loss: 0.36169, val loss: 0.36235, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.35978, val loss: 0.36081, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35809, val loss: 0.35923, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.35649, val loss: 0.35745, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35492, val loss: 0.35598, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35346, val loss: 0.35462, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.35195, val loss: 0.35290, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35059, val loss: 0.35163, in 0.016s [62/72] 1 tree, 56 leaves, max depth = 11, train loss: 0.34874, val loss: 0.35016, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.34731, val loss: 0.34842, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34603, val loss: 0.34722, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.34428, val loss: 0.34585, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.34293, val loss: 0.34436, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34172, val loss: 0.34323, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.34044, val loss: 0.34165, in 0.016s 1 tree, 55 leaves, max depth = 14, train loss: 0.33879, val loss: 0.34037, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33764, val loss: 0.33930, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.33644, val loss: 0.33782, in 0.016s 1 tree, 56 leaves, max depth = 14, train loss: 0.33488, val loss: 0.33661, in 0.000s Fit 72 trees in 0.923 s, (2483 total leaves) Time spent computing histograms: 0.324s Time spent finding best splits: 0.062s Time spent applying splits: 0.048s Time spent predicting: 0.000s Trial 78, Fold 3: Log loss = 0.33187436289576167, Average precision = 0.9506772581017511, ROC-AUC = 0.9500002558243684, Elapsed Time = 0.9412186999998085 seconds Trial 78, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 78, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 11, train loss: 0.67153, val loss: 0.67024, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.65196, val loss: 0.64946, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.63413, val loss: 0.63046, in 0.000s 1 tree, 31 leaves, max depth = 15, train loss: 0.61804, val loss: 0.61331, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.60312, val loss: 0.59733, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.58946, val loss: 0.58263, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.57706, val loss: 0.56932, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.56553, val loss: 0.55684, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.55494, val loss: 0.54533, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.54443, val loss: 0.53496, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.53504, val loss: 0.52471, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.52565, val loss: 0.51547, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.51702, val loss: 0.50697, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.50902, val loss: 0.49824, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.50164, val loss: 0.49015, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.49405, val loss: 0.48269, in 0.000s 1 tree, 31 leaves, max depth = 15, train loss: 0.48745, val loss: 0.47543, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.48059, val loss: 0.46870, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.47464, val loss: 0.46208, in 0.000s 1 tree, 56 leaves, max depth = 13, train loss: 0.46842, val loss: 0.45600, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.46268, val loss: 0.45038, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.45747, val loss: 0.44458, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.45224, val loss: 0.43947, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.44745, val loss: 0.43409, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.44268, val loss: 0.42944, in 0.031s 1 tree, 56 leaves, max depth = 12, train loss: 0.43827, val loss: 0.42513, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.43408, val loss: 0.42039, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.43020, val loss: 0.41601, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.42620, val loss: 0.41213, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.42250, val loss: 0.40853, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.41903, val loss: 0.40460, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.41563, val loss: 0.40131, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41252, val loss: 0.39795, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.40939, val loss: 0.39492, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40652, val loss: 0.39184, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.40348, val loss: 0.38835, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.40060, val loss: 0.38559, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.39782, val loss: 0.38238, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.39522, val loss: 0.37940, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.39266, val loss: 0.37663, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39029, val loss: 0.37407, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.38763, val loss: 0.37154, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.38532, val loss: 0.36886, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.38286, val loss: 0.36652, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38068, val loss: 0.36416, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37866, val loss: 0.36197, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.37637, val loss: 0.35981, in 0.031s 1 tree, 56 leaves, max depth = 10, train loss: 0.37425, val loss: 0.35781, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.37236, val loss: 0.35576, in 0.016s 1 tree, 31 leaves, max depth = 8, train loss: 0.37029, val loss: 0.35337, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.36831, val loss: 0.35152, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36655, val loss: 0.34961, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.36459, val loss: 0.34735, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.36278, val loss: 0.34525, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.36092, val loss: 0.34352, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35930, val loss: 0.34175, in 0.000s 1 tree, 31 leaves, max depth = 8, train loss: 0.35767, val loss: 0.33985, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35616, val loss: 0.33821, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.35439, val loss: 0.33658, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.35298, val loss: 0.33504, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.35141, val loss: 0.33327, in 0.000s 1 tree, 55 leaves, max depth = 11, train loss: 0.34976, val loss: 0.33176, in 0.000s 1 tree, 39 leaves, max depth = 11, train loss: 0.34831, val loss: 0.33017, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34698, val loss: 0.32871, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34574, val loss: 0.32736, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.34436, val loss: 0.32581, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.34278, val loss: 0.32437, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.34148, val loss: 0.32296, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.34001, val loss: 0.32163, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33882, val loss: 0.32033, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.33760, val loss: 0.31901, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.33621, val loss: 0.31775, in 0.016s Fit 72 trees in 1.142 s, (2531 total leaves) Time spent computing histograms: 0.409s Time spent finding best splits: 0.094s Time spent applying splits: 0.076s Time spent predicting: 0.016s Trial 78, Fold 4: Log loss = 0.3340373016637059, Average precision = 0.9511456838640654, ROC-AUC = 0.9483110629460336, Elapsed Time = 1.1569469999994908 seconds Trial 78, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 78, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 10, train loss: 0.67150, val loss: 0.67008, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.65194, val loss: 0.64918, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.63411, val loss: 0.63007, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.61783, val loss: 0.61256, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.60295, val loss: 0.59650, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.58906, val loss: 0.58146, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.57654, val loss: 0.56791, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.56481, val loss: 0.55513, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.55403, val loss: 0.54333, in 0.000s 1 tree, 58 leaves, max depth = 12, train loss: 0.54368, val loss: 0.53336, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.53411, val loss: 0.52285, in 0.000s 1 tree, 58 leaves, max depth = 13, train loss: 0.52486, val loss: 0.51398, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.51650, val loss: 0.50477, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.50863, val loss: 0.49605, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.50137, val loss: 0.48798, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.49342, val loss: 0.48044, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.48707, val loss: 0.47334, in 0.016s 1 tree, 58 leaves, max depth = 14, train loss: 0.47986, val loss: 0.46655, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.47418, val loss: 0.46016, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.46763, val loss: 0.45403, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.46158, val loss: 0.44837, in 0.000s 1 tree, 58 leaves, max depth = 12, train loss: 0.45600, val loss: 0.44316, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.45084, val loss: 0.43836, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.44609, val loss: 0.43302, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.44138, val loss: 0.42866, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.43705, val loss: 0.42375, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.43274, val loss: 0.41978, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.42881, val loss: 0.41530, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.42485, val loss: 0.41168, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.42118, val loss: 0.40834, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.41766, val loss: 0.40432, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.41429, val loss: 0.40127, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.41115, val loss: 0.39825, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.40798, val loss: 0.39459, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.40488, val loss: 0.39182, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40203, val loss: 0.38909, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.39918, val loss: 0.38656, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.39631, val loss: 0.38322, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.39368, val loss: 0.38015, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.39111, val loss: 0.37769, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.38873, val loss: 0.37542, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.38608, val loss: 0.37311, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.38371, val loss: 0.37031, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.38125, val loss: 0.36820, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37906, val loss: 0.36610, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.37704, val loss: 0.36417, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.37475, val loss: 0.36222, in 0.000s 1 tree, 31 leaves, max depth = 12, train loss: 0.37254, val loss: 0.35965, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37067, val loss: 0.35786, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.36853, val loss: 0.35606, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.36651, val loss: 0.35370, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.36477, val loss: 0.35204, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.36277, val loss: 0.35037, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.36091, val loss: 0.34820, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.35905, val loss: 0.34666, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35742, val loss: 0.34511, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.35573, val loss: 0.34309, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35422, val loss: 0.34165, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.35246, val loss: 0.34022, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35105, val loss: 0.33887, in 0.000s 1 tree, 38 leaves, max depth = 10, train loss: 0.34945, val loss: 0.33717, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.34793, val loss: 0.33533, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.34627, val loss: 0.33399, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34494, val loss: 0.33272, in 0.016s [65/72] 1 tree, 39 leaves, max depth = 10, train loss: 0.34352, val loss: 0.33121, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34228, val loss: 0.33003, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.34094, val loss: 0.32839, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.33978, val loss: 0.32729, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.33817, val loss: 0.32601, in 0.000s 1 tree, 39 leaves, max depth = 10, train loss: 0.33689, val loss: 0.32465, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.33539, val loss: 0.32347, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.33421, val loss: 0.32220, in 0.000s Fit 72 trees in 0.940 s, (2570 total leaves) Time spent computing histograms: 0.325s Time spent finding best splits: 0.064s Time spent applying splits: 0.051s Time spent predicting: 0.000s Trial 78, Fold 5: Log loss = 0.3390389330580322, Average precision = 0.9488684476965955, ROC-AUC = 0.9455521988740873, Elapsed Time = 0.9489164000005985 seconds
Optimization Progress: 79%|#######9 | 79/100 [16:17<05:15, 15.03s/it]
Trial 79, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 79, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 59 leaves, max depth = 13, train loss: 0.68276, val loss: 0.68246, in 0.000s 1 tree, 58 leaves, max depth = 13, train loss: 0.67273, val loss: 0.67213, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.66312, val loss: 0.66230, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.65404, val loss: 0.65296, in 0.000s 1 tree, 59 leaves, max depth = 13, train loss: 0.64525, val loss: 0.64390, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.63684, val loss: 0.63521, in 0.016s 1 tree, 59 leaves, max depth = 15, train loss: 0.62886, val loss: 0.62696, in 0.000s 1 tree, 59 leaves, max depth = 13, train loss: 0.62123, val loss: 0.61907, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.61382, val loss: 0.61141, in 0.000s 1 tree, 59 leaves, max depth = 12, train loss: 0.60659, val loss: 0.60391, in 0.016s 1 tree, 59 leaves, max depth = 15, train loss: 0.59982, val loss: 0.59685, in 0.000s 1 tree, 59 leaves, max depth = 12, train loss: 0.59336, val loss: 0.59016, in 0.000s 1 tree, 59 leaves, max depth = 13, train loss: 0.58707, val loss: 0.58364, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.58111, val loss: 0.57743, in 0.000s 1 tree, 59 leaves, max depth = 13, train loss: 0.57530, val loss: 0.57140, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.56972, val loss: 0.56561, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.56435, val loss: 0.56007, in 0.000s 1 tree, 59 leaves, max depth = 13, train loss: 0.55918, val loss: 0.55475, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.55423, val loss: 0.54953, in 0.000s 1 tree, 59 leaves, max depth = 12, train loss: 0.54937, val loss: 0.54445, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.54469, val loss: 0.53955, in 0.000s 1 tree, 59 leaves, max depth = 12, train loss: 0.54019, val loss: 0.53484, in 0.016s 1 tree, 97 leaves, max depth = 12, train loss: 0.53516, val loss: 0.53018, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.53091, val loss: 0.52573, in 0.000s 1 tree, 59 leaves, max depth = 13, train loss: 0.52690, val loss: 0.52159, in 0.016s 1 tree, 97 leaves, max depth = 12, train loss: 0.52219, val loss: 0.51723, in 0.016s 1 tree, 97 leaves, max depth = 12, train loss: 0.51765, val loss: 0.51305, in 0.000s 1 tree, 59 leaves, max depth = 15, train loss: 0.51400, val loss: 0.50921, in 0.016s 1 tree, 97 leaves, max depth = 12, train loss: 0.50968, val loss: 0.50523, in 0.016s 1 tree, 97 leaves, max depth = 12, train loss: 0.50554, val loss: 0.50142, in 0.000s 1 tree, 59 leaves, max depth = 15, train loss: 0.50213, val loss: 0.49785, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.49882, val loss: 0.49433, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.49554, val loss: 0.49085, in 0.000s 1 tree, 97 leaves, max depth = 12, train loss: 0.49167, val loss: 0.48730, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.48856, val loss: 0.48400, in 0.016s 1 tree, 59 leaves, max depth = 15, train loss: 0.48563, val loss: 0.48096, in 0.000s 1 tree, 59 leaves, max depth = 15, train loss: 0.48282, val loss: 0.47803, in 0.016s 1 tree, 96 leaves, max depth = 12, train loss: 0.47919, val loss: 0.47472, in 0.016s 1 tree, 59 leaves, max depth = 15, train loss: 0.47653, val loss: 0.47194, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.47398, val loss: 0.46919, in 0.000s 1 tree, 59 leaves, max depth = 15, train loss: 0.47151, val loss: 0.46661, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.46906, val loss: 0.46399, in 0.016s 1 tree, 59 leaves, max depth = 15, train loss: 0.46677, val loss: 0.46159, in 0.000s 1 tree, 97 leaves, max depth = 12, train loss: 0.46340, val loss: 0.45853, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.46121, val loss: 0.45620, in 0.016s 1 tree, 97 leaves, max depth = 12, train loss: 0.45799, val loss: 0.45327, in 0.016s Fit 46 trees in 0.736 s, (3000 total leaves) Time spent computing histograms: 0.204s Time spent finding best splits: 0.062s Time spent applying splits: 0.047s Time spent predicting: 0.000s Trial 79, Fold 1: Log loss = 0.45842813120494297, Average precision = 0.906537808073568, ROC-AUC = 0.9138198238243407, Elapsed Time = 0.7366600000004837 seconds Trial 79, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 79, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 58 leaves, max depth = 16, train loss: 0.68272, val loss: 0.68227, in 0.000s 1 tree, 59 leaves, max depth = 14, train loss: 0.67275, val loss: 0.67182, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.66317, val loss: 0.66178, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.65406, val loss: 0.65226, in 0.000s 1 tree, 59 leaves, max depth = 17, train loss: 0.64534, val loss: 0.64314, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.63693, val loss: 0.63431, in 0.000s 1 tree, 59 leaves, max depth = 16, train loss: 0.62892, val loss: 0.62593, in 0.016s 1 tree, 59 leaves, max depth = 17, train loss: 0.62126, val loss: 0.61790, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.61386, val loss: 0.61011, in 0.000s 1 tree, 59 leaves, max depth = 14, train loss: 0.60668, val loss: 0.60253, in 0.000s 1 tree, 59 leaves, max depth = 14, train loss: 0.59990, val loss: 0.59539, in 0.016s 1 tree, 59 leaves, max depth = 18, train loss: 0.59342, val loss: 0.58856, in 0.000s 1 tree, 59 leaves, max depth = 12, train loss: 0.58714, val loss: 0.58194, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.58118, val loss: 0.57563, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.57539, val loss: 0.56951, in 0.000s 1 tree, 59 leaves, max depth = 12, train loss: 0.56983, val loss: 0.56362, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.56448, val loss: 0.55796, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.55934, val loss: 0.55252, in 0.000s 1 tree, 8 leaves, max depth = 6, train loss: 0.55439, val loss: 0.54726, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.54956, val loss: 0.54212, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.54491, val loss: 0.53717, in 0.000s 1 tree, 59 leaves, max depth = 13, train loss: 0.54045, val loss: 0.53241, in 0.016s 1 tree, 96 leaves, max depth = 17, train loss: 0.53548, val loss: 0.52765, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.53127, val loss: 0.52316, in 0.000s 1 tree, 59 leaves, max depth = 14, train loss: 0.52728, val loss: 0.51891, in 0.016s 1 tree, 96 leaves, max depth = 15, train loss: 0.52263, val loss: 0.51445, in 0.016s 1 tree, 96 leaves, max depth = 15, train loss: 0.51815, val loss: 0.51018, in 0.016s 1 tree, 59 leaves, max depth = 17, train loss: 0.51446, val loss: 0.50627, in 0.000s 1 tree, 96 leaves, max depth = 18, train loss: 0.51021, val loss: 0.50220, in 0.016s 1 tree, 96 leaves, max depth = 18, train loss: 0.50612, val loss: 0.49830, in 0.016s 1 tree, 59 leaves, max depth = 17, train loss: 0.50269, val loss: 0.49465, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.49938, val loss: 0.49112, in 0.000s 1 tree, 59 leaves, max depth = 12, train loss: 0.49612, val loss: 0.48762, in 0.016s 1 tree, 96 leaves, max depth = 18, train loss: 0.49230, val loss: 0.48398, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.48922, val loss: 0.48066, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.48631, val loss: 0.47754, in 0.000s 1 tree, 59 leaves, max depth = 14, train loss: 0.48351, val loss: 0.47453, in 0.016s 1 tree, 96 leaves, max depth = 14, train loss: 0.47993, val loss: 0.47113, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.47728, val loss: 0.46828, in 0.000s 1 tree, 59 leaves, max depth = 13, train loss: 0.47473, val loss: 0.46554, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.47227, val loss: 0.46289, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.46984, val loss: 0.46026, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.46756, val loss: 0.45780, in 0.000s 1 tree, 96 leaves, max depth = 14, train loss: 0.46424, val loss: 0.45465, in 0.016s 1 tree, 59 leaves, max depth = 16, train loss: 0.46208, val loss: 0.45233, in 0.016s 1 tree, 96 leaves, max depth = 14, train loss: 0.45889, val loss: 0.44933, in 0.016s Fit 46 trees in 0.799 s, (2993 total leaves) Time spent computing histograms: 0.224s Time spent finding best splits: 0.067s Time spent applying splits: 0.051s Time spent predicting: 0.000s Trial 79, Fold 2: Log loss = 0.45980606153601994, Average precision = 0.9046649152467294, ROC-AUC = 0.9184656113721006, Elapsed Time = 0.8067893000006734 seconds Trial 79, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 79, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 58 leaves, max depth = 15, train loss: 0.68282, val loss: 0.68247, in 0.016s 1 tree, 58 leaves, max depth = 15, train loss: 0.67289, val loss: 0.67223, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.66337, val loss: 0.66234, in 0.000s 1 tree, 58 leaves, max depth = 12, train loss: 0.65434, val loss: 0.65300, in 0.016s 1 tree, 58 leaves, max depth = 14, train loss: 0.64570, val loss: 0.64407, in 0.016s 1 tree, 58 leaves, max depth = 15, train loss: 0.63737, val loss: 0.63547, in 0.000s 1 tree, 58 leaves, max depth = 15, train loss: 0.62943, val loss: 0.62725, in 0.016s 1 tree, 58 leaves, max depth = 15, train loss: 0.62184, val loss: 0.61939, in 0.016s 1 tree, 58 leaves, max depth = 14, train loss: 0.61447, val loss: 0.61172, in 0.000s 1 tree, 58 leaves, max depth = 14, train loss: 0.60731, val loss: 0.60432, in 0.016s 1 tree, 58 leaves, max depth = 15, train loss: 0.60056, val loss: 0.59734, in 0.016s 1 tree, 58 leaves, max depth = 15, train loss: 0.59413, val loss: 0.59067, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.58796, val loss: 0.58425, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.58203, val loss: 0.57807, in 0.016s 1 tree, 58 leaves, max depth = 14, train loss: 0.57635, val loss: 0.57215, in 0.016s 1 tree, 58 leaves, max depth = 14, train loss: 0.57080, val loss: 0.56634, in 0.000s 1 tree, 58 leaves, max depth = 10, train loss: 0.56547, val loss: 0.56076, in 0.016s 1 tree, 58 leaves, max depth = 10, train loss: 0.56035, val loss: 0.55539, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.55540, val loss: 0.55024, in 0.000s 1 tree, 58 leaves, max depth = 14, train loss: 0.55058, val loss: 0.54524, in 0.016s 1 tree, 58 leaves, max depth = 14, train loss: 0.54594, val loss: 0.54043, in 0.016s 1 tree, 58 leaves, max depth = 14, train loss: 0.54148, val loss: 0.53581, in 0.000s 1 tree, 95 leaves, max depth = 13, train loss: 0.53645, val loss: 0.53114, in 0.016s 1 tree, 58 leaves, max depth = 14, train loss: 0.53224, val loss: 0.52676, in 0.016s 1 tree, 58 leaves, max depth = 10, train loss: 0.52827, val loss: 0.52258, in 0.016s 1 tree, 95 leaves, max depth = 14, train loss: 0.52356, val loss: 0.51820, in 0.000s 1 tree, 95 leaves, max depth = 14, train loss: 0.51903, val loss: 0.51400, in 0.016s 1 tree, 58 leaves, max depth = 16, train loss: 0.51539, val loss: 0.51018, in 0.000s 1 tree, 95 leaves, max depth = 15, train loss: 0.51108, val loss: 0.50619, in 0.016s 1 tree, 95 leaves, max depth = 15, train loss: 0.50694, val loss: 0.50236, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.50355, val loss: 0.49878, in 0.016s 1 tree, 58 leaves, max depth = 16, train loss: 0.50026, val loss: 0.49534, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.49700, val loss: 0.49193, in 0.016s 1 tree, 95 leaves, max depth = 14, train loss: 0.49313, val loss: 0.48836, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.49006, val loss: 0.48513, in 0.000s 1 tree, 58 leaves, max depth = 10, train loss: 0.48717, val loss: 0.48204, in 0.016s 1 tree, 58 leaves, max depth = 10, train loss: 0.48439, val loss: 0.47907, in 0.016s 1 tree, 95 leaves, max depth = 14, train loss: 0.48076, val loss: 0.47573, in 0.016s 1 tree, 58 leaves, max depth = 10, train loss: 0.47813, val loss: 0.47291, in 0.016s 1 tree, 58 leaves, max depth = 16, train loss: 0.47558, val loss: 0.47024, in 0.000s 1 tree, 58 leaves, max depth = 10, train loss: 0.47314, val loss: 0.46762, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.47072, val loss: 0.46506, in 0.016s 1 tree, 58 leaves, max depth = 16, train loss: 0.46844, val loss: 0.46268, in 0.016s 1 tree, 95 leaves, max depth = 14, train loss: 0.46507, val loss: 0.45958, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.46292, val loss: 0.45727, in 0.000s 1 tree, 95 leaves, max depth = 14, train loss: 0.45969, val loss: 0.45431, in 0.016s Fit 46 trees in 0.861 s, (2951 total leaves) Time spent computing histograms: 0.237s Time spent finding best splits: 0.075s Time spent applying splits: 0.056s Time spent predicting: 0.031s Trial 79, Fold 3: Log loss = 0.45530085571122997, Average precision = 0.9135555234024215, ROC-AUC = 0.9231098792749757, Elapsed Time = 0.8606184000000212 seconds Trial 79, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 79, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.173 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 58 leaves, max depth = 15, train loss: 0.68277, val loss: 0.68224, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.67280, val loss: 0.67171, in 0.000s 1 tree, 59 leaves, max depth = 13, train loss: 0.66325, val loss: 0.66162, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.65418, val loss: 0.65207, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.64544, val loss: 0.64281, in 0.000s 1 tree, 59 leaves, max depth = 16, train loss: 0.63710, val loss: 0.63395, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.62914, val loss: 0.62549, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.62151, val loss: 0.61738, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.61414, val loss: 0.60952, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.60699, val loss: 0.60187, in 0.000s 1 tree, 59 leaves, max depth = 16, train loss: 0.60023, val loss: 0.59465, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.59377, val loss: 0.58774, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.58751, val loss: 0.58104, in 0.016s 1 tree, 58 leaves, max depth = 14, train loss: 0.58157, val loss: 0.57465, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.57580, val loss: 0.56845, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.57025, val loss: 0.56248, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.56492, val loss: 0.55674, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.55979, val loss: 0.55122, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.55486, val loss: 0.54590, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.55005, val loss: 0.54068, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.54543, val loss: 0.53565, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.54099, val loss: 0.53081, in 0.016s 1 tree, 95 leaves, max depth = 14, train loss: 0.53603, val loss: 0.52594, in 0.031s 1 tree, 59 leaves, max depth = 13, train loss: 0.53184, val loss: 0.52136, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.52786, val loss: 0.51703, in 0.016s 1 tree, 95 leaves, max depth = 16, train loss: 0.52322, val loss: 0.51247, in 0.016s 1 tree, 95 leaves, max depth = 16, train loss: 0.51875, val loss: 0.50809, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.51509, val loss: 0.50408, in 0.016s 1 tree, 95 leaves, max depth = 16, train loss: 0.51085, val loss: 0.49992, in 0.031s 1 tree, 95 leaves, max depth = 16, train loss: 0.50677, val loss: 0.49592, in 0.016s 1 tree, 59 leaves, max depth = 16, train loss: 0.50336, val loss: 0.49219, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.50005, val loss: 0.48856, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.49681, val loss: 0.48498, in 0.016s 1 tree, 94 leaves, max depth = 18, train loss: 0.49301, val loss: 0.48126, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.48994, val loss: 0.47786, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.48704, val loss: 0.47465, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.48425, val loss: 0.47156, in 0.016s 1 tree, 95 leaves, max depth = 14, train loss: 0.48068, val loss: 0.46808, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.47804, val loss: 0.46515, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.47549, val loss: 0.46231, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.47304, val loss: 0.45958, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.47062, val loss: 0.45687, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.46835, val loss: 0.45432, in 0.016s 1 tree, 95 leaves, max depth = 12, train loss: 0.46504, val loss: 0.45110, in 0.016s 1 tree, 58 leaves, max depth = 14, train loss: 0.46287, val loss: 0.44865, in 0.016s 1 tree, 95 leaves, max depth = 15, train loss: 0.45971, val loss: 0.44557, in 0.016s Fit 46 trees in 1.033 s, (2982 total leaves) Time spent computing histograms: 0.307s Time spent finding best splits: 0.098s Time spent applying splits: 0.076s Time spent predicting: 0.000s Trial 79, Fold 4: Log loss = 0.45939396332992505, Average precision = 0.9076412395230304, ROC-AUC = 0.9176424118543316, Elapsed Time = 1.0503874999994878 seconds Trial 79, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 79, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0.283 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 59 leaves, max depth = 11, train loss: 0.68265, val loss: 0.68201, in 0.016s 1 tree, 59 leaves, max depth = 16, train loss: 0.67260, val loss: 0.67133, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.66300, val loss: 0.66111, in 0.016s 1 tree, 59 leaves, max depth = 15, train loss: 0.65381, val loss: 0.65130, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.64502, val loss: 0.64191, in 0.000s 1 tree, 59 leaves, max depth = 15, train loss: 0.63659, val loss: 0.63289, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.62853, val loss: 0.62427, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.62081, val loss: 0.61598, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.61337, val loss: 0.60795, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.60614, val loss: 0.60017, in 0.016s 1 tree, 59 leaves, max depth = 15, train loss: 0.59931, val loss: 0.59281, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.59277, val loss: 0.58576, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.58650, val loss: 0.57898, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.58050, val loss: 0.57243, in 0.016s 1 tree, 59 leaves, max depth = 15, train loss: 0.57471, val loss: 0.56613, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.56915, val loss: 0.56010, in 0.000s 1 tree, 57 leaves, max depth = 14, train loss: 0.56374, val loss: 0.55423, in 0.016s 1 tree, 57 leaves, max depth = 14, train loss: 0.55855, val loss: 0.54858, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.55356, val loss: 0.54312, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.54870, val loss: 0.53780, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.54402, val loss: 0.53267, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.53953, val loss: 0.52773, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.53462, val loss: 0.52307, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.53038, val loss: 0.51839, in 0.016s 1 tree, 58 leaves, max depth = 14, train loss: 0.52635, val loss: 0.51396, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.52175, val loss: 0.50961, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.51734, val loss: 0.50543, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.51363, val loss: 0.50134, in 0.016s 1 tree, 97 leaves, max depth = 14, train loss: 0.50943, val loss: 0.49738, in 0.016s 1 tree, 97 leaves, max depth = 14, train loss: 0.50539, val loss: 0.49357, in 0.016s 1 tree, 59 leaves, max depth = 12, train loss: 0.50194, val loss: 0.48976, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.49860, val loss: 0.48607, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.49531, val loss: 0.48241, in 0.016s 1 tree, 97 leaves, max depth = 18, train loss: 0.49154, val loss: 0.47887, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.48842, val loss: 0.47539, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.48547, val loss: 0.47210, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.48264, val loss: 0.46892, in 0.016s 1 tree, 97 leaves, max depth = 16, train loss: 0.47910, val loss: 0.46563, in 0.016s 1 tree, 59 leaves, max depth = 13, train loss: 0.47644, val loss: 0.46266, in 0.016s 1 tree, 59 leaves, max depth = 14, train loss: 0.47385, val loss: 0.45976, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.47136, val loss: 0.45696, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.46890, val loss: 0.45418, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.46659, val loss: 0.45157, in 0.016s 1 tree, 97 leaves, max depth = 17, train loss: 0.46330, val loss: 0.44853, in 0.016s 1 tree, 59 leaves, max depth = 16, train loss: 0.46113, val loss: 0.44605, in 0.016s 1 tree, 97 leaves, max depth = 17, train loss: 0.45798, val loss: 0.44315, in 0.016s Fit 46 trees in 1.189 s, (2986 total leaves) Time spent computing histograms: 0.311s Time spent finding best splits: 0.101s Time spent applying splits: 0.076s Time spent predicting: 0.000s Trial 79, Fold 5: Log loss = 0.46532086876685613, Average precision = 0.9022929032342281, ROC-AUC = 0.9129145532579009, Elapsed Time = 1.2088435999994545 seconds
Optimization Progress: 80%|######## | 80/100 [16:28<04:39, 13.97s/it]
Trial 80, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 80, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.236 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 8, train loss: 0.68558, val loss: 0.68559, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.67822, val loss: 0.67832, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.67105, val loss: 0.67123, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.66407, val loss: 0.66432, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.65741, val loss: 0.65768, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.65121, val loss: 0.65141, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.64510, val loss: 0.64521, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.63875, val loss: 0.63893, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.63260, val loss: 0.63281, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.62659, val loss: 0.62684, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.62095, val loss: 0.62120, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.61516, val loss: 0.61547, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.60951, val loss: 0.60987, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.60391, val loss: 0.60431, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.59866, val loss: 0.59903, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.59338, val loss: 0.59381, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.58846, val loss: 0.58888, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.58334, val loss: 0.58377, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.57833, val loss: 0.57879, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.57374, val loss: 0.57422, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.56908, val loss: 0.56954, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.56438, val loss: 0.56486, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.55986, val loss: 0.56037, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.55543, val loss: 0.55598, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.55128, val loss: 0.55181, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.54722, val loss: 0.54773, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.54306, val loss: 0.54360, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.53897, val loss: 0.53956, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.53497, val loss: 0.53559, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.53099, val loss: 0.53163, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.52714, val loss: 0.52774, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.52350, val loss: 0.52407, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.51975, val loss: 0.52033, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.51608, val loss: 0.51663, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.51246, val loss: 0.51298, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.50893, val loss: 0.50944, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.50570, val loss: 0.50618, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.50231, val loss: 0.50280, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.49917, val loss: 0.49958, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.49610, val loss: 0.49644, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.49298, val loss: 0.49323, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.48982, val loss: 0.49005, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.48692, val loss: 0.48707, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.48388, val loss: 0.48403, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.48090, val loss: 0.48104, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.47818, val loss: 0.47825, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.47531, val loss: 0.47537, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.47250, val loss: 0.47255, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.46974, val loss: 0.46978, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.46708, val loss: 0.46707, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.46442, val loss: 0.46440, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.46179, val loss: 0.46176, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.45924, val loss: 0.45918, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.45671, val loss: 0.45665, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.45425, val loss: 0.45418, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.45184, val loss: 0.45175, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.44944, val loss: 0.44935, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.44598, val loss: 0.44599, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.44265, val loss: 0.44275, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.43934, val loss: 0.43956, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.43716, val loss: 0.43733, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.43497, val loss: 0.43514, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.43282, val loss: 0.43299, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.43076, val loss: 0.43088, in 0.016s Fit 64 trees in 1.533 s, (2240 total leaves) Time spent computing histograms: 0.473s Time spent finding best splits: 0.073s Time spent applying splits: 0.048s Time spent predicting: 0.000s Trial 80, Fold 1: Log loss = 0.43370266737972424, Average precision = 0.9503427459847125, ROC-AUC = 0.9456591322416752, Elapsed Time = 1.5502667000000656 seconds Trial 80, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 80, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.188 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 10, train loss: 0.68564, val loss: 0.68548, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.67827, val loss: 0.67796, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.67106, val loss: 0.67061, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.66407, val loss: 0.66347, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.65724, val loss: 0.65651, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.65055, val loss: 0.64969, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.64435, val loss: 0.64342, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.63806, val loss: 0.63703, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.63215, val loss: 0.63106, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.62622, val loss: 0.62498, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.62027, val loss: 0.61892, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.61477, val loss: 0.61335, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.60909, val loss: 0.60755, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.60353, val loss: 0.60189, in 0.031s 1 tree, 35 leaves, max depth = 14, train loss: 0.59838, val loss: 0.59667, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.59306, val loss: 0.59132, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.58777, val loss: 0.58596, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.58260, val loss: 0.58073, in 0.016s 1 tree, 35 leaves, max depth = 14, train loss: 0.57789, val loss: 0.57597, in 0.031s 1 tree, 35 leaves, max depth = 8, train loss: 0.57301, val loss: 0.57103, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.56819, val loss: 0.56617, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.56347, val loss: 0.56141, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.55883, val loss: 0.55672, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.55435, val loss: 0.55213, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.54990, val loss: 0.54764, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.54559, val loss: 0.54325, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.54134, val loss: 0.53895, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.53755, val loss: 0.53511, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.53347, val loss: 0.53097, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.52946, val loss: 0.52692, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.52579, val loss: 0.52322, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.52221, val loss: 0.51960, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.51845, val loss: 0.51578, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.51476, val loss: 0.51204, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.51115, val loss: 0.50838, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.50761, val loss: 0.50479, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.50437, val loss: 0.50152, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.50096, val loss: 0.49806, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.49782, val loss: 0.49490, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.49482, val loss: 0.49188, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.49167, val loss: 0.48870, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.48857, val loss: 0.48559, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.48572, val loss: 0.48273, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.48267, val loss: 0.47965, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.47968, val loss: 0.47662, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.47701, val loss: 0.47393, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.47414, val loss: 0.47104, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.47131, val loss: 0.46817, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.46874, val loss: 0.46553, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.46602, val loss: 0.46277, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.46234, val loss: 0.45914, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.45972, val loss: 0.45650, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.45714, val loss: 0.45388, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.45462, val loss: 0.45132, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.45214, val loss: 0.44881, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.44974, val loss: 0.44642, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.44736, val loss: 0.44398, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.44399, val loss: 0.44066, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.44068, val loss: 0.43739, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.43745, val loss: 0.43421, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.43525, val loss: 0.43199, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.43303, val loss: 0.42975, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.43085, val loss: 0.42752, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.42872, val loss: 0.42536, in 0.016s Fit 64 trees in 1.642 s, (2240 total leaves) Time spent computing histograms: 0.537s Time spent finding best splits: 0.086s Time spent applying splits: 0.059s Time spent predicting: 0.000s Trial 80, Fold 2: Log loss = 0.43118969439221216, Average precision = 0.9489102449125872, ROC-AUC = 0.947832373933251, Elapsed Time = 1.6524898000006942 seconds Trial 80, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 80, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 8, train loss: 0.68564, val loss: 0.68560, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.67832, val loss: 0.67826, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.67119, val loss: 0.67112, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.66424, val loss: 0.66415, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.65747, val loss: 0.65735, in 0.000s 1 tree, 35 leaves, max depth = 12, train loss: 0.65115, val loss: 0.65106, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.64469, val loss: 0.64458, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.63875, val loss: 0.63867, in 0.000s 1 tree, 35 leaves, max depth = 12, train loss: 0.63288, val loss: 0.63282, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.62698, val loss: 0.62688, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.62109, val loss: 0.62098, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.61556, val loss: 0.61547, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.61014, val loss: 0.61008, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.60453, val loss: 0.60448, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.59937, val loss: 0.59934, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.59415, val loss: 0.59412, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.58900, val loss: 0.58896, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.58402, val loss: 0.58398, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.57901, val loss: 0.57898, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.57436, val loss: 0.57435, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.56984, val loss: 0.56986, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.56512, val loss: 0.56515, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.56050, val loss: 0.56055, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.55599, val loss: 0.55606, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.55162, val loss: 0.55172, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.54729, val loss: 0.54741, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.54306, val loss: 0.54320, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.53925, val loss: 0.53940, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.53518, val loss: 0.53534, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.53120, val loss: 0.53138, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.52751, val loss: 0.52772, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.52368, val loss: 0.52392, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.51993, val loss: 0.52018, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.51626, val loss: 0.51653, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.51266, val loss: 0.51295, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.50913, val loss: 0.50944, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.50588, val loss: 0.50621, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.50249, val loss: 0.50283, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.49830, val loss: 0.49887, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.49523, val loss: 0.49582, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.49202, val loss: 0.49262, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.48887, val loss: 0.48949, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.48599, val loss: 0.48666, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.48296, val loss: 0.48363, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.47998, val loss: 0.48067, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.47729, val loss: 0.47802, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.47443, val loss: 0.47515, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.47161, val loss: 0.47235, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.46885, val loss: 0.46961, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.46618, val loss: 0.46696, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.46352, val loss: 0.46430, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.46091, val loss: 0.46170, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.45835, val loss: 0.45916, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.45583, val loss: 0.45668, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.45337, val loss: 0.45422, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.45099, val loss: 0.45186, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.44861, val loss: 0.44951, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.44518, val loss: 0.44634, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.44190, val loss: 0.44329, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.43967, val loss: 0.44111, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.43745, val loss: 0.43891, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.43526, val loss: 0.43674, in 0.031s 1 tree, 35 leaves, max depth = 12, train loss: 0.43312, val loss: 0.43461, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.43101, val loss: 0.43254, in 0.016s Fit 64 trees in 1.472 s, (2240 total leaves) Time spent computing histograms: 0.485s Time spent finding best splits: 0.073s Time spent applying splits: 0.050s Time spent predicting: 0.000s Trial 80, Fold 3: Log loss = 0.4285578476224689, Average precision = 0.9520868528689832, ROC-AUC = 0.9500523988500544, Elapsed Time = 1.4797304999992775 seconds Trial 80, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 80, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.159 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 8, train loss: 0.68571, val loss: 0.68540, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.67843, val loss: 0.67781, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.67133, val loss: 0.67041, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.66450, val loss: 0.66324, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.65775, val loss: 0.65620, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.65150, val loss: 0.64966, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.64506, val loss: 0.64294, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.63912, val loss: 0.63671, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.63330, val loss: 0.63062, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.62743, val loss: 0.62446, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.62157, val loss: 0.61833, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.61614, val loss: 0.61265, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.61053, val loss: 0.60679, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.60499, val loss: 0.60097, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.59988, val loss: 0.59562, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.59465, val loss: 0.59008, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.58942, val loss: 0.58461, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.58438, val loss: 0.57929, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.57942, val loss: 0.57407, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.57484, val loss: 0.56936, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.57009, val loss: 0.56437, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.56547, val loss: 0.55949, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.56089, val loss: 0.55468, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.55643, val loss: 0.54999, in 0.031s 1 tree, 35 leaves, max depth = 8, train loss: 0.55233, val loss: 0.54576, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.54837, val loss: 0.54155, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.54425, val loss: 0.53720, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.54013, val loss: 0.53289, in 0.031s 1 tree, 35 leaves, max depth = 8, train loss: 0.53606, val loss: 0.52863, in 0.000s 1 tree, 35 leaves, max depth = 9, train loss: 0.53212, val loss: 0.52448, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.52846, val loss: 0.52064, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.52466, val loss: 0.51666, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.52093, val loss: 0.51273, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.51726, val loss: 0.50888, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.51368, val loss: 0.50510, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.51015, val loss: 0.50140, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.50693, val loss: 0.49799, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.50354, val loss: 0.49443, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.50036, val loss: 0.49112, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.49738, val loss: 0.48793, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.49428, val loss: 0.48465, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.49113, val loss: 0.48135, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.48829, val loss: 0.47834, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.48529, val loss: 0.47516, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.48237, val loss: 0.47206, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.47948, val loss: 0.46900, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.47669, val loss: 0.46602, in 0.031s 1 tree, 35 leaves, max depth = 9, train loss: 0.47398, val loss: 0.46314, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.47124, val loss: 0.46023, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.46854, val loss: 0.45738, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.46596, val loss: 0.45462, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.46337, val loss: 0.45187, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.46082, val loss: 0.44918, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.45833, val loss: 0.44652, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.45585, val loss: 0.44392, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.45343, val loss: 0.44136, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.45111, val loss: 0.43890, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.44888, val loss: 0.43651, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.44664, val loss: 0.43414, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.44446, val loss: 0.43180, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.44113, val loss: 0.42842, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.43895, val loss: 0.42613, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.43682, val loss: 0.42387, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.43364, val loss: 0.42064, in 0.016s Fit 64 trees in 1.472 s, (2240 total leaves) Time spent computing histograms: 0.483s Time spent finding best splits: 0.079s Time spent applying splits: 0.052s Time spent predicting: 0.000s Trial 80, Fold 4: Log loss = 0.43292715360677114, Average precision = 0.9522821749291704, ROC-AUC = 0.9482136001964754, Elapsed Time = 1.4838823999998567 seconds Trial 80, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 80, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 9, train loss: 0.68554, val loss: 0.68523, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.67812, val loss: 0.67753, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.67089, val loss: 0.67003, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.66384, val loss: 0.66271, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.65696, val loss: 0.65558, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.65032, val loss: 0.64869, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.64413, val loss: 0.64223, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.63782, val loss: 0.63567, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.63186, val loss: 0.62942, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.62589, val loss: 0.62316, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.61992, val loss: 0.61696, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.61440, val loss: 0.61117, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.60869, val loss: 0.60525, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.60312, val loss: 0.59947, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.59793, val loss: 0.59403, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.59268, val loss: 0.58857, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.58747, val loss: 0.58317, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.58244, val loss: 0.57794, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.57746, val loss: 0.57278, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.57284, val loss: 0.56794, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.56804, val loss: 0.56294, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.56334, val loss: 0.55805, in 0.016s 1 tree, 35 leaves, max depth = 12, train loss: 0.55877, val loss: 0.55330, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.55429, val loss: 0.54866, in 0.016s 1 tree, 35 leaves, max depth = 8, train loss: 0.54995, val loss: 0.54414, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.54561, val loss: 0.53960, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.54136, val loss: 0.53516, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.53729, val loss: 0.53092, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.53322, val loss: 0.52666, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.52925, val loss: 0.52252, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.52555, val loss: 0.51865, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.52176, val loss: 0.51471, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.51805, val loss: 0.51086, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.51441, val loss: 0.50707, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.51082, val loss: 0.50334, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.50728, val loss: 0.49964, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.50403, val loss: 0.49623, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.50064, val loss: 0.49271, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.49751, val loss: 0.48938, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.49441, val loss: 0.48614, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.49120, val loss: 0.48278, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.48806, val loss: 0.47952, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.48520, val loss: 0.47651, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.48216, val loss: 0.47333, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.47920, val loss: 0.47025, in 0.016s 1 tree, 35 leaves, max depth = 16, train loss: 0.47530, val loss: 0.46632, in 0.031s 1 tree, 35 leaves, max depth = 10, train loss: 0.47261, val loss: 0.46350, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.46988, val loss: 0.46061, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.46712, val loss: 0.45774, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.46440, val loss: 0.45489, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.46180, val loss: 0.45217, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.45917, val loss: 0.44943, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.45661, val loss: 0.44677, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.45408, val loss: 0.44414, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.45162, val loss: 0.44157, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.44920, val loss: 0.43908, in 0.016s 1 tree, 35 leaves, max depth = 7, train loss: 0.44688, val loss: 0.43665, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.44454, val loss: 0.43422, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.44224, val loss: 0.43183, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.43998, val loss: 0.42946, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.43777, val loss: 0.42716, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.43559, val loss: 0.42491, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.43235, val loss: 0.42168, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.43025, val loss: 0.41950, in 0.000s Fit 64 trees in 1.455 s, (2240 total leaves) Time spent computing histograms: 0.473s Time spent finding best splits: 0.072s Time spent applying splits: 0.048s Time spent predicting: 0.000s Trial 80, Fold 5: Log loss = 0.43518271020566096, Average precision = 0.947309225647094, ROC-AUC = 0.9459746205153932, Elapsed Time = 1.4620661999997537 seconds
Optimization Progress: 81%|########1 | 81/100 [16:44<04:37, 14.61s/it]
Trial 81, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 81, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.147 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 7 leaves, max depth = 3, train loss: 0.68509, val loss: 0.68486, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.67748, val loss: 0.67703, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.67013, val loss: 0.66946, in 0.016s 1 tree, 7 leaves, max depth = 3, train loss: 0.66301, val loss: 0.66212, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.65612, val loss: 0.65502, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.64945, val loss: 0.64814, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.64299, val loss: 0.64148, in 0.000s 1 tree, 7 leaves, max depth = 3, train loss: 0.63674, val loss: 0.63502, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.63068, val loss: 0.62875, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.62481, val loss: 0.62267, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.61912, val loss: 0.61678, in 0.000s Fit 11 trees in 0.331 s, (85 total leaves) Time spent computing histograms: 0.047s Time spent finding best splits: 0.004s Time spent applying splits: 0.003s Time spent predicting: 0.000s Trial 81, Fold 1: Log loss = 0.6195250216480869, Average precision = 0.8109929655039347, ROC-AUC = 0.8611422531222668, Elapsed Time = 0.33699350000097184 seconds Trial 81, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 81, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 8 leaves, max depth = 4, train loss: 0.68524, val loss: 0.68485, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.67772, val loss: 0.67695, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.67039, val loss: 0.66926, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.66336, val loss: 0.66185, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.65649, val loss: 0.65464, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.64990, val loss: 0.64769, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.64346, val loss: 0.64092, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.63729, val loss: 0.63440, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.63124, val loss: 0.62804, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.62539, val loss: 0.62187, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.61977, val loss: 0.61593, in 0.000s Fit 11 trees in 0.329 s, (83 total leaves) Time spent computing histograms: 0.041s Time spent finding best splits: 0.004s Time spent applying splits: 0.002s Time spent predicting: 0.016s Trial 81, Fold 2: Log loss = 0.6193558844500675, Average precision = 0.8194662883517918, ROC-AUC = 0.8683016630646799, Elapsed Time = 0.34302349999961734 seconds Trial 81, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 81, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 8 leaves, max depth = 4, train loss: 0.68520, val loss: 0.68494, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.67773, val loss: 0.67717, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.67043, val loss: 0.66962, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.66344, val loss: 0.66235, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.65660, val loss: 0.65527, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.64999, val loss: 0.64842, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.64365, val loss: 0.64181, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.63744, val loss: 0.63538, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.63150, val loss: 0.62917, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.62567, val loss: 0.62313, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.62003, val loss: 0.61727, in 0.000s Fit 11 trees in 0.345 s, (86 total leaves) Time spent computing histograms: 0.045s Time spent finding best splits: 0.004s Time spent applying splits: 0.003s Time spent predicting: 0.000s Trial 81, Fold 3: Log loss = 0.6181173893012052, Average precision = 0.823732825492677, ROC-AUC = 0.8711087608710308, Elapsed Time = 0.3515162000003329 seconds Trial 81, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 81, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0.159 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 7 leaves, max depth = 4, train loss: 0.68524, val loss: 0.68478, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.67769, val loss: 0.67678, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.67040, val loss: 0.66904, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.66334, val loss: 0.66154, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.65651, val loss: 0.65428, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.64990, val loss: 0.64724, in 0.000s 1 tree, 6 leaves, max depth = 3, train loss: 0.64349, val loss: 0.64042, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.63729, val loss: 0.63381, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.63129, val loss: 0.62740, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.62547, val loss: 0.62119, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.61983, val loss: 0.61516, in 0.000s Fit 11 trees in 0.346 s, (76 total leaves) Time spent computing histograms: 0.046s Time spent finding best splits: 0.005s Time spent applying splits: 0.002s Time spent predicting: 0.000s Trial 81, Fold 4: Log loss = 0.6188015647415763, Average precision = 0.8267691012009457, ROC-AUC = 0.8702258900405588, Elapsed Time = 0.3649242999999842 seconds Trial 81, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 81, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0.141 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 7 leaves, max depth = 5, train loss: 0.68522, val loss: 0.68468, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.67770, val loss: 0.67662, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.67042, val loss: 0.66881, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.66338, val loss: 0.66124, in 0.000s 1 tree, 7 leaves, max depth = 5, train loss: 0.65656, val loss: 0.65391, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.64997, val loss: 0.64681, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.64358, val loss: 0.63992, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.63730, val loss: 0.63319, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.63121, val loss: 0.62666, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.62532, val loss: 0.62034, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.61969, val loss: 0.61424, in 0.000s Fit 11 trees in 0.344 s, (78 total leaves) Time spent computing histograms: 0.042s Time spent finding best splits: 0.004s Time spent applying splits: 0.002s Time spent predicting: 0.000s Trial 81, Fold 5: Log loss = 0.6219450285121826, Average precision = 0.8066400069708548, ROC-AUC = 0.8550896085316687, Elapsed Time = 0.359162999999171 seconds
Optimization Progress: 82%|########2 | 82/100 [16:52<03:48, 12.69s/it]
Trial 82, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 82, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.158 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 73 leaves, max depth = 10, train loss: 0.68500, val loss: 0.68496, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.67689, val loss: 0.67690, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.66899, val loss: 0.66908, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.66132, val loss: 0.66148, in 0.031s 1 tree, 84 leaves, max depth = 14, train loss: 0.65406, val loss: 0.65418, in 0.016s 1 tree, 129 leaves, max depth = 12, train loss: 0.64690, val loss: 0.64701, in 0.016s 1 tree, 76 leaves, max depth = 10, train loss: 0.64000, val loss: 0.64006, in 0.031s 1 tree, 120 leaves, max depth = 13, train loss: 0.63308, val loss: 0.63320, in 0.016s 1 tree, 81 leaves, max depth = 10, train loss: 0.62654, val loss: 0.62661, in 0.016s 1 tree, 92 leaves, max depth = 10, train loss: 0.62017, val loss: 0.62015, in 0.016s 1 tree, 116 leaves, max depth = 15, train loss: 0.61377, val loss: 0.61380, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.60753, val loss: 0.60760, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.60162, val loss: 0.60173, in 0.031s 1 tree, 127 leaves, max depth = 14, train loss: 0.59583, val loss: 0.59589, in 0.016s 1 tree, 145 leaves, max depth = 15, train loss: 0.59011, val loss: 0.59012, in 0.016s 1 tree, 120 leaves, max depth = 15, train loss: 0.58446, val loss: 0.58451, in 0.031s 1 tree, 121 leaves, max depth = 15, train loss: 0.57896, val loss: 0.57904, in 0.031s 1 tree, 123 leaves, max depth = 15, train loss: 0.57379, val loss: 0.57389, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.56851, val loss: 0.56860, in 0.031s 1 tree, 121 leaves, max depth = 15, train loss: 0.56338, val loss: 0.56351, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.55834, val loss: 0.55848, in 0.031s 1 tree, 148 leaves, max depth = 14, train loss: 0.55340, val loss: 0.55346, in 0.031s 1 tree, 126 leaves, max depth = 15, train loss: 0.54880, val loss: 0.54886, in 0.031s 1 tree, 145 leaves, max depth = 14, train loss: 0.54428, val loss: 0.54429, in 0.031s 1 tree, 148 leaves, max depth = 15, train loss: 0.53965, val loss: 0.53959, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.53517, val loss: 0.53516, in 0.031s 1 tree, 124 leaves, max depth = 14, train loss: 0.53079, val loss: 0.53081, in 0.016s 1 tree, 148 leaves, max depth = 15, train loss: 0.52646, val loss: 0.52642, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.52226, val loss: 0.52226, in 0.031s 1 tree, 122 leaves, max depth = 14, train loss: 0.51833, val loss: 0.51834, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.51432, val loss: 0.51438, in 0.016s 1 tree, 127 leaves, max depth = 14, train loss: 0.51050, val loss: 0.51056, in 0.031s 1 tree, 120 leaves, max depth = 15, train loss: 0.50665, val loss: 0.50673, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.50289, val loss: 0.50300, in 0.016s 1 tree, 128 leaves, max depth = 13, train loss: 0.49912, val loss: 0.49922, in 0.031s 1 tree, 144 leaves, max depth = 14, train loss: 0.49557, val loss: 0.49567, in 0.031s 1 tree, 121 leaves, max depth = 14, train loss: 0.49204, val loss: 0.49219, in 0.016s 1 tree, 125 leaves, max depth = 14, train loss: 0.48858, val loss: 0.48876, in 0.031s 1 tree, 121 leaves, max depth = 15, train loss: 0.48533, val loss: 0.48554, in 0.031s 1 tree, 122 leaves, max depth = 15, train loss: 0.48203, val loss: 0.48228, in 0.031s 1 tree, 129 leaves, max depth = 15, train loss: 0.47868, val loss: 0.47893, in 0.016s 1 tree, 126 leaves, max depth = 13, train loss: 0.47542, val loss: 0.47567, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.47243, val loss: 0.47272, in 0.031s 1 tree, 148 leaves, max depth = 14, train loss: 0.46934, val loss: 0.46958, in 0.031s 1 tree, 126 leaves, max depth = 16, train loss: 0.46643, val loss: 0.46667, in 0.016s 1 tree, 134 leaves, max depth = 15, train loss: 0.46234, val loss: 0.46267, in 0.031s 1 tree, 127 leaves, max depth = 13, train loss: 0.45936, val loss: 0.45968, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.45537, val loss: 0.45581, in 0.016s 1 tree, 126 leaves, max depth = 13, train loss: 0.45267, val loss: 0.45310, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.44995, val loss: 0.45043, in 0.031s 1 tree, 119 leaves, max depth = 16, train loss: 0.44613, val loss: 0.44675, in 0.031s 1 tree, 145 leaves, max depth = 15, train loss: 0.44346, val loss: 0.44405, in 0.016s 1 tree, 122 leaves, max depth = 16, train loss: 0.43977, val loss: 0.44052, in 0.031s 1 tree, 128 leaves, max depth = 15, train loss: 0.43715, val loss: 0.43790, in 0.016s 1 tree, 117 leaves, max depth = 15, train loss: 0.43360, val loss: 0.43447, in 0.031s 1 tree, 94 leaves, max depth = 13, train loss: 0.43119, val loss: 0.43201, in 0.016s 1 tree, 147 leaves, max depth = 14, train loss: 0.42876, val loss: 0.42955, in 0.031s 1 tree, 129 leaves, max depth = 13, train loss: 0.42633, val loss: 0.42714, in 0.031s 1 tree, 127 leaves, max depth = 13, train loss: 0.42395, val loss: 0.42476, in 0.016s 1 tree, 127 leaves, max depth = 13, train loss: 0.42162, val loss: 0.42243, in 0.031s 1 tree, 128 leaves, max depth = 13, train loss: 0.41934, val loss: 0.42015, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.41608, val loss: 0.41702, in 0.016s 1 tree, 131 leaves, max depth = 13, train loss: 0.41387, val loss: 0.41483, in 0.031s 1 tree, 117 leaves, max depth = 15, train loss: 0.41072, val loss: 0.41182, in 0.016s 1 tree, 130 leaves, max depth = 13, train loss: 0.40859, val loss: 0.40971, in 0.031s 1 tree, 94 leaves, max depth = 12, train loss: 0.40662, val loss: 0.40769, in 0.031s 1 tree, 129 leaves, max depth = 12, train loss: 0.40456, val loss: 0.40566, in 0.016s 1 tree, 129 leaves, max depth = 12, train loss: 0.40255, val loss: 0.40368, in 0.031s 1 tree, 128 leaves, max depth = 12, train loss: 0.40058, val loss: 0.40173, in 0.016s 1 tree, 130 leaves, max depth = 14, train loss: 0.39864, val loss: 0.39981, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.39684, val loss: 0.39802, in 0.031s 1 tree, 134 leaves, max depth = 15, train loss: 0.39400, val loss: 0.39529, in 0.031s 1 tree, 118 leaves, max depth = 15, train loss: 0.39119, val loss: 0.39262, in 0.016s 1 tree, 145 leaves, max depth = 14, train loss: 0.38942, val loss: 0.39084, in 0.031s 1 tree, 100 leaves, max depth = 15, train loss: 0.38784, val loss: 0.38921, in 0.016s 1 tree, 118 leaves, max depth = 15, train loss: 0.38514, val loss: 0.38664, in 0.031s 1 tree, 125 leaves, max depth = 13, train loss: 0.38341, val loss: 0.38494, in 0.031s 1 tree, 114 leaves, max depth = 18, train loss: 0.38081, val loss: 0.38251, in 0.016s 1 tree, 92 leaves, max depth = 15, train loss: 0.37836, val loss: 0.38012, in 0.031s 1 tree, 127 leaves, max depth = 14, train loss: 0.37671, val loss: 0.37848, in 0.031s 1 tree, 118 leaves, max depth = 15, train loss: 0.37423, val loss: 0.37614, in 0.016s 1 tree, 143 leaves, max depth = 14, train loss: 0.37266, val loss: 0.37458, in 0.031s 1 tree, 119 leaves, max depth = 15, train loss: 0.37026, val loss: 0.37232, in 0.031s 1 tree, 123 leaves, max depth = 13, train loss: 0.36880, val loss: 0.37089, in 0.016s 1 tree, 135 leaves, max depth = 15, train loss: 0.36651, val loss: 0.36871, in 0.031s 1 tree, 127 leaves, max depth = 13, train loss: 0.36499, val loss: 0.36723, in 0.031s 1 tree, 117 leaves, max depth = 15, train loss: 0.36273, val loss: 0.36510, in 0.016s 1 tree, 119 leaves, max depth = 17, train loss: 0.36052, val loss: 0.36305, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.35907, val loss: 0.36162, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.35764, val loss: 0.36022, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.35565, val loss: 0.35824, in 0.016s 1 tree, 61 leaves, max depth = 15, train loss: 0.35371, val loss: 0.35633, in 0.031s 1 tree, 67 leaves, max depth = 14, train loss: 0.35181, val loss: 0.35445, in 0.016s 1 tree, 62 leaves, max depth = 16, train loss: 0.34996, val loss: 0.35266, in 0.016s 1 tree, 66 leaves, max depth = 14, train loss: 0.34815, val loss: 0.35087, in 0.031s 1 tree, 68 leaves, max depth = 14, train loss: 0.34638, val loss: 0.34912, in 0.016s 1 tree, 120 leaves, max depth = 17, train loss: 0.34456, val loss: 0.34750, in 0.016s 1 tree, 128 leaves, max depth = 18, train loss: 0.34306, val loss: 0.34607, in 0.031s 1 tree, 126 leaves, max depth = 17, train loss: 0.34158, val loss: 0.34465, in 0.016s 1 tree, 67 leaves, max depth = 14, train loss: 0.33991, val loss: 0.34300, in 0.016s Fit 100 trees in 2.768 s, (11849 total leaves) Time spent computing histograms: 0.865s Time spent finding best splits: 0.237s Time spent applying splits: 0.224s Time spent predicting: 0.000s Trial 82, Fold 1: Log loss = 0.34719319393428133, Average precision = 0.955178807528165, ROC-AUC = 0.9496816793564762, Elapsed Time = 2.782788900000014 seconds Trial 82, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 82, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 68 leaves, max depth = 12, train loss: 0.68493, val loss: 0.68479, in 0.016s 1 tree, 111 leaves, max depth = 14, train loss: 0.67671, val loss: 0.67650, in 0.016s 1 tree, 108 leaves, max depth = 16, train loss: 0.66879, val loss: 0.66848, in 0.031s 1 tree, 107 leaves, max depth = 16, train loss: 0.66110, val loss: 0.66068, in 0.031s 1 tree, 91 leaves, max depth = 12, train loss: 0.65366, val loss: 0.65311, in 0.016s 1 tree, 141 leaves, max depth = 12, train loss: 0.64641, val loss: 0.64582, in 0.031s 1 tree, 93 leaves, max depth = 12, train loss: 0.63943, val loss: 0.63873, in 0.031s 1 tree, 120 leaves, max depth = 14, train loss: 0.63240, val loss: 0.63162, in 0.016s [9/100] 1 tree, 120 leaves, max depth = 14, train loss: 0.62556, val loss: 0.62469, in 0.031s 1 tree, 123 leaves, max depth = 17, train loss: 0.61931, val loss: 0.61836, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.61280, val loss: 0.61177, in 0.031s 1 tree, 124 leaves, max depth = 25, train loss: 0.60688, val loss: 0.60580, in 0.016s 1 tree, 100 leaves, max depth = 14, train loss: 0.60083, val loss: 0.59964, in 0.031s 1 tree, 148 leaves, max depth = 14, train loss: 0.59485, val loss: 0.59362, in 0.016s 1 tree, 127 leaves, max depth = 25, train loss: 0.58933, val loss: 0.58807, in 0.031s 1 tree, 98 leaves, max depth = 17, train loss: 0.58409, val loss: 0.58274, in 0.031s 1 tree, 128 leaves, max depth = 15, train loss: 0.57850, val loss: 0.57711, in 0.016s 1 tree, 125 leaves, max depth = 14, train loss: 0.57300, val loss: 0.57153, in 0.031s 1 tree, 128 leaves, max depth = 25, train loss: 0.56799, val loss: 0.56650, in 0.031s 1 tree, 127 leaves, max depth = 17, train loss: 0.56300, val loss: 0.56150, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.55791, val loss: 0.55638, in 0.031s 1 tree, 126 leaves, max depth = 16, train loss: 0.55294, val loss: 0.55139, in 0.016s 1 tree, 122 leaves, max depth = 14, train loss: 0.54804, val loss: 0.54644, in 0.031s 1 tree, 101 leaves, max depth = 17, train loss: 0.54370, val loss: 0.54205, in 0.016s 1 tree, 125 leaves, max depth = 17, train loss: 0.53934, val loss: 0.53766, in 0.031s 1 tree, 127 leaves, max depth = 15, train loss: 0.53476, val loss: 0.53305, in 0.031s 1 tree, 124 leaves, max depth = 15, train loss: 0.53027, val loss: 0.52852, in 0.031s 1 tree, 128 leaves, max depth = 15, train loss: 0.52589, val loss: 0.52412, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.52161, val loss: 0.51981, in 0.016s 1 tree, 124 leaves, max depth = 18, train loss: 0.51772, val loss: 0.51589, in 0.016s 1 tree, 147 leaves, max depth = 14, train loss: 0.51365, val loss: 0.51183, in 0.031s 1 tree, 151 leaves, max depth = 15, train loss: 0.50969, val loss: 0.50787, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.50575, val loss: 0.50391, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.50189, val loss: 0.50004, in 0.031s 1 tree, 151 leaves, max depth = 14, train loss: 0.49819, val loss: 0.49634, in 0.031s 1 tree, 147 leaves, max depth = 14, train loss: 0.49455, val loss: 0.49271, in 0.031s 1 tree, 146 leaves, max depth = 14, train loss: 0.49099, val loss: 0.48914, in 0.031s 1 tree, 127 leaves, max depth = 14, train loss: 0.48754, val loss: 0.48571, in 0.031s 1 tree, 150 leaves, max depth = 14, train loss: 0.48412, val loss: 0.48230, in 0.031s 1 tree, 129 leaves, max depth = 14, train loss: 0.48074, val loss: 0.47889, in 0.031s 1 tree, 148 leaves, max depth = 14, train loss: 0.47748, val loss: 0.47561, in 0.031s 1 tree, 126 leaves, max depth = 14, train loss: 0.47422, val loss: 0.47233, in 0.031s 1 tree, 128 leaves, max depth = 14, train loss: 0.47104, val loss: 0.46912, in 0.031s 1 tree, 126 leaves, max depth = 14, train loss: 0.46792, val loss: 0.46599, in 0.031s 1 tree, 127 leaves, max depth = 14, train loss: 0.46487, val loss: 0.46291, in 0.031s 1 tree, 125 leaves, max depth = 14, train loss: 0.46189, val loss: 0.45991, in 0.016s 1 tree, 126 leaves, max depth = 14, train loss: 0.45897, val loss: 0.45697, in 0.031s 1 tree, 119 leaves, max depth = 18, train loss: 0.45499, val loss: 0.45306, in 0.031s 1 tree, 142 leaves, max depth = 18, train loss: 0.45115, val loss: 0.44930, in 0.031s 1 tree, 151 leaves, max depth = 14, train loss: 0.44843, val loss: 0.44659, in 0.031s 1 tree, 119 leaves, max depth = 22, train loss: 0.44467, val loss: 0.44293, in 0.031s 1 tree, 120 leaves, max depth = 15, train loss: 0.44098, val loss: 0.43928, in 0.016s 1 tree, 127 leaves, max depth = 14, train loss: 0.43834, val loss: 0.43664, in 0.031s 1 tree, 150 leaves, max depth = 14, train loss: 0.43580, val loss: 0.43412, in 0.016s 1 tree, 121 leaves, max depth = 18, train loss: 0.43230, val loss: 0.43070, in 0.031s 1 tree, 103 leaves, max depth = 13, train loss: 0.42994, val loss: 0.42830, in 0.031s 1 tree, 154 leaves, max depth = 13, train loss: 0.42754, val loss: 0.42593, in 0.031s 1 tree, 131 leaves, max depth = 14, train loss: 0.42513, val loss: 0.42353, in 0.016s 1 tree, 132 leaves, max depth = 14, train loss: 0.42277, val loss: 0.42119, in 0.031s 1 tree, 131 leaves, max depth = 14, train loss: 0.42046, val loss: 0.41889, in 0.016s 1 tree, 131 leaves, max depth = 14, train loss: 0.41821, val loss: 0.41664, in 0.047s 1 tree, 120 leaves, max depth = 16, train loss: 0.41500, val loss: 0.41352, in 0.031s 1 tree, 131 leaves, max depth = 13, train loss: 0.41285, val loss: 0.41134, in 0.031s 1 tree, 121 leaves, max depth = 17, train loss: 0.40975, val loss: 0.40832, in 0.047s 1 tree, 130 leaves, max depth = 13, train loss: 0.40764, val loss: 0.40622, in 0.016s 1 tree, 130 leaves, max depth = 13, train loss: 0.40557, val loss: 0.40417, in 0.031s 1 tree, 126 leaves, max depth = 13, train loss: 0.40354, val loss: 0.40215, in 0.031s 1 tree, 128 leaves, max depth = 13, train loss: 0.40155, val loss: 0.40015, in 0.031s 1 tree, 128 leaves, max depth = 13, train loss: 0.39960, val loss: 0.39820, in 0.031s 1 tree, 119 leaves, max depth = 17, train loss: 0.39672, val loss: 0.39540, in 0.031s 1 tree, 95 leaves, max depth = 15, train loss: 0.39398, val loss: 0.39271, in 0.031s 1 tree, 147 leaves, max depth = 13, train loss: 0.39216, val loss: 0.39093, in 0.031s 1 tree, 115 leaves, max depth = 23, train loss: 0.38943, val loss: 0.38828, in 0.031s 1 tree, 147 leaves, max depth = 14, train loss: 0.38769, val loss: 0.38657, in 0.031s 1 tree, 104 leaves, max depth = 16, train loss: 0.38613, val loss: 0.38502, in 0.016s 1 tree, 143 leaves, max depth = 17, train loss: 0.38355, val loss: 0.38255, in 0.031s 1 tree, 124 leaves, max depth = 14, train loss: 0.38184, val loss: 0.38085, in 0.031s 1 tree, 152 leaves, max depth = 18, train loss: 0.38025, val loss: 0.37933, in 0.031s 1 tree, 143 leaves, max depth = 17, train loss: 0.37778, val loss: 0.37696, in 0.031s 1 tree, 121 leaves, max depth = 15, train loss: 0.37530, val loss: 0.37455, in 0.031s 1 tree, 152 leaves, max depth = 15, train loss: 0.37373, val loss: 0.37302, in 0.031s 1 tree, 114 leaves, max depth = 23, train loss: 0.37134, val loss: 0.37073, in 0.031s 1 tree, 130 leaves, max depth = 15, train loss: 0.36977, val loss: 0.36918, in 0.031s 1 tree, 114 leaves, max depth = 23, train loss: 0.36746, val loss: 0.36696, in 0.031s 1 tree, 95 leaves, max depth = 15, train loss: 0.36527, val loss: 0.36483, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.36377, val loss: 0.36334, in 0.032s 1 tree, 119 leaves, max depth = 17, train loss: 0.36158, val loss: 0.36122, in 0.031s 1 tree, 129 leaves, max depth = 14, train loss: 0.36013, val loss: 0.35979, in 0.031s 1 tree, 126 leaves, max depth = 16, train loss: 0.35870, val loss: 0.35838, in 0.031s 1 tree, 68 leaves, max depth = 16, train loss: 0.35672, val loss: 0.35645, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.35533, val loss: 0.35507, in 0.031s 1 tree, 143 leaves, max depth = 17, train loss: 0.35333, val loss: 0.35317, in 0.031s 1 tree, 68 leaves, max depth = 16, train loss: 0.35146, val loss: 0.35134, in 0.016s 1 tree, 127 leaves, max depth = 14, train loss: 0.35013, val loss: 0.35005, in 0.031s 1 tree, 68 leaves, max depth = 18, train loss: 0.34831, val loss: 0.34829, in 0.016s 1 tree, 124 leaves, max depth = 17, train loss: 0.34647, val loss: 0.34659, in 0.031s 1 tree, 62 leaves, max depth = 19, train loss: 0.34472, val loss: 0.34489, in 0.016s 1 tree, 122 leaves, max depth = 19, train loss: 0.34288, val loss: 0.34312, in 0.031s 1 tree, 123 leaves, max depth = 20, train loss: 0.34141, val loss: 0.34173, in 0.031s 1 tree, 123 leaves, max depth = 20, train loss: 0.33997, val loss: 0.34036, in 0.031s Fit 100 trees in 3.159 s, (12361 total leaves) Time spent computing histograms: 0.969s Time spent finding best splits: 0.275s Time spent applying splits: 0.270s Time spent predicting: 0.016s Trial 82, Fold 2: Log loss = 0.344785585703399, Average precision = 0.9539294557132714, ROC-AUC = 0.9512520861526808, Elapsed Time = 3.155847199999698 seconds Trial 82, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 82, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 69 leaves, max depth = 11, train loss: 0.68499, val loss: 0.68497, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.67725, val loss: 0.67731, in 0.031s 1 tree, 114 leaves, max depth = 15, train loss: 0.66935, val loss: 0.66944, in 0.031s 1 tree, 121 leaves, max depth = 13, train loss: 0.66160, val loss: 0.66178, in 0.016s 1 tree, 116 leaves, max depth = 15, train loss: 0.65413, val loss: 0.65433, in 0.031s 1 tree, 121 leaves, max depth = 14, train loss: 0.64731, val loss: 0.64754, in 0.016s 1 tree, 117 leaves, max depth = 16, train loss: 0.64022, val loss: 0.64049, in 0.031s 1 tree, 94 leaves, max depth = 16, train loss: 0.63348, val loss: 0.63369, in 0.016s 1 tree, 146 leaves, max depth = 15, train loss: 0.62712, val loss: 0.62741, in 0.031s 1 tree, 124 leaves, max depth = 18, train loss: 0.62084, val loss: 0.62124, in 0.016s 1 tree, 125 leaves, max depth = 16, train loss: 0.61443, val loss: 0.61484, in 0.031s 1 tree, 125 leaves, max depth = 16, train loss: 0.60818, val loss: 0.60861, in 0.031s 1 tree, 128 leaves, max depth = 17, train loss: 0.60236, val loss: 0.60273, in 0.016s 1 tree, 125 leaves, max depth = 18, train loss: 0.59668, val loss: 0.59715, in 0.031s 1 tree, 87 leaves, max depth = 16, train loss: 0.59094, val loss: 0.59143, in 0.016s 1 tree, 121 leaves, max depth = 16, train loss: 0.58553, val loss: 0.58608, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.57998, val loss: 0.58055, in 0.016s 1 tree, 146 leaves, max depth = 15, train loss: 0.57455, val loss: 0.57522, in 0.031s 1 tree, 124 leaves, max depth = 18, train loss: 0.56952, val loss: 0.57025, in 0.031s 1 tree, 127 leaves, max depth = 13, train loss: 0.56430, val loss: 0.56503, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.55917, val loss: 0.55994, in 0.031s 1 tree, 126 leaves, max depth = 15, train loss: 0.55417, val loss: 0.55498, in 0.031s 1 tree, 142 leaves, max depth = 16, train loss: 0.54963, val loss: 0.55050, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.54519, val loss: 0.54608, in 0.031s 1 tree, 122 leaves, max depth = 15, train loss: 0.54052, val loss: 0.54145, in 0.016s 1 tree, 124 leaves, max depth = 16, train loss: 0.53595, val loss: 0.53693, in 0.016s 1 tree, 123 leaves, max depth = 16, train loss: 0.53175, val loss: 0.53277, in 0.031s 1 tree, 127 leaves, max depth = 18, train loss: 0.52754, val loss: 0.52866, in 0.016s 1 tree, 126 leaves, max depth = 14, train loss: 0.52361, val loss: 0.52476, in 0.031s 1 tree, 124 leaves, max depth = 16, train loss: 0.51941, val loss: 0.52061, in 0.031s 1 tree, 143 leaves, max depth = 16, train loss: 0.51535, val loss: 0.51662, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.51134, val loss: 0.51262, in 0.031s 1 tree, 120 leaves, max depth = 15, train loss: 0.50741, val loss: 0.50875, in 0.031s 1 tree, 141 leaves, max depth = 14, train loss: 0.50391, val loss: 0.50535, in 0.016s 1 tree, 144 leaves, max depth = 16, train loss: 0.50019, val loss: 0.50171, in 0.016s 1 tree, 99 leaves, max depth = 15, train loss: 0.49662, val loss: 0.49812, in 0.031s 1 tree, 124 leaves, max depth = 15, train loss: 0.49302, val loss: 0.49453, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.48949, val loss: 0.49105, in 0.031s 1 tree, 124 leaves, max depth = 14, train loss: 0.48603, val loss: 0.48764, in 0.031s 1 tree, 125 leaves, max depth = 14, train loss: 0.48265, val loss: 0.48429, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.47933, val loss: 0.48100, in 0.031s 1 tree, 124 leaves, max depth = 15, train loss: 0.47609, val loss: 0.47778, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.47293, val loss: 0.47463, in 0.031s 1 tree, 116 leaves, max depth = 15, train loss: 0.46870, val loss: 0.47070, in 0.031s 1 tree, 145 leaves, max depth = 15, train loss: 0.46568, val loss: 0.46777, in 0.031s 1 tree, 126 leaves, max depth = 15, train loss: 0.46267, val loss: 0.46481, in 0.016s 1 tree, 145 leaves, max depth = 15, train loss: 0.45978, val loss: 0.46200, in 0.031s 1 tree, 112 leaves, max depth = 17, train loss: 0.45583, val loss: 0.45831, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.45300, val loss: 0.45554, in 0.047s 1 tree, 125 leaves, max depth = 17, train loss: 0.45030, val loss: 0.45293, in 0.016s 1 tree, 143 leaves, max depth = 15, train loss: 0.44762, val loss: 0.45033, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.44496, val loss: 0.44769, in 0.031s 1 tree, 136 leaves, max depth = 16, train loss: 0.44128, val loss: 0.44435, in 0.031s 1 tree, 126 leaves, max depth = 16, train loss: 0.43871, val loss: 0.44183, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.43618, val loss: 0.43934, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.43371, val loss: 0.43692, in 0.031s 1 tree, 125 leaves, max depth = 14, train loss: 0.43129, val loss: 0.43453, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.42892, val loss: 0.43218, in 0.031s 1 tree, 89 leaves, max depth = 14, train loss: 0.42670, val loss: 0.42998, in 0.031s 1 tree, 116 leaves, max depth = 16, train loss: 0.42329, val loss: 0.42686, in 0.031s 1 tree, 113 leaves, max depth = 17, train loss: 0.41999, val loss: 0.42382, in 0.016s 1 tree, 87 leaves, max depth = 14, train loss: 0.41787, val loss: 0.42172, in 0.031s 1 tree, 124 leaves, max depth = 14, train loss: 0.41570, val loss: 0.41960, in 0.016s 1 tree, 88 leaves, max depth = 15, train loss: 0.41366, val loss: 0.41757, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.41157, val loss: 0.41551, in 0.031s 1 tree, 125 leaves, max depth = 16, train loss: 0.40952, val loss: 0.41351, in 0.031s 1 tree, 124 leaves, max depth = 15, train loss: 0.40753, val loss: 0.41154, in 0.031s 1 tree, 125 leaves, max depth = 14, train loss: 0.40556, val loss: 0.40961, in 0.016s 1 tree, 117 leaves, max depth = 16, train loss: 0.40252, val loss: 0.40684, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.39957, val loss: 0.40415, in 0.031s 1 tree, 135 leaves, max depth = 19, train loss: 0.39672, val loss: 0.40161, in 0.031s 1 tree, 115 leaves, max depth = 19, train loss: 0.39390, val loss: 0.39904, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.39207, val loss: 0.39724, in 0.031s 1 tree, 125 leaves, max depth = 16, train loss: 0.39027, val loss: 0.39549, in 0.031s 1 tree, 123 leaves, max depth = 16, train loss: 0.38861, val loss: 0.39389, in 0.016s 1 tree, 116 leaves, max depth = 17, train loss: 0.38591, val loss: 0.39146, in 0.031s 1 tree, 116 leaves, max depth = 18, train loss: 0.38327, val loss: 0.38909, in 0.031s 1 tree, 118 leaves, max depth = 16, train loss: 0.38070, val loss: 0.38677, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.37914, val loss: 0.38526, in 0.031s 1 tree, 116 leaves, max depth = 16, train loss: 0.37665, val loss: 0.38302, in 0.031s 1 tree, 145 leaves, max depth = 14, train loss: 0.37506, val loss: 0.38152, in 0.031s 1 tree, 137 leaves, max depth = 15, train loss: 0.37267, val loss: 0.37944, in 0.031s 1 tree, 127 leaves, max depth = 15, train loss: 0.37110, val loss: 0.37794, in 0.031s 1 tree, 112 leaves, max depth = 14, train loss: 0.36878, val loss: 0.37584, in 0.016s 1 tree, 99 leaves, max depth = 13, train loss: 0.36735, val loss: 0.37443, in 0.031s 1 tree, 117 leaves, max depth = 16, train loss: 0.36508, val loss: 0.37239, in 0.016s 1 tree, 113 leaves, max depth = 14, train loss: 0.36287, val loss: 0.37041, in 0.031s 1 tree, 64 leaves, max depth = 13, train loss: 0.36081, val loss: 0.36855, in 0.031s 1 tree, 145 leaves, max depth = 13, train loss: 0.35940, val loss: 0.36723, in 0.031s 1 tree, 63 leaves, max depth = 14, train loss: 0.35741, val loss: 0.36543, in 0.016s 1 tree, 123 leaves, max depth = 19, train loss: 0.35581, val loss: 0.36379, in 0.016s 1 tree, 122 leaves, max depth = 19, train loss: 0.35425, val loss: 0.36219, in 0.031s 1 tree, 123 leaves, max depth = 19, train loss: 0.35271, val loss: 0.36061, in 0.016s 1 tree, 123 leaves, max depth = 19, train loss: 0.35120, val loss: 0.35906, in 0.031s 1 tree, 122 leaves, max depth = 19, train loss: 0.34973, val loss: 0.35755, in 0.016s 1 tree, 64 leaves, max depth = 14, train loss: 0.34786, val loss: 0.35587, in 0.016s 1 tree, 64 leaves, max depth = 14, train loss: 0.34604, val loss: 0.35424, in 0.031s 1 tree, 116 leaves, max depth = 15, train loss: 0.34413, val loss: 0.35257, in 0.016s 1 tree, 123 leaves, max depth = 18, train loss: 0.34271, val loss: 0.35112, in 0.031s 1 tree, 122 leaves, max depth = 18, train loss: 0.34132, val loss: 0.34969, in 0.016s Fit 100 trees in 2.939 s, (11986 total leaves) Time spent computing histograms: 0.927s Time spent finding best splits: 0.250s Time spent applying splits: 0.238s Time spent predicting: 0.000s Trial 82, Fold 3: Log loss = 0.342112439021262, Average precision = 0.9573984711296334, ROC-AUC = 0.9542221103283219, Elapsed Time = 2.9520584999991115 seconds Trial 82, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 82, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 79 leaves, max depth = 12, train loss: 0.68500, val loss: 0.68470, in 0.031s 1 tree, 113 leaves, max depth = 14, train loss: 0.67691, val loss: 0.67637, in 0.016s 1 tree, 136 leaves, max depth = 12, train loss: 0.66914, val loss: 0.66829, in 0.016s 1 tree, 135 leaves, max depth = 12, train loss: 0.66159, val loss: 0.66041, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.65443, val loss: 0.65297, in 0.031s 1 tree, 128 leaves, max depth = 14, train loss: 0.64753, val loss: 0.64593, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.64045, val loss: 0.63862, in 0.031s 1 tree, 99 leaves, max depth = 14, train loss: 0.63372, val loss: 0.63159, in 0.031s 1 tree, 145 leaves, max depth = 14, train loss: 0.62746, val loss: 0.62502, in 0.031s 1 tree, 126 leaves, max depth = 19, train loss: 0.62112, val loss: 0.61850, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.61471, val loss: 0.61189, in 0.031s 1 tree, 127 leaves, max depth = 14, train loss: 0.60846, val loss: 0.60543, in 0.031s 1 tree, 126 leaves, max depth = 17, train loss: 0.60265, val loss: 0.59934, in 0.016s 1 tree, 122 leaves, max depth = 14, train loss: 0.59705, val loss: 0.59353, in 0.016s 1 tree, 148 leaves, max depth = 14, train loss: 0.59132, val loss: 0.58753, in 0.031s 1 tree, 126 leaves, max depth = 15, train loss: 0.58589, val loss: 0.58187, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.58034, val loss: 0.57612, in 0.016s 1 tree, 126 leaves, max depth = 14, train loss: 0.57493, val loss: 0.57050, in 0.031s 1 tree, 91 leaves, max depth = 14, train loss: 0.56979, val loss: 0.56511, in 0.016s 1 tree, 145 leaves, max depth = 14, train loss: 0.56497, val loss: 0.56014, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.55989, val loss: 0.55486, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.55495, val loss: 0.54973, in 0.031s 1 tree, 146 leaves, max depth = 14, train loss: 0.55021, val loss: 0.54473, in 0.031s 1 tree, 146 leaves, max depth = 13, train loss: 0.54587, val loss: 0.54014, in 0.031s 1 tree, 127 leaves, max depth = 14, train loss: 0.54125, val loss: 0.53540, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.53704, val loss: 0.53101, in 0.031s 1 tree, 125 leaves, max depth = 17, train loss: 0.53262, val loss: 0.52647, in 0.016s 1 tree, 124 leaves, max depth = 16, train loss: 0.52847, val loss: 0.52215, in 0.031s 1 tree, 147 leaves, max depth = 13, train loss: 0.52431, val loss: 0.51781, in 0.031s 1 tree, 146 leaves, max depth = 13, train loss: 0.52025, val loss: 0.51351, in 0.031s 1 tree, 125 leaves, max depth = 14, train loss: 0.51622, val loss: 0.50930, in 0.031s 1 tree, 144 leaves, max depth = 13, train loss: 0.51243, val loss: 0.50532, in 0.031s 1 tree, 148 leaves, max depth = 14, train loss: 0.50863, val loss: 0.50136, in 0.016s 1 tree, 124 leaves, max depth = 20, train loss: 0.50496, val loss: 0.49757, in 0.031s 1 tree, 126 leaves, max depth = 15, train loss: 0.50125, val loss: 0.49371, in 0.031s 1 tree, 131 leaves, max depth = 14, train loss: 0.49761, val loss: 0.48995, in 0.016s 1 tree, 149 leaves, max depth = 14, train loss: 0.49412, val loss: 0.48630, in 0.047s 1 tree, 127 leaves, max depth = 17, train loss: 0.49064, val loss: 0.48267, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.48723, val loss: 0.47911, in 0.031s 1 tree, 148 leaves, max depth = 14, train loss: 0.48388, val loss: 0.47557, in 0.031s 1 tree, 128 leaves, max depth = 17, train loss: 0.48061, val loss: 0.47217, in 0.031s 1 tree, 150 leaves, max depth = 14, train loss: 0.47749, val loss: 0.46888, in 0.031s 1 tree, 120 leaves, max depth = 14, train loss: 0.47323, val loss: 0.46456, in 0.031s 1 tree, 130 leaves, max depth = 14, train loss: 0.47005, val loss: 0.46126, in 0.016s 1 tree, 124 leaves, max depth = 14, train loss: 0.46698, val loss: 0.45804, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.46403, val loss: 0.45496, in 0.016s 1 tree, 149 leaves, max depth = 14, train loss: 0.46120, val loss: 0.45197, in 0.031s 1 tree, 131 leaves, max depth = 15, train loss: 0.45826, val loss: 0.44891, in 0.031s 1 tree, 92 leaves, max depth = 14, train loss: 0.45559, val loss: 0.44607, in 0.016s 1 tree, 141 leaves, max depth = 14, train loss: 0.45179, val loss: 0.44217, in 0.031s 1 tree, 124 leaves, max depth = 15, train loss: 0.44905, val loss: 0.43932, in 0.016s 1 tree, 125 leaves, max depth = 17, train loss: 0.44643, val loss: 0.43658, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.44379, val loss: 0.43385, in 0.031s 1 tree, 131 leaves, max depth = 15, train loss: 0.44117, val loss: 0.43112, in 0.031s 1 tree, 130 leaves, max depth = 15, train loss: 0.43860, val loss: 0.42844, in 0.016s 1 tree, 130 leaves, max depth = 15, train loss: 0.43609, val loss: 0.42581, in 0.062s 1 tree, 114 leaves, max depth = 15, train loss: 0.43257, val loss: 0.42228, in 0.063s 1 tree, 129 leaves, max depth = 15, train loss: 0.43015, val loss: 0.41975, in 0.031s 1 tree, 131 leaves, max depth = 15, train loss: 0.42778, val loss: 0.41727, in 0.047s 1 tree, 112 leaves, max depth = 14, train loss: 0.42442, val loss: 0.41389, in 0.031s 1 tree, 127 leaves, max depth = 15, train loss: 0.42213, val loss: 0.41152, in 0.031s 1 tree, 124 leaves, max depth = 15, train loss: 0.42006, val loss: 0.40943, in 0.031s 1 tree, 146 leaves, max depth = 12, train loss: 0.41794, val loss: 0.40719, in 0.031s 1 tree, 128 leaves, max depth = 14, train loss: 0.41578, val loss: 0.40494, in 0.047s 1 tree, 117 leaves, max depth = 17, train loss: 0.41261, val loss: 0.40174, in 0.031s 1 tree, 149 leaves, max depth = 14, train loss: 0.41058, val loss: 0.39959, in 0.031s 1 tree, 127 leaves, max depth = 17, train loss: 0.40867, val loss: 0.39762, in 0.031s 1 tree, 138 leaves, max depth = 17, train loss: 0.40570, val loss: 0.39459, in 0.031s 1 tree, 148 leaves, max depth = 13, train loss: 0.40377, val loss: 0.39255, in 0.031s 1 tree, 117 leaves, max depth = 17, train loss: 0.40083, val loss: 0.38959, in 0.031s 1 tree, 147 leaves, max depth = 13, train loss: 0.39897, val loss: 0.38763, in 0.032s 1 tree, 139 leaves, max depth = 17, train loss: 0.39619, val loss: 0.38480, in 0.031s 1 tree, 119 leaves, max depth = 17, train loss: 0.39341, val loss: 0.38200, in 0.031s 1 tree, 119 leaves, max depth = 17, train loss: 0.39069, val loss: 0.37927, in 0.016s 1 tree, 128 leaves, max depth = 17, train loss: 0.38893, val loss: 0.37747, in 0.031s 1 tree, 145 leaves, max depth = 15, train loss: 0.38736, val loss: 0.37581, in 0.031s 1 tree, 150 leaves, max depth = 13, train loss: 0.38567, val loss: 0.37402, in 0.031s 1 tree, 118 leaves, max depth = 17, train loss: 0.38308, val loss: 0.37142, in 0.031s 1 tree, 129 leaves, max depth = 16, train loss: 0.38139, val loss: 0.36964, in 0.016s 1 tree, 119 leaves, max depth = 17, train loss: 0.37888, val loss: 0.36712, in 0.031s 1 tree, 149 leaves, max depth = 14, train loss: 0.37730, val loss: 0.36545, in 0.031s 1 tree, 117 leaves, max depth = 17, train loss: 0.37487, val loss: 0.36301, in 0.031s 1 tree, 112 leaves, max depth = 14, train loss: 0.37256, val loss: 0.36067, in 0.031s 1 tree, 129 leaves, max depth = 16, train loss: 0.37099, val loss: 0.35904, in 0.031s 1 tree, 128 leaves, max depth = 16, train loss: 0.36945, val loss: 0.35743, in 0.031s 1 tree, 113 leaves, max depth = 17, train loss: 0.36723, val loss: 0.35518, in 0.031s 1 tree, 117 leaves, max depth = 17, train loss: 0.36500, val loss: 0.35294, in 0.031s 1 tree, 123 leaves, max depth = 15, train loss: 0.36353, val loss: 0.35142, in 0.016s 1 tree, 67 leaves, max depth = 16, train loss: 0.36149, val loss: 0.34928, in 0.031s 1 tree, 62 leaves, max depth = 13, train loss: 0.35950, val loss: 0.34723, in 0.031s 1 tree, 124 leaves, max depth = 18, train loss: 0.35787, val loss: 0.34559, in 0.047s 1 tree, 67 leaves, max depth = 16, train loss: 0.35594, val loss: 0.34359, in 0.016s 1 tree, 68 leaves, max depth = 13, train loss: 0.35405, val loss: 0.34163, in 0.016s 1 tree, 123 leaves, max depth = 17, train loss: 0.35249, val loss: 0.34008, in 0.031s 1 tree, 124 leaves, max depth = 18, train loss: 0.35097, val loss: 0.33855, in 0.031s 1 tree, 123 leaves, max depth = 17, train loss: 0.34947, val loss: 0.33707, in 0.031s 1 tree, 127 leaves, max depth = 17, train loss: 0.34760, val loss: 0.33534, in 0.031s 1 tree, 61 leaves, max depth = 13, train loss: 0.34584, val loss: 0.33352, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.34404, val loss: 0.33186, in 0.031s 1 tree, 66 leaves, max depth = 16, train loss: 0.34235, val loss: 0.33010, in 0.016s Fit 100 trees in 3.252 s, (12459 total leaves) Time spent computing histograms: 1.017s Time spent finding best splits: 0.295s Time spent applying splits: 0.285s Time spent predicting: 0.000s Trial 82, Fold 4: Log loss = 0.34486513459364354, Average precision = 0.9576524629502526, ROC-AUC = 0.952854257721315, Elapsed Time = 3.2568659000007756 seconds Trial 82, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 82, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 70 leaves, max depth = 12, train loss: 0.68497, val loss: 0.68470, in 0.031s 1 tree, 127 leaves, max depth = 14, train loss: 0.67712, val loss: 0.67656, in 0.016s 1 tree, 122 leaves, max depth = 15, train loss: 0.66919, val loss: 0.66845, in 0.031s 1 tree, 125 leaves, max depth = 14, train loss: 0.66149, val loss: 0.66050, in 0.031s 1 tree, 123 leaves, max depth = 15, train loss: 0.65400, val loss: 0.65283, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.64693, val loss: 0.64545, in 0.031s 1 tree, 120 leaves, max depth = 15, train loss: 0.63982, val loss: 0.63818, in 0.031s 1 tree, 98 leaves, max depth = 14, train loss: 0.63294, val loss: 0.63099, in 0.016s 1 tree, 143 leaves, max depth = 16, train loss: 0.62649, val loss: 0.62426, in 0.031s 1 tree, 123 leaves, max depth = 14, train loss: 0.62020, val loss: 0.61778, in 0.031s 1 tree, 129 leaves, max depth = 14, train loss: 0.61365, val loss: 0.61100, in 0.031s 1 tree, 129 leaves, max depth = 14, train loss: 0.60727, val loss: 0.60441, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.60143, val loss: 0.59835, in 0.031s 1 tree, 122 leaves, max depth = 14, train loss: 0.59575, val loss: 0.59248, in 0.031s 1 tree, 131 leaves, max depth = 14, train loss: 0.58981, val loss: 0.58635, in 0.016s 1 tree, 132 leaves, max depth = 14, train loss: 0.58401, val loss: 0.58038, in 0.031s 1 tree, 126 leaves, max depth = 15, train loss: 0.57839, val loss: 0.57458, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.57289, val loss: 0.56894, in 0.031s 1 tree, 148 leaves, max depth = 13, train loss: 0.56758, val loss: 0.56345, in 0.031s 1 tree, 88 leaves, max depth = 14, train loss: 0.56282, val loss: 0.55855, in 0.031s 1 tree, 149 leaves, max depth = 13, train loss: 0.55775, val loss: 0.55331, in 0.016s 1 tree, 131 leaves, max depth = 14, train loss: 0.55273, val loss: 0.54814, in 0.031s 1 tree, 129 leaves, max depth = 14, train loss: 0.54782, val loss: 0.54307, in 0.031s 1 tree, 130 leaves, max depth = 14, train loss: 0.54302, val loss: 0.53814, in 0.016s 1 tree, 129 leaves, max depth = 14, train loss: 0.53834, val loss: 0.53333, in 0.031s 1 tree, 131 leaves, max depth = 14, train loss: 0.53375, val loss: 0.52859, in 0.031s 1 tree, 130 leaves, max depth = 14, train loss: 0.52928, val loss: 0.52400, in 0.016s 1 tree, 149 leaves, max depth = 13, train loss: 0.52498, val loss: 0.51955, in 0.031s 1 tree, 126 leaves, max depth = 14, train loss: 0.52072, val loss: 0.51518, in 0.016s 1 tree, 127 leaves, max depth = 14, train loss: 0.51655, val loss: 0.51089, in 0.031s 1 tree, 126 leaves, max depth = 14, train loss: 0.51246, val loss: 0.50669, in 0.031s 1 tree, 127 leaves, max depth = 14, train loss: 0.50847, val loss: 0.50258, in 0.031s 1 tree, 147 leaves, max depth = 13, train loss: 0.50462, val loss: 0.49862, in 0.031s 1 tree, 128 leaves, max depth = 14, train loss: 0.50080, val loss: 0.49468, in 0.031s 1 tree, 126 leaves, max depth = 14, train loss: 0.49706, val loss: 0.49085, in 0.016s 1 tree, 125 leaves, max depth = 14, train loss: 0.49361, val loss: 0.48730, in 0.031s 1 tree, 149 leaves, max depth = 13, train loss: 0.49007, val loss: 0.48363, in 0.031s 1 tree, 128 leaves, max depth = 14, train loss: 0.48654, val loss: 0.47999, in 0.016s 1 tree, 128 leaves, max depth = 14, train loss: 0.48309, val loss: 0.47642, in 0.031s 1 tree, 148 leaves, max depth = 14, train loss: 0.47977, val loss: 0.47298, in 0.031s 1 tree, 128 leaves, max depth = 14, train loss: 0.47646, val loss: 0.46958, in 0.016s 1 tree, 149 leaves, max depth = 13, train loss: 0.47328, val loss: 0.46630, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.47029, val loss: 0.46322, in 0.031s 1 tree, 128 leaves, max depth = 14, train loss: 0.46718, val loss: 0.46000, in 0.031s 1 tree, 148 leaves, max depth = 14, train loss: 0.46419, val loss: 0.45690, in 0.032s 1 tree, 125 leaves, max depth = 17, train loss: 0.46120, val loss: 0.45384, in 0.016s 1 tree, 124 leaves, max depth = 17, train loss: 0.45828, val loss: 0.45085, in 0.031s 1 tree, 108 leaves, max depth = 19, train loss: 0.45428, val loss: 0.44688, in 0.031s 1 tree, 130 leaves, max depth = 14, train loss: 0.45145, val loss: 0.44398, in 0.016s 1 tree, 95 leaves, max depth = 13, train loss: 0.44882, val loss: 0.44122, in 0.031s 1 tree, 124 leaves, max depth = 14, train loss: 0.44612, val loss: 0.43845, in 0.031s 1 tree, 125 leaves, max depth = 16, train loss: 0.44361, val loss: 0.43588, in 0.031s 1 tree, 116 leaves, max depth = 14, train loss: 0.43986, val loss: 0.43217, in 0.016s 1 tree, 145 leaves, max depth = 15, train loss: 0.43736, val loss: 0.42960, in 0.047s 1 tree, 132 leaves, max depth = 14, train loss: 0.43484, val loss: 0.42702, in 0.016s 1 tree, 123 leaves, max depth = 15, train loss: 0.43238, val loss: 0.42450, in 0.031s 1 tree, 144 leaves, max depth = 13, train loss: 0.43002, val loss: 0.42207, in 0.031s 1 tree, 137 leaves, max depth = 13, train loss: 0.42657, val loss: 0.41865, in 0.016s 1 tree, 125 leaves, max depth = 14, train loss: 0.42423, val loss: 0.41626, in 0.031s 1 tree, 145 leaves, max depth = 13, train loss: 0.42200, val loss: 0.41396, in 0.031s 1 tree, 91 leaves, max depth = 13, train loss: 0.41876, val loss: 0.41069, in 0.031s 1 tree, 127 leaves, max depth = 14, train loss: 0.41652, val loss: 0.40840, in 0.031s 1 tree, 126 leaves, max depth = 16, train loss: 0.41435, val loss: 0.40618, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.41113, val loss: 0.40301, in 0.031s 1 tree, 125 leaves, max depth = 16, train loss: 0.40904, val loss: 0.40087, in 0.016s 1 tree, 132 leaves, max depth = 19, train loss: 0.40600, val loss: 0.39787, in 0.038s 1 tree, 145 leaves, max depth = 12, train loss: 0.40402, val loss: 0.39585, in 0.025s 1 tree, 115 leaves, max depth = 16, train loss: 0.40101, val loss: 0.39289, in 0.031s 1 tree, 125 leaves, max depth = 14, train loss: 0.39905, val loss: 0.39091, in 0.016s 1 tree, 115 leaves, max depth = 16, train loss: 0.39613, val loss: 0.38804, in 0.031s 1 tree, 124 leaves, max depth = 15, train loss: 0.39424, val loss: 0.38615, in 0.031s 1 tree, 124 leaves, max depth = 14, train loss: 0.39238, val loss: 0.38426, in 0.016s 1 tree, 116 leaves, max depth = 14, train loss: 0.38958, val loss: 0.38153, in 0.031s 1 tree, 136 leaves, max depth = 14, train loss: 0.38690, val loss: 0.37890, in 0.031s 1 tree, 144 leaves, max depth = 12, train loss: 0.38517, val loss: 0.37715, in 0.031s 1 tree, 124 leaves, max depth = 14, train loss: 0.38342, val loss: 0.37536, in 0.016s 1 tree, 113 leaves, max depth = 20, train loss: 0.38081, val loss: 0.37281, in 0.047s 1 tree, 115 leaves, max depth = 14, train loss: 0.37825, val loss: 0.37030, in 0.016s 1 tree, 126 leaves, max depth = 14, train loss: 0.37657, val loss: 0.36861, in 0.031s 1 tree, 117 leaves, max depth = 16, train loss: 0.37407, val loss: 0.36619, in 0.031s 1 tree, 124 leaves, max depth = 14, train loss: 0.37258, val loss: 0.36472, in 0.031s 1 tree, 125 leaves, max depth = 14, train loss: 0.37100, val loss: 0.36312, in 0.016s 1 tree, 116 leaves, max depth = 14, train loss: 0.36861, val loss: 0.36079, in 0.031s 1 tree, 138 leaves, max depth = 14, train loss: 0.36632, val loss: 0.35857, in 0.031s 1 tree, 124 leaves, max depth = 14, train loss: 0.36480, val loss: 0.35705, in 0.031s 1 tree, 126 leaves, max depth = 14, train loss: 0.36331, val loss: 0.35554, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.36107, val loss: 0.35336, in 0.031s 1 tree, 123 leaves, max depth = 16, train loss: 0.35962, val loss: 0.35193, in 0.031s 1 tree, 115 leaves, max depth = 13, train loss: 0.35744, val loss: 0.34982, in 0.031s 1 tree, 137 leaves, max depth = 13, train loss: 0.35536, val loss: 0.34780, in 0.031s 1 tree, 66 leaves, max depth = 12, train loss: 0.35340, val loss: 0.34582, in 0.016s 1 tree, 124 leaves, max depth = 16, train loss: 0.35203, val loss: 0.34447, in 0.031s 1 tree, 115 leaves, max depth = 14, train loss: 0.35001, val loss: 0.34252, in 0.016s 1 tree, 124 leaves, max depth = 16, train loss: 0.34868, val loss: 0.34122, in 0.031s 1 tree, 66 leaves, max depth = 12, train loss: 0.34683, val loss: 0.33936, in 0.016s 1 tree, 65 leaves, max depth = 12, train loss: 0.34504, val loss: 0.33754, in 0.031s 1 tree, 126 leaves, max depth = 16, train loss: 0.34353, val loss: 0.33609, in 0.031s 1 tree, 129 leaves, max depth = 16, train loss: 0.34204, val loss: 0.33464, in 0.016s 1 tree, 66 leaves, max depth = 12, train loss: 0.34031, val loss: 0.33290, in 0.031s 1 tree, 66 leaves, max depth = 12, train loss: 0.33862, val loss: 0.33120, in 0.016s Fit 100 trees in 3.049 s, (12317 total leaves) Time spent computing histograms: 0.946s Time spent finding best splits: 0.266s Time spent applying splits: 0.255s Time spent predicting: 0.031s Trial 82, Fold 5: Log loss = 0.34798127203454254, Average precision = 0.9529458646450839, ROC-AUC = 0.9496271854040095, Elapsed Time = 3.070813899999848 seconds
Optimization Progress: 83%|########2 | 83/100 [17:14<04:21, 15.41s/it]
Trial 83, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 83, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.143 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 67 leaves, max depth = 15, train loss: 0.66953, val loss: 0.66899, in 0.016s 1 tree, 93 leaves, max depth = 11, train loss: 0.64669, val loss: 0.64604, in 0.016s 1 tree, 88 leaves, max depth = 11, train loss: 0.62569, val loss: 0.62495, in 0.016s 1 tree, 97 leaves, max depth = 11, train loss: 0.60662, val loss: 0.60563, in 0.016s 1 tree, 92 leaves, max depth = 13, train loss: 0.58877, val loss: 0.58770, in 0.016s 1 tree, 91 leaves, max depth = 11, train loss: 0.57351, val loss: 0.57232, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.55812, val loss: 0.55687, in 0.016s 1 tree, 99 leaves, max depth = 11, train loss: 0.54378, val loss: 0.54230, in 0.016s 1 tree, 102 leaves, max depth = 11, train loss: 0.53044, val loss: 0.52875, in 0.016s 1 tree, 74 leaves, max depth = 15, train loss: 0.51852, val loss: 0.51656, in 0.016s 1 tree, 91 leaves, max depth = 13, train loss: 0.50665, val loss: 0.50462, in 0.016s 1 tree, 80 leaves, max depth = 11, train loss: 0.49627, val loss: 0.49408, in 0.016s 1 tree, 94 leaves, max depth = 15, train loss: 0.48614, val loss: 0.48388, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.47327, val loss: 0.47132, in 0.016s 1 tree, 94 leaves, max depth = 12, train loss: 0.46460, val loss: 0.46264, in 0.031s 1 tree, 95 leaves, max depth = 14, train loss: 0.45643, val loss: 0.45438, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.44546, val loss: 0.44355, in 0.016s 1 tree, 101 leaves, max depth = 12, train loss: 0.43764, val loss: 0.43559, in 0.016s 1 tree, 74 leaves, max depth = 15, train loss: 0.43075, val loss: 0.42853, in 0.016s 1 tree, 73 leaves, max depth = 11, train loss: 0.42398, val loss: 0.42160, in 0.016s 1 tree, 74 leaves, max depth = 14, train loss: 0.41792, val loss: 0.41539, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.41193, val loss: 0.40928, in 0.016s 1 tree, 91 leaves, max depth = 12, train loss: 0.40602, val loss: 0.40342, in 0.016s 1 tree, 89 leaves, max depth = 14, train loss: 0.40033, val loss: 0.39780, in 0.016s 1 tree, 93 leaves, max depth = 11, train loss: 0.39516, val loss: 0.39269, in 0.016s 1 tree, 92 leaves, max depth = 14, train loss: 0.39044, val loss: 0.38795, in 0.016s 1 tree, 96 leaves, max depth = 15, train loss: 0.38594, val loss: 0.38356, in 0.016s 1 tree, 94 leaves, max depth = 15, train loss: 0.38129, val loss: 0.37895, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.37407, val loss: 0.37185, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.36739, val loss: 0.36528, in 0.016s 1 tree, 91 leaves, max depth = 15, train loss: 0.36271, val loss: 0.36067, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.35666, val loss: 0.35475, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.35103, val loss: 0.34971, in 0.016s 1 tree, 66 leaves, max depth = 10, train loss: 0.34529, val loss: 0.34423, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.34132, val loss: 0.34040, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.33655, val loss: 0.33617, in 0.016s 1 tree, 89 leaves, max depth = 15, train loss: 0.33277, val loss: 0.33246, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.32823, val loss: 0.32800, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.32383, val loss: 0.32363, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.31979, val loss: 0.32009, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.31603, val loss: 0.31679, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.31238, val loss: 0.31319, in 0.000s 1 tree, 93 leaves, max depth = 14, train loss: 0.30968, val loss: 0.31061, in 0.031s 1 tree, 88 leaves, max depth = 15, train loss: 0.30662, val loss: 0.30763, in 0.016s 1 tree, 91 leaves, max depth = 12, train loss: 0.30377, val loss: 0.30486, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.30067, val loss: 0.30218, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.29766, val loss: 0.29923, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.29481, val loss: 0.29642, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.29210, val loss: 0.29412, in 0.016s 1 tree, 122 leaves, max depth = 14, train loss: 0.28942, val loss: 0.29150, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.28693, val loss: 0.28900, in 0.016s 1 tree, 90 leaves, max depth = 19, train loss: 0.28461, val loss: 0.28683, in 0.016s 1 tree, 91 leaves, max depth = 14, train loss: 0.28241, val loss: 0.28470, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.28078, val loss: 0.28280, in 0.016s 1 tree, 64 leaves, max depth = 14, train loss: 0.27828, val loss: 0.28043, in 0.016s 1 tree, 26 leaves, max depth = 11, train loss: 0.27626, val loss: 0.27845, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.27416, val loss: 0.27672, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.27218, val loss: 0.27501, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.27033, val loss: 0.27344, in 0.016s 1 tree, 90 leaves, max depth = 14, train loss: 0.26847, val loss: 0.27170, in 0.016s 1 tree, 119 leaves, max depth = 14, train loss: 0.26651, val loss: 0.26980, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.26486, val loss: 0.26840, in 0.016s 1 tree, 91 leaves, max depth = 14, train loss: 0.26316, val loss: 0.26678, in 0.016s 1 tree, 91 leaves, max depth = 14, train loss: 0.26154, val loss: 0.26527, in 0.016s 1 tree, 118 leaves, max depth = 14, train loss: 0.25987, val loss: 0.26364, in 0.031s 1 tree, 24 leaves, max depth = 12, train loss: 0.25842, val loss: 0.26221, in 0.016s 1 tree, 119 leaves, max depth = 14, train loss: 0.25685, val loss: 0.26071, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.25545, val loss: 0.25921, in 0.016s 1 tree, 43 leaves, max depth = 13, train loss: 0.25413, val loss: 0.25800, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.25285, val loss: 0.25675, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.25164, val loss: 0.25573, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.25050, val loss: 0.25478, in 0.016s 1 tree, 110 leaves, max depth = 19, train loss: 0.24893, val loss: 0.25306, in 0.016s 1 tree, 110 leaves, max depth = 16, train loss: 0.24739, val loss: 0.25139, in 0.016s 1 tree, 27 leaves, max depth = 7, train loss: 0.24625, val loss: 0.25038, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.24524, val loss: 0.24937, in 0.016s 1 tree, 119 leaves, max depth = 15, train loss: 0.24399, val loss: 0.24818, in 0.016s 1 tree, 108 leaves, max depth = 14, train loss: 0.24265, val loss: 0.24674, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.24167, val loss: 0.24581, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.24062, val loss: 0.24497, in 0.016s 1 tree, 108 leaves, max depth = 14, train loss: 0.23969, val loss: 0.24422, in 0.016s 1 tree, 73 leaves, max depth = 13, train loss: 0.23801, val loss: 0.24243, in 0.016s 1 tree, 69 leaves, max depth = 13, train loss: 0.23731, val loss: 0.24169, in 0.016s 1 tree, 73 leaves, max depth = 16, train loss: 0.23577, val loss: 0.24005, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.23489, val loss: 0.23935, in 0.016s Fit 85 trees in 1.658 s, (5881 total leaves) Time spent computing histograms: 0.504s Time spent finding best splits: 0.102s Time spent applying splits: 0.095s Time spent predicting: 0.000s Trial 83, Fold 1: Log loss = 0.2419060173779556, Average precision = 0.9677093062637296, ROC-AUC = 0.962729584582001, Elapsed Time = 1.6626142999994045 seconds Trial 83, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 83, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 67 leaves, max depth = 13, train loss: 0.66954, val loss: 0.66906, in 0.016s 1 tree, 88 leaves, max depth = 12, train loss: 0.64664, val loss: 0.64581, in 0.016s 1 tree, 88 leaves, max depth = 12, train loss: 0.62558, val loss: 0.62443, in 0.016s 1 tree, 91 leaves, max depth = 15, train loss: 0.60627, val loss: 0.60484, in 0.016s 1 tree, 97 leaves, max depth = 15, train loss: 0.58862, val loss: 0.58700, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.57306, val loss: 0.57102, in 0.031s 1 tree, 92 leaves, max depth = 14, train loss: 0.55744, val loss: 0.55518, in 0.016s 1 tree, 100 leaves, max depth = 13, train loss: 0.54311, val loss: 0.54068, in 0.016s 1 tree, 74 leaves, max depth = 13, train loss: 0.53001, val loss: 0.52725, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.51748, val loss: 0.51455, in 0.016s 1 tree, 96 leaves, max depth = 15, train loss: 0.50580, val loss: 0.50276, in 0.031s 1 tree, 93 leaves, max depth = 13, train loss: 0.49474, val loss: 0.49152, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.48135, val loss: 0.47814, in 0.016s 1 tree, 77 leaves, max depth = 15, train loss: 0.47214, val loss: 0.46897, in 0.016s 1 tree, 96 leaves, max depth = 15, train loss: 0.46324, val loss: 0.46015, in 0.016s 1 tree, 74 leaves, max depth = 13, train loss: 0.45481, val loss: 0.45149, in 0.031s 1 tree, 73 leaves, max depth = 13, train loss: 0.44737, val loss: 0.44396, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.43687, val loss: 0.43347, in 0.016s 1 tree, 71 leaves, max depth = 11, train loss: 0.43045, val loss: 0.42681, in 0.016s 1 tree, 99 leaves, max depth = 13, train loss: 0.42356, val loss: 0.41989, in 0.031s 1 tree, 91 leaves, max depth = 12, train loss: 0.41702, val loss: 0.41336, in 0.016s 1 tree, 96 leaves, max depth = 13, train loss: 0.41078, val loss: 0.40706, in 0.016s 1 tree, 97 leaves, max depth = 13, train loss: 0.40489, val loss: 0.40115, in 0.031s 1 tree, 81 leaves, max depth = 12, train loss: 0.39966, val loss: 0.39595, in 0.016s 1 tree, 85 leaves, max depth = 13, train loss: 0.39465, val loss: 0.39090, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.38652, val loss: 0.38298, in 0.016s 1 tree, 92 leaves, max depth = 14, train loss: 0.38184, val loss: 0.37834, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.37492, val loss: 0.37156, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.36803, val loss: 0.36484, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.36207, val loss: 0.35898, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.35648, val loss: 0.35350, in 0.000s 1 tree, 92 leaves, max depth = 14, train loss: 0.35192, val loss: 0.34907, in 0.031s 1 tree, 92 leaves, max depth = 17, train loss: 0.34837, val loss: 0.34567, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.34343, val loss: 0.34083, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.33879, val loss: 0.33629, in 0.000s 1 tree, 94 leaves, max depth = 14, train loss: 0.33554, val loss: 0.33308, in 0.031s 1 tree, 92 leaves, max depth = 14, train loss: 0.33173, val loss: 0.32939, in 0.018s 1 tree, 42 leaves, max depth = 11, train loss: 0.32736, val loss: 0.32536, in 0.014s 1 tree, 93 leaves, max depth = 15, train loss: 0.32446, val loss: 0.32265, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.32047, val loss: 0.31899, in 0.016s 1 tree, 91 leaves, max depth = 14, train loss: 0.31713, val loss: 0.31577, in 0.031s 1 tree, 42 leaves, max depth = 11, train loss: 0.31350, val loss: 0.31242, in 0.000s 1 tree, 42 leaves, max depth = 11, train loss: 0.31011, val loss: 0.30930, in 0.031s 1 tree, 28 leaves, max depth = 10, train loss: 0.30681, val loss: 0.30609, in 0.000s 1 tree, 93 leaves, max depth = 14, train loss: 0.30387, val loss: 0.30324, in 0.031s 1 tree, 28 leaves, max depth = 9, train loss: 0.30086, val loss: 0.30031, in 0.000s 1 tree, 42 leaves, max depth = 11, train loss: 0.29801, val loss: 0.29771, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.29538, val loss: 0.29528, in 0.016s 1 tree, 118 leaves, max depth = 14, train loss: 0.29265, val loss: 0.29278, in 0.031s 1 tree, 28 leaves, max depth = 9, train loss: 0.29014, val loss: 0.29030, in 0.016s 1 tree, 95 leaves, max depth = 12, train loss: 0.28825, val loss: 0.28861, in 0.016s 1 tree, 85 leaves, max depth = 19, train loss: 0.28653, val loss: 0.28706, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.28429, val loss: 0.28487, in 0.016s 1 tree, 89 leaves, max depth = 13, train loss: 0.28204, val loss: 0.28276, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.27991, val loss: 0.28083, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.27770, val loss: 0.27879, in 0.016s 1 tree, 111 leaves, max depth = 14, train loss: 0.27504, val loss: 0.27611, in 0.016s 1 tree, 121 leaves, max depth = 13, train loss: 0.27296, val loss: 0.27428, in 0.031s 1 tree, 41 leaves, max depth = 10, train loss: 0.27114, val loss: 0.27264, in 0.016s 1 tree, 94 leaves, max depth = 13, train loss: 0.26928, val loss: 0.27085, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.26760, val loss: 0.26936, in 0.016s 1 tree, 94 leaves, max depth = 14, train loss: 0.26547, val loss: 0.26709, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.26391, val loss: 0.26557, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.26236, val loss: 0.26402, in 0.016s 1 tree, 76 leaves, max depth = 11, train loss: 0.26116, val loss: 0.26296, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.25979, val loss: 0.26175, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.25843, val loss: 0.26054, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.25658, val loss: 0.25860, in 0.016s 1 tree, 26 leaves, max depth = 11, train loss: 0.25530, val loss: 0.25738, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.25407, val loss: 0.25614, in 0.016s 1 tree, 111 leaves, max depth = 17, train loss: 0.25247, val loss: 0.25452, in 0.016s 1 tree, 91 leaves, max depth = 12, train loss: 0.25108, val loss: 0.25318, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.24992, val loss: 0.25221, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.24885, val loss: 0.25127, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.24782, val loss: 0.25030, in 0.016s 1 tree, 108 leaves, max depth = 15, train loss: 0.24634, val loss: 0.24876, in 0.031s 1 tree, 123 leaves, max depth = 12, train loss: 0.24503, val loss: 0.24768, in 0.016s 1 tree, 72 leaves, max depth = 12, train loss: 0.24422, val loss: 0.24707, in 0.016s 1 tree, 63 leaves, max depth = 10, train loss: 0.24318, val loss: 0.24608, in 0.016s 1 tree, 111 leaves, max depth = 17, train loss: 0.24194, val loss: 0.24480, in 0.016s 1 tree, 26 leaves, max depth = 7, train loss: 0.24093, val loss: 0.24388, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.23988, val loss: 0.24310, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.23902, val loss: 0.24240, in 0.016s 1 tree, 32 leaves, max depth = 8, train loss: 0.23809, val loss: 0.24171, in 0.016s 1 tree, 42 leaves, max depth = 9, train loss: 0.23726, val loss: 0.24101, in 0.016s Fit 85 trees in 1.799 s, (5781 total leaves) Time spent computing histograms: 0.540s Time spent finding best splits: 0.114s Time spent applying splits: 0.105s Time spent predicting: 0.016s Trial 83, Fold 2: Log loss = 0.24140072050409492, Average precision = 0.9669439635515518, ROC-AUC = 0.9637237055231034, Elapsed Time = 1.8088112000004912 seconds Trial 83, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 83, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.143 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 70 leaves, max depth = 15, train loss: 0.66953, val loss: 0.66918, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.64822, val loss: 0.64789, in 0.031s 1 tree, 93 leaves, max depth = 14, train loss: 0.62710, val loss: 0.62678, in 0.016s 1 tree, 92 leaves, max depth = 12, train loss: 0.60768, val loss: 0.60728, in 0.016s 1 tree, 99 leaves, max depth = 12, train loss: 0.58997, val loss: 0.58949, in 0.016s 1 tree, 101 leaves, max depth = 11, train loss: 0.57456, val loss: 0.57402, in 0.016s 1 tree, 73 leaves, max depth = 12, train loss: 0.55938, val loss: 0.55869, in 0.016s 1 tree, 102 leaves, max depth = 12, train loss: 0.54516, val loss: 0.54443, in 0.031s 1 tree, 100 leaves, max depth = 13, train loss: 0.53182, val loss: 0.53116, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.52012, val loss: 0.51944, in 0.016s 1 tree, 95 leaves, max depth = 12, train loss: 0.50826, val loss: 0.50759, in 0.031s 1 tree, 65 leaves, max depth = 10, train loss: 0.49435, val loss: 0.49448, in 0.016s 1 tree, 73 leaves, max depth = 13, train loss: 0.48409, val loss: 0.48427, in 0.016s 1 tree, 89 leaves, max depth = 15, train loss: 0.47427, val loss: 0.47447, in 0.016s 1 tree, 95 leaves, max depth = 13, train loss: 0.46493, val loss: 0.46527, in 0.016s 1 tree, 92 leaves, max depth = 11, train loss: 0.45636, val loss: 0.45691, in 0.031s 1 tree, 97 leaves, max depth = 13, train loss: 0.44809, val loss: 0.44875, in 0.016s 1 tree, 91 leaves, max depth = 12, train loss: 0.44114, val loss: 0.44188, in 0.016s 1 tree, 59 leaves, max depth = 10, train loss: 0.43063, val loss: 0.43208, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.42401, val loss: 0.42541, in 0.031s 1 tree, 65 leaves, max depth = 9, train loss: 0.41479, val loss: 0.41690, in 0.016s 1 tree, 97 leaves, max depth = 11, train loss: 0.40856, val loss: 0.41070, in 0.016s 1 tree, 59 leaves, max depth = 10, train loss: 0.39995, val loss: 0.40284, in 0.016s 1 tree, 101 leaves, max depth = 12, train loss: 0.39478, val loss: 0.39781, in 0.016s 1 tree, 99 leaves, max depth = 11, train loss: 0.38939, val loss: 0.39245, in 0.031s 1 tree, 96 leaves, max depth = 12, train loss: 0.38459, val loss: 0.38776, in 0.016s 1 tree, 94 leaves, max depth = 11, train loss: 0.38009, val loss: 0.38333, in 0.016s 1 tree, 97 leaves, max depth = 11, train loss: 0.37550, val loss: 0.37881, in 0.031s 1 tree, 93 leaves, max depth = 17, train loss: 0.37088, val loss: 0.37439, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.36411, val loss: 0.36829, in 0.016s 1 tree, 94 leaves, max depth = 16, train loss: 0.35950, val loss: 0.36357, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.35344, val loss: 0.35812, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.34810, val loss: 0.35364, in 0.016s 1 tree, 92 leaves, max depth = 14, train loss: 0.34439, val loss: 0.35010, in 0.031s 1 tree, 27 leaves, max depth = 10, train loss: 0.33931, val loss: 0.34552, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.33459, val loss: 0.34125, in 0.000s 1 tree, 43 leaves, max depth = 12, train loss: 0.33012, val loss: 0.33752, in 0.016s 1 tree, 91 leaves, max depth = 13, train loss: 0.32723, val loss: 0.33476, in 0.031s 1 tree, 43 leaves, max depth = 11, train loss: 0.32312, val loss: 0.33139, in 0.000s 1 tree, 27 leaves, max depth = 10, train loss: 0.31919, val loss: 0.32787, in 0.000s 1 tree, 80 leaves, max depth = 11, train loss: 0.31641, val loss: 0.32521, in 0.031s 1 tree, 59 leaves, max depth = 11, train loss: 0.31248, val loss: 0.32181, in 0.016s 1 tree, 91 leaves, max depth = 14, train loss: 0.30931, val loss: 0.31854, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.30602, val loss: 0.31560, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.30298, val loss: 0.31289, in 0.016s 1 tree, 90 leaves, max depth = 12, train loss: 0.30066, val loss: 0.31072, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.29784, val loss: 0.30782, in 0.016s 1 tree, 94 leaves, max depth = 15, train loss: 0.29519, val loss: 0.30510, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.29217, val loss: 0.30245, in 0.016s 1 tree, 116 leaves, max depth = 13, train loss: 0.28965, val loss: 0.29973, in 0.031s 1 tree, 43 leaves, max depth = 11, train loss: 0.28698, val loss: 0.29760, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.28470, val loss: 0.29524, in 0.016s 1 tree, 118 leaves, max depth = 16, train loss: 0.28248, val loss: 0.29283, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.28013, val loss: 0.29102, in 0.016s 1 tree, 27 leaves, max depth = 12, train loss: 0.27797, val loss: 0.28913, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.27582, val loss: 0.28745, in 0.016s 1 tree, 94 leaves, max depth = 16, train loss: 0.27384, val loss: 0.28541, in 0.016s 1 tree, 27 leaves, max depth = 8, train loss: 0.27190, val loss: 0.28375, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.27007, val loss: 0.28240, in 0.016s 1 tree, 116 leaves, max depth = 14, train loss: 0.26821, val loss: 0.28037, in 0.031s 1 tree, 95 leaves, max depth = 13, train loss: 0.26647, val loss: 0.27857, in 0.016s 1 tree, 95 leaves, max depth = 13, train loss: 0.26484, val loss: 0.27692, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.26316, val loss: 0.27567, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.26156, val loss: 0.27449, in 0.016s 1 tree, 32 leaves, max depth = 14, train loss: 0.25998, val loss: 0.27300, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.25852, val loss: 0.27190, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.25715, val loss: 0.27084, in 0.016s 1 tree, 114 leaves, max depth = 15, train loss: 0.25519, val loss: 0.26866, in 0.016s 1 tree, 27 leaves, max depth = 8, train loss: 0.25389, val loss: 0.26758, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.25242, val loss: 0.26606, in 0.031s 1 tree, 44 leaves, max depth = 12, train loss: 0.25123, val loss: 0.26515, in 0.000s 1 tree, 26 leaves, max depth = 9, train loss: 0.25009, val loss: 0.26415, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.24875, val loss: 0.26281, in 0.016s 1 tree, 92 leaves, max depth = 13, train loss: 0.24750, val loss: 0.26156, in 0.016s 1 tree, 100 leaves, max depth = 18, train loss: 0.24606, val loss: 0.25990, in 0.031s 1 tree, 47 leaves, max depth = 11, train loss: 0.24501, val loss: 0.25916, in 0.016s 1 tree, 49 leaves, max depth = 9, train loss: 0.24434, val loss: 0.25852, in 0.016s 1 tree, 30 leaves, max depth = 13, train loss: 0.24337, val loss: 0.25754, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.24238, val loss: 0.25690, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.24145, val loss: 0.25630, in 0.016s 1 tree, 101 leaves, max depth = 18, train loss: 0.24015, val loss: 0.25478, in 0.016s 1 tree, 47 leaves, max depth = 10, train loss: 0.23929, val loss: 0.25421, in 0.016s 1 tree, 109 leaves, max depth = 16, train loss: 0.23820, val loss: 0.25287, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.23744, val loss: 0.25207, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.23640, val loss: 0.25103, in 0.016s Fit 85 trees in 1.846 s, (6094 total leaves) Time spent computing histograms: 0.573s Time spent finding best splits: 0.119s Time spent applying splits: 0.111s Time spent predicting: 0.000s Trial 83, Fold 3: Log loss = 0.23801422688332105, Average precision = 0.9672258396658524, ROC-AUC = 0.9640686397848426, Elapsed Time = 1.8602890000001935 seconds Trial 83, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 83, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 70 leaves, max depth = 12, train loss: 0.66948, val loss: 0.66858, in 0.016s 1 tree, 96 leaves, max depth = 14, train loss: 0.64844, val loss: 0.64635, in 0.016s 1 tree, 98 leaves, max depth = 13, train loss: 0.62726, val loss: 0.62432, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.60783, val loss: 0.60410, in 0.031s 1 tree, 77 leaves, max depth = 13, train loss: 0.59014, val loss: 0.58560, in 0.016s 1 tree, 102 leaves, max depth = 13, train loss: 0.57445, val loss: 0.56915, in 0.016s 1 tree, 98 leaves, max depth = 14, train loss: 0.55878, val loss: 0.55279, in 0.031s 1 tree, 103 leaves, max depth = 13, train loss: 0.54445, val loss: 0.53768, in 0.016s 1 tree, 71 leaves, max depth = 12, train loss: 0.53130, val loss: 0.52388, in 0.016s 1 tree, 99 leaves, max depth = 14, train loss: 0.51951, val loss: 0.51148, in 0.016s 1 tree, 97 leaves, max depth = 14, train loss: 0.50757, val loss: 0.49899, in 0.031s 1 tree, 98 leaves, max depth = 14, train loss: 0.49643, val loss: 0.48730, in 0.016s 1 tree, 58 leaves, max depth = 10, train loss: 0.48306, val loss: 0.47364, in 0.016s 1 tree, 94 leaves, max depth = 13, train loss: 0.47371, val loss: 0.46392, in 0.016s 1 tree, 96 leaves, max depth = 12, train loss: 0.46438, val loss: 0.45414, in 0.031s 1 tree, 96 leaves, max depth = 14, train loss: 0.45563, val loss: 0.44496, in 0.016s 1 tree, 93 leaves, max depth = 12, train loss: 0.44748, val loss: 0.43636, in 0.016s 1 tree, 75 leaves, max depth = 14, train loss: 0.44011, val loss: 0.42852, in 0.016s 1 tree, 100 leaves, max depth = 13, train loss: 0.43309, val loss: 0.42099, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.42308, val loss: 0.41082, in 0.016s 1 tree, 94 leaves, max depth = 11, train loss: 0.41699, val loss: 0.40454, in 0.031s 1 tree, 58 leaves, max depth = 12, train loss: 0.40800, val loss: 0.39543, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.39959, val loss: 0.38689, in 0.016s 1 tree, 100 leaves, max depth = 12, train loss: 0.39413, val loss: 0.38102, in 0.016s 1 tree, 71 leaves, max depth = 13, train loss: 0.38914, val loss: 0.37555, in 0.016s 1 tree, 102 leaves, max depth = 13, train loss: 0.38424, val loss: 0.37027, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.37720, val loss: 0.36312, in 0.016s 1 tree, 92 leaves, max depth = 13, train loss: 0.37282, val loss: 0.35858, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.36867, val loss: 0.35407, in 0.031s 1 tree, 81 leaves, max depth = 12, train loss: 0.36476, val loss: 0.34987, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.35886, val loss: 0.34369, in 0.016s 1 tree, 91 leaves, max depth = 12, train loss: 0.35442, val loss: 0.33917, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.34914, val loss: 0.33403, in 0.016s 1 tree, 59 leaves, max depth = 11, train loss: 0.34368, val loss: 0.32853, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.33977, val loss: 0.32465, in 0.031s 1 tree, 28 leaves, max depth = 11, train loss: 0.33520, val loss: 0.31988, in 0.000s 1 tree, 42 leaves, max depth = 11, train loss: 0.33080, val loss: 0.31566, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.32666, val loss: 0.31136, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.32279, val loss: 0.30762, in 0.016s 1 tree, 92 leaves, max depth = 15, train loss: 0.31933, val loss: 0.30413, in 0.016s 1 tree, 123 leaves, max depth = 14, train loss: 0.31597, val loss: 0.30064, in 0.031s 1 tree, 59 leaves, max depth = 9, train loss: 0.31207, val loss: 0.29668, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.30874, val loss: 0.29353, in 0.016s 1 tree, 90 leaves, max depth = 15, train loss: 0.30645, val loss: 0.29108, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.30332, val loss: 0.28784, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.30047, val loss: 0.28482, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.29778, val loss: 0.28197, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.29507, val loss: 0.27928, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.29246, val loss: 0.27678, in 0.016s 1 tree, 92 leaves, max depth = 15, train loss: 0.28990, val loss: 0.27422, in 0.031s 1 tree, 27 leaves, max depth = 10, train loss: 0.28755, val loss: 0.27179, in 0.016s 1 tree, 125 leaves, max depth = 12, train loss: 0.28507, val loss: 0.26924, in 0.016s 1 tree, 124 leaves, max depth = 16, train loss: 0.28277, val loss: 0.26689, in 0.031s 1 tree, 100 leaves, max depth = 12, train loss: 0.28109, val loss: 0.26529, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.27897, val loss: 0.26336, in 0.016s 1 tree, 85 leaves, max depth = 12, train loss: 0.27750, val loss: 0.26185, in 0.016s 1 tree, 121 leaves, max depth = 13, train loss: 0.27548, val loss: 0.25974, in 0.016s 1 tree, 94 leaves, max depth = 13, train loss: 0.27361, val loss: 0.25789, in 0.031s 1 tree, 42 leaves, max depth = 11, train loss: 0.27165, val loss: 0.25610, in 0.016s 1 tree, 26 leaves, max depth = 11, train loss: 0.26988, val loss: 0.25422, in 0.016s 1 tree, 123 leaves, max depth = 13, train loss: 0.26807, val loss: 0.25234, in 0.031s 1 tree, 49 leaves, max depth = 12, train loss: 0.26637, val loss: 0.25079, in 0.016s 1 tree, 119 leaves, max depth = 14, train loss: 0.26469, val loss: 0.24903, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.26285, val loss: 0.24730, in 0.016s 1 tree, 42 leaves, max depth = 9, train loss: 0.26131, val loss: 0.24592, in 0.016s 1 tree, 88 leaves, max depth = 14, train loss: 0.25979, val loss: 0.24443, in 0.031s 1 tree, 45 leaves, max depth = 14, train loss: 0.25832, val loss: 0.24306, in 0.016s 1 tree, 122 leaves, max depth = 14, train loss: 0.25685, val loss: 0.24155, in 0.016s 1 tree, 89 leaves, max depth = 17, train loss: 0.25526, val loss: 0.24007, in 0.031s 1 tree, 46 leaves, max depth = 13, train loss: 0.25392, val loss: 0.23880, in 0.000s 1 tree, 91 leaves, max depth = 14, train loss: 0.25264, val loss: 0.23759, in 0.031s 1 tree, 74 leaves, max depth = 12, train loss: 0.25171, val loss: 0.23681, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.25036, val loss: 0.23553, in 0.016s 1 tree, 111 leaves, max depth = 17, train loss: 0.24895, val loss: 0.23416, in 0.016s 1 tree, 89 leaves, max depth = 14, train loss: 0.24761, val loss: 0.23294, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.24633, val loss: 0.23188, in 0.016s 1 tree, 45 leaves, max depth = 12, train loss: 0.24519, val loss: 0.23085, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.24401, val loss: 0.22981, in 0.016s 1 tree, 74 leaves, max depth = 12, train loss: 0.24324, val loss: 0.22919, in 0.016s 1 tree, 107 leaves, max depth = 13, train loss: 0.24195, val loss: 0.22797, in 0.016s 1 tree, 27 leaves, max depth = 7, train loss: 0.24091, val loss: 0.22711, in 0.000s 1 tree, 110 leaves, max depth = 15, train loss: 0.23975, val loss: 0.22598, in 0.031s 1 tree, 37 leaves, max depth = 8, train loss: 0.23894, val loss: 0.22530, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.23801, val loss: 0.22451, in 0.016s 1 tree, 72 leaves, max depth = 15, train loss: 0.23640, val loss: 0.22303, in 0.016s Fit 85 trees in 1.908 s, (6280 total leaves) Time spent computing histograms: 0.574s Time spent finding best splits: 0.121s Time spent applying splits: 0.114s Time spent predicting: 0.000s Trial 83, Fold 4: Log loss = 0.2407239049784999, Average precision = 0.9676442621459418, ROC-AUC = 0.9632780776173069, Elapsed Time = 1.9141239000000496 seconds Trial 83, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 83, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.143 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 69 leaves, max depth = 12, train loss: 0.66934, val loss: 0.66800, in 0.016s 1 tree, 103 leaves, max depth = 13, train loss: 0.64809, val loss: 0.64580, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.62700, val loss: 0.62397, in 0.016s 1 tree, 97 leaves, max depth = 14, train loss: 0.60752, val loss: 0.60378, in 0.016s 1 tree, 93 leaves, max depth = 13, train loss: 0.59071, val loss: 0.58659, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.57489, val loss: 0.57017, in 0.031s 1 tree, 97 leaves, max depth = 12, train loss: 0.55910, val loss: 0.55387, in 0.016s 1 tree, 103 leaves, max depth = 12, train loss: 0.54464, val loss: 0.53891, in 0.016s 1 tree, 103 leaves, max depth = 13, train loss: 0.53119, val loss: 0.52498, in 0.016s 1 tree, 100 leaves, max depth = 15, train loss: 0.51919, val loss: 0.51243, in 0.031s 1 tree, 76 leaves, max depth = 14, train loss: 0.50758, val loss: 0.50027, in 0.016s 1 tree, 96 leaves, max depth = 15, train loss: 0.49681, val loss: 0.48898, in 0.016s 1 tree, 96 leaves, max depth = 14, train loss: 0.48697, val loss: 0.47872, in 0.016s 1 tree, 59 leaves, max depth = 10, train loss: 0.47409, val loss: 0.46573, in 0.016s 1 tree, 94 leaves, max depth = 12, train loss: 0.46530, val loss: 0.45680, in 0.016s 1 tree, 95 leaves, max depth = 15, train loss: 0.45686, val loss: 0.44800, in 0.031s 1 tree, 98 leaves, max depth = 13, train loss: 0.44842, val loss: 0.43934, in 0.016s 1 tree, 59 leaves, max depth = 10, train loss: 0.43771, val loss: 0.42861, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.43051, val loss: 0.42100, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.42368, val loss: 0.41387, in 0.016s 1 tree, 74 leaves, max depth = 14, train loss: 0.41785, val loss: 0.40760, in 0.031s 1 tree, 76 leaves, max depth = 15, train loss: 0.41184, val loss: 0.40126, in 0.016s 1 tree, 98 leaves, max depth = 14, train loss: 0.40583, val loss: 0.39514, in 0.016s 1 tree, 74 leaves, max depth = 13, train loss: 0.40045, val loss: 0.38950, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.39525, val loss: 0.38418, in 0.016s 1 tree, 94 leaves, max depth = 12, train loss: 0.39060, val loss: 0.37938, in 0.031s 1 tree, 92 leaves, max depth = 12, train loss: 0.38614, val loss: 0.37491, in 0.016s 1 tree, 95 leaves, max depth = 13, train loss: 0.38154, val loss: 0.37025, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.37429, val loss: 0.36297, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.36758, val loss: 0.35622, in 0.016s 1 tree, 90 leaves, max depth = 15, train loss: 0.36284, val loss: 0.35153, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.35673, val loss: 0.34538, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.35067, val loss: 0.33988, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.34533, val loss: 0.33439, in 0.016s 1 tree, 63 leaves, max depth = 11, train loss: 0.34021, val loss: 0.32932, in 0.031s 1 tree, 43 leaves, max depth = 11, train loss: 0.33544, val loss: 0.32503, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.33105, val loss: 0.32114, in 0.047s 1 tree, 28 leaves, max depth = 10, train loss: 0.32682, val loss: 0.31684, in 0.016s 1 tree, 90 leaves, max depth = 13, train loss: 0.32313, val loss: 0.31321, in 0.031s 1 tree, 94 leaves, max depth = 13, train loss: 0.31961, val loss: 0.30974, in 0.016s 1 tree, 91 leaves, max depth = 14, train loss: 0.31625, val loss: 0.30648, in 0.031s 1 tree, 42 leaves, max depth = 11, train loss: 0.31252, val loss: 0.30326, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.30899, val loss: 0.29968, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.30573, val loss: 0.29641, in 0.016s 1 tree, 94 leaves, max depth = 14, train loss: 0.30279, val loss: 0.29358, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.29971, val loss: 0.29091, in 0.031s 1 tree, 42 leaves, max depth = 11, train loss: 0.29677, val loss: 0.28843, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.29400, val loss: 0.28553, in 0.016s 1 tree, 95 leaves, max depth = 13, train loss: 0.29137, val loss: 0.28297, in 0.031s 1 tree, 91 leaves, max depth = 14, train loss: 0.28885, val loss: 0.28053, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.28638, val loss: 0.27807, in 0.016s 1 tree, 27 leaves, max depth = 13, train loss: 0.28409, val loss: 0.27568, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.28178, val loss: 0.27327, in 0.016s 1 tree, 120 leaves, max depth = 14, train loss: 0.27931, val loss: 0.27094, in 0.031s 1 tree, 28 leaves, max depth = 11, train loss: 0.27733, val loss: 0.26890, in 0.016s 1 tree, 68 leaves, max depth = 11, train loss: 0.27526, val loss: 0.26697, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.27321, val loss: 0.26533, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.27130, val loss: 0.26380, in 0.016s 1 tree, 117 leaves, max depth = 14, train loss: 0.26913, val loss: 0.26175, in 0.016s 1 tree, 89 leaves, max depth = 11, train loss: 0.26721, val loss: 0.25992, in 0.031s 1 tree, 26 leaves, max depth = 12, train loss: 0.26564, val loss: 0.25829, in 0.016s 1 tree, 121 leaves, max depth = 15, train loss: 0.26370, val loss: 0.25648, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.26172, val loss: 0.25436, in 0.031s 1 tree, 41 leaves, max depth = 10, train loss: 0.26014, val loss: 0.25306, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.25863, val loss: 0.25184, in 0.016s 1 tree, 69 leaves, max depth = 11, train loss: 0.25683, val loss: 0.25015, in 0.016s 1 tree, 89 leaves, max depth = 15, train loss: 0.25527, val loss: 0.24874, in 0.016s 1 tree, 90 leaves, max depth = 15, train loss: 0.25385, val loss: 0.24742, in 0.031s 1 tree, 118 leaves, max depth = 15, train loss: 0.25236, val loss: 0.24604, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.25107, val loss: 0.24494, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.24967, val loss: 0.24367, in 0.016s 1 tree, 40 leaves, max depth = 13, train loss: 0.24824, val loss: 0.24236, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.24715, val loss: 0.24150, in 0.000s 1 tree, 80 leaves, max depth = 13, train loss: 0.24605, val loss: 0.24056, in 0.031s 1 tree, 94 leaves, max depth = 18, train loss: 0.24438, val loss: 0.23895, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.24296, val loss: 0.23746, in 0.016s 1 tree, 121 leaves, max depth = 14, train loss: 0.24180, val loss: 0.23640, in 0.031s 1 tree, 115 leaves, max depth = 15, train loss: 0.24016, val loss: 0.23497, in 0.016s 1 tree, 88 leaves, max depth = 13, train loss: 0.23917, val loss: 0.23412, in 0.031s 1 tree, 107 leaves, max depth = 14, train loss: 0.23802, val loss: 0.23292, in 0.016s 1 tree, 107 leaves, max depth = 15, train loss: 0.23689, val loss: 0.23174, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.23580, val loss: 0.23077, in 0.016s 1 tree, 47 leaves, max depth = 16, train loss: 0.23482, val loss: 0.22985, in 0.016s 1 tree, 41 leaves, max depth = 14, train loss: 0.23373, val loss: 0.22892, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.23277, val loss: 0.22806, in 0.016s Fit 85 trees in 2.018 s, (6083 total leaves) Time spent computing histograms: 0.630s Time spent finding best splits: 0.141s Time spent applying splits: 0.131s Time spent predicting: 0.000s Trial 83, Fold 5: Log loss = 0.2437184623531028, Average precision = 0.9649102280878095, ROC-AUC = 0.9611272504319716, Elapsed Time = 2.023743600000671 seconds
Optimization Progress: 84%|########4 | 84/100 [17:30<04:08, 15.54s/it]
Trial 84, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 84, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.190 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 76 leaves, max depth = 19, train loss: 0.68533, val loss: 0.68510, in 0.016s 1 tree, 76 leaves, max depth = 19, train loss: 0.67777, val loss: 0.67731, in 0.016s 1 tree, 76 leaves, max depth = 19, train loss: 0.67046, val loss: 0.66978, in 0.000s 1 tree, 80 leaves, max depth = 15, train loss: 0.66345, val loss: 0.66261, in 0.016s 1 tree, 77 leaves, max depth = 19, train loss: 0.65659, val loss: 0.65554, in 0.016s 1 tree, 77 leaves, max depth = 19, train loss: 0.64996, val loss: 0.64869, in 0.016s 1 tree, 77 leaves, max depth = 19, train loss: 0.64353, val loss: 0.64205, in 0.016s 1 tree, 78 leaves, max depth = 19, train loss: 0.63730, val loss: 0.63562, in 0.000s 1 tree, 78 leaves, max depth = 19, train loss: 0.63126, val loss: 0.62938, in 0.016s 1 tree, 78 leaves, max depth = 19, train loss: 0.62541, val loss: 0.62333, in 0.016s 1 tree, 78 leaves, max depth = 19, train loss: 0.61974, val loss: 0.61746, in 0.016s 1 tree, 78 leaves, max depth = 19, train loss: 0.61424, val loss: 0.61177, in 0.000s 1 tree, 81 leaves, max depth = 17, train loss: 0.60900, val loss: 0.60636, in 0.016s 1 tree, 78 leaves, max depth = 19, train loss: 0.60382, val loss: 0.60099, in 0.016s 1 tree, 78 leaves, max depth = 19, train loss: 0.59880, val loss: 0.59579, in 0.000s 1 tree, 82 leaves, max depth = 21, train loss: 0.59397, val loss: 0.59080, in 0.016s 1 tree, 77 leaves, max depth = 19, train loss: 0.58924, val loss: 0.58589, in 0.016s 1 tree, 77 leaves, max depth = 19, train loss: 0.58465, val loss: 0.58112, in 0.000s 1 tree, 81 leaves, max depth = 17, train loss: 0.58027, val loss: 0.57659, in 0.016s 1 tree, 77 leaves, max depth = 19, train loss: 0.57594, val loss: 0.57209, in 0.016s 1 tree, 77 leaves, max depth = 19, train loss: 0.57173, val loss: 0.56772, in 0.000s 1 tree, 77 leaves, max depth = 19, train loss: 0.56765, val loss: 0.56347, in 0.016s 1 tree, 77 leaves, max depth = 19, train loss: 0.56369, val loss: 0.55934, in 0.016s 1 tree, 78 leaves, max depth = 19, train loss: 0.55983, val loss: 0.55533, in 0.000s 1 tree, 125 leaves, max depth = 14, train loss: 0.55587, val loss: 0.55165, in 0.016s 1 tree, 78 leaves, max depth = 19, train loss: 0.55218, val loss: 0.54780, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.54837, val loss: 0.54428, in 0.000s 1 tree, 79 leaves, max depth = 18, train loss: 0.54489, val loss: 0.54069, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.54123, val loss: 0.53731, in 0.016s 1 tree, 82 leaves, max depth = 19, train loss: 0.53783, val loss: 0.53376, in 0.016s 1 tree, 82 leaves, max depth = 19, train loss: 0.53453, val loss: 0.53030, in 0.000s 1 tree, 124 leaves, max depth = 15, train loss: 0.53105, val loss: 0.52708, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.52766, val loss: 0.52396, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.52437, val loss: 0.52093, in 0.016s 1 tree, 82 leaves, max depth = 19, train loss: 0.52128, val loss: 0.51769, in 0.000s 1 tree, 125 leaves, max depth = 15, train loss: 0.51811, val loss: 0.51477, in 0.016s 1 tree, 82 leaves, max depth = 19, train loss: 0.51515, val loss: 0.51165, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.51210, val loss: 0.50885, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.50929, val loss: 0.50592, in 0.000s 1 tree, 126 leaves, max depth = 15, train loss: 0.50635, val loss: 0.50321, in 0.016s 1 tree, 82 leaves, max depth = 17, train loss: 0.50361, val loss: 0.50032, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.50077, val loss: 0.49772, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.49818, val loss: 0.49501, in 0.000s 1 tree, 125 leaves, max depth = 14, train loss: 0.49544, val loss: 0.49250, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.49277, val loss: 0.49006, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.49026, val loss: 0.48741, in 0.000s 1 tree, 126 leaves, max depth = 14, train loss: 0.48769, val loss: 0.48506, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.48528, val loss: 0.48250, in 0.016s 1 tree, 126 leaves, max depth = 14, train loss: 0.48279, val loss: 0.48024, in 0.016s 1 tree, 80 leaves, max depth = 19, train loss: 0.48050, val loss: 0.47784, in 0.016s 1 tree, 126 leaves, max depth = 14, train loss: 0.47810, val loss: 0.47566, in 0.000s 1 tree, 126 leaves, max depth = 14, train loss: 0.47577, val loss: 0.47353, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.47355, val loss: 0.47118, in 0.016s 1 tree, 126 leaves, max depth = 13, train loss: 0.47129, val loss: 0.46912, in 0.000s 1 tree, 79 leaves, max depth = 19, train loss: 0.46918, val loss: 0.46687, in 0.000s 1 tree, 126 leaves, max depth = 13, train loss: 0.46700, val loss: 0.46489, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.46493, val loss: 0.46269, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.46282, val loss: 0.46078, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.46076, val loss: 0.45892, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.45879, val loss: 0.45681, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.45688, val loss: 0.45476, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.45490, val loss: 0.45298, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.45297, val loss: 0.45124, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.45115, val loss: 0.44929, in 0.000s 1 tree, 127 leaves, max depth = 17, train loss: 0.44928, val loss: 0.44761, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.44748, val loss: 0.44573, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.44567, val loss: 0.44410, in 0.016s 1 tree, 80 leaves, max depth = 19, train loss: 0.44399, val loss: 0.44233, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.44223, val loss: 0.44075, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.44052, val loss: 0.43922, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.43890, val loss: 0.43747, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.43724, val loss: 0.43599, in 0.000s 1 tree, 82 leaves, max depth = 16, train loss: 0.43564, val loss: 0.43430, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.43403, val loss: 0.43286, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.43252, val loss: 0.43122, in 0.016s Fit 75 trees in 1.190 s, (7250 total leaves) Time spent computing histograms: 0.366s Time spent finding best splits: 0.101s Time spent applying splits: 0.110s Time spent predicting: 0.000s Trial 84, Fold 1: Log loss = 0.43436533814762274, Average precision = 0.9183205254227605, ROC-AUC = 0.9288089890929774, Elapsed Time = 1.2009637000010116 seconds Trial 84, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 84, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 80 leaves, max depth = 17, train loss: 0.68537, val loss: 0.68499, in 0.000s 1 tree, 80 leaves, max depth = 16, train loss: 0.67785, val loss: 0.67710, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.67058, val loss: 0.66946, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.66359, val loss: 0.66214, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.65677, val loss: 0.65498, in 0.000s 1 tree, 80 leaves, max depth = 16, train loss: 0.65016, val loss: 0.64803, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.64377, val loss: 0.64130, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.63757, val loss: 0.63478, in 0.000s 1 tree, 80 leaves, max depth = 16, train loss: 0.63157, val loss: 0.62846, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.62575, val loss: 0.62233, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.62011, val loss: 0.61638, in 0.000s 1 tree, 80 leaves, max depth = 16, train loss: 0.61464, val loss: 0.61061, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.60937, val loss: 0.60510, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.60422, val loss: 0.59966, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.59923, val loss: 0.59439, in 0.016s 1 tree, 81 leaves, max depth = 18, train loss: 0.59441, val loss: 0.58935, in 0.000s 1 tree, 81 leaves, max depth = 18, train loss: 0.58974, val loss: 0.58446, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.58517, val loss: 0.57962, in 0.016s 1 tree, 81 leaves, max depth = 19, train loss: 0.58077, val loss: 0.57501, in 0.000s 1 tree, 80 leaves, max depth = 16, train loss: 0.57646, val loss: 0.57044, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.57230, val loss: 0.56609, in 0.016s 1 tree, 81 leaves, max depth = 18, train loss: 0.56827, val loss: 0.56186, in 0.000s 1 tree, 81 leaves, max depth = 17, train loss: 0.56432, val loss: 0.55767, in 0.016s 1 tree, 82 leaves, max depth = 17, train loss: 0.56051, val loss: 0.55367, in 0.016s 1 tree, 82 leaves, max depth = 18, train loss: 0.55682, val loss: 0.54979, in 0.016s 1 tree, 129 leaves, max depth = 18, train loss: 0.55292, val loss: 0.54600, in 0.016s 1 tree, 129 leaves, max depth = 18, train loss: 0.54913, val loss: 0.54231, in 0.016s 1 tree, 82 leaves, max depth = 18, train loss: 0.54564, val loss: 0.53864, in 0.000s 1 tree, 129 leaves, max depth = 18, train loss: 0.54200, val loss: 0.53511, in 0.016s 1 tree, 82 leaves, max depth = 18, train loss: 0.53865, val loss: 0.53158, in 0.016s 1 tree, 83 leaves, max depth = 16, train loss: 0.53536, val loss: 0.52807, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.53189, val loss: 0.52470, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.52852, val loss: 0.52144, in 0.000s 1 tree, 82 leaves, max depth = 18, train loss: 0.52543, val loss: 0.51818, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.52219, val loss: 0.51504, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.51904, val loss: 0.51199, in 0.016s 1 tree, 81 leaves, max depth = 20, train loss: 0.51611, val loss: 0.50890, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.51307, val loss: 0.50596, in 0.016s 1 tree, 82 leaves, max depth = 18, train loss: 0.51026, val loss: 0.50299, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.50733, val loss: 0.50016, in 0.016s 1 tree, 81 leaves, max depth = 16, train loss: 0.50463, val loss: 0.49730, in 0.000s 1 tree, 127 leaves, max depth = 17, train loss: 0.50181, val loss: 0.49458, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.49920, val loss: 0.49182, in 0.016s 1 tree, 125 leaves, max depth = 16, train loss: 0.49648, val loss: 0.48920, in 0.016s 1 tree, 125 leaves, max depth = 16, train loss: 0.49384, val loss: 0.48664, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.49136, val loss: 0.48402, in 0.016s 1 tree, 127 leaves, max depth = 18, train loss: 0.48880, val loss: 0.48156, in 0.016s 1 tree, 81 leaves, max depth = 20, train loss: 0.48641, val loss: 0.47903, in 0.000s 1 tree, 127 leaves, max depth = 18, train loss: 0.48395, val loss: 0.47666, in 0.016s 1 tree, 81 leaves, max depth = 20, train loss: 0.48165, val loss: 0.47422, in 0.016s 1 tree, 127 leaves, max depth = 18, train loss: 0.47927, val loss: 0.47193, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.47705, val loss: 0.46958, in 0.016s 1 tree, 128 leaves, max depth = 18, train loss: 0.47475, val loss: 0.46737, in 0.016s 1 tree, 81 leaves, max depth = 16, train loss: 0.47262, val loss: 0.46511, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.47039, val loss: 0.46297, in 0.000s 1 tree, 128 leaves, max depth = 15, train loss: 0.46822, val loss: 0.46090, in 0.016s 1 tree, 81 leaves, max depth = 17, train loss: 0.46619, val loss: 0.45873, in 0.016s 1 tree, 128 leaves, max depth = 17, train loss: 0.46409, val loss: 0.45673, in 0.016s 1 tree, 81 leaves, max depth = 17, train loss: 0.46213, val loss: 0.45464, in 0.016s 1 tree, 81 leaves, max depth = 17, train loss: 0.46022, val loss: 0.45261, in 0.016s 1 tree, 81 leaves, max depth = 16, train loss: 0.45835, val loss: 0.45063, in 0.000s 1 tree, 127 leaves, max depth = 17, train loss: 0.45635, val loss: 0.44871, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.45440, val loss: 0.44685, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.45250, val loss: 0.44504, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.45066, val loss: 0.44327, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.44890, val loss: 0.44141, in 0.016s 1 tree, 126 leaves, max depth = 17, train loss: 0.44711, val loss: 0.43970, in 0.016s 1 tree, 83 leaves, max depth = 18, train loss: 0.44542, val loss: 0.43790, in 0.000s 1 tree, 127 leaves, max depth = 17, train loss: 0.44369, val loss: 0.43625, in 0.016s 1 tree, 83 leaves, max depth = 18, train loss: 0.44206, val loss: 0.43451, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.44038, val loss: 0.43291, in 0.016s 1 tree, 82 leaves, max depth = 18, train loss: 0.43880, val loss: 0.43123, in 0.000s 1 tree, 128 leaves, max depth = 16, train loss: 0.43717, val loss: 0.42969, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.43559, val loss: 0.42818, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.43407, val loss: 0.42656, in 0.016s Fit 75 trees in 1.221 s, (7371 total leaves) Time spent computing histograms: 0.386s Time spent finding best splits: 0.108s Time spent applying splits: 0.120s Time spent predicting: 0.000s Trial 84, Fold 2: Log loss = 0.43684396533371944, Average precision = 0.9113542035048336, ROC-AUC = 0.9271392871678127, Elapsed Time = 1.2313483999987511 seconds Trial 84, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 84, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 81 leaves, max depth = 12, train loss: 0.68547, val loss: 0.68519, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.67798, val loss: 0.67748, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.67073, val loss: 0.67002, in 0.000s 1 tree, 82 leaves, max depth = 12, train loss: 0.66377, val loss: 0.66280, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.65698, val loss: 0.65580, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.65039, val loss: 0.64903, in 0.000s 1 tree, 82 leaves, max depth = 12, train loss: 0.64408, val loss: 0.64246, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.63790, val loss: 0.63610, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.63198, val loss: 0.62993, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.62617, val loss: 0.62394, in 0.000s 1 tree, 80 leaves, max depth = 14, train loss: 0.62055, val loss: 0.61814, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.61515, val loss: 0.61251, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.60991, val loss: 0.60711, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.60478, val loss: 0.60181, in 0.000s 1 tree, 80 leaves, max depth = 14, train loss: 0.59980, val loss: 0.59667, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.59502, val loss: 0.59167, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.59033, val loss: 0.58682, in 0.000s 1 tree, 81 leaves, max depth = 14, train loss: 0.58577, val loss: 0.58212, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.58140, val loss: 0.57754, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.57711, val loss: 0.57310, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.57294, val loss: 0.56879, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.56890, val loss: 0.56460, in 0.000s 1 tree, 83 leaves, max depth = 13, train loss: 0.56501, val loss: 0.56053, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.56120, val loss: 0.55657, in 0.016s 1 tree, 127 leaves, max depth = 14, train loss: 0.55722, val loss: 0.55290, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.55360, val loss: 0.54911, in 0.000s 1 tree, 128 leaves, max depth = 14, train loss: 0.54978, val loss: 0.54558, in 0.031s 1 tree, 81 leaves, max depth = 15, train loss: 0.54630, val loss: 0.54198, in 0.000s 1 tree, 128 leaves, max depth = 15, train loss: 0.54263, val loss: 0.53860, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.53906, val loss: 0.53532, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.53579, val loss: 0.53187, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.53235, val loss: 0.52872, in 0.016s 1 tree, 81 leaves, max depth = 16, train loss: 0.52919, val loss: 0.52543, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.52589, val loss: 0.52240, in 0.000s 1 tree, 81 leaves, max depth = 16, train loss: 0.52285, val loss: 0.51925, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.51967, val loss: 0.51634, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.51678, val loss: 0.51326, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.51371, val loss: 0.51046, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.51091, val loss: 0.50754, in 0.000s 1 tree, 127 leaves, max depth = 16, train loss: 0.50796, val loss: 0.50485, in 0.000s 1 tree, 80 leaves, max depth = 15, train loss: 0.50526, val loss: 0.50204, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.50242, val loss: 0.49944, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.49982, val loss: 0.49674, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.49708, val loss: 0.49423, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.49441, val loss: 0.49180, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.49194, val loss: 0.48923, in 0.000s 1 tree, 128 leaves, max depth = 15, train loss: 0.48936, val loss: 0.48688, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.48685, val loss: 0.48461, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.48450, val loss: 0.48215, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.48208, val loss: 0.47995, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.47982, val loss: 0.47758, in 0.000s 1 tree, 129 leaves, max depth = 15, train loss: 0.47748, val loss: 0.47546, in 0.031s 1 tree, 75 leaves, max depth = 18, train loss: 0.47531, val loss: 0.47315, in 0.000s 1 tree, 129 leaves, max depth = 16, train loss: 0.47305, val loss: 0.47111, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.47095, val loss: 0.46890, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.46876, val loss: 0.46693, in 0.016s 1 tree, 129 leaves, max depth = 16, train loss: 0.46663, val loss: 0.46502, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.46462, val loss: 0.46290, in 0.016s 1 tree, 129 leaves, max depth = 16, train loss: 0.46256, val loss: 0.46106, in 0.016s 1 tree, 75 leaves, max depth = 18, train loss: 0.46064, val loss: 0.45900, in 0.000s 1 tree, 81 leaves, max depth = 14, train loss: 0.45876, val loss: 0.45701, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.45678, val loss: 0.45524, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.45485, val loss: 0.45352, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.45304, val loss: 0.45162, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.45117, val loss: 0.44996, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.44943, val loss: 0.44812, in 0.000s 1 tree, 127 leaves, max depth = 15, train loss: 0.44762, val loss: 0.44651, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.44586, val loss: 0.44495, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.44419, val loss: 0.44319, in 0.016s 1 tree, 127 leaves, max depth = 14, train loss: 0.44248, val loss: 0.44167, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.44087, val loss: 0.43997, in 0.000s 1 tree, 127 leaves, max depth = 14, train loss: 0.43921, val loss: 0.43851, in 0.031s 1 tree, 128 leaves, max depth = 14, train loss: 0.43760, val loss: 0.43708, in 0.000s 1 tree, 78 leaves, max depth = 15, train loss: 0.43605, val loss: 0.43544, in 0.016s 1 tree, 127 leaves, max depth = 14, train loss: 0.43449, val loss: 0.43407, in 0.016s Fit 75 trees in 1.267 s, (7409 total leaves) Time spent computing histograms: 0.387s Time spent finding best splits: 0.107s Time spent applying splits: 0.123s Time spent predicting: 0.016s Trial 84, Fold 3: Log loss = 0.43330142446967024, Average precision = 0.9190328847460254, ROC-AUC = 0.9310932549046047, Elapsed Time = 1.269082899998466 seconds Trial 84, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 84, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 74 leaves, max depth = 16, train loss: 0.68541, val loss: 0.68496, in 0.016s 1 tree, 74 leaves, max depth = 16, train loss: 0.67792, val loss: 0.67703, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.67068, val loss: 0.66935, in 0.016s 1 tree, 78 leaves, max depth = 16, train loss: 0.66375, val loss: 0.66200, in 0.000s 1 tree, 76 leaves, max depth = 16, train loss: 0.65696, val loss: 0.65479, in 0.016s 1 tree, 76 leaves, max depth = 16, train loss: 0.65038, val loss: 0.64780, in 0.016s 1 tree, 76 leaves, max depth = 16, train loss: 0.64402, val loss: 0.64103, in 0.000s 1 tree, 76 leaves, max depth = 16, train loss: 0.63785, val loss: 0.63446, in 0.016s 1 tree, 76 leaves, max depth = 16, train loss: 0.63187, val loss: 0.62808, in 0.016s 1 tree, 76 leaves, max depth = 16, train loss: 0.62608, val loss: 0.62190, in 0.016s 1 tree, 76 leaves, max depth = 16, train loss: 0.62046, val loss: 0.61590, in 0.000s 1 tree, 77 leaves, max depth = 16, train loss: 0.61502, val loss: 0.61008, in 0.016s 1 tree, 82 leaves, max depth = 15, train loss: 0.60976, val loss: 0.60448, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.60463, val loss: 0.59899, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.59966, val loss: 0.59366, in 0.016s 1 tree, 82 leaves, max depth = 15, train loss: 0.59486, val loss: 0.58854, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.59018, val loss: 0.58350, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.58564, val loss: 0.57861, in 0.016s 1 tree, 79 leaves, max depth = 17, train loss: 0.58126, val loss: 0.57392, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.57698, val loss: 0.56930, in 0.000s 1 tree, 75 leaves, max depth = 16, train loss: 0.57282, val loss: 0.56480, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.56879, val loss: 0.56044, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.56487, val loss: 0.55619, in 0.016s 1 tree, 75 leaves, max depth = 16, train loss: 0.56106, val loss: 0.55206, in 0.000s 1 tree, 125 leaves, max depth = 14, train loss: 0.55704, val loss: 0.54818, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.55340, val loss: 0.54422, in 0.016s 1 tree, 126 leaves, max depth = 14, train loss: 0.54953, val loss: 0.54050, in 0.016s 1 tree, 79 leaves, max depth = 16, train loss: 0.54606, val loss: 0.53674, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.54235, val loss: 0.53316, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.53875, val loss: 0.52969, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.53544, val loss: 0.52608, in 0.000s 1 tree, 126 leaves, max depth = 15, train loss: 0.53198, val loss: 0.52275, in 0.016s 1 tree, 81 leaves, max depth = 16, train loss: 0.52883, val loss: 0.51932, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.52549, val loss: 0.51611, in 0.016s 1 tree, 81 leaves, max depth = 16, train loss: 0.52246, val loss: 0.51282, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.51925, val loss: 0.50973, in 0.016s 1 tree, 78 leaves, max depth = 16, train loss: 0.51633, val loss: 0.50652, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.51323, val loss: 0.50355, in 0.016s 1 tree, 81 leaves, max depth = 16, train loss: 0.51044, val loss: 0.50050, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.50746, val loss: 0.49764, in 0.016s 1 tree, 81 leaves, max depth = 16, train loss: 0.50478, val loss: 0.49470, in 0.000s 1 tree, 128 leaves, max depth = 15, train loss: 0.50190, val loss: 0.49194, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.49932, val loss: 0.48912, in 0.016s 1 tree, 129 leaves, max depth = 15, train loss: 0.49655, val loss: 0.48646, in 0.016s 1 tree, 129 leaves, max depth = 15, train loss: 0.49384, val loss: 0.48387, in 0.016s 1 tree, 129 leaves, max depth = 15, train loss: 0.49122, val loss: 0.48136, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.48879, val loss: 0.47870, in 0.016s 1 tree, 129 leaves, max depth = 15, train loss: 0.48625, val loss: 0.47628, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.48392, val loss: 0.47371, in 0.016s 1 tree, 130 leaves, max depth = 18, train loss: 0.48147, val loss: 0.47137, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.47920, val loss: 0.46885, in 0.016s 1 tree, 130 leaves, max depth = 18, train loss: 0.47683, val loss: 0.46659, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.47465, val loss: 0.46416, in 0.016s 1 tree, 130 leaves, max depth = 18, train loss: 0.47236, val loss: 0.46198, in 0.016s 1 tree, 130 leaves, max depth = 18, train loss: 0.47013, val loss: 0.45986, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.46806, val loss: 0.45758, in 0.016s 1 tree, 129 leaves, max depth = 18, train loss: 0.46591, val loss: 0.45553, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.46392, val loss: 0.45332, in 0.016s 1 tree, 129 leaves, max depth = 18, train loss: 0.46183, val loss: 0.45133, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.45985, val loss: 0.44915, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.45798, val loss: 0.44707, in 0.016s 1 tree, 130 leaves, max depth = 15, train loss: 0.45597, val loss: 0.44516, in 0.016s 1 tree, 130 leaves, max depth = 15, train loss: 0.45402, val loss: 0.44331, in 0.016s 1 tree, 130 leaves, max depth = 15, train loss: 0.45212, val loss: 0.44150, in 0.031s 1 tree, 81 leaves, max depth = 12, train loss: 0.45029, val loss: 0.43948, in 0.000s 1 tree, 80 leaves, max depth = 15, train loss: 0.44856, val loss: 0.43755, in 0.016s 1 tree, 130 leaves, max depth = 17, train loss: 0.44673, val loss: 0.43581, in 0.031s 1 tree, 130 leaves, max depth = 17, train loss: 0.44495, val loss: 0.43412, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.44329, val loss: 0.43227, in 0.016s 1 tree, 130 leaves, max depth = 15, train loss: 0.44156, val loss: 0.43063, in 0.031s 1 tree, 130 leaves, max depth = 17, train loss: 0.43988, val loss: 0.42904, in 0.016s 1 tree, 77 leaves, max depth = 16, train loss: 0.43829, val loss: 0.42727, in 0.016s 1 tree, 130 leaves, max depth = 13, train loss: 0.43666, val loss: 0.42573, in 0.016s 1 tree, 79 leaves, max depth = 12, train loss: 0.43508, val loss: 0.42397, in 0.016s 1 tree, 130 leaves, max depth = 13, train loss: 0.43350, val loss: 0.42247, in 0.016s Fit 75 trees in 1.361 s, (7308 total leaves) Time spent computing histograms: 0.426s Time spent finding best splits: 0.123s Time spent applying splits: 0.136s Time spent predicting: 0.031s Trial 84, Fold 4: Log loss = 0.4347714117107563, Average precision = 0.9173394608907867, ROC-AUC = 0.9293842887663302, Elapsed Time = 1.3692844999986846 seconds Trial 84, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 84, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 79 leaves, max depth = 17, train loss: 0.68540, val loss: 0.68489, in 0.016s 1 tree, 78 leaves, max depth = 14, train loss: 0.67791, val loss: 0.67690, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.67059, val loss: 0.66909, in 0.000s 1 tree, 78 leaves, max depth = 14, train loss: 0.66357, val loss: 0.66160, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.65671, val loss: 0.65426, in 0.000s 1 tree, 80 leaves, max depth = 17, train loss: 0.65005, val loss: 0.64715, in 0.000s 1 tree, 80 leaves, max depth = 17, train loss: 0.64361, val loss: 0.64025, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.63737, val loss: 0.63356, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.63132, val loss: 0.62708, in 0.000s 1 tree, 79 leaves, max depth = 17, train loss: 0.62552, val loss: 0.62084, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.61983, val loss: 0.61473, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.61437, val loss: 0.60885, in 0.016s 1 tree, 76 leaves, max depth = 14, train loss: 0.60908, val loss: 0.60315, in 0.000s 1 tree, 80 leaves, max depth = 17, train loss: 0.60389, val loss: 0.59755, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.59885, val loss: 0.59212, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.59402, val loss: 0.58689, in 0.000s 1 tree, 80 leaves, max depth = 17, train loss: 0.58927, val loss: 0.58176, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.58466, val loss: 0.57677, in 0.016s 1 tree, 78 leaves, max depth = 17, train loss: 0.58024, val loss: 0.57197, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.57590, val loss: 0.56726, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.57168, val loss: 0.56268, in 0.000s 1 tree, 80 leaves, max depth = 17, train loss: 0.56759, val loss: 0.55823, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.56361, val loss: 0.55390, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.55975, val loss: 0.54969, in 0.000s 1 tree, 78 leaves, max depth = 15, train loss: 0.55603, val loss: 0.54563, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.55218, val loss: 0.54195, in 0.016s 1 tree, 78 leaves, max depth = 15, train loss: 0.54861, val loss: 0.53806, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.54491, val loss: 0.53453, in 0.016s 1 tree, 80 leaves, max depth = 19, train loss: 0.54145, val loss: 0.53075, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.53790, val loss: 0.52736, in 0.000s 1 tree, 79 leaves, max depth = 19, train loss: 0.53458, val loss: 0.52373, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.53116, val loss: 0.52048, in 0.016s 1 tree, 79 leaves, max depth = 17, train loss: 0.52801, val loss: 0.51705, in 0.016s 1 tree, 124 leaves, max depth = 15, train loss: 0.52471, val loss: 0.51392, in 0.016s 1 tree, 80 leaves, max depth = 17, train loss: 0.52168, val loss: 0.51062, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.51851, val loss: 0.50762, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.51543, val loss: 0.50470, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.51255, val loss: 0.50156, in 0.016s 1 tree, 81 leaves, max depth = 17, train loss: 0.50976, val loss: 0.49850, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.50681, val loss: 0.49572, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.50412, val loss: 0.49278, in 0.000s 1 tree, 126 leaves, max depth = 16, train loss: 0.50128, val loss: 0.49010, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.49870, val loss: 0.48726, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.49596, val loss: 0.48468, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.49329, val loss: 0.48218, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.49070, val loss: 0.47974, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.48826, val loss: 0.47707, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.48576, val loss: 0.47473, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.48341, val loss: 0.47214, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.48099, val loss: 0.46988, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.47873, val loss: 0.46739, in 0.016s 1 tree, 126 leaves, max depth = 17, train loss: 0.47640, val loss: 0.46520, in 0.016s 1 tree, 80 leaves, max depth = 18, train loss: 0.47422, val loss: 0.46281, in 0.016s 1 tree, 126 leaves, max depth = 17, train loss: 0.47196, val loss: 0.46070, in 0.016s 1 tree, 126 leaves, max depth = 17, train loss: 0.46977, val loss: 0.45866, in 0.031s 1 tree, 80 leaves, max depth = 18, train loss: 0.46769, val loss: 0.45636, in 0.016s 1 tree, 125 leaves, max depth = 16, train loss: 0.46557, val loss: 0.45439, in 0.016s 1 tree, 82 leaves, max depth = 15, train loss: 0.46356, val loss: 0.45216, in 0.016s 1 tree, 125 leaves, max depth = 16, train loss: 0.46150, val loss: 0.45025, in 0.016s 1 tree, 80 leaves, max depth = 19, train loss: 0.45957, val loss: 0.44811, in 0.000s 1 tree, 82 leaves, max depth = 15, train loss: 0.45769, val loss: 0.44602, in 0.016s 1 tree, 125 leaves, max depth = 16, train loss: 0.45571, val loss: 0.44419, in 0.016s 1 tree, 125 leaves, max depth = 16, train loss: 0.45378, val loss: 0.44241, in 0.016s 1 tree, 125 leaves, max depth = 16, train loss: 0.45191, val loss: 0.44068, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.45012, val loss: 0.43868, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.44838, val loss: 0.43674, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.44657, val loss: 0.43508, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.44481, val loss: 0.43346, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.44314, val loss: 0.43159, in 0.016s 1 tree, 126 leaves, max depth = 13, train loss: 0.44144, val loss: 0.43003, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.43983, val loss: 0.42822, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.43817, val loss: 0.42671, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.43656, val loss: 0.42524, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.43502, val loss: 0.42350, in 0.016s 1 tree, 125 leaves, max depth = 16, train loss: 0.43346, val loss: 0.42207, in 0.016s Fit 75 trees in 1.345 s, (7286 total leaves) Time spent computing histograms: 0.416s Time spent finding best splits: 0.120s Time spent applying splits: 0.134s Time spent predicting: 0.016s Trial 84, Fold 5: Log loss = 0.43985658498218044, Average precision = 0.9133008795843532, ROC-AUC = 0.9265036136967467, Elapsed Time = 1.3525437000007514 seconds
Optimization Progress: 85%|########5 | 85/100 [17:43<03:42, 14.82s/it]
Trial 85, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 85, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 71 leaves, max depth = 12, train loss: 0.67320, val loss: 0.67335, in 0.016s 1 tree, 71 leaves, max depth = 12, train loss: 0.65476, val loss: 0.65504, in 0.016s 1 tree, 71 leaves, max depth = 12, train loss: 0.63751, val loss: 0.63791, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.62134, val loss: 0.62183, in 0.016s 1 tree, 70 leaves, max depth = 12, train loss: 0.60616, val loss: 0.60674, in 0.016s 1 tree, 73 leaves, max depth = 12, train loss: 0.59190, val loss: 0.59259, in 0.031s 1 tree, 73 leaves, max depth = 12, train loss: 0.57848, val loss: 0.57927, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.56584, val loss: 0.56672, in 0.016s 1 tree, 71 leaves, max depth = 12, train loss: 0.55391, val loss: 0.55489, in 0.031s 1 tree, 76 leaves, max depth = 12, train loss: 0.54265, val loss: 0.54371, in 0.016s 1 tree, 88 leaves, max depth = 14, train loss: 0.53222, val loss: 0.53317, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.52236, val loss: 0.52327, in 0.016s 1 tree, 80 leaves, max depth = 11, train loss: 0.51249, val loss: 0.51332, in 0.031s 1 tree, 78 leaves, max depth = 11, train loss: 0.50315, val loss: 0.50390, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.49429, val loss: 0.49494, in 0.016s 1 tree, 80 leaves, max depth = 11, train loss: 0.48589, val loss: 0.48649, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.47828, val loss: 0.47880, in 0.031s 1 tree, 80 leaves, max depth = 12, train loss: 0.47103, val loss: 0.47147, in 0.016s 1 tree, 78 leaves, max depth = 11, train loss: 0.46375, val loss: 0.46416, in 0.015s 1 tree, 78 leaves, max depth = 11, train loss: 0.45684, val loss: 0.45723, in 0.031s 1 tree, 80 leaves, max depth = 11, train loss: 0.45026, val loss: 0.45062, in 0.016s 1 tree, 78 leaves, max depth = 11, train loss: 0.44401, val loss: 0.44436, in 0.016s 1 tree, 78 leaves, max depth = 11, train loss: 0.43807, val loss: 0.43836, in 0.016s 1 tree, 79 leaves, max depth = 15, train loss: 0.42934, val loss: 0.42995, in 0.031s 1 tree, 94 leaves, max depth = 15, train loss: 0.42114, val loss: 0.42201, in 0.016s 1 tree, 83 leaves, max depth = 11, train loss: 0.41588, val loss: 0.41673, in 0.016s 1 tree, 81 leaves, max depth = 11, train loss: 0.41086, val loss: 0.41171, in 0.031s 1 tree, 78 leaves, max depth = 16, train loss: 0.40346, val loss: 0.40460, in 0.016s 1 tree, 83 leaves, max depth = 11, train loss: 0.39884, val loss: 0.39999, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.39201, val loss: 0.39344, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.38773, val loss: 0.38923, in 0.016s 1 tree, 80 leaves, max depth = 16, train loss: 0.38141, val loss: 0.38321, in 0.031s 1 tree, 83 leaves, max depth = 11, train loss: 0.37748, val loss: 0.37932, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.37163, val loss: 0.37376, in 0.016s 1 tree, 74 leaves, max depth = 14, train loss: 0.36807, val loss: 0.37031, in 0.031s 1 tree, 81 leaves, max depth = 15, train loss: 0.36264, val loss: 0.36518, in 0.016s Fit 36 trees in 1.017 s, (2826 total leaves) Time spent computing histograms: 0.262s Time spent finding best splits: 0.065s Time spent applying splits: 0.052s Time spent predicting: 0.016s Trial 85, Fold 1: Log loss = 0.3676550507126004, Average precision = 0.952459670240728, ROC-AUC = 0.9474979298404095, Elapsed Time = 1.0264220000008208 seconds Trial 85, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 85, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 73 leaves, max depth = 12, train loss: 0.67321, val loss: 0.67284, in 0.016s 1 tree, 73 leaves, max depth = 12, train loss: 0.65466, val loss: 0.65394, in 0.031s 1 tree, 74 leaves, max depth = 11, train loss: 0.63740, val loss: 0.63633, in 0.016s 1 tree, 74 leaves, max depth = 11, train loss: 0.62122, val loss: 0.61983, in 0.031s 1 tree, 73 leaves, max depth = 12, train loss: 0.60595, val loss: 0.60430, in 0.016s 1 tree, 78 leaves, max depth = 11, train loss: 0.59142, val loss: 0.58955, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.57773, val loss: 0.57560, in 0.031s 1 tree, 76 leaves, max depth = 11, train loss: 0.56483, val loss: 0.56249, in 0.016s 1 tree, 76 leaves, max depth = 11, train loss: 0.55268, val loss: 0.55015, in 0.016s 1 tree, 74 leaves, max depth = 11, train loss: 0.54118, val loss: 0.53843, in 0.031s 1 tree, 76 leaves, max depth = 11, train loss: 0.53033, val loss: 0.52741, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.52002, val loss: 0.51689, in 0.031s 1 tree, 74 leaves, max depth = 11, train loss: 0.51026, val loss: 0.50694, in 0.016s 1 tree, 77 leaves, max depth = 14, train loss: 0.50102, val loss: 0.49756, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.49230, val loss: 0.48869, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.48398, val loss: 0.48019, in 0.031s 1 tree, 82 leaves, max depth = 12, train loss: 0.47645, val loss: 0.47261, in 0.016s 1 tree, 85 leaves, max depth = 14, train loss: 0.46930, val loss: 0.46542, in 0.016s 1 tree, 77 leaves, max depth = 11, train loss: 0.46211, val loss: 0.45811, in 0.031s 1 tree, 79 leaves, max depth = 13, train loss: 0.45528, val loss: 0.45118, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.44878, val loss: 0.44454, in 0.031s 1 tree, 76 leaves, max depth = 14, train loss: 0.44260, val loss: 0.43824, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.43379, val loss: 0.42960, in 0.031s 1 tree, 79 leaves, max depth = 15, train loss: 0.42813, val loss: 0.42387, in 0.016s 1 tree, 80 leaves, max depth = 14, train loss: 0.42003, val loss: 0.41595, in 0.016s 1 tree, 77 leaves, max depth = 14, train loss: 0.41484, val loss: 0.41071, in 0.031s 1 tree, 75 leaves, max depth = 12, train loss: 0.40989, val loss: 0.40566, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.40260, val loss: 0.39856, in 0.031s 1 tree, 75 leaves, max depth = 12, train loss: 0.39804, val loss: 0.39395, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.39131, val loss: 0.38741, in 0.031s 1 tree, 77 leaves, max depth = 14, train loss: 0.38711, val loss: 0.38320, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.38090, val loss: 0.37717, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.37702, val loss: 0.37326, in 0.031s 1 tree, 76 leaves, max depth = 13, train loss: 0.37330, val loss: 0.36950, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.36762, val loss: 0.36399, in 0.031s 1 tree, 81 leaves, max depth = 14, train loss: 0.36229, val loss: 0.35882, in 0.016s Fit 36 trees in 1.111 s, (2776 total leaves) Time spent computing histograms: 0.299s Time spent finding best splits: 0.073s Time spent applying splits: 0.056s Time spent predicting: 0.000s Trial 85, Fold 2: Log loss = 0.36578762426480615, Average precision = 0.9499341777478105, ROC-AUC = 0.9485677437016323, Elapsed Time = 1.1277160000008735 seconds Trial 85, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 85, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 73 leaves, max depth = 12, train loss: 0.67328, val loss: 0.67336, in 0.016s 1 tree, 73 leaves, max depth = 12, train loss: 0.65488, val loss: 0.65500, in 0.031s 1 tree, 72 leaves, max depth = 13, train loss: 0.63766, val loss: 0.63781, in 0.016s 1 tree, 75 leaves, max depth = 12, train loss: 0.62149, val loss: 0.62161, in 0.016s 1 tree, 73 leaves, max depth = 11, train loss: 0.60634, val loss: 0.60650, in 0.031s 1 tree, 72 leaves, max depth = 12, train loss: 0.59211, val loss: 0.59226, in 0.016s 1 tree, 77 leaves, max depth = 14, train loss: 0.57849, val loss: 0.57872, in 0.016s 1 tree, 78 leaves, max depth = 14, train loss: 0.56565, val loss: 0.56589, in 0.031s 1 tree, 83 leaves, max depth = 15, train loss: 0.55406, val loss: 0.55440, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.54257, val loss: 0.54293, in 0.031s 1 tree, 81 leaves, max depth = 14, train loss: 0.53172, val loss: 0.53217, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.52145, val loss: 0.52193, in 0.016s 1 tree, 82 leaves, max depth = 14, train loss: 0.51174, val loss: 0.51229, in 0.031s 1 tree, 79 leaves, max depth = 14, train loss: 0.50253, val loss: 0.50314, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.49380, val loss: 0.49445, in 0.031s 1 tree, 83 leaves, max depth = 13, train loss: 0.48552, val loss: 0.48619, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.47767, val loss: 0.47836, in 0.016s 1 tree, 82 leaves, max depth = 12, train loss: 0.47020, val loss: 0.47093, in 0.031s 1 tree, 81 leaves, max depth = 14, train loss: 0.46311, val loss: 0.46385, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.45637, val loss: 0.45718, in 0.016s 1 tree, 83 leaves, max depth = 13, train loss: 0.44995, val loss: 0.45080, in 0.016s 1 tree, 84 leaves, max depth = 15, train loss: 0.44076, val loss: 0.44230, in 0.031s 1 tree, 79 leaves, max depth = 13, train loss: 0.43485, val loss: 0.43643, in 0.031s 1 tree, 85 leaves, max depth = 15, train loss: 0.42641, val loss: 0.42864, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.41844, val loss: 0.42130, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.41323, val loss: 0.41612, in 0.031s 1 tree, 80 leaves, max depth = 13, train loss: 0.40823, val loss: 0.41118, in 0.016s 1 tree, 82 leaves, max depth = 15, train loss: 0.40107, val loss: 0.40461, in 0.031s 1 tree, 81 leaves, max depth = 17, train loss: 0.39650, val loss: 0.40011, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.38988, val loss: 0.39407, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.38566, val loss: 0.38990, in 0.031s 1 tree, 81 leaves, max depth = 15, train loss: 0.37954, val loss: 0.38433, in 0.016s 1 tree, 84 leaves, max depth = 16, train loss: 0.37562, val loss: 0.38048, in 0.016s 1 tree, 81 leaves, max depth = 15, train loss: 0.36995, val loss: 0.37533, in 0.031s 1 tree, 85 leaves, max depth = 12, train loss: 0.36635, val loss: 0.37180, in 0.016s 1 tree, 83 leaves, max depth = 15, train loss: 0.36110, val loss: 0.36705, in 0.016s Fit 36 trees in 1.095 s, (2887 total leaves) Time spent computing histograms: 0.287s Time spent finding best splits: 0.072s Time spent applying splits: 0.058s Time spent predicting: 0.000s Trial 85, Fold 3: Log loss = 0.36100000465176585, Average precision = 0.9513218767088868, ROC-AUC = 0.9504091986015737, Elapsed Time = 1.107314300001235 seconds Trial 85, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 85, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 77 leaves, max depth = 12, train loss: 0.67342, val loss: 0.67281, in 0.016s 1 tree, 76 leaves, max depth = 12, train loss: 0.65512, val loss: 0.65384, in 0.031s 1 tree, 74 leaves, max depth = 13, train loss: 0.63800, val loss: 0.63604, in 0.016s 1 tree, 96 leaves, max depth = 13, train loss: 0.62209, val loss: 0.61939, in 0.031s 1 tree, 73 leaves, max depth = 12, train loss: 0.60693, val loss: 0.60369, in 0.031s 1 tree, 74 leaves, max depth = 12, train loss: 0.59274, val loss: 0.58898, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.57934, val loss: 0.57507, in 0.016s 1 tree, 74 leaves, max depth = 12, train loss: 0.56676, val loss: 0.56200, in 0.031s 1 tree, 77 leaves, max depth = 12, train loss: 0.55458, val loss: 0.54933, in 0.016s 1 tree, 74 leaves, max depth = 12, train loss: 0.54313, val loss: 0.53739, in 0.031s 1 tree, 77 leaves, max depth = 12, train loss: 0.53231, val loss: 0.52607, in 0.016s 1 tree, 77 leaves, max depth = 12, train loss: 0.52202, val loss: 0.51534, in 0.031s 1 tree, 77 leaves, max depth = 11, train loss: 0.51229, val loss: 0.50518, in 0.016s 1 tree, 76 leaves, max depth = 11, train loss: 0.50307, val loss: 0.49554, in 0.031s 1 tree, 77 leaves, max depth = 11, train loss: 0.49431, val loss: 0.48642, in 0.016s 1 tree, 79 leaves, max depth = 11, train loss: 0.48601, val loss: 0.47774, in 0.031s 1 tree, 76 leaves, max depth = 11, train loss: 0.47812, val loss: 0.46950, in 0.016s 1 tree, 75 leaves, max depth = 11, train loss: 0.47065, val loss: 0.46166, in 0.031s 1 tree, 76 leaves, max depth = 11, train loss: 0.46354, val loss: 0.45420, in 0.016s 1 tree, 78 leaves, max depth = 12, train loss: 0.45678, val loss: 0.44710, in 0.031s 1 tree, 76 leaves, max depth = 11, train loss: 0.45036, val loss: 0.44035, in 0.016s 1 tree, 76 leaves, max depth = 11, train loss: 0.44424, val loss: 0.43391, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.43539, val loss: 0.42494, in 0.031s 1 tree, 76 leaves, max depth = 13, train loss: 0.42980, val loss: 0.41902, in 0.016s 1 tree, 78 leaves, max depth = 13, train loss: 0.42166, val loss: 0.41078, in 0.016s 1 tree, 78 leaves, max depth = 11, train loss: 0.41652, val loss: 0.40536, in 0.031s 1 tree, 79 leaves, max depth = 13, train loss: 0.40906, val loss: 0.39784, in 0.016s 1 tree, 81 leaves, max depth = 11, train loss: 0.40434, val loss: 0.39287, in 0.031s 1 tree, 79 leaves, max depth = 13, train loss: 0.39743, val loss: 0.38587, in 0.016s 1 tree, 81 leaves, max depth = 12, train loss: 0.39319, val loss: 0.38139, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.38680, val loss: 0.37493, in 0.031s 1 tree, 81 leaves, max depth = 12, train loss: 0.38289, val loss: 0.37081, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.37697, val loss: 0.36484, in 0.016s 1 tree, 82 leaves, max depth = 11, train loss: 0.37326, val loss: 0.36091, in 0.031s 1 tree, 76 leaves, max depth = 13, train loss: 0.36778, val loss: 0.35538, in 0.016s 1 tree, 83 leaves, max depth = 12, train loss: 0.36445, val loss: 0.35186, in 0.016s Fit 36 trees in 1.158 s, (2799 total leaves) Time spent computing histograms: 0.300s Time spent finding best splits: 0.076s Time spent applying splits: 0.059s Time spent predicting: 0.000s Trial 85, Fold 4: Log loss = 0.3651511463110123, Average precision = 0.9530434339767936, ROC-AUC = 0.9493210840242161, Elapsed Time = 1.1653210000004037 seconds Trial 85, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 85, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 77 leaves, max depth = 14, train loss: 0.67324, val loss: 0.67264, in 0.016s 1 tree, 74 leaves, max depth = 13, train loss: 0.65453, val loss: 0.65331, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.63702, val loss: 0.63521, in 0.031s 1 tree, 73 leaves, max depth = 13, train loss: 0.62062, val loss: 0.61825, in 0.016s 1 tree, 90 leaves, max depth = 12, train loss: 0.60566, val loss: 0.60276, in 0.016s 1 tree, 75 leaves, max depth = 13, train loss: 0.59115, val loss: 0.58775, in 0.031s 1 tree, 75 leaves, max depth = 13, train loss: 0.57751, val loss: 0.57363, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.56465, val loss: 0.56031, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.55243, val loss: 0.54765, in 0.031s 1 tree, 79 leaves, max depth = 13, train loss: 0.54089, val loss: 0.53570, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.52998, val loss: 0.52441, in 0.031s 1 tree, 82 leaves, max depth = 13, train loss: 0.52017, val loss: 0.51424, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.51036, val loss: 0.50404, in 0.016s 1 tree, 80 leaves, max depth = 13, train loss: 0.50107, val loss: 0.49442, in 0.016s 1 tree, 74 leaves, max depth = 13, train loss: 0.49232, val loss: 0.48537, in 0.031s 1 tree, 81 leaves, max depth = 13, train loss: 0.48396, val loss: 0.47668, in 0.016s 1 tree, 83 leaves, max depth = 14, train loss: 0.47641, val loss: 0.46882, in 0.031s 1 tree, 84 leaves, max depth = 14, train loss: 0.46925, val loss: 0.46137, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.46201, val loss: 0.45386, in 0.031s 1 tree, 80 leaves, max depth = 13, train loss: 0.45513, val loss: 0.44667, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.44859, val loss: 0.43989, in 0.016s 1 tree, 79 leaves, max depth = 13, train loss: 0.44236, val loss: 0.43336, in 0.031s 1 tree, 78 leaves, max depth = 13, train loss: 0.43644, val loss: 0.42719, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.42778, val loss: 0.41859, in 0.031s 1 tree, 96 leaves, max depth = 14, train loss: 0.41971, val loss: 0.41056, in 0.016s 1 tree, 84 leaves, max depth = 14, train loss: 0.41446, val loss: 0.40509, in 0.031s 1 tree, 82 leaves, max depth = 14, train loss: 0.40946, val loss: 0.39992, in 0.016s 1 tree, 80 leaves, max depth = 15, train loss: 0.40211, val loss: 0.39262, in 0.016s 1 tree, 77 leaves, max depth = 13, train loss: 0.39754, val loss: 0.38792, in 0.031s 1 tree, 79 leaves, max depth = 15, train loss: 0.39075, val loss: 0.38118, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.38655, val loss: 0.37685, in 0.016s 1 tree, 77 leaves, max depth = 14, train loss: 0.38027, val loss: 0.37067, in 0.016s 1 tree, 82 leaves, max depth = 13, train loss: 0.37641, val loss: 0.36679, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.37264, val loss: 0.36296, in 0.016s 1 tree, 79 leaves, max depth = 14, train loss: 0.36694, val loss: 0.35736, in 0.016s 1 tree, 81 leaves, max depth = 13, train loss: 0.36152, val loss: 0.35200, in 0.031s Fit 36 trees in 1.142 s, (2874 total leaves) Time spent computing histograms: 0.293s Time spent finding best splits: 0.072s Time spent applying splits: 0.058s Time spent predicting: 0.000s Trial 85, Fold 5: Log loss = 0.36914487007096985, Average precision = 0.9498696078413703, ROC-AUC = 0.9474313211823941, Elapsed Time = 1.1547817999999097 seconds
Optimization Progress: 86%|########6 | 86/100 [17:55<03:16, 14.02s/it]
Trial 86, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 86, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 27 leaves, max depth = 12, train loss: 0.67757, val loss: 0.67715, in 0.000s 1 tree, 28 leaves, max depth = 12, train loss: 0.66302, val loss: 0.66218, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.64939, val loss: 0.64816, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.63662, val loss: 0.63501, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.62465, val loss: 0.62267, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.61341, val loss: 0.61107, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.60286, val loss: 0.60017, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.59293, val loss: 0.58991, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.58360, val loss: 0.58024, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.57481, val loss: 0.57113, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.56654, val loss: 0.56255, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.55874, val loss: 0.55446, in 0.000s 1 tree, 31 leaves, max depth = 14, train loss: 0.55145, val loss: 0.54680, in 0.016s Fit 13 trees in 0.361 s, (377 total leaves) Time spent computing histograms: 0.051s Time spent finding best splits: 0.007s Time spent applying splits: 0.006s Time spent predicting: 0.000s Trial 86, Fold 1: Log loss = 0.5523366704044703, Average precision = 0.8140478710455946, ROC-AUC = 0.8619023571739245, Elapsed Time = 0.3603768000011769 seconds Trial 86, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 86, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 9, train loss: 0.67775, val loss: 0.67702, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.66336, val loss: 0.66191, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.64990, val loss: 0.64776, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.63728, val loss: 0.63449, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.62545, val loss: 0.62202, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.61434, val loss: 0.61036, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.60390, val loss: 0.59934, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.59408, val loss: 0.58902, in 0.000s 1 tree, 29 leaves, max depth = 9, train loss: 0.58485, val loss: 0.57923, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.57615, val loss: 0.57007, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.56796, val loss: 0.56144, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.56010, val loss: 0.55310, in 0.000s 1 tree, 54 leaves, max depth = 11, train loss: 0.55226, val loss: 0.54561, in 0.016s Fit 13 trees in 0.376 s, (407 total leaves) Time spent computing histograms: 0.046s Time spent finding best splits: 0.007s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 86, Fold 2: Log loss = 0.5521814942210168, Average precision = 0.8878644760545134, ROC-AUC = 0.9063182770192243, Elapsed Time = 0.37138969999978144 seconds Trial 86, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 86, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0.158 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 28 leaves, max depth = 11, train loss: 0.67789, val loss: 0.67739, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.66362, val loss: 0.66265, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.65013, val loss: 0.64872, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.63762, val loss: 0.63576, in 0.000s 1 tree, 21 leaves, max depth = 8, train loss: 0.62575, val loss: 0.62348, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.61473, val loss: 0.61205, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.60425, val loss: 0.60119, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.59452, val loss: 0.59107, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.58525, val loss: 0.58145, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.57663, val loss: 0.57247, in 0.000s 1 tree, 22 leaves, max depth = 7, train loss: 0.56840, val loss: 0.56391, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.56075, val loss: 0.55593, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.55293, val loss: 0.54866, in 0.016s Fit 13 trees in 0.377 s, (357 total leaves) Time spent computing histograms: 0.057s Time spent finding best splits: 0.007s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 86, Fold 3: Log loss = 0.5495966806223601, Average precision = 0.8902275823415812, ROC-AUC = 0.9079231666120907, Elapsed Time = 0.38836579999951937 seconds Trial 86, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 86, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 27 leaves, max depth = 12, train loss: 0.67773, val loss: 0.67684, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.66332, val loss: 0.66157, in 0.016s 1 tree, 27 leaves, max depth = 12, train loss: 0.64984, val loss: 0.64725, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.63721, val loss: 0.63381, in 0.016s 1 tree, 27 leaves, max depth = 12, train loss: 0.62536, val loss: 0.62117, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.61424, val loss: 0.60929, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.60379, val loss: 0.59810, in 0.000s 1 tree, 32 leaves, max depth = 13, train loss: 0.59403, val loss: 0.58762, in 0.016s 1 tree, 29 leaves, max depth = 12, train loss: 0.58478, val loss: 0.57768, in 0.000s 1 tree, 32 leaves, max depth = 13, train loss: 0.57613, val loss: 0.56836, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.56792, val loss: 0.55950, in 0.000s 1 tree, 33 leaves, max depth = 13, train loss: 0.56024, val loss: 0.55118, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.55295, val loss: 0.54326, in 0.000s Fit 13 trees in 0.376 s, (378 total leaves) Time spent computing histograms: 0.050s Time spent finding best splits: 0.007s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 86, Fold 4: Log loss = 0.5517180277152086, Average precision = 0.8204335615774272, ROC-AUC = 0.8677278482370867, Elapsed Time = 0.3881531000006362 seconds Trial 86, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 86, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 27 leaves, max depth = 10, train loss: 0.67760, val loss: 0.67656, in 0.000s 1 tree, 27 leaves, max depth = 10, train loss: 0.66306, val loss: 0.66102, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.64952, val loss: 0.64655, in 0.000s 1 tree, 27 leaves, max depth = 10, train loss: 0.63675, val loss: 0.63285, in 0.016s 1 tree, 27 leaves, max depth = 12, train loss: 0.62485, val loss: 0.62008, in 0.000s 1 tree, 27 leaves, max depth = 10, train loss: 0.61360, val loss: 0.60796, in 0.016s 1 tree, 27 leaves, max depth = 12, train loss: 0.60309, val loss: 0.59664, in 0.000s 1 tree, 27 leaves, max depth = 10, train loss: 0.59314, val loss: 0.58587, in 0.016s 1 tree, 27 leaves, max depth = 12, train loss: 0.58385, val loss: 0.57582, in 0.000s 1 tree, 26 leaves, max depth = 12, train loss: 0.57510, val loss: 0.56633, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.56678, val loss: 0.55726, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.55901, val loss: 0.54880, in 0.016s 1 tree, 51 leaves, max depth = 11, train loss: 0.55133, val loss: 0.54145, in 0.000s Fit 13 trees in 0.376 s, (378 total leaves) Time spent computing histograms: 0.059s Time spent finding best splits: 0.007s Time spent applying splits: 0.007s Time spent predicting: 0.000s Trial 86, Fold 5: Log loss = 0.5551145966827552, Average precision = 0.8855353230294833, ROC-AUC = 0.9000196941827843, Elapsed Time = 0.39433289999942644 seconds
Optimization Progress: 87%|########7 | 87/100 [18:04<02:39, 12.30s/it]
Trial 87, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 87, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.143 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 47 leaves, max depth = 13, train loss: 0.68085, val loss: 0.68060, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.66875, val loss: 0.66851, in 0.016s 1 tree, 44 leaves, max depth = 18, train loss: 0.65796, val loss: 0.65750, in 0.016s 1 tree, 42 leaves, max depth = 13, train loss: 0.64705, val loss: 0.64650, in 0.016s 1 tree, 46 leaves, max depth = 15, train loss: 0.63693, val loss: 0.63619, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.62728, val loss: 0.62638, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.61796, val loss: 0.61691, in 0.000s 1 tree, 59 leaves, max depth = 14, train loss: 0.60916, val loss: 0.60806, in 0.031s 1 tree, 53 leaves, max depth = 12, train loss: 0.60046, val loss: 0.59943, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.59218, val loss: 0.59101, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.58417, val loss: 0.58311, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.57650, val loss: 0.57522, in 0.000s 1 tree, 46 leaves, max depth = 13, train loss: 0.56870, val loss: 0.56728, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.56093, val loss: 0.55950, in 0.016s 1 tree, 44 leaves, max depth = 13, train loss: 0.55389, val loss: 0.55236, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.54732, val loss: 0.54558, in 0.016s 1 tree, 50 leaves, max depth = 10, train loss: 0.54136, val loss: 0.53949, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.53522, val loss: 0.53330, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.52941, val loss: 0.52746, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.52363, val loss: 0.52160, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.51746, val loss: 0.51543, in 0.016s 1 tree, 45 leaves, max depth = 16, train loss: 0.51185, val loss: 0.50975, in 0.016s 1 tree, 46 leaves, max depth = 15, train loss: 0.50677, val loss: 0.50463, in 0.016s 1 tree, 44 leaves, max depth = 15, train loss: 0.50151, val loss: 0.49930, in 0.016s 1 tree, 53 leaves, max depth = 14, train loss: 0.49589, val loss: 0.49368, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.49078, val loss: 0.48840, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.48596, val loss: 0.48355, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.48137, val loss: 0.47883, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.47672, val loss: 0.47407, in 0.016s 1 tree, 7 leaves, max depth = 3, train loss: 0.47080, val loss: 0.46825, in 0.016s 1 tree, 50 leaves, max depth = 14, train loss: 0.46649, val loss: 0.46387, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.46211, val loss: 0.45940, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.45856, val loss: 0.45580, in 0.016s 1 tree, 44 leaves, max depth = 14, train loss: 0.45455, val loss: 0.45174, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.45113, val loss: 0.44829, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.44707, val loss: 0.44424, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.44362, val loss: 0.44063, in 0.000s 1 tree, 47 leaves, max depth = 14, train loss: 0.44017, val loss: 0.43710, in 0.000s 1 tree, 45 leaves, max depth = 11, train loss: 0.43653, val loss: 0.43350, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.43339, val loss: 0.43011, in 0.031s 1 tree, 45 leaves, max depth = 10, train loss: 0.43047, val loss: 0.42717, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.42730, val loss: 0.42393, in 0.016s 1 tree, 51 leaves, max depth = 14, train loss: 0.42447, val loss: 0.42104, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.42156, val loss: 0.41800, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.41842, val loss: 0.41493, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.41563, val loss: 0.41195, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.41309, val loss: 0.40942, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.41054, val loss: 0.40681, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.40787, val loss: 0.40416, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.40530, val loss: 0.40162, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.40316, val loss: 0.39945, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.40012, val loss: 0.39643, in 0.016s Fit 52 trees in 1.096 s, (2178 total leaves) Time spent computing histograms: 0.341s Time spent finding best splits: 0.043s Time spent applying splits: 0.038s Time spent predicting: 0.000s Trial 87, Fold 1: Log loss = 0.4021835494331139, Average precision = 0.9526312436607483, ROC-AUC = 0.9463635988873117, Elapsed Time = 1.0909823999991204 seconds Trial 87, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 87, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.141 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 43 leaves, max depth = 11, train loss: 0.68094, val loss: 0.68068, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.66868, val loss: 0.66811, in 0.016s 1 tree, 44 leaves, max depth = 15, train loss: 0.65779, val loss: 0.65699, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.64673, val loss: 0.64556, in 0.016s 1 tree, 45 leaves, max depth = 15, train loss: 0.63662, val loss: 0.63542, in 0.016s 1 tree, 44 leaves, max depth = 14, train loss: 0.62704, val loss: 0.62569, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.61777, val loss: 0.61636, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.60887, val loss: 0.60719, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.60014, val loss: 0.59835, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.59196, val loss: 0.58985, in 0.000s 1 tree, 50 leaves, max depth = 12, train loss: 0.58329, val loss: 0.58101, in 0.031s 1 tree, 51 leaves, max depth = 14, train loss: 0.57580, val loss: 0.57322, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.56815, val loss: 0.56548, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.56051, val loss: 0.55761, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.55365, val loss: 0.55065, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.54660, val loss: 0.54348, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.53960, val loss: 0.53647, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.53283, val loss: 0.52962, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.52661, val loss: 0.52332, in 0.016s 1 tree, 46 leaves, max depth = 14, train loss: 0.52058, val loss: 0.51723, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.51451, val loss: 0.51103, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.50870, val loss: 0.50504, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.50355, val loss: 0.49971, in 0.016s 1 tree, 46 leaves, max depth = 13, train loss: 0.49837, val loss: 0.49448, in 0.016s 1 tree, 47 leaves, max depth = 14, train loss: 0.49291, val loss: 0.48892, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.48777, val loss: 0.48370, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.48311, val loss: 0.47904, in 0.016s 1 tree, 50 leaves, max depth = 17, train loss: 0.47851, val loss: 0.47439, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.47371, val loss: 0.46947, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.46978, val loss: 0.46537, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.46522, val loss: 0.46076, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.46116, val loss: 0.45663, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.45678, val loss: 0.45219, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.45313, val loss: 0.44850, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.44935, val loss: 0.44475, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.44536, val loss: 0.44068, in 0.016s 1 tree, 49 leaves, max depth = 15, train loss: 0.44183, val loss: 0.43712, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.43661, val loss: 0.43191, in 0.016s 1 tree, 49 leaves, max depth = 9, train loss: 0.43331, val loss: 0.42855, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.43019, val loss: 0.42534, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42720, val loss: 0.42222, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.42411, val loss: 0.41907, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.42076, val loss: 0.41569, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.41762, val loss: 0.41257, in 0.016s 1 tree, 50 leaves, max depth = 15, train loss: 0.41465, val loss: 0.40949, in 0.016s 1 tree, 38 leaves, max depth = 9, train loss: 0.41218, val loss: 0.40697, in 0.016s 1 tree, 48 leaves, max depth = 17, train loss: 0.40924, val loss: 0.40395, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.40672, val loss: 0.40142, in 0.016s 1 tree, 52 leaves, max depth = 15, train loss: 0.40415, val loss: 0.39886, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.39974, val loss: 0.39456, in 0.016s 1 tree, 45 leaves, max depth = 14, train loss: 0.39752, val loss: 0.39236, in 0.016s 1 tree, 68 leaves, max depth = 11, train loss: 0.39453, val loss: 0.38944, in 0.016s Fit 52 trees in 1.141 s, (2254 total leaves) Time spent computing histograms: 0.341s Time spent finding best splits: 0.047s Time spent applying splits: 0.042s Time spent predicting: 0.000s Trial 87, Fold 2: Log loss = 0.396659992982574, Average precision = 0.9505285581424341, ROC-AUC = 0.9472481449591964, Elapsed Time = 1.1540588999996544 seconds Trial 87, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 87, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 40 leaves, max depth = 14, train loss: 0.68086, val loss: 0.68065, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.66881, val loss: 0.66850, in 0.016s 1 tree, 50 leaves, max depth = 17, train loss: 0.65794, val loss: 0.65761, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.64763, val loss: 0.64714, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.63774, val loss: 0.63711, in 0.016s 1 tree, 43 leaves, max depth = 16, train loss: 0.62808, val loss: 0.62742, in 0.016s 1 tree, 46 leaves, max depth = 16, train loss: 0.61876, val loss: 0.61813, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.60983, val loss: 0.60907, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.60121, val loss: 0.60045, in 0.016s 1 tree, 51 leaves, max depth = 17, train loss: 0.59284, val loss: 0.59201, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.58488, val loss: 0.58397, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.57719, val loss: 0.57616, in 0.016s 1 tree, 49 leaves, max depth = 14, train loss: 0.56912, val loss: 0.56806, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.56137, val loss: 0.56023, in 0.016s 1 tree, 48 leaves, max depth = 15, train loss: 0.55389, val loss: 0.55275, in 0.031s 1 tree, 40 leaves, max depth = 12, train loss: 0.54729, val loss: 0.54609, in 0.016s 1 tree, 56 leaves, max depth = 17, train loss: 0.54032, val loss: 0.53909, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.53361, val loss: 0.53231, in 0.016s 1 tree, 45 leaves, max depth = 14, train loss: 0.52746, val loss: 0.52604, in 0.016s 1 tree, 57 leaves, max depth = 15, train loss: 0.52131, val loss: 0.51999, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.51507, val loss: 0.51373, in 0.016s 1 tree, 45 leaves, max depth = 16, train loss: 0.50955, val loss: 0.50824, in 0.031s 1 tree, 49 leaves, max depth = 12, train loss: 0.50454, val loss: 0.50309, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.49933, val loss: 0.49791, in 0.016s 1 tree, 58 leaves, max depth = 12, train loss: 0.49388, val loss: 0.49248, in 0.016s 1 tree, 50 leaves, max depth = 14, train loss: 0.48873, val loss: 0.48736, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.48395, val loss: 0.48264, in 0.016s 1 tree, 56 leaves, max depth = 14, train loss: 0.47942, val loss: 0.47818, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.47460, val loss: 0.47334, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.47064, val loss: 0.46941, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.46611, val loss: 0.46487, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.46202, val loss: 0.46074, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.45784, val loss: 0.45659, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.45377, val loss: 0.45254, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.45039, val loss: 0.44907, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.44637, val loss: 0.44512, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.44302, val loss: 0.44179, in 0.016s 1 tree, 46 leaves, max depth = 13, train loss: 0.43945, val loss: 0.43821, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.43627, val loss: 0.43503, in 0.000s 1 tree, 45 leaves, max depth = 12, train loss: 0.43295, val loss: 0.43169, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.43006, val loss: 0.42887, in 0.016s 1 tree, 44 leaves, max depth = 13, train loss: 0.42701, val loss: 0.42587, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.42373, val loss: 0.42258, in 0.016s 1 tree, 24 leaves, max depth = 10, train loss: 0.41879, val loss: 0.41807, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.41570, val loss: 0.41501, in 0.016s 1 tree, 10 leaves, max depth = 6, train loss: 0.41289, val loss: 0.41230, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.41043, val loss: 0.40982, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.40782, val loss: 0.40734, in 0.016s 1 tree, 65 leaves, max depth = 15, train loss: 0.40476, val loss: 0.40409, in 0.016s 1 tree, 11 leaves, max depth = 5, train loss: 0.40041, val loss: 0.40011, in 0.016s 1 tree, 49 leaves, max depth = 15, train loss: 0.39797, val loss: 0.39772, in 0.016s 1 tree, 46 leaves, max depth = 16, train loss: 0.39577, val loss: 0.39551, in 0.016s Fit 52 trees in 1.204 s, (2312 total leaves) Time spent computing histograms: 0.354s Time spent finding best splits: 0.049s Time spent applying splits: 0.044s Time spent predicting: 0.000s Trial 87, Fold 3: Log loss = 0.39246991409418885, Average precision = 0.9549524192568721, ROC-AUC = 0.9508018890071367, Elapsed Time = 1.2186887999996543 seconds Trial 87, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 87, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 43 leaves, max depth = 12, train loss: 0.68110, val loss: 0.68057, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.66909, val loss: 0.66809, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.65825, val loss: 0.65675, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.64793, val loss: 0.64594, in 0.016s 1 tree, 46 leaves, max depth = 16, train loss: 0.63791, val loss: 0.63538, in 0.016s 1 tree, 45 leaves, max depth = 15, train loss: 0.62835, val loss: 0.62525, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.61912, val loss: 0.61549, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.61031, val loss: 0.60631, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.60174, val loss: 0.59731, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.59361, val loss: 0.58888, in 0.031s 1 tree, 57 leaves, max depth = 12, train loss: 0.58505, val loss: 0.57983, in 0.016s 1 tree, 48 leaves, max depth = 11, train loss: 0.57758, val loss: 0.57202, in 0.016s 1 tree, 45 leaves, max depth = 15, train loss: 0.57013, val loss: 0.56425, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.56239, val loss: 0.55616, in 0.016s 1 tree, 49 leaves, max depth = 15, train loss: 0.55474, val loss: 0.54810, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.54831, val loss: 0.54125, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.54135, val loss: 0.53385, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.53467, val loss: 0.52680, in 0.016s 1 tree, 47 leaves, max depth = 15, train loss: 0.52844, val loss: 0.52018, in 0.016s 1 tree, 47 leaves, max depth = 16, train loss: 0.52254, val loss: 0.51395, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.51707, val loss: 0.50825, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.51131, val loss: 0.50210, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.50615, val loss: 0.49671, in 0.016s 1 tree, 46 leaves, max depth = 15, train loss: 0.50101, val loss: 0.49126, in 0.016s 1 tree, 54 leaves, max depth = 14, train loss: 0.49542, val loss: 0.48541, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.49032, val loss: 0.48001, in 0.016s 1 tree, 58 leaves, max depth = 15, train loss: 0.48569, val loss: 0.47511, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.48112, val loss: 0.47025, in 0.031s 1 tree, 45 leaves, max depth = 17, train loss: 0.47638, val loss: 0.46522, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.47243, val loss: 0.46100, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.46792, val loss: 0.45622, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.46382, val loss: 0.45186, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.45961, val loss: 0.44741, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.45582, val loss: 0.44340, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.45193, val loss: 0.43928, in 0.016s 1 tree, 47 leaves, max depth = 15, train loss: 0.44799, val loss: 0.43508, in 0.031s 1 tree, 54 leaves, max depth = 14, train loss: 0.44445, val loss: 0.43136, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.44097, val loss: 0.42765, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.43724, val loss: 0.42374, in 0.016s 1 tree, 15 leaves, max depth = 11, train loss: 0.43239, val loss: 0.41882, in 0.016s 1 tree, 48 leaves, max depth = 14, train loss: 0.42914, val loss: 0.41529, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.42637, val loss: 0.41232, in 0.000s 1 tree, 45 leaves, max depth = 13, train loss: 0.42349, val loss: 0.40922, in 0.016s 1 tree, 20 leaves, max depth = 9, train loss: 0.41867, val loss: 0.40435, in 0.031s 1 tree, 45 leaves, max depth = 11, train loss: 0.41559, val loss: 0.40110, in 0.016s 1 tree, 10 leaves, max depth = 6, train loss: 0.41286, val loss: 0.39815, in 0.016s 1 tree, 48 leaves, max depth = 13, train loss: 0.41035, val loss: 0.39550, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.40789, val loss: 0.39282, in 0.016s 1 tree, 60 leaves, max depth = 10, train loss: 0.40522, val loss: 0.39005, in 0.016s 1 tree, 53 leaves, max depth = 16, train loss: 0.40210, val loss: 0.38692, in 0.016s 1 tree, 46 leaves, max depth = 15, train loss: 0.39983, val loss: 0.38445, in 0.016s 1 tree, 70 leaves, max depth = 14, train loss: 0.39681, val loss: 0.38137, in 0.016s Fit 52 trees in 1.220 s, (2272 total leaves) Time spent computing histograms: 0.367s Time spent finding best splits: 0.050s Time spent applying splits: 0.045s Time spent predicting: 0.000s Trial 87, Fold 4: Log loss = 0.39664251896804825, Average precision = 0.9532641220458598, ROC-AUC = 0.9476337415442868, Elapsed Time = 1.2226099999988946 seconds Trial 87, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 87, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 43 leaves, max depth = 14, train loss: 0.68085, val loss: 0.68005, in 0.016s 1 tree, 42 leaves, max depth = 14, train loss: 0.66863, val loss: 0.66730, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.65792, val loss: 0.65601, in 0.016s 1 tree, 41 leaves, max depth = 13, train loss: 0.64696, val loss: 0.64452, in 0.016s 1 tree, 49 leaves, max depth = 13, train loss: 0.63702, val loss: 0.63400, in 0.016s 1 tree, 45 leaves, max depth = 14, train loss: 0.62734, val loss: 0.62376, in 0.016s 1 tree, 46 leaves, max depth = 14, train loss: 0.61812, val loss: 0.61402, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.60930, val loss: 0.60495, in 0.062s 1 tree, 51 leaves, max depth = 11, train loss: 0.60047, val loss: 0.59568, in 0.063s 1 tree, 48 leaves, max depth = 11, train loss: 0.59230, val loss: 0.58717, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.58360, val loss: 0.57819, in 0.016s 1 tree, 25 leaves, max depth = 13, train loss: 0.57601, val loss: 0.57028, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.56872, val loss: 0.56273, in 0.031s 1 tree, 43 leaves, max depth = 12, train loss: 0.56121, val loss: 0.55491, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.55453, val loss: 0.54798, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.54792, val loss: 0.54123, in 0.031s 1 tree, 28 leaves, max depth = 10, train loss: 0.54157, val loss: 0.53454, in 0.016s 1 tree, 48 leaves, max depth = 18, train loss: 0.53511, val loss: 0.52781, in 0.016s 1 tree, 45 leaves, max depth = 13, train loss: 0.52863, val loss: 0.52099, in 0.016s 1 tree, 43 leaves, max depth = 14, train loss: 0.52278, val loss: 0.51476, in 0.016s 1 tree, 52 leaves, max depth = 16, train loss: 0.51724, val loss: 0.50888, in 0.016s 1 tree, 42 leaves, max depth = 13, train loss: 0.51177, val loss: 0.50306, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.50635, val loss: 0.49725, in 0.031s 1 tree, 50 leaves, max depth = 13, train loss: 0.50067, val loss: 0.49137, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.49526, val loss: 0.48564, in 0.000s 1 tree, 62 leaves, max depth = 11, train loss: 0.48992, val loss: 0.48014, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.48495, val loss: 0.47490, in 0.016s 1 tree, 43 leaves, max depth = 14, train loss: 0.48085, val loss: 0.47052, in 0.016s 1 tree, 46 leaves, max depth = 13, train loss: 0.47635, val loss: 0.46581, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.47210, val loss: 0.46146, in 0.031s 1 tree, 50 leaves, max depth = 14, train loss: 0.46747, val loss: 0.45668, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.46359, val loss: 0.45258, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.45778, val loss: 0.44671, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.45388, val loss: 0.44271, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.44866, val loss: 0.43739, in 0.031s 1 tree, 50 leaves, max depth = 14, train loss: 0.44460, val loss: 0.43318, in 0.016s 1 tree, 51 leaves, max depth = 13, train loss: 0.44115, val loss: 0.42952, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.43767, val loss: 0.42573, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.43416, val loss: 0.42201, in 0.016s 1 tree, 45 leaves, max depth = 11, train loss: 0.43068, val loss: 0.41831, in 0.016s 1 tree, 44 leaves, max depth = 13, train loss: 0.42745, val loss: 0.41488, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.42453, val loss: 0.41193, in 0.016s 1 tree, 62 leaves, max depth = 11, train loss: 0.42143, val loss: 0.40860, in 0.031s 1 tree, 51 leaves, max depth = 11, train loss: 0.41812, val loss: 0.40519, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.41515, val loss: 0.40205, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.41265, val loss: 0.39948, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.40966, val loss: 0.39636, in 0.031s 1 tree, 46 leaves, max depth = 12, train loss: 0.40714, val loss: 0.39377, in 0.016s 1 tree, 52 leaves, max depth = 16, train loss: 0.40455, val loss: 0.39110, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40208, val loss: 0.38846, in 0.000s 1 tree, 32 leaves, max depth = 8, train loss: 0.39999, val loss: 0.38622, in 0.016s 1 tree, 47 leaves, max depth = 14, train loss: 0.39760, val loss: 0.38375, in 0.031s Fit 52 trees in 1.377 s, (2143 total leaves) Time spent computing histograms: 0.435s Time spent finding best splits: 0.070s Time spent applying splits: 0.060s Time spent predicting: 0.016s Trial 87, Fold 5: Log loss = 0.4032658384987923, Average precision = 0.9493148596627419, ROC-AUC = 0.9450400014863534, Elapsed Time = 1.3754774999997608 seconds
Optimization Progress: 88%|########8 | 88/100 [18:16<02:29, 12.42s/it]
Trial 88, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 88, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.127 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 12, train loss: 0.66967, val loss: 0.66915, in 0.016s 1 tree, 66 leaves, max depth = 11, train loss: 0.64700, val loss: 0.64635, in 0.016s 1 tree, 71 leaves, max depth = 10, train loss: 0.62636, val loss: 0.62541, in 0.016s 1 tree, 68 leaves, max depth = 10, train loss: 0.60731, val loss: 0.60607, in 0.016s 1 tree, 65 leaves, max depth = 11, train loss: 0.58948, val loss: 0.58812, in 0.016s 1 tree, 65 leaves, max depth = 10, train loss: 0.57294, val loss: 0.57148, in 0.016s 1 tree, 45 leaves, max depth = 8, train loss: 0.55918, val loss: 0.55744, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.54569, val loss: 0.54363, in 0.016s 1 tree, 66 leaves, max depth = 11, train loss: 0.53226, val loss: 0.53015, in 0.016s 1 tree, 62 leaves, max depth = 13, train loss: 0.52018, val loss: 0.51795, in 0.016s 1 tree, 65 leaves, max depth = 10, train loss: 0.50840, val loss: 0.50610, in 0.016s 1 tree, 72 leaves, max depth = 11, train loss: 0.49820, val loss: 0.49571, in 0.016s Fit 12 trees in 0.486 s, (740 total leaves) Time spent computing histograms: 0.079s Time spent finding best splits: 0.013s Time spent applying splits: 0.014s Time spent predicting: 0.000s Trial 88, Fold 1: Log loss = 0.49899187329362027, Average precision = 0.9206425128050846, ROC-AUC = 0.9312720458553791, Elapsed Time = 0.4972512000003917 seconds Trial 88, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 88, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 45 leaves, max depth = 11, train loss: 0.66964, val loss: 0.66913, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.64686, val loss: 0.64596, in 0.016s 1 tree, 69 leaves, max depth = 12, train loss: 0.62617, val loss: 0.62496, in 0.031s 1 tree, 68 leaves, max depth = 11, train loss: 0.60709, val loss: 0.60558, in 0.016s 1 tree, 63 leaves, max depth = 12, train loss: 0.58917, val loss: 0.58735, in 0.016s 1 tree, 62 leaves, max depth = 11, train loss: 0.57257, val loss: 0.57046, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.55850, val loss: 0.55589, in 0.032s 1 tree, 47 leaves, max depth = 10, train loss: 0.54504, val loss: 0.54229, in 0.016s 1 tree, 64 leaves, max depth = 11, train loss: 0.53151, val loss: 0.52855, in 0.016s 1 tree, 63 leaves, max depth = 12, train loss: 0.51949, val loss: 0.51648, in 0.016s 1 tree, 62 leaves, max depth = 11, train loss: 0.50784, val loss: 0.50457, in 0.031s 1 tree, 72 leaves, max depth = 10, train loss: 0.49764, val loss: 0.49416, in 0.016s Fit 12 trees in 0.596 s, (724 total leaves) Time spent computing histograms: 0.091s Time spent finding best splits: 0.016s Time spent applying splits: 0.016s Time spent predicting: 0.000s Trial 88, Fold 2: Log loss = 0.4988965466589309, Average precision = 0.914735125078572, ROC-AUC = 0.9307612791852722, Elapsed Time = 0.5944717000002129 seconds Trial 88, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 88, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 12, train loss: 0.66970, val loss: 0.66945, in 0.031s 1 tree, 61 leaves, max depth = 13, train loss: 0.64701, val loss: 0.64678, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.62636, val loss: 0.62616, in 0.016s 1 tree, 67 leaves, max depth = 12, train loss: 0.60731, val loss: 0.60711, in 0.016s 1 tree, 62 leaves, max depth = 12, train loss: 0.58943, val loss: 0.58927, in 0.031s 1 tree, 64 leaves, max depth = 12, train loss: 0.57287, val loss: 0.57272, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.55883, val loss: 0.55854, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.54520, val loss: 0.54491, in 0.016s 1 tree, 64 leaves, max depth = 10, train loss: 0.53216, val loss: 0.53198, in 0.016s 1 tree, 65 leaves, max depth = 10, train loss: 0.52017, val loss: 0.52003, in 0.016s 1 tree, 65 leaves, max depth = 10, train loss: 0.50836, val loss: 0.50821, in 0.016s 1 tree, 72 leaves, max depth = 12, train loss: 0.49817, val loss: 0.49802, in 0.031s Fit 12 trees in 0.596 s, (728 total leaves) Time spent computing histograms: 0.093s Time spent finding best splits: 0.016s Time spent applying splits: 0.017s Time spent predicting: 0.000s Trial 88, Fold 3: Log loss = 0.49575776300203145, Average precision = 0.9247529398996359, ROC-AUC = 0.9354105920919573, Elapsed Time = 0.59648659999948 seconds Trial 88, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 88, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.190 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 48 leaves, max depth = 11, train loss: 0.66968, val loss: 0.66863, in 0.016s 1 tree, 65 leaves, max depth = 12, train loss: 0.64698, val loss: 0.64491, in 0.016s 1 tree, 72 leaves, max depth = 12, train loss: 0.62635, val loss: 0.62322, in 0.031s 1 tree, 72 leaves, max depth = 12, train loss: 0.60732, val loss: 0.60316, in 0.016s 1 tree, 65 leaves, max depth = 12, train loss: 0.58945, val loss: 0.58443, in 0.031s 1 tree, 66 leaves, max depth = 12, train loss: 0.57289, val loss: 0.56707, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.55877, val loss: 0.55225, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.54537, val loss: 0.53821, in 0.031s 1 tree, 50 leaves, max depth = 13, train loss: 0.53306, val loss: 0.52515, in 0.016s 1 tree, 61 leaves, max depth = 15, train loss: 0.52117, val loss: 0.51254, in 0.016s 1 tree, 64 leaves, max depth = 13, train loss: 0.50999, val loss: 0.50074, in 0.016s 1 tree, 64 leaves, max depth = 10, train loss: 0.49954, val loss: 0.48980, in 0.032s Fit 12 trees in 0.659 s, (723 total leaves) Time spent computing histograms: 0.108s Time spent finding best splits: 0.017s Time spent applying splits: 0.017s Time spent predicting: 0.000s Trial 88, Fold 4: Log loss = 0.4991357854614416, Average precision = 0.9226061890411278, ROC-AUC = 0.9325586087922726, Elapsed Time = 0.669846900000266 seconds Trial 88, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 88, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.143 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 46 leaves, max depth = 10, train loss: 0.66951, val loss: 0.66803, in 0.016s 1 tree, 62 leaves, max depth = 11, train loss: 0.64671, val loss: 0.64448, in 0.016s 1 tree, 70 leaves, max depth = 11, train loss: 0.62602, val loss: 0.62305, in 0.016s 1 tree, 70 leaves, max depth = 11, train loss: 0.60694, val loss: 0.60327, in 0.016s 1 tree, 63 leaves, max depth = 11, train loss: 0.58901, val loss: 0.58468, in 0.031s 1 tree, 64 leaves, max depth = 12, train loss: 0.57239, val loss: 0.56749, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.55828, val loss: 0.55268, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.54483, val loss: 0.53841, in 0.016s 1 tree, 48 leaves, max depth = 12, train loss: 0.53237, val loss: 0.52522, in 0.016s 1 tree, 64 leaves, max depth = 13, train loss: 0.52030, val loss: 0.51262, in 0.031s 1 tree, 60 leaves, max depth = 10, train loss: 0.50915, val loss: 0.50088, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.49859, val loss: 0.48986, in 0.016s Fit 12 trees in 0.580 s, (702 total leaves) Time spent computing histograms: 0.093s Time spent finding best splits: 0.013s Time spent applying splits: 0.014s Time spent predicting: 0.000s Trial 88, Fold 5: Log loss = 0.5017683253521471, Average precision = 0.9171203694323937, ROC-AUC = 0.9293585270237631, Elapsed Time = 0.5883169000007911 seconds
Optimization Progress: 89%|########9 | 89/100 [18:26<02:07, 11.58s/it]
Trial 89, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 89, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.158 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 76 leaves, max depth = 15, train loss: 0.64936, val loss: 0.64820, in 0.016s 1 tree, 89 leaves, max depth = 15, train loss: 0.61314, val loss: 0.61094, in 0.016s 1 tree, 87 leaves, max depth = 17, train loss: 0.58324, val loss: 0.58025, in 0.016s 1 tree, 79 leaves, max depth = 18, train loss: 0.55869, val loss: 0.55509, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.53618, val loss: 0.53443, in 0.016s 1 tree, 89 leaves, max depth = 16, train loss: 0.51692, val loss: 0.51438, in 0.016s 1 tree, 128 leaves, max depth = 17, train loss: 0.49916, val loss: 0.49824, in 0.016s 1 tree, 78 leaves, max depth = 18, train loss: 0.48429, val loss: 0.48275, in 0.016s 1 tree, 89 leaves, max depth = 16, train loss: 0.47150, val loss: 0.46932, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.45773, val loss: 0.45700, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.44606, val loss: 0.44664, in 0.031s 1 tree, 78 leaves, max depth = 17, train loss: 0.43656, val loss: 0.43657, in 0.016s 1 tree, 85 leaves, max depth = 16, train loss: 0.42812, val loss: 0.42761, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42134, val loss: 0.42011, in 0.000s 1 tree, 89 leaves, max depth = 18, train loss: 0.41459, val loss: 0.41301, in 0.016s 1 tree, 128 leaves, max depth = 14, train loss: 0.40536, val loss: 0.40511, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.39984, val loss: 0.39897, in 0.000s 1 tree, 85 leaves, max depth = 12, train loss: 0.39429, val loss: 0.39318, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38968, val loss: 0.38801, in 0.016s 1 tree, 91 leaves, max depth = 12, train loss: 0.38485, val loss: 0.38291, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38100, val loss: 0.37855, in 0.016s 1 tree, 193 leaves, max depth = 14, train loss: 0.37530, val loss: 0.37433, in 0.031s 1 tree, 128 leaves, max depth = 15, train loss: 0.36812, val loss: 0.36851, in 0.016s 1 tree, 91 leaves, max depth = 16, train loss: 0.36431, val loss: 0.36448, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.36100, val loss: 0.36070, in 0.000s 1 tree, 127 leaves, max depth = 16, train loss: 0.35480, val loss: 0.35582, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.34954, val loss: 0.35175, in 0.031s 1 tree, 111 leaves, max depth = 15, train loss: 0.34641, val loss: 0.34804, in 0.016s 1 tree, 128 leaves, max depth = 15, train loss: 0.34188, val loss: 0.34459, in 0.016s 1 tree, 129 leaves, max depth = 15, train loss: 0.33804, val loss: 0.34174, in 0.016s 1 tree, 85 leaves, max depth = 13, train loss: 0.33528, val loss: 0.33889, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33224, val loss: 0.33537, in 0.016s 1 tree, 113 leaves, max depth = 15, train loss: 0.32977, val loss: 0.33269, in 0.016s 1 tree, 129 leaves, max depth = 16, train loss: 0.32632, val loss: 0.33020, in 0.016s 1 tree, 91 leaves, max depth = 13, train loss: 0.32410, val loss: 0.32786, in 0.016s 1 tree, 111 leaves, max depth = 14, train loss: 0.32207, val loss: 0.32539, in 0.016s 1 tree, 162 leaves, max depth = 17, train loss: 0.31949, val loss: 0.32364, in 0.016s 1 tree, 128 leaves, max depth = 16, train loss: 0.31673, val loss: 0.32173, in 0.016s 1 tree, 110 leaves, max depth = 15, train loss: 0.31492, val loss: 0.31973, in 0.016s 1 tree, 162 leaves, max depth = 16, train loss: 0.31288, val loss: 0.31842, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.31007, val loss: 0.31513, in 0.000s 1 tree, 111 leaves, max depth = 17, train loss: 0.30837, val loss: 0.31305, in 0.016s 1 tree, 146 leaves, max depth = 23, train loss: 0.30621, val loss: 0.31151, in 0.031s 1 tree, 128 leaves, max depth = 17, train loss: 0.30408, val loss: 0.31009, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30157, val loss: 0.30714, in 0.000s 1 tree, 111 leaves, max depth = 15, train loss: 0.30005, val loss: 0.30528, in 0.031s 1 tree, 128 leaves, max depth = 17, train loss: 0.29813, val loss: 0.30408, in 0.016s 1 tree, 50 leaves, max depth = 15, train loss: 0.29674, val loss: 0.30265, in 0.016s 1 tree, 129 leaves, max depth = 16, train loss: 0.29514, val loss: 0.30164, in 0.016s 1 tree, 50 leaves, max depth = 12, train loss: 0.29394, val loss: 0.30040, in 0.016s 1 tree, 112 leaves, max depth = 17, train loss: 0.29267, val loss: 0.29906, in 0.016s 1 tree, 129 leaves, max depth = 17, train loss: 0.29124, val loss: 0.29821, in 0.016s 1 tree, 156 leaves, max depth = 20, train loss: 0.28954, val loss: 0.29709, in 0.016s 1 tree, 85 leaves, max depth = 13, train loss: 0.28831, val loss: 0.29573, in 0.000s 1 tree, 129 leaves, max depth = 16, train loss: 0.28708, val loss: 0.29500, in 0.031s Fit 55 trees in 1.220 s, (5283 total leaves) Time spent computing histograms: 0.345s Time spent finding best splits: 0.101s Time spent applying splits: 0.116s Time spent predicting: 0.016s Trial 89, Fold 1: Log loss = 0.2991288879401835, Average precision = 0.9549775656537438, ROC-AUC = 0.9511204850805122, Elapsed Time = 1.2206474000013259 seconds Trial 89, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 89, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.190 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 73 leaves, max depth = 18, train loss: 0.64921, val loss: 0.64738, in 0.016s 1 tree, 88 leaves, max depth = 14, train loss: 0.61322, val loss: 0.60940, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.58345, val loss: 0.57803, in 0.016s 1 tree, 74 leaves, max depth = 17, train loss: 0.55882, val loss: 0.55233, in 0.016s 1 tree, 127 leaves, max depth = 15, train loss: 0.53656, val loss: 0.53099, in 0.031s 1 tree, 81 leaves, max depth = 14, train loss: 0.51743, val loss: 0.51072, in 0.016s 1 tree, 126 leaves, max depth = 15, train loss: 0.49984, val loss: 0.49397, in 0.031s 1 tree, 83 leaves, max depth = 23, train loss: 0.48489, val loss: 0.47802, in 0.016s 1 tree, 88 leaves, max depth = 14, train loss: 0.47215, val loss: 0.46429, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.45852, val loss: 0.45145, in 0.031s 1 tree, 126 leaves, max depth = 15, train loss: 0.44697, val loss: 0.44064, in 0.016s 1 tree, 83 leaves, max depth = 17, train loss: 0.43745, val loss: 0.43035, in 0.031s 1 tree, 79 leaves, max depth = 22, train loss: 0.42935, val loss: 0.42165, in 0.016s 1 tree, 87 leaves, max depth = 16, train loss: 0.42203, val loss: 0.41366, in 0.016s 1 tree, 88 leaves, max depth = 13, train loss: 0.41603, val loss: 0.40721, in 0.016s 1 tree, 126 leaves, max depth = 14, train loss: 0.40693, val loss: 0.39888, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40077, val loss: 0.39249, in 0.000s 1 tree, 84 leaves, max depth = 19, train loss: 0.39617, val loss: 0.38743, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.39109, val loss: 0.38213, in 0.000s 1 tree, 88 leaves, max depth = 17, train loss: 0.38700, val loss: 0.37764, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38276, val loss: 0.37320, in 0.016s 1 tree, 196 leaves, max depth = 15, train loss: 0.37709, val loss: 0.36868, in 0.031s 1 tree, 126 leaves, max depth = 17, train loss: 0.36998, val loss: 0.36235, in 0.016s 1 tree, 88 leaves, max depth = 16, train loss: 0.36677, val loss: 0.35883, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36316, val loss: 0.35503, in 0.000s 1 tree, 126 leaves, max depth = 17, train loss: 0.35700, val loss: 0.34962, in 0.031s 1 tree, 127 leaves, max depth = 17, train loss: 0.35176, val loss: 0.34507, in 0.031s 1 tree, 107 leaves, max depth = 15, train loss: 0.34889, val loss: 0.34207, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.34439, val loss: 0.33821, in 0.031s 1 tree, 127 leaves, max depth = 17, train loss: 0.34058, val loss: 0.33497, in 0.031s 1 tree, 85 leaves, max depth = 14, train loss: 0.33807, val loss: 0.33250, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33478, val loss: 0.32903, in 0.016s 1 tree, 145 leaves, max depth = 18, train loss: 0.33218, val loss: 0.32760, in 0.031s 1 tree, 127 leaves, max depth = 18, train loss: 0.32907, val loss: 0.32504, in 0.016s 1 tree, 113 leaves, max depth = 19, train loss: 0.32678, val loss: 0.32247, in 0.031s 1 tree, 107 leaves, max depth = 14, train loss: 0.32474, val loss: 0.32033, in 0.016s 1 tree, 163 leaves, max depth = 15, train loss: 0.32216, val loss: 0.31875, in 0.031s 1 tree, 126 leaves, max depth = 19, train loss: 0.31966, val loss: 0.31670, in 0.016s 1 tree, 7 leaves, max depth = 3, train loss: 0.31789, val loss: 0.31470, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.31617, val loss: 0.31283, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.31313, val loss: 0.30963, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.31178, val loss: 0.30830, in 0.016s 1 tree, 144 leaves, max depth = 19, train loss: 0.30982, val loss: 0.30740, in 0.031s 1 tree, 126 leaves, max depth = 21, train loss: 0.30769, val loss: 0.30572, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30499, val loss: 0.30285, in 0.016s 1 tree, 127 leaves, max depth = 21, train loss: 0.30312, val loss: 0.30139, in 0.031s 1 tree, 113 leaves, max depth = 16, train loss: 0.30161, val loss: 0.29972, in 0.016s 1 tree, 108 leaves, max depth = 15, train loss: 0.29955, val loss: 0.29825, in 0.016s 1 tree, 109 leaves, max depth = 16, train loss: 0.29819, val loss: 0.29685, in 0.031s 1 tree, 127 leaves, max depth = 22, train loss: 0.29667, val loss: 0.29569, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.29561, val loss: 0.29465, in 0.016s 1 tree, 159 leaves, max depth = 23, train loss: 0.29394, val loss: 0.29403, in 0.031s 1 tree, 87 leaves, max depth = 15, train loss: 0.29276, val loss: 0.29300, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.29191, val loss: 0.29231, in 0.016s 1 tree, 127 leaves, max depth = 23, train loss: 0.29058, val loss: 0.29128, in 0.016s Fit 55 trees in 1.487 s, (5085 total leaves) Time spent computing histograms: 0.429s Time spent finding best splits: 0.119s Time spent applying splits: 0.145s Time spent predicting: 0.000s Trial 89, Fold 2: Log loss = 0.3012319816631841, Average precision = 0.9527567543609047, ROC-AUC = 0.950794187210029, Elapsed Time = 1.4833385000001726 seconds Trial 89, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 89, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.206 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 77 leaves, max depth = 17, train loss: 0.64955, val loss: 0.64800, in 0.016s 1 tree, 89 leaves, max depth = 13, train loss: 0.61373, val loss: 0.61076, in 0.016s 1 tree, 86 leaves, max depth = 13, train loss: 0.58412, val loss: 0.57993, in 0.016s 1 tree, 77 leaves, max depth = 17, train loss: 0.55973, val loss: 0.55458, in 0.016s 1 tree, 126 leaves, max depth = 16, train loss: 0.53704, val loss: 0.53367, in 0.016s 1 tree, 89 leaves, max depth = 15, train loss: 0.51807, val loss: 0.51363, in 0.031s 1 tree, 126 leaves, max depth = 16, train loss: 0.50016, val loss: 0.49723, in 0.016s 1 tree, 76 leaves, max depth = 18, train loss: 0.48540, val loss: 0.48162, in 0.016s 1 tree, 88 leaves, max depth = 15, train loss: 0.47286, val loss: 0.46819, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.45899, val loss: 0.45558, in 0.078s 1 tree, 125 leaves, max depth = 16, train loss: 0.44723, val loss: 0.44497, in 0.047s 1 tree, 76 leaves, max depth = 19, train loss: 0.43785, val loss: 0.43488, in 0.016s 1 tree, 86 leaves, max depth = 14, train loss: 0.42952, val loss: 0.42596, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.42267, val loss: 0.41961, in 0.000s 1 tree, 89 leaves, max depth = 15, train loss: 0.41608, val loss: 0.41252, in 0.031s 1 tree, 125 leaves, max depth = 14, train loss: 0.40679, val loss: 0.40423, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40121, val loss: 0.39906, in 0.000s 1 tree, 80 leaves, max depth = 15, train loss: 0.39577, val loss: 0.39338, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.39115, val loss: 0.38909, in 0.000s 1 tree, 88 leaves, max depth = 14, train loss: 0.38642, val loss: 0.38416, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38254, val loss: 0.38055, in 0.000s 1 tree, 196 leaves, max depth = 13, train loss: 0.37681, val loss: 0.37618, in 0.031s 1 tree, 126 leaves, max depth = 16, train loss: 0.36953, val loss: 0.36980, in 0.031s 1 tree, 88 leaves, max depth = 14, train loss: 0.36581, val loss: 0.36587, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.36248, val loss: 0.36277, in 0.000s 1 tree, 126 leaves, max depth = 15, train loss: 0.35619, val loss: 0.35730, in 0.031s 1 tree, 125 leaves, max depth = 15, train loss: 0.35083, val loss: 0.35272, in 0.031s 1 tree, 77 leaves, max depth = 13, train loss: 0.34795, val loss: 0.34963, in 0.031s 1 tree, 126 leaves, max depth = 15, train loss: 0.34338, val loss: 0.34577, in 0.016s 1 tree, 125 leaves, max depth = 15, train loss: 0.33949, val loss: 0.34254, in 0.031s 1 tree, 112 leaves, max depth = 15, train loss: 0.33676, val loss: 0.33920, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.33369, val loss: 0.33632, in 0.000s 1 tree, 88 leaves, max depth = 16, train loss: 0.33131, val loss: 0.33380, in 0.016s 1 tree, 126 leaves, max depth = 19, train loss: 0.32781, val loss: 0.33092, in 0.031s 1 tree, 112 leaves, max depth = 15, train loss: 0.32565, val loss: 0.32827, in 0.031s 1 tree, 145 leaves, max depth = 20, train loss: 0.32301, val loss: 0.32680, in 0.031s 1 tree, 52 leaves, max depth = 13, train loss: 0.32126, val loss: 0.32503, in 0.016s 1 tree, 126 leaves, max depth = 17, train loss: 0.31847, val loss: 0.32274, in 0.031s 1 tree, 109 leaves, max depth = 25, train loss: 0.31665, val loss: 0.32035, in 0.031s 1 tree, 164 leaves, max depth = 16, train loss: 0.31439, val loss: 0.31881, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.31154, val loss: 0.31613, in 0.016s 1 tree, 108 leaves, max depth = 16, train loss: 0.30971, val loss: 0.31396, in 0.031s 1 tree, 146 leaves, max depth = 23, train loss: 0.30761, val loss: 0.31301, in 0.031s 1 tree, 124 leaves, max depth = 20, train loss: 0.30546, val loss: 0.31129, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.30291, val loss: 0.30890, in 0.016s 1 tree, 108 leaves, max depth = 16, train loss: 0.30128, val loss: 0.30695, in 0.031s 1 tree, 126 leaves, max depth = 18, train loss: 0.29934, val loss: 0.30552, in 0.031s 1 tree, 89 leaves, max depth = 16, train loss: 0.29786, val loss: 0.30388, in 0.016s 1 tree, 126 leaves, max depth = 20, train loss: 0.29616, val loss: 0.30259, in 0.031s 1 tree, 52 leaves, max depth = 12, train loss: 0.29505, val loss: 0.30145, in 0.016s 1 tree, 109 leaves, max depth = 23, train loss: 0.29383, val loss: 0.29982, in 0.031s 1 tree, 124 leaves, max depth = 20, train loss: 0.29234, val loss: 0.29869, in 0.031s 1 tree, 150 leaves, max depth = 18, train loss: 0.29072, val loss: 0.29763, in 0.031s 1 tree, 89 leaves, max depth = 13, train loss: 0.28960, val loss: 0.29650, in 0.016s 1 tree, 126 leaves, max depth = 18, train loss: 0.28829, val loss: 0.29551, in 0.031s Fit 55 trees in 1.706 s, (5168 total leaves) Time spent computing histograms: 0.495s Time spent finding best splits: 0.153s Time spent applying splits: 0.181s Time spent predicting: 0.000s Trial 89, Fold 3: Log loss = 0.2924300948581894, Average precision = 0.9579560866011733, ROC-AUC = 0.9552850605792104, Elapsed Time = 1.7165589000014734 seconds Trial 89, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 89, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0.205 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 81 leaves, max depth = 15, train loss: 0.64942, val loss: 0.64717, in 0.016s 1 tree, 89 leaves, max depth = 16, train loss: 0.61341, val loss: 0.60887, in 0.016s 1 tree, 91 leaves, max depth = 16, train loss: 0.58364, val loss: 0.57720, in 0.031s 1 tree, 80 leaves, max depth = 15, train loss: 0.55910, val loss: 0.55094, in 0.016s 1 tree, 131 leaves, max depth = 16, train loss: 0.53671, val loss: 0.52931, in 0.016s 1 tree, 89 leaves, max depth = 16, train loss: 0.51759, val loss: 0.50852, in 0.016s 1 tree, 131 leaves, max depth = 17, train loss: 0.49994, val loss: 0.49155, in 0.016s 1 tree, 85 leaves, max depth = 16, train loss: 0.48508, val loss: 0.47527, in 0.016s 1 tree, 89 leaves, max depth = 14, train loss: 0.47242, val loss: 0.46123, in 0.031s 1 tree, 131 leaves, max depth = 17, train loss: 0.45877, val loss: 0.44821, in 0.016s 1 tree, 131 leaves, max depth = 17, train loss: 0.44719, val loss: 0.43720, in 0.031s 1 tree, 85 leaves, max depth = 18, train loss: 0.43773, val loss: 0.42657, in 0.016s 1 tree, 89 leaves, max depth = 15, train loss: 0.42934, val loss: 0.41727, in 0.016s 1 tree, 88 leaves, max depth = 15, train loss: 0.42212, val loss: 0.40904, in 0.016s 1 tree, 87 leaves, max depth = 17, train loss: 0.41626, val loss: 0.40227, in 0.016s 1 tree, 131 leaves, max depth = 15, train loss: 0.40722, val loss: 0.39380, in 0.031s 1 tree, 4 leaves, max depth = 3, train loss: 0.40108, val loss: 0.38718, in 0.000s 1 tree, 91 leaves, max depth = 15, train loss: 0.39647, val loss: 0.38197, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.39140, val loss: 0.37650, in 0.016s 1 tree, 88 leaves, max depth = 14, train loss: 0.38740, val loss: 0.37186, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.38318, val loss: 0.36730, in 0.016s 1 tree, 195 leaves, max depth = 15, train loss: 0.37761, val loss: 0.36273, in 0.016s 1 tree, 130 leaves, max depth = 16, train loss: 0.37053, val loss: 0.35630, in 0.031s 1 tree, 88 leaves, max depth = 13, train loss: 0.36741, val loss: 0.35267, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.36381, val loss: 0.34877, in 0.000s 1 tree, 132 leaves, max depth = 14, train loss: 0.35766, val loss: 0.34327, in 0.031s 1 tree, 131 leaves, max depth = 14, train loss: 0.35244, val loss: 0.33861, in 0.031s 1 tree, 53 leaves, max depth = 11, train loss: 0.34997, val loss: 0.33613, in 0.000s 1 tree, 132 leaves, max depth = 16, train loss: 0.34555, val loss: 0.33222, in 0.031s 1 tree, 131 leaves, max depth = 16, train loss: 0.34180, val loss: 0.32892, in 0.016s 1 tree, 114 leaves, max depth = 16, train loss: 0.33919, val loss: 0.32622, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.33592, val loss: 0.32265, in 0.016s 1 tree, 114 leaves, max depth = 16, train loss: 0.33375, val loss: 0.32044, in 0.016s 1 tree, 130 leaves, max depth = 16, train loss: 0.33033, val loss: 0.31749, in 0.031s 1 tree, 83 leaves, max depth = 13, train loss: 0.32829, val loss: 0.31527, in 0.016s 1 tree, 145 leaves, max depth = 22, train loss: 0.32581, val loss: 0.31335, in 0.016s 1 tree, 164 leaves, max depth = 23, train loss: 0.32333, val loss: 0.31161, in 0.032s 1 tree, 132 leaves, max depth = 17, train loss: 0.32085, val loss: 0.30953, in 0.031s 1 tree, 113 leaves, max depth = 25, train loss: 0.31889, val loss: 0.30756, in 0.031s 1 tree, 105 leaves, max depth = 15, train loss: 0.31708, val loss: 0.30577, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.31406, val loss: 0.30247, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.31269, val loss: 0.30116, in 0.016s 1 tree, 145 leaves, max depth = 23, train loss: 0.31071, val loss: 0.29969, in 0.031s 1 tree, 130 leaves, max depth = 16, train loss: 0.30859, val loss: 0.29794, in 0.016s 1 tree, 4 leaves, max depth = 3, train loss: 0.30590, val loss: 0.29499, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.30436, val loss: 0.29356, in 0.016s 1 tree, 131 leaves, max depth = 17, train loss: 0.30244, val loss: 0.29199, in 0.031s 1 tree, 107 leaves, max depth = 17, train loss: 0.30108, val loss: 0.29093, in 0.016s 1 tree, 132 leaves, max depth = 17, train loss: 0.29938, val loss: 0.28958, in 0.016s 1 tree, 152 leaves, max depth = 20, train loss: 0.29757, val loss: 0.28869, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.29650, val loss: 0.28769, in 0.016s 1 tree, 131 leaves, max depth = 17, train loss: 0.29506, val loss: 0.28656, in 0.016s 1 tree, 87 leaves, max depth = 15, train loss: 0.29390, val loss: 0.28527, in 0.031s 1 tree, 110 leaves, max depth = 16, train loss: 0.29196, val loss: 0.28370, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.29079, val loss: 0.28261, in 0.031s Fit 55 trees in 1.440 s, (5348 total leaves) Time spent computing histograms: 0.405s Time spent finding best splits: 0.114s Time spent applying splits: 0.135s Time spent predicting: 0.000s Trial 89, Fold 4: Log loss = 0.2984179934391109, Average precision = 0.9564092380626489, ROC-AUC = 0.9529589482130042, Elapsed Time = 1.4476890999994794 seconds Trial 89, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 89, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.189 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 79 leaves, max depth = 15, train loss: 0.64887, val loss: 0.64617, in 0.016s 1 tree, 77 leaves, max depth = 15, train loss: 0.61270, val loss: 0.60729, in 0.016s 1 tree, 86 leaves, max depth = 16, train loss: 0.58265, val loss: 0.57491, in 0.016s 1 tree, 74 leaves, max depth = 14, train loss: 0.55783, val loss: 0.54778, in 0.016s 1 tree, 127 leaves, max depth = 20, train loss: 0.53545, val loss: 0.52651, in 0.031s 1 tree, 77 leaves, max depth = 15, train loss: 0.51620, val loss: 0.50536, in 0.031s 1 tree, 127 leaves, max depth = 18, train loss: 0.49852, val loss: 0.48872, in 0.016s 1 tree, 76 leaves, max depth = 15, train loss: 0.48349, val loss: 0.47216, in 0.016s 1 tree, 85 leaves, max depth = 15, train loss: 0.47065, val loss: 0.45794, in 0.016s 1 tree, 127 leaves, max depth = 16, train loss: 0.45695, val loss: 0.44527, in 0.016s 1 tree, 127 leaves, max depth = 17, train loss: 0.44533, val loss: 0.43459, in 0.031s 1 tree, 76 leaves, max depth = 15, train loss: 0.43576, val loss: 0.42380, in 0.016s 1 tree, 73 leaves, max depth = 16, train loss: 0.42766, val loss: 0.41459, in 0.016s 1 tree, 87 leaves, max depth = 15, train loss: 0.42034, val loss: 0.40623, in 0.016s 1 tree, 88 leaves, max depth = 17, train loss: 0.41437, val loss: 0.39923, in 0.016s 1 tree, 126 leaves, max depth = 19, train loss: 0.40521, val loss: 0.39111, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.39901, val loss: 0.38516, in 0.000s 1 tree, 86 leaves, max depth = 15, train loss: 0.39435, val loss: 0.37976, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38923, val loss: 0.37488, in 0.016s 1 tree, 87 leaves, max depth = 15, train loss: 0.38517, val loss: 0.37011, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38090, val loss: 0.36607, in 0.000s 1 tree, 186 leaves, max depth = 13, train loss: 0.37521, val loss: 0.36211, in 0.031s 1 tree, 126 leaves, max depth = 19, train loss: 0.36803, val loss: 0.35594, in 0.016s 1 tree, 87 leaves, max depth = 14, train loss: 0.36484, val loss: 0.35216, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.36121, val loss: 0.34873, in 0.000s 1 tree, 126 leaves, max depth = 17, train loss: 0.35498, val loss: 0.34345, in 0.031s 1 tree, 127 leaves, max depth = 19, train loss: 0.34968, val loss: 0.33902, in 0.016s 1 tree, 107 leaves, max depth = 16, train loss: 0.34672, val loss: 0.33593, in 0.016s 1 tree, 126 leaves, max depth = 18, train loss: 0.34217, val loss: 0.33217, in 0.031s 1 tree, 127 leaves, max depth = 17, train loss: 0.33831, val loss: 0.32899, in 0.016s 1 tree, 114 leaves, max depth = 16, train loss: 0.33576, val loss: 0.32640, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.33244, val loss: 0.32323, in 0.000s 1 tree, 114 leaves, max depth = 14, train loss: 0.33032, val loss: 0.32110, in 0.016s 1 tree, 126 leaves, max depth = 20, train loss: 0.32682, val loss: 0.31826, in 0.031s 1 tree, 81 leaves, max depth = 13, train loss: 0.32475, val loss: 0.31589, in 0.016s 1 tree, 137 leaves, max depth = 26, train loss: 0.32233, val loss: 0.31414, in 0.031s 1 tree, 166 leaves, max depth = 16, train loss: 0.31982, val loss: 0.31227, in 0.016s 1 tree, 126 leaves, max depth = 22, train loss: 0.31729, val loss: 0.31027, in 0.016s 1 tree, 111 leaves, max depth = 16, train loss: 0.31540, val loss: 0.30833, in 0.016s 1 tree, 107 leaves, max depth = 17, train loss: 0.31365, val loss: 0.30654, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.31058, val loss: 0.30358, in 0.000s 1 tree, 51 leaves, max depth = 12, train loss: 0.30922, val loss: 0.30198, in 0.016s 1 tree, 107 leaves, max depth = 14, train loss: 0.30673, val loss: 0.30020, in 0.031s 1 tree, 126 leaves, max depth = 23, train loss: 0.30460, val loss: 0.29857, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.30188, val loss: 0.29593, in 0.000s 1 tree, 107 leaves, max depth = 16, train loss: 0.30033, val loss: 0.29440, in 0.031s 1 tree, 126 leaves, max depth = 24, train loss: 0.29840, val loss: 0.29294, in 0.016s 1 tree, 81 leaves, max depth = 14, train loss: 0.29704, val loss: 0.29145, in 0.031s 1 tree, 127 leaves, max depth = 24, train loss: 0.29536, val loss: 0.29022, in 0.016s 1 tree, 159 leaves, max depth = 19, train loss: 0.29353, val loss: 0.28964, in 0.031s 1 tree, 111 leaves, max depth = 17, train loss: 0.29233, val loss: 0.28836, in 0.016s 1 tree, 126 leaves, max depth = 24, train loss: 0.29088, val loss: 0.28729, in 0.031s 1 tree, 89 leaves, max depth = 15, train loss: 0.28972, val loss: 0.28587, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.28803, val loss: 0.28481, in 0.016s 1 tree, 107 leaves, max depth = 18, train loss: 0.28686, val loss: 0.28366, in 0.016s Fit 55 trees in 1.424 s, (5237 total leaves) Time spent computing histograms: 0.414s Time spent finding best splits: 0.117s Time spent applying splits: 0.142s Time spent predicting: 0.016s Trial 89, Fold 5: Log loss = 0.3030563569382536, Average precision = 0.9536670025897026, ROC-AUC = 0.950168441001059, Elapsed Time = 1.432476100000713 seconds
Optimization Progress: 90%|######### | 90/100 [18:41<02:07, 12.76s/it]
Trial 90, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 90, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.157 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 7, train loss: 0.67072, val loss: 0.67079, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.65012, val loss: 0.65020, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.63101, val loss: 0.63108, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.61323, val loss: 0.61326, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.59667, val loss: 0.59655, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.58126, val loss: 0.58117, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.56683, val loss: 0.56673, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.55333, val loss: 0.55314, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.54069, val loss: 0.54052, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.52882, val loss: 0.52857, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.51769, val loss: 0.51747, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.50720, val loss: 0.50691, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.49734, val loss: 0.49705, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48805, val loss: 0.48774, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.47929, val loss: 0.47902, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.47075, val loss: 0.47045, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.46295, val loss: 0.46265, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.45533, val loss: 0.45497, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.44813, val loss: 0.44774, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.43793, val loss: 0.43786, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.43142, val loss: 0.43134, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42527, val loss: 0.42520, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.41625, val loss: 0.41651, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40776, val loss: 0.40834, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.40240, val loss: 0.40297, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.39469, val loss: 0.39558, in 0.000s 1 tree, 17 leaves, max depth = 9, train loss: 0.38993, val loss: 0.39080, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.38531, val loss: 0.38614, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.37842, val loss: 0.37958, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.37420, val loss: 0.37536, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.36788, val loss: 0.36936, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.36401, val loss: 0.36551, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.35822, val loss: 0.36007, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.35280, val loss: 0.35484, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.34793, val loss: 0.35000, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.34337, val loss: 0.34545, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.33947, val loss: 0.34158, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.33578, val loss: 0.33799, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.33168, val loss: 0.33392, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.32763, val loss: 0.33025, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.32391, val loss: 0.32656, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.32061, val loss: 0.32339, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.31701, val loss: 0.32015, in 0.016s 1 tree, 17 leaves, max depth = 10, train loss: 0.31374, val loss: 0.31691, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.31071, val loss: 0.31395, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.30750, val loss: 0.31108, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.30461, val loss: 0.30820, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.30185, val loss: 0.30555, in 0.031s 1 tree, 17 leaves, max depth = 8, train loss: 0.29903, val loss: 0.30307, in 0.000s Fit 49 trees in 1.126 s, (833 total leaves) Time spent computing histograms: 0.340s Time spent finding best splits: 0.056s Time spent applying splits: 0.022s Time spent predicting: 0.000s Trial 90, Fold 1: Log loss = 0.30327338909954293, Average precision = 0.9586633313997285, ROC-AUC = 0.9539470146685594, Elapsed Time = 1.1228356000010535 seconds Trial 90, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 90, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 6, train loss: 0.67046, val loss: 0.67015, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.64968, val loss: 0.64902, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.63025, val loss: 0.62928, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.61220, val loss: 0.61085, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.59555, val loss: 0.59390, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.58002, val loss: 0.57808, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.56539, val loss: 0.56326, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.55181, val loss: 0.54936, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.53892, val loss: 0.53619, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.52680, val loss: 0.52386, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.51544, val loss: 0.51227, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.50475, val loss: 0.50133, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.49469, val loss: 0.49108, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48523, val loss: 0.48137, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.47630, val loss: 0.47227, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.46789, val loss: 0.46371, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.45995, val loss: 0.45555, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.45255, val loss: 0.44796, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.44547, val loss: 0.44066, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43543, val loss: 0.43076, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.42902, val loss: 0.42422, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.41990, val loss: 0.41522, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.41412, val loss: 0.40940, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40582, val loss: 0.40124, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40053, val loss: 0.39589, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.39295, val loss: 0.38841, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.38817, val loss: 0.38366, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.38122, val loss: 0.37687, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.37686, val loss: 0.37253, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.37051, val loss: 0.36631, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.36646, val loss: 0.36223, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.36063, val loss: 0.35652, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.35692, val loss: 0.35273, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.35183, val loss: 0.34779, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.34705, val loss: 0.34312, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.34258, val loss: 0.33873, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.33875, val loss: 0.33501, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.33496, val loss: 0.33132, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.33082, val loss: 0.32753, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.32696, val loss: 0.32381, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.32317, val loss: 0.32033, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.31969, val loss: 0.31697, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.31644, val loss: 0.31386, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.31306, val loss: 0.31077, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.30999, val loss: 0.30780, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.30703, val loss: 0.30491, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.30402, val loss: 0.30215, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.30125, val loss: 0.29950, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.29858, val loss: 0.29693, in 0.016s Fit 49 trees in 1.127 s, (833 total leaves) Time spent computing histograms: 0.338s Time spent finding best splits: 0.055s Time spent applying splits: 0.021s Time spent predicting: 0.016s Trial 90, Fold 2: Log loss = 0.30164434202133084, Average precision = 0.9563519071978452, ROC-AUC = 0.9546788546441348, Elapsed Time = 1.1341358000008768 seconds Trial 90, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 90, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 6, train loss: 0.67083, val loss: 0.67081, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.65031, val loss: 0.65023, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.63125, val loss: 0.63107, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.61357, val loss: 0.61331, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.59683, val loss: 0.59653, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.58121, val loss: 0.58088, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.56663, val loss: 0.56625, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.55297, val loss: 0.55258, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.54020, val loss: 0.53972, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.52827, val loss: 0.52765, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.51699, val loss: 0.51639, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.50638, val loss: 0.50574, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.49641, val loss: 0.49570, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48700, val loss: 0.48624, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.47814, val loss: 0.47737, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.46978, val loss: 0.46896, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.46187, val loss: 0.46108, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.45437, val loss: 0.45355, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.44387, val loss: 0.44379, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43405, val loss: 0.43462, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.42757, val loss: 0.42822, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.42146, val loss: 0.42214, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.41565, val loss: 0.41632, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.40726, val loss: 0.40857, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40199, val loss: 0.40335, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39427, val loss: 0.39622, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.38943, val loss: 0.39133, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.38236, val loss: 0.38490, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.37797, val loss: 0.38057, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.37150, val loss: 0.37466, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.36744, val loss: 0.37059, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.36151, val loss: 0.36518, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.35595, val loss: 0.36009, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.35241, val loss: 0.35666, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.34752, val loss: 0.35220, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.34293, val loss: 0.34803, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.33862, val loss: 0.34414, in 0.000s 1 tree, 17 leaves, max depth = 6, train loss: 0.33492, val loss: 0.34036, in 0.031s 1 tree, 17 leaves, max depth = 8, train loss: 0.33072, val loss: 0.33680, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.32693, val loss: 0.33338, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.32358, val loss: 0.32995, in 0.031s 1 tree, 17 leaves, max depth = 8, train loss: 0.31985, val loss: 0.32683, in 0.031s 1 tree, 17 leaves, max depth = 8, train loss: 0.31634, val loss: 0.32394, in 0.032s 1 tree, 17 leaves, max depth = 6, train loss: 0.31313, val loss: 0.32105, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.31012, val loss: 0.31791, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.30699, val loss: 0.31528, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.30415, val loss: 0.31270, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.30140, val loss: 0.30982, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.29861, val loss: 0.30743, in 0.016s Fit 49 trees in 1.205 s, (833 total leaves) Time spent computing histograms: 0.365s Time spent finding best splits: 0.064s Time spent applying splits: 0.038s Time spent predicting: 0.016s Trial 90, Fold 3: Log loss = 0.2972712315038747, Average precision = 0.9594648412397699, ROC-AUC = 0.9567885554410532, Elapsed Time = 1.2156603000003088 seconds Trial 90, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 90, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 6, train loss: 0.67088, val loss: 0.67003, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.65037, val loss: 0.64870, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.63135, val loss: 0.62890, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.61368, val loss: 0.61051, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.59713, val loss: 0.59316, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.58175, val loss: 0.57714, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.56740, val loss: 0.56212, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.55397, val loss: 0.54806, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.54137, val loss: 0.53483, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.52957, val loss: 0.52241, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.51849, val loss: 0.51073, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.50803, val loss: 0.49970, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.49821, val loss: 0.48935, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48888, val loss: 0.47961, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.47995, val loss: 0.47018, in 0.016s 1 tree, 17 leaves, max depth = 9, train loss: 0.47153, val loss: 0.46130, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.46352, val loss: 0.45285, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.45595, val loss: 0.44488, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.44881, val loss: 0.43730, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43875, val loss: 0.42708, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.43233, val loss: 0.42028, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.42319, val loss: 0.41099, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.41733, val loss: 0.40477, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.40905, val loss: 0.39637, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.40383, val loss: 0.39080, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39629, val loss: 0.38315, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.39156, val loss: 0.37817, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.38460, val loss: 0.37110, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.38029, val loss: 0.36653, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.37391, val loss: 0.36008, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.36993, val loss: 0.35577, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.36413, val loss: 0.34992, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.35865, val loss: 0.34435, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.35526, val loss: 0.34067, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.35041, val loss: 0.33561, in 0.000s 1 tree, 17 leaves, max depth = 8, train loss: 0.34587, val loss: 0.33088, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.34160, val loss: 0.32640, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.33741, val loss: 0.32246, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.33369, val loss: 0.31862, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.32994, val loss: 0.31470, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.32649, val loss: 0.31116, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.32285, val loss: 0.30775, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.31965, val loss: 0.30446, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.31642, val loss: 0.30110, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.31319, val loss: 0.29810, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.31028, val loss: 0.29505, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.30741, val loss: 0.29209, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.30453, val loss: 0.28942, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.30178, val loss: 0.28687, in 0.016s Fit 49 trees in 1.127 s, (833 total leaves) Time spent computing histograms: 0.335s Time spent finding best splits: 0.056s Time spent applying splits: 0.022s Time spent predicting: 0.000s Trial 90, Fold 4: Log loss = 0.30193565745199424, Average precision = 0.9581189580828368, ROC-AUC = 0.9541854289910847, Elapsed Time = 1.1225206999988586 seconds Trial 90, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 90, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.174 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 17 leaves, max depth = 7, train loss: 0.67063, val loss: 0.66979, in 0.000s 1 tree, 17 leaves, max depth = 7, train loss: 0.64993, val loss: 0.64834, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.63076, val loss: 0.62843, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.61268, val loss: 0.60966, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.59584, val loss: 0.59217, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.58014, val loss: 0.57586, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.56547, val loss: 0.56059, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.55175, val loss: 0.54632, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.53889, val loss: 0.53292, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.52688, val loss: 0.52042, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.51555, val loss: 0.50867, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.50490, val loss: 0.49754, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.49488, val loss: 0.48707, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.48545, val loss: 0.47724, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.47652, val loss: 0.46794, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.46810, val loss: 0.45917, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.46016, val loss: 0.45091, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.45266, val loss: 0.44310, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.44557, val loss: 0.43574, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.43546, val loss: 0.42563, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.42905, val loss: 0.41897, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.41986, val loss: 0.40980, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.41403, val loss: 0.40378, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.40567, val loss: 0.39548, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.40039, val loss: 0.38998, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.39274, val loss: 0.38237, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.38792, val loss: 0.37741, in 0.031s 1 tree, 17 leaves, max depth = 6, train loss: 0.38093, val loss: 0.37045, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.37655, val loss: 0.36591, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.37237, val loss: 0.36162, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.36611, val loss: 0.35542, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.36023, val loss: 0.34959, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.35654, val loss: 0.34584, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.35124, val loss: 0.34050, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.34640, val loss: 0.33558, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.34187, val loss: 0.33096, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.33797, val loss: 0.32712, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.33378, val loss: 0.32286, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.33015, val loss: 0.31931, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.32616, val loss: 0.31571, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.32248, val loss: 0.31199, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.31919, val loss: 0.30877, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.31593, val loss: 0.30546, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.31245, val loss: 0.30234, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.30929, val loss: 0.29914, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.30617, val loss: 0.29640, in 0.016s 1 tree, 17 leaves, max depth = 6, train loss: 0.30331, val loss: 0.29342, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.30043, val loss: 0.29080, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.29783, val loss: 0.28815, in 0.016s Fit 49 trees in 1.143 s, (833 total leaves) Time spent computing histograms: 0.346s Time spent finding best splits: 0.060s Time spent applying splits: 0.022s Time spent predicting: 0.000s Trial 90, Fold 5: Log loss = 0.3066667178325203, Average precision = 0.9558979910376308, ROC-AUC = 0.9514793303977854, Elapsed Time = 1.152712999999494 seconds
Optimization Progress: 91%|#########1| 91/100 [18:55<01:55, 12.88s/it]
Trial 91, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 91, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.143 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 10, train loss: 0.66806, val loss: 0.66726, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.65533, val loss: 0.65349, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.63376, val loss: 0.63130, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.61459, val loss: 0.61135, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.59735, val loss: 0.59338, in 0.000s 1 tree, 24 leaves, max depth = 11, train loss: 0.58161, val loss: 0.57708, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.56757, val loss: 0.56237, in 0.000s 1 tree, 24 leaves, max depth = 9, train loss: 0.55498, val loss: 0.54928, in 0.000s 1 tree, 47 leaves, max depth = 8, train loss: 0.54589, val loss: 0.54077, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.54088, val loss: 0.53491, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.53026, val loss: 0.52387, in 0.016s 1 tree, 18 leaves, max depth = 6, train loss: 0.52507, val loss: 0.51845, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.51565, val loss: 0.50865, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.50706, val loss: 0.49964, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.49931, val loss: 0.49152, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.49226, val loss: 0.48410, in 0.000s 1 tree, 47 leaves, max depth = 14, train loss: 0.48762, val loss: 0.47935, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.48125, val loss: 0.47261, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.47544, val loss: 0.46647, in 0.016s 1 tree, 24 leaves, max depth = 11, train loss: 0.46981, val loss: 0.46050, in 0.000s 1 tree, 24 leaves, max depth = 11, train loss: 0.46469, val loss: 0.45506, in 0.016s 1 tree, 24 leaves, max depth = 11, train loss: 0.46003, val loss: 0.45009, in 0.000s 1 tree, 24 leaves, max depth = 7, train loss: 0.45609, val loss: 0.44590, in 0.016s 1 tree, 45 leaves, max depth = 9, train loss: 0.45153, val loss: 0.44177, in 0.000s 1 tree, 24 leaves, max depth = 10, train loss: 0.44798, val loss: 0.43795, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.44464, val loss: 0.43440, in 0.000s 1 tree, 18 leaves, max depth = 7, train loss: 0.44129, val loss: 0.43103, in 0.000s 1 tree, 24 leaves, max depth = 10, train loss: 0.43845, val loss: 0.42800, in 0.016s 1 tree, 47 leaves, max depth = 8, train loss: 0.43352, val loss: 0.42369, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.42609, val loss: 0.41700, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.42372, val loss: 0.41446, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.42142, val loss: 0.41189, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.41786, val loss: 0.40850, in 0.016s 1 tree, 47 leaves, max depth = 12, train loss: 0.41479, val loss: 0.40553, in 0.000s 1 tree, 44 leaves, max depth = 9, train loss: 0.41173, val loss: 0.40287, in 0.016s 1 tree, 43 leaves, max depth = 9, train loss: 0.40895, val loss: 0.40048, in 0.000s Fit 36 trees in 0.502 s, (957 total leaves) Time spent computing histograms: 0.146s Time spent finding best splits: 0.013s Time spent applying splits: 0.017s Time spent predicting: 0.000s Trial 91, Fold 1: Log loss = 0.40882904263038683, Average precision = 0.9161066816081005, ROC-AUC = 0.9231154527704889, Elapsed Time = 0.5137892999991891 seconds Trial 91, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 91, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.143 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 9, train loss: 0.66805, val loss: 0.66675, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.65515, val loss: 0.65251, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.63362, val loss: 0.62991, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.61439, val loss: 0.60970, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.59711, val loss: 0.59149, in 0.000s 1 tree, 24 leaves, max depth = 9, train loss: 0.58147, val loss: 0.57490, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.56740, val loss: 0.56001, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.55476, val loss: 0.54661, in 0.000s 1 tree, 47 leaves, max depth = 8, train loss: 0.54576, val loss: 0.53806, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.54083, val loss: 0.53256, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.53016, val loss: 0.52122, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.52585, val loss: 0.51649, in 0.000s 1 tree, 25 leaves, max depth = 13, train loss: 0.51655, val loss: 0.50662, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.50812, val loss: 0.49754, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.50044, val loss: 0.48931, in 0.000s 1 tree, 24 leaves, max depth = 9, train loss: 0.49346, val loss: 0.48179, in 0.000s 1 tree, 47 leaves, max depth = 12, train loss: 0.48860, val loss: 0.47745, in 0.000s 1 tree, 24 leaves, max depth = 9, train loss: 0.48233, val loss: 0.47075, in 0.000s 1 tree, 24 leaves, max depth = 9, train loss: 0.47665, val loss: 0.46456, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.47120, val loss: 0.45863, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.46638, val loss: 0.45339, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.46186, val loss: 0.44845, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.45799, val loss: 0.44424, in 0.000s 1 tree, 45 leaves, max depth = 10, train loss: 0.45333, val loss: 0.43982, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.44985, val loss: 0.43604, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.44660, val loss: 0.43247, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.44318, val loss: 0.42914, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.44039, val loss: 0.42605, in 0.016s 1 tree, 47 leaves, max depth = 8, train loss: 0.43535, val loss: 0.42149, in 0.000s 1 tree, 43 leaves, max depth = 11, train loss: 0.42777, val loss: 0.41435, in 0.016s 1 tree, 25 leaves, max depth = 14, train loss: 0.42544, val loss: 0.41174, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.42320, val loss: 0.40923, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.41916, val loss: 0.40576, in 0.000s 1 tree, 47 leaves, max depth = 11, train loss: 0.41594, val loss: 0.40311, in 0.016s 1 tree, 44 leaves, max depth = 9, train loss: 0.41283, val loss: 0.40025, in 0.000s 1 tree, 44 leaves, max depth = 9, train loss: 0.41001, val loss: 0.39767, in 0.016s Fit 36 trees in 0.534 s, (975 total leaves) Time spent computing histograms: 0.145s Time spent finding best splits: 0.013s Time spent applying splits: 0.018s Time spent predicting: 0.000s Trial 91, Fold 2: Log loss = 0.41045980898028744, Average precision = 0.9138552951984288, ROC-AUC = 0.9266183370202987, Elapsed Time = 0.5324492999989161 seconds Trial 91, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 91, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 10, train loss: 0.66816, val loss: 0.66729, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.65521, val loss: 0.65401, in 0.000s 1 tree, 24 leaves, max depth = 9, train loss: 0.63385, val loss: 0.63179, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.61475, val loss: 0.61195, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.59758, val loss: 0.59409, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.58213, val loss: 0.57801, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.56816, val loss: 0.56344, in 0.000s 1 tree, 24 leaves, max depth = 10, train loss: 0.55562, val loss: 0.55025, in 0.000s 1 tree, 47 leaves, max depth = 8, train loss: 0.54651, val loss: 0.54188, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.54166, val loss: 0.53596, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.53112, val loss: 0.52482, in 0.016s 1 tree, 18 leaves, max depth = 6, train loss: 0.52597, val loss: 0.51997, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.51662, val loss: 0.51008, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.50814, val loss: 0.50113, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.50044, val loss: 0.49295, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.49345, val loss: 0.48550, in 0.000s 1 tree, 47 leaves, max depth = 14, train loss: 0.48872, val loss: 0.48115, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.48240, val loss: 0.47451, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.47669, val loss: 0.46843, in 0.000s 1 tree, 23 leaves, max depth = 8, train loss: 0.47118, val loss: 0.46268, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.46617, val loss: 0.45744, in 0.000s 1 tree, 23 leaves, max depth = 8, train loss: 0.46162, val loss: 0.45266, in 0.000s 1 tree, 24 leaves, max depth = 8, train loss: 0.45776, val loss: 0.44851, in 0.016s 1 tree, 44 leaves, max depth = 12, train loss: 0.45323, val loss: 0.44430, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.44971, val loss: 0.44057, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.44645, val loss: 0.43703, in 0.000s 1 tree, 18 leaves, max depth = 6, train loss: 0.44310, val loss: 0.43407, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.44030, val loss: 0.43101, in 0.000s 1 tree, 47 leaves, max depth = 9, train loss: 0.43538, val loss: 0.42666, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.42809, val loss: 0.42002, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.42575, val loss: 0.41744, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.42348, val loss: 0.41501, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.41981, val loss: 0.41179, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.41665, val loss: 0.40908, in 0.000s 1 tree, 43 leaves, max depth = 11, train loss: 0.41361, val loss: 0.40630, in 0.016s 1 tree, 43 leaves, max depth = 11, train loss: 0.41085, val loss: 0.40378, in 0.000s Fit 36 trees in 0.549 s, (953 total leaves) Time spent computing histograms: 0.146s Time spent finding best splits: 0.014s Time spent applying splits: 0.019s Time spent predicting: 0.000s Trial 91, Fold 3: Log loss = 0.40375749891006, Average precision = 0.9206012984117338, ROC-AUC = 0.9313995669645867, Elapsed Time = 0.5566916999996465 seconds Trial 91, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 91, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 10, train loss: 0.66813, val loss: 0.66664, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.65509, val loss: 0.65268, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.63373, val loss: 0.62998, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.61464, val loss: 0.60970, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.59747, val loss: 0.59140, in 0.000s 1 tree, 24 leaves, max depth = 12, train loss: 0.58201, val loss: 0.57474, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.56804, val loss: 0.55974, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.55542, val loss: 0.54612, in 0.000s 1 tree, 47 leaves, max depth = 8, train loss: 0.54645, val loss: 0.53727, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.54147, val loss: 0.53184, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.53087, val loss: 0.52032, in 0.000s 1 tree, 18 leaves, max depth = 7, train loss: 0.52578, val loss: 0.51536, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.51638, val loss: 0.50510, in 0.016s 1 tree, 24 leaves, max depth = 10, train loss: 0.50786, val loss: 0.49569, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.50009, val loss: 0.48711, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.49305, val loss: 0.47929, in 0.000s 1 tree, 47 leaves, max depth = 17, train loss: 0.48832, val loss: 0.47437, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.48200, val loss: 0.46727, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.47627, val loss: 0.46079, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.47080, val loss: 0.45457, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.46583, val loss: 0.44889, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.46130, val loss: 0.44368, in 0.000s 1 tree, 24 leaves, max depth = 9, train loss: 0.45736, val loss: 0.43915, in 0.016s 1 tree, 45 leaves, max depth = 10, train loss: 0.45287, val loss: 0.43441, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.44936, val loss: 0.43030, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.44609, val loss: 0.42647, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.44319, val loss: 0.42302, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.44057, val loss: 0.41991, in 0.000s 1 tree, 47 leaves, max depth = 9, train loss: 0.43554, val loss: 0.41522, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.42810, val loss: 0.40813, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.42593, val loss: 0.40552, in 0.016s 1 tree, 24 leaves, max depth = 7, train loss: 0.42373, val loss: 0.40292, in 0.000s 1 tree, 30 leaves, max depth = 8, train loss: 0.41993, val loss: 0.39918, in 0.016s 1 tree, 47 leaves, max depth = 17, train loss: 0.41668, val loss: 0.39579, in 0.000s 1 tree, 44 leaves, max depth = 8, train loss: 0.41367, val loss: 0.39259, in 0.016s 1 tree, 44 leaves, max depth = 8, train loss: 0.41092, val loss: 0.38966, in 0.000s Fit 36 trees in 0.564 s, (987 total leaves) Time spent computing histograms: 0.153s Time spent finding best splits: 0.014s Time spent applying splits: 0.019s Time spent predicting: 0.000s Trial 91, Fold 4: Log loss = 0.4106653964918873, Average precision = 0.9136860523512662, ROC-AUC = 0.9226894622202193, Elapsed Time = 0.5814295000000129 seconds Trial 91, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 91, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 24 leaves, max depth = 11, train loss: 0.66782, val loss: 0.66597, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.65508, val loss: 0.65183, in 0.016s 1 tree, 24 leaves, max depth = 10, train loss: 0.63345, val loss: 0.62867, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.61406, val loss: 0.60786, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.59664, val loss: 0.58909, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.58098, val loss: 0.57199, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.56680, val loss: 0.55659, in 0.000s 1 tree, 24 leaves, max depth = 11, train loss: 0.55406, val loss: 0.54254, in 0.000s 1 tree, 47 leaves, max depth = 7, train loss: 0.54519, val loss: 0.53448, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.54015, val loss: 0.52876, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.52940, val loss: 0.51684, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.52498, val loss: 0.51201, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.51559, val loss: 0.50152, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.50710, val loss: 0.49193, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.49937, val loss: 0.48315, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.49234, val loss: 0.47512, in 0.000s 1 tree, 47 leaves, max depth = 10, train loss: 0.48754, val loss: 0.47022, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.48123, val loss: 0.46299, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.47551, val loss: 0.45637, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.47006, val loss: 0.45013, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.46511, val loss: 0.44441, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.46060, val loss: 0.43917, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.45668, val loss: 0.43465, in 0.000s 1 tree, 45 leaves, max depth = 10, train loss: 0.45215, val loss: 0.42999, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.44864, val loss: 0.42579, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.44538, val loss: 0.42198, in 0.016s 1 tree, 18 leaves, max depth = 6, train loss: 0.44200, val loss: 0.41920, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.43919, val loss: 0.41582, in 0.016s 1 tree, 47 leaves, max depth = 8, train loss: 0.43422, val loss: 0.41173, in 0.000s 1 tree, 43 leaves, max depth = 13, train loss: 0.42671, val loss: 0.40494, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.42437, val loss: 0.40208, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.42212, val loss: 0.39933, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.41837, val loss: 0.39585, in 0.000s 1 tree, 47 leaves, max depth = 17, train loss: 0.41520, val loss: 0.39277, in 0.016s 1 tree, 44 leaves, max depth = 10, train loss: 0.41216, val loss: 0.38975, in 0.000s 1 tree, 43 leaves, max depth = 10, train loss: 0.40940, val loss: 0.38700, in 0.016s Fit 36 trees in 0.596 s, (973 total leaves) Time spent computing histograms: 0.161s Time spent finding best splits: 0.015s Time spent applying splits: 0.020s Time spent predicting: 0.000s Trial 91, Fold 5: Log loss = 0.4167221246317307, Average precision = 0.9144840675045608, ROC-AUC = 0.9236757520019322, Elapsed Time = 0.5988780999996379 seconds
Optimization Progress: 92%|#########2| 92/100 [19:04<01:34, 11.76s/it]
Trial 92, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 92, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.174 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 7 leaves, max depth = 4, train loss: 0.67285, val loss: 0.67218, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.65443, val loss: 0.65314, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.63752, val loss: 0.63563, in 0.016s 1 tree, 6 leaves, max depth = 5, train loss: 0.62230, val loss: 0.61982, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.60796, val loss: 0.60494, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.59474, val loss: 0.59119, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.58254, val loss: 0.57848, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.57127, val loss: 0.56672, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.56084, val loss: 0.55583, in 0.000s 1 tree, 7 leaves, max depth = 4, train loss: 0.55119, val loss: 0.54572, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.54164, val loss: 0.53685, in 0.000s 1 tree, 27 leaves, max depth = 9, train loss: 0.53282, val loss: 0.52867, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.52461, val loss: 0.52005, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.51683, val loss: 0.51185, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.50961, val loss: 0.50422, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.50197, val loss: 0.49720, in 0.016s 1 tree, 15 leaves, max depth = 8, train loss: 0.49561, val loss: 0.49047, in 0.000s 1 tree, 30 leaves, max depth = 9, train loss: 0.48865, val loss: 0.48410, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.48291, val loss: 0.47800, in 0.000s 1 tree, 28 leaves, max depth = 9, train loss: 0.47656, val loss: 0.47222, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.47066, val loss: 0.46686, in 0.016s Fit 21 trees in 0.470 s, (304 total leaves) Time spent computing histograms: 0.094s Time spent finding best splits: 0.009s Time spent applying splits: 0.008s Time spent predicting: 0.000s Trial 92, Fold 1: Log loss = 0.47138854252755547, Average precision = 0.9163772901012393, ROC-AUC = 0.9268395742819863, Elapsed Time = 0.4720883000009053 seconds Trial 92, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 92, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.174 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 8 leaves, max depth = 6, train loss: 0.67330, val loss: 0.67232, in 0.000s 1 tree, 8 leaves, max depth = 6, train loss: 0.65518, val loss: 0.65328, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.63855, val loss: 0.63578, in 0.000s 1 tree, 8 leaves, max depth = 6, train loss: 0.62327, val loss: 0.61966, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.60920, val loss: 0.60480, in 0.000s 1 tree, 8 leaves, max depth = 6, train loss: 0.59624, val loss: 0.59108, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.58400, val loss: 0.57813, in 0.000s 1 tree, 8 leaves, max depth = 6, train loss: 0.57295, val loss: 0.56640, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.56248, val loss: 0.55529, in 0.000s 1 tree, 11 leaves, max depth = 9, train loss: 0.55303, val loss: 0.54521, in 0.000s 1 tree, 24 leaves, max depth = 9, train loss: 0.54349, val loss: 0.53600, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.53468, val loss: 0.52750, in 0.000s 1 tree, 10 leaves, max depth = 6, train loss: 0.52646, val loss: 0.51872, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.51863, val loss: 0.51035, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.51137, val loss: 0.50256, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.50374, val loss: 0.49524, in 0.000s 1 tree, 13 leaves, max depth = 8, train loss: 0.49738, val loss: 0.48841, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.49044, val loss: 0.48175, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.48470, val loss: 0.47557, in 0.000s 1 tree, 26 leaves, max depth = 10, train loss: 0.47837, val loss: 0.46952, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.47249, val loss: 0.46391, in 0.000s Fit 21 trees in 0.486 s, (295 total leaves) Time spent computing histograms: 0.096s Time spent finding best splits: 0.008s Time spent applying splits: 0.008s Time spent predicting: 0.000s Trial 92, Fold 2: Log loss = 0.4728401946769687, Average precision = 0.9123714330399207, ROC-AUC = 0.927864641582353, Elapsed Time = 0.5001008999988699 seconds Trial 92, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 92, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 6 leaves, max depth = 4, train loss: 0.67322, val loss: 0.67253, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.65521, val loss: 0.65394, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.63855, val loss: 0.63665, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.62334, val loss: 0.62091, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.60922, val loss: 0.60622, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.59629, val loss: 0.59281, in 0.000s 1 tree, 5 leaves, max depth = 4, train loss: 0.58440, val loss: 0.58035, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.57325, val loss: 0.56871, in 0.000s 1 tree, 7 leaves, max depth = 6, train loss: 0.56309, val loss: 0.55801, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.55352, val loss: 0.54801, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.54401, val loss: 0.53919, in 0.000s 1 tree, 10 leaves, max depth = 5, train loss: 0.53553, val loss: 0.53034, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.52695, val loss: 0.52241, in 0.000s 1 tree, 14 leaves, max depth = 8, train loss: 0.51933, val loss: 0.51444, in 0.016s 1 tree, 7 leaves, max depth = 6, train loss: 0.51232, val loss: 0.50697, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.50470, val loss: 0.49996, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.49834, val loss: 0.49328, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.49139, val loss: 0.48691, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.48565, val loss: 0.48086, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.47931, val loss: 0.47507, in 0.000s 1 tree, 29 leaves, max depth = 10, train loss: 0.47342, val loss: 0.46972, in 0.000s Fit 21 trees in 0.486 s, (296 total leaves) Time spent computing histograms: 0.102s Time spent finding best splits: 0.009s Time spent applying splits: 0.009s Time spent predicting: 0.000s Trial 92, Fold 3: Log loss = 0.4695607971578864, Average precision = 0.9199735082968026, ROC-AUC = 0.9317574201105643, Elapsed Time = 0.4958059999989928 seconds Trial 92, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 92, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.188 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 8 leaves, max depth = 5, train loss: 0.67312, val loss: 0.67192, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.65487, val loss: 0.65253, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.63813, val loss: 0.63469, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.62294, val loss: 0.61850, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.60875, val loss: 0.60330, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.59566, val loss: 0.58925, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.58359, val loss: 0.57623, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.57244, val loss: 0.56418, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.56213, val loss: 0.55299, in 0.000s 1 tree, 26 leaves, max depth = 10, train loss: 0.55239, val loss: 0.54336, in 0.016s 1 tree, 20 leaves, max depth = 13, train loss: 0.54327, val loss: 0.53348, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.53450, val loss: 0.52482, in 0.000s 1 tree, 19 leaves, max depth = 13, train loss: 0.52631, val loss: 0.51592, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.51861, val loss: 0.50746, in 0.000s 1 tree, 19 leaves, max depth = 10, train loss: 0.51154, val loss: 0.49972, in 0.016s 1 tree, 23 leaves, max depth = 10, train loss: 0.50396, val loss: 0.49226, in 0.000s 1 tree, 8 leaves, max depth = 4, train loss: 0.49751, val loss: 0.48512, in 0.016s 1 tree, 23 leaves, max depth = 10, train loss: 0.49062, val loss: 0.47835, in 0.000s 1 tree, 12 leaves, max depth = 6, train loss: 0.48485, val loss: 0.47195, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.47857, val loss: 0.46581, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.47275, val loss: 0.46011, in 0.016s Fit 21 trees in 0.516 s, (320 total leaves) Time spent computing histograms: 0.105s Time spent finding best splits: 0.009s Time spent applying splits: 0.009s Time spent predicting: 0.000s Trial 92, Fold 4: Log loss = 0.47093547119013984, Average precision = 0.9171902347863797, ROC-AUC = 0.928863444256076, Elapsed Time = 0.5190452000006189 seconds Trial 92, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 92, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 11 leaves, max depth = 7, train loss: 0.67310, val loss: 0.67178, in 0.016s 1 tree, 11 leaves, max depth = 7, train loss: 0.65487, val loss: 0.65228, in 0.000s 1 tree, 11 leaves, max depth = 7, train loss: 0.63814, val loss: 0.63435, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.62276, val loss: 0.61782, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.60860, val loss: 0.60257, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.59543, val loss: 0.58829, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.58337, val loss: 0.57523, in 0.016s 1 tree, 14 leaves, max depth = 7, train loss: 0.57222, val loss: 0.56310, in 0.000s 1 tree, 8 leaves, max depth = 5, train loss: 0.56178, val loss: 0.55169, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.55223, val loss: 0.54123, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.54281, val loss: 0.53219, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.53415, val loss: 0.52263, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.52567, val loss: 0.51453, in 0.000s 1 tree, 14 leaves, max depth = 7, train loss: 0.51795, val loss: 0.50603, in 0.016s 1 tree, 15 leaves, max depth = 8, train loss: 0.51079, val loss: 0.49814, in 0.000s 1 tree, 29 leaves, max depth = 11, train loss: 0.50327, val loss: 0.49101, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.49682, val loss: 0.48384, in 0.000s 1 tree, 30 leaves, max depth = 11, train loss: 0.48997, val loss: 0.47738, in 0.016s 1 tree, 16 leaves, max depth = 8, train loss: 0.48414, val loss: 0.47087, in 0.000s 1 tree, 27 leaves, max depth = 12, train loss: 0.47789, val loss: 0.46502, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.47210, val loss: 0.45959, in 0.000s Fit 21 trees in 0.454 s, (357 total leaves) Time spent computing histograms: 0.095s Time spent finding best splits: 0.008s Time spent applying splits: 0.009s Time spent predicting: 0.000s Trial 92, Fold 5: Log loss = 0.47663555439002386, Average precision = 0.9146373133272797, ROC-AUC = 0.9261888783605522, Elapsed Time = 0.46562550000089686 seconds
Optimization Progress: 93%|#########3| 93/100 [19:13<01:17, 11.05s/it]
Trial 93, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 93, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.126 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 128 leaves, max depth = 12, train loss: 0.67794, val loss: 0.67791, in 0.016s 1 tree, 207 leaves, max depth = 16, train loss: 0.66330, val loss: 0.66346, in 0.016s 1 tree, 248 leaves, max depth = 17, train loss: 0.65023, val loss: 0.65032, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.63690, val loss: 0.63718, in 0.031s 1 tree, 248 leaves, max depth = 14, train loss: 0.62439, val loss: 0.62466, in 0.016s 1 tree, 248 leaves, max depth = 16, train loss: 0.61230, val loss: 0.61272, in 0.031s 1 tree, 248 leaves, max depth = 24, train loss: 0.60136, val loss: 0.60184, in 0.031s 1 tree, 248 leaves, max depth = 24, train loss: 0.59107, val loss: 0.59145, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.58047, val loss: 0.58095, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.57033, val loss: 0.57087, in 0.016s 1 tree, 248 leaves, max depth = 16, train loss: 0.56077, val loss: 0.56131, in 0.031s 1 tree, 248 leaves, max depth = 23, train loss: 0.55199, val loss: 0.55260, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.54321, val loss: 0.54378, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.53463, val loss: 0.53532, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.52627, val loss: 0.52701, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.51876, val loss: 0.51962, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.51117, val loss: 0.51211, in 0.016s 1 tree, 248 leaves, max depth = 16, train loss: 0.50388, val loss: 0.50495, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.49732, val loss: 0.49858, in 0.031s 1 tree, 248 leaves, max depth = 15, train loss: 0.49055, val loss: 0.49176, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.48405, val loss: 0.48539, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.47809, val loss: 0.47955, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.47194, val loss: 0.47349, in 0.031s 1 tree, 248 leaves, max depth = 19, train loss: 0.46625, val loss: 0.46779, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.46088, val loss: 0.46239, in 0.031s 1 tree, 248 leaves, max depth = 15, train loss: 0.45535, val loss: 0.45691, in 0.031s 1 tree, 215 leaves, max depth = 15, train loss: 0.45023, val loss: 0.45168, in 0.016s 1 tree, 248 leaves, max depth = 16, train loss: 0.44510, val loss: 0.44662, in 0.047s 1 tree, 248 leaves, max depth = 15, train loss: 0.44026, val loss: 0.44176, in 0.031s 1 tree, 193 leaves, max depth = 15, train loss: 0.43340, val loss: 0.43525, in 0.016s 1 tree, 248 leaves, max depth = 16, train loss: 0.42884, val loss: 0.43081, in 0.047s 1 tree, 248 leaves, max depth = 17, train loss: 0.42437, val loss: 0.42642, in 0.031s 1 tree, 248 leaves, max depth = 15, train loss: 0.42013, val loss: 0.42221, in 0.031s 1 tree, 193 leaves, max depth = 18, train loss: 0.41388, val loss: 0.41627, in 0.031s 1 tree, 236 leaves, max depth = 16, train loss: 0.40801, val loss: 0.41059, in 0.031s 1 tree, 248 leaves, max depth = 15, train loss: 0.40415, val loss: 0.40676, in 0.031s 1 tree, 248 leaves, max depth = 15, train loss: 0.40043, val loss: 0.40310, in 0.031s 1 tree, 192 leaves, max depth = 16, train loss: 0.39489, val loss: 0.39786, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.39127, val loss: 0.39437, in 0.031s 1 tree, 193 leaves, max depth = 16, train loss: 0.38605, val loss: 0.38944, in 0.016s 1 tree, 195 leaves, max depth = 21, train loss: 0.38101, val loss: 0.38470, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.37775, val loss: 0.38163, in 0.031s 1 tree, 240 leaves, max depth = 21, train loss: 0.37309, val loss: 0.37719, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.36990, val loss: 0.37412, in 0.031s 1 tree, 192 leaves, max depth = 19, train loss: 0.36546, val loss: 0.37002, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.36244, val loss: 0.36711, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.35983, val loss: 0.36472, in 0.031s 1 tree, 93 leaves, max depth = 17, train loss: 0.35594, val loss: 0.36089, in 0.016s 1 tree, 95 leaves, max depth = 16, train loss: 0.35223, val loss: 0.35728, in 0.016s 1 tree, 95 leaves, max depth = 18, train loss: 0.34871, val loss: 0.35386, in 0.016s 1 tree, 94 leaves, max depth = 15, train loss: 0.34531, val loss: 0.35049, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.34239, val loss: 0.34774, in 0.016s 1 tree, 248 leaves, max depth = 20, train loss: 0.33956, val loss: 0.34511, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.33683, val loss: 0.34249, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.33419, val loss: 0.33995, in 0.031s 1 tree, 141 leaves, max depth = 18, train loss: 0.33093, val loss: 0.33704, in 0.016s 1 tree, 94 leaves, max depth = 21, train loss: 0.32796, val loss: 0.33416, in 0.016s 1 tree, 141 leaves, max depth = 18, train loss: 0.32494, val loss: 0.33148, in 0.016s 1 tree, 94 leaves, max depth = 21, train loss: 0.32218, val loss: 0.32879, in 0.016s 1 tree, 248 leaves, max depth = 19, train loss: 0.31977, val loss: 0.32648, in 0.031s 1 tree, 195 leaves, max depth = 18, train loss: 0.31682, val loss: 0.32381, in 0.031s 1 tree, 141 leaves, max depth = 19, train loss: 0.31409, val loss: 0.32140, in 0.016s 1 tree, 141 leaves, max depth = 19, train loss: 0.31147, val loss: 0.31910, in 0.031s 1 tree, 248 leaves, max depth = 25, train loss: 0.30926, val loss: 0.31699, in 0.016s 1 tree, 94 leaves, max depth = 17, train loss: 0.30690, val loss: 0.31466, in 0.031s 1 tree, 142 leaves, max depth = 19, train loss: 0.30447, val loss: 0.31253, in 0.016s 1 tree, 248 leaves, max depth = 22, train loss: 0.30238, val loss: 0.31055, in 0.031s 1 tree, 95 leaves, max depth = 17, train loss: 0.30021, val loss: 0.30842, in 0.016s 1 tree, 141 leaves, max depth = 18, train loss: 0.29797, val loss: 0.30646, in 0.016s 1 tree, 94 leaves, max depth = 19, train loss: 0.29593, val loss: 0.30449, in 0.016s 1 tree, 248 leaves, max depth = 24, train loss: 0.29387, val loss: 0.30251, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.29186, val loss: 0.30061, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.29031, val loss: 0.29925, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.28880, val loss: 0.29783, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.28693, val loss: 0.29605, in 0.031s 1 tree, 143 leaves, max depth = 18, train loss: 0.28494, val loss: 0.29433, in 0.016s 1 tree, 248 leaves, max depth = 21, train loss: 0.28315, val loss: 0.29261, in 0.031s 1 tree, 141 leaves, max depth = 18, train loss: 0.28127, val loss: 0.29099, in 0.016s 1 tree, 143 leaves, max depth = 18, train loss: 0.27944, val loss: 0.28941, in 0.031s 1 tree, 94 leaves, max depth = 15, train loss: 0.27777, val loss: 0.28776, in 0.016s 1 tree, 146 leaves, max depth = 18, train loss: 0.27603, val loss: 0.28624, in 0.031s 1 tree, 94 leaves, max depth = 15, train loss: 0.27446, val loss: 0.28469, in 0.016s 1 tree, 143 leaves, max depth = 18, train loss: 0.27285, val loss: 0.28328, in 0.031s 1 tree, 94 leaves, max depth = 15, train loss: 0.27138, val loss: 0.28182, in 0.016s 1 tree, 248 leaves, max depth = 22, train loss: 0.26979, val loss: 0.28031, in 0.031s Fit 85 trees in 2.595 s, (17472 total leaves) Time spent computing histograms: 0.715s Time spent finding best splits: 0.314s Time spent applying splits: 0.272s Time spent predicting: 0.000s Trial 93, Fold 1: Log loss = 0.28539871775659487, Average precision = 0.9620041346120233, ROC-AUC = 0.9569296277177939, Elapsed Time = 2.5979539999989356 seconds Trial 93, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 93, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 131 leaves, max depth = 14, train loss: 0.67803, val loss: 0.67785, in 0.016s 1 tree, 221 leaves, max depth = 15, train loss: 0.66342, val loss: 0.66318, in 0.031s 1 tree, 146 leaves, max depth = 15, train loss: 0.64986, val loss: 0.64952, in 0.031s 1 tree, 241 leaves, max depth = 17, train loss: 0.63663, val loss: 0.63622, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.62447, val loss: 0.62408, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.61300, val loss: 0.61259, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.60148, val loss: 0.60097, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.59032, val loss: 0.58978, in 0.031s 1 tree, 248 leaves, max depth = 14, train loss: 0.57991, val loss: 0.57940, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.56985, val loss: 0.56930, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.56008, val loss: 0.55952, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.55086, val loss: 0.55029, in 0.031s 1 tree, 248 leaves, max depth = 14, train loss: 0.54213, val loss: 0.54163, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.53404, val loss: 0.53351, in 0.016s 1 tree, 248 leaves, max depth = 20, train loss: 0.52579, val loss: 0.52525, in 0.047s 1 tree, 248 leaves, max depth = 20, train loss: 0.51785, val loss: 0.51732, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.51023, val loss: 0.50974, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.50293, val loss: 0.50246, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.49591, val loss: 0.49548, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.48916, val loss: 0.48877, in 0.031s 1 tree, 213 leaves, max depth = 16, train loss: 0.48323, val loss: 0.48288, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.47697, val loss: 0.47665, in 0.031s 1 tree, 248 leaves, max depth = 19, train loss: 0.47130, val loss: 0.47105, in 0.031s 1 tree, 248 leaves, max depth = 14, train loss: 0.46554, val loss: 0.46541, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.46046, val loss: 0.46027, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.45505, val loss: 0.45491, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.44983, val loss: 0.44973, in 0.031s 1 tree, 236 leaves, max depth = 19, train loss: 0.44285, val loss: 0.44300, in 0.031s 1 tree, 248 leaves, max depth = 19, train loss: 0.43797, val loss: 0.43817, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.43324, val loss: 0.43356, in 0.031s 1 tree, 215 leaves, max depth = 17, train loss: 0.42888, val loss: 0.42919, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.42450, val loss: 0.42486, in 0.031s 1 tree, 235 leaves, max depth = 17, train loss: 0.41829, val loss: 0.41895, in 0.031s 1 tree, 196 leaves, max depth = 21, train loss: 0.41231, val loss: 0.41316, in 0.031s 1 tree, 234 leaves, max depth = 18, train loss: 0.40664, val loss: 0.40775, in 0.031s 1 tree, 206 leaves, max depth = 21, train loss: 0.40299, val loss: 0.40424, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.39921, val loss: 0.40050, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.39587, val loss: 0.39721, in 0.031s 1 tree, 146 leaves, max depth = 17, train loss: 0.39083, val loss: 0.39235, in 0.031s 1 tree, 248 leaves, max depth = 22, train loss: 0.38739, val loss: 0.38896, in 0.031s 1 tree, 214 leaves, max depth = 16, train loss: 0.38422, val loss: 0.38585, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.38123, val loss: 0.38307, in 0.047s 1 tree, 248 leaves, max depth = 21, train loss: 0.37811, val loss: 0.38005, in 0.031s 1 tree, 196 leaves, max depth = 23, train loss: 0.37344, val loss: 0.37556, in 0.031s 1 tree, 236 leaves, max depth = 17, train loss: 0.36900, val loss: 0.37142, in 0.032s 1 tree, 248 leaves, max depth = 20, train loss: 0.36629, val loss: 0.36886, in 0.031s 1 tree, 95 leaves, max depth = 18, train loss: 0.36231, val loss: 0.36501, in 0.031s 1 tree, 95 leaves, max depth = 16, train loss: 0.35851, val loss: 0.36133, in 0.016s 1 tree, 248 leaves, max depth = 23, train loss: 0.35540, val loss: 0.35840, in 0.031s 1 tree, 95 leaves, max depth = 16, train loss: 0.35180, val loss: 0.35492, in 0.016s 1 tree, 248 leaves, max depth = 23, train loss: 0.34886, val loss: 0.35215, in 0.031s 1 tree, 248 leaves, max depth = 23, train loss: 0.34601, val loss: 0.34942, in 0.031s 1 tree, 248 leaves, max depth = 23, train loss: 0.34324, val loss: 0.34676, in 0.047s 1 tree, 248 leaves, max depth = 22, train loss: 0.34046, val loss: 0.34419, in 0.031s 1 tree, 140 leaves, max depth = 17, train loss: 0.33710, val loss: 0.34109, in 0.031s 1 tree, 248 leaves, max depth = 23, train loss: 0.33459, val loss: 0.33875, in 0.031s 1 tree, 161 leaves, max depth = 18, train loss: 0.33129, val loss: 0.33570, in 0.031s 1 tree, 93 leaves, max depth = 16, train loss: 0.32827, val loss: 0.33282, in 0.031s 1 tree, 93 leaves, max depth = 19, train loss: 0.32540, val loss: 0.33007, in 0.016s 1 tree, 236 leaves, max depth = 20, train loss: 0.32242, val loss: 0.32734, in 0.047s 1 tree, 142 leaves, max depth = 17, train loss: 0.31961, val loss: 0.32477, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.31764, val loss: 0.32299, in 0.031s 1 tree, 248 leaves, max depth = 22, train loss: 0.31542, val loss: 0.32094, in 0.031s 1 tree, 143 leaves, max depth = 17, train loss: 0.31280, val loss: 0.31852, in 0.031s 1 tree, 94 leaves, max depth = 16, train loss: 0.31035, val loss: 0.31619, in 0.016s 1 tree, 234 leaves, max depth = 17, train loss: 0.30781, val loss: 0.31391, in 0.031s 1 tree, 143 leaves, max depth = 17, train loss: 0.30541, val loss: 0.31171, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.30326, val loss: 0.30976, in 0.031s 1 tree, 248 leaves, max depth = 24, train loss: 0.30118, val loss: 0.30788, in 0.031s 1 tree, 248 leaves, max depth = 24, train loss: 0.29916, val loss: 0.30607, in 0.031s 1 tree, 248 leaves, max depth = 22, train loss: 0.29724, val loss: 0.30433, in 0.047s 1 tree, 248 leaves, max depth = 18, train loss: 0.29576, val loss: 0.30300, in 0.031s 1 tree, 143 leaves, max depth = 18, train loss: 0.29361, val loss: 0.30103, in 0.031s 1 tree, 232 leaves, max depth = 19, train loss: 0.29215, val loss: 0.29972, in 0.031s 1 tree, 142 leaves, max depth = 18, train loss: 0.29010, val loss: 0.29784, in 0.016s 1 tree, 95 leaves, max depth = 16, train loss: 0.28817, val loss: 0.29603, in 0.016s 1 tree, 183 leaves, max depth = 22, train loss: 0.28618, val loss: 0.29418, in 0.031s 1 tree, 95 leaves, max depth = 16, train loss: 0.28440, val loss: 0.29252, in 0.016s 1 tree, 194 leaves, max depth = 20, train loss: 0.28246, val loss: 0.29070, in 0.031s 1 tree, 94 leaves, max depth = 17, train loss: 0.28082, val loss: 0.28917, in 0.016s 1 tree, 248 leaves, max depth = 23, train loss: 0.27925, val loss: 0.28773, in 0.047s 1 tree, 144 leaves, max depth = 19, train loss: 0.27751, val loss: 0.28615, in 0.031s 1 tree, 248 leaves, max depth = 25, train loss: 0.27590, val loss: 0.28477, in 0.031s 1 tree, 94 leaves, max depth = 16, train loss: 0.27442, val loss: 0.28336, in 0.016s 1 tree, 247 leaves, max depth = 16, train loss: 0.27320, val loss: 0.28237, in 0.031s Fit 85 trees in 2.955 s, (17901 total leaves) Time spent computing histograms: 0.797s Time spent finding best splits: 0.366s Time spent applying splits: 0.320s Time spent predicting: 0.000s Trial 93, Fold 2: Log loss = 0.2836545940853594, Average precision = 0.9613487091262867, ROC-AUC = 0.9583427290503834, Elapsed Time = 2.9765826999991987 seconds Trial 93, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 93, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 124 leaves, max depth = 15, train loss: 0.67811, val loss: 0.67812, in 0.016s 1 tree, 203 leaves, max depth = 17, train loss: 0.66365, val loss: 0.66372, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.65048, val loss: 0.65067, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.63734, val loss: 0.63760, in 0.016s 1 tree, 248 leaves, max depth = 17, train loss: 0.62499, val loss: 0.62538, in 0.016s 1 tree, 248 leaves, max depth = 17, train loss: 0.61305, val loss: 0.61351, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.60199, val loss: 0.60253, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.59159, val loss: 0.59225, in 0.031s 1 tree, 248 leaves, max depth = 19, train loss: 0.58107, val loss: 0.58184, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.57087, val loss: 0.57173, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.56145, val loss: 0.56240, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.55264, val loss: 0.55374, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.54383, val loss: 0.54496, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.53555, val loss: 0.53686, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.52726, val loss: 0.52863, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.51936, val loss: 0.52083, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.51221, val loss: 0.51382, in 0.031s 1 tree, 248 leaves, max depth = 22, train loss: 0.50532, val loss: 0.50720, in 0.031s 1 tree, 248 leaves, max depth = 22, train loss: 0.49823, val loss: 0.50018, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.49141, val loss: 0.49345, in 0.031s 1 tree, 190 leaves, max depth = 16, train loss: 0.48310, val loss: 0.48566, in 0.031s 1 tree, 189 leaves, max depth = 17, train loss: 0.47509, val loss: 0.47816, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.46905, val loss: 0.47229, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.46355, val loss: 0.46695, in 0.031s 1 tree, 248 leaves, max depth = 22, train loss: 0.45808, val loss: 0.46169, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.45254, val loss: 0.45624, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.44719, val loss: 0.45099, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.44213, val loss: 0.44610, in 0.031s 1 tree, 191 leaves, max depth = 16, train loss: 0.43547, val loss: 0.43993, in 0.016s 1 tree, 248 leaves, max depth = 17, train loss: 0.43065, val loss: 0.43519, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.42600, val loss: 0.43067, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.42160, val loss: 0.42650, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.41727, val loss: 0.42228, in 0.031s 1 tree, 189 leaves, max depth = 17, train loss: 0.41133, val loss: 0.41684, in 0.031s 1 tree, 236 leaves, max depth = 17, train loss: 0.40572, val loss: 0.41179, in 0.047s 1 tree, 248 leaves, max depth = 17, train loss: 0.40184, val loss: 0.40810, in 0.016s 1 tree, 248 leaves, max depth = 18, train loss: 0.39801, val loss: 0.40438, in 0.031s 1 tree, 248 leaves, max depth = 19, train loss: 0.39432, val loss: 0.40083, in 0.047s 1 tree, 209 leaves, max depth = 19, train loss: 0.39089, val loss: 0.39746, in 0.016s 1 tree, 248 leaves, max depth = 19, train loss: 0.38744, val loss: 0.39416, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.38410, val loss: 0.39098, in 0.031s 1 tree, 191 leaves, max depth = 16, train loss: 0.37921, val loss: 0.38657, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.37616, val loss: 0.38373, in 0.031s 1 tree, 190 leaves, max depth = 17, train loss: 0.37151, val loss: 0.37956, in 0.031s 1 tree, 237 leaves, max depth = 17, train loss: 0.36713, val loss: 0.37571, in 0.031s 1 tree, 238 leaves, max depth = 17, train loss: 0.36293, val loss: 0.37202, in 0.047s 1 tree, 248 leaves, max depth = 19, train loss: 0.36007, val loss: 0.36934, in 0.031s 1 tree, 93 leaves, max depth = 14, train loss: 0.35626, val loss: 0.36589, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.35262, val loss: 0.36259, in 0.016s 1 tree, 94 leaves, max depth = 18, train loss: 0.34914, val loss: 0.35945, in 0.016s 1 tree, 93 leaves, max depth = 14, train loss: 0.34581, val loss: 0.35642, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.34289, val loss: 0.35343, in 0.031s 1 tree, 138 leaves, max depth = 16, train loss: 0.33956, val loss: 0.35063, in 0.016s 1 tree, 248 leaves, max depth = 17, train loss: 0.33723, val loss: 0.34851, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.33455, val loss: 0.34574, in 0.031s 1 tree, 94 leaves, max depth = 14, train loss: 0.33157, val loss: 0.34309, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.32900, val loss: 0.34050, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.32650, val loss: 0.33795, in 0.031s 1 tree, 94 leaves, max depth = 19, train loss: 0.32374, val loss: 0.33549, in 0.026s 1 tree, 136 leaves, max depth = 17, train loss: 0.32087, val loss: 0.33309, in 0.022s 1 tree, 136 leaves, max depth = 17, train loss: 0.31812, val loss: 0.33078, in 0.016s 1 tree, 238 leaves, max depth = 17, train loss: 0.31543, val loss: 0.32851, in 0.047s 1 tree, 248 leaves, max depth = 19, train loss: 0.31346, val loss: 0.32673, in 0.031s 1 tree, 93 leaves, max depth = 13, train loss: 0.31106, val loss: 0.32459, in 0.016s 1 tree, 248 leaves, max depth = 18, train loss: 0.30920, val loss: 0.32286, in 0.047s 1 tree, 92 leaves, max depth = 14, train loss: 0.30695, val loss: 0.32086, in 0.016s 1 tree, 248 leaves, max depth = 20, train loss: 0.30484, val loss: 0.31869, in 0.031s 1 tree, 93 leaves, max depth = 14, train loss: 0.30271, val loss: 0.31683, in 0.016s 1 tree, 248 leaves, max depth = 18, train loss: 0.30110, val loss: 0.31549, in 0.047s 1 tree, 139 leaves, max depth = 15, train loss: 0.29883, val loss: 0.31362, in 0.016s 1 tree, 248 leaves, max depth = 20, train loss: 0.29688, val loss: 0.31162, in 0.031s 1 tree, 92 leaves, max depth = 14, train loss: 0.29494, val loss: 0.30993, in 0.031s 1 tree, 139 leaves, max depth = 15, train loss: 0.29283, val loss: 0.30820, in 0.016s 1 tree, 93 leaves, max depth = 19, train loss: 0.29103, val loss: 0.30663, in 0.016s 1 tree, 248 leaves, max depth = 19, train loss: 0.28920, val loss: 0.30481, in 0.016s 1 tree, 248 leaves, max depth = 21, train loss: 0.28744, val loss: 0.30305, in 0.047s 1 tree, 248 leaves, max depth = 19, train loss: 0.28573, val loss: 0.30127, in 0.031s 1 tree, 92 leaves, max depth = 14, train loss: 0.28405, val loss: 0.29983, in 0.016s 1 tree, 138 leaves, max depth = 15, train loss: 0.28218, val loss: 0.29827, in 0.016s 1 tree, 92 leaves, max depth = 14, train loss: 0.28062, val loss: 0.29692, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.27936, val loss: 0.29585, in 0.032s 1 tree, 92 leaves, max depth = 23, train loss: 0.27788, val loss: 0.29458, in 0.016s 1 tree, 138 leaves, max depth = 15, train loss: 0.27618, val loss: 0.29317, in 0.031s 1 tree, 138 leaves, max depth = 15, train loss: 0.27454, val loss: 0.29180, in 0.016s 1 tree, 92 leaves, max depth = 14, train loss: 0.27318, val loss: 0.29064, in 0.031s Fit 85 trees in 2.800 s, (17271 total leaves) Time spent computing histograms: 0.760s Time spent finding best splits: 0.339s Time spent applying splits: 0.295s Time spent predicting: 0.000s Trial 93, Fold 3: Log loss = 0.28081531736409293, Average precision = 0.9617123144966295, ROC-AUC = 0.9583356909304542, Elapsed Time = 2.7998050000005605 seconds Trial 93, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 93, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 132 leaves, max depth = 14, train loss: 0.67813, val loss: 0.67759, in 0.016s 1 tree, 220 leaves, max depth = 18, train loss: 0.66367, val loss: 0.66276, in 0.031s 1 tree, 222 leaves, max depth = 17, train loss: 0.64993, val loss: 0.64865, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.63756, val loss: 0.63597, in 0.016s 1 tree, 248 leaves, max depth = 18, train loss: 0.62585, val loss: 0.62386, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.61386, val loss: 0.61152, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.60227, val loss: 0.59964, in 0.031s 1 tree, 248 leaves, max depth = 19, train loss: 0.59126, val loss: 0.58832, in 0.047s 1 tree, 248 leaves, max depth = 17, train loss: 0.58094, val loss: 0.57756, in 0.016s 1 tree, 248 leaves, max depth = 16, train loss: 0.57097, val loss: 0.56727, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.56155, val loss: 0.55747, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.55239, val loss: 0.54796, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.54347, val loss: 0.53879, in 0.031s 1 tree, 186 leaves, max depth = 16, train loss: 0.53517, val loss: 0.53014, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.52699, val loss: 0.52175, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.51913, val loss: 0.51366, in 0.016s 1 tree, 211 leaves, max depth = 16, train loss: 0.51180, val loss: 0.50603, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.50456, val loss: 0.49856, in 0.031s 1 tree, 182 leaves, max depth = 16, train loss: 0.49783, val loss: 0.49156, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.49113, val loss: 0.48471, in 0.031s 1 tree, 212 leaves, max depth = 21, train loss: 0.48527, val loss: 0.47863, in 0.016s 1 tree, 248 leaves, max depth = 16, train loss: 0.47913, val loss: 0.47226, in 0.016s 1 tree, 248 leaves, max depth = 17, train loss: 0.47315, val loss: 0.46614, in 0.047s 1 tree, 248 leaves, max depth = 16, train loss: 0.46741, val loss: 0.46022, in 0.016s 1 tree, 248 leaves, max depth = 16, train loss: 0.46186, val loss: 0.45456, in 0.016s 1 tree, 248 leaves, max depth = 17, train loss: 0.45653, val loss: 0.44911, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.45140, val loss: 0.44386, in 0.016s 1 tree, 239 leaves, max depth = 19, train loss: 0.44439, val loss: 0.43676, in 0.047s 1 tree, 194 leaves, max depth = 16, train loss: 0.43765, val loss: 0.42998, in 0.016s 1 tree, 248 leaves, max depth = 17, train loss: 0.43296, val loss: 0.42520, in 0.031s 1 tree, 213 leaves, max depth = 17, train loss: 0.42865, val loss: 0.42073, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.42429, val loss: 0.41627, in 0.031s 1 tree, 248 leaves, max depth = 24, train loss: 0.42031, val loss: 0.41228, in 0.047s 1 tree, 248 leaves, max depth = 18, train loss: 0.41621, val loss: 0.40807, in 0.031s 1 tree, 238 leaves, max depth = 20, train loss: 0.41038, val loss: 0.40220, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.40667, val loss: 0.39838, in 0.032s 1 tree, 237 leaves, max depth = 20, train loss: 0.40118, val loss: 0.39286, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.39766, val loss: 0.38924, in 0.031s 1 tree, 195 leaves, max depth = 17, train loss: 0.39241, val loss: 0.38405, in 0.031s 1 tree, 248 leaves, max depth = 25, train loss: 0.38921, val loss: 0.38088, in 0.031s 1 tree, 195 leaves, max depth = 17, train loss: 0.38428, val loss: 0.37597, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.38111, val loss: 0.37273, in 0.031s 1 tree, 194 leaves, max depth = 20, train loss: 0.37644, val loss: 0.36811, in 0.031s 1 tree, 194 leaves, max depth = 17, train loss: 0.37196, val loss: 0.36366, in 0.047s 1 tree, 193 leaves, max depth = 17, train loss: 0.36765, val loss: 0.35939, in 0.016s 1 tree, 195 leaves, max depth = 17, train loss: 0.36348, val loss: 0.35522, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.36070, val loss: 0.35235, in 0.031s 1 tree, 195 leaves, max depth = 17, train loss: 0.35676, val loss: 0.34841, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.35421, val loss: 0.34578, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.35163, val loss: 0.34319, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.34923, val loss: 0.34072, in 0.031s 1 tree, 248 leaves, max depth = 24, train loss: 0.34640, val loss: 0.33795, in 0.031s 1 tree, 96 leaves, max depth = 19, train loss: 0.34310, val loss: 0.33455, in 0.016s 1 tree, 96 leaves, max depth = 15, train loss: 0.33994, val loss: 0.33133, in 0.016s 1 tree, 140 leaves, max depth = 15, train loss: 0.33683, val loss: 0.32844, in 0.031s 1 tree, 94 leaves, max depth = 14, train loss: 0.33393, val loss: 0.32541, in 0.016s 1 tree, 248 leaves, max depth = 21, train loss: 0.33138, val loss: 0.32291, in 0.047s 1 tree, 195 leaves, max depth = 17, train loss: 0.32837, val loss: 0.31995, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.32592, val loss: 0.31755, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.32346, val loss: 0.31509, in 0.031s 1 tree, 248 leaves, max depth = 19, train loss: 0.32116, val loss: 0.31286, in 0.031s 1 tree, 248 leaves, max depth = 21, train loss: 0.31893, val loss: 0.31071, in 0.031s 1 tree, 96 leaves, max depth = 14, train loss: 0.31642, val loss: 0.30812, in 0.016s 1 tree, 248 leaves, max depth = 20, train loss: 0.31429, val loss: 0.30607, in 0.031s 1 tree, 139 leaves, max depth = 15, train loss: 0.31179, val loss: 0.30374, in 0.016s 1 tree, 140 leaves, max depth = 15, train loss: 0.30937, val loss: 0.30152, in 0.031s 1 tree, 94 leaves, max depth = 17, train loss: 0.30711, val loss: 0.29917, in 0.031s 1 tree, 140 leaves, max depth = 15, train loss: 0.30485, val loss: 0.29710, in 0.016s 1 tree, 139 leaves, max depth = 15, train loss: 0.30269, val loss: 0.29510, in 0.016s 1 tree, 248 leaves, max depth = 19, train loss: 0.30068, val loss: 0.29308, in 0.047s 1 tree, 94 leaves, max depth = 17, train loss: 0.29864, val loss: 0.29099, in 0.016s 1 tree, 139 leaves, max depth = 15, train loss: 0.29663, val loss: 0.28916, in 0.031s 1 tree, 94 leaves, max depth = 15, train loss: 0.29473, val loss: 0.28720, in 0.016s 1 tree, 141 leaves, max depth = 15, train loss: 0.29282, val loss: 0.28544, in 0.031s 1 tree, 94 leaves, max depth = 14, train loss: 0.29105, val loss: 0.28361, in 0.016s 1 tree, 248 leaves, max depth = 25, train loss: 0.28930, val loss: 0.28193, in 0.031s 1 tree, 141 leaves, max depth = 16, train loss: 0.28753, val loss: 0.28030, in 0.031s 1 tree, 94 leaves, max depth = 15, train loss: 0.28588, val loss: 0.27858, in 0.016s 1 tree, 95 leaves, max depth = 16, train loss: 0.28429, val loss: 0.27688, in 0.016s 1 tree, 141 leaves, max depth = 16, train loss: 0.28266, val loss: 0.27539, in 0.031s 1 tree, 141 leaves, max depth = 16, train loss: 0.28110, val loss: 0.27397, in 0.016s 1 tree, 248 leaves, max depth = 19, train loss: 0.27941, val loss: 0.27228, in 0.047s 1 tree, 142 leaves, max depth = 16, train loss: 0.27793, val loss: 0.27091, in 0.016s 1 tree, 248 leaves, max depth = 19, train loss: 0.27632, val loss: 0.26930, in 0.031s 1 tree, 93 leaves, max depth = 16, train loss: 0.27492, val loss: 0.26785, in 0.031s Fit 85 trees in 2.799 s, (17289 total leaves) Time spent computing histograms: 0.763s Time spent finding best splits: 0.335s Time spent applying splits: 0.290s Time spent predicting: 0.000s Trial 93, Fold 4: Log loss = 0.2826723555976151, Average precision = 0.9625027075561599, ROC-AUC = 0.9576732243896507, Elapsed Time = 2.8056574999991426 seconds Trial 93, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 93, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.141 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 126 leaves, max depth = 15, train loss: 0.67788, val loss: 0.67727, in 0.016s 1 tree, 233 leaves, max depth = 17, train loss: 0.66319, val loss: 0.66213, in 0.031s 1 tree, 235 leaves, max depth = 16, train loss: 0.64925, val loss: 0.64774, in 0.031s 1 tree, 236 leaves, max depth = 16, train loss: 0.63600, val loss: 0.63405, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.62332, val loss: 0.62094, in 0.016s 1 tree, 246 leaves, max depth = 16, train loss: 0.61124, val loss: 0.60844, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.59977, val loss: 0.59659, in 0.031s 1 tree, 248 leaves, max depth = 19, train loss: 0.58879, val loss: 0.58526, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.57827, val loss: 0.57436, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.56822, val loss: 0.56394, in 0.031s 1 tree, 248 leaves, max depth = 20, train loss: 0.55889, val loss: 0.55428, in 0.031s 1 tree, 248 leaves, max depth = 19, train loss: 0.54967, val loss: 0.54478, in 0.031s 1 tree, 248 leaves, max depth = 15, train loss: 0.54091, val loss: 0.53575, in 0.031s 1 tree, 245 leaves, max depth = 20, train loss: 0.53279, val loss: 0.52751, in 0.031s 1 tree, 248 leaves, max depth = 19, train loss: 0.52465, val loss: 0.51912, in 0.031s 1 tree, 183 leaves, max depth = 17, train loss: 0.51705, val loss: 0.51118, in 0.031s 1 tree, 248 leaves, max depth = 15, train loss: 0.50960, val loss: 0.50352, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.50270, val loss: 0.49645, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.49572, val loss: 0.48924, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.48940, val loss: 0.48293, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.48294, val loss: 0.47626, in 0.031s 1 tree, 248 leaves, max depth = 19, train loss: 0.47675, val loss: 0.47000, in 0.031s 1 tree, 248 leaves, max depth = 17, train loss: 0.47107, val loss: 0.46417, in 0.016s 1 tree, 248 leaves, max depth = 16, train loss: 0.46538, val loss: 0.45835, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.45983, val loss: 0.45265, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.45448, val loss: 0.44714, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.44934, val loss: 0.44190, in 0.031s 1 tree, 235 leaves, max depth = 18, train loss: 0.44230, val loss: 0.43494, in 0.031s 1 tree, 192 leaves, max depth = 19, train loss: 0.43545, val loss: 0.42814, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.43076, val loss: 0.42333, in 0.016s 1 tree, 181 leaves, max depth = 21, train loss: 0.42647, val loss: 0.41895, in 0.031s 1 tree, 182 leaves, max depth = 18, train loss: 0.42232, val loss: 0.41469, in 0.031s 1 tree, 236 leaves, max depth = 18, train loss: 0.41627, val loss: 0.40874, in 0.031s 1 tree, 248 leaves, max depth = 22, train loss: 0.41222, val loss: 0.40464, in 0.031s 1 tree, 183 leaves, max depth = 17, train loss: 0.40866, val loss: 0.40098, in 0.016s 1 tree, 248 leaves, max depth = 19, train loss: 0.40482, val loss: 0.39707, in 0.016s 1 tree, 248 leaves, max depth = 21, train loss: 0.40109, val loss: 0.39330, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.39749, val loss: 0.38971, in 0.031s 1 tree, 192 leaves, max depth = 18, train loss: 0.39206, val loss: 0.38440, in 0.031s 1 tree, 213 leaves, max depth = 15, train loss: 0.38885, val loss: 0.38100, in 0.031s 1 tree, 191 leaves, max depth = 18, train loss: 0.38378, val loss: 0.37604, in 0.031s 1 tree, 191 leaves, max depth = 18, train loss: 0.37891, val loss: 0.37127, in 0.016s 1 tree, 191 leaves, max depth = 19, train loss: 0.37424, val loss: 0.36669, in 0.016s 1 tree, 142 leaves, max depth = 16, train loss: 0.36990, val loss: 0.36233, in 0.031s 1 tree, 248 leaves, max depth = 14, train loss: 0.36699, val loss: 0.35946, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.36410, val loss: 0.35657, in 0.031s 1 tree, 248 leaves, max depth = 16, train loss: 0.36136, val loss: 0.35380, in 0.016s 1 tree, 248 leaves, max depth = 19, train loss: 0.35875, val loss: 0.35130, in 0.031s 1 tree, 91 leaves, max depth = 16, train loss: 0.35495, val loss: 0.34742, in 0.031s 1 tree, 94 leaves, max depth = 16, train loss: 0.35129, val loss: 0.34374, in 0.016s 1 tree, 94 leaves, max depth = 16, train loss: 0.34779, val loss: 0.34022, in 0.016s 1 tree, 93 leaves, max depth = 16, train loss: 0.34444, val loss: 0.33684, in 0.016s 1 tree, 139 leaves, max depth = 18, train loss: 0.34117, val loss: 0.33387, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.33834, val loss: 0.33115, in 0.031s 1 tree, 93 leaves, max depth = 17, train loss: 0.33528, val loss: 0.32806, in 0.016s 1 tree, 248 leaves, max depth = 19, train loss: 0.33257, val loss: 0.32547, in 0.031s 1 tree, 93 leaves, max depth = 16, train loss: 0.32967, val loss: 0.32254, in 0.022s 1 tree, 248 leaves, max depth = 18, train loss: 0.32714, val loss: 0.32006, in 0.025s 1 tree, 191 leaves, max depth = 18, train loss: 0.32413, val loss: 0.31721, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.32169, val loss: 0.31485, in 0.031s 1 tree, 137 leaves, max depth = 17, train loss: 0.31899, val loss: 0.31239, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.31666, val loss: 0.31015, in 0.031s 1 tree, 137 leaves, max depth = 17, train loss: 0.31410, val loss: 0.30786, in 0.016s 1 tree, 138 leaves, max depth = 17, train loss: 0.31163, val loss: 0.30566, in 0.031s 1 tree, 248 leaves, max depth = 15, train loss: 0.30984, val loss: 0.30400, in 0.031s 1 tree, 94 leaves, max depth = 16, train loss: 0.30751, val loss: 0.30166, in 0.016s 1 tree, 139 leaves, max depth = 17, train loss: 0.30520, val loss: 0.29959, in 0.031s 1 tree, 94 leaves, max depth = 22, train loss: 0.30303, val loss: 0.29741, in 0.016s 1 tree, 138 leaves, max depth = 17, train loss: 0.30088, val loss: 0.29549, in 0.016s 1 tree, 92 leaves, max depth = 16, train loss: 0.29884, val loss: 0.29340, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.29686, val loss: 0.29151, in 0.031s 1 tree, 137 leaves, max depth = 17, train loss: 0.29488, val loss: 0.28971, in 0.031s 1 tree, 93 leaves, max depth = 15, train loss: 0.29299, val loss: 0.28781, in 0.016s 1 tree, 138 leaves, max depth = 17, train loss: 0.29112, val loss: 0.28616, in 0.078s 1 tree, 248 leaves, max depth = 19, train loss: 0.28924, val loss: 0.28439, in 0.062s 1 tree, 92 leaves, max depth = 15, train loss: 0.28752, val loss: 0.28264, in 0.016s 1 tree, 93 leaves, max depth = 15, train loss: 0.28584, val loss: 0.28096, in 0.016s 1 tree, 241 leaves, max depth = 18, train loss: 0.28432, val loss: 0.27972, in 0.047s 1 tree, 92 leaves, max depth = 15, train loss: 0.28274, val loss: 0.27810, in 0.031s 1 tree, 248 leaves, max depth = 18, train loss: 0.28104, val loss: 0.27651, in 0.031s 1 tree, 137 leaves, max depth = 17, train loss: 0.27942, val loss: 0.27507, in 0.031s 1 tree, 137 leaves, max depth = 17, train loss: 0.27787, val loss: 0.27368, in 0.031s 1 tree, 248 leaves, max depth = 19, train loss: 0.27616, val loss: 0.27204, in 0.047s 1 tree, 134 leaves, max depth = 17, train loss: 0.27470, val loss: 0.27080, in 0.031s 1 tree, 92 leaves, max depth = 15, train loss: 0.27332, val loss: 0.26939, in 0.016s Fit 85 trees in 2.813 s, (16748 total leaves) Time spent computing histograms: 0.778s Time spent finding best splits: 0.348s Time spent applying splits: 0.309s Time spent predicting: 0.000s Trial 93, Fold 5: Log loss = 0.2892328390065862, Average precision = 0.9588432364280189, ROC-AUC = 0.9540768073128588, Elapsed Time = 2.826665499998853 seconds
Optimization Progress: 94%|#########3| 94/100 [19:34<01:24, 14.08s/it]
Trial 94, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 94, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.188 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 8, train loss: 0.68468, val loss: 0.68456, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.67578, val loss: 0.67558, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.66743, val loss: 0.66703, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.65936, val loss: 0.65894, in 0.031s 1 tree, 25 leaves, max depth = 11, train loss: 0.65200, val loss: 0.65144, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.64487, val loss: 0.64418, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.63743, val loss: 0.63670, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.63053, val loss: 0.62967, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.62364, val loss: 0.62264, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.61613, val loss: 0.61515, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.60934, val loss: 0.60819, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.60293, val loss: 0.60174, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.59716, val loss: 0.59581, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.59064, val loss: 0.58923, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.58448, val loss: 0.58300, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.57855, val loss: 0.57696, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.57325, val loss: 0.57155, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.56725, val loss: 0.56545, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.56246, val loss: 0.56036, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.55687, val loss: 0.55468, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.55190, val loss: 0.54954, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.54656, val loss: 0.54413, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.54135, val loss: 0.53885, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.53644, val loss: 0.53391, in 0.016s 1 tree, 9 leaves, max depth = 6, train loss: 0.53202, val loss: 0.52934, in 0.016s 1 tree, 20 leaves, max depth = 10, train loss: 0.52786, val loss: 0.52512, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.52338, val loss: 0.52060, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.51921, val loss: 0.51631, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.51509, val loss: 0.51205, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.51094, val loss: 0.50785, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.50705, val loss: 0.50391, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.50260, val loss: 0.49947, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.49855, val loss: 0.49536, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.49475, val loss: 0.49154, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.49139, val loss: 0.48799, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.48805, val loss: 0.48460, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.48484, val loss: 0.48143, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.48134, val loss: 0.47786, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.47804, val loss: 0.47448, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.47457, val loss: 0.47095, in 0.016s 1 tree, 22 leaves, max depth = 9, train loss: 0.47139, val loss: 0.46764, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.46821, val loss: 0.46439, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.46513, val loss: 0.46133, in 0.016s 1 tree, 18 leaves, max depth = 9, train loss: 0.46249, val loss: 0.45861, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.45958, val loss: 0.45562, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.45694, val loss: 0.45288, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.45454, val loss: 0.45042, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.45156, val loss: 0.44738, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.44869, val loss: 0.44444, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.44557, val loss: 0.44137, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.44324, val loss: 0.43905, in 0.016s Fit 51 trees in 1.063 s, (1186 total leaves) Time spent computing histograms: 0.315s Time spent finding best splits: 0.028s Time spent applying splits: 0.024s Time spent predicting: 0.000s Trial 94, Fold 1: Log loss = 0.44411788190088125, Average precision = 0.9523659075828259, ROC-AUC = 0.9449224780497363, Elapsed Time = 1.0696496999989904 seconds Trial 94, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 94, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.141 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 9, train loss: 0.68484, val loss: 0.68446, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.67619, val loss: 0.67571, in 0.016s 1 tree, 25 leaves, max depth = 13, train loss: 0.66761, val loss: 0.66698, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.65949, val loss: 0.65852, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.65215, val loss: 0.65083, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.64512, val loss: 0.64348, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.63755, val loss: 0.63570, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.62996, val loss: 0.62795, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.62318, val loss: 0.62092, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.61637, val loss: 0.61390, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.60940, val loss: 0.60668, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.60298, val loss: 0.60015, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.59723, val loss: 0.59411, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.59049, val loss: 0.58729, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.58412, val loss: 0.58078, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.57831, val loss: 0.57494, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.57297, val loss: 0.56941, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.56723, val loss: 0.56362, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.56161, val loss: 0.55790, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.55603, val loss: 0.55222, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.55103, val loss: 0.54714, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.54546, val loss: 0.54147, in 0.016s 1 tree, 25 leaves, max depth = 6, train loss: 0.54017, val loss: 0.53608, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.53566, val loss: 0.53147, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.53139, val loss: 0.52700, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.52703, val loss: 0.52248, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.52254, val loss: 0.51798, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.51861, val loss: 0.51396, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.51455, val loss: 0.50971, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.51075, val loss: 0.50574, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.50645, val loss: 0.50136, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.50265, val loss: 0.49742, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.49838, val loss: 0.49308, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.49468, val loss: 0.48934, in 0.015s 1 tree, 25 leaves, max depth = 8, train loss: 0.49102, val loss: 0.48564, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.48764, val loss: 0.48216, in 0.016s 1 tree, 25 leaves, max depth = 6, train loss: 0.48413, val loss: 0.47865, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.48061, val loss: 0.47505, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.47729, val loss: 0.47165, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.47375, val loss: 0.46806, in 0.000s 1 tree, 18 leaves, max depth = 9, train loss: 0.47058, val loss: 0.46488, in 0.031s 1 tree, 25 leaves, max depth = 8, train loss: 0.46746, val loss: 0.46169, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.46443, val loss: 0.45858, in 0.016s 1 tree, 11 leaves, max depth = 7, train loss: 0.46176, val loss: 0.45584, in 0.063s 1 tree, 25 leaves, max depth = 10, train loss: 0.45887, val loss: 0.45294, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.45631, val loss: 0.45028, in 0.031s 1 tree, 25 leaves, max depth = 15, train loss: 0.45394, val loss: 0.44777, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.45093, val loss: 0.44473, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.44806, val loss: 0.44176, in 0.031s 1 tree, 25 leaves, max depth = 10, train loss: 0.44561, val loss: 0.43918, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.44334, val loss: 0.43682, in 0.016s Fit 51 trees in 1.079 s, (1220 total leaves) Time spent computing histograms: 0.348s Time spent finding best splits: 0.043s Time spent applying splits: 0.033s Time spent predicting: 0.000s Trial 94, Fold 2: Log loss = 0.4446945884712925, Average precision = 0.9498656301917219, ROC-AUC = 0.9459203518363455, Elapsed Time = 1.083164300000135 seconds Trial 94, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 94, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 8, train loss: 0.68478, val loss: 0.68467, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.67572, val loss: 0.67551, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.66732, val loss: 0.66711, in 0.031s 1 tree, 25 leaves, max depth = 7, train loss: 0.65930, val loss: 0.65893, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.65199, val loss: 0.65142, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.64489, val loss: 0.64418, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.63753, val loss: 0.63672, in 0.078s 1 tree, 17 leaves, max depth = 8, train loss: 0.63067, val loss: 0.62969, in 0.047s 1 tree, 25 leaves, max depth = 8, train loss: 0.62376, val loss: 0.62266, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.61627, val loss: 0.61513, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.60948, val loss: 0.60833, in 0.016s 1 tree, 25 leaves, max depth = 6, train loss: 0.60310, val loss: 0.60185, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.59735, val loss: 0.59599, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.59066, val loss: 0.58928, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.58451, val loss: 0.58306, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.57861, val loss: 0.57716, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.57330, val loss: 0.57169, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.56758, val loss: 0.56598, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.56204, val loss: 0.56032, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.55660, val loss: 0.55471, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.55159, val loss: 0.54960, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.54613, val loss: 0.54411, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.54099, val loss: 0.53896, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.53649, val loss: 0.53441, in 0.017s 1 tree, 25 leaves, max depth = 12, train loss: 0.53170, val loss: 0.52963, in 0.015s 1 tree, 25 leaves, max depth = 8, train loss: 0.52737, val loss: 0.52521, in 0.016s 1 tree, 25 leaves, max depth = 13, train loss: 0.52288, val loss: 0.52075, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.51878, val loss: 0.51665, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.51472, val loss: 0.51256, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.51085, val loss: 0.50865, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.50668, val loss: 0.50443, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.50289, val loss: 0.50062, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.49911, val loss: 0.49674, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.49541, val loss: 0.49306, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.49140, val loss: 0.48906, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.48797, val loss: 0.48562, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.48462, val loss: 0.48225, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.48114, val loss: 0.47876, in 0.016s 1 tree, 21 leaves, max depth = 9, train loss: 0.47652, val loss: 0.47446, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.47320, val loss: 0.47115, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.46995, val loss: 0.46796, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.46683, val loss: 0.46481, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.46249, val loss: 0.46073, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.45990, val loss: 0.45800, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.45698, val loss: 0.45511, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.45443, val loss: 0.45254, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.45207, val loss: 0.45012, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.44910, val loss: 0.44718, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.44620, val loss: 0.44427, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.44367, val loss: 0.44168, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.44139, val loss: 0.43935, in 0.000s Fit 51 trees in 1.221 s, (1213 total leaves) Time spent computing histograms: 0.394s Time spent finding best splits: 0.060s Time spent applying splits: 0.036s Time spent predicting: 0.016s Trial 94, Fold 3: Log loss = 0.43768270158311073, Average precision = 0.9558801258586747, ROC-AUC = 0.9505323705107338, Elapsed Time = 1.2125231999998505 seconds Trial 94, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 94, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 11, train loss: 0.68484, val loss: 0.68458, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.67602, val loss: 0.67530, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.66758, val loss: 0.66643, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.65952, val loss: 0.65806, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.65219, val loss: 0.65047, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.64518, val loss: 0.64319, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.63776, val loss: 0.63553, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.63090, val loss: 0.62832, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.62390, val loss: 0.62100, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.61642, val loss: 0.61322, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.60971, val loss: 0.60610, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.60332, val loss: 0.59945, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.59756, val loss: 0.59346, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.59106, val loss: 0.58661, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.58487, val loss: 0.58014, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.57906, val loss: 0.57398, in 0.031s 1 tree, 12 leaves, max depth = 6, train loss: 0.57376, val loss: 0.56837, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.56814, val loss: 0.56241, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.56234, val loss: 0.55630, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.55690, val loss: 0.55054, in 0.016s 1 tree, 24 leaves, max depth = 9, train loss: 0.55194, val loss: 0.54525, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.54663, val loss: 0.53963, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.54173, val loss: 0.53445, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.53701, val loss: 0.52943, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.53289, val loss: 0.52509, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.52851, val loss: 0.52053, in 0.016s 1 tree, 25 leaves, max depth = 13, train loss: 0.52410, val loss: 0.51590, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.52019, val loss: 0.51176, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.51613, val loss: 0.50751, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.51228, val loss: 0.50349, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.50806, val loss: 0.49898, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.50423, val loss: 0.49497, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.50044, val loss: 0.49087, in 0.031s 1 tree, 25 leaves, max depth = 9, train loss: 0.49681, val loss: 0.48698, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.49290, val loss: 0.48280, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.48949, val loss: 0.47922, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.48608, val loss: 0.47562, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.48258, val loss: 0.47195, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.47935, val loss: 0.46847, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.47622, val loss: 0.46507, in 0.000s 1 tree, 23 leaves, max depth = 9, train loss: 0.47298, val loss: 0.46163, in 0.031s 1 tree, 25 leaves, max depth = 11, train loss: 0.46988, val loss: 0.45825, in 0.000s 1 tree, 9 leaves, max depth = 5, train loss: 0.46554, val loss: 0.45384, in 0.016s 1 tree, 10 leaves, max depth = 7, train loss: 0.46293, val loss: 0.45101, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.46008, val loss: 0.44798, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.45753, val loss: 0.44518, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.45515, val loss: 0.44270, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.45217, val loss: 0.43953, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.44895, val loss: 0.43628, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.44640, val loss: 0.43350, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.44411, val loss: 0.43116, in 0.016s Fit 51 trees in 1.127 s, (1223 total leaves) Time spent computing histograms: 0.337s Time spent finding best splits: 0.031s Time spent applying splits: 0.027s Time spent predicting: 0.016s Trial 94, Fold 4: Log loss = 0.4435109531285845, Average precision = 0.9534755101482932, ROC-AUC = 0.9469630517822271, Elapsed Time = 1.126024299999699 seconds Trial 94, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 94, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 11, train loss: 0.68469, val loss: 0.68429, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.67607, val loss: 0.67519, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.66717, val loss: 0.66585, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.65875, val loss: 0.65700, in 0.016s 1 tree, 22 leaves, max depth = 10, train loss: 0.65068, val loss: 0.64847, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.64251, val loss: 0.63993, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.63533, val loss: 0.63232, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.62808, val loss: 0.62482, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.62136, val loss: 0.61776, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.61415, val loss: 0.61020, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.60761, val loss: 0.60346, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.60112, val loss: 0.59654, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.59518, val loss: 0.59025, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.58862, val loss: 0.58336, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.58291, val loss: 0.57728, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.57712, val loss: 0.57113, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.57153, val loss: 0.56516, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.56652, val loss: 0.55989, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.56160, val loss: 0.55468, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.55640, val loss: 0.54931, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.55177, val loss: 0.54440, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.54643, val loss: 0.53871, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.54063, val loss: 0.53273, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.53542, val loss: 0.52722, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.53095, val loss: 0.52242, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.52643, val loss: 0.51761, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.52247, val loss: 0.51340, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.51872, val loss: 0.50936, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.51412, val loss: 0.50448, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.51011, val loss: 0.50021, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.50623, val loss: 0.49609, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.50249, val loss: 0.49213, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.49911, val loss: 0.48846, in 0.016s 1 tree, 20 leaves, max depth = 7, train loss: 0.49508, val loss: 0.48421, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.49163, val loss: 0.48063, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.48703, val loss: 0.47589, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.48356, val loss: 0.47218, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.48027, val loss: 0.46860, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.47701, val loss: 0.46509, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.47325, val loss: 0.46117, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.47026, val loss: 0.45807, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.46715, val loss: 0.45470, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.46450, val loss: 0.45178, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.46127, val loss: 0.44837, in 0.016s 1 tree, 25 leaves, max depth = 6, train loss: 0.45842, val loss: 0.44541, in 0.031s 1 tree, 6 leaves, max depth = 4, train loss: 0.45573, val loss: 0.44255, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.45276, val loss: 0.43939, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.45020, val loss: 0.43673, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.44763, val loss: 0.43397, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.44448, val loss: 0.43077, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.44229, val loss: 0.42841, in 0.031s Fit 51 trees in 1.173 s, (1191 total leaves) Time spent computing histograms: 0.362s Time spent finding best splits: 0.033s Time spent applying splits: 0.028s Time spent predicting: 0.000s Trial 94, Fold 5: Log loss = 0.4475411045755514, Average precision = 0.9518987120846267, ROC-AUC = 0.9456943685784887, Elapsed Time = 1.1737415000006877 seconds
Optimization Progress: 95%|#########5| 95/100 [19:47<01:07, 13.55s/it]
Trial 95, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 95, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.141 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 12, train loss: 0.67525, val loss: 0.67497, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.65881, val loss: 0.65818, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.64364, val loss: 0.64269, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.62939, val loss: 0.62833, in 0.000s 1 tree, 28 leaves, max depth = 10, train loss: 0.61602, val loss: 0.61486, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.60305, val loss: 0.60166, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.59104, val loss: 0.58948, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.57859, val loss: 0.57686, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.56705, val loss: 0.56515, in 0.016s 1 tree, 32 leaves, max depth = 13, train loss: 0.55563, val loss: 0.55387, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.54514, val loss: 0.54323, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.53559, val loss: 0.53375, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.52677, val loss: 0.52488, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.51777, val loss: 0.51574, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.50944, val loss: 0.50729, in 0.016s 1 tree, 30 leaves, max depth = 15, train loss: 0.50165, val loss: 0.49932, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.49438, val loss: 0.49205, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.48681, val loss: 0.48429, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.48074, val loss: 0.47777, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.47386, val loss: 0.47087, in 0.016s 1 tree, 18 leaves, max depth = 9, train loss: 0.46731, val loss: 0.46422, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.46107, val loss: 0.45780, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.45527, val loss: 0.45194, in 0.016s 1 tree, 23 leaves, max depth = 8, train loss: 0.45025, val loss: 0.44686, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.44510, val loss: 0.44146, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.44043, val loss: 0.43670, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.43552, val loss: 0.43171, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.43107, val loss: 0.42719, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.42703, val loss: 0.42313, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.41983, val loss: 0.41618, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.41536, val loss: 0.41170, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.41054, val loss: 0.40699, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.40713, val loss: 0.40354, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.40360, val loss: 0.39988, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.39991, val loss: 0.39623, in 0.000s 1 tree, 28 leaves, max depth = 11, train loss: 0.39565, val loss: 0.39201, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.39155, val loss: 0.38800, in 0.016s 1 tree, 10 leaves, max depth = 6, train loss: 0.38581, val loss: 0.38236, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.38020, val loss: 0.37700, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.37714, val loss: 0.37388, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.37202, val loss: 0.36888, in 0.016s 1 tree, 30 leaves, max depth = 15, train loss: 0.36887, val loss: 0.36581, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.36634, val loss: 0.36312, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.36364, val loss: 0.36050, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.36037, val loss: 0.35727, in 0.016s 1 tree, 27 leaves, max depth = 8, train loss: 0.35774, val loss: 0.35475, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.35319, val loss: 0.35053, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.34939, val loss: 0.34664, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34535, val loss: 0.34274, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.34106, val loss: 0.33882, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.33830, val loss: 0.33621, in 0.016s 1 tree, 34 leaves, max depth = 13, train loss: 0.33651, val loss: 0.33457, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33337, val loss: 0.33134, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.33048, val loss: 0.32836, in 0.016s 1 tree, 29 leaves, max depth = 15, train loss: 0.32861, val loss: 0.32654, in 0.016s 1 tree, 26 leaves, max depth = 8, train loss: 0.32609, val loss: 0.32415, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.32383, val loss: 0.32183, in 0.000s 1 tree, 16 leaves, max depth = 8, train loss: 0.32076, val loss: 0.31884, in 0.016s Fit 58 trees in 1.157 s, (1435 total leaves) Time spent computing histograms: 0.371s Time spent finding best splits: 0.038s Time spent applying splits: 0.031s Time spent predicting: 0.000s Trial 95, Fold 1: Log loss = 0.3234746455634853, Average precision = 0.9569561453670864, ROC-AUC = 0.950871124948022, Elapsed Time = 1.1658042000017304 seconds Trial 95, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 95, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.126 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 9, train loss: 0.67519, val loss: 0.67479, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.65885, val loss: 0.65813, in 0.000s 1 tree, 23 leaves, max depth = 11, train loss: 0.64347, val loss: 0.64249, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.62764, val loss: 0.62627, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.61422, val loss: 0.61238, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.60100, val loss: 0.59891, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.58904, val loss: 0.58647, in 0.031s 1 tree, 33 leaves, max depth = 8, train loss: 0.57722, val loss: 0.57436, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.56625, val loss: 0.56315, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.55608, val loss: 0.55262, in 0.000s 1 tree, 38 leaves, max depth = 12, train loss: 0.54560, val loss: 0.54188, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.53641, val loss: 0.53246, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.52767, val loss: 0.52340, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.51860, val loss: 0.51409, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.51010, val loss: 0.50539, in 0.031s 1 tree, 15 leaves, max depth = 10, train loss: 0.50225, val loss: 0.49753, in 0.000s 1 tree, 28 leaves, max depth = 15, train loss: 0.49430, val loss: 0.48952, in 0.031s 1 tree, 33 leaves, max depth = 10, train loss: 0.48678, val loss: 0.48191, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.48008, val loss: 0.47498, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.47370, val loss: 0.46857, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.46717, val loss: 0.46192, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.46144, val loss: 0.45622, in 0.016s 1 tree, 7 leaves, max depth = 5, train loss: 0.45338, val loss: 0.44808, in 0.016s 1 tree, 27 leaves, max depth = 13, train loss: 0.44770, val loss: 0.44234, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.44282, val loss: 0.43737, in 0.016s 1 tree, 37 leaves, max depth = 13, train loss: 0.43741, val loss: 0.43192, in 0.016s 1 tree, 29 leaves, max depth = 15, train loss: 0.43277, val loss: 0.42725, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.42858, val loss: 0.42289, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.42383, val loss: 0.41809, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.41935, val loss: 0.41347, in 0.016s 1 tree, 22 leaves, max depth = 8, train loss: 0.41552, val loss: 0.40947, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.41068, val loss: 0.40467, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.40717, val loss: 0.40103, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.40331, val loss: 0.39717, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.39758, val loss: 0.39144, in 0.016s 1 tree, 44 leaves, max depth = 11, train loss: 0.39340, val loss: 0.38734, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.38823, val loss: 0.38214, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.38497, val loss: 0.37878, in 0.016s 1 tree, 17 leaves, max depth = 8, train loss: 0.38183, val loss: 0.37562, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.37666, val loss: 0.37056, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.37406, val loss: 0.36800, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.36861, val loss: 0.36288, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.36399, val loss: 0.35850, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.36056, val loss: 0.35514, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.35825, val loss: 0.35289, in 0.031s 1 tree, 31 leaves, max depth = 10, train loss: 0.35510, val loss: 0.34978, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.35199, val loss: 0.34675, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.34913, val loss: 0.34393, in 0.016s 1 tree, 12 leaves, max depth = 6, train loss: 0.34631, val loss: 0.34117, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34227, val loss: 0.33725, in 0.000s 1 tree, 43 leaves, max depth = 10, train loss: 0.33951, val loss: 0.33459, in 0.031s 1 tree, 36 leaves, max depth = 11, train loss: 0.33751, val loss: 0.33263, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33381, val loss: 0.32908, in 0.000s 1 tree, 32 leaves, max depth = 8, train loss: 0.33200, val loss: 0.32725, in 0.031s 1 tree, 10 leaves, max depth = 6, train loss: 0.32818, val loss: 0.32372, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.32661, val loss: 0.32214, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.32372, val loss: 0.31925, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.32035, val loss: 0.31615, in 0.016s Fit 58 trees in 1.220 s, (1488 total leaves) Time spent computing histograms: 0.403s Time spent finding best splits: 0.043s Time spent applying splits: 0.035s Time spent predicting: 0.000s Trial 95, Fold 2: Log loss = 0.3229884704946013, Average precision = 0.9535500219361897, ROC-AUC = 0.9504192943060286, Elapsed Time = 1.2325305000013032 seconds Trial 95, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 95, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 28 leaves, max depth = 10, train loss: 0.67539, val loss: 0.67508, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.65870, val loss: 0.65806, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.64326, val loss: 0.64260, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.62912, val loss: 0.62821, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.61589, val loss: 0.61469, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.60287, val loss: 0.60152, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.59092, val loss: 0.58938, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.57934, val loss: 0.57761, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.56861, val loss: 0.56658, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.55849, val loss: 0.55645, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.54812, val loss: 0.54600, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.53895, val loss: 0.53674, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.53013, val loss: 0.52787, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.52111, val loss: 0.51879, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.51271, val loss: 0.51017, in 0.016s 1 tree, 26 leaves, max depth = 8, train loss: 0.50465, val loss: 0.50226, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.49728, val loss: 0.49482, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.48959, val loss: 0.48714, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.48346, val loss: 0.48101, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.47707, val loss: 0.47464, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.47086, val loss: 0.46849, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.46413, val loss: 0.46185, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.45792, val loss: 0.45564, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.45270, val loss: 0.45047, in 0.031s 1 tree, 28 leaves, max depth = 11, train loss: 0.44763, val loss: 0.44536, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.44294, val loss: 0.44051, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.43797, val loss: 0.43560, in 0.016s 1 tree, 26 leaves, max depth = 10, train loss: 0.43342, val loss: 0.43107, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.42934, val loss: 0.42691, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.42534, val loss: 0.42304, in 0.016s 1 tree, 17 leaves, max depth = 11, train loss: 0.42092, val loss: 0.41860, in 0.016s 1 tree, 29 leaves, max depth = 8, train loss: 0.41734, val loss: 0.41502, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.41389, val loss: 0.41158, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.41025, val loss: 0.40806, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.40570, val loss: 0.40335, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.40131, val loss: 0.39881, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.39795, val loss: 0.39548, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.39463, val loss: 0.39215, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.38880, val loss: 0.38702, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.38382, val loss: 0.38224, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.38001, val loss: 0.37834, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.37477, val loss: 0.37356, in 0.000s 1 tree, 9 leaves, max depth = 4, train loss: 0.36975, val loss: 0.36906, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.36705, val loss: 0.36639, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.36468, val loss: 0.36412, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.36017, val loss: 0.36014, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.35836, val loss: 0.35834, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.35520, val loss: 0.35506, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.35145, val loss: 0.35151, in 0.016s 1 tree, 33 leaves, max depth = 8, train loss: 0.34854, val loss: 0.34843, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.34631, val loss: 0.34639, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.34294, val loss: 0.34318, in 0.016s 1 tree, 21 leaves, max depth = 8, train loss: 0.34128, val loss: 0.34161, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33737, val loss: 0.33825, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.33540, val loss: 0.33641, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.33282, val loss: 0.33374, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.33112, val loss: 0.33204, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.32960, val loss: 0.33055, in 0.016s Fit 58 trees in 1.251 s, (1446 total leaves) Time spent computing histograms: 0.400s Time spent finding best splits: 0.043s Time spent applying splits: 0.034s Time spent predicting: 0.000s Trial 95, Fold 3: Log loss = 0.3254918967385569, Average precision = 0.958430490667888, ROC-AUC = 0.9538103083375871, Elapsed Time = 1.2599082999986422 seconds Trial 95, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 95, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.157 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 30 leaves, max depth = 9, train loss: 0.67529, val loss: 0.67443, in 0.016s 1 tree, 40 leaves, max depth = 9, train loss: 0.65848, val loss: 0.65684, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.64325, val loss: 0.64077, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.62807, val loss: 0.62484, in 0.016s 1 tree, 30 leaves, max depth = 10, train loss: 0.61434, val loss: 0.61040, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.60054, val loss: 0.59589, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.58826, val loss: 0.58314, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.57694, val loss: 0.57130, in 0.016s 1 tree, 27 leaves, max depth = 14, train loss: 0.56579, val loss: 0.55964, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.55463, val loss: 0.54804, in 0.016s 1 tree, 25 leaves, max depth = 17, train loss: 0.54476, val loss: 0.53755, in 0.031s 1 tree, 29 leaves, max depth = 12, train loss: 0.53524, val loss: 0.52751, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.52566, val loss: 0.51755, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.51650, val loss: 0.50790, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.50845, val loss: 0.49952, in 0.016s 1 tree, 28 leaves, max depth = 13, train loss: 0.50116, val loss: 0.49187, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.49377, val loss: 0.48408, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.48697, val loss: 0.47689, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.48004, val loss: 0.46942, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.47391, val loss: 0.46300, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.46846, val loss: 0.45733, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.46251, val loss: 0.45101, in 0.000s 1 tree, 28 leaves, max depth = 12, train loss: 0.45703, val loss: 0.44508, in 0.031s 1 tree, 28 leaves, max depth = 11, train loss: 0.45132, val loss: 0.43893, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.44646, val loss: 0.43376, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.44105, val loss: 0.42792, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.43646, val loss: 0.42295, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.43218, val loss: 0.41844, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.42717, val loss: 0.41318, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.42287, val loss: 0.40854, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.41913, val loss: 0.40445, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.41511, val loss: 0.40015, in 0.016s 1 tree, 8 leaves, max depth = 4, train loss: 0.40854, val loss: 0.39342, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.40513, val loss: 0.38971, in 0.016s 1 tree, 14 leaves, max depth = 6, train loss: 0.40149, val loss: 0.38578, in 0.016s 1 tree, 26 leaves, max depth = 11, train loss: 0.39725, val loss: 0.38150, in 0.016s 1 tree, 10 leaves, max depth = 5, train loss: 0.39397, val loss: 0.37792, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.39126, val loss: 0.37509, in 0.031s 1 tree, 34 leaves, max depth = 11, train loss: 0.38727, val loss: 0.37104, in 0.016s 1 tree, 27 leaves, max depth = 8, train loss: 0.38439, val loss: 0.36787, in 0.000s 1 tree, 27 leaves, max depth = 11, train loss: 0.38176, val loss: 0.36509, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.37895, val loss: 0.36224, in 0.031s 1 tree, 11 leaves, max depth = 6, train loss: 0.37671, val loss: 0.35986, in 0.000s 1 tree, 27 leaves, max depth = 10, train loss: 0.37438, val loss: 0.35732, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.36930, val loss: 0.35212, in 0.000s 1 tree, 9 leaves, max depth = 6, train loss: 0.36513, val loss: 0.34756, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.36074, val loss: 0.34314, in 0.016s 1 tree, 24 leaves, max depth = 10, train loss: 0.35847, val loss: 0.34073, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.35537, val loss: 0.33763, in 0.016s 1 tree, 9 leaves, max depth = 7, train loss: 0.35111, val loss: 0.33350, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.34894, val loss: 0.33113, in 0.016s 1 tree, 7 leaves, max depth = 4, train loss: 0.34479, val loss: 0.32717, in 0.016s 1 tree, 6 leaves, max depth = 4, train loss: 0.34268, val loss: 0.32485, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.34088, val loss: 0.32293, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33916, val loss: 0.32107, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.33700, val loss: 0.31885, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.33523, val loss: 0.31695, in 0.016s 1 tree, 29 leaves, max depth = 11, train loss: 0.33322, val loss: 0.31487, in 0.016s Fit 58 trees in 1.282 s, (1459 total leaves) Time spent computing histograms: 0.406s Time spent finding best splits: 0.044s Time spent applying splits: 0.036s Time spent predicting: 0.000s Trial 95, Fold 4: Log loss = 0.33272664317341477, Average precision = 0.955768917238855, ROC-AUC = 0.9497438696824682, Elapsed Time = 1.2968270999990636 seconds Trial 95, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 95, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 28 leaves, max depth = 10, train loss: 0.67523, val loss: 0.67407, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.65890, val loss: 0.65665, in 0.016s 1 tree, 28 leaves, max depth = 10, train loss: 0.64378, val loss: 0.64069, in 0.016s 1 tree, 27 leaves, max depth = 10, train loss: 0.62942, val loss: 0.62585, in 0.016s 1 tree, 30 leaves, max depth = 11, train loss: 0.61599, val loss: 0.61195, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.60332, val loss: 0.59837, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.59130, val loss: 0.58595, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.57877, val loss: 0.57279, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.56698, val loss: 0.56042, in 0.031s 1 tree, 30 leaves, max depth = 12, train loss: 0.55565, val loss: 0.54863, in 0.016s 1 tree, 28 leaves, max depth = 12, train loss: 0.54515, val loss: 0.53751, in 0.016s 1 tree, 27 leaves, max depth = 11, train loss: 0.53571, val loss: 0.52774, in 0.000s 1 tree, 24 leaves, max depth = 9, train loss: 0.52696, val loss: 0.51869, in 0.031s 1 tree, 29 leaves, max depth = 10, train loss: 0.51792, val loss: 0.50922, in 0.000s 1 tree, 29 leaves, max depth = 12, train loss: 0.50960, val loss: 0.50033, in 0.031s 1 tree, 16 leaves, max depth = 6, train loss: 0.50177, val loss: 0.49204, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.49398, val loss: 0.48391, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.48662, val loss: 0.47612, in 0.016s 1 tree, 4 leaves, max depth = 2, train loss: 0.48051, val loss: 0.46962, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.47364, val loss: 0.46229, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.46704, val loss: 0.45536, in 0.016s 1 tree, 20 leaves, max depth = 8, train loss: 0.46089, val loss: 0.44883, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.45502, val loss: 0.44255, in 0.016s 1 tree, 28 leaves, max depth = 7, train loss: 0.44991, val loss: 0.43733, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.44468, val loss: 0.43174, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.43985, val loss: 0.42659, in 0.016s 1 tree, 33 leaves, max depth = 14, train loss: 0.43490, val loss: 0.42139, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.43052, val loss: 0.41686, in 0.016s 1 tree, 29 leaves, max depth = 10, train loss: 0.42647, val loss: 0.41265, in 0.031s 1 tree, 13 leaves, max depth = 5, train loss: 0.41927, val loss: 0.40547, in 0.000s 1 tree, 30 leaves, max depth = 12, train loss: 0.41504, val loss: 0.40092, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.41016, val loss: 0.39594, in 0.031s 1 tree, 15 leaves, max depth = 6, train loss: 0.40343, val loss: 0.38932, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.39988, val loss: 0.38561, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.39546, val loss: 0.38109, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.39123, val loss: 0.37682, in 0.016s 1 tree, 8 leaves, max depth = 5, train loss: 0.38795, val loss: 0.37348, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.38404, val loss: 0.36961, in 0.000s 1 tree, 6 leaves, max depth = 4, train loss: 0.37901, val loss: 0.36439, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.37428, val loss: 0.35939, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.37067, val loss: 0.35579, in 0.016s 1 tree, 6 leaves, max depth = 3, train loss: 0.36582, val loss: 0.35085, in 0.016s 1 tree, 9 leaves, max depth = 5, train loss: 0.36116, val loss: 0.34613, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.35856, val loss: 0.34337, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.35631, val loss: 0.34113, in 0.016s 1 tree, 18 leaves, max depth = 6, train loss: 0.35422, val loss: 0.33906, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.35208, val loss: 0.33690, in 0.031s 1 tree, 4 leaves, max depth = 2, train loss: 0.34805, val loss: 0.33322, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.34412, val loss: 0.32917, in 0.000s 1 tree, 7 leaves, max depth = 3, train loss: 0.34001, val loss: 0.32521, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33656, val loss: 0.32173, in 0.016s 1 tree, 29 leaves, max depth = 13, train loss: 0.33465, val loss: 0.31966, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.33143, val loss: 0.31642, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.32865, val loss: 0.31374, in 0.016s 1 tree, 34 leaves, max depth = 7, train loss: 0.32696, val loss: 0.31193, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.32434, val loss: 0.30938, in 0.016s 1 tree, 33 leaves, max depth = 13, train loss: 0.32242, val loss: 0.30746, in 0.031s 1 tree, 34 leaves, max depth = 10, train loss: 0.32029, val loss: 0.30527, in 0.016s Fit 58 trees in 1.298 s, (1422 total leaves) Time spent computing histograms: 0.411s Time spent finding best splits: 0.044s Time spent applying splits: 0.035s Time spent predicting: 0.000s Trial 95, Fold 5: Log loss = 0.3282869013454593, Average precision = 0.9549812129532772, ROC-AUC = 0.9502090927670328, Elapsed Time = 1.3063646999999037 seconds
Optimization Progress: 96%|#########6| 96/100 [20:00<00:53, 13.40s/it]
Trial 96, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 96, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 101 leaves, max depth = 15, train loss: 0.68483, val loss: 0.68463, in 0.016s 1 tree, 103 leaves, max depth = 17, train loss: 0.67663, val loss: 0.67620, in 0.016s 1 tree, 103 leaves, max depth = 17, train loss: 0.66871, val loss: 0.66805, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.66116, val loss: 0.66033, in 0.000s 1 tree, 105 leaves, max depth = 19, train loss: 0.65386, val loss: 0.65283, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.64681, val loss: 0.64558, in 0.016s 1 tree, 104 leaves, max depth = 21, train loss: 0.64007, val loss: 0.63868, in 0.016s 1 tree, 103 leaves, max depth = 19, train loss: 0.63356, val loss: 0.63201, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.62719, val loss: 0.62545, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.62095, val loss: 0.61901, in 0.016s 1 tree, 104 leaves, max depth = 18, train loss: 0.61505, val loss: 0.61289, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.60928, val loss: 0.60695, in 0.000s 1 tree, 105 leaves, max depth = 19, train loss: 0.60371, val loss: 0.60120, in 0.016s 1 tree, 104 leaves, max depth = 18, train loss: 0.59824, val loss: 0.59554, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.59302, val loss: 0.59015, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.58797, val loss: 0.58494, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.58300, val loss: 0.57979, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.57827, val loss: 0.57493, in 0.000s 1 tree, 103 leaves, max depth = 20, train loss: 0.57372, val loss: 0.57026, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.56920, val loss: 0.56557, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.56483, val loss: 0.56102, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.56059, val loss: 0.55661, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.55648, val loss: 0.55233, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.55250, val loss: 0.54818, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.54870, val loss: 0.54425, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.54448, val loss: 0.54029, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.54039, val loss: 0.53647, in 0.031s 1 tree, 106 leaves, max depth = 17, train loss: 0.53684, val loss: 0.53276, in 0.000s 1 tree, 163 leaves, max depth = 16, train loss: 0.53292, val loss: 0.52909, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.52913, val loss: 0.52555, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.52578, val loss: 0.52210, in 0.016s 1 tree, 106 leaves, max depth = 16, train loss: 0.52253, val loss: 0.51869, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.51931, val loss: 0.51530, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.51573, val loss: 0.51197, in 0.031s 1 tree, 163 leaves, max depth = 16, train loss: 0.51227, val loss: 0.50875, in 0.016s 1 tree, 104 leaves, max depth = 19, train loss: 0.50928, val loss: 0.50562, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.50632, val loss: 0.50251, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.50302, val loss: 0.49945, in 0.031s 1 tree, 105 leaves, max depth = 19, train loss: 0.50025, val loss: 0.49654, in 0.000s 1 tree, 106 leaves, max depth = 17, train loss: 0.49757, val loss: 0.49371, in 0.000s 1 tree, 106 leaves, max depth = 21, train loss: 0.49495, val loss: 0.49101, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.49183, val loss: 0.48812, in 0.031s 1 tree, 105 leaves, max depth = 19, train loss: 0.48933, val loss: 0.48550, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.48632, val loss: 0.48271, in 0.031s 1 tree, 163 leaves, max depth = 16, train loss: 0.48340, val loss: 0.48001, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.48057, val loss: 0.47739, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.47782, val loss: 0.47485, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.47546, val loss: 0.47234, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.47281, val loss: 0.46989, in 0.031s 1 tree, 163 leaves, max depth = 16, train loss: 0.47024, val loss: 0.46752, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.46805, val loss: 0.46519, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.46557, val loss: 0.46291, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.46316, val loss: 0.46069, in 0.031s 1 tree, 105 leaves, max depth = 17, train loss: 0.46102, val loss: 0.45841, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.45870, val loss: 0.45628, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.45644, val loss: 0.45421, in 0.031s 1 tree, 163 leaves, max depth = 16, train loss: 0.45424, val loss: 0.45220, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.45228, val loss: 0.45009, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.45016, val loss: 0.44816, in 0.031s 1 tree, 104 leaves, max depth = 18, train loss: 0.44821, val loss: 0.44611, in 0.000s 1 tree, 106 leaves, max depth = 14, train loss: 0.44638, val loss: 0.44414, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.44434, val loss: 0.44229, in 0.031s 1 tree, 163 leaves, max depth = 16, train loss: 0.44236, val loss: 0.44049, in 0.016s 1 tree, 104 leaves, max depth = 20, train loss: 0.44062, val loss: 0.43867, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.43871, val loss: 0.43693, in 0.016s 1 tree, 103 leaves, max depth = 14, train loss: 0.43697, val loss: 0.43508, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.43533, val loss: 0.43331, in 0.018s 1 tree, 104 leaves, max depth = 17, train loss: 0.43369, val loss: 0.43158, in 0.014s 1 tree, 163 leaves, max depth = 16, train loss: 0.43186, val loss: 0.42992, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.43008, val loss: 0.42832, in 0.031s 1 tree, 163 leaves, max depth = 16, train loss: 0.42835, val loss: 0.42675, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.42667, val loss: 0.42523, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.42517, val loss: 0.42360, in 0.016s 1 tree, 104 leaves, max depth = 21, train loss: 0.42372, val loss: 0.42209, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.42230, val loss: 0.42057, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.42093, val loss: 0.41907, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.41955, val loss: 0.41757, in 0.016s 1 tree, 104 leaves, max depth = 17, train loss: 0.41821, val loss: 0.41615, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.41690, val loss: 0.41475, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41565, val loss: 0.41337, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.41439, val loss: 0.41200, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.41280, val loss: 0.41058, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.41159, val loss: 0.40925, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.41005, val loss: 0.40788, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.40856, val loss: 0.40655, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40737, val loss: 0.40524, in 0.000s 1 tree, 163 leaves, max depth = 16, train loss: 0.40592, val loss: 0.40395, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40476, val loss: 0.40268, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.40362, val loss: 0.40143, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.40222, val loss: 0.40018, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.40112, val loss: 0.39897, in 0.016s Fit 91 trees in 1.799 s, (11094 total leaves) Time spent computing histograms: 0.554s Time spent finding best splits: 0.170s Time spent applying splits: 0.207s Time spent predicting: 0.000s Trial 96, Fold 1: Log loss = 0.40533120527321725, Average precision = 0.9457526290398302, ROC-AUC = 0.9421818317058832, Elapsed Time = 1.8046297000000777 seconds Trial 96, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 96, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 105 leaves, max depth = 18, train loss: 0.68479, val loss: 0.68447, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.67669, val loss: 0.67601, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.66888, val loss: 0.66784, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.66134, val loss: 0.65995, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.65408, val loss: 0.65235, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.64707, val loss: 0.64501, in 0.016s 1 tree, 105 leaves, max depth = 20, train loss: 0.64030, val loss: 0.63796, in 0.016s 1 tree, 104 leaves, max depth = 21, train loss: 0.63376, val loss: 0.63114, in 0.016s 1 tree, 8 leaves, max depth = 6, train loss: 0.62743, val loss: 0.62449, in 0.000s 1 tree, 105 leaves, max depth = 14, train loss: 0.62123, val loss: 0.61798, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.61531, val loss: 0.61177, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.60956, val loss: 0.60573, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.60402, val loss: 0.59991, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.59858, val loss: 0.59419, in 0.016s 1 tree, 104 leaves, max depth = 19, train loss: 0.59340, val loss: 0.58877, in 0.016s 1 tree, 106 leaves, max depth = 17, train loss: 0.58837, val loss: 0.58353, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.58342, val loss: 0.57832, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.57869, val loss: 0.57334, in 0.016s 1 tree, 104 leaves, max depth = 23, train loss: 0.57413, val loss: 0.56857, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.56963, val loss: 0.56383, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.56528, val loss: 0.55923, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.56106, val loss: 0.55478, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.55697, val loss: 0.55046, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.55300, val loss: 0.54628, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.54921, val loss: 0.54227, in 0.016s 1 tree, 161 leaves, max depth = 17, train loss: 0.54499, val loss: 0.53820, in 0.016s 1 tree, 161 leaves, max depth = 17, train loss: 0.54089, val loss: 0.53427, in 0.031s 1 tree, 104 leaves, max depth = 17, train loss: 0.53734, val loss: 0.53051, in 0.016s 1 tree, 161 leaves, max depth = 17, train loss: 0.53342, val loss: 0.52674, in 0.016s 1 tree, 161 leaves, max depth = 17, train loss: 0.52961, val loss: 0.52309, in 0.016s 1 tree, 106 leaves, max depth = 18, train loss: 0.52626, val loss: 0.51957, in 0.016s 1 tree, 104 leaves, max depth = 17, train loss: 0.52301, val loss: 0.51613, in 0.031s 1 tree, 106 leaves, max depth = 14, train loss: 0.51980, val loss: 0.51272, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.51622, val loss: 0.50930, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.51275, val loss: 0.50598, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.50977, val loss: 0.50281, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.50683, val loss: 0.49968, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.50352, val loss: 0.49652, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.50075, val loss: 0.49358, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.49807, val loss: 0.49073, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.49547, val loss: 0.48795, in 0.016s 1 tree, 161 leaves, max depth = 17, train loss: 0.49234, val loss: 0.48496, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.48984, val loss: 0.48230, in 0.016s 1 tree, 161 leaves, max depth = 17, train loss: 0.48683, val loss: 0.47943, in 0.031s 1 tree, 161 leaves, max depth = 17, train loss: 0.48391, val loss: 0.47665, in 0.016s 1 tree, 161 leaves, max depth = 17, train loss: 0.48107, val loss: 0.47394, in 0.031s 1 tree, 161 leaves, max depth = 17, train loss: 0.47832, val loss: 0.47132, in 0.016s 1 tree, 106 leaves, max depth = 15, train loss: 0.47597, val loss: 0.46882, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.47331, val loss: 0.46629, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.47074, val loss: 0.46384, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.46855, val loss: 0.46151, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.46606, val loss: 0.45915, in 0.031s 1 tree, 161 leaves, max depth = 16, train loss: 0.46365, val loss: 0.45686, in 0.016s 1 tree, 106 leaves, max depth = 14, train loss: 0.46153, val loss: 0.45459, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.45920, val loss: 0.45238, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.45694, val loss: 0.45024, in 0.031s 1 tree, 161 leaves, max depth = 16, train loss: 0.45474, val loss: 0.44816, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.45278, val loss: 0.44607, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.45066, val loss: 0.44406, in 0.016s 1 tree, 106 leaves, max depth = 13, train loss: 0.44877, val loss: 0.44204, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.44694, val loss: 0.44008, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.44490, val loss: 0.43816, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.44292, val loss: 0.43629, in 0.031s 1 tree, 105 leaves, max depth = 20, train loss: 0.44117, val loss: 0.43444, in 0.016s 1 tree, 161 leaves, max depth = 15, train loss: 0.43925, val loss: 0.43264, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.43757, val loss: 0.43084, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.43594, val loss: 0.42909, in 0.016s 1 tree, 104 leaves, max depth = 21, train loss: 0.43436, val loss: 0.42742, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.43252, val loss: 0.42569, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.43073, val loss: 0.42402, in 0.031s 1 tree, 161 leaves, max depth = 16, train loss: 0.42899, val loss: 0.42239, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.42730, val loss: 0.42081, in 0.031s 1 tree, 105 leaves, max depth = 15, train loss: 0.42581, val loss: 0.41921, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.42436, val loss: 0.41768, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.42295, val loss: 0.41617, in 0.000s 1 tree, 105 leaves, max depth = 15, train loss: 0.42159, val loss: 0.41471, in 0.031s 1 tree, 106 leaves, max depth = 17, train loss: 0.42023, val loss: 0.41324, in 0.016s 1 tree, 106 leaves, max depth = 13, train loss: 0.41894, val loss: 0.41185, in 0.000s 1 tree, 106 leaves, max depth = 13, train loss: 0.41770, val loss: 0.41051, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41648, val loss: 0.40924, in 0.016s 1 tree, 106 leaves, max depth = 18, train loss: 0.41524, val loss: 0.40790, in 0.016s 1 tree, 161 leaves, max depth = 15, train loss: 0.41364, val loss: 0.40642, in 0.016s 1 tree, 106 leaves, max depth = 18, train loss: 0.41245, val loss: 0.40512, in 0.016s 1 tree, 161 leaves, max depth = 15, train loss: 0.41089, val loss: 0.40368, in 0.016s 1 tree, 161 leaves, max depth = 15, train loss: 0.40939, val loss: 0.40228, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40822, val loss: 0.40108, in 0.016s 1 tree, 160 leaves, max depth = 15, train loss: 0.40676, val loss: 0.39972, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40563, val loss: 0.39855, in 0.016s 1 tree, 105 leaves, max depth = 13, train loss: 0.40452, val loss: 0.39736, in 0.016s 1 tree, 160 leaves, max depth = 15, train loss: 0.40310, val loss: 0.39604, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.40198, val loss: 0.39488, in 0.031s Fit 91 trees in 1.877 s, (10965 total leaves) Time spent computing histograms: 0.573s Time spent finding best splits: 0.171s Time spent applying splits: 0.208s Time spent predicting: 0.000s Trial 96, Fold 2: Log loss = 0.4061720059908957, Average precision = 0.9421895019687312, ROC-AUC = 0.9427703713055864, Elapsed Time = 1.8791829000001599 seconds Trial 96, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 96, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.172 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 99 leaves, max depth = 15, train loss: 0.68486, val loss: 0.68458, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.67682, val loss: 0.67626, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.66905, val loss: 0.66824, in 0.016s 1 tree, 101 leaves, max depth = 13, train loss: 0.66155, val loss: 0.66049, in 0.016s 1 tree, 102 leaves, max depth = 16, train loss: 0.65433, val loss: 0.65307, in 0.016s 1 tree, 102 leaves, max depth = 16, train loss: 0.64736, val loss: 0.64591, in 0.016s 1 tree, 100 leaves, max depth = 15, train loss: 0.64065, val loss: 0.63895, in 0.016s 1 tree, 103 leaves, max depth = 19, train loss: 0.63417, val loss: 0.63224, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.62786, val loss: 0.62569, in 0.016s 1 tree, 103 leaves, max depth = 14, train loss: 0.62168, val loss: 0.61932, in 0.016s 1 tree, 102 leaves, max depth = 20, train loss: 0.61582, val loss: 0.61326, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.61011, val loss: 0.60733, in 0.016s 1 tree, 103 leaves, max depth = 16, train loss: 0.60460, val loss: 0.60167, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.59917, val loss: 0.59608, in 0.016s 1 tree, 102 leaves, max depth = 20, train loss: 0.59404, val loss: 0.59076, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.58903, val loss: 0.58553, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.58411, val loss: 0.58045, in 0.016s 1 tree, 103 leaves, max depth = 14, train loss: 0.57941, val loss: 0.57557, in 0.000s 1 tree, 103 leaves, max depth = 20, train loss: 0.57489, val loss: 0.57087, in 0.031s 1 tree, 104 leaves, max depth = 14, train loss: 0.57041, val loss: 0.56624, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.56607, val loss: 0.56176, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.56187, val loss: 0.55742, in 0.016s 1 tree, 163 leaves, max depth = 18, train loss: 0.55739, val loss: 0.55329, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.55338, val loss: 0.54914, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.54955, val loss: 0.54515, in 0.016s 1 tree, 162 leaves, max depth = 18, train loss: 0.54531, val loss: 0.54125, in 0.031s 1 tree, 162 leaves, max depth = 18, train loss: 0.54121, val loss: 0.53748, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.53761, val loss: 0.53376, in 0.016s 1 tree, 163 leaves, max depth = 18, train loss: 0.53368, val loss: 0.53015, in 0.016s 1 tree, 163 leaves, max depth = 18, train loss: 0.52987, val loss: 0.52665, in 0.032s 1 tree, 100 leaves, max depth = 15, train loss: 0.52649, val loss: 0.52312, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.52321, val loss: 0.51972, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.51996, val loss: 0.51634, in 0.016s 1 tree, 163 leaves, max depth = 18, train loss: 0.51637, val loss: 0.51305, in 0.016s 1 tree, 163 leaves, max depth = 18, train loss: 0.51289, val loss: 0.50987, in 0.031s 1 tree, 104 leaves, max depth = 15, train loss: 0.50989, val loss: 0.50671, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.50692, val loss: 0.50361, in 0.016s 1 tree, 162 leaves, max depth = 18, train loss: 0.50360, val loss: 0.50058, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.50081, val loss: 0.49764, in 0.031s 1 tree, 105 leaves, max depth = 16, train loss: 0.49810, val loss: 0.49482, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.49548, val loss: 0.49204, in 0.016s 1 tree, 163 leaves, max depth = 17, train loss: 0.49234, val loss: 0.48918, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.48983, val loss: 0.48652, in 0.016s 1 tree, 163 leaves, max depth = 18, train loss: 0.48681, val loss: 0.48378, in 0.031s 1 tree, 163 leaves, max depth = 18, train loss: 0.48389, val loss: 0.48111, in 0.016s 1 tree, 163 leaves, max depth = 18, train loss: 0.48104, val loss: 0.47853, in 0.016s 1 tree, 163 leaves, max depth = 18, train loss: 0.47828, val loss: 0.47602, in 0.031s 1 tree, 105 leaves, max depth = 14, train loss: 0.47591, val loss: 0.47353, in 0.016s 1 tree, 163 leaves, max depth = 18, train loss: 0.47325, val loss: 0.47112, in 0.031s 1 tree, 163 leaves, max depth = 18, train loss: 0.47067, val loss: 0.46878, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.46846, val loss: 0.46647, in 0.016s 1 tree, 162 leaves, max depth = 18, train loss: 0.46596, val loss: 0.46421, in 0.031s 1 tree, 162 leaves, max depth = 18, train loss: 0.46354, val loss: 0.46203, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.46141, val loss: 0.45977, in 0.016s 1 tree, 162 leaves, max depth = 15, train loss: 0.45907, val loss: 0.45766, in 0.031s 1 tree, 163 leaves, max depth = 15, train loss: 0.45680, val loss: 0.45562, in 0.016s 1 tree, 163 leaves, max depth = 18, train loss: 0.45459, val loss: 0.45364, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.45261, val loss: 0.45156, in 0.016s 1 tree, 163 leaves, max depth = 17, train loss: 0.45048, val loss: 0.44965, in 0.031s 1 tree, 105 leaves, max depth = 15, train loss: 0.44859, val loss: 0.44761, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.44674, val loss: 0.44567, in 0.016s 1 tree, 162 leaves, max depth = 16, train loss: 0.44469, val loss: 0.44383, in 0.016s 1 tree, 162 leaves, max depth = 16, train loss: 0.44270, val loss: 0.44205, in 0.031s 1 tree, 105 leaves, max depth = 15, train loss: 0.44095, val loss: 0.44016, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.43903, val loss: 0.43845, in 0.031s 1 tree, 105 leaves, max depth = 15, train loss: 0.43735, val loss: 0.43663, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.43571, val loss: 0.43486, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.43407, val loss: 0.43310, in 0.016s 1 tree, 162 leaves, max depth = 16, train loss: 0.43222, val loss: 0.43146, in 0.016s 1 tree, 162 leaves, max depth = 16, train loss: 0.43043, val loss: 0.42986, in 0.031s 1 tree, 162 leaves, max depth = 16, train loss: 0.42869, val loss: 0.42832, in 0.031s 1 tree, 163 leaves, max depth = 16, train loss: 0.42700, val loss: 0.42682, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.42551, val loss: 0.42520, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.42406, val loss: 0.42364, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.42266, val loss: 0.42211, in 0.031s 1 tree, 102 leaves, max depth = 13, train loss: 0.42128, val loss: 0.42062, in 0.000s 1 tree, 5 leaves, max depth = 3, train loss: 0.42000, val loss: 0.41943, in 0.000s 1 tree, 104 leaves, max depth = 13, train loss: 0.41863, val loss: 0.41795, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.41735, val loss: 0.41656, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.41612, val loss: 0.41541, in 0.000s 1 tree, 104 leaves, max depth = 13, train loss: 0.41483, val loss: 0.41403, in 0.016s 1 tree, 162 leaves, max depth = 16, train loss: 0.41323, val loss: 0.41262, in 0.031s 1 tree, 105 leaves, max depth = 14, train loss: 0.41200, val loss: 0.41129, in 0.016s 1 tree, 162 leaves, max depth = 16, train loss: 0.41045, val loss: 0.40993, in 0.016s 1 tree, 162 leaves, max depth = 16, train loss: 0.40895, val loss: 0.40861, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40777, val loss: 0.40751, in 0.000s 1 tree, 163 leaves, max depth = 16, train loss: 0.40631, val loss: 0.40623, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40517, val loss: 0.40517, in 0.000s 1 tree, 104 leaves, max depth = 13, train loss: 0.40404, val loss: 0.40394, in 0.016s 1 tree, 163 leaves, max depth = 16, train loss: 0.40262, val loss: 0.40271, in 0.031s 1 tree, 103 leaves, max depth = 13, train loss: 0.40150, val loss: 0.40149, in 0.016s Fit 91 trees in 1.986 s, (10977 total leaves) Time spent computing histograms: 0.635s Time spent finding best splits: 0.181s Time spent applying splits: 0.218s Time spent predicting: 0.016s Trial 96, Fold 3: Log loss = 0.4005417862920448, Average precision = 0.9481568863042917, ROC-AUC = 0.9469454118952915, Elapsed Time = 1.9903591000002052 seconds Trial 96, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 96, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.142 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 96 leaves, max depth = 16, train loss: 0.68486, val loss: 0.68446, in 0.016s 1 tree, 99 leaves, max depth = 19, train loss: 0.67674, val loss: 0.67586, in 0.000s 1 tree, 99 leaves, max depth = 19, train loss: 0.66890, val loss: 0.66754, in 0.031s 1 tree, 103 leaves, max depth = 19, train loss: 0.66138, val loss: 0.65964, in 0.016s 1 tree, 103 leaves, max depth = 13, train loss: 0.65413, val loss: 0.65198, in 0.016s 1 tree, 102 leaves, max depth = 19, train loss: 0.64716, val loss: 0.64458, in 0.016s 1 tree, 102 leaves, max depth = 15, train loss: 0.64044, val loss: 0.63746, in 0.016s 1 tree, 102 leaves, max depth = 15, train loss: 0.63394, val loss: 0.63057, in 0.016s 1 tree, 103 leaves, max depth = 13, train loss: 0.62762, val loss: 0.62386, in 0.016s 1 tree, 104 leaves, max depth = 19, train loss: 0.62144, val loss: 0.61727, in 0.016s 1 tree, 103 leaves, max depth = 19, train loss: 0.61556, val loss: 0.61098, in 0.016s 1 tree, 103 leaves, max depth = 19, train loss: 0.60981, val loss: 0.60492, in 0.016s 1 tree, 104 leaves, max depth = 13, train loss: 0.60428, val loss: 0.59902, in 0.016s 1 tree, 103 leaves, max depth = 19, train loss: 0.59887, val loss: 0.59321, in 0.016s 1 tree, 103 leaves, max depth = 13, train loss: 0.59368, val loss: 0.58768, in 0.016s 1 tree, 103 leaves, max depth = 13, train loss: 0.58867, val loss: 0.58231, in 0.016s 1 tree, 103 leaves, max depth = 19, train loss: 0.58376, val loss: 0.57702, in 0.016s 1 tree, 103 leaves, max depth = 18, train loss: 0.57903, val loss: 0.57201, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.57450, val loss: 0.56716, in 0.016s 1 tree, 104 leaves, max depth = 20, train loss: 0.57003, val loss: 0.56233, in 0.016s 1 tree, 104 leaves, max depth = 20, train loss: 0.56571, val loss: 0.55765, in 0.016s 1 tree, 104 leaves, max depth = 20, train loss: 0.56152, val loss: 0.55310, in 0.000s 1 tree, 104 leaves, max depth = 20, train loss: 0.55746, val loss: 0.54870, in 0.031s 1 tree, 105 leaves, max depth = 20, train loss: 0.55352, val loss: 0.54443, in 0.016s 1 tree, 103 leaves, max depth = 21, train loss: 0.54974, val loss: 0.54038, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.54552, val loss: 0.53628, in 0.016s 1 tree, 161 leaves, max depth = 18, train loss: 0.54143, val loss: 0.53232, in 0.016s 1 tree, 103 leaves, max depth = 21, train loss: 0.53789, val loss: 0.52848, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.53397, val loss: 0.52468, in 0.031s 1 tree, 161 leaves, max depth = 16, train loss: 0.53018, val loss: 0.52100, in 0.016s 1 tree, 102 leaves, max depth = 15, train loss: 0.52685, val loss: 0.51742, in 0.016s 1 tree, 103 leaves, max depth = 20, train loss: 0.52361, val loss: 0.51390, in 0.016s 1 tree, 104 leaves, max depth = 20, train loss: 0.52042, val loss: 0.51041, in 0.031s 1 tree, 161 leaves, max depth = 16, train loss: 0.51685, val loss: 0.50696, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.51339, val loss: 0.50361, in 0.031s 1 tree, 104 leaves, max depth = 21, train loss: 0.51041, val loss: 0.50040, in 0.016s 1 tree, 105 leaves, max depth = 18, train loss: 0.50750, val loss: 0.49720, in 0.016s 1 tree, 160 leaves, max depth = 18, train loss: 0.50420, val loss: 0.49401, in 0.016s 1 tree, 103 leaves, max depth = 21, train loss: 0.50144, val loss: 0.49103, in 0.016s 1 tree, 103 leaves, max depth = 22, train loss: 0.49877, val loss: 0.48810, in 0.016s 1 tree, 103 leaves, max depth = 21, train loss: 0.49616, val loss: 0.48529, in 0.016s 1 tree, 160 leaves, max depth = 18, train loss: 0.49304, val loss: 0.48228, in 0.016s 1 tree, 104 leaves, max depth = 21, train loss: 0.49055, val loss: 0.47959, in 0.031s 1 tree, 161 leaves, max depth = 18, train loss: 0.48755, val loss: 0.47670, in 0.016s 1 tree, 161 leaves, max depth = 18, train loss: 0.48464, val loss: 0.47389, in 0.016s 1 tree, 160 leaves, max depth = 18, train loss: 0.48181, val loss: 0.47117, in 0.031s 1 tree, 161 leaves, max depth = 18, train loss: 0.47907, val loss: 0.46853, in 0.016s 1 tree, 105 leaves, max depth = 15, train loss: 0.47674, val loss: 0.46594, in 0.016s 1 tree, 160 leaves, max depth = 19, train loss: 0.47410, val loss: 0.46340, in 0.031s 1 tree, 161 leaves, max depth = 19, train loss: 0.47153, val loss: 0.46093, in 0.016s 1 tree, 103 leaves, max depth = 20, train loss: 0.46935, val loss: 0.45852, in 0.016s 1 tree, 160 leaves, max depth = 19, train loss: 0.46688, val loss: 0.45614, in 0.031s 1 tree, 161 leaves, max depth = 19, train loss: 0.46447, val loss: 0.45383, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.46237, val loss: 0.45149, in 0.016s 1 tree, 160 leaves, max depth = 18, train loss: 0.46006, val loss: 0.44927, in 0.031s 1 tree, 161 leaves, max depth = 18, train loss: 0.45780, val loss: 0.44710, in 0.016s 1 tree, 161 leaves, max depth = 21, train loss: 0.45561, val loss: 0.44501, in 0.031s 1 tree, 104 leaves, max depth = 14, train loss: 0.45360, val loss: 0.44281, in 0.016s 1 tree, 160 leaves, max depth = 18, train loss: 0.45149, val loss: 0.44079, in 0.016s 1 tree, 104 leaves, max depth = 14, train loss: 0.44961, val loss: 0.43871, in 0.016s 1 tree, 103 leaves, max depth = 19, train loss: 0.44778, val loss: 0.43668, in 0.031s 1 tree, 161 leaves, max depth = 16, train loss: 0.44576, val loss: 0.43474, in 0.016s 1 tree, 161 leaves, max depth = 16, train loss: 0.44379, val loss: 0.43286, in 0.031s 1 tree, 104 leaves, max depth = 14, train loss: 0.44199, val loss: 0.43089, in 0.016s 1 tree, 161 leaves, max depth = 18, train loss: 0.44009, val loss: 0.42908, in 0.016s 1 tree, 102 leaves, max depth = 17, train loss: 0.43841, val loss: 0.42720, in 0.016s 1 tree, 103 leaves, max depth = 15, train loss: 0.43672, val loss: 0.42536, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.43515, val loss: 0.42360, in 0.016s 1 tree, 161 leaves, max depth = 18, train loss: 0.43333, val loss: 0.42186, in 0.031s 1 tree, 161 leaves, max depth = 18, train loss: 0.43156, val loss: 0.42018, in 0.031s 1 tree, 161 leaves, max depth = 18, train loss: 0.42985, val loss: 0.41855, in 0.016s 1 tree, 161 leaves, max depth = 18, train loss: 0.42818, val loss: 0.41696, in 0.031s 1 tree, 103 leaves, max depth = 17, train loss: 0.42668, val loss: 0.41528, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.42523, val loss: 0.41367, in 0.016s 1 tree, 103 leaves, max depth = 16, train loss: 0.42382, val loss: 0.41209, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.42246, val loss: 0.41057, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.42111, val loss: 0.40902, in 0.047s 1 tree, 104 leaves, max depth = 14, train loss: 0.41977, val loss: 0.40755, in 0.094s 1 tree, 103 leaves, max depth = 16, train loss: 0.41852, val loss: 0.40613, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41730, val loss: 0.40482, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.41607, val loss: 0.40341, in 0.016s 1 tree, 161 leaves, max depth = 17, train loss: 0.41449, val loss: 0.40192, in 0.031s 1 tree, 105 leaves, max depth = 17, train loss: 0.41331, val loss: 0.40056, in 0.031s 1 tree, 161 leaves, max depth = 17, train loss: 0.41178, val loss: 0.39911, in 0.016s 1 tree, 161 leaves, max depth = 17, train loss: 0.41029, val loss: 0.39771, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40913, val loss: 0.39646, in 0.016s 1 tree, 161 leaves, max depth = 17, train loss: 0.40769, val loss: 0.39510, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40656, val loss: 0.39389, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.40545, val loss: 0.39261, in 0.016s 1 tree, 161 leaves, max depth = 20, train loss: 0.40405, val loss: 0.39129, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.40298, val loss: 0.39005, in 0.016s Fit 91 trees in 2.126 s, (10942 total leaves) Time spent computing histograms: 0.681s Time spent finding best splits: 0.209s Time spent applying splits: 0.268s Time spent predicting: 0.031s Trial 96, Fold 4: Log loss = 0.40511379728362673, Average precision = 0.9467197122543686, ROC-AUC = 0.9438916339406427, Elapsed Time = 2.1346135000003414 seconds Trial 96, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 96, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 102 leaves, max depth = 13, train loss: 0.68473, val loss: 0.68423, in 0.016s 1 tree, 99 leaves, max depth = 16, train loss: 0.67653, val loss: 0.67551, in 0.016s 1 tree, 99 leaves, max depth = 16, train loss: 0.66861, val loss: 0.66709, in 0.016s 1 tree, 101 leaves, max depth = 14, train loss: 0.66105, val loss: 0.65905, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.65374, val loss: 0.65127, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.64667, val loss: 0.64375, in 0.016s 1 tree, 99 leaves, max depth = 15, train loss: 0.63986, val loss: 0.63650, in 0.016s 1 tree, 103 leaves, max depth = 18, train loss: 0.63328, val loss: 0.62949, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.62689, val loss: 0.62263, in 0.000s 1 tree, 104 leaves, max depth = 16, train loss: 0.62065, val loss: 0.61595, in 0.031s 1 tree, 105 leaves, max depth = 17, train loss: 0.61469, val loss: 0.60952, in 0.016s 1 tree, 104 leaves, max depth = 15, train loss: 0.60891, val loss: 0.60330, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.60333, val loss: 0.59727, in 0.016s 1 tree, 104 leaves, max depth = 16, train loss: 0.59785, val loss: 0.59138, in 0.016s 1 tree, 99 leaves, max depth = 20, train loss: 0.59263, val loss: 0.58579, in 0.000s 1 tree, 104 leaves, max depth = 15, train loss: 0.58756, val loss: 0.58030, in 0.031s 1 tree, 105 leaves, max depth = 16, train loss: 0.58258, val loss: 0.57494, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.57783, val loss: 0.56982, in 0.016s 1 tree, 103 leaves, max depth = 18, train loss: 0.57324, val loss: 0.56487, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.56871, val loss: 0.55997, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.56433, val loss: 0.55523, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.56008, val loss: 0.55062, in 0.000s 1 tree, 105 leaves, max depth = 16, train loss: 0.55597, val loss: 0.54615, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.55198, val loss: 0.54182, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.54816, val loss: 0.53768, in 0.016s 1 tree, 158 leaves, max depth = 17, train loss: 0.54406, val loss: 0.53379, in 0.031s 1 tree, 158 leaves, max depth = 17, train loss: 0.54008, val loss: 0.53004, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.53649, val loss: 0.52613, in 0.016s 1 tree, 158 leaves, max depth = 17, train loss: 0.53267, val loss: 0.52254, in 0.031s 1 tree, 158 leaves, max depth = 17, train loss: 0.52898, val loss: 0.51906, in 0.016s 1 tree, 100 leaves, max depth = 15, train loss: 0.52559, val loss: 0.51536, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.52231, val loss: 0.51174, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.51907, val loss: 0.50820, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.51560, val loss: 0.50494, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.51223, val loss: 0.50179, in 0.031s 1 tree, 103 leaves, max depth = 18, train loss: 0.50922, val loss: 0.49850, in 0.016s 1 tree, 105 leaves, max depth = 14, train loss: 0.50625, val loss: 0.49523, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.50304, val loss: 0.49224, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.50024, val loss: 0.48916, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.49752, val loss: 0.48617, in 0.031s 1 tree, 102 leaves, max depth = 15, train loss: 0.49489, val loss: 0.48326, in 0.016s 1 tree, 159 leaves, max depth = 18, train loss: 0.49185, val loss: 0.48044, in 0.016s 1 tree, 105 leaves, max depth = 17, train loss: 0.48933, val loss: 0.47765, in 0.016s 1 tree, 158 leaves, max depth = 17, train loss: 0.48640, val loss: 0.47495, in 0.016s 1 tree, 158 leaves, max depth = 17, train loss: 0.48357, val loss: 0.47232, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.48082, val loss: 0.46978, in 0.031s 1 tree, 159 leaves, max depth = 18, train loss: 0.47815, val loss: 0.46732, in 0.031s 1 tree, 105 leaves, max depth = 14, train loss: 0.47576, val loss: 0.46468, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.47319, val loss: 0.46231, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.47069, val loss: 0.46001, in 0.031s 1 tree, 105 leaves, max depth = 16, train loss: 0.46847, val loss: 0.45754, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.46606, val loss: 0.45534, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.46372, val loss: 0.45320, in 0.031s 1 tree, 105 leaves, max depth = 13, train loss: 0.46157, val loss: 0.45080, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.45931, val loss: 0.44874, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.45712, val loss: 0.44674, in 0.031s 1 tree, 159 leaves, max depth = 16, train loss: 0.45500, val loss: 0.44481, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.45300, val loss: 0.44258, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.45095, val loss: 0.44071, in 0.016s 1 tree, 103 leaves, max depth = 16, train loss: 0.44903, val loss: 0.43857, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.44716, val loss: 0.43648, in 0.016s 1 tree, 158 leaves, max depth = 15, train loss: 0.44519, val loss: 0.43470, in 0.031s 1 tree, 158 leaves, max depth = 15, train loss: 0.44327, val loss: 0.43297, in 0.016s 1 tree, 99 leaves, max depth = 15, train loss: 0.44149, val loss: 0.43099, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.43964, val loss: 0.42932, in 0.031s 1 tree, 105 leaves, max depth = 16, train loss: 0.43792, val loss: 0.42739, in 0.000s 1 tree, 105 leaves, max depth = 16, train loss: 0.43625, val loss: 0.42551, in 0.016s 1 tree, 103 leaves, max depth = 17, train loss: 0.43464, val loss: 0.42369, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.43287, val loss: 0.42211, in 0.016s 1 tree, 159 leaves, max depth = 16, train loss: 0.43114, val loss: 0.42057, in 0.031s 1 tree, 159 leaves, max depth = 15, train loss: 0.42947, val loss: 0.41907, in 0.016s 1 tree, 159 leaves, max depth = 15, train loss: 0.42784, val loss: 0.41763, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.42631, val loss: 0.41589, in 0.016s 1 tree, 100 leaves, max depth = 15, train loss: 0.42483, val loss: 0.41423, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.42339, val loss: 0.41259, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.42199, val loss: 0.41100, in 0.031s 1 tree, 105 leaves, max depth = 19, train loss: 0.42059, val loss: 0.40942, in 0.016s 1 tree, 101 leaves, max depth = 15, train loss: 0.41928, val loss: 0.40792, in 0.016s 1 tree, 105 leaves, max depth = 16, train loss: 0.41800, val loss: 0.40645, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.41678, val loss: 0.40527, in 0.000s 1 tree, 105 leaves, max depth = 19, train loss: 0.41550, val loss: 0.40382, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.41397, val loss: 0.40247, in 0.031s 1 tree, 105 leaves, max depth = 19, train loss: 0.41274, val loss: 0.40107, in 0.016s 1 tree, 159 leaves, max depth = 19, train loss: 0.41124, val loss: 0.39977, in 0.016s 1 tree, 159 leaves, max depth = 17, train loss: 0.40980, val loss: 0.39850, in 0.031s 1 tree, 5 leaves, max depth = 3, train loss: 0.40863, val loss: 0.39738, in 0.000s 1 tree, 159 leaves, max depth = 17, train loss: 0.40723, val loss: 0.39616, in 0.016s 1 tree, 5 leaves, max depth = 3, train loss: 0.40610, val loss: 0.39508, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.40494, val loss: 0.39375, in 0.016s 1 tree, 159 leaves, max depth = 19, train loss: 0.40358, val loss: 0.39258, in 0.016s 1 tree, 105 leaves, max depth = 19, train loss: 0.40246, val loss: 0.39130, in 0.016s Fit 91 trees in 1.955 s, (10906 total leaves) Time spent computing histograms: 0.610s Time spent finding best splits: 0.180s Time spent applying splits: 0.219s Time spent predicting: 0.000s Trial 96, Fold 5: Log loss = 0.4107916499987752, Average precision = 0.9446465371422839, ROC-AUC = 0.941807182802891, Elapsed Time = 1.9732861000011326 seconds
Optimization Progress: 97%|#########7| 97/100 [20:16<00:42, 14.27s/it]
Trial 97, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 97, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 13, train loss: 0.67810, val loss: 0.67770, in 0.000s 1 tree, 35 leaves, max depth = 13, train loss: 0.66417, val loss: 0.66339, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.65109, val loss: 0.64994, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.63895, val loss: 0.63747, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.62738, val loss: 0.62555, in 0.000s 1 tree, 35 leaves, max depth = 13, train loss: 0.61649, val loss: 0.61432, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.60624, val loss: 0.60373, in 0.000s 1 tree, 35 leaves, max depth = 13, train loss: 0.59657, val loss: 0.59375, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.58745, val loss: 0.58432, in 0.000s 1 tree, 35 leaves, max depth = 13, train loss: 0.57884, val loss: 0.57541, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.57071, val loss: 0.56699, in 0.000s 1 tree, 34 leaves, max depth = 13, train loss: 0.56304, val loss: 0.55903, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.55586, val loss: 0.55154, in 0.000s 1 tree, 36 leaves, max depth = 13, train loss: 0.54899, val loss: 0.54440, in 0.016s 1 tree, 35 leaves, max depth = 13, train loss: 0.54249, val loss: 0.53764, in 0.000s 1 tree, 54 leaves, max depth = 11, train loss: 0.53560, val loss: 0.53120, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.52967, val loss: 0.52497, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.52327, val loss: 0.51901, in 0.000s 1 tree, 35 leaves, max depth = 12, train loss: 0.51779, val loss: 0.51324, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.51183, val loss: 0.50770, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.50620, val loss: 0.50247, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.50087, val loss: 0.49753, in 0.000s 1 tree, 32 leaves, max depth = 10, train loss: 0.49600, val loss: 0.49240, in 0.016s 1 tree, 33 leaves, max depth = 13, train loss: 0.49139, val loss: 0.48757, in 0.000s 1 tree, 56 leaves, max depth = 10, train loss: 0.48648, val loss: 0.48303, in 0.016s 1 tree, 56 leaves, max depth = 10, train loss: 0.48184, val loss: 0.47875, in 0.000s 1 tree, 56 leaves, max depth = 10, train loss: 0.47743, val loss: 0.47469, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.47330, val loss: 0.47031, in 0.000s 1 tree, 56 leaves, max depth = 10, train loss: 0.46917, val loss: 0.46650, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.46532, val loss: 0.46240, in 0.000s 1 tree, 31 leaves, max depth = 13, train loss: 0.46166, val loss: 0.45857, in 0.016s 1 tree, 54 leaves, max depth = 10, train loss: 0.45783, val loss: 0.45505, in 0.000s 1 tree, 32 leaves, max depth = 9, train loss: 0.45430, val loss: 0.45135, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.45072, val loss: 0.44808, in 0.000s 1 tree, 54 leaves, max depth = 11, train loss: 0.44732, val loss: 0.44499, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.44409, val loss: 0.44206, in 0.016s 1 tree, 31 leaves, max depth = 10, train loss: 0.44100, val loss: 0.43878, in 0.000s 1 tree, 52 leaves, max depth = 11, train loss: 0.43797, val loss: 0.43605, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.43498, val loss: 0.43289, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.43213, val loss: 0.43033, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.42939, val loss: 0.42737, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.42670, val loss: 0.42498, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.42413, val loss: 0.42219, in 0.000s 1 tree, 54 leaves, max depth = 11, train loss: 0.42159, val loss: 0.41994, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.41916, val loss: 0.41733, in 0.000s 1 tree, 53 leaves, max depth = 10, train loss: 0.41677, val loss: 0.41522, in 0.016s 1 tree, 30 leaves, max depth = 12, train loss: 0.41452, val loss: 0.41282, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.41225, val loss: 0.41082, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.41011, val loss: 0.40847, in 0.000s 1 tree, 54 leaves, max depth = 10, train loss: 0.40796, val loss: 0.40660, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.40581, val loss: 0.40432, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40380, val loss: 0.40212, in 0.000s 1 tree, 52 leaves, max depth = 10, train loss: 0.40178, val loss: 0.40038, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.39982, val loss: 0.39826, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.39790, val loss: 0.39665, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39601, val loss: 0.39457, in 0.016s 1 tree, 55 leaves, max depth = 10, train loss: 0.39420, val loss: 0.39302, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39241, val loss: 0.39106, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39071, val loss: 0.38919, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38911, val loss: 0.38742, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38758, val loss: 0.38574, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.38586, val loss: 0.38434, in 0.000s 1 tree, 31 leaves, max depth = 10, train loss: 0.38408, val loss: 0.38245, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38263, val loss: 0.38085, in 0.000s 1 tree, 54 leaves, max depth = 11, train loss: 0.38099, val loss: 0.37953, in 0.016s 1 tree, 35 leaves, max depth = 9, train loss: 0.37929, val loss: 0.37773, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.37791, val loss: 0.37621, in 0.000s 1 tree, 31 leaves, max depth = 9, train loss: 0.37633, val loss: 0.37455, in 0.016s 1 tree, 53 leaves, max depth = 11, train loss: 0.37477, val loss: 0.37330, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.37353, val loss: 0.37184, in 0.016s Fit 70 trees in 0.798 s, (2569 total leaves) Time spent computing histograms: 0.274s Time spent finding best splits: 0.038s Time spent applying splits: 0.041s Time spent predicting: 0.016s Trial 97, Fold 1: Log loss = 0.3761059926294201, Average precision = 0.9467077987386714, ROC-AUC = 0.943012434698862, Elapsed Time = 0.7945130000007339 seconds Trial 97, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 97, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.142 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 34 leaves, max depth = 11, train loss: 0.67829, val loss: 0.67754, in 0.000s 1 tree, 35 leaves, max depth = 13, train loss: 0.66451, val loss: 0.66308, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.65148, val loss: 0.64937, in 0.000s 1 tree, 35 leaves, max depth = 13, train loss: 0.63932, val loss: 0.63659, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.62780, val loss: 0.62444, in 0.000s 1 tree, 34 leaves, max depth = 12, train loss: 0.61696, val loss: 0.61298, in 0.016s 1 tree, 32 leaves, max depth = 15, train loss: 0.60684, val loss: 0.60235, in 0.000s 1 tree, 34 leaves, max depth = 12, train loss: 0.59719, val loss: 0.59215, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.58819, val loss: 0.58267, in 0.000s 1 tree, 34 leaves, max depth = 12, train loss: 0.57960, val loss: 0.57355, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.57157, val loss: 0.56511, in 0.000s 1 tree, 34 leaves, max depth = 12, train loss: 0.56392, val loss: 0.55697, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.55675, val loss: 0.54941, in 0.000s 1 tree, 34 leaves, max depth = 9, train loss: 0.54991, val loss: 0.54211, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.54348, val loss: 0.53530, in 0.000s 1 tree, 55 leaves, max depth = 14, train loss: 0.53650, val loss: 0.52854, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.53058, val loss: 0.52228, in 0.000s 1 tree, 55 leaves, max depth = 12, train loss: 0.52409, val loss: 0.51601, in 0.016s 1 tree, 31 leaves, max depth = 15, train loss: 0.51862, val loss: 0.51022, in 0.000s 1 tree, 55 leaves, max depth = 14, train loss: 0.51258, val loss: 0.50439, in 0.016s 1 tree, 55 leaves, max depth = 14, train loss: 0.50688, val loss: 0.49888, in 0.000s 1 tree, 55 leaves, max depth = 14, train loss: 0.50148, val loss: 0.49367, in 0.016s 1 tree, 55 leaves, max depth = 14, train loss: 0.49636, val loss: 0.48875, in 0.016s 1 tree, 33 leaves, max depth = 14, train loss: 0.49162, val loss: 0.48367, in 0.000s 1 tree, 56 leaves, max depth = 14, train loss: 0.48685, val loss: 0.47908, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.48243, val loss: 0.47438, in 0.000s 1 tree, 53 leaves, max depth = 12, train loss: 0.47796, val loss: 0.47010, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.47374, val loss: 0.46605, in 0.000s 1 tree, 33 leaves, max depth = 10, train loss: 0.46968, val loss: 0.46174, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.46572, val loss: 0.45796, in 0.016s 1 tree, 33 leaves, max depth = 15, train loss: 0.46195, val loss: 0.45390, in 0.000s 1 tree, 54 leaves, max depth = 15, train loss: 0.45823, val loss: 0.45036, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.45470, val loss: 0.44659, in 0.000s 1 tree, 54 leaves, max depth = 15, train loss: 0.45120, val loss: 0.44327, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.44790, val loss: 0.43975, in 0.000s 1 tree, 55 leaves, max depth = 11, train loss: 0.44461, val loss: 0.43665, in 0.016s 1 tree, 33 leaves, max depth = 14, train loss: 0.44155, val loss: 0.43333, in 0.000s 1 tree, 55 leaves, max depth = 12, train loss: 0.43845, val loss: 0.43041, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.43556, val loss: 0.42731, in 0.016s 1 tree, 56 leaves, max depth = 15, train loss: 0.43264, val loss: 0.42453, in 0.000s 1 tree, 34 leaves, max depth = 12, train loss: 0.42993, val loss: 0.42162, in 0.016s 1 tree, 57 leaves, max depth = 15, train loss: 0.42717, val loss: 0.41902, in 0.000s 1 tree, 33 leaves, max depth = 11, train loss: 0.42463, val loss: 0.41628, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.42202, val loss: 0.41383, in 0.000s 1 tree, 55 leaves, max depth = 15, train loss: 0.41954, val loss: 0.41150, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.41718, val loss: 0.40895, in 0.000s 1 tree, 55 leaves, max depth = 13, train loss: 0.41484, val loss: 0.40675, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.41261, val loss: 0.40467, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.41052, val loss: 0.40249, in 0.000s 1 tree, 32 leaves, max depth = 13, train loss: 0.40837, val loss: 0.40014, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.40627, val loss: 0.39818, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.40431, val loss: 0.39614, in 0.016s [53/70] 1 tree, 32 leaves, max depth = 13, train loss: 0.40231, val loss: 0.39394, in 0.000s 1 tree, 55 leaves, max depth = 11, train loss: 0.40033, val loss: 0.39211, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.39845, val loss: 0.39036, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39661, val loss: 0.38845, in 0.000s 1 tree, 33 leaves, max depth = 12, train loss: 0.39475, val loss: 0.38641, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39302, val loss: 0.38461, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39138, val loss: 0.38290, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38982, val loss: 0.38127, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.38799, val loss: 0.37930, in 0.000s 1 tree, 56 leaves, max depth = 13, train loss: 0.38621, val loss: 0.37769, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.38452, val loss: 0.37615, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38304, val loss: 0.37461, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38164, val loss: 0.37314, in 0.000s 1 tree, 31 leaves, max depth = 8, train loss: 0.37994, val loss: 0.37132, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.37834, val loss: 0.36987, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37706, val loss: 0.36855, in 0.000s 1 tree, 55 leaves, max depth = 12, train loss: 0.37553, val loss: 0.36717, in 0.016s 1 tree, 33 leaves, max depth = 8, train loss: 0.37397, val loss: 0.36546, in 0.000s Fit 70 trees in 0.845 s, (2631 total leaves) Time spent computing histograms: 0.306s Time spent finding best splits: 0.044s Time spent applying splits: 0.047s Time spent predicting: 0.000s Trial 97, Fold 2: Log loss = 0.3764287364805963, Average precision = 0.9427200995296994, ROC-AUC = 0.9438946706481238, Elapsed Time = 0.8514697000009619 seconds Trial 97, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 97, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 33 leaves, max depth = 9, train loss: 0.67832, val loss: 0.67784, in 0.000s 1 tree, 32 leaves, max depth = 9, train loss: 0.66474, val loss: 0.66375, in 0.016s 1 tree, 33 leaves, max depth = 9, train loss: 0.65192, val loss: 0.65044, in 0.000s 1 tree, 36 leaves, max depth = 12, train loss: 0.63986, val loss: 0.63796, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.62851, val loss: 0.62616, in 0.000s 1 tree, 32 leaves, max depth = 9, train loss: 0.61790, val loss: 0.61511, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.60771, val loss: 0.60455, in 0.000s 1 tree, 31 leaves, max depth = 13, train loss: 0.59825, val loss: 0.59470, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.58929, val loss: 0.58535, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.58068, val loss: 0.57642, in 0.000s 1 tree, 35 leaves, max depth = 12, train loss: 0.57270, val loss: 0.56812, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.56501, val loss: 0.56013, in 0.000s 1 tree, 35 leaves, max depth = 9, train loss: 0.55788, val loss: 0.55266, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.55101, val loss: 0.54552, in 0.000s 1 tree, 34 leaves, max depth = 9, train loss: 0.54451, val loss: 0.53875, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.53741, val loss: 0.53212, in 0.000s 1 tree, 31 leaves, max depth = 14, train loss: 0.53155, val loss: 0.52595, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.52495, val loss: 0.51981, in 0.000s 1 tree, 34 leaves, max depth = 12, train loss: 0.51952, val loss: 0.51413, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.51338, val loss: 0.50841, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.50757, val loss: 0.50301, in 0.000s 1 tree, 52 leaves, max depth = 12, train loss: 0.50207, val loss: 0.49791, in 0.016s 1 tree, 51 leaves, max depth = 12, train loss: 0.49687, val loss: 0.49310, in 0.016s 1 tree, 31 leaves, max depth = 14, train loss: 0.49219, val loss: 0.48814, in 0.000s 1 tree, 50 leaves, max depth = 12, train loss: 0.48733, val loss: 0.48366, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.48271, val loss: 0.47941, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.47841, val loss: 0.47488, in 0.016s 1 tree, 52 leaves, max depth = 12, train loss: 0.47409, val loss: 0.47092, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.47009, val loss: 0.46669, in 0.000s 1 tree, 53 leaves, max depth = 12, train loss: 0.46604, val loss: 0.46298, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.46234, val loss: 0.45904, in 0.000s 1 tree, 51 leaves, max depth = 12, train loss: 0.45854, val loss: 0.45557, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.45506, val loss: 0.45188, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.45148, val loss: 0.44863, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.44809, val loss: 0.44553, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.44488, val loss: 0.44213, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.44168, val loss: 0.43922, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.43868, val loss: 0.43603, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.43584, val loss: 0.43300, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.43285, val loss: 0.43028, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.43012, val loss: 0.42730, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.42730, val loss: 0.42476, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.42480, val loss: 0.42209, in 0.000s 1 tree, 53 leaves, max depth = 11, train loss: 0.42213, val loss: 0.41968, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.41959, val loss: 0.41740, in 0.000s 1 tree, 55 leaves, max depth = 11, train loss: 0.41718, val loss: 0.41523, in 0.016s 1 tree, 31 leaves, max depth = 16, train loss: 0.41491, val loss: 0.41275, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.41263, val loss: 0.41074, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.41046, val loss: 0.40873, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.40829, val loss: 0.40682, in 0.000s 1 tree, 34 leaves, max depth = 9, train loss: 0.40613, val loss: 0.40444, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40410, val loss: 0.40256, in 0.000s 1 tree, 54 leaves, max depth = 11, train loss: 0.40206, val loss: 0.40077, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.40007, val loss: 0.39862, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.39813, val loss: 0.39692, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39623, val loss: 0.39516, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.39437, val loss: 0.39311, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39258, val loss: 0.39145, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39087, val loss: 0.38988, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38925, val loss: 0.38839, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38772, val loss: 0.38697, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.38588, val loss: 0.38539, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.38413, val loss: 0.38389, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38267, val loss: 0.38254, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.38101, val loss: 0.38113, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.37961, val loss: 0.37984, in 0.000s 1 tree, 35 leaves, max depth = 11, train loss: 0.37789, val loss: 0.37796, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37663, val loss: 0.37679, in 0.000s 1 tree, 34 leaves, max depth = 9, train loss: 0.37502, val loss: 0.37501, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.37342, val loss: 0.37367, in 0.000s Fit 70 trees in 0.907 s, (2565 total leaves) Time spent computing histograms: 0.315s Time spent finding best splits: 0.046s Time spent applying splits: 0.050s Time spent predicting: 0.016s Trial 97, Fold 3: Log loss = 0.3713377352370851, Average precision = 0.9484391318896641, ROC-AUC = 0.9480855609114812, Elapsed Time = 0.9261010999998689 seconds Trial 97, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 97, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 33 leaves, max depth = 11, train loss: 0.67834, val loss: 0.67749, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.66456, val loss: 0.66287, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.65162, val loss: 0.64913, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.63954, val loss: 0.63634, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.62809, val loss: 0.62414, in 0.000s 1 tree, 33 leaves, max depth = 11, train loss: 0.61732, val loss: 0.61263, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.60718, val loss: 0.60177, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.59765, val loss: 0.59165, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.58862, val loss: 0.58195, in 0.000s 1 tree, 34 leaves, max depth = 15, train loss: 0.58014, val loss: 0.57282, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.57209, val loss: 0.56412, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.56451, val loss: 0.55601, in 0.016s 1 tree, 31 leaves, max depth = 11, train loss: 0.55736, val loss: 0.54832, in 0.000s 1 tree, 31 leaves, max depth = 13, train loss: 0.55054, val loss: 0.54091, in 0.016s 1 tree, 31 leaves, max depth = 13, train loss: 0.54409, val loss: 0.53389, in 0.000s 1 tree, 56 leaves, max depth = 11, train loss: 0.53705, val loss: 0.52698, in 0.000s 1 tree, 31 leaves, max depth = 11, train loss: 0.53115, val loss: 0.52059, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.52461, val loss: 0.51417, in 0.000s 1 tree, 34 leaves, max depth = 15, train loss: 0.51916, val loss: 0.50819, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.51308, val loss: 0.50223, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.50733, val loss: 0.49659, in 0.000s 1 tree, 54 leaves, max depth = 12, train loss: 0.50189, val loss: 0.49124, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.49673, val loss: 0.48618, in 0.000s 1 tree, 32 leaves, max depth = 10, train loss: 0.49202, val loss: 0.48103, in 0.016s [25/70] 1 tree, 56 leaves, max depth = 12, train loss: 0.48720, val loss: 0.47631, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.48264, val loss: 0.47182, in 0.000s 1 tree, 34 leaves, max depth = 14, train loss: 0.47832, val loss: 0.46705, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.47405, val loss: 0.46286, in 0.000s 1 tree, 34 leaves, max depth = 14, train loss: 0.47002, val loss: 0.45841, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.46601, val loss: 0.45449, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.46227, val loss: 0.45036, in 0.001s 1 tree, 57 leaves, max depth = 13, train loss: 0.45851, val loss: 0.44669, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.45492, val loss: 0.44272, in 0.000s 1 tree, 57 leaves, max depth = 13, train loss: 0.45139, val loss: 0.43928, in 0.016s 1 tree, 34 leaves, max depth = 14, train loss: 0.44812, val loss: 0.43562, in 0.000s 1 tree, 57 leaves, max depth = 11, train loss: 0.44480, val loss: 0.43238, in 0.016s 1 tree, 56 leaves, max depth = 11, train loss: 0.44166, val loss: 0.42931, in 0.016s 1 tree, 35 leaves, max depth = 16, train loss: 0.43863, val loss: 0.42590, in 0.000s 1 tree, 33 leaves, max depth = 10, train loss: 0.43569, val loss: 0.42262, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.43274, val loss: 0.41976, in 0.000s 1 tree, 34 leaves, max depth = 13, train loss: 0.43006, val loss: 0.41672, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.42727, val loss: 0.41402, in 0.000s 1 tree, 35 leaves, max depth = 16, train loss: 0.42476, val loss: 0.41115, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.42213, val loss: 0.40858, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.41963, val loss: 0.40614, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.41726, val loss: 0.40383, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.41497, val loss: 0.40124, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.41272, val loss: 0.39905, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.41071, val loss: 0.39692, in 0.000s 1 tree, 32 leaves, max depth = 12, train loss: 0.40855, val loss: 0.39446, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.40643, val loss: 0.39239, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.40454, val loss: 0.39039, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.40254, val loss: 0.38845, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.40054, val loss: 0.38617, in 0.000s 1 tree, 58 leaves, max depth = 11, train loss: 0.39864, val loss: 0.38432, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.39687, val loss: 0.38244, in 0.000s 1 tree, 33 leaves, max depth = 12, train loss: 0.39500, val loss: 0.38031, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.39333, val loss: 0.37854, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.39175, val loss: 0.37686, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.39025, val loss: 0.37526, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.38883, val loss: 0.37375, in 0.016s 1 tree, 56 leaves, max depth = 12, train loss: 0.38701, val loss: 0.37202, in 0.000s 1 tree, 56 leaves, max depth = 13, train loss: 0.38529, val loss: 0.37035, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.38393, val loss: 0.36890, in 0.000s 1 tree, 33 leaves, max depth = 12, train loss: 0.38223, val loss: 0.36697, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.38094, val loss: 0.36560, in 0.000s 1 tree, 56 leaves, max depth = 12, train loss: 0.37929, val loss: 0.36404, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.37769, val loss: 0.36222, in 0.000s 1 tree, 57 leaves, max depth = 12, train loss: 0.37612, val loss: 0.36073, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37489, val loss: 0.35942, in 0.000s Fit 70 trees in 0.908 s, (2623 total leaves) Time spent computing histograms: 0.323s Time spent finding best splits: 0.046s Time spent applying splits: 0.050s Time spent predicting: 0.000s Trial 97, Fold 4: Log loss = 0.37434479667596815, Average precision = 0.9483072080684369, ROC-AUC = 0.9461314143603464, Elapsed Time = 0.9189717000008386 seconds Trial 97, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 97, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 34 leaves, max depth = 10, train loss: 0.67829, val loss: 0.67728, in 0.016s 1 tree, 31 leaves, max depth = 12, train loss: 0.66450, val loss: 0.66250, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.65143, val loss: 0.64849, in 0.016s 1 tree, 34 leaves, max depth = 10, train loss: 0.63924, val loss: 0.63539, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.62767, val loss: 0.62294, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.61688, val loss: 0.61127, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.60661, val loss: 0.60017, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.59700, val loss: 0.58983, in 0.016s 1 tree, 32 leaves, max depth = 11, train loss: 0.58786, val loss: 0.57990, in 0.000s 1 tree, 34 leaves, max depth = 12, train loss: 0.57930, val loss: 0.57065, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.57122, val loss: 0.56189, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.56351, val loss: 0.55345, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.55629, val loss: 0.54552, in 0.000s 1 tree, 32 leaves, max depth = 11, train loss: 0.54939, val loss: 0.53793, in 0.016s 1 tree, 34 leaves, max depth = 12, train loss: 0.54292, val loss: 0.53086, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.53594, val loss: 0.52418, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.52998, val loss: 0.51764, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.52350, val loss: 0.51145, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.51799, val loss: 0.50537, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.51196, val loss: 0.49963, in 0.000s 1 tree, 53 leaves, max depth = 13, train loss: 0.50625, val loss: 0.49420, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.50085, val loss: 0.48907, in 0.016s 1 tree, 57 leaves, max depth = 13, train loss: 0.49572, val loss: 0.48422, in 0.000s 1 tree, 34 leaves, max depth = 12, train loss: 0.49095, val loss: 0.47892, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.48616, val loss: 0.47440, in 0.000s 1 tree, 36 leaves, max depth = 10, train loss: 0.48171, val loss: 0.46944, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.47723, val loss: 0.46524, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.47308, val loss: 0.46061, in 0.000s 1 tree, 56 leaves, max depth = 14, train loss: 0.46888, val loss: 0.45668, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.46502, val loss: 0.45236, in 0.000s 1 tree, 35 leaves, max depth = 13, train loss: 0.46137, val loss: 0.44826, in 0.016s 1 tree, 55 leaves, max depth = 13, train loss: 0.45747, val loss: 0.44463, in 0.000s 1 tree, 36 leaves, max depth = 10, train loss: 0.45405, val loss: 0.44078, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.45038, val loss: 0.43737, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.44690, val loss: 0.43413, in 0.000s 1 tree, 53 leaves, max depth = 13, train loss: 0.44359, val loss: 0.43107, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.44051, val loss: 0.42759, in 0.000s 1 tree, 54 leaves, max depth = 13, train loss: 0.43740, val loss: 0.42472, in 0.000s 1 tree, 34 leaves, max depth = 9, train loss: 0.43449, val loss: 0.42142, in 0.016s 1 tree, 53 leaves, max depth = 13, train loss: 0.43155, val loss: 0.41873, in 0.000s 1 tree, 33 leaves, max depth = 9, train loss: 0.42883, val loss: 0.41563, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.42606, val loss: 0.41310, in 0.016s 1 tree, 33 leaves, max depth = 9, train loss: 0.42351, val loss: 0.41018, in 0.000s 1 tree, 53 leaves, max depth = 14, train loss: 0.42088, val loss: 0.40780, in 0.016s 1 tree, 55 leaves, max depth = 14, train loss: 0.41838, val loss: 0.40554, in 0.000s 1 tree, 36 leaves, max depth = 9, train loss: 0.41601, val loss: 0.40280, in 0.016s 1 tree, 54 leaves, max depth = 14, train loss: 0.41365, val loss: 0.40068, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.41144, val loss: 0.39814, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.40935, val loss: 0.39611, in 0.016s 1 tree, 54 leaves, max depth = 12, train loss: 0.40712, val loss: 0.39412, in 0.000s 1 tree, 55 leaves, max depth = 12, train loss: 0.40499, val loss: 0.39223, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.40302, val loss: 0.39033, in 0.000s 1 tree, 36 leaves, max depth = 12, train loss: 0.40100, val loss: 0.38799, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.39900, val loss: 0.38622, in 0.000s 1 tree, 36 leaves, max depth = 9, train loss: 0.39707, val loss: 0.38397, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39523, val loss: 0.38220, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.39333, val loss: 0.38053, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39159, val loss: 0.37885, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38994, val loss: 0.37727, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.38844, val loss: 0.37578, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.38661, val loss: 0.37367, in 0.016s 1 tree, 55 leaves, max depth = 14, train loss: 0.38480, val loss: 0.37210, in 0.016s 1 tree, 56 leaves, max depth = 13, train loss: 0.38308, val loss: 0.37061, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.38166, val loss: 0.36920, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.38032, val loss: 0.36787, in 0.000s 1 tree, 33 leaves, max depth = 11, train loss: 0.37866, val loss: 0.36598, in 0.016s 1 tree, 54 leaves, max depth = 13, train loss: 0.37702, val loss: 0.36457, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.37574, val loss: 0.36331, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.37414, val loss: 0.36145, in 0.000s 1 tree, 55 leaves, max depth = 13, train loss: 0.37257, val loss: 0.36011, in 0.016s Fit 70 trees in 0.923 s, (2641 total leaves) Time spent computing histograms: 0.330s Time spent finding best splits: 0.047s Time spent applying splits: 0.051s Time spent predicting: 0.000s Trial 97, Fold 5: Log loss = 0.37853788756599877, Average precision = 0.9464296262456326, ROC-AUC = 0.9434605280270516, Elapsed Time = 0.9309966000000713 seconds
Optimization Progress: 98%|#########8| 98/100 [20:27<00:26, 13.34s/it]
Trial 98, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 98, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.174 s 0.040 GB of training data: 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 10, train loss: 0.67807, val loss: 0.67763, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.66412, val loss: 0.66325, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.65101, val loss: 0.64974, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.63882, val loss: 0.63720, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.62723, val loss: 0.62523, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.61632, val loss: 0.61395, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.60616, val loss: 0.60346, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.59647, val loss: 0.59341, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.58732, val loss: 0.58393, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.57869, val loss: 0.57497, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.57054, val loss: 0.56650, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.56284, val loss: 0.55849, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.55572, val loss: 0.55108, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.54883, val loss: 0.54390, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.54232, val loss: 0.53710, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.53540, val loss: 0.53061, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.52939, val loss: 0.52432, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.52298, val loss: 0.51831, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.51754, val loss: 0.51259, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.51156, val loss: 0.50701, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.50591, val loss: 0.50174, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.50057, val loss: 0.49676, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.49564, val loss: 0.49157, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.49098, val loss: 0.48664, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.48606, val loss: 0.48207, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.48140, val loss: 0.47777, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.47700, val loss: 0.47371, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.47289, val loss: 0.46936, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.46876, val loss: 0.46554, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.46487, val loss: 0.46140, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.46120, val loss: 0.45751, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.45738, val loss: 0.45400, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.45393, val loss: 0.45032, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.45033, val loss: 0.44703, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.44692, val loss: 0.44393, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.44368, val loss: 0.44095, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.44062, val loss: 0.43770, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.43757, val loss: 0.43493, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.43460, val loss: 0.43181, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.43174, val loss: 0.42922, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.42903, val loss: 0.42629, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.42632, val loss: 0.42386, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.42377, val loss: 0.42112, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.42122, val loss: 0.41884, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.41880, val loss: 0.41665, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.41643, val loss: 0.41407, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.41414, val loss: 0.41204, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.41193, val loss: 0.40968, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.40979, val loss: 0.40733, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.40762, val loss: 0.40541, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.40549, val loss: 0.40316, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.40346, val loss: 0.40137, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.40153, val loss: 0.39931, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.39966, val loss: 0.39730, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.39774, val loss: 0.39563, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39576, val loss: 0.39346, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.39394, val loss: 0.39188, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39207, val loss: 0.38984, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39030, val loss: 0.38789, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38862, val loss: 0.38605, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38703, val loss: 0.38430, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.38528, val loss: 0.38282, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.38364, val loss: 0.38145, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38213, val loss: 0.37978, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38069, val loss: 0.37820, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.37900, val loss: 0.37642, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.37764, val loss: 0.37492, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.37607, val loss: 0.37328, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.37449, val loss: 0.37198, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.37326, val loss: 0.37054, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.37175, val loss: 0.36894, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.37025, val loss: 0.36774, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36907, val loss: 0.36637, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.36768, val loss: 0.36491, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.36656, val loss: 0.36359, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.36524, val loss: 0.36221, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.36417, val loss: 0.36095, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.36289, val loss: 0.35967, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.36168, val loss: 0.35841, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.36066, val loss: 0.35721, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.35951, val loss: 0.35599, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.35803, val loss: 0.35484, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.35694, val loss: 0.35368, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.35553, val loss: 0.35259, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.35420, val loss: 0.35155, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35322, val loss: 0.35039, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.35194, val loss: 0.34940, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.35100, val loss: 0.34827, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.35011, val loss: 0.34721, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.34907, val loss: 0.34618, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.34807, val loss: 0.34503, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.34684, val loss: 0.34409, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.34597, val loss: 0.34306, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.34479, val loss: 0.34214, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.34396, val loss: 0.34114, in 0.016s Fit 95 trees in 1.064 s, (1953 total leaves) Time spent computing histograms: 0.433s Time spent finding best splits: 0.051s Time spent applying splits: 0.043s Time spent predicting: 0.000s Trial 98, Fold 1: Log loss = 0.34746178444472975, Average precision = 0.9454165208062872, ROC-AUC = 0.9436222057326394, Elapsed Time = 1.0571799000008468 seconds Trial 98, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 98, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 10, train loss: 0.67829, val loss: 0.67754, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.66451, val loss: 0.66308, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.65148, val loss: 0.64937, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.63932, val loss: 0.63659, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.62780, val loss: 0.62443, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.61696, val loss: 0.61298, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.60685, val loss: 0.60237, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.59721, val loss: 0.59216, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.58812, val loss: 0.58252, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.57961, val loss: 0.57360, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.57159, val loss: 0.56516, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.56389, val loss: 0.55699, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.55673, val loss: 0.54945, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.54985, val loss: 0.54212, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.54334, val loss: 0.53517, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.53631, val loss: 0.52836, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.53041, val loss: 0.52212, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.52387, val loss: 0.51581, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.51842, val loss: 0.51004, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.51234, val loss: 0.50417, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.50659, val loss: 0.49862, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.50115, val loss: 0.49337, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.49599, val loss: 0.48841, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.49126, val loss: 0.48334, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.48645, val loss: 0.47871, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.48189, val loss: 0.47433, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.47757, val loss: 0.46972, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.47347, val loss: 0.46535, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.46925, val loss: 0.46131, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.46524, val loss: 0.45748, in 0.016s 1 tree, 25 leaves, max depth = 14, train loss: 0.46149, val loss: 0.45343, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.45773, val loss: 0.44985, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.45422, val loss: 0.44610, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.45069, val loss: 0.44274, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.44741, val loss: 0.43923, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.44408, val loss: 0.43607, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.44092, val loss: 0.43309, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.43790, val loss: 0.42984, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.43503, val loss: 0.42676, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.43207, val loss: 0.42395, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.42938, val loss: 0.42106, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.42658, val loss: 0.41841, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.42407, val loss: 0.41569, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.42143, val loss: 0.41319, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.41891, val loss: 0.41082, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.41652, val loss: 0.40859, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.41423, val loss: 0.40607, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.41197, val loss: 0.40395, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.40987, val loss: 0.40178, in 0.016s 1 tree, 25 leaves, max depth = 17, train loss: 0.40773, val loss: 0.39943, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.40560, val loss: 0.39744, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.40359, val loss: 0.39524, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.40157, val loss: 0.39336, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.39966, val loss: 0.39127, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.39774, val loss: 0.38949, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39582, val loss: 0.38750, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.39400, val loss: 0.38582, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39219, val loss: 0.38394, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39046, val loss: 0.38215, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38883, val loss: 0.38046, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38728, val loss: 0.37884, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.38555, val loss: 0.37726, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.38380, val loss: 0.37535, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38233, val loss: 0.37382, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.38069, val loss: 0.37233, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37935, val loss: 0.37094, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.37778, val loss: 0.36952, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37651, val loss: 0.36819, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.37486, val loss: 0.36642, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.37332, val loss: 0.36473, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.37182, val loss: 0.36339, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.37034, val loss: 0.36179, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36913, val loss: 0.36054, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.36771, val loss: 0.35926, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36655, val loss: 0.35806, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.36518, val loss: 0.35656, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36408, val loss: 0.35542, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.36276, val loss: 0.35399, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.36149, val loss: 0.35263, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.36045, val loss: 0.35154, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.35924, val loss: 0.35019, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.35786, val loss: 0.34895, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.35672, val loss: 0.34767, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.35540, val loss: 0.34650, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.35439, val loss: 0.34546, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35344, val loss: 0.34447, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.35217, val loss: 0.34335, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.35125, val loss: 0.34239, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.35020, val loss: 0.34124, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.34898, val loss: 0.34016, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.34794, val loss: 0.33898, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.34677, val loss: 0.33796, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.34588, val loss: 0.33703, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.34489, val loss: 0.33590, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.34404, val loss: 0.33502, in 0.000s Fit 95 trees in 1.111 s, (1972 total leaves) Time spent computing histograms: 0.440s Time spent finding best splits: 0.059s Time spent applying splits: 0.049s Time spent predicting: 0.000s Trial 98, Fold 2: Log loss = 0.34651320081061987, Average precision = 0.9441432964982537, ROC-AUC = 0.946327187649187, Elapsed Time = 1.1205337000010331 seconds Trial 98, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 98, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 9, train loss: 0.67832, val loss: 0.67785, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.66474, val loss: 0.66376, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.65192, val loss: 0.65043, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.63987, val loss: 0.63795, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.62852, val loss: 0.62616, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.61790, val loss: 0.61510, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.60784, val loss: 0.60461, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.59835, val loss: 0.59476, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.58923, val loss: 0.58532, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.58067, val loss: 0.57642, in 0.016s 1 tree, 25 leaves, max depth = 13, train loss: 0.57274, val loss: 0.56812, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.56505, val loss: 0.56014, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.55792, val loss: 0.55266, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.55105, val loss: 0.54552, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.54455, val loss: 0.53876, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.53745, val loss: 0.53212, in 0.000s 1 tree, 25 leaves, max depth = 15, train loss: 0.53158, val loss: 0.52595, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.52499, val loss: 0.51980, in 0.016s 1 tree, 25 leaves, max depth = 14, train loss: 0.51959, val loss: 0.51411, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.51346, val loss: 0.50841, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.50767, val loss: 0.50301, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.50217, val loss: 0.49788, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.49697, val loss: 0.49304, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.49230, val loss: 0.48809, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.48744, val loss: 0.48357, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.48284, val loss: 0.47930, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.47854, val loss: 0.47476, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.47446, val loss: 0.47046, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.47019, val loss: 0.46653, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.46614, val loss: 0.46280, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.46245, val loss: 0.45884, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.45864, val loss: 0.45533, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.45516, val loss: 0.45164, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.45159, val loss: 0.44835, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.44819, val loss: 0.44524, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.44499, val loss: 0.44184, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.44178, val loss: 0.43889, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.43879, val loss: 0.43571, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.43595, val loss: 0.43268, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.43295, val loss: 0.42996, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.43023, val loss: 0.42699, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.42741, val loss: 0.42441, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.42491, val loss: 0.42174, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.42224, val loss: 0.41932, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.41970, val loss: 0.41701, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.41728, val loss: 0.41481, in 0.016s 1 tree, 25 leaves, max depth = 16, train loss: 0.41503, val loss: 0.41235, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.41274, val loss: 0.41028, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.41057, val loss: 0.40827, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.40840, val loss: 0.40632, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.40627, val loss: 0.40404, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.40421, val loss: 0.40219, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.40222, val loss: 0.39999, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.40032, val loss: 0.39794, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.39837, val loss: 0.39620, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39638, val loss: 0.39436, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.39453, val loss: 0.39272, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39264, val loss: 0.39097, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39086, val loss: 0.38931, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38916, val loss: 0.38774, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38755, val loss: 0.38626, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.38579, val loss: 0.38472, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.38412, val loss: 0.38327, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38259, val loss: 0.38185, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.38114, val loss: 0.38051, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.37976, val loss: 0.37924, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.37804, val loss: 0.37736, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.37645, val loss: 0.37561, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.37485, val loss: 0.37422, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37360, val loss: 0.37307, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.37206, val loss: 0.37139, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.37052, val loss: 0.37007, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.36933, val loss: 0.36896, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.36792, val loss: 0.36738, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.36679, val loss: 0.36633, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.36545, val loss: 0.36485, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.36437, val loss: 0.36384, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.36307, val loss: 0.36242, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.36184, val loss: 0.36105, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36082, val loss: 0.36010, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.35963, val loss: 0.35882, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.35813, val loss: 0.35754, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.35701, val loss: 0.35632, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.35558, val loss: 0.35511, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.35422, val loss: 0.35396, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.35322, val loss: 0.35304, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.35192, val loss: 0.35195, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.35097, val loss: 0.35106, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.35006, val loss: 0.35021, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.34901, val loss: 0.34905, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.34800, val loss: 0.34783, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.34674, val loss: 0.34679, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.34586, val loss: 0.34597, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.34465, val loss: 0.34497, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.34381, val loss: 0.34419, in 0.016s Fit 95 trees in 1.064 s, (1953 total leaves) Time spent computing histograms: 0.420s Time spent finding best splits: 0.053s Time spent applying splits: 0.044s Time spent predicting: 0.000s Trial 98, Fold 3: Log loss = 0.340985015897022, Average precision = 0.9499141215918279, ROC-AUC = 0.9504775939989021, Elapsed Time = 1.0630522000010387 seconds Trial 98, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 98, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.189 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 9, train loss: 0.67834, val loss: 0.67748, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.66456, val loss: 0.66287, in 0.016s 1 tree, 25 leaves, max depth = 13, train loss: 0.65162, val loss: 0.64913, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.63954, val loss: 0.63634, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.62810, val loss: 0.62414, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.61733, val loss: 0.61263, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.60723, val loss: 0.60192, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.59766, val loss: 0.59164, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.58863, val loss: 0.58194, in 0.016s 1 tree, 25 leaves, max depth = 15, train loss: 0.58015, val loss: 0.57280, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.57214, val loss: 0.56422, in 0.016s 1 tree, 25 leaves, max depth = 13, train loss: 0.56452, val loss: 0.55598, in 0.016s 1 tree, 25 leaves, max depth = 15, train loss: 0.55737, val loss: 0.54822, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.55056, val loss: 0.54081, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.54415, val loss: 0.53389, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.53706, val loss: 0.52691, in 0.000s 1 tree, 25 leaves, max depth = 13, train loss: 0.53112, val loss: 0.52042, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.52453, val loss: 0.51395, in 0.016s 1 tree, 25 leaves, max depth = 15, train loss: 0.51909, val loss: 0.50797, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.51296, val loss: 0.50194, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.50716, val loss: 0.49623, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.50167, val loss: 0.49081, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.49646, val loss: 0.48569, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.49176, val loss: 0.48056, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.48689, val loss: 0.47577, in 0.000s 1 tree, 25 leaves, max depth = 7, train loss: 0.48228, val loss: 0.47124, in 0.016s 1 tree, 25 leaves, max depth = 14, train loss: 0.47798, val loss: 0.46649, in 0.000s 1 tree, 25 leaves, max depth = 14, train loss: 0.47391, val loss: 0.46198, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.46963, val loss: 0.45778, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.46558, val loss: 0.45380, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.46185, val loss: 0.44970, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.45805, val loss: 0.44595, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.45447, val loss: 0.44199, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.45090, val loss: 0.43848, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.44750, val loss: 0.43514, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.44430, val loss: 0.43155, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.44110, val loss: 0.42845, in 0.016s 1 tree, 25 leaves, max depth = 15, train loss: 0.43811, val loss: 0.42506, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.43518, val loss: 0.42180, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.43220, val loss: 0.41890, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.42954, val loss: 0.41589, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.42672, val loss: 0.41315, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.42422, val loss: 0.41032, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.42155, val loss: 0.40772, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.41902, val loss: 0.40526, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.41661, val loss: 0.40292, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.41434, val loss: 0.40036, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.41206, val loss: 0.39812, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.40996, val loss: 0.39590, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.40779, val loss: 0.39377, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.40567, val loss: 0.39134, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.40361, val loss: 0.38933, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.40163, val loss: 0.38705, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.39973, val loss: 0.38485, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.39779, val loss: 0.38295, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39586, val loss: 0.38091, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.39401, val loss: 0.37911, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39219, val loss: 0.37718, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.39047, val loss: 0.37535, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.38890, val loss: 0.37368, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.38741, val loss: 0.37209, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.38564, val loss: 0.37044, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.38396, val loss: 0.36885, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.38254, val loss: 0.36734, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.38119, val loss: 0.36591, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.37948, val loss: 0.36399, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37820, val loss: 0.36263, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.37662, val loss: 0.36083, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.37501, val loss: 0.35933, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37379, val loss: 0.35804, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.37227, val loss: 0.35630, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.37073, val loss: 0.35488, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36957, val loss: 0.35364, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.36817, val loss: 0.35204, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36707, val loss: 0.35087, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.36574, val loss: 0.34934, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.36445, val loss: 0.34786, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.36321, val loss: 0.34646, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.36204, val loss: 0.34514, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36100, val loss: 0.34402, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.35990, val loss: 0.34275, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.35839, val loss: 0.34137, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.35735, val loss: 0.34016, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.35591, val loss: 0.33886, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.35454, val loss: 0.33760, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35353, val loss: 0.33652, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.35222, val loss: 0.33533, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35125, val loss: 0.33429, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.35033, val loss: 0.33330, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.34907, val loss: 0.33215, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.34804, val loss: 0.33108, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.34683, val loss: 0.32998, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.34594, val loss: 0.32902, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.34496, val loss: 0.32800, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.34411, val loss: 0.32709, in 0.000s Fit 95 trees in 1.236 s, (1965 total leaves) Time spent computing histograms: 0.507s Time spent finding best splits: 0.066s Time spent applying splits: 0.055s Time spent predicting: 0.031s Trial 98, Fold 4: Log loss = 0.34290472175752834, Average precision = 0.9498242747729755, ROC-AUC = 0.9485274481331263, Elapsed Time = 1.2537620999992214 seconds Trial 98, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 98, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.174 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 25 leaves, max depth = 9, train loss: 0.67829, val loss: 0.67729, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.66451, val loss: 0.66249, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.65143, val loss: 0.64848, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.63925, val loss: 0.63538, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.62768, val loss: 0.62294, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.61688, val loss: 0.61135, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.60671, val loss: 0.60033, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.59701, val loss: 0.58983, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.58787, val loss: 0.57991, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.57932, val loss: 0.57066, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.57124, val loss: 0.56191, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.56353, val loss: 0.55348, in 0.016s 1 tree, 25 leaves, max depth = 12, train loss: 0.55631, val loss: 0.54555, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.54941, val loss: 0.53796, in 0.000s 1 tree, 25 leaves, max depth = 12, train loss: 0.54295, val loss: 0.53088, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.53595, val loss: 0.52417, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.52999, val loss: 0.51764, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.52349, val loss: 0.51143, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.51799, val loss: 0.50538, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.51193, val loss: 0.49961, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.50620, val loss: 0.49417, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.50077, val loss: 0.48900, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.49563, val loss: 0.48411, in 0.016s 1 tree, 25 leaves, max depth = 11, train loss: 0.49086, val loss: 0.47883, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.48605, val loss: 0.47427, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.48161, val loss: 0.46933, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.47710, val loss: 0.46509, in 0.024s 1 tree, 25 leaves, max depth = 10, train loss: 0.47297, val loss: 0.46047, in 0.008s 1 tree, 25 leaves, max depth = 10, train loss: 0.46875, val loss: 0.45648, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.46490, val loss: 0.45216, in 0.016s 1 tree, 25 leaves, max depth = 13, train loss: 0.46125, val loss: 0.44807, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.45732, val loss: 0.44438, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.45386, val loss: 0.44045, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.45017, val loss: 0.43700, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.44666, val loss: 0.43372, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.44334, val loss: 0.43062, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.44027, val loss: 0.42714, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.43713, val loss: 0.42424, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.43424, val loss: 0.42095, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.43127, val loss: 0.41823, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.42856, val loss: 0.41514, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.42576, val loss: 0.41257, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.42322, val loss: 0.40966, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.42057, val loss: 0.40726, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.41805, val loss: 0.40495, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.41569, val loss: 0.40223, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.41331, val loss: 0.40008, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.41104, val loss: 0.39804, in 0.000s 1 tree, 3 leaves, max depth = 2, train loss: 0.40894, val loss: 0.39601, in 0.016s 1 tree, 25 leaves, max depth = 13, train loss: 0.40678, val loss: 0.39352, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.40463, val loss: 0.39160, in 0.000s 1 tree, 25 leaves, max depth = 11, train loss: 0.40260, val loss: 0.38926, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.40057, val loss: 0.38744, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.39863, val loss: 0.38518, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.39669, val loss: 0.38346, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39477, val loss: 0.38160, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.39293, val loss: 0.37998, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.39110, val loss: 0.37822, in 0.016s 1 tree, 3 leaves, max depth = 2, train loss: 0.38937, val loss: 0.37655, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.38781, val loss: 0.37500, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.38632, val loss: 0.37352, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.38457, val loss: 0.37199, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.38282, val loss: 0.36995, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.38141, val loss: 0.36855, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.37974, val loss: 0.36711, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37840, val loss: 0.36578, in 0.000s 1 tree, 25 leaves, max depth = 8, train loss: 0.37673, val loss: 0.36384, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.37546, val loss: 0.36259, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.37386, val loss: 0.36121, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.37228, val loss: 0.35938, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.37075, val loss: 0.35808, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.36926, val loss: 0.35631, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36805, val loss: 0.35512, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.36659, val loss: 0.35388, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.36544, val loss: 0.35274, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.36406, val loss: 0.35111, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.36297, val loss: 0.35003, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.36163, val loss: 0.34845, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.36036, val loss: 0.34696, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.35932, val loss: 0.34593, in 0.016s 1 tree, 25 leaves, max depth = 7, train loss: 0.35811, val loss: 0.34451, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.35669, val loss: 0.34332, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.35554, val loss: 0.34197, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.35419, val loss: 0.34083, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.35319, val loss: 0.33984, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.35224, val loss: 0.33890, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.35093, val loss: 0.33781, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.34985, val loss: 0.33665, in 0.000s 1 tree, 2 leaves, max depth = 1, train loss: 0.34894, val loss: 0.33575, in 0.016s 1 tree, 25 leaves, max depth = 10, train loss: 0.34768, val loss: 0.33471, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.34660, val loss: 0.33352, in 0.000s 1 tree, 25 leaves, max depth = 10, train loss: 0.34540, val loss: 0.33252, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.34452, val loss: 0.33165, in 0.000s 1 tree, 25 leaves, max depth = 9, train loss: 0.34348, val loss: 0.33052, in 0.016s 1 tree, 2 leaves, max depth = 1, train loss: 0.34264, val loss: 0.32969, in 0.016s Fit 95 trees in 1.253 s, (1965 total leaves) Time spent computing histograms: 0.505s Time spent finding best splits: 0.065s Time spent applying splits: 0.054s Time spent predicting: 0.016s Trial 98, Fold 5: Log loss = 0.34892619304748446, Average precision = 0.9478084946031027, ROC-AUC = 0.9457619233413225, Elapsed Time = 1.2518749000009848 seconds
Optimization Progress: 99%|#########9| 99/100 [20:40<00:13, 13.36s/it]
Trial 99, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 99, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0.142 s 0.040 GB of training data: 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 29 leaves, max depth = 9, train loss: 0.66366, val loss: 0.66376, in 0.016s 1 tree, 25 leaves, max depth = 8, train loss: 0.63715, val loss: 0.63719, in 0.016s 1 tree, 32 leaves, max depth = 8, train loss: 0.61314, val loss: 0.61315, in 0.016s 1 tree, 33 leaves, max depth = 9, train loss: 0.59124, val loss: 0.59133, in 0.016s 1 tree, 33 leaves, max depth = 8, train loss: 0.57128, val loss: 0.57141, in 0.016s 1 tree, 34 leaves, max depth = 9, train loss: 0.55304, val loss: 0.55316, in 0.016s 1 tree, 33 leaves, max depth = 9, train loss: 0.53633, val loss: 0.53644, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.52098, val loss: 0.52109, in 0.016s 1 tree, 38 leaves, max depth = 9, train loss: 0.50687, val loss: 0.50702, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.49385, val loss: 0.49405, in 0.016s 1 tree, 43 leaves, max depth = 9, train loss: 0.48181, val loss: 0.48197, in 0.016s 1 tree, 46 leaves, max depth = 9, train loss: 0.47067, val loss: 0.47087, in 0.016s 1 tree, 47 leaves, max depth = 9, train loss: 0.46035, val loss: 0.46062, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.45077, val loss: 0.45113, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.44161, val loss: 0.44177, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.42857, val loss: 0.42919, in 0.016s 1 tree, 46 leaves, max depth = 9, train loss: 0.42078, val loss: 0.42149, in 0.031s 1 tree, 30 leaves, max depth = 8, train loss: 0.40928, val loss: 0.41041, in 0.016s 1 tree, 61 leaves, max depth = 11, train loss: 0.40240, val loss: 0.40375, in 0.000s 1 tree, 32 leaves, max depth = 8, train loss: 0.39223, val loss: 0.39399, in 0.031s 1 tree, 57 leaves, max depth = 11, train loss: 0.38616, val loss: 0.38807, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.37719, val loss: 0.37951, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.36892, val loss: 0.37163, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.36379, val loss: 0.36668, in 0.016s 1 tree, 57 leaves, max depth = 11, train loss: 0.35900, val loss: 0.36210, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.35222, val loss: 0.35538, in 0.016s 1 tree, 12 leaves, max depth = 5, train loss: 0.34591, val loss: 0.34917, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.34012, val loss: 0.34346, in 0.000s 1 tree, 47 leaves, max depth = 10, train loss: 0.33510, val loss: 0.33830, in 0.031s 1 tree, 17 leaves, max depth = 7, train loss: 0.32998, val loss: 0.33318, in 0.000s 1 tree, 47 leaves, max depth = 10, train loss: 0.32543, val loss: 0.32854, in 0.031s 1 tree, 34 leaves, max depth = 11, train loss: 0.32066, val loss: 0.32428, in 0.000s 1 tree, 34 leaves, max depth = 11, train loss: 0.31625, val loss: 0.32036, in 0.000s 1 tree, 18 leaves, max depth = 7, train loss: 0.31212, val loss: 0.31622, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.30821, val loss: 0.31237, in 0.031s 1 tree, 36 leaves, max depth = 11, train loss: 0.30447, val loss: 0.30907, in 0.000s 1 tree, 18 leaves, max depth = 7, train loss: 0.30094, val loss: 0.30553, in 0.016s 1 tree, 21 leaves, max depth = 7, train loss: 0.29771, val loss: 0.30228, in 0.016s 1 tree, 40 leaves, max depth = 11, train loss: 0.29431, val loss: 0.29896, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.29118, val loss: 0.29624, in 0.031s 1 tree, 41 leaves, max depth = 10, train loss: 0.28829, val loss: 0.29372, in 0.000s 1 tree, 20 leaves, max depth = 7, train loss: 0.28562, val loss: 0.29102, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.28265, val loss: 0.28801, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.28018, val loss: 0.28592, in 0.016s 1 tree, 19 leaves, max depth = 7, train loss: 0.27789, val loss: 0.28352, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.27522, val loss: 0.28092, in 0.016s 1 tree, 41 leaves, max depth = 11, train loss: 0.27311, val loss: 0.27918, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.27114, val loss: 0.27712, in 0.000s Fit 48 trees in 1.049 s, (1707 total leaves) Time spent computing histograms: 0.295s Time spent finding best splits: 0.050s Time spent applying splits: 0.033s Time spent predicting: 0.016s Trial 99, Fold 1: Log loss = 0.27620585109547047, Average precision = 0.9619353206628051, ROC-AUC = 0.9561175456331283, Elapsed Time = 1.0621444000007614 seconds Trial 99, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 99, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 Binning 0.040 GB of training data: 0.158 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 23 leaves, max depth = 9, train loss: 0.66346, val loss: 0.66287, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.63670, val loss: 0.63553, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.61216, val loss: 0.61065, in 0.016s 1 tree, 25 leaves, max depth = 9, train loss: 0.59016, val loss: 0.58822, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.57012, val loss: 0.56771, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.55204, val loss: 0.54935, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.53522, val loss: 0.53221, in 0.016s 1 tree, 30 leaves, max depth = 8, train loss: 0.51978, val loss: 0.51646, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.50525, val loss: 0.50186, in 0.016s 1 tree, 38 leaves, max depth = 9, train loss: 0.49186, val loss: 0.48843, in 0.016s 1 tree, 46 leaves, max depth = 9, train loss: 0.47949, val loss: 0.47606, in 0.016s 1 tree, 40 leaves, max depth = 8, train loss: 0.46807, val loss: 0.46459, in 0.016s 1 tree, 47 leaves, max depth = 9, train loss: 0.45747, val loss: 0.45402, in 0.016s 1 tree, 41 leaves, max depth = 9, train loss: 0.44768, val loss: 0.44418, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.43855, val loss: 0.43513, in 0.016s 1 tree, 20 leaves, max depth = 6, train loss: 0.42572, val loss: 0.42248, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.41774, val loss: 0.41447, in 0.016s 1 tree, 28 leaves, max depth = 9, train loss: 0.40641, val loss: 0.40340, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.39950, val loss: 0.39646, in 0.016s 1 tree, 32 leaves, max depth = 9, train loss: 0.38957, val loss: 0.38688, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.38348, val loss: 0.38072, in 0.016s 1 tree, 27 leaves, max depth = 7, train loss: 0.37450, val loss: 0.37191, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.36912, val loss: 0.36648, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.36118, val loss: 0.35875, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.35623, val loss: 0.35389, in 0.016s 1 tree, 13 leaves, max depth = 6, train loss: 0.34953, val loss: 0.34731, in 0.016s 1 tree, 10 leaves, max depth = 4, train loss: 0.34340, val loss: 0.34125, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.33828, val loss: 0.33634, in 0.016s 1 tree, 28 leaves, max depth = 11, train loss: 0.33283, val loss: 0.33130, in 0.016s 1 tree, 11 leaves, max depth = 6, train loss: 0.32767, val loss: 0.32622, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.32281, val loss: 0.32180, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.31831, val loss: 0.31734, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.31406, val loss: 0.31326, in 0.016s 1 tree, 36 leaves, max depth = 12, train loss: 0.30987, val loss: 0.30947, in 0.016s 1 tree, 15 leaves, max depth = 7, train loss: 0.30604, val loss: 0.30567, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.30227, val loss: 0.30207, in 0.016s 1 tree, 41 leaves, max depth = 12, train loss: 0.29870, val loss: 0.29886, in 0.016s 1 tree, 12 leaves, max depth = 4, train loss: 0.29544, val loss: 0.29563, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.29199, val loss: 0.29243, in 0.000s 1 tree, 41 leaves, max depth = 12, train loss: 0.28892, val loss: 0.28968, in 0.000s 1 tree, 13 leaves, max depth = 5, train loss: 0.28613, val loss: 0.28692, in 0.016s 1 tree, 52 leaves, max depth = 13, train loss: 0.28310, val loss: 0.28405, in 0.031s 1 tree, 41 leaves, max depth = 12, train loss: 0.28046, val loss: 0.28172, in 0.016s 1 tree, 17 leaves, max depth = 5, train loss: 0.27806, val loss: 0.27932, in 0.016s 1 tree, 40 leaves, max depth = 9, train loss: 0.27526, val loss: 0.27675, in 0.016s 1 tree, 38 leaves, max depth = 12, train loss: 0.27298, val loss: 0.27474, in 0.016s 1 tree, 18 leaves, max depth = 5, train loss: 0.27092, val loss: 0.27270, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.26889, val loss: 0.27094, in 0.016s Fit 48 trees in 1.111 s, (1585 total leaves) Time spent computing histograms: 0.298s Time spent finding best splits: 0.053s Time spent applying splits: 0.034s Time spent predicting: 0.016s Trial 99, Fold 2: Log loss = 0.27319669112073436, Average precision = 0.9622910927957704, ROC-AUC = 0.9599752317264558, Elapsed Time = 1.1303246000006766 seconds Trial 99, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 99, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 Binning 0.040 GB of training data: 0.173 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 31 leaves, max depth = 9, train loss: 0.66374, val loss: 0.66367, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.63724, val loss: 0.63709, in 0.031s 1 tree, 36 leaves, max depth = 8, train loss: 0.61291, val loss: 0.61281, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.59109, val loss: 0.59089, in 0.000s 1 tree, 37 leaves, max depth = 8, train loss: 0.57090, val loss: 0.57075, in 0.031s 1 tree, 31 leaves, max depth = 9, train loss: 0.55327, val loss: 0.55312, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.53624, val loss: 0.53614, in 0.016s 1 tree, 44 leaves, max depth = 8, train loss: 0.52056, val loss: 0.52051, in 0.016s 1 tree, 45 leaves, max depth = 10, train loss: 0.50613, val loss: 0.50610, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.49284, val loss: 0.49281, in 0.016s 1 tree, 42 leaves, max depth = 10, train loss: 0.48055, val loss: 0.48053, in 0.016s 1 tree, 43 leaves, max depth = 10, train loss: 0.46919, val loss: 0.46922, in 0.016s 1 tree, 44 leaves, max depth = 10, train loss: 0.45868, val loss: 0.45877, in 0.016s 1 tree, 44 leaves, max depth = 10, train loss: 0.44895, val loss: 0.44908, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.43545, val loss: 0.43652, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.42690, val loss: 0.42818, in 0.031s 1 tree, 53 leaves, max depth = 12, train loss: 0.41893, val loss: 0.42043, in 0.016s 1 tree, 28 leaves, max depth = 14, train loss: 0.40744, val loss: 0.40993, in 0.031s 1 tree, 53 leaves, max depth = 11, train loss: 0.40051, val loss: 0.40324, in 0.016s 1 tree, 36 leaves, max depth = 14, train loss: 0.39033, val loss: 0.39405, in 0.016s 1 tree, 32 leaves, max depth = 12, train loss: 0.38093, val loss: 0.38553, in 0.016s 1 tree, 56 leaves, max depth = 14, train loss: 0.37503, val loss: 0.37991, in 0.031s 1 tree, 58 leaves, max depth = 14, train loss: 0.36952, val loss: 0.37461, in 0.016s 1 tree, 40 leaves, max depth = 14, train loss: 0.36146, val loss: 0.36744, in 0.016s 1 tree, 63 leaves, max depth = 14, train loss: 0.35657, val loss: 0.36272, in 0.031s 1 tree, 9 leaves, max depth = 5, train loss: 0.34976, val loss: 0.35648, in 0.000s 1 tree, 13 leaves, max depth = 5, train loss: 0.34350, val loss: 0.35079, in 0.016s 1 tree, 13 leaves, max depth = 5, train loss: 0.33775, val loss: 0.34558, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.33223, val loss: 0.34110, in 0.016s 1 tree, 53 leaves, max depth = 12, train loss: 0.32752, val loss: 0.33622, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.32268, val loss: 0.33186, in 0.031s 1 tree, 62 leaves, max depth = 13, train loss: 0.31840, val loss: 0.32748, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.31385, val loss: 0.32382, in 0.016s 1 tree, 15 leaves, max depth = 5, train loss: 0.30975, val loss: 0.32009, in 0.016s 1 tree, 62 leaves, max depth = 13, train loss: 0.30594, val loss: 0.31612, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.30205, val loss: 0.31310, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.29846, val loss: 0.31029, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.29512, val loss: 0.30721, in 0.016s 1 tree, 36 leaves, max depth = 11, train loss: 0.29195, val loss: 0.30471, in 0.016s 1 tree, 16 leaves, max depth = 5, train loss: 0.28900, val loss: 0.30200, in 0.016s 1 tree, 57 leaves, max depth = 12, train loss: 0.28586, val loss: 0.29871, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.28317, val loss: 0.29664, in 0.016s 1 tree, 18 leaves, max depth = 5, train loss: 0.28066, val loss: 0.29431, in 0.016s 1 tree, 55 leaves, max depth = 12, train loss: 0.27786, val loss: 0.29137, in 0.031s 1 tree, 35 leaves, max depth = 11, train loss: 0.27555, val loss: 0.28964, in 0.016s 1 tree, 19 leaves, max depth = 6, train loss: 0.27339, val loss: 0.28765, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.27077, val loss: 0.28474, in 0.016s 1 tree, 40 leaves, max depth = 12, train loss: 0.26877, val loss: 0.28328, in 0.016s Fit 48 trees in 1.235 s, (1805 total leaves) Time spent computing histograms: 0.344s Time spent finding best splits: 0.060s Time spent applying splits: 0.040s Time spent predicting: 0.000s Trial 99, Fold 3: Log loss = 0.26933763869940724, Average precision = 0.9635613369023321, ROC-AUC = 0.9604339774489314, Elapsed Time = 1.2420631000004505 seconds Trial 99, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 99, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 Binning 0.040 GB of training data: 0.158 s 0.000 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 35 leaves, max depth = 10, train loss: 0.66352, val loss: 0.66248, in 0.016s 1 tree, 33 leaves, max depth = 10, train loss: 0.63681, val loss: 0.63479, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.61259, val loss: 0.60969, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.59065, val loss: 0.58691, in 0.016s 1 tree, 35 leaves, max depth = 11, train loss: 0.57057, val loss: 0.56605, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.55223, val loss: 0.54695, in 0.016s 1 tree, 33 leaves, max depth = 11, train loss: 0.53543, val loss: 0.52944, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.52003, val loss: 0.51334, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.50562, val loss: 0.49832, in 0.016s 1 tree, 39 leaves, max depth = 10, train loss: 0.49231, val loss: 0.48444, in 0.016s 1 tree, 35 leaves, max depth = 10, train loss: 0.48001, val loss: 0.47152, in 0.000s 1 tree, 37 leaves, max depth = 10, train loss: 0.46864, val loss: 0.45958, in 0.031s 1 tree, 39 leaves, max depth = 10, train loss: 0.45812, val loss: 0.44857, in 0.016s 1 tree, 38 leaves, max depth = 10, train loss: 0.44835, val loss: 0.43830, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.43928, val loss: 0.42877, in 0.016s 1 tree, 26 leaves, max depth = 9, train loss: 0.42646, val loss: 0.41578, in 0.016s 1 tree, 57 leaves, max depth = 10, train loss: 0.41848, val loss: 0.40742, in 0.016s 1 tree, 33 leaves, max depth = 9, train loss: 0.40717, val loss: 0.39598, in 0.016s 1 tree, 54 leaves, max depth = 11, train loss: 0.40012, val loss: 0.38864, in 0.016s 1 tree, 34 leaves, max depth = 11, train loss: 0.39012, val loss: 0.37853, in 0.016s 1 tree, 58 leaves, max depth = 11, train loss: 0.38390, val loss: 0.37199, in 0.031s 1 tree, 57 leaves, max depth = 11, train loss: 0.37806, val loss: 0.36593, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.36948, val loss: 0.35731, in 0.016s 1 tree, 43 leaves, max depth = 12, train loss: 0.36160, val loss: 0.34940, in 0.016s 1 tree, 38 leaves, max depth = 9, train loss: 0.35425, val loss: 0.34196, in 0.016s 1 tree, 55 leaves, max depth = 11, train loss: 0.34943, val loss: 0.33692, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.34325, val loss: 0.33051, in 0.016s 1 tree, 13 leaves, max depth = 7, train loss: 0.33759, val loss: 0.32461, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.33274, val loss: 0.31967, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.32822, val loss: 0.31506, in 0.031s 1 tree, 33 leaves, max depth = 9, train loss: 0.32313, val loss: 0.31023, in 0.016s 1 tree, 16 leaves, max depth = 7, train loss: 0.31847, val loss: 0.30537, in 0.016s 1 tree, 36 leaves, max depth = 9, train loss: 0.31395, val loss: 0.30116, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.30988, val loss: 0.29691, in 0.000s 1 tree, 62 leaves, max depth = 13, train loss: 0.30588, val loss: 0.29271, in 0.031s 1 tree, 36 leaves, max depth = 9, train loss: 0.30203, val loss: 0.28913, in 0.016s 1 tree, 17 leaves, max depth = 7, train loss: 0.29856, val loss: 0.28552, in 0.016s 1 tree, 63 leaves, max depth = 13, train loss: 0.29500, val loss: 0.28177, in 0.016s 1 tree, 41 leaves, max depth = 10, train loss: 0.29170, val loss: 0.27874, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.28875, val loss: 0.27563, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.28572, val loss: 0.27255, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.28288, val loss: 0.26995, in 0.016s 1 tree, 39 leaves, max depth = 9, train loss: 0.28025, val loss: 0.26751, in 0.016s 1 tree, 18 leaves, max depth = 7, train loss: 0.27783, val loss: 0.26495, in 0.016s 1 tree, 40 leaves, max depth = 9, train loss: 0.27549, val loss: 0.26281, in 0.016s 1 tree, 24 leaves, max depth = 8, train loss: 0.27335, val loss: 0.26051, in 0.016s 1 tree, 61 leaves, max depth = 12, train loss: 0.27065, val loss: 0.25761, in 0.016s 1 tree, 37 leaves, max depth = 9, train loss: 0.26862, val loss: 0.25578, in 0.016s Fit 48 trees in 1.143 s, (1787 total leaves) Time spent computing histograms: 0.334s Time spent finding best splits: 0.057s Time spent applying splits: 0.038s Time spent predicting: 0.000s Trial 99, Fold 4: Log loss = 0.26912938002939735, Average precision = 0.9637844512354639, ROC-AUC = 0.9605351867350514, Elapsed Time = 1.1649399999987509 seconds Trial 99, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 99, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 Binning 0.040 GB of training data: 0.157 s 0.016 s 0.004 GB of validation data: Fitting gradient boosted rounds: 1 tree, 28 leaves, max depth = 9, train loss: 0.66315, val loss: 0.66194, in 0.016s 1 tree, 30 leaves, max depth = 9, train loss: 0.63607, val loss: 0.63374, in 0.016s 1 tree, 31 leaves, max depth = 9, train loss: 0.61158, val loss: 0.60823, in 0.016s 1 tree, 32 leaves, max depth = 10, train loss: 0.58934, val loss: 0.58506, in 0.016s 1 tree, 36 leaves, max depth = 10, train loss: 0.56905, val loss: 0.56390, in 0.016s 1 tree, 37 leaves, max depth = 10, train loss: 0.55051, val loss: 0.54460, in 0.016s 1 tree, 40 leaves, max depth = 10, train loss: 0.53347, val loss: 0.52680, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.51779, val loss: 0.51051, in 0.016s 1 tree, 49 leaves, max depth = 11, train loss: 0.50336, val loss: 0.49553, in 0.016s 1 tree, 45 leaves, max depth = 10, train loss: 0.49010, val loss: 0.48174, in 0.016s 1 tree, 50 leaves, max depth = 11, train loss: 0.47781, val loss: 0.46896, in 0.016s 1 tree, 29 leaves, max depth = 9, train loss: 0.46695, val loss: 0.45763, in 0.016s 1 tree, 48 leaves, max depth = 10, train loss: 0.45638, val loss: 0.44664, in 0.016s 1 tree, 46 leaves, max depth = 11, train loss: 0.44656, val loss: 0.43644, in 0.016s 1 tree, 53 leaves, max depth = 10, train loss: 0.43745, val loss: 0.42701, in 0.016s 1 tree, 27 leaves, max depth = 9, train loss: 0.42455, val loss: 0.41420, in 0.016s 1 tree, 52 leaves, max depth = 11, train loss: 0.41654, val loss: 0.40594, in 0.016s 1 tree, 28 leaves, max depth = 8, train loss: 0.40517, val loss: 0.39463, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.39826, val loss: 0.38739, in 0.016s 1 tree, 33 leaves, max depth = 12, train loss: 0.38820, val loss: 0.37740, in 0.016s 1 tree, 58 leaves, max depth = 13, train loss: 0.38194, val loss: 0.37098, in 0.016s 1 tree, 46 leaves, max depth = 10, train loss: 0.37605, val loss: 0.36496, in 0.016s 1 tree, 36 leaves, max depth = 13, train loss: 0.36738, val loss: 0.35645, in 0.016s 1 tree, 32 leaves, max depth = 8, train loss: 0.35942, val loss: 0.34862, in 0.016s 1 tree, 60 leaves, max depth = 12, train loss: 0.35440, val loss: 0.34353, in 0.031s 1 tree, 10 leaves, max depth = 4, train loss: 0.34762, val loss: 0.33662, in 0.000s 1 tree, 10 leaves, max depth = 4, train loss: 0.34141, val loss: 0.33028, in 0.016s 1 tree, 47 leaves, max depth = 11, train loss: 0.33624, val loss: 0.32518, in 0.016s 1 tree, 50 leaves, max depth = 13, train loss: 0.33030, val loss: 0.31946, in 0.016s 1 tree, 39 leaves, max depth = 12, train loss: 0.32498, val loss: 0.31465, in 0.031s 1 tree, 9 leaves, max depth = 4, train loss: 0.32018, val loss: 0.30972, in 0.000s 1 tree, 44 leaves, max depth = 11, train loss: 0.31567, val loss: 0.30519, in 0.031s 1 tree, 37 leaves, max depth = 10, train loss: 0.31114, val loss: 0.30113, in 0.016s 1 tree, 9 leaves, max depth = 4, train loss: 0.30707, val loss: 0.29694, in 0.016s 1 tree, 46 leaves, max depth = 12, train loss: 0.30307, val loss: 0.29292, in 0.016s 1 tree, 39 leaves, max depth = 11, train loss: 0.29921, val loss: 0.28948, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.29576, val loss: 0.28583, in 0.016s 1 tree, 49 leaves, max depth = 12, train loss: 0.29230, val loss: 0.28248, in 0.016s 1 tree, 42 leaves, max depth = 12, train loss: 0.28900, val loss: 0.27947, in 0.016s 1 tree, 11 leaves, max depth = 4, train loss: 0.28605, val loss: 0.27643, in 0.016s 1 tree, 49 leaves, max depth = 10, train loss: 0.28284, val loss: 0.27322, in 0.016s 1 tree, 38 leaves, max depth = 11, train loss: 0.27998, val loss: 0.27067, in 0.016s 1 tree, 42 leaves, max depth = 11, train loss: 0.27734, val loss: 0.26830, in 0.016s 1 tree, 14 leaves, max depth = 5, train loss: 0.27492, val loss: 0.26581, in 0.016s 1 tree, 47 leaves, max depth = 13, train loss: 0.27215, val loss: 0.26320, in 0.016s 1 tree, 37 leaves, max depth = 11, train loss: 0.26986, val loss: 0.26118, in 0.000s 1 tree, 19 leaves, max depth = 8, train loss: 0.26777, val loss: 0.25908, in 0.016s 1 tree, 23 leaves, max depth = 9, train loss: 0.26578, val loss: 0.25709, in 0.016s Fit 48 trees in 1.111 s, (1757 total leaves) Time spent computing histograms: 0.302s Time spent finding best splits: 0.053s Time spent applying splits: 0.035s Time spent predicting: 0.031s Trial 99, Fold 5: Log loss = 0.2757542983872139, Average precision = 0.9617069684914026, ROC-AUC = 0.956636010627427, Elapsed Time = 1.1231758999983867 seconds
Optimization Progress: 100%|##########| 100/100 [20:54<00:00, 13.38s/it]
Optuna Optimization Elapsed Time: 1254.3631370999992 seconds
Optimization Progress: 100%|##########| 100/100 [20:54<00:00, 12.54s/it]
Training with Best Trial 69:
Full_params: {'loss': 'log_loss', 'verbose': 2, 'random_state': 42, 'categorical_features': 'from_dtype', 'max_depth': None, 'learning_rate': 0.09615350775493679, 'max_iter': 64, 'max_leaf_nodes': 81, 'min_samples_leaf': 47, 'l2_regularization': 1.1831955846640462e-05, 'class_weight': None, 'max_features': 0.7736302453809708, 'max_bins': 217, 'interaction_cst': 'pairwise', 'warm_start': False}
Binning 0.050 GB of training data: 0.236 s
0.016 s 0.006 GB of validation data:
Fitting gradient boosted rounds:
1 tree, 81 leaves, max depth = 12, train loss: 0.63949, val loss: 0.64037, in 0.031s
1 tree, 81 leaves, max depth = 12, train loss: 0.59379, val loss: 0.59532, in 0.031s
1 tree, 81 leaves, max depth = 12, train loss: 0.55562, val loss: 0.55774, in 0.031s
1 tree, 81 leaves, max depth = 12, train loss: 0.52336, val loss: 0.52629, in 0.031s
1 tree, 81 leaves, max depth = 12, train loss: 0.49575, val loss: 0.49935, in 0.031s
1 tree, 81 leaves, max depth = 13, train loss: 0.47204, val loss: 0.47629, in 0.078s
1 tree, 81 leaves, max depth = 13, train loss: 0.45162, val loss: 0.45640, in 0.062s
1 tree, 81 leaves, max depth = 13, train loss: 0.43389, val loss: 0.43922, in 0.047s
1 tree, 81 leaves, max depth = 10, train loss: 0.41858, val loss: 0.42506, in 0.031s
1 tree, 81 leaves, max depth = 11, train loss: 0.39676, val loss: 0.40394, in 0.063s
1 tree, 81 leaves, max depth = 15, train loss: 0.37807, val loss: 0.38536, in 0.031s
1 tree, 81 leaves, max depth = 13, train loss: 0.36692, val loss: 0.37464, in 0.031s
1 tree, 81 leaves, max depth = 14, train loss: 0.35195, val loss: 0.35972, in 0.031s
1 tree, 81 leaves, max depth = 16, train loss: 0.33931, val loss: 0.34788, in 0.031s
1 tree, 81 leaves, max depth = 17, train loss: 0.32960, val loss: 0.33878, in 0.047s
1 tree, 81 leaves, max depth = 17, train loss: 0.31972, val loss: 0.32946, in 0.047s
1 tree, 81 leaves, max depth = 17, train loss: 0.31039, val loss: 0.31987, in 0.047s
1 tree, 81 leaves, max depth = 12, train loss: 0.30230, val loss: 0.31217, in 0.047s
1 tree, 81 leaves, max depth = 18, train loss: 0.29508, val loss: 0.30544, in 0.063s
1 tree, 81 leaves, max depth = 13, train loss: 0.28860, val loss: 0.29949, in 0.031s
1 tree, 81 leaves, max depth = 24, train loss: 0.28268, val loss: 0.29397, in 0.047s
1 tree, 81 leaves, max depth = 13, train loss: 0.27669, val loss: 0.28837, in 0.047s
1 tree, 81 leaves, max depth = 22, train loss: 0.27160, val loss: 0.28329, in 0.047s
1 tree, 81 leaves, max depth = 20, train loss: 0.26719, val loss: 0.27917, in 0.031s
1 tree, 81 leaves, max depth = 19, train loss: 0.26227, val loss: 0.27397, in 0.031s
1 tree, 81 leaves, max depth = 16, train loss: 0.25786, val loss: 0.26993, in 0.047s
1 tree, 81 leaves, max depth = 18, train loss: 0.25383, val loss: 0.26561, in 0.047s
1 tree, 81 leaves, max depth = 15, train loss: 0.24968, val loss: 0.26176, in 0.047s
1 tree, 81 leaves, max depth = 15, train loss: 0.24599, val loss: 0.25848, in 0.031s
1 tree, 81 leaves, max depth = 17, train loss: 0.24283, val loss: 0.25510, in 0.031s
1 tree, 81 leaves, max depth = 16, train loss: 0.23968, val loss: 0.25236, in 0.047s
1 tree, 81 leaves, max depth = 16, train loss: 0.23722, val loss: 0.25021, in 0.031s
1 tree, 81 leaves, max depth = 15, train loss: 0.23498, val loss: 0.24820, in 0.047s
1 tree, 81 leaves, max depth = 18, train loss: 0.23270, val loss: 0.24695, in 0.031s
1 tree, 81 leaves, max depth = 15, train loss: 0.22979, val loss: 0.24434, in 0.047s
1 tree, 81 leaves, max depth = 11, train loss: 0.22731, val loss: 0.24223, in 0.031s
1 tree, 81 leaves, max depth = 13, train loss: 0.22489, val loss: 0.24001, in 0.031s
1 tree, 81 leaves, max depth = 16, train loss: 0.22285, val loss: 0.23828, in 0.047s
1 tree, 81 leaves, max depth = 12, train loss: 0.22092, val loss: 0.23663, in 0.031s
1 tree, 81 leaves, max depth = 16, train loss: 0.21934, val loss: 0.23550, in 0.031s
1 tree, 81 leaves, max depth = 16, train loss: 0.21791, val loss: 0.23431, in 0.047s
1 tree, 81 leaves, max depth = 14, train loss: 0.21581, val loss: 0.23244, in 0.031s
1 tree, 81 leaves, max depth = 15, train loss: 0.21437, val loss: 0.23194, in 0.047s
1 tree, 81 leaves, max depth = 18, train loss: 0.21294, val loss: 0.23029, in 0.031s
1 tree, 81 leaves, max depth = 11, train loss: 0.21077, val loss: 0.22851, in 0.047s
1 tree, 81 leaves, max depth = 15, train loss: 0.20965, val loss: 0.22791, in 0.031s
1 tree, 81 leaves, max depth = 25, train loss: 0.20836, val loss: 0.22675, in 0.031s
1 tree, 81 leaves, max depth = 19, train loss: 0.20725, val loss: 0.22551, in 0.047s
1 tree, 81 leaves, max depth = 16, train loss: 0.20525, val loss: 0.22395, in 0.031s
1 tree, 81 leaves, max depth = 17, train loss: 0.20415, val loss: 0.22374, in 0.031s
1 tree, 81 leaves, max depth = 15, train loss: 0.20240, val loss: 0.22248, in 0.047s
1 tree, 81 leaves, max depth = 23, train loss: 0.20149, val loss: 0.22146, in 0.031s
1 tree, 81 leaves, max depth = 14, train loss: 0.20066, val loss: 0.22105, in 0.031s
1 tree, 81 leaves, max depth = 16, train loss: 0.19913, val loss: 0.21995, in 0.047s
1 tree, 81 leaves, max depth = 15, train loss: 0.19817, val loss: 0.21925, in 0.047s
1 tree, 81 leaves, max depth = 15, train loss: 0.19729, val loss: 0.21914, in 0.032s
1 tree, 81 leaves, max depth = 14, train loss: 0.19584, val loss: 0.21804, in 0.047s
1 tree, 81 leaves, max depth = 17, train loss: 0.19456, val loss: 0.21737, in 0.031s
1 tree, 36 leaves, max depth = 8, train loss: 0.19387, val loss: 0.21670, in 0.047s
1 tree, 81 leaves, max depth = 15, train loss: 0.19237, val loss: 0.21644, in 0.031s
1 tree, 81 leaves, max depth = 12, train loss: 0.19123, val loss: 0.21586, in 0.047s
1 tree, 81 leaves, max depth = 19, train loss: 0.18998, val loss: 0.21554, in 0.047s
1 tree, 81 leaves, max depth = 27, train loss: 0.18925, val loss: 0.21481, in 0.031s
1 tree, 81 leaves, max depth = 18, train loss: 0.18814, val loss: 0.21431, in 0.047s
Fit 64 trees in 3.112 s, (5139 total leaves)
Time spent computing histograms: 0.902s
Time spent finding best splits: 0.300s
Time spent applying splits: 0.238s
Time spent predicting: 0.031s
Training Elapsed Time: 3.1261192000001756 seconds
Log loss: (Train) 0.2112895608019953 vs (Test) 0.22135196393197867
PR-AUC: (Train) 0.9714329280365582 vs (Test) 0.9678213476186919
ROC-AUC: (Train) 0.9668834736715238 vs (Test) 0.9627144738851556
save_results(clf_name = "HistGradientBoostingClassifier",
best_trials = best_trials_hgbc,
exec_time = exec_time_hgbc,
lloss_auc_train = lloss_auc_train_hgbc,
lloss_auc_test = lloss_auc_test_hgbc,
df_metrics = df_metrics_hgbc,
cm_final = cm_final_hgbc,
cm_all = cm_hgbc_all,
cm_labels = cm_labels_hgbc_all)
Optuna with XGBoost¶
gc.collect();
X_df = clean_df.drop(columns = ["target", "anon_ssn"])
y_df = clean_df.target
anon_ssn = clean_df.anon_ssn;
# A single train-test split (80%-20%) using GroupShuffleSplit, ensuring that no anon_ssn (grouped by anon_ssn) appear in both sets
gss = GroupShuffleSplit(n_splits = 1, test_size = 0.2, random_state = seed)
train_idx, test_idx = next(gss.split(X_df, y_df, groups = anon_ssn))
X_train, X_test = X_df.iloc[train_idx], X_df.iloc[test_idx]
y_train, y_test = y_df.iloc[train_idx], y_df.iloc[test_idx]
anon_ssn_train = anon_ssn[train_idx] # Keeping track of anon_ssn for cross-validation
# Estimate scale_pos_weight value to be used if data is imbalance only
estimate = Counter(y_train)[0] / Counter(y_train)[1]
del X_df, y_df, gss, train_idx, test_idx;
# Define the objective function
def objective(trial):
# https://xgboost.readthedocs.io/en/stable/parameter.html#learning-task-parameters
# https://xgboost.readthedocs.io/en/latest/treemethod.html
# https://xgboost.readthedocs.io/en/stable/python/python_api.html#module-xgboost.training
# https://xgboost.readthedocs.io/en/stable/parameter.html#parameters-for-tree-booster
param = {
"objective": "binary:logistic",
"booster": trial.suggest_categorical("booster", ["gbtree", "dart"]),
"device": "cpu",
"verbosity": 2,
"validate_parameters": True,
"eval_metric": ["logloss", "auc", "aucpr"],
"seed": seed,
"eta": trial.suggest_float("eta", 1e-2, 1e-1, log = True),
"gamma": trial.suggest_float("gamma", 1e-8, 5e0, log = True),
"max_depth": 0, # No limit on depth
"min_child_weight": trial.suggest_float("min_child_weight", 1e-10, 1e1, log = True),
"max_delta_step": trial.suggest_float("max_delta_step", 1, 100),
"subsample": trial.suggest_float("subsample", 1e-1, 1.0),
"sampling_method": "uniform", # Only 'uniform' is supported on CPU
"colsample_bytree": trial.suggest_float("colsample_bytree", 1e-1, 1e0),
"colsample_bylevel": trial.suggest_float("colsample_bylevel", 1e-1, 1e0),
"colsample_bynode": trial.suggest_float("colsample_bynode", 1e-1, 1e0),
"lambda": trial.suggest_float("lambda", 1e-8, 1e1, log = True),
"alpha": trial.suggest_float("alpha", 1e-8, 1e1, log = True),
"tree_method": trial.suggest_categorical("tree_method", ["auto", "approx", "hist"]), # CPU-supported methods
"scale_pos_weight": trial.suggest_categorical("scale_pos_weight", [1, estimate]),
"grow_policy": trial.suggest_categorical("grow_policy", ["depthwise", "lossguide"]),
"max_leaves": trial.suggest_int("max_leaves", 2, 256),
"max_bin": trial.suggest_int("max_bin", 40, 255),
"num_parallel_tree": 1,
}
sgkf = StratifiedGroupKFold(n_splits = 5, shuffle = True, random_state = seed)
lloss_scores, pr_auc_scores, roc_auc_scores = [], [], []
for fold_idx, (train_index, valid_index) in enumerate(sgkf.split(X_train, y_train, groups = anon_ssn_train), start = 1):
# Split data into training and validation sets
X_train_fold, X_valid_fold = X_train.iloc[train_index], X_train.iloc[valid_index]
y_train_fold, y_valid_fold = y_train.iloc[train_index], y_train.iloc[valid_index]
# Summarize the composition of classes in the train and validation sets
train_0, train_1 = len(y_train_fold[y_train_fold == 0]), len(y_train_fold[y_train_fold == 1])
valid_0, valid_1 = len(y_valid_fold[y_valid_fold == 0]), len(y_valid_fold[y_valid_fold == 1])
print(f'Trial {trial.number}, Fold {fold_idx}: Train size = {len(train_index)} where 0 = {train_0}, 1 = {train_1}, 0/1 = {train_0/train_1}')
print(f'Trial {trial.number}, Fold {fold_idx}: Validation size = {len(valid_index)} where 0 = {valid_0}, 1 = {valid_1}, 0/1 = {valid_0/valid_1}')
# Create XGBoost datasets
# https://xgboost.readthedocs.io/en/stable/python/python_api.html
dtrain_fold = xgb.DMatrix(X_train_fold, label = y_train_fold, enable_categorical = True, nthread = -1)
dvalid_fold = xgb.DMatrix(X_valid_fold, label = y_valid_fold, enable_categorical = True, nthread = -1)
num_round = trial.suggest_int("num_boost_round", 5, 100)
start_fold = time.perf_counter()
# Train the model with the specified parameters and data
# https://xgboost.readthedocs.io/en/stable/python/python_api.html#xgboost.train
clf = xgb.train(param,
dtrain_fold,
num_boost_round = num_round,
evals = [(dvalid_fold, "validation")], # ValueError: Must have at least 1 validation dataset for early stopping
early_stopping_rounds = 50
)
end_fold = time.perf_counter()
print(clf.attributes()) # Check model attributes
y_prob_fold = clf.predict(dvalid_fold)
y_pred_fold = np.rint(y_prob_fold) # Set y_pred_fold = 1 if y_prob_fold => 0.5 and 0 if y_prob_fold < 0.5
print(f'Trial {trial.number}, Fold {fold_idx}: '
f'Log loss = {log_loss(y_valid_fold, y_prob_fold)}, '
f'Average precision = {average_precision_score(y_valid_fold, y_prob_fold)}, '
f'ROC-AUC = {roc_auc_score(y_valid_fold, y_prob_fold)}, '
f'Elapsed Time = {end_fold - start_fold} seconds')
lloss_scores.append(log_loss(y_valid_fold, y_prob_fold))
pr_auc_scores.append(average_precision_score(y_valid_fold, y_prob_fold))
roc_auc_scores.append(roc_auc_score(y_valid_fold, y_prob_fold))
del X_train_fold, X_valid_fold, y_train_fold, y_valid_fold, dtrain_fold, dvalid_fold, clf, start_fold, end_fold
gc.collect()
mean_lloss = np.mean(lloss_scores)
mean_pr_auc = np.mean(pr_auc_scores)
mean_roc_auc = np.mean(roc_auc_scores)
del lloss_scores, pr_auc_scores, roc_auc_scores
gc.collect()
return mean_lloss, mean_pr_auc, mean_roc_auc
trial_progress = tqdm(total = n_trials, desc = "Optimization Progress", leave = True,
ascii = True, # Plain text mode
dynamic_ncols = True # Auto-fit width
)
def update_progress(study_xgb, trial):
trial_progress.update(1)
# Disable Optuna's stdout handler so notebook isn’t spammed
optuna.logging.disable_default_handler()
# Enable propagation to Python’s logging
optuna.logging.enable_propagation()
optuna.logging.set_verbosity(optuna.logging.DEBUG)
# Configure Python logging
logging.basicConfig(filename = "optuna_debug_XGBClassifier.log", filemode = "w", level = logging.DEBUG, format="%(asctime)s %(levelname)s %(message)s")
study_xgb = optuna.create_study(study_name = "Optuna with XGBClassifier",
directions=["minimize", "maximize", "maximize"],
sampler = module.AutoSampler(seed = seed))
start_optuna = time.perf_counter()
study_xgb.optimize(objective, n_trials = n_trials, n_jobs = 1, callbacks = [update_progress])
end_optuna = time.perf_counter()
print(f'Optuna Optimization Elapsed Time: {end_optuna - start_optuna} seconds')
gc.collect()
fig = plot_pareto_front(study_xgb, target_names = ["Log loss", "PR-AUC", "ROC-AUC"])
fig.update_layout(width = 900, height = 400)
fig.show()
trial_progress.close()
# Plot optimization history for each objective
metrics = ["Log loss", "PR-AUC", "ROC-AUC"]
for i, obj in enumerate(metrics):
optuna.visualization.plot_optimization_history(study_xgb,
target = lambda t: t.values[i], # Correctly target each objective
target_name = obj).show()
best_trials = study_xgb.best_trials
best_trials_xgb = {}
exec_time_xgb, lloss_auc_train_xgb, lloss_auc_test_xgb, all_metrics = [], [], [], []
cm_xgb_all, cm_labels_xgb_all = [], []
for i, trial in enumerate(best_trials):
display(Markdown(f"<span style = 'font-size: 18px; font-weight: bold;'> Training with Best Trial {trial.number} </span>"))
best_params = trial.params
# Non-optimized and best Optuna optimized parameters
full_params = {"objective": "binary:logistic",
"device": "cpu",
"verbosity": 1,
"validate_parameters": True,
"eval_metric": ["logloss", "auc", "aucpr"],
"seed": seed,
"max_depth": 0,
"sampling_method": "uniform", # Only 'uniform' is supported on CPU
"num_parallel_tree": 1,
**best_params
}
print("Full_params:", full_params)
best_trials_xgb[trial.number] = full_params
dtrain_all = xgb.DMatrix(X_train, label = y_train, enable_categorical = True, nthread = -1)
dtest_all = xgb.DMatrix(X_test, label = y_test, enable_categorical = True, nthread = -1)
# Extract `num_boost_round` separately (default to 10 if not found)
num_boost_round = full_params.pop("num_boost_round", 10)
start_train = time.perf_counter()
# https://xgboost.readthedocs.io/en/stable/python/python_api.html#xgboost.Booster
final_xgb = xgb.train(params = full_params,
dtrain = dtrain_all,
num_boost_round = num_boost_round, # Pass explicitly
#early_stopping_rounds = 50 # ValueError: Must have at least 1 validation dataset for early stopping
)
end_train = time.perf_counter()
print(f'Training Elapsed Time: {end_train - start_train} seconds')
y_prob_all = final_xgb.predict(dtest_all)
y_pred_all = np.rint(y_prob_all)
print(f'Log loss: (Train) {trial.values[0]} vs (Test) {log_loss(y_test, y_prob_all)}')
print(f'PR-AUC: (Train) {trial.values[1]} vs (Test) {average_precision_score(y_test, y_prob_all)}')
print(f'ROC-AUC: (Train) {trial.values[2]} vs (Test) {roc_auc_score(y_test, y_prob_all)}')
exec_time_xgb.append({"Classifier": "XGBClassifier",
"Best Trial": trial.number,
"Optimization Elapsed Time (s)": end_optuna - start_optuna,
"Training Elapsed Time (s)": end_train - start_train})
lloss_auc_train_xgb.append({"Classifier": "XGBClassifier",
"Best Trial": trial.number,
"Set": "Training",
"Log loss": trial.values[0],
"PR-AUC": trial.values[1],
"ROC-AUC": trial.values[2]})
lloss_auc_test_xgb.append({"Classifier": "XGBClassifier",
"Best Trial": trial.number,
"Set": "Test",
"Log loss": log_loss(y_test, y_prob_all),
"PR-AUC": average_precision_score(y_test, y_prob_all),
"ROC-AUC": roc_auc_score(y_test, y_prob_all)})
report = classification_report(y_test, y_pred_all, target_names = ["Safe", "Risky"], output_dict = True)
all_metrics.append({"Classifier": "XGBClassifier",
"Trial": trial.number,
"Accuracy": accuracy_score(y_test, y_pred_all),
"Precision (Safe)": report["Safe"]["precision"],
"Recall (Safe)": report["Safe"]["recall"],
"F1-score (Safe)": report["Safe"]["f1-score"],
"Precision (Risky)": report["Risky"]["precision"],
"Recall (Risky)": report["Risky"]["recall"],
"F1-score (Risky)": report["Risky"]["f1-score"],
"Precision (Macro avg)": report["macro avg"]["precision"],
"Recall (Macro avg)": report["macro avg"]["recall"],
"F1-score (Macro avg)": report["macro avg"]["f1-score"],
"Precision (Weighted avg)": report["weighted avg"]["precision"],
"Recall (Weighted avg)": report["weighted avg"]["recall"],
"F1-score (Weighted avg)": report["weighted avg"]["f1-score"]})
# Store confusion matrix
cm_final_xgb = confusion_matrix(y_test, y_pred_all)
cm_xgb_all.append(cm_final_xgb)
cm_labels_xgb_all.append(f'XGBoostClassifier Confusion Matrix for Best Trial {trial.number}') # Store label for subplots
df_metrics_xgb = pd.DataFrame(all_metrics)
gc.collect();
Optimization Progress: 0%| | 0/100 [00:00<?, ?it/s]
Trial 0, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 0, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68714 validation-auc:0.91258 validation-aucpr:0.91700
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.68097 validation-auc:0.93166 validation-aucpr:0.93703
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67488 validation-auc:0.93897 validation-aucpr:0.94349
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66831 validation-auc:0.95137 validation-aucpr:0.95887
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.66292 validation-auc:0.94986 validation-aucpr:0.95779
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65783 validation-auc:0.94954 validation-aucpr:0.95684
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.65216 validation-auc:0.94990 validation-aucpr:0.95706
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64588 validation-auc:0.95255 validation-aucpr:0.95984
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.64091 validation-auc:0.95181 validation-aucpr:0.95890
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.63570 validation-auc:0.95195 validation-aucpr:0.95913
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.63098 validation-auc:0.95088 validation-aucpr:0.95800
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.62620 validation-auc:0.95117 validation-aucpr:0.95836
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.62129 validation-auc:0.95202 validation-aucpr:0.95919
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.61664 validation-auc:0.95205 validation-aucpr:0.95932
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.61216 validation-auc:0.95192 validation-aucpr:0.95920
[17:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.60780 validation-auc:0.95137 validation-aucpr:0.95848
[17:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.60433 validation-auc:0.95200 validation-aucpr:0.95910
[17:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.59974 validation-auc:0.95261 validation-aucpr:0.95976
[17:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.59532 validation-auc:0.95279 validation-aucpr:0.95995
[17:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.59060 validation-auc:0.95323 validation-aucpr:0.96023
[17:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.58650 validation-auc:0.95309 validation-aucpr:0.96014
[17:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.58330 validation-auc:0.95252 validation-aucpr:0.95964
[17:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.57924 validation-auc:0.95215 validation-aucpr:0.95931
[17:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.57545 validation-auc:0.95220 validation-aucpr:0.95941
[17:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.57116 validation-auc:0.95373 validation-aucpr:0.96117
[17:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.56745 validation-auc:0.95357 validation-aucpr:0.96103
[17:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.56402 validation-auc:0.95340 validation-aucpr:0.96080
[17:58:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.55977 validation-auc:0.95381 validation-aucpr:0.96132
[17:58:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.55635 validation-auc:0.95371 validation-aucpr:0.96118
[17:58:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.55258 validation-auc:0.95400 validation-aucpr:0.96137
[17:58:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.54912 validation-auc:0.95382 validation-aucpr:0.96123
[17:58:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.54541 validation-auc:0.95372 validation-aucpr:0.96111
[17:58:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.54087 validation-auc:0.95456 validation-aucpr:0.96196
[17:58:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.53757 validation-auc:0.95440 validation-aucpr:0.96183
[17:58:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.53419 validation-auc:0.95467 validation-aucpr:0.96203
[17:58:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.53114 validation-auc:0.95437 validation-aucpr:0.96176
[17:58:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.52807 validation-auc:0.95439 validation-aucpr:0.96174
[17:58:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.52492 validation-auc:0.95448 validation-aucpr:0.96185
[17:58:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.52220 validation-auc:0.95425 validation-aucpr:0.96162
[17:58:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.51813 validation-auc:0.95462 validation-aucpr:0.96209
[17:58:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.51428 validation-auc:0.95468 validation-aucpr:0.96223
[17:58:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.51146 validation-auc:0.95466 validation-aucpr:0.96218
[17:59:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.50875 validation-auc:0.95450 validation-aucpr:0.96200
[17:59:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.50558 validation-auc:0.95482 validation-aucpr:0.96226
[17:59:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.50312 validation-auc:0.95481 validation-aucpr:0.96221
[17:59:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.50034 validation-auc:0.95478 validation-aucpr:0.96217
[17:59:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.49764 validation-auc:0.95476 validation-aucpr:0.96213
[17:59:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.49491 validation-auc:0.95481 validation-aucpr:0.96211
[17:59:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.49231 validation-auc:0.95483 validation-aucpr:0.96215
[17:59:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.48995 validation-auc:0.95473 validation-aucpr:0.96202
[17:59:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.48707 validation-auc:0.95480 validation-aucpr:0.96210
[17:59:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.48439 validation-auc:0.95476 validation-aucpr:0.96203
[17:59:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.48239 validation-auc:0.95473 validation-aucpr:0.96201
[17:59:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.47984 validation-auc:0.95482 validation-aucpr:0.96206
[17:59:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.47721 validation-auc:0.95483 validation-aucpr:0.96201
[17:59:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.47483 validation-auc:0.95487 validation-aucpr:0.96206
[17:59:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.47276 validation-auc:0.95491 validation-aucpr:0.96208
[17:59:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.47081 validation-auc:0.95487 validation-aucpr:0.96208
[17:59:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.46760 validation-auc:0.95534 validation-aucpr:0.96263
[17:59:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.46544 validation-auc:0.95529 validation-aucpr:0.96259
[17:59:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.46320 validation-auc:0.95531 validation-aucpr:0.96259
[17:59:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.46132 validation-auc:0.95518 validation-aucpr:0.96247
[17:59:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.46009 validation-auc:0.95528 validation-aucpr:0.96254
[17:59:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.45799 validation-auc:0.95529 validation-aucpr:0.96252
[17:59:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.45624 validation-auc:0.95521 validation-aucpr:0.96243
[17:59:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.45419 validation-auc:0.95539 validation-aucpr:0.96258
[17:59:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.45232 validation-auc:0.95529 validation-aucpr:0.96249
[17:59:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.45039 validation-auc:0.95529 validation-aucpr:0.96247
[17:59:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.44726 validation-auc:0.95554 validation-aucpr:0.96277
[17:59:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.44505 validation-auc:0.95565 validation-aucpr:0.96283
[17:59:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.44314 validation-auc:0.95574 validation-aucpr:0.96293
[17:59:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.44129 validation-auc:0.95570 validation-aucpr:0.96291
[17:59:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.44014 validation-auc:0.95576 validation-aucpr:0.96295
[17:59:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.43846 validation-auc:0.95570 validation-aucpr:0.96289
[17:59:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.43680 validation-auc:0.95564 validation-aucpr:0.96285
[17:59:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.43506 validation-auc:0.95560 validation-aucpr:0.96280
[17:59:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.43289 validation-auc:0.95568 validation-aucpr:0.96284
[17:59:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.43186 validation-auc:0.95573 validation-aucpr:0.96286
[17:59:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.43003 validation-auc:0.95578 validation-aucpr:0.96291
[17:59:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.42846 validation-auc:0.95584 validation-aucpr:0.96295
[17:59:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.42673 validation-auc:0.95582 validation-aucpr:0.96295
[17:59:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.42521 validation-auc:0.95579 validation-aucpr:0.96291
[17:59:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.42352 validation-auc:0.95582 validation-aucpr:0.96295
[17:59:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.42182 validation-auc:0.95588 validation-aucpr:0.96302
[17:59:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.42030 validation-auc:0.95588 validation-aucpr:0.96301
[17:59:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.41853 validation-auc:0.95585 validation-aucpr:0.96301
[17:59:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.41705 validation-auc:0.95598 validation-aucpr:0.96309
[17:59:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.41571 validation-auc:0.95594 validation-aucpr:0.96305
[17:59:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.41413 validation-auc:0.95588 validation-aucpr:0.96303
[17:59:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[89] validation-logloss:0.41265 validation-auc:0.95577 validation-aucpr:0.96292
[17:59:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[90] validation-logloss:0.41126 validation-auc:0.95577 validation-aucpr:0.96289
[17:59:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[91] validation-logloss:0.40991 validation-auc:0.95572 validation-aucpr:0.96284
[17:59:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[92] validation-logloss:0.40839 validation-auc:0.95572 validation-aucpr:0.96288
{'best_iteration': '86', 'best_score': '0.9630916850735398'}
Trial 0, Fold 1: Log loss = 0.40839214376528166, Average precision = 0.9628826257850801, ROC-AUC = 0.9557242750558017, Elapsed Time = 16.0506733000002 seconds
Trial 0, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 0, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[17:59:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68722 validation-auc:0.92238 validation-aucpr:0.91976
[17:59:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.68057 validation-auc:0.93376 validation-aucpr:0.93009
[17:59:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67468 validation-auc:0.93503 validation-aucpr:0.93226
[17:59:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66906 validation-auc:0.93381 validation-aucpr:0.93174
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.66332 validation-auc:0.93762 validation-aucpr:0.93605
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65765 validation-auc:0.94463 validation-aucpr:0.94623
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.65242 validation-auc:0.94435 validation-aucpr:0.94558
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64731 validation-auc:0.94458 validation-aucpr:0.94603
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.64077 validation-auc:0.95344 validation-aucpr:0.95696
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.63520 validation-auc:0.95361 validation-aucpr:0.95701
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.63017 validation-auc:0.95375 validation-aucpr:0.95741
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.62523 validation-auc:0.95345 validation-aucpr:0.95719
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.62083 validation-auc:0.95319 validation-aucpr:0.95706
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.61576 validation-auc:0.95346 validation-aucpr:0.95717
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.61100 validation-auc:0.95345 validation-aucpr:0.95709
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.60754 validation-auc:0.95300 validation-aucpr:0.95655
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.60322 validation-auc:0.95284 validation-aucpr:0.95643
[17:59:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.59908 validation-auc:0.95290 validation-aucpr:0.95649
[17:59:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.59496 validation-auc:0.95294 validation-aucpr:0.95642
[17:59:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.59093 validation-auc:0.95291 validation-aucpr:0.95655
[17:59:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.58644 validation-auc:0.95290 validation-aucpr:0.95669
[17:59:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.58271 validation-auc:0.95235 validation-aucpr:0.95609
[17:59:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.57866 validation-auc:0.95271 validation-aucpr:0.95622
[17:59:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.57452 validation-auc:0.95252 validation-aucpr:0.95595
[17:59:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.56971 validation-auc:0.95413 validation-aucpr:0.95777
[17:59:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.56637 validation-auc:0.95432 validation-aucpr:0.95801
[17:59:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.56238 validation-auc:0.95442 validation-aucpr:0.95806
[17:59:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.55875 validation-auc:0.95422 validation-aucpr:0.95785
[17:59:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.55505 validation-auc:0.95399 validation-aucpr:0.95755
[17:59:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.55035 validation-auc:0.95517 validation-aucpr:0.95909
[17:59:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.54688 validation-auc:0.95529 validation-aucpr:0.95923
[17:59:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.54329 validation-auc:0.95536 validation-aucpr:0.95934
[17:59:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.53984 validation-auc:0.95520 validation-aucpr:0.95920
[17:59:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.53650 validation-auc:0.95518 validation-aucpr:0.95918
[17:59:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.53271 validation-auc:0.95553 validation-aucpr:0.95965
[17:59:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.52927 validation-auc:0.95555 validation-aucpr:0.95974
[17:59:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.52596 validation-auc:0.95544 validation-aucpr:0.95959
[17:59:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.52265 validation-auc:0.95540 validation-aucpr:0.95949
[17:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.51958 validation-auc:0.95541 validation-aucpr:0.95951
[17:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.51728 validation-auc:0.95529 validation-aucpr:0.95943
[17:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.51343 validation-auc:0.95596 validation-aucpr:0.96029
[17:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.51084 validation-auc:0.95587 validation-aucpr:0.96015
[17:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.50844 validation-auc:0.95598 validation-aucpr:0.96023
[17:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.50514 validation-auc:0.95593 validation-aucpr:0.96008
[17:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.50258 validation-auc:0.95596 validation-aucpr:0.96014
[17:59:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.49968 validation-auc:0.95610 validation-aucpr:0.96021
[17:59:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.49691 validation-auc:0.95609 validation-aucpr:0.96015
[17:59:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.49417 validation-auc:0.95607 validation-aucpr:0.96012
[17:59:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.49190 validation-auc:0.95602 validation-aucpr:0.96016
[17:59:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.48934 validation-auc:0.95621 validation-aucpr:0.96034
[17:59:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.48589 validation-auc:0.95663 validation-aucpr:0.96089
[17:59:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.48337 validation-auc:0.95666 validation-aucpr:0.96105
[17:59:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.48079 validation-auc:0.95662 validation-aucpr:0.96099
[17:59:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.47823 validation-auc:0.95668 validation-aucpr:0.96102
[17:59:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.47540 validation-auc:0.95658 validation-aucpr:0.96088
[17:59:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.47293 validation-auc:0.95650 validation-aucpr:0.96073
[17:59:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.47089 validation-auc:0.95642 validation-aucpr:0.96064
[17:59:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.46923 validation-auc:0.95647 validation-aucpr:0.96064
[17:59:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.46689 validation-auc:0.95647 validation-aucpr:0.96064
[17:59:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.46462 validation-auc:0.95658 validation-aucpr:0.96082
[17:59:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.46276 validation-auc:0.95661 validation-aucpr:0.96085
[17:59:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.46060 validation-auc:0.95648 validation-aucpr:0.96070
[17:59:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.45874 validation-auc:0.95646 validation-aucpr:0.96067
[17:59:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.45647 validation-auc:0.95647 validation-aucpr:0.96072
[17:59:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.45459 validation-auc:0.95642 validation-aucpr:0.96067
[17:59:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.45263 validation-auc:0.95639 validation-aucpr:0.96059
[17:59:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.45063 validation-auc:0.95639 validation-aucpr:0.96058
[17:59:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.44853 validation-auc:0.95635 validation-aucpr:0.96076
[17:59:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.44633 validation-auc:0.95640 validation-aucpr:0.96080
[17:59:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.44443 validation-auc:0.95620 validation-aucpr:0.96058
[17:59:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.44251 validation-auc:0.95616 validation-aucpr:0.96055
[17:59:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.44064 validation-auc:0.95600 validation-aucpr:0.96039
[17:59:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.43873 validation-auc:0.95605 validation-aucpr:0.96040
[17:59:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.43696 validation-auc:0.95604 validation-aucpr:0.96037
[17:59:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.43493 validation-auc:0.95604 validation-aucpr:0.96038
[17:59:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.43312 validation-auc:0.95597 validation-aucpr:0.96033
[17:59:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.43032 validation-auc:0.95652 validation-aucpr:0.96099
[17:59:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.42902 validation-auc:0.95647 validation-aucpr:0.96103
[17:59:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.42751 validation-auc:0.95635 validation-aucpr:0.96093
[17:59:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.42599 validation-auc:0.95623 validation-aucpr:0.96084
[17:59:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.42432 validation-auc:0.95618 validation-aucpr:0.96076
[17:59:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.42266 validation-auc:0.95616 validation-aucpr:0.96073
[17:59:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.42067 validation-auc:0.95620 validation-aucpr:0.96074
[17:59:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.41793 validation-auc:0.95660 validation-aucpr:0.96120
[17:59:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.41665 validation-auc:0.95659 validation-aucpr:0.96119
[17:59:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.41489 validation-auc:0.95673 validation-aucpr:0.96129
[17:59:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.41292 validation-auc:0.95690 validation-aucpr:0.96152
[17:59:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.41136 validation-auc:0.95689 validation-aucpr:0.96149
[17:59:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.40989 validation-auc:0.95687 validation-aucpr:0.96146
[17:59:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[89] validation-logloss:0.40852 validation-auc:0.95683 validation-aucpr:0.96143
[17:59:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[90] validation-logloss:0.40715 validation-auc:0.95677 validation-aucpr:0.96137
[17:59:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[91] validation-logloss:0.40541 validation-auc:0.95673 validation-aucpr:0.96130
[17:59:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[92] validation-logloss:0.40400 validation-auc:0.95674 validation-aucpr:0.96130
{'best_iteration': '86', 'best_score': '0.9615151857978635'}
Trial 0, Fold 2: Log loss = 0.40400137854890755, Average precision = 0.9613053529150961, ROC-AUC = 0.9567406518052979, Elapsed Time = 15.39552619999995 seconds
Trial 0, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 0, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[17:59:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68770 validation-auc:0.89543 validation-aucpr:0.89711
[17:59:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.68144 validation-auc:0.93117 validation-aucpr:0.93039
[17:59:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67547 validation-auc:0.93600 validation-aucpr:0.93526
[17:59:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66957 validation-auc:0.93921 validation-aucpr:0.94207
[17:59:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.66413 validation-auc:0.94041 validation-aucpr:0.94403
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65897 validation-auc:0.94172 validation-aucpr:0.94548
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.65351 validation-auc:0.94416 validation-aucpr:0.94793
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64713 validation-auc:0.95457 validation-aucpr:0.95955
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.64069 validation-auc:0.95716 validation-aucpr:0.96258
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.63613 validation-auc:0.95675 validation-aucpr:0.96218
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.63105 validation-auc:0.95675 validation-aucpr:0.96225
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.62614 validation-auc:0.95662 validation-aucpr:0.96285
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.62136 validation-auc:0.95627 validation-aucpr:0.96246
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.61664 validation-auc:0.95638 validation-aucpr:0.96247
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.61182 validation-auc:0.95641 validation-aucpr:0.96236
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.60817 validation-auc:0.95643 validation-aucpr:0.96243
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.60358 validation-auc:0.95662 validation-aucpr:0.96268
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.59950 validation-auc:0.95638 validation-aucpr:0.96243
[17:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.59486 validation-auc:0.95675 validation-aucpr:0.96272
[17:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.59167 validation-auc:0.95675 validation-aucpr:0.96279
[17:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.58842 validation-auc:0.95635 validation-aucpr:0.96225
[17:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.58429 validation-auc:0.95622 validation-aucpr:0.96205
[17:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.58023 validation-auc:0.95629 validation-aucpr:0.96204
[17:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.57639 validation-auc:0.95604 validation-aucpr:0.96182
[17:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.57262 validation-auc:0.95587 validation-aucpr:0.96165
[17:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.56826 validation-auc:0.95614 validation-aucpr:0.96191
[17:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.56421 validation-auc:0.95623 validation-aucpr:0.96200
[17:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.56028 validation-auc:0.95642 validation-aucpr:0.96224
[17:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.55638 validation-auc:0.95624 validation-aucpr:0.96202
[17:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.55257 validation-auc:0.95632 validation-aucpr:0.96209
[17:59:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.54904 validation-auc:0.95631 validation-aucpr:0.96207
[17:59:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.54531 validation-auc:0.95648 validation-aucpr:0.96212
[17:59:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.54160 validation-auc:0.95649 validation-aucpr:0.96209
[17:59:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.53832 validation-auc:0.95624 validation-aucpr:0.96183
[17:59:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.53525 validation-auc:0.95603 validation-aucpr:0.96162
[17:59:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.53207 validation-auc:0.95611 validation-aucpr:0.96178
[17:59:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.52874 validation-auc:0.95611 validation-aucpr:0.96181
[17:59:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.52549 validation-auc:0.95616 validation-aucpr:0.96181
[17:59:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.52114 validation-auc:0.95757 validation-aucpr:0.96337
[17:59:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.51747 validation-auc:0.95767 validation-aucpr:0.96342
[17:59:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.51414 validation-auc:0.95778 validation-aucpr:0.96351
[17:59:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.51169 validation-auc:0.95772 validation-aucpr:0.96342
[17:59:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.50916 validation-auc:0.95772 validation-aucpr:0.96336
[17:59:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.50618 validation-auc:0.95787 validation-aucpr:0.96350
[17:59:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.50318 validation-auc:0.95775 validation-aucpr:0.96339
[17:59:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.50028 validation-auc:0.95764 validation-aucpr:0.96323
[17:59:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.49787 validation-auc:0.95746 validation-aucpr:0.96304
[17:59:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.49413 validation-auc:0.95840 validation-aucpr:0.96416
[17:59:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.49162 validation-auc:0.95837 validation-aucpr:0.96413
[17:59:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.48898 validation-auc:0.95841 validation-aucpr:0.96418
[17:59:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.48659 validation-auc:0.95837 validation-aucpr:0.96417
[17:59:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.48449 validation-auc:0.95844 validation-aucpr:0.96416
[17:59:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.48238 validation-auc:0.95834 validation-aucpr:0.96410
[17:59:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.47999 validation-auc:0.95823 validation-aucpr:0.96396
[17:59:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.47735 validation-auc:0.95818 validation-aucpr:0.96388
[17:59:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.47486 validation-auc:0.95817 validation-aucpr:0.96384
[17:59:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.47259 validation-auc:0.95812 validation-aucpr:0.96379
[17:59:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.47017 validation-auc:0.95807 validation-aucpr:0.96371
[17:59:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.46814 validation-auc:0.95794 validation-aucpr:0.96356
[17:59:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.46575 validation-auc:0.95794 validation-aucpr:0.96352
[17:59:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.46349 validation-auc:0.95798 validation-aucpr:0.96354
[17:59:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.46206 validation-auc:0.95799 validation-aucpr:0.96355
[17:59:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.45945 validation-auc:0.95806 validation-aucpr:0.96358
[17:59:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.45742 validation-auc:0.95801 validation-aucpr:0.96353
[17:59:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.45519 validation-auc:0.95797 validation-aucpr:0.96348
[17:59:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.45255 validation-auc:0.95839 validation-aucpr:0.96394
[17:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.45061 validation-auc:0.95836 validation-aucpr:0.96391
[17:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.44853 validation-auc:0.95836 validation-aucpr:0.96390
[17:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.44627 validation-auc:0.95828 validation-aucpr:0.96377
[17:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.44439 validation-auc:0.95822 validation-aucpr:0.96366
[17:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.44215 validation-auc:0.95821 validation-aucpr:0.96362
[17:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.44033 validation-auc:0.95813 validation-aucpr:0.96354
[17:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.43840 validation-auc:0.95804 validation-aucpr:0.96345
[17:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.43687 validation-auc:0.95811 validation-aucpr:0.96348
[17:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.43507 validation-auc:0.95801 validation-aucpr:0.96338
[17:59:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.43297 validation-auc:0.95808 validation-aucpr:0.96346
[17:59:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.43071 validation-auc:0.95869 validation-aucpr:0.96418
[17:59:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.42900 validation-auc:0.95864 validation-aucpr:0.96411
[17:59:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.42733 validation-auc:0.95863 validation-aucpr:0.96410
[17:59:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.42552 validation-auc:0.95862 validation-aucpr:0.96407
[17:59:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.42382 validation-auc:0.95854 validation-aucpr:0.96399
[17:59:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.42118 validation-auc:0.95896 validation-aucpr:0.96451
[17:59:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.41929 validation-auc:0.95903 validation-aucpr:0.96455
[17:59:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.41760 validation-auc:0.95903 validation-aucpr:0.96455
[17:59:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.41632 validation-auc:0.95898 validation-aucpr:0.96447
[17:59:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.41392 validation-auc:0.95931 validation-aucpr:0.96487
[17:59:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.41240 validation-auc:0.95926 validation-aucpr:0.96483
[17:59:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.41087 validation-auc:0.95921 validation-aucpr:0.96478
[17:59:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.40843 validation-auc:0.95952 validation-aucpr:0.96515
[17:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[89] validation-logloss:0.40713 validation-auc:0.95953 validation-aucpr:0.96514
[17:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[90] validation-logloss:0.40576 validation-auc:0.95947 validation-aucpr:0.96509
[17:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[91] validation-logloss:0.40446 validation-auc:0.95940 validation-aucpr:0.96502
[17:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[92] validation-logloss:0.40206 validation-auc:0.95970 validation-aucpr:0.96535
{'best_iteration': '92', 'best_score': '0.9653474941847704'}
Trial 0, Fold 3: Log loss = 0.4020617406521426, Average precision = 0.9653535765256415, ROC-AUC = 0.9596951717213549, Elapsed Time = 15.429512600001544 seconds
Trial 0, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 0, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[17:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68706 validation-auc:0.91637 validation-aucpr:0.91737
[17:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.68094 validation-auc:0.93038 validation-aucpr:0.93196
[17:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67547 validation-auc:0.93114 validation-aucpr:0.93244
[17:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66815 validation-auc:0.95085 validation-aucpr:0.95608
[17:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.66237 validation-auc:0.95106 validation-aucpr:0.95656
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65672 validation-auc:0.95187 validation-aucpr:0.95764
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.65127 validation-auc:0.95158 validation-aucpr:0.95733
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64621 validation-auc:0.95184 validation-aucpr:0.95763
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.64073 validation-auc:0.95207 validation-aucpr:0.95796
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.63549 validation-auc:0.95244 validation-aucpr:0.95862
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.63030 validation-auc:0.95252 validation-aucpr:0.95848
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.62549 validation-auc:0.95216 validation-aucpr:0.95814
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.62080 validation-auc:0.95190 validation-aucpr:0.95802
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.61613 validation-auc:0.95157 validation-aucpr:0.95759
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.61161 validation-auc:0.95102 validation-aucpr:0.95695
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.60661 validation-auc:0.95135 validation-aucpr:0.95713
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.60225 validation-auc:0.95106 validation-aucpr:0.95664
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.59829 validation-auc:0.95026 validation-aucpr:0.95568
[17:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.59407 validation-auc:0.95008 validation-aucpr:0.95555
[17:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.58988 validation-auc:0.94980 validation-aucpr:0.95523
[17:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.58417 validation-auc:0.95299 validation-aucpr:0.95923
[17:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.57998 validation-auc:0.95318 validation-aucpr:0.95940
[17:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.57598 validation-auc:0.95300 validation-aucpr:0.95920
[17:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.57200 validation-auc:0.95296 validation-aucpr:0.95918
[17:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.56770 validation-auc:0.95286 validation-aucpr:0.95904
[17:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.56367 validation-auc:0.95290 validation-aucpr:0.95904
[17:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.56011 validation-auc:0.95311 validation-aucpr:0.95919
[17:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.55663 validation-auc:0.95273 validation-aucpr:0.95873
[17:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.55305 validation-auc:0.95267 validation-aucpr:0.95869
[17:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.54972 validation-auc:0.95270 validation-aucpr:0.95876
[17:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.54642 validation-auc:0.95251 validation-aucpr:0.95858
[17:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.54329 validation-auc:0.95232 validation-aucpr:0.95841
[17:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.53988 validation-auc:0.95186 validation-aucpr:0.95793
[17:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.53667 validation-auc:0.95188 validation-aucpr:0.95794
[17:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.53239 validation-auc:0.95344 validation-aucpr:0.95991
[17:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.52889 validation-auc:0.95354 validation-aucpr:0.96005
[17:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.52594 validation-auc:0.95348 validation-aucpr:0.95997
[17:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.52259 validation-auc:0.95358 validation-aucpr:0.96007
[17:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.51935 validation-auc:0.95370 validation-aucpr:0.96020
[17:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.51657 validation-auc:0.95363 validation-aucpr:0.96009
[17:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.51254 validation-auc:0.95470 validation-aucpr:0.96133
[17:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.50989 validation-auc:0.95465 validation-aucpr:0.96126
[17:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.50705 validation-auc:0.95464 validation-aucpr:0.96115
[17:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.50419 validation-auc:0.95448 validation-aucpr:0.96105
[17:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.50158 validation-auc:0.95437 validation-aucpr:0.96092
[17:59:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.49795 validation-auc:0.95500 validation-aucpr:0.96172
[17:59:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.49569 validation-auc:0.95496 validation-aucpr:0.96165
[17:59:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.49339 validation-auc:0.95492 validation-aucpr:0.96159
[17:59:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.49052 validation-auc:0.95500 validation-aucpr:0.96158
[17:59:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.48725 validation-auc:0.95533 validation-aucpr:0.96194
[17:59:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.48476 validation-auc:0.95521 validation-aucpr:0.96183
[17:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.48142 validation-auc:0.95563 validation-aucpr:0.96236
[17:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.47821 validation-auc:0.95585 validation-aucpr:0.96266
[17:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.47579 validation-auc:0.95588 validation-aucpr:0.96266
[17:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.47311 validation-auc:0.95590 validation-aucpr:0.96265
[17:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.47112 validation-auc:0.95571 validation-aucpr:0.96250
[17:59:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.46903 validation-auc:0.95563 validation-aucpr:0.96243
[17:59:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.46685 validation-auc:0.95568 validation-aucpr:0.96243
[17:59:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.46453 validation-auc:0.95568 validation-aucpr:0.96242
[17:59:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.46145 validation-auc:0.95597 validation-aucpr:0.96277
[17:59:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.45937 validation-auc:0.95591 validation-aucpr:0.96272
[17:59:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.45696 validation-auc:0.95594 validation-aucpr:0.96271
[17:59:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.45495 validation-auc:0.95598 validation-aucpr:0.96272
[17:59:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.45292 validation-auc:0.95600 validation-aucpr:0.96271
[17:59:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.45063 validation-auc:0.95605 validation-aucpr:0.96270
[17:59:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.44889 validation-auc:0.95599 validation-aucpr:0.96262
[17:59:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.44701 validation-auc:0.95597 validation-aucpr:0.96260
[17:59:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.44505 validation-auc:0.95600 validation-aucpr:0.96263
[17:59:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.44315 validation-auc:0.95605 validation-aucpr:0.96268
[17:59:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.44145 validation-auc:0.95607 validation-aucpr:0.96271
[17:59:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.43946 validation-auc:0.95610 validation-aucpr:0.96270
[17:59:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.43774 validation-auc:0.95601 validation-aucpr:0.96261
[17:59:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.43589 validation-auc:0.95609 validation-aucpr:0.96264
[17:59:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.43404 validation-auc:0.95609 validation-aucpr:0.96266
[17:59:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.43255 validation-auc:0.95604 validation-aucpr:0.96262
[17:59:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.43070 validation-auc:0.95604 validation-aucpr:0.96259
[17:59:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.42906 validation-auc:0.95600 validation-aucpr:0.96257
[17:59:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.42709 validation-auc:0.95601 validation-aucpr:0.96259
[17:59:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.42538 validation-auc:0.95600 validation-aucpr:0.96257
[17:59:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.42399 validation-auc:0.95598 validation-aucpr:0.96254
[17:59:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.42250 validation-auc:0.95595 validation-aucpr:0.96252
[17:59:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.41995 validation-auc:0.95634 validation-aucpr:0.96293
[17:59:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.41745 validation-auc:0.95655 validation-aucpr:0.96320
[17:59:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.41560 validation-auc:0.95658 validation-aucpr:0.96322
[17:59:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.41410 validation-auc:0.95652 validation-aucpr:0.96316
[17:59:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.41222 validation-auc:0.95659 validation-aucpr:0.96320
[17:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.41049 validation-auc:0.95647 validation-aucpr:0.96312
[17:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.40895 validation-auc:0.95647 validation-aucpr:0.96310
[17:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.40762 validation-auc:0.95650 validation-aucpr:0.96315
[17:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[89] validation-logloss:0.40614 validation-auc:0.95646 validation-aucpr:0.96309
[17:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[90] validation-logloss:0.40481 validation-auc:0.95641 validation-aucpr:0.96305
[17:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[91] validation-logloss:0.40334 validation-auc:0.95642 validation-aucpr:0.96304
[17:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[92] validation-logloss:0.40220 validation-auc:0.95632 validation-aucpr:0.96297
{'best_iteration': '83', 'best_score': '0.9632182894831647'}
Trial 0, Fold 4: Log loss = 0.4022018041340452, Average precision = 0.9629778336083984, ROC-AUC = 0.9563161226635765, Elapsed Time = 15.438011500000357 seconds
Trial 0, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 0, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:00:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68753 validation-auc:0.89140 validation-aucpr:0.86380
[18:00:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.68146 validation-auc:0.92771 validation-aucpr:0.93034
[18:00:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67620 validation-auc:0.92589 validation-aucpr:0.93015
[18:00:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.67043 validation-auc:0.93133 validation-aucpr:0.93622
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.66431 validation-auc:0.94552 validation-aucpr:0.95293
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.66011 validation-auc:0.94665 validation-aucpr:0.95325
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.65412 validation-auc:0.94997 validation-aucpr:0.95638
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64803 validation-auc:0.95180 validation-aucpr:0.95821
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.64283 validation-auc:0.95159 validation-aucpr:0.95799
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.63748 validation-auc:0.95198 validation-aucpr:0.95828
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.63273 validation-auc:0.95181 validation-aucpr:0.95816
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.62795 validation-auc:0.95149 validation-aucpr:0.95787
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.62317 validation-auc:0.95153 validation-aucpr:0.95789
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.61846 validation-auc:0.95144 validation-aucpr:0.95775
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.61508 validation-auc:0.95123 validation-aucpr:0.95739
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.61057 validation-auc:0.95109 validation-aucpr:0.95734
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.60444 validation-auc:0.95351 validation-aucpr:0.96023
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.60028 validation-auc:0.95374 validation-aucpr:0.96051
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.59644 validation-auc:0.95353 validation-aucpr:0.96025
[18:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.59326 validation-auc:0.95312 validation-aucpr:0.95977
[18:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.58928 validation-auc:0.95288 validation-aucpr:0.95952
[18:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.58539 validation-auc:0.95273 validation-aucpr:0.95934
[18:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.58042 validation-auc:0.95352 validation-aucpr:0.96027
[18:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.57656 validation-auc:0.95350 validation-aucpr:0.96027
[18:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.57352 validation-auc:0.95344 validation-aucpr:0.96021
[18:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.56965 validation-auc:0.95341 validation-aucpr:0.96002
[18:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.56559 validation-auc:0.95378 validation-aucpr:0.96048
[18:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.56186 validation-auc:0.95373 validation-aucpr:0.96034
[18:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.55839 validation-auc:0.95359 validation-aucpr:0.96008
[18:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.55497 validation-auc:0.95371 validation-aucpr:0.96023
[18:00:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.55070 validation-auc:0.95411 validation-aucpr:0.96057
[18:00:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.54690 validation-auc:0.95416 validation-aucpr:0.96066
[18:00:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.54243 validation-auc:0.95432 validation-aucpr:0.96094
[18:00:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.53915 validation-auc:0.95438 validation-aucpr:0.96096
[18:00:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.53610 validation-auc:0.95436 validation-aucpr:0.96095
[18:00:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.53343 validation-auc:0.95426 validation-aucpr:0.96077
[18:00:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.52925 validation-auc:0.95463 validation-aucpr:0.96114
[18:00:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.52563 validation-auc:0.95476 validation-aucpr:0.96122
[18:00:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.52269 validation-auc:0.95480 validation-aucpr:0.96118
[18:00:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.51956 validation-auc:0.95483 validation-aucpr:0.96119
[18:00:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.51683 validation-auc:0.95481 validation-aucpr:0.96109
[18:00:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.51382 validation-auc:0.95477 validation-aucpr:0.96107
[18:00:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.51128 validation-auc:0.95479 validation-aucpr:0.96111
[18:00:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.50775 validation-auc:0.95485 validation-aucpr:0.96123
[18:00:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.50486 validation-auc:0.95482 validation-aucpr:0.96119
[18:00:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.50206 validation-auc:0.95476 validation-aucpr:0.96108
[18:00:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.49940 validation-auc:0.95477 validation-aucpr:0.96111
[18:00:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.49656 validation-auc:0.95488 validation-aucpr:0.96121
[18:00:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.49429 validation-auc:0.95480 validation-aucpr:0.96113
[18:00:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.49157 validation-auc:0.95470 validation-aucpr:0.96107
[18:00:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.48906 validation-auc:0.95466 validation-aucpr:0.96099
[18:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.48614 validation-auc:0.95480 validation-aucpr:0.96114
[18:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.48386 validation-auc:0.95479 validation-aucpr:0.96108
[18:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.48147 validation-auc:0.95477 validation-aucpr:0.96110
[18:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.47917 validation-auc:0.95474 validation-aucpr:0.96109
[18:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.47708 validation-auc:0.95469 validation-aucpr:0.96106
[18:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.47488 validation-auc:0.95477 validation-aucpr:0.96111
[18:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.47260 validation-auc:0.95482 validation-aucpr:0.96112
[18:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.47038 validation-auc:0.95481 validation-aucpr:0.96108
[18:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.46774 validation-auc:0.95493 validation-aucpr:0.96115
[18:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.46438 validation-auc:0.95518 validation-aucpr:0.96139
[18:00:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.46225 validation-auc:0.95516 validation-aucpr:0.96139
[18:00:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.46016 validation-auc:0.95514 validation-aucpr:0.96135
[18:00:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.45807 validation-auc:0.95516 validation-aucpr:0.96137
[18:00:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.45603 validation-auc:0.95514 validation-aucpr:0.96143
[18:00:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.45402 validation-auc:0.95504 validation-aucpr:0.96136
[18:00:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.45191 validation-auc:0.95500 validation-aucpr:0.96129
[18:00:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.44887 validation-auc:0.95519 validation-aucpr:0.96144
[18:00:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.44610 validation-auc:0.95525 validation-aucpr:0.96155
[18:00:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.44427 validation-auc:0.95520 validation-aucpr:0.96151
[18:00:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.44184 validation-auc:0.95531 validation-aucpr:0.96159
[18:00:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.43987 validation-auc:0.95529 validation-aucpr:0.96156
[18:00:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.43790 validation-auc:0.95530 validation-aucpr:0.96160
[18:00:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.43610 validation-auc:0.95524 validation-aucpr:0.96159
[18:00:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.43426 validation-auc:0.95523 validation-aucpr:0.96157
[18:00:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.43247 validation-auc:0.95528 validation-aucpr:0.96160
[18:00:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.42998 validation-auc:0.95543 validation-aucpr:0.96173
[18:00:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.42864 validation-auc:0.95538 validation-aucpr:0.96167
[18:00:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.42688 validation-auc:0.95535 validation-aucpr:0.96162
[18:00:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.42583 validation-auc:0.95535 validation-aucpr:0.96159
[18:00:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.42420 validation-auc:0.95529 validation-aucpr:0.96152
[18:00:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.42243 validation-auc:0.95532 validation-aucpr:0.96155
[18:00:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.42065 validation-auc:0.95529 validation-aucpr:0.96160
[18:00:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.41924 validation-auc:0.95531 validation-aucpr:0.96161
[18:00:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.41796 validation-auc:0.95531 validation-aucpr:0.96161
[18:00:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.41638 validation-auc:0.95526 validation-aucpr:0.96157
[18:00:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.41508 validation-auc:0.95524 validation-aucpr:0.96153
[18:00:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.41381 validation-auc:0.95531 validation-aucpr:0.96161
[18:00:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.41228 validation-auc:0.95529 validation-aucpr:0.96163
[18:00:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[89] validation-logloss:0.41099 validation-auc:0.95527 validation-aucpr:0.96169
[18:00:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[90] validation-logloss:0.40904 validation-auc:0.95532 validation-aucpr:0.96178
[18:00:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[91] validation-logloss:0.40759 validation-auc:0.95532 validation-aucpr:0.96178
[18:00:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[92] validation-logloss:0.40635 validation-auc:0.95529 validation-aucpr:0.96174
{'best_iteration': '90', 'best_score': '0.9617848063106246'}
Trial 0, Fold 5: Log loss = 0.4063536315120741, Average precision = 0.961749182442062, ROC-AUC = 0.9552881110306002, Elapsed Time = 15.487014800000907 seconds
Optimization Progress: 1%|1 | 1/100 [01:28<2:25:19, 88.08s/it]
Trial 1, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 1, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:00:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63392 validation-auc:0.94384 validation-aucpr:0.94999
[18:00:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.58832 validation-auc:0.94990 validation-aucpr:0.94299
[18:00:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.54919 validation-auc:0.95494 validation-aucpr:0.94824
[18:00:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.51654 validation-auc:0.95652 validation-aucpr:0.95592
[18:00:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.48251 validation-auc:0.95997 validation-aucpr:0.96125
{'best_iteration': '4', 'best_score': '0.9612493689860694'}
Trial 1, Fold 1: Log loss = 0.4825125440000191, Average precision = 0.9622372581737065, ROC-AUC = 0.9599684756217052, Elapsed Time = 1.3910605000000942 seconds
Trial 1, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 1, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:00:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63285 validation-auc:0.93630 validation-aucpr:0.94394
[18:00:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.58116 validation-auc:0.95561 validation-aucpr:0.95181
[18:00:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.54115 validation-auc:0.96053 validation-aucpr:0.95841
[18:00:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.50679 validation-auc:0.96251 validation-aucpr:0.96537
[18:00:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.47790 validation-auc:0.96320 validation-aucpr:0.96723
{'best_iteration': '4', 'best_score': '0.9672330098919696'}
Trial 1, Fold 2: Log loss = 0.4778954946598143, Average precision = 0.9670471551250688, ROC-AUC = 0.963202831249482, Elapsed Time = 1.3718076999994082 seconds
Trial 1, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 1, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:00:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63230 validation-auc:0.93998 validation-aucpr:0.95407
[18:00:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.58061 validation-auc:0.95589 validation-aucpr:0.94565
[18:00:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.54078 validation-auc:0.96280 validation-aucpr:0.96040
[18:00:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.50346 validation-auc:0.96485 validation-aucpr:0.96751
[18:00:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.46977 validation-auc:0.96618 validation-aucpr:0.97034
{'best_iteration': '4', 'best_score': '0.9703432338893858'}
Trial 1, Fold 3: Log loss = 0.4697678746716907, Average precision = 0.9706581052575195, ROC-AUC = 0.9661814480983523, Elapsed Time = 1.581263799998851 seconds
Trial 1, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 1, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:00:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63334 validation-auc:0.92408 validation-aucpr:0.91700
[18:00:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.58359 validation-auc:0.94845 validation-aucpr:0.93471
[18:00:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.54465 validation-auc:0.95682 validation-aucpr:0.95295
[18:00:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.50549 validation-auc:0.96154 validation-aucpr:0.96622
[18:00:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.47218 validation-auc:0.96303 validation-aucpr:0.96621
{'best_iteration': '3', 'best_score': '0.966221599980624'}
Trial 1, Fold 4: Log loss = 0.4721788659488989, Average precision = 0.9668029424338604, ROC-AUC = 0.9630322598720525, Elapsed Time = 1.341411400000652 seconds
Trial 1, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 1, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:00:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63208 validation-auc:0.93269 validation-aucpr:0.94766
[18:00:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.58183 validation-auc:0.95642 validation-aucpr:0.95017
[18:00:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.53918 validation-auc:0.95850 validation-aucpr:0.95651
[18:00:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.50252 validation-auc:0.96053 validation-aucpr:0.96324
[18:00:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.47027 validation-auc:0.96212 validation-aucpr:0.96623
{'best_iteration': '4', 'best_score': '0.9662340562252816'}
Trial 1, Fold 5: Log loss = 0.47026629801259107, Average precision = 0.9665882911261429, ROC-AUC = 0.962115898407744, Elapsed Time = 1.2950529999998253 seconds
Optimization Progress: 2%|2 | 2/100 [01:43<1:13:57, 45.28s/it]
Trial 2, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 2, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68353 validation-auc:0.91262 validation-aucpr:0.91609
[1] validation-logloss:0.67234 validation-auc:0.94708 validation-aucpr:0.95478
[2] validation-logloss:0.66292 validation-auc:0.95054 validation-aucpr:0.95893
[3] validation-logloss:0.65436 validation-auc:0.94975 validation-aucpr:0.95791
[4] validation-logloss:0.64550 validation-auc:0.95050 validation-aucpr:0.95875
[5] validation-logloss:0.63603 validation-auc:0.95166 validation-aucpr:0.95911
[6] validation-logloss:0.62775 validation-auc:0.95168 validation-aucpr:0.95896
[7] validation-logloss:0.62005 validation-auc:0.95170 validation-aucpr:0.95889
[8] validation-logloss:0.61279 validation-auc:0.95124 validation-aucpr:0.95832
[9] validation-logloss:0.60512 validation-auc:0.95178 validation-aucpr:0.95881
[10] validation-logloss:0.59861 validation-auc:0.95132 validation-aucpr:0.95826
[11] validation-logloss:0.59203 validation-auc:0.95058 validation-aucpr:0.95749
[12] validation-logloss:0.58474 validation-auc:0.95131 validation-aucpr:0.95848
[13] validation-logloss:0.57802 validation-auc:0.95106 validation-aucpr:0.95827
[14] validation-logloss:0.57120 validation-auc:0.95149 validation-aucpr:0.95893
[15] validation-logloss:0.56375 validation-auc:0.95204 validation-aucpr:0.95958
[16] validation-logloss:0.55670 validation-auc:0.95233 validation-aucpr:0.96010
[17] validation-logloss:0.55124 validation-auc:0.95205 validation-aucpr:0.95977
[18] validation-logloss:0.54516 validation-auc:0.95276 validation-aucpr:0.96034
[19] validation-logloss:0.53921 validation-auc:0.95289 validation-aucpr:0.96018
[20] validation-logloss:0.53406 validation-auc:0.95284 validation-aucpr:0.96014
[21] validation-logloss:0.52856 validation-auc:0.95288 validation-aucpr:0.96005
{'best_iteration': '18', 'best_score': '0.9603435077097092'}
Trial 2, Fold 1: Log loss = 0.5285589149145459, Average precision = 0.9599034058322928, ROC-AUC = 0.9528846762784207, Elapsed Time = 1.5365930999996635 seconds
Trial 2, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 2, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68363 validation-auc:0.91583 validation-aucpr:0.91126
[1] validation-logloss:0.67451 validation-auc:0.92725 validation-aucpr:0.92492
[2] validation-logloss:0.66501 validation-auc:0.93310 validation-aucpr:0.93311
[3] validation-logloss:0.65645 validation-auc:0.93276 validation-aucpr:0.93250
[4] validation-logloss:0.64741 validation-auc:0.93954 validation-aucpr:0.94073
[5] validation-logloss:0.63955 validation-auc:0.94139 validation-aucpr:0.94358
[6] validation-logloss:0.63133 validation-auc:0.94242 validation-aucpr:0.94462
[7] validation-logloss:0.62367 validation-auc:0.94334 validation-aucpr:0.94536
[8] validation-logloss:0.61641 validation-auc:0.94318 validation-aucpr:0.94531
[9] validation-logloss:0.60882 validation-auc:0.94351 validation-aucpr:0.94579
[10] validation-logloss:0.60229 validation-auc:0.94309 validation-aucpr:0.94541
[11] validation-logloss:0.59566 validation-auc:0.94290 validation-aucpr:0.94513
[12] validation-logloss:0.58841 validation-auc:0.94387 validation-aucpr:0.94651
[13] validation-logloss:0.58162 validation-auc:0.94394 validation-aucpr:0.94659
[14] validation-logloss:0.57482 validation-auc:0.94452 validation-aucpr:0.94742
[15] validation-logloss:0.56723 validation-auc:0.95083 validation-aucpr:0.95501
[16] validation-logloss:0.56008 validation-auc:0.95258 validation-aucpr:0.95742
[17] validation-logloss:0.55448 validation-auc:0.95270 validation-aucpr:0.95784
[18] validation-logloss:0.54846 validation-auc:0.95332 validation-aucpr:0.95830
[19] validation-logloss:0.54241 validation-auc:0.95380 validation-aucpr:0.95825
[20] validation-logloss:0.53708 validation-auc:0.95368 validation-aucpr:0.95823
[21] validation-logloss:0.53168 validation-auc:0.95359 validation-aucpr:0.95806
{'best_iteration': '18', 'best_score': '0.9582963343506642'}
Trial 2, Fold 2: Log loss = 0.531684667525449, Average precision = 0.957925966584543, ROC-AUC = 0.9535940855997118, Elapsed Time = 1.873345200001495 seconds
Trial 2, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 2, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68134 validation-auc:0.94037 validation-aucpr:0.93717
[1] validation-logloss:0.67031 validation-auc:0.95352 validation-aucpr:0.95412
[2] validation-logloss:0.66074 validation-auc:0.95657 validation-aucpr:0.96157
[3] validation-logloss:0.65196 validation-auc:0.95585 validation-aucpr:0.96089
[4] validation-logloss:0.64278 validation-auc:0.95889 validation-aucpr:0.96347
[5] validation-logloss:0.63391 validation-auc:0.95837 validation-aucpr:0.96310
[6] validation-logloss:0.62601 validation-auc:0.95815 validation-aucpr:0.96281
[7] validation-logloss:0.62166 validation-auc:0.95836 validation-aucpr:0.96298
[8] validation-logloss:0.61422 validation-auc:0.95780 validation-aucpr:0.96302
[9] validation-logloss:0.60690 validation-auc:0.95724 validation-aucpr:0.96235
[10] validation-logloss:0.59931 validation-auc:0.95768 validation-aucpr:0.96253
[11] validation-logloss:0.59238 validation-auc:0.95781 validation-aucpr:0.96246
[12] validation-logloss:0.58582 validation-auc:0.95784 validation-aucpr:0.96261
[13] validation-logloss:0.57913 validation-auc:0.95781 validation-aucpr:0.96244
[14] validation-logloss:0.57286 validation-auc:0.95759 validation-aucpr:0.96229
[15] validation-logloss:0.56661 validation-auc:0.95782 validation-aucpr:0.96254
[16] validation-logloss:0.56078 validation-auc:0.95787 validation-aucpr:0.96279
[17] validation-logloss:0.55301 validation-auc:0.95892 validation-aucpr:0.96404
[18] validation-logloss:0.54702 validation-auc:0.95879 validation-aucpr:0.96390
[19] validation-logloss:0.54167 validation-auc:0.95887 validation-aucpr:0.96411
[20] validation-logloss:0.53642 validation-auc:0.95868 validation-aucpr:0.96390
[21] validation-logloss:0.52959 validation-auc:0.95933 validation-aucpr:0.96467
{'best_iteration': '21', 'best_score': '0.9646701594022623'}
Trial 2, Fold 3: Log loss = 0.5295945618795187, Average precision = 0.9644928261022662, ROC-AUC = 0.9593264836609491, Elapsed Time = 1.8513464000006934 seconds
Trial 2, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 2, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68152 validation-auc:0.94070 validation-aucpr:0.94479
[1] validation-logloss:0.67048 validation-auc:0.94815 validation-aucpr:0.95350
[2] validation-logloss:0.66102 validation-auc:0.95153 validation-aucpr:0.95926
[3] validation-logloss:0.65216 validation-auc:0.95090 validation-aucpr:0.95798
[4] validation-logloss:0.64325 validation-auc:0.95219 validation-aucpr:0.95918
[5] validation-logloss:0.63529 validation-auc:0.95192 validation-aucpr:0.95929
[6] validation-logloss:0.62694 validation-auc:0.95320 validation-aucpr:0.96008
[7] validation-logloss:0.61915 validation-auc:0.95280 validation-aucpr:0.95979
[8] validation-logloss:0.61180 validation-auc:0.95276 validation-aucpr:0.95947
[9] validation-logloss:0.60462 validation-auc:0.95194 validation-aucpr:0.95871
[10] validation-logloss:0.59717 validation-auc:0.95284 validation-aucpr:0.95903
[11] validation-logloss:0.59054 validation-auc:0.95395 validation-aucpr:0.95991
[12] validation-logloss:0.58411 validation-auc:0.95353 validation-aucpr:0.95980
[13] validation-logloss:0.57772 validation-auc:0.95308 validation-aucpr:0.95942
[14] validation-logloss:0.57136 validation-auc:0.95375 validation-aucpr:0.96010
[15] validation-logloss:0.56518 validation-auc:0.95423 validation-aucpr:0.96066
[16] validation-logloss:0.55945 validation-auc:0.95389 validation-aucpr:0.96035
[17] validation-logloss:0.55176 validation-auc:0.95459 validation-aucpr:0.96128
[18] validation-logloss:0.54599 validation-auc:0.95468 validation-aucpr:0.96129
[19] validation-logloss:0.54079 validation-auc:0.95501 validation-aucpr:0.96150
[20] validation-logloss:0.53568 validation-auc:0.95469 validation-aucpr:0.96121
[21] validation-logloss:0.52909 validation-auc:0.95507 validation-aucpr:0.96164
{'best_iteration': '21', 'best_score': '0.9616435554777185'}
Trial 2, Fold 4: Log loss = 0.52909184951774, Average precision = 0.961437936932935, ROC-AUC = 0.9550681076847143, Elapsed Time = 1.8316604000010557 seconds
Trial 2, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 2, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68143 validation-auc:0.93842 validation-aucpr:0.94365
[1] validation-logloss:0.67050 validation-auc:0.94675 validation-aucpr:0.95040
[2] validation-logloss:0.66113 validation-auc:0.94800 validation-aucpr:0.95419
[3] validation-logloss:0.65252 validation-auc:0.94793 validation-aucpr:0.95343
[4] validation-logloss:0.64373 validation-auc:0.95174 validation-aucpr:0.95681
[5] validation-logloss:0.63496 validation-auc:0.95247 validation-aucpr:0.95770
[6] validation-logloss:0.62714 validation-auc:0.95206 validation-aucpr:0.95746
[7] validation-logloss:0.62275 validation-auc:0.95204 validation-aucpr:0.95729
[8] validation-logloss:0.61510 validation-auc:0.95156 validation-aucpr:0.95661
[9] validation-logloss:0.60807 validation-auc:0.95122 validation-aucpr:0.95616
[10] validation-logloss:0.60071 validation-auc:0.95169 validation-aucpr:0.95680
[11] validation-logloss:0.59420 validation-auc:0.95135 validation-aucpr:0.95662
[12] validation-logloss:0.58782 validation-auc:0.95151 validation-aucpr:0.95707
[13] validation-logloss:0.58148 validation-auc:0.95131 validation-aucpr:0.95696
[14] validation-logloss:0.57524 validation-auc:0.95164 validation-aucpr:0.95725
[15] validation-logloss:0.56916 validation-auc:0.95224 validation-aucpr:0.95787
[16] validation-logloss:0.56337 validation-auc:0.95244 validation-aucpr:0.95802
[17] validation-logloss:0.55568 validation-auc:0.95336 validation-aucpr:0.95948
[18] validation-logloss:0.54982 validation-auc:0.95347 validation-aucpr:0.95946
[19] validation-logloss:0.54483 validation-auc:0.95392 validation-aucpr:0.95977
[20] validation-logloss:0.53981 validation-auc:0.95366 validation-aucpr:0.95942
[21] validation-logloss:0.53325 validation-auc:0.95402 validation-aucpr:0.96005
{'best_iteration': '21', 'best_score': '0.9600526891124913'}
Trial 2, Fold 5: Log loss = 0.5332477529706764, Average precision = 0.9599245768391352, ROC-AUC = 0.9540233729074932, Elapsed Time = 1.8385123999996722 seconds
Optimization Progress: 3%|3 | 3/100 [02:00<52:27, 32.45s/it]
Trial 3, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 3, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68648 validation-auc:0.92196 validation-aucpr:0.92137
[1] validation-logloss:0.67979 validation-auc:0.93993 validation-aucpr:0.94332
[2] validation-logloss:0.67481 validation-auc:0.93782 validation-aucpr:0.94263
[3] validation-logloss:0.67037 validation-auc:0.94815 validation-aucpr:0.95520
[4] validation-logloss:0.66325 validation-auc:0.95313 validation-aucpr:0.96064
[5] validation-logloss:0.65757 validation-auc:0.95367 validation-aucpr:0.96074
[6] validation-logloss:0.65164 validation-auc:0.95371 validation-aucpr:0.96076
[7] validation-logloss:0.64613 validation-auc:0.95296 validation-aucpr:0.95999
[8] validation-logloss:0.63929 validation-auc:0.95503 validation-aucpr:0.96216
[9] validation-logloss:0.63400 validation-auc:0.95465 validation-aucpr:0.96159
[10] validation-logloss:0.62856 validation-auc:0.95467 validation-aucpr:0.96143
[11] validation-logloss:0.62336 validation-auc:0.95547 validation-aucpr:0.96198
[12] validation-logloss:0.61810 validation-auc:0.95542 validation-aucpr:0.96221
[13] validation-logloss:0.61454 validation-auc:0.95524 validation-aucpr:0.96216
[14] validation-logloss:0.60945 validation-auc:0.95545 validation-aucpr:0.96244
[15] validation-logloss:0.60455 validation-auc:0.95551 validation-aucpr:0.96232
[16] validation-logloss:0.59971 validation-auc:0.95561 validation-aucpr:0.96230
[17] validation-logloss:0.59507 validation-auc:0.95588 validation-aucpr:0.96268
[18] validation-logloss:0.59096 validation-auc:0.95576 validation-aucpr:0.96253
[19] validation-logloss:0.58663 validation-auc:0.95559 validation-aucpr:0.96248
[20] validation-logloss:0.58216 validation-auc:0.95558 validation-aucpr:0.96244
[21] validation-logloss:0.57705 validation-auc:0.95626 validation-aucpr:0.96306
[22] validation-logloss:0.57264 validation-auc:0.95622 validation-aucpr:0.96300
[23] validation-logloss:0.56832 validation-auc:0.95613 validation-aucpr:0.96291
[24] validation-logloss:0.56404 validation-auc:0.95620 validation-aucpr:0.96287
[25] validation-logloss:0.56015 validation-auc:0.95604 validation-aucpr:0.96269
[26] validation-logloss:0.55595 validation-auc:0.95590 validation-aucpr:0.96254
[27] validation-logloss:0.55209 validation-auc:0.95562 validation-aucpr:0.96222
[28] validation-logloss:0.54827 validation-auc:0.95578 validation-aucpr:0.96249
[29] validation-logloss:0.54546 validation-auc:0.95586 validation-aucpr:0.96257
[30] validation-logloss:0.54056 validation-auc:0.95655 validation-aucpr:0.96335
[31] validation-logloss:0.53725 validation-auc:0.95638 validation-aucpr:0.96318
[32] validation-logloss:0.53406 validation-auc:0.95641 validation-aucpr:0.96320
[33] validation-logloss:0.53054 validation-auc:0.95637 validation-aucpr:0.96316
[34] validation-logloss:0.52707 validation-auc:0.95630 validation-aucpr:0.96310
[35] validation-logloss:0.52277 validation-auc:0.95666 validation-aucpr:0.96350
[36] validation-logloss:0.51957 validation-auc:0.95649 validation-aucpr:0.96334
[37] validation-logloss:0.51637 validation-auc:0.95649 validation-aucpr:0.96334
[38] validation-logloss:0.51229 validation-auc:0.95659 validation-aucpr:0.96342
[39] validation-logloss:0.50904 validation-auc:0.95681 validation-aucpr:0.96358
[40] validation-logloss:0.50655 validation-auc:0.95683 validation-aucpr:0.96355
[41] validation-logloss:0.50273 validation-auc:0.95698 validation-aucpr:0.96370
[42] validation-logloss:0.49980 validation-auc:0.95690 validation-aucpr:0.96363
[43] validation-logloss:0.49695 validation-auc:0.95687 validation-aucpr:0.96359
[44] validation-logloss:0.49428 validation-auc:0.95680 validation-aucpr:0.96350
[45] validation-logloss:0.49116 validation-auc:0.95685 validation-aucpr:0.96354
[46] validation-logloss:0.48822 validation-auc:0.95672 validation-aucpr:0.96342
[47] validation-logloss:0.48540 validation-auc:0.95663 validation-aucpr:0.96334
[48] validation-logloss:0.48298 validation-auc:0.95654 validation-aucpr:0.96326
[49] validation-logloss:0.48015 validation-auc:0.95654 validation-aucpr:0.96329
[50] validation-logloss:0.47744 validation-auc:0.95641 validation-aucpr:0.96319
[51] validation-logloss:0.47453 validation-auc:0.95648 validation-aucpr:0.96323
[52] validation-logloss:0.47209 validation-auc:0.95646 validation-aucpr:0.96319
[53] validation-logloss:0.46993 validation-auc:0.95650 validation-aucpr:0.96322
[54] validation-logloss:0.46865 validation-auc:0.95653 validation-aucpr:0.96326
[55] validation-logloss:0.46506 validation-auc:0.95667 validation-aucpr:0.96344
[56] validation-logloss:0.46277 validation-auc:0.95667 validation-aucpr:0.96344
[57] validation-logloss:0.46066 validation-auc:0.95663 validation-aucpr:0.96343
[58] validation-logloss:0.45836 validation-auc:0.95651 validation-aucpr:0.96332
[59] validation-logloss:0.45611 validation-auc:0.95654 validation-aucpr:0.96333
[60] validation-logloss:0.45582 validation-auc:0.95657 validation-aucpr:0.96336
[61] validation-logloss:0.45406 validation-auc:0.95660 validation-aucpr:0.96342
[62] validation-logloss:0.45193 validation-auc:0.95664 validation-aucpr:0.96341
[63] validation-logloss:0.44985 validation-auc:0.95661 validation-aucpr:0.96335
[64] validation-logloss:0.44796 validation-auc:0.95671 validation-aucpr:0.96346
[65] validation-logloss:0.44593 validation-auc:0.95667 validation-aucpr:0.96343
[66] validation-logloss:0.44410 validation-auc:0.95673 validation-aucpr:0.96346
[67] validation-logloss:0.44186 validation-auc:0.95676 validation-aucpr:0.96345
[68] validation-logloss:0.43990 validation-auc:0.95669 validation-aucpr:0.96339
[69] validation-logloss:0.43801 validation-auc:0.95665 validation-aucpr:0.96336
[70] validation-logloss:0.43578 validation-auc:0.95671 validation-aucpr:0.96341
[71] validation-logloss:0.43273 validation-auc:0.95693 validation-aucpr:0.96363
[72] validation-logloss:0.43068 validation-auc:0.95696 validation-aucpr:0.96366
[73] validation-logloss:0.42894 validation-auc:0.95691 validation-aucpr:0.96359
[74] validation-logloss:0.42739 validation-auc:0.95698 validation-aucpr:0.96364
[75] validation-logloss:0.42569 validation-auc:0.95700 validation-aucpr:0.96364
[76] validation-logloss:0.42396 validation-auc:0.95705 validation-aucpr:0.96369
[77] validation-logloss:0.42233 validation-auc:0.95705 validation-aucpr:0.96369
[78] validation-logloss:0.42070 validation-auc:0.95705 validation-aucpr:0.96367
[79] validation-logloss:0.41917 validation-auc:0.95703 validation-aucpr:0.96366
[80] validation-logloss:0.41726 validation-auc:0.95706 validation-aucpr:0.96368
[81] validation-logloss:0.41545 validation-auc:0.95709 validation-aucpr:0.96368
[82] validation-logloss:0.41369 validation-auc:0.95701 validation-aucpr:0.96362
[83] validation-logloss:0.41209 validation-auc:0.95696 validation-aucpr:0.96359
[84] validation-logloss:0.41013 validation-auc:0.95701 validation-aucpr:0.96363
[85] validation-logloss:0.40853 validation-auc:0.95708 validation-aucpr:0.96368
[86] validation-logloss:0.40618 validation-auc:0.95720 validation-aucpr:0.96385
[87] validation-logloss:0.40457 validation-auc:0.95721 validation-aucpr:0.96385
[88] validation-logloss:0.40296 validation-auc:0.95719 validation-aucpr:0.96383
[89] validation-logloss:0.40150 validation-auc:0.95715 validation-aucpr:0.96378
[90] validation-logloss:0.40007 validation-auc:0.95706 validation-aucpr:0.96372
[91] validation-logloss:0.39855 validation-auc:0.95708 validation-aucpr:0.96374
[92] validation-logloss:0.39658 validation-auc:0.95715 validation-aucpr:0.96376
[93] validation-logloss:0.39545 validation-auc:0.95710 validation-aucpr:0.96370
[94] validation-logloss:0.39386 validation-auc:0.95710 validation-aucpr:0.96368
{'best_iteration': '87', 'best_score': '0.9638543290754388'}
Trial 3, Fold 1: Log loss = 0.3938565476281746, Average precision = 0.9636843615734073, ROC-AUC = 0.957102141017001, Elapsed Time = 2.7186390999995638 seconds
Trial 3, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 3, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68674 validation-auc:0.91252 validation-aucpr:0.91239
[1] validation-logloss:0.68039 validation-auc:0.92215 validation-aucpr:0.92220
[2] validation-logloss:0.67433 validation-auc:0.92483 validation-aucpr:0.92480
[3] validation-logloss:0.66802 validation-auc:0.92934 validation-aucpr:0.92810
[4] validation-logloss:0.66176 validation-auc:0.93242 validation-aucpr:0.93143
[5] validation-logloss:0.65601 validation-auc:0.93170 validation-aucpr:0.93022
[6] validation-logloss:0.64917 validation-auc:0.94908 validation-aucpr:0.95162
[7] validation-logloss:0.64310 validation-auc:0.95003 validation-aucpr:0.95237
[8] validation-logloss:0.63788 validation-auc:0.94970 validation-aucpr:0.95273
[9] validation-logloss:0.63344 validation-auc:0.94830 validation-aucpr:0.95040
[10] validation-logloss:0.62848 validation-auc:0.94864 validation-aucpr:0.95117
[11] validation-logloss:0.62404 validation-auc:0.94934 validation-aucpr:0.95200
[12] validation-logloss:0.61965 validation-auc:0.94871 validation-aucpr:0.95112
[13] validation-logloss:0.61347 validation-auc:0.95379 validation-aucpr:0.95793
[14] validation-logloss:0.60851 validation-auc:0.95415 validation-aucpr:0.95834
[15] validation-logloss:0.60376 validation-auc:0.95383 validation-aucpr:0.95810
[16] validation-logloss:0.59908 validation-auc:0.95385 validation-aucpr:0.95820
[17] validation-logloss:0.59466 validation-auc:0.95366 validation-aucpr:0.95813
[18] validation-logloss:0.59066 validation-auc:0.95372 validation-aucpr:0.95829
[19] validation-logloss:0.58502 validation-auc:0.95545 validation-aucpr:0.96043
[20] validation-logloss:0.57953 validation-auc:0.95649 validation-aucpr:0.96159
[21] validation-logloss:0.57560 validation-auc:0.95649 validation-aucpr:0.96162
[22] validation-logloss:0.57118 validation-auc:0.95677 validation-aucpr:0.96178
[23] validation-logloss:0.56723 validation-auc:0.95678 validation-aucpr:0.96165
[24] validation-logloss:0.56202 validation-auc:0.95735 validation-aucpr:0.96245
[25] validation-logloss:0.55826 validation-auc:0.95720 validation-aucpr:0.96226
[26] validation-logloss:0.55383 validation-auc:0.95742 validation-aucpr:0.96257
[27] validation-logloss:0.55008 validation-auc:0.95733 validation-aucpr:0.96242
[28] validation-logloss:0.54654 validation-auc:0.95729 validation-aucpr:0.96233
[29] validation-logloss:0.54262 validation-auc:0.95732 validation-aucpr:0.96231
[30] validation-logloss:0.53918 validation-auc:0.95710 validation-aucpr:0.96213
[31] validation-logloss:0.53458 validation-auc:0.95760 validation-aucpr:0.96258
[32] validation-logloss:0.53096 validation-auc:0.95760 validation-aucpr:0.96257
[33] validation-logloss:0.52740 validation-auc:0.95779 validation-aucpr:0.96271
[34] validation-logloss:0.52421 validation-auc:0.95769 validation-aucpr:0.96261
[35] validation-logloss:0.52063 validation-auc:0.95777 validation-aucpr:0.96267
[36] validation-logloss:0.51691 validation-auc:0.95793 validation-aucpr:0.96275
[37] validation-logloss:0.51367 validation-auc:0.95795 validation-aucpr:0.96272
[38] validation-logloss:0.51082 validation-auc:0.95784 validation-aucpr:0.96259
[39] validation-logloss:0.50749 validation-auc:0.95781 validation-aucpr:0.96254
[40] validation-logloss:0.50459 validation-auc:0.95778 validation-aucpr:0.96249
[41] validation-logloss:0.50120 validation-auc:0.95776 validation-aucpr:0.96246
[42] validation-logloss:0.49809 validation-auc:0.95776 validation-aucpr:0.96244
[43] validation-logloss:0.49520 validation-auc:0.95787 validation-aucpr:0.96254
[44] validation-logloss:0.49222 validation-auc:0.95793 validation-aucpr:0.96262
[45] validation-logloss:0.48890 validation-auc:0.95804 validation-aucpr:0.96268
[46] validation-logloss:0.48604 validation-auc:0.95804 validation-aucpr:0.96262
[47] validation-logloss:0.48285 validation-auc:0.95807 validation-aucpr:0.96267
[48] validation-logloss:0.48036 validation-auc:0.95802 validation-aucpr:0.96260
[49] validation-logloss:0.47753 validation-auc:0.95798 validation-aucpr:0.96262
[50] validation-logloss:0.47474 validation-auc:0.95789 validation-aucpr:0.96253
[51] validation-logloss:0.47210 validation-auc:0.95782 validation-aucpr:0.96248
[52] validation-logloss:0.46940 validation-auc:0.95781 validation-aucpr:0.96250
[53] validation-logloss:0.46680 validation-auc:0.95784 validation-aucpr:0.96246
[54] validation-logloss:0.46442 validation-auc:0.95825 validation-aucpr:0.96289
[55] validation-logloss:0.46219 validation-auc:0.95812 validation-aucpr:0.96277
[56] validation-logloss:0.45988 validation-auc:0.95812 validation-aucpr:0.96273
[57] validation-logloss:0.45765 validation-auc:0.95814 validation-aucpr:0.96278
[58] validation-logloss:0.45521 validation-auc:0.95817 validation-aucpr:0.96278
[59] validation-logloss:0.45293 validation-auc:0.95813 validation-aucpr:0.96275
[60] validation-logloss:0.45100 validation-auc:0.95811 validation-aucpr:0.96270
[61] validation-logloss:0.44877 validation-auc:0.95804 validation-aucpr:0.96265
[62] validation-logloss:0.44689 validation-auc:0.95800 validation-aucpr:0.96257
[63] validation-logloss:0.44483 validation-auc:0.95801 validation-aucpr:0.96258
[64] validation-logloss:0.44305 validation-auc:0.95797 validation-aucpr:0.96253
[65] validation-logloss:0.44111 validation-auc:0.95796 validation-aucpr:0.96255
[66] validation-logloss:0.43910 validation-auc:0.95789 validation-aucpr:0.96247
[67] validation-logloss:0.43666 validation-auc:0.95789 validation-aucpr:0.96242
[68] validation-logloss:0.43462 validation-auc:0.95796 validation-aucpr:0.96245
[69] validation-logloss:0.43285 validation-auc:0.95786 validation-aucpr:0.96234
[70] validation-logloss:0.43097 validation-auc:0.95792 validation-aucpr:0.96245
[71] validation-logloss:0.42872 validation-auc:0.95792 validation-aucpr:0.96239
[72] validation-logloss:0.42701 validation-auc:0.95794 validation-aucpr:0.96240
[73] validation-logloss:0.42505 validation-auc:0.95795 validation-aucpr:0.96237
[74] validation-logloss:0.42348 validation-auc:0.95793 validation-aucpr:0.96233
[75] validation-logloss:0.42178 validation-auc:0.95797 validation-aucpr:0.96232
[76] validation-logloss:0.41989 validation-auc:0.95795 validation-aucpr:0.96235
[77] validation-logloss:0.41848 validation-auc:0.95790 validation-aucpr:0.96225
[78] validation-logloss:0.41675 validation-auc:0.95788 validation-aucpr:0.96225
[79] validation-logloss:0.41503 validation-auc:0.95793 validation-aucpr:0.96226
[80] validation-logloss:0.41369 validation-auc:0.95790 validation-aucpr:0.96223
[81] validation-logloss:0.41191 validation-auc:0.95791 validation-aucpr:0.96226
[82] validation-logloss:0.41051 validation-auc:0.95785 validation-aucpr:0.96220
[83] validation-logloss:0.40890 validation-auc:0.95783 validation-aucpr:0.96217
[84] validation-logloss:0.40747 validation-auc:0.95777 validation-aucpr:0.96212
[85] validation-logloss:0.40597 validation-auc:0.95777 validation-aucpr:0.96211
[86] validation-logloss:0.40467 validation-auc:0.95781 validation-aucpr:0.96209
[87] validation-logloss:0.40331 validation-auc:0.95776 validation-aucpr:0.96205
[88] validation-logloss:0.40164 validation-auc:0.95773 validation-aucpr:0.96201
[89] validation-logloss:0.40013 validation-auc:0.95768 validation-aucpr:0.96193
[90] validation-logloss:0.39886 validation-auc:0.95765 validation-aucpr:0.96189
[91] validation-logloss:0.39738 validation-auc:0.95765 validation-aucpr:0.96185
[92] validation-logloss:0.39623 validation-auc:0.95759 validation-aucpr:0.96178
[93] validation-logloss:0.39481 validation-auc:0.95764 validation-aucpr:0.96184
[94] validation-logloss:0.39321 validation-auc:0.95762 validation-aucpr:0.96178
{'best_iteration': '54', 'best_score': '0.9628922856828451'}
Trial 3, Fold 2: Log loss = 0.3932079132355882, Average precision = 0.9617895947599515, ROC-AUC = 0.9576229134300127, Elapsed Time = 2.8691792999998142 seconds
Trial 3, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 3, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68637 validation-auc:0.91589 validation-aucpr:0.90796
[1] validation-logloss:0.68078 validation-auc:0.92850 validation-aucpr:0.92889
[2] validation-logloss:0.67416 validation-auc:0.94698 validation-aucpr:0.95374
[3] validation-logloss:0.66803 validation-auc:0.94846 validation-aucpr:0.95492
[4] validation-logloss:0.66274 validation-auc:0.94739 validation-aucpr:0.95377
[5] validation-logloss:0.65644 validation-auc:0.95036 validation-aucpr:0.95687
[6] validation-logloss:0.64935 validation-auc:0.95312 validation-aucpr:0.95948
[7] validation-logloss:0.64360 validation-auc:0.95329 validation-aucpr:0.95977
[8] validation-logloss:0.63840 validation-auc:0.95338 validation-aucpr:0.95971
[9] validation-logloss:0.63291 validation-auc:0.95463 validation-aucpr:0.96039
[10] validation-logloss:0.62713 validation-auc:0.95512 validation-aucpr:0.96082
[11] validation-logloss:0.62123 validation-auc:0.95592 validation-aucpr:0.96243
[12] validation-logloss:0.61643 validation-auc:0.95651 validation-aucpr:0.96287
[13] validation-logloss:0.61044 validation-auc:0.95703 validation-aucpr:0.96339
[14] validation-logloss:0.60546 validation-auc:0.95734 validation-aucpr:0.96357
[15] validation-logloss:0.59992 validation-auc:0.95813 validation-aucpr:0.96408
[16] validation-logloss:0.59505 validation-auc:0.95837 validation-aucpr:0.96429
[17] validation-logloss:0.59001 validation-auc:0.95859 validation-aucpr:0.96458
[18] validation-logloss:0.58534 validation-auc:0.95865 validation-aucpr:0.96474
[19] validation-logloss:0.58115 validation-auc:0.95868 validation-aucpr:0.96485
[20] validation-logloss:0.57679 validation-auc:0.95845 validation-aucpr:0.96465
[21] validation-logloss:0.57199 validation-auc:0.95867 validation-aucpr:0.96477
[22] validation-logloss:0.56769 validation-auc:0.95889 validation-aucpr:0.96495
[23] validation-logloss:0.56219 validation-auc:0.95890 validation-aucpr:0.96512
[24] validation-logloss:0.55829 validation-auc:0.95871 validation-aucpr:0.96503
[25] validation-logloss:0.55504 validation-auc:0.95859 validation-aucpr:0.96491
[26] validation-logloss:0.55004 validation-auc:0.95866 validation-aucpr:0.96497
[27] validation-logloss:0.54646 validation-auc:0.95853 validation-aucpr:0.96485
[28] validation-logloss:0.54217 validation-auc:0.95849 validation-aucpr:0.96480
[29] validation-logloss:0.53876 validation-auc:0.95843 validation-aucpr:0.96472
[30] validation-logloss:0.53531 validation-auc:0.95841 validation-aucpr:0.96462
[31] validation-logloss:0.53165 validation-auc:0.95828 validation-aucpr:0.96454
[32] validation-logloss:0.52858 validation-auc:0.95841 validation-aucpr:0.96457
[33] validation-logloss:0.52439 validation-auc:0.95842 validation-aucpr:0.96464
[34] validation-logloss:0.52065 validation-auc:0.95837 validation-aucpr:0.96457
[35] validation-logloss:0.51740 validation-auc:0.95830 validation-aucpr:0.96447
[36] validation-logloss:0.51431 validation-auc:0.95839 validation-aucpr:0.96454
[37] validation-logloss:0.51143 validation-auc:0.95831 validation-aucpr:0.96443
[38] validation-logloss:0.50802 validation-auc:0.95834 validation-aucpr:0.96448
[39] validation-logloss:0.50473 validation-auc:0.95837 validation-aucpr:0.96446
[40] validation-logloss:0.50267 validation-auc:0.95846 validation-aucpr:0.96443
[41] validation-logloss:0.49981 validation-auc:0.95834 validation-aucpr:0.96438
[42] validation-logloss:0.49679 validation-auc:0.95853 validation-aucpr:0.96443
[43] validation-logloss:0.49455 validation-auc:0.95843 validation-aucpr:0.96437
[44] validation-logloss:0.49165 validation-auc:0.95856 validation-aucpr:0.96444
[45] validation-logloss:0.48878 validation-auc:0.95850 validation-aucpr:0.96440
[46] validation-logloss:0.48590 validation-auc:0.95840 validation-aucpr:0.96431
[47] validation-logloss:0.48340 validation-auc:0.95843 validation-aucpr:0.96451
[48] validation-logloss:0.48075 validation-auc:0.95839 validation-aucpr:0.96451
[49] validation-logloss:0.47740 validation-auc:0.95836 validation-aucpr:0.96454
[50] validation-logloss:0.47456 validation-auc:0.95845 validation-aucpr:0.96456
[51] validation-logloss:0.47229 validation-auc:0.95846 validation-aucpr:0.96457
[52] validation-logloss:0.46974 validation-auc:0.95845 validation-aucpr:0.96453
[53] validation-logloss:0.46706 validation-auc:0.95865 validation-aucpr:0.96462
[54] validation-logloss:0.46457 validation-auc:0.95873 validation-aucpr:0.96470
[55] validation-logloss:0.46209 validation-auc:0.95877 validation-aucpr:0.96475
[56] validation-logloss:0.45970 validation-auc:0.95901 validation-aucpr:0.96510
[57] validation-logloss:0.45769 validation-auc:0.95917 validation-aucpr:0.96515
[58] validation-logloss:0.45617 validation-auc:0.95916 validation-aucpr:0.96514
[59] validation-logloss:0.45401 validation-auc:0.95921 validation-aucpr:0.96518
[60] validation-logloss:0.45198 validation-auc:0.95925 validation-aucpr:0.96521
[61] validation-logloss:0.44992 validation-auc:0.95923 validation-aucpr:0.96518
[62] validation-logloss:0.44763 validation-auc:0.95917 validation-aucpr:0.96515
[63] validation-logloss:0.44532 validation-auc:0.95920 validation-aucpr:0.96518
[64] validation-logloss:0.44328 validation-auc:0.95921 validation-aucpr:0.96520
[65] validation-logloss:0.44101 validation-auc:0.95928 validation-aucpr:0.96522
[66] validation-logloss:0.43898 validation-auc:0.95940 validation-aucpr:0.96530
[67] validation-logloss:0.43675 validation-auc:0.95949 validation-aucpr:0.96540
[68] validation-logloss:0.43503 validation-auc:0.95946 validation-aucpr:0.96537
[69] validation-logloss:0.43310 validation-auc:0.95946 validation-aucpr:0.96537
[70] validation-logloss:0.43136 validation-auc:0.95937 validation-aucpr:0.96529
[71] validation-logloss:0.42923 validation-auc:0.95935 validation-aucpr:0.96529
[72] validation-logloss:0.42712 validation-auc:0.95945 validation-aucpr:0.96535
[73] validation-logloss:0.42516 validation-auc:0.95952 validation-aucpr:0.96545
[74] validation-logloss:0.42318 validation-auc:0.95959 validation-aucpr:0.96549
[75] validation-logloss:0.42158 validation-auc:0.95962 validation-aucpr:0.96554
[76] validation-logloss:0.41998 validation-auc:0.95962 validation-aucpr:0.96552
[77] validation-logloss:0.41806 validation-auc:0.95964 validation-aucpr:0.96552
[78] validation-logloss:0.41618 validation-auc:0.95956 validation-aucpr:0.96545
[79] validation-logloss:0.41455 validation-auc:0.95957 validation-aucpr:0.96542
[80] validation-logloss:0.41264 validation-auc:0.95960 validation-aucpr:0.96544
[81] validation-logloss:0.41101 validation-auc:0.95956 validation-aucpr:0.96541
[82] validation-logloss:0.40966 validation-auc:0.95954 validation-aucpr:0.96543
[83] validation-logloss:0.40819 validation-auc:0.95948 validation-aucpr:0.96539
[84] validation-logloss:0.40642 validation-auc:0.95944 validation-aucpr:0.96535
[85] validation-logloss:0.40475 validation-auc:0.95948 validation-aucpr:0.96536
[86] validation-logloss:0.40294 validation-auc:0.95958 validation-aucpr:0.96542
[87] validation-logloss:0.40129 validation-auc:0.95960 validation-aucpr:0.96544
[88] validation-logloss:0.39988 validation-auc:0.95959 validation-aucpr:0.96541
[89] validation-logloss:0.39816 validation-auc:0.95958 validation-aucpr:0.96539
[90] validation-logloss:0.39683 validation-auc:0.95957 validation-aucpr:0.96536
[91] validation-logloss:0.39561 validation-auc:0.95949 validation-aucpr:0.96531
[92] validation-logloss:0.39507 validation-auc:0.95949 validation-aucpr:0.96530
[93] validation-logloss:0.39366 validation-auc:0.95937 validation-aucpr:0.96518
[94] validation-logloss:0.39203 validation-auc:0.95937 validation-aucpr:0.96519
{'best_iteration': '75', 'best_score': '0.965536071811244'}
Trial 3, Fold 3: Log loss = 0.3920261647812823, Average precision = 0.9651939229642484, ROC-AUC = 0.9593688451666651, Elapsed Time = 2.9667830999987927 seconds
Trial 3, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 3, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68650 validation-auc:0.91122 validation-aucpr:0.91672
[1] validation-logloss:0.68023 validation-auc:0.92379 validation-aucpr:0.92782
[2] validation-logloss:0.67405 validation-auc:0.93039 validation-aucpr:0.93561
[3] validation-logloss:0.66843 validation-auc:0.93237 validation-aucpr:0.93702
[4] validation-logloss:0.66248 validation-auc:0.93462 validation-aucpr:0.93935
[5] validation-logloss:0.65544 validation-auc:0.94953 validation-aucpr:0.95605
[6] validation-logloss:0.64954 validation-auc:0.94976 validation-aucpr:0.95695
[7] validation-logloss:0.64419 validation-auc:0.94895 validation-aucpr:0.95615
[8] validation-logloss:0.63847 validation-auc:0.94925 validation-aucpr:0.95641
[9] validation-logloss:0.63183 validation-auc:0.95105 validation-aucpr:0.95855
[10] validation-logloss:0.62670 validation-auc:0.95077 validation-aucpr:0.95831
[11] validation-logloss:0.62140 validation-auc:0.95096 validation-aucpr:0.95844
[12] validation-logloss:0.61612 validation-auc:0.95237 validation-aucpr:0.95971
[13] validation-logloss:0.61166 validation-auc:0.95254 validation-aucpr:0.95998
[14] validation-logloss:0.60652 validation-auc:0.95265 validation-aucpr:0.96018
[15] validation-logloss:0.60154 validation-auc:0.95301 validation-aucpr:0.96063
[16] validation-logloss:0.59947 validation-auc:0.95334 validation-aucpr:0.96095
[17] validation-logloss:0.59541 validation-auc:0.95316 validation-aucpr:0.96075
[18] validation-logloss:0.59134 validation-auc:0.95304 validation-aucpr:0.96057
[19] validation-logloss:0.58703 validation-auc:0.95311 validation-aucpr:0.96066
[20] validation-logloss:0.58293 validation-auc:0.95311 validation-aucpr:0.96062
[21] validation-logloss:0.57890 validation-auc:0.95247 validation-aucpr:0.95995
[22] validation-logloss:0.57513 validation-auc:0.95225 validation-aucpr:0.95962
[23] validation-logloss:0.57089 validation-auc:0.95245 validation-aucpr:0.95964
[24] validation-logloss:0.56692 validation-auc:0.95286 validation-aucpr:0.96010
[25] validation-logloss:0.56284 validation-auc:0.95297 validation-aucpr:0.96017
[26] validation-logloss:0.55955 validation-auc:0.95299 validation-aucpr:0.96020
[27] validation-logloss:0.55590 validation-auc:0.95314 validation-aucpr:0.96029
[28] validation-logloss:0.55205 validation-auc:0.95270 validation-aucpr:0.95979
[29] validation-logloss:0.54865 validation-auc:0.95251 validation-aucpr:0.95960
[30] validation-logloss:0.54525 validation-auc:0.95196 validation-aucpr:0.95898
[31] validation-logloss:0.54125 validation-auc:0.95293 validation-aucpr:0.96022
[32] validation-logloss:0.53716 validation-auc:0.95347 validation-aucpr:0.96075
[33] validation-logloss:0.53490 validation-auc:0.95360 validation-aucpr:0.96095
[34] validation-logloss:0.53023 validation-auc:0.95417 validation-aucpr:0.96154
[35] validation-logloss:0.52720 validation-auc:0.95439 validation-aucpr:0.96166
[36] validation-logloss:0.52432 validation-auc:0.95413 validation-aucpr:0.96142
[37] validation-logloss:0.52094 validation-auc:0.95420 validation-aucpr:0.96152
[38] validation-logloss:0.51744 validation-auc:0.95434 validation-aucpr:0.96165
[39] validation-logloss:0.51438 validation-auc:0.95434 validation-aucpr:0.96167
[40] validation-logloss:0.51183 validation-auc:0.95432 validation-aucpr:0.96159
[41] validation-logloss:0.50924 validation-auc:0.95416 validation-aucpr:0.96143
[42] validation-logloss:0.50608 validation-auc:0.95418 validation-aucpr:0.96136
[43] validation-logloss:0.50337 validation-auc:0.95412 validation-aucpr:0.96129
[44] validation-logloss:0.50054 validation-auc:0.95399 validation-aucpr:0.96115
[45] validation-logloss:0.49782 validation-auc:0.95399 validation-aucpr:0.96112
[46] validation-logloss:0.49540 validation-auc:0.95387 validation-aucpr:0.96096
[47] validation-logloss:0.49263 validation-auc:0.95406 validation-aucpr:0.96111
[48] validation-logloss:0.48973 validation-auc:0.95401 validation-aucpr:0.96096
[49] validation-logloss:0.48696 validation-auc:0.95406 validation-aucpr:0.96102
[50] validation-logloss:0.48424 validation-auc:0.95416 validation-aucpr:0.96105
[51] validation-logloss:0.48085 validation-auc:0.95453 validation-aucpr:0.96152
[52] validation-logloss:0.47813 validation-auc:0.95452 validation-aucpr:0.96151
[53] validation-logloss:0.47535 validation-auc:0.95460 validation-aucpr:0.96156
[54] validation-logloss:0.47301 validation-auc:0.95452 validation-aucpr:0.96145
[55] validation-logloss:0.47037 validation-auc:0.95455 validation-aucpr:0.96146
[56] validation-logloss:0.46813 validation-auc:0.95452 validation-aucpr:0.96143
[57] validation-logloss:0.46603 validation-auc:0.95443 validation-aucpr:0.96134
[58] validation-logloss:0.46372 validation-auc:0.95441 validation-aucpr:0.96133
[59] validation-logloss:0.46170 validation-auc:0.95433 validation-aucpr:0.96124
[60] validation-logloss:0.45951 validation-auc:0.95433 validation-aucpr:0.96119
[61] validation-logloss:0.45728 validation-auc:0.95439 validation-aucpr:0.96126
[62] validation-logloss:0.45506 validation-auc:0.95440 validation-aucpr:0.96121
[63] validation-logloss:0.45301 validation-auc:0.95436 validation-aucpr:0.96119
[64] validation-logloss:0.45152 validation-auc:0.95429 validation-aucpr:0.96113
[65] validation-logloss:0.44944 validation-auc:0.95432 validation-aucpr:0.96112
[66] validation-logloss:0.44780 validation-auc:0.95425 validation-aucpr:0.96101
[67] validation-logloss:0.44602 validation-auc:0.95427 validation-aucpr:0.96101
[68] validation-logloss:0.44356 validation-auc:0.95448 validation-aucpr:0.96119
[69] validation-logloss:0.44159 validation-auc:0.95451 validation-aucpr:0.96118
[70] validation-logloss:0.44149 validation-auc:0.95448 validation-aucpr:0.96115
[71] validation-logloss:0.43888 validation-auc:0.95490 validation-aucpr:0.96170
[72] validation-logloss:0.43678 validation-auc:0.95492 validation-aucpr:0.96170
[73] validation-logloss:0.43445 validation-auc:0.95495 validation-aucpr:0.96166
[74] validation-logloss:0.43259 validation-auc:0.95497 validation-aucpr:0.96171
[75] validation-logloss:0.43105 validation-auc:0.95487 validation-aucpr:0.96165
[76] validation-logloss:0.42931 validation-auc:0.95490 validation-aucpr:0.96169
[77] validation-logloss:0.42642 validation-auc:0.95528 validation-aucpr:0.96215
[78] validation-logloss:0.42448 validation-auc:0.95540 validation-aucpr:0.96223
[79] validation-logloss:0.42247 validation-auc:0.95547 validation-aucpr:0.96229
[80] validation-logloss:0.42078 validation-auc:0.95543 validation-aucpr:0.96224
[81] validation-logloss:0.41919 validation-auc:0.95541 validation-aucpr:0.96219
[82] validation-logloss:0.41773 validation-auc:0.95535 validation-aucpr:0.96212
[83] validation-logloss:0.41615 validation-auc:0.95537 validation-aucpr:0.96216
[84] validation-logloss:0.41496 validation-auc:0.95534 validation-aucpr:0.96213
[85] validation-logloss:0.41365 validation-auc:0.95530 validation-aucpr:0.96211
[86] validation-logloss:0.41200 validation-auc:0.95525 validation-aucpr:0.96204
[87] validation-logloss:0.41045 validation-auc:0.95520 validation-aucpr:0.96200
[88] validation-logloss:0.40904 validation-auc:0.95529 validation-aucpr:0.96209
[89] validation-logloss:0.40787 validation-auc:0.95536 validation-aucpr:0.96215
[90] validation-logloss:0.40625 validation-auc:0.95540 validation-aucpr:0.96216
[91] validation-logloss:0.40489 validation-auc:0.95534 validation-aucpr:0.96209
[92] validation-logloss:0.40348 validation-auc:0.95528 validation-aucpr:0.96199
[93] validation-logloss:0.40213 validation-auc:0.95527 validation-aucpr:0.96199
[94] validation-logloss:0.40036 validation-auc:0.95535 validation-aucpr:0.96205
{'best_iteration': '79', 'best_score': '0.9622923502065677'}
Trial 3, Fold 4: Log loss = 0.4003591014363517, Average precision = 0.9620560602631516, ROC-AUC = 0.9553464130131547, Elapsed Time = 2.7846226999990904 seconds
Trial 3, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 3, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68639 validation-auc:0.91537 validation-aucpr:0.91728
[1] validation-logloss:0.68013 validation-auc:0.92791 validation-aucpr:0.93193
[2] validation-logloss:0.67394 validation-auc:0.92814 validation-aucpr:0.93182
[3] validation-logloss:0.66757 validation-auc:0.93667 validation-aucpr:0.94082
[4] validation-logloss:0.66294 validation-auc:0.93631 validation-aucpr:0.94060
[5] validation-logloss:0.65729 validation-auc:0.93803 validation-aucpr:0.94266
[6] validation-logloss:0.65090 validation-auc:0.94229 validation-aucpr:0.94764
[7] validation-logloss:0.64525 validation-auc:0.94258 validation-aucpr:0.94772
[8] validation-logloss:0.64001 validation-auc:0.94279 validation-aucpr:0.94788
[9] validation-logloss:0.63478 validation-auc:0.94305 validation-aucpr:0.94816
[10] validation-logloss:0.62928 validation-auc:0.94329 validation-aucpr:0.94860
[11] validation-logloss:0.62416 validation-auc:0.94347 validation-aucpr:0.94872
[12] validation-logloss:0.61951 validation-auc:0.94328 validation-aucpr:0.94861
[13] validation-logloss:0.61508 validation-auc:0.94369 validation-aucpr:0.94924
[14] validation-logloss:0.61004 validation-auc:0.94355 validation-aucpr:0.94898
[15] validation-logloss:0.60580 validation-auc:0.94325 validation-aucpr:0.94858
[16] validation-logloss:0.60117 validation-auc:0.94387 validation-aucpr:0.94913
[17] validation-logloss:0.59639 validation-auc:0.94382 validation-aucpr:0.94923
[18] validation-logloss:0.59094 validation-auc:0.94798 validation-aucpr:0.95403
[19] validation-logloss:0.58653 validation-auc:0.94794 validation-aucpr:0.95395
[20] validation-logloss:0.58221 validation-auc:0.94815 validation-aucpr:0.95410
[21] validation-logloss:0.57830 validation-auc:0.94783 validation-aucpr:0.95375
[22] validation-logloss:0.57425 validation-auc:0.94773 validation-aucpr:0.95358
[23] validation-logloss:0.57052 validation-auc:0.94753 validation-aucpr:0.95340
[24] validation-logloss:0.56631 validation-auc:0.94774 validation-aucpr:0.95349
[25] validation-logloss:0.56246 validation-auc:0.94786 validation-aucpr:0.95352
[26] validation-logloss:0.55857 validation-auc:0.94782 validation-aucpr:0.95345
[27] validation-logloss:0.55479 validation-auc:0.94783 validation-aucpr:0.95346
[28] validation-logloss:0.55109 validation-auc:0.94787 validation-aucpr:0.95352
[29] validation-logloss:0.54746 validation-auc:0.94802 validation-aucpr:0.95368
[30] validation-logloss:0.54482 validation-auc:0.94811 validation-aucpr:0.95360
[31] validation-logloss:0.54094 validation-auc:0.94971 validation-aucpr:0.95558
[32] validation-logloss:0.53799 validation-auc:0.94961 validation-aucpr:0.95547
[33] validation-logloss:0.53478 validation-auc:0.94953 validation-aucpr:0.95531
[34] validation-logloss:0.53144 validation-auc:0.94939 validation-aucpr:0.95509
[35] validation-logloss:0.52803 validation-auc:0.94945 validation-aucpr:0.95516
[36] validation-logloss:0.52445 validation-auc:0.95022 validation-aucpr:0.95612
[37] validation-logloss:0.52171 validation-auc:0.95008 validation-aucpr:0.95599
[38] validation-logloss:0.51859 validation-auc:0.95005 validation-aucpr:0.95598
[39] validation-logloss:0.51538 validation-auc:0.95003 validation-aucpr:0.95589
[40] validation-logloss:0.51260 validation-auc:0.94990 validation-aucpr:0.95570
[41] validation-logloss:0.50974 validation-auc:0.94986 validation-aucpr:0.95564
[42] validation-logloss:0.50683 validation-auc:0.94989 validation-aucpr:0.95562
[43] validation-logloss:0.50397 validation-auc:0.95003 validation-aucpr:0.95576
[44] validation-logloss:0.50139 validation-auc:0.94988 validation-aucpr:0.95554
[45] validation-logloss:0.49949 validation-auc:0.94990 validation-aucpr:0.95555
[46] validation-logloss:0.49699 validation-auc:0.94984 validation-aucpr:0.95545
[47] validation-logloss:0.49415 validation-auc:0.94988 validation-aucpr:0.95545
[48] validation-logloss:0.49145 validation-auc:0.94993 validation-aucpr:0.95542
[49] validation-logloss:0.48879 validation-auc:0.94966 validation-aucpr:0.95512
[50] validation-logloss:0.48580 validation-auc:0.94956 validation-aucpr:0.95505
[51] validation-logloss:0.48293 validation-auc:0.94975 validation-aucpr:0.95519
[52] validation-logloss:0.48027 validation-auc:0.94980 validation-aucpr:0.95520
[53] validation-logloss:0.47698 validation-auc:0.95132 validation-aucpr:0.95702
[54] validation-logloss:0.47454 validation-auc:0.95129 validation-aucpr:0.95694
[55] validation-logloss:0.47208 validation-auc:0.95111 validation-aucpr:0.95676
[56] validation-logloss:0.46927 validation-auc:0.95125 validation-aucpr:0.95686
[57] validation-logloss:0.46717 validation-auc:0.95133 validation-aucpr:0.95695
[58] validation-logloss:0.46452 validation-auc:0.95133 validation-aucpr:0.95695
[59] validation-logloss:0.46226 validation-auc:0.95141 validation-aucpr:0.95708
[60] validation-logloss:0.46015 validation-auc:0.95136 validation-aucpr:0.95705
[61] validation-logloss:0.45794 validation-auc:0.95143 validation-aucpr:0.95712
[62] validation-logloss:0.45539 validation-auc:0.95179 validation-aucpr:0.95758
[63] validation-logloss:0.45343 validation-auc:0.95176 validation-aucpr:0.95754
[64] validation-logloss:0.45153 validation-auc:0.95173 validation-aucpr:0.95749
[65] validation-logloss:0.44959 validation-auc:0.95158 validation-aucpr:0.95744
[66] validation-logloss:0.44747 validation-auc:0.95158 validation-aucpr:0.95743
[67] validation-logloss:0.44579 validation-auc:0.95147 validation-aucpr:0.95736
[68] validation-logloss:0.44361 validation-auc:0.95152 validation-aucpr:0.95743
[69] validation-logloss:0.44161 validation-auc:0.95151 validation-aucpr:0.95741
[70] validation-logloss:0.44023 validation-auc:0.95147 validation-aucpr:0.95739
[71] validation-logloss:0.43895 validation-auc:0.95141 validation-aucpr:0.95718
[72] validation-logloss:0.43726 validation-auc:0.95139 validation-aucpr:0.95713
[73] validation-logloss:0.43548 validation-auc:0.95131 validation-aucpr:0.95704
[74] validation-logloss:0.43390 validation-auc:0.95138 validation-aucpr:0.95707
[75] validation-logloss:0.43229 validation-auc:0.95122 validation-aucpr:0.95687
[76] validation-logloss:0.43072 validation-auc:0.95118 validation-aucpr:0.95686
[77] validation-logloss:0.42927 validation-auc:0.95108 validation-aucpr:0.95678
[78] validation-logloss:0.42760 validation-auc:0.95109 validation-aucpr:0.95677
[79] validation-logloss:0.42490 validation-auc:0.95201 validation-aucpr:0.95782
[80] validation-logloss:0.42348 validation-auc:0.95203 validation-aucpr:0.95780
[81] validation-logloss:0.42184 validation-auc:0.95205 validation-aucpr:0.95791
[82] validation-logloss:0.42022 validation-auc:0.95193 validation-aucpr:0.95775
[83] validation-logloss:0.41852 validation-auc:0.95199 validation-aucpr:0.95785
[84] validation-logloss:0.41677 validation-auc:0.95202 validation-aucpr:0.95785
[85] validation-logloss:0.41538 validation-auc:0.95203 validation-aucpr:0.95812
[86] validation-logloss:0.41238 validation-auc:0.95273 validation-aucpr:0.95894
[87] validation-logloss:0.41136 validation-auc:0.95283 validation-aucpr:0.95910
[88] validation-logloss:0.40900 validation-auc:0.95323 validation-aucpr:0.95961
[89] validation-logloss:0.40767 validation-auc:0.95322 validation-aucpr:0.95964
[90] validation-logloss:0.40615 validation-auc:0.95326 validation-aucpr:0.95964
[91] validation-logloss:0.40460 validation-auc:0.95326 validation-aucpr:0.95963
[92] validation-logloss:0.40333 validation-auc:0.95321 validation-aucpr:0.95956
[93] validation-logloss:0.40207 validation-auc:0.95319 validation-aucpr:0.95952
[94] validation-logloss:0.40052 validation-auc:0.95326 validation-aucpr:0.95957
{'best_iteration': '89', 'best_score': '0.959639811688674'}
Trial 3, Fold 5: Log loss = 0.4005184527327331, Average precision = 0.959576679824926, ROC-AUC = 0.9532623599576391, Elapsed Time = 2.835710700001073 seconds
Optimization Progress: 4%|4 | 4/100 [02:22<45:23, 28.37s/it]
Trial 4, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 4, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.65332 validation-auc:0.91488 validation-aucpr:0.93009
[1] validation-logloss:0.61677 validation-auc:0.95002 validation-aucpr:0.93873
[2] validation-logloss:0.58489 validation-auc:0.95826 validation-aucpr:0.95774
[3] validation-logloss:0.55534 validation-auc:0.96160 validation-aucpr:0.96131
[4] validation-logloss:0.52807 validation-auc:0.96369 validation-aucpr:0.96267
[5] validation-logloss:0.50386 validation-auc:0.96427 validation-aucpr:0.96573
[6] validation-logloss:0.48156 validation-auc:0.96485 validation-aucpr:0.96500
[7] validation-logloss:0.46084 validation-auc:0.96574 validation-aucpr:0.96729
[8] validation-logloss:0.44250 validation-auc:0.96652 validation-aucpr:0.96887
[9] validation-logloss:0.42517 validation-auc:0.96669 validation-aucpr:0.96917
[10] validation-logloss:0.40921 validation-auc:0.96710 validation-aucpr:0.96945
[11] validation-logloss:0.39461 validation-auc:0.96725 validation-aucpr:0.96953
[12] validation-logloss:0.38065 validation-auc:0.96765 validation-aucpr:0.96960
[13] validation-logloss:0.36827 validation-auc:0.96782 validation-aucpr:0.96967
[14] validation-logloss:0.35661 validation-auc:0.96827 validation-aucpr:0.96992
[15] validation-logloss:0.34547 validation-auc:0.96881 validation-aucpr:0.97045
[16] validation-logloss:0.33553 validation-auc:0.96893 validation-aucpr:0.97055
[17] validation-logloss:0.32616 validation-auc:0.96908 validation-aucpr:0.97072
[18] validation-logloss:0.31781 validation-auc:0.96892 validation-aucpr:0.97072
[19] validation-logloss:0.30997 validation-auc:0.96933 validation-aucpr:0.97087
[20] validation-logloss:0.30243 validation-auc:0.96962 validation-aucpr:0.97249
[21] validation-logloss:0.29536 validation-auc:0.96956 validation-aucpr:0.97243
[22] validation-logloss:0.28905 validation-auc:0.96962 validation-aucpr:0.97110
[23] validation-logloss:0.28345 validation-auc:0.96963 validation-aucpr:0.97091
[24] validation-logloss:0.27766 validation-auc:0.96971 validation-aucpr:0.97091
[25] validation-logloss:0.27273 validation-auc:0.96978 validation-aucpr:0.97101
[26] validation-logloss:0.26762 validation-auc:0.97000 validation-aucpr:0.97103
[27] validation-logloss:0.26281 validation-auc:0.97032 validation-aucpr:0.97310
[28] validation-logloss:0.25878 validation-auc:0.97042 validation-aucpr:0.97476
[29] validation-logloss:0.25450 validation-auc:0.97054 validation-aucpr:0.97483
[30] validation-logloss:0.25040 validation-auc:0.97052 validation-aucpr:0.97414
[31] validation-logloss:0.24737 validation-auc:0.97026 validation-aucpr:0.97382
[32] validation-logloss:0.24373 validation-auc:0.97055 validation-aucpr:0.97407
[33] validation-logloss:0.24097 validation-auc:0.97051 validation-aucpr:0.97405
[34] validation-logloss:0.23806 validation-auc:0.97053 validation-aucpr:0.97415
[35] validation-logloss:0.23527 validation-auc:0.97051 validation-aucpr:0.97410
[36] validation-logloss:0.23253 validation-auc:0.97077 validation-aucpr:0.97494
[37] validation-logloss:0.22996 validation-auc:0.97074 validation-aucpr:0.97427
[38] validation-logloss:0.22786 validation-auc:0.97066 validation-aucpr:0.97432
[39] validation-logloss:0.22586 validation-auc:0.97066 validation-aucpr:0.97429
[40] validation-logloss:0.22382 validation-auc:0.97062 validation-aucpr:0.97426
[41] validation-logloss:0.22164 validation-auc:0.97085 validation-aucpr:0.97495
[42] validation-logloss:0.21978 validation-auc:0.97095 validation-aucpr:0.97497
[43] validation-logloss:0.21841 validation-auc:0.97083 validation-aucpr:0.97492
[44] validation-logloss:0.21652 validation-auc:0.97105 validation-aucpr:0.97512
[45] validation-logloss:0.21482 validation-auc:0.97118 validation-aucpr:0.97526
[46] validation-logloss:0.21414 validation-auc:0.97092 validation-aucpr:0.97505
[47] validation-logloss:0.21301 validation-auc:0.97088 validation-aucpr:0.97502
[48] validation-logloss:0.21179 validation-auc:0.97094 validation-aucpr:0.97506
[49] validation-logloss:0.21025 validation-auc:0.97118 validation-aucpr:0.97526
[50] validation-logloss:0.20964 validation-auc:0.97106 validation-aucpr:0.97517
[51] validation-logloss:0.20835 validation-auc:0.97126 validation-aucpr:0.97531
[52] validation-logloss:0.20737 validation-auc:0.97133 validation-aucpr:0.97532
[53] validation-logloss:0.20654 validation-auc:0.97132 validation-aucpr:0.97532
[54] validation-logloss:0.20595 validation-auc:0.97119 validation-aucpr:0.97523
[55] validation-logloss:0.20518 validation-auc:0.97122 validation-aucpr:0.97526
[56] validation-logloss:0.20436 validation-auc:0.97116 validation-aucpr:0.97508
[57] validation-logloss:0.20369 validation-auc:0.97115 validation-aucpr:0.97483
[58] validation-logloss:0.20300 validation-auc:0.97119 validation-aucpr:0.97485
[59] validation-logloss:0.20215 validation-auc:0.97139 validation-aucpr:0.97556
[60] validation-logloss:0.20155 validation-auc:0.97144 validation-aucpr:0.97560
[61] validation-logloss:0.20131 validation-auc:0.97135 validation-aucpr:0.97545
[62] validation-logloss:0.20093 validation-auc:0.97127 validation-aucpr:0.97539
[63] validation-logloss:0.20046 validation-auc:0.97129 validation-aucpr:0.97535
[64] validation-logloss:0.20002 validation-auc:0.97130 validation-aucpr:0.97532
[65] validation-logloss:0.19932 validation-auc:0.97143 validation-aucpr:0.97542
[66] validation-logloss:0.19880 validation-auc:0.97149 validation-aucpr:0.97482
[67] validation-logloss:0.19855 validation-auc:0.97152 validation-aucpr:0.97564
[68] validation-logloss:0.19820 validation-auc:0.97155 validation-aucpr:0.97560
[69] validation-logloss:0.19818 validation-auc:0.97153 validation-aucpr:0.97556
[70] validation-logloss:0.19783 validation-auc:0.97158 validation-aucpr:0.97558
[71] validation-logloss:0.19749 validation-auc:0.97162 validation-aucpr:0.97556
[72] validation-logloss:0.19719 validation-auc:0.97169 validation-aucpr:0.97568
[73] validation-logloss:0.19713 validation-auc:0.97165 validation-aucpr:0.97564
[74] validation-logloss:0.19710 validation-auc:0.97168 validation-aucpr:0.97571
[75] validation-logloss:0.19696 validation-auc:0.97165 validation-aucpr:0.97568
[76] validation-logloss:0.19691 validation-auc:0.97165 validation-aucpr:0.97564
[77] validation-logloss:0.19696 validation-auc:0.97163 validation-aucpr:0.97562
[78] validation-logloss:0.19695 validation-auc:0.97164 validation-aucpr:0.97572
[79] validation-logloss:0.19706 validation-auc:0.97160 validation-aucpr:0.97568
[80] validation-logloss:0.19694 validation-auc:0.97154 validation-aucpr:0.97564
[81] validation-logloss:0.19649 validation-auc:0.97166 validation-aucpr:0.97570
[82] validation-logloss:0.19645 validation-auc:0.97165 validation-aucpr:0.97566
[83] validation-logloss:0.19653 validation-auc:0.97172 validation-aucpr:0.97571
[84] validation-logloss:0.19673 validation-auc:0.97160 validation-aucpr:0.97561
[85] validation-logloss:0.19689 validation-auc:0.97157 validation-aucpr:0.97563
[86] validation-logloss:0.19673 validation-auc:0.97159 validation-aucpr:0.97564
[87] validation-logloss:0.19684 validation-auc:0.97154 validation-aucpr:0.97560
[88] validation-logloss:0.19681 validation-auc:0.97153 validation-aucpr:0.97551
{'best_iteration': '78', 'best_score': '0.97572330288339'}
Trial 4, Fold 1: Log loss = 0.1968148526228069, Average precision = 0.97551845493452, ROC-AUC = 0.9715284349712985, Elapsed Time = 2.971843600000284 seconds
Trial 4, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 4, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.65239 validation-auc:0.94171 validation-aucpr:0.94549
[1] validation-logloss:0.61997 validation-auc:0.95250 validation-aucpr:0.94399
[2] validation-logloss:0.58764 validation-auc:0.95756 validation-aucpr:0.95015
[3] validation-logloss:0.55752 validation-auc:0.96201 validation-aucpr:0.95601
[4] validation-logloss:0.53032 validation-auc:0.96465 validation-aucpr:0.96392
[5] validation-logloss:0.50606 validation-auc:0.96637 validation-aucpr:0.96873
[6] validation-logloss:0.48325 validation-auc:0.96693 validation-aucpr:0.97139
[7] validation-logloss:0.46264 validation-auc:0.96691 validation-aucpr:0.97143
[8] validation-logloss:0.44603 validation-auc:0.96767 validation-aucpr:0.97161
[9] validation-logloss:0.42815 validation-auc:0.96824 validation-aucpr:0.97206
[10] validation-logloss:0.41171 validation-auc:0.96856 validation-aucpr:0.97239
[11] validation-logloss:0.39777 validation-auc:0.96910 validation-aucpr:0.97267
[12] validation-logloss:0.38349 validation-auc:0.96958 validation-aucpr:0.97314
[13] validation-logloss:0.37060 validation-auc:0.96991 validation-aucpr:0.97338
[14] validation-logloss:0.35853 validation-auc:0.97003 validation-aucpr:0.97324
[15] validation-logloss:0.34755 validation-auc:0.97012 validation-aucpr:0.97344
[16] validation-logloss:0.33644 validation-auc:0.97082 validation-aucpr:0.97402
[17] validation-logloss:0.32598 validation-auc:0.97129 validation-aucpr:0.97429
[18] validation-logloss:0.31719 validation-auc:0.97150 validation-aucpr:0.97447
[19] validation-logloss:0.30889 validation-auc:0.97152 validation-aucpr:0.97435
[20] validation-logloss:0.30109 validation-auc:0.97161 validation-aucpr:0.97455
[21] validation-logloss:0.29421 validation-auc:0.97162 validation-aucpr:0.97438
[22] validation-logloss:0.28789 validation-auc:0.97162 validation-aucpr:0.97443
[23] validation-logloss:0.28159 validation-auc:0.97135 validation-aucpr:0.97405
[24] validation-logloss:0.27544 validation-auc:0.97165 validation-aucpr:0.97443
[25] validation-logloss:0.26978 validation-auc:0.97163 validation-aucpr:0.97455
[26] validation-logloss:0.26426 validation-auc:0.97180 validation-aucpr:0.97431
[27] validation-logloss:0.25919 validation-auc:0.97204 validation-aucpr:0.97508
[28] validation-logloss:0.25451 validation-auc:0.97201 validation-aucpr:0.97521
[29] validation-logloss:0.25009 validation-auc:0.97204 validation-aucpr:0.97532
[30] validation-logloss:0.24625 validation-auc:0.97185 validation-aucpr:0.97518
[31] validation-logloss:0.24240 validation-auc:0.97202 validation-aucpr:0.97533
[32] validation-logloss:0.23860 validation-auc:0.97199 validation-aucpr:0.97529
[33] validation-logloss:0.23521 validation-auc:0.97216 validation-aucpr:0.97539
[34] validation-logloss:0.23179 validation-auc:0.97230 validation-aucpr:0.97546
[35] validation-logloss:0.22864 validation-auc:0.97253 validation-aucpr:0.97563
[36] validation-logloss:0.22609 validation-auc:0.97264 validation-aucpr:0.97564
[37] validation-logloss:0.22347 validation-auc:0.97259 validation-aucpr:0.97556
[38] validation-logloss:0.22138 validation-auc:0.97258 validation-aucpr:0.97558
[39] validation-logloss:0.21909 validation-auc:0.97254 validation-aucpr:0.97552
[40] validation-logloss:0.21672 validation-auc:0.97263 validation-aucpr:0.97561
[41] validation-logloss:0.21448 validation-auc:0.97273 validation-aucpr:0.97563
[42] validation-logloss:0.21266 validation-auc:0.97269 validation-aucpr:0.97558
[43] validation-logloss:0.21132 validation-auc:0.97265 validation-aucpr:0.97553
[44] validation-logloss:0.20944 validation-auc:0.97273 validation-aucpr:0.97556
[45] validation-logloss:0.20739 validation-auc:0.97292 validation-aucpr:0.97573
[46] validation-logloss:0.20574 validation-auc:0.97304 validation-aucpr:0.97578
[47] validation-logloss:0.20423 validation-auc:0.97302 validation-aucpr:0.97578
[48] validation-logloss:0.20299 validation-auc:0.97308 validation-aucpr:0.97581
[49] validation-logloss:0.20187 validation-auc:0.97303 validation-aucpr:0.97575
[50] validation-logloss:0.20080 validation-auc:0.97287 validation-aucpr:0.97563
[51] validation-logloss:0.19983 validation-auc:0.97292 validation-aucpr:0.97565
[52] validation-logloss:0.19840 validation-auc:0.97305 validation-aucpr:0.97574
[53] validation-logloss:0.19747 validation-auc:0.97305 validation-aucpr:0.97573
[54] validation-logloss:0.19642 validation-auc:0.97305 validation-aucpr:0.97556
[55] validation-logloss:0.19518 validation-auc:0.97322 validation-aucpr:0.97568
[56] validation-logloss:0.19419 validation-auc:0.97330 validation-aucpr:0.97575
[57] validation-logloss:0.19317 validation-auc:0.97333 validation-aucpr:0.97556
[58] validation-logloss:0.19238 validation-auc:0.97339 validation-aucpr:0.97559
[59] validation-logloss:0.19142 validation-auc:0.97348 validation-aucpr:0.97571
[60] validation-logloss:0.19042 validation-auc:0.97358 validation-aucpr:0.97580
[61] validation-logloss:0.18957 validation-auc:0.97362 validation-aucpr:0.97584
[62] validation-logloss:0.18885 validation-auc:0.97366 validation-aucpr:0.97604
[63] validation-logloss:0.18810 validation-auc:0.97382 validation-aucpr:0.97622
[64] validation-logloss:0.18724 validation-auc:0.97397 validation-aucpr:0.97629
[65] validation-logloss:0.18666 validation-auc:0.97400 validation-aucpr:0.97629
[66] validation-logloss:0.18650 validation-auc:0.97398 validation-aucpr:0.97626
[67] validation-logloss:0.18604 validation-auc:0.97403 validation-aucpr:0.97628
[68] validation-logloss:0.18576 validation-auc:0.97395 validation-aucpr:0.97611
[69] validation-logloss:0.18522 validation-auc:0.97402 validation-aucpr:0.97611
[70] validation-logloss:0.18485 validation-auc:0.97410 validation-aucpr:0.97581
[71] validation-logloss:0.18459 validation-auc:0.97412 validation-aucpr:0.97572
[72] validation-logloss:0.18446 validation-auc:0.97403 validation-aucpr:0.97566
[73] validation-logloss:0.18390 validation-auc:0.97421 validation-aucpr:0.97653
[74] validation-logloss:0.18348 validation-auc:0.97426 validation-aucpr:0.97657
[75] validation-logloss:0.18344 validation-auc:0.97425 validation-aucpr:0.97666
[76] validation-logloss:0.18301 validation-auc:0.97434 validation-aucpr:0.97672
[77] validation-logloss:0.18272 validation-auc:0.97429 validation-aucpr:0.97667
[78] validation-logloss:0.18223 validation-auc:0.97447 validation-aucpr:0.97680
[79] validation-logloss:0.18196 validation-auc:0.97450 validation-aucpr:0.97681
[80] validation-logloss:0.18192 validation-auc:0.97449 validation-aucpr:0.97682
[81] validation-logloss:0.18180 validation-auc:0.97444 validation-aucpr:0.97674
[82] validation-logloss:0.18151 validation-auc:0.97460 validation-aucpr:0.97692
[83] validation-logloss:0.18131 validation-auc:0.97468 validation-aucpr:0.97695
[84] validation-logloss:0.18141 validation-auc:0.97464 validation-aucpr:0.97697
[85] validation-logloss:0.18153 validation-auc:0.97464 validation-aucpr:0.97717
[86] validation-logloss:0.18159 validation-auc:0.97462 validation-aucpr:0.97715
[87] validation-logloss:0.18154 validation-auc:0.97455 validation-aucpr:0.97707
[88] validation-logloss:0.18135 validation-auc:0.97457 validation-aucpr:0.97706
{'best_iteration': '85', 'best_score': '0.9771727738107567'}
Trial 4, Fold 2: Log loss = 0.18135254688458874, Average precision = 0.9770647111537439, ROC-AUC = 0.974569954363371, Elapsed Time = 3.2144864999991114 seconds
Trial 4, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 4, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.65257 validation-auc:0.94874 validation-aucpr:0.95522
[1] validation-logloss:0.61594 validation-auc:0.95668 validation-aucpr:0.95192
[2] validation-logloss:0.58267 validation-auc:0.96405 validation-aucpr:0.96123
[3] validation-logloss:0.55282 validation-auc:0.96524 validation-aucpr:0.96189
[4] validation-logloss:0.52565 validation-auc:0.96659 validation-aucpr:0.96425
[5] validation-logloss:0.50078 validation-auc:0.96730 validation-aucpr:0.96627
[6] validation-logloss:0.47832 validation-auc:0.96850 validation-aucpr:0.97182
[7] validation-logloss:0.45804 validation-auc:0.96910 validation-aucpr:0.97369
[8] validation-logloss:0.43904 validation-auc:0.96973 validation-aucpr:0.97414
[9] validation-logloss:0.42171 validation-auc:0.96970 validation-aucpr:0.97278
[10] validation-logloss:0.40580 validation-auc:0.96998 validation-aucpr:0.97426
[11] validation-logloss:0.39125 validation-auc:0.96989 validation-aucpr:0.97399
[12] validation-logloss:0.37749 validation-auc:0.97049 validation-aucpr:0.97433
[13] validation-logloss:0.36449 validation-auc:0.97102 validation-aucpr:0.97474
[14] validation-logloss:0.35295 validation-auc:0.97095 validation-aucpr:0.97468
[15] validation-logloss:0.34236 validation-auc:0.97109 validation-aucpr:0.97499
[16] validation-logloss:0.33249 validation-auc:0.97092 validation-aucpr:0.97462
[17] validation-logloss:0.32348 validation-auc:0.97061 validation-aucpr:0.97440
[18] validation-logloss:0.31490 validation-auc:0.97077 validation-aucpr:0.97490
[19] validation-logloss:0.30675 validation-auc:0.97069 validation-aucpr:0.97487
[20] validation-logloss:0.29915 validation-auc:0.97105 validation-aucpr:0.97505
[21] validation-logloss:0.29200 validation-auc:0.97122 validation-aucpr:0.97514
[22] validation-logloss:0.28552 validation-auc:0.97105 validation-aucpr:0.97505
[23] validation-logloss:0.27946 validation-auc:0.97113 validation-aucpr:0.97503
[24] validation-logloss:0.27368 validation-auc:0.97105 validation-aucpr:0.97495
[25] validation-logloss:0.26799 validation-auc:0.97116 validation-aucpr:0.97502
[26] validation-logloss:0.26261 validation-auc:0.97149 validation-aucpr:0.97528
[27] validation-logloss:0.25793 validation-auc:0.97132 validation-aucpr:0.97507
[28] validation-logloss:0.25362 validation-auc:0.97127 validation-aucpr:0.97507
[29] validation-logloss:0.24973 validation-auc:0.97119 validation-aucpr:0.97495
[30] validation-logloss:0.24571 validation-auc:0.97113 validation-aucpr:0.97491
[31] validation-logloss:0.24218 validation-auc:0.97104 validation-aucpr:0.97489
[32] validation-logloss:0.23851 validation-auc:0.97124 validation-aucpr:0.97508
[33] validation-logloss:0.23542 validation-auc:0.97117 validation-aucpr:0.97408
[34] validation-logloss:0.23267 validation-auc:0.97099 validation-aucpr:0.97393
[35] validation-logloss:0.22966 validation-auc:0.97107 validation-aucpr:0.97396
[36] validation-logloss:0.22675 validation-auc:0.97121 validation-aucpr:0.97409
[37] validation-logloss:0.22434 validation-auc:0.97121 validation-aucpr:0.97399
[38] validation-logloss:0.22197 validation-auc:0.97123 validation-aucpr:0.97396
[39] validation-logloss:0.21961 validation-auc:0.97156 validation-aucpr:0.97434
[40] validation-logloss:0.21773 validation-auc:0.97155 validation-aucpr:0.97431
[41] validation-logloss:0.21536 validation-auc:0.97181 validation-aucpr:0.97441
[42] validation-logloss:0.21351 validation-auc:0.97181 validation-aucpr:0.97440
[43] validation-logloss:0.21176 validation-auc:0.97183 validation-aucpr:0.97444
[44] validation-logloss:0.21005 validation-auc:0.97198 validation-aucpr:0.97550
[45] validation-logloss:0.20869 validation-auc:0.97189 validation-aucpr:0.97495
[46] validation-logloss:0.20768 validation-auc:0.97174 validation-aucpr:0.97472
[47] validation-logloss:0.20620 validation-auc:0.97194 validation-aucpr:0.97516
[48] validation-logloss:0.20471 validation-auc:0.97212 validation-aucpr:0.97525
[49] validation-logloss:0.20322 validation-auc:0.97236 validation-aucpr:0.97558
[50] validation-logloss:0.20205 validation-auc:0.97242 validation-aucpr:0.97590
[51] validation-logloss:0.20088 validation-auc:0.97252 validation-aucpr:0.97598
[52] validation-logloss:0.20008 validation-auc:0.97256 validation-aucpr:0.97624
[53] validation-logloss:0.19906 validation-auc:0.97258 validation-aucpr:0.97624
[54] validation-logloss:0.19863 validation-auc:0.97244 validation-aucpr:0.97612
[55] validation-logloss:0.19790 validation-auc:0.97242 validation-aucpr:0.97608
[56] validation-logloss:0.19694 validation-auc:0.97237 validation-aucpr:0.97610
[57] validation-logloss:0.19640 validation-auc:0.97239 validation-aucpr:0.97608
[58] validation-logloss:0.19572 validation-auc:0.97243 validation-aucpr:0.97611
[59] validation-logloss:0.19525 validation-auc:0.97247 validation-aucpr:0.97612
[60] validation-logloss:0.19455 validation-auc:0.97248 validation-aucpr:0.97614
[61] validation-logloss:0.19392 validation-auc:0.97271 validation-aucpr:0.97622
[62] validation-logloss:0.19347 validation-auc:0.97278 validation-aucpr:0.97608
[63] validation-logloss:0.19313 validation-auc:0.97281 validation-aucpr:0.97609
[64] validation-logloss:0.19293 validation-auc:0.97280 validation-aucpr:0.97624
[65] validation-logloss:0.19259 validation-auc:0.97282 validation-aucpr:0.97607
[66] validation-logloss:0.19227 validation-auc:0.97288 validation-aucpr:0.97603
[67] validation-logloss:0.19212 validation-auc:0.97293 validation-aucpr:0.97615
[68] validation-logloss:0.19177 validation-auc:0.97287 validation-aucpr:0.97607
[69] validation-logloss:0.19117 validation-auc:0.97299 validation-aucpr:0.97613
[70] validation-logloss:0.19101 validation-auc:0.97302 validation-aucpr:0.97632
[71] validation-logloss:0.19080 validation-auc:0.97303 validation-aucpr:0.97634
[72] validation-logloss:0.19057 validation-auc:0.97306 validation-aucpr:0.97638
[73] validation-logloss:0.19022 validation-auc:0.97311 validation-aucpr:0.97637
[74] validation-logloss:0.19005 validation-auc:0.97310 validation-aucpr:0.97631
[75] validation-logloss:0.18975 validation-auc:0.97311 validation-aucpr:0.97641
[76] validation-logloss:0.18942 validation-auc:0.97324 validation-aucpr:0.97659
[77] validation-logloss:0.18960 validation-auc:0.97313 validation-aucpr:0.97647
[78] validation-logloss:0.18948 validation-auc:0.97305 validation-aucpr:0.97640
[79] validation-logloss:0.18917 validation-auc:0.97310 validation-aucpr:0.97641
[80] validation-logloss:0.18942 validation-auc:0.97302 validation-aucpr:0.97634
[81] validation-logloss:0.18926 validation-auc:0.97309 validation-aucpr:0.97637
[82] validation-logloss:0.18941 validation-auc:0.97299 validation-aucpr:0.97619
[83] validation-logloss:0.18964 validation-auc:0.97291 validation-aucpr:0.97616
[84] validation-logloss:0.18960 validation-auc:0.97292 validation-aucpr:0.97622
[85] validation-logloss:0.18950 validation-auc:0.97294 validation-aucpr:0.97617
[86] validation-logloss:0.18940 validation-auc:0.97294 validation-aucpr:0.97610
[87] validation-logloss:0.18945 validation-auc:0.97295 validation-aucpr:0.97600
[88] validation-logloss:0.18922 validation-auc:0.97301 validation-aucpr:0.97602
{'best_iteration': '76', 'best_score': '0.9765877426803053'}
Trial 4, Fold 3: Log loss = 0.18921685039483901, Average precision = 0.9760259892551311, ROC-AUC = 0.9730138397973631, Elapsed Time = 3.222346100001232 seconds
Trial 4, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 4, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.65218 validation-auc:0.93087 validation-aucpr:0.92133
[1] validation-logloss:0.61615 validation-auc:0.95108 validation-aucpr:0.93605
[2] validation-logloss:0.58374 validation-auc:0.96028 validation-aucpr:0.95399
[3] validation-logloss:0.55701 validation-auc:0.96363 validation-aucpr:0.96372
[4] validation-logloss:0.52946 validation-auc:0.96594 validation-aucpr:0.96859
[5] validation-logloss:0.50448 validation-auc:0.96686 validation-aucpr:0.97103
[6] validation-logloss:0.48197 validation-auc:0.96722 validation-aucpr:0.96976
[7] validation-logloss:0.46334 validation-auc:0.96722 validation-aucpr:0.96986
[8] validation-logloss:0.44405 validation-auc:0.96798 validation-aucpr:0.97051
[9] validation-logloss:0.42880 validation-auc:0.96722 validation-aucpr:0.96973
[10] validation-logloss:0.41224 validation-auc:0.96813 validation-aucpr:0.97165
[11] validation-logloss:0.39697 validation-auc:0.96840 validation-aucpr:0.97174
[12] validation-logloss:0.38253 validation-auc:0.96924 validation-aucpr:0.97242
[13] validation-logloss:0.36932 validation-auc:0.96970 validation-aucpr:0.97278
[14] validation-logloss:0.35877 validation-auc:0.96980 validation-aucpr:0.97287
[15] validation-logloss:0.34759 validation-auc:0.96979 validation-aucpr:0.97297
[16] validation-logloss:0.33695 validation-auc:0.97002 validation-aucpr:0.97318
[17] validation-logloss:0.32702 validation-auc:0.97042 validation-aucpr:0.97351
[18] validation-logloss:0.31823 validation-auc:0.97069 validation-aucpr:0.97546
[19] validation-logloss:0.30953 validation-auc:0.97097 validation-aucpr:0.97570
[20] validation-logloss:0.30191 validation-auc:0.97083 validation-aucpr:0.97557
[21] validation-logloss:0.29446 validation-auc:0.97089 validation-aucpr:0.97564
[22] validation-logloss:0.28791 validation-auc:0.97082 validation-aucpr:0.97556
[23] validation-logloss:0.28090 validation-auc:0.97121 validation-aucpr:0.97585
[24] validation-logloss:0.27475 validation-auc:0.97140 validation-aucpr:0.97598
[25] validation-logloss:0.26969 validation-auc:0.97143 validation-aucpr:0.97596
[26] validation-logloss:0.26453 validation-auc:0.97147 validation-aucpr:0.97598
[27] validation-logloss:0.25957 validation-auc:0.97152 validation-aucpr:0.97603
[28] validation-logloss:0.25493 validation-auc:0.97175 validation-aucpr:0.97620
[29] validation-logloss:0.25018 validation-auc:0.97219 validation-aucpr:0.97642
[30] validation-logloss:0.24673 validation-auc:0.97213 validation-aucpr:0.97636
[31] validation-logloss:0.24274 validation-auc:0.97233 validation-aucpr:0.97649
[32] validation-logloss:0.23901 validation-auc:0.97248 validation-aucpr:0.97657
[33] validation-logloss:0.23534 validation-auc:0.97253 validation-aucpr:0.97661
[34] validation-logloss:0.23205 validation-auc:0.97275 validation-aucpr:0.97689
[35] validation-logloss:0.22914 validation-auc:0.97285 validation-aucpr:0.97699
[36] validation-logloss:0.22629 validation-auc:0.97281 validation-aucpr:0.97698
[37] validation-logloss:0.22385 validation-auc:0.97266 validation-aucpr:0.97688
[38] validation-logloss:0.22127 validation-auc:0.97271 validation-aucpr:0.97695
[39] validation-logloss:0.21895 validation-auc:0.97274 validation-aucpr:0.97699
[40] validation-logloss:0.21709 validation-auc:0.97253 validation-aucpr:0.97686
[41] validation-logloss:0.21538 validation-auc:0.97239 validation-aucpr:0.97674
[42] validation-logloss:0.21353 validation-auc:0.97238 validation-aucpr:0.97676
[43] validation-logloss:0.21158 validation-auc:0.97245 validation-aucpr:0.97681
[44] validation-logloss:0.20968 validation-auc:0.97248 validation-aucpr:0.97683
[45] validation-logloss:0.20810 validation-auc:0.97250 validation-aucpr:0.97682
[46] validation-logloss:0.20676 validation-auc:0.97238 validation-aucpr:0.97676
[47] validation-logloss:0.20568 validation-auc:0.97222 validation-aucpr:0.97662
[48] validation-logloss:0.20416 validation-auc:0.97230 validation-aucpr:0.97667
[49] validation-logloss:0.20300 validation-auc:0.97237 validation-aucpr:0.97673
[50] validation-logloss:0.20175 validation-auc:0.97250 validation-aucpr:0.97681
[51] validation-logloss:0.20079 validation-auc:0.97246 validation-aucpr:0.97675
[52] validation-logloss:0.20006 validation-auc:0.97233 validation-aucpr:0.97663
[53] validation-logloss:0.19926 validation-auc:0.97233 validation-aucpr:0.97663
[54] validation-logloss:0.19870 validation-auc:0.97217 validation-aucpr:0.97651
[55] validation-logloss:0.19805 validation-auc:0.97209 validation-aucpr:0.97642
[56] validation-logloss:0.19706 validation-auc:0.97216 validation-aucpr:0.97646
[57] validation-logloss:0.19626 validation-auc:0.97214 validation-aucpr:0.97645
[58] validation-logloss:0.19579 validation-auc:0.97204 validation-aucpr:0.97640
[59] validation-logloss:0.19544 validation-auc:0.97190 validation-aucpr:0.97628
[60] validation-logloss:0.19504 validation-auc:0.97178 validation-aucpr:0.97619
[61] validation-logloss:0.19453 validation-auc:0.97181 validation-aucpr:0.97621
[62] validation-logloss:0.19392 validation-auc:0.97189 validation-aucpr:0.97626
[63] validation-logloss:0.19362 validation-auc:0.97180 validation-aucpr:0.97623
[64] validation-logloss:0.19309 validation-auc:0.97193 validation-aucpr:0.97632
[65] validation-logloss:0.19237 validation-auc:0.97208 validation-aucpr:0.97643
[66] validation-logloss:0.19205 validation-auc:0.97215 validation-aucpr:0.97644
[67] validation-logloss:0.19177 validation-auc:0.97219 validation-aucpr:0.97647
[68] validation-logloss:0.19152 validation-auc:0.97207 validation-aucpr:0.97638
[69] validation-logloss:0.19132 validation-auc:0.97200 validation-aucpr:0.97633
[70] validation-logloss:0.19122 validation-auc:0.97195 validation-aucpr:0.97631
[71] validation-logloss:0.19092 validation-auc:0.97192 validation-aucpr:0.97629
[72] validation-logloss:0.19062 validation-auc:0.97196 validation-aucpr:0.97632
[73] validation-logloss:0.19051 validation-auc:0.97194 validation-aucpr:0.97631
[74] validation-logloss:0.19048 validation-auc:0.97181 validation-aucpr:0.97621
[75] validation-logloss:0.19040 validation-auc:0.97170 validation-aucpr:0.97613
[76] validation-logloss:0.19033 validation-auc:0.97171 validation-aucpr:0.97611
[77] validation-logloss:0.19027 validation-auc:0.97176 validation-aucpr:0.97615
[78] validation-logloss:0.19046 validation-auc:0.97163 validation-aucpr:0.97606
[79] validation-logloss:0.19064 validation-auc:0.97155 validation-aucpr:0.97598
[80] validation-logloss:0.19064 validation-auc:0.97155 validation-aucpr:0.97596
[81] validation-logloss:0.19054 validation-auc:0.97157 validation-aucpr:0.97598
[82] validation-logloss:0.19054 validation-auc:0.97148 validation-aucpr:0.97592
[83] validation-logloss:0.19041 validation-auc:0.97157 validation-aucpr:0.97598
[84] validation-logloss:0.19043 validation-auc:0.97158 validation-aucpr:0.97599
[85] validation-logloss:0.19021 validation-auc:0.97170 validation-aucpr:0.97608
[86] validation-logloss:0.19001 validation-auc:0.97179 validation-aucpr:0.97614
[87] validation-logloss:0.18979 validation-auc:0.97182 validation-aucpr:0.97619
[88] validation-logloss:0.18988 validation-auc:0.97183 validation-aucpr:0.97619
{'best_iteration': '39', 'best_score': '0.9769865175482205'}
Trial 4, Fold 4: Log loss = 0.1898793489588116, Average precision = 0.9761969319498418, ROC-AUC = 0.9718313280447124, Elapsed Time = 3.1994386999995186 seconds
Trial 4, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 4, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.65278 validation-auc:0.93567 validation-aucpr:0.93547
[1] validation-logloss:0.62061 validation-auc:0.94959 validation-aucpr:0.92957
[2] validation-logloss:0.58765 validation-auc:0.95713 validation-aucpr:0.94956
[3] validation-logloss:0.55951 validation-auc:0.96044 validation-aucpr:0.96057
[4] validation-logloss:0.53240 validation-auc:0.96270 validation-aucpr:0.96259
[5] validation-logloss:0.50762 validation-auc:0.96477 validation-aucpr:0.96957
[6] validation-logloss:0.48488 validation-auc:0.96603 validation-aucpr:0.97029
[7] validation-logloss:0.46434 validation-auc:0.96644 validation-aucpr:0.97096
[8] validation-logloss:0.44581 validation-auc:0.96622 validation-aucpr:0.97089
[9] validation-logloss:0.42847 validation-auc:0.96638 validation-aucpr:0.97110
[10] validation-logloss:0.41269 validation-auc:0.96628 validation-aucpr:0.97105
[11] validation-logloss:0.39811 validation-auc:0.96664 validation-aucpr:0.97119
[12] validation-logloss:0.38447 validation-auc:0.96702 validation-aucpr:0.97145
[13] validation-logloss:0.37203 validation-auc:0.96718 validation-aucpr:0.97154
[14] validation-logloss:0.35984 validation-auc:0.96761 validation-aucpr:0.97146
[15] validation-logloss:0.34851 validation-auc:0.96822 validation-aucpr:0.97228
[16] validation-logloss:0.33808 validation-auc:0.96879 validation-aucpr:0.97307
[17] validation-logloss:0.32890 validation-auc:0.96890 validation-aucpr:0.97314
[18] validation-logloss:0.32097 validation-auc:0.96865 validation-aucpr:0.97290
[19] validation-logloss:0.31286 validation-auc:0.96905 validation-aucpr:0.97307
[20] validation-logloss:0.30531 validation-auc:0.96916 validation-aucpr:0.97320
[21] validation-logloss:0.29899 validation-auc:0.96920 validation-aucpr:0.97338
[22] validation-logloss:0.29218 validation-auc:0.96935 validation-aucpr:0.97344
[23] validation-logloss:0.28713 validation-auc:0.96918 validation-aucpr:0.97322
[24] validation-logloss:0.28154 validation-auc:0.96919 validation-aucpr:0.97322
[25] validation-logloss:0.27590 validation-auc:0.96930 validation-aucpr:0.97327
[26] validation-logloss:0.27055 validation-auc:0.96942 validation-aucpr:0.97344
[27] validation-logloss:0.26589 validation-auc:0.96929 validation-aucpr:0.97329
[28] validation-logloss:0.26135 validation-auc:0.96930 validation-aucpr:0.97329
[29] validation-logloss:0.25704 validation-auc:0.96963 validation-aucpr:0.97376
[30] validation-logloss:0.25315 validation-auc:0.96962 validation-aucpr:0.97368
[31] validation-logloss:0.24938 validation-auc:0.96977 validation-aucpr:0.97384
[32] validation-logloss:0.24572 validation-auc:0.96986 validation-aucpr:0.97389
[33] validation-logloss:0.24250 validation-auc:0.96993 validation-aucpr:0.97397
[34] validation-logloss:0.23956 validation-auc:0.97011 validation-aucpr:0.97426
[35] validation-logloss:0.23685 validation-auc:0.97014 validation-aucpr:0.97426
[36] validation-logloss:0.23399 validation-auc:0.97034 validation-aucpr:0.97441
[37] validation-logloss:0.23188 validation-auc:0.97010 validation-aucpr:0.97420
[38] validation-logloss:0.22930 validation-auc:0.97022 validation-aucpr:0.97431
[39] validation-logloss:0.22710 validation-auc:0.97034 validation-aucpr:0.97437
[40] validation-logloss:0.22502 validation-auc:0.97028 validation-aucpr:0.97432
[41] validation-logloss:0.22324 validation-auc:0.97023 validation-aucpr:0.97428
[42] validation-logloss:0.22138 validation-auc:0.97032 validation-aucpr:0.97435
[43] validation-logloss:0.21987 validation-auc:0.97022 validation-aucpr:0.97427
[44] validation-logloss:0.21803 validation-auc:0.97038 validation-aucpr:0.97434
[45] validation-logloss:0.21674 validation-auc:0.97037 validation-aucpr:0.97431
[46] validation-logloss:0.21561 validation-auc:0.97029 validation-aucpr:0.97424
[47] validation-logloss:0.21408 validation-auc:0.97039 validation-aucpr:0.97432
[48] validation-logloss:0.21273 validation-auc:0.97051 validation-aucpr:0.97439
[49] validation-logloss:0.21134 validation-auc:0.97055 validation-aucpr:0.97451
[50] validation-logloss:0.21020 validation-auc:0.97060 validation-aucpr:0.97457
[51] validation-logloss:0.20920 validation-auc:0.97064 validation-aucpr:0.97459
[52] validation-logloss:0.20822 validation-auc:0.97069 validation-aucpr:0.97463
[53] validation-logloss:0.20716 validation-auc:0.97085 validation-aucpr:0.97484
[54] validation-logloss:0.20613 validation-auc:0.97100 validation-aucpr:0.97490
[55] validation-logloss:0.20481 validation-auc:0.97128 validation-aucpr:0.97510
[56] validation-logloss:0.20399 validation-auc:0.97136 validation-aucpr:0.97515
[57] validation-logloss:0.20360 validation-auc:0.97126 validation-aucpr:0.97499
[58] validation-logloss:0.20262 validation-auc:0.97145 validation-aucpr:0.97512
[59] validation-logloss:0.20217 validation-auc:0.97145 validation-aucpr:0.97511
[60] validation-logloss:0.20139 validation-auc:0.97158 validation-aucpr:0.97518
[61] validation-logloss:0.20084 validation-auc:0.97151 validation-aucpr:0.97510
[62] validation-logloss:0.20007 validation-auc:0.97168 validation-aucpr:0.97523
[63] validation-logloss:0.19965 validation-auc:0.97175 validation-aucpr:0.97521
[64] validation-logloss:0.19924 validation-auc:0.97177 validation-aucpr:0.97519
[65] validation-logloss:0.19871 validation-auc:0.97188 validation-aucpr:0.97524
[66] validation-logloss:0.19829 validation-auc:0.97193 validation-aucpr:0.97528
[67] validation-logloss:0.19776 validation-auc:0.97197 validation-aucpr:0.97531
[68] validation-logloss:0.19735 validation-auc:0.97202 validation-aucpr:0.97536
[69] validation-logloss:0.19713 validation-auc:0.97198 validation-aucpr:0.97532
[70] validation-logloss:0.19700 validation-auc:0.97195 validation-aucpr:0.97527
[71] validation-logloss:0.19694 validation-auc:0.97187 validation-aucpr:0.97509
[72] validation-logloss:0.19695 validation-auc:0.97180 validation-aucpr:0.97499
[73] validation-logloss:0.19664 validation-auc:0.97186 validation-aucpr:0.97498
[74] validation-logloss:0.19684 validation-auc:0.97177 validation-aucpr:0.97479
[75] validation-logloss:0.19669 validation-auc:0.97175 validation-aucpr:0.97476
[76] validation-logloss:0.19639 validation-auc:0.97184 validation-aucpr:0.97506
[77] validation-logloss:0.19636 validation-auc:0.97179 validation-aucpr:0.97495
[78] validation-logloss:0.19651 validation-auc:0.97173 validation-aucpr:0.97495
[79] validation-logloss:0.19639 validation-auc:0.97179 validation-aucpr:0.97522
[80] validation-logloss:0.19611 validation-auc:0.97185 validation-aucpr:0.97527
[81] validation-logloss:0.19611 validation-auc:0.97189 validation-aucpr:0.97540
[82] validation-logloss:0.19577 validation-auc:0.97196 validation-aucpr:0.97546
[83] validation-logloss:0.19592 validation-auc:0.97180 validation-aucpr:0.97533
[84] validation-logloss:0.19609 validation-auc:0.97172 validation-aucpr:0.97526
[85] validation-logloss:0.19604 validation-auc:0.97172 validation-aucpr:0.97521
[86] validation-logloss:0.19610 validation-auc:0.97166 validation-aucpr:0.97521
[87] validation-logloss:0.19595 validation-auc:0.97169 validation-aucpr:0.97523
[88] validation-logloss:0.19613 validation-auc:0.97160 validation-aucpr:0.97525
{'best_iteration': '82', 'best_score': '0.9754620734892333'}
Trial 4, Fold 5: Log loss = 0.19613147834100697, Average precision = 0.9752497992069998, ROC-AUC = 0.9716035895435038, Elapsed Time = 3.3549910000001546 seconds
Optimization Progress: 5%|5 | 5/100 [02:47<42:36, 26.91s/it]
Trial 5, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 5, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68411 validation-auc:0.92495 validation-aucpr:0.92667
[1] validation-logloss:0.67398 validation-auc:0.95069 validation-aucpr:0.95639
[2] validation-logloss:0.66382 validation-auc:0.95533 validation-aucpr:0.96185
[3] validation-logloss:0.65600 validation-auc:0.95352 validation-aucpr:0.95998
[4] validation-logloss:0.64691 validation-auc:0.95499 validation-aucpr:0.96191
[5] validation-logloss:0.63940 validation-auc:0.95475 validation-aucpr:0.96156
[6] validation-logloss:0.63110 validation-auc:0.95566 validation-aucpr:0.96202
[7] validation-logloss:0.62281 validation-auc:0.95646 validation-aucpr:0.96287
[8] validation-logloss:0.61529 validation-auc:0.95652 validation-aucpr:0.96296
[9] validation-logloss:0.60838 validation-auc:0.95570 validation-aucpr:0.96218
[10] validation-logloss:0.60150 validation-auc:0.95552 validation-aucpr:0.96184
[11] validation-logloss:0.59507 validation-auc:0.95605 validation-aucpr:0.96273
[12] validation-logloss:0.58692 validation-auc:0.95727 validation-aucpr:0.96403
[13] validation-logloss:0.57943 validation-auc:0.95806 validation-aucpr:0.96461
[14] validation-logloss:0.57197 validation-auc:0.95849 validation-aucpr:0.96513
[15] validation-logloss:0.56615 validation-auc:0.95846 validation-aucpr:0.96538
[16] validation-logloss:0.55961 validation-auc:0.95860 validation-aucpr:0.96549
[17] validation-logloss:0.55281 validation-auc:0.95902 validation-aucpr:0.96586
{'best_iteration': '17', 'best_score': '0.9658596269830234'}
Trial 5, Fold 1: Log loss = 0.552814188535648, Average precision = 0.9658570155340371, ROC-AUC = 0.9590218929085236, Elapsed Time = 2.8525215000008757 seconds
Trial 5, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 5, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68338 validation-auc:0.93597 validation-aucpr:0.93907
[1] validation-logloss:0.67327 validation-auc:0.95048 validation-aucpr:0.95611
[2] validation-logloss:0.66397 validation-auc:0.95365 validation-aucpr:0.96019
[3] validation-logloss:0.65405 validation-auc:0.95823 validation-aucpr:0.96382
[4] validation-logloss:0.64544 validation-auc:0.95959 validation-aucpr:0.96459
[5] validation-logloss:0.63633 validation-auc:0.96052 validation-aucpr:0.96537
[6] validation-logloss:0.62716 validation-auc:0.96203 validation-aucpr:0.96648
[7] validation-logloss:0.61860 validation-auc:0.96195 validation-aucpr:0.96676
[8] validation-logloss:0.61064 validation-auc:0.96220 validation-aucpr:0.96676
[9] validation-logloss:0.60334 validation-auc:0.96178 validation-aucpr:0.96629
[10] validation-logloss:0.59626 validation-auc:0.96172 validation-aucpr:0.96615
[11] validation-logloss:0.58932 validation-auc:0.96150 validation-aucpr:0.96585
[12] validation-logloss:0.58274 validation-auc:0.96117 validation-aucpr:0.96546
[13] validation-logloss:0.57548 validation-auc:0.96186 validation-aucpr:0.96596
[14] validation-logloss:0.56897 validation-auc:0.96226 validation-aucpr:0.96625
[15] validation-logloss:0.56278 validation-auc:0.96213 validation-aucpr:0.96607
[16] validation-logloss:0.55681 validation-auc:0.96160 validation-aucpr:0.96556
[17] validation-logloss:0.55075 validation-auc:0.96167 validation-aucpr:0.96557
{'best_iteration': '8', 'best_score': '0.9667615147131858'}
Trial 5, Fold 2: Log loss = 0.5507514087219968, Average precision = 0.9655134610344999, ROC-AUC = 0.9616703303640338, Elapsed Time = 3.293433200000436 seconds
Trial 5, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 5, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68452 validation-auc:0.91430 validation-aucpr:0.89637
[1] validation-logloss:0.67552 validation-auc:0.93844 validation-aucpr:0.94050
[2] validation-logloss:0.66523 validation-auc:0.95468 validation-aucpr:0.95576
[3] validation-logloss:0.65679 validation-auc:0.95555 validation-aucpr:0.95943
[4] validation-logloss:0.64875 validation-auc:0.95470 validation-aucpr:0.95925
[5] validation-logloss:0.63933 validation-auc:0.95731 validation-aucpr:0.96235
[6] validation-logloss:0.63020 validation-auc:0.95787 validation-aucpr:0.96329
[7] validation-logloss:0.62145 validation-auc:0.95855 validation-aucpr:0.96401
[8] validation-logloss:0.61409 validation-auc:0.95906 validation-aucpr:0.96401
[9] validation-logloss:0.60568 validation-auc:0.95958 validation-aucpr:0.96450
[10] validation-logloss:0.59785 validation-auc:0.95937 validation-aucpr:0.96426
[11] validation-logloss:0.59003 validation-auc:0.95997 validation-aucpr:0.96469
[12] validation-logloss:0.58349 validation-auc:0.95958 validation-aucpr:0.96488
[13] validation-logloss:0.57626 validation-auc:0.96036 validation-aucpr:0.96628
[14] validation-logloss:0.56964 validation-auc:0.96067 validation-aucpr:0.96649
[15] validation-logloss:0.56273 validation-auc:0.96066 validation-aucpr:0.96657
[16] validation-logloss:0.55549 validation-auc:0.96143 validation-aucpr:0.96721
[17] validation-logloss:0.54823 validation-auc:0.96246 validation-aucpr:0.96796
{'best_iteration': '17', 'best_score': '0.9679629925673398'}
Trial 5, Fold 3: Log loss = 0.5482264868254652, Average precision = 0.967963204762582, ROC-AUC = 0.9624570967485625, Elapsed Time = 3.393403899999612 seconds
Trial 5, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 5, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68407 validation-auc:0.92159 validation-aucpr:0.89391
[1] validation-logloss:0.67386 validation-auc:0.95333 validation-aucpr:0.95828
[2] validation-logloss:0.66508 validation-auc:0.95452 validation-aucpr:0.96148
[3] validation-logloss:0.65704 validation-auc:0.95389 validation-aucpr:0.96080
[4] validation-logloss:0.64866 validation-auc:0.95435 validation-aucpr:0.96096
[5] validation-logloss:0.63980 validation-auc:0.95546 validation-aucpr:0.96243
[6] validation-logloss:0.63050 validation-auc:0.95681 validation-aucpr:0.96371
[7] validation-logloss:0.62330 validation-auc:0.95603 validation-aucpr:0.96301
[8] validation-logloss:0.61628 validation-auc:0.95542 validation-aucpr:0.96259
[9] validation-logloss:0.60927 validation-auc:0.95504 validation-aucpr:0.96231
[10] validation-logloss:0.60115 validation-auc:0.95644 validation-aucpr:0.96353
[11] validation-logloss:0.59427 validation-auc:0.95678 validation-aucpr:0.96361
[12] validation-logloss:0.58693 validation-auc:0.95685 validation-aucpr:0.96361
[13] validation-logloss:0.57984 validation-auc:0.95708 validation-aucpr:0.96395
[14] validation-logloss:0.57315 validation-auc:0.95754 validation-aucpr:0.96420
[15] validation-logloss:0.56671 validation-auc:0.95756 validation-aucpr:0.96425
[16] validation-logloss:0.55984 validation-auc:0.95781 validation-aucpr:0.96465
[17] validation-logloss:0.55305 validation-auc:0.95809 validation-aucpr:0.96504
{'best_iteration': '17', 'best_score': '0.9650388062047341'}
Trial 5, Fold 4: Log loss = 0.5530506863842228, Average precision = 0.965038797513921, ROC-AUC = 0.9580878136393007, Elapsed Time = 3.3468467999991844 seconds
Trial 5, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 5, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68377 validation-auc:0.92746 validation-aucpr:0.92197
[1] validation-logloss:0.67479 validation-auc:0.94111 validation-aucpr:0.94618
[2] validation-logloss:0.66475 validation-auc:0.95406 validation-aucpr:0.95977
[3] validation-logloss:0.65642 validation-auc:0.95316 validation-aucpr:0.95912
[4] validation-logloss:0.64750 validation-auc:0.95445 validation-aucpr:0.96017
[5] validation-logloss:0.63949 validation-auc:0.95412 validation-aucpr:0.96001
[6] validation-logloss:0.63138 validation-auc:0.95485 validation-aucpr:0.96028
[7] validation-logloss:0.62322 validation-auc:0.95570 validation-aucpr:0.96099
[8] validation-logloss:0.61643 validation-auc:0.95518 validation-aucpr:0.96037
[9] validation-logloss:0.60766 validation-auc:0.95737 validation-aucpr:0.96278
[10] validation-logloss:0.59931 validation-auc:0.95828 validation-aucpr:0.96403
[11] validation-logloss:0.59182 validation-auc:0.95888 validation-aucpr:0.96353
[12] validation-logloss:0.58400 validation-auc:0.95903 validation-aucpr:0.96385
[13] validation-logloss:0.57641 validation-auc:0.95936 validation-aucpr:0.96457
[14] validation-logloss:0.57065 validation-auc:0.95904 validation-aucpr:0.96416
[15] validation-logloss:0.56348 validation-auc:0.95951 validation-aucpr:0.96463
[16] validation-logloss:0.55800 validation-auc:0.95910 validation-aucpr:0.96442
[17] validation-logloss:0.55198 validation-auc:0.95917 validation-aucpr:0.96378
{'best_iteration': '15', 'best_score': '0.9646284351019664'}
Trial 5, Fold 5: Log loss = 0.551975421982808, Average precision = 0.9641766244099718, ROC-AUC = 0.9591747022648309, Elapsed Time = 3.3459750000001804 seconds
Optimization Progress: 6%|6 | 6/100 [03:11<41:00, 26.18s/it]
Trial 6, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 6, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66868 validation-auc:0.92656 validation-aucpr:0.89275
[1] validation-logloss:0.64642 validation-auc:0.94301 validation-aucpr:0.93566
[2] validation-logloss:0.62475 validation-auc:0.95046 validation-aucpr:0.95092
[3] validation-logloss:0.60542 validation-auc:0.95212 validation-aucpr:0.95692
[4] validation-logloss:0.58687 validation-auc:0.95403 validation-aucpr:0.95893
[5] validation-logloss:0.56954 validation-auc:0.95452 validation-aucpr:0.95945
[6] validation-logloss:0.54956 validation-auc:0.96044 validation-aucpr:0.96491
[7] validation-logloss:0.53461 validation-auc:0.96030 validation-aucpr:0.96489
[8] validation-logloss:0.51724 validation-auc:0.96261 validation-aucpr:0.96724
[9] validation-logloss:0.50330 validation-auc:0.96294 validation-aucpr:0.96786
[10] validation-logloss:0.49033 validation-auc:0.96300 validation-aucpr:0.96842
[11] validation-logloss:0.47571 validation-auc:0.96388 validation-aucpr:0.96945
[12] validation-logloss:0.46129 validation-auc:0.96477 validation-aucpr:0.97030
[13] validation-logloss:0.45096 validation-auc:0.96456 validation-aucpr:0.97006
[14] validation-logloss:0.43848 validation-auc:0.96512 validation-aucpr:0.97062
[15] validation-logloss:0.42864 validation-auc:0.96520 validation-aucpr:0.97064
[16] validation-logloss:0.41941 validation-auc:0.96511 validation-aucpr:0.97053
[17] validation-logloss:0.40882 validation-auc:0.96548 validation-aucpr:0.97091
[18] validation-logloss:0.39866 validation-auc:0.96566 validation-aucpr:0.97107
[19] validation-logloss:0.39133 validation-auc:0.96546 validation-aucpr:0.97088
[20] validation-logloss:0.38188 validation-auc:0.96588 validation-aucpr:0.97126
[21] validation-logloss:0.37457 validation-auc:0.96611 validation-aucpr:0.97142
[22] validation-logloss:0.36786 validation-auc:0.96617 validation-aucpr:0.97146
[23] validation-logloss:0.36001 validation-auc:0.96653 validation-aucpr:0.97178
[24] validation-logloss:0.35400 validation-auc:0.96645 validation-aucpr:0.97169
[25] validation-logloss:0.34681 validation-auc:0.96670 validation-aucpr:0.97189
[26] validation-logloss:0.34125 validation-auc:0.96685 validation-aucpr:0.97196
[27] validation-logloss:0.33596 validation-auc:0.96682 validation-aucpr:0.97192
[28] validation-logloss:0.33077 validation-auc:0.96693 validation-aucpr:0.97197
[29] validation-logloss:0.32597 validation-auc:0.96695 validation-aucpr:0.97199
[30] validation-logloss:0.31973 validation-auc:0.96723 validation-aucpr:0.97224
[31] validation-logloss:0.31548 validation-auc:0.96742 validation-aucpr:0.97236
[32] validation-logloss:0.30959 validation-auc:0.96778 validation-aucpr:0.97266
[33] validation-logloss:0.30441 validation-auc:0.96795 validation-aucpr:0.97280
[34] validation-logloss:0.30063 validation-auc:0.96797 validation-aucpr:0.97283
[35] validation-logloss:0.29591 validation-auc:0.96806 validation-aucpr:0.97292
[36] validation-logloss:0.29211 validation-auc:0.96818 validation-aucpr:0.97298
[37] validation-logloss:0.28771 validation-auc:0.96837 validation-aucpr:0.97312
[38] validation-logloss:0.28446 validation-auc:0.96833 validation-aucpr:0.97308
[39] validation-logloss:0.28170 validation-auc:0.96830 validation-aucpr:0.97300
[40] validation-logloss:0.27742 validation-auc:0.96855 validation-aucpr:0.97324
[41] validation-logloss:0.27466 validation-auc:0.96849 validation-aucpr:0.97315
[42] validation-logloss:0.27089 validation-auc:0.96868 validation-aucpr:0.97330
[43] validation-logloss:0.26827 validation-auc:0.96872 validation-aucpr:0.97333
[44] validation-logloss:0.26586 validation-auc:0.96876 validation-aucpr:0.97331
[45] validation-logloss:0.26238 validation-auc:0.96897 validation-aucpr:0.97359
[46] validation-logloss:0.25999 validation-auc:0.96908 validation-aucpr:0.97365
[47] validation-logloss:0.25790 validation-auc:0.96897 validation-aucpr:0.97357
[48] validation-logloss:0.25562 validation-auc:0.96914 validation-aucpr:0.97370
[49] validation-logloss:0.25375 validation-auc:0.96907 validation-aucpr:0.97360
[50] validation-logloss:0.25073 validation-auc:0.96929 validation-aucpr:0.97382
[51] validation-logloss:0.24888 validation-auc:0.96934 validation-aucpr:0.97386
{'best_iteration': '51', 'best_score': '0.9738558070482932'}
Trial 6, Fold 1: Log loss = 0.24888475472946808, Average precision = 0.9738601882875679, ROC-AUC = 0.9693418475741195, Elapsed Time = 1.6845219000006182 seconds
Trial 6, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 6, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.66788 validation-auc:0.93467 validation-aucpr:0.91059
[1] validation-logloss:0.64453 validation-auc:0.95338 validation-aucpr:0.95073
[2] validation-logloss:0.62375 validation-auc:0.95511 validation-aucpr:0.95729
[3] validation-logloss:0.60063 validation-auc:0.96283 validation-aucpr:0.96462
[4] validation-logloss:0.57891 validation-auc:0.96523 validation-aucpr:0.96826
[5] validation-logloss:0.56156 validation-auc:0.96566 validation-aucpr:0.96894
[6] validation-logloss:0.54520 validation-auc:0.96588 validation-aucpr:0.96890
[7] validation-logloss:0.52685 validation-auc:0.96672 validation-aucpr:0.96999
[8] validation-logloss:0.51011 validation-auc:0.96704 validation-aucpr:0.97042
[9] validation-logloss:0.49444 validation-auc:0.96748 validation-aucpr:0.97081
[10] validation-logloss:0.47964 validation-auc:0.96765 validation-aucpr:0.97100
[11] validation-logloss:0.46581 validation-auc:0.96809 validation-aucpr:0.97139
[12] validation-logloss:0.45496 validation-auc:0.96793 validation-aucpr:0.97124
[13] validation-logloss:0.44395 validation-auc:0.96799 validation-aucpr:0.97125
[14] validation-logloss:0.43378 validation-auc:0.96798 validation-aucpr:0.97133
[15] validation-logloss:0.42528 validation-auc:0.96748 validation-aucpr:0.97077
[16] validation-logloss:0.41465 validation-auc:0.96762 validation-aucpr:0.97090
[17] validation-logloss:0.40565 validation-auc:0.96757 validation-aucpr:0.97028
[18] validation-logloss:0.39571 validation-auc:0.96767 validation-aucpr:0.97044
[19] validation-logloss:0.38781 validation-auc:0.96774 validation-aucpr:0.97046
[20] validation-logloss:0.38028 validation-auc:0.96781 validation-aucpr:0.97050
[21] validation-logloss:0.37130 validation-auc:0.96824 validation-aucpr:0.97085
[22] validation-logloss:0.36476 validation-auc:0.96815 validation-aucpr:0.97078
[23] validation-logloss:0.35668 validation-auc:0.96842 validation-aucpr:0.97101
[24] validation-logloss:0.35027 validation-auc:0.96858 validation-aucpr:0.97114
[25] validation-logloss:0.34288 validation-auc:0.96874 validation-aucpr:0.97130
[26] validation-logloss:0.33586 validation-auc:0.96892 validation-aucpr:0.97144
[27] validation-logloss:0.33056 validation-auc:0.96895 validation-aucpr:0.97150
[28] validation-logloss:0.32402 validation-auc:0.96926 validation-aucpr:0.97168
[29] validation-logloss:0.31906 validation-auc:0.96920 validation-aucpr:0.97165
[30] validation-logloss:0.31305 validation-auc:0.96948 validation-aucpr:0.97205
[31] validation-logloss:0.30734 validation-auc:0.96974 validation-aucpr:0.97234
[32] validation-logloss:0.30292 validation-auc:0.96984 validation-aucpr:0.97240
[33] validation-logloss:0.29776 validation-auc:0.96992 validation-aucpr:0.97251
[34] validation-logloss:0.29272 validation-auc:0.96999 validation-aucpr:0.97261
[35] validation-logloss:0.28898 validation-auc:0.97005 validation-aucpr:0.97264
[36] validation-logloss:0.28518 validation-auc:0.97010 validation-aucpr:0.97267
[37] validation-logloss:0.28164 validation-auc:0.97020 validation-aucpr:0.97269
[38] validation-logloss:0.27842 validation-auc:0.97032 validation-aucpr:0.97276
[39] validation-logloss:0.27516 validation-auc:0.97038 validation-aucpr:0.97280
[40] validation-logloss:0.27111 validation-auc:0.97049 validation-aucpr:0.97291
[41] validation-logloss:0.26738 validation-auc:0.97060 validation-aucpr:0.97299
[42] validation-logloss:0.26459 validation-auc:0.97062 validation-aucpr:0.97301
[43] validation-logloss:0.26107 validation-auc:0.97082 validation-aucpr:0.97323
[44] validation-logloss:0.25834 validation-auc:0.97089 validation-aucpr:0.97393
[45] validation-logloss:0.25590 validation-auc:0.97095 validation-aucpr:0.97396
[46] validation-logloss:0.25344 validation-auc:0.97103 validation-aucpr:0.97399
[47] validation-logloss:0.25101 validation-auc:0.97109 validation-aucpr:0.97403
[48] validation-logloss:0.24893 validation-auc:0.97107 validation-aucpr:0.97399
[49] validation-logloss:0.24667 validation-auc:0.97108 validation-aucpr:0.97400
[50] validation-logloss:0.24461 validation-auc:0.97118 validation-aucpr:0.97406
[51] validation-logloss:0.24163 validation-auc:0.97143 validation-aucpr:0.97427
{'best_iteration': '51', 'best_score': '0.9742744916003644'}
Trial 6, Fold 2: Log loss = 0.2416254566667652, Average precision = 0.974289737756206, ROC-AUC = 0.9714251332573178, Elapsed Time = 1.8510905000002822 seconds
Trial 6, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 6, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66832 validation-auc:0.92346 validation-aucpr:0.88843
[1] validation-logloss:0.64380 validation-auc:0.95272 validation-aucpr:0.95000
[2] validation-logloss:0.62157 validation-auc:0.95911 validation-aucpr:0.96307
[3] validation-logloss:0.59860 validation-auc:0.96419 validation-aucpr:0.96711
[4] validation-logloss:0.57704 validation-auc:0.96581 validation-aucpr:0.96937
[5] validation-logloss:0.55930 validation-auc:0.96675 validation-aucpr:0.97002
[6] validation-logloss:0.54293 validation-auc:0.96719 validation-aucpr:0.97030
[7] validation-logloss:0.52767 validation-auc:0.96737 validation-aucpr:0.96963
[8] validation-logloss:0.51299 validation-auc:0.96722 validation-aucpr:0.96950
[9] validation-logloss:0.50055 validation-auc:0.96708 validation-aucpr:0.96921
[10] validation-logloss:0.48829 validation-auc:0.96690 validation-aucpr:0.96913
[11] validation-logloss:0.47310 validation-auc:0.96760 validation-aucpr:0.97226
[12] validation-logloss:0.45907 validation-auc:0.96797 validation-aucpr:0.97263
[13] validation-logloss:0.44667 validation-auc:0.96809 validation-aucpr:0.97279
[14] validation-logloss:0.43562 validation-auc:0.96826 validation-aucpr:0.97294
[15] validation-logloss:0.42385 validation-auc:0.96841 validation-aucpr:0.97310
[16] validation-logloss:0.41436 validation-auc:0.96874 validation-aucpr:0.97338
[17] validation-logloss:0.40374 validation-auc:0.96897 validation-aucpr:0.97358
[18] validation-logloss:0.39549 validation-auc:0.96893 validation-aucpr:0.97354
[19] validation-logloss:0.38716 validation-auc:0.96896 validation-aucpr:0.97357
[20] validation-logloss:0.37922 validation-auc:0.96919 validation-aucpr:0.97372
[21] validation-logloss:0.37037 validation-auc:0.96939 validation-aucpr:0.97387
[22] validation-logloss:0.36176 validation-auc:0.96949 validation-aucpr:0.97397
[23] validation-logloss:0.35367 validation-auc:0.96976 validation-aucpr:0.97421
[24] validation-logloss:0.34586 validation-auc:0.96988 validation-aucpr:0.97433
[25] validation-logloss:0.33841 validation-auc:0.97016 validation-aucpr:0.97457
[26] validation-logloss:0.33269 validation-auc:0.97030 validation-aucpr:0.97463
[27] validation-logloss:0.32732 validation-auc:0.97022 validation-aucpr:0.97457
[28] validation-logloss:0.32193 validation-auc:0.97045 validation-aucpr:0.97475
[29] validation-logloss:0.31622 validation-auc:0.97057 validation-aucpr:0.97485
[30] validation-logloss:0.31038 validation-auc:0.97082 validation-aucpr:0.97507
[31] validation-logloss:0.30544 validation-auc:0.97110 validation-aucpr:0.97523
[32] validation-logloss:0.30030 validation-auc:0.97105 validation-aucpr:0.97521
[33] validation-logloss:0.29628 validation-auc:0.97101 validation-aucpr:0.97515
[34] validation-logloss:0.29245 validation-auc:0.97107 validation-aucpr:0.97520
[35] validation-logloss:0.28883 validation-auc:0.97104 validation-aucpr:0.97519
[36] validation-logloss:0.28389 validation-auc:0.97127 validation-aucpr:0.97537
[37] validation-logloss:0.28063 validation-auc:0.97119 validation-aucpr:0.97530
[38] validation-logloss:0.27750 validation-auc:0.97116 validation-aucpr:0.97526
[39] validation-logloss:0.27325 validation-auc:0.97123 validation-aucpr:0.97533
[40] validation-logloss:0.27030 validation-auc:0.97116 validation-aucpr:0.97527
[41] validation-logloss:0.26646 validation-auc:0.97130 validation-aucpr:0.97537
[42] validation-logloss:0.26342 validation-auc:0.97149 validation-aucpr:0.97549
[43] validation-logloss:0.26088 validation-auc:0.97148 validation-aucpr:0.97547
[44] validation-logloss:0.25833 validation-auc:0.97154 validation-aucpr:0.97552
[45] validation-logloss:0.25584 validation-auc:0.97153 validation-aucpr:0.97548
[46] validation-logloss:0.25349 validation-auc:0.97150 validation-aucpr:0.97543
[47] validation-logloss:0.25019 validation-auc:0.97160 validation-aucpr:0.97553
[48] validation-logloss:0.24808 validation-auc:0.97167 validation-aucpr:0.97563
[49] validation-logloss:0.24598 validation-auc:0.97165 validation-aucpr:0.97560
[50] validation-logloss:0.24396 validation-auc:0.97163 validation-aucpr:0.97559
[51] validation-logloss:0.24117 validation-auc:0.97175 validation-aucpr:0.97569
{'best_iteration': '51', 'best_score': '0.9756914292499006'}
Trial 6, Fold 3: Log loss = 0.2411696065616846, Average precision = 0.9756956642097862, ROC-AUC = 0.9717455528695669, Elapsed Time = 1.7352725000000646 seconds
Trial 6, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 6, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66848 validation-auc:0.92531 validation-aucpr:0.88966
[1] validation-logloss:0.64595 validation-auc:0.94503 validation-aucpr:0.93222
[2] validation-logloss:0.62460 validation-auc:0.95218 validation-aucpr:0.95507
[3] validation-logloss:0.60120 validation-auc:0.96106 validation-aucpr:0.96514
[4] validation-logloss:0.58273 validation-auc:0.96159 validation-aucpr:0.96633
[5] validation-logloss:0.56187 validation-auc:0.96428 validation-aucpr:0.96930
[6] validation-logloss:0.54578 validation-auc:0.96443 validation-aucpr:0.96933
[7] validation-logloss:0.52969 validation-auc:0.96527 validation-aucpr:0.97000
[8] validation-logloss:0.51239 validation-auc:0.96593 validation-aucpr:0.97084
[9] validation-logloss:0.49680 validation-auc:0.96612 validation-aucpr:0.97112
[10] validation-logloss:0.48098 validation-auc:0.96711 validation-aucpr:0.97201
[11] validation-logloss:0.46852 validation-auc:0.96726 validation-aucpr:0.97218
[12] validation-logloss:0.45710 validation-auc:0.96716 validation-aucpr:0.97206
[13] validation-logloss:0.44662 validation-auc:0.96680 validation-aucpr:0.97177
[14] validation-logloss:0.43433 validation-auc:0.96699 validation-aucpr:0.97206
[15] validation-logloss:0.42454 validation-auc:0.96679 validation-aucpr:0.97190
[16] validation-logloss:0.41541 validation-auc:0.96685 validation-aucpr:0.97198
[17] validation-logloss:0.40681 validation-auc:0.96682 validation-aucpr:0.97196
[18] validation-logloss:0.39676 validation-auc:0.96712 validation-aucpr:0.97225
[19] validation-logloss:0.38865 validation-auc:0.96741 validation-aucpr:0.97245
[20] validation-logloss:0.38153 validation-auc:0.96710 validation-aucpr:0.97219
[21] validation-logloss:0.37415 validation-auc:0.96708 validation-aucpr:0.97212
[22] validation-logloss:0.36591 validation-auc:0.96724 validation-aucpr:0.97234
[23] validation-logloss:0.35958 validation-auc:0.96712 validation-aucpr:0.97222
[24] validation-logloss:0.35205 validation-auc:0.96713 validation-aucpr:0.97230
[25] validation-logloss:0.34635 validation-auc:0.96702 validation-aucpr:0.97222
[26] validation-logloss:0.34088 validation-auc:0.96699 validation-aucpr:0.97220
[27] validation-logloss:0.33484 validation-auc:0.96711 validation-aucpr:0.97231
[28] validation-logloss:0.32844 validation-auc:0.96722 validation-aucpr:0.97246
[29] validation-logloss:0.32361 validation-auc:0.96730 validation-aucpr:0.97246
[30] validation-logloss:0.31878 validation-auc:0.96740 validation-aucpr:0.97252
[31] validation-logloss:0.31279 validation-auc:0.96761 validation-aucpr:0.97270
[32] validation-logloss:0.30723 validation-auc:0.96787 validation-aucpr:0.97293
[33] validation-logloss:0.30309 validation-auc:0.96795 validation-aucpr:0.97296
[34] validation-logloss:0.29896 validation-auc:0.96813 validation-aucpr:0.97308
[35] validation-logloss:0.29486 validation-auc:0.96828 validation-aucpr:0.97318
[36] validation-logloss:0.29129 validation-auc:0.96829 validation-aucpr:0.97317
[37] validation-logloss:0.28664 validation-auc:0.96853 validation-aucpr:0.97340
[38] validation-logloss:0.28238 validation-auc:0.96861 validation-aucpr:0.97350
[39] validation-logloss:0.27820 validation-auc:0.96871 validation-aucpr:0.97360
[40] validation-logloss:0.27520 validation-auc:0.96873 validation-aucpr:0.97359
[41] validation-logloss:0.27256 validation-auc:0.96860 validation-aucpr:0.97348
[42] validation-logloss:0.27001 validation-auc:0.96859 validation-aucpr:0.97349
[43] validation-logloss:0.26621 validation-auc:0.96885 validation-aucpr:0.97370
[44] validation-logloss:0.26271 validation-auc:0.96896 validation-aucpr:0.97380
[45] validation-logloss:0.26011 validation-auc:0.96904 validation-aucpr:0.97385
[46] validation-logloss:0.25675 validation-auc:0.96924 validation-aucpr:0.97402
[47] validation-logloss:0.25437 validation-auc:0.96928 validation-aucpr:0.97407
[48] validation-logloss:0.25136 validation-auc:0.96942 validation-aucpr:0.97420
[49] validation-logloss:0.24929 validation-auc:0.96944 validation-aucpr:0.97421
[50] validation-logloss:0.24666 validation-auc:0.96955 validation-aucpr:0.97432
[51] validation-logloss:0.24464 validation-auc:0.96963 validation-aucpr:0.97439
{'best_iteration': '51', 'best_score': '0.9743919537729241'}
Trial 6, Fold 4: Log loss = 0.24464498648864635, Average precision = 0.9743956571983957, ROC-AUC = 0.9696342434625445, Elapsed Time = 1.8642994000001636 seconds
Trial 6, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 6, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66837 validation-auc:0.92013 validation-aucpr:0.88776
[1] validation-logloss:0.64569 validation-auc:0.94533 validation-aucpr:0.93629
[2] validation-logloss:0.62488 validation-auc:0.94923 validation-aucpr:0.94806
[3] validation-logloss:0.60140 validation-auc:0.95964 validation-aucpr:0.96366
[4] validation-logloss:0.57992 validation-auc:0.96210 validation-aucpr:0.96599
[5] validation-logloss:0.56293 validation-auc:0.96179 validation-aucpr:0.96558
[6] validation-logloss:0.54669 validation-auc:0.96209 validation-aucpr:0.96568
[7] validation-logloss:0.53101 validation-auc:0.96269 validation-aucpr:0.96581
[8] validation-logloss:0.51657 validation-auc:0.96266 validation-aucpr:0.96579
[9] validation-logloss:0.50308 validation-auc:0.96248 validation-aucpr:0.96588
[10] validation-logloss:0.48987 validation-auc:0.96302 validation-aucpr:0.96623
[11] validation-logloss:0.47787 validation-auc:0.96293 validation-aucpr:0.96608
[12] validation-logloss:0.46411 validation-auc:0.96347 validation-aucpr:0.96622
[13] validation-logloss:0.45101 validation-auc:0.96426 validation-aucpr:0.96697
[14] validation-logloss:0.44053 validation-auc:0.96436 validation-aucpr:0.96700
[15] validation-logloss:0.42948 validation-auc:0.96442 validation-aucpr:0.96680
[16] validation-logloss:0.42044 validation-auc:0.96439 validation-aucpr:0.96674
[17] validation-logloss:0.40995 validation-auc:0.96452 validation-aucpr:0.96692
[18] validation-logloss:0.40207 validation-auc:0.96447 validation-aucpr:0.96670
[19] validation-logloss:0.39202 validation-auc:0.96500 validation-aucpr:0.96742
[20] validation-logloss:0.38485 validation-auc:0.96500 validation-aucpr:0.96747
[21] validation-logloss:0.37770 validation-auc:0.96507 validation-aucpr:0.96747
[22] validation-logloss:0.36904 validation-auc:0.96559 validation-aucpr:0.96790
[23] validation-logloss:0.36094 validation-auc:0.96599 validation-aucpr:0.96941
[24] validation-logloss:0.35346 validation-auc:0.96619 validation-aucpr:0.96930
[25] validation-logloss:0.34712 validation-auc:0.96634 validation-aucpr:0.96941
[26] validation-logloss:0.34005 validation-auc:0.96682 validation-aucpr:0.96981
[27] validation-logloss:0.33429 validation-auc:0.96689 validation-aucpr:0.97002
[28] validation-logloss:0.32918 validation-auc:0.96692 validation-aucpr:0.97000
[29] validation-logloss:0.32433 validation-auc:0.96702 validation-aucpr:0.97116
[30] validation-logloss:0.31817 validation-auc:0.96735 validation-aucpr:0.97145
[31] validation-logloss:0.31389 validation-auc:0.96757 validation-aucpr:0.97154
[32] validation-logloss:0.30980 validation-auc:0.96744 validation-aucpr:0.97142
[33] validation-logloss:0.30554 validation-auc:0.96755 validation-aucpr:0.97149
[34] validation-logloss:0.30047 validation-auc:0.96774 validation-aucpr:0.97166
[35] validation-logloss:0.29599 validation-auc:0.96783 validation-aucpr:0.97181
[36] validation-logloss:0.29258 validation-auc:0.96781 validation-aucpr:0.97175
[37] validation-logloss:0.28796 validation-auc:0.96815 validation-aucpr:0.97205
[38] validation-logloss:0.28344 validation-auc:0.96839 validation-aucpr:0.97225
[39] validation-logloss:0.27925 validation-auc:0.96858 validation-aucpr:0.97243
[40] validation-logloss:0.27545 validation-auc:0.96865 validation-aucpr:0.97249
[41] validation-logloss:0.27171 validation-auc:0.96880 validation-aucpr:0.97265
[42] validation-logloss:0.26805 validation-auc:0.96892 validation-aucpr:0.97276
[43] validation-logloss:0.26589 validation-auc:0.96875 validation-aucpr:0.97263
[44] validation-logloss:0.26353 validation-auc:0.96876 validation-aucpr:0.97262
[45] validation-logloss:0.26108 validation-auc:0.96885 validation-aucpr:0.97268
[46] validation-logloss:0.25851 validation-auc:0.96891 validation-aucpr:0.97274
[47] validation-logloss:0.25536 validation-auc:0.96917 validation-aucpr:0.97297
[48] validation-logloss:0.25252 validation-auc:0.96929 validation-aucpr:0.97306
[49] validation-logloss:0.24971 validation-auc:0.96947 validation-aucpr:0.97316
[50] validation-logloss:0.24772 validation-auc:0.96953 validation-aucpr:0.97317
[51] validation-logloss:0.24576 validation-auc:0.96963 validation-aucpr:0.97323
{'best_iteration': '51', 'best_score': '0.9732328723738558'}
Trial 6, Fold 5: Log loss = 0.24575918207555797, Average precision = 0.9732383941377283, ROC-AUC = 0.9696304553815285, Elapsed Time = 1.744874300000447 seconds
Optimization Progress: 7%|7 | 7/100 [03:29<36:20, 23.45s/it]
Trial 7, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 7, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68016 validation-auc:0.89251 validation-aucpr:0.87031
[1] validation-logloss:0.66712 validation-auc:0.90934 validation-aucpr:0.90403
[2] validation-logloss:0.65493 validation-auc:0.91940 validation-aucpr:0.91914
[3] validation-logloss:0.64218 validation-auc:0.92620 validation-aucpr:0.93081
[4] validation-logloss:0.63544 validation-auc:0.92475 validation-aucpr:0.93087
[5] validation-logloss:0.62441 validation-auc:0.93475 validation-aucpr:0.94099
[6] validation-logloss:0.61526 validation-auc:0.93665 validation-aucpr:0.94386
[7] validation-logloss:0.60932 validation-auc:0.93241 validation-aucpr:0.94166
[8] validation-logloss:0.59916 validation-auc:0.93500 validation-aucpr:0.94389
[9] validation-logloss:0.58964 validation-auc:0.93711 validation-aucpr:0.94473
[10] validation-logloss:0.58046 validation-auc:0.93881 validation-aucpr:0.94572
[11] validation-logloss:0.57157 validation-auc:0.94023 validation-aucpr:0.94657
[12] validation-logloss:0.56514 validation-auc:0.94241 validation-aucpr:0.94895
[13] validation-logloss:0.55683 validation-auc:0.94293 validation-aucpr:0.94978
[14] validation-logloss:0.54973 validation-auc:0.94335 validation-aucpr:0.95017
[15] validation-logloss:0.54508 validation-auc:0.94246 validation-aucpr:0.94980
[16] validation-logloss:0.53825 validation-auc:0.94314 validation-aucpr:0.95041
[17] validation-logloss:0.53180 validation-auc:0.94337 validation-aucpr:0.95039
[18] validation-logloss:0.52467 validation-auc:0.94405 validation-aucpr:0.95080
[19] validation-logloss:0.51864 validation-auc:0.94388 validation-aucpr:0.95064
[20] validation-logloss:0.51293 validation-auc:0.94378 validation-aucpr:0.95054
[21] validation-logloss:0.50736 validation-auc:0.94372 validation-aucpr:0.95017
[22] validation-logloss:0.50184 validation-auc:0.94344 validation-aucpr:0.94997
[23] validation-logloss:0.49344 validation-auc:0.94956 validation-aucpr:0.95698
[24] validation-logloss:0.48983 validation-auc:0.94934 validation-aucpr:0.95664
[25] validation-logloss:0.48617 validation-auc:0.95008 validation-aucpr:0.95737
[26] validation-logloss:0.48121 validation-auc:0.94982 validation-aucpr:0.95701
[27] validation-logloss:0.47813 validation-auc:0.95035 validation-aucpr:0.95764
[28] validation-logloss:0.47385 validation-auc:0.95076 validation-aucpr:0.95807
[29] validation-logloss:0.46691 validation-auc:0.95212 validation-aucpr:0.95960
[30] validation-logloss:0.46231 validation-auc:0.95230 validation-aucpr:0.95991
[31] validation-logloss:0.45990 validation-auc:0.95229 validation-aucpr:0.96008
[32] validation-logloss:0.45711 validation-auc:0.95246 validation-aucpr:0.96015
[33] validation-logloss:0.45468 validation-auc:0.95230 validation-aucpr:0.95999
[34] validation-logloss:0.45072 validation-auc:0.95221 validation-aucpr:0.95995
[35] validation-logloss:0.44696 validation-auc:0.95222 validation-aucpr:0.95985
[36] validation-logloss:0.44315 validation-auc:0.95207 validation-aucpr:0.95965
[37] validation-logloss:0.44108 validation-auc:0.95219 validation-aucpr:0.95974
[38] validation-logloss:0.43709 validation-auc:0.95222 validation-aucpr:0.95972
[39] validation-logloss:0.43289 validation-auc:0.95231 validation-aucpr:0.95966
[40] validation-logloss:0.42983 validation-auc:0.95198 validation-aucpr:0.95931
[41] validation-logloss:0.42831 validation-auc:0.95212 validation-aucpr:0.95946
[42] validation-logloss:0.42538 validation-auc:0.95195 validation-aucpr:0.95925
[43] validation-logloss:0.42330 validation-auc:0.95198 validation-aucpr:0.95940
[44] validation-logloss:0.41957 validation-auc:0.95204 validation-aucpr:0.95948
[45] validation-logloss:0.41739 validation-auc:0.95186 validation-aucpr:0.95931
[46] validation-logloss:0.41432 validation-auc:0.95198 validation-aucpr:0.95948
[47] validation-logloss:0.41163 validation-auc:0.95179 validation-aucpr:0.95919
[48] validation-logloss:0.41000 validation-auc:0.95197 validation-aucpr:0.95943
[49] validation-logloss:0.40495 validation-auc:0.95289 validation-aucpr:0.96051
[50] validation-logloss:0.40270 validation-auc:0.95280 validation-aucpr:0.96035
[51] validation-logloss:0.40179 validation-auc:0.95277 validation-aucpr:0.96034
[52] validation-logloss:0.39913 validation-auc:0.95276 validation-aucpr:0.96032
[53] validation-logloss:0.39708 validation-auc:0.95267 validation-aucpr:0.96016
[54] validation-logloss:0.39336 validation-auc:0.95303 validation-aucpr:0.96045
{'best_iteration': '49', 'best_score': '0.9605111595823358'}
Trial 7, Fold 1: Log loss = 0.39335688845107086, Average precision = 0.9604510688194967, ROC-AUC = 0.9530320220530248, Elapsed Time = 0.7336209999994026 seconds
Trial 7, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 7, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67933 validation-auc:0.90882 validation-aucpr:0.89431
[1] validation-logloss:0.66675 validation-auc:0.91501 validation-aucpr:0.89901
[2] validation-logloss:0.65206 validation-auc:0.93963 validation-aucpr:0.93515
[3] validation-logloss:0.63992 validation-auc:0.94320 validation-aucpr:0.94373
[4] validation-logloss:0.62894 validation-auc:0.94160 validation-aucpr:0.94018
[5] validation-logloss:0.61858 validation-auc:0.94205 validation-aucpr:0.94202
[6] validation-logloss:0.60899 validation-auc:0.94013 validation-aucpr:0.94055
[7] validation-logloss:0.59955 validation-auc:0.94683 validation-aucpr:0.95021
[8] validation-logloss:0.58795 validation-auc:0.95064 validation-aucpr:0.95505
[9] validation-logloss:0.57983 validation-auc:0.95104 validation-aucpr:0.95535
[10] validation-logloss:0.57015 validation-auc:0.95200 validation-aucpr:0.95618
[11] validation-logloss:0.56186 validation-auc:0.95137 validation-aucpr:0.95500
[12] validation-logloss:0.55357 validation-auc:0.95168 validation-aucpr:0.95523
[13] validation-logloss:0.54688 validation-auc:0.95224 validation-aucpr:0.95573
[14] validation-logloss:0.54133 validation-auc:0.95203 validation-aucpr:0.95582
[15] validation-logloss:0.53531 validation-auc:0.95226 validation-aucpr:0.95626
[16] validation-logloss:0.52876 validation-auc:0.95188 validation-aucpr:0.95567
[17] validation-logloss:0.52642 validation-auc:0.95163 validation-aucpr:0.95592
[18] validation-logloss:0.52025 validation-auc:0.95145 validation-aucpr:0.95577
[19] validation-logloss:0.51251 validation-auc:0.95244 validation-aucpr:0.95678
[20] validation-logloss:0.50681 validation-auc:0.95245 validation-aucpr:0.95640
[21] validation-logloss:0.49955 validation-auc:0.95320 validation-aucpr:0.95719
[22] validation-logloss:0.49401 validation-auc:0.95304 validation-aucpr:0.95700
[23] validation-logloss:0.49148 validation-auc:0.95294 validation-aucpr:0.95727
[24] validation-logloss:0.48907 validation-auc:0.95270 validation-aucpr:0.95711
[25] validation-logloss:0.48413 validation-auc:0.95268 validation-aucpr:0.95688
[26] validation-logloss:0.47927 validation-auc:0.95270 validation-aucpr:0.95669
[27] validation-logloss:0.47472 validation-auc:0.95235 validation-aucpr:0.95614
[28] validation-logloss:0.46771 validation-auc:0.95332 validation-aucpr:0.95745
[29] validation-logloss:0.46268 validation-auc:0.95332 validation-aucpr:0.95756
[30] validation-logloss:0.45885 validation-auc:0.95332 validation-aucpr:0.95740
[31] validation-logloss:0.45480 validation-auc:0.95336 validation-aucpr:0.95773
[32] validation-logloss:0.45011 validation-auc:0.95325 validation-aucpr:0.95745
[33] validation-logloss:0.44529 validation-auc:0.95321 validation-aucpr:0.95728
[34] validation-logloss:0.44151 validation-auc:0.95325 validation-aucpr:0.95725
[35] validation-logloss:0.43920 validation-auc:0.95310 validation-aucpr:0.95713
[36] validation-logloss:0.43571 validation-auc:0.95319 validation-aucpr:0.95722
[37] validation-logloss:0.43194 validation-auc:0.95306 validation-aucpr:0.95703
[38] validation-logloss:0.42812 validation-auc:0.95290 validation-aucpr:0.95683
[39] validation-logloss:0.42485 validation-auc:0.95285 validation-aucpr:0.95678
[40] validation-logloss:0.42238 validation-auc:0.95290 validation-aucpr:0.95677
[41] validation-logloss:0.41903 validation-auc:0.95285 validation-aucpr:0.95671
[42] validation-logloss:0.41634 validation-auc:0.95306 validation-aucpr:0.95691
[43] validation-logloss:0.41369 validation-auc:0.95307 validation-aucpr:0.95686
[44] validation-logloss:0.41114 validation-auc:0.95309 validation-aucpr:0.95680
[45] validation-logloss:0.40835 validation-auc:0.95294 validation-aucpr:0.95656
[46] validation-logloss:0.40635 validation-auc:0.95315 validation-aucpr:0.95684
[47] validation-logloss:0.40329 validation-auc:0.95309 validation-aucpr:0.95681
[48] validation-logloss:0.40126 validation-auc:0.95306 validation-aucpr:0.95681
[49] validation-logloss:0.39889 validation-auc:0.95332 validation-aucpr:0.95720
[50] validation-logloss:0.39657 validation-auc:0.95334 validation-aucpr:0.95723
[51] validation-logloss:0.39397 validation-auc:0.95336 validation-aucpr:0.95724
[52] validation-logloss:0.39086 validation-auc:0.95348 validation-aucpr:0.95742
[53] validation-logloss:0.38884 validation-auc:0.95351 validation-aucpr:0.95744
[54] validation-logloss:0.38677 validation-auc:0.95351 validation-aucpr:0.95742
{'best_iteration': '31', 'best_score': '0.9577280678288306'}
Trial 7, Fold 2: Log loss = 0.38676745688883635, Average precision = 0.9574293626158382, ROC-AUC = 0.9535117624260951, Elapsed Time = 1.0004017999999633 seconds
Trial 7, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 7, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67935 validation-auc:0.90470 validation-aucpr:0.89052
[1] validation-logloss:0.66670 validation-auc:0.91107 validation-aucpr:0.89678
[2] validation-logloss:0.65748 validation-auc:0.91284 validation-aucpr:0.90760
[3] validation-logloss:0.64437 validation-auc:0.92801 validation-aucpr:0.92905
[4] validation-logloss:0.63204 validation-auc:0.93349 validation-aucpr:0.93578
[5] validation-logloss:0.62087 validation-auc:0.93476 validation-aucpr:0.93718
[6] validation-logloss:0.61630 validation-auc:0.93486 validation-aucpr:0.93826
[7] validation-logloss:0.61212 validation-auc:0.93174 validation-aucpr:0.93799
[8] validation-logloss:0.60159 validation-auc:0.93401 validation-aucpr:0.93934
[9] validation-logloss:0.59161 validation-auc:0.93668 validation-aucpr:0.94126
[10] validation-logloss:0.58263 validation-auc:0.93978 validation-aucpr:0.94483
[11] validation-logloss:0.57186 validation-auc:0.95108 validation-aucpr:0.95796
[12] validation-logloss:0.56858 validation-auc:0.94955 validation-aucpr:0.95721
[13] validation-logloss:0.56002 validation-auc:0.94974 validation-aucpr:0.95730
[14] validation-logloss:0.55234 validation-auc:0.94992 validation-aucpr:0.95716
[15] validation-logloss:0.54406 validation-auc:0.95155 validation-aucpr:0.95845
[16] validation-logloss:0.53695 validation-auc:0.95150 validation-aucpr:0.95833
[17] validation-logloss:0.52953 validation-auc:0.95245 validation-aucpr:0.95905
[18] validation-logloss:0.52296 validation-auc:0.95230 validation-aucpr:0.95877
[19] validation-logloss:0.51900 validation-auc:0.95238 validation-aucpr:0.95908
[20] validation-logloss:0.51275 validation-auc:0.95292 validation-aucpr:0.95949
[21] validation-logloss:0.50723 validation-auc:0.95304 validation-aucpr:0.95942
[22] validation-logloss:0.50095 validation-auc:0.95318 validation-aucpr:0.95929
[23] validation-logloss:0.49563 validation-auc:0.95320 validation-aucpr:0.95926
[24] validation-logloss:0.49035 validation-auc:0.95288 validation-aucpr:0.95887
[25] validation-logloss:0.48681 validation-auc:0.95267 validation-aucpr:0.95863
[26] validation-logloss:0.48187 validation-auc:0.95262 validation-aucpr:0.95845
[27] validation-logloss:0.47603 validation-auc:0.95327 validation-aucpr:0.95908
[28] validation-logloss:0.47167 validation-auc:0.95319 validation-aucpr:0.95900
[29] validation-logloss:0.46714 validation-auc:0.95359 validation-aucpr:0.95937
[30] validation-logloss:0.46427 validation-auc:0.95360 validation-aucpr:0.95939
[31] validation-logloss:0.46030 validation-auc:0.95307 validation-aucpr:0.95879
[32] validation-logloss:0.45621 validation-auc:0.95313 validation-aucpr:0.95888
[33] validation-logloss:0.45440 validation-auc:0.95291 validation-aucpr:0.95856
[34] validation-logloss:0.45190 validation-auc:0.95242 validation-aucpr:0.95793
[35] validation-logloss:0.44799 validation-auc:0.95217 validation-aucpr:0.95766
[36] validation-logloss:0.44177 validation-auc:0.95468 validation-aucpr:0.96061
[37] validation-logloss:0.43795 validation-auc:0.95492 validation-aucpr:0.96086
[38] validation-logloss:0.43421 validation-auc:0.95481 validation-aucpr:0.96068
[39] validation-logloss:0.43090 validation-auc:0.95464 validation-aucpr:0.96048
[40] validation-logloss:0.42521 validation-auc:0.95571 validation-aucpr:0.96178
[41] validation-logloss:0.42179 validation-auc:0.95555 validation-aucpr:0.96157
[42] validation-logloss:0.41843 validation-auc:0.95546 validation-aucpr:0.96149
[43] validation-logloss:0.41547 validation-auc:0.95558 validation-aucpr:0.96161
[44] validation-logloss:0.41066 validation-auc:0.95638 validation-aucpr:0.96243
[45] validation-logloss:0.40819 validation-auc:0.95618 validation-aucpr:0.96220
[46] validation-logloss:0.40649 validation-auc:0.95614 validation-aucpr:0.96203
[47] validation-logloss:0.40335 validation-auc:0.95655 validation-aucpr:0.96242
[48] validation-logloss:0.40051 validation-auc:0.95652 validation-aucpr:0.96234
[49] validation-logloss:0.39855 validation-auc:0.95658 validation-aucpr:0.96240
[50] validation-logloss:0.39698 validation-auc:0.95674 validation-aucpr:0.96260
[51] validation-logloss:0.39445 validation-auc:0.95666 validation-aucpr:0.96255
[52] validation-logloss:0.39174 validation-auc:0.95661 validation-aucpr:0.96253
[53] validation-logloss:0.38659 validation-auc:0.95728 validation-aucpr:0.96340
[54] validation-logloss:0.38234 validation-auc:0.95745 validation-aucpr:0.96364
{'best_iteration': '54', 'best_score': '0.9636424415346243'}
Trial 7, Fold 3: Log loss = 0.38234323236366546, Average precision = 0.9636474357242567, ROC-AUC = 0.9574512910402482, Elapsed Time = 1.0445481999995536 seconds
Trial 7, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 7, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67951 validation-auc:0.90264 validation-aucpr:0.89429
[1] validation-logloss:0.66617 validation-auc:0.92506 validation-aucpr:0.92731
[2] validation-logloss:0.65515 validation-auc:0.92324 validation-aucpr:0.92434
[3] validation-logloss:0.64695 validation-auc:0.92645 validation-aucpr:0.92960
[4] validation-logloss:0.63759 validation-auc:0.92542 validation-aucpr:0.92953
[5] validation-logloss:0.62642 validation-auc:0.93200 validation-aucpr:0.93649
[6] validation-logloss:0.61601 validation-auc:0.93403 validation-aucpr:0.93867
[7] validation-logloss:0.60945 validation-auc:0.93368 validation-aucpr:0.93902
[8] validation-logloss:0.59785 validation-auc:0.94521 validation-aucpr:0.95217
[9] validation-logloss:0.58813 validation-auc:0.94676 validation-aucpr:0.95335
[10] validation-logloss:0.57837 validation-auc:0.94708 validation-aucpr:0.95317
[11] validation-logloss:0.57021 validation-auc:0.94647 validation-aucpr:0.95165
[12] validation-logloss:0.56092 validation-auc:0.94800 validation-aucpr:0.95338
[13] validation-logloss:0.55357 validation-auc:0.94763 validation-aucpr:0.95298
[14] validation-logloss:0.54679 validation-auc:0.94728 validation-aucpr:0.95234
[15] validation-logloss:0.54011 validation-auc:0.94694 validation-aucpr:0.95171
[16] validation-logloss:0.53152 validation-auc:0.94949 validation-aucpr:0.95486
[17] validation-logloss:0.52481 validation-auc:0.94942 validation-aucpr:0.95486
[18] validation-logloss:0.51691 validation-auc:0.95087 validation-aucpr:0.95629
[19] validation-logloss:0.51071 validation-auc:0.95076 validation-aucpr:0.95625
[20] validation-logloss:0.50306 validation-auc:0.95188 validation-aucpr:0.95777
[21] validation-logloss:0.49896 validation-auc:0.95225 validation-aucpr:0.95844
[22] validation-logloss:0.49325 validation-auc:0.95265 validation-aucpr:0.95920
[23] validation-logloss:0.48753 validation-auc:0.95283 validation-aucpr:0.95948
[24] validation-logloss:0.48229 validation-auc:0.95292 validation-aucpr:0.95950
[25] validation-logloss:0.47732 validation-auc:0.95277 validation-aucpr:0.95923
[26] validation-logloss:0.47215 validation-auc:0.95295 validation-aucpr:0.95949
[27] validation-logloss:0.46876 validation-auc:0.95282 validation-aucpr:0.95943
[28] validation-logloss:0.46398 validation-auc:0.95274 validation-aucpr:0.95929
[29] validation-logloss:0.46021 validation-auc:0.95252 validation-aucpr:0.95900
[30] validation-logloss:0.45739 validation-auc:0.95277 validation-aucpr:0.95928
[31] validation-logloss:0.45329 validation-auc:0.95253 validation-aucpr:0.95902
[32] validation-logloss:0.44984 validation-auc:0.95281 validation-aucpr:0.95929
[33] validation-logloss:0.44446 validation-auc:0.95361 validation-aucpr:0.96008
[34] validation-logloss:0.44069 validation-auc:0.95327 validation-aucpr:0.95968
[35] validation-logloss:0.43709 validation-auc:0.95317 validation-aucpr:0.95950
[36] validation-logloss:0.43309 validation-auc:0.95322 validation-aucpr:0.95956
[37] validation-logloss:0.42951 validation-auc:0.95313 validation-aucpr:0.95954
[38] validation-logloss:0.42588 validation-auc:0.95328 validation-aucpr:0.95963
[39] validation-logloss:0.42408 validation-auc:0.95332 validation-aucpr:0.95964
[40] validation-logloss:0.42050 validation-auc:0.95307 validation-aucpr:0.95938
[41] validation-logloss:0.41780 validation-auc:0.95288 validation-aucpr:0.95918
[42] validation-logloss:0.41334 validation-auc:0.95339 validation-aucpr:0.95982
[43] validation-logloss:0.41064 validation-auc:0.95358 validation-aucpr:0.96018
[44] validation-logloss:0.40806 validation-auc:0.95328 validation-aucpr:0.95988
[45] validation-logloss:0.40514 validation-auc:0.95336 validation-aucpr:0.95999
[46] validation-logloss:0.39938 validation-auc:0.95438 validation-aucpr:0.96106
[47] validation-logloss:0.39676 validation-auc:0.95454 validation-aucpr:0.96116
[48] validation-logloss:0.39263 validation-auc:0.95484 validation-aucpr:0.96146
[49] validation-logloss:0.39008 validation-auc:0.95483 validation-aucpr:0.96139
[50] validation-logloss:0.38875 validation-auc:0.95489 validation-aucpr:0.96144
[51] validation-logloss:0.38713 validation-auc:0.95470 validation-aucpr:0.96126
[52] validation-logloss:0.38499 validation-auc:0.95483 validation-aucpr:0.96144
[53] validation-logloss:0.38412 validation-auc:0.95494 validation-aucpr:0.96161
[54] validation-logloss:0.38293 validation-auc:0.95493 validation-aucpr:0.96159
{'best_iteration': '53', 'best_score': '0.9616095000793439'}
Trial 7, Fold 4: Log loss = 0.3829302825616808, Average precision = 0.9615912528910233, ROC-AUC = 0.9549325688915593, Elapsed Time = 1.014082800000324 seconds
Trial 7, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 7, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68005 validation-auc:0.89139 validation-aucpr:0.87633
[1] validation-logloss:0.66708 validation-auc:0.91207 validation-aucpr:0.91324
[2] validation-logloss:0.65532 validation-auc:0.92129 validation-aucpr:0.92471
[3] validation-logloss:0.64280 validation-auc:0.92677 validation-aucpr:0.93153
[4] validation-logloss:0.63319 validation-auc:0.92862 validation-aucpr:0.93410
[5] validation-logloss:0.62247 validation-auc:0.93553 validation-aucpr:0.93941
[6] validation-logloss:0.61248 validation-auc:0.93679 validation-aucpr:0.94078
[7] validation-logloss:0.60286 validation-auc:0.93738 validation-aucpr:0.94164
[8] validation-logloss:0.59772 validation-auc:0.93772 validation-aucpr:0.94272
[9] validation-logloss:0.59146 validation-auc:0.93922 validation-aucpr:0.94386
[10] validation-logloss:0.58172 validation-auc:0.94040 validation-aucpr:0.94490
[11] validation-logloss:0.57129 validation-auc:0.94791 validation-aucpr:0.95330
[12] validation-logloss:0.56417 validation-auc:0.94962 validation-aucpr:0.95564
[13] validation-logloss:0.55580 validation-auc:0.95008 validation-aucpr:0.95600
[14] validation-logloss:0.54835 validation-auc:0.95015 validation-aucpr:0.95582
[15] validation-logloss:0.54062 validation-auc:0.95095 validation-aucpr:0.95681
[16] validation-logloss:0.53470 validation-auc:0.95068 validation-aucpr:0.95659
[17] validation-logloss:0.52802 validation-auc:0.95077 validation-aucpr:0.95659
[18] validation-logloss:0.52146 validation-auc:0.95062 validation-aucpr:0.95642
[19] validation-logloss:0.51544 validation-auc:0.95013 validation-aucpr:0.95571
[20] validation-logloss:0.50931 validation-auc:0.95011 validation-aucpr:0.95570
[21] validation-logloss:0.50313 validation-auc:0.94994 validation-aucpr:0.95555
[22] validation-logloss:0.49776 validation-auc:0.95177 validation-aucpr:0.95766
[23] validation-logloss:0.49294 validation-auc:0.95146 validation-aucpr:0.95734
[24] validation-logloss:0.48717 validation-auc:0.95138 validation-aucpr:0.95731
[25] validation-logloss:0.48162 validation-auc:0.95117 validation-aucpr:0.95692
[26] validation-logloss:0.47693 validation-auc:0.95116 validation-aucpr:0.95705
[27] validation-logloss:0.47241 validation-auc:0.95078 validation-aucpr:0.95657
[28] validation-logloss:0.46850 validation-auc:0.95059 validation-aucpr:0.95630
[29] validation-logloss:0.46486 validation-auc:0.95053 validation-aucpr:0.95640
[30] validation-logloss:0.46017 validation-auc:0.95115 validation-aucpr:0.95714
[31] validation-logloss:0.45638 validation-auc:0.95109 validation-aucpr:0.95702
[32] validation-logloss:0.44950 validation-auc:0.95194 validation-aucpr:0.95815
[33] validation-logloss:0.44490 validation-auc:0.95257 validation-aucpr:0.95875
[34] validation-logloss:0.44167 validation-auc:0.95259 validation-aucpr:0.95883
[35] validation-logloss:0.43810 validation-auc:0.95267 validation-aucpr:0.95885
[36] validation-logloss:0.43474 validation-auc:0.95252 validation-aucpr:0.95874
[37] validation-logloss:0.43075 validation-auc:0.95248 validation-aucpr:0.95859
[38] validation-logloss:0.42759 validation-auc:0.95219 validation-aucpr:0.95824
[39] validation-logloss:0.42523 validation-auc:0.95230 validation-aucpr:0.95827
[40] validation-logloss:0.42350 validation-auc:0.95242 validation-aucpr:0.95846
[41] validation-logloss:0.42053 validation-auc:0.95239 validation-aucpr:0.95845
[42] validation-logloss:0.41733 validation-auc:0.95235 validation-aucpr:0.95830
[43] validation-logloss:0.41358 validation-auc:0.95262 validation-aucpr:0.95879
[44] validation-logloss:0.41037 validation-auc:0.95255 validation-aucpr:0.95862
[45] validation-logloss:0.40747 validation-auc:0.95235 validation-aucpr:0.95837
[46] validation-logloss:0.40442 validation-auc:0.95244 validation-aucpr:0.95849
[47] validation-logloss:0.40226 validation-auc:0.95230 validation-aucpr:0.95827
[48] validation-logloss:0.39997 validation-auc:0.95230 validation-aucpr:0.95836
[49] validation-logloss:0.39775 validation-auc:0.95232 validation-aucpr:0.95838
[50] validation-logloss:0.39405 validation-auc:0.95272 validation-aucpr:0.95875
[51] validation-logloss:0.38953 validation-auc:0.95320 validation-aucpr:0.95935
[52] validation-logloss:0.38533 validation-auc:0.95370 validation-aucpr:0.95999
[53] validation-logloss:0.38300 validation-auc:0.95371 validation-aucpr:0.95999
[54] validation-logloss:0.38231 validation-auc:0.95377 validation-aucpr:0.96003
{'best_iteration': '54', 'best_score': '0.9600339440211474'}
Trial 7, Fold 5: Log loss = 0.38230679109129057, Average precision = 0.9600364880023089, ROC-AUC = 0.9537678687549933, Elapsed Time = 1.1196113000005425 seconds
Optimization Progress: 8%|8 | 8/100 [03:43<31:07, 20.30s/it]
Trial 8, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 8, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67257 validation-auc:0.92769 validation-aucpr:0.91050
[1] validation-logloss:0.65027 validation-auc:0.95658 validation-aucpr:0.95645
[2] validation-logloss:0.63157 validation-auc:0.95832 validation-aucpr:0.96167
[3] validation-logloss:0.61415 validation-auc:0.95866 validation-aucpr:0.96361
[4] validation-logloss:0.59720 validation-auc:0.95935 validation-aucpr:0.96394
[5] validation-logloss:0.58111 validation-auc:0.95989 validation-aucpr:0.96423
[6] validation-logloss:0.56754 validation-auc:0.95943 validation-aucpr:0.96397
[7] validation-logloss:0.55336 validation-auc:0.95981 validation-aucpr:0.96470
[8] validation-logloss:0.53679 validation-auc:0.96220 validation-aucpr:0.96740
[9] validation-logloss:0.52404 validation-auc:0.96238 validation-aucpr:0.96753
[10] validation-logloss:0.51220 validation-auc:0.96238 validation-aucpr:0.96753
[11] validation-logloss:0.50134 validation-auc:0.96252 validation-aucpr:0.96761
[12] validation-logloss:0.48980 validation-auc:0.96292 validation-aucpr:0.96809
[13] validation-logloss:0.47960 validation-auc:0.96277 validation-aucpr:0.96795
[14] validation-logloss:0.46999 validation-auc:0.96253 validation-aucpr:0.96735
[15] validation-logloss:0.46119 validation-auc:0.96221 validation-aucpr:0.96701
[16] validation-logloss:0.45209 validation-auc:0.96214 validation-aucpr:0.96691
[17] validation-logloss:0.44408 validation-auc:0.96217 validation-aucpr:0.96738
[18] validation-logloss:0.43600 validation-auc:0.96214 validation-aucpr:0.96728
{'best_iteration': '12', 'best_score': '0.968090333653044'}
Trial 8, Fold 1: Log loss = 0.43599817143202385, Average precision = 0.9672858086737529, ROC-AUC = 0.9621420685106321, Elapsed Time = 1.3593856999996206 seconds
Trial 8, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 8, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67242 validation-auc:0.93086 validation-aucpr:0.90704
[1] validation-logloss:0.64977 validation-auc:0.95785 validation-aucpr:0.95825
[2] validation-logloss:0.63079 validation-auc:0.96015 validation-aucpr:0.96284
[3] validation-logloss:0.61379 validation-auc:0.96007 validation-aucpr:0.96319
[4] validation-logloss:0.59733 validation-auc:0.96125 validation-aucpr:0.96473
[5] validation-logloss:0.58147 validation-auc:0.96153 validation-aucpr:0.96471
[6] validation-logloss:0.56832 validation-auc:0.95991 validation-aucpr:0.96315
[7] validation-logloss:0.55353 validation-auc:0.96076 validation-aucpr:0.96411
[8] validation-logloss:0.53987 validation-auc:0.96089 validation-aucpr:0.96424
[9] validation-logloss:0.52758 validation-auc:0.96092 validation-aucpr:0.96421
[10] validation-logloss:0.51678 validation-auc:0.96033 validation-aucpr:0.96356
[11] validation-logloss:0.50381 validation-auc:0.96201 validation-aucpr:0.96547
[12] validation-logloss:0.49091 validation-auc:0.96305 validation-aucpr:0.96657
[13] validation-logloss:0.48081 validation-auc:0.96311 validation-aucpr:0.96657
[14] validation-logloss:0.47140 validation-auc:0.96313 validation-aucpr:0.96652
[15] validation-logloss:0.46246 validation-auc:0.96288 validation-aucpr:0.96627
[16] validation-logloss:0.45431 validation-auc:0.96259 validation-aucpr:0.96598
[17] validation-logloss:0.44543 validation-auc:0.96309 validation-aucpr:0.96644
[18] validation-logloss:0.43829 validation-auc:0.96283 validation-aucpr:0.96614
{'best_iteration': '12', 'best_score': '0.9665736485651384'}
Trial 8, Fold 2: Log loss = 0.43828518469363265, Average precision = 0.9661639868587157, ROC-AUC = 0.9628348428697207, Elapsed Time = 1.6450736000006145 seconds
Trial 8, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 8, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67217 validation-auc:0.92719 validation-aucpr:0.90401
[1] validation-logloss:0.65099 validation-auc:0.95515 validation-aucpr:0.95623
[2] validation-logloss:0.63246 validation-auc:0.95664 validation-aucpr:0.95948
[3] validation-logloss:0.61522 validation-auc:0.95661 validation-aucpr:0.95932
[4] validation-logloss:0.59891 validation-auc:0.95667 validation-aucpr:0.96017
[5] validation-logloss:0.58324 validation-auc:0.95751 validation-aucpr:0.96076
[6] validation-logloss:0.56526 validation-auc:0.96327 validation-aucpr:0.96676
[7] validation-logloss:0.55152 validation-auc:0.96316 validation-aucpr:0.96697
[8] validation-logloss:0.53550 validation-auc:0.96447 validation-aucpr:0.96839
[9] validation-logloss:0.52326 validation-auc:0.96469 validation-aucpr:0.96847
[10] validation-logloss:0.51114 validation-auc:0.96436 validation-aucpr:0.96815
[11] validation-logloss:0.50023 validation-auc:0.96424 validation-aucpr:0.96799
[12] validation-logloss:0.48964 validation-auc:0.96447 validation-aucpr:0.96855
[13] validation-logloss:0.47940 validation-auc:0.96487 validation-aucpr:0.96880
[14] validation-logloss:0.46980 validation-auc:0.96490 validation-aucpr:0.96904
[15] validation-logloss:0.45893 validation-auc:0.96545 validation-aucpr:0.96963
[16] validation-logloss:0.44967 validation-auc:0.96560 validation-aucpr:0.96969
[17] validation-logloss:0.43885 validation-auc:0.96601 validation-aucpr:0.97015
[18] validation-logloss:0.43158 validation-auc:0.96593 validation-aucpr:0.97017
{'best_iteration': '18', 'best_score': '0.9701717611710687'}
Trial 8, Fold 3: Log loss = 0.431576279013927, Average precision = 0.9701762699819079, ROC-AUC = 0.9659336746732672, Elapsed Time = 1.5622930000008637 seconds
Trial 8, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 8, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67264 validation-auc:0.92282 validation-aucpr:0.88926
[1] validation-logloss:0.64939 validation-auc:0.95910 validation-aucpr:0.96190
[2] validation-logloss:0.62990 validation-auc:0.96036 validation-aucpr:0.96612
[3] validation-logloss:0.61286 validation-auc:0.96062 validation-aucpr:0.96666
[4] validation-logloss:0.59632 validation-auc:0.96039 validation-aucpr:0.96651
[5] validation-logloss:0.58025 validation-auc:0.96145 validation-aucpr:0.96740
[6] validation-logloss:0.56505 validation-auc:0.96239 validation-aucpr:0.96798
[7] validation-logloss:0.54853 validation-auc:0.96344 validation-aucpr:0.96900
[8] validation-logloss:0.53268 validation-auc:0.96437 validation-aucpr:0.96997
[9] validation-logloss:0.51970 validation-auc:0.96438 validation-aucpr:0.96997
[10] validation-logloss:0.50779 validation-auc:0.96433 validation-aucpr:0.96981
[11] validation-logloss:0.49657 validation-auc:0.96465 validation-aucpr:0.96999
[12] validation-logloss:0.48608 validation-auc:0.96456 validation-aucpr:0.96987
[13] validation-logloss:0.47570 validation-auc:0.96473 validation-aucpr:0.97002
[14] validation-logloss:0.46410 validation-auc:0.96502 validation-aucpr:0.97034
[15] validation-logloss:0.45355 validation-auc:0.96522 validation-aucpr:0.97061
[16] validation-logloss:0.44299 validation-auc:0.96538 validation-aucpr:0.97085
[17] validation-logloss:0.43307 validation-auc:0.96547 validation-aucpr:0.97101
[18] validation-logloss:0.42449 validation-auc:0.96570 validation-aucpr:0.97117
{'best_iteration': '18', 'best_score': '0.9711688641698892'}
Trial 8, Fold 4: Log loss = 0.4244943068705463, Average precision = 0.9711685110879875, ROC-AUC = 0.965697731949422, Elapsed Time = 1.530232099999921 seconds
Trial 8, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 8, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67267 validation-auc:0.91918 validation-aucpr:0.89168
[1] validation-logloss:0.65222 validation-auc:0.94691 validation-aucpr:0.94239
[2] validation-logloss:0.63352 validation-auc:0.95275 validation-aucpr:0.95577
[3] validation-logloss:0.61526 validation-auc:0.95581 validation-aucpr:0.95973
[4] validation-logloss:0.59935 validation-auc:0.95742 validation-aucpr:0.96176
[5] validation-logloss:0.58507 validation-auc:0.95652 validation-aucpr:0.96055
[6] validation-logloss:0.56745 validation-auc:0.96048 validation-aucpr:0.96483
[7] validation-logloss:0.55348 validation-auc:0.96071 validation-aucpr:0.96524
[8] validation-logloss:0.54116 validation-auc:0.95976 validation-aucpr:0.96425
[9] validation-logloss:0.52659 validation-auc:0.96123 validation-aucpr:0.96573
[10] validation-logloss:0.51486 validation-auc:0.96119 validation-aucpr:0.96572
[11] validation-logloss:0.50349 validation-auc:0.96180 validation-aucpr:0.96615
[12] validation-logloss:0.49053 validation-auc:0.96247 validation-aucpr:0.96682
[13] validation-logloss:0.48153 validation-auc:0.96217 validation-aucpr:0.96645
[14] validation-logloss:0.47185 validation-auc:0.96227 validation-aucpr:0.96528
[15] validation-logloss:0.46302 validation-auc:0.96222 validation-aucpr:0.96533
[16] validation-logloss:0.45214 validation-auc:0.96264 validation-aucpr:0.96645
[17] validation-logloss:0.44099 validation-auc:0.96332 validation-aucpr:0.96712
[18] validation-logloss:0.43070 validation-auc:0.96375 validation-aucpr:0.96754
{'best_iteration': '18', 'best_score': '0.9675360413317013'}
Trial 8, Fold 5: Log loss = 0.4307014960639307, Average precision = 0.9676823377355989, ROC-AUC = 0.9637460565185887, Elapsed Time = 1.5349166999985755 seconds
Optimization Progress: 9%|9 | 9/100 [03:59<28:51, 19.02s/it]
Trial 9, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 9, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68501 validation-auc:0.94432 validation-aucpr:0.94674
[18:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67707 validation-auc:0.94745 validation-aucpr:0.94975
[18:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66938 validation-auc:0.95070 validation-aucpr:0.95346
[18:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66194 validation-auc:0.95101 validation-aucpr:0.95378
[18:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65456 validation-auc:0.95241 validation-aucpr:0.95538
[18:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64747 validation-auc:0.95343 validation-aucpr:0.95627
[18:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64064 validation-auc:0.95419 validation-aucpr:0.95732
[18:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63369 validation-auc:0.95621 validation-aucpr:0.95883
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62576 validation-auc:0.95991 validation-aucpr:0.96309
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61929 validation-auc:0.95976 validation-aucpr:0.96283
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61323 validation-auc:0.95899 validation-aucpr:0.96439
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.60576 validation-auc:0.96066 validation-aucpr:0.96632
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.59948 validation-auc:0.96075 validation-aucpr:0.96633
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.59338 validation-auc:0.96074 validation-aucpr:0.96625
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.58762 validation-auc:0.96072 validation-aucpr:0.96622
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.58217 validation-auc:0.96056 validation-aucpr:0.96605
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.57660 validation-auc:0.96031 validation-aucpr:0.96572
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.57127 validation-auc:0.96009 validation-aucpr:0.96555
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.56595 validation-auc:0.95984 validation-aucpr:0.96519
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.56068 validation-auc:0.95990 validation-aucpr:0.96527
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.55583 validation-auc:0.95964 validation-aucpr:0.96508
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.54961 validation-auc:0.96057 validation-aucpr:0.96616
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.54489 validation-auc:0.96054 validation-aucpr:0.96619
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.54019 validation-auc:0.96063 validation-aucpr:0.96625
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.53530 validation-auc:0.96079 validation-aucpr:0.96633
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.53078 validation-auc:0.96078 validation-aucpr:0.96630
[18:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.52624 validation-auc:0.96074 validation-aucpr:0.96624
[18:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.52187 validation-auc:0.96077 validation-aucpr:0.96626
[18:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.51647 validation-auc:0.96121 validation-aucpr:0.96682
[18:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.51230 validation-auc:0.96126 validation-aucpr:0.96689
[18:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.50817 validation-auc:0.96124 validation-aucpr:0.96683
[18:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.50290 validation-auc:0.96195 validation-aucpr:0.96762
[18:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.49898 validation-auc:0.96196 validation-aucpr:0.96761
[18:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.49482 validation-auc:0.96199 validation-aucpr:0.96763
[18:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.49060 validation-auc:0.96218 validation-aucpr:0.96777
[18:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.48710 validation-auc:0.96207 validation-aucpr:0.96773
[18:02:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.48225 validation-auc:0.96254 validation-aucpr:0.96822
[18:02:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.47737 validation-auc:0.96288 validation-aucpr:0.96857
[18:02:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.47302 validation-auc:0.96314 validation-aucpr:0.96887
[18:02:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.46920 validation-auc:0.96312 validation-aucpr:0.96886
[18:02:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.46567 validation-auc:0.96311 validation-aucpr:0.96884
[18:02:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.46212 validation-auc:0.96317 validation-aucpr:0.96886
[18:02:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.45886 validation-auc:0.96317 validation-aucpr:0.96885
[18:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.45557 validation-auc:0.96314 validation-aucpr:0.96881
{'best_iteration': '38', 'best_score': '0.9688745689451286'}
Trial 9, Fold 1: Log loss = 0.45556608645567226, Average precision = 0.9688078257615644, ROC-AUC = 0.9631374926513816, Elapsed Time = 3.3502621999996336 seconds
Trial 9, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 9, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68465 validation-auc:0.94008 validation-aucpr:0.93101
[18:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67570 validation-auc:0.95996 validation-aucpr:0.96218
[18:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66799 validation-auc:0.96088 validation-aucpr:0.96512
[18:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66046 validation-auc:0.96114 validation-aucpr:0.96514
[18:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65341 validation-auc:0.96049 validation-aucpr:0.96416
[18:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64647 validation-auc:0.95997 validation-aucpr:0.96386
[18:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63951 validation-auc:0.95993 validation-aucpr:0.96369
[18:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63254 validation-auc:0.95985 validation-aucpr:0.96364
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62567 validation-auc:0.95979 validation-aucpr:0.96331
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61934 validation-auc:0.95941 validation-aucpr:0.96278
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61296 validation-auc:0.95898 validation-aucpr:0.96214
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.60568 validation-auc:0.96132 validation-aucpr:0.96500
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.59944 validation-auc:0.96086 validation-aucpr:0.96426
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.59360 validation-auc:0.96056 validation-aucpr:0.96395
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.58697 validation-auc:0.96170 validation-aucpr:0.96540
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.57996 validation-auc:0.96238 validation-aucpr:0.96532
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.57448 validation-auc:0.96241 validation-aucpr:0.96524
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.56913 validation-auc:0.96252 validation-aucpr:0.96523
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.56370 validation-auc:0.96231 validation-aucpr:0.96500
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.55846 validation-auc:0.96207 validation-aucpr:0.96474
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.55244 validation-auc:0.96242 validation-aucpr:0.96517
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.54729 validation-auc:0.96234 validation-aucpr:0.96510
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.54102 validation-auc:0.96296 validation-aucpr:0.96593
[18:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.53594 validation-auc:0.96304 validation-aucpr:0.96594
[18:02:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.53105 validation-auc:0.96306 validation-aucpr:0.96594
[18:02:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.52511 validation-auc:0.96353 validation-aucpr:0.96644
[18:02:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.52067 validation-auc:0.96357 validation-aucpr:0.96651
[18:02:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.51522 validation-auc:0.96390 validation-aucpr:0.96686
[18:02:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.51083 validation-auc:0.96392 validation-aucpr:0.96680
[18:02:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.50692 validation-auc:0.96380 validation-aucpr:0.96657
[18:02:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.50292 validation-auc:0.96373 validation-aucpr:0.96652
[18:02:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.49876 validation-auc:0.96375 validation-aucpr:0.96651
[18:02:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.49363 validation-auc:0.96398 validation-aucpr:0.96681
[18:03:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.48974 validation-auc:0.96388 validation-aucpr:0.96789
[18:03:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.48612 validation-auc:0.96376 validation-aucpr:0.96768
[18:03:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.48118 validation-auc:0.96396 validation-aucpr:0.96792
[18:03:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.47757 validation-auc:0.96384 validation-aucpr:0.96779
[18:03:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.47382 validation-auc:0.96376 validation-aucpr:0.96771
[18:03:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.47025 validation-auc:0.96382 validation-aucpr:0.96772
[18:03:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.46678 validation-auc:0.96386 validation-aucpr:0.96775
[18:03:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.46325 validation-auc:0.96390 validation-aucpr:0.96808
[18:03:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.45910 validation-auc:0.96403 validation-aucpr:0.96820
[18:03:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.45468 validation-auc:0.96429 validation-aucpr:0.96849
[18:03:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.45118 validation-auc:0.96433 validation-aucpr:0.96849
{'best_iteration': '43', 'best_score': '0.968494347322956'}
Trial 9, Fold 2: Log loss = 0.4511816000390403, Average precision = 0.9684326083583713, ROC-AUC = 0.9643344144857222, Elapsed Time = 3.7610129000004235 seconds
Trial 9, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 9, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68485 validation-auc:0.93746 validation-aucpr:0.93170
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67589 validation-auc:0.96026 validation-aucpr:0.96209
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66736 validation-auc:0.96233 validation-aucpr:0.96690
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65984 validation-auc:0.96268 validation-aucpr:0.96592
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65244 validation-auc:0.96265 validation-aucpr:0.96576
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64516 validation-auc:0.96318 validation-aucpr:0.96805
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63828 validation-auc:0.96350 validation-aucpr:0.96829
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63102 validation-auc:0.96374 validation-aucpr:0.96851
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62320 validation-auc:0.96444 validation-aucpr:0.96958
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61684 validation-auc:0.96404 validation-aucpr:0.96919
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61013 validation-auc:0.96431 validation-aucpr:0.96933
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.60364 validation-auc:0.96439 validation-aucpr:0.96935
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.59775 validation-auc:0.96442 validation-aucpr:0.96937
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.59040 validation-auc:0.96509 validation-aucpr:0.97008
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.58452 validation-auc:0.96505 validation-aucpr:0.96993
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.57766 validation-auc:0.96530 validation-aucpr:0.97024
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.57112 validation-auc:0.96545 validation-aucpr:0.97049
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.56568 validation-auc:0.96535 validation-aucpr:0.97046
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.55962 validation-auc:0.96550 validation-aucpr:0.97058
[18:03:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.55413 validation-auc:0.96548 validation-aucpr:0.97052
[18:03:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.54897 validation-auc:0.96533 validation-aucpr:0.97047
[18:03:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.54385 validation-auc:0.96543 validation-aucpr:0.97051
[18:03:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.53892 validation-auc:0.96548 validation-aucpr:0.97054
[18:03:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.53407 validation-auc:0.96543 validation-aucpr:0.97046
[18:03:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.52943 validation-auc:0.96538 validation-aucpr:0.97033
[18:03:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.52382 validation-auc:0.96540 validation-aucpr:0.97040
[18:03:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.51921 validation-auc:0.96546 validation-aucpr:0.97042
[18:03:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.51482 validation-auc:0.96553 validation-aucpr:0.97044
[18:03:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.51019 validation-auc:0.96557 validation-aucpr:0.97048
[18:03:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.50575 validation-auc:0.96553 validation-aucpr:0.97041
[18:03:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.50157 validation-auc:0.96546 validation-aucpr:0.97034
[18:03:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.49742 validation-auc:0.96547 validation-aucpr:0.97030
[18:03:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.49254 validation-auc:0.96563 validation-aucpr:0.97050
[18:03:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.48860 validation-auc:0.96569 validation-aucpr:0.97050
[18:03:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.48478 validation-auc:0.96560 validation-aucpr:0.97042
[18:03:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.48097 validation-auc:0.96560 validation-aucpr:0.97037
[18:03:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.47692 validation-auc:0.96568 validation-aucpr:0.97041
[18:03:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.47330 validation-auc:0.96561 validation-aucpr:0.97035
[18:03:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.46881 validation-auc:0.96583 validation-aucpr:0.97058
[18:03:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.46512 validation-auc:0.96582 validation-aucpr:0.97057
[18:03:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.46165 validation-auc:0.96572 validation-aucpr:0.97050
[18:03:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.45812 validation-auc:0.96583 validation-aucpr:0.97057
[18:03:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.45483 validation-auc:0.96578 validation-aucpr:0.97051
[18:03:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.45114 validation-auc:0.96586 validation-aucpr:0.97059
{'best_iteration': '43', 'best_score': '0.9705907319091631'}
Trial 9, Fold 3: Log loss = 0.4511384331227501, Average precision = 0.9705902080277841, ROC-AUC = 0.9658637744267127, Elapsed Time = 3.7800318999998126 seconds
Trial 9, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 9, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68470 validation-auc:0.93733 validation-aucpr:0.92477
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67582 validation-auc:0.95888 validation-aucpr:0.96391
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66734 validation-auc:0.96074 validation-aucpr:0.96742
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66009 validation-auc:0.95988 validation-aucpr:0.96691
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65142 validation-auc:0.96155 validation-aucpr:0.96835
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64336 validation-auc:0.96232 validation-aucpr:0.96862
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63653 validation-auc:0.96233 validation-aucpr:0.96885
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.62993 validation-auc:0.96176 validation-aucpr:0.96822
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62347 validation-auc:0.96147 validation-aucpr:0.96783
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61582 validation-auc:0.96180 validation-aucpr:0.96816
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.60928 validation-auc:0.96169 validation-aucpr:0.96808
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.60193 validation-auc:0.96192 validation-aucpr:0.96841
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.59594 validation-auc:0.96197 validation-aucpr:0.96842
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.58973 validation-auc:0.96194 validation-aucpr:0.96842
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.58288 validation-auc:0.96236 validation-aucpr:0.96885
[18:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.57617 validation-auc:0.96261 validation-aucpr:0.96910
[18:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.57105 validation-auc:0.96250 validation-aucpr:0.96894
[18:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.56549 validation-auc:0.96264 validation-aucpr:0.96896
[18:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.56007 validation-auc:0.96270 validation-aucpr:0.96891
[18:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.55458 validation-auc:0.96291 validation-aucpr:0.96902
[18:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.54950 validation-auc:0.96280 validation-aucpr:0.96888
[18:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.54438 validation-auc:0.96281 validation-aucpr:0.96883
[18:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.53832 validation-auc:0.96317 validation-aucpr:0.96919
[18:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.53360 validation-auc:0.96311 validation-aucpr:0.96912
[18:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.52847 validation-auc:0.96306 validation-aucpr:0.96912
[18:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.52378 validation-auc:0.96298 validation-aucpr:0.96905
[18:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.51926 validation-auc:0.96282 validation-aucpr:0.96895
[18:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.51500 validation-auc:0.96270 validation-aucpr:0.96881
[18:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.51068 validation-auc:0.96275 validation-aucpr:0.96880
[18:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.50545 validation-auc:0.96291 validation-aucpr:0.96899
[18:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.50115 validation-auc:0.96288 validation-aucpr:0.96891
[18:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.49703 validation-auc:0.96291 validation-aucpr:0.96885
[18:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.49225 validation-auc:0.96305 validation-aucpr:0.96902
[18:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.48842 validation-auc:0.96293 validation-aucpr:0.96892
[18:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.48453 validation-auc:0.96287 validation-aucpr:0.96884
[18:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.48077 validation-auc:0.96280 validation-aucpr:0.96877
[18:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.47676 validation-auc:0.96285 validation-aucpr:0.96879
[18:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.47212 validation-auc:0.96302 validation-aucpr:0.96896
[18:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.46775 validation-auc:0.96327 validation-aucpr:0.96921
[18:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.46411 validation-auc:0.96327 validation-aucpr:0.96923
[18:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.45991 validation-auc:0.96338 validation-aucpr:0.96935
[18:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.45653 validation-auc:0.96335 validation-aucpr:0.96930
[18:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.45237 validation-auc:0.96353 validation-aucpr:0.96947
[18:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.44910 validation-auc:0.96354 validation-aucpr:0.96947
{'best_iteration': '43', 'best_score': '0.9694712773591828'}
Trial 9, Fold 4: Log loss = 0.44909951162714856, Average precision = 0.9694630808116731, ROC-AUC = 0.9635393940260997, Elapsed Time = 3.8774806000001263 seconds
Trial 9, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 9, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68496 validation-auc:0.93507 validation-aucpr:0.91961
[18:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67608 validation-auc:0.95637 validation-aucpr:0.95470
[18:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66752 validation-auc:0.95770 validation-aucpr:0.96311
[18:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66012 validation-auc:0.95890 validation-aucpr:0.96433
[18:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65289 validation-auc:0.95894 validation-aucpr:0.96422
[18:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64577 validation-auc:0.95921 validation-aucpr:0.96419
[18:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63898 validation-auc:0.95908 validation-aucpr:0.96409
[18:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63252 validation-auc:0.95892 validation-aucpr:0.96428
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62585 validation-auc:0.95907 validation-aucpr:0.96428
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61939 validation-auc:0.95925 validation-aucpr:0.96438
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61322 validation-auc:0.95923 validation-aucpr:0.96435
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.60581 validation-auc:0.95997 validation-aucpr:0.96513
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.59985 validation-auc:0.95981 validation-aucpr:0.96495
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.59410 validation-auc:0.95966 validation-aucpr:0.96477
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.58839 validation-auc:0.95953 validation-aucpr:0.96466
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.58279 validation-auc:0.95919 validation-aucpr:0.96439
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.57699 validation-auc:0.95944 validation-aucpr:0.96463
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.57138 validation-auc:0.95951 validation-aucpr:0.96466
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.56655 validation-auc:0.95930 validation-aucpr:0.96441
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.56020 validation-auc:0.95991 validation-aucpr:0.96499
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.55529 validation-auc:0.95978 validation-aucpr:0.96491
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.54914 validation-auc:0.96006 validation-aucpr:0.96524
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.54430 validation-auc:0.96008 validation-aucpr:0.96524
[18:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.53950 validation-auc:0.96031 validation-aucpr:0.96534
[18:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.53471 validation-auc:0.96036 validation-aucpr:0.96538
[18:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.52985 validation-auc:0.96067 validation-aucpr:0.96561
[18:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.52514 validation-auc:0.96069 validation-aucpr:0.96559
[18:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.52092 validation-auc:0.96063 validation-aucpr:0.96551
[18:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.51662 validation-auc:0.96059 validation-aucpr:0.96543
[18:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.51124 validation-auc:0.96092 validation-aucpr:0.96580
[18:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.50625 validation-auc:0.96117 validation-aucpr:0.96651
[18:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.50111 validation-auc:0.96133 validation-aucpr:0.96665
[18:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.49721 validation-auc:0.96123 validation-aucpr:0.96655
[18:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.49206 validation-auc:0.96159 validation-aucpr:0.96689
[18:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.48823 validation-auc:0.96150 validation-aucpr:0.96680
[18:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.48387 validation-auc:0.96153 validation-aucpr:0.96689
[18:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.47921 validation-auc:0.96168 validation-aucpr:0.96703
[18:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.47458 validation-auc:0.96188 validation-aucpr:0.96722
[18:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.47117 validation-auc:0.96183 validation-aucpr:0.96716
[18:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.46784 validation-auc:0.96178 validation-aucpr:0.96709
[18:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.46448 validation-auc:0.96165 validation-aucpr:0.96701
[18:03:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.46101 validation-auc:0.96172 validation-aucpr:0.96705
[18:03:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.45752 validation-auc:0.96177 validation-aucpr:0.96708
[18:03:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.45425 validation-auc:0.96174 validation-aucpr:0.96706
{'best_iteration': '37', 'best_score': '0.9672193418676625'}
Trial 9, Fold 5: Log loss = 0.4542515284001841, Average precision = 0.9670490035762956, ROC-AUC = 0.96174490459469, Elapsed Time = 3.735684199999014 seconds
Optimization Progress: 10%|# | 10/100 [04:26<32:11, 21.47s/it]
Trial 10, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 10, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67854 validation-auc:0.91581 validation-aucpr:0.90795
[1] validation-logloss:0.66452 validation-auc:0.92989 validation-aucpr:0.92944
[2] validation-logloss:0.64978 validation-auc:0.94898 validation-aucpr:0.95368
[3] validation-logloss:0.63678 validation-auc:0.95179 validation-aucpr:0.95817
[4] validation-logloss:0.62092 validation-auc:0.95804 validation-aucpr:0.96407
[5] validation-logloss:0.60858 validation-auc:0.95759 validation-aucpr:0.96377
[6] validation-logloss:0.59778 validation-auc:0.95701 validation-aucpr:0.96330
[7] validation-logloss:0.58770 validation-auc:0.95669 validation-aucpr:0.96292
[8] validation-logloss:0.57914 validation-auc:0.95677 validation-aucpr:0.96342
[9] validation-logloss:0.57009 validation-auc:0.95599 validation-aucpr:0.96237
[10] validation-logloss:0.56079 validation-auc:0.95652 validation-aucpr:0.96275
[11] validation-logloss:0.55168 validation-auc:0.95649 validation-aucpr:0.96271
[12] validation-logloss:0.54383 validation-auc:0.95613 validation-aucpr:0.96226
[13] validation-logloss:0.54043 validation-auc:0.95609 validation-aucpr:0.96140
[14] validation-logloss:0.53295 validation-auc:0.95645 validation-aucpr:0.96288
[15] validation-logloss:0.52513 validation-auc:0.95649 validation-aucpr:0.96301
[16] validation-logloss:0.51814 validation-auc:0.95663 validation-aucpr:0.96321
[17] validation-logloss:0.51001 validation-auc:0.95686 validation-aucpr:0.96336
[18] validation-logloss:0.50350 validation-auc:0.95659 validation-aucpr:0.96309
[19] validation-logloss:0.49708 validation-auc:0.95698 validation-aucpr:0.96351
[20] validation-logloss:0.49048 validation-auc:0.95701 validation-aucpr:0.96349
[21] validation-logloss:0.48409 validation-auc:0.95708 validation-aucpr:0.96362
[22] validation-logloss:0.47753 validation-auc:0.95737 validation-aucpr:0.96386
[23] validation-logloss:0.47173 validation-auc:0.95735 validation-aucpr:0.96386
[24] validation-logloss:0.46604 validation-auc:0.95718 validation-aucpr:0.96367
[25] validation-logloss:0.46019 validation-auc:0.95728 validation-aucpr:0.96374
[26] validation-logloss:0.45554 validation-auc:0.95704 validation-aucpr:0.96347
{'best_iteration': '4', 'best_score': '0.9640740072621098'}
Trial 10, Fold 1: Log loss = 0.45554417321152957, Average precision = 0.9634563201492339, ROC-AUC = 0.9570427695568843, Elapsed Time = 1.0330285999989428 seconds
Trial 10, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 10, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67846 validation-auc:0.91873 validation-aucpr:0.90311
[1] validation-logloss:0.66542 validation-auc:0.92942 validation-aucpr:0.92406
[2] validation-logloss:0.65286 validation-auc:0.93128 validation-aucpr:0.92687
[3] validation-logloss:0.63966 validation-auc:0.94142 validation-aucpr:0.94299
[4] validation-logloss:0.62541 validation-auc:0.95174 validation-aucpr:0.95526
[5] validation-logloss:0.61327 validation-auc:0.95274 validation-aucpr:0.95579
[6] validation-logloss:0.60389 validation-auc:0.95350 validation-aucpr:0.95655
[7] validation-logloss:0.59448 validation-auc:0.95282 validation-aucpr:0.95564
[8] validation-logloss:0.58427 validation-auc:0.95225 validation-aucpr:0.95570
[9] validation-logloss:0.57413 validation-auc:0.95236 validation-aucpr:0.95580
[10] validation-logloss:0.56522 validation-auc:0.95168 validation-aucpr:0.95499
[11] validation-logloss:0.55616 validation-auc:0.95163 validation-aucpr:0.95484
[12] validation-logloss:0.54743 validation-auc:0.95179 validation-aucpr:0.95475
[13] validation-logloss:0.53862 validation-auc:0.95147 validation-aucpr:0.95429
[14] validation-logloss:0.52693 validation-auc:0.95532 validation-aucpr:0.95882
[15] validation-logloss:0.51872 validation-auc:0.95564 validation-aucpr:0.95889
[16] validation-logloss:0.51264 validation-auc:0.95537 validation-aucpr:0.95863
[17] validation-logloss:0.50538 validation-auc:0.95548 validation-aucpr:0.95895
[18] validation-logloss:0.49840 validation-auc:0.95544 validation-aucpr:0.95900
[19] validation-logloss:0.49135 validation-auc:0.95560 validation-aucpr:0.95919
[20] validation-logloss:0.48570 validation-auc:0.95557 validation-aucpr:0.95908
[21] validation-logloss:0.47965 validation-auc:0.95575 validation-aucpr:0.95911
[22] validation-logloss:0.47324 validation-auc:0.95585 validation-aucpr:0.95914
[23] validation-logloss:0.46620 validation-auc:0.95592 validation-aucpr:0.95905
[24] validation-logloss:0.45986 validation-auc:0.95604 validation-aucpr:0.95914
[25] validation-logloss:0.45508 validation-auc:0.95576 validation-aucpr:0.95883
[26] validation-logloss:0.45213 validation-auc:0.95586 validation-aucpr:0.95888
{'best_iteration': '19', 'best_score': '0.9591881059143724'}
Trial 10, Fold 2: Log loss = 0.4521302113765439, Average precision = 0.9588842977089528, ROC-AUC = 0.9558649912092507, Elapsed Time = 1.3426713000008021 seconds
Trial 10, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 10, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67796 validation-auc:0.92452 validation-aucpr:0.91870
[1] validation-logloss:0.66517 validation-auc:0.92814 validation-aucpr:0.92500
[2] validation-logloss:0.65165 validation-auc:0.93558 validation-aucpr:0.93512
[3] validation-logloss:0.63856 validation-auc:0.93893 validation-aucpr:0.93850
[4] validation-logloss:0.62334 validation-auc:0.95315 validation-aucpr:0.95637
[5] validation-logloss:0.61199 validation-auc:0.95358 validation-aucpr:0.95676
[6] validation-logloss:0.60083 validation-auc:0.95417 validation-aucpr:0.95926
[7] validation-logloss:0.59101 validation-auc:0.95335 validation-aucpr:0.95878
[8] validation-logloss:0.58087 validation-auc:0.95383 validation-aucpr:0.95922
[9] validation-logloss:0.57160 validation-auc:0.95411 validation-aucpr:0.95874
[10] validation-logloss:0.56290 validation-auc:0.95434 validation-aucpr:0.95876
[11] validation-logloss:0.55485 validation-auc:0.95375 validation-aucpr:0.95838
[12] validation-logloss:0.54498 validation-auc:0.95525 validation-aucpr:0.96000
[13] validation-logloss:0.53740 validation-auc:0.95480 validation-aucpr:0.95952
[14] validation-logloss:0.52926 validation-auc:0.95498 validation-aucpr:0.95972
[15] validation-logloss:0.52178 validation-auc:0.95594 validation-aucpr:0.96058
[16] validation-logloss:0.51499 validation-auc:0.95565 validation-aucpr:0.96027
[17] validation-logloss:0.50855 validation-auc:0.95539 validation-aucpr:0.96004
[18] validation-logloss:0.50149 validation-auc:0.95589 validation-aucpr:0.96079
[19] validation-logloss:0.49457 validation-auc:0.95565 validation-aucpr:0.96060
[20] validation-logloss:0.48845 validation-auc:0.95571 validation-aucpr:0.96059
[21] validation-logloss:0.48232 validation-auc:0.95581 validation-aucpr:0.96062
[22] validation-logloss:0.47589 validation-auc:0.95606 validation-aucpr:0.96076
[23] validation-logloss:0.46998 validation-auc:0.95658 validation-aucpr:0.96112
[24] validation-logloss:0.46504 validation-auc:0.95606 validation-aucpr:0.96074
[25] validation-logloss:0.45942 validation-auc:0.95658 validation-aucpr:0.96120
[26] validation-logloss:0.45406 validation-auc:0.95634 validation-aucpr:0.96099
{'best_iteration': '25', 'best_score': '0.9612033851979362'}
Trial 10, Fold 3: Log loss = 0.45405906982495015, Average precision = 0.96099287969551, ROC-AUC = 0.9563417657058105, Elapsed Time = 1.3540023999994446 seconds
Trial 10, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 10, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67799 validation-auc:0.92305 validation-aucpr:0.91630
[1] validation-logloss:0.66305 validation-auc:0.94321 validation-aucpr:0.94644
[2] validation-logloss:0.64946 validation-auc:0.95125 validation-aucpr:0.95569
[3] validation-logloss:0.64038 validation-auc:0.94774 validation-aucpr:0.95305
[4] validation-logloss:0.62896 validation-auc:0.94848 validation-aucpr:0.95454
[5] validation-logloss:0.61711 validation-auc:0.94967 validation-aucpr:0.95597
[6] validation-logloss:0.60580 validation-auc:0.95091 validation-aucpr:0.95730
[7] validation-logloss:0.59591 validation-auc:0.95098 validation-aucpr:0.95725
[8] validation-logloss:0.58589 validation-auc:0.95133 validation-aucpr:0.95743
[9] validation-logloss:0.57581 validation-auc:0.95200 validation-aucpr:0.95819
[10] validation-logloss:0.56749 validation-auc:0.95131 validation-aucpr:0.95759
[11] validation-logloss:0.55838 validation-auc:0.95126 validation-aucpr:0.95739
[12] validation-logloss:0.55005 validation-auc:0.95090 validation-aucpr:0.95701
[13] validation-logloss:0.54130 validation-auc:0.95096 validation-aucpr:0.95711
[14] validation-logloss:0.53400 validation-auc:0.95069 validation-aucpr:0.95697
[15] validation-logloss:0.52682 validation-auc:0.95051 validation-aucpr:0.95688
[16] validation-logloss:0.52040 validation-auc:0.95115 validation-aucpr:0.95756
[17] validation-logloss:0.51365 validation-auc:0.95147 validation-aucpr:0.95781
[18] validation-logloss:0.50591 validation-auc:0.95208 validation-aucpr:0.95832
[19] validation-logloss:0.50168 validation-auc:0.95202 validation-aucpr:0.95812
[20] validation-logloss:0.49618 validation-auc:0.95213 validation-aucpr:0.95807
[21] validation-logloss:0.48888 validation-auc:0.95337 validation-aucpr:0.95958
[22] validation-logloss:0.48316 validation-auc:0.95353 validation-aucpr:0.95981
[23] validation-logloss:0.47711 validation-auc:0.95381 validation-aucpr:0.95994
[24] validation-logloss:0.47135 validation-auc:0.95404 validation-aucpr:0.96020
[25] validation-logloss:0.46577 validation-auc:0.95380 validation-aucpr:0.95987
[26] validation-logloss:0.46036 validation-auc:0.95368 validation-aucpr:0.95968
{'best_iteration': '24', 'best_score': '0.9601950676419457'}
Trial 10, Fold 4: Log loss = 0.46035901083769054, Average precision = 0.9596825375211024, ROC-AUC = 0.9536800086315633, Elapsed Time = 1.4137474999988626 seconds
Trial 10, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 10, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67936 validation-auc:0.89905 validation-aucpr:0.87397
[1] validation-logloss:0.66630 validation-auc:0.91514 validation-aucpr:0.90211
[2] validation-logloss:0.65378 validation-auc:0.92234 validation-aucpr:0.91901
[3] validation-logloss:0.64099 validation-auc:0.92810 validation-aucpr:0.92915
[4] validation-logloss:0.63089 validation-auc:0.93158 validation-aucpr:0.93389
[5] validation-logloss:0.62011 validation-auc:0.93258 validation-aucpr:0.93401
[6] validation-logloss:0.60970 validation-auc:0.93295 validation-aucpr:0.93483
[7] validation-logloss:0.59977 validation-auc:0.93444 validation-aucpr:0.93635
[8] validation-logloss:0.59187 validation-auc:0.93689 validation-aucpr:0.93846
[9] validation-logloss:0.58184 validation-auc:0.93884 validation-aucpr:0.94017
[10] validation-logloss:0.57209 validation-auc:0.93993 validation-aucpr:0.94129
[11] validation-logloss:0.56341 validation-auc:0.94012 validation-aucpr:0.94165
[12] validation-logloss:0.55469 validation-auc:0.94069 validation-aucpr:0.94226
[13] validation-logloss:0.54692 validation-auc:0.94100 validation-aucpr:0.94318
[14] validation-logloss:0.53946 validation-auc:0.94182 validation-aucpr:0.94460
[15] validation-logloss:0.52867 validation-auc:0.94978 validation-aucpr:0.95396
[16] validation-logloss:0.51929 validation-auc:0.95258 validation-aucpr:0.95744
[17] validation-logloss:0.51175 validation-auc:0.95309 validation-aucpr:0.95803
[18] validation-logloss:0.50566 validation-auc:0.95257 validation-aucpr:0.95752
[19] validation-logloss:0.50021 validation-auc:0.95318 validation-aucpr:0.95823
[20] validation-logloss:0.49218 validation-auc:0.95354 validation-aucpr:0.95875
[21] validation-logloss:0.48564 validation-auc:0.95378 validation-aucpr:0.95901
[22] validation-logloss:0.47926 validation-auc:0.95465 validation-aucpr:0.95992
[23] validation-logloss:0.47286 validation-auc:0.95483 validation-aucpr:0.96016
[24] validation-logloss:0.46804 validation-auc:0.95459 validation-aucpr:0.95992
[25] validation-logloss:0.46235 validation-auc:0.95459 validation-aucpr:0.95995
[26] validation-logloss:0.45748 validation-auc:0.95450 validation-aucpr:0.95988
{'best_iteration': '23', 'best_score': '0.9601604895372757'}
Trial 10, Fold 5: Log loss = 0.4574784116641232, Average precision = 0.9598834535675733, ROC-AUC = 0.9544994519071773, Elapsed Time = 1.380747799999881 seconds
Optimization Progress: 11%|#1 | 11/100 [04:40<28:38, 19.30s/it]
Trial 11, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 11, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:03:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68589 validation-auc:0.95478 validation-aucpr:0.95734
[18:03:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67891 validation-auc:0.96195 validation-aucpr:0.96755
[18:03:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67204 validation-auc:0.96271 validation-aucpr:0.96876
[18:03:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66515 validation-auc:0.96429 validation-aucpr:0.97006
[18:03:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65841 validation-auc:0.96486 validation-aucpr:0.97045
[18:03:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65255 validation-auc:0.96552 validation-aucpr:0.97108
[18:03:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64623 validation-auc:0.96555 validation-aucpr:0.97143
[18:03:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63986 validation-auc:0.96620 validation-aucpr:0.97186
[18:04:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63365 validation-auc:0.96602 validation-aucpr:0.97177
[18:04:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62839 validation-auc:0.96572 validation-aucpr:0.97153
[18:04:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62228 validation-auc:0.96636 validation-aucpr:0.97200
[18:04:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61638 validation-auc:0.96624 validation-aucpr:0.97198
[18:04:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.61120 validation-auc:0.96622 validation-aucpr:0.97190
[18:04:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60609 validation-auc:0.96657 validation-aucpr:0.97210
[18:04:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.60036 validation-auc:0.96679 validation-aucpr:0.97231
[18:04:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59477 validation-auc:0.96682 validation-aucpr:0.97233
[18:04:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58931 validation-auc:0.96673 validation-aucpr:0.97262
[18:04:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58387 validation-auc:0.96676 validation-aucpr:0.97268
[18:04:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57862 validation-auc:0.96697 validation-aucpr:0.97287
[18:04:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57395 validation-auc:0.96687 validation-aucpr:0.97279
[18:04:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56873 validation-auc:0.96706 validation-aucpr:0.97295
[18:04:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56362 validation-auc:0.96719 validation-aucpr:0.97304
[18:04:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55870 validation-auc:0.96739 validation-aucpr:0.97318
[18:04:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55431 validation-auc:0.96745 validation-aucpr:0.97317
[18:04:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.55003 validation-auc:0.96727 validation-aucpr:0.97303
[18:04:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54538 validation-auc:0.96734 validation-aucpr:0.97309
[18:04:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.54060 validation-auc:0.96750 validation-aucpr:0.97322
[18:04:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53602 validation-auc:0.96753 validation-aucpr:0.97333
[18:04:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.53154 validation-auc:0.96757 validation-aucpr:0.97338
[18:05:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52722 validation-auc:0.96760 validation-aucpr:0.97338
[18:05:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52286 validation-auc:0.96775 validation-aucpr:0.97351
[18:05:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51868 validation-auc:0.96766 validation-aucpr:0.97343
[18:05:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51432 validation-auc:0.96785 validation-aucpr:0.97357
[18:05:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.51021 validation-auc:0.96781 validation-aucpr:0.97353
[18:05:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50624 validation-auc:0.96780 validation-aucpr:0.97355
[18:05:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.50223 validation-auc:0.96782 validation-aucpr:0.97358
[18:05:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49878 validation-auc:0.96778 validation-aucpr:0.97353
[18:05:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49536 validation-auc:0.96780 validation-aucpr:0.97346
[18:05:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.49167 validation-auc:0.96770 validation-aucpr:0.97345
[18:05:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48823 validation-auc:0.96780 validation-aucpr:0.97353
[18:05:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48452 validation-auc:0.96785 validation-aucpr:0.97357
[18:05:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.48094 validation-auc:0.96788 validation-aucpr:0.97363
[18:05:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47732 validation-auc:0.96793 validation-aucpr:0.97368
[18:05:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47374 validation-auc:0.96801 validation-aucpr:0.97372
[18:05:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.47023 validation-auc:0.96806 validation-aucpr:0.97375
[18:05:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46676 validation-auc:0.96804 validation-aucpr:0.97375
[18:05:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.46385 validation-auc:0.96791 validation-aucpr:0.97365
[18:05:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.46048 validation-auc:0.96796 validation-aucpr:0.97370
[18:06:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45756 validation-auc:0.96801 validation-aucpr:0.97371
[18:06:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45427 validation-auc:0.96808 validation-aucpr:0.97375
[18:06:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.45105 validation-auc:0.96803 validation-aucpr:0.97372
[18:06:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44786 validation-auc:0.96808 validation-aucpr:0.97374
[18:06:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44473 validation-auc:0.96808 validation-aucpr:0.97376
[18:06:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.44177 validation-auc:0.96808 validation-aucpr:0.97377
[18:06:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43878 validation-auc:0.96804 validation-aucpr:0.97374
[18:06:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43619 validation-auc:0.96797 validation-aucpr:0.97368
[18:06:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.43320 validation-auc:0.96793 validation-aucpr:0.97365
[18:06:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.43062 validation-auc:0.96795 validation-aucpr:0.97366
[18:06:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42764 validation-auc:0.96805 validation-aucpr:0.97373
[18:06:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42489 validation-auc:0.96799 validation-aucpr:0.97369
[18:06:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.42205 validation-auc:0.96802 validation-aucpr:0.97370
[18:06:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.41960 validation-auc:0.96799 validation-aucpr:0.97366
[18:06:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41683 validation-auc:0.96801 validation-aucpr:0.97367
[18:06:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41416 validation-auc:0.96809 validation-aucpr:0.97371
[18:06:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.41149 validation-auc:0.96818 validation-aucpr:0.97376
[18:06:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.40925 validation-auc:0.96812 validation-aucpr:0.97371
[18:06:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40670 validation-auc:0.96812 validation-aucpr:0.97371
[18:06:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40411 validation-auc:0.96812 validation-aucpr:0.97371
[18:06:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.40163 validation-auc:0.96822 validation-aucpr:0.97379
[18:07:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.39914 validation-auc:0.96816 validation-aucpr:0.97374
[18:07:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39700 validation-auc:0.96808 validation-aucpr:0.97368
[18:07:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39455 validation-auc:0.96812 validation-aucpr:0.97370
[18:07:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.39219 validation-auc:0.96811 validation-aucpr:0.97369
[18:07:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.38980 validation-auc:0.96818 validation-aucpr:0.97375
[18:07:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38783 validation-auc:0.96823 validation-aucpr:0.97376
[18:07:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38563 validation-auc:0.96821 validation-aucpr:0.97375
[18:07:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38363 validation-auc:0.96823 validation-aucpr:0.97376
[18:07:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.38133 validation-auc:0.96831 validation-aucpr:0.97385
[18:07:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.37900 validation-auc:0.96842 validation-aucpr:0.97394
[18:07:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37683 validation-auc:0.96847 validation-aucpr:0.97398
[18:07:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.37474 validation-auc:0.96841 validation-aucpr:0.97394
[18:07:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.37263 validation-auc:0.96851 validation-aucpr:0.97398
[18:07:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.37055 validation-auc:0.96847 validation-aucpr:0.97395
[18:07:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.36848 validation-auc:0.96849 validation-aucpr:0.97397
[18:07:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.36677 validation-auc:0.96848 validation-aucpr:0.97398
[18:07:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.36478 validation-auc:0.96849 validation-aucpr:0.97399
[18:07:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.36281 validation-auc:0.96850 validation-aucpr:0.97399
[18:07:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.36084 validation-auc:0.96849 validation-aucpr:0.97398
[18:07:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.35887 validation-auc:0.96856 validation-aucpr:0.97403
{'best_iteration': '88', 'best_score': '0.9740344937804971'}
Trial 11, Fold 1: Log loss = 0.3588683607750186, Average precision = 0.9740402771464329, ROC-AUC = 0.968557994938415, Elapsed Time = 261.71760129999893 seconds
Trial 11, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 11, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:08:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68605 validation-auc:0.95120 validation-aucpr:0.95669
[18:08:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67888 validation-auc:0.95967 validation-aucpr:0.96385
[18:08:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67196 validation-auc:0.96205 validation-aucpr:0.96544
[18:08:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66513 validation-auc:0.96373 validation-aucpr:0.96652
[18:08:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65840 validation-auc:0.96515 validation-aucpr:0.96802
[18:08:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65198 validation-auc:0.96542 validation-aucpr:0.96811
[18:08:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64553 validation-auc:0.96568 validation-aucpr:0.96830
[18:08:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63912 validation-auc:0.96595 validation-aucpr:0.96854
[18:08:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63350 validation-auc:0.96573 validation-aucpr:0.96839
[18:08:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62746 validation-auc:0.96573 validation-aucpr:0.96841
[18:08:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62149 validation-auc:0.96630 validation-aucpr:0.96980
[18:08:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61558 validation-auc:0.96682 validation-aucpr:0.97041
[18:08:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60987 validation-auc:0.96690 validation-aucpr:0.97063
[18:08:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60397 validation-auc:0.96736 validation-aucpr:0.97100
[18:08:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59839 validation-auc:0.96785 validation-aucpr:0.97142
[18:08:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59283 validation-auc:0.96819 validation-aucpr:0.97176
[18:08:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58795 validation-auc:0.96816 validation-aucpr:0.97165
[18:08:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58260 validation-auc:0.96815 validation-aucpr:0.97161
[18:08:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57792 validation-auc:0.96839 validation-aucpr:0.97182
[18:08:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57335 validation-auc:0.96844 validation-aucpr:0.97209
[18:08:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56832 validation-auc:0.96839 validation-aucpr:0.97209
[18:09:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56393 validation-auc:0.96827 validation-aucpr:0.97198
[18:09:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55896 validation-auc:0.96843 validation-aucpr:0.97209
[18:09:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55416 validation-auc:0.96851 validation-aucpr:0.97220
[18:09:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54926 validation-auc:0.96872 validation-aucpr:0.97236
[18:09:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54457 validation-auc:0.96877 validation-aucpr:0.97237
[18:09:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53980 validation-auc:0.96895 validation-aucpr:0.97251
[18:09:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53522 validation-auc:0.96895 validation-aucpr:0.97249
[18:09:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.53128 validation-auc:0.96899 validation-aucpr:0.97253
[18:09:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52734 validation-auc:0.96907 validation-aucpr:0.97260
[18:09:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52304 validation-auc:0.96913 validation-aucpr:0.97267
[18:09:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51917 validation-auc:0.96914 validation-aucpr:0.97266
[18:09:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51536 validation-auc:0.96912 validation-aucpr:0.97260
[18:09:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.51167 validation-auc:0.96909 validation-aucpr:0.97255
[18:09:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50755 validation-auc:0.96922 validation-aucpr:0.97269
[18:09:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.50343 validation-auc:0.96927 validation-aucpr:0.97274
[18:09:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49936 validation-auc:0.96927 validation-aucpr:0.97274
[18:09:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49548 validation-auc:0.96940 validation-aucpr:0.97289
[18:09:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.49206 validation-auc:0.96928 validation-aucpr:0.97279
[18:09:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48823 validation-auc:0.96940 validation-aucpr:0.97287
[18:10:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48445 validation-auc:0.96935 validation-aucpr:0.97284
[18:10:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.48077 validation-auc:0.96935 validation-aucpr:0.97282
[18:10:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47710 validation-auc:0.96940 validation-aucpr:0.97286
[18:10:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47347 validation-auc:0.96933 validation-aucpr:0.97281
[18:10:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.47000 validation-auc:0.96948 validation-aucpr:0.97292
[18:10:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46645 validation-auc:0.96960 validation-aucpr:0.97301
[18:10:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.46303 validation-auc:0.96970 validation-aucpr:0.97308
[18:10:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.45960 validation-auc:0.96966 validation-aucpr:0.97305
[18:10:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45628 validation-auc:0.96968 validation-aucpr:0.97304
[18:10:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45312 validation-auc:0.96964 validation-aucpr:0.97301
[18:10:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.44995 validation-auc:0.96958 validation-aucpr:0.97298
[18:10:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44679 validation-auc:0.96962 validation-aucpr:0.97304
[18:10:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44367 validation-auc:0.96973 validation-aucpr:0.97311
[18:10:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.44063 validation-auc:0.96960 validation-aucpr:0.97301
[18:10:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43754 validation-auc:0.96967 validation-aucpr:0.97307
[18:10:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43452 validation-auc:0.96971 validation-aucpr:0.97311
[18:10:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.43162 validation-auc:0.96971 validation-aucpr:0.97310
[18:10:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.42860 validation-auc:0.96975 validation-aucpr:0.97314
[18:10:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42565 validation-auc:0.96980 validation-aucpr:0.97318
[18:10:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42284 validation-auc:0.96977 validation-aucpr:0.97317
[18:10:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.42045 validation-auc:0.96976 validation-aucpr:0.97313
[18:10:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.41758 validation-auc:0.96981 validation-aucpr:0.97317
[18:11:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41484 validation-auc:0.96981 validation-aucpr:0.97318
[18:11:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41213 validation-auc:0.96985 validation-aucpr:0.97320
[18:11:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.40939 validation-auc:0.96990 validation-aucpr:0.97326
[18:11:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.40675 validation-auc:0.96998 validation-aucpr:0.97332
[18:11:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40416 validation-auc:0.97004 validation-aucpr:0.97337
[18:11:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40165 validation-auc:0.97004 validation-aucpr:0.97338
[18:11:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.39909 validation-auc:0.97003 validation-aucpr:0.97336
[18:11:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.39663 validation-auc:0.97011 validation-aucpr:0.97343
[18:11:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39414 validation-auc:0.97016 validation-aucpr:0.97346
[18:11:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39202 validation-auc:0.97019 validation-aucpr:0.97346
[18:11:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.38962 validation-auc:0.97026 validation-aucpr:0.97351
[18:11:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.38721 validation-auc:0.97030 validation-aucpr:0.97353
[18:11:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38483 validation-auc:0.97031 validation-aucpr:0.97352
[18:11:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38248 validation-auc:0.97033 validation-aucpr:0.97353
[18:11:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38017 validation-auc:0.97036 validation-aucpr:0.97355
[18:11:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.37805 validation-auc:0.97027 validation-aucpr:0.97350
[18:11:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.37585 validation-auc:0.97032 validation-aucpr:0.97353
[18:11:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37397 validation-auc:0.97035 validation-aucpr:0.97356
[18:11:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.37186 validation-auc:0.97032 validation-aucpr:0.97354
[18:11:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.36963 validation-auc:0.97033 validation-aucpr:0.97356
[18:12:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.36759 validation-auc:0.97034 validation-aucpr:0.97357
[18:12:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.36561 validation-auc:0.97030 validation-aucpr:0.97360
[18:12:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.36353 validation-auc:0.97035 validation-aucpr:0.97364
[18:12:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.36150 validation-auc:0.97038 validation-aucpr:0.97367
[18:12:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.35960 validation-auc:0.97042 validation-aucpr:0.97402
[18:12:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.35755 validation-auc:0.97044 validation-aucpr:0.97402
[18:12:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.35560 validation-auc:0.97039 validation-aucpr:0.97400
{'best_iteration': '87', 'best_score': '0.9740248478874732'}
Trial 11, Fold 2: Log loss = 0.3556043571620501, Average precision = 0.9739869661255756, ROC-AUC = 0.970393172442238, Elapsed Time = 260.57359840000026 seconds
Trial 11, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 11, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:12:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68585 validation-auc:0.95527 validation-aucpr:0.95682
[18:12:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67966 validation-auc:0.95976 validation-aucpr:0.96432
[18:12:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67349 validation-auc:0.96038 validation-aucpr:0.96523
[18:12:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66654 validation-auc:0.96394 validation-aucpr:0.96873
[18:12:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65969 validation-auc:0.96518 validation-aucpr:0.96994
[18:12:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65297 validation-auc:0.96646 validation-aucpr:0.97094
[18:12:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64634 validation-auc:0.96700 validation-aucpr:0.97141
[18:12:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64009 validation-auc:0.96719 validation-aucpr:0.97153
[18:12:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63443 validation-auc:0.96754 validation-aucpr:0.97205
[18:12:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62826 validation-auc:0.96759 validation-aucpr:0.97219
[18:12:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62217 validation-auc:0.96747 validation-aucpr:0.97208
[18:12:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61698 validation-auc:0.96731 validation-aucpr:0.97195
[18:13:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.61169 validation-auc:0.96739 validation-aucpr:0.97202
[18:13:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60665 validation-auc:0.96744 validation-aucpr:0.97203
[18:13:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.60168 validation-auc:0.96733 validation-aucpr:0.97192
[18:13:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59610 validation-auc:0.96735 validation-aucpr:0.97197
[18:13:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.59069 validation-auc:0.96750 validation-aucpr:0.97210
[18:13:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58588 validation-auc:0.96730 validation-aucpr:0.97193
[18:13:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.58053 validation-auc:0.96760 validation-aucpr:0.97213
[18:13:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57605 validation-auc:0.96754 validation-aucpr:0.97208
[18:13:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.57083 validation-auc:0.96780 validation-aucpr:0.97226
[18:13:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56577 validation-auc:0.96805 validation-aucpr:0.97254
[18:13:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.56074 validation-auc:0.96822 validation-aucpr:0.97267
[18:13:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55597 validation-auc:0.96849 validation-aucpr:0.97326
[18:13:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.55108 validation-auc:0.96868 validation-aucpr:0.97340
[18:13:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54651 validation-auc:0.96870 validation-aucpr:0.97343
[18:13:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.54245 validation-auc:0.96864 validation-aucpr:0.97333
[18:13:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53792 validation-auc:0.96863 validation-aucpr:0.97337
[18:13:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.53334 validation-auc:0.96878 validation-aucpr:0.97349
[18:13:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52938 validation-auc:0.96878 validation-aucpr:0.97347
[18:13:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52496 validation-auc:0.96885 validation-aucpr:0.97353
[18:13:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.52056 validation-auc:0.96896 validation-aucpr:0.97359
[18:14:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51635 validation-auc:0.96894 validation-aucpr:0.97357
[18:14:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.51217 validation-auc:0.96890 validation-aucpr:0.97355
[18:14:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50795 validation-auc:0.96893 validation-aucpr:0.97359
[18:14:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.50385 validation-auc:0.96893 validation-aucpr:0.97361
[18:14:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49988 validation-auc:0.96900 validation-aucpr:0.97366
[18:14:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49635 validation-auc:0.96907 validation-aucpr:0.97368
[18:14:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.49252 validation-auc:0.96912 validation-aucpr:0.97372
[18:14:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48855 validation-auc:0.96924 validation-aucpr:0.97383
[18:14:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48472 validation-auc:0.96938 validation-aucpr:0.97394
[18:14:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.48109 validation-auc:0.96939 validation-aucpr:0.97395
[18:14:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47732 validation-auc:0.96939 validation-aucpr:0.97395
[18:14:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47373 validation-auc:0.96943 validation-aucpr:0.97399
[18:14:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.47018 validation-auc:0.96941 validation-aucpr:0.97398
[18:14:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46656 validation-auc:0.96955 validation-aucpr:0.97409
[18:14:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.46312 validation-auc:0.96969 validation-aucpr:0.97420
[18:14:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.45983 validation-auc:0.96959 validation-aucpr:0.97413
[18:14:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45645 validation-auc:0.96963 validation-aucpr:0.97418
[18:14:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45359 validation-auc:0.96963 validation-aucpr:0.97414
[18:14:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.45036 validation-auc:0.96967 validation-aucpr:0.97418
[18:14:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44709 validation-auc:0.96969 validation-aucpr:0.97420
[18:14:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44437 validation-auc:0.96971 validation-aucpr:0.97421
[18:14:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.44121 validation-auc:0.96974 validation-aucpr:0.97424
[18:15:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43816 validation-auc:0.96978 validation-aucpr:0.97431
[18:15:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43505 validation-auc:0.96987 validation-aucpr:0.97439
[18:15:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.43199 validation-auc:0.96992 validation-aucpr:0.97442
[18:15:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.42895 validation-auc:0.97004 validation-aucpr:0.97450
[18:15:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42626 validation-auc:0.96999 validation-aucpr:0.97445
[18:15:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42335 validation-auc:0.97000 validation-aucpr:0.97447
[18:15:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.42094 validation-auc:0.97001 validation-aucpr:0.97445
[18:15:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.41811 validation-auc:0.97004 validation-aucpr:0.97447
[18:15:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41569 validation-auc:0.97011 validation-aucpr:0.97450
[18:15:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41305 validation-auc:0.97012 validation-aucpr:0.97452
[18:15:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.41034 validation-auc:0.97018 validation-aucpr:0.97466
[18:15:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.40797 validation-auc:0.97022 validation-aucpr:0.97466
[18:15:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40528 validation-auc:0.97021 validation-aucpr:0.97466
[18:15:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40267 validation-auc:0.97019 validation-aucpr:0.97464
[18:15:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.40052 validation-auc:0.97022 validation-aucpr:0.97464
[18:15:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.39794 validation-auc:0.97026 validation-aucpr:0.97467
[18:15:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39541 validation-auc:0.97030 validation-aucpr:0.97469
[18:15:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39303 validation-auc:0.97031 validation-aucpr:0.97475
[18:15:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.39093 validation-auc:0.97022 validation-aucpr:0.97467
[18:15:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.38849 validation-auc:0.97022 validation-aucpr:0.97468
[18:15:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38595 validation-auc:0.97032 validation-aucpr:0.97476
[18:16:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38358 validation-auc:0.97035 validation-aucpr:0.97477
[18:16:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38128 validation-auc:0.97037 validation-aucpr:0.97479
[18:16:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.37900 validation-auc:0.97037 validation-aucpr:0.97479
[18:16:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.37675 validation-auc:0.97041 validation-aucpr:0.97481
[18:16:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37486 validation-auc:0.97044 validation-aucpr:0.97480
[18:16:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.37270 validation-auc:0.97050 validation-aucpr:0.97485
[18:16:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.37052 validation-auc:0.97054 validation-aucpr:0.97488
[18:16:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.36838 validation-auc:0.97055 validation-aucpr:0.97487
[18:16:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.36627 validation-auc:0.97057 validation-aucpr:0.97488
[18:16:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.36419 validation-auc:0.97060 validation-aucpr:0.97490
[18:16:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.36208 validation-auc:0.97062 validation-aucpr:0.97493
[18:16:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.36020 validation-auc:0.97063 validation-aucpr:0.97492
[18:16:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.35815 validation-auc:0.97064 validation-aucpr:0.97493
[18:16:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.35644 validation-auc:0.97065 validation-aucpr:0.97494
{'best_iteration': '88', 'best_score': '0.9749374341359066'}
Trial 11, Fold 3: Log loss = 0.35644221608984444, Average precision = 0.974941946430545, ROC-AUC = 0.9706546124230722, Elapsed Time = 259.26046209999913 seconds
Trial 11, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 11, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:16:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68580 validation-auc:0.95624 validation-aucpr:0.96239
[18:16:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67874 validation-auc:0.96047 validation-aucpr:0.96517
[18:16:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67178 validation-auc:0.96229 validation-aucpr:0.96916
[18:16:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66502 validation-auc:0.96249 validation-aucpr:0.96931
[18:16:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65826 validation-auc:0.96430 validation-aucpr:0.97046
[18:16:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65177 validation-auc:0.96519 validation-aucpr:0.97103
[18:16:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64614 validation-auc:0.96584 validation-aucpr:0.97162
[18:17:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63979 validation-auc:0.96654 validation-aucpr:0.97205
[18:17:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63369 validation-auc:0.96690 validation-aucpr:0.97234
[18:17:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62759 validation-auc:0.96692 validation-aucpr:0.97240
[18:17:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62149 validation-auc:0.96723 validation-aucpr:0.97265
[18:17:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61549 validation-auc:0.96761 validation-aucpr:0.97290
[18:17:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60978 validation-auc:0.96769 validation-aucpr:0.97291
[18:17:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60410 validation-auc:0.96752 validation-aucpr:0.97279
[18:17:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59843 validation-auc:0.96795 validation-aucpr:0.97313
[18:17:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59280 validation-auc:0.96817 validation-aucpr:0.97332
[18:17:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58747 validation-auc:0.96810 validation-aucpr:0.97323
[18:17:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58209 validation-auc:0.96830 validation-aucpr:0.97331
[18:17:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57700 validation-auc:0.96816 validation-aucpr:0.97320
[18:17:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57189 validation-auc:0.96802 validation-aucpr:0.97311
[18:17:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56677 validation-auc:0.96811 validation-aucpr:0.97315
[18:17:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56177 validation-auc:0.96818 validation-aucpr:0.97325
[18:17:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55678 validation-auc:0.96857 validation-aucpr:0.97354
[18:17:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55201 validation-auc:0.96879 validation-aucpr:0.97368
[18:17:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54726 validation-auc:0.96885 validation-aucpr:0.97372
[18:17:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54265 validation-auc:0.96883 validation-aucpr:0.97375
[18:17:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53791 validation-auc:0.96906 validation-aucpr:0.97390
[18:17:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53336 validation-auc:0.96929 validation-aucpr:0.97408
[18:18:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52941 validation-auc:0.96919 validation-aucpr:0.97405
[18:18:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52496 validation-auc:0.96938 validation-aucpr:0.97419
[18:18:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52055 validation-auc:0.96943 validation-aucpr:0.97423
[18:18:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51628 validation-auc:0.96956 validation-aucpr:0.97432
[18:18:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51214 validation-auc:0.96953 validation-aucpr:0.97429
[18:18:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.50799 validation-auc:0.96961 validation-aucpr:0.97436
[18:18:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50436 validation-auc:0.96956 validation-aucpr:0.97430
[18:18:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.50038 validation-auc:0.96948 validation-aucpr:0.97424
[18:18:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49638 validation-auc:0.96962 validation-aucpr:0.97432
[18:18:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49243 validation-auc:0.96968 validation-aucpr:0.97437
[18:18:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.48905 validation-auc:0.96969 validation-aucpr:0.97437
[18:18:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48535 validation-auc:0.96960 validation-aucpr:0.97431
[18:18:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48155 validation-auc:0.96967 validation-aucpr:0.97436
[18:18:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.47826 validation-auc:0.96959 validation-aucpr:0.97429
[18:18:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47473 validation-auc:0.96965 validation-aucpr:0.97431
[18:18:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47116 validation-auc:0.96954 validation-aucpr:0.97423
[18:18:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.46763 validation-auc:0.96961 validation-aucpr:0.97428
[18:18:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46451 validation-auc:0.96955 validation-aucpr:0.97422
[18:18:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.46116 validation-auc:0.96948 validation-aucpr:0.97416
[18:18:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.45782 validation-auc:0.96953 validation-aucpr:0.97418
[18:18:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45459 validation-auc:0.96962 validation-aucpr:0.97424
[18:19:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45132 validation-auc:0.96958 validation-aucpr:0.97423
[18:19:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.44817 validation-auc:0.96958 validation-aucpr:0.97423
[18:19:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44508 validation-auc:0.96953 validation-aucpr:0.97420
[18:19:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44200 validation-auc:0.96942 validation-aucpr:0.97410
[18:19:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.43899 validation-auc:0.96946 validation-aucpr:0.97412
[18:19:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43639 validation-auc:0.96932 validation-aucpr:0.97402
[18:19:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43342 validation-auc:0.96930 validation-aucpr:0.97400
[18:19:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.43050 validation-auc:0.96931 validation-aucpr:0.97401
[18:19:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.42803 validation-auc:0.96926 validation-aucpr:0.97399
[18:19:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42554 validation-auc:0.96918 validation-aucpr:0.97392
[18:19:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42309 validation-auc:0.96925 validation-aucpr:0.97398
[18:19:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.42033 validation-auc:0.96923 validation-aucpr:0.97397
[18:19:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.41786 validation-auc:0.96913 validation-aucpr:0.97389
[18:19:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41516 validation-auc:0.96921 validation-aucpr:0.97395
[18:19:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41246 validation-auc:0.96916 validation-aucpr:0.97391
[18:19:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.40977 validation-auc:0.96918 validation-aucpr:0.97394
[18:19:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.40722 validation-auc:0.96916 validation-aucpr:0.97393
[18:19:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40468 validation-auc:0.96922 validation-aucpr:0.97396
[18:19:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40259 validation-auc:0.96913 validation-aucpr:0.97389
[18:19:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.40005 validation-auc:0.96921 validation-aucpr:0.97395
[18:20:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.39753 validation-auc:0.96932 validation-aucpr:0.97403
[18:20:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39508 validation-auc:0.96932 validation-aucpr:0.97404
[18:20:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39265 validation-auc:0.96933 validation-aucpr:0.97404
[18:20:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.39066 validation-auc:0.96924 validation-aucpr:0.97399
[18:20:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.38865 validation-auc:0.96917 validation-aucpr:0.97394
[18:20:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38637 validation-auc:0.96913 validation-aucpr:0.97392
[18:20:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38406 validation-auc:0.96917 validation-aucpr:0.97396
[18:20:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38174 validation-auc:0.96923 validation-aucpr:0.97401
[18:20:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.37942 validation-auc:0.96932 validation-aucpr:0.97407
[18:20:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.37723 validation-auc:0.96938 validation-aucpr:0.97411
[18:20:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37508 validation-auc:0.96946 validation-aucpr:0.97416
[18:20:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.37330 validation-auc:0.96939 validation-aucpr:0.97410
[18:20:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.37112 validation-auc:0.96945 validation-aucpr:0.97414
[18:20:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.36905 validation-auc:0.96940 validation-aucpr:0.97412
[18:20:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.36702 validation-auc:0.96939 validation-aucpr:0.97410
[18:20:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.36513 validation-auc:0.96938 validation-aucpr:0.97410
[18:20:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.36314 validation-auc:0.96951 validation-aucpr:0.97420
[18:20:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.36123 validation-auc:0.96946 validation-aucpr:0.97414
[18:20:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.35923 validation-auc:0.96951 validation-aucpr:0.97419
[18:20:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
{'best_iteration': '38', 'best_score': '0.9743741208236099'}
Trial 11, Fold 4: Log loss = 0.3572366804694858, Average precision = 0.9742509040078975, ROC-AUC = 0.9695805942426042, Elapsed Time = 257.6183670999999 seconds
Trial 11, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 11, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:20:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68595 validation-auc:0.95108 validation-aucpr:0.95716
[18:21:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67893 validation-auc:0.95631 validation-aucpr:0.96264
[18:21:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67196 validation-auc:0.95890 validation-aucpr:0.96451
[18:21:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66515 validation-auc:0.95991 validation-aucpr:0.96521
[18:21:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65859 validation-auc:0.96067 validation-aucpr:0.96532
[18:21:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65196 validation-auc:0.96175 validation-aucpr:0.96615
[18:21:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64622 validation-auc:0.96169 validation-aucpr:0.96649
[18:21:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64001 validation-auc:0.96280 validation-aucpr:0.96741
[18:21:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63383 validation-auc:0.96339 validation-aucpr:0.96790
[18:21:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62765 validation-auc:0.96424 validation-aucpr:0.96853
[18:21:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62190 validation-auc:0.96418 validation-aucpr:0.96849
[18:21:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61609 validation-auc:0.96437 validation-aucpr:0.96869
[18:21:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.61026 validation-auc:0.96433 validation-aucpr:0.96872
[18:21:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60456 validation-auc:0.96490 validation-aucpr:0.96917
[18:21:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59899 validation-auc:0.96494 validation-aucpr:0.96921
[18:21:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59346 validation-auc:0.96512 validation-aucpr:0.96929
[18:21:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58858 validation-auc:0.96519 validation-aucpr:0.96947
[18:21:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58324 validation-auc:0.96531 validation-aucpr:0.96953
[18:21:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57809 validation-auc:0.96556 validation-aucpr:0.96969
[18:21:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57299 validation-auc:0.96549 validation-aucpr:0.96962
[18:21:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56791 validation-auc:0.96553 validation-aucpr:0.96967
[18:21:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56309 validation-auc:0.96586 validation-aucpr:0.96984
[18:21:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55889 validation-auc:0.96575 validation-aucpr:0.96977
[18:22:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55406 validation-auc:0.96578 validation-aucpr:0.96980
[18:22:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54919 validation-auc:0.96611 validation-aucpr:0.97002
[18:22:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54460 validation-auc:0.96612 validation-aucpr:0.97005
[18:22:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.54003 validation-auc:0.96605 validation-aucpr:0.97001
[18:22:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53542 validation-auc:0.96626 validation-aucpr:0.97018
[18:22:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.53119 validation-auc:0.96621 validation-aucpr:0.97015
[18:22:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52677 validation-auc:0.96626 validation-aucpr:0.97057
[18:22:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52297 validation-auc:0.96637 validation-aucpr:0.97060
[18:22:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51876 validation-auc:0.96637 validation-aucpr:0.97062
[18:22:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51445 validation-auc:0.96656 validation-aucpr:0.97073
[18:22:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.51036 validation-auc:0.96658 validation-aucpr:0.97075
[18:22:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50697 validation-auc:0.96648 validation-aucpr:0.97063
[18:22:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.50303 validation-auc:0.96650 validation-aucpr:0.97063
[18:22:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49920 validation-auc:0.96635 validation-aucpr:0.97054
[18:22:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49585 validation-auc:0.96631 validation-aucpr:0.97047
[18:22:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.49203 validation-auc:0.96638 validation-aucpr:0.97052
[18:22:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48826 validation-auc:0.96636 validation-aucpr:0.97045
[18:22:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48457 validation-auc:0.96640 validation-aucpr:0.97048
[18:22:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.48087 validation-auc:0.96659 validation-aucpr:0.97066
[18:22:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47729 validation-auc:0.96670 validation-aucpr:0.97074
[18:22:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47381 validation-auc:0.96670 validation-aucpr:0.97075
[18:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.47039 validation-auc:0.96667 validation-aucpr:0.96921
[18:23:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46701 validation-auc:0.96663 validation-aucpr:0.96918
[18:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.46358 validation-auc:0.96676 validation-aucpr:0.96927
[18:23:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.46038 validation-auc:0.96672 validation-aucpr:0.96923
[18:23:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45756 validation-auc:0.96664 validation-aucpr:0.96916
[18:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45441 validation-auc:0.96667 validation-aucpr:0.96915
[18:23:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.45118 validation-auc:0.96667 validation-aucpr:0.96915
[18:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44797 validation-auc:0.96678 validation-aucpr:0.96924
[18:23:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44522 validation-auc:0.96683 validation-aucpr:0.96928
[18:23:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.44225 validation-auc:0.96676 validation-aucpr:0.96923
[18:23:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43926 validation-auc:0.96695 validation-aucpr:0.96939
[18:23:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43669 validation-auc:0.96685 validation-aucpr:0.96932
[18:23:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.43404 validation-auc:0.96683 validation-aucpr:0.96931
[18:23:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.43149 validation-auc:0.96680 validation-aucpr:0.96929
[18:23:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42896 validation-auc:0.96668 validation-aucpr:0.96920
[18:23:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42614 validation-auc:0.96672 validation-aucpr:0.96925
[18:23:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.42331 validation-auc:0.96681 validation-aucpr:0.96932
[18:23:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.42057 validation-auc:0.96685 validation-aucpr:0.96963
[18:23:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41784 validation-auc:0.96689 validation-aucpr:0.96972
[18:23:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41511 validation-auc:0.96699 validation-aucpr:0.96979
[18:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.41280 validation-auc:0.96698 validation-aucpr:0.96977
[18:23:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.41016 validation-auc:0.96697 validation-aucpr:0.96978
[18:24:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40761 validation-auc:0.96701 validation-aucpr:0.96981
[18:24:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40508 validation-auc:0.96705 validation-aucpr:0.96983
[18:24:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.40253 validation-auc:0.96706 validation-aucpr:0.96984
[18:24:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.40007 validation-auc:0.96702 validation-aucpr:0.96981
[18:24:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39791 validation-auc:0.96703 validation-aucpr:0.96982
[18:24:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39584 validation-auc:0.96701 validation-aucpr:0.96979
[18:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.39346 validation-auc:0.96707 validation-aucpr:0.96984
[18:24:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.39144 validation-auc:0.96709 validation-aucpr:0.96982
[18:24:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38910 validation-auc:0.96708 validation-aucpr:0.96982
[18:24:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38681 validation-auc:0.96710 validation-aucpr:0.96983
[18:24:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38457 validation-auc:0.96707 validation-aucpr:0.96981
[18:24:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.38239 validation-auc:0.96709 validation-aucpr:0.96982
[18:24:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.38023 validation-auc:0.96715 validation-aucpr:0.96987
[18:24:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37804 validation-auc:0.96719 validation-aucpr:0.96991
[18:24:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.37607 validation-auc:0.96724 validation-aucpr:0.96995
[18:24:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.37407 validation-auc:0.96726 validation-aucpr:0.96997
[18:24:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.37201 validation-auc:0.96724 validation-aucpr:0.96958
[18:24:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.36998 validation-auc:0.96716 validation-aucpr:0.96952
{'best_iteration': '33', 'best_score': '0.9707520012467902'}
Trial 11, Fold 5: Log loss = 0.3699763639546245, Average precision = 0.9705531018910141, ROC-AUC = 0.9671560485294391, Elapsed Time = 228.57969969999976 seconds
Optimization Progress: 12%|#2 | 12/100 [25:57<9:49:25, 401.88s/it]
Trial 12, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 12, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:24:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66259 validation-auc:0.94294 validation-aucpr:0.93924
[18:24:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63428 validation-auc:0.95463 validation-aucpr:0.95805
[18:24:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60932 validation-auc:0.95643 validation-aucpr:0.96112
[18:24:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.58603 validation-auc:0.95797 validation-aucpr:0.96251
[18:25:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.56475 validation-auc:0.95805 validation-aucpr:0.96270
[18:25:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.54351 validation-auc:0.95997 validation-aucpr:0.96492
[18:25:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.52625 validation-auc:0.96042 validation-aucpr:0.96503
[18:25:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.50972 validation-auc:0.96048 validation-aucpr:0.96531
[18:25:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.48994 validation-auc:0.96277 validation-aucpr:0.96772
[18:25:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.47233 validation-auc:0.96360 validation-aucpr:0.96885
[18:25:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.45930 validation-auc:0.96345 validation-aucpr:0.96876
[18:25:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.44865 validation-auc:0.96304 validation-aucpr:0.96829
[18:25:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.43703 validation-auc:0.96304 validation-aucpr:0.96818
[18:25:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.42596 validation-auc:0.96324 validation-aucpr:0.96849
[18:25:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.41246 validation-auc:0.96400 validation-aucpr:0.96934
[18:25:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.40000 validation-auc:0.96474 validation-aucpr:0.97009
[18:25:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.39074 validation-auc:0.96472 validation-aucpr:0.97005
[18:25:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.37932 validation-auc:0.96527 validation-aucpr:0.97058
[18:25:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.36852 validation-auc:0.96575 validation-aucpr:0.97102
[18:25:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.36046 validation-auc:0.96597 validation-aucpr:0.97116
[18:25:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.35358 validation-auc:0.96593 validation-aucpr:0.97116
[18:25:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.34714 validation-auc:0.96585 validation-aucpr:0.97111
[18:25:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.34141 validation-auc:0.96571 validation-aucpr:0.97095
[18:25:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.33502 validation-auc:0.96588 validation-aucpr:0.97107
[18:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.32988 validation-auc:0.96588 validation-aucpr:0.97102
[18:25:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.32296 validation-auc:0.96599 validation-aucpr:0.97115
[18:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.31652 validation-auc:0.96616 validation-aucpr:0.97133
[18:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.31225 validation-auc:0.96613 validation-aucpr:0.97130
[18:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.30603 validation-auc:0.96639 validation-aucpr:0.97156
{'best_iteration': '28', 'best_score': '0.9715565489584634'}
Trial 12, Fold 1: Log loss = 0.30603410223586613, Average precision = 0.9715614797671308, ROC-AUC = 0.9663893310008937, Elapsed Time = 53.693495299999995 seconds
Trial 12, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 12, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66405 validation-auc:0.93662 validation-aucpr:0.93297
[18:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63717 validation-auc:0.94590 validation-aucpr:0.94476
[18:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60878 validation-auc:0.95973 validation-aucpr:0.96166
[18:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.58238 validation-auc:0.96332 validation-aucpr:0.96699
[18:25:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.56224 validation-auc:0.96292 validation-aucpr:0.96663
[18:25:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.54181 validation-auc:0.96346 validation-aucpr:0.96689
[18:25:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.52407 validation-auc:0.96363 validation-aucpr:0.96678
[18:26:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.50769 validation-auc:0.96352 validation-aucpr:0.96665
[18:26:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.49264 validation-auc:0.96327 validation-aucpr:0.96630
[18:26:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.47488 validation-auc:0.96469 validation-aucpr:0.96778
[18:26:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.46040 validation-auc:0.96491 validation-aucpr:0.96777
[18:26:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.44749 validation-auc:0.96491 validation-aucpr:0.96812
[18:26:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.43521 validation-auc:0.96518 validation-aucpr:0.96834
[18:26:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.42396 validation-auc:0.96546 validation-aucpr:0.96847
[18:26:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.41358 validation-auc:0.96565 validation-aucpr:0.96863
[18:26:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.40227 validation-auc:0.96600 validation-aucpr:0.96902
[18:26:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.39100 validation-auc:0.96636 validation-aucpr:0.96935
[18:26:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.38242 validation-auc:0.96643 validation-aucpr:0.96933
[18:26:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.37471 validation-auc:0.96632 validation-aucpr:0.96921
[18:26:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.36568 validation-auc:0.96664 validation-aucpr:0.96946
[18:26:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.35859 validation-auc:0.96659 validation-aucpr:0.96938
[18:26:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.35133 validation-auc:0.96676 validation-aucpr:0.96950
[18:26:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.34315 validation-auc:0.96710 validation-aucpr:0.96979
[18:26:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.33454 validation-auc:0.96747 validation-aucpr:0.97014
[18:26:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.32775 validation-auc:0.96769 validation-aucpr:0.97030
[18:26:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.32063 validation-auc:0.96793 validation-aucpr:0.97049
[18:26:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.31557 validation-auc:0.96796 validation-aucpr:0.97048
[18:26:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.30904 validation-auc:0.96819 validation-aucpr:0.97073
[18:26:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.30284 validation-auc:0.96846 validation-aucpr:0.97135
{'best_iteration': '28', 'best_score': '0.9713477996479326'}
Trial 12, Fold 2: Log loss = 0.3028431411702671, Average precision = 0.971361292061751, ROC-AUC = 0.9684592986642249, Elapsed Time = 53.507002000000284 seconds
Trial 12, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 12, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:26:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66239 validation-auc:0.94767 validation-aucpr:0.95016
[18:26:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63434 validation-auc:0.95690 validation-aucpr:0.95905
[18:26:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61081 validation-auc:0.95657 validation-aucpr:0.96020
[18:26:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.58871 validation-auc:0.95568 validation-aucpr:0.95868
[18:26:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.56640 validation-auc:0.96000 validation-aucpr:0.96273
[18:26:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.54629 validation-auc:0.96183 validation-aucpr:0.96466
[18:26:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.52812 validation-auc:0.96205 validation-aucpr:0.96552
[18:26:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.51158 validation-auc:0.96192 validation-aucpr:0.96532
[18:26:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.49587 validation-auc:0.96204 validation-aucpr:0.96580
[18:26:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.47702 validation-auc:0.96506 validation-aucpr:0.96934
[18:26:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.46401 validation-auc:0.96487 validation-aucpr:0.96903
[18:27:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.44886 validation-auc:0.96570 validation-aucpr:0.96999
[18:27:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.43399 validation-auc:0.96645 validation-aucpr:0.97098
[18:27:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.42069 validation-auc:0.96685 validation-aucpr:0.97142
[18:27:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.40991 validation-auc:0.96716 validation-aucpr:0.97171
[18:27:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.39989 validation-auc:0.96705 validation-aucpr:0.97163
[18:27:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.39035 validation-auc:0.96719 validation-aucpr:0.97173
[18:27:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.37903 validation-auc:0.96745 validation-aucpr:0.97202
[18:27:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.37144 validation-auc:0.96731 validation-aucpr:0.97186
[18:27:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.36378 validation-auc:0.96719 validation-aucpr:0.97176
[18:27:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.35450 validation-auc:0.96738 validation-aucpr:0.97199
[18:27:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.34755 validation-auc:0.96757 validation-aucpr:0.97209
[18:27:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.33920 validation-auc:0.96771 validation-aucpr:0.97225
[18:27:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.33335 validation-auc:0.96767 validation-aucpr:0.97225
[18:27:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.32542 validation-auc:0.96801 validation-aucpr:0.97254
[18:27:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.31965 validation-auc:0.96814 validation-aucpr:0.97263
[18:27:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.31198 validation-auc:0.96867 validation-aucpr:0.97311
[18:27:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.30669 validation-auc:0.96896 validation-aucpr:0.97333
[18:27:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.30179 validation-auc:0.96896 validation-aucpr:0.97333
{'best_iteration': '27', 'best_score': '0.9733342783045276'}
Trial 12, Fold 3: Log loss = 0.3017870133683276, Average precision = 0.9733305895492752, ROC-AUC = 0.9689629361654997, Elapsed Time = 52.86153060000106 seconds
Trial 12, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 12, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:27:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.65952 validation-auc:0.94855 validation-aucpr:0.94885
[18:27:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63334 validation-auc:0.95400 validation-aucpr:0.96011
[18:27:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60796 validation-auc:0.95741 validation-aucpr:0.96390
[18:27:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.58166 validation-auc:0.96230 validation-aucpr:0.96786
[18:27:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.56142 validation-auc:0.96212 validation-aucpr:0.96781
[18:27:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.54217 validation-auc:0.96172 validation-aucpr:0.96766
[18:27:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.52369 validation-auc:0.96255 validation-aucpr:0.96829
[18:27:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.50683 validation-auc:0.96240 validation-aucpr:0.96800
[18:27:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.49166 validation-auc:0.96181 validation-aucpr:0.96748
[18:27:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.47382 validation-auc:0.96299 validation-aucpr:0.96874
[18:27:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.45801 validation-auc:0.96368 validation-aucpr:0.96947
[18:27:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.44566 validation-auc:0.96360 validation-aucpr:0.96937
[18:27:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.43217 validation-auc:0.96402 validation-aucpr:0.96983
[18:27:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.42100 validation-auc:0.96409 validation-aucpr:0.96984
[18:28:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.40792 validation-auc:0.96486 validation-aucpr:0.97057
[18:28:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.39699 validation-auc:0.96519 validation-aucpr:0.97090
[18:28:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.38804 validation-auc:0.96508 validation-aucpr:0.97077
[18:28:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.37930 validation-auc:0.96500 validation-aucpr:0.97066
[18:28:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.36885 validation-auc:0.96538 validation-aucpr:0.97105
[18:28:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.36095 validation-auc:0.96549 validation-aucpr:0.97112
[18:28:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.35323 validation-auc:0.96555 validation-aucpr:0.97118
[18:28:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.34611 validation-auc:0.96571 validation-aucpr:0.97128
[18:28:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.33928 validation-auc:0.96608 validation-aucpr:0.97152
[18:28:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.33346 validation-auc:0.96604 validation-aucpr:0.97148
[18:28:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.32800 validation-auc:0.96606 validation-aucpr:0.97149
[18:28:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.32263 validation-auc:0.96615 validation-aucpr:0.97152
[18:28:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.31779 validation-auc:0.96603 validation-aucpr:0.97137
[18:28:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.31307 validation-auc:0.96601 validation-aucpr:0.97135
[18:28:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.30850 validation-auc:0.96609 validation-aucpr:0.97139
{'best_iteration': '22', 'best_score': '0.9715216628471062'}
Trial 12, Fold 4: Log loss = 0.3085016825154917, Average precision = 0.9713912806763556, ROC-AUC = 0.9660949597153969, Elapsed Time = 53.205132799999774 seconds
Trial 12, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 12, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:28:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66283 validation-auc:0.94372 validation-aucpr:0.94356
[18:28:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63360 validation-auc:0.95657 validation-aucpr:0.95950
[18:28:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60862 validation-auc:0.95788 validation-aucpr:0.96140
[18:28:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.58644 validation-auc:0.95707 validation-aucpr:0.96084
[18:28:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.56481 validation-auc:0.95857 validation-aucpr:0.96289
[18:28:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.54563 validation-auc:0.95935 validation-aucpr:0.96323
[18:28:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.52717 validation-auc:0.96000 validation-aucpr:0.96396
[18:28:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.51130 validation-auc:0.95970 validation-aucpr:0.96365
[18:28:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.49270 validation-auc:0.96179 validation-aucpr:0.96598
[18:28:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.47617 validation-auc:0.96227 validation-aucpr:0.96667
[18:28:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.46293 validation-auc:0.96210 validation-aucpr:0.96650
[18:28:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.44688 validation-auc:0.96312 validation-aucpr:0.96744
[18:28:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.43315 validation-auc:0.96336 validation-aucpr:0.96809
[18:28:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.42298 validation-auc:0.96317 validation-aucpr:0.96784
[18:28:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.41097 validation-auc:0.96328 validation-aucpr:0.96801
[18:28:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.40214 validation-auc:0.96300 validation-aucpr:0.96767
[18:28:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.39274 validation-auc:0.96279 validation-aucpr:0.96753
[18:28:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.38429 validation-auc:0.96301 validation-aucpr:0.96765
[18:29:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.37446 validation-auc:0.96312 validation-aucpr:0.96781
[18:29:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.36749 validation-auc:0.96309 validation-aucpr:0.96781
[18:29:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.35817 validation-auc:0.96325 validation-aucpr:0.96804
[18:29:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.35118 validation-auc:0.96347 validation-aucpr:0.96821
[18:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.34505 validation-auc:0.96344 validation-aucpr:0.96811
[18:29:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.33953 validation-auc:0.96322 validation-aucpr:0.96794
[18:29:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.33427 validation-auc:0.96332 validation-aucpr:0.96796
[18:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.32729 validation-auc:0.96351 validation-aucpr:0.96826
[18:29:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.32167 validation-auc:0.96378 validation-aucpr:0.96858
[18:29:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.31664 validation-auc:0.96398 validation-aucpr:0.96874
[18:29:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.31166 validation-auc:0.96412 validation-aucpr:0.96891
{'best_iteration': '28', 'best_score': '0.9689149458783484'}
Trial 12, Fold 5: Log loss = 0.31166072012417156, Average precision = 0.9689226422619246, ROC-AUC = 0.9641158612489085, Elapsed Time = 53.00868619999892 seconds
Optimization Progress: 13%|#3 | 13/100 [30:31<8:46:37, 363.19s/it]
Trial 13, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 13, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:29:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68199 validation-auc:0.91934 validation-aucpr:0.89907
[18:29:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67118 validation-auc:0.94339 validation-aucpr:0.94700
[18:29:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65875 validation-auc:0.96044 validation-aucpr:0.96344
[18:29:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64739 validation-auc:0.96276 validation-aucpr:0.96840
[18:29:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.63735 validation-auc:0.96386 validation-aucpr:0.96919
[18:29:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.62614 validation-auc:0.96518 validation-aucpr:0.97051
[18:29:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.61654 validation-auc:0.96538 validation-aucpr:0.97058
[18:29:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.60583 validation-auc:0.96625 validation-aucpr:0.97137
[18:29:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.59564 validation-auc:0.96707 validation-aucpr:0.97211
[18:29:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.58711 validation-auc:0.96692 validation-aucpr:0.97190
[18:29:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.57876 validation-auc:0.96711 validation-aucpr:0.97180
[18:29:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.56969 validation-auc:0.96741 validation-aucpr:0.97215
[18:29:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.56191 validation-auc:0.96713 validation-aucpr:0.97187
[18:29:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.55418 validation-auc:0.96706 validation-aucpr:0.97146
{'best_iteration': '11', 'best_score': '0.9721533735935723'}
Trial 13, Fold 1: Log loss = 0.5541760171341904, Average precision = 0.9715588809569982, ROC-AUC = 0.9670561583812487, Elapsed Time = 8.191645800001425 seconds
Trial 13, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 13, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:29:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68205 validation-auc:0.92486 validation-aucpr:0.89257
[18:29:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67062 validation-auc:0.95107 validation-aucpr:0.95183
[18:29:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65970 validation-auc:0.95674 validation-aucpr:0.95885
[18:29:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64923 validation-auc:0.95845 validation-aucpr:0.96106
[18:29:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.63893 validation-auc:0.95881 validation-aucpr:0.96155
[18:29:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.62746 validation-auc:0.96518 validation-aucpr:0.96826
[18:29:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.61787 validation-auc:0.96484 validation-aucpr:0.96790
[18:29:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.60742 validation-auc:0.96667 validation-aucpr:0.96988
[18:29:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.59728 validation-auc:0.96769 validation-aucpr:0.97111
[18:29:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.58740 validation-auc:0.96849 validation-aucpr:0.97196
[18:29:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.57921 validation-auc:0.96842 validation-aucpr:0.97186
[18:29:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.57127 validation-auc:0.96836 validation-aucpr:0.97166
[18:29:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.56223 validation-auc:0.96892 validation-aucpr:0.97213
[18:29:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.55369 validation-auc:0.96885 validation-aucpr:0.97222
{'best_iteration': '13', 'best_score': '0.9722168506448429'}
Trial 13, Fold 2: Log loss = 0.5536905013799721, Average precision = 0.9722019875011716, ROC-AUC = 0.9688539187803363, Elapsed Time = 8.690538099999685 seconds
Trial 13, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 13, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:29:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68175 validation-auc:0.92494 validation-aucpr:0.89388
[18:29:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67063 validation-auc:0.94710 validation-aucpr:0.94695
[18:29:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66008 validation-auc:0.95246 validation-aucpr:0.95365
[18:29:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64965 validation-auc:0.95434 validation-aucpr:0.95691
[18:29:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.63927 validation-auc:0.95700 validation-aucpr:0.95967
[18:29:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.62977 validation-auc:0.95784 validation-aucpr:0.96078
[18:29:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.62003 validation-auc:0.95862 validation-aucpr:0.96179
[18:29:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.61077 validation-auc:0.95928 validation-aucpr:0.96292
[18:29:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.60170 validation-auc:0.95979 validation-aucpr:0.96332
[18:29:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.59295 validation-auc:0.95998 validation-aucpr:0.96332
[18:29:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.58471 validation-auc:0.96018 validation-aucpr:0.96353
[18:29:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.57661 validation-auc:0.96033 validation-aucpr:0.96365
[18:29:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.56744 validation-auc:0.96288 validation-aucpr:0.96652
[18:29:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.55985 validation-auc:0.96291 validation-aucpr:0.96644
{'best_iteration': '12', 'best_score': '0.966518950749605'}
Trial 13, Fold 3: Log loss = 0.5598497446494258, Average precision = 0.9669748866732013, ROC-AUC = 0.9629090029711743, Elapsed Time = 7.858257300000332 seconds
Trial 13, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 13, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:29:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68161 validation-auc:0.92751 validation-aucpr:0.90009
[18:29:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67016 validation-auc:0.94928 validation-aucpr:0.94291
[18:29:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65784 validation-auc:0.96084 validation-aucpr:0.96151
[18:29:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64629 validation-auc:0.96445 validation-aucpr:0.96949
[18:29:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.63616 validation-auc:0.96470 validation-aucpr:0.96962
[18:29:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.62502 validation-auc:0.96526 validation-aucpr:0.97043
[18:29:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.61541 validation-auc:0.96534 validation-aucpr:0.97049
[18:29:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.60513 validation-auc:0.96583 validation-aucpr:0.97104
[18:29:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.59581 validation-auc:0.96634 validation-aucpr:0.97149
[18:29:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.58747 validation-auc:0.96614 validation-aucpr:0.97126
[18:30:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.57793 validation-auc:0.96669 validation-aucpr:0.97190
[18:30:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.56905 validation-auc:0.96678 validation-aucpr:0.97209
[18:30:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.56007 validation-auc:0.96690 validation-aucpr:0.97232
[18:30:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.55237 validation-auc:0.96658 validation-aucpr:0.97213
{'best_iteration': '12', 'best_score': '0.972317800443495'}
Trial 13, Fold 4: Log loss = 0.5523687742806942, Average precision = 0.972130547603141, ROC-AUC = 0.966584583360158, Elapsed Time = 8.161774599999262 seconds
Trial 13, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 13, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:30:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68195 validation-auc:0.92479 validation-aucpr:0.90042
[18:30:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66939 validation-auc:0.95639 validation-aucpr:0.95941
[18:30:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65735 validation-auc:0.96104 validation-aucpr:0.96365
[18:30:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64698 validation-auc:0.96273 validation-aucpr:0.96809
[18:30:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.63647 validation-auc:0.96369 validation-aucpr:0.96885
[18:30:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.62690 validation-auc:0.96339 validation-aucpr:0.96840
[18:30:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.61614 validation-auc:0.96454 validation-aucpr:0.96974
[18:30:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.60590 validation-auc:0.96481 validation-aucpr:0.97010
[18:30:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.59584 validation-auc:0.96517 validation-aucpr:0.97051
[18:30:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.58610 validation-auc:0.96540 validation-aucpr:0.97072
[18:30:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.57786 validation-auc:0.96531 validation-aucpr:0.97044
[18:30:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.56890 validation-auc:0.96535 validation-aucpr:0.97056
[18:30:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.56080 validation-auc:0.96556 validation-aucpr:0.97075
[18:30:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.55342 validation-auc:0.96535 validation-aucpr:0.97050
{'best_iteration': '12', 'best_score': '0.9707524225232216'}
Trial 13, Fold 5: Log loss = 0.5534183540087733, Average precision = 0.9705031427069372, ROC-AUC = 0.9653473793731304, Elapsed Time = 8.009748399999808 seconds
Optimization Progress: 14%|#4 | 14/100 [31:20<6:24:32, 268.29s/it]
Trial 14, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 14, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:30:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67300 validation-auc:0.93535 validation-aucpr:0.91414
[18:30:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65352 validation-auc:0.95997 validation-aucpr:0.94955
[18:30:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.63501 validation-auc:0.96534 validation-aucpr:0.96582
[18:30:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.61730 validation-auc:0.96661 validation-aucpr:0.96679
[18:30:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.60074 validation-auc:0.96710 validation-aucpr:0.96875
{'best_iteration': '4', 'best_score': '0.9687460586423775'}
Trial 14, Fold 1: Log loss = 0.6007382625759794, Average precision = 0.9697246970219232, ROC-AUC = 0.9671048504466526, Elapsed Time = 9.356031400000575 seconds
Trial 14, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 14, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:30:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67308 validation-auc:0.93724 validation-aucpr:0.90542
[18:30:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65404 validation-auc:0.95966 validation-aucpr:0.95575
[18:30:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.63566 validation-auc:0.96497 validation-aucpr:0.96601
[18:30:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.61960 validation-auc:0.96509 validation-aucpr:0.96872
[18:30:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.60307 validation-auc:0.96593 validation-aucpr:0.96962
{'best_iteration': '4', 'best_score': '0.9696233259644926'}
Trial 14, Fold 2: Log loss = 0.6030703772186887, Average precision = 0.969516017292491, ROC-AUC = 0.9659294354587835, Elapsed Time = 9.748138599999947 seconds
Trial 14, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 14, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:30:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67279 validation-auc:0.94268 validation-aucpr:0.92792
[18:30:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65354 validation-auc:0.96313 validation-aucpr:0.96026
[18:30:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.63494 validation-auc:0.96576 validation-aucpr:0.96564
[18:30:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.61735 validation-auc:0.96730 validation-aucpr:0.96911
[18:30:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.60225 validation-auc:0.96770 validation-aucpr:0.96910
{'best_iteration': '3', 'best_score': '0.9691086296590194'}
Trial 14, Fold 3: Log loss = 0.6022545490642423, Average precision = 0.9701395087255767, ROC-AUC = 0.9677014963016826, Elapsed Time = 9.65295689999948 seconds
Trial 14, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 14, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:30:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67389 validation-auc:0.93183 validation-aucpr:0.91325
[18:30:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65492 validation-auc:0.95639 validation-aucpr:0.95643
[18:30:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.63656 validation-auc:0.96061 validation-aucpr:0.96473
[18:30:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.61956 validation-auc:0.96282 validation-aucpr:0.96665
[18:30:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.60402 validation-auc:0.96411 validation-aucpr:0.96995
{'best_iteration': '4', 'best_score': '0.9699524125024998'}
Trial 14, Fold 4: Log loss = 0.6040156415532489, Average precision = 0.969877524685255, ROC-AUC = 0.964110087603215, Elapsed Time = 9.464738699998634 seconds
Trial 14, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 14, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:31:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67263 validation-auc:0.93698 validation-aucpr:0.92727
[18:31:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65439 validation-auc:0.95989 validation-aucpr:0.96219
[18:31:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.63584 validation-auc:0.96387 validation-aucpr:0.96903
[18:31:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.61880 validation-auc:0.96427 validation-aucpr:0.96953
[18:31:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.60297 validation-auc:0.96474 validation-aucpr:0.96920
{'best_iteration': '3', 'best_score': '0.9695329455063676'}
Trial 14, Fold 5: Log loss = 0.6029668724141746, Average precision = 0.969375726079295, ROC-AUC = 0.964739832413652, Elapsed Time = 9.787574100000711 seconds
Optimization Progress: 15%|#5 | 15/100 [32:16<4:49:20, 204.24s/it]
Trial 15, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 15, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:31:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.65835 validation-auc:0.95231 validation-aucpr:0.95569
[18:31:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.62991 validation-auc:0.95311 validation-aucpr:0.95398
[18:31:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60171 validation-auc:0.95713 validation-aucpr:0.96099
[18:31:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57811 validation-auc:0.95772 validation-aucpr:0.96117
[18:31:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55292 validation-auc:0.96075 validation-aucpr:0.96445
[18:31:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53003 validation-auc:0.96176 validation-aucpr:0.96522
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51238 validation-auc:0.96237 validation-aucpr:0.96548
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49493 validation-auc:0.96314 validation-aucpr:0.96610
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47646 validation-auc:0.96344 validation-aucpr:0.96656
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.46201 validation-auc:0.96369 validation-aucpr:0.96711
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44873 validation-auc:0.96369 validation-aucpr:0.96715
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43649 validation-auc:0.96322 validation-aucpr:0.96663
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.42217 validation-auc:0.96347 validation-aucpr:0.96688
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40893 validation-auc:0.96388 validation-aucpr:0.96714
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39879 validation-auc:0.96403 validation-aucpr:0.96723
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38702 validation-auc:0.96432 validation-aucpr:0.96744
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37732 validation-auc:0.96432 validation-aucpr:0.96751
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36847 validation-auc:0.96437 validation-aucpr:0.96740
[18:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35939 validation-auc:0.96526 validation-aucpr:0.96962
[18:31:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.35032 validation-auc:0.96539 validation-aucpr:0.96985
[18:31:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.34340 validation-auc:0.96534 validation-aucpr:0.96979
[18:31:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33562 validation-auc:0.96536 validation-aucpr:0.96982
[18:31:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32751 validation-auc:0.96575 validation-aucpr:0.97020
[18:31:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.32026 validation-auc:0.96570 validation-aucpr:0.97044
[18:31:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31355 validation-auc:0.96578 validation-aucpr:0.97053
[18:31:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30706 validation-auc:0.96605 validation-aucpr:0.97075
[18:31:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.30049 validation-auc:0.96622 validation-aucpr:0.97095
[18:31:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29473 validation-auc:0.96671 validation-aucpr:0.97226
[18:31:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.28924 validation-auc:0.96678 validation-aucpr:0.97238
[18:31:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28438 validation-auc:0.96677 validation-aucpr:0.97234
[18:31:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.27997 validation-auc:0.96665 validation-aucpr:0.97225
[18:31:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27614 validation-auc:0.96680 validation-aucpr:0.97236
[18:31:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.27256 validation-auc:0.96680 validation-aucpr:0.97229
[18:31:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26822 validation-auc:0.96689 validation-aucpr:0.97235
[18:31:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26427 validation-auc:0.96699 validation-aucpr:0.97243
[18:31:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.26029 validation-auc:0.96722 validation-aucpr:0.97264
[18:31:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25666 validation-auc:0.96740 validation-aucpr:0.97314
[18:31:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25326 validation-auc:0.96755 validation-aucpr:0.97327
[18:31:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.25075 validation-auc:0.96779 validation-aucpr:0.97341
{'best_iteration': '38', 'best_score': '0.9734073712231963'}
Trial 15, Fold 1: Log loss = 0.2507485350769062, Average precision = 0.9734117928534539, ROC-AUC = 0.9677870621298805, Elapsed Time = 5.148564700000861 seconds
Trial 15, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 15, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:31:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.65842 validation-auc:0.95149 validation-aucpr:0.95086
[18:31:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.62647 validation-auc:0.95926 validation-aucpr:0.95947
[18:31:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.59846 validation-auc:0.96257 validation-aucpr:0.96585
[18:31:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57212 validation-auc:0.96383 validation-aucpr:0.96792
[18:31:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.54773 validation-auc:0.96486 validation-aucpr:0.96906
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.52494 validation-auc:0.96559 validation-aucpr:0.96983
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.50660 validation-auc:0.96684 validation-aucpr:0.97087
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.48925 validation-auc:0.96658 validation-aucpr:0.97100
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47305 validation-auc:0.96670 validation-aucpr:0.97108
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.45704 validation-auc:0.96662 validation-aucpr:0.97116
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44329 validation-auc:0.96684 validation-aucpr:0.97120
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.42896 validation-auc:0.96697 validation-aucpr:0.97131
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.41478 validation-auc:0.96729 validation-aucpr:0.97162
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40140 validation-auc:0.96790 validation-aucpr:0.97209
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.38867 validation-auc:0.96862 validation-aucpr:0.97262
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.37852 validation-auc:0.96859 validation-aucpr:0.97257
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.36953 validation-auc:0.96876 validation-aucpr:0.97262
[18:31:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.35897 validation-auc:0.96898 validation-aucpr:0.97284
[18:31:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35111 validation-auc:0.96894 validation-aucpr:0.97274
[18:31:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34180 validation-auc:0.96937 validation-aucpr:0.97309
[18:31:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.33334 validation-auc:0.96950 validation-aucpr:0.97318
[18:31:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.32668 validation-auc:0.96942 validation-aucpr:0.97278
[18:31:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.31899 validation-auc:0.96946 validation-aucpr:0.97287
[18:31:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31209 validation-auc:0.96957 validation-aucpr:0.97299
[18:31:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.30523 validation-auc:0.96952 validation-aucpr:0.97302
[18:31:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.29872 validation-auc:0.96960 validation-aucpr:0.97310
[18:31:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.29299 validation-auc:0.96954 validation-aucpr:0.97299
[18:31:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.28731 validation-auc:0.96943 validation-aucpr:0.97293
[18:31:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.28294 validation-auc:0.96945 validation-aucpr:0.97295
[18:31:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.27739 validation-auc:0.96971 validation-aucpr:0.97320
[18:31:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.27383 validation-auc:0.96964 validation-aucpr:0.97312
[18:31:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.26917 validation-auc:0.96979 validation-aucpr:0.97328
[18:31:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.26477 validation-auc:0.96995 validation-aucpr:0.97342
[18:31:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26067 validation-auc:0.97011 validation-aucpr:0.97350
[18:31:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.25691 validation-auc:0.97002 validation-aucpr:0.97349
[18:31:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.25396 validation-auc:0.96996 validation-aucpr:0.97344
[18:31:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25028 validation-auc:0.97007 validation-aucpr:0.97352
[18:31:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.24758 validation-auc:0.97012 validation-aucpr:0.97351
[18:31:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.24442 validation-auc:0.97013 validation-aucpr:0.97353
{'best_iteration': '38', 'best_score': '0.9735321248399427'}
Trial 15, Fold 2: Log loss = 0.24442360141281588, Average precision = 0.9735330060498169, ROC-AUC = 0.9701298141605572, Elapsed Time = 5.283658000000287 seconds
Trial 15, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 15, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:31:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.65832 validation-auc:0.94735 validation-aucpr:0.94594
[18:31:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.62628 validation-auc:0.95822 validation-aucpr:0.95927
[18:31:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.59758 validation-auc:0.96044 validation-aucpr:0.96377
[18:31:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57100 validation-auc:0.96076 validation-aucpr:0.96423
[18:31:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.54634 validation-auc:0.96319 validation-aucpr:0.96671
[18:31:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.52340 validation-auc:0.96474 validation-aucpr:0.96794
[18:31:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.50501 validation-auc:0.96504 validation-aucpr:0.96844
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.48776 validation-auc:0.96524 validation-aucpr:0.96854
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47153 validation-auc:0.96642 validation-aucpr:0.97066
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.45753 validation-auc:0.96637 validation-aucpr:0.97066
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44218 validation-auc:0.96647 validation-aucpr:0.97062
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.42917 validation-auc:0.96658 validation-aucpr:0.97066
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.41731 validation-auc:0.96651 validation-aucpr:0.97079
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40377 validation-auc:0.96742 validation-aucpr:0.97225
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39155 validation-auc:0.96740 validation-aucpr:0.97226
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.37977 validation-auc:0.96761 validation-aucpr:0.97247
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.36887 validation-auc:0.96791 validation-aucpr:0.97275
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36036 validation-auc:0.96783 validation-aucpr:0.97260
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35255 validation-auc:0.96780 validation-aucpr:0.97270
[18:31:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34339 validation-auc:0.96796 validation-aucpr:0.97288
[18:31:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.33573 validation-auc:0.96816 validation-aucpr:0.97303
[18:31:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.32893 validation-auc:0.96816 validation-aucpr:0.97295
[18:31:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32092 validation-auc:0.96831 validation-aucpr:0.97308
[18:31:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31510 validation-auc:0.96818 validation-aucpr:0.97299
[18:31:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.30834 validation-auc:0.96797 validation-aucpr:0.97177
[18:31:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30204 validation-auc:0.96786 validation-aucpr:0.97175
[18:31:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.29599 validation-auc:0.96784 validation-aucpr:0.97112
[18:31:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29120 validation-auc:0.96772 validation-aucpr:0.97089
[18:31:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.28621 validation-auc:0.96756 validation-aucpr:0.97044
[18:31:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28159 validation-auc:0.96804 validation-aucpr:0.97077
[18:31:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.27739 validation-auc:0.96810 validation-aucpr:0.97078
[18:31:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27294 validation-auc:0.96838 validation-aucpr:0.97240
[18:31:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.26865 validation-auc:0.96840 validation-aucpr:0.97250
[18:31:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26559 validation-auc:0.96834 validation-aucpr:0.97245
[18:31:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26225 validation-auc:0.96839 validation-aucpr:0.97248
[18:31:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.25853 validation-auc:0.96839 validation-aucpr:0.97246
[18:31:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25559 validation-auc:0.96832 validation-aucpr:0.97241
[18:31:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25227 validation-auc:0.96829 validation-aucpr:0.97240
[18:31:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.24930 validation-auc:0.96855 validation-aucpr:0.97255
{'best_iteration': '22', 'best_score': '0.9730788157534537'}
Trial 15, Fold 3: Log loss = 0.24930300686625795, Average precision = 0.9727403636924389, ROC-AUC = 0.9685533162061426, Elapsed Time = 5.271522000000914 seconds
Trial 15, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 15, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:31:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.65897 validation-auc:0.94989 validation-aucpr:0.95155
[18:31:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63117 validation-auc:0.95327 validation-aucpr:0.95725
[18:31:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60142 validation-auc:0.95973 validation-aucpr:0.96495
[18:31:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57778 validation-auc:0.96069 validation-aucpr:0.96648
[18:31:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55610 validation-auc:0.96158 validation-aucpr:0.96718
[18:31:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53242 validation-auc:0.96334 validation-aucpr:0.96905
[18:31:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51115 validation-auc:0.96442 validation-aucpr:0.97006
[18:31:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49378 validation-auc:0.96480 validation-aucpr:0.97024
[18:31:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47537 validation-auc:0.96499 validation-aucpr:0.97057
[18:31:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.46020 validation-auc:0.96500 validation-aucpr:0.97046
[18:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44416 validation-auc:0.96576 validation-aucpr:0.97118
[18:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43254 validation-auc:0.96503 validation-aucpr:0.97053
[18:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.41816 validation-auc:0.96583 validation-aucpr:0.97121
[18:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40543 validation-auc:0.96575 validation-aucpr:0.97129
[18:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39301 validation-auc:0.96582 validation-aucpr:0.97145
[18:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38135 validation-auc:0.96622 validation-aucpr:0.97176
[18:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37215 validation-auc:0.96609 validation-aucpr:0.97171
[18:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36297 validation-auc:0.96645 validation-aucpr:0.97197
[18:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35502 validation-auc:0.96627 validation-aucpr:0.97178
[18:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34777 validation-auc:0.96637 validation-aucpr:0.97183
[18:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.33934 validation-auc:0.96656 validation-aucpr:0.97198
[18:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33112 validation-auc:0.96669 validation-aucpr:0.97211
[18:31:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32464 validation-auc:0.96700 validation-aucpr:0.97231
[18:31:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31879 validation-auc:0.96692 validation-aucpr:0.97226
[18:31:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31187 validation-auc:0.96711 validation-aucpr:0.97238
[18:31:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30665 validation-auc:0.96710 validation-aucpr:0.97238
[18:31:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.30039 validation-auc:0.96737 validation-aucpr:0.97256
[18:31:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29538 validation-auc:0.96741 validation-aucpr:0.97257
[18:31:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.28964 validation-auc:0.96762 validation-aucpr:0.97277
[18:31:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28405 validation-auc:0.96786 validation-aucpr:0.97299
[18:31:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.27887 validation-auc:0.96815 validation-aucpr:0.97320
[18:31:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27356 validation-auc:0.96849 validation-aucpr:0.97349
[18:31:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.26936 validation-auc:0.96871 validation-aucpr:0.97368
[18:31:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26520 validation-auc:0.96894 validation-aucpr:0.97386
[18:31:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26242 validation-auc:0.96878 validation-aucpr:0.97373
[18:31:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.25923 validation-auc:0.96891 validation-aucpr:0.97381
[18:31:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25575 validation-auc:0.96926 validation-aucpr:0.97406
[18:31:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25193 validation-auc:0.96949 validation-aucpr:0.97423
[18:31:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.24844 validation-auc:0.96965 validation-aucpr:0.97438
{'best_iteration': '38', 'best_score': '0.9743752728524171'}
Trial 15, Fold 4: Log loss = 0.24843866113024915, Average precision = 0.9743756236521505, ROC-AUC = 0.9696492950492499, Elapsed Time = 5.2661658999986685 seconds
Trial 15, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 15, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:31:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.65847 validation-auc:0.95169 validation-aucpr:0.95688
[18:31:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.62713 validation-auc:0.95678 validation-aucpr:0.96259
[18:31:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.59869 validation-auc:0.95860 validation-aucpr:0.96373
[18:31:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57244 validation-auc:0.95982 validation-aucpr:0.96521
[18:31:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.54877 validation-auc:0.96048 validation-aucpr:0.96587
[18:31:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.52637 validation-auc:0.96112 validation-aucpr:0.96637
[18:31:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.50830 validation-auc:0.96140 validation-aucpr:0.96640
[18:31:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49137 validation-auc:0.96139 validation-aucpr:0.96649
[18:31:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47365 validation-auc:0.96215 validation-aucpr:0.96633
[18:31:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.45713 validation-auc:0.96227 validation-aucpr:0.96673
[18:31:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44398 validation-auc:0.96289 validation-aucpr:0.96704
[18:31:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43102 validation-auc:0.96309 validation-aucpr:0.96726
[18:31:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.41859 validation-auc:0.96285 validation-aucpr:0.96713
[18:31:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40581 validation-auc:0.96322 validation-aucpr:0.96747
[18:31:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39400 validation-auc:0.96304 validation-aucpr:0.96752
[18:31:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38263 validation-auc:0.96334 validation-aucpr:0.96759
[18:31:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37368 validation-auc:0.96337 validation-aucpr:0.96750
[18:31:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36339 validation-auc:0.96368 validation-aucpr:0.96773
[18:31:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35640 validation-auc:0.96333 validation-aucpr:0.96855
[18:31:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34906 validation-auc:0.96349 validation-aucpr:0.96862
[18:31:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.34069 validation-auc:0.96379 validation-aucpr:0.96817
[18:31:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33412 validation-auc:0.96386 validation-aucpr:0.96819
[18:31:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32766 validation-auc:0.96402 validation-aucpr:0.96839
[18:31:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.32099 validation-auc:0.96407 validation-aucpr:0.96847
[18:31:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31375 validation-auc:0.96444 validation-aucpr:0.96873
[18:31:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30850 validation-auc:0.96438 validation-aucpr:0.96865
[18:31:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.30351 validation-auc:0.96451 validation-aucpr:0.96873
[18:31:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29911 validation-auc:0.96426 validation-aucpr:0.96856
[18:31:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.29369 validation-auc:0.96454 validation-aucpr:0.96886
[18:31:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28841 validation-auc:0.96467 validation-aucpr:0.96901
[18:31:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.28476 validation-auc:0.96471 validation-aucpr:0.96910
[18:31:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.28098 validation-auc:0.96472 validation-aucpr:0.96904
[18:31:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.27654 validation-auc:0.96476 validation-aucpr:0.96908
[18:31:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.27246 validation-auc:0.96484 validation-aucpr:0.96914
[18:31:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26870 validation-auc:0.96529 validation-aucpr:0.96950
[18:31:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.26591 validation-auc:0.96539 validation-aucpr:0.96955
[18:31:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.26305 validation-auc:0.96538 validation-aucpr:0.96951
[18:31:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25959 validation-auc:0.96554 validation-aucpr:0.96980
[18:31:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.25668 validation-auc:0.96559 validation-aucpr:0.96999
{'best_iteration': '38', 'best_score': '0.969985383677907'}
Trial 15, Fold 5: Log loss = 0.25668126798836727, Average precision = 0.9701177753994549, ROC-AUC = 0.9655906211099343, Elapsed Time = 6.848654200000965 seconds
Optimization Progress: 16%|#6 | 16/100 [32:52<3:35:03, 153.61s/it]
Trial 16, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 16, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66463 validation-auc:0.95925 validation-aucpr:0.96434
[1] validation-logloss:0.64010 validation-auc:0.96240 validation-aucpr:0.96801
[2] validation-logloss:0.61513 validation-auc:0.96566 validation-aucpr:0.97118
[3] validation-logloss:0.59295 validation-auc:0.96626 validation-aucpr:0.97152
[4] validation-logloss:0.57390 validation-auc:0.96643 validation-aucpr:0.97140
[5] validation-logloss:0.55627 validation-auc:0.96664 validation-aucpr:0.97154
[6] validation-logloss:0.54065 validation-auc:0.96639 validation-aucpr:0.97129
[7] validation-logloss:0.52491 validation-auc:0.96629 validation-aucpr:0.97117
[8] validation-logloss:0.51049 validation-auc:0.96623 validation-aucpr:0.97103
[9] validation-logloss:0.49658 validation-auc:0.96627 validation-aucpr:0.97104
[10] validation-logloss:0.48341 validation-auc:0.96618 validation-aucpr:0.97090
[11] validation-logloss:0.47140 validation-auc:0.96601 validation-aucpr:0.97074
[12] validation-logloss:0.45757 validation-auc:0.96618 validation-aucpr:0.97104
[13] validation-logloss:0.44432 validation-auc:0.96663 validation-aucpr:0.97148
[14] validation-logloss:0.43193 validation-auc:0.96704 validation-aucpr:0.97187
[15] validation-logloss:0.42049 validation-auc:0.96731 validation-aucpr:0.97216
[16] validation-logloss:0.40985 validation-auc:0.96739 validation-aucpr:0.97221
[17] validation-logloss:0.39931 validation-auc:0.96771 validation-aucpr:0.97255
[18] validation-logloss:0.39106 validation-auc:0.96761 validation-aucpr:0.97246
[19] validation-logloss:0.38149 validation-auc:0.96782 validation-aucpr:0.97265
[20] validation-logloss:0.37406 validation-auc:0.96785 validation-aucpr:0.97269
[21] validation-logloss:0.36550 validation-auc:0.96811 validation-aucpr:0.97293
[22] validation-logloss:0.35742 validation-auc:0.96838 validation-aucpr:0.97314
[23] validation-logloss:0.35108 validation-auc:0.96834 validation-aucpr:0.97322
[24] validation-logloss:0.34373 validation-auc:0.96851 validation-aucpr:0.97339
[25] validation-logloss:0.33679 validation-auc:0.96875 validation-aucpr:0.97360
[26] validation-logloss:0.33135 validation-auc:0.96880 validation-aucpr:0.97366
[27] validation-logloss:0.32601 validation-auc:0.96882 validation-aucpr:0.97366
[28] validation-logloss:0.32013 validation-auc:0.96894 validation-aucpr:0.97375
[29] validation-logloss:0.31433 validation-auc:0.96902 validation-aucpr:0.97402
[30] validation-logloss:0.30993 validation-auc:0.96904 validation-aucpr:0.97402
[31] validation-logloss:0.30471 validation-auc:0.96920 validation-aucpr:0.97417
[32] validation-logloss:0.29986 validation-auc:0.96945 validation-aucpr:0.97437
[33] validation-logloss:0.29584 validation-auc:0.96948 validation-aucpr:0.97437
[34] validation-logloss:0.29097 validation-auc:0.96967 validation-aucpr:0.97451
[35] validation-logloss:0.28652 validation-auc:0.96988 validation-aucpr:0.97472
[36] validation-logloss:0.28229 validation-auc:0.97004 validation-aucpr:0.97488
[37] validation-logloss:0.27841 validation-auc:0.97008 validation-aucpr:0.97492
[38] validation-logloss:0.27517 validation-auc:0.97004 validation-aucpr:0.97489
[39] validation-logloss:0.27166 validation-auc:0.97003 validation-aucpr:0.97494
[40] validation-logloss:0.26821 validation-auc:0.97001 validation-aucpr:0.97493
[41] validation-logloss:0.26478 validation-auc:0.97009 validation-aucpr:0.97500
[42] validation-logloss:0.26168 validation-auc:0.97012 validation-aucpr:0.97503
[43] validation-logloss:0.25852 validation-auc:0.97033 validation-aucpr:0.97517
[44] validation-logloss:0.25582 validation-auc:0.97031 validation-aucpr:0.97521
[45] validation-logloss:0.25351 validation-auc:0.97036 validation-aucpr:0.97522
[46] validation-logloss:0.25072 validation-auc:0.97054 validation-aucpr:0.97533
[47] validation-logloss:0.24813 validation-auc:0.97061 validation-aucpr:0.97538
[48] validation-logloss:0.24560 validation-auc:0.97069 validation-aucpr:0.97552
[49] validation-logloss:0.24375 validation-auc:0.97077 validation-aucpr:0.97556
[50] validation-logloss:0.24169 validation-auc:0.97067 validation-aucpr:0.97547
[51] validation-logloss:0.23988 validation-auc:0.97073 validation-aucpr:0.97555
[52] validation-logloss:0.23763 validation-auc:0.97087 validation-aucpr:0.97565
[53] validation-logloss:0.23565 validation-auc:0.97099 validation-aucpr:0.97578
[54] validation-logloss:0.23405 validation-auc:0.97109 validation-aucpr:0.97584
[55] validation-logloss:0.23195 validation-auc:0.97126 validation-aucpr:0.97598
[56] validation-logloss:0.22980 validation-auc:0.97153 validation-aucpr:0.97620
{'best_iteration': '56', 'best_score': '0.9761988403450483'}
Trial 16, Fold 1: Log loss = 0.22979919630740303, Average precision = 0.9762028387089069, ROC-AUC = 0.9715299285929367, Elapsed Time = 9.99813069999982 seconds
Trial 16, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 16, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.66419 validation-auc:0.95790 validation-aucpr:0.96292
[1] validation-logloss:0.63789 validation-auc:0.96261 validation-aucpr:0.96685
[2] validation-logloss:0.61439 validation-auc:0.96572 validation-aucpr:0.96950
[3] validation-logloss:0.59173 validation-auc:0.96657 validation-aucpr:0.97030
[4] validation-logloss:0.57056 validation-auc:0.96751 validation-aucpr:0.97112
[5] validation-logloss:0.55304 validation-auc:0.96809 validation-aucpr:0.97158
[6] validation-logloss:0.53432 validation-auc:0.96892 validation-aucpr:0.97226
[7] validation-logloss:0.51646 validation-auc:0.96955 validation-aucpr:0.97297
[8] validation-logloss:0.50081 validation-auc:0.96990 validation-aucpr:0.97310
[9] validation-logloss:0.48717 validation-auc:0.96988 validation-aucpr:0.97308
[10] validation-logloss:0.47244 validation-auc:0.97001 validation-aucpr:0.97348
[11] validation-logloss:0.45849 validation-auc:0.97022 validation-aucpr:0.97385
[12] validation-logloss:0.44562 validation-auc:0.97023 validation-aucpr:0.97389
[13] validation-logloss:0.43322 validation-auc:0.97028 validation-aucpr:0.97402
[14] validation-logloss:0.42152 validation-auc:0.97040 validation-aucpr:0.97410
[15] validation-logloss:0.41179 validation-auc:0.97046 validation-aucpr:0.97411
[16] validation-logloss:0.40120 validation-auc:0.97069 validation-aucpr:0.97428
[17] validation-logloss:0.39138 validation-auc:0.97063 validation-aucpr:0.97421
[18] validation-logloss:0.38181 validation-auc:0.97084 validation-aucpr:0.97434
[19] validation-logloss:0.37394 validation-auc:0.97099 validation-aucpr:0.97436
[20] validation-logloss:0.36524 validation-auc:0.97122 validation-aucpr:0.97454
[21] validation-logloss:0.35732 validation-auc:0.97116 validation-aucpr:0.97447
[22] validation-logloss:0.34972 validation-auc:0.97118 validation-aucpr:0.97452
[23] validation-logloss:0.34235 validation-auc:0.97119 validation-aucpr:0.97442
[24] validation-logloss:0.33543 validation-auc:0.97119 validation-aucpr:0.97436
[25] validation-logloss:0.32968 validation-auc:0.97098 validation-aucpr:0.97420
[26] validation-logloss:0.32337 validation-auc:0.97095 validation-aucpr:0.97417
[27] validation-logloss:0.31815 validation-auc:0.97101 validation-aucpr:0.97422
[28] validation-logloss:0.31238 validation-auc:0.97104 validation-aucpr:0.97423
[29] validation-logloss:0.30744 validation-auc:0.97117 validation-aucpr:0.97430
[30] validation-logloss:0.30200 validation-auc:0.97137 validation-aucpr:0.97443
[31] validation-logloss:0.29760 validation-auc:0.97139 validation-aucpr:0.97446
[32] validation-logloss:0.29271 validation-auc:0.97146 validation-aucpr:0.97453
[33] validation-logloss:0.28795 validation-auc:0.97150 validation-aucpr:0.97458
[34] validation-logloss:0.28347 validation-auc:0.97161 validation-aucpr:0.97467
[35] validation-logloss:0.27992 validation-auc:0.97172 validation-aucpr:0.97472
[36] validation-logloss:0.27589 validation-auc:0.97160 validation-aucpr:0.97465
[37] validation-logloss:0.27179 validation-auc:0.97166 validation-aucpr:0.97471
[38] validation-logloss:0.26864 validation-auc:0.97177 validation-aucpr:0.97478
[39] validation-logloss:0.26511 validation-auc:0.97173 validation-aucpr:0.97475
[40] validation-logloss:0.26180 validation-auc:0.97169 validation-aucpr:0.97467
[41] validation-logloss:0.25870 validation-auc:0.97168 validation-aucpr:0.97464
[42] validation-logloss:0.25612 validation-auc:0.97170 validation-aucpr:0.97462
[43] validation-logloss:0.25370 validation-auc:0.97176 validation-aucpr:0.97465
[44] validation-logloss:0.25091 validation-auc:0.97166 validation-aucpr:0.97458
[45] validation-logloss:0.24821 validation-auc:0.97178 validation-aucpr:0.97467
[46] validation-logloss:0.24562 validation-auc:0.97183 validation-aucpr:0.97472
[47] validation-logloss:0.24340 validation-auc:0.97181 validation-aucpr:0.97451
[48] validation-logloss:0.24090 validation-auc:0.97195 validation-aucpr:0.97459
[49] validation-logloss:0.23853 validation-auc:0.97189 validation-aucpr:0.97448
[50] validation-logloss:0.23603 validation-auc:0.97200 validation-aucpr:0.97455
[51] validation-logloss:0.23419 validation-auc:0.97201 validation-aucpr:0.97456
[52] validation-logloss:0.23208 validation-auc:0.97208 validation-aucpr:0.97458
[53] validation-logloss:0.23013 validation-auc:0.97218 validation-aucpr:0.97495
[54] validation-logloss:0.22815 validation-auc:0.97217 validation-aucpr:0.97493
[55] validation-logloss:0.22616 validation-auc:0.97229 validation-aucpr:0.97501
[56] validation-logloss:0.22446 validation-auc:0.97222 validation-aucpr:0.97496
{'best_iteration': '55', 'best_score': '0.9750061463455643'}
Trial 16, Fold 2: Log loss = 0.22445777337154255, Average precision = 0.9749343967660432, ROC-AUC = 0.9722240094739177, Elapsed Time = 10.321142300001156 seconds
Trial 16, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 16, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66447 validation-auc:0.95985 validation-aucpr:0.96127
[1] validation-logloss:0.63915 validation-auc:0.96178 validation-aucpr:0.96616
[2] validation-logloss:0.61443 validation-auc:0.96493 validation-aucpr:0.96916
[3] validation-logloss:0.59137 validation-auc:0.96618 validation-aucpr:0.97034
[4] validation-logloss:0.56991 validation-auc:0.96786 validation-aucpr:0.97140
[5] validation-logloss:0.55269 validation-auc:0.96819 validation-aucpr:0.97203
[6] validation-logloss:0.53402 validation-auc:0.96886 validation-aucpr:0.97213
[7] validation-logloss:0.51946 validation-auc:0.96807 validation-aucpr:0.97146
[8] validation-logloss:0.50278 validation-auc:0.96827 validation-aucpr:0.97156
[9] validation-logloss:0.48702 validation-auc:0.96851 validation-aucpr:0.97163
[10] validation-logloss:0.47267 validation-auc:0.96812 validation-aucpr:0.97122
[11] validation-logloss:0.46006 validation-auc:0.96846 validation-aucpr:0.97149
[12] validation-logloss:0.44725 validation-auc:0.96851 validation-aucpr:0.97154
[13] validation-logloss:0.43562 validation-auc:0.96812 validation-aucpr:0.97101
[14] validation-logloss:0.42499 validation-auc:0.96814 validation-aucpr:0.97125
[15] validation-logloss:0.41379 validation-auc:0.96825 validation-aucpr:0.97191
[16] validation-logloss:0.40429 validation-auc:0.96851 validation-aucpr:0.97209
[17] validation-logloss:0.39424 validation-auc:0.96858 validation-aucpr:0.97211
[18] validation-logloss:0.38442 validation-auc:0.96930 validation-aucpr:0.97270
[19] validation-logloss:0.37713 validation-auc:0.96914 validation-aucpr:0.97340
[20] validation-logloss:0.36878 validation-auc:0.96926 validation-aucpr:0.97347
[21] validation-logloss:0.36044 validation-auc:0.96933 validation-aucpr:0.97355
[22] validation-logloss:0.35362 validation-auc:0.96937 validation-aucpr:0.97299
[23] validation-logloss:0.34595 validation-auc:0.96949 validation-aucpr:0.97303
[24] validation-logloss:0.33970 validation-auc:0.96955 validation-aucpr:0.97307
[25] validation-logloss:0.33371 validation-auc:0.96962 validation-aucpr:0.97309
[26] validation-logloss:0.32707 validation-auc:0.96997 validation-aucpr:0.97437
[27] validation-logloss:0.32064 validation-auc:0.97014 validation-aucpr:0.97449
[28] validation-logloss:0.31453 validation-auc:0.97027 validation-aucpr:0.97456
[29] validation-logloss:0.30974 validation-auc:0.97025 validation-aucpr:0.97454
[30] validation-logloss:0.30411 validation-auc:0.97055 validation-aucpr:0.97480
[31] validation-logloss:0.29880 validation-auc:0.97075 validation-aucpr:0.97496
[32] validation-logloss:0.29386 validation-auc:0.97082 validation-aucpr:0.97503
[33] validation-logloss:0.28936 validation-auc:0.97073 validation-aucpr:0.97497
[34] validation-logloss:0.28503 validation-auc:0.97063 validation-aucpr:0.97488
[35] validation-logloss:0.28132 validation-auc:0.97075 validation-aucpr:0.97502
[36] validation-logloss:0.27758 validation-auc:0.97083 validation-aucpr:0.97508
[37] validation-logloss:0.27393 validation-auc:0.97087 validation-aucpr:0.97513
[38] validation-logloss:0.26992 validation-auc:0.97103 validation-aucpr:0.97529
[39] validation-logloss:0.26630 validation-auc:0.97099 validation-aucpr:0.97526
[40] validation-logloss:0.26290 validation-auc:0.97116 validation-aucpr:0.97535
[41] validation-logloss:0.25957 validation-auc:0.97125 validation-aucpr:0.97539
[42] validation-logloss:0.25693 validation-auc:0.97123 validation-aucpr:0.97533
[43] validation-logloss:0.25399 validation-auc:0.97126 validation-aucpr:0.97538
[44] validation-logloss:0.25100 validation-auc:0.97127 validation-aucpr:0.97540
[45] validation-logloss:0.24815 validation-auc:0.97127 validation-aucpr:0.97540
[46] validation-logloss:0.24530 validation-auc:0.97138 validation-aucpr:0.97551
[47] validation-logloss:0.24273 validation-auc:0.97148 validation-aucpr:0.97557
[48] validation-logloss:0.24018 validation-auc:0.97158 validation-aucpr:0.97571
[49] validation-logloss:0.23780 validation-auc:0.97166 validation-aucpr:0.97574
[50] validation-logloss:0.23546 validation-auc:0.97173 validation-aucpr:0.97578
[51] validation-logloss:0.23339 validation-auc:0.97169 validation-aucpr:0.97576
[52] validation-logloss:0.23175 validation-auc:0.97174 validation-aucpr:0.97576
[53] validation-logloss:0.22975 validation-auc:0.97176 validation-aucpr:0.97578
[54] validation-logloss:0.22805 validation-auc:0.97162 validation-aucpr:0.97565
[55] validation-logloss:0.22612 validation-auc:0.97168 validation-aucpr:0.97566
[56] validation-logloss:0.22442 validation-auc:0.97168 validation-aucpr:0.97564
{'best_iteration': '53', 'best_score': '0.975780860731931'}
Trial 16, Fold 3: Log loss = 0.2244200578116086, Average precision = 0.9756433442739769, ROC-AUC = 0.971681897747301, Elapsed Time = 10.35664270000052 seconds
Trial 16, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 16, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66455 validation-auc:0.95339 validation-aucpr:0.95766
[1] validation-logloss:0.63835 validation-auc:0.96090 validation-aucpr:0.96545
[2] validation-logloss:0.61414 validation-auc:0.96464 validation-aucpr:0.97065
[3] validation-logloss:0.59135 validation-auc:0.96593 validation-aucpr:0.97133
[4] validation-logloss:0.56988 validation-auc:0.96776 validation-aucpr:0.97271
[5] validation-logloss:0.55220 validation-auc:0.96762 validation-aucpr:0.97266
[6] validation-logloss:0.53339 validation-auc:0.96837 validation-aucpr:0.97336
[7] validation-logloss:0.51833 validation-auc:0.96785 validation-aucpr:0.97304
[8] validation-logloss:0.50365 validation-auc:0.96721 validation-aucpr:0.97262
[9] validation-logloss:0.49009 validation-auc:0.96684 validation-aucpr:0.97219
[10] validation-logloss:0.47590 validation-auc:0.96720 validation-aucpr:0.97252
[11] validation-logloss:0.46173 validation-auc:0.96766 validation-aucpr:0.97290
[12] validation-logloss:0.44827 validation-auc:0.96819 validation-aucpr:0.97330
[13] validation-logloss:0.43581 validation-auc:0.96834 validation-aucpr:0.97343
[14] validation-logloss:0.42541 validation-auc:0.96822 validation-aucpr:0.97335
[15] validation-logloss:0.41455 validation-auc:0.96829 validation-aucpr:0.97344
[16] validation-logloss:0.40524 validation-auc:0.96836 validation-aucpr:0.97350
[17] validation-logloss:0.39523 validation-auc:0.96859 validation-aucpr:0.97363
[18] validation-logloss:0.38584 validation-auc:0.96867 validation-aucpr:0.97375
[19] validation-logloss:0.37822 validation-auc:0.96863 validation-aucpr:0.97370
[20] validation-logloss:0.37001 validation-auc:0.96881 validation-aucpr:0.97380
[21] validation-logloss:0.36181 validation-auc:0.96876 validation-aucpr:0.97377
[22] validation-logloss:0.35551 validation-auc:0.96847 validation-aucpr:0.97355
[23] validation-logloss:0.34919 validation-auc:0.96849 validation-aucpr:0.97354
[24] validation-logloss:0.34206 validation-auc:0.96845 validation-aucpr:0.97350
[25] validation-logloss:0.33537 validation-auc:0.96871 validation-aucpr:0.97373
[26] validation-logloss:0.32975 validation-auc:0.96874 validation-aucpr:0.97373
[27] validation-logloss:0.32358 validation-auc:0.96892 validation-aucpr:0.97389
[28] validation-logloss:0.31850 validation-auc:0.96893 validation-aucpr:0.97385
[29] validation-logloss:0.31382 validation-auc:0.96893 validation-aucpr:0.97384
[30] validation-logloss:0.30812 validation-auc:0.96924 validation-aucpr:0.97408
[31] validation-logloss:0.30291 validation-auc:0.96936 validation-aucpr:0.97418
[32] validation-logloss:0.29794 validation-auc:0.96949 validation-aucpr:0.97428
[33] validation-logloss:0.29327 validation-auc:0.96954 validation-aucpr:0.97432
[34] validation-logloss:0.28948 validation-auc:0.96951 validation-aucpr:0.97431
[35] validation-logloss:0.28507 validation-auc:0.96971 validation-aucpr:0.97449
[36] validation-logloss:0.28154 validation-auc:0.96975 validation-aucpr:0.97454
[37] validation-logloss:0.27824 validation-auc:0.96977 validation-aucpr:0.97453
[38] validation-logloss:0.27507 validation-auc:0.96974 validation-aucpr:0.97451
[39] validation-logloss:0.27112 validation-auc:0.96987 validation-aucpr:0.97463
[40] validation-logloss:0.26738 validation-auc:0.97009 validation-aucpr:0.97481
[41] validation-logloss:0.26399 validation-auc:0.97015 validation-aucpr:0.97483
[42] validation-logloss:0.26129 validation-auc:0.97012 validation-aucpr:0.97480
[43] validation-logloss:0.25790 validation-auc:0.97028 validation-aucpr:0.97494
[44] validation-logloss:0.25550 validation-auc:0.97027 validation-aucpr:0.97492
[45] validation-logloss:0.25343 validation-auc:0.97015 validation-aucpr:0.97481
[46] validation-logloss:0.25110 validation-auc:0.97027 validation-aucpr:0.97490
[47] validation-logloss:0.24838 validation-auc:0.97026 validation-aucpr:0.97487
[48] validation-logloss:0.24595 validation-auc:0.97024 validation-aucpr:0.97487
[49] validation-logloss:0.24405 validation-auc:0.97022 validation-aucpr:0.97485
[50] validation-logloss:0.24209 validation-auc:0.97026 validation-aucpr:0.97487
[51] validation-logloss:0.23972 validation-auc:0.97037 validation-aucpr:0.97499
[52] validation-logloss:0.23755 validation-auc:0.97040 validation-aucpr:0.97501
[53] validation-logloss:0.23605 validation-auc:0.97037 validation-aucpr:0.97497
[54] validation-logloss:0.23395 validation-auc:0.97042 validation-aucpr:0.97502
[55] validation-logloss:0.23193 validation-auc:0.97056 validation-aucpr:0.97513
[56] validation-logloss:0.23063 validation-auc:0.97041 validation-aucpr:0.97505
{'best_iteration': '55', 'best_score': '0.9751346047427522'}
Trial 16, Fold 4: Log loss = 0.23063088965856088, Average precision = 0.9750531428006062, ROC-AUC = 0.9704149141254818, Elapsed Time = 10.488736199999039 seconds
Trial 16, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 16, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66464 validation-auc:0.95664 validation-aucpr:0.96172
[1] validation-logloss:0.63920 validation-auc:0.96086 validation-aucpr:0.96353
[2] validation-logloss:0.61741 validation-auc:0.96288 validation-aucpr:0.96780
[3] validation-logloss:0.59442 validation-auc:0.96538 validation-aucpr:0.96991
[4] validation-logloss:0.57322 validation-auc:0.96664 validation-aucpr:0.97098
[5] validation-logloss:0.55315 validation-auc:0.96683 validation-aucpr:0.97100
[6] validation-logloss:0.53423 validation-auc:0.96745 validation-aucpr:0.97135
[7] validation-logloss:0.51695 validation-auc:0.96775 validation-aucpr:0.97159
[8] validation-logloss:0.50077 validation-auc:0.96790 validation-aucpr:0.97196
[9] validation-logloss:0.48736 validation-auc:0.96769 validation-aucpr:0.97183
[10] validation-logloss:0.47464 validation-auc:0.96733 validation-aucpr:0.97158
[11] validation-logloss:0.46247 validation-auc:0.96770 validation-aucpr:0.97182
[12] validation-logloss:0.44934 validation-auc:0.96799 validation-aucpr:0.97209
[13] validation-logloss:0.43861 validation-auc:0.96775 validation-aucpr:0.97182
[14] validation-logloss:0.42687 validation-auc:0.96785 validation-aucpr:0.97194
[15] validation-logloss:0.41571 validation-auc:0.96822 validation-aucpr:0.97221
[16] validation-logloss:0.40617 validation-auc:0.96863 validation-aucpr:0.97259
[17] validation-logloss:0.39660 validation-auc:0.96855 validation-aucpr:0.97258
[18] validation-logloss:0.38839 validation-auc:0.96838 validation-aucpr:0.97236
[19] validation-logloss:0.38059 validation-auc:0.96827 validation-aucpr:0.97220
[20] validation-logloss:0.37184 validation-auc:0.96862 validation-aucpr:0.97248
[21] validation-logloss:0.36528 validation-auc:0.96836 validation-aucpr:0.97245
[22] validation-logloss:0.35720 validation-auc:0.96876 validation-aucpr:0.97278
[23] validation-logloss:0.35115 validation-auc:0.96860 validation-aucpr:0.97267
[24] validation-logloss:0.34421 validation-auc:0.96888 validation-aucpr:0.97288
[25] validation-logloss:0.33875 validation-auc:0.96877 validation-aucpr:0.97269
[26] validation-logloss:0.33201 validation-auc:0.96903 validation-aucpr:0.97289
[27] validation-logloss:0.32570 validation-auc:0.96919 validation-aucpr:0.97235
[28] validation-logloss:0.32022 validation-auc:0.96948 validation-aucpr:0.97256
[29] validation-logloss:0.31535 validation-auc:0.96959 validation-aucpr:0.97263
[30] validation-logloss:0.30983 validation-auc:0.96979 validation-aucpr:0.97347
[31] validation-logloss:0.30550 validation-auc:0.96967 validation-aucpr:0.97335
[32] validation-logloss:0.30025 validation-auc:0.96988 validation-aucpr:0.97362
[33] validation-logloss:0.29548 validation-auc:0.96998 validation-aucpr:0.97370
[34] validation-logloss:0.29102 validation-auc:0.97001 validation-aucpr:0.97372
[35] validation-logloss:0.28702 validation-auc:0.96997 validation-aucpr:0.97368
[36] validation-logloss:0.28293 validation-auc:0.96998 validation-aucpr:0.97373
[37] validation-logloss:0.27897 validation-auc:0.97017 validation-aucpr:0.97385
[38] validation-logloss:0.27514 validation-auc:0.97033 validation-aucpr:0.97399
[39] validation-logloss:0.27204 validation-auc:0.97021 validation-aucpr:0.97392
[40] validation-logloss:0.26864 validation-auc:0.97019 validation-aucpr:0.97389
[41] validation-logloss:0.26541 validation-auc:0.97032 validation-aucpr:0.97397
[42] validation-logloss:0.26282 validation-auc:0.97023 validation-aucpr:0.97386
[43] validation-logloss:0.26046 validation-auc:0.97017 validation-aucpr:0.97390
[44] validation-logloss:0.25802 validation-auc:0.97015 validation-aucpr:0.97380
[45] validation-logloss:0.25511 validation-auc:0.97030 validation-aucpr:0.97385
[46] validation-logloss:0.25307 validation-auc:0.97022 validation-aucpr:0.97370
[47] validation-logloss:0.25054 validation-auc:0.97020 validation-aucpr:0.97369
[48] validation-logloss:0.24860 validation-auc:0.97032 validation-aucpr:0.97376
[49] validation-logloss:0.24612 validation-auc:0.97043 validation-aucpr:0.97383
[50] validation-logloss:0.24360 validation-auc:0.97058 validation-aucpr:0.97392
[51] validation-logloss:0.24130 validation-auc:0.97067 validation-aucpr:0.97400
[52] validation-logloss:0.23920 validation-auc:0.97071 validation-aucpr:0.97388
[53] validation-logloss:0.23752 validation-auc:0.97072 validation-aucpr:0.97386
[54] validation-logloss:0.23620 validation-auc:0.97050 validation-aucpr:0.97369
[55] validation-logloss:0.23413 validation-auc:0.97077 validation-aucpr:0.97386
[56] validation-logloss:0.23209 validation-auc:0.97098 validation-aucpr:0.97403
{'best_iteration': '56', 'best_score': '0.9740272707349765'}
Trial 16, Fold 5: Log loss = 0.23209462109948828, Average precision = 0.9740309172331549, ROC-AUC = 0.970983482897646, Elapsed Time = 10.367234399998779 seconds
Optimization Progress: 17%|#7 | 17/100 [33:52<2:53:29, 125.41s/it]
Trial 17, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 17, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68070 validation-auc:0.94898 validation-aucpr:0.92672
[1] validation-logloss:0.67016 validation-auc:0.96222 validation-aucpr:0.96208
[2] validation-logloss:0.65893 validation-auc:0.96484 validation-aucpr:0.96820
[3] validation-logloss:0.64785 validation-auc:0.96631 validation-aucpr:0.96911
[4] validation-logloss:0.63708 validation-auc:0.96744 validation-aucpr:0.97010
[5] validation-logloss:0.62658 validation-auc:0.96754 validation-aucpr:0.96975
[6] validation-logloss:0.61637 validation-auc:0.96832 validation-aucpr:0.97309
[7] validation-logloss:0.60679 validation-auc:0.96827 validation-aucpr:0.97310
[8] validation-logloss:0.59797 validation-auc:0.96876 validation-aucpr:0.97353
[9] validation-logloss:0.58967 validation-auc:0.96903 validation-aucpr:0.97361
[10] validation-logloss:0.58167 validation-auc:0.96863 validation-aucpr:0.97336
[11] validation-logloss:0.57302 validation-auc:0.96858 validation-aucpr:0.97330
[12] validation-logloss:0.56516 validation-auc:0.96856 validation-aucpr:0.97329
[13] validation-logloss:0.55678 validation-auc:0.96853 validation-aucpr:0.97329
[14] validation-logloss:0.54872 validation-auc:0.96874 validation-aucpr:0.97356
[15] validation-logloss:0.54084 validation-auc:0.96906 validation-aucpr:0.97380
[16] validation-logloss:0.53407 validation-auc:0.96879 validation-aucpr:0.97383
[17] validation-logloss:0.52748 validation-auc:0.96864 validation-aucpr:0.97363
[18] validation-logloss:0.52008 validation-auc:0.96877 validation-aucpr:0.97371
[19] validation-logloss:0.51293 validation-auc:0.96881 validation-aucpr:0.97380
[20] validation-logloss:0.50591 validation-auc:0.96900 validation-aucpr:0.97396
[21] validation-logloss:0.49994 validation-auc:0.96878 validation-aucpr:0.97374
[22] validation-logloss:0.49338 validation-auc:0.96887 validation-aucpr:0.97380
[23] validation-logloss:0.48702 validation-auc:0.96894 validation-aucpr:0.97389
[24] validation-logloss:0.48086 validation-auc:0.96901 validation-aucpr:0.97392
[25] validation-logloss:0.47488 validation-auc:0.96927 validation-aucpr:0.97420
[26] validation-logloss:0.46956 validation-auc:0.96922 validation-aucpr:0.97409
[27] validation-logloss:0.46372 validation-auc:0.96940 validation-aucpr:0.97424
[28] validation-logloss:0.45803 validation-auc:0.96941 validation-aucpr:0.97423
[29] validation-logloss:0.45247 validation-auc:0.96961 validation-aucpr:0.97440
[30] validation-logloss:0.44764 validation-auc:0.96951 validation-aucpr:0.97427
[31] validation-logloss:0.44230 validation-auc:0.96968 validation-aucpr:0.97441
[32] validation-logloss:0.43704 validation-auc:0.96977 validation-aucpr:0.97450
[33] validation-logloss:0.43202 validation-auc:0.96985 validation-aucpr:0.97456
[34] validation-logloss:0.42703 validation-auc:0.96993 validation-aucpr:0.97464
[35] validation-logloss:0.42260 validation-auc:0.96977 validation-aucpr:0.97447
[36] validation-logloss:0.41841 validation-auc:0.96968 validation-aucpr:0.97443
[37] validation-logloss:0.41366 validation-auc:0.96981 validation-aucpr:0.97453
[38] validation-logloss:0.40921 validation-auc:0.96978 validation-aucpr:0.97454
[39] validation-logloss:0.40466 validation-auc:0.97000 validation-aucpr:0.97469
[40] validation-logloss:0.40024 validation-auc:0.97011 validation-aucpr:0.97478
[41] validation-logloss:0.39648 validation-auc:0.97007 validation-aucpr:0.97473
{'best_iteration': '40', 'best_score': '0.9747813685044032'}
Trial 17, Fold 1: Log loss = 0.39647990137081396, Average precision = 0.9747384831870218, ROC-AUC = 0.9700692413119016, Elapsed Time = 1.3540717999985645 seconds
Trial 17, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 17, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68110 validation-auc:0.94313 validation-aucpr:0.91708
[1] validation-logloss:0.66937 validation-auc:0.96204 validation-aucpr:0.95203
[2] validation-logloss:0.65910 validation-auc:0.96507 validation-aucpr:0.96812
[3] validation-logloss:0.64823 validation-auc:0.96729 validation-aucpr:0.97087
[4] validation-logloss:0.63878 validation-auc:0.96777 validation-aucpr:0.97130
[5] validation-logloss:0.62843 validation-auc:0.96854 validation-aucpr:0.97194
[6] validation-logloss:0.61821 validation-auc:0.96897 validation-aucpr:0.97236
[7] validation-logloss:0.60838 validation-auc:0.96928 validation-aucpr:0.97272
[8] validation-logloss:0.59990 validation-auc:0.96885 validation-aucpr:0.97233
[9] validation-logloss:0.59051 validation-auc:0.96896 validation-aucpr:0.97245
[10] validation-logloss:0.58142 validation-auc:0.96913 validation-aucpr:0.97257
[11] validation-logloss:0.57258 validation-auc:0.96918 validation-aucpr:0.97277
[12] validation-logloss:0.56387 validation-auc:0.96969 validation-aucpr:0.97312
[13] validation-logloss:0.55546 validation-auc:0.96968 validation-aucpr:0.97313
[14] validation-logloss:0.54733 validation-auc:0.96969 validation-aucpr:0.97310
[15] validation-logloss:0.53945 validation-auc:0.96966 validation-aucpr:0.97309
[16] validation-logloss:0.53180 validation-auc:0.96947 validation-aucpr:0.97296
[17] validation-logloss:0.52440 validation-auc:0.96937 validation-aucpr:0.97287
[18] validation-logloss:0.51788 validation-auc:0.96949 validation-aucpr:0.97296
[19] validation-logloss:0.51148 validation-auc:0.96967 validation-aucpr:0.97322
[20] validation-logloss:0.50458 validation-auc:0.96966 validation-aucpr:0.97322
[21] validation-logloss:0.49870 validation-auc:0.96943 validation-aucpr:0.97305
[22] validation-logloss:0.49200 validation-auc:0.96967 validation-aucpr:0.97324
[23] validation-logloss:0.48550 validation-auc:0.96977 validation-aucpr:0.97331
[24] validation-logloss:0.47928 validation-auc:0.96974 validation-aucpr:0.97327
[25] validation-logloss:0.47337 validation-auc:0.96981 validation-aucpr:0.97331
[26] validation-logloss:0.46733 validation-auc:0.96997 validation-aucpr:0.97342
[27] validation-logloss:0.46157 validation-auc:0.96992 validation-aucpr:0.97337
[28] validation-logloss:0.45655 validation-auc:0.96992 validation-aucpr:0.97334
[29] validation-logloss:0.45179 validation-auc:0.96981 validation-aucpr:0.97318
[30] validation-logloss:0.44636 validation-auc:0.96979 validation-aucpr:0.97318
[31] validation-logloss:0.44101 validation-auc:0.96987 validation-aucpr:0.97325
[32] validation-logloss:0.43584 validation-auc:0.96994 validation-aucpr:0.97328
[33] validation-logloss:0.43143 validation-auc:0.96988 validation-aucpr:0.97324
[34] validation-logloss:0.42635 validation-auc:0.97007 validation-aucpr:0.97337
[35] validation-logloss:0.42213 validation-auc:0.97010 validation-aucpr:0.97337
[36] validation-logloss:0.41756 validation-auc:0.97010 validation-aucpr:0.97338
[37] validation-logloss:0.41344 validation-auc:0.97023 validation-aucpr:0.97350
[38] validation-logloss:0.40897 validation-auc:0.97027 validation-aucpr:0.97351
[39] validation-logloss:0.40460 validation-auc:0.97027 validation-aucpr:0.97352
[40] validation-logloss:0.40021 validation-auc:0.97025 validation-aucpr:0.97353
[41] validation-logloss:0.39626 validation-auc:0.97022 validation-aucpr:0.97344
{'best_iteration': '40', 'best_score': '0.9735317036055212'}
Trial 17, Fold 2: Log loss = 0.3962613455256332, Average precision = 0.9732858310430789, ROC-AUC = 0.9702157034071321, Elapsed Time = 1.8218703000002279 seconds
Trial 17, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 17, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68088 validation-auc:0.94306 validation-aucpr:0.92744
[1] validation-logloss:0.66888 validation-auc:0.96486 validation-aucpr:0.96352
[2] validation-logloss:0.65766 validation-auc:0.96609 validation-aucpr:0.96783
[3] validation-logloss:0.64664 validation-auc:0.96827 validation-aucpr:0.97263
[4] validation-logloss:0.63599 validation-auc:0.96896 validation-aucpr:0.97321
[5] validation-logloss:0.62545 validation-auc:0.96974 validation-aucpr:0.97372
[6] validation-logloss:0.61527 validation-auc:0.97027 validation-aucpr:0.97416
[7] validation-logloss:0.60568 validation-auc:0.97077 validation-aucpr:0.97456
[8] validation-logloss:0.59627 validation-auc:0.97097 validation-aucpr:0.97504
[9] validation-logloss:0.58727 validation-auc:0.97070 validation-aucpr:0.97513
[10] validation-logloss:0.57901 validation-auc:0.97081 validation-aucpr:0.97512
[11] validation-logloss:0.57003 validation-auc:0.97109 validation-aucpr:0.97536
[12] validation-logloss:0.56155 validation-auc:0.97099 validation-aucpr:0.97527
[13] validation-logloss:0.55330 validation-auc:0.97100 validation-aucpr:0.97527
[14] validation-logloss:0.54592 validation-auc:0.97135 validation-aucpr:0.97552
[15] validation-logloss:0.53882 validation-auc:0.97127 validation-aucpr:0.97536
[16] validation-logloss:0.53199 validation-auc:0.97140 validation-aucpr:0.97535
[17] validation-logloss:0.52456 validation-auc:0.97143 validation-aucpr:0.97535
[18] validation-logloss:0.51724 validation-auc:0.97160 validation-aucpr:0.97546
[19] validation-logloss:0.51002 validation-auc:0.97166 validation-aucpr:0.97553
[20] validation-logloss:0.50296 validation-auc:0.97178 validation-aucpr:0.97564
[21] validation-logloss:0.49611 validation-auc:0.97192 validation-aucpr:0.97576
[22] validation-logloss:0.48946 validation-auc:0.97188 validation-aucpr:0.97570
[23] validation-logloss:0.48352 validation-auc:0.97213 validation-aucpr:0.97602
[24] validation-logloss:0.47729 validation-auc:0.97199 validation-aucpr:0.97592
[25] validation-logloss:0.47103 validation-auc:0.97206 validation-aucpr:0.97600
[26] validation-logloss:0.46580 validation-auc:0.97180 validation-aucpr:0.97570
[27] validation-logloss:0.46000 validation-auc:0.97189 validation-aucpr:0.97576
[28] validation-logloss:0.45497 validation-auc:0.97183 validation-aucpr:0.97574
[29] validation-logloss:0.44959 validation-auc:0.97174 validation-aucpr:0.97565
[30] validation-logloss:0.44409 validation-auc:0.97185 validation-aucpr:0.97574
[31] validation-logloss:0.43876 validation-auc:0.97192 validation-aucpr:0.97580
[32] validation-logloss:0.43422 validation-auc:0.97198 validation-aucpr:0.97591
[33] validation-logloss:0.42923 validation-auc:0.97186 validation-aucpr:0.97582
[34] validation-logloss:0.42490 validation-auc:0.97185 validation-aucpr:0.97577
[35] validation-logloss:0.42002 validation-auc:0.97185 validation-aucpr:0.97580
[36] validation-logloss:0.41534 validation-auc:0.97189 validation-aucpr:0.97585
[37] validation-logloss:0.41061 validation-auc:0.97190 validation-aucpr:0.97586
[38] validation-logloss:0.40599 validation-auc:0.97198 validation-aucpr:0.97594
[39] validation-logloss:0.40156 validation-auc:0.97198 validation-aucpr:0.97592
[40] validation-logloss:0.39710 validation-auc:0.97205 validation-aucpr:0.97598
[41] validation-logloss:0.39307 validation-auc:0.97211 validation-aucpr:0.97601
{'best_iteration': '23', 'best_score': '0.9760153538535357'}
Trial 17, Fold 3: Log loss = 0.3930701404115779, Average precision = 0.9760085592503406, ROC-AUC = 0.9721139399601276, Elapsed Time = 1.6774910000003729 seconds
Trial 17, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 17, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68092 validation-auc:0.93976 validation-aucpr:0.92820
[1] validation-logloss:0.67073 validation-auc:0.95822 validation-aucpr:0.96040
[2] validation-logloss:0.65918 validation-auc:0.96465 validation-aucpr:0.96808
[3] validation-logloss:0.64802 validation-auc:0.96658 validation-aucpr:0.96915
[4] validation-logloss:0.63726 validation-auc:0.96750 validation-aucpr:0.97281
[5] validation-logloss:0.62668 validation-auc:0.96890 validation-aucpr:0.97381
[6] validation-logloss:0.61663 validation-auc:0.96905 validation-aucpr:0.97402
[7] validation-logloss:0.60789 validation-auc:0.96827 validation-aucpr:0.97340
[8] validation-logloss:0.59942 validation-auc:0.96815 validation-aucpr:0.97325
[9] validation-logloss:0.59114 validation-auc:0.96793 validation-aucpr:0.97318
[10] validation-logloss:0.58205 validation-auc:0.96853 validation-aucpr:0.97369
[11] validation-logloss:0.57315 validation-auc:0.96898 validation-aucpr:0.97402
[12] validation-logloss:0.56464 validation-auc:0.96914 validation-aucpr:0.97417
[13] validation-logloss:0.55623 validation-auc:0.96938 validation-aucpr:0.97436
[14] validation-logloss:0.54800 validation-auc:0.96940 validation-aucpr:0.97441
[15] validation-logloss:0.54018 validation-auc:0.96933 validation-aucpr:0.97439
[16] validation-logloss:0.53338 validation-auc:0.96898 validation-aucpr:0.97412
[17] validation-logloss:0.52569 validation-auc:0.96941 validation-aucpr:0.97445
[18] validation-logloss:0.51920 validation-auc:0.96929 validation-aucpr:0.97431
[19] validation-logloss:0.51193 validation-auc:0.96982 validation-aucpr:0.97471
[20] validation-logloss:0.50543 validation-auc:0.96971 validation-aucpr:0.97461
[21] validation-logloss:0.49858 validation-auc:0.96983 validation-aucpr:0.97468
[22] validation-logloss:0.49190 validation-auc:0.96999 validation-aucpr:0.97483
[23] validation-logloss:0.48616 validation-auc:0.96996 validation-aucpr:0.97480
[24] validation-logloss:0.47977 validation-auc:0.97019 validation-aucpr:0.97498
[25] validation-logloss:0.47382 validation-auc:0.97009 validation-aucpr:0.97493
[26] validation-logloss:0.46781 validation-auc:0.97027 validation-aucpr:0.97507
[27] validation-logloss:0.46200 validation-auc:0.97030 validation-aucpr:0.97507
[28] validation-logloss:0.45656 validation-auc:0.97029 validation-aucpr:0.97505
[29] validation-logloss:0.45101 validation-auc:0.97024 validation-aucpr:0.97501
[30] validation-logloss:0.44545 validation-auc:0.97033 validation-aucpr:0.97508
[31] validation-logloss:0.44058 validation-auc:0.97036 validation-aucpr:0.97509
[32] validation-logloss:0.43539 validation-auc:0.97038 validation-aucpr:0.97510
[33] validation-logloss:0.43028 validation-auc:0.97064 validation-aucpr:0.97533
[34] validation-logloss:0.42604 validation-auc:0.97053 validation-aucpr:0.97524
[35] validation-logloss:0.42182 validation-auc:0.97046 validation-aucpr:0.97517
[36] validation-logloss:0.41776 validation-auc:0.97021 validation-aucpr:0.97499
[37] validation-logloss:0.41362 validation-auc:0.97016 validation-aucpr:0.97495
[38] validation-logloss:0.40902 validation-auc:0.97026 validation-aucpr:0.97504
[39] validation-logloss:0.40513 validation-auc:0.97013 validation-aucpr:0.97494
[40] validation-logloss:0.40081 validation-auc:0.97021 validation-aucpr:0.97498
[41] validation-logloss:0.39658 validation-auc:0.97035 validation-aucpr:0.97509
{'best_iteration': '33', 'best_score': '0.9753345323573815'}
Trial 17, Fold 4: Log loss = 0.3965803346159523, Average precision = 0.9750878709584755, ROC-AUC = 0.9703468839340855, Elapsed Time = 1.805470499999501 seconds
Trial 17, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 17, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68100 validation-auc:0.93722 validation-aucpr:0.91276
[1] validation-logloss:0.66927 validation-auc:0.96019 validation-aucpr:0.95802
[2] validation-logloss:0.65798 validation-auc:0.96497 validation-aucpr:0.96872
[3] validation-logloss:0.64696 validation-auc:0.96467 validation-aucpr:0.96643
[4] validation-logloss:0.63619 validation-auc:0.96601 validation-aucpr:0.96870
[5] validation-logloss:0.62584 validation-auc:0.96626 validation-aucpr:0.96794
[6] validation-logloss:0.61580 validation-auc:0.96677 validation-aucpr:0.97012
[7] validation-logloss:0.60706 validation-auc:0.96667 validation-aucpr:0.97029
[8] validation-logloss:0.59752 validation-auc:0.96678 validation-aucpr:0.97033
[9] validation-logloss:0.58826 validation-auc:0.96721 validation-aucpr:0.97089
[10] validation-logloss:0.58063 validation-auc:0.96742 validation-aucpr:0.97213
[11] validation-logloss:0.57225 validation-auc:0.96760 validation-aucpr:0.97233
[12] validation-logloss:0.56365 validation-auc:0.96795 validation-aucpr:0.97264
[13] validation-logloss:0.55542 validation-auc:0.96841 validation-aucpr:0.97295
[14] validation-logloss:0.54815 validation-auc:0.96841 validation-aucpr:0.97288
[15] validation-logloss:0.54032 validation-auc:0.96868 validation-aucpr:0.97296
[16] validation-logloss:0.53282 validation-auc:0.96860 validation-aucpr:0.97287
[17] validation-logloss:0.52628 validation-auc:0.96821 validation-aucpr:0.97256
[18] validation-logloss:0.51910 validation-auc:0.96824 validation-aucpr:0.97258
[19] validation-logloss:0.51287 validation-auc:0.96801 validation-aucpr:0.97236
[20] validation-logloss:0.50587 validation-auc:0.96827 validation-aucpr:0.97257
[21] validation-logloss:0.49936 validation-auc:0.96818 validation-aucpr:0.97250
[22] validation-logloss:0.49279 validation-auc:0.96820 validation-aucpr:0.97249
[23] validation-logloss:0.48645 validation-auc:0.96830 validation-aucpr:0.97257
[24] validation-logloss:0.48025 validation-auc:0.96848 validation-aucpr:0.97269
[25] validation-logloss:0.47529 validation-auc:0.96823 validation-aucpr:0.97267
[26] validation-logloss:0.47006 validation-auc:0.96811 validation-aucpr:0.97254
[27] validation-logloss:0.46480 validation-auc:0.96812 validation-aucpr:0.97252
[28] validation-logloss:0.45980 validation-auc:0.96807 validation-aucpr:0.97247
[29] validation-logloss:0.45426 validation-auc:0.96822 validation-aucpr:0.97259
[30] validation-logloss:0.44943 validation-auc:0.96834 validation-aucpr:0.97268
[31] validation-logloss:0.44406 validation-auc:0.96843 validation-aucpr:0.97275
[32] validation-logloss:0.43966 validation-auc:0.96825 validation-aucpr:0.97260
[33] validation-logloss:0.43455 validation-auc:0.96834 validation-aucpr:0.97267
[34] validation-logloss:0.42951 validation-auc:0.96852 validation-aucpr:0.97279
[35] validation-logloss:0.42534 validation-auc:0.96848 validation-aucpr:0.97272
[36] validation-logloss:0.42052 validation-auc:0.96860 validation-aucpr:0.97285
[37] validation-logloss:0.41637 validation-auc:0.96858 validation-aucpr:0.97283
[38] validation-logloss:0.41174 validation-auc:0.96879 validation-aucpr:0.97298
[39] validation-logloss:0.40778 validation-auc:0.96887 validation-aucpr:0.97306
[40] validation-logloss:0.40396 validation-auc:0.96886 validation-aucpr:0.97303
[41] validation-logloss:0.39983 validation-auc:0.96884 validation-aucpr:0.97300
{'best_iteration': '39', 'best_score': '0.9730560035632158'}
Trial 17, Fold 5: Log loss = 0.3998339284718829, Average precision = 0.9729822569312114, ROC-AUC = 0.9688389721866116, Elapsed Time = 1.7565878000004886 seconds
Optimization Progress: 18%|#8 | 18/100 [34:09<2:06:46, 92.76s/it]
Trial 18, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 18, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66415 validation-auc:0.90117 validation-aucpr:0.89825
[1] validation-logloss:0.63802 validation-auc:0.91492 validation-aucpr:0.91811
[2] validation-logloss:0.61375 validation-auc:0.92827 validation-aucpr:0.93063
[3] validation-logloss:0.58648 validation-auc:0.94237 validation-aucpr:0.94863
[4] validation-logloss:0.56778 validation-auc:0.94191 validation-aucpr:0.95001
[5] validation-logloss:0.54988 validation-auc:0.94233 validation-aucpr:0.95046
[6] validation-logloss:0.52795 validation-auc:0.94457 validation-aucpr:0.95350
[7] validation-logloss:0.50871 validation-auc:0.94546 validation-aucpr:0.95459
[8] validation-logloss:0.49224 validation-auc:0.94577 validation-aucpr:0.95474
[9] validation-logloss:0.47824 validation-auc:0.94599 validation-aucpr:0.95491
[10] validation-logloss:0.46174 validation-auc:0.94668 validation-aucpr:0.95575
[11] validation-logloss:0.44955 validation-auc:0.94784 validation-aucpr:0.95607
[12] validation-logloss:0.44034 validation-auc:0.94804 validation-aucpr:0.95650
[13] validation-logloss:0.42811 validation-auc:0.94834 validation-aucpr:0.95684
[14] validation-logloss:0.42058 validation-auc:0.94795 validation-aucpr:0.95659
[15] validation-logloss:0.41008 validation-auc:0.94862 validation-aucpr:0.95689
[16] validation-logloss:0.40055 validation-auc:0.95022 validation-aucpr:0.95758
[17] validation-logloss:0.39381 validation-auc:0.95082 validation-aucpr:0.95788
[18] validation-logloss:0.38767 validation-auc:0.95060 validation-aucpr:0.95790
[19] validation-logloss:0.37982 validation-auc:0.95092 validation-aucpr:0.95826
[20] validation-logloss:0.37431 validation-auc:0.95039 validation-aucpr:0.95791
[21] validation-logloss:0.36926 validation-auc:0.94992 validation-aucpr:0.95775
[22] validation-logloss:0.36471 validation-auc:0.94962 validation-aucpr:0.95792
[23] validation-logloss:0.36034 validation-auc:0.94948 validation-aucpr:0.95794
[24] validation-logloss:0.35334 validation-auc:0.95030 validation-aucpr:0.95890
[25] validation-logloss:0.34727 validation-auc:0.95183 validation-aucpr:0.95980
{'best_iteration': '25', 'best_score': '0.9597999490308322'}
Trial 18, Fold 1: Log loss = 0.34726644783133376, Average precision = 0.9578838551170721, ROC-AUC = 0.9518273415207698, Elapsed Time = 1.8354823999998189 seconds
Trial 18, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 18, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.66396 validation-auc:0.90215 validation-aucpr:0.89116
[1] validation-logloss:0.63839 validation-auc:0.91495 validation-aucpr:0.90589
[2] validation-logloss:0.60834 validation-auc:0.94015 validation-aucpr:0.93914
[3] validation-logloss:0.58667 validation-auc:0.93817 validation-aucpr:0.93790
[4] validation-logloss:0.56317 validation-auc:0.94234 validation-aucpr:0.94322
[5] validation-logloss:0.54371 validation-auc:0.94296 validation-aucpr:0.94273
[6] validation-logloss:0.52768 validation-auc:0.94360 validation-aucpr:0.94403
[7] validation-logloss:0.51333 validation-auc:0.94263 validation-aucpr:0.94271
[8] validation-logloss:0.49840 validation-auc:0.94278 validation-aucpr:0.94299
[9] validation-logloss:0.48582 validation-auc:0.94358 validation-aucpr:0.94625
[10] validation-logloss:0.47275 validation-auc:0.94369 validation-aucpr:0.94599
[11] validation-logloss:0.45827 validation-auc:0.94465 validation-aucpr:0.94708
[12] validation-logloss:0.44856 validation-auc:0.94487 validation-aucpr:0.94704
[13] validation-logloss:0.43952 validation-auc:0.94487 validation-aucpr:0.94714
[14] validation-logloss:0.42983 validation-auc:0.94437 validation-aucpr:0.94624
[15] validation-logloss:0.42206 validation-auc:0.94433 validation-aucpr:0.94602
[16] validation-logloss:0.41155 validation-auc:0.94514 validation-aucpr:0.94630
[17] validation-logloss:0.40252 validation-auc:0.94613 validation-aucpr:0.95024
[18] validation-logloss:0.39595 validation-auc:0.94658 validation-aucpr:0.95086
[19] validation-logloss:0.38673 validation-auc:0.94734 validation-aucpr:0.95204
[20] validation-logloss:0.37807 validation-auc:0.94778 validation-aucpr:0.95261
[21] validation-logloss:0.37049 validation-auc:0.94806 validation-aucpr:0.95308
[22] validation-logloss:0.36365 validation-auc:0.94825 validation-aucpr:0.95353
[23] validation-logloss:0.35933 validation-auc:0.94777 validation-aucpr:0.95301
[24] validation-logloss:0.35311 validation-auc:0.94848 validation-aucpr:0.95338
[25] validation-logloss:0.34717 validation-auc:0.94943 validation-aucpr:0.95398
{'best_iteration': '25', 'best_score': '0.9539815995939754'}
Trial 18, Fold 2: Log loss = 0.34717386262269745, Average precision = 0.9518878146326276, ROC-AUC = 0.9494315679704128, Elapsed Time = 2.105443700000251 seconds
Trial 18, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 18, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66343 validation-auc:0.90742 validation-aucpr:0.90215
[1] validation-logloss:0.63726 validation-auc:0.92023 validation-aucpr:0.91689
[2] validation-logloss:0.61301 validation-auc:0.92445 validation-aucpr:0.92579
[3] validation-logloss:0.58562 validation-auc:0.94587 validation-aucpr:0.95187
[4] validation-logloss:0.56645 validation-auc:0.94688 validation-aucpr:0.95475
[5] validation-logloss:0.54810 validation-auc:0.94739 validation-aucpr:0.95548
[6] validation-logloss:0.52958 validation-auc:0.94960 validation-aucpr:0.95656
[7] validation-logloss:0.51453 validation-auc:0.95045 validation-aucpr:0.95695
[8] validation-logloss:0.49960 validation-auc:0.94924 validation-aucpr:0.95620
[9] validation-logloss:0.48648 validation-auc:0.94946 validation-aucpr:0.95648
[10] validation-logloss:0.47340 validation-auc:0.94926 validation-aucpr:0.95610
[11] validation-logloss:0.45838 validation-auc:0.95025 validation-aucpr:0.95725
[12] validation-logloss:0.44830 validation-auc:0.95108 validation-aucpr:0.95767
[13] validation-logloss:0.43866 validation-auc:0.95080 validation-aucpr:0.95763
[14] validation-logloss:0.42867 validation-auc:0.95312 validation-aucpr:0.95869
[15] validation-logloss:0.41645 validation-auc:0.95364 validation-aucpr:0.95942
[16] validation-logloss:0.40939 validation-auc:0.95233 validation-aucpr:0.95869
[17] validation-logloss:0.40296 validation-auc:0.95232 validation-aucpr:0.95871
[18] validation-logloss:0.39653 validation-auc:0.95205 validation-aucpr:0.95858
[19] validation-logloss:0.38636 validation-auc:0.95208 validation-aucpr:0.95869
[20] validation-logloss:0.37795 validation-auc:0.95269 validation-aucpr:0.95915
[21] validation-logloss:0.37281 validation-auc:0.95225 validation-aucpr:0.95900
[22] validation-logloss:0.36516 validation-auc:0.95242 validation-aucpr:0.95941
[23] validation-logloss:0.35774 validation-auc:0.95275 validation-aucpr:0.95993
[24] validation-logloss:0.35191 validation-auc:0.95430 validation-aucpr:0.96066
[25] validation-logloss:0.34656 validation-auc:0.95432 validation-aucpr:0.96064
{'best_iteration': '24', 'best_score': '0.9606556106886179'}
Trial 18, Fold 3: Log loss = 0.346561476570241, Average precision = 0.9575752170927161, ROC-AUC = 0.9543186464061794, Elapsed Time = 2.0811799000002793 seconds
Trial 18, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 18, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66376 validation-auc:0.90244 validation-aucpr:0.89434
[1] validation-logloss:0.63805 validation-auc:0.91080 validation-aucpr:0.90586
[2] validation-logloss:0.60809 validation-auc:0.93520 validation-aucpr:0.94036
[3] validation-logloss:0.58689 validation-auc:0.93533 validation-aucpr:0.94122
[4] validation-logloss:0.56326 validation-auc:0.93942 validation-aucpr:0.94666
[5] validation-logloss:0.54330 validation-auc:0.94206 validation-aucpr:0.94668
[6] validation-logloss:0.52710 validation-auc:0.94376 validation-aucpr:0.94962
[7] validation-logloss:0.51238 validation-auc:0.94520 validation-aucpr:0.95045
[8] validation-logloss:0.49729 validation-auc:0.94402 validation-aucpr:0.94893
[9] validation-logloss:0.48228 validation-auc:0.94413 validation-aucpr:0.94771
[10] validation-logloss:0.46944 validation-auc:0.94460 validation-aucpr:0.94825
[11] validation-logloss:0.45466 validation-auc:0.94543 validation-aucpr:0.94882
[12] validation-logloss:0.44470 validation-auc:0.94546 validation-aucpr:0.94878
[13] validation-logloss:0.43541 validation-auc:0.94536 validation-aucpr:0.95016
[14] validation-logloss:0.42562 validation-auc:0.94677 validation-aucpr:0.95065
[15] validation-logloss:0.41795 validation-auc:0.94678 validation-aucpr:0.95062
[16] validation-logloss:0.40750 validation-auc:0.94818 validation-aucpr:0.95163
[17] validation-logloss:0.39838 validation-auc:0.95058 validation-aucpr:0.95601
[18] validation-logloss:0.39167 validation-auc:0.95084 validation-aucpr:0.95647
[19] validation-logloss:0.38258 validation-auc:0.95178 validation-aucpr:0.95771
[20] validation-logloss:0.37439 validation-auc:0.95193 validation-aucpr:0.95781
[21] validation-logloss:0.36659 validation-auc:0.95239 validation-aucpr:0.95863
[22] validation-logloss:0.35969 validation-auc:0.95268 validation-aucpr:0.95894
[23] validation-logloss:0.35511 validation-auc:0.95094 validation-aucpr:0.95747
[24] validation-logloss:0.34843 validation-auc:0.95201 validation-aucpr:0.95902
[25] validation-logloss:0.34473 validation-auc:0.95119 validation-aucpr:0.95824
{'best_iteration': '24', 'best_score': '0.9590165483824703'}
Trial 18, Fold 4: Log loss = 0.34472849653115334, Average precision = 0.9551434753025292, ROC-AUC = 0.9511935908257451, Elapsed Time = 2.024505199999112 seconds
Trial 18, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 18, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66397 validation-auc:0.90168 validation-aucpr:0.89628
[1] validation-logloss:0.63881 validation-auc:0.90722 validation-aucpr:0.90642
[2] validation-logloss:0.60858 validation-auc:0.93393 validation-aucpr:0.94129
[3] validation-logloss:0.58734 validation-auc:0.93425 validation-aucpr:0.94134
[4] validation-logloss:0.56411 validation-auc:0.93753 validation-aucpr:0.94571
[5] validation-logloss:0.54450 validation-auc:0.94128 validation-aucpr:0.94725
[6] validation-logloss:0.52910 validation-auc:0.94241 validation-aucpr:0.94920
[7] validation-logloss:0.51425 validation-auc:0.94460 validation-aucpr:0.95063
[8] validation-logloss:0.49960 validation-auc:0.94427 validation-aucpr:0.95206
[9] validation-logloss:0.48696 validation-auc:0.94469 validation-aucpr:0.95235
[10] validation-logloss:0.47403 validation-auc:0.94451 validation-aucpr:0.95195
[11] validation-logloss:0.46377 validation-auc:0.94506 validation-aucpr:0.95277
[12] validation-logloss:0.45471 validation-auc:0.94534 validation-aucpr:0.95304
[13] validation-logloss:0.44440 validation-auc:0.94527 validation-aucpr:0.95292
[14] validation-logloss:0.43473 validation-auc:0.94494 validation-aucpr:0.95256
[15] validation-logloss:0.42731 validation-auc:0.94605 validation-aucpr:0.95298
[16] validation-logloss:0.41653 validation-auc:0.94699 validation-aucpr:0.95416
[17] validation-logloss:0.41046 validation-auc:0.94688 validation-aucpr:0.95400
[18] validation-logloss:0.40393 validation-auc:0.94721 validation-aucpr:0.95437
[19] validation-logloss:0.39421 validation-auc:0.94785 validation-aucpr:0.95517
[20] validation-logloss:0.38551 validation-auc:0.94795 validation-aucpr:0.95517
[21] validation-logloss:0.37731 validation-auc:0.94846 validation-aucpr:0.95576
[22] validation-logloss:0.37014 validation-auc:0.94856 validation-aucpr:0.95583
[23] validation-logloss:0.36571 validation-auc:0.94839 validation-aucpr:0.95603
[24] validation-logloss:0.35931 validation-auc:0.94862 validation-aucpr:0.95636
[25] validation-logloss:0.35366 validation-auc:0.94931 validation-aucpr:0.95701
{'best_iteration': '25', 'best_score': '0.9570113734803808'}
Trial 18, Fold 5: Log loss = 0.35366369208012155, Average precision = 0.9551741061597449, ROC-AUC = 0.9493108893967263, Elapsed Time = 2.0203742000012426 seconds
Optimization Progress: 19%|#9 | 19/100 [34:27<1:35:04, 70.43s/it]
Trial 19, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 19, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66687 validation-auc:0.95022 validation-aucpr:0.94356
[1] validation-logloss:0.64264 validation-auc:0.96143 validation-aucpr:0.96320
[2] validation-logloss:0.62001 validation-auc:0.96411 validation-aucpr:0.96722
[3] validation-logloss:0.60135 validation-auc:0.96370 validation-aucpr:0.96627
[4] validation-logloss:0.58193 validation-auc:0.96485 validation-aucpr:0.96795
[5] validation-logloss:0.56536 validation-auc:0.96431 validation-aucpr:0.97006
[6] validation-logloss:0.55004 validation-auc:0.96422 validation-aucpr:0.96988
[7] validation-logloss:0.53504 validation-auc:0.96411 validation-aucpr:0.96973
[8] validation-logloss:0.52199 validation-auc:0.96384 validation-aucpr:0.96953
[9] validation-logloss:0.50660 validation-auc:0.96426 validation-aucpr:0.96991
[10] validation-logloss:0.49394 validation-auc:0.96475 validation-aucpr:0.97020
[11] validation-logloss:0.48017 validation-auc:0.96500 validation-aucpr:0.97073
[12] validation-logloss:0.46709 validation-auc:0.96545 validation-aucpr:0.97107
{'best_iteration': '12', 'best_score': '0.9710723987838434'}
Trial 19, Fold 1: Log loss = 0.4670901924666118, Average precision = 0.9709924434484682, ROC-AUC = 0.9654467810661351, Elapsed Time = 0.4070884999982809 seconds
Trial 19, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 19, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.66707 validation-auc:0.95571 validation-aucpr:0.94803
[1] validation-logloss:0.64327 validation-auc:0.96257 validation-aucpr:0.96563
[2] validation-logloss:0.62058 validation-auc:0.96486 validation-aucpr:0.96956
[3] validation-logloss:0.59939 validation-auc:0.96540 validation-aucpr:0.97003
[4] validation-logloss:0.58172 validation-auc:0.96602 validation-aucpr:0.97049
[5] validation-logloss:0.56536 validation-auc:0.96716 validation-aucpr:0.97131
[6] validation-logloss:0.54724 validation-auc:0.96808 validation-aucpr:0.97207
[7] validation-logloss:0.53238 validation-auc:0.96789 validation-aucpr:0.97190
[8] validation-logloss:0.51629 validation-auc:0.96784 validation-aucpr:0.97193
[9] validation-logloss:0.50115 validation-auc:0.96809 validation-aucpr:0.97212
[10] validation-logloss:0.48679 validation-auc:0.96844 validation-aucpr:0.97242
[11] validation-logloss:0.47526 validation-auc:0.96839 validation-aucpr:0.97235
[12] validation-logloss:0.46397 validation-auc:0.96831 validation-aucpr:0.97227
{'best_iteration': '10', 'best_score': '0.9724188374268375'}
Trial 19, Fold 2: Log loss = 0.46397375509965866, Average precision = 0.9720977182252877, ROC-AUC = 0.9683108134561439, Elapsed Time = 0.4283958000014536 seconds
Trial 19, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 19, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66689 validation-auc:0.95270 validation-aucpr:0.93948
[1] validation-logloss:0.64245 validation-auc:0.96317 validation-aucpr:0.96668
[2] validation-logloss:0.62016 validation-auc:0.96576 validation-aucpr:0.97077
[3] validation-logloss:0.60098 validation-auc:0.96729 validation-aucpr:0.97206
[4] validation-logloss:0.58192 validation-auc:0.96756 validation-aucpr:0.97243
[5] validation-logloss:0.56555 validation-auc:0.96766 validation-aucpr:0.97257
[6] validation-logloss:0.54955 validation-auc:0.96798 validation-aucpr:0.97310
[7] validation-logloss:0.53546 validation-auc:0.96799 validation-aucpr:0.97279
[8] validation-logloss:0.52131 validation-auc:0.96808 validation-aucpr:0.97284
[9] validation-logloss:0.50854 validation-auc:0.96782 validation-aucpr:0.97257
[10] validation-logloss:0.49354 validation-auc:0.96826 validation-aucpr:0.97303
[11] validation-logloss:0.47939 validation-auc:0.96866 validation-aucpr:0.97336
[12] validation-logloss:0.46681 validation-auc:0.96887 validation-aucpr:0.97360
{'best_iteration': '12', 'best_score': '0.9735955391456544'}
Trial 19, Fold 3: Log loss = 0.4668126306623334, Average precision = 0.9733190556635031, ROC-AUC = 0.9688660991177973, Elapsed Time = 0.5241953000004287 seconds
Trial 19, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 19, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66686 validation-auc:0.95712 validation-aucpr:0.95982
[1] validation-logloss:0.64275 validation-auc:0.96260 validation-aucpr:0.96550
[2] validation-logloss:0.61980 validation-auc:0.96674 validation-aucpr:0.97187
[3] validation-logloss:0.60129 validation-auc:0.96496 validation-aucpr:0.97056
[4] validation-logloss:0.58343 validation-auc:0.96491 validation-aucpr:0.97047
[5] validation-logloss:0.56449 validation-auc:0.96523 validation-aucpr:0.97089
[6] validation-logloss:0.54837 validation-auc:0.96564 validation-aucpr:0.97117
[7] validation-logloss:0.53171 validation-auc:0.96572 validation-aucpr:0.97123
[8] validation-logloss:0.51810 validation-auc:0.96609 validation-aucpr:0.97154
[9] validation-logloss:0.50495 validation-auc:0.96592 validation-aucpr:0.97137
[10] validation-logloss:0.49034 validation-auc:0.96640 validation-aucpr:0.97185
[11] validation-logloss:0.47681 validation-auc:0.96642 validation-aucpr:0.97190
[12] validation-logloss:0.46528 validation-auc:0.96672 validation-aucpr:0.97205
{'best_iteration': '12', 'best_score': '0.9720525847713024'}
Trial 19, Fold 4: Log loss = 0.46527833454358125, Average precision = 0.9719904271170401, ROC-AUC = 0.9667181848203704, Elapsed Time = 0.5389599999998609 seconds
Trial 19, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 19, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66701 validation-auc:0.95208 validation-aucpr:0.94348
[1] validation-logloss:0.64323 validation-auc:0.96138 validation-aucpr:0.96223
[2] validation-logloss:0.62083 validation-auc:0.96242 validation-aucpr:0.96299
[3] validation-logloss:0.60224 validation-auc:0.96229 validation-aucpr:0.96498
[4] validation-logloss:0.58241 validation-auc:0.96338 validation-aucpr:0.96563
[5] validation-logloss:0.56594 validation-auc:0.96345 validation-aucpr:0.96828
[6] validation-logloss:0.54808 validation-auc:0.96453 validation-aucpr:0.96897
[7] validation-logloss:0.53136 validation-auc:0.96520 validation-aucpr:0.96955
[8] validation-logloss:0.51735 validation-auc:0.96536 validation-aucpr:0.96962
[9] validation-logloss:0.50206 validation-auc:0.96621 validation-aucpr:0.97033
[10] validation-logloss:0.48779 validation-auc:0.96686 validation-aucpr:0.97107
[11] validation-logloss:0.47446 validation-auc:0.96715 validation-aucpr:0.97126
[12] validation-logloss:0.46176 validation-auc:0.96744 validation-aucpr:0.97144
{'best_iteration': '12', 'best_score': '0.9714434035775554'}
Trial 19, Fold 5: Log loss = 0.46175643038908776, Average precision = 0.9713472304358971, ROC-AUC = 0.9674423945153559, Elapsed Time = 0.4952616999999009 seconds
Optimization Progress: 20%|## | 20/100 [34:37<1:09:46, 52.33s/it]
Trial 20, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 20, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:33:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66078 validation-auc:0.93642 validation-aucpr:0.93490
[18:33:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63086 validation-auc:0.94541 validation-aucpr:0.94612
[18:33:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60533 validation-auc:0.94667 validation-aucpr:0.94864
[18:33:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57494 validation-auc:0.96064 validation-aucpr:0.96468
[18:33:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55402 validation-auc:0.95961 validation-aucpr:0.96464
[18:33:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53391 validation-auc:0.95974 validation-aucpr:0.96486
[18:33:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51407 validation-auc:0.96001 validation-aucpr:0.96500
[18:33:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49586 validation-auc:0.96079 validation-aucpr:0.96561
[18:33:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.48017 validation-auc:0.96084 validation-aucpr:0.96560
[18:33:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.46509 validation-auc:0.96108 validation-aucpr:0.96617
[18:33:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.45107 validation-auc:0.96107 validation-aucpr:0.96614
[18:33:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43329 validation-auc:0.96356 validation-aucpr:0.96881
[18:33:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.42068 validation-auc:0.96355 validation-aucpr:0.96875
[18:33:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.41054 validation-auc:0.96313 validation-aucpr:0.96844
[18:33:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.40002 validation-auc:0.96317 validation-aucpr:0.96835
[18:33:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.39087 validation-auc:0.96302 validation-aucpr:0.96811
[18:33:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.38129 validation-auc:0.96306 validation-aucpr:0.96809
[18:33:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.37278 validation-auc:0.96305 validation-aucpr:0.96805
[18:33:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.36113 validation-auc:0.96437 validation-aucpr:0.96946
[18:33:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.35383 validation-auc:0.96429 validation-aucpr:0.96925
[18:33:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.34689 validation-auc:0.96440 validation-aucpr:0.96929
[18:33:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.34052 validation-auc:0.96455 validation-aucpr:0.96938
[18:33:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.33421 validation-auc:0.96457 validation-aucpr:0.96940
[18:33:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.32819 validation-auc:0.96479 validation-aucpr:0.96961
[18:33:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.32325 validation-auc:0.96472 validation-aucpr:0.96951
[18:33:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.31785 validation-auc:0.96483 validation-aucpr:0.96957
[18:33:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.31041 validation-auc:0.96546 validation-aucpr:0.97028
[18:33:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.30611 validation-auc:0.96549 validation-aucpr:0.97029
[18:33:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.30186 validation-auc:0.96545 validation-aucpr:0.97023
[18:33:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.29810 validation-auc:0.96541 validation-aucpr:0.97024
[18:33:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.29178 validation-auc:0.96594 validation-aucpr:0.97077
[18:33:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.28543 validation-auc:0.96636 validation-aucpr:0.97122
[18:33:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.28206 validation-auc:0.96644 validation-aucpr:0.97130
[18:33:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.27896 validation-auc:0.96642 validation-aucpr:0.97130
[18:33:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.27338 validation-auc:0.96681 validation-aucpr:0.97168
[18:33:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.27077 validation-auc:0.96678 validation-aucpr:0.97166
[18:33:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.26814 validation-auc:0.96688 validation-aucpr:0.97174
[18:33:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.26585 validation-auc:0.96692 validation-aucpr:0.97175
[18:33:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.26370 validation-auc:0.96694 validation-aucpr:0.97174
[18:33:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.26158 validation-auc:0.96692 validation-aucpr:0.97173
[18:33:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.25936 validation-auc:0.96701 validation-aucpr:0.97177
[18:33:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.25734 validation-auc:0.96702 validation-aucpr:0.97172
[18:33:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.25586 validation-auc:0.96699 validation-aucpr:0.97169
[18:33:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.25194 validation-auc:0.96735 validation-aucpr:0.97211
[18:33:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.25013 validation-auc:0.96746 validation-aucpr:0.97234
[18:33:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.24669 validation-auc:0.96770 validation-aucpr:0.97263
[18:33:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.24553 validation-auc:0.96762 validation-aucpr:0.97256
{'best_iteration': '45', 'best_score': '0.9726338681129421'}
Trial 20, Fold 1: Log loss = 0.24552598115549665, Average precision = 0.9725694283267494, ROC-AUC = 0.9676181335226051, Elapsed Time = 6.571905600001628 seconds
Trial 20, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 20, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66057 validation-auc:0.93451 validation-aucpr:0.93073
[18:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63085 validation-auc:0.94389 validation-aucpr:0.94285
[18:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60271 validation-auc:0.95107 validation-aucpr:0.95059
[18:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57773 validation-auc:0.95352 validation-aucpr:0.95378
[18:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55508 validation-auc:0.95537 validation-aucpr:0.95697
[18:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53442 validation-auc:0.95650 validation-aucpr:0.95855
[18:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51557 validation-auc:0.95651 validation-aucpr:0.95845
[18:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49883 validation-auc:0.95678 validation-aucpr:0.95940
[18:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47874 validation-auc:0.96128 validation-aucpr:0.96481
[18:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.45925 validation-auc:0.96369 validation-aucpr:0.96760
[18:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44154 validation-auc:0.96490 validation-aucpr:0.96880
[18:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.42830 validation-auc:0.96492 validation-aucpr:0.96885
[18:33:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.41591 validation-auc:0.96504 validation-aucpr:0.96889
[18:33:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40452 validation-auc:0.96509 validation-aucpr:0.96886
[18:33:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39408 validation-auc:0.96521 validation-aucpr:0.96893
[18:33:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38451 validation-auc:0.96535 validation-aucpr:0.96893
[18:33:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37482 validation-auc:0.96550 validation-aucpr:0.96910
[18:33:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36309 validation-auc:0.96622 validation-aucpr:0.96992
[18:33:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35479 validation-auc:0.96626 validation-aucpr:0.96988
[18:33:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34711 validation-auc:0.96620 validation-aucpr:0.96991
[18:33:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.33973 validation-auc:0.96637 validation-aucpr:0.97002
[18:33:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33404 validation-auc:0.96625 validation-aucpr:0.96975
[18:33:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32785 validation-auc:0.96626 validation-aucpr:0.96971
[18:33:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31884 validation-auc:0.96690 validation-aucpr:0.97035
[18:33:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31320 validation-auc:0.96702 validation-aucpr:0.97050
[18:33:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30801 validation-auc:0.96707 validation-aucpr:0.97050
[18:33:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.30336 validation-auc:0.96715 validation-aucpr:0.97077
[18:33:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29657 validation-auc:0.96759 validation-aucpr:0.97115
[18:33:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.29018 validation-auc:0.96785 validation-aucpr:0.97144
[18:33:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28390 validation-auc:0.96818 validation-aucpr:0.97173
[18:33:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.27846 validation-auc:0.96848 validation-aucpr:0.97198
[18:33:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27462 validation-auc:0.96870 validation-aucpr:0.97214
[18:33:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.27158 validation-auc:0.96864 validation-aucpr:0.97203
[18:33:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26840 validation-auc:0.96888 validation-aucpr:0.97220
[18:33:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26537 validation-auc:0.96899 validation-aucpr:0.97226
[18:33:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.26253 validation-auc:0.96903 validation-aucpr:0.97228
[18:33:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.26018 validation-auc:0.96891 validation-aucpr:0.97220
[18:33:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25560 validation-auc:0.96918 validation-aucpr:0.97246
[18:33:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.25108 validation-auc:0.96952 validation-aucpr:0.97278
[18:33:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.24915 validation-auc:0.96949 validation-aucpr:0.97273
[18:33:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.24581 validation-auc:0.96956 validation-aucpr:0.97279
[18:33:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.24229 validation-auc:0.96974 validation-aucpr:0.97296
[18:33:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.23889 validation-auc:0.96996 validation-aucpr:0.97312
[18:33:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.23603 validation-auc:0.97014 validation-aucpr:0.97337
[18:33:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.23436 validation-auc:0.97013 validation-aucpr:0.97335
[18:33:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.23255 validation-auc:0.97022 validation-aucpr:0.97340
[18:33:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.23098 validation-auc:0.97024 validation-aucpr:0.97340
{'best_iteration': '45', 'best_score': '0.9734037572532945'}
Trial 20, Fold 2: Log loss = 0.2309817435022436, Average precision = 0.973403500713181, ROC-AUC = 0.9702374792143471, Elapsed Time = 8.763767099999313 seconds
Trial 20, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 20, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66043 validation-auc:0.93767 validation-aucpr:0.93605
[18:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63148 validation-auc:0.94698 validation-aucpr:0.94963
[18:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.59849 validation-auc:0.96370 validation-aucpr:0.96786
[18:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57263 validation-auc:0.96510 validation-aucpr:0.96889
[18:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55136 validation-auc:0.96347 validation-aucpr:0.96780
[18:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53029 validation-auc:0.96312 validation-aucpr:0.96742
[18:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.50595 validation-auc:0.96597 validation-aucpr:0.97052
[18:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.48783 validation-auc:0.96639 validation-aucpr:0.97070
[18:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47152 validation-auc:0.96622 validation-aucpr:0.97036
[18:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.45396 validation-auc:0.96672 validation-aucpr:0.97116
[18:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.43662 validation-auc:0.96727 validation-aucpr:0.97180
[18:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.42007 validation-auc:0.96788 validation-aucpr:0.97239
[18:33:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.40847 validation-auc:0.96765 validation-aucpr:0.97220
[18:33:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.39838 validation-auc:0.96727 validation-aucpr:0.97187
[18:33:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.38811 validation-auc:0.96719 validation-aucpr:0.97180
[18:33:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.37846 validation-auc:0.96699 validation-aucpr:0.97162
[18:33:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.36716 validation-auc:0.96707 validation-aucpr:0.97180
[18:33:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.35929 validation-auc:0.96701 validation-aucpr:0.97173
[18:33:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.34863 validation-auc:0.96734 validation-aucpr:0.97201
[18:33:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34035 validation-auc:0.96759 validation-aucpr:0.97219
[18:33:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.33364 validation-auc:0.96764 validation-aucpr:0.97218
[18:33:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.32737 validation-auc:0.96765 validation-aucpr:0.97230
[18:33:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.31893 validation-auc:0.96801 validation-aucpr:0.97264
[18:33:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31297 validation-auc:0.96805 validation-aucpr:0.97266
[18:33:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.30825 validation-auc:0.96797 validation-aucpr:0.97262
[18:33:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30167 validation-auc:0.96795 validation-aucpr:0.97262
[18:33:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.29690 validation-auc:0.96803 validation-aucpr:0.97268
[18:33:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29274 validation-auc:0.96804 validation-aucpr:0.97270
[18:33:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.28832 validation-auc:0.96809 validation-aucpr:0.97272
[18:33:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28454 validation-auc:0.96806 validation-aucpr:0.97268
[18:33:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.27905 validation-auc:0.96818 validation-aucpr:0.97282
[18:33:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27555 validation-auc:0.96825 validation-aucpr:0.97283
[18:33:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.27025 validation-auc:0.96855 validation-aucpr:0.97309
[18:33:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26758 validation-auc:0.96851 validation-aucpr:0.97301
[18:33:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26318 validation-auc:0.96859 validation-aucpr:0.97307
[18:33:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.25998 validation-auc:0.96870 validation-aucpr:0.97314
[18:33:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25737 validation-auc:0.96874 validation-aucpr:0.97318
[18:33:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25359 validation-auc:0.96880 validation-aucpr:0.97327
[18:33:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.25041 validation-auc:0.96878 validation-aucpr:0.97328
[18:33:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.24807 validation-auc:0.96887 validation-aucpr:0.97334
[18:33:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.24425 validation-auc:0.96910 validation-aucpr:0.97353
[18:33:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.24156 validation-auc:0.96916 validation-aucpr:0.97360
[18:33:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.23941 validation-auc:0.96928 validation-aucpr:0.97367
[18:33:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.23749 validation-auc:0.96935 validation-aucpr:0.97370
[18:33:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.23571 validation-auc:0.96933 validation-aucpr:0.97367
[18:33:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.23298 validation-auc:0.96947 validation-aucpr:0.97382
[18:33:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.23158 validation-auc:0.96949 validation-aucpr:0.97380
{'best_iteration': '45', 'best_score': '0.9738152691478276'}
Trial 20, Fold 3: Log loss = 0.23158353629015746, Average precision = 0.9738055826676788, ROC-AUC = 0.9694872256358891, Elapsed Time = 7.577756599999702 seconds
Trial 20, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 20, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:33:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66059 validation-auc:0.93175 validation-aucpr:0.93407
[18:33:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.62531 validation-auc:0.95875 validation-aucpr:0.96446
[18:33:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.59299 validation-auc:0.96314 validation-aucpr:0.96873
[18:33:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.56817 validation-auc:0.96340 validation-aucpr:0.96892
[18:33:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.54561 validation-auc:0.96322 validation-aucpr:0.96887
[18:33:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.52424 validation-auc:0.96346 validation-aucpr:0.96901
[18:33:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.50568 validation-auc:0.96356 validation-aucpr:0.96921
[18:33:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.48917 validation-auc:0.96354 validation-aucpr:0.96910
[18:33:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47320 validation-auc:0.96330 validation-aucpr:0.96888
[18:33:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.45895 validation-auc:0.96316 validation-aucpr:0.96866
[18:33:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44456 validation-auc:0.96317 validation-aucpr:0.96864
[18:33:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43174 validation-auc:0.96308 validation-aucpr:0.96849
[18:33:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.42149 validation-auc:0.96246 validation-aucpr:0.96796
[18:33:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.41152 validation-auc:0.96204 validation-aucpr:0.96767
[18:33:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.40075 validation-auc:0.96221 validation-aucpr:0.96784
[18:33:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.39058 validation-auc:0.96221 validation-aucpr:0.96788
[18:33:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.38161 validation-auc:0.96208 validation-aucpr:0.96775
[18:33:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36959 validation-auc:0.96324 validation-aucpr:0.96900
[18:33:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.36153 validation-auc:0.96324 validation-aucpr:0.96899
[18:33:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.35090 validation-auc:0.96381 validation-aucpr:0.96958
[18:33:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.34386 validation-auc:0.96377 validation-aucpr:0.96956
[18:33:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33707 validation-auc:0.96389 validation-aucpr:0.96965
[18:34:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.33066 validation-auc:0.96409 validation-aucpr:0.96982
[18:34:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.32469 validation-auc:0.96423 validation-aucpr:0.96990
[18:34:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31629 validation-auc:0.96472 validation-aucpr:0.97041
[18:34:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30867 validation-auc:0.96517 validation-aucpr:0.97084
[18:34:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.30353 validation-auc:0.96524 validation-aucpr:0.97085
[18:34:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29901 validation-auc:0.96515 validation-aucpr:0.97078
[18:34:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.29504 validation-auc:0.96524 validation-aucpr:0.97082
[18:34:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.29110 validation-auc:0.96534 validation-aucpr:0.97088
[18:34:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.28529 validation-auc:0.96578 validation-aucpr:0.97128
[18:34:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.28177 validation-auc:0.96574 validation-aucpr:0.97123
[18:34:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.27887 validation-auc:0.96558 validation-aucpr:0.97111
[18:34:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.27570 validation-auc:0.96552 validation-aucpr:0.97107
[18:34:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.27340 validation-auc:0.96548 validation-aucpr:0.97102
[18:34:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.26824 validation-auc:0.96583 validation-aucpr:0.97134
[18:34:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.26525 validation-auc:0.96590 validation-aucpr:0.97140
[18:34:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.26079 validation-auc:0.96616 validation-aucpr:0.97165
[18:34:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.25639 validation-auc:0.96644 validation-aucpr:0.97192
[18:34:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.25242 validation-auc:0.96668 validation-aucpr:0.97213
[18:34:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.25020 validation-auc:0.96679 validation-aucpr:0.97220
[18:34:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.24687 validation-auc:0.96694 validation-aucpr:0.97234
[18:34:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.24522 validation-auc:0.96700 validation-aucpr:0.97237
[18:34:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.24170 validation-auc:0.96721 validation-aucpr:0.97256
[18:34:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.23974 validation-auc:0.96736 validation-aucpr:0.97267
[18:34:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.23700 validation-auc:0.96738 validation-aucpr:0.97271
[18:34:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.23554 validation-auc:0.96747 validation-aucpr:0.97278
{'best_iteration': '46', 'best_score': '0.9727819744692342'}
Trial 20, Fold 4: Log loss = 0.23554353868065628, Average precision = 0.9727860480148764, ROC-AUC = 0.9674674110793984, Elapsed Time = 6.664979299999686 seconds
Trial 20, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 20, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:34:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66165 validation-auc:0.92788 validation-aucpr:0.92757
[18:34:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63181 validation-auc:0.94335 validation-aucpr:0.94601
[18:34:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60371 validation-auc:0.95083 validation-aucpr:0.95442
[18:34:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57927 validation-auc:0.95115 validation-aucpr:0.95461
[18:34:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55855 validation-auc:0.95127 validation-aucpr:0.95499
[18:34:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53778 validation-auc:0.95283 validation-aucpr:0.95735
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51329 validation-auc:0.95969 validation-aucpr:0.96498
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49186 validation-auc:0.96111 validation-aucpr:0.96654
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47589 validation-auc:0.96102 validation-aucpr:0.96644
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.45758 validation-auc:0.96195 validation-aucpr:0.96740
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44409 validation-auc:0.96202 validation-aucpr:0.96750
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43087 validation-auc:0.96250 validation-aucpr:0.96791
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.41892 validation-auc:0.96251 validation-aucpr:0.96787
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40452 validation-auc:0.96316 validation-aucpr:0.96852
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39440 validation-auc:0.96327 validation-aucpr:0.96863
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38193 validation-auc:0.96360 validation-aucpr:0.96896
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37133 validation-auc:0.96382 validation-aucpr:0.96920
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36336 validation-auc:0.96378 validation-aucpr:0.96918
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35554 validation-auc:0.96403 validation-aucpr:0.96933
[18:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34756 validation-auc:0.96426 validation-aucpr:0.96949
[18:34:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.34072 validation-auc:0.96419 validation-aucpr:0.96946
[18:34:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33458 validation-auc:0.96436 validation-aucpr:0.96956
[18:34:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32850 validation-auc:0.96446 validation-aucpr:0.96964
[18:34:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.32289 validation-auc:0.96463 validation-aucpr:0.96972
[18:34:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31751 validation-auc:0.96462 validation-aucpr:0.96967
[18:34:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.31247 validation-auc:0.96470 validation-aucpr:0.96967
[18:34:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.30759 validation-auc:0.96487 validation-aucpr:0.96979
[18:34:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.30344 validation-auc:0.96486 validation-aucpr:0.96973
[18:34:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.29648 validation-auc:0.96529 validation-aucpr:0.97014
[18:34:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.29227 validation-auc:0.96542 validation-aucpr:0.97023
[18:34:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.28858 validation-auc:0.96542 validation-aucpr:0.97022
[18:34:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.28453 validation-auc:0.96559 validation-aucpr:0.97033
[18:34:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.28165 validation-auc:0.96552 validation-aucpr:0.97025
[18:34:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.27877 validation-auc:0.96554 validation-aucpr:0.97021
[18:34:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.27603 validation-auc:0.96544 validation-aucpr:0.97016
[18:34:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.27345 validation-auc:0.96547 validation-aucpr:0.97021
[18:34:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.27088 validation-auc:0.96554 validation-aucpr:0.97023
[18:34:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.26844 validation-auc:0.96558 validation-aucpr:0.97026
[18:34:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.26578 validation-auc:0.96571 validation-aucpr:0.97035
[18:34:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.26146 validation-auc:0.96602 validation-aucpr:0.97064
[18:34:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.25919 validation-auc:0.96609 validation-aucpr:0.97069
[18:34:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.25703 validation-auc:0.96620 validation-aucpr:0.97075
[18:34:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.25508 validation-auc:0.96626 validation-aucpr:0.97081
[18:34:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.25343 validation-auc:0.96620 validation-aucpr:0.97075
[18:34:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.25158 validation-auc:0.96636 validation-aucpr:0.97083
[18:34:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.24978 validation-auc:0.96649 validation-aucpr:0.97099
[18:34:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.24816 validation-auc:0.96653 validation-aucpr:0.97102
{'best_iteration': '46', 'best_score': '0.971019754688506'}
Trial 20, Fold 5: Log loss = 0.248159576737433, Average precision = 0.9710249427718667, ROC-AUC = 0.9665340096241384, Elapsed Time = 6.658546999999089 seconds
Optimization Progress: 21%|##1 | 21/100 [35:22<1:05:56, 50.08s/it]
Trial 21, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 21, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:34:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68447 validation-auc:0.93569 validation-aucpr:0.89227
[18:34:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67583 validation-auc:0.96004 validation-aucpr:0.95021
[18:34:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66753 validation-auc:0.96484 validation-aucpr:0.96664
[18:34:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65918 validation-auc:0.96775 validation-aucpr:0.97129
[18:34:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65118 validation-auc:0.96849 validation-aucpr:0.97242
[18:34:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64333 validation-auc:0.96874 validation-aucpr:0.97260
[18:34:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63576 validation-auc:0.96896 validation-aucpr:0.97379
[18:34:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.62881 validation-auc:0.96941 validation-aucpr:0.97434
[18:34:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62216 validation-auc:0.96939 validation-aucpr:0.97411
[18:34:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61564 validation-auc:0.96902 validation-aucpr:0.97375
[18:34:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.60868 validation-auc:0.96906 validation-aucpr:0.97392
[18:34:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.60235 validation-auc:0.96916 validation-aucpr:0.97391
[18:34:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.59550 validation-auc:0.96952 validation-aucpr:0.97423
[18:34:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.58884 validation-auc:0.96978 validation-aucpr:0.97450
[18:34:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.58253 validation-auc:0.96952 validation-aucpr:0.97434
[18:34:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.57612 validation-auc:0.96983 validation-aucpr:0.97456
[18:34:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.56985 validation-auc:0.96983 validation-aucpr:0.97459
[18:34:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.56425 validation-auc:0.96978 validation-aucpr:0.97452
[18:34:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.55814 validation-auc:0.96992 validation-aucpr:0.97464
[18:34:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.55218 validation-auc:0.97001 validation-aucpr:0.97472
[18:34:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.54643 validation-auc:0.97003 validation-aucpr:0.97392
[18:34:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.54079 validation-auc:0.96997 validation-aucpr:0.97389
[18:34:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.53520 validation-auc:0.97004 validation-aucpr:0.97397
[18:34:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.52974 validation-auc:0.97006 validation-aucpr:0.97399
[18:34:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.52433 validation-auc:0.97013 validation-aucpr:0.97403
[18:34:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.51911 validation-auc:0.97011 validation-aucpr:0.97404
[18:34:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.51389 validation-auc:0.97023 validation-aucpr:0.97413
[18:34:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.50880 validation-auc:0.97035 validation-aucpr:0.97422
[18:34:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.50380 validation-auc:0.97037 validation-aucpr:0.97424
[18:34:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.49944 validation-auc:0.97027 validation-aucpr:0.97411
[18:34:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.49464 validation-auc:0.97032 validation-aucpr:0.97412
[18:34:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.49041 validation-auc:0.97018 validation-aucpr:0.97397
[18:34:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.48578 validation-auc:0.97031 validation-aucpr:0.97405
[18:34:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.48130 validation-auc:0.97036 validation-aucpr:0.97411
[18:34:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.47681 validation-auc:0.97037 validation-aucpr:0.97411
[18:34:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.47235 validation-auc:0.97051 validation-aucpr:0.97422
[18:34:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.46809 validation-auc:0.97049 validation-aucpr:0.97422
[18:34:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.46380 validation-auc:0.97063 validation-aucpr:0.97431
[18:34:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.45963 validation-auc:0.97066 validation-aucpr:0.97433
[18:34:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.45590 validation-auc:0.97063 validation-aucpr:0.97429
[18:34:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.45189 validation-auc:0.97077 validation-aucpr:0.97438
[18:34:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.44832 validation-auc:0.97081 validation-aucpr:0.97437
[18:34:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.44441 validation-auc:0.97084 validation-aucpr:0.97436
[18:34:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.44054 validation-auc:0.97106 validation-aucpr:0.97451
[18:34:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.43721 validation-auc:0.97099 validation-aucpr:0.97444
[18:34:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.43346 validation-auc:0.97117 validation-aucpr:0.97454
[18:34:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.42980 validation-auc:0.97119 validation-aucpr:0.97453
[18:34:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.42626 validation-auc:0.97116 validation-aucpr:0.97444
[18:34:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.42313 validation-auc:0.97103 validation-aucpr:0.97430
[18:34:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.41966 validation-auc:0.97098 validation-aucpr:0.97417
[18:34:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.41624 validation-auc:0.97106 validation-aucpr:0.97426
[18:34:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.41332 validation-auc:0.97087 validation-aucpr:0.97407
[18:34:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.41005 validation-auc:0.97081 validation-aucpr:0.97403
[18:34:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.40678 validation-auc:0.97095 validation-aucpr:0.97419
[18:34:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.40399 validation-auc:0.97110 validation-aucpr:0.97560
[18:34:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.40091 validation-auc:0.97108 validation-aucpr:0.97558
[18:34:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.39777 validation-auc:0.97117 validation-aucpr:0.97564
[18:34:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.39504 validation-auc:0.97116 validation-aucpr:0.97563
[18:34:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.39201 validation-auc:0.97124 validation-aucpr:0.97568
[18:34:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.38909 validation-auc:0.97125 validation-aucpr:0.97569
[18:34:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.38626 validation-auc:0.97125 validation-aucpr:0.97568
[18:34:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.38344 validation-auc:0.97123 validation-aucpr:0.97567
[18:34:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.38094 validation-auc:0.97122 validation-aucpr:0.97562
[18:34:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.37848 validation-auc:0.97118 validation-aucpr:0.97559
[18:34:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.37578 validation-auc:0.97120 validation-aucpr:0.97561
[18:34:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.37306 validation-auc:0.97122 validation-aucpr:0.97562
[18:34:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.37039 validation-auc:0.97122 validation-aucpr:0.97563
[18:34:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.36808 validation-auc:0.97121 validation-aucpr:0.97561
[18:34:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.36556 validation-auc:0.97126 validation-aucpr:0.97568
[18:34:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.36310 validation-auc:0.97125 validation-aucpr:0.97566
[18:34:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.36093 validation-auc:0.97119 validation-aucpr:0.97561
[18:34:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.35874 validation-auc:0.97115 validation-aucpr:0.97559
[18:34:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.35659 validation-auc:0.97114 validation-aucpr:0.97557
[18:34:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.35413 validation-auc:0.97124 validation-aucpr:0.97564
[18:34:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.35180 validation-auc:0.97129 validation-aucpr:0.97567
[18:34:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.34980 validation-auc:0.97123 validation-aucpr:0.97560
[18:34:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.34749 validation-auc:0.97127 validation-aucpr:0.97563
[18:34:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.34521 validation-auc:0.97132 validation-aucpr:0.97568
[18:34:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.34302 validation-auc:0.97137 validation-aucpr:0.97577
[18:34:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.34084 validation-auc:0.97142 validation-aucpr:0.97581
[18:34:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.33860 validation-auc:0.97153 validation-aucpr:0.97589
[18:34:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.33656 validation-auc:0.97153 validation-aucpr:0.97589
[18:34:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.33451 validation-auc:0.97151 validation-aucpr:0.97588
{'best_iteration': '81', 'best_score': '0.9758924405030649'}
Trial 21, Fold 1: Log loss = 0.3345103914095373, Average precision = 0.9758799882601042, ROC-AUC = 0.9715121544954426, Elapsed Time = 20.817922200001703 seconds
Trial 21, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 21, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:34:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68394 validation-auc:0.94208 validation-aucpr:0.90150
[18:34:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67606 validation-auc:0.96278 validation-aucpr:0.95869
[18:34:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66762 validation-auc:0.96656 validation-aucpr:0.96660
[18:34:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65946 validation-auc:0.96733 validation-aucpr:0.96997
[18:34:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65137 validation-auc:0.96827 validation-aucpr:0.96890
[18:34:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64416 validation-auc:0.96852 validation-aucpr:0.97179
[18:34:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63640 validation-auc:0.96892 validation-aucpr:0.97223
[18:34:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.62881 validation-auc:0.96970 validation-aucpr:0.97305
[18:34:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62141 validation-auc:0.96982 validation-aucpr:0.97320
[18:34:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61492 validation-auc:0.96983 validation-aucpr:0.97316
[18:34:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.60849 validation-auc:0.96991 validation-aucpr:0.97323
[18:34:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.60151 validation-auc:0.97008 validation-aucpr:0.97344
[18:34:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.59474 validation-auc:0.97022 validation-aucpr:0.97362
[18:34:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.58863 validation-auc:0.97044 validation-aucpr:0.97388
[18:34:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.58269 validation-auc:0.97054 validation-aucpr:0.97384
[18:34:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.57633 validation-auc:0.97070 validation-aucpr:0.97401
[18:34:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.57004 validation-auc:0.97084 validation-aucpr:0.97414
[18:34:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.56461 validation-auc:0.97048 validation-aucpr:0.97384
[18:34:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.55849 validation-auc:0.97070 validation-aucpr:0.97402
[18:34:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.55252 validation-auc:0.97074 validation-aucpr:0.97405
[18:34:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.54668 validation-auc:0.97077 validation-aucpr:0.97401
[18:34:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.54098 validation-auc:0.97078 validation-aucpr:0.97400
[18:34:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.53592 validation-auc:0.97083 validation-aucpr:0.97405
[18:34:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.53042 validation-auc:0.97095 validation-aucpr:0.97416
[18:34:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.52549 validation-auc:0.97092 validation-aucpr:0.97417
[18:34:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.52009 validation-auc:0.97104 validation-aucpr:0.97424
[18:34:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.51538 validation-auc:0.97117 validation-aucpr:0.97431
[18:34:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.51025 validation-auc:0.97115 validation-aucpr:0.97427
[18:34:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.50525 validation-auc:0.97110 validation-aucpr:0.97424
[18:34:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.50021 validation-auc:0.97111 validation-aucpr:0.97429
[18:34:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.49532 validation-auc:0.97131 validation-aucpr:0.97445
[18:34:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.49062 validation-auc:0.97137 validation-aucpr:0.97449
[18:34:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.48591 validation-auc:0.97155 validation-aucpr:0.97461
[18:34:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.48126 validation-auc:0.97171 validation-aucpr:0.97474
[18:34:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.47681 validation-auc:0.97171 validation-aucpr:0.97468
[18:34:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.47233 validation-auc:0.97175 validation-aucpr:0.97472
[18:34:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.46848 validation-auc:0.97165 validation-aucpr:0.97463
[18:34:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.46416 validation-auc:0.97188 validation-aucpr:0.97480
[18:34:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.45987 validation-auc:0.97187 validation-aucpr:0.97481
[18:34:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.45577 validation-auc:0.97186 validation-aucpr:0.97484
[18:34:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.45162 validation-auc:0.97200 validation-aucpr:0.97494
[18:34:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.44806 validation-auc:0.97199 validation-aucpr:0.97496
[18:34:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.44410 validation-auc:0.97212 validation-aucpr:0.97501
[18:34:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.44033 validation-auc:0.97214 validation-aucpr:0.97498
[18:34:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.43662 validation-auc:0.97213 validation-aucpr:0.97497
[18:34:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.43288 validation-auc:0.97216 validation-aucpr:0.97500
[18:34:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.42916 validation-auc:0.97227 validation-aucpr:0.97509
[18:34:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.42550 validation-auc:0.97233 validation-aucpr:0.97514
[18:34:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.42187 validation-auc:0.97243 validation-aucpr:0.97520
[18:34:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.41841 validation-auc:0.97234 validation-aucpr:0.97514
[18:34:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.41496 validation-auc:0.97239 validation-aucpr:0.97516
[18:34:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.41202 validation-auc:0.97236 validation-aucpr:0.97513
[18:34:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.40883 validation-auc:0.97234 validation-aucpr:0.97511
[18:34:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.40555 validation-auc:0.97237 validation-aucpr:0.97512
[18:34:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.40232 validation-auc:0.97238 validation-aucpr:0.97512
[18:34:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.39918 validation-auc:0.97233 validation-aucpr:0.97509
[18:34:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.39605 validation-auc:0.97240 validation-aucpr:0.97514
[18:34:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.39331 validation-auc:0.97237 validation-aucpr:0.97512
[18:34:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.39032 validation-auc:0.97236 validation-aucpr:0.97510
[18:34:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.38765 validation-auc:0.97228 validation-aucpr:0.97509
[18:34:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.38461 validation-auc:0.97232 validation-aucpr:0.97512
[18:34:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.38171 validation-auc:0.97238 validation-aucpr:0.97518
[18:34:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.37916 validation-auc:0.97260 validation-aucpr:0.97529
[18:34:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.37660 validation-auc:0.97261 validation-aucpr:0.97530
[18:34:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.37384 validation-auc:0.97260 validation-aucpr:0.97530
[18:34:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.37116 validation-auc:0.97254 validation-aucpr:0.97525
[18:34:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.36843 validation-auc:0.97259 validation-aucpr:0.97529
[18:34:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.36576 validation-auc:0.97265 validation-aucpr:0.97533
[18:34:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.36308 validation-auc:0.97270 validation-aucpr:0.97538
[18:34:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.36044 validation-auc:0.97271 validation-aucpr:0.97539
[18:34:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.35792 validation-auc:0.97277 validation-aucpr:0.97540
[18:34:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.35550 validation-auc:0.97279 validation-aucpr:0.97540
[18:34:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.35310 validation-auc:0.97274 validation-aucpr:0.97536
[18:34:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.35071 validation-auc:0.97271 validation-aucpr:0.97533
[18:34:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.34860 validation-auc:0.97271 validation-aucpr:0.97533
[18:34:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.34624 validation-auc:0.97275 validation-aucpr:0.97537
[18:34:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.34393 validation-auc:0.97272 validation-aucpr:0.97536
[18:34:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.34165 validation-auc:0.97278 validation-aucpr:0.97538
[18:34:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.33966 validation-auc:0.97277 validation-aucpr:0.97538
[18:34:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.33742 validation-auc:0.97282 validation-aucpr:0.97542
[18:34:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.33518 validation-auc:0.97288 validation-aucpr:0.97547
[18:34:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.33304 validation-auc:0.97289 validation-aucpr:0.97549
[18:34:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.33097 validation-auc:0.97285 validation-aucpr:0.97546
{'best_iteration': '81', 'best_score': '0.9754875814965706'}
Trial 21, Fold 2: Log loss = 0.33096566923356896, Average precision = 0.9754165941955653, ROC-AUC = 0.9728503484584398, Elapsed Time = 20.76533600000039 seconds
Trial 21, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 21, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:35:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68430 validation-auc:0.94102 validation-aucpr:0.90218
[18:35:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67563 validation-auc:0.96315 validation-aucpr:0.95918
[18:35:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66714 validation-auc:0.96681 validation-aucpr:0.97083
[18:35:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65887 validation-auc:0.96832 validation-aucpr:0.97210
[18:35:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65096 validation-auc:0.96840 validation-aucpr:0.97242
[18:35:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64380 validation-auc:0.96784 validation-aucpr:0.97210
[18:35:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63613 validation-auc:0.96834 validation-aucpr:0.97243
[18:35:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.62850 validation-auc:0.96869 validation-aucpr:0.97160
[18:35:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62184 validation-auc:0.96856 validation-aucpr:0.97116
[18:35:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61458 validation-auc:0.96856 validation-aucpr:0.97116
[18:35:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.60746 validation-auc:0.96889 validation-aucpr:0.97046
[18:35:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.60071 validation-auc:0.96879 validation-aucpr:0.97022
[18:35:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.59390 validation-auc:0.96895 validation-aucpr:0.97031
[18:35:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.58719 validation-auc:0.96929 validation-aucpr:0.97153
[18:35:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.58066 validation-auc:0.96955 validation-aucpr:0.97105
[18:35:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.57427 validation-auc:0.96955 validation-aucpr:0.97112
[18:35:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.56804 validation-auc:0.96953 validation-aucpr:0.97139
[18:35:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.56194 validation-auc:0.96952 validation-aucpr:0.97144
[18:35:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.55637 validation-auc:0.96974 validation-aucpr:0.97158
[18:35:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.55046 validation-auc:0.96977 validation-aucpr:0.97160
[18:35:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.54453 validation-auc:0.97014 validation-aucpr:0.97054
[18:35:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.53874 validation-auc:0.97037 validation-aucpr:0.97073
[18:35:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.53317 validation-auc:0.97074 validation-aucpr:0.97212
[18:35:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.52771 validation-auc:0.97069 validation-aucpr:0.97211
[18:35:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.52231 validation-auc:0.97083 validation-aucpr:0.97221
[18:35:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.51742 validation-auc:0.97108 validation-aucpr:0.97324
[18:35:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.51269 validation-auc:0.97112 validation-aucpr:0.97326
[18:35:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.50772 validation-auc:0.97110 validation-aucpr:0.97326
[18:35:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.50273 validation-auc:0.97116 validation-aucpr:0.97328
[18:35:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.49823 validation-auc:0.97117 validation-aucpr:0.97329
[18:35:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.49333 validation-auc:0.97126 validation-aucpr:0.97336
[18:35:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.48849 validation-auc:0.97146 validation-aucpr:0.97351
[18:35:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.48392 validation-auc:0.97127 validation-aucpr:0.97461
[18:35:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.47930 validation-auc:0.97130 validation-aucpr:0.97463
[18:35:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.47483 validation-auc:0.97155 validation-aucpr:0.97474
[18:35:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.47039 validation-auc:0.97163 validation-aucpr:0.97480
[18:35:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.46603 validation-auc:0.97178 validation-aucpr:0.97489
[18:35:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.46183 validation-auc:0.97176 validation-aucpr:0.97488
[18:35:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.45805 validation-auc:0.97168 validation-aucpr:0.97487
[18:35:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.45388 validation-auc:0.97169 validation-aucpr:0.97487
[18:35:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.44983 validation-auc:0.97177 validation-aucpr:0.97493
[18:35:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.44592 validation-auc:0.97171 validation-aucpr:0.97477
[18:35:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.44212 validation-auc:0.97169 validation-aucpr:0.97474
[18:35:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.43839 validation-auc:0.97181 validation-aucpr:0.97415
[18:35:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.43465 validation-auc:0.97174 validation-aucpr:0.97411
[18:35:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.43133 validation-auc:0.97174 validation-aucpr:0.97410
[18:35:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.42770 validation-auc:0.97175 validation-aucpr:0.97409
[18:35:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.42413 validation-auc:0.97181 validation-aucpr:0.97415
[18:35:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.42054 validation-auc:0.97185 validation-aucpr:0.97418
[18:35:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.41732 validation-auc:0.97190 validation-aucpr:0.97420
[18:35:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.41397 validation-auc:0.97192 validation-aucpr:0.97420
[18:35:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.41057 validation-auc:0.97202 validation-aucpr:0.97426
[18:35:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.40757 validation-auc:0.97228 validation-aucpr:0.97606
[18:35:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.40459 validation-auc:0.97231 validation-aucpr:0.97608
[18:35:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.40134 validation-auc:0.97233 validation-aucpr:0.97610
[18:35:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.39842 validation-auc:0.97240 validation-aucpr:0.97616
[18:35:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.39537 validation-auc:0.97235 validation-aucpr:0.97613
[18:35:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.39235 validation-auc:0.97234 validation-aucpr:0.97612
[18:35:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.38932 validation-auc:0.97237 validation-aucpr:0.97615
[18:35:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.38632 validation-auc:0.97242 validation-aucpr:0.97618
[18:35:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.38338 validation-auc:0.97246 validation-aucpr:0.97621
[18:35:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.38080 validation-auc:0.97244 validation-aucpr:0.97620
[18:35:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.37805 validation-auc:0.97234 validation-aucpr:0.97611
[18:35:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.37553 validation-auc:0.97243 validation-aucpr:0.97617
[18:35:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.37273 validation-auc:0.97247 validation-aucpr:0.97620
[18:35:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.36993 validation-auc:0.97255 validation-aucpr:0.97627
[18:35:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.36762 validation-auc:0.97258 validation-aucpr:0.97636
[18:35:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.36521 validation-auc:0.97261 validation-aucpr:0.97640
[18:35:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.36268 validation-auc:0.97254 validation-aucpr:0.97636
[18:35:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.36022 validation-auc:0.97257 validation-aucpr:0.97637
[18:35:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.35800 validation-auc:0.97254 validation-aucpr:0.97633
[18:35:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.35559 validation-auc:0.97249 validation-aucpr:0.97630
[18:35:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.35345 validation-auc:0.97248 validation-aucpr:0.97631
[18:35:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.35130 validation-auc:0.97249 validation-aucpr:0.97630
[18:35:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.34893 validation-auc:0.97254 validation-aucpr:0.97633
[18:35:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.34663 validation-auc:0.97254 validation-aucpr:0.97633
[18:35:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.34437 validation-auc:0.97258 validation-aucpr:0.97645
[18:35:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.34242 validation-auc:0.97250 validation-aucpr:0.97637
[18:35:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.34014 validation-auc:0.97250 validation-aucpr:0.97637
[18:35:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.33798 validation-auc:0.97248 validation-aucpr:0.97636
[18:35:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.33602 validation-auc:0.97250 validation-aucpr:0.97638
[18:35:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.33396 validation-auc:0.97248 validation-aucpr:0.97635
[18:35:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.33180 validation-auc:0.97250 validation-aucpr:0.97637
{'best_iteration': '76', 'best_score': '0.9764509129277438'}
Trial 21, Fold 3: Log loss = 0.33180345818438056, Average precision = 0.976378138082869, ROC-AUC = 0.9724998585441726, Elapsed Time = 20.840254299999287 seconds
Trial 21, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 21, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:35:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68401 validation-auc:0.94302 validation-aucpr:0.91190
[18:35:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67538 validation-auc:0.96397 validation-aucpr:0.97035
[18:35:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66763 validation-auc:0.96476 validation-aucpr:0.96867
[18:35:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65936 validation-auc:0.96725 validation-aucpr:0.97103
[18:35:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65188 validation-auc:0.96816 validation-aucpr:0.97317
[18:35:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64394 validation-auc:0.96913 validation-aucpr:0.97399
[18:35:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63623 validation-auc:0.96976 validation-aucpr:0.97445
[18:35:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.62873 validation-auc:0.97004 validation-aucpr:0.97469
[18:35:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62138 validation-auc:0.97019 validation-aucpr:0.97478
[18:35:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61477 validation-auc:0.97024 validation-aucpr:0.97478
[18:35:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.60773 validation-auc:0.97016 validation-aucpr:0.97478
[18:35:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.60144 validation-auc:0.96995 validation-aucpr:0.97459
[18:35:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.59474 validation-auc:0.97002 validation-aucpr:0.97464
[18:35:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.58800 validation-auc:0.97028 validation-aucpr:0.97489
[18:35:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.58163 validation-auc:0.97026 validation-aucpr:0.97492
[18:35:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.57525 validation-auc:0.97031 validation-aucpr:0.97499
[18:35:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.56893 validation-auc:0.97055 validation-aucpr:0.97517
[18:35:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.56329 validation-auc:0.97047 validation-aucpr:0.97504
[18:35:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.55719 validation-auc:0.97069 validation-aucpr:0.97525
[18:35:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.55137 validation-auc:0.97063 validation-aucpr:0.97523
[18:35:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.54614 validation-auc:0.97055 validation-aucpr:0.97517
[18:35:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.54051 validation-auc:0.97048 validation-aucpr:0.97513
[18:35:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.53494 validation-auc:0.97065 validation-aucpr:0.97524
[18:35:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.52948 validation-auc:0.97067 validation-aucpr:0.97525
[18:35:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.52458 validation-auc:0.97065 validation-aucpr:0.97525
[18:35:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.51921 validation-auc:0.97093 validation-aucpr:0.97545
[18:35:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.51449 validation-auc:0.97098 validation-aucpr:0.97547
[18:35:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.50941 validation-auc:0.97097 validation-aucpr:0.97549
[18:35:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.50491 validation-auc:0.97082 validation-aucpr:0.97536
[18:35:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.50007 validation-auc:0.97078 validation-aucpr:0.97536
[18:35:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.49537 validation-auc:0.97065 validation-aucpr:0.97526
[18:35:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.49069 validation-auc:0.97064 validation-aucpr:0.97527
[18:35:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.48613 validation-auc:0.97056 validation-aucpr:0.97523
[18:35:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.48142 validation-auc:0.97080 validation-aucpr:0.97542
[18:35:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.47684 validation-auc:0.97095 validation-aucpr:0.97554
[18:35:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.47241 validation-auc:0.97100 validation-aucpr:0.97558
[18:35:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.46807 validation-auc:0.97109 validation-aucpr:0.97565
[18:35:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.46389 validation-auc:0.97108 validation-aucpr:0.97562
[18:35:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.46010 validation-auc:0.97102 validation-aucpr:0.97557
[18:35:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.45598 validation-auc:0.97109 validation-aucpr:0.97563
[18:35:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.45198 validation-auc:0.97100 validation-aucpr:0.97559
[18:35:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.44804 validation-auc:0.97092 validation-aucpr:0.97553
[18:35:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.44415 validation-auc:0.97091 validation-aucpr:0.97552
[18:35:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.44026 validation-auc:0.97101 validation-aucpr:0.97559
[18:35:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.43692 validation-auc:0.97092 validation-aucpr:0.97553
[18:35:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.43322 validation-auc:0.97094 validation-aucpr:0.97557
[18:35:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.42954 validation-auc:0.97103 validation-aucpr:0.97564
[18:35:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.42597 validation-auc:0.97112 validation-aucpr:0.97569
[18:35:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.42274 validation-auc:0.97117 validation-aucpr:0.97572
[18:35:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.41921 validation-auc:0.97116 validation-aucpr:0.97573
[18:35:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.41577 validation-auc:0.97119 validation-aucpr:0.97578
[18:35:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.41243 validation-auc:0.97125 validation-aucpr:0.97583
[18:35:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.40941 validation-auc:0.97120 validation-aucpr:0.97578
[18:35:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.40643 validation-auc:0.97129 validation-aucpr:0.97584
[18:35:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.40325 validation-auc:0.97121 validation-aucpr:0.97580
[18:35:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.40018 validation-auc:0.97122 validation-aucpr:0.97581
[18:35:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.39746 validation-auc:0.97106 validation-aucpr:0.97570
[18:35:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.39470 validation-auc:0.97108 validation-aucpr:0.97571
[18:35:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.39168 validation-auc:0.97114 validation-aucpr:0.97574
[18:35:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.38911 validation-auc:0.97105 validation-aucpr:0.97569
[18:35:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.38625 validation-auc:0.97108 validation-aucpr:0.97572
[18:35:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.38333 validation-auc:0.97109 validation-aucpr:0.97574
[18:35:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.38052 validation-auc:0.97110 validation-aucpr:0.97576
[18:35:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.37775 validation-auc:0.97111 validation-aucpr:0.97576
[18:35:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.37492 validation-auc:0.97116 validation-aucpr:0.97579
[18:35:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.37221 validation-auc:0.97123 validation-aucpr:0.97585
[18:35:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.36953 validation-auc:0.97124 validation-aucpr:0.97585
[18:35:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.36694 validation-auc:0.97127 validation-aucpr:0.97588
[18:35:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.36434 validation-auc:0.97133 validation-aucpr:0.97593
[18:35:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.36181 validation-auc:0.97140 validation-aucpr:0.97599
[18:35:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.35930 validation-auc:0.97140 validation-aucpr:0.97598
[18:35:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.35676 validation-auc:0.97146 validation-aucpr:0.97602
[18:35:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.35438 validation-auc:0.97140 validation-aucpr:0.97598
[18:35:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.35235 validation-auc:0.97130 validation-aucpr:0.97591
[18:35:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.35002 validation-auc:0.97129 validation-aucpr:0.97592
[18:35:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.34775 validation-auc:0.97130 validation-aucpr:0.97592
[18:35:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.34539 validation-auc:0.97141 validation-aucpr:0.97601
[18:35:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.34339 validation-auc:0.97138 validation-aucpr:0.97598
[18:35:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.34112 validation-auc:0.97142 validation-aucpr:0.97602
[18:35:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.33922 validation-auc:0.97139 validation-aucpr:0.97600
[18:35:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.33704 validation-auc:0.97142 validation-aucpr:0.97604
[18:35:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.33512 validation-auc:0.97138 validation-aucpr:0.97600
[18:35:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.33327 validation-auc:0.97135 validation-aucpr:0.97596
{'best_iteration': '80', 'best_score': '0.976037216196178'}
Trial 21, Fold 4: Log loss = 0.33326755809500186, Average precision = 0.9759524422770561, ROC-AUC = 0.9713480379884164, Elapsed Time = 20.94706589999987 seconds
Trial 21, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 21, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:35:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68434 validation-auc:0.93097 validation-aucpr:0.87197
[18:35:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67656 validation-auc:0.95917 validation-aucpr:0.95191
[18:35:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66816 validation-auc:0.96334 validation-aucpr:0.96412
[18:35:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66002 validation-auc:0.96502 validation-aucpr:0.97028
[18:35:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65268 validation-auc:0.96527 validation-aucpr:0.97029
[18:35:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64495 validation-auc:0.96578 validation-aucpr:0.97074
[18:35:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63785 validation-auc:0.96613 validation-aucpr:0.97096
[18:35:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63045 validation-auc:0.96691 validation-aucpr:0.97153
[18:35:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62302 validation-auc:0.96766 validation-aucpr:0.97203
[18:35:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61574 validation-auc:0.96799 validation-aucpr:0.97229
[18:35:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.60869 validation-auc:0.96824 validation-aucpr:0.97244
[18:35:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.60172 validation-auc:0.96851 validation-aucpr:0.97266
[18:35:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.59497 validation-auc:0.96852 validation-aucpr:0.97276
[18:35:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.58847 validation-auc:0.96822 validation-aucpr:0.97264
[18:35:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.58202 validation-auc:0.96847 validation-aucpr:0.97283
[18:35:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.57573 validation-auc:0.96874 validation-aucpr:0.97309
[18:35:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.56960 validation-auc:0.96891 validation-aucpr:0.97318
[18:35:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.56358 validation-auc:0.96899 validation-aucpr:0.97328
[18:35:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.55769 validation-auc:0.96880 validation-aucpr:0.97310
[18:35:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.55193 validation-auc:0.96892 validation-aucpr:0.97317
[18:35:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.54612 validation-auc:0.96895 validation-aucpr:0.97320
[18:35:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.54043 validation-auc:0.96923 validation-aucpr:0.97339
[18:35:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.53557 validation-auc:0.96918 validation-aucpr:0.97326
[18:35:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.53008 validation-auc:0.96932 validation-aucpr:0.97337
[18:35:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.52523 validation-auc:0.96923 validation-aucpr:0.97331
[18:35:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.52036 validation-auc:0.96928 validation-aucpr:0.97332
[18:35:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.51517 validation-auc:0.96940 validation-aucpr:0.97344
[18:35:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.51029 validation-auc:0.96944 validation-aucpr:0.97350
[18:35:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.50529 validation-auc:0.96955 validation-aucpr:0.97359
[18:35:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.50040 validation-auc:0.96962 validation-aucpr:0.97358
[18:35:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.49574 validation-auc:0.96958 validation-aucpr:0.97360
[18:35:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.49100 validation-auc:0.96961 validation-aucpr:0.97361
[18:35:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.48692 validation-auc:0.96962 validation-aucpr:0.97364
[18:35:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.48234 validation-auc:0.96970 validation-aucpr:0.97370
[18:35:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.47788 validation-auc:0.96973 validation-aucpr:0.97374
[18:35:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.47356 validation-auc:0.96974 validation-aucpr:0.97368
[18:35:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.46928 validation-auc:0.96973 validation-aucpr:0.97362
[18:35:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.46503 validation-auc:0.96981 validation-aucpr:0.97373
[18:35:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.46093 validation-auc:0.96991 validation-aucpr:0.97376
[18:35:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.45739 validation-auc:0.96983 validation-aucpr:0.97370
[18:35:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.45344 validation-auc:0.96972 validation-aucpr:0.97366
[18:35:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.44946 validation-auc:0.96980 validation-aucpr:0.97373
[18:35:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.44564 validation-auc:0.96980 validation-aucpr:0.97374
[18:35:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.44177 validation-auc:0.96992 validation-aucpr:0.97383
[18:35:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.43836 validation-auc:0.96992 validation-aucpr:0.97382
[18:35:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.43469 validation-auc:0.96993 validation-aucpr:0.97383
[18:35:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.43142 validation-auc:0.96994 validation-aucpr:0.97383
[18:35:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.42792 validation-auc:0.97010 validation-aucpr:0.97388
[18:35:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.42446 validation-auc:0.97005 validation-aucpr:0.97386
[18:35:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.42105 validation-auc:0.97003 validation-aucpr:0.97386
[18:35:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.41782 validation-auc:0.96991 validation-aucpr:0.97378
[18:35:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.41486 validation-auc:0.96999 validation-aucpr:0.97382
[18:35:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.41170 validation-auc:0.96997 validation-aucpr:0.97380
[18:35:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.40846 validation-auc:0.97005 validation-aucpr:0.97384
[18:35:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.40520 validation-auc:0.97017 validation-aucpr:0.97396
[18:35:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.40202 validation-auc:0.97030 validation-aucpr:0.97406
[18:35:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.39904 validation-auc:0.97031 validation-aucpr:0.97406
[18:35:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.39636 validation-auc:0.97025 validation-aucpr:0.97398
[18:35:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.39333 validation-auc:0.97029 validation-aucpr:0.97401
[18:35:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.39036 validation-auc:0.97039 validation-aucpr:0.97409
[18:35:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.38775 validation-auc:0.97028 validation-aucpr:0.97274
[18:35:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.38493 validation-auc:0.97023 validation-aucpr:0.97274
[18:35:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.38216 validation-auc:0.97027 validation-aucpr:0.97277
[18:35:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.37940 validation-auc:0.97025 validation-aucpr:0.97276
[18:36:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.37695 validation-auc:0.97023 validation-aucpr:0.97266
[18:36:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.37424 validation-auc:0.97030 validation-aucpr:0.97267
[18:36:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.37192 validation-auc:0.97034 validation-aucpr:0.97270
[18:36:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.36929 validation-auc:0.97043 validation-aucpr:0.97274
[18:36:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.36670 validation-auc:0.97048 validation-aucpr:0.97280
[18:36:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.36455 validation-auc:0.97044 validation-aucpr:0.97296
[18:36:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.36233 validation-auc:0.97038 validation-aucpr:0.97290
[18:36:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.35991 validation-auc:0.97044 validation-aucpr:0.97295
[18:36:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.35777 validation-auc:0.97038 validation-aucpr:0.97290
[18:36:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.35537 validation-auc:0.97045 validation-aucpr:0.97295
[18:36:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.35309 validation-auc:0.97051 validation-aucpr:0.97304
[18:36:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.35070 validation-auc:0.97061 validation-aucpr:0.97315
[18:36:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.34845 validation-auc:0.97058 validation-aucpr:0.97313
[18:36:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.34644 validation-auc:0.97058 validation-aucpr:0.97312
[18:36:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.34419 validation-auc:0.97066 validation-aucpr:0.97317
[18:36:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.34202 validation-auc:0.97070 validation-aucpr:0.97318
[18:36:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.34013 validation-auc:0.97066 validation-aucpr:0.97295
[18:36:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.33804 validation-auc:0.97068 validation-aucpr:0.97309
[18:36:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.33600 validation-auc:0.97074 validation-aucpr:0.97312
{'best_iteration': '59', 'best_score': '0.9740903400979094'}
Trial 21, Fold 5: Log loss = 0.33599993079141377, Average precision = 0.974024744362116, ROC-AUC = 0.9707432881853483, Elapsed Time = 22.729992100001255 seconds
Optimization Progress: 22%|##2 | 22/100 [37:17<1:30:15, 69.43s/it]
Trial 22, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 22, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66534 validation-auc:0.82859 validation-aucpr:0.81740
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63069 validation-auc:0.84888 validation-aucpr:0.82950
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61584 validation-auc:0.86316 validation-aucpr:0.85073
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.61219 validation-auc:0.85958 validation-aucpr:0.85603
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.61037 validation-auc:0.85655 validation-aucpr:0.84991
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.60284 validation-auc:0.85753 validation-aucpr:0.83697
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.59989 validation-auc:0.84902 validation-aucpr:0.83168
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.58183 validation-auc:0.85358 validation-aucpr:0.84506
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.58032 validation-auc:0.84760 validation-aucpr:0.83470
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.55682 validation-auc:0.86457 validation-aucpr:0.85791
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.53023 validation-auc:0.90063 validation-aucpr:0.90279
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.51509 validation-auc:0.91001 validation-aucpr:0.91155
[18:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.51445 validation-auc:0.90667 validation-aucpr:0.90656
{'best_iteration': '11', 'best_score': '0.9115515191694321'}
Trial 22, Fold 1: Log loss = 0.5144471949959329, Average precision = 0.9039888838431559, ROC-AUC = 0.9066697076803218, Elapsed Time = 0.4393142999997508 seconds
Trial 22, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 22, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66534 validation-auc:0.82738 validation-aucpr:0.82150
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65950 validation-auc:0.82413 validation-aucpr:0.82233
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.64338 validation-auc:0.82202 validation-aucpr:0.78106
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64025 validation-auc:0.80404 validation-aucpr:0.75031
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.61070 validation-auc:0.84332 validation-aucpr:0.79836
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.60712 validation-auc:0.83594 validation-aucpr:0.79079
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.60595 validation-auc:0.82719 validation-aucpr:0.77069
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.60434 validation-auc:0.82479 validation-aucpr:0.76080
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.60269 validation-auc:0.82752 validation-aucpr:0.76604
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.60190 validation-auc:0.82646 validation-aucpr:0.76179
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.59018 validation-auc:0.83292 validation-aucpr:0.76908
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.55934 validation-auc:0.86031 validation-aucpr:0.82476
[18:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.55321 validation-auc:0.85851 validation-aucpr:0.82720
{'best_iteration': '12', 'best_score': '0.82720031891643'}
Trial 22, Fold 2: Log loss = 0.553212866445822, Average precision = 0.8235345231859007, ROC-AUC = 0.858507178125496, Elapsed Time = 0.4681982000001881 seconds
Trial 22, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 22, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66482 validation-auc:0.82983 validation-aucpr:0.81626
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66482 validation-auc:0.82983 validation-aucpr:0.81626
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65871 validation-auc:0.81927 validation-aucpr:0.80524
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65577 validation-auc:0.79766 validation-aucpr:0.73470
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65339 validation-auc:0.79757 validation-aucpr:0.73619
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64140 validation-auc:0.79833 validation-aucpr:0.73805
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64140 validation-auc:0.79833 validation-aucpr:0.73805
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63474 validation-auc:0.80063 validation-aucpr:0.75471
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62018 validation-auc:0.81373 validation-aucpr:0.76437
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.60200 validation-auc:0.84022 validation-aucpr:0.79799
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.57422 validation-auc:0.86227 validation-aucpr:0.83985
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.55015 validation-auc:0.87818 validation-aucpr:0.87099
[18:36:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.54954 validation-auc:0.87116 validation-aucpr:0.85570
{'best_iteration': '11', 'best_score': '0.8709905104598027'}
Trial 22, Fold 3: Log loss = 0.5495449233834574, Average precision = 0.8494831472255703, ROC-AUC = 0.8711569912887287, Elapsed Time = 0.5590210000009392 seconds
Trial 22, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 22, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66513 validation-auc:0.82575 validation-aucpr:0.80961
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65613 validation-auc:0.81019 validation-aucpr:0.79149
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.62651 validation-auc:0.85436 validation-aucpr:0.85019
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.61667 validation-auc:0.85176 validation-aucpr:0.84246
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.60990 validation-auc:0.84522 validation-aucpr:0.83211
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.58277 validation-auc:0.86419 validation-aucpr:0.85499
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.58013 validation-auc:0.86915 validation-aucpr:0.86664
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.57805 validation-auc:0.85937 validation-aucpr:0.85414
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.55521 validation-auc:0.87155 validation-aucpr:0.86867
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.55454 validation-auc:0.86622 validation-aucpr:0.86019
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.55367 validation-auc:0.86482 validation-aucpr:0.85700
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.55322 validation-auc:0.85839 validation-aucpr:0.84371
[18:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.55183 validation-auc:0.85330 validation-aucpr:0.83146
{'best_iteration': '8', 'best_score': '0.8686678124859214'}
Trial 22, Fold 4: Log loss = 0.5518270399325328, Average precision = 0.8193038375097094, ROC-AUC = 0.8533011259182958, Elapsed Time = 0.5348350000003848 seconds
Trial 22, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 22, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66578 validation-auc:0.81850 validation-aucpr:0.82146
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65403 validation-auc:0.81988 validation-aucpr:0.82139
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.64183 validation-auc:0.85871 validation-aucpr:0.85673
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63956 validation-auc:0.84010 validation-aucpr:0.80902
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62942 validation-auc:0.82334 validation-aucpr:0.78522
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.62667 validation-auc:0.81153 validation-aucpr:0.74412
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.61833 validation-auc:0.80627 validation-aucpr:0.74120
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.59032 validation-auc:0.84637 validation-aucpr:0.83656
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.56555 validation-auc:0.86916 validation-aucpr:0.87463
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.54567 validation-auc:0.87825 validation-aucpr:0.88727
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.52925 validation-auc:0.88512 validation-aucpr:0.89739
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.51573 validation-auc:0.90352 validation-aucpr:0.92329
[18:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.51248 validation-auc:0.90250 validation-aucpr:0.92071
{'best_iteration': '11', 'best_score': '0.9232917725587191'}
Trial 22, Fold 5: Log loss = 0.5124814976954526, Average precision = 0.9202401278312367, ROC-AUC = 0.902498931683481, Elapsed Time = 0.5430703000001813 seconds
Optimization Progress: 23%|##3 | 23/100 [37:27<1:06:26, 51.77s/it]
Trial 23, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 23, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.65142 validation-auc:0.94745 validation-aucpr:0.95243
[1] validation-logloss:0.62137 validation-auc:0.94927 validation-aucpr:0.95089
[2] validation-logloss:0.59166 validation-auc:0.95627 validation-aucpr:0.95819
[3] validation-logloss:0.56790 validation-auc:0.95636 validation-aucpr:0.96088
[4] validation-logloss:0.54506 validation-auc:0.95734 validation-aucpr:0.96296
[5] validation-logloss:0.52566 validation-auc:0.95732 validation-aucpr:0.96323
[6] validation-logloss:0.50700 validation-auc:0.95756 validation-aucpr:0.96351
[7] validation-logloss:0.49095 validation-auc:0.95730 validation-aucpr:0.96324
[8] validation-logloss:0.47611 validation-auc:0.95725 validation-aucpr:0.96301
[9] validation-logloss:0.46227 validation-auc:0.95769 validation-aucpr:0.96355
[10] validation-logloss:0.44962 validation-auc:0.95778 validation-aucpr:0.96382
[11] validation-logloss:0.43869 validation-auc:0.95727 validation-aucpr:0.96319
[12] validation-logloss:0.42806 validation-auc:0.95693 validation-aucpr:0.96267
[13] validation-logloss:0.41717 validation-auc:0.95729 validation-aucpr:0.96305
[14] validation-logloss:0.40813 validation-auc:0.95732 validation-aucpr:0.96313
[15] validation-logloss:0.39866 validation-auc:0.95750 validation-aucpr:0.96324
[16] validation-logloss:0.39102 validation-auc:0.95731 validation-aucpr:0.96300
[17] validation-logloss:0.38266 validation-auc:0.95769 validation-aucpr:0.96357
[18] validation-logloss:0.37561 validation-auc:0.95784 validation-aucpr:0.96363
[19] validation-logloss:0.36781 validation-auc:0.95813 validation-aucpr:0.96379
[20] validation-logloss:0.35548 validation-auc:0.95962 validation-aucpr:0.96542
[21] validation-logloss:0.34964 validation-auc:0.95965 validation-aucpr:0.96545
[22] validation-logloss:0.34032 validation-auc:0.96010 validation-aucpr:0.96593
[23] validation-logloss:0.33570 validation-auc:0.96010 validation-aucpr:0.96595
[24] validation-logloss:0.33073 validation-auc:0.96006 validation-aucpr:0.96597
[25] validation-logloss:0.32643 validation-auc:0.96002 validation-aucpr:0.96592
[26] validation-logloss:0.32250 validation-auc:0.96016 validation-aucpr:0.96612
[27] validation-logloss:0.31781 validation-auc:0.96024 validation-aucpr:0.96620
[28] validation-logloss:0.31404 validation-auc:0.96027 validation-aucpr:0.96624
[29] validation-logloss:0.31084 validation-auc:0.96014 validation-aucpr:0.96610
[30] validation-logloss:0.30620 validation-auc:0.96034 validation-aucpr:0.96620
[31] validation-logloss:0.30307 validation-auc:0.96037 validation-aucpr:0.96618
[32] validation-logloss:0.30012 validation-auc:0.96037 validation-aucpr:0.96621
[33] validation-logloss:0.29406 validation-auc:0.96078 validation-aucpr:0.96672
[34] validation-logloss:0.29074 validation-auc:0.96098 validation-aucpr:0.96685
[35] validation-logloss:0.28843 validation-auc:0.96086 validation-aucpr:0.96678
[36] validation-logloss:0.28256 validation-auc:0.96131 validation-aucpr:0.96724
[37] validation-logloss:0.28047 validation-auc:0.96140 validation-aucpr:0.96734
[38] validation-logloss:0.27881 validation-auc:0.96137 validation-aucpr:0.96733
[39] validation-logloss:0.27380 validation-auc:0.96173 validation-aucpr:0.96775
[40] validation-logloss:0.27198 validation-auc:0.96188 validation-aucpr:0.96784
[41] validation-logloss:0.26990 validation-auc:0.96197 validation-aucpr:0.96797
[42] validation-logloss:0.26509 validation-auc:0.96240 validation-aucpr:0.96838
[43] validation-logloss:0.26108 validation-auc:0.96266 validation-aucpr:0.96871
[44] validation-logloss:0.25987 validation-auc:0.96274 validation-aucpr:0.96882
[45] validation-logloss:0.25749 validation-auc:0.96296 validation-aucpr:0.96896
[46] validation-logloss:0.25402 validation-auc:0.96326 validation-aucpr:0.96924
[47] validation-logloss:0.25303 validation-auc:0.96322 validation-aucpr:0.96920
[48] validation-logloss:0.24918 validation-auc:0.96356 validation-aucpr:0.96953
[49] validation-logloss:0.24767 validation-auc:0.96370 validation-aucpr:0.96968
{'best_iteration': '49', 'best_score': '0.9696847725863038'}
Trial 23, Fold 1: Log loss = 0.24766681338758925, Average precision = 0.9696768625925263, ROC-AUC = 0.9637044714252259, Elapsed Time = 0.7275594000002457 seconds
Trial 23, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 23, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.65076 validation-auc:0.95547 validation-aucpr:0.95878
[1] validation-logloss:0.62079 validation-auc:0.95520 validation-aucpr:0.95925
[2] validation-logloss:0.59094 validation-auc:0.96035 validation-aucpr:0.96300
[3] validation-logloss:0.56574 validation-auc:0.95943 validation-aucpr:0.96253
[4] validation-logloss:0.54324 validation-auc:0.95905 validation-aucpr:0.96279
[5] validation-logloss:0.52416 validation-auc:0.95799 validation-aucpr:0.96168
[6] validation-logloss:0.50594 validation-auc:0.95773 validation-aucpr:0.96156
[7] validation-logloss:0.48921 validation-auc:0.95809 validation-aucpr:0.96176
[8] validation-logloss:0.47411 validation-auc:0.95814 validation-aucpr:0.96165
[9] validation-logloss:0.46001 validation-auc:0.95822 validation-aucpr:0.96180
[10] validation-logloss:0.44764 validation-auc:0.95859 validation-aucpr:0.96215
[11] validation-logloss:0.43637 validation-auc:0.95820 validation-aucpr:0.96164
[12] validation-logloss:0.42511 validation-auc:0.95776 validation-aucpr:0.96100
[13] validation-logloss:0.41542 validation-auc:0.95809 validation-aucpr:0.96139
[14] validation-logloss:0.40547 validation-auc:0.95825 validation-aucpr:0.96143
[15] validation-logloss:0.39649 validation-auc:0.95792 validation-aucpr:0.96112
[16] validation-logloss:0.38883 validation-auc:0.95767 validation-aucpr:0.96081
[17] validation-logloss:0.38098 validation-auc:0.95788 validation-aucpr:0.96105
[18] validation-logloss:0.37365 validation-auc:0.95801 validation-aucpr:0.96114
[19] validation-logloss:0.36631 validation-auc:0.95787 validation-aucpr:0.96084
[20] validation-logloss:0.35361 validation-auc:0.96002 validation-aucpr:0.96324
[21] validation-logloss:0.34797 validation-auc:0.96005 validation-aucpr:0.96396
[22] validation-logloss:0.33817 validation-auc:0.96124 validation-aucpr:0.96533
[23] validation-logloss:0.33317 validation-auc:0.96124 validation-aucpr:0.96524
[24] validation-logloss:0.32859 validation-auc:0.96103 validation-aucpr:0.96492
[25] validation-logloss:0.32413 validation-auc:0.96115 validation-aucpr:0.96500
[26] validation-logloss:0.32020 validation-auc:0.96120 validation-aucpr:0.96508
[27] validation-logloss:0.31554 validation-auc:0.96137 validation-aucpr:0.96520
[28] validation-logloss:0.31202 validation-auc:0.96132 validation-aucpr:0.96507
[29] validation-logloss:0.30877 validation-auc:0.96123 validation-aucpr:0.96495
[30] validation-logloss:0.30487 validation-auc:0.96133 validation-aucpr:0.96507
[31] validation-logloss:0.30191 validation-auc:0.96120 validation-aucpr:0.96481
[32] validation-logloss:0.29842 validation-auc:0.96149 validation-aucpr:0.96495
[33] validation-logloss:0.29569 validation-auc:0.96156 validation-aucpr:0.96495
[34] validation-logloss:0.29351 validation-auc:0.96147 validation-aucpr:0.96481
[35] validation-logloss:0.29049 validation-auc:0.96158 validation-aucpr:0.96485
[36] validation-logloss:0.28788 validation-auc:0.96178 validation-aucpr:0.96504
[37] validation-logloss:0.28599 validation-auc:0.96173 validation-aucpr:0.96522
[38] validation-logloss:0.28160 validation-auc:0.96221 validation-aucpr:0.96574
[39] validation-logloss:0.27990 validation-auc:0.96224 validation-aucpr:0.96577
[40] validation-logloss:0.27426 validation-auc:0.96297 validation-aucpr:0.96660
[41] validation-logloss:0.27208 validation-auc:0.96303 validation-aucpr:0.96658
[42] validation-logloss:0.27032 validation-auc:0.96320 validation-aucpr:0.96666
[43] validation-logloss:0.26775 validation-auc:0.96356 validation-aucpr:0.96697
[44] validation-logloss:0.26599 validation-auc:0.96380 validation-aucpr:0.96717
[45] validation-logloss:0.26078 validation-auc:0.96424 validation-aucpr:0.96769
[46] validation-logloss:0.25623 validation-auc:0.96467 validation-aucpr:0.96822
[47] validation-logloss:0.25439 validation-auc:0.96492 validation-aucpr:0.96840
[48] validation-logloss:0.25253 validation-auc:0.96509 validation-aucpr:0.96856
[49] validation-logloss:0.25123 validation-auc:0.96519 validation-aucpr:0.96855
{'best_iteration': '48', 'best_score': '0.9685554995351819'}
Trial 23, Fold 2: Log loss = 0.2512295796579628, Average precision = 0.9685552808996968, ROC-AUC = 0.9651874646617346, Elapsed Time = 0.9321792999999161 seconds
Trial 23, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 23, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.65094 validation-auc:0.95011 validation-aucpr:0.95493
[1] validation-logloss:0.62074 validation-auc:0.95393 validation-aucpr:0.95924
[2] validation-logloss:0.59084 validation-auc:0.95915 validation-aucpr:0.96202
[3] validation-logloss:0.56659 validation-auc:0.95978 validation-aucpr:0.96225
[4] validation-logloss:0.54377 validation-auc:0.96088 validation-aucpr:0.96407
[5] validation-logloss:0.52402 validation-auc:0.96068 validation-aucpr:0.96381
[6] validation-logloss:0.50510 validation-auc:0.96063 validation-aucpr:0.96433
[7] validation-logloss:0.48890 validation-auc:0.96084 validation-aucpr:0.96447
[8] validation-logloss:0.47362 validation-auc:0.96101 validation-aucpr:0.96516
[9] validation-logloss:0.45971 validation-auc:0.96097 validation-aucpr:0.96512
[10] validation-logloss:0.44767 validation-auc:0.96091 validation-aucpr:0.96561
[11] validation-logloss:0.43616 validation-auc:0.96086 validation-aucpr:0.96555
[12] validation-logloss:0.42504 validation-auc:0.96080 validation-aucpr:0.96553
[13] validation-logloss:0.41525 validation-auc:0.96058 validation-aucpr:0.96536
[14] validation-logloss:0.40553 validation-auc:0.96068 validation-aucpr:0.96557
[15] validation-logloss:0.39607 validation-auc:0.96051 validation-aucpr:0.96532
[16] validation-logloss:0.38830 validation-auc:0.96043 validation-aucpr:0.96513
[17] validation-logloss:0.38025 validation-auc:0.96041 validation-aucpr:0.96517
[18] validation-logloss:0.37246 validation-auc:0.96107 validation-aucpr:0.96578
[19] validation-logloss:0.36432 validation-auc:0.96102 validation-aucpr:0.96567
[20] validation-logloss:0.35163 validation-auc:0.96277 validation-aucpr:0.96762
[21] validation-logloss:0.34558 validation-auc:0.96282 validation-aucpr:0.96765
[22] validation-logloss:0.34035 validation-auc:0.96282 validation-aucpr:0.96757
[23] validation-logloss:0.33545 validation-auc:0.96275 validation-aucpr:0.96748
[24] validation-logloss:0.33083 validation-auc:0.96262 validation-aucpr:0.96742
[25] validation-logloss:0.32681 validation-auc:0.96253 validation-aucpr:0.96728
[26] validation-logloss:0.32234 validation-auc:0.96271 validation-aucpr:0.96742
[27] validation-logloss:0.31705 validation-auc:0.96295 validation-aucpr:0.96766
[28] validation-logloss:0.30917 validation-auc:0.96358 validation-aucpr:0.96839
[29] validation-logloss:0.30557 validation-auc:0.96361 validation-aucpr:0.96839
[30] validation-logloss:0.30173 validation-auc:0.96373 validation-aucpr:0.96850
[31] validation-logloss:0.29417 validation-auc:0.96419 validation-aucpr:0.96906
[32] validation-logloss:0.29089 validation-auc:0.96424 validation-aucpr:0.96905
[33] validation-logloss:0.28750 validation-auc:0.96454 validation-aucpr:0.96928
[34] validation-logloss:0.28470 validation-auc:0.96464 validation-aucpr:0.96931
[35] validation-logloss:0.28159 validation-auc:0.96483 validation-aucpr:0.96945
[36] validation-logloss:0.27894 validation-auc:0.96503 validation-aucpr:0.96960
[37] validation-logloss:0.27702 validation-auc:0.96496 validation-aucpr:0.96950
[38] validation-logloss:0.27312 validation-auc:0.96528 validation-aucpr:0.96985
[39] validation-logloss:0.27116 validation-auc:0.96537 validation-aucpr:0.96993
[40] validation-logloss:0.26910 validation-auc:0.96543 validation-aucpr:0.96994
[41] validation-logloss:0.26757 validation-auc:0.96546 validation-aucpr:0.96990
[42] validation-logloss:0.26571 validation-auc:0.96558 validation-aucpr:0.96999
[43] validation-logloss:0.26388 validation-auc:0.96563 validation-aucpr:0.97009
[44] validation-logloss:0.26222 validation-auc:0.96565 validation-aucpr:0.97012
[45] validation-logloss:0.26045 validation-auc:0.96573 validation-aucpr:0.97019
[46] validation-logloss:0.25509 validation-auc:0.96613 validation-aucpr:0.97062
[47] validation-logloss:0.25374 validation-auc:0.96619 validation-aucpr:0.97069
[48] validation-logloss:0.24965 validation-auc:0.96642 validation-aucpr:0.97098
[49] validation-logloss:0.24819 validation-auc:0.96651 validation-aucpr:0.97104
{'best_iteration': '49', 'best_score': '0.9710446138361724'}
Trial 23, Fold 3: Log loss = 0.2481948596089837, Average precision = 0.9710479627303537, ROC-AUC = 0.9665101071693425, Elapsed Time = 0.9626014999994368 seconds
Trial 23, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 23, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.65127 validation-auc:0.95231 validation-aucpr:0.95618
[1] validation-logloss:0.62100 validation-auc:0.95361 validation-aucpr:0.95967
[2] validation-logloss:0.59108 validation-auc:0.95902 validation-aucpr:0.96362
[3] validation-logloss:0.56568 validation-auc:0.95876 validation-aucpr:0.96329
[4] validation-logloss:0.54216 validation-auc:0.95913 validation-aucpr:0.96411
[5] validation-logloss:0.52275 validation-auc:0.95806 validation-aucpr:0.96311
[6] validation-logloss:0.50436 validation-auc:0.95771 validation-aucpr:0.96312
[7] validation-logloss:0.48816 validation-auc:0.95754 validation-aucpr:0.96301
[8] validation-logloss:0.46874 validation-auc:0.95865 validation-aucpr:0.96445
[9] validation-logloss:0.45442 validation-auc:0.95822 validation-aucpr:0.96400
[10] validation-logloss:0.44039 validation-auc:0.95868 validation-aucpr:0.96439
[11] validation-logloss:0.42853 validation-auc:0.95842 validation-aucpr:0.96421
[12] validation-logloss:0.41335 validation-auc:0.95939 validation-aucpr:0.96542
[13] validation-logloss:0.40389 validation-auc:0.95910 validation-aucpr:0.96502
[14] validation-logloss:0.39455 validation-auc:0.95928 validation-aucpr:0.96531
[15] validation-logloss:0.38641 validation-auc:0.95885 validation-aucpr:0.96474
[16] validation-logloss:0.37758 validation-auc:0.95947 validation-aucpr:0.96527
[17] validation-logloss:0.37064 validation-auc:0.95915 validation-aucpr:0.96510
[18] validation-logloss:0.36283 validation-auc:0.95939 validation-aucpr:0.96515
[19] validation-logloss:0.35623 validation-auc:0.95959 validation-aucpr:0.96535
[20] validation-logloss:0.34765 validation-auc:0.95990 validation-aucpr:0.96570
[21] validation-logloss:0.33788 validation-auc:0.96041 validation-aucpr:0.96618
[22] validation-logloss:0.33286 validation-auc:0.96047 validation-aucpr:0.96629
[23] validation-logloss:0.32806 validation-auc:0.96056 validation-aucpr:0.96638
[24] validation-logloss:0.32353 validation-auc:0.96055 validation-aucpr:0.96643
[25] validation-logloss:0.31975 validation-auc:0.96031 validation-aucpr:0.96618
[26] validation-logloss:0.31514 validation-auc:0.96067 validation-aucpr:0.96655
[27] validation-logloss:0.31032 validation-auc:0.96126 validation-aucpr:0.96711
[28] validation-logloss:0.30632 validation-auc:0.96135 validation-aucpr:0.96723
[29] validation-logloss:0.30231 validation-auc:0.96148 validation-aucpr:0.96732
[30] validation-logloss:0.29866 validation-auc:0.96169 validation-aucpr:0.96746
[31] validation-logloss:0.29123 validation-auc:0.96245 validation-aucpr:0.96823
[32] validation-logloss:0.28793 validation-auc:0.96271 validation-aucpr:0.96844
[33] validation-logloss:0.28510 validation-auc:0.96289 validation-aucpr:0.96852
[34] validation-logloss:0.28273 validation-auc:0.96278 validation-aucpr:0.96849
[35] validation-logloss:0.27988 validation-auc:0.96291 validation-aucpr:0.96864
[36] validation-logloss:0.27747 validation-auc:0.96300 validation-aucpr:0.96868
[37] validation-logloss:0.27322 validation-auc:0.96308 validation-aucpr:0.96878
[38] validation-logloss:0.27006 validation-auc:0.96338 validation-aucpr:0.96904
[39] validation-logloss:0.26814 validation-auc:0.96341 validation-aucpr:0.96907
[40] validation-logloss:0.26349 validation-auc:0.96377 validation-aucpr:0.96947
[41] validation-logloss:0.26134 validation-auc:0.96377 validation-aucpr:0.96947
[42] validation-logloss:0.25983 validation-auc:0.96387 validation-aucpr:0.96953
[43] validation-logloss:0.25777 validation-auc:0.96407 validation-aucpr:0.96974
[44] validation-logloss:0.25627 validation-auc:0.96414 validation-aucpr:0.96980
[45] validation-logloss:0.25499 validation-auc:0.96418 validation-aucpr:0.96982
[46] validation-logloss:0.25342 validation-auc:0.96421 validation-aucpr:0.96983
[47] validation-logloss:0.25134 validation-auc:0.96452 validation-aucpr:0.97008
[48] validation-logloss:0.24962 validation-auc:0.96460 validation-aucpr:0.97014
[49] validation-logloss:0.24831 validation-auc:0.96464 validation-aucpr:0.97017
{'best_iteration': '49', 'best_score': '0.9701724688367994'}
Trial 23, Fold 4: Log loss = 0.24831006230194794, Average precision = 0.9701685420356017, ROC-AUC = 0.9646393520604877, Elapsed Time = 0.9430483999985881 seconds
Trial 23, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 23, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.65341 validation-auc:0.94036 validation-aucpr:0.94665
[1] validation-logloss:0.62349 validation-auc:0.94719 validation-aucpr:0.95075
[2] validation-logloss:0.59392 validation-auc:0.95151 validation-aucpr:0.95509
[3] validation-logloss:0.56876 validation-auc:0.95169 validation-aucpr:0.95491
[4] validation-logloss:0.54627 validation-auc:0.95179 validation-aucpr:0.95553
[5] validation-logloss:0.52703 validation-auc:0.95119 validation-aucpr:0.95475
[6] validation-logloss:0.50902 validation-auc:0.95145 validation-aucpr:0.95485
[7] validation-logloss:0.49313 validation-auc:0.95188 validation-aucpr:0.95539
[8] validation-logloss:0.47810 validation-auc:0.95206 validation-aucpr:0.95544
[9] validation-logloss:0.46450 validation-auc:0.95245 validation-aucpr:0.95590
[10] validation-logloss:0.45314 validation-auc:0.95285 validation-aucpr:0.95732
[11] validation-logloss:0.44223 validation-auc:0.95201 validation-aucpr:0.95620
[12] validation-logloss:0.43136 validation-auc:0.95245 validation-aucpr:0.95666
[13] validation-logloss:0.42132 validation-auc:0.95300 validation-aucpr:0.95781
[14] validation-logloss:0.41218 validation-auc:0.95364 validation-aucpr:0.95852
[15] validation-logloss:0.40326 validation-auc:0.95349 validation-aucpr:0.95822
[16] validation-logloss:0.39550 validation-auc:0.95354 validation-aucpr:0.95821
[17] validation-logloss:0.38717 validation-auc:0.95365 validation-aucpr:0.95878
[18] validation-logloss:0.37973 validation-auc:0.95397 validation-aucpr:0.95907
[19] validation-logloss:0.37334 validation-auc:0.95384 validation-aucpr:0.95888
[20] validation-logloss:0.36369 validation-auc:0.95533 validation-aucpr:0.96088
[21] validation-logloss:0.35262 validation-auc:0.95690 validation-aucpr:0.96268
[22] validation-logloss:0.34747 validation-auc:0.95699 validation-aucpr:0.96275
[23] validation-logloss:0.34284 validation-auc:0.95703 validation-aucpr:0.96275
[24] validation-logloss:0.33856 validation-auc:0.95686 validation-aucpr:0.96256
[25] validation-logloss:0.33481 validation-auc:0.95681 validation-aucpr:0.96247
[26] validation-logloss:0.33103 validation-auc:0.95679 validation-aucpr:0.96240
[27] validation-logloss:0.32584 validation-auc:0.95730 validation-aucpr:0.96285
[28] validation-logloss:0.32212 validation-auc:0.95731 validation-aucpr:0.96282
[29] validation-logloss:0.31790 validation-auc:0.95754 validation-aucpr:0.96303
[30] validation-logloss:0.31422 validation-auc:0.95769 validation-aucpr:0.96313
[31] validation-logloss:0.30608 validation-auc:0.95884 validation-aucpr:0.96436
[32] validation-logloss:0.30309 validation-auc:0.95884 validation-aucpr:0.96434
[33] validation-logloss:0.30032 validation-auc:0.95909 validation-aucpr:0.96449
[34] validation-logloss:0.29777 validation-auc:0.95917 validation-aucpr:0.96453
[35] validation-logloss:0.29496 validation-auc:0.95929 validation-aucpr:0.96457
[36] validation-logloss:0.29258 validation-auc:0.95940 validation-aucpr:0.96467
[37] validation-logloss:0.28761 validation-auc:0.95986 validation-aucpr:0.96524
[38] validation-logloss:0.28418 validation-auc:0.96020 validation-aucpr:0.96553
[39] validation-logloss:0.28254 validation-auc:0.96024 validation-aucpr:0.96552
[40] validation-logloss:0.27701 validation-auc:0.96075 validation-aucpr:0.96604
[41] validation-logloss:0.27492 validation-auc:0.96083 validation-aucpr:0.96608
[42] validation-logloss:0.27313 validation-auc:0.96102 validation-aucpr:0.96621
[43] validation-logloss:0.27094 validation-auc:0.96121 validation-aucpr:0.96640
[44] validation-logloss:0.26934 validation-auc:0.96140 validation-aucpr:0.96654
[45] validation-logloss:0.26473 validation-auc:0.96188 validation-aucpr:0.96703
[46] validation-logloss:0.26296 validation-auc:0.96204 validation-aucpr:0.96714
[47] validation-logloss:0.26193 validation-auc:0.96202 validation-aucpr:0.96708
[48] validation-logloss:0.25814 validation-auc:0.96221 validation-aucpr:0.96730
[49] validation-logloss:0.25625 validation-auc:0.96249 validation-aucpr:0.96753
{'best_iteration': '49', 'best_score': '0.9675298542241902'}
Trial 23, Fold 5: Log loss = 0.2562486586560138, Average precision = 0.9675302529511657, ROC-AUC = 0.9624879326681902, Elapsed Time = 0.9584814000008919 seconds
Optimization Progress: 24%|##4 | 24/100 [37:40<50:37, 39.97s/it]
Trial 24, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 24, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:36:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67754 validation-auc:0.94648 validation-aucpr:0.95313
[18:36:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66446 validation-auc:0.95291 validation-aucpr:0.94587
[18:36:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65146 validation-auc:0.95806 validation-aucpr:0.95520
[18:36:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63909 validation-auc:0.96182 validation-aucpr:0.96495
[18:36:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62699 validation-auc:0.96247 validation-aucpr:0.96495
[18:36:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61423 validation-auc:0.96446 validation-aucpr:0.96677
[18:36:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.60175 validation-auc:0.96548 validation-aucpr:0.96685
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.58963 validation-auc:0.96624 validation-aucpr:0.96807
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.57821 validation-auc:0.96716 validation-aucpr:0.97022
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.56892 validation-auc:0.96625 validation-aucpr:0.96933
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.55941 validation-auc:0.96601 validation-aucpr:0.96871
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.55067 validation-auc:0.96578 validation-aucpr:0.96819
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.54168 validation-auc:0.96579 validation-aucpr:0.96820
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.53233 validation-auc:0.96591 validation-aucpr:0.96825
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.52405 validation-auc:0.96611 validation-aucpr:0.96853
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.51486 validation-auc:0.96676 validation-aucpr:0.97094
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.50584 validation-auc:0.96697 validation-aucpr:0.97092
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.49724 validation-auc:0.96733 validation-aucpr:0.97127
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.48909 validation-auc:0.96750 validation-aucpr:0.97149
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.48088 validation-auc:0.96805 validation-aucpr:0.97265
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.47322 validation-auc:0.96806 validation-aucpr:0.97261
[18:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.46678 validation-auc:0.96827 validation-aucpr:0.97284
[18:36:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.45953 validation-auc:0.96831 validation-aucpr:0.97290
[18:36:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.45338 validation-auc:0.96823 validation-aucpr:0.97294
[18:36:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.44648 validation-auc:0.96838 validation-aucpr:0.97315
[18:36:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.44021 validation-auc:0.96859 validation-aucpr:0.97336
[18:36:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.43359 validation-auc:0.96883 validation-aucpr:0.97356
[18:36:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.42834 validation-auc:0.96874 validation-aucpr:0.97337
[18:36:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.42314 validation-auc:0.96859 validation-aucpr:0.97323
[18:36:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.41808 validation-auc:0.96843 validation-aucpr:0.97308
[18:36:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.41306 validation-auc:0.96843 validation-aucpr:0.97304
[18:36:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.40727 validation-auc:0.96855 validation-aucpr:0.97313
[18:36:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.40193 validation-auc:0.96857 validation-aucpr:0.97325
[18:36:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.39664 validation-auc:0.96855 validation-aucpr:0.97326
[18:36:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.39140 validation-auc:0.96859 validation-aucpr:0.97331
[18:36:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.38723 validation-auc:0.96852 validation-aucpr:0.97318
[18:36:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.38311 validation-auc:0.96843 validation-aucpr:0.97305
[18:36:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.37909 validation-auc:0.96848 validation-aucpr:0.97308
[18:36:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.37509 validation-auc:0.96848 validation-aucpr:0.97306
[18:36:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.37125 validation-auc:0.96835 validation-aucpr:0.97298
[18:36:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.36685 validation-auc:0.96846 validation-aucpr:0.97302
[18:36:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.36304 validation-auc:0.96848 validation-aucpr:0.97304
[18:36:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.35967 validation-auc:0.96834 validation-aucpr:0.97288
[18:36:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.35625 validation-auc:0.96830 validation-aucpr:0.97281
{'best_iteration': '26', 'best_score': '0.9735624594295109'}
Trial 24, Fold 1: Log loss = 0.3562467255542872, Average precision = 0.9728185954572157, ROC-AUC = 0.9682985528598671, Elapsed Time = 4.798161999999138 seconds
Trial 24, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 24, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67795 validation-auc:0.95003 validation-aucpr:0.95522
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66306 validation-auc:0.95893 validation-aucpr:0.96415
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.64936 validation-auc:0.96179 validation-aucpr:0.96638
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63579 validation-auc:0.96512 validation-aucpr:0.96811
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62415 validation-auc:0.96673 validation-aucpr:0.97068
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61271 validation-auc:0.96744 validation-aucpr:0.97100
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.60174 validation-auc:0.96816 validation-aucpr:0.97156
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.59079 validation-auc:0.96877 validation-aucpr:0.97198
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.57919 validation-auc:0.96929 validation-aucpr:0.97251
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.56807 validation-auc:0.96969 validation-aucpr:0.97285
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.55900 validation-auc:0.96921 validation-aucpr:0.97237
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.54858 validation-auc:0.96939 validation-aucpr:0.97284
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.53997 validation-auc:0.96886 validation-aucpr:0.97237
[18:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.53013 validation-auc:0.96942 validation-aucpr:0.97277
[18:36:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.52183 validation-auc:0.96954 validation-aucpr:0.97283
[18:36:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.51258 validation-auc:0.96997 validation-aucpr:0.97327
[18:36:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.50491 validation-auc:0.96963 validation-aucpr:0.97299
[18:36:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.49735 validation-auc:0.96973 validation-aucpr:0.97301
[18:36:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.48889 validation-auc:0.97011 validation-aucpr:0.97329
[18:36:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.48145 validation-auc:0.97019 validation-aucpr:0.97343
[18:36:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.47475 validation-auc:0.97019 validation-aucpr:0.97342
[18:36:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.46701 validation-auc:0.97026 validation-aucpr:0.97344
[18:36:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.46030 validation-auc:0.97013 validation-aucpr:0.97329
[18:36:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.45406 validation-auc:0.97007 validation-aucpr:0.97314
[18:36:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.44778 validation-auc:0.97021 validation-aucpr:0.97322
[18:36:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.44202 validation-auc:0.97008 validation-aucpr:0.97314
[18:36:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.43536 validation-auc:0.97028 validation-aucpr:0.97352
[18:36:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.42986 validation-auc:0.97028 validation-aucpr:0.97349
[18:36:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.42455 validation-auc:0.97021 validation-aucpr:0.97343
[18:36:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.41942 validation-auc:0.97009 validation-aucpr:0.97327
[18:36:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.41425 validation-auc:0.97005 validation-aucpr:0.97326
[18:36:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.40943 validation-auc:0.97011 validation-aucpr:0.97340
[18:36:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.40441 validation-auc:0.97025 validation-aucpr:0.97342
[18:36:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.39989 validation-auc:0.97015 validation-aucpr:0.97330
[18:36:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.39579 validation-auc:0.96989 validation-aucpr:0.97309
[18:36:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.39141 validation-auc:0.96986 validation-aucpr:0.97305
[18:36:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.38702 validation-auc:0.96988 validation-aucpr:0.97308
[18:36:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.38202 validation-auc:0.97007 validation-aucpr:0.97317
[18:36:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.37813 validation-auc:0.96996 validation-aucpr:0.97311
[18:36:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.37420 validation-auc:0.96994 validation-aucpr:0.97310
[18:36:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.36930 validation-auc:0.97010 validation-aucpr:0.97325
[18:36:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.36546 validation-auc:0.97011 validation-aucpr:0.97321
[18:36:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.36082 validation-auc:0.97021 validation-aucpr:0.97330
[18:36:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.35630 validation-auc:0.97040 validation-aucpr:0.97347
{'best_iteration': '26', 'best_score': '0.9735159251236497'}
Trial 24, Fold 2: Log loss = 0.3563022562597658, Average precision = 0.9734075077131078, ROC-AUC = 0.9704006080837259, Elapsed Time = 4.943635200001154 seconds
Trial 24, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 24, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:36:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67755 validation-auc:0.95357 validation-aucpr:0.95829
[18:36:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66279 validation-auc:0.95870 validation-aucpr:0.95148
[18:36:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65024 validation-auc:0.96146 validation-aucpr:0.96566
[18:36:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63640 validation-auc:0.96460 validation-aucpr:0.96668
[18:36:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62300 validation-auc:0.96692 validation-aucpr:0.97016
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61010 validation-auc:0.96815 validation-aucpr:0.97122
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.59754 validation-auc:0.96919 validation-aucpr:0.97155
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.58724 validation-auc:0.96904 validation-aucpr:0.97005
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.57719 validation-auc:0.96887 validation-aucpr:0.97008
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.56767 validation-auc:0.96880 validation-aucpr:0.97016
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.55707 validation-auc:0.96861 validation-aucpr:0.96992
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.54784 validation-auc:0.96922 validation-aucpr:0.97178
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.53779 validation-auc:0.96940 validation-aucpr:0.97197
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.52927 validation-auc:0.96929 validation-aucpr:0.97188
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.52018 validation-auc:0.96919 validation-aucpr:0.97178
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.51237 validation-auc:0.96890 validation-aucpr:0.97051
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.50398 validation-auc:0.96914 validation-aucpr:0.97070
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.49556 validation-auc:0.96931 validation-aucpr:0.97286
[18:36:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.48843 validation-auc:0.96912 validation-aucpr:0.97268
[18:36:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.48123 validation-auc:0.96913 validation-aucpr:0.97277
[18:36:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.47312 validation-auc:0.96936 validation-aucpr:0.97300
[18:36:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.46636 validation-auc:0.96941 validation-aucpr:0.97295
[18:36:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.45988 validation-auc:0.96946 validation-aucpr:0.97285
[18:36:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.45264 validation-auc:0.96969 validation-aucpr:0.97298
[18:36:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.44636 validation-auc:0.96988 validation-aucpr:0.97317
[18:36:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.44054 validation-auc:0.96984 validation-aucpr:0.97388
[18:36:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.43450 validation-auc:0.96997 validation-aucpr:0.97396
[18:36:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.42909 validation-auc:0.96992 validation-aucpr:0.97319
[18:36:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.42290 validation-auc:0.96994 validation-aucpr:0.97301
[18:36:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.41751 validation-auc:0.97006 validation-aucpr:0.97400
[18:36:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.41235 validation-auc:0.97015 validation-aucpr:0.97403
[18:36:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.40740 validation-auc:0.97010 validation-aucpr:0.97387
[18:36:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.40163 validation-auc:0.97019 validation-aucpr:0.97399
[18:36:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.39602 validation-auc:0.97028 validation-aucpr:0.97405
[18:36:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.39162 validation-auc:0.97026 validation-aucpr:0.97399
[18:36:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.38718 validation-auc:0.97029 validation-aucpr:0.97348
[18:36:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.38219 validation-auc:0.97034 validation-aucpr:0.97349
[18:36:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.37710 validation-auc:0.97052 validation-aucpr:0.97364
[18:36:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.37232 validation-auc:0.97062 validation-aucpr:0.97377
[18:36:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.36849 validation-auc:0.97060 validation-aucpr:0.97372
[18:36:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.36458 validation-auc:0.97061 validation-aucpr:0.97373
[18:36:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.36018 validation-auc:0.97062 validation-aucpr:0.97376
[18:36:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.35658 validation-auc:0.97057 validation-aucpr:0.97367
[18:36:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.35295 validation-auc:0.97065 validation-aucpr:0.97374
{'best_iteration': '33', 'best_score': '0.9740539892541278'}
Trial 24, Fold 3: Log loss = 0.3529549688708266, Average precision = 0.9737998160380797, ROC-AUC = 0.9706531828163073, Elapsed Time = 5.05997539999953 seconds
Trial 24, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 24, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67756 validation-auc:0.94669 validation-aucpr:0.95494
[18:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66441 validation-auc:0.95335 validation-aucpr:0.95834
[18:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65023 validation-auc:0.95958 validation-aucpr:0.96440
[18:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63669 validation-auc:0.96288 validation-aucpr:0.96940
[18:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62349 validation-auc:0.96425 validation-aucpr:0.97044
[18:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61083 validation-auc:0.96541 validation-aucpr:0.97136
[18:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.59887 validation-auc:0.96571 validation-aucpr:0.97077
[18:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.58734 validation-auc:0.96569 validation-aucpr:0.96967
[18:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.57621 validation-auc:0.96647 validation-aucpr:0.97228
[18:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.56590 validation-auc:0.96718 validation-aucpr:0.97268
[18:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.55677 validation-auc:0.96684 validation-aucpr:0.97236
[18:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.54668 validation-auc:0.96687 validation-aucpr:0.97244
[18:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.53704 validation-auc:0.96701 validation-aucpr:0.97253
[18:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.52735 validation-auc:0.96733 validation-aucpr:0.97280
[18:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.51940 validation-auc:0.96715 validation-aucpr:0.97261
[18:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.51013 validation-auc:0.96780 validation-aucpr:0.97311
[18:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.50230 validation-auc:0.96802 validation-aucpr:0.97324
[18:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.49392 validation-auc:0.96819 validation-aucpr:0.97338
[18:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.48580 validation-auc:0.96831 validation-aucpr:0.97351
[18:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.47778 validation-auc:0.96863 validation-aucpr:0.97379
[18:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.47021 validation-auc:0.96845 validation-aucpr:0.97370
[18:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.46338 validation-auc:0.96858 validation-aucpr:0.97382
[18:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.45681 validation-auc:0.96876 validation-aucpr:0.97393
[18:36:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.45086 validation-auc:0.96878 validation-aucpr:0.97392
[18:36:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.44483 validation-auc:0.96885 validation-aucpr:0.97394
[18:36:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.43821 validation-auc:0.96900 validation-aucpr:0.97407
[18:36:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.43285 validation-auc:0.96875 validation-aucpr:0.97388
[18:36:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.42632 validation-auc:0.96907 validation-aucpr:0.97414
[18:36:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.41998 validation-auc:0.96926 validation-aucpr:0.97433
[18:36:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.41465 validation-auc:0.96925 validation-aucpr:0.97431
[18:36:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.40952 validation-auc:0.96923 validation-aucpr:0.97429
[18:36:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.40380 validation-auc:0.96929 validation-aucpr:0.97438
[18:36:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.39826 validation-auc:0.96935 validation-aucpr:0.97446
[18:36:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.39361 validation-auc:0.96944 validation-aucpr:0.97452
[18:36:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.38847 validation-auc:0.96949 validation-aucpr:0.97457
[18:36:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.38346 validation-auc:0.96936 validation-aucpr:0.97448
[18:36:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.37917 validation-auc:0.96933 validation-aucpr:0.97443
[18:36:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.37502 validation-auc:0.96928 validation-aucpr:0.97437
[18:36:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.37039 validation-auc:0.96931 validation-aucpr:0.97439
[18:36:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.36575 validation-auc:0.96944 validation-aucpr:0.97449
[18:36:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.36123 validation-auc:0.96952 validation-aucpr:0.97457
[18:36:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.35742 validation-auc:0.96958 validation-aucpr:0.97458
[18:36:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.35381 validation-auc:0.96965 validation-aucpr:0.97463
[18:36:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.34956 validation-auc:0.96986 validation-aucpr:0.97478
{'best_iteration': '43', 'best_score': '0.9747828782362944'}
Trial 24, Fold 4: Log loss = 0.3495584092387853, Average precision = 0.9747869080590805, ROC-AUC = 0.9698566641868804, Elapsed Time = 5.095767300001171 seconds
Trial 24, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 24, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:36:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67775 validation-auc:0.94775 validation-aucpr:0.94986
[18:36:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66501 validation-auc:0.95325 validation-aucpr:0.95732
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65082 validation-auc:0.95775 validation-aucpr:0.96329
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63701 validation-auc:0.96022 validation-aucpr:0.96463
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62379 validation-auc:0.96338 validation-aucpr:0.96490
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61170 validation-auc:0.96486 validation-aucpr:0.96984
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.60074 validation-auc:0.96507 validation-aucpr:0.96969
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.58885 validation-auc:0.96609 validation-aucpr:0.97045
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.57880 validation-auc:0.96583 validation-aucpr:0.97030
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.56903 validation-auc:0.96595 validation-aucpr:0.96921
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.55854 validation-auc:0.96601 validation-aucpr:0.96933
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.54847 validation-auc:0.96635 validation-aucpr:0.96978
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.53900 validation-auc:0.96689 validation-aucpr:0.97000
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.52926 validation-auc:0.96753 validation-aucpr:0.97040
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.51993 validation-auc:0.96773 validation-aucpr:0.97061
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.51115 validation-auc:0.96794 validation-aucpr:0.97066
[18:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.50249 validation-auc:0.96813 validation-aucpr:0.97088
[18:36:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.49523 validation-auc:0.96791 validation-aucpr:0.97068
[18:36:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.48731 validation-auc:0.96792 validation-aucpr:0.97040
[18:36:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.47950 validation-auc:0.96803 validation-aucpr:0.97050
[18:36:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.47286 validation-auc:0.96786 validation-aucpr:0.97029
[18:36:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.46656 validation-auc:0.96778 validation-aucpr:0.97237
[18:36:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.46020 validation-auc:0.96780 validation-aucpr:0.97238
[18:36:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.45411 validation-auc:0.96763 validation-aucpr:0.97172
[18:36:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.44855 validation-auc:0.96737 validation-aucpr:0.97138
[18:36:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.44177 validation-auc:0.96758 validation-aucpr:0.97156
[18:36:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.43605 validation-auc:0.96754 validation-aucpr:0.97149
[18:36:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.42990 validation-auc:0.96774 validation-aucpr:0.97165
[18:36:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.42369 validation-auc:0.96795 validation-aucpr:0.97186
[18:36:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.41855 validation-auc:0.96788 validation-aucpr:0.97178
[18:36:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.41256 validation-auc:0.96815 validation-aucpr:0.97200
[18:36:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.40683 validation-auc:0.96837 validation-aucpr:0.97216
[18:36:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.40206 validation-auc:0.96829 validation-aucpr:0.97135
[18:37:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.39745 validation-auc:0.96819 validation-aucpr:0.97127
[18:37:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.39293 validation-auc:0.96818 validation-aucpr:0.97126
[18:37:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.38786 validation-auc:0.96840 validation-aucpr:0.97142
[18:37:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.38283 validation-auc:0.96864 validation-aucpr:0.97164
[18:37:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.37795 validation-auc:0.96886 validation-aucpr:0.97183
[18:37:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.37389 validation-auc:0.96879 validation-aucpr:0.97177
[18:37:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.36923 validation-auc:0.96898 validation-aucpr:0.97186
[18:37:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.36563 validation-auc:0.96889 validation-aucpr:0.97170
[18:37:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.36143 validation-auc:0.96905 validation-aucpr:0.97340
[18:37:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.35713 validation-auc:0.96925 validation-aucpr:0.97368
[18:37:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.35370 validation-auc:0.96913 validation-aucpr:0.97357
{'best_iteration': '42', 'best_score': '0.973677385463933'}
Trial 24, Fold 5: Log loss = 0.3536975849183317, Average precision = 0.9735737443153737, ROC-AUC = 0.969126804525946, Elapsed Time = 5.088518399999884 seconds
Optimization Progress: 25%|##5 | 25/100 [38:13<47:30, 38.01s/it]
Trial 25, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 25, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67136 validation-auc:0.92645 validation-aucpr:0.92797
[1] validation-logloss:0.65045 validation-auc:0.93927 validation-aucpr:0.94577
[2] validation-logloss:0.62847 validation-auc:0.95043 validation-aucpr:0.95863
[3] validation-logloss:0.60879 validation-auc:0.95524 validation-aucpr:0.96105
[4] validation-logloss:0.59011 validation-auc:0.95747 validation-aucpr:0.96265
[5] validation-logloss:0.57531 validation-auc:0.95568 validation-aucpr:0.96092
[6] validation-logloss:0.55744 validation-auc:0.95778 validation-aucpr:0.96344
[7] validation-logloss:0.54321 validation-auc:0.95790 validation-aucpr:0.96430
[8] validation-logloss:0.53138 validation-auc:0.95740 validation-aucpr:0.96381
[9] validation-logloss:0.51916 validation-auc:0.95736 validation-aucpr:0.96369
[10] validation-logloss:0.50745 validation-auc:0.95717 validation-aucpr:0.96353
[11] validation-logloss:0.49565 validation-auc:0.95745 validation-aucpr:0.96393
[12] validation-logloss:0.48441 validation-auc:0.95766 validation-aucpr:0.96421
[13] validation-logloss:0.47120 validation-auc:0.95912 validation-aucpr:0.96579
[14] validation-logloss:0.46180 validation-auc:0.95924 validation-aucpr:0.96585
[15] validation-logloss:0.45253 validation-auc:0.95952 validation-aucpr:0.96603
[16] validation-logloss:0.44349 validation-auc:0.95957 validation-aucpr:0.96604
[17] validation-logloss:0.43490 validation-auc:0.95954 validation-aucpr:0.96600
[18] validation-logloss:0.42815 validation-auc:0.95919 validation-aucpr:0.96572
[19] validation-logloss:0.41788 validation-auc:0.96025 validation-aucpr:0.96671
[20] validation-logloss:0.41213 validation-auc:0.95990 validation-aucpr:0.96630
[21] validation-logloss:0.40480 validation-auc:0.96029 validation-aucpr:0.96658
[22] validation-logloss:0.39603 validation-auc:0.96066 validation-aucpr:0.96701
[23] validation-logloss:0.38666 validation-auc:0.96145 validation-aucpr:0.96774
[24] validation-logloss:0.38092 validation-auc:0.96136 validation-aucpr:0.96762
[25] validation-logloss:0.37635 validation-auc:0.96106 validation-aucpr:0.96740
[26] validation-logloss:0.37070 validation-auc:0.96120 validation-aucpr:0.96743
[27] validation-logloss:0.36546 validation-auc:0.96135 validation-aucpr:0.96751
[28] validation-logloss:0.35802 validation-auc:0.96189 validation-aucpr:0.96804
[29] validation-logloss:0.35351 validation-auc:0.96196 validation-aucpr:0.96805
[30] validation-logloss:0.34984 validation-auc:0.96183 validation-aucpr:0.96795
[31] validation-logloss:0.34655 validation-auc:0.96146 validation-aucpr:0.96765
[32] validation-logloss:0.34241 validation-auc:0.96152 validation-aucpr:0.96768
[33] validation-logloss:0.33647 validation-auc:0.96179 validation-aucpr:0.96796
[34] validation-logloss:0.33313 validation-auc:0.96163 validation-aucpr:0.96780
[35] validation-logloss:0.32905 validation-auc:0.96190 validation-aucpr:0.96800
[36] validation-logloss:0.32565 validation-auc:0.96200 validation-aucpr:0.96806
[37] validation-logloss:0.32215 validation-auc:0.96225 validation-aucpr:0.96826
[38] validation-logloss:0.31689 validation-auc:0.96252 validation-aucpr:0.96858
[39] validation-logloss:0.31322 validation-auc:0.96275 validation-aucpr:0.96874
[40] validation-logloss:0.31011 validation-auc:0.96282 validation-aucpr:0.96879
[41] validation-logloss:0.30612 validation-auc:0.96292 validation-aucpr:0.96890
[42] validation-logloss:0.30268 validation-auc:0.96329 validation-aucpr:0.96918
[43] validation-logloss:0.29816 validation-auc:0.96382 validation-aucpr:0.96962
[44] validation-logloss:0.29539 validation-auc:0.96402 validation-aucpr:0.96977
[45] validation-logloss:0.29267 validation-auc:0.96414 validation-aucpr:0.96985
[46] validation-logloss:0.28872 validation-auc:0.96439 validation-aucpr:0.97012
[47] validation-logloss:0.28674 validation-auc:0.96434 validation-aucpr:0.97007
[48] validation-logloss:0.28442 validation-auc:0.96440 validation-aucpr:0.97011
[49] validation-logloss:0.28251 validation-auc:0.96447 validation-aucpr:0.97017
[50] validation-logloss:0.27902 validation-auc:0.96474 validation-aucpr:0.97045
[51] validation-logloss:0.27705 validation-auc:0.96482 validation-aucpr:0.97051
[52] validation-logloss:0.27542 validation-auc:0.96481 validation-aucpr:0.97048
[53] validation-logloss:0.27344 validation-auc:0.96483 validation-aucpr:0.97048
[54] validation-logloss:0.27018 validation-auc:0.96504 validation-aucpr:0.97068
[55] validation-logloss:0.26907 validation-auc:0.96497 validation-aucpr:0.97063
{'best_iteration': '54', 'best_score': '0.9706754928377431'}
Trial 25, Fold 1: Log loss = 0.2690666545589348, Average precision = 0.9706323993273999, ROC-AUC = 0.9649702410824813, Elapsed Time = 1.1947266000006493 seconds
Trial 25, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 25, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67061 validation-auc:0.93136 validation-aucpr:0.92698
[1] validation-logloss:0.65091 validation-auc:0.93686 validation-aucpr:0.93478
[2] validation-logloss:0.63111 validation-auc:0.94713 validation-aucpr:0.94825
[3] validation-logloss:0.61210 validation-auc:0.95005 validation-aucpr:0.95175
[4] validation-logloss:0.59208 validation-auc:0.95885 validation-aucpr:0.96273
[5] validation-logloss:0.57508 validation-auc:0.95946 validation-aucpr:0.96335
[6] validation-logloss:0.56160 validation-auc:0.95778 validation-aucpr:0.96170
[7] validation-logloss:0.54381 validation-auc:0.96128 validation-aucpr:0.96557
[8] validation-logloss:0.52942 validation-auc:0.96143 validation-aucpr:0.96536
[9] validation-logloss:0.51655 validation-auc:0.96130 validation-aucpr:0.96531
[10] validation-logloss:0.50556 validation-auc:0.96053 validation-aucpr:0.96458
[11] validation-logloss:0.49411 validation-auc:0.96059 validation-aucpr:0.96455
[12] validation-logloss:0.48347 validation-auc:0.96034 validation-aucpr:0.96424
[13] validation-logloss:0.47336 validation-auc:0.96002 validation-aucpr:0.96377
[14] validation-logloss:0.46312 validation-auc:0.96023 validation-aucpr:0.96376
[15] validation-logloss:0.45392 validation-auc:0.95998 validation-aucpr:0.96359
[16] validation-logloss:0.44160 validation-auc:0.96168 validation-aucpr:0.96537
[17] validation-logloss:0.43318 validation-auc:0.96156 validation-aucpr:0.96514
[18] validation-logloss:0.42520 validation-auc:0.96141 validation-aucpr:0.96508
[19] validation-logloss:0.41774 validation-auc:0.96124 validation-aucpr:0.96498
[20] validation-logloss:0.41024 validation-auc:0.96158 validation-aucpr:0.96511
[21] validation-logloss:0.40411 validation-auc:0.96119 validation-aucpr:0.96459
[22] validation-logloss:0.39732 validation-auc:0.96121 validation-aucpr:0.96456
[23] validation-logloss:0.39068 validation-auc:0.96113 validation-aucpr:0.96447
[24] validation-logloss:0.38439 validation-auc:0.96126 validation-aucpr:0.96462
[25] validation-logloss:0.37872 validation-auc:0.96112 validation-aucpr:0.96450
[26] validation-logloss:0.37376 validation-auc:0.96101 validation-aucpr:0.96437
[27] validation-logloss:0.36871 validation-auc:0.96090 validation-aucpr:0.96400
[28] validation-logloss:0.36333 validation-auc:0.96102 validation-aucpr:0.96405
[29] validation-logloss:0.35622 validation-auc:0.96191 validation-aucpr:0.96506
[30] validation-logloss:0.35142 validation-auc:0.96203 validation-aucpr:0.96515
[31] validation-logloss:0.34752 validation-auc:0.96186 validation-aucpr:0.96499
[32] validation-logloss:0.34069 validation-auc:0.96271 validation-aucpr:0.96592
[33] validation-logloss:0.33605 validation-auc:0.96297 validation-aucpr:0.96608
[34] validation-logloss:0.32954 validation-auc:0.96360 validation-aucpr:0.96679
[35] validation-logloss:0.32570 validation-auc:0.96368 validation-aucpr:0.96686
[36] validation-logloss:0.32188 validation-auc:0.96382 validation-aucpr:0.96696
[37] validation-logloss:0.31872 validation-auc:0.96378 validation-aucpr:0.96688
[38] validation-logloss:0.31593 validation-auc:0.96377 validation-aucpr:0.96681
[39] validation-logloss:0.31042 validation-auc:0.96424 validation-aucpr:0.96734
[40] validation-logloss:0.30741 validation-auc:0.96427 validation-aucpr:0.96769
[41] validation-logloss:0.30497 validation-auc:0.96420 validation-aucpr:0.96758
[42] validation-logloss:0.30223 validation-auc:0.96427 validation-aucpr:0.96762
[43] validation-logloss:0.29758 validation-auc:0.96469 validation-aucpr:0.96810
[44] validation-logloss:0.29477 validation-auc:0.96480 validation-aucpr:0.96814
[45] validation-logloss:0.29080 validation-auc:0.96493 validation-aucpr:0.96825
[46] validation-logloss:0.28693 validation-auc:0.96519 validation-aucpr:0.96852
[47] validation-logloss:0.28472 validation-auc:0.96517 validation-aucpr:0.96851
[48] validation-logloss:0.28120 validation-auc:0.96550 validation-aucpr:0.96883
[49] validation-logloss:0.27714 validation-auc:0.96588 validation-aucpr:0.96919
[50] validation-logloss:0.27505 validation-auc:0.96593 validation-aucpr:0.96919
[51] validation-logloss:0.27112 validation-auc:0.96628 validation-aucpr:0.96950
[52] validation-logloss:0.26929 validation-auc:0.96635 validation-aucpr:0.96956
[53] validation-logloss:0.26728 validation-auc:0.96646 validation-aucpr:0.96964
[54] validation-logloss:0.26555 validation-auc:0.96646 validation-aucpr:0.96961
[55] validation-logloss:0.26345 validation-auc:0.96671 validation-aucpr:0.96979
{'best_iteration': '55', 'best_score': '0.9697896161344084'}
Trial 25, Fold 2: Log loss = 0.2634486405045629, Average precision = 0.9697925827319714, ROC-AUC = 0.9667127575273731, Elapsed Time = 1.5920014000002993 seconds
Trial 25, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 25, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67096 validation-auc:0.92747 validation-aucpr:0.92769
[1] validation-logloss:0.64721 validation-auc:0.95369 validation-aucpr:0.95988
[2] validation-logloss:0.62642 validation-auc:0.95662 validation-aucpr:0.96262
[3] validation-logloss:0.60830 validation-auc:0.96139 validation-aucpr:0.96630
[4] validation-logloss:0.58983 validation-auc:0.96198 validation-aucpr:0.96652
[5] validation-logloss:0.57351 validation-auc:0.96141 validation-aucpr:0.96608
[6] validation-logloss:0.55846 validation-auc:0.96103 validation-aucpr:0.96570
[7] validation-logloss:0.54134 validation-auc:0.96358 validation-aucpr:0.96832
[8] validation-logloss:0.52711 validation-auc:0.96378 validation-aucpr:0.96846
[9] validation-logloss:0.51435 validation-auc:0.96332 validation-aucpr:0.96793
[10] validation-logloss:0.50233 validation-auc:0.96290 validation-aucpr:0.96740
[11] validation-logloss:0.49082 validation-auc:0.96263 validation-aucpr:0.96711
[12] validation-logloss:0.47988 validation-auc:0.96267 validation-aucpr:0.96708
[13] validation-logloss:0.46958 validation-auc:0.96244 validation-aucpr:0.96687
[14] validation-logloss:0.45934 validation-auc:0.96216 validation-aucpr:0.96717
[15] validation-logloss:0.44964 validation-auc:0.96201 validation-aucpr:0.96697
[16] validation-logloss:0.43784 validation-auc:0.96338 validation-aucpr:0.96841
[17] validation-logloss:0.42886 validation-auc:0.96393 validation-aucpr:0.96876
[18] validation-logloss:0.42107 validation-auc:0.96422 validation-aucpr:0.96890
[19] validation-logloss:0.41281 validation-auc:0.96424 validation-aucpr:0.96888
[20] validation-logloss:0.40561 validation-auc:0.96420 validation-aucpr:0.96878
[21] validation-logloss:0.39898 validation-auc:0.96382 validation-aucpr:0.96849
[22] validation-logloss:0.39189 validation-auc:0.96414 validation-aucpr:0.96870
[23] validation-logloss:0.38533 validation-auc:0.96435 validation-aucpr:0.96885
[24] validation-logloss:0.37768 validation-auc:0.96487 validation-aucpr:0.96951
[25] validation-logloss:0.37250 validation-auc:0.96474 validation-aucpr:0.96954
[26] validation-logloss:0.36790 validation-auc:0.96441 validation-aucpr:0.96929
[27] validation-logloss:0.36108 validation-auc:0.96471 validation-aucpr:0.96965
[28] validation-logloss:0.35680 validation-auc:0.96436 validation-aucpr:0.96934
[29] validation-logloss:0.34959 validation-auc:0.96493 validation-aucpr:0.96989
[30] validation-logloss:0.34517 validation-auc:0.96481 validation-aucpr:0.96977
[31] validation-logloss:0.34129 validation-auc:0.96485 validation-aucpr:0.96984
[32] validation-logloss:0.33800 validation-auc:0.96458 validation-aucpr:0.96962
[33] validation-logloss:0.33220 validation-auc:0.96485 validation-aucpr:0.96994
[34] validation-logloss:0.32840 validation-auc:0.96484 validation-aucpr:0.96992
[35] validation-logloss:0.32387 validation-auc:0.96506 validation-aucpr:0.97009
[36] validation-logloss:0.32003 validation-auc:0.96517 validation-aucpr:0.97012
[37] validation-logloss:0.31686 validation-auc:0.96522 validation-aucpr:0.97014
[38] validation-logloss:0.31350 validation-auc:0.96523 validation-aucpr:0.97016
[39] validation-logloss:0.31083 validation-auc:0.96506 validation-aucpr:0.96999
[40] validation-logloss:0.30749 validation-auc:0.96522 validation-aucpr:0.97009
[41] validation-logloss:0.30229 validation-auc:0.96560 validation-aucpr:0.97045
[42] validation-logloss:0.29765 validation-auc:0.96576 validation-aucpr:0.97062
[43] validation-logloss:0.29495 validation-auc:0.96580 validation-aucpr:0.97063
[44] validation-logloss:0.29224 validation-auc:0.96583 validation-aucpr:0.97072
[45] validation-logloss:0.28999 validation-auc:0.96581 validation-aucpr:0.97069
[46] validation-logloss:0.28784 validation-auc:0.96577 validation-aucpr:0.97067
[47] validation-logloss:0.28551 validation-auc:0.96595 validation-aucpr:0.97078
[48] validation-logloss:0.28290 validation-auc:0.96608 validation-aucpr:0.97087
[49] validation-logloss:0.28053 validation-auc:0.96615 validation-aucpr:0.97090
[50] validation-logloss:0.27852 validation-auc:0.96617 validation-aucpr:0.97093
[51] validation-logloss:0.27661 validation-auc:0.96618 validation-aucpr:0.97092
[52] validation-logloss:0.27287 validation-auc:0.96649 validation-aucpr:0.97123
[53] validation-logloss:0.27084 validation-auc:0.96663 validation-aucpr:0.97132
[54] validation-logloss:0.26952 validation-auc:0.96652 validation-aucpr:0.97122
[55] validation-logloss:0.26764 validation-auc:0.96661 validation-aucpr:0.97131
{'best_iteration': '53', 'best_score': '0.9713172950178219'}
Trial 25, Fold 3: Log loss = 0.267635464730996, Average precision = 0.9713168943918258, ROC-AUC = 0.966612060704414, Elapsed Time = 1.624384500000815 seconds
Trial 25, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 25, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67115 validation-auc:0.91979 validation-aucpr:0.91803
[1] validation-logloss:0.64973 validation-auc:0.94236 validation-aucpr:0.94599
[2] validation-logloss:0.62750 validation-auc:0.95848 validation-aucpr:0.96492
[3] validation-logloss:0.60959 validation-auc:0.95865 validation-aucpr:0.96487
[4] validation-logloss:0.59150 validation-auc:0.95936 validation-aucpr:0.96547
[5] validation-logloss:0.57424 validation-auc:0.96145 validation-aucpr:0.96776
[6] validation-logloss:0.55902 validation-auc:0.96092 validation-aucpr:0.96714
[7] validation-logloss:0.54229 validation-auc:0.96169 validation-aucpr:0.96796
[8] validation-logloss:0.52964 validation-auc:0.96073 validation-aucpr:0.96701
[9] validation-logloss:0.51694 validation-auc:0.96048 validation-aucpr:0.96675
[10] validation-logloss:0.50214 validation-auc:0.96103 validation-aucpr:0.96738
[11] validation-logloss:0.48849 validation-auc:0.96105 validation-aucpr:0.96757
[12] validation-logloss:0.47751 validation-auc:0.96054 validation-aucpr:0.96709
[13] validation-logloss:0.46514 validation-auc:0.96040 validation-aucpr:0.96720
[14] validation-logloss:0.45366 validation-auc:0.96050 validation-aucpr:0.96739
[15] validation-logloss:0.44469 validation-auc:0.96029 validation-aucpr:0.96710
[16] validation-logloss:0.43588 validation-auc:0.96056 validation-aucpr:0.96721
[17] validation-logloss:0.42686 validation-auc:0.96145 validation-aucpr:0.96782
[18] validation-logloss:0.41644 validation-auc:0.96199 validation-aucpr:0.96829
[19] validation-logloss:0.40871 validation-auc:0.96197 validation-aucpr:0.96831
[20] validation-logloss:0.39951 validation-auc:0.96221 validation-aucpr:0.96857
[21] validation-logloss:0.39327 validation-auc:0.96203 validation-aucpr:0.96836
[22] validation-logloss:0.38503 validation-auc:0.96215 validation-aucpr:0.96855
[23] validation-logloss:0.37917 validation-auc:0.96189 validation-aucpr:0.96831
[24] validation-logloss:0.37403 validation-auc:0.96182 validation-aucpr:0.96832
[25] validation-logloss:0.36834 validation-auc:0.96178 validation-aucpr:0.96824
[26] validation-logloss:0.36365 validation-auc:0.96169 validation-aucpr:0.96808
[27] validation-logloss:0.35854 validation-auc:0.96175 validation-aucpr:0.96809
[28] validation-logloss:0.35411 validation-auc:0.96161 validation-aucpr:0.96802
[29] validation-logloss:0.34907 validation-auc:0.96160 validation-aucpr:0.96804
[30] validation-logloss:0.34378 validation-auc:0.96202 validation-aucpr:0.96837
[31] validation-logloss:0.34033 validation-auc:0.96183 validation-aucpr:0.96821
[32] validation-logloss:0.33436 validation-auc:0.96221 validation-aucpr:0.96856
[33] validation-logloss:0.32862 validation-auc:0.96257 validation-aucpr:0.96887
[34] validation-logloss:0.32490 validation-auc:0.96244 validation-aucpr:0.96875
[35] validation-logloss:0.31972 validation-auc:0.96266 validation-aucpr:0.96895
[36] validation-logloss:0.31625 validation-auc:0.96255 validation-aucpr:0.96888
[37] validation-logloss:0.31301 validation-auc:0.96248 validation-aucpr:0.96885
[38] validation-logloss:0.30935 validation-auc:0.96266 validation-aucpr:0.96901
[39] validation-logloss:0.30444 validation-auc:0.96302 validation-aucpr:0.96932
[40] validation-logloss:0.30201 validation-auc:0.96302 validation-aucpr:0.96930
[41] validation-logloss:0.29797 validation-auc:0.96305 validation-aucpr:0.96935
[42] validation-logloss:0.29481 validation-auc:0.96324 validation-aucpr:0.96947
[43] validation-logloss:0.29174 validation-auc:0.96337 validation-aucpr:0.96953
[44] validation-logloss:0.28847 validation-auc:0.96369 validation-aucpr:0.96977
[45] validation-logloss:0.28598 validation-auc:0.96377 validation-aucpr:0.96979
[46] validation-logloss:0.28323 validation-auc:0.96394 validation-aucpr:0.96997
[47] validation-logloss:0.28073 validation-auc:0.96399 validation-aucpr:0.96999
[48] validation-logloss:0.27912 validation-auc:0.96382 validation-aucpr:0.96987
[49] validation-logloss:0.27658 validation-auc:0.96396 validation-aucpr:0.96997
[50] validation-logloss:0.27350 validation-auc:0.96415 validation-aucpr:0.97018
[51] validation-logloss:0.27128 validation-auc:0.96431 validation-aucpr:0.97030
[52] validation-logloss:0.26785 validation-auc:0.96457 validation-aucpr:0.97053
[53] validation-logloss:0.26509 validation-auc:0.96475 validation-aucpr:0.97072
[54] validation-logloss:0.26379 validation-auc:0.96470 validation-aucpr:0.97068
[55] validation-logloss:0.26189 validation-auc:0.96481 validation-aucpr:0.97079
{'best_iteration': '55', 'best_score': '0.9707935271000794'}
Trial 25, Fold 4: Log loss = 0.2618905262310492, Average precision = 0.9707982250800647, ROC-AUC = 0.9648084216161052, Elapsed Time = 1.4710076999999728 seconds
Trial 25, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 25, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67141 validation-auc:0.92058 validation-aucpr:0.91798
[1] validation-logloss:0.65048 validation-auc:0.93603 validation-aucpr:0.94189
[2] validation-logloss:0.63149 validation-auc:0.93869 validation-aucpr:0.94458
[3] validation-logloss:0.61234 validation-auc:0.94543 validation-aucpr:0.95018
[4] validation-logloss:0.59490 validation-auc:0.94715 validation-aucpr:0.95168
[5] validation-logloss:0.57953 validation-auc:0.94706 validation-aucpr:0.95158
[6] validation-logloss:0.56517 validation-auc:0.94737 validation-aucpr:0.95219
[7] validation-logloss:0.55121 validation-auc:0.94698 validation-aucpr:0.95191
[8] validation-logloss:0.53763 validation-auc:0.94742 validation-aucpr:0.95207
[9] validation-logloss:0.52568 validation-auc:0.94794 validation-aucpr:0.95275
[10] validation-logloss:0.51307 validation-auc:0.94890 validation-aucpr:0.95359
[11] validation-logloss:0.49836 validation-auc:0.95373 validation-aucpr:0.95902
[12] validation-logloss:0.48738 validation-auc:0.95364 validation-aucpr:0.95888
[13] validation-logloss:0.47660 validation-auc:0.95439 validation-aucpr:0.95938
[14] validation-logloss:0.46781 validation-auc:0.95397 validation-aucpr:0.95900
[15] validation-logloss:0.45839 validation-auc:0.95450 validation-aucpr:0.95940
[16] validation-logloss:0.44985 validation-auc:0.95474 validation-aucpr:0.95986
[17] validation-logloss:0.44229 validation-auc:0.95460 validation-aucpr:0.95956
[18] validation-logloss:0.43107 validation-auc:0.95681 validation-aucpr:0.96190
[19] validation-logloss:0.42367 validation-auc:0.95680 validation-aucpr:0.96189
[20] validation-logloss:0.41378 validation-auc:0.95810 validation-aucpr:0.96328
[21] validation-logloss:0.40640 validation-auc:0.95824 validation-aucpr:0.96338
[22] validation-logloss:0.39977 validation-auc:0.95841 validation-aucpr:0.96354
[23] validation-logloss:0.39354 validation-auc:0.95841 validation-aucpr:0.96362
[24] validation-logloss:0.38856 validation-auc:0.95824 validation-aucpr:0.96368
[25] validation-logloss:0.38284 validation-auc:0.95849 validation-aucpr:0.96379
[26] validation-logloss:0.37762 validation-auc:0.95856 validation-aucpr:0.96386
[27] validation-logloss:0.37176 validation-auc:0.95869 validation-aucpr:0.96393
[28] validation-logloss:0.36403 validation-auc:0.95952 validation-aucpr:0.96484
[29] validation-logloss:0.35744 validation-auc:0.96003 validation-aucpr:0.96544
[30] validation-logloss:0.35060 validation-auc:0.96051 validation-aucpr:0.96594
[31] validation-logloss:0.34648 validation-auc:0.96054 validation-aucpr:0.96592
[32] validation-logloss:0.34298 validation-auc:0.96039 validation-aucpr:0.96584
[33] validation-logloss:0.33881 validation-auc:0.96044 validation-aucpr:0.96587
[34] validation-logloss:0.33522 validation-auc:0.96040 validation-aucpr:0.96587
[35] validation-logloss:0.33207 validation-auc:0.96034 validation-aucpr:0.96579
[36] validation-logloss:0.32875 validation-auc:0.96039 validation-aucpr:0.96580
[37] validation-logloss:0.32504 validation-auc:0.96064 validation-aucpr:0.96604
[38] validation-logloss:0.32146 validation-auc:0.96109 validation-aucpr:0.96638
[39] validation-logloss:0.31822 validation-auc:0.96111 validation-aucpr:0.96641
[40] validation-logloss:0.31327 validation-auc:0.96150 validation-aucpr:0.96678
[41] validation-logloss:0.30825 validation-auc:0.96191 validation-aucpr:0.96721
[42] validation-logloss:0.30370 validation-auc:0.96225 validation-aucpr:0.96760
[43] validation-logloss:0.30148 validation-auc:0.96214 validation-aucpr:0.96748
[44] validation-logloss:0.29898 validation-auc:0.96219 validation-aucpr:0.96748
[45] validation-logloss:0.29511 validation-auc:0.96242 validation-aucpr:0.96770
[46] validation-logloss:0.29080 validation-auc:0.96270 validation-aucpr:0.96800
[47] validation-logloss:0.28917 validation-auc:0.96262 validation-aucpr:0.96791
[48] validation-logloss:0.28535 validation-auc:0.96288 validation-aucpr:0.96819
[49] validation-logloss:0.28317 validation-auc:0.96302 validation-aucpr:0.96834
[50] validation-logloss:0.28201 validation-auc:0.96284 validation-aucpr:0.96822
[51] validation-logloss:0.28024 validation-auc:0.96289 validation-aucpr:0.96825
[52] validation-logloss:0.27846 validation-auc:0.96286 validation-aucpr:0.96822
[53] validation-logloss:0.27703 validation-auc:0.96293 validation-aucpr:0.96831
[54] validation-logloss:0.27502 validation-auc:0.96304 validation-aucpr:0.96838
[55] validation-logloss:0.27291 validation-auc:0.96330 validation-aucpr:0.96859
{'best_iteration': '55', 'best_score': '0.968587545474199'}
Trial 25, Fold 5: Log loss = 0.2729147602555443, Average precision = 0.9685929768466469, ROC-AUC = 0.9633032718354607, Elapsed Time = 1.555640200000198 seconds
Optimization Progress: 26%|##6 | 26/100 [38:28<38:27, 31.18s/it]
Trial 26, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 26, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:37:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68542 validation-auc:0.95577 validation-aucpr:0.96213
[18:37:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67867 validation-auc:0.96068 validation-aucpr:0.96692
[18:37:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67120 validation-auc:0.96338 validation-aucpr:0.96919
[18:37:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66384 validation-auc:0.96580 validation-aucpr:0.97120
[18:37:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65661 validation-auc:0.96626 validation-aucpr:0.97158
[18:37:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64959 validation-auc:0.96711 validation-aucpr:0.97239
[18:37:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64273 validation-auc:0.96780 validation-aucpr:0.97295
[18:37:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63604 validation-auc:0.96813 validation-aucpr:0.97354
[18:37:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62933 validation-auc:0.96837 validation-aucpr:0.97372
[18:37:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62285 validation-auc:0.96863 validation-aucpr:0.97386
[18:37:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61637 validation-auc:0.96883 validation-aucpr:0.97397
[18:37:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61024 validation-auc:0.96871 validation-aucpr:0.97390
[18:37:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60476 validation-auc:0.96863 validation-aucpr:0.97377
[18:37:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.59939 validation-auc:0.96873 validation-aucpr:0.97378
[18:37:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59348 validation-auc:0.96873 validation-aucpr:0.97380
[18:37:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.58817 validation-auc:0.96866 validation-aucpr:0.97373
[18:37:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58251 validation-auc:0.96856 validation-aucpr:0.97369
[18:37:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.57693 validation-auc:0.96862 validation-aucpr:0.97373
[18:37:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57188 validation-auc:0.96873 validation-aucpr:0.97378
[18:37:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.56651 validation-auc:0.96867 validation-aucpr:0.97374
[18:37:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56116 validation-auc:0.96893 validation-aucpr:0.97395
[18:37:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.55587 validation-auc:0.96895 validation-aucpr:0.97398
[18:37:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55064 validation-auc:0.96917 validation-aucpr:0.97416
[18:37:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.54581 validation-auc:0.96929 validation-aucpr:0.97421
[18:37:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54080 validation-auc:0.96946 validation-aucpr:0.97436
[18:37:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.53591 validation-auc:0.96953 validation-aucpr:0.97440
[18:37:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53108 validation-auc:0.96968 validation-aucpr:0.97451
[18:37:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.52627 validation-auc:0.96982 validation-aucpr:0.97475
[18:38:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52151 validation-auc:0.97000 validation-aucpr:0.97491
[18:38:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.51692 validation-auc:0.97016 validation-aucpr:0.97500
[18:38:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.51234 validation-auc:0.97020 validation-aucpr:0.97510
[18:38:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.50786 validation-auc:0.97023 validation-aucpr:0.97514
[18:38:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.50350 validation-auc:0.97024 validation-aucpr:0.97514
[18:38:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.49918 validation-auc:0.97031 validation-aucpr:0.97518
[18:38:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.49495 validation-auc:0.97030 validation-aucpr:0.97520
[18:38:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.49079 validation-auc:0.97036 validation-aucpr:0.97530
[18:38:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.48670 validation-auc:0.97038 validation-aucpr:0.97533
[18:38:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.48266 validation-auc:0.97049 validation-aucpr:0.97546
[18:38:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.47908 validation-auc:0.97044 validation-aucpr:0.97542
[18:38:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.47512 validation-auc:0.97053 validation-aucpr:0.97548
[18:38:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.47126 validation-auc:0.97055 validation-aucpr:0.97550
[18:38:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.46742 validation-auc:0.97074 validation-aucpr:0.97563
[18:38:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.46389 validation-auc:0.97079 validation-aucpr:0.97565
[18:38:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.46026 validation-auc:0.97071 validation-aucpr:0.97560
[18:38:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.45700 validation-auc:0.97078 validation-aucpr:0.97562
[18:38:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.45345 validation-auc:0.97081 validation-aucpr:0.97564
[18:38:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.44989 validation-auc:0.97086 validation-aucpr:0.97569
[18:38:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.44645 validation-auc:0.97091 validation-aucpr:0.97572
[18:38:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.44304 validation-auc:0.97100 validation-aucpr:0.97579
[18:38:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.43971 validation-auc:0.97098 validation-aucpr:0.97577
[18:38:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.43650 validation-auc:0.97090 validation-aucpr:0.97571
[18:38:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.43330 validation-auc:0.97078 validation-aucpr:0.97562
[18:38:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.43045 validation-auc:0.97074 validation-aucpr:0.97558
[18:38:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.42735 validation-auc:0.97078 validation-aucpr:0.97562
[18:38:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.42425 validation-auc:0.97079 validation-aucpr:0.97564
[18:38:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.42154 validation-auc:0.97069 validation-aucpr:0.97556
[18:38:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.41848 validation-auc:0.97075 validation-aucpr:0.97560
[18:38:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.41589 validation-auc:0.97070 validation-aucpr:0.97554
[18:38:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.41295 validation-auc:0.97078 validation-aucpr:0.97560
[18:38:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.41013 validation-auc:0.97081 validation-aucpr:0.97562
[18:38:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.40731 validation-auc:0.97080 validation-aucpr:0.97561
[18:38:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.40452 validation-auc:0.97075 validation-aucpr:0.97558
[18:38:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.40178 validation-auc:0.97081 validation-aucpr:0.97561
[18:38:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.39898 validation-auc:0.97087 validation-aucpr:0.97565
[18:38:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.39632 validation-auc:0.97086 validation-aucpr:0.97564
[18:38:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.39370 validation-auc:0.97080 validation-aucpr:0.97561
[18:38:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.39130 validation-auc:0.97077 validation-aucpr:0.97558
[18:38:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.38866 validation-auc:0.97083 validation-aucpr:0.97563
[18:38:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.38611 validation-auc:0.97099 validation-aucpr:0.97579
[18:38:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.38363 validation-auc:0.97100 validation-aucpr:0.97580
[18:38:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.38118 validation-auc:0.97102 validation-aucpr:0.97582
[18:39:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.37876 validation-auc:0.97108 validation-aucpr:0.97582
[18:39:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.37638 validation-auc:0.97119 validation-aucpr:0.97590
[18:39:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.37400 validation-auc:0.97125 validation-aucpr:0.97593
[18:39:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.37194 validation-auc:0.97117 validation-aucpr:0.97585
[18:39:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.36966 validation-auc:0.97121 validation-aucpr:0.97587
[18:39:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.36740 validation-auc:0.97124 validation-aucpr:0.97589
[18:39:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.36525 validation-auc:0.97126 validation-aucpr:0.97590
[18:39:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.36308 validation-auc:0.97137 validation-aucpr:0.97594
{'best_iteration': '78', 'best_score': '0.9759377173694493'}
Trial 26, Fold 1: Log loss = 0.3630798792810747, Average precision = 0.9759361280515932, ROC-AUC = 0.9713674972397873, Elapsed Time = 108.3564650999997 seconds
Trial 26, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 26, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:39:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68565 validation-auc:0.95899 validation-aucpr:0.96175
[18:39:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67799 validation-auc:0.96605 validation-aucpr:0.96878
[18:39:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67063 validation-auc:0.96761 validation-aucpr:0.97093
[18:39:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66335 validation-auc:0.96813 validation-aucpr:0.97149
[18:39:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65622 validation-auc:0.96847 validation-aucpr:0.97178
[18:39:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64919 validation-auc:0.96838 validation-aucpr:0.97186
[18:39:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64298 validation-auc:0.96890 validation-aucpr:0.97236
[18:39:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63628 validation-auc:0.96883 validation-aucpr:0.97227
[18:39:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62956 validation-auc:0.96938 validation-aucpr:0.97272
[18:39:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62302 validation-auc:0.96977 validation-aucpr:0.97307
[18:39:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61658 validation-auc:0.96992 validation-aucpr:0.97313
[18:39:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61043 validation-auc:0.96967 validation-aucpr:0.97291
[18:39:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60430 validation-auc:0.96955 validation-aucpr:0.97286
[18:39:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.59878 validation-auc:0.96985 validation-aucpr:0.97309
[18:39:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59294 validation-auc:0.97001 validation-aucpr:0.97322
[18:39:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.58713 validation-auc:0.97023 validation-aucpr:0.97346
[18:39:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58188 validation-auc:0.97024 validation-aucpr:0.97348
[18:39:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.57628 validation-auc:0.97036 validation-aucpr:0.97357
[18:39:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57078 validation-auc:0.97024 validation-aucpr:0.97347
[18:39:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.56535 validation-auc:0.97032 validation-aucpr:0.97353
[18:39:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56005 validation-auc:0.97046 validation-aucpr:0.97359
[18:39:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.55473 validation-auc:0.97073 validation-aucpr:0.97378
[18:39:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.54972 validation-auc:0.97071 validation-aucpr:0.97379
[18:39:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.54465 validation-auc:0.97079 validation-aucpr:0.97383
[18:39:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.53975 validation-auc:0.97067 validation-aucpr:0.97375
[18:39:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.53487 validation-auc:0.97086 validation-aucpr:0.97391
[18:39:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.52995 validation-auc:0.97095 validation-aucpr:0.97398
[18:39:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.52513 validation-auc:0.97103 validation-aucpr:0.97406
[18:39:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52056 validation-auc:0.97107 validation-aucpr:0.97410
[18:39:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.51591 validation-auc:0.97119 validation-aucpr:0.97418
[18:39:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.51190 validation-auc:0.97112 validation-aucpr:0.97412
[18:39:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.50740 validation-auc:0.97136 validation-aucpr:0.97430
[18:39:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.50296 validation-auc:0.97148 validation-aucpr:0.97438
[18:39:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.49867 validation-auc:0.97148 validation-aucpr:0.97439
[18:39:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.49439 validation-auc:0.97167 validation-aucpr:0.97461
[18:39:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.49023 validation-auc:0.97165 validation-aucpr:0.97458
[18:40:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.48605 validation-auc:0.97172 validation-aucpr:0.97463
[18:40:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.48204 validation-auc:0.97177 validation-aucpr:0.97471
[18:40:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.47833 validation-auc:0.97185 validation-aucpr:0.97478
[18:40:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.47440 validation-auc:0.97202 validation-aucpr:0.97491
[18:40:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.47053 validation-auc:0.97202 validation-aucpr:0.97492
[18:40:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.46671 validation-auc:0.97199 validation-aucpr:0.97489
[18:40:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.46314 validation-auc:0.97191 validation-aucpr:0.97483
[18:40:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.45947 validation-auc:0.97188 validation-aucpr:0.97480
[18:40:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.45579 validation-auc:0.97189 validation-aucpr:0.97478
[18:40:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.45221 validation-auc:0.97188 validation-aucpr:0.97476
[18:40:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.44871 validation-auc:0.97182 validation-aucpr:0.97470
[18:40:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.44527 validation-auc:0.97177 validation-aucpr:0.97466
[18:40:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.44188 validation-auc:0.97171 validation-aucpr:0.97461
[18:40:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.43856 validation-auc:0.97177 validation-aucpr:0.97465
[18:40:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.43520 validation-auc:0.97184 validation-aucpr:0.97468
[18:40:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.43198 validation-auc:0.97180 validation-aucpr:0.97467
[18:40:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.42871 validation-auc:0.97185 validation-aucpr:0.97470
[18:40:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.42569 validation-auc:0.97195 validation-aucpr:0.97476
[18:40:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.42256 validation-auc:0.97192 validation-aucpr:0.97474
[18:40:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.41984 validation-auc:0.97196 validation-aucpr:0.97478
[18:40:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.41678 validation-auc:0.97201 validation-aucpr:0.97484
[18:40:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.41372 validation-auc:0.97205 validation-aucpr:0.97486
[18:40:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.41072 validation-auc:0.97204 validation-aucpr:0.97484
[18:40:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.40776 validation-auc:0.97211 validation-aucpr:0.97491
[18:40:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.40493 validation-auc:0.97212 validation-aucpr:0.97488
[18:40:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.40216 validation-auc:0.97206 validation-aucpr:0.97485
[18:40:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.39939 validation-auc:0.97210 validation-aucpr:0.97488
[18:40:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.39663 validation-auc:0.97214 validation-aucpr:0.97489
[18:40:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.39414 validation-auc:0.97212 validation-aucpr:0.97486
[18:40:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.39150 validation-auc:0.97218 validation-aucpr:0.97489
[18:40:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.38886 validation-auc:0.97220 validation-aucpr:0.97490
[18:40:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.38616 validation-auc:0.97234 validation-aucpr:0.97500
[18:40:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.38368 validation-auc:0.97228 validation-aucpr:0.97495
[18:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.38117 validation-auc:0.97227 validation-aucpr:0.97498
[18:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.37881 validation-auc:0.97220 validation-aucpr:0.97493
[18:40:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.37636 validation-auc:0.97219 validation-aucpr:0.97492
[18:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.37391 validation-auc:0.97222 validation-aucpr:0.97495
[18:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.37175 validation-auc:0.97224 validation-aucpr:0.97493
[18:40:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.36945 validation-auc:0.97225 validation-aucpr:0.97494
[18:40:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.36742 validation-auc:0.97218 validation-aucpr:0.97486
[18:40:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.36514 validation-auc:0.97216 validation-aucpr:0.97485
[18:40:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.36302 validation-auc:0.97213 validation-aucpr:0.97481
[18:40:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.36082 validation-auc:0.97213 validation-aucpr:0.97482
{'best_iteration': '67', 'best_score': '0.975003136856076'}
Trial 26, Fold 2: Log loss = 0.3608162098978818, Average precision = 0.9747282694988629, ROC-AUC = 0.97213250555928, Elapsed Time = 107.27267639999991 seconds
Trial 26, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 26, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:41:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68550 validation-auc:0.95309 validation-aucpr:0.95686
[18:41:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67793 validation-auc:0.96434 validation-aucpr:0.96972
[18:41:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67041 validation-auc:0.96710 validation-aucpr:0.97192
[18:41:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66312 validation-auc:0.96727 validation-aucpr:0.97218
[18:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65591 validation-auc:0.96767 validation-aucpr:0.97246
[18:41:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64882 validation-auc:0.96821 validation-aucpr:0.97276
[18:41:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64191 validation-auc:0.96833 validation-aucpr:0.97306
[18:41:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63514 validation-auc:0.96850 validation-aucpr:0.97316
[18:41:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62849 validation-auc:0.96875 validation-aucpr:0.97341
[18:41:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62193 validation-auc:0.96921 validation-aucpr:0.97375
[18:41:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61557 validation-auc:0.96961 validation-aucpr:0.97400
[18:41:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.60924 validation-auc:0.96968 validation-aucpr:0.97411
[18:41:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60306 validation-auc:0.97005 validation-aucpr:0.97422
[18:41:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.59755 validation-auc:0.97063 validation-aucpr:0.97486
[18:41:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59155 validation-auc:0.97064 validation-aucpr:0.97486
[18:41:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.58571 validation-auc:0.97075 validation-aucpr:0.97496
[18:41:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58045 validation-auc:0.97084 validation-aucpr:0.97500
[18:41:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.57488 validation-auc:0.97090 validation-aucpr:0.97506
[18:41:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.56931 validation-auc:0.97097 validation-aucpr:0.97510
[18:41:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.56431 validation-auc:0.97124 validation-aucpr:0.97530
[18:41:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.55881 validation-auc:0.97153 validation-aucpr:0.97553
[18:41:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.55353 validation-auc:0.97168 validation-aucpr:0.97566
[18:41:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.54825 validation-auc:0.97175 validation-aucpr:0.97574
[18:41:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.54318 validation-auc:0.97173 validation-aucpr:0.97571
[18:41:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.53871 validation-auc:0.97168 validation-aucpr:0.97567
[18:41:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.53371 validation-auc:0.97157 validation-aucpr:0.97561
[18:41:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.52889 validation-auc:0.97138 validation-aucpr:0.97550
[18:41:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.52414 validation-auc:0.97131 validation-aucpr:0.97545
[18:41:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.51943 validation-auc:0.97140 validation-aucpr:0.97551
[18:41:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.51524 validation-auc:0.97146 validation-aucpr:0.97563
[18:41:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.51064 validation-auc:0.97149 validation-aucpr:0.97565
[18:41:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.50617 validation-auc:0.97147 validation-aucpr:0.97564
[18:41:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.50169 validation-auc:0.97157 validation-aucpr:0.97570
[18:41:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.49756 validation-auc:0.97149 validation-aucpr:0.97562
[18:41:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.49343 validation-auc:0.97148 validation-aucpr:0.97562
[18:41:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.48979 validation-auc:0.97148 validation-aucpr:0.97561
[18:41:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.48575 validation-auc:0.97153 validation-aucpr:0.97565
[18:41:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.48177 validation-auc:0.97149 validation-aucpr:0.97563
[18:41:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.47778 validation-auc:0.97159 validation-aucpr:0.97570
[18:41:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.47390 validation-auc:0.97145 validation-aucpr:0.97560
[18:41:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.47002 validation-auc:0.97147 validation-aucpr:0.97563
[18:41:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.46649 validation-auc:0.97152 validation-aucpr:0.97569
[18:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.46281 validation-auc:0.97145 validation-aucpr:0.97562
[18:41:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.45941 validation-auc:0.97148 validation-aucpr:0.97564
[18:41:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.45611 validation-auc:0.97149 validation-aucpr:0.97559
[18:42:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.45262 validation-auc:0.97146 validation-aucpr:0.97557
[18:42:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.44905 validation-auc:0.97147 validation-aucpr:0.97559
[18:42:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.44559 validation-auc:0.97152 validation-aucpr:0.97562
[18:42:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.44253 validation-auc:0.97149 validation-aucpr:0.97561
[18:42:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.43921 validation-auc:0.97147 validation-aucpr:0.97560
[18:42:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.43588 validation-auc:0.97149 validation-aucpr:0.97562
[18:42:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.43259 validation-auc:0.97151 validation-aucpr:0.97564
[18:42:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.42979 validation-auc:0.97151 validation-aucpr:0.97566
[18:42:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.42665 validation-auc:0.97152 validation-aucpr:0.97565
[18:42:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.42356 validation-auc:0.97155 validation-aucpr:0.97568
[18:42:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.42044 validation-auc:0.97162 validation-aucpr:0.97573
[18:42:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.41741 validation-auc:0.97163 validation-aucpr:0.97575
[18:42:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.41439 validation-auc:0.97172 validation-aucpr:0.97581
[18:42:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.41144 validation-auc:0.97183 validation-aucpr:0.97591
[18:42:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.40850 validation-auc:0.97188 validation-aucpr:0.97595
[18:42:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.40560 validation-auc:0.97188 validation-aucpr:0.97595
[18:42:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.40292 validation-auc:0.97187 validation-aucpr:0.97596
[18:42:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.40015 validation-auc:0.97190 validation-aucpr:0.97596
[18:42:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.39739 validation-auc:0.97198 validation-aucpr:0.97603
[18:42:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.39464 validation-auc:0.97197 validation-aucpr:0.97602
[18:42:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.39198 validation-auc:0.97194 validation-aucpr:0.97601
[18:42:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.38960 validation-auc:0.97196 validation-aucpr:0.97597
[18:42:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.38708 validation-auc:0.97191 validation-aucpr:0.97596
[18:42:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.38457 validation-auc:0.97186 validation-aucpr:0.97596
[18:42:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.38206 validation-auc:0.97188 validation-aucpr:0.97598
[18:42:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.37962 validation-auc:0.97189 validation-aucpr:0.97600
[18:42:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.37739 validation-auc:0.97189 validation-aucpr:0.97599
[18:42:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.37494 validation-auc:0.97191 validation-aucpr:0.97600
[18:42:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.37256 validation-auc:0.97186 validation-aucpr:0.97597
[18:42:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.37027 validation-auc:0.97188 validation-aucpr:0.97598
[18:42:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.36795 validation-auc:0.97186 validation-aucpr:0.97596
[18:42:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.36573 validation-auc:0.97190 validation-aucpr:0.97599
[18:42:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.36345 validation-auc:0.97188 validation-aucpr:0.97598
[18:42:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.36119 validation-auc:0.97194 validation-aucpr:0.97609
{'best_iteration': '78', 'best_score': '0.9760926971288632'}
Trial 26, Fold 3: Log loss = 0.3611888171930082, Average precision = 0.9760961494961098, ROC-AUC = 0.9719414089963498, Elapsed Time = 108.23821640000097 seconds
Trial 26, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 26, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:42:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68536 validation-auc:0.95745 validation-aucpr:0.95606
[18:42:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67782 validation-auc:0.96335 validation-aucpr:0.96994
[18:42:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67038 validation-auc:0.96625 validation-aucpr:0.97207
[18:42:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66315 validation-auc:0.96726 validation-aucpr:0.97260
[18:42:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65595 validation-auc:0.96774 validation-aucpr:0.97297
[18:42:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64891 validation-auc:0.96819 validation-aucpr:0.97336
[18:42:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64203 validation-auc:0.96809 validation-aucpr:0.97333
[18:42:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63593 validation-auc:0.96831 validation-aucpr:0.97367
[18:43:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62940 validation-auc:0.96854 validation-aucpr:0.97389
[18:43:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62296 validation-auc:0.96900 validation-aucpr:0.97411
[18:43:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61661 validation-auc:0.96910 validation-aucpr:0.97427
[18:43:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61030 validation-auc:0.96943 validation-aucpr:0.97451
[18:43:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60414 validation-auc:0.96957 validation-aucpr:0.97456
[18:43:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.59819 validation-auc:0.96934 validation-aucpr:0.97445
[18:43:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59229 validation-auc:0.96928 validation-aucpr:0.97439
[18:43:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.58695 validation-auc:0.96913 validation-aucpr:0.97426
[18:43:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58134 validation-auc:0.96899 validation-aucpr:0.97415
[18:43:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.57585 validation-auc:0.96900 validation-aucpr:0.97418
[18:43:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57028 validation-auc:0.96931 validation-aucpr:0.97441
[18:43:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.56480 validation-auc:0.96962 validation-aucpr:0.97464
[18:43:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.55947 validation-auc:0.96972 validation-aucpr:0.97474
[18:43:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.55442 validation-auc:0.96973 validation-aucpr:0.97474
[18:43:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.54934 validation-auc:0.96977 validation-aucpr:0.97478
[18:43:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.54420 validation-auc:0.97004 validation-aucpr:0.97494
[18:43:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.53924 validation-auc:0.96999 validation-aucpr:0.97493
[18:43:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.53436 validation-auc:0.97005 validation-aucpr:0.97496
[18:43:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.52954 validation-auc:0.97005 validation-aucpr:0.97498
[18:43:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.52479 validation-auc:0.97022 validation-aucpr:0.97507
[18:43:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52029 validation-auc:0.97014 validation-aucpr:0.97503
[18:43:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.51609 validation-auc:0.97022 validation-aucpr:0.97510
[18:43:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.51163 validation-auc:0.97027 validation-aucpr:0.97513
[18:43:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.50728 validation-auc:0.97026 validation-aucpr:0.97513
[18:43:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.50290 validation-auc:0.97021 validation-aucpr:0.97510
[18:43:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.49872 validation-auc:0.97032 validation-aucpr:0.97516
[18:43:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.49453 validation-auc:0.97036 validation-aucpr:0.97518
[18:43:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.49031 validation-auc:0.97050 validation-aucpr:0.97529
[18:43:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.48618 validation-auc:0.97067 validation-aucpr:0.97543
[18:43:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.48219 validation-auc:0.97068 validation-aucpr:0.97542
[18:43:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.47821 validation-auc:0.97066 validation-aucpr:0.97542
[18:43:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.47442 validation-auc:0.97054 validation-aucpr:0.97531
[18:43:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.47059 validation-auc:0.97067 validation-aucpr:0.97539
[18:43:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.46678 validation-auc:0.97077 validation-aucpr:0.97546
[18:43:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.46341 validation-auc:0.97077 validation-aucpr:0.97547
[18:43:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.45974 validation-auc:0.97080 validation-aucpr:0.97550
[18:43:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.45612 validation-auc:0.97087 validation-aucpr:0.97552
[18:43:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.45254 validation-auc:0.97090 validation-aucpr:0.97556
[18:43:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.44907 validation-auc:0.97091 validation-aucpr:0.97556
[18:43:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.44595 validation-auc:0.97088 validation-aucpr:0.97555
[18:43:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.44254 validation-auc:0.97089 validation-aucpr:0.97555
[18:43:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.43918 validation-auc:0.97096 validation-aucpr:0.97561
[18:43:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.43591 validation-auc:0.97093 validation-aucpr:0.97559
[18:43:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.43276 validation-auc:0.97086 validation-aucpr:0.97552
[18:43:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.42987 validation-auc:0.97084 validation-aucpr:0.97552
[18:43:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.42669 validation-auc:0.97088 validation-aucpr:0.97556
[18:44:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.42356 validation-auc:0.97083 validation-aucpr:0.97554
[18:44:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.42076 validation-auc:0.97093 validation-aucpr:0.97562
[18:44:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.41774 validation-auc:0.97097 validation-aucpr:0.97565
[18:44:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.41474 validation-auc:0.97106 validation-aucpr:0.97570
[18:44:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.41179 validation-auc:0.97116 validation-aucpr:0.97576
[18:44:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.40919 validation-auc:0.97110 validation-aucpr:0.97573
[18:44:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.40635 validation-auc:0.97112 validation-aucpr:0.97575
[18:44:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.40354 validation-auc:0.97111 validation-aucpr:0.97575
[18:44:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.40077 validation-auc:0.97113 validation-aucpr:0.97575
[18:44:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.39801 validation-auc:0.97116 validation-aucpr:0.97577
[18:44:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.39527 validation-auc:0.97128 validation-aucpr:0.97585
[18:44:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.39259 validation-auc:0.97132 validation-aucpr:0.97587
[18:44:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.38997 validation-auc:0.97133 validation-aucpr:0.97587
[18:44:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.38740 validation-auc:0.97134 validation-aucpr:0.97587
[18:44:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.38504 validation-auc:0.97136 validation-aucpr:0.97589
[18:44:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.38248 validation-auc:0.97144 validation-aucpr:0.97595
[18:44:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.38005 validation-auc:0.97137 validation-aucpr:0.97589
[18:44:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.37764 validation-auc:0.97133 validation-aucpr:0.97587
[18:44:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.37520 validation-auc:0.97137 validation-aucpr:0.97589
[18:44:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.37287 validation-auc:0.97136 validation-aucpr:0.97588
[18:44:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.37051 validation-auc:0.97134 validation-aucpr:0.97587
[18:44:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.36822 validation-auc:0.97135 validation-aucpr:0.97587
[18:44:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.36598 validation-auc:0.97128 validation-aucpr:0.97582
[18:44:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.36374 validation-auc:0.97130 validation-aucpr:0.97582
[18:44:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.36154 validation-auc:0.97130 validation-aucpr:0.97581
{'best_iteration': '69', 'best_score': '0.975951096302995'}
Trial 26, Fold 4: Log loss = 0.3615393712953612, Average precision = 0.9757995200171711, ROC-AUC = 0.9712955064605582, Elapsed Time = 106.7030864999997 seconds
Trial 26, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 26, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:44:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68543 validation-auc:0.95258 validation-aucpr:0.95746
[18:44:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67787 validation-auc:0.96072 validation-aucpr:0.96581
[18:44:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67039 validation-auc:0.96449 validation-aucpr:0.96857
[18:44:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66323 validation-auc:0.96455 validation-aucpr:0.96869
[18:44:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65615 validation-auc:0.96421 validation-aucpr:0.96853
[18:44:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64925 validation-auc:0.96447 validation-aucpr:0.96876
[18:44:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64256 validation-auc:0.96455 validation-aucpr:0.96888
[18:44:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63584 validation-auc:0.96545 validation-aucpr:0.96957
[18:44:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.62918 validation-auc:0.96607 validation-aucpr:0.97059
[18:44:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62271 validation-auc:0.96637 validation-aucpr:0.97081
[18:44:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61636 validation-auc:0.96662 validation-aucpr:0.97096
[18:44:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61018 validation-auc:0.96715 validation-aucpr:0.97132
[18:44:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60411 validation-auc:0.96724 validation-aucpr:0.97144
[18:44:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.59812 validation-auc:0.96736 validation-aucpr:0.97155
[18:44:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59224 validation-auc:0.96766 validation-aucpr:0.97171
[18:44:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.58642 validation-auc:0.96799 validation-aucpr:0.97197
[18:44:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58076 validation-auc:0.96798 validation-aucpr:0.97198
[18:44:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.57584 validation-auc:0.96763 validation-aucpr:0.97198
[18:44:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57035 validation-auc:0.96790 validation-aucpr:0.97225
[18:45:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.56497 validation-auc:0.96780 validation-aucpr:0.97217
[18:45:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.55974 validation-auc:0.96792 validation-aucpr:0.97227
[18:45:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.55452 validation-auc:0.96822 validation-aucpr:0.97251
[18:45:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.54943 validation-auc:0.96838 validation-aucpr:0.97259
[18:45:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.54438 validation-auc:0.96831 validation-aucpr:0.97259
[18:45:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.53941 validation-auc:0.96838 validation-aucpr:0.97258
[18:45:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.53458 validation-auc:0.96833 validation-aucpr:0.97255
[18:45:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.52980 validation-auc:0.96848 validation-aucpr:0.97264
[18:45:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.52507 validation-auc:0.96876 validation-aucpr:0.97286
[18:45:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52047 validation-auc:0.96887 validation-aucpr:0.97292
[18:45:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.51591 validation-auc:0.96893 validation-aucpr:0.97294
[18:45:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.51145 validation-auc:0.96893 validation-aucpr:0.97294
[18:45:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.50705 validation-auc:0.96899 validation-aucpr:0.97302
[18:45:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.50263 validation-auc:0.96919 validation-aucpr:0.97315
[18:45:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.49843 validation-auc:0.96911 validation-aucpr:0.97316
[18:45:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.49427 validation-auc:0.96909 validation-aucpr:0.97314
[18:45:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.49015 validation-auc:0.96903 validation-aucpr:0.97310
[18:45:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.48611 validation-auc:0.96889 validation-aucpr:0.97302
[18:45:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.48250 validation-auc:0.96891 validation-aucpr:0.97306
[18:45:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.47857 validation-auc:0.96905 validation-aucpr:0.97316
[18:45:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.47471 validation-auc:0.96915 validation-aucpr:0.97322
[18:45:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.47101 validation-auc:0.96915 validation-aucpr:0.97320
[18:45:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.46727 validation-auc:0.96910 validation-aucpr:0.97318
[18:45:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.46361 validation-auc:0.96910 validation-aucpr:0.97319
[18:45:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.46007 validation-auc:0.96901 validation-aucpr:0.97309
[18:45:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.45655 validation-auc:0.96902 validation-aucpr:0.97310
[18:45:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.45303 validation-auc:0.96903 validation-aucpr:0.97311
[18:45:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.44954 validation-auc:0.96921 validation-aucpr:0.97321
[18:45:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.44622 validation-auc:0.96946 validation-aucpr:0.97331
[18:45:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.44288 validation-auc:0.96947 validation-aucpr:0.97333
[18:45:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.43968 validation-auc:0.96942 validation-aucpr:0.97329
[18:45:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.43647 validation-auc:0.96934 validation-aucpr:0.97323
[18:45:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.43355 validation-auc:0.96944 validation-aucpr:0.97334
[18:45:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.43046 validation-auc:0.96952 validation-aucpr:0.97338
[18:45:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.42742 validation-auc:0.96945 validation-aucpr:0.97332
[18:45:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.42434 validation-auc:0.96938 validation-aucpr:0.97328
[18:45:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.42145 validation-auc:0.96935 validation-aucpr:0.97330
[18:45:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.41844 validation-auc:0.96941 validation-aucpr:0.97335
[18:45:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.41583 validation-auc:0.96940 validation-aucpr:0.97334
[18:45:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.41295 validation-auc:0.96936 validation-aucpr:0.97322
[18:45:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.41020 validation-auc:0.96929 validation-aucpr:0.97317
[18:45:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.40739 validation-auc:0.96935 validation-aucpr:0.97322
[18:45:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.40456 validation-auc:0.96946 validation-aucpr:0.97329
[18:46:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.40182 validation-auc:0.96947 validation-aucpr:0.97329
[18:46:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.39915 validation-auc:0.96950 validation-aucpr:0.97332
[18:46:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.39648 validation-auc:0.96963 validation-aucpr:0.97340
[18:46:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.39391 validation-auc:0.96963 validation-aucpr:0.97345
[18:46:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.39134 validation-auc:0.96965 validation-aucpr:0.97345
[18:46:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.38885 validation-auc:0.96965 validation-aucpr:0.97345
[18:46:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.38633 validation-auc:0.96970 validation-aucpr:0.97349
[18:46:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.38384 validation-auc:0.96980 validation-aucpr:0.97357
[18:46:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.38163 validation-auc:0.96983 validation-aucpr:0.97357
[18:46:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.37923 validation-auc:0.96986 validation-aucpr:0.97359
[18:46:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.37683 validation-auc:0.96994 validation-aucpr:0.97365
[18:46:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.37483 validation-auc:0.96993 validation-aucpr:0.97369
[18:46:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.37254 validation-auc:0.96997 validation-aucpr:0.97371
[18:46:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.37030 validation-auc:0.96994 validation-aucpr:0.97369
[18:46:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.36812 validation-auc:0.96992 validation-aucpr:0.97367
[18:46:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.36581 validation-auc:0.97007 validation-aucpr:0.97379
[18:46:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.36361 validation-auc:0.97009 validation-aucpr:0.97379
{'best_iteration': '78', 'best_score': '0.9737867965658782'}
Trial 26, Fold 5: Log loss = 0.36361108693323485, Average precision = 0.9737929935568028, ROC-AUC = 0.9700917451647066, Elapsed Time = 107.7331622999991 seconds
Optimization Progress: 27%|##7 | 27/100 [47:35<3:45:59, 185.75s/it]
Trial 27, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 27, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67710 validation-auc:0.81462 validation-aucpr:0.74736
[1] validation-logloss:0.66225 validation-auc:0.89137 validation-aucpr:0.83793
[2] validation-logloss:0.64776 validation-auc:0.93149 validation-aucpr:0.91845
[3] validation-logloss:0.63378 validation-auc:0.94311 validation-aucpr:0.93988
[4] validation-logloss:0.62062 validation-auc:0.94563 validation-aucpr:0.94713
[5] validation-logloss:0.60765 validation-auc:0.95090 validation-aucpr:0.95911
[6] validation-logloss:0.59496 validation-auc:0.95299 validation-aucpr:0.96198
[7] validation-logloss:0.58297 validation-auc:0.95449 validation-aucpr:0.96269
[8] validation-logloss:0.57165 validation-auc:0.95677 validation-aucpr:0.96310
[9] validation-logloss:0.56193 validation-auc:0.95746 validation-aucpr:0.96273
[10] validation-logloss:0.55161 validation-auc:0.95838 validation-aucpr:0.96322
[11] validation-logloss:0.54112 validation-auc:0.95946 validation-aucpr:0.96455
[12] validation-logloss:0.53139 validation-auc:0.95968 validation-aucpr:0.96460
[13] validation-logloss:0.52203 validation-auc:0.95959 validation-aucpr:0.96460
[14] validation-logloss:0.51310 validation-auc:0.95997 validation-aucpr:0.96617
[15] validation-logloss:0.50506 validation-auc:0.96112 validation-aucpr:0.96647
[16] validation-logloss:0.49662 validation-auc:0.96170 validation-aucpr:0.96664
[17] validation-logloss:0.48829 validation-auc:0.96233 validation-aucpr:0.96854
[18] validation-logloss:0.48007 validation-auc:0.96307 validation-aucpr:0.96891
[19] validation-logloss:0.47225 validation-auc:0.96365 validation-aucpr:0.96924
[20] validation-logloss:0.46451 validation-auc:0.96405 validation-aucpr:0.96949
[21] validation-logloss:0.45703 validation-auc:0.96409 validation-aucpr:0.96953
[22] validation-logloss:0.45006 validation-auc:0.96443 validation-aucpr:0.96987
[23] validation-logloss:0.44294 validation-auc:0.96507 validation-aucpr:0.97022
[24] validation-logloss:0.43665 validation-auc:0.96504 validation-aucpr:0.97031
[25] validation-logloss:0.43005 validation-auc:0.96519 validation-aucpr:0.97046
[26] validation-logloss:0.42423 validation-auc:0.96519 validation-aucpr:0.97050
[27] validation-logloss:0.41800 validation-auc:0.96516 validation-aucpr:0.97058
[28] validation-logloss:0.41203 validation-auc:0.96521 validation-aucpr:0.97053
[29] validation-logloss:0.40632 validation-auc:0.96535 validation-aucpr:0.97093
[30] validation-logloss:0.40140 validation-auc:0.96548 validation-aucpr:0.97140
[31] validation-logloss:0.39606 validation-auc:0.96556 validation-aucpr:0.97154
[32] validation-logloss:0.39086 validation-auc:0.96572 validation-aucpr:0.97159
[33] validation-logloss:0.38553 validation-auc:0.96590 validation-aucpr:0.97172
[34] validation-logloss:0.38032 validation-auc:0.96626 validation-aucpr:0.97195
[35] validation-logloss:0.37585 validation-auc:0.96632 validation-aucpr:0.97193
[36] validation-logloss:0.37079 validation-auc:0.96668 validation-aucpr:0.97225
[37] validation-logloss:0.36636 validation-auc:0.96657 validation-aucpr:0.97215
[38] validation-logloss:0.36203 validation-auc:0.96658 validation-aucpr:0.97219
[39] validation-logloss:0.35778 validation-auc:0.96654 validation-aucpr:0.97216
[40] validation-logloss:0.35345 validation-auc:0.96657 validation-aucpr:0.97225
[41] validation-logloss:0.34948 validation-auc:0.96656 validation-aucpr:0.97228
[42] validation-logloss:0.34556 validation-auc:0.96673 validation-aucpr:0.97238
[43] validation-logloss:0.34156 validation-auc:0.96685 validation-aucpr:0.97245
[44] validation-logloss:0.33776 validation-auc:0.96694 validation-aucpr:0.97253
[45] validation-logloss:0.33408 validation-auc:0.96711 validation-aucpr:0.97261
[46] validation-logloss:0.33041 validation-auc:0.96730 validation-aucpr:0.97277
[47] validation-logloss:0.32697 validation-auc:0.96738 validation-aucpr:0.97283
{'best_iteration': '47', 'best_score': '0.9728271886681858'}
Trial 27, Fold 1: Log loss = 0.32696823726928415, Average precision = 0.9729045792862018, ROC-AUC = 0.9673756440496504, Elapsed Time = 14.147009500000422 seconds
Trial 27, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 27, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67726 validation-auc:0.80978 validation-aucpr:0.71893
[1] validation-logloss:0.66200 validation-auc:0.91014 validation-aucpr:0.84861
[2] validation-logloss:0.64775 validation-auc:0.93518 validation-aucpr:0.90438
[3] validation-logloss:0.63508 validation-auc:0.94685 validation-aucpr:0.92445
[4] validation-logloss:0.62172 validation-auc:0.95331 validation-aucpr:0.94448
[5] validation-logloss:0.60906 validation-auc:0.95586 validation-aucpr:0.95279
[6] validation-logloss:0.59662 validation-auc:0.95748 validation-aucpr:0.95485
[7] validation-logloss:0.58487 validation-auc:0.95926 validation-aucpr:0.95898
[8] validation-logloss:0.57492 validation-auc:0.96014 validation-aucpr:0.95980
[9] validation-logloss:0.56374 validation-auc:0.96172 validation-aucpr:0.96498
[10] validation-logloss:0.55299 validation-auc:0.96246 validation-aucpr:0.96516
[11] validation-logloss:0.54246 validation-auc:0.96412 validation-aucpr:0.96612
[12] validation-logloss:0.53330 validation-auc:0.96480 validation-aucpr:0.96605
[13] validation-logloss:0.52388 validation-auc:0.96465 validation-aucpr:0.96558
[14] validation-logloss:0.51450 validation-auc:0.96550 validation-aucpr:0.96785
[15] validation-logloss:0.50569 validation-auc:0.96554 validation-aucpr:0.96792
[16] validation-logloss:0.49715 validation-auc:0.96630 validation-aucpr:0.97021
[17] validation-logloss:0.48826 validation-auc:0.96708 validation-aucpr:0.97074
[18] validation-logloss:0.47989 validation-auc:0.96695 validation-aucpr:0.97086
[19] validation-logloss:0.47190 validation-auc:0.96713 validation-aucpr:0.97106
[20] validation-logloss:0.46442 validation-auc:0.96726 validation-aucpr:0.97105
[21] validation-logloss:0.45706 validation-auc:0.96753 validation-aucpr:0.97114
[22] validation-logloss:0.44998 validation-auc:0.96764 validation-aucpr:0.97132
[23] validation-logloss:0.44282 validation-auc:0.96802 validation-aucpr:0.97113
[24] validation-logloss:0.43649 validation-auc:0.96814 validation-aucpr:0.97107
[25] validation-logloss:0.43020 validation-auc:0.96796 validation-aucpr:0.97091
[26] validation-logloss:0.42387 validation-auc:0.96802 validation-aucpr:0.97103
[27] validation-logloss:0.41766 validation-auc:0.96795 validation-aucpr:0.97100
[28] validation-logloss:0.41159 validation-auc:0.96798 validation-aucpr:0.97106
[29] validation-logloss:0.40563 validation-auc:0.96810 validation-aucpr:0.97115
[30] validation-logloss:0.39992 validation-auc:0.96815 validation-aucpr:0.97115
[31] validation-logloss:0.39418 validation-auc:0.96835 validation-aucpr:0.97128
[32] validation-logloss:0.38869 validation-auc:0.96843 validation-aucpr:0.97141
[33] validation-logloss:0.38334 validation-auc:0.96862 validation-aucpr:0.97151
[34] validation-logloss:0.37817 validation-auc:0.96869 validation-aucpr:0.97163
[35] validation-logloss:0.37329 validation-auc:0.96860 validation-aucpr:0.97176
[36] validation-logloss:0.36878 validation-auc:0.96867 validation-aucpr:0.97194
[37] validation-logloss:0.36402 validation-auc:0.96870 validation-aucpr:0.97254
[38] validation-logloss:0.35944 validation-auc:0.96894 validation-aucpr:0.97267
[39] validation-logloss:0.35510 validation-auc:0.96898 validation-aucpr:0.97272
[40] validation-logloss:0.35111 validation-auc:0.96912 validation-aucpr:0.97285
[41] validation-logloss:0.34704 validation-auc:0.96928 validation-aucpr:0.97290
[42] validation-logloss:0.34361 validation-auc:0.96923 validation-aucpr:0.97276
[43] validation-logloss:0.33974 validation-auc:0.96917 validation-aucpr:0.97271
[44] validation-logloss:0.33586 validation-auc:0.96932 validation-aucpr:0.97283
[45] validation-logloss:0.33198 validation-auc:0.96959 validation-aucpr:0.97309
[46] validation-logloss:0.32831 validation-auc:0.96960 validation-aucpr:0.97319
[47] validation-logloss:0.32496 validation-auc:0.96970 validation-aucpr:0.97324
{'best_iteration': '47', 'best_score': '0.9732447278228306'}
Trial 27, Fold 2: Log loss = 0.3249584929680322, Average precision = 0.9733388508949556, ROC-AUC = 0.9697001403060024, Elapsed Time = 13.547955099998944 seconds
Trial 27, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 27, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67718 validation-auc:0.87853 validation-aucpr:0.90062
[1] validation-logloss:0.66183 validation-auc:0.91449 validation-aucpr:0.86609
[2] validation-logloss:0.64756 validation-auc:0.94079 validation-aucpr:0.92209
[3] validation-logloss:0.63387 validation-auc:0.94707 validation-aucpr:0.93535
[4] validation-logloss:0.62057 validation-auc:0.95179 validation-aucpr:0.94525
[5] validation-logloss:0.60746 validation-auc:0.95617 validation-aucpr:0.95559
[6] validation-logloss:0.59525 validation-auc:0.95750 validation-aucpr:0.96035
[7] validation-logloss:0.58461 validation-auc:0.95938 validation-aucpr:0.96129
[8] validation-logloss:0.57274 validation-auc:0.96218 validation-aucpr:0.96318
[9] validation-logloss:0.56155 validation-auc:0.96345 validation-aucpr:0.96621
[10] validation-logloss:0.55080 validation-auc:0.96390 validation-aucpr:0.96731
[11] validation-logloss:0.54138 validation-auc:0.96404 validation-aucpr:0.96578
[12] validation-logloss:0.53124 validation-auc:0.96421 validation-aucpr:0.96611
[13] validation-logloss:0.52160 validation-auc:0.96478 validation-aucpr:0.96638
[14] validation-logloss:0.51210 validation-auc:0.96517 validation-aucpr:0.96665
[15] validation-logloss:0.50331 validation-auc:0.96502 validation-aucpr:0.96572
[16] validation-logloss:0.49436 validation-auc:0.96576 validation-aucpr:0.96754
[17] validation-logloss:0.48630 validation-auc:0.96633 validation-aucpr:0.96846
[18] validation-logloss:0.47781 validation-auc:0.96683 validation-aucpr:0.96874
[19] validation-logloss:0.47007 validation-auc:0.96665 validation-aucpr:0.96851
[20] validation-logloss:0.46244 validation-auc:0.96713 validation-aucpr:0.96862
[21] validation-logloss:0.45500 validation-auc:0.96730 validation-aucpr:0.96869
[22] validation-logloss:0.44776 validation-auc:0.96763 validation-aucpr:0.96879
[23] validation-logloss:0.44062 validation-auc:0.96761 validation-aucpr:0.96838
[24] validation-logloss:0.43370 validation-auc:0.96753 validation-aucpr:0.96834
[25] validation-logloss:0.42684 validation-auc:0.96787 validation-aucpr:0.96810
[26] validation-logloss:0.42062 validation-auc:0.96746 validation-aucpr:0.96747
[27] validation-logloss:0.41452 validation-auc:0.96760 validation-aucpr:0.96795
[28] validation-logloss:0.40846 validation-auc:0.96793 validation-aucpr:0.96826
[29] validation-logloss:0.40279 validation-auc:0.96821 validation-aucpr:0.96937
[30] validation-logloss:0.39739 validation-auc:0.96808 validation-aucpr:0.96911
[31] validation-logloss:0.39189 validation-auc:0.96822 validation-aucpr:0.96921
[32] validation-logloss:0.38682 validation-auc:0.96814 validation-aucpr:0.96926
[33] validation-logloss:0.38192 validation-auc:0.96790 validation-aucpr:0.96913
[34] validation-logloss:0.37662 validation-auc:0.96807 validation-aucpr:0.96910
[35] validation-logloss:0.37144 validation-auc:0.96820 validation-aucpr:0.96921
[36] validation-logloss:0.36705 validation-auc:0.96808 validation-aucpr:0.96938
[37] validation-logloss:0.36291 validation-auc:0.96775 validation-aucpr:0.96821
[38] validation-logloss:0.35819 validation-auc:0.96787 validation-aucpr:0.96800
[39] validation-logloss:0.35408 validation-auc:0.96774 validation-aucpr:0.96779
[40] validation-logloss:0.34975 validation-auc:0.96796 validation-aucpr:0.96775
[41] validation-logloss:0.34598 validation-auc:0.96781 validation-aucpr:0.96721
[42] validation-logloss:0.34184 validation-auc:0.96805 validation-aucpr:0.96921
[43] validation-logloss:0.33816 validation-auc:0.96820 validation-aucpr:0.97004
[44] validation-logloss:0.33485 validation-auc:0.96802 validation-aucpr:0.96986
[45] validation-logloss:0.33116 validation-auc:0.96802 validation-aucpr:0.96992
[46] validation-logloss:0.32810 validation-auc:0.96807 validation-aucpr:0.96954
[47] validation-logloss:0.32490 validation-auc:0.96811 validation-aucpr:0.96906
{'best_iteration': '43', 'best_score': '0.9700397161982565'}
Trial 27, Fold 3: Log loss = 0.32490451688075633, Average precision = 0.9693429881943123, ROC-AUC = 0.9681149536265663, Elapsed Time = 13.689646200000425 seconds
Trial 27, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 27, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67743 validation-auc:0.81076 validation-aucpr:0.71757
[1] validation-logloss:0.66216 validation-auc:0.89878 validation-aucpr:0.83078
[2] validation-logloss:0.64765 validation-auc:0.93143 validation-aucpr:0.90569
[3] validation-logloss:0.63401 validation-auc:0.94414 validation-aucpr:0.93825
[4] validation-logloss:0.62057 validation-auc:0.95023 validation-aucpr:0.94818
[5] validation-logloss:0.60790 validation-auc:0.95327 validation-aucpr:0.94992
[6] validation-logloss:0.59554 validation-auc:0.95592 validation-aucpr:0.95984
[7] validation-logloss:0.58397 validation-auc:0.95867 validation-aucpr:0.96543
[8] validation-logloss:0.57317 validation-auc:0.95943 validation-aucpr:0.96475
[9] validation-logloss:0.56250 validation-auc:0.96107 validation-aucpr:0.96689
[10] validation-logloss:0.55227 validation-auc:0.96284 validation-aucpr:0.96913
[11] validation-logloss:0.54184 validation-auc:0.96332 validation-aucpr:0.96981
[12] validation-logloss:0.53188 validation-auc:0.96431 validation-aucpr:0.97036
[13] validation-logloss:0.52222 validation-auc:0.96476 validation-aucpr:0.97071
[14] validation-logloss:0.51292 validation-auc:0.96536 validation-aucpr:0.97109
[15] validation-logloss:0.50398 validation-auc:0.96553 validation-aucpr:0.97130
[16] validation-logloss:0.49560 validation-auc:0.96553 validation-aucpr:0.97135
[17] validation-logloss:0.48745 validation-auc:0.96538 validation-aucpr:0.97099
[18] validation-logloss:0.47929 validation-auc:0.96561 validation-aucpr:0.97158
[19] validation-logloss:0.47114 validation-auc:0.96582 validation-aucpr:0.97170
[20] validation-logloss:0.46362 validation-auc:0.96625 validation-aucpr:0.97191
[21] validation-logloss:0.45683 validation-auc:0.96628 validation-aucpr:0.97182
[22] validation-logloss:0.44954 validation-auc:0.96640 validation-aucpr:0.97198
[23] validation-logloss:0.44249 validation-auc:0.96643 validation-aucpr:0.97197
[24] validation-logloss:0.43555 validation-auc:0.96692 validation-aucpr:0.97235
[25] validation-logloss:0.42903 validation-auc:0.96713 validation-aucpr:0.97245
[26] validation-logloss:0.42291 validation-auc:0.96725 validation-aucpr:0.97252
[27] validation-logloss:0.41705 validation-auc:0.96701 validation-aucpr:0.97235
[28] validation-logloss:0.41130 validation-auc:0.96701 validation-aucpr:0.97239
[29] validation-logloss:0.40564 validation-auc:0.96711 validation-aucpr:0.97246
[30] validation-logloss:0.40002 validation-auc:0.96723 validation-aucpr:0.97256
[31] validation-logloss:0.39462 validation-auc:0.96709 validation-aucpr:0.97251
[32] validation-logloss:0.38943 validation-auc:0.96708 validation-aucpr:0.97249
[33] validation-logloss:0.38404 validation-auc:0.96728 validation-aucpr:0.97264
[34] validation-logloss:0.37898 validation-auc:0.96734 validation-aucpr:0.97288
[35] validation-logloss:0.37418 validation-auc:0.96724 validation-aucpr:0.97285
[36] validation-logloss:0.36968 validation-auc:0.96734 validation-aucpr:0.97287
[37] validation-logloss:0.36536 validation-auc:0.96737 validation-aucpr:0.97289
[38] validation-logloss:0.36079 validation-auc:0.96746 validation-aucpr:0.97294
[39] validation-logloss:0.35652 validation-auc:0.96745 validation-aucpr:0.97293
[40] validation-logloss:0.35209 validation-auc:0.96760 validation-aucpr:0.97305
[41] validation-logloss:0.34816 validation-auc:0.96741 validation-aucpr:0.97290
[42] validation-logloss:0.34414 validation-auc:0.96743 validation-aucpr:0.97290
[43] validation-logloss:0.34052 validation-auc:0.96740 validation-aucpr:0.97289
[44] validation-logloss:0.33680 validation-auc:0.96750 validation-aucpr:0.97296
[45] validation-logloss:0.33305 validation-auc:0.96747 validation-aucpr:0.97292
[46] validation-logloss:0.32903 validation-auc:0.96780 validation-aucpr:0.97315
[47] validation-logloss:0.32529 validation-auc:0.96809 validation-aucpr:0.97334
{'best_iteration': '47', 'best_score': '0.9733443455114036'}
Trial 27, Fold 4: Log loss = 0.32529408301928064, Average precision = 0.973338719744536, ROC-AUC = 0.9680886988514298, Elapsed Time = 13.645755100000315 seconds
Trial 27, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 27, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67738 validation-auc:0.81385 validation-aucpr:0.73173
[1] validation-logloss:0.66237 validation-auc:0.90118 validation-aucpr:0.84005
[2] validation-logloss:0.64846 validation-auc:0.92205 validation-aucpr:0.89053
[3] validation-logloss:0.63505 validation-auc:0.93435 validation-aucpr:0.92671
[4] validation-logloss:0.62169 validation-auc:0.93970 validation-aucpr:0.93789
[5] validation-logloss:0.60894 validation-auc:0.94506 validation-aucpr:0.94859
[6] validation-logloss:0.59665 validation-auc:0.95013 validation-aucpr:0.95372
[7] validation-logloss:0.58482 validation-auc:0.95344 validation-aucpr:0.95548
[8] validation-logloss:0.57389 validation-auc:0.95436 validation-aucpr:0.95889
[9] validation-logloss:0.56268 validation-auc:0.95629 validation-aucpr:0.96026
[10] validation-logloss:0.55208 validation-auc:0.95708 validation-aucpr:0.96116
[11] validation-logloss:0.54197 validation-auc:0.95843 validation-aucpr:0.96321
[12] validation-logloss:0.53196 validation-auc:0.95966 validation-aucpr:0.96381
[13] validation-logloss:0.52262 validation-auc:0.96025 validation-aucpr:0.96402
[14] validation-logloss:0.51390 validation-auc:0.96038 validation-aucpr:0.96556
[15] validation-logloss:0.50478 validation-auc:0.96132 validation-aucpr:0.96608
[16] validation-logloss:0.49603 validation-auc:0.96212 validation-aucpr:0.96658
[17] validation-logloss:0.48764 validation-auc:0.96282 validation-aucpr:0.96701
[18] validation-logloss:0.47971 validation-auc:0.96276 validation-aucpr:0.96692
[19] validation-logloss:0.47242 validation-auc:0.96261 validation-aucpr:0.96620
[20] validation-logloss:0.46483 validation-auc:0.96338 validation-aucpr:0.96802
[21] validation-logloss:0.45752 validation-auc:0.96341 validation-aucpr:0.96798
[22] validation-logloss:0.45059 validation-auc:0.96360 validation-aucpr:0.96803
[23] validation-logloss:0.44397 validation-auc:0.96331 validation-aucpr:0.96787
[24] validation-logloss:0.43755 validation-auc:0.96321 validation-aucpr:0.96770
[25] validation-logloss:0.43091 validation-auc:0.96369 validation-aucpr:0.96790
[26] validation-logloss:0.42472 validation-auc:0.96361 validation-aucpr:0.96895
[27] validation-logloss:0.41870 validation-auc:0.96364 validation-aucpr:0.96895
[28] validation-logloss:0.41276 validation-auc:0.96368 validation-aucpr:0.96899
[29] validation-logloss:0.40704 validation-auc:0.96383 validation-aucpr:0.96911
[30] validation-logloss:0.40219 validation-auc:0.96372 validation-aucpr:0.96847
[31] validation-logloss:0.39687 validation-auc:0.96387 validation-aucpr:0.96860
[32] validation-logloss:0.39155 validation-auc:0.96408 validation-aucpr:0.96882
[33] validation-logloss:0.38672 validation-auc:0.96430 validation-aucpr:0.96881
[34] validation-logloss:0.38164 validation-auc:0.96450 validation-aucpr:0.96895
[35] validation-logloss:0.37686 validation-auc:0.96452 validation-aucpr:0.96902
[36] validation-logloss:0.37210 validation-auc:0.96465 validation-aucpr:0.96915
[37] validation-logloss:0.36770 validation-auc:0.96477 validation-aucpr:0.96914
[38] validation-logloss:0.36353 validation-auc:0.96478 validation-aucpr:0.96915
[39] validation-logloss:0.35957 validation-auc:0.96477 validation-aucpr:0.96868
[40] validation-logloss:0.35559 validation-auc:0.96487 validation-aucpr:0.96852
[41] validation-logloss:0.35143 validation-auc:0.96494 validation-aucpr:0.96859
[42] validation-logloss:0.34795 validation-auc:0.96491 validation-aucpr:0.96830
[43] validation-logloss:0.34400 validation-auc:0.96510 validation-aucpr:0.96823
[44] validation-logloss:0.34021 validation-auc:0.96519 validation-aucpr:0.96818
[45] validation-logloss:0.33661 validation-auc:0.96519 validation-aucpr:0.96822
[46] validation-logloss:0.33316 validation-auc:0.96532 validation-aucpr:0.96853
[47] validation-logloss:0.32985 validation-auc:0.96533 validation-aucpr:0.96866
{'best_iteration': '38', 'best_score': '0.9691535642346878'}
Trial 27, Fold 5: Log loss = 0.32985425076024333, Average precision = 0.9688231578351952, ROC-AUC = 0.9653324415212828, Elapsed Time = 14.10808200000065 seconds
Optimization Progress: 28%|##8 | 28/100 [48:52<3:03:46, 153.15s/it]
Trial 28, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 28, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67453 validation-auc:0.92966 validation-aucpr:0.92412
[1] validation-logloss:0.65734 validation-auc:0.94220 validation-aucpr:0.94079
[2] validation-logloss:0.64190 validation-auc:0.94381 validation-aucpr:0.94793
[3] validation-logloss:0.62513 validation-auc:0.95034 validation-aucpr:0.95588
[4] validation-logloss:0.61047 validation-auc:0.95093 validation-aucpr:0.95709
[5] validation-logloss:0.59542 validation-auc:0.95285 validation-aucpr:0.95884
[6] validation-logloss:0.58235 validation-auc:0.95329 validation-aucpr:0.95917
[7] validation-logloss:0.56761 validation-auc:0.95724 validation-aucpr:0.96378
[8] validation-logloss:0.55330 validation-auc:0.95864 validation-aucpr:0.96548
[9] validation-logloss:0.54248 validation-auc:0.95811 validation-aucpr:0.96489
[10] validation-logloss:0.53158 validation-auc:0.95876 validation-aucpr:0.96530
[11] validation-logloss:0.51977 validation-auc:0.95895 validation-aucpr:0.96559
[12] validation-logloss:0.50953 validation-auc:0.95900 validation-aucpr:0.96570
[13] validation-logloss:0.50093 validation-auc:0.95857 validation-aucpr:0.96533
[14] validation-logloss:0.49037 validation-auc:0.95885 validation-aucpr:0.96570
[15] validation-logloss:0.48179 validation-auc:0.95892 validation-aucpr:0.96568
[16] validation-logloss:0.47358 validation-auc:0.95888 validation-aucpr:0.96558
[17] validation-logloss:0.46601 validation-auc:0.95864 validation-aucpr:0.96541
[18] validation-logloss:0.45761 validation-auc:0.95848 validation-aucpr:0.96536
[19] validation-logloss:0.44815 validation-auc:0.95907 validation-aucpr:0.96591
[20] validation-logloss:0.43952 validation-auc:0.95952 validation-aucpr:0.96641
[21] validation-logloss:0.43300 validation-auc:0.95957 validation-aucpr:0.96643
[22] validation-logloss:0.42651 validation-auc:0.95975 validation-aucpr:0.96658
[23] validation-logloss:0.41861 validation-auc:0.95982 validation-aucpr:0.96674
[24] validation-logloss:0.41035 validation-auc:0.96038 validation-aucpr:0.96730
{'best_iteration': '24', 'best_score': '0.967297137732974'}
Trial 28, Fold 1: Log loss = 0.41034771616691573, Average precision = 0.9673015836050309, ROC-AUC = 0.9603838517992763, Elapsed Time = 0.9977183999999397 seconds
Trial 28, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 28, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67493 validation-auc:0.92659 validation-aucpr:0.91603
[1] validation-logloss:0.65440 validation-auc:0.95558 validation-aucpr:0.96034
[2] validation-logloss:0.63804 validation-auc:0.95563 validation-aucpr:0.96028
[3] validation-logloss:0.62193 validation-auc:0.95796 validation-aucpr:0.96183
[4] validation-logloss:0.60719 validation-auc:0.95712 validation-aucpr:0.96109
[5] validation-logloss:0.59324 validation-auc:0.95693 validation-aucpr:0.96071
[6] validation-logloss:0.57882 validation-auc:0.95797 validation-aucpr:0.96157
[7] validation-logloss:0.56650 validation-auc:0.95736 validation-aucpr:0.96121
[8] validation-logloss:0.55490 validation-auc:0.95750 validation-aucpr:0.96111
[9] validation-logloss:0.54254 validation-auc:0.95863 validation-aucpr:0.96241
[10] validation-logloss:0.53208 validation-auc:0.95793 validation-aucpr:0.96168
[11] validation-logloss:0.51964 validation-auc:0.95966 validation-aucpr:0.96375
[12] validation-logloss:0.50705 validation-auc:0.96118 validation-aucpr:0.96578
[13] validation-logloss:0.49555 validation-auc:0.96162 validation-aucpr:0.96616
[14] validation-logloss:0.48666 validation-auc:0.96141 validation-aucpr:0.96585
[15] validation-logloss:0.47513 validation-auc:0.96223 validation-aucpr:0.96669
[16] validation-logloss:0.46474 validation-auc:0.96290 validation-aucpr:0.96736
[17] validation-logloss:0.45657 validation-auc:0.96301 validation-aucpr:0.96737
[18] validation-logloss:0.44932 validation-auc:0.96257 validation-aucpr:0.96691
[19] validation-logloss:0.44048 validation-auc:0.96280 validation-aucpr:0.96714
[20] validation-logloss:0.43353 validation-auc:0.96272 validation-aucpr:0.96704
[21] validation-logloss:0.42738 validation-auc:0.96251 validation-aucpr:0.96683
[22] validation-logloss:0.42151 validation-auc:0.96242 validation-aucpr:0.96672
[23] validation-logloss:0.41597 validation-auc:0.96235 validation-aucpr:0.96664
[24] validation-logloss:0.41013 validation-auc:0.96245 validation-aucpr:0.96669
{'best_iteration': '17', 'best_score': '0.9673667641165824'}
Trial 28, Fold 2: Log loss = 0.4101304550323175, Average precision = 0.9666330371412952, ROC-AUC = 0.9624493276207374, Elapsed Time = 1.2822998999999982 seconds
Trial 28, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 28, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67449 validation-auc:0.93199 validation-aucpr:0.92687
[1] validation-logloss:0.65477 validation-auc:0.95745 validation-aucpr:0.96278
[2] validation-logloss:0.63839 validation-auc:0.95725 validation-aucpr:0.96317
[3] validation-logloss:0.62300 validation-auc:0.95666 validation-aucpr:0.96261
[4] validation-logloss:0.60752 validation-auc:0.95706 validation-aucpr:0.96320
[5] validation-logloss:0.59387 validation-auc:0.95680 validation-aucpr:0.96336
[6] validation-logloss:0.58044 validation-auc:0.95810 validation-aucpr:0.96379
[7] validation-logloss:0.56757 validation-auc:0.95870 validation-aucpr:0.96417
[8] validation-logloss:0.55673 validation-auc:0.95799 validation-aucpr:0.96354
[9] validation-logloss:0.54440 validation-auc:0.95855 validation-aucpr:0.96395
[10] validation-logloss:0.53330 validation-auc:0.95814 validation-aucpr:0.96352
[11] validation-logloss:0.52290 validation-auc:0.95761 validation-aucpr:0.96295
[12] validation-logloss:0.51239 validation-auc:0.95829 validation-aucpr:0.96379
[13] validation-logloss:0.50305 validation-auc:0.95823 validation-aucpr:0.96375
[14] validation-logloss:0.49428 validation-auc:0.95778 validation-aucpr:0.96334
[15] validation-logloss:0.48541 validation-auc:0.95800 validation-aucpr:0.96341
[16] validation-logloss:0.47637 validation-auc:0.95854 validation-aucpr:0.96369
[17] validation-logloss:0.46880 validation-auc:0.95807 validation-aucpr:0.96322
[18] validation-logloss:0.46171 validation-auc:0.95788 validation-aucpr:0.96306
[19] validation-logloss:0.45486 validation-auc:0.95767 validation-aucpr:0.96290
[20] validation-logloss:0.44828 validation-auc:0.95755 validation-aucpr:0.96278
[21] validation-logloss:0.43986 validation-auc:0.95857 validation-aucpr:0.96410
[22] validation-logloss:0.43406 validation-auc:0.95835 validation-aucpr:0.96395
[23] validation-logloss:0.42816 validation-auc:0.95807 validation-aucpr:0.96371
[24] validation-logloss:0.42208 validation-auc:0.95836 validation-aucpr:0.96387
{'best_iteration': '7', 'best_score': '0.9641735778234024'}
Trial 28, Fold 3: Log loss = 0.42207755835198346, Average precision = 0.9638774698876118, ROC-AUC = 0.9583618753069894, Elapsed Time = 1.3180508999994345 seconds
Trial 28, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 28, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67487 validation-auc:0.92848 validation-aucpr:0.92519
[1] validation-logloss:0.65758 validation-auc:0.93797 validation-aucpr:0.93847
[2] validation-logloss:0.64007 validation-auc:0.95223 validation-aucpr:0.95651
[3] validation-logloss:0.62384 validation-auc:0.95442 validation-aucpr:0.96065
[4] validation-logloss:0.60735 validation-auc:0.95637 validation-aucpr:0.96316
[5] validation-logloss:0.59402 validation-auc:0.95596 validation-aucpr:0.96283
[6] validation-logloss:0.58079 validation-auc:0.95533 validation-aucpr:0.96204
[7] validation-logloss:0.56572 validation-auc:0.95708 validation-aucpr:0.96393
[8] validation-logloss:0.55312 validation-auc:0.95708 validation-aucpr:0.96397
[9] validation-logloss:0.53975 validation-auc:0.95803 validation-aucpr:0.96481
[10] validation-logloss:0.52881 validation-auc:0.95869 validation-aucpr:0.96522
[11] validation-logloss:0.51717 validation-auc:0.95914 validation-aucpr:0.96566
[12] validation-logloss:0.50687 validation-auc:0.95932 validation-aucpr:0.96581
[13] validation-logloss:0.49786 validation-auc:0.95874 validation-aucpr:0.96533
[14] validation-logloss:0.48824 validation-auc:0.95920 validation-aucpr:0.96564
[15] validation-logloss:0.47747 validation-auc:0.96011 validation-aucpr:0.96646
[16] validation-logloss:0.46911 validation-auc:0.96021 validation-aucpr:0.96650
[17] validation-logloss:0.46103 validation-auc:0.96070 validation-aucpr:0.96690
[18] validation-logloss:0.45382 validation-auc:0.96053 validation-aucpr:0.96667
[19] validation-logloss:0.44491 validation-auc:0.96111 validation-aucpr:0.96724
[20] validation-logloss:0.43808 validation-auc:0.96107 validation-aucpr:0.96717
[21] validation-logloss:0.42941 validation-auc:0.96140 validation-aucpr:0.96752
[22] validation-logloss:0.42274 validation-auc:0.96152 validation-aucpr:0.96760
[23] validation-logloss:0.41489 validation-auc:0.96189 validation-aucpr:0.96801
[24] validation-logloss:0.40935 validation-auc:0.96188 validation-aucpr:0.96799
{'best_iteration': '23', 'best_score': '0.9680064618656231'}
Trial 28, Fold 4: Log loss = 0.409350621791954, Average precision = 0.9679871158277571, ROC-AUC = 0.9618803664122502, Elapsed Time = 1.2848967999998422 seconds
Trial 28, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 28, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67495 validation-auc:0.91969 validation-aucpr:0.90610
[1] validation-logloss:0.65813 validation-auc:0.93588 validation-aucpr:0.93828
[2] validation-logloss:0.64248 validation-auc:0.93797 validation-aucpr:0.94127
[3] validation-logloss:0.62754 validation-auc:0.93957 validation-aucpr:0.94455
[4] validation-logloss:0.61008 validation-auc:0.94994 validation-aucpr:0.95687
[5] validation-logloss:0.59344 validation-auc:0.95363 validation-aucpr:0.96018
[6] validation-logloss:0.58120 validation-auc:0.95313 validation-aucpr:0.96021
[7] validation-logloss:0.56863 validation-auc:0.95286 validation-aucpr:0.96003
[8] validation-logloss:0.55453 validation-auc:0.95443 validation-aucpr:0.96131
[9] validation-logloss:0.54319 validation-auc:0.95455 validation-aucpr:0.96132
[10] validation-logloss:0.53230 validation-auc:0.95490 validation-aucpr:0.96160
[11] validation-logloss:0.52226 validation-auc:0.95486 validation-aucpr:0.96151
[12] validation-logloss:0.51321 validation-auc:0.95474 validation-aucpr:0.96134
[13] validation-logloss:0.50445 validation-auc:0.95453 validation-aucpr:0.96117
[14] validation-logloss:0.49649 validation-auc:0.95444 validation-aucpr:0.96145
[15] validation-logloss:0.48890 validation-auc:0.95432 validation-aucpr:0.96129
[16] validation-logloss:0.48053 validation-auc:0.95437 validation-aucpr:0.96124
[17] validation-logloss:0.47301 validation-auc:0.95430 validation-aucpr:0.96114
[18] validation-logloss:0.46379 validation-auc:0.95513 validation-aucpr:0.96189
[19] validation-logloss:0.45706 validation-auc:0.95482 validation-aucpr:0.96159
[20] validation-logloss:0.44790 validation-auc:0.95561 validation-aucpr:0.96244
[21] validation-logloss:0.44038 validation-auc:0.95607 validation-aucpr:0.96272
[22] validation-logloss:0.43436 validation-auc:0.95596 validation-aucpr:0.96254
[23] validation-logloss:0.42855 validation-auc:0.95596 validation-aucpr:0.96258
[24] validation-logloss:0.42321 validation-auc:0.95581 validation-aucpr:0.96247
{'best_iteration': '21', 'best_score': '0.9627214154090976'}
Trial 28, Fold 5: Log loss = 0.42320609848369695, Average precision = 0.9624687066772707, ROC-AUC = 0.9558063281496758, Elapsed Time = 1.2925592999999935 seconds
Optimization Progress: 29%|##9 | 29/100 [49:06<2:11:52, 111.45s/it]
Trial 29, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 29, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64541 validation-auc:0.91169 validation-aucpr:0.91761
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.60390 validation-auc:0.93151 validation-aucpr:0.93490
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.56258 validation-auc:0.94565 validation-aucpr:0.95033
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.53228 validation-auc:0.94971 validation-aucpr:0.95664
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.50397 validation-auc:0.95238 validation-aucpr:0.95835
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.48025 validation-auc:0.95444 validation-aucpr:0.95879
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.45484 validation-auc:0.95602 validation-aucpr:0.96062
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.43727 validation-auc:0.95625 validation-aucpr:0.96108
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.41788 validation-auc:0.95818 validation-aucpr:0.96269
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.40385 validation-auc:0.95779 validation-aucpr:0.96240
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.38933 validation-auc:0.95868 validation-aucpr:0.96319
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.37717 validation-auc:0.95881 validation-aucpr:0.96220
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.36023 validation-auc:0.96053 validation-aucpr:0.96401
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.35017 validation-auc:0.96065 validation-aucpr:0.96393
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.34203 validation-auc:0.96039 validation-aucpr:0.96372
[18:48:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.33393 validation-auc:0.96071 validation-aucpr:0.96352
[18:48:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.32278 validation-auc:0.96130 validation-aucpr:0.96421
[18:48:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.31714 validation-auc:0.96100 validation-aucpr:0.96305
[18:48:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.31066 validation-auc:0.96119 validation-aucpr:0.96318
[18:48:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.30438 validation-auc:0.96131 validation-aucpr:0.96372
[18:48:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.29466 validation-auc:0.96206 validation-aucpr:0.96483
[18:48:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.28839 validation-auc:0.96270 validation-aucpr:0.96702
[18:48:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.28440 validation-auc:0.96293 validation-aucpr:0.96716
[18:48:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.27877 validation-auc:0.96347 validation-aucpr:0.96743
[18:48:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.27457 validation-auc:0.96363 validation-aucpr:0.96750
[18:48:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.27128 validation-auc:0.96368 validation-aucpr:0.96756
[18:48:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.26828 validation-auc:0.96387 validation-aucpr:0.96876
[18:48:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.26574 validation-auc:0.96382 validation-aucpr:0.96772
[18:48:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.26306 validation-auc:0.96390 validation-aucpr:0.96766
[18:48:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.26011 validation-auc:0.96420 validation-aucpr:0.96892
[18:48:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.25810 validation-auc:0.96391 validation-aucpr:0.96855
[18:48:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.25614 validation-auc:0.96390 validation-aucpr:0.96878
[18:48:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.25481 validation-auc:0.96400 validation-aucpr:0.96956
[18:48:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.25265 validation-auc:0.96420 validation-aucpr:0.96960
[18:48:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.24764 validation-auc:0.96488 validation-aucpr:0.97028
[18:48:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.24315 validation-auc:0.96533 validation-aucpr:0.97079
[18:48:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.24109 validation-auc:0.96570 validation-aucpr:0.97108
[18:48:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.23990 validation-auc:0.96591 validation-aucpr:0.97124
[18:48:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.23887 validation-auc:0.96590 validation-aucpr:0.97122
[18:48:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.23697 validation-auc:0.96600 validation-aucpr:0.97138
[18:48:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.23386 validation-auc:0.96620 validation-aucpr:0.97163
[18:48:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.23076 validation-auc:0.96660 validation-aucpr:0.97203
[18:48:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.22987 validation-auc:0.96673 validation-aucpr:0.97209
[18:48:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.22815 validation-auc:0.96679 validation-aucpr:0.97213
[18:48:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.22617 validation-auc:0.96694 validation-aucpr:0.97227
[18:48:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.22627 validation-auc:0.96680 validation-aucpr:0.97210
[18:48:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.22582 validation-auc:0.96678 validation-aucpr:0.97208
[18:48:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.22572 validation-auc:0.96665 validation-aucpr:0.97185
[18:48:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.22484 validation-auc:0.96671 validation-aucpr:0.97197
[18:48:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.22269 validation-auc:0.96704 validation-aucpr:0.97232
[18:48:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.22300 validation-auc:0.96678 validation-aucpr:0.97210
[18:48:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.22248 validation-auc:0.96691 validation-aucpr:0.97215
[18:48:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.22211 validation-auc:0.96701 validation-aucpr:0.97218
[18:48:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.22179 validation-auc:0.96700 validation-aucpr:0.97213
{'best_iteration': '49', 'best_score': '0.9723227413480227'}
Trial 29, Fold 1: Log loss = 0.2217877447091866, Average precision = 0.972137945326637, ROC-AUC = 0.9670048524779781, Elapsed Time = 6.752285800001118 seconds
Trial 29, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 29, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[18:48:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64490 validation-auc:0.91025 validation-aucpr:0.89715
[18:48:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.59718 validation-auc:0.94638 validation-aucpr:0.94511
[18:48:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.56146 validation-auc:0.95100 validation-aucpr:0.95479
[18:48:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.53075 validation-auc:0.95366 validation-aucpr:0.95701
[18:48:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.50243 validation-auc:0.95516 validation-aucpr:0.95991
[18:48:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.47286 validation-auc:0.95755 validation-aucpr:0.96278
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.45099 validation-auc:0.95856 validation-aucpr:0.96356
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.43191 validation-auc:0.95870 validation-aucpr:0.96129
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.40901 validation-auc:0.96055 validation-aucpr:0.96300
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.39460 validation-auc:0.96081 validation-aucpr:0.96304
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.38144 validation-auc:0.96124 validation-aucpr:0.96586
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.36900 validation-auc:0.96137 validation-aucpr:0.96592
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.35892 validation-auc:0.96110 validation-aucpr:0.96563
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.34433 validation-auc:0.96190 validation-aucpr:0.96650
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.33530 validation-auc:0.96207 validation-aucpr:0.96607
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.32704 validation-auc:0.96184 validation-aucpr:0.96605
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.31647 validation-auc:0.96227 validation-aucpr:0.96651
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.30982 validation-auc:0.96262 validation-aucpr:0.96681
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.30323 validation-auc:0.96276 validation-aucpr:0.96685
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.29368 validation-auc:0.96338 validation-aucpr:0.96740
[18:48:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.28486 validation-auc:0.96413 validation-aucpr:0.96818
[18:48:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.27980 validation-auc:0.96427 validation-aucpr:0.96827
[18:48:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.27518 validation-auc:0.96453 validation-aucpr:0.96854
[18:48:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.26847 validation-auc:0.96491 validation-aucpr:0.96882
[18:48:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.26388 validation-auc:0.96527 validation-aucpr:0.96918
[18:48:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.26109 validation-auc:0.96506 validation-aucpr:0.96895
[18:48:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.25792 validation-auc:0.96526 validation-aucpr:0.96906
[18:48:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.25419 validation-auc:0.96561 validation-aucpr:0.96941
[18:48:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.24922 validation-auc:0.96584 validation-aucpr:0.96966
[18:48:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.24651 validation-auc:0.96613 validation-aucpr:0.96981
[18:48:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.24519 validation-auc:0.96592 validation-aucpr:0.96970
[18:48:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.24185 validation-auc:0.96668 validation-aucpr:0.97060
[18:48:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.23883 validation-auc:0.96671 validation-aucpr:0.97075
[18:48:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.23528 validation-auc:0.96686 validation-aucpr:0.97077
[18:48:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.23331 validation-auc:0.96702 validation-aucpr:0.97079
[18:48:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.22974 validation-auc:0.96736 validation-aucpr:0.97121
[18:48:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.22777 validation-auc:0.96776 validation-aucpr:0.97112
[18:48:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.22637 validation-auc:0.96798 validation-aucpr:0.97108
[18:48:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.22523 validation-auc:0.96796 validation-aucpr:0.97093
[18:48:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.22371 validation-auc:0.96822 validation-aucpr:0.97094
[18:48:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.22283 validation-auc:0.96808 validation-aucpr:0.97074
[18:48:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.22280 validation-auc:0.96778 validation-aucpr:0.97048
[18:48:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.22129 validation-auc:0.96768 validation-aucpr:0.97051
[18:48:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.21993 validation-auc:0.96750 validation-aucpr:0.97077
[18:48:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.21782 validation-auc:0.96776 validation-aucpr:0.97092
[18:48:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.21496 validation-auc:0.96826 validation-aucpr:0.97128
[18:48:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.21431 validation-auc:0.96839 validation-aucpr:0.97142
[18:48:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.21447 validation-auc:0.96830 validation-aucpr:0.97150
[18:48:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.21373 validation-auc:0.96839 validation-aucpr:0.97154
[18:48:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.21232 validation-auc:0.96859 validation-aucpr:0.97166
[18:48:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.21266 validation-auc:0.96828 validation-aucpr:0.97137
[18:48:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.21169 validation-auc:0.96818 validation-aucpr:0.97115
[18:48:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.21162 validation-auc:0.96829 validation-aucpr:0.97087
[18:48:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.21039 validation-auc:0.96847 validation-aucpr:0.97084
{'best_iteration': '49', 'best_score': '0.9716618943586721'}
Trial 29, Fold 2: Log loss = 0.21039004558810057, Average precision = 0.970847171830053, ROC-AUC = 0.968472955964917, Elapsed Time = 6.9477287999998225 seconds
Trial 29, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 29, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64386 validation-auc:0.92319 validation-aucpr:0.92384
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.60188 validation-auc:0.93882 validation-aucpr:0.93939
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.56659 validation-auc:0.94351 validation-aucpr:0.94409
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.53346 validation-auc:0.94853 validation-aucpr:0.95234
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.50650 validation-auc:0.94981 validation-aucpr:0.95381
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.48344 validation-auc:0.95081 validation-aucpr:0.95484
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.46198 validation-auc:0.95191 validation-aucpr:0.95633
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.44093 validation-auc:0.95335 validation-aucpr:0.95589
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.41706 validation-auc:0.95852 validation-aucpr:0.96101
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.39609 validation-auc:0.96103 validation-aucpr:0.96550
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.37858 validation-auc:0.96179 validation-aucpr:0.96620
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.36437 validation-auc:0.96261 validation-aucpr:0.96691
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.35270 validation-auc:0.96273 validation-aucpr:0.96611
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.34169 validation-auc:0.96298 validation-aucpr:0.96487
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.33360 validation-auc:0.96263 validation-aucpr:0.96502
[18:48:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.32453 validation-auc:0.96290 validation-aucpr:0.96526
[18:48:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.31707 validation-auc:0.96315 validation-aucpr:0.96400
[18:48:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.31041 validation-auc:0.96328 validation-aucpr:0.96391
[18:48:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.30402 validation-auc:0.96327 validation-aucpr:0.96390
[18:48:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.29938 validation-auc:0.96318 validation-aucpr:0.96362
[18:48:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.29085 validation-auc:0.96370 validation-aucpr:0.96493
[18:48:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.28595 validation-auc:0.96380 validation-aucpr:0.96560
[18:48:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.28230 validation-auc:0.96380 validation-aucpr:0.96590
[18:48:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.27932 validation-auc:0.96386 validation-aucpr:0.96738
[18:48:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.27579 validation-auc:0.96382 validation-aucpr:0.96731
[18:48:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.27278 validation-auc:0.96373 validation-aucpr:0.96769
[18:48:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.27058 validation-auc:0.96360 validation-aucpr:0.96757
[18:48:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.26561 validation-auc:0.96404 validation-aucpr:0.96790
[18:48:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.26381 validation-auc:0.96376 validation-aucpr:0.96706
[18:48:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.26137 validation-auc:0.96371 validation-aucpr:0.96715
[18:48:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.25954 validation-auc:0.96344 validation-aucpr:0.96668
[18:48:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.25729 validation-auc:0.96355 validation-aucpr:0.96677
[18:48:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.25236 validation-auc:0.96430 validation-aucpr:0.96910
[18:48:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.25129 validation-auc:0.96413 validation-aucpr:0.96895
[18:48:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.24862 validation-auc:0.96433 validation-aucpr:0.96923
[18:48:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.24413 validation-auc:0.96474 validation-aucpr:0.96958
[18:48:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.24228 validation-auc:0.96499 validation-aucpr:0.96976
[18:48:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.24063 validation-auc:0.96532 validation-aucpr:0.96993
[18:48:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.23938 validation-auc:0.96534 validation-aucpr:0.97006
[18:48:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.23800 validation-auc:0.96548 validation-aucpr:0.97012
[18:48:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.23678 validation-auc:0.96573 validation-aucpr:0.97037
[18:48:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.23572 validation-auc:0.96585 validation-aucpr:0.97055
[18:48:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.23442 validation-auc:0.96585 validation-aucpr:0.97056
[18:48:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.23143 validation-auc:0.96616 validation-aucpr:0.97085
[18:48:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.23070 validation-auc:0.96622 validation-aucpr:0.97088
[18:48:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.22768 validation-auc:0.96647 validation-aucpr:0.97095
[18:48:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.22714 validation-auc:0.96647 validation-aucpr:0.97094
[18:48:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.22464 validation-auc:0.96678 validation-aucpr:0.97116
[18:48:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.22358 validation-auc:0.96696 validation-aucpr:0.97143
[18:48:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.22294 validation-auc:0.96704 validation-aucpr:0.97100
[18:48:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.22254 validation-auc:0.96717 validation-aucpr:0.97115
[18:48:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.22258 validation-auc:0.96711 validation-aucpr:0.97139
[18:48:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.22249 validation-auc:0.96704 validation-aucpr:0.97134
[18:48:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.22047 validation-auc:0.96717 validation-aucpr:0.97147
{'best_iteration': '53', 'best_score': '0.9714691673218635'}
Trial 29, Fold 3: Log loss = 0.220469281968947, Average precision = 0.9714746681701518, ROC-AUC = 0.9671723160713082, Elapsed Time = 7.089355300000534 seconds
Trial 29, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 29, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[18:48:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64628 validation-auc:0.89556 validation-aucpr:0.91012
[18:48:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.59900 validation-auc:0.94428 validation-aucpr:0.94372
[18:48:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.56390 validation-auc:0.94788 validation-aucpr:0.94820
[18:48:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.53289 validation-auc:0.95223 validation-aucpr:0.95777
[18:48:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.50459 validation-auc:0.95366 validation-aucpr:0.95973
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.47957 validation-auc:0.95403 validation-aucpr:0.96028
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.45126 validation-auc:0.95756 validation-aucpr:0.96386
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.43384 validation-auc:0.95720 validation-aucpr:0.96336
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.41201 validation-auc:0.95891 validation-aucpr:0.96527
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.39652 validation-auc:0.95881 validation-aucpr:0.96527
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.38243 validation-auc:0.95865 validation-aucpr:0.96514
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.36925 validation-auc:0.95956 validation-aucpr:0.96582
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.35757 validation-auc:0.95997 validation-aucpr:0.96634
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.34414 validation-auc:0.96084 validation-aucpr:0.96712
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.33558 validation-auc:0.96098 validation-aucpr:0.96723
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.32237 validation-auc:0.96205 validation-aucpr:0.96822
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.31110 validation-auc:0.96301 validation-aucpr:0.96898
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.30415 validation-auc:0.96278 validation-aucpr:0.96879
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.29423 validation-auc:0.96340 validation-aucpr:0.96939
[18:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.28877 validation-auc:0.96338 validation-aucpr:0.96944
[18:48:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.28283 validation-auc:0.96365 validation-aucpr:0.96962
[18:48:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.27796 validation-auc:0.96396 validation-aucpr:0.96987
[18:48:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.27411 validation-auc:0.96405 validation-aucpr:0.96989
[18:48:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.26754 validation-auc:0.96453 validation-aucpr:0.97037
[18:48:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.26421 validation-auc:0.96453 validation-aucpr:0.97036
[18:48:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.26074 validation-auc:0.96448 validation-aucpr:0.97029
[18:48:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.25768 validation-auc:0.96467 validation-aucpr:0.97043
[18:48:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.25188 validation-auc:0.96511 validation-aucpr:0.97088
[18:48:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.24964 validation-auc:0.96517 validation-aucpr:0.97089
[18:48:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.24743 validation-auc:0.96530 validation-aucpr:0.97096
[18:48:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.24537 validation-auc:0.96529 validation-aucpr:0.97096
[18:48:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.24115 validation-auc:0.96558 validation-aucpr:0.97128
[18:48:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.23780 validation-auc:0.96582 validation-aucpr:0.97150
[18:48:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.23609 validation-auc:0.96586 validation-aucpr:0.97151
[18:48:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.23350 validation-auc:0.96578 validation-aucpr:0.97135
[18:48:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.23237 validation-auc:0.96575 validation-aucpr:0.97140
[18:48:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.23065 validation-auc:0.96601 validation-aucpr:0.97159
[18:48:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.22776 validation-auc:0.96638 validation-aucpr:0.97196
[18:48:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.22684 validation-auc:0.96641 validation-aucpr:0.97199
[18:48:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.22497 validation-auc:0.96644 validation-aucpr:0.97203
[18:48:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.22292 validation-auc:0.96694 validation-aucpr:0.97233
[18:48:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.22095 validation-auc:0.96721 validation-aucpr:0.97254
[18:48:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.22040 validation-auc:0.96711 validation-aucpr:0.97249
[18:48:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.21913 validation-auc:0.96738 validation-aucpr:0.97269
[18:48:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.21883 validation-auc:0.96733 validation-aucpr:0.97264
[18:48:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.21750 validation-auc:0.96736 validation-aucpr:0.97269
[18:48:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.21688 validation-auc:0.96748 validation-aucpr:0.97276
[18:48:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.21571 validation-auc:0.96783 validation-aucpr:0.97300
[18:48:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.21368 validation-auc:0.96821 validation-aucpr:0.97328
[18:48:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.21324 validation-auc:0.96831 validation-aucpr:0.97339
[18:48:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.21172 validation-auc:0.96869 validation-aucpr:0.97368
[18:48:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.21151 validation-auc:0.96865 validation-aucpr:0.97367
[18:48:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.21170 validation-auc:0.96838 validation-aucpr:0.97348
[18:48:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.21202 validation-auc:0.96815 validation-aucpr:0.97323
{'best_iteration': '50', 'best_score': '0.9736780566332014'}
Trial 29, Fold 4: Log loss = 0.21202301627995604, Average precision = 0.9732314089812211, ROC-AUC = 0.968145179557978, Elapsed Time = 9.093965699999899 seconds
Trial 29, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 29, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[18:48:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64529 validation-auc:0.90132 validation-aucpr:0.91196
[18:48:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.60198 validation-auc:0.93659 validation-aucpr:0.93723
[18:48:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.56527 validation-auc:0.94311 validation-aucpr:0.94273
[18:48:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.53389 validation-auc:0.94624 validation-aucpr:0.94825
[18:48:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.50769 validation-auc:0.94717 validation-aucpr:0.95187
[18:48:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.48347 validation-auc:0.95008 validation-aucpr:0.95406
[18:48:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.46231 validation-auc:0.95062 validation-aucpr:0.95503
[18:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.44375 validation-auc:0.95115 validation-aucpr:0.95522
[18:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.42109 validation-auc:0.95512 validation-aucpr:0.95981
[18:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.40650 validation-auc:0.95518 validation-aucpr:0.96009
[18:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.39462 validation-auc:0.95529 validation-aucpr:0.96068
[18:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.37714 validation-auc:0.95733 validation-aucpr:0.96316
[18:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.36635 validation-auc:0.95690 validation-aucpr:0.96276
[18:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.35676 validation-auc:0.95711 validation-aucpr:0.96284
[18:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.34751 validation-auc:0.95718 validation-aucpr:0.96300
[18:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.33915 validation-auc:0.95730 validation-aucpr:0.96305
[18:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.33291 validation-auc:0.95688 validation-aucpr:0.96274
[18:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.32761 validation-auc:0.95670 validation-aucpr:0.96259
[18:48:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.31691 validation-auc:0.95790 validation-aucpr:0.96297
[18:48:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.31259 validation-auc:0.95757 validation-aucpr:0.96298
[18:48:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.30643 validation-auc:0.95817 validation-aucpr:0.96298
[18:48:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.30059 validation-auc:0.95866 validation-aucpr:0.96323
[18:48:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.29786 validation-auc:0.95810 validation-aucpr:0.96202
[18:48:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.28972 validation-auc:0.95917 validation-aucpr:0.96395
[18:48:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.28543 validation-auc:0.95940 validation-aucpr:0.96381
[18:48:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.28207 validation-auc:0.95944 validation-aucpr:0.96413
[18:48:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.27893 validation-auc:0.95961 validation-aucpr:0.96416
[18:48:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.27676 validation-auc:0.95941 validation-aucpr:0.96401
[18:48:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.27162 validation-auc:0.95997 validation-aucpr:0.96443
[18:48:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.26585 validation-auc:0.96053 validation-aucpr:0.96585
[18:48:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.26275 validation-auc:0.96089 validation-aucpr:0.96607
[18:48:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.26138 validation-auc:0.96084 validation-aucpr:0.96527
[18:48:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.25685 validation-auc:0.96118 validation-aucpr:0.96511
[18:48:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.25283 validation-auc:0.96169 validation-aucpr:0.96662
[18:48:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.24905 validation-auc:0.96202 validation-aucpr:0.96667
[18:48:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.24578 validation-auc:0.96216 validation-aucpr:0.96671
[18:48:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.24385 validation-auc:0.96239 validation-aucpr:0.96697
[18:48:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.24066 validation-auc:0.96285 validation-aucpr:0.96735
[18:48:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.23951 validation-auc:0.96292 validation-aucpr:0.96729
[18:48:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.23658 validation-auc:0.96348 validation-aucpr:0.96777
[18:48:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.23554 validation-auc:0.96363 validation-aucpr:0.96805
[18:48:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.23492 validation-auc:0.96364 validation-aucpr:0.96807
[18:48:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.23292 validation-auc:0.96398 validation-aucpr:0.96795
[18:48:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.23274 validation-auc:0.96380 validation-aucpr:0.96787
[18:48:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.23152 validation-auc:0.96393 validation-aucpr:0.96792
[18:48:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.23065 validation-auc:0.96412 validation-aucpr:0.96807
[18:48:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.23042 validation-auc:0.96413 validation-aucpr:0.96830
[18:48:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.22993 validation-auc:0.96429 validation-aucpr:0.96835
[18:48:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.22977 validation-auc:0.96417 validation-aucpr:0.96822
[18:48:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.22903 validation-auc:0.96428 validation-aucpr:0.96845
[18:48:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.22735 validation-auc:0.96465 validation-aucpr:0.96932
[18:48:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.22616 validation-auc:0.96502 validation-aucpr:0.96959
[18:48:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.22491 validation-auc:0.96514 validation-aucpr:0.96973
[18:48:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.22447 validation-auc:0.96525 validation-aucpr:0.96987
{'best_iteration': '53', 'best_score': '0.969869479613832'}
Trial 29, Fold 5: Log loss = 0.2244748179461408, Average precision = 0.9698752741540864, ROC-AUC = 0.9652471248351076, Elapsed Time = 9.2844757999992 seconds
Optimization Progress: 30%|### | 30/100 [49:53<1:47:34, 92.21s/it]
Trial 30, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 30, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68429 validation-auc:0.94592 validation-aucpr:0.95150
[1] validation-logloss:0.67552 validation-auc:0.93777 validation-aucpr:0.94722
[2] validation-logloss:0.66675 validation-auc:0.94673 validation-aucpr:0.94831
[3] validation-logloss:0.65861 validation-auc:0.95024 validation-aucpr:0.95688
[4] validation-logloss:0.65139 validation-auc:0.95150 validation-aucpr:0.95454
[5] validation-logloss:0.64332 validation-auc:0.95423 validation-aucpr:0.96099
[6] validation-logloss:0.63530 validation-auc:0.95723 validation-aucpr:0.96523
[7] validation-logloss:0.62740 validation-auc:0.95863 validation-aucpr:0.96553
[8] validation-logloss:0.61993 validation-auc:0.95944 validation-aucpr:0.96701
[9] validation-logloss:0.61339 validation-auc:0.95979 validation-aucpr:0.96727
[10] validation-logloss:0.60710 validation-auc:0.96040 validation-aucpr:0.96584
[11] validation-logloss:0.60038 validation-auc:0.96130 validation-aucpr:0.96669
[12] validation-logloss:0.59437 validation-auc:0.96237 validation-aucpr:0.96906
[13] validation-logloss:0.58734 validation-auc:0.96312 validation-aucpr:0.96844
[14] validation-logloss:0.58055 validation-auc:0.96386 validation-aucpr:0.96908
[15] validation-logloss:0.57401 validation-auc:0.96429 validation-aucpr:0.96993
[16] validation-logloss:0.56841 validation-auc:0.96440 validation-aucpr:0.97014
[17] validation-logloss:0.56276 validation-auc:0.96505 validation-aucpr:0.97077
[18] validation-logloss:0.55744 validation-auc:0.96532 validation-aucpr:0.97094
[19] validation-logloss:0.55132 validation-auc:0.96545 validation-aucpr:0.97093
[20] validation-logloss:0.54604 validation-auc:0.96553 validation-aucpr:0.97092
[21] validation-logloss:0.54070 validation-auc:0.96584 validation-aucpr:0.97118
[22] validation-logloss:0.53573 validation-auc:0.96563 validation-aucpr:0.97098
[23] validation-logloss:0.53025 validation-auc:0.96542 validation-aucpr:0.97088
[24] validation-logloss:0.52468 validation-auc:0.96565 validation-aucpr:0.97105
[25] validation-logloss:0.51943 validation-auc:0.96566 validation-aucpr:0.97112
[26] validation-logloss:0.51424 validation-auc:0.96584 validation-aucpr:0.97123
[27] validation-logloss:0.50921 validation-auc:0.96563 validation-aucpr:0.97109
[28] validation-logloss:0.50410 validation-auc:0.96567 validation-aucpr:0.97114
[29] validation-logloss:0.50007 validation-auc:0.96536 validation-aucpr:0.97089
[30] validation-logloss:0.49528 validation-auc:0.96537 validation-aucpr:0.97086
[31] validation-logloss:0.49038 validation-auc:0.96559 validation-aucpr:0.97102
[32] validation-logloss:0.48545 validation-auc:0.96580 validation-aucpr:0.97126
[33] validation-logloss:0.48088 validation-auc:0.96576 validation-aucpr:0.97130
[34] validation-logloss:0.47632 validation-auc:0.96618 validation-aucpr:0.97157
[35] validation-logloss:0.47214 validation-auc:0.96644 validation-aucpr:0.97172
[36] validation-logloss:0.46841 validation-auc:0.96627 validation-aucpr:0.97160
[37] validation-logloss:0.46417 validation-auc:0.96637 validation-aucpr:0.97169
[38] validation-logloss:0.46038 validation-auc:0.96639 validation-aucpr:0.97168
[39] validation-logloss:0.45608 validation-auc:0.96650 validation-aucpr:0.97175
[40] validation-logloss:0.45191 validation-auc:0.96643 validation-aucpr:0.97156
[41] validation-logloss:0.44851 validation-auc:0.96650 validation-aucpr:0.97159
[42] validation-logloss:0.44494 validation-auc:0.96667 validation-aucpr:0.97172
[43] validation-logloss:0.44144 validation-auc:0.96668 validation-aucpr:0.97178
[44] validation-logloss:0.43765 validation-auc:0.96676 validation-aucpr:0.97183
[45] validation-logloss:0.43384 validation-auc:0.96688 validation-aucpr:0.97194
[46] validation-logloss:0.43084 validation-auc:0.96702 validation-aucpr:0.97192
[47] validation-logloss:0.42791 validation-auc:0.96685 validation-aucpr:0.97178
[48] validation-logloss:0.42500 validation-auc:0.96676 validation-aucpr:0.97166
[49] validation-logloss:0.42140 validation-auc:0.96680 validation-aucpr:0.97169
[50] validation-logloss:0.41782 validation-auc:0.96689 validation-aucpr:0.97177
[51] validation-logloss:0.41481 validation-auc:0.96687 validation-aucpr:0.97197
[52] validation-logloss:0.41186 validation-auc:0.96690 validation-aucpr:0.97195
[53] validation-logloss:0.40913 validation-auc:0.96685 validation-aucpr:0.97190
[54] validation-logloss:0.40636 validation-auc:0.96678 validation-aucpr:0.97186
[55] validation-logloss:0.40372 validation-auc:0.96691 validation-aucpr:0.97204
[56] validation-logloss:0.40052 validation-auc:0.96683 validation-aucpr:0.97201
[57] validation-logloss:0.39801 validation-auc:0.96671 validation-aucpr:0.97183
[58] validation-logloss:0.39534 validation-auc:0.96668 validation-aucpr:0.97181
[59] validation-logloss:0.39227 validation-auc:0.96675 validation-aucpr:0.97157
[60] validation-logloss:0.38929 validation-auc:0.96680 validation-aucpr:0.97163
[61] validation-logloss:0.38617 validation-auc:0.96697 validation-aucpr:0.97180
[62] validation-logloss:0.38322 validation-auc:0.96704 validation-aucpr:0.97185
[63] validation-logloss:0.38056 validation-auc:0.96701 validation-aucpr:0.97182
[64] validation-logloss:0.37818 validation-auc:0.96702 validation-aucpr:0.97170
[65] validation-logloss:0.37542 validation-auc:0.96715 validation-aucpr:0.97181
[66] validation-logloss:0.37280 validation-auc:0.96710 validation-aucpr:0.97180
[67] validation-logloss:0.37019 validation-auc:0.96713 validation-aucpr:0.97179
[68] validation-logloss:0.36763 validation-auc:0.96709 validation-aucpr:0.97181
[69] validation-logloss:0.36545 validation-auc:0.96710 validation-aucpr:0.97183
[70] validation-logloss:0.36288 validation-auc:0.96709 validation-aucpr:0.97185
[71] validation-logloss:0.36042 validation-auc:0.96709 validation-aucpr:0.97182
[72] validation-logloss:0.35796 validation-auc:0.96718 validation-aucpr:0.97188
[73] validation-logloss:0.35570 validation-auc:0.96737 validation-aucpr:0.97197
[74] validation-logloss:0.35379 validation-auc:0.96736 validation-aucpr:0.97193
[75] validation-logloss:0.35178 validation-auc:0.96737 validation-aucpr:0.97193
[76] validation-logloss:0.34978 validation-auc:0.96735 validation-aucpr:0.97190
[77] validation-logloss:0.34744 validation-auc:0.96738 validation-aucpr:0.97193
[78] validation-logloss:0.34553 validation-auc:0.96737 validation-aucpr:0.97192
[79] validation-logloss:0.34330 validation-auc:0.96739 validation-aucpr:0.97194
[80] validation-logloss:0.34108 validation-auc:0.96743 validation-aucpr:0.97198
[81] validation-logloss:0.33942 validation-auc:0.96744 validation-aucpr:0.97189
[82] validation-logloss:0.33732 validation-auc:0.96746 validation-aucpr:0.97191
[83] validation-logloss:0.33519 validation-auc:0.96746 validation-aucpr:0.97195
[84] validation-logloss:0.33347 validation-auc:0.96745 validation-aucpr:0.97195
[85] validation-logloss:0.33180 validation-auc:0.96749 validation-aucpr:0.97199
[86] validation-logloss:0.33005 validation-auc:0.96758 validation-aucpr:0.97196
[87] validation-logloss:0.32842 validation-auc:0.96757 validation-aucpr:0.97191
[88] validation-logloss:0.32691 validation-auc:0.96755 validation-aucpr:0.97186
[89] validation-logloss:0.32524 validation-auc:0.96766 validation-aucpr:0.97191
[90] validation-logloss:0.32311 validation-auc:0.96779 validation-aucpr:0.97203
[91] validation-logloss:0.32119 validation-auc:0.96784 validation-aucpr:0.97207
[92] validation-logloss:0.31967 validation-auc:0.96783 validation-aucpr:0.97206
[93] validation-logloss:0.31793 validation-auc:0.96785 validation-aucpr:0.97211
{'best_iteration': '93', 'best_score': '0.9721051971930318'}
Trial 30, Fold 1: Log loss = 0.31792534158671387, Average precision = 0.9721165311715719, ROC-AUC = 0.9678542751035975, Elapsed Time = 2.6847978999994666 seconds
Trial 30, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 30, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68365 validation-auc:0.94604 validation-aucpr:0.95049
[1] validation-logloss:0.67572 validation-auc:0.94530 validation-aucpr:0.94160
[2] validation-logloss:0.66691 validation-auc:0.95222 validation-aucpr:0.94586
[3] validation-logloss:0.65860 validation-auc:0.95589 validation-aucpr:0.94929
[4] validation-logloss:0.65023 validation-auc:0.95917 validation-aucpr:0.95637
[5] validation-logloss:0.64297 validation-auc:0.96098 validation-aucpr:0.96079
[6] validation-logloss:0.63576 validation-auc:0.96114 validation-aucpr:0.95928
[7] validation-logloss:0.62904 validation-auc:0.96144 validation-aucpr:0.96299
[8] validation-logloss:0.62169 validation-auc:0.96206 validation-aucpr:0.96346
[9] validation-logloss:0.61536 validation-auc:0.96258 validation-aucpr:0.96540
[10] validation-logloss:0.60804 validation-auc:0.96356 validation-aucpr:0.96631
[11] validation-logloss:0.60181 validation-auc:0.96407 validation-aucpr:0.96660
[12] validation-logloss:0.59564 validation-auc:0.96426 validation-aucpr:0.96675
[13] validation-logloss:0.58906 validation-auc:0.96467 validation-aucpr:0.96698
[14] validation-logloss:0.58317 validation-auc:0.96439 validation-aucpr:0.96662
[15] validation-logloss:0.57749 validation-auc:0.96410 validation-aucpr:0.96610
[16] validation-logloss:0.57090 validation-auc:0.96498 validation-aucpr:0.96923
[17] validation-logloss:0.56455 validation-auc:0.96538 validation-aucpr:0.96953
[18] validation-logloss:0.55918 validation-auc:0.96536 validation-aucpr:0.96948
[19] validation-logloss:0.55271 validation-auc:0.96597 validation-aucpr:0.97018
[20] validation-logloss:0.54766 validation-auc:0.96568 validation-aucpr:0.96987
[21] validation-logloss:0.54268 validation-auc:0.96579 validation-aucpr:0.96983
[22] validation-logloss:0.53673 validation-auc:0.96638 validation-aucpr:0.97042
[23] validation-logloss:0.53114 validation-auc:0.96660 validation-aucpr:0.97059
[24] validation-logloss:0.52581 validation-auc:0.96666 validation-aucpr:0.97070
[25] validation-logloss:0.52097 validation-auc:0.96651 validation-aucpr:0.97060
[26] validation-logloss:0.51576 validation-auc:0.96663 validation-aucpr:0.97071
[27] validation-logloss:0.51054 validation-auc:0.96691 validation-aucpr:0.97095
[28] validation-logloss:0.50583 validation-auc:0.96709 validation-aucpr:0.97109
[29] validation-logloss:0.50127 validation-auc:0.96707 validation-aucpr:0.97107
[30] validation-logloss:0.49647 validation-auc:0.96730 validation-aucpr:0.97126
[31] validation-logloss:0.49155 validation-auc:0.96760 validation-aucpr:0.97151
[32] validation-logloss:0.48744 validation-auc:0.96764 validation-aucpr:0.97146
[33] validation-logloss:0.48312 validation-auc:0.96791 validation-aucpr:0.97167
[34] validation-logloss:0.47846 validation-auc:0.96789 validation-aucpr:0.97166
[35] validation-logloss:0.47445 validation-auc:0.96760 validation-aucpr:0.97146
[36] validation-logloss:0.47048 validation-auc:0.96763 validation-aucpr:0.97155
[37] validation-logloss:0.46636 validation-auc:0.96774 validation-aucpr:0.97165
[38] validation-logloss:0.46207 validation-auc:0.96798 validation-aucpr:0.97181
[39] validation-logloss:0.45823 validation-auc:0.96821 validation-aucpr:0.97197
[40] validation-logloss:0.45420 validation-auc:0.96827 validation-aucpr:0.97206
[41] validation-logloss:0.45065 validation-auc:0.96828 validation-aucpr:0.97205
[42] validation-logloss:0.44746 validation-auc:0.96802 validation-aucpr:0.97183
[43] validation-logloss:0.44361 validation-auc:0.96794 validation-aucpr:0.97174
[44] validation-logloss:0.44031 validation-auc:0.96803 validation-aucpr:0.97174
[45] validation-logloss:0.43642 validation-auc:0.96826 validation-aucpr:0.97190
[46] validation-logloss:0.43256 validation-auc:0.96825 validation-aucpr:0.97194
[47] validation-logloss:0.42921 validation-auc:0.96842 validation-aucpr:0.97204
[48] validation-logloss:0.42610 validation-auc:0.96845 validation-aucpr:0.97206
[49] validation-logloss:0.42307 validation-auc:0.96852 validation-aucpr:0.97241
[50] validation-logloss:0.41996 validation-auc:0.96847 validation-aucpr:0.97237
[51] validation-logloss:0.41643 validation-auc:0.96855 validation-aucpr:0.97247
[52] validation-logloss:0.41358 validation-auc:0.96848 validation-aucpr:0.97239
[53] validation-logloss:0.41055 validation-auc:0.96855 validation-aucpr:0.97241
[54] validation-logloss:0.40719 validation-auc:0.96855 validation-aucpr:0.97244
[55] validation-logloss:0.40378 validation-auc:0.96861 validation-aucpr:0.97252
[56] validation-logloss:0.40052 validation-auc:0.96877 validation-aucpr:0.97266
[57] validation-logloss:0.39732 validation-auc:0.96879 validation-aucpr:0.97269
[58] validation-logloss:0.39411 validation-auc:0.96880 validation-aucpr:0.97270
[59] validation-logloss:0.39130 validation-auc:0.96902 validation-aucpr:0.97284
[60] validation-logloss:0.38876 validation-auc:0.96907 validation-aucpr:0.97287
[61] validation-logloss:0.38592 validation-auc:0.96904 validation-aucpr:0.97286
[62] validation-logloss:0.38348 validation-auc:0.96902 validation-aucpr:0.97282
[63] validation-logloss:0.38096 validation-auc:0.96905 validation-aucpr:0.97285
[64] validation-logloss:0.37865 validation-auc:0.96914 validation-aucpr:0.97291
[65] validation-logloss:0.37629 validation-auc:0.96906 validation-aucpr:0.97285
[66] validation-logloss:0.37340 validation-auc:0.96917 validation-aucpr:0.97294
[67] validation-logloss:0.37049 validation-auc:0.96926 validation-aucpr:0.97303
[68] validation-logloss:0.36816 validation-auc:0.96932 validation-aucpr:0.97309
[69] validation-logloss:0.36543 validation-auc:0.96931 validation-aucpr:0.97308
[70] validation-logloss:0.36335 validation-auc:0.96927 validation-aucpr:0.97306
[71] validation-logloss:0.36117 validation-auc:0.96936 validation-aucpr:0.97312
[72] validation-logloss:0.35853 validation-auc:0.96937 validation-aucpr:0.97314
[73] validation-logloss:0.35584 validation-auc:0.96960 validation-aucpr:0.97333
[74] validation-logloss:0.35337 validation-auc:0.96967 validation-aucpr:0.97337
[75] validation-logloss:0.35132 validation-auc:0.96964 validation-aucpr:0.97335
[76] validation-logloss:0.34943 validation-auc:0.96954 validation-aucpr:0.97326
[77] validation-logloss:0.34704 validation-auc:0.96961 validation-aucpr:0.97331
[78] validation-logloss:0.34465 validation-auc:0.96956 validation-aucpr:0.97328
[79] validation-logloss:0.34274 validation-auc:0.96949 validation-aucpr:0.97323
[80] validation-logloss:0.34039 validation-auc:0.96957 validation-aucpr:0.97330
[81] validation-logloss:0.33814 validation-auc:0.96965 validation-aucpr:0.97338
[82] validation-logloss:0.33642 validation-auc:0.96960 validation-aucpr:0.97334
[83] validation-logloss:0.33423 validation-auc:0.96971 validation-aucpr:0.97343
[84] validation-logloss:0.33225 validation-auc:0.96965 validation-aucpr:0.97339
[85] validation-logloss:0.33006 validation-auc:0.96971 validation-aucpr:0.97346
[86] validation-logloss:0.32813 validation-auc:0.96970 validation-aucpr:0.97345
[87] validation-logloss:0.32600 validation-auc:0.96969 validation-aucpr:0.97346
[88] validation-logloss:0.32441 validation-auc:0.96962 validation-aucpr:0.97341
[89] validation-logloss:0.32238 validation-auc:0.96966 validation-aucpr:0.97344
[90] validation-logloss:0.32048 validation-auc:0.96966 validation-aucpr:0.97341
[91] validation-logloss:0.31844 validation-auc:0.96977 validation-aucpr:0.97348
[92] validation-logloss:0.31664 validation-auc:0.96981 validation-aucpr:0.97352
[93] validation-logloss:0.31517 validation-auc:0.96981 validation-aucpr:0.97351
{'best_iteration': '92', 'best_score': '0.9735212751789194'}
Trial 30, Fold 2: Log loss = 0.3151728598347162, Average precision = 0.97351458863413, ROC-AUC = 0.9698109920632872, Elapsed Time = 2.9415685000003577 seconds
Trial 30, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 30, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68408 validation-auc:0.94187 validation-aucpr:0.94867
[1] validation-logloss:0.67524 validation-auc:0.94617 validation-aucpr:0.94137
[2] validation-logloss:0.66632 validation-auc:0.95575 validation-aucpr:0.95035
[3] validation-logloss:0.65793 validation-auc:0.96003 validation-aucpr:0.96118
[4] validation-logloss:0.64957 validation-auc:0.96238 validation-aucpr:0.96455
[5] validation-logloss:0.64207 validation-auc:0.96387 validation-aucpr:0.96392
[6] validation-logloss:0.63507 validation-auc:0.96440 validation-aucpr:0.96713
[7] validation-logloss:0.62749 validation-auc:0.96389 validation-aucpr:0.96640
[8] validation-logloss:0.62102 validation-auc:0.96349 validation-aucpr:0.96595
[9] validation-logloss:0.61401 validation-auc:0.96381 validation-aucpr:0.96586
[10] validation-logloss:0.60659 validation-auc:0.96456 validation-aucpr:0.96801
[11] validation-logloss:0.59936 validation-auc:0.96502 validation-aucpr:0.96831
[12] validation-logloss:0.59221 validation-auc:0.96563 validation-aucpr:0.97013
[13] validation-logloss:0.58612 validation-auc:0.96543 validation-aucpr:0.97010
[14] validation-logloss:0.57933 validation-auc:0.96590 validation-aucpr:0.97037
[15] validation-logloss:0.57365 validation-auc:0.96584 validation-aucpr:0.97090
[16] validation-logloss:0.56810 validation-auc:0.96577 validation-aucpr:0.97072
[17] validation-logloss:0.56177 validation-auc:0.96637 validation-aucpr:0.97125
[18] validation-logloss:0.55568 validation-auc:0.96698 validation-aucpr:0.97236
[19] validation-logloss:0.54971 validation-auc:0.96694 validation-aucpr:0.97227
[20] validation-logloss:0.54382 validation-auc:0.96721 validation-aucpr:0.97253
[21] validation-logloss:0.53799 validation-auc:0.96767 validation-aucpr:0.97289
[22] validation-logloss:0.53244 validation-auc:0.96710 validation-aucpr:0.97250
[23] validation-logloss:0.52740 validation-auc:0.96719 validation-aucpr:0.97251
[24] validation-logloss:0.52207 validation-auc:0.96704 validation-aucpr:0.97234
[25] validation-logloss:0.51728 validation-auc:0.96701 validation-aucpr:0.97233
[26] validation-logloss:0.51189 validation-auc:0.96722 validation-aucpr:0.97246
[27] validation-logloss:0.50708 validation-auc:0.96742 validation-aucpr:0.97263
[28] validation-logloss:0.50200 validation-auc:0.96738 validation-aucpr:0.97210
[29] validation-logloss:0.49688 validation-auc:0.96758 validation-aucpr:0.97225
[30] validation-logloss:0.49187 validation-auc:0.96786 validation-aucpr:0.97251
[31] validation-logloss:0.48776 validation-auc:0.96775 validation-aucpr:0.97230
[32] validation-logloss:0.48288 validation-auc:0.96808 validation-aucpr:0.97260
[33] validation-logloss:0.47830 validation-auc:0.96819 validation-aucpr:0.97268
[34] validation-logloss:0.47379 validation-auc:0.96802 validation-aucpr:0.97254
[35] validation-logloss:0.46974 validation-auc:0.96815 validation-aucpr:0.97261
[36] validation-logloss:0.46537 validation-auc:0.96806 validation-aucpr:0.97118
[37] validation-logloss:0.46151 validation-auc:0.96815 validation-aucpr:0.97130
[38] validation-logloss:0.45769 validation-auc:0.96810 validation-aucpr:0.97128
[39] validation-logloss:0.45390 validation-auc:0.96821 validation-aucpr:0.97132
[40] validation-logloss:0.45021 validation-auc:0.96844 validation-aucpr:0.97143
[41] validation-logloss:0.44612 validation-auc:0.96829 validation-aucpr:0.97137
[42] validation-logloss:0.44195 validation-auc:0.96845 validation-aucpr:0.97145
[43] validation-logloss:0.43792 validation-auc:0.96870 validation-aucpr:0.97165
[44] validation-logloss:0.43407 validation-auc:0.96872 validation-aucpr:0.97169
[45] validation-logloss:0.43017 validation-auc:0.96884 validation-aucpr:0.97173
[46] validation-logloss:0.42701 validation-auc:0.96870 validation-aucpr:0.97165
[47] validation-logloss:0.42382 validation-auc:0.96870 validation-aucpr:0.97173
[48] validation-logloss:0.42090 validation-auc:0.96856 validation-aucpr:0.97160
[49] validation-logloss:0.41724 validation-auc:0.96862 validation-aucpr:0.97168
[50] validation-logloss:0.41368 validation-auc:0.96859 validation-aucpr:0.97169
[51] validation-logloss:0.41085 validation-auc:0.96855 validation-aucpr:0.97160
[52] validation-logloss:0.40742 validation-auc:0.96856 validation-aucpr:0.97166
[53] validation-logloss:0.40458 validation-auc:0.96849 validation-aucpr:0.97156
[54] validation-logloss:0.40118 validation-auc:0.96860 validation-aucpr:0.97154
[55] validation-logloss:0.39834 validation-auc:0.96862 validation-aucpr:0.97144
[56] validation-logloss:0.39559 validation-auc:0.96866 validation-aucpr:0.97140
[57] validation-logloss:0.39294 validation-auc:0.96856 validation-aucpr:0.97128
[58] validation-logloss:0.38978 validation-auc:0.96864 validation-aucpr:0.97119
[59] validation-logloss:0.38675 validation-auc:0.96874 validation-aucpr:0.97126
[60] validation-logloss:0.38366 validation-auc:0.96883 validation-aucpr:0.97134
[61] validation-logloss:0.38078 validation-auc:0.96886 validation-aucpr:0.97138
[62] validation-logloss:0.37849 validation-auc:0.96880 validation-aucpr:0.97158
[63] validation-logloss:0.37571 validation-auc:0.96881 validation-aucpr:0.97180
[64] validation-logloss:0.37290 validation-auc:0.96881 validation-aucpr:0.97171
[65] validation-logloss:0.37057 validation-auc:0.96886 validation-aucpr:0.97186
[66] validation-logloss:0.36798 validation-auc:0.96894 validation-aucpr:0.97193
[67] validation-logloss:0.36564 validation-auc:0.96896 validation-aucpr:0.97199
[68] validation-logloss:0.36332 validation-auc:0.96906 validation-aucpr:0.97205
[69] validation-logloss:0.36059 validation-auc:0.96917 validation-aucpr:0.97196
[70] validation-logloss:0.35849 validation-auc:0.96918 validation-aucpr:0.97240
[71] validation-logloss:0.35635 validation-auc:0.96926 validation-aucpr:0.97251
{'best_iteration': '21', 'best_score': '0.9728873063164621'}
Trial 30, Fold 3: Log loss = 0.35634576391738043, Average precision = 0.9725254290879971, ROC-AUC = 0.9692625516464255, Elapsed Time = 2.331017199998314 seconds
Trial 30, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 30, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68408 validation-auc:0.93053 validation-aucpr:0.92858
[1] validation-logloss:0.67602 validation-auc:0.94184 validation-aucpr:0.94726
[2] validation-logloss:0.66734 validation-auc:0.94964 validation-aucpr:0.94641
[3] validation-logloss:0.66002 validation-auc:0.95045 validation-aucpr:0.95206
[4] validation-logloss:0.65255 validation-auc:0.95388 validation-aucpr:0.95359
[5] validation-logloss:0.64522 validation-auc:0.95537 validation-aucpr:0.95826
[6] validation-logloss:0.63706 validation-auc:0.95776 validation-aucpr:0.96238
[7] validation-logloss:0.62999 validation-auc:0.95796 validation-aucpr:0.96265
[8] validation-logloss:0.62335 validation-auc:0.95790 validation-aucpr:0.96314
[9] validation-logloss:0.61658 validation-auc:0.95907 validation-aucpr:0.96494
[10] validation-logloss:0.60931 validation-auc:0.96047 validation-aucpr:0.96625
[11] validation-logloss:0.60196 validation-auc:0.96131 validation-aucpr:0.96660
[12] validation-logloss:0.59566 validation-auc:0.96191 validation-aucpr:0.96711
[13] validation-logloss:0.58955 validation-auc:0.96224 validation-aucpr:0.96656
[14] validation-logloss:0.58364 validation-auc:0.96271 validation-aucpr:0.96855
[15] validation-logloss:0.57782 validation-auc:0.96260 validation-aucpr:0.96845
[16] validation-logloss:0.57132 validation-auc:0.96341 validation-aucpr:0.96908
[17] validation-logloss:0.56494 validation-auc:0.96372 validation-aucpr:0.96948
[18] validation-logloss:0.55875 validation-auc:0.96405 validation-aucpr:0.96983
[19] validation-logloss:0.55350 validation-auc:0.96416 validation-aucpr:0.96997
[20] validation-logloss:0.54836 validation-auc:0.96413 validation-aucpr:0.96993
[21] validation-logloss:0.54326 validation-auc:0.96422 validation-aucpr:0.97000
[22] validation-logloss:0.53830 validation-auc:0.96416 validation-aucpr:0.96996
[23] validation-logloss:0.53344 validation-auc:0.96406 validation-aucpr:0.96985
[24] validation-logloss:0.52802 validation-auc:0.96401 validation-aucpr:0.96992
[25] validation-logloss:0.52335 validation-auc:0.96404 validation-aucpr:0.96996
[26] validation-logloss:0.51893 validation-auc:0.96395 validation-aucpr:0.96989
[27] validation-logloss:0.51432 validation-auc:0.96402 validation-aucpr:0.96990
[28] validation-logloss:0.50917 validation-auc:0.96437 validation-aucpr:0.97021
[29] validation-logloss:0.50445 validation-auc:0.96439 validation-aucpr:0.97023
[30] validation-logloss:0.49963 validation-auc:0.96440 validation-aucpr:0.97025
[31] validation-logloss:0.49506 validation-auc:0.96445 validation-aucpr:0.97033
[32] validation-logloss:0.49094 validation-auc:0.96459 validation-aucpr:0.97040
[33] validation-logloss:0.48682 validation-auc:0.96440 validation-aucpr:0.97025
[34] validation-logloss:0.48220 validation-auc:0.96462 validation-aucpr:0.97043
[35] validation-logloss:0.47819 validation-auc:0.96489 validation-aucpr:0.97062
[36] validation-logloss:0.47362 validation-auc:0.96509 validation-aucpr:0.97084
[37] validation-logloss:0.47000 validation-auc:0.96508 validation-aucpr:0.97081
[38] validation-logloss:0.46619 validation-auc:0.96512 validation-aucpr:0.97081
[39] validation-logloss:0.46245 validation-auc:0.96517 validation-aucpr:0.97081
[40] validation-logloss:0.45866 validation-auc:0.96523 validation-aucpr:0.97086
[41] validation-logloss:0.45454 validation-auc:0.96526 validation-aucpr:0.97091
[42] validation-logloss:0.45097 validation-auc:0.96531 validation-aucpr:0.97093
[43] validation-logloss:0.44752 validation-auc:0.96538 validation-aucpr:0.97095
[44] validation-logloss:0.44336 validation-auc:0.96564 validation-aucpr:0.97119
[45] validation-logloss:0.43937 validation-auc:0.96570 validation-aucpr:0.97127
[46] validation-logloss:0.43563 validation-auc:0.96581 validation-aucpr:0.97139
[47] validation-logloss:0.43250 validation-auc:0.96577 validation-aucpr:0.97139
[48] validation-logloss:0.42861 validation-auc:0.96600 validation-aucpr:0.97157
[49] validation-logloss:0.42487 validation-auc:0.96607 validation-aucpr:0.97163
[50] validation-logloss:0.42175 validation-auc:0.96612 validation-aucpr:0.97168
[51] validation-logloss:0.41890 validation-auc:0.96605 validation-aucpr:0.97163
[52] validation-logloss:0.41532 validation-auc:0.96618 validation-aucpr:0.97174
[53] validation-logloss:0.41198 validation-auc:0.96630 validation-aucpr:0.97186
[54] validation-logloss:0.40912 validation-auc:0.96622 validation-aucpr:0.97179
[55] validation-logloss:0.40595 validation-auc:0.96621 validation-aucpr:0.97181
[56] validation-logloss:0.40323 validation-auc:0.96614 validation-aucpr:0.97174
[57] validation-logloss:0.40033 validation-auc:0.96620 validation-aucpr:0.97175
[58] validation-logloss:0.39753 validation-auc:0.96628 validation-aucpr:0.97177
[59] validation-logloss:0.39482 validation-auc:0.96628 validation-aucpr:0.97175
[60] validation-logloss:0.39217 validation-auc:0.96629 validation-aucpr:0.97175
[61] validation-logloss:0.38975 validation-auc:0.96624 validation-aucpr:0.97172
[62] validation-logloss:0.38734 validation-auc:0.96618 validation-aucpr:0.97167
[63] validation-logloss:0.38437 validation-auc:0.96633 validation-aucpr:0.97179
[64] validation-logloss:0.38173 validation-auc:0.96644 validation-aucpr:0.97187
[65] validation-logloss:0.37886 validation-auc:0.96645 validation-aucpr:0.97190
[66] validation-logloss:0.37653 validation-auc:0.96645 validation-aucpr:0.97190
[67] validation-logloss:0.37438 validation-auc:0.96646 validation-aucpr:0.97188
[68] validation-logloss:0.37207 validation-auc:0.96645 validation-aucpr:0.97190
[69] validation-logloss:0.36934 validation-auc:0.96649 validation-aucpr:0.97196
[70] validation-logloss:0.36648 validation-auc:0.96668 validation-aucpr:0.97211
[71] validation-logloss:0.36382 validation-auc:0.96681 validation-aucpr:0.97224
[72] validation-logloss:0.36169 validation-auc:0.96679 validation-aucpr:0.97222
[73] validation-logloss:0.35899 validation-auc:0.96694 validation-aucpr:0.97234
[74] validation-logloss:0.35707 validation-auc:0.96689 validation-aucpr:0.97229
[75] validation-logloss:0.35464 validation-auc:0.96695 validation-aucpr:0.97234
[76] validation-logloss:0.35221 validation-auc:0.96708 validation-aucpr:0.97244
[77] validation-logloss:0.35029 validation-auc:0.96711 validation-aucpr:0.97247
[78] validation-logloss:0.34794 validation-auc:0.96713 validation-aucpr:0.97251
[79] validation-logloss:0.34612 validation-auc:0.96707 validation-aucpr:0.97246
[80] validation-logloss:0.34386 validation-auc:0.96713 validation-aucpr:0.97253
[81] validation-logloss:0.34157 validation-auc:0.96717 validation-aucpr:0.97256
[82] validation-logloss:0.33969 validation-auc:0.96733 validation-aucpr:0.97266
[83] validation-logloss:0.33794 validation-auc:0.96729 validation-aucpr:0.97263
[84] validation-logloss:0.33621 validation-auc:0.96718 validation-aucpr:0.97255
[85] validation-logloss:0.33440 validation-auc:0.96713 validation-aucpr:0.97252
[86] validation-logloss:0.33270 validation-auc:0.96717 validation-aucpr:0.97253
[87] validation-logloss:0.33074 validation-auc:0.96729 validation-aucpr:0.97261
[88] validation-logloss:0.32873 validation-auc:0.96735 validation-aucpr:0.97266
[89] validation-logloss:0.32663 validation-auc:0.96743 validation-aucpr:0.97272
[90] validation-logloss:0.32507 validation-auc:0.96744 validation-aucpr:0.97273
[91] validation-logloss:0.32368 validation-auc:0.96751 validation-aucpr:0.97276
[92] validation-logloss:0.32184 validation-auc:0.96751 validation-aucpr:0.97277
[93] validation-logloss:0.32051 validation-auc:0.96739 validation-aucpr:0.97268
{'best_iteration': '92', 'best_score': '0.9727730022275809'}
Trial 30, Fold 4: Log loss = 0.3205108791725433, Average precision = 0.9726860324319256, ROC-AUC = 0.9673858195574059, Elapsed Time = 2.7896488000005775 seconds
Trial 30, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 30, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68389 validation-auc:0.94408 validation-aucpr:0.95018
[1] validation-logloss:0.67593 validation-auc:0.94630 validation-aucpr:0.94872
[2] validation-logloss:0.66727 validation-auc:0.95062 validation-aucpr:0.95591
[3] validation-logloss:0.65885 validation-auc:0.95609 validation-aucpr:0.95779
[4] validation-logloss:0.65153 validation-auc:0.95795 validation-aucpr:0.95871
[5] validation-logloss:0.64346 validation-auc:0.95900 validation-aucpr:0.96111
[6] validation-logloss:0.63639 validation-auc:0.96002 validation-aucpr:0.96209
[7] validation-logloss:0.62865 validation-auc:0.96094 validation-aucpr:0.96319
[8] validation-logloss:0.62186 validation-auc:0.96195 validation-aucpr:0.96294
[9] validation-logloss:0.61413 validation-auc:0.96361 validation-aucpr:0.96569
[10] validation-logloss:0.60685 validation-auc:0.96372 validation-aucpr:0.96641
[11] validation-logloss:0.59983 validation-auc:0.96387 validation-aucpr:0.96739
[12] validation-logloss:0.59367 validation-auc:0.96366 validation-aucpr:0.96683
[13] validation-logloss:0.58787 validation-auc:0.96348 validation-aucpr:0.96616
[14] validation-logloss:0.58193 validation-auc:0.96344 validation-aucpr:0.96696
[15] validation-logloss:0.57639 validation-auc:0.96299 validation-aucpr:0.96664
[16] validation-logloss:0.56976 validation-auc:0.96373 validation-aucpr:0.96727
[17] validation-logloss:0.56419 validation-auc:0.96385 validation-aucpr:0.96726
[18] validation-logloss:0.55887 validation-auc:0.96372 validation-aucpr:0.96708
[19] validation-logloss:0.55282 validation-auc:0.96387 validation-aucpr:0.96720
[20] validation-logloss:0.54697 validation-auc:0.96419 validation-aucpr:0.96745
[21] validation-logloss:0.54168 validation-auc:0.96421 validation-aucpr:0.96748
[22] validation-logloss:0.53664 validation-auc:0.96444 validation-aucpr:0.96754
[23] validation-logloss:0.53165 validation-auc:0.96457 validation-aucpr:0.96811
[24] validation-logloss:0.52697 validation-auc:0.96463 validation-aucpr:0.96805
[25] validation-logloss:0.52145 validation-auc:0.96501 validation-aucpr:0.96835
[26] validation-logloss:0.51586 validation-auc:0.96520 validation-aucpr:0.96882
[27] validation-logloss:0.51060 validation-auc:0.96526 validation-aucpr:0.96894
[28] validation-logloss:0.50598 validation-auc:0.96516 validation-aucpr:0.96883
[29] validation-logloss:0.50148 validation-auc:0.96516 validation-aucpr:0.96883
[30] validation-logloss:0.49719 validation-auc:0.96503 validation-aucpr:0.96871
[31] validation-logloss:0.49243 validation-auc:0.96515 validation-aucpr:0.96879
[32] validation-logloss:0.48826 validation-auc:0.96527 validation-aucpr:0.96877
[33] validation-logloss:0.48420 validation-auc:0.96508 validation-aucpr:0.96865
[34] validation-logloss:0.47947 validation-auc:0.96535 validation-aucpr:0.96884
[35] validation-logloss:0.47481 validation-auc:0.96546 validation-aucpr:0.96896
[36] validation-logloss:0.47024 validation-auc:0.96562 validation-aucpr:0.96916
[37] validation-logloss:0.46582 validation-auc:0.96589 validation-aucpr:0.96938
[38] validation-logloss:0.46197 validation-auc:0.96607 validation-aucpr:0.96938
[39] validation-logloss:0.45774 validation-auc:0.96611 validation-aucpr:0.96936
[40] validation-logloss:0.45422 validation-auc:0.96600 validation-aucpr:0.96922
[41] validation-logloss:0.45013 validation-auc:0.96619 validation-aucpr:0.97111
[42] validation-logloss:0.44669 validation-auc:0.96600 validation-aucpr:0.97095
[43] validation-logloss:0.44339 validation-auc:0.96590 validation-aucpr:0.97086
[44] validation-logloss:0.43950 validation-auc:0.96602 validation-aucpr:0.97102
[45] validation-logloss:0.43635 validation-auc:0.96585 validation-aucpr:0.97089
[46] validation-logloss:0.43302 validation-auc:0.96593 validation-aucpr:0.97089
[47] validation-logloss:0.42971 validation-auc:0.96595 validation-aucpr:0.97091
[48] validation-logloss:0.42652 validation-auc:0.96591 validation-aucpr:0.97085
[49] validation-logloss:0.42299 validation-auc:0.96591 validation-aucpr:0.97085
[50] validation-logloss:0.41946 validation-auc:0.96594 validation-aucpr:0.97089
[51] validation-logloss:0.41602 validation-auc:0.96601 validation-aucpr:0.97104
[52] validation-logloss:0.41316 validation-auc:0.96593 validation-aucpr:0.97095
[53] validation-logloss:0.40987 validation-auc:0.96589 validation-aucpr:0.97093
[54] validation-logloss:0.40726 validation-auc:0.96585 validation-aucpr:0.97083
[55] validation-logloss:0.40409 validation-auc:0.96587 validation-aucpr:0.97085
[56] validation-logloss:0.40149 validation-auc:0.96583 validation-aucpr:0.97076
[57] validation-logloss:0.39861 validation-auc:0.96587 validation-aucpr:0.97076
[58] validation-logloss:0.39560 validation-auc:0.96590 validation-aucpr:0.97083
[59] validation-logloss:0.39256 validation-auc:0.96592 validation-aucpr:0.97085
[60] validation-logloss:0.38969 validation-auc:0.96604 validation-aucpr:0.97098
[61] validation-logloss:0.38714 validation-auc:0.96606 validation-aucpr:0.97099
[62] validation-logloss:0.38429 validation-auc:0.96611 validation-aucpr:0.97106
[63] validation-logloss:0.38189 validation-auc:0.96613 validation-aucpr:0.97144
[64] validation-logloss:0.37956 validation-auc:0.96613 validation-aucpr:0.97129
[65] validation-logloss:0.37674 validation-auc:0.96623 validation-aucpr:0.97139
[66] validation-logloss:0.37406 validation-auc:0.96631 validation-aucpr:0.97144
[67] validation-logloss:0.37136 validation-auc:0.96635 validation-aucpr:0.97146
[68] validation-logloss:0.36898 validation-auc:0.96637 validation-aucpr:0.97148
[69] validation-logloss:0.36681 validation-auc:0.96639 validation-aucpr:0.97148
[70] validation-logloss:0.36432 validation-auc:0.96647 validation-aucpr:0.97155
[71] validation-logloss:0.36216 validation-auc:0.96642 validation-aucpr:0.97151
[72] validation-logloss:0.36015 validation-auc:0.96633 validation-aucpr:0.97129
[73] validation-logloss:0.35765 validation-auc:0.96644 validation-aucpr:0.97136
[74] validation-logloss:0.35565 validation-auc:0.96635 validation-aucpr:0.97129
[75] validation-logloss:0.35367 validation-auc:0.96625 validation-aucpr:0.97119
[76] validation-logloss:0.35145 validation-auc:0.96624 validation-aucpr:0.97121
[77] validation-logloss:0.34941 validation-auc:0.96630 validation-aucpr:0.97126
[78] validation-logloss:0.34759 validation-auc:0.96626 validation-aucpr:0.97123
[79] validation-logloss:0.34539 validation-auc:0.96626 validation-aucpr:0.97122
[80] validation-logloss:0.34306 validation-auc:0.96640 validation-aucpr:0.97132
[81] validation-logloss:0.34139 validation-auc:0.96634 validation-aucpr:0.97128
[82] validation-logloss:0.33913 validation-auc:0.96646 validation-aucpr:0.97139
[83] validation-logloss:0.33745 validation-auc:0.96637 validation-aucpr:0.97129
[84] validation-logloss:0.33536 validation-auc:0.96636 validation-aucpr:0.97129
[85] validation-logloss:0.33366 validation-auc:0.96635 validation-aucpr:0.97126
[86] validation-logloss:0.33201 validation-auc:0.96634 validation-aucpr:0.97123
[87] validation-logloss:0.33025 validation-auc:0.96644 validation-aucpr:0.97128
[88] validation-logloss:0.32833 validation-auc:0.96647 validation-aucpr:0.97132
[89] validation-logloss:0.32614 validation-auc:0.96663 validation-aucpr:0.97144
[90] validation-logloss:0.32447 validation-auc:0.96673 validation-aucpr:0.97150
[91] validation-logloss:0.32296 validation-auc:0.96674 validation-aucpr:0.97150
[92] validation-logloss:0.32118 validation-auc:0.96676 validation-aucpr:0.97146
[93] validation-logloss:0.31976 validation-auc:0.96669 validation-aucpr:0.97140
{'best_iteration': '70', 'best_score': '0.9715477136630459'}
Trial 30, Fold 5: Log loss = 0.3197561209559464, Average precision = 0.9714092461106858, ROC-AUC = 0.9666948330639318, Elapsed Time = 2.8984010000003764 seconds
Optimization Progress: 31%|###1 | 31/100 [50:17<1:22:15, 71.52s/it]
Trial 31, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 31, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66183 validation-auc:0.95646 validation-aucpr:0.95706
[1] validation-logloss:0.63297 validation-auc:0.96248 validation-aucpr:0.96496
[2] validation-logloss:0.60646 validation-auc:0.96382 validation-aucpr:0.96777
[3] validation-logloss:0.58199 validation-auc:0.96545 validation-aucpr:0.97079
[4] validation-logloss:0.55933 validation-auc:0.96640 validation-aucpr:0.97186
[5] validation-logloss:0.53802 validation-auc:0.96680 validation-aucpr:0.97221
[6] validation-logloss:0.51885 validation-auc:0.96668 validation-aucpr:0.97209
[7] validation-logloss:0.50046 validation-auc:0.96686 validation-aucpr:0.97228
[8] validation-logloss:0.48402 validation-auc:0.96716 validation-aucpr:0.97265
[9] validation-logloss:0.46789 validation-auc:0.96732 validation-aucpr:0.97279
[10] validation-logloss:0.45298 validation-auc:0.96740 validation-aucpr:0.97285
[11] validation-logloss:0.43896 validation-auc:0.96748 validation-aucpr:0.97287
[12] validation-logloss:0.42568 validation-auc:0.96771 validation-aucpr:0.97302
[13] validation-logloss:0.41462 validation-auc:0.96823 validation-aucpr:0.97348
[14] validation-logloss:0.40427 validation-auc:0.96817 validation-aucpr:0.97349
[15] validation-logloss:0.39318 validation-auc:0.96815 validation-aucpr:0.97348
[16] validation-logloss:0.38312 validation-auc:0.96809 validation-aucpr:0.97343
[17] validation-logloss:0.37376 validation-auc:0.96805 validation-aucpr:0.97338
[18] validation-logloss:0.36441 validation-auc:0.96819 validation-aucpr:0.97346
[19] validation-logloss:0.35571 validation-auc:0.96843 validation-aucpr:0.97361
[20] validation-logloss:0.34753 validation-auc:0.96842 validation-aucpr:0.97358
[21] validation-logloss:0.33995 validation-auc:0.96861 validation-aucpr:0.97403
[22] validation-logloss:0.33281 validation-auc:0.96855 validation-aucpr:0.97398
[23] validation-logloss:0.32569 validation-auc:0.96871 validation-aucpr:0.97410
[24] validation-logloss:0.31917 validation-auc:0.96888 validation-aucpr:0.97422
[25] validation-logloss:0.31279 validation-auc:0.96915 validation-aucpr:0.97443
[26] validation-logloss:0.30690 validation-auc:0.96927 validation-aucpr:0.97452
[27] validation-logloss:0.30141 validation-auc:0.96953 validation-aucpr:0.97463
[28] validation-logloss:0.29670 validation-auc:0.96967 validation-aucpr:0.97471
[29] validation-logloss:0.29209 validation-auc:0.96985 validation-aucpr:0.97484
[30] validation-logloss:0.28713 validation-auc:0.97002 validation-aucpr:0.97495
[31] validation-logloss:0.28249 validation-auc:0.97013 validation-aucpr:0.97502
[32] validation-logloss:0.27809 validation-auc:0.97021 validation-aucpr:0.97505
[33] validation-logloss:0.27414 validation-auc:0.97021 validation-aucpr:0.97509
[34] validation-logloss:0.27030 validation-auc:0.97021 validation-aucpr:0.97509
[35] validation-logloss:0.26659 validation-auc:0.97032 validation-aucpr:0.97517
[36] validation-logloss:0.26306 validation-auc:0.97037 validation-aucpr:0.97522
[37] validation-logloss:0.25957 validation-auc:0.97042 validation-aucpr:0.97526
[38] validation-logloss:0.25630 validation-auc:0.97057 validation-aucpr:0.97535
[39] validation-logloss:0.25326 validation-auc:0.97067 validation-aucpr:0.97541
[40] validation-logloss:0.25033 validation-auc:0.97084 validation-aucpr:0.97561
[41] validation-logloss:0.24780 validation-auc:0.97093 validation-aucpr:0.97564
[42] validation-logloss:0.24514 validation-auc:0.97099 validation-aucpr:0.97567
{'best_iteration': '42', 'best_score': '0.975667931459085'}
Trial 31, Fold 1: Log loss = 0.24514445785594513, Average precision = 0.9756598868388844, ROC-AUC = 0.9709922248032004, Elapsed Time = 1.0101247000002331 seconds
Trial 31, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 31, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.66193 validation-auc:0.95879 validation-aucpr:0.95950
[1] validation-logloss:0.63288 validation-auc:0.96351 validation-aucpr:0.96590
[2] validation-logloss:0.60870 validation-auc:0.96519 validation-aucpr:0.96779
[3] validation-logloss:0.58404 validation-auc:0.96680 validation-aucpr:0.96988
[4] validation-logloss:0.56373 validation-auc:0.96679 validation-aucpr:0.97063
[5] validation-logloss:0.54220 validation-auc:0.96742 validation-aucpr:0.97119
[6] validation-logloss:0.52272 validation-auc:0.96782 validation-aucpr:0.97166
[7] validation-logloss:0.50640 validation-auc:0.96780 validation-aucpr:0.97158
[8] validation-logloss:0.49092 validation-auc:0.96778 validation-aucpr:0.97148
[9] validation-logloss:0.47483 validation-auc:0.96831 validation-aucpr:0.97197
[10] validation-logloss:0.45941 validation-auc:0.96870 validation-aucpr:0.97224
[11] validation-logloss:0.44531 validation-auc:0.96861 validation-aucpr:0.97214
[12] validation-logloss:0.43158 validation-auc:0.96909 validation-aucpr:0.97255
[13] validation-logloss:0.41885 validation-auc:0.96938 validation-aucpr:0.97280
[14] validation-logloss:0.40668 validation-auc:0.96967 validation-aucpr:0.97309
[15] validation-logloss:0.39536 validation-auc:0.96979 validation-aucpr:0.97316
[16] validation-logloss:0.38460 validation-auc:0.96995 validation-aucpr:0.97328
[17] validation-logloss:0.37501 validation-auc:0.96990 validation-aucpr:0.97323
[18] validation-logloss:0.36567 validation-auc:0.96984 validation-aucpr:0.97320
[19] validation-logloss:0.35660 validation-auc:0.96994 validation-aucpr:0.97329
[20] validation-logloss:0.34823 validation-auc:0.96999 validation-aucpr:0.97324
[21] validation-logloss:0.34131 validation-auc:0.97012 validation-aucpr:0.97333
[22] validation-logloss:0.33377 validation-auc:0.97014 validation-aucpr:0.97332
[23] validation-logloss:0.32677 validation-auc:0.97004 validation-aucpr:0.97324
[24] validation-logloss:0.31991 validation-auc:0.97007 validation-aucpr:0.97326
[25] validation-logloss:0.31332 validation-auc:0.97008 validation-aucpr:0.97327
[26] validation-logloss:0.30730 validation-auc:0.97026 validation-aucpr:0.97338
[27] validation-logloss:0.30129 validation-auc:0.97039 validation-aucpr:0.97348
[28] validation-logloss:0.29576 validation-auc:0.97044 validation-aucpr:0.97348
[29] validation-logloss:0.29055 validation-auc:0.97046 validation-aucpr:0.97350
[30] validation-logloss:0.28568 validation-auc:0.97061 validation-aucpr:0.97359
[31] validation-logloss:0.28150 validation-auc:0.97068 validation-aucpr:0.97367
[32] validation-logloss:0.27683 validation-auc:0.97074 validation-aucpr:0.97372
[33] validation-logloss:0.27249 validation-auc:0.97075 validation-aucpr:0.97374
[34] validation-logloss:0.26876 validation-auc:0.97091 validation-aucpr:0.97387
[35] validation-logloss:0.26493 validation-auc:0.97096 validation-aucpr:0.97389
[36] validation-logloss:0.26150 validation-auc:0.97079 validation-aucpr:0.97377
[37] validation-logloss:0.25798 validation-auc:0.97081 validation-aucpr:0.97377
[38] validation-logloss:0.25468 validation-auc:0.97086 validation-aucpr:0.97413
[39] validation-logloss:0.25148 validation-auc:0.97095 validation-aucpr:0.97421
[40] validation-logloss:0.24849 validation-auc:0.97098 validation-aucpr:0.97423
[41] validation-logloss:0.24613 validation-auc:0.97100 validation-aucpr:0.97425
[42] validation-logloss:0.24319 validation-auc:0.97103 validation-aucpr:0.97428
{'best_iteration': '42', 'best_score': '0.9742822625501332'}
Trial 31, Fold 2: Log loss = 0.2431944995877271, Average precision = 0.9742661387249827, ROC-AUC = 0.9710278575549607, Elapsed Time = 1.205074200001036 seconds
Trial 31, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 31, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66155 validation-auc:0.96269 validation-aucpr:0.96502
[1] validation-logloss:0.63567 validation-auc:0.96477 validation-aucpr:0.96781
[2] validation-logloss:0.60879 validation-auc:0.96584 validation-aucpr:0.97005
[3] validation-logloss:0.58748 validation-auc:0.96561 validation-aucpr:0.96964
[4] validation-logloss:0.56424 validation-auc:0.96632 validation-aucpr:0.97021
[5] validation-logloss:0.54594 validation-auc:0.96677 validation-aucpr:0.97070
[6] validation-logloss:0.52830 validation-auc:0.96649 validation-aucpr:0.97038
[7] validation-logloss:0.50908 validation-auc:0.96715 validation-aucpr:0.97099
[8] validation-logloss:0.49107 validation-auc:0.96772 validation-aucpr:0.97144
[9] validation-logloss:0.47687 validation-auc:0.96786 validation-aucpr:0.97150
[10] validation-logloss:0.46118 validation-auc:0.96811 validation-aucpr:0.97178
[11] validation-logloss:0.44636 validation-auc:0.96854 validation-aucpr:0.97242
[12] validation-logloss:0.43416 validation-auc:0.96864 validation-aucpr:0.97249
[13] validation-logloss:0.42113 validation-auc:0.96879 validation-aucpr:0.97262
[14] validation-logloss:0.41051 validation-auc:0.96881 validation-aucpr:0.97259
[15] validation-logloss:0.40051 validation-auc:0.96874 validation-aucpr:0.97252
[16] validation-logloss:0.38962 validation-auc:0.96897 validation-aucpr:0.97278
[17] validation-logloss:0.37920 validation-auc:0.96910 validation-aucpr:0.97290
[18] validation-logloss:0.36940 validation-auc:0.96930 validation-aucpr:0.97337
[19] validation-logloss:0.36014 validation-auc:0.96948 validation-aucpr:0.97351
[20] validation-logloss:0.35126 validation-auc:0.96980 validation-aucpr:0.97381
[21] validation-logloss:0.34316 validation-auc:0.96990 validation-aucpr:0.97390
[22] validation-logloss:0.33556 validation-auc:0.96992 validation-aucpr:0.97393
[23] validation-logloss:0.32804 validation-auc:0.97010 validation-aucpr:0.97409
[24] validation-logloss:0.32107 validation-auc:0.97012 validation-aucpr:0.97412
[25] validation-logloss:0.31442 validation-auc:0.97027 validation-aucpr:0.97425
[26] validation-logloss:0.30908 validation-auc:0.97041 validation-aucpr:0.97433
[27] validation-logloss:0.30309 validation-auc:0.97062 validation-aucpr:0.97446
[28] validation-logloss:0.29742 validation-auc:0.97069 validation-aucpr:0.97450
[29] validation-logloss:0.29303 validation-auc:0.97067 validation-aucpr:0.97447
[30] validation-logloss:0.28786 validation-auc:0.97070 validation-aucpr:0.97452
[31] validation-logloss:0.28300 validation-auc:0.97083 validation-aucpr:0.97463
[32] validation-logloss:0.27842 validation-auc:0.97087 validation-aucpr:0.97465
[33] validation-logloss:0.27487 validation-auc:0.97086 validation-aucpr:0.97463
[34] validation-logloss:0.27081 validation-auc:0.97095 validation-aucpr:0.97472
[35] validation-logloss:0.26673 validation-auc:0.97105 validation-aucpr:0.97480
[36] validation-logloss:0.26310 validation-auc:0.97109 validation-aucpr:0.97483
[37] validation-logloss:0.25937 validation-auc:0.97118 validation-aucpr:0.97490
[38] validation-logloss:0.25586 validation-auc:0.97121 validation-aucpr:0.97492
[39] validation-logloss:0.25267 validation-auc:0.97129 validation-aucpr:0.97497
[40] validation-logloss:0.24958 validation-auc:0.97127 validation-aucpr:0.97493
[41] validation-logloss:0.24677 validation-auc:0.97125 validation-aucpr:0.97490
[42] validation-logloss:0.24382 validation-auc:0.97133 validation-aucpr:0.97495
{'best_iteration': '39', 'best_score': '0.9749729169827692'}
Trial 31, Fold 3: Log loss = 0.24381864069012466, Average precision = 0.9749243711198644, ROC-AUC = 0.9713275057545434, Elapsed Time = 1.1850333000002138 seconds
Trial 31, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 31, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66180 validation-auc:0.96164 validation-aucpr:0.96373
[1] validation-logloss:0.63291 validation-auc:0.96602 validation-aucpr:0.96790
[2] validation-logloss:0.60627 validation-auc:0.96732 validation-aucpr:0.96907
[3] validation-logloss:0.58496 validation-auc:0.96733 validation-aucpr:0.97206
[4] validation-logloss:0.56435 validation-auc:0.96676 validation-aucpr:0.97179
[5] validation-logloss:0.54289 validation-auc:0.96763 validation-aucpr:0.97264
[6] validation-logloss:0.52277 validation-auc:0.96805 validation-aucpr:0.97296
[7] validation-logloss:0.50653 validation-auc:0.96741 validation-aucpr:0.97257
[8] validation-logloss:0.49114 validation-auc:0.96726 validation-aucpr:0.97242
[9] validation-logloss:0.47453 validation-auc:0.96751 validation-aucpr:0.97262
[10] validation-logloss:0.45930 validation-auc:0.96778 validation-aucpr:0.97281
[11] validation-logloss:0.44644 validation-auc:0.96761 validation-aucpr:0.97271
[12] validation-logloss:0.43268 validation-auc:0.96792 validation-aucpr:0.97297
[13] validation-logloss:0.41970 validation-auc:0.96842 validation-aucpr:0.97337
[14] validation-logloss:0.40759 validation-auc:0.96896 validation-aucpr:0.97379
[15] validation-logloss:0.39630 validation-auc:0.96909 validation-aucpr:0.97391
[16] validation-logloss:0.38556 validation-auc:0.96949 validation-aucpr:0.97420
[17] validation-logloss:0.37557 validation-auc:0.96959 validation-aucpr:0.97426
[18] validation-logloss:0.36609 validation-auc:0.96971 validation-aucpr:0.97437
[19] validation-logloss:0.35856 validation-auc:0.96950 validation-aucpr:0.97418
[20] validation-logloss:0.35117 validation-auc:0.96946 validation-aucpr:0.97416
[21] validation-logloss:0.34290 validation-auc:0.96984 validation-aucpr:0.97448
[22] validation-logloss:0.33518 validation-auc:0.96988 validation-aucpr:0.97455
[23] validation-logloss:0.32880 validation-auc:0.96980 validation-aucpr:0.97448
[24] validation-logloss:0.32177 validation-auc:0.97003 validation-aucpr:0.97465
[25] validation-logloss:0.31535 validation-auc:0.97016 validation-aucpr:0.97475
[26] validation-logloss:0.30905 validation-auc:0.97040 validation-aucpr:0.97499
[27] validation-logloss:0.30311 validation-auc:0.97059 validation-aucpr:0.97513
[28] validation-logloss:0.29748 validation-auc:0.97073 validation-aucpr:0.97525
[29] validation-logloss:0.29216 validation-auc:0.97082 validation-aucpr:0.97534
[30] validation-logloss:0.28697 validation-auc:0.97104 validation-aucpr:0.97550
[31] validation-logloss:0.28216 validation-auc:0.97124 validation-aucpr:0.97566
[32] validation-logloss:0.27752 validation-auc:0.97130 validation-aucpr:0.97572
[33] validation-logloss:0.27324 validation-auc:0.97133 validation-aucpr:0.97576
[34] validation-logloss:0.26915 validation-auc:0.97133 validation-aucpr:0.97576
[35] validation-logloss:0.26526 validation-auc:0.97132 validation-aucpr:0.97576
[36] validation-logloss:0.26158 validation-auc:0.97134 validation-aucpr:0.97577
[37] validation-logloss:0.25805 validation-auc:0.97143 validation-aucpr:0.97584
[38] validation-logloss:0.25464 validation-auc:0.97155 validation-aucpr:0.97590
[39] validation-logloss:0.25141 validation-auc:0.97162 validation-aucpr:0.97596
[40] validation-logloss:0.24850 validation-auc:0.97165 validation-aucpr:0.97597
[41] validation-logloss:0.24565 validation-auc:0.97172 validation-aucpr:0.97601
[42] validation-logloss:0.24301 validation-auc:0.97172 validation-aucpr:0.97602
{'best_iteration': '42', 'best_score': '0.9760215168892972'}
Trial 31, Fold 4: Log loss = 0.24300942956000274, Average precision = 0.9760034541082719, ROC-AUC = 0.9717249237584973, Elapsed Time = 1.2454550999991625 seconds
Trial 31, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 31, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66161 validation-auc:0.96173 validation-aucpr:0.96283
[1] validation-logloss:0.63570 validation-auc:0.96160 validation-aucpr:0.96336
[2] validation-logloss:0.61121 validation-auc:0.96395 validation-aucpr:0.96851
[3] validation-logloss:0.58782 validation-auc:0.96463 validation-aucpr:0.96896
[4] validation-logloss:0.56466 validation-auc:0.96560 validation-aucpr:0.96974
[5] validation-logloss:0.54307 validation-auc:0.96626 validation-aucpr:0.97044
[6] validation-logloss:0.52326 validation-auc:0.96670 validation-aucpr:0.97072
[7] validation-logloss:0.50682 validation-auc:0.96592 validation-aucpr:0.96995
[8] validation-logloss:0.49165 validation-auc:0.96597 validation-aucpr:0.97014
[9] validation-logloss:0.47517 validation-auc:0.96650 validation-aucpr:0.97123
[10] validation-logloss:0.46250 validation-auc:0.96624 validation-aucpr:0.97094
[11] validation-logloss:0.45012 validation-auc:0.96612 validation-aucpr:0.97084
[12] validation-logloss:0.43659 validation-auc:0.96648 validation-aucpr:0.97103
[13] validation-logloss:0.42372 validation-auc:0.96679 validation-aucpr:0.97130
[14] validation-logloss:0.41322 validation-auc:0.96685 validation-aucpr:0.97129
[15] validation-logloss:0.40190 validation-auc:0.96714 validation-aucpr:0.97154
[16] validation-logloss:0.39123 validation-auc:0.96727 validation-aucpr:0.97167
[17] validation-logloss:0.38092 validation-auc:0.96778 validation-aucpr:0.97206
[18] validation-logloss:0.37132 validation-auc:0.96803 validation-aucpr:0.97229
[19] validation-logloss:0.36227 validation-auc:0.96819 validation-aucpr:0.97242
[20] validation-logloss:0.35382 validation-auc:0.96848 validation-aucpr:0.97266
[21] validation-logloss:0.34570 validation-auc:0.96872 validation-aucpr:0.97283
[22] validation-logloss:0.33924 validation-auc:0.96875 validation-aucpr:0.97283
[23] validation-logloss:0.33197 validation-auc:0.96898 validation-aucpr:0.97300
[24] validation-logloss:0.32501 validation-auc:0.96919 validation-aucpr:0.97316
[25] validation-logloss:0.31836 validation-auc:0.96936 validation-aucpr:0.97328
[26] validation-logloss:0.31239 validation-auc:0.96947 validation-aucpr:0.97336
[27] validation-logloss:0.30660 validation-auc:0.96958 validation-aucpr:0.97344
[28] validation-logloss:0.30114 validation-auc:0.96962 validation-aucpr:0.97348
[29] validation-logloss:0.29596 validation-auc:0.96968 validation-aucpr:0.97353
[30] validation-logloss:0.29106 validation-auc:0.96978 validation-aucpr:0.97358
[31] validation-logloss:0.28623 validation-auc:0.96996 validation-aucpr:0.97372
[32] validation-logloss:0.28176 validation-auc:0.97005 validation-aucpr:0.97379
[33] validation-logloss:0.27762 validation-auc:0.97009 validation-aucpr:0.97382
[34] validation-logloss:0.27364 validation-auc:0.97009 validation-aucpr:0.97385
[35] validation-logloss:0.26978 validation-auc:0.97014 validation-aucpr:0.97387
[36] validation-logloss:0.26623 validation-auc:0.97027 validation-aucpr:0.97395
[37] validation-logloss:0.26279 validation-auc:0.97047 validation-aucpr:0.97443
[38] validation-logloss:0.25942 validation-auc:0.97066 validation-aucpr:0.97455
[39] validation-logloss:0.25635 validation-auc:0.97080 validation-aucpr:0.97463
[40] validation-logloss:0.25332 validation-auc:0.97090 validation-aucpr:0.97470
[41] validation-logloss:0.25038 validation-auc:0.97100 validation-aucpr:0.97479
[42] validation-logloss:0.24800 validation-auc:0.97108 validation-aucpr:0.97485
{'best_iteration': '42', 'best_score': '0.974853958215763'}
Trial 31, Fold 5: Log loss = 0.24799681727472822, Average precision = 0.9748078353358505, ROC-AUC = 0.9710814335878715, Elapsed Time = 1.209885000000213 seconds
Optimization Progress: 32%|###2 | 32/100 [50:30<1:01:27, 54.22s/it]
Trial 32, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 32, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67263 validation-auc:0.94340 validation-aucpr:0.95202
[1] validation-logloss:0.65321 validation-auc:0.95034 validation-aucpr:0.95581
[2] validation-logloss:0.63598 validation-auc:0.95080 validation-aucpr:0.95973
[3] validation-logloss:0.61883 validation-auc:0.95193 validation-aucpr:0.96076
[4] validation-logloss:0.60229 validation-auc:0.95492 validation-aucpr:0.96341
[5] validation-logloss:0.58667 validation-auc:0.95776 validation-aucpr:0.96592
[6] validation-logloss:0.57125 validation-auc:0.95920 validation-aucpr:0.96671
[7] validation-logloss:0.55687 validation-auc:0.96026 validation-aucpr:0.96781
[8] validation-logloss:0.54337 validation-auc:0.96070 validation-aucpr:0.96799
[9] validation-logloss:0.53106 validation-auc:0.96176 validation-aucpr:0.96795
[10] validation-logloss:0.51879 validation-auc:0.96239 validation-aucpr:0.96917
[11] validation-logloss:0.50777 validation-auc:0.96299 validation-aucpr:0.96996
[12] validation-logloss:0.49780 validation-auc:0.96312 validation-aucpr:0.97013
[13] validation-logloss:0.48833 validation-auc:0.96363 validation-aucpr:0.97045
[14] validation-logloss:0.47754 validation-auc:0.96423 validation-aucpr:0.97088
[15] validation-logloss:0.46756 validation-auc:0.96454 validation-aucpr:0.97113
[16] validation-logloss:0.45813 validation-auc:0.96443 validation-aucpr:0.97110
[17] validation-logloss:0.44947 validation-auc:0.96494 validation-aucpr:0.97145
[18] validation-logloss:0.44132 validation-auc:0.96553 validation-aucpr:0.97181
[19] validation-logloss:0.43318 validation-auc:0.96540 validation-aucpr:0.97173
[20] validation-logloss:0.42591 validation-auc:0.96520 validation-aucpr:0.97150
[21] validation-logloss:0.41936 validation-auc:0.96491 validation-aucpr:0.97136
[22] validation-logloss:0.41177 validation-auc:0.96509 validation-aucpr:0.97151
[23] validation-logloss:0.40542 validation-auc:0.96507 validation-aucpr:0.97148
[24] validation-logloss:0.39810 validation-auc:0.96544 validation-aucpr:0.97167
[25] validation-logloss:0.39183 validation-auc:0.96581 validation-aucpr:0.97190
[26] validation-logloss:0.38546 validation-auc:0.96588 validation-aucpr:0.97188
[27] validation-logloss:0.37882 validation-auc:0.96594 validation-aucpr:0.97199
[28] validation-logloss:0.37264 validation-auc:0.96584 validation-aucpr:0.97195
[29] validation-logloss:0.36749 validation-auc:0.96590 validation-aucpr:0.97194
[30] validation-logloss:0.36203 validation-auc:0.96596 validation-aucpr:0.97202
[31] validation-logloss:0.35632 validation-auc:0.96619 validation-aucpr:0.97229
[32] validation-logloss:0.35195 validation-auc:0.96610 validation-aucpr:0.97221
[33] validation-logloss:0.34733 validation-auc:0.96600 validation-aucpr:0.97213
[34] validation-logloss:0.34201 validation-auc:0.96642 validation-aucpr:0.97235
[35] validation-logloss:0.33734 validation-auc:0.96656 validation-aucpr:0.97242
[36] validation-logloss:0.33317 validation-auc:0.96646 validation-aucpr:0.97237
[37] validation-logloss:0.32841 validation-auc:0.96671 validation-aucpr:0.97261
[38] validation-logloss:0.32478 validation-auc:0.96656 validation-aucpr:0.97252
[39] validation-logloss:0.32075 validation-auc:0.96667 validation-aucpr:0.97261
[40] validation-logloss:0.31663 validation-auc:0.96684 validation-aucpr:0.97271
[41] validation-logloss:0.31292 validation-auc:0.96703 validation-aucpr:0.97279
[42] validation-logloss:0.30980 validation-auc:0.96715 validation-aucpr:0.97284
[43] validation-logloss:0.30637 validation-auc:0.96700 validation-aucpr:0.97273
[44] validation-logloss:0.30303 validation-auc:0.96718 validation-aucpr:0.97283
[45] validation-logloss:0.30017 validation-auc:0.96713 validation-aucpr:0.97278
[46] validation-logloss:0.29722 validation-auc:0.96719 validation-aucpr:0.97284
[47] validation-logloss:0.29425 validation-auc:0.96728 validation-aucpr:0.97288
[48] validation-logloss:0.29113 validation-auc:0.96732 validation-aucpr:0.97294
[49] validation-logloss:0.28814 validation-auc:0.96732 validation-aucpr:0.97293
[50] validation-logloss:0.28582 validation-auc:0.96730 validation-aucpr:0.97290
[51] validation-logloss:0.28280 validation-auc:0.96748 validation-aucpr:0.97303
[52] validation-logloss:0.28036 validation-auc:0.96770 validation-aucpr:0.97316
[53] validation-logloss:0.27775 validation-auc:0.96772 validation-aucpr:0.97320
[54] validation-logloss:0.27522 validation-auc:0.96774 validation-aucpr:0.97323
[55] validation-logloss:0.27307 validation-auc:0.96779 validation-aucpr:0.97327
[56] validation-logloss:0.27052 validation-auc:0.96775 validation-aucpr:0.97324
[57] validation-logloss:0.26850 validation-auc:0.96760 validation-aucpr:0.97314
[58] validation-logloss:0.26641 validation-auc:0.96754 validation-aucpr:0.97311
[59] validation-logloss:0.26416 validation-auc:0.96761 validation-aucpr:0.97317
[60] validation-logloss:0.26259 validation-auc:0.96757 validation-aucpr:0.97315
[61] validation-logloss:0.26027 validation-auc:0.96777 validation-aucpr:0.97329
[62] validation-logloss:0.25813 validation-auc:0.96778 validation-aucpr:0.97330
[63] validation-logloss:0.25674 validation-auc:0.96759 validation-aucpr:0.97312
[64] validation-logloss:0.25505 validation-auc:0.96765 validation-aucpr:0.97318
[65] validation-logloss:0.25362 validation-auc:0.96775 validation-aucpr:0.97322
[66] validation-logloss:0.25198 validation-auc:0.96783 validation-aucpr:0.97327
[67] validation-logloss:0.25036 validation-auc:0.96784 validation-aucpr:0.97327
[68] validation-logloss:0.24911 validation-auc:0.96772 validation-aucpr:0.97319
[69] validation-logloss:0.24788 validation-auc:0.96770 validation-aucpr:0.97314
[70] validation-logloss:0.24638 validation-auc:0.96768 validation-aucpr:0.97311
[71] validation-logloss:0.24509 validation-auc:0.96758 validation-aucpr:0.97310
[72] validation-logloss:0.24377 validation-auc:0.96755 validation-aucpr:0.97309
[73] validation-logloss:0.24245 validation-auc:0.96759 validation-aucpr:0.97313
[74] validation-logloss:0.24113 validation-auc:0.96753 validation-aucpr:0.97308
[75] validation-logloss:0.24001 validation-auc:0.96757 validation-aucpr:0.97309
[76] validation-logloss:0.23915 validation-auc:0.96739 validation-aucpr:0.97295
[77] validation-logloss:0.23776 validation-auc:0.96748 validation-aucpr:0.97303
[78] validation-logloss:0.23676 validation-auc:0.96758 validation-aucpr:0.97311
[79] validation-logloss:0.23585 validation-auc:0.96760 validation-aucpr:0.97311
[80] validation-logloss:0.23479 validation-auc:0.96769 validation-aucpr:0.97319
[81] validation-logloss:0.23382 validation-auc:0.96765 validation-aucpr:0.97316
[82] validation-logloss:0.23265 validation-auc:0.96776 validation-aucpr:0.97326
[83] validation-logloss:0.23180 validation-auc:0.96756 validation-aucpr:0.97312
[84] validation-logloss:0.23077 validation-auc:0.96755 validation-aucpr:0.97309
[85] validation-logloss:0.23012 validation-auc:0.96753 validation-aucpr:0.97305
[86] validation-logloss:0.22915 validation-auc:0.96757 validation-aucpr:0.97302
[87] validation-logloss:0.22848 validation-auc:0.96755 validation-aucpr:0.97301
[88] validation-logloss:0.22776 validation-auc:0.96749 validation-aucpr:0.97296
[89] validation-logloss:0.22667 validation-auc:0.96768 validation-aucpr:0.97313
[90] validation-logloss:0.22566 validation-auc:0.96784 validation-aucpr:0.97322
[91] validation-logloss:0.22487 validation-auc:0.96790 validation-aucpr:0.97326
[92] validation-logloss:0.22411 validation-auc:0.96792 validation-aucpr:0.97329
[93] validation-logloss:0.22307 validation-auc:0.96807 validation-aucpr:0.97337
[94] validation-logloss:0.22263 validation-auc:0.96794 validation-aucpr:0.97341
[95] validation-logloss:0.22192 validation-auc:0.96796 validation-aucpr:0.97341
[96] validation-logloss:0.22136 validation-auc:0.96796 validation-aucpr:0.97340
[97] validation-logloss:0.22077 validation-auc:0.96801 validation-aucpr:0.97346
{'best_iteration': '97', 'best_score': '0.9734587937630889'}
Trial 32, Fold 1: Log loss = 0.22077211741617622, Average precision = 0.973463242617084, ROC-AUC = 0.9680081181323278, Elapsed Time = 1103.3959042000006 seconds
Trial 32, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 32, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67236 validation-auc:0.94406 validation-aucpr:0.94290
[1] validation-logloss:0.65459 validation-auc:0.94981 validation-aucpr:0.94874
[2] validation-logloss:0.63797 validation-auc:0.95234 validation-aucpr:0.95579
[3] validation-logloss:0.62092 validation-auc:0.95669 validation-aucpr:0.96050
[4] validation-logloss:0.60429 validation-auc:0.95935 validation-aucpr:0.96403
[5] validation-logloss:0.58879 validation-auc:0.96113 validation-aucpr:0.96654
[6] validation-logloss:0.57537 validation-auc:0.96182 validation-aucpr:0.96684
[7] validation-logloss:0.56189 validation-auc:0.96251 validation-aucpr:0.96726
[8] validation-logloss:0.54936 validation-auc:0.96338 validation-aucpr:0.96795
[9] validation-logloss:0.53656 validation-auc:0.96385 validation-aucpr:0.96843
[10] validation-logloss:0.52506 validation-auc:0.96410 validation-aucpr:0.96852
[11] validation-logloss:0.51309 validation-auc:0.96508 validation-aucpr:0.96943
[12] validation-logloss:0.50140 validation-auc:0.96565 validation-aucpr:0.96975
[13] validation-logloss:0.49165 validation-auc:0.96541 validation-aucpr:0.96956
[14] validation-logloss:0.48070 validation-auc:0.96607 validation-aucpr:0.97016
[15] validation-logloss:0.47181 validation-auc:0.96606 validation-aucpr:0.97006
[16] validation-logloss:0.46168 validation-auc:0.96680 validation-aucpr:0.97063
[17] validation-logloss:0.45189 validation-auc:0.96697 validation-aucpr:0.97077
[18] validation-logloss:0.44430 validation-auc:0.96686 validation-aucpr:0.97082
[19] validation-logloss:0.43688 validation-auc:0.96698 validation-aucpr:0.97084
[20] validation-logloss:0.42863 validation-auc:0.96713 validation-aucpr:0.97096
[21] validation-logloss:0.42035 validation-auc:0.96757 validation-aucpr:0.97131
[22] validation-logloss:0.41283 validation-auc:0.96784 validation-aucpr:0.97176
[23] validation-logloss:0.40501 validation-auc:0.96794 validation-aucpr:0.97190
[24] validation-logloss:0.39759 validation-auc:0.96808 validation-aucpr:0.97210
[25] validation-logloss:0.39053 validation-auc:0.96844 validation-aucpr:0.97236
[26] validation-logloss:0.38468 validation-auc:0.96824 validation-aucpr:0.97220
[27] validation-logloss:0.37795 validation-auc:0.96825 validation-aucpr:0.97230
[28] validation-logloss:0.37175 validation-auc:0.96836 validation-aucpr:0.97242
[29] validation-logloss:0.36568 validation-auc:0.96833 validation-aucpr:0.97236
[30] validation-logloss:0.35981 validation-auc:0.96845 validation-aucpr:0.97251
[31] validation-logloss:0.35390 validation-auc:0.96900 validation-aucpr:0.97285
[32] validation-logloss:0.34880 validation-auc:0.96930 validation-aucpr:0.97307
[33] validation-logloss:0.34411 validation-auc:0.96914 validation-aucpr:0.97291
[34] validation-logloss:0.33889 validation-auc:0.96923 validation-aucpr:0.97304
[35] validation-logloss:0.33466 validation-auc:0.96926 validation-aucpr:0.97301
[36] validation-logloss:0.33082 validation-auc:0.96913 validation-aucpr:0.97290
[37] validation-logloss:0.32627 validation-auc:0.96930 validation-aucpr:0.97304
[38] validation-logloss:0.32169 validation-auc:0.96943 validation-aucpr:0.97316
[39] validation-logloss:0.31729 validation-auc:0.96945 validation-aucpr:0.97320
[40] validation-logloss:0.31383 validation-auc:0.96952 validation-aucpr:0.97325
[41] validation-logloss:0.31043 validation-auc:0.96947 validation-aucpr:0.97317
[42] validation-logloss:0.30689 validation-auc:0.96938 validation-aucpr:0.97312
[43] validation-logloss:0.30375 validation-auc:0.96930 validation-aucpr:0.97287
[44] validation-logloss:0.30044 validation-auc:0.96919 validation-aucpr:0.97278
[45] validation-logloss:0.29766 validation-auc:0.96908 validation-aucpr:0.97272
[46] validation-logloss:0.29488 validation-auc:0.96907 validation-aucpr:0.97270
[47] validation-logloss:0.29180 validation-auc:0.96907 validation-aucpr:0.97275
[48] validation-logloss:0.28907 validation-auc:0.96914 validation-aucpr:0.97275
[49] validation-logloss:0.28583 validation-auc:0.96921 validation-aucpr:0.97284
[50] validation-logloss:0.28266 validation-auc:0.96921 validation-aucpr:0.97286
[51] validation-logloss:0.28006 validation-auc:0.96931 validation-aucpr:0.97294
[52] validation-logloss:0.27766 validation-auc:0.96937 validation-aucpr:0.97296
[53] validation-logloss:0.27507 validation-auc:0.96943 validation-aucpr:0.97307
[54] validation-logloss:0.27273 validation-auc:0.96954 validation-aucpr:0.97312
[55] validation-logloss:0.26995 validation-auc:0.96970 validation-aucpr:0.97323
[56] validation-logloss:0.26750 validation-auc:0.96984 validation-aucpr:0.97325
[57] validation-logloss:0.26504 validation-auc:0.96976 validation-aucpr:0.97322
[58] validation-logloss:0.26259 validation-auc:0.96978 validation-aucpr:0.97320
[59] validation-logloss:0.25992 validation-auc:0.97011 validation-aucpr:0.97345
[60] validation-logloss:0.25797 validation-auc:0.97010 validation-aucpr:0.97349
[61] validation-logloss:0.25566 validation-auc:0.97018 validation-aucpr:0.97352
[62] validation-logloss:0.25362 validation-auc:0.97024 validation-aucpr:0.97351
[63] validation-logloss:0.25214 validation-auc:0.97016 validation-aucpr:0.97342
[64] validation-logloss:0.25019 validation-auc:0.97014 validation-aucpr:0.97327
[65] validation-logloss:0.24815 validation-auc:0.97028 validation-aucpr:0.97338
[66] validation-logloss:0.24665 validation-auc:0.97043 validation-aucpr:0.97414
[67] validation-logloss:0.24538 validation-auc:0.97023 validation-aucpr:0.97401
[68] validation-logloss:0.24367 validation-auc:0.97042 validation-aucpr:0.97415
[69] validation-logloss:0.24190 validation-auc:0.97046 validation-aucpr:0.97417
[70] validation-logloss:0.23994 validation-auc:0.97068 validation-aucpr:0.97444
[71] validation-logloss:0.23848 validation-auc:0.97072 validation-aucpr:0.97446
[72] validation-logloss:0.23680 validation-auc:0.97087 validation-aucpr:0.97457
[73] validation-logloss:0.23570 validation-auc:0.97093 validation-aucpr:0.97460
[74] validation-logloss:0.23459 validation-auc:0.97079 validation-aucpr:0.97448
[75] validation-logloss:0.23351 validation-auc:0.97062 validation-aucpr:0.97436
[76] validation-logloss:0.23208 validation-auc:0.97057 validation-aucpr:0.97433
[77] validation-logloss:0.23060 validation-auc:0.97069 validation-aucpr:0.97455
[78] validation-logloss:0.22912 validation-auc:0.97072 validation-aucpr:0.97458
[79] validation-logloss:0.22783 validation-auc:0.97074 validation-aucpr:0.97459
[80] validation-logloss:0.22678 validation-auc:0.97080 validation-aucpr:0.97464
[81] validation-logloss:0.22587 validation-auc:0.97076 validation-aucpr:0.97463
[82] validation-logloss:0.22439 validation-auc:0.97095 validation-aucpr:0.97475
[83] validation-logloss:0.22338 validation-auc:0.97096 validation-aucpr:0.97475
[84] validation-logloss:0.22235 validation-auc:0.97102 validation-aucpr:0.97478
[85] validation-logloss:0.22108 validation-auc:0.97108 validation-aucpr:0.97484
[86] validation-logloss:0.22028 validation-auc:0.97108 validation-aucpr:0.97485
[87] validation-logloss:0.21938 validation-auc:0.97099 validation-aucpr:0.97476
[88] validation-logloss:0.21843 validation-auc:0.97095 validation-aucpr:0.97473
[89] validation-logloss:0.21772 validation-auc:0.97095 validation-aucpr:0.97475
[90] validation-logloss:0.21681 validation-auc:0.97092 validation-aucpr:0.97473
[91] validation-logloss:0.21563 validation-auc:0.97107 validation-aucpr:0.97483
[92] validation-logloss:0.21493 validation-auc:0.97110 validation-aucpr:0.97485
[93] validation-logloss:0.21439 validation-auc:0.97106 validation-aucpr:0.97484
[94] validation-logloss:0.21366 validation-auc:0.97098 validation-aucpr:0.97479
[95] validation-logloss:0.21289 validation-auc:0.97099 validation-aucpr:0.97478
[96] validation-logloss:0.21219 validation-auc:0.97088 validation-aucpr:0.97469
[97] validation-logloss:0.21159 validation-auc:0.97079 validation-aucpr:0.97462
{'best_iteration': '92', 'best_score': '0.9748518836757191'}
Trial 32, Fold 2: Log loss = 0.21159174969704342, Average precision = 0.9746231769873027, ROC-AUC = 0.9707887030450622, Elapsed Time = 1091.1235703999992 seconds
Trial 32, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 32, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67319 validation-auc:0.93892 validation-aucpr:0.94781
[1] validation-logloss:0.65596 validation-auc:0.94305 validation-aucpr:0.95011
[2] validation-logloss:0.63705 validation-auc:0.95411 validation-aucpr:0.95880
[3] validation-logloss:0.62103 validation-auc:0.95509 validation-aucpr:0.95783
[4] validation-logloss:0.60463 validation-auc:0.95661 validation-aucpr:0.96193
[5] validation-logloss:0.58850 validation-auc:0.95903 validation-aucpr:0.96166
[6] validation-logloss:0.57324 validation-auc:0.96114 validation-aucpr:0.96680
[7] validation-logloss:0.55857 validation-auc:0.96288 validation-aucpr:0.96801
[8] validation-logloss:0.54550 validation-auc:0.96333 validation-aucpr:0.96792
[9] validation-logloss:0.53384 validation-auc:0.96418 validation-aucpr:0.96868
[10] validation-logloss:0.52177 validation-auc:0.96367 validation-aucpr:0.96672
[11] validation-logloss:0.51114 validation-auc:0.96382 validation-aucpr:0.96594
[12] validation-logloss:0.50104 validation-auc:0.96447 validation-aucpr:0.96555
[13] validation-logloss:0.49117 validation-auc:0.96460 validation-aucpr:0.96421
[14] validation-logloss:0.47992 validation-auc:0.96512 validation-aucpr:0.96523
[15] validation-logloss:0.47002 validation-auc:0.96590 validation-aucpr:0.96521
[16] validation-logloss:0.45997 validation-auc:0.96655 validation-aucpr:0.96547
[17] validation-logloss:0.45123 validation-auc:0.96652 validation-aucpr:0.96545
[18] validation-logloss:0.44259 validation-auc:0.96727 validation-aucpr:0.96731
[19] validation-logloss:0.43373 validation-auc:0.96761 validation-aucpr:0.96870
[20] validation-logloss:0.42554 validation-auc:0.96755 validation-aucpr:0.96887
[21] validation-logloss:0.41869 validation-auc:0.96719 validation-aucpr:0.96825
[22] validation-logloss:0.41049 validation-auc:0.96737 validation-aucpr:0.96812
[23] validation-logloss:0.40373 validation-auc:0.96742 validation-aucpr:0.96792
[24] validation-logloss:0.39670 validation-auc:0.96785 validation-aucpr:0.96815
[25] validation-logloss:0.38985 validation-auc:0.96790 validation-aucpr:0.96993
[26] validation-logloss:0.38364 validation-auc:0.96818 validation-aucpr:0.97015
[27] validation-logloss:0.37767 validation-auc:0.96831 validation-aucpr:0.96966
[28] validation-logloss:0.37192 validation-auc:0.96875 validation-aucpr:0.97246
[29] validation-logloss:0.36617 validation-auc:0.96880 validation-aucpr:0.97245
[30] validation-logloss:0.36036 validation-auc:0.96878 validation-aucpr:0.97251
[31] validation-logloss:0.35542 validation-auc:0.96875 validation-aucpr:0.97258
[32] validation-logloss:0.35057 validation-auc:0.96883 validation-aucpr:0.97256
[33] validation-logloss:0.34508 validation-auc:0.96901 validation-aucpr:0.97266
[34] validation-logloss:0.34013 validation-auc:0.96919 validation-aucpr:0.97367
[35] validation-logloss:0.33609 validation-auc:0.96910 validation-aucpr:0.97368
[36] validation-logloss:0.33108 validation-auc:0.96921 validation-aucpr:0.97379
[37] validation-logloss:0.32745 validation-auc:0.96936 validation-aucpr:0.97383
[38] validation-logloss:0.32347 validation-auc:0.96936 validation-aucpr:0.97387
[39] validation-logloss:0.32001 validation-auc:0.96916 validation-aucpr:0.97369
[40] validation-logloss:0.31611 validation-auc:0.96912 validation-aucpr:0.97366
[41] validation-logloss:0.31237 validation-auc:0.96922 validation-aucpr:0.97371
[42] validation-logloss:0.30847 validation-auc:0.96931 validation-aucpr:0.97377
[43] validation-logloss:0.30443 validation-auc:0.96952 validation-aucpr:0.97400
[44] validation-logloss:0.30140 validation-auc:0.96946 validation-aucpr:0.97390
[45] validation-logloss:0.29793 validation-auc:0.96938 validation-aucpr:0.97379
[46] validation-logloss:0.29521 validation-auc:0.96928 validation-aucpr:0.97371
[47] validation-logloss:0.29220 validation-auc:0.96927 validation-aucpr:0.97368
[48] validation-logloss:0.28887 validation-auc:0.96950 validation-aucpr:0.97392
[49] validation-logloss:0.28610 validation-auc:0.96980 validation-aucpr:0.97403
[50] validation-logloss:0.28256 validation-auc:0.96989 validation-aucpr:0.97413
[51] validation-logloss:0.27977 validation-auc:0.96989 validation-aucpr:0.97416
[52] validation-logloss:0.27672 validation-auc:0.97013 validation-aucpr:0.97430
[53] validation-logloss:0.27425 validation-auc:0.97021 validation-aucpr:0.97435
[54] validation-logloss:0.27142 validation-auc:0.97024 validation-aucpr:0.97437
[55] validation-logloss:0.26892 validation-auc:0.97026 validation-aucpr:0.97437
[56] validation-logloss:0.26669 validation-auc:0.97038 validation-aucpr:0.97445
[57] validation-logloss:0.26431 validation-auc:0.97048 validation-aucpr:0.97458
[58] validation-logloss:0.26197 validation-auc:0.97051 validation-aucpr:0.97459
[59] validation-logloss:0.25953 validation-auc:0.97062 validation-aucpr:0.97472
[60] validation-logloss:0.25734 validation-auc:0.97047 validation-aucpr:0.97462
[61] validation-logloss:0.25582 validation-auc:0.97035 validation-aucpr:0.97449
[62] validation-logloss:0.25389 validation-auc:0.97049 validation-aucpr:0.97454
[63] validation-logloss:0.25223 validation-auc:0.97038 validation-aucpr:0.97444
[64] validation-logloss:0.25081 validation-auc:0.97034 validation-aucpr:0.97439
[65] validation-logloss:0.24915 validation-auc:0.97042 validation-aucpr:0.97443
[66] validation-logloss:0.24712 validation-auc:0.97056 validation-aucpr:0.97460
[67] validation-logloss:0.24553 validation-auc:0.97056 validation-aucpr:0.97459
[68] validation-logloss:0.24357 validation-auc:0.97070 validation-aucpr:0.97470
[69] validation-logloss:0.24178 validation-auc:0.97078 validation-aucpr:0.97474
[70] validation-logloss:0.24051 validation-auc:0.97071 validation-aucpr:0.97466
[71] validation-logloss:0.23910 validation-auc:0.97078 validation-aucpr:0.97471
[72] validation-logloss:0.23802 validation-auc:0.97067 validation-aucpr:0.97467
[73] validation-logloss:0.23649 validation-auc:0.97069 validation-aucpr:0.97469
[74] validation-logloss:0.23495 validation-auc:0.97070 validation-aucpr:0.97471
[75] validation-logloss:0.23367 validation-auc:0.97073 validation-aucpr:0.97472
[76] validation-logloss:0.23249 validation-auc:0.97076 validation-aucpr:0.97477
[77] validation-logloss:0.23150 validation-auc:0.97076 validation-aucpr:0.97479
[78] validation-logloss:0.23021 validation-auc:0.97092 validation-aucpr:0.97491
[79] validation-logloss:0.22881 validation-auc:0.97093 validation-aucpr:0.97488
[80] validation-logloss:0.22747 validation-auc:0.97103 validation-aucpr:0.97497
[81] validation-logloss:0.22623 validation-auc:0.97102 validation-aucpr:0.97498
[82] validation-logloss:0.22514 validation-auc:0.97106 validation-aucpr:0.97504
[83] validation-logloss:0.22406 validation-auc:0.97103 validation-aucpr:0.97501
[84] validation-logloss:0.22272 validation-auc:0.97117 validation-aucpr:0.97511
[85] validation-logloss:0.22178 validation-auc:0.97120 validation-aucpr:0.97517
[86] validation-logloss:0.22110 validation-auc:0.97115 validation-aucpr:0.97511
[87] validation-logloss:0.22017 validation-auc:0.97111 validation-aucpr:0.97507
[88] validation-logloss:0.21917 validation-auc:0.97123 validation-aucpr:0.97513
[89] validation-logloss:0.21807 validation-auc:0.97139 validation-aucpr:0.97523
[90] validation-logloss:0.21737 validation-auc:0.97135 validation-aucpr:0.97519
[91] validation-logloss:0.21643 validation-auc:0.97143 validation-aucpr:0.97530
[92] validation-logloss:0.21574 validation-auc:0.97147 validation-aucpr:0.97529
[93] validation-logloss:0.21508 validation-auc:0.97142 validation-aucpr:0.97524
[94] validation-logloss:0.21419 validation-auc:0.97154 validation-aucpr:0.97541
[95] validation-logloss:0.21351 validation-auc:0.97158 validation-aucpr:0.97539
[96] validation-logloss:0.21322 validation-auc:0.97144 validation-aucpr:0.97528
[97] validation-logloss:0.21248 validation-auc:0.97140 validation-aucpr:0.97527
{'best_iteration': '94', 'best_score': '0.9754120472963778'}
Trial 32, Fold 3: Log loss = 0.21247699408509904, Average precision = 0.9752734811271323, ROC-AUC = 0.9714012433666246, Elapsed Time = 1086.049289999999 seconds
Trial 32, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 32, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67231 validation-auc:0.94524 validation-aucpr:0.94865
[1] validation-logloss:0.65399 validation-auc:0.94796 validation-aucpr:0.95001
[2] validation-logloss:0.63804 validation-auc:0.94682 validation-aucpr:0.95165
[3] validation-logloss:0.62247 validation-auc:0.94910 validation-aucpr:0.94988
[4] validation-logloss:0.60601 validation-auc:0.95421 validation-aucpr:0.96119
[5] validation-logloss:0.59011 validation-auc:0.95657 validation-aucpr:0.96393
[6] validation-logloss:0.57554 validation-auc:0.95806 validation-aucpr:0.96484
[7] validation-logloss:0.56092 validation-auc:0.95983 validation-aucpr:0.96728
[8] validation-logloss:0.54766 validation-auc:0.96045 validation-aucpr:0.96747
[9] validation-logloss:0.53475 validation-auc:0.96090 validation-aucpr:0.96777
[10] validation-logloss:0.52325 validation-auc:0.96163 validation-aucpr:0.96825
[11] validation-logloss:0.51122 validation-auc:0.96215 validation-aucpr:0.96869
[12] validation-logloss:0.50056 validation-auc:0.96278 validation-aucpr:0.96919
[13] validation-logloss:0.49136 validation-auc:0.96235 validation-aucpr:0.96880
[14] validation-logloss:0.48030 validation-auc:0.96273 validation-aucpr:0.96918
[15] validation-logloss:0.47049 validation-auc:0.96284 validation-aucpr:0.96921
[16] validation-logloss:0.46138 validation-auc:0.96327 validation-aucpr:0.96951
[17] validation-logloss:0.45193 validation-auc:0.96396 validation-aucpr:0.97008
[18] validation-logloss:0.44298 validation-auc:0.96435 validation-aucpr:0.97035
[19] validation-logloss:0.43554 validation-auc:0.96434 validation-aucpr:0.97036
[20] validation-logloss:0.42842 validation-auc:0.96446 validation-aucpr:0.97049
[21] validation-logloss:0.42138 validation-auc:0.96458 validation-aucpr:0.97059
[22] validation-logloss:0.41421 validation-auc:0.96488 validation-aucpr:0.97083
[23] validation-logloss:0.40668 validation-auc:0.96497 validation-aucpr:0.97092
[24] validation-logloss:0.39907 validation-auc:0.96552 validation-aucpr:0.97136
[25] validation-logloss:0.39262 validation-auc:0.96559 validation-aucpr:0.97141
[26] validation-logloss:0.38704 validation-auc:0.96544 validation-aucpr:0.97129
[27] validation-logloss:0.38036 validation-auc:0.96570 validation-aucpr:0.97146
[28] validation-logloss:0.37498 validation-auc:0.96539 validation-aucpr:0.97116
[29] validation-logloss:0.36901 validation-auc:0.96535 validation-aucpr:0.97116
[30] validation-logloss:0.36331 validation-auc:0.96565 validation-aucpr:0.97143
[31] validation-logloss:0.35736 validation-auc:0.96607 validation-aucpr:0.97175
[32] validation-logloss:0.35173 validation-auc:0.96651 validation-aucpr:0.97206
[33] validation-logloss:0.34719 validation-auc:0.96655 validation-aucpr:0.97215
[34] validation-logloss:0.34217 validation-auc:0.96680 validation-aucpr:0.97232
[35] validation-logloss:0.33729 validation-auc:0.96683 validation-aucpr:0.97236
[36] validation-logloss:0.33296 validation-auc:0.96667 validation-aucpr:0.97225
[37] validation-logloss:0.32852 validation-auc:0.96687 validation-aucpr:0.97238
[38] validation-logloss:0.32481 validation-auc:0.96694 validation-aucpr:0.97241
[39] validation-logloss:0.32075 validation-auc:0.96679 validation-aucpr:0.97229
[40] validation-logloss:0.31696 validation-auc:0.96673 validation-aucpr:0.97228
[41] validation-logloss:0.31375 validation-auc:0.96669 validation-aucpr:0.97225
[42] validation-logloss:0.30988 validation-auc:0.96682 validation-aucpr:0.97236
[43] validation-logloss:0.30662 validation-auc:0.96673 validation-aucpr:0.97229
[44] validation-logloss:0.30328 validation-auc:0.96678 validation-aucpr:0.97231
[45] validation-logloss:0.29965 validation-auc:0.96703 validation-aucpr:0.97248
[46] validation-logloss:0.29624 validation-auc:0.96708 validation-aucpr:0.97250
[47] validation-logloss:0.29295 validation-auc:0.96706 validation-aucpr:0.97250
[48] validation-logloss:0.28965 validation-auc:0.96726 validation-aucpr:0.97267
[49] validation-logloss:0.28671 validation-auc:0.96727 validation-aucpr:0.97265
[50] validation-logloss:0.28357 validation-auc:0.96745 validation-aucpr:0.97281
[51] validation-logloss:0.28071 validation-auc:0.96744 validation-aucpr:0.97281
[52] validation-logloss:0.27840 validation-auc:0.96752 validation-aucpr:0.97284
[53] validation-logloss:0.27567 validation-auc:0.96765 validation-aucpr:0.97291
[54] validation-logloss:0.27301 validation-auc:0.96780 validation-aucpr:0.97301
[55] validation-logloss:0.27048 validation-auc:0.96791 validation-aucpr:0.97309
[56] validation-logloss:0.26806 validation-auc:0.96784 validation-aucpr:0.97309
[57] validation-logloss:0.26597 validation-auc:0.96789 validation-aucpr:0.97313
[58] validation-logloss:0.26364 validation-auc:0.96802 validation-aucpr:0.97323
[59] validation-logloss:0.26186 validation-auc:0.96793 validation-aucpr:0.97317
[60] validation-logloss:0.26017 validation-auc:0.96778 validation-aucpr:0.97308
[61] validation-logloss:0.25807 validation-auc:0.96784 validation-aucpr:0.97310
[62] validation-logloss:0.25641 validation-auc:0.96785 validation-aucpr:0.97312
[63] validation-logloss:0.25453 validation-auc:0.96785 validation-aucpr:0.97315
[64] validation-logloss:0.25325 validation-auc:0.96762 validation-aucpr:0.97299
[65] validation-logloss:0.25132 validation-auc:0.96765 validation-aucpr:0.97302
[66] validation-logloss:0.24974 validation-auc:0.96765 validation-aucpr:0.97304
[67] validation-logloss:0.24811 validation-auc:0.96772 validation-aucpr:0.97308
[68] validation-logloss:0.24646 validation-auc:0.96775 validation-aucpr:0.97311
[69] validation-logloss:0.24493 validation-auc:0.96781 validation-aucpr:0.97315
[70] validation-logloss:0.24330 validation-auc:0.96787 validation-aucpr:0.97318
[71] validation-logloss:0.24159 validation-auc:0.96793 validation-aucpr:0.97327
[72] validation-logloss:0.24032 validation-auc:0.96800 validation-aucpr:0.97331
[73] validation-logloss:0.23899 validation-auc:0.96813 validation-aucpr:0.97342
[74] validation-logloss:0.23771 validation-auc:0.96810 validation-aucpr:0.97340
[75] validation-logloss:0.23639 validation-auc:0.96811 validation-aucpr:0.97340
[76] validation-logloss:0.23500 validation-auc:0.96818 validation-aucpr:0.97345
[77] validation-logloss:0.23383 validation-auc:0.96824 validation-aucpr:0.97349
[78] validation-logloss:0.23314 validation-auc:0.96813 validation-aucpr:0.97340
[79] validation-logloss:0.23194 validation-auc:0.96818 validation-aucpr:0.97346
[80] validation-logloss:0.23113 validation-auc:0.96816 validation-aucpr:0.97344
[81] validation-logloss:0.23001 validation-auc:0.96813 validation-aucpr:0.97344
[82] validation-logloss:0.22882 validation-auc:0.96825 validation-aucpr:0.97350
[83] validation-logloss:0.22773 validation-auc:0.96836 validation-aucpr:0.97357
[84] validation-logloss:0.22709 validation-auc:0.96834 validation-aucpr:0.97354
[85] validation-logloss:0.22644 validation-auc:0.96837 validation-aucpr:0.97352
[86] validation-logloss:0.22556 validation-auc:0.96832 validation-aucpr:0.97349
[87] validation-logloss:0.22452 validation-auc:0.96836 validation-aucpr:0.97352
[88] validation-logloss:0.22396 validation-auc:0.96833 validation-aucpr:0.97350
[89] validation-logloss:0.22279 validation-auc:0.96841 validation-aucpr:0.97357
[90] validation-logloss:0.22209 validation-auc:0.96842 validation-aucpr:0.97361
[91] validation-logloss:0.22123 validation-auc:0.96847 validation-aucpr:0.97366
[92] validation-logloss:0.22042 validation-auc:0.96857 validation-aucpr:0.97374
[93] validation-logloss:0.21988 validation-auc:0.96856 validation-aucpr:0.97373
[94] validation-logloss:0.21943 validation-auc:0.96852 validation-aucpr:0.97370
[95] validation-logloss:0.21840 validation-auc:0.96868 validation-aucpr:0.97381
[96] validation-logloss:0.21767 validation-auc:0.96872 validation-aucpr:0.97383
[97] validation-logloss:0.21687 validation-auc:0.96880 validation-aucpr:0.97390
{'best_iteration': '97', 'best_score': '0.9738957097915883'}
Trial 32, Fold 4: Log loss = 0.21687468327294243, Average precision = 0.9738999480902333, ROC-AUC = 0.9687965705034205, Elapsed Time = 1075.9968526999983 seconds
Trial 32, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 32, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67284 validation-auc:0.93883 validation-aucpr:0.94575
[1] validation-logloss:0.65434 validation-auc:0.94729 validation-aucpr:0.95476
[2] validation-logloss:0.63869 validation-auc:0.94707 validation-aucpr:0.95243
[3] validation-logloss:0.62192 validation-auc:0.95017 validation-aucpr:0.95572
[4] validation-logloss:0.60565 validation-auc:0.95491 validation-aucpr:0.96287
[5] validation-logloss:0.58926 validation-auc:0.95653 validation-aucpr:0.96382
[6] validation-logloss:0.57363 validation-auc:0.95766 validation-aucpr:0.96465
[7] validation-logloss:0.55965 validation-auc:0.95898 validation-aucpr:0.96554
[8] validation-logloss:0.54637 validation-auc:0.95914 validation-aucpr:0.96576
[9] validation-logloss:0.53326 validation-auc:0.96037 validation-aucpr:0.96724
[10] validation-logloss:0.52142 validation-auc:0.96102 validation-aucpr:0.96757
[11] validation-logloss:0.51024 validation-auc:0.96143 validation-aucpr:0.96785
[12] validation-logloss:0.49925 validation-auc:0.96201 validation-aucpr:0.96835
[13] validation-logloss:0.48992 validation-auc:0.96172 validation-aucpr:0.96806
[14] validation-logloss:0.48136 validation-auc:0.96132 validation-aucpr:0.96769
[15] validation-logloss:0.47188 validation-auc:0.96192 validation-aucpr:0.96803
[16] validation-logloss:0.46233 validation-auc:0.96246 validation-aucpr:0.96843
[17] validation-logloss:0.45301 validation-auc:0.96301 validation-aucpr:0.96876
[18] validation-logloss:0.44538 validation-auc:0.96267 validation-aucpr:0.96817
[19] validation-logloss:0.43757 validation-auc:0.96269 validation-aucpr:0.96818
[20] validation-logloss:0.43047 validation-auc:0.96251 validation-aucpr:0.96803
[21] validation-logloss:0.42303 validation-auc:0.96268 validation-aucpr:0.96822
[22] validation-logloss:0.41592 validation-auc:0.96323 validation-aucpr:0.96865
[23] validation-logloss:0.40849 validation-auc:0.96340 validation-aucpr:0.96882
[24] validation-logloss:0.40118 validation-auc:0.96379 validation-aucpr:0.96914
[25] validation-logloss:0.39497 validation-auc:0.96395 validation-aucpr:0.96927
[26] validation-logloss:0.38834 validation-auc:0.96423 validation-aucpr:0.96945
[27] validation-logloss:0.38222 validation-auc:0.96434 validation-aucpr:0.96949
[28] validation-logloss:0.37609 validation-auc:0.96470 validation-aucpr:0.96976
[29] validation-logloss:0.37039 validation-auc:0.96464 validation-aucpr:0.96982
[30] validation-logloss:0.36482 validation-auc:0.96520 validation-aucpr:0.97030
[31] validation-logloss:0.35928 validation-auc:0.96538 validation-aucpr:0.97043
[32] validation-logloss:0.35396 validation-auc:0.96559 validation-aucpr:0.97061
[33] validation-logloss:0.34925 validation-auc:0.96562 validation-aucpr:0.97054
[34] validation-logloss:0.34482 validation-auc:0.96548 validation-aucpr:0.97042
[35] validation-logloss:0.34036 validation-auc:0.96548 validation-aucpr:0.97044
[36] validation-logloss:0.33630 validation-auc:0.96571 validation-aucpr:0.97076
[37] validation-logloss:0.33201 validation-auc:0.96600 validation-aucpr:0.97093
[38] validation-logloss:0.32736 validation-auc:0.96632 validation-aucpr:0.97121
[39] validation-logloss:0.32289 validation-auc:0.96653 validation-aucpr:0.97139
[40] validation-logloss:0.31849 validation-auc:0.96674 validation-aucpr:0.97154
[41] validation-logloss:0.31553 validation-auc:0.96657 validation-aucpr:0.97137
[42] validation-logloss:0.31255 validation-auc:0.96645 validation-aucpr:0.97127
[43] validation-logloss:0.30908 validation-auc:0.96637 validation-aucpr:0.97128
[44] validation-logloss:0.30606 validation-auc:0.96625 validation-aucpr:0.97118
[45] validation-logloss:0.30264 validation-auc:0.96653 validation-aucpr:0.97136
[46] validation-logloss:0.29986 validation-auc:0.96646 validation-aucpr:0.97131
[47] validation-logloss:0.29691 validation-auc:0.96659 validation-aucpr:0.97138
[48] validation-logloss:0.29371 validation-auc:0.96669 validation-aucpr:0.97145
[49] validation-logloss:0.29037 validation-auc:0.96697 validation-aucpr:0.97173
[50] validation-logloss:0.28738 validation-auc:0.96713 validation-aucpr:0.97182
[51] validation-logloss:0.28485 validation-auc:0.96707 validation-aucpr:0.97177
[52] validation-logloss:0.28177 validation-auc:0.96736 validation-aucpr:0.97197
[53] validation-logloss:0.27894 validation-auc:0.96752 validation-aucpr:0.97210
[54] validation-logloss:0.27639 validation-auc:0.96755 validation-aucpr:0.97208
[55] validation-logloss:0.27410 validation-auc:0.96755 validation-aucpr:0.97202
[56] validation-logloss:0.27142 validation-auc:0.96774 validation-aucpr:0.97219
[57] validation-logloss:0.26934 validation-auc:0.96769 validation-aucpr:0.97212
[58] validation-logloss:0.26684 validation-auc:0.96784 validation-aucpr:0.97223
[59] validation-logloss:0.26501 validation-auc:0.96786 validation-aucpr:0.97224
[60] validation-logloss:0.26306 validation-auc:0.96783 validation-aucpr:0.97220
[61] validation-logloss:0.26165 validation-auc:0.96768 validation-aucpr:0.97207
[62] validation-logloss:0.25979 validation-auc:0.96773 validation-aucpr:0.97210
[63] validation-logloss:0.25770 validation-auc:0.96777 validation-aucpr:0.97210
[64] validation-logloss:0.25621 validation-auc:0.96767 validation-aucpr:0.97200
[65] validation-logloss:0.25439 validation-auc:0.96766 validation-aucpr:0.97199
[66] validation-logloss:0.25262 validation-auc:0.96774 validation-aucpr:0.97206
[67] validation-logloss:0.25096 validation-auc:0.96774 validation-aucpr:0.97205
[68] validation-logloss:0.24947 validation-auc:0.96768 validation-aucpr:0.97199
[69] validation-logloss:0.24798 validation-auc:0.96769 validation-aucpr:0.97202
[70] validation-logloss:0.24641 validation-auc:0.96780 validation-aucpr:0.97216
[71] validation-logloss:0.24516 validation-auc:0.96777 validation-aucpr:0.97220
[72] validation-logloss:0.24358 validation-auc:0.96787 validation-aucpr:0.97226
[73] validation-logloss:0.24212 validation-auc:0.96788 validation-aucpr:0.97228
[74] validation-logloss:0.24075 validation-auc:0.96786 validation-aucpr:0.97227
[75] validation-logloss:0.23973 validation-auc:0.96787 validation-aucpr:0.97218
[76] validation-logloss:0.23823 validation-auc:0.96798 validation-aucpr:0.97230
[77] validation-logloss:0.23652 validation-auc:0.96819 validation-aucpr:0.97245
[78] validation-logloss:0.23531 validation-auc:0.96821 validation-aucpr:0.97248
[79] validation-logloss:0.23460 validation-auc:0.96801 validation-aucpr:0.97258
[80] validation-logloss:0.23365 validation-auc:0.96800 validation-aucpr:0.97258
[81] validation-logloss:0.23254 validation-auc:0.96808 validation-aucpr:0.97262
[82] validation-logloss:0.23124 validation-auc:0.96823 validation-aucpr:0.97271
[83] validation-logloss:0.23023 validation-auc:0.96824 validation-aucpr:0.97274
[84] validation-logloss:0.22957 validation-auc:0.96818 validation-aucpr:0.97269
[85] validation-logloss:0.22865 validation-auc:0.96822 validation-aucpr:0.97276
[86] validation-logloss:0.22773 validation-auc:0.96824 validation-aucpr:0.97277
[87] validation-logloss:0.22670 validation-auc:0.96826 validation-aucpr:0.97277
[88] validation-logloss:0.22592 validation-auc:0.96831 validation-aucpr:0.97279
[89] validation-logloss:0.22500 validation-auc:0.96844 validation-aucpr:0.97287
[90] validation-logloss:0.22445 validation-auc:0.96843 validation-aucpr:0.97279
[91] validation-logloss:0.22376 validation-auc:0.96842 validation-aucpr:0.97277
[92] validation-logloss:0.22315 validation-auc:0.96843 validation-aucpr:0.97275
[93] validation-logloss:0.22255 validation-auc:0.96840 validation-aucpr:0.97271
[94] validation-logloss:0.22144 validation-auc:0.96863 validation-aucpr:0.97287
[95] validation-logloss:0.22074 validation-auc:0.96865 validation-aucpr:0.97287
[96] validation-logloss:0.22032 validation-auc:0.96856 validation-aucpr:0.97277
[97] validation-logloss:0.21967 validation-auc:0.96859 validation-aucpr:0.97277
{'best_iteration': '89', 'best_score': '0.972873718449354'}
Trial 32, Fold 5: Log loss = 0.21967443811519236, Average precision = 0.9727732177067483, ROC-AUC = 0.9685915686602381, Elapsed Time = 1074.4078993000003 seconds
Optimization Progress: 33%|###3 | 33/100 [2:21:10<31:04:44, 1669.92s/it]
Trial 33, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 33, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68334 validation-auc:0.96185 validation-aucpr:0.96505
[1] validation-logloss:0.67396 validation-auc:0.96582 validation-aucpr:0.97007
[2] validation-logloss:0.66562 validation-auc:0.96601 validation-aucpr:0.97066
[3] validation-logloss:0.65712 validation-auc:0.96579 validation-aucpr:0.97046
[4] validation-logloss:0.64923 validation-auc:0.96559 validation-aucpr:0.97019
[5] validation-logloss:0.64059 validation-auc:0.96629 validation-aucpr:0.97070
[6] validation-logloss:0.63328 validation-auc:0.96574 validation-aucpr:0.97018
[7] validation-logloss:0.62495 validation-auc:0.96681 validation-aucpr:0.97111
[8] validation-logloss:0.61692 validation-auc:0.96710 validation-aucpr:0.97144
[9] validation-logloss:0.60912 validation-auc:0.96732 validation-aucpr:0.97173
[10] validation-logloss:0.60147 validation-auc:0.96756 validation-aucpr:0.97193
[11] validation-logloss:0.59395 validation-auc:0.96784 validation-aucpr:0.97222
[12] validation-logloss:0.58680 validation-auc:0.96801 validation-aucpr:0.97266
[13] validation-logloss:0.57969 validation-auc:0.96802 validation-aucpr:0.97266
[14] validation-logloss:0.57269 validation-auc:0.96821 validation-aucpr:0.97283
[15] validation-logloss:0.56588 validation-auc:0.96866 validation-aucpr:0.97357
[16] validation-logloss:0.55919 validation-auc:0.96891 validation-aucpr:0.97383
[17] validation-logloss:0.55266 validation-auc:0.96904 validation-aucpr:0.97392
[18] validation-logloss:0.54630 validation-auc:0.96917 validation-aucpr:0.97401
[19] validation-logloss:0.54003 validation-auc:0.96930 validation-aucpr:0.97408
[20] validation-logloss:0.53439 validation-auc:0.96912 validation-aucpr:0.97392
[21] validation-logloss:0.52843 validation-auc:0.96924 validation-aucpr:0.97405
[22] validation-logloss:0.52277 validation-auc:0.96934 validation-aucpr:0.97410
[23] validation-logloss:0.51696 validation-auc:0.96939 validation-aucpr:0.97415
[24] validation-logloss:0.51132 validation-auc:0.96946 validation-aucpr:0.97421
[25] validation-logloss:0.50583 validation-auc:0.96952 validation-aucpr:0.97424
[26] validation-logloss:0.50043 validation-auc:0.96963 validation-aucpr:0.97438
[27] validation-logloss:0.49569 validation-auc:0.96953 validation-aucpr:0.97433
[28] validation-logloss:0.49049 validation-auc:0.96958 validation-aucpr:0.97438
[29] validation-logloss:0.48542 validation-auc:0.96957 validation-aucpr:0.97437
[30] validation-logloss:0.48043 validation-auc:0.96962 validation-aucpr:0.97439
[31] validation-logloss:0.47561 validation-auc:0.96958 validation-aucpr:0.97436
[32] validation-logloss:0.47124 validation-auc:0.96953 validation-aucpr:0.97430
[33] validation-logloss:0.46657 validation-auc:0.96968 validation-aucpr:0.97438
[34] validation-logloss:0.46195 validation-auc:0.96970 validation-aucpr:0.97440
[35] validation-logloss:0.45776 validation-auc:0.96971 validation-aucpr:0.97442
[36] validation-logloss:0.45382 validation-auc:0.96971 validation-aucpr:0.97442
[37] validation-logloss:0.44943 validation-auc:0.96978 validation-aucpr:0.97448
[38] validation-logloss:0.44517 validation-auc:0.96974 validation-aucpr:0.97446
[39] validation-logloss:0.44094 validation-auc:0.96985 validation-aucpr:0.97454
[40] validation-logloss:0.43692 validation-auc:0.96982 validation-aucpr:0.97454
[41] validation-logloss:0.43290 validation-auc:0.96987 validation-aucpr:0.97459
[42] validation-logloss:0.42900 validation-auc:0.96989 validation-aucpr:0.97461
[43] validation-logloss:0.42513 validation-auc:0.96993 validation-aucpr:0.97463
[44] validation-logloss:0.42140 validation-auc:0.96991 validation-aucpr:0.97461
[45] validation-logloss:0.41768 validation-auc:0.96992 validation-aucpr:0.97461
[46] validation-logloss:0.41399 validation-auc:0.96994 validation-aucpr:0.97462
[47] validation-logloss:0.41039 validation-auc:0.97001 validation-aucpr:0.97471
[48] validation-logloss:0.40685 validation-auc:0.97008 validation-aucpr:0.97475
[49] validation-logloss:0.40332 validation-auc:0.97013 validation-aucpr:0.97479
[50] validation-logloss:0.39992 validation-auc:0.97012 validation-aucpr:0.97478
[51] validation-logloss:0.39654 validation-auc:0.97016 validation-aucpr:0.97480
[52] validation-logloss:0.39331 validation-auc:0.97014 validation-aucpr:0.97476
[53] validation-logloss:0.39042 validation-auc:0.97020 validation-aucpr:0.97479
[54] validation-logloss:0.38726 validation-auc:0.97016 validation-aucpr:0.97476
[55] validation-logloss:0.38415 validation-auc:0.97021 validation-aucpr:0.97485
[56] validation-logloss:0.38106 validation-auc:0.97027 validation-aucpr:0.97488
[57] validation-logloss:0.37800 validation-auc:0.97029 validation-aucpr:0.97490
[58] validation-logloss:0.37502 validation-auc:0.97031 validation-aucpr:0.97491
[59] validation-logloss:0.37221 validation-auc:0.97034 validation-aucpr:0.97494
[60] validation-logloss:0.36938 validation-auc:0.97031 validation-aucpr:0.97492
[61] validation-logloss:0.36659 validation-auc:0.97034 validation-aucpr:0.97494
[62] validation-logloss:0.36389 validation-auc:0.97030 validation-aucpr:0.97490
[63] validation-logloss:0.36114 validation-auc:0.97033 validation-aucpr:0.97493
[64] validation-logloss:0.35844 validation-auc:0.97037 validation-aucpr:0.97493
[65] validation-logloss:0.35583 validation-auc:0.97038 validation-aucpr:0.97493
[66] validation-logloss:0.35322 validation-auc:0.97044 validation-aucpr:0.97498
[67] validation-logloss:0.35070 validation-auc:0.97047 validation-aucpr:0.97502
[68] validation-logloss:0.34833 validation-auc:0.97048 validation-aucpr:0.97503
[69] validation-logloss:0.34583 validation-auc:0.97056 validation-aucpr:0.97509
[70] validation-logloss:0.34343 validation-auc:0.97058 validation-aucpr:0.97511
[71] validation-logloss:0.34106 validation-auc:0.97062 validation-aucpr:0.97514
[72] validation-logloss:0.33878 validation-auc:0.97066 validation-aucpr:0.97524
[73] validation-logloss:0.33648 validation-auc:0.97072 validation-aucpr:0.97528
[74] validation-logloss:0.33427 validation-auc:0.97069 validation-aucpr:0.97525
[75] validation-logloss:0.33203 validation-auc:0.97074 validation-aucpr:0.97530
[76] validation-logloss:0.32990 validation-auc:0.97072 validation-aucpr:0.97529
[77] validation-logloss:0.32772 validation-auc:0.97074 validation-aucpr:0.97529
[78] validation-logloss:0.32565 validation-auc:0.97074 validation-aucpr:0.97529
[79] validation-logloss:0.32358 validation-auc:0.97077 validation-aucpr:0.97532
[80] validation-logloss:0.32156 validation-auc:0.97079 validation-aucpr:0.97532
[81] validation-logloss:0.31955 validation-auc:0.97079 validation-aucpr:0.97532
[82] validation-logloss:0.31753 validation-auc:0.97085 validation-aucpr:0.97536
[83] validation-logloss:0.31560 validation-auc:0.97086 validation-aucpr:0.97538
[84] validation-logloss:0.31362 validation-auc:0.97096 validation-aucpr:0.97545
[85] validation-logloss:0.31186 validation-auc:0.97094 validation-aucpr:0.97542
[86] validation-logloss:0.30996 validation-auc:0.97098 validation-aucpr:0.97545
[87] validation-logloss:0.30820 validation-auc:0.97097 validation-aucpr:0.97543
[88] validation-logloss:0.30645 validation-auc:0.97099 validation-aucpr:0.97545
[89] validation-logloss:0.30485 validation-auc:0.97104 validation-aucpr:0.97548
[90] validation-logloss:0.30307 validation-auc:0.97106 validation-aucpr:0.97548
[91] validation-logloss:0.30129 validation-auc:0.97111 validation-aucpr:0.97551
[92] validation-logloss:0.29960 validation-auc:0.97114 validation-aucpr:0.97553
[93] validation-logloss:0.29799 validation-auc:0.97113 validation-aucpr:0.97554
[94] validation-logloss:0.29630 validation-auc:0.97121 validation-aucpr:0.97559
{'best_iteration': '94', 'best_score': '0.9755942644594567'}
Trial 33, Fold 1: Log loss = 0.2962993850750307, Average precision = 0.9755984835078575, ROC-AUC = 0.9712136542110571, Elapsed Time = 2.482240000001184 seconds
Trial 33, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 33, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68357 validation-auc:0.96349 validation-aucpr:0.96723
[1] validation-logloss:0.67503 validation-auc:0.96605 validation-aucpr:0.96927
[2] validation-logloss:0.66588 validation-auc:0.96734 validation-aucpr:0.97044
[3] validation-logloss:0.65692 validation-auc:0.96852 validation-aucpr:0.97158
[4] validation-logloss:0.64816 validation-auc:0.96976 validation-aucpr:0.97314
[5] validation-logloss:0.63948 validation-auc:0.96993 validation-aucpr:0.97326
[6] validation-logloss:0.63122 validation-auc:0.97000 validation-aucpr:0.97323
[7] validation-logloss:0.62301 validation-auc:0.97030 validation-aucpr:0.97356
[8] validation-logloss:0.61506 validation-auc:0.97025 validation-aucpr:0.97350
[9] validation-logloss:0.60733 validation-auc:0.97022 validation-aucpr:0.97341
[10] validation-logloss:0.60051 validation-auc:0.97040 validation-aucpr:0.97361
[11] validation-logloss:0.59314 validation-auc:0.97071 validation-aucpr:0.97385
[12] validation-logloss:0.58588 validation-auc:0.97085 validation-aucpr:0.97396
[13] validation-logloss:0.57869 validation-auc:0.97097 validation-aucpr:0.97405
[14] validation-logloss:0.57213 validation-auc:0.97093 validation-aucpr:0.97403
[15] validation-logloss:0.56526 validation-auc:0.97096 validation-aucpr:0.97403
[16] validation-logloss:0.55869 validation-auc:0.97071 validation-aucpr:0.97372
[17] validation-logloss:0.55209 validation-auc:0.97096 validation-aucpr:0.97390
[18] validation-logloss:0.54562 validation-auc:0.97109 validation-aucpr:0.97399
[19] validation-logloss:0.53934 validation-auc:0.97103 validation-aucpr:0.97394
[20] validation-logloss:0.53323 validation-auc:0.97097 validation-aucpr:0.97388
[21] validation-logloss:0.52715 validation-auc:0.97103 validation-aucpr:0.97393
[22] validation-logloss:0.52130 validation-auc:0.97096 validation-aucpr:0.97387
[23] validation-logloss:0.51555 validation-auc:0.97087 validation-aucpr:0.97382
[24] validation-logloss:0.50997 validation-auc:0.97086 validation-aucpr:0.97381
[25] validation-logloss:0.50438 validation-auc:0.97090 validation-aucpr:0.97383
[26] validation-logloss:0.49892 validation-auc:0.97096 validation-aucpr:0.97388
[27] validation-logloss:0.49357 validation-auc:0.97100 validation-aucpr:0.97391
[28] validation-logloss:0.48831 validation-auc:0.97111 validation-aucpr:0.97400
[29] validation-logloss:0.48322 validation-auc:0.97111 validation-aucpr:0.97400
[30] validation-logloss:0.47818 validation-auc:0.97121 validation-aucpr:0.97406
[31] validation-logloss:0.47329 validation-auc:0.97130 validation-aucpr:0.97412
[32] validation-logloss:0.46851 validation-auc:0.97126 validation-aucpr:0.97409
[33] validation-logloss:0.46394 validation-auc:0.97114 validation-aucpr:0.97400
[34] validation-logloss:0.45930 validation-auc:0.97116 validation-aucpr:0.97401
[35] validation-logloss:0.45477 validation-auc:0.97118 validation-aucpr:0.97404
[36] validation-logloss:0.45036 validation-auc:0.97122 validation-aucpr:0.97406
[37] validation-logloss:0.44604 validation-auc:0.97120 validation-aucpr:0.97404
[38] validation-logloss:0.44181 validation-auc:0.97120 validation-aucpr:0.97404
[39] validation-logloss:0.43767 validation-auc:0.97126 validation-aucpr:0.97409
[40] validation-logloss:0.43361 validation-auc:0.97129 validation-aucpr:0.97411
[41] validation-logloss:0.42963 validation-auc:0.97129 validation-aucpr:0.97411
[42] validation-logloss:0.42562 validation-auc:0.97136 validation-aucpr:0.97417
[43] validation-logloss:0.42175 validation-auc:0.97140 validation-aucpr:0.97421
[44] validation-logloss:0.41785 validation-auc:0.97148 validation-aucpr:0.97426
[45] validation-logloss:0.41437 validation-auc:0.97140 validation-aucpr:0.97421
[46] validation-logloss:0.41104 validation-auc:0.97144 validation-aucpr:0.97422
[47] validation-logloss:0.40766 validation-auc:0.97149 validation-aucpr:0.97427
[48] validation-logloss:0.40407 validation-auc:0.97146 validation-aucpr:0.97425
[49] validation-logloss:0.40065 validation-auc:0.97140 validation-aucpr:0.97420
[50] validation-logloss:0.39719 validation-auc:0.97148 validation-aucpr:0.97427
[51] validation-logloss:0.39417 validation-auc:0.97154 validation-aucpr:0.97432
[52] validation-logloss:0.39091 validation-auc:0.97150 validation-aucpr:0.97429
[53] validation-logloss:0.38765 validation-auc:0.97145 validation-aucpr:0.97424
[54] validation-logloss:0.38442 validation-auc:0.97147 validation-aucpr:0.97427
[55] validation-logloss:0.38132 validation-auc:0.97145 validation-aucpr:0.97424
[56] validation-logloss:0.37819 validation-auc:0.97154 validation-aucpr:0.97431
[57] validation-logloss:0.37517 validation-auc:0.97154 validation-aucpr:0.97431
[58] validation-logloss:0.37220 validation-auc:0.97157 validation-aucpr:0.97433
[59] validation-logloss:0.36930 validation-auc:0.97157 validation-aucpr:0.97433
[60] validation-logloss:0.36642 validation-auc:0.97159 validation-aucpr:0.97435
[61] validation-logloss:0.36356 validation-auc:0.97160 validation-aucpr:0.97437
[62] validation-logloss:0.36078 validation-auc:0.97159 validation-aucpr:0.97436
[63] validation-logloss:0.35814 validation-auc:0.97157 validation-aucpr:0.97435
[64] validation-logloss:0.35574 validation-auc:0.97164 validation-aucpr:0.97439
[65] validation-logloss:0.35313 validation-auc:0.97164 validation-aucpr:0.97439
[66] validation-logloss:0.35053 validation-auc:0.97164 validation-aucpr:0.97439
[67] validation-logloss:0.34827 validation-auc:0.97174 validation-aucpr:0.97449
[68] validation-logloss:0.34578 validation-auc:0.97176 validation-aucpr:0.97449
[69] validation-logloss:0.34332 validation-auc:0.97177 validation-aucpr:0.97448
[70] validation-logloss:0.34111 validation-auc:0.97179 validation-aucpr:0.97447
[71] validation-logloss:0.33872 validation-auc:0.97182 validation-aucpr:0.97449
[72] validation-logloss:0.33642 validation-auc:0.97181 validation-aucpr:0.97449
[73] validation-logloss:0.33416 validation-auc:0.97177 validation-aucpr:0.97446
[74] validation-logloss:0.33186 validation-auc:0.97181 validation-aucpr:0.97449
[75] validation-logloss:0.32987 validation-auc:0.97182 validation-aucpr:0.97450
[76] validation-logloss:0.32786 validation-auc:0.97182 validation-aucpr:0.97449
[77] validation-logloss:0.32565 validation-auc:0.97181 validation-aucpr:0.97448
[78] validation-logloss:0.32348 validation-auc:0.97181 validation-aucpr:0.97448
[79] validation-logloss:0.32155 validation-auc:0.97187 validation-aucpr:0.97452
[80] validation-logloss:0.31950 validation-auc:0.97183 validation-aucpr:0.97449
[81] validation-logloss:0.31742 validation-auc:0.97189 validation-aucpr:0.97453
[82] validation-logloss:0.31564 validation-auc:0.97189 validation-aucpr:0.97453
[83] validation-logloss:0.31362 validation-auc:0.97191 validation-aucpr:0.97455
[84] validation-logloss:0.31167 validation-auc:0.97193 validation-aucpr:0.97456
[85] validation-logloss:0.30973 validation-auc:0.97193 validation-aucpr:0.97456
[86] validation-logloss:0.30783 validation-auc:0.97194 validation-aucpr:0.97458
[87] validation-logloss:0.30591 validation-auc:0.97200 validation-aucpr:0.97462
[88] validation-logloss:0.30411 validation-auc:0.97200 validation-aucpr:0.97463
[89] validation-logloss:0.30228 validation-auc:0.97205 validation-aucpr:0.97467
[90] validation-logloss:0.30049 validation-auc:0.97212 validation-aucpr:0.97472
[91] validation-logloss:0.29872 validation-auc:0.97216 validation-aucpr:0.97474
[92] validation-logloss:0.29700 validation-auc:0.97214 validation-aucpr:0.97473
[93] validation-logloss:0.29530 validation-auc:0.97216 validation-aucpr:0.97474
[94] validation-logloss:0.29366 validation-auc:0.97215 validation-aucpr:0.97473
{'best_iteration': '93', 'best_score': '0.9747449932848273'}
Trial 33, Fold 2: Log loss = 0.2936584776398004, Average precision = 0.9746742058070549, ROC-AUC = 0.9721450247515813, Elapsed Time = 2.6043852000002516 seconds
Trial 33, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 33, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68338 validation-auc:0.96407 validation-aucpr:0.96861
[1] validation-logloss:0.67388 validation-auc:0.96621 validation-aucpr:0.97075
[2] validation-logloss:0.66461 validation-auc:0.96840 validation-aucpr:0.97245
[3] validation-logloss:0.65551 validation-auc:0.96954 validation-aucpr:0.97351
[4] validation-logloss:0.64669 validation-auc:0.97044 validation-aucpr:0.97410
[5] validation-logloss:0.63803 validation-auc:0.97051 validation-aucpr:0.97419
[6] validation-logloss:0.62958 validation-auc:0.97094 validation-aucpr:0.97443
[7] validation-logloss:0.62138 validation-auc:0.97061 validation-aucpr:0.97417
[8] validation-logloss:0.61445 validation-auc:0.97086 validation-aucpr:0.97523
[9] validation-logloss:0.60675 validation-auc:0.97069 validation-aucpr:0.97510
[10] validation-logloss:0.59922 validation-auc:0.97059 validation-aucpr:0.97500
[11] validation-logloss:0.59168 validation-auc:0.97083 validation-aucpr:0.97517
[12] validation-logloss:0.58435 validation-auc:0.97091 validation-aucpr:0.97524
[13] validation-logloss:0.57723 validation-auc:0.97099 validation-aucpr:0.97529
[14] validation-logloss:0.57035 validation-auc:0.97091 validation-aucpr:0.97526
[15] validation-logloss:0.56352 validation-auc:0.97089 validation-aucpr:0.97525
[16] validation-logloss:0.55682 validation-auc:0.97089 validation-aucpr:0.97521
[17] validation-logloss:0.55026 validation-auc:0.97095 validation-aucpr:0.97526
[18] validation-logloss:0.54405 validation-auc:0.97102 validation-aucpr:0.97526
[19] validation-logloss:0.53793 validation-auc:0.97088 validation-aucpr:0.97509
[20] validation-logloss:0.53184 validation-auc:0.97085 validation-aucpr:0.97507
[21] validation-logloss:0.52597 validation-auc:0.97078 validation-aucpr:0.97502
[22] validation-logloss:0.52000 validation-auc:0.97094 validation-aucpr:0.97512
[23] validation-logloss:0.51422 validation-auc:0.97106 validation-aucpr:0.97519
[24] validation-logloss:0.50858 validation-auc:0.97112 validation-aucpr:0.97523
[25] validation-logloss:0.50368 validation-auc:0.97112 validation-aucpr:0.97529
[26] validation-logloss:0.49822 validation-auc:0.97114 validation-aucpr:0.97530
[27] validation-logloss:0.49342 validation-auc:0.97119 validation-aucpr:0.97534
[28] validation-logloss:0.48877 validation-auc:0.97118 validation-aucpr:0.97533
[29] validation-logloss:0.48366 validation-auc:0.97121 validation-aucpr:0.97537
[30] validation-logloss:0.47857 validation-auc:0.97128 validation-aucpr:0.97543
[31] validation-logloss:0.47422 validation-auc:0.97140 validation-aucpr:0.97558
[32] validation-logloss:0.46936 validation-auc:0.97138 validation-aucpr:0.97557
[33] validation-logloss:0.46523 validation-auc:0.97138 validation-aucpr:0.97558
[34] validation-logloss:0.46069 validation-auc:0.97144 validation-aucpr:0.97559
[35] validation-logloss:0.45608 validation-auc:0.97146 validation-aucpr:0.97558
[36] validation-logloss:0.45156 validation-auc:0.97155 validation-aucpr:0.97565
[37] validation-logloss:0.44731 validation-auc:0.97153 validation-aucpr:0.97562
[38] validation-logloss:0.44301 validation-auc:0.97161 validation-aucpr:0.97569
[39] validation-logloss:0.43881 validation-auc:0.97162 validation-aucpr:0.97571
[40] validation-logloss:0.43467 validation-auc:0.97162 validation-aucpr:0.97569
[41] validation-logloss:0.43068 validation-auc:0.97160 validation-aucpr:0.97566
[42] validation-logloss:0.42664 validation-auc:0.97167 validation-aucpr:0.97571
[43] validation-logloss:0.42275 validation-auc:0.97160 validation-aucpr:0.97566
[44] validation-logloss:0.41885 validation-auc:0.97157 validation-aucpr:0.97563
[45] validation-logloss:0.41504 validation-auc:0.97164 validation-aucpr:0.97568
[46] validation-logloss:0.41166 validation-auc:0.97156 validation-aucpr:0.97562
[47] validation-logloss:0.40802 validation-auc:0.97157 validation-aucpr:0.97563
[48] validation-logloss:0.40441 validation-auc:0.97170 validation-aucpr:0.97575
[49] validation-logloss:0.40088 validation-auc:0.97176 validation-aucpr:0.97579
[50] validation-logloss:0.39742 validation-auc:0.97177 validation-aucpr:0.97580
[51] validation-logloss:0.39408 validation-auc:0.97173 validation-aucpr:0.97576
[52] validation-logloss:0.39105 validation-auc:0.97177 validation-aucpr:0.97582
[53] validation-logloss:0.38784 validation-auc:0.97177 validation-aucpr:0.97582
[54] validation-logloss:0.38456 validation-auc:0.97178 validation-aucpr:0.97584
[55] validation-logloss:0.38140 validation-auc:0.97182 validation-aucpr:0.97587
[56] validation-logloss:0.37831 validation-auc:0.97180 validation-aucpr:0.97586
[57] validation-logloss:0.37561 validation-auc:0.97183 validation-aucpr:0.97588
[58] validation-logloss:0.37260 validation-auc:0.97179 validation-aucpr:0.97585
[59] validation-logloss:0.36966 validation-auc:0.97182 validation-aucpr:0.97587
[60] validation-logloss:0.36683 validation-auc:0.97177 validation-aucpr:0.97583
[61] validation-logloss:0.36405 validation-auc:0.97178 validation-aucpr:0.97584
[62] validation-logloss:0.36119 validation-auc:0.97183 validation-aucpr:0.97588
[63] validation-logloss:0.35843 validation-auc:0.97186 validation-aucpr:0.97590
[64] validation-logloss:0.35569 validation-auc:0.97185 validation-aucpr:0.97591
[65] validation-logloss:0.35301 validation-auc:0.97193 validation-aucpr:0.97595
[66] validation-logloss:0.35035 validation-auc:0.97197 validation-aucpr:0.97599
[67] validation-logloss:0.34778 validation-auc:0.97196 validation-aucpr:0.97597
[68] validation-logloss:0.34528 validation-auc:0.97195 validation-aucpr:0.97597
[69] validation-logloss:0.34276 validation-auc:0.97196 validation-aucpr:0.97599
[70] validation-logloss:0.34059 validation-auc:0.97197 validation-aucpr:0.97601
[71] validation-logloss:0.33819 validation-auc:0.97197 validation-aucpr:0.97600
[72] validation-logloss:0.33594 validation-auc:0.97199 validation-aucpr:0.97602
[73] validation-logloss:0.33366 validation-auc:0.97198 validation-aucpr:0.97600
[74] validation-logloss:0.33135 validation-auc:0.97199 validation-aucpr:0.97601
[75] validation-logloss:0.32909 validation-auc:0.97199 validation-aucpr:0.97600
[76] validation-logloss:0.32684 validation-auc:0.97202 validation-aucpr:0.97602
[77] validation-logloss:0.32465 validation-auc:0.97203 validation-aucpr:0.97603
[78] validation-logloss:0.32257 validation-auc:0.97202 validation-aucpr:0.97600
[79] validation-logloss:0.32047 validation-auc:0.97206 validation-aucpr:0.97602
[80] validation-logloss:0.31844 validation-auc:0.97207 validation-aucpr:0.97606
[81] validation-logloss:0.31642 validation-auc:0.97207 validation-aucpr:0.97605
[82] validation-logloss:0.31437 validation-auc:0.97212 validation-aucpr:0.97609
[83] validation-logloss:0.31240 validation-auc:0.97212 validation-aucpr:0.97608
[84] validation-logloss:0.31048 validation-auc:0.97213 validation-aucpr:0.97607
[85] validation-logloss:0.30870 validation-auc:0.97213 validation-aucpr:0.97607
[86] validation-logloss:0.30680 validation-auc:0.97210 validation-aucpr:0.97604
[87] validation-logloss:0.30495 validation-auc:0.97211 validation-aucpr:0.97605
[88] validation-logloss:0.30311 validation-auc:0.97213 validation-aucpr:0.97607
[89] validation-logloss:0.30130 validation-auc:0.97214 validation-aucpr:0.97608
[90] validation-logloss:0.29953 validation-auc:0.97214 validation-aucpr:0.97608
[91] validation-logloss:0.29802 validation-auc:0.97213 validation-aucpr:0.97607
[92] validation-logloss:0.29652 validation-auc:0.97217 validation-aucpr:0.97611
[93] validation-logloss:0.29481 validation-auc:0.97217 validation-aucpr:0.97611
[94] validation-logloss:0.29313 validation-auc:0.97219 validation-aucpr:0.97613
{'best_iteration': '94', 'best_score': '0.9761276928325958'}
Trial 33, Fold 3: Log loss = 0.29313219894809645, Average precision = 0.9761308604244551, ROC-AUC = 0.9721853450559081, Elapsed Time = 2.70938169999863 seconds
Trial 33, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 33, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68351 validation-auc:0.96007 validation-aucpr:0.96605
[1] validation-logloss:0.67398 validation-auc:0.96567 validation-aucpr:0.97100
[2] validation-logloss:0.66490 validation-auc:0.96713 validation-aucpr:0.97232
[3] validation-logloss:0.65596 validation-auc:0.96765 validation-aucpr:0.97245
[4] validation-logloss:0.64728 validation-auc:0.96747 validation-aucpr:0.97243
[5] validation-logloss:0.63882 validation-auc:0.96748 validation-aucpr:0.97239
[6] validation-logloss:0.63045 validation-auc:0.96818 validation-aucpr:0.97301
[7] validation-logloss:0.62224 validation-auc:0.96833 validation-aucpr:0.97314
[8] validation-logloss:0.61430 validation-auc:0.96872 validation-aucpr:0.97340
[9] validation-logloss:0.60653 validation-auc:0.96893 validation-aucpr:0.97357
[10] validation-logloss:0.59906 validation-auc:0.96913 validation-aucpr:0.97374
[11] validation-logloss:0.59158 validation-auc:0.96956 validation-aucpr:0.97415
[12] validation-logloss:0.58455 validation-auc:0.96975 validation-aucpr:0.97434
[13] validation-logloss:0.57736 validation-auc:0.97008 validation-aucpr:0.97457
[14] validation-logloss:0.57060 validation-auc:0.97025 validation-aucpr:0.97477
[15] validation-logloss:0.56375 validation-auc:0.97052 validation-aucpr:0.97497
[16] validation-logloss:0.55709 validation-auc:0.97046 validation-aucpr:0.97495
[17] validation-logloss:0.55058 validation-auc:0.97053 validation-aucpr:0.97499
[18] validation-logloss:0.54423 validation-auc:0.97069 validation-aucpr:0.97511
[19] validation-logloss:0.53826 validation-auc:0.97056 validation-aucpr:0.97499
[20] validation-logloss:0.53217 validation-auc:0.97067 validation-aucpr:0.97506
[21] validation-logloss:0.52617 validation-auc:0.97077 validation-aucpr:0.97514
[22] validation-logloss:0.52039 validation-auc:0.97066 validation-aucpr:0.97507
[23] validation-logloss:0.51468 validation-auc:0.97074 validation-aucpr:0.97512
[24] validation-logloss:0.50909 validation-auc:0.97086 validation-aucpr:0.97519
[25] validation-logloss:0.50367 validation-auc:0.97091 validation-aucpr:0.97523
[26] validation-logloss:0.49841 validation-auc:0.97100 validation-aucpr:0.97529
[27] validation-logloss:0.49313 validation-auc:0.97104 validation-aucpr:0.97534
[28] validation-logloss:0.48871 validation-auc:0.97086 validation-aucpr:0.97534
[29] validation-logloss:0.48364 validation-auc:0.97085 validation-aucpr:0.97534
[30] validation-logloss:0.47871 validation-auc:0.97096 validation-aucpr:0.97542
[31] validation-logloss:0.47394 validation-auc:0.97090 validation-aucpr:0.97537
[32] validation-logloss:0.46924 validation-auc:0.97091 validation-aucpr:0.97538
[33] validation-logloss:0.46449 validation-auc:0.97099 validation-aucpr:0.97545
[34] validation-logloss:0.46021 validation-auc:0.97107 validation-aucpr:0.97556
[35] validation-logloss:0.45575 validation-auc:0.97101 validation-aucpr:0.97551
[36] validation-logloss:0.45131 validation-auc:0.97106 validation-aucpr:0.97555
[37] validation-logloss:0.44701 validation-auc:0.97107 validation-aucpr:0.97556
[38] validation-logloss:0.44273 validation-auc:0.97097 validation-aucpr:0.97548
[39] validation-logloss:0.43915 validation-auc:0.97086 validation-aucpr:0.97543
[40] validation-logloss:0.43552 validation-auc:0.97084 validation-aucpr:0.97543
[41] validation-logloss:0.43153 validation-auc:0.97081 validation-aucpr:0.97540
[42] validation-logloss:0.42761 validation-auc:0.97082 validation-aucpr:0.97541
[43] validation-logloss:0.42368 validation-auc:0.97088 validation-aucpr:0.97546
[44] validation-logloss:0.41989 validation-auc:0.97094 validation-aucpr:0.97549
[45] validation-logloss:0.41615 validation-auc:0.97089 validation-aucpr:0.97545
[46] validation-logloss:0.41293 validation-auc:0.97080 validation-aucpr:0.97540
[47] validation-logloss:0.40932 validation-auc:0.97090 validation-aucpr:0.97547
[48] validation-logloss:0.40573 validation-auc:0.97093 validation-aucpr:0.97550
[49] validation-logloss:0.40267 validation-auc:0.97089 validation-aucpr:0.97547
[50] validation-logloss:0.39917 validation-auc:0.97096 validation-aucpr:0.97552
[51] validation-logloss:0.39579 validation-auc:0.97095 validation-aucpr:0.97552
[52] validation-logloss:0.39245 validation-auc:0.97098 validation-aucpr:0.97553
[53] validation-logloss:0.38920 validation-auc:0.97110 validation-aucpr:0.97562
[54] validation-logloss:0.38599 validation-auc:0.97118 validation-aucpr:0.97568
[55] validation-logloss:0.38289 validation-auc:0.97119 validation-aucpr:0.97568
[56] validation-logloss:0.37976 validation-auc:0.97121 validation-aucpr:0.97570
[57] validation-logloss:0.37668 validation-auc:0.97129 validation-aucpr:0.97575
[58] validation-logloss:0.37368 validation-auc:0.97133 validation-aucpr:0.97577
[59] validation-logloss:0.37074 validation-auc:0.97132 validation-aucpr:0.97576
[60] validation-logloss:0.36789 validation-auc:0.97138 validation-aucpr:0.97580
[61] validation-logloss:0.36504 validation-auc:0.97144 validation-aucpr:0.97585
[62] validation-logloss:0.36224 validation-auc:0.97152 validation-aucpr:0.97591
[63] validation-logloss:0.35951 validation-auc:0.97148 validation-aucpr:0.97588
[64] validation-logloss:0.35687 validation-auc:0.97147 validation-aucpr:0.97588
[65] validation-logloss:0.35411 validation-auc:0.97151 validation-aucpr:0.97592
[66] validation-logloss:0.35148 validation-auc:0.97152 validation-aucpr:0.97593
[67] validation-logloss:0.34893 validation-auc:0.97155 validation-aucpr:0.97595
[68] validation-logloss:0.34645 validation-auc:0.97156 validation-aucpr:0.97596
[69] validation-logloss:0.34401 validation-auc:0.97159 validation-aucpr:0.97598
[70] validation-logloss:0.34156 validation-auc:0.97161 validation-aucpr:0.97600
[71] validation-logloss:0.33909 validation-auc:0.97166 validation-aucpr:0.97603
[72] validation-logloss:0.33703 validation-auc:0.97165 validation-aucpr:0.97603
[73] validation-logloss:0.33470 validation-auc:0.97168 validation-aucpr:0.97605
[74] validation-logloss:0.33239 validation-auc:0.97174 validation-aucpr:0.97609
[75] validation-logloss:0.33015 validation-auc:0.97177 validation-aucpr:0.97612
[76] validation-logloss:0.32794 validation-auc:0.97185 validation-aucpr:0.97617
[77] validation-logloss:0.32574 validation-auc:0.97190 validation-aucpr:0.97621
[78] validation-logloss:0.32361 validation-auc:0.97192 validation-aucpr:0.97623
[79] validation-logloss:0.32151 validation-auc:0.97195 validation-aucpr:0.97625
[80] validation-logloss:0.31968 validation-auc:0.97195 validation-aucpr:0.97625
[81] validation-logloss:0.31758 validation-auc:0.97203 validation-aucpr:0.97631
[82] validation-logloss:0.31559 validation-auc:0.97205 validation-aucpr:0.97631
[83] validation-logloss:0.31363 validation-auc:0.97204 validation-aucpr:0.97632
[84] validation-logloss:0.31168 validation-auc:0.97209 validation-aucpr:0.97635
[85] validation-logloss:0.30977 validation-auc:0.97213 validation-aucpr:0.97638
[86] validation-logloss:0.30793 validation-auc:0.97211 validation-aucpr:0.97636
[87] validation-logloss:0.30606 validation-auc:0.97212 validation-aucpr:0.97637
[88] validation-logloss:0.30447 validation-auc:0.97209 validation-aucpr:0.97635
[89] validation-logloss:0.30288 validation-auc:0.97207 validation-aucpr:0.97634
[90] validation-logloss:0.30108 validation-auc:0.97212 validation-aucpr:0.97638
[91] validation-logloss:0.29930 validation-auc:0.97220 validation-aucpr:0.97643
[92] validation-logloss:0.29757 validation-auc:0.97223 validation-aucpr:0.97645
[93] validation-logloss:0.29584 validation-auc:0.97226 validation-aucpr:0.97648
[94] validation-logloss:0.29421 validation-auc:0.97225 validation-aucpr:0.97646
{'best_iteration': '93', 'best_score': '0.9764766823302508'}
Trial 33, Fold 4: Log loss = 0.2942099781841265, Average precision = 0.97645302945903, ROC-AUC = 0.9722454702175296, Elapsed Time = 2.9326142000027176 seconds
Trial 33, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 33, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68343 validation-auc:0.95976 validation-aucpr:0.96214
[1] validation-logloss:0.67403 validation-auc:0.96579 validation-aucpr:0.96994
[2] validation-logloss:0.66571 validation-auc:0.96529 validation-aucpr:0.96930
[3] validation-logloss:0.65698 validation-auc:0.96665 validation-aucpr:0.97030
[4] validation-logloss:0.64817 validation-auc:0.96694 validation-aucpr:0.97075
[5] validation-logloss:0.63960 validation-auc:0.96740 validation-aucpr:0.97115
[6] validation-logloss:0.63132 validation-auc:0.96779 validation-aucpr:0.97176
[7] validation-logloss:0.62312 validation-auc:0.96820 validation-aucpr:0.97206
[8] validation-logloss:0.61508 validation-auc:0.96860 validation-aucpr:0.97224
[9] validation-logloss:0.60731 validation-auc:0.96879 validation-aucpr:0.97238
[10] validation-logloss:0.59967 validation-auc:0.96916 validation-aucpr:0.97264
[11] validation-logloss:0.59227 validation-auc:0.96939 validation-aucpr:0.97279
[12] validation-logloss:0.58516 validation-auc:0.96957 validation-aucpr:0.97289
[13] validation-logloss:0.57806 validation-auc:0.96978 validation-aucpr:0.97305
[14] validation-logloss:0.57131 validation-auc:0.96951 validation-aucpr:0.97299
[15] validation-logloss:0.56515 validation-auc:0.96946 validation-aucpr:0.97304
[16] validation-logloss:0.55846 validation-auc:0.96954 validation-aucpr:0.97311
[17] validation-logloss:0.55205 validation-auc:0.96956 validation-aucpr:0.97314
[18] validation-logloss:0.54573 validation-auc:0.96958 validation-aucpr:0.97317
[19] validation-logloss:0.53949 validation-auc:0.96961 validation-aucpr:0.97319
[20] validation-logloss:0.53346 validation-auc:0.96964 validation-aucpr:0.97315
[21] validation-logloss:0.52755 validation-auc:0.96976 validation-aucpr:0.97324
[22] validation-logloss:0.52168 validation-auc:0.96990 validation-aucpr:0.97334
[23] validation-logloss:0.51598 validation-auc:0.96985 validation-aucpr:0.97331
[24] validation-logloss:0.51038 validation-auc:0.96996 validation-aucpr:0.97337
[25] validation-logloss:0.50547 validation-auc:0.96981 validation-aucpr:0.97334
[26] validation-logloss:0.50011 validation-auc:0.96976 validation-aucpr:0.97330
[27] validation-logloss:0.49500 validation-auc:0.96982 validation-aucpr:0.97335
[28] validation-logloss:0.49040 validation-auc:0.96966 validation-aucpr:0.97327
[29] validation-logloss:0.48545 validation-auc:0.96964 validation-aucpr:0.97347
[30] validation-logloss:0.48056 validation-auc:0.96970 validation-aucpr:0.97350
[31] validation-logloss:0.47573 validation-auc:0.96967 validation-aucpr:0.97350
[32] validation-logloss:0.47092 validation-auc:0.96975 validation-aucpr:0.97356
[33] validation-logloss:0.46622 validation-auc:0.96984 validation-aucpr:0.97363
[34] validation-logloss:0.46165 validation-auc:0.96980 validation-aucpr:0.97360
[35] validation-logloss:0.45722 validation-auc:0.96986 validation-aucpr:0.97364
[36] validation-logloss:0.45288 validation-auc:0.96994 validation-aucpr:0.97404
[37] validation-logloss:0.44858 validation-auc:0.96994 validation-aucpr:0.97405
[38] validation-logloss:0.44438 validation-auc:0.97004 validation-aucpr:0.97411
[39] validation-logloss:0.44020 validation-auc:0.97015 validation-aucpr:0.97420
[40] validation-logloss:0.43615 validation-auc:0.97019 validation-aucpr:0.97423
[41] validation-logloss:0.43217 validation-auc:0.97031 validation-aucpr:0.97432
[42] validation-logloss:0.42826 validation-auc:0.97034 validation-aucpr:0.97433
[43] validation-logloss:0.42438 validation-auc:0.97039 validation-aucpr:0.97437
[44] validation-logloss:0.42058 validation-auc:0.97039 validation-aucpr:0.97438
[45] validation-logloss:0.41696 validation-auc:0.97039 validation-aucpr:0.97437
[46] validation-logloss:0.41333 validation-auc:0.97049 validation-aucpr:0.97450
[47] validation-logloss:0.40980 validation-auc:0.97052 validation-aucpr:0.97450
[48] validation-logloss:0.40665 validation-auc:0.97047 validation-aucpr:0.97449
[49] validation-logloss:0.40317 validation-auc:0.97058 validation-aucpr:0.97456
[50] validation-logloss:0.39981 validation-auc:0.97057 validation-aucpr:0.97454
[51] validation-logloss:0.39653 validation-auc:0.97058 validation-aucpr:0.97455
[52] validation-logloss:0.39333 validation-auc:0.97063 validation-aucpr:0.97456
[53] validation-logloss:0.39049 validation-auc:0.97055 validation-aucpr:0.97451
[54] validation-logloss:0.38737 validation-auc:0.97049 validation-aucpr:0.97447
[55] validation-logloss:0.38436 validation-auc:0.97046 validation-aucpr:0.97443
[56] validation-logloss:0.38130 validation-auc:0.97044 validation-aucpr:0.97443
[57] validation-logloss:0.37828 validation-auc:0.97051 validation-aucpr:0.97449
[58] validation-logloss:0.37539 validation-auc:0.97052 validation-aucpr:0.97453
[59] validation-logloss:0.37250 validation-auc:0.97060 validation-aucpr:0.97458
[60] validation-logloss:0.36969 validation-auc:0.97064 validation-aucpr:0.97459
[61] validation-logloss:0.36683 validation-auc:0.97072 validation-aucpr:0.97467
[62] validation-logloss:0.36410 validation-auc:0.97079 validation-aucpr:0.97471
[63] validation-logloss:0.36170 validation-auc:0.97084 validation-aucpr:0.97476
[64] validation-logloss:0.35902 validation-auc:0.97089 validation-aucpr:0.97480
[65] validation-logloss:0.35636 validation-auc:0.97100 validation-aucpr:0.97488
[66] validation-logloss:0.35377 validation-auc:0.97108 validation-aucpr:0.97493
[67] validation-logloss:0.35123 validation-auc:0.97112 validation-aucpr:0.97496
[68] validation-logloss:0.34877 validation-auc:0.97109 validation-aucpr:0.97493
[69] validation-logloss:0.34639 validation-auc:0.97109 validation-aucpr:0.97492
[70] validation-logloss:0.34398 validation-auc:0.97115 validation-aucpr:0.97496
[71] validation-logloss:0.34166 validation-auc:0.97116 validation-aucpr:0.97496
[72] validation-logloss:0.33933 validation-auc:0.97122 validation-aucpr:0.97501
[73] validation-logloss:0.33706 validation-auc:0.97125 validation-aucpr:0.97501
[74] validation-logloss:0.33487 validation-auc:0.97129 validation-aucpr:0.97505
[75] validation-logloss:0.33272 validation-auc:0.97133 validation-aucpr:0.97506
[76] validation-logloss:0.33052 validation-auc:0.97137 validation-aucpr:0.97508
[77] validation-logloss:0.32841 validation-auc:0.97137 validation-aucpr:0.97509
[78] validation-logloss:0.32638 validation-auc:0.97133 validation-aucpr:0.97506
[79] validation-logloss:0.32430 validation-auc:0.97135 validation-aucpr:0.97507
[80] validation-logloss:0.32242 validation-auc:0.97141 validation-aucpr:0.97513
[81] validation-logloss:0.32044 validation-auc:0.97141 validation-aucpr:0.97514
[82] validation-logloss:0.31847 validation-auc:0.97143 validation-aucpr:0.97516
[83] validation-logloss:0.31667 validation-auc:0.97142 validation-aucpr:0.97515
[84] validation-logloss:0.31470 validation-auc:0.97148 validation-aucpr:0.97518
[85] validation-logloss:0.31280 validation-auc:0.97155 validation-aucpr:0.97524
[86] validation-logloss:0.31092 validation-auc:0.97160 validation-aucpr:0.97527
[87] validation-logloss:0.30912 validation-auc:0.97162 validation-aucpr:0.97528
[88] validation-logloss:0.30735 validation-auc:0.97163 validation-aucpr:0.97528
[89] validation-logloss:0.30556 validation-auc:0.97166 validation-aucpr:0.97530
[90] validation-logloss:0.30381 validation-auc:0.97169 validation-aucpr:0.97531
[91] validation-logloss:0.30209 validation-auc:0.97173 validation-aucpr:0.97536
[92] validation-logloss:0.30039 validation-auc:0.97175 validation-aucpr:0.97537
[93] validation-logloss:0.29871 validation-auc:0.97177 validation-aucpr:0.97538
[94] validation-logloss:0.29705 validation-auc:0.97176 validation-aucpr:0.97537
{'best_iteration': '93', 'best_score': '0.9753777106220222'}
Trial 33, Fold 5: Log loss = 0.29705146661846155, Average precision = 0.9753714880967246, ROC-AUC = 0.9717643386656263, Elapsed Time = 2.967479999999341 seconds
Optimization Progress: 34%|###4 | 34/100 [2:21:33<21:33:26, 1175.85s/it]
Trial 34, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 34, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68198 validation-auc:0.95130 validation-aucpr:0.95540
[1] validation-logloss:0.67112 validation-auc:0.95892 validation-aucpr:0.96265
[2] validation-logloss:0.66069 validation-auc:0.96145 validation-aucpr:0.96424
[3] validation-logloss:0.65041 validation-auc:0.96213 validation-aucpr:0.96619
[4] validation-logloss:0.64063 validation-auc:0.96271 validation-aucpr:0.96646
[5] validation-logloss:0.63112 validation-auc:0.96362 validation-aucpr:0.96797
{'best_iteration': '5', 'best_score': '0.967973486067347'}
Trial 34, Fold 1: Log loss = 0.6311222487753312, Average precision = 0.9682110235622168, ROC-AUC = 0.9636233677702738, Elapsed Time = 0.641079200002423 seconds
Trial 34, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 34, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68210 validation-auc:0.95181 validation-aucpr:0.95322
[1] validation-logloss:0.67115 validation-auc:0.95940 validation-aucpr:0.96118
[2] validation-logloss:0.66057 validation-auc:0.96273 validation-aucpr:0.96656
[3] validation-logloss:0.65033 validation-auc:0.96360 validation-aucpr:0.96698
[4] validation-logloss:0.64081 validation-auc:0.96418 validation-aucpr:0.96799
[5] validation-logloss:0.63121 validation-auc:0.96547 validation-aucpr:0.96980
{'best_iteration': '5', 'best_score': '0.9698008921278516'}
Trial 34, Fold 2: Log loss = 0.6312073776093977, Average precision = 0.969641383985065, ROC-AUC = 0.9654703984077407, Elapsed Time = 0.6488423999981023 seconds
Trial 34, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 34, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68197 validation-auc:0.95564 validation-aucpr:0.95792
[1] validation-logloss:0.67107 validation-auc:0.96225 validation-aucpr:0.96621
[2] validation-logloss:0.66058 validation-auc:0.96390 validation-aucpr:0.96893
[3] validation-logloss:0.65033 validation-auc:0.96350 validation-aucpr:0.96671
[4] validation-logloss:0.64048 validation-auc:0.96404 validation-aucpr:0.96703
[5] validation-logloss:0.63090 validation-auc:0.96491 validation-aucpr:0.96755
{'best_iteration': '2', 'best_score': '0.968934987538646'}
Trial 34, Fold 3: Log loss = 0.6309028230443496, Average precision = 0.9679194572067682, ROC-AUC = 0.9649145155348595, Elapsed Time = 0.7778741000001901 seconds
Trial 34, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 34, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68201 validation-auc:0.95272 validation-aucpr:0.95267
[1] validation-logloss:0.67121 validation-auc:0.95835 validation-aucpr:0.95833
[2] validation-logloss:0.66074 validation-auc:0.96139 validation-aucpr:0.96191
[3] validation-logloss:0.65168 validation-auc:0.96197 validation-aucpr:0.96553
[4] validation-logloss:0.64167 validation-auc:0.96350 validation-aucpr:0.96626
[5] validation-logloss:0.63191 validation-auc:0.96580 validation-aucpr:0.97126
{'best_iteration': '5', 'best_score': '0.9712628373370135'}
Trial 34, Fold 4: Log loss = 0.6319119314165961, Average precision = 0.971030496877912, ROC-AUC = 0.9657958653142293, Elapsed Time = 0.7649254999996629 seconds
Trial 34, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 34, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68213 validation-auc:0.94976 validation-aucpr:0.95254
[1] validation-logloss:0.67129 validation-auc:0.95654 validation-aucpr:0.95866
[2] validation-logloss:0.66076 validation-auc:0.95844 validation-aucpr:0.95956
[3] validation-logloss:0.65057 validation-auc:0.95909 validation-aucpr:0.96011
[4] validation-logloss:0.64076 validation-auc:0.96032 validation-aucpr:0.96077
[5] validation-logloss:0.63112 validation-auc:0.96092 validation-aucpr:0.96102
{'best_iteration': '5', 'best_score': '0.9610187676182963'}
Trial 34, Fold 5: Log loss = 0.6311226248417754, Average precision = 0.9608989400500028, ROC-AUC = 0.9609236200137488, Elapsed Time = 0.7530755000007048 seconds
Optimization Progress: 35%|###5 | 35/100 [2:21:45<14:55:20, 826.47s/it]
Trial 35, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 35, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66299 validation-auc:0.94914 validation-aucpr:0.93790
[1] validation-logloss:0.63898 validation-auc:0.95939 validation-aucpr:0.95686
[2] validation-logloss:0.61353 validation-auc:0.96323 validation-aucpr:0.96870
[3] validation-logloss:0.59262 validation-auc:0.96437 validation-aucpr:0.96937
[4] validation-logloss:0.57084 validation-auc:0.96506 validation-aucpr:0.97017
[5] validation-logloss:0.55311 validation-auc:0.96458 validation-aucpr:0.96955
[6] validation-logloss:0.53632 validation-auc:0.96413 validation-aucpr:0.96899
[7] validation-logloss:0.52077 validation-auc:0.96405 validation-aucpr:0.96897
[8] validation-logloss:0.50527 validation-auc:0.96454 validation-aucpr:0.96942
[9] validation-logloss:0.49136 validation-auc:0.96459 validation-aucpr:0.96880
[10] validation-logloss:0.47835 validation-auc:0.96473 validation-aucpr:0.96880
[11] validation-logloss:0.46590 validation-auc:0.96515 validation-aucpr:0.97025
[12] validation-logloss:0.45412 validation-auc:0.96518 validation-aucpr:0.97015
[13] validation-logloss:0.44027 validation-auc:0.96578 validation-aucpr:0.97078
[14] validation-logloss:0.42795 validation-auc:0.96597 validation-aucpr:0.97101
[15] validation-logloss:0.41839 validation-auc:0.96600 validation-aucpr:0.97100
[16] validation-logloss:0.40945 validation-auc:0.96598 validation-aucpr:0.97097
[17] validation-logloss:0.40100 validation-auc:0.96604 validation-aucpr:0.97100
[18] validation-logloss:0.39091 validation-auc:0.96619 validation-aucpr:0.97119
[19] validation-logloss:0.38088 validation-auc:0.96644 validation-aucpr:0.97143
[20] validation-logloss:0.37389 validation-auc:0.96625 validation-aucpr:0.97126
[21] validation-logloss:0.36672 validation-auc:0.96642 validation-aucpr:0.97136
[22] validation-logloss:0.36027 validation-auc:0.96638 validation-aucpr:0.97131
[23] validation-logloss:0.35225 validation-auc:0.96640 validation-aucpr:0.97010
[24] validation-logloss:0.34465 validation-auc:0.96655 validation-aucpr:0.96938
[25] validation-logloss:0.33930 validation-auc:0.96643 validation-aucpr:0.96926
[26] validation-logloss:0.33384 validation-auc:0.96637 validation-aucpr:0.96920
[27] validation-logloss:0.32846 validation-auc:0.96654 validation-aucpr:0.97149
[28] validation-logloss:0.32191 validation-auc:0.96675 validation-aucpr:0.97172
[29] validation-logloss:0.31730 validation-auc:0.96677 validation-aucpr:0.97172
[30] validation-logloss:0.31262 validation-auc:0.96684 validation-aucpr:0.97181
[31] validation-logloss:0.30650 validation-auc:0.96725 validation-aucpr:0.97215
[32] validation-logloss:0.30122 validation-auc:0.96743 validation-aucpr:0.97222
[33] validation-logloss:0.29752 validation-auc:0.96742 validation-aucpr:0.97228
[34] validation-logloss:0.29412 validation-auc:0.96743 validation-aucpr:0.97225
[35] validation-logloss:0.28944 validation-auc:0.96760 validation-aucpr:0.97240
[36] validation-logloss:0.28606 validation-auc:0.96768 validation-aucpr:0.97252
[37] validation-logloss:0.28312 validation-auc:0.96760 validation-aucpr:0.97244
[38] validation-logloss:0.27888 validation-auc:0.96773 validation-aucpr:0.97229
[39] validation-logloss:0.27477 validation-auc:0.96780 validation-aucpr:0.97157
[40] validation-logloss:0.27189 validation-auc:0.96785 validation-aucpr:0.97158
[41] validation-logloss:0.26814 validation-auc:0.96821 validation-aucpr:0.97297
[42] validation-logloss:0.26437 validation-auc:0.96848 validation-aucpr:0.97320
[43] validation-logloss:0.26082 validation-auc:0.96871 validation-aucpr:0.97338
[44] validation-logloss:0.25867 validation-auc:0.96862 validation-aucpr:0.97327
[45] validation-logloss:0.25570 validation-auc:0.96879 validation-aucpr:0.97338
[46] validation-logloss:0.25341 validation-auc:0.96880 validation-aucpr:0.97338
[47] validation-logloss:0.25128 validation-auc:0.96887 validation-aucpr:0.97342
[48] validation-logloss:0.24854 validation-auc:0.96898 validation-aucpr:0.97351
[49] validation-logloss:0.24581 validation-auc:0.96909 validation-aucpr:0.97359
[50] validation-logloss:0.24395 validation-auc:0.96915 validation-aucpr:0.97361
[51] validation-logloss:0.24234 validation-auc:0.96914 validation-aucpr:0.97356
[52] validation-logloss:0.23986 validation-auc:0.96931 validation-aucpr:0.97374
[53] validation-logloss:0.23849 validation-auc:0.96930 validation-aucpr:0.97369
[54] validation-logloss:0.23644 validation-auc:0.96936 validation-aucpr:0.97373
[55] validation-logloss:0.23424 validation-auc:0.96956 validation-aucpr:0.97390
[56] validation-logloss:0.23228 validation-auc:0.96962 validation-aucpr:0.97378
[57] validation-logloss:0.23057 validation-auc:0.96972 validation-aucpr:0.97397
[58] validation-logloss:0.22912 validation-auc:0.96987 validation-aucpr:0.97407
[59] validation-logloss:0.22752 validation-auc:0.96996 validation-aucpr:0.97414
[60] validation-logloss:0.22577 validation-auc:0.97004 validation-aucpr:0.97396
{'best_iteration': '59', 'best_score': '0.9741431676411648'}
Trial 35, Fold 1: Log loss = 0.22577467774862892, Average precision = 0.9739730615777419, ROC-AUC = 0.9700445218737902, Elapsed Time = 1.6055976000025112 seconds
Trial 35, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 35, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.66261 validation-auc:0.94859 validation-aucpr:0.93978
[1] validation-logloss:0.63879 validation-auc:0.95982 validation-aucpr:0.95983
[2] validation-logloss:0.61305 validation-auc:0.96397 validation-aucpr:0.96862
[3] validation-logloss:0.58926 validation-auc:0.96611 validation-aucpr:0.97050
[4] validation-logloss:0.56954 validation-auc:0.96643 validation-aucpr:0.97023
[5] validation-logloss:0.55182 validation-auc:0.96710 validation-aucpr:0.97047
[6] validation-logloss:0.53210 validation-auc:0.96826 validation-aucpr:0.97156
[7] validation-logloss:0.51619 validation-auc:0.96785 validation-aucpr:0.97099
[8] validation-logloss:0.49923 validation-auc:0.96793 validation-aucpr:0.97126
[9] validation-logloss:0.48561 validation-auc:0.96722 validation-aucpr:0.97071
[10] validation-logloss:0.47272 validation-auc:0.96712 validation-aucpr:0.97061
[11] validation-logloss:0.46048 validation-auc:0.96695 validation-aucpr:0.97050
[12] validation-logloss:0.44716 validation-auc:0.96704 validation-aucpr:0.97059
[13] validation-logloss:0.43394 validation-auc:0.96738 validation-aucpr:0.97105
[14] validation-logloss:0.42353 validation-auc:0.96727 validation-aucpr:0.97090
[15] validation-logloss:0.41392 validation-auc:0.96723 validation-aucpr:0.97081
[16] validation-logloss:0.40440 validation-auc:0.96741 validation-aucpr:0.97100
[17] validation-logloss:0.39350 validation-auc:0.96788 validation-aucpr:0.97141
[18] validation-logloss:0.38477 validation-auc:0.96810 validation-aucpr:0.97158
[19] validation-logloss:0.37523 validation-auc:0.96844 validation-aucpr:0.97186
[20] validation-logloss:0.36598 validation-auc:0.96897 validation-aucpr:0.97231
[21] validation-logloss:0.35845 validation-auc:0.96919 validation-aucpr:0.97298
[22] validation-logloss:0.35027 validation-auc:0.96937 validation-aucpr:0.97314
[23] validation-logloss:0.34423 validation-auc:0.96939 validation-aucpr:0.97310
[24] validation-logloss:0.33807 validation-auc:0.96938 validation-aucpr:0.97308
[25] validation-logloss:0.33220 validation-auc:0.96923 validation-aucpr:0.97295
[26] validation-logloss:0.32661 validation-auc:0.96923 validation-aucpr:0.97292
[27] validation-logloss:0.32000 validation-auc:0.96944 validation-aucpr:0.97310
[28] validation-logloss:0.31496 validation-auc:0.96952 validation-aucpr:0.97315
[29] validation-logloss:0.31060 validation-auc:0.96941 validation-aucpr:0.97300
[30] validation-logloss:0.30565 validation-auc:0.96953 validation-aucpr:0.97309
[31] validation-logloss:0.30133 validation-auc:0.96959 validation-aucpr:0.97314
[32] validation-logloss:0.29609 validation-auc:0.96981 validation-aucpr:0.97333
[33] validation-logloss:0.29227 validation-auc:0.96976 validation-aucpr:0.97329
[34] validation-logloss:0.28763 validation-auc:0.96982 validation-aucpr:0.97333
[35] validation-logloss:0.28274 validation-auc:0.97004 validation-aucpr:0.97351
[36] validation-logloss:0.27824 validation-auc:0.97021 validation-aucpr:0.97363
[37] validation-logloss:0.27404 validation-auc:0.97030 validation-aucpr:0.97370
[38] validation-logloss:0.27089 validation-auc:0.97038 validation-aucpr:0.97376
[39] validation-logloss:0.26802 validation-auc:0.97033 validation-aucpr:0.97371
[40] validation-logloss:0.26510 validation-auc:0.97035 validation-aucpr:0.97369
[41] validation-logloss:0.26152 validation-auc:0.97045 validation-aucpr:0.97379
[42] validation-logloss:0.25807 validation-auc:0.97049 validation-aucpr:0.97385
[43] validation-logloss:0.25475 validation-auc:0.97067 validation-aucpr:0.97398
[44] validation-logloss:0.25226 validation-auc:0.97078 validation-aucpr:0.97409
[45] validation-logloss:0.25012 validation-auc:0.97069 validation-aucpr:0.97402
[46] validation-logloss:0.24785 validation-auc:0.97072 validation-aucpr:0.97404
[47] validation-logloss:0.24556 validation-auc:0.97085 validation-aucpr:0.97411
[48] validation-logloss:0.24263 validation-auc:0.97096 validation-aucpr:0.97419
[49] validation-logloss:0.24080 validation-auc:0.97083 validation-aucpr:0.97409
[50] validation-logloss:0.23812 validation-auc:0.97099 validation-aucpr:0.97421
[51] validation-logloss:0.23563 validation-auc:0.97109 validation-aucpr:0.97427
[52] validation-logloss:0.23325 validation-auc:0.97119 validation-aucpr:0.97437
[53] validation-logloss:0.23165 validation-auc:0.97121 validation-aucpr:0.97438
[54] validation-logloss:0.22932 validation-auc:0.97135 validation-aucpr:0.97449
[55] validation-logloss:0.22703 validation-auc:0.97149 validation-aucpr:0.97461
[56] validation-logloss:0.22506 validation-auc:0.97147 validation-aucpr:0.97452
[57] validation-logloss:0.22372 validation-auc:0.97154 validation-aucpr:0.97458
[58] validation-logloss:0.22178 validation-auc:0.97164 validation-aucpr:0.97463
[59] validation-logloss:0.22058 validation-auc:0.97161 validation-aucpr:0.97462
[60] validation-logloss:0.21927 validation-auc:0.97172 validation-aucpr:0.97503
{'best_iteration': '60', 'best_score': '0.9750271561781708'}
Trial 35, Fold 2: Log loss = 0.21927145707866819, Average precision = 0.9750233171169024, ROC-AUC = 0.9717249110075112, Elapsed Time = 1.8591325000015786 seconds
Trial 35, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 35, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66260 validation-auc:0.95283 validation-aucpr:0.93270
[1] validation-logloss:0.63823 validation-auc:0.96152 validation-aucpr:0.96266
[2] validation-logloss:0.61253 validation-auc:0.96408 validation-aucpr:0.96896
[3] validation-logloss:0.59181 validation-auc:0.96459 validation-aucpr:0.96976
[4] validation-logloss:0.56957 validation-auc:0.96579 validation-aucpr:0.97085
[5] validation-logloss:0.54908 validation-auc:0.96653 validation-aucpr:0.97138
[6] validation-logloss:0.52994 validation-auc:0.96730 validation-aucpr:0.97207
[7] validation-logloss:0.51413 validation-auc:0.96684 validation-aucpr:0.97162
[8] validation-logloss:0.49927 validation-auc:0.96672 validation-aucpr:0.97147
[9] validation-logloss:0.48541 validation-auc:0.96688 validation-aucpr:0.97153
[10] validation-logloss:0.47233 validation-auc:0.96691 validation-aucpr:0.97154
[11] validation-logloss:0.45849 validation-auc:0.96728 validation-aucpr:0.97189
[12] validation-logloss:0.44407 validation-auc:0.96802 validation-aucpr:0.97252
[13] validation-logloss:0.43130 validation-auc:0.96804 validation-aucpr:0.97261
[14] validation-logloss:0.42121 validation-auc:0.96815 validation-aucpr:0.97274
[15] validation-logloss:0.40972 validation-auc:0.96853 validation-aucpr:0.97303
[16] validation-logloss:0.39902 validation-auc:0.96898 validation-aucpr:0.97343
[17] validation-logloss:0.39055 validation-auc:0.96862 validation-aucpr:0.97313
[18] validation-logloss:0.38225 validation-auc:0.96873 validation-aucpr:0.97313
[19] validation-logloss:0.37452 validation-auc:0.96863 validation-aucpr:0.97300
[20] validation-logloss:0.36668 validation-auc:0.96891 validation-aucpr:0.97326
[21] validation-logloss:0.35786 validation-auc:0.96914 validation-aucpr:0.97347
[22] validation-logloss:0.35073 validation-auc:0.96913 validation-aucpr:0.97346
[23] validation-logloss:0.34449 validation-auc:0.96912 validation-aucpr:0.97347
[24] validation-logloss:0.33722 validation-auc:0.96923 validation-aucpr:0.97356
[25] validation-logloss:0.33148 validation-auc:0.96930 validation-aucpr:0.97361
[26] validation-logloss:0.32607 validation-auc:0.96938 validation-aucpr:0.97365
[27] validation-logloss:0.32062 validation-auc:0.96956 validation-aucpr:0.97379
[28] validation-logloss:0.31548 validation-auc:0.96969 validation-aucpr:0.97389
[29] validation-logloss:0.31120 validation-auc:0.96952 validation-aucpr:0.97376
[30] validation-logloss:0.30652 validation-auc:0.96960 validation-aucpr:0.97379
[31] validation-logloss:0.30069 validation-auc:0.96985 validation-aucpr:0.97400
[32] validation-logloss:0.29542 validation-auc:0.96996 validation-aucpr:0.97413
[33] validation-logloss:0.28986 validation-auc:0.97030 validation-aucpr:0.97438
[34] validation-logloss:0.28615 validation-auc:0.97037 validation-aucpr:0.97443
[35] validation-logloss:0.28120 validation-auc:0.97064 validation-aucpr:0.97467
[36] validation-logloss:0.27658 validation-auc:0.97075 validation-aucpr:0.97478
[37] validation-logloss:0.27228 validation-auc:0.97084 validation-aucpr:0.97487
[38] validation-logloss:0.26845 validation-auc:0.97087 validation-aucpr:0.97492
[39] validation-logloss:0.26457 validation-auc:0.97100 validation-aucpr:0.97504
[40] validation-logloss:0.26170 validation-auc:0.97095 validation-aucpr:0.97501
[41] validation-logloss:0.25895 validation-auc:0.97102 validation-aucpr:0.97507
[42] validation-logloss:0.25632 validation-auc:0.97106 validation-aucpr:0.97509
[43] validation-logloss:0.25296 validation-auc:0.97115 validation-aucpr:0.97518
[44] validation-logloss:0.25074 validation-auc:0.97116 validation-aucpr:0.97523
[45] validation-logloss:0.24847 validation-auc:0.97115 validation-aucpr:0.97520
[46] validation-logloss:0.24541 validation-auc:0.97131 validation-aucpr:0.97534
[47] validation-logloss:0.24273 validation-auc:0.97134 validation-aucpr:0.97539
[48] validation-logloss:0.23982 validation-auc:0.97152 validation-aucpr:0.97553
[49] validation-logloss:0.23798 validation-auc:0.97152 validation-aucpr:0.97547
[50] validation-logloss:0.23599 validation-auc:0.97143 validation-aucpr:0.97539
[51] validation-logloss:0.23368 validation-auc:0.97152 validation-aucpr:0.97546
[52] validation-logloss:0.23136 validation-auc:0.97160 validation-aucpr:0.97553
[53] validation-logloss:0.22918 validation-auc:0.97174 validation-aucpr:0.97565
[54] validation-logloss:0.22728 validation-auc:0.97179 validation-aucpr:0.97573
[55] validation-logloss:0.22531 validation-auc:0.97181 validation-aucpr:0.97576
[56] validation-logloss:0.22389 validation-auc:0.97192 validation-aucpr:0.97587
[57] validation-logloss:0.22264 validation-auc:0.97193 validation-aucpr:0.97588
[58] validation-logloss:0.22086 validation-auc:0.97206 validation-aucpr:0.97599
[59] validation-logloss:0.21965 validation-auc:0.97204 validation-aucpr:0.97598
[60] validation-logloss:0.21796 validation-auc:0.97219 validation-aucpr:0.97610
{'best_iteration': '60', 'best_score': '0.9760955890250614'}
Trial 35, Fold 3: Log loss = 0.21795609703426969, Average precision = 0.976099486575763, ROC-AUC = 0.9721864736928277, Elapsed Time = 1.8033400999993319 seconds
Trial 35, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 35, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66301 validation-auc:0.94282 validation-aucpr:0.92186
[1] validation-logloss:0.63903 validation-auc:0.95582 validation-aucpr:0.95447
[2] validation-logloss:0.61316 validation-auc:0.96062 validation-aucpr:0.96570
[3] validation-logloss:0.59228 validation-auc:0.96298 validation-aucpr:0.96858
[4] validation-logloss:0.57368 validation-auc:0.96267 validation-aucpr:0.96830
[5] validation-logloss:0.55249 validation-auc:0.96377 validation-aucpr:0.96944
[6] validation-logloss:0.53476 validation-auc:0.96462 validation-aucpr:0.97020
[7] validation-logloss:0.51660 validation-auc:0.96565 validation-aucpr:0.97095
[8] validation-logloss:0.50056 validation-auc:0.96608 validation-aucpr:0.97129
[9] validation-logloss:0.48715 validation-auc:0.96573 validation-aucpr:0.97089
[10] validation-logloss:0.47515 validation-auc:0.96500 validation-aucpr:0.97036
[11] validation-logloss:0.46019 validation-auc:0.96547 validation-aucpr:0.97085
[12] validation-logloss:0.44636 validation-auc:0.96597 validation-aucpr:0.97138
[13] validation-logloss:0.43538 validation-auc:0.96586 validation-aucpr:0.97128
[14] validation-logloss:0.42275 validation-auc:0.96628 validation-aucpr:0.97165
[15] validation-logloss:0.41109 validation-auc:0.96656 validation-aucpr:0.97191
[16] validation-logloss:0.40068 validation-auc:0.96705 validation-aucpr:0.97228
[17] validation-logloss:0.39193 validation-auc:0.96724 validation-aucpr:0.97245
[18] validation-logloss:0.38182 validation-auc:0.96751 validation-aucpr:0.97271
[19] validation-logloss:0.37227 validation-auc:0.96781 validation-aucpr:0.97297
[20] validation-logloss:0.36498 validation-auc:0.96778 validation-aucpr:0.97296
[21] validation-logloss:0.35633 validation-auc:0.96824 validation-aucpr:0.97328
[22] validation-logloss:0.34981 validation-auc:0.96829 validation-aucpr:0.97325
[23] validation-logloss:0.34193 validation-auc:0.96864 validation-aucpr:0.97354
[24] validation-logloss:0.33579 validation-auc:0.96868 validation-aucpr:0.97353
[25] validation-logloss:0.32875 validation-auc:0.96892 validation-aucpr:0.97376
[26] validation-logloss:0.32210 validation-auc:0.96894 validation-aucpr:0.97380
[27] validation-logloss:0.31715 validation-auc:0.96884 validation-aucpr:0.97372
[28] validation-logloss:0.31096 validation-auc:0.96905 validation-aucpr:0.97388
[29] validation-logloss:0.30498 validation-auc:0.96925 validation-aucpr:0.97406
[30] validation-logloss:0.30051 validation-auc:0.96929 validation-aucpr:0.97407
[31] validation-logloss:0.29631 validation-auc:0.96926 validation-aucpr:0.97400
[32] validation-logloss:0.29247 validation-auc:0.96916 validation-aucpr:0.97391
[33] validation-logloss:0.28859 validation-auc:0.96915 validation-aucpr:0.97390
[34] validation-logloss:0.28427 validation-auc:0.96936 validation-aucpr:0.97405
[35] validation-logloss:0.28084 validation-auc:0.96915 validation-aucpr:0.97390
[36] validation-logloss:0.27657 validation-auc:0.96933 validation-aucpr:0.97406
[37] validation-logloss:0.27340 validation-auc:0.96922 validation-aucpr:0.97397
[38] validation-logloss:0.26995 validation-auc:0.96925 validation-aucpr:0.97402
[39] validation-logloss:0.26634 validation-auc:0.96928 validation-aucpr:0.97406
[40] validation-logloss:0.26382 validation-auc:0.96925 validation-aucpr:0.97404
[41] validation-logloss:0.26121 validation-auc:0.96926 validation-aucpr:0.97401
[42] validation-logloss:0.25830 validation-auc:0.96928 validation-aucpr:0.97404
[43] validation-logloss:0.25512 validation-auc:0.96937 validation-aucpr:0.97413
[44] validation-logloss:0.25287 validation-auc:0.96937 validation-aucpr:0.97412
[45] validation-logloss:0.25029 validation-auc:0.96945 validation-aucpr:0.97421
[46] validation-logloss:0.24812 validation-auc:0.96950 validation-aucpr:0.97425
[47] validation-logloss:0.24534 validation-auc:0.96955 validation-aucpr:0.97429
[48] validation-logloss:0.24248 validation-auc:0.96972 validation-aucpr:0.97445
[49] validation-logloss:0.24045 validation-auc:0.96981 validation-aucpr:0.97453
[50] validation-logloss:0.23790 validation-auc:0.97003 validation-aucpr:0.97468
[51] validation-logloss:0.23557 validation-auc:0.97013 validation-aucpr:0.97479
[52] validation-logloss:0.23350 validation-auc:0.97004 validation-aucpr:0.97473
[53] validation-logloss:0.23181 validation-auc:0.97014 validation-aucpr:0.97478
[54] validation-logloss:0.23027 validation-auc:0.97018 validation-aucpr:0.97480
[55] validation-logloss:0.22864 validation-auc:0.97021 validation-aucpr:0.97482
[56] validation-logloss:0.22677 validation-auc:0.97021 validation-aucpr:0.97483
[57] validation-logloss:0.22553 validation-auc:0.97026 validation-aucpr:0.97485
[58] validation-logloss:0.22373 validation-auc:0.97043 validation-aucpr:0.97497
[59] validation-logloss:0.22182 validation-auc:0.97058 validation-aucpr:0.97510
[60] validation-logloss:0.22012 validation-auc:0.97073 validation-aucpr:0.97523
{'best_iteration': '60', 'best_score': '0.9752255955912826'}
Trial 35, Fold 4: Log loss = 0.22011990067822448, Average precision = 0.9752247920061735, ROC-AUC = 0.9707328602664338, Elapsed Time = 1.9176872999996704 seconds
Trial 35, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 35, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66319 validation-auc:0.94499 validation-aucpr:0.93567
[1] validation-logloss:0.63932 validation-auc:0.95795 validation-aucpr:0.96326
[2] validation-logloss:0.61763 validation-auc:0.95886 validation-aucpr:0.96384
[3] validation-logloss:0.59363 validation-auc:0.96106 validation-aucpr:0.96636
[4] validation-logloss:0.57361 validation-auc:0.96176 validation-aucpr:0.96686
[5] validation-logloss:0.55533 validation-auc:0.96193 validation-aucpr:0.96692
[6] validation-logloss:0.53569 validation-auc:0.96279 validation-aucpr:0.96770
[7] validation-logloss:0.52028 validation-auc:0.96292 validation-aucpr:0.96761
[8] validation-logloss:0.50606 validation-auc:0.96280 validation-aucpr:0.96730
[9] validation-logloss:0.49035 validation-auc:0.96285 validation-aucpr:0.96737
[10] validation-logloss:0.47752 validation-auc:0.96294 validation-aucpr:0.96783
[11] validation-logloss:0.46509 validation-auc:0.96295 validation-aucpr:0.96784
[12] validation-logloss:0.45079 validation-auc:0.96386 validation-aucpr:0.96867
[13] validation-logloss:0.43756 validation-auc:0.96434 validation-aucpr:0.96906
[14] validation-logloss:0.42528 validation-auc:0.96485 validation-aucpr:0.96943
[15] validation-logloss:0.41590 validation-auc:0.96468 validation-aucpr:0.96926
[16] validation-logloss:0.40674 validation-auc:0.96483 validation-aucpr:0.96931
[17] validation-logloss:0.39652 validation-auc:0.96496 validation-aucpr:0.96938
[18] validation-logloss:0.38922 validation-auc:0.96470 validation-aucpr:0.96923
[19] validation-logloss:0.38148 validation-auc:0.96490 validation-aucpr:0.96931
[20] validation-logloss:0.37235 validation-auc:0.96524 validation-aucpr:0.96961
[21] validation-logloss:0.36377 validation-auc:0.96566 validation-aucpr:0.96991
[22] validation-logloss:0.35592 validation-auc:0.96576 validation-aucpr:0.97002
[23] validation-logloss:0.34805 validation-auc:0.96609 validation-aucpr:0.97036
[24] validation-logloss:0.34070 validation-auc:0.96642 validation-aucpr:0.97070
[25] validation-logloss:0.33506 validation-auc:0.96651 validation-aucpr:0.97075
[26] validation-logloss:0.32833 validation-auc:0.96683 validation-aucpr:0.97101
[27] validation-logloss:0.32313 validation-auc:0.96670 validation-aucpr:0.97078
[28] validation-logloss:0.31723 validation-auc:0.96687 validation-aucpr:0.97094
[29] validation-logloss:0.31288 validation-auc:0.96674 validation-aucpr:0.97078
[30] validation-logloss:0.30735 validation-auc:0.96694 validation-aucpr:0.97096
[31] validation-logloss:0.30260 validation-auc:0.96702 validation-aucpr:0.97096
[32] validation-logloss:0.29824 validation-auc:0.96730 validation-aucpr:0.97115
[33] validation-logloss:0.29336 validation-auc:0.96755 validation-aucpr:0.97134
[34] validation-logloss:0.28961 validation-auc:0.96759 validation-aucpr:0.97137
[35] validation-logloss:0.28632 validation-auc:0.96756 validation-aucpr:0.97129
[36] validation-logloss:0.28333 validation-auc:0.96745 validation-aucpr:0.97118
[37] validation-logloss:0.27999 validation-auc:0.96759 validation-aucpr:0.97128
[38] validation-logloss:0.27603 validation-auc:0.96777 validation-aucpr:0.97154
[39] validation-logloss:0.27261 validation-auc:0.96789 validation-aucpr:0.97165
[40] validation-logloss:0.26908 validation-auc:0.96808 validation-aucpr:0.97181
[41] validation-logloss:0.26554 validation-auc:0.96833 validation-aucpr:0.97200
[42] validation-logloss:0.26292 validation-auc:0.96836 validation-aucpr:0.97204
[43] validation-logloss:0.26004 validation-auc:0.96842 validation-aucpr:0.97207
[44] validation-logloss:0.25794 validation-auc:0.96831 validation-aucpr:0.97202
[45] validation-logloss:0.25498 validation-auc:0.96855 validation-aucpr:0.97224
[46] validation-logloss:0.25197 validation-auc:0.96882 validation-aucpr:0.97245
[47] validation-logloss:0.24991 validation-auc:0.96893 validation-aucpr:0.97248
[48] validation-logloss:0.24722 validation-auc:0.96916 validation-aucpr:0.97303
[49] validation-logloss:0.24475 validation-auc:0.96933 validation-aucpr:0.97310
[50] validation-logloss:0.24300 validation-auc:0.96933 validation-aucpr:0.97310
[51] validation-logloss:0.24116 validation-auc:0.96942 validation-aucpr:0.97316
[52] validation-logloss:0.23944 validation-auc:0.96944 validation-aucpr:0.97322
[53] validation-logloss:0.23726 validation-auc:0.96960 validation-aucpr:0.97338
[54] validation-logloss:0.23515 validation-auc:0.96985 validation-aucpr:0.97373
[55] validation-logloss:0.23370 validation-auc:0.96995 validation-aucpr:0.97380
[56] validation-logloss:0.23228 validation-auc:0.96998 validation-aucpr:0.97381
[57] validation-logloss:0.23042 validation-auc:0.97012 validation-aucpr:0.97393
[58] validation-logloss:0.22864 validation-auc:0.97019 validation-aucpr:0.97394
[59] validation-logloss:0.22669 validation-auc:0.97041 validation-aucpr:0.97414
[60] validation-logloss:0.22562 validation-auc:0.97026 validation-aucpr:0.97401
{'best_iteration': '59', 'best_score': '0.9741436480585423'}
Trial 35, Fold 5: Log loss = 0.22562292321361685, Average precision = 0.9740183756693583, ROC-AUC = 0.9702630473960947, Elapsed Time = 1.9858647999972163 seconds
Optimization Progress: 36%|###6 | 36/100 [2:22:02<10:22:32, 583.63s/it]
Trial 36, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 36, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.62882 validation-auc:0.93107 validation-aucpr:0.88995
[1] validation-logloss:0.57594 validation-auc:0.95804 validation-aucpr:0.94536
[2] validation-logloss:0.53013 validation-auc:0.96362 validation-aucpr:0.96144
[3] validation-logloss:0.49113 validation-auc:0.96601 validation-aucpr:0.96749
[4] validation-logloss:0.45755 validation-auc:0.96711 validation-aucpr:0.97002
[5] validation-logloss:0.42871 validation-auc:0.96667 validation-aucpr:0.96973
[6] validation-logloss:0.40344 validation-auc:0.96731 validation-aucpr:0.97032
[7] validation-logloss:0.38041 validation-auc:0.96826 validation-aucpr:0.97101
[8] validation-logloss:0.36071 validation-auc:0.96869 validation-aucpr:0.97158
[9] validation-logloss:0.34296 validation-auc:0.96910 validation-aucpr:0.97160
[10] validation-logloss:0.32781 validation-auc:0.96963 validation-aucpr:0.97423
[11] validation-logloss:0.31392 validation-auc:0.96985 validation-aucpr:0.97439
[12] validation-logloss:0.30155 validation-auc:0.96985 validation-aucpr:0.97434
[13] validation-logloss:0.29024 validation-auc:0.97019 validation-aucpr:0.97461
[14] validation-logloss:0.28052 validation-auc:0.97022 validation-aucpr:0.97461
[15] validation-logloss:0.27164 validation-auc:0.97014 validation-aucpr:0.97456
[16] validation-logloss:0.26314 validation-auc:0.97074 validation-aucpr:0.97493
[17] validation-logloss:0.25685 validation-auc:0.97086 validation-aucpr:0.97499
[18] validation-logloss:0.24995 validation-auc:0.97108 validation-aucpr:0.97519
[19] validation-logloss:0.24424 validation-auc:0.97108 validation-aucpr:0.97519
[20] validation-logloss:0.23914 validation-auc:0.97104 validation-aucpr:0.97512
[21] validation-logloss:0.23465 validation-auc:0.97108 validation-aucpr:0.97517
[22] validation-logloss:0.23038 validation-auc:0.97117 validation-aucpr:0.97521
[23] validation-logloss:0.22614 validation-auc:0.97137 validation-aucpr:0.97528
[24] validation-logloss:0.22267 validation-auc:0.97151 validation-aucpr:0.97535
[25] validation-logloss:0.21939 validation-auc:0.97155 validation-aucpr:0.97536
[26] validation-logloss:0.21622 validation-auc:0.97172 validation-aucpr:0.97543
[27] validation-logloss:0.21372 validation-auc:0.97176 validation-aucpr:0.97543
[28] validation-logloss:0.21157 validation-auc:0.97177 validation-aucpr:0.97542
[29] validation-logloss:0.20965 validation-auc:0.97172 validation-aucpr:0.97538
[30] validation-logloss:0.20771 validation-auc:0.97182 validation-aucpr:0.97548
[31] validation-logloss:0.20587 validation-auc:0.97189 validation-aucpr:0.97554
[32] validation-logloss:0.20471 validation-auc:0.97175 validation-aucpr:0.97536
[33] validation-logloss:0.20335 validation-auc:0.97179 validation-aucpr:0.97541
[34] validation-logloss:0.20202 validation-auc:0.97200 validation-aucpr:0.97578
[35] validation-logloss:0.20077 validation-auc:0.97205 validation-aucpr:0.97586
[36] validation-logloss:0.19983 validation-auc:0.97201 validation-aucpr:0.97582
[37] validation-logloss:0.19879 validation-auc:0.97223 validation-aucpr:0.97625
[38] validation-logloss:0.19818 validation-auc:0.97228 validation-aucpr:0.97627
[39] validation-logloss:0.19741 validation-auc:0.97232 validation-aucpr:0.97628
[40] validation-logloss:0.19680 validation-auc:0.97239 validation-aucpr:0.97638
[41] validation-logloss:0.19655 validation-auc:0.97233 validation-aucpr:0.97631
[42] validation-logloss:0.19579 validation-auc:0.97242 validation-aucpr:0.97640
[43] validation-logloss:0.19570 validation-auc:0.97227 validation-aucpr:0.97630
[44] validation-logloss:0.19512 validation-auc:0.97241 validation-aucpr:0.97640
[45] validation-logloss:0.19445 validation-auc:0.97254 validation-aucpr:0.97652
[46] validation-logloss:0.19409 validation-auc:0.97253 validation-aucpr:0.97652
[47] validation-logloss:0.19421 validation-auc:0.97248 validation-aucpr:0.97653
[48] validation-logloss:0.19414 validation-auc:0.97242 validation-aucpr:0.97646
[49] validation-logloss:0.19399 validation-auc:0.97239 validation-aucpr:0.97640
[50] validation-logloss:0.19410 validation-auc:0.97238 validation-aucpr:0.97638
[51] validation-logloss:0.19412 validation-auc:0.97235 validation-aucpr:0.97633
[52] validation-logloss:0.19433 validation-auc:0.97224 validation-aucpr:0.97624
[53] validation-logloss:0.19445 validation-auc:0.97225 validation-aucpr:0.97622
[54] validation-logloss:0.19440 validation-auc:0.97225 validation-aucpr:0.97621
[55] validation-logloss:0.19401 validation-auc:0.97239 validation-aucpr:0.97628
[56] validation-logloss:0.19375 validation-auc:0.97249 validation-aucpr:0.97638
[57] validation-logloss:0.19420 validation-auc:0.97239 validation-aucpr:0.97624
[58] validation-logloss:0.19450 validation-auc:0.97230 validation-aucpr:0.97621
[59] validation-logloss:0.19450 validation-auc:0.97239 validation-aucpr:0.97631
[60] validation-logloss:0.19462 validation-auc:0.97241 validation-aucpr:0.97632
{'best_iteration': '47', 'best_score': '0.9765331381787777'}
Trial 36, Fold 1: Log loss = 0.19461938434271853, Average precision = 0.9763247913170843, ROC-AUC = 0.9724131070675786, Elapsed Time = 18.183157800001936 seconds
Trial 36, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 36, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.62898 validation-auc:0.93228 validation-aucpr:0.89005
[1] validation-logloss:0.57487 validation-auc:0.96346 validation-aucpr:0.95779
[2] validation-logloss:0.52876 validation-auc:0.96766 validation-aucpr:0.96886
[3] validation-logloss:0.48957 validation-auc:0.96903 validation-aucpr:0.97222
[4] validation-logloss:0.45558 validation-auc:0.96988 validation-aucpr:0.97319
[5] validation-logloss:0.42594 validation-auc:0.97049 validation-aucpr:0.97372
[6] validation-logloss:0.40042 validation-auc:0.97067 validation-aucpr:0.97366
[7] validation-logloss:0.37774 validation-auc:0.97056 validation-aucpr:0.97366
[8] validation-logloss:0.35764 validation-auc:0.97058 validation-aucpr:0.97373
[9] validation-logloss:0.33995 validation-auc:0.97072 validation-aucpr:0.97388
[10] validation-logloss:0.32435 validation-auc:0.97072 validation-aucpr:0.97397
[11] validation-logloss:0.31019 validation-auc:0.97094 validation-aucpr:0.97417
[12] validation-logloss:0.29750 validation-auc:0.97092 validation-aucpr:0.97415
[13] validation-logloss:0.28614 validation-auc:0.97111 validation-aucpr:0.97431
[14] validation-logloss:0.27602 validation-auc:0.97132 validation-aucpr:0.97448
[15] validation-logloss:0.26686 validation-auc:0.97143 validation-aucpr:0.97457
[16] validation-logloss:0.25854 validation-auc:0.97170 validation-aucpr:0.97473
[17] validation-logloss:0.25114 validation-auc:0.97165 validation-aucpr:0.97469
[18] validation-logloss:0.24463 validation-auc:0.97166 validation-aucpr:0.97462
[19] validation-logloss:0.23913 validation-auc:0.97150 validation-aucpr:0.97448
[20] validation-logloss:0.23364 validation-auc:0.97162 validation-aucpr:0.97449
[21] validation-logloss:0.22870 validation-auc:0.97175 validation-aucpr:0.97463
[22] validation-logloss:0.22466 validation-auc:0.97171 validation-aucpr:0.97453
[23] validation-logloss:0.22048 validation-auc:0.97193 validation-aucpr:0.97480
[24] validation-logloss:0.21650 validation-auc:0.97216 validation-aucpr:0.97498
[25] validation-logloss:0.21295 validation-auc:0.97230 validation-aucpr:0.97510
[26] validation-logloss:0.20982 validation-auc:0.97233 validation-aucpr:0.97499
[27] validation-logloss:0.20727 validation-auc:0.97226 validation-aucpr:0.97492
[28] validation-logloss:0.20469 validation-auc:0.97229 validation-aucpr:0.97496
[29] validation-logloss:0.20227 validation-auc:0.97250 validation-aucpr:0.97504
[30] validation-logloss:0.20029 validation-auc:0.97258 validation-aucpr:0.97501
[31] validation-logloss:0.19834 validation-auc:0.97273 validation-aucpr:0.97508
[32] validation-logloss:0.19651 validation-auc:0.97279 validation-aucpr:0.97512
[33] validation-logloss:0.19503 validation-auc:0.97273 validation-aucpr:0.97508
[34] validation-logloss:0.19346 validation-auc:0.97283 validation-aucpr:0.97515
[35] validation-logloss:0.19209 validation-auc:0.97288 validation-aucpr:0.97517
[36] validation-logloss:0.19092 validation-auc:0.97298 validation-aucpr:0.97537
[37] validation-logloss:0.19027 validation-auc:0.97285 validation-aucpr:0.97520
[38] validation-logloss:0.18949 validation-auc:0.97287 validation-aucpr:0.97527
[39] validation-logloss:0.18834 validation-auc:0.97303 validation-aucpr:0.97545
[40] validation-logloss:0.18734 validation-auc:0.97313 validation-aucpr:0.97552
[41] validation-logloss:0.18682 validation-auc:0.97300 validation-aucpr:0.97523
[42] validation-logloss:0.18600 validation-auc:0.97300 validation-aucpr:0.97527
[43] validation-logloss:0.18513 validation-auc:0.97312 validation-aucpr:0.97511
[44] validation-logloss:0.18442 validation-auc:0.97317 validation-aucpr:0.97514
[45] validation-logloss:0.18417 validation-auc:0.97312 validation-aucpr:0.97512
[46] validation-logloss:0.18365 validation-auc:0.97311 validation-aucpr:0.97518
[47] validation-logloss:0.18316 validation-auc:0.97310 validation-aucpr:0.97513
[48] validation-logloss:0.18263 validation-auc:0.97319 validation-aucpr:0.97527
[49] validation-logloss:0.18203 validation-auc:0.97330 validation-aucpr:0.97539
[50] validation-logloss:0.18156 validation-auc:0.97351 validation-aucpr:0.97596
[51] validation-logloss:0.18130 validation-auc:0.97356 validation-aucpr:0.97603
[52] validation-logloss:0.18101 validation-auc:0.97360 validation-aucpr:0.97599
[53] validation-logloss:0.18080 validation-auc:0.97366 validation-aucpr:0.97597
[54] validation-logloss:0.18062 validation-auc:0.97370 validation-aucpr:0.97594
[55] validation-logloss:0.18033 validation-auc:0.97374 validation-aucpr:0.97601
[56] validation-logloss:0.18008 validation-auc:0.97378 validation-aucpr:0.97660
[57] validation-logloss:0.17958 validation-auc:0.97388 validation-aucpr:0.97668
[58] validation-logloss:0.18019 validation-auc:0.97355 validation-aucpr:0.97621
[59] validation-logloss:0.18002 validation-auc:0.97355 validation-aucpr:0.97620
[60] validation-logloss:0.17990 validation-auc:0.97354 validation-aucpr:0.97615
{'best_iteration': '57', 'best_score': '0.9766846401509044'}
Trial 36, Fold 2: Log loss = 0.17990457088399403, Average precision = 0.9761546471904696, ROC-AUC = 0.9735385246655404, Elapsed Time = 18.41648890000215 seconds
Trial 36, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 36, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.62844 validation-auc:0.93154 validation-aucpr:0.89432
[1] validation-logloss:0.57382 validation-auc:0.96068 validation-aucpr:0.95541
[2] validation-logloss:0.52780 validation-auc:0.96813 validation-aucpr:0.97256
[3] validation-logloss:0.48854 validation-auc:0.96884 validation-aucpr:0.97302
[4] validation-logloss:0.45469 validation-auc:0.96933 validation-aucpr:0.97345
[5] validation-logloss:0.42741 validation-auc:0.97040 validation-aucpr:0.97430
[6] validation-logloss:0.40123 validation-auc:0.97079 validation-aucpr:0.97480
[7] validation-logloss:0.37826 validation-auc:0.97124 validation-aucpr:0.97489
[8] validation-logloss:0.35859 validation-auc:0.97111 validation-aucpr:0.97500
[9] validation-logloss:0.34059 validation-auc:0.97112 validation-aucpr:0.97494
[10] validation-logloss:0.32423 validation-auc:0.97145 validation-aucpr:0.97517
[11] validation-logloss:0.30996 validation-auc:0.97150 validation-aucpr:0.97527
[12] validation-logloss:0.29720 validation-auc:0.97180 validation-aucpr:0.97552
[13] validation-logloss:0.28679 validation-auc:0.97161 validation-aucpr:0.97450
[14] validation-logloss:0.27664 validation-auc:0.97167 validation-aucpr:0.97471
[15] validation-logloss:0.26764 validation-auc:0.97175 validation-aucpr:0.97496
[16] validation-logloss:0.25914 validation-auc:0.97208 validation-aucpr:0.97522
[17] validation-logloss:0.25169 validation-auc:0.97210 validation-aucpr:0.97526
[18] validation-logloss:0.24488 validation-auc:0.97216 validation-aucpr:0.97512
[19] validation-logloss:0.23906 validation-auc:0.97217 validation-aucpr:0.97505
[20] validation-logloss:0.23373 validation-auc:0.97189 validation-aucpr:0.97315
[21] validation-logloss:0.22861 validation-auc:0.97207 validation-aucpr:0.97326
[22] validation-logloss:0.22423 validation-auc:0.97196 validation-aucpr:0.97310
[23] validation-logloss:0.22045 validation-auc:0.97197 validation-aucpr:0.97317
[24] validation-logloss:0.21691 validation-auc:0.97202 validation-aucpr:0.97319
[25] validation-logloss:0.21342 validation-auc:0.97227 validation-aucpr:0.97396
[26] validation-logloss:0.21050 validation-auc:0.97236 validation-aucpr:0.97406
[27] validation-logloss:0.20817 validation-auc:0.97228 validation-aucpr:0.97395
[28] validation-logloss:0.20590 validation-auc:0.97235 validation-aucpr:0.97431
[29] validation-logloss:0.20358 validation-auc:0.97249 validation-aucpr:0.97441
[30] validation-logloss:0.20130 validation-auc:0.97263 validation-aucpr:0.97454
[31] validation-logloss:0.19972 validation-auc:0.97261 validation-aucpr:0.97452
[32] validation-logloss:0.19800 validation-auc:0.97279 validation-aucpr:0.97463
[33] validation-logloss:0.19725 validation-auc:0.97258 validation-aucpr:0.97449
[34] validation-logloss:0.19583 validation-auc:0.97276 validation-aucpr:0.97495
[35] validation-logloss:0.19457 validation-auc:0.97283 validation-aucpr:0.97506
[36] validation-logloss:0.19368 validation-auc:0.97285 validation-aucpr:0.97503
[37] validation-logloss:0.19274 validation-auc:0.97293 validation-aucpr:0.97504
[38] validation-logloss:0.19166 validation-auc:0.97299 validation-aucpr:0.97507
[39] validation-logloss:0.19095 validation-auc:0.97301 validation-aucpr:0.97509
[40] validation-logloss:0.18987 validation-auc:0.97312 validation-aucpr:0.97531
[41] validation-logloss:0.18945 validation-auc:0.97303 validation-aucpr:0.97512
[42] validation-logloss:0.18894 validation-auc:0.97305 validation-aucpr:0.97495
[43] validation-logloss:0.18837 validation-auc:0.97302 validation-aucpr:0.97489
[44] validation-logloss:0.18801 validation-auc:0.97299 validation-aucpr:0.97490
[45] validation-logloss:0.18761 validation-auc:0.97310 validation-aucpr:0.97511
[46] validation-logloss:0.18711 validation-auc:0.97321 validation-aucpr:0.97525
[47] validation-logloss:0.18673 validation-auc:0.97330 validation-aucpr:0.97523
[48] validation-logloss:0.18641 validation-auc:0.97339 validation-aucpr:0.97572
[49] validation-logloss:0.18576 validation-auc:0.97354 validation-aucpr:0.97591
[50] validation-logloss:0.18567 validation-auc:0.97354 validation-aucpr:0.97591
[51] validation-logloss:0.18538 validation-auc:0.97364 validation-aucpr:0.97572
[52] validation-logloss:0.18547 validation-auc:0.97360 validation-aucpr:0.97567
[53] validation-logloss:0.18523 validation-auc:0.97369 validation-aucpr:0.97577
[54] validation-logloss:0.18514 validation-auc:0.97369 validation-aucpr:0.97578
[55] validation-logloss:0.18505 validation-auc:0.97380 validation-aucpr:0.97587
[56] validation-logloss:0.18471 validation-auc:0.97397 validation-aucpr:0.97609
[57] validation-logloss:0.18473 validation-auc:0.97393 validation-aucpr:0.97602
[58] validation-logloss:0.18511 validation-auc:0.97378 validation-aucpr:0.97558
[59] validation-logloss:0.18507 validation-auc:0.97396 validation-aucpr:0.97735
[60] validation-logloss:0.18504 validation-auc:0.97401 validation-aucpr:0.97734
{'best_iteration': '59', 'best_score': '0.9773455256355663'}
Trial 36, Fold 3: Log loss = 0.1850426326602382, Average precision = 0.9773415631367709, ROC-AUC = 0.9740058364072388, Elapsed Time = 17.800311499999225 seconds
Trial 36, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 36, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.62889 validation-auc:0.92818 validation-aucpr:0.88664
[1] validation-logloss:0.57557 validation-auc:0.95670 validation-aucpr:0.94196
[2] validation-logloss:0.52984 validation-auc:0.96495 validation-aucpr:0.96483
[3] validation-logloss:0.49059 validation-auc:0.96722 validation-aucpr:0.96900
[4] validation-logloss:0.45661 validation-auc:0.96855 validation-aucpr:0.97389
[5] validation-logloss:0.42682 validation-auc:0.96938 validation-aucpr:0.97416
[6] validation-logloss:0.40070 validation-auc:0.97000 validation-aucpr:0.97468
[7] validation-logloss:0.38005 validation-auc:0.97005 validation-aucpr:0.97475
[8] validation-logloss:0.35980 validation-auc:0.97060 validation-aucpr:0.97517
[9] validation-logloss:0.34155 validation-auc:0.97098 validation-aucpr:0.97542
[10] validation-logloss:0.32525 validation-auc:0.97119 validation-aucpr:0.97562
[11] validation-logloss:0.31127 validation-auc:0.97120 validation-aucpr:0.97557
[12] validation-logloss:0.29845 validation-auc:0.97163 validation-aucpr:0.97585
[13] validation-logloss:0.28705 validation-auc:0.97167 validation-aucpr:0.97435
[14] validation-logloss:0.27739 validation-auc:0.97151 validation-aucpr:0.97415
[15] validation-logloss:0.26812 validation-auc:0.97156 validation-aucpr:0.97417
[16] validation-logloss:0.25988 validation-auc:0.97156 validation-aucpr:0.97405
[17] validation-logloss:0.25240 validation-auc:0.97166 validation-aucpr:0.97375
[18] validation-logloss:0.24577 validation-auc:0.97173 validation-aucpr:0.97380
[19] validation-logloss:0.23970 validation-auc:0.97188 validation-aucpr:0.97390
[20] validation-logloss:0.23487 validation-auc:0.97200 validation-aucpr:0.97398
[21] validation-logloss:0.23005 validation-auc:0.97186 validation-aucpr:0.97388
[22] validation-logloss:0.22576 validation-auc:0.97176 validation-aucpr:0.97353
[23] validation-logloss:0.22147 validation-auc:0.97199 validation-aucpr:0.97370
[24] validation-logloss:0.21736 validation-auc:0.97224 validation-aucpr:0.97387
[25] validation-logloss:0.21437 validation-auc:0.97204 validation-aucpr:0.97371
[26] validation-logloss:0.21144 validation-auc:0.97198 validation-aucpr:0.97367
[27] validation-logloss:0.20868 validation-auc:0.97205 validation-aucpr:0.97368
[28] validation-logloss:0.20614 validation-auc:0.97210 validation-aucpr:0.97372
[29] validation-logloss:0.20383 validation-auc:0.97222 validation-aucpr:0.97380
[30] validation-logloss:0.20181 validation-auc:0.97223 validation-aucpr:0.97380
[31] validation-logloss:0.19971 validation-auc:0.97237 validation-aucpr:0.97392
[32] validation-logloss:0.19797 validation-auc:0.97244 validation-aucpr:0.97425
[33] validation-logloss:0.19667 validation-auc:0.97242 validation-aucpr:0.97430
[34] validation-logloss:0.19558 validation-auc:0.97236 validation-aucpr:0.97434
[35] validation-logloss:0.19445 validation-auc:0.97242 validation-aucpr:0.97640
[36] validation-logloss:0.19343 validation-auc:0.97248 validation-aucpr:0.97652
[37] validation-logloss:0.19229 validation-auc:0.97253 validation-aucpr:0.97656
[38] validation-logloss:0.19126 validation-auc:0.97263 validation-aucpr:0.97665
[39] validation-logloss:0.19080 validation-auc:0.97242 validation-aucpr:0.97648
[40] validation-logloss:0.19028 validation-auc:0.97235 validation-aucpr:0.97640
[41] validation-logloss:0.18969 validation-auc:0.97239 validation-aucpr:0.97640
[42] validation-logloss:0.18913 validation-auc:0.97244 validation-aucpr:0.97642
[43] validation-logloss:0.18836 validation-auc:0.97256 validation-aucpr:0.97652
[44] validation-logloss:0.18776 validation-auc:0.97269 validation-aucpr:0.97659
[45] validation-logloss:0.18727 validation-auc:0.97274 validation-aucpr:0.97663
[46] validation-logloss:0.18678 validation-auc:0.97283 validation-aucpr:0.97670
[47] validation-logloss:0.18671 validation-auc:0.97279 validation-aucpr:0.97666
[48] validation-logloss:0.18654 validation-auc:0.97284 validation-aucpr:0.97667
[49] validation-logloss:0.18618 validation-auc:0.97294 validation-aucpr:0.97671
[50] validation-logloss:0.18629 validation-auc:0.97288 validation-aucpr:0.97665
[51] validation-logloss:0.18620 validation-auc:0.97290 validation-aucpr:0.97666
[52] validation-logloss:0.18590 validation-auc:0.97292 validation-aucpr:0.97668
[53] validation-logloss:0.18561 validation-auc:0.97298 validation-aucpr:0.97678
[54] validation-logloss:0.18563 validation-auc:0.97295 validation-aucpr:0.97675
[55] validation-logloss:0.18602 validation-auc:0.97286 validation-aucpr:0.97670
[56] validation-logloss:0.18585 validation-auc:0.97298 validation-aucpr:0.97678
[57] validation-logloss:0.18564 validation-auc:0.97307 validation-aucpr:0.97687
[58] validation-logloss:0.18598 validation-auc:0.97298 validation-aucpr:0.97682
[59] validation-logloss:0.18606 validation-auc:0.97296 validation-aucpr:0.97680
[60] validation-logloss:0.18627 validation-auc:0.97295 validation-aucpr:0.97680
{'best_iteration': '57', 'best_score': '0.9768669727658384'}
Trial 36, Fold 4: Log loss = 0.1862653102870968, Average precision = 0.976799427274251, ROC-AUC = 0.9729520751518272, Elapsed Time = 18.60203040000124 seconds
Trial 36, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 36, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.62917 validation-auc:0.92813 validation-aucpr:0.88436
[1] validation-logloss:0.57570 validation-auc:0.95772 validation-aucpr:0.94855
[2] validation-logloss:0.53065 validation-auc:0.96252 validation-aucpr:0.96071
[3] validation-logloss:0.49165 validation-auc:0.96473 validation-aucpr:0.96366
[4] validation-logloss:0.45789 validation-auc:0.96504 validation-aucpr:0.96325
[5] validation-logloss:0.42881 validation-auc:0.96618 validation-aucpr:0.96591
[6] validation-logloss:0.40314 validation-auc:0.96712 validation-aucpr:0.96937
[7] validation-logloss:0.38033 validation-auc:0.96801 validation-aucpr:0.96974
[8] validation-logloss:0.36129 validation-auc:0.96796 validation-aucpr:0.96962
[9] validation-logloss:0.34395 validation-auc:0.96835 validation-aucpr:0.97219
[10] validation-logloss:0.32870 validation-auc:0.96836 validation-aucpr:0.97209
[11] validation-logloss:0.31462 validation-auc:0.96888 validation-aucpr:0.97246
[12] validation-logloss:0.30195 validation-auc:0.96961 validation-aucpr:0.97300
[13] validation-logloss:0.29093 validation-auc:0.97000 validation-aucpr:0.97331
[14] validation-logloss:0.28113 validation-auc:0.97029 validation-aucpr:0.97357
[15] validation-logloss:0.27252 validation-auc:0.97024 validation-aucpr:0.97352
[16] validation-logloss:0.26465 validation-auc:0.97028 validation-aucpr:0.97357
[17] validation-logloss:0.25799 validation-auc:0.97019 validation-aucpr:0.97349
[18] validation-logloss:0.25179 validation-auc:0.97000 validation-aucpr:0.97252
[19] validation-logloss:0.24679 validation-auc:0.97000 validation-aucpr:0.97260
[20] validation-logloss:0.24130 validation-auc:0.97029 validation-aucpr:0.97258
[21] validation-logloss:0.23693 validation-auc:0.97021 validation-aucpr:0.97252
[22] validation-logloss:0.23325 validation-auc:0.97013 validation-aucpr:0.97247
[23] validation-logloss:0.22951 validation-auc:0.97020 validation-aucpr:0.97251
[24] validation-logloss:0.22572 validation-auc:0.97045 validation-aucpr:0.97275
[25] validation-logloss:0.22259 validation-auc:0.97041 validation-aucpr:0.97277
[26] validation-logloss:0.21962 validation-auc:0.97073 validation-aucpr:0.97409
[27] validation-logloss:0.21706 validation-auc:0.97084 validation-aucpr:0.97419
[28] validation-logloss:0.21488 validation-auc:0.97077 validation-aucpr:0.97412
[29] validation-logloss:0.21306 validation-auc:0.97066 validation-aucpr:0.97405
[30] validation-logloss:0.21141 validation-auc:0.97053 validation-aucpr:0.97344
[31] validation-logloss:0.20985 validation-auc:0.97051 validation-aucpr:0.97341
[32] validation-logloss:0.20822 validation-auc:0.97067 validation-aucpr:0.97353
[33] validation-logloss:0.20678 validation-auc:0.97080 validation-aucpr:0.97370
[34] validation-logloss:0.20558 validation-auc:0.97089 validation-aucpr:0.97379
[35] validation-logloss:0.20438 validation-auc:0.97098 validation-aucpr:0.97347
[36] validation-logloss:0.20380 validation-auc:0.97084 validation-aucpr:0.97332
[37] validation-logloss:0.20259 validation-auc:0.97101 validation-aucpr:0.97333
[38] validation-logloss:0.20168 validation-auc:0.97101 validation-aucpr:0.97337
[39] validation-logloss:0.20099 validation-auc:0.97101 validation-aucpr:0.97325
[40] validation-logloss:0.20020 validation-auc:0.97102 validation-aucpr:0.97325
[41] validation-logloss:0.19923 validation-auc:0.97115 validation-aucpr:0.97317
[42] validation-logloss:0.19866 validation-auc:0.97124 validation-aucpr:0.97408
[43] validation-logloss:0.19784 validation-auc:0.97137 validation-aucpr:0.97409
[44] validation-logloss:0.19775 validation-auc:0.97125 validation-aucpr:0.97398
[45] validation-logloss:0.19728 validation-auc:0.97129 validation-aucpr:0.97399
[46] validation-logloss:0.19669 validation-auc:0.97150 validation-aucpr:0.97428
[47] validation-logloss:0.19636 validation-auc:0.97148 validation-aucpr:0.97428
[48] validation-logloss:0.19622 validation-auc:0.97153 validation-aucpr:0.97507
[49] validation-logloss:0.19650 validation-auc:0.97137 validation-aucpr:0.97492
[50] validation-logloss:0.19648 validation-auc:0.97135 validation-aucpr:0.97486
[51] validation-logloss:0.19648 validation-auc:0.97133 validation-aucpr:0.97480
[52] validation-logloss:0.19646 validation-auc:0.97132 validation-aucpr:0.97469
[53] validation-logloss:0.19640 validation-auc:0.97121 validation-aucpr:0.97454
[54] validation-logloss:0.19612 validation-auc:0.97132 validation-aucpr:0.97448
[55] validation-logloss:0.19625 validation-auc:0.97128 validation-aucpr:0.97444
[56] validation-logloss:0.19652 validation-auc:0.97122 validation-aucpr:0.97430
[57] validation-logloss:0.19664 validation-auc:0.97123 validation-aucpr:0.97405
[58] validation-logloss:0.19649 validation-auc:0.97129 validation-aucpr:0.97404
[59] validation-logloss:0.19664 validation-auc:0.97132 validation-aucpr:0.97415
[60] validation-logloss:0.19683 validation-auc:0.97129 validation-aucpr:0.97414
{'best_iteration': '48', 'best_score': '0.975065095160132'}
Trial 36, Fold 5: Log loss = 0.1968293727947802, Average precision = 0.9741463844987054, ROC-AUC = 0.9712943537149547, Elapsed Time = 19.465487499997835 seconds
Optimization Progress: 37%|###7 | 37/100 [2:23:42<7:40:34, 438.64s/it]
Trial 37, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 37, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.65316 validation-auc:0.94886 validation-aucpr:0.95153
[1] validation-logloss:0.61796 validation-auc:0.95469 validation-aucpr:0.95909
[2] validation-logloss:0.58590 validation-auc:0.95620 validation-aucpr:0.96087
[3] validation-logloss:0.55682 validation-auc:0.95743 validation-aucpr:0.96198
[4] validation-logloss:0.53131 validation-auc:0.95800 validation-aucpr:0.96257
[5] validation-logloss:0.50323 validation-auc:0.96314 validation-aucpr:0.96681
[6] validation-logloss:0.48294 validation-auc:0.96276 validation-aucpr:0.96647
[7] validation-logloss:0.46407 validation-auc:0.96293 validation-aucpr:0.96638
[8] validation-logloss:0.44304 validation-auc:0.96503 validation-aucpr:0.96855
[9] validation-logloss:0.42751 validation-auc:0.96508 validation-aucpr:0.96846
[10] validation-logloss:0.40959 validation-auc:0.96634 validation-aucpr:0.96977
[11] validation-logloss:0.39313 validation-auc:0.96718 validation-aucpr:0.97054
[12] validation-logloss:0.37775 validation-auc:0.96762 validation-aucpr:0.97098
[13] validation-logloss:0.36383 validation-auc:0.96812 validation-aucpr:0.97195
[14] validation-logloss:0.35081 validation-auc:0.96847 validation-aucpr:0.97236
[15] validation-logloss:0.34183 validation-auc:0.96840 validation-aucpr:0.97217
[16] validation-logloss:0.33367 validation-auc:0.96803 validation-aucpr:0.97206
[17] validation-logloss:0.32332 validation-auc:0.96854 validation-aucpr:0.97373
[18] validation-logloss:0.31365 validation-auc:0.96890 validation-aucpr:0.97402
[19] validation-logloss:0.30657 validation-auc:0.96890 validation-aucpr:0.97403
[20] validation-logloss:0.30042 validation-auc:0.96882 validation-aucpr:0.97400
[21] validation-logloss:0.29247 validation-auc:0.96911 validation-aucpr:0.97426
[22] validation-logloss:0.28734 validation-auc:0.96910 validation-aucpr:0.97421
[23] validation-logloss:0.28114 validation-auc:0.96921 validation-aucpr:0.97428
[24] validation-logloss:0.27623 validation-auc:0.96925 validation-aucpr:0.97427
[25] validation-logloss:0.27208 validation-auc:0.96917 validation-aucpr:0.97424
[26] validation-logloss:0.26751 validation-auc:0.96931 validation-aucpr:0.97434
[27] validation-logloss:0.26231 validation-auc:0.96947 validation-aucpr:0.97449
[28] validation-logloss:0.25727 validation-auc:0.96966 validation-aucpr:0.97469
[29] validation-logloss:0.25408 validation-auc:0.96966 validation-aucpr:0.97465
[30] validation-logloss:0.25026 validation-auc:0.96980 validation-aucpr:0.97476
[31] validation-logloss:0.24728 validation-auc:0.96987 validation-aucpr:0.97480
[32] validation-logloss:0.24335 validation-auc:0.97005 validation-aucpr:0.97494
[33] validation-logloss:0.24098 validation-auc:0.97000 validation-aucpr:0.97492
[34] validation-logloss:0.23906 validation-auc:0.96995 validation-aucpr:0.97486
[35] validation-logloss:0.23669 validation-auc:0.96999 validation-aucpr:0.97487
[36] validation-logloss:0.23455 validation-auc:0.97005 validation-aucpr:0.97490
[37] validation-logloss:0.23129 validation-auc:0.97024 validation-aucpr:0.97507
[38] validation-logloss:0.22958 validation-auc:0.97020 validation-aucpr:0.97502
[39] validation-logloss:0.22691 validation-auc:0.97034 validation-aucpr:0.97516
[40] validation-logloss:0.22449 validation-auc:0.97043 validation-aucpr:0.97523
[41] validation-logloss:0.22303 validation-auc:0.97039 validation-aucpr:0.97519
[42] validation-logloss:0.22078 validation-auc:0.97051 validation-aucpr:0.97528
[43] validation-logloss:0.21867 validation-auc:0.97061 validation-aucpr:0.97539
[44] validation-logloss:0.21678 validation-auc:0.97078 validation-aucpr:0.97555
[45] validation-logloss:0.21507 validation-auc:0.97090 validation-aucpr:0.97568
[46] validation-logloss:0.21394 validation-auc:0.97096 validation-aucpr:0.97570
[47] validation-logloss:0.21287 validation-auc:0.97096 validation-aucpr:0.97569
[48] validation-logloss:0.21220 validation-auc:0.97092 validation-aucpr:0.97565
[49] validation-logloss:0.21063 validation-auc:0.97104 validation-aucpr:0.97573
[50] validation-logloss:0.20909 validation-auc:0.97112 validation-aucpr:0.97581
[51] validation-logloss:0.20830 validation-auc:0.97112 validation-aucpr:0.97580
[52] validation-logloss:0.20754 validation-auc:0.97112 validation-aucpr:0.97578
[53] validation-logloss:0.20697 validation-auc:0.97113 validation-aucpr:0.97579
[54] validation-logloss:0.20648 validation-auc:0.97113 validation-aucpr:0.97577
[55] validation-logloss:0.20582 validation-auc:0.97117 validation-aucpr:0.97579
[56] validation-logloss:0.20521 validation-auc:0.97121 validation-aucpr:0.97579
[57] validation-logloss:0.20455 validation-auc:0.97127 validation-aucpr:0.97584
[58] validation-logloss:0.20388 validation-auc:0.97137 validation-aucpr:0.97591
[59] validation-logloss:0.20330 validation-auc:0.97140 validation-aucpr:0.97592
[60] validation-logloss:0.20250 validation-auc:0.97143 validation-aucpr:0.97597
[61] validation-logloss:0.20166 validation-auc:0.97152 validation-aucpr:0.97604
[62] validation-logloss:0.20138 validation-auc:0.97149 validation-aucpr:0.97600
[63] validation-logloss:0.20084 validation-auc:0.97160 validation-aucpr:0.97606
[64] validation-logloss:0.20045 validation-auc:0.97162 validation-aucpr:0.97607
[65] validation-logloss:0.20002 validation-auc:0.97167 validation-aucpr:0.97609
[66] validation-logloss:0.19963 validation-auc:0.97170 validation-aucpr:0.97611
[67] validation-logloss:0.19927 validation-auc:0.97172 validation-aucpr:0.97610
[68] validation-logloss:0.19852 validation-auc:0.97177 validation-aucpr:0.97614
[69] validation-logloss:0.19784 validation-auc:0.97186 validation-aucpr:0.97622
[70] validation-logloss:0.19714 validation-auc:0.97193 validation-aucpr:0.97628
[71] validation-logloss:0.19693 validation-auc:0.97194 validation-aucpr:0.97629
[72] validation-logloss:0.19627 validation-auc:0.97200 validation-aucpr:0.97633
[73] validation-logloss:0.19613 validation-auc:0.97204 validation-aucpr:0.97636
[74] validation-logloss:0.19555 validation-auc:0.97212 validation-aucpr:0.97644
[75] validation-logloss:0.19499 validation-auc:0.97220 validation-aucpr:0.97650
[76] validation-logloss:0.19451 validation-auc:0.97224 validation-aucpr:0.97655
[77] validation-logloss:0.19434 validation-auc:0.97226 validation-aucpr:0.97657
[78] validation-logloss:0.19393 validation-auc:0.97230 validation-aucpr:0.97662
[79] validation-logloss:0.19369 validation-auc:0.97232 validation-aucpr:0.97662
{'best_iteration': '79', 'best_score': '0.976617268810572'}
Trial 37, Fold 1: Log loss = 0.19369070806982336, Average precision = 0.9766212346890932, ROC-AUC = 0.9723205025260129, Elapsed Time = 2.6088954000006197 seconds
Trial 37, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 37, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.65269 validation-auc:0.94884 validation-aucpr:0.95081
[1] validation-logloss:0.61671 validation-auc:0.95616 validation-aucpr:0.95908
[2] validation-logloss:0.58483 validation-auc:0.95774 validation-aucpr:0.96032
[3] validation-logloss:0.55442 validation-auc:0.96198 validation-aucpr:0.96484
[4] validation-logloss:0.52515 validation-auc:0.96510 validation-aucpr:0.96840
[5] validation-logloss:0.50258 validation-auc:0.96459 validation-aucpr:0.96786
[6] validation-logloss:0.48280 validation-auc:0.96415 validation-aucpr:0.96738
[7] validation-logloss:0.46364 validation-auc:0.96450 validation-aucpr:0.96800
[8] validation-logloss:0.44581 validation-auc:0.96446 validation-aucpr:0.96775
[9] validation-logloss:0.42836 validation-auc:0.96536 validation-aucpr:0.96883
[10] validation-logloss:0.41042 validation-auc:0.96686 validation-aucpr:0.97040
[11] validation-logloss:0.39408 validation-auc:0.96753 validation-aucpr:0.97108
[12] validation-logloss:0.37884 validation-auc:0.96815 validation-aucpr:0.97168
[13] validation-logloss:0.36442 validation-auc:0.96865 validation-aucpr:0.97220
[14] validation-logloss:0.35229 validation-auc:0.96899 validation-aucpr:0.97246
[15] validation-logloss:0.34173 validation-auc:0.96890 validation-aucpr:0.97240
[16] validation-logloss:0.33220 validation-auc:0.96898 validation-aucpr:0.97242
[17] validation-logloss:0.32436 validation-auc:0.96889 validation-aucpr:0.97233
[18] validation-logloss:0.31710 validation-auc:0.96893 validation-aucpr:0.97228
[19] validation-logloss:0.30827 validation-auc:0.96933 validation-aucpr:0.97263
[20] validation-logloss:0.30015 validation-auc:0.96961 validation-aucpr:0.97283
[21] validation-logloss:0.29428 validation-auc:0.96960 validation-aucpr:0.97278
[22] validation-logloss:0.28852 validation-auc:0.96971 validation-aucpr:0.97285
[23] validation-logloss:0.28341 validation-auc:0.96957 validation-aucpr:0.97274
[24] validation-logloss:0.27684 validation-auc:0.96988 validation-aucpr:0.97303
[25] validation-logloss:0.27056 validation-auc:0.97009 validation-aucpr:0.97321
[26] validation-logloss:0.26590 validation-auc:0.97023 validation-aucpr:0.97329
[27] validation-logloss:0.26199 validation-auc:0.97022 validation-aucpr:0.97329
[28] validation-logloss:0.25691 validation-auc:0.97045 validation-aucpr:0.97345
[29] validation-logloss:0.25338 validation-auc:0.97046 validation-aucpr:0.97348
[30] validation-logloss:0.25018 validation-auc:0.97051 validation-aucpr:0.97350
[31] validation-logloss:0.24555 validation-auc:0.97080 validation-aucpr:0.97374
[32] validation-logloss:0.24275 validation-auc:0.97081 validation-aucpr:0.97372
[33] validation-logloss:0.24004 validation-auc:0.97085 validation-aucpr:0.97374
[34] validation-logloss:0.23790 validation-auc:0.97084 validation-aucpr:0.97373
[35] validation-logloss:0.23430 validation-auc:0.97102 validation-aucpr:0.97389
[36] validation-logloss:0.23130 validation-auc:0.97113 validation-aucpr:0.97243
[37] validation-logloss:0.22800 validation-auc:0.97141 validation-aucpr:0.97264
[38] validation-logloss:0.22620 validation-auc:0.97139 validation-aucpr:0.97260
[39] validation-logloss:0.22448 validation-auc:0.97149 validation-aucpr:0.97255
[40] validation-logloss:0.22293 validation-auc:0.97145 validation-aucpr:0.97251
[41] validation-logloss:0.22053 validation-auc:0.97158 validation-aucpr:0.97260
[42] validation-logloss:0.21903 validation-auc:0.97162 validation-aucpr:0.97462
[43] validation-logloss:0.21649 validation-auc:0.97178 validation-aucpr:0.97475
[44] validation-logloss:0.21429 validation-auc:0.97185 validation-aucpr:0.97482
[45] validation-logloss:0.21316 validation-auc:0.97183 validation-aucpr:0.97476
[46] validation-logloss:0.21103 validation-auc:0.97198 validation-aucpr:0.97488
[47] validation-logloss:0.20989 validation-auc:0.97198 validation-aucpr:0.97487
[48] validation-logloss:0.20877 validation-auc:0.97207 validation-aucpr:0.97491
[49] validation-logloss:0.20703 validation-auc:0.97215 validation-aucpr:0.97499
[50] validation-logloss:0.20558 validation-auc:0.97222 validation-aucpr:0.97492
[51] validation-logloss:0.20405 validation-auc:0.97235 validation-aucpr:0.97502
[52] validation-logloss:0.20265 validation-auc:0.97240 validation-aucpr:0.97506
[53] validation-logloss:0.20200 validation-auc:0.97240 validation-aucpr:0.97505
[54] validation-logloss:0.20119 validation-auc:0.97244 validation-aucpr:0.97508
[55] validation-logloss:0.19999 validation-auc:0.97253 validation-aucpr:0.97515
[56] validation-logloss:0.19909 validation-auc:0.97258 validation-aucpr:0.97519
[57] validation-logloss:0.19842 validation-auc:0.97264 validation-aucpr:0.97513
[58] validation-logloss:0.19732 validation-auc:0.97272 validation-aucpr:0.97518
[59] validation-logloss:0.19639 validation-auc:0.97275 validation-aucpr:0.97519
[60] validation-logloss:0.19557 validation-auc:0.97280 validation-aucpr:0.97522
[61] validation-logloss:0.19479 validation-auc:0.97276 validation-aucpr:0.97507
[62] validation-logloss:0.19421 validation-auc:0.97291 validation-aucpr:0.97551
[63] validation-logloss:0.19387 validation-auc:0.97290 validation-aucpr:0.97557
[64] validation-logloss:0.19326 validation-auc:0.97293 validation-aucpr:0.97559
[65] validation-logloss:0.19250 validation-auc:0.97298 validation-aucpr:0.97561
[66] validation-logloss:0.19180 validation-auc:0.97311 validation-aucpr:0.97591
[67] validation-logloss:0.19156 validation-auc:0.97310 validation-aucpr:0.97587
[68] validation-logloss:0.19102 validation-auc:0.97319 validation-aucpr:0.97600
[69] validation-logloss:0.19071 validation-auc:0.97322 validation-aucpr:0.97600
[70] validation-logloss:0.19019 validation-auc:0.97327 validation-aucpr:0.97608
[71] validation-logloss:0.18960 validation-auc:0.97334 validation-aucpr:0.97611
[72] validation-logloss:0.18935 validation-auc:0.97337 validation-aucpr:0.97610
[73] validation-logloss:0.18892 validation-auc:0.97339 validation-aucpr:0.97608
[74] validation-logloss:0.18867 validation-auc:0.97343 validation-aucpr:0.97610
[75] validation-logloss:0.18821 validation-auc:0.97347 validation-aucpr:0.97612
[76] validation-logloss:0.18780 validation-auc:0.97349 validation-aucpr:0.97612
[77] validation-logloss:0.18750 validation-auc:0.97356 validation-aucpr:0.97613
[78] validation-logloss:0.18719 validation-auc:0.97359 validation-aucpr:0.97613
[79] validation-logloss:0.18681 validation-auc:0.97360 validation-aucpr:0.97621
{'best_iteration': '79', 'best_score': '0.9762111427657041'}
Trial 37, Fold 2: Log loss = 0.1868082831728122, Average precision = 0.9762157308120819, ROC-AUC = 0.9736024863571153, Elapsed Time = 2.777645799997117 seconds
Trial 37, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 37, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.65310 validation-auc:0.94907 validation-aucpr:0.95224
[1] validation-logloss:0.61741 validation-auc:0.95518 validation-aucpr:0.95762
[2] validation-logloss:0.58641 validation-auc:0.95650 validation-aucpr:0.95934
[3] validation-logloss:0.55738 validation-auc:0.95841 validation-aucpr:0.96151
[4] validation-logloss:0.52935 validation-auc:0.96307 validation-aucpr:0.96726
[5] validation-logloss:0.50634 validation-auc:0.96323 validation-aucpr:0.96723
[6] validation-logloss:0.48031 validation-auc:0.96591 validation-aucpr:0.97008
[7] validation-logloss:0.45933 validation-auc:0.96649 validation-aucpr:0.97114
[8] validation-logloss:0.44152 validation-auc:0.96671 validation-aucpr:0.97127
[9] validation-logloss:0.42556 validation-auc:0.96673 validation-aucpr:0.97124
[10] validation-logloss:0.41120 validation-auc:0.96658 validation-aucpr:0.97108
[11] validation-logloss:0.39792 validation-auc:0.96661 validation-aucpr:0.97107
[12] validation-logloss:0.38564 validation-auc:0.96698 validation-aucpr:0.97118
[13] validation-logloss:0.37386 validation-auc:0.96698 validation-aucpr:0.97115
[14] validation-logloss:0.36398 validation-auc:0.96696 validation-aucpr:0.97101
[15] validation-logloss:0.35103 validation-auc:0.96784 validation-aucpr:0.97206
[16] validation-logloss:0.34179 validation-auc:0.96789 validation-aucpr:0.97208
[17] validation-logloss:0.33339 validation-auc:0.96794 validation-aucpr:0.97230
[18] validation-logloss:0.32395 validation-auc:0.96831 validation-aucpr:0.97264
[19] validation-logloss:0.31691 validation-auc:0.96827 validation-aucpr:0.97261
[20] validation-logloss:0.30721 validation-auc:0.96881 validation-aucpr:0.97315
[21] validation-logloss:0.29818 validation-auc:0.96919 validation-aucpr:0.97356
[22] validation-logloss:0.28990 validation-auc:0.96938 validation-aucpr:0.97377
[23] validation-logloss:0.28265 validation-auc:0.96962 validation-aucpr:0.97407
[24] validation-logloss:0.27700 validation-auc:0.96972 validation-aucpr:0.97414
[25] validation-logloss:0.27280 validation-auc:0.96976 validation-aucpr:0.97420
[26] validation-logloss:0.26661 validation-auc:0.96991 validation-aucpr:0.97437
[27] validation-logloss:0.26116 validation-auc:0.97011 validation-aucpr:0.97456
[28] validation-logloss:0.25748 validation-auc:0.97022 validation-aucpr:0.97460
[29] validation-logloss:0.25306 validation-auc:0.97031 validation-aucpr:0.97468
[30] validation-logloss:0.24859 validation-auc:0.97041 validation-aucpr:0.97474
[31] validation-logloss:0.24414 validation-auc:0.97056 validation-aucpr:0.97488
[32] validation-logloss:0.24142 validation-auc:0.97061 validation-aucpr:0.97490
[33] validation-logloss:0.23759 validation-auc:0.97077 validation-aucpr:0.97504
[34] validation-logloss:0.23566 validation-auc:0.97071 validation-aucpr:0.97500
[35] validation-logloss:0.23357 validation-auc:0.97065 validation-aucpr:0.97494
[36] validation-logloss:0.23137 validation-auc:0.97073 validation-aucpr:0.97497
[37] validation-logloss:0.22909 validation-auc:0.97089 validation-aucpr:0.97511
[38] validation-logloss:0.22632 validation-auc:0.97100 validation-aucpr:0.97519
[39] validation-logloss:0.22459 validation-auc:0.97109 validation-aucpr:0.97524
[40] validation-logloss:0.22192 validation-auc:0.97119 validation-aucpr:0.97534
[41] validation-logloss:0.22030 validation-auc:0.97126 validation-aucpr:0.97537
[42] validation-logloss:0.21794 validation-auc:0.97131 validation-aucpr:0.97540
[43] validation-logloss:0.21652 validation-auc:0.97134 validation-aucpr:0.97541
[44] validation-logloss:0.21435 validation-auc:0.97142 validation-aucpr:0.97549
[45] validation-logloss:0.21327 validation-auc:0.97143 validation-aucpr:0.97550
[46] validation-logloss:0.21135 validation-auc:0.97150 validation-aucpr:0.97552
[47] validation-logloss:0.21017 validation-auc:0.97159 validation-aucpr:0.97560
[48] validation-logloss:0.20944 validation-auc:0.97157 validation-aucpr:0.97556
[49] validation-logloss:0.20829 validation-auc:0.97166 validation-aucpr:0.97561
[50] validation-logloss:0.20657 validation-auc:0.97179 validation-aucpr:0.97574
[51] validation-logloss:0.20504 validation-auc:0.97190 validation-aucpr:0.97581
[52] validation-logloss:0.20432 validation-auc:0.97191 validation-aucpr:0.97582
[53] validation-logloss:0.20348 validation-auc:0.97198 validation-aucpr:0.97587
[54] validation-logloss:0.20226 validation-auc:0.97208 validation-aucpr:0.97598
[55] validation-logloss:0.20179 validation-auc:0.97206 validation-aucpr:0.97593
[56] validation-logloss:0.20071 validation-auc:0.97210 validation-aucpr:0.97587
[57] validation-logloss:0.20013 validation-auc:0.97217 validation-aucpr:0.97593
[58] validation-logloss:0.19899 validation-auc:0.97226 validation-aucpr:0.97599
[59] validation-logloss:0.19801 validation-auc:0.97236 validation-aucpr:0.97602
[60] validation-logloss:0.19695 validation-auc:0.97246 validation-aucpr:0.97624
[61] validation-logloss:0.19656 validation-auc:0.97251 validation-aucpr:0.97633
[62] validation-logloss:0.19602 validation-auc:0.97258 validation-aucpr:0.97638
[63] validation-logloss:0.19503 validation-auc:0.97272 validation-aucpr:0.97649
[64] validation-logloss:0.19453 validation-auc:0.97276 validation-aucpr:0.97653
[65] validation-logloss:0.19410 validation-auc:0.97280 validation-aucpr:0.97653
[66] validation-logloss:0.19338 validation-auc:0.97285 validation-aucpr:0.97658
[67] validation-logloss:0.19247 validation-auc:0.97297 validation-aucpr:0.97665
[68] validation-logloss:0.19222 validation-auc:0.97295 validation-aucpr:0.97663
[69] validation-logloss:0.19179 validation-auc:0.97296 validation-aucpr:0.97666
[70] validation-logloss:0.19130 validation-auc:0.97300 validation-aucpr:0.97669
[71] validation-logloss:0.19095 validation-auc:0.97306 validation-aucpr:0.97672
[72] validation-logloss:0.19068 validation-auc:0.97306 validation-aucpr:0.97670
[73] validation-logloss:0.19030 validation-auc:0.97312 validation-aucpr:0.97678
[74] validation-logloss:0.18990 validation-auc:0.97313 validation-aucpr:0.97677
[75] validation-logloss:0.18957 validation-auc:0.97321 validation-aucpr:0.97686
[76] validation-logloss:0.18928 validation-auc:0.97324 validation-aucpr:0.97688
[77] validation-logloss:0.18914 validation-auc:0.97326 validation-aucpr:0.97687
[78] validation-logloss:0.18896 validation-auc:0.97328 validation-aucpr:0.97687
[79] validation-logloss:0.18883 validation-auc:0.97329 validation-aucpr:0.97684
{'best_iteration': '76', 'best_score': '0.976880019972459'}
Trial 37, Fold 3: Log loss = 0.1888314692820942, Average precision = 0.9768399914184289, ROC-AUC = 0.9732865184771408, Elapsed Time = 2.7473267999994277 seconds
Trial 37, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 37, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.65303 validation-auc:0.94703 validation-aucpr:0.95109
[1] validation-logloss:0.61721 validation-auc:0.95355 validation-aucpr:0.95846
[2] validation-logloss:0.58656 validation-auc:0.95563 validation-aucpr:0.96027
[3] validation-logloss:0.55582 validation-auc:0.96015 validation-aucpr:0.96540
[4] validation-logloss:0.53055 validation-auc:0.96035 validation-aucpr:0.96553
[5] validation-logloss:0.50782 validation-auc:0.96031 validation-aucpr:0.96557
[6] validation-logloss:0.48737 validation-auc:0.96048 validation-aucpr:0.96587
[7] validation-logloss:0.46371 validation-auc:0.96353 validation-aucpr:0.96925
[8] validation-logloss:0.44741 validation-auc:0.96318 validation-aucpr:0.96888
[9] validation-logloss:0.43167 validation-auc:0.96334 validation-aucpr:0.96892
[10] validation-logloss:0.41721 validation-auc:0.96346 validation-aucpr:0.96883
[11] validation-logloss:0.40354 validation-auc:0.96362 validation-aucpr:0.96895
[12] validation-logloss:0.39173 validation-auc:0.96366 validation-aucpr:0.96893
[13] validation-logloss:0.37707 validation-auc:0.96486 validation-aucpr:0.97026
[14] validation-logloss:0.36340 validation-auc:0.96558 validation-aucpr:0.97110
[15] validation-logloss:0.35063 validation-auc:0.96630 validation-aucpr:0.97180
[16] validation-logloss:0.33855 validation-auc:0.96679 validation-aucpr:0.97223
[17] validation-logloss:0.32798 validation-auc:0.96704 validation-aucpr:0.97249
[18] validation-logloss:0.32022 validation-auc:0.96713 validation-aucpr:0.97257
[19] validation-logloss:0.31076 validation-auc:0.96737 validation-aucpr:0.97281
[20] validation-logloss:0.30286 validation-auc:0.96745 validation-aucpr:0.97292
[21] validation-logloss:0.29484 validation-auc:0.96773 validation-aucpr:0.97313
[22] validation-logloss:0.28908 validation-auc:0.96779 validation-aucpr:0.97318
[23] validation-logloss:0.28391 validation-auc:0.96789 validation-aucpr:0.97324
[24] validation-logloss:0.27774 validation-auc:0.96800 validation-aucpr:0.97336
[25] validation-logloss:0.27336 validation-auc:0.96802 validation-aucpr:0.97338
[26] validation-logloss:0.26932 validation-auc:0.96813 validation-aucpr:0.97344
[27] validation-logloss:0.26575 validation-auc:0.96801 validation-aucpr:0.97332
[28] validation-logloss:0.26234 validation-auc:0.96805 validation-aucpr:0.97333
[29] validation-logloss:0.25729 validation-auc:0.96821 validation-aucpr:0.97349
[30] validation-logloss:0.25258 validation-auc:0.96844 validation-aucpr:0.97370
[31] validation-logloss:0.24814 validation-auc:0.96866 validation-aucpr:0.97387
[32] validation-logloss:0.24408 validation-auc:0.96889 validation-aucpr:0.97405
[33] validation-logloss:0.24014 validation-auc:0.96912 validation-aucpr:0.97423
[34] validation-logloss:0.23727 validation-auc:0.96929 validation-aucpr:0.97435
[35] validation-logloss:0.23377 validation-auc:0.96946 validation-aucpr:0.97447
[36] validation-logloss:0.23197 validation-auc:0.96944 validation-aucpr:0.97445
[37] validation-logloss:0.22912 validation-auc:0.96961 validation-aucpr:0.97458
[38] validation-logloss:0.22741 validation-auc:0.96963 validation-aucpr:0.97458
[39] validation-logloss:0.22604 validation-auc:0.96955 validation-aucpr:0.97453
[40] validation-logloss:0.22425 validation-auc:0.96960 validation-aucpr:0.97457
[41] validation-logloss:0.22183 validation-auc:0.96972 validation-aucpr:0.97467
[42] validation-logloss:0.22049 validation-auc:0.96971 validation-aucpr:0.97467
[43] validation-logloss:0.21913 validation-auc:0.96978 validation-aucpr:0.97473
[44] validation-logloss:0.21678 validation-auc:0.97002 validation-aucpr:0.97491
[45] validation-logloss:0.21478 validation-auc:0.97014 validation-aucpr:0.97502
[46] validation-logloss:0.21301 validation-auc:0.97025 validation-aucpr:0.97510
[47] validation-logloss:0.21191 validation-auc:0.97029 validation-aucpr:0.97514
[48] validation-logloss:0.21089 validation-auc:0.97032 validation-aucpr:0.97516
[49] validation-logloss:0.21005 validation-auc:0.97032 validation-aucpr:0.97516
[50] validation-logloss:0.20863 validation-auc:0.97043 validation-aucpr:0.97523
[51] validation-logloss:0.20790 validation-auc:0.97036 validation-aucpr:0.97519
[52] validation-logloss:0.20701 validation-auc:0.97038 validation-aucpr:0.97520
[53] validation-logloss:0.20627 validation-auc:0.97037 validation-aucpr:0.97519
[54] validation-logloss:0.20580 validation-auc:0.97037 validation-aucpr:0.97518
[55] validation-logloss:0.20511 validation-auc:0.97043 validation-aucpr:0.97523
[56] validation-logloss:0.20473 validation-auc:0.97044 validation-aucpr:0.97523
[57] validation-logloss:0.20411 validation-auc:0.97047 validation-aucpr:0.97525
[58] validation-logloss:0.20364 validation-auc:0.97051 validation-aucpr:0.97528
[59] validation-logloss:0.20249 validation-auc:0.97061 validation-aucpr:0.97536
[60] validation-logloss:0.20137 validation-auc:0.97072 validation-aucpr:0.97544
[61] validation-logloss:0.20036 validation-auc:0.97085 validation-aucpr:0.97555
[62] validation-logloss:0.19988 validation-auc:0.97088 validation-aucpr:0.97556
[63] validation-logloss:0.19908 validation-auc:0.97098 validation-aucpr:0.97564
[64] validation-logloss:0.19847 validation-auc:0.97109 validation-aucpr:0.97572
[65] validation-logloss:0.19757 validation-auc:0.97119 validation-aucpr:0.97579
[66] validation-logloss:0.19717 validation-auc:0.97120 validation-aucpr:0.97580
[67] validation-logloss:0.19628 validation-auc:0.97130 validation-aucpr:0.97587
[68] validation-logloss:0.19594 validation-auc:0.97129 validation-aucpr:0.97585
[69] validation-logloss:0.19516 validation-auc:0.97140 validation-aucpr:0.97594
[70] validation-logloss:0.19477 validation-auc:0.97145 validation-aucpr:0.97596
[71] validation-logloss:0.19402 validation-auc:0.97154 validation-aucpr:0.97603
[72] validation-logloss:0.19345 validation-auc:0.97162 validation-aucpr:0.97608
[73] validation-logloss:0.19302 validation-auc:0.97163 validation-aucpr:0.97609
[74] validation-logloss:0.19276 validation-auc:0.97167 validation-aucpr:0.97611
[75] validation-logloss:0.19229 validation-auc:0.97172 validation-aucpr:0.97614
[76] validation-logloss:0.19203 validation-auc:0.97175 validation-aucpr:0.97616
[77] validation-logloss:0.19176 validation-auc:0.97177 validation-aucpr:0.97617
[78] validation-logloss:0.19149 validation-auc:0.97181 validation-aucpr:0.97619
[79] validation-logloss:0.19112 validation-auc:0.97191 validation-aucpr:0.97626
{'best_iteration': '79', 'best_score': '0.9762577490225597'}
Trial 37, Fold 4: Log loss = 0.19112030204925323, Average precision = 0.9762616418832818, ROC-AUC = 0.9719088213624041, Elapsed Time = 2.7423531000022194 seconds
Trial 37, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 37, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.65387 validation-auc:0.94223 validation-aucpr:0.94643
[1] validation-logloss:0.61840 validation-auc:0.95093 validation-aucpr:0.95391
[2] validation-logloss:0.58759 validation-auc:0.95284 validation-aucpr:0.95590
[3] validation-logloss:0.55965 validation-auc:0.95482 validation-aucpr:0.95770
[4] validation-logloss:0.53436 validation-auc:0.95566 validation-aucpr:0.95856
[5] validation-logloss:0.51293 validation-auc:0.95564 validation-aucpr:0.95874
[6] validation-logloss:0.48753 validation-auc:0.96107 validation-aucpr:0.96501
[7] validation-logloss:0.46971 validation-auc:0.96100 validation-aucpr:0.96524
[8] validation-logloss:0.44757 validation-auc:0.96321 validation-aucpr:0.96772
[9] validation-logloss:0.42865 validation-auc:0.96397 validation-aucpr:0.96857
[10] validation-logloss:0.41510 validation-auc:0.96374 validation-aucpr:0.96847
[11] validation-logloss:0.40202 validation-auc:0.96379 validation-aucpr:0.96842
[12] validation-logloss:0.38965 validation-auc:0.96398 validation-aucpr:0.96883
[13] validation-logloss:0.37527 validation-auc:0.96452 validation-aucpr:0.96937
[14] validation-logloss:0.36189 validation-auc:0.96513 validation-aucpr:0.96997
[15] validation-logloss:0.35057 validation-auc:0.96543 validation-aucpr:0.97017
[16] validation-logloss:0.34157 validation-auc:0.96562 validation-aucpr:0.97033
[17] validation-logloss:0.33128 validation-auc:0.96604 validation-aucpr:0.97071
[18] validation-logloss:0.32384 validation-auc:0.96602 validation-aucpr:0.97068
[19] validation-logloss:0.31740 validation-auc:0.96595 validation-aucpr:0.97056
[20] validation-logloss:0.30846 validation-auc:0.96631 validation-aucpr:0.97089
[21] validation-logloss:0.30046 validation-auc:0.96671 validation-aucpr:0.97121
[22] validation-logloss:0.29461 validation-auc:0.96687 validation-aucpr:0.97131
[23] validation-logloss:0.29002 validation-auc:0.96674 validation-aucpr:0.97118
[24] validation-logloss:0.28530 validation-auc:0.96680 validation-aucpr:0.97120
[25] validation-logloss:0.28115 validation-auc:0.96677 validation-aucpr:0.97117
[26] validation-logloss:0.27504 validation-auc:0.96700 validation-aucpr:0.97141
[27] validation-logloss:0.26922 validation-auc:0.96735 validation-aucpr:0.97178
[28] validation-logloss:0.26420 validation-auc:0.96761 validation-aucpr:0.97206
[29] validation-logloss:0.26120 validation-auc:0.96754 validation-aucpr:0.97200
[30] validation-logloss:0.25630 validation-auc:0.96787 validation-aucpr:0.97229
[31] validation-logloss:0.25349 validation-auc:0.96782 validation-aucpr:0.97224
[32] validation-logloss:0.24938 validation-auc:0.96807 validation-aucpr:0.97243
[33] validation-logloss:0.24594 validation-auc:0.96828 validation-aucpr:0.97258
[34] validation-logloss:0.24349 validation-auc:0.96834 validation-aucpr:0.97262
[35] validation-logloss:0.24120 validation-auc:0.96841 validation-aucpr:0.97268
[36] validation-logloss:0.23897 validation-auc:0.96842 validation-aucpr:0.97269
[37] validation-logloss:0.23701 validation-auc:0.96848 validation-aucpr:0.97274
[38] validation-logloss:0.23489 validation-auc:0.96861 validation-aucpr:0.97281
[39] validation-logloss:0.23192 validation-auc:0.96884 validation-aucpr:0.97301
[40] validation-logloss:0.23039 validation-auc:0.96883 validation-aucpr:0.97298
[41] validation-logloss:0.22926 validation-auc:0.96875 validation-aucpr:0.97289
[42] validation-logloss:0.22796 validation-auc:0.96872 validation-aucpr:0.97279
[43] validation-logloss:0.22564 validation-auc:0.96893 validation-aucpr:0.97293
[44] validation-logloss:0.22449 validation-auc:0.96890 validation-aucpr:0.97291
[45] validation-logloss:0.22336 validation-auc:0.96890 validation-aucpr:0.97286
[46] validation-logloss:0.22230 validation-auc:0.96895 validation-aucpr:0.97291
[47] validation-logloss:0.22127 validation-auc:0.96899 validation-aucpr:0.97286
[48] validation-logloss:0.22018 validation-auc:0.96903 validation-aucpr:0.97286
[49] validation-logloss:0.21837 validation-auc:0.96914 validation-aucpr:0.97293
[50] validation-logloss:0.21654 validation-auc:0.96930 validation-aucpr:0.97307
[51] validation-logloss:0.21580 validation-auc:0.96933 validation-aucpr:0.97310
[52] validation-logloss:0.21522 validation-auc:0.96929 validation-aucpr:0.97312
[53] validation-logloss:0.21437 validation-auc:0.96941 validation-aucpr:0.97320
[54] validation-logloss:0.21277 validation-auc:0.96965 validation-aucpr:0.97337
[55] validation-logloss:0.21130 validation-auc:0.96981 validation-aucpr:0.97349
[56] validation-logloss:0.21076 validation-auc:0.96981 validation-aucpr:0.97351
[57] validation-logloss:0.20930 validation-auc:0.97000 validation-aucpr:0.97362
[58] validation-logloss:0.20876 validation-auc:0.97002 validation-aucpr:0.97356
[59] validation-logloss:0.20822 validation-auc:0.97004 validation-aucpr:0.97356
[60] validation-logloss:0.20710 validation-auc:0.97015 validation-aucpr:0.97362
[61] validation-logloss:0.20651 validation-auc:0.97023 validation-aucpr:0.97369
[62] validation-logloss:0.20600 validation-auc:0.97028 validation-aucpr:0.97368
[63] validation-logloss:0.20568 validation-auc:0.97021 validation-aucpr:0.97359
[64] validation-logloss:0.20454 validation-auc:0.97038 validation-aucpr:0.97374
[65] validation-logloss:0.20413 validation-auc:0.97040 validation-aucpr:0.97374
[66] validation-logloss:0.20366 validation-auc:0.97045 validation-aucpr:0.97389
[67] validation-logloss:0.20330 validation-auc:0.97050 validation-aucpr:0.97390
[68] validation-logloss:0.20304 validation-auc:0.97051 validation-aucpr:0.97397
[69] validation-logloss:0.20220 validation-auc:0.97065 validation-aucpr:0.97400
[70] validation-logloss:0.20151 validation-auc:0.97075 validation-aucpr:0.97399
[71] validation-logloss:0.20081 validation-auc:0.97083 validation-aucpr:0.97406
[72] validation-logloss:0.20058 validation-auc:0.97085 validation-aucpr:0.97407
[73] validation-logloss:0.19996 validation-auc:0.97099 validation-aucpr:0.97414
[74] validation-logloss:0.19932 validation-auc:0.97110 validation-aucpr:0.97420
[75] validation-logloss:0.19900 validation-auc:0.97114 validation-aucpr:0.97423
[76] validation-logloss:0.19883 validation-auc:0.97116 validation-aucpr:0.97423
[77] validation-logloss:0.19831 validation-auc:0.97123 validation-aucpr:0.97416
[78] validation-logloss:0.19779 validation-auc:0.97133 validation-aucpr:0.97421
[79] validation-logloss:0.19763 validation-auc:0.97134 validation-aucpr:0.97420
{'best_iteration': '75', 'best_score': '0.9742342907888454'}
Trial 37, Fold 5: Log loss = 0.19762597901567752, Average precision = 0.9742038468289268, ROC-AUC = 0.9713423629303457, Elapsed Time = 3.3143954999977723 seconds
Optimization Progress: 38%|###8 | 38/100 [2:24:04<5:24:16, 313.82s/it]
Trial 38, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 38, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[20:22:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68047 validation-auc:0.91070 validation-aucpr:0.89779
[20:22:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66627 validation-auc:0.93986 validation-aucpr:0.93921
[20:22:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65357 validation-auc:0.94255 validation-aucpr:0.94553
[20:22:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63943 validation-auc:0.95464 validation-aucpr:0.95920
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62532 validation-auc:0.95771 validation-aucpr:0.96321
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61454 validation-auc:0.95667 validation-aucpr:0.96179
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.60442 validation-auc:0.95639 validation-aucpr:0.96159
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.59377 validation-auc:0.95875 validation-aucpr:0.96465
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.58136 validation-auc:0.96004 validation-aucpr:0.96635
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.57201 validation-auc:0.96003 validation-aucpr:0.96629
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.56250 validation-auc:0.95979 validation-aucpr:0.96599
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.55420 validation-auc:0.95908 validation-aucpr:0.96513
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.54528 validation-auc:0.95904 validation-aucpr:0.96511
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.53712 validation-auc:0.95877 validation-aucpr:0.96484
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.52924 validation-auc:0.95881 validation-aucpr:0.96484
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.52129 validation-auc:0.95893 validation-aucpr:0.96493
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.51374 validation-auc:0.95879 validation-aucpr:0.96480
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.50583 validation-auc:0.95890 validation-aucpr:0.96506
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.49861 validation-auc:0.95907 validation-aucpr:0.96517
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.49061 validation-auc:0.95970 validation-aucpr:0.96592
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.48259 validation-auc:0.95984 validation-aucpr:0.96610
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.47688 validation-auc:0.95943 validation-aucpr:0.96573
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.47153 validation-auc:0.95913 validation-aucpr:0.96540
[20:22:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.46471 validation-auc:0.95910 validation-aucpr:0.96547
[20:23:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.45931 validation-auc:0.95885 validation-aucpr:0.96517
[20:23:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.45356 validation-auc:0.95911 validation-aucpr:0.96538
[20:23:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.44761 validation-auc:0.95927 validation-aucpr:0.96555
[20:23:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.44187 validation-auc:0.95935 validation-aucpr:0.96556
[20:23:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.43537 validation-auc:0.95955 validation-aucpr:0.96578
[20:23:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.43072 validation-auc:0.95964 validation-aucpr:0.96587
[20:23:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.42670 validation-auc:0.95926 validation-aucpr:0.96549
[20:23:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.42200 validation-auc:0.95945 validation-aucpr:0.96561
[20:23:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.41743 validation-auc:0.95952 validation-aucpr:0.96564
[20:23:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.41275 validation-auc:0.95984 validation-aucpr:0.96595
[20:23:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.40867 validation-auc:0.95973 validation-aucpr:0.96582
[20:23:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.40456 validation-auc:0.95979 validation-aucpr:0.96588
[20:23:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.40057 validation-auc:0.95981 validation-aucpr:0.96594
[20:23:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.39640 validation-auc:0.95994 validation-aucpr:0.96604
[20:23:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.39223 validation-auc:0.96027 validation-aucpr:0.96632
[20:23:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.38729 validation-auc:0.96050 validation-aucpr:0.96656
[20:23:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.38388 validation-auc:0.96046 validation-aucpr:0.96653
[20:23:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.37950 validation-auc:0.96058 validation-aucpr:0.96670
[20:23:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.37575 validation-auc:0.96085 validation-aucpr:0.96692
[20:23:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.37287 validation-auc:0.96080 validation-aucpr:0.96685
[20:23:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.36943 validation-auc:0.96092 validation-aucpr:0.96695
[20:23:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.36624 validation-auc:0.96100 validation-aucpr:0.96696
[20:23:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.36348 validation-auc:0.96098 validation-aucpr:0.96692
{'best_iteration': '45', 'best_score': '0.96695969606383'}
Trial 38, Fold 1: Log loss = 0.36347848486850604, Average precision = 0.9669244598053782, ROC-AUC = 0.9609798815139827, Elapsed Time = 3.9911858000014035 seconds
Trial 38, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 38, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67991 validation-auc:0.91700 validation-aucpr:0.90474
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66559 validation-auc:0.94392 validation-aucpr:0.94367
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65247 validation-auc:0.94750 validation-aucpr:0.94769
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64102 validation-auc:0.94661 validation-aucpr:0.94698
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62646 validation-auc:0.95777 validation-aucpr:0.96131
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61526 validation-auc:0.95792 validation-aucpr:0.96174
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.60483 validation-auc:0.95767 validation-aucpr:0.96140
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.59424 validation-auc:0.95757 validation-aucpr:0.96126
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.58437 validation-auc:0.95771 validation-aucpr:0.96131
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.57301 validation-auc:0.95937 validation-aucpr:0.96349
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.56199 validation-auc:0.95999 validation-aucpr:0.96423
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.55212 validation-auc:0.96082 validation-aucpr:0.96498
[20:23:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.54278 validation-auc:0.96080 validation-aucpr:0.96502
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.53517 validation-auc:0.96042 validation-aucpr:0.96461
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.52775 validation-auc:0.96045 validation-aucpr:0.96472
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.52030 validation-auc:0.96059 validation-aucpr:0.96479
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.51314 validation-auc:0.96115 validation-aucpr:0.96548
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.50611 validation-auc:0.96084 validation-aucpr:0.96507
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.49955 validation-auc:0.96053 validation-aucpr:0.96480
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.49278 validation-auc:0.96044 validation-aucpr:0.96468
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.48583 validation-auc:0.96048 validation-aucpr:0.96474
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.47870 validation-auc:0.96057 validation-aucpr:0.96491
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.47333 validation-auc:0.96036 validation-aucpr:0.96471
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.46818 validation-auc:0.96025 validation-aucpr:0.96466
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.46213 validation-auc:0.96033 validation-aucpr:0.96471
[20:23:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.45619 validation-auc:0.96062 validation-aucpr:0.96497
[20:23:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.45052 validation-auc:0.96064 validation-aucpr:0.96498
[20:23:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.44513 validation-auc:0.96064 validation-aucpr:0.96491
[20:23:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.43872 validation-auc:0.96113 validation-aucpr:0.96544
[20:23:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.43388 validation-auc:0.96123 validation-aucpr:0.96551
[20:23:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.42963 validation-auc:0.96118 validation-aucpr:0.96541
[20:23:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.42376 validation-auc:0.96136 validation-aucpr:0.96565
[20:23:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.41946 validation-auc:0.96145 validation-aucpr:0.96568
[20:23:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.41429 validation-auc:0.96176 validation-aucpr:0.96599
[20:23:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.40889 validation-auc:0.96198 validation-aucpr:0.96621
[20:23:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.40497 validation-auc:0.96186 validation-aucpr:0.96609
[20:23:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.39914 validation-auc:0.96212 validation-aucpr:0.96636
[20:23:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.39472 validation-auc:0.96224 validation-aucpr:0.96646
[20:23:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.39100 validation-auc:0.96226 validation-aucpr:0.96644
[20:23:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.38873 validation-auc:0.96218 validation-aucpr:0.96624
[20:23:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.38544 validation-auc:0.96214 validation-aucpr:0.96613
[20:23:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.38067 validation-auc:0.96236 validation-aucpr:0.96638
[20:23:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.37808 validation-auc:0.96238 validation-aucpr:0.96629
[20:23:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.37478 validation-auc:0.96243 validation-aucpr:0.96632
[20:23:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.37209 validation-auc:0.96237 validation-aucpr:0.96625
[20:23:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.36884 validation-auc:0.96237 validation-aucpr:0.96626
[20:23:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.36678 validation-auc:0.96238 validation-aucpr:0.96620
{'best_iteration': '37', 'best_score': '0.9664614916691174'}
Trial 38, Fold 2: Log loss = 0.36678439359062504, Average precision = 0.9661942089559218, ROC-AUC = 0.9623813446128473, Elapsed Time = 4.435023800000636 seconds
Trial 38, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 38, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67990 validation-auc:0.91670 validation-aucpr:0.90637
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66606 validation-auc:0.93974 validation-aucpr:0.94017
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65833 validation-auc:0.93809 validation-aucpr:0.94062
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64593 validation-auc:0.94326 validation-aucpr:0.94557
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.63434 validation-auc:0.94596 validation-aucpr:0.94895
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.62376 validation-auc:0.94660 validation-aucpr:0.94965
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.61339 validation-auc:0.94600 validation-aucpr:0.94934
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.60377 validation-auc:0.94632 validation-aucpr:0.94958
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.59333 validation-auc:0.94843 validation-aucpr:0.95202
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.58330 validation-auc:0.94953 validation-aucpr:0.95288
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.57362 validation-auc:0.95136 validation-aucpr:0.95496
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.56403 validation-auc:0.95211 validation-aucpr:0.95568
[20:23:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.55528 validation-auc:0.95223 validation-aucpr:0.95587
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.54731 validation-auc:0.95235 validation-aucpr:0.95601
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.53963 validation-auc:0.95233 validation-aucpr:0.95595
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.53109 validation-auc:0.95374 validation-aucpr:0.95744
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.52385 validation-auc:0.95377 validation-aucpr:0.95745
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.51606 validation-auc:0.95432 validation-aucpr:0.95798
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.50888 validation-auc:0.95406 validation-aucpr:0.95774
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.50197 validation-auc:0.95451 validation-aucpr:0.95816
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.49534 validation-auc:0.95449 validation-aucpr:0.95812
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.48733 validation-auc:0.95861 validation-aucpr:0.96279
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.48157 validation-auc:0.95843 validation-aucpr:0.96263
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.47534 validation-auc:0.95935 validation-aucpr:0.96362
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.46954 validation-auc:0.95919 validation-aucpr:0.96341
[20:23:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.46341 validation-auc:0.95972 validation-aucpr:0.96403
[20:23:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.45837 validation-auc:0.95946 validation-aucpr:0.96376
[20:23:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.45358 validation-auc:0.95920 validation-aucpr:0.96349
[20:23:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.44845 validation-auc:0.95923 validation-aucpr:0.96352
[20:23:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.44362 validation-auc:0.95931 validation-aucpr:0.96352
[20:23:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.43884 validation-auc:0.95931 validation-aucpr:0.96360
[20:23:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.43456 validation-auc:0.95958 validation-aucpr:0.96389
[20:23:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.42962 validation-auc:0.95970 validation-aucpr:0.96394
[20:23:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.42423 validation-auc:0.96023 validation-aucpr:0.96449
[20:23:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.41740 validation-auc:0.96159 validation-aucpr:0.96600
[20:23:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.41321 validation-auc:0.96152 validation-aucpr:0.96590
[20:23:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.40969 validation-auc:0.96137 validation-aucpr:0.96574
[20:23:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.40516 validation-auc:0.96153 validation-aucpr:0.96582
[20:23:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.40094 validation-auc:0.96164 validation-aucpr:0.96593
[20:23:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.39723 validation-auc:0.96180 validation-aucpr:0.96607
[20:23:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.39317 validation-auc:0.96192 validation-aucpr:0.96615
[20:23:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.38947 validation-auc:0.96197 validation-aucpr:0.96616
[20:23:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.38641 validation-auc:0.96184 validation-aucpr:0.96606
[20:23:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.38126 validation-auc:0.96249 validation-aucpr:0.96686
[20:23:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.37767 validation-auc:0.96267 validation-aucpr:0.96697
[20:23:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.37454 validation-auc:0.96261 validation-aucpr:0.96687
[20:23:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.37180 validation-auc:0.96255 validation-aucpr:0.96681
{'best_iteration': '44', 'best_score': '0.9669745897838138'}
Trial 38, Fold 3: Log loss = 0.37179536140253167, Average precision = 0.966817583403075, ROC-AUC = 0.9625506231279676, Elapsed Time = 4.31358569999793 seconds
Trial 38, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 38, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67987 validation-auc:0.91555 validation-aucpr:0.90252
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66681 validation-auc:0.93186 validation-aucpr:0.93114
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65318 validation-auc:0.94676 validation-aucpr:0.94888
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64077 validation-auc:0.94892 validation-aucpr:0.95104
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62862 validation-auc:0.95050 validation-aucpr:0.95489
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61797 validation-auc:0.94978 validation-aucpr:0.95426
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.60792 validation-auc:0.94967 validation-aucpr:0.95429
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.59825 validation-auc:0.94948 validation-aucpr:0.95417
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.58748 validation-auc:0.95159 validation-aucpr:0.95614
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.57750 validation-auc:0.95210 validation-aucpr:0.95651
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.56846 validation-auc:0.95197 validation-aucpr:0.95638
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.55924 validation-auc:0.95239 validation-aucpr:0.95673
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.55056 validation-auc:0.95225 validation-aucpr:0.95667
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.54232 validation-auc:0.95224 validation-aucpr:0.95675
[20:23:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.53515 validation-auc:0.95197 validation-aucpr:0.95643
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.52784 validation-auc:0.95205 validation-aucpr:0.95653
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.52119 validation-auc:0.95183 validation-aucpr:0.95636
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.51427 validation-auc:0.95169 validation-aucpr:0.95629
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.50712 validation-auc:0.95176 validation-aucpr:0.95628
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.50106 validation-auc:0.95172 validation-aucpr:0.95635
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.49408 validation-auc:0.95235 validation-aucpr:0.95702
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.48501 validation-auc:0.95667 validation-aucpr:0.96192
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.47918 validation-auc:0.95680 validation-aucpr:0.96203
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.47372 validation-auc:0.95661 validation-aucpr:0.96181
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.46891 validation-auc:0.95635 validation-aucpr:0.96158
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.46284 validation-auc:0.95633 validation-aucpr:0.96156
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.45758 validation-auc:0.95631 validation-aucpr:0.96148
[20:23:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.45264 validation-auc:0.95617 validation-aucpr:0.96137
[20:23:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.44717 validation-auc:0.95624 validation-aucpr:0.96152
[20:23:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.44178 validation-auc:0.95638 validation-aucpr:0.96160
[20:23:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.43737 validation-auc:0.95608 validation-aucpr:0.96128
[20:23:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.43282 validation-auc:0.95604 validation-aucpr:0.96131
[20:23:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.42890 validation-auc:0.95573 validation-aucpr:0.96100
[20:23:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.42447 validation-auc:0.95582 validation-aucpr:0.96106
[20:23:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.42014 validation-auc:0.95600 validation-aucpr:0.96122
[20:23:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.41676 validation-auc:0.95572 validation-aucpr:0.96091
[20:23:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.41308 validation-auc:0.95575 validation-aucpr:0.96095
[20:23:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.40913 validation-auc:0.95580 validation-aucpr:0.96100
[20:23:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.40364 validation-auc:0.95729 validation-aucpr:0.96271
[20:23:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.40021 validation-auc:0.95724 validation-aucpr:0.96266
[20:23:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.39668 validation-auc:0.95722 validation-aucpr:0.96261
[20:23:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.39355 validation-auc:0.95725 validation-aucpr:0.96265
[20:23:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.39022 validation-auc:0.95736 validation-aucpr:0.96273
[20:23:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.38482 validation-auc:0.95855 validation-aucpr:0.96416
[20:23:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.37962 validation-auc:0.95947 validation-aucpr:0.96523
[20:23:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.37661 validation-auc:0.95948 validation-aucpr:0.96524
[20:23:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.37360 validation-auc:0.95957 validation-aucpr:0.96532
{'best_iteration': '46', 'best_score': '0.965317789327765'}
Trial 38, Fold 4: Log loss = 0.3736001626733337, Average precision = 0.9653230628179911, ROC-AUC = 0.9595715126218731, Elapsed Time = 4.244987300000503 seconds
Trial 38, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 38, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68022 validation-auc:0.91274 validation-aucpr:0.90548
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66699 validation-auc:0.93385 validation-aucpr:0.93541
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65498 validation-auc:0.93962 validation-aucpr:0.94399
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64212 validation-auc:0.94736 validation-aucpr:0.95141
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.63031 validation-auc:0.94783 validation-aucpr:0.95200
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61926 validation-auc:0.94828 validation-aucpr:0.95263
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.60923 validation-auc:0.94794 validation-aucpr:0.95237
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.59829 validation-auc:0.94880 validation-aucpr:0.95319
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.58820 validation-auc:0.94991 validation-aucpr:0.95413
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.57889 validation-auc:0.95053 validation-aucpr:0.95482
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.56908 validation-auc:0.95076 validation-aucpr:0.95501
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.56091 validation-auc:0.95049 validation-aucpr:0.95488
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.55240 validation-auc:0.95070 validation-aucpr:0.95499
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.54541 validation-auc:0.95058 validation-aucpr:0.95487
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.53729 validation-auc:0.95096 validation-aucpr:0.95544
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.52967 validation-auc:0.95131 validation-aucpr:0.95573
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.52200 validation-auc:0.95163 validation-aucpr:0.95597
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.51475 validation-auc:0.95194 validation-aucpr:0.95640
[20:23:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.50637 validation-auc:0.95498 validation-aucpr:0.96015
[20:23:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.50118 validation-auc:0.95488 validation-aucpr:0.96009
[20:23:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.49526 validation-auc:0.95476 validation-aucpr:0.95982
[20:23:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.48938 validation-auc:0.95430 validation-aucpr:0.95942
[20:23:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.48135 validation-auc:0.95583 validation-aucpr:0.96132
[20:23:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.47587 validation-auc:0.95569 validation-aucpr:0.96112
[20:23:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.47063 validation-auc:0.95557 validation-aucpr:0.96103
[20:23:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.46535 validation-auc:0.95558 validation-aucpr:0.96102
[20:23:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.45953 validation-auc:0.95600 validation-aucpr:0.96133
[20:23:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.45418 validation-auc:0.95630 validation-aucpr:0.96150
[20:23:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.44674 validation-auc:0.95741 validation-aucpr:0.96280
[20:23:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.44186 validation-auc:0.95735 validation-aucpr:0.96273
[20:23:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.43789 validation-auc:0.95724 validation-aucpr:0.96269
[20:23:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.43354 validation-auc:0.95731 validation-aucpr:0.96271
[20:23:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.42876 validation-auc:0.95736 validation-aucpr:0.96274
[20:23:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.42533 validation-auc:0.95717 validation-aucpr:0.96254
[20:23:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.41907 validation-auc:0.95782 validation-aucpr:0.96329
[20:23:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.41473 validation-auc:0.95794 validation-aucpr:0.96337
[20:23:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.41125 validation-auc:0.95783 validation-aucpr:0.96324
[20:23:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.40745 validation-auc:0.95794 validation-aucpr:0.96332
[20:23:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.40192 validation-auc:0.95830 validation-aucpr:0.96374
[20:23:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.39675 validation-auc:0.95840 validation-aucpr:0.96388
[20:23:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.39333 validation-auc:0.95845 validation-aucpr:0.96391
[20:23:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.38985 validation-auc:0.95859 validation-aucpr:0.96402
[20:23:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.38569 validation-auc:0.95890 validation-aucpr:0.96431
[20:23:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.38316 validation-auc:0.95882 validation-aucpr:0.96424
[20:23:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.38020 validation-auc:0.95886 validation-aucpr:0.96424
[20:23:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.37699 validation-auc:0.95910 validation-aucpr:0.96443
[20:23:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.37411 validation-auc:0.95900 validation-aucpr:0.96434
{'best_iteration': '45', 'best_score': '0.9644308472576195'}
Trial 38, Fold 5: Log loss = 0.37410844472793614, Average precision = 0.9643466108012383, ROC-AUC = 0.9589996841498989, Elapsed Time = 4.2861682000002475 seconds
Optimization Progress: 39%|###9 | 39/100 [2:24:34<3:52:14, 228.43s/it]
Trial 39, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 39, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.69226 validation-auc:0.63014 validation-aucpr:0.67987
[1] validation-logloss:0.68619 validation-auc:0.89844 validation-aucpr:0.87978
[2] validation-logloss:0.68015 validation-auc:0.92588 validation-aucpr:0.92816
[3] validation-logloss:0.67398 validation-auc:0.93739 validation-aucpr:0.94171
[4] validation-logloss:0.66769 validation-auc:0.94633 validation-aucpr:0.95256
[5] validation-logloss:0.66270 validation-auc:0.94563 validation-aucpr:0.95211
[6] validation-logloss:0.65710 validation-auc:0.94542 validation-aucpr:0.95179
[7] validation-logloss:0.65143 validation-auc:0.94734 validation-aucpr:0.95300
[8] validation-logloss:0.64719 validation-auc:0.94662 validation-aucpr:0.95258
[9] validation-logloss:0.64204 validation-auc:0.94688 validation-aucpr:0.95271
[10] validation-logloss:0.63700 validation-auc:0.94868 validation-aucpr:0.95488
[11] validation-logloss:0.63243 validation-auc:0.94884 validation-aucpr:0.95534
[12] validation-logloss:0.62740 validation-auc:0.94945 validation-aucpr:0.95623
[13] validation-logloss:0.62268 validation-auc:0.94926 validation-aucpr:0.95599
[14] validation-logloss:0.61683 validation-auc:0.95290 validation-aucpr:0.95988
[15] validation-logloss:0.61203 validation-auc:0.95280 validation-aucpr:0.95976
[16] validation-logloss:0.60998 validation-auc:0.95253 validation-aucpr:0.95944
[17] validation-logloss:0.60539 validation-auc:0.95241 validation-aucpr:0.95924
[18] validation-logloss:0.60083 validation-auc:0.95325 validation-aucpr:0.96025
[19] validation-logloss:0.59550 validation-auc:0.95373 validation-aucpr:0.96071
[20] validation-logloss:0.59187 validation-auc:0.95387 validation-aucpr:0.96089
[21] validation-logloss:0.58728 validation-auc:0.95381 validation-aucpr:0.96070
[22] validation-logloss:0.58309 validation-auc:0.95427 validation-aucpr:0.96106
[23] validation-logloss:0.57867 validation-auc:0.95426 validation-aucpr:0.96112
[24] validation-logloss:0.57463 validation-auc:0.95403 validation-aucpr:0.96094
[25] validation-logloss:0.57059 validation-auc:0.95388 validation-aucpr:0.96077
[26] validation-logloss:0.56706 validation-auc:0.95368 validation-aucpr:0.96059
[27] validation-logloss:0.56328 validation-auc:0.95333 validation-aucpr:0.96026
[28] validation-logloss:0.55978 validation-auc:0.95341 validation-aucpr:0.96029
[29] validation-logloss:0.55624 validation-auc:0.95323 validation-aucpr:0.96015
[30] validation-logloss:0.55136 validation-auc:0.95472 validation-aucpr:0.96176
[31] validation-logloss:0.54812 validation-auc:0.95459 validation-aucpr:0.96163
[32] validation-logloss:0.54459 validation-auc:0.95463 validation-aucpr:0.96168
[33] validation-logloss:0.53999 validation-auc:0.95551 validation-aucpr:0.96262
[34] validation-logloss:0.53558 validation-auc:0.95607 validation-aucpr:0.96315
{'best_iteration': '34', 'best_score': '0.9631540251948735'}
Trial 39, Fold 1: Log loss = 0.5355809136638554, Average precision = 0.9631595856202696, ROC-AUC = 0.9560693016542158, Elapsed Time = 2.151727899999969 seconds
Trial 39, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 39, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.69183 validation-auc:0.62838 validation-aucpr:0.65085
[1] validation-logloss:0.68522 validation-auc:0.91819 validation-aucpr:0.88707
[2] validation-logloss:0.67930 validation-auc:0.93380 validation-aucpr:0.92814
[3] validation-logloss:0.67332 validation-auc:0.93813 validation-aucpr:0.93811
[4] validation-logloss:0.66756 validation-auc:0.94129 validation-aucpr:0.94215
[5] validation-logloss:0.66199 validation-auc:0.94103 validation-aucpr:0.94281
[6] validation-logloss:0.65615 validation-auc:0.94408 validation-aucpr:0.94565
[7] validation-logloss:0.65107 validation-auc:0.94330 validation-aucpr:0.94496
[8] validation-logloss:0.64544 validation-auc:0.94814 validation-aucpr:0.95118
[9] validation-logloss:0.64058 validation-auc:0.94744 validation-aucpr:0.95039
[10] validation-logloss:0.63569 validation-auc:0.94655 validation-aucpr:0.94948
[11] validation-logloss:0.63155 validation-auc:0.94802 validation-aucpr:0.95053
[12] validation-logloss:0.62661 validation-auc:0.94878 validation-aucpr:0.95132
[13] validation-logloss:0.62266 validation-auc:0.94955 validation-aucpr:0.95189
[14] validation-logloss:0.61744 validation-auc:0.95020 validation-aucpr:0.95256
[15] validation-logloss:0.61261 validation-auc:0.95021 validation-aucpr:0.95271
[16] validation-logloss:0.60739 validation-auc:0.95347 validation-aucpr:0.95646
[17] validation-logloss:0.60288 validation-auc:0.95307 validation-aucpr:0.95631
[18] validation-logloss:0.59799 validation-auc:0.95324 validation-aucpr:0.95624
[19] validation-logloss:0.59362 validation-auc:0.95318 validation-aucpr:0.95621
[20] validation-logloss:0.58867 validation-auc:0.95502 validation-aucpr:0.95846
[21] validation-logloss:0.58432 validation-auc:0.95498 validation-aucpr:0.95835
[22] validation-logloss:0.58034 validation-auc:0.95496 validation-aucpr:0.95829
[23] validation-logloss:0.57656 validation-auc:0.95457 validation-aucpr:0.95789
[24] validation-logloss:0.57325 validation-auc:0.95445 validation-aucpr:0.95795
[25] validation-logloss:0.56989 validation-auc:0.95414 validation-aucpr:0.95760
[26] validation-logloss:0.56605 validation-auc:0.95402 validation-aucpr:0.95744
[27] validation-logloss:0.56203 validation-auc:0.95412 validation-aucpr:0.95745
[28] validation-logloss:0.55888 validation-auc:0.95443 validation-aucpr:0.95764
[29] validation-logloss:0.55520 validation-auc:0.95454 validation-aucpr:0.95777
[30] validation-logloss:0.55145 validation-auc:0.95470 validation-aucpr:0.95789
[31] validation-logloss:0.54783 validation-auc:0.95467 validation-aucpr:0.95783
[32] validation-logloss:0.54450 validation-auc:0.95466 validation-aucpr:0.95759
[33] validation-logloss:0.54120 validation-auc:0.95454 validation-aucpr:0.95752
[34] validation-logloss:0.53651 validation-auc:0.95581 validation-aucpr:0.95902
{'best_iteration': '34', 'best_score': '0.9590245603636176'}
Trial 39, Fold 2: Log loss = 0.5365051738060637, Average precision = 0.9590721938204927, ROC-AUC = 0.9558121071060149, Elapsed Time = 2.1664930999977514 seconds
Trial 39, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 39, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.69216 validation-auc:0.61740 validation-aucpr:0.65215
[1] validation-logloss:0.68566 validation-auc:0.86971 validation-aucpr:0.81245
[2] validation-logloss:0.68091 validation-auc:0.91964 validation-aucpr:0.91568
[3] validation-logloss:0.67510 validation-auc:0.92775 validation-aucpr:0.92560
[4] validation-logloss:0.66989 validation-auc:0.93403 validation-aucpr:0.93147
[5] validation-logloss:0.66449 validation-auc:0.93752 validation-aucpr:0.93762
[6] validation-logloss:0.65879 validation-auc:0.94044 validation-aucpr:0.94151
[7] validation-logloss:0.65311 validation-auc:0.94236 validation-aucpr:0.94400
[8] validation-logloss:0.64733 validation-auc:0.94527 validation-aucpr:0.94700
[9] validation-logloss:0.64229 validation-auc:0.94581 validation-aucpr:0.94753
[10] validation-logloss:0.63729 validation-auc:0.94573 validation-aucpr:0.94738
[11] validation-logloss:0.63265 validation-auc:0.94497 validation-aucpr:0.94660
[12] validation-logloss:0.62648 validation-auc:0.95397 validation-aucpr:0.95737
[13] validation-logloss:0.62120 validation-auc:0.95544 validation-aucpr:0.95867
[14] validation-logloss:0.61602 validation-auc:0.95613 validation-aucpr:0.95968
[15] validation-logloss:0.61144 validation-auc:0.95604 validation-aucpr:0.95994
[16] validation-logloss:0.60563 validation-auc:0.95853 validation-aucpr:0.96229
[17] validation-logloss:0.60105 validation-auc:0.95813 validation-aucpr:0.96182
[18] validation-logloss:0.59685 validation-auc:0.95805 validation-aucpr:0.96171
[19] validation-logloss:0.59245 validation-auc:0.95819 validation-aucpr:0.96223
[20] validation-logloss:0.58849 validation-auc:0.95778 validation-aucpr:0.96192
[21] validation-logloss:0.58433 validation-auc:0.95738 validation-aucpr:0.96139
[22] validation-logloss:0.58010 validation-auc:0.95719 validation-aucpr:0.96118
[23] validation-logloss:0.57543 validation-auc:0.95789 validation-aucpr:0.96197
[24] validation-logloss:0.57184 validation-auc:0.95796 validation-aucpr:0.96218
[25] validation-logloss:0.56730 validation-auc:0.95900 validation-aucpr:0.96386
[26] validation-logloss:0.56301 validation-auc:0.95894 validation-aucpr:0.96379
[27] validation-logloss:0.55920 validation-auc:0.95895 validation-aucpr:0.96379
[28] validation-logloss:0.55554 validation-auc:0.95877 validation-aucpr:0.96364
[29] validation-logloss:0.55210 validation-auc:0.95848 validation-aucpr:0.96330
[30] validation-logloss:0.54750 validation-auc:0.95914 validation-aucpr:0.96416
[31] validation-logloss:0.54393 validation-auc:0.95902 validation-aucpr:0.96397
[32] validation-logloss:0.54075 validation-auc:0.95900 validation-aucpr:0.96390
[33] validation-logloss:0.53771 validation-auc:0.95873 validation-aucpr:0.96357
[34] validation-logloss:0.53390 validation-auc:0.95867 validation-aucpr:0.96350
{'best_iteration': '30', 'best_score': '0.9641562408617427'}
Trial 39, Fold 3: Log loss = 0.5338974948694302, Average precision = 0.9635049136418115, ROC-AUC = 0.9586714227928076, Elapsed Time = 1.9661199000001943 seconds
Trial 39, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 39, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.69196 validation-auc:0.62449 validation-aucpr:0.66023
[1] validation-logloss:0.68575 validation-auc:0.90475 validation-aucpr:0.88475
[2] validation-logloss:0.68009 validation-auc:0.92448 validation-aucpr:0.91614
[3] validation-logloss:0.67503 validation-auc:0.92851 validation-aucpr:0.92872
[4] validation-logloss:0.66948 validation-auc:0.93151 validation-aucpr:0.93417
[5] validation-logloss:0.66433 validation-auc:0.93189 validation-aucpr:0.93452
[6] validation-logloss:0.65829 validation-auc:0.93704 validation-aucpr:0.94134
[7] validation-logloss:0.65246 validation-auc:0.94025 validation-aucpr:0.94543
[8] validation-logloss:0.64834 validation-auc:0.94096 validation-aucpr:0.94616
[9] validation-logloss:0.64347 validation-auc:0.94169 validation-aucpr:0.94701
[10] validation-logloss:0.63824 validation-auc:0.94215 validation-aucpr:0.94760
[11] validation-logloss:0.63301 validation-auc:0.94292 validation-aucpr:0.94808
[12] validation-logloss:0.62816 validation-auc:0.94324 validation-aucpr:0.94850
[13] validation-logloss:0.62360 validation-auc:0.94278 validation-aucpr:0.94809
[14] validation-logloss:0.61860 validation-auc:0.94390 validation-aucpr:0.94929
[15] validation-logloss:0.61283 validation-auc:0.94953 validation-aucpr:0.95586
[16] validation-logloss:0.60798 validation-auc:0.95062 validation-aucpr:0.95677
[17] validation-logloss:0.60376 validation-auc:0.95029 validation-aucpr:0.95645
[18] validation-logloss:0.59899 validation-auc:0.95100 validation-aucpr:0.95702
[19] validation-logloss:0.59477 validation-auc:0.95127 validation-aucpr:0.95716
[20] validation-logloss:0.59084 validation-auc:0.95114 validation-aucpr:0.95692
[21] validation-logloss:0.58635 validation-auc:0.95153 validation-aucpr:0.95726
[22] validation-logloss:0.58247 validation-auc:0.95180 validation-aucpr:0.95739
[23] validation-logloss:0.57815 validation-auc:0.95215 validation-aucpr:0.95748
[24] validation-logloss:0.57444 validation-auc:0.95205 validation-aucpr:0.95738
[25] validation-logloss:0.57106 validation-auc:0.95199 validation-aucpr:0.95744
[26] validation-logloss:0.56674 validation-auc:0.95170 validation-aucpr:0.95698
[27] validation-logloss:0.56320 validation-auc:0.95158 validation-aucpr:0.95697
[28] validation-logloss:0.55932 validation-auc:0.95203 validation-aucpr:0.95747
[29] validation-logloss:0.55530 validation-auc:0.95201 validation-aucpr:0.95732
[30] validation-logloss:0.55240 validation-auc:0.95193 validation-aucpr:0.95721
[31] validation-logloss:0.54903 validation-auc:0.95176 validation-aucpr:0.95697
[32] validation-logloss:0.54522 validation-auc:0.95214 validation-aucpr:0.95734
[33] validation-logloss:0.54188 validation-auc:0.95239 validation-aucpr:0.95763
[34] validation-logloss:0.53804 validation-auc:0.95277 validation-aucpr:0.95800
{'best_iteration': '34', 'best_score': '0.9580042989195655'}
Trial 39, Fold 4: Log loss = 0.5380433135204978, Average precision = 0.9580086411722494, ROC-AUC = 0.9527683444566049, Elapsed Time = 1.9444853999993938 seconds
Trial 39, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 39, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68732 validation-auc:0.89226 validation-aucpr:0.85523
[1] validation-logloss:0.68097 validation-auc:0.93162 validation-aucpr:0.93449
[2] validation-logloss:0.67451 validation-auc:0.94454 validation-aucpr:0.94874
[3] validation-logloss:0.66929 validation-auc:0.94362 validation-aucpr:0.94765
[4] validation-logloss:0.66235 validation-auc:0.95061 validation-aucpr:0.95586
[5] validation-logloss:0.65705 validation-auc:0.94966 validation-aucpr:0.95443
[6] validation-logloss:0.65287 validation-auc:0.94918 validation-aucpr:0.95490
[7] validation-logloss:0.64715 validation-auc:0.95098 validation-aucpr:0.95671
[8] validation-logloss:0.64217 validation-auc:0.95160 validation-aucpr:0.95717
[9] validation-logloss:0.63721 validation-auc:0.95198 validation-aucpr:0.95740
[10] validation-logloss:0.63220 validation-auc:0.95255 validation-aucpr:0.95795
[11] validation-logloss:0.62710 validation-auc:0.95267 validation-aucpr:0.95814
[12] validation-logloss:0.62116 validation-auc:0.95375 validation-aucpr:0.95939
[13] validation-logloss:0.61702 validation-auc:0.95357 validation-aucpr:0.95912
[14] validation-logloss:0.61229 validation-auc:0.95371 validation-aucpr:0.95920
[15] validation-logloss:0.60752 validation-auc:0.95408 validation-aucpr:0.95947
[16] validation-logloss:0.60282 validation-auc:0.95411 validation-aucpr:0.95929
[17] validation-logloss:0.59803 validation-auc:0.95475 validation-aucpr:0.95973
[18] validation-logloss:0.59392 validation-auc:0.95436 validation-aucpr:0.95932
[19] validation-logloss:0.58845 validation-auc:0.95475 validation-aucpr:0.95990
[20] validation-logloss:0.58439 validation-auc:0.95441 validation-aucpr:0.95961
[21] validation-logloss:0.57988 validation-auc:0.95471 validation-aucpr:0.95984
[22] validation-logloss:0.57582 validation-auc:0.95438 validation-aucpr:0.95944
[23] validation-logloss:0.57088 validation-auc:0.95479 validation-aucpr:0.95987
[24] validation-logloss:0.56769 validation-auc:0.95462 validation-aucpr:0.96014
[25] validation-logloss:0.56391 validation-auc:0.95457 validation-aucpr:0.96006
[26] validation-logloss:0.56038 validation-auc:0.95463 validation-aucpr:0.95986
[27] validation-logloss:0.55691 validation-auc:0.95456 validation-aucpr:0.95976
[28] validation-logloss:0.55317 validation-auc:0.95451 validation-aucpr:0.96033
[29] validation-logloss:0.54919 validation-auc:0.95472 validation-aucpr:0.96047
[30] validation-logloss:0.54521 validation-auc:0.95509 validation-aucpr:0.96065
[31] validation-logloss:0.54210 validation-auc:0.95479 validation-aucpr:0.96037
[32] validation-logloss:0.53876 validation-auc:0.95480 validation-aucpr:0.96033
[33] validation-logloss:0.53487 validation-auc:0.95501 validation-aucpr:0.96040
[34] validation-logloss:0.53123 validation-auc:0.95518 validation-aucpr:0.96049
{'best_iteration': '30', 'best_score': '0.9606467506066156'}
Trial 39, Fold 5: Log loss = 0.53122527454656, Average precision = 0.9604999865216366, ROC-AUC = 0.9551781208776917, Elapsed Time = 1.9197061000013491 seconds
Optimization Progress: 40%|#### | 40/100 [2:24:52<2:45:16, 165.27s/it]
Trial 40, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 40, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63086 validation-auc:0.92500 validation-aucpr:0.89044
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.58262 validation-auc:0.95148 validation-aucpr:0.93704
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.53810 validation-auc:0.95822 validation-aucpr:0.95171
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.49924 validation-auc:0.96113 validation-aucpr:0.95731
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.46479 validation-auc:0.96298 validation-aucpr:0.95741
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.43537 validation-auc:0.96380 validation-aucpr:0.96024
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.40962 validation-auc:0.96541 validation-aucpr:0.96554
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.38675 validation-auc:0.96655 validation-aucpr:0.97040
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.36661 validation-auc:0.96709 validation-aucpr:0.97055
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.34893 validation-auc:0.96723 validation-aucpr:0.97031
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.33281 validation-auc:0.96752 validation-aucpr:0.97008
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.31876 validation-auc:0.96801 validation-aucpr:0.97031
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.30626 validation-auc:0.96853 validation-aucpr:0.97081
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.29482 validation-auc:0.96873 validation-aucpr:0.97098
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.28462 validation-auc:0.96877 validation-aucpr:0.97108
[20:23:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.27554 validation-auc:0.96882 validation-aucpr:0.97108
[20:23:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.26827 validation-auc:0.96887 validation-aucpr:0.97061
[20:23:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.26081 validation-auc:0.96929 validation-aucpr:0.97242
[20:23:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.25427 validation-auc:0.96958 validation-aucpr:0.97268
[20:23:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.24872 validation-auc:0.96971 validation-aucpr:0.97294
[20:23:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.24336 validation-auc:0.96967 validation-aucpr:0.97285
[20:23:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.23875 validation-auc:0.96958 validation-aucpr:0.97293
[20:23:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.23440 validation-auc:0.96957 validation-aucpr:0.97291
[20:23:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.23050 validation-auc:0.96972 validation-aucpr:0.97297
[20:23:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.22743 validation-auc:0.96960 validation-aucpr:0.97235
[20:23:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.22443 validation-auc:0.96979 validation-aucpr:0.97322
[20:23:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.22213 validation-auc:0.96969 validation-aucpr:0.97311
[20:23:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.21927 validation-auc:0.97009 validation-aucpr:0.97401
[20:23:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.21714 validation-auc:0.97031 validation-aucpr:0.97417
[20:23:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.21412 validation-auc:0.97078 validation-aucpr:0.97455
[20:23:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.21255 validation-auc:0.97087 validation-aucpr:0.97457
[20:23:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.21084 validation-auc:0.97087 validation-aucpr:0.97457
[20:23:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.20880 validation-auc:0.97104 validation-aucpr:0.97470
[20:23:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.20764 validation-auc:0.97097 validation-aucpr:0.97464
[20:23:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.20657 validation-auc:0.97095 validation-aucpr:0.97480
[20:23:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.20569 validation-auc:0.97096 validation-aucpr:0.97471
[20:23:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.20429 validation-auc:0.97109 validation-aucpr:0.97478
[20:23:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20324 validation-auc:0.97115 validation-aucpr:0.97476
[20:23:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.20260 validation-auc:0.97114 validation-aucpr:0.97459
[20:23:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.20206 validation-auc:0.97119 validation-aucpr:0.97455
[20:23:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20156 validation-auc:0.97118 validation-aucpr:0.97451
[20:23:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.20090 validation-auc:0.97117 validation-aucpr:0.97434
[20:23:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20026 validation-auc:0.97142 validation-aucpr:0.97565
[20:23:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.19967 validation-auc:0.97146 validation-aucpr:0.97565
[20:23:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19959 validation-auc:0.97144 validation-aucpr:0.97563
[20:23:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19928 validation-auc:0.97149 validation-aucpr:0.97568
[20:23:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19988 validation-auc:0.97127 validation-aucpr:0.97548
[20:23:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19996 validation-auc:0.97125 validation-aucpr:0.97542
[20:23:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19972 validation-auc:0.97137 validation-aucpr:0.97557
[20:23:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.20021 validation-auc:0.97123 validation-aucpr:0.97545
[20:23:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.20051 validation-auc:0.97113 validation-aucpr:0.97529
[20:23:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.20041 validation-auc:0.97122 validation-aucpr:0.97535
[20:23:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.20049 validation-auc:0.97120 validation-aucpr:0.97530
[20:23:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.20077 validation-auc:0.97124 validation-aucpr:0.97537
[20:23:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.20081 validation-auc:0.97125 validation-aucpr:0.97533
[20:23:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.20059 validation-auc:0.97143 validation-aucpr:0.97552
[20:23:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.20097 validation-auc:0.97139 validation-aucpr:0.97547
[20:23:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.20147 validation-auc:0.97130 validation-aucpr:0.97533
[20:23:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.20180 validation-auc:0.97125 validation-aucpr:0.97525
[20:23:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.20167 validation-auc:0.97149 validation-aucpr:0.97556
[20:23:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.20178 validation-auc:0.97149 validation-aucpr:0.97552
[20:23:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.20223 validation-auc:0.97151 validation-aucpr:0.97549
[20:23:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.20266 validation-auc:0.97155 validation-aucpr:0.97556
[20:23:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.20272 validation-auc:0.97159 validation-aucpr:0.97550
[20:23:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.20292 validation-auc:0.97164 validation-aucpr:0.97561
[20:23:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.20348 validation-auc:0.97160 validation-aucpr:0.97567
{'best_iteration': '45', 'best_score': '0.9756752977752717'}
Trial 40, Fold 1: Log loss = 0.20347556267412154, Average precision = 0.9756785766613285, ROC-AUC = 0.9715972162477355, Elapsed Time = 10.003865800001222 seconds
Trial 40, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 40, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[20:23:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.62933 validation-auc:0.93299 validation-aucpr:0.89951
[20:23:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.57635 validation-auc:0.95740 validation-aucpr:0.95010
[20:23:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.53071 validation-auc:0.96401 validation-aucpr:0.95772
[20:23:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.49117 validation-auc:0.96651 validation-aucpr:0.96493
[20:23:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.45771 validation-auc:0.96900 validation-aucpr:0.96967
[20:23:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.42846 validation-auc:0.96988 validation-aucpr:0.97048
[20:23:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.40255 validation-auc:0.97074 validation-aucpr:0.97400
[20:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.37990 validation-auc:0.97084 validation-aucpr:0.97404
[20:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.35973 validation-auc:0.97108 validation-aucpr:0.97430
[20:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.34202 validation-auc:0.97155 validation-aucpr:0.97474
[20:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.32611 validation-auc:0.97216 validation-aucpr:0.97520
[20:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.31223 validation-auc:0.97221 validation-aucpr:0.97525
[20:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.29923 validation-auc:0.97264 validation-aucpr:0.97556
[20:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.28799 validation-auc:0.97276 validation-aucpr:0.97569
[20:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.27745 validation-auc:0.97295 validation-aucpr:0.97581
[20:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.26811 validation-auc:0.97308 validation-aucpr:0.97597
[20:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.25979 validation-auc:0.97321 validation-aucpr:0.97608
[20:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.25219 validation-auc:0.97341 validation-aucpr:0.97624
[20:23:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.24536 validation-auc:0.97338 validation-aucpr:0.97620
[20:23:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.23901 validation-auc:0.97341 validation-aucpr:0.97624
[20:23:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.23343 validation-auc:0.97348 validation-aucpr:0.97628
[20:23:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.22848 validation-auc:0.97342 validation-aucpr:0.97592
[20:23:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.22433 validation-auc:0.97349 validation-aucpr:0.97595
[20:23:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.22092 validation-auc:0.97339 validation-aucpr:0.97589
[20:23:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.21711 validation-auc:0.97353 validation-aucpr:0.97597
[20:23:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.21399 validation-auc:0.97327 validation-aucpr:0.97578
[20:23:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.21080 validation-auc:0.97343 validation-aucpr:0.97592
[20:23:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.20806 validation-auc:0.97339 validation-aucpr:0.97592
[20:23:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.20592 validation-auc:0.97331 validation-aucpr:0.97582
[20:23:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.20447 validation-auc:0.97303 validation-aucpr:0.97564
[20:23:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.20214 validation-auc:0.97325 validation-aucpr:0.97565
[20:23:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.19967 validation-auc:0.97343 validation-aucpr:0.97581
[20:23:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.19832 validation-auc:0.97339 validation-aucpr:0.97576
[20:23:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.19675 validation-auc:0.97343 validation-aucpr:0.97577
[20:24:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.19511 validation-auc:0.97362 validation-aucpr:0.97602
[20:24:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.19399 validation-auc:0.97358 validation-aucpr:0.97600
[20:24:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.19249 validation-auc:0.97365 validation-aucpr:0.97607
[20:24:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.19150 validation-auc:0.97367 validation-aucpr:0.97608
[20:24:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.18998 validation-auc:0.97389 validation-aucpr:0.97626
[20:24:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.18927 validation-auc:0.97370 validation-aucpr:0.97602
[20:24:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.18866 validation-auc:0.97367 validation-aucpr:0.97592
[20:24:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.18808 validation-auc:0.97368 validation-aucpr:0.97597
[20:24:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.18730 validation-auc:0.97377 validation-aucpr:0.97603
[20:24:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.18624 validation-auc:0.97389 validation-aucpr:0.97612
[20:24:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.18553 validation-auc:0.97384 validation-aucpr:0.97593
[20:24:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.18510 validation-auc:0.97384 validation-aucpr:0.97585
[20:24:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.18454 validation-auc:0.97395 validation-aucpr:0.97593
[20:24:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.18430 validation-auc:0.97390 validation-aucpr:0.97586
[20:24:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.18375 validation-auc:0.97393 validation-aucpr:0.97592
[20:24:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.18336 validation-auc:0.97401 validation-aucpr:0.97589
[20:24:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.18318 validation-auc:0.97401 validation-aucpr:0.97580
[20:24:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.18312 validation-auc:0.97416 validation-aucpr:0.97649
[20:24:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.18314 validation-auc:0.97414 validation-aucpr:0.97645
[20:24:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.18305 validation-auc:0.97405 validation-aucpr:0.97637
[20:24:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.18301 validation-auc:0.97394 validation-aucpr:0.97641
[20:24:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.18250 validation-auc:0.97420 validation-aucpr:0.97640
[20:24:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.18288 validation-auc:0.97422 validation-aucpr:0.97701
[20:24:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.18335 validation-auc:0.97407 validation-aucpr:0.97681
[20:24:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.18342 validation-auc:0.97409 validation-aucpr:0.97677
[20:24:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.18362 validation-auc:0.97402 validation-aucpr:0.97661
[20:24:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.18394 validation-auc:0.97388 validation-aucpr:0.97657
[20:24:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.18416 validation-auc:0.97381 validation-aucpr:0.97653
[20:24:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.18443 validation-auc:0.97375 validation-aucpr:0.97648
[20:24:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.18457 validation-auc:0.97370 validation-aucpr:0.97646
[20:24:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.18491 validation-auc:0.97366 validation-aucpr:0.97626
[20:24:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.18488 validation-auc:0.97371 validation-aucpr:0.97642
{'best_iteration': '56', 'best_score': '0.9770064115831377'}
Trial 40, Fold 2: Log loss = 0.18488184562218168, Average precision = 0.9764291748176318, ROC-AUC = 0.9737075716985524, Elapsed Time = 10.05679930000042 seconds
Trial 40, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 40, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[20:24:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.62968 validation-auc:0.92847 validation-aucpr:0.88851
[20:24:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.57559 validation-auc:0.95534 validation-aucpr:0.92830
[20:24:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.53082 validation-auc:0.96108 validation-aucpr:0.94587
[20:24:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.49208 validation-auc:0.96333 validation-aucpr:0.95202
[20:24:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.45802 validation-auc:0.96630 validation-aucpr:0.96477
[20:24:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.42909 validation-auc:0.96659 validation-aucpr:0.96669
[20:24:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.40297 validation-auc:0.96826 validation-aucpr:0.97081
[20:24:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.38010 validation-auc:0.96913 validation-aucpr:0.97153
[20:24:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.36031 validation-auc:0.96964 validation-aucpr:0.97258
[20:24:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.34277 validation-auc:0.96987 validation-aucpr:0.97347
[20:24:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.32704 validation-auc:0.96995 validation-aucpr:0.97257
[20:24:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.31287 validation-auc:0.97034 validation-aucpr:0.97299
[20:24:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.30041 validation-auc:0.97057 validation-aucpr:0.97432
[20:24:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.28884 validation-auc:0.97076 validation-aucpr:0.97447
[20:24:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.27891 validation-auc:0.97067 validation-aucpr:0.97403
[20:24:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.27020 validation-auc:0.97053 validation-aucpr:0.97397
[20:24:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.26189 validation-auc:0.97048 validation-aucpr:0.97396
[20:24:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.25574 validation-auc:0.97030 validation-aucpr:0.97375
[20:24:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.24918 validation-auc:0.97039 validation-aucpr:0.97379
[20:24:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.24247 validation-auc:0.97063 validation-aucpr:0.97363
[20:24:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.23732 validation-auc:0.97066 validation-aucpr:0.97362
[20:24:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.23235 validation-auc:0.97082 validation-aucpr:0.97385
[20:24:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.22836 validation-auc:0.97068 validation-aucpr:0.97372
[20:24:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.22460 validation-auc:0.97064 validation-aucpr:0.97356
[20:24:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.22082 validation-auc:0.97085 validation-aucpr:0.97371
[20:24:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.21764 validation-auc:0.97089 validation-aucpr:0.97371
[20:24:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.21469 validation-auc:0.97097 validation-aucpr:0.97379
[20:24:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.21218 validation-auc:0.97114 validation-aucpr:0.97386
[20:24:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.20991 validation-auc:0.97135 validation-aucpr:0.97407
[20:24:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.20801 validation-auc:0.97128 validation-aucpr:0.97400
[20:24:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.20642 validation-auc:0.97116 validation-aucpr:0.97394
[20:24:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.20502 validation-auc:0.97119 validation-aucpr:0.97411
[20:24:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.20418 validation-auc:0.97090 validation-aucpr:0.97403
[20:24:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.20262 validation-auc:0.97097 validation-aucpr:0.97407
[20:24:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.20138 validation-auc:0.97107 validation-aucpr:0.97441
[20:24:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.20005 validation-auc:0.97127 validation-aucpr:0.97472
[20:24:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.19936 validation-auc:0.97120 validation-aucpr:0.97458
[20:24:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.19834 validation-auc:0.97131 validation-aucpr:0.97458
[20:24:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.19748 validation-auc:0.97142 validation-aucpr:0.97461
[20:24:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.19682 validation-auc:0.97142 validation-aucpr:0.97437
[20:24:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.19620 validation-auc:0.97147 validation-aucpr:0.97452
[20:24:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.19542 validation-auc:0.97158 validation-aucpr:0.97465
[20:24:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.19514 validation-auc:0.97156 validation-aucpr:0.97469
[20:24:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.19438 validation-auc:0.97169 validation-aucpr:0.97482
[20:24:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19381 validation-auc:0.97180 validation-aucpr:0.97483
[20:24:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19365 validation-auc:0.97192 validation-aucpr:0.97514
[20:24:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19326 validation-auc:0.97204 validation-aucpr:0.97521
[20:24:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19287 validation-auc:0.97212 validation-aucpr:0.97524
[20:24:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19252 validation-auc:0.97218 validation-aucpr:0.97544
[20:24:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19272 validation-auc:0.97224 validation-aucpr:0.97587
[20:24:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19284 validation-auc:0.97218 validation-aucpr:0.97577
[20:24:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19275 validation-auc:0.97219 validation-aucpr:0.97570
[20:24:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19298 validation-auc:0.97213 validation-aucpr:0.97566
[20:24:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19312 validation-auc:0.97212 validation-aucpr:0.97560
[20:24:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.19317 validation-auc:0.97208 validation-aucpr:0.97554
[20:24:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.19329 validation-auc:0.97209 validation-aucpr:0.97555
[20:24:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.19391 validation-auc:0.97190 validation-aucpr:0.97515
[20:24:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.19401 validation-auc:0.97205 validation-aucpr:0.97550
[20:24:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.19429 validation-auc:0.97203 validation-aucpr:0.97545
[20:24:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.19472 validation-auc:0.97192 validation-aucpr:0.97526
[20:24:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.19485 validation-auc:0.97195 validation-aucpr:0.97534
[20:24:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.19518 validation-auc:0.97198 validation-aucpr:0.97548
[20:24:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.19588 validation-auc:0.97180 validation-aucpr:0.97537
[20:24:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.19550 validation-auc:0.97199 validation-aucpr:0.97541
[20:24:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.19632 validation-auc:0.97184 validation-aucpr:0.97521
[20:24:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.19636 validation-auc:0.97199 validation-aucpr:0.97540
{'best_iteration': '49', 'best_score': '0.9758689164852605'}
Trial 40, Fold 3: Log loss = 0.1963552571288972, Average precision = 0.9754003522438767, ROC-AUC = 0.9719943796891102, Elapsed Time = 10.401604899998347 seconds
Trial 40, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 40, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[20:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.62979 validation-auc:0.92368 validation-aucpr:0.88066
[20:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.57616 validation-auc:0.95172 validation-aucpr:0.93060
[20:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.53068 validation-auc:0.96166 validation-aucpr:0.94972
[20:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.49535 validation-auc:0.96485 validation-aucpr:0.96724
[20:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.46143 validation-auc:0.96570 validation-aucpr:0.96976
[20:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.43121 validation-auc:0.96641 validation-aucpr:0.96948
[20:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.40529 validation-auc:0.96674 validation-aucpr:0.97005
[20:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.38467 validation-auc:0.96644 validation-aucpr:0.97005
[20:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.36431 validation-auc:0.96686 validation-aucpr:0.97039
[20:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.34664 validation-auc:0.96729 validation-aucpr:0.97045
[20:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.33041 validation-auc:0.96759 validation-aucpr:0.97054
[20:24:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.31703 validation-auc:0.96865 validation-aucpr:0.97089
[20:24:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.30367 validation-auc:0.96933 validation-aucpr:0.97101
[20:24:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.29249 validation-auc:0.96944 validation-aucpr:0.97073
[20:24:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.28241 validation-auc:0.96939 validation-aucpr:0.96966
[20:24:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.27276 validation-auc:0.96989 validation-aucpr:0.97023
[20:24:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.26428 validation-auc:0.96987 validation-aucpr:0.97029
[20:24:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.25633 validation-auc:0.97044 validation-aucpr:0.97060
[20:24:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.24953 validation-auc:0.97050 validation-aucpr:0.97066
[20:24:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.24389 validation-auc:0.97061 validation-aucpr:0.97073
[20:24:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.23843 validation-auc:0.97060 validation-aucpr:0.97075
[20:24:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.23385 validation-auc:0.97044 validation-aucpr:0.97065
[20:24:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.22903 validation-auc:0.97076 validation-aucpr:0.97089
[20:24:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.22443 validation-auc:0.97106 validation-aucpr:0.97111
[20:24:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.22085 validation-auc:0.97117 validation-aucpr:0.97109
[20:24:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.21764 validation-auc:0.97137 validation-aucpr:0.97167
[20:24:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.21432 validation-auc:0.97143 validation-aucpr:0.97169
[20:24:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.21143 validation-auc:0.97150 validation-aucpr:0.97194
[20:24:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.20944 validation-auc:0.97131 validation-aucpr:0.97181
[20:24:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.20740 validation-auc:0.97129 validation-aucpr:0.97179
[20:24:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.20496 validation-auc:0.97151 validation-aucpr:0.97179
[20:24:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.20366 validation-auc:0.97132 validation-aucpr:0.97188
[20:24:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.20189 validation-auc:0.97167 validation-aucpr:0.97355
[20:24:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.20018 validation-auc:0.97186 validation-aucpr:0.97366
[20:24:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.19918 validation-auc:0.97181 validation-aucpr:0.97393
[20:24:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.19726 validation-auc:0.97214 validation-aucpr:0.97419
[20:24:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.19609 validation-auc:0.97215 validation-aucpr:0.97392
[20:24:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.19531 validation-auc:0.97216 validation-aucpr:0.97410
[20:24:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.19406 validation-auc:0.97237 validation-aucpr:0.97408
[20:24:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.19343 validation-auc:0.97241 validation-aucpr:0.97410
[20:24:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.19290 validation-auc:0.97238 validation-aucpr:0.97456
[20:24:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.19252 validation-auc:0.97233 validation-aucpr:0.97439
[20:24:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.19170 validation-auc:0.97267 validation-aucpr:0.97656
[20:24:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.19197 validation-auc:0.97251 validation-aucpr:0.97649
[20:24:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19176 validation-auc:0.97251 validation-aucpr:0.97645
[20:24:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19156 validation-auc:0.97247 validation-aucpr:0.97638
[20:24:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19144 validation-auc:0.97253 validation-aucpr:0.97641
[20:24:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19106 validation-auc:0.97253 validation-aucpr:0.97641
[20:24:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19144 validation-auc:0.97244 validation-aucpr:0.97643
[20:24:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19094 validation-auc:0.97255 validation-aucpr:0.97652
[20:24:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19094 validation-auc:0.97258 validation-aucpr:0.97664
[20:24:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19141 validation-auc:0.97246 validation-aucpr:0.97653
[20:24:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19122 validation-auc:0.97247 validation-aucpr:0.97656
[20:24:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19111 validation-auc:0.97246 validation-aucpr:0.97657
[20:24:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.19121 validation-auc:0.97246 validation-aucpr:0.97657
[20:24:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.19154 validation-auc:0.97240 validation-aucpr:0.97652
[20:24:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.19184 validation-auc:0.97234 validation-aucpr:0.97647
[20:24:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.19200 validation-auc:0.97231 validation-aucpr:0.97646
[20:24:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.19232 validation-auc:0.97227 validation-aucpr:0.97644
[20:24:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.19242 validation-auc:0.97230 validation-aucpr:0.97644
[20:24:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.19260 validation-auc:0.97233 validation-aucpr:0.97645
[20:24:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.19250 validation-auc:0.97232 validation-aucpr:0.97645
[20:24:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.19268 validation-auc:0.97230 validation-aucpr:0.97645
[20:24:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.19290 validation-auc:0.97236 validation-aucpr:0.97653
[20:24:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.19347 validation-auc:0.97233 validation-aucpr:0.97649
[20:24:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.19370 validation-auc:0.97231 validation-aucpr:0.97643
{'best_iteration': '50', 'best_score': '0.9766366978311805'}
Trial 40, Fold 4: Log loss = 0.19370183926702098, Average precision = 0.9764367725265992, ROC-AUC = 0.9723107434351237, Elapsed Time = 10.073183600001357 seconds
Trial 40, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 40, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[20:24:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63070 validation-auc:0.91775 validation-aucpr:0.88462
[20:24:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.57687 validation-auc:0.95126 validation-aucpr:0.92715
[20:24:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.53179 validation-auc:0.96085 validation-aucpr:0.95436
[20:24:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.49368 validation-auc:0.96162 validation-aucpr:0.95762
[20:24:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.45990 validation-auc:0.96465 validation-aucpr:0.96513
[20:24:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.43151 validation-auc:0.96584 validation-aucpr:0.96723
[20:24:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.40641 validation-auc:0.96743 validation-aucpr:0.97168
[20:24:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.38427 validation-auc:0.96770 validation-aucpr:0.97184
[20:24:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.36531 validation-auc:0.96746 validation-aucpr:0.97163
[20:24:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.34792 validation-auc:0.96732 validation-aucpr:0.97018
[20:24:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.33225 validation-auc:0.96775 validation-aucpr:0.97056
[20:24:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.31820 validation-auc:0.96843 validation-aucpr:0.97095
[20:24:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.30570 validation-auc:0.96908 validation-aucpr:0.97274
[20:24:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.29523 validation-auc:0.96941 validation-aucpr:0.97297
[20:24:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.28562 validation-auc:0.96956 validation-aucpr:0.97314
[20:24:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.27628 validation-auc:0.96959 validation-aucpr:0.97322
[20:24:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.26786 validation-auc:0.97016 validation-aucpr:0.97375
[20:24:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.26049 validation-auc:0.97005 validation-aucpr:0.97331
[20:24:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.25419 validation-auc:0.96992 validation-aucpr:0.97318
[20:24:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.24805 validation-auc:0.97006 validation-aucpr:0.97332
[20:24:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.24329 validation-auc:0.97002 validation-aucpr:0.97332
[20:24:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.23852 validation-auc:0.97006 validation-aucpr:0.97339
[20:24:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.23398 validation-auc:0.97003 validation-aucpr:0.97341
[20:24:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.23034 validation-auc:0.97003 validation-aucpr:0.97337
[20:24:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.22673 validation-auc:0.97034 validation-aucpr:0.97352
[20:24:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.22371 validation-auc:0.97046 validation-aucpr:0.97350
[20:24:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.22108 validation-auc:0.97055 validation-aucpr:0.97346
[20:24:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.21870 validation-auc:0.97071 validation-aucpr:0.97361
[20:24:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.21629 validation-auc:0.97073 validation-aucpr:0.97366
[20:24:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.21376 validation-auc:0.97093 validation-aucpr:0.97380
[20:24:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.21172 validation-auc:0.97110 validation-aucpr:0.97394
[20:24:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.21004 validation-auc:0.97112 validation-aucpr:0.97398
[20:24:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.20861 validation-auc:0.97105 validation-aucpr:0.97385
[20:24:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.20687 validation-auc:0.97120 validation-aucpr:0.97401
[20:24:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.20634 validation-auc:0.97091 validation-aucpr:0.97377
[20:24:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.20532 validation-auc:0.97099 validation-aucpr:0.97393
[20:24:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.20447 validation-auc:0.97098 validation-aucpr:0.97431
[20:24:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20382 validation-auc:0.97093 validation-aucpr:0.97429
[20:24:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.20278 validation-auc:0.97099 validation-aucpr:0.97436
[20:24:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.20214 validation-auc:0.97095 validation-aucpr:0.97441
[20:24:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20184 validation-auc:0.97096 validation-aucpr:0.97464
[20:24:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.20114 validation-auc:0.97102 validation-aucpr:0.97404
[20:24:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20094 validation-auc:0.97086 validation-aucpr:0.97391
[20:24:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20071 validation-auc:0.97078 validation-aucpr:0.97380
[20:24:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.20060 validation-auc:0.97073 validation-aucpr:0.97376
[20:24:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.20071 validation-auc:0.97066 validation-aucpr:0.97386
[20:24:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.20065 validation-auc:0.97057 validation-aucpr:0.97381
[20:24:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20006 validation-auc:0.97071 validation-aucpr:0.97388
[20:24:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19982 validation-auc:0.97083 validation-aucpr:0.97462
[20:24:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19978 validation-auc:0.97082 validation-aucpr:0.97459
[20:24:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.20035 validation-auc:0.97060 validation-aucpr:0.97443
[20:24:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19996 validation-auc:0.97072 validation-aucpr:0.97452
[20:24:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.20049 validation-auc:0.97051 validation-aucpr:0.97431
[20:24:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.20053 validation-auc:0.97068 validation-aucpr:0.97460
[20:24:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.20073 validation-auc:0.97060 validation-aucpr:0.97446
[20:24:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.20136 validation-auc:0.97047 validation-aucpr:0.97442
[20:24:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.20190 validation-auc:0.97035 validation-aucpr:0.97430
[20:24:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.20211 validation-auc:0.97035 validation-aucpr:0.97424
[20:24:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.20246 validation-auc:0.97038 validation-aucpr:0.97427
[20:24:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.20281 validation-auc:0.97041 validation-aucpr:0.97428
[20:24:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.20269 validation-auc:0.97049 validation-aucpr:0.97428
[20:24:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.20249 validation-auc:0.97060 validation-aucpr:0.97435
[20:24:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.20295 validation-auc:0.97063 validation-aucpr:0.97442
[20:24:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.20341 validation-auc:0.97058 validation-aucpr:0.97431
[20:24:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.20377 validation-auc:0.97061 validation-aucpr:0.97424
[20:24:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.20425 validation-auc:0.97057 validation-aucpr:0.97414
{'best_iteration': '40', 'best_score': '0.9746363737266491'}
Trial 40, Fold 5: Log loss = 0.20424930220962945, Average precision = 0.9741473211463513, ROC-AUC = 0.9705699050591755, Elapsed Time = 13.895724099998915 seconds
Optimization Progress: 41%|####1 | 41/100 [2:25:54<2:12:09, 134.40s/it]
Trial 41, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 41, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68331 validation-auc:0.95047 validation-aucpr:0.95393
[1] validation-logloss:0.67501 validation-auc:0.95415 validation-aucpr:0.95984
[2] validation-logloss:0.66635 validation-auc:0.95465 validation-aucpr:0.96120
[3] validation-logloss:0.65800 validation-auc:0.95560 validation-aucpr:0.96189
[4] validation-logloss:0.65015 validation-auc:0.95545 validation-aucpr:0.96146
[5] validation-logloss:0.64237 validation-auc:0.95490 validation-aucpr:0.96079
[6] validation-logloss:0.63338 validation-auc:0.95775 validation-aucpr:0.96387
[7] validation-logloss:0.62638 validation-auc:0.95698 validation-aucpr:0.96304
[8] validation-logloss:0.61959 validation-auc:0.95631 validation-aucpr:0.96217
[9] validation-logloss:0.61143 validation-auc:0.95731 validation-aucpr:0.96141
[10] validation-logloss:0.60467 validation-auc:0.95773 validation-aucpr:0.96159
[11] validation-logloss:0.59794 validation-auc:0.95751 validation-aucpr:0.96142
[12] validation-logloss:0.59150 validation-auc:0.95714 validation-aucpr:0.96107
[13] validation-logloss:0.58518 validation-auc:0.95685 validation-aucpr:0.96086
[14] validation-logloss:0.57757 validation-auc:0.95794 validation-aucpr:0.96208
[15] validation-logloss:0.57045 validation-auc:0.95840 validation-aucpr:0.96264
[16] validation-logloss:0.56431 validation-auc:0.95861 validation-aucpr:0.96285
[17] validation-logloss:0.55851 validation-auc:0.95866 validation-aucpr:0.96288
[18] validation-logloss:0.55287 validation-auc:0.95854 validation-aucpr:0.96279
[19] validation-logloss:0.54760 validation-auc:0.95810 validation-aucpr:0.96242
[20] validation-logloss:0.54115 validation-auc:0.95854 validation-aucpr:0.96284
[21] validation-logloss:0.53599 validation-auc:0.95821 validation-aucpr:0.96253
[22] validation-logloss:0.52952 validation-auc:0.95895 validation-aucpr:0.96315
[23] validation-logloss:0.52320 validation-auc:0.95932 validation-aucpr:0.96358
[24] validation-logloss:0.51706 validation-auc:0.95968 validation-aucpr:0.96391
[25] validation-logloss:0.51138 validation-auc:0.95966 validation-aucpr:0.96395
[26] validation-logloss:0.50640 validation-auc:0.95978 validation-aucpr:0.96405
[27] validation-logloss:0.50172 validation-auc:0.95990 validation-aucpr:0.96411
[28] validation-logloss:0.49741 validation-auc:0.95991 validation-aucpr:0.96404
[29] validation-logloss:0.49232 validation-auc:0.96003 validation-aucpr:0.96425
[30] validation-logloss:0.48809 validation-auc:0.96007 validation-aucpr:0.96424
[31] validation-logloss:0.48318 validation-auc:0.96020 validation-aucpr:0.96490
[32] validation-logloss:0.47804 validation-auc:0.96044 validation-aucpr:0.96511
[33] validation-logloss:0.47418 validation-auc:0.96033 validation-aucpr:0.96501
[34] validation-logloss:0.46926 validation-auc:0.96059 validation-aucpr:0.96528
[35] validation-logloss:0.46435 validation-auc:0.96088 validation-aucpr:0.96559
[36] validation-logloss:0.45970 validation-auc:0.96086 validation-aucpr:0.96559
[37] validation-logloss:0.45593 validation-auc:0.96089 validation-aucpr:0.96559
[38] validation-logloss:0.45131 validation-auc:0.96104 validation-aucpr:0.96574
[39] validation-logloss:0.44726 validation-auc:0.96107 validation-aucpr:0.96579
[40] validation-logloss:0.44277 validation-auc:0.96125 validation-aucpr:0.96602
[41] validation-logloss:0.43859 validation-auc:0.96130 validation-aucpr:0.96609
[42] validation-logloss:0.43545 validation-auc:0.96132 validation-aucpr:0.96611
[43] validation-logloss:0.43150 validation-auc:0.96136 validation-aucpr:0.96617
[44] validation-logloss:0.42759 validation-auc:0.96152 validation-aucpr:0.96822
[45] validation-logloss:0.42366 validation-auc:0.96164 validation-aucpr:0.96832
[46] validation-logloss:0.41974 validation-auc:0.96166 validation-aucpr:0.96838
[47] validation-logloss:0.41686 validation-auc:0.96159 validation-aucpr:0.96831
[48] validation-logloss:0.41391 validation-auc:0.96163 validation-aucpr:0.96832
{'best_iteration': '46', 'best_score': '0.9683846721459073'}
Trial 41, Fold 1: Log loss = 0.4139050734398529, Average precision = 0.9682114877618576, ROC-AUC = 0.9616322954455294, Elapsed Time = 4.119934400001512 seconds
Trial 41, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 41, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68371 validation-auc:0.94526 validation-aucpr:0.94741
[1] validation-logloss:0.67530 validation-auc:0.95126 validation-aucpr:0.95521
[2] validation-logloss:0.66527 validation-auc:0.95994 validation-aucpr:0.96478
[3] validation-logloss:0.65699 validation-auc:0.95987 validation-aucpr:0.96462
[4] validation-logloss:0.64865 validation-auc:0.96036 validation-aucpr:0.96485
[5] validation-logloss:0.63981 validation-auc:0.96165 validation-aucpr:0.96635
[6] validation-logloss:0.63116 validation-auc:0.96223 validation-aucpr:0.96684
[7] validation-logloss:0.62251 validation-auc:0.96268 validation-aucpr:0.96721
[8] validation-logloss:0.61413 validation-auc:0.96318 validation-aucpr:0.96766
[9] validation-logloss:0.60700 validation-auc:0.96329 validation-aucpr:0.96768
[10] validation-logloss:0.59867 validation-auc:0.96395 validation-aucpr:0.96826
[11] validation-logloss:0.59178 validation-auc:0.96351 validation-aucpr:0.96787
[12] validation-logloss:0.58515 validation-auc:0.96352 validation-aucpr:0.96780
[13] validation-logloss:0.57791 validation-auc:0.96353 validation-aucpr:0.96789
[14] validation-logloss:0.57170 validation-auc:0.96322 validation-aucpr:0.96765
[15] validation-logloss:0.56460 validation-auc:0.96329 validation-aucpr:0.96768
[16] validation-logloss:0.55746 validation-auc:0.96352 validation-aucpr:0.96788
[17] validation-logloss:0.55055 validation-auc:0.96372 validation-aucpr:0.96801
[18] validation-logloss:0.54412 validation-auc:0.96359 validation-aucpr:0.96793
[19] validation-logloss:0.53864 validation-auc:0.96363 validation-aucpr:0.96786
[20] validation-logloss:0.53364 validation-auc:0.96322 validation-aucpr:0.96752
[21] validation-logloss:0.52860 validation-auc:0.96300 validation-aucpr:0.96725
[22] validation-logloss:0.52228 validation-auc:0.96335 validation-aucpr:0.96755
[23] validation-logloss:0.51648 validation-auc:0.96336 validation-aucpr:0.96756
[24] validation-logloss:0.51122 validation-auc:0.96327 validation-aucpr:0.96748
[25] validation-logloss:0.50565 validation-auc:0.96325 validation-aucpr:0.96750
[26] validation-logloss:0.50024 validation-auc:0.96327 validation-aucpr:0.96755
[27] validation-logloss:0.49564 validation-auc:0.96315 validation-aucpr:0.96753
[28] validation-logloss:0.49108 validation-auc:0.96323 validation-aucpr:0.96754
[29] validation-logloss:0.48579 validation-auc:0.96339 validation-aucpr:0.96763
[30] validation-logloss:0.48156 validation-auc:0.96343 validation-aucpr:0.96763
[31] validation-logloss:0.47673 validation-auc:0.96349 validation-aucpr:0.96764
[32] validation-logloss:0.47228 validation-auc:0.96346 validation-aucpr:0.96761
[33] validation-logloss:0.46826 validation-auc:0.96346 validation-aucpr:0.96755
[34] validation-logloss:0.46334 validation-auc:0.96364 validation-aucpr:0.96769
[35] validation-logloss:0.45901 validation-auc:0.96358 validation-aucpr:0.96769
[36] validation-logloss:0.45518 validation-auc:0.96368 validation-aucpr:0.96803
[37] validation-logloss:0.45090 validation-auc:0.96357 validation-aucpr:0.96793
[38] validation-logloss:0.44655 validation-auc:0.96361 validation-aucpr:0.96791
[39] validation-logloss:0.44311 validation-auc:0.96358 validation-aucpr:0.96792
[40] validation-logloss:0.43960 validation-auc:0.96348 validation-aucpr:0.96779
[41] validation-logloss:0.43520 validation-auc:0.96355 validation-aucpr:0.96786
[42] validation-logloss:0.43172 validation-auc:0.96367 validation-aucpr:0.96791
[43] validation-logloss:0.42754 validation-auc:0.96370 validation-aucpr:0.96797
[44] validation-logloss:0.42439 validation-auc:0.96379 validation-aucpr:0.96802
[45] validation-logloss:0.42052 validation-auc:0.96388 validation-aucpr:0.96813
[46] validation-logloss:0.41743 validation-auc:0.96393 validation-aucpr:0.96815
[47] validation-logloss:0.41383 validation-auc:0.96398 validation-aucpr:0.96821
[48] validation-logloss:0.41082 validation-auc:0.96399 validation-aucpr:0.96820
{'best_iteration': '10', 'best_score': '0.9682580044293139'}
Trial 41, Fold 2: Log loss = 0.41081987104648127, Average precision = 0.9681047990157229, ROC-AUC = 0.9639932095900958, Elapsed Time = 4.259771599998203 seconds
Trial 41, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 41, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68306 validation-auc:0.95170 validation-aucpr:0.95260
[1] validation-logloss:0.67316 validation-auc:0.95748 validation-aucpr:0.96148
[2] validation-logloss:0.66321 validation-auc:0.96296 validation-aucpr:0.96563
[3] validation-logloss:0.65478 validation-auc:0.96364 validation-aucpr:0.96816
[4] validation-logloss:0.64658 validation-auc:0.96294 validation-aucpr:0.96766
[5] validation-logloss:0.63740 validation-auc:0.96314 validation-aucpr:0.96794
[6] validation-logloss:0.62997 validation-auc:0.96312 validation-aucpr:0.96784
[7] validation-logloss:0.62234 validation-auc:0.96269 validation-aucpr:0.96606
[8] validation-logloss:0.61551 validation-auc:0.96253 validation-aucpr:0.96568
[9] validation-logloss:0.60844 validation-auc:0.96245 validation-aucpr:0.96742
[10] validation-logloss:0.60185 validation-auc:0.96209 validation-aucpr:0.96701
[11] validation-logloss:0.59528 validation-auc:0.96211 validation-aucpr:0.96693
[12] validation-logloss:0.58888 validation-auc:0.96195 validation-aucpr:0.96674
[13] validation-logloss:0.58141 validation-auc:0.96205 validation-aucpr:0.96691
[14] validation-logloss:0.57413 validation-auc:0.96239 validation-aucpr:0.96716
[15] validation-logloss:0.56678 validation-auc:0.96287 validation-aucpr:0.96762
[16] validation-logloss:0.55953 validation-auc:0.96336 validation-aucpr:0.96810
[17] validation-logloss:0.55258 validation-auc:0.96357 validation-aucpr:0.96830
[18] validation-logloss:0.54665 validation-auc:0.96387 validation-aucpr:0.96862
[19] validation-logloss:0.54129 validation-auc:0.96371 validation-aucpr:0.96843
[20] validation-logloss:0.53478 validation-auc:0.96400 validation-aucpr:0.96873
[21] validation-logloss:0.52968 validation-auc:0.96379 validation-aucpr:0.96853
[22] validation-logloss:0.52431 validation-auc:0.96372 validation-aucpr:0.96848
[23] validation-logloss:0.51844 validation-auc:0.96382 validation-aucpr:0.96844
[24] validation-logloss:0.51257 validation-auc:0.96384 validation-aucpr:0.96854
[25] validation-logloss:0.50783 validation-auc:0.96364 validation-aucpr:0.96836
[26] validation-logloss:0.50311 validation-auc:0.96366 validation-aucpr:0.96832
[27] validation-logloss:0.49863 validation-auc:0.96349 validation-aucpr:0.96813
[28] validation-logloss:0.49382 validation-auc:0.96356 validation-aucpr:0.96817
[29] validation-logloss:0.48855 validation-auc:0.96346 validation-aucpr:0.96812
[30] validation-logloss:0.48418 validation-auc:0.96361 validation-aucpr:0.96889
[31] validation-logloss:0.47986 validation-auc:0.96379 validation-aucpr:0.96901
[32] validation-logloss:0.47486 validation-auc:0.96398 validation-aucpr:0.96916
[33] validation-logloss:0.46987 validation-auc:0.96406 validation-aucpr:0.96931
[34] validation-logloss:0.46588 validation-auc:0.96409 validation-aucpr:0.96928
[35] validation-logloss:0.46177 validation-auc:0.96410 validation-aucpr:0.96928
[36] validation-logloss:0.45746 validation-auc:0.96414 validation-aucpr:0.96933
[37] validation-logloss:0.45290 validation-auc:0.96426 validation-aucpr:0.96946
[38] validation-logloss:0.44836 validation-auc:0.96439 validation-aucpr:0.96960
[39] validation-logloss:0.44411 validation-auc:0.96436 validation-aucpr:0.96956
[40] validation-logloss:0.44051 validation-auc:0.96441 validation-aucpr:0.96958
[41] validation-logloss:0.43630 validation-auc:0.96454 validation-aucpr:0.96966
[42] validation-logloss:0.43266 validation-auc:0.96454 validation-aucpr:0.96967
[43] validation-logloss:0.42844 validation-auc:0.96456 validation-aucpr:0.96972
[44] validation-logloss:0.42503 validation-auc:0.96459 validation-aucpr:0.96974
[45] validation-logloss:0.42173 validation-auc:0.96464 validation-aucpr:0.96977
[46] validation-logloss:0.41833 validation-auc:0.96480 validation-aucpr:0.96989
[47] validation-logloss:0.41458 validation-auc:0.96484 validation-aucpr:0.96992
[48] validation-logloss:0.41149 validation-auc:0.96487 validation-aucpr:0.96990
{'best_iteration': '47', 'best_score': '0.9699236434103997'}
Trial 41, Fold 3: Log loss = 0.4114919188101245, Average precision = 0.9698635641105825, ROC-AUC = 0.9648704986949949, Elapsed Time = 4.189911100002064 seconds
Trial 41, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 41, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68323 validation-auc:0.94846 validation-aucpr:0.95009
[1] validation-logloss:0.67334 validation-auc:0.95575 validation-aucpr:0.95982
[2] validation-logloss:0.66363 validation-auc:0.95780 validation-aucpr:0.96448
[3] validation-logloss:0.65565 validation-auc:0.95568 validation-aucpr:0.96329
[4] validation-logloss:0.64788 validation-auc:0.95672 validation-aucpr:0.96365
[5] validation-logloss:0.63867 validation-auc:0.95851 validation-aucpr:0.96501
[6] validation-logloss:0.63133 validation-auc:0.95796 validation-aucpr:0.96455
[7] validation-logloss:0.62366 validation-auc:0.95799 validation-aucpr:0.96498
[8] validation-logloss:0.61696 validation-auc:0.95686 validation-aucpr:0.96420
[9] validation-logloss:0.60978 validation-auc:0.95775 validation-aucpr:0.96483
[10] validation-logloss:0.60245 validation-auc:0.95793 validation-aucpr:0.96499
[11] validation-logloss:0.59574 validation-auc:0.95807 validation-aucpr:0.96497
[12] validation-logloss:0.58888 validation-auc:0.95834 validation-aucpr:0.96513
[13] validation-logloss:0.58268 validation-auc:0.95800 validation-aucpr:0.96483
[14] validation-logloss:0.57520 validation-auc:0.95887 validation-aucpr:0.96564
[15] validation-logloss:0.56899 validation-auc:0.95944 validation-aucpr:0.96592
[16] validation-logloss:0.56254 validation-auc:0.95948 validation-aucpr:0.96600
[17] validation-logloss:0.55692 validation-auc:0.95948 validation-aucpr:0.96591
[18] validation-logloss:0.55025 validation-auc:0.96031 validation-aucpr:0.96666
[19] validation-logloss:0.54397 validation-auc:0.96035 validation-aucpr:0.96678
[20] validation-logloss:0.53832 validation-auc:0.96047 validation-aucpr:0.96683
[21] validation-logloss:0.53304 validation-auc:0.96039 validation-aucpr:0.96678
[22] validation-logloss:0.52691 validation-auc:0.96083 validation-aucpr:0.96712
[23] validation-logloss:0.52184 validation-auc:0.96053 validation-aucpr:0.96687
[24] validation-logloss:0.51689 validation-auc:0.96055 validation-aucpr:0.96685
[25] validation-logloss:0.51114 validation-auc:0.96082 validation-aucpr:0.96711
[26] validation-logloss:0.50614 validation-auc:0.96095 validation-aucpr:0.96721
[27] validation-logloss:0.50164 validation-auc:0.96074 validation-aucpr:0.96701
[28] validation-logloss:0.49612 validation-auc:0.96094 validation-aucpr:0.96722
[29] validation-logloss:0.49051 validation-auc:0.96138 validation-aucpr:0.96765
[30] validation-logloss:0.48618 validation-auc:0.96132 validation-aucpr:0.96758
[31] validation-logloss:0.48213 validation-auc:0.96127 validation-aucpr:0.96754
[32] validation-logloss:0.47715 validation-auc:0.96140 validation-aucpr:0.96767
[33] validation-logloss:0.47226 validation-auc:0.96156 validation-aucpr:0.96782
[34] validation-logloss:0.46760 validation-auc:0.96157 validation-aucpr:0.96785
[35] validation-logloss:0.46381 validation-auc:0.96155 validation-aucpr:0.96781
[36] validation-logloss:0.45912 validation-auc:0.96158 validation-aucpr:0.96791
[37] validation-logloss:0.45454 validation-auc:0.96169 validation-aucpr:0.96804
[38] validation-logloss:0.44998 validation-auc:0.96181 validation-aucpr:0.96818
[39] validation-logloss:0.44650 validation-auc:0.96180 validation-aucpr:0.96815
[40] validation-logloss:0.44300 validation-auc:0.96188 validation-aucpr:0.96818
[41] validation-logloss:0.43950 validation-auc:0.96185 validation-aucpr:0.96815
[42] validation-logloss:0.43635 validation-auc:0.96180 validation-aucpr:0.96809
[43] validation-logloss:0.43234 validation-auc:0.96184 validation-aucpr:0.96818
[44] validation-logloss:0.42835 validation-auc:0.96196 validation-aucpr:0.96827
[45] validation-logloss:0.42533 validation-auc:0.96192 validation-aucpr:0.96823
[46] validation-logloss:0.42230 validation-auc:0.96195 validation-aucpr:0.96822
[47] validation-logloss:0.41868 validation-auc:0.96199 validation-aucpr:0.96829
[48] validation-logloss:0.41574 validation-auc:0.96199 validation-aucpr:0.96827
{'best_iteration': '47', 'best_score': '0.968289622878098'}
Trial 41, Fold 4: Log loss = 0.4157433809493005, Average precision = 0.9680932874566976, ROC-AUC = 0.9619874413137145, Elapsed Time = 4.13615320000099 seconds
Trial 41, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 41, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68440 validation-auc:0.92976 validation-aucpr:0.92645
[1] validation-logloss:0.67556 validation-auc:0.94104 validation-aucpr:0.94659
[2] validation-logloss:0.66566 validation-auc:0.95372 validation-aucpr:0.96037
[3] validation-logloss:0.65798 validation-auc:0.95399 validation-aucpr:0.96125
[4] validation-logloss:0.64974 validation-auc:0.95305 validation-aucpr:0.96017
[5] validation-logloss:0.64186 validation-auc:0.95301 validation-aucpr:0.96000
[6] validation-logloss:0.63323 validation-auc:0.95498 validation-aucpr:0.96207
[7] validation-logloss:0.62626 validation-auc:0.95426 validation-aucpr:0.96035
[8] validation-logloss:0.61917 validation-auc:0.95547 validation-aucpr:0.96114
[9] validation-logloss:0.61248 validation-auc:0.95466 validation-aucpr:0.96039
[10] validation-logloss:0.60430 validation-auc:0.95563 validation-aucpr:0.96153
[11] validation-logloss:0.59715 validation-auc:0.95595 validation-aucpr:0.96166
[12] validation-logloss:0.59094 validation-auc:0.95508 validation-aucpr:0.96094
[13] validation-logloss:0.58471 validation-auc:0.95447 validation-aucpr:0.96041
[14] validation-logloss:0.57758 validation-auc:0.95541 validation-aucpr:0.96167
[15] validation-logloss:0.57130 validation-auc:0.95643 validation-aucpr:0.96241
[16] validation-logloss:0.56515 validation-auc:0.95653 validation-aucpr:0.96256
[17] validation-logloss:0.55844 validation-auc:0.95650 validation-aucpr:0.96262
[18] validation-logloss:0.55182 validation-auc:0.95666 validation-aucpr:0.96287
[19] validation-logloss:0.54651 validation-auc:0.95675 validation-aucpr:0.96280
[20] validation-logloss:0.54133 validation-auc:0.95653 validation-aucpr:0.96228
[21] validation-logloss:0.53643 validation-auc:0.95630 validation-aucpr:0.96195
[22] validation-logloss:0.53106 validation-auc:0.95645 validation-aucpr:0.96208
[23] validation-logloss:0.52491 validation-auc:0.95656 validation-aucpr:0.96237
[24] validation-logloss:0.51874 validation-auc:0.95748 validation-aucpr:0.96312
[25] validation-logloss:0.51294 validation-auc:0.95767 validation-aucpr:0.96334
[26] validation-logloss:0.50801 validation-auc:0.95796 validation-aucpr:0.96352
[27] validation-logloss:0.50348 validation-auc:0.95815 validation-aucpr:0.96461
[28] validation-logloss:0.49829 validation-auc:0.95824 validation-aucpr:0.96473
[29] validation-logloss:0.49297 validation-auc:0.95851 validation-aucpr:0.96497
[30] validation-logloss:0.48900 validation-auc:0.95821 validation-aucpr:0.96462
[31] validation-logloss:0.48430 validation-auc:0.95829 validation-aucpr:0.96467
[32] validation-logloss:0.47924 validation-auc:0.95850 validation-aucpr:0.96486
[33] validation-logloss:0.47440 validation-auc:0.95862 validation-aucpr:0.96500
[34] validation-logloss:0.47048 validation-auc:0.95869 validation-aucpr:0.96505
[35] validation-logloss:0.46634 validation-auc:0.95873 validation-aucpr:0.96510
[36] validation-logloss:0.46242 validation-auc:0.95935 validation-aucpr:0.96543
[37] validation-logloss:0.45775 validation-auc:0.95957 validation-aucpr:0.96562
[38] validation-logloss:0.45424 validation-auc:0.95942 validation-aucpr:0.96547
[39] validation-logloss:0.44959 validation-auc:0.95985 validation-aucpr:0.96581
[40] validation-logloss:0.44624 validation-auc:0.95998 validation-aucpr:0.96590
[41] validation-logloss:0.44178 validation-auc:0.96003 validation-aucpr:0.96599
[42] validation-logloss:0.43828 validation-auc:0.96014 validation-aucpr:0.96605
[43] validation-logloss:0.43408 validation-auc:0.96034 validation-aucpr:0.96624
[44] validation-logloss:0.43106 validation-auc:0.96029 validation-aucpr:0.96618
[45] validation-logloss:0.42792 validation-auc:0.96031 validation-aucpr:0.96615
[46] validation-logloss:0.42403 validation-auc:0.96038 validation-aucpr:0.96624
[47] validation-logloss:0.42112 validation-auc:0.96046 validation-aucpr:0.96629
[48] validation-logloss:0.41818 validation-auc:0.96047 validation-aucpr:0.96630
{'best_iteration': '48', 'best_score': '0.9662968778042819'}
Trial 41, Fold 5: Log loss = 0.4181768956842521, Average precision = 0.9661507923621591, ROC-AUC = 0.9604730319751779, Elapsed Time = 4.168298500000674 seconds
Optimization Progress: 42%|####2 | 42/100 [2:26:24<1:39:34, 103.01s/it]
Trial 42, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 42, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68113 validation-auc:0.94891 validation-aucpr:0.92942
[1] validation-logloss:0.66946 validation-auc:0.96345 validation-aucpr:0.96057
[2] validation-logloss:0.65818 validation-auc:0.96507 validation-aucpr:0.96730
[3] validation-logloss:0.64900 validation-auc:0.96546 validation-aucpr:0.96958
[4] validation-logloss:0.63959 validation-auc:0.96565 validation-aucpr:0.96935
[5] validation-logloss:0.63061 validation-auc:0.96558 validation-aucpr:0.96940
[6] validation-logloss:0.62165 validation-auc:0.96546 validation-aucpr:0.96913
[7] validation-logloss:0.61220 validation-auc:0.96615 validation-aucpr:0.97013
[8] validation-logloss:0.60262 validation-auc:0.96637 validation-aucpr:0.97038
[9] validation-logloss:0.59342 validation-auc:0.96671 validation-aucpr:0.97070
[10] validation-logloss:0.58558 validation-auc:0.96699 validation-aucpr:0.97215
[11] validation-logloss:0.57772 validation-auc:0.96691 validation-aucpr:0.97202
[12] validation-logloss:0.57015 validation-auc:0.96691 validation-aucpr:0.97199
[13] validation-logloss:0.56185 validation-auc:0.96710 validation-aucpr:0.97216
[14] validation-logloss:0.55398 validation-auc:0.96715 validation-aucpr:0.97220
[15] validation-logloss:0.54724 validation-auc:0.96708 validation-aucpr:0.97206
[16] validation-logloss:0.53935 validation-auc:0.96748 validation-aucpr:0.97242
[17] validation-logloss:0.53190 validation-auc:0.96776 validation-aucpr:0.97268
[18] validation-logloss:0.52546 validation-auc:0.96766 validation-aucpr:0.97252
[19] validation-logloss:0.51907 validation-auc:0.96767 validation-aucpr:0.97249
[20] validation-logloss:0.51231 validation-auc:0.96776 validation-aucpr:0.97264
[21] validation-logloss:0.50565 validation-auc:0.96789 validation-aucpr:0.97287
[22] validation-logloss:0.49986 validation-auc:0.96792 validation-aucpr:0.97284
[23] validation-logloss:0.49349 validation-auc:0.96809 validation-aucpr:0.97298
[24] validation-logloss:0.48810 validation-auc:0.96805 validation-aucpr:0.97291
[25] validation-logloss:0.48268 validation-auc:0.96789 validation-aucpr:0.97275
[26] validation-logloss:0.47651 validation-auc:0.96814 validation-aucpr:0.97293
[27] validation-logloss:0.47143 validation-auc:0.96801 validation-aucpr:0.97283
[28] validation-logloss:0.46570 validation-auc:0.96819 validation-aucpr:0.97299
[29] validation-logloss:0.46075 validation-auc:0.96821 validation-aucpr:0.97297
[30] validation-logloss:0.45532 validation-auc:0.96830 validation-aucpr:0.97307
[31] validation-logloss:0.44999 validation-auc:0.96841 validation-aucpr:0.97313
[32] validation-logloss:0.44556 validation-auc:0.96828 validation-aucpr:0.97300
[33] validation-logloss:0.44033 validation-auc:0.96845 validation-aucpr:0.97314
[34] validation-logloss:0.43525 validation-auc:0.96856 validation-aucpr:0.97326
[35] validation-logloss:0.43037 validation-auc:0.96871 validation-aucpr:0.97347
[36] validation-logloss:0.42546 validation-auc:0.96883 validation-aucpr:0.97360
[37] validation-logloss:0.42128 validation-auc:0.96888 validation-aucpr:0.97359
[38] validation-logloss:0.41739 validation-auc:0.96878 validation-aucpr:0.97348
[39] validation-logloss:0.41274 validation-auc:0.96901 validation-aucpr:0.97366
[40] validation-logloss:0.40883 validation-auc:0.96901 validation-aucpr:0.97364
[41] validation-logloss:0.40532 validation-auc:0.96898 validation-aucpr:0.97358
[42] validation-logloss:0.40110 validation-auc:0.96901 validation-aucpr:0.97363
[43] validation-logloss:0.39686 validation-auc:0.96911 validation-aucpr:0.97372
[44] validation-logloss:0.39354 validation-auc:0.96904 validation-aucpr:0.97363
[45] validation-logloss:0.38949 validation-auc:0.96918 validation-aucpr:0.97375
[46] validation-logloss:0.38628 validation-auc:0.96901 validation-aucpr:0.97361
[47] validation-logloss:0.38298 validation-auc:0.96900 validation-aucpr:0.97360
[48] validation-logloss:0.37915 validation-auc:0.96918 validation-aucpr:0.97388
[49] validation-logloss:0.37602 validation-auc:0.96921 validation-aucpr:0.97387
[50] validation-logloss:0.37293 validation-auc:0.96924 validation-aucpr:0.97388
[51] validation-logloss:0.36928 validation-auc:0.96936 validation-aucpr:0.97400
[52] validation-logloss:0.36576 validation-auc:0.96946 validation-aucpr:0.97409
[53] validation-logloss:0.36232 validation-auc:0.96952 validation-aucpr:0.97415
[54] validation-logloss:0.35955 validation-auc:0.96948 validation-aucpr:0.97412
[55] validation-logloss:0.35615 validation-auc:0.96965 validation-aucpr:0.97426
[56] validation-logloss:0.35294 validation-auc:0.96969 validation-aucpr:0.97431
[57] validation-logloss:0.35037 validation-auc:0.96964 validation-aucpr:0.97425
[58] validation-logloss:0.34730 validation-auc:0.96974 validation-aucpr:0.97432
[59] validation-logloss:0.34485 validation-auc:0.96969 validation-aucpr:0.97428
[60] validation-logloss:0.34238 validation-auc:0.96965 validation-aucpr:0.97423
[61] validation-logloss:0.34002 validation-auc:0.96960 validation-aucpr:0.97416
[62] validation-logloss:0.33716 validation-auc:0.96964 validation-aucpr:0.97421
[63] validation-logloss:0.33428 validation-auc:0.96972 validation-aucpr:0.97428
[64] validation-logloss:0.33204 validation-auc:0.96970 validation-aucpr:0.97426
[65] validation-logloss:0.32990 validation-auc:0.96962 validation-aucpr:0.97418
[66] validation-logloss:0.32727 validation-auc:0.96965 validation-aucpr:0.97421
[67] validation-logloss:0.32518 validation-auc:0.96954 validation-aucpr:0.97411
[68] validation-logloss:0.32298 validation-auc:0.96957 validation-aucpr:0.97413
[69] validation-logloss:0.32039 validation-auc:0.96961 validation-aucpr:0.97416
[70] validation-logloss:0.31792 validation-auc:0.96969 validation-aucpr:0.97421
[71] validation-logloss:0.31601 validation-auc:0.96967 validation-aucpr:0.97417
[72] validation-logloss:0.31367 validation-auc:0.96972 validation-aucpr:0.97422
{'best_iteration': '58', 'best_score': '0.9743230383233791'}
Trial 42, Fold 1: Log loss = 0.3136689057542144, Average precision = 0.9742187899161252, ROC-AUC = 0.969719808529655, Elapsed Time = 1.9366607000010845 seconds
Trial 42, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 42, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68136 validation-auc:0.94753 validation-aucpr:0.90845
[1] validation-logloss:0.67107 validation-auc:0.96181 validation-aucpr:0.95788
[2] validation-logloss:0.66099 validation-auc:0.96363 validation-aucpr:0.96454
[3] validation-logloss:0.65146 validation-auc:0.96404 validation-aucpr:0.96720
[4] validation-logloss:0.64192 validation-auc:0.96476 validation-aucpr:0.96778
[5] validation-logloss:0.63260 validation-auc:0.96443 validation-aucpr:0.96743
[6] validation-logloss:0.62251 validation-auc:0.96641 validation-aucpr:0.96968
[7] validation-logloss:0.61280 validation-auc:0.96754 validation-aucpr:0.97090
[8] validation-logloss:0.60463 validation-auc:0.96730 validation-aucpr:0.97064
[9] validation-logloss:0.59676 validation-auc:0.96671 validation-aucpr:0.97034
[10] validation-logloss:0.58901 validation-auc:0.96640 validation-aucpr:0.97000
[11] validation-logloss:0.58023 validation-auc:0.96723 validation-aucpr:0.97075
[12] validation-logloss:0.57163 validation-auc:0.96762 validation-aucpr:0.97113
[13] validation-logloss:0.56320 validation-auc:0.96809 validation-aucpr:0.97159
[14] validation-logloss:0.55529 validation-auc:0.96818 validation-aucpr:0.97170
[15] validation-logloss:0.54840 validation-auc:0.96824 validation-aucpr:0.97170
[16] validation-logloss:0.54170 validation-auc:0.96811 validation-aucpr:0.97155
[17] validation-logloss:0.53506 validation-auc:0.96815 validation-aucpr:0.97155
[18] validation-logloss:0.52786 validation-auc:0.96826 validation-aucpr:0.97169
[19] validation-logloss:0.52079 validation-auc:0.96839 validation-aucpr:0.97179
[20] validation-logloss:0.51377 validation-auc:0.96858 validation-aucpr:0.97195
[21] validation-logloss:0.50773 validation-auc:0.96873 validation-aucpr:0.97207
[22] validation-logloss:0.50206 validation-auc:0.96872 validation-aucpr:0.97205
[23] validation-logloss:0.49631 validation-auc:0.96871 validation-aucpr:0.97204
[24] validation-logloss:0.49081 validation-auc:0.96876 validation-aucpr:0.97186
[25] validation-logloss:0.48452 validation-auc:0.96889 validation-aucpr:0.97198
[26] validation-logloss:0.47831 validation-auc:0.96912 validation-aucpr:0.97219
[27] validation-logloss:0.47318 validation-auc:0.96906 validation-aucpr:0.97215
[28] validation-logloss:0.46725 validation-auc:0.96918 validation-aucpr:0.97225
[29] validation-logloss:0.46147 validation-auc:0.96928 validation-aucpr:0.97236
[30] validation-logloss:0.45580 validation-auc:0.96936 validation-aucpr:0.97242
[31] validation-logloss:0.45047 validation-auc:0.96951 validation-aucpr:0.97253
[32] validation-logloss:0.44574 validation-auc:0.96951 validation-aucpr:0.97255
[33] validation-logloss:0.44137 validation-auc:0.96946 validation-aucpr:0.97251
[34] validation-logloss:0.43622 validation-auc:0.96960 validation-aucpr:0.97263
[35] validation-logloss:0.43128 validation-auc:0.96968 validation-aucpr:0.97271
[36] validation-logloss:0.42712 validation-auc:0.96960 validation-aucpr:0.97266
[37] validation-logloss:0.42225 validation-auc:0.96968 validation-aucpr:0.97273
[38] validation-logloss:0.41757 validation-auc:0.96974 validation-aucpr:0.97279
[39] validation-logloss:0.41296 validation-auc:0.96976 validation-aucpr:0.97284
[40] validation-logloss:0.40843 validation-auc:0.96991 validation-aucpr:0.97296
[41] validation-logloss:0.40416 validation-auc:0.96994 validation-aucpr:0.97300
[42] validation-logloss:0.40045 validation-auc:0.96987 validation-aucpr:0.97295
[43] validation-logloss:0.39631 validation-auc:0.96988 validation-aucpr:0.97298
[44] validation-logloss:0.39278 validation-auc:0.96982 validation-aucpr:0.97294
[45] validation-logloss:0.38931 validation-auc:0.96983 validation-aucpr:0.97293
[46] validation-logloss:0.38597 validation-auc:0.96982 validation-aucpr:0.97292
[47] validation-logloss:0.38208 validation-auc:0.96987 validation-aucpr:0.97296
[48] validation-logloss:0.37822 validation-auc:0.96989 validation-aucpr:0.97300
[49] validation-logloss:0.37511 validation-auc:0.96979 validation-aucpr:0.97292
[50] validation-logloss:0.37202 validation-auc:0.96980 validation-aucpr:0.97292
[51] validation-logloss:0.36832 validation-auc:0.96992 validation-aucpr:0.97302
[52] validation-logloss:0.36484 validation-auc:0.96996 validation-aucpr:0.97306
[53] validation-logloss:0.36186 validation-auc:0.96993 validation-aucpr:0.97304
[54] validation-logloss:0.35842 validation-auc:0.96999 validation-aucpr:0.97310
[55] validation-logloss:0.35504 validation-auc:0.97013 validation-aucpr:0.97321
[56] validation-logloss:0.35179 validation-auc:0.97016 validation-aucpr:0.97324
[57] validation-logloss:0.34865 validation-auc:0.97026 validation-aucpr:0.97331
[58] validation-logloss:0.34605 validation-auc:0.97023 validation-aucpr:0.97329
[59] validation-logloss:0.34291 validation-auc:0.97035 validation-aucpr:0.97335
[60] validation-logloss:0.33996 validation-auc:0.97034 validation-aucpr:0.97336
[61] validation-logloss:0.33699 validation-auc:0.97047 validation-aucpr:0.97345
[62] validation-logloss:0.33452 validation-auc:0.97047 validation-aucpr:0.97342
[63] validation-logloss:0.33216 validation-auc:0.97046 validation-aucpr:0.97342
[64] validation-logloss:0.32945 validation-auc:0.97053 validation-aucpr:0.97346
[65] validation-logloss:0.32718 validation-auc:0.97050 validation-aucpr:0.97344
[66] validation-logloss:0.32442 validation-auc:0.97060 validation-aucpr:0.97348
[67] validation-logloss:0.32186 validation-auc:0.97063 validation-aucpr:0.97351
[68] validation-logloss:0.31925 validation-auc:0.97071 validation-aucpr:0.97356
[69] validation-logloss:0.31706 validation-auc:0.97078 validation-aucpr:0.97360
[70] validation-logloss:0.31451 validation-auc:0.97084 validation-aucpr:0.97367
[71] validation-logloss:0.31244 validation-auc:0.97085 validation-aucpr:0.97367
[72] validation-logloss:0.31005 validation-auc:0.97089 validation-aucpr:0.97369
{'best_iteration': '72', 'best_score': '0.973692450730977'}
Trial 42, Fold 2: Log loss = 0.3100492448380225, Average precision = 0.9736645921803798, ROC-AUC = 0.9708852146366203, Elapsed Time = 2.2306970999998157 seconds
Trial 42, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 42, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68110 validation-auc:0.94677 validation-aucpr:0.91224
[1] validation-logloss:0.67096 validation-auc:0.96183 validation-aucpr:0.95749
[2] validation-logloss:0.65956 validation-auc:0.96645 validation-aucpr:0.96741
[3] validation-logloss:0.64856 validation-auc:0.96805 validation-aucpr:0.97135
[4] validation-logloss:0.63795 validation-auc:0.96863 validation-aucpr:0.97192
[5] validation-logloss:0.62753 validation-auc:0.96904 validation-aucpr:0.97229
[6] validation-logloss:0.61771 validation-auc:0.96928 validation-aucpr:0.97283
[7] validation-logloss:0.60954 validation-auc:0.96891 validation-aucpr:0.97234
[8] validation-logloss:0.59999 validation-auc:0.96962 validation-aucpr:0.97292
[9] validation-logloss:0.59160 validation-auc:0.96949 validation-aucpr:0.97285
[10] validation-logloss:0.58350 validation-auc:0.96911 validation-aucpr:0.97249
[11] validation-logloss:0.57585 validation-auc:0.96850 validation-aucpr:0.97197
[12] validation-logloss:0.56734 validation-auc:0.96860 validation-aucpr:0.97213
[13] validation-logloss:0.56000 validation-auc:0.96837 validation-aucpr:0.97192
[14] validation-logloss:0.55316 validation-auc:0.96833 validation-aucpr:0.97225
[15] validation-logloss:0.54635 validation-auc:0.96834 validation-aucpr:0.97226
[16] validation-logloss:0.53954 validation-auc:0.96854 validation-aucpr:0.97247
[17] validation-logloss:0.53208 validation-auc:0.96873 validation-aucpr:0.97262
[18] validation-logloss:0.52474 validation-auc:0.96884 validation-aucpr:0.97271
[19] validation-logloss:0.51824 validation-auc:0.96905 validation-aucpr:0.97288
[20] validation-logloss:0.51109 validation-auc:0.96936 validation-aucpr:0.97313
[21] validation-logloss:0.50495 validation-auc:0.96925 validation-aucpr:0.97303
[22] validation-logloss:0.49913 validation-auc:0.96911 validation-aucpr:0.97293
[23] validation-logloss:0.49257 validation-auc:0.96921 validation-aucpr:0.97301
[24] validation-logloss:0.48703 validation-auc:0.96924 validation-aucpr:0.97291
[25] validation-logloss:0.48081 validation-auc:0.96952 validation-aucpr:0.97162
[26] validation-logloss:0.47489 validation-auc:0.96976 validation-aucpr:0.97403
[27] validation-logloss:0.46903 validation-auc:0.96987 validation-aucpr:0.97412
[28] validation-logloss:0.46402 validation-auc:0.96983 validation-aucpr:0.97407
[29] validation-logloss:0.45826 validation-auc:0.96995 validation-aucpr:0.97419
[30] validation-logloss:0.45275 validation-auc:0.97011 validation-aucpr:0.97433
[31] validation-logloss:0.44815 validation-auc:0.97007 validation-aucpr:0.97426
[32] validation-logloss:0.44284 validation-auc:0.97012 validation-aucpr:0.97432
[33] validation-logloss:0.43784 validation-auc:0.97007 validation-aucpr:0.97426
[34] validation-logloss:0.43297 validation-auc:0.97011 validation-aucpr:0.97431
[35] validation-logloss:0.42818 validation-auc:0.97007 validation-aucpr:0.97429
[36] validation-logloss:0.42337 validation-auc:0.97022 validation-aucpr:0.97445
[37] validation-logloss:0.41866 validation-auc:0.97025 validation-aucpr:0.97447
[38] validation-logloss:0.41400 validation-auc:0.97033 validation-aucpr:0.97454
[39] validation-logloss:0.41025 validation-auc:0.97042 validation-aucpr:0.97466
[40] validation-logloss:0.40624 validation-auc:0.97044 validation-aucpr:0.97466
[41] validation-logloss:0.40250 validation-auc:0.97040 validation-aucpr:0.97462
[42] validation-logloss:0.39880 validation-auc:0.97042 validation-aucpr:0.97462
[43] validation-logloss:0.39518 validation-auc:0.97037 validation-aucpr:0.97458
[44] validation-logloss:0.39099 validation-auc:0.97046 validation-aucpr:0.97468
[45] validation-logloss:0.38699 validation-auc:0.97054 validation-aucpr:0.97475
[46] validation-logloss:0.38306 validation-auc:0.97068 validation-aucpr:0.97490
[47] validation-logloss:0.37929 validation-auc:0.97074 validation-aucpr:0.97495
[48] validation-logloss:0.37543 validation-auc:0.97079 validation-aucpr:0.97499
[49] validation-logloss:0.37226 validation-auc:0.97082 validation-aucpr:0.97503
[50] validation-logloss:0.36912 validation-auc:0.97079 validation-aucpr:0.97501
[51] validation-logloss:0.36553 validation-auc:0.97092 validation-aucpr:0.97511
[52] validation-logloss:0.36246 validation-auc:0.97098 validation-aucpr:0.97513
[53] validation-logloss:0.35964 validation-auc:0.97096 validation-aucpr:0.97509
[54] validation-logloss:0.35681 validation-auc:0.97099 validation-aucpr:0.97510
[55] validation-logloss:0.35399 validation-auc:0.97101 validation-aucpr:0.97512
[56] validation-logloss:0.35126 validation-auc:0.97093 validation-aucpr:0.97507
[57] validation-logloss:0.34813 validation-auc:0.97101 validation-aucpr:0.97514
[58] validation-logloss:0.34512 validation-auc:0.97106 validation-aucpr:0.97521
[59] validation-logloss:0.34200 validation-auc:0.97115 validation-aucpr:0.97528
[60] validation-logloss:0.33905 validation-auc:0.97111 validation-aucpr:0.97527
[61] validation-logloss:0.33614 validation-auc:0.97111 validation-aucpr:0.97526
[62] validation-logloss:0.33335 validation-auc:0.97113 validation-aucpr:0.97528
[63] validation-logloss:0.33105 validation-auc:0.97111 validation-aucpr:0.97528
[64] validation-logloss:0.32873 validation-auc:0.97108 validation-aucpr:0.97530
[65] validation-logloss:0.32640 validation-auc:0.97112 validation-aucpr:0.97533
[66] validation-logloss:0.32371 validation-auc:0.97115 validation-aucpr:0.97535
[67] validation-logloss:0.32151 validation-auc:0.97107 validation-aucpr:0.97528
[68] validation-logloss:0.31938 validation-auc:0.97110 validation-aucpr:0.97541
[69] validation-logloss:0.31726 validation-auc:0.97116 validation-aucpr:0.97545
[70] validation-logloss:0.31472 validation-auc:0.97119 validation-aucpr:0.97548
[71] validation-logloss:0.31275 validation-auc:0.97114 validation-aucpr:0.97543
[72] validation-logloss:0.31030 validation-auc:0.97119 validation-aucpr:0.97549
{'best_iteration': '72', 'best_score': '0.9754871959768607'}
Trial 42, Fold 3: Log loss = 0.3102979970624399, Average precision = 0.9754888562852468, ROC-AUC = 0.9711920693241902, Elapsed Time = 2.2501545999984955 seconds
Trial 42, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 42, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68115 validation-auc:0.94672 validation-aucpr:0.93040
[1] validation-logloss:0.67083 validation-auc:0.95995 validation-aucpr:0.96448
[2] validation-logloss:0.66061 validation-auc:0.96409 validation-aucpr:0.96962
[3] validation-logloss:0.64959 validation-auc:0.96598 validation-aucpr:0.97121
[4] validation-logloss:0.64015 validation-auc:0.96565 validation-aucpr:0.97111
[5] validation-logloss:0.63110 validation-auc:0.96524 validation-aucpr:0.97060
[6] validation-logloss:0.62102 validation-auc:0.96618 validation-aucpr:0.97146
[7] validation-logloss:0.61125 validation-auc:0.96686 validation-aucpr:0.97194
[8] validation-logloss:0.60289 validation-auc:0.96669 validation-aucpr:0.97173
[9] validation-logloss:0.59483 validation-auc:0.96658 validation-aucpr:0.97153
[10] validation-logloss:0.58681 validation-auc:0.96644 validation-aucpr:0.97140
[11] validation-logloss:0.57930 validation-auc:0.96629 validation-aucpr:0.97130
[12] validation-logloss:0.57186 validation-auc:0.96621 validation-aucpr:0.97128
[13] validation-logloss:0.56450 validation-auc:0.96634 validation-aucpr:0.97133
[14] validation-logloss:0.55647 validation-auc:0.96664 validation-aucpr:0.97168
[15] validation-logloss:0.54983 validation-auc:0.96637 validation-aucpr:0.97146
[16] validation-logloss:0.54209 validation-auc:0.96645 validation-aucpr:0.97160
[17] validation-logloss:0.53460 validation-auc:0.96669 validation-aucpr:0.97185
[18] validation-logloss:0.52717 validation-auc:0.96696 validation-aucpr:0.97211
[19] validation-logloss:0.52092 validation-auc:0.96676 validation-aucpr:0.97189
[20] validation-logloss:0.51468 validation-auc:0.96689 validation-aucpr:0.97199
[21] validation-logloss:0.50879 validation-auc:0.96700 validation-aucpr:0.97202
[22] validation-logloss:0.50207 validation-auc:0.96719 validation-aucpr:0.97222
[23] validation-logloss:0.49562 validation-auc:0.96726 validation-aucpr:0.97233
[24] validation-logloss:0.49019 validation-auc:0.96723 validation-aucpr:0.97227
[25] validation-logloss:0.48391 validation-auc:0.96730 validation-aucpr:0.97236
[26] validation-logloss:0.47768 validation-auc:0.96739 validation-aucpr:0.97248
[27] validation-logloss:0.47247 validation-auc:0.96736 validation-aucpr:0.97244
[28] validation-logloss:0.46746 validation-auc:0.96749 validation-aucpr:0.97251
[29] validation-logloss:0.46258 validation-auc:0.96749 validation-aucpr:0.97248
[30] validation-logloss:0.45698 validation-auc:0.96758 validation-aucpr:0.97258
[31] validation-logloss:0.45232 validation-auc:0.96761 validation-aucpr:0.97259
[32] validation-logloss:0.44776 validation-auc:0.96758 validation-aucpr:0.97255
[33] validation-logloss:0.44268 validation-auc:0.96768 validation-aucpr:0.97266
[34] validation-logloss:0.43752 validation-auc:0.96782 validation-aucpr:0.97280
[35] validation-logloss:0.43263 validation-auc:0.96785 validation-aucpr:0.97286
[36] validation-logloss:0.42772 validation-auc:0.96805 validation-aucpr:0.97301
[37] validation-logloss:0.42365 validation-auc:0.96798 validation-aucpr:0.97294
[38] validation-logloss:0.41966 validation-auc:0.96788 validation-aucpr:0.97285
[39] validation-logloss:0.41582 validation-auc:0.96789 validation-aucpr:0.97285
[40] validation-logloss:0.41125 validation-auc:0.96802 validation-aucpr:0.97295
[41] validation-logloss:0.40760 validation-auc:0.96804 validation-aucpr:0.97295
[42] validation-logloss:0.40323 validation-auc:0.96811 validation-aucpr:0.97304
[43] validation-logloss:0.39895 validation-auc:0.96815 validation-aucpr:0.97308
[44] validation-logloss:0.39477 validation-auc:0.96828 validation-aucpr:0.97321
[45] validation-logloss:0.39065 validation-auc:0.96840 validation-aucpr:0.97332
[46] validation-logloss:0.38721 validation-auc:0.96845 validation-aucpr:0.97334
[47] validation-logloss:0.38329 validation-auc:0.96853 validation-aucpr:0.97342
[48] validation-logloss:0.37944 validation-auc:0.96866 validation-aucpr:0.97354
[49] validation-logloss:0.37563 validation-auc:0.96875 validation-aucpr:0.97362
[50] validation-logloss:0.37190 validation-auc:0.96887 validation-aucpr:0.97372
[51] validation-logloss:0.36835 validation-auc:0.96894 validation-aucpr:0.97378
[52] validation-logloss:0.36537 validation-auc:0.96901 validation-aucpr:0.97381
[53] validation-logloss:0.36251 validation-auc:0.96897 validation-aucpr:0.97376
[54] validation-logloss:0.35966 validation-auc:0.96894 validation-aucpr:0.97375
[55] validation-logloss:0.35628 validation-auc:0.96905 validation-aucpr:0.97384
[56] validation-logloss:0.35350 validation-auc:0.96904 validation-aucpr:0.97382
[57] validation-logloss:0.35029 validation-auc:0.96905 validation-aucpr:0.97386
[58] validation-logloss:0.34781 validation-auc:0.96895 validation-aucpr:0.97379
[59] validation-logloss:0.34464 validation-auc:0.96911 validation-aucpr:0.97394
[60] validation-logloss:0.34155 validation-auc:0.96925 validation-aucpr:0.97405
[61] validation-logloss:0.33858 validation-auc:0.96929 validation-aucpr:0.97410
[62] validation-logloss:0.33617 validation-auc:0.96932 validation-aucpr:0.97411
[63] validation-logloss:0.33338 validation-auc:0.96936 validation-aucpr:0.97415
[64] validation-logloss:0.33052 validation-auc:0.96945 validation-aucpr:0.97422
[65] validation-logloss:0.32845 validation-auc:0.96936 validation-aucpr:0.97416
[66] validation-logloss:0.32628 validation-auc:0.96931 validation-aucpr:0.97410
[67] validation-logloss:0.32409 validation-auc:0.96932 validation-aucpr:0.97410
[68] validation-logloss:0.32146 validation-auc:0.96936 validation-aucpr:0.97415
[69] validation-logloss:0.31887 validation-auc:0.96948 validation-aucpr:0.97425
[70] validation-logloss:0.31683 validation-auc:0.96947 validation-aucpr:0.97423
[71] validation-logloss:0.31490 validation-auc:0.96943 validation-aucpr:0.97420
[72] validation-logloss:0.31291 validation-auc:0.96942 validation-aucpr:0.97418
{'best_iteration': '69', 'best_score': '0.9742472920910086'}
Trial 42, Fold 4: Log loss = 0.3129062231328123, Average precision = 0.9741810472294776, ROC-AUC = 0.9694151013016492, Elapsed Time = 2.174041499998566 seconds
Trial 42, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 42, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68107 validation-auc:0.95236 validation-aucpr:0.94188
[1] validation-logloss:0.66936 validation-auc:0.96419 validation-aucpr:0.96656
[2] validation-logloss:0.65987 validation-auc:0.96348 validation-aucpr:0.96789
[3] validation-logloss:0.64997 validation-auc:0.96523 validation-aucpr:0.96913
[4] validation-logloss:0.64050 validation-auc:0.96514 validation-aucpr:0.96886
[5] validation-logloss:0.63016 validation-auc:0.96583 validation-aucpr:0.96954
[6] validation-logloss:0.62177 validation-auc:0.96545 validation-aucpr:0.96914
[7] validation-logloss:0.61182 validation-auc:0.96661 validation-aucpr:0.97020
[8] validation-logloss:0.60338 validation-auc:0.96661 validation-aucpr:0.97008
[9] validation-logloss:0.59418 validation-auc:0.96700 validation-aucpr:0.97075
[10] validation-logloss:0.58521 validation-auc:0.96728 validation-aucpr:0.97095
[11] validation-logloss:0.57658 validation-auc:0.96764 validation-aucpr:0.97121
[12] validation-logloss:0.56951 validation-auc:0.96722 validation-aucpr:0.97089
[13] validation-logloss:0.56125 validation-auc:0.96745 validation-aucpr:0.97106
[14] validation-logloss:0.55343 validation-auc:0.96740 validation-aucpr:0.97112
[15] validation-logloss:0.54564 validation-auc:0.96751 validation-aucpr:0.97119
[16] validation-logloss:0.53801 validation-auc:0.96765 validation-aucpr:0.97132
[17] validation-logloss:0.53125 validation-auc:0.96762 validation-aucpr:0.97134
[18] validation-logloss:0.52475 validation-auc:0.96760 validation-aucpr:0.97127
[19] validation-logloss:0.51776 validation-auc:0.96781 validation-aucpr:0.97146
[20] validation-logloss:0.51073 validation-auc:0.96806 validation-aucpr:0.97171
[21] validation-logloss:0.50410 validation-auc:0.96813 validation-aucpr:0.97179
[22] validation-logloss:0.49836 validation-auc:0.96804 validation-aucpr:0.97169
[23] validation-logloss:0.49223 validation-auc:0.96803 validation-aucpr:0.97172
[24] validation-logloss:0.48686 validation-auc:0.96796 validation-aucpr:0.97161
[25] validation-logloss:0.48154 validation-auc:0.96785 validation-aucpr:0.97151
[26] validation-logloss:0.47635 validation-auc:0.96768 validation-aucpr:0.97134
[27] validation-logloss:0.47032 validation-auc:0.96790 validation-aucpr:0.97154
[28] validation-logloss:0.46449 validation-auc:0.96821 validation-aucpr:0.97177
[29] validation-logloss:0.45888 validation-auc:0.96828 validation-aucpr:0.97186
[30] validation-logloss:0.45402 validation-auc:0.96822 validation-aucpr:0.97180
[31] validation-logloss:0.44865 validation-auc:0.96825 validation-aucpr:0.97185
[32] validation-logloss:0.44361 validation-auc:0.96837 validation-aucpr:0.97200
[33] validation-logloss:0.43918 validation-auc:0.96836 validation-aucpr:0.97198
[34] validation-logloss:0.43520 validation-auc:0.96834 validation-aucpr:0.97235
[35] validation-logloss:0.43101 validation-auc:0.96834 validation-aucpr:0.97234
[36] validation-logloss:0.42621 validation-auc:0.96841 validation-aucpr:0.97241
[37] validation-logloss:0.42152 validation-auc:0.96849 validation-aucpr:0.97248
[38] validation-logloss:0.41764 validation-auc:0.96840 validation-aucpr:0.97241
[39] validation-logloss:0.41305 validation-auc:0.96862 validation-aucpr:0.97260
[40] validation-logloss:0.40867 validation-auc:0.96865 validation-aucpr:0.97262
[41] validation-logloss:0.40445 validation-auc:0.96874 validation-aucpr:0.97271
[42] validation-logloss:0.40086 validation-auc:0.96860 validation-aucpr:0.97261
[43] validation-logloss:0.39727 validation-auc:0.96851 validation-aucpr:0.97253
[44] validation-logloss:0.39325 validation-auc:0.96864 validation-aucpr:0.97263
[45] validation-logloss:0.38929 validation-auc:0.96874 validation-aucpr:0.97273
[46] validation-logloss:0.38553 validation-auc:0.96885 validation-aucpr:0.97282
[47] validation-logloss:0.38237 validation-auc:0.96871 validation-aucpr:0.97270
[48] validation-logloss:0.37869 validation-auc:0.96879 validation-aucpr:0.97278
[49] validation-logloss:0.37506 validation-auc:0.96891 validation-aucpr:0.97286
[50] validation-logloss:0.37199 validation-auc:0.96888 validation-aucpr:0.97281
[51] validation-logloss:0.36922 validation-auc:0.96878 validation-aucpr:0.97271
[52] validation-logloss:0.36576 validation-auc:0.96892 validation-aucpr:0.97284
[53] validation-logloss:0.36240 validation-auc:0.96896 validation-aucpr:0.97288
[54] validation-logloss:0.35906 validation-auc:0.96909 validation-aucpr:0.97297
[55] validation-logloss:0.35579 validation-auc:0.96923 validation-aucpr:0.97309
[56] validation-logloss:0.35271 validation-auc:0.96928 validation-aucpr:0.97315
[57] validation-logloss:0.35008 validation-auc:0.96921 validation-aucpr:0.97307
[58] validation-logloss:0.34699 validation-auc:0.96931 validation-aucpr:0.97315
[59] validation-logloss:0.34447 validation-auc:0.96934 validation-aucpr:0.97317
[60] validation-logloss:0.34200 validation-auc:0.96934 validation-aucpr:0.97315
[61] validation-logloss:0.33909 validation-auc:0.96945 validation-aucpr:0.97324
[62] validation-logloss:0.33630 validation-auc:0.96952 validation-aucpr:0.97330
[63] validation-logloss:0.33347 validation-auc:0.96967 validation-aucpr:0.97341
[64] validation-logloss:0.33082 validation-auc:0.96975 validation-aucpr:0.97348
[65] validation-logloss:0.32813 validation-auc:0.96983 validation-aucpr:0.97353
[66] validation-logloss:0.32592 validation-auc:0.96981 validation-aucpr:0.97355
[67] validation-logloss:0.32335 validation-auc:0.96990 validation-aucpr:0.97363
[68] validation-logloss:0.32088 validation-auc:0.96992 validation-aucpr:0.97366
[69] validation-logloss:0.31838 validation-auc:0.96998 validation-aucpr:0.97372
[70] validation-logloss:0.31637 validation-auc:0.96990 validation-aucpr:0.97364
[71] validation-logloss:0.31395 validation-auc:0.96999 validation-aucpr:0.97372
[72] validation-logloss:0.31165 validation-auc:0.97001 validation-aucpr:0.97374
{'best_iteration': '72', 'best_score': '0.9737411445974385'}
Trial 42, Fold 5: Log loss = 0.3116536302078631, Average precision = 0.9737242713120442, ROC-AUC = 0.9700108132211136, Elapsed Time = 2.2429255999995803 seconds
Optimization Progress: 43%|####3 | 43/100 [2:26:42<1:13:49, 77.72s/it]
Trial 43, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 43, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67385 validation-auc:0.94143 validation-aucpr:0.94742
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65498 validation-auc:0.95671 validation-aucpr:0.96049
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.63800 validation-auc:0.95647 validation-aucpr:0.95995
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.62230 validation-auc:0.95708 validation-aucpr:0.96084
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.60781 validation-auc:0.95682 validation-aucpr:0.96138
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.59385 validation-auc:0.95659 validation-aucpr:0.96136
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.57868 validation-auc:0.95931 validation-aucpr:0.96438
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.56312 validation-auc:0.96151 validation-aucpr:0.96677
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.55050 validation-auc:0.96175 validation-aucpr:0.96706
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.53826 validation-auc:0.96186 validation-aucpr:0.96713
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.52539 validation-auc:0.96263 validation-aucpr:0.96795
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.51261 validation-auc:0.96324 validation-aucpr:0.96890
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.50210 validation-auc:0.96361 validation-aucpr:0.96916
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.49218 validation-auc:0.96384 validation-aucpr:0.96935
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.48074 validation-auc:0.96445 validation-aucpr:0.96984
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.47129 validation-auc:0.96483 validation-aucpr:0.97009
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.46290 validation-auc:0.96465 validation-aucpr:0.96996
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.45273 validation-auc:0.96500 validation-aucpr:0.97034
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.44265 validation-auc:0.96554 validation-aucpr:0.97084
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.43481 validation-auc:0.96553 validation-aucpr:0.97078
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.42595 validation-auc:0.96580 validation-aucpr:0.97105
[20:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.41885 validation-auc:0.96592 validation-aucpr:0.97106
[20:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.41081 validation-auc:0.96601 validation-aucpr:0.97118
[20:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.40276 validation-auc:0.96633 validation-aucpr:0.97152
[20:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.39522 validation-auc:0.96666 validation-aucpr:0.97186
[20:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.38782 validation-auc:0.96693 validation-aucpr:0.97209
[20:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.38213 validation-auc:0.96705 validation-aucpr:0.97216
[20:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.37615 validation-auc:0.96706 validation-aucpr:0.97216
[20:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.36947 validation-auc:0.96729 validation-aucpr:0.97241
[20:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.36354 validation-auc:0.96742 validation-aucpr:0.97253
[20:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.35877 validation-auc:0.96738 validation-aucpr:0.97246
[20:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.35410 validation-auc:0.96729 validation-aucpr:0.97239
[20:25:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.34979 validation-auc:0.96724 validation-aucpr:0.97232
{'best_iteration': '29', 'best_score': '0.9725345395665433'}
Trial 43, Fold 1: Log loss = 0.3497921846878542, Average precision = 0.972323647074988, ROC-AUC = 0.9672356170210733, Elapsed Time = 2.4518600999981572 seconds
Trial 43, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 43, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67176 validation-auc:0.95300 validation-aucpr:0.95682
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65324 validation-auc:0.96051 validation-aucpr:0.96394
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.63595 validation-auc:0.96163 validation-aucpr:0.96546
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.61979 validation-auc:0.96285 validation-aucpr:0.96620
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.60494 validation-auc:0.96213 validation-aucpr:0.96530
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.59031 validation-auc:0.96224 validation-aucpr:0.96529
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.57627 validation-auc:0.96246 validation-aucpr:0.96533
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.56302 validation-auc:0.96253 validation-aucpr:0.96525
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.55059 validation-auc:0.96256 validation-aucpr:0.96517
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.53914 validation-auc:0.96235 validation-aucpr:0.96488
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.52789 validation-auc:0.96254 validation-aucpr:0.96513
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.51472 validation-auc:0.96476 validation-aucpr:0.96765
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.50443 validation-auc:0.96453 validation-aucpr:0.96733
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.49465 validation-auc:0.96431 validation-aucpr:0.96703
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.48296 validation-auc:0.96552 validation-aucpr:0.96843
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.47463 validation-auc:0.96530 validation-aucpr:0.96819
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.46602 validation-auc:0.96511 validation-aucpr:0.96782
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.45579 validation-auc:0.96585 validation-aucpr:0.96909
[20:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.44767 validation-auc:0.96589 validation-aucpr:0.96910
[20:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.43946 validation-auc:0.96600 validation-aucpr:0.96919
[20:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.43205 validation-auc:0.96600 validation-aucpr:0.96913
[20:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.42504 validation-auc:0.96606 validation-aucpr:0.96914
[20:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.41813 validation-auc:0.96600 validation-aucpr:0.96905
[20:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.41036 validation-auc:0.96645 validation-aucpr:0.96956
[20:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.40417 validation-auc:0.96641 validation-aucpr:0.96954
[20:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.39650 validation-auc:0.96692 validation-aucpr:0.97011
[20:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.38941 validation-auc:0.96724 validation-aucpr:0.97045
[20:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.38340 validation-auc:0.96734 validation-aucpr:0.97057
[20:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.37810 validation-auc:0.96732 validation-aucpr:0.97067
[20:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.37114 validation-auc:0.96769 validation-aucpr:0.97104
[20:25:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.36624 validation-auc:0.96782 validation-aucpr:0.97121
[20:25:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.36149 validation-auc:0.96773 validation-aucpr:0.97116
[20:25:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.35701 validation-auc:0.96767 validation-aucpr:0.97112
{'best_iteration': '30', 'best_score': '0.9712114953884028'}
Trial 43, Fold 2: Log loss = 0.35701467220388006, Average precision = 0.9710963596730398, ROC-AUC = 0.9676705895492514, Elapsed Time = 2.6748434000001 seconds
Trial 43, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 43, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67382 validation-auc:0.94070 validation-aucpr:0.94001
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65604 validation-auc:0.95100 validation-aucpr:0.95679
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.63855 validation-auc:0.95879 validation-aucpr:0.96296
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.62278 validation-auc:0.95898 validation-aucpr:0.96299
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.60629 validation-auc:0.96177 validation-aucpr:0.96583
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.59131 validation-auc:0.96180 validation-aucpr:0.96575
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.57491 validation-auc:0.96501 validation-aucpr:0.96936
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.56225 validation-auc:0.96471 validation-aucpr:0.96908
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.54733 validation-auc:0.96631 validation-aucpr:0.97065
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.53580 validation-auc:0.96619 validation-aucpr:0.97036
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.52216 validation-auc:0.96698 validation-aucpr:0.97118
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.50873 validation-auc:0.96790 validation-aucpr:0.97198
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.49808 validation-auc:0.96792 validation-aucpr:0.97201
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.48663 validation-auc:0.96808 validation-aucpr:0.97226
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.47519 validation-auc:0.96843 validation-aucpr:0.97256
[20:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.46613 validation-auc:0.96842 validation-aucpr:0.97248
[20:25:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.45656 validation-auc:0.96843 validation-aucpr:0.97237
[20:25:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.44822 validation-auc:0.96833 validation-aucpr:0.97225
[20:25:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.43848 validation-auc:0.96848 validation-aucpr:0.97267
[20:25:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.42922 validation-auc:0.96867 validation-aucpr:0.97286
[20:25:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.42026 validation-auc:0.96900 validation-aucpr:0.97317
[20:25:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.41331 validation-auc:0.96903 validation-aucpr:0.97313
[20:25:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.40523 validation-auc:0.96916 validation-aucpr:0.97326
[20:25:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.39886 validation-auc:0.96910 validation-aucpr:0.97324
[20:25:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.39269 validation-auc:0.96909 validation-aucpr:0.97323
[20:25:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.38665 validation-auc:0.96909 validation-aucpr:0.97321
[20:25:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.38110 validation-auc:0.96892 validation-aucpr:0.97302
[20:25:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.37410 validation-auc:0.96911 validation-aucpr:0.97323
[20:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.36727 validation-auc:0.96925 validation-aucpr:0.97340
[20:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.36214 validation-auc:0.96916 validation-aucpr:0.97331
[20:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.35716 validation-auc:0.96916 validation-aucpr:0.97329
[20:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.35248 validation-auc:0.96919 validation-aucpr:0.97330
[20:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.34807 validation-auc:0.96917 validation-aucpr:0.97326
{'best_iteration': '28', 'best_score': '0.9734004595640693'}
Trial 43, Fold 3: Log loss = 0.3480675914193448, Average precision = 0.973268520424059, ROC-AUC = 0.9691660908110293, Elapsed Time = 2.6797023999970406 seconds
Trial 43, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 43, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67080 validation-auc:0.95951 validation-aucpr:0.95794
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65173 validation-auc:0.96170 validation-aucpr:0.96288
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.63229 validation-auc:0.96368 validation-aucpr:0.96512
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.61794 validation-auc:0.96164 validation-aucpr:0.96793
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.60059 validation-auc:0.96336 validation-aucpr:0.96925
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.58404 validation-auc:0.96371 validation-aucpr:0.96966
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.57067 validation-auc:0.96348 validation-aucpr:0.96956
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.55840 validation-auc:0.96286 validation-aucpr:0.96895
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.54428 validation-auc:0.96392 validation-aucpr:0.96957
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.53052 validation-auc:0.96444 validation-aucpr:0.97017
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.51729 validation-auc:0.96500 validation-aucpr:0.97072
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.50510 validation-auc:0.96511 validation-aucpr:0.97089
[20:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.49417 validation-auc:0.96554 validation-aucpr:0.97127
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.48409 validation-auc:0.96554 validation-aucpr:0.97131
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.47354 validation-auc:0.96575 validation-aucpr:0.97153
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.46279 validation-auc:0.96605 validation-aucpr:0.97182
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.45428 validation-auc:0.96597 validation-aucpr:0.97174
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.44603 validation-auc:0.96586 validation-aucpr:0.97161
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.43800 validation-auc:0.96604 validation-aucpr:0.97173
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.42877 validation-auc:0.96639 validation-aucpr:0.97199
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.42200 validation-auc:0.96606 validation-aucpr:0.97173
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.41340 validation-auc:0.96633 validation-aucpr:0.97199
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.40636 validation-auc:0.96637 validation-aucpr:0.97200
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.39860 validation-auc:0.96666 validation-aucpr:0.97226
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.39217 validation-auc:0.96662 validation-aucpr:0.97226
[20:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.38671 validation-auc:0.96636 validation-aucpr:0.97206
[20:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.38118 validation-auc:0.96629 validation-aucpr:0.97196
[20:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.37453 validation-auc:0.96637 validation-aucpr:0.97203
[20:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.36924 validation-auc:0.96648 validation-aucpr:0.97207
[20:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.36295 validation-auc:0.96673 validation-aucpr:0.97228
[20:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.35834 validation-auc:0.96665 validation-aucpr:0.97221
[20:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.35346 validation-auc:0.96671 validation-aucpr:0.97226
[20:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.34885 validation-auc:0.96674 validation-aucpr:0.97228
{'best_iteration': '32', 'best_score': '0.972280019775681'}
Trial 43, Fold 4: Log loss = 0.3488531874288383, Average precision = 0.9722768432375872, ROC-AUC = 0.966742848558982, Elapsed Time = 2.7160441000014544 seconds
Trial 43, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 43, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[20:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67104 validation-auc:0.95828 validation-aucpr:0.95734
[20:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65267 validation-auc:0.95999 validation-aucpr:0.96523
[20:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.63352 validation-auc:0.96188 validation-aucpr:0.96721
[20:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.61552 validation-auc:0.96312 validation-aucpr:0.96567
[20:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.60066 validation-auc:0.96277 validation-aucpr:0.96801
[20:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.58378 validation-auc:0.96366 validation-aucpr:0.96886
[20:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.56842 validation-auc:0.96455 validation-aucpr:0.96952
[20:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.55610 validation-auc:0.96377 validation-aucpr:0.96883
[20:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.54162 validation-auc:0.96411 validation-aucpr:0.96938
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.52970 validation-auc:0.96415 validation-aucpr:0.96936
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.51665 validation-auc:0.96470 validation-aucpr:0.96970
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.50393 validation-auc:0.96545 validation-aucpr:0.97034
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.49413 validation-auc:0.96491 validation-aucpr:0.97004
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.48452 validation-auc:0.96458 validation-aucpr:0.96967
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.47564 validation-auc:0.96439 validation-aucpr:0.96937
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.46488 validation-auc:0.96487 validation-aucpr:0.96966
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.45544 validation-auc:0.96509 validation-aucpr:0.96984
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.44590 validation-auc:0.96516 validation-aucpr:0.96980
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.43792 validation-auc:0.96524 validation-aucpr:0.96981
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.43070 validation-auc:0.96497 validation-aucpr:0.96958
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.42208 validation-auc:0.96522 validation-aucpr:0.96982
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.41548 validation-auc:0.96506 validation-aucpr:0.96963
[20:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.40868 validation-auc:0.96514 validation-aucpr:0.96965
[20:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.40121 validation-auc:0.96535 validation-aucpr:0.96988
[20:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.39362 validation-auc:0.96578 validation-aucpr:0.97024
[20:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.38656 validation-auc:0.96600 validation-aucpr:0.97043
[20:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.38093 validation-auc:0.96602 validation-aucpr:0.97044
[20:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.37468 validation-auc:0.96602 validation-aucpr:0.97042
[20:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.36800 validation-auc:0.96647 validation-aucpr:0.97079
[20:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.36191 validation-auc:0.96671 validation-aucpr:0.97097
[20:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.35665 validation-auc:0.96673 validation-aucpr:0.97097
[20:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.35208 validation-auc:0.96670 validation-aucpr:0.97089
[20:25:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.34650 validation-auc:0.96692 validation-aucpr:0.97107
{'best_iteration': '32', 'best_score': '0.9710684845858037'}
Trial 43, Fold 5: Log loss = 0.34649803740362317, Average precision = 0.9710592798843336, ROC-AUC = 0.9669182319826096, Elapsed Time = 2.757409000001644 seconds
Optimization Progress: 44%|####4 | 44/100 [2:27:03<56:40, 60.72s/it]
Trial 44, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 44, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.65514 validation-auc:0.87809 validation-aucpr:0.88222
[1] validation-logloss:0.60224 validation-auc:0.94543 validation-aucpr:0.95331
[2] validation-logloss:0.56799 validation-auc:0.94776 validation-aucpr:0.95521
[3] validation-logloss:0.53744 validation-auc:0.94993 validation-aucpr:0.95683
[4] validation-logloss:0.51274 validation-auc:0.95064 validation-aucpr:0.95757
[5] validation-logloss:0.48533 validation-auc:0.95375 validation-aucpr:0.95984
[6] validation-logloss:0.46379 validation-auc:0.95533 validation-aucpr:0.96080
[7] validation-logloss:0.44329 validation-auc:0.95685 validation-aucpr:0.96192
[8] validation-logloss:0.42770 validation-auc:0.95626 validation-aucpr:0.96119
[9] validation-logloss:0.41215 validation-auc:0.95657 validation-aucpr:0.96147
[10] validation-logloss:0.40000 validation-auc:0.95614 validation-aucpr:0.96111
[11] validation-logloss:0.39012 validation-auc:0.95590 validation-aucpr:0.96112
[12] validation-logloss:0.37851 validation-auc:0.95607 validation-aucpr:0.96137
[13] validation-logloss:0.36718 validation-auc:0.95651 validation-aucpr:0.96176
[14] validation-logloss:0.35940 validation-auc:0.95671 validation-aucpr:0.96199
[15] validation-logloss:0.35371 validation-auc:0.95651 validation-aucpr:0.96170
[16] validation-logloss:0.34584 validation-auc:0.95643 validation-aucpr:0.96159
[17] validation-logloss:0.34032 validation-auc:0.95600 validation-aucpr:0.96112
[18] validation-logloss:0.33289 validation-auc:0.95639 validation-aucpr:0.96159
[19] validation-logloss:0.32676 validation-auc:0.95654 validation-aucpr:0.96176
[20] validation-logloss:0.32132 validation-auc:0.95697 validation-aucpr:0.96216
[21] validation-logloss:0.31601 validation-auc:0.95704 validation-aucpr:0.96237
[22] validation-logloss:0.30578 validation-auc:0.95890 validation-aucpr:0.96452
[23] validation-logloss:0.30272 validation-auc:0.95888 validation-aucpr:0.96454
[24] validation-logloss:0.29897 validation-auc:0.95901 validation-aucpr:0.96454
[25] validation-logloss:0.29444 validation-auc:0.95938 validation-aucpr:0.96489
[26] validation-logloss:0.29176 validation-auc:0.95927 validation-aucpr:0.96479
[27] validation-logloss:0.28948 validation-auc:0.95927 validation-aucpr:0.96480
[28] validation-logloss:0.28755 validation-auc:0.95930 validation-aucpr:0.96483
[29] validation-logloss:0.28515 validation-auc:0.95939 validation-aucpr:0.96491
[30] validation-logloss:0.28262 validation-auc:0.95965 validation-aucpr:0.96513
[31] validation-logloss:0.27951 validation-auc:0.95994 validation-aucpr:0.96538
[32] validation-logloss:0.27777 validation-auc:0.95987 validation-aucpr:0.96537
[33] validation-logloss:0.27504 validation-auc:0.96012 validation-aucpr:0.96555
[34] validation-logloss:0.27266 validation-auc:0.96042 validation-aucpr:0.96575
[35] validation-logloss:0.27056 validation-auc:0.96057 validation-aucpr:0.96588
[36] validation-logloss:0.27007 validation-auc:0.96028 validation-aucpr:0.96565
[37] validation-logloss:0.26836 validation-auc:0.96044 validation-aucpr:0.96582
[38] validation-logloss:0.26680 validation-auc:0.96056 validation-aucpr:0.96594
[39] validation-logloss:0.26384 validation-auc:0.96112 validation-aucpr:0.96636
[40] validation-logloss:0.26152 validation-auc:0.96136 validation-aucpr:0.96654
[41] validation-logloss:0.26021 validation-auc:0.96138 validation-aucpr:0.96652
[42] validation-logloss:0.25329 validation-auc:0.96281 validation-aucpr:0.96807
[43] validation-logloss:0.25231 validation-auc:0.96287 validation-aucpr:0.96812
[44] validation-logloss:0.25102 validation-auc:0.96296 validation-aucpr:0.96821
[45] validation-logloss:0.25024 validation-auc:0.96307 validation-aucpr:0.96830
[46] validation-logloss:0.24901 validation-auc:0.96344 validation-aucpr:0.96858
[47] validation-logloss:0.24842 validation-auc:0.96339 validation-aucpr:0.96853
[48] validation-logloss:0.24651 validation-auc:0.96377 validation-aucpr:0.96885
[49] validation-logloss:0.24573 validation-auc:0.96376 validation-aucpr:0.96888
[50] validation-logloss:0.24435 validation-auc:0.96404 validation-aucpr:0.96907
[51] validation-logloss:0.24356 validation-auc:0.96421 validation-aucpr:0.96914
[52] validation-logloss:0.24305 validation-auc:0.96417 validation-aucpr:0.96909
[53] validation-logloss:0.24296 validation-auc:0.96410 validation-aucpr:0.96904
[54] validation-logloss:0.23998 validation-auc:0.96458 validation-aucpr:0.96954
[55] validation-logloss:0.23959 validation-auc:0.96468 validation-aucpr:0.96961
[56] validation-logloss:0.23537 validation-auc:0.96527 validation-aucpr:0.97022
[57] validation-logloss:0.23459 validation-auc:0.96540 validation-aucpr:0.97032
[58] validation-logloss:0.23420 validation-auc:0.96547 validation-aucpr:0.97031
[59] validation-logloss:0.23391 validation-auc:0.96545 validation-aucpr:0.97027
[60] validation-logloss:0.23326 validation-auc:0.96554 validation-aucpr:0.97034
[61] validation-logloss:0.23270 validation-auc:0.96562 validation-aucpr:0.97041
[62] validation-logloss:0.22946 validation-auc:0.96605 validation-aucpr:0.97088
[63] validation-logloss:0.22924 validation-auc:0.96605 validation-aucpr:0.97088
[64] validation-logloss:0.22885 validation-auc:0.96604 validation-aucpr:0.97088
[65] validation-logloss:0.22901 validation-auc:0.96595 validation-aucpr:0.97077
[66] validation-logloss:0.22846 validation-auc:0.96601 validation-aucpr:0.97082
[67] validation-logloss:0.22830 validation-auc:0.96601 validation-aucpr:0.97079
[68] validation-logloss:0.22786 validation-auc:0.96614 validation-aucpr:0.97088
[69] validation-logloss:0.22539 validation-auc:0.96643 validation-aucpr:0.97120
[70] validation-logloss:0.22514 validation-auc:0.96645 validation-aucpr:0.97123
[71] validation-logloss:0.22442 validation-auc:0.96666 validation-aucpr:0.97137
[72] validation-logloss:0.22189 validation-auc:0.96694 validation-aucpr:0.97168
[73] validation-logloss:0.22122 validation-auc:0.96712 validation-aucpr:0.97185
[74] validation-logloss:0.22103 validation-auc:0.96715 validation-aucpr:0.97190
[75] validation-logloss:0.22108 validation-auc:0.96710 validation-aucpr:0.97187
[76] validation-logloss:0.22095 validation-auc:0.96707 validation-aucpr:0.97182
[77] validation-logloss:0.22053 validation-auc:0.96720 validation-aucpr:0.97193
[78] validation-logloss:0.21776 validation-auc:0.96760 validation-aucpr:0.97231
[79] validation-logloss:0.21621 validation-auc:0.96785 validation-aucpr:0.97253
[80] validation-logloss:0.21596 validation-auc:0.96789 validation-aucpr:0.97253
[81] validation-logloss:0.21594 validation-auc:0.96790 validation-aucpr:0.97251
[82] validation-logloss:0.21572 validation-auc:0.96797 validation-aucpr:0.97254
[83] validation-logloss:0.21560 validation-auc:0.96794 validation-aucpr:0.97254
[84] validation-logloss:0.21534 validation-auc:0.96797 validation-aucpr:0.97257
[85] validation-logloss:0.21526 validation-auc:0.96797 validation-aucpr:0.97257
[86] validation-logloss:0.21492 validation-auc:0.96806 validation-aucpr:0.97264
[87] validation-logloss:0.21471 validation-auc:0.96813 validation-aucpr:0.97266
[88] validation-logloss:0.21443 validation-auc:0.96816 validation-aucpr:0.97267
[89] validation-logloss:0.21427 validation-auc:0.96817 validation-aucpr:0.97270
[90] validation-logloss:0.21416 validation-auc:0.96820 validation-aucpr:0.97274
[91] validation-logloss:0.21407 validation-auc:0.96823 validation-aucpr:0.97272
[92] validation-logloss:0.21435 validation-auc:0.96810 validation-aucpr:0.97264
[93] validation-logloss:0.21269 validation-auc:0.96827 validation-aucpr:0.97283
[94] validation-logloss:0.21240 validation-auc:0.96834 validation-aucpr:0.97288
[95] validation-logloss:0.21201 validation-auc:0.96842 validation-aucpr:0.97292
[96] validation-logloss:0.21179 validation-auc:0.96849 validation-aucpr:0.97299
{'best_iteration': '96', 'best_score': '0.9729902603003616'}
Trial 44, Fold 1: Log loss = 0.2117866213049766, Average precision = 0.9729949916357183, ROC-AUC = 0.9684905579214522, Elapsed Time = 68.23660120000204 seconds
Trial 44, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 44, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.65052 validation-auc:0.89657 validation-aucpr:0.89769
[1] validation-logloss:0.61009 validation-auc:0.92863 validation-aucpr:0.92567
[2] validation-logloss:0.57328 validation-auc:0.93758 validation-aucpr:0.93752
[3] validation-logloss:0.54145 validation-auc:0.94245 validation-aucpr:0.94347
[4] validation-logloss:0.51154 validation-auc:0.94911 validation-aucpr:0.95099
[5] validation-logloss:0.48645 validation-auc:0.95101 validation-aucpr:0.95320
[6] validation-logloss:0.46522 validation-auc:0.95169 validation-aucpr:0.95353
[7] validation-logloss:0.44730 validation-auc:0.95194 validation-aucpr:0.95392
[8] validation-logloss:0.43012 validation-auc:0.95228 validation-aucpr:0.95396
[9] validation-logloss:0.41893 validation-auc:0.95276 validation-aucpr:0.95468
[10] validation-logloss:0.40492 validation-auc:0.95306 validation-aucpr:0.95491
[11] validation-logloss:0.39368 validation-auc:0.95307 validation-aucpr:0.95479
[12] validation-logloss:0.38240 validation-auc:0.95373 validation-aucpr:0.95511
[13] validation-logloss:0.37219 validation-auc:0.95378 validation-aucpr:0.95508
[14] validation-logloss:0.36350 validation-auc:0.95368 validation-aucpr:0.95487
[15] validation-logloss:0.35781 validation-auc:0.95387 validation-aucpr:0.95493
[16] validation-logloss:0.34934 validation-auc:0.95430 validation-aucpr:0.95523
[17] validation-logloss:0.34139 validation-auc:0.95457 validation-aucpr:0.95566
[18] validation-logloss:0.33652 validation-auc:0.95441 validation-aucpr:0.95588
[19] validation-logloss:0.32959 validation-auc:0.95502 validation-aucpr:0.95661
[20] validation-logloss:0.32387 validation-auc:0.95525 validation-aucpr:0.95679
[21] validation-logloss:0.31831 validation-auc:0.95572 validation-aucpr:0.95726
[22] validation-logloss:0.31491 validation-auc:0.95581 validation-aucpr:0.95734
[23] validation-logloss:0.30258 validation-auc:0.96004 validation-aucpr:0.96250
[24] validation-logloss:0.30155 validation-auc:0.95995 validation-aucpr:0.96241
[25] validation-logloss:0.29197 validation-auc:0.96190 validation-aucpr:0.96494
[26] validation-logloss:0.28547 validation-auc:0.96282 validation-aucpr:0.96650
[27] validation-logloss:0.28354 validation-auc:0.96318 validation-aucpr:0.96668
[28] validation-logloss:0.28221 validation-auc:0.96316 validation-aucpr:0.96655
[29] validation-logloss:0.27826 validation-auc:0.96352 validation-aucpr:0.96681
[30] validation-logloss:0.27040 validation-auc:0.96444 validation-aucpr:0.96789
[31] validation-logloss:0.26728 validation-auc:0.96481 validation-aucpr:0.96813
[32] validation-logloss:0.26525 validation-auc:0.96471 validation-aucpr:0.96801
[33] validation-logloss:0.26265 validation-auc:0.96494 validation-aucpr:0.96818
[34] validation-logloss:0.25617 validation-auc:0.96566 validation-aucpr:0.96895
[35] validation-logloss:0.25311 validation-auc:0.96570 validation-aucpr:0.96908
[36] validation-logloss:0.25016 validation-auc:0.96598 validation-aucpr:0.96933
[37] validation-logloss:0.24877 validation-auc:0.96598 validation-aucpr:0.96931
[38] validation-logloss:0.24738 validation-auc:0.96598 validation-aucpr:0.96937
[39] validation-logloss:0.24651 validation-auc:0.96595 validation-aucpr:0.96922
[40] validation-logloss:0.24490 validation-auc:0.96609 validation-aucpr:0.96937
[41] validation-logloss:0.24358 validation-auc:0.96622 validation-aucpr:0.96945
[42] validation-logloss:0.24188 validation-auc:0.96642 validation-aucpr:0.96958
[43] validation-logloss:0.23940 validation-auc:0.96673 validation-aucpr:0.96986
[44] validation-logloss:0.23812 validation-auc:0.96690 validation-aucpr:0.96997
[45] validation-logloss:0.23711 validation-auc:0.96698 validation-aucpr:0.97002
[46] validation-logloss:0.23607 validation-auc:0.96712 validation-aucpr:0.97009
[47] validation-logloss:0.23553 validation-auc:0.96712 validation-aucpr:0.97017
[48] validation-logloss:0.23410 validation-auc:0.96735 validation-aucpr:0.97038
[49] validation-logloss:0.23371 validation-auc:0.96730 validation-aucpr:0.97023
[50] validation-logloss:0.23299 validation-auc:0.96736 validation-aucpr:0.97047
[51] validation-logloss:0.23232 validation-auc:0.96750 validation-aucpr:0.97061
[52] validation-logloss:0.23173 validation-auc:0.96754 validation-aucpr:0.97060
[53] validation-logloss:0.23103 validation-auc:0.96760 validation-aucpr:0.97057
[54] validation-logloss:0.22990 validation-auc:0.96780 validation-aucpr:0.97073
[55] validation-logloss:0.22903 validation-auc:0.96790 validation-aucpr:0.97081
[56] validation-logloss:0.22843 validation-auc:0.96787 validation-aucpr:0.97080
[57] validation-logloss:0.22799 validation-auc:0.96792 validation-aucpr:0.97088
[58] validation-logloss:0.22772 validation-auc:0.96798 validation-aucpr:0.97089
[59] validation-logloss:0.22417 validation-auc:0.96847 validation-aucpr:0.97144
[60] validation-logloss:0.22374 validation-auc:0.96845 validation-aucpr:0.97141
[61] validation-logloss:0.22272 validation-auc:0.96867 validation-aucpr:0.97158
[62] validation-logloss:0.22210 validation-auc:0.96872 validation-aucpr:0.97154
[63] validation-logloss:0.22122 validation-auc:0.96882 validation-aucpr:0.97190
[64] validation-logloss:0.22056 validation-auc:0.96885 validation-aucpr:0.97190
[65] validation-logloss:0.21971 validation-auc:0.96894 validation-aucpr:0.97206
[66] validation-logloss:0.21906 validation-auc:0.96903 validation-aucpr:0.97218
[67] validation-logloss:0.21858 validation-auc:0.96908 validation-aucpr:0.97223
[68] validation-logloss:0.21762 validation-auc:0.96918 validation-aucpr:0.97233
[69] validation-logloss:0.21476 validation-auc:0.96943 validation-aucpr:0.97261
[70] validation-logloss:0.21445 validation-auc:0.96950 validation-aucpr:0.97266
[71] validation-logloss:0.21413 validation-auc:0.96952 validation-aucpr:0.97263
[72] validation-logloss:0.21369 validation-auc:0.96964 validation-aucpr:0.97273
[73] validation-logloss:0.21104 validation-auc:0.96990 validation-aucpr:0.97299
[74] validation-logloss:0.21059 validation-auc:0.97000 validation-aucpr:0.97300
[75] validation-logloss:0.21055 validation-auc:0.97001 validation-aucpr:0.97300
[76] validation-logloss:0.21051 validation-auc:0.97003 validation-aucpr:0.97301
[77] validation-logloss:0.21019 validation-auc:0.97013 validation-aucpr:0.97309
[78] validation-logloss:0.20753 validation-auc:0.97048 validation-aucpr:0.97338
[79] validation-logloss:0.20706 validation-auc:0.97055 validation-aucpr:0.97344
[80] validation-logloss:0.20674 validation-auc:0.97058 validation-aucpr:0.97350
[81] validation-logloss:0.20615 validation-auc:0.97069 validation-aucpr:0.97365
[82] validation-logloss:0.20602 validation-auc:0.97069 validation-aucpr:0.97360
[83] validation-logloss:0.20568 validation-auc:0.97075 validation-aucpr:0.97363
[84] validation-logloss:0.20525 validation-auc:0.97081 validation-aucpr:0.97371
[85] validation-logloss:0.20505 validation-auc:0.97089 validation-aucpr:0.97378
[86] validation-logloss:0.20496 validation-auc:0.97082 validation-aucpr:0.97368
[87] validation-logloss:0.20454 validation-auc:0.97091 validation-aucpr:0.97374
[88] validation-logloss:0.20261 validation-auc:0.97111 validation-aucpr:0.97393
[89] validation-logloss:0.20247 validation-auc:0.97112 validation-aucpr:0.97392
[90] validation-logloss:0.20234 validation-auc:0.97112 validation-aucpr:0.97394
[91] validation-logloss:0.20202 validation-auc:0.97115 validation-aucpr:0.97397
[92] validation-logloss:0.20174 validation-auc:0.97117 validation-aucpr:0.97398
[93] validation-logloss:0.20136 validation-auc:0.97123 validation-aucpr:0.97402
[94] validation-logloss:0.20097 validation-auc:0.97127 validation-aucpr:0.97407
[95] validation-logloss:0.20092 validation-auc:0.97128 validation-aucpr:0.97415
[96] validation-logloss:0.20096 validation-auc:0.97125 validation-aucpr:0.97411
{'best_iteration': '95', 'best_score': '0.9741451280839953'}
Trial 44, Fold 2: Log loss = 0.20095771260405446, Average precision = 0.9741141990732255, ROC-AUC = 0.9712527477730253, Elapsed Time = 68.58425809999972 seconds
Trial 44, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 44, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.65141 validation-auc:0.90045 validation-aucpr:0.90638
[1] validation-logloss:0.60926 validation-auc:0.93363 validation-aucpr:0.93785
[2] validation-logloss:0.57088 validation-auc:0.94340 validation-aucpr:0.94753
[3] validation-logloss:0.53191 validation-auc:0.95832 validation-aucpr:0.96383
[4] validation-logloss:0.50519 validation-auc:0.95929 validation-aucpr:0.96419
[5] validation-logloss:0.48058 validation-auc:0.95968 validation-aucpr:0.96452
[6] validation-logloss:0.45990 validation-auc:0.95980 validation-aucpr:0.96452
[7] validation-logloss:0.43786 validation-auc:0.96147 validation-aucpr:0.96608
[8] validation-logloss:0.42200 validation-auc:0.96118 validation-aucpr:0.96589
[9] validation-logloss:0.40742 validation-auc:0.96073 validation-aucpr:0.96531
[10] validation-logloss:0.39350 validation-auc:0.96084 validation-aucpr:0.96550
[11] validation-logloss:0.37598 validation-auc:0.96156 validation-aucpr:0.96658
[12] validation-logloss:0.36719 validation-auc:0.96088 validation-aucpr:0.96597
[13] validation-logloss:0.35543 validation-auc:0.96150 validation-aucpr:0.96665
[14] validation-logloss:0.33987 validation-auc:0.96241 validation-aucpr:0.96764
[15] validation-logloss:0.33203 validation-auc:0.96235 validation-aucpr:0.96759
[16] validation-logloss:0.32498 validation-auc:0.96285 validation-aucpr:0.96799
[17] validation-logloss:0.31915 validation-auc:0.96273 validation-aucpr:0.96780
[18] validation-logloss:0.31341 validation-auc:0.96268 validation-aucpr:0.96778
[19] validation-logloss:0.30974 validation-auc:0.96266 validation-aucpr:0.96775
[20] validation-logloss:0.30339 validation-auc:0.96327 validation-aucpr:0.96819
[21] validation-logloss:0.29886 validation-auc:0.96335 validation-aucpr:0.96821
[22] validation-logloss:0.29468 validation-auc:0.96347 validation-aucpr:0.96826
[23] validation-logloss:0.29108 validation-auc:0.96354 validation-aucpr:0.96832
[24] validation-logloss:0.28797 validation-auc:0.96352 validation-aucpr:0.96829
[25] validation-logloss:0.28393 validation-auc:0.96379 validation-aucpr:0.96849
[26] validation-logloss:0.27760 validation-auc:0.96419 validation-aucpr:0.96886
[27] validation-logloss:0.27454 validation-auc:0.96415 validation-aucpr:0.96882
[28] validation-logloss:0.26806 validation-auc:0.96452 validation-aucpr:0.96926
[29] validation-logloss:0.26614 validation-auc:0.96450 validation-aucpr:0.96925
[30] validation-logloss:0.26350 validation-auc:0.96473 validation-aucpr:0.96945
[31] validation-logloss:0.26146 validation-auc:0.96473 validation-aucpr:0.96941
[32] validation-logloss:0.25934 validation-auc:0.96475 validation-aucpr:0.96939
[33] validation-logloss:0.25649 validation-auc:0.96489 validation-aucpr:0.96949
[34] validation-logloss:0.25342 validation-auc:0.96521 validation-aucpr:0.96969
[35] validation-logloss:0.25250 validation-auc:0.96522 validation-aucpr:0.96965
[36] validation-logloss:0.25040 validation-auc:0.96545 validation-aucpr:0.96982
[37] validation-logloss:0.24526 validation-auc:0.96586 validation-aucpr:0.97024
[38] validation-logloss:0.24075 validation-auc:0.96619 validation-aucpr:0.97055
[39] validation-logloss:0.23944 validation-auc:0.96631 validation-aucpr:0.97063
[40] validation-logloss:0.23797 validation-auc:0.96652 validation-aucpr:0.97086
[41] validation-logloss:0.23682 validation-auc:0.96657 validation-aucpr:0.97095
[42] validation-logloss:0.23556 validation-auc:0.96666 validation-aucpr:0.97105
[43] validation-logloss:0.23417 validation-auc:0.96698 validation-aucpr:0.97139
[44] validation-logloss:0.23339 validation-auc:0.96704 validation-aucpr:0.97143
[45] validation-logloss:0.23243 validation-auc:0.96732 validation-aucpr:0.97164
[46] validation-logloss:0.23181 validation-auc:0.96739 validation-aucpr:0.97169
[47] validation-logloss:0.23118 validation-auc:0.96740 validation-aucpr:0.97177
[48] validation-logloss:0.23027 validation-auc:0.96749 validation-aucpr:0.97187
[49] validation-logloss:0.22915 validation-auc:0.96755 validation-aucpr:0.97193
[50] validation-logloss:0.22866 validation-auc:0.96753 validation-aucpr:0.97190
[51] validation-logloss:0.22558 validation-auc:0.96781 validation-aucpr:0.97216
[52] validation-logloss:0.22515 validation-auc:0.96783 validation-aucpr:0.97220
[53] validation-logloss:0.22458 validation-auc:0.96791 validation-aucpr:0.97222
[54] validation-logloss:0.22327 validation-auc:0.96815 validation-aucpr:0.97246
[55] validation-logloss:0.22230 validation-auc:0.96832 validation-aucpr:0.97259
[56] validation-logloss:0.22103 validation-auc:0.96854 validation-aucpr:0.97276
[57] validation-logloss:0.22036 validation-auc:0.96861 validation-aucpr:0.97284
[58] validation-logloss:0.21950 validation-auc:0.96880 validation-aucpr:0.97300
[59] validation-logloss:0.21865 validation-auc:0.96892 validation-aucpr:0.97307
[60] validation-logloss:0.21855 validation-auc:0.96881 validation-aucpr:0.97298
[61] validation-logloss:0.21769 validation-auc:0.96900 validation-aucpr:0.97317
[62] validation-logloss:0.21694 validation-auc:0.96909 validation-aucpr:0.97327
[63] validation-logloss:0.21646 validation-auc:0.96919 validation-aucpr:0.97336
[64] validation-logloss:0.21585 validation-auc:0.96926 validation-aucpr:0.97338
[65] validation-logloss:0.21549 validation-auc:0.96926 validation-aucpr:0.97337
[66] validation-logloss:0.21495 validation-auc:0.96931 validation-aucpr:0.97340
[67] validation-logloss:0.21436 validation-auc:0.96940 validation-aucpr:0.97347
[68] validation-logloss:0.21391 validation-auc:0.96946 validation-aucpr:0.97350
[69] validation-logloss:0.21372 validation-auc:0.96943 validation-aucpr:0.97348
[70] validation-logloss:0.21254 validation-auc:0.96965 validation-aucpr:0.97367
[71] validation-logloss:0.21203 validation-auc:0.96970 validation-aucpr:0.97370
[72] validation-logloss:0.21154 validation-auc:0.96982 validation-aucpr:0.97378
[73] validation-logloss:0.21114 validation-auc:0.96989 validation-aucpr:0.97383
[74] validation-logloss:0.21081 validation-auc:0.96991 validation-aucpr:0.97386
[75] validation-logloss:0.21047 validation-auc:0.96997 validation-aucpr:0.97394
[76] validation-logloss:0.21039 validation-auc:0.96994 validation-aucpr:0.97390
[77] validation-logloss:0.21049 validation-auc:0.96992 validation-aucpr:0.97390
[78] validation-logloss:0.20989 validation-auc:0.97006 validation-aucpr:0.97403
[79] validation-logloss:0.20895 validation-auc:0.97019 validation-aucpr:0.97414
[80] validation-logloss:0.20907 validation-auc:0.97008 validation-aucpr:0.97402
[81] validation-logloss:0.20879 validation-auc:0.97011 validation-aucpr:0.97405
[82] validation-logloss:0.20671 validation-auc:0.97029 validation-aucpr:0.97423
[83] validation-logloss:0.20487 validation-auc:0.97043 validation-aucpr:0.97440
[84] validation-logloss:0.20461 validation-auc:0.97049 validation-aucpr:0.97444
[85] validation-logloss:0.20300 validation-auc:0.97061 validation-aucpr:0.97456
[86] validation-logloss:0.20293 validation-auc:0.97059 validation-aucpr:0.97453
[87] validation-logloss:0.20288 validation-auc:0.97058 validation-aucpr:0.97451
[88] validation-logloss:0.20275 validation-auc:0.97057 validation-aucpr:0.97454
[89] validation-logloss:0.20245 validation-auc:0.97060 validation-aucpr:0.97456
[90] validation-logloss:0.20212 validation-auc:0.97071 validation-aucpr:0.97463
[91] validation-logloss:0.20158 validation-auc:0.97075 validation-aucpr:0.97468
[92] validation-logloss:0.20156 validation-auc:0.97076 validation-aucpr:0.97466
[93] validation-logloss:0.20156 validation-auc:0.97072 validation-aucpr:0.97466
[94] validation-logloss:0.20127 validation-auc:0.97079 validation-aucpr:0.97474
[95] validation-logloss:0.20122 validation-auc:0.97076 validation-aucpr:0.97471
[96] validation-logloss:0.19974 validation-auc:0.97089 validation-aucpr:0.97486
{'best_iteration': '96', 'best_score': '0.9748559120999989'}
Trial 44, Fold 3: Log loss = 0.19974281941264244, Average precision = 0.974860322917372, ROC-AUC = 0.9708878640531249, Elapsed Time = 66.81937679999828 seconds
Trial 44, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 44, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.65307 validation-auc:0.88711 validation-aucpr:0.88252
[1] validation-logloss:0.61193 validation-auc:0.92548 validation-aucpr:0.92554
[2] validation-logloss:0.57837 validation-auc:0.93327 validation-aucpr:0.93622
[3] validation-logloss:0.54799 validation-auc:0.93860 validation-aucpr:0.94305
[4] validation-logloss:0.52443 validation-auc:0.93920 validation-aucpr:0.94342
[5] validation-logloss:0.49923 validation-auc:0.94169 validation-aucpr:0.94706
[6] validation-logloss:0.47716 validation-auc:0.94517 validation-aucpr:0.95059
[7] validation-logloss:0.45608 validation-auc:0.94744 validation-aucpr:0.95274
[8] validation-logloss:0.44067 validation-auc:0.94700 validation-aucpr:0.95228
[9] validation-logloss:0.42538 validation-auc:0.94756 validation-aucpr:0.95311
[10] validation-logloss:0.41327 validation-auc:0.94839 validation-aucpr:0.95368
[11] validation-logloss:0.40183 validation-auc:0.94905 validation-aucpr:0.95425
[12] validation-logloss:0.39174 validation-auc:0.94901 validation-aucpr:0.95411
[13] validation-logloss:0.38195 validation-auc:0.94947 validation-aucpr:0.95446
[14] validation-logloss:0.37091 validation-auc:0.95057 validation-aucpr:0.95547
[15] validation-logloss:0.36336 validation-auc:0.95041 validation-aucpr:0.95528
[16] validation-logloss:0.35689 validation-auc:0.95050 validation-aucpr:0.95546
[17] validation-logloss:0.35039 validation-auc:0.95098 validation-aucpr:0.95602
[18] validation-logloss:0.34235 validation-auc:0.95224 validation-aucpr:0.95728
[19] validation-logloss:0.33656 validation-auc:0.95254 validation-aucpr:0.95749
[20] validation-logloss:0.33244 validation-auc:0.95267 validation-aucpr:0.95753
[21] validation-logloss:0.32856 validation-auc:0.95254 validation-aucpr:0.95759
[22] validation-logloss:0.32320 validation-auc:0.95282 validation-aucpr:0.95812
[23] validation-logloss:0.31843 validation-auc:0.95320 validation-aucpr:0.95840
[24] validation-logloss:0.31385 validation-auc:0.95359 validation-aucpr:0.95870
[25] validation-logloss:0.30343 validation-auc:0.95698 validation-aucpr:0.96260
[26] validation-logloss:0.30109 validation-auc:0.95707 validation-aucpr:0.96265
[27] validation-logloss:0.29804 validation-auc:0.95732 validation-aucpr:0.96276
[28] validation-logloss:0.29463 validation-auc:0.95762 validation-aucpr:0.96296
[29] validation-logloss:0.29148 validation-auc:0.95792 validation-aucpr:0.96329
[30] validation-logloss:0.28306 validation-auc:0.95964 validation-aucpr:0.96522
[31] validation-logloss:0.27929 validation-auc:0.95997 validation-aucpr:0.96554
[32] validation-logloss:0.27696 validation-auc:0.96014 validation-aucpr:0.96568
[33] validation-logloss:0.27467 validation-auc:0.96027 validation-aucpr:0.96581
[34] validation-logloss:0.27397 validation-auc:0.96010 validation-aucpr:0.96567
[35] validation-logloss:0.27224 validation-auc:0.96016 validation-aucpr:0.96572
[36] validation-logloss:0.27075 validation-auc:0.96024 validation-aucpr:0.96577
[37] validation-logloss:0.26906 validation-auc:0.96027 validation-aucpr:0.96590
[38] validation-logloss:0.26795 validation-auc:0.96014 validation-aucpr:0.96575
[39] validation-logloss:0.26734 validation-auc:0.96014 validation-aucpr:0.96578
[40] validation-logloss:0.26411 validation-auc:0.96058 validation-aucpr:0.96624
[41] validation-logloss:0.26204 validation-auc:0.96075 validation-aucpr:0.96637
[42] validation-logloss:0.26077 validation-auc:0.96080 validation-aucpr:0.96640
[43] validation-logloss:0.25890 validation-auc:0.96096 validation-aucpr:0.96653
[44] validation-logloss:0.25672 validation-auc:0.96130 validation-aucpr:0.96682
[45] validation-logloss:0.25578 validation-auc:0.96138 validation-aucpr:0.96685
[46] validation-logloss:0.25468 validation-auc:0.96137 validation-aucpr:0.96688
[47] validation-logloss:0.25356 validation-auc:0.96149 validation-aucpr:0.96702
[48] validation-logloss:0.25298 validation-auc:0.96163 validation-aucpr:0.96710
[49] validation-logloss:0.25210 validation-auc:0.96171 validation-aucpr:0.96713
[50] validation-logloss:0.25137 validation-auc:0.96183 validation-aucpr:0.96724
[51] validation-logloss:0.24612 validation-auc:0.96278 validation-aucpr:0.96836
[52] validation-logloss:0.24557 validation-auc:0.96272 validation-aucpr:0.96832
[53] validation-logloss:0.24042 validation-auc:0.96354 validation-aucpr:0.96919
[54] validation-logloss:0.24004 validation-auc:0.96359 validation-aucpr:0.96925
[55] validation-logloss:0.23961 validation-auc:0.96361 validation-aucpr:0.96928
[56] validation-logloss:0.23906 validation-auc:0.96361 validation-aucpr:0.96932
[57] validation-logloss:0.23824 validation-auc:0.96373 validation-aucpr:0.96941
[58] validation-logloss:0.23734 validation-auc:0.96388 validation-aucpr:0.96951
[59] validation-logloss:0.23634 validation-auc:0.96397 validation-aucpr:0.96957
[60] validation-logloss:0.23208 validation-auc:0.96456 validation-aucpr:0.97019
[61] validation-logloss:0.23122 validation-auc:0.96467 validation-aucpr:0.97024
[62] validation-logloss:0.23014 validation-auc:0.96487 validation-aucpr:0.97040
[63] validation-logloss:0.22988 validation-auc:0.96495 validation-aucpr:0.97044
[64] validation-logloss:0.22606 validation-auc:0.96543 validation-aucpr:0.97093
[65] validation-logloss:0.22592 validation-auc:0.96547 validation-aucpr:0.97097
[66] validation-logloss:0.22306 validation-auc:0.96577 validation-aucpr:0.97129
[67] validation-logloss:0.22288 validation-auc:0.96576 validation-aucpr:0.97129
[68] validation-logloss:0.22196 validation-auc:0.96581 validation-aucpr:0.97134
[69] validation-logloss:0.22145 validation-auc:0.96592 validation-aucpr:0.97143
[70] validation-logloss:0.22093 validation-auc:0.96599 validation-aucpr:0.97149
[71] validation-logloss:0.22071 validation-auc:0.96603 validation-aucpr:0.97150
[72] validation-logloss:0.22048 validation-auc:0.96612 validation-aucpr:0.97157
[73] validation-logloss:0.21798 validation-auc:0.96638 validation-aucpr:0.97185
[74] validation-logloss:0.21710 validation-auc:0.96653 validation-aucpr:0.97196
[75] validation-logloss:0.21644 validation-auc:0.96663 validation-aucpr:0.97204
[76] validation-logloss:0.21424 validation-auc:0.96680 validation-aucpr:0.97224
[77] validation-logloss:0.21408 validation-auc:0.96680 validation-aucpr:0.97223
[78] validation-logloss:0.21390 validation-auc:0.96687 validation-aucpr:0.97229
[79] validation-logloss:0.21360 validation-auc:0.96698 validation-aucpr:0.97236
[80] validation-logloss:0.21151 validation-auc:0.96728 validation-aucpr:0.97263
[81] validation-logloss:0.21110 validation-auc:0.96737 validation-aucpr:0.97270
[82] validation-logloss:0.21094 validation-auc:0.96736 validation-aucpr:0.97269
[83] validation-logloss:0.21082 validation-auc:0.96732 validation-aucpr:0.97265
[84] validation-logloss:0.21045 validation-auc:0.96743 validation-aucpr:0.97272
[85] validation-logloss:0.20909 validation-auc:0.96761 validation-aucpr:0.97288
[86] validation-logloss:0.20886 validation-auc:0.96759 validation-aucpr:0.97287
[87] validation-logloss:0.20862 validation-auc:0.96768 validation-aucpr:0.97293
[88] validation-logloss:0.20844 validation-auc:0.96767 validation-aucpr:0.97295
[89] validation-logloss:0.20683 validation-auc:0.96783 validation-aucpr:0.97311
[90] validation-logloss:0.20697 validation-auc:0.96778 validation-aucpr:0.97308
[91] validation-logloss:0.20565 validation-auc:0.96791 validation-aucpr:0.97322
[92] validation-logloss:0.20560 validation-auc:0.96791 validation-aucpr:0.97321
[93] validation-logloss:0.20453 validation-auc:0.96807 validation-aucpr:0.97336
[94] validation-logloss:0.20419 validation-auc:0.96817 validation-aucpr:0.97341
[95] validation-logloss:0.20392 validation-auc:0.96819 validation-aucpr:0.97344
[96] validation-logloss:0.20383 validation-auc:0.96818 validation-aucpr:0.97345
{'best_iteration': '96', 'best_score': '0.9734476786511739'}
Trial 44, Fold 4: Log loss = 0.20383334708858364, Average precision = 0.9734519949775687, ROC-AUC = 0.9681788593460516, Elapsed Time = 64.23538910000207 seconds
Trial 44, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 44, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.65330 validation-auc:0.88770 validation-aucpr:0.88733
[1] validation-logloss:0.61476 validation-auc:0.92004 validation-aucpr:0.92299
[2] validation-logloss:0.58016 validation-auc:0.93163 validation-aucpr:0.93327
[3] validation-logloss:0.53890 validation-auc:0.95149 validation-aucpr:0.95585
[4] validation-logloss:0.51155 validation-auc:0.95213 validation-aucpr:0.95581
[5] validation-logloss:0.48701 validation-auc:0.95344 validation-aucpr:0.95857
[6] validation-logloss:0.46591 validation-auc:0.95347 validation-aucpr:0.95918
[7] validation-logloss:0.43846 validation-auc:0.95647 validation-aucpr:0.96209
[8] validation-logloss:0.42255 validation-auc:0.95610 validation-aucpr:0.96161
[9] validation-logloss:0.40789 validation-auc:0.95691 validation-aucpr:0.96231
[10] validation-logloss:0.39005 validation-auc:0.95718 validation-aucpr:0.96266
[11] validation-logloss:0.37306 validation-auc:0.95856 validation-aucpr:0.96401
[12] validation-logloss:0.36427 validation-auc:0.95808 validation-aucpr:0.96355
[13] validation-logloss:0.35356 validation-auc:0.95885 validation-aucpr:0.96414
[14] validation-logloss:0.34307 validation-auc:0.95939 validation-aucpr:0.96452
[15] validation-logloss:0.33481 validation-auc:0.95946 validation-aucpr:0.96457
[16] validation-logloss:0.32709 validation-auc:0.95962 validation-aucpr:0.96473
[17] validation-logloss:0.32001 validation-auc:0.96010 validation-aucpr:0.96512
[18] validation-logloss:0.31459 validation-auc:0.96013 validation-aucpr:0.96533
[19] validation-logloss:0.30917 validation-auc:0.96026 validation-aucpr:0.96539
[20] validation-logloss:0.30309 validation-auc:0.96039 validation-aucpr:0.96549
[21] validation-logloss:0.29766 validation-auc:0.96083 validation-aucpr:0.96585
[22] validation-logloss:0.29466 validation-auc:0.96074 validation-aucpr:0.96569
[23] validation-logloss:0.29177 validation-auc:0.96075 validation-aucpr:0.96566
[24] validation-logloss:0.28757 validation-auc:0.96113 validation-aucpr:0.96595
[25] validation-logloss:0.28038 validation-auc:0.96151 validation-aucpr:0.96637
[26] validation-logloss:0.27774 validation-auc:0.96149 validation-aucpr:0.96635
[27] validation-logloss:0.27535 validation-auc:0.96161 validation-aucpr:0.96653
[28] validation-logloss:0.27353 validation-auc:0.96163 validation-aucpr:0.96653
[29] validation-logloss:0.27082 validation-auc:0.96177 validation-aucpr:0.96667
[30] validation-logloss:0.26823 validation-auc:0.96196 validation-aucpr:0.96685
[31] validation-logloss:0.26588 validation-auc:0.96222 validation-aucpr:0.96697
[32] validation-logloss:0.26441 validation-auc:0.96235 validation-aucpr:0.96700
[33] validation-logloss:0.26283 validation-auc:0.96249 validation-aucpr:0.96712
[34] validation-logloss:0.26085 validation-auc:0.96292 validation-aucpr:0.96744
[35] validation-logloss:0.25799 validation-auc:0.96335 validation-aucpr:0.96784
[36] validation-logloss:0.25601 validation-auc:0.96344 validation-aucpr:0.96792
[37] validation-logloss:0.25373 validation-auc:0.96379 validation-aucpr:0.96817
[38] validation-logloss:0.25195 validation-auc:0.96386 validation-aucpr:0.96825
[39] validation-logloss:0.25126 validation-auc:0.96369 validation-aucpr:0.96819
[40] validation-logloss:0.24873 validation-auc:0.96384 validation-aucpr:0.96836
[41] validation-logloss:0.24748 validation-auc:0.96398 validation-aucpr:0.96843
[42] validation-logloss:0.24624 validation-auc:0.96399 validation-aucpr:0.96846
[43] validation-logloss:0.24459 validation-auc:0.96416 validation-aucpr:0.96859
[44] validation-logloss:0.24323 validation-auc:0.96432 validation-aucpr:0.96877
[45] validation-logloss:0.24233 validation-auc:0.96441 validation-aucpr:0.96884
[46] validation-logloss:0.24102 validation-auc:0.96446 validation-aucpr:0.96888
[47] validation-logloss:0.24022 validation-auc:0.96462 validation-aucpr:0.96905
[48] validation-logloss:0.23971 validation-auc:0.96459 validation-aucpr:0.96907
[49] validation-logloss:0.23883 validation-auc:0.96468 validation-aucpr:0.96915
[50] validation-logloss:0.23744 validation-auc:0.96491 validation-aucpr:0.96932
[51] validation-logloss:0.23457 validation-auc:0.96513 validation-aucpr:0.96959
[52] validation-logloss:0.23355 validation-auc:0.96539 validation-aucpr:0.96977
[53] validation-logloss:0.23278 validation-auc:0.96544 validation-aucpr:0.96982
[54] validation-logloss:0.23168 validation-auc:0.96560 validation-aucpr:0.96992
[55] validation-logloss:0.23106 validation-auc:0.96565 validation-aucpr:0.96996
[56] validation-logloss:0.23004 validation-auc:0.96585 validation-aucpr:0.97008
[57] validation-logloss:0.22906 validation-auc:0.96596 validation-aucpr:0.97019
[58] validation-logloss:0.22865 validation-auc:0.96597 validation-aucpr:0.97020
[59] validation-logloss:0.22783 validation-auc:0.96617 validation-aucpr:0.97035
[60] validation-logloss:0.22423 validation-auc:0.96663 validation-aucpr:0.97078
[61] validation-logloss:0.22373 validation-auc:0.96675 validation-aucpr:0.97084
[62] validation-logloss:0.22317 validation-auc:0.96683 validation-aucpr:0.97089
[63] validation-logloss:0.22276 validation-auc:0.96696 validation-aucpr:0.97101
[64] validation-logloss:0.22232 validation-auc:0.96704 validation-aucpr:0.97105
[65] validation-logloss:0.22116 validation-auc:0.96721 validation-aucpr:0.97119
[66] validation-logloss:0.22107 validation-auc:0.96713 validation-aucpr:0.97116
[67] validation-logloss:0.22082 validation-auc:0.96714 validation-aucpr:0.97119
[68] validation-logloss:0.22089 validation-auc:0.96711 validation-aucpr:0.97115
[69] validation-logloss:0.21832 validation-auc:0.96739 validation-aucpr:0.97142
[70] validation-logloss:0.21820 validation-auc:0.96737 validation-aucpr:0.97142
[71] validation-logloss:0.21806 validation-auc:0.96736 validation-aucpr:0.97143
[72] validation-logloss:0.21758 validation-auc:0.96738 validation-aucpr:0.97145
[73] validation-logloss:0.21670 validation-auc:0.96764 validation-aucpr:0.97163
[74] validation-logloss:0.21660 validation-auc:0.96767 validation-aucpr:0.97167
[75] validation-logloss:0.21614 validation-auc:0.96773 validation-aucpr:0.97172
[76] validation-logloss:0.21377 validation-auc:0.96808 validation-aucpr:0.97206
[77] validation-logloss:0.21362 validation-auc:0.96805 validation-aucpr:0.97201
[78] validation-logloss:0.21350 validation-auc:0.96810 validation-aucpr:0.97202
[79] validation-logloss:0.21325 validation-auc:0.96816 validation-aucpr:0.97206
[80] validation-logloss:0.21283 validation-auc:0.96826 validation-aucpr:0.97214
[81] validation-logloss:0.21241 validation-auc:0.96834 validation-aucpr:0.97227
[82] validation-logloss:0.21197 validation-auc:0.96848 validation-aucpr:0.97241
[83] validation-logloss:0.21166 validation-auc:0.96852 validation-aucpr:0.97245
[84] validation-logloss:0.21136 validation-auc:0.96862 validation-aucpr:0.97251
[85] validation-logloss:0.21109 validation-auc:0.96872 validation-aucpr:0.97259
[86] validation-logloss:0.21100 validation-auc:0.96871 validation-aucpr:0.97260
[87] validation-logloss:0.21092 validation-auc:0.96871 validation-aucpr:0.97261
[88] validation-logloss:0.20893 validation-auc:0.96904 validation-aucpr:0.97288
[89] validation-logloss:0.20884 validation-auc:0.96899 validation-aucpr:0.97283
[90] validation-logloss:0.20869 validation-auc:0.96905 validation-aucpr:0.97287
[91] validation-logloss:0.20880 validation-auc:0.96900 validation-aucpr:0.97285
[92] validation-logloss:0.20741 validation-auc:0.96917 validation-aucpr:0.97303
[93] validation-logloss:0.20573 validation-auc:0.96951 validation-aucpr:0.97333
[94] validation-logloss:0.20570 validation-auc:0.96948 validation-aucpr:0.97332
[95] validation-logloss:0.20552 validation-auc:0.96948 validation-aucpr:0.97332
[96] validation-logloss:0.20552 validation-auc:0.96945 validation-aucpr:0.97329
{'best_iteration': '93', 'best_score': '0.9733290849490748'}
Trial 44, Fold 5: Log loss = 0.20551938913953613, Average precision = 0.9732923313730766, ROC-AUC = 0.969452910465786, Elapsed Time = 65.30914010000197 seconds
Optimization Progress: 45%|####5 | 45/100 [2:32:45<2:12:49, 144.91s/it]
Trial 45, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 45, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67020 validation-auc:0.93525 validation-aucpr:0.92509
[1] validation-logloss:0.64890 validation-auc:0.94891 validation-aucpr:0.95024
[2] validation-logloss:0.62574 validation-auc:0.96047 validation-aucpr:0.96201
[3] validation-logloss:0.60522 validation-auc:0.96166 validation-aucpr:0.96670
[4] validation-logloss:0.58834 validation-auc:0.96130 validation-aucpr:0.96647
[5] validation-logloss:0.57257 validation-auc:0.96252 validation-aucpr:0.96778
[6] validation-logloss:0.55801 validation-auc:0.96231 validation-aucpr:0.96771
[7] validation-logloss:0.54446 validation-auc:0.96181 validation-aucpr:0.96734
[8] validation-logloss:0.53092 validation-auc:0.96239 validation-aucpr:0.96754
[9] validation-logloss:0.51747 validation-auc:0.96278 validation-aucpr:0.96804
[10] validation-logloss:0.50541 validation-auc:0.96268 validation-aucpr:0.96677
[11] validation-logloss:0.49394 validation-auc:0.96286 validation-aucpr:0.96687
[12] validation-logloss:0.48301 validation-auc:0.96275 validation-aucpr:0.96681
[13] validation-logloss:0.46951 validation-auc:0.96341 validation-aucpr:0.96766
[14] validation-logloss:0.45861 validation-auc:0.96354 validation-aucpr:0.96788
[15] validation-logloss:0.44905 validation-auc:0.96362 validation-aucpr:0.96787
[16] validation-logloss:0.44024 validation-auc:0.96366 validation-aucpr:0.96771
[17] validation-logloss:0.43190 validation-auc:0.96368 validation-aucpr:0.96899
[18] validation-logloss:0.42381 validation-auc:0.96371 validation-aucpr:0.96903
[19] validation-logloss:0.41327 validation-auc:0.96422 validation-aucpr:0.96947
[20] validation-logloss:0.40348 validation-auc:0.96464 validation-aucpr:0.96986
[21] validation-logloss:0.39450 validation-auc:0.96480 validation-aucpr:0.97009
[22] validation-logloss:0.38759 validation-auc:0.96487 validation-aucpr:0.97012
[23] validation-logloss:0.37931 validation-auc:0.96516 validation-aucpr:0.97045
[24] validation-logloss:0.37362 validation-auc:0.96516 validation-aucpr:0.97042
[25] validation-logloss:0.36794 validation-auc:0.96515 validation-aucpr:0.97042
[26] validation-logloss:0.36058 validation-auc:0.96552 validation-aucpr:0.97078
[27] validation-logloss:0.35558 validation-auc:0.96537 validation-aucpr:0.97062
[28] validation-logloss:0.34877 validation-auc:0.96564 validation-aucpr:0.97095
[29] validation-logloss:0.34394 validation-auc:0.96560 validation-aucpr:0.97090
[30] validation-logloss:0.33938 validation-auc:0.96543 validation-aucpr:0.97071
[31] validation-logloss:0.33510 validation-auc:0.96545 validation-aucpr:0.97072
[32] validation-logloss:0.33073 validation-auc:0.96548 validation-aucpr:0.97074
[33] validation-logloss:0.32505 validation-auc:0.96566 validation-aucpr:0.97090
[34] validation-logloss:0.31949 validation-auc:0.96586 validation-aucpr:0.97106
[35] validation-logloss:0.31457 validation-auc:0.96591 validation-aucpr:0.97123
[36] validation-logloss:0.30962 validation-auc:0.96606 validation-aucpr:0.97142
[37] validation-logloss:0.30635 validation-auc:0.96609 validation-aucpr:0.97142
[38] validation-logloss:0.30181 validation-auc:0.96616 validation-aucpr:0.97150
[39] validation-logloss:0.29718 validation-auc:0.96631 validation-aucpr:0.97165
[40] validation-logloss:0.29408 validation-auc:0.96633 validation-aucpr:0.97165
[41] validation-logloss:0.29102 validation-auc:0.96640 validation-aucpr:0.97167
[42] validation-logloss:0.28729 validation-auc:0.96656 validation-aucpr:0.97183
[43] validation-logloss:0.28366 validation-auc:0.96661 validation-aucpr:0.97188
[44] validation-logloss:0.28029 validation-auc:0.96670 validation-aucpr:0.97201
{'best_iteration': '44', 'best_score': '0.9720059714082437'}
Trial 45, Fold 1: Log loss = 0.2802868031056375, Average precision = 0.972009096896103, ROC-AUC = 0.966703439631398, Elapsed Time = 0.9623223999988113 seconds
Trial 45, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 45, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67120 validation-auc:0.92551 validation-aucpr:0.90220
[1] validation-logloss:0.65013 validation-auc:0.94577 validation-aucpr:0.93924
[2] validation-logloss:0.62772 validation-auc:0.95785 validation-aucpr:0.95836
[3] validation-logloss:0.60896 validation-auc:0.95986 validation-aucpr:0.96349
[4] validation-logloss:0.58978 validation-auc:0.96129 validation-aucpr:0.96573
[5] validation-logloss:0.57354 validation-auc:0.96166 validation-aucpr:0.96584
[6] validation-logloss:0.55841 validation-auc:0.96182 validation-aucpr:0.96615
[7] validation-logloss:0.54384 validation-auc:0.96260 validation-aucpr:0.96672
[8] validation-logloss:0.53068 validation-auc:0.96251 validation-aucpr:0.96635
[9] validation-logloss:0.51821 validation-auc:0.96181 validation-aucpr:0.96542
[10] validation-logloss:0.50572 validation-auc:0.96173 validation-aucpr:0.96542
[11] validation-logloss:0.49384 validation-auc:0.96165 validation-aucpr:0.96526
[12] validation-logloss:0.48007 validation-auc:0.96377 validation-aucpr:0.96774
[13] validation-logloss:0.46869 validation-auc:0.96384 validation-aucpr:0.96790
[14] validation-logloss:0.45694 validation-auc:0.96446 validation-aucpr:0.96851
[15] validation-logloss:0.44759 validation-auc:0.96452 validation-aucpr:0.96851
[16] validation-logloss:0.43819 validation-auc:0.96469 validation-aucpr:0.96864
[17] validation-logloss:0.42772 validation-auc:0.96503 validation-aucpr:0.96880
[18] validation-logloss:0.41973 validation-auc:0.96499 validation-aucpr:0.96877
[19] validation-logloss:0.41073 validation-auc:0.96548 validation-aucpr:0.96922
[20] validation-logloss:0.40113 validation-auc:0.96572 validation-aucpr:0.96943
[21] validation-logloss:0.39249 validation-auc:0.96593 validation-aucpr:0.96965
[22] validation-logloss:0.38644 validation-auc:0.96569 validation-aucpr:0.96941
[23] validation-logloss:0.37810 validation-auc:0.96584 validation-aucpr:0.96961
[24] validation-logloss:0.37187 validation-auc:0.96605 validation-aucpr:0.96978
[25] validation-logloss:0.36653 validation-auc:0.96593 validation-aucpr:0.96995
[26] validation-logloss:0.35877 validation-auc:0.96624 validation-aucpr:0.97023
[27] validation-logloss:0.35337 validation-auc:0.96644 validation-aucpr:0.97041
[28] validation-logloss:0.34821 validation-auc:0.96639 validation-aucpr:0.97032
[29] validation-logloss:0.34229 validation-auc:0.96645 validation-aucpr:0.97056
[30] validation-logloss:0.33685 validation-auc:0.96649 validation-aucpr:0.97059
[31] validation-logloss:0.33111 validation-auc:0.96664 validation-aucpr:0.97073
[32] validation-logloss:0.32660 validation-auc:0.96678 validation-aucpr:0.97087
[33] validation-logloss:0.32262 validation-auc:0.96681 validation-aucpr:0.97087
[34] validation-logloss:0.31837 validation-auc:0.96692 validation-aucpr:0.97089
[35] validation-logloss:0.31465 validation-auc:0.96697 validation-aucpr:0.97091
[36] validation-logloss:0.31079 validation-auc:0.96716 validation-aucpr:0.97102
[37] validation-logloss:0.30724 validation-auc:0.96706 validation-aucpr:0.97092
[38] validation-logloss:0.30404 validation-auc:0.96713 validation-aucpr:0.97097
[39] validation-logloss:0.29927 validation-auc:0.96735 validation-aucpr:0.97109
[40] validation-logloss:0.29478 validation-auc:0.96752 validation-aucpr:0.97123
[41] validation-logloss:0.29058 validation-auc:0.96759 validation-aucpr:0.97132
[42] validation-logloss:0.28669 validation-auc:0.96765 validation-aucpr:0.97135
[43] validation-logloss:0.28378 validation-auc:0.96779 validation-aucpr:0.97141
[44] validation-logloss:0.28087 validation-auc:0.96790 validation-aucpr:0.97148
{'best_iteration': '44', 'best_score': '0.9714752468667572'}
Trial 45, Fold 2: Log loss = 0.2808740290736027, Average precision = 0.9714337620041505, ROC-AUC = 0.9678955556412089, Elapsed Time = 1.1687717999993765 seconds
Trial 45, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 45, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67066 validation-auc:0.93383 validation-aucpr:0.93355
[1] validation-logloss:0.64961 validation-auc:0.94795 validation-aucpr:0.94906
[2] validation-logloss:0.63024 validation-auc:0.95066 validation-aucpr:0.95428
[3] validation-logloss:0.61199 validation-auc:0.95437 validation-aucpr:0.95822
[4] validation-logloss:0.59271 validation-auc:0.96009 validation-aucpr:0.96488
[5] validation-logloss:0.57738 validation-auc:0.96006 validation-aucpr:0.96502
[6] validation-logloss:0.56230 validation-auc:0.96039 validation-aucpr:0.96532
[7] validation-logloss:0.54728 validation-auc:0.96137 validation-aucpr:0.96604
[8] validation-logloss:0.53354 validation-auc:0.96152 validation-aucpr:0.96613
[9] validation-logloss:0.52071 validation-auc:0.96188 validation-aucpr:0.96639
[10] validation-logloss:0.50877 validation-auc:0.96184 validation-aucpr:0.96685
[11] validation-logloss:0.49462 validation-auc:0.96298 validation-aucpr:0.96799
[12] validation-logloss:0.48383 validation-auc:0.96290 validation-aucpr:0.96783
[13] validation-logloss:0.47382 validation-auc:0.96301 validation-aucpr:0.96807
[14] validation-logloss:0.46076 validation-auc:0.96418 validation-aucpr:0.96921
[15] validation-logloss:0.44905 validation-auc:0.96465 validation-aucpr:0.96967
[16] validation-logloss:0.43784 validation-auc:0.96502 validation-aucpr:0.97002
[17] validation-logloss:0.42992 validation-auc:0.96474 validation-aucpr:0.96988
[18] validation-logloss:0.42194 validation-auc:0.96493 validation-aucpr:0.96999
[19] validation-logloss:0.41404 validation-auc:0.96517 validation-aucpr:0.97013
[20] validation-logloss:0.40541 validation-auc:0.96555 validation-aucpr:0.97042
[21] validation-logloss:0.39608 validation-auc:0.96585 validation-aucpr:0.97076
[22] validation-logloss:0.38740 validation-auc:0.96601 validation-aucpr:0.97090
[23] validation-logloss:0.38071 validation-auc:0.96612 validation-aucpr:0.97102
[24] validation-logloss:0.37481 validation-auc:0.96585 validation-aucpr:0.97079
[25] validation-logloss:0.36868 validation-auc:0.96594 validation-aucpr:0.97086
[26] validation-logloss:0.36332 validation-auc:0.96602 validation-aucpr:0.97093
[27] validation-logloss:0.35840 validation-auc:0.96593 validation-aucpr:0.97082
[28] validation-logloss:0.35333 validation-auc:0.96595 validation-aucpr:0.97082
[29] validation-logloss:0.34877 validation-auc:0.96586 validation-aucpr:0.97072
[30] validation-logloss:0.34366 validation-auc:0.96592 validation-aucpr:0.97076
[31] validation-logloss:0.33903 validation-auc:0.96602 validation-aucpr:0.97082
[32] validation-logloss:0.33293 validation-auc:0.96616 validation-aucpr:0.97099
[33] validation-logloss:0.32746 validation-auc:0.96625 validation-aucpr:0.97117
[34] validation-logloss:0.32185 validation-auc:0.96631 validation-aucpr:0.97126
[35] validation-logloss:0.31656 validation-auc:0.96643 validation-aucpr:0.97135
[36] validation-logloss:0.31280 validation-auc:0.96645 validation-aucpr:0.97135
[37] validation-logloss:0.30812 validation-auc:0.96665 validation-aucpr:0.97152
[38] validation-logloss:0.30497 validation-auc:0.96667 validation-aucpr:0.97154
[39] validation-logloss:0.30146 validation-auc:0.96678 validation-aucpr:0.97164
[40] validation-logloss:0.29820 validation-auc:0.96683 validation-aucpr:0.97169
[41] validation-logloss:0.29515 validation-auc:0.96694 validation-aucpr:0.97175
[42] validation-logloss:0.29065 validation-auc:0.96715 validation-aucpr:0.97192
[43] validation-logloss:0.28642 validation-auc:0.96734 validation-aucpr:0.97209
[44] validation-logloss:0.28359 validation-auc:0.96745 validation-aucpr:0.97217
{'best_iteration': '44', 'best_score': '0.972170186636452'}
Trial 45, Fold 3: Log loss = 0.28359285391712913, Average precision = 0.9721740803263408, ROC-AUC = 0.9674528199670618, Elapsed Time = 1.1913502999996126 seconds
Trial 45, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 45, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67095 validation-auc:0.92539 validation-aucpr:0.92800
[1] validation-logloss:0.64635 validation-auc:0.95647 validation-aucpr:0.96293
[2] validation-logloss:0.62755 validation-auc:0.95729 validation-aucpr:0.96357
[3] validation-logloss:0.61132 validation-auc:0.95610 validation-aucpr:0.96227
[4] validation-logloss:0.59410 validation-auc:0.95678 validation-aucpr:0.96299
[5] validation-logloss:0.57847 validation-auc:0.95727 validation-aucpr:0.96324
[6] validation-logloss:0.56368 validation-auc:0.95713 validation-aucpr:0.96299
[7] validation-logloss:0.54591 validation-auc:0.96012 validation-aucpr:0.96585
[8] validation-logloss:0.53017 validation-auc:0.96098 validation-aucpr:0.96673
[9] validation-logloss:0.51480 validation-auc:0.96181 validation-aucpr:0.96761
[10] validation-logloss:0.50292 validation-auc:0.96198 validation-aucpr:0.96776
[11] validation-logloss:0.49151 validation-auc:0.96186 validation-aucpr:0.96757
[12] validation-logloss:0.47862 validation-auc:0.96248 validation-aucpr:0.96818
[13] validation-logloss:0.46816 validation-auc:0.96272 validation-aucpr:0.96838
[14] validation-logloss:0.45841 validation-auc:0.96269 validation-aucpr:0.96834
[15] validation-logloss:0.44648 validation-auc:0.96319 validation-aucpr:0.96888
[16] validation-logloss:0.43797 validation-auc:0.96296 validation-aucpr:0.96868
[17] validation-logloss:0.42744 validation-auc:0.96355 validation-aucpr:0.96927
[18] validation-logloss:0.41951 validation-auc:0.96369 validation-aucpr:0.96928
[19] validation-logloss:0.40988 validation-auc:0.96406 validation-aucpr:0.96972
[20] validation-logloss:0.40058 validation-auc:0.96437 validation-aucpr:0.97006
[21] validation-logloss:0.39199 validation-auc:0.96456 validation-aucpr:0.97033
[22] validation-logloss:0.38535 validation-auc:0.96445 validation-aucpr:0.97026
[23] validation-logloss:0.37912 validation-auc:0.96447 validation-aucpr:0.97030
[24] validation-logloss:0.37130 validation-auc:0.96468 validation-aucpr:0.97046
[25] validation-logloss:0.36608 validation-auc:0.96452 validation-aucpr:0.97040
[26] validation-logloss:0.36027 validation-auc:0.96464 validation-aucpr:0.97043
[27] validation-logloss:0.35325 validation-auc:0.96480 validation-aucpr:0.97063
[28] validation-logloss:0.34832 validation-auc:0.96467 validation-aucpr:0.97049
[29] validation-logloss:0.34190 validation-auc:0.96479 validation-aucpr:0.97066
[30] validation-logloss:0.33736 validation-auc:0.96480 validation-aucpr:0.97063
[31] validation-logloss:0.33247 validation-auc:0.96498 validation-aucpr:0.97072
[32] validation-logloss:0.32649 validation-auc:0.96544 validation-aucpr:0.97106
[33] validation-logloss:0.32104 validation-auc:0.96576 validation-aucpr:0.97129
[34] validation-logloss:0.31617 validation-auc:0.96588 validation-aucpr:0.97142
[35] validation-logloss:0.31271 validation-auc:0.96578 validation-aucpr:0.97134
[36] validation-logloss:0.30906 validation-auc:0.96584 validation-aucpr:0.97137
[37] validation-logloss:0.30541 validation-auc:0.96599 validation-aucpr:0.97146
[38] validation-logloss:0.30187 validation-auc:0.96613 validation-aucpr:0.97161
[39] validation-logloss:0.29898 validation-auc:0.96615 validation-aucpr:0.97159
[40] validation-logloss:0.29469 validation-auc:0.96630 validation-aucpr:0.97177
[41] validation-logloss:0.29206 validation-auc:0.96616 validation-aucpr:0.97167
[42] validation-logloss:0.28781 validation-auc:0.96636 validation-aucpr:0.97186
[43] validation-logloss:0.28481 validation-auc:0.96649 validation-aucpr:0.97195
[44] validation-logloss:0.28226 validation-auc:0.96658 validation-aucpr:0.97202
{'best_iteration': '44', 'best_score': '0.9720178657714623'}
Trial 45, Fold 4: Log loss = 0.28226231217644576, Average precision = 0.9720161104064858, ROC-AUC = 0.9665810812583008, Elapsed Time = 1.1915703999984544 seconds
Trial 45, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 45, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67074 validation-auc:0.93172 validation-aucpr:0.93217
[1] validation-logloss:0.64738 validation-auc:0.95505 validation-aucpr:0.96156
[2] validation-logloss:0.62848 validation-auc:0.95691 validation-aucpr:0.96271
[3] validation-logloss:0.61110 validation-auc:0.95738 validation-aucpr:0.96289
[4] validation-logloss:0.59377 validation-auc:0.95839 validation-aucpr:0.96339
[5] validation-logloss:0.57738 validation-auc:0.95814 validation-aucpr:0.96248
[6] validation-logloss:0.55930 validation-auc:0.96079 validation-aucpr:0.96557
[7] validation-logloss:0.54477 validation-auc:0.96096 validation-aucpr:0.96582
[8] validation-logloss:0.52923 validation-auc:0.96120 validation-aucpr:0.96612
[9] validation-logloss:0.51663 validation-auc:0.96147 validation-aucpr:0.96653
[10] validation-logloss:0.50514 validation-auc:0.96139 validation-aucpr:0.96640
[11] validation-logloss:0.49352 validation-auc:0.96158 validation-aucpr:0.96658
[12] validation-logloss:0.48050 validation-auc:0.96185 validation-aucpr:0.96682
[13] validation-logloss:0.47081 validation-auc:0.96159 validation-aucpr:0.96659
[14] validation-logloss:0.45957 validation-auc:0.96133 validation-aucpr:0.96646
[15] validation-logloss:0.44817 validation-auc:0.96174 validation-aucpr:0.96712
[16] validation-logloss:0.43897 validation-auc:0.96192 validation-aucpr:0.96726
[17] validation-logloss:0.42869 validation-auc:0.96204 validation-aucpr:0.96742
[18] validation-logloss:0.42061 validation-auc:0.96223 validation-aucpr:0.96751
[19] validation-logloss:0.41209 validation-auc:0.96231 validation-aucpr:0.96756
[20] validation-logloss:0.40513 validation-auc:0.96254 validation-aucpr:0.96770
[21] validation-logloss:0.39835 validation-auc:0.96270 validation-aucpr:0.96781
[22] validation-logloss:0.39211 validation-auc:0.96270 validation-aucpr:0.96775
[23] validation-logloss:0.38608 validation-auc:0.96288 validation-aucpr:0.96793
[24] validation-logloss:0.38035 validation-auc:0.96286 validation-aucpr:0.96793
[25] validation-logloss:0.37272 validation-auc:0.96335 validation-aucpr:0.96845
[26] validation-logloss:0.36503 validation-auc:0.96364 validation-aucpr:0.96871
[27] validation-logloss:0.35780 validation-auc:0.96378 validation-aucpr:0.96885
[28] validation-logloss:0.35277 validation-auc:0.96379 validation-aucpr:0.96883
[29] validation-logloss:0.34634 validation-auc:0.96394 validation-aucpr:0.96896
[30] validation-logloss:0.34001 validation-auc:0.96417 validation-aucpr:0.96921
[31] validation-logloss:0.33597 validation-auc:0.96415 validation-aucpr:0.96925
[32] validation-logloss:0.32987 validation-auc:0.96441 validation-aucpr:0.96951
[33] validation-logloss:0.32435 validation-auc:0.96462 validation-aucpr:0.96970
[34] validation-logloss:0.31901 validation-auc:0.96482 validation-aucpr:0.96989
[35] validation-logloss:0.31552 validation-auc:0.96469 validation-aucpr:0.96975
[36] validation-logloss:0.31089 validation-auc:0.96485 validation-aucpr:0.96985
[37] validation-logloss:0.30631 validation-auc:0.96493 validation-aucpr:0.96997
[38] validation-logloss:0.30311 validation-auc:0.96494 validation-aucpr:0.96993
[39] validation-logloss:0.29898 validation-auc:0.96502 validation-aucpr:0.96999
[40] validation-logloss:0.29579 validation-auc:0.96499 validation-aucpr:0.96995
[41] validation-logloss:0.29176 validation-auc:0.96518 validation-aucpr:0.97011
[42] validation-logloss:0.28916 validation-auc:0.96517 validation-aucpr:0.97008
[43] validation-logloss:0.28549 validation-auc:0.96526 validation-aucpr:0.97021
[44] validation-logloss:0.28291 validation-auc:0.96528 validation-aucpr:0.97024
{'best_iteration': '44', 'best_score': '0.9702411674009223'}
Trial 45, Fold 5: Log loss = 0.2829139369086157, Average precision = 0.9702407906841284, ROC-AUC = 0.9652755141853854, Elapsed Time = 1.2029946000002383 seconds
Optimization Progress: 46%|####6 | 46/100 [2:32:58<1:34:55, 105.48s/it]
Trial 46, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 46, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[20:31:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64076 validation-auc:0.93869 validation-aucpr:0.93708
[20:32:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.60029 validation-auc:0.95067 validation-aucpr:0.94280
[20:32:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.56030 validation-auc:0.95506 validation-aucpr:0.95160
[20:32:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.53027 validation-auc:0.95681 validation-aucpr:0.95480
[20:32:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.49789 validation-auc:0.96018 validation-aucpr:0.96071
[20:32:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.47328 validation-auc:0.96102 validation-aucpr:0.96032
[20:32:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.44763 validation-auc:0.96347 validation-aucpr:0.96383
[20:32:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.42532 validation-auc:0.96390 validation-aucpr:0.96473
[20:32:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.40458 validation-auc:0.96433 validation-aucpr:0.96536
[20:32:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.38660 validation-auc:0.96466 validation-aucpr:0.96732
[20:32:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.37016 validation-auc:0.96508 validation-aucpr:0.96784
[20:32:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.35489 validation-auc:0.96557 validation-aucpr:0.96902
[20:32:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.34342 validation-auc:0.96582 validation-aucpr:0.96914
[20:32:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.33001 validation-auc:0.96666 validation-aucpr:0.96975
[20:32:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.31871 validation-auc:0.96695 validation-aucpr:0.96999
[20:32:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.30794 validation-auc:0.96728 validation-aucpr:0.97033
[20:32:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.29834 validation-auc:0.96733 validation-aucpr:0.97032
[20:32:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.28978 validation-auc:0.96753 validation-aucpr:0.97046
[20:33:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.28196 validation-auc:0.96785 validation-aucpr:0.97066
[20:33:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.27476 validation-auc:0.96808 validation-aucpr:0.96989
[20:33:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.26881 validation-auc:0.96804 validation-aucpr:0.97005
[20:33:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.26402 validation-auc:0.96782 validation-aucpr:0.96950
[20:33:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.25898 validation-auc:0.96819 validation-aucpr:0.97107
[20:33:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.25395 validation-auc:0.96831 validation-aucpr:0.97135
[20:33:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.24992 validation-auc:0.96829 validation-aucpr:0.97190
[20:33:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.24638 validation-auc:0.96804 validation-aucpr:0.97170
[20:33:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.24278 validation-auc:0.96821 validation-aucpr:0.97175
[20:33:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.23971 validation-auc:0.96849 validation-aucpr:0.97269
[20:33:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.23655 validation-auc:0.96847 validation-aucpr:0.97272
[20:33:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.23392 validation-auc:0.96854 validation-aucpr:0.97277
[20:33:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.23152 validation-auc:0.96869 validation-aucpr:0.97284
[20:33:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.22930 validation-auc:0.96867 validation-aucpr:0.97268
[20:33:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.22672 validation-auc:0.96867 validation-aucpr:0.97288
[20:33:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.22411 validation-auc:0.96881 validation-aucpr:0.97280
[20:33:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.22201 validation-auc:0.96879 validation-aucpr:0.97284
[20:33:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.22027 validation-auc:0.96875 validation-aucpr:0.97290
[20:33:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.21837 validation-auc:0.96890 validation-aucpr:0.97326
[20:33:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.21717 validation-auc:0.96899 validation-aucpr:0.97409
[20:34:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.21562 validation-auc:0.96918 validation-aucpr:0.97427
[20:34:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.21412 validation-auc:0.96935 validation-aucpr:0.97437
[20:34:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.21328 validation-auc:0.96935 validation-aucpr:0.97441
[20:34:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.21210 validation-auc:0.96949 validation-aucpr:0.97450
[20:34:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.21082 validation-auc:0.96957 validation-aucpr:0.97456
[20:34:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20935 validation-auc:0.96982 validation-aucpr:0.97474
[20:34:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.20814 validation-auc:0.97003 validation-aucpr:0.97491
[20:34:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.20765 validation-auc:0.96996 validation-aucpr:0.97483
[20:34:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.20746 validation-auc:0.96991 validation-aucpr:0.97476
[20:34:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20698 validation-auc:0.96998 validation-aucpr:0.97478
[20:34:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.20669 validation-auc:0.96991 validation-aucpr:0.97471
[20:34:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.20629 validation-auc:0.96990 validation-aucpr:0.97475
[20:34:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.20609 validation-auc:0.96986 validation-aucpr:0.97469
[20:34:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.20626 validation-auc:0.96972 validation-aucpr:0.97459
[20:34:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.20545 validation-auc:0.96984 validation-aucpr:0.97462
[20:34:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.20534 validation-auc:0.96973 validation-aucpr:0.97461
[20:34:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.20525 validation-auc:0.96974 validation-aucpr:0.97457
[20:34:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.20504 validation-auc:0.96975 validation-aucpr:0.97454
[20:34:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.20480 validation-auc:0.96963 validation-aucpr:0.97446
[20:34:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.20434 validation-auc:0.96968 validation-aucpr:0.97449
[20:34:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.20389 validation-auc:0.96981 validation-aucpr:0.97455
[20:34:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.20355 validation-auc:0.96992 validation-aucpr:0.97461
[20:34:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.20304 validation-auc:0.97009 validation-aucpr:0.97491
[20:34:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.20314 validation-auc:0.97001 validation-aucpr:0.97488
[20:34:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.20279 validation-auc:0.97005 validation-aucpr:0.97492
[20:34:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.20264 validation-auc:0.97006 validation-aucpr:0.97495
[20:34:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.20293 validation-auc:0.96995 validation-aucpr:0.97489
[20:34:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.20292 validation-auc:0.96999 validation-aucpr:0.97489
[20:34:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.20284 validation-auc:0.97001 validation-aucpr:0.97490
[20:34:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.20285 validation-auc:0.97003 validation-aucpr:0.97489
[20:34:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.20308 validation-auc:0.96998 validation-aucpr:0.97488
[20:34:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.20314 validation-auc:0.96999 validation-aucpr:0.97484
[20:34:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.20319 validation-auc:0.97004 validation-aucpr:0.97492
[20:34:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.20310 validation-auc:0.97012 validation-aucpr:0.97498
[20:34:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.20294 validation-auc:0.97020 validation-aucpr:0.97501
[20:34:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.20297 validation-auc:0.97025 validation-aucpr:0.97501
{'best_iteration': '72', 'best_score': '0.9750078737434187'}
Trial 46, Fold 1: Log loss = 0.20296792392887414, Average precision = 0.9750121228461307, ROC-AUC = 0.9702506416598559, Elapsed Time = 173.2442964999973 seconds
Trial 46, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 46, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[20:34:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64530 validation-auc:0.92763 validation-aucpr:0.91331
[20:34:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.59987 validation-auc:0.95017 validation-aucpr:0.94426
[20:34:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.56506 validation-auc:0.95467 validation-aucpr:0.95053
[20:35:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.53379 validation-auc:0.95752 validation-aucpr:0.96177
[20:35:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.50317 validation-auc:0.96085 validation-aucpr:0.96533
[20:35:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.47577 validation-auc:0.96249 validation-aucpr:0.96685
[20:35:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.45064 validation-auc:0.96373 validation-aucpr:0.96797
[20:35:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.43124 validation-auc:0.96391 validation-aucpr:0.96800
[20:35:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.41027 validation-auc:0.96509 validation-aucpr:0.96904
[20:35:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.39076 validation-auc:0.96650 validation-aucpr:0.97012
[20:35:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.37329 validation-auc:0.96726 validation-aucpr:0.97089
[20:35:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.35690 validation-auc:0.96831 validation-aucpr:0.97173
[20:35:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.34278 validation-auc:0.96884 validation-aucpr:0.97255
[20:35:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.32986 validation-auc:0.96911 validation-aucpr:0.97285
[20:35:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.32078 validation-auc:0.96899 validation-aucpr:0.97258
[20:35:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.31055 validation-auc:0.96885 validation-aucpr:0.97248
[20:35:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.30204 validation-auc:0.96888 validation-aucpr:0.97237
[20:35:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.29528 validation-auc:0.96852 validation-aucpr:0.97090
[20:35:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.28725 validation-auc:0.96844 validation-aucpr:0.97097
[20:35:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.27942 validation-auc:0.96904 validation-aucpr:0.97276
[20:36:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.27311 validation-auc:0.96918 validation-aucpr:0.97279
[20:36:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.26682 validation-auc:0.96922 validation-aucpr:0.97281
[20:36:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.26013 validation-auc:0.96966 validation-aucpr:0.97233
[20:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.25512 validation-auc:0.97008 validation-aucpr:0.97375
[20:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.24913 validation-auc:0.97060 validation-aucpr:0.97416
[20:36:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.24414 validation-auc:0.97055 validation-aucpr:0.97419
[20:36:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.23976 validation-auc:0.97085 validation-aucpr:0.97441
[20:36:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.23633 validation-auc:0.97094 validation-aucpr:0.97448
[20:36:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.23270 validation-auc:0.97092 validation-aucpr:0.97445
[20:36:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.22912 validation-auc:0.97128 validation-aucpr:0.97472
[20:36:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.22581 validation-auc:0.97129 validation-aucpr:0.97473
[20:36:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.22264 validation-auc:0.97138 validation-aucpr:0.97480
[20:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.21926 validation-auc:0.97160 validation-aucpr:0.97507
[20:36:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.21690 validation-auc:0.97169 validation-aucpr:0.97508
[20:36:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.21436 validation-auc:0.97200 validation-aucpr:0.97532
[20:36:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.21231 validation-auc:0.97184 validation-aucpr:0.97525
[20:36:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.21087 validation-auc:0.97170 validation-aucpr:0.97519
[20:36:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20879 validation-auc:0.97181 validation-aucpr:0.97527
[20:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.20737 validation-auc:0.97193 validation-aucpr:0.97533
[20:36:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.20593 validation-auc:0.97188 validation-aucpr:0.97529
[20:36:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20459 validation-auc:0.97188 validation-aucpr:0.97529
[20:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.20355 validation-auc:0.97188 validation-aucpr:0.97523
[20:36:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20274 validation-auc:0.97178 validation-aucpr:0.97511
[20:37:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20221 validation-auc:0.97158 validation-aucpr:0.97477
[20:37:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.20085 validation-auc:0.97174 validation-aucpr:0.97497
[20:37:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19980 validation-auc:0.97177 validation-aucpr:0.97501
[20:37:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19873 validation-auc:0.97188 validation-aucpr:0.97520
[20:37:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19776 validation-auc:0.97204 validation-aucpr:0.97529
[20:37:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19678 validation-auc:0.97221 validation-aucpr:0.97534
[20:37:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19596 validation-auc:0.97231 validation-aucpr:0.97541
[20:37:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19549 validation-auc:0.97226 validation-aucpr:0.97520
[20:37:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19499 validation-auc:0.97227 validation-aucpr:0.97502
[20:37:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19379 validation-auc:0.97246 validation-aucpr:0.97513
[20:37:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19348 validation-auc:0.97242 validation-aucpr:0.97502
[20:37:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.19293 validation-auc:0.97246 validation-aucpr:0.97508
[20:37:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.19233 validation-auc:0.97253 validation-aucpr:0.97504
[20:37:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.19182 validation-auc:0.97260 validation-aucpr:0.97508
[20:37:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.19149 validation-auc:0.97259 validation-aucpr:0.97490
[20:37:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.19056 validation-auc:0.97277 validation-aucpr:0.97499
[20:37:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.19033 validation-auc:0.97272 validation-aucpr:0.97473
[20:37:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.18994 validation-auc:0.97281 validation-aucpr:0.97445
[20:37:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.18946 validation-auc:0.97291 validation-aucpr:0.97488
[20:37:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.18955 validation-auc:0.97286 validation-aucpr:0.97481
[20:37:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.18912 validation-auc:0.97297 validation-aucpr:0.97430
[20:37:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.18896 validation-auc:0.97299 validation-aucpr:0.97452
[20:37:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.18846 validation-auc:0.97315 validation-aucpr:0.97447
[20:37:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.18846 validation-auc:0.97314 validation-aucpr:0.97469
[20:37:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.18827 validation-auc:0.97317 validation-aucpr:0.97435
[20:37:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.18866 validation-auc:0.97304 validation-aucpr:0.97393
[20:37:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.18839 validation-auc:0.97311 validation-aucpr:0.97416
[20:37:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.18845 validation-auc:0.97306 validation-aucpr:0.97396
[20:37:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.18825 validation-auc:0.97312 validation-aucpr:0.97427
[20:37:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.18809 validation-auc:0.97320 validation-aucpr:0.97449
[20:37:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.18801 validation-auc:0.97327 validation-aucpr:0.97446
{'best_iteration': '49', 'best_score': '0.9754094178962217'}
Trial 46, Fold 2: Log loss = 0.1880081891241934, Average precision = 0.9745046069561031, ROC-AUC = 0.9732694758419043, Elapsed Time = 168.89245759999903 seconds
Trial 46, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 46, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[20:37:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64015 validation-auc:0.94881 validation-aucpr:0.95418
[20:37:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.59977 validation-auc:0.95293 validation-aucpr:0.94017
[20:37:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.56469 validation-auc:0.95817 validation-aucpr:0.95175
[20:37:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.53286 validation-auc:0.96073 validation-aucpr:0.96001
[20:37:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.50034 validation-auc:0.96359 validation-aucpr:0.96391
[20:37:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.47538 validation-auc:0.96345 validation-aucpr:0.96154
[20:38:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.45279 validation-auc:0.96437 validation-aucpr:0.96486
[20:38:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.42941 validation-auc:0.96566 validation-aucpr:0.96793
[20:38:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.40882 validation-auc:0.96647 validation-aucpr:0.96871
[20:38:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.39185 validation-auc:0.96710 validation-aucpr:0.96990
[20:38:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.37413 validation-auc:0.96786 validation-aucpr:0.97070
[20:38:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.36034 validation-auc:0.96843 validation-aucpr:0.97064
[20:38:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.34531 validation-auc:0.96876 validation-aucpr:0.97029
[20:38:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.33452 validation-auc:0.96855 validation-aucpr:0.97022
[20:38:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.32420 validation-auc:0.96910 validation-aucpr:0.97121
[20:38:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.31556 validation-auc:0.96891 validation-aucpr:0.97098
[20:38:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.30545 validation-auc:0.96907 validation-aucpr:0.97119
[20:38:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.29540 validation-auc:0.96967 validation-aucpr:0.97154
[20:38:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.28594 validation-auc:0.97012 validation-aucpr:0.97209
[20:38:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.27938 validation-auc:0.97005 validation-aucpr:0.97208
[20:38:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.27369 validation-auc:0.97003 validation-aucpr:0.97217
[20:38:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.26886 validation-auc:0.96957 validation-aucpr:0.97014
[20:38:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.26349 validation-auc:0.96971 validation-aucpr:0.97073
[20:39:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.25954 validation-auc:0.96968 validation-aucpr:0.97049
[20:39:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.25570 validation-auc:0.96954 validation-aucpr:0.97038
[20:39:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.25105 validation-auc:0.96924 validation-aucpr:0.97015
[20:39:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.24661 validation-auc:0.96921 validation-aucpr:0.97030
[20:39:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.24222 validation-auc:0.96924 validation-aucpr:0.97005
[20:39:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.23821 validation-auc:0.96936 validation-aucpr:0.97119
[20:39:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.23455 validation-auc:0.96946 validation-aucpr:0.97130
[20:39:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.23209 validation-auc:0.96950 validation-aucpr:0.97099
[20:39:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.22917 validation-auc:0.96952 validation-aucpr:0.97093
[20:39:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.22587 validation-auc:0.96968 validation-aucpr:0.97098
[20:39:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.22377 validation-auc:0.96961 validation-aucpr:0.97030
[20:39:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.22120 validation-auc:0.96963 validation-aucpr:0.97057
[20:39:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.21861 validation-auc:0.96989 validation-aucpr:0.97091
[20:39:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.21659 validation-auc:0.97000 validation-aucpr:0.97153
[20:39:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.21427 validation-auc:0.97031 validation-aucpr:0.97178
[20:39:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.21225 validation-auc:0.97058 validation-aucpr:0.97201
[20:39:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.21127 validation-auc:0.97052 validation-aucpr:0.97195
[20:39:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20980 validation-auc:0.97073 validation-aucpr:0.97271
[20:39:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.20801 validation-auc:0.97094 validation-aucpr:0.97286
[20:39:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20659 validation-auc:0.97101 validation-aucpr:0.97449
[20:39:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20589 validation-auc:0.97113 validation-aucpr:0.97506
[20:39:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.20486 validation-auc:0.97125 validation-aucpr:0.97524
[20:39:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.20417 validation-auc:0.97121 validation-aucpr:0.97525
[20:39:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.20347 validation-auc:0.97124 validation-aucpr:0.97527
[20:40:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20292 validation-auc:0.97125 validation-aucpr:0.97523
[20:40:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.20223 validation-auc:0.97127 validation-aucpr:0.97523
[20:40:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.20142 validation-auc:0.97132 validation-aucpr:0.97528
[20:40:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.20047 validation-auc:0.97147 validation-aucpr:0.97539
[20:40:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19951 validation-auc:0.97170 validation-aucpr:0.97563
[20:40:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19904 validation-auc:0.97170 validation-aucpr:0.97557
[20:40:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19862 validation-auc:0.97192 validation-aucpr:0.97572
[20:40:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.19795 validation-auc:0.97204 validation-aucpr:0.97564
[20:40:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.19772 validation-auc:0.97210 validation-aucpr:0.97574
[20:40:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.19823 validation-auc:0.97185 validation-aucpr:0.97563
[20:40:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.19753 validation-auc:0.97205 validation-aucpr:0.97573
[20:40:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.19714 validation-auc:0.97210 validation-aucpr:0.97574
[20:40:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.19676 validation-auc:0.97217 validation-aucpr:0.97582
[20:40:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.19674 validation-auc:0.97208 validation-aucpr:0.97547
[20:40:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.19676 validation-auc:0.97211 validation-aucpr:0.97558
[20:40:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.19740 validation-auc:0.97180 validation-aucpr:0.97515
[20:40:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.19699 validation-auc:0.97192 validation-aucpr:0.97497
[20:40:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.19663 validation-auc:0.97202 validation-aucpr:0.97504
[20:40:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.19680 validation-auc:0.97200 validation-aucpr:0.97542
[20:40:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.19671 validation-auc:0.97207 validation-aucpr:0.97565
[20:40:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.19708 validation-auc:0.97192 validation-aucpr:0.97547
[20:40:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.19685 validation-auc:0.97197 validation-aucpr:0.97546
[20:40:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.19651 validation-auc:0.97209 validation-aucpr:0.97548
[20:40:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.19653 validation-auc:0.97211 validation-aucpr:0.97533
[20:40:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.19678 validation-auc:0.97207 validation-aucpr:0.97539
[20:40:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.19691 validation-auc:0.97210 validation-aucpr:0.97562
[20:40:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.19689 validation-auc:0.97210 validation-aucpr:0.97545
{'best_iteration': '59', 'best_score': '0.9758164074598226'}
Trial 46, Fold 3: Log loss = 0.19688780501448533, Average precision = 0.9754565903667709, ROC-AUC = 0.9721035565004672, Elapsed Time = 172.1257302999984 seconds
Trial 46, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 46, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[20:40:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64068 validation-auc:0.94491 validation-aucpr:0.95001
[20:40:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.59933 validation-auc:0.95102 validation-aucpr:0.94377
[20:40:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.55913 validation-auc:0.95819 validation-aucpr:0.95572
[20:40:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.52414 validation-auc:0.96134 validation-aucpr:0.96267
[20:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.49257 validation-auc:0.96277 validation-aucpr:0.96354
[20:40:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.46471 validation-auc:0.96381 validation-aucpr:0.96777
[20:40:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.43999 validation-auc:0.96407 validation-aucpr:0.97089
[20:40:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.41826 validation-auc:0.96471 validation-aucpr:0.97126
[20:40:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.40089 validation-auc:0.96550 validation-aucpr:0.97169
[20:41:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.38277 validation-auc:0.96595 validation-aucpr:0.97206
[20:41:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.36836 validation-auc:0.96560 validation-aucpr:0.97193
[20:41:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.35533 validation-auc:0.96565 validation-aucpr:0.97187
[20:41:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.34177 validation-auc:0.96552 validation-aucpr:0.97187
[20:41:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.33001 validation-auc:0.96565 validation-aucpr:0.97186
[20:41:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.32085 validation-auc:0.96577 validation-aucpr:0.97197
[20:41:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.31007 validation-auc:0.96623 validation-aucpr:0.97228
[20:41:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.30043 validation-auc:0.96667 validation-aucpr:0.97255
[20:41:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.29244 validation-auc:0.96723 validation-aucpr:0.97278
[20:41:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.28510 validation-auc:0.96693 validation-aucpr:0.97254
[20:41:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.27865 validation-auc:0.96748 validation-aucpr:0.97297
[20:41:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.27151 validation-auc:0.96796 validation-aucpr:0.97326
[20:41:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.26551 validation-auc:0.96801 validation-aucpr:0.97332
[20:41:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.26032 validation-auc:0.96824 validation-aucpr:0.97347
[20:41:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.25524 validation-auc:0.96813 validation-aucpr:0.97343
[20:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.25017 validation-auc:0.96847 validation-aucpr:0.97366
[20:41:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.24620 validation-auc:0.96823 validation-aucpr:0.97347
[20:42:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.24141 validation-auc:0.96856 validation-aucpr:0.97376
[20:42:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.23745 validation-auc:0.96881 validation-aucpr:0.97392
[20:42:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.23466 validation-auc:0.96866 validation-aucpr:0.97378
[20:42:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.23143 validation-auc:0.96882 validation-aucpr:0.97388
[20:42:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.22848 validation-auc:0.96898 validation-aucpr:0.97397
[20:42:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.22675 validation-auc:0.96881 validation-aucpr:0.97386
[20:42:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.22512 validation-auc:0.96855 validation-aucpr:0.97362
[20:42:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.22291 validation-auc:0.96850 validation-aucpr:0.97358
[20:42:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.22065 validation-auc:0.96868 validation-aucpr:0.97372
[20:42:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.21871 validation-auc:0.96874 validation-aucpr:0.97379
[20:42:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.21751 validation-auc:0.96865 validation-aucpr:0.97372
[20:42:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.21534 validation-auc:0.96895 validation-aucpr:0.97391
[20:42:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.21412 validation-auc:0.96899 validation-aucpr:0.97392
[20:42:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.21309 validation-auc:0.96903 validation-aucpr:0.97394
[20:42:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.21181 validation-auc:0.96897 validation-aucpr:0.97391
[20:42:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.21112 validation-auc:0.96891 validation-aucpr:0.97395
[20:42:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.21006 validation-auc:0.96900 validation-aucpr:0.97401
[20:42:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20858 validation-auc:0.96924 validation-aucpr:0.97417
[20:42:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.20823 validation-auc:0.96918 validation-aucpr:0.97410
[20:42:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.20789 validation-auc:0.96916 validation-aucpr:0.97412
[20:42:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.20689 validation-auc:0.96925 validation-aucpr:0.97420
[20:42:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20608 validation-auc:0.96930 validation-aucpr:0.97422
[20:42:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.20534 validation-auc:0.96934 validation-aucpr:0.97429
[20:42:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.20515 validation-auc:0.96924 validation-aucpr:0.97422
[20:42:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.20486 validation-auc:0.96919 validation-aucpr:0.97419
[20:42:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.20453 validation-auc:0.96917 validation-aucpr:0.97416
[20:42:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.20432 validation-auc:0.96919 validation-aucpr:0.97417
[20:42:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.20364 validation-auc:0.96936 validation-aucpr:0.97428
[20:42:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.20361 validation-auc:0.96930 validation-aucpr:0.97425
[20:43:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.20292 validation-auc:0.96949 validation-aucpr:0.97436
[20:43:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.20261 validation-auc:0.96944 validation-aucpr:0.97435
[20:43:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.20233 validation-auc:0.96950 validation-aucpr:0.97441
[20:43:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.20216 validation-auc:0.96960 validation-aucpr:0.97448
[20:43:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.20197 validation-auc:0.96958 validation-aucpr:0.97447
[20:43:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.20186 validation-auc:0.96955 validation-aucpr:0.97447
[20:43:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.20170 validation-auc:0.96953 validation-aucpr:0.97445
[20:43:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.20162 validation-auc:0.96957 validation-aucpr:0.97445
[20:43:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.20171 validation-auc:0.96954 validation-aucpr:0.97443
[20:43:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.20123 validation-auc:0.96966 validation-aucpr:0.97453
[20:43:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.20144 validation-auc:0.96963 validation-aucpr:0.97449
[20:43:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.20147 validation-auc:0.96962 validation-aucpr:0.97448
[20:43:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.20166 validation-auc:0.96959 validation-aucpr:0.97446
[20:43:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.20177 validation-auc:0.96958 validation-aucpr:0.97441
[20:43:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.20188 validation-auc:0.96963 validation-aucpr:0.97443
[20:43:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.20178 validation-auc:0.96965 validation-aucpr:0.97445
[20:43:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.20162 validation-auc:0.96971 validation-aucpr:0.97450
[20:43:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.20162 validation-auc:0.96977 validation-aucpr:0.97454
[20:43:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.20176 validation-auc:0.96973 validation-aucpr:0.97450
{'best_iteration': '72', 'best_score': '0.974535678305386'}
Trial 46, Fold 4: Log loss = 0.20176173603682535, Average precision = 0.9745087340972265, ROC-AUC = 0.9697287256998838, Elapsed Time = 167.63923059999797 seconds
Trial 46, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 46, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[20:43:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64095 validation-auc:0.94365 validation-aucpr:0.95270
[20:43:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.60106 validation-auc:0.94824 validation-aucpr:0.93268
[20:43:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.56080 validation-auc:0.95714 validation-aucpr:0.95764
[20:43:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.53025 validation-auc:0.95956 validation-aucpr:0.96236
[20:43:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.50057 validation-auc:0.96036 validation-aucpr:0.96652
[20:43:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.47220 validation-auc:0.96245 validation-aucpr:0.96836
[20:43:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.44705 validation-auc:0.96355 validation-aucpr:0.96933
[20:43:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.42671 validation-auc:0.96425 validation-aucpr:0.96971
[20:43:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.40976 validation-auc:0.96418 validation-aucpr:0.96938
[20:43:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.39223 validation-auc:0.96447 validation-aucpr:0.96969
[20:43:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.37842 validation-auc:0.96468 validation-aucpr:0.96979
[20:43:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.36427 validation-auc:0.96431 validation-aucpr:0.96953
[20:44:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.35133 validation-auc:0.96507 validation-aucpr:0.97002
[20:44:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.34098 validation-auc:0.96486 validation-aucpr:0.96966
[20:44:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.33112 validation-auc:0.96476 validation-aucpr:0.96954
[20:44:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.32005 validation-auc:0.96486 validation-aucpr:0.96962
[20:44:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.31020 validation-auc:0.96497 validation-aucpr:0.96989
[20:44:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.30115 validation-auc:0.96523 validation-aucpr:0.97010
[20:44:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.29316 validation-auc:0.96527 validation-aucpr:0.97018
[20:44:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.28683 validation-auc:0.96546 validation-aucpr:0.97038
[20:44:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.27941 validation-auc:0.96576 validation-aucpr:0.97062
[20:44:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.27290 validation-auc:0.96592 validation-aucpr:0.97082
[20:44:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.26819 validation-auc:0.96583 validation-aucpr:0.97061
[20:44:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.26251 validation-auc:0.96603 validation-aucpr:0.97082
[20:44:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.25721 validation-auc:0.96638 validation-aucpr:0.97110
[20:44:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.25214 validation-auc:0.96667 validation-aucpr:0.97130
[20:44:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.24734 validation-auc:0.96707 validation-aucpr:0.97181
[20:44:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.24338 validation-auc:0.96733 validation-aucpr:0.97203
[20:44:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.24089 validation-auc:0.96726 validation-aucpr:0.97196
[20:45:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.23739 validation-auc:0.96751 validation-aucpr:0.97246
[20:45:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.23433 validation-auc:0.96770 validation-aucpr:0.97272
[20:45:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.23142 validation-auc:0.96788 validation-aucpr:0.97285
[20:45:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.22975 validation-auc:0.96775 validation-aucpr:0.97269
[20:45:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.22759 validation-auc:0.96774 validation-aucpr:0.97269
[20:45:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.22535 validation-auc:0.96789 validation-aucpr:0.97277
[20:45:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.22429 validation-auc:0.96757 validation-aucpr:0.97254
[20:45:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.22317 validation-auc:0.96743 validation-aucpr:0.97238
[20:45:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.22150 validation-auc:0.96740 validation-aucpr:0.97240
[20:45:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.22027 validation-auc:0.96744 validation-aucpr:0.97241
[20:45:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.21889 validation-auc:0.96751 validation-aucpr:0.97247
[20:45:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.21713 validation-auc:0.96773 validation-aucpr:0.97252
[20:45:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.21589 validation-auc:0.96786 validation-aucpr:0.97250
[20:45:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.21502 validation-auc:0.96787 validation-aucpr:0.97250
[20:45:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.21363 validation-auc:0.96800 validation-aucpr:0.97260
[20:45:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.21269 validation-auc:0.96818 validation-aucpr:0.97270
[20:45:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.21192 validation-auc:0.96821 validation-aucpr:0.97268
[20:45:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.21177 validation-auc:0.96800 validation-aucpr:0.97250
[20:45:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.21135 validation-auc:0.96796 validation-aucpr:0.97244
[20:45:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.21064 validation-auc:0.96802 validation-aucpr:0.97254
[20:45:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.21044 validation-auc:0.96793 validation-aucpr:0.97246
[20:45:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.21026 validation-auc:0.96789 validation-aucpr:0.97237
[20:45:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.20991 validation-auc:0.96790 validation-aucpr:0.97240
[20:45:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.20947 validation-auc:0.96794 validation-aucpr:0.97232
[20:45:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.20873 validation-auc:0.96811 validation-aucpr:0.97236
[20:45:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.20815 validation-auc:0.96826 validation-aucpr:0.97252
[20:45:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.20810 validation-auc:0.96830 validation-aucpr:0.97274
[20:45:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.20774 validation-auc:0.96835 validation-aucpr:0.97275
[20:45:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.20711 validation-auc:0.96848 validation-aucpr:0.97279
[20:45:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.20706 validation-auc:0.96840 validation-aucpr:0.97270
[20:45:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.20705 validation-auc:0.96846 validation-aucpr:0.97294
[20:46:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.20714 validation-auc:0.96844 validation-aucpr:0.97287
[20:46:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.20705 validation-auc:0.96854 validation-aucpr:0.97293
[20:46:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.20691 validation-auc:0.96854 validation-aucpr:0.97288
[20:46:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.20695 validation-auc:0.96852 validation-aucpr:0.97278
[20:46:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.20666 validation-auc:0.96865 validation-aucpr:0.97295
[20:46:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.20673 validation-auc:0.96860 validation-aucpr:0.97283
[20:46:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.20700 validation-auc:0.96851 validation-aucpr:0.97272
[20:46:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.20678 validation-auc:0.96859 validation-aucpr:0.97262
[20:46:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.20698 validation-auc:0.96853 validation-aucpr:0.97255
[20:46:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.20728 validation-auc:0.96848 validation-aucpr:0.97250
[20:46:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.20717 validation-auc:0.96851 validation-aucpr:0.97241
[20:46:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.20694 validation-auc:0.96862 validation-aucpr:0.97248
[20:46:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.20677 validation-auc:0.96869 validation-aucpr:0.97249
[20:46:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.20641 validation-auc:0.96886 validation-aucpr:0.97271
{'best_iteration': '64', 'best_score': '0.9729466973423092'}
Trial 46, Fold 5: Log loss = 0.20640855234259178, Average precision = 0.9727156435063508, ROC-AUC = 0.9688557679802315, Elapsed Time = 176.15985360000195 seconds
Optimization Progress: 47%|####6 | 47/100 [2:47:24<4:54:45, 333.68s/it]
Trial 47, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 47, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[20:46:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68214 validation-auc:0.93801 validation-aucpr:0.91337
[20:46:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67009 validation-auc:0.95823 validation-aucpr:0.95300
[20:46:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65805 validation-auc:0.96329 validation-aucpr:0.96345
[20:46:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64657 validation-auc:0.96524 validation-aucpr:0.96771
[20:46:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.63605 validation-auc:0.96552 validation-aucpr:0.97081
[20:46:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.62582 validation-auc:0.96558 validation-aucpr:0.97099
{'best_iteration': '5', 'best_score': '0.9709855396319533'}
Trial 47, Fold 1: Log loss = 0.6258185969607232, Average precision = 0.9708432228852684, ROC-AUC = 0.965581356375733, Elapsed Time = 0.40443649999724585 seconds
Trial 47, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 47, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[20:46:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68157 validation-auc:0.94237 validation-aucpr:0.93199
[20:46:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67085 validation-auc:0.95490 validation-aucpr:0.95445
[20:46:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66064 validation-auc:0.95773 validation-aucpr:0.95933
[20:46:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64926 validation-auc:0.96375 validation-aucpr:0.96673
[20:46:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.63908 validation-auc:0.96489 validation-aucpr:0.96765
[20:46:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.62971 validation-auc:0.96467 validation-aucpr:0.96709
{'best_iteration': '4', 'best_score': '0.967654452445068'}
Trial 47, Fold 2: Log loss = 0.6297064625081882, Average precision = 0.9674295405735265, ROC-AUC = 0.9646678802442896, Elapsed Time = 0.47734399999899324 seconds
Trial 47, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 47, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[20:46:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68186 validation-auc:0.94238 validation-aucpr:0.94465
[20:46:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67043 validation-auc:0.95718 validation-aucpr:0.95483
[20:46:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65883 validation-auc:0.96507 validation-aucpr:0.96715
[20:46:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64858 validation-auc:0.96542 validation-aucpr:0.96956
[20:46:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.63734 validation-auc:0.96742 validation-aucpr:0.97159
[20:46:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.62655 validation-auc:0.96825 validation-aucpr:0.97230
{'best_iteration': '5', 'best_score': '0.9722959147201717'}
Trial 47, Fold 3: Log loss = 0.6265516179895246, Average precision = 0.9721495798330522, ROC-AUC = 0.9682459507517023, Elapsed Time = 0.4703719999997702 seconds
Trial 47, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 47, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[20:46:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68187 validation-auc:0.93803 validation-aucpr:0.92342
[20:46:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67100 validation-auc:0.95101 validation-aucpr:0.94634
[20:46:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65905 validation-auc:0.96171 validation-aucpr:0.96040
[20:46:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64763 validation-auc:0.96442 validation-aucpr:0.96716
[20:46:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.63786 validation-auc:0.96423 validation-aucpr:0.96973
[20:46:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.62843 validation-auc:0.96482 validation-aucpr:0.96998
{'best_iteration': '5', 'best_score': '0.9699807104101339'}
Trial 47, Fold 4: Log loss = 0.6284295527765685, Average precision = 0.9698986077541827, ROC-AUC = 0.9648172141271512, Elapsed Time = 0.46956090000094264 seconds
Trial 47, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 47, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[20:46:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68206 validation-auc:0.93532 validation-aucpr:0.91851
[20:46:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67146 validation-auc:0.94912 validation-aucpr:0.94584
[20:46:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66166 validation-auc:0.95150 validation-aucpr:0.95554
[20:46:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65157 validation-auc:0.95365 validation-aucpr:0.95748
[20:46:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.64238 validation-auc:0.95407 validation-aucpr:0.95810
[20:46:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.63275 validation-auc:0.95544 validation-aucpr:0.95924
{'best_iteration': '5', 'best_score': '0.959242797316479'}
Trial 47, Fold 5: Log loss = 0.6327488828734219, Average precision = 0.9587947031588553, ROC-AUC = 0.9554357059249763, Elapsed Time = 0.46724950000134413 seconds
Optimization Progress: 48%|####8 | 48/100 [2:47:34<3:25:00, 236.55s/it]
Trial 48, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 48, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.65265 validation-auc:0.92815 validation-aucpr:0.91366
[1] validation-logloss:0.61911 validation-auc:0.95030 validation-aucpr:0.94360
[2] validation-logloss:0.58864 validation-auc:0.95679 validation-aucpr:0.95677
[3] validation-logloss:0.56412 validation-auc:0.95688 validation-aucpr:0.95950
[4] validation-logloss:0.54044 validation-auc:0.95838 validation-aucpr:0.96405
[5] validation-logloss:0.52041 validation-auc:0.95850 validation-aucpr:0.96382
[6] validation-logloss:0.50156 validation-auc:0.95872 validation-aucpr:0.96450
[7] validation-logloss:0.48352 validation-auc:0.95926 validation-aucpr:0.96478
[8] validation-logloss:0.46713 validation-auc:0.95954 validation-aucpr:0.96497
{'best_iteration': '8', 'best_score': '0.9649673171907855'}
Trial 48, Fold 1: Log loss = 0.4671281350911924, Average precision = 0.9649679365351161, ROC-AUC = 0.9595362215196226, Elapsed Time = 5.979565199999342 seconds
Trial 48, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 48, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.65169 validation-auc:0.93400 validation-aucpr:0.89597
[1] validation-logloss:0.61991 validation-auc:0.95580 validation-aucpr:0.94681
[2] validation-logloss:0.58676 validation-auc:0.96162 validation-aucpr:0.96618
[3] validation-logloss:0.55719 validation-auc:0.96371 validation-aucpr:0.96789
[4] validation-logloss:0.53294 validation-auc:0.96504 validation-aucpr:0.96922
[5] validation-logloss:0.51166 validation-auc:0.96557 validation-aucpr:0.96951
[6] validation-logloss:0.49281 validation-auc:0.96552 validation-aucpr:0.96931
[7] validation-logloss:0.47469 validation-auc:0.96570 validation-aucpr:0.96899
[8] validation-logloss:0.45966 validation-auc:0.96478 validation-aucpr:0.96808
{'best_iteration': '5', 'best_score': '0.9695143141055345'}
Trial 48, Fold 2: Log loss = 0.4596563851648362, Average precision = 0.9680996079646447, ROC-AUC = 0.9647844984174224, Elapsed Time = 6.274692999999388 seconds
Trial 48, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 48, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.65214 validation-auc:0.93166 validation-aucpr:0.91066
[1] validation-logloss:0.62242 validation-auc:0.94954 validation-aucpr:0.94639
[2] validation-logloss:0.59343 validation-auc:0.95684 validation-aucpr:0.95985
[3] validation-logloss:0.56865 validation-auc:0.95702 validation-aucpr:0.96193
[4] validation-logloss:0.54572 validation-auc:0.95796 validation-aucpr:0.96284
[5] validation-logloss:0.52381 validation-auc:0.95933 validation-aucpr:0.96349
[6] validation-logloss:0.49920 validation-auc:0.96212 validation-aucpr:0.96605
[7] validation-logloss:0.48134 validation-auc:0.96264 validation-aucpr:0.96614
[8] validation-logloss:0.46541 validation-auc:0.96263 validation-aucpr:0.96602
{'best_iteration': '7', 'best_score': '0.9661439859958221'}
Trial 48, Fold 3: Log loss = 0.4654080460313595, Average precision = 0.9660221162005403, ROC-AUC = 0.9626300039246468, Elapsed Time = 6.240940799998498 seconds
Trial 48, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 48, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.65142 validation-auc:0.93416 validation-aucpr:0.90515
[1] validation-logloss:0.62067 validation-auc:0.95532 validation-aucpr:0.95617
[2] validation-logloss:0.58966 validation-auc:0.95845 validation-aucpr:0.96334
[3] validation-logloss:0.56525 validation-auc:0.95871 validation-aucpr:0.96323
[4] validation-logloss:0.54214 validation-auc:0.95918 validation-aucpr:0.96428
[5] validation-logloss:0.52043 validation-auc:0.95991 validation-aucpr:0.96552
[6] validation-logloss:0.50021 validation-auc:0.96001 validation-aucpr:0.96560
[7] validation-logloss:0.48187 validation-auc:0.96050 validation-aucpr:0.96581
[8] validation-logloss:0.46433 validation-auc:0.96139 validation-aucpr:0.96660
{'best_iteration': '8', 'best_score': '0.9666011794865117'}
Trial 48, Fold 4: Log loss = 0.4643289259501296, Average precision = 0.9665960273611527, ROC-AUC = 0.9613865700503825, Elapsed Time = 5.843712499998219 seconds
Trial 48, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 48, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.65256 validation-auc:0.92726 validation-aucpr:0.90348
[1] validation-logloss:0.62107 validation-auc:0.95214 validation-aucpr:0.95102
[2] validation-logloss:0.59356 validation-auc:0.95333 validation-aucpr:0.95624
[3] validation-logloss:0.56813 validation-auc:0.95497 validation-aucpr:0.95773
[4] validation-logloss:0.54440 validation-auc:0.95656 validation-aucpr:0.96067
[5] validation-logloss:0.52202 validation-auc:0.95745 validation-aucpr:0.96157
[6] validation-logloss:0.50376 validation-auc:0.95800 validation-aucpr:0.96312
[7] validation-logloss:0.48722 validation-auc:0.95832 validation-aucpr:0.96341
[8] validation-logloss:0.47188 validation-auc:0.95810 validation-aucpr:0.96326
{'best_iteration': '7', 'best_score': '0.9634119132513211'}
Trial 48, Fold 5: Log loss = 0.4718818396789914, Average precision = 0.9632099260839972, ROC-AUC = 0.9580976905783771, Elapsed Time = 6.20024960000228 seconds
Optimization Progress: 49%|####9 | 49/100 [2:48:13<2:30:30, 177.07s/it]
Trial 49, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 49, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66453 validation-auc:0.94548 validation-aucpr:0.90619
[1] validation-logloss:0.63846 validation-auc:0.96347 validation-aucpr:0.96426
[2] validation-logloss:0.61384 validation-auc:0.96495 validation-aucpr:0.96393
[3] validation-logloss:0.59333 validation-auc:0.96549 validation-aucpr:0.96700
[4] validation-logloss:0.57190 validation-auc:0.96666 validation-aucpr:0.96992
[5] validation-logloss:0.55194 validation-auc:0.96736 validation-aucpr:0.97049
[6] validation-logloss:0.53343 validation-auc:0.96833 validation-aucpr:0.97261
[7] validation-logloss:0.51641 validation-auc:0.96822 validation-aucpr:0.97019
[8] validation-logloss:0.50181 validation-auc:0.96823 validation-aucpr:0.97071
[9] validation-logloss:0.48792 validation-auc:0.96822 validation-aucpr:0.97064
[10] validation-logloss:0.47476 validation-auc:0.96831 validation-aucpr:0.97069
[11] validation-logloss:0.46081 validation-auc:0.96830 validation-aucpr:0.96846
[12] validation-logloss:0.44897 validation-auc:0.96835 validation-aucpr:0.96840
[13] validation-logloss:0.43635 validation-auc:0.96838 validation-aucpr:0.96849
[14] validation-logloss:0.42460 validation-auc:0.96836 validation-aucpr:0.96852
[15] validation-logloss:0.41479 validation-auc:0.96824 validation-aucpr:0.96834
[16] validation-logloss:0.40398 validation-auc:0.96846 validation-aucpr:0.96852
[17] validation-logloss:0.39389 validation-auc:0.96855 validation-aucpr:0.96865
[18] validation-logloss:0.38439 validation-auc:0.96863 validation-aucpr:0.96871
[19] validation-logloss:0.37546 validation-auc:0.96865 validation-aucpr:0.96896
[20] validation-logloss:0.36716 validation-auc:0.96878 validation-aucpr:0.97138
[21] validation-logloss:0.35910 validation-auc:0.96883 validation-aucpr:0.97146
[22] validation-logloss:0.35146 validation-auc:0.96919 validation-aucpr:0.97166
[23] validation-logloss:0.34518 validation-auc:0.96911 validation-aucpr:0.97155
[24] validation-logloss:0.33804 validation-auc:0.96935 validation-aucpr:0.97176
[25] validation-logloss:0.33202 validation-auc:0.96953 validation-aucpr:0.97183
[26] validation-logloss:0.32584 validation-auc:0.96964 validation-aucpr:0.97226
[27] validation-logloss:0.31963 validation-auc:0.96987 validation-aucpr:0.97235
[28] validation-logloss:0.31398 validation-auc:0.96979 validation-aucpr:0.97237
[29] validation-logloss:0.30868 validation-auc:0.96990 validation-aucpr:0.97232
[30] validation-logloss:0.30407 validation-auc:0.96997 validation-aucpr:0.97233
[31] validation-logloss:0.29888 validation-auc:0.97009 validation-aucpr:0.97243
[32] validation-logloss:0.29415 validation-auc:0.97022 validation-aucpr:0.97256
[33] validation-logloss:0.28981 validation-auc:0.97021 validation-aucpr:0.97256
[34] validation-logloss:0.28535 validation-auc:0.97049 validation-aucpr:0.97264
[35] validation-logloss:0.28124 validation-auc:0.97047 validation-aucpr:0.97263
[36] validation-logloss:0.27775 validation-auc:0.97049 validation-aucpr:0.97275
[37] validation-logloss:0.27371 validation-auc:0.97074 validation-aucpr:0.97295
[38] validation-logloss:0.27022 validation-auc:0.97075 validation-aucpr:0.97296
[39] validation-logloss:0.26658 validation-auc:0.97092 validation-aucpr:0.97305
[40] validation-logloss:0.26301 validation-auc:0.97114 validation-aucpr:0.97322
[41] validation-logloss:0.25987 validation-auc:0.97124 validation-aucpr:0.97332
[42] validation-logloss:0.25688 validation-auc:0.97134 validation-aucpr:0.97376
[43] validation-logloss:0.25389 validation-auc:0.97142 validation-aucpr:0.97385
[44] validation-logloss:0.25148 validation-auc:0.97139 validation-aucpr:0.97383
[45] validation-logloss:0.24887 validation-auc:0.97143 validation-aucpr:0.97384
[46] validation-logloss:0.24633 validation-auc:0.97145 validation-aucpr:0.97390
[47] validation-logloss:0.24432 validation-auc:0.97144 validation-aucpr:0.97412
[48] validation-logloss:0.24187 validation-auc:0.97150 validation-aucpr:0.97417
[49] validation-logloss:0.23993 validation-auc:0.97152 validation-aucpr:0.97423
[50] validation-logloss:0.23788 validation-auc:0.97155 validation-aucpr:0.97482
[51] validation-logloss:0.23574 validation-auc:0.97160 validation-aucpr:0.97484
[52] validation-logloss:0.23416 validation-auc:0.97156 validation-aucpr:0.97481
[53] validation-logloss:0.23247 validation-auc:0.97167 validation-aucpr:0.97487
[54] validation-logloss:0.23067 validation-auc:0.97169 validation-aucpr:0.97487
[55] validation-logloss:0.22873 validation-auc:0.97180 validation-aucpr:0.97498
[56] validation-logloss:0.22675 validation-auc:0.97196 validation-aucpr:0.97511
[57] validation-logloss:0.22537 validation-auc:0.97193 validation-aucpr:0.97509
[58] validation-logloss:0.22397 validation-auc:0.97194 validation-aucpr:0.97509
[59] validation-logloss:0.22242 validation-auc:0.97196 validation-aucpr:0.97507
[60] validation-logloss:0.22126 validation-auc:0.97204 validation-aucpr:0.97590
[61] validation-logloss:0.21994 validation-auc:0.97203 validation-aucpr:0.97589
[62] validation-logloss:0.21884 validation-auc:0.97206 validation-aucpr:0.97591
[63] validation-logloss:0.21750 validation-auc:0.97211 validation-aucpr:0.97595
[64] validation-logloss:0.21645 validation-auc:0.97221 validation-aucpr:0.97600
[65] validation-logloss:0.21559 validation-auc:0.97219 validation-aucpr:0.97595
[66] validation-logloss:0.21449 validation-auc:0.97221 validation-aucpr:0.97589
[67] validation-logloss:0.21330 validation-auc:0.97230 validation-aucpr:0.97596
[68] validation-logloss:0.21223 validation-auc:0.97233 validation-aucpr:0.97599
[69] validation-logloss:0.21095 validation-auc:0.97251 validation-aucpr:0.97611
[70] validation-logloss:0.21009 validation-auc:0.97254 validation-aucpr:0.97612
[71] validation-logloss:0.20926 validation-auc:0.97255 validation-aucpr:0.97610
[72] validation-logloss:0.20827 validation-auc:0.97261 validation-aucpr:0.97614
[73] validation-logloss:0.20744 validation-auc:0.97265 validation-aucpr:0.97617
[74] validation-logloss:0.20664 validation-auc:0.97267 validation-aucpr:0.97620
[75] validation-logloss:0.20606 validation-auc:0.97264 validation-aucpr:0.97614
[76] validation-logloss:0.20532 validation-auc:0.97265 validation-aucpr:0.97615
[77] validation-logloss:0.20469 validation-auc:0.97269 validation-aucpr:0.97617
[78] validation-logloss:0.20417 validation-auc:0.97268 validation-aucpr:0.97617
[79] validation-logloss:0.20354 validation-auc:0.97267 validation-aucpr:0.97616
[80] validation-logloss:0.20307 validation-auc:0.97264 validation-aucpr:0.97617
[81] validation-logloss:0.20250 validation-auc:0.97267 validation-aucpr:0.97617
[82] validation-logloss:0.20198 validation-auc:0.97270 validation-aucpr:0.97576
[83] validation-logloss:0.20150 validation-auc:0.97269 validation-aucpr:0.97578
{'best_iteration': '74', 'best_score': '0.9761984583995469'}
Trial 49, Fold 1: Log loss = 0.20149546090569612, Average precision = 0.9759896129523458, ROC-AUC = 0.972685767697624, Elapsed Time = 7.393411099998048 seconds
Trial 49, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 49, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.66469 validation-auc:0.94495 validation-aucpr:0.91806
[1] validation-logloss:0.64048 validation-auc:0.96350 validation-aucpr:0.96717
[2] validation-logloss:0.61606 validation-auc:0.96530 validation-aucpr:0.96965
[3] validation-logloss:0.59336 validation-auc:0.96765 validation-aucpr:0.97149
[4] validation-logloss:0.57192 validation-auc:0.96836 validation-aucpr:0.97218
[5] validation-logloss:0.55201 validation-auc:0.96915 validation-aucpr:0.97281
[6] validation-logloss:0.53392 validation-auc:0.97004 validation-aucpr:0.97336
[7] validation-logloss:0.51795 validation-auc:0.97064 validation-aucpr:0.97374
[8] validation-logloss:0.50353 validation-auc:0.97049 validation-aucpr:0.97369
[9] validation-logloss:0.48934 validation-auc:0.97049 validation-aucpr:0.97373
[10] validation-logloss:0.47447 validation-auc:0.97074 validation-aucpr:0.97395
[11] validation-logloss:0.46054 validation-auc:0.97078 validation-aucpr:0.97402
[12] validation-logloss:0.44724 validation-auc:0.97098 validation-aucpr:0.97418
[13] validation-logloss:0.43486 validation-auc:0.97107 validation-aucpr:0.97427
[14] validation-logloss:0.42315 validation-auc:0.97115 validation-aucpr:0.97430
[15] validation-logloss:0.41165 validation-auc:0.97136 validation-aucpr:0.97448
[16] validation-logloss:0.40212 validation-auc:0.97140 validation-aucpr:0.97449
[17] validation-logloss:0.39299 validation-auc:0.97143 validation-aucpr:0.97450
[18] validation-logloss:0.38339 validation-auc:0.97157 validation-aucpr:0.97465
[19] validation-logloss:0.37416 validation-auc:0.97175 validation-aucpr:0.97480
[20] validation-logloss:0.36546 validation-auc:0.97189 validation-aucpr:0.97490
[21] validation-logloss:0.35821 validation-auc:0.97185 validation-aucpr:0.97483
[22] validation-logloss:0.35027 validation-auc:0.97218 validation-aucpr:0.97508
[23] validation-logloss:0.34347 validation-auc:0.97245 validation-aucpr:0.97527
[24] validation-logloss:0.33639 validation-auc:0.97226 validation-aucpr:0.97514
[25] validation-logloss:0.32974 validation-auc:0.97219 validation-aucpr:0.97506
[26] validation-logloss:0.32435 validation-auc:0.97202 validation-aucpr:0.97490
[27] validation-logloss:0.31810 validation-auc:0.97216 validation-aucpr:0.97501
[28] validation-logloss:0.31207 validation-auc:0.97215 validation-aucpr:0.97512
[29] validation-logloss:0.30722 validation-auc:0.97206 validation-aucpr:0.97491
[30] validation-logloss:0.30186 validation-auc:0.97202 validation-aucpr:0.97490
[31] validation-logloss:0.29683 validation-auc:0.97206 validation-aucpr:0.97505
[32] validation-logloss:0.29197 validation-auc:0.97213 validation-aucpr:0.97500
[33] validation-logloss:0.28718 validation-auc:0.97218 validation-aucpr:0.97514
[34] validation-logloss:0.28337 validation-auc:0.97213 validation-aucpr:0.97508
[35] validation-logloss:0.27893 validation-auc:0.97232 validation-aucpr:0.97522
[36] validation-logloss:0.27576 validation-auc:0.97204 validation-aucpr:0.97490
[37] validation-logloss:0.27232 validation-auc:0.97201 validation-aucpr:0.97486
[38] validation-logloss:0.26826 validation-auc:0.97235 validation-aucpr:0.97509
[39] validation-logloss:0.26477 validation-auc:0.97229 validation-aucpr:0.97505
[40] validation-logloss:0.26137 validation-auc:0.97225 validation-aucpr:0.97501
[41] validation-logloss:0.25850 validation-auc:0.97230 validation-aucpr:0.97533
[42] validation-logloss:0.25534 validation-auc:0.97222 validation-aucpr:0.97525
[43] validation-logloss:0.25261 validation-auc:0.97244 validation-aucpr:0.97531
[44] validation-logloss:0.24964 validation-auc:0.97252 validation-aucpr:0.97535
[45] validation-logloss:0.24715 validation-auc:0.97260 validation-aucpr:0.97539
[46] validation-logloss:0.24437 validation-auc:0.97266 validation-aucpr:0.97542
[47] validation-logloss:0.24170 validation-auc:0.97271 validation-aucpr:0.97546
[48] validation-logloss:0.23923 validation-auc:0.97277 validation-aucpr:0.97550
[49] validation-logloss:0.23661 validation-auc:0.97287 validation-aucpr:0.97558
[50] validation-logloss:0.23423 validation-auc:0.97295 validation-aucpr:0.97565
[51] validation-logloss:0.23200 validation-auc:0.97296 validation-aucpr:0.97566
[52] validation-logloss:0.22986 validation-auc:0.97298 validation-aucpr:0.97569
[53] validation-logloss:0.22772 validation-auc:0.97311 validation-aucpr:0.97579
[54] validation-logloss:0.22592 validation-auc:0.97291 validation-aucpr:0.97439
[55] validation-logloss:0.22410 validation-auc:0.97291 validation-aucpr:0.97440
[56] validation-logloss:0.22216 validation-auc:0.97301 validation-aucpr:0.97446
[57] validation-logloss:0.22034 validation-auc:0.97308 validation-aucpr:0.97456
[58] validation-logloss:0.21911 validation-auc:0.97297 validation-aucpr:0.97459
[59] validation-logloss:0.21766 validation-auc:0.97294 validation-aucpr:0.97455
[60] validation-logloss:0.21610 validation-auc:0.97299 validation-aucpr:0.97459
[61] validation-logloss:0.21459 validation-auc:0.97302 validation-aucpr:0.97461
[62] validation-logloss:0.21329 validation-auc:0.97300 validation-aucpr:0.97458
[63] validation-logloss:0.21215 validation-auc:0.97298 validation-aucpr:0.97460
[64] validation-logloss:0.21070 validation-auc:0.97299 validation-aucpr:0.97460
[65] validation-logloss:0.20968 validation-auc:0.97287 validation-aucpr:0.97451
[66] validation-logloss:0.20846 validation-auc:0.97294 validation-aucpr:0.97452
[67] validation-logloss:0.20720 validation-auc:0.97310 validation-aucpr:0.97464
[68] validation-logloss:0.20651 validation-auc:0.97303 validation-aucpr:0.97456
[69] validation-logloss:0.20533 validation-auc:0.97317 validation-aucpr:0.97468
[70] validation-logloss:0.20462 validation-auc:0.97303 validation-aucpr:0.97456
[71] validation-logloss:0.20364 validation-auc:0.97302 validation-aucpr:0.97467
[72] validation-logloss:0.20257 validation-auc:0.97305 validation-aucpr:0.97470
[73] validation-logloss:0.20159 validation-auc:0.97308 validation-aucpr:0.97474
[74] validation-logloss:0.20088 validation-auc:0.97320 validation-aucpr:0.97576
[75] validation-logloss:0.20011 validation-auc:0.97315 validation-aucpr:0.97574
[76] validation-logloss:0.19959 validation-auc:0.97305 validation-aucpr:0.97567
[77] validation-logloss:0.19863 validation-auc:0.97317 validation-aucpr:0.97578
[78] validation-logloss:0.19782 validation-auc:0.97322 validation-aucpr:0.97585
[79] validation-logloss:0.19710 validation-auc:0.97322 validation-aucpr:0.97585
[80] validation-logloss:0.19641 validation-auc:0.97325 validation-aucpr:0.97588
[81] validation-logloss:0.19587 validation-auc:0.97324 validation-aucpr:0.97586
[82] validation-logloss:0.19528 validation-auc:0.97319 validation-aucpr:0.97464
[83] validation-logloss:0.19465 validation-auc:0.97325 validation-aucpr:0.97475
{'best_iteration': '80', 'best_score': '0.9758752127330358'}
Trial 49, Fold 2: Log loss = 0.19464862815883344, Average precision = 0.9751688581640191, ROC-AUC = 0.9732532388288591, Elapsed Time = 7.5820448999984364 seconds
Trial 49, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 49, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66473 validation-auc:0.94799 validation-aucpr:0.91996
[1] validation-logloss:0.64044 validation-auc:0.96164 validation-aucpr:0.96550
[2] validation-logloss:0.61552 validation-auc:0.96729 validation-aucpr:0.97098
[3] validation-logloss:0.59323 validation-auc:0.96741 validation-aucpr:0.97069
[4] validation-logloss:0.57184 validation-auc:0.96851 validation-aucpr:0.97192
[5] validation-logloss:0.55153 validation-auc:0.96931 validation-aucpr:0.97253
[6] validation-logloss:0.53282 validation-auc:0.96956 validation-aucpr:0.97266
[7] validation-logloss:0.51507 validation-auc:0.97025 validation-aucpr:0.97338
[8] validation-logloss:0.49851 validation-auc:0.97034 validation-aucpr:0.97347
[9] validation-logloss:0.48326 validation-auc:0.97059 validation-aucpr:0.97357
[10] validation-logloss:0.46862 validation-auc:0.97051 validation-aucpr:0.97349
[11] validation-logloss:0.45619 validation-auc:0.97081 validation-aucpr:0.97432
[12] validation-logloss:0.44435 validation-auc:0.97097 validation-aucpr:0.97446
[13] validation-logloss:0.43209 validation-auc:0.97087 validation-aucpr:0.97436
[14] validation-logloss:0.42036 validation-auc:0.97080 validation-aucpr:0.97431
[15] validation-logloss:0.40910 validation-auc:0.97107 validation-aucpr:0.97471
[16] validation-logloss:0.39861 validation-auc:0.97104 validation-aucpr:0.97456
[17] validation-logloss:0.38888 validation-auc:0.97090 validation-aucpr:0.97449
[18] validation-logloss:0.37944 validation-auc:0.97097 validation-aucpr:0.97373
[19] validation-logloss:0.37030 validation-auc:0.97136 validation-aucpr:0.97508
[20] validation-logloss:0.36171 validation-auc:0.97164 validation-aucpr:0.97524
[21] validation-logloss:0.35391 validation-auc:0.97156 validation-aucpr:0.97520
[22] validation-logloss:0.34620 validation-auc:0.97154 validation-aucpr:0.97519
[23] validation-logloss:0.33934 validation-auc:0.97155 validation-aucpr:0.97519
[24] validation-logloss:0.33324 validation-auc:0.97150 validation-aucpr:0.97514
[25] validation-logloss:0.32647 validation-auc:0.97155 validation-aucpr:0.97518
[26] validation-logloss:0.32064 validation-auc:0.97159 validation-aucpr:0.97520
[27] validation-logloss:0.31523 validation-auc:0.97156 validation-aucpr:0.97520
[28] validation-logloss:0.30954 validation-auc:0.97175 validation-aucpr:0.97537
[29] validation-logloss:0.30419 validation-auc:0.97176 validation-aucpr:0.97538
[30] validation-logloss:0.29969 validation-auc:0.97174 validation-aucpr:0.97536
[31] validation-logloss:0.29535 validation-auc:0.97159 validation-aucpr:0.97528
[32] validation-logloss:0.29055 validation-auc:0.97152 validation-aucpr:0.97523
[33] validation-logloss:0.28586 validation-auc:0.97168 validation-aucpr:0.97531
[34] validation-logloss:0.28148 validation-auc:0.97175 validation-aucpr:0.97536
[35] validation-logloss:0.27728 validation-auc:0.97190 validation-aucpr:0.97545
[36] validation-logloss:0.27319 validation-auc:0.97200 validation-aucpr:0.97552
[37] validation-logloss:0.26951 validation-auc:0.97186 validation-aucpr:0.97541
[38] validation-logloss:0.26591 validation-auc:0.97179 validation-aucpr:0.97538
[39] validation-logloss:0.26255 validation-auc:0.97180 validation-aucpr:0.97538
[40] validation-logloss:0.25936 validation-auc:0.97183 validation-aucpr:0.97541
[41] validation-logloss:0.25663 validation-auc:0.97173 validation-aucpr:0.97534
[42] validation-logloss:0.25353 validation-auc:0.97177 validation-aucpr:0.97537
[43] validation-logloss:0.25076 validation-auc:0.97192 validation-aucpr:0.97553
[44] validation-logloss:0.24793 validation-auc:0.97198 validation-aucpr:0.97556
[45] validation-logloss:0.24514 validation-auc:0.97206 validation-aucpr:0.97564
[46] validation-logloss:0.24297 validation-auc:0.97201 validation-aucpr:0.97560
[47] validation-logloss:0.24092 validation-auc:0.97202 validation-aucpr:0.97590
[48] validation-logloss:0.23839 validation-auc:0.97212 validation-aucpr:0.97595
[49] validation-logloss:0.23595 validation-auc:0.97220 validation-aucpr:0.97601
[50] validation-logloss:0.23414 validation-auc:0.97214 validation-aucpr:0.97595
[51] validation-logloss:0.23191 validation-auc:0.97229 validation-aucpr:0.97607
[52] validation-logloss:0.22990 validation-auc:0.97231 validation-aucpr:0.97609
[53] validation-logloss:0.22810 validation-auc:0.97241 validation-aucpr:0.97616
[54] validation-logloss:0.22626 validation-auc:0.97242 validation-aucpr:0.97617
[55] validation-logloss:0.22476 validation-auc:0.97245 validation-aucpr:0.97617
[56] validation-logloss:0.22322 validation-auc:0.97235 validation-aucpr:0.97611
[57] validation-logloss:0.22155 validation-auc:0.97235 validation-aucpr:0.97608
[58] validation-logloss:0.21989 validation-auc:0.97241 validation-aucpr:0.97611
[59] validation-logloss:0.21833 validation-auc:0.97246 validation-aucpr:0.97616
[60] validation-logloss:0.21703 validation-auc:0.97246 validation-aucpr:0.97616
[61] validation-logloss:0.21587 validation-auc:0.97239 validation-aucpr:0.97611
[62] validation-logloss:0.21443 validation-auc:0.97249 validation-aucpr:0.97618
[63] validation-logloss:0.21324 validation-auc:0.97248 validation-aucpr:0.97618
[64] validation-logloss:0.21222 validation-auc:0.97244 validation-aucpr:0.97615
[65] validation-logloss:0.21109 validation-auc:0.97242 validation-aucpr:0.97613
[66] validation-logloss:0.20992 validation-auc:0.97245 validation-aucpr:0.97614
[67] validation-logloss:0.20904 validation-auc:0.97248 validation-aucpr:0.97616
[68] validation-logloss:0.20811 validation-auc:0.97252 validation-aucpr:0.97620
[69] validation-logloss:0.20722 validation-auc:0.97228 validation-aucpr:0.97391
[70] validation-logloss:0.20659 validation-auc:0.97220 validation-aucpr:0.97385
[71] validation-logloss:0.20563 validation-auc:0.97224 validation-aucpr:0.97387
[72] validation-logloss:0.20485 validation-auc:0.97217 validation-aucpr:0.97384
[73] validation-logloss:0.20396 validation-auc:0.97224 validation-aucpr:0.97378
[74] validation-logloss:0.20321 validation-auc:0.97225 validation-aucpr:0.97376
[75] validation-logloss:0.20227 validation-auc:0.97231 validation-aucpr:0.97356
[76] validation-logloss:0.20151 validation-auc:0.97232 validation-aucpr:0.97353
[77] validation-logloss:0.20097 validation-auc:0.97228 validation-aucpr:0.97344
[78] validation-logloss:0.20040 validation-auc:0.97230 validation-aucpr:0.97335
[79] validation-logloss:0.19978 validation-auc:0.97238 validation-aucpr:0.97335
[80] validation-logloss:0.19924 validation-auc:0.97240 validation-aucpr:0.97319
[81] validation-logloss:0.19861 validation-auc:0.97247 validation-aucpr:0.97325
[82] validation-logloss:0.19802 validation-auc:0.97250 validation-aucpr:0.97313
[83] validation-logloss:0.19761 validation-auc:0.97245 validation-aucpr:0.97292
{'best_iteration': '68', 'best_score': '0.9762039347909636'}
Trial 49, Fold 3: Log loss = 0.1976074858911532, Average precision = 0.9739148147201059, ROC-AUC = 0.9724530577332395, Elapsed Time = 8.235687200001848 seconds
Trial 49, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 49, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66450 validation-auc:0.94787 validation-aucpr:0.92504
[1] validation-logloss:0.64030 validation-auc:0.96410 validation-aucpr:0.96521
[2] validation-logloss:0.61847 validation-auc:0.96429 validation-aucpr:0.96687
[3] validation-logloss:0.59758 validation-auc:0.96413 validation-aucpr:0.96693
[4] validation-logloss:0.57829 validation-auc:0.96494 validation-aucpr:0.97019
[5] validation-logloss:0.55767 validation-auc:0.96672 validation-aucpr:0.97209
[6] validation-logloss:0.53851 validation-auc:0.96766 validation-aucpr:0.97298
[7] validation-logloss:0.52259 validation-auc:0.96743 validation-aucpr:0.97273
[8] validation-logloss:0.50543 validation-auc:0.96843 validation-aucpr:0.97349
[9] validation-logloss:0.49155 validation-auc:0.96834 validation-aucpr:0.97336
[10] validation-logloss:0.47649 validation-auc:0.96867 validation-aucpr:0.97366
[11] validation-logloss:0.46245 validation-auc:0.96901 validation-aucpr:0.97399
[12] validation-logloss:0.44894 validation-auc:0.96920 validation-aucpr:0.97419
[13] validation-logloss:0.43639 validation-auc:0.96964 validation-aucpr:0.97433
[14] validation-logloss:0.42439 validation-auc:0.96974 validation-aucpr:0.97441
[15] validation-logloss:0.41287 validation-auc:0.97028 validation-aucpr:0.97481
[16] validation-logloss:0.40223 validation-auc:0.97024 validation-aucpr:0.97485
[17] validation-logloss:0.39248 validation-auc:0.97046 validation-aucpr:0.97505
[18] validation-logloss:0.38419 validation-auc:0.97051 validation-aucpr:0.97507
[19] validation-logloss:0.37493 validation-auc:0.97055 validation-aucpr:0.97513
[20] validation-logloss:0.36746 validation-auc:0.97038 validation-aucpr:0.97499
[21] validation-logloss:0.35885 validation-auc:0.97067 validation-aucpr:0.97525
[22] validation-logloss:0.35120 validation-auc:0.97065 validation-aucpr:0.97524
[23] validation-logloss:0.34383 validation-auc:0.97055 validation-aucpr:0.97516
[24] validation-logloss:0.33664 validation-auc:0.97070 validation-aucpr:0.97529
[25] validation-logloss:0.32991 validation-auc:0.97062 validation-aucpr:0.97525
[26] validation-logloss:0.32332 validation-auc:0.97084 validation-aucpr:0.97541
[27] validation-logloss:0.31794 validation-auc:0.97084 validation-aucpr:0.97540
[28] validation-logloss:0.31266 validation-auc:0.97100 validation-aucpr:0.97550
[29] validation-logloss:0.30693 validation-auc:0.97122 validation-aucpr:0.97571
[30] validation-logloss:0.30216 validation-auc:0.97128 validation-aucpr:0.97573
[31] validation-logloss:0.29692 validation-auc:0.97142 validation-aucpr:0.97584
[32] validation-logloss:0.29218 validation-auc:0.97133 validation-aucpr:0.97579
[33] validation-logloss:0.28736 validation-auc:0.97155 validation-aucpr:0.97594
[34] validation-logloss:0.28389 validation-auc:0.97134 validation-aucpr:0.97578
[35] validation-logloss:0.27964 validation-auc:0.97131 validation-aucpr:0.97575
[36] validation-logloss:0.27595 validation-auc:0.97141 validation-aucpr:0.97578
[37] validation-logloss:0.27177 validation-auc:0.97166 validation-aucpr:0.97596
[38] validation-logloss:0.26848 validation-auc:0.97164 validation-aucpr:0.97594
[39] validation-logloss:0.26539 validation-auc:0.97163 validation-aucpr:0.97592
[40] validation-logloss:0.26187 validation-auc:0.97174 validation-aucpr:0.97601
[41] validation-logloss:0.25872 validation-auc:0.97183 validation-aucpr:0.97608
[42] validation-logloss:0.25598 validation-auc:0.97186 validation-aucpr:0.97608
[43] validation-logloss:0.25280 validation-auc:0.97194 validation-aucpr:0.97614
[44] validation-logloss:0.24978 validation-auc:0.97213 validation-aucpr:0.97632
[45] validation-logloss:0.24708 validation-auc:0.97206 validation-aucpr:0.97626
[46] validation-logloss:0.24506 validation-auc:0.97194 validation-aucpr:0.97618
[47] validation-logloss:0.24307 validation-auc:0.97183 validation-aucpr:0.97611
[48] validation-logloss:0.24100 validation-auc:0.97189 validation-aucpr:0.97615
[49] validation-logloss:0.23856 validation-auc:0.97193 validation-aucpr:0.97619
[50] validation-logloss:0.23615 validation-auc:0.97206 validation-aucpr:0.97630
[51] validation-logloss:0.23406 validation-auc:0.97202 validation-aucpr:0.97628
[52] validation-logloss:0.23197 validation-auc:0.97204 validation-aucpr:0.97630
[53] validation-logloss:0.23027 validation-auc:0.97196 validation-aucpr:0.97625
[54] validation-logloss:0.22830 validation-auc:0.97199 validation-aucpr:0.97629
[55] validation-logloss:0.22682 validation-auc:0.97198 validation-aucpr:0.97627
[56] validation-logloss:0.22536 validation-auc:0.97201 validation-aucpr:0.97628
[57] validation-logloss:0.22346 validation-auc:0.97208 validation-aucpr:0.97634
[58] validation-logloss:0.22200 validation-auc:0.97203 validation-aucpr:0.97631
[59] validation-logloss:0.22063 validation-auc:0.97216 validation-aucpr:0.97639
[60] validation-logloss:0.21925 validation-auc:0.97209 validation-aucpr:0.97635
[61] validation-logloss:0.21776 validation-auc:0.97211 validation-aucpr:0.97637
[62] validation-logloss:0.21623 validation-auc:0.97217 validation-aucpr:0.97642
[63] validation-logloss:0.21505 validation-auc:0.97215 validation-aucpr:0.97638
[64] validation-logloss:0.21339 validation-auc:0.97236 validation-aucpr:0.97653
[65] validation-logloss:0.21206 validation-auc:0.97236 validation-aucpr:0.97654
[66] validation-logloss:0.21094 validation-auc:0.97230 validation-aucpr:0.97652
[67] validation-logloss:0.21018 validation-auc:0.97223 validation-aucpr:0.97645
[68] validation-logloss:0.20932 validation-auc:0.97222 validation-aucpr:0.97644
[69] validation-logloss:0.20809 validation-auc:0.97231 validation-aucpr:0.97652
[70] validation-logloss:0.20704 validation-auc:0.97233 validation-aucpr:0.97655
[71] validation-logloss:0.20588 validation-auc:0.97242 validation-aucpr:0.97663
[72] validation-logloss:0.20499 validation-auc:0.97242 validation-aucpr:0.97663
[73] validation-logloss:0.20408 validation-auc:0.97249 validation-aucpr:0.97668
[74] validation-logloss:0.20309 validation-auc:0.97256 validation-aucpr:0.97673
[75] validation-logloss:0.20230 validation-auc:0.97255 validation-aucpr:0.97672
[76] validation-logloss:0.20167 validation-auc:0.97255 validation-aucpr:0.97671
[77] validation-logloss:0.20097 validation-auc:0.97255 validation-aucpr:0.97671
[78] validation-logloss:0.20038 validation-auc:0.97247 validation-aucpr:0.97667
[79] validation-logloss:0.19976 validation-auc:0.97254 validation-aucpr:0.97671
[80] validation-logloss:0.19908 validation-auc:0.97254 validation-aucpr:0.97672
[81] validation-logloss:0.19843 validation-auc:0.97257 validation-aucpr:0.97673
[82] validation-logloss:0.19805 validation-auc:0.97244 validation-aucpr:0.97667
[83] validation-logloss:0.19745 validation-auc:0.97252 validation-aucpr:0.97671
{'best_iteration': '81', 'best_score': '0.9767312128066848'}
Trial 49, Fold 4: Log loss = 0.19745498544430348, Average precision = 0.9767147911153203, ROC-AUC = 0.9725193792904473, Elapsed Time = 8.099118099999032 seconds
Trial 49, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 49, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66484 validation-auc:0.94332 validation-aucpr:0.91325
[1] validation-logloss:0.63930 validation-auc:0.95927 validation-aucpr:0.96318
[2] validation-logloss:0.61462 validation-auc:0.96318 validation-aucpr:0.96803
[3] validation-logloss:0.59367 validation-auc:0.96464 validation-aucpr:0.96923
[4] validation-logloss:0.57256 validation-auc:0.96532 validation-aucpr:0.96981
[5] validation-logloss:0.55438 validation-auc:0.96604 validation-aucpr:0.97025
[6] validation-logloss:0.53571 validation-auc:0.96682 validation-aucpr:0.97086
[7] validation-logloss:0.51811 validation-auc:0.96739 validation-aucpr:0.97119
[8] validation-logloss:0.50145 validation-auc:0.96787 validation-aucpr:0.97186
[9] validation-logloss:0.48783 validation-auc:0.96762 validation-aucpr:0.97145
[10] validation-logloss:0.47329 validation-auc:0.96773 validation-aucpr:0.97156
[11] validation-logloss:0.45964 validation-auc:0.96757 validation-aucpr:0.97145
[12] validation-logloss:0.44821 validation-auc:0.96761 validation-aucpr:0.97135
[13] validation-logloss:0.43732 validation-auc:0.96783 validation-aucpr:0.97144
[14] validation-logloss:0.42563 validation-auc:0.96808 validation-aucpr:0.97167
[15] validation-logloss:0.41466 validation-auc:0.96823 validation-aucpr:0.97181
[16] validation-logloss:0.40545 validation-auc:0.96825 validation-aucpr:0.97180
[17] validation-logloss:0.39554 validation-auc:0.96871 validation-aucpr:0.97213
[18] validation-logloss:0.38599 validation-auc:0.96895 validation-aucpr:0.97231
[19] validation-logloss:0.37701 validation-auc:0.96897 validation-aucpr:0.97248
[20] validation-logloss:0.36948 validation-auc:0.96908 validation-aucpr:0.97252
[21] validation-logloss:0.36152 validation-auc:0.96906 validation-aucpr:0.97249
[22] validation-logloss:0.35370 validation-auc:0.96919 validation-aucpr:0.97262
[23] validation-logloss:0.34637 validation-auc:0.96925 validation-aucpr:0.97303
[24] validation-logloss:0.33926 validation-auc:0.96946 validation-aucpr:0.97326
[25] validation-logloss:0.33366 validation-auc:0.96929 validation-aucpr:0.97284
[26] validation-logloss:0.32724 validation-auc:0.96945 validation-aucpr:0.97297
[27] validation-logloss:0.32095 validation-auc:0.96956 validation-aucpr:0.97307
[28] validation-logloss:0.31532 validation-auc:0.96953 validation-aucpr:0.97324
[29] validation-logloss:0.31058 validation-auc:0.96935 validation-aucpr:0.97304
[30] validation-logloss:0.30617 validation-auc:0.96924 validation-aucpr:0.97293
[31] validation-logloss:0.30100 validation-auc:0.96937 validation-aucpr:0.97300
[32] validation-logloss:0.29605 validation-auc:0.96958 validation-aucpr:0.97320
[33] validation-logloss:0.29211 validation-auc:0.96940 validation-aucpr:0.97284
[34] validation-logloss:0.28756 validation-auc:0.96952 validation-aucpr:0.97295
[35] validation-logloss:0.28323 validation-auc:0.96963 validation-aucpr:0.97304
[36] validation-logloss:0.27916 validation-auc:0.96970 validation-aucpr:0.97307
[37] validation-logloss:0.27527 validation-auc:0.96973 validation-aucpr:0.97305
[38] validation-logloss:0.27152 validation-auc:0.96991 validation-aucpr:0.97318
[39] validation-logloss:0.26807 validation-auc:0.96988 validation-aucpr:0.97321
[40] validation-logloss:0.26474 validation-auc:0.96998 validation-aucpr:0.97327
[41] validation-logloss:0.26199 validation-auc:0.97015 validation-aucpr:0.97410
[42] validation-logloss:0.25931 validation-auc:0.97018 validation-aucpr:0.97408
[43] validation-logloss:0.25634 validation-auc:0.97023 validation-aucpr:0.97418
[44] validation-logloss:0.25392 validation-auc:0.97021 validation-aucpr:0.97414
[45] validation-logloss:0.25112 validation-auc:0.97029 validation-aucpr:0.97421
[46] validation-logloss:0.24827 validation-auc:0.97065 validation-aucpr:0.97442
[47] validation-logloss:0.24585 validation-auc:0.97070 validation-aucpr:0.97444
[48] validation-logloss:0.24341 validation-auc:0.97076 validation-aucpr:0.97446
[49] validation-logloss:0.24108 validation-auc:0.97083 validation-aucpr:0.97453
[50] validation-logloss:0.23895 validation-auc:0.97087 validation-aucpr:0.97452
[51] validation-logloss:0.23704 validation-auc:0.97095 validation-aucpr:0.97457
[52] validation-logloss:0.23507 validation-auc:0.97096 validation-aucpr:0.97458
[53] validation-logloss:0.23329 validation-auc:0.97105 validation-aucpr:0.97467
[54] validation-logloss:0.23179 validation-auc:0.97105 validation-aucpr:0.97464
[55] validation-logloss:0.23006 validation-auc:0.97101 validation-aucpr:0.97460
[56] validation-logloss:0.22855 validation-auc:0.97106 validation-aucpr:0.97463
[57] validation-logloss:0.22672 validation-auc:0.97126 validation-aucpr:0.97478
[58] validation-logloss:0.22527 validation-auc:0.97126 validation-aucpr:0.97479
[59] validation-logloss:0.22382 validation-auc:0.97128 validation-aucpr:0.97484
[60] validation-logloss:0.22257 validation-auc:0.97128 validation-aucpr:0.97481
[61] validation-logloss:0.22140 validation-auc:0.97133 validation-aucpr:0.97484
[62] validation-logloss:0.22029 validation-auc:0.97123 validation-aucpr:0.97477
[63] validation-logloss:0.21911 validation-auc:0.97118 validation-aucpr:0.97474
[64] validation-logloss:0.21802 validation-auc:0.97116 validation-aucpr:0.97472
[65] validation-logloss:0.21715 validation-auc:0.97105 validation-aucpr:0.97464
[66] validation-logloss:0.21604 validation-auc:0.97114 validation-aucpr:0.97471
[67] validation-logloss:0.21519 validation-auc:0.97109 validation-aucpr:0.97465
[68] validation-logloss:0.21439 validation-auc:0.97102 validation-aucpr:0.97461
[69] validation-logloss:0.21353 validation-auc:0.97097 validation-aucpr:0.97458
[70] validation-logloss:0.21268 validation-auc:0.97091 validation-aucpr:0.97453
[71] validation-logloss:0.21194 validation-auc:0.97086 validation-aucpr:0.97449
[72] validation-logloss:0.21111 validation-auc:0.97088 validation-aucpr:0.97452
[73] validation-logloss:0.21016 validation-auc:0.97097 validation-aucpr:0.97460
[74] validation-logloss:0.20952 validation-auc:0.97096 validation-aucpr:0.97457
[75] validation-logloss:0.20900 validation-auc:0.97092 validation-aucpr:0.97453
[76] validation-logloss:0.20829 validation-auc:0.97089 validation-aucpr:0.97451
[77] validation-logloss:0.20762 validation-auc:0.97092 validation-aucpr:0.97452
[78] validation-logloss:0.20690 validation-auc:0.97091 validation-aucpr:0.97449
[79] validation-logloss:0.20638 validation-auc:0.97091 validation-aucpr:0.97446
[80] validation-logloss:0.20580 validation-auc:0.97099 validation-aucpr:0.97450
[81] validation-logloss:0.20520 validation-auc:0.97107 validation-aucpr:0.97455
[82] validation-logloss:0.20476 validation-auc:0.97105 validation-aucpr:0.97456
[83] validation-logloss:0.20413 validation-auc:0.97117 validation-aucpr:0.97475
{'best_iteration': '61', 'best_score': '0.9748406069880208'}
Trial 49, Fold 5: Log loss = 0.20412901026625851, Average precision = 0.9746657449146889, ROC-AUC = 0.9711680136744514, Elapsed Time = 7.984125700000732 seconds
Optimization Progress: 50%|##### | 50/100 [2:49:00<1:55:06, 138.13s/it]
Trial 50, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 50, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68362 validation-auc:0.93883 validation-aucpr:0.94319
[1] validation-logloss:0.67417 validation-auc:0.94860 validation-aucpr:0.95309
[2] validation-logloss:0.66495 validation-auc:0.95193 validation-aucpr:0.95658
[3] validation-logloss:0.65565 validation-auc:0.95530 validation-aucpr:0.95966
[4] validation-logloss:0.64702 validation-auc:0.95608 validation-aucpr:0.96053
[5] validation-logloss:0.63888 validation-auc:0.95603 validation-aucpr:0.96063
[6] validation-logloss:0.62932 validation-auc:0.96132 validation-aucpr:0.96664
[7] validation-logloss:0.62109 validation-auc:0.96145 validation-aucpr:0.96677
[8] validation-logloss:0.61321 validation-auc:0.96188 validation-aucpr:0.96707
[9] validation-logloss:0.60432 validation-auc:0.96388 validation-aucpr:0.96926
[10] validation-logloss:0.59572 validation-auc:0.96492 validation-aucpr:0.97040
[11] validation-logloss:0.58729 validation-auc:0.96544 validation-aucpr:0.97095
[12] validation-logloss:0.58058 validation-auc:0.96508 validation-aucpr:0.97066
[13] validation-logloss:0.57338 validation-auc:0.96524 validation-aucpr:0.97076
[14] validation-logloss:0.56675 validation-auc:0.96502 validation-aucpr:0.97055
[15] validation-logloss:0.56027 validation-auc:0.96487 validation-aucpr:0.97036
[16] validation-logloss:0.55254 validation-auc:0.96550 validation-aucpr:0.97099
[17] validation-logloss:0.54638 validation-auc:0.96553 validation-aucpr:0.97097
[18] validation-logloss:0.54026 validation-auc:0.96552 validation-aucpr:0.97092
[19] validation-logloss:0.53432 validation-auc:0.96545 validation-aucpr:0.97080
[20] validation-logloss:0.52749 validation-auc:0.96584 validation-aucpr:0.97120
[21] validation-logloss:0.52085 validation-auc:0.96597 validation-aucpr:0.97139
[22] validation-logloss:0.51411 validation-auc:0.96626 validation-aucpr:0.97166
[23] validation-logloss:0.50896 validation-auc:0.96620 validation-aucpr:0.97165
[24] validation-logloss:0.50357 validation-auc:0.96622 validation-aucpr:0.97163
[25] validation-logloss:0.49860 validation-auc:0.96617 validation-aucpr:0.97159
[26] validation-logloss:0.49245 validation-auc:0.96651 validation-aucpr:0.97189
[27] validation-logloss:0.48636 validation-auc:0.96688 validation-aucpr:0.97221
[28] validation-logloss:0.48143 validation-auc:0.96701 validation-aucpr:0.97229
[29] validation-logloss:0.47686 validation-auc:0.96695 validation-aucpr:0.97219
[30] validation-logloss:0.47238 validation-auc:0.96684 validation-aucpr:0.97211
[31] validation-logloss:0.46785 validation-auc:0.96676 validation-aucpr:0.97202
[32] validation-logloss:0.46337 validation-auc:0.96672 validation-aucpr:0.97198
[33] validation-logloss:0.45831 validation-auc:0.96681 validation-aucpr:0.97213
[34] validation-logloss:0.45392 validation-auc:0.96685 validation-aucpr:0.97214
[35] validation-logloss:0.44982 validation-auc:0.96680 validation-aucpr:0.97208
[36] validation-logloss:0.44570 validation-auc:0.96680 validation-aucpr:0.97207
[37] validation-logloss:0.44085 validation-auc:0.96691 validation-aucpr:0.97218
[38] validation-logloss:0.43690 validation-auc:0.96695 validation-aucpr:0.97218
[39] validation-logloss:0.43313 validation-auc:0.96692 validation-aucpr:0.97214
[40] validation-logloss:0.42954 validation-auc:0.96692 validation-aucpr:0.97211
[41] validation-logloss:0.42593 validation-auc:0.96691 validation-aucpr:0.97208
[42] validation-logloss:0.42142 validation-auc:0.96706 validation-aucpr:0.97224
[43] validation-logloss:0.41807 validation-auc:0.96707 validation-aucpr:0.97226
[44] validation-logloss:0.41456 validation-auc:0.96712 validation-aucpr:0.97228
[45] validation-logloss:0.41026 validation-auc:0.96734 validation-aucpr:0.97248
[46] validation-logloss:0.40607 validation-auc:0.96749 validation-aucpr:0.97262
{'best_iteration': '46', 'best_score': '0.9726204836170593'}
Trial 50, Fold 1: Log loss = 0.40606661302723657, Average precision = 0.9726251530010052, ROC-AUC = 0.9674923705806724, Elapsed Time = 1.0990195000013046 seconds
Trial 50, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 50, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68375 validation-auc:0.94128 validation-aucpr:0.94405
[1] validation-logloss:0.67298 validation-auc:0.96227 validation-aucpr:0.96690
[2] validation-logloss:0.66346 validation-auc:0.96311 validation-aucpr:0.96744
[3] validation-logloss:0.65439 validation-auc:0.96359 validation-aucpr:0.96757
[4] validation-logloss:0.64435 validation-auc:0.96496 validation-aucpr:0.96920
[5] validation-logloss:0.63600 validation-auc:0.96431 validation-aucpr:0.96830
[6] validation-logloss:0.62626 validation-auc:0.96564 validation-aucpr:0.96950
[7] validation-logloss:0.61849 validation-auc:0.96518 validation-aucpr:0.96898
[8] validation-logloss:0.60944 validation-auc:0.96597 validation-aucpr:0.96974
[9] validation-logloss:0.60186 validation-auc:0.96600 validation-aucpr:0.96967
[10] validation-logloss:0.59421 validation-auc:0.96610 validation-aucpr:0.96971
[11] validation-logloss:0.58574 validation-auc:0.96672 validation-aucpr:0.97059
[12] validation-logloss:0.57885 validation-auc:0.96666 validation-aucpr:0.97068
[13] validation-logloss:0.57186 validation-auc:0.96654 validation-aucpr:0.97047
[14] validation-logloss:0.56505 validation-auc:0.96636 validation-aucpr:0.97024
[15] validation-logloss:0.55775 validation-auc:0.96646 validation-aucpr:0.97038
[16] validation-logloss:0.55150 validation-auc:0.96624 validation-aucpr:0.97013
[17] validation-logloss:0.54409 validation-auc:0.96670 validation-aucpr:0.97060
[18] validation-logloss:0.53691 validation-auc:0.96691 validation-aucpr:0.97087
[19] validation-logloss:0.53141 validation-auc:0.96665 validation-aucpr:0.97057
[20] validation-logloss:0.52586 validation-auc:0.96657 validation-aucpr:0.97045
[21] validation-logloss:0.51949 validation-auc:0.96680 validation-aucpr:0.97061
[22] validation-logloss:0.51407 validation-auc:0.96682 validation-aucpr:0.97062
[23] validation-logloss:0.50860 validation-auc:0.96686 validation-aucpr:0.97066
[24] validation-logloss:0.50329 validation-auc:0.96672 validation-aucpr:0.97051
[25] validation-logloss:0.49796 validation-auc:0.96679 validation-aucpr:0.97053
[26] validation-logloss:0.49300 validation-auc:0.96673 validation-aucpr:0.97041
[27] validation-logloss:0.48795 validation-auc:0.96676 validation-aucpr:0.97041
[28] validation-logloss:0.48193 validation-auc:0.96711 validation-aucpr:0.97075
[29] validation-logloss:0.47721 validation-auc:0.96714 validation-aucpr:0.97078
[30] validation-logloss:0.47292 validation-auc:0.96706 validation-aucpr:0.97067
[31] validation-logloss:0.46739 validation-auc:0.96732 validation-aucpr:0.97091
[32] validation-logloss:0.46292 validation-auc:0.96740 validation-aucpr:0.97094
[33] validation-logloss:0.45841 validation-auc:0.96740 validation-aucpr:0.97094
[34] validation-logloss:0.45420 validation-auc:0.96744 validation-aucpr:0.97094
[35] validation-logloss:0.44991 validation-auc:0.96746 validation-aucpr:0.97095
[36] validation-logloss:0.44494 validation-auc:0.96769 validation-aucpr:0.97118
[37] validation-logloss:0.44081 validation-auc:0.96772 validation-aucpr:0.97117
[38] validation-logloss:0.43691 validation-auc:0.96770 validation-aucpr:0.97104
[39] validation-logloss:0.43224 validation-auc:0.96781 validation-aucpr:0.97119
[40] validation-logloss:0.42867 validation-auc:0.96763 validation-aucpr:0.97103
[41] validation-logloss:0.42505 validation-auc:0.96758 validation-aucpr:0.97095
[42] validation-logloss:0.42140 validation-auc:0.96760 validation-aucpr:0.97094
[43] validation-logloss:0.41730 validation-auc:0.96764 validation-aucpr:0.97094
[44] validation-logloss:0.41384 validation-auc:0.96764 validation-aucpr:0.97089
[45] validation-logloss:0.40966 validation-auc:0.96779 validation-aucpr:0.97104
[46] validation-logloss:0.40630 validation-auc:0.96780 validation-aucpr:0.97104
{'best_iteration': '39', 'best_score': '0.9711898911116464'}
Trial 50, Fold 2: Log loss = 0.406295763397056, Average precision = 0.9710449059893823, ROC-AUC = 0.9677950986072286, Elapsed Time = 1.3552650999990874 seconds
Trial 50, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 50, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68338 validation-auc:0.94309 validation-aucpr:0.94693
[1] validation-logloss:0.67353 validation-auc:0.95311 validation-aucpr:0.95467
[2] validation-logloss:0.66305 validation-auc:0.96262 validation-aucpr:0.96503
[3] validation-logloss:0.65411 validation-auc:0.96252 validation-aucpr:0.96496
[4] validation-logloss:0.64533 validation-auc:0.96305 validation-aucpr:0.96633
[5] validation-logloss:0.63660 validation-auc:0.96315 validation-aucpr:0.96738
[6] validation-logloss:0.62847 validation-auc:0.96301 validation-aucpr:0.96636
[7] validation-logloss:0.62023 validation-auc:0.96287 validation-aucpr:0.96619
[8] validation-logloss:0.61093 validation-auc:0.96504 validation-aucpr:0.96851
[9] validation-logloss:0.60211 validation-auc:0.96602 validation-aucpr:0.96951
[10] validation-logloss:0.59335 validation-auc:0.96682 validation-aucpr:0.97029
[11] validation-logloss:0.58544 validation-auc:0.96738 validation-aucpr:0.97222
[12] validation-logloss:0.57889 validation-auc:0.96685 validation-aucpr:0.97185
[13] validation-logloss:0.57071 validation-auc:0.96745 validation-aucpr:0.97238
[14] validation-logloss:0.56278 validation-auc:0.96783 validation-aucpr:0.97274
[15] validation-logloss:0.55631 validation-auc:0.96772 validation-aucpr:0.97257
[16] validation-logloss:0.55018 validation-auc:0.96750 validation-aucpr:0.97238
[17] validation-logloss:0.54415 validation-auc:0.96723 validation-aucpr:0.97220
[18] validation-logloss:0.53795 validation-auc:0.96714 validation-aucpr:0.97210
[19] validation-logloss:0.53206 validation-auc:0.96702 validation-aucpr:0.97203
[20] validation-logloss:0.52632 validation-auc:0.96695 validation-aucpr:0.97197
[21] validation-logloss:0.52062 validation-auc:0.96692 validation-aucpr:0.97189
[22] validation-logloss:0.51494 validation-auc:0.96685 validation-aucpr:0.97179
[23] validation-logloss:0.50943 validation-auc:0.96691 validation-aucpr:0.97182
[24] validation-logloss:0.50296 validation-auc:0.96728 validation-aucpr:0.97217
[25] validation-logloss:0.49677 validation-auc:0.96751 validation-aucpr:0.97241
[26] validation-logloss:0.49086 validation-auc:0.96764 validation-aucpr:0.97253
[27] validation-logloss:0.48601 validation-auc:0.96761 validation-aucpr:0.97253
[28] validation-logloss:0.48126 validation-auc:0.96759 validation-aucpr:0.97250
[29] validation-logloss:0.47562 validation-auc:0.96779 validation-aucpr:0.97267
[30] validation-logloss:0.47005 validation-auc:0.96800 validation-aucpr:0.97281
[31] validation-logloss:0.46550 validation-auc:0.96798 validation-aucpr:0.97279
[32] validation-logloss:0.46007 validation-auc:0.96819 validation-aucpr:0.97298
[33] validation-logloss:0.45563 validation-auc:0.96824 validation-aucpr:0.97299
[34] validation-logloss:0.45150 validation-auc:0.96825 validation-aucpr:0.97298
[35] validation-logloss:0.44719 validation-auc:0.96832 validation-aucpr:0.97304
[36] validation-logloss:0.44297 validation-auc:0.96832 validation-aucpr:0.97304
[37] validation-logloss:0.43794 validation-auc:0.96859 validation-aucpr:0.97327
[38] validation-logloss:0.43416 validation-auc:0.96852 validation-aucpr:0.97319
[39] validation-logloss:0.43048 validation-auc:0.96852 validation-aucpr:0.97318
[40] validation-logloss:0.42678 validation-auc:0.96847 validation-aucpr:0.97317
[41] validation-logloss:0.42216 validation-auc:0.96866 validation-aucpr:0.97334
[42] validation-logloss:0.41780 validation-auc:0.96875 validation-aucpr:0.97343
[43] validation-logloss:0.41427 validation-auc:0.96871 validation-aucpr:0.97341
[44] validation-logloss:0.41021 validation-auc:0.96879 validation-aucpr:0.97348
[45] validation-logloss:0.40656 validation-auc:0.96889 validation-aucpr:0.97354
[46] validation-logloss:0.40306 validation-auc:0.96890 validation-aucpr:0.97355
{'best_iteration': '46', 'best_score': '0.97354798914545'}
Trial 50, Fold 3: Log loss = 0.4030647629357849, Average precision = 0.9735525545776742, ROC-AUC = 0.9688970237693946, Elapsed Time = 1.3142243000002054 seconds
Trial 50, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 50, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68349 validation-auc:0.94204 validation-aucpr:0.94628
[1] validation-logloss:0.67442 validation-auc:0.94667 validation-aucpr:0.95214
[2] validation-logloss:0.66581 validation-auc:0.94897 validation-aucpr:0.95443
[3] validation-logloss:0.65630 validation-auc:0.95477 validation-aucpr:0.95994
[4] validation-logloss:0.64760 validation-auc:0.95499 validation-aucpr:0.96006
[5] validation-logloss:0.63920 validation-auc:0.95576 validation-aucpr:0.96085
[6] validation-logloss:0.63087 validation-auc:0.95639 validation-aucpr:0.96143
[7] validation-logloss:0.62133 validation-auc:0.96113 validation-aucpr:0.96650
[8] validation-logloss:0.61210 validation-auc:0.96282 validation-aucpr:0.96853
[9] validation-logloss:0.60443 validation-auc:0.96271 validation-aucpr:0.96834
[10] validation-logloss:0.59695 validation-auc:0.96290 validation-aucpr:0.96838
[11] validation-logloss:0.58957 validation-auc:0.96281 validation-aucpr:0.96822
[12] validation-logloss:0.58228 validation-auc:0.96297 validation-aucpr:0.96828
[13] validation-logloss:0.57531 validation-auc:0.96273 validation-aucpr:0.96805
[14] validation-logloss:0.56864 validation-auc:0.96263 validation-aucpr:0.96788
[15] validation-logloss:0.56211 validation-auc:0.96253 validation-aucpr:0.96772
[16] validation-logloss:0.55470 validation-auc:0.96339 validation-aucpr:0.96874
[17] validation-logloss:0.54744 validation-auc:0.96393 validation-aucpr:0.96936
[18] validation-logloss:0.54134 validation-auc:0.96368 validation-aucpr:0.96908
[19] validation-logloss:0.53522 validation-auc:0.96368 validation-aucpr:0.96904
[20] validation-logloss:0.52952 validation-auc:0.96350 validation-aucpr:0.96889
[21] validation-logloss:0.52278 validation-auc:0.96394 validation-aucpr:0.96944
[22] validation-logloss:0.51618 validation-auc:0.96442 validation-aucpr:0.96994
[23] validation-logloss:0.50987 validation-auc:0.96469 validation-aucpr:0.97023
[24] validation-logloss:0.50464 validation-auc:0.96461 validation-aucpr:0.97012
[25] validation-logloss:0.49952 validation-auc:0.96451 validation-aucpr:0.97003
[26] validation-logloss:0.49459 validation-auc:0.96447 validation-aucpr:0.97000
[27] validation-logloss:0.48864 validation-auc:0.96485 validation-aucpr:0.97037
[28] validation-logloss:0.48291 validation-auc:0.96507 validation-aucpr:0.97060
[29] validation-logloss:0.47819 validation-auc:0.96504 validation-aucpr:0.97057
[30] validation-logloss:0.47277 validation-auc:0.96517 validation-aucpr:0.97073
[31] validation-logloss:0.46834 validation-auc:0.96514 validation-aucpr:0.97071
[32] validation-logloss:0.46308 validation-auc:0.96535 validation-aucpr:0.97091
[33] validation-logloss:0.45862 validation-auc:0.96543 validation-aucpr:0.97097
[34] validation-logloss:0.45433 validation-auc:0.96547 validation-aucpr:0.97097
[35] validation-logloss:0.45025 validation-auc:0.96535 validation-aucpr:0.97087
[36] validation-logloss:0.44597 validation-auc:0.96545 validation-aucpr:0.97096
[37] validation-logloss:0.44207 validation-auc:0.96550 validation-aucpr:0.97099
[38] validation-logloss:0.43728 validation-auc:0.96570 validation-aucpr:0.97118
[39] validation-logloss:0.43352 validation-auc:0.96569 validation-aucpr:0.97115
[40] validation-logloss:0.42890 validation-auc:0.96584 validation-aucpr:0.97132
[41] validation-logloss:0.42518 validation-auc:0.96589 validation-aucpr:0.97135
[42] validation-logloss:0.42163 validation-auc:0.96593 validation-aucpr:0.97133
[43] validation-logloss:0.41817 validation-auc:0.96585 validation-aucpr:0.97127
[44] validation-logloss:0.41389 validation-auc:0.96603 validation-aucpr:0.97144
[45] validation-logloss:0.41068 validation-auc:0.96599 validation-aucpr:0.97141
[46] validation-logloss:0.40753 validation-auc:0.96593 validation-aucpr:0.97135
{'best_iteration': '44', 'best_score': '0.9714400273714907'}
Trial 50, Fold 4: Log loss = 0.4075332225921297, Average precision = 0.9713529212958817, ROC-AUC = 0.9659268588262505, Elapsed Time = 1.3637277999987418 seconds
Trial 50, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 50, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68359 validation-auc:0.93834 validation-aucpr:0.93939
[1] validation-logloss:0.67398 validation-auc:0.95067 validation-aucpr:0.95333
[2] validation-logloss:0.66444 validation-auc:0.95413 validation-aucpr:0.95852
[3] validation-logloss:0.65589 validation-auc:0.95368 validation-aucpr:0.95783
[4] validation-logloss:0.64699 validation-auc:0.95520 validation-aucpr:0.95874
[5] validation-logloss:0.63728 validation-auc:0.96073 validation-aucpr:0.96504
[6] validation-logloss:0.62902 validation-auc:0.96101 validation-aucpr:0.96520
[7] validation-logloss:0.62113 validation-auc:0.96095 validation-aucpr:0.96444
[8] validation-logloss:0.61208 validation-auc:0.96247 validation-aucpr:0.96613
[9] validation-logloss:0.60432 validation-auc:0.96253 validation-aucpr:0.96615
[10] validation-logloss:0.59560 validation-auc:0.96339 validation-aucpr:0.96714
[11] validation-logloss:0.58859 validation-auc:0.96317 validation-aucpr:0.96684
[12] validation-logloss:0.58046 validation-auc:0.96357 validation-aucpr:0.96729
[13] validation-logloss:0.57249 validation-auc:0.96398 validation-aucpr:0.96733
[14] validation-logloss:0.56565 validation-auc:0.96402 validation-aucpr:0.96732
[15] validation-logloss:0.55915 validation-auc:0.96402 validation-aucpr:0.96690
[16] validation-logloss:0.55277 validation-auc:0.96398 validation-aucpr:0.96717
[17] validation-logloss:0.54631 validation-auc:0.96412 validation-aucpr:0.96793
[18] validation-logloss:0.54023 validation-auc:0.96403 validation-aucpr:0.96858
[19] validation-logloss:0.53332 validation-auc:0.96420 validation-aucpr:0.96882
[20] validation-logloss:0.52745 validation-auc:0.96416 validation-aucpr:0.96879
[21] validation-logloss:0.52059 validation-auc:0.96457 validation-aucpr:0.96896
[22] validation-logloss:0.51528 validation-auc:0.96441 validation-aucpr:0.96837
[23] validation-logloss:0.51020 validation-auc:0.96422 validation-aucpr:0.96811
[24] validation-logloss:0.50399 validation-auc:0.96439 validation-aucpr:0.96827
[25] validation-logloss:0.49908 validation-auc:0.96432 validation-aucpr:0.96823
[26] validation-logloss:0.49399 validation-auc:0.96433 validation-aucpr:0.96829
[27] validation-logloss:0.48912 validation-auc:0.96428 validation-aucpr:0.96819
[28] validation-logloss:0.48433 validation-auc:0.96433 validation-aucpr:0.96815
[29] validation-logloss:0.47871 validation-auc:0.96444 validation-aucpr:0.96830
[30] validation-logloss:0.47438 validation-auc:0.96426 validation-aucpr:0.96814
[31] validation-logloss:0.46987 validation-auc:0.96431 validation-aucpr:0.96809
[32] validation-logloss:0.46460 validation-auc:0.96447 validation-aucpr:0.96824
[33] validation-logloss:0.46030 validation-auc:0.96446 validation-aucpr:0.96817
[34] validation-logloss:0.45616 validation-auc:0.96437 validation-aucpr:0.96810
[35] validation-logloss:0.45114 validation-auc:0.96450 validation-aucpr:0.96821
[36] validation-logloss:0.44640 validation-auc:0.96463 validation-aucpr:0.96812
[37] validation-logloss:0.44244 validation-auc:0.96471 validation-aucpr:0.96938
[38] validation-logloss:0.43856 validation-auc:0.96466 validation-aucpr:0.96935
[39] validation-logloss:0.43402 validation-auc:0.96478 validation-aucpr:0.96942
[40] validation-logloss:0.42934 validation-auc:0.96501 validation-aucpr:0.96962
[41] validation-logloss:0.42498 validation-auc:0.96525 validation-aucpr:0.96996
[42] validation-logloss:0.42063 validation-auc:0.96545 validation-aucpr:0.97023
[43] validation-logloss:0.41623 validation-auc:0.96564 validation-aucpr:0.97038
[44] validation-logloss:0.41290 validation-auc:0.96559 validation-aucpr:0.97034
[45] validation-logloss:0.40871 validation-auc:0.96574 validation-aucpr:0.97047
[46] validation-logloss:0.40533 validation-auc:0.96573 validation-aucpr:0.97042
{'best_iteration': '45', 'best_score': '0.9704737172269167'}
Trial 50, Fold 5: Log loss = 0.405331832251526, Average precision = 0.9704301972342224, ROC-AUC = 0.9657291492484625, Elapsed Time = 1.3081268000023556 seconds
Optimization Progress: 51%|#####1 | 51/100 [2:49:14<1:22:31, 101.05s/it]
Trial 51, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 51, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68291 validation-auc:0.95205 validation-aucpr:0.95616
[1] validation-logloss:0.67265 validation-auc:0.95976 validation-aucpr:0.96540
[2] validation-logloss:0.66447 validation-auc:0.95993 validation-aucpr:0.96584
[3] validation-logloss:0.65478 validation-auc:0.96073 validation-aucpr:0.96659
[4] validation-logloss:0.64542 validation-auc:0.96077 validation-aucpr:0.96667
[5] validation-logloss:0.63731 validation-auc:0.96148 validation-aucpr:0.96697
[6] validation-logloss:0.62850 validation-auc:0.96148 validation-aucpr:0.96707
[7] validation-logloss:0.62023 validation-auc:0.96169 validation-aucpr:0.96733
[8] validation-logloss:0.61176 validation-auc:0.96215 validation-aucpr:0.96777
[9] validation-logloss:0.60441 validation-auc:0.96229 validation-aucpr:0.96770
[10] validation-logloss:0.59730 validation-auc:0.96171 validation-aucpr:0.96716
[11] validation-logloss:0.59040 validation-auc:0.96166 validation-aucpr:0.96700
[12] validation-logloss:0.58286 validation-auc:0.96205 validation-aucpr:0.96727
[13] validation-logloss:0.57540 validation-auc:0.96195 validation-aucpr:0.96468
[14] validation-logloss:0.56796 validation-auc:0.96199 validation-aucpr:0.96482
[15] validation-logloss:0.56073 validation-auc:0.96219 validation-aucpr:0.96497
[16] validation-logloss:0.55482 validation-auc:0.96246 validation-aucpr:0.96850
[17] validation-logloss:0.54798 validation-auc:0.96249 validation-aucpr:0.96856
[18] validation-logloss:0.54217 validation-auc:0.96246 validation-aucpr:0.96847
[19] validation-logloss:0.53597 validation-auc:0.96236 validation-aucpr:0.96843
[20] validation-logloss:0.52981 validation-auc:0.96217 validation-aucpr:0.96834
[21] validation-logloss:0.52451 validation-auc:0.96204 validation-aucpr:0.96820
[22] validation-logloss:0.51833 validation-auc:0.96214 validation-aucpr:0.96829
[23] validation-logloss:0.51302 validation-auc:0.96236 validation-aucpr:0.96843
[24] validation-logloss:0.50816 validation-auc:0.96214 validation-aucpr:0.96821
[25] validation-logloss:0.50237 validation-auc:0.96227 validation-aucpr:0.96830
[26] validation-logloss:0.49685 validation-auc:0.96239 validation-aucpr:0.96841
[27] validation-logloss:0.49238 validation-auc:0.96240 validation-aucpr:0.96847
[28] validation-logloss:0.48712 validation-auc:0.96238 validation-aucpr:0.96850
[29] validation-logloss:0.48192 validation-auc:0.96254 validation-aucpr:0.96865
[30] validation-logloss:0.47689 validation-auc:0.96261 validation-aucpr:0.96874
[31] validation-logloss:0.47189 validation-auc:0.96265 validation-aucpr:0.96880
[32] validation-logloss:0.46754 validation-auc:0.96266 validation-aucpr:0.96880
[33] validation-logloss:0.46335 validation-auc:0.96261 validation-aucpr:0.96874
[34] validation-logloss:0.45853 validation-auc:0.96268 validation-aucpr:0.96881
[35] validation-logloss:0.45444 validation-auc:0.96273 validation-aucpr:0.96882
[36] validation-logloss:0.44987 validation-auc:0.96281 validation-aucpr:0.96892
[37] validation-logloss:0.44598 validation-auc:0.96273 validation-aucpr:0.96882
[38] validation-logloss:0.44168 validation-auc:0.96280 validation-aucpr:0.96890
[39] validation-logloss:0.43754 validation-auc:0.96311 validation-aucpr:0.96955
[40] validation-logloss:0.43336 validation-auc:0.96317 validation-aucpr:0.96960
[41] validation-logloss:0.42915 validation-auc:0.96335 validation-aucpr:0.96973
[42] validation-logloss:0.42502 validation-auc:0.96348 validation-aucpr:0.96986
[43] validation-logloss:0.42167 validation-auc:0.96351 validation-aucpr:0.96986
[44] validation-logloss:0.41769 validation-auc:0.96366 validation-aucpr:0.96997
[45] validation-logloss:0.41441 validation-auc:0.96381 validation-aucpr:0.97007
[46] validation-logloss:0.41057 validation-auc:0.96390 validation-aucpr:0.97017
[47] validation-logloss:0.40694 validation-auc:0.96395 validation-aucpr:0.97025
[48] validation-logloss:0.40335 validation-auc:0.96406 validation-aucpr:0.97035
[49] validation-logloss:0.40034 validation-auc:0.96402 validation-aucpr:0.97030
[50] validation-logloss:0.39691 validation-auc:0.96415 validation-aucpr:0.97040
[51] validation-logloss:0.39351 validation-auc:0.96430 validation-aucpr:0.97055
[52] validation-logloss:0.39025 validation-auc:0.96427 validation-aucpr:0.97053
[53] validation-logloss:0.38691 validation-auc:0.96435 validation-aucpr:0.97061
[54] validation-logloss:0.38378 validation-auc:0.96436 validation-aucpr:0.97069
[55] validation-logloss:0.38122 validation-auc:0.96428 validation-aucpr:0.97060
{'best_iteration': '54', 'best_score': '0.97069156848798'}
Trial 51, Fold 1: Log loss = 0.3812171714360218, Average precision = 0.9703871434420608, ROC-AUC = 0.9642796651180798, Elapsed Time = 0.7973453000013251 seconds
Trial 51, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 51, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68236 validation-auc:0.95602 validation-aucpr:0.95911
[1] validation-logloss:0.67211 validation-auc:0.96113 validation-aucpr:0.96672
[2] validation-logloss:0.66210 validation-auc:0.96291 validation-aucpr:0.96763
[3] validation-logloss:0.65293 validation-auc:0.96248 validation-aucpr:0.96502
[4] validation-logloss:0.64365 validation-auc:0.96223 validation-aucpr:0.96440
[5] validation-logloss:0.63554 validation-auc:0.96194 validation-aucpr:0.96443
[6] validation-logloss:0.62720 validation-auc:0.96301 validation-aucpr:0.96739
[7] validation-logloss:0.61892 validation-auc:0.96373 validation-aucpr:0.96800
[8] validation-logloss:0.61041 validation-auc:0.96439 validation-aucpr:0.96851
[9] validation-logloss:0.60304 validation-auc:0.96383 validation-aucpr:0.96794
[10] validation-logloss:0.59576 validation-auc:0.96307 validation-aucpr:0.96719
[11] validation-logloss:0.58901 validation-auc:0.96296 validation-aucpr:0.96705
[12] validation-logloss:0.58139 validation-auc:0.96323 validation-aucpr:0.96729
[13] validation-logloss:0.57480 validation-auc:0.96339 validation-aucpr:0.96749
[14] validation-logloss:0.56744 validation-auc:0.96348 validation-aucpr:0.96758
[15] validation-logloss:0.56120 validation-auc:0.96355 validation-aucpr:0.96760
[16] validation-logloss:0.55417 validation-auc:0.96355 validation-aucpr:0.96761
[17] validation-logloss:0.54723 validation-auc:0.96359 validation-aucpr:0.96768
[18] validation-logloss:0.54042 validation-auc:0.96374 validation-aucpr:0.96781
[19] validation-logloss:0.53391 validation-auc:0.96378 validation-aucpr:0.96790
[20] validation-logloss:0.52767 validation-auc:0.96392 validation-aucpr:0.96793
[21] validation-logloss:0.52233 validation-auc:0.96374 validation-aucpr:0.96768
[22] validation-logloss:0.51623 validation-auc:0.96373 validation-aucpr:0.96774
[23] validation-logloss:0.51016 validation-auc:0.96397 validation-aucpr:0.96794
[24] validation-logloss:0.50455 validation-auc:0.96423 validation-aucpr:0.96835
[25] validation-logloss:0.49884 validation-auc:0.96430 validation-aucpr:0.96842
[26] validation-logloss:0.49312 validation-auc:0.96447 validation-aucpr:0.96859
[27] validation-logloss:0.48843 validation-auc:0.96418 validation-aucpr:0.96829
[28] validation-logloss:0.48317 validation-auc:0.96429 validation-aucpr:0.96836
[29] validation-logloss:0.47862 validation-auc:0.96412 validation-aucpr:0.96814
[30] validation-logloss:0.47347 validation-auc:0.96409 validation-aucpr:0.96812
[31] validation-logloss:0.46914 validation-auc:0.96403 validation-aucpr:0.96803
[32] validation-logloss:0.46449 validation-auc:0.96407 validation-aucpr:0.96810
[33] validation-logloss:0.46030 validation-auc:0.96410 validation-aucpr:0.96811
[34] validation-logloss:0.45558 validation-auc:0.96409 validation-aucpr:0.96800
[35] validation-logloss:0.45151 validation-auc:0.96422 validation-aucpr:0.96810
[36] validation-logloss:0.44692 validation-auc:0.96426 validation-aucpr:0.96816
[37] validation-logloss:0.44305 validation-auc:0.96428 validation-aucpr:0.96813
[38] validation-logloss:0.43880 validation-auc:0.96441 validation-aucpr:0.96820
[39] validation-logloss:0.43448 validation-auc:0.96444 validation-aucpr:0.96824
[40] validation-logloss:0.43039 validation-auc:0.96455 validation-aucpr:0.96835
[41] validation-logloss:0.42628 validation-auc:0.96470 validation-aucpr:0.96847
[42] validation-logloss:0.42219 validation-auc:0.96486 validation-aucpr:0.96869
[43] validation-logloss:0.41821 validation-auc:0.96496 validation-aucpr:0.96879
[44] validation-logloss:0.41431 validation-auc:0.96508 validation-aucpr:0.96891
[45] validation-logloss:0.41078 validation-auc:0.96518 validation-aucpr:0.96900
[46] validation-logloss:0.40702 validation-auc:0.96526 validation-aucpr:0.96907
[47] validation-logloss:0.40409 validation-auc:0.96517 validation-aucpr:0.96898
[48] validation-logloss:0.40063 validation-auc:0.96524 validation-aucpr:0.96905
[49] validation-logloss:0.39711 validation-auc:0.96535 validation-aucpr:0.96914
[50] validation-logloss:0.39358 validation-auc:0.96558 validation-aucpr:0.96935
[51] validation-logloss:0.39066 validation-auc:0.96546 validation-aucpr:0.96922
[52] validation-logloss:0.38725 validation-auc:0.96566 validation-aucpr:0.96938
[53] validation-logloss:0.38446 validation-auc:0.96561 validation-aucpr:0.96935
[54] validation-logloss:0.38115 validation-auc:0.96573 validation-aucpr:0.96945
[55] validation-logloss:0.37860 validation-auc:0.96575 validation-aucpr:0.96945
{'best_iteration': '54', 'best_score': '0.9694506681393924'}
Trial 51, Fold 2: Log loss = 0.37859739271133297, Average precision = 0.9692820372580925, ROC-AUC = 0.9657487797201831, Elapsed Time = 1.055651200000284 seconds
Trial 51, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 51, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68267 validation-auc:0.95690 validation-aucpr:0.96154
[1] validation-logloss:0.67228 validation-auc:0.96240 validation-aucpr:0.96740
[2] validation-logloss:0.66229 validation-auc:0.96191 validation-aucpr:0.96718
[3] validation-logloss:0.65256 validation-auc:0.96203 validation-aucpr:0.96732
[4] validation-logloss:0.64333 validation-auc:0.96423 validation-aucpr:0.96932
[5] validation-logloss:0.63419 validation-auc:0.96451 validation-aucpr:0.96946
[6] validation-logloss:0.62519 validation-auc:0.96492 validation-aucpr:0.96969
[7] validation-logloss:0.61746 validation-auc:0.96523 validation-aucpr:0.96998
[8] validation-logloss:0.60900 validation-auc:0.96533 validation-aucpr:0.97014
[9] validation-logloss:0.60075 validation-auc:0.96528 validation-aucpr:0.97072
[10] validation-logloss:0.59267 validation-auc:0.96608 validation-aucpr:0.97123
[11] validation-logloss:0.58475 validation-auc:0.96581 validation-aucpr:0.97107
[12] validation-logloss:0.57724 validation-auc:0.96559 validation-aucpr:0.97085
[13] validation-logloss:0.57071 validation-auc:0.96565 validation-aucpr:0.97093
[14] validation-logloss:0.56427 validation-auc:0.96531 validation-aucpr:0.97059
[15] validation-logloss:0.55749 validation-auc:0.96514 validation-aucpr:0.97048
[16] validation-logloss:0.55162 validation-auc:0.96466 validation-aucpr:0.96997
[17] validation-logloss:0.54552 validation-auc:0.96477 validation-aucpr:0.97001
[18] validation-logloss:0.53961 validation-auc:0.96468 validation-aucpr:0.96984
[19] validation-logloss:0.53316 validation-auc:0.96482 validation-aucpr:0.96995
[20] validation-logloss:0.52671 validation-auc:0.96489 validation-aucpr:0.97002
[21] validation-logloss:0.52069 validation-auc:0.96481 validation-aucpr:0.96998
[22] validation-logloss:0.51523 validation-auc:0.96477 validation-aucpr:0.96992
[23] validation-logloss:0.50917 validation-auc:0.96486 validation-aucpr:0.97001
[24] validation-logloss:0.50329 validation-auc:0.96469 validation-aucpr:0.96987
[25] validation-logloss:0.49831 validation-auc:0.96480 validation-aucpr:0.97004
[26] validation-logloss:0.49268 validation-auc:0.96485 validation-aucpr:0.97005
[27] validation-logloss:0.48801 validation-auc:0.96496 validation-aucpr:0.97016
[28] validation-logloss:0.48255 validation-auc:0.96501 validation-aucpr:0.97020
[29] validation-logloss:0.47805 validation-auc:0.96495 validation-aucpr:0.97014
[30] validation-logloss:0.47341 validation-auc:0.96515 validation-aucpr:0.97024
[31] validation-logloss:0.46852 validation-auc:0.96527 validation-aucpr:0.97038
[32] validation-logloss:0.46348 validation-auc:0.96542 validation-aucpr:0.97048
[33] validation-logloss:0.45852 validation-auc:0.96565 validation-aucpr:0.97062
[34] validation-logloss:0.45391 validation-auc:0.96567 validation-aucpr:0.97063
[35] validation-logloss:0.44926 validation-auc:0.96584 validation-aucpr:0.97080
[36] validation-logloss:0.44528 validation-auc:0.96576 validation-aucpr:0.97067
[37] validation-logloss:0.44103 validation-auc:0.96583 validation-aucpr:0.97075
[38] validation-logloss:0.43685 validation-auc:0.96585 validation-aucpr:0.97077
[39] validation-logloss:0.43245 validation-auc:0.96594 validation-aucpr:0.97085
[40] validation-logloss:0.42887 validation-auc:0.96590 validation-aucpr:0.97076
[41] validation-logloss:0.42482 validation-auc:0.96600 validation-aucpr:0.97084
[42] validation-logloss:0.42162 validation-auc:0.96591 validation-aucpr:0.97074
[43] validation-logloss:0.41758 validation-auc:0.96594 validation-aucpr:0.97076
[44] validation-logloss:0.41360 validation-auc:0.96599 validation-aucpr:0.97082
[45] validation-logloss:0.40968 validation-auc:0.96609 validation-aucpr:0.97090
[46] validation-logloss:0.40640 validation-auc:0.96603 validation-aucpr:0.97085
[47] validation-logloss:0.40267 validation-auc:0.96616 validation-aucpr:0.97095
[48] validation-logloss:0.39905 validation-auc:0.96617 validation-aucpr:0.97098
[49] validation-logloss:0.39541 validation-auc:0.96620 validation-aucpr:0.97102
[50] validation-logloss:0.39227 validation-auc:0.96629 validation-aucpr:0.97106
[51] validation-logloss:0.38923 validation-auc:0.96637 validation-aucpr:0.97110
[52] validation-logloss:0.38594 validation-auc:0.96636 validation-aucpr:0.97112
[53] validation-logloss:0.38261 validation-auc:0.96644 validation-aucpr:0.97119
[54] validation-logloss:0.37937 validation-auc:0.96650 validation-aucpr:0.97120
[55] validation-logloss:0.37612 validation-auc:0.96660 validation-aucpr:0.97130
{'best_iteration': '55', 'best_score': '0.9712978997806914'}
Trial 51, Fold 3: Log loss = 0.37611588215876013, Average precision = 0.9711011629204486, ROC-AUC = 0.9665986675463012, Elapsed Time = 1.0364141000027303 seconds
Trial 51, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 51, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68242 validation-auc:0.95750 validation-aucpr:0.96134
[1] validation-logloss:0.67212 validation-auc:0.96272 validation-aucpr:0.96928
[2] validation-logloss:0.66378 validation-auc:0.96068 validation-aucpr:0.96725
[3] validation-logloss:0.65554 validation-auc:0.96049 validation-aucpr:0.96693
[4] validation-logloss:0.64616 validation-auc:0.95988 validation-aucpr:0.96668
[5] validation-logloss:0.63799 validation-auc:0.96061 validation-aucpr:0.96709
[6] validation-logloss:0.62904 validation-auc:0.96097 validation-aucpr:0.96739
[7] validation-logloss:0.62072 validation-auc:0.96165 validation-aucpr:0.96801
[8] validation-logloss:0.61223 validation-auc:0.96286 validation-aucpr:0.96881
[9] validation-logloss:0.60501 validation-auc:0.96256 validation-aucpr:0.96841
[10] validation-logloss:0.59690 validation-auc:0.96274 validation-aucpr:0.96859
[11] validation-logloss:0.59020 validation-auc:0.96272 validation-aucpr:0.96847
[12] validation-logloss:0.58233 validation-auc:0.96297 validation-aucpr:0.96880
[13] validation-logloss:0.57495 validation-auc:0.96308 validation-aucpr:0.96898
[14] validation-logloss:0.56853 validation-auc:0.96319 validation-aucpr:0.96894
[15] validation-logloss:0.56241 validation-auc:0.96297 validation-aucpr:0.96865
[16] validation-logloss:0.55551 validation-auc:0.96300 validation-aucpr:0.96867
[17] validation-logloss:0.54863 validation-auc:0.96290 validation-aucpr:0.96866
[18] validation-logloss:0.54290 validation-auc:0.96291 validation-aucpr:0.96867
[19] validation-logloss:0.53635 validation-auc:0.96288 validation-aucpr:0.96867
[20] validation-logloss:0.53085 validation-auc:0.96263 validation-aucpr:0.96844
[21] validation-logloss:0.52526 validation-auc:0.96281 validation-aucpr:0.96857
[22] validation-logloss:0.51906 validation-auc:0.96322 validation-aucpr:0.96891
[23] validation-logloss:0.51378 validation-auc:0.96307 validation-aucpr:0.96880
[24] validation-logloss:0.50888 validation-auc:0.96311 validation-aucpr:0.96885
[25] validation-logloss:0.50293 validation-auc:0.96340 validation-aucpr:0.96906
[26] validation-logloss:0.49720 validation-auc:0.96340 validation-aucpr:0.96909
[27] validation-logloss:0.49165 validation-auc:0.96361 validation-aucpr:0.96928
[28] validation-logloss:0.48646 validation-auc:0.96357 validation-aucpr:0.96932
[29] validation-logloss:0.48104 validation-auc:0.96369 validation-aucpr:0.96943
[30] validation-logloss:0.47652 validation-auc:0.96371 validation-aucpr:0.96941
[31] validation-logloss:0.47240 validation-auc:0.96366 validation-aucpr:0.96932
[32] validation-logloss:0.46735 validation-auc:0.96379 validation-aucpr:0.96948
[33] validation-logloss:0.46241 validation-auc:0.96388 validation-aucpr:0.96961
[34] validation-logloss:0.45836 validation-auc:0.96370 validation-aucpr:0.96947
[35] validation-logloss:0.45359 validation-auc:0.96383 validation-aucpr:0.96960
[36] validation-logloss:0.44967 validation-auc:0.96381 validation-aucpr:0.96954
[37] validation-logloss:0.44507 validation-auc:0.96404 validation-aucpr:0.96973
[38] validation-logloss:0.44146 validation-auc:0.96396 validation-aucpr:0.96968
[39] validation-logloss:0.43728 validation-auc:0.96402 validation-aucpr:0.96979
[40] validation-logloss:0.43312 validation-auc:0.96408 validation-aucpr:0.96984
[41] validation-logloss:0.42903 validation-auc:0.96415 validation-aucpr:0.96992
[42] validation-logloss:0.42565 validation-auc:0.96407 validation-aucpr:0.96984
[43] validation-logloss:0.42224 validation-auc:0.96398 validation-aucpr:0.96979
[44] validation-logloss:0.41820 validation-auc:0.96410 validation-aucpr:0.96987
[45] validation-logloss:0.41428 validation-auc:0.96415 validation-aucpr:0.96993
[46] validation-logloss:0.41119 validation-auc:0.96409 validation-aucpr:0.96986
[47] validation-logloss:0.40746 validation-auc:0.96413 validation-aucpr:0.96991
[48] validation-logloss:0.40442 validation-auc:0.96402 validation-aucpr:0.96984
[49] validation-logloss:0.40079 validation-auc:0.96421 validation-aucpr:0.96998
[50] validation-logloss:0.39723 validation-auc:0.96430 validation-aucpr:0.97005
[51] validation-logloss:0.39386 validation-auc:0.96430 validation-aucpr:0.97007
[52] validation-logloss:0.39118 validation-auc:0.96415 validation-aucpr:0.96996
[53] validation-logloss:0.38790 validation-auc:0.96424 validation-aucpr:0.97004
[54] validation-logloss:0.38449 validation-auc:0.96435 validation-aucpr:0.97015
[55] validation-logloss:0.38126 validation-auc:0.96433 validation-aucpr:0.97017
{'best_iteration': '55', 'best_score': '0.970168208440948'}
Trial 51, Fold 4: Log loss = 0.3812629807783591, Average precision = 0.9699755871299118, ROC-AUC = 0.9643334024812169, Elapsed Time = 1.0350413000014669 seconds
Trial 51, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 51, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68251 validation-auc:0.95788 validation-aucpr:0.96177
[1] validation-logloss:0.67234 validation-auc:0.96178 validation-aucpr:0.96709
[2] validation-logloss:0.66242 validation-auc:0.96310 validation-aucpr:0.96796
[3] validation-logloss:0.65281 validation-auc:0.96369 validation-aucpr:0.96845
[4] validation-logloss:0.64371 validation-auc:0.96500 validation-aucpr:0.96907
[5] validation-logloss:0.63572 validation-auc:0.96209 validation-aucpr:0.96708
[6] validation-logloss:0.62687 validation-auc:0.96195 validation-aucpr:0.96705
[7] validation-logloss:0.61931 validation-auc:0.96172 validation-aucpr:0.96669
[8] validation-logloss:0.61088 validation-auc:0.96260 validation-aucpr:0.96728
[9] validation-logloss:0.60271 validation-auc:0.96276 validation-aucpr:0.96771
[10] validation-logloss:0.59472 validation-auc:0.96272 validation-aucpr:0.96756
[11] validation-logloss:0.58807 validation-auc:0.96221 validation-aucpr:0.96717
[12] validation-logloss:0.58043 validation-auc:0.96238 validation-aucpr:0.96735
[13] validation-logloss:0.57322 validation-auc:0.96219 validation-aucpr:0.96718
[14] validation-logloss:0.56688 validation-auc:0.96230 validation-aucpr:0.96720
[15] validation-logloss:0.56067 validation-auc:0.96199 validation-aucpr:0.96684
[16] validation-logloss:0.55497 validation-auc:0.96153 validation-aucpr:0.96673
[17] validation-logloss:0.54901 validation-auc:0.96144 validation-aucpr:0.96670
[18] validation-logloss:0.54327 validation-auc:0.96128 validation-aucpr:0.96658
[19] validation-logloss:0.53793 validation-auc:0.96111 validation-aucpr:0.96643
[20] validation-logloss:0.53258 validation-auc:0.96094 validation-aucpr:0.96630
[21] validation-logloss:0.52627 validation-auc:0.96101 validation-aucpr:0.96640
[22] validation-logloss:0.52022 validation-auc:0.96110 validation-aucpr:0.96646
[23] validation-logloss:0.51437 validation-auc:0.96113 validation-aucpr:0.96651
[24] validation-logloss:0.50879 validation-auc:0.96123 validation-aucpr:0.96657
[25] validation-logloss:0.50309 validation-auc:0.96124 validation-aucpr:0.96661
[26] validation-logloss:0.49759 validation-auc:0.96134 validation-aucpr:0.96668
[27] validation-logloss:0.49289 validation-auc:0.96146 validation-aucpr:0.96676
[28] validation-logloss:0.48741 validation-auc:0.96165 validation-aucpr:0.96695
[29] validation-logloss:0.48290 validation-auc:0.96159 validation-aucpr:0.96686
[30] validation-logloss:0.47775 validation-auc:0.96164 validation-aucpr:0.96690
[31] validation-logloss:0.47264 validation-auc:0.96177 validation-aucpr:0.96701
[32] validation-logloss:0.46776 validation-auc:0.96185 validation-aucpr:0.96710
[33] validation-logloss:0.46295 validation-auc:0.96176 validation-aucpr:0.96687
[34] validation-logloss:0.45841 validation-auc:0.96175 validation-aucpr:0.96687
[35] validation-logloss:0.45460 validation-auc:0.96168 validation-aucpr:0.96716
[36] validation-logloss:0.45027 validation-auc:0.96170 validation-aucpr:0.96723
[37] validation-logloss:0.44655 validation-auc:0.96169 validation-aucpr:0.96718
[38] validation-logloss:0.44286 validation-auc:0.96183 validation-aucpr:0.96764
[39] validation-logloss:0.43850 validation-auc:0.96206 validation-aucpr:0.96783
[40] validation-logloss:0.43495 validation-auc:0.96194 validation-aucpr:0.96773
[41] validation-logloss:0.43155 validation-auc:0.96189 validation-aucpr:0.96772
[42] validation-logloss:0.42830 validation-auc:0.96183 validation-aucpr:0.96766
[43] validation-logloss:0.42497 validation-auc:0.96186 validation-aucpr:0.96768
[44] validation-logloss:0.42099 validation-auc:0.96196 validation-aucpr:0.96778
[45] validation-logloss:0.41785 validation-auc:0.96195 validation-aucpr:0.96777
[46] validation-logloss:0.41407 validation-auc:0.96195 validation-aucpr:0.96777
[47] validation-logloss:0.41027 validation-auc:0.96208 validation-aucpr:0.96791
[48] validation-logloss:0.40660 validation-auc:0.96218 validation-aucpr:0.96799
[49] validation-logloss:0.40378 validation-auc:0.96215 validation-aucpr:0.96797
[50] validation-logloss:0.40025 validation-auc:0.96218 validation-aucpr:0.96802
[51] validation-logloss:0.39694 validation-auc:0.96223 validation-aucpr:0.96808
[52] validation-logloss:0.39348 validation-auc:0.96226 validation-aucpr:0.96811
[53] validation-logloss:0.39074 validation-auc:0.96231 validation-aucpr:0.96812
{'best_iteration': '4', 'best_score': '0.9690651836602294'}
Trial 51, Fold 5: Log loss = 0.38745012429890546, Average precision = 0.9680018802991691, ROC-AUC = 0.9624831763372536, Elapsed Time = 1.0867137000022922 seconds
Optimization Progress: 52%|#####2 | 52/100 [2:49:27<59:41, 74.61s/it]
Trial 52, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 52, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[20:48:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66455 validation-auc:0.95712 validation-aucpr:0.96324
[20:48:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63859 validation-auc:0.96221 validation-aucpr:0.96793
[20:48:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61703 validation-auc:0.96202 validation-aucpr:0.96628
[20:48:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.59423 validation-auc:0.96319 validation-aucpr:0.96731
[20:48:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.57256 validation-auc:0.96511 validation-aucpr:0.97001
[20:48:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.55286 validation-auc:0.96614 validation-aucpr:0.97147
[20:48:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.53601 validation-auc:0.96659 validation-aucpr:0.97172
[20:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.52017 validation-auc:0.96653 validation-aucpr:0.97175
[20:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.50599 validation-auc:0.96628 validation-aucpr:0.97166
[20:48:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.49097 validation-auc:0.96631 validation-aucpr:0.97181
[20:48:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.47623 validation-auc:0.96653 validation-aucpr:0.97199
[20:48:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.46420 validation-auc:0.96635 validation-aucpr:0.97175
[20:48:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.45319 validation-auc:0.96608 validation-aucpr:0.97150
[20:48:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.44194 validation-auc:0.96659 validation-aucpr:0.97185
[20:48:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.42956 validation-auc:0.96706 validation-aucpr:0.97223
[20:48:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.41983 validation-auc:0.96708 validation-aucpr:0.97230
[20:48:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.41093 validation-auc:0.96676 validation-aucpr:0.97202
[20:48:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.40043 validation-auc:0.96689 validation-aucpr:0.97215
[20:48:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.39095 validation-auc:0.96714 validation-aucpr:0.97234
[20:48:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.38330 validation-auc:0.96717 validation-aucpr:0.97242
[20:48:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.37393 validation-auc:0.96760 validation-aucpr:0.97272
[20:48:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.36515 validation-auc:0.96792 validation-aucpr:0.97303
[20:48:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.35697 validation-auc:0.96828 validation-aucpr:0.97333
[20:48:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.34930 validation-auc:0.96854 validation-aucpr:0.97354
[20:48:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.34198 validation-auc:0.96874 validation-aucpr:0.97372
[20:48:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.33513 validation-auc:0.96906 validation-aucpr:0.97398
[20:48:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.32848 validation-auc:0.96919 validation-aucpr:0.97405
[20:48:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.32323 validation-auc:0.96923 validation-aucpr:0.97406
[20:48:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.31842 validation-auc:0.96919 validation-aucpr:0.97404
[20:48:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.31239 validation-auc:0.96954 validation-aucpr:0.97431
[20:48:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.30660 validation-auc:0.96976 validation-aucpr:0.97452
[20:48:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.30184 validation-auc:0.96971 validation-aucpr:0.97451
[20:48:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.29659 validation-auc:0.96978 validation-aucpr:0.97460
[20:48:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.29264 validation-auc:0.96975 validation-aucpr:0.97457
[20:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.28819 validation-auc:0.96983 validation-aucpr:0.97464
[20:48:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.28483 validation-auc:0.96974 validation-aucpr:0.97451
[20:48:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.28054 validation-auc:0.96987 validation-aucpr:0.97462
[20:48:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.27721 validation-auc:0.96980 validation-aucpr:0.97460
[20:48:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.27423 validation-auc:0.96974 validation-aucpr:0.97455
[20:48:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.27034 validation-auc:0.96996 validation-aucpr:0.97473
[20:48:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.26749 validation-auc:0.96986 validation-aucpr:0.97467
[20:48:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.26407 validation-auc:0.96987 validation-aucpr:0.97469
[20:48:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.26086 validation-auc:0.96992 validation-aucpr:0.97475
[20:48:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.25847 validation-auc:0.96989 validation-aucpr:0.97467
[20:48:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.25559 validation-auc:0.96991 validation-aucpr:0.97470
[20:48:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.25274 validation-auc:0.97005 validation-aucpr:0.97481
[20:48:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.24985 validation-auc:0.97018 validation-aucpr:0.97490
[20:48:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.24777 validation-auc:0.97025 validation-aucpr:0.97494
[20:48:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.24531 validation-auc:0.97038 validation-aucpr:0.97502
[20:48:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.24299 validation-auc:0.97038 validation-aucpr:0.97501
[20:48:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.24067 validation-auc:0.97042 validation-aucpr:0.97505
[20:48:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.23903 validation-auc:0.97029 validation-aucpr:0.97495
[20:48:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.23716 validation-auc:0.97038 validation-aucpr:0.97501
[20:48:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.23498 validation-auc:0.97043 validation-aucpr:0.97505
[20:48:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.23335 validation-auc:0.97046 validation-aucpr:0.97507
[20:48:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.23191 validation-auc:0.97042 validation-aucpr:0.97504
[20:48:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.23048 validation-auc:0.97050 validation-aucpr:0.97509
{'best_iteration': '56', 'best_score': '0.9750851646174706'}
Trial 52, Fold 1: Log loss = 0.23047562636104363, Average precision = 0.9750894141029817, ROC-AUC = 0.9705026156302128, Elapsed Time = 22.141144799999893 seconds
Trial 52, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 52, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[20:48:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66495 validation-auc:0.95334 validation-aucpr:0.95421
[20:48:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63872 validation-auc:0.96041 validation-aucpr:0.96190
[20:48:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61403 validation-auc:0.96421 validation-aucpr:0.96762
[20:48:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.59267 validation-auc:0.96583 validation-aucpr:0.96927
[20:48:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.57130 validation-auc:0.96674 validation-aucpr:0.97051
[20:48:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.55173 validation-auc:0.96794 validation-aucpr:0.97170
[20:48:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.53308 validation-auc:0.96861 validation-aucpr:0.97249
[20:48:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.51687 validation-auc:0.96902 validation-aucpr:0.97287
[20:48:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.50033 validation-auc:0.96907 validation-aucpr:0.97291
[20:48:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.48507 validation-auc:0.96925 validation-aucpr:0.97301
[20:48:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.47067 validation-auc:0.96949 validation-aucpr:0.97315
[20:48:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.45832 validation-auc:0.96960 validation-aucpr:0.97314
[20:48:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.44493 validation-auc:0.96997 validation-aucpr:0.97344
[20:48:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.43264 validation-auc:0.97045 validation-aucpr:0.97369
[20:48:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.42088 validation-auc:0.97058 validation-aucpr:0.97382
[20:48:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.40986 validation-auc:0.97059 validation-aucpr:0.97385
[20:48:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.39929 validation-auc:0.97063 validation-aucpr:0.97388
[20:48:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.38942 validation-auc:0.97057 validation-aucpr:0.97389
[20:48:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.38084 validation-auc:0.97070 validation-aucpr:0.97391
[20:48:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.37184 validation-auc:0.97089 validation-aucpr:0.97408
[20:48:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.36319 validation-auc:0.97099 validation-aucpr:0.97404
[20:48:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.35589 validation-auc:0.97108 validation-aucpr:0.97404
[20:48:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.34785 validation-auc:0.97143 validation-aucpr:0.97417
[20:48:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.34076 validation-auc:0.97129 validation-aucpr:0.97404
[20:48:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.33389 validation-auc:0.97129 validation-aucpr:0.97396
[20:48:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.32823 validation-auc:0.97113 validation-aucpr:0.97357
[20:48:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.32180 validation-auc:0.97116 validation-aucpr:0.97349
[20:48:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.31625 validation-auc:0.97115 validation-aucpr:0.97345
[20:48:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.31111 validation-auc:0.97129 validation-aucpr:0.97357
[20:48:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.30552 validation-auc:0.97133 validation-aucpr:0.97362
[20:48:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.30122 validation-auc:0.97114 validation-aucpr:0.97338
[20:48:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.29613 validation-auc:0.97109 validation-aucpr:0.97334
[20:48:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.29189 validation-auc:0.97115 validation-aucpr:0.97340
[20:48:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.28696 validation-auc:0.97134 validation-aucpr:0.97351
[20:48:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.28344 validation-auc:0.97124 validation-aucpr:0.97350
[20:48:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.27970 validation-auc:0.97139 validation-aucpr:0.97410
[20:48:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.27533 validation-auc:0.97154 validation-aucpr:0.97424
[20:48:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.27211 validation-auc:0.97149 validation-aucpr:0.97421
[20:48:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.26846 validation-auc:0.97149 validation-aucpr:0.97421
[20:48:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.26524 validation-auc:0.97154 validation-aucpr:0.97424
[20:48:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.26163 validation-auc:0.97166 validation-aucpr:0.97432
[20:48:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.25830 validation-auc:0.97167 validation-aucpr:0.97431
[20:48:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.25541 validation-auc:0.97164 validation-aucpr:0.97426
[20:48:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.25280 validation-auc:0.97163 validation-aucpr:0.97424
[20:49:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.24991 validation-auc:0.97166 validation-aucpr:0.97422
[20:49:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.24747 validation-auc:0.97178 validation-aucpr:0.97427
[20:49:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.24542 validation-auc:0.97175 validation-aucpr:0.97427
[20:49:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.24270 validation-auc:0.97181 validation-aucpr:0.97431
[20:49:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.24040 validation-auc:0.97193 validation-aucpr:0.97439
[20:49:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.23850 validation-auc:0.97199 validation-aucpr:0.97449
[20:49:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.23660 validation-auc:0.97195 validation-aucpr:0.97445
[20:49:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.23424 validation-auc:0.97200 validation-aucpr:0.97452
[20:49:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.23257 validation-auc:0.97205 validation-aucpr:0.97477
[20:49:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.23095 validation-auc:0.97210 validation-aucpr:0.97478
[20:49:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.22957 validation-auc:0.97198 validation-aucpr:0.97467
[20:49:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.22759 validation-auc:0.97206 validation-aucpr:0.97473
[20:49:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.22601 validation-auc:0.97213 validation-aucpr:0.97478
{'best_iteration': '56', 'best_score': '0.9747847674977682'}
Trial 52, Fold 2: Log loss = 0.22601226216187362, Average precision = 0.9747905876487796, ROC-AUC = 0.9721342506588131, Elapsed Time = 21.478866199999175 seconds
Trial 52, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 52, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[20:49:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66451 validation-auc:0.95725 validation-aucpr:0.96033
[20:49:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63800 validation-auc:0.96229 validation-aucpr:0.96447
[20:49:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61414 validation-auc:0.96422 validation-aucpr:0.96874
[20:49:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.59107 validation-auc:0.96563 validation-aucpr:0.97007
[20:49:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.57216 validation-auc:0.96613 validation-aucpr:0.97121
[20:49:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.55180 validation-auc:0.96675 validation-aucpr:0.97166
[20:49:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.53332 validation-auc:0.96747 validation-aucpr:0.97233
[20:49:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.51555 validation-auc:0.96822 validation-aucpr:0.97278
[20:49:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.50039 validation-auc:0.96901 validation-aucpr:0.97352
[20:49:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.48610 validation-auc:0.96937 validation-aucpr:0.97385
[20:49:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.47158 validation-auc:0.96910 validation-aucpr:0.97361
[20:49:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.45941 validation-auc:0.96941 validation-aucpr:0.97403
[20:49:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.44798 validation-auc:0.96949 validation-aucpr:0.97402
[20:49:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.43518 validation-auc:0.96980 validation-aucpr:0.97434
[20:49:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.42490 validation-auc:0.96974 validation-aucpr:0.97433
[20:49:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.41358 validation-auc:0.96965 validation-aucpr:0.97424
[20:49:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.40277 validation-auc:0.96977 validation-aucpr:0.97440
[20:49:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.39425 validation-auc:0.96982 validation-aucpr:0.97433
[20:49:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.38432 validation-auc:0.97021 validation-aucpr:0.97441
[20:49:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.37662 validation-auc:0.97012 validation-aucpr:0.97434
[20:49:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.36744 validation-auc:0.97047 validation-aucpr:0.97351
[20:49:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.35875 validation-auc:0.97064 validation-aucpr:0.97323
[20:49:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.35176 validation-auc:0.97069 validation-aucpr:0.97328
[20:49:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.34417 validation-auc:0.97075 validation-aucpr:0.97339
[20:49:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.33817 validation-auc:0.97067 validation-aucpr:0.97408
[20:49:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.33229 validation-auc:0.97065 validation-aucpr:0.97401
[20:49:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.32554 validation-auc:0.97083 validation-aucpr:0.97409
[20:49:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.31902 validation-auc:0.97109 validation-aucpr:0.97429
[20:49:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.31331 validation-auc:0.97104 validation-aucpr:0.97425
[20:49:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.30833 validation-auc:0.97098 validation-aucpr:0.97418
[20:49:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.30246 validation-auc:0.97118 validation-aucpr:0.97431
[20:49:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.29719 validation-auc:0.97119 validation-aucpr:0.97431
[20:49:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.29300 validation-auc:0.97108 validation-aucpr:0.97421
[20:49:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.28918 validation-auc:0.97104 validation-aucpr:0.97529
[20:49:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.28431 validation-auc:0.97132 validation-aucpr:0.97550
[20:49:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.27978 validation-auc:0.97152 validation-aucpr:0.97562
[20:49:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.27599 validation-auc:0.97155 validation-aucpr:0.97560
[20:49:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.27289 validation-auc:0.97150 validation-aucpr:0.97551
[20:49:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.26900 validation-auc:0.97157 validation-aucpr:0.97559
[20:49:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.26519 validation-auc:0.97166 validation-aucpr:0.97566
[20:49:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.26233 validation-auc:0.97164 validation-aucpr:0.97571
[20:49:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.25952 validation-auc:0.97163 validation-aucpr:0.97569
[20:49:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.25632 validation-auc:0.97160 validation-aucpr:0.97568
[20:49:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.25367 validation-auc:0.97156 validation-aucpr:0.97565
[20:49:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.25138 validation-auc:0.97146 validation-aucpr:0.97557
[20:49:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.24817 validation-auc:0.97170 validation-aucpr:0.97576
[20:49:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.24577 validation-auc:0.97185 validation-aucpr:0.97585
[20:49:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.24313 validation-auc:0.97173 validation-aucpr:0.97576
[20:49:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.24058 validation-auc:0.97174 validation-aucpr:0.97577
[20:49:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.23829 validation-auc:0.97173 validation-aucpr:0.97575
[20:49:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.23573 validation-auc:0.97191 validation-aucpr:0.97586
[20:49:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.23354 validation-auc:0.97192 validation-aucpr:0.97586
[20:49:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.23201 validation-auc:0.97182 validation-aucpr:0.97576
[20:49:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.23051 validation-auc:0.97183 validation-aucpr:0.97581
[20:49:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.22883 validation-auc:0.97183 validation-aucpr:0.97583
[20:49:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.22679 validation-auc:0.97198 validation-aucpr:0.97596
[20:49:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.22491 validation-auc:0.97200 validation-aucpr:0.97595
{'best_iteration': '55', 'best_score': '0.975958283888624'}
Trial 52, Fold 3: Log loss = 0.22490842763663127, Average precision = 0.9759580309108026, ROC-AUC = 0.972004386936464, Elapsed Time = 22.18626099999892 seconds
Trial 52, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 52, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[20:49:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66434 validation-auc:0.95254 validation-aucpr:0.95559
[20:49:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63778 validation-auc:0.96034 validation-aucpr:0.96585
[20:49:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61346 validation-auc:0.96308 validation-aucpr:0.96743
[20:49:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.59333 validation-auc:0.96395 validation-aucpr:0.96976
[20:49:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.57452 validation-auc:0.96464 validation-aucpr:0.97037
[20:49:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.55681 validation-auc:0.96457 validation-aucpr:0.97030
[20:49:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.54007 validation-auc:0.96426 validation-aucpr:0.97006
[20:49:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.52215 validation-auc:0.96498 validation-aucpr:0.97080
[20:49:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.50782 validation-auc:0.96467 validation-aucpr:0.97045
[20:49:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.49152 validation-auc:0.96591 validation-aucpr:0.97136
[20:49:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.47869 validation-auc:0.96568 validation-aucpr:0.97119
[20:49:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.46647 validation-auc:0.96533 validation-aucpr:0.97093
[20:49:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.45316 validation-auc:0.96580 validation-aucpr:0.97131
[20:49:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.44224 validation-auc:0.96565 validation-aucpr:0.97119
[20:49:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.43202 validation-auc:0.96578 validation-aucpr:0.97123
[20:49:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.42012 validation-auc:0.96630 validation-aucpr:0.97174
[20:49:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.41092 validation-auc:0.96645 validation-aucpr:0.97184
[20:49:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.40176 validation-auc:0.96687 validation-aucpr:0.97215
[20:49:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.39333 validation-auc:0.96701 validation-aucpr:0.97221
[20:49:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.38392 validation-auc:0.96733 validation-aucpr:0.97246
[20:49:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.37640 validation-auc:0.96711 validation-aucpr:0.97227
[20:49:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.36941 validation-auc:0.96707 validation-aucpr:0.97227
[20:49:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.36244 validation-auc:0.96725 validation-aucpr:0.97239
[20:49:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.35444 validation-auc:0.96740 validation-aucpr:0.97258
[20:49:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.34802 validation-auc:0.96748 validation-aucpr:0.97261
[20:49:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.34056 validation-auc:0.96777 validation-aucpr:0.97284
[20:49:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.33341 validation-auc:0.96801 validation-aucpr:0.97304
[20:49:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.32681 validation-auc:0.96820 validation-aucpr:0.97319
[20:49:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.32083 validation-auc:0.96824 validation-aucpr:0.97326
[20:49:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.31482 validation-auc:0.96831 validation-aucpr:0.97336
[20:49:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.31021 validation-auc:0.96825 validation-aucpr:0.97331
[20:49:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.30433 validation-auc:0.96852 validation-aucpr:0.97356
[20:49:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.29905 validation-auc:0.96876 validation-aucpr:0.97377
[20:49:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.29422 validation-auc:0.96890 validation-aucpr:0.97387
[20:49:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.29028 validation-auc:0.96893 validation-aucpr:0.97387
[20:49:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.28677 validation-auc:0.96889 validation-aucpr:0.97384
[20:49:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.28257 validation-auc:0.96871 validation-aucpr:0.97376
[20:49:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.27835 validation-auc:0.96890 validation-aucpr:0.97390
[20:49:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.27537 validation-auc:0.96880 validation-aucpr:0.97384
[20:49:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.27150 validation-auc:0.96897 validation-aucpr:0.97395
[20:49:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.26857 validation-auc:0.96906 validation-aucpr:0.97398
[20:49:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.26477 validation-auc:0.96918 validation-aucpr:0.97409
[20:49:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.26116 validation-auc:0.96949 validation-aucpr:0.97432
[20:49:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.25856 validation-auc:0.96953 validation-aucpr:0.97434
[20:49:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.25539 validation-auc:0.96966 validation-aucpr:0.97445
[20:49:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.25305 validation-auc:0.96973 validation-aucpr:0.97451
[20:49:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.25004 validation-auc:0.96990 validation-aucpr:0.97462
[20:49:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.24715 validation-auc:0.97012 validation-aucpr:0.97476
[20:49:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.24458 validation-auc:0.97015 validation-aucpr:0.97480
[20:49:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.24250 validation-auc:0.97010 validation-aucpr:0.97478
[20:49:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.24074 validation-auc:0.97002 validation-aucpr:0.97469
[20:49:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.23841 validation-auc:0.97007 validation-aucpr:0.97472
[20:49:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.23591 validation-auc:0.97021 validation-aucpr:0.97483
[20:49:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.23431 validation-auc:0.97027 validation-aucpr:0.97488
[20:49:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.23217 validation-auc:0.97039 validation-aucpr:0.97498
[20:49:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.23003 validation-auc:0.97055 validation-aucpr:0.97509
[20:49:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.22803 validation-auc:0.97062 validation-aucpr:0.97514
{'best_iteration': '56', 'best_score': '0.9751414398190122'}
Trial 52, Fold 4: Log loss = 0.22802817105373885, Average precision = 0.9751453337884511, ROC-AUC = 0.9706172163923403, Elapsed Time = 26.99545769999895 seconds
Trial 52, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 52, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[20:49:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66470 validation-auc:0.95401 validation-aucpr:0.95795
[20:49:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.64216 validation-auc:0.95653 validation-aucpr:0.96240
[20:49:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61735 validation-auc:0.96173 validation-aucpr:0.96699
[20:49:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.59767 validation-auc:0.96169 validation-aucpr:0.96656
[20:49:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.57658 validation-auc:0.96331 validation-aucpr:0.96798
[20:49:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.55899 validation-auc:0.96339 validation-aucpr:0.96800
[20:49:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.53999 validation-auc:0.96417 validation-aucpr:0.96870
[20:49:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.52215 validation-auc:0.96490 validation-aucpr:0.96959
[20:49:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.50791 validation-auc:0.96415 validation-aucpr:0.96899
[20:49:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.49176 validation-auc:0.96486 validation-aucpr:0.96963
[20:50:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.47686 validation-auc:0.96543 validation-aucpr:0.97003
[20:50:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.46490 validation-auc:0.96474 validation-aucpr:0.96944
[20:50:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.45146 validation-auc:0.96512 validation-aucpr:0.96976
[20:50:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.43874 validation-auc:0.96543 validation-aucpr:0.97001
[20:50:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.42667 validation-auc:0.96567 validation-aucpr:0.97030
[20:50:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.41711 validation-auc:0.96563 validation-aucpr:0.97005
[20:50:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.40640 validation-auc:0.96605 validation-aucpr:0.97044
[20:50:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.39647 validation-auc:0.96621 validation-aucpr:0.97053
[20:50:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.38828 validation-auc:0.96607 validation-aucpr:0.97040
[20:50:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.37899 validation-auc:0.96642 validation-aucpr:0.97066
[20:50:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.37165 validation-auc:0.96624 validation-aucpr:0.97054
[20:50:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.36370 validation-auc:0.96633 validation-aucpr:0.97063
[20:50:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.35726 validation-auc:0.96638 validation-aucpr:0.97059
[20:50:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.35071 validation-auc:0.96662 validation-aucpr:0.97068
[20:50:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.34446 validation-auc:0.96705 validation-aucpr:0.97098
[20:50:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.33908 validation-auc:0.96713 validation-aucpr:0.97165
[20:50:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.33343 validation-auc:0.96719 validation-aucpr:0.97167
[20:50:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.32804 validation-auc:0.96735 validation-aucpr:0.97180
[20:50:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.32201 validation-auc:0.96756 validation-aucpr:0.97206
[20:50:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.31724 validation-auc:0.96751 validation-aucpr:0.97206
[20:50:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.31139 validation-auc:0.96777 validation-aucpr:0.97230
[20:50:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.30604 validation-auc:0.96790 validation-aucpr:0.97239
[20:50:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.30090 validation-auc:0.96817 validation-aucpr:0.97256
[20:50:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.29658 validation-auc:0.96820 validation-aucpr:0.97259
[20:50:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.29278 validation-auc:0.96830 validation-aucpr:0.97267
[20:50:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.28913 validation-auc:0.96824 validation-aucpr:0.97263
[20:50:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.28489 validation-auc:0.96828 validation-aucpr:0.97269
[20:50:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.28071 validation-auc:0.96844 validation-aucpr:0.97281
[20:50:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.27669 validation-auc:0.96870 validation-aucpr:0.97299
[20:50:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.27340 validation-auc:0.96879 validation-aucpr:0.97303
[20:50:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.27059 validation-auc:0.96874 validation-aucpr:0.97298
[20:50:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.26703 validation-auc:0.96889 validation-aucpr:0.97308
[20:50:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.26359 validation-auc:0.96904 validation-aucpr:0.97318
[20:50:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.26025 validation-auc:0.96928 validation-aucpr:0.97338
[20:50:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.25720 validation-auc:0.96940 validation-aucpr:0.97345
[20:50:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.25407 validation-auc:0.96970 validation-aucpr:0.97368
[20:50:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.25126 validation-auc:0.96990 validation-aucpr:0.97384
[20:50:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.24862 validation-auc:0.97006 validation-aucpr:0.97393
[20:50:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.24605 validation-auc:0.97017 validation-aucpr:0.97399
[20:50:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.24363 validation-auc:0.97026 validation-aucpr:0.97403
[20:50:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.24133 validation-auc:0.97037 validation-aucpr:0.97409
[20:50:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.23914 validation-auc:0.97045 validation-aucpr:0.97413
[20:50:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.23750 validation-auc:0.97051 validation-aucpr:0.97417
[20:50:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.23558 validation-auc:0.97052 validation-aucpr:0.97418
[20:50:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.23412 validation-auc:0.97055 validation-aucpr:0.97421
[20:50:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.23225 validation-auc:0.97068 validation-aucpr:0.97429
[20:50:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.23030 validation-auc:0.97080 validation-aucpr:0.97440
{'best_iteration': '56', 'best_score': '0.9744025313585415'}
Trial 52, Fold 5: Log loss = 0.23029926692140149, Average precision = 0.9744071415557151, ROC-AUC = 0.9707958307786633, Elapsed Time = 21.70401050000146 seconds
Optimization Progress: 53%|#####3 | 53/100 [2:51:30<1:09:43, 89.01s/it]
Trial 53, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 53, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[20:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67632 validation-auc:0.94589 validation-aucpr:0.94841
[20:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66148 validation-auc:0.95293 validation-aucpr:0.95900
[20:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.64589 validation-auc:0.95853 validation-aucpr:0.96463
[20:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63284 validation-auc:0.95921 validation-aucpr:0.96494
[20:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62094 validation-auc:0.95847 validation-aucpr:0.96314
[20:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.60894 validation-auc:0.95930 validation-aucpr:0.96523
[20:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.59628 validation-auc:0.96093 validation-aucpr:0.96676
[20:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.58542 validation-auc:0.96114 validation-aucpr:0.96710
[20:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.57393 validation-auc:0.96208 validation-aucpr:0.96788
[20:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.56345 validation-auc:0.96229 validation-aucpr:0.96812
[20:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.55222 validation-auc:0.96262 validation-aucpr:0.96862
[20:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.54278 validation-auc:0.96239 validation-aucpr:0.96846
[20:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.53371 validation-auc:0.96228 validation-aucpr:0.96838
[20:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.52324 validation-auc:0.96283 validation-aucpr:0.96882
[20:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.51355 validation-auc:0.96319 validation-aucpr:0.96906
[20:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.50558 validation-auc:0.96325 validation-aucpr:0.96922
[20:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.49681 validation-auc:0.96341 validation-aucpr:0.96940
[20:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.48831 validation-auc:0.96350 validation-aucpr:0.96942
[20:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.48068 validation-auc:0.96373 validation-aucpr:0.96955
[20:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.47318 validation-auc:0.96378 validation-aucpr:0.96958
[20:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.46668 validation-auc:0.96368 validation-aucpr:0.96947
[20:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.46003 validation-auc:0.96364 validation-aucpr:0.96941
[20:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.45362 validation-auc:0.96357 validation-aucpr:0.96944
[20:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.44623 validation-auc:0.96385 validation-aucpr:0.96961
[20:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.43976 validation-auc:0.96388 validation-aucpr:0.96971
[20:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.43409 validation-auc:0.96389 validation-aucpr:0.96967
[20:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.42716 validation-auc:0.96424 validation-aucpr:0.97000
[20:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.42124 validation-auc:0.96457 validation-aucpr:0.97005
[20:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.41596 validation-auc:0.96464 validation-aucpr:0.96996
[20:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.41002 validation-auc:0.96479 validation-aucpr:0.97008
[20:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.40511 validation-auc:0.96464 validation-aucpr:0.96997
[20:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.40070 validation-auc:0.96446 validation-aucpr:0.96982
[20:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.39550 validation-auc:0.96444 validation-aucpr:0.96980
[20:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.39124 validation-auc:0.96441 validation-aucpr:0.96975
[20:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.38569 validation-auc:0.96456 validation-aucpr:0.96992
[20:50:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.38114 validation-auc:0.96451 validation-aucpr:0.96829
[20:50:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.37687 validation-auc:0.96459 validation-aucpr:0.96835
[20:50:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.37190 validation-auc:0.96471 validation-aucpr:0.96848
[20:50:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.36813 validation-auc:0.96474 validation-aucpr:0.96848
[20:50:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.36308 validation-auc:0.96494 validation-aucpr:0.96866
[20:50:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.35834 validation-auc:0.96514 validation-aucpr:0.96897
[20:50:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.35461 validation-auc:0.96518 validation-aucpr:0.96892
[20:50:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.35113 validation-auc:0.96517 validation-aucpr:0.96909
[20:50:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.34779 validation-auc:0.96522 validation-aucpr:0.96908
[20:50:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.34344 validation-auc:0.96537 validation-aucpr:0.96930
[20:50:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.34024 validation-auc:0.96540 validation-aucpr:0.96922
[20:50:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.33733 validation-auc:0.96554 validation-aucpr:0.96927
{'best_iteration': '29', 'best_score': '0.9700769090840297'}
Trial 53, Fold 1: Log loss = 0.33733213328401895, Average precision = 0.9693284708448289, ROC-AUC = 0.9655407298671752, Elapsed Time = 14.781939100001182 seconds
Trial 53, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 53, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[20:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67573 validation-auc:0.94920 validation-aucpr:0.95315
[20:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.65968 validation-auc:0.95868 validation-aucpr:0.96208
[20:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.64422 validation-auc:0.96191 validation-aucpr:0.96634
[20:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63089 validation-auc:0.96299 validation-aucpr:0.96643
[20:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.61864 validation-auc:0.96324 validation-aucpr:0.96574
[20:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.60691 validation-auc:0.96295 validation-aucpr:0.96554
[20:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.59383 validation-auc:0.96408 validation-aucpr:0.96649
[20:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.58077 validation-auc:0.96513 validation-aucpr:0.96760
[20:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.57019 validation-auc:0.96493 validation-aucpr:0.96914
[20:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.55994 validation-auc:0.96541 validation-aucpr:0.96941
[20:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.54842 validation-auc:0.96596 validation-aucpr:0.96991
[20:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.53914 validation-auc:0.96559 validation-aucpr:0.96961
[20:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.53012 validation-auc:0.96579 validation-aucpr:0.96958
[20:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.51986 validation-auc:0.96606 validation-aucpr:0.96976
[20:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.50998 validation-auc:0.96660 validation-aucpr:0.97032
[20:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.50138 validation-auc:0.96677 validation-aucpr:0.97036
[20:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.49209 validation-auc:0.96752 validation-aucpr:0.97099
[20:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.48462 validation-auc:0.96762 validation-aucpr:0.97092
[20:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.47744 validation-auc:0.96737 validation-aucpr:0.97081
[20:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.47018 validation-auc:0.96744 validation-aucpr:0.97081
[20:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.46361 validation-auc:0.96717 validation-aucpr:0.97047
[20:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.45684 validation-auc:0.96715 validation-aucpr:0.97041
[20:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.45019 validation-auc:0.96725 validation-aucpr:0.97042
[20:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.44259 validation-auc:0.96748 validation-aucpr:0.97081
[20:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.43631 validation-auc:0.96754 validation-aucpr:0.97087
[20:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.43034 validation-auc:0.96749 validation-aucpr:0.97082
[20:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.42453 validation-auc:0.96764 validation-aucpr:0.97090
[20:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.41815 validation-auc:0.96765 validation-aucpr:0.97099
[20:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.41267 validation-auc:0.96777 validation-aucpr:0.97104
[20:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.40704 validation-auc:0.96770 validation-aucpr:0.97096
[20:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.40149 validation-auc:0.96772 validation-aucpr:0.97107
[20:50:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.39700 validation-auc:0.96769 validation-aucpr:0.97112
[20:50:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.39173 validation-auc:0.96780 validation-aucpr:0.97134
[20:50:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.38704 validation-auc:0.96772 validation-aucpr:0.97127
[20:50:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.38163 validation-auc:0.96789 validation-aucpr:0.97139
[20:50:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.37734 validation-auc:0.96808 validation-aucpr:0.97150
[20:50:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.37358 validation-auc:0.96788 validation-aucpr:0.97137
[20:50:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.36829 validation-auc:0.96821 validation-aucpr:0.97170
[20:50:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.36403 validation-auc:0.96832 validation-aucpr:0.97182
[20:50:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.35979 validation-auc:0.96836 validation-aucpr:0.97184
[20:50:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.35628 validation-auc:0.96826 validation-aucpr:0.97174
[20:50:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.35248 validation-auc:0.96833 validation-aucpr:0.97175
[20:50:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.34900 validation-auc:0.96837 validation-aucpr:0.97177
[20:50:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.34474 validation-auc:0.96842 validation-aucpr:0.97186
[20:50:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.34168 validation-auc:0.96832 validation-aucpr:0.97177
[20:50:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.33750 validation-auc:0.96846 validation-aucpr:0.97187
[20:50:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.33439 validation-auc:0.96850 validation-aucpr:0.97188
{'best_iteration': '46', 'best_score': '0.9718843925984001'}
Trial 53, Fold 2: Log loss = 0.3343866096678563, Average precision = 0.971889660200405, ROC-AUC = 0.9685029261525473, Elapsed Time = 14.036364999999932 seconds
Trial 53, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 53, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[20:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67775 validation-auc:0.92623 validation-aucpr:0.93190
[20:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66221 validation-auc:0.94945 validation-aucpr:0.95479
[20:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.64725 validation-auc:0.95836 validation-aucpr:0.96343
[20:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63406 validation-auc:0.96058 validation-aucpr:0.96602
[20:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62068 validation-auc:0.96274 validation-aucpr:0.96742
[20:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.60889 validation-auc:0.96230 validation-aucpr:0.96737
[20:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.59709 validation-auc:0.96258 validation-aucpr:0.96765
[20:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.58609 validation-auc:0.96274 validation-aucpr:0.96778
[20:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.57535 validation-auc:0.96206 validation-aucpr:0.96721
[20:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.56527 validation-auc:0.96183 validation-aucpr:0.96717
[20:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.55486 validation-auc:0.96244 validation-aucpr:0.96783
[20:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.54568 validation-auc:0.96230 validation-aucpr:0.96775
[20:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.53458 validation-auc:0.96357 validation-aucpr:0.96891
[20:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.52590 validation-auc:0.96373 validation-aucpr:0.96895
[20:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.51759 validation-auc:0.96366 validation-aucpr:0.96882
[20:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.50950 validation-auc:0.96373 validation-aucpr:0.96875
[20:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.50155 validation-auc:0.96383 validation-aucpr:0.96887
[20:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.49388 validation-auc:0.96388 validation-aucpr:0.96882
[20:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.48680 validation-auc:0.96368 validation-aucpr:0.96865
[20:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.47834 validation-auc:0.96425 validation-aucpr:0.96919
[20:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.47048 validation-auc:0.96460 validation-aucpr:0.96943
[20:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.46376 validation-auc:0.96452 validation-aucpr:0.96929
[20:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.45724 validation-auc:0.96463 validation-aucpr:0.96936
[20:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.44999 validation-auc:0.96484 validation-aucpr:0.96961
[20:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.44236 validation-auc:0.96514 validation-aucpr:0.96994
[20:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.43507 validation-auc:0.96544 validation-aucpr:0.97022
[20:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.42793 validation-auc:0.96577 validation-aucpr:0.97052
[20:51:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.42094 validation-auc:0.96605 validation-aucpr:0.97075
[20:51:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.41535 validation-auc:0.96615 validation-aucpr:0.97086
[20:51:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.40925 validation-auc:0.96639 validation-aucpr:0.97108
[20:51:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.40426 validation-auc:0.96649 validation-aucpr:0.97109
[20:51:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.39912 validation-auc:0.96651 validation-aucpr:0.97110
[20:51:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.39422 validation-auc:0.96650 validation-aucpr:0.97107
[20:51:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.38946 validation-auc:0.96661 validation-aucpr:0.97114
[20:51:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.38462 validation-auc:0.96681 validation-aucpr:0.97127
[20:51:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.38028 validation-auc:0.96675 validation-aucpr:0.97125
[20:51:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.37623 validation-auc:0.96664 validation-aucpr:0.97110
[20:51:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.37126 validation-auc:0.96679 validation-aucpr:0.97125
[20:51:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.36700 validation-auc:0.96691 validation-aucpr:0.97134
[20:51:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.36305 validation-auc:0.96708 validation-aucpr:0.97148
[20:51:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.35924 validation-auc:0.96716 validation-aucpr:0.97152
[20:51:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.35566 validation-auc:0.96724 validation-aucpr:0.97160
[20:51:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.35080 validation-auc:0.96744 validation-aucpr:0.97175
[20:51:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.34738 validation-auc:0.96753 validation-aucpr:0.97191
[20:51:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.34311 validation-auc:0.96775 validation-aucpr:0.97211
[20:51:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.33961 validation-auc:0.96791 validation-aucpr:0.97223
[20:51:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.33617 validation-auc:0.96799 validation-aucpr:0.97231
{'best_iteration': '46', 'best_score': '0.9723141068097029'}
Trial 53, Fold 3: Log loss = 0.3361675740453154, Average precision = 0.9723190719744764, ROC-AUC = 0.9679867404724987, Elapsed Time = 14.098342500001309 seconds
Trial 53, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 53, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[20:51:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67702 validation-auc:0.93491 validation-aucpr:0.93034
[20:51:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66280 validation-auc:0.94355 validation-aucpr:0.95187
[20:51:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.64932 validation-auc:0.95139 validation-aucpr:0.95804
[20:51:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63610 validation-auc:0.95323 validation-aucpr:0.95974
[20:51:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62162 validation-auc:0.95866 validation-aucpr:0.96565
[20:51:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.60771 validation-auc:0.96075 validation-aucpr:0.96751
[20:51:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.59461 validation-auc:0.96233 validation-aucpr:0.96874
[20:51:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.58410 validation-auc:0.96215 validation-aucpr:0.96858
[20:51:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.57353 validation-auc:0.96217 validation-aucpr:0.96850
[20:51:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.56189 validation-auc:0.96257 validation-aucpr:0.96891
[20:51:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.55199 validation-auc:0.96250 validation-aucpr:0.96883
[20:51:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.54260 validation-auc:0.96269 validation-aucpr:0.96889
[20:51:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.53381 validation-auc:0.96283 validation-aucpr:0.96897
[20:51:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.52492 validation-auc:0.96280 validation-aucpr:0.96891
[20:51:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.51647 validation-auc:0.96279 validation-aucpr:0.96880
[20:51:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.50796 validation-auc:0.96282 validation-aucpr:0.96876
[20:51:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.49884 validation-auc:0.96301 validation-aucpr:0.96916
[20:51:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.49076 validation-auc:0.96281 validation-aucpr:0.96905
[20:51:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.48326 validation-auc:0.96272 validation-aucpr:0.96896
[20:51:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.47504 validation-auc:0.96276 validation-aucpr:0.96906
[20:51:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.46706 validation-auc:0.96269 validation-aucpr:0.96905
[20:51:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.45952 validation-auc:0.96323 validation-aucpr:0.96948
[20:51:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.45246 validation-auc:0.96361 validation-aucpr:0.96978
[20:51:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.44570 validation-auc:0.96380 validation-aucpr:0.96992
[20:51:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.43919 validation-auc:0.96385 validation-aucpr:0.96995
[20:51:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.43360 validation-auc:0.96398 validation-aucpr:0.97000
[20:51:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.42658 validation-auc:0.96427 validation-aucpr:0.97027
[20:51:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.42000 validation-auc:0.96432 validation-aucpr:0.97036
[20:51:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.41452 validation-auc:0.96465 validation-aucpr:0.97057
[20:51:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.40892 validation-auc:0.96460 validation-aucpr:0.97052
[20:51:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.40397 validation-auc:0.96454 validation-aucpr:0.97046
[20:51:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.39767 validation-auc:0.96497 validation-aucpr:0.97086
[20:51:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.39332 validation-auc:0.96462 validation-aucpr:0.97059
[20:51:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.38872 validation-auc:0.96480 validation-aucpr:0.97067
[20:51:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.38339 validation-auc:0.96496 validation-aucpr:0.97081
[20:51:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.37866 validation-auc:0.96503 validation-aucpr:0.97092
[20:51:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.37347 validation-auc:0.96516 validation-aucpr:0.97106
[20:51:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.36946 validation-auc:0.96513 validation-aucpr:0.97101
[20:51:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.36529 validation-auc:0.96510 validation-aucpr:0.97099
[20:51:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.36120 validation-auc:0.96525 validation-aucpr:0.97108
[20:51:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.35756 validation-auc:0.96525 validation-aucpr:0.97106
[20:51:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.35406 validation-auc:0.96516 validation-aucpr:0.97098
[20:51:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.35051 validation-auc:0.96523 validation-aucpr:0.97104
[20:51:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.34581 validation-auc:0.96560 validation-aucpr:0.97133
[20:51:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.34210 validation-auc:0.96570 validation-aucpr:0.97138
[20:51:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.33803 validation-auc:0.96585 validation-aucpr:0.97154
[20:51:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.33491 validation-auc:0.96588 validation-aucpr:0.97157
{'best_iteration': '46', 'best_score': '0.9715733178238387'}
Trial 53, Fold 4: Log loss = 0.3349134832867913, Average precision = 0.9715754250844988, ROC-AUC = 0.9658764136569453, Elapsed Time = 14.17093129999921 seconds
Trial 53, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 53, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[20:51:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67761 validation-auc:0.92559 validation-aucpr:0.93521
[20:51:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66160 validation-auc:0.94911 validation-aucpr:0.95521
[20:51:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.64580 validation-auc:0.95579 validation-aucpr:0.96039
[20:51:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63119 validation-auc:0.95911 validation-aucpr:0.96446
[20:51:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.61837 validation-auc:0.96096 validation-aucpr:0.96666
[20:51:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.60470 validation-auc:0.96196 validation-aucpr:0.96778
[20:51:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.59281 validation-auc:0.96181 validation-aucpr:0.96750
[20:51:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.58173 validation-auc:0.96163 validation-aucpr:0.96734
[20:51:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.57113 validation-auc:0.96146 validation-aucpr:0.96730
[20:51:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.56089 validation-auc:0.96106 validation-aucpr:0.96693
[20:51:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.55140 validation-auc:0.96111 validation-aucpr:0.96690
[20:51:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.54228 validation-auc:0.96119 validation-aucpr:0.96697
[20:51:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.53211 validation-auc:0.96151 validation-aucpr:0.96735
[20:51:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.52333 validation-auc:0.96161 validation-aucpr:0.96735
[20:51:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.51356 validation-auc:0.96157 validation-aucpr:0.96731
[20:51:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.50541 validation-auc:0.96162 validation-aucpr:0.96732
[20:51:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.49766 validation-auc:0.96148 validation-aucpr:0.96727
[20:51:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.48856 validation-auc:0.96187 validation-aucpr:0.96758
[20:51:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.48063 validation-auc:0.96190 validation-aucpr:0.96770
[20:51:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.47375 validation-auc:0.96196 validation-aucpr:0.96765
[20:51:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.46592 validation-auc:0.96221 validation-aucpr:0.96809
[20:51:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.45840 validation-auc:0.96218 validation-aucpr:0.96814
[20:51:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.45195 validation-auc:0.96233 validation-aucpr:0.96821
[20:51:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.44622 validation-auc:0.96239 validation-aucpr:0.96822
[20:51:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.44048 validation-auc:0.96234 validation-aucpr:0.96815
[20:51:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.43470 validation-auc:0.96242 validation-aucpr:0.96817
[20:51:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.42863 validation-auc:0.96268 validation-aucpr:0.96836
[20:51:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.42240 validation-auc:0.96286 validation-aucpr:0.96852
[20:51:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.41708 validation-auc:0.96300 validation-aucpr:0.96869
[20:51:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.41209 validation-auc:0.96303 validation-aucpr:0.96867
[20:51:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.40701 validation-auc:0.96324 validation-aucpr:0.96880
[20:51:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.40188 validation-auc:0.96314 validation-aucpr:0.96867
[20:51:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.39720 validation-auc:0.96308 validation-aucpr:0.96862
[20:51:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.39304 validation-auc:0.96298 validation-aucpr:0.96856
[20:51:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.38861 validation-auc:0.96301 validation-aucpr:0.96856
[20:51:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.38322 validation-auc:0.96321 validation-aucpr:0.96870
[20:51:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.37904 validation-auc:0.96332 validation-aucpr:0.96877
[20:51:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.37502 validation-auc:0.96350 validation-aucpr:0.96890
[20:51:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.37055 validation-auc:0.96355 validation-aucpr:0.96892
[20:51:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.36638 validation-auc:0.96373 validation-aucpr:0.96907
[20:51:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.36252 validation-auc:0.96369 validation-aucpr:0.96903
[20:51:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.35847 validation-auc:0.96383 validation-aucpr:0.96910
[20:51:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.35410 validation-auc:0.96389 validation-aucpr:0.96917
[20:51:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.35066 validation-auc:0.96391 validation-aucpr:0.96918
[20:51:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.34746 validation-auc:0.96392 validation-aucpr:0.96917
[20:51:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.34420 validation-auc:0.96404 validation-aucpr:0.96918
[20:51:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.34149 validation-auc:0.96385 validation-aucpr:0.96885
{'best_iteration': '43', 'best_score': '0.9691820538568151'}
Trial 53, Fold 5: Log loss = 0.34148940877067663, Average precision = 0.9688621797793956, ROC-AUC = 0.9638521821526114, Elapsed Time = 16.684623099998134 seconds
Optimization Progress: 54%|#####4 | 54/100 [2:52:52<1:06:36, 86.88s/it]
Trial 54, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 54, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67402 validation-auc:0.92686 validation-aucpr:0.90773
[1] validation-logloss:0.65665 validation-auc:0.95221 validation-aucpr:0.95385
[2] validation-logloss:0.63894 validation-auc:0.95776 validation-aucpr:0.95910
[3] validation-logloss:0.62399 validation-auc:0.95890 validation-aucpr:0.96099
[4] validation-logloss:0.60903 validation-auc:0.96046 validation-aucpr:0.96456
[5] validation-logloss:0.59579 validation-auc:0.96061 validation-aucpr:0.96533
[6] validation-logloss:0.58195 validation-auc:0.96187 validation-aucpr:0.96646
[7] validation-logloss:0.56959 validation-auc:0.96171 validation-aucpr:0.96615
[8] validation-logloss:0.55747 validation-auc:0.96237 validation-aucpr:0.96731
[9] validation-logloss:0.54655 validation-auc:0.96174 validation-aucpr:0.96680
[10] validation-logloss:0.53579 validation-auc:0.96194 validation-aucpr:0.96694
[11] validation-logloss:0.52545 validation-auc:0.96189 validation-aucpr:0.96689
[12] validation-logloss:0.51497 validation-auc:0.96227 validation-aucpr:0.96743
[13] validation-logloss:0.50479 validation-auc:0.96266 validation-aucpr:0.96770
[14] validation-logloss:0.49390 validation-auc:0.96352 validation-aucpr:0.96877
[15] validation-logloss:0.48327 validation-auc:0.96425 validation-aucpr:0.96962
[16] validation-logloss:0.47353 validation-auc:0.96441 validation-aucpr:0.96985
[17] validation-logloss:0.46411 validation-auc:0.96471 validation-aucpr:0.97013
[18] validation-logloss:0.45530 validation-auc:0.96477 validation-aucpr:0.97030
[19] validation-logloss:0.44707 validation-auc:0.96487 validation-aucpr:0.97041
[20] validation-logloss:0.43945 validation-auc:0.96502 validation-aucpr:0.97049
[21] validation-logloss:0.43232 validation-auc:0.96496 validation-aucpr:0.97043
[22] validation-logloss:0.42529 validation-auc:0.96526 validation-aucpr:0.97059
[23] validation-logloss:0.41860 validation-auc:0.96543 validation-aucpr:0.97067
[24] validation-logloss:0.41243 validation-auc:0.96546 validation-aucpr:0.97066
[25] validation-logloss:0.40589 validation-auc:0.96556 validation-aucpr:0.97073
[26] validation-logloss:0.39891 validation-auc:0.96580 validation-aucpr:0.97098
[27] validation-logloss:0.39224 validation-auc:0.96599 validation-aucpr:0.97113
[28] validation-logloss:0.38693 validation-auc:0.96606 validation-aucpr:0.97118
[29] validation-logloss:0.38079 validation-auc:0.96613 validation-aucpr:0.97126
[30] validation-logloss:0.37564 validation-auc:0.96623 validation-aucpr:0.97135
[31] validation-logloss:0.36995 validation-auc:0.96627 validation-aucpr:0.97141
[32] validation-logloss:0.36535 validation-auc:0.96634 validation-aucpr:0.97147
[33] validation-logloss:0.36060 validation-auc:0.96638 validation-aucpr:0.97150
[34] validation-logloss:0.35643 validation-auc:0.96647 validation-aucpr:0.97163
[35] validation-logloss:0.35224 validation-auc:0.96636 validation-aucpr:0.97161
[36] validation-logloss:0.34808 validation-auc:0.96641 validation-aucpr:0.97160
[37] validation-logloss:0.34454 validation-auc:0.96621 validation-aucpr:0.97141
[38] validation-logloss:0.34096 validation-auc:0.96613 validation-aucpr:0.97133
[39] validation-logloss:0.33738 validation-auc:0.96614 validation-aucpr:0.97133
[40] validation-logloss:0.33319 validation-auc:0.96621 validation-aucpr:0.97140
[41] validation-logloss:0.32894 validation-auc:0.96638 validation-aucpr:0.97152
[42] validation-logloss:0.32551 validation-auc:0.96655 validation-aucpr:0.97162
[43] validation-logloss:0.32204 validation-auc:0.96669 validation-aucpr:0.97171
[44] validation-logloss:0.31897 validation-auc:0.96662 validation-aucpr:0.97167
[45] validation-logloss:0.31496 validation-auc:0.96672 validation-aucpr:0.97174
[46] validation-logloss:0.31183 validation-auc:0.96677 validation-aucpr:0.97180
[47] validation-logloss:0.30899 validation-auc:0.96687 validation-aucpr:0.97186
[48] validation-logloss:0.30575 validation-auc:0.96704 validation-aucpr:0.97200
[49] validation-logloss:0.30211 validation-auc:0.96718 validation-aucpr:0.97211
[50] validation-logloss:0.29988 validation-auc:0.96712 validation-aucpr:0.97195
[51] validation-logloss:0.29751 validation-auc:0.96704 validation-aucpr:0.97184
[52] validation-logloss:0.29482 validation-auc:0.96716 validation-aucpr:0.97199
[53] validation-logloss:0.29237 validation-auc:0.96727 validation-aucpr:0.97207
[54] validation-logloss:0.28972 validation-auc:0.96727 validation-aucpr:0.97208
[55] validation-logloss:0.28748 validation-auc:0.96730 validation-aucpr:0.97232
[56] validation-logloss:0.28528 validation-auc:0.96724 validation-aucpr:0.97229
[57] validation-logloss:0.28231 validation-auc:0.96741 validation-aucpr:0.97244
[58] validation-logloss:0.28016 validation-auc:0.96743 validation-aucpr:0.97246
[59] validation-logloss:0.27825 validation-auc:0.96747 validation-aucpr:0.97248
[60] validation-logloss:0.27554 validation-auc:0.96757 validation-aucpr:0.97258
[61] validation-logloss:0.27311 validation-auc:0.96760 validation-aucpr:0.97263
[62] validation-logloss:0.27145 validation-auc:0.96753 validation-aucpr:0.97261
[63] validation-logloss:0.26973 validation-auc:0.96753 validation-aucpr:0.97260
[64] validation-logloss:0.26822 validation-auc:0.96754 validation-aucpr:0.97259
[65] validation-logloss:0.26658 validation-auc:0.96749 validation-aucpr:0.97254
[66] validation-logloss:0.26507 validation-auc:0.96743 validation-aucpr:0.97249
[67] validation-logloss:0.26318 validation-auc:0.96752 validation-aucpr:0.97259
[68] validation-logloss:0.26098 validation-auc:0.96762 validation-aucpr:0.97268
[69] validation-logloss:0.25954 validation-auc:0.96758 validation-aucpr:0.97265
[70] validation-logloss:0.25808 validation-auc:0.96765 validation-aucpr:0.97267
[71] validation-logloss:0.25661 validation-auc:0.96774 validation-aucpr:0.97271
[72] validation-logloss:0.25534 validation-auc:0.96774 validation-aucpr:0.97271
[73] validation-logloss:0.25401 validation-auc:0.96777 validation-aucpr:0.97273
[74] validation-logloss:0.25269 validation-auc:0.96784 validation-aucpr:0.97278
[75] validation-logloss:0.25129 validation-auc:0.96791 validation-aucpr:0.97282
[76] validation-logloss:0.24953 validation-auc:0.96804 validation-aucpr:0.97296
[77] validation-logloss:0.24759 validation-auc:0.96817 validation-aucpr:0.97308
[78] validation-logloss:0.24646 validation-auc:0.96813 validation-aucpr:0.97309
[79] validation-logloss:0.24471 validation-auc:0.96824 validation-aucpr:0.97319
[80] validation-logloss:0.24334 validation-auc:0.96824 validation-aucpr:0.97320
[81] validation-logloss:0.24208 validation-auc:0.96828 validation-aucpr:0.97325
[82] validation-logloss:0.24058 validation-auc:0.96835 validation-aucpr:0.97332
[83] validation-logloss:0.23912 validation-auc:0.96839 validation-aucpr:0.97337
[84] validation-logloss:0.23751 validation-auc:0.96854 validation-aucpr:0.97350
[85] validation-logloss:0.23629 validation-auc:0.96856 validation-aucpr:0.97352
[86] validation-logloss:0.23534 validation-auc:0.96866 validation-aucpr:0.97359
[87] validation-logloss:0.23437 validation-auc:0.96866 validation-aucpr:0.97361
[88] validation-logloss:0.23289 validation-auc:0.96879 validation-aucpr:0.97375
[89] validation-logloss:0.23202 validation-auc:0.96887 validation-aucpr:0.97380
[90] validation-logloss:0.23120 validation-auc:0.96888 validation-aucpr:0.97379
[91] validation-logloss:0.23058 validation-auc:0.96886 validation-aucpr:0.97378
[92] validation-logloss:0.22989 validation-auc:0.96886 validation-aucpr:0.97376
[93] validation-logloss:0.22904 validation-auc:0.96883 validation-aucpr:0.97373
[94] validation-logloss:0.22831 validation-auc:0.96885 validation-aucpr:0.97375
[95] validation-logloss:0.22726 validation-auc:0.96888 validation-aucpr:0.97380
[96] validation-logloss:0.22649 validation-auc:0.96887 validation-aucpr:0.97379
[97] validation-logloss:0.22585 validation-auc:0.96889 validation-aucpr:0.97380
{'best_iteration': '89', 'best_score': '0.9738037843926355'}
Trial 54, Fold 1: Log loss = 0.22584576867637168, Average precision = 0.973802441655735, ROC-AUC = 0.9688949559799831, Elapsed Time = 16.8418285000007 seconds
Trial 54, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 54, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67522 validation-auc:0.92656 validation-aucpr:0.89492
[1] validation-logloss:0.65811 validation-auc:0.94887 validation-aucpr:0.94476
[2] validation-logloss:0.64196 validation-auc:0.95633 validation-aucpr:0.95859
[3] validation-logloss:0.62605 validation-auc:0.95986 validation-aucpr:0.96390
[4] validation-logloss:0.61026 validation-auc:0.96282 validation-aucpr:0.96695
[5] validation-logloss:0.59439 validation-auc:0.96479 validation-aucpr:0.96927
[6] validation-logloss:0.58141 validation-auc:0.96488 validation-aucpr:0.96929
[7] validation-logloss:0.56873 validation-auc:0.96495 validation-aucpr:0.96938
[8] validation-logloss:0.55504 validation-auc:0.96617 validation-aucpr:0.97030
[9] validation-logloss:0.54171 validation-auc:0.96669 validation-aucpr:0.97081
[10] validation-logloss:0.53095 validation-auc:0.96673 validation-aucpr:0.97080
[11] validation-logloss:0.52067 validation-auc:0.96666 validation-aucpr:0.97057
[12] validation-logloss:0.50936 validation-auc:0.96696 validation-aucpr:0.97090
[13] validation-logloss:0.49810 validation-auc:0.96694 validation-aucpr:0.97101
[14] validation-logloss:0.48886 validation-auc:0.96691 validation-aucpr:0.97103
[15] validation-logloss:0.47836 validation-auc:0.96717 validation-aucpr:0.97131
[16] validation-logloss:0.46830 validation-auc:0.96758 validation-aucpr:0.97168
[17] validation-logloss:0.45953 validation-auc:0.96769 validation-aucpr:0.97175
[18] validation-logloss:0.45163 validation-auc:0.96769 validation-aucpr:0.97171
[19] validation-logloss:0.44421 validation-auc:0.96763 validation-aucpr:0.97164
[20] validation-logloss:0.43654 validation-auc:0.96762 validation-aucpr:0.97168
[21] validation-logloss:0.42960 validation-auc:0.96755 validation-aucpr:0.97158
[22] validation-logloss:0.42133 validation-auc:0.96785 validation-aucpr:0.97188
[23] validation-logloss:0.41472 validation-auc:0.96794 validation-aucpr:0.97190
[24] validation-logloss:0.40851 validation-auc:0.96808 validation-aucpr:0.97198
[25] validation-logloss:0.40124 validation-auc:0.96843 validation-aucpr:0.97231
[26] validation-logloss:0.39539 validation-auc:0.96838 validation-aucpr:0.97225
[27] validation-logloss:0.38944 validation-auc:0.96842 validation-aucpr:0.97226
[28] validation-logloss:0.38287 validation-auc:0.96860 validation-aucpr:0.97215
[29] validation-logloss:0.37775 validation-auc:0.96844 validation-aucpr:0.97207
[30] validation-logloss:0.37171 validation-auc:0.96849 validation-aucpr:0.97210
[31] validation-logloss:0.36712 validation-auc:0.96843 validation-aucpr:0.97201
[32] validation-logloss:0.36136 validation-auc:0.96856 validation-aucpr:0.97214
[33] validation-logloss:0.35663 validation-auc:0.96851 validation-aucpr:0.97209
[34] validation-logloss:0.35220 validation-auc:0.96842 validation-aucpr:0.97202
[35] validation-logloss:0.34799 validation-auc:0.96845 validation-aucpr:0.97200
[36] validation-logloss:0.34329 validation-auc:0.96840 validation-aucpr:0.97186
[37] validation-logloss:0.33920 validation-auc:0.96850 validation-aucpr:0.97192
[38] validation-logloss:0.33552 validation-auc:0.96848 validation-aucpr:0.97188
[39] validation-logloss:0.33156 validation-auc:0.96849 validation-aucpr:0.97185
[40] validation-logloss:0.32808 validation-auc:0.96847 validation-aucpr:0.97180
[41] validation-logloss:0.32406 validation-auc:0.96856 validation-aucpr:0.97228
[42] validation-logloss:0.32084 validation-auc:0.96859 validation-aucpr:0.97228
[43] validation-logloss:0.31754 validation-auc:0.96860 validation-aucpr:0.97232
[44] validation-logloss:0.31378 validation-auc:0.96866 validation-aucpr:0.97232
[45] validation-logloss:0.31037 validation-auc:0.96885 validation-aucpr:0.97248
[46] validation-logloss:0.30661 validation-auc:0.96894 validation-aucpr:0.97256
[47] validation-logloss:0.30283 validation-auc:0.96909 validation-aucpr:0.97268
[48] validation-logloss:0.29991 validation-auc:0.96918 validation-aucpr:0.97277
[49] validation-logloss:0.29734 validation-auc:0.96921 validation-aucpr:0.97279
[50] validation-logloss:0.29465 validation-auc:0.96927 validation-aucpr:0.97283
[51] validation-logloss:0.29206 validation-auc:0.96932 validation-aucpr:0.97286
[52] validation-logloss:0.28873 validation-auc:0.96945 validation-aucpr:0.97297
[53] validation-logloss:0.28627 validation-auc:0.96947 validation-aucpr:0.97302
[54] validation-logloss:0.28399 validation-auc:0.96945 validation-aucpr:0.97302
[55] validation-logloss:0.28163 validation-auc:0.96953 validation-aucpr:0.97309
[56] validation-logloss:0.27917 validation-auc:0.96958 validation-aucpr:0.97314
[57] validation-logloss:0.27709 validation-auc:0.96955 validation-aucpr:0.97314
[58] validation-logloss:0.27493 validation-auc:0.96962 validation-aucpr:0.97319
[59] validation-logloss:0.27278 validation-auc:0.96964 validation-aucpr:0.97317
[60] validation-logloss:0.27013 validation-auc:0.96970 validation-aucpr:0.97324
[61] validation-logloss:0.26728 validation-auc:0.96988 validation-aucpr:0.97340
[62] validation-logloss:0.26483 validation-auc:0.96992 validation-aucpr:0.97343
[63] validation-logloss:0.26308 validation-auc:0.96991 validation-aucpr:0.97342
[64] validation-logloss:0.26133 validation-auc:0.97007 validation-aucpr:0.97356
[65] validation-logloss:0.25988 validation-auc:0.97002 validation-aucpr:0.97349
[66] validation-logloss:0.25756 validation-auc:0.97008 validation-aucpr:0.97356
[67] validation-logloss:0.25538 validation-auc:0.97010 validation-aucpr:0.97358
[68] validation-logloss:0.25331 validation-auc:0.97014 validation-aucpr:0.97360
[69] validation-logloss:0.25175 validation-auc:0.97017 validation-aucpr:0.97362
[70] validation-logloss:0.25011 validation-auc:0.97017 validation-aucpr:0.97368
[71] validation-logloss:0.24876 validation-auc:0.97017 validation-aucpr:0.97372
[72] validation-logloss:0.24721 validation-auc:0.97027 validation-aucpr:0.97380
[73] validation-logloss:0.24586 validation-auc:0.97027 validation-aucpr:0.97381
[74] validation-logloss:0.24462 validation-auc:0.97024 validation-aucpr:0.97379
[75] validation-logloss:0.24310 validation-auc:0.97024 validation-aucpr:0.97379
[76] validation-logloss:0.24121 validation-auc:0.97038 validation-aucpr:0.97387
[77] validation-logloss:0.24006 validation-auc:0.97034 validation-aucpr:0.97384
[78] validation-logloss:0.23882 validation-auc:0.97043 validation-aucpr:0.97388
[79] validation-logloss:0.23816 validation-auc:0.97034 validation-aucpr:0.97379
[80] validation-logloss:0.23679 validation-auc:0.97042 validation-aucpr:0.97384
[81] validation-logloss:0.23571 validation-auc:0.97042 validation-aucpr:0.97385
[82] validation-logloss:0.23462 validation-auc:0.97046 validation-aucpr:0.97388
[83] validation-logloss:0.23341 validation-auc:0.97048 validation-aucpr:0.97389
[84] validation-logloss:0.23235 validation-auc:0.97050 validation-aucpr:0.97391
[85] validation-logloss:0.23139 validation-auc:0.97053 validation-aucpr:0.97393
[86] validation-logloss:0.22999 validation-auc:0.97057 validation-aucpr:0.97394
[87] validation-logloss:0.22901 validation-auc:0.97061 validation-aucpr:0.97397
[88] validation-logloss:0.22774 validation-auc:0.97067 validation-aucpr:0.97401
[89] validation-logloss:0.22632 validation-auc:0.97074 validation-aucpr:0.97407
[90] validation-logloss:0.22496 validation-auc:0.97078 validation-aucpr:0.97411
[91] validation-logloss:0.22376 validation-auc:0.97082 validation-aucpr:0.97413
[92] validation-logloss:0.22266 validation-auc:0.97078 validation-aucpr:0.97411
[93] validation-logloss:0.22168 validation-auc:0.97082 validation-aucpr:0.97415
[94] validation-logloss:0.22109 validation-auc:0.97083 validation-aucpr:0.97416
[95] validation-logloss:0.22027 validation-auc:0.97093 validation-aucpr:0.97423
[96] validation-logloss:0.21954 validation-auc:0.97097 validation-aucpr:0.97425
[97] validation-logloss:0.21868 validation-auc:0.97099 validation-aucpr:0.97426
{'best_iteration': '97', 'best_score': '0.9742619592192957'}
Trial 54, Fold 2: Log loss = 0.21867937219408498, Average precision = 0.9742667819552169, ROC-AUC = 0.9709876443918116, Elapsed Time = 16.60435570000118 seconds
Trial 54, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 54, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67524 validation-auc:0.92396 validation-aucpr:0.89424
[1] validation-logloss:0.65846 validation-auc:0.94801 validation-aucpr:0.95016
[2] validation-logloss:0.64193 validation-auc:0.95385 validation-aucpr:0.95537
[3] validation-logloss:0.62614 validation-auc:0.95819 validation-aucpr:0.96162
[4] validation-logloss:0.61126 validation-auc:0.96074 validation-aucpr:0.96425
[5] validation-logloss:0.59561 validation-auc:0.96404 validation-aucpr:0.96802
[6] validation-logloss:0.58261 validation-auc:0.96449 validation-aucpr:0.96775
[7] validation-logloss:0.56775 validation-auc:0.96627 validation-aucpr:0.96947
[8] validation-logloss:0.55604 validation-auc:0.96620 validation-aucpr:0.96949
[9] validation-logloss:0.54348 validation-auc:0.96647 validation-aucpr:0.96967
[10] validation-logloss:0.53034 validation-auc:0.96721 validation-aucpr:0.97044
[11] validation-logloss:0.51804 validation-auc:0.96784 validation-aucpr:0.97103
[12] validation-logloss:0.50665 validation-auc:0.96822 validation-aucpr:0.97146
[13] validation-logloss:0.49634 validation-auc:0.96878 validation-aucpr:0.97188
[14] validation-logloss:0.48723 validation-auc:0.96862 validation-aucpr:0.97179
[15] validation-logloss:0.47821 validation-auc:0.96857 validation-aucpr:0.97157
[16] validation-logloss:0.46797 validation-auc:0.96877 validation-aucpr:0.97190
[17] validation-logloss:0.45799 validation-auc:0.96896 validation-aucpr:0.97213
[18] validation-logloss:0.44867 validation-auc:0.96926 validation-aucpr:0.97285
[19] validation-logloss:0.44134 validation-auc:0.96921 validation-aucpr:0.97274
[20] validation-logloss:0.43326 validation-auc:0.96935 validation-aucpr:0.97289
[21] validation-logloss:0.42513 validation-auc:0.96945 validation-aucpr:0.97304
[22] validation-logloss:0.41825 validation-auc:0.96936 validation-aucpr:0.97293
[23] validation-logloss:0.41166 validation-auc:0.96957 validation-aucpr:0.97316
[24] validation-logloss:0.40562 validation-auc:0.96947 validation-aucpr:0.97295
[25] validation-logloss:0.39947 validation-auc:0.96961 validation-aucpr:0.97331
[26] validation-logloss:0.39265 validation-auc:0.96962 validation-aucpr:0.97337
[27] validation-logloss:0.38686 validation-auc:0.96956 validation-aucpr:0.97332
[28] validation-logloss:0.38123 validation-auc:0.96966 validation-aucpr:0.97335
[29] validation-logloss:0.37598 validation-auc:0.96964 validation-aucpr:0.97347
[30] validation-logloss:0.37020 validation-auc:0.96959 validation-aucpr:0.97339
[31] validation-logloss:0.36485 validation-auc:0.96962 validation-aucpr:0.97339
[32] validation-logloss:0.36029 validation-auc:0.96958 validation-aucpr:0.97331
[33] validation-logloss:0.35552 validation-auc:0.96968 validation-aucpr:0.97340
[34] validation-logloss:0.35135 validation-auc:0.96977 validation-aucpr:0.97338
[35] validation-logloss:0.34602 validation-auc:0.96982 validation-aucpr:0.97342
[36] validation-logloss:0.34194 validation-auc:0.96983 validation-aucpr:0.97350
[37] validation-logloss:0.33722 validation-auc:0.96978 validation-aucpr:0.97354
[38] validation-logloss:0.33216 validation-auc:0.96997 validation-aucpr:0.97371
[39] validation-logloss:0.32811 validation-auc:0.97007 validation-aucpr:0.97379
[40] validation-logloss:0.32419 validation-auc:0.97007 validation-aucpr:0.97380
[41] validation-logloss:0.32110 validation-auc:0.96992 validation-aucpr:0.97362
[42] validation-logloss:0.31704 validation-auc:0.97005 validation-aucpr:0.97382
[43] validation-logloss:0.31408 validation-auc:0.96999 validation-aucpr:0.97369
[44] validation-logloss:0.31104 validation-auc:0.96992 validation-aucpr:0.97365
[45] validation-logloss:0.30818 validation-auc:0.96974 validation-aucpr:0.97349
[46] validation-logloss:0.30430 validation-auc:0.96977 validation-aucpr:0.97357
[47] validation-logloss:0.30137 validation-auc:0.96981 validation-aucpr:0.97358
[48] validation-logloss:0.29852 validation-auc:0.96992 validation-aucpr:0.97391
[49] validation-logloss:0.29589 validation-auc:0.96989 validation-aucpr:0.97391
[50] validation-logloss:0.29319 validation-auc:0.96993 validation-aucpr:0.97394
[51] validation-logloss:0.29075 validation-auc:0.96994 validation-aucpr:0.97396
[52] validation-logloss:0.28804 validation-auc:0.96994 validation-aucpr:0.97397
[53] validation-logloss:0.28492 validation-auc:0.97003 validation-aucpr:0.97407
[54] validation-logloss:0.28183 validation-auc:0.97005 validation-aucpr:0.97413
[55] validation-logloss:0.27964 validation-auc:0.97006 validation-aucpr:0.97415
[56] validation-logloss:0.27751 validation-auc:0.97008 validation-aucpr:0.97416
[57] validation-logloss:0.27550 validation-auc:0.97004 validation-aucpr:0.97412
[58] validation-logloss:0.27279 validation-auc:0.97012 validation-aucpr:0.97419
[59] validation-logloss:0.27051 validation-auc:0.97017 validation-aucpr:0.97422
[60] validation-logloss:0.26818 validation-auc:0.97017 validation-aucpr:0.97420
[61] validation-logloss:0.26628 validation-auc:0.97020 validation-aucpr:0.97422
[62] validation-logloss:0.26480 validation-auc:0.97017 validation-aucpr:0.97424
[63] validation-logloss:0.26320 validation-auc:0.97019 validation-aucpr:0.97423
[64] validation-logloss:0.26143 validation-auc:0.97022 validation-aucpr:0.97423
[65] validation-logloss:0.25902 validation-auc:0.97030 validation-aucpr:0.97431
[66] validation-logloss:0.25731 validation-auc:0.97036 validation-aucpr:0.97434
[67] validation-logloss:0.25585 validation-auc:0.97030 validation-aucpr:0.97426
[68] validation-logloss:0.25424 validation-auc:0.97026 validation-aucpr:0.97423
[69] validation-logloss:0.25269 validation-auc:0.97030 validation-aucpr:0.97426
[70] validation-logloss:0.25107 validation-auc:0.97032 validation-aucpr:0.97429
[71] validation-logloss:0.24924 validation-auc:0.97039 validation-aucpr:0.97437
[72] validation-logloss:0.24783 validation-auc:0.97040 validation-aucpr:0.97437
[73] validation-logloss:0.24584 validation-auc:0.97048 validation-aucpr:0.97444
[74] validation-logloss:0.24439 validation-auc:0.97049 validation-aucpr:0.97445
[75] validation-logloss:0.24255 validation-auc:0.97059 validation-aucpr:0.97452
[76] validation-logloss:0.24085 validation-auc:0.97065 validation-aucpr:0.97456
[77] validation-logloss:0.23904 validation-auc:0.97080 validation-aucpr:0.97478
[78] validation-logloss:0.23760 validation-auc:0.97091 validation-aucpr:0.97487
[79] validation-logloss:0.23592 validation-auc:0.97098 validation-aucpr:0.97497
[80] validation-logloss:0.23427 validation-auc:0.97103 validation-aucpr:0.97501
[81] validation-logloss:0.23324 validation-auc:0.97102 validation-aucpr:0.97499
[82] validation-logloss:0.23173 validation-auc:0.97103 validation-aucpr:0.97499
[83] validation-logloss:0.23059 validation-auc:0.97106 validation-aucpr:0.97501
[84] validation-logloss:0.22960 validation-auc:0.97111 validation-aucpr:0.97509
[85] validation-logloss:0.22811 validation-auc:0.97117 validation-aucpr:0.97516
[86] validation-logloss:0.22683 validation-auc:0.97122 validation-aucpr:0.97520
[87] validation-logloss:0.22602 validation-auc:0.97124 validation-aucpr:0.97518
[88] validation-logloss:0.22458 validation-auc:0.97134 validation-aucpr:0.97526
[89] validation-logloss:0.22331 validation-auc:0.97138 validation-aucpr:0.97530
[90] validation-logloss:0.22259 validation-auc:0.97134 validation-aucpr:0.97526
[91] validation-logloss:0.22159 validation-auc:0.97143 validation-aucpr:0.97533
[92] validation-logloss:0.22106 validation-auc:0.97136 validation-aucpr:0.97530
[93] validation-logloss:0.21988 validation-auc:0.97144 validation-aucpr:0.97535
[94] validation-logloss:0.21889 validation-auc:0.97146 validation-aucpr:0.97535
[95] validation-logloss:0.21819 validation-auc:0.97144 validation-aucpr:0.97532
[96] validation-logloss:0.21744 validation-auc:0.97146 validation-aucpr:0.97533
[97] validation-logloss:0.21686 validation-auc:0.97142 validation-aucpr:0.97529
{'best_iteration': '93', 'best_score': '0.9753493181523499'}
Trial 54, Fold 3: Log loss = 0.216855898710009, Average precision = 0.9752910810816002, ROC-AUC = 0.97141651758627, Elapsed Time = 16.7057109999987 seconds
Trial 54, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 54, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67554 validation-auc:0.91778 validation-aucpr:0.88855
[1] validation-logloss:0.65872 validation-auc:0.94242 validation-aucpr:0.93678
[2] validation-logloss:0.64088 validation-auc:0.95546 validation-aucpr:0.95899
[3] validation-logloss:0.62504 validation-auc:0.95806 validation-aucpr:0.96390
[4] validation-logloss:0.60880 validation-auc:0.96062 validation-aucpr:0.96698
[5] validation-logloss:0.59336 validation-auc:0.96235 validation-aucpr:0.96854
[6] validation-logloss:0.57992 validation-auc:0.96340 validation-aucpr:0.96906
[7] validation-logloss:0.56731 validation-auc:0.96386 validation-aucpr:0.96927
[8] validation-logloss:0.55493 validation-auc:0.96421 validation-aucpr:0.96956
[9] validation-logloss:0.54190 validation-auc:0.96486 validation-aucpr:0.97034
[10] validation-logloss:0.53016 validation-auc:0.96501 validation-aucpr:0.97046
[11] validation-logloss:0.52017 validation-auc:0.96464 validation-aucpr:0.97019
[12] validation-logloss:0.51020 validation-auc:0.96484 validation-aucpr:0.97031
[13] validation-logloss:0.50048 validation-auc:0.96491 validation-aucpr:0.97030
[14] validation-logloss:0.49158 validation-auc:0.96464 validation-aucpr:0.97017
[15] validation-logloss:0.48089 validation-auc:0.96528 validation-aucpr:0.97076
[16] validation-logloss:0.47242 validation-auc:0.96512 validation-aucpr:0.97063
[17] validation-logloss:0.46419 validation-auc:0.96507 validation-aucpr:0.97056
[18] validation-logloss:0.45609 validation-auc:0.96506 validation-aucpr:0.97059
[19] validation-logloss:0.44869 validation-auc:0.96526 validation-aucpr:0.97067
[20] validation-logloss:0.44161 validation-auc:0.96540 validation-aucpr:0.97076
[21] validation-logloss:0.43445 validation-auc:0.96546 validation-aucpr:0.97082
[22] validation-logloss:0.42596 validation-auc:0.96595 validation-aucpr:0.97127
[23] validation-logloss:0.41926 validation-auc:0.96579 validation-aucpr:0.97114
[24] validation-logloss:0.41342 validation-auc:0.96569 validation-aucpr:0.97106
[25] validation-logloss:0.40593 validation-auc:0.96594 validation-aucpr:0.97135
[26] validation-logloss:0.40029 validation-auc:0.96594 validation-aucpr:0.97134
[27] validation-logloss:0.39387 validation-auc:0.96610 validation-aucpr:0.97152
[28] validation-logloss:0.38821 validation-auc:0.96621 validation-aucpr:0.97162
[29] validation-logloss:0.38323 validation-auc:0.96608 validation-aucpr:0.97149
[30] validation-logloss:0.37689 validation-auc:0.96615 validation-aucpr:0.97160
[31] validation-logloss:0.37241 validation-auc:0.96604 validation-aucpr:0.97153
[32] validation-logloss:0.36732 validation-auc:0.96623 validation-aucpr:0.97167
[33] validation-logloss:0.36288 validation-auc:0.96619 validation-aucpr:0.97164
[34] validation-logloss:0.35861 validation-auc:0.96615 validation-aucpr:0.97163
[35] validation-logloss:0.35427 validation-auc:0.96629 validation-aucpr:0.97171
[36] validation-logloss:0.34893 validation-auc:0.96643 validation-aucpr:0.97186
[37] validation-logloss:0.34525 validation-auc:0.96639 validation-aucpr:0.97184
[38] validation-logloss:0.34154 validation-auc:0.96631 validation-aucpr:0.97176
[39] validation-logloss:0.33762 validation-auc:0.96633 validation-aucpr:0.97176
[40] validation-logloss:0.33412 validation-auc:0.96649 validation-aucpr:0.97186
[41] validation-logloss:0.32961 validation-auc:0.96671 validation-aucpr:0.97202
[42] validation-logloss:0.32605 validation-auc:0.96677 validation-aucpr:0.97206
[43] validation-logloss:0.32257 validation-auc:0.96671 validation-aucpr:0.97202
[44] validation-logloss:0.31934 validation-auc:0.96681 validation-aucpr:0.97208
[45] validation-logloss:0.31627 validation-auc:0.96676 validation-aucpr:0.97206
[46] validation-logloss:0.31350 validation-auc:0.96667 validation-aucpr:0.97200
[47] validation-logloss:0.30990 validation-auc:0.96671 validation-aucpr:0.97207
[48] validation-logloss:0.30612 validation-auc:0.96690 validation-aucpr:0.97224
[49] validation-logloss:0.30380 validation-auc:0.96682 validation-aucpr:0.97216
[50] validation-logloss:0.30024 validation-auc:0.96689 validation-aucpr:0.97225
[51] validation-logloss:0.29675 validation-auc:0.96694 validation-aucpr:0.97233
[52] validation-logloss:0.29401 validation-auc:0.96695 validation-aucpr:0.97236
[53] validation-logloss:0.29218 validation-auc:0.96680 validation-aucpr:0.97224
[54] validation-logloss:0.28912 validation-auc:0.96694 validation-aucpr:0.97234
[55] validation-logloss:0.28648 validation-auc:0.96711 validation-aucpr:0.97246
[56] validation-logloss:0.28437 validation-auc:0.96720 validation-aucpr:0.97251
[57] validation-logloss:0.28202 validation-auc:0.96721 validation-aucpr:0.97254
[58] validation-logloss:0.27996 validation-auc:0.96726 validation-aucpr:0.97258
[59] validation-logloss:0.27751 validation-auc:0.96731 validation-aucpr:0.97262
[60] validation-logloss:0.27568 validation-auc:0.96730 validation-aucpr:0.97261
[61] validation-logloss:0.27386 validation-auc:0.96736 validation-aucpr:0.97264
[62] validation-logloss:0.27181 validation-auc:0.96737 validation-aucpr:0.97264
[63] validation-logloss:0.26910 validation-auc:0.96750 validation-aucpr:0.97275
[64] validation-logloss:0.26663 validation-auc:0.96750 validation-aucpr:0.97280
[65] validation-logloss:0.26446 validation-auc:0.96759 validation-aucpr:0.97289
[66] validation-logloss:0.26287 validation-auc:0.96760 validation-aucpr:0.97289
[67] validation-logloss:0.26143 validation-auc:0.96761 validation-aucpr:0.97288
[68] validation-logloss:0.25938 validation-auc:0.96765 validation-aucpr:0.97293
[69] validation-logloss:0.25794 validation-auc:0.96767 validation-aucpr:0.97292
[70] validation-logloss:0.25649 validation-auc:0.96773 validation-aucpr:0.97297
[71] validation-logloss:0.25527 validation-auc:0.96774 validation-aucpr:0.97299
[72] validation-logloss:0.25389 validation-auc:0.96779 validation-aucpr:0.97301
[73] validation-logloss:0.25189 validation-auc:0.96784 validation-aucpr:0.97308
[74] validation-logloss:0.25049 validation-auc:0.96788 validation-aucpr:0.97311
[75] validation-logloss:0.24855 validation-auc:0.96803 validation-aucpr:0.97324
[76] validation-logloss:0.24712 validation-auc:0.96812 validation-aucpr:0.97330
[77] validation-logloss:0.24584 validation-auc:0.96816 validation-aucpr:0.97333
[78] validation-logloss:0.24388 validation-auc:0.96827 validation-aucpr:0.97345
[79] validation-logloss:0.24202 validation-auc:0.96842 validation-aucpr:0.97357
[80] validation-logloss:0.24088 validation-auc:0.96842 validation-aucpr:0.97356
[81] validation-logloss:0.23949 validation-auc:0.96850 validation-aucpr:0.97362
[82] validation-logloss:0.23853 validation-auc:0.96847 validation-aucpr:0.97361
[83] validation-logloss:0.23703 validation-auc:0.96853 validation-aucpr:0.97368
[84] validation-logloss:0.23580 validation-auc:0.96862 validation-aucpr:0.97373
[85] validation-logloss:0.23491 validation-auc:0.96861 validation-aucpr:0.97371
[86] validation-logloss:0.23346 validation-auc:0.96872 validation-aucpr:0.97380
[87] validation-logloss:0.23187 validation-auc:0.96883 validation-aucpr:0.97389
[88] validation-logloss:0.23043 validation-auc:0.96891 validation-aucpr:0.97397
[89] validation-logloss:0.22913 validation-auc:0.96892 validation-aucpr:0.97398
[90] validation-logloss:0.22798 validation-auc:0.96898 validation-aucpr:0.97404
[91] validation-logloss:0.22705 validation-auc:0.96903 validation-aucpr:0.97408
[92] validation-logloss:0.22650 validation-auc:0.96900 validation-aucpr:0.97405
[93] validation-logloss:0.22575 validation-auc:0.96899 validation-aucpr:0.97405
[94] validation-logloss:0.22502 validation-auc:0.96897 validation-aucpr:0.97404
[95] validation-logloss:0.22394 validation-auc:0.96902 validation-aucpr:0.97408
[96] validation-logloss:0.22324 validation-auc:0.96904 validation-aucpr:0.97409
[97] validation-logloss:0.22246 validation-auc:0.96912 validation-aucpr:0.97414
{'best_iteration': '97', 'best_score': '0.9741356325292141'}
Trial 54, Fold 4: Log loss = 0.22246427317922213, Average precision = 0.9741398406367417, ROC-AUC = 0.9691153362852326, Elapsed Time = 16.831737099997554 seconds
Trial 54, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 54, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67576 validation-auc:0.92141 validation-aucpr:0.90680
[1] validation-logloss:0.65868 validation-auc:0.94379 validation-aucpr:0.94201
[2] validation-logloss:0.64288 validation-auc:0.94958 validation-aucpr:0.95391
[3] validation-logloss:0.62788 validation-auc:0.95220 validation-aucpr:0.95693
[4] validation-logloss:0.61293 validation-auc:0.95519 validation-aucpr:0.95990
[5] validation-logloss:0.59880 validation-auc:0.95692 validation-aucpr:0.96154
[6] validation-logloss:0.58422 validation-auc:0.96008 validation-aucpr:0.96514
[7] validation-logloss:0.57016 validation-auc:0.96099 validation-aucpr:0.96611
[8] validation-logloss:0.55795 validation-auc:0.96134 validation-aucpr:0.96635
[9] validation-logloss:0.54675 validation-auc:0.96150 validation-aucpr:0.96630
[10] validation-logloss:0.53505 validation-auc:0.96201 validation-aucpr:0.96663
[11] validation-logloss:0.52494 validation-auc:0.96208 validation-aucpr:0.96661
[12] validation-logloss:0.51543 validation-auc:0.96197 validation-aucpr:0.96645
[13] validation-logloss:0.50561 validation-auc:0.96211 validation-aucpr:0.96653
[14] validation-logloss:0.49574 validation-auc:0.96242 validation-aucpr:0.96734
[15] validation-logloss:0.48703 validation-auc:0.96240 validation-aucpr:0.96740
[16] validation-logloss:0.47656 validation-auc:0.96300 validation-aucpr:0.96812
[17] validation-logloss:0.46794 validation-auc:0.96304 validation-aucpr:0.96814
[18] validation-logloss:0.46004 validation-auc:0.96311 validation-aucpr:0.96813
[19] validation-logloss:0.45223 validation-auc:0.96300 validation-aucpr:0.96807
[20] validation-logloss:0.44368 validation-auc:0.96322 validation-aucpr:0.96837
[21] validation-logloss:0.43655 validation-auc:0.96348 validation-aucpr:0.96854
[22] validation-logloss:0.42984 validation-auc:0.96356 validation-aucpr:0.96860
[23] validation-logloss:0.42203 validation-auc:0.96386 validation-aucpr:0.96891
[24] validation-logloss:0.41437 validation-auc:0.96411 validation-aucpr:0.96910
[25] validation-logloss:0.40845 validation-auc:0.96412 validation-aucpr:0.96909
[26] validation-logloss:0.40288 validation-auc:0.96408 validation-aucpr:0.96894
[27] validation-logloss:0.39599 validation-auc:0.96430 validation-aucpr:0.96911
[28] validation-logloss:0.39079 validation-auc:0.96439 validation-aucpr:0.96912
[29] validation-logloss:0.38562 validation-auc:0.96447 validation-aucpr:0.96915
[30] validation-logloss:0.37925 validation-auc:0.96467 validation-aucpr:0.96931
[31] validation-logloss:0.37360 validation-auc:0.96490 validation-aucpr:0.96953
[32] validation-logloss:0.36786 validation-auc:0.96506 validation-aucpr:0.96962
[33] validation-logloss:0.36265 validation-auc:0.96507 validation-aucpr:0.96964
[34] validation-logloss:0.35727 validation-auc:0.96533 validation-aucpr:0.96980
[35] validation-logloss:0.35324 validation-auc:0.96545 validation-aucpr:0.96987
[36] validation-logloss:0.34822 validation-auc:0.96566 validation-aucpr:0.97004
[37] validation-logloss:0.34426 validation-auc:0.96561 validation-aucpr:0.96999
[38] validation-logloss:0.34009 validation-auc:0.96570 validation-aucpr:0.97024
[39] validation-logloss:0.33541 validation-auc:0.96602 validation-aucpr:0.97055
[40] validation-logloss:0.33147 validation-auc:0.96601 validation-aucpr:0.97054
[41] validation-logloss:0.32777 validation-auc:0.96603 validation-aucpr:0.97056
[42] validation-logloss:0.32461 validation-auc:0.96589 validation-aucpr:0.97045
[43] validation-logloss:0.32120 validation-auc:0.96595 validation-aucpr:0.97055
[44] validation-logloss:0.31752 validation-auc:0.96607 validation-aucpr:0.97066
[45] validation-logloss:0.31466 validation-auc:0.96602 validation-aucpr:0.97060
[46] validation-logloss:0.31179 validation-auc:0.96599 validation-aucpr:0.97057
[47] validation-logloss:0.30814 validation-auc:0.96609 validation-aucpr:0.97067
[48] validation-logloss:0.30550 validation-auc:0.96613 validation-aucpr:0.97063
[49] validation-logloss:0.30295 validation-auc:0.96608 validation-aucpr:0.97047
[50] validation-logloss:0.29959 validation-auc:0.96622 validation-aucpr:0.97066
[51] validation-logloss:0.29658 validation-auc:0.96622 validation-aucpr:0.97067
[52] validation-logloss:0.29352 validation-auc:0.96632 validation-aucpr:0.97074
[53] validation-logloss:0.29060 validation-auc:0.96638 validation-aucpr:0.97080
[54] validation-logloss:0.28856 validation-auc:0.96642 validation-aucpr:0.97080
[55] validation-logloss:0.28552 validation-auc:0.96665 validation-aucpr:0.97098
[56] validation-logloss:0.28368 validation-auc:0.96650 validation-aucpr:0.97084
[57] validation-logloss:0.28105 validation-auc:0.96655 validation-aucpr:0.97077
[58] validation-logloss:0.27861 validation-auc:0.96652 validation-aucpr:0.97076
[59] validation-logloss:0.27676 validation-auc:0.96650 validation-aucpr:0.97074
[60] validation-logloss:0.27485 validation-auc:0.96650 validation-aucpr:0.97077
[61] validation-logloss:0.27231 validation-auc:0.96663 validation-aucpr:0.97088
[62] validation-logloss:0.27052 validation-auc:0.96664 validation-aucpr:0.97089
[63] validation-logloss:0.26857 validation-auc:0.96661 validation-aucpr:0.97086
[64] validation-logloss:0.26676 validation-auc:0.96671 validation-aucpr:0.97094
[65] validation-logloss:0.26459 validation-auc:0.96678 validation-aucpr:0.97100
[66] validation-logloss:0.26285 validation-auc:0.96688 validation-aucpr:0.97109
[67] validation-logloss:0.26118 validation-auc:0.96698 validation-aucpr:0.97121
[68] validation-logloss:0.25978 validation-auc:0.96690 validation-aucpr:0.97114
[69] validation-logloss:0.25787 validation-auc:0.96701 validation-aucpr:0.97121
[70] validation-logloss:0.25595 validation-auc:0.96711 validation-aucpr:0.97140
[71] validation-logloss:0.25456 validation-auc:0.96718 validation-aucpr:0.97140
[72] validation-logloss:0.25311 validation-auc:0.96719 validation-aucpr:0.97140
[73] validation-logloss:0.25180 validation-auc:0.96720 validation-aucpr:0.97140
[74] validation-logloss:0.25044 validation-auc:0.96729 validation-aucpr:0.97154
[75] validation-logloss:0.24858 validation-auc:0.96740 validation-aucpr:0.97165
[76] validation-logloss:0.24744 validation-auc:0.96740 validation-aucpr:0.97164
[77] validation-logloss:0.24656 validation-auc:0.96735 validation-aucpr:0.97159
[78] validation-logloss:0.24498 validation-auc:0.96742 validation-aucpr:0.97158
[79] validation-logloss:0.24354 validation-auc:0.96741 validation-aucpr:0.97157
[80] validation-logloss:0.24224 validation-auc:0.96744 validation-aucpr:0.97160
[81] validation-logloss:0.24070 validation-auc:0.96755 validation-aucpr:0.97169
[82] validation-logloss:0.23930 validation-auc:0.96762 validation-aucpr:0.97178
[83] validation-logloss:0.23819 validation-auc:0.96770 validation-aucpr:0.97183
[84] validation-logloss:0.23741 validation-auc:0.96765 validation-aucpr:0.97178
[85] validation-logloss:0.23637 validation-auc:0.96774 validation-aucpr:0.97184
[86] validation-logloss:0.23550 validation-auc:0.96773 validation-aucpr:0.97183
[87] validation-logloss:0.23469 validation-auc:0.96774 validation-aucpr:0.97183
[88] validation-logloss:0.23383 validation-auc:0.96777 validation-aucpr:0.97189
[89] validation-logloss:0.23269 validation-auc:0.96784 validation-aucpr:0.97194
[90] validation-logloss:0.23188 validation-auc:0.96786 validation-aucpr:0.97180
[91] validation-logloss:0.23085 validation-auc:0.96788 validation-aucpr:0.97184
[92] validation-logloss:0.23022 validation-auc:0.96786 validation-aucpr:0.97181
[93] validation-logloss:0.22955 validation-auc:0.96788 validation-aucpr:0.97162
[94] validation-logloss:0.22889 validation-auc:0.96786 validation-aucpr:0.97177
[95] validation-logloss:0.22824 validation-auc:0.96787 validation-aucpr:0.97188
[96] validation-logloss:0.22720 validation-auc:0.96794 validation-aucpr:0.97195
[97] validation-logloss:0.22612 validation-auc:0.96799 validation-aucpr:0.97200
{'best_iteration': '97', 'best_score': '0.9719967325717643'}
Trial 54, Fold 5: Log loss = 0.2261180343303365, Average precision = 0.9720025766905903, ROC-AUC = 0.9679936086803038, Elapsed Time = 16.96568880000268 seconds
Optimization Progress: 55%|#####5 | 55/100 [2:54:24<1:06:21, 88.47s/it]
Trial 55, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 55, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68388 validation-auc:0.94851 validation-aucpr:0.94833
[1] validation-logloss:0.67489 validation-auc:0.96102 validation-aucpr:0.96180
[2] validation-logloss:0.66598 validation-auc:0.96393 validation-aucpr:0.96686
[3] validation-logloss:0.65723 validation-auc:0.96541 validation-aucpr:0.96763
[4] validation-logloss:0.64867 validation-auc:0.96675 validation-aucpr:0.97182
[5] validation-logloss:0.64021 validation-auc:0.96787 validation-aucpr:0.97176
[6] validation-logloss:0.63302 validation-auc:0.96805 validation-aucpr:0.97183
[7] validation-logloss:0.62502 validation-auc:0.96887 validation-aucpr:0.97372
[8] validation-logloss:0.61730 validation-auc:0.96873 validation-aucpr:0.97362
[9] validation-logloss:0.60971 validation-auc:0.96859 validation-aucpr:0.97364
[10] validation-logloss:0.60221 validation-auc:0.96868 validation-aucpr:0.97368
[11] validation-logloss:0.59507 validation-auc:0.96903 validation-aucpr:0.97391
[12] validation-logloss:0.58874 validation-auc:0.96889 validation-aucpr:0.97378
[13] validation-logloss:0.58168 validation-auc:0.96872 validation-aucpr:0.97366
[14] validation-logloss:0.57491 validation-auc:0.96870 validation-aucpr:0.97367
[15] validation-logloss:0.56824 validation-auc:0.96880 validation-aucpr:0.97376
[16] validation-logloss:0.56202 validation-auc:0.96901 validation-aucpr:0.97400
[17] validation-logloss:0.55561 validation-auc:0.96918 validation-aucpr:0.97410
[18] validation-logloss:0.54946 validation-auc:0.96916 validation-aucpr:0.97414
[19] validation-logloss:0.54334 validation-auc:0.96894 validation-aucpr:0.97399
[20] validation-logloss:0.53735 validation-auc:0.96919 validation-aucpr:0.97414
[21] validation-logloss:0.53161 validation-auc:0.96928 validation-aucpr:0.97421
[22] validation-logloss:0.52582 validation-auc:0.96944 validation-aucpr:0.97435
[23] validation-logloss:0.52031 validation-auc:0.96951 validation-aucpr:0.97439
[24] validation-logloss:0.51475 validation-auc:0.96970 validation-aucpr:0.97448
[25] validation-logloss:0.50942 validation-auc:0.96970 validation-aucpr:0.97450
[26] validation-logloss:0.50412 validation-auc:0.96975 validation-aucpr:0.97455
[27] validation-logloss:0.49963 validation-auc:0.96961 validation-aucpr:0.97445
[28] validation-logloss:0.49458 validation-auc:0.96970 validation-aucpr:0.97451
[29] validation-logloss:0.48953 validation-auc:0.96972 validation-aucpr:0.97452
[30] validation-logloss:0.48514 validation-auc:0.96973 validation-aucpr:0.97454
[31] validation-logloss:0.48030 validation-auc:0.96974 validation-aucpr:0.97452
[32] validation-logloss:0.47550 validation-auc:0.96986 validation-aucpr:0.97458
[33] validation-logloss:0.47086 validation-auc:0.96988 validation-aucpr:0.97460
[34] validation-logloss:0.46641 validation-auc:0.96987 validation-aucpr:0.97458
[35] validation-logloss:0.46235 validation-auc:0.97004 validation-aucpr:0.97470
[36] validation-logloss:0.45799 validation-auc:0.97003 validation-aucpr:0.97470
[37] validation-logloss:0.45383 validation-auc:0.96987 validation-aucpr:0.97459
[38] validation-logloss:0.45003 validation-auc:0.96990 validation-aucpr:0.97461
[39] validation-logloss:0.44587 validation-auc:0.96996 validation-aucpr:0.97468
[40] validation-logloss:0.44176 validation-auc:0.97000 validation-aucpr:0.97470
[41] validation-logloss:0.43807 validation-auc:0.97010 validation-aucpr:0.97478
[42] validation-logloss:0.43472 validation-auc:0.97001 validation-aucpr:0.97469
[43] validation-logloss:0.43077 validation-auc:0.96996 validation-aucpr:0.97466
[44] validation-logloss:0.42731 validation-auc:0.97005 validation-aucpr:0.97473
[45] validation-logloss:0.42390 validation-auc:0.97007 validation-aucpr:0.97474
[46] validation-logloss:0.42023 validation-auc:0.97010 validation-aucpr:0.97477
[47] validation-logloss:0.41712 validation-auc:0.97005 validation-aucpr:0.97470
[48] validation-logloss:0.41392 validation-auc:0.97005 validation-aucpr:0.97470
[49] validation-logloss:0.41081 validation-auc:0.97003 validation-aucpr:0.97466
[50] validation-logloss:0.40787 validation-auc:0.96997 validation-aucpr:0.97468
[51] validation-logloss:0.40445 validation-auc:0.97001 validation-aucpr:0.97470
[52] validation-logloss:0.40113 validation-auc:0.96999 validation-aucpr:0.97468
[53] validation-logloss:0.39831 validation-auc:0.96992 validation-aucpr:0.97461
[54] validation-logloss:0.39510 validation-auc:0.96995 validation-aucpr:0.97464
[55] validation-logloss:0.39237 validation-auc:0.96992 validation-aucpr:0.97467
[56] validation-logloss:0.38942 validation-auc:0.96995 validation-aucpr:0.97470
[57] validation-logloss:0.38665 validation-auc:0.96997 validation-aucpr:0.97470
[58] validation-logloss:0.38358 validation-auc:0.97001 validation-aucpr:0.97473
[59] validation-logloss:0.38068 validation-auc:0.97002 validation-aucpr:0.97474
[60] validation-logloss:0.37812 validation-auc:0.97000 validation-aucpr:0.97470
[61] validation-logloss:0.37521 validation-auc:0.97010 validation-aucpr:0.97478
[62] validation-logloss:0.37232 validation-auc:0.97015 validation-aucpr:0.97482
[63] validation-logloss:0.36953 validation-auc:0.97014 validation-aucpr:0.97481
[64] validation-logloss:0.36677 validation-auc:0.97019 validation-aucpr:0.97486
[65] validation-logloss:0.36418 validation-auc:0.97020 validation-aucpr:0.97490
{'best_iteration': '65', 'best_score': '0.9748960585897559'}
Trial 55, Fold 1: Log loss = 0.3641809402106429, Average precision = 0.9748999805952879, ROC-AUC = 0.9701964231943906, Elapsed Time = 2.353945900002145 seconds
Trial 55, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 55, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68338 validation-auc:0.95430 validation-aucpr:0.95662
[1] validation-logloss:0.67423 validation-auc:0.96548 validation-aucpr:0.96508
[2] validation-logloss:0.66602 validation-auc:0.96689 validation-aucpr:0.97055
[3] validation-logloss:0.65723 validation-auc:0.96778 validation-aucpr:0.97152
[4] validation-logloss:0.64881 validation-auc:0.96885 validation-aucpr:0.97255
[5] validation-logloss:0.64055 validation-auc:0.96889 validation-aucpr:0.97252
[6] validation-logloss:0.63234 validation-auc:0.96931 validation-aucpr:0.97313
[7] validation-logloss:0.62435 validation-auc:0.96930 validation-aucpr:0.97319
[8] validation-logloss:0.61767 validation-auc:0.96872 validation-aucpr:0.97269
[9] validation-logloss:0.61097 validation-auc:0.96871 validation-aucpr:0.97268
[10] validation-logloss:0.60354 validation-auc:0.96905 validation-aucpr:0.97292
[11] validation-logloss:0.59705 validation-auc:0.96901 validation-aucpr:0.97291
[12] validation-logloss:0.58995 validation-auc:0.96915 validation-aucpr:0.97301
[13] validation-logloss:0.58301 validation-auc:0.96936 validation-aucpr:0.97314
[14] validation-logloss:0.57659 validation-auc:0.96979 validation-aucpr:0.97337
[15] validation-logloss:0.57050 validation-auc:0.96952 validation-aucpr:0.97313
[16] validation-logloss:0.56466 validation-auc:0.96953 validation-aucpr:0.97333
[17] validation-logloss:0.55819 validation-auc:0.96956 validation-aucpr:0.97336
[18] validation-logloss:0.55274 validation-auc:0.96947 validation-aucpr:0.97323
[19] validation-logloss:0.54686 validation-auc:0.96967 validation-aucpr:0.97338
[20] validation-logloss:0.54081 validation-auc:0.96986 validation-aucpr:0.97355
[21] validation-logloss:0.53481 validation-auc:0.97006 validation-aucpr:0.97371
[22] validation-logloss:0.52908 validation-auc:0.97009 validation-aucpr:0.97377
[23] validation-logloss:0.52354 validation-auc:0.97025 validation-aucpr:0.97391
[24] validation-logloss:0.51789 validation-auc:0.97048 validation-aucpr:0.97413
[25] validation-logloss:0.51236 validation-auc:0.97061 validation-aucpr:0.97424
[26] validation-logloss:0.50693 validation-auc:0.97068 validation-aucpr:0.97425
[27] validation-logloss:0.50239 validation-auc:0.97038 validation-aucpr:0.97393
[28] validation-logloss:0.49748 validation-auc:0.97039 validation-aucpr:0.97394
[29] validation-logloss:0.49294 validation-auc:0.97034 validation-aucpr:0.97389
[30] validation-logloss:0.48791 validation-auc:0.97045 validation-aucpr:0.97399
[31] validation-logloss:0.48300 validation-auc:0.97069 validation-aucpr:0.97418
[32] validation-logloss:0.47877 validation-auc:0.97068 validation-aucpr:0.97415
[33] validation-logloss:0.47451 validation-auc:0.97063 validation-aucpr:0.97411
[34] validation-logloss:0.46992 validation-auc:0.97064 validation-aucpr:0.97401
[35] validation-logloss:0.46532 validation-auc:0.97085 validation-aucpr:0.97416
[36] validation-logloss:0.46085 validation-auc:0.97103 validation-aucpr:0.97430
[37] validation-logloss:0.45646 validation-auc:0.97104 validation-aucpr:0.97432
[38] validation-logloss:0.45236 validation-auc:0.97101 validation-aucpr:0.97429
[39] validation-logloss:0.44809 validation-auc:0.97109 validation-aucpr:0.97437
[40] validation-logloss:0.44394 validation-auc:0.97112 validation-aucpr:0.97438
[41] validation-logloss:0.44012 validation-auc:0.97109 validation-aucpr:0.97434
[42] validation-logloss:0.43656 validation-auc:0.97107 validation-aucpr:0.97424
[43] validation-logloss:0.43254 validation-auc:0.97124 validation-aucpr:0.97438
[44] validation-logloss:0.42915 validation-auc:0.97115 validation-aucpr:0.97431
[45] validation-logloss:0.42534 validation-auc:0.97126 validation-aucpr:0.97440
[46] validation-logloss:0.42148 validation-auc:0.97144 validation-aucpr:0.97453
[47] validation-logloss:0.41774 validation-auc:0.97153 validation-aucpr:0.97461
[48] validation-logloss:0.41411 validation-auc:0.97168 validation-aucpr:0.97471
[49] validation-logloss:0.41058 validation-auc:0.97163 validation-aucpr:0.97465
[50] validation-logloss:0.40752 validation-auc:0.97148 validation-aucpr:0.97454
[51] validation-logloss:0.40430 validation-auc:0.97150 validation-aucpr:0.97454
[52] validation-logloss:0.40124 validation-auc:0.97153 validation-aucpr:0.97456
[53] validation-logloss:0.39829 validation-auc:0.97147 validation-aucpr:0.97450
[54] validation-logloss:0.39508 validation-auc:0.97146 validation-aucpr:0.97450
[55] validation-logloss:0.39189 validation-auc:0.97157 validation-aucpr:0.97459
[56] validation-logloss:0.38870 validation-auc:0.97158 validation-aucpr:0.97459
[57] validation-logloss:0.38557 validation-auc:0.97160 validation-aucpr:0.97462
[58] validation-logloss:0.38257 validation-auc:0.97156 validation-aucpr:0.97459
[59] validation-logloss:0.37948 validation-auc:0.97155 validation-aucpr:0.97459
[60] validation-logloss:0.37650 validation-auc:0.97161 validation-aucpr:0.97461
[61] validation-logloss:0.37362 validation-auc:0.97164 validation-aucpr:0.97464
[62] validation-logloss:0.37076 validation-auc:0.97167 validation-aucpr:0.97466
[63] validation-logloss:0.36786 validation-auc:0.97176 validation-aucpr:0.97474
[64] validation-logloss:0.36537 validation-auc:0.97176 validation-aucpr:0.97442
[65] validation-logloss:0.36256 validation-auc:0.97181 validation-aucpr:0.97447
{'best_iteration': '63', 'best_score': '0.9747379643077329'}
Trial 55, Fold 2: Log loss = 0.36256205802843033, Average precision = 0.9745009545862788, ROC-AUC = 0.9718087516589826, Elapsed Time = 2.5463577000009536 seconds
Trial 55, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 55, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68367 validation-auc:0.94844 validation-aucpr:0.95205
[1] validation-logloss:0.67453 validation-auc:0.96217 validation-aucpr:0.96193
[2] validation-logloss:0.66633 validation-auc:0.96459 validation-aucpr:0.96725
[3] validation-logloss:0.65755 validation-auc:0.96635 validation-aucpr:0.96968
[4] validation-logloss:0.64890 validation-auc:0.96834 validation-aucpr:0.97264
[5] validation-logloss:0.64060 validation-auc:0.96913 validation-aucpr:0.97333
[6] validation-logloss:0.63325 validation-auc:0.96940 validation-aucpr:0.97358
[7] validation-logloss:0.62529 validation-auc:0.96949 validation-aucpr:0.97362
[8] validation-logloss:0.61752 validation-auc:0.96956 validation-aucpr:0.97367
[9] validation-logloss:0.61002 validation-auc:0.96977 validation-aucpr:0.97384
[10] validation-logloss:0.60243 validation-auc:0.96986 validation-aucpr:0.97397
[11] validation-logloss:0.59581 validation-auc:0.96985 validation-aucpr:0.97399
[12] validation-logloss:0.58874 validation-auc:0.97002 validation-aucpr:0.97412
[13] validation-logloss:0.58175 validation-auc:0.97018 validation-aucpr:0.97423
[14] validation-logloss:0.57510 validation-auc:0.97011 validation-aucpr:0.97416
[15] validation-logloss:0.56841 validation-auc:0.97001 validation-aucpr:0.97413
[16] validation-logloss:0.56233 validation-auc:0.97001 validation-aucpr:0.97412
[17] validation-logloss:0.55655 validation-auc:0.96984 validation-aucpr:0.97389
[18] validation-logloss:0.55107 validation-auc:0.96974 validation-aucpr:0.97377
[19] validation-logloss:0.54477 validation-auc:0.96993 validation-aucpr:0.97397
[20] validation-logloss:0.53940 validation-auc:0.96983 validation-aucpr:0.97387
[21] validation-logloss:0.53340 validation-auc:0.96998 validation-aucpr:0.97407
[22] validation-logloss:0.52746 validation-auc:0.97005 validation-aucpr:0.97412
[23] validation-logloss:0.52231 validation-auc:0.97007 validation-aucpr:0.97428
[24] validation-logloss:0.51673 validation-auc:0.97007 validation-aucpr:0.97427
[25] validation-logloss:0.51128 validation-auc:0.97013 validation-aucpr:0.97431
[26] validation-logloss:0.50594 validation-auc:0.97026 validation-aucpr:0.97443
[27] validation-logloss:0.50056 validation-auc:0.97037 validation-aucpr:0.97454
[28] validation-logloss:0.49532 validation-auc:0.97051 validation-aucpr:0.97467
[29] validation-logloss:0.49032 validation-auc:0.97052 validation-aucpr:0.97467
[30] validation-logloss:0.48527 validation-auc:0.97066 validation-aucpr:0.97478
[31] validation-logloss:0.48084 validation-auc:0.97066 validation-aucpr:0.97478
[32] validation-logloss:0.47600 validation-auc:0.97076 validation-aucpr:0.97486
[33] validation-logloss:0.47167 validation-auc:0.97089 validation-aucpr:0.97495
[34] validation-logloss:0.46750 validation-auc:0.97098 validation-aucpr:0.97501
[35] validation-logloss:0.46303 validation-auc:0.97106 validation-aucpr:0.97507
[36] validation-logloss:0.45866 validation-auc:0.97112 validation-aucpr:0.97509
[37] validation-logloss:0.45476 validation-auc:0.97111 validation-aucpr:0.97507
[38] validation-logloss:0.45034 validation-auc:0.97122 validation-aucpr:0.97515
[39] validation-logloss:0.44598 validation-auc:0.97133 validation-aucpr:0.97523
[40] validation-logloss:0.44233 validation-auc:0.97138 validation-aucpr:0.97528
[41] validation-logloss:0.43817 validation-auc:0.97159 validation-aucpr:0.97542
[42] validation-logloss:0.43406 validation-auc:0.97173 validation-aucpr:0.97553
[43] validation-logloss:0.43031 validation-auc:0.97162 validation-aucpr:0.97547
[44] validation-logloss:0.42638 validation-auc:0.97173 validation-aucpr:0.97556
[45] validation-logloss:0.42262 validation-auc:0.97171 validation-aucpr:0.97554
[46] validation-logloss:0.41886 validation-auc:0.97180 validation-aucpr:0.97562
[47] validation-logloss:0.41567 validation-auc:0.97179 validation-aucpr:0.97560
[48] validation-logloss:0.41202 validation-auc:0.97182 validation-aucpr:0.97562
[49] validation-logloss:0.40902 validation-auc:0.97172 validation-aucpr:0.97553
[50] validation-logloss:0.40569 validation-auc:0.97179 validation-aucpr:0.97559
[51] validation-logloss:0.40276 validation-auc:0.97168 validation-aucpr:0.97548
[52] validation-logloss:0.39936 validation-auc:0.97166 validation-aucpr:0.97548
[53] validation-logloss:0.39603 validation-auc:0.97169 validation-aucpr:0.97546
[54] validation-logloss:0.39292 validation-auc:0.97166 validation-aucpr:0.97543
[55] validation-logloss:0.38959 validation-auc:0.97179 validation-aucpr:0.97554
[56] validation-logloss:0.38648 validation-auc:0.97186 validation-aucpr:0.97562
[57] validation-logloss:0.38334 validation-auc:0.97190 validation-aucpr:0.97564
[58] validation-logloss:0.38030 validation-auc:0.97194 validation-aucpr:0.97566
[59] validation-logloss:0.37720 validation-auc:0.97202 validation-aucpr:0.97572
[60] validation-logloss:0.37423 validation-auc:0.97207 validation-aucpr:0.97576
[61] validation-logloss:0.37127 validation-auc:0.97211 validation-aucpr:0.97577
[62] validation-logloss:0.36839 validation-auc:0.97211 validation-aucpr:0.97578
[63] validation-logloss:0.36550 validation-auc:0.97218 validation-aucpr:0.97583
[64] validation-logloss:0.36271 validation-auc:0.97222 validation-aucpr:0.97586
[65] validation-logloss:0.36009 validation-auc:0.97215 validation-aucpr:0.97580
{'best_iteration': '64', 'best_score': '0.9758630569815583'}
Trial 55, Fold 3: Log loss = 0.3600912741941957, Average precision = 0.9758044092545942, ROC-AUC = 0.9721529155550848, Elapsed Time = 2.635833000000275 seconds
Trial 55, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 55, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68357 validation-auc:0.94645 validation-aucpr:0.93736
[1] validation-logloss:0.67441 validation-auc:0.95971 validation-aucpr:0.95893
[2] validation-logloss:0.66542 validation-auc:0.96524 validation-aucpr:0.96967
[3] validation-logloss:0.65689 validation-auc:0.96651 validation-aucpr:0.97207
[4] validation-logloss:0.64943 validation-auc:0.96551 validation-aucpr:0.97122
[5] validation-logloss:0.64171 validation-auc:0.96616 validation-aucpr:0.97169
[6] validation-logloss:0.63356 validation-auc:0.96677 validation-aucpr:0.97227
[7] validation-logloss:0.62636 validation-auc:0.96672 validation-aucpr:0.97220
[8] validation-logloss:0.61846 validation-auc:0.96715 validation-aucpr:0.97258
[9] validation-logloss:0.61082 validation-auc:0.96777 validation-aucpr:0.97307
[10] validation-logloss:0.60342 validation-auc:0.96808 validation-aucpr:0.97342
[11] validation-logloss:0.59610 validation-auc:0.96819 validation-aucpr:0.97354
[12] validation-logloss:0.58938 validation-auc:0.96830 validation-aucpr:0.97369
[13] validation-logloss:0.58222 validation-auc:0.96892 validation-aucpr:0.97413
[14] validation-logloss:0.57540 validation-auc:0.96901 validation-aucpr:0.97423
[15] validation-logloss:0.56874 validation-auc:0.96927 validation-aucpr:0.97442
[16] validation-logloss:0.56217 validation-auc:0.96940 validation-aucpr:0.97451
[17] validation-logloss:0.55570 validation-auc:0.96963 validation-aucpr:0.97471
[18] validation-logloss:0.55019 validation-auc:0.96935 validation-aucpr:0.97448
[19] validation-logloss:0.54421 validation-auc:0.96941 validation-aucpr:0.97450
[20] validation-logloss:0.53826 validation-auc:0.96943 validation-aucpr:0.97450
[21] validation-logloss:0.53233 validation-auc:0.96963 validation-aucpr:0.97465
[22] validation-logloss:0.52676 validation-auc:0.96955 validation-aucpr:0.97460
[23] validation-logloss:0.52171 validation-auc:0.96943 validation-aucpr:0.97450
[24] validation-logloss:0.51609 validation-auc:0.96958 validation-aucpr:0.97460
[25] validation-logloss:0.51064 validation-auc:0.96965 validation-aucpr:0.97468
[26] validation-logloss:0.50524 validation-auc:0.96973 validation-aucpr:0.97476
[27] validation-logloss:0.50066 validation-auc:0.96964 validation-aucpr:0.97468
[28] validation-logloss:0.49553 validation-auc:0.96980 validation-aucpr:0.97485
[29] validation-logloss:0.49046 validation-auc:0.96993 validation-aucpr:0.97492
[30] validation-logloss:0.48614 validation-auc:0.96981 validation-aucpr:0.97482
[31] validation-logloss:0.48137 validation-auc:0.96985 validation-aucpr:0.97485
[32] validation-logloss:0.47655 validation-auc:0.96989 validation-aucpr:0.97489
[33] validation-logloss:0.47188 validation-auc:0.97001 validation-aucpr:0.97496
[34] validation-logloss:0.46783 validation-auc:0.96995 validation-aucpr:0.97490
[35] validation-logloss:0.46399 validation-auc:0.96977 validation-aucpr:0.97477
[36] validation-logloss:0.45971 validation-auc:0.96975 validation-aucpr:0.97475
[37] validation-logloss:0.45530 validation-auc:0.96986 validation-aucpr:0.97483
[38] validation-logloss:0.45109 validation-auc:0.96987 validation-aucpr:0.97485
[39] validation-logloss:0.44715 validation-auc:0.96989 validation-aucpr:0.97487
[40] validation-logloss:0.44363 validation-auc:0.96981 validation-aucpr:0.97479
[41] validation-logloss:0.43978 validation-auc:0.96976 validation-aucpr:0.97476
[42] validation-logloss:0.43572 validation-auc:0.96991 validation-aucpr:0.97487
[43] validation-logloss:0.43178 validation-auc:0.96985 validation-aucpr:0.97482
[44] validation-logloss:0.42800 validation-auc:0.96988 validation-aucpr:0.97484
[45] validation-logloss:0.42423 validation-auc:0.96984 validation-aucpr:0.97480
[46] validation-logloss:0.42099 validation-auc:0.96976 validation-aucpr:0.97475
[47] validation-logloss:0.41723 validation-auc:0.96997 validation-aucpr:0.97490
[48] validation-logloss:0.41378 validation-auc:0.96999 validation-aucpr:0.97493
[49] validation-logloss:0.41063 validation-auc:0.96984 validation-aucpr:0.97480
[50] validation-logloss:0.40721 validation-auc:0.96990 validation-aucpr:0.97485
[51] validation-logloss:0.40412 validation-auc:0.96993 validation-aucpr:0.97487
[52] validation-logloss:0.40121 validation-auc:0.96989 validation-aucpr:0.97483
[53] validation-logloss:0.39783 validation-auc:0.97004 validation-aucpr:0.97496
[54] validation-logloss:0.39462 validation-auc:0.97001 validation-aucpr:0.97496
[55] validation-logloss:0.39139 validation-auc:0.97003 validation-aucpr:0.97499
[56] validation-logloss:0.38867 validation-auc:0.97003 validation-aucpr:0.97497
[57] validation-logloss:0.38549 validation-auc:0.97019 validation-aucpr:0.97510
[58] validation-logloss:0.38243 validation-auc:0.97025 validation-aucpr:0.97513
[59] validation-logloss:0.37947 validation-auc:0.97025 validation-aucpr:0.97514
[60] validation-logloss:0.37661 validation-auc:0.97039 validation-aucpr:0.97525
[61] validation-logloss:0.37386 validation-auc:0.97041 validation-aucpr:0.97527
[62] validation-logloss:0.37095 validation-auc:0.97051 validation-aucpr:0.97535
[63] validation-logloss:0.36810 validation-auc:0.97054 validation-aucpr:0.97537
[64] validation-logloss:0.36534 validation-auc:0.97065 validation-aucpr:0.97545
[65] validation-logloss:0.36298 validation-auc:0.97060 validation-aucpr:0.97541
{'best_iteration': '64', 'best_score': '0.9754461952539737'}
Trial 55, Fold 4: Log loss = 0.3629805179052722, Average precision = 0.9754157021253169, ROC-AUC = 0.9705993333190268, Elapsed Time = 2.696198199999344 seconds
Trial 55, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 55, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68385 validation-auc:0.93955 validation-aucpr:0.93653
[1] validation-logloss:0.67499 validation-auc:0.95723 validation-aucpr:0.95397
[2] validation-logloss:0.66600 validation-auc:0.96296 validation-aucpr:0.96347
[3] validation-logloss:0.65731 validation-auc:0.96436 validation-aucpr:0.96781
[4] validation-logloss:0.64894 validation-auc:0.96514 validation-aucpr:0.97009
[5] validation-logloss:0.64068 validation-auc:0.96584 validation-aucpr:0.97050
[6] validation-logloss:0.63275 validation-auc:0.96600 validation-aucpr:0.97010
[7] validation-logloss:0.62498 validation-auc:0.96649 validation-aucpr:0.97041
[8] validation-logloss:0.61727 validation-auc:0.96682 validation-aucpr:0.97070
[9] validation-logloss:0.60973 validation-auc:0.96702 validation-aucpr:0.97098
[10] validation-logloss:0.60228 validation-auc:0.96751 validation-aucpr:0.97138
[11] validation-logloss:0.59581 validation-auc:0.96772 validation-aucpr:0.97167
[12] validation-logloss:0.58935 validation-auc:0.96767 validation-aucpr:0.97154
[13] validation-logloss:0.58243 validation-auc:0.96786 validation-aucpr:0.97167
[14] validation-logloss:0.57567 validation-auc:0.96763 validation-aucpr:0.97151
[15] validation-logloss:0.56914 validation-auc:0.96764 validation-aucpr:0.97150
[16] validation-logloss:0.56281 validation-auc:0.96753 validation-aucpr:0.97150
[17] validation-logloss:0.55710 validation-auc:0.96746 validation-aucpr:0.97147
[18] validation-logloss:0.55134 validation-auc:0.96749 validation-aucpr:0.97147
[19] validation-logloss:0.54528 validation-auc:0.96743 validation-aucpr:0.97145
[20] validation-logloss:0.53932 validation-auc:0.96776 validation-aucpr:0.97162
[21] validation-logloss:0.53347 validation-auc:0.96793 validation-aucpr:0.97171
[22] validation-logloss:0.52840 validation-auc:0.96769 validation-aucpr:0.97103
[23] validation-logloss:0.52279 validation-auc:0.96795 validation-aucpr:0.97121
[24] validation-logloss:0.51784 validation-auc:0.96799 validation-aucpr:0.97123
[25] validation-logloss:0.51243 validation-auc:0.96798 validation-aucpr:0.97122
[26] validation-logloss:0.50704 validation-auc:0.96824 validation-aucpr:0.97140
[27] validation-logloss:0.50186 validation-auc:0.96823 validation-aucpr:0.97140
[28] validation-logloss:0.49674 validation-auc:0.96847 validation-aucpr:0.97160
[29] validation-logloss:0.49173 validation-auc:0.96860 validation-aucpr:0.97176
[30] validation-logloss:0.48749 validation-auc:0.96857 validation-aucpr:0.97308
[31] validation-logloss:0.48261 validation-auc:0.96876 validation-aucpr:0.97323
[32] validation-logloss:0.47780 validation-auc:0.96896 validation-aucpr:0.97340
[33] validation-logloss:0.47318 validation-auc:0.96895 validation-aucpr:0.97340
[34] validation-logloss:0.46867 validation-auc:0.96912 validation-aucpr:0.97353
[35] validation-logloss:0.46418 validation-auc:0.96928 validation-aucpr:0.97366
[36] validation-logloss:0.45982 validation-auc:0.96938 validation-aucpr:0.97373
[37] validation-logloss:0.45602 validation-auc:0.96920 validation-aucpr:0.97359
[38] validation-logloss:0.45204 validation-auc:0.96934 validation-aucpr:0.97371
[39] validation-logloss:0.44809 validation-auc:0.96921 validation-aucpr:0.97358
[40] validation-logloss:0.44406 validation-auc:0.96921 validation-aucpr:0.97358
[41] validation-logloss:0.44004 validation-auc:0.96933 validation-aucpr:0.97367
[42] validation-logloss:0.43604 validation-auc:0.96953 validation-aucpr:0.97381
[43] validation-logloss:0.43216 validation-auc:0.96955 validation-aucpr:0.97383
[44] validation-logloss:0.42831 validation-auc:0.96962 validation-aucpr:0.97389
[45] validation-logloss:0.42495 validation-auc:0.96968 validation-aucpr:0.97390
[46] validation-logloss:0.42125 validation-auc:0.96969 validation-aucpr:0.97391
[47] validation-logloss:0.41763 validation-auc:0.96984 validation-aucpr:0.97402
[48] validation-logloss:0.41402 validation-auc:0.96994 validation-aucpr:0.97411
[49] validation-logloss:0.41055 validation-auc:0.97003 validation-aucpr:0.97417
[50] validation-logloss:0.40756 validation-auc:0.97000 validation-aucpr:0.97411
[51] validation-logloss:0.40432 validation-auc:0.96996 validation-aucpr:0.97410
[52] validation-logloss:0.40095 validation-auc:0.97006 validation-aucpr:0.97417
[53] validation-logloss:0.39771 validation-auc:0.97018 validation-aucpr:0.97425
[54] validation-logloss:0.39447 validation-auc:0.97026 validation-aucpr:0.97430
[55] validation-logloss:0.39135 validation-auc:0.97032 validation-aucpr:0.97434
[56] validation-logloss:0.38826 validation-auc:0.97033 validation-aucpr:0.97436
[57] validation-logloss:0.38565 validation-auc:0.97029 validation-aucpr:0.97432
[58] validation-logloss:0.38303 validation-auc:0.97026 validation-aucpr:0.97429
[59] validation-logloss:0.38014 validation-auc:0.97032 validation-aucpr:0.97431
[60] validation-logloss:0.37728 validation-auc:0.97039 validation-aucpr:0.97440
[61] validation-logloss:0.37442 validation-auc:0.97043 validation-aucpr:0.97444
[62] validation-logloss:0.37163 validation-auc:0.97043 validation-aucpr:0.97444
[63] validation-logloss:0.36893 validation-auc:0.97043 validation-aucpr:0.97443
[64] validation-logloss:0.36624 validation-auc:0.97043 validation-aucpr:0.97444
[65] validation-logloss:0.36350 validation-auc:0.97052 validation-aucpr:0.97451
{'best_iteration': '65', 'best_score': '0.9745086390557987'}
Trial 55, Fold 5: Log loss = 0.3635039653222269, Average precision = 0.974512820454051, ROC-AUC = 0.9705224903851514, Elapsed Time = 2.5660344999996596 seconds
Optimization Progress: 56%|#####6 | 56/100 [2:54:45<49:55, 68.08s/it]
Trial 56, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 56, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[20:53:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66058 validation-auc:0.94692 validation-aucpr:0.93916
[20:53:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63109 validation-auc:0.96084 validation-aucpr:0.96099
[20:53:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60382 validation-auc:0.96396 validation-aucpr:0.96927
[20:53:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57872 validation-auc:0.96459 validation-aucpr:0.97005
[20:53:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55551 validation-auc:0.96481 validation-aucpr:0.96872
[20:53:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53392 validation-auc:0.96602 validation-aucpr:0.96837
[20:53:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51583 validation-auc:0.96631 validation-aucpr:0.96879
[20:53:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49709 validation-auc:0.96688 validation-aucpr:0.96920
[20:53:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47959 validation-auc:0.96666 validation-aucpr:0.96512
[20:53:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.46324 validation-auc:0.96772 validation-aucpr:0.96759
[20:53:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44945 validation-auc:0.96784 validation-aucpr:0.96776
[20:53:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43532 validation-auc:0.96785 validation-aucpr:0.96779
[20:53:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.42201 validation-auc:0.96803 validation-aucpr:0.96801
[20:53:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.41083 validation-auc:0.96764 validation-aucpr:0.96620
[20:53:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39912 validation-auc:0.96780 validation-aucpr:0.96632
[20:53:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38980 validation-auc:0.96804 validation-aucpr:0.96812
[20:53:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37941 validation-auc:0.96844 validation-aucpr:0.97063
[20:53:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36946 validation-auc:0.96862 validation-aucpr:0.97080
[20:53:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.36019 validation-auc:0.96869 validation-aucpr:0.97084
[20:53:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.35132 validation-auc:0.96868 validation-aucpr:0.97084
[20:53:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.34285 validation-auc:0.96876 validation-aucpr:0.97089
[20:53:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33569 validation-auc:0.96882 validation-aucpr:0.97087
[20:53:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32816 validation-auc:0.96931 validation-aucpr:0.97392
[20:53:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.32117 validation-auc:0.96949 validation-aucpr:0.97403
[20:53:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31469 validation-auc:0.96947 validation-aucpr:0.97402
[20:53:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30918 validation-auc:0.96954 validation-aucpr:0.97402
[20:53:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.30378 validation-auc:0.96968 validation-aucpr:0.97405
[20:53:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29798 validation-auc:0.96978 validation-aucpr:0.97410
[20:53:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.29254 validation-auc:0.96998 validation-aucpr:0.97427
[20:53:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28744 validation-auc:0.96995 validation-aucpr:0.97427
[20:53:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.28246 validation-auc:0.97011 validation-aucpr:0.97435
[20:53:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27798 validation-auc:0.97016 validation-aucpr:0.97441
[20:53:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.27340 validation-auc:0.97035 validation-aucpr:0.97454
[20:53:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26919 validation-auc:0.97052 validation-aucpr:0.97466
[20:53:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26531 validation-auc:0.97049 validation-aucpr:0.97463
[20:53:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.26168 validation-auc:0.97049 validation-aucpr:0.97360
[20:53:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25811 validation-auc:0.97055 validation-aucpr:0.97366
[20:53:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25475 validation-auc:0.97063 validation-aucpr:0.97368
[20:53:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.25150 validation-auc:0.97071 validation-aucpr:0.97374
[20:53:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.24857 validation-auc:0.97082 validation-aucpr:0.97388
[20:53:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.24594 validation-auc:0.97087 validation-aucpr:0.97532
[20:53:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.24311 validation-auc:0.97096 validation-aucpr:0.97539
[20:53:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.24062 validation-auc:0.97093 validation-aucpr:0.97537
{'best_iteration': '41', 'best_score': '0.975389821361592'}
Trial 56, Fold 1: Log loss = 0.24061675051511633, Average precision = 0.9753669075707132, ROC-AUC = 0.9709288952457424, Elapsed Time = 6.691964800000278 seconds
Trial 56, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 56, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[20:53:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66045 validation-auc:0.95472 validation-aucpr:0.94123
[20:53:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63073 validation-auc:0.96408 validation-aucpr:0.95982
[20:53:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60338 validation-auc:0.96654 validation-aucpr:0.96890
[20:53:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57840 validation-auc:0.96627 validation-aucpr:0.96846
[20:53:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55703 validation-auc:0.96781 validation-aucpr:0.97159
[20:53:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53731 validation-auc:0.96831 validation-aucpr:0.97175
[20:53:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51663 validation-auc:0.96931 validation-aucpr:0.97248
[20:53:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49760 validation-auc:0.96978 validation-aucpr:0.97282
[20:53:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.48159 validation-auc:0.97049 validation-aucpr:0.97375
[20:53:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.46460 validation-auc:0.97134 validation-aucpr:0.97434
[20:53:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.45080 validation-auc:0.97079 validation-aucpr:0.97396
[20:53:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43769 validation-auc:0.97068 validation-aucpr:0.97377
[20:53:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.42383 validation-auc:0.97101 validation-aucpr:0.97409
[20:53:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.41090 validation-auc:0.97131 validation-aucpr:0.97438
[20:53:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39882 validation-auc:0.97155 validation-aucpr:0.97458
[20:53:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38737 validation-auc:0.97165 validation-aucpr:0.97465
[20:53:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37791 validation-auc:0.97158 validation-aucpr:0.97457
[20:53:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36773 validation-auc:0.97185 validation-aucpr:0.97478
[20:53:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35819 validation-auc:0.97171 validation-aucpr:0.97476
[20:53:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34923 validation-auc:0.97189 validation-aucpr:0.97497
[20:53:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.34054 validation-auc:0.97201 validation-aucpr:0.97507
[20:53:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33265 validation-auc:0.97192 validation-aucpr:0.97500
[20:53:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32505 validation-auc:0.97201 validation-aucpr:0.97505
[20:53:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31776 validation-auc:0.97204 validation-aucpr:0.97508
[20:53:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31084 validation-auc:0.97223 validation-aucpr:0.97528
[20:53:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30420 validation-auc:0.97230 validation-aucpr:0.97530
[20:53:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.29811 validation-auc:0.97243 validation-aucpr:0.97550
[20:53:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29235 validation-auc:0.97246 validation-aucpr:0.97542
[20:53:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.28689 validation-auc:0.97242 validation-aucpr:0.97538
[20:53:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28180 validation-auc:0.97235 validation-aucpr:0.97531
[20:53:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.27681 validation-auc:0.97237 validation-aucpr:0.97545
[20:53:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27230 validation-auc:0.97229 validation-aucpr:0.97537
[20:53:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.26786 validation-auc:0.97232 validation-aucpr:0.97539
[20:53:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26351 validation-auc:0.97247 validation-aucpr:0.97551
[20:53:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.25952 validation-auc:0.97252 validation-aucpr:0.97553
[20:53:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.25552 validation-auc:0.97273 validation-aucpr:0.97568
[20:53:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25187 validation-auc:0.97280 validation-aucpr:0.97575
[20:53:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.24838 validation-auc:0.97275 validation-aucpr:0.97574
[20:53:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.24493 validation-auc:0.97283 validation-aucpr:0.97580
[20:53:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.24196 validation-auc:0.97270 validation-aucpr:0.97571
[20:53:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.23894 validation-auc:0.97281 validation-aucpr:0.97577
[20:53:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.23616 validation-auc:0.97280 validation-aucpr:0.97573
[20:53:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.23346 validation-auc:0.97279 validation-aucpr:0.97571
{'best_iteration': '38', 'best_score': '0.9757950282631109'}
Trial 56, Fold 2: Log loss = 0.23345912524895623, Average precision = 0.9755510875905027, ROC-AUC = 0.9727877524969339, Elapsed Time = 5.648318900002778 seconds
Trial 56, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 56, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[20:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66060 validation-auc:0.95031 validation-aucpr:0.94385
[20:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63067 validation-auc:0.95917 validation-aucpr:0.94856
[20:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60336 validation-auc:0.96377 validation-aucpr:0.95967
[20:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57806 validation-auc:0.96458 validation-aucpr:0.96060
[20:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55470 validation-auc:0.96615 validation-aucpr:0.96471
[20:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53304 validation-auc:0.96699 validation-aucpr:0.96866
[20:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51313 validation-auc:0.96758 validation-aucpr:0.97208
[20:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49459 validation-auc:0.96756 validation-aucpr:0.96980
[20:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47726 validation-auc:0.96761 validation-aucpr:0.96986
[20:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.46110 validation-auc:0.96798 validation-aucpr:0.97001
[20:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44581 validation-auc:0.96831 validation-aucpr:0.97039
[20:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43170 validation-auc:0.96861 validation-aucpr:0.97061
[20:53:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.41833 validation-auc:0.96877 validation-aucpr:0.97076
[20:53:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40571 validation-auc:0.96916 validation-aucpr:0.96863
[20:53:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39386 validation-auc:0.96949 validation-aucpr:0.97143
[20:53:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38283 validation-auc:0.96967 validation-aucpr:0.97122
[20:53:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37247 validation-auc:0.96963 validation-aucpr:0.97122
[20:53:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36278 validation-auc:0.96957 validation-aucpr:0.97117
[20:53:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35351 validation-auc:0.97014 validation-aucpr:0.97394
[20:53:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34494 validation-auc:0.97015 validation-aucpr:0.97395
[20:53:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.33653 validation-auc:0.97045 validation-aucpr:0.97409
[20:53:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.32933 validation-auc:0.97083 validation-aucpr:0.97432
[20:53:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32196 validation-auc:0.97096 validation-aucpr:0.97441
[20:53:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31517 validation-auc:0.97086 validation-aucpr:0.97435
[20:53:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.30871 validation-auc:0.97096 validation-aucpr:0.97442
[20:53:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30265 validation-auc:0.97099 validation-aucpr:0.97447
[20:53:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.29672 validation-auc:0.97122 validation-aucpr:0.97462
[20:53:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29108 validation-auc:0.97139 validation-aucpr:0.97475
[20:53:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.28572 validation-auc:0.97152 validation-aucpr:0.97488
[20:53:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28100 validation-auc:0.97170 validation-aucpr:0.97498
[20:53:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.27631 validation-auc:0.97161 validation-aucpr:0.97495
[20:53:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27196 validation-auc:0.97159 validation-aucpr:0.97493
[20:53:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.26802 validation-auc:0.97176 validation-aucpr:0.97503
[20:53:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26408 validation-auc:0.97169 validation-aucpr:0.97496
[20:53:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26034 validation-auc:0.97168 validation-aucpr:0.97487
[20:53:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.25657 validation-auc:0.97176 validation-aucpr:0.97500
[20:53:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25313 validation-auc:0.97175 validation-aucpr:0.97495
[20:53:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.24996 validation-auc:0.97173 validation-aucpr:0.97492
[20:53:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.24674 validation-auc:0.97183 validation-aucpr:0.97499
[20:53:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.24363 validation-auc:0.97189 validation-aucpr:0.97503
[20:53:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.24092 validation-auc:0.97195 validation-aucpr:0.97507
[20:53:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.23835 validation-auc:0.97200 validation-aucpr:0.97509
[20:53:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.23589 validation-auc:0.97197 validation-aucpr:0.97508
{'best_iteration': '41', 'best_score': '0.9750947247890471'}
Trial 56, Fold 3: Log loss = 0.2358894526760947, Average precision = 0.9749897931718626, ROC-AUC = 0.9719713554959502, Elapsed Time = 5.436695800002781 seconds
Trial 56, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 56, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[20:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66077 validation-auc:0.94342 validation-aucpr:0.94184
[20:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63116 validation-auc:0.95862 validation-aucpr:0.95123
[20:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60602 validation-auc:0.96504 validation-aucpr:0.96958
[20:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.58052 validation-auc:0.96631 validation-aucpr:0.97024
[20:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55740 validation-auc:0.96677 validation-aucpr:0.97222
[20:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53578 validation-auc:0.96742 validation-aucpr:0.97254
[20:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51761 validation-auc:0.96740 validation-aucpr:0.97239
[20:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49867 validation-auc:0.96791 validation-aucpr:0.97161
[20:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.48102 validation-auc:0.96826 validation-aucpr:0.97158
[20:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.46433 validation-auc:0.96889 validation-aucpr:0.97365
[20:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44888 validation-auc:0.96888 validation-aucpr:0.97377
[20:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43418 validation-auc:0.96935 validation-aucpr:0.97324
[20:54:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.42066 validation-auc:0.96965 validation-aucpr:0.97324
[20:54:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40923 validation-auc:0.96938 validation-aucpr:0.97291
[20:54:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39731 validation-auc:0.96956 validation-aucpr:0.97429
[20:54:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38595 validation-auc:0.96974 validation-aucpr:0.97443
[20:54:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37533 validation-auc:0.97000 validation-aucpr:0.97460
[20:54:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36524 validation-auc:0.97021 validation-aucpr:0.97394
[20:54:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35583 validation-auc:0.97022 validation-aucpr:0.97395
[20:54:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34687 validation-auc:0.97044 validation-aucpr:0.97295
[20:54:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.33857 validation-auc:0.97033 validation-aucpr:0.97275
[20:54:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33072 validation-auc:0.97044 validation-aucpr:0.97263
[20:54:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32319 validation-auc:0.97050 validation-aucpr:0.97290
[20:54:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31605 validation-auc:0.97069 validation-aucpr:0.97304
[20:54:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.30930 validation-auc:0.97106 validation-aucpr:0.97471
[20:54:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30307 validation-auc:0.97121 validation-aucpr:0.97471
[20:54:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.29700 validation-auc:0.97123 validation-aucpr:0.97472
[20:54:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29136 validation-auc:0.97113 validation-aucpr:0.97464
[20:54:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.28610 validation-auc:0.97117 validation-aucpr:0.97456
[20:54:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28164 validation-auc:0.97111 validation-aucpr:0.97414
[20:54:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.27684 validation-auc:0.97106 validation-aucpr:0.97412
[20:54:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27215 validation-auc:0.97132 validation-aucpr:0.97480
[20:54:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.26796 validation-auc:0.97117 validation-aucpr:0.97463
[20:54:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26380 validation-auc:0.97129 validation-aucpr:0.97440
[20:54:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26036 validation-auc:0.97126 validation-aucpr:0.97406
[20:54:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.25640 validation-auc:0.97169 validation-aucpr:0.97601
[20:54:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25290 validation-auc:0.97174 validation-aucpr:0.97606
[20:54:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25002 validation-auc:0.97170 validation-aucpr:0.97604
[20:54:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.24672 validation-auc:0.97181 validation-aucpr:0.97614
[20:54:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.24370 validation-auc:0.97194 validation-aucpr:0.97623
[20:54:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.24093 validation-auc:0.97204 validation-aucpr:0.97631
[20:54:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.23842 validation-auc:0.97192 validation-aucpr:0.97622
[20:54:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.23582 validation-auc:0.97196 validation-aucpr:0.97626
{'best_iteration': '40', 'best_score': '0.976311607155693'}
Trial 56, Fold 4: Log loss = 0.23582202103210947, Average precision = 0.9762461607065359, ROC-AUC = 0.9719560624810738, Elapsed Time = 5.414685199997621 seconds
Trial 56, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 56, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[20:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66050 validation-auc:0.94956 validation-aucpr:0.95245
[20:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63061 validation-auc:0.96168 validation-aucpr:0.95996
[20:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60337 validation-auc:0.96329 validation-aucpr:0.96246
[20:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57825 validation-auc:0.96536 validation-aucpr:0.96247
[20:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55729 validation-auc:0.96621 validation-aucpr:0.97090
[20:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53588 validation-auc:0.96742 validation-aucpr:0.97180
[20:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51580 validation-auc:0.96789 validation-aucpr:0.97089
[20:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49737 validation-auc:0.96792 validation-aucpr:0.97075
[20:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47988 validation-auc:0.96871 validation-aucpr:0.97288
[20:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.46348 validation-auc:0.96919 validation-aucpr:0.97326
[20:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44839 validation-auc:0.96926 validation-aucpr:0.97332
[20:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43432 validation-auc:0.96961 validation-aucpr:0.97355
[20:54:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.42096 validation-auc:0.96998 validation-aucpr:0.97396
[20:54:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40859 validation-auc:0.96989 validation-aucpr:0.97392
[20:54:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39686 validation-auc:0.96989 validation-aucpr:0.97394
[20:54:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38587 validation-auc:0.97000 validation-aucpr:0.97402
[20:54:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37564 validation-auc:0.97010 validation-aucpr:0.97415
[20:54:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36579 validation-auc:0.97035 validation-aucpr:0.97421
[20:54:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35666 validation-auc:0.97053 validation-aucpr:0.97435
[20:54:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34802 validation-auc:0.97072 validation-aucpr:0.97437
[20:54:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.33994 validation-auc:0.97070 validation-aucpr:0.97434
[20:54:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33198 validation-auc:0.97092 validation-aucpr:0.97448
[20:54:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32470 validation-auc:0.97096 validation-aucpr:0.97461
[20:54:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31881 validation-auc:0.97090 validation-aucpr:0.97437
[20:54:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31242 validation-auc:0.97088 validation-aucpr:0.97436
[20:54:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30611 validation-auc:0.97102 validation-aucpr:0.97448
[20:54:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.30035 validation-auc:0.97099 validation-aucpr:0.97446
[20:54:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29486 validation-auc:0.97104 validation-aucpr:0.97450
[20:54:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.28960 validation-auc:0.97102 validation-aucpr:0.97447
[20:54:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28492 validation-auc:0.97100 validation-aucpr:0.97447
[20:54:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.28035 validation-auc:0.97099 validation-aucpr:0.97447
[20:54:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27613 validation-auc:0.97095 validation-aucpr:0.97438
[20:54:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.27192 validation-auc:0.97097 validation-aucpr:0.97441
[20:54:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26798 validation-auc:0.97100 validation-aucpr:0.97444
[20:54:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26446 validation-auc:0.97106 validation-aucpr:0.97446
[20:54:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.26090 validation-auc:0.97103 validation-aucpr:0.97443
[20:54:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25730 validation-auc:0.97112 validation-aucpr:0.97453
[20:54:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25402 validation-auc:0.97119 validation-aucpr:0.97457
[20:54:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.25090 validation-auc:0.97118 validation-aucpr:0.97456
[20:54:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.24801 validation-auc:0.97112 validation-aucpr:0.97451
[20:54:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.24531 validation-auc:0.97115 validation-aucpr:0.97454
[20:54:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.24258 validation-auc:0.97119 validation-aucpr:0.97457
[20:54:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.24017 validation-auc:0.97121 validation-aucpr:0.97457
{'best_iteration': '22', 'best_score': '0.9746072960651041'}
Trial 56, Fold 5: Log loss = 0.24016825303238032, Average precision = 0.9744614167127172, ROC-AUC = 0.9712105233821972, Elapsed Time = 5.48138789999939 seconds
Optimization Progress: 57%|#####6 | 57/100 [2:55:22<42:08, 58.81s/it]
Trial 57, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 57, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67453 validation-auc:0.92966 validation-aucpr:0.92412
[1] validation-logloss:0.65734 validation-auc:0.94220 validation-aucpr:0.94079
[2] validation-logloss:0.64190 validation-auc:0.94381 validation-aucpr:0.94793
[3] validation-logloss:0.62513 validation-auc:0.95034 validation-aucpr:0.95588
[4] validation-logloss:0.61047 validation-auc:0.95093 validation-aucpr:0.95709
[5] validation-logloss:0.59542 validation-auc:0.95285 validation-aucpr:0.95884
[6] validation-logloss:0.58235 validation-auc:0.95329 validation-aucpr:0.95917
[7] validation-logloss:0.56761 validation-auc:0.95724 validation-aucpr:0.96378
[8] validation-logloss:0.55330 validation-auc:0.95864 validation-aucpr:0.96548
[9] validation-logloss:0.54248 validation-auc:0.95811 validation-aucpr:0.96489
[10] validation-logloss:0.53158 validation-auc:0.95876 validation-aucpr:0.96530
[11] validation-logloss:0.51977 validation-auc:0.95895 validation-aucpr:0.96559
[12] validation-logloss:0.50953 validation-auc:0.95900 validation-aucpr:0.96570
[13] validation-logloss:0.50093 validation-auc:0.95857 validation-aucpr:0.96533
[14] validation-logloss:0.49037 validation-auc:0.95885 validation-aucpr:0.96570
[15] validation-logloss:0.48179 validation-auc:0.95892 validation-aucpr:0.96568
[16] validation-logloss:0.47358 validation-auc:0.95888 validation-aucpr:0.96558
[17] validation-logloss:0.46601 validation-auc:0.95864 validation-aucpr:0.96541
[18] validation-logloss:0.45761 validation-auc:0.95848 validation-aucpr:0.96536
[19] validation-logloss:0.44815 validation-auc:0.95907 validation-aucpr:0.96591
[20] validation-logloss:0.43952 validation-auc:0.95952 validation-aucpr:0.96641
[21] validation-logloss:0.43300 validation-auc:0.95957 validation-aucpr:0.96643
[22] validation-logloss:0.42651 validation-auc:0.95975 validation-aucpr:0.96658
[23] validation-logloss:0.41861 validation-auc:0.95982 validation-aucpr:0.96674
[24] validation-logloss:0.41035 validation-auc:0.96038 validation-aucpr:0.96730
{'best_iteration': '24', 'best_score': '0.967297137732974'}
Trial 57, Fold 1: Log loss = 0.41034771616691573, Average precision = 0.9673015836050309, ROC-AUC = 0.9603838517992763, Elapsed Time = 1.044937900001969 seconds
Trial 57, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 57, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67493 validation-auc:0.92659 validation-aucpr:0.91603
[1] validation-logloss:0.65440 validation-auc:0.95558 validation-aucpr:0.96034
[2] validation-logloss:0.63804 validation-auc:0.95563 validation-aucpr:0.96028
[3] validation-logloss:0.62193 validation-auc:0.95796 validation-aucpr:0.96183
[4] validation-logloss:0.60719 validation-auc:0.95712 validation-aucpr:0.96109
[5] validation-logloss:0.59324 validation-auc:0.95693 validation-aucpr:0.96071
[6] validation-logloss:0.57882 validation-auc:0.95797 validation-aucpr:0.96157
[7] validation-logloss:0.56650 validation-auc:0.95736 validation-aucpr:0.96121
[8] validation-logloss:0.55490 validation-auc:0.95750 validation-aucpr:0.96111
[9] validation-logloss:0.54254 validation-auc:0.95863 validation-aucpr:0.96241
[10] validation-logloss:0.53208 validation-auc:0.95793 validation-aucpr:0.96168
[11] validation-logloss:0.51964 validation-auc:0.95966 validation-aucpr:0.96375
[12] validation-logloss:0.50705 validation-auc:0.96118 validation-aucpr:0.96578
[13] validation-logloss:0.49555 validation-auc:0.96162 validation-aucpr:0.96616
[14] validation-logloss:0.48666 validation-auc:0.96141 validation-aucpr:0.96585
[15] validation-logloss:0.47513 validation-auc:0.96223 validation-aucpr:0.96669
[16] validation-logloss:0.46474 validation-auc:0.96290 validation-aucpr:0.96736
[17] validation-logloss:0.45657 validation-auc:0.96301 validation-aucpr:0.96737
[18] validation-logloss:0.44932 validation-auc:0.96257 validation-aucpr:0.96691
[19] validation-logloss:0.44048 validation-auc:0.96280 validation-aucpr:0.96714
[20] validation-logloss:0.43353 validation-auc:0.96272 validation-aucpr:0.96704
[21] validation-logloss:0.42738 validation-auc:0.96251 validation-aucpr:0.96683
[22] validation-logloss:0.42151 validation-auc:0.96242 validation-aucpr:0.96672
[23] validation-logloss:0.41597 validation-auc:0.96235 validation-aucpr:0.96664
[24] validation-logloss:0.41013 validation-auc:0.96245 validation-aucpr:0.96669
{'best_iteration': '17', 'best_score': '0.9673667641165824'}
Trial 57, Fold 2: Log loss = 0.4101304550323175, Average precision = 0.9666330371412952, ROC-AUC = 0.9624493276207374, Elapsed Time = 1.3475201000001107 seconds
Trial 57, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 57, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67449 validation-auc:0.93199 validation-aucpr:0.92687
[1] validation-logloss:0.65477 validation-auc:0.95745 validation-aucpr:0.96278
[2] validation-logloss:0.63839 validation-auc:0.95725 validation-aucpr:0.96317
[3] validation-logloss:0.62300 validation-auc:0.95666 validation-aucpr:0.96261
[4] validation-logloss:0.60752 validation-auc:0.95706 validation-aucpr:0.96320
[5] validation-logloss:0.59387 validation-auc:0.95680 validation-aucpr:0.96336
[6] validation-logloss:0.58044 validation-auc:0.95810 validation-aucpr:0.96379
[7] validation-logloss:0.56757 validation-auc:0.95870 validation-aucpr:0.96417
[8] validation-logloss:0.55673 validation-auc:0.95799 validation-aucpr:0.96354
[9] validation-logloss:0.54440 validation-auc:0.95855 validation-aucpr:0.96395
[10] validation-logloss:0.53330 validation-auc:0.95814 validation-aucpr:0.96352
[11] validation-logloss:0.52290 validation-auc:0.95761 validation-aucpr:0.96295
[12] validation-logloss:0.51239 validation-auc:0.95829 validation-aucpr:0.96379
[13] validation-logloss:0.50305 validation-auc:0.95823 validation-aucpr:0.96375
[14] validation-logloss:0.49428 validation-auc:0.95778 validation-aucpr:0.96334
[15] validation-logloss:0.48541 validation-auc:0.95800 validation-aucpr:0.96341
[16] validation-logloss:0.47637 validation-auc:0.95854 validation-aucpr:0.96369
[17] validation-logloss:0.46880 validation-auc:0.95807 validation-aucpr:0.96322
[18] validation-logloss:0.46171 validation-auc:0.95788 validation-aucpr:0.96306
[19] validation-logloss:0.45486 validation-auc:0.95767 validation-aucpr:0.96290
[20] validation-logloss:0.44828 validation-auc:0.95755 validation-aucpr:0.96278
[21] validation-logloss:0.43986 validation-auc:0.95857 validation-aucpr:0.96410
[22] validation-logloss:0.43406 validation-auc:0.95835 validation-aucpr:0.96395
[23] validation-logloss:0.42816 validation-auc:0.95807 validation-aucpr:0.96371
[24] validation-logloss:0.42208 validation-auc:0.95836 validation-aucpr:0.96387
{'best_iteration': '7', 'best_score': '0.9641735778234024'}
Trial 57, Fold 3: Log loss = 0.42207755835198346, Average precision = 0.9638774698876118, ROC-AUC = 0.9583618753069894, Elapsed Time = 1.2980142000014894 seconds
Trial 57, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 57, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67487 validation-auc:0.92848 validation-aucpr:0.92519
[1] validation-logloss:0.65758 validation-auc:0.93797 validation-aucpr:0.93847
[2] validation-logloss:0.64007 validation-auc:0.95223 validation-aucpr:0.95651
[3] validation-logloss:0.62384 validation-auc:0.95442 validation-aucpr:0.96065
[4] validation-logloss:0.60735 validation-auc:0.95637 validation-aucpr:0.96316
[5] validation-logloss:0.59402 validation-auc:0.95596 validation-aucpr:0.96283
[6] validation-logloss:0.58079 validation-auc:0.95533 validation-aucpr:0.96204
[7] validation-logloss:0.56572 validation-auc:0.95708 validation-aucpr:0.96393
[8] validation-logloss:0.55312 validation-auc:0.95708 validation-aucpr:0.96397
[9] validation-logloss:0.53975 validation-auc:0.95803 validation-aucpr:0.96481
[10] validation-logloss:0.52881 validation-auc:0.95869 validation-aucpr:0.96522
[11] validation-logloss:0.51717 validation-auc:0.95914 validation-aucpr:0.96566
[12] validation-logloss:0.50687 validation-auc:0.95932 validation-aucpr:0.96581
[13] validation-logloss:0.49786 validation-auc:0.95874 validation-aucpr:0.96533
[14] validation-logloss:0.48824 validation-auc:0.95920 validation-aucpr:0.96564
[15] validation-logloss:0.47747 validation-auc:0.96011 validation-aucpr:0.96646
[16] validation-logloss:0.46911 validation-auc:0.96021 validation-aucpr:0.96650
[17] validation-logloss:0.46103 validation-auc:0.96070 validation-aucpr:0.96690
[18] validation-logloss:0.45382 validation-auc:0.96053 validation-aucpr:0.96667
[19] validation-logloss:0.44491 validation-auc:0.96111 validation-aucpr:0.96724
[20] validation-logloss:0.43808 validation-auc:0.96107 validation-aucpr:0.96717
[21] validation-logloss:0.42941 validation-auc:0.96140 validation-aucpr:0.96752
[22] validation-logloss:0.42274 validation-auc:0.96152 validation-aucpr:0.96760
[23] validation-logloss:0.41489 validation-auc:0.96189 validation-aucpr:0.96801
[24] validation-logloss:0.40935 validation-auc:0.96188 validation-aucpr:0.96799
{'best_iteration': '23', 'best_score': '0.9680064618656231'}
Trial 57, Fold 4: Log loss = 0.409350621791954, Average precision = 0.9679871158277571, ROC-AUC = 0.9618803664122502, Elapsed Time = 1.265817600000446 seconds
Trial 57, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 57, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67495 validation-auc:0.91969 validation-aucpr:0.90610
[1] validation-logloss:0.65813 validation-auc:0.93588 validation-aucpr:0.93828
[2] validation-logloss:0.64248 validation-auc:0.93797 validation-aucpr:0.94127
[3] validation-logloss:0.62754 validation-auc:0.93957 validation-aucpr:0.94455
[4] validation-logloss:0.61008 validation-auc:0.94994 validation-aucpr:0.95687
[5] validation-logloss:0.59344 validation-auc:0.95363 validation-aucpr:0.96018
[6] validation-logloss:0.58120 validation-auc:0.95313 validation-aucpr:0.96021
[7] validation-logloss:0.56863 validation-auc:0.95286 validation-aucpr:0.96003
[8] validation-logloss:0.55453 validation-auc:0.95443 validation-aucpr:0.96131
[9] validation-logloss:0.54319 validation-auc:0.95455 validation-aucpr:0.96132
[10] validation-logloss:0.53230 validation-auc:0.95490 validation-aucpr:0.96160
[11] validation-logloss:0.52226 validation-auc:0.95486 validation-aucpr:0.96151
[12] validation-logloss:0.51321 validation-auc:0.95474 validation-aucpr:0.96134
[13] validation-logloss:0.50445 validation-auc:0.95453 validation-aucpr:0.96117
[14] validation-logloss:0.49649 validation-auc:0.95444 validation-aucpr:0.96145
[15] validation-logloss:0.48890 validation-auc:0.95432 validation-aucpr:0.96129
[16] validation-logloss:0.48053 validation-auc:0.95437 validation-aucpr:0.96124
[17] validation-logloss:0.47301 validation-auc:0.95430 validation-aucpr:0.96114
[18] validation-logloss:0.46379 validation-auc:0.95513 validation-aucpr:0.96189
[19] validation-logloss:0.45706 validation-auc:0.95482 validation-aucpr:0.96159
[20] validation-logloss:0.44790 validation-auc:0.95561 validation-aucpr:0.96244
[21] validation-logloss:0.44038 validation-auc:0.95607 validation-aucpr:0.96272
[22] validation-logloss:0.43436 validation-auc:0.95596 validation-aucpr:0.96254
[23] validation-logloss:0.42855 validation-auc:0.95596 validation-aucpr:0.96258
[24] validation-logloss:0.42321 validation-auc:0.95581 validation-aucpr:0.96247
{'best_iteration': '21', 'best_score': '0.9627214154090976'}
Trial 57, Fold 5: Log loss = 0.42320609848369695, Average precision = 0.9624687066772707, ROC-AUC = 0.9558063281496758, Elapsed Time = 1.2785884000004444 seconds
Optimization Progress: 58%|#####8 | 58/100 [2:55:36<31:44, 45.34s/it]
Trial 58, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 58, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68570 validation-auc:0.91254 validation-aucpr:0.88376
[1] validation-logloss:0.67832 validation-auc:0.94100 validation-aucpr:0.94213
[2] validation-logloss:0.67010 validation-auc:0.95528 validation-aucpr:0.95899
[3] validation-logloss:0.66274 validation-auc:0.95750 validation-aucpr:0.96288
[4] validation-logloss:0.65580 validation-auc:0.95827 validation-aucpr:0.96318
[5] validation-logloss:0.64802 validation-auc:0.96097 validation-aucpr:0.96472
[6] validation-logloss:0.64046 validation-auc:0.96242 validation-aucpr:0.96642
[7] validation-logloss:0.63304 validation-auc:0.96390 validation-aucpr:0.96787
[8] validation-logloss:0.62653 validation-auc:0.96393 validation-aucpr:0.96765
[9] validation-logloss:0.61939 validation-auc:0.96481 validation-aucpr:0.96922
[10] validation-logloss:0.61221 validation-auc:0.96560 validation-aucpr:0.96988
[11] validation-logloss:0.60527 validation-auc:0.96607 validation-aucpr:0.97025
[12] validation-logloss:0.59849 validation-auc:0.96652 validation-aucpr:0.97046
[13] validation-logloss:0.59273 validation-auc:0.96653 validation-aucpr:0.97034
[14] validation-logloss:0.58714 validation-auc:0.96627 validation-aucpr:0.97131
[15] validation-logloss:0.58070 validation-auc:0.96658 validation-aucpr:0.97166
[16] validation-logloss:0.57518 validation-auc:0.96624 validation-aucpr:0.97137
[17] validation-logloss:0.56915 validation-auc:0.96639 validation-aucpr:0.97148
[18] validation-logloss:0.56409 validation-auc:0.96615 validation-aucpr:0.97130
[19] validation-logloss:0.55798 validation-auc:0.96654 validation-aucpr:0.97166
[20] validation-logloss:0.55210 validation-auc:0.96683 validation-aucpr:0.97193
[21] validation-logloss:0.54720 validation-auc:0.96682 validation-aucpr:0.97183
[22] validation-logloss:0.54187 validation-auc:0.96674 validation-aucpr:0.97200
[23] validation-logloss:0.53698 validation-auc:0.96655 validation-aucpr:0.97188
[24] validation-logloss:0.53140 validation-auc:0.96689 validation-aucpr:0.97218
[25] validation-logloss:0.52682 validation-auc:0.96677 validation-aucpr:0.97205
[26] validation-logloss:0.52235 validation-auc:0.96681 validation-aucpr:0.97206
[27] validation-logloss:0.51763 validation-auc:0.96699 validation-aucpr:0.97226
[28] validation-logloss:0.51317 validation-auc:0.96691 validation-aucpr:0.97222
[29] validation-logloss:0.50894 validation-auc:0.96687 validation-aucpr:0.97215
[30] validation-logloss:0.50456 validation-auc:0.96687 validation-aucpr:0.97213
[31] validation-logloss:0.50034 validation-auc:0.96709 validation-aucpr:0.97229
[32] validation-logloss:0.49615 validation-auc:0.96708 validation-aucpr:0.97226
[33] validation-logloss:0.49220 validation-auc:0.96704 validation-aucpr:0.97221
[34] validation-logloss:0.48831 validation-auc:0.96701 validation-aucpr:0.97217
[35] validation-logloss:0.48379 validation-auc:0.96724 validation-aucpr:0.97236
[36] validation-logloss:0.47931 validation-auc:0.96735 validation-aucpr:0.97245
[37] validation-logloss:0.47550 validation-auc:0.96747 validation-aucpr:0.97251
[38] validation-logloss:0.47186 validation-auc:0.96742 validation-aucpr:0.97245
[39] validation-logloss:0.46818 validation-auc:0.96740 validation-aucpr:0.97243
[40] validation-logloss:0.46391 validation-auc:0.96767 validation-aucpr:0.97264
[41] validation-logloss:0.45976 validation-auc:0.96776 validation-aucpr:0.97275
[42] validation-logloss:0.45624 validation-auc:0.96792 validation-aucpr:0.97283
[43] validation-logloss:0.45283 validation-auc:0.96791 validation-aucpr:0.97281
[44] validation-logloss:0.44912 validation-auc:0.96793 validation-aucpr:0.97287
[45] validation-logloss:0.44534 validation-auc:0.96811 validation-aucpr:0.97302
[46] validation-logloss:0.44157 validation-auc:0.96822 validation-aucpr:0.97313
[47] validation-logloss:0.43845 validation-auc:0.96814 validation-aucpr:0.97304
[48] validation-logloss:0.43543 validation-auc:0.96803 validation-aucpr:0.97294
[49] validation-logloss:0.43252 validation-auc:0.96797 validation-aucpr:0.97286
[50] validation-logloss:0.42945 validation-auc:0.96801 validation-aucpr:0.97288
[51] validation-logloss:0.42646 validation-auc:0.96806 validation-aucpr:0.97293
{'best_iteration': '46', 'best_score': '0.9731310062419574'}
Trial 58, Fold 1: Log loss = 0.4264604090034749, Average precision = 0.9729299750562228, ROC-AUC = 0.9680568848788135, Elapsed Time = 9.525947300000553 seconds
Trial 58, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 58, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68512 validation-auc:0.92130 validation-aucpr:0.88875
[1] validation-logloss:0.67730 validation-auc:0.95032 validation-aucpr:0.94661
[2] validation-logloss:0.66981 validation-auc:0.95501 validation-aucpr:0.95575
[3] validation-logloss:0.66258 validation-auc:0.95662 validation-aucpr:0.95999
[4] validation-logloss:0.65461 validation-auc:0.96271 validation-aucpr:0.96636
[5] validation-logloss:0.64768 validation-auc:0.96284 validation-aucpr:0.96632
[6] validation-logloss:0.64000 validation-auc:0.96516 validation-aucpr:0.96887
[7] validation-logloss:0.63255 validation-auc:0.96619 validation-aucpr:0.96992
[8] validation-logloss:0.62583 validation-auc:0.96630 validation-aucpr:0.96990
[9] validation-logloss:0.61871 validation-auc:0.96692 validation-aucpr:0.97051
[10] validation-logloss:0.61167 validation-auc:0.96710 validation-aucpr:0.97068
[11] validation-logloss:0.60537 validation-auc:0.96752 validation-aucpr:0.97093
[12] validation-logloss:0.59870 validation-auc:0.96786 validation-aucpr:0.97133
[13] validation-logloss:0.59291 validation-auc:0.96797 validation-aucpr:0.97130
[14] validation-logloss:0.58710 validation-auc:0.96825 validation-aucpr:0.97152
[15] validation-logloss:0.58073 validation-auc:0.96827 validation-aucpr:0.97163
[16] validation-logloss:0.57506 validation-auc:0.96863 validation-aucpr:0.97190
[17] validation-logloss:0.56945 validation-auc:0.96874 validation-aucpr:0.97196
[18] validation-logloss:0.56390 validation-auc:0.96882 validation-aucpr:0.97196
[19] validation-logloss:0.55861 validation-auc:0.96877 validation-aucpr:0.97190
[20] validation-logloss:0.55339 validation-auc:0.96879 validation-aucpr:0.97189
[21] validation-logloss:0.54818 validation-auc:0.96895 validation-aucpr:0.97195
[22] validation-logloss:0.54327 validation-auc:0.96897 validation-aucpr:0.97191
[23] validation-logloss:0.53834 validation-auc:0.96912 validation-aucpr:0.97197
[24] validation-logloss:0.53287 validation-auc:0.96938 validation-aucpr:0.97223
[25] validation-logloss:0.52752 validation-auc:0.96982 validation-aucpr:0.97265
[26] validation-logloss:0.52288 validation-auc:0.96971 validation-aucpr:0.97254
[27] validation-logloss:0.51762 validation-auc:0.96990 validation-aucpr:0.97270
[28] validation-logloss:0.51240 validation-auc:0.97010 validation-aucpr:0.97293
[29] validation-logloss:0.50791 validation-auc:0.97006 validation-aucpr:0.97290
[30] validation-logloss:0.50298 validation-auc:0.97012 validation-aucpr:0.97299
[31] validation-logloss:0.49875 validation-auc:0.97004 validation-aucpr:0.97289
[32] validation-logloss:0.49444 validation-auc:0.97015 validation-aucpr:0.97295
[33] validation-logloss:0.49041 validation-auc:0.97011 validation-aucpr:0.97292
[34] validation-logloss:0.48641 validation-auc:0.97007 validation-aucpr:0.97287
[35] validation-logloss:0.48178 validation-auc:0.97025 validation-aucpr:0.97306
[36] validation-logloss:0.47736 validation-auc:0.97034 validation-aucpr:0.97317
[37] validation-logloss:0.47303 validation-auc:0.97040 validation-aucpr:0.97325
[38] validation-logloss:0.46936 validation-auc:0.97053 validation-aucpr:0.97332
[39] validation-logloss:0.46493 validation-auc:0.97069 validation-aucpr:0.97348
[40] validation-logloss:0.46083 validation-auc:0.97075 validation-aucpr:0.97341
[41] validation-logloss:0.45732 validation-auc:0.97073 validation-aucpr:0.97339
[42] validation-logloss:0.45385 validation-auc:0.97080 validation-aucpr:0.97343
[43] validation-logloss:0.44977 validation-auc:0.97095 validation-aucpr:0.97355
[44] validation-logloss:0.44637 validation-auc:0.97099 validation-aucpr:0.97360
[45] validation-logloss:0.44316 validation-auc:0.97096 validation-aucpr:0.97357
[46] validation-logloss:0.43938 validation-auc:0.97102 validation-aucpr:0.97363
[47] validation-logloss:0.43614 validation-auc:0.97108 validation-aucpr:0.97366
[48] validation-logloss:0.43322 validation-auc:0.97106 validation-aucpr:0.97364
[49] validation-logloss:0.43010 validation-auc:0.97108 validation-aucpr:0.97367
[50] validation-logloss:0.42656 validation-auc:0.97108 validation-aucpr:0.97377
[51] validation-logloss:0.42303 validation-auc:0.97118 validation-aucpr:0.97386
{'best_iteration': '51', 'best_score': '0.9738633077380496'}
Trial 58, Fold 2: Log loss = 0.4230341057484318, Average precision = 0.9737971510136811, ROC-AUC = 0.9711782396103601, Elapsed Time = 9.858536900002946 seconds
Trial 58, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 58, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68538 validation-auc:0.91583 validation-aucpr:0.87047
[1] validation-logloss:0.67757 validation-auc:0.94953 validation-aucpr:0.94006
[2] validation-logloss:0.67011 validation-auc:0.95516 validation-aucpr:0.95680
[3] validation-logloss:0.66266 validation-auc:0.95872 validation-aucpr:0.96160
[4] validation-logloss:0.65549 validation-auc:0.96005 validation-aucpr:0.96284
[5] validation-logloss:0.64748 validation-auc:0.96583 validation-aucpr:0.96858
[6] validation-logloss:0.63974 validation-auc:0.96826 validation-aucpr:0.97116
[7] validation-logloss:0.63209 validation-auc:0.96895 validation-aucpr:0.97183
[8] validation-logloss:0.62452 validation-auc:0.96928 validation-aucpr:0.97219
[9] validation-logloss:0.61846 validation-auc:0.96915 validation-aucpr:0.97346
[10] validation-logloss:0.61129 validation-auc:0.96941 validation-aucpr:0.97366
[11] validation-logloss:0.60511 validation-auc:0.96930 validation-aucpr:0.97353
[12] validation-logloss:0.59910 validation-auc:0.96904 validation-aucpr:0.97321
[13] validation-logloss:0.59231 validation-auc:0.96930 validation-aucpr:0.97346
[14] validation-logloss:0.58639 validation-auc:0.96944 validation-aucpr:0.97356
[15] validation-logloss:0.57991 validation-auc:0.96976 validation-aucpr:0.97383
[16] validation-logloss:0.57415 validation-auc:0.96973 validation-aucpr:0.97378
[17] validation-logloss:0.56786 validation-auc:0.96999 validation-aucpr:0.97409
[18] validation-logloss:0.56194 validation-auc:0.97000 validation-aucpr:0.97415
[19] validation-logloss:0.55605 validation-auc:0.97000 validation-aucpr:0.97419
[20] validation-logloss:0.55078 validation-auc:0.97018 validation-aucpr:0.97428
[21] validation-logloss:0.54561 validation-auc:0.97017 validation-aucpr:0.97424
[22] validation-logloss:0.54077 validation-auc:0.97004 validation-aucpr:0.97412
[23] validation-logloss:0.53579 validation-auc:0.97005 validation-aucpr:0.97409
[24] validation-logloss:0.53103 validation-auc:0.97013 validation-aucpr:0.97415
[25] validation-logloss:0.52574 validation-auc:0.97013 validation-aucpr:0.97417
[26] validation-logloss:0.52037 validation-auc:0.97022 validation-aucpr:0.97427
[27] validation-logloss:0.51515 validation-auc:0.97041 validation-aucpr:0.97444
[28] validation-logloss:0.51076 validation-auc:0.97038 validation-aucpr:0.97440
[29] validation-logloss:0.50583 validation-auc:0.97034 validation-aucpr:0.97438
[30] validation-logloss:0.50151 validation-auc:0.97045 validation-aucpr:0.97444
[31] validation-logloss:0.49740 validation-auc:0.97049 validation-aucpr:0.97445
[32] validation-logloss:0.49316 validation-auc:0.97048 validation-aucpr:0.97443
[33] validation-logloss:0.48905 validation-auc:0.97042 validation-aucpr:0.97438
[34] validation-logloss:0.48519 validation-auc:0.97035 validation-aucpr:0.97432
[35] validation-logloss:0.48063 validation-auc:0.97034 validation-aucpr:0.97431
[36] validation-logloss:0.47618 validation-auc:0.97038 validation-aucpr:0.97439
[37] validation-logloss:0.47175 validation-auc:0.97047 validation-aucpr:0.97446
[38] validation-logloss:0.46818 validation-auc:0.97036 validation-aucpr:0.97435
[39] validation-logloss:0.46417 validation-auc:0.97042 validation-aucpr:0.97441
[40] validation-logloss:0.45997 validation-auc:0.97047 validation-aucpr:0.97446
[41] validation-logloss:0.45604 validation-auc:0.97044 validation-aucpr:0.97448
[42] validation-logloss:0.45257 validation-auc:0.97037 validation-aucpr:0.97442
[43] validation-logloss:0.44853 validation-auc:0.97052 validation-aucpr:0.97455
[44] validation-logloss:0.44523 validation-auc:0.97049 validation-aucpr:0.97452
[45] validation-logloss:0.44188 validation-auc:0.97054 validation-aucpr:0.97461
[46] validation-logloss:0.43879 validation-auc:0.97045 validation-aucpr:0.97452
[47] validation-logloss:0.43504 validation-auc:0.97047 validation-aucpr:0.97455
[48] validation-logloss:0.43136 validation-auc:0.97057 validation-aucpr:0.97464
[49] validation-logloss:0.42786 validation-auc:0.97061 validation-aucpr:0.97467
[50] validation-logloss:0.42488 validation-auc:0.97049 validation-aucpr:0.97463
[51] validation-logloss:0.42190 validation-auc:0.97047 validation-aucpr:0.97458
{'best_iteration': '49', 'best_score': '0.9746702082923872'}
Trial 58, Fold 3: Log loss = 0.4218982546432991, Average precision = 0.9745859170622088, ROC-AUC = 0.9704738047885506, Elapsed Time = 9.829640899999504 seconds
Trial 58, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 58, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68537 validation-auc:0.90713 validation-aucpr:0.86614
[1] validation-logloss:0.67684 validation-auc:0.94933 validation-aucpr:0.94181
[2] validation-logloss:0.66957 validation-auc:0.95704 validation-aucpr:0.95876
[3] validation-logloss:0.66130 validation-auc:0.96109 validation-aucpr:0.96457
[4] validation-logloss:0.65334 validation-auc:0.96373 validation-aucpr:0.96961
[5] validation-logloss:0.64557 validation-auc:0.96458 validation-aucpr:0.97034
[6] validation-logloss:0.63877 validation-auc:0.96470 validation-aucpr:0.97057
[7] validation-logloss:0.63130 validation-auc:0.96560 validation-aucpr:0.97136
[8] validation-logloss:0.62498 validation-auc:0.96554 validation-aucpr:0.97123
[9] validation-logloss:0.61895 validation-auc:0.96544 validation-aucpr:0.97113
[10] validation-logloss:0.61251 validation-auc:0.96604 validation-aucpr:0.97154
[11] validation-logloss:0.60555 validation-auc:0.96629 validation-aucpr:0.97177
[12] validation-logloss:0.59958 validation-auc:0.96604 validation-aucpr:0.97152
[13] validation-logloss:0.59275 validation-auc:0.96677 validation-aucpr:0.97212
[14] validation-logloss:0.58626 validation-auc:0.96713 validation-aucpr:0.97245
[15] validation-logloss:0.57995 validation-auc:0.96738 validation-aucpr:0.97267
[16] validation-logloss:0.57356 validation-auc:0.96791 validation-aucpr:0.97308
[17] validation-logloss:0.56808 validation-auc:0.96792 validation-aucpr:0.97308
[18] validation-logloss:0.56204 validation-auc:0.96814 validation-aucpr:0.97327
[19] validation-logloss:0.55603 validation-auc:0.96833 validation-aucpr:0.97345
[20] validation-logloss:0.55038 validation-auc:0.96842 validation-aucpr:0.97350
[21] validation-logloss:0.54470 validation-auc:0.96841 validation-aucpr:0.97353
[22] validation-logloss:0.53960 validation-auc:0.96851 validation-aucpr:0.97363
[23] validation-logloss:0.53410 validation-auc:0.96854 validation-aucpr:0.97370
[24] validation-logloss:0.52937 validation-auc:0.96854 validation-aucpr:0.97366
[25] validation-logloss:0.52409 validation-auc:0.96849 validation-aucpr:0.97365
[26] validation-logloss:0.51938 validation-auc:0.96853 validation-aucpr:0.97369
[27] validation-logloss:0.51480 validation-auc:0.96865 validation-aucpr:0.97377
[28] validation-logloss:0.51045 validation-auc:0.96860 validation-aucpr:0.97370
[29] validation-logloss:0.50556 validation-auc:0.96872 validation-aucpr:0.97380
[30] validation-logloss:0.50079 validation-auc:0.96884 validation-aucpr:0.97391
[31] validation-logloss:0.49589 validation-auc:0.96911 validation-aucpr:0.97413
[32] validation-logloss:0.49123 validation-auc:0.96919 validation-aucpr:0.97420
[33] validation-logloss:0.48665 validation-auc:0.96928 validation-aucpr:0.97429
[34] validation-logloss:0.48258 validation-auc:0.96937 validation-aucpr:0.97435
[35] validation-logloss:0.47814 validation-auc:0.96952 validation-aucpr:0.97444
[36] validation-logloss:0.47449 validation-auc:0.96946 validation-aucpr:0.97440
[37] validation-logloss:0.47055 validation-auc:0.96950 validation-aucpr:0.97441
[38] validation-logloss:0.46635 validation-auc:0.96958 validation-aucpr:0.97447
[39] validation-logloss:0.46288 validation-auc:0.96941 validation-aucpr:0.97435
[40] validation-logloss:0.45874 validation-auc:0.96945 validation-aucpr:0.97439
[41] validation-logloss:0.45467 validation-auc:0.96958 validation-aucpr:0.97452
[42] validation-logloss:0.45072 validation-auc:0.96965 validation-aucpr:0.97457
[43] validation-logloss:0.44691 validation-auc:0.96975 validation-aucpr:0.97463
[44] validation-logloss:0.44304 validation-auc:0.96987 validation-aucpr:0.97471
[45] validation-logloss:0.43926 validation-auc:0.96996 validation-aucpr:0.97477
[46] validation-logloss:0.43587 validation-auc:0.97009 validation-aucpr:0.97485
[47] validation-logloss:0.43221 validation-auc:0.97011 validation-aucpr:0.97486
[48] validation-logloss:0.42905 validation-auc:0.97013 validation-aucpr:0.97487
[49] validation-logloss:0.42549 validation-auc:0.97017 validation-aucpr:0.97490
[50] validation-logloss:0.42197 validation-auc:0.97032 validation-aucpr:0.97502
[51] validation-logloss:0.41858 validation-auc:0.97035 validation-aucpr:0.97505
{'best_iteration': '51', 'best_score': '0.9750538421258711'}
Trial 58, Fold 4: Log loss = 0.41858497123120914, Average precision = 0.9750528920264302, ROC-AUC = 0.9703511311639974, Elapsed Time = 10.098115500000858 seconds
Trial 58, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 58, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68543 validation-auc:0.91494 validation-aucpr:0.88175
[1] validation-logloss:0.67780 validation-auc:0.94480 validation-aucpr:0.93673
[2] validation-logloss:0.66948 validation-auc:0.95846 validation-aucpr:0.95619
[3] validation-logloss:0.66231 validation-auc:0.95895 validation-aucpr:0.96348
[4] validation-logloss:0.65428 validation-auc:0.96190 validation-aucpr:0.96671
[5] validation-logloss:0.64748 validation-auc:0.96177 validation-aucpr:0.96650
[6] validation-logloss:0.63987 validation-auc:0.96323 validation-aucpr:0.96802
[7] validation-logloss:0.63319 validation-auc:0.96324 validation-aucpr:0.96798
[8] validation-logloss:0.62674 validation-auc:0.96331 validation-aucpr:0.96791
[9] validation-logloss:0.61949 validation-auc:0.96423 validation-aucpr:0.96881
[10] validation-logloss:0.61354 validation-auc:0.96399 validation-aucpr:0.96836
[11] validation-logloss:0.60681 validation-auc:0.96454 validation-aucpr:0.96887
[12] validation-logloss:0.60094 validation-auc:0.96416 validation-aucpr:0.96850
[13] validation-logloss:0.59417 validation-auc:0.96471 validation-aucpr:0.96909
[14] validation-logloss:0.58840 validation-auc:0.96490 validation-aucpr:0.96916
[15] validation-logloss:0.58220 validation-auc:0.96537 validation-aucpr:0.96948
[16] validation-logloss:0.57659 validation-auc:0.96532 validation-aucpr:0.96945
[17] validation-logloss:0.57114 validation-auc:0.96523 validation-aucpr:0.96981
[18] validation-logloss:0.56575 validation-auc:0.96513 validation-aucpr:0.96972
[19] validation-logloss:0.55976 validation-auc:0.96524 validation-aucpr:0.96988
[20] validation-logloss:0.55410 validation-auc:0.96527 validation-aucpr:0.96999
[21] validation-logloss:0.54920 validation-auc:0.96526 validation-aucpr:0.96989
[22] validation-logloss:0.54348 validation-auc:0.96570 validation-aucpr:0.97025
[23] validation-logloss:0.53790 validation-auc:0.96594 validation-aucpr:0.97046
[24] validation-logloss:0.53318 validation-auc:0.96590 validation-aucpr:0.97041
[25] validation-logloss:0.52790 validation-auc:0.96602 validation-aucpr:0.97055
[26] validation-logloss:0.52287 validation-auc:0.96604 validation-aucpr:0.97061
[27] validation-logloss:0.51843 validation-auc:0.96604 validation-aucpr:0.97059
[28] validation-logloss:0.51331 validation-auc:0.96634 validation-aucpr:0.97083
[29] validation-logloss:0.50931 validation-auc:0.96614 validation-aucpr:0.97066
[30] validation-logloss:0.50516 validation-auc:0.96605 validation-aucpr:0.97058
[31] validation-logloss:0.50083 validation-auc:0.96625 validation-aucpr:0.97074
[32] validation-logloss:0.49602 validation-auc:0.96653 validation-aucpr:0.97101
[33] validation-logloss:0.49224 validation-auc:0.96645 validation-aucpr:0.97093
[34] validation-logloss:0.48788 validation-auc:0.96638 validation-aucpr:0.97090
[35] validation-logloss:0.48425 validation-auc:0.96624 validation-aucpr:0.97077
[36] validation-logloss:0.47971 validation-auc:0.96647 validation-aucpr:0.97096
[37] validation-logloss:0.47543 validation-auc:0.96667 validation-aucpr:0.97138
[38] validation-logloss:0.47179 validation-auc:0.96665 validation-aucpr:0.97135
[39] validation-logloss:0.46751 validation-auc:0.96678 validation-aucpr:0.97147
[40] validation-logloss:0.46353 validation-auc:0.96667 validation-aucpr:0.97139
[41] validation-logloss:0.46004 validation-auc:0.96668 validation-aucpr:0.97141
[42] validation-logloss:0.45657 validation-auc:0.96677 validation-aucpr:0.97146
[43] validation-logloss:0.45256 validation-auc:0.96690 validation-aucpr:0.97159
[44] validation-logloss:0.44918 validation-auc:0.96696 validation-aucpr:0.97162
[45] validation-logloss:0.44598 validation-auc:0.96686 validation-aucpr:0.97155
[46] validation-logloss:0.44267 validation-auc:0.96679 validation-aucpr:0.97151
[47] validation-logloss:0.43958 validation-auc:0.96675 validation-aucpr:0.97147
[48] validation-logloss:0.43632 validation-auc:0.96680 validation-aucpr:0.97150
[49] validation-logloss:0.43273 validation-auc:0.96683 validation-aucpr:0.97155
[50] validation-logloss:0.42965 validation-auc:0.96686 validation-aucpr:0.97154
[51] validation-logloss:0.42667 validation-auc:0.96689 validation-aucpr:0.97162
{'best_iteration': '51', 'best_score': '0.9716200229220133'}
Trial 58, Fold 5: Log loss = 0.42666805150415726, Average precision = 0.9716291604384265, ROC-AUC = 0.9668851606190663, Elapsed Time = 9.674562900003366 seconds
Optimization Progress: 59%|#####8 | 59/100 [2:56:32<33:16, 48.70s/it]
Trial 59, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 59, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67899 validation-auc:0.92790 validation-aucpr:0.92841
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66477 validation-auc:0.94561 validation-aucpr:0.94981
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.64949 validation-auc:0.95777 validation-aucpr:0.96367
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63710 validation-auc:0.95702 validation-aucpr:0.96296
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62566 validation-auc:0.95598 validation-aucpr:0.96186
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61183 validation-auc:0.95884 validation-aucpr:0.96522
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.60139 validation-auc:0.95814 validation-aucpr:0.96473
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.59172 validation-auc:0.95718 validation-aucpr:0.96394
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.58145 validation-auc:0.95756 validation-aucpr:0.96419
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.57007 validation-auc:0.95857 validation-aucpr:0.96552
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.56014 validation-auc:0.95915 validation-aucpr:0.96575
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.55085 validation-auc:0.95933 validation-aucpr:0.96571
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.54272 validation-auc:0.95896 validation-aucpr:0.96524
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.53227 validation-auc:0.95929 validation-aucpr:0.96572
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.52406 validation-auc:0.95928 validation-aucpr:0.96576
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.51473 validation-auc:0.95969 validation-aucpr:0.96618
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.50589 validation-auc:0.95988 validation-aucpr:0.96641
[20:55:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.49852 validation-auc:0.95985 validation-aucpr:0.96647
[20:55:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.49187 validation-auc:0.95967 validation-aucpr:0.96628
[20:55:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.48482 validation-auc:0.95964 validation-aucpr:0.96614
[20:55:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.47792 validation-auc:0.96003 validation-aucpr:0.96645
[20:55:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.47136 validation-auc:0.96054 validation-aucpr:0.96680
[20:55:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.46562 validation-auc:0.96050 validation-aucpr:0.96668
[20:55:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.46021 validation-auc:0.96018 validation-aucpr:0.96640
[20:55:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.45432 validation-auc:0.96020 validation-aucpr:0.96644
[20:55:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.44804 validation-auc:0.96079 validation-aucpr:0.96692
[20:55:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.44086 validation-auc:0.96144 validation-aucpr:0.96750
[20:55:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.43622 validation-auc:0.96134 validation-aucpr:0.96740
[20:55:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.43010 validation-auc:0.96148 validation-aucpr:0.96760
[20:55:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.42475 validation-auc:0.96186 validation-aucpr:0.96790
[20:55:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.41999 validation-auc:0.96184 validation-aucpr:0.96786
[20:55:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.41374 validation-auc:0.96235 validation-aucpr:0.96831
[20:55:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.40748 validation-auc:0.96266 validation-aucpr:0.96861
[20:55:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.40208 validation-auc:0.96262 validation-aucpr:0.96865
[20:55:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.39824 validation-auc:0.96249 validation-aucpr:0.96855
[20:55:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.39417 validation-auc:0.96236 validation-aucpr:0.96841
[20:55:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.38962 validation-auc:0.96262 validation-aucpr:0.96860
[20:55:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.38586 validation-auc:0.96253 validation-aucpr:0.96854
[20:55:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.38239 validation-auc:0.96243 validation-aucpr:0.96848
[20:55:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.37886 validation-auc:0.96239 validation-aucpr:0.96840
[20:55:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.37507 validation-auc:0.96239 validation-aucpr:0.96841
[20:55:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.37205 validation-auc:0.96245 validation-aucpr:0.96844
[20:55:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.36889 validation-auc:0.96238 validation-aucpr:0.96837
[20:55:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.36558 validation-auc:0.96240 validation-aucpr:0.96838
[20:55:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.36150 validation-auc:0.96254 validation-aucpr:0.96853
[20:55:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.35818 validation-auc:0.96263 validation-aucpr:0.96862
[20:55:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.35467 validation-auc:0.96289 validation-aucpr:0.96882
[20:55:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.35203 validation-auc:0.96292 validation-aucpr:0.96883
[20:55:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.34941 validation-auc:0.96294 validation-aucpr:0.96885
[20:55:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.34667 validation-auc:0.96296 validation-aucpr:0.96888
[20:55:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.34343 validation-auc:0.96307 validation-aucpr:0.96893
[20:55:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.34070 validation-auc:0.96304 validation-aucpr:0.96890
[20:55:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.33798 validation-auc:0.96309 validation-aucpr:0.96895
[20:55:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.33579 validation-auc:0.96307 validation-aucpr:0.96891
[20:55:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.33343 validation-auc:0.96305 validation-aucpr:0.96888
[20:55:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.32938 validation-auc:0.96347 validation-aucpr:0.96926
[20:55:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.32698 validation-auc:0.96355 validation-aucpr:0.96932
[20:55:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.32443 validation-auc:0.96369 validation-aucpr:0.96942
[20:55:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.32194 validation-auc:0.96379 validation-aucpr:0.96952
[20:55:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.31897 validation-auc:0.96389 validation-aucpr:0.96965
[20:55:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.31694 validation-auc:0.96396 validation-aucpr:0.96972
[20:55:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.31474 validation-auc:0.96397 validation-aucpr:0.96972
[20:55:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.31206 validation-auc:0.96409 validation-aucpr:0.96984
[20:55:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.31013 validation-auc:0.96416 validation-aucpr:0.96991
[20:55:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.30882 validation-auc:0.96401 validation-aucpr:0.96980
[20:55:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.30704 validation-auc:0.96401 validation-aucpr:0.96980
[20:55:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.30527 validation-auc:0.96401 validation-aucpr:0.96977
[20:55:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.30285 validation-auc:0.96419 validation-aucpr:0.96991
[20:55:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.30001 validation-auc:0.96439 validation-aucpr:0.97011
[20:55:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.29832 validation-auc:0.96448 validation-aucpr:0.97019
[20:55:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.29685 validation-auc:0.96443 validation-aucpr:0.97015
[20:55:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.29457 validation-auc:0.96454 validation-aucpr:0.97028
[20:55:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.29278 validation-auc:0.96456 validation-aucpr:0.97032
[20:55:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.29124 validation-auc:0.96459 validation-aucpr:0.97034
[20:55:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.28958 validation-auc:0.96475 validation-aucpr:0.97046
[20:55:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.28846 validation-auc:0.96472 validation-aucpr:0.97043
[20:55:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.28718 validation-auc:0.96471 validation-aucpr:0.97041
[20:55:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.28585 validation-auc:0.96479 validation-aucpr:0.97046
[20:55:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.28486 validation-auc:0.96475 validation-aucpr:0.97041
[20:55:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.28372 validation-auc:0.96478 validation-aucpr:0.97041
[20:55:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.28263 validation-auc:0.96470 validation-aucpr:0.97034
[20:55:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.28055 validation-auc:0.96482 validation-aucpr:0.97044
[20:55:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.27859 validation-auc:0.96495 validation-aucpr:0.97057
[20:55:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.27728 validation-auc:0.96506 validation-aucpr:0.97063
[20:55:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.27609 validation-auc:0.96512 validation-aucpr:0.97068
[20:55:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.27490 validation-auc:0.96514 validation-aucpr:0.97068
[20:55:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.27299 validation-auc:0.96524 validation-aucpr:0.97079
[20:55:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.27184 validation-auc:0.96529 validation-aucpr:0.97082
[20:55:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.27078 validation-auc:0.96536 validation-aucpr:0.97087
[20:55:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[89] validation-logloss:0.26895 validation-auc:0.96549 validation-aucpr:0.97098
[20:55:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[90] validation-logloss:0.26708 validation-auc:0.96563 validation-aucpr:0.97111
[20:55:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[91] validation-logloss:0.26582 validation-auc:0.96572 validation-aucpr:0.97116
[20:55:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[92] validation-logloss:0.26486 validation-auc:0.96570 validation-aucpr:0.97114
[20:55:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[93] validation-logloss:0.26314 validation-auc:0.96587 validation-aucpr:0.97129
[20:55:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[94] validation-logloss:0.26224 validation-auc:0.96586 validation-aucpr:0.97129
{'best_iteration': '93', 'best_score': '0.9712917743104803'}
Trial 59, Fold 1: Log loss = 0.2622414109940505, Average precision = 0.9712959878669294, ROC-AUC = 0.9658647710815733, Elapsed Time = 19.233018799997808 seconds
Trial 59, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 59, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67932 validation-auc:0.92449 validation-aucpr:0.92114
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66522 validation-auc:0.94170 validation-aucpr:0.94047
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65008 validation-auc:0.95676 validation-aucpr:0.95986
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63706 validation-auc:0.95940 validation-aucpr:0.96276
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62527 validation-auc:0.95850 validation-aucpr:0.96176
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61394 validation-auc:0.95821 validation-aucpr:0.96169
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.60367 validation-auc:0.95682 validation-aucpr:0.96054
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.59286 validation-auc:0.95709 validation-aucpr:0.96050
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.58275 validation-auc:0.95661 validation-aucpr:0.96011
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.57086 validation-auc:0.95914 validation-aucpr:0.96318
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.56181 validation-auc:0.95871 validation-aucpr:0.96264
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.55230 validation-auc:0.95952 validation-aucpr:0.96338
[20:55:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.54340 validation-auc:0.95989 validation-aucpr:0.96389
[20:55:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.53516 validation-auc:0.96001 validation-aucpr:0.96412
[20:55:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.52523 validation-auc:0.96102 validation-aucpr:0.96543
[20:55:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.51763 validation-auc:0.96096 validation-aucpr:0.96530
[20:55:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.50847 validation-auc:0.96172 validation-aucpr:0.96629
[20:55:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.50124 validation-auc:0.96147 validation-aucpr:0.96608
[20:55:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.49359 validation-auc:0.96160 validation-aucpr:0.96622
[20:55:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.48659 validation-auc:0.96165 validation-aucpr:0.96622
[20:55:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.47800 validation-auc:0.96218 validation-aucpr:0.96680
[20:55:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.47133 validation-auc:0.96232 validation-aucpr:0.96684
[20:55:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.46581 validation-auc:0.96208 validation-aucpr:0.96657
[20:55:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.45967 validation-auc:0.96220 validation-aucpr:0.96666
[20:55:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.45270 validation-auc:0.96242 validation-aucpr:0.96684
[20:55:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.44727 validation-auc:0.96243 validation-aucpr:0.96687
[20:55:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.44135 validation-auc:0.96264 validation-aucpr:0.96700
[20:55:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.43663 validation-auc:0.96243 validation-aucpr:0.96683
[20:55:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.43170 validation-auc:0.96261 validation-aucpr:0.96694
[20:55:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.42722 validation-auc:0.96256 validation-aucpr:0.96684
[20:55:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.42141 validation-auc:0.96265 validation-aucpr:0.96697
[20:55:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.41661 validation-auc:0.96273 validation-aucpr:0.96701
[20:55:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.41232 validation-auc:0.96262 validation-aucpr:0.96688
[20:55:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.40617 validation-auc:0.96291 validation-aucpr:0.96717
[20:55:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.40151 validation-auc:0.96312 validation-aucpr:0.96730
[20:55:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.39738 validation-auc:0.96318 validation-aucpr:0.96737
[20:55:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.39314 validation-auc:0.96315 validation-aucpr:0.96730
[20:55:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.38893 validation-auc:0.96322 validation-aucpr:0.96736
[20:55:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.38568 validation-auc:0.96342 validation-aucpr:0.96753
[20:55:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.38199 validation-auc:0.96338 validation-aucpr:0.96748
[20:55:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.37691 validation-auc:0.96354 validation-aucpr:0.96767
[20:55:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.37309 validation-auc:0.96370 validation-aucpr:0.96777
[20:55:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.36978 validation-auc:0.96367 validation-aucpr:0.96772
[20:55:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.36582 validation-auc:0.96394 validation-aucpr:0.96795
[20:55:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.36234 validation-auc:0.96402 validation-aucpr:0.96803
[20:55:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.35938 validation-auc:0.96406 validation-aucpr:0.96794
[20:55:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.35598 validation-auc:0.96424 validation-aucpr:0.96808
[20:55:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.35155 validation-auc:0.96450 validation-aucpr:0.96835
[20:55:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.34850 validation-auc:0.96460 validation-aucpr:0.96848
[20:55:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.34544 validation-auc:0.96470 validation-aucpr:0.96855
[20:55:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.34288 validation-auc:0.96459 validation-aucpr:0.96846
[20:55:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.34033 validation-auc:0.96458 validation-aucpr:0.96844
[20:55:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.33802 validation-auc:0.96446 validation-aucpr:0.96832
[20:55:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.33430 validation-auc:0.96454 validation-aucpr:0.96842
[20:55:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.33208 validation-auc:0.96444 validation-aucpr:0.96837
[20:55:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.32915 validation-auc:0.96460 validation-aucpr:0.96848
[20:55:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.32547 validation-auc:0.96487 validation-aucpr:0.96871
[20:55:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.32312 validation-auc:0.96497 validation-aucpr:0.96877
[20:55:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.32086 validation-auc:0.96502 validation-aucpr:0.96879
[20:55:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.31836 validation-auc:0.96520 validation-aucpr:0.96891
[20:55:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.31642 validation-auc:0.96522 validation-aucpr:0.96888
[20:55:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.31444 validation-auc:0.96523 validation-aucpr:0.96886
[20:55:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.31259 validation-auc:0.96524 validation-aucpr:0.96890
[20:55:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.31054 validation-auc:0.96534 validation-aucpr:0.96898
[20:55:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.30887 validation-auc:0.96530 validation-aucpr:0.96893
[20:55:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.30671 validation-auc:0.96528 validation-aucpr:0.96892
[20:55:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.30490 validation-auc:0.96529 validation-aucpr:0.96897
[20:55:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.30290 validation-auc:0.96540 validation-aucpr:0.96904
[20:55:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.30109 validation-auc:0.96537 validation-aucpr:0.96900
[20:55:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.29946 validation-auc:0.96534 validation-aucpr:0.96895
[20:55:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.29767 validation-auc:0.96541 validation-aucpr:0.96899
[20:55:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.29620 validation-auc:0.96544 validation-aucpr:0.96899
[20:55:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.29476 validation-auc:0.96547 validation-aucpr:0.96899
[20:55:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.29331 validation-auc:0.96550 validation-aucpr:0.96904
[20:55:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.29188 validation-auc:0.96548 validation-aucpr:0.96902
[20:55:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.29057 validation-auc:0.96547 validation-aucpr:0.96904
[20:55:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.28927 validation-auc:0.96538 validation-aucpr:0.96895
[20:55:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.28675 validation-auc:0.96557 validation-aucpr:0.96916
[20:55:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.28547 validation-auc:0.96559 validation-aucpr:0.96933
[20:55:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.28414 validation-auc:0.96561 validation-aucpr:0.96934
[20:55:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.28170 validation-auc:0.96580 validation-aucpr:0.96952
[20:56:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.27919 validation-auc:0.96604 validation-aucpr:0.96975
[20:56:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.27801 validation-auc:0.96608 validation-aucpr:0.96978
[20:56:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.27719 validation-auc:0.96607 validation-aucpr:0.96979
[20:56:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.27578 validation-auc:0.96623 validation-aucpr:0.96990
[20:56:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.27470 validation-auc:0.96630 validation-aucpr:0.96994
[20:56:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.27376 validation-auc:0.96632 validation-aucpr:0.96995
[20:56:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.27258 validation-auc:0.96638 validation-aucpr:0.96999
[20:56:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.27137 validation-auc:0.96647 validation-aucpr:0.97006
[20:56:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[89] validation-logloss:0.27048 validation-auc:0.96643 validation-aucpr:0.97002
[20:56:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[90] validation-logloss:0.26950 validation-auc:0.96649 validation-aucpr:0.97016
[20:56:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[91] validation-logloss:0.26862 validation-auc:0.96646 validation-aucpr:0.97008
[20:56:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[92] validation-logloss:0.26752 validation-auc:0.96652 validation-aucpr:0.97012
[20:56:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[93] validation-logloss:0.26556 validation-auc:0.96672 validation-aucpr:0.97032
[20:56:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[94] validation-logloss:0.26463 validation-auc:0.96670 validation-aucpr:0.97027
{'best_iteration': '93', 'best_score': '0.9703206595234433'}
Trial 59, Fold 2: Log loss = 0.2646334476355985, Average precision = 0.9702728597285563, ROC-AUC = 0.9666965205143279, Elapsed Time = 17.663478499998746 seconds
Trial 59, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 59, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67862 validation-auc:0.93487 validation-aucpr:0.93542
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66407 validation-auc:0.95108 validation-aucpr:0.94911
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.64926 validation-auc:0.95705 validation-aucpr:0.96172
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63720 validation-auc:0.95805 validation-aucpr:0.96305
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62432 validation-auc:0.96126 validation-aucpr:0.96610
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61216 validation-auc:0.96220 validation-aucpr:0.96693
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.60128 validation-auc:0.96218 validation-aucpr:0.96715
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.59030 validation-auc:0.96237 validation-aucpr:0.96722
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.57927 validation-auc:0.96268 validation-aucpr:0.96760
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.57012 validation-auc:0.96212 validation-aucpr:0.96706
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.56130 validation-auc:0.96168 validation-aucpr:0.96674
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.55239 validation-auc:0.96124 validation-aucpr:0.96628
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.54219 validation-auc:0.96202 validation-aucpr:0.96717
[20:56:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.53377 validation-auc:0.96204 validation-aucpr:0.96719
[20:56:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.52553 validation-auc:0.96209 validation-aucpr:0.96720
[20:56:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.51737 validation-auc:0.96200 validation-aucpr:0.96710
[20:56:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.50973 validation-auc:0.96191 validation-aucpr:0.96699
[20:56:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.50272 validation-auc:0.96164 validation-aucpr:0.96673
[20:56:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.49402 validation-auc:0.96223 validation-aucpr:0.96737
[20:56:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.48741 validation-auc:0.96212 validation-aucpr:0.96720
[20:56:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.47941 validation-auc:0.96253 validation-aucpr:0.96766
[20:56:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.47182 validation-auc:0.96264 validation-aucpr:0.96778
[20:56:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.46651 validation-auc:0.96245 validation-aucpr:0.96758
[20:56:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.45853 validation-auc:0.96293 validation-aucpr:0.96807
[20:56:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.45272 validation-auc:0.96297 validation-aucpr:0.96807
[20:56:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.44688 validation-auc:0.96296 validation-aucpr:0.96804
[20:56:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.44094 validation-auc:0.96309 validation-aucpr:0.96810
[20:56:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.43581 validation-auc:0.96296 validation-aucpr:0.96796
[20:56:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.43060 validation-auc:0.96314 validation-aucpr:0.96807
[20:56:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.42366 validation-auc:0.96353 validation-aucpr:0.96846
[20:56:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.41726 validation-auc:0.96368 validation-aucpr:0.96857
[20:56:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.41077 validation-auc:0.96383 validation-aucpr:0.96880
[20:56:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.40630 validation-auc:0.96389 validation-aucpr:0.96879
[20:56:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.40211 validation-auc:0.96382 validation-aucpr:0.96869
[20:56:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.39791 validation-auc:0.96377 validation-aucpr:0.96866
[20:56:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.39225 validation-auc:0.96394 validation-aucpr:0.96888
[20:56:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.38717 validation-auc:0.96405 validation-aucpr:0.96900
[20:56:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.38332 validation-auc:0.96400 validation-aucpr:0.96896
[20:56:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.37964 validation-auc:0.96412 validation-aucpr:0.96905
[20:56:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.37552 validation-auc:0.96430 validation-aucpr:0.96921
[20:56:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.37056 validation-auc:0.96439 validation-aucpr:0.96937
[20:56:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.36677 validation-auc:0.96442 validation-aucpr:0.96937
[20:56:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.36212 validation-auc:0.96467 validation-aucpr:0.96960
[20:56:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.35920 validation-auc:0.96462 validation-aucpr:0.96954
[20:56:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.35440 validation-auc:0.96495 validation-aucpr:0.96987
[20:56:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.35154 validation-auc:0.96490 validation-aucpr:0.96984
[20:56:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.34820 validation-auc:0.96506 validation-aucpr:0.96998
[20:56:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.34410 validation-auc:0.96514 validation-aucpr:0.97007
[20:56:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.34063 validation-auc:0.96516 validation-aucpr:0.97019
[20:56:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.33745 validation-auc:0.96534 validation-aucpr:0.97033
[20:56:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.33475 validation-auc:0.96537 validation-aucpr:0.97035
[20:56:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.33219 validation-auc:0.96537 validation-aucpr:0.97035
[20:56:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.32958 validation-auc:0.96540 validation-aucpr:0.97035
[20:56:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.32602 validation-auc:0.96549 validation-aucpr:0.97046
[20:56:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.32261 validation-auc:0.96569 validation-aucpr:0.97065
[20:56:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.31992 validation-auc:0.96583 validation-aucpr:0.97075
[20:56:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.31732 validation-auc:0.96597 validation-aucpr:0.97093
[20:56:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.31406 validation-auc:0.96607 validation-aucpr:0.97104
[20:56:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.31213 validation-auc:0.96608 validation-aucpr:0.97102
[20:56:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.31006 validation-auc:0.96608 validation-aucpr:0.97100
[20:56:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.30788 validation-auc:0.96611 validation-aucpr:0.97101
[20:56:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.30603 validation-auc:0.96623 validation-aucpr:0.97111
[20:56:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.30441 validation-auc:0.96616 validation-aucpr:0.97107
[20:56:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.30259 validation-auc:0.96618 validation-aucpr:0.97107
[20:56:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.30075 validation-auc:0.96622 validation-aucpr:0.97108
[20:56:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.29859 validation-auc:0.96626 validation-aucpr:0.97112
[20:56:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.29647 validation-auc:0.96632 validation-aucpr:0.97116
[20:56:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.29390 validation-auc:0.96634 validation-aucpr:0.97121
[20:56:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.29227 validation-auc:0.96633 validation-aucpr:0.97119
[20:56:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.28978 validation-auc:0.96641 validation-aucpr:0.97129
[20:56:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.28740 validation-auc:0.96646 validation-aucpr:0.97135
[20:56:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.28576 validation-auc:0.96656 validation-aucpr:0.97144
[20:56:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.28397 validation-auc:0.96667 validation-aucpr:0.97151
[20:56:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.28254 validation-auc:0.96662 validation-aucpr:0.97144
[20:56:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.28100 validation-auc:0.96669 validation-aucpr:0.97149
[20:56:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.27898 validation-auc:0.96685 validation-aucpr:0.97163
[20:56:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.27757 validation-auc:0.96687 validation-aucpr:0.97163
[20:56:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.27589 validation-auc:0.96696 validation-aucpr:0.97172
[20:56:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.27461 validation-auc:0.96698 validation-aucpr:0.97175
[20:56:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.27334 validation-auc:0.96696 validation-aucpr:0.97171
[20:56:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.27219 validation-auc:0.96697 validation-aucpr:0.97171
[20:56:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.27092 validation-auc:0.96702 validation-aucpr:0.97174
[20:56:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.26970 validation-auc:0.96707 validation-aucpr:0.97181
[20:56:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.26818 validation-auc:0.96718 validation-aucpr:0.97191
[20:56:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.26696 validation-auc:0.96723 validation-aucpr:0.97194
[20:56:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.26611 validation-auc:0.96720 validation-aucpr:0.97193
[20:56:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.26510 validation-auc:0.96720 validation-aucpr:0.97193
[20:56:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.26425 validation-auc:0.96718 validation-aucpr:0.97189
[20:56:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.26333 validation-auc:0.96711 validation-aucpr:0.97183
[20:56:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[89] validation-logloss:0.26228 validation-auc:0.96708 validation-aucpr:0.97182
[20:56:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[90] validation-logloss:0.26099 validation-auc:0.96719 validation-aucpr:0.97190
[20:56:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[91] validation-logloss:0.25908 validation-auc:0.96733 validation-aucpr:0.97203
[20:56:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[92] validation-logloss:0.25716 validation-auc:0.96750 validation-aucpr:0.97219
[20:56:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[93] validation-logloss:0.25599 validation-auc:0.96754 validation-aucpr:0.97222
[20:56:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[94] validation-logloss:0.25507 validation-auc:0.96758 validation-aucpr:0.97225
{'best_iteration': '94', 'best_score': '0.9722450704143667'}
Trial 59, Fold 3: Log loss = 0.25506957399390734, Average precision = 0.9722498977626997, ROC-AUC = 0.9675826884552783, Elapsed Time = 16.725995500000863 seconds
Trial 59, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 59, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[20:56:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67895 validation-auc:0.92444 validation-aucpr:0.92797
[20:56:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66482 validation-auc:0.94131 validation-aucpr:0.94623
[20:56:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65203 validation-auc:0.94545 validation-aucpr:0.94943
[20:56:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63918 validation-auc:0.94843 validation-aucpr:0.95268
[20:56:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62454 validation-auc:0.95810 validation-aucpr:0.96384
[20:56:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61036 validation-auc:0.96045 validation-aucpr:0.96660
[20:56:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.59884 validation-auc:0.96002 validation-aucpr:0.96626
[20:56:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.58798 validation-auc:0.96030 validation-aucpr:0.96647
[20:56:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.57610 validation-auc:0.96120 validation-aucpr:0.96750
[20:56:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.56643 validation-auc:0.96088 validation-aucpr:0.96719
[20:56:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.55546 validation-auc:0.96079 validation-aucpr:0.96725
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.54647 validation-auc:0.96054 validation-aucpr:0.96710
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.53760 validation-auc:0.96064 validation-aucpr:0.96725
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.52897 validation-auc:0.96067 validation-aucpr:0.96717
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.52174 validation-auc:0.96034 validation-aucpr:0.96686
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.51412 validation-auc:0.96005 validation-aucpr:0.96662
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.50671 validation-auc:0.96033 validation-aucpr:0.96674
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.49844 validation-auc:0.96057 validation-aucpr:0.96705
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.49138 validation-auc:0.96070 validation-aucpr:0.96704
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.48431 validation-auc:0.96106 validation-aucpr:0.96725
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.47811 validation-auc:0.96069 validation-aucpr:0.96700
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.47224 validation-auc:0.96033 validation-aucpr:0.96675
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.46632 validation-auc:0.96024 validation-aucpr:0.96668
[20:56:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.46063 validation-auc:0.96032 validation-aucpr:0.96668
[20:56:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.45328 validation-auc:0.96084 validation-aucpr:0.96718
[20:56:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.44810 validation-auc:0.96062 validation-aucpr:0.96702
[20:56:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.44087 validation-auc:0.96097 validation-aucpr:0.96742
[20:56:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.43547 validation-auc:0.96103 validation-aucpr:0.96741
[20:56:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.43078 validation-auc:0.96087 validation-aucpr:0.96728
[20:56:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.42404 validation-auc:0.96133 validation-aucpr:0.96767
[20:56:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.41927 validation-auc:0.96150 validation-aucpr:0.96782
[20:56:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.41470 validation-auc:0.96153 validation-aucpr:0.96781
[20:56:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.40989 validation-auc:0.96171 validation-aucpr:0.96791
[20:56:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.40544 validation-auc:0.96169 validation-aucpr:0.96787
[20:56:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.40141 validation-auc:0.96158 validation-aucpr:0.96777
[20:56:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.39764 validation-auc:0.96148 validation-aucpr:0.96769
[20:56:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.39189 validation-auc:0.96176 validation-aucpr:0.96795
[20:56:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.38812 validation-auc:0.96181 validation-aucpr:0.96797
[20:56:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.38455 validation-auc:0.96174 validation-aucpr:0.96790
[20:56:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.38086 validation-auc:0.96170 validation-aucpr:0.96786
[20:56:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.37743 validation-auc:0.96166 validation-aucpr:0.96782
[20:56:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.37383 validation-auc:0.96174 validation-aucpr:0.96787
[20:56:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.37055 validation-auc:0.96174 validation-aucpr:0.96786
[20:56:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.36743 validation-auc:0.96167 validation-aucpr:0.96782
[20:56:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.36407 validation-auc:0.96175 validation-aucpr:0.96785
[20:56:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.36093 validation-auc:0.96186 validation-aucpr:0.96790
[20:56:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.35842 validation-auc:0.96173 validation-aucpr:0.96776
[20:56:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.35574 validation-auc:0.96163 validation-aucpr:0.96766
[20:56:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.35288 validation-auc:0.96163 validation-aucpr:0.96765
[20:56:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.35021 validation-auc:0.96170 validation-aucpr:0.96771
[20:56:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.34608 validation-auc:0.96183 validation-aucpr:0.96790
[20:56:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.34291 validation-auc:0.96188 validation-aucpr:0.96797
[20:56:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.34016 validation-auc:0.96198 validation-aucpr:0.96804
[20:56:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.33791 validation-auc:0.96206 validation-aucpr:0.96808
[20:56:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.33543 validation-auc:0.96217 validation-aucpr:0.96818
[20:56:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.33130 validation-auc:0.96251 validation-aucpr:0.96851
[20:56:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.32929 validation-auc:0.96256 validation-aucpr:0.96852
[20:56:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.32704 validation-auc:0.96262 validation-aucpr:0.96857
[20:56:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.32461 validation-auc:0.96263 validation-aucpr:0.96858
[20:56:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.32226 validation-auc:0.96269 validation-aucpr:0.96862
[20:56:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.32038 validation-auc:0.96265 validation-aucpr:0.96857
[20:56:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.31861 validation-auc:0.96261 validation-aucpr:0.96853
[20:56:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.31536 validation-auc:0.96292 validation-aucpr:0.96879
[20:56:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.31322 validation-auc:0.96298 validation-aucpr:0.96882
[20:56:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.30992 validation-auc:0.96318 validation-aucpr:0.96904
[20:56:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.30813 validation-auc:0.96318 validation-aucpr:0.96903
[20:56:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.30602 validation-auc:0.96327 validation-aucpr:0.96910
[20:56:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.30442 validation-auc:0.96332 validation-aucpr:0.96913
[20:56:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.30307 validation-auc:0.96327 validation-aucpr:0.96907
[20:56:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.30008 validation-auc:0.96346 validation-aucpr:0.96927
[20:56:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.29847 validation-auc:0.96356 validation-aucpr:0.96933
[20:56:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.29691 validation-auc:0.96354 validation-aucpr:0.96932
[20:56:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.29521 validation-auc:0.96357 validation-aucpr:0.96935
[20:56:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.29410 validation-auc:0.96353 validation-aucpr:0.96933
[20:56:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.29161 validation-auc:0.96361 validation-aucpr:0.96944
[20:56:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.28932 validation-auc:0.96361 validation-aucpr:0.96949
[20:56:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.28756 validation-auc:0.96375 validation-aucpr:0.96962
[20:56:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.28611 validation-auc:0.96387 validation-aucpr:0.96970
[20:56:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.28499 validation-auc:0.96376 validation-aucpr:0.96962
[20:56:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.28358 validation-auc:0.96381 validation-aucpr:0.96964
[20:56:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.28229 validation-auc:0.96380 validation-aucpr:0.96965
[20:56:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.28117 validation-auc:0.96382 validation-aucpr:0.96966
[20:56:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.27893 validation-auc:0.96396 validation-aucpr:0.96982
[20:56:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.27706 validation-auc:0.96406 validation-aucpr:0.96992
[20:56:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.27480 validation-auc:0.96423 validation-aucpr:0.97007
[20:56:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.27302 validation-auc:0.96435 validation-aucpr:0.97018
[20:56:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.27175 validation-auc:0.96445 validation-aucpr:0.97028
[20:56:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.27043 validation-auc:0.96463 validation-aucpr:0.97040
[20:56:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.26939 validation-auc:0.96461 validation-aucpr:0.97038
[20:56:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[89] validation-logloss:0.26818 validation-auc:0.96467 validation-aucpr:0.97044
[20:56:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[90] validation-logloss:0.26674 validation-auc:0.96474 validation-aucpr:0.97052
[20:56:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[91] validation-logloss:0.26581 validation-auc:0.96481 validation-aucpr:0.97057
[20:56:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[92] validation-logloss:0.26400 validation-auc:0.96486 validation-aucpr:0.97063
[20:56:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[93] validation-logloss:0.26308 validation-auc:0.96486 validation-aucpr:0.97064
[20:56:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[94] validation-logloss:0.26146 validation-auc:0.96492 validation-aucpr:0.97070
{'best_iteration': '94', 'best_score': '0.9707029113327627'}
Trial 59, Fold 4: Log loss = 0.2614591833102355, Average precision = 0.9707076760835737, ROC-AUC = 0.9649216065676183, Elapsed Time = 16.850258499998745 seconds
Trial 59, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 59, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.67920 validation-auc:0.92272 validation-aucpr:0.92042
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.66623 validation-auc:0.93540 validation-aucpr:0.93593
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65081 validation-auc:0.95197 validation-aucpr:0.95806
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.63840 validation-auc:0.95330 validation-aucpr:0.95919
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.62729 validation-auc:0.95354 validation-aucpr:0.95930
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.61550 validation-auc:0.95555 validation-aucpr:0.96082
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.60527 validation-auc:0.95521 validation-aucpr:0.96018
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.59564 validation-auc:0.95459 validation-aucpr:0.95941
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.58274 validation-auc:0.95754 validation-aucpr:0.96242
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.57273 validation-auc:0.95779 validation-aucpr:0.96249
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.56377 validation-auc:0.95793 validation-aucpr:0.96262
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.55540 validation-auc:0.95802 validation-aucpr:0.96279
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.54626 validation-auc:0.95827 validation-aucpr:0.96295
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.53700 validation-auc:0.95871 validation-aucpr:0.96340
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.52872 validation-auc:0.95870 validation-aucpr:0.96345
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.52170 validation-auc:0.95860 validation-aucpr:0.96339
[20:56:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.51253 validation-auc:0.95885 validation-aucpr:0.96378
[20:56:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.50548 validation-auc:0.95868 validation-aucpr:0.96377
[20:56:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.49808 validation-auc:0.95918 validation-aucpr:0.96411
[20:56:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.49152 validation-auc:0.95945 validation-aucpr:0.96434
[20:56:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.48438 validation-auc:0.95969 validation-aucpr:0.96456
[20:56:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.47654 validation-auc:0.95990 validation-aucpr:0.96490
[20:56:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.47021 validation-auc:0.95999 validation-aucpr:0.96501
[20:56:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.46361 validation-auc:0.96033 validation-aucpr:0.96527
[20:56:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.45619 validation-auc:0.96050 validation-aucpr:0.96567
[20:56:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.45080 validation-auc:0.96038 validation-aucpr:0.96554
[20:56:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.44537 validation-auc:0.96043 validation-aucpr:0.96557
[20:56:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.43849 validation-auc:0.96087 validation-aucpr:0.96594
[20:56:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.43303 validation-auc:0.96089 validation-aucpr:0.96599
[20:56:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.42862 validation-auc:0.96081 validation-aucpr:0.96592
[20:56:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.42411 validation-auc:0.96068 validation-aucpr:0.96584
[20:56:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.41980 validation-auc:0.96054 validation-aucpr:0.96567
[20:56:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.41370 validation-auc:0.96066 validation-aucpr:0.96585
[20:56:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.40974 validation-auc:0.96062 validation-aucpr:0.96581
[20:56:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.40565 validation-auc:0.96061 validation-aucpr:0.96589
[20:56:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.40119 validation-auc:0.96065 validation-aucpr:0.96593
[20:56:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.39766 validation-auc:0.96040 validation-aucpr:0.96572
[20:56:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.39401 validation-auc:0.96045 validation-aucpr:0.96578
[20:56:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.39070 validation-auc:0.96040 validation-aucpr:0.96570
[20:56:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.38692 validation-auc:0.96045 validation-aucpr:0.96573
[20:56:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.38161 validation-auc:0.96070 validation-aucpr:0.96599
[20:56:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.37827 validation-auc:0.96078 validation-aucpr:0.96604
[20:56:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.37398 validation-auc:0.96092 validation-aucpr:0.96616
[20:56:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.37112 validation-auc:0.96068 validation-aucpr:0.96596
[20:56:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.36786 validation-auc:0.96080 validation-aucpr:0.96605
[20:56:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.36469 validation-auc:0.96098 validation-aucpr:0.96619
[20:56:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.36149 validation-auc:0.96097 validation-aucpr:0.96619
[20:56:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.35907 validation-auc:0.96079 validation-aucpr:0.96607
[20:56:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.35617 validation-auc:0.96087 validation-aucpr:0.96611
[20:56:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.35340 validation-auc:0.96092 validation-aucpr:0.96617
[20:56:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.35059 validation-auc:0.96095 validation-aucpr:0.96622
[20:56:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.34763 validation-auc:0.96104 validation-aucpr:0.96641
[20:56:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.34500 validation-auc:0.96108 validation-aucpr:0.96643
[20:56:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.34175 validation-auc:0.96118 validation-aucpr:0.96658
[20:56:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.33803 validation-auc:0.96130 validation-aucpr:0.96673
[20:56:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.33534 validation-auc:0.96151 validation-aucpr:0.96686
[20:56:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.33164 validation-auc:0.96166 validation-aucpr:0.96701
[20:56:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.32941 validation-auc:0.96168 validation-aucpr:0.96702
[20:56:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.32649 validation-auc:0.96170 validation-aucpr:0.96703
[20:56:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.32462 validation-auc:0.96166 validation-aucpr:0.96698
[20:56:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.32209 validation-auc:0.96181 validation-aucpr:0.96712
[20:56:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.31982 validation-auc:0.96193 validation-aucpr:0.96721
[20:56:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.31801 validation-auc:0.96185 validation-aucpr:0.96714
[20:56:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.31593 validation-auc:0.96196 validation-aucpr:0.96716
[20:56:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.31410 validation-auc:0.96204 validation-aucpr:0.96722
[20:56:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.31233 validation-auc:0.96199 validation-aucpr:0.96717
[20:56:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.31051 validation-auc:0.96223 validation-aucpr:0.96728
[20:56:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.30865 validation-auc:0.96229 validation-aucpr:0.96730
[20:56:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.30694 validation-auc:0.96232 validation-aucpr:0.96726
[20:56:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.30400 validation-auc:0.96243 validation-aucpr:0.96737
[20:56:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.30262 validation-auc:0.96242 validation-aucpr:0.96738
[20:56:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.30106 validation-auc:0.96245 validation-aucpr:0.96739
[20:56:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.29962 validation-auc:0.96248 validation-aucpr:0.96739
[20:56:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.29716 validation-auc:0.96261 validation-aucpr:0.96757
[20:56:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.29508 validation-auc:0.96268 validation-aucpr:0.96767
[20:56:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.29362 validation-auc:0.96276 validation-aucpr:0.96781
[20:56:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.29144 validation-auc:0.96285 validation-aucpr:0.96787
[20:56:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.28988 validation-auc:0.96290 validation-aucpr:0.96792
[20:56:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.28845 validation-auc:0.96295 validation-aucpr:0.96792
[20:56:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.28673 validation-auc:0.96313 validation-aucpr:0.96806
[20:56:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.28573 validation-auc:0.96316 validation-aucpr:0.96805
[20:56:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.28438 validation-auc:0.96326 validation-aucpr:0.96813
[20:56:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.28345 validation-auc:0.96326 validation-aucpr:0.96816
[20:56:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.28233 validation-auc:0.96326 validation-aucpr:0.96816
[20:56:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.28020 validation-auc:0.96339 validation-aucpr:0.96830
[20:56:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.27894 validation-auc:0.96344 validation-aucpr:0.96832
[20:56:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.27785 validation-auc:0.96349 validation-aucpr:0.96835
[20:56:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.27575 validation-auc:0.96361 validation-aucpr:0.96853
[20:56:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.27360 validation-auc:0.96380 validation-aucpr:0.96869
[20:56:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[89] validation-logloss:0.27167 validation-auc:0.96388 validation-aucpr:0.96878
[20:56:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[90] validation-logloss:0.27079 validation-auc:0.96384 validation-aucpr:0.96871
[20:56:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[91] validation-logloss:0.26862 validation-auc:0.96401 validation-aucpr:0.96888
[20:56:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[92] validation-logloss:0.26775 validation-auc:0.96405 validation-aucpr:0.96891
[20:56:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[93] validation-logloss:0.26668 validation-auc:0.96410 validation-aucpr:0.96894
[20:56:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[94] validation-logloss:0.26497 validation-auc:0.96420 validation-aucpr:0.96900
{'best_iteration': '94', 'best_score': '0.9689992016802114'}
Trial 59, Fold 5: Log loss = 0.2649650333683201, Average precision = 0.9690049278285596, ROC-AUC = 0.9642040020065772, Elapsed Time = 16.963513800001238 seconds
Optimization Progress: 60%|###### | 60/100 [2:58:08<41:47, 62.69s/it]
Trial 60, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 60, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66602 validation-auc:0.94155 validation-aucpr:0.94613
[1] validation-logloss:0.64043 validation-auc:0.94913 validation-aucpr:0.94290
[2] validation-logloss:0.61669 validation-auc:0.95412 validation-aucpr:0.94777
[3] validation-logloss:0.59698 validation-auc:0.95686 validation-aucpr:0.95489
[4] validation-logloss:0.57847 validation-auc:0.95856 validation-aucpr:0.95584
[5] validation-logloss:0.55960 validation-auc:0.95962 validation-aucpr:0.95884
[6] validation-logloss:0.54102 validation-auc:0.96212 validation-aucpr:0.96556
[7] validation-logloss:0.52451 validation-auc:0.96234 validation-aucpr:0.96771
[8] validation-logloss:0.50834 validation-auc:0.96280 validation-aucpr:0.96736
[9] validation-logloss:0.49254 validation-auc:0.96374 validation-aucpr:0.96812
[10] validation-logloss:0.47963 validation-auc:0.96449 validation-aucpr:0.96957
[11] validation-logloss:0.46616 validation-auc:0.96517 validation-aucpr:0.97014
[12] validation-logloss:0.45302 validation-auc:0.96530 validation-aucpr:0.97030
[13] validation-logloss:0.44041 validation-auc:0.96576 validation-aucpr:0.97060
[14] validation-logloss:0.42894 validation-auc:0.96632 validation-aucpr:0.97117
[15] validation-logloss:0.41764 validation-auc:0.96641 validation-aucpr:0.97132
[16] validation-logloss:0.40736 validation-auc:0.96672 validation-aucpr:0.97152
[17] validation-logloss:0.39725 validation-auc:0.96707 validation-aucpr:0.97178
[18] validation-logloss:0.38758 validation-auc:0.96736 validation-aucpr:0.97199
[19] validation-logloss:0.37838 validation-auc:0.96769 validation-aucpr:0.97221
[20] validation-logloss:0.37076 validation-auc:0.96777 validation-aucpr:0.97163
[21] validation-logloss:0.36304 validation-auc:0.96763 validation-aucpr:0.97154
[22] validation-logloss:0.35561 validation-auc:0.96803 validation-aucpr:0.97183
[23] validation-logloss:0.34850 validation-auc:0.96802 validation-aucpr:0.97184
[24] validation-logloss:0.34157 validation-auc:0.96816 validation-aucpr:0.97192
[25] validation-logloss:0.33467 validation-auc:0.96829 validation-aucpr:0.97193
[26] validation-logloss:0.32837 validation-auc:0.96821 validation-aucpr:0.97189
[27] validation-logloss:0.32228 validation-auc:0.96836 validation-aucpr:0.97206
[28] validation-logloss:0.31662 validation-auc:0.96847 validation-aucpr:0.97213
[29] validation-logloss:0.31191 validation-auc:0.96850 validation-aucpr:0.97208
[30] validation-logloss:0.30716 validation-auc:0.96865 validation-aucpr:0.97215
[31] validation-logloss:0.30284 validation-auc:0.96878 validation-aucpr:0.97218
[32] validation-logloss:0.29827 validation-auc:0.96878 validation-aucpr:0.97183
[33] validation-logloss:0.29351 validation-auc:0.96894 validation-aucpr:0.97196
[34] validation-logloss:0.28971 validation-auc:0.96899 validation-aucpr:0.97173
[35] validation-logloss:0.28620 validation-auc:0.96904 validation-aucpr:0.97193
[36] validation-logloss:0.28261 validation-auc:0.96915 validation-aucpr:0.97210
[37] validation-logloss:0.27921 validation-auc:0.96930 validation-aucpr:0.97288
[38] validation-logloss:0.27563 validation-auc:0.96930 validation-aucpr:0.97323
[39] validation-logloss:0.27205 validation-auc:0.96935 validation-aucpr:0.97301
[40] validation-logloss:0.26890 validation-auc:0.96935 validation-aucpr:0.97301
[41] validation-logloss:0.26544 validation-auc:0.96958 validation-aucpr:0.97324
[42] validation-logloss:0.26219 validation-auc:0.96977 validation-aucpr:0.97321
[43] validation-logloss:0.25921 validation-auc:0.96988 validation-aucpr:0.97328
[44] validation-logloss:0.25691 validation-auc:0.96988 validation-aucpr:0.97326
[45] validation-logloss:0.25387 validation-auc:0.97009 validation-aucpr:0.97339
[46] validation-logloss:0.25169 validation-auc:0.97010 validation-aucpr:0.97351
[47] validation-logloss:0.24962 validation-auc:0.96992 validation-aucpr:0.97340
[48] validation-logloss:0.24742 validation-auc:0.96999 validation-aucpr:0.97342
[49] validation-logloss:0.24520 validation-auc:0.96991 validation-aucpr:0.97340
[50] validation-logloss:0.24336 validation-auc:0.96989 validation-aucpr:0.97345
[51] validation-logloss:0.24134 validation-auc:0.96989 validation-aucpr:0.97349
[52] validation-logloss:0.23971 validation-auc:0.96985 validation-aucpr:0.97343
[53] validation-logloss:0.23794 validation-auc:0.96987 validation-aucpr:0.97347
[54] validation-logloss:0.23627 validation-auc:0.96988 validation-aucpr:0.97353
[55] validation-logloss:0.23466 validation-auc:0.97000 validation-aucpr:0.97365
{'best_iteration': '55', 'best_score': '0.9736481198598432'}
Trial 60, Fold 1: Log loss = 0.23465558197815578, Average precision = 0.9737553055902464, ROC-AUC = 0.9700006093976283, Elapsed Time = 5.350716500000999 seconds
Trial 60, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 60, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.66561 validation-auc:0.93663 validation-aucpr:0.92142
[1] validation-logloss:0.64060 validation-auc:0.94660 validation-aucpr:0.92801
[2] validation-logloss:0.61747 validation-auc:0.95261 validation-aucpr:0.95056
[3] validation-logloss:0.59669 validation-auc:0.95770 validation-aucpr:0.95105
[4] validation-logloss:0.57806 validation-auc:0.96111 validation-aucpr:0.95966
[5] validation-logloss:0.55781 validation-auc:0.96360 validation-aucpr:0.96312
[6] validation-logloss:0.53901 validation-auc:0.96522 validation-aucpr:0.96826
[7] validation-logloss:0.52318 validation-auc:0.96641 validation-aucpr:0.97076
[8] validation-logloss:0.50918 validation-auc:0.96654 validation-aucpr:0.97068
[9] validation-logloss:0.49540 validation-auc:0.96657 validation-aucpr:0.97070
[10] validation-logloss:0.48099 validation-auc:0.96642 validation-aucpr:0.97062
[11] validation-logloss:0.46780 validation-auc:0.96652 validation-aucpr:0.97085
[12] validation-logloss:0.45468 validation-auc:0.96678 validation-aucpr:0.97108
[13] validation-logloss:0.44218 validation-auc:0.96716 validation-aucpr:0.97136
[14] validation-logloss:0.43172 validation-auc:0.96740 validation-aucpr:0.97122
[15] validation-logloss:0.42178 validation-auc:0.96769 validation-aucpr:0.97146
[16] validation-logloss:0.41139 validation-auc:0.96790 validation-aucpr:0.97165
[17] validation-logloss:0.40116 validation-auc:0.96811 validation-aucpr:0.97195
[18] validation-logloss:0.39158 validation-auc:0.96860 validation-aucpr:0.97234
[19] validation-logloss:0.38248 validation-auc:0.96880 validation-aucpr:0.97259
[20] validation-logloss:0.37450 validation-auc:0.96856 validation-aucpr:0.97233
[21] validation-logloss:0.36672 validation-auc:0.96828 validation-aucpr:0.97210
[22] validation-logloss:0.35888 validation-auc:0.96826 validation-aucpr:0.97216
[23] validation-logloss:0.35146 validation-auc:0.96826 validation-aucpr:0.97236
[24] validation-logloss:0.34425 validation-auc:0.96847 validation-aucpr:0.97258
[25] validation-logloss:0.33732 validation-auc:0.96882 validation-aucpr:0.97286
[26] validation-logloss:0.33080 validation-auc:0.96890 validation-aucpr:0.97297
[27] validation-logloss:0.32438 validation-auc:0.96913 validation-aucpr:0.97317
[28] validation-logloss:0.31857 validation-auc:0.96903 validation-aucpr:0.97312
[29] validation-logloss:0.31384 validation-auc:0.96905 validation-aucpr:0.97304
[30] validation-logloss:0.30892 validation-auc:0.96912 validation-aucpr:0.97303
[31] validation-logloss:0.30405 validation-auc:0.96894 validation-aucpr:0.97292
[32] validation-logloss:0.29970 validation-auc:0.96905 validation-aucpr:0.97296
[33] validation-logloss:0.29464 validation-auc:0.96927 validation-aucpr:0.97310
[34] validation-logloss:0.29041 validation-auc:0.96925 validation-aucpr:0.97309
[35] validation-logloss:0.28668 validation-auc:0.96903 validation-aucpr:0.97297
[36] validation-logloss:0.28262 validation-auc:0.96890 validation-aucpr:0.97296
[37] validation-logloss:0.27844 validation-auc:0.96903 validation-aucpr:0.97306
[38] validation-logloss:0.27492 validation-auc:0.96904 validation-aucpr:0.97290
[39] validation-logloss:0.27126 validation-auc:0.96902 validation-aucpr:0.97295
[40] validation-logloss:0.26835 validation-auc:0.96893 validation-aucpr:0.97290
[41] validation-logloss:0.26546 validation-auc:0.96898 validation-aucpr:0.97287
[42] validation-logloss:0.26294 validation-auc:0.96898 validation-aucpr:0.97285
[43] validation-logloss:0.25972 validation-auc:0.96911 validation-aucpr:0.97288
[44] validation-logloss:0.25719 validation-auc:0.96924 validation-aucpr:0.97297
[45] validation-logloss:0.25417 validation-auc:0.96933 validation-aucpr:0.97304
[46] validation-logloss:0.25177 validation-auc:0.96940 validation-aucpr:0.97303
[47] validation-logloss:0.24914 validation-auc:0.96948 validation-aucpr:0.97300
[48] validation-logloss:0.24661 validation-auc:0.96961 validation-aucpr:0.97299
[49] validation-logloss:0.24405 validation-auc:0.96974 validation-aucpr:0.97306
[50] validation-logloss:0.24190 validation-auc:0.96994 validation-aucpr:0.97412
[51] validation-logloss:0.23982 validation-auc:0.96976 validation-aucpr:0.97363
[52] validation-logloss:0.23787 validation-auc:0.96978 validation-aucpr:0.97361
[53] validation-logloss:0.23585 validation-auc:0.96969 validation-aucpr:0.97357
[54] validation-logloss:0.23375 validation-auc:0.96979 validation-aucpr:0.97366
[55] validation-logloss:0.23195 validation-auc:0.96973 validation-aucpr:0.97365
{'best_iteration': '50', 'best_score': '0.9741150097823695'}
Trial 60, Fold 2: Log loss = 0.23194843281151947, Average precision = 0.9736809321240751, ROC-AUC = 0.9697323867104145, Elapsed Time = 5.66577529999995 seconds
Trial 60, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 60, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66522 validation-auc:0.94317 validation-aucpr:0.94647
[1] validation-logloss:0.64209 validation-auc:0.94919 validation-aucpr:0.93701
[2] validation-logloss:0.61928 validation-auc:0.95353 validation-aucpr:0.94277
[3] validation-logloss:0.59677 validation-auc:0.95891 validation-aucpr:0.95364
[4] validation-logloss:0.57658 validation-auc:0.96108 validation-aucpr:0.96035
[5] validation-logloss:0.55671 validation-auc:0.96269 validation-aucpr:0.96311
[6] validation-logloss:0.53791 validation-auc:0.96383 validation-aucpr:0.96428
[7] validation-logloss:0.52076 validation-auc:0.96461 validation-aucpr:0.96546
[8] validation-logloss:0.50462 validation-auc:0.96527 validation-aucpr:0.96433
[9] validation-logloss:0.49015 validation-auc:0.96675 validation-aucpr:0.96605
[10] validation-logloss:0.47716 validation-auc:0.96678 validation-aucpr:0.96615
[11] validation-logloss:0.46338 validation-auc:0.96728 validation-aucpr:0.96657
[12] validation-logloss:0.45089 validation-auc:0.96739 validation-aucpr:0.96646
[13] validation-logloss:0.43987 validation-auc:0.96735 validation-aucpr:0.96628
[14] validation-logloss:0.42804 validation-auc:0.96745 validation-aucpr:0.96624
[15] validation-logloss:0.41738 validation-auc:0.96739 validation-aucpr:0.96618
[16] validation-logloss:0.40822 validation-auc:0.96717 validation-aucpr:0.96592
[17] validation-logloss:0.39864 validation-auc:0.96718 validation-aucpr:0.96795
[18] validation-logloss:0.38888 validation-auc:0.96747 validation-aucpr:0.96802
[19] validation-logloss:0.37975 validation-auc:0.96756 validation-aucpr:0.96811
[20] validation-logloss:0.37138 validation-auc:0.96784 validation-aucpr:0.96829
[21] validation-logloss:0.36320 validation-auc:0.96775 validation-aucpr:0.96818
[22] validation-logloss:0.35540 validation-auc:0.96773 validation-aucpr:0.96803
[23] validation-logloss:0.34861 validation-auc:0.96787 validation-aucpr:0.96811
[24] validation-logloss:0.34227 validation-auc:0.96780 validation-aucpr:0.96819
[25] validation-logloss:0.33566 validation-auc:0.96777 validation-aucpr:0.96818
[26] validation-logloss:0.32929 validation-auc:0.96780 validation-aucpr:0.96847
[27] validation-logloss:0.32323 validation-auc:0.96849 validation-aucpr:0.97160
[28] validation-logloss:0.31742 validation-auc:0.96847 validation-aucpr:0.97161
[29] validation-logloss:0.31177 validation-auc:0.96866 validation-aucpr:0.97178
[30] validation-logloss:0.30632 validation-auc:0.96871 validation-aucpr:0.97191
[31] validation-logloss:0.30106 validation-auc:0.96897 validation-aucpr:0.97208
[32] validation-logloss:0.29623 validation-auc:0.96907 validation-aucpr:0.97211
[33] validation-logloss:0.29170 validation-auc:0.96918 validation-aucpr:0.97228
[34] validation-logloss:0.28712 validation-auc:0.96917 validation-aucpr:0.97228
[35] validation-logloss:0.28276 validation-auc:0.96925 validation-aucpr:0.97238
[36] validation-logloss:0.27926 validation-auc:0.96940 validation-aucpr:0.97262
[37] validation-logloss:0.27583 validation-auc:0.96927 validation-aucpr:0.97266
[38] validation-logloss:0.27220 validation-auc:0.96932 validation-aucpr:0.97271
[39] validation-logloss:0.26866 validation-auc:0.96953 validation-aucpr:0.97298
[40] validation-logloss:0.26518 validation-auc:0.96974 validation-aucpr:0.97316
[41] validation-logloss:0.26213 validation-auc:0.96985 validation-aucpr:0.97324
[42] validation-logloss:0.25921 validation-auc:0.96988 validation-aucpr:0.97323
[43] validation-logloss:0.25680 validation-auc:0.96988 validation-aucpr:0.97323
[44] validation-logloss:0.25459 validation-auc:0.96982 validation-aucpr:0.97309
[45] validation-logloss:0.25223 validation-auc:0.96981 validation-aucpr:0.97309
[46] validation-logloss:0.24942 validation-auc:0.97003 validation-aucpr:0.97315
[47] validation-logloss:0.24701 validation-auc:0.97010 validation-aucpr:0.97319
[48] validation-logloss:0.24445 validation-auc:0.97015 validation-aucpr:0.97331
[49] validation-logloss:0.24209 validation-auc:0.97033 validation-aucpr:0.97343
[50] validation-logloss:0.23993 validation-auc:0.97025 validation-aucpr:0.97319
[51] validation-logloss:0.23779 validation-auc:0.97033 validation-aucpr:0.97324
[52] validation-logloss:0.23548 validation-auc:0.97049 validation-aucpr:0.97339
[53] validation-logloss:0.23350 validation-auc:0.97061 validation-aucpr:0.97352
[54] validation-logloss:0.23147 validation-auc:0.97071 validation-aucpr:0.97383
[55] validation-logloss:0.22979 validation-auc:0.97069 validation-aucpr:0.97381
{'best_iteration': '54', 'best_score': '0.9738305377640103'}
Trial 60, Fold 3: Log loss = 0.22978920308210152, Average precision = 0.9740861129702912, ROC-AUC = 0.9706929860783389, Elapsed Time = 5.672290900001826 seconds
Trial 60, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 60, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66510 validation-auc:0.94943 validation-aucpr:0.94812
[1] validation-logloss:0.63930 validation-auc:0.95330 validation-aucpr:0.94564
[2] validation-logloss:0.61620 validation-auc:0.95717 validation-aucpr:0.96030
[3] validation-logloss:0.59405 validation-auc:0.95934 validation-aucpr:0.96372
[4] validation-logloss:0.57315 validation-auc:0.96181 validation-aucpr:0.96657
[5] validation-logloss:0.55415 validation-auc:0.96287 validation-aucpr:0.96595
[6] validation-logloss:0.53756 validation-auc:0.96301 validation-aucpr:0.96733
[7] validation-logloss:0.52092 validation-auc:0.96342 validation-aucpr:0.96702
[8] validation-logloss:0.50642 validation-auc:0.96408 validation-aucpr:0.96827
[9] validation-logloss:0.49113 validation-auc:0.96434 validation-aucpr:0.96850
[10] validation-logloss:0.47701 validation-auc:0.96447 validation-aucpr:0.96860
[11] validation-logloss:0.46296 validation-auc:0.96511 validation-aucpr:0.96905
[12] validation-logloss:0.45030 validation-auc:0.96521 validation-aucpr:0.96906
[13] validation-logloss:0.43967 validation-auc:0.96509 validation-aucpr:0.96922
[14] validation-logloss:0.42836 validation-auc:0.96514 validation-aucpr:0.96914
[15] validation-logloss:0.41764 validation-auc:0.96556 validation-aucpr:0.96976
[16] validation-logloss:0.40724 validation-auc:0.96593 validation-aucpr:0.96970
[17] validation-logloss:0.39746 validation-auc:0.96622 validation-aucpr:0.97002
[18] validation-logloss:0.38772 validation-auc:0.96652 validation-aucpr:0.97002
[19] validation-logloss:0.37854 validation-auc:0.96721 validation-aucpr:0.97040
[20] validation-logloss:0.37137 validation-auc:0.96743 validation-aucpr:0.97211
[21] validation-logloss:0.36399 validation-auc:0.96763 validation-aucpr:0.97230
[22] validation-logloss:0.35625 validation-auc:0.96788 validation-aucpr:0.97319
[23] validation-logloss:0.34877 validation-auc:0.96815 validation-aucpr:0.97342
[24] validation-logloss:0.34236 validation-auc:0.96846 validation-aucpr:0.97364
[25] validation-logloss:0.33695 validation-auc:0.96832 validation-aucpr:0.97352
[26] validation-logloss:0.33104 validation-auc:0.96814 validation-aucpr:0.97338
[27] validation-logloss:0.32491 validation-auc:0.96824 validation-aucpr:0.97345
[28] validation-logloss:0.31903 validation-auc:0.96827 validation-aucpr:0.97346
[29] validation-logloss:0.31432 validation-auc:0.96816 validation-aucpr:0.97340
[30] validation-logloss:0.30910 validation-auc:0.96805 validation-aucpr:0.97339
[31] validation-logloss:0.30434 validation-auc:0.96821 validation-aucpr:0.97342
[32] validation-logloss:0.29931 validation-auc:0.96843 validation-aucpr:0.97359
[33] validation-logloss:0.29476 validation-auc:0.96838 validation-aucpr:0.97359
[34] validation-logloss:0.29066 validation-auc:0.96830 validation-aucpr:0.97355
[35] validation-logloss:0.28643 validation-auc:0.96830 validation-aucpr:0.97351
[36] validation-logloss:0.28240 validation-auc:0.96833 validation-aucpr:0.97353
[37] validation-logloss:0.27847 validation-auc:0.96831 validation-aucpr:0.97353
[38] validation-logloss:0.27511 validation-auc:0.96827 validation-aucpr:0.97350
[39] validation-logloss:0.27146 validation-auc:0.96851 validation-aucpr:0.97360
[40] validation-logloss:0.26809 validation-auc:0.96857 validation-aucpr:0.97362
[41] validation-logloss:0.26531 validation-auc:0.96828 validation-aucpr:0.97341
[42] validation-logloss:0.26203 validation-auc:0.96850 validation-aucpr:0.97351
[43] validation-logloss:0.25869 validation-auc:0.96872 validation-aucpr:0.97367
[44] validation-logloss:0.25629 validation-auc:0.96871 validation-aucpr:0.97366
[45] validation-logloss:0.25356 validation-auc:0.96881 validation-aucpr:0.97373
[46] validation-logloss:0.25121 validation-auc:0.96871 validation-aucpr:0.97366
[47] validation-logloss:0.24909 validation-auc:0.96875 validation-aucpr:0.97368
[48] validation-logloss:0.24698 validation-auc:0.96891 validation-aucpr:0.97379
[49] validation-logloss:0.24454 validation-auc:0.96897 validation-aucpr:0.97383
[50] validation-logloss:0.24210 validation-auc:0.96903 validation-aucpr:0.97391
[51] validation-logloss:0.24009 validation-auc:0.96908 validation-aucpr:0.97395
[52] validation-logloss:0.23808 validation-auc:0.96913 validation-aucpr:0.97400
[53] validation-logloss:0.23595 validation-auc:0.96924 validation-aucpr:0.97407
[54] validation-logloss:0.23471 validation-auc:0.96908 validation-aucpr:0.97394
[55] validation-logloss:0.23294 validation-auc:0.96906 validation-aucpr:0.97395
{'best_iteration': '53', 'best_score': '0.9740728410816708'}
Trial 60, Fold 4: Log loss = 0.23294060156659158, Average precision = 0.9739539172089092, ROC-AUC = 0.9690645185519003, Elapsed Time = 5.714658599998074 seconds
Trial 60, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 60, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66639 validation-auc:0.92341 validation-aucpr:0.93476
[1] validation-logloss:0.64094 validation-auc:0.94721 validation-aucpr:0.94052
[2] validation-logloss:0.61763 validation-auc:0.95311 validation-aucpr:0.95444
[3] validation-logloss:0.59823 validation-auc:0.95373 validation-aucpr:0.95447
[4] validation-logloss:0.58049 validation-auc:0.95562 validation-aucpr:0.95451
[5] validation-logloss:0.56068 validation-auc:0.95938 validation-aucpr:0.96368
[6] validation-logloss:0.54314 validation-auc:0.95980 validation-aucpr:0.96286
[7] validation-logloss:0.52596 validation-auc:0.96113 validation-aucpr:0.96660
[8] validation-logloss:0.50941 validation-auc:0.96209 validation-aucpr:0.96764
[9] validation-logloss:0.49560 validation-auc:0.96316 validation-aucpr:0.96867
[10] validation-logloss:0.48136 validation-auc:0.96400 validation-aucpr:0.96968
[11] validation-logloss:0.46763 validation-auc:0.96486 validation-aucpr:0.97038
[12] validation-logloss:0.45464 validation-auc:0.96486 validation-aucpr:0.97032
[13] validation-logloss:0.44380 validation-auc:0.96491 validation-aucpr:0.97037
[14] validation-logloss:0.43310 validation-auc:0.96531 validation-aucpr:0.97080
[15] validation-logloss:0.42201 validation-auc:0.96621 validation-aucpr:0.97119
[16] validation-logloss:0.41174 validation-auc:0.96612 validation-aucpr:0.97053
[17] validation-logloss:0.40310 validation-auc:0.96612 validation-aucpr:0.97048
[18] validation-logloss:0.39342 validation-auc:0.96617 validation-aucpr:0.97070
[19] validation-logloss:0.38530 validation-auc:0.96630 validation-aucpr:0.97073
[20] validation-logloss:0.37844 validation-auc:0.96627 validation-aucpr:0.97064
[21] validation-logloss:0.37027 validation-auc:0.96620 validation-aucpr:0.97057
[22] validation-logloss:0.36372 validation-auc:0.96633 validation-aucpr:0.97061
[23] validation-logloss:0.35636 validation-auc:0.96636 validation-aucpr:0.97077
[24] validation-logloss:0.34939 validation-auc:0.96636 validation-aucpr:0.97074
[25] validation-logloss:0.34263 validation-auc:0.96660 validation-aucpr:0.97093
[26] validation-logloss:0.33739 validation-auc:0.96654 validation-aucpr:0.97082
[27] validation-logloss:0.33133 validation-auc:0.96664 validation-aucpr:0.97088
[28] validation-logloss:0.32646 validation-auc:0.96640 validation-aucpr:0.97073
[29] validation-logloss:0.32130 validation-auc:0.96660 validation-aucpr:0.97078
[30] validation-logloss:0.31659 validation-auc:0.96640 validation-aucpr:0.96989
[31] validation-logloss:0.31110 validation-auc:0.96663 validation-aucpr:0.97008
[32] validation-logloss:0.30572 validation-auc:0.96691 validation-aucpr:0.97030
[33] validation-logloss:0.30075 validation-auc:0.96703 validation-aucpr:0.97039
[34] validation-logloss:0.29701 validation-auc:0.96691 validation-aucpr:0.97026
[35] validation-logloss:0.29364 validation-auc:0.96689 validation-aucpr:0.97080
[36] validation-logloss:0.28939 validation-auc:0.96699 validation-aucpr:0.97109
[37] validation-logloss:0.28612 validation-auc:0.96709 validation-aucpr:0.97115
[38] validation-logloss:0.28241 validation-auc:0.96738 validation-aucpr:0.97149
[39] validation-logloss:0.27933 validation-auc:0.96736 validation-aucpr:0.97146
[40] validation-logloss:0.27617 validation-auc:0.96743 validation-aucpr:0.97151
[41] validation-logloss:0.27367 validation-auc:0.96729 validation-aucpr:0.97146
[42] validation-logloss:0.27105 validation-auc:0.96721 validation-aucpr:0.97139
[43] validation-logloss:0.26782 validation-auc:0.96721 validation-aucpr:0.97144
[44] validation-logloss:0.26475 validation-auc:0.96736 validation-aucpr:0.97153
[45] validation-logloss:0.26218 validation-auc:0.96736 validation-aucpr:0.97150
[46] validation-logloss:0.26007 validation-auc:0.96731 validation-aucpr:0.97142
[47] validation-logloss:0.25814 validation-auc:0.96726 validation-aucpr:0.97135
[48] validation-logloss:0.25536 validation-auc:0.96738 validation-aucpr:0.97151
[49] validation-logloss:0.25323 validation-auc:0.96755 validation-aucpr:0.97204
[50] validation-logloss:0.25095 validation-auc:0.96757 validation-aucpr:0.97207
[51] validation-logloss:0.24922 validation-auc:0.96766 validation-aucpr:0.97204
[52] validation-logloss:0.24653 validation-auc:0.96796 validation-aucpr:0.97225
[53] validation-logloss:0.24457 validation-auc:0.96792 validation-aucpr:0.97219
[54] validation-logloss:0.24258 validation-auc:0.96801 validation-aucpr:0.97223
[55] validation-logloss:0.24105 validation-auc:0.96801 validation-aucpr:0.97223
{'best_iteration': '52', 'best_score': '0.97225150400306'}
Trial 60, Fold 5: Log loss = 0.24104923104785034, Average precision = 0.9722273856003052, ROC-AUC = 0.9680141203574679, Elapsed Time = 5.812461699999403 seconds
Optimization Progress: 61%|######1 | 61/100 [2:58:43<35:29, 54.61s/it]
Trial 61, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 61, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.65332 validation-auc:0.91488 validation-aucpr:0.93009
[1] validation-logloss:0.61684 validation-auc:0.94953 validation-aucpr:0.93989
[2] validation-logloss:0.58403 validation-auc:0.95482 validation-aucpr:0.94498
[3] validation-logloss:0.55500 validation-auc:0.96038 validation-aucpr:0.95724
[4] validation-logloss:0.52896 validation-auc:0.96244 validation-aucpr:0.96479
[5] validation-logloss:0.50474 validation-auc:0.96346 validation-aucpr:0.96611
[6] validation-logloss:0.48500 validation-auc:0.96421 validation-aucpr:0.96559
[7] validation-logloss:0.46407 validation-auc:0.96601 validation-aucpr:0.96882
[8] validation-logloss:0.44688 validation-auc:0.96662 validation-aucpr:0.97142
[9] validation-logloss:0.43114 validation-auc:0.96671 validation-aucpr:0.97134
[10] validation-logloss:0.41494 validation-auc:0.96762 validation-aucpr:0.97197
[11] validation-logloss:0.39993 validation-auc:0.96779 validation-aucpr:0.97195
[12] validation-logloss:0.38622 validation-auc:0.96782 validation-aucpr:0.97202
[13] validation-logloss:0.37336 validation-auc:0.96786 validation-aucpr:0.97202
[14] validation-logloss:0.36163 validation-auc:0.96816 validation-aucpr:0.97230
[15] validation-logloss:0.35062 validation-auc:0.96845 validation-aucpr:0.97250
[16] validation-logloss:0.34023 validation-auc:0.96860 validation-aucpr:0.97260
[17] validation-logloss:0.33075 validation-auc:0.96864 validation-aucpr:0.97255
[18] validation-logloss:0.32176 validation-auc:0.96872 validation-aucpr:0.97263
[19] validation-logloss:0.31345 validation-auc:0.96877 validation-aucpr:0.97294
[20] validation-logloss:0.30527 validation-auc:0.96909 validation-aucpr:0.97320
[21] validation-logloss:0.29781 validation-auc:0.96925 validation-aucpr:0.97334
[22] validation-logloss:0.29128 validation-auc:0.96938 validation-aucpr:0.97340
[23] validation-logloss:0.28519 validation-auc:0.96938 validation-aucpr:0.97335
[24] validation-logloss:0.27902 validation-auc:0.96965 validation-aucpr:0.97357
[25] validation-logloss:0.27351 validation-auc:0.96976 validation-aucpr:0.97367
[26] validation-logloss:0.26846 validation-auc:0.96978 validation-aucpr:0.97341
[27] validation-logloss:0.26355 validation-auc:0.96996 validation-aucpr:0.97355
[28] validation-logloss:0.25912 validation-auc:0.96993 validation-aucpr:0.97346
[29] validation-logloss:0.25517 validation-auc:0.97000 validation-aucpr:0.97351
[30] validation-logloss:0.25173 validation-auc:0.96991 validation-aucpr:0.97309
[31] validation-logloss:0.24779 validation-auc:0.97014 validation-aucpr:0.97326
[32] validation-logloss:0.24504 validation-auc:0.96989 validation-aucpr:0.97265
[33] validation-logloss:0.24225 validation-auc:0.96982 validation-aucpr:0.97259
[34] validation-logloss:0.23934 validation-auc:0.96984 validation-aucpr:0.97261
[35] validation-logloss:0.23641 validation-auc:0.96992 validation-aucpr:0.97263
[36] validation-logloss:0.23344 validation-auc:0.97023 validation-aucpr:0.97284
[37] validation-logloss:0.23078 validation-auc:0.97031 validation-aucpr:0.97274
[38] validation-logloss:0.22854 validation-auc:0.97038 validation-aucpr:0.97277
[39] validation-logloss:0.22607 validation-auc:0.97055 validation-aucpr:0.97288
[40] validation-logloss:0.22433 validation-auc:0.97044 validation-aucpr:0.97270
[41] validation-logloss:0.22252 validation-auc:0.97045 validation-aucpr:0.97261
[42] validation-logloss:0.22057 validation-auc:0.97058 validation-aucpr:0.97274
[43] validation-logloss:0.21911 validation-auc:0.97062 validation-aucpr:0.97285
[44] validation-logloss:0.21724 validation-auc:0.97092 validation-aucpr:0.97482
[45] validation-logloss:0.21565 validation-auc:0.97105 validation-aucpr:0.97485
[46] validation-logloss:0.21436 validation-auc:0.97101 validation-aucpr:0.97481
[47] validation-logloss:0.21298 validation-auc:0.97113 validation-aucpr:0.97486
[48] validation-logloss:0.21149 validation-auc:0.97126 validation-aucpr:0.97495
[49] validation-logloss:0.21013 validation-auc:0.97136 validation-aucpr:0.97508
[50] validation-logloss:0.20874 validation-auc:0.97149 validation-aucpr:0.97515
[51] validation-logloss:0.20788 validation-auc:0.97144 validation-aucpr:0.97511
[52] validation-logloss:0.20680 validation-auc:0.97153 validation-aucpr:0.97521
[53] validation-logloss:0.20599 validation-auc:0.97154 validation-aucpr:0.97514
[54] validation-logloss:0.20541 validation-auc:0.97138 validation-aucpr:0.97499
[55] validation-logloss:0.20457 validation-auc:0.97155 validation-aucpr:0.97519
[56] validation-logloss:0.20390 validation-auc:0.97151 validation-aucpr:0.97491
[57] validation-logloss:0.20329 validation-auc:0.97155 validation-aucpr:0.97507
[58] validation-logloss:0.20266 validation-auc:0.97155 validation-aucpr:0.97506
[59] validation-logloss:0.20194 validation-auc:0.97162 validation-aucpr:0.97512
[60] validation-logloss:0.20127 validation-auc:0.97175 validation-aucpr:0.97516
[61] validation-logloss:0.20084 validation-auc:0.97183 validation-aucpr:0.97543
[62] validation-logloss:0.20066 validation-auc:0.97173 validation-aucpr:0.97534
[63] validation-logloss:0.20027 validation-auc:0.97173 validation-aucpr:0.97524
[64] validation-logloss:0.19984 validation-auc:0.97171 validation-aucpr:0.97541
[65] validation-logloss:0.19946 validation-auc:0.97175 validation-aucpr:0.97539
[66] validation-logloss:0.19906 validation-auc:0.97173 validation-aucpr:0.97537
[67] validation-logloss:0.19871 validation-auc:0.97179 validation-aucpr:0.97538
[68] validation-logloss:0.19836 validation-auc:0.97183 validation-aucpr:0.97542
[69] validation-logloss:0.19838 validation-auc:0.97169 validation-aucpr:0.97529
[70] validation-logloss:0.19822 validation-auc:0.97165 validation-aucpr:0.97527
[71] validation-logloss:0.19808 validation-auc:0.97167 validation-aucpr:0.97522
[72] validation-logloss:0.19769 validation-auc:0.97178 validation-aucpr:0.97529
[73] validation-logloss:0.19793 validation-auc:0.97157 validation-aucpr:0.97512
[74] validation-logloss:0.19803 validation-auc:0.97143 validation-aucpr:0.97485
[75] validation-logloss:0.19784 validation-auc:0.97153 validation-aucpr:0.97536
[76] validation-logloss:0.19751 validation-auc:0.97156 validation-aucpr:0.97534
[77] validation-logloss:0.19741 validation-auc:0.97152 validation-aucpr:0.97512
[78] validation-logloss:0.19689 validation-auc:0.97177 validation-aucpr:0.97515
[79] validation-logloss:0.19663 validation-auc:0.97178 validation-aucpr:0.97527
[80] validation-logloss:0.19658 validation-auc:0.97177 validation-aucpr:0.97524
[81] validation-logloss:0.19633 validation-auc:0.97182 validation-aucpr:0.97503
[82] validation-logloss:0.19641 validation-auc:0.97176 validation-aucpr:0.97485
[83] validation-logloss:0.19635 validation-auc:0.97179 validation-aucpr:0.97484
[84] validation-logloss:0.19622 validation-auc:0.97185 validation-aucpr:0.97482
[85] validation-logloss:0.19602 validation-auc:0.97198 validation-aucpr:0.97538
[86] validation-logloss:0.19597 validation-auc:0.97202 validation-aucpr:0.97567
[87] validation-logloss:0.19619 validation-auc:0.97196 validation-aucpr:0.97560
[88] validation-logloss:0.19598 validation-auc:0.97201 validation-aucpr:0.97556
{'best_iteration': '86', 'best_score': '0.9756681810404803'}
Trial 61, Fold 1: Log loss = 0.19597705106820937, Average precision = 0.975568210512143, ROC-AUC = 0.9720051989981981, Elapsed Time = 2.902240300001722 seconds
Trial 61, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 61, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.65238 validation-auc:0.94185 validation-aucpr:0.94565
[1] validation-logloss:0.61617 validation-auc:0.95448 validation-aucpr:0.94587
[2] validation-logloss:0.58301 validation-auc:0.96415 validation-aucpr:0.96661
[3] validation-logloss:0.55338 validation-auc:0.96547 validation-aucpr:0.97008
[4] validation-logloss:0.52638 validation-auc:0.96581 validation-aucpr:0.97070
[5] validation-logloss:0.50198 validation-auc:0.96644 validation-aucpr:0.97105
[6] validation-logloss:0.47883 validation-auc:0.96833 validation-aucpr:0.97262
[7] validation-logloss:0.46054 validation-auc:0.96905 validation-aucpr:0.97324
[8] validation-logloss:0.44160 validation-auc:0.96932 validation-aucpr:0.97329
[9] validation-logloss:0.42422 validation-auc:0.96970 validation-aucpr:0.97371
[10] validation-logloss:0.40843 validation-auc:0.96937 validation-aucpr:0.97342
[11] validation-logloss:0.39328 validation-auc:0.96954 validation-aucpr:0.97350
[12] validation-logloss:0.37979 validation-auc:0.96953 validation-aucpr:0.97355
[13] validation-logloss:0.36731 validation-auc:0.96924 validation-aucpr:0.97317
[14] validation-logloss:0.35570 validation-auc:0.96914 validation-aucpr:0.97303
[15] validation-logloss:0.34499 validation-auc:0.96938 validation-aucpr:0.97331
[16] validation-logloss:0.33448 validation-auc:0.96975 validation-aucpr:0.97348
[17] validation-logloss:0.32497 validation-auc:0.96981 validation-aucpr:0.97350
[18] validation-logloss:0.31608 validation-auc:0.96988 validation-aucpr:0.97357
[19] validation-logloss:0.30788 validation-auc:0.96982 validation-aucpr:0.97354
[20] validation-logloss:0.30070 validation-auc:0.97028 validation-aucpr:0.97401
[21] validation-logloss:0.29340 validation-auc:0.97054 validation-aucpr:0.97412
[22] validation-logloss:0.28669 validation-auc:0.97075 validation-aucpr:0.97412
[23] validation-logloss:0.28032 validation-auc:0.97092 validation-aucpr:0.97434
[24] validation-logloss:0.27450 validation-auc:0.97090 validation-aucpr:0.97426
[25] validation-logloss:0.26895 validation-auc:0.97110 validation-aucpr:0.97450
[26] validation-logloss:0.26377 validation-auc:0.97113 validation-aucpr:0.97472
[27] validation-logloss:0.25858 validation-auc:0.97145 validation-aucpr:0.97491
[28] validation-logloss:0.25400 validation-auc:0.97145 validation-aucpr:0.97490
[29] validation-logloss:0.24940 validation-auc:0.97162 validation-aucpr:0.97503
[30] validation-logloss:0.24517 validation-auc:0.97170 validation-aucpr:0.97506
[31] validation-logloss:0.24110 validation-auc:0.97196 validation-aucpr:0.97520
[32] validation-logloss:0.23739 validation-auc:0.97206 validation-aucpr:0.97546
[33] validation-logloss:0.23407 validation-auc:0.97208 validation-aucpr:0.97546
[34] validation-logloss:0.23090 validation-auc:0.97202 validation-aucpr:0.97543
[35] validation-logloss:0.22799 validation-auc:0.97212 validation-aucpr:0.97544
[36] validation-logloss:0.22501 validation-auc:0.97222 validation-aucpr:0.97554
[37] validation-logloss:0.22257 validation-auc:0.97221 validation-aucpr:0.97551
[38] validation-logloss:0.22025 validation-auc:0.97250 validation-aucpr:0.97573
[39] validation-logloss:0.21789 validation-auc:0.97259 validation-aucpr:0.97578
[40] validation-logloss:0.21589 validation-auc:0.97251 validation-aucpr:0.97573
[41] validation-logloss:0.21382 validation-auc:0.97255 validation-aucpr:0.97572
[42] validation-logloss:0.21200 validation-auc:0.97254 validation-aucpr:0.97572
[43] validation-logloss:0.21016 validation-auc:0.97261 validation-aucpr:0.97574
[44] validation-logloss:0.20837 validation-auc:0.97246 validation-aucpr:0.97565
[45] validation-logloss:0.20672 validation-auc:0.97267 validation-aucpr:0.97582
[46] validation-logloss:0.20511 validation-auc:0.97270 validation-aucpr:0.97584
[47] validation-logloss:0.20359 validation-auc:0.97277 validation-aucpr:0.97589
[48] validation-logloss:0.20216 validation-auc:0.97285 validation-aucpr:0.97593
[49] validation-logloss:0.20083 validation-auc:0.97294 validation-aucpr:0.97601
[50] validation-logloss:0.19981 validation-auc:0.97288 validation-aucpr:0.97595
[51] validation-logloss:0.19848 validation-auc:0.97289 validation-aucpr:0.97569
[52] validation-logloss:0.19776 validation-auc:0.97269 validation-aucpr:0.97549
[53] validation-logloss:0.19689 validation-auc:0.97277 validation-aucpr:0.97552
[54] validation-logloss:0.19584 validation-auc:0.97284 validation-aucpr:0.97553
[55] validation-logloss:0.19487 validation-auc:0.97292 validation-aucpr:0.97558
[56] validation-logloss:0.19407 validation-auc:0.97298 validation-aucpr:0.97559
[57] validation-logloss:0.19321 validation-auc:0.97304 validation-aucpr:0.97565
[58] validation-logloss:0.19217 validation-auc:0.97307 validation-aucpr:0.97555
[59] validation-logloss:0.19151 validation-auc:0.97306 validation-aucpr:0.97557
[60] validation-logloss:0.19094 validation-auc:0.97305 validation-aucpr:0.97537
[61] validation-logloss:0.19051 validation-auc:0.97296 validation-aucpr:0.97526
[62] validation-logloss:0.18976 validation-auc:0.97307 validation-aucpr:0.97543
[63] validation-logloss:0.18864 validation-auc:0.97327 validation-aucpr:0.97584
[64] validation-logloss:0.18789 validation-auc:0.97337 validation-aucpr:0.97610
[65] validation-logloss:0.18748 validation-auc:0.97335 validation-aucpr:0.97636
[66] validation-logloss:0.18714 validation-auc:0.97334 validation-aucpr:0.97610
[67] validation-logloss:0.18688 validation-auc:0.97331 validation-aucpr:0.97605
[68] validation-logloss:0.18662 validation-auc:0.97341 validation-aucpr:0.97650
[69] validation-logloss:0.18634 validation-auc:0.97340 validation-aucpr:0.97640
[70] validation-logloss:0.18595 validation-auc:0.97336 validation-aucpr:0.97635
[71] validation-logloss:0.18577 validation-auc:0.97329 validation-aucpr:0.97628
[72] validation-logloss:0.18563 validation-auc:0.97330 validation-aucpr:0.97624
[73] validation-logloss:0.18532 validation-auc:0.97334 validation-aucpr:0.97639
[74] validation-logloss:0.18528 validation-auc:0.97329 validation-aucpr:0.97635
[75] validation-logloss:0.18521 validation-auc:0.97323 validation-aucpr:0.97628
[76] validation-logloss:0.18501 validation-auc:0.97331 validation-aucpr:0.97640
[77] validation-logloss:0.18489 validation-auc:0.97328 validation-aucpr:0.97639
[78] validation-logloss:0.18469 validation-auc:0.97335 validation-aucpr:0.97644
[79] validation-logloss:0.18467 validation-auc:0.97325 validation-aucpr:0.97635
[80] validation-logloss:0.18461 validation-auc:0.97331 validation-aucpr:0.97635
[81] validation-logloss:0.18419 validation-auc:0.97345 validation-aucpr:0.97639
[82] validation-logloss:0.18396 validation-auc:0.97345 validation-aucpr:0.97638
[83] validation-logloss:0.18392 validation-auc:0.97345 validation-aucpr:0.97631
[84] validation-logloss:0.18395 validation-auc:0.97347 validation-aucpr:0.97628
[85] validation-logloss:0.18400 validation-auc:0.97345 validation-aucpr:0.97630
[86] validation-logloss:0.18371 validation-auc:0.97350 validation-aucpr:0.97629
[87] validation-logloss:0.18373 validation-auc:0.97348 validation-aucpr:0.97627
[88] validation-logloss:0.18354 validation-auc:0.97343 validation-aucpr:0.97625
{'best_iteration': '68', 'best_score': '0.9764972411007973'}
Trial 61, Fold 2: Log loss = 0.1835398357728264, Average precision = 0.9762499405488697, ROC-AUC = 0.973432908206854, Elapsed Time = 3.3019077999997535 seconds
Trial 61, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 61, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.65257 validation-auc:0.94874 validation-aucpr:0.95522
[1] validation-logloss:0.61608 validation-auc:0.95651 validation-aucpr:0.94847
[2] validation-logloss:0.58239 validation-auc:0.96551 validation-aucpr:0.96145
[3] validation-logloss:0.55286 validation-auc:0.96633 validation-aucpr:0.96346
[4] validation-logloss:0.52647 validation-auc:0.96743 validation-aucpr:0.97130
[5] validation-logloss:0.50173 validation-auc:0.96789 validation-aucpr:0.97175
[6] validation-logloss:0.47979 validation-auc:0.96831 validation-aucpr:0.97334
[7] validation-logloss:0.45929 validation-auc:0.96828 validation-aucpr:0.97278
[8] validation-logloss:0.44136 validation-auc:0.96839 validation-aucpr:0.97268
[9] validation-logloss:0.42410 validation-auc:0.96848 validation-aucpr:0.97269
[10] validation-logloss:0.40778 validation-auc:0.96858 validation-aucpr:0.97279
[11] validation-logloss:0.39318 validation-auc:0.96899 validation-aucpr:0.97324
[12] validation-logloss:0.37912 validation-auc:0.96938 validation-aucpr:0.97365
[13] validation-logloss:0.36658 validation-auc:0.96961 validation-aucpr:0.97377
[14] validation-logloss:0.35465 validation-auc:0.96963 validation-aucpr:0.97381
[15] validation-logloss:0.34340 validation-auc:0.96986 validation-aucpr:0.97370
[16] validation-logloss:0.33300 validation-auc:0.96997 validation-aucpr:0.97377
[17] validation-logloss:0.32425 validation-auc:0.97034 validation-aucpr:0.97392
[18] validation-logloss:0.31523 validation-auc:0.97055 validation-aucpr:0.97423
[19] validation-logloss:0.30669 validation-auc:0.97088 validation-aucpr:0.97513
[20] validation-logloss:0.29900 validation-auc:0.97083 validation-aucpr:0.97505
[21] validation-logloss:0.29155 validation-auc:0.97105 validation-aucpr:0.97518
[22] validation-logloss:0.28463 validation-auc:0.97113 validation-aucpr:0.97528
[23] validation-logloss:0.27818 validation-auc:0.97135 validation-aucpr:0.97539
[24] validation-logloss:0.27244 validation-auc:0.97122 validation-aucpr:0.97529
[25] validation-logloss:0.26697 validation-auc:0.97151 validation-aucpr:0.97545
[26] validation-logloss:0.26203 validation-auc:0.97135 validation-aucpr:0.97531
[27] validation-logloss:0.25734 validation-auc:0.97133 validation-aucpr:0.97525
[28] validation-logloss:0.25267 validation-auc:0.97146 validation-aucpr:0.97491
[29] validation-logloss:0.24822 validation-auc:0.97173 validation-aucpr:0.97514
[30] validation-logloss:0.24434 validation-auc:0.97163 validation-aucpr:0.97500
[31] validation-logloss:0.24081 validation-auc:0.97155 validation-aucpr:0.97492
[32] validation-logloss:0.23729 validation-auc:0.97158 validation-aucpr:0.97493
[33] validation-logloss:0.23417 validation-auc:0.97176 validation-aucpr:0.97524
[34] validation-logloss:0.23112 validation-auc:0.97172 validation-aucpr:0.97513
[35] validation-logloss:0.22804 validation-auc:0.97180 validation-aucpr:0.97518
[36] validation-logloss:0.22568 validation-auc:0.97187 validation-aucpr:0.97524
[37] validation-logloss:0.22320 validation-auc:0.97200 validation-aucpr:0.97588
[38] validation-logloss:0.22087 validation-auc:0.97192 validation-aucpr:0.97583
[39] validation-logloss:0.21860 validation-auc:0.97192 validation-aucpr:0.97575
[40] validation-logloss:0.21615 validation-auc:0.97226 validation-aucpr:0.97600
[41] validation-logloss:0.21434 validation-auc:0.97217 validation-aucpr:0.97593
[42] validation-logloss:0.21256 validation-auc:0.97212 validation-aucpr:0.97582
[43] validation-logloss:0.21087 validation-auc:0.97217 validation-aucpr:0.97580
[44] validation-logloss:0.20925 validation-auc:0.97222 validation-aucpr:0.97579
[45] validation-logloss:0.20741 validation-auc:0.97242 validation-aucpr:0.97598
[46] validation-logloss:0.20594 validation-auc:0.97244 validation-aucpr:0.97589
[47] validation-logloss:0.20450 validation-auc:0.97256 validation-aucpr:0.97594
[48] validation-logloss:0.20297 validation-auc:0.97262 validation-aucpr:0.97602
[49] validation-logloss:0.20179 validation-auc:0.97266 validation-aucpr:0.97601
[50] validation-logloss:0.20065 validation-auc:0.97272 validation-aucpr:0.97602
[51] validation-logloss:0.19954 validation-auc:0.97274 validation-aucpr:0.97600
[52] validation-logloss:0.19838 validation-auc:0.97290 validation-aucpr:0.97613
[53] validation-logloss:0.19767 validation-auc:0.97284 validation-aucpr:0.97604
[54] validation-logloss:0.19640 validation-auc:0.97305 validation-aucpr:0.97628
[55] validation-logloss:0.19558 validation-auc:0.97309 validation-aucpr:0.97629
[56] validation-logloss:0.19456 validation-auc:0.97324 validation-aucpr:0.97637
[57] validation-logloss:0.19397 validation-auc:0.97327 validation-aucpr:0.97636
[58] validation-logloss:0.19325 validation-auc:0.97329 validation-aucpr:0.97641
[59] validation-logloss:0.19279 validation-auc:0.97340 validation-aucpr:0.97653
[60] validation-logloss:0.19204 validation-auc:0.97351 validation-aucpr:0.97653
[61] validation-logloss:0.19146 validation-auc:0.97360 validation-aucpr:0.97709
[62] validation-logloss:0.19109 validation-auc:0.97357 validation-aucpr:0.97704
[63] validation-logloss:0.19060 validation-auc:0.97354 validation-aucpr:0.97704
[64] validation-logloss:0.19012 validation-auc:0.97359 validation-aucpr:0.97705
[65] validation-logloss:0.18966 validation-auc:0.97362 validation-aucpr:0.97707
[66] validation-logloss:0.18949 validation-auc:0.97356 validation-aucpr:0.97690
[67] validation-logloss:0.18917 validation-auc:0.97354 validation-aucpr:0.97684
[68] validation-logloss:0.18883 validation-auc:0.97356 validation-aucpr:0.97680
[69] validation-logloss:0.18842 validation-auc:0.97362 validation-aucpr:0.97687
[70] validation-logloss:0.18827 validation-auc:0.97361 validation-aucpr:0.97696
[71] validation-logloss:0.18832 validation-auc:0.97351 validation-aucpr:0.97683
[72] validation-logloss:0.18821 validation-auc:0.97347 validation-aucpr:0.97687
[73] validation-logloss:0.18787 validation-auc:0.97355 validation-aucpr:0.97693
[74] validation-logloss:0.18738 validation-auc:0.97365 validation-aucpr:0.97709
[75] validation-logloss:0.18723 validation-auc:0.97366 validation-aucpr:0.97708
[76] validation-logloss:0.18715 validation-auc:0.97367 validation-aucpr:0.97710
[77] validation-logloss:0.18675 validation-auc:0.97380 validation-aucpr:0.97721
[78] validation-logloss:0.18647 validation-auc:0.97394 validation-aucpr:0.97739
[79] validation-logloss:0.18630 validation-auc:0.97394 validation-aucpr:0.97736
[80] validation-logloss:0.18624 validation-auc:0.97387 validation-aucpr:0.97728
[81] validation-logloss:0.18613 validation-auc:0.97388 validation-aucpr:0.97726
[82] validation-logloss:0.18605 validation-auc:0.97395 validation-aucpr:0.97741
[83] validation-logloss:0.18584 validation-auc:0.97398 validation-aucpr:0.97743
[84] validation-logloss:0.18593 validation-auc:0.97394 validation-aucpr:0.97739
[85] validation-logloss:0.18595 validation-auc:0.97392 validation-aucpr:0.97730
[86] validation-logloss:0.18594 validation-auc:0.97392 validation-aucpr:0.97726
[87] validation-logloss:0.18599 validation-auc:0.97390 validation-aucpr:0.97735
[88] validation-logloss:0.18598 validation-auc:0.97386 validation-aucpr:0.97730
{'best_iteration': '83', 'best_score': '0.9774279649471089'}
Trial 61, Fold 3: Log loss = 0.1859758454152286, Average precision = 0.9773062201854303, ROC-AUC = 0.9738628004882935, Elapsed Time = 3.588831900000514 seconds
Trial 61, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 61, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.65218 validation-auc:0.93116 validation-aucpr:0.92204
[1] validation-logloss:0.61553 validation-auc:0.95476 validation-aucpr:0.94696
[2] validation-logloss:0.58266 validation-auc:0.96118 validation-aucpr:0.95811
[3] validation-logloss:0.55270 validation-auc:0.96466 validation-aucpr:0.96351
[4] validation-logloss:0.52872 validation-auc:0.96615 validation-aucpr:0.96626
[5] validation-logloss:0.50357 validation-auc:0.96793 validation-aucpr:0.96916
[6] validation-logloss:0.48082 validation-auc:0.96891 validation-aucpr:0.97161
[7] validation-logloss:0.46213 validation-auc:0.96899 validation-aucpr:0.97168
[8] validation-logloss:0.44325 validation-auc:0.96925 validation-aucpr:0.97187
[9] validation-logloss:0.42587 validation-auc:0.96905 validation-aucpr:0.97179
[10] validation-logloss:0.40992 validation-auc:0.96888 validation-aucpr:0.97183
[11] validation-logloss:0.39536 validation-auc:0.96865 validation-aucpr:0.97183
[12] validation-logloss:0.38131 validation-auc:0.96906 validation-aucpr:0.97213
[13] validation-logloss:0.36810 validation-auc:0.96936 validation-aucpr:0.97232
[14] validation-logloss:0.35637 validation-auc:0.96954 validation-aucpr:0.97250
[15] validation-logloss:0.34524 validation-auc:0.96998 validation-aucpr:0.97454
[16] validation-logloss:0.33494 validation-auc:0.97021 validation-aucpr:0.97474
[17] validation-logloss:0.32643 validation-auc:0.97042 validation-aucpr:0.97491
[18] validation-logloss:0.31792 validation-auc:0.97034 validation-aucpr:0.97492
[19] validation-logloss:0.30972 validation-auc:0.97027 validation-aucpr:0.97484
[20] validation-logloss:0.30190 validation-auc:0.97041 validation-aucpr:0.97494
[21] validation-logloss:0.29451 validation-auc:0.97059 validation-aucpr:0.97510
[22] validation-logloss:0.28800 validation-auc:0.97055 validation-aucpr:0.97509
[23] validation-logloss:0.28163 validation-auc:0.97063 validation-aucpr:0.97515
[24] validation-logloss:0.27599 validation-auc:0.97046 validation-aucpr:0.97507
[25] validation-logloss:0.27012 validation-auc:0.97077 validation-aucpr:0.97533
[26] validation-logloss:0.26474 validation-auc:0.97103 validation-aucpr:0.97549
[27] validation-logloss:0.26001 validation-auc:0.97105 validation-aucpr:0.97552
[28] validation-logloss:0.25557 validation-auc:0.97099 validation-aucpr:0.97550
[29] validation-logloss:0.25099 validation-auc:0.97108 validation-aucpr:0.97561
[30] validation-logloss:0.24721 validation-auc:0.97096 validation-aucpr:0.97554
[31] validation-logloss:0.24384 validation-auc:0.97087 validation-aucpr:0.97548
[32] validation-logloss:0.24032 validation-auc:0.97097 validation-aucpr:0.97559
[33] validation-logloss:0.23701 validation-auc:0.97096 validation-aucpr:0.97558
[34] validation-logloss:0.23352 validation-auc:0.97117 validation-aucpr:0.97574
[35] validation-logloss:0.23073 validation-auc:0.97109 validation-aucpr:0.97568
[36] validation-logloss:0.22826 validation-auc:0.97110 validation-aucpr:0.97569
[37] validation-logloss:0.22600 validation-auc:0.97096 validation-aucpr:0.97557
[38] validation-logloss:0.22390 validation-auc:0.97091 validation-aucpr:0.97550
[39] validation-logloss:0.22162 validation-auc:0.97099 validation-aucpr:0.97552
[40] validation-logloss:0.21943 validation-auc:0.97095 validation-aucpr:0.97549
[41] validation-logloss:0.21773 validation-auc:0.97098 validation-aucpr:0.97550
[42] validation-logloss:0.21582 validation-auc:0.97122 validation-aucpr:0.97567
[43] validation-logloss:0.21397 validation-auc:0.97115 validation-aucpr:0.97564
[44] validation-logloss:0.21243 validation-auc:0.97105 validation-aucpr:0.97558
[45] validation-logloss:0.21098 validation-auc:0.97110 validation-aucpr:0.97560
[46] validation-logloss:0.20957 validation-auc:0.97109 validation-aucpr:0.97559
[47] validation-logloss:0.20830 validation-auc:0.97104 validation-aucpr:0.97558
[48] validation-logloss:0.20688 validation-auc:0.97108 validation-aucpr:0.97560
[49] validation-logloss:0.20564 validation-auc:0.97105 validation-aucpr:0.97559
[50] validation-logloss:0.20468 validation-auc:0.97101 validation-aucpr:0.97555
[51] validation-logloss:0.20383 validation-auc:0.97095 validation-aucpr:0.97550
[52] validation-logloss:0.20329 validation-auc:0.97075 validation-aucpr:0.97533
[53] validation-logloss:0.20224 validation-auc:0.97086 validation-aucpr:0.97540
[54] validation-logloss:0.20161 validation-auc:0.97075 validation-aucpr:0.97530
[55] validation-logloss:0.20069 validation-auc:0.97089 validation-aucpr:0.97541
[56] validation-logloss:0.20013 validation-auc:0.97088 validation-aucpr:0.97544
[57] validation-logloss:0.19964 validation-auc:0.97086 validation-aucpr:0.97542
[58] validation-logloss:0.19934 validation-auc:0.97078 validation-aucpr:0.97537
[59] validation-logloss:0.19859 validation-auc:0.97074 validation-aucpr:0.97537
[60] validation-logloss:0.19859 validation-auc:0.97063 validation-aucpr:0.97531
[61] validation-logloss:0.19808 validation-auc:0.97067 validation-aucpr:0.97531
[62] validation-logloss:0.19737 validation-auc:0.97079 validation-aucpr:0.97539
[63] validation-logloss:0.19704 validation-auc:0.97077 validation-aucpr:0.97540
[64] validation-logloss:0.19647 validation-auc:0.97085 validation-aucpr:0.97546
[65] validation-logloss:0.19600 validation-auc:0.97086 validation-aucpr:0.97547
[66] validation-logloss:0.19544 validation-auc:0.97089 validation-aucpr:0.97552
[67] validation-logloss:0.19501 validation-auc:0.97087 validation-aucpr:0.97550
[68] validation-logloss:0.19443 validation-auc:0.97096 validation-aucpr:0.97556
[69] validation-logloss:0.19417 validation-auc:0.97104 validation-aucpr:0.97559
[70] validation-logloss:0.19402 validation-auc:0.97105 validation-aucpr:0.97561
[71] validation-logloss:0.19404 validation-auc:0.97091 validation-aucpr:0.97552
[72] validation-logloss:0.19376 validation-auc:0.97092 validation-aucpr:0.97555
[73] validation-logloss:0.19366 validation-auc:0.97094 validation-aucpr:0.97557
[74] validation-logloss:0.19360 validation-auc:0.97085 validation-aucpr:0.97550
[75] validation-logloss:0.19356 validation-auc:0.97084 validation-aucpr:0.97548
[76] validation-logloss:0.19329 validation-auc:0.97081 validation-aucpr:0.97547
[77] validation-logloss:0.19324 validation-auc:0.97080 validation-aucpr:0.97545
[78] validation-logloss:0.19342 validation-auc:0.97080 validation-aucpr:0.97542
[79] validation-logloss:0.19323 validation-auc:0.97085 validation-aucpr:0.97549
[80] validation-logloss:0.19299 validation-auc:0.97098 validation-aucpr:0.97555
[81] validation-logloss:0.19290 validation-auc:0.97099 validation-aucpr:0.97554
[82] validation-logloss:0.19286 validation-auc:0.97101 validation-aucpr:0.97557
[83] validation-logloss:0.19315 validation-auc:0.97088 validation-aucpr:0.97549
[84] validation-logloss:0.19323 validation-auc:0.97091 validation-aucpr:0.97550
{'best_iteration': '34', 'best_score': '0.9757358168229795'}
Trial 61, Fold 4: Log loss = 0.19322964262513223, Average precision = 0.9755070062750073, ROC-AUC = 0.9709110203843192, Elapsed Time = 3.3171443000028376 seconds
Trial 61, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 61, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.65263 validation-auc:0.94417 validation-aucpr:0.94844
[1] validation-logloss:0.62087 validation-auc:0.94755 validation-aucpr:0.93609
[2] validation-logloss:0.58831 validation-auc:0.95691 validation-aucpr:0.95541
[3] validation-logloss:0.55887 validation-auc:0.96060 validation-aucpr:0.96631
[4] validation-logloss:0.53202 validation-auc:0.96238 validation-aucpr:0.96769
[5] validation-logloss:0.50769 validation-auc:0.96314 validation-aucpr:0.96876
[6] validation-logloss:0.48687 validation-auc:0.96369 validation-aucpr:0.96956
[7] validation-logloss:0.46624 validation-auc:0.96542 validation-aucpr:0.97056
[8] validation-logloss:0.44754 validation-auc:0.96591 validation-aucpr:0.97090
[9] validation-logloss:0.42980 validation-auc:0.96646 validation-aucpr:0.97129
[10] validation-logloss:0.41362 validation-auc:0.96748 validation-aucpr:0.97209
[11] validation-logloss:0.39842 validation-auc:0.96793 validation-aucpr:0.97237
[12] validation-logloss:0.38450 validation-auc:0.96806 validation-aucpr:0.97253
[13] validation-logloss:0.37198 validation-auc:0.96805 validation-aucpr:0.97248
[14] validation-logloss:0.36010 validation-auc:0.96863 validation-aucpr:0.97285
[15] validation-logloss:0.34911 validation-auc:0.96892 validation-aucpr:0.97286
[16] validation-logloss:0.33871 validation-auc:0.96892 validation-aucpr:0.97287
[17] validation-logloss:0.32960 validation-auc:0.96872 validation-aucpr:0.97278
[18] validation-logloss:0.32082 validation-auc:0.96869 validation-aucpr:0.97280
[19] validation-logloss:0.31292 validation-auc:0.96874 validation-aucpr:0.97274
[20] validation-logloss:0.30565 validation-auc:0.96880 validation-aucpr:0.97292
[21] validation-logloss:0.29888 validation-auc:0.96875 validation-aucpr:0.97271
[22] validation-logloss:0.29226 validation-auc:0.96880 validation-aucpr:0.97264
[23] validation-logloss:0.28574 validation-auc:0.96914 validation-aucpr:0.97293
[24] validation-logloss:0.28019 validation-auc:0.96905 validation-aucpr:0.97287
[25] validation-logloss:0.27501 validation-auc:0.96914 validation-aucpr:0.97298
[26] validation-logloss:0.27026 validation-auc:0.96953 validation-aucpr:0.97322
[27] validation-logloss:0.26577 validation-auc:0.96935 validation-aucpr:0.97321
[28] validation-logloss:0.26091 validation-auc:0.96978 validation-aucpr:0.97346
[29] validation-logloss:0.25686 validation-auc:0.96972 validation-aucpr:0.97341
[30] validation-logloss:0.25283 validation-auc:0.96982 validation-aucpr:0.97345
[31] validation-logloss:0.24923 validation-auc:0.96979 validation-aucpr:0.97342
[32] validation-logloss:0.24552 validation-auc:0.96991 validation-aucpr:0.97368
[33] validation-logloss:0.24240 validation-auc:0.97004 validation-aucpr:0.97381
[34] validation-logloss:0.23950 validation-auc:0.97030 validation-aucpr:0.97400
[35] validation-logloss:0.23673 validation-auc:0.97043 validation-aucpr:0.97428
[36] validation-logloss:0.23386 validation-auc:0.97054 validation-aucpr:0.97441
[37] validation-logloss:0.23126 validation-auc:0.97067 validation-aucpr:0.97448
[38] validation-logloss:0.22877 validation-auc:0.97074 validation-aucpr:0.97451
[39] validation-logloss:0.22667 validation-auc:0.97080 validation-aucpr:0.97440
[40] validation-logloss:0.22431 validation-auc:0.97099 validation-aucpr:0.97453
[41] validation-logloss:0.22214 validation-auc:0.97114 validation-aucpr:0.97466
[42] validation-logloss:0.22032 validation-auc:0.97141 validation-aucpr:0.97532
[43] validation-logloss:0.21889 validation-auc:0.97127 validation-aucpr:0.97518
[44] validation-logloss:0.21724 validation-auc:0.97133 validation-aucpr:0.97527
[45] validation-logloss:0.21574 validation-auc:0.97134 validation-aucpr:0.97527
[46] validation-logloss:0.21417 validation-auc:0.97136 validation-aucpr:0.97529
[47] validation-logloss:0.21296 validation-auc:0.97129 validation-aucpr:0.97516
[48] validation-logloss:0.21188 validation-auc:0.97131 validation-aucpr:0.97518
[49] validation-logloss:0.21053 validation-auc:0.97133 validation-aucpr:0.97519
[50] validation-logloss:0.20928 validation-auc:0.97144 validation-aucpr:0.97525
[51] validation-logloss:0.20832 validation-auc:0.97139 validation-aucpr:0.97521
[52] validation-logloss:0.20720 validation-auc:0.97156 validation-aucpr:0.97529
[53] validation-logloss:0.20660 validation-auc:0.97146 validation-aucpr:0.97518
[54] validation-logloss:0.20595 validation-auc:0.97142 validation-aucpr:0.97507
[55] validation-logloss:0.20504 validation-auc:0.97148 validation-aucpr:0.97513
[56] validation-logloss:0.20428 validation-auc:0.97149 validation-aucpr:0.97512
[57] validation-logloss:0.20338 validation-auc:0.97159 validation-aucpr:0.97514
[58] validation-logloss:0.20274 validation-auc:0.97174 validation-aucpr:0.97520
[59] validation-logloss:0.20196 validation-auc:0.97182 validation-aucpr:0.97526
[60] validation-logloss:0.20155 validation-auc:0.97182 validation-aucpr:0.97522
[61] validation-logloss:0.20099 validation-auc:0.97179 validation-aucpr:0.97517
[62] validation-logloss:0.20043 validation-auc:0.97193 validation-aucpr:0.97538
[63] validation-logloss:0.19981 validation-auc:0.97199 validation-aucpr:0.97542
[64] validation-logloss:0.19906 validation-auc:0.97216 validation-aucpr:0.97554
[65] validation-logloss:0.19830 validation-auc:0.97232 validation-aucpr:0.97567
[66] validation-logloss:0.19782 validation-auc:0.97237 validation-aucpr:0.97559
[67] validation-logloss:0.19740 validation-auc:0.97234 validation-aucpr:0.97560
[68] validation-logloss:0.19704 validation-auc:0.97240 validation-aucpr:0.97559
[69] validation-logloss:0.19663 validation-auc:0.97246 validation-aucpr:0.97564
[70] validation-logloss:0.19641 validation-auc:0.97236 validation-aucpr:0.97554
[71] validation-logloss:0.19632 validation-auc:0.97228 validation-aucpr:0.97542
[72] validation-logloss:0.19611 validation-auc:0.97236 validation-aucpr:0.97545
[73] validation-logloss:0.19542 validation-auc:0.97250 validation-aucpr:0.97554
[74] validation-logloss:0.19538 validation-auc:0.97251 validation-aucpr:0.97540
[75] validation-logloss:0.19532 validation-auc:0.97239 validation-aucpr:0.97539
[76] validation-logloss:0.19542 validation-auc:0.97230 validation-aucpr:0.97529
[77] validation-logloss:0.19511 validation-auc:0.97235 validation-aucpr:0.97531
[78] validation-logloss:0.19495 validation-auc:0.97238 validation-aucpr:0.97552
[79] validation-logloss:0.19446 validation-auc:0.97254 validation-aucpr:0.97557
[80] validation-logloss:0.19431 validation-auc:0.97258 validation-aucpr:0.97573
[81] validation-logloss:0.19428 validation-auc:0.97258 validation-aucpr:0.97569
[82] validation-logloss:0.19407 validation-auc:0.97267 validation-aucpr:0.97596
[83] validation-logloss:0.19422 validation-auc:0.97259 validation-aucpr:0.97578
[84] validation-logloss:0.19458 validation-auc:0.97247 validation-aucpr:0.97572
[85] validation-logloss:0.19459 validation-auc:0.97254 validation-aucpr:0.97588
[86] validation-logloss:0.19454 validation-auc:0.97258 validation-aucpr:0.97586
[87] validation-logloss:0.19444 validation-auc:0.97260 validation-aucpr:0.97598
[88] validation-logloss:0.19437 validation-auc:0.97266 validation-aucpr:0.97610
{'best_iteration': '88', 'best_score': '0.9761023838603088'}
Trial 61, Fold 5: Log loss = 0.1943696178039605, Average precision = 0.9761065560095816, ROC-AUC = 0.9726645486130465, Elapsed Time = 3.414511799997854 seconds
Optimization Progress: 62%|######2 | 62/100 [2:59:08<28:50, 45.54s/it]
Trial 62, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 62, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[20:58:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68248 validation-auc:0.93953 validation-aucpr:0.94317
[20:58:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67097 validation-auc:0.95882 validation-aucpr:0.96404
[20:58:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66079 validation-auc:0.96151 validation-aucpr:0.96615
[20:58:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65115 validation-auc:0.96150 validation-aucpr:0.96613
[20:58:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.64136 validation-auc:0.96256 validation-aucpr:0.96751
[20:58:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.63242 validation-auc:0.96148 validation-aucpr:0.96649
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.62264 validation-auc:0.96329 validation-aucpr:0.96833
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.61280 validation-auc:0.96421 validation-aucpr:0.96947
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.60480 validation-auc:0.96407 validation-aucpr:0.96972
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.59684 validation-auc:0.96406 validation-aucpr:0.96953
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.58889 validation-auc:0.96398 validation-aucpr:0.96947
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.57973 validation-auc:0.96465 validation-aucpr:0.97011
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.57087 validation-auc:0.96537 validation-aucpr:0.97079
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.56328 validation-auc:0.96577 validation-aucpr:0.97109
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.55630 validation-auc:0.96551 validation-aucpr:0.97082
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.54950 validation-auc:0.96561 validation-aucpr:0.97080
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.54178 validation-auc:0.96585 validation-aucpr:0.97105
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.53503 validation-auc:0.96595 validation-aucpr:0.97107
[20:58:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.52864 validation-auc:0.96609 validation-aucpr:0.97112
[20:58:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.52218 validation-auc:0.96617 validation-aucpr:0.97112
[20:58:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.51597 validation-auc:0.96612 validation-aucpr:0.97111
[20:58:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.51002 validation-auc:0.96608 validation-aucpr:0.97106
[20:58:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.50317 validation-auc:0.96633 validation-aucpr:0.97135
[20:58:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.49738 validation-auc:0.96652 validation-aucpr:0.97145
[20:58:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.49094 validation-auc:0.96674 validation-aucpr:0.97167
[20:58:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.48535 validation-auc:0.96687 validation-aucpr:0.97172
[20:58:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.47946 validation-auc:0.96701 validation-aucpr:0.97186
[20:58:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.47364 validation-auc:0.96714 validation-aucpr:0.97202
[20:58:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.46876 validation-auc:0.96702 validation-aucpr:0.97195
[20:58:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.46309 validation-auc:0.96711 validation-aucpr:0.97208
[20:58:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.45733 validation-auc:0.96723 validation-aucpr:0.97222
[20:58:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.45260 validation-auc:0.96733 validation-aucpr:0.97227
[20:58:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.44813 validation-auc:0.96724 validation-aucpr:0.97221
[20:58:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.44359 validation-auc:0.96726 validation-aucpr:0.97221
[20:58:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.43844 validation-auc:0.96733 validation-aucpr:0.97232
[20:58:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.43332 validation-auc:0.96747 validation-aucpr:0.97247
[20:58:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.42844 validation-auc:0.96765 validation-aucpr:0.97268
[20:58:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.42352 validation-auc:0.96782 validation-aucpr:0.97281
[20:58:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.41962 validation-auc:0.96784 validation-aucpr:0.97281
[20:58:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.41585 validation-auc:0.96773 validation-aucpr:0.97271
[20:58:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.41224 validation-auc:0.96769 validation-aucpr:0.97268
[20:58:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.40781 validation-auc:0.96787 validation-aucpr:0.97281
{'best_iteration': '37', 'best_score': '0.972812172962304'}
Trial 62, Fold 1: Log loss = 0.40781070738928166, Average precision = 0.9728138503826426, ROC-AUC = 0.9678679417415867, Elapsed Time = 7.990083500000765 seconds
Trial 62, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 62, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68270 validation-auc:0.94135 validation-aucpr:0.94169
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67085 validation-auc:0.96259 validation-aucpr:0.96625
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66083 validation-auc:0.96391 validation-aucpr:0.96757
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65096 validation-auc:0.96396 validation-aucpr:0.96731
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.64171 validation-auc:0.96327 validation-aucpr:0.96660
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.63239 validation-auc:0.96365 validation-aucpr:0.96694
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.62233 validation-auc:0.96613 validation-aucpr:0.96964
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.61335 validation-auc:0.96625 validation-aucpr:0.96969
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.60353 validation-auc:0.96734 validation-aucpr:0.97093
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.59392 validation-auc:0.96815 validation-aucpr:0.97172
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.58475 validation-auc:0.96896 validation-aucpr:0.97239
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.57716 validation-auc:0.96863 validation-aucpr:0.97207
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.56929 validation-auc:0.96871 validation-aucpr:0.97206
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.56201 validation-auc:0.96871 validation-aucpr:0.97201
[20:58:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.55467 validation-auc:0.96861 validation-aucpr:0.97188
[20:58:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.54778 validation-auc:0.96849 validation-aucpr:0.97208
[20:58:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.54077 validation-auc:0.96853 validation-aucpr:0.97213
[20:58:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.53408 validation-auc:0.96833 validation-aucpr:0.97187
[20:58:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.52759 validation-auc:0.96835 validation-aucpr:0.97185
[20:58:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.52120 validation-auc:0.96850 validation-aucpr:0.97191
[20:58:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.51501 validation-auc:0.96850 validation-aucpr:0.97183
[20:58:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.50793 validation-auc:0.96881 validation-aucpr:0.97218
[20:58:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.50102 validation-auc:0.96931 validation-aucpr:0.97262
[20:58:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.49549 validation-auc:0.96919 validation-aucpr:0.97246
[20:58:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.48889 validation-auc:0.96945 validation-aucpr:0.97314
[20:58:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.48321 validation-auc:0.96956 validation-aucpr:0.97321
[20:58:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.47814 validation-auc:0.96943 validation-aucpr:0.97308
[20:58:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.47221 validation-auc:0.96960 validation-aucpr:0.97321
[20:58:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.46615 validation-auc:0.96989 validation-aucpr:0.97346
[20:58:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.46125 validation-auc:0.96987 validation-aucpr:0.97340
[20:58:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.45654 validation-auc:0.96982 validation-aucpr:0.97337
[20:58:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.45094 validation-auc:0.96992 validation-aucpr:0.97351
[20:58:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.44649 validation-auc:0.96983 validation-aucpr:0.97345
[20:58:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.44216 validation-auc:0.96969 validation-aucpr:0.97333
[20:58:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.43782 validation-auc:0.96959 validation-aucpr:0.97322
[20:58:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.43360 validation-auc:0.96959 validation-aucpr:0.97323
[20:58:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.42858 validation-auc:0.96986 validation-aucpr:0.97343
[20:58:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.42388 validation-auc:0.96998 validation-aucpr:0.97354
[20:58:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.41982 validation-auc:0.96993 validation-aucpr:0.97348
[20:58:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.41507 validation-auc:0.97026 validation-aucpr:0.97374
[20:58:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.41032 validation-auc:0.97051 validation-aucpr:0.97393
[20:58:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.40678 validation-auc:0.97042 validation-aucpr:0.97379
{'best_iteration': '40', 'best_score': '0.9739341330303594'}
Trial 62, Fold 2: Log loss = 0.40677707386682616, Average precision = 0.9737911873049275, ROC-AUC = 0.9704198041785878, Elapsed Time = 7.774721900001168 seconds
Trial 62, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 62, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[20:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68239 validation-auc:0.94132 validation-aucpr:0.94100
[20:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67076 validation-auc:0.96059 validation-aucpr:0.96419
[20:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66041 validation-auc:0.96454 validation-aucpr:0.96768
[20:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65058 validation-auc:0.96531 validation-aucpr:0.96720
[20:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.64083 validation-auc:0.96618 validation-aucpr:0.96969
[20:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.63020 validation-auc:0.96822 validation-aucpr:0.97200
[20:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.62098 validation-auc:0.96790 validation-aucpr:0.97178
[20:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.61215 validation-auc:0.96784 validation-aucpr:0.97170
[20:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.60426 validation-auc:0.96772 validation-aucpr:0.97171
[20:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.59491 validation-auc:0.96818 validation-aucpr:0.97235
[20:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.58589 validation-auc:0.96839 validation-aucpr:0.97262
[20:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.57680 validation-auc:0.96896 validation-aucpr:0.97313
[20:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.56939 validation-auc:0.96891 validation-aucpr:0.97309
[20:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.56173 validation-auc:0.96904 validation-aucpr:0.97321
[20:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.55469 validation-auc:0.96877 validation-aucpr:0.97299
[20:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.54760 validation-auc:0.96892 validation-aucpr:0.97306
[20:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.53952 validation-auc:0.96927 validation-aucpr:0.97346
[20:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.53311 validation-auc:0.96917 validation-aucpr:0.97334
[20:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.52664 validation-auc:0.96916 validation-aucpr:0.97330
[20:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.51923 validation-auc:0.96941 validation-aucpr:0.97355
[20:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.51289 validation-auc:0.96945 validation-aucpr:0.97355
[20:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.50691 validation-auc:0.96940 validation-aucpr:0.97348
[20:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.50005 validation-auc:0.96965 validation-aucpr:0.97375
[20:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.49444 validation-auc:0.96958 validation-aucpr:0.97369
[20:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.48802 validation-auc:0.96963 validation-aucpr:0.97388
[20:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.48160 validation-auc:0.96981 validation-aucpr:0.97404
[20:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.47546 validation-auc:0.96993 validation-aucpr:0.97418
[20:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.47016 validation-auc:0.97000 validation-aucpr:0.97423
[20:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.46418 validation-auc:0.97016 validation-aucpr:0.97440
[20:58:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.45944 validation-auc:0.97003 validation-aucpr:0.97431
[20:58:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.45470 validation-auc:0.97000 validation-aucpr:0.97426
[20:58:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.44993 validation-auc:0.97003 validation-aucpr:0.97424
[20:58:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.44527 validation-auc:0.96999 validation-aucpr:0.97420
[20:58:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.44061 validation-auc:0.97005 validation-aucpr:0.97423
[20:58:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.43540 validation-auc:0.97031 validation-aucpr:0.97444
[20:58:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.43107 validation-auc:0.97027 validation-aucpr:0.97440
[20:58:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.42620 validation-auc:0.97027 validation-aucpr:0.97444
[20:58:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.42224 validation-auc:0.97028 validation-aucpr:0.97444
[20:58:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.41740 validation-auc:0.97041 validation-aucpr:0.97455
[20:58:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.41257 validation-auc:0.97057 validation-aucpr:0.97472
[20:58:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.40791 validation-auc:0.97076 validation-aucpr:0.97489
[20:58:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.40408 validation-auc:0.97074 validation-aucpr:0.97486
{'best_iteration': '40', 'best_score': '0.9748941392910312'}
Trial 62, Fold 3: Log loss = 0.40408457793104074, Average precision = 0.974869227836466, ROC-AUC = 0.9707354980689774, Elapsed Time = 7.972263800002111 seconds
Trial 62, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 62, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[20:58:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68261 validation-auc:0.93563 validation-aucpr:0.93838
[20:58:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67076 validation-auc:0.96144 validation-aucpr:0.96397
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66066 validation-auc:0.96174 validation-aucpr:0.96735
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65087 validation-auc:0.96233 validation-aucpr:0.96758
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.64137 validation-auc:0.96294 validation-aucpr:0.96809
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.63251 validation-auc:0.96198 validation-aucpr:0.96723
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.62268 validation-auc:0.96329 validation-aucpr:0.96891
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.61389 validation-auc:0.96352 validation-aucpr:0.96902
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.60416 validation-auc:0.96509 validation-aucpr:0.97058
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.59576 validation-auc:0.96532 validation-aucpr:0.97073
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.58771 validation-auc:0.96526 validation-aucpr:0.97066
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.58023 validation-auc:0.96482 validation-aucpr:0.97030
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.57225 validation-auc:0.96510 validation-aucpr:0.97040
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.56483 validation-auc:0.96499 validation-aucpr:0.97027
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.55772 validation-auc:0.96503 validation-aucpr:0.97025
[20:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.55052 validation-auc:0.96504 validation-aucpr:0.97026
[20:58:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.54364 validation-auc:0.96500 validation-aucpr:0.97019
[20:58:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.53687 validation-auc:0.96475 validation-aucpr:0.97001
[20:58:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.52930 validation-auc:0.96538 validation-aucpr:0.97067
[20:58:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.52310 validation-auc:0.96530 validation-aucpr:0.97059
[20:58:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.51587 validation-auc:0.96566 validation-aucpr:0.97100
[20:58:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.50982 validation-auc:0.96559 validation-aucpr:0.97094
[20:58:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.50433 validation-auc:0.96556 validation-aucpr:0.97089
[20:58:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.49876 validation-auc:0.96537 validation-aucpr:0.97068
[20:58:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.49223 validation-auc:0.96571 validation-aucpr:0.97104
[20:58:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.48563 validation-auc:0.96609 validation-aucpr:0.97143
[20:58:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.48064 validation-auc:0.96604 validation-aucpr:0.97136
[20:58:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.47573 validation-auc:0.96599 validation-aucpr:0.97131
[20:58:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.46976 validation-auc:0.96613 validation-aucpr:0.97145
[20:58:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.46513 validation-auc:0.96607 validation-aucpr:0.97140
[20:58:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.45941 validation-auc:0.96629 validation-aucpr:0.97161
[20:58:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.45462 validation-auc:0.96634 validation-aucpr:0.97163
[20:58:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.45013 validation-auc:0.96629 validation-aucpr:0.97158
[20:58:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.44565 validation-auc:0.96622 validation-aucpr:0.97151
[20:58:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.44137 validation-auc:0.96624 validation-aucpr:0.97151
[20:58:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.43612 validation-auc:0.96653 validation-aucpr:0.97179
[20:58:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.43112 validation-auc:0.96671 validation-aucpr:0.97196
[20:58:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.42700 validation-auc:0.96666 validation-aucpr:0.97189
[20:58:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.42302 validation-auc:0.96668 validation-aucpr:0.97189
[20:58:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.41922 validation-auc:0.96656 validation-aucpr:0.97182
[20:58:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.41542 validation-auc:0.96653 validation-aucpr:0.97179
[20:58:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.41174 validation-auc:0.96649 validation-aucpr:0.97177
{'best_iteration': '36', 'best_score': '0.9719608080914132'}
Trial 62, Fold 4: Log loss = 0.4117373799923235, Average precision = 0.9717651888435617, ROC-AUC = 0.9664940503015086, Elapsed Time = 7.8552474999996775 seconds
Trial 62, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 62, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[20:58:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68271 validation-auc:0.93592 validation-aucpr:0.93858
[20:58:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67102 validation-auc:0.95941 validation-aucpr:0.96306
[20:58:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.65949 validation-auc:0.96464 validation-aucpr:0.96849
[20:58:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.64987 validation-auc:0.96459 validation-aucpr:0.96829
[20:58:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.64025 validation-auc:0.96497 validation-aucpr:0.96857
[20:58:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.63030 validation-auc:0.96546 validation-aucpr:0.96957
[20:58:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.62159 validation-auc:0.96472 validation-aucpr:0.96857
[20:58:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.61179 validation-auc:0.96538 validation-aucpr:0.96880
[20:58:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.60329 validation-auc:0.96554 validation-aucpr:0.96941
[20:58:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.59502 validation-auc:0.96564 validation-aucpr:0.96938
[20:58:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.58633 validation-auc:0.96551 validation-aucpr:0.96936
[20:58:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.57923 validation-auc:0.96502 validation-aucpr:0.96880
[20:58:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.57144 validation-auc:0.96503 validation-aucpr:0.96884
[20:58:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.56416 validation-auc:0.96516 validation-aucpr:0.96963
[20:58:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.55714 validation-auc:0.96498 validation-aucpr:0.96942
[20:58:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.54911 validation-auc:0.96528 validation-aucpr:0.96974
[20:58:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.54262 validation-auc:0.96527 validation-aucpr:0.96988
[20:58:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.53480 validation-auc:0.96582 validation-aucpr:0.97038
[20:58:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.52843 validation-auc:0.96585 validation-aucpr:0.97044
[20:58:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.52225 validation-auc:0.96588 validation-aucpr:0.97043
[20:58:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.51624 validation-auc:0.96565 validation-aucpr:0.97026
[20:58:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.51029 validation-auc:0.96561 validation-aucpr:0.97018
[20:58:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.50349 validation-auc:0.96593 validation-aucpr:0.97045
[20:58:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.49768 validation-auc:0.96596 validation-aucpr:0.97046
[20:58:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.49146 validation-auc:0.96606 validation-aucpr:0.97057
[20:58:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.48512 validation-auc:0.96637 validation-aucpr:0.97084
[20:58:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.47986 validation-auc:0.96632 validation-aucpr:0.97082
[20:58:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.47374 validation-auc:0.96656 validation-aucpr:0.97107
[20:58:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.46888 validation-auc:0.96650 validation-aucpr:0.97099
[20:58:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.46289 validation-auc:0.96690 validation-aucpr:0.97130
[20:58:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.45827 validation-auc:0.96671 validation-aucpr:0.97117
[20:58:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.45364 validation-auc:0.96674 validation-aucpr:0.97123
[20:58:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.44826 validation-auc:0.96683 validation-aucpr:0.97135
[20:58:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.44305 validation-auc:0.96696 validation-aucpr:0.97148
[20:58:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.43877 validation-auc:0.96691 validation-aucpr:0.97141
[20:58:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.43360 validation-auc:0.96731 validation-aucpr:0.97175
[20:58:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.42864 validation-auc:0.96752 validation-aucpr:0.97193
[20:58:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.42471 validation-auc:0.96738 validation-aucpr:0.97181
[20:58:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.42069 validation-auc:0.96752 validation-aucpr:0.97187
[20:58:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.41602 validation-auc:0.96763 validation-aucpr:0.97201
[20:58:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.41150 validation-auc:0.96780 validation-aucpr:0.97217
[20:58:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.40700 validation-auc:0.96799 validation-aucpr:0.97233
{'best_iteration': '41', 'best_score': '0.9723345784298949'}
Trial 62, Fold 5: Log loss = 0.40699792927955714, Average precision = 0.9723394565659912, ROC-AUC = 0.9679921966445572, Elapsed Time = 8.108416099999886 seconds
Optimization Progress: 63%|######3 | 63/100 [2:59:56<28:40, 46.50s/it]
Trial 63, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 63, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67832 validation-auc:0.92353 validation-aucpr:0.92326
[1] validation-logloss:0.66323 validation-auc:0.94121 validation-aucpr:0.94346
[2] validation-logloss:0.65079 validation-auc:0.93858 validation-aucpr:0.94112
[3] validation-logloss:0.63928 validation-auc:0.93992 validation-aucpr:0.94359
[4] validation-logloss:0.62799 validation-auc:0.93817 validation-aucpr:0.94204
[5] validation-logloss:0.61553 validation-auc:0.94174 validation-aucpr:0.94691
[6] validation-logloss:0.60391 validation-auc:0.94419 validation-aucpr:0.95006
[7] validation-logloss:0.59353 validation-auc:0.94477 validation-aucpr:0.95064
[8] validation-logloss:0.58297 validation-auc:0.94535 validation-aucpr:0.95151
[9] validation-logloss:0.57233 validation-auc:0.94666 validation-aucpr:0.95298
[10] validation-logloss:0.56316 validation-auc:0.94682 validation-aucpr:0.95318
[11] validation-logloss:0.55484 validation-auc:0.94779 validation-aucpr:0.95413
[12] validation-logloss:0.54587 validation-auc:0.94813 validation-aucpr:0.95433
[13] validation-logloss:0.53970 validation-auc:0.94854 validation-aucpr:0.95472
[14] validation-logloss:0.53207 validation-auc:0.94887 validation-aucpr:0.95516
[15] validation-logloss:0.52443 validation-auc:0.94913 validation-aucpr:0.95527
[16] validation-logloss:0.51673 validation-auc:0.94898 validation-aucpr:0.95504
[17] validation-logloss:0.50996 validation-auc:0.94873 validation-aucpr:0.95471
[18] validation-logloss:0.50331 validation-auc:0.94898 validation-aucpr:0.95487
[19] validation-logloss:0.49572 validation-auc:0.94982 validation-aucpr:0.95593
[20] validation-logloss:0.48951 validation-auc:0.95006 validation-aucpr:0.95639
[21] validation-logloss:0.48352 validation-auc:0.94987 validation-aucpr:0.95621
[22] validation-logloss:0.47802 validation-auc:0.94975 validation-aucpr:0.95600
[23] validation-logloss:0.47252 validation-auc:0.94986 validation-aucpr:0.95602
[24] validation-logloss:0.46703 validation-auc:0.94978 validation-aucpr:0.95587
[25] validation-logloss:0.46219 validation-auc:0.94964 validation-aucpr:0.95571
[26] validation-logloss:0.45673 validation-auc:0.95018 validation-aucpr:0.95624
{'best_iteration': '20', 'best_score': '0.9563913163849957'}
Trial 63, Fold 1: Log loss = 0.45672835759562763, Average precision = 0.9562278721538539, ROC-AUC = 0.9501815198376852, Elapsed Time = 0.45719150000149966 seconds
Trial 63, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 63, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67858 validation-auc:0.91963 validation-aucpr:0.91597
[1] validation-logloss:0.66428 validation-auc:0.93085 validation-aucpr:0.93238
[2] validation-logloss:0.65073 validation-auc:0.93382 validation-aucpr:0.93639
[3] validation-logloss:0.63874 validation-auc:0.93356 validation-aucpr:0.93664
[4] validation-logloss:0.62726 validation-auc:0.93563 validation-aucpr:0.93603
[5] validation-logloss:0.61539 validation-auc:0.93695 validation-aucpr:0.93969
[6] validation-logloss:0.60487 validation-auc:0.93791 validation-aucpr:0.94148
[7] validation-logloss:0.59497 validation-auc:0.93961 validation-aucpr:0.94268
[8] validation-logloss:0.58472 validation-auc:0.94065 validation-aucpr:0.94421
[9] validation-logloss:0.57620 validation-auc:0.94037 validation-aucpr:0.94378
[10] validation-logloss:0.56508 validation-auc:0.95059 validation-aucpr:0.95576
[11] validation-logloss:0.55497 validation-auc:0.95074 validation-aucpr:0.95535
[12] validation-logloss:0.54663 validation-auc:0.95085 validation-aucpr:0.95556
[13] validation-logloss:0.53993 validation-auc:0.95195 validation-aucpr:0.95678
[14] validation-logloss:0.53338 validation-auc:0.95228 validation-aucpr:0.95693
[15] validation-logloss:0.52579 validation-auc:0.95219 validation-aucpr:0.95670
[16] validation-logloss:0.51818 validation-auc:0.95189 validation-aucpr:0.95659
[17] validation-logloss:0.51148 validation-auc:0.95148 validation-aucpr:0.95608
[18] validation-logloss:0.50464 validation-auc:0.95173 validation-aucpr:0.95635
[19] validation-logloss:0.49716 validation-auc:0.95292 validation-aucpr:0.95752
[20] validation-logloss:0.48818 validation-auc:0.95498 validation-aucpr:0.96019
[21] validation-logloss:0.48203 validation-auc:0.95477 validation-aucpr:0.95997
[22] validation-logloss:0.47622 validation-auc:0.95474 validation-aucpr:0.95981
[23] validation-logloss:0.47082 validation-auc:0.95453 validation-aucpr:0.95949
[24] validation-logloss:0.46308 validation-auc:0.95553 validation-aucpr:0.96049
[25] validation-logloss:0.45774 validation-auc:0.95569 validation-aucpr:0.96049
[26] validation-logloss:0.45242 validation-auc:0.95583 validation-aucpr:0.96054
{'best_iteration': '26', 'best_score': '0.9605407520010589'}
Trial 63, Fold 2: Log loss = 0.4524220961549884, Average precision = 0.9604365271065654, ROC-AUC = 0.9558308479575204, Elapsed Time = 0.4858105000021169 seconds
Trial 63, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 63, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67808 validation-auc:0.92802 validation-aucpr:0.91740
[1] validation-logloss:0.66489 validation-auc:0.93125 validation-aucpr:0.92869
[2] validation-logloss:0.65182 validation-auc:0.93639 validation-aucpr:0.93817
[3] validation-logloss:0.63689 validation-auc:0.94992 validation-aucpr:0.95469
[4] validation-logloss:0.62565 validation-auc:0.95382 validation-aucpr:0.95862
[5] validation-logloss:0.61371 validation-auc:0.95547 validation-aucpr:0.96133
[6] validation-logloss:0.60241 validation-auc:0.95594 validation-aucpr:0.96229
[7] validation-logloss:0.59327 validation-auc:0.95601 validation-aucpr:0.96199
[8] validation-logloss:0.58300 validation-auc:0.95635 validation-aucpr:0.96216
[9] validation-logloss:0.57206 validation-auc:0.95709 validation-aucpr:0.96290
[10] validation-logloss:0.56253 validation-auc:0.95677 validation-aucpr:0.96287
[11] validation-logloss:0.55348 validation-auc:0.95712 validation-aucpr:0.96283
[12] validation-logloss:0.54514 validation-auc:0.95743 validation-aucpr:0.96304
[13] validation-logloss:0.53661 validation-auc:0.95839 validation-aucpr:0.96421
[14] validation-logloss:0.52893 validation-auc:0.95898 validation-aucpr:0.96462
[15] validation-logloss:0.52120 validation-auc:0.95897 validation-aucpr:0.96466
[16] validation-logloss:0.51400 validation-auc:0.95858 validation-aucpr:0.96437
[17] validation-logloss:0.50722 validation-auc:0.95835 validation-aucpr:0.96412
[18] validation-logloss:0.50061 validation-auc:0.95839 validation-aucpr:0.96416
[19] validation-logloss:0.49371 validation-auc:0.95826 validation-aucpr:0.96400
[20] validation-logloss:0.48723 validation-auc:0.95826 validation-aucpr:0.96399
[21] validation-logloss:0.48111 validation-auc:0.95806 validation-aucpr:0.96378
[22] validation-logloss:0.47444 validation-auc:0.95856 validation-aucpr:0.96416
[23] validation-logloss:0.46832 validation-auc:0.95830 validation-aucpr:0.96392
[24] validation-logloss:0.46284 validation-auc:0.95812 validation-aucpr:0.96368
[25] validation-logloss:0.45699 validation-auc:0.95841 validation-aucpr:0.96389
[26] validation-logloss:0.45128 validation-auc:0.95816 validation-aucpr:0.96361
{'best_iteration': '15', 'best_score': '0.9646621250956404'}
Trial 63, Fold 3: Log loss = 0.4512801008490302, Average precision = 0.9636091254630796, ROC-AUC = 0.9581633856940606, Elapsed Time = 0.5337794000006397 seconds
Trial 63, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 63, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67815 validation-auc:0.92384 validation-aucpr:0.91387
[1] validation-logloss:0.66368 validation-auc:0.93671 validation-aucpr:0.93977
[2] validation-logloss:0.64849 validation-auc:0.95031 validation-aucpr:0.95568
[3] validation-logloss:0.63388 validation-auc:0.95210 validation-aucpr:0.95842
[4] validation-logloss:0.62180 validation-auc:0.95293 validation-aucpr:0.95851
[5] validation-logloss:0.61078 validation-auc:0.95317 validation-aucpr:0.95918
[6] validation-logloss:0.59838 validation-auc:0.95474 validation-aucpr:0.96043
[7] validation-logloss:0.58663 validation-auc:0.95574 validation-aucpr:0.96225
[8] validation-logloss:0.57625 validation-auc:0.95639 validation-aucpr:0.96303
[9] validation-logloss:0.56822 validation-auc:0.95654 validation-aucpr:0.96318
[10] validation-logloss:0.55823 validation-auc:0.95722 validation-aucpr:0.96393
[11] validation-logloss:0.54818 validation-auc:0.95711 validation-aucpr:0.96403
[12] validation-logloss:0.53973 validation-auc:0.95669 validation-aucpr:0.96361
[13] validation-logloss:0.53232 validation-auc:0.95731 validation-aucpr:0.96433
[14] validation-logloss:0.52495 validation-auc:0.95758 validation-aucpr:0.96463
[15] validation-logloss:0.51639 validation-auc:0.95789 validation-aucpr:0.96497
[16] validation-logloss:0.50923 validation-auc:0.95758 validation-aucpr:0.96453
[17] validation-logloss:0.50267 validation-auc:0.95704 validation-aucpr:0.96406
[18] validation-logloss:0.49579 validation-auc:0.95712 validation-aucpr:0.96419
[19] validation-logloss:0.48830 validation-auc:0.95758 validation-aucpr:0.96460
[20] validation-logloss:0.48195 validation-auc:0.95778 validation-aucpr:0.96478
[21] validation-logloss:0.47566 validation-auc:0.95755 validation-aucpr:0.96452
[22] validation-logloss:0.46990 validation-auc:0.95742 validation-aucpr:0.96435
[23] validation-logloss:0.46449 validation-auc:0.95745 validation-aucpr:0.96428
[24] validation-logloss:0.45901 validation-auc:0.95758 validation-aucpr:0.96439
[25] validation-logloss:0.45467 validation-auc:0.95769 validation-aucpr:0.96442
[26] validation-logloss:0.44921 validation-auc:0.95776 validation-aucpr:0.96452
{'best_iteration': '15', 'best_score': '0.9649736225770621'}
Trial 63, Fold 4: Log loss = 0.4492070739008998, Average precision = 0.9645035016118194, ROC-AUC = 0.9577632358586617, Elapsed Time = 0.5850017999982811 seconds
Trial 63, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 63, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67909 validation-auc:0.90844 validation-aucpr:0.89348
[1] validation-logloss:0.66187 validation-auc:0.94828 validation-aucpr:0.95390
[2] validation-logloss:0.64937 validation-auc:0.94753 validation-aucpr:0.95366
[3] validation-logloss:0.63876 validation-auc:0.94667 validation-aucpr:0.95276
[4] validation-logloss:0.62707 validation-auc:0.94690 validation-aucpr:0.95194
[5] validation-logloss:0.61460 validation-auc:0.95048 validation-aucpr:0.95669
[6] validation-logloss:0.60402 validation-auc:0.95076 validation-aucpr:0.95645
[7] validation-logloss:0.59320 validation-auc:0.95066 validation-aucpr:0.95615
[8] validation-logloss:0.58295 validation-auc:0.94991 validation-aucpr:0.95531
[9] validation-logloss:0.57265 validation-auc:0.94996 validation-aucpr:0.95521
[10] validation-logloss:0.56375 validation-auc:0.94980 validation-aucpr:0.95456
[11] validation-logloss:0.55517 validation-auc:0.95042 validation-aucpr:0.95580
[12] validation-logloss:0.54707 validation-auc:0.95069 validation-aucpr:0.95599
[13] validation-logloss:0.53702 validation-auc:0.95196 validation-aucpr:0.95761
[14] validation-logloss:0.53108 validation-auc:0.95211 validation-aucpr:0.95812
[15] validation-logloss:0.52338 validation-auc:0.95234 validation-aucpr:0.95847
[16] validation-logloss:0.51671 validation-auc:0.95211 validation-aucpr:0.95836
[17] validation-logloss:0.51018 validation-auc:0.95165 validation-aucpr:0.95798
[18] validation-logloss:0.50264 validation-auc:0.95232 validation-aucpr:0.95865
[19] validation-logloss:0.49641 validation-auc:0.95235 validation-aucpr:0.95866
[20] validation-logloss:0.49069 validation-auc:0.95236 validation-aucpr:0.95843
[21] validation-logloss:0.48386 validation-auc:0.95275 validation-aucpr:0.95877
[22] validation-logloss:0.47912 validation-auc:0.95247 validation-aucpr:0.95858
[23] validation-logloss:0.47315 validation-auc:0.95256 validation-aucpr:0.95856
[24] validation-logloss:0.46778 validation-auc:0.95288 validation-aucpr:0.95877
[25] validation-logloss:0.46302 validation-auc:0.95264 validation-aucpr:0.95852
[26] validation-logloss:0.45551 validation-auc:0.95342 validation-aucpr:0.95948
{'best_iteration': '26', 'best_score': '0.9594836374150355'}
Trial 63, Fold 5: Log loss = 0.4555060647716498, Average precision = 0.9594378534684262, ROC-AUC = 0.9534234806681159, Elapsed Time = 0.5480397999999695 seconds
Optimization Progress: 64%|######4 | 64/100 [3:00:09<21:43, 36.20s/it]
Trial 64, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 64, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.65272 validation-auc:0.90122 validation-aucpr:0.85272
[1] validation-logloss:0.61729 validation-auc:0.94674 validation-aucpr:0.92387
[2] validation-logloss:0.58508 validation-auc:0.95753 validation-aucpr:0.95646
[3] validation-logloss:0.55693 validation-auc:0.96058 validation-aucpr:0.96534
[4] validation-logloss:0.53309 validation-auc:0.96161 validation-aucpr:0.96567
[5] validation-logloss:0.50919 validation-auc:0.96258 validation-aucpr:0.96642
[6] validation-logloss:0.48774 validation-auc:0.96286 validation-aucpr:0.96647
[7] validation-logloss:0.46849 validation-auc:0.96315 validation-aucpr:0.96667
[8] validation-logloss:0.45033 validation-auc:0.96312 validation-aucpr:0.96462
[9] validation-logloss:0.43244 validation-auc:0.96418 validation-aucpr:0.96547
[10] validation-logloss:0.41632 validation-auc:0.96453 validation-aucpr:0.96633
[11] validation-logloss:0.40188 validation-auc:0.96432 validation-aucpr:0.96758
[12] validation-logloss:0.38823 validation-auc:0.96487 validation-aucpr:0.96843
[13] validation-logloss:0.37705 validation-auc:0.96503 validation-aucpr:0.96927
[14] validation-logloss:0.36508 validation-auc:0.96576 validation-aucpr:0.96974
[15] validation-logloss:0.35442 validation-auc:0.96596 validation-aucpr:0.97002
[16] validation-logloss:0.34373 validation-auc:0.96655 validation-aucpr:0.97049
[17] validation-logloss:0.33450 validation-auc:0.96678 validation-aucpr:0.97080
[18] validation-logloss:0.32610 validation-auc:0.96722 validation-aucpr:0.97123
[19] validation-logloss:0.31797 validation-auc:0.96733 validation-aucpr:0.97130
[20] validation-logloss:0.31022 validation-auc:0.96746 validation-aucpr:0.97150
[21] validation-logloss:0.30324 validation-auc:0.96740 validation-aucpr:0.97149
[22] validation-logloss:0.29614 validation-auc:0.96772 validation-aucpr:0.97164
[23] validation-logloss:0.29029 validation-auc:0.96790 validation-aucpr:0.97179
[24] validation-logloss:0.28509 validation-auc:0.96813 validation-aucpr:0.97185
[25] validation-logloss:0.27919 validation-auc:0.96829 validation-aucpr:0.97203
[26] validation-logloss:0.27403 validation-auc:0.96842 validation-aucpr:0.97219
[27] validation-logloss:0.26835 validation-auc:0.96882 validation-aucpr:0.97252
[28] validation-logloss:0.26343 validation-auc:0.96904 validation-aucpr:0.97271
[29] validation-logloss:0.25970 validation-auc:0.96911 validation-aucpr:0.97407
[30] validation-logloss:0.25514 validation-auc:0.96931 validation-aucpr:0.97352
[31] validation-logloss:0.25133 validation-auc:0.96940 validation-aucpr:0.97355
[32] validation-logloss:0.24805 validation-auc:0.96933 validation-aucpr:0.97357
[33] validation-logloss:0.24461 validation-auc:0.96953 validation-aucpr:0.97373
[34] validation-logloss:0.24125 validation-auc:0.96960 validation-aucpr:0.97376
[35] validation-logloss:0.23843 validation-auc:0.96962 validation-aucpr:0.97379
[36] validation-logloss:0.23609 validation-auc:0.96970 validation-aucpr:0.97387
[37] validation-logloss:0.23353 validation-auc:0.96975 validation-aucpr:0.97395
[38] validation-logloss:0.23141 validation-auc:0.96973 validation-aucpr:0.97389
[39] validation-logloss:0.22888 validation-auc:0.96992 validation-aucpr:0.97413
[40] validation-logloss:0.22665 validation-auc:0.96992 validation-aucpr:0.97415
[41] validation-logloss:0.22429 validation-auc:0.97015 validation-aucpr:0.97430
[42] validation-logloss:0.22236 validation-auc:0.97020 validation-aucpr:0.97435
[43] validation-logloss:0.22071 validation-auc:0.97023 validation-aucpr:0.97440
[44] validation-logloss:0.21921 validation-auc:0.97019 validation-aucpr:0.97438
[45] validation-logloss:0.21761 validation-auc:0.97039 validation-aucpr:0.97475
[46] validation-logloss:0.21630 validation-auc:0.97045 validation-aucpr:0.97475
[47] validation-logloss:0.21519 validation-auc:0.97035 validation-aucpr:0.97470
[48] validation-logloss:0.21371 validation-auc:0.97048 validation-aucpr:0.97481
[49] validation-logloss:0.21251 validation-auc:0.97047 validation-aucpr:0.97481
[50] validation-logloss:0.21147 validation-auc:0.97047 validation-aucpr:0.97487
[51] validation-logloss:0.21076 validation-auc:0.97036 validation-aucpr:0.97476
[52] validation-logloss:0.21028 validation-auc:0.97017 validation-aucpr:0.97459
[53] validation-logloss:0.20962 validation-auc:0.97014 validation-aucpr:0.97451
[54] validation-logloss:0.20907 validation-auc:0.97009 validation-aucpr:0.97445
[55] validation-logloss:0.20829 validation-auc:0.97017 validation-aucpr:0.97448
[56] validation-logloss:0.20755 validation-auc:0.97016 validation-aucpr:0.97448
[57] validation-logloss:0.20713 validation-auc:0.97018 validation-aucpr:0.97452
[58] validation-logloss:0.20647 validation-auc:0.97025 validation-aucpr:0.97485
[59] validation-logloss:0.20588 validation-auc:0.97031 validation-aucpr:0.97493
[60] validation-logloss:0.20538 validation-auc:0.97032 validation-aucpr:0.97500
[61] validation-logloss:0.20487 validation-auc:0.97033 validation-aucpr:0.97460
[62] validation-logloss:0.20460 validation-auc:0.97019 validation-aucpr:0.97441
[63] validation-logloss:0.20447 validation-auc:0.97021 validation-aucpr:0.97495
[64] validation-logloss:0.20389 validation-auc:0.97038 validation-aucpr:0.97509
[65] validation-logloss:0.20362 validation-auc:0.97038 validation-aucpr:0.97507
[66] validation-logloss:0.20347 validation-auc:0.97040 validation-aucpr:0.97507
[67] validation-logloss:0.20335 validation-auc:0.97035 validation-aucpr:0.97491
[68] validation-logloss:0.20294 validation-auc:0.97045 validation-aucpr:0.97496
[69] validation-logloss:0.20305 validation-auc:0.97035 validation-aucpr:0.97489
[70] validation-logloss:0.20296 validation-auc:0.97032 validation-aucpr:0.97487
[71] validation-logloss:0.20305 validation-auc:0.97033 validation-aucpr:0.97487
[72] validation-logloss:0.20293 validation-auc:0.97042 validation-aucpr:0.97510
[73] validation-logloss:0.20320 validation-auc:0.97036 validation-aucpr:0.97503
[74] validation-logloss:0.20326 validation-auc:0.97039 validation-aucpr:0.97505
[75] validation-logloss:0.20289 validation-auc:0.97056 validation-aucpr:0.97514
[76] validation-logloss:0.20265 validation-auc:0.97066 validation-aucpr:0.97523
[77] validation-logloss:0.20267 validation-auc:0.97067 validation-aucpr:0.97522
[78] validation-logloss:0.20250 validation-auc:0.97076 validation-aucpr:0.97532
[79] validation-logloss:0.20239 validation-auc:0.97089 validation-aucpr:0.97536
[80] validation-logloss:0.20232 validation-auc:0.97093 validation-aucpr:0.97539
[81] validation-logloss:0.20230 validation-auc:0.97093 validation-aucpr:0.97540
[82] validation-logloss:0.20227 validation-auc:0.97087 validation-aucpr:0.97533
[83] validation-logloss:0.20229 validation-auc:0.97087 validation-aucpr:0.97532
[84] validation-logloss:0.20244 validation-auc:0.97082 validation-aucpr:0.97522
[85] validation-logloss:0.20264 validation-auc:0.97082 validation-aucpr:0.97517
[86] validation-logloss:0.20264 validation-auc:0.97089 validation-aucpr:0.97529
[87] validation-logloss:0.20267 validation-auc:0.97093 validation-aucpr:0.97538
[88] validation-logloss:0.20275 validation-auc:0.97098 validation-aucpr:0.97539
{'best_iteration': '81', 'best_score': '0.9754009204043076'}
Trial 64, Fold 1: Log loss = 0.20275462881741438, Average precision = 0.9753897889959595, ROC-AUC = 0.9709829643490437, Elapsed Time = 16.874604999997246 seconds
Trial 64, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 64, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.65288 validation-auc:0.90207 validation-aucpr:0.83301
[1] validation-logloss:0.61960 validation-auc:0.94802 validation-aucpr:0.92119
[2] validation-logloss:0.59022 validation-auc:0.95707 validation-aucpr:0.95389
[3] validation-logloss:0.56197 validation-auc:0.96178 validation-aucpr:0.96109
[4] validation-logloss:0.53460 validation-auc:0.96408 validation-aucpr:0.96791
[5] validation-logloss:0.51022 validation-auc:0.96425 validation-aucpr:0.96822
[6] validation-logloss:0.48796 validation-auc:0.96531 validation-aucpr:0.96985
[7] validation-logloss:0.46778 validation-auc:0.96583 validation-aucpr:0.97035
[8] validation-logloss:0.44942 validation-auc:0.96616 validation-aucpr:0.97072
[9] validation-logloss:0.43197 validation-auc:0.96656 validation-aucpr:0.97106
[10] validation-logloss:0.41559 validation-auc:0.96698 validation-aucpr:0.97153
[11] validation-logloss:0.40121 validation-auc:0.96758 validation-aucpr:0.97188
[12] validation-logloss:0.38786 validation-auc:0.96806 validation-aucpr:0.97222
[13] validation-logloss:0.37545 validation-auc:0.96822 validation-aucpr:0.97232
[14] validation-logloss:0.36304 validation-auc:0.96845 validation-aucpr:0.97264
[15] validation-logloss:0.35170 validation-auc:0.96875 validation-aucpr:0.97283
[16] validation-logloss:0.34060 validation-auc:0.96925 validation-aucpr:0.97324
[17] validation-logloss:0.33106 validation-auc:0.96906 validation-aucpr:0.97313
[18] validation-logloss:0.32198 validation-auc:0.96919 validation-aucpr:0.97317
[19] validation-logloss:0.31417 validation-auc:0.96928 validation-aucpr:0.97323
[20] validation-logloss:0.30606 validation-auc:0.96934 validation-aucpr:0.97330
[21] validation-logloss:0.29868 validation-auc:0.96942 validation-aucpr:0.97333
[22] validation-logloss:0.29164 validation-auc:0.96973 validation-aucpr:0.97356
[23] validation-logloss:0.28539 validation-auc:0.96977 validation-aucpr:0.97361
[24] validation-logloss:0.27926 validation-auc:0.97003 validation-aucpr:0.97389
[25] validation-logloss:0.27329 validation-auc:0.97017 validation-aucpr:0.97404
[26] validation-logloss:0.26775 validation-auc:0.97020 validation-aucpr:0.97408
[27] validation-logloss:0.26283 validation-auc:0.97041 validation-aucpr:0.97422
[28] validation-logloss:0.25888 validation-auc:0.97026 validation-aucpr:0.97413
[29] validation-logloss:0.25442 validation-auc:0.97034 validation-aucpr:0.97416
[30] validation-logloss:0.25051 validation-auc:0.97045 validation-aucpr:0.97425
[31] validation-logloss:0.24686 validation-auc:0.97046 validation-aucpr:0.97412
[32] validation-logloss:0.24367 validation-auc:0.97053 validation-aucpr:0.97421
[33] validation-logloss:0.24045 validation-auc:0.97055 validation-aucpr:0.97417
[34] validation-logloss:0.23724 validation-auc:0.97056 validation-aucpr:0.97417
[35] validation-logloss:0.23402 validation-auc:0.97077 validation-aucpr:0.97435
[36] validation-logloss:0.23114 validation-auc:0.97067 validation-aucpr:0.97433
[37] validation-logloss:0.22831 validation-auc:0.97080 validation-aucpr:0.97411
[38] validation-logloss:0.22585 validation-auc:0.97087 validation-aucpr:0.97416
[39] validation-logloss:0.22354 validation-auc:0.97092 validation-aucpr:0.97421
[40] validation-logloss:0.22124 validation-auc:0.97103 validation-aucpr:0.97430
[41] validation-logloss:0.21914 validation-auc:0.97112 validation-aucpr:0.97425
[42] validation-logloss:0.21717 validation-auc:0.97124 validation-aucpr:0.97430
[43] validation-logloss:0.21581 validation-auc:0.97117 validation-aucpr:0.97423
[44] validation-logloss:0.21415 validation-auc:0.97139 validation-aucpr:0.97431
[45] validation-logloss:0.21282 validation-auc:0.97133 validation-aucpr:0.97409
[46] validation-logloss:0.21081 validation-auc:0.97157 validation-aucpr:0.97427
[47] validation-logloss:0.20946 validation-auc:0.97156 validation-aucpr:0.97385
[48] validation-logloss:0.20810 validation-auc:0.97162 validation-aucpr:0.97392
[49] validation-logloss:0.20700 validation-auc:0.97153 validation-aucpr:0.97387
[50] validation-logloss:0.20628 validation-auc:0.97133 validation-aucpr:0.97349
[51] validation-logloss:0.20538 validation-auc:0.97127 validation-aucpr:0.97338
[52] validation-logloss:0.20462 validation-auc:0.97116 validation-aucpr:0.97333
[53] validation-logloss:0.20366 validation-auc:0.97119 validation-aucpr:0.97335
[54] validation-logloss:0.20224 validation-auc:0.97134 validation-aucpr:0.97328
[55] validation-logloss:0.20124 validation-auc:0.97133 validation-aucpr:0.97344
[56] validation-logloss:0.20024 validation-auc:0.97138 validation-aucpr:0.97350
[57] validation-logloss:0.19968 validation-auc:0.97134 validation-aucpr:0.97351
[58] validation-logloss:0.19880 validation-auc:0.97137 validation-aucpr:0.97351
[59] validation-logloss:0.19793 validation-auc:0.97148 validation-aucpr:0.97355
[60] validation-logloss:0.19726 validation-auc:0.97154 validation-aucpr:0.97363
[61] validation-logloss:0.19667 validation-auc:0.97168 validation-aucpr:0.97409
[62] validation-logloss:0.19635 validation-auc:0.97149 validation-aucpr:0.97390
[63] validation-logloss:0.19564 validation-auc:0.97171 validation-aucpr:0.97400
[64] validation-logloss:0.19488 validation-auc:0.97184 validation-aucpr:0.97406
[65] validation-logloss:0.19448 validation-auc:0.97190 validation-aucpr:0.97405
[66] validation-logloss:0.19395 validation-auc:0.97196 validation-aucpr:0.97438
[67] validation-logloss:0.19377 validation-auc:0.97193 validation-aucpr:0.97495
[68] validation-logloss:0.19355 validation-auc:0.97193 validation-aucpr:0.97469
[69] validation-logloss:0.19326 validation-auc:0.97192 validation-aucpr:0.97482
[70] validation-logloss:0.19311 validation-auc:0.97188 validation-aucpr:0.97483
[71] validation-logloss:0.19306 validation-auc:0.97179 validation-aucpr:0.97418
[72] validation-logloss:0.19303 validation-auc:0.97174 validation-aucpr:0.97412
[73] validation-logloss:0.19275 validation-auc:0.97178 validation-aucpr:0.97420
[74] validation-logloss:0.19258 validation-auc:0.97180 validation-aucpr:0.97457
[75] validation-logloss:0.19260 validation-auc:0.97171 validation-aucpr:0.97441
[76] validation-logloss:0.19244 validation-auc:0.97170 validation-aucpr:0.97439
[77] validation-logloss:0.19257 validation-auc:0.97160 validation-aucpr:0.97416
[78] validation-logloss:0.19226 validation-auc:0.97178 validation-aucpr:0.97422
[79] validation-logloss:0.19223 validation-auc:0.97177 validation-aucpr:0.97400
[80] validation-logloss:0.19229 validation-auc:0.97172 validation-aucpr:0.97396
[81] validation-logloss:0.19192 validation-auc:0.97180 validation-aucpr:0.97392
[82] validation-logloss:0.19184 validation-auc:0.97190 validation-aucpr:0.97439
[83] validation-logloss:0.19139 validation-auc:0.97203 validation-aucpr:0.97468
[84] validation-logloss:0.19147 validation-auc:0.97204 validation-aucpr:0.97467
[85] validation-logloss:0.19145 validation-auc:0.97207 validation-aucpr:0.97454
[86] validation-logloss:0.19148 validation-auc:0.97199 validation-aucpr:0.97464
[87] validation-logloss:0.19151 validation-auc:0.97207 validation-aucpr:0.97509
[88] validation-logloss:0.19154 validation-auc:0.97201 validation-aucpr:0.97501
{'best_iteration': '87', 'best_score': '0.9750940179143601'}
Trial 64, Fold 2: Log loss = 0.19154041852961148, Average precision = 0.9750110784779089, ROC-AUC = 0.9720083758707668, Elapsed Time = 16.697156499998528 seconds
Trial 64, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 64, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.65331 validation-auc:0.89847 validation-aucpr:0.84646
[1] validation-logloss:0.61695 validation-auc:0.94909 validation-aucpr:0.92788
[2] validation-logloss:0.58392 validation-auc:0.96071 validation-aucpr:0.95286
[3] validation-logloss:0.55601 validation-auc:0.96329 validation-aucpr:0.96052
[4] validation-logloss:0.52938 validation-auc:0.96526 validation-aucpr:0.96460
[5] validation-logloss:0.50616 validation-auc:0.96546 validation-aucpr:0.96502
[6] validation-logloss:0.48629 validation-auc:0.96584 validation-aucpr:0.96477
[7] validation-logloss:0.46516 validation-auc:0.96681 validation-aucpr:0.96734
[8] validation-logloss:0.44571 validation-auc:0.96761 validation-aucpr:0.96779
[9] validation-logloss:0.42918 validation-auc:0.96838 validation-aucpr:0.97060
[10] validation-logloss:0.41306 validation-auc:0.96820 validation-aucpr:0.97067
[11] validation-logloss:0.39846 validation-auc:0.96850 validation-aucpr:0.97091
[12] validation-logloss:0.38571 validation-auc:0.96871 validation-aucpr:0.97102
[13] validation-logloss:0.37270 validation-auc:0.96888 validation-aucpr:0.97128
[14] validation-logloss:0.36208 validation-auc:0.96864 validation-aucpr:0.97114
[15] validation-logloss:0.35034 validation-auc:0.96882 validation-aucpr:0.97131
[16] validation-logloss:0.33996 validation-auc:0.96969 validation-aucpr:0.97320
[17] validation-logloss:0.33039 validation-auc:0.97000 validation-aucpr:0.97330
[18] validation-logloss:0.32179 validation-auc:0.96988 validation-aucpr:0.97212
[19] validation-logloss:0.31366 validation-auc:0.96979 validation-aucpr:0.97212
[20] validation-logloss:0.30602 validation-auc:0.97003 validation-aucpr:0.97215
[21] validation-logloss:0.29967 validation-auc:0.97019 validation-aucpr:0.97254
[22] validation-logloss:0.29316 validation-auc:0.97023 validation-aucpr:0.97253
[23] validation-logloss:0.28675 validation-auc:0.97057 validation-aucpr:0.97283
[24] validation-logloss:0.28142 validation-auc:0.97058 validation-aucpr:0.97279
[25] validation-logloss:0.27567 validation-auc:0.97065 validation-aucpr:0.97277
[26] validation-logloss:0.27007 validation-auc:0.97086 validation-aucpr:0.97393
[27] validation-logloss:0.26448 validation-auc:0.97116 validation-aucpr:0.97422
[28] validation-logloss:0.25971 validation-auc:0.97115 validation-aucpr:0.97419
[29] validation-logloss:0.25546 validation-auc:0.97129 validation-aucpr:0.97447
[30] validation-logloss:0.25119 validation-auc:0.97125 validation-aucpr:0.97441
[31] validation-logloss:0.24723 validation-auc:0.97131 validation-aucpr:0.97444
[32] validation-logloss:0.24330 validation-auc:0.97135 validation-aucpr:0.97448
[33] validation-logloss:0.24078 validation-auc:0.97113 validation-aucpr:0.97423
[34] validation-logloss:0.23783 validation-auc:0.97109 validation-aucpr:0.97418
[35] validation-logloss:0.23457 validation-auc:0.97132 validation-aucpr:0.97423
[36] validation-logloss:0.23193 validation-auc:0.97137 validation-aucpr:0.97425
[37] validation-logloss:0.22923 validation-auc:0.97159 validation-aucpr:0.97562
[38] validation-logloss:0.22708 validation-auc:0.97158 validation-aucpr:0.97566
[39] validation-logloss:0.22466 validation-auc:0.97157 validation-aucpr:0.97565
[40] validation-logloss:0.22287 validation-auc:0.97145 validation-aucpr:0.97553
[41] validation-logloss:0.22096 validation-auc:0.97144 validation-aucpr:0.97552
[42] validation-logloss:0.21896 validation-auc:0.97145 validation-aucpr:0.97548
[43] validation-logloss:0.21699 validation-auc:0.97143 validation-aucpr:0.97545
[44] validation-logloss:0.21522 validation-auc:0.97151 validation-aucpr:0.97552
[45] validation-logloss:0.21355 validation-auc:0.97153 validation-aucpr:0.97549
[46] validation-logloss:0.21198 validation-auc:0.97146 validation-aucpr:0.97546
[47] validation-logloss:0.21075 validation-auc:0.97147 validation-aucpr:0.97542
[48] validation-logloss:0.20939 validation-auc:0.97150 validation-aucpr:0.97545
[49] validation-logloss:0.20806 validation-auc:0.97158 validation-aucpr:0.97548
[50] validation-logloss:0.20684 validation-auc:0.97149 validation-aucpr:0.97533
[51] validation-logloss:0.20569 validation-auc:0.97153 validation-aucpr:0.97540
[52] validation-logloss:0.20464 validation-auc:0.97157 validation-aucpr:0.97541
[53] validation-logloss:0.20369 validation-auc:0.97155 validation-aucpr:0.97538
[54] validation-logloss:0.20333 validation-auc:0.97142 validation-aucpr:0.97519
[55] validation-logloss:0.20253 validation-auc:0.97151 validation-aucpr:0.97527
[56] validation-logloss:0.20164 validation-auc:0.97156 validation-aucpr:0.97528
[57] validation-logloss:0.20092 validation-auc:0.97155 validation-aucpr:0.97525
[58] validation-logloss:0.19997 validation-auc:0.97163 validation-aucpr:0.97526
[59] validation-logloss:0.19959 validation-auc:0.97158 validation-aucpr:0.97525
[60] validation-logloss:0.19903 validation-auc:0.97163 validation-aucpr:0.97531
[61] validation-logloss:0.19860 validation-auc:0.97158 validation-aucpr:0.97511
[62] validation-logloss:0.19817 validation-auc:0.97163 validation-aucpr:0.97503
[63] validation-logloss:0.19810 validation-auc:0.97160 validation-aucpr:0.97515
[64] validation-logloss:0.19802 validation-auc:0.97151 validation-aucpr:0.97508
[65] validation-logloss:0.19714 validation-auc:0.97163 validation-aucpr:0.97522
[66] validation-logloss:0.19698 validation-auc:0.97155 validation-aucpr:0.97490
[67] validation-logloss:0.19680 validation-auc:0.97159 validation-aucpr:0.97529
[68] validation-logloss:0.19638 validation-auc:0.97169 validation-aucpr:0.97543
[69] validation-logloss:0.19626 validation-auc:0.97164 validation-aucpr:0.97539
[70] validation-logloss:0.19597 validation-auc:0.97161 validation-aucpr:0.97524
[71] validation-logloss:0.19576 validation-auc:0.97162 validation-aucpr:0.97524
[72] validation-logloss:0.19542 validation-auc:0.97172 validation-aucpr:0.97537
[73] validation-logloss:0.19535 validation-auc:0.97170 validation-aucpr:0.97534
[74] validation-logloss:0.19511 validation-auc:0.97171 validation-aucpr:0.97531
[75] validation-logloss:0.19473 validation-auc:0.97181 validation-aucpr:0.97542
[76] validation-logloss:0.19477 validation-auc:0.97178 validation-aucpr:0.97451
[77] validation-logloss:0.19444 validation-auc:0.97200 validation-aucpr:0.97472
[78] validation-logloss:0.19464 validation-auc:0.97199 validation-aucpr:0.97444
[79] validation-logloss:0.19482 validation-auc:0.97191 validation-aucpr:0.97353
[80] validation-logloss:0.19448 validation-auc:0.97197 validation-aucpr:0.97357
[81] validation-logloss:0.19441 validation-auc:0.97199 validation-aucpr:0.97360
[82] validation-logloss:0.19437 validation-auc:0.97200 validation-aucpr:0.97359
[83] validation-logloss:0.19437 validation-auc:0.97197 validation-aucpr:0.97384
[84] validation-logloss:0.19415 validation-auc:0.97208 validation-aucpr:0.97392
[85] validation-logloss:0.19430 validation-auc:0.97213 validation-aucpr:0.97396
[86] validation-logloss:0.19389 validation-auc:0.97225 validation-aucpr:0.97406
[87] validation-logloss:0.19400 validation-auc:0.97225 validation-aucpr:0.97407
{'best_iteration': '38', 'best_score': '0.9756599265861333'}
Trial 64, Fold 3: Log loss = 0.19373711385512057, Average precision = 0.9743362243542404, ROC-AUC = 0.972316567908428, Elapsed Time = 17.79897510000228 seconds
Trial 64, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 64, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.65334 validation-auc:0.89940 validation-aucpr:0.85789
[1] validation-logloss:0.61820 validation-auc:0.94311 validation-aucpr:0.93094
[2] validation-logloss:0.58539 validation-auc:0.95680 validation-aucpr:0.95333
[3] validation-logloss:0.55727 validation-auc:0.96054 validation-aucpr:0.96180
[4] validation-logloss:0.53129 validation-auc:0.96219 validation-aucpr:0.96861
[5] validation-logloss:0.50723 validation-auc:0.96344 validation-aucpr:0.96921
[6] validation-logloss:0.48595 validation-auc:0.96447 validation-aucpr:0.96928
[7] validation-logloss:0.46603 validation-auc:0.96454 validation-aucpr:0.96890
[8] validation-logloss:0.44722 validation-auc:0.96471 validation-aucpr:0.96862
[9] validation-logloss:0.43001 validation-auc:0.96492 validation-aucpr:0.96878
[10] validation-logloss:0.41409 validation-auc:0.96563 validation-aucpr:0.96961
[11] validation-logloss:0.39901 validation-auc:0.96594 validation-aucpr:0.97058
[12] validation-logloss:0.38595 validation-auc:0.96680 validation-aucpr:0.97222
[13] validation-logloss:0.37397 validation-auc:0.96700 validation-aucpr:0.97236
[14] validation-logloss:0.36354 validation-auc:0.96700 validation-aucpr:0.97235
[15] validation-logloss:0.35173 validation-auc:0.96768 validation-aucpr:0.97286
[16] validation-logloss:0.34185 validation-auc:0.96789 validation-aucpr:0.97324
[17] validation-logloss:0.33328 validation-auc:0.96772 validation-aucpr:0.97320
[18] validation-logloss:0.32482 validation-auc:0.96782 validation-aucpr:0.97331
[19] validation-logloss:0.31626 validation-auc:0.96806 validation-aucpr:0.97346
[20] validation-logloss:0.30833 validation-auc:0.96833 validation-aucpr:0.97370
[21] validation-logloss:0.30098 validation-auc:0.96839 validation-aucpr:0.97367
[22] validation-logloss:0.29417 validation-auc:0.96854 validation-aucpr:0.97378
[23] validation-logloss:0.28755 validation-auc:0.96882 validation-aucpr:0.97396
[24] validation-logloss:0.28192 validation-auc:0.96866 validation-aucpr:0.97389
[25] validation-logloss:0.27655 validation-auc:0.96868 validation-aucpr:0.97391
[26] validation-logloss:0.27149 validation-auc:0.96886 validation-aucpr:0.97405
[27] validation-logloss:0.26614 validation-auc:0.96911 validation-aucpr:0.97423
[28] validation-logloss:0.26176 validation-auc:0.96906 validation-aucpr:0.97419
[29] validation-logloss:0.25736 validation-auc:0.96913 validation-aucpr:0.97428
[30] validation-logloss:0.25350 validation-auc:0.96903 validation-aucpr:0.97424
[31] validation-logloss:0.24958 validation-auc:0.96899 validation-aucpr:0.97422
[32] validation-logloss:0.24644 validation-auc:0.96883 validation-aucpr:0.97411
[33] validation-logloss:0.24288 validation-auc:0.96907 validation-aucpr:0.97425
[34] validation-logloss:0.23972 validation-auc:0.96913 validation-aucpr:0.97425
[35] validation-logloss:0.23689 validation-auc:0.96914 validation-aucpr:0.97426
[36] validation-logloss:0.23444 validation-auc:0.96912 validation-aucpr:0.97422
[37] validation-logloss:0.23223 validation-auc:0.96900 validation-aucpr:0.97412
[38] validation-logloss:0.22996 validation-auc:0.96894 validation-aucpr:0.97408
[39] validation-logloss:0.22737 validation-auc:0.96919 validation-aucpr:0.97424
[40] validation-logloss:0.22519 validation-auc:0.96935 validation-aucpr:0.97436
[41] validation-logloss:0.22343 validation-auc:0.96937 validation-aucpr:0.97435
[42] validation-logloss:0.22155 validation-auc:0.96936 validation-aucpr:0.97436
[43] validation-logloss:0.21969 validation-auc:0.96943 validation-aucpr:0.97439
[44] validation-logloss:0.21791 validation-auc:0.96954 validation-aucpr:0.97447
[45] validation-logloss:0.21650 validation-auc:0.96962 validation-aucpr:0.97449
[46] validation-logloss:0.21479 validation-auc:0.96977 validation-aucpr:0.97461
[47] validation-logloss:0.21353 validation-auc:0.96972 validation-aucpr:0.97458
[48] validation-logloss:0.21226 validation-auc:0.96981 validation-aucpr:0.97464
[49] validation-logloss:0.21107 validation-auc:0.96989 validation-aucpr:0.97472
[50] validation-logloss:0.21022 validation-auc:0.96983 validation-aucpr:0.97467
[51] validation-logloss:0.20946 validation-auc:0.96981 validation-aucpr:0.97467
[52] validation-logloss:0.20873 validation-auc:0.96987 validation-aucpr:0.97473
[53] validation-logloss:0.20784 validation-auc:0.96988 validation-aucpr:0.97474
[54] validation-logloss:0.20662 validation-auc:0.97003 validation-aucpr:0.97484
[55] validation-logloss:0.20557 validation-auc:0.97009 validation-aucpr:0.97491
[56] validation-logloss:0.20454 validation-auc:0.97030 validation-aucpr:0.97506
[57] validation-logloss:0.20410 validation-auc:0.97020 validation-aucpr:0.97500
[58] validation-logloss:0.20354 validation-auc:0.97022 validation-aucpr:0.97500
[59] validation-logloss:0.20300 validation-auc:0.97019 validation-aucpr:0.97499
[60] validation-logloss:0.20256 validation-auc:0.97013 validation-aucpr:0.97495
[61] validation-logloss:0.20210 validation-auc:0.97012 validation-aucpr:0.97492
[62] validation-logloss:0.20187 validation-auc:0.97009 validation-aucpr:0.97488
[63] validation-logloss:0.20146 validation-auc:0.97012 validation-aucpr:0.97491
[64] validation-logloss:0.20100 validation-auc:0.97017 validation-aucpr:0.97493
[65] validation-logloss:0.20041 validation-auc:0.97025 validation-aucpr:0.97502
[66] validation-logloss:0.19996 validation-auc:0.97031 validation-aucpr:0.97507
[67] validation-logloss:0.19954 validation-auc:0.97034 validation-aucpr:0.97509
[68] validation-logloss:0.19920 validation-auc:0.97041 validation-aucpr:0.97513
[69] validation-logloss:0.19928 validation-auc:0.97034 validation-aucpr:0.97510
[70] validation-logloss:0.19918 validation-auc:0.97031 validation-aucpr:0.97507
[71] validation-logloss:0.19880 validation-auc:0.97039 validation-aucpr:0.97513
[72] validation-logloss:0.19895 validation-auc:0.97030 validation-aucpr:0.97506
[73] validation-logloss:0.19891 validation-auc:0.97030 validation-aucpr:0.97504
[74] validation-logloss:0.19858 validation-auc:0.97038 validation-aucpr:0.97509
[75] validation-logloss:0.19833 validation-auc:0.97044 validation-aucpr:0.97510
[76] validation-logloss:0.19823 validation-auc:0.97049 validation-aucpr:0.97513
[77] validation-logloss:0.19814 validation-auc:0.97055 validation-aucpr:0.97518
[78] validation-logloss:0.19796 validation-auc:0.97061 validation-aucpr:0.97520
[79] validation-logloss:0.19796 validation-auc:0.97060 validation-aucpr:0.97519
[80] validation-logloss:0.19795 validation-auc:0.97060 validation-aucpr:0.97519
[81] validation-logloss:0.19770 validation-auc:0.97071 validation-aucpr:0.97525
[82] validation-logloss:0.19755 validation-auc:0.97074 validation-aucpr:0.97527
[83] validation-logloss:0.19776 validation-auc:0.97067 validation-aucpr:0.97519
[84] validation-logloss:0.19800 validation-auc:0.97066 validation-aucpr:0.97516
[85] validation-logloss:0.19820 validation-auc:0.97063 validation-aucpr:0.97514
[86] validation-logloss:0.19806 validation-auc:0.97069 validation-aucpr:0.97516
[87] validation-logloss:0.19797 validation-auc:0.97080 validation-aucpr:0.97524
[88] validation-logloss:0.19805 validation-auc:0.97080 validation-aucpr:0.97523
{'best_iteration': '82', 'best_score': '0.975269879696509'}
Trial 64, Fold 4: Log loss = 0.1980478209605085, Average precision = 0.9752385460516755, ROC-AUC = 0.9707982825096391, Elapsed Time = 18.344916800000647 seconds
Trial 64, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 64, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.65294 validation-auc:0.90143 validation-aucpr:0.85366
[1] validation-logloss:0.61725 validation-auc:0.95167 validation-aucpr:0.93663
[2] validation-logloss:0.58558 validation-auc:0.95664 validation-aucpr:0.95611
[3] validation-logloss:0.55732 validation-auc:0.96091 validation-aucpr:0.96049
[4] validation-logloss:0.53102 validation-auc:0.96253 validation-aucpr:0.96095
[5] validation-logloss:0.50824 validation-auc:0.96397 validation-aucpr:0.96860
[6] validation-logloss:0.48880 validation-auc:0.96429 validation-aucpr:0.96878
[7] validation-logloss:0.47002 validation-auc:0.96447 validation-aucpr:0.96912
[8] validation-logloss:0.45079 validation-auc:0.96512 validation-aucpr:0.96985
[9] validation-logloss:0.43314 validation-auc:0.96567 validation-aucpr:0.97035
[10] validation-logloss:0.41722 validation-auc:0.96582 validation-aucpr:0.97023
[11] validation-logloss:0.40421 validation-auc:0.96583 validation-aucpr:0.97014
[12] validation-logloss:0.39084 validation-auc:0.96578 validation-aucpr:0.97028
[13] validation-logloss:0.37703 validation-auc:0.96661 validation-aucpr:0.97102
[14] validation-logloss:0.36611 validation-auc:0.96678 validation-aucpr:0.97104
[15] validation-logloss:0.35483 validation-auc:0.96714 validation-aucpr:0.97135
[16] validation-logloss:0.34628 validation-auc:0.96665 validation-aucpr:0.97094
[17] validation-logloss:0.33629 validation-auc:0.96691 validation-aucpr:0.97121
[18] validation-logloss:0.32796 validation-auc:0.96686 validation-aucpr:0.97123
[19] validation-logloss:0.32007 validation-auc:0.96692 validation-aucpr:0.97133
[20] validation-logloss:0.31232 validation-auc:0.96705 validation-aucpr:0.97144
[21] validation-logloss:0.30544 validation-auc:0.96698 validation-aucpr:0.97131
[22] validation-logloss:0.29907 validation-auc:0.96683 validation-aucpr:0.97100
[23] validation-logloss:0.29376 validation-auc:0.96679 validation-aucpr:0.97110
[24] validation-logloss:0.28739 validation-auc:0.96694 validation-aucpr:0.97119
[25] validation-logloss:0.28148 validation-auc:0.96726 validation-aucpr:0.97145
[26] validation-logloss:0.27660 validation-auc:0.96709 validation-aucpr:0.97134
[27] validation-logloss:0.27151 validation-auc:0.96710 validation-aucpr:0.97139
[28] validation-logloss:0.26720 validation-auc:0.96720 validation-aucpr:0.97155
[29] validation-logloss:0.26296 validation-auc:0.96727 validation-aucpr:0.97205
[30] validation-logloss:0.25942 validation-auc:0.96772 validation-aucpr:0.97220
[31] validation-logloss:0.25552 validation-auc:0.96773 validation-aucpr:0.97222
[32] validation-logloss:0.25193 validation-auc:0.96805 validation-aucpr:0.97250
[33] validation-logloss:0.24855 validation-auc:0.96816 validation-aucpr:0.97255
[34] validation-logloss:0.24519 validation-auc:0.96841 validation-aucpr:0.97271
[35] validation-logloss:0.24276 validation-auc:0.96834 validation-aucpr:0.97265
[36] validation-logloss:0.23980 validation-auc:0.96840 validation-aucpr:0.97283
[37] validation-logloss:0.23749 validation-auc:0.96845 validation-aucpr:0.97289
[38] validation-logloss:0.23496 validation-auc:0.96861 validation-aucpr:0.97300
[39] validation-logloss:0.23277 validation-auc:0.96863 validation-aucpr:0.97304
[40] validation-logloss:0.23052 validation-auc:0.96869 validation-aucpr:0.97309
[41] validation-logloss:0.22853 validation-auc:0.96876 validation-aucpr:0.97311
[42] validation-logloss:0.22709 validation-auc:0.96856 validation-aucpr:0.97299
[43] validation-logloss:0.22557 validation-auc:0.96856 validation-aucpr:0.97298
[44] validation-logloss:0.22341 validation-auc:0.96893 validation-aucpr:0.97326
[45] validation-logloss:0.22153 validation-auc:0.96907 validation-aucpr:0.97338
[46] validation-logloss:0.21982 validation-auc:0.96926 validation-aucpr:0.97352
[47] validation-logloss:0.21857 validation-auc:0.96931 validation-aucpr:0.97354
[48] validation-logloss:0.21734 validation-auc:0.96931 validation-aucpr:0.97346
[49] validation-logloss:0.21596 validation-auc:0.96943 validation-aucpr:0.97356
[50] validation-logloss:0.21516 validation-auc:0.96930 validation-aucpr:0.97280
[51] validation-logloss:0.21421 validation-auc:0.96942 validation-aucpr:0.97303
[52] validation-logloss:0.21301 validation-auc:0.96955 validation-aucpr:0.97329
[53] validation-logloss:0.21191 validation-auc:0.96966 validation-aucpr:0.97338
[54] validation-logloss:0.21102 validation-auc:0.96968 validation-aucpr:0.97346
[55] validation-logloss:0.21001 validation-auc:0.96985 validation-aucpr:0.97350
[56] validation-logloss:0.20928 validation-auc:0.96989 validation-aucpr:0.97352
[57] validation-logloss:0.20863 validation-auc:0.96998 validation-aucpr:0.97357
[58] validation-logloss:0.20795 validation-auc:0.97005 validation-aucpr:0.97343
[59] validation-logloss:0.20759 validation-auc:0.96998 validation-aucpr:0.97339
[60] validation-logloss:0.20736 validation-auc:0.96993 validation-aucpr:0.97337
[61] validation-logloss:0.20682 validation-auc:0.96993 validation-aucpr:0.97343
[62] validation-logloss:0.20601 validation-auc:0.97006 validation-aucpr:0.97362
[63] validation-logloss:0.20587 validation-auc:0.96999 validation-aucpr:0.97345
[64] validation-logloss:0.20524 validation-auc:0.97005 validation-aucpr:0.97354
[65] validation-logloss:0.20441 validation-auc:0.97027 validation-aucpr:0.97373
[66] validation-logloss:0.20392 validation-auc:0.97036 validation-aucpr:0.97385
[67] validation-logloss:0.20393 validation-auc:0.97032 validation-aucpr:0.97376
[68] validation-logloss:0.20334 validation-auc:0.97047 validation-aucpr:0.97384
[69] validation-logloss:0.20324 validation-auc:0.97045 validation-aucpr:0.97414
[70] validation-logloss:0.20295 validation-auc:0.97055 validation-aucpr:0.97421
[71] validation-logloss:0.20273 validation-auc:0.97053 validation-aucpr:0.97428
[72] validation-logloss:0.20253 validation-auc:0.97055 validation-aucpr:0.97433
[73] validation-logloss:0.20244 validation-auc:0.97056 validation-aucpr:0.97435
[74] validation-logloss:0.20260 validation-auc:0.97056 validation-aucpr:0.97435
[75] validation-logloss:0.20238 validation-auc:0.97058 validation-aucpr:0.97433
[76] validation-logloss:0.20217 validation-auc:0.97063 validation-aucpr:0.97436
[77] validation-logloss:0.20213 validation-auc:0.97064 validation-aucpr:0.97437
[78] validation-logloss:0.20249 validation-auc:0.97051 validation-aucpr:0.97423
[79] validation-logloss:0.20243 validation-auc:0.97055 validation-aucpr:0.97421
[80] validation-logloss:0.20238 validation-auc:0.97054 validation-aucpr:0.97419
[81] validation-logloss:0.20238 validation-auc:0.97066 validation-aucpr:0.97434
[82] validation-logloss:0.20237 validation-auc:0.97075 validation-aucpr:0.97463
[83] validation-logloss:0.20246 validation-auc:0.97072 validation-aucpr:0.97459
[84] validation-logloss:0.20292 validation-auc:0.97056 validation-aucpr:0.97447
[85] validation-logloss:0.20309 validation-auc:0.97053 validation-aucpr:0.97440
[86] validation-logloss:0.20329 validation-auc:0.97045 validation-aucpr:0.97433
[87] validation-logloss:0.20347 validation-auc:0.97043 validation-aucpr:0.97434
[88] validation-logloss:0.20347 validation-auc:0.97041 validation-aucpr:0.97431
{'best_iteration': '82', 'best_score': '0.9746318812315987'}
Trial 64, Fold 5: Log loss = 0.20347111347276622, Average precision = 0.9743139083594021, ROC-AUC = 0.9704094532077365, Elapsed Time = 17.027871199999936 seconds
Optimization Progress: 65%|######5 | 65/100 [3:01:43<31:19, 53.69s/it]
Trial 65, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 65, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.63304 validation-auc:0.95532 validation-aucpr:0.95790
[1] validation-logloss:0.58295 validation-auc:0.96014 validation-aucpr:0.96636
[2] validation-logloss:0.53928 validation-auc:0.96217 validation-aucpr:0.96760
[3] validation-logloss:0.50166 validation-auc:0.96332 validation-aucpr:0.96852
[4] validation-logloss:0.47213 validation-auc:0.96449 validation-aucpr:0.96940
[5] validation-logloss:0.44658 validation-auc:0.96425 validation-aucpr:0.96917
[6] validation-logloss:0.42039 validation-auc:0.96533 validation-aucpr:0.96993
[7] validation-logloss:0.39782 validation-auc:0.96551 validation-aucpr:0.97012
[8] validation-logloss:0.37855 validation-auc:0.96556 validation-aucpr:0.97004
[9] validation-logloss:0.36118 validation-auc:0.96567 validation-aucpr:0.97013
[10] validation-logloss:0.34473 validation-auc:0.96616 validation-aucpr:0.97053
[11] validation-logloss:0.33230 validation-auc:0.96659 validation-aucpr:0.97079
[12] validation-logloss:0.31884 validation-auc:0.96672 validation-aucpr:0.97092
[13] validation-logloss:0.30721 validation-auc:0.96720 validation-aucpr:0.97180
[14] validation-logloss:0.29718 validation-auc:0.96725 validation-aucpr:0.97219
[15] validation-logloss:0.28730 validation-auc:0.96778 validation-aucpr:0.97259
[16] validation-logloss:0.27902 validation-auc:0.96807 validation-aucpr:0.97275
[17] validation-logloss:0.27190 validation-auc:0.96849 validation-aucpr:0.97305
[18] validation-logloss:0.26552 validation-auc:0.96875 validation-aucpr:0.97322
[19] validation-logloss:0.25920 validation-auc:0.96913 validation-aucpr:0.97380
[20] validation-logloss:0.25328 validation-auc:0.96945 validation-aucpr:0.97398
[21] validation-logloss:0.24818 validation-auc:0.96938 validation-aucpr:0.97397
[22] validation-logloss:0.24316 validation-auc:0.96966 validation-aucpr:0.97466
[23] validation-logloss:0.23864 validation-auc:0.96997 validation-aucpr:0.97484
[24] validation-logloss:0.23442 validation-auc:0.97016 validation-aucpr:0.97504
[25] validation-logloss:0.23094 validation-auc:0.97047 validation-aucpr:0.97520
[26] validation-logloss:0.22742 validation-auc:0.97057 validation-aucpr:0.97529
[27] validation-logloss:0.22437 validation-auc:0.97075 validation-aucpr:0.97538
[28] validation-logloss:0.22122 validation-auc:0.97103 validation-aucpr:0.97558
[29] validation-logloss:0.21875 validation-auc:0.97123 validation-aucpr:0.97574
[30] validation-logloss:0.21624 validation-auc:0.97136 validation-aucpr:0.97583
[31] validation-logloss:0.21407 validation-auc:0.97142 validation-aucpr:0.97586
[32] validation-logloss:0.21234 validation-auc:0.97146 validation-aucpr:0.97587
[33] validation-logloss:0.21085 validation-auc:0.97141 validation-aucpr:0.97582
[34] validation-logloss:0.20897 validation-auc:0.97150 validation-aucpr:0.97583
[35] validation-logloss:0.20769 validation-auc:0.97149 validation-aucpr:0.97581
[36] validation-logloss:0.20605 validation-auc:0.97166 validation-aucpr:0.97590
[37] validation-logloss:0.20469 validation-auc:0.97167 validation-aucpr:0.97593
[38] validation-logloss:0.20368 validation-auc:0.97175 validation-aucpr:0.97612
[39] validation-logloss:0.20253 validation-auc:0.97186 validation-aucpr:0.97618
[40] validation-logloss:0.20155 validation-auc:0.97189 validation-aucpr:0.97617
[41] validation-logloss:0.20087 validation-auc:0.97189 validation-aucpr:0.97612
[42] validation-logloss:0.19994 validation-auc:0.97201 validation-aucpr:0.97619
[43] validation-logloss:0.19916 validation-auc:0.97204 validation-aucpr:0.97623
[44] validation-logloss:0.19816 validation-auc:0.97216 validation-aucpr:0.97630
[45] validation-logloss:0.19736 validation-auc:0.97228 validation-aucpr:0.97641
[46] validation-logloss:0.19653 validation-auc:0.97246 validation-aucpr:0.97652
[47] validation-logloss:0.19608 validation-auc:0.97252 validation-aucpr:0.97656
[48] validation-logloss:0.19567 validation-auc:0.97261 validation-aucpr:0.97660
[49] validation-logloss:0.19528 validation-auc:0.97254 validation-aucpr:0.97660
[50] validation-logloss:0.19497 validation-auc:0.97257 validation-aucpr:0.97658
[51] validation-logloss:0.19454 validation-auc:0.97269 validation-aucpr:0.97660
[52] validation-logloss:0.19374 validation-auc:0.97291 validation-aucpr:0.97678
[53] validation-logloss:0.19344 validation-auc:0.97293 validation-aucpr:0.97677
[54] validation-logloss:0.19324 validation-auc:0.97294 validation-aucpr:0.97682
[55] validation-logloss:0.19306 validation-auc:0.97300 validation-aucpr:0.97688
[56] validation-logloss:0.19268 validation-auc:0.97310 validation-aucpr:0.97700
[57] validation-logloss:0.19271 validation-auc:0.97300 validation-aucpr:0.97690
[58] validation-logloss:0.19248 validation-auc:0.97300 validation-aucpr:0.97690
[59] validation-logloss:0.19215 validation-auc:0.97303 validation-aucpr:0.97695
[60] validation-logloss:0.19195 validation-auc:0.97305 validation-aucpr:0.97693
[61] validation-logloss:0.19187 validation-auc:0.97297 validation-aucpr:0.97686
[62] validation-logloss:0.19179 validation-auc:0.97293 validation-aucpr:0.97683
[63] validation-logloss:0.19179 validation-auc:0.97287 validation-aucpr:0.97682
[64] validation-logloss:0.19179 validation-auc:0.97291 validation-aucpr:0.97687
[65] validation-logloss:0.19143 validation-auc:0.97303 validation-aucpr:0.97694
[66] validation-logloss:0.19134 validation-auc:0.97302 validation-aucpr:0.97697
[67] validation-logloss:0.19126 validation-auc:0.97304 validation-aucpr:0.97702
[68] validation-logloss:0.19126 validation-auc:0.97309 validation-aucpr:0.97705
[69] validation-logloss:0.19128 validation-auc:0.97305 validation-aucpr:0.97698
[70] validation-logloss:0.19132 validation-auc:0.97298 validation-aucpr:0.97690
[71] validation-logloss:0.19122 validation-auc:0.97304 validation-aucpr:0.97693
[72] validation-logloss:0.19110 validation-auc:0.97310 validation-aucpr:0.97697
[73] validation-logloss:0.19099 validation-auc:0.97311 validation-aucpr:0.97697
[74] validation-logloss:0.19093 validation-auc:0.97308 validation-aucpr:0.97699
[75] validation-logloss:0.19072 validation-auc:0.97312 validation-aucpr:0.97699
[76] validation-logloss:0.19047 validation-auc:0.97319 validation-aucpr:0.97702
[77] validation-logloss:0.19035 validation-auc:0.97324 validation-aucpr:0.97706
[78] validation-logloss:0.19042 validation-auc:0.97327 validation-aucpr:0.97710
[79] validation-logloss:0.19051 validation-auc:0.97320 validation-aucpr:0.97705
[80] validation-logloss:0.19051 validation-auc:0.97321 validation-aucpr:0.97708
[81] validation-logloss:0.19039 validation-auc:0.97326 validation-aucpr:0.97714
[82] validation-logloss:0.19030 validation-auc:0.97325 validation-aucpr:0.97714
[83] validation-logloss:0.19029 validation-auc:0.97328 validation-aucpr:0.97718
[84] validation-logloss:0.19020 validation-auc:0.97332 validation-aucpr:0.97722
[85] validation-logloss:0.19029 validation-auc:0.97329 validation-aucpr:0.97718
[86] validation-logloss:0.19044 validation-auc:0.97319 validation-aucpr:0.97710
[87] validation-logloss:0.19080 validation-auc:0.97304 validation-aucpr:0.97696
[88] validation-logloss:0.19075 validation-auc:0.97309 validation-aucpr:0.97700
[89] validation-logloss:0.19067 validation-auc:0.97317 validation-aucpr:0.97709
[90] validation-logloss:0.19056 validation-auc:0.97319 validation-aucpr:0.97709
[91] validation-logloss:0.19071 validation-auc:0.97310 validation-aucpr:0.97699
[92] validation-logloss:0.19077 validation-auc:0.97303 validation-aucpr:0.97695
[93] validation-logloss:0.19089 validation-auc:0.97298 validation-aucpr:0.97689
[94] validation-logloss:0.19094 validation-auc:0.97295 validation-aucpr:0.97685
[95] validation-logloss:0.19072 validation-auc:0.97301 validation-aucpr:0.97690
[96] validation-logloss:0.19096 validation-auc:0.97291 validation-aucpr:0.97684
{'best_iteration': '84', 'best_score': '0.9772210952959658'}
Trial 65, Fold 1: Log loss = 0.19095632367941393, Average precision = 0.9768403685158552, ROC-AUC = 0.9729122007379687, Elapsed Time = 2.4595071999974607 seconds
Trial 65, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 65, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.63236 validation-auc:0.95699 validation-aucpr:0.95729
[1] validation-logloss:0.58097 validation-auc:0.96491 validation-aucpr:0.96673
[2] validation-logloss:0.53829 validation-auc:0.96638 validation-aucpr:0.97073
[3] validation-logloss:0.50432 validation-auc:0.96749 validation-aucpr:0.97148
[4] validation-logloss:0.47047 validation-auc:0.96843 validation-aucpr:0.97218
[5] validation-logloss:0.44143 validation-auc:0.96867 validation-aucpr:0.97227
[6] validation-logloss:0.41628 validation-auc:0.96894 validation-aucpr:0.97239
[7] validation-logloss:0.39390 validation-auc:0.96907 validation-aucpr:0.97237
[8] validation-logloss:0.37404 validation-auc:0.96921 validation-aucpr:0.97261
[9] validation-logloss:0.35595 validation-auc:0.96964 validation-aucpr:0.97345
[10] validation-logloss:0.33948 validation-auc:0.97002 validation-aucpr:0.97378
[11] validation-logloss:0.32467 validation-auc:0.97021 validation-aucpr:0.97399
[12] validation-logloss:0.31186 validation-auc:0.97022 validation-aucpr:0.97402
[13] validation-logloss:0.30062 validation-auc:0.97016 validation-aucpr:0.97396
[14] validation-logloss:0.29007 validation-auc:0.97024 validation-aucpr:0.97403
[15] validation-logloss:0.28067 validation-auc:0.97049 validation-aucpr:0.97414
[16] validation-logloss:0.27306 validation-auc:0.97073 validation-aucpr:0.97436
[17] validation-logloss:0.26609 validation-auc:0.97089 validation-aucpr:0.97445
[18] validation-logloss:0.25992 validation-auc:0.97104 validation-aucpr:0.97454
[19] validation-logloss:0.25342 validation-auc:0.97104 validation-aucpr:0.97456
[20] validation-logloss:0.24769 validation-auc:0.97099 validation-aucpr:0.97454
[21] validation-logloss:0.24303 validation-auc:0.97106 validation-aucpr:0.97454
[22] validation-logloss:0.23779 validation-auc:0.97119 validation-aucpr:0.97462
[23] validation-logloss:0.23355 validation-auc:0.97127 validation-aucpr:0.97461
[24] validation-logloss:0.22949 validation-auc:0.97129 validation-aucpr:0.97462
[25] validation-logloss:0.22578 validation-auc:0.97124 validation-aucpr:0.97456
[26] validation-logloss:0.22217 validation-auc:0.97132 validation-aucpr:0.97461
[27] validation-logloss:0.21925 validation-auc:0.97119 validation-aucpr:0.97451
[28] validation-logloss:0.21663 validation-auc:0.97104 validation-aucpr:0.97441
[29] validation-logloss:0.21390 validation-auc:0.97115 validation-aucpr:0.97451
[30] validation-logloss:0.21165 validation-auc:0.97125 validation-aucpr:0.97456
[31] validation-logloss:0.20931 validation-auc:0.97140 validation-aucpr:0.97466
[32] validation-logloss:0.20716 validation-auc:0.97166 validation-aucpr:0.97480
[33] validation-logloss:0.20522 validation-auc:0.97166 validation-aucpr:0.97479
[34] validation-logloss:0.20357 validation-auc:0.97171 validation-aucpr:0.97480
[35] validation-logloss:0.20202 validation-auc:0.97183 validation-aucpr:0.97520
[36] validation-logloss:0.20054 validation-auc:0.97192 validation-aucpr:0.97526
[37] validation-logloss:0.19932 validation-auc:0.97196 validation-aucpr:0.97518
[38] validation-logloss:0.19741 validation-auc:0.97224 validation-aucpr:0.97538
[39] validation-logloss:0.19645 validation-auc:0.97217 validation-aucpr:0.97529
[40] validation-logloss:0.19575 validation-auc:0.97207 validation-aucpr:0.97537
[41] validation-logloss:0.19492 validation-auc:0.97202 validation-aucpr:0.97532
[42] validation-logloss:0.19392 validation-auc:0.97214 validation-aucpr:0.97538
[43] validation-logloss:0.19285 validation-auc:0.97229 validation-aucpr:0.97543
[44] validation-logloss:0.19207 validation-auc:0.97235 validation-aucpr:0.97551
[45] validation-logloss:0.19147 validation-auc:0.97237 validation-aucpr:0.97551
[46] validation-logloss:0.19081 validation-auc:0.97235 validation-aucpr:0.97542
[47] validation-logloss:0.19042 validation-auc:0.97240 validation-aucpr:0.97537
[48] validation-logloss:0.18994 validation-auc:0.97247 validation-aucpr:0.97540
[49] validation-logloss:0.18929 validation-auc:0.97260 validation-aucpr:0.97541
[50] validation-logloss:0.18853 validation-auc:0.97265 validation-aucpr:0.97544
[51] validation-logloss:0.18787 validation-auc:0.97282 validation-aucpr:0.97590
[52] validation-logloss:0.18728 validation-auc:0.97302 validation-aucpr:0.97609
[53] validation-logloss:0.18683 validation-auc:0.97300 validation-aucpr:0.97605
[54] validation-logloss:0.18623 validation-auc:0.97314 validation-aucpr:0.97619
[55] validation-logloss:0.18579 validation-auc:0.97318 validation-aucpr:0.97622
[56] validation-logloss:0.18546 validation-auc:0.97326 validation-aucpr:0.97630
[57] validation-logloss:0.18495 validation-auc:0.97343 validation-aucpr:0.97638
[58] validation-logloss:0.18488 validation-auc:0.97340 validation-aucpr:0.97633
[59] validation-logloss:0.18446 validation-auc:0.97343 validation-aucpr:0.97634
[60] validation-logloss:0.18413 validation-auc:0.97345 validation-aucpr:0.97646
[61] validation-logloss:0.18393 validation-auc:0.97346 validation-aucpr:0.97645
[62] validation-logloss:0.18379 validation-auc:0.97350 validation-aucpr:0.97645
[63] validation-logloss:0.18324 validation-auc:0.97374 validation-aucpr:0.97664
[64] validation-logloss:0.18291 validation-auc:0.97392 validation-aucpr:0.97682
[65] validation-logloss:0.18293 validation-auc:0.97387 validation-aucpr:0.97680
[66] validation-logloss:0.18273 validation-auc:0.97391 validation-aucpr:0.97683
[67] validation-logloss:0.18268 validation-auc:0.97392 validation-aucpr:0.97688
[68] validation-logloss:0.18290 validation-auc:0.97386 validation-aucpr:0.97686
[69] validation-logloss:0.18267 validation-auc:0.97395 validation-aucpr:0.97686
[70] validation-logloss:0.18253 validation-auc:0.97395 validation-aucpr:0.97686
[71] validation-logloss:0.18247 validation-auc:0.97395 validation-aucpr:0.97684
[72] validation-logloss:0.18223 validation-auc:0.97400 validation-aucpr:0.97689
[73] validation-logloss:0.18213 validation-auc:0.97402 validation-aucpr:0.97690
[74] validation-logloss:0.18203 validation-auc:0.97404 validation-aucpr:0.97691
[75] validation-logloss:0.18222 validation-auc:0.97391 validation-aucpr:0.97681
[76] validation-logloss:0.18221 validation-auc:0.97387 validation-aucpr:0.97679
[77] validation-logloss:0.18201 validation-auc:0.97395 validation-aucpr:0.97684
[78] validation-logloss:0.18190 validation-auc:0.97395 validation-aucpr:0.97678
[79] validation-logloss:0.18194 validation-auc:0.97397 validation-aucpr:0.97681
[80] validation-logloss:0.18182 validation-auc:0.97395 validation-aucpr:0.97678
[81] validation-logloss:0.18180 validation-auc:0.97390 validation-aucpr:0.97672
[82] validation-logloss:0.18152 validation-auc:0.97403 validation-aucpr:0.97680
[83] validation-logloss:0.18130 validation-auc:0.97405 validation-aucpr:0.97684
[84] validation-logloss:0.18128 validation-auc:0.97410 validation-aucpr:0.97690
[85] validation-logloss:0.18140 validation-auc:0.97407 validation-aucpr:0.97686
[86] validation-logloss:0.18099 validation-auc:0.97421 validation-aucpr:0.97699
[87] validation-logloss:0.18107 validation-auc:0.97415 validation-aucpr:0.97689
[88] validation-logloss:0.18101 validation-auc:0.97417 validation-aucpr:0.97692
[89] validation-logloss:0.18099 validation-auc:0.97416 validation-aucpr:0.97694
[90] validation-logloss:0.18096 validation-auc:0.97419 validation-aucpr:0.97700
[91] validation-logloss:0.18111 validation-auc:0.97414 validation-aucpr:0.97695
[92] validation-logloss:0.18114 validation-auc:0.97406 validation-aucpr:0.97688
[93] validation-logloss:0.18082 validation-auc:0.97414 validation-aucpr:0.97691
[94] validation-logloss:0.18068 validation-auc:0.97422 validation-aucpr:0.97695
[95] validation-logloss:0.18072 validation-auc:0.97417 validation-aucpr:0.97697
[96] validation-logloss:0.18067 validation-auc:0.97416 validation-aucpr:0.97698
{'best_iteration': '90', 'best_score': '0.9769978487826361'}
Trial 65, Fold 2: Log loss = 0.1806736366945864, Average precision = 0.9769836979905047, ROC-AUC = 0.9741555311612561, Elapsed Time = 2.701753800000006 seconds
Trial 65, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 65, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.63211 validation-auc:0.96084 validation-aucpr:0.96028
[1] validation-logloss:0.58061 validation-auc:0.96425 validation-aucpr:0.96580
[2] validation-logloss:0.53715 validation-auc:0.96485 validation-aucpr:0.96824
[3] validation-logloss:0.49834 validation-auc:0.96757 validation-aucpr:0.97166
[4] validation-logloss:0.46540 validation-auc:0.96855 validation-aucpr:0.97223
[5] validation-logloss:0.44027 validation-auc:0.96884 validation-aucpr:0.97263
[6] validation-logloss:0.41484 validation-auc:0.96813 validation-aucpr:0.97219
[7] validation-logloss:0.39235 validation-auc:0.96884 validation-aucpr:0.97298
[8] validation-logloss:0.37443 validation-auc:0.96879 validation-aucpr:0.97298
[9] validation-logloss:0.35600 validation-auc:0.96889 validation-aucpr:0.97311
[10] validation-logloss:0.33951 validation-auc:0.96917 validation-aucpr:0.97336
[11] validation-logloss:0.32506 validation-auc:0.96931 validation-aucpr:0.97343
[12] validation-logloss:0.31221 validation-auc:0.96947 validation-aucpr:0.97350
[13] validation-logloss:0.30117 validation-auc:0.96930 validation-aucpr:0.97327
[14] validation-logloss:0.29119 validation-auc:0.96934 validation-aucpr:0.97330
[15] validation-logloss:0.28308 validation-auc:0.96966 validation-aucpr:0.97344
[16] validation-logloss:0.27460 validation-auc:0.96978 validation-aucpr:0.97354
[17] validation-logloss:0.26643 validation-auc:0.97007 validation-aucpr:0.97385
[18] validation-logloss:0.25933 validation-auc:0.97013 validation-aucpr:0.97389
[19] validation-logloss:0.25286 validation-auc:0.97020 validation-aucpr:0.97391
[20] validation-logloss:0.24692 validation-auc:0.97047 validation-aucpr:0.97409
[21] validation-logloss:0.24162 validation-auc:0.97063 validation-aucpr:0.97421
[22] validation-logloss:0.23680 validation-auc:0.97057 validation-aucpr:0.97411
[23] validation-logloss:0.23231 validation-auc:0.97077 validation-aucpr:0.97420
[24] validation-logloss:0.22834 validation-auc:0.97082 validation-aucpr:0.97423
[25] validation-logloss:0.22476 validation-auc:0.97112 validation-aucpr:0.97444
[26] validation-logloss:0.22110 validation-auc:0.97147 validation-aucpr:0.97470
[27] validation-logloss:0.21840 validation-auc:0.97152 validation-aucpr:0.97472
[28] validation-logloss:0.21588 validation-auc:0.97153 validation-aucpr:0.97474
[29] validation-logloss:0.21366 validation-auc:0.97166 validation-aucpr:0.97483
[30] validation-logloss:0.21179 validation-auc:0.97159 validation-aucpr:0.97451
[31] validation-logloss:0.20962 validation-auc:0.97178 validation-aucpr:0.97465
[32] validation-logloss:0.20780 validation-auc:0.97166 validation-aucpr:0.97457
[33] validation-logloss:0.20572 validation-auc:0.97179 validation-aucpr:0.97467
[34] validation-logloss:0.20411 validation-auc:0.97195 validation-aucpr:0.97475
[35] validation-logloss:0.20245 validation-auc:0.97203 validation-aucpr:0.97482
[36] validation-logloss:0.20095 validation-auc:0.97214 validation-aucpr:0.97492
[37] validation-logloss:0.19975 validation-auc:0.97212 validation-aucpr:0.97401
[38] validation-logloss:0.19858 validation-auc:0.97216 validation-aucpr:0.97422
[39] validation-logloss:0.19728 validation-auc:0.97234 validation-aucpr:0.97433
[40] validation-logloss:0.19622 validation-auc:0.97247 validation-aucpr:0.97430
[41] validation-logloss:0.19521 validation-auc:0.97259 validation-aucpr:0.97542
[42] validation-logloss:0.19429 validation-auc:0.97271 validation-aucpr:0.97555
[43] validation-logloss:0.19347 validation-auc:0.97278 validation-aucpr:0.97551
[44] validation-logloss:0.19261 validation-auc:0.97284 validation-aucpr:0.97550
[45] validation-logloss:0.19218 validation-auc:0.97274 validation-aucpr:0.97535
[46] validation-logloss:0.19150 validation-auc:0.97275 validation-aucpr:0.97527
[47] validation-logloss:0.19104 validation-auc:0.97285 validation-aucpr:0.97577
[48] validation-logloss:0.19058 validation-auc:0.97295 validation-aucpr:0.97587
[49] validation-logloss:0.18995 validation-auc:0.97305 validation-aucpr:0.97593
[50] validation-logloss:0.18929 validation-auc:0.97315 validation-aucpr:0.97609
[51] validation-logloss:0.18881 validation-auc:0.97319 validation-aucpr:0.97612
[52] validation-logloss:0.18839 validation-auc:0.97328 validation-aucpr:0.97626
[53] validation-logloss:0.18787 validation-auc:0.97339 validation-aucpr:0.97633
[54] validation-logloss:0.18779 validation-auc:0.97337 validation-aucpr:0.97611
[55] validation-logloss:0.18733 validation-auc:0.97346 validation-aucpr:0.97605
[56] validation-logloss:0.18723 validation-auc:0.97341 validation-aucpr:0.97589
[57] validation-logloss:0.18714 validation-auc:0.97340 validation-aucpr:0.97631
[58] validation-logloss:0.18718 validation-auc:0.97334 validation-aucpr:0.97615
[59] validation-logloss:0.18698 validation-auc:0.97337 validation-aucpr:0.97608
[60] validation-logloss:0.18679 validation-auc:0.97343 validation-aucpr:0.97634
[61] validation-logloss:0.18658 validation-auc:0.97343 validation-aucpr:0.97625
[62] validation-logloss:0.18626 validation-auc:0.97357 validation-aucpr:0.97632
[63] validation-logloss:0.18634 validation-auc:0.97351 validation-aucpr:0.97633
[64] validation-logloss:0.18626 validation-auc:0.97349 validation-aucpr:0.97624
[65] validation-logloss:0.18604 validation-auc:0.97351 validation-aucpr:0.97629
[66] validation-logloss:0.18587 validation-auc:0.97352 validation-aucpr:0.97638
[67] validation-logloss:0.18567 validation-auc:0.97361 validation-aucpr:0.97656
[68] validation-logloss:0.18569 validation-auc:0.97358 validation-aucpr:0.97651
[69] validation-logloss:0.18552 validation-auc:0.97361 validation-aucpr:0.97647
[70] validation-logloss:0.18544 validation-auc:0.97367 validation-aucpr:0.97635
[71] validation-logloss:0.18546 validation-auc:0.97362 validation-aucpr:0.97620
[72] validation-logloss:0.18554 validation-auc:0.97363 validation-aucpr:0.97634
[73] validation-logloss:0.18551 validation-auc:0.97359 validation-aucpr:0.97621
[74] validation-logloss:0.18538 validation-auc:0.97366 validation-aucpr:0.97621
[75] validation-logloss:0.18530 validation-auc:0.97375 validation-aucpr:0.97642
[76] validation-logloss:0.18520 validation-auc:0.97378 validation-aucpr:0.97643
[77] validation-logloss:0.18505 validation-auc:0.97387 validation-aucpr:0.97652
[78] validation-logloss:0.18460 validation-auc:0.97401 validation-aucpr:0.97661
[79] validation-logloss:0.18469 validation-auc:0.97396 validation-aucpr:0.97653
[80] validation-logloss:0.18489 validation-auc:0.97386 validation-aucpr:0.97633
[81] validation-logloss:0.18491 validation-auc:0.97385 validation-aucpr:0.97637
[82] validation-logloss:0.18490 validation-auc:0.97385 validation-aucpr:0.97630
[83] validation-logloss:0.18480 validation-auc:0.97388 validation-aucpr:0.97626
[84] validation-logloss:0.18472 validation-auc:0.97392 validation-aucpr:0.97627
[85] validation-logloss:0.18476 validation-auc:0.97391 validation-aucpr:0.97618
[86] validation-logloss:0.18458 validation-auc:0.97396 validation-aucpr:0.97638
[87] validation-logloss:0.18466 validation-auc:0.97394 validation-aucpr:0.97651
[88] validation-logloss:0.18457 validation-auc:0.97398 validation-aucpr:0.97657
[89] validation-logloss:0.18455 validation-auc:0.97398 validation-aucpr:0.97658
[90] validation-logloss:0.18464 validation-auc:0.97401 validation-aucpr:0.97655
[91] validation-logloss:0.18460 validation-auc:0.97398 validation-aucpr:0.97659
[92] validation-logloss:0.18471 validation-auc:0.97394 validation-aucpr:0.97659
[93] validation-logloss:0.18472 validation-auc:0.97396 validation-aucpr:0.97664
[94] validation-logloss:0.18468 validation-auc:0.97395 validation-aucpr:0.97660
[95] validation-logloss:0.18470 validation-auc:0.97395 validation-aucpr:0.97648
[96] validation-logloss:0.18468 validation-auc:0.97396 validation-aucpr:0.97645
{'best_iteration': '93', 'best_score': '0.9766434736304546'}
Trial 65, Fold 3: Log loss = 0.18468022857613342, Average precision = 0.9764594632367195, ROC-AUC = 0.9739638511138292, Elapsed Time = 2.7284695000016654 seconds
Trial 65, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 65, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.63227 validation-auc:0.96018 validation-aucpr:0.96040
[1] validation-logloss:0.58161 validation-auc:0.96473 validation-aucpr:0.96583
[2] validation-logloss:0.53782 validation-auc:0.96663 validation-aucpr:0.97084
[3] validation-logloss:0.49989 validation-auc:0.96676 validation-aucpr:0.97111
[4] validation-logloss:0.46721 validation-auc:0.96840 validation-aucpr:0.97319
[5] validation-logloss:0.43827 validation-auc:0.96872 validation-aucpr:0.97358
[6] validation-logloss:0.41584 validation-auc:0.96886 validation-aucpr:0.97373
[7] validation-logloss:0.39655 validation-auc:0.96891 validation-aucpr:0.97373
[8] validation-logloss:0.37627 validation-auc:0.96904 validation-aucpr:0.97394
[9] validation-logloss:0.35840 validation-auc:0.96901 validation-aucpr:0.97386
[10] validation-logloss:0.34248 validation-auc:0.96906 validation-aucpr:0.97391
[11] validation-logloss:0.32792 validation-auc:0.96919 validation-aucpr:0.97403
[12] validation-logloss:0.31502 validation-auc:0.96931 validation-aucpr:0.97410
[13] validation-logloss:0.30360 validation-auc:0.96976 validation-aucpr:0.97447
[14] validation-logloss:0.29488 validation-auc:0.96940 validation-aucpr:0.97426
[15] validation-logloss:0.28512 validation-auc:0.96960 validation-aucpr:0.97440
[16] validation-logloss:0.27761 validation-auc:0.96956 validation-aucpr:0.97437
[17] validation-logloss:0.26995 validation-auc:0.96971 validation-aucpr:0.97448
[18] validation-logloss:0.26285 validation-auc:0.96988 validation-aucpr:0.97460
[19] validation-logloss:0.25631 validation-auc:0.96994 validation-aucpr:0.97467
[20] validation-logloss:0.25049 validation-auc:0.97010 validation-aucpr:0.97480
[21] validation-logloss:0.24515 validation-auc:0.97021 validation-aucpr:0.97485
[22] validation-logloss:0.24042 validation-auc:0.97032 validation-aucpr:0.97490
[23] validation-logloss:0.23670 validation-auc:0.97024 validation-aucpr:0.97484
[24] validation-logloss:0.23264 validation-auc:0.97040 validation-aucpr:0.97497
[25] validation-logloss:0.22943 validation-auc:0.97051 validation-aucpr:0.97504
[26] validation-logloss:0.22607 validation-auc:0.97047 validation-aucpr:0.97500
[27] validation-logloss:0.22288 validation-auc:0.97069 validation-aucpr:0.97514
[28] validation-logloss:0.21974 validation-auc:0.97089 validation-aucpr:0.97529
[29] validation-logloss:0.21710 validation-auc:0.97090 validation-aucpr:0.97532
[30] validation-logloss:0.21470 validation-auc:0.97089 validation-aucpr:0.97532
[31] validation-logloss:0.21251 validation-auc:0.97085 validation-aucpr:0.97530
[32] validation-logloss:0.21026 validation-auc:0.97103 validation-aucpr:0.97543
[33] validation-logloss:0.20813 validation-auc:0.97123 validation-aucpr:0.97556
[34] validation-logloss:0.20656 validation-auc:0.97132 validation-aucpr:0.97562
[35] validation-logloss:0.20500 validation-auc:0.97136 validation-aucpr:0.97564
[36] validation-logloss:0.20364 validation-auc:0.97149 validation-aucpr:0.97574
[37] validation-logloss:0.20216 validation-auc:0.97158 validation-aucpr:0.97581
[38] validation-logloss:0.20112 validation-auc:0.97146 validation-aucpr:0.97574
[39] validation-logloss:0.19975 validation-auc:0.97165 validation-aucpr:0.97587
[40] validation-logloss:0.19847 validation-auc:0.97180 validation-aucpr:0.97600
[41] validation-logloss:0.19773 validation-auc:0.97172 validation-aucpr:0.97595
[42] validation-logloss:0.19717 validation-auc:0.97171 validation-aucpr:0.97593
[43] validation-logloss:0.19649 validation-auc:0.97179 validation-aucpr:0.97599
[44] validation-logloss:0.19586 validation-auc:0.97180 validation-aucpr:0.97600
[45] validation-logloss:0.19535 validation-auc:0.97167 validation-aucpr:0.97590
[46] validation-logloss:0.19470 validation-auc:0.97177 validation-aucpr:0.97597
[47] validation-logloss:0.19400 validation-auc:0.97184 validation-aucpr:0.97600
[48] validation-logloss:0.19364 validation-auc:0.97178 validation-aucpr:0.97596
[49] validation-logloss:0.19342 validation-auc:0.97167 validation-aucpr:0.97587
[50] validation-logloss:0.19294 validation-auc:0.97177 validation-aucpr:0.97595
[51] validation-logloss:0.19275 validation-auc:0.97168 validation-aucpr:0.97588
[52] validation-logloss:0.19264 validation-auc:0.97164 validation-aucpr:0.97585
[53] validation-logloss:0.19236 validation-auc:0.97161 validation-aucpr:0.97584
[54] validation-logloss:0.19163 validation-auc:0.97182 validation-aucpr:0.97599
[55] validation-logloss:0.19138 validation-auc:0.97193 validation-aucpr:0.97607
[56] validation-logloss:0.19114 validation-auc:0.97190 validation-aucpr:0.97605
[57] validation-logloss:0.19087 validation-auc:0.97195 validation-aucpr:0.97608
[58] validation-logloss:0.19075 validation-auc:0.97195 validation-aucpr:0.97606
[59] validation-logloss:0.19073 validation-auc:0.97188 validation-aucpr:0.97600
[60] validation-logloss:0.19066 validation-auc:0.97183 validation-aucpr:0.97597
[61] validation-logloss:0.19070 validation-auc:0.97165 validation-aucpr:0.97583
[62] validation-logloss:0.19038 validation-auc:0.97175 validation-aucpr:0.97590
[63] validation-logloss:0.19043 validation-auc:0.97165 validation-aucpr:0.97581
[64] validation-logloss:0.19044 validation-auc:0.97159 validation-aucpr:0.97576
[65] validation-logloss:0.19035 validation-auc:0.97150 validation-aucpr:0.97569
[66] validation-logloss:0.19022 validation-auc:0.97153 validation-aucpr:0.97573
[67] validation-logloss:0.19004 validation-auc:0.97159 validation-aucpr:0.97577
[68] validation-logloss:0.19007 validation-auc:0.97156 validation-aucpr:0.97573
[69] validation-logloss:0.18996 validation-auc:0.97159 validation-aucpr:0.97575
[70] validation-logloss:0.18990 validation-auc:0.97159 validation-aucpr:0.97576
[71] validation-logloss:0.18973 validation-auc:0.97161 validation-aucpr:0.97577
[72] validation-logloss:0.18975 validation-auc:0.97160 validation-aucpr:0.97575
[73] validation-logloss:0.18992 validation-auc:0.97153 validation-aucpr:0.97569
[74] validation-logloss:0.18989 validation-auc:0.97155 validation-aucpr:0.97571
[75] validation-logloss:0.18975 validation-auc:0.97158 validation-aucpr:0.97571
[76] validation-logloss:0.18982 validation-auc:0.97151 validation-aucpr:0.97565
[77] validation-logloss:0.18979 validation-auc:0.97157 validation-aucpr:0.97571
[78] validation-logloss:0.18979 validation-auc:0.97158 validation-aucpr:0.97571
[79] validation-logloss:0.18972 validation-auc:0.97159 validation-aucpr:0.97572
[80] validation-logloss:0.18981 validation-auc:0.97155 validation-aucpr:0.97566
[81] validation-logloss:0.18966 validation-auc:0.97162 validation-aucpr:0.97571
[82] validation-logloss:0.18952 validation-auc:0.97164 validation-aucpr:0.97572
[83] validation-logloss:0.18950 validation-auc:0.97160 validation-aucpr:0.97568
[84] validation-logloss:0.18959 validation-auc:0.97155 validation-aucpr:0.97565
[85] validation-logloss:0.18955 validation-auc:0.97157 validation-aucpr:0.97566
[86] validation-logloss:0.18958 validation-auc:0.97159 validation-aucpr:0.97568
[87] validation-logloss:0.18941 validation-auc:0.97163 validation-aucpr:0.97573
[88] validation-logloss:0.18915 validation-auc:0.97178 validation-aucpr:0.97584
[89] validation-logloss:0.18912 validation-auc:0.97180 validation-aucpr:0.97585
[90] validation-logloss:0.18908 validation-auc:0.97175 validation-aucpr:0.97582
[91] validation-logloss:0.18928 validation-auc:0.97169 validation-aucpr:0.97578
[92] validation-logloss:0.18934 validation-auc:0.97171 validation-aucpr:0.97579
[93] validation-logloss:0.18944 validation-auc:0.97164 validation-aucpr:0.97571
[94] validation-logloss:0.18938 validation-auc:0.97167 validation-aucpr:0.97573
[95] validation-logloss:0.18942 validation-auc:0.97168 validation-aucpr:0.97574
[96] validation-logloss:0.18949 validation-auc:0.97170 validation-aucpr:0.97574
{'best_iteration': '57', 'best_score': '0.976076300519751'}
Trial 65, Fold 4: Log loss = 0.18949136138858005, Average precision = 0.9757470371557592, ROC-AUC = 0.9716955657131412, Elapsed Time = 2.6781521999982942 seconds
Trial 65, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 65, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.63258 validation-auc:0.95470 validation-aucpr:0.95545
[1] validation-logloss:0.58266 validation-auc:0.96421 validation-aucpr:0.96888
[2] validation-logloss:0.53907 validation-auc:0.96466 validation-aucpr:0.96913
[3] validation-logloss:0.50150 validation-auc:0.96582 validation-aucpr:0.97037
[4] validation-logloss:0.46936 validation-auc:0.96576 validation-aucpr:0.96989
[5] validation-logloss:0.44095 validation-auc:0.96590 validation-aucpr:0.97013
[6] validation-logloss:0.41574 validation-auc:0.96643 validation-aucpr:0.97058
[7] validation-logloss:0.39553 validation-auc:0.96724 validation-aucpr:0.97152
[8] validation-logloss:0.37544 validation-auc:0.96744 validation-aucpr:0.97164
[9] validation-logloss:0.35970 validation-auc:0.96804 validation-aucpr:0.97219
[10] validation-logloss:0.34453 validation-auc:0.96812 validation-aucpr:0.97228
[11] validation-logloss:0.33042 validation-auc:0.96808 validation-aucpr:0.97227
[12] validation-logloss:0.31769 validation-auc:0.96848 validation-aucpr:0.97253
[13] validation-logloss:0.30611 validation-auc:0.96893 validation-aucpr:0.97285
[14] validation-logloss:0.29561 validation-auc:0.96925 validation-aucpr:0.97308
[15] validation-logloss:0.28742 validation-auc:0.96924 validation-aucpr:0.97307
[16] validation-logloss:0.27974 validation-auc:0.96966 validation-aucpr:0.97337
[17] validation-logloss:0.27196 validation-auc:0.96998 validation-aucpr:0.97363
[18] validation-logloss:0.26470 validation-auc:0.97008 validation-aucpr:0.97371
[19] validation-logloss:0.25944 validation-auc:0.97009 validation-aucpr:0.97366
[20] validation-logloss:0.25361 validation-auc:0.97044 validation-aucpr:0.97384
[21] validation-logloss:0.24829 validation-auc:0.97052 validation-aucpr:0.97386
[22] validation-logloss:0.24353 validation-auc:0.97040 validation-aucpr:0.97384
[23] validation-logloss:0.23926 validation-auc:0.97049 validation-aucpr:0.97391
[24] validation-logloss:0.23548 validation-auc:0.97064 validation-aucpr:0.97398
[25] validation-logloss:0.23296 validation-auc:0.97057 validation-aucpr:0.97382
[26] validation-logloss:0.22978 validation-auc:0.97062 validation-aucpr:0.97383
[27] validation-logloss:0.22677 validation-auc:0.97071 validation-aucpr:0.97386
[28] validation-logloss:0.22376 validation-auc:0.97098 validation-aucpr:0.97397
[29] validation-logloss:0.22122 validation-auc:0.97105 validation-aucpr:0.97401
[30] validation-logloss:0.21878 validation-auc:0.97121 validation-aucpr:0.97412
[31] validation-logloss:0.21660 validation-auc:0.97127 validation-aucpr:0.97417
[32] validation-logloss:0.21452 validation-auc:0.97130 validation-aucpr:0.97419
[33] validation-logloss:0.21288 validation-auc:0.97127 validation-aucpr:0.97433
[34] validation-logloss:0.21144 validation-auc:0.97130 validation-aucpr:0.97442
[35] validation-logloss:0.20970 validation-auc:0.97135 validation-aucpr:0.97428
[36] validation-logloss:0.20832 validation-auc:0.97138 validation-aucpr:0.97429
[37] validation-logloss:0.20693 validation-auc:0.97153 validation-aucpr:0.97436
[38] validation-logloss:0.20555 validation-auc:0.97179 validation-aucpr:0.97457
[39] validation-logloss:0.20431 validation-auc:0.97199 validation-aucpr:0.97462
[40] validation-logloss:0.20317 validation-auc:0.97212 validation-aucpr:0.97485
[41] validation-logloss:0.20220 validation-auc:0.97222 validation-aucpr:0.97493
[42] validation-logloss:0.20140 validation-auc:0.97216 validation-aucpr:0.97484
[43] validation-logloss:0.20067 validation-auc:0.97223 validation-aucpr:0.97492
[44] validation-logloss:0.19989 validation-auc:0.97229 validation-aucpr:0.97499
[45] validation-logloss:0.19936 validation-auc:0.97238 validation-aucpr:0.97498
[46] validation-logloss:0.19845 validation-auc:0.97251 validation-aucpr:0.97507
[47] validation-logloss:0.19780 validation-auc:0.97258 validation-aucpr:0.97513
[48] validation-logloss:0.19706 validation-auc:0.97274 validation-aucpr:0.97523
[49] validation-logloss:0.19656 validation-auc:0.97277 validation-aucpr:0.97517
[50] validation-logloss:0.19612 validation-auc:0.97282 validation-aucpr:0.97517
[51] validation-logloss:0.19579 validation-auc:0.97284 validation-aucpr:0.97514
[52] validation-logloss:0.19520 validation-auc:0.97291 validation-aucpr:0.97554
[53] validation-logloss:0.19494 validation-auc:0.97282 validation-aucpr:0.97563
[54] validation-logloss:0.19450 validation-auc:0.97289 validation-aucpr:0.97561
[55] validation-logloss:0.19435 validation-auc:0.97289 validation-aucpr:0.97552
[56] validation-logloss:0.19403 validation-auc:0.97298 validation-aucpr:0.97551
[57] validation-logloss:0.19375 validation-auc:0.97303 validation-aucpr:0.97541
[58] validation-logloss:0.19369 validation-auc:0.97305 validation-aucpr:0.97548
[59] validation-logloss:0.19334 validation-auc:0.97311 validation-aucpr:0.97569
[60] validation-logloss:0.19333 validation-auc:0.97309 validation-aucpr:0.97581
[61] validation-logloss:0.19319 validation-auc:0.97302 validation-aucpr:0.97560
[62] validation-logloss:0.19285 validation-auc:0.97304 validation-aucpr:0.97557
[63] validation-logloss:0.19237 validation-auc:0.97321 validation-aucpr:0.97560
[64] validation-logloss:0.19214 validation-auc:0.97330 validation-aucpr:0.97588
[65] validation-logloss:0.19225 validation-auc:0.97324 validation-aucpr:0.97602
[66] validation-logloss:0.19196 validation-auc:0.97335 validation-aucpr:0.97621
[67] validation-logloss:0.19174 validation-auc:0.97345 validation-aucpr:0.97640
[68] validation-logloss:0.19175 validation-auc:0.97341 validation-aucpr:0.97639
[69] validation-logloss:0.19170 validation-auc:0.97338 validation-aucpr:0.97633
[70] validation-logloss:0.19131 validation-auc:0.97347 validation-aucpr:0.97636
[71] validation-logloss:0.19114 validation-auc:0.97347 validation-aucpr:0.97632
[72] validation-logloss:0.19100 validation-auc:0.97351 validation-aucpr:0.97630
[73] validation-logloss:0.19088 validation-auc:0.97349 validation-aucpr:0.97627
[74] validation-logloss:0.19083 validation-auc:0.97350 validation-aucpr:0.97616
[75] validation-logloss:0.19053 validation-auc:0.97353 validation-aucpr:0.97618
[76] validation-logloss:0.19035 validation-auc:0.97351 validation-aucpr:0.97617
[77] validation-logloss:0.19038 validation-auc:0.97356 validation-aucpr:0.97631
[78] validation-logloss:0.19034 validation-auc:0.97357 validation-aucpr:0.97645
[79] validation-logloss:0.19020 validation-auc:0.97361 validation-aucpr:0.97642
[80] validation-logloss:0.19019 validation-auc:0.97364 validation-aucpr:0.97646
[81] validation-logloss:0.19037 validation-auc:0.97353 validation-aucpr:0.97626
[82] validation-logloss:0.19042 validation-auc:0.97356 validation-aucpr:0.97644
[83] validation-logloss:0.19032 validation-auc:0.97354 validation-aucpr:0.97643
[84] validation-logloss:0.19038 validation-auc:0.97354 validation-aucpr:0.97648
[85] validation-logloss:0.19052 validation-auc:0.97343 validation-aucpr:0.97634
[86] validation-logloss:0.19054 validation-auc:0.97335 validation-aucpr:0.97616
[87] validation-logloss:0.19040 validation-auc:0.97335 validation-aucpr:0.97615
[88] validation-logloss:0.19033 validation-auc:0.97335 validation-aucpr:0.97614
[89] validation-logloss:0.19054 validation-auc:0.97324 validation-aucpr:0.97599
[90] validation-logloss:0.19057 validation-auc:0.97322 validation-aucpr:0.97596
[91] validation-logloss:0.19064 validation-auc:0.97317 validation-aucpr:0.97588
[92] validation-logloss:0.19067 validation-auc:0.97313 validation-aucpr:0.97576
[93] validation-logloss:0.19051 validation-auc:0.97318 validation-aucpr:0.97580
[94] validation-logloss:0.19071 validation-auc:0.97311 validation-aucpr:0.97585
[95] validation-logloss:0.19087 validation-auc:0.97303 validation-aucpr:0.97584
[96] validation-logloss:0.19068 validation-auc:0.97310 validation-aucpr:0.97600
{'best_iteration': '84', 'best_score': '0.9764759330080807'}
Trial 65, Fold 5: Log loss = 0.19067964677015295, Average precision = 0.9760026014555567, ROC-AUC = 0.9731033944596176, Elapsed Time = 2.712488900000608 seconds
Optimization Progress: 66%|######6 | 66/100 [3:02:04<24:52, 43.90s/it]
Trial 66, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 66, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66326 validation-auc:0.94744 validation-aucpr:0.92501
[1] validation-logloss:0.63903 validation-auc:0.95849 validation-aucpr:0.95428
[2] validation-logloss:0.61594 validation-auc:0.96031 validation-aucpr:0.95947
[3] validation-logloss:0.59520 validation-auc:0.96201 validation-aucpr:0.96459
[4] validation-logloss:0.57298 validation-auc:0.96370 validation-aucpr:0.96639
[5] validation-logloss:0.55201 validation-auc:0.96421 validation-aucpr:0.96559
[6] validation-logloss:0.53231 validation-auc:0.96527 validation-aucpr:0.96632
[7] validation-logloss:0.51598 validation-auc:0.96536 validation-aucpr:0.96836
[8] validation-logloss:0.50123 validation-auc:0.96552 validation-aucpr:0.96853
[9] validation-logloss:0.48724 validation-auc:0.96581 validation-aucpr:0.96871
[10] validation-logloss:0.47435 validation-auc:0.96579 validation-aucpr:0.96892
[11] validation-logloss:0.46179 validation-auc:0.96582 validation-aucpr:0.96894
[12] validation-logloss:0.45031 validation-auc:0.96557 validation-aucpr:0.96872
[13] validation-logloss:0.43720 validation-auc:0.96600 validation-aucpr:0.96895
[14] validation-logloss:0.42659 validation-auc:0.96612 validation-aucpr:0.96887
[15] validation-logloss:0.41684 validation-auc:0.96616 validation-aucpr:0.96885
[16] validation-logloss:0.40516 validation-auc:0.96671 validation-aucpr:0.96946
[17] validation-logloss:0.39645 validation-auc:0.96671 validation-aucpr:0.96940
[18] validation-logloss:0.38626 validation-auc:0.96694 validation-aucpr:0.96966
[19] validation-logloss:0.37653 validation-auc:0.96723 validation-aucpr:0.96995
[20] validation-logloss:0.36720 validation-auc:0.96762 validation-aucpr:0.97034
[21] validation-logloss:0.35843 validation-auc:0.96787 validation-aucpr:0.97059
[22] validation-logloss:0.35028 validation-auc:0.96815 validation-aucpr:0.97076
[23] validation-logloss:0.34376 validation-auc:0.96799 validation-aucpr:0.97063
[24] validation-logloss:0.33632 validation-auc:0.96830 validation-aucpr:0.97090
[25] validation-logloss:0.32913 validation-auc:0.96852 validation-aucpr:0.97112
[26] validation-logloss:0.32349 validation-auc:0.96871 validation-aucpr:0.97127
[27] validation-logloss:0.31806 validation-auc:0.96883 validation-aucpr:0.97133
[28] validation-logloss:0.31185 validation-auc:0.96917 validation-aucpr:0.97168
[29] validation-logloss:0.30733 validation-auc:0.96898 validation-aucpr:0.96962
[30] validation-logloss:0.30172 validation-auc:0.96922 validation-aucpr:0.96983
[31] validation-logloss:0.29652 validation-auc:0.96961 validation-aucpr:0.97195
[32] validation-logloss:0.29283 validation-auc:0.96949 validation-aucpr:0.97205
[33] validation-logloss:0.28928 validation-auc:0.96940 validation-aucpr:0.97182
[34] validation-logloss:0.28461 validation-auc:0.96951 validation-aucpr:0.97194
[35] validation-logloss:0.28118 validation-auc:0.96950 validation-aucpr:0.97193
[36] validation-logloss:0.27680 validation-auc:0.96981 validation-aucpr:0.97191
[37] validation-logloss:0.27248 validation-auc:0.97015 validation-aucpr:0.97217
[38] validation-logloss:0.26967 validation-auc:0.97007 validation-aucpr:0.97213
[39] validation-logloss:0.26683 validation-auc:0.97006 validation-aucpr:0.97212
[40] validation-logloss:0.26311 validation-auc:0.97036 validation-aucpr:0.97265
[41] validation-logloss:0.26075 validation-auc:0.97018 validation-aucpr:0.97250
[42] validation-logloss:0.25758 validation-auc:0.97021 validation-aucpr:0.97254
[43] validation-logloss:0.25507 validation-auc:0.97027 validation-aucpr:0.97229
[44] validation-logloss:0.25285 validation-auc:0.97024 validation-aucpr:0.97253
[45] validation-logloss:0.25040 validation-auc:0.97037 validation-aucpr:0.97265
[46] validation-logloss:0.24749 validation-auc:0.97048 validation-aucpr:0.97289
[47] validation-logloss:0.24492 validation-auc:0.97054 validation-aucpr:0.97291
[48] validation-logloss:0.24312 validation-auc:0.97052 validation-aucpr:0.97290
[49] validation-logloss:0.24046 validation-auc:0.97065 validation-aucpr:0.97287
[50] validation-logloss:0.23870 validation-auc:0.97068 validation-aucpr:0.97287
[51] validation-logloss:0.23643 validation-auc:0.97084 validation-aucpr:0.97301
[52] validation-logloss:0.23492 validation-auc:0.97082 validation-aucpr:0.97299
[53] validation-logloss:0.23276 validation-auc:0.97089 validation-aucpr:0.97305
[54] validation-logloss:0.23135 validation-auc:0.97081 validation-aucpr:0.97296
[55] validation-logloss:0.22992 validation-auc:0.97086 validation-aucpr:0.97299
[56] validation-logloss:0.22782 validation-auc:0.97103 validation-aucpr:0.97330
[57] validation-logloss:0.22604 validation-auc:0.97112 validation-aucpr:0.97341
[58] validation-logloss:0.22478 validation-auc:0.97118 validation-aucpr:0.97344
[59] validation-logloss:0.22296 validation-auc:0.97134 validation-aucpr:0.97341
[60] validation-logloss:0.22122 validation-auc:0.97151 validation-aucpr:0.97356
{'best_iteration': '60', 'best_score': '0.973559109648379'}
Trial 66, Fold 1: Log loss = 0.22121907431323887, Average precision = 0.9740674502888876, ROC-AUC = 0.9715058066034805, Elapsed Time = 1.7955819999988307 seconds
Trial 66, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 66, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.66278 validation-auc:0.94974 validation-aucpr:0.94237
[1] validation-logloss:0.63510 validation-auc:0.96293 validation-aucpr:0.96337
[2] validation-logloss:0.61009 validation-auc:0.96484 validation-aucpr:0.96897
[3] validation-logloss:0.58614 validation-auc:0.96721 validation-aucpr:0.97118
[4] validation-logloss:0.56672 validation-auc:0.96730 validation-aucpr:0.97108
[5] validation-logloss:0.54815 validation-auc:0.96765 validation-aucpr:0.97136
[6] validation-logloss:0.52861 validation-auc:0.96826 validation-aucpr:0.97188
[7] validation-logloss:0.51075 validation-auc:0.96822 validation-aucpr:0.97189
[8] validation-logloss:0.49538 validation-auc:0.96805 validation-aucpr:0.97171
[9] validation-logloss:0.47921 validation-auc:0.96850 validation-aucpr:0.97208
[10] validation-logloss:0.46583 validation-auc:0.96853 validation-aucpr:0.97203
[11] validation-logloss:0.45127 validation-auc:0.96906 validation-aucpr:0.97244
[12] validation-logloss:0.43776 validation-auc:0.96940 validation-aucpr:0.97270
[13] validation-logloss:0.42483 validation-auc:0.96991 validation-aucpr:0.97309
[14] validation-logloss:0.41278 validation-auc:0.97008 validation-aucpr:0.97314
[15] validation-logloss:0.40147 validation-auc:0.97034 validation-aucpr:0.97336
[16] validation-logloss:0.39220 validation-auc:0.97033 validation-aucpr:0.97336
[17] validation-logloss:0.38199 validation-auc:0.97054 validation-aucpr:0.97360
[18] validation-logloss:0.37274 validation-auc:0.97063 validation-aucpr:0.97367
[19] validation-logloss:0.36466 validation-auc:0.97070 validation-aucpr:0.97366
[20] validation-logloss:0.35618 validation-auc:0.97068 validation-aucpr:0.97363
[21] validation-logloss:0.34799 validation-auc:0.97084 validation-aucpr:0.97376
[22] validation-logloss:0.34013 validation-auc:0.97097 validation-aucpr:0.97388
[23] validation-logloss:0.33263 validation-auc:0.97118 validation-aucpr:0.97401
[24] validation-logloss:0.32647 validation-auc:0.97116 validation-aucpr:0.97401
[25] validation-logloss:0.31957 validation-auc:0.97141 validation-aucpr:0.97418
[26] validation-logloss:0.31328 validation-auc:0.97134 validation-aucpr:0.97414
[27] validation-logloss:0.30741 validation-auc:0.97135 validation-aucpr:0.97416
[28] validation-logloss:0.30175 validation-auc:0.97132 validation-aucpr:0.97414
[29] validation-logloss:0.29695 validation-auc:0.97132 validation-aucpr:0.97413
[30] validation-logloss:0.29243 validation-auc:0.97142 validation-aucpr:0.97418
[31] validation-logloss:0.28778 validation-auc:0.97142 validation-aucpr:0.97417
[32] validation-logloss:0.28349 validation-auc:0.97157 validation-aucpr:0.97432
[33] validation-logloss:0.27946 validation-auc:0.97160 validation-aucpr:0.97434
[34] validation-logloss:0.27492 validation-auc:0.97168 validation-aucpr:0.97441
[35] validation-logloss:0.27081 validation-auc:0.97182 validation-aucpr:0.97488
[36] validation-logloss:0.26662 validation-auc:0.97191 validation-aucpr:0.97496
[37] validation-logloss:0.26344 validation-auc:0.97186 validation-aucpr:0.97490
[38] validation-logloss:0.25972 validation-auc:0.97203 validation-aucpr:0.97501
[39] validation-logloss:0.25626 validation-auc:0.97203 validation-aucpr:0.97491
[40] validation-logloss:0.25307 validation-auc:0.97208 validation-aucpr:0.97495
[41] validation-logloss:0.24992 validation-auc:0.97213 validation-aucpr:0.97499
[42] validation-logloss:0.24684 validation-auc:0.97225 validation-aucpr:0.97510
[43] validation-logloss:0.24433 validation-auc:0.97226 validation-aucpr:0.97510
[44] validation-logloss:0.24144 validation-auc:0.97235 validation-aucpr:0.97516
[45] validation-logloss:0.23895 validation-auc:0.97228 validation-aucpr:0.97512
[46] validation-logloss:0.23632 validation-auc:0.97227 validation-aucpr:0.97507
[47] validation-logloss:0.23421 validation-auc:0.97242 validation-aucpr:0.97516
[48] validation-logloss:0.23203 validation-auc:0.97252 validation-aucpr:0.97524
[49] validation-logloss:0.22985 validation-auc:0.97245 validation-aucpr:0.97522
[50] validation-logloss:0.22794 validation-auc:0.97255 validation-aucpr:0.97530
[51] validation-logloss:0.22592 validation-auc:0.97251 validation-aucpr:0.97528
[52] validation-logloss:0.22395 validation-auc:0.97255 validation-aucpr:0.97528
[53] validation-logloss:0.22205 validation-auc:0.97264 validation-aucpr:0.97528
[54] validation-logloss:0.22018 validation-auc:0.97270 validation-aucpr:0.97533
[55] validation-logloss:0.21841 validation-auc:0.97280 validation-aucpr:0.97540
[56] validation-logloss:0.21675 validation-auc:0.97279 validation-aucpr:0.97539
[57] validation-logloss:0.21491 validation-auc:0.97291 validation-aucpr:0.97546
[58] validation-logloss:0.21357 validation-auc:0.97295 validation-aucpr:0.97548
[59] validation-logloss:0.21187 validation-auc:0.97304 validation-aucpr:0.97556
[60] validation-logloss:0.21044 validation-auc:0.97316 validation-aucpr:0.97561
{'best_iteration': '60', 'best_score': '0.9756137684891409'}
Trial 66, Fold 2: Log loss = 0.2104350181056542, Average precision = 0.9755512232417399, ROC-AUC = 0.9731561961200518, Elapsed Time = 1.952999400000408 seconds
Trial 66, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 66, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66268 validation-auc:0.95210 validation-aucpr:0.95295
[1] validation-logloss:0.63570 validation-auc:0.96114 validation-aucpr:0.96550
[2] validation-logloss:0.61340 validation-auc:0.96345 validation-aucpr:0.96666
[3] validation-logloss:0.59172 validation-auc:0.96439 validation-aucpr:0.96763
[4] validation-logloss:0.56913 validation-auc:0.96544 validation-aucpr:0.96915
[5] validation-logloss:0.54809 validation-auc:0.96597 validation-aucpr:0.96983
[6] validation-logloss:0.53156 validation-auc:0.96662 validation-aucpr:0.97098
[7] validation-logloss:0.51295 validation-auc:0.96777 validation-aucpr:0.97216
[8] validation-logloss:0.49552 validation-auc:0.96809 validation-aucpr:0.97253
[9] validation-logloss:0.48132 validation-auc:0.96821 validation-aucpr:0.97234
[10] validation-logloss:0.46554 validation-auc:0.96850 validation-aucpr:0.97257
[11] validation-logloss:0.45330 validation-auc:0.96839 validation-aucpr:0.97312
[12] validation-logloss:0.44156 validation-auc:0.96875 validation-aucpr:0.97340
[13] validation-logloss:0.42889 validation-auc:0.96894 validation-aucpr:0.97358
[14] validation-logloss:0.41655 validation-auc:0.96931 validation-aucpr:0.97388
[15] validation-logloss:0.40508 validation-auc:0.96949 validation-aucpr:0.97407
[16] validation-logloss:0.39463 validation-auc:0.96950 validation-aucpr:0.97409
[17] validation-logloss:0.38596 validation-auc:0.96937 validation-aucpr:0.97402
[18] validation-logloss:0.37743 validation-auc:0.96943 validation-aucpr:0.97402
[19] validation-logloss:0.36811 validation-auc:0.96964 validation-aucpr:0.97416
[20] validation-logloss:0.35891 validation-auc:0.96999 validation-aucpr:0.97444
[21] validation-logloss:0.35066 validation-auc:0.97034 validation-aucpr:0.97473
[22] validation-logloss:0.34269 validation-auc:0.97038 validation-aucpr:0.97480
[23] validation-logloss:0.33504 validation-auc:0.97053 validation-aucpr:0.97492
[24] validation-logloss:0.32876 validation-auc:0.97056 validation-aucpr:0.97495
[25] validation-logloss:0.32205 validation-auc:0.97058 validation-aucpr:0.97495
[26] validation-logloss:0.31554 validation-auc:0.97067 validation-aucpr:0.97502
[27] validation-logloss:0.31021 validation-auc:0.97069 validation-aucpr:0.97503
[28] validation-logloss:0.30430 validation-auc:0.97076 validation-aucpr:0.97509
[29] validation-logloss:0.29881 validation-auc:0.97084 validation-aucpr:0.97513
[30] validation-logloss:0.29345 validation-auc:0.97092 validation-aucpr:0.97519
[31] validation-logloss:0.28911 validation-auc:0.97094 validation-aucpr:0.97519
[32] validation-logloss:0.28530 validation-auc:0.97080 validation-aucpr:0.97504
[33] validation-logloss:0.28143 validation-auc:0.97072 validation-aucpr:0.97496
[34] validation-logloss:0.27762 validation-auc:0.97082 validation-aucpr:0.97497
[35] validation-logloss:0.27382 validation-auc:0.97091 validation-aucpr:0.97501
[36] validation-logloss:0.26960 validation-auc:0.97103 validation-aucpr:0.97509
[37] validation-logloss:0.26565 validation-auc:0.97120 validation-aucpr:0.97522
[38] validation-logloss:0.26248 validation-auc:0.97134 validation-aucpr:0.97535
[39] validation-logloss:0.25954 validation-auc:0.97137 validation-aucpr:0.97539
[40] validation-logloss:0.25609 validation-auc:0.97148 validation-aucpr:0.97548
[41] validation-logloss:0.25273 validation-auc:0.97155 validation-aucpr:0.97554
[42] validation-logloss:0.25017 validation-auc:0.97165 validation-aucpr:0.97569
[43] validation-logloss:0.24697 validation-auc:0.97180 validation-aucpr:0.97583
[44] validation-logloss:0.24410 validation-auc:0.97194 validation-aucpr:0.97594
[45] validation-logloss:0.24124 validation-auc:0.97197 validation-aucpr:0.97598
[46] validation-logloss:0.23874 validation-auc:0.97197 validation-aucpr:0.97599
[47] validation-logloss:0.23613 validation-auc:0.97209 validation-aucpr:0.97608
[48] validation-logloss:0.23380 validation-auc:0.97214 validation-aucpr:0.97612
[49] validation-logloss:0.23159 validation-auc:0.97214 validation-aucpr:0.97615
[50] validation-logloss:0.22934 validation-auc:0.97221 validation-aucpr:0.97621
[51] validation-logloss:0.22729 validation-auc:0.97224 validation-aucpr:0.97624
[52] validation-logloss:0.22544 validation-auc:0.97220 validation-aucpr:0.97621
[53] validation-logloss:0.22340 validation-auc:0.97220 validation-aucpr:0.97620
[54] validation-logloss:0.22168 validation-auc:0.97216 validation-aucpr:0.97617
[55] validation-logloss:0.22050 validation-auc:0.97210 validation-aucpr:0.97609
[56] validation-logloss:0.21922 validation-auc:0.97210 validation-aucpr:0.97606
[57] validation-logloss:0.21764 validation-auc:0.97215 validation-aucpr:0.97610
[58] validation-logloss:0.21587 validation-auc:0.97229 validation-aucpr:0.97622
[59] validation-logloss:0.21453 validation-auc:0.97230 validation-aucpr:0.97622
[60] validation-logloss:0.21306 validation-auc:0.97238 validation-aucpr:0.97629
{'best_iteration': '60', 'best_score': '0.9762925367757626'}
Trial 66, Fold 3: Log loss = 0.2130604057966504, Average precision = 0.9762961959857607, ROC-AUC = 0.9723778905143936, Elapsed Time = 2.1198683000002347 seconds
Trial 66, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 66, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66279 validation-auc:0.94747 validation-aucpr:0.92289
[1] validation-logloss:0.63758 validation-auc:0.96398 validation-aucpr:0.96642
[2] validation-logloss:0.61157 validation-auc:0.96684 validation-aucpr:0.97024
[3] validation-logloss:0.58741 validation-auc:0.96806 validation-aucpr:0.97142
[4] validation-logloss:0.56524 validation-auc:0.96855 validation-aucpr:0.97155
[5] validation-logloss:0.54712 validation-auc:0.96722 validation-aucpr:0.97034
[6] validation-logloss:0.52758 validation-auc:0.96724 validation-aucpr:0.97028
[7] validation-logloss:0.51174 validation-auc:0.96695 validation-aucpr:0.97024
[8] validation-logloss:0.49663 validation-auc:0.96697 validation-aucpr:0.97022
[9] validation-logloss:0.48034 validation-auc:0.96761 validation-aucpr:0.97080
[10] validation-logloss:0.46539 validation-auc:0.96827 validation-aucpr:0.97328
[11] validation-logloss:0.45201 validation-auc:0.96816 validation-aucpr:0.97317
[12] validation-logloss:0.44130 validation-auc:0.96763 validation-aucpr:0.97282
[13] validation-logloss:0.43029 validation-auc:0.96772 validation-aucpr:0.97293
[14] validation-logloss:0.41830 validation-auc:0.96798 validation-aucpr:0.97316
[15] validation-logloss:0.40696 validation-auc:0.96795 validation-aucpr:0.97319
[16] validation-logloss:0.39594 validation-auc:0.96827 validation-aucpr:0.97349
[17] validation-logloss:0.38568 validation-auc:0.96855 validation-aucpr:0.97369
[18] validation-logloss:0.37765 validation-auc:0.96838 validation-aucpr:0.97354
[19] validation-logloss:0.36979 validation-auc:0.96831 validation-aucpr:0.97342
[20] validation-logloss:0.36112 validation-auc:0.96860 validation-aucpr:0.97371
[21] validation-logloss:0.35269 validation-auc:0.96885 validation-aucpr:0.97391
[22] validation-logloss:0.34596 validation-auc:0.96916 validation-aucpr:0.97405
[23] validation-logloss:0.33948 validation-auc:0.96905 validation-aucpr:0.97395
[24] validation-logloss:0.33326 validation-auc:0.96915 validation-aucpr:0.97399
[25] validation-logloss:0.32740 validation-auc:0.96925 validation-aucpr:0.97408
[26] validation-logloss:0.32057 validation-auc:0.96944 validation-aucpr:0.97424
[27] validation-logloss:0.31421 validation-auc:0.96974 validation-aucpr:0.97449
[28] validation-logloss:0.30820 validation-auc:0.96998 validation-aucpr:0.97469
[29] validation-logloss:0.30345 validation-auc:0.96997 validation-aucpr:0.97467
[30] validation-logloss:0.29787 validation-auc:0.97016 validation-aucpr:0.97483
[31] validation-logloss:0.29386 validation-auc:0.97006 validation-aucpr:0.97475
[32] validation-logloss:0.28948 validation-auc:0.97013 validation-aucpr:0.97477
[33] validation-logloss:0.28566 validation-auc:0.97003 validation-aucpr:0.97472
[34] validation-logloss:0.28105 validation-auc:0.97017 validation-aucpr:0.97487
[35] validation-logloss:0.27677 validation-auc:0.97017 validation-aucpr:0.97490
[36] validation-logloss:0.27360 validation-auc:0.97021 validation-aucpr:0.97491
[37] validation-logloss:0.26931 validation-auc:0.97039 validation-aucpr:0.97507
[38] validation-logloss:0.26628 validation-auc:0.97035 validation-aucpr:0.97503
[39] validation-logloss:0.26254 validation-auc:0.97047 validation-aucpr:0.97514
[40] validation-logloss:0.25897 validation-auc:0.97060 validation-aucpr:0.97527
[41] validation-logloss:0.25562 validation-auc:0.97071 validation-aucpr:0.97535
[42] validation-logloss:0.25217 validation-auc:0.97101 validation-aucpr:0.97560
[43] validation-logloss:0.24992 validation-auc:0.97092 validation-aucpr:0.97553
[44] validation-logloss:0.24738 validation-auc:0.97106 validation-aucpr:0.97563
[45] validation-logloss:0.24446 validation-auc:0.97116 validation-aucpr:0.97570
[46] validation-logloss:0.24213 validation-auc:0.97136 validation-aucpr:0.97582
[47] validation-logloss:0.24013 validation-auc:0.97134 validation-aucpr:0.97580
[48] validation-logloss:0.23765 validation-auc:0.97145 validation-aucpr:0.97591
[49] validation-logloss:0.23501 validation-auc:0.97159 validation-aucpr:0.97603
[50] validation-logloss:0.23267 validation-auc:0.97172 validation-aucpr:0.97612
[51] validation-logloss:0.23050 validation-auc:0.97179 validation-aucpr:0.97620
[52] validation-logloss:0.22872 validation-auc:0.97171 validation-aucpr:0.97613
[53] validation-logloss:0.22685 validation-auc:0.97177 validation-aucpr:0.97617
[54] validation-logloss:0.22477 validation-auc:0.97198 validation-aucpr:0.97634
[55] validation-logloss:0.22336 validation-auc:0.97198 validation-aucpr:0.97634
[56] validation-logloss:0.22201 validation-auc:0.97196 validation-aucpr:0.97630
[57] validation-logloss:0.22000 validation-auc:0.97212 validation-aucpr:0.97643
[58] validation-logloss:0.21838 validation-auc:0.97215 validation-aucpr:0.97647
[59] validation-logloss:0.21718 validation-auc:0.97207 validation-aucpr:0.97642
[60] validation-logloss:0.21548 validation-auc:0.97218 validation-aucpr:0.97651
{'best_iteration': '60', 'best_score': '0.9765077996628351'}
Trial 66, Fold 4: Log loss = 0.2154846902786677, Average precision = 0.976506590273033, ROC-AUC = 0.9721760242828291, Elapsed Time = 2.0137092000004486 seconds
Trial 66, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 66, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66316 validation-auc:0.95118 validation-aucpr:0.94310
[1] validation-logloss:0.63899 validation-auc:0.95792 validation-aucpr:0.96211
[2] validation-logloss:0.61668 validation-auc:0.95951 validation-aucpr:0.96374
[3] validation-logloss:0.59250 validation-auc:0.96209 validation-aucpr:0.96623
[4] validation-logloss:0.57003 validation-auc:0.96449 validation-aucpr:0.96846
[5] validation-logloss:0.54925 validation-auc:0.96476 validation-aucpr:0.96865
[6] validation-logloss:0.53042 validation-auc:0.96586 validation-aucpr:0.97049
[7] validation-logloss:0.51210 validation-auc:0.96628 validation-aucpr:0.97083
[8] validation-logloss:0.49565 validation-auc:0.96602 validation-aucpr:0.97063
[9] validation-logloss:0.47992 validation-auc:0.96680 validation-aucpr:0.97132
[10] validation-logloss:0.46640 validation-auc:0.96639 validation-aucpr:0.97103
[11] validation-logloss:0.45441 validation-auc:0.96608 validation-aucpr:0.97075
[12] validation-logloss:0.44309 validation-auc:0.96611 validation-aucpr:0.97069
[13] validation-logloss:0.43034 validation-auc:0.96671 validation-aucpr:0.97112
[14] validation-logloss:0.41829 validation-auc:0.96729 validation-aucpr:0.97191
[15] validation-logloss:0.40694 validation-auc:0.96763 validation-aucpr:0.97218
[16] validation-logloss:0.39647 validation-auc:0.96780 validation-aucpr:0.97229
[17] validation-logloss:0.38670 validation-auc:0.96783 validation-aucpr:0.97229
[18] validation-logloss:0.37724 validation-auc:0.96811 validation-aucpr:0.97254
[19] validation-logloss:0.36811 validation-auc:0.96864 validation-aucpr:0.97291
[20] validation-logloss:0.36023 validation-auc:0.96869 validation-aucpr:0.97294
[21] validation-logloss:0.35236 validation-auc:0.96889 validation-aucpr:0.97310
[22] validation-logloss:0.34451 validation-auc:0.96918 validation-aucpr:0.97336
[23] validation-logloss:0.33746 validation-auc:0.96914 validation-aucpr:0.97339
[24] validation-logloss:0.33179 validation-auc:0.96897 validation-aucpr:0.97323
[25] validation-logloss:0.32547 validation-auc:0.96898 validation-aucpr:0.97326
[26] validation-logloss:0.31902 validation-auc:0.96932 validation-aucpr:0.97353
[27] validation-logloss:0.31315 validation-auc:0.96945 validation-aucpr:0.97364
[28] validation-logloss:0.30807 validation-auc:0.96951 validation-aucpr:0.97370
[29] validation-logloss:0.30356 validation-auc:0.96954 validation-aucpr:0.97370
[30] validation-logloss:0.29927 validation-auc:0.96954 validation-aucpr:0.97364
[31] validation-logloss:0.29424 validation-auc:0.96972 validation-aucpr:0.97378
[32] validation-logloss:0.28943 validation-auc:0.96983 validation-aucpr:0.97386
[33] validation-logloss:0.28500 validation-auc:0.96985 validation-aucpr:0.97386
[34] validation-logloss:0.28153 validation-auc:0.96985 validation-aucpr:0.97390
[35] validation-logloss:0.27722 validation-auc:0.97009 validation-aucpr:0.97409
[36] validation-logloss:0.27395 validation-auc:0.97011 validation-aucpr:0.97409
[37] validation-logloss:0.27092 validation-auc:0.96996 validation-aucpr:0.97398
[38] validation-logloss:0.26802 validation-auc:0.96985 validation-aucpr:0.97389
[39] validation-logloss:0.26507 validation-auc:0.96998 validation-aucpr:0.97401
[40] validation-logloss:0.26236 validation-auc:0.96993 validation-aucpr:0.97395
[41] validation-logloss:0.25912 validation-auc:0.97007 validation-aucpr:0.97408
[42] validation-logloss:0.25610 validation-auc:0.97014 validation-aucpr:0.97417
[43] validation-logloss:0.25363 validation-auc:0.97022 validation-aucpr:0.97424
[44] validation-logloss:0.25085 validation-auc:0.97026 validation-aucpr:0.97428
[45] validation-logloss:0.24862 validation-auc:0.97033 validation-aucpr:0.97432
[46] validation-logloss:0.24588 validation-auc:0.97041 validation-aucpr:0.97437
[47] validation-logloss:0.24375 validation-auc:0.97040 validation-aucpr:0.97437
[48] validation-logloss:0.24167 validation-auc:0.97055 validation-aucpr:0.97445
[49] validation-logloss:0.23955 validation-auc:0.97064 validation-aucpr:0.97450
[50] validation-logloss:0.23784 validation-auc:0.97065 validation-aucpr:0.97449
[51] validation-logloss:0.23618 validation-auc:0.97057 validation-aucpr:0.97442
[52] validation-logloss:0.23442 validation-auc:0.97059 validation-aucpr:0.97444
[53] validation-logloss:0.23224 validation-auc:0.97081 validation-aucpr:0.97461
[54] validation-logloss:0.23025 validation-auc:0.97094 validation-aucpr:0.97474
[55] validation-logloss:0.22894 validation-auc:0.97090 validation-aucpr:0.97469
[56] validation-logloss:0.22710 validation-auc:0.97105 validation-aucpr:0.97482
[57] validation-logloss:0.22557 validation-auc:0.97108 validation-aucpr:0.97484
[58] validation-logloss:0.22391 validation-auc:0.97117 validation-aucpr:0.97493
[59] validation-logloss:0.22217 validation-auc:0.97139 validation-aucpr:0.97511
[60] validation-logloss:0.22073 validation-auc:0.97145 validation-aucpr:0.97512
{'best_iteration': '60', 'best_score': '0.975122968769573'}
Trial 66, Fold 5: Log loss = 0.2207293657428227, Average precision = 0.975127238399295, ROC-AUC = 0.9714478197053305, Elapsed Time = 2.1293003000027966 seconds
Optimization Progress: 67%|######7 | 67/100 [3:02:22<19:51, 36.11s/it]
Trial 67, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 67, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67790 validation-auc:0.95553 validation-aucpr:0.96092
[1] validation-logloss:0.66503 validation-auc:0.95859 validation-aucpr:0.96439
[2] validation-logloss:0.65245 validation-auc:0.95961 validation-aucpr:0.96443
[3] validation-logloss:0.64023 validation-auc:0.96095 validation-aucpr:0.96598
[4] validation-logloss:0.62862 validation-auc:0.96154 validation-aucpr:0.96621
[5] validation-logloss:0.61621 validation-auc:0.96280 validation-aucpr:0.96709
[6] validation-logloss:0.60377 validation-auc:0.96375 validation-aucpr:0.96800
[7] validation-logloss:0.59216 validation-auc:0.96451 validation-aucpr:0.96998
[8] validation-logloss:0.58076 validation-auc:0.96534 validation-aucpr:0.97092
[9] validation-logloss:0.57128 validation-auc:0.96557 validation-aucpr:0.97103
[10] validation-logloss:0.56224 validation-auc:0.96568 validation-aucpr:0.97098
[11] validation-logloss:0.55305 validation-auc:0.96598 validation-aucpr:0.97114
[12] validation-logloss:0.54469 validation-auc:0.96576 validation-aucpr:0.97092
[13] validation-logloss:0.53612 validation-auc:0.96562 validation-aucpr:0.97079
[14] validation-logloss:0.52686 validation-auc:0.96552 validation-aucpr:0.97076
[15] validation-logloss:0.51927 validation-auc:0.96533 validation-aucpr:0.97058
[16] validation-logloss:0.51172 validation-auc:0.96523 validation-aucpr:0.97046
[17] validation-logloss:0.50450 validation-auc:0.96513 validation-aucpr:0.97036
[18] validation-logloss:0.49774 validation-auc:0.96502 validation-aucpr:0.97035
[19] validation-logloss:0.49092 validation-auc:0.96510 validation-aucpr:0.97040
[20] validation-logloss:0.48288 validation-auc:0.96534 validation-aucpr:0.97064
[21] validation-logloss:0.47516 validation-auc:0.96564 validation-aucpr:0.97093
[22] validation-logloss:0.46888 validation-auc:0.96566 validation-aucpr:0.97089
[23] validation-logloss:0.46154 validation-auc:0.96584 validation-aucpr:0.97108
[24] validation-logloss:0.45452 validation-auc:0.96614 validation-aucpr:0.97139
[25] validation-logloss:0.44906 validation-auc:0.96585 validation-aucpr:0.97110
[26] validation-logloss:0.44358 validation-auc:0.96596 validation-aucpr:0.97122
[27] validation-logloss:0.43696 validation-auc:0.96620 validation-aucpr:0.97145
[28] validation-logloss:0.43182 validation-auc:0.96607 validation-aucpr:0.97139
[29] validation-logloss:0.42615 validation-auc:0.96607 validation-aucpr:0.97140
[30] validation-logloss:0.42044 validation-auc:0.96625 validation-aucpr:0.97159
[31] validation-logloss:0.41479 validation-auc:0.96627 validation-aucpr:0.97166
[32] validation-logloss:0.41016 validation-auc:0.96623 validation-aucpr:0.97160
[33] validation-logloss:0.40555 validation-auc:0.96626 validation-aucpr:0.97163
[34] validation-logloss:0.40121 validation-auc:0.96630 validation-aucpr:0.97160
[35] validation-logloss:0.39598 validation-auc:0.96644 validation-aucpr:0.97172
[36] validation-logloss:0.39184 validation-auc:0.96645 validation-aucpr:0.97170
[37] validation-logloss:0.38787 validation-auc:0.96649 validation-aucpr:0.97173
[38] validation-logloss:0.38321 validation-auc:0.96645 validation-aucpr:0.97172
[39] validation-logloss:0.37852 validation-auc:0.96658 validation-aucpr:0.97188
[40] validation-logloss:0.37513 validation-auc:0.96652 validation-aucpr:0.97182
[41] validation-logloss:0.37082 validation-auc:0.96662 validation-aucpr:0.97194
[42] validation-logloss:0.36719 validation-auc:0.96670 validation-aucpr:0.97196
[43] validation-logloss:0.36289 validation-auc:0.96675 validation-aucpr:0.97203
[44] validation-logloss:0.35947 validation-auc:0.96676 validation-aucpr:0.97204
[45] validation-logloss:0.35625 validation-auc:0.96676 validation-aucpr:0.97201
[46] validation-logloss:0.35315 validation-auc:0.96670 validation-aucpr:0.97194
[47] validation-logloss:0.35009 validation-auc:0.96671 validation-aucpr:0.97193
[48] validation-logloss:0.34688 validation-auc:0.96674 validation-aucpr:0.97193
[49] validation-logloss:0.34341 validation-auc:0.96678 validation-aucpr:0.97195
[50] validation-logloss:0.33964 validation-auc:0.96697 validation-aucpr:0.97210
[51] validation-logloss:0.33605 validation-auc:0.96706 validation-aucpr:0.97217
[52] validation-logloss:0.33335 validation-auc:0.96703 validation-aucpr:0.97212
[53] validation-logloss:0.33064 validation-auc:0.96701 validation-aucpr:0.97208
[54] validation-logloss:0.32790 validation-auc:0.96704 validation-aucpr:0.97209
[55] validation-logloss:0.32474 validation-auc:0.96719 validation-aucpr:0.97222
[56] validation-logloss:0.32225 validation-auc:0.96722 validation-aucpr:0.97225
[57] validation-logloss:0.31938 validation-auc:0.96723 validation-aucpr:0.97230
[58] validation-logloss:0.31725 validation-auc:0.96717 validation-aucpr:0.97224
[59] validation-logloss:0.31414 validation-auc:0.96730 validation-aucpr:0.97237
[60] validation-logloss:0.31202 validation-auc:0.96727 validation-aucpr:0.97234
{'best_iteration': '59', 'best_score': '0.9723680795391926'}
Trial 67, Fold 1: Log loss = 0.31202037811636174, Average precision = 0.9723428888836552, ROC-AUC = 0.9672663856268192, Elapsed Time = 1.4499145000008866 seconds
Trial 67, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 67, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67734 validation-auc:0.95774 validation-aucpr:0.96414
[1] validation-logloss:0.66279 validation-auc:0.96283 validation-aucpr:0.96703
[2] validation-logloss:0.64856 validation-auc:0.96545 validation-aucpr:0.96898
[3] validation-logloss:0.63506 validation-auc:0.96584 validation-aucpr:0.96995
[4] validation-logloss:0.62224 validation-auc:0.96586 validation-aucpr:0.97016
[5] validation-logloss:0.60985 validation-auc:0.96734 validation-aucpr:0.97134
[6] validation-logloss:0.59958 validation-auc:0.96733 validation-aucpr:0.97056
[7] validation-logloss:0.58905 validation-auc:0.96671 validation-aucpr:0.97011
[8] validation-logloss:0.57791 validation-auc:0.96692 validation-aucpr:0.97025
[9] validation-logloss:0.56677 validation-auc:0.96753 validation-aucpr:0.97081
[10] validation-logloss:0.55610 validation-auc:0.96803 validation-aucpr:0.97107
[11] validation-logloss:0.54698 validation-auc:0.96814 validation-aucpr:0.97108
[12] validation-logloss:0.53699 validation-auc:0.96855 validation-aucpr:0.97136
[13] validation-logloss:0.52857 validation-auc:0.96860 validation-aucpr:0.97135
[14] validation-logloss:0.51941 validation-auc:0.96860 validation-aucpr:0.97126
[15] validation-logloss:0.51163 validation-auc:0.96850 validation-aucpr:0.97117
[16] validation-logloss:0.50410 validation-auc:0.96829 validation-aucpr:0.97124
[17] validation-logloss:0.49579 validation-auc:0.96849 validation-aucpr:0.97144
[18] validation-logloss:0.48889 validation-auc:0.96848 validation-aucpr:0.97197
[19] validation-logloss:0.48161 validation-auc:0.96861 validation-aucpr:0.97203
[20] validation-logloss:0.47509 validation-auc:0.96866 validation-aucpr:0.97259
[21] validation-logloss:0.46855 validation-auc:0.96868 validation-aucpr:0.97254
[22] validation-logloss:0.46113 validation-auc:0.96894 validation-aucpr:0.97275
[23] validation-logloss:0.45398 validation-auc:0.96917 validation-aucpr:0.97297
[24] validation-logloss:0.44831 validation-auc:0.96901 validation-aucpr:0.97280
[25] validation-logloss:0.44163 validation-auc:0.96901 validation-aucpr:0.97279
[26] validation-logloss:0.43622 validation-auc:0.96897 validation-aucpr:0.97271
[27] validation-logloss:0.42998 validation-auc:0.96917 validation-aucpr:0.97290
[28] validation-logloss:0.42470 validation-auc:0.96900 validation-aucpr:0.97276
[29] validation-logloss:0.41978 validation-auc:0.96896 validation-aucpr:0.97272
[30] validation-logloss:0.41470 validation-auc:0.96896 validation-aucpr:0.97273
[31] validation-logloss:0.41013 validation-auc:0.96889 validation-aucpr:0.97263
[32] validation-logloss:0.40457 validation-auc:0.96902 validation-aucpr:0.97273
[33] validation-logloss:0.40003 validation-auc:0.96898 validation-aucpr:0.97269
[34] validation-logloss:0.39589 validation-auc:0.96886 validation-aucpr:0.97260
[35] validation-logloss:0.39071 validation-auc:0.96891 validation-aucpr:0.97268
[36] validation-logloss:0.38664 validation-auc:0.96900 validation-aucpr:0.97274
[37] validation-logloss:0.38159 validation-auc:0.96917 validation-aucpr:0.97290
[38] validation-logloss:0.37790 validation-auc:0.96917 validation-aucpr:0.97289
[39] validation-logloss:0.37312 validation-auc:0.96923 validation-aucpr:0.97295
[40] validation-logloss:0.36934 validation-auc:0.96918 validation-aucpr:0.97290
[41] validation-logloss:0.36472 validation-auc:0.96938 validation-aucpr:0.97303
[42] validation-logloss:0.36042 validation-auc:0.96945 validation-aucpr:0.97310
[43] validation-logloss:0.35615 validation-auc:0.96955 validation-aucpr:0.97320
[44] validation-logloss:0.35274 validation-auc:0.96951 validation-aucpr:0.97317
[45] validation-logloss:0.34947 validation-auc:0.96949 validation-aucpr:0.97314
[46] validation-logloss:0.34557 validation-auc:0.96947 validation-aucpr:0.97312
[47] validation-logloss:0.34196 validation-auc:0.96943 validation-aucpr:0.97310
[48] validation-logloss:0.33907 validation-auc:0.96936 validation-aucpr:0.97304
[49] validation-logloss:0.33610 validation-auc:0.96940 validation-aucpr:0.97306
[50] validation-logloss:0.33311 validation-auc:0.96936 validation-aucpr:0.97293
[51] validation-logloss:0.32951 validation-auc:0.96943 validation-aucpr:0.97298
[52] validation-logloss:0.32664 validation-auc:0.96945 validation-aucpr:0.97299
[53] validation-logloss:0.32330 validation-auc:0.96953 validation-aucpr:0.97304
[54] validation-logloss:0.32064 validation-auc:0.96953 validation-aucpr:0.97303
[55] validation-logloss:0.31733 validation-auc:0.96968 validation-aucpr:0.97317
[56] validation-logloss:0.31485 validation-auc:0.96972 validation-aucpr:0.97319
[57] validation-logloss:0.31239 validation-auc:0.96975 validation-aucpr:0.97319
[58] validation-logloss:0.30924 validation-auc:0.96986 validation-aucpr:0.97328
[59] validation-logloss:0.30623 validation-auc:0.97004 validation-aucpr:0.97344
[60] validation-logloss:0.30382 validation-auc:0.97008 validation-aucpr:0.97338
{'best_iteration': '59', 'best_score': '0.9734417834944555'}
Trial 67, Fold 2: Log loss = 0.3038166756126164, Average precision = 0.9733877376478585, ROC-AUC = 0.9700820136081343, Elapsed Time = 1.8838468999965698 seconds
Trial 67, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 67, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67764 validation-auc:0.95744 validation-aucpr:0.96229
[1] validation-logloss:0.66499 validation-auc:0.95839 validation-aucpr:0.96064
[2] validation-logloss:0.65268 validation-auc:0.96027 validation-aucpr:0.96414
[3] validation-logloss:0.64100 validation-auc:0.96113 validation-aucpr:0.96504
[4] validation-logloss:0.62797 validation-auc:0.96349 validation-aucpr:0.96657
[5] validation-logloss:0.61668 validation-auc:0.96525 validation-aucpr:0.97016
[6] validation-logloss:0.60609 validation-auc:0.96553 validation-aucpr:0.97001
[7] validation-logloss:0.59395 validation-auc:0.96700 validation-aucpr:0.97146
[8] validation-logloss:0.58272 validation-auc:0.96729 validation-aucpr:0.97175
[9] validation-logloss:0.57319 validation-auc:0.96732 validation-aucpr:0.97200
[10] validation-logloss:0.56416 validation-auc:0.96685 validation-aucpr:0.97015
[11] validation-logloss:0.55511 validation-auc:0.96694 validation-aucpr:0.97167
[12] validation-logloss:0.54659 validation-auc:0.96710 validation-aucpr:0.97166
[13] validation-logloss:0.53805 validation-auc:0.96719 validation-aucpr:0.97168
[14] validation-logloss:0.53014 validation-auc:0.96702 validation-aucpr:0.97152
[15] validation-logloss:0.52067 validation-auc:0.96741 validation-aucpr:0.97184
[16] validation-logloss:0.51162 validation-auc:0.96777 validation-aucpr:0.97215
[17] validation-logloss:0.50294 validation-auc:0.96806 validation-aucpr:0.97251
[18] validation-logloss:0.49605 validation-auc:0.96803 validation-aucpr:0.97246
[19] validation-logloss:0.48766 validation-auc:0.96815 validation-aucpr:0.97258
[20] validation-logloss:0.48120 validation-auc:0.96814 validation-aucpr:0.97256
[21] validation-logloss:0.47343 validation-auc:0.96849 validation-aucpr:0.97290
[22] validation-logloss:0.46718 validation-auc:0.96837 validation-aucpr:0.97280
[23] validation-logloss:0.45972 validation-auc:0.96865 validation-aucpr:0.97306
[24] validation-logloss:0.45383 validation-auc:0.96867 validation-aucpr:0.97306
[25] validation-logloss:0.44694 validation-auc:0.96883 validation-aucpr:0.97320
[26] validation-logloss:0.44121 validation-auc:0.96888 validation-aucpr:0.97323
[27] validation-logloss:0.43553 validation-auc:0.96900 validation-aucpr:0.97331
[28] validation-logloss:0.43031 validation-auc:0.96897 validation-aucpr:0.97327
[29] validation-logloss:0.42440 validation-auc:0.96906 validation-aucpr:0.97334
[30] validation-logloss:0.41843 validation-auc:0.96913 validation-aucpr:0.97348
[31] validation-logloss:0.41363 validation-auc:0.96922 validation-aucpr:0.97354
[32] validation-logloss:0.40916 validation-auc:0.96911 validation-aucpr:0.97344
[33] validation-logloss:0.40425 validation-auc:0.96913 validation-aucpr:0.97342
[34] validation-logloss:0.39970 validation-auc:0.96922 validation-aucpr:0.97349
[35] validation-logloss:0.39436 validation-auc:0.96935 validation-aucpr:0.97361
[36] validation-logloss:0.38913 validation-auc:0.96945 validation-aucpr:0.97370
[37] validation-logloss:0.38415 validation-auc:0.96948 validation-aucpr:0.97374
[38] validation-logloss:0.37917 validation-auc:0.96959 validation-aucpr:0.97384
[39] validation-logloss:0.37446 validation-auc:0.96965 validation-aucpr:0.97388
[40] validation-logloss:0.37067 validation-auc:0.96967 validation-aucpr:0.97389
[41] validation-logloss:0.36608 validation-auc:0.96976 validation-aucpr:0.97396
[42] validation-logloss:0.36261 validation-auc:0.96981 validation-aucpr:0.97399
[43] validation-logloss:0.35890 validation-auc:0.96989 validation-aucpr:0.97404
[44] validation-logloss:0.35529 validation-auc:0.96999 validation-aucpr:0.97416
[45] validation-logloss:0.35196 validation-auc:0.97001 validation-aucpr:0.97416
[46] validation-logloss:0.34789 validation-auc:0.97008 validation-aucpr:0.97422
[47] validation-logloss:0.34463 validation-auc:0.97016 validation-aucpr:0.97429
[48] validation-logloss:0.34173 validation-auc:0.97008 validation-aucpr:0.97422
[49] validation-logloss:0.33805 validation-auc:0.97012 validation-aucpr:0.97431
[50] validation-logloss:0.33487 validation-auc:0.97015 validation-aucpr:0.97433
[51] validation-logloss:0.33196 validation-auc:0.97011 validation-aucpr:0.97429
[52] validation-logloss:0.32836 validation-auc:0.97016 validation-aucpr:0.97433
[53] validation-logloss:0.32548 validation-auc:0.97023 validation-aucpr:0.97436
[54] validation-logloss:0.32207 validation-auc:0.97026 validation-aucpr:0.97442
[55] validation-logloss:0.31958 validation-auc:0.97023 validation-aucpr:0.97439
[56] validation-logloss:0.31726 validation-auc:0.97017 validation-aucpr:0.97432
[57] validation-logloss:0.31405 validation-auc:0.97024 validation-aucpr:0.97438
[58] validation-logloss:0.31093 validation-auc:0.97031 validation-aucpr:0.97444
[59] validation-logloss:0.30784 validation-auc:0.97040 validation-aucpr:0.97453
[60] validation-logloss:0.30496 validation-auc:0.97042 validation-aucpr:0.97456
{'best_iteration': '60', 'best_score': '0.9745641931920642'}
Trial 67, Fold 3: Log loss = 0.30495799683777036, Average precision = 0.9745684705321301, ROC-AUC = 0.9704159433358053, Elapsed Time = 1.829988699999376 seconds
Trial 67, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 67, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67757 validation-auc:0.95496 validation-aucpr:0.95939
[1] validation-logloss:0.66306 validation-auc:0.96062 validation-aucpr:0.96533
[2] validation-logloss:0.64938 validation-auc:0.96258 validation-aucpr:0.96685
[3] validation-logloss:0.63606 validation-auc:0.96429 validation-aucpr:0.97027
[4] validation-logloss:0.62445 validation-auc:0.96421 validation-aucpr:0.97008
[5] validation-logloss:0.61199 validation-auc:0.96530 validation-aucpr:0.97089
[6] validation-logloss:0.59982 validation-auc:0.96530 validation-aucpr:0.97093
[7] validation-logloss:0.58815 validation-auc:0.96611 validation-aucpr:0.97158
[8] validation-logloss:0.57673 validation-auc:0.96724 validation-aucpr:0.97248
[9] validation-logloss:0.56693 validation-auc:0.96727 validation-aucpr:0.97250
[10] validation-logloss:0.55785 validation-auc:0.96723 validation-aucpr:0.97251
[11] validation-logloss:0.54906 validation-auc:0.96683 validation-aucpr:0.97215
[12] validation-logloss:0.54060 validation-auc:0.96679 validation-aucpr:0.97209
[13] validation-logloss:0.53118 validation-auc:0.96685 validation-aucpr:0.97219
[14] validation-logloss:0.52181 validation-auc:0.96703 validation-aucpr:0.97238
[15] validation-logloss:0.51307 validation-auc:0.96710 validation-aucpr:0.97247
[16] validation-logloss:0.50531 validation-auc:0.96741 validation-aucpr:0.97275
[17] validation-logloss:0.49807 validation-auc:0.96748 validation-aucpr:0.97276
[18] validation-logloss:0.49097 validation-auc:0.96742 validation-aucpr:0.97270
[19] validation-logloss:0.48416 validation-auc:0.96731 validation-aucpr:0.97258
[20] validation-logloss:0.47726 validation-auc:0.96727 validation-aucpr:0.97251
[21] validation-logloss:0.46976 validation-auc:0.96736 validation-aucpr:0.97263
[22] validation-logloss:0.46357 validation-auc:0.96742 validation-aucpr:0.97261
[23] validation-logloss:0.45741 validation-auc:0.96732 validation-aucpr:0.97255
[24] validation-logloss:0.45181 validation-auc:0.96703 validation-aucpr:0.97232
[25] validation-logloss:0.44489 validation-auc:0.96717 validation-aucpr:0.97246
[26] validation-logloss:0.43840 validation-auc:0.96719 validation-aucpr:0.97250
[27] validation-logloss:0.43204 validation-auc:0.96736 validation-aucpr:0.97262
[28] validation-logloss:0.42684 validation-auc:0.96730 validation-aucpr:0.97254
[29] validation-logloss:0.42179 validation-auc:0.96730 validation-aucpr:0.97252
[30] validation-logloss:0.41675 validation-auc:0.96745 validation-aucpr:0.97261
[31] validation-logloss:0.41211 validation-auc:0.96735 validation-aucpr:0.97249
[32] validation-logloss:0.40740 validation-auc:0.96732 validation-aucpr:0.97244
[33] validation-logloss:0.40193 validation-auc:0.96746 validation-aucpr:0.97258
[34] validation-logloss:0.39748 validation-auc:0.96753 validation-aucpr:0.97266
[35] validation-logloss:0.39316 validation-auc:0.96753 validation-aucpr:0.97263
[36] validation-logloss:0.38915 validation-auc:0.96753 validation-aucpr:0.97259
[37] validation-logloss:0.38435 validation-auc:0.96756 validation-aucpr:0.97267
[38] validation-logloss:0.38037 validation-auc:0.96761 validation-aucpr:0.97273
[39] validation-logloss:0.37568 validation-auc:0.96766 validation-aucpr:0.97280
[40] validation-logloss:0.37201 validation-auc:0.96757 validation-aucpr:0.97274
[41] validation-logloss:0.36832 validation-auc:0.96758 validation-aucpr:0.97271
[42] validation-logloss:0.36451 validation-auc:0.96752 validation-aucpr:0.97271
[43] validation-logloss:0.36109 validation-auc:0.96754 validation-aucpr:0.97274
[44] validation-logloss:0.35773 validation-auc:0.96754 validation-aucpr:0.97272
[45] validation-logloss:0.35445 validation-auc:0.96762 validation-aucpr:0.97276
[46] validation-logloss:0.35144 validation-auc:0.96757 validation-aucpr:0.97271
[47] validation-logloss:0.34826 validation-auc:0.96758 validation-aucpr:0.97270
[48] validation-logloss:0.34499 validation-auc:0.96767 validation-aucpr:0.97276
[49] validation-logloss:0.34107 validation-auc:0.96777 validation-aucpr:0.97288
[50] validation-logloss:0.33840 validation-auc:0.96768 validation-aucpr:0.97281
[51] validation-logloss:0.33481 validation-auc:0.96775 validation-aucpr:0.97289
[52] validation-logloss:0.33145 validation-auc:0.96776 validation-aucpr:0.97293
[53] validation-logloss:0.32786 validation-auc:0.96790 validation-aucpr:0.97306
[54] validation-logloss:0.32531 validation-auc:0.96790 validation-aucpr:0.97305
[55] validation-logloss:0.32291 validation-auc:0.96787 validation-aucpr:0.97303
[56] validation-logloss:0.31963 validation-auc:0.96790 validation-aucpr:0.97309
[57] validation-logloss:0.31729 validation-auc:0.96789 validation-aucpr:0.97306
[58] validation-logloss:0.31484 validation-auc:0.96793 validation-aucpr:0.97308
[59] validation-logloss:0.31261 validation-auc:0.96791 validation-aucpr:0.97305
[60] validation-logloss:0.30951 validation-auc:0.96800 validation-aucpr:0.97316
{'best_iteration': '60', 'best_score': '0.973162413525967'}
Trial 67, Fold 4: Log loss = 0.3095125853597949, Average precision = 0.9731652736050046, ROC-AUC = 0.9680012953306103, Elapsed Time = 1.7693674999973155 seconds
Trial 67, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 67, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67778 validation-auc:0.95233 validation-aucpr:0.94662
[1] validation-logloss:0.66519 validation-auc:0.95641 validation-aucpr:0.95980
[2] validation-logloss:0.65176 validation-auc:0.95921 validation-aucpr:0.96469
[3] validation-logloss:0.63841 validation-auc:0.96148 validation-aucpr:0.96672
[4] validation-logloss:0.62749 validation-auc:0.96130 validation-aucpr:0.96454
[5] validation-logloss:0.61654 validation-auc:0.96128 validation-aucpr:0.96399
[6] validation-logloss:0.60449 validation-auc:0.96262 validation-aucpr:0.96727
[7] validation-logloss:0.59433 validation-auc:0.96241 validation-aucpr:0.96695
[8] validation-logloss:0.58470 validation-auc:0.96183 validation-aucpr:0.96658
[9] validation-logloss:0.57403 validation-auc:0.96261 validation-aucpr:0.96779
[10] validation-logloss:0.56505 validation-auc:0.96257 validation-aucpr:0.96766
[11] validation-logloss:0.55493 validation-auc:0.96299 validation-aucpr:0.96804
[12] validation-logloss:0.54536 validation-auc:0.96287 validation-aucpr:0.96811
[13] validation-logloss:0.53602 validation-auc:0.96317 validation-aucpr:0.96838
[14] validation-logloss:0.52718 validation-auc:0.96346 validation-aucpr:0.96848
[15] validation-logloss:0.51805 validation-auc:0.96377 validation-aucpr:0.96878
[16] validation-logloss:0.50944 validation-auc:0.96411 validation-aucpr:0.96910
[17] validation-logloss:0.50215 validation-auc:0.96407 validation-aucpr:0.96905
[18] validation-logloss:0.49389 validation-auc:0.96439 validation-aucpr:0.96929
[19] validation-logloss:0.48625 validation-auc:0.96439 validation-aucpr:0.96932
[20] validation-logloss:0.47948 validation-auc:0.96439 validation-aucpr:0.96934
[21] validation-logloss:0.47332 validation-auc:0.96432 validation-aucpr:0.96932
[22] validation-logloss:0.46714 validation-auc:0.96431 validation-aucpr:0.96932
[23] validation-logloss:0.46117 validation-auc:0.96429 validation-aucpr:0.96926
[24] validation-logloss:0.45547 validation-auc:0.96415 validation-aucpr:0.96903
[25] validation-logloss:0.44895 validation-auc:0.96429 validation-aucpr:0.96922
[26] validation-logloss:0.44355 validation-auc:0.96422 validation-aucpr:0.96909
[27] validation-logloss:0.43807 validation-auc:0.96430 validation-aucpr:0.96919
[28] validation-logloss:0.43289 validation-auc:0.96442 validation-aucpr:0.96933
[29] validation-logloss:0.42702 validation-auc:0.96449 validation-aucpr:0.96955
[30] validation-logloss:0.42219 validation-auc:0.96460 validation-aucpr:0.96958
[31] validation-logloss:0.41743 validation-auc:0.96464 validation-aucpr:0.96958
[32] validation-logloss:0.41158 validation-auc:0.96483 validation-aucpr:0.96975
[33] validation-logloss:0.40693 validation-auc:0.96488 validation-aucpr:0.96979
[34] validation-logloss:0.40250 validation-auc:0.96490 validation-aucpr:0.96977
[35] validation-logloss:0.39846 validation-auc:0.96490 validation-aucpr:0.96976
[36] validation-logloss:0.39429 validation-auc:0.96495 validation-aucpr:0.96975
[37] validation-logloss:0.39031 validation-auc:0.96499 validation-aucpr:0.96978
[38] validation-logloss:0.38533 validation-auc:0.96517 validation-aucpr:0.96995
[39] validation-logloss:0.38162 validation-auc:0.96511 validation-aucpr:0.96990
[40] validation-logloss:0.37792 validation-auc:0.96504 validation-aucpr:0.96985
[41] validation-logloss:0.37434 validation-auc:0.96505 validation-aucpr:0.96983
[42] validation-logloss:0.36979 validation-auc:0.96519 validation-aucpr:0.96996
[43] validation-logloss:0.36572 validation-auc:0.96532 validation-aucpr:0.97006
[44] validation-logloss:0.36266 validation-auc:0.96526 validation-aucpr:0.97001
[45] validation-logloss:0.35940 validation-auc:0.96528 validation-aucpr:0.97003
[46] validation-logloss:0.35533 validation-auc:0.96547 validation-aucpr:0.97019
[47] validation-logloss:0.35212 validation-auc:0.96552 validation-aucpr:0.97020
[48] validation-logloss:0.34907 validation-auc:0.96555 validation-aucpr:0.97021
[49] validation-logloss:0.34531 validation-auc:0.96568 validation-aucpr:0.97031
[50] validation-logloss:0.34247 validation-auc:0.96561 validation-aucpr:0.97027
[51] validation-logloss:0.33961 validation-auc:0.96564 validation-aucpr:0.97027
[52] validation-logloss:0.33617 validation-auc:0.96572 validation-aucpr:0.97039
[53] validation-logloss:0.33334 validation-auc:0.96578 validation-aucpr:0.97044
[54] validation-logloss:0.33000 validation-auc:0.96594 validation-aucpr:0.97056
[55] validation-logloss:0.32669 validation-auc:0.96616 validation-aucpr:0.97075
[56] validation-logloss:0.32369 validation-auc:0.96624 validation-aucpr:0.97087
[57] validation-logloss:0.32123 validation-auc:0.96626 validation-aucpr:0.97088
[58] validation-logloss:0.31879 validation-auc:0.96631 validation-aucpr:0.97091
[59] validation-logloss:0.31655 validation-auc:0.96631 validation-aucpr:0.97093
[60] validation-logloss:0.31453 validation-auc:0.96627 validation-aucpr:0.97087
{'best_iteration': '59', 'best_score': '0.9709289988232364'}
Trial 67, Fold 5: Log loss = 0.31453136502821993, Average precision = 0.9708783903919904, ROC-AUC = 0.9662654998792338, Elapsed Time = 1.8825285999992047 seconds
Optimization Progress: 68%|######8 | 68/100 [3:02:39<16:08, 30.27s/it]
Trial 68, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 68, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[21:01:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68643 validation-auc:0.93634 validation-aucpr:0.92382
[21:01:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67955 validation-auc:0.95075 validation-aucpr:0.95354
[21:01:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67197 validation-auc:0.96136 validation-aucpr:0.96432
[21:01:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66458 validation-auc:0.96420 validation-aucpr:0.96792
[21:01:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65817 validation-auc:0.96400 validation-aucpr:0.96774
[21:01:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65122 validation-auc:0.96518 validation-aucpr:0.96927
[21:01:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64440 validation-auc:0.96529 validation-aucpr:0.96981
[21:01:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63837 validation-auc:0.96549 validation-aucpr:0.97009
[21:01:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63246 validation-auc:0.96565 validation-aucpr:0.97025
[21:01:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62587 validation-auc:0.96619 validation-aucpr:0.97074
[21:01:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62027 validation-auc:0.96605 validation-aucpr:0.97063
[21:01:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61491 validation-auc:0.96615 validation-aucpr:0.97098
[21:01:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60966 validation-auc:0.96584 validation-aucpr:0.97063
[21:01:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60360 validation-auc:0.96613 validation-aucpr:0.97083
[21:01:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59817 validation-auc:0.96625 validation-aucpr:0.97099
[21:01:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59299 validation-auc:0.96619 validation-aucpr:0.97126
[21:01:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58802 validation-auc:0.96595 validation-aucpr:0.97105
[21:01:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58288 validation-auc:0.96607 validation-aucpr:0.97115
[21:01:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57784 validation-auc:0.96617 validation-aucpr:0.97119
[21:01:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57238 validation-auc:0.96641 validation-aucpr:0.97141
[21:01:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56763 validation-auc:0.96634 validation-aucpr:0.97130
[21:01:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56304 validation-auc:0.96638 validation-aucpr:0.97128
[21:01:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55770 validation-auc:0.96661 validation-aucpr:0.97156
[21:01:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55317 validation-auc:0.96658 validation-aucpr:0.97154
[21:01:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54886 validation-auc:0.96654 validation-aucpr:0.97156
[21:01:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54391 validation-auc:0.96670 validation-aucpr:0.97173
[21:01:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53888 validation-auc:0.96695 validation-aucpr:0.97197
[21:01:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53436 validation-auc:0.96701 validation-aucpr:0.97205
[21:01:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52955 validation-auc:0.96725 validation-aucpr:0.97229
[21:01:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52496 validation-auc:0.96734 validation-aucpr:0.97234
[21:01:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52040 validation-auc:0.96734 validation-aucpr:0.97237
[21:01:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51643 validation-auc:0.96727 validation-aucpr:0.97229
[21:01:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51259 validation-auc:0.96725 validation-aucpr:0.97225
[21:01:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.50820 validation-auc:0.96743 validation-aucpr:0.97239
[21:01:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50399 validation-auc:0.96754 validation-aucpr:0.97247
[21:01:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.49977 validation-auc:0.96767 validation-aucpr:0.97259
[21:01:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49556 validation-auc:0.96775 validation-aucpr:0.97268
[21:01:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49180 validation-auc:0.96770 validation-aucpr:0.97265
[21:01:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.48782 validation-auc:0.96772 validation-aucpr:0.97265
[21:01:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48388 validation-auc:0.96779 validation-aucpr:0.97271
[21:01:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48042 validation-auc:0.96783 validation-aucpr:0.97271
[21:01:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.47707 validation-auc:0.96787 validation-aucpr:0.97272
[21:01:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47352 validation-auc:0.96792 validation-aucpr:0.97275
[21:01:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.46981 validation-auc:0.96787 validation-aucpr:0.97274
[21:01:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.46617 validation-auc:0.96789 validation-aucpr:0.97276
[21:01:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46295 validation-auc:0.96792 validation-aucpr:0.97277
[21:01:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.45980 validation-auc:0.96796 validation-aucpr:0.97278
[21:01:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.45621 validation-auc:0.96800 validation-aucpr:0.97283
[21:01:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45315 validation-auc:0.96802 validation-aucpr:0.97284
[21:01:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45015 validation-auc:0.96798 validation-aucpr:0.97279
[21:01:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.44729 validation-auc:0.96787 validation-aucpr:0.97269
[21:01:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44443 validation-auc:0.96776 validation-aucpr:0.97260
[21:01:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44106 validation-auc:0.96792 validation-aucpr:0.97274
[21:01:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.43840 validation-auc:0.96791 validation-aucpr:0.97279
[21:01:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43565 validation-auc:0.96788 validation-aucpr:0.97274
[21:01:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43241 validation-auc:0.96791 validation-aucpr:0.97278
[21:01:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.42971 validation-auc:0.96788 validation-aucpr:0.97275
[21:01:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.42714 validation-auc:0.96782 validation-aucpr:0.97268
[21:01:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42414 validation-auc:0.96789 validation-aucpr:0.97274
[21:01:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42154 validation-auc:0.96791 validation-aucpr:0.97277
[21:01:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.41885 validation-auc:0.96793 validation-aucpr:0.97280
[21:01:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.41589 validation-auc:0.96797 validation-aucpr:0.97284
[21:01:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41363 validation-auc:0.96792 validation-aucpr:0.97279
[21:01:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41117 validation-auc:0.96795 validation-aucpr:0.97280
[21:01:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.40885 validation-auc:0.96794 validation-aucpr:0.97276
[21:01:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.40655 validation-auc:0.96788 validation-aucpr:0.97271
[21:01:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40418 validation-auc:0.96789 validation-aucpr:0.97272
[21:01:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40192 validation-auc:0.96789 validation-aucpr:0.97271
[21:01:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.39976 validation-auc:0.96786 validation-aucpr:0.97267
[21:01:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.39712 validation-auc:0.96792 validation-aucpr:0.97274
[21:01:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39457 validation-auc:0.96789 validation-aucpr:0.97274
[21:01:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39198 validation-auc:0.96798 validation-aucpr:0.97281
[21:01:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.38990 validation-auc:0.96793 validation-aucpr:0.97275
[21:01:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.38772 validation-auc:0.96794 validation-aucpr:0.97276
[21:01:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38536 validation-auc:0.96799 validation-aucpr:0.97281
[21:01:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38328 validation-auc:0.96796 validation-aucpr:0.97278
[21:01:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38117 validation-auc:0.96800 validation-aucpr:0.97281
[21:01:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.37916 validation-auc:0.96797 validation-aucpr:0.97278
[21:01:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.37723 validation-auc:0.96795 validation-aucpr:0.97276
[21:01:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37531 validation-auc:0.96796 validation-aucpr:0.97276
{'best_iteration': '61', 'best_score': '0.9728415399042182'}
Trial 68, Fold 1: Log loss = 0.375314446571651, Average precision = 0.9727612448617955, ROC-AUC = 0.96795927670476, Elapsed Time = 21.340347599998495 seconds
Trial 68, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 68, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[21:02:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68585 validation-auc:0.94018 validation-aucpr:0.92886
[21:02:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67924 validation-auc:0.95496 validation-aucpr:0.95415
[21:02:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67287 validation-auc:0.95648 validation-aucpr:0.95858
[21:02:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66572 validation-auc:0.96356 validation-aucpr:0.96661
[21:02:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65871 validation-auc:0.96475 validation-aucpr:0.96835
[21:02:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65247 validation-auc:0.96501 validation-aucpr:0.96876
[21:02:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64641 validation-auc:0.96517 validation-aucpr:0.96893
[21:02:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64049 validation-auc:0.96497 validation-aucpr:0.96863
[21:02:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63470 validation-auc:0.96491 validation-aucpr:0.96839
[21:02:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62821 validation-auc:0.96618 validation-aucpr:0.96964
[21:02:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62270 validation-auc:0.96555 validation-aucpr:0.96901
[21:02:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61712 validation-auc:0.96548 validation-aucpr:0.96892
[21:02:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.61173 validation-auc:0.96526 validation-aucpr:0.96870
[21:02:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60566 validation-auc:0.96592 validation-aucpr:0.96904
[21:02:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.60029 validation-auc:0.96619 validation-aucpr:0.96922
[21:02:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59502 validation-auc:0.96631 validation-aucpr:0.96930
[21:02:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58994 validation-auc:0.96609 validation-aucpr:0.96899
[21:02:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58488 validation-auc:0.96626 validation-aucpr:0.96906
[21:02:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.58003 validation-auc:0.96635 validation-aucpr:0.96903
[21:02:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57499 validation-auc:0.96649 validation-aucpr:0.96910
[21:02:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56932 validation-auc:0.96739 validation-aucpr:0.97000
[21:02:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56390 validation-auc:0.96790 validation-aucpr:0.97093
[21:02:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55941 validation-auc:0.96782 validation-aucpr:0.97080
[21:02:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55421 validation-auc:0.96820 validation-aucpr:0.97119
[21:02:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54981 validation-auc:0.96805 validation-aucpr:0.97106
[21:02:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54544 validation-auc:0.96805 validation-aucpr:0.97101
[21:02:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.54035 validation-auc:0.96839 validation-aucpr:0.97166
[21:02:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53552 validation-auc:0.96857 validation-aucpr:0.97187
[21:02:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.53068 validation-auc:0.96883 validation-aucpr:0.97212
[21:02:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52663 validation-auc:0.96890 validation-aucpr:0.97239
[21:02:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52248 validation-auc:0.96893 validation-aucpr:0.97240
[21:02:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51801 validation-auc:0.96903 validation-aucpr:0.97250
[21:02:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51421 validation-auc:0.96897 validation-aucpr:0.97243
[21:02:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.51029 validation-auc:0.96893 validation-aucpr:0.97238
[21:02:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50595 validation-auc:0.96907 validation-aucpr:0.97252
[21:02:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.50151 validation-auc:0.96922 validation-aucpr:0.97268
[21:02:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49719 validation-auc:0.96937 validation-aucpr:0.97283
[21:02:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49299 validation-auc:0.96953 validation-aucpr:0.97299
[21:02:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.48876 validation-auc:0.96973 validation-aucpr:0.97317
[21:02:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48465 validation-auc:0.96984 validation-aucpr:0.97328
[21:02:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48079 validation-auc:0.96987 validation-aucpr:0.97330
[21:02:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.47745 validation-auc:0.96973 validation-aucpr:0.97318
[21:02:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47368 validation-auc:0.96976 validation-aucpr:0.97320
[21:02:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47010 validation-auc:0.96984 validation-aucpr:0.97325
[21:02:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.46629 validation-auc:0.96992 validation-aucpr:0.97335
[21:02:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46309 validation-auc:0.96999 validation-aucpr:0.97341
[21:02:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.45994 validation-auc:0.97000 validation-aucpr:0.97346
[21:02:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.45643 validation-auc:0.97010 validation-aucpr:0.97354
[21:02:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45339 validation-auc:0.97010 validation-aucpr:0.97353
[21:02:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45029 validation-auc:0.97017 validation-aucpr:0.97356
[21:02:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.44666 validation-auc:0.97034 validation-aucpr:0.97374
[21:02:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44320 validation-auc:0.97039 validation-aucpr:0.97378
[21:02:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44045 validation-auc:0.97037 validation-aucpr:0.97374
[21:02:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.43728 validation-auc:0.97040 validation-aucpr:0.97376
[21:02:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43448 validation-auc:0.97038 validation-aucpr:0.97375
[21:02:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43176 validation-auc:0.97037 validation-aucpr:0.97372
[21:02:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.42857 validation-auc:0.97046 validation-aucpr:0.97381
[21:02:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.42593 validation-auc:0.97038 validation-aucpr:0.97374
[21:02:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42281 validation-auc:0.97051 validation-aucpr:0.97386
[21:02:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.41975 validation-auc:0.97058 validation-aucpr:0.97392
[21:02:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.41688 validation-auc:0.97064 validation-aucpr:0.97393
[21:02:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.41397 validation-auc:0.97061 validation-aucpr:0.97393
[21:02:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41103 validation-auc:0.97066 validation-aucpr:0.97398
[21:02:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.40804 validation-auc:0.97082 validation-aucpr:0.97412
[21:02:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.40532 validation-auc:0.97085 validation-aucpr:0.97410
[21:02:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.40301 validation-auc:0.97084 validation-aucpr:0.97408
[21:02:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40069 validation-auc:0.97088 validation-aucpr:0.97410
[21:02:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.39790 validation-auc:0.97092 validation-aucpr:0.97416
[21:02:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.39520 validation-auc:0.97095 validation-aucpr:0.97419
[21:02:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.39255 validation-auc:0.97099 validation-aucpr:0.97422
[21:02:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39039 validation-auc:0.97096 validation-aucpr:0.97413
[21:02:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.38783 validation-auc:0.97102 validation-aucpr:0.97419
[21:02:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.38525 validation-auc:0.97109 validation-aucpr:0.97427
[21:02:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.38296 validation-auc:0.97111 validation-aucpr:0.97428
[21:02:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38046 validation-auc:0.97114 validation-aucpr:0.97431
[21:02:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.37843 validation-auc:0.97114 validation-aucpr:0.97429
[21:02:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.37634 validation-auc:0.97119 validation-aucpr:0.97433
[21:02:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.37389 validation-auc:0.97122 validation-aucpr:0.97437
[21:02:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.37190 validation-auc:0.97120 validation-aucpr:0.97435
[21:02:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.36992 validation-auc:0.97119 validation-aucpr:0.97435
{'best_iteration': '77', 'best_score': '0.9743684100818301'}
Trial 68, Fold 2: Log loss = 0.36992255065135127, Average precision = 0.9743500422679949, ROC-AUC = 0.9711869651080246, Elapsed Time = 21.50118659999862 seconds
Trial 68, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 68, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[21:02:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68606 validation-auc:0.94324 validation-aucpr:0.92817
[21:02:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67925 validation-auc:0.95504 validation-aucpr:0.95683
[21:02:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67178 validation-auc:0.96425 validation-aucpr:0.96810
[21:02:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66504 validation-auc:0.96577 validation-aucpr:0.96976
[21:02:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65868 validation-auc:0.96595 validation-aucpr:0.96956
[21:02:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65238 validation-auc:0.96531 validation-aucpr:0.96883
[21:02:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64546 validation-auc:0.96727 validation-aucpr:0.97124
[21:02:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63946 validation-auc:0.96694 validation-aucpr:0.97104
[21:02:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63349 validation-auc:0.96724 validation-aucpr:0.97106
[21:02:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62763 validation-auc:0.96742 validation-aucpr:0.97118
[21:02:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62185 validation-auc:0.96741 validation-aucpr:0.97108
[21:02:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61548 validation-auc:0.96845 validation-aucpr:0.97222
[21:02:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60921 validation-auc:0.96895 validation-aucpr:0.97276
[21:02:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60389 validation-auc:0.96877 validation-aucpr:0.97253
[21:02:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59806 validation-auc:0.96916 validation-aucpr:0.97302
[21:02:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59283 validation-auc:0.96894 validation-aucpr:0.97283
[21:02:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58692 validation-auc:0.96936 validation-aucpr:0.97329
[21:02:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58185 validation-auc:0.96927 validation-aucpr:0.97321
[21:02:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57707 validation-auc:0.96911 validation-aucpr:0.97308
[21:02:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57243 validation-auc:0.96884 validation-aucpr:0.97282
[21:02:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56709 validation-auc:0.96908 validation-aucpr:0.97306
[21:02:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56247 validation-auc:0.96896 validation-aucpr:0.97292
[21:02:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55714 validation-auc:0.96923 validation-aucpr:0.97326
[21:02:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55254 validation-auc:0.96921 validation-aucpr:0.97328
[21:02:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54819 validation-auc:0.96904 validation-aucpr:0.97311
[21:02:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54316 validation-auc:0.96924 validation-aucpr:0.97333
[21:02:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53889 validation-auc:0.96925 validation-aucpr:0.97338
[21:02:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53392 validation-auc:0.96948 validation-aucpr:0.97358
[21:02:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52975 validation-auc:0.96950 validation-aucpr:0.97356
[21:02:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52501 validation-auc:0.96968 validation-aucpr:0.97375
[21:02:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52102 validation-auc:0.96967 validation-aucpr:0.97373
[21:02:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51647 validation-auc:0.96979 validation-aucpr:0.97386
[21:02:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51190 validation-auc:0.96992 validation-aucpr:0.97398
[21:02:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.50796 validation-auc:0.96993 validation-aucpr:0.97399
[21:02:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50418 validation-auc:0.96986 validation-aucpr:0.97394
[21:02:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.49990 validation-auc:0.96987 validation-aucpr:0.97397
[21:02:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49569 validation-auc:0.96992 validation-aucpr:0.97403
[21:02:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49145 validation-auc:0.97010 validation-aucpr:0.97418
[21:02:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.48782 validation-auc:0.97001 validation-aucpr:0.97411
[21:02:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48372 validation-auc:0.97009 validation-aucpr:0.97420
[21:02:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.47983 validation-auc:0.97016 validation-aucpr:0.97425
[21:02:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.47647 validation-auc:0.97009 validation-aucpr:0.97419
[21:02:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47263 validation-auc:0.97014 validation-aucpr:0.97425
[21:02:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.46882 validation-auc:0.97023 validation-aucpr:0.97435
[21:02:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.46560 validation-auc:0.97023 validation-aucpr:0.97433
[21:02:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46241 validation-auc:0.97025 validation-aucpr:0.97434
[21:02:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.45860 validation-auc:0.97042 validation-aucpr:0.97449
[21:02:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.45528 validation-auc:0.97044 validation-aucpr:0.97450
[21:02:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45233 validation-auc:0.97035 validation-aucpr:0.97444
[21:02:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.44875 validation-auc:0.97048 validation-aucpr:0.97453
[21:02:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.44568 validation-auc:0.97050 validation-aucpr:0.97454
[21:02:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44223 validation-auc:0.97059 validation-aucpr:0.97463
[21:02:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.43890 validation-auc:0.97059 validation-aucpr:0.97463
[21:02:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.43605 validation-auc:0.97060 validation-aucpr:0.97464
[21:02:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43300 validation-auc:0.97060 validation-aucpr:0.97465
[21:02:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43024 validation-auc:0.97063 validation-aucpr:0.97467
[21:02:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.42694 validation-auc:0.97070 validation-aucpr:0.97475
[21:02:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.42425 validation-auc:0.97068 validation-aucpr:0.97472
[21:02:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42155 validation-auc:0.97070 validation-aucpr:0.97474
[21:02:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.41898 validation-auc:0.97067 validation-aucpr:0.97471
[21:02:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.41594 validation-auc:0.97070 validation-aucpr:0.97473
[21:02:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.41292 validation-auc:0.97075 validation-aucpr:0.97479
[21:02:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41018 validation-auc:0.97075 validation-aucpr:0.97480
[21:02:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.40774 validation-auc:0.97075 validation-aucpr:0.97478
[21:02:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.40523 validation-auc:0.97079 validation-aucpr:0.97482
[21:02:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.40234 validation-auc:0.97084 validation-aucpr:0.97487
[21:02:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40002 validation-auc:0.97082 validation-aucpr:0.97484
[21:02:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.39774 validation-auc:0.97080 validation-aucpr:0.97485
[21:02:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.39507 validation-auc:0.97083 validation-aucpr:0.97488
[21:02:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.39241 validation-auc:0.97086 validation-aucpr:0.97491
[21:02:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.38984 validation-auc:0.97091 validation-aucpr:0.97496
[21:02:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.38778 validation-auc:0.97090 validation-aucpr:0.97495
[21:02:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.38522 validation-auc:0.97094 validation-aucpr:0.97498
[21:02:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.38323 validation-auc:0.97089 validation-aucpr:0.97493
[21:02:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38125 validation-auc:0.97086 validation-aucpr:0.97489
[21:02:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.37876 validation-auc:0.97088 validation-aucpr:0.97492
[21:02:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.37672 validation-auc:0.97091 validation-aucpr:0.97492
[21:02:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.37427 validation-auc:0.97095 validation-aucpr:0.97496
[21:02:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.37222 validation-auc:0.97096 validation-aucpr:0.97496
[21:02:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37027 validation-auc:0.97096 validation-aucpr:0.97494
{'best_iteration': '72', 'best_score': '0.9749780107157653'}
Trial 68, Fold 3: Log loss = 0.37027450129392103, Average precision = 0.9749421781509551, ROC-AUC = 0.970958667209215, Elapsed Time = 20.445873399999982 seconds
Trial 68, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 68, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[21:02:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68598 validation-auc:0.93902 validation-aucpr:0.91408
[21:02:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67910 validation-auc:0.95301 validation-aucpr:0.95750
[21:02:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67157 validation-auc:0.96192 validation-aucpr:0.96755
[21:02:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66491 validation-auc:0.96249 validation-aucpr:0.96810
[21:02:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65867 validation-auc:0.96173 validation-aucpr:0.96762
[21:02:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65177 validation-auc:0.96410 validation-aucpr:0.96977
[21:02:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64494 validation-auc:0.96510 validation-aucpr:0.97077
[21:02:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63875 validation-auc:0.96570 validation-aucpr:0.97109
[21:02:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63288 validation-auc:0.96528 validation-aucpr:0.97072
[21:02:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62712 validation-auc:0.96564 validation-aucpr:0.97100
[21:02:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62140 validation-auc:0.96561 validation-aucpr:0.97091
[21:02:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61507 validation-auc:0.96625 validation-aucpr:0.97152
[21:02:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60954 validation-auc:0.96652 validation-aucpr:0.97164
[21:02:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60349 validation-auc:0.96683 validation-aucpr:0.97209
[21:02:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59758 validation-auc:0.96695 validation-aucpr:0.97225
[21:02:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59155 validation-auc:0.96741 validation-aucpr:0.97271
[21:02:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58654 validation-auc:0.96728 validation-aucpr:0.97257
[21:02:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58163 validation-auc:0.96728 validation-aucpr:0.97254
[21:02:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57594 validation-auc:0.96753 validation-aucpr:0.97281
[21:02:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57048 validation-auc:0.96773 validation-aucpr:0.97300
[21:02:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56550 validation-auc:0.96766 validation-aucpr:0.97295
[21:02:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56075 validation-auc:0.96751 validation-aucpr:0.97285
[21:02:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55609 validation-auc:0.96760 validation-aucpr:0.97287
[21:02:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55155 validation-auc:0.96742 validation-aucpr:0.97271
[21:02:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54650 validation-auc:0.96759 validation-aucpr:0.97285
[21:02:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54242 validation-auc:0.96760 validation-aucpr:0.97284
[21:02:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53743 validation-auc:0.96784 validation-aucpr:0.97306
[21:02:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53316 validation-auc:0.96775 validation-aucpr:0.97299
[21:02:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52915 validation-auc:0.96756 validation-aucpr:0.97281
[21:02:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52512 validation-auc:0.96759 validation-aucpr:0.97279
[21:02:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52111 validation-auc:0.96761 validation-aucpr:0.97278
[21:02:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51714 validation-auc:0.96756 validation-aucpr:0.97272
[21:02:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51312 validation-auc:0.96765 validation-aucpr:0.97278
[21:02:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.50942 validation-auc:0.96756 validation-aucpr:0.97268
[21:02:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50498 validation-auc:0.96763 validation-aucpr:0.97277
[21:02:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.50129 validation-auc:0.96766 validation-aucpr:0.97276
[21:02:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49754 validation-auc:0.96765 validation-aucpr:0.97273
[21:02:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49350 validation-auc:0.96780 validation-aucpr:0.97284
[21:02:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.48929 validation-auc:0.96796 validation-aucpr:0.97298
[21:02:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48546 validation-auc:0.96800 validation-aucpr:0.97303
[21:02:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48198 validation-auc:0.96794 validation-aucpr:0.97299
[21:02:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.47875 validation-auc:0.96790 validation-aucpr:0.97295
[21:02:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47550 validation-auc:0.96779 validation-aucpr:0.97285
[21:02:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47220 validation-auc:0.96777 validation-aucpr:0.97283
[21:02:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.46900 validation-auc:0.96785 validation-aucpr:0.97289
[21:02:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46597 validation-auc:0.96785 validation-aucpr:0.97288
[21:02:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.46268 validation-auc:0.96795 validation-aucpr:0.97294
[21:02:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.45913 validation-auc:0.96793 validation-aucpr:0.97296
[21:02:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45562 validation-auc:0.96801 validation-aucpr:0.97303
[21:02:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45206 validation-auc:0.96809 validation-aucpr:0.97311
[21:02:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.44903 validation-auc:0.96808 validation-aucpr:0.97310
[21:02:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44612 validation-auc:0.96799 validation-aucpr:0.97303
[21:02:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44316 validation-auc:0.96798 validation-aucpr:0.97302
[21:02:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.43975 validation-auc:0.96805 validation-aucpr:0.97310
[21:02:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43654 validation-auc:0.96810 validation-aucpr:0.97315
[21:02:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43386 validation-auc:0.96805 validation-aucpr:0.97312
[21:02:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.43119 validation-auc:0.96806 validation-aucpr:0.97311
[21:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.42861 validation-auc:0.96801 validation-aucpr:0.97307
[21:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42541 validation-auc:0.96810 validation-aucpr:0.97316
[21:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42280 validation-auc:0.96808 validation-aucpr:0.97315
[21:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.42036 validation-auc:0.96799 validation-aucpr:0.97308
[21:02:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.41730 validation-auc:0.96805 validation-aucpr:0.97316
[21:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41497 validation-auc:0.96798 validation-aucpr:0.97311
[21:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41206 validation-auc:0.96805 validation-aucpr:0.97319
[21:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.40975 validation-auc:0.96800 validation-aucpr:0.97314
[21:02:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.40689 validation-auc:0.96809 validation-aucpr:0.97323
[21:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40405 validation-auc:0.96819 validation-aucpr:0.97332
[21:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40177 validation-auc:0.96814 validation-aucpr:0.97328
[21:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.39900 validation-auc:0.96821 validation-aucpr:0.97335
[21:02:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.39695 validation-auc:0.96816 validation-aucpr:0.97329
[21:02:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39472 validation-auc:0.96813 validation-aucpr:0.97327
[21:02:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39212 validation-auc:0.96818 validation-aucpr:0.97331
[21:02:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.38994 validation-auc:0.96823 validation-aucpr:0.97336
[21:02:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.38789 validation-auc:0.96823 validation-aucpr:0.97335
[21:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38574 validation-auc:0.96827 validation-aucpr:0.97336
[21:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38380 validation-auc:0.96826 validation-aucpr:0.97336
[21:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38132 validation-auc:0.96834 validation-aucpr:0.97343
[21:02:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.37931 validation-auc:0.96833 validation-aucpr:0.97341
[21:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.37731 validation-auc:0.96831 validation-aucpr:0.97339
[21:02:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37494 validation-auc:0.96840 validation-aucpr:0.97347
{'best_iteration': '79', 'best_score': '0.9734720863134078'}
Trial 68, Fold 4: Log loss = 0.37494391354788814, Average precision = 0.9734713875983636, ROC-AUC = 0.9684023977624698, Elapsed Time = 20.46157889999813 seconds
Trial 68, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 68, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[21:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68626 validation-auc:0.93212 validation-aucpr:0.90712
[21:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67984 validation-auc:0.94790 validation-aucpr:0.94679
[21:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67253 validation-auc:0.95790 validation-aucpr:0.95955
[21:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66593 validation-auc:0.95991 validation-aucpr:0.96408
[21:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65908 validation-auc:0.96234 validation-aucpr:0.96654
[21:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65302 validation-auc:0.96251 validation-aucpr:0.96687
[21:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64701 validation-auc:0.96232 validation-aucpr:0.96676
[21:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64105 validation-auc:0.96234 validation-aucpr:0.96659
[21:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63436 validation-auc:0.96408 validation-aucpr:0.96848
[21:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62802 validation-auc:0.96443 validation-aucpr:0.96890
[21:03:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62226 validation-auc:0.96468 validation-aucpr:0.96895
[21:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61601 validation-auc:0.96523 validation-aucpr:0.96953
[21:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60993 validation-auc:0.96542 validation-aucpr:0.96909
[21:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60460 validation-auc:0.96549 validation-aucpr:0.96918
[21:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59876 validation-auc:0.96558 validation-aucpr:0.96929
[21:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59288 validation-auc:0.96574 validation-aucpr:0.96944
[21:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58732 validation-auc:0.96565 validation-aucpr:0.96923
[21:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58236 validation-auc:0.96559 validation-aucpr:0.96916
[21:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57752 validation-auc:0.96571 validation-aucpr:0.96911
[21:03:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57202 validation-auc:0.96594 validation-aucpr:0.96934
[21:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56720 validation-auc:0.96599 validation-aucpr:0.96933
[21:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56273 validation-auc:0.96605 validation-aucpr:0.97055
[21:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55763 validation-auc:0.96608 validation-aucpr:0.97061
[21:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55254 validation-auc:0.96627 validation-aucpr:0.97083
[21:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54770 validation-auc:0.96634 validation-aucpr:0.97090
[21:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54271 validation-auc:0.96662 validation-aucpr:0.97113
[21:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53779 validation-auc:0.96695 validation-aucpr:0.97139
[21:03:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53358 validation-auc:0.96693 validation-aucpr:0.97133
[21:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52943 validation-auc:0.96698 validation-aucpr:0.97134
[21:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52477 validation-auc:0.96706 validation-aucpr:0.97144
[21:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52087 validation-auc:0.96713 validation-aucpr:0.97146
[21:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51688 validation-auc:0.96710 validation-aucpr:0.97142
[21:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51252 validation-auc:0.96714 validation-aucpr:0.97146
[21:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.50807 validation-auc:0.96724 validation-aucpr:0.97147
[21:03:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50383 validation-auc:0.96736 validation-aucpr:0.97158
[21:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.49960 validation-auc:0.96747 validation-aucpr:0.97176
[21:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49597 validation-auc:0.96741 validation-aucpr:0.97171
[21:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49244 validation-auc:0.96733 validation-aucpr:0.97162
[21:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.48907 validation-auc:0.96722 validation-aucpr:0.97153
[21:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48555 validation-auc:0.96721 validation-aucpr:0.97152
[21:03:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48221 validation-auc:0.96719 validation-aucpr:0.97149
[21:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.47892 validation-auc:0.96705 validation-aucpr:0.97034
[21:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47504 validation-auc:0.96710 validation-aucpr:0.97037
[21:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47134 validation-auc:0.96718 validation-aucpr:0.97047
[21:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.46757 validation-auc:0.96724 validation-aucpr:0.97053
[21:03:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46451 validation-auc:0.96726 validation-aucpr:0.97053
[21:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.46120 validation-auc:0.96721 validation-aucpr:0.97052
[21:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.45826 validation-auc:0.96716 validation-aucpr:0.97046
[21:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45537 validation-auc:0.96708 validation-aucpr:0.97038
[21:03:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45190 validation-auc:0.96718 validation-aucpr:0.97046
[21:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.44860 validation-auc:0.96730 validation-aucpr:0.97188
[21:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44541 validation-auc:0.96736 validation-aucpr:0.97195
[21:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44204 validation-auc:0.96749 validation-aucpr:0.97206
[21:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.43928 validation-auc:0.96747 validation-aucpr:0.97204
[21:03:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43600 validation-auc:0.96756 validation-aucpr:0.97212
[21:03:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43282 validation-auc:0.96759 validation-aucpr:0.97217
[21:03:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.43003 validation-auc:0.96762 validation-aucpr:0.97219
[21:03:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.42742 validation-auc:0.96760 validation-aucpr:0.97216
[21:03:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42445 validation-auc:0.96764 validation-aucpr:0.97220
[21:03:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42182 validation-auc:0.96761 validation-aucpr:0.97216
[21:03:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.41935 validation-auc:0.96756 validation-aucpr:0.97211
[21:03:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.41680 validation-auc:0.96758 validation-aucpr:0.97212
[21:03:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41390 validation-auc:0.96764 validation-aucpr:0.97216
[21:03:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41158 validation-auc:0.96757 validation-aucpr:0.97211
[21:03:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.40918 validation-auc:0.96756 validation-aucpr:0.97208
[21:03:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.40676 validation-auc:0.96759 validation-aucpr:0.97211
[21:03:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40403 validation-auc:0.96765 validation-aucpr:0.97216
[21:03:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40197 validation-auc:0.96755 validation-aucpr:0.97204
[21:03:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.39975 validation-auc:0.96756 validation-aucpr:0.97203
[21:03:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.39749 validation-auc:0.96760 validation-aucpr:0.97207
[21:03:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39523 validation-auc:0.96766 validation-aucpr:0.97210
[21:03:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39312 validation-auc:0.96770 validation-aucpr:0.97212
[21:03:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.39102 validation-auc:0.96766 validation-aucpr:0.97209
[21:03:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.38856 validation-auc:0.96772 validation-aucpr:0.97215
[21:03:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38605 validation-auc:0.96777 validation-aucpr:0.97221
[21:03:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38359 validation-auc:0.96786 validation-aucpr:0.97228
[21:03:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38163 validation-auc:0.96783 validation-aucpr:0.97225
[21:03:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.37916 validation-auc:0.96798 validation-aucpr:0.97236
[21:03:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.37683 validation-auc:0.96807 validation-aucpr:0.97244
[21:03:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37495 validation-auc:0.96801 validation-aucpr:0.97239
{'best_iteration': '78', 'best_score': '0.9724375478366029'}
Trial 68, Fold 5: Log loss = 0.3749512595837695, Average precision = 0.9723950680655613, ROC-AUC = 0.968007729037772, Elapsed Time = 20.992388499998924 seconds
Optimization Progress: 69%|######9 | 69/100 [3:04:32<28:27, 55.10s/it]
Trial 69, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 69, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68494 validation-auc:0.94993 validation-aucpr:0.95429
[1] validation-logloss:0.67660 validation-auc:0.96044 validation-aucpr:0.96753
[2] validation-logloss:0.66935 validation-auc:0.96113 validation-aucpr:0.96789
[3] validation-logloss:0.66162 validation-auc:0.96248 validation-aucpr:0.96906
[4] validation-logloss:0.65393 validation-auc:0.96286 validation-aucpr:0.96944
[5] validation-logloss:0.64613 validation-auc:0.96437 validation-aucpr:0.97053
[6] validation-logloss:0.63846 validation-auc:0.96489 validation-aucpr:0.97098
[7] validation-logloss:0.63111 validation-auc:0.96482 validation-aucpr:0.97094
[8] validation-logloss:0.62390 validation-auc:0.96504 validation-aucpr:0.97111
[9] validation-logloss:0.61769 validation-auc:0.96516 validation-aucpr:0.97119
[10] validation-logloss:0.61073 validation-auc:0.96544 validation-aucpr:0.97130
[11] validation-logloss:0.60435 validation-auc:0.96559 validation-aucpr:0.97146
[12] validation-logloss:0.59850 validation-auc:0.96545 validation-aucpr:0.97138
[13] validation-logloss:0.59280 validation-auc:0.96511 validation-aucpr:0.97118
[14] validation-logloss:0.58673 validation-auc:0.96511 validation-aucpr:0.97116
[15] validation-logloss:0.58072 validation-auc:0.96524 validation-aucpr:0.97124
[16] validation-logloss:0.57469 validation-auc:0.96531 validation-aucpr:0.97129
[17] validation-logloss:0.56882 validation-auc:0.96531 validation-aucpr:0.97131
[18] validation-logloss:0.56296 validation-auc:0.96556 validation-aucpr:0.97151
[19] validation-logloss:0.55714 validation-auc:0.96571 validation-aucpr:0.97162
[20] validation-logloss:0.55160 validation-auc:0.96580 validation-aucpr:0.97166
[21] validation-logloss:0.54667 validation-auc:0.96565 validation-aucpr:0.97151
[22] validation-logloss:0.54111 validation-auc:0.96575 validation-aucpr:0.97162
[23] validation-logloss:0.53590 validation-auc:0.96576 validation-aucpr:0.97169
[24] validation-logloss:0.53075 validation-auc:0.96564 validation-aucpr:0.97162
[25] validation-logloss:0.52555 validation-auc:0.96581 validation-aucpr:0.97174
[26] validation-logloss:0.52044 validation-auc:0.96595 validation-aucpr:0.97183
[27] validation-logloss:0.51603 validation-auc:0.96604 validation-aucpr:0.97188
[28] validation-logloss:0.51126 validation-auc:0.96600 validation-aucpr:0.97184
[29] validation-logloss:0.50662 validation-auc:0.96602 validation-aucpr:0.97185
[30] validation-logloss:0.50194 validation-auc:0.96606 validation-aucpr:0.97189
[31] validation-logloss:0.49789 validation-auc:0.96605 validation-aucpr:0.97187
[32] validation-logloss:0.49328 validation-auc:0.96611 validation-aucpr:0.97198
[33] validation-logloss:0.48890 validation-auc:0.96615 validation-aucpr:0.97207
[34] validation-logloss:0.48444 validation-auc:0.96629 validation-aucpr:0.97219
[35] validation-logloss:0.48077 validation-auc:0.96618 validation-aucpr:0.97212
[36] validation-logloss:0.47658 validation-auc:0.96630 validation-aucpr:0.97218
[37] validation-logloss:0.47257 validation-auc:0.96632 validation-aucpr:0.97219
[38] validation-logloss:0.46894 validation-auc:0.96635 validation-aucpr:0.97219
[39] validation-logloss:0.46498 validation-auc:0.96644 validation-aucpr:0.97226
[40] validation-logloss:0.46151 validation-auc:0.96640 validation-aucpr:0.97222
[41] validation-logloss:0.45768 validation-auc:0.96630 validation-aucpr:0.97217
[42] validation-logloss:0.45436 validation-auc:0.96630 validation-aucpr:0.97217
[43] validation-logloss:0.45053 validation-auc:0.96632 validation-aucpr:0.97218
[44] validation-logloss:0.44741 validation-auc:0.96644 validation-aucpr:0.97225
[45] validation-logloss:0.44381 validation-auc:0.96652 validation-aucpr:0.97231
[46] validation-logloss:0.44069 validation-auc:0.96649 validation-aucpr:0.97226
[47] validation-logloss:0.43726 validation-auc:0.96656 validation-aucpr:0.97229
[48] validation-logloss:0.43423 validation-auc:0.96657 validation-aucpr:0.97229
[49] validation-logloss:0.43079 validation-auc:0.96660 validation-aucpr:0.97231
[50] validation-logloss:0.42747 validation-auc:0.96669 validation-aucpr:0.97243
[51] validation-logloss:0.42479 validation-auc:0.96657 validation-aucpr:0.97232
[52] validation-logloss:0.42171 validation-auc:0.96654 validation-aucpr:0.97229
[53] validation-logloss:0.41860 validation-auc:0.96654 validation-aucpr:0.97233
[54] validation-logloss:0.41548 validation-auc:0.96653 validation-aucpr:0.97233
[55] validation-logloss:0.41248 validation-auc:0.96654 validation-aucpr:0.97235
[56] validation-logloss:0.40935 validation-auc:0.96667 validation-aucpr:0.97244
[57] validation-logloss:0.40640 validation-auc:0.96665 validation-aucpr:0.97243
[58] validation-logloss:0.40349 validation-auc:0.96666 validation-aucpr:0.97243
[59] validation-logloss:0.40094 validation-auc:0.96668 validation-aucpr:0.97245
[60] validation-logloss:0.39808 validation-auc:0.96673 validation-aucpr:0.97250
[61] validation-logloss:0.39529 validation-auc:0.96674 validation-aucpr:0.97251
[62] validation-logloss:0.39260 validation-auc:0.96681 validation-aucpr:0.97256
[63] validation-logloss:0.39021 validation-auc:0.96679 validation-aucpr:0.97253
[64] validation-logloss:0.38798 validation-auc:0.96680 validation-aucpr:0.97253
[65] validation-logloss:0.38527 validation-auc:0.96686 validation-aucpr:0.97257
[66] validation-logloss:0.38302 validation-auc:0.96686 validation-aucpr:0.97256
[67] validation-logloss:0.38048 validation-auc:0.96689 validation-aucpr:0.97259
[68] validation-logloss:0.37784 validation-auc:0.96701 validation-aucpr:0.97267
[69] validation-logloss:0.37537 validation-auc:0.96709 validation-aucpr:0.97275
[70] validation-logloss:0.37297 validation-auc:0.96709 validation-aucpr:0.97275
[71] validation-logloss:0.37057 validation-auc:0.96710 validation-aucpr:0.97276
[72] validation-logloss:0.36824 validation-auc:0.96709 validation-aucpr:0.97275
[73] validation-logloss:0.36621 validation-auc:0.96706 validation-aucpr:0.97272
[74] validation-logloss:0.36432 validation-auc:0.96706 validation-aucpr:0.97272
[75] validation-logloss:0.36204 validation-auc:0.96707 validation-aucpr:0.97273
[76] validation-logloss:0.35981 validation-auc:0.96710 validation-aucpr:0.97274
[77] validation-logloss:0.35762 validation-auc:0.96715 validation-aucpr:0.97278
[78] validation-logloss:0.35583 validation-auc:0.96720 validation-aucpr:0.97282
[79] validation-logloss:0.35371 validation-auc:0.96723 validation-aucpr:0.97284
[80] validation-logloss:0.35171 validation-auc:0.96720 validation-aucpr:0.97282
[81] validation-logloss:0.34987 validation-auc:0.96716 validation-aucpr:0.97280
[82] validation-logloss:0.34788 validation-auc:0.96716 validation-aucpr:0.97280
[83] validation-logloss:0.34615 validation-auc:0.96720 validation-aucpr:0.97282
[84] validation-logloss:0.34419 validation-auc:0.96723 validation-aucpr:0.97285
[85] validation-logloss:0.34231 validation-auc:0.96722 validation-aucpr:0.97283
[86] validation-logloss:0.34046 validation-auc:0.96725 validation-aucpr:0.97286
[87] validation-logloss:0.33885 validation-auc:0.96721 validation-aucpr:0.97283
[88] validation-logloss:0.33715 validation-auc:0.96720 validation-aucpr:0.97282
[89] validation-logloss:0.33550 validation-auc:0.96725 validation-aucpr:0.97285
[90] validation-logloss:0.33369 validation-auc:0.96728 validation-aucpr:0.97289
[91] validation-logloss:0.33189 validation-auc:0.96731 validation-aucpr:0.97292
[92] validation-logloss:0.33006 validation-auc:0.96733 validation-aucpr:0.97294
[93] validation-logloss:0.32858 validation-auc:0.96733 validation-aucpr:0.97294
[94] validation-logloss:0.32682 validation-auc:0.96733 validation-aucpr:0.97293
{'best_iteration': '92', 'best_score': '0.9729422850626687'}
Trial 69, Fold 1: Log loss = 0.3268189989634485, Average precision = 0.9729359479843955, ROC-AUC = 0.9673291177356218, Elapsed Time = 225.05377579999913 seconds
Trial 69, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 69, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68561 validation-auc:0.94421 validation-aucpr:0.94823
[1] validation-logloss:0.67842 validation-auc:0.95073 validation-aucpr:0.95487
[2] validation-logloss:0.67043 validation-auc:0.96058 validation-aucpr:0.96496
[3] validation-logloss:0.66257 validation-auc:0.96352 validation-aucpr:0.96778
[4] validation-logloss:0.65561 validation-auc:0.96301 validation-aucpr:0.96716
[5] validation-logloss:0.64796 validation-auc:0.96437 validation-aucpr:0.96852
[6] validation-logloss:0.64048 validation-auc:0.96521 validation-aucpr:0.96932
[7] validation-logloss:0.63298 validation-auc:0.96616 validation-aucpr:0.97020
[8] validation-logloss:0.62661 validation-auc:0.96600 validation-aucpr:0.97003
[9] validation-logloss:0.61972 validation-auc:0.96651 validation-aucpr:0.97040
[10] validation-logloss:0.61268 validation-auc:0.96686 validation-aucpr:0.97069
[11] validation-logloss:0.60598 validation-auc:0.96712 validation-aucpr:0.97086
[12] validation-logloss:0.59942 validation-auc:0.96718 validation-aucpr:0.97083
[13] validation-logloss:0.59307 validation-auc:0.96720 validation-aucpr:0.97083
[14] validation-logloss:0.58693 validation-auc:0.96746 validation-aucpr:0.97097
[15] validation-logloss:0.58063 validation-auc:0.96756 validation-aucpr:0.97109
[16] validation-logloss:0.57519 validation-auc:0.96792 validation-aucpr:0.97135
[17] validation-logloss:0.56923 validation-auc:0.96796 validation-aucpr:0.97137
[18] validation-logloss:0.56344 validation-auc:0.96804 validation-aucpr:0.97143
[19] validation-logloss:0.55817 validation-auc:0.96793 validation-aucpr:0.97136
[20] validation-logloss:0.55290 validation-auc:0.96793 validation-aucpr:0.97134
[21] validation-logloss:0.54778 validation-auc:0.96813 validation-aucpr:0.97149
[22] validation-logloss:0.54307 validation-auc:0.96811 validation-aucpr:0.97145
[23] validation-logloss:0.53786 validation-auc:0.96805 validation-aucpr:0.97143
[24] validation-logloss:0.53277 validation-auc:0.96801 validation-aucpr:0.97139
[25] validation-logloss:0.52814 validation-auc:0.96810 validation-aucpr:0.97148
[26] validation-logloss:0.52382 validation-auc:0.96783 validation-aucpr:0.97126
[27] validation-logloss:0.51912 validation-auc:0.96785 validation-aucpr:0.97125
[28] validation-logloss:0.51440 validation-auc:0.96789 validation-aucpr:0.97129
[29] validation-logloss:0.51001 validation-auc:0.96797 validation-aucpr:0.97132
[30] validation-logloss:0.50575 validation-auc:0.96791 validation-aucpr:0.97125
[31] validation-logloss:0.50123 validation-auc:0.96790 validation-aucpr:0.97123
[32] validation-logloss:0.49706 validation-auc:0.96792 validation-aucpr:0.97125
[33] validation-logloss:0.49246 validation-auc:0.96805 validation-aucpr:0.97135
[34] validation-logloss:0.48838 validation-auc:0.96803 validation-aucpr:0.97133
[35] validation-logloss:0.48394 validation-auc:0.96816 validation-aucpr:0.97145
[36] validation-logloss:0.47971 validation-auc:0.96822 validation-aucpr:0.97150
[37] validation-logloss:0.47591 validation-auc:0.96808 validation-aucpr:0.97140
[38] validation-logloss:0.47193 validation-auc:0.96812 validation-aucpr:0.97144
[39] validation-logloss:0.46856 validation-auc:0.96800 validation-aucpr:0.97135
[40] validation-logloss:0.46463 validation-auc:0.96799 validation-aucpr:0.97135
[41] validation-logloss:0.46079 validation-auc:0.96802 validation-aucpr:0.97171
[42] validation-logloss:0.45762 validation-auc:0.96799 validation-aucpr:0.97168
[43] validation-logloss:0.45391 validation-auc:0.96806 validation-aucpr:0.97172
[44] validation-logloss:0.45072 validation-auc:0.96800 validation-aucpr:0.97167
[45] validation-logloss:0.44714 validation-auc:0.96801 validation-aucpr:0.97170
[46] validation-logloss:0.44354 validation-auc:0.96808 validation-aucpr:0.97173
[47] validation-logloss:0.44006 validation-auc:0.96815 validation-aucpr:0.97181
[48] validation-logloss:0.43663 validation-auc:0.96814 validation-aucpr:0.97180
[49] validation-logloss:0.43313 validation-auc:0.96819 validation-aucpr:0.97184
[50] validation-logloss:0.42995 validation-auc:0.96822 validation-aucpr:0.97188
[51] validation-logloss:0.42668 validation-auc:0.96825 validation-aucpr:0.97197
[52] validation-logloss:0.42336 validation-auc:0.96832 validation-aucpr:0.97202
[53] validation-logloss:0.42027 validation-auc:0.96832 validation-aucpr:0.97203
[54] validation-logloss:0.41698 validation-auc:0.96846 validation-aucpr:0.97214
[55] validation-logloss:0.41379 validation-auc:0.96854 validation-aucpr:0.97219
[56] validation-logloss:0.41085 validation-auc:0.96856 validation-aucpr:0.97223
[57] validation-logloss:0.40817 validation-auc:0.96864 validation-aucpr:0.97230
[58] validation-logloss:0.40516 validation-auc:0.96863 validation-aucpr:0.97230
[59] validation-logloss:0.40219 validation-auc:0.96863 validation-aucpr:0.97231
[60] validation-logloss:0.39939 validation-auc:0.96868 validation-aucpr:0.97235
[61] validation-logloss:0.39654 validation-auc:0.96873 validation-aucpr:0.97239
[62] validation-logloss:0.39413 validation-auc:0.96877 validation-aucpr:0.97237
[63] validation-logloss:0.39183 validation-auc:0.96873 validation-aucpr:0.97236
[64] validation-logloss:0.38915 validation-auc:0.96873 validation-aucpr:0.97236
[65] validation-logloss:0.38649 validation-auc:0.96877 validation-aucpr:0.97239
[66] validation-logloss:0.38388 validation-auc:0.96876 validation-aucpr:0.97237
[67] validation-logloss:0.38162 validation-auc:0.96878 validation-aucpr:0.97238
[68] validation-logloss:0.37930 validation-auc:0.96883 validation-aucpr:0.97241
[69] validation-logloss:0.37690 validation-auc:0.96886 validation-aucpr:0.97247
[70] validation-logloss:0.37485 validation-auc:0.96885 validation-aucpr:0.97247
[71] validation-logloss:0.37228 validation-auc:0.96894 validation-aucpr:0.97268
[72] validation-logloss:0.36996 validation-auc:0.96892 validation-aucpr:0.97265
[73] validation-logloss:0.36750 validation-auc:0.96902 validation-aucpr:0.97272
[74] validation-logloss:0.36517 validation-auc:0.96904 validation-aucpr:0.97271
[75] validation-logloss:0.36289 validation-auc:0.96908 validation-aucpr:0.97275
[76] validation-logloss:0.36083 validation-auc:0.96902 validation-aucpr:0.97270
[77] validation-logloss:0.35893 validation-auc:0.96905 validation-aucpr:0.97272
[78] validation-logloss:0.35683 validation-auc:0.96905 validation-aucpr:0.97273
[79] validation-logloss:0.35486 validation-auc:0.96907 validation-aucpr:0.97273
[80] validation-logloss:0.35274 validation-auc:0.96914 validation-aucpr:0.97278
[81] validation-logloss:0.35074 validation-auc:0.96915 validation-aucpr:0.97278
[82] validation-logloss:0.34911 validation-auc:0.96911 validation-aucpr:0.97275
[83] validation-logloss:0.34711 validation-auc:0.96907 validation-aucpr:0.97273
[84] validation-logloss:0.34509 validation-auc:0.96910 validation-aucpr:0.97275
[85] validation-logloss:0.34296 validation-auc:0.96915 validation-aucpr:0.97279
[86] validation-logloss:0.34092 validation-auc:0.96914 validation-aucpr:0.97278
[87] validation-logloss:0.33909 validation-auc:0.96919 validation-aucpr:0.97281
[88] validation-logloss:0.33724 validation-auc:0.96918 validation-aucpr:0.97282
[89] validation-logloss:0.33542 validation-auc:0.96913 validation-aucpr:0.97279
[90] validation-logloss:0.33359 validation-auc:0.96919 validation-aucpr:0.97283
[91] validation-logloss:0.33178 validation-auc:0.96916 validation-aucpr:0.97281
[92] validation-logloss:0.32998 validation-auc:0.96918 validation-aucpr:0.97284
[93] validation-logloss:0.32836 validation-auc:0.96916 validation-aucpr:0.97282
[94] validation-logloss:0.32659 validation-auc:0.96920 validation-aucpr:0.97282
{'best_iteration': '92', 'best_score': '0.9728419835631521'}
Trial 69, Fold 2: Log loss = 0.32658608862626937, Average precision = 0.9728280145197388, ROC-AUC = 0.9691953512976407, Elapsed Time = 241.70224149999922 seconds
Trial 69, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 69, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68478 validation-auc:0.95556 validation-aucpr:0.95743
[1] validation-logloss:0.67744 validation-auc:0.95982 validation-aucpr:0.96365
[2] validation-logloss:0.66920 validation-auc:0.96338 validation-aucpr:0.96775
[3] validation-logloss:0.66125 validation-auc:0.96505 validation-aucpr:0.97031
[4] validation-logloss:0.65337 validation-auc:0.96585 validation-aucpr:0.97118
[5] validation-logloss:0.64588 validation-auc:0.96634 validation-aucpr:0.97159
[6] validation-logloss:0.63863 validation-auc:0.96692 validation-aucpr:0.97206
[7] validation-logloss:0.63219 validation-auc:0.96703 validation-aucpr:0.97199
[8] validation-logloss:0.62536 validation-auc:0.96675 validation-aucpr:0.97179
[9] validation-logloss:0.61903 validation-auc:0.96691 validation-aucpr:0.97196
[10] validation-logloss:0.61233 validation-auc:0.96685 validation-aucpr:0.97188
[11] validation-logloss:0.60578 validation-auc:0.96700 validation-aucpr:0.97205
[12] validation-logloss:0.59993 validation-auc:0.96694 validation-aucpr:0.97197
[13] validation-logloss:0.59354 validation-auc:0.96707 validation-aucpr:0.97206
[14] validation-logloss:0.58764 validation-auc:0.96695 validation-aucpr:0.97197
[15] validation-logloss:0.58139 validation-auc:0.96713 validation-aucpr:0.97214
[16] validation-logloss:0.57531 validation-auc:0.96726 validation-aucpr:0.97226
[17] validation-logloss:0.57023 validation-auc:0.96709 validation-aucpr:0.97215
[18] validation-logloss:0.56528 validation-auc:0.96693 validation-aucpr:0.97198
[19] validation-logloss:0.55934 validation-auc:0.96706 validation-aucpr:0.97207
[20] validation-logloss:0.55377 validation-auc:0.96719 validation-aucpr:0.97216
[21] validation-logloss:0.54888 validation-auc:0.96724 validation-aucpr:0.97223
[22] validation-logloss:0.54391 validation-auc:0.96712 validation-aucpr:0.97214
[23] validation-logloss:0.53870 validation-auc:0.96724 validation-aucpr:0.97222
[24] validation-logloss:0.53423 validation-auc:0.96712 validation-aucpr:0.97213
[25] validation-logloss:0.52912 validation-auc:0.96713 validation-aucpr:0.97217
[26] validation-logloss:0.52390 validation-auc:0.96730 validation-aucpr:0.97231
[27] validation-logloss:0.51896 validation-auc:0.96742 validation-aucpr:0.97246
[28] validation-logloss:0.51421 validation-auc:0.96744 validation-aucpr:0.97253
[29] validation-logloss:0.51012 validation-auc:0.96734 validation-aucpr:0.97246
[30] validation-logloss:0.50516 validation-auc:0.96758 validation-aucpr:0.97264
[31] validation-logloss:0.50080 validation-auc:0.96771 validation-aucpr:0.97274
[32] validation-logloss:0.49618 validation-auc:0.96779 validation-aucpr:0.97283
[33] validation-logloss:0.49223 validation-auc:0.96778 validation-aucpr:0.97280
[34] validation-logloss:0.48800 validation-auc:0.96778 validation-aucpr:0.97281
[35] validation-logloss:0.48351 validation-auc:0.96778 validation-aucpr:0.97280
[36] validation-logloss:0.47911 validation-auc:0.96784 validation-aucpr:0.97283
[37] validation-logloss:0.47512 validation-auc:0.96787 validation-aucpr:0.97280
[38] validation-logloss:0.47139 validation-auc:0.96785 validation-aucpr:0.97276
[39] validation-logloss:0.46742 validation-auc:0.96787 validation-aucpr:0.97279
[40] validation-logloss:0.46338 validation-auc:0.96788 validation-aucpr:0.97279
[41] validation-logloss:0.45957 validation-auc:0.96785 validation-aucpr:0.97275
[42] validation-logloss:0.45572 validation-auc:0.96794 validation-aucpr:0.97282
[43] validation-logloss:0.45255 validation-auc:0.96786 validation-aucpr:0.97275
[44] validation-logloss:0.44899 validation-auc:0.96783 validation-aucpr:0.97273
[45] validation-logloss:0.44538 validation-auc:0.96786 validation-aucpr:0.97272
[46] validation-logloss:0.44168 validation-auc:0.96786 validation-aucpr:0.97273
[47] validation-logloss:0.43839 validation-auc:0.96790 validation-aucpr:0.97275
[48] validation-logloss:0.43487 validation-auc:0.96794 validation-aucpr:0.97278
[49] validation-logloss:0.43167 validation-auc:0.96798 validation-aucpr:0.97280
[50] validation-logloss:0.42851 validation-auc:0.96799 validation-aucpr:0.97281
[51] validation-logloss:0.42521 validation-auc:0.96796 validation-aucpr:0.97281
[52] validation-logloss:0.42191 validation-auc:0.96799 validation-aucpr:0.97281
[53] validation-logloss:0.41898 validation-auc:0.96796 validation-aucpr:0.97279
[54] validation-logloss:0.41648 validation-auc:0.96792 validation-aucpr:0.97280
[55] validation-logloss:0.41341 validation-auc:0.96791 validation-aucpr:0.97280
[56] validation-logloss:0.41050 validation-auc:0.96797 validation-aucpr:0.97283
[57] validation-logloss:0.40751 validation-auc:0.96802 validation-aucpr:0.97287
[58] validation-logloss:0.40434 validation-auc:0.96813 validation-aucpr:0.97299
[59] validation-logloss:0.40141 validation-auc:0.96818 validation-aucpr:0.97302
[60] validation-logloss:0.39847 validation-auc:0.96820 validation-aucpr:0.97303
[61] validation-logloss:0.39561 validation-auc:0.96823 validation-aucpr:0.97306
[62] validation-logloss:0.39310 validation-auc:0.96822 validation-aucpr:0.97308
[63] validation-logloss:0.39056 validation-auc:0.96817 validation-aucpr:0.97306
[64] validation-logloss:0.38795 validation-auc:0.96818 validation-aucpr:0.97311
[65] validation-logloss:0.38532 validation-auc:0.96818 validation-aucpr:0.97310
[66] validation-logloss:0.38278 validation-auc:0.96818 validation-aucpr:0.97311
[67] validation-logloss:0.38021 validation-auc:0.96818 validation-aucpr:0.97312
[68] validation-logloss:0.37769 validation-auc:0.96815 validation-aucpr:0.97309
[69] validation-logloss:0.37552 validation-auc:0.96818 validation-aucpr:0.97310
[70] validation-logloss:0.37327 validation-auc:0.96822 validation-aucpr:0.97315
[71] validation-logloss:0.37082 validation-auc:0.96820 validation-aucpr:0.97312
[72] validation-logloss:0.36878 validation-auc:0.96821 validation-aucpr:0.97311
[73] validation-logloss:0.36662 validation-auc:0.96822 validation-aucpr:0.97312
[74] validation-logloss:0.36460 validation-auc:0.96826 validation-aucpr:0.97314
[75] validation-logloss:0.36261 validation-auc:0.96826 validation-aucpr:0.97314
[76] validation-logloss:0.36049 validation-auc:0.96829 validation-aucpr:0.97317
[77] validation-logloss:0.35832 validation-auc:0.96828 validation-aucpr:0.97317
[78] validation-logloss:0.35622 validation-auc:0.96829 validation-aucpr:0.97319
[79] validation-logloss:0.35412 validation-auc:0.96828 validation-aucpr:0.97320
[80] validation-logloss:0.35194 validation-auc:0.96829 validation-aucpr:0.97319
[81] validation-logloss:0.34980 validation-auc:0.96832 validation-aucpr:0.97324
[82] validation-logloss:0.34772 validation-auc:0.96834 validation-aucpr:0.97327
[83] validation-logloss:0.34588 validation-auc:0.96839 validation-aucpr:0.97331
[84] validation-logloss:0.34381 validation-auc:0.96841 validation-aucpr:0.97332
[85] validation-logloss:0.34215 validation-auc:0.96836 validation-aucpr:0.97328
[86] validation-logloss:0.34027 validation-auc:0.96837 validation-aucpr:0.97328
[87] validation-logloss:0.33843 validation-auc:0.96839 validation-aucpr:0.97329
[88] validation-logloss:0.33662 validation-auc:0.96837 validation-aucpr:0.97325
[89] validation-logloss:0.33471 validation-auc:0.96846 validation-aucpr:0.97331
[90] validation-logloss:0.33318 validation-auc:0.96843 validation-aucpr:0.97331
[91] validation-logloss:0.33140 validation-auc:0.96845 validation-aucpr:0.97332
[92] validation-logloss:0.32956 validation-auc:0.96846 validation-aucpr:0.97333
[93] validation-logloss:0.32801 validation-auc:0.96845 validation-aucpr:0.97331
[94] validation-logloss:0.32630 validation-auc:0.96847 validation-aucpr:0.97336
{'best_iteration': '94', 'best_score': '0.9733583680595139'}
Trial 69, Fold 3: Log loss = 0.3263034784487951, Average precision = 0.9733629369980912, ROC-AUC = 0.9684713019233178, Elapsed Time = 236.4137340999987 seconds
Trial 69, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 69, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68452 validation-auc:0.95758 validation-aucpr:0.96321
[1] validation-logloss:0.67640 validation-auc:0.96182 validation-aucpr:0.96821
[2] validation-logloss:0.66914 validation-auc:0.96363 validation-aucpr:0.96952
[3] validation-logloss:0.66134 validation-auc:0.96442 validation-aucpr:0.97021
[4] validation-logloss:0.65399 validation-auc:0.96564 validation-aucpr:0.97100
[5] validation-logloss:0.64650 validation-auc:0.96615 validation-aucpr:0.97148
[6] validation-logloss:0.63961 validation-auc:0.96606 validation-aucpr:0.97145
[7] validation-logloss:0.63302 validation-auc:0.96570 validation-aucpr:0.97114
[8] validation-logloss:0.62576 validation-auc:0.96606 validation-aucpr:0.97146
[9] validation-logloss:0.61873 validation-auc:0.96616 validation-aucpr:0.97157
[10] validation-logloss:0.61255 validation-auc:0.96591 validation-aucpr:0.97140
[11] validation-logloss:0.60637 validation-auc:0.96552 validation-aucpr:0.97106
[12] validation-logloss:0.60045 validation-auc:0.96559 validation-aucpr:0.97108
[13] validation-logloss:0.59454 validation-auc:0.96563 validation-aucpr:0.97111
[14] validation-logloss:0.58813 validation-auc:0.96600 validation-aucpr:0.97142
[15] validation-logloss:0.58211 validation-auc:0.96608 validation-aucpr:0.97149
[16] validation-logloss:0.57596 validation-auc:0.96631 validation-aucpr:0.97170
[17] validation-logloss:0.57065 validation-auc:0.96631 validation-aucpr:0.97170
[18] validation-logloss:0.56469 validation-auc:0.96647 validation-aucpr:0.97184
[19] validation-logloss:0.55907 validation-auc:0.96656 validation-aucpr:0.97194
[20] validation-logloss:0.55338 validation-auc:0.96684 validation-aucpr:0.97211
[21] validation-logloss:0.54792 validation-auc:0.96700 validation-aucpr:0.97226
[22] validation-logloss:0.54317 validation-auc:0.96689 validation-aucpr:0.97218
[23] validation-logloss:0.53779 validation-auc:0.96698 validation-aucpr:0.97229
[24] validation-logloss:0.53263 validation-auc:0.96700 validation-aucpr:0.97235
[25] validation-logloss:0.52735 validation-auc:0.96719 validation-aucpr:0.97249
[26] validation-logloss:0.52237 validation-auc:0.96729 validation-aucpr:0.97255
[27] validation-logloss:0.51794 validation-auc:0.96719 validation-aucpr:0.97247
[28] validation-logloss:0.51329 validation-auc:0.96722 validation-aucpr:0.97247
[29] validation-logloss:0.50862 validation-auc:0.96730 validation-aucpr:0.97253
[30] validation-logloss:0.50467 validation-auc:0.96714 validation-aucpr:0.97242
[31] validation-logloss:0.50051 validation-auc:0.96704 validation-aucpr:0.97234
[32] validation-logloss:0.49631 validation-auc:0.96702 validation-aucpr:0.97232
[33] validation-logloss:0.49186 validation-auc:0.96706 validation-aucpr:0.97236
[34] validation-logloss:0.48751 validation-auc:0.96701 validation-aucpr:0.97233
[35] validation-logloss:0.48323 validation-auc:0.96701 validation-aucpr:0.97235
[36] validation-logloss:0.47897 validation-auc:0.96706 validation-aucpr:0.97237
[37] validation-logloss:0.47481 validation-auc:0.96708 validation-aucpr:0.97240
[38] validation-logloss:0.47113 validation-auc:0.96708 validation-aucpr:0.97242
[39] validation-logloss:0.46767 validation-auc:0.96708 validation-aucpr:0.97241
[40] validation-logloss:0.46401 validation-auc:0.96710 validation-aucpr:0.97241
[41] validation-logloss:0.46064 validation-auc:0.96698 validation-aucpr:0.97234
[42] validation-logloss:0.45676 validation-auc:0.96710 validation-aucpr:0.97244
[43] validation-logloss:0.45310 validation-auc:0.96710 validation-aucpr:0.97245
[44] validation-logloss:0.44988 validation-auc:0.96704 validation-aucpr:0.97239
[45] validation-logloss:0.44606 validation-auc:0.96716 validation-aucpr:0.97248
[46] validation-logloss:0.44242 validation-auc:0.96721 validation-aucpr:0.97251
[47] validation-logloss:0.43920 validation-auc:0.96715 validation-aucpr:0.97247
[48] validation-logloss:0.43570 validation-auc:0.96718 validation-aucpr:0.97250
[49] validation-logloss:0.43247 validation-auc:0.96716 validation-aucpr:0.97249
[50] validation-logloss:0.42909 validation-auc:0.96719 validation-aucpr:0.97252
[51] validation-logloss:0.42565 validation-auc:0.96722 validation-aucpr:0.97255
[52] validation-logloss:0.42268 validation-auc:0.96718 validation-aucpr:0.97251
[53] validation-logloss:0.41992 validation-auc:0.96711 validation-aucpr:0.97246
[54] validation-logloss:0.41715 validation-auc:0.96704 validation-aucpr:0.97239
[55] validation-logloss:0.41408 validation-auc:0.96701 validation-aucpr:0.97237
[56] validation-logloss:0.41105 validation-auc:0.96701 validation-aucpr:0.97239
[57] validation-logloss:0.40809 validation-auc:0.96704 validation-aucpr:0.97241
[58] validation-logloss:0.40565 validation-auc:0.96701 validation-aucpr:0.97240
[59] validation-logloss:0.40272 validation-auc:0.96706 validation-aucpr:0.97242
[60] validation-logloss:0.39971 validation-auc:0.96714 validation-aucpr:0.97248
[61] validation-logloss:0.39706 validation-auc:0.96717 validation-aucpr:0.97251
[62] validation-logloss:0.39443 validation-auc:0.96712 validation-aucpr:0.97250
[63] validation-logloss:0.39194 validation-auc:0.96708 validation-aucpr:0.97248
[64] validation-logloss:0.38917 validation-auc:0.96720 validation-aucpr:0.97255
[65] validation-logloss:0.38651 validation-auc:0.96729 validation-aucpr:0.97262
[66] validation-logloss:0.38382 validation-auc:0.96734 validation-aucpr:0.97267
[67] validation-logloss:0.38119 validation-auc:0.96744 validation-aucpr:0.97274
[68] validation-logloss:0.37895 validation-auc:0.96744 validation-aucpr:0.97274
[69] validation-logloss:0.37639 validation-auc:0.96751 validation-aucpr:0.97279
[70] validation-logloss:0.37431 validation-auc:0.96748 validation-aucpr:0.97276
[71] validation-logloss:0.37188 validation-auc:0.96750 validation-aucpr:0.97279
[72] validation-logloss:0.36964 validation-auc:0.96750 validation-aucpr:0.97278
[73] validation-logloss:0.36731 validation-auc:0.96754 validation-aucpr:0.97282
[74] validation-logloss:0.36508 validation-auc:0.96758 validation-aucpr:0.97286
[75] validation-logloss:0.36278 validation-auc:0.96765 validation-aucpr:0.97292
[76] validation-logloss:0.36053 validation-auc:0.96771 validation-aucpr:0.97295
[77] validation-logloss:0.35854 validation-auc:0.96770 validation-aucpr:0.97294
[78] validation-logloss:0.35635 validation-auc:0.96766 validation-aucpr:0.97292
[79] validation-logloss:0.35453 validation-auc:0.96762 validation-aucpr:0.97290
[80] validation-logloss:0.35250 validation-auc:0.96763 validation-aucpr:0.97290
[81] validation-logloss:0.35063 validation-auc:0.96761 validation-aucpr:0.97289
[82] validation-logloss:0.34857 validation-auc:0.96765 validation-aucpr:0.97291
[83] validation-logloss:0.34649 validation-auc:0.96771 validation-aucpr:0.97296
[84] validation-logloss:0.34451 validation-auc:0.96773 validation-aucpr:0.97297
[85] validation-logloss:0.34293 validation-auc:0.96769 validation-aucpr:0.97295
[86] validation-logloss:0.34126 validation-auc:0.96770 validation-aucpr:0.97296
[87] validation-logloss:0.33943 validation-auc:0.96771 validation-aucpr:0.97297
[88] validation-logloss:0.33794 validation-auc:0.96765 validation-aucpr:0.97292
[89] validation-logloss:0.33604 validation-auc:0.96772 validation-aucpr:0.97297
[90] validation-logloss:0.33416 validation-auc:0.96774 validation-aucpr:0.97299
[91] validation-logloss:0.33269 validation-auc:0.96772 validation-aucpr:0.97299
[92] validation-logloss:0.33092 validation-auc:0.96773 validation-aucpr:0.97300
[93] validation-logloss:0.32919 validation-auc:0.96776 validation-aucpr:0.97302
[94] validation-logloss:0.32744 validation-auc:0.96775 validation-aucpr:0.97301
{'best_iteration': '93', 'best_score': '0.9730163723177887'}
Trial 69, Fold 4: Log loss = 0.3274444627638355, Average precision = 0.9730170377555121, ROC-AUC = 0.9677517519450822, Elapsed Time = 237.86633739999888 seconds
Trial 69, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 69, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68493 validation-auc:0.95046 validation-aucpr:0.95399
[1] validation-logloss:0.67694 validation-auc:0.95990 validation-aucpr:0.96461
[2] validation-logloss:0.66914 validation-auc:0.96105 validation-aucpr:0.96499
[3] validation-logloss:0.66165 validation-auc:0.96195 validation-aucpr:0.96693
[4] validation-logloss:0.65396 validation-auc:0.96267 validation-aucpr:0.96765
[5] validation-logloss:0.64658 validation-auc:0.96307 validation-aucpr:0.96790
[6] validation-logloss:0.63891 validation-auc:0.96409 validation-aucpr:0.96888
[7] validation-logloss:0.63213 validation-auc:0.96397 validation-aucpr:0.96866
[8] validation-logloss:0.62531 validation-auc:0.96398 validation-aucpr:0.96874
[9] validation-logloss:0.61839 validation-auc:0.96379 validation-aucpr:0.96873
[10] validation-logloss:0.61217 validation-auc:0.96334 validation-aucpr:0.96830
[11] validation-logloss:0.60558 validation-auc:0.96357 validation-aucpr:0.96842
[12] validation-logloss:0.59874 validation-auc:0.96420 validation-aucpr:0.96899
[13] validation-logloss:0.59292 validation-auc:0.96441 validation-aucpr:0.96913
[14] validation-logloss:0.58738 validation-auc:0.96409 validation-aucpr:0.96885
[15] validation-logloss:0.58129 validation-auc:0.96420 validation-aucpr:0.96902
[16] validation-logloss:0.57541 validation-auc:0.96426 validation-aucpr:0.96905
[17] validation-logloss:0.56948 validation-auc:0.96426 validation-aucpr:0.96911
[18] validation-logloss:0.56387 validation-auc:0.96429 validation-aucpr:0.96911
[19] validation-logloss:0.55828 validation-auc:0.96445 validation-aucpr:0.96984
[20] validation-logloss:0.55308 validation-auc:0.96446 validation-aucpr:0.96981
[21] validation-logloss:0.54753 validation-auc:0.96475 validation-aucpr:0.97003
[22] validation-logloss:0.54246 validation-auc:0.96482 validation-aucpr:0.97010
[23] validation-logloss:0.53751 validation-auc:0.96474 validation-aucpr:0.97002
[24] validation-logloss:0.53250 validation-auc:0.96470 validation-aucpr:0.96998
[25] validation-logloss:0.52740 validation-auc:0.96477 validation-aucpr:0.96992
[26] validation-logloss:0.52294 validation-auc:0.96469 validation-aucpr:0.96988
[27] validation-logloss:0.51824 validation-auc:0.96464 validation-aucpr:0.96984
[28] validation-logloss:0.51332 validation-auc:0.96482 validation-aucpr:0.96998
[29] validation-logloss:0.50853 validation-auc:0.96488 validation-aucpr:0.97001
[30] validation-logloss:0.50417 validation-auc:0.96470 validation-aucpr:0.96984
[31] validation-logloss:0.50006 validation-auc:0.96473 validation-aucpr:0.96984
[32] validation-logloss:0.49612 validation-auc:0.96455 validation-aucpr:0.96972
[33] validation-logloss:0.49167 validation-auc:0.96460 validation-aucpr:0.96976
[34] validation-logloss:0.48801 validation-auc:0.96458 validation-aucpr:0.96973
[35] validation-logloss:0.48381 validation-auc:0.96464 validation-aucpr:0.96977
[36] validation-logloss:0.47958 validation-auc:0.96474 validation-aucpr:0.96986
[37] validation-logloss:0.47541 validation-auc:0.96489 validation-aucpr:0.96999
[38] validation-logloss:0.47202 validation-auc:0.96487 validation-aucpr:0.96996
[39] validation-logloss:0.46872 validation-auc:0.96491 validation-aucpr:0.97000
[40] validation-logloss:0.46496 validation-auc:0.96482 validation-aucpr:0.96994
[41] validation-logloss:0.46153 validation-auc:0.96470 validation-aucpr:0.96985
[42] validation-logloss:0.45785 validation-auc:0.96477 validation-aucpr:0.96991
[43] validation-logloss:0.45456 validation-auc:0.96469 validation-aucpr:0.97012
[44] validation-logloss:0.45077 validation-auc:0.96477 validation-aucpr:0.97020
[45] validation-logloss:0.44710 validation-auc:0.96490 validation-aucpr:0.97032
[46] validation-logloss:0.44408 validation-auc:0.96485 validation-aucpr:0.97026
[47] validation-logloss:0.44092 validation-auc:0.96477 validation-aucpr:0.97019
[48] validation-logloss:0.43762 validation-auc:0.96485 validation-aucpr:0.97022
[49] validation-logloss:0.43417 validation-auc:0.96503 validation-aucpr:0.97037
[50] validation-logloss:0.43138 validation-auc:0.96496 validation-aucpr:0.97029
[51] validation-logloss:0.42814 validation-auc:0.96499 validation-aucpr:0.97031
[52] validation-logloss:0.42542 validation-auc:0.96496 validation-aucpr:0.97028
[53] validation-logloss:0.42229 validation-auc:0.96492 validation-aucpr:0.97025
[54] validation-logloss:0.41907 validation-auc:0.96506 validation-aucpr:0.97036
[55] validation-logloss:0.41616 validation-auc:0.96511 validation-aucpr:0.97042
[56] validation-logloss:0.41330 validation-auc:0.96513 validation-aucpr:0.97042
[57] validation-logloss:0.41071 validation-auc:0.96511 validation-aucpr:0.97040
[58] validation-logloss:0.40823 validation-auc:0.96506 validation-aucpr:0.97035
[59] validation-logloss:0.40569 validation-auc:0.96504 validation-aucpr:0.97032
[60] validation-logloss:0.40293 validation-auc:0.96503 validation-aucpr:0.97031
[61] validation-logloss:0.40006 validation-auc:0.96506 validation-aucpr:0.97035
[62] validation-logloss:0.39768 validation-auc:0.96502 validation-aucpr:0.97032
[63] validation-logloss:0.39522 validation-auc:0.96505 validation-aucpr:0.97034
[64] validation-logloss:0.39298 validation-auc:0.96497 validation-aucpr:0.97028
[65] validation-logloss:0.39063 validation-auc:0.96502 validation-aucpr:0.97032
[66] validation-logloss:0.38803 validation-auc:0.96505 validation-aucpr:0.97036
[67] validation-logloss:0.38546 validation-auc:0.96511 validation-aucpr:0.97038
[68] validation-logloss:0.38287 validation-auc:0.96522 validation-aucpr:0.97047
[69] validation-logloss:0.38055 validation-auc:0.96527 validation-aucpr:0.97051
[70] validation-logloss:0.37836 validation-auc:0.96528 validation-aucpr:0.97052
[71] validation-logloss:0.37625 validation-auc:0.96528 validation-aucpr:0.97051
[72] validation-logloss:0.37369 validation-auc:0.96535 validation-aucpr:0.97055
[73] validation-logloss:0.37119 validation-auc:0.96544 validation-aucpr:0.97064
[74] validation-logloss:0.36919 validation-auc:0.96541 validation-aucpr:0.97060
[75] validation-logloss:0.36697 validation-auc:0.96544 validation-aucpr:0.97061
[76] validation-logloss:0.36490 validation-auc:0.96545 validation-aucpr:0.97062
[77] validation-logloss:0.36263 validation-auc:0.96550 validation-aucpr:0.97065
[78] validation-logloss:0.36040 validation-auc:0.96547 validation-aucpr:0.97064
[79] validation-logloss:0.35812 validation-auc:0.96554 validation-aucpr:0.97069
[80] validation-logloss:0.35614 validation-auc:0.96560 validation-aucpr:0.97075
[81] validation-logloss:0.35434 validation-auc:0.96563 validation-aucpr:0.97078
[82] validation-logloss:0.35230 validation-auc:0.96567 validation-aucpr:0.97081
[83] validation-logloss:0.35058 validation-auc:0.96565 validation-aucpr:0.97079
[84] validation-logloss:0.34858 validation-auc:0.96562 validation-aucpr:0.97078
[85] validation-logloss:0.34677 validation-auc:0.96568 validation-aucpr:0.97081
[86] validation-logloss:0.34478 validation-auc:0.96569 validation-aucpr:0.97082
[87] validation-logloss:0.34317 validation-auc:0.96567 validation-aucpr:0.97080
[88] validation-logloss:0.34135 validation-auc:0.96572 validation-aucpr:0.97084
[89] validation-logloss:0.33957 validation-auc:0.96574 validation-aucpr:0.97085
[90] validation-logloss:0.33773 validation-auc:0.96576 validation-aucpr:0.97085
[91] validation-logloss:0.33624 validation-auc:0.96572 validation-aucpr:0.97080
[92] validation-logloss:0.33443 validation-auc:0.96579 validation-aucpr:0.97086
[93] validation-logloss:0.33281 validation-auc:0.96582 validation-aucpr:0.97087
[94] validation-logloss:0.33098 validation-auc:0.96591 validation-aucpr:0.97094
{'best_iteration': '94', 'best_score': '0.9709447835729923'}
Trial 69, Fold 5: Log loss = 0.3309771129872184, Average precision = 0.9709498977967147, ROC-AUC = 0.9659115248128123, Elapsed Time = 239.0036919000013 seconds
Optimization Progress: 70%|####### | 70/100 [3:24:19<3:17:27, 394.91s/it]
Trial 70, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 70, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67291 validation-auc:0.94441 validation-aucpr:0.94450
[1] validation-logloss:0.65387 validation-auc:0.95691 validation-aucpr:0.96256
[2] validation-logloss:0.63586 validation-auc:0.96020 validation-aucpr:0.96376
[3] validation-logloss:0.61769 validation-auc:0.96293 validation-aucpr:0.96850
[4] validation-logloss:0.60076 validation-auc:0.96394 validation-aucpr:0.96921
[5] validation-logloss:0.58723 validation-auc:0.96332 validation-aucpr:0.96823
[6] validation-logloss:0.57224 validation-auc:0.96408 validation-aucpr:0.96918
[7] validation-logloss:0.55889 validation-auc:0.96397 validation-aucpr:0.96910
[8] validation-logloss:0.54658 validation-auc:0.96418 validation-aucpr:0.96924
[9] validation-logloss:0.53292 validation-auc:0.96483 validation-aucpr:0.96994
[10] validation-logloss:0.52180 validation-auc:0.96490 validation-aucpr:0.96996
[11] validation-logloss:0.50941 validation-auc:0.96516 validation-aucpr:0.97043
[12] validation-logloss:0.49834 validation-auc:0.96540 validation-aucpr:0.97074
[13] validation-logloss:0.48728 validation-auc:0.96566 validation-aucpr:0.97120
[14] validation-logloss:0.47779 validation-auc:0.96546 validation-aucpr:0.97096
[15] validation-logloss:0.46747 validation-auc:0.96574 validation-aucpr:0.97117
[16] validation-logloss:0.45875 validation-auc:0.96578 validation-aucpr:0.97124
[17] validation-logloss:0.44974 validation-auc:0.96594 validation-aucpr:0.97139
[18] validation-logloss:0.44066 validation-auc:0.96607 validation-aucpr:0.97160
[19] validation-logloss:0.43277 validation-auc:0.96617 validation-aucpr:0.97168
[20] validation-logloss:0.42529 validation-auc:0.96644 validation-aucpr:0.97186
[21] validation-logloss:0.41727 validation-auc:0.96663 validation-aucpr:0.97197
[22] validation-logloss:0.41021 validation-auc:0.96679 validation-aucpr:0.97210
[23] validation-logloss:0.40394 validation-auc:0.96694 validation-aucpr:0.97215
[24] validation-logloss:0.39655 validation-auc:0.96709 validation-aucpr:0.97228
[25] validation-logloss:0.39057 validation-auc:0.96711 validation-aucpr:0.97230
[26] validation-logloss:0.38485 validation-auc:0.96710 validation-aucpr:0.97230
[27] validation-logloss:0.37832 validation-auc:0.96731 validation-aucpr:0.97260
[28] validation-logloss:0.37198 validation-auc:0.96762 validation-aucpr:0.97282
[29] validation-logloss:0.36582 validation-auc:0.96785 validation-aucpr:0.97302
[30] validation-logloss:0.36099 validation-auc:0.96803 validation-aucpr:0.97311
[31] validation-logloss:0.35578 validation-auc:0.96814 validation-aucpr:0.97320
[32] validation-logloss:0.35058 validation-auc:0.96816 validation-aucpr:0.97322
[33] validation-logloss:0.34594 validation-auc:0.96811 validation-aucpr:0.97314
[34] validation-logloss:0.34170 validation-auc:0.96801 validation-aucpr:0.97304
[35] validation-logloss:0.33696 validation-auc:0.96801 validation-aucpr:0.97313
[36] validation-logloss:0.33270 validation-auc:0.96811 validation-aucpr:0.97318
[37] validation-logloss:0.32847 validation-auc:0.96819 validation-aucpr:0.97324
[38] validation-logloss:0.32489 validation-auc:0.96816 validation-aucpr:0.97321
[39] validation-logloss:0.32016 validation-auc:0.96835 validation-aucpr:0.97338
[40] validation-logloss:0.31690 validation-auc:0.96838 validation-aucpr:0.97335
[41] validation-logloss:0.31364 validation-auc:0.96827 validation-aucpr:0.97328
[42] validation-logloss:0.31050 validation-auc:0.96820 validation-aucpr:0.97321
[43] validation-logloss:0.30775 validation-auc:0.96798 validation-aucpr:0.97303
[44] validation-logloss:0.30455 validation-auc:0.96808 validation-aucpr:0.97308
[45] validation-logloss:0.30127 validation-auc:0.96800 validation-aucpr:0.97305
[46] validation-logloss:0.29862 validation-auc:0.96796 validation-aucpr:0.97302
[47] validation-logloss:0.29582 validation-auc:0.96800 validation-aucpr:0.97299
[48] validation-logloss:0.29328 validation-auc:0.96803 validation-aucpr:0.97300
[49] validation-logloss:0.29100 validation-auc:0.96788 validation-aucpr:0.97292
[50] validation-logloss:0.28807 validation-auc:0.96798 validation-aucpr:0.97301
[51] validation-logloss:0.28487 validation-auc:0.96820 validation-aucpr:0.97319
[52] validation-logloss:0.28176 validation-auc:0.96824 validation-aucpr:0.97325
[53] validation-logloss:0.27945 validation-auc:0.96832 validation-aucpr:0.97329
[54] validation-logloss:0.27657 validation-auc:0.96843 validation-aucpr:0.97338
[55] validation-logloss:0.27445 validation-auc:0.96843 validation-aucpr:0.97342
[56] validation-logloss:0.27254 validation-auc:0.96837 validation-aucpr:0.97337
[57] validation-logloss:0.27021 validation-auc:0.96854 validation-aucpr:0.97350
[58] validation-logloss:0.26805 validation-auc:0.96849 validation-aucpr:0.97347
[59] validation-logloss:0.26532 validation-auc:0.96864 validation-aucpr:0.97361
[60] validation-logloss:0.26365 validation-auc:0.96863 validation-aucpr:0.97360
[61] validation-logloss:0.26168 validation-auc:0.96863 validation-aucpr:0.97361
[62] validation-logloss:0.25997 validation-auc:0.96864 validation-aucpr:0.97361
[63] validation-logloss:0.25774 validation-auc:0.96878 validation-aucpr:0.97372
[64] validation-logloss:0.25565 validation-auc:0.96878 validation-aucpr:0.97374
[65] validation-logloss:0.25349 validation-auc:0.96896 validation-aucpr:0.97388
[66] validation-logloss:0.25148 validation-auc:0.96904 validation-aucpr:0.97394
[67] validation-logloss:0.25009 validation-auc:0.96899 validation-aucpr:0.97389
[68] validation-logloss:0.24812 validation-auc:0.96909 validation-aucpr:0.97396
[69] validation-logloss:0.24680 validation-auc:0.96912 validation-aucpr:0.97396
[70] validation-logloss:0.24557 validation-auc:0.96908 validation-aucpr:0.97393
[71] validation-logloss:0.24441 validation-auc:0.96910 validation-aucpr:0.97392
[72] validation-logloss:0.24313 validation-auc:0.96907 validation-aucpr:0.97391
[73] validation-logloss:0.24190 validation-auc:0.96913 validation-aucpr:0.97396
[74] validation-logloss:0.24076 validation-auc:0.96918 validation-aucpr:0.97398
[75] validation-logloss:0.23922 validation-auc:0.96921 validation-aucpr:0.97403
[76] validation-logloss:0.23815 validation-auc:0.96920 validation-aucpr:0.97402
[77] validation-logloss:0.23657 validation-auc:0.96934 validation-aucpr:0.97413
[78] validation-logloss:0.23511 validation-auc:0.96943 validation-aucpr:0.97420
[79] validation-logloss:0.23367 validation-auc:0.96951 validation-aucpr:0.97427
[80] validation-logloss:0.23218 validation-auc:0.96956 validation-aucpr:0.97431
[81] validation-logloss:0.23149 validation-auc:0.96950 validation-aucpr:0.97427
[82] validation-logloss:0.23019 validation-auc:0.96960 validation-aucpr:0.97437
[83] validation-logloss:0.22928 validation-auc:0.96964 validation-aucpr:0.97443
[84] validation-logloss:0.22815 validation-auc:0.96968 validation-aucpr:0.97447
[85] validation-logloss:0.22729 validation-auc:0.96974 validation-aucpr:0.97453
[86] validation-logloss:0.22620 validation-auc:0.96981 validation-aucpr:0.97459
[87] validation-logloss:0.22560 validation-auc:0.96972 validation-aucpr:0.97452
[88] validation-logloss:0.22441 validation-auc:0.96982 validation-aucpr:0.97459
[89] validation-logloss:0.22382 validation-auc:0.96980 validation-aucpr:0.97458
[90] validation-logloss:0.22299 validation-auc:0.96987 validation-aucpr:0.97462
[91] validation-logloss:0.22204 validation-auc:0.96993 validation-aucpr:0.97465
[92] validation-logloss:0.22136 validation-auc:0.96997 validation-aucpr:0.97468
[93] validation-logloss:0.22070 validation-auc:0.96996 validation-aucpr:0.97467
[94] validation-logloss:0.21973 validation-auc:0.97002 validation-aucpr:0.97473
[95] validation-logloss:0.21885 validation-auc:0.97008 validation-aucpr:0.97478
[96] validation-logloss:0.21840 validation-auc:0.97003 validation-aucpr:0.97475
[97] validation-logloss:0.21745 validation-auc:0.97011 validation-aucpr:0.97480
{'best_iteration': '97', 'best_score': '0.9747997415825582'}
Trial 70, Fold 1: Log loss = 0.21744990998351488, Average precision = 0.9748039295492716, ROC-AUC = 0.970106283128528, Elapsed Time = 17.291013699999894 seconds
Trial 70, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 70, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67321 validation-auc:0.94151 validation-aucpr:0.94429
[1] validation-logloss:0.65366 validation-auc:0.95852 validation-aucpr:0.96174
[2] validation-logloss:0.63629 validation-auc:0.96110 validation-aucpr:0.96444
[3] validation-logloss:0.61961 validation-auc:0.96219 validation-aucpr:0.96562
[4] validation-logloss:0.60220 validation-auc:0.96457 validation-aucpr:0.96845
[5] validation-logloss:0.58604 validation-auc:0.96492 validation-aucpr:0.96916
[6] validation-logloss:0.57080 validation-auc:0.96607 validation-aucpr:0.96979
[7] validation-logloss:0.55641 validation-auc:0.96640 validation-aucpr:0.97011
[8] validation-logloss:0.54462 validation-auc:0.96626 validation-aucpr:0.96975
[9] validation-logloss:0.53322 validation-auc:0.96656 validation-aucpr:0.96977
[10] validation-logloss:0.52063 validation-auc:0.96718 validation-aucpr:0.97040
[11] validation-logloss:0.50965 validation-auc:0.96710 validation-aucpr:0.97031
[12] validation-logloss:0.49931 validation-auc:0.96724 validation-aucpr:0.97092
[13] validation-logloss:0.48798 validation-auc:0.96732 validation-aucpr:0.97109
[14] validation-logloss:0.47811 validation-auc:0.96762 validation-aucpr:0.97130
[15] validation-logloss:0.46748 validation-auc:0.96792 validation-aucpr:0.97162
[16] validation-logloss:0.45746 validation-auc:0.96820 validation-aucpr:0.97189
[17] validation-logloss:0.44738 validation-auc:0.96880 validation-aucpr:0.97236
[18] validation-logloss:0.43781 validation-auc:0.96907 validation-aucpr:0.97260
[19] validation-logloss:0.43003 validation-auc:0.96916 validation-aucpr:0.97275
[20] validation-logloss:0.42278 validation-auc:0.96914 validation-aucpr:0.97270
[21] validation-logloss:0.41535 validation-auc:0.96928 validation-aucpr:0.97279
[22] validation-logloss:0.40868 validation-auc:0.96921 validation-aucpr:0.97274
[23] validation-logloss:0.40106 validation-auc:0.96940 validation-aucpr:0.97306
[24] validation-logloss:0.39447 validation-auc:0.96953 validation-aucpr:0.97315
[25] validation-logloss:0.38765 validation-auc:0.96962 validation-aucpr:0.97315
[26] validation-logloss:0.38170 validation-auc:0.96953 validation-aucpr:0.97309
[27] validation-logloss:0.37637 validation-auc:0.96945 validation-aucpr:0.97303
[28] validation-logloss:0.37099 validation-auc:0.96938 validation-aucpr:0.97294
[29] validation-logloss:0.36543 validation-auc:0.96944 validation-aucpr:0.97295
[30] validation-logloss:0.35958 validation-auc:0.96939 validation-aucpr:0.97289
[31] validation-logloss:0.35504 validation-auc:0.96937 validation-aucpr:0.97282
[32] validation-logloss:0.35043 validation-auc:0.96931 validation-aucpr:0.97280
[33] validation-logloss:0.34448 validation-auc:0.96962 validation-aucpr:0.97307
[34] validation-logloss:0.33920 validation-auc:0.96967 validation-aucpr:0.97314
[35] validation-logloss:0.33511 validation-auc:0.96971 validation-aucpr:0.97317
[36] validation-logloss:0.33092 validation-auc:0.96998 validation-aucpr:0.97336
[37] validation-logloss:0.32620 validation-auc:0.97017 validation-aucpr:0.97375
[38] validation-logloss:0.32246 validation-auc:0.97016 validation-aucpr:0.97373
[39] validation-logloss:0.31833 validation-auc:0.97010 validation-aucpr:0.97369
[40] validation-logloss:0.31371 validation-auc:0.97030 validation-aucpr:0.97383
[41] validation-logloss:0.30923 validation-auc:0.97050 validation-aucpr:0.97397
[42] validation-logloss:0.30498 validation-auc:0.97070 validation-aucpr:0.97413
[43] validation-logloss:0.30177 validation-auc:0.97079 validation-aucpr:0.97429
[44] validation-logloss:0.29858 validation-auc:0.97079 validation-aucpr:0.97429
[45] validation-logloss:0.29511 validation-auc:0.97086 validation-aucpr:0.97434
[46] validation-logloss:0.29221 validation-auc:0.97084 validation-aucpr:0.97432
[47] validation-logloss:0.28901 validation-auc:0.97088 validation-aucpr:0.97434
[48] validation-logloss:0.28546 validation-auc:0.97100 validation-aucpr:0.97442
[49] validation-logloss:0.28284 validation-auc:0.97108 validation-aucpr:0.97449
[50] validation-logloss:0.28037 validation-auc:0.97106 validation-aucpr:0.97447
[51] validation-logloss:0.27723 validation-auc:0.97115 validation-aucpr:0.97452
[52] validation-logloss:0.27409 validation-auc:0.97123 validation-aucpr:0.97457
[53] validation-logloss:0.27191 validation-auc:0.97123 validation-aucpr:0.97458
[54] validation-logloss:0.26881 validation-auc:0.97136 validation-aucpr:0.97467
[55] validation-logloss:0.26669 validation-auc:0.97137 validation-aucpr:0.97468
[56] validation-logloss:0.26481 validation-auc:0.97133 validation-aucpr:0.97465
[57] validation-logloss:0.26286 validation-auc:0.97127 validation-aucpr:0.97461
[58] validation-logloss:0.26088 validation-auc:0.97115 validation-aucpr:0.97447
[59] validation-logloss:0.25897 validation-auc:0.97122 validation-aucpr:0.97450
[60] validation-logloss:0.25632 validation-auc:0.97134 validation-aucpr:0.97458
[61] validation-logloss:0.25405 validation-auc:0.97137 validation-aucpr:0.97458
[62] validation-logloss:0.25195 validation-auc:0.97136 validation-aucpr:0.97459
[63] validation-logloss:0.25046 validation-auc:0.97128 validation-aucpr:0.97449
[64] validation-logloss:0.24889 validation-auc:0.97133 validation-aucpr:0.97451
[65] validation-logloss:0.24668 validation-auc:0.97140 validation-aucpr:0.97451
[66] validation-logloss:0.24493 validation-auc:0.97150 validation-aucpr:0.97458
[67] validation-logloss:0.24288 validation-auc:0.97157 validation-aucpr:0.97464
[68] validation-logloss:0.24154 validation-auc:0.97147 validation-aucpr:0.97458
[69] validation-logloss:0.24014 validation-auc:0.97149 validation-aucpr:0.97460
[70] validation-logloss:0.23853 validation-auc:0.97151 validation-aucpr:0.97461
[71] validation-logloss:0.23663 validation-auc:0.97161 validation-aucpr:0.97467
[72] validation-logloss:0.23474 validation-auc:0.97167 validation-aucpr:0.97473
[73] validation-logloss:0.23311 validation-auc:0.97168 validation-aucpr:0.97475
[74] validation-logloss:0.23163 validation-auc:0.97160 validation-aucpr:0.97468
[75] validation-logloss:0.23015 validation-auc:0.97158 validation-aucpr:0.97466
[76] validation-logloss:0.22905 validation-auc:0.97156 validation-aucpr:0.97465
[77] validation-logloss:0.22751 validation-auc:0.97164 validation-aucpr:0.97472
[78] validation-logloss:0.22641 validation-auc:0.97170 validation-aucpr:0.97476
[79] validation-logloss:0.22505 validation-auc:0.97174 validation-aucpr:0.97480
[80] validation-logloss:0.22403 validation-auc:0.97176 validation-aucpr:0.97482
[81] validation-logloss:0.22317 validation-auc:0.97172 validation-aucpr:0.97477
[82] validation-logloss:0.22206 validation-auc:0.97181 validation-aucpr:0.97483
[83] validation-logloss:0.22077 validation-auc:0.97183 validation-aucpr:0.97484
[84] validation-logloss:0.21990 validation-auc:0.97187 validation-aucpr:0.97487
[85] validation-logloss:0.21904 validation-auc:0.97186 validation-aucpr:0.97491
[86] validation-logloss:0.21798 validation-auc:0.97192 validation-aucpr:0.97495
[87] validation-logloss:0.21688 validation-auc:0.97187 validation-aucpr:0.97494
[88] validation-logloss:0.21557 validation-auc:0.97199 validation-aucpr:0.97504
[89] validation-logloss:0.21445 validation-auc:0.97201 validation-aucpr:0.97509
[90] validation-logloss:0.21369 validation-auc:0.97201 validation-aucpr:0.97508
[91] validation-logloss:0.21259 validation-auc:0.97204 validation-aucpr:0.97509
[92] validation-logloss:0.21165 validation-auc:0.97204 validation-aucpr:0.97526
[93] validation-logloss:0.21064 validation-auc:0.97207 validation-aucpr:0.97529
[94] validation-logloss:0.20978 validation-auc:0.97205 validation-aucpr:0.97528
[95] validation-logloss:0.20922 validation-auc:0.97205 validation-aucpr:0.97523
[96] validation-logloss:0.20853 validation-auc:0.97208 validation-aucpr:0.97524
[97] validation-logloss:0.20795 validation-auc:0.97204 validation-aucpr:0.97515
{'best_iteration': '93', 'best_score': '0.9752944596780706'}
Trial 70, Fold 2: Log loss = 0.2079469238659796, Average precision = 0.9751581080665546, ROC-AUC = 0.972035993967722, Elapsed Time = 17.38810250000097 seconds
Trial 70, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 70, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67257 validation-auc:0.94873 validation-aucpr:0.94502
[1] validation-logloss:0.65257 validation-auc:0.96102 validation-aucpr:0.95878
[2] validation-logloss:0.63460 validation-auc:0.96466 validation-aucpr:0.96501
[3] validation-logloss:0.61683 validation-auc:0.96580 validation-aucpr:0.96614
[4] validation-logloss:0.60173 validation-auc:0.96600 validation-aucpr:0.96583
[5] validation-logloss:0.58757 validation-auc:0.96655 validation-aucpr:0.96787
[6] validation-logloss:0.57243 validation-auc:0.96720 validation-aucpr:0.97052
[7] validation-logloss:0.55745 validation-auc:0.96826 validation-aucpr:0.97138
[8] validation-logloss:0.54348 validation-auc:0.96842 validation-aucpr:0.97208
[9] validation-logloss:0.53019 validation-auc:0.96889 validation-aucpr:0.97284
[10] validation-logloss:0.51868 validation-auc:0.96910 validation-aucpr:0.97300
[11] validation-logloss:0.50653 validation-auc:0.96927 validation-aucpr:0.97300
[12] validation-logloss:0.49464 validation-auc:0.96904 validation-aucpr:0.97293
[13] validation-logloss:0.48383 validation-auc:0.96880 validation-aucpr:0.97347
[14] validation-logloss:0.47338 validation-auc:0.96892 validation-aucpr:0.97348
[15] validation-logloss:0.46491 validation-auc:0.96882 validation-aucpr:0.97354
[16] validation-logloss:0.45628 validation-auc:0.96893 validation-aucpr:0.97354
[17] validation-logloss:0.44672 validation-auc:0.96896 validation-aucpr:0.97356
[18] validation-logloss:0.43753 validation-auc:0.96909 validation-aucpr:0.97374
[19] validation-logloss:0.42862 validation-auc:0.96923 validation-aucpr:0.97387
[20] validation-logloss:0.42099 validation-auc:0.96920 validation-aucpr:0.97381
[21] validation-logloss:0.41277 validation-auc:0.96935 validation-aucpr:0.97389
[22] validation-logloss:0.40581 validation-auc:0.96937 validation-aucpr:0.97387
[23] validation-logloss:0.39968 validation-auc:0.96932 validation-aucpr:0.97380
[24] validation-logloss:0.39329 validation-auc:0.96917 validation-aucpr:0.97368
[25] validation-logloss:0.38640 validation-auc:0.96908 validation-aucpr:0.97361
[26] validation-logloss:0.38110 validation-auc:0.96911 validation-aucpr:0.97353
[27] validation-logloss:0.37569 validation-auc:0.96914 validation-aucpr:0.97365
[28] validation-logloss:0.36945 validation-auc:0.96920 validation-aucpr:0.97369
[29] validation-logloss:0.36357 validation-auc:0.96924 validation-aucpr:0.97367
[30] validation-logloss:0.35755 validation-auc:0.96929 validation-aucpr:0.97377
[31] validation-logloss:0.35289 validation-auc:0.96919 validation-aucpr:0.97366
[32] validation-logloss:0.34710 validation-auc:0.96936 validation-aucpr:0.97381
[33] validation-logloss:0.34257 validation-auc:0.96932 validation-aucpr:0.97372
[34] validation-logloss:0.33740 validation-auc:0.96941 validation-aucpr:0.97378
[35] validation-logloss:0.33314 validation-auc:0.96954 validation-aucpr:0.97386
[36] validation-logloss:0.32821 validation-auc:0.96979 validation-aucpr:0.97404
[37] validation-logloss:0.32434 validation-auc:0.96983 validation-aucpr:0.97406
[38] validation-logloss:0.32055 validation-auc:0.96992 validation-aucpr:0.97411
[39] validation-logloss:0.31685 validation-auc:0.96998 validation-aucpr:0.97412
[40] validation-logloss:0.31285 validation-auc:0.97011 validation-aucpr:0.97425
[41] validation-logloss:0.30953 validation-auc:0.97013 validation-aucpr:0.97425
[42] validation-logloss:0.30673 validation-auc:0.97006 validation-aucpr:0.97418
[43] validation-logloss:0.30341 validation-auc:0.97001 validation-aucpr:0.97413
[44] validation-logloss:0.29991 validation-auc:0.97001 validation-aucpr:0.97414
[45] validation-logloss:0.29696 validation-auc:0.97000 validation-aucpr:0.97415
[46] validation-logloss:0.29391 validation-auc:0.97010 validation-aucpr:0.97422
[47] validation-logloss:0.29122 validation-auc:0.97009 validation-aucpr:0.97418
[48] validation-logloss:0.28859 validation-auc:0.96998 validation-aucpr:0.97412
[49] validation-logloss:0.28536 validation-auc:0.97008 validation-aucpr:0.97436
[50] validation-logloss:0.28233 validation-auc:0.97005 validation-aucpr:0.97432
[51] validation-logloss:0.27906 validation-auc:0.97022 validation-aucpr:0.97444
[52] validation-logloss:0.27666 validation-auc:0.97025 validation-aucpr:0.97444
[53] validation-logloss:0.27444 validation-auc:0.97021 validation-aucpr:0.97441
[54] validation-logloss:0.27205 validation-auc:0.97029 validation-aucpr:0.97446
[55] validation-logloss:0.26929 validation-auc:0.97036 validation-aucpr:0.97451
[56] validation-logloss:0.26661 validation-auc:0.97049 validation-aucpr:0.97462
[57] validation-logloss:0.26424 validation-auc:0.97058 validation-aucpr:0.97467
[58] validation-logloss:0.26222 validation-auc:0.97065 validation-aucpr:0.97471
[59] validation-logloss:0.26021 validation-auc:0.97066 validation-aucpr:0.97471
[60] validation-logloss:0.25788 validation-auc:0.97068 validation-aucpr:0.97473
[61] validation-logloss:0.25622 validation-auc:0.97071 validation-aucpr:0.97475
[62] validation-logloss:0.25444 validation-auc:0.97078 validation-aucpr:0.97480
[63] validation-logloss:0.25258 validation-auc:0.97081 validation-aucpr:0.97484
[64] validation-logloss:0.25072 validation-auc:0.97083 validation-aucpr:0.97487
[65] validation-logloss:0.24859 validation-auc:0.97082 validation-aucpr:0.97488
[66] validation-logloss:0.24655 validation-auc:0.97090 validation-aucpr:0.97497
[67] validation-logloss:0.24517 validation-auc:0.97090 validation-aucpr:0.97495
[68] validation-logloss:0.24390 validation-auc:0.97091 validation-aucpr:0.97498
[69] validation-logloss:0.24221 validation-auc:0.97093 validation-aucpr:0.97499
[70] validation-logloss:0.24113 validation-auc:0.97085 validation-aucpr:0.97494
[71] validation-logloss:0.23934 validation-auc:0.97089 validation-aucpr:0.97498
[72] validation-logloss:0.23787 validation-auc:0.97092 validation-aucpr:0.97500
[73] validation-logloss:0.23666 validation-auc:0.97098 validation-aucpr:0.97502
[74] validation-logloss:0.23509 validation-auc:0.97098 validation-aucpr:0.97504
[75] validation-logloss:0.23394 validation-auc:0.97099 validation-aucpr:0.97505
[76] validation-logloss:0.23254 validation-auc:0.97099 validation-aucpr:0.97505
[77] validation-logloss:0.23138 validation-auc:0.97103 validation-aucpr:0.97508
[78] validation-logloss:0.22975 validation-auc:0.97111 validation-aucpr:0.97514
[79] validation-logloss:0.22869 validation-auc:0.97112 validation-aucpr:0.97516
[80] validation-logloss:0.22746 validation-auc:0.97122 validation-aucpr:0.97526
[81] validation-logloss:0.22615 validation-auc:0.97120 validation-aucpr:0.97525
[82] validation-logloss:0.22528 validation-auc:0.97119 validation-aucpr:0.97524
[83] validation-logloss:0.22413 validation-auc:0.97125 validation-aucpr:0.97528
[84] validation-logloss:0.22300 validation-auc:0.97131 validation-aucpr:0.97533
[85] validation-logloss:0.22202 validation-auc:0.97139 validation-aucpr:0.97540
[86] validation-logloss:0.22089 validation-auc:0.97144 validation-aucpr:0.97544
[87] validation-logloss:0.22005 validation-auc:0.97146 validation-aucpr:0.97545
[88] validation-logloss:0.21903 validation-auc:0.97143 validation-aucpr:0.97542
[89] validation-logloss:0.21780 validation-auc:0.97151 validation-aucpr:0.97550
[90] validation-logloss:0.21732 validation-auc:0.97144 validation-aucpr:0.97544
[91] validation-logloss:0.21679 validation-auc:0.97138 validation-aucpr:0.97539
[92] validation-logloss:0.21582 validation-auc:0.97144 validation-aucpr:0.97545
[93] validation-logloss:0.21502 validation-auc:0.97146 validation-aucpr:0.97547
[94] validation-logloss:0.21411 validation-auc:0.97153 validation-aucpr:0.97553
[95] validation-logloss:0.21321 validation-auc:0.97156 validation-aucpr:0.97555
[96] validation-logloss:0.21245 validation-auc:0.97160 validation-aucpr:0.97559
[97] validation-logloss:0.21155 validation-auc:0.97165 validation-aucpr:0.97564
{'best_iteration': '97', 'best_score': '0.9756388331758201'}
Trial 70, Fold 3: Log loss = 0.21154853985304786, Average precision = 0.9756430835055213, ROC-AUC = 0.9716489415492483, Elapsed Time = 17.922511900000245 seconds
Trial 70, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 70, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67400 validation-auc:0.93355 validation-aucpr:0.93296
[1] validation-logloss:0.65611 validation-auc:0.94846 validation-aucpr:0.95293
[2] validation-logloss:0.63952 validation-auc:0.95213 validation-aucpr:0.95769
[3] validation-logloss:0.62193 validation-auc:0.95913 validation-aucpr:0.96460
[4] validation-logloss:0.60666 validation-auc:0.95981 validation-aucpr:0.96526
[5] validation-logloss:0.59201 validation-auc:0.96034 validation-aucpr:0.96573
[6] validation-logloss:0.57629 validation-auc:0.96247 validation-aucpr:0.96798
[7] validation-logloss:0.56330 validation-auc:0.96270 validation-aucpr:0.96824
[8] validation-logloss:0.55092 validation-auc:0.96321 validation-aucpr:0.96868
[9] validation-logloss:0.53897 validation-auc:0.96340 validation-aucpr:0.96876
[10] validation-logloss:0.52795 validation-auc:0.96376 validation-aucpr:0.96889
[11] validation-logloss:0.51504 validation-auc:0.96494 validation-aucpr:0.96998
[12] validation-logloss:0.50458 validation-auc:0.96510 validation-aucpr:0.97007
[13] validation-logloss:0.49466 validation-auc:0.96526 validation-aucpr:0.97031
[14] validation-logloss:0.48392 validation-auc:0.96563 validation-aucpr:0.97076
[15] validation-logloss:0.47482 validation-auc:0.96545 validation-aucpr:0.97054
[16] validation-logloss:0.46462 validation-auc:0.96575 validation-aucpr:0.97094
[17] validation-logloss:0.45514 validation-auc:0.96605 validation-aucpr:0.97118
[18] validation-logloss:0.44597 validation-auc:0.96644 validation-aucpr:0.97156
[19] validation-logloss:0.43850 validation-auc:0.96633 validation-aucpr:0.97145
[20] validation-logloss:0.43108 validation-auc:0.96631 validation-aucpr:0.97141
[21] validation-logloss:0.42439 validation-auc:0.96614 validation-aucpr:0.97124
[22] validation-logloss:0.41650 validation-auc:0.96639 validation-aucpr:0.97157
[23] validation-logloss:0.40839 validation-auc:0.96673 validation-aucpr:0.97192
[24] validation-logloss:0.40230 validation-auc:0.96665 validation-aucpr:0.97185
[25] validation-logloss:0.39507 validation-auc:0.96671 validation-aucpr:0.97196
[26] validation-logloss:0.38779 validation-auc:0.96692 validation-aucpr:0.97217
[27] validation-logloss:0.38113 validation-auc:0.96715 validation-aucpr:0.97236
[28] validation-logloss:0.37580 validation-auc:0.96701 validation-aucpr:0.97225
[29] validation-logloss:0.36945 validation-auc:0.96713 validation-aucpr:0.97236
[30] validation-logloss:0.36325 validation-auc:0.96732 validation-aucpr:0.97257
[31] validation-logloss:0.35832 validation-auc:0.96753 validation-aucpr:0.97266
[32] validation-logloss:0.35286 validation-auc:0.96766 validation-aucpr:0.97279
[33] validation-logloss:0.34840 validation-auc:0.96767 validation-aucpr:0.97275
[34] validation-logloss:0.34425 validation-auc:0.96768 validation-aucpr:0.97274
[35] validation-logloss:0.33910 validation-auc:0.96783 validation-aucpr:0.97290
[36] validation-logloss:0.33414 validation-auc:0.96800 validation-aucpr:0.97305
[37] validation-logloss:0.33013 validation-auc:0.96809 validation-aucpr:0.97311
[38] validation-logloss:0.32658 validation-auc:0.96794 validation-aucpr:0.97299
[39] validation-logloss:0.32216 validation-auc:0.96802 validation-aucpr:0.97310
[40] validation-logloss:0.31757 validation-auc:0.96819 validation-aucpr:0.97326
[41] validation-logloss:0.31426 validation-auc:0.96809 validation-aucpr:0.97318
[42] validation-logloss:0.31002 validation-auc:0.96828 validation-aucpr:0.97335
[43] validation-logloss:0.30589 validation-auc:0.96840 validation-aucpr:0.97347
[44] validation-logloss:0.30295 validation-auc:0.96842 validation-aucpr:0.97347
[45] validation-logloss:0.29966 validation-auc:0.96854 validation-aucpr:0.97354
[46] validation-logloss:0.29599 validation-auc:0.96865 validation-aucpr:0.97369
[47] validation-logloss:0.29255 validation-auc:0.96884 validation-aucpr:0.97385
[48] validation-logloss:0.28982 validation-auc:0.96900 validation-aucpr:0.97394
[49] validation-logloss:0.28644 validation-auc:0.96910 validation-aucpr:0.97402
[50] validation-logloss:0.28402 validation-auc:0.96910 validation-aucpr:0.97401
[51] validation-logloss:0.28085 validation-auc:0.96917 validation-aucpr:0.97408
[52] validation-logloss:0.27794 validation-auc:0.96919 validation-aucpr:0.97409
[53] validation-logloss:0.27500 validation-auc:0.96926 validation-aucpr:0.97420
[54] validation-logloss:0.27273 validation-auc:0.96928 validation-aucpr:0.97420
[55] validation-logloss:0.27011 validation-auc:0.96934 validation-aucpr:0.97428
[56] validation-logloss:0.26792 validation-auc:0.96931 validation-aucpr:0.97424
[57] validation-logloss:0.26534 validation-auc:0.96944 validation-aucpr:0.97435
[58] validation-logloss:0.26287 validation-auc:0.96948 validation-aucpr:0.97441
[59] validation-logloss:0.26100 validation-auc:0.96946 validation-aucpr:0.97440
[60] validation-logloss:0.25892 validation-auc:0.96951 validation-aucpr:0.97444
[61] validation-logloss:0.25640 validation-auc:0.96972 validation-aucpr:0.97462
[62] validation-logloss:0.25496 validation-auc:0.96964 validation-aucpr:0.97456
[63] validation-logloss:0.25324 validation-auc:0.96964 validation-aucpr:0.97452
[64] validation-logloss:0.25169 validation-auc:0.96965 validation-aucpr:0.97451
[65] validation-logloss:0.24983 validation-auc:0.96975 validation-aucpr:0.97458
[66] validation-logloss:0.24782 validation-auc:0.96981 validation-aucpr:0.97464
[67] validation-logloss:0.24647 validation-auc:0.96979 validation-aucpr:0.97461
[68] validation-logloss:0.24515 validation-auc:0.96974 validation-aucpr:0.97457
[69] validation-logloss:0.24329 validation-auc:0.96974 validation-aucpr:0.97458
[70] validation-logloss:0.24195 validation-auc:0.96975 validation-aucpr:0.97459
[71] validation-logloss:0.24062 validation-auc:0.96971 validation-aucpr:0.97457
[72] validation-logloss:0.23930 validation-auc:0.96971 validation-aucpr:0.97457
[73] validation-logloss:0.23780 validation-auc:0.96974 validation-aucpr:0.97458
[74] validation-logloss:0.23605 validation-auc:0.96984 validation-aucpr:0.97467
[75] validation-logloss:0.23488 validation-auc:0.96985 validation-aucpr:0.97466
[76] validation-logloss:0.23330 validation-auc:0.96990 validation-aucpr:0.97471
[77] validation-logloss:0.23197 validation-auc:0.96994 validation-aucpr:0.97474
[78] validation-logloss:0.23035 validation-auc:0.97007 validation-aucpr:0.97484
[79] validation-logloss:0.22873 validation-auc:0.97017 validation-aucpr:0.97492
[80] validation-logloss:0.22767 validation-auc:0.97016 validation-aucpr:0.97490
[81] validation-logloss:0.22667 validation-auc:0.97017 validation-aucpr:0.97491
[82] validation-logloss:0.22515 validation-auc:0.97031 validation-aucpr:0.97502
[83] validation-logloss:0.22420 validation-auc:0.97037 validation-aucpr:0.97507
[84] validation-logloss:0.22318 validation-auc:0.97033 validation-aucpr:0.97504
[85] validation-logloss:0.22244 validation-auc:0.97030 validation-aucpr:0.97501
[86] validation-logloss:0.22136 validation-auc:0.97035 validation-aucpr:0.97506
[87] validation-logloss:0.22048 validation-auc:0.97038 validation-aucpr:0.97509
[88] validation-logloss:0.21959 validation-auc:0.97041 validation-aucpr:0.97510
[89] validation-logloss:0.21855 validation-auc:0.97048 validation-aucpr:0.97515
[90] validation-logloss:0.21755 validation-auc:0.97050 validation-aucpr:0.97519
[91] validation-logloss:0.21687 validation-auc:0.97050 validation-aucpr:0.97520
[92] validation-logloss:0.21580 validation-auc:0.97057 validation-aucpr:0.97525
[93] validation-logloss:0.21515 validation-auc:0.97055 validation-aucpr:0.97523
[94] validation-logloss:0.21446 validation-auc:0.97049 validation-aucpr:0.97519
[95] validation-logloss:0.21348 validation-auc:0.97055 validation-aucpr:0.97522
[96] validation-logloss:0.21263 validation-auc:0.97058 validation-aucpr:0.97524
[97] validation-logloss:0.21186 validation-auc:0.97055 validation-aucpr:0.97522
{'best_iteration': '92', 'best_score': '0.9752471491651576'}
Trial 70, Fold 4: Log loss = 0.2118629863121309, Average precision = 0.9752193204989157, ROC-AUC = 0.9705480685088616, Elapsed Time = 17.842629599999782 seconds
Trial 70, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 70, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67302 validation-auc:0.93436 validation-aucpr:0.93150
[1] validation-logloss:0.65307 validation-auc:0.95469 validation-aucpr:0.95539
[2] validation-logloss:0.63662 validation-auc:0.95763 validation-aucpr:0.96329
[3] validation-logloss:0.61845 validation-auc:0.96103 validation-aucpr:0.96637
[4] validation-logloss:0.60163 validation-auc:0.96285 validation-aucpr:0.96784
[5] validation-logloss:0.58611 validation-auc:0.96243 validation-aucpr:0.96737
[6] validation-logloss:0.57224 validation-auc:0.96350 validation-aucpr:0.96805
[7] validation-logloss:0.55781 validation-auc:0.96369 validation-aucpr:0.96827
[8] validation-logloss:0.54535 validation-auc:0.96445 validation-aucpr:0.96883
[9] validation-logloss:0.53413 validation-auc:0.96404 validation-aucpr:0.96848
[10] validation-logloss:0.52284 validation-auc:0.96410 validation-aucpr:0.96879
[11] validation-logloss:0.51080 validation-auc:0.96477 validation-aucpr:0.96934
[12] validation-logloss:0.50012 validation-auc:0.96525 validation-aucpr:0.96969
[13] validation-logloss:0.49004 validation-auc:0.96519 validation-aucpr:0.96968
[14] validation-logloss:0.48079 validation-auc:0.96525 validation-aucpr:0.96957
[15] validation-logloss:0.47022 validation-auc:0.96564 validation-aucpr:0.96993
[16] validation-logloss:0.46173 validation-auc:0.96552 validation-aucpr:0.96971
[17] validation-logloss:0.45217 validation-auc:0.96586 validation-aucpr:0.97009
[18] validation-logloss:0.44306 validation-auc:0.96619 validation-aucpr:0.97035
[19] validation-logloss:0.43522 validation-auc:0.96630 validation-aucpr:0.97041
[20] validation-logloss:0.42786 validation-auc:0.96629 validation-aucpr:0.97038
[21] validation-logloss:0.41973 validation-auc:0.96658 validation-aucpr:0.97065
[22] validation-logloss:0.41310 validation-auc:0.96643 validation-aucpr:0.97058
[23] validation-logloss:0.40655 validation-auc:0.96627 validation-aucpr:0.97048
[24] validation-logloss:0.40042 validation-auc:0.96617 validation-aucpr:0.97036
[25] validation-logloss:0.39349 validation-auc:0.96633 validation-aucpr:0.97047
[26] validation-logloss:0.38726 validation-auc:0.96671 validation-aucpr:0.97108
[27] validation-logloss:0.38046 validation-auc:0.96677 validation-aucpr:0.97106
[28] validation-logloss:0.37488 validation-auc:0.96687 validation-aucpr:0.97117
[29] validation-logloss:0.36994 validation-auc:0.96664 validation-aucpr:0.97089
[30] validation-logloss:0.36392 validation-auc:0.96672 validation-aucpr:0.97101
[31] validation-logloss:0.35797 validation-auc:0.96688 validation-aucpr:0.97114
[32] validation-logloss:0.35342 validation-auc:0.96690 validation-aucpr:0.97114
[33] validation-logloss:0.34925 validation-auc:0.96669 validation-aucpr:0.97093
[34] validation-logloss:0.34468 validation-auc:0.96665 validation-aucpr:0.97090
[35] validation-logloss:0.34050 validation-auc:0.96666 validation-aucpr:0.97089
[36] validation-logloss:0.33651 validation-auc:0.96673 validation-aucpr:0.97089
[37] validation-logloss:0.33154 validation-auc:0.96679 validation-aucpr:0.97096
[38] validation-logloss:0.32811 validation-auc:0.96672 validation-aucpr:0.97086
[39] validation-logloss:0.32395 validation-auc:0.96670 validation-aucpr:0.97092
[40] validation-logloss:0.31958 validation-auc:0.96685 validation-aucpr:0.97103
[41] validation-logloss:0.31621 validation-auc:0.96696 validation-aucpr:0.97108
[42] validation-logloss:0.31314 validation-auc:0.96698 validation-aucpr:0.97108
[43] validation-logloss:0.30910 validation-auc:0.96706 validation-aucpr:0.97115
[44] validation-logloss:0.30536 validation-auc:0.96707 validation-aucpr:0.97117
[45] validation-logloss:0.30139 validation-auc:0.96720 validation-aucpr:0.97128
[46] validation-logloss:0.29812 validation-auc:0.96721 validation-aucpr:0.97128
[47] validation-logloss:0.29489 validation-auc:0.96721 validation-aucpr:0.97128
[48] validation-logloss:0.29177 validation-auc:0.96729 validation-aucpr:0.97133
[49] validation-logloss:0.28904 validation-auc:0.96746 validation-aucpr:0.97147
[50] validation-logloss:0.28652 validation-auc:0.96752 validation-aucpr:0.97150
[51] validation-logloss:0.28436 validation-auc:0.96736 validation-aucpr:0.97139
[52] validation-logloss:0.28127 validation-auc:0.96752 validation-aucpr:0.97153
[53] validation-logloss:0.27843 validation-auc:0.96766 validation-aucpr:0.97163
[54] validation-logloss:0.27564 validation-auc:0.96774 validation-aucpr:0.97170
[55] validation-logloss:0.27368 validation-auc:0.96764 validation-aucpr:0.97162
[56] validation-logloss:0.27099 validation-auc:0.96773 validation-aucpr:0.97171
[57] validation-logloss:0.26857 validation-auc:0.96771 validation-aucpr:0.97174
[58] validation-logloss:0.26610 validation-auc:0.96780 validation-aucpr:0.97181
[59] validation-logloss:0.26350 validation-auc:0.96799 validation-aucpr:0.97197
[60] validation-logloss:0.26175 validation-auc:0.96795 validation-aucpr:0.97193
[61] validation-logloss:0.26025 validation-auc:0.96792 validation-aucpr:0.97189
[62] validation-logloss:0.25847 validation-auc:0.96800 validation-aucpr:0.97194
[63] validation-logloss:0.25648 validation-auc:0.96797 validation-aucpr:0.97193
[64] validation-logloss:0.25473 validation-auc:0.96805 validation-aucpr:0.97198
[65] validation-logloss:0.25331 validation-auc:0.96806 validation-aucpr:0.97200
[66] validation-logloss:0.25124 validation-auc:0.96819 validation-aucpr:0.97212
[67] validation-logloss:0.24997 validation-auc:0.96817 validation-aucpr:0.97211
[68] validation-logloss:0.24784 validation-auc:0.96843 validation-aucpr:0.97230
[69] validation-logloss:0.24644 validation-auc:0.96838 validation-aucpr:0.97227
[70] validation-logloss:0.24508 validation-auc:0.96835 validation-aucpr:0.97225
[71] validation-logloss:0.24339 validation-auc:0.96845 validation-aucpr:0.97249
[72] validation-logloss:0.24184 validation-auc:0.96846 validation-aucpr:0.97248
[73] validation-logloss:0.24035 validation-auc:0.96846 validation-aucpr:0.97247
[74] validation-logloss:0.23920 validation-auc:0.96847 validation-aucpr:0.97246
[75] validation-logloss:0.23802 validation-auc:0.96852 validation-aucpr:0.97250
[76] validation-logloss:0.23682 validation-auc:0.96858 validation-aucpr:0.97254
[77] validation-logloss:0.23557 validation-auc:0.96859 validation-aucpr:0.97255
[78] validation-logloss:0.23422 validation-auc:0.96871 validation-aucpr:0.97264
[79] validation-logloss:0.23296 validation-auc:0.96874 validation-aucpr:0.97266
[80] validation-logloss:0.23200 validation-auc:0.96881 validation-aucpr:0.97277
[81] validation-logloss:0.23081 validation-auc:0.96886 validation-aucpr:0.97282
[82] validation-logloss:0.23009 validation-auc:0.96886 validation-aucpr:0.97291
[83] validation-logloss:0.22931 validation-auc:0.96886 validation-aucpr:0.97290
[84] validation-logloss:0.22837 validation-auc:0.96882 validation-aucpr:0.97292
[85] validation-logloss:0.22710 validation-auc:0.96894 validation-aucpr:0.97297
[86] validation-logloss:0.22593 validation-auc:0.96903 validation-aucpr:0.97306
[87] validation-logloss:0.22486 validation-auc:0.96911 validation-aucpr:0.97314
[88] validation-logloss:0.22399 validation-auc:0.96910 validation-aucpr:0.97308
[89] validation-logloss:0.22313 validation-auc:0.96915 validation-aucpr:0.97312
[90] validation-logloss:0.22215 validation-auc:0.96922 validation-aucpr:0.97317
[91] validation-logloss:0.22134 validation-auc:0.96919 validation-aucpr:0.97314
[92] validation-logloss:0.22078 validation-auc:0.96918 validation-aucpr:0.97314
[93] validation-logloss:0.22009 validation-auc:0.96926 validation-aucpr:0.97321
[94] validation-logloss:0.21960 validation-auc:0.96922 validation-aucpr:0.97317
[95] validation-logloss:0.21907 validation-auc:0.96923 validation-aucpr:0.97318
[96] validation-logloss:0.21822 validation-auc:0.96934 validation-aucpr:0.97325
[97] validation-logloss:0.21768 validation-auc:0.96934 validation-aucpr:0.97323
{'best_iteration': '96', 'best_score': '0.9732505139835228'}
Trial 70, Fold 5: Log loss = 0.21767668827980782, Average precision = 0.973240621004372, ROC-AUC = 0.969341879865485, Elapsed Time = 19.734101900001406 seconds
Optimization Progress: 71%|#######1 | 71/100 [3:25:57<2:27:49, 305.83s/it]
Trial 71, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 71, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68602 validation-auc:0.87396 validation-aucpr:0.80837
[1] validation-logloss:0.67872 validation-auc:0.93204 validation-aucpr:0.89510
[2] validation-logloss:0.67175 validation-auc:0.94868 validation-aucpr:0.93511
[3] validation-logloss:0.66563 validation-auc:0.95252 validation-aucpr:0.94511
[4] validation-logloss:0.65966 validation-auc:0.95486 validation-aucpr:0.95009
[5] validation-logloss:0.65359 validation-auc:0.95698 validation-aucpr:0.95965
[6] validation-logloss:0.64780 validation-auc:0.95800 validation-aucpr:0.96071
[7] validation-logloss:0.64125 validation-auc:0.95999 validation-aucpr:0.96304
[8] validation-logloss:0.63498 validation-auc:0.96099 validation-aucpr:0.96406
[9] validation-logloss:0.62864 validation-auc:0.96217 validation-aucpr:0.96529
[10] validation-logloss:0.62239 validation-auc:0.96315 validation-aucpr:0.96884
[11] validation-logloss:0.61650 validation-auc:0.96332 validation-aucpr:0.96932
[12] validation-logloss:0.61059 validation-auc:0.96381 validation-aucpr:0.96912
[13] validation-logloss:0.60489 validation-auc:0.96403 validation-aucpr:0.96870
[14] validation-logloss:0.59938 validation-auc:0.96394 validation-aucpr:0.96854
[15] validation-logloss:0.59392 validation-auc:0.96440 validation-aucpr:0.96897
[16] validation-logloss:0.58853 validation-auc:0.96458 validation-aucpr:0.96912
[17] validation-logloss:0.58328 validation-auc:0.96480 validation-aucpr:0.96849
[18] validation-logloss:0.57804 validation-auc:0.96520 validation-aucpr:0.96881
[19] validation-logloss:0.57279 validation-auc:0.96553 validation-aucpr:0.96899
[20] validation-logloss:0.56767 validation-auc:0.96562 validation-aucpr:0.96912
[21] validation-logloss:0.56269 validation-auc:0.96559 validation-aucpr:0.96926
[22] validation-logloss:0.55781 validation-auc:0.96574 validation-aucpr:0.97087
[23] validation-logloss:0.55294 validation-auc:0.96561 validation-aucpr:0.97106
[24] validation-logloss:0.54820 validation-auc:0.96566 validation-aucpr:0.97112
[25] validation-logloss:0.54330 validation-auc:0.96587 validation-aucpr:0.97127
[26] validation-logloss:0.53854 validation-auc:0.96616 validation-aucpr:0.97154
[27] validation-logloss:0.53400 validation-auc:0.96591 validation-aucpr:0.97140
[28] validation-logloss:0.52965 validation-auc:0.96586 validation-aucpr:0.97144
[29] validation-logloss:0.52531 validation-auc:0.96603 validation-aucpr:0.97154
[30] validation-logloss:0.52106 validation-auc:0.96606 validation-aucpr:0.97159
[31] validation-logloss:0.51684 validation-auc:0.96641 validation-aucpr:0.97183
[32] validation-logloss:0.51279 validation-auc:0.96678 validation-aucpr:0.97208
[33] validation-logloss:0.50859 validation-auc:0.96695 validation-aucpr:0.97221
[34] validation-logloss:0.50454 validation-auc:0.96694 validation-aucpr:0.97220
[35] validation-logloss:0.50037 validation-auc:0.96717 validation-aucpr:0.97240
[36] validation-logloss:0.49617 validation-auc:0.96746 validation-aucpr:0.97262
[37] validation-logloss:0.49234 validation-auc:0.96733 validation-aucpr:0.97255
[38] validation-logloss:0.48849 validation-auc:0.96737 validation-aucpr:0.97241
[39] validation-logloss:0.48464 validation-auc:0.96733 validation-aucpr:0.97239
[40] validation-logloss:0.48100 validation-auc:0.96722 validation-aucpr:0.97236
[41] validation-logloss:0.47741 validation-auc:0.96705 validation-aucpr:0.97226
[42] validation-logloss:0.47387 validation-auc:0.96708 validation-aucpr:0.97231
[43] validation-logloss:0.47055 validation-auc:0.96689 validation-aucpr:0.97213
[44] validation-logloss:0.46704 validation-auc:0.96693 validation-aucpr:0.97218
[45] validation-logloss:0.46391 validation-auc:0.96702 validation-aucpr:0.97225
[46] validation-logloss:0.46039 validation-auc:0.96711 validation-aucpr:0.97231
[47] validation-logloss:0.45697 validation-auc:0.96716 validation-aucpr:0.97244
[48] validation-logloss:0.45400 validation-auc:0.96729 validation-aucpr:0.97251
[49] validation-logloss:0.45071 validation-auc:0.96734 validation-aucpr:0.97257
[50] validation-logloss:0.44752 validation-auc:0.96737 validation-aucpr:0.97259
[51] validation-logloss:0.44438 validation-auc:0.96735 validation-aucpr:0.97257
[52] validation-logloss:0.44137 validation-auc:0.96729 validation-aucpr:0.97254
[53] validation-logloss:0.43869 validation-auc:0.96723 validation-aucpr:0.97247
[54] validation-logloss:0.43568 validation-auc:0.96720 validation-aucpr:0.97248
[55] validation-logloss:0.43266 validation-auc:0.96723 validation-aucpr:0.97251
[56] validation-logloss:0.42969 validation-auc:0.96721 validation-aucpr:0.97250
[57] validation-logloss:0.42729 validation-auc:0.96705 validation-aucpr:0.97236
[58] validation-logloss:0.42448 validation-auc:0.96703 validation-aucpr:0.97235
[59] validation-logloss:0.42169 validation-auc:0.96713 validation-aucpr:0.97259
[60] validation-logloss:0.41882 validation-auc:0.96722 validation-aucpr:0.97262
[61] validation-logloss:0.41597 validation-auc:0.96726 validation-aucpr:0.97262
[62] validation-logloss:0.41333 validation-auc:0.96729 validation-aucpr:0.97261
[63] validation-logloss:0.41071 validation-auc:0.96731 validation-aucpr:0.97260
[64] validation-logloss:0.40799 validation-auc:0.96734 validation-aucpr:0.97262
[65] validation-logloss:0.40542 validation-auc:0.96737 validation-aucpr:0.97261
[66] validation-logloss:0.40295 validation-auc:0.96742 validation-aucpr:0.97263
[67] validation-logloss:0.40063 validation-auc:0.96738 validation-aucpr:0.97259
[68] validation-logloss:0.39805 validation-auc:0.96744 validation-aucpr:0.97267
[69] validation-logloss:0.39577 validation-auc:0.96745 validation-aucpr:0.97267
[70] validation-logloss:0.39324 validation-auc:0.96751 validation-aucpr:0.97271
[71] validation-logloss:0.39082 validation-auc:0.96745 validation-aucpr:0.97259
[72] validation-logloss:0.38853 validation-auc:0.96749 validation-aucpr:0.97259
[73] validation-logloss:0.38614 validation-auc:0.96752 validation-aucpr:0.97262
[74] validation-logloss:0.38373 validation-auc:0.96760 validation-aucpr:0.97268
[75] validation-logloss:0.38167 validation-auc:0.96765 validation-aucpr:0.97274
[76] validation-logloss:0.37930 validation-auc:0.96774 validation-aucpr:0.97270
[77] validation-logloss:0.37709 validation-auc:0.96770 validation-aucpr:0.97266
[78] validation-logloss:0.37505 validation-auc:0.96765 validation-aucpr:0.97257
[79] validation-logloss:0.37296 validation-auc:0.96770 validation-aucpr:0.97262
{'best_iteration': '75', 'best_score': '0.9727365897316989'}
Trial 71, Fold 1: Log loss = 0.37296171100424547, Average precision = 0.9726960282798622, ROC-AUC = 0.9677037180424714, Elapsed Time = 6.413239400000748 seconds
Trial 71, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 71, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68524 validation-auc:0.89374 validation-aucpr:0.83216
[1] validation-logloss:0.67781 validation-auc:0.93814 validation-aucpr:0.89778
[2] validation-logloss:0.67085 validation-auc:0.95140 validation-aucpr:0.93410
[3] validation-logloss:0.66426 validation-auc:0.95769 validation-aucpr:0.95362
[4] validation-logloss:0.65750 validation-auc:0.96125 validation-aucpr:0.96416
[5] validation-logloss:0.65073 validation-auc:0.96299 validation-aucpr:0.96662
[6] validation-logloss:0.64418 validation-auc:0.96381 validation-aucpr:0.96771
[7] validation-logloss:0.63771 validation-auc:0.96488 validation-aucpr:0.96881
[8] validation-logloss:0.63155 validation-auc:0.96526 validation-aucpr:0.96944
[9] validation-logloss:0.62520 validation-auc:0.96615 validation-aucpr:0.96970
[10] validation-logloss:0.61926 validation-auc:0.96578 validation-aucpr:0.96916
[11] validation-logloss:0.61337 validation-auc:0.96600 validation-aucpr:0.96962
[12] validation-logloss:0.60764 validation-auc:0.96606 validation-aucpr:0.96938
[13] validation-logloss:0.60193 validation-auc:0.96609 validation-aucpr:0.96906
[14] validation-logloss:0.59623 validation-auc:0.96629 validation-aucpr:0.96885
[15] validation-logloss:0.59071 validation-auc:0.96637 validation-aucpr:0.96901
[16] validation-logloss:0.58579 validation-auc:0.96666 validation-aucpr:0.96912
[17] validation-logloss:0.58017 validation-auc:0.96699 validation-aucpr:0.96967
[18] validation-logloss:0.57491 validation-auc:0.96703 validation-aucpr:0.96995
[19] validation-logloss:0.56969 validation-auc:0.96758 validation-aucpr:0.97070
[20] validation-logloss:0.56450 validation-auc:0.96810 validation-aucpr:0.97098
[21] validation-logloss:0.55942 validation-auc:0.96812 validation-aucpr:0.97115
[22] validation-logloss:0.55459 validation-auc:0.96846 validation-aucpr:0.97217
[23] validation-logloss:0.54968 validation-auc:0.96880 validation-aucpr:0.97186
[24] validation-logloss:0.54469 validation-auc:0.96938 validation-aucpr:0.97237
[25] validation-logloss:0.53972 validation-auc:0.96965 validation-aucpr:0.97256
[26] validation-logloss:0.53541 validation-auc:0.97000 validation-aucpr:0.97281
[27] validation-logloss:0.53085 validation-auc:0.97013 validation-aucpr:0.97298
[28] validation-logloss:0.52634 validation-auc:0.97020 validation-aucpr:0.97305
[29] validation-logloss:0.52194 validation-auc:0.97024 validation-aucpr:0.97312
[30] validation-logloss:0.51748 validation-auc:0.97049 validation-aucpr:0.97326
[31] validation-logloss:0.51362 validation-auc:0.97056 validation-aucpr:0.97330
[32] validation-logloss:0.50940 validation-auc:0.97085 validation-aucpr:0.97350
[33] validation-logloss:0.50526 validation-auc:0.97097 validation-aucpr:0.97360
[34] validation-logloss:0.50118 validation-auc:0.97099 validation-aucpr:0.97338
[35] validation-logloss:0.49720 validation-auc:0.97086 validation-aucpr:0.97327
[36] validation-logloss:0.49333 validation-auc:0.97088 validation-aucpr:0.97329
[37] validation-logloss:0.48942 validation-auc:0.97071 validation-aucpr:0.97317
[38] validation-logloss:0.48557 validation-auc:0.97096 validation-aucpr:0.97342
[39] validation-logloss:0.48171 validation-auc:0.97118 validation-aucpr:0.97370
[40] validation-logloss:0.47851 validation-auc:0.97101 validation-aucpr:0.97356
[41] validation-logloss:0.47498 validation-auc:0.97083 validation-aucpr:0.97374
[42] validation-logloss:0.47139 validation-auc:0.97092 validation-aucpr:0.97378
[43] validation-logloss:0.46777 validation-auc:0.97086 validation-aucpr:0.97373
[44] validation-logloss:0.46420 validation-auc:0.97079 validation-aucpr:0.97369
[45] validation-logloss:0.46073 validation-auc:0.97091 validation-aucpr:0.97377
[46] validation-logloss:0.45716 validation-auc:0.97102 validation-aucpr:0.97392
[47] validation-logloss:0.45391 validation-auc:0.97104 validation-aucpr:0.97391
[48] validation-logloss:0.45052 validation-auc:0.97119 validation-aucpr:0.97402
[49] validation-logloss:0.44763 validation-auc:0.97128 validation-aucpr:0.97398
[50] validation-logloss:0.44445 validation-auc:0.97122 validation-aucpr:0.97381
[51] validation-logloss:0.44117 validation-auc:0.97144 validation-aucpr:0.97401
[52] validation-logloss:0.43813 validation-auc:0.97151 validation-aucpr:0.97408
[53] validation-logloss:0.43489 validation-auc:0.97148 validation-aucpr:0.97407
[54] validation-logloss:0.43216 validation-auc:0.97156 validation-aucpr:0.97413
[55] validation-logloss:0.42889 validation-auc:0.97178 validation-aucpr:0.97434
[56] validation-logloss:0.42605 validation-auc:0.97175 validation-aucpr:0.97431
[57] validation-logloss:0.42350 validation-auc:0.97169 validation-aucpr:0.97427
[58] validation-logloss:0.42070 validation-auc:0.97173 validation-aucpr:0.97434
[59] validation-logloss:0.41780 validation-auc:0.97174 validation-aucpr:0.97438
[60] validation-logloss:0.41497 validation-auc:0.97171 validation-aucpr:0.97436
[61] validation-logloss:0.41222 validation-auc:0.97174 validation-aucpr:0.97438
[62] validation-logloss:0.40983 validation-auc:0.97163 validation-aucpr:0.97433
[63] validation-logloss:0.40716 validation-auc:0.97151 validation-aucpr:0.97424
[64] validation-logloss:0.40447 validation-auc:0.97153 validation-aucpr:0.97424
[65] validation-logloss:0.40193 validation-auc:0.97144 validation-aucpr:0.97419
[66] validation-logloss:0.39939 validation-auc:0.97144 validation-aucpr:0.97420
[67] validation-logloss:0.39690 validation-auc:0.97153 validation-aucpr:0.97472
[68] validation-logloss:0.39433 validation-auc:0.97161 validation-aucpr:0.97478
[69] validation-logloss:0.39178 validation-auc:0.97154 validation-aucpr:0.97473
[70] validation-logloss:0.38939 validation-auc:0.97146 validation-aucpr:0.97468
[71] validation-logloss:0.38686 validation-auc:0.97149 validation-aucpr:0.97470
[72] validation-logloss:0.38458 validation-auc:0.97132 validation-aucpr:0.97457
[73] validation-logloss:0.38228 validation-auc:0.97137 validation-aucpr:0.97460
[74] validation-logloss:0.37997 validation-auc:0.97147 validation-aucpr:0.97469
[75] validation-logloss:0.37784 validation-auc:0.97140 validation-aucpr:0.97463
[76] validation-logloss:0.37548 validation-auc:0.97145 validation-aucpr:0.97468
[77] validation-logloss:0.37320 validation-auc:0.97148 validation-aucpr:0.97471
[78] validation-logloss:0.37102 validation-auc:0.97144 validation-aucpr:0.97464
[79] validation-logloss:0.36909 validation-auc:0.97148 validation-aucpr:0.97464
{'best_iteration': '68', 'best_score': '0.9747844255814891'}
Trial 71, Fold 2: Log loss = 0.36908960303185984, Average precision = 0.9747278218123956, ROC-AUC = 0.9714808246945849, Elapsed Time = 5.026995500000339 seconds
Trial 71, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 71, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68556 validation-auc:0.87778 validation-aucpr:0.79278
[1] validation-logloss:0.67836 validation-auc:0.93716 validation-aucpr:0.90135
[2] validation-logloss:0.67119 validation-auc:0.95347 validation-aucpr:0.94140
[3] validation-logloss:0.66417 validation-auc:0.95902 validation-aucpr:0.95109
[4] validation-logloss:0.65731 validation-auc:0.96171 validation-aucpr:0.95796
[5] validation-logloss:0.65079 validation-auc:0.96148 validation-aucpr:0.95872
[6] validation-logloss:0.64443 validation-auc:0.96200 validation-aucpr:0.96199
[7] validation-logloss:0.63853 validation-auc:0.96237 validation-aucpr:0.96683
[8] validation-logloss:0.63295 validation-auc:0.96219 validation-aucpr:0.96576
[9] validation-logloss:0.62704 validation-auc:0.96273 validation-aucpr:0.96657
[10] validation-logloss:0.62084 validation-auc:0.96302 validation-aucpr:0.96595
[11] validation-logloss:0.61471 validation-auc:0.96386 validation-aucpr:0.96732
[12] validation-logloss:0.60910 validation-auc:0.96416 validation-aucpr:0.96836
[13] validation-logloss:0.60332 validation-auc:0.96452 validation-aucpr:0.96791
[14] validation-logloss:0.59766 validation-auc:0.96489 validation-aucpr:0.96828
[15] validation-logloss:0.59218 validation-auc:0.96502 validation-aucpr:0.96819
[16] validation-logloss:0.58659 validation-auc:0.96549 validation-aucpr:0.96885
[17] validation-logloss:0.58180 validation-auc:0.96549 validation-aucpr:0.96906
[18] validation-logloss:0.57644 validation-auc:0.96597 validation-aucpr:0.96915
[19] validation-logloss:0.57128 validation-auc:0.96607 validation-aucpr:0.96922
[20] validation-logloss:0.56639 validation-auc:0.96612 validation-aucpr:0.96912
[21] validation-logloss:0.56133 validation-auc:0.96649 validation-aucpr:0.96935
[22] validation-logloss:0.55645 validation-auc:0.96643 validation-aucpr:0.96937
[23] validation-logloss:0.55196 validation-auc:0.96644 validation-aucpr:0.96907
[24] validation-logloss:0.54704 validation-auc:0.96652 validation-aucpr:0.96910
[25] validation-logloss:0.54231 validation-auc:0.96705 validation-aucpr:0.97070
[26] validation-logloss:0.53769 validation-auc:0.96693 validation-aucpr:0.97069
[27] validation-logloss:0.53300 validation-auc:0.96724 validation-aucpr:0.97101
[28] validation-logloss:0.52846 validation-auc:0.96722 validation-aucpr:0.97048
[29] validation-logloss:0.52399 validation-auc:0.96729 validation-aucpr:0.97122
[30] validation-logloss:0.51947 validation-auc:0.96768 validation-aucpr:0.97158
[31] validation-logloss:0.51538 validation-auc:0.96752 validation-aucpr:0.97150
[32] validation-logloss:0.51113 validation-auc:0.96764 validation-aucpr:0.97151
[33] validation-logloss:0.50726 validation-auc:0.96785 validation-aucpr:0.97163
[34] validation-logloss:0.50302 validation-auc:0.96792 validation-aucpr:0.97177
[35] validation-logloss:0.49890 validation-auc:0.96812 validation-aucpr:0.97192
[36] validation-logloss:0.49492 validation-auc:0.96796 validation-aucpr:0.97183
[37] validation-logloss:0.49106 validation-auc:0.96825 validation-aucpr:0.97214
[38] validation-logloss:0.48704 validation-auc:0.96827 validation-aucpr:0.97237
[39] validation-logloss:0.48317 validation-auc:0.96850 validation-aucpr:0.97258
[40] validation-logloss:0.47938 validation-auc:0.96862 validation-aucpr:0.97276
[41] validation-logloss:0.47565 validation-auc:0.96868 validation-aucpr:0.97274
[42] validation-logloss:0.47212 validation-auc:0.96883 validation-aucpr:0.97283
[43] validation-logloss:0.46842 validation-auc:0.96893 validation-aucpr:0.97282
[44] validation-logloss:0.46486 validation-auc:0.96888 validation-aucpr:0.97286
[45] validation-logloss:0.46143 validation-auc:0.96878 validation-aucpr:0.97280
[46] validation-logloss:0.45815 validation-auc:0.96865 validation-aucpr:0.97257
[47] validation-logloss:0.45476 validation-auc:0.96868 validation-aucpr:0.97275
[48] validation-logloss:0.45137 validation-auc:0.96864 validation-aucpr:0.97225
[49] validation-logloss:0.44808 validation-auc:0.96872 validation-aucpr:0.97225
[50] validation-logloss:0.44465 validation-auc:0.96889 validation-aucpr:0.97253
[51] validation-logloss:0.44148 validation-auc:0.96884 validation-aucpr:0.97250
[52] validation-logloss:0.43837 validation-auc:0.96879 validation-aucpr:0.97256
[53] validation-logloss:0.43519 validation-auc:0.96893 validation-aucpr:0.97270
[54] validation-logloss:0.43206 validation-auc:0.96893 validation-aucpr:0.97264
[55] validation-logloss:0.42896 validation-auc:0.96896 validation-aucpr:0.97266
[56] validation-logloss:0.42600 validation-auc:0.96889 validation-aucpr:0.97236
[57] validation-logloss:0.42335 validation-auc:0.96913 validation-aucpr:0.97298
[58] validation-logloss:0.42036 validation-auc:0.96920 validation-aucpr:0.97305
[59] validation-logloss:0.41758 validation-auc:0.96921 validation-aucpr:0.97283
[60] validation-logloss:0.41475 validation-auc:0.96920 validation-aucpr:0.97290
[61] validation-logloss:0.41200 validation-auc:0.96913 validation-aucpr:0.97286
[62] validation-logloss:0.40918 validation-auc:0.96926 validation-aucpr:0.97289
[63] validation-logloss:0.40658 validation-auc:0.96918 validation-aucpr:0.97282
[64] validation-logloss:0.40410 validation-auc:0.96914 validation-aucpr:0.97277
[65] validation-logloss:0.40171 validation-auc:0.96910 validation-aucpr:0.97242
[66] validation-logloss:0.39914 validation-auc:0.96920 validation-aucpr:0.97242
[67] validation-logloss:0.39644 validation-auc:0.96922 validation-aucpr:0.97225
[68] validation-logloss:0.39410 validation-auc:0.96917 validation-aucpr:0.97213
[69] validation-logloss:0.39156 validation-auc:0.96922 validation-aucpr:0.97223
[70] validation-logloss:0.38908 validation-auc:0.96925 validation-aucpr:0.97215
[71] validation-logloss:0.38659 validation-auc:0.96932 validation-aucpr:0.97221
[72] validation-logloss:0.38424 validation-auc:0.96940 validation-aucpr:0.97273
[73] validation-logloss:0.38187 validation-auc:0.96940 validation-aucpr:0.97274
[74] validation-logloss:0.37965 validation-auc:0.96926 validation-aucpr:0.97261
[75] validation-logloss:0.37728 validation-auc:0.96928 validation-aucpr:0.97262
[76] validation-logloss:0.37522 validation-auc:0.96933 validation-aucpr:0.97264
[77] validation-logloss:0.37310 validation-auc:0.96931 validation-aucpr:0.97264
[78] validation-logloss:0.37081 validation-auc:0.96934 validation-aucpr:0.97266
[79] validation-logloss:0.36866 validation-auc:0.96927 validation-aucpr:0.97259
{'best_iteration': '58', 'best_score': '0.9730494269389566'}
Trial 71, Fold 3: Log loss = 0.3686551244484038, Average precision = 0.9727055199215334, ROC-AUC = 0.9692703768624011, Elapsed Time = 5.743944699999702 seconds
Trial 71, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 71, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68543 validation-auc:0.91799 validation-aucpr:0.87581
[1] validation-logloss:0.67808 validation-auc:0.93524 validation-aucpr:0.89612
[2] validation-logloss:0.67092 validation-auc:0.95223 validation-aucpr:0.93929
[3] validation-logloss:0.66413 validation-auc:0.95879 validation-aucpr:0.95791
[4] validation-logloss:0.65794 validation-auc:0.96139 validation-aucpr:0.96407
[5] validation-logloss:0.65120 validation-auc:0.96168 validation-aucpr:0.96285
[6] validation-logloss:0.64454 validation-auc:0.96318 validation-aucpr:0.96619
[7] validation-logloss:0.63851 validation-auc:0.96315 validation-aucpr:0.96764
[8] validation-logloss:0.63226 validation-auc:0.96392 validation-aucpr:0.96885
[9] validation-logloss:0.62593 validation-auc:0.96525 validation-aucpr:0.97016
[10] validation-logloss:0.61999 validation-auc:0.96574 validation-aucpr:0.97148
[11] validation-logloss:0.61458 validation-auc:0.96537 validation-aucpr:0.97125
[12] validation-logloss:0.60875 validation-auc:0.96565 validation-aucpr:0.97144
[13] validation-logloss:0.60311 validation-auc:0.96562 validation-aucpr:0.97148
[14] validation-logloss:0.59751 validation-auc:0.96552 validation-aucpr:0.97158
[15] validation-logloss:0.59188 validation-auc:0.96606 validation-aucpr:0.97203
[16] validation-logloss:0.58629 validation-auc:0.96639 validation-aucpr:0.97224
[17] validation-logloss:0.58111 validation-auc:0.96633 validation-aucpr:0.97219
[18] validation-logloss:0.57583 validation-auc:0.96648 validation-aucpr:0.97227
[19] validation-logloss:0.57060 validation-auc:0.96639 validation-aucpr:0.97221
[20] validation-logloss:0.56571 validation-auc:0.96662 validation-aucpr:0.97242
[21] validation-logloss:0.56062 validation-auc:0.96679 validation-aucpr:0.97252
[22] validation-logloss:0.55575 validation-auc:0.96684 validation-aucpr:0.97256
[23] validation-logloss:0.55079 validation-auc:0.96715 validation-aucpr:0.97275
[24] validation-logloss:0.54590 validation-auc:0.96753 validation-aucpr:0.97304
[25] validation-logloss:0.54109 validation-auc:0.96747 validation-aucpr:0.97300
[26] validation-logloss:0.53653 validation-auc:0.96720 validation-aucpr:0.97281
[27] validation-logloss:0.53205 validation-auc:0.96697 validation-aucpr:0.97265
[28] validation-logloss:0.52772 validation-auc:0.96684 validation-aucpr:0.97256
[29] validation-logloss:0.52349 validation-auc:0.96651 validation-aucpr:0.97233
[30] validation-logloss:0.51911 validation-auc:0.96680 validation-aucpr:0.97256
[31] validation-logloss:0.51497 validation-auc:0.96670 validation-aucpr:0.97250
[32] validation-logloss:0.51080 validation-auc:0.96672 validation-aucpr:0.97254
[33] validation-logloss:0.50662 validation-auc:0.96686 validation-aucpr:0.97266
[34] validation-logloss:0.50284 validation-auc:0.96692 validation-aucpr:0.97268
[35] validation-logloss:0.49870 validation-auc:0.96718 validation-aucpr:0.97286
[36] validation-logloss:0.49466 validation-auc:0.96745 validation-aucpr:0.97308
[37] validation-logloss:0.49128 validation-auc:0.96745 validation-aucpr:0.97307
[38] validation-logloss:0.48737 validation-auc:0.96753 validation-aucpr:0.97312
[39] validation-logloss:0.48347 validation-auc:0.96777 validation-aucpr:0.97326
[40] validation-logloss:0.47977 validation-auc:0.96784 validation-aucpr:0.97328
[41] validation-logloss:0.47619 validation-auc:0.96769 validation-aucpr:0.97318
[42] validation-logloss:0.47253 validation-auc:0.96771 validation-aucpr:0.97322
[43] validation-logloss:0.46882 validation-auc:0.96793 validation-aucpr:0.97339
[44] validation-logloss:0.46516 validation-auc:0.96806 validation-aucpr:0.97351
[45] validation-logloss:0.46180 validation-auc:0.96796 validation-aucpr:0.97339
[46] validation-logloss:0.45818 validation-auc:0.96805 validation-aucpr:0.97346
[47] validation-logloss:0.45524 validation-auc:0.96792 validation-aucpr:0.97336
[48] validation-logloss:0.45232 validation-auc:0.96787 validation-aucpr:0.97330
[49] validation-logloss:0.44918 validation-auc:0.96812 validation-aucpr:0.97345
[50] validation-logloss:0.44606 validation-auc:0.96824 validation-aucpr:0.97356
[51] validation-logloss:0.44317 validation-auc:0.96827 validation-aucpr:0.97356
[52] validation-logloss:0.44002 validation-auc:0.96829 validation-aucpr:0.97356
[53] validation-logloss:0.43684 validation-auc:0.96845 validation-aucpr:0.97371
[54] validation-logloss:0.43370 validation-auc:0.96840 validation-aucpr:0.97368
[55] validation-logloss:0.43063 validation-auc:0.96837 validation-aucpr:0.97370
[56] validation-logloss:0.42770 validation-auc:0.96828 validation-aucpr:0.97363
[57] validation-logloss:0.42485 validation-auc:0.96841 validation-aucpr:0.97371
[58] validation-logloss:0.42194 validation-auc:0.96843 validation-aucpr:0.97374
[59] validation-logloss:0.41906 validation-auc:0.96839 validation-aucpr:0.97372
[60] validation-logloss:0.41634 validation-auc:0.96830 validation-aucpr:0.97366
[61] validation-logloss:0.41357 validation-auc:0.96832 validation-aucpr:0.97367
[62] validation-logloss:0.41079 validation-auc:0.96842 validation-aucpr:0.97374
[63] validation-logloss:0.40819 validation-auc:0.96840 validation-aucpr:0.97372
[64] validation-logloss:0.40548 validation-auc:0.96848 validation-aucpr:0.97376
[65] validation-logloss:0.40274 validation-auc:0.96849 validation-aucpr:0.97377
[66] validation-logloss:0.40012 validation-auc:0.96861 validation-aucpr:0.97385
[67] validation-logloss:0.39751 validation-auc:0.96871 validation-aucpr:0.97392
[68] validation-logloss:0.39495 validation-auc:0.96871 validation-aucpr:0.97394
[69] validation-logloss:0.39245 validation-auc:0.96874 validation-aucpr:0.97395
[70] validation-logloss:0.39013 validation-auc:0.96868 validation-aucpr:0.97389
[71] validation-logloss:0.38768 validation-auc:0.96875 validation-aucpr:0.97395
[72] validation-logloss:0.38565 validation-auc:0.96875 validation-aucpr:0.97393
[73] validation-logloss:0.38331 validation-auc:0.96873 validation-aucpr:0.97391
[74] validation-logloss:0.38094 validation-auc:0.96880 validation-aucpr:0.97393
[75] validation-logloss:0.37869 validation-auc:0.96886 validation-aucpr:0.97397
[76] validation-logloss:0.37642 validation-auc:0.96885 validation-aucpr:0.97398
[77] validation-logloss:0.37424 validation-auc:0.96889 validation-aucpr:0.97399
[78] validation-logloss:0.37207 validation-auc:0.96889 validation-aucpr:0.97399
[79] validation-logloss:0.37004 validation-auc:0.96888 validation-aucpr:0.97398
{'best_iteration': '78', 'best_score': '0.9739904032743923'}
Trial 71, Fold 4: Log loss = 0.3700416251857441, Average precision = 0.9739796125881007, ROC-AUC = 0.9688816641272703, Elapsed Time = 5.088956399999006 seconds
Trial 71, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 71, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68568 validation-auc:0.90673 validation-aucpr:0.84899
[1] validation-logloss:0.67851 validation-auc:0.93090 validation-aucpr:0.90040
[2] validation-logloss:0.67134 validation-auc:0.94797 validation-aucpr:0.93347
[3] validation-logloss:0.66457 validation-auc:0.95152 validation-aucpr:0.94272
[4] validation-logloss:0.65825 validation-auc:0.95428 validation-aucpr:0.95744
[5] validation-logloss:0.65153 validation-auc:0.95624 validation-aucpr:0.96116
[6] validation-logloss:0.64499 validation-auc:0.95835 validation-aucpr:0.96292
[7] validation-logloss:0.63851 validation-auc:0.95894 validation-aucpr:0.96571
[8] validation-logloss:0.63227 validation-auc:0.95984 validation-aucpr:0.96626
[9] validation-logloss:0.62610 validation-auc:0.96109 validation-aucpr:0.96722
[10] validation-logloss:0.62014 validation-auc:0.96131 validation-aucpr:0.96745
[11] validation-logloss:0.61419 validation-auc:0.96147 validation-aucpr:0.96708
[12] validation-logloss:0.60834 validation-auc:0.96163 validation-aucpr:0.96739
[13] validation-logloss:0.60259 validation-auc:0.96209 validation-aucpr:0.96780
[14] validation-logloss:0.59701 validation-auc:0.96237 validation-aucpr:0.96765
[15] validation-logloss:0.59147 validation-auc:0.96290 validation-aucpr:0.96790
[16] validation-logloss:0.58621 validation-auc:0.96301 validation-aucpr:0.96763
[17] validation-logloss:0.58067 validation-auc:0.96357 validation-aucpr:0.96808
[18] validation-logloss:0.57531 validation-auc:0.96403 validation-aucpr:0.96842
[19] validation-logloss:0.57016 validation-auc:0.96433 validation-aucpr:0.96866
[20] validation-logloss:0.56512 validation-auc:0.96407 validation-aucpr:0.96861
[21] validation-logloss:0.56015 validation-auc:0.96430 validation-aucpr:0.96879
[22] validation-logloss:0.55527 validation-auc:0.96412 validation-aucpr:0.96867
[23] validation-logloss:0.55058 validation-auc:0.96430 validation-aucpr:0.96885
[24] validation-logloss:0.54642 validation-auc:0.96425 validation-aucpr:0.96883
[25] validation-logloss:0.54170 validation-auc:0.96463 validation-aucpr:0.96901
[26] validation-logloss:0.53706 validation-auc:0.96485 validation-aucpr:0.96928
[27] validation-logloss:0.53247 validation-auc:0.96480 validation-aucpr:0.96923
[28] validation-logloss:0.52807 validation-auc:0.96513 validation-aucpr:0.97001
[29] validation-logloss:0.52385 validation-auc:0.96515 validation-aucpr:0.96998
[30] validation-logloss:0.51998 validation-auc:0.96512 validation-aucpr:0.96975
[31] validation-logloss:0.51552 validation-auc:0.96547 validation-aucpr:0.96986
[32] validation-logloss:0.51130 validation-auc:0.96551 validation-aucpr:0.96990
[33] validation-logloss:0.50717 validation-auc:0.96575 validation-aucpr:0.97003
[34] validation-logloss:0.50301 validation-auc:0.96596 validation-aucpr:0.97019
[35] validation-logloss:0.49922 validation-auc:0.96598 validation-aucpr:0.97017
[36] validation-logloss:0.49526 validation-auc:0.96613 validation-aucpr:0.97028
[37] validation-logloss:0.49131 validation-auc:0.96626 validation-aucpr:0.97043
[38] validation-logloss:0.48757 validation-auc:0.96627 validation-aucpr:0.97041
[39] validation-logloss:0.48376 validation-auc:0.96635 validation-aucpr:0.97034
[40] validation-logloss:0.48006 validation-auc:0.96635 validation-aucpr:0.97032
[41] validation-logloss:0.47653 validation-auc:0.96624 validation-aucpr:0.97023
[42] validation-logloss:0.47294 validation-auc:0.96640 validation-aucpr:0.97045
[43] validation-logloss:0.46977 validation-auc:0.96643 validation-aucpr:0.97053
[44] validation-logloss:0.46635 validation-auc:0.96666 validation-aucpr:0.97107
[45] validation-logloss:0.46314 validation-auc:0.96668 validation-aucpr:0.97117
[46] validation-logloss:0.45957 validation-auc:0.96687 validation-aucpr:0.97135
[47] validation-logloss:0.45643 validation-auc:0.96674 validation-aucpr:0.97071
[48] validation-logloss:0.45305 validation-auc:0.96686 validation-aucpr:0.97075
[49] validation-logloss:0.44976 validation-auc:0.96692 validation-aucpr:0.97082
[50] validation-logloss:0.44649 validation-auc:0.96708 validation-aucpr:0.97092
[51] validation-logloss:0.44328 validation-auc:0.96704 validation-aucpr:0.97099
[52] validation-logloss:0.44019 validation-auc:0.96706 validation-aucpr:0.97104
[53] validation-logloss:0.43749 validation-auc:0.96706 validation-aucpr:0.97101
[54] validation-logloss:0.43445 validation-auc:0.96709 validation-aucpr:0.97076
[55] validation-logloss:0.43169 validation-auc:0.96697 validation-aucpr:0.97064
[56] validation-logloss:0.42873 validation-auc:0.96710 validation-aucpr:0.97073
[57] validation-logloss:0.42576 validation-auc:0.96723 validation-aucpr:0.97083
[58] validation-logloss:0.42287 validation-auc:0.96729 validation-aucpr:0.97082
[59] validation-logloss:0.42014 validation-auc:0.96737 validation-aucpr:0.97087
[60] validation-logloss:0.41733 validation-auc:0.96746 validation-aucpr:0.97095
[61] validation-logloss:0.41460 validation-auc:0.96755 validation-aucpr:0.97099
[62] validation-logloss:0.41189 validation-auc:0.96754 validation-aucpr:0.97099
[63] validation-logloss:0.40923 validation-auc:0.96754 validation-aucpr:0.97098
[64] validation-logloss:0.40654 validation-auc:0.96758 validation-aucpr:0.97100
[65] validation-logloss:0.40382 validation-auc:0.96776 validation-aucpr:0.97108
[66] validation-logloss:0.40127 validation-auc:0.96771 validation-aucpr:0.97110
[67] validation-logloss:0.39909 validation-auc:0.96751 validation-aucpr:0.97096
[68] validation-logloss:0.39665 validation-auc:0.96750 validation-aucpr:0.97094
[69] validation-logloss:0.39411 validation-auc:0.96754 validation-aucpr:0.97084
[70] validation-logloss:0.39181 validation-auc:0.96768 validation-aucpr:0.97102
[71] validation-logloss:0.38938 validation-auc:0.96780 validation-aucpr:0.97112
[72] validation-logloss:0.38701 validation-auc:0.96779 validation-aucpr:0.97111
[73] validation-logloss:0.38481 validation-auc:0.96771 validation-aucpr:0.97105
[74] validation-logloss:0.38263 validation-auc:0.96781 validation-aucpr:0.97095
[75] validation-logloss:0.38030 validation-auc:0.96784 validation-aucpr:0.97097
[76] validation-logloss:0.37836 validation-auc:0.96788 validation-aucpr:0.97091
[77] validation-logloss:0.37626 validation-auc:0.96793 validation-aucpr:0.97094
[78] validation-logloss:0.37414 validation-auc:0.96793 validation-aucpr:0.97103
[79] validation-logloss:0.37183 validation-auc:0.96796 validation-aucpr:0.97104
{'best_iteration': '46', 'best_score': '0.9713535849713747'}
Trial 71, Fold 5: Log loss = 0.3718332745016931, Average precision = 0.9711940729228865, ROC-AUC = 0.9679583077866339, Elapsed Time = 5.344965599997522 seconds
Optimization Progress: 72%|#######2 | 72/100 [3:26:35<1:45:09, 225.35s/it]
Trial 72, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 72, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63479 validation-auc:0.95862 validation-aucpr:0.96396
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.58509 validation-auc:0.96387 validation-aucpr:0.96927
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.54597 validation-auc:0.96471 validation-aucpr:0.96937
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.51280 validation-auc:0.96481 validation-aucpr:0.96913
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.48051 validation-auc:0.96534 validation-aucpr:0.96970
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.45177 validation-auc:0.96644 validation-aucpr:0.97122
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.42585 validation-auc:0.96732 validation-aucpr:0.97216
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.40293 validation-auc:0.96790 validation-aucpr:0.97295
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.38464 validation-auc:0.96848 validation-aucpr:0.97343
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.36603 validation-auc:0.96873 validation-aucpr:0.97162
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.34968 validation-auc:0.96902 validation-aucpr:0.97160
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.33498 validation-auc:0.96952 validation-aucpr:0.97203
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.32329 validation-auc:0.96981 validation-aucpr:0.97218
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.31113 validation-auc:0.96993 validation-aucpr:0.97227
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.30081 validation-auc:0.97003 validation-aucpr:0.97234
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.29289 validation-auc:0.96968 validation-aucpr:0.97212
[21:25:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.28387 validation-auc:0.96992 validation-aucpr:0.97225
[21:25:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.27559 validation-auc:0.97030 validation-aucpr:0.97337
[21:25:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.26793 validation-auc:0.97054 validation-aucpr:0.97357
[21:25:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.26252 validation-auc:0.97042 validation-aucpr:0.97332
[21:25:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.25663 validation-auc:0.97087 validation-aucpr:0.97361
[21:25:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.25189 validation-auc:0.97096 validation-aucpr:0.97534
[21:25:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.24657 validation-auc:0.97100 validation-aucpr:0.97546
[21:25:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.24305 validation-auc:0.97094 validation-aucpr:0.97538
[21:25:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.23976 validation-auc:0.97078 validation-aucpr:0.97523
[21:25:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.23582 validation-auc:0.97094 validation-aucpr:0.97531
[21:25:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.23299 validation-auc:0.97091 validation-aucpr:0.97540
[21:25:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.22990 validation-auc:0.97096 validation-aucpr:0.97539
[21:25:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.22666 validation-auc:0.97112 validation-aucpr:0.97550
[21:25:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.22455 validation-auc:0.97102 validation-aucpr:0.97539
[21:25:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.22152 validation-auc:0.97107 validation-aucpr:0.97542
[21:25:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.21898 validation-auc:0.97098 validation-aucpr:0.97540
[21:25:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.21673 validation-auc:0.97116 validation-aucpr:0.97561
[21:25:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.21486 validation-auc:0.97120 validation-aucpr:0.97564
[21:25:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.21233 validation-auc:0.97142 validation-aucpr:0.97582
[21:25:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.21038 validation-auc:0.97157 validation-aucpr:0.97596
[21:25:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.20846 validation-auc:0.97172 validation-aucpr:0.97608
[21:25:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20671 validation-auc:0.97190 validation-aucpr:0.97618
[21:25:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.20543 validation-auc:0.97184 validation-aucpr:0.97612
[21:25:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.20442 validation-auc:0.97188 validation-aucpr:0.97614
[21:25:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20296 validation-auc:0.97203 validation-aucpr:0.97638
[21:25:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.20167 validation-auc:0.97220 validation-aucpr:0.97650
[21:25:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20079 validation-auc:0.97223 validation-aucpr:0.97654
[21:25:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.19997 validation-auc:0.97220 validation-aucpr:0.97652
[21:25:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19926 validation-auc:0.97224 validation-aucpr:0.97655
[21:25:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19850 validation-auc:0.97228 validation-aucpr:0.97652
[21:25:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19752 validation-auc:0.97240 validation-aucpr:0.97663
[21:25:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19686 validation-auc:0.97242 validation-aucpr:0.97664
[21:25:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19625 validation-auc:0.97250 validation-aucpr:0.97673
[21:25:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19582 validation-auc:0.97253 validation-aucpr:0.97676
[21:25:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19535 validation-auc:0.97259 validation-aucpr:0.97677
[21:25:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19509 validation-auc:0.97254 validation-aucpr:0.97673
[21:25:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19491 validation-auc:0.97253 validation-aucpr:0.97668
[21:25:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19474 validation-auc:0.97249 validation-aucpr:0.97666
{'best_iteration': '50', 'best_score': '0.9767685335164963'}
Trial 72, Fold 1: Log loss = 0.19473958140355582, Average precision = 0.9766591920176859, ROC-AUC = 0.9724918209279094, Elapsed Time = 6.8159644000006665 seconds
Trial 72, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 72, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[21:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63593 validation-auc:0.95733 validation-aucpr:0.96176
[21:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.58589 validation-auc:0.96568 validation-aucpr:0.96911
[21:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.54672 validation-auc:0.96585 validation-aucpr:0.96950
[21:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.51010 validation-auc:0.96673 validation-aucpr:0.97029
[21:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.47851 validation-auc:0.96782 validation-aucpr:0.97105
[21:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.44841 validation-auc:0.96928 validation-aucpr:0.97230
[21:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.42594 validation-auc:0.96903 validation-aucpr:0.97225
[21:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.40177 validation-auc:0.96956 validation-aucpr:0.97282
[21:25:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.38107 validation-auc:0.96962 validation-aucpr:0.97294
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.36276 validation-auc:0.96967 validation-aucpr:0.97292
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.34636 validation-auc:0.97010 validation-aucpr:0.97326
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.33115 validation-auc:0.97042 validation-aucpr:0.97348
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.31946 validation-auc:0.97029 validation-aucpr:0.97335
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.30711 validation-auc:0.97063 validation-aucpr:0.97359
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.29606 validation-auc:0.97065 validation-aucpr:0.97363
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.28644 validation-auc:0.97086 validation-aucpr:0.97384
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.27874 validation-auc:0.97075 validation-aucpr:0.97379
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.27079 validation-auc:0.97097 validation-aucpr:0.97392
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.26468 validation-auc:0.97095 validation-aucpr:0.97386
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.25857 validation-auc:0.97102 validation-aucpr:0.97387
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.25311 validation-auc:0.97115 validation-aucpr:0.97400
[21:25:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.24857 validation-auc:0.97111 validation-aucpr:0.97394
[21:25:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.24284 validation-auc:0.97130 validation-aucpr:0.97401
[21:25:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.23756 validation-auc:0.97153 validation-aucpr:0.97423
[21:25:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.23293 validation-auc:0.97175 validation-aucpr:0.97438
[21:25:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.22852 validation-auc:0.97176 validation-aucpr:0.97367
[21:25:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.22477 validation-auc:0.97195 validation-aucpr:0.97488
[21:25:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.22184 validation-auc:0.97204 validation-aucpr:0.97486
[21:25:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.21841 validation-auc:0.97235 validation-aucpr:0.97510
[21:25:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.21566 validation-auc:0.97224 validation-aucpr:0.97506
[21:25:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.21296 validation-auc:0.97209 validation-aucpr:0.97493
[21:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.21077 validation-auc:0.97212 validation-aucpr:0.97496
[21:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.20933 validation-auc:0.97190 validation-aucpr:0.97476
[21:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.20709 validation-auc:0.97187 validation-aucpr:0.97478
[21:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.20542 validation-auc:0.97196 validation-aucpr:0.97481
[21:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.20374 validation-auc:0.97217 validation-aucpr:0.97503
[21:25:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.20176 validation-auc:0.97225 validation-aucpr:0.97467
[21:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20049 validation-auc:0.97217 validation-aucpr:0.97460
[21:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.19912 validation-auc:0.97228 validation-aucpr:0.97468
[21:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.19782 validation-auc:0.97235 validation-aucpr:0.97472
[21:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.19660 validation-auc:0.97227 validation-aucpr:0.97460
[21:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.19510 validation-auc:0.97249 validation-aucpr:0.97478
[21:25:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.19476 validation-auc:0.97236 validation-aucpr:0.97465
[21:25:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.19377 validation-auc:0.97232 validation-aucpr:0.97459
[21:25:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19245 validation-auc:0.97248 validation-aucpr:0.97473
[21:25:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19133 validation-auc:0.97263 validation-aucpr:0.97485
[21:25:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19048 validation-auc:0.97281 validation-aucpr:0.97575
[21:25:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.18984 validation-auc:0.97290 validation-aucpr:0.97578
[21:25:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.18904 validation-auc:0.97293 validation-aucpr:0.97580
[21:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.18872 validation-auc:0.97293 validation-aucpr:0.97582
[21:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.18799 validation-auc:0.97307 validation-aucpr:0.97587
[21:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.18766 validation-auc:0.97295 validation-aucpr:0.97571
[21:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.18715 validation-auc:0.97306 validation-aucpr:0.97591
[21:25:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.18652 validation-auc:0.97301 validation-aucpr:0.97578
{'best_iteration': '52', 'best_score': '0.9759079413380883'}
Trial 72, Fold 2: Log loss = 0.18651565331945985, Average precision = 0.9757842427832537, ROC-AUC = 0.9730090766420401, Elapsed Time = 7.354374000002281 seconds
Trial 72, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 72, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[21:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63908 validation-auc:0.94558 validation-aucpr:0.95267
[21:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.58812 validation-auc:0.96595 validation-aucpr:0.96939
[21:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.54491 validation-auc:0.96801 validation-aucpr:0.97151
[21:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.51018 validation-auc:0.96880 validation-aucpr:0.97192
[21:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.47795 validation-auc:0.96847 validation-aucpr:0.97190
[21:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.44799 validation-auc:0.96963 validation-aucpr:0.97300
[21:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.42190 validation-auc:0.96994 validation-aucpr:0.97329
[21:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.39899 validation-auc:0.97041 validation-aucpr:0.97357
[21:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.38088 validation-auc:0.97020 validation-aucpr:0.97320
[21:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.36289 validation-auc:0.97018 validation-aucpr:0.97293
[21:25:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.34803 validation-auc:0.97027 validation-aucpr:0.97300
[21:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.33386 validation-auc:0.96978 validation-aucpr:0.97263
[21:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.31990 validation-auc:0.97017 validation-aucpr:0.97294
[21:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.30734 validation-auc:0.97054 validation-aucpr:0.97360
[21:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.29624 validation-auc:0.97062 validation-aucpr:0.97389
[21:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.28787 validation-auc:0.97062 validation-aucpr:0.97384
[21:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.27853 validation-auc:0.97114 validation-aucpr:0.97420
[21:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.26982 validation-auc:0.97168 validation-aucpr:0.97560
[21:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.26188 validation-auc:0.97177 validation-aucpr:0.97568
[21:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.25544 validation-auc:0.97156 validation-aucpr:0.97546
[21:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.24884 validation-auc:0.97182 validation-aucpr:0.97572
[21:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.24330 validation-auc:0.97174 validation-aucpr:0.97568
[21:25:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.23817 validation-auc:0.97181 validation-aucpr:0.97574
[21:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.23423 validation-auc:0.97183 validation-aucpr:0.97576
[21:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.22991 validation-auc:0.97193 validation-aucpr:0.97588
[21:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.22601 validation-auc:0.97189 validation-aucpr:0.97583
[21:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.22288 validation-auc:0.97187 validation-aucpr:0.97485
[21:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.22021 validation-auc:0.97179 validation-aucpr:0.97472
[21:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.21686 validation-auc:0.97195 validation-aucpr:0.97486
[21:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.21445 validation-auc:0.97192 validation-aucpr:0.97495
[21:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.21232 validation-auc:0.97211 validation-aucpr:0.97629
[21:25:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.21030 validation-auc:0.97195 validation-aucpr:0.97614
[21:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.20858 validation-auc:0.97183 validation-aucpr:0.97604
[21:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.20687 validation-auc:0.97183 validation-aucpr:0.97600
[21:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.20543 validation-auc:0.97173 validation-aucpr:0.97594
[21:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.20346 validation-auc:0.97191 validation-aucpr:0.97606
[21:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.20212 validation-auc:0.97192 validation-aucpr:0.97606
[21:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20056 validation-auc:0.97203 validation-aucpr:0.97622
[21:25:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.19926 validation-auc:0.97200 validation-aucpr:0.97617
[21:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.19855 validation-auc:0.97191 validation-aucpr:0.97608
[21:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.19808 validation-auc:0.97186 validation-aucpr:0.97596
[21:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.19748 validation-auc:0.97173 validation-aucpr:0.97586
[21:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.19673 validation-auc:0.97167 validation-aucpr:0.97579
[21:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.19598 validation-auc:0.97167 validation-aucpr:0.97583
[21:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19544 validation-auc:0.97171 validation-aucpr:0.97593
[21:25:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19466 validation-auc:0.97180 validation-aucpr:0.97606
[21:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19373 validation-auc:0.97197 validation-aucpr:0.97620
[21:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19315 validation-auc:0.97201 validation-aucpr:0.97620
[21:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19244 validation-auc:0.97202 validation-aucpr:0.97619
[21:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19208 validation-auc:0.97200 validation-aucpr:0.97618
[21:25:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19152 validation-auc:0.97208 validation-aucpr:0.97621
[21:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19106 validation-auc:0.97219 validation-aucpr:0.97640
[21:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19105 validation-auc:0.97216 validation-aucpr:0.97635
[21:25:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19085 validation-auc:0.97213 validation-aucpr:0.97630
{'best_iteration': '51', 'best_score': '0.9764002137695778'}
Trial 72, Fold 3: Log loss = 0.19085098155620686, Average precision = 0.9763054316523052, ROC-AUC = 0.97213192290838, Elapsed Time = 6.96847870000056 seconds
Trial 72, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 72, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63561 validation-auc:0.95069 validation-aucpr:0.95005
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.58593 validation-auc:0.96165 validation-aucpr:0.96418
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.54323 validation-auc:0.96460 validation-aucpr:0.97045
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.50675 validation-auc:0.96527 validation-aucpr:0.97088
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.47843 validation-auc:0.96388 validation-aucpr:0.96988
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.44957 validation-auc:0.96529 validation-aucpr:0.97101
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.42455 validation-auc:0.96597 validation-aucpr:0.97152
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.40281 validation-auc:0.96649 validation-aucpr:0.97180
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.38329 validation-auc:0.96671 validation-aucpr:0.97206
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.36544 validation-auc:0.96715 validation-aucpr:0.97231
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.34904 validation-auc:0.96763 validation-aucpr:0.97277
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.33525 validation-auc:0.96778 validation-aucpr:0.97278
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.32436 validation-auc:0.96765 validation-aucpr:0.97286
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.31452 validation-auc:0.96707 validation-aucpr:0.97249
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.30327 validation-auc:0.96725 validation-aucpr:0.97266
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.29394 validation-auc:0.96786 validation-aucpr:0.97306
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.28626 validation-auc:0.96785 validation-aucpr:0.97308
[21:25:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.27920 validation-auc:0.96788 validation-aucpr:0.97313
[21:25:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.27295 validation-auc:0.96782 validation-aucpr:0.97313
[21:25:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.26733 validation-auc:0.96778 validation-aucpr:0.97311
[21:25:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.26043 validation-auc:0.96818 validation-aucpr:0.97343
[21:25:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.25523 validation-auc:0.96840 validation-aucpr:0.97360
[21:25:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.25057 validation-auc:0.96855 validation-aucpr:0.97368
[21:25:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.24550 validation-auc:0.96880 validation-aucpr:0.97388
[21:25:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.24164 validation-auc:0.96888 validation-aucpr:0.97396
[21:25:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.23699 validation-auc:0.96919 validation-aucpr:0.97420
[21:25:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.23358 validation-auc:0.96911 validation-aucpr:0.97413
[21:25:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.22971 validation-auc:0.96938 validation-aucpr:0.97434
[21:25:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.22671 validation-auc:0.96965 validation-aucpr:0.97454
[21:25:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.22409 validation-auc:0.96963 validation-aucpr:0.97458
[21:25:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.22053 validation-auc:0.97000 validation-aucpr:0.97485
[21:25:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.21709 validation-auc:0.97037 validation-aucpr:0.97511
[21:25:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.21527 validation-auc:0.97041 validation-aucpr:0.97514
[21:25:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.21306 validation-auc:0.97060 validation-aucpr:0.97527
[21:25:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.21116 validation-auc:0.97075 validation-aucpr:0.97539
[21:25:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.20924 validation-auc:0.97087 validation-aucpr:0.97548
[21:25:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.20719 validation-auc:0.97112 validation-aucpr:0.97569
[21:25:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20612 validation-auc:0.97105 validation-aucpr:0.97563
[21:25:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.20411 validation-auc:0.97136 validation-aucpr:0.97587
[21:25:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.20268 validation-auc:0.97139 validation-aucpr:0.97588
[21:25:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20090 validation-auc:0.97168 validation-aucpr:0.97609
[21:25:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.19927 validation-auc:0.97192 validation-aucpr:0.97628
[21:25:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.19854 validation-auc:0.97184 validation-aucpr:0.97620
[21:25:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.19782 validation-auc:0.97172 validation-aucpr:0.97612
[21:25:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19699 validation-auc:0.97185 validation-aucpr:0.97622
[21:25:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19609 validation-auc:0.97191 validation-aucpr:0.97625
[21:25:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19544 validation-auc:0.97200 validation-aucpr:0.97629
[21:25:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19455 validation-auc:0.97209 validation-aucpr:0.97635
[21:25:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19404 validation-auc:0.97216 validation-aucpr:0.97641
[21:25:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19402 validation-auc:0.97199 validation-aucpr:0.97630
[21:25:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19337 validation-auc:0.97204 validation-aucpr:0.97634
[21:25:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19285 validation-auc:0.97203 validation-aucpr:0.97634
[21:25:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19231 validation-auc:0.97218 validation-aucpr:0.97643
[21:25:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19179 validation-auc:0.97232 validation-aucpr:0.97653
{'best_iteration': '53', 'best_score': '0.9765328333816946'}
Trial 72, Fold 4: Log loss = 0.1917944273199209, Average precision = 0.9765367007556147, ROC-AUC = 0.9723240067144979, Elapsed Time = 7.33326169999782 seconds
Trial 72, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 72, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[21:26:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63486 validation-auc:0.95834 validation-aucpr:0.96338
[21:26:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.58509 validation-auc:0.96294 validation-aucpr:0.96551
[21:26:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.54780 validation-auc:0.96268 validation-aucpr:0.96743
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.51017 validation-auc:0.96475 validation-aucpr:0.96938
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.48221 validation-auc:0.96398 validation-aucpr:0.96878
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.45338 validation-auc:0.96498 validation-aucpr:0.96938
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.42716 validation-auc:0.96556 validation-aucpr:0.96986
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.40416 validation-auc:0.96620 validation-aucpr:0.97036
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.38420 validation-auc:0.96707 validation-aucpr:0.97107
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.36584 validation-auc:0.96771 validation-aucpr:0.97156
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.34969 validation-auc:0.96790 validation-aucpr:0.97172
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.33478 validation-auc:0.96846 validation-aucpr:0.97211
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.32356 validation-auc:0.96850 validation-aucpr:0.97202
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.31169 validation-auc:0.96855 validation-aucpr:0.97208
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.30058 validation-auc:0.96894 validation-aucpr:0.97226
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.29067 validation-auc:0.96948 validation-aucpr:0.97377
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.28270 validation-auc:0.96956 validation-aucpr:0.97385
[21:26:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.27603 validation-auc:0.96973 validation-aucpr:0.97381
[21:26:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.26845 validation-auc:0.96985 validation-aucpr:0.97391
[21:26:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.26232 validation-auc:0.96991 validation-aucpr:0.97387
[21:26:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.25577 validation-auc:0.97001 validation-aucpr:0.97398
[21:26:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.25121 validation-auc:0.96978 validation-aucpr:0.97236
[21:26:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.24593 validation-auc:0.97007 validation-aucpr:0.97257
[21:26:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.24169 validation-auc:0.97005 validation-aucpr:0.97258
[21:26:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.23741 validation-auc:0.97020 validation-aucpr:0.97269
[21:26:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.23375 validation-auc:0.97025 validation-aucpr:0.97275
[21:26:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.23013 validation-auc:0.97046 validation-aucpr:0.97287
[21:26:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.22742 validation-auc:0.97048 validation-aucpr:0.97303
[21:26:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.22396 validation-auc:0.97086 validation-aucpr:0.97326
[21:26:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.22091 validation-auc:0.97115 validation-aucpr:0.97339
[21:26:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.21820 validation-auc:0.97140 validation-aucpr:0.97357
[21:26:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.21586 validation-auc:0.97135 validation-aucpr:0.97351
[21:26:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.21414 validation-auc:0.97127 validation-aucpr:0.97348
[21:26:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.21251 validation-auc:0.97135 validation-aucpr:0.97377
[21:26:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.21091 validation-auc:0.97125 validation-aucpr:0.97342
[21:26:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.20952 validation-auc:0.97126 validation-aucpr:0.97338
[21:26:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.20803 validation-auc:0.97137 validation-aucpr:0.97350
[21:26:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20695 validation-auc:0.97140 validation-aucpr:0.97347
[21:26:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.20588 validation-auc:0.97137 validation-aucpr:0.97344
[21:26:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.20455 validation-auc:0.97153 validation-aucpr:0.97355
[21:26:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20322 validation-auc:0.97188 validation-aucpr:0.97498
[21:26:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.20194 validation-auc:0.97207 validation-aucpr:0.97512
[21:26:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20116 validation-auc:0.97209 validation-aucpr:0.97501
[21:26:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20019 validation-auc:0.97212 validation-aucpr:0.97495
[21:26:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19953 validation-auc:0.97216 validation-aucpr:0.97554
[21:26:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19895 validation-auc:0.97216 validation-aucpr:0.97544
[21:26:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19823 validation-auc:0.97221 validation-aucpr:0.97545
[21:26:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19771 validation-auc:0.97212 validation-aucpr:0.97538
[21:26:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19706 validation-auc:0.97212 validation-aucpr:0.97536
[21:26:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19638 validation-auc:0.97222 validation-aucpr:0.97542
[21:26:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19578 validation-auc:0.97223 validation-aucpr:0.97537
[21:26:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19542 validation-auc:0.97223 validation-aucpr:0.97537
[21:26:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19499 validation-auc:0.97235 validation-aucpr:0.97552
[21:26:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19465 validation-auc:0.97235 validation-aucpr:0.97560
{'best_iteration': '53', 'best_score': '0.9756027609368605'}
Trial 72, Fold 5: Log loss = 0.19464730760572593, Average precision = 0.9756072587949387, ROC-AUC = 0.9723509280419153, Elapsed Time = 7.446603000000323 seconds
Optimization Progress: 73%|#######3 | 73/100 [3:27:19<1:16:53, 170.88s/it]
Trial 73, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 73, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66221 validation-auc:0.95893 validation-aucpr:0.96232
[1] validation-logloss:0.63373 validation-auc:0.96436 validation-aucpr:0.96920
[2] validation-logloss:0.60796 validation-auc:0.96393 validation-aucpr:0.96892
[3] validation-logloss:0.58642 validation-auc:0.96412 validation-aucpr:0.96906
[4] validation-logloss:0.56384 validation-auc:0.96473 validation-aucpr:0.96955
[5] validation-logloss:0.54293 validation-auc:0.96522 validation-aucpr:0.97028
[6] validation-logloss:0.52508 validation-auc:0.96544 validation-aucpr:0.97039
[7] validation-logloss:0.50736 validation-auc:0.96576 validation-aucpr:0.97049
[8] validation-logloss:0.49035 validation-auc:0.96622 validation-aucpr:0.97123
[9] validation-logloss:0.47459 validation-auc:0.96620 validation-aucpr:0.97130
[10] validation-logloss:0.45961 validation-auc:0.96631 validation-aucpr:0.97137
[11] validation-logloss:0.44543 validation-auc:0.96639 validation-aucpr:0.97145
[12] validation-logloss:0.43265 validation-auc:0.96645 validation-aucpr:0.97152
[13] validation-logloss:0.42071 validation-auc:0.96654 validation-aucpr:0.97190
[14] validation-logloss:0.40943 validation-auc:0.96661 validation-aucpr:0.97197
[15] validation-logloss:0.39851 validation-auc:0.96699 validation-aucpr:0.97223
[16] validation-logloss:0.38834 validation-auc:0.96691 validation-aucpr:0.97217
[17] validation-logloss:0.37839 validation-auc:0.96715 validation-aucpr:0.97232
[18] validation-logloss:0.36913 validation-auc:0.96727 validation-aucpr:0.97239
[19] validation-logloss:0.36175 validation-auc:0.96725 validation-aucpr:0.97234
[20] validation-logloss:0.35374 validation-auc:0.96734 validation-aucpr:0.97237
[21] validation-logloss:0.34657 validation-auc:0.96720 validation-aucpr:0.97230
[22] validation-logloss:0.33906 validation-auc:0.96747 validation-aucpr:0.97248
[23] validation-logloss:0.33236 validation-auc:0.96742 validation-aucpr:0.97247
[24] validation-logloss:0.32558 validation-auc:0.96758 validation-aucpr:0.97255
[25] validation-logloss:0.31926 validation-auc:0.96771 validation-aucpr:0.97264
[26] validation-logloss:0.31390 validation-auc:0.96783 validation-aucpr:0.97274
[27] validation-logloss:0.30809 validation-auc:0.96792 validation-aucpr:0.97281
[28] validation-logloss:0.30267 validation-auc:0.96801 validation-aucpr:0.97285
[29] validation-logloss:0.29765 validation-auc:0.96828 validation-aucpr:0.97363
[30] validation-logloss:0.29344 validation-auc:0.96830 validation-aucpr:0.97362
[31] validation-logloss:0.28881 validation-auc:0.96856 validation-aucpr:0.97380
[32] validation-logloss:0.28421 validation-auc:0.96874 validation-aucpr:0.97393
[33] validation-logloss:0.28055 validation-auc:0.96886 validation-aucpr:0.97401
[34] validation-logloss:0.27648 validation-auc:0.96907 validation-aucpr:0.97411
[35] validation-logloss:0.27261 validation-auc:0.96921 validation-aucpr:0.97422
[36] validation-logloss:0.26899 validation-auc:0.96927 validation-aucpr:0.97425
[37] validation-logloss:0.26567 validation-auc:0.96930 validation-aucpr:0.97426
[38] validation-logloss:0.26241 validation-auc:0.96932 validation-aucpr:0.97428
[39] validation-logloss:0.25972 validation-auc:0.96938 validation-aucpr:0.97430
[40] validation-logloss:0.25659 validation-auc:0.96949 validation-aucpr:0.97436
[41] validation-logloss:0.25374 validation-auc:0.96959 validation-aucpr:0.97445
[42] validation-logloss:0.25129 validation-auc:0.96970 validation-aucpr:0.97457
{'best_iteration': '42', 'best_score': '0.9745716988150757'}
Trial 73, Fold 1: Log loss = 0.25128757216862274, Average precision = 0.9745746531559298, ROC-AUC = 0.9696964333510179, Elapsed Time = 1.0446025999990525 seconds
Trial 73, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 73, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.66232 validation-auc:0.96079 validation-aucpr:0.96036
[1] validation-logloss:0.63411 validation-auc:0.96436 validation-aucpr:0.96732
[2] validation-logloss:0.60766 validation-auc:0.96736 validation-aucpr:0.97094
[3] validation-logloss:0.58347 validation-auc:0.96748 validation-aucpr:0.97103
[4] validation-logloss:0.56134 validation-auc:0.96845 validation-aucpr:0.97189
[5] validation-logloss:0.54033 validation-auc:0.96884 validation-aucpr:0.97221
[6] validation-logloss:0.52085 validation-auc:0.96888 validation-aucpr:0.97220
[7] validation-logloss:0.50500 validation-auc:0.96859 validation-aucpr:0.97199
[8] validation-logloss:0.48768 validation-auc:0.96905 validation-aucpr:0.97233
[9] validation-logloss:0.47156 validation-auc:0.96946 validation-aucpr:0.97264
[10] validation-logloss:0.45651 validation-auc:0.96976 validation-aucpr:0.97284
[11] validation-logloss:0.44261 validation-auc:0.96967 validation-aucpr:0.97278
[12] validation-logloss:0.43086 validation-auc:0.96969 validation-aucpr:0.97283
[13] validation-logloss:0.41852 validation-auc:0.97000 validation-aucpr:0.97301
[14] validation-logloss:0.40708 validation-auc:0.97007 validation-aucpr:0.97302
[15] validation-logloss:0.39585 validation-auc:0.97020 validation-aucpr:0.97312
[16] validation-logloss:0.38553 validation-auc:0.97022 validation-aucpr:0.97314
[17] validation-logloss:0.37584 validation-auc:0.97032 validation-aucpr:0.97322
[18] validation-logloss:0.36656 validation-auc:0.97034 validation-aucpr:0.97323
[19] validation-logloss:0.35874 validation-auc:0.97055 validation-aucpr:0.97338
[20] validation-logloss:0.35063 validation-auc:0.97049 validation-aucpr:0.97336
[21] validation-logloss:0.34274 validation-auc:0.97056 validation-aucpr:0.97342
[22] validation-logloss:0.33619 validation-auc:0.97062 validation-aucpr:0.97348
[23] validation-logloss:0.32934 validation-auc:0.97070 validation-aucpr:0.97356
[24] validation-logloss:0.32258 validation-auc:0.97070 validation-aucpr:0.97358
[25] validation-logloss:0.31621 validation-auc:0.97081 validation-aucpr:0.97366
[26] validation-logloss:0.31095 validation-auc:0.97078 validation-aucpr:0.97364
[27] validation-logloss:0.30513 validation-auc:0.97087 validation-aucpr:0.97372
[28] validation-logloss:0.30036 validation-auc:0.97077 validation-aucpr:0.97365
[29] validation-logloss:0.29538 validation-auc:0.97076 validation-aucpr:0.97365
[30] validation-logloss:0.29032 validation-auc:0.97088 validation-aucpr:0.97374
[31] validation-logloss:0.28535 validation-auc:0.97102 validation-aucpr:0.97386
[32] validation-logloss:0.28084 validation-auc:0.97104 validation-aucpr:0.97388
[33] validation-logloss:0.27670 validation-auc:0.97105 validation-aucpr:0.97389
[34] validation-logloss:0.27271 validation-auc:0.97118 validation-aucpr:0.97396
[35] validation-logloss:0.26930 validation-auc:0.97129 validation-aucpr:0.97405
[36] validation-logloss:0.26552 validation-auc:0.97131 validation-aucpr:0.97408
[37] validation-logloss:0.26201 validation-auc:0.97139 validation-aucpr:0.97407
[38] validation-logloss:0.25857 validation-auc:0.97142 validation-aucpr:0.97409
[39] validation-logloss:0.25520 validation-auc:0.97151 validation-aucpr:0.97417
[40] validation-logloss:0.25217 validation-auc:0.97151 validation-aucpr:0.97418
[41] validation-logloss:0.24933 validation-auc:0.97154 validation-aucpr:0.97418
[42] validation-logloss:0.24689 validation-auc:0.97166 validation-aucpr:0.97428
{'best_iteration': '42', 'best_score': '0.9742818391162781'}
Trial 73, Fold 2: Log loss = 0.24689143229259836, Average precision = 0.9742754776508732, ROC-AUC = 0.9716611769376143, Elapsed Time = 1.2765495000021474 seconds
Trial 73, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 73, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66213 validation-auc:0.96246 validation-aucpr:0.96394
[1] validation-logloss:0.63405 validation-auc:0.96415 validation-aucpr:0.96683
[2] validation-logloss:0.60788 validation-auc:0.96467 validation-aucpr:0.96728
[3] validation-logloss:0.58403 validation-auc:0.96499 validation-aucpr:0.96704
[4] validation-logloss:0.56355 validation-auc:0.96602 validation-aucpr:0.96843
[5] validation-logloss:0.54273 validation-auc:0.96700 validation-aucpr:0.97014
[6] validation-logloss:0.52286 validation-auc:0.96751 validation-aucpr:0.97058
[7] validation-logloss:0.50495 validation-auc:0.96759 validation-aucpr:0.97057
[8] validation-logloss:0.48795 validation-auc:0.96782 validation-aucpr:0.97069
[9] validation-logloss:0.47198 validation-auc:0.96846 validation-aucpr:0.97249
[10] validation-logloss:0.45684 validation-auc:0.96869 validation-aucpr:0.97273
[11] validation-logloss:0.44263 validation-auc:0.96889 validation-aucpr:0.97289
[12] validation-logloss:0.42975 validation-auc:0.96923 validation-aucpr:0.97347
[13] validation-logloss:0.41851 validation-auc:0.96942 validation-aucpr:0.97364
[14] validation-logloss:0.40790 validation-auc:0.96956 validation-aucpr:0.97377
[15] validation-logloss:0.39684 validation-auc:0.96962 validation-aucpr:0.97380
[16] validation-logloss:0.38670 validation-auc:0.96960 validation-aucpr:0.97383
[17] validation-logloss:0.37721 validation-auc:0.96959 validation-aucpr:0.97380
[18] validation-logloss:0.36798 validation-auc:0.96964 validation-aucpr:0.97385
[19] validation-logloss:0.35934 validation-auc:0.96989 validation-aucpr:0.97404
[20] validation-logloss:0.35106 validation-auc:0.96995 validation-aucpr:0.97412
[21] validation-logloss:0.34318 validation-auc:0.97001 validation-aucpr:0.97415
[22] validation-logloss:0.33563 validation-auc:0.97003 validation-aucpr:0.97415
[23] validation-logloss:0.32875 validation-auc:0.96995 validation-aucpr:0.97407
[24] validation-logloss:0.32259 validation-auc:0.97005 validation-aucpr:0.97415
[25] validation-logloss:0.31612 validation-auc:0.97013 validation-aucpr:0.97421
[26] validation-logloss:0.31093 validation-auc:0.97012 validation-aucpr:0.97419
[27] validation-logloss:0.30530 validation-auc:0.97012 validation-aucpr:0.97419
[28] validation-logloss:0.29981 validation-auc:0.97017 validation-aucpr:0.97432
[29] validation-logloss:0.29480 validation-auc:0.97022 validation-aucpr:0.97435
[30] validation-logloss:0.28996 validation-auc:0.97030 validation-aucpr:0.97440
[31] validation-logloss:0.28523 validation-auc:0.97043 validation-aucpr:0.97449
[32] validation-logloss:0.28082 validation-auc:0.97047 validation-aucpr:0.97452
[33] validation-logloss:0.27672 validation-auc:0.97045 validation-aucpr:0.97450
[34] validation-logloss:0.27259 validation-auc:0.97059 validation-aucpr:0.97462
[35] validation-logloss:0.26864 validation-auc:0.97063 validation-aucpr:0.97466
[36] validation-logloss:0.26501 validation-auc:0.97065 validation-aucpr:0.97467
[37] validation-logloss:0.26154 validation-auc:0.97071 validation-aucpr:0.97473
[38] validation-logloss:0.25815 validation-auc:0.97079 validation-aucpr:0.97479
[39] validation-logloss:0.25506 validation-auc:0.97086 validation-aucpr:0.97481
[40] validation-logloss:0.25242 validation-auc:0.97096 validation-aucpr:0.97489
[41] validation-logloss:0.24960 validation-auc:0.97098 validation-aucpr:0.97490
[42] validation-logloss:0.24717 validation-auc:0.97102 validation-aucpr:0.97493
{'best_iteration': '42', 'best_score': '0.974931153155075'}
Trial 73, Fold 3: Log loss = 0.24717121871577077, Average precision = 0.9749117773427436, ROC-AUC = 0.9710236766957845, Elapsed Time = 1.2841706999970484 seconds
Trial 73, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 73, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66257 validation-auc:0.95890 validation-aucpr:0.96040
[1] validation-logloss:0.63457 validation-auc:0.96546 validation-aucpr:0.97052
[2] validation-logloss:0.61137 validation-auc:0.96493 validation-aucpr:0.96950
[3] validation-logloss:0.58698 validation-auc:0.96604 validation-aucpr:0.97025
[4] validation-logloss:0.56421 validation-auc:0.96624 validation-aucpr:0.97053
[5] validation-logloss:0.54369 validation-auc:0.96600 validation-aucpr:0.97028
[6] validation-logloss:0.52434 validation-auc:0.96650 validation-aucpr:0.97091
[7] validation-logloss:0.50810 validation-auc:0.96650 validation-aucpr:0.97129
[8] validation-logloss:0.49099 validation-auc:0.96656 validation-aucpr:0.97141
[9] validation-logloss:0.47485 validation-auc:0.96692 validation-aucpr:0.97174
[10] validation-logloss:0.45963 validation-auc:0.96734 validation-aucpr:0.97209
[11] validation-logloss:0.44534 validation-auc:0.96782 validation-aucpr:0.97259
[12] validation-logloss:0.43252 validation-auc:0.96829 validation-aucpr:0.97304
[13] validation-logloss:0.41980 validation-auc:0.96853 validation-aucpr:0.97322
[14] validation-logloss:0.40819 validation-auc:0.96883 validation-aucpr:0.97345
[15] validation-logloss:0.39732 validation-auc:0.96898 validation-aucpr:0.97356
[16] validation-logloss:0.38820 validation-auc:0.96881 validation-aucpr:0.97349
[17] validation-logloss:0.37830 validation-auc:0.96895 validation-aucpr:0.97365
[18] validation-logloss:0.36889 validation-auc:0.96922 validation-aucpr:0.97384
[19] validation-logloss:0.35996 validation-auc:0.96937 validation-aucpr:0.97395
[20] validation-logloss:0.35176 validation-auc:0.96947 validation-aucpr:0.97400
[21] validation-logloss:0.34387 validation-auc:0.96957 validation-aucpr:0.97410
[22] validation-logloss:0.33636 validation-auc:0.96968 validation-aucpr:0.97423
[23] validation-logloss:0.33032 validation-auc:0.96953 validation-aucpr:0.97410
[24] validation-logloss:0.32371 validation-auc:0.96965 validation-aucpr:0.97419
[25] validation-logloss:0.31749 validation-auc:0.96959 validation-aucpr:0.97416
[26] validation-logloss:0.31213 validation-auc:0.96955 validation-aucpr:0.97414
[27] validation-logloss:0.30647 validation-auc:0.96970 validation-aucpr:0.97425
[28] validation-logloss:0.30094 validation-auc:0.96982 validation-aucpr:0.97434
[29] validation-logloss:0.29647 validation-auc:0.96972 validation-aucpr:0.97428
[30] validation-logloss:0.29150 validation-auc:0.96974 validation-aucpr:0.97431
[31] validation-logloss:0.28749 validation-auc:0.96973 validation-aucpr:0.97428
[32] validation-logloss:0.28299 validation-auc:0.96979 validation-aucpr:0.97435
[33] validation-logloss:0.27873 validation-auc:0.96998 validation-aucpr:0.97449
[34] validation-logloss:0.27479 validation-auc:0.96994 validation-aucpr:0.97446
[35] validation-logloss:0.27084 validation-auc:0.97003 validation-aucpr:0.97453
[36] validation-logloss:0.26714 validation-auc:0.97005 validation-aucpr:0.97456
[37] validation-logloss:0.26348 validation-auc:0.97020 validation-aucpr:0.97466
[38] validation-logloss:0.26006 validation-auc:0.97026 validation-aucpr:0.97472
[39] validation-logloss:0.25698 validation-auc:0.97032 validation-aucpr:0.97476
[40] validation-logloss:0.25433 validation-auc:0.97038 validation-aucpr:0.97481
[41] validation-logloss:0.25143 validation-auc:0.97040 validation-aucpr:0.97483
[42] validation-logloss:0.24939 validation-auc:0.97027 validation-aucpr:0.97472
{'best_iteration': '41', 'best_score': '0.9748262195518107'}
Trial 73, Fold 4: Log loss = 0.2493894087452096, Average precision = 0.9747037275156256, ROC-AUC = 0.9702705828212813, Elapsed Time = 1.254519900001469 seconds
Trial 73, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 73, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66242 validation-auc:0.95556 validation-aucpr:0.95546
[1] validation-logloss:0.63502 validation-auc:0.96010 validation-aucpr:0.96559
[2] validation-logloss:0.60888 validation-auc:0.96173 validation-aucpr:0.96679
[3] validation-logloss:0.58724 validation-auc:0.96131 validation-aucpr:0.96589
[4] validation-logloss:0.56487 validation-auc:0.96170 validation-aucpr:0.96641
[5] validation-logloss:0.54386 validation-auc:0.96290 validation-aucpr:0.96790
[6] validation-logloss:0.52449 validation-auc:0.96336 validation-aucpr:0.96819
[7] validation-logloss:0.50653 validation-auc:0.96335 validation-aucpr:0.96823
[8] validation-logloss:0.48948 validation-auc:0.96382 validation-aucpr:0.96888
[9] validation-logloss:0.47386 validation-auc:0.96408 validation-aucpr:0.96901
[10] validation-logloss:0.45934 validation-auc:0.96484 validation-aucpr:0.96940
[11] validation-logloss:0.44557 validation-auc:0.96517 validation-aucpr:0.96966
[12] validation-logloss:0.43241 validation-auc:0.96547 validation-aucpr:0.96988
[13] validation-logloss:0.42017 validation-auc:0.96572 validation-aucpr:0.97002
[14] validation-logloss:0.40878 validation-auc:0.96587 validation-aucpr:0.97011
[15] validation-logloss:0.39920 validation-auc:0.96635 validation-aucpr:0.97125
[16] validation-logloss:0.38891 validation-auc:0.96658 validation-aucpr:0.97139
[17] validation-logloss:0.37948 validation-auc:0.96659 validation-aucpr:0.97141
[18] validation-logloss:0.37016 validation-auc:0.96668 validation-aucpr:0.97148
[19] validation-logloss:0.36176 validation-auc:0.96682 validation-aucpr:0.97153
[20] validation-logloss:0.35374 validation-auc:0.96694 validation-aucpr:0.97161
[21] validation-logloss:0.34619 validation-auc:0.96718 validation-aucpr:0.97177
[22] validation-logloss:0.33880 validation-auc:0.96730 validation-aucpr:0.97181
[23] validation-logloss:0.33212 validation-auc:0.96728 validation-aucpr:0.97182
[24] validation-logloss:0.32574 validation-auc:0.96739 validation-aucpr:0.97187
[25] validation-logloss:0.31957 validation-auc:0.96749 validation-aucpr:0.97197
[26] validation-logloss:0.31394 validation-auc:0.96761 validation-aucpr:0.97200
[27] validation-logloss:0.30834 validation-auc:0.96768 validation-aucpr:0.97202
[28] validation-logloss:0.30353 validation-auc:0.96781 validation-aucpr:0.97214
[29] validation-logloss:0.29902 validation-auc:0.96801 validation-aucpr:0.97225
[30] validation-logloss:0.29430 validation-auc:0.96807 validation-aucpr:0.97231
[31] validation-logloss:0.28972 validation-auc:0.96814 validation-aucpr:0.97233
[32] validation-logloss:0.28566 validation-auc:0.96823 validation-aucpr:0.97244
[33] validation-logloss:0.28187 validation-auc:0.96811 validation-aucpr:0.97235
[34] validation-logloss:0.27788 validation-auc:0.96821 validation-aucpr:0.97243
[35] validation-logloss:0.27406 validation-auc:0.96836 validation-aucpr:0.97255
[36] validation-logloss:0.27055 validation-auc:0.96846 validation-aucpr:0.97262
[37] validation-logloss:0.26719 validation-auc:0.96855 validation-aucpr:0.97267
[38] validation-logloss:0.26443 validation-auc:0.96872 validation-aucpr:0.97284
[39] validation-logloss:0.26149 validation-auc:0.96881 validation-aucpr:0.97291
[40] validation-logloss:0.25859 validation-auc:0.96882 validation-aucpr:0.97292
[41] validation-logloss:0.25619 validation-auc:0.96884 validation-aucpr:0.97291
[42] validation-logloss:0.25354 validation-auc:0.96887 validation-aucpr:0.97293
{'best_iteration': '42', 'best_score': '0.9729319944965127'}
Trial 73, Fold 5: Log loss = 0.25353626477607055, Average precision = 0.9729489695128064, ROC-AUC = 0.968868922207978, Elapsed Time = 1.2993155000003753 seconds
Optimization Progress: 74%|#######4 | 74/100 [3:27:33<53:39, 123.83s/it]
Trial 74, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 74, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68198 validation-auc:0.94945 validation-aucpr:0.95422
[1] validation-logloss:0.67126 validation-auc:0.95865 validation-aucpr:0.96317
[2] validation-logloss:0.66095 validation-auc:0.96173 validation-aucpr:0.96655
[3] validation-logloss:0.65066 validation-auc:0.96379 validation-aucpr:0.96784
[4] validation-logloss:0.64073 validation-auc:0.96415 validation-aucpr:0.96847
[5] validation-logloss:0.63119 validation-auc:0.96441 validation-aucpr:0.96861
{'best_iteration': '5', 'best_score': '0.968609721886661'}
Trial 74, Fold 1: Log loss = 0.6311894079791622, Average precision = 0.9684504452958994, ROC-AUC = 0.9644080418978793, Elapsed Time = 0.4654045999996015 seconds
Trial 74, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 74, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68211 validation-auc:0.95194 validation-aucpr:0.95339
[1] validation-logloss:0.67099 validation-auc:0.96093 validation-aucpr:0.96313
[2] validation-logloss:0.66030 validation-auc:0.96405 validation-aucpr:0.96679
[3] validation-logloss:0.65158 validation-auc:0.96409 validation-aucpr:0.96527
[4] validation-logloss:0.64153 validation-auc:0.96526 validation-aucpr:0.96644
[5] validation-logloss:0.63184 validation-auc:0.96703 validation-aucpr:0.96916
{'best_iteration': '5', 'best_score': '0.9691621140187124'}
Trial 74, Fold 2: Log loss = 0.6318357320456525, Average precision = 0.9696004383607427, ROC-AUC = 0.9670322624896773, Elapsed Time = 0.4971901999997499 seconds
Trial 74, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 74, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68193 validation-auc:0.95612 validation-aucpr:0.95651
[1] validation-logloss:0.67224 validation-auc:0.95906 validation-aucpr:0.96173
[2] validation-logloss:0.66207 validation-auc:0.96266 validation-aucpr:0.96536
[3] validation-logloss:0.65210 validation-auc:0.96273 validation-aucpr:0.96726
[4] validation-logloss:0.64232 validation-auc:0.96351 validation-aucpr:0.96798
[5] validation-logloss:0.63264 validation-auc:0.96433 validation-aucpr:0.96881
{'best_iteration': '5', 'best_score': '0.9688053773871279'}
Trial 74, Fold 3: Log loss = 0.6326356588194274, Average precision = 0.9686266263110179, ROC-AUC = 0.9643291291858885, Elapsed Time = 0.5874459000006027 seconds
Trial 74, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 74, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68194 validation-auc:0.95429 validation-aucpr:0.96112
[1] validation-logloss:0.67105 validation-auc:0.95921 validation-aucpr:0.96506
[2] validation-logloss:0.66069 validation-auc:0.96126 validation-aucpr:0.96711
[3] validation-logloss:0.65036 validation-auc:0.96230 validation-aucpr:0.96624
[4] validation-logloss:0.64061 validation-auc:0.96461 validation-aucpr:0.96963
[5] validation-logloss:0.63092 validation-auc:0.96428 validation-aucpr:0.96778
{'best_iteration': '4', 'best_score': '0.969634442922803'}
Trial 74, Fold 4: Log loss = 0.6309225193685589, Average precision = 0.9680199805645648, ROC-AUC = 0.9642778904411397, Elapsed Time = 0.5735706000014034 seconds
Trial 74, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 74, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68212 validation-auc:0.94916 validation-aucpr:0.94737
[1] validation-logloss:0.67140 validation-auc:0.95587 validation-aucpr:0.95626
[2] validation-logloss:0.66091 validation-auc:0.95810 validation-aucpr:0.95942
[3] validation-logloss:0.65063 validation-auc:0.95948 validation-aucpr:0.96048
[4] validation-logloss:0.64079 validation-auc:0.96033 validation-aucpr:0.96113
[5] validation-logloss:0.63112 validation-auc:0.96114 validation-aucpr:0.96241
{'best_iteration': '5', 'best_score': '0.9624078909143001'}
Trial 74, Fold 5: Log loss = 0.6311159088796015, Average precision = 0.9619067479372349, ROC-AUC = 0.9611428571428572, Elapsed Time = 0.578624599998875 seconds
Optimization Progress: 75%|#######5 | 75/100 [3:27:43<37:24, 89.78s/it]
Trial 75, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 75, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68520 validation-auc:0.96108 validation-aucpr:0.96645
[1] validation-logloss:0.67745 validation-auc:0.96484 validation-aucpr:0.96898
[2] validation-logloss:0.67011 validation-auc:0.96581 validation-aucpr:0.97084
[3] validation-logloss:0.66370 validation-auc:0.96473 validation-aucpr:0.97008
[4] validation-logloss:0.65640 validation-auc:0.96545 validation-aucpr:0.97065
[5] validation-logloss:0.64993 validation-auc:0.96599 validation-aucpr:0.97107
[6] validation-logloss:0.64306 validation-auc:0.96662 validation-aucpr:0.97164
[7] validation-logloss:0.63706 validation-auc:0.96605 validation-aucpr:0.97109
[8] validation-logloss:0.63108 validation-auc:0.96553 validation-aucpr:0.97105
[9] validation-logloss:0.62520 validation-auc:0.96538 validation-aucpr:0.97085
[10] validation-logloss:0.61936 validation-auc:0.96544 validation-aucpr:0.97082
[11] validation-logloss:0.61288 validation-auc:0.96615 validation-aucpr:0.97143
[12] validation-logloss:0.60647 validation-auc:0.96655 validation-aucpr:0.97180
[13] validation-logloss:0.60124 validation-auc:0.96619 validation-aucpr:0.97155
[14] validation-logloss:0.59540 validation-auc:0.96616 validation-aucpr:0.97171
[15] validation-logloss:0.58946 validation-auc:0.96650 validation-aucpr:0.97201
[16] validation-logloss:0.58392 validation-auc:0.96635 validation-aucpr:0.97186
[17] validation-logloss:0.57882 validation-auc:0.96627 validation-aucpr:0.97173
[18] validation-logloss:0.57378 validation-auc:0.96627 validation-aucpr:0.97164
[19] validation-logloss:0.56900 validation-auc:0.96604 validation-aucpr:0.97146
[20] validation-logloss:0.56450 validation-auc:0.96591 validation-aucpr:0.97131
[21] validation-logloss:0.55906 validation-auc:0.96615 validation-aucpr:0.97154
[22] validation-logloss:0.55365 validation-auc:0.96640 validation-aucpr:0.97173
[23] validation-logloss:0.54929 validation-auc:0.96628 validation-aucpr:0.97165
[24] validation-logloss:0.54412 validation-auc:0.96654 validation-aucpr:0.97186
[25] validation-logloss:0.53992 validation-auc:0.96637 validation-aucpr:0.97168
[26] validation-logloss:0.53571 validation-auc:0.96629 validation-aucpr:0.97159
[27] validation-logloss:0.53092 validation-auc:0.96637 validation-aucpr:0.97166
[28] validation-logloss:0.52619 validation-auc:0.96648 validation-aucpr:0.97176
[29] validation-logloss:0.52200 validation-auc:0.96650 validation-aucpr:0.97175
[30] validation-logloss:0.51787 validation-auc:0.96643 validation-aucpr:0.97169
[31] validation-logloss:0.51321 validation-auc:0.96664 validation-aucpr:0.97184
[32] validation-logloss:0.50940 validation-auc:0.96661 validation-aucpr:0.97178
[33] validation-logloss:0.50574 validation-auc:0.96653 validation-aucpr:0.97172
[34] validation-logloss:0.50147 validation-auc:0.96674 validation-aucpr:0.97189
[35] validation-logloss:0.49728 validation-auc:0.96680 validation-aucpr:0.97196
[36] validation-logloss:0.49297 validation-auc:0.96695 validation-aucpr:0.97210
[37] validation-logloss:0.48882 validation-auc:0.96709 validation-aucpr:0.97226
[38] validation-logloss:0.48489 validation-auc:0.96720 validation-aucpr:0.97230
{'best_iteration': '38', 'best_score': '0.9723014364966841'}
Trial 75, Fold 1: Log loss = 0.48489445876785703, Average precision = 0.9722991842862466, ROC-AUC = 0.9671955132800888, Elapsed Time = 0.7628678000000946 seconds
Trial 75, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 75, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68538 validation-auc:0.96151 validation-aucpr:0.96532
[1] validation-logloss:0.67845 validation-auc:0.96332 validation-aucpr:0.96709
[2] validation-logloss:0.67073 validation-auc:0.96615 validation-aucpr:0.97000
[3] validation-logloss:0.66326 validation-auc:0.96708 validation-aucpr:0.97098
[4] validation-logloss:0.65601 validation-auc:0.96793 validation-aucpr:0.97179
[5] validation-logloss:0.64977 validation-auc:0.96821 validation-aucpr:0.97174
[6] validation-logloss:0.64270 validation-auc:0.96862 validation-aucpr:0.97212
[7] validation-logloss:0.63596 validation-auc:0.96870 validation-aucpr:0.97229
[8] validation-logloss:0.62931 validation-auc:0.96865 validation-aucpr:0.97229
[9] validation-logloss:0.62352 validation-auc:0.96906 validation-aucpr:0.97239
[10] validation-logloss:0.61717 validation-auc:0.96915 validation-aucpr:0.97245
[11] validation-logloss:0.61139 validation-auc:0.96916 validation-aucpr:0.97241
[12] validation-logloss:0.60508 validation-auc:0.96939 validation-aucpr:0.97263
[13] validation-logloss:0.59969 validation-auc:0.96937 validation-aucpr:0.97257
[14] validation-logloss:0.59419 validation-auc:0.96928 validation-aucpr:0.97249
[15] validation-logloss:0.58887 validation-auc:0.96934 validation-aucpr:0.97253
[16] validation-logloss:0.58375 validation-auc:0.96933 validation-aucpr:0.97252
[17] validation-logloss:0.57853 validation-auc:0.96920 validation-aucpr:0.97237
[18] validation-logloss:0.57300 validation-auc:0.96915 validation-aucpr:0.97231
[19] validation-logloss:0.56746 validation-auc:0.96921 validation-aucpr:0.97235
[20] validation-logloss:0.56253 validation-auc:0.96917 validation-aucpr:0.97231
[21] validation-logloss:0.55782 validation-auc:0.96909 validation-aucpr:0.97222
[22] validation-logloss:0.55249 validation-auc:0.96914 validation-aucpr:0.97229
[23] validation-logloss:0.54732 validation-auc:0.96930 validation-aucpr:0.97243
[24] validation-logloss:0.54210 validation-auc:0.96950 validation-aucpr:0.97260
[25] validation-logloss:0.53759 validation-auc:0.96943 validation-aucpr:0.97254
[26] validation-logloss:0.53268 validation-auc:0.96970 validation-aucpr:0.97278
[27] validation-logloss:0.52785 validation-auc:0.96976 validation-aucpr:0.97285
[28] validation-logloss:0.52319 validation-auc:0.96973 validation-aucpr:0.97284
[29] validation-logloss:0.51905 validation-auc:0.96957 validation-aucpr:0.97267
[30] validation-logloss:0.51499 validation-auc:0.96955 validation-aucpr:0.97265
[31] validation-logloss:0.51099 validation-auc:0.96950 validation-aucpr:0.97259
[32] validation-logloss:0.50692 validation-auc:0.96960 validation-aucpr:0.97267
[33] validation-logloss:0.50297 validation-auc:0.96962 validation-aucpr:0.97266
[34] validation-logloss:0.49860 validation-auc:0.96977 validation-aucpr:0.97280
[35] validation-logloss:0.49434 validation-auc:0.96983 validation-aucpr:0.97286
[36] validation-logloss:0.49055 validation-auc:0.96971 validation-aucpr:0.97275
[37] validation-logloss:0.48643 validation-auc:0.96980 validation-aucpr:0.97283
[38] validation-logloss:0.48235 validation-auc:0.96987 validation-aucpr:0.97289
{'best_iteration': '38', 'best_score': '0.9728927317821991'}
Trial 75, Fold 2: Log loss = 0.4823548176019976, Average precision = 0.9728926908705433, ROC-AUC = 0.969873663898686, Elapsed Time = 0.9614445000006526 seconds
Trial 75, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 75, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68520 validation-auc:0.96196 validation-aucpr:0.96490
[1] validation-logloss:0.67738 validation-auc:0.96760 validation-aucpr:0.97189
[2] validation-logloss:0.67076 validation-auc:0.96759 validation-aucpr:0.97146
[3] validation-logloss:0.66369 validation-auc:0.96795 validation-aucpr:0.97194
[4] validation-logloss:0.65705 validation-auc:0.96823 validation-aucpr:0.97223
[5] validation-logloss:0.64979 validation-auc:0.96885 validation-aucpr:0.97271
[6] validation-logloss:0.64351 validation-auc:0.96870 validation-aucpr:0.97261
[7] validation-logloss:0.63662 validation-auc:0.96945 validation-aucpr:0.97353
[8] validation-logloss:0.62978 validation-auc:0.96968 validation-aucpr:0.97380
[9] validation-logloss:0.62306 validation-auc:0.96995 validation-aucpr:0.97404
[10] validation-logloss:0.61646 validation-auc:0.97016 validation-aucpr:0.97423
[11] validation-logloss:0.61004 validation-auc:0.97055 validation-aucpr:0.97490
[12] validation-logloss:0.60383 validation-auc:0.97055 validation-aucpr:0.97492
[13] validation-logloss:0.59832 validation-auc:0.97024 validation-aucpr:0.97463
[14] validation-logloss:0.59295 validation-auc:0.97010 validation-aucpr:0.97450
[15] validation-logloss:0.58770 validation-auc:0.96991 validation-aucpr:0.97429
[16] validation-logloss:0.58254 validation-auc:0.96987 validation-aucpr:0.97420
[17] validation-logloss:0.57739 validation-auc:0.96978 validation-aucpr:0.97417
[18] validation-logloss:0.57163 validation-auc:0.97000 validation-aucpr:0.97437
[19] validation-logloss:0.56656 validation-auc:0.96997 validation-aucpr:0.97429
[20] validation-logloss:0.56101 validation-auc:0.97011 validation-aucpr:0.97444
[21] validation-logloss:0.55568 validation-auc:0.97028 validation-aucpr:0.97452
[22] validation-logloss:0.55091 validation-auc:0.97017 validation-aucpr:0.97439
[23] validation-logloss:0.54635 validation-auc:0.97017 validation-aucpr:0.97437
[24] validation-logloss:0.54135 validation-auc:0.97024 validation-aucpr:0.97447
[25] validation-logloss:0.53701 validation-auc:0.97022 validation-aucpr:0.97446
[26] validation-logloss:0.53260 validation-auc:0.97022 validation-aucpr:0.97444
[27] validation-logloss:0.52838 validation-auc:0.97018 validation-aucpr:0.97437
[28] validation-logloss:0.52434 validation-auc:0.97013 validation-aucpr:0.97433
[29] validation-logloss:0.51953 validation-auc:0.97017 validation-aucpr:0.97438
[30] validation-logloss:0.51557 validation-auc:0.97010 validation-aucpr:0.97433
[31] validation-logloss:0.51100 validation-auc:0.97017 validation-aucpr:0.97440
[32] validation-logloss:0.50649 validation-auc:0.97016 validation-aucpr:0.97442
[33] validation-logloss:0.50268 validation-auc:0.97012 validation-aucpr:0.97436
[34] validation-logloss:0.49827 validation-auc:0.97025 validation-aucpr:0.97447
[35] validation-logloss:0.49390 validation-auc:0.97036 validation-aucpr:0.97458
[36] validation-logloss:0.49026 validation-auc:0.97030 validation-aucpr:0.97451
[37] validation-logloss:0.48610 validation-auc:0.97043 validation-aucpr:0.97462
[38] validation-logloss:0.48194 validation-auc:0.97053 validation-aucpr:0.97472
{'best_iteration': '12', 'best_score': '0.9749222820063228'}
Trial 75, Fold 3: Log loss = 0.4819393169675427, Average precision = 0.9747282369942932, ROC-AUC = 0.970529709937302, Elapsed Time = 1.0155684999990626 seconds
Trial 75, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 75, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68524 validation-auc:0.96118 validation-aucpr:0.96774
[1] validation-logloss:0.67771 validation-auc:0.96477 validation-aucpr:0.97058
[2] validation-logloss:0.67012 validation-auc:0.96584 validation-aucpr:0.97144
[3] validation-logloss:0.66333 validation-auc:0.96563 validation-aucpr:0.97149
[4] validation-logloss:0.65606 validation-auc:0.96598 validation-aucpr:0.97182
[5] validation-logloss:0.64892 validation-auc:0.96707 validation-aucpr:0.97260
[6] validation-logloss:0.64224 validation-auc:0.96750 validation-aucpr:0.97303
[7] validation-logloss:0.63609 validation-auc:0.96701 validation-aucpr:0.97258
[8] validation-logloss:0.63014 validation-auc:0.96695 validation-aucpr:0.97246
[9] validation-logloss:0.62353 validation-auc:0.96755 validation-aucpr:0.97287
[10] validation-logloss:0.61771 validation-auc:0.96768 validation-aucpr:0.97289
[11] validation-logloss:0.61214 validation-auc:0.96745 validation-aucpr:0.97265
[12] validation-logloss:0.60590 validation-auc:0.96768 validation-aucpr:0.97291
[13] validation-logloss:0.59968 validation-auc:0.96809 validation-aucpr:0.97325
[14] validation-logloss:0.59357 validation-auc:0.96821 validation-aucpr:0.97337
[15] validation-logloss:0.58771 validation-auc:0.96852 validation-aucpr:0.97367
[16] validation-logloss:0.58210 validation-auc:0.96875 validation-aucpr:0.97383
[17] validation-logloss:0.57638 validation-auc:0.96893 validation-aucpr:0.97396
[18] validation-logloss:0.57147 validation-auc:0.96859 validation-aucpr:0.97367
[19] validation-logloss:0.56593 validation-auc:0.96873 validation-aucpr:0.97380
[20] validation-logloss:0.56130 validation-auc:0.96857 validation-aucpr:0.97365
[21] validation-logloss:0.55594 validation-auc:0.96875 validation-aucpr:0.97382
[22] validation-logloss:0.55143 validation-auc:0.96863 validation-aucpr:0.97371
[23] validation-logloss:0.54674 validation-auc:0.96870 validation-aucpr:0.97374
[24] validation-logloss:0.54222 validation-auc:0.96875 validation-aucpr:0.97377
[25] validation-logloss:0.53794 validation-auc:0.96867 validation-aucpr:0.97372
[26] validation-logloss:0.53304 validation-auc:0.96872 validation-aucpr:0.97378
[27] validation-logloss:0.52814 validation-auc:0.96885 validation-aucpr:0.97390
[28] validation-logloss:0.52396 validation-auc:0.96884 validation-aucpr:0.97388
[29] validation-logloss:0.51916 validation-auc:0.96895 validation-aucpr:0.97396
[30] validation-logloss:0.51457 validation-auc:0.96906 validation-aucpr:0.97407
[31] validation-logloss:0.51000 validation-auc:0.96926 validation-aucpr:0.97423
[32] validation-logloss:0.50563 validation-auc:0.96934 validation-aucpr:0.97432
[33] validation-logloss:0.50120 validation-auc:0.96942 validation-aucpr:0.97438
[34] validation-logloss:0.49739 validation-auc:0.96937 validation-aucpr:0.97433
[35] validation-logloss:0.49321 validation-auc:0.96932 validation-aucpr:0.97430
[36] validation-logloss:0.48949 validation-auc:0.96923 validation-aucpr:0.97421
[37] validation-logloss:0.48535 validation-auc:0.96925 validation-aucpr:0.97423
[38] validation-logloss:0.48120 validation-auc:0.96934 validation-aucpr:0.97431
{'best_iteration': '33', 'best_score': '0.974379595658719'}
Trial 75, Fold 4: Log loss = 0.48119895275105207, Average precision = 0.9743045863136259, ROC-AUC = 0.9693426748547299, Elapsed Time = 0.9456118000016431 seconds
Trial 75, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 75, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68529 validation-auc:0.95807 validation-aucpr:0.96371
[1] validation-logloss:0.67757 validation-auc:0.96367 validation-aucpr:0.96895
[2] validation-logloss:0.67090 validation-auc:0.96391 validation-aucpr:0.96884
[3] validation-logloss:0.66434 validation-auc:0.96452 validation-aucpr:0.96890
[4] validation-logloss:0.65789 validation-auc:0.96409 validation-aucpr:0.96841
[5] validation-logloss:0.65181 validation-auc:0.96425 validation-aucpr:0.96843
[6] validation-logloss:0.64589 validation-auc:0.96385 validation-aucpr:0.96789
[7] validation-logloss:0.63896 validation-auc:0.96482 validation-aucpr:0.96886
[8] validation-logloss:0.63318 validation-auc:0.96461 validation-aucpr:0.96856
[9] validation-logloss:0.62740 validation-auc:0.96456 validation-aucpr:0.96842
[10] validation-logloss:0.62077 validation-auc:0.96512 validation-aucpr:0.96908
[11] validation-logloss:0.61483 validation-auc:0.96502 validation-aucpr:0.96897
[12] validation-logloss:0.60916 validation-auc:0.96548 validation-aucpr:0.96932
[13] validation-logloss:0.60311 validation-auc:0.96566 validation-aucpr:0.96953
[14] validation-logloss:0.59711 validation-auc:0.96600 validation-aucpr:0.96975
[15] validation-logloss:0.59199 validation-auc:0.96596 validation-aucpr:0.96970
[16] validation-logloss:0.58622 validation-auc:0.96617 validation-aucpr:0.96995
[17] validation-logloss:0.58052 validation-auc:0.96626 validation-aucpr:0.97007
[18] validation-logloss:0.57547 validation-auc:0.96631 validation-aucpr:0.97015
[19] validation-logloss:0.57066 validation-auc:0.96624 validation-aucpr:0.97005
[20] validation-logloss:0.56587 validation-auc:0.96624 validation-aucpr:0.97042
[21] validation-logloss:0.56041 validation-auc:0.96650 validation-aucpr:0.97064
[22] validation-logloss:0.55513 validation-auc:0.96657 validation-aucpr:0.97071
[23] validation-logloss:0.55006 validation-auc:0.96663 validation-aucpr:0.97075
[24] validation-logloss:0.54507 validation-auc:0.96667 validation-aucpr:0.97081
[25] validation-logloss:0.54009 validation-auc:0.96688 validation-aucpr:0.97100
[26] validation-logloss:0.53589 validation-auc:0.96673 validation-aucpr:0.97082
[27] validation-logloss:0.53166 validation-auc:0.96667 validation-aucpr:0.97077
[28] validation-logloss:0.52735 validation-auc:0.96674 validation-aucpr:0.97081
[29] validation-logloss:0.52271 validation-auc:0.96686 validation-aucpr:0.97092
[30] validation-logloss:0.51806 validation-auc:0.96694 validation-aucpr:0.97099
[31] validation-logloss:0.51408 validation-auc:0.96690 validation-aucpr:0.97094
[32] validation-logloss:0.50964 validation-auc:0.96694 validation-aucpr:0.97099
[33] validation-logloss:0.50519 validation-auc:0.96710 validation-aucpr:0.97112
[34] validation-logloss:0.50157 validation-auc:0.96701 validation-aucpr:0.97104
[35] validation-logloss:0.49784 validation-auc:0.96700 validation-aucpr:0.97101
[36] validation-logloss:0.49433 validation-auc:0.96699 validation-aucpr:0.97096
[37] validation-logloss:0.49012 validation-auc:0.96718 validation-aucpr:0.97113
[38] validation-logloss:0.48670 validation-auc:0.96704 validation-aucpr:0.97102
{'best_iteration': '37', 'best_score': '0.9711287945679274'}
Trial 75, Fold 5: Log loss = 0.4866973349365086, Average precision = 0.9710253972086283, ROC-AUC = 0.9670447949761255, Elapsed Time = 0.9918016000010539 seconds
Optimization Progress: 76%|#######6 | 76/100 [3:27:56<26:38, 66.61s/it]
Trial 76, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 76, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.64608 validation-auc:0.94500 validation-aucpr:0.93574
[1] validation-logloss:0.60419 validation-auc:0.96165 validation-aucpr:0.95555
[2] validation-logloss:0.57083 validation-auc:0.96431 validation-aucpr:0.96414
[3] validation-logloss:0.53722 validation-auc:0.96636 validation-aucpr:0.96455
[4] validation-logloss:0.50807 validation-auc:0.96670 validation-aucpr:0.96465
[5] validation-logloss:0.48209 validation-auc:0.96723 validation-aucpr:0.96472
[6] validation-logloss:0.45801 validation-auc:0.96734 validation-aucpr:0.96508
[7] validation-logloss:0.43931 validation-auc:0.96743 validation-aucpr:0.96713
[8] validation-logloss:0.41907 validation-auc:0.96811 validation-aucpr:0.96767
[9] validation-logloss:0.40061 validation-auc:0.96850 validation-aucpr:0.96786
[10] validation-logloss:0.38401 validation-auc:0.96878 validation-aucpr:0.96794
[11] validation-logloss:0.37031 validation-auc:0.96934 validation-aucpr:0.97037
[12] validation-logloss:0.35621 validation-auc:0.96947 validation-aucpr:0.97043
[13] validation-logloss:0.34498 validation-auc:0.96927 validation-aucpr:0.96968
[14] validation-logloss:0.33586 validation-auc:0.96946 validation-aucpr:0.97358
[15] validation-logloss:0.32629 validation-auc:0.96934 validation-aucpr:0.97342
[16] validation-logloss:0.31768 validation-auc:0.96928 validation-aucpr:0.97342
[17] validation-logloss:0.30808 validation-auc:0.96968 validation-aucpr:0.97400
[18] validation-logloss:0.30050 validation-auc:0.96970 validation-aucpr:0.97400
[19] validation-logloss:0.29237 validation-auc:0.96995 validation-aucpr:0.97420
[20] validation-logloss:0.28472 validation-auc:0.97006 validation-aucpr:0.97427
[21] validation-logloss:0.27861 validation-auc:0.97014 validation-aucpr:0.97434
[22] validation-logloss:0.27216 validation-auc:0.97022 validation-aucpr:0.97441
[23] validation-logloss:0.26724 validation-auc:0.97028 validation-aucpr:0.97440
[24] validation-logloss:0.26157 validation-auc:0.97037 validation-aucpr:0.97456
[25] validation-logloss:0.25602 validation-auc:0.97080 validation-aucpr:0.97492
[26] validation-logloss:0.25123 validation-auc:0.97089 validation-aucpr:0.97498
[27] validation-logloss:0.24680 validation-auc:0.97091 validation-aucpr:0.97432
[28] validation-logloss:0.24280 validation-auc:0.97109 validation-aucpr:0.97516
[29] validation-logloss:0.23996 validation-auc:0.97102 validation-aucpr:0.97549
[30] validation-logloss:0.23712 validation-auc:0.97100 validation-aucpr:0.97544
[31] validation-logloss:0.23370 validation-auc:0.97107 validation-aucpr:0.97549
[32] validation-logloss:0.23060 validation-auc:0.97115 validation-aucpr:0.97555
[33] validation-logloss:0.22813 validation-auc:0.97123 validation-aucpr:0.97557
[34] validation-logloss:0.22565 validation-auc:0.97129 validation-aucpr:0.97564
[35] validation-logloss:0.22329 validation-auc:0.97126 validation-aucpr:0.97562
[36] validation-logloss:0.22090 validation-auc:0.97135 validation-aucpr:0.97574
[37] validation-logloss:0.21873 validation-auc:0.97132 validation-aucpr:0.97572
[38] validation-logloss:0.21661 validation-auc:0.97138 validation-aucpr:0.97574
[39] validation-logloss:0.21496 validation-auc:0.97133 validation-aucpr:0.97571
[40] validation-logloss:0.21345 validation-auc:0.97143 validation-aucpr:0.97580
[41] validation-logloss:0.21238 validation-auc:0.97142 validation-aucpr:0.97577
[42] validation-logloss:0.21103 validation-auc:0.97146 validation-aucpr:0.97579
[43] validation-logloss:0.20994 validation-auc:0.97140 validation-aucpr:0.97574
[44] validation-logloss:0.20875 validation-auc:0.97145 validation-aucpr:0.97579
[45] validation-logloss:0.20809 validation-auc:0.97132 validation-aucpr:0.97567
[46] validation-logloss:0.20702 validation-auc:0.97147 validation-aucpr:0.97575
[47] validation-logloss:0.20582 validation-auc:0.97161 validation-aucpr:0.97588
[48] validation-logloss:0.20496 validation-auc:0.97156 validation-aucpr:0.97579
[49] validation-logloss:0.20457 validation-auc:0.97150 validation-aucpr:0.97572
[50] validation-logloss:0.20404 validation-auc:0.97145 validation-aucpr:0.97567
[51] validation-logloss:0.20346 validation-auc:0.97140 validation-aucpr:0.97561
[52] validation-logloss:0.20295 validation-auc:0.97141 validation-aucpr:0.97563
[53] validation-logloss:0.20212 validation-auc:0.97149 validation-aucpr:0.97569
[54] validation-logloss:0.20126 validation-auc:0.97159 validation-aucpr:0.97578
[55] validation-logloss:0.20067 validation-auc:0.97161 validation-aucpr:0.97580
[56] validation-logloss:0.20049 validation-auc:0.97162 validation-aucpr:0.97596
[57] validation-logloss:0.19994 validation-auc:0.97166 validation-aucpr:0.97600
[58] validation-logloss:0.19936 validation-auc:0.97167 validation-aucpr:0.97601
[59] validation-logloss:0.19872 validation-auc:0.97173 validation-aucpr:0.97606
[60] validation-logloss:0.19836 validation-auc:0.97175 validation-aucpr:0.97612
[61] validation-logloss:0.19770 validation-auc:0.97190 validation-aucpr:0.97628
[62] validation-logloss:0.19746 validation-auc:0.97187 validation-aucpr:0.97627
[63] validation-logloss:0.19706 validation-auc:0.97193 validation-aucpr:0.97635
[64] validation-logloss:0.19684 validation-auc:0.97189 validation-aucpr:0.97634
[65] validation-logloss:0.19653 validation-auc:0.97196 validation-aucpr:0.97643
[66] validation-logloss:0.19623 validation-auc:0.97199 validation-aucpr:0.97647
[67] validation-logloss:0.19586 validation-auc:0.97205 validation-aucpr:0.97651
[68] validation-logloss:0.19568 validation-auc:0.97205 validation-aucpr:0.97651
[69] validation-logloss:0.19548 validation-auc:0.97208 validation-aucpr:0.97653
[70] validation-logloss:0.19539 validation-auc:0.97206 validation-aucpr:0.97650
[71] validation-logloss:0.19514 validation-auc:0.97210 validation-aucpr:0.97653
[72] validation-logloss:0.19503 validation-auc:0.97209 validation-aucpr:0.97651
{'best_iteration': '69', 'best_score': '0.9765318621486272'}
Trial 76, Fold 1: Log loss = 0.19503386461973604, Average precision = 0.9765164255619482, ROC-AUC = 0.9720876469126243, Elapsed Time = 3.084799500000372 seconds
Trial 76, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 76, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.64557 validation-auc:0.95077 validation-aucpr:0.92546
[1] validation-logloss:0.60372 validation-auc:0.96275 validation-aucpr:0.95631
[2] validation-logloss:0.57040 validation-auc:0.96634 validation-aucpr:0.97026
[3] validation-logloss:0.54066 validation-auc:0.96691 validation-aucpr:0.96904
[4] validation-logloss:0.51390 validation-auc:0.96783 validation-aucpr:0.97153
[5] validation-logloss:0.48639 validation-auc:0.96948 validation-aucpr:0.97304
[6] validation-logloss:0.46163 validation-auc:0.97035 validation-aucpr:0.97373
[7] validation-logloss:0.43963 validation-auc:0.97058 validation-aucpr:0.97403
[8] validation-logloss:0.41950 validation-auc:0.97099 validation-aucpr:0.97475
[9] validation-logloss:0.40118 validation-auc:0.97112 validation-aucpr:0.97479
[10] validation-logloss:0.38650 validation-auc:0.97084 validation-aucpr:0.97436
[11] validation-logloss:0.37254 validation-auc:0.97097 validation-aucpr:0.97435
[12] validation-logloss:0.35792 validation-auc:0.97144 validation-aucpr:0.97482
[13] validation-logloss:0.34459 validation-auc:0.97170 validation-aucpr:0.97503
[14] validation-logloss:0.33389 validation-auc:0.97156 validation-aucpr:0.97489
[15] validation-logloss:0.32223 validation-auc:0.97189 validation-aucpr:0.97520
[16] validation-logloss:0.31181 validation-auc:0.97223 validation-aucpr:0.97517
[17] validation-logloss:0.30233 validation-auc:0.97223 validation-aucpr:0.97517
[18] validation-logloss:0.29361 validation-auc:0.97226 validation-aucpr:0.97521
[19] validation-logloss:0.28562 validation-auc:0.97217 validation-aucpr:0.97516
[20] validation-logloss:0.27810 validation-auc:0.97242 validation-aucpr:0.97536
[21] validation-logloss:0.27126 validation-auc:0.97241 validation-aucpr:0.97532
[22] validation-logloss:0.26546 validation-auc:0.97249 validation-aucpr:0.97535
[23] validation-logloss:0.26019 validation-auc:0.97277 validation-aucpr:0.97551
[24] validation-logloss:0.25444 validation-auc:0.97282 validation-aucpr:0.97552
[25] validation-logloss:0.25027 validation-auc:0.97260 validation-aucpr:0.97533
[26] validation-logloss:0.24577 validation-auc:0.97270 validation-aucpr:0.97539
[27] validation-logloss:0.24190 validation-auc:0.97264 validation-aucpr:0.97534
[28] validation-logloss:0.23761 validation-auc:0.97258 validation-aucpr:0.97543
[29] validation-logloss:0.23434 validation-auc:0.97240 validation-aucpr:0.97531
[30] validation-logloss:0.23048 validation-auc:0.97250 validation-aucpr:0.97561
[31] validation-logloss:0.22696 validation-auc:0.97247 validation-aucpr:0.97559
[32] validation-logloss:0.22367 validation-auc:0.97251 validation-aucpr:0.97562
[33] validation-logloss:0.22111 validation-auc:0.97261 validation-aucpr:0.97567
[34] validation-logloss:0.21842 validation-auc:0.97280 validation-aucpr:0.97580
[35] validation-logloss:0.21641 validation-auc:0.97269 validation-aucpr:0.97571
[36] validation-logloss:0.21438 validation-auc:0.97265 validation-aucpr:0.97570
[37] validation-logloss:0.21240 validation-auc:0.97270 validation-aucpr:0.97575
[38] validation-logloss:0.21037 validation-auc:0.97278 validation-aucpr:0.97583
[39] validation-logloss:0.20847 validation-auc:0.97294 validation-aucpr:0.97595
[40] validation-logloss:0.20676 validation-auc:0.97300 validation-aucpr:0.97601
[41] validation-logloss:0.20475 validation-auc:0.97308 validation-aucpr:0.97608
[42] validation-logloss:0.20273 validation-auc:0.97312 validation-aucpr:0.97613
[43] validation-logloss:0.20166 validation-auc:0.97301 validation-aucpr:0.97605
[44] validation-logloss:0.19983 validation-auc:0.97318 validation-aucpr:0.97619
[45] validation-logloss:0.19863 validation-auc:0.97321 validation-aucpr:0.97622
[46] validation-logloss:0.19721 validation-auc:0.97331 validation-aucpr:0.97628
[47] validation-logloss:0.19570 validation-auc:0.97342 validation-aucpr:0.97635
[48] validation-logloss:0.19442 validation-auc:0.97350 validation-aucpr:0.97645
[49] validation-logloss:0.19342 validation-auc:0.97356 validation-aucpr:0.97650
[50] validation-logloss:0.19208 validation-auc:0.97368 validation-aucpr:0.97669
[51] validation-logloss:0.19119 validation-auc:0.97372 validation-aucpr:0.97671
[52] validation-logloss:0.19047 validation-auc:0.97377 validation-aucpr:0.97674
[53] validation-logloss:0.18961 validation-auc:0.97380 validation-aucpr:0.97678
[54] validation-logloss:0.18924 validation-auc:0.97378 validation-aucpr:0.97677
[55] validation-logloss:0.18846 validation-auc:0.97384 validation-aucpr:0.97683
[56] validation-logloss:0.18736 validation-auc:0.97403 validation-aucpr:0.97698
[57] validation-logloss:0.18648 validation-auc:0.97412 validation-aucpr:0.97703
[58] validation-logloss:0.18609 validation-auc:0.97403 validation-aucpr:0.97690
[59] validation-logloss:0.18595 validation-auc:0.97391 validation-aucpr:0.97683
[60] validation-logloss:0.18555 validation-auc:0.97395 validation-aucpr:0.97693
[61] validation-logloss:0.18504 validation-auc:0.97390 validation-aucpr:0.97629
[62] validation-logloss:0.18472 validation-auc:0.97388 validation-aucpr:0.97627
[63] validation-logloss:0.18416 validation-auc:0.97390 validation-aucpr:0.97619
[64] validation-logloss:0.18404 validation-auc:0.97383 validation-aucpr:0.97600
[65] validation-logloss:0.18367 validation-auc:0.97382 validation-aucpr:0.97597
[66] validation-logloss:0.18338 validation-auc:0.97383 validation-aucpr:0.97602
[67] validation-logloss:0.18301 validation-auc:0.97382 validation-aucpr:0.97591
[68] validation-logloss:0.18276 validation-auc:0.97383 validation-aucpr:0.97589
[69] validation-logloss:0.18241 validation-auc:0.97393 validation-aucpr:0.97594
[70] validation-logloss:0.18215 validation-auc:0.97391 validation-aucpr:0.97590
[71] validation-logloss:0.18196 validation-auc:0.97397 validation-aucpr:0.97656
[72] validation-logloss:0.18172 validation-auc:0.97394 validation-aucpr:0.97627
{'best_iteration': '57', 'best_score': '0.9770323608248587'}
Trial 76, Fold 2: Log loss = 0.18172125853299292, Average precision = 0.9762585221188115, ROC-AUC = 0.9739378489630013, Elapsed Time = 3.2547775999992155 seconds
Trial 76, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 76, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.64618 validation-auc:0.94209 validation-aucpr:0.91996
[1] validation-logloss:0.60481 validation-auc:0.95842 validation-aucpr:0.95056
[2] validation-logloss:0.57157 validation-auc:0.96142 validation-aucpr:0.96026
[3] validation-logloss:0.54184 validation-auc:0.96483 validation-aucpr:0.96469
[4] validation-logloss:0.51496 validation-auc:0.96569 validation-aucpr:0.96904
[5] validation-logloss:0.48797 validation-auc:0.96747 validation-aucpr:0.97111
[6] validation-logloss:0.46349 validation-auc:0.96825 validation-aucpr:0.96973
[7] validation-logloss:0.44460 validation-auc:0.96793 validation-aucpr:0.97005
[8] validation-logloss:0.42620 validation-auc:0.96820 validation-aucpr:0.97088
[9] validation-logloss:0.40728 validation-auc:0.96878 validation-aucpr:0.97074
[10] validation-logloss:0.39298 validation-auc:0.96857 validation-aucpr:0.96968
[11] validation-logloss:0.37665 validation-auc:0.96890 validation-aucpr:0.97021
[12] validation-logloss:0.36352 validation-auc:0.96897 validation-aucpr:0.97048
[13] validation-logloss:0.35209 validation-auc:0.96927 validation-aucpr:0.97079
[14] validation-logloss:0.33936 validation-auc:0.96943 validation-aucpr:0.97088
[15] validation-logloss:0.32925 validation-auc:0.96986 validation-aucpr:0.97121
[16] validation-logloss:0.31842 validation-auc:0.97015 validation-aucpr:0.97058
[17] validation-logloss:0.30837 validation-auc:0.97037 validation-aucpr:0.97086
[18] validation-logloss:0.30054 validation-auc:0.97033 validation-aucpr:0.97073
[19] validation-logloss:0.29165 validation-auc:0.97055 validation-aucpr:0.97123
[20] validation-logloss:0.28526 validation-auc:0.97080 validation-aucpr:0.97218
[21] validation-logloss:0.27761 validation-auc:0.97107 validation-aucpr:0.97264
[22] validation-logloss:0.27075 validation-auc:0.97118 validation-aucpr:0.97272
[23] validation-logloss:0.26455 validation-auc:0.97124 validation-aucpr:0.97274
[24] validation-logloss:0.25865 validation-auc:0.97135 validation-aucpr:0.97273
[25] validation-logloss:0.25437 validation-auc:0.97123 validation-aucpr:0.97269
[26] validation-logloss:0.25003 validation-auc:0.97130 validation-aucpr:0.97267
[27] validation-logloss:0.24507 validation-auc:0.97150 validation-aucpr:0.97372
[28] validation-logloss:0.24065 validation-auc:0.97138 validation-aucpr:0.97272
[29] validation-logloss:0.23704 validation-auc:0.97174 validation-aucpr:0.97447
[30] validation-logloss:0.23280 validation-auc:0.97198 validation-aucpr:0.97471
[31] validation-logloss:0.23012 validation-auc:0.97187 validation-aucpr:0.97438
[32] validation-logloss:0.22757 validation-auc:0.97177 validation-aucpr:0.97445
[33] validation-logloss:0.22501 validation-auc:0.97195 validation-aucpr:0.97587
[34] validation-logloss:0.22200 validation-auc:0.97211 validation-aucpr:0.97599
[35] validation-logloss:0.21891 validation-auc:0.97231 validation-aucpr:0.97615
[36] validation-logloss:0.21710 validation-auc:0.97230 validation-aucpr:0.97614
[37] validation-logloss:0.21514 validation-auc:0.97238 validation-aucpr:0.97618
[38] validation-logloss:0.21320 validation-auc:0.97251 validation-aucpr:0.97628
[39] validation-logloss:0.21105 validation-auc:0.97252 validation-aucpr:0.97627
[40] validation-logloss:0.20893 validation-auc:0.97261 validation-aucpr:0.97636
[41] validation-logloss:0.20670 validation-auc:0.97279 validation-aucpr:0.97648
[42] validation-logloss:0.20553 validation-auc:0.97276 validation-aucpr:0.97656
[43] validation-logloss:0.20443 validation-auc:0.97272 validation-aucpr:0.97654
[44] validation-logloss:0.20301 validation-auc:0.97284 validation-aucpr:0.97661
[45] validation-logloss:0.20141 validation-auc:0.97298 validation-aucpr:0.97674
[46] validation-logloss:0.20023 validation-auc:0.97290 validation-aucpr:0.97667
[47] validation-logloss:0.19882 validation-auc:0.97301 validation-aucpr:0.97677
[48] validation-logloss:0.19801 validation-auc:0.97303 validation-aucpr:0.97676
[49] validation-logloss:0.19686 validation-auc:0.97310 validation-aucpr:0.97682
[50] validation-logloss:0.19611 validation-auc:0.97319 validation-aucpr:0.97688
[51] validation-logloss:0.19541 validation-auc:0.97315 validation-aucpr:0.97688
[52] validation-logloss:0.19477 validation-auc:0.97318 validation-aucpr:0.97688
[53] validation-logloss:0.19429 validation-auc:0.97324 validation-aucpr:0.97692
[54] validation-logloss:0.19368 validation-auc:0.97328 validation-aucpr:0.97694
[55] validation-logloss:0.19292 validation-auc:0.97329 validation-aucpr:0.97692
[56] validation-logloss:0.19235 validation-auc:0.97324 validation-aucpr:0.97688
[57] validation-logloss:0.19209 validation-auc:0.97321 validation-aucpr:0.97685
[58] validation-logloss:0.19150 validation-auc:0.97318 validation-aucpr:0.97676
[59] validation-logloss:0.19086 validation-auc:0.97325 validation-aucpr:0.97683
[60] validation-logloss:0.19028 validation-auc:0.97327 validation-aucpr:0.97685
[61] validation-logloss:0.18947 validation-auc:0.97347 validation-aucpr:0.97700
[62] validation-logloss:0.18888 validation-auc:0.97358 validation-aucpr:0.97709
[63] validation-logloss:0.18880 validation-auc:0.97356 validation-aucpr:0.97707
[64] validation-logloss:0.18843 validation-auc:0.97360 validation-aucpr:0.97708
[65] validation-logloss:0.18815 validation-auc:0.97360 validation-aucpr:0.97706
[66] validation-logloss:0.18784 validation-auc:0.97362 validation-aucpr:0.97702
[67] validation-logloss:0.18781 validation-auc:0.97360 validation-aucpr:0.97701
[68] validation-logloss:0.18744 validation-auc:0.97360 validation-aucpr:0.97694
[69] validation-logloss:0.18704 validation-auc:0.97363 validation-aucpr:0.97684
[70] validation-logloss:0.18681 validation-auc:0.97365 validation-aucpr:0.97685
[71] validation-logloss:0.18665 validation-auc:0.97371 validation-aucpr:0.97689
[72] validation-logloss:0.18657 validation-auc:0.97372 validation-aucpr:0.97690
{'best_iteration': '62', 'best_score': '0.977090692573389'}
Trial 76, Fold 3: Log loss = 0.18657246436692548, Average precision = 0.9769053106660622, ROC-AUC = 0.9737197645693483, Elapsed Time = 3.1280955000001995 seconds
Trial 76, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 76, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.64659 validation-auc:0.93764 validation-aucpr:0.92147
[1] validation-logloss:0.60502 validation-auc:0.95930 validation-aucpr:0.95793
[2] validation-logloss:0.57194 validation-auc:0.96263 validation-aucpr:0.96390
[3] validation-logloss:0.53894 validation-auc:0.96500 validation-aucpr:0.96913
[4] validation-logloss:0.50959 validation-auc:0.96607 validation-aucpr:0.96995
[5] validation-logloss:0.48599 validation-auc:0.96667 validation-aucpr:0.97177
[6] validation-logloss:0.46111 validation-auc:0.96792 validation-aucpr:0.97280
[7] validation-logloss:0.44152 validation-auc:0.96790 validation-aucpr:0.97294
[8] validation-logloss:0.42174 validation-auc:0.96815 validation-aucpr:0.97308
[9] validation-logloss:0.40373 validation-auc:0.96839 validation-aucpr:0.97337
[10] validation-logloss:0.38711 validation-auc:0.96869 validation-aucpr:0.97357
[11] validation-logloss:0.37177 validation-auc:0.96899 validation-aucpr:0.97384
[12] validation-logloss:0.35759 validation-auc:0.96949 validation-aucpr:0.97417
[13] validation-logloss:0.34603 validation-auc:0.96966 validation-aucpr:0.97439
[14] validation-logloss:0.33556 validation-auc:0.96959 validation-aucpr:0.97433
[15] validation-logloss:0.32472 validation-auc:0.96965 validation-aucpr:0.97439
[16] validation-logloss:0.31410 validation-auc:0.97007 validation-aucpr:0.97476
[17] validation-logloss:0.30594 validation-auc:0.97003 validation-aucpr:0.97471
[18] validation-logloss:0.29787 validation-auc:0.97028 validation-aucpr:0.97485
[19] validation-logloss:0.29105 validation-auc:0.97005 validation-aucpr:0.97471
[20] validation-logloss:0.28309 validation-auc:0.97055 validation-aucpr:0.97511
[21] validation-logloss:0.27581 validation-auc:0.97072 validation-aucpr:0.97526
[22] validation-logloss:0.27026 validation-auc:0.97068 validation-aucpr:0.97524
[23] validation-logloss:0.26511 validation-auc:0.97075 validation-aucpr:0.97528
[24] validation-logloss:0.26040 validation-auc:0.97078 validation-aucpr:0.97528
[25] validation-logloss:0.25613 validation-auc:0.97061 validation-aucpr:0.97514
[26] validation-logloss:0.25064 validation-auc:0.97101 validation-aucpr:0.97545
[27] validation-logloss:0.24679 validation-auc:0.97109 validation-aucpr:0.97550
[28] validation-logloss:0.24251 validation-auc:0.97112 validation-aucpr:0.97554
[29] validation-logloss:0.23924 validation-auc:0.97112 validation-aucpr:0.97556
[30] validation-logloss:0.23540 validation-auc:0.97110 validation-aucpr:0.97556
[31] validation-logloss:0.23229 validation-auc:0.97122 validation-aucpr:0.97562
[32] validation-logloss:0.22890 validation-auc:0.97123 validation-aucpr:0.97562
[33] validation-logloss:0.22648 validation-auc:0.97112 validation-aucpr:0.97558
[34] validation-logloss:0.22369 validation-auc:0.97105 validation-aucpr:0.97553
[35] validation-logloss:0.22082 validation-auc:0.97119 validation-aucpr:0.97565
[36] validation-logloss:0.21816 validation-auc:0.97136 validation-aucpr:0.97577
[37] validation-logloss:0.21624 validation-auc:0.97142 validation-aucpr:0.97582
[38] validation-logloss:0.21421 validation-auc:0.97138 validation-aucpr:0.97580
[39] validation-logloss:0.21197 validation-auc:0.97148 validation-aucpr:0.97590
[40] validation-logloss:0.21039 validation-auc:0.97161 validation-aucpr:0.97597
[41] validation-logloss:0.20841 validation-auc:0.97182 validation-aucpr:0.97613
[42] validation-logloss:0.20650 validation-auc:0.97197 validation-aucpr:0.97628
[43] validation-logloss:0.20530 validation-auc:0.97199 validation-aucpr:0.97631
[44] validation-logloss:0.20385 validation-auc:0.97198 validation-aucpr:0.97632
[45] validation-logloss:0.20265 validation-auc:0.97202 validation-aucpr:0.97634
[46] validation-logloss:0.20146 validation-auc:0.97200 validation-aucpr:0.97630
[47] validation-logloss:0.20000 validation-auc:0.97225 validation-aucpr:0.97650
[48] validation-logloss:0.19882 validation-auc:0.97229 validation-aucpr:0.97654
[49] validation-logloss:0.19821 validation-auc:0.97220 validation-aucpr:0.97646
[50] validation-logloss:0.19717 validation-auc:0.97227 validation-aucpr:0.97651
[51] validation-logloss:0.19675 validation-auc:0.97223 validation-aucpr:0.97648
[52] validation-logloss:0.19603 validation-auc:0.97231 validation-aucpr:0.97653
[53] validation-logloss:0.19545 validation-auc:0.97238 validation-aucpr:0.97656
[54] validation-logloss:0.19468 validation-auc:0.97244 validation-aucpr:0.97660
[55] validation-logloss:0.19380 validation-auc:0.97252 validation-aucpr:0.97666
[56] validation-logloss:0.19348 validation-auc:0.97249 validation-aucpr:0.97662
[57] validation-logloss:0.19285 validation-auc:0.97254 validation-aucpr:0.97664
[58] validation-logloss:0.19217 validation-auc:0.97260 validation-aucpr:0.97668
[59] validation-logloss:0.19195 validation-auc:0.97258 validation-aucpr:0.97667
[60] validation-logloss:0.19129 validation-auc:0.97267 validation-aucpr:0.97673
[61] validation-logloss:0.19107 validation-auc:0.97262 validation-aucpr:0.97670
[62] validation-logloss:0.19046 validation-auc:0.97270 validation-aucpr:0.97676
[63] validation-logloss:0.19013 validation-auc:0.97268 validation-aucpr:0.97675
[64] validation-logloss:0.18967 validation-auc:0.97274 validation-aucpr:0.97679
[65] validation-logloss:0.18939 validation-auc:0.97272 validation-aucpr:0.97679
[66] validation-logloss:0.18891 validation-auc:0.97281 validation-aucpr:0.97687
[67] validation-logloss:0.18854 validation-auc:0.97287 validation-aucpr:0.97690
[68] validation-logloss:0.18828 validation-auc:0.97288 validation-aucpr:0.97690
[69] validation-logloss:0.18805 validation-auc:0.97292 validation-aucpr:0.97692
[70] validation-logloss:0.18774 validation-auc:0.97300 validation-aucpr:0.97697
[71] validation-logloss:0.18749 validation-auc:0.97302 validation-aucpr:0.97698
[72] validation-logloss:0.18729 validation-auc:0.97305 validation-aucpr:0.97700
{'best_iteration': '72', 'best_score': '0.9770020497729272'}
Trial 76, Fold 4: Log loss = 0.18728633940029324, Average precision = 0.9770056518527942, ROC-AUC = 0.9730516242599385, Elapsed Time = 2.883299500001158 seconds
Trial 76, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 76, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.64594 validation-auc:0.94709 validation-aucpr:0.92992
[1] validation-logloss:0.60545 validation-auc:0.96148 validation-aucpr:0.96125
[2] validation-logloss:0.57336 validation-auc:0.96311 validation-aucpr:0.96487
[3] validation-logloss:0.54018 validation-auc:0.96505 validation-aucpr:0.96831
[4] validation-logloss:0.51340 validation-auc:0.96578 validation-aucpr:0.96861
[5] validation-logloss:0.48612 validation-auc:0.96710 validation-aucpr:0.96967
[6] validation-logloss:0.46197 validation-auc:0.96746 validation-aucpr:0.96999
[7] validation-logloss:0.44051 validation-auc:0.96773 validation-aucpr:0.97035
[8] validation-logloss:0.41998 validation-auc:0.96834 validation-aucpr:0.97097
[9] validation-logloss:0.40408 validation-auc:0.96789 validation-aucpr:0.97060
[10] validation-logloss:0.38930 validation-auc:0.96816 validation-aucpr:0.97255
[11] validation-logloss:0.37574 validation-auc:0.96845 validation-aucpr:0.97268
[12] validation-logloss:0.36118 validation-auc:0.96933 validation-aucpr:0.97334
[13] validation-logloss:0.34984 validation-auc:0.96969 validation-aucpr:0.97361
[14] validation-logloss:0.33777 validation-auc:0.97013 validation-aucpr:0.97401
[15] validation-logloss:0.32650 validation-auc:0.97030 validation-aucpr:0.97411
[16] validation-logloss:0.31739 validation-auc:0.97023 validation-aucpr:0.97403
[17] validation-logloss:0.30940 validation-auc:0.96984 validation-aucpr:0.97373
[18] validation-logloss:0.30048 validation-auc:0.96986 validation-aucpr:0.97384
[19] validation-logloss:0.29383 validation-auc:0.96981 validation-aucpr:0.97382
[20] validation-logloss:0.28712 validation-auc:0.96993 validation-aucpr:0.97393
[21] validation-logloss:0.28001 validation-auc:0.97012 validation-aucpr:0.97411
[22] validation-logloss:0.27424 validation-auc:0.97037 validation-aucpr:0.97430
[23] validation-logloss:0.26793 validation-auc:0.97078 validation-aucpr:0.97452
[24] validation-logloss:0.26223 validation-auc:0.97117 validation-aucpr:0.97466
[25] validation-logloss:0.25717 validation-auc:0.97115 validation-aucpr:0.97469
[26] validation-logloss:0.25309 validation-auc:0.97124 validation-aucpr:0.97474
[27] validation-logloss:0.24919 validation-auc:0.97120 validation-aucpr:0.97470
[28] validation-logloss:0.24611 validation-auc:0.97104 validation-aucpr:0.97451
[29] validation-logloss:0.24272 validation-auc:0.97102 validation-aucpr:0.97449
[30] validation-logloss:0.23873 validation-auc:0.97117 validation-aucpr:0.97465
[31] validation-logloss:0.23509 validation-auc:0.97129 validation-aucpr:0.97475
[32] validation-logloss:0.23236 validation-auc:0.97137 validation-aucpr:0.97479
[33] validation-logloss:0.22995 validation-auc:0.97134 validation-aucpr:0.97474
[34] validation-logloss:0.22790 validation-auc:0.97138 validation-aucpr:0.97504
[35] validation-logloss:0.22538 validation-auc:0.97154 validation-aucpr:0.97517
[36] validation-logloss:0.22357 validation-auc:0.97143 validation-aucpr:0.97508
[37] validation-logloss:0.22097 validation-auc:0.97155 validation-aucpr:0.97518
[38] validation-logloss:0.21880 validation-auc:0.97151 validation-aucpr:0.97511
[39] validation-logloss:0.21714 validation-auc:0.97158 validation-aucpr:0.97513
[40] validation-logloss:0.21491 validation-auc:0.97177 validation-aucpr:0.97530
[41] validation-logloss:0.21340 validation-auc:0.97190 validation-aucpr:0.97541
[42] validation-logloss:0.21208 validation-auc:0.97198 validation-aucpr:0.97541
[43] validation-logloss:0.21097 validation-auc:0.97192 validation-aucpr:0.97530
[44] validation-logloss:0.20991 validation-auc:0.97188 validation-aucpr:0.97526
[45] validation-logloss:0.20820 validation-auc:0.97204 validation-aucpr:0.97540
[46] validation-logloss:0.20674 validation-auc:0.97206 validation-aucpr:0.97544
[47] validation-logloss:0.20572 validation-auc:0.97211 validation-aucpr:0.97546
[48] validation-logloss:0.20460 validation-auc:0.97214 validation-aucpr:0.97551
[49] validation-logloss:0.20351 validation-auc:0.97221 validation-aucpr:0.97558
[50] validation-logloss:0.20318 validation-auc:0.97206 validation-aucpr:0.97547
[51] validation-logloss:0.20244 validation-auc:0.97213 validation-aucpr:0.97552
[52] validation-logloss:0.20129 validation-auc:0.97221 validation-aucpr:0.97559
[53] validation-logloss:0.20044 validation-auc:0.97220 validation-aucpr:0.97563
[54] validation-logloss:0.19983 validation-auc:0.97225 validation-aucpr:0.97567
[55] validation-logloss:0.19925 validation-auc:0.97225 validation-aucpr:0.97556
[56] validation-logloss:0.19840 validation-auc:0.97231 validation-aucpr:0.97561
[57] validation-logloss:0.19826 validation-auc:0.97225 validation-aucpr:0.97554
[58] validation-logloss:0.19822 validation-auc:0.97210 validation-aucpr:0.97541
[59] validation-logloss:0.19797 validation-auc:0.97204 validation-aucpr:0.97527
[60] validation-logloss:0.19733 validation-auc:0.97206 validation-aucpr:0.97526
[61] validation-logloss:0.19700 validation-auc:0.97207 validation-aucpr:0.97518
[62] validation-logloss:0.19668 validation-auc:0.97205 validation-aucpr:0.97530
[63] validation-logloss:0.19610 validation-auc:0.97211 validation-aucpr:0.97533
[64] validation-logloss:0.19541 validation-auc:0.97221 validation-aucpr:0.97537
[65] validation-logloss:0.19489 validation-auc:0.97238 validation-aucpr:0.97573
[66] validation-logloss:0.19482 validation-auc:0.97234 validation-aucpr:0.97567
[67] validation-logloss:0.19435 validation-auc:0.97239 validation-aucpr:0.97570
[68] validation-logloss:0.19395 validation-auc:0.97242 validation-aucpr:0.97569
[69] validation-logloss:0.19384 validation-auc:0.97243 validation-aucpr:0.97567
[70] validation-logloss:0.19325 validation-auc:0.97259 validation-aucpr:0.97575
[71] validation-logloss:0.19291 validation-auc:0.97266 validation-aucpr:0.97588
[72] validation-logloss:0.19265 validation-auc:0.97267 validation-aucpr:0.97589
{'best_iteration': '72', 'best_score': '0.9758881601556328'}
Trial 76, Fold 5: Log loss = 0.19264611417775424, Average precision = 0.9758934275824167, ROC-AUC = 0.9726742099102614, Elapsed Time = 2.890385700000479 seconds
Optimization Progress: 77%|#######7 | 77/100 [3:28:19<20:32, 53.57s/it]
Trial 77, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 77, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[21:27:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64124 validation-auc:0.94910 validation-aucpr:0.95290
[21:27:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.59675 validation-auc:0.95926 validation-aucpr:0.95782
[21:27:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.55690 validation-auc:0.96340 validation-aucpr:0.96624
[21:27:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.52630 validation-auc:0.96339 validation-aucpr:0.96582
[21:27:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.49429 validation-auc:0.96532 validation-aucpr:0.97072
[21:27:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.46657 validation-auc:0.96587 validation-aucpr:0.97128
[21:27:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.44524 validation-auc:0.96575 validation-aucpr:0.97127
[21:27:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.42281 validation-auc:0.96566 validation-aucpr:0.97133
[21:27:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.40169 validation-auc:0.96646 validation-aucpr:0.97201
[21:27:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.38353 validation-auc:0.96657 validation-aucpr:0.97228
[21:27:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.36888 validation-auc:0.96677 validation-aucpr:0.97232
[21:27:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.35346 validation-auc:0.96710 validation-aucpr:0.97270
[21:27:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.33982 validation-auc:0.96740 validation-aucpr:0.97293
[21:27:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.32805 validation-auc:0.96748 validation-aucpr:0.97297
[21:27:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.31919 validation-auc:0.96714 validation-aucpr:0.97255
[21:27:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.31018 validation-auc:0.96737 validation-aucpr:0.97268
[21:27:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.30094 validation-auc:0.96755 validation-aucpr:0.97280
[21:27:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.29209 validation-auc:0.96757 validation-aucpr:0.97283
[21:27:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.28461 validation-auc:0.96795 validation-aucpr:0.97308
[21:27:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.27727 validation-auc:0.96816 validation-aucpr:0.97319
[21:27:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.27078 validation-auc:0.96829 validation-aucpr:0.97325
[21:27:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.26427 validation-auc:0.96878 validation-aucpr:0.97368
[21:27:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.25896 validation-auc:0.96872 validation-aucpr:0.97359
[21:27:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.25461 validation-auc:0.96873 validation-aucpr:0.97364
[21:27:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.25072 validation-auc:0.96865 validation-aucpr:0.97356
[21:27:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.24637 validation-auc:0.96885 validation-aucpr:0.97376
[21:27:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.24208 validation-auc:0.96912 validation-aucpr:0.97392
[21:27:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.23809 validation-auc:0.96920 validation-aucpr:0.97398
[21:27:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.23496 validation-auc:0.96919 validation-aucpr:0.97395
[21:27:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.23216 validation-auc:0.96917 validation-aucpr:0.97398
[21:27:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.22992 validation-auc:0.96904 validation-aucpr:0.97387
[21:27:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.22818 validation-auc:0.96874 validation-aucpr:0.97278
[21:27:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.22556 validation-auc:0.96893 validation-aucpr:0.97293
[21:27:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.22341 validation-auc:0.96901 validation-aucpr:0.97297
[21:27:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.22187 validation-auc:0.96892 validation-aucpr:0.97290
[21:27:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.22019 validation-auc:0.96889 validation-aucpr:0.97289
[21:27:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.21836 validation-auc:0.96900 validation-aucpr:0.97297
[21:27:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.21702 validation-auc:0.96911 validation-aucpr:0.97398
[21:27:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.21616 validation-auc:0.96908 validation-aucpr:0.97396
[21:27:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.21498 validation-auc:0.96915 validation-aucpr:0.97400
[21:27:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.21394 validation-auc:0.96921 validation-aucpr:0.97400
[21:27:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.21258 validation-auc:0.96940 validation-aucpr:0.97419
[21:27:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.21167 validation-auc:0.96935 validation-aucpr:0.97416
[21:27:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.21047 validation-auc:0.96956 validation-aucpr:0.97432
[21:27:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.20967 validation-auc:0.96958 validation-aucpr:0.97431
[21:27:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.20897 validation-auc:0.96965 validation-aucpr:0.97435
[21:27:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.20833 validation-auc:0.96972 validation-aucpr:0.97439
[21:27:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20764 validation-auc:0.96980 validation-aucpr:0.97450
[21:27:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.20710 validation-auc:0.96982 validation-aucpr:0.97459
[21:27:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.20713 validation-auc:0.96972 validation-aucpr:0.97450
[21:27:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.20682 validation-auc:0.96979 validation-aucpr:0.97454
[21:27:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.20640 validation-auc:0.96980 validation-aucpr:0.97460
[21:27:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.20576 validation-auc:0.96987 validation-aucpr:0.97469
[21:27:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.20552 validation-auc:0.96992 validation-aucpr:0.97471
[21:27:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.20507 validation-auc:0.97011 validation-aucpr:0.97483
[21:27:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.20495 validation-auc:0.97007 validation-aucpr:0.97480
[21:27:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.20468 validation-auc:0.97001 validation-aucpr:0.97478
[21:27:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.20445 validation-auc:0.97007 validation-aucpr:0.97480
[21:27:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.20439 validation-auc:0.97007 validation-aucpr:0.97479
[21:27:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.20412 validation-auc:0.97010 validation-aucpr:0.97479
[21:27:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.20385 validation-auc:0.97023 validation-aucpr:0.97489
[21:27:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.20420 validation-auc:0.97013 validation-aucpr:0.97483
[21:27:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.20448 validation-auc:0.97007 validation-aucpr:0.97476
[21:27:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.20443 validation-auc:0.97011 validation-aucpr:0.97480
[21:27:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.20463 validation-auc:0.97012 validation-aucpr:0.97480
[21:27:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.20464 validation-auc:0.97011 validation-aucpr:0.97481
[21:27:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.20489 validation-auc:0.97001 validation-aucpr:0.97472
[21:27:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.20531 validation-auc:0.96993 validation-aucpr:0.97466
[21:27:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.20533 validation-auc:0.97004 validation-aucpr:0.97474
[21:27:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.20556 validation-auc:0.97008 validation-aucpr:0.97478
[21:27:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.20598 validation-auc:0.96997 validation-aucpr:0.97471
[21:27:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.20631 validation-auc:0.96991 validation-aucpr:0.97469
[21:27:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.20650 validation-auc:0.96992 validation-aucpr:0.97470
[21:27:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.20667 validation-auc:0.96998 validation-aucpr:0.97474
{'best_iteration': '60', 'best_score': '0.9748918052690786'}
Trial 77, Fold 1: Log loss = 0.20666904008919706, Average precision = 0.9747405903084637, ROC-AUC = 0.9699798480568579, Elapsed Time = 17.862047299997357 seconds
Trial 77, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 77, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[21:27:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.63982 validation-auc:0.95206 validation-aucpr:0.94168
[21:27:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.59510 validation-auc:0.96174 validation-aucpr:0.96048
[21:27:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.55920 validation-auc:0.96427 validation-aucpr:0.96810
[21:27:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.52369 validation-auc:0.96565 validation-aucpr:0.96799
[21:27:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.49574 validation-auc:0.96634 validation-aucpr:0.97002
[21:27:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.46684 validation-auc:0.96787 validation-aucpr:0.97141
[21:27:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.44162 validation-auc:0.96820 validation-aucpr:0.97189
[21:27:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.42232 validation-auc:0.96779 validation-aucpr:0.97148
[21:27:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.40197 validation-auc:0.96833 validation-aucpr:0.97181
[21:27:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.38408 validation-auc:0.96835 validation-aucpr:0.97189
[21:27:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.36711 validation-auc:0.96926 validation-aucpr:0.97262
[21:27:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.35153 validation-auc:0.96962 validation-aucpr:0.97289
[21:27:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.33790 validation-auc:0.96951 validation-aucpr:0.97287
[21:27:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.32695 validation-auc:0.96956 validation-aucpr:0.97287
[21:27:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.31540 validation-auc:0.96969 validation-aucpr:0.97313
[21:27:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.30624 validation-auc:0.96981 validation-aucpr:0.97323
[21:27:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.29676 validation-auc:0.96988 validation-aucpr:0.97329
[21:27:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.28729 validation-auc:0.97045 validation-aucpr:0.97365
[21:27:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.27911 validation-auc:0.97067 validation-aucpr:0.97382
[21:27:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.27139 validation-auc:0.97089 validation-aucpr:0.97400
[21:27:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.26394 validation-auc:0.97127 validation-aucpr:0.97431
[21:27:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.25736 validation-auc:0.97158 validation-aucpr:0.97449
[21:27:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.25109 validation-auc:0.97186 validation-aucpr:0.97469
[21:27:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.24653 validation-auc:0.97176 validation-aucpr:0.97456
[21:27:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.24181 validation-auc:0.97197 validation-aucpr:0.97475
[21:27:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.23683 validation-auc:0.97206 validation-aucpr:0.97486
[21:27:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.23278 validation-auc:0.97201 validation-aucpr:0.97505
[21:27:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.22876 validation-auc:0.97204 validation-aucpr:0.97507
[21:27:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.22594 validation-auc:0.97197 validation-aucpr:0.97499
[21:27:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.22266 validation-auc:0.97187 validation-aucpr:0.97491
[21:27:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.21937 validation-auc:0.97199 validation-aucpr:0.97504
[21:27:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.21704 validation-auc:0.97178 validation-aucpr:0.97450
[21:27:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.21484 validation-auc:0.97169 validation-aucpr:0.97442
[21:27:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.21261 validation-auc:0.97189 validation-aucpr:0.97460
[21:27:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.21034 validation-auc:0.97192 validation-aucpr:0.97461
[21:27:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.20812 validation-auc:0.97202 validation-aucpr:0.97465
[21:27:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.20627 validation-auc:0.97203 validation-aucpr:0.97475
[21:27:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20410 validation-auc:0.97220 validation-aucpr:0.97485
[21:27:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.20256 validation-auc:0.97226 validation-aucpr:0.97485
[21:27:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.20162 validation-auc:0.97220 validation-aucpr:0.97476
[21:27:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20080 validation-auc:0.97212 validation-aucpr:0.97473
[21:27:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.19963 validation-auc:0.97205 validation-aucpr:0.97470
[21:27:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.19851 validation-auc:0.97219 validation-aucpr:0.97470
[21:27:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.19726 validation-auc:0.97221 validation-aucpr:0.97464
[21:27:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19623 validation-auc:0.97226 validation-aucpr:0.97467
[21:27:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19520 validation-auc:0.97231 validation-aucpr:0.97449
[21:27:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19427 validation-auc:0.97242 validation-aucpr:0.97498
[21:27:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19366 validation-auc:0.97235 validation-aucpr:0.97490
[21:27:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19305 validation-auc:0.97229 validation-aucpr:0.97507
[21:27:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19242 validation-auc:0.97234 validation-aucpr:0.97512
[21:27:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19197 validation-auc:0.97234 validation-aucpr:0.97497
[21:27:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19120 validation-auc:0.97256 validation-aucpr:0.97510
[21:27:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19063 validation-auc:0.97257 validation-aucpr:0.97511
[21:27:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19044 validation-auc:0.97257 validation-aucpr:0.97508
[21:27:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.18966 validation-auc:0.97262 validation-aucpr:0.97514
[21:27:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.18946 validation-auc:0.97255 validation-aucpr:0.97510
[21:27:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.18899 validation-auc:0.97261 validation-aucpr:0.97515
[21:27:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.18876 validation-auc:0.97262 validation-aucpr:0.97453
[21:27:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.18818 validation-auc:0.97279 validation-aucpr:0.97452
[21:27:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.18800 validation-auc:0.97268 validation-aucpr:0.97436
[21:27:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.18768 validation-auc:0.97278 validation-aucpr:0.97457
[21:27:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.18733 validation-auc:0.97280 validation-aucpr:0.97455
[21:27:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.18702 validation-auc:0.97300 validation-aucpr:0.97566
[21:27:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.18684 validation-auc:0.97299 validation-aucpr:0.97563
[21:27:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.18671 validation-auc:0.97296 validation-aucpr:0.97573
[21:27:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.18660 validation-auc:0.97299 validation-aucpr:0.97587
[21:27:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.18659 validation-auc:0.97301 validation-aucpr:0.97581
[21:27:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.18641 validation-auc:0.97300 validation-aucpr:0.97577
[21:27:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.18650 validation-auc:0.97295 validation-aucpr:0.97565
[21:27:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.18637 validation-auc:0.97299 validation-aucpr:0.97565
[21:27:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.18649 validation-auc:0.97301 validation-aucpr:0.97569
[21:27:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.18668 validation-auc:0.97299 validation-aucpr:0.97563
[21:27:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.18699 validation-auc:0.97290 validation-aucpr:0.97542
[21:27:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.18704 validation-auc:0.97290 validation-aucpr:0.97538
{'best_iteration': '65', 'best_score': '0.9758713375134983'}
Trial 77, Fold 2: Log loss = 0.18703814435806546, Average precision = 0.9753815790402707, ROC-AUC = 0.9728992112453606, Elapsed Time = 18.450700099998357 seconds
Trial 77, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 77, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[21:27:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64064 validation-auc:0.95030 validation-aucpr:0.94347
[21:27:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.59775 validation-auc:0.95936 validation-aucpr:0.95585
[21:27:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.56142 validation-auc:0.96249 validation-aucpr:0.96331
[21:27:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.52527 validation-auc:0.96578 validation-aucpr:0.96963
[21:27:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.49362 validation-auc:0.96684 validation-aucpr:0.96861
[21:27:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.46545 validation-auc:0.96709 validation-aucpr:0.96766
[21:27:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.44088 validation-auc:0.96728 validation-aucpr:0.96980
[21:27:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.42082 validation-auc:0.96754 validation-aucpr:0.96976
[21:27:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.40211 validation-auc:0.96831 validation-aucpr:0.97035
[21:27:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.38525 validation-auc:0.96887 validation-aucpr:0.97070
[21:27:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.37034 validation-auc:0.96902 validation-aucpr:0.97083
[21:27:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.35419 validation-auc:0.96927 validation-aucpr:0.97099
[21:27:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.33951 validation-auc:0.96979 validation-aucpr:0.97125
[21:27:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.32854 validation-auc:0.96988 validation-aucpr:0.97128
[21:27:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.31829 validation-auc:0.97010 validation-aucpr:0.97147
[21:27:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.30711 validation-auc:0.97035 validation-aucpr:0.97172
[21:27:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.29755 validation-auc:0.97019 validation-aucpr:0.97164
[21:27:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.28795 validation-auc:0.97057 validation-aucpr:0.97186
[21:27:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.27982 validation-auc:0.97077 validation-aucpr:0.97199
[21:27:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.27195 validation-auc:0.97071 validation-aucpr:0.97213
[21:27:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.26461 validation-auc:0.97088 validation-aucpr:0.97224
[21:27:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.25850 validation-auc:0.97078 validation-aucpr:0.97215
[21:27:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.25362 validation-auc:0.97092 validation-aucpr:0.97221
[21:27:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.24877 validation-auc:0.97084 validation-aucpr:0.97199
[21:27:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.24395 validation-auc:0.97121 validation-aucpr:0.97519
[21:27:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.23963 validation-auc:0.97115 validation-aucpr:0.97515
[21:27:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.23578 validation-auc:0.97095 validation-aucpr:0.97498
[21:27:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.23217 validation-auc:0.97090 validation-aucpr:0.97494
[21:27:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.22843 validation-auc:0.97122 validation-aucpr:0.97520
[21:27:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.22485 validation-auc:0.97142 validation-aucpr:0.97532
[21:27:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.22164 validation-auc:0.97161 validation-aucpr:0.97558
[21:27:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.21937 validation-auc:0.97172 validation-aucpr:0.97564
[21:28:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.21728 validation-auc:0.97175 validation-aucpr:0.97564
[21:28:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.21538 validation-auc:0.97160 validation-aucpr:0.97554
[21:28:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.21333 validation-auc:0.97164 validation-aucpr:0.97559
[21:28:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.21184 validation-auc:0.97162 validation-aucpr:0.97556
[21:28:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.20987 validation-auc:0.97176 validation-aucpr:0.97568
[21:28:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20825 validation-auc:0.97177 validation-aucpr:0.97571
[21:28:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.20673 validation-auc:0.97190 validation-aucpr:0.97577
[21:28:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.20513 validation-auc:0.97195 validation-aucpr:0.97581
[21:28:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20368 validation-auc:0.97196 validation-aucpr:0.97578
[21:28:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.20266 validation-auc:0.97185 validation-aucpr:0.97571
[21:28:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20183 validation-auc:0.97185 validation-aucpr:0.97571
[21:28:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20085 validation-auc:0.97184 validation-aucpr:0.97572
[21:28:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19977 validation-auc:0.97193 validation-aucpr:0.97588
[21:28:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19894 validation-auc:0.97192 validation-aucpr:0.97587
[21:28:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19785 validation-auc:0.97199 validation-aucpr:0.97594
[21:28:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19699 validation-auc:0.97212 validation-aucpr:0.97592
[21:28:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19584 validation-auc:0.97226 validation-aucpr:0.97605
[21:28:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19494 validation-auc:0.97240 validation-aucpr:0.97611
[21:28:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19429 validation-auc:0.97243 validation-aucpr:0.97613
[21:28:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19376 validation-auc:0.97251 validation-aucpr:0.97620
[21:28:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19347 validation-auc:0.97253 validation-aucpr:0.97619
[21:28:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19315 validation-auc:0.97257 validation-aucpr:0.97619
[21:28:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.19251 validation-auc:0.97267 validation-aucpr:0.97620
[21:28:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.19262 validation-auc:0.97260 validation-aucpr:0.97612
[21:28:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.19228 validation-auc:0.97267 validation-aucpr:0.97609
[21:28:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.19174 validation-auc:0.97279 validation-aucpr:0.97618
[21:28:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.19175 validation-auc:0.97272 validation-aucpr:0.97604
[21:28:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.19162 validation-auc:0.97274 validation-aucpr:0.97603
[21:28:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.19156 validation-auc:0.97272 validation-aucpr:0.97597
[21:28:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.19153 validation-auc:0.97271 validation-aucpr:0.97596
[21:28:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.19164 validation-auc:0.97263 validation-aucpr:0.97584
[21:28:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.19160 validation-auc:0.97265 validation-aucpr:0.97576
[21:28:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.19170 validation-auc:0.97265 validation-aucpr:0.97567
[21:28:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.19170 validation-auc:0.97267 validation-aucpr:0.97556
[21:28:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.19160 validation-auc:0.97266 validation-aucpr:0.97547
[21:28:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.19156 validation-auc:0.97271 validation-aucpr:0.97554
[21:28:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.19179 validation-auc:0.97268 validation-aucpr:0.97552
[21:28:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.19168 validation-auc:0.97279 validation-aucpr:0.97556
[21:28:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.19168 validation-auc:0.97282 validation-aucpr:0.97575
[21:28:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.19206 validation-auc:0.97274 validation-aucpr:0.97561
[21:28:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.19217 validation-auc:0.97279 validation-aucpr:0.97581
[21:28:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.19239 validation-auc:0.97276 validation-aucpr:0.97567
{'best_iteration': '51', 'best_score': '0.9761959525993174'}
Trial 77, Fold 3: Log loss = 0.1923887270627478, Average precision = 0.975678562191017, ROC-AUC = 0.9727598212479895, Elapsed Time = 18.685897399998794 seconds
Trial 77, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 77, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[21:28:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64013 validation-auc:0.95125 validation-aucpr:0.95040
[21:28:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.59626 validation-auc:0.95914 validation-aucpr:0.96319
[21:28:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.55707 validation-auc:0.96243 validation-aucpr:0.96865
[21:28:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.52566 validation-auc:0.96284 validation-aucpr:0.96903
[21:28:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.49407 validation-auc:0.96478 validation-aucpr:0.97051
[21:28:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.46892 validation-auc:0.96527 validation-aucpr:0.97100
[21:28:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.44417 validation-auc:0.96540 validation-aucpr:0.97121
[21:28:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.42383 validation-auc:0.96579 validation-aucpr:0.97156
[21:28:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.40444 validation-auc:0.96615 validation-aucpr:0.97194
[21:28:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.38775 validation-auc:0.96647 validation-aucpr:0.97221
[21:28:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.37094 validation-auc:0.96684 validation-aucpr:0.97257
[21:28:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.35772 validation-auc:0.96693 validation-aucpr:0.97252
[21:28:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.34462 validation-auc:0.96700 validation-aucpr:0.97261
[21:28:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.33165 validation-auc:0.96732 validation-aucpr:0.97289
[21:28:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.31952 validation-auc:0.96784 validation-aucpr:0.97330
[21:28:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.30864 validation-auc:0.96796 validation-aucpr:0.97340
[21:28:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.29956 validation-auc:0.96796 validation-aucpr:0.97342
[21:28:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.29048 validation-auc:0.96808 validation-aucpr:0.97352
[21:28:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.28335 validation-auc:0.96802 validation-aucpr:0.97349
[21:28:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.27546 validation-auc:0.96847 validation-aucpr:0.97385
[21:28:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.26838 validation-auc:0.96876 validation-aucpr:0.97406
[21:28:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.26266 validation-auc:0.96886 validation-aucpr:0.97414
[21:28:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.25681 validation-auc:0.96910 validation-aucpr:0.97433
[21:28:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.25150 validation-auc:0.96906 validation-aucpr:0.97430
[21:28:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.24609 validation-auc:0.96944 validation-aucpr:0.97453
[21:28:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.24177 validation-auc:0.96931 validation-aucpr:0.97447
[21:28:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.23742 validation-auc:0.96954 validation-aucpr:0.97459
[21:28:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.23340 validation-auc:0.96970 validation-aucpr:0.97472
[21:28:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.23042 validation-auc:0.96939 validation-aucpr:0.97454
[21:28:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.22666 validation-auc:0.96981 validation-aucpr:0.97482
[21:28:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.22340 validation-auc:0.97017 validation-aucpr:0.97505
[21:28:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.22031 validation-auc:0.97045 validation-aucpr:0.97527
[21:28:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.21774 validation-auc:0.97064 validation-aucpr:0.97539
[21:28:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.21522 validation-auc:0.97099 validation-aucpr:0.97562
[21:28:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.21304 validation-auc:0.97106 validation-aucpr:0.97566
[21:28:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.21174 validation-auc:0.97094 validation-aucpr:0.97559
[21:28:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.20963 validation-auc:0.97109 validation-aucpr:0.97572
[21:28:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20797 validation-auc:0.97115 validation-aucpr:0.97578
[21:28:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.20643 validation-auc:0.97114 validation-aucpr:0.97575
[21:28:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.20468 validation-auc:0.97137 validation-aucpr:0.97591
[21:28:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20319 validation-auc:0.97156 validation-aucpr:0.97604
[21:28:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.20203 validation-auc:0.97158 validation-aucpr:0.97605
[21:28:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20098 validation-auc:0.97157 validation-aucpr:0.97605
[21:28:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.19991 validation-auc:0.97167 validation-aucpr:0.97612
[21:28:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19921 validation-auc:0.97167 validation-aucpr:0.97612
[21:28:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19885 validation-auc:0.97148 validation-aucpr:0.97597
[21:28:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19807 validation-auc:0.97158 validation-aucpr:0.97604
[21:28:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19750 validation-auc:0.97163 validation-aucpr:0.97609
[21:28:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19693 validation-auc:0.97164 validation-aucpr:0.97608
[21:28:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19615 validation-auc:0.97174 validation-aucpr:0.97616
[21:28:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19548 validation-auc:0.97183 validation-aucpr:0.97622
[21:28:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19544 validation-auc:0.97169 validation-aucpr:0.97611
[21:28:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19485 validation-auc:0.97177 validation-aucpr:0.97617
[21:28:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19432 validation-auc:0.97186 validation-aucpr:0.97624
[21:28:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.19432 validation-auc:0.97180 validation-aucpr:0.97621
[21:28:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.19378 validation-auc:0.97185 validation-aucpr:0.97622
[21:28:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.19398 validation-auc:0.97174 validation-aucpr:0.97613
[21:28:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.19402 validation-auc:0.97167 validation-aucpr:0.97608
[21:28:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.19372 validation-auc:0.97175 validation-aucpr:0.97612
[21:28:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.19356 validation-auc:0.97171 validation-aucpr:0.97609
[21:28:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.19359 validation-auc:0.97169 validation-aucpr:0.97608
[21:28:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.19308 validation-auc:0.97187 validation-aucpr:0.97622
[21:28:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.19326 validation-auc:0.97186 validation-aucpr:0.97621
[21:28:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.19318 validation-auc:0.97187 validation-aucpr:0.97624
[21:28:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.19310 validation-auc:0.97192 validation-aucpr:0.97627
[21:28:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.19315 validation-auc:0.97195 validation-aucpr:0.97627
[21:28:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.19281 validation-auc:0.97206 validation-aucpr:0.97638
[21:28:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.19275 validation-auc:0.97203 validation-aucpr:0.97639
[21:28:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.19298 validation-auc:0.97202 validation-aucpr:0.97637
[21:28:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.19341 validation-auc:0.97193 validation-aucpr:0.97630
[21:28:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.19336 validation-auc:0.97199 validation-aucpr:0.97635
[21:28:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.19369 validation-auc:0.97194 validation-aucpr:0.97631
[21:28:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.19353 validation-auc:0.97201 validation-aucpr:0.97637
[21:28:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.19371 validation-auc:0.97196 validation-aucpr:0.97631
{'best_iteration': '67', 'best_score': '0.9763874139328066'}
Trial 77, Fold 4: Log loss = 0.19370908844916987, Average precision = 0.976315444049942, ROC-AUC = 0.9719556154042409, Elapsed Time = 18.724122399999032 seconds
Trial 77, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 77, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[21:28:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64019 validation-auc:0.94752 validation-aucpr:0.94616
[21:28:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.59981 validation-auc:0.95419 validation-aucpr:0.95434
[21:28:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.56013 validation-auc:0.95930 validation-aucpr:0.96385
[21:28:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.52420 validation-auc:0.96318 validation-aucpr:0.96572
[21:28:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.49357 validation-auc:0.96375 validation-aucpr:0.96887
[21:28:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.46594 validation-auc:0.96571 validation-aucpr:0.97069
[21:28:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.44097 validation-auc:0.96600 validation-aucpr:0.97123
[21:28:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.42117 validation-auc:0.96620 validation-aucpr:0.97124
[21:28:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.40138 validation-auc:0.96645 validation-aucpr:0.97147
[21:28:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.38282 validation-auc:0.96729 validation-aucpr:0.97222
[21:28:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.36594 validation-auc:0.96781 validation-aucpr:0.97261
[21:28:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.35293 validation-auc:0.96784 validation-aucpr:0.97255
[21:28:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.33939 validation-auc:0.96806 validation-aucpr:0.97272
[21:28:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.32797 validation-auc:0.96836 validation-aucpr:0.97275
[21:28:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.31683 validation-auc:0.96829 validation-aucpr:0.97277
[21:28:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.30635 validation-auc:0.96878 validation-aucpr:0.97312
[21:28:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.29713 validation-auc:0.96886 validation-aucpr:0.97315
[21:28:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.28814 validation-auc:0.96929 validation-aucpr:0.97350
[21:28:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.28009 validation-auc:0.96975 validation-aucpr:0.97384
[21:28:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.27320 validation-auc:0.96969 validation-aucpr:0.97380
[21:28:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.26630 validation-auc:0.97002 validation-aucpr:0.97410
[21:28:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.26135 validation-auc:0.96985 validation-aucpr:0.97394
[21:28:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.25568 validation-auc:0.97008 validation-aucpr:0.97411
[21:28:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.25113 validation-auc:0.96997 validation-aucpr:0.97407
[21:28:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.24645 validation-auc:0.96997 validation-aucpr:0.97406
[21:28:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.24306 validation-auc:0.96998 validation-aucpr:0.97401
[21:28:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.23883 validation-auc:0.97024 validation-aucpr:0.97414
[21:28:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.23611 validation-auc:0.97018 validation-aucpr:0.97407
[21:28:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.23283 validation-auc:0.97023 validation-aucpr:0.97415
[21:28:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.22952 validation-auc:0.97044 validation-aucpr:0.97429
[21:28:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.22636 validation-auc:0.97066 validation-aucpr:0.97446
[21:28:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.22401 validation-auc:0.97074 validation-aucpr:0.97471
[21:28:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.22167 validation-auc:0.97065 validation-aucpr:0.97465
[21:28:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.22001 validation-auc:0.97057 validation-aucpr:0.97455
[21:28:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.21787 validation-auc:0.97074 validation-aucpr:0.97468
[21:28:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.21646 validation-auc:0.97068 validation-aucpr:0.97442
[21:28:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.21499 validation-auc:0.97073 validation-aucpr:0.97444
[21:28:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.21325 validation-auc:0.97077 validation-aucpr:0.97449
[21:28:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.21214 validation-auc:0.97066 validation-aucpr:0.97440
[21:28:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.21022 validation-auc:0.97095 validation-aucpr:0.97468
[21:28:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20870 validation-auc:0.97112 validation-aucpr:0.97480
[21:28:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.20742 validation-auc:0.97112 validation-aucpr:0.97481
[21:28:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20656 validation-auc:0.97106 validation-aucpr:0.97477
[21:28:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20555 validation-auc:0.97117 validation-aucpr:0.97480
[21:28:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.20484 validation-auc:0.97114 validation-aucpr:0.97480
[21:28:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.20435 validation-auc:0.97110 validation-aucpr:0.97478
[21:28:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.20374 validation-auc:0.97108 validation-aucpr:0.97467
[21:28:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20353 validation-auc:0.97100 validation-aucpr:0.97458
[21:28:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.20274 validation-auc:0.97108 validation-aucpr:0.97461
[21:28:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.20208 validation-auc:0.97122 validation-aucpr:0.97468
[21:28:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.20176 validation-auc:0.97120 validation-aucpr:0.97460
[21:28:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.20113 validation-auc:0.97131 validation-aucpr:0.97498
[21:28:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.20034 validation-auc:0.97142 validation-aucpr:0.97497
[21:28:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19986 validation-auc:0.97154 validation-aucpr:0.97512
[21:28:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.19947 validation-auc:0.97166 validation-aucpr:0.97526
[21:28:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.19964 validation-auc:0.97153 validation-aucpr:0.97522
[21:28:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.20003 validation-auc:0.97132 validation-aucpr:0.97505
[21:28:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.20002 validation-auc:0.97126 validation-aucpr:0.97494
[21:28:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.19993 validation-auc:0.97132 validation-aucpr:0.97493
[21:28:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.19983 validation-auc:0.97133 validation-aucpr:0.97490
[21:28:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.19956 validation-auc:0.97140 validation-aucpr:0.97494
[21:28:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.19935 validation-auc:0.97148 validation-aucpr:0.97501
[21:28:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.19941 validation-auc:0.97144 validation-aucpr:0.97493
[21:28:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.19950 validation-auc:0.97144 validation-aucpr:0.97482
[21:28:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.19973 validation-auc:0.97137 validation-aucpr:0.97477
[21:28:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.19986 validation-auc:0.97132 validation-aucpr:0.97469
[21:28:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.19992 validation-auc:0.97139 validation-aucpr:0.97468
[21:28:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.20011 validation-auc:0.97133 validation-aucpr:0.97456
[21:28:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.20035 validation-auc:0.97128 validation-aucpr:0.97449
[21:28:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.20002 validation-auc:0.97145 validation-aucpr:0.97502
[21:28:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.19996 validation-auc:0.97146 validation-aucpr:0.97501
[21:28:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.19986 validation-auc:0.97154 validation-aucpr:0.97513
[21:28:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.20026 validation-auc:0.97149 validation-aucpr:0.97507
[21:28:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.20060 validation-auc:0.97139 validation-aucpr:0.97493
{'best_iteration': '54', 'best_score': '0.9752626205930159'}
Trial 77, Fold 5: Log loss = 0.20059891678402936, Average precision = 0.9749373043194629, ROC-AUC = 0.9713875480742434, Elapsed Time = 18.439465700001165 seconds
Optimization Progress: 78%|#######8 | 78/100 [3:29:59<24:45, 67.54s/it]
Trial 78, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 78, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[21:28:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66663 validation-auc:0.94171 validation-aucpr:0.94109
[21:28:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.64201 validation-auc:0.96091 validation-aucpr:0.95450
[21:28:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61922 validation-auc:0.96447 validation-aucpr:0.96116
[21:28:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.59807 validation-auc:0.96590 validation-aucpr:0.97163
[21:28:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.57823 validation-auc:0.96652 validation-aucpr:0.97227
[21:28:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.56166 validation-auc:0.96675 validation-aucpr:0.97258
[21:28:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.54621 validation-auc:0.96617 validation-aucpr:0.97196
[21:28:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.52950 validation-auc:0.96628 validation-aucpr:0.97213
[21:28:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.51374 validation-auc:0.96620 validation-aucpr:0.97204
[21:28:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.50042 validation-auc:0.96596 validation-aucpr:0.97177
[21:28:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.48820 validation-auc:0.96597 validation-aucpr:0.97164
[21:28:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.47439 validation-auc:0.96634 validation-aucpr:0.97190
[21:28:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.46125 validation-auc:0.96692 validation-aucpr:0.97242
[21:28:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.44984 validation-auc:0.96731 validation-aucpr:0.97269
[21:28:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.43864 validation-auc:0.96730 validation-aucpr:0.97268
[21:28:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.42793 validation-auc:0.96719 validation-aucpr:0.97260
[21:28:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.41757 validation-auc:0.96747 validation-aucpr:0.97287
[21:28:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.40755 validation-auc:0.96773 validation-aucpr:0.97304
[21:28:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.39867 validation-auc:0.96805 validation-aucpr:0.97329
[21:28:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.39055 validation-auc:0.96818 validation-aucpr:0.97334
[21:28:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.38315 validation-auc:0.96796 validation-aucpr:0.97317
[21:28:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.37470 validation-auc:0.96817 validation-aucpr:0.97334
[21:28:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.36665 validation-auc:0.96836 validation-aucpr:0.97346
[21:28:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.36061 validation-auc:0.96817 validation-aucpr:0.97328
[21:28:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.35471 validation-auc:0.96813 validation-aucpr:0.97319
[21:28:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.34775 validation-auc:0.96832 validation-aucpr:0.97332
[21:28:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.34221 validation-auc:0.96836 validation-aucpr:0.97333
[21:28:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.33556 validation-auc:0.96869 validation-aucpr:0.97356
[21:28:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.32957 validation-auc:0.96863 validation-aucpr:0.97353
[21:28:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.32399 validation-auc:0.96854 validation-aucpr:0.97344
[21:28:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.31850 validation-auc:0.96857 validation-aucpr:0.97348
[21:28:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.31311 validation-auc:0.96863 validation-aucpr:0.97355
[21:28:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.30810 validation-auc:0.96874 validation-aucpr:0.97366
[21:28:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.30305 validation-auc:0.96903 validation-aucpr:0.97386
[21:28:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.29832 validation-auc:0.96916 validation-aucpr:0.97396
[21:28:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.29376 validation-auc:0.96944 validation-aucpr:0.97421
[21:28:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.28972 validation-auc:0.96944 validation-aucpr:0.97422
[21:28:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.28561 validation-auc:0.96955 validation-aucpr:0.97429
[21:28:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.28227 validation-auc:0.96952 validation-aucpr:0.97426
[21:28:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.27906 validation-auc:0.96955 validation-aucpr:0.97424
[21:28:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.27555 validation-auc:0.96953 validation-aucpr:0.97422
[21:28:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.27196 validation-auc:0.96967 validation-aucpr:0.97432
[21:29:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.26921 validation-auc:0.96966 validation-aucpr:0.97430
[21:29:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.26660 validation-auc:0.96971 validation-aucpr:0.97429
[21:29:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.26347 validation-auc:0.96975 validation-aucpr:0.97434
{'best_iteration': '44', 'best_score': '0.9743414384395959'}
Trial 78, Fold 1: Log loss = 0.2634666136903201, Average precision = 0.9743449721498787, ROC-AUC = 0.9697506518164828, Elapsed Time = 7.150237099998776 seconds
Trial 78, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 78, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[21:29:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66660 validation-auc:0.94258 validation-aucpr:0.91905
[21:29:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.64419 validation-auc:0.96093 validation-aucpr:0.95317
[21:29:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.62103 validation-auc:0.96378 validation-aucpr:0.96142
[21:29:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.59913 validation-auc:0.96715 validation-aucpr:0.96839
[21:29:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.57891 validation-auc:0.96845 validation-aucpr:0.97224
[21:29:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.55967 validation-auc:0.96904 validation-aucpr:0.97274
[21:29:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.54164 validation-auc:0.96914 validation-aucpr:0.97281
[21:29:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.52494 validation-auc:0.96906 validation-aucpr:0.97278
[21:29:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.50911 validation-auc:0.96911 validation-aucpr:0.97282
[21:29:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.49435 validation-auc:0.96934 validation-aucpr:0.97302
[21:29:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.48176 validation-auc:0.96954 validation-aucpr:0.97311
[21:29:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.46845 validation-auc:0.96965 validation-aucpr:0.97315
[21:29:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.45737 validation-auc:0.96975 validation-aucpr:0.97320
[21:29:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.44521 validation-auc:0.96982 validation-aucpr:0.97322
[21:29:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.43365 validation-auc:0.96982 validation-aucpr:0.97322
[21:29:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.42311 validation-auc:0.96970 validation-aucpr:0.97314
[21:29:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.41271 validation-auc:0.96989 validation-aucpr:0.97325
[21:29:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.40279 validation-auc:0.97014 validation-aucpr:0.97347
[21:29:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.39337 validation-auc:0.97046 validation-aucpr:0.97363
[21:29:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.38431 validation-auc:0.97055 validation-aucpr:0.97366
[21:29:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.37660 validation-auc:0.97064 validation-aucpr:0.97372
[21:29:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.36910 validation-auc:0.97082 validation-aucpr:0.97395
[21:29:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.36109 validation-auc:0.97101 validation-aucpr:0.97413
[21:29:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.35379 validation-auc:0.97089 validation-aucpr:0.97404
[21:29:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.34745 validation-auc:0.97110 validation-aucpr:0.97416
[21:29:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.34142 validation-auc:0.97123 validation-aucpr:0.97425
[21:29:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.33480 validation-auc:0.97137 validation-aucpr:0.97467
[21:29:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.32924 validation-auc:0.97135 validation-aucpr:0.97470
[21:29:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.32314 validation-auc:0.97155 validation-aucpr:0.97484
[21:29:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.31739 validation-auc:0.97164 validation-aucpr:0.97491
[21:29:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.31247 validation-auc:0.97165 validation-aucpr:0.97489
[21:29:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.30778 validation-auc:0.97182 validation-aucpr:0.97503
[21:29:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.30239 validation-auc:0.97213 validation-aucpr:0.97525
[21:29:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.29785 validation-auc:0.97231 validation-aucpr:0.97538
[21:29:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.29402 validation-auc:0.97216 validation-aucpr:0.97525
[21:29:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.28944 validation-auc:0.97215 validation-aucpr:0.97526
[21:29:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.28513 validation-auc:0.97215 validation-aucpr:0.97515
[21:29:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.28078 validation-auc:0.97236 validation-aucpr:0.97533
[21:29:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.27663 validation-auc:0.97248 validation-aucpr:0.97542
[21:29:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.27339 validation-auc:0.97245 validation-aucpr:0.97537
[21:29:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.26972 validation-auc:0.97247 validation-aucpr:0.97538
[21:29:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.26687 validation-auc:0.97240 validation-aucpr:0.97532
[21:29:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.26392 validation-auc:0.97255 validation-aucpr:0.97540
[21:29:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.26067 validation-auc:0.97263 validation-aucpr:0.97548
[21:29:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.25753 validation-auc:0.97261 validation-aucpr:0.97546
{'best_iteration': '43', 'best_score': '0.9754849993403878'}
Trial 78, Fold 2: Log loss = 0.25753349892157557, Average precision = 0.9754106024749805, ROC-AUC = 0.9726118768135756, Elapsed Time = 5.847930500000075 seconds
Trial 78, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 78, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[21:29:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66640 validation-auc:0.93718 validation-aucpr:0.89307
[21:29:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.64415 validation-auc:0.95825 validation-aucpr:0.94779
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.62339 validation-auc:0.96143 validation-aucpr:0.95842
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.60217 validation-auc:0.96362 validation-aucpr:0.96591
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.58403 validation-auc:0.96465 validation-aucpr:0.96847
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.56457 validation-auc:0.96590 validation-aucpr:0.96969
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.54855 validation-auc:0.96645 validation-aucpr:0.97121
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.53085 validation-auc:0.96767 validation-aucpr:0.97206
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.51647 validation-auc:0.96795 validation-aucpr:0.97229
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.50280 validation-auc:0.96803 validation-aucpr:0.97265
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.48803 validation-auc:0.96832 validation-aucpr:0.97293
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.47399 validation-auc:0.96876 validation-aucpr:0.97335
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.46217 validation-auc:0.96925 validation-aucpr:0.97364
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.45143 validation-auc:0.96925 validation-aucpr:0.97369
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.43936 validation-auc:0.96967 validation-aucpr:0.97404
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.42941 validation-auc:0.96959 validation-aucpr:0.97400
[21:29:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.41834 validation-auc:0.96986 validation-aucpr:0.97426
[21:29:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.40824 validation-auc:0.97011 validation-aucpr:0.97446
[21:29:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.39822 validation-auc:0.97030 validation-aucpr:0.97462
[21:29:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.38872 validation-auc:0.97065 validation-aucpr:0.97496
[21:29:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.37978 validation-auc:0.97085 validation-aucpr:0.97514
[21:29:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.37121 validation-auc:0.97102 validation-aucpr:0.97527
[21:29:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.36463 validation-auc:0.97073 validation-aucpr:0.97500
[21:29:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.35712 validation-auc:0.97072 validation-aucpr:0.97499
[21:29:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.35068 validation-auc:0.97078 validation-aucpr:0.97499
[21:29:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.34402 validation-auc:0.97070 validation-aucpr:0.97495
[21:29:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.33741 validation-auc:0.97082 validation-aucpr:0.97504
[21:29:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.33091 validation-auc:0.97100 validation-aucpr:0.97518
[21:29:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.32460 validation-auc:0.97108 validation-aucpr:0.97528
[21:29:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.31868 validation-auc:0.97121 validation-aucpr:0.97539
[21:29:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.31284 validation-auc:0.97138 validation-aucpr:0.97552
[21:29:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.30823 validation-auc:0.97150 validation-aucpr:0.97560
[21:29:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.30289 validation-auc:0.97156 validation-aucpr:0.97568
[21:29:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.29858 validation-auc:0.97164 validation-aucpr:0.97568
[21:29:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.29388 validation-auc:0.97170 validation-aucpr:0.97574
[21:29:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.28937 validation-auc:0.97180 validation-aucpr:0.97580
[21:29:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.28585 validation-auc:0.97180 validation-aucpr:0.97579
[21:29:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.28155 validation-auc:0.97198 validation-aucpr:0.97590
[21:29:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.27798 validation-auc:0.97193 validation-aucpr:0.97585
[21:29:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.27399 validation-auc:0.97193 validation-aucpr:0.97586
[21:29:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.27051 validation-auc:0.97180 validation-aucpr:0.97578
[21:29:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.26770 validation-auc:0.97173 validation-aucpr:0.97570
[21:29:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.26416 validation-auc:0.97176 validation-aucpr:0.97572
[21:29:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.26119 validation-auc:0.97185 validation-aucpr:0.97577
[21:29:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.25780 validation-auc:0.97200 validation-aucpr:0.97589
{'best_iteration': '37', 'best_score': '0.9759014238086381'}
Trial 78, Fold 3: Log loss = 0.25779883428118283, Average precision = 0.9758932484338817, ROC-AUC = 0.9719994209340177, Elapsed Time = 5.3621460999966075 seconds
Trial 78, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 78, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[21:29:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66641 validation-auc:0.93185 validation-aucpr:0.90151
[21:29:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.64424 validation-auc:0.95972 validation-aucpr:0.95595
[21:29:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.62119 validation-auc:0.96387 validation-aucpr:0.96402
[21:29:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.60025 validation-auc:0.96519 validation-aucpr:0.96611
[21:29:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.58232 validation-auc:0.96579 validation-aucpr:0.97122
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.56548 validation-auc:0.96609 validation-aucpr:0.97147
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.54932 validation-auc:0.96608 validation-aucpr:0.97150
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.53427 validation-auc:0.96647 validation-aucpr:0.97159
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.51783 validation-auc:0.96669 validation-aucpr:0.97192
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.50426 validation-auc:0.96679 validation-aucpr:0.97197
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.49001 validation-auc:0.96746 validation-aucpr:0.97252
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.47603 validation-auc:0.96777 validation-aucpr:0.97283
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.46507 validation-auc:0.96763 validation-aucpr:0.97266
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.45254 validation-auc:0.96763 validation-aucpr:0.97269
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.44031 validation-auc:0.96814 validation-aucpr:0.97312
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.43050 validation-auc:0.96822 validation-aucpr:0.97316
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.41979 validation-auc:0.96841 validation-aucpr:0.97337
[21:29:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.40966 validation-auc:0.96841 validation-aucpr:0.97337
[21:29:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.40134 validation-auc:0.96841 validation-aucpr:0.97334
[21:29:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.39195 validation-auc:0.96851 validation-aucpr:0.97344
[21:29:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.38308 validation-auc:0.96868 validation-aucpr:0.97362
[21:29:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.37462 validation-auc:0.96881 validation-aucpr:0.97373
[21:29:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.36652 validation-auc:0.96900 validation-aucpr:0.97390
[21:29:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.36000 validation-auc:0.96900 validation-aucpr:0.97387
[21:29:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.35396 validation-auc:0.96908 validation-aucpr:0.97392
[21:29:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.34677 validation-auc:0.96928 validation-aucpr:0.97407
[21:29:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.33972 validation-auc:0.96943 validation-aucpr:0.97419
[21:29:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.33434 validation-auc:0.96947 validation-aucpr:0.97422
[21:29:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.32940 validation-auc:0.96946 validation-aucpr:0.97422
[21:29:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.32462 validation-auc:0.96946 validation-aucpr:0.97421
[21:29:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.31964 validation-auc:0.96946 validation-aucpr:0.97420
[21:29:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.31510 validation-auc:0.96937 validation-aucpr:0.97412
[21:29:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.30987 validation-auc:0.96943 validation-aucpr:0.97420
[21:29:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.30572 validation-auc:0.96942 validation-aucpr:0.97420
[21:29:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.30157 validation-auc:0.96953 validation-aucpr:0.97426
[21:29:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.29670 validation-auc:0.96957 validation-aucpr:0.97430
[21:29:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.29218 validation-auc:0.96958 validation-aucpr:0.97433
[21:29:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.28767 validation-auc:0.96979 validation-aucpr:0.97448
[21:29:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.28379 validation-auc:0.96984 validation-aucpr:0.97451
[21:29:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.28068 validation-auc:0.96976 validation-aucpr:0.97444
[21:29:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.27762 validation-auc:0.96973 validation-aucpr:0.97440
[21:29:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.27476 validation-auc:0.96966 validation-aucpr:0.97436
[21:29:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.27107 validation-auc:0.96970 validation-aucpr:0.97441
[21:29:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.26832 validation-auc:0.96968 validation-aucpr:0.97441
[21:29:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.26568 validation-auc:0.96968 validation-aucpr:0.97444
{'best_iteration': '38', 'best_score': '0.9745071265456206'}
Trial 78, Fold 4: Log loss = 0.26568473621462146, Average precision = 0.9744429625257657, ROC-AUC = 0.9696823787348798, Elapsed Time = 5.064143999999942 seconds
Trial 78, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 78, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[21:29:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66649 validation-auc:0.94023 validation-aucpr:0.90318
[21:29:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.64191 validation-auc:0.96175 validation-aucpr:0.95572
[21:29:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.62133 validation-auc:0.96121 validation-aucpr:0.95928
[21:29:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.60181 validation-auc:0.96250 validation-aucpr:0.96298
[21:29:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.58141 validation-auc:0.96401 validation-aucpr:0.96614
[21:29:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.56207 validation-auc:0.96564 validation-aucpr:0.96820
[21:29:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.54427 validation-auc:0.96648 validation-aucpr:0.97033
[21:29:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.52953 validation-auc:0.96651 validation-aucpr:0.97007
[21:29:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.51536 validation-auc:0.96716 validation-aucpr:0.97036
[21:29:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.50229 validation-auc:0.96688 validation-aucpr:0.97000
[21:29:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.49014 validation-auc:0.96678 validation-aucpr:0.97114
[21:29:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.47858 validation-auc:0.96683 validation-aucpr:0.97107
[21:29:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.46550 validation-auc:0.96703 validation-aucpr:0.97125
[21:29:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.45298 validation-auc:0.96751 validation-aucpr:0.97163
[21:29:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.44273 validation-auc:0.96774 validation-aucpr:0.97177
[21:29:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.43314 validation-auc:0.96785 validation-aucpr:0.97181
[21:29:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.42218 validation-auc:0.96810 validation-aucpr:0.97204
[21:29:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.41345 validation-auc:0.96791 validation-aucpr:0.97184
[21:29:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.40395 validation-auc:0.96798 validation-aucpr:0.97193
[21:29:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.39503 validation-auc:0.96821 validation-aucpr:0.97217
[21:29:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.38682 validation-auc:0.96847 validation-aucpr:0.97277
[21:29:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.37973 validation-auc:0.96833 validation-aucpr:0.97262
[21:29:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.37172 validation-auc:0.96864 validation-aucpr:0.97282
[21:29:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.36388 validation-auc:0.96876 validation-aucpr:0.97294
[21:29:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.35756 validation-auc:0.96862 validation-aucpr:0.97283
[21:29:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.35167 validation-auc:0.96850 validation-aucpr:0.97271
[21:29:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.34505 validation-auc:0.96853 validation-aucpr:0.97274
[21:29:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.33843 validation-auc:0.96870 validation-aucpr:0.97293
[21:29:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.33330 validation-auc:0.96864 validation-aucpr:0.97285
[21:29:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.32694 validation-auc:0.96902 validation-aucpr:0.97314
[21:29:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.32232 validation-auc:0.96896 validation-aucpr:0.97308
[21:29:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.31683 validation-auc:0.96905 validation-aucpr:0.97314
[21:29:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.31144 validation-auc:0.96921 validation-aucpr:0.97326
[21:29:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.30640 validation-auc:0.96927 validation-aucpr:0.97333
[21:29:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.30173 validation-auc:0.96941 validation-aucpr:0.97347
[21:29:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.29740 validation-auc:0.96948 validation-aucpr:0.97352
[21:29:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.29391 validation-auc:0.96941 validation-aucpr:0.97344
[21:29:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.29059 validation-auc:0.96926 validation-aucpr:0.97330
[21:29:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.28721 validation-auc:0.96928 validation-aucpr:0.97329
[21:29:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.28318 validation-auc:0.96936 validation-aucpr:0.97336
[21:29:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.27995 validation-auc:0.96946 validation-aucpr:0.97343
[21:29:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.27615 validation-auc:0.96967 validation-aucpr:0.97362
[21:29:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.27337 validation-auc:0.96978 validation-aucpr:0.97369
[21:29:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.27063 validation-auc:0.96978 validation-aucpr:0.97367
[21:29:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.26746 validation-auc:0.96984 validation-aucpr:0.97371
{'best_iteration': '44', 'best_score': '0.9737137481701315'}
Trial 78, Fold 5: Log loss = 0.26746121894649966, Average precision = 0.9737135246742918, ROC-AUC = 0.9698394366720547, Elapsed Time = 5.0369890999972995 seconds
Optimization Progress: 79%|#######9 | 79/100 [3:30:36<20:23, 58.25s/it]
Trial 79, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 79, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[21:29:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68590 validation-auc:0.95495 validation-aucpr:0.95753
[21:29:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67954 validation-auc:0.95942 validation-aucpr:0.96511
[21:29:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67347 validation-auc:0.95824 validation-aucpr:0.96384
[21:29:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66658 validation-auc:0.96148 validation-aucpr:0.96716
[21:29:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65966 validation-auc:0.96332 validation-aucpr:0.96875
[21:29:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65328 validation-auc:0.96403 validation-aucpr:0.96940
[21:29:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64700 validation-auc:0.96436 validation-aucpr:0.97035
[21:29:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64135 validation-auc:0.96499 validation-aucpr:0.97074
[21:29:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63588 validation-auc:0.96486 validation-aucpr:0.97050
[21:29:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.63057 validation-auc:0.96474 validation-aucpr:0.97038
[21:29:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62444 validation-auc:0.96540 validation-aucpr:0.97100
[21:29:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61837 validation-auc:0.96565 validation-aucpr:0.97126
[21:29:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.61323 validation-auc:0.96516 validation-aucpr:0.97079
[21:29:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60753 validation-auc:0.96558 validation-aucpr:0.97115
[21:29:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.60192 validation-auc:0.96600 validation-aucpr:0.97151
[21:29:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59630 validation-auc:0.96621 validation-aucpr:0.97173
[21:29:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.59077 validation-auc:0.96616 validation-aucpr:0.97174
[21:29:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58610 validation-auc:0.96589 validation-aucpr:0.97150
[21:29:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.58079 validation-auc:0.96603 validation-aucpr:0.97169
[21:29:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57555 validation-auc:0.96616 validation-aucpr:0.97180
[21:29:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.57027 validation-auc:0.96643 validation-aucpr:0.97204
[21:29:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56511 validation-auc:0.96675 validation-aucpr:0.97233
[21:30:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.56010 validation-auc:0.96681 validation-aucpr:0.97240
[21:30:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55540 validation-auc:0.96665 validation-aucpr:0.97234
[21:30:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.55067 validation-auc:0.96663 validation-aucpr:0.97232
[21:30:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54595 validation-auc:0.96677 validation-aucpr:0.97242
[21:30:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.54119 validation-auc:0.96690 validation-aucpr:0.97252
[21:30:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53729 validation-auc:0.96685 validation-aucpr:0.97246
[21:30:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.53295 validation-auc:0.96689 validation-aucpr:0.97248
[21:30:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52850 validation-auc:0.96706 validation-aucpr:0.97259
[21:30:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52413 validation-auc:0.96740 validation-aucpr:0.97333
[21:30:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51988 validation-auc:0.96731 validation-aucpr:0.97326
[21:30:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51553 validation-auc:0.96726 validation-aucpr:0.97325
[21:30:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.51140 validation-auc:0.96712 validation-aucpr:0.97316
[21:30:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50720 validation-auc:0.96729 validation-aucpr:0.97326
[21:30:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.50314 validation-auc:0.96731 validation-aucpr:0.97330
[21:30:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49919 validation-auc:0.96744 validation-aucpr:0.97338
[21:30:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49563 validation-auc:0.96752 validation-aucpr:0.97341
[21:30:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.49172 validation-auc:0.96757 validation-aucpr:0.97344
[21:30:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48777 validation-auc:0.96767 validation-aucpr:0.97354
[21:30:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48397 validation-auc:0.96771 validation-aucpr:0.97354
[21:30:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.48072 validation-auc:0.96778 validation-aucpr:0.97357
[21:30:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47714 validation-auc:0.96778 validation-aucpr:0.97359
[21:30:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47366 validation-auc:0.96782 validation-aucpr:0.97361
[21:30:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.47059 validation-auc:0.96796 validation-aucpr:0.97367
[21:30:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46750 validation-auc:0.96797 validation-aucpr:0.97369
[21:30:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.46398 validation-auc:0.96806 validation-aucpr:0.97375
[21:30:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.46073 validation-auc:0.96805 validation-aucpr:0.97376
[21:30:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45743 validation-auc:0.96809 validation-aucpr:0.97378
[21:30:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45467 validation-auc:0.96810 validation-aucpr:0.97373
[21:30:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.45145 validation-auc:0.96808 validation-aucpr:0.97374
[21:30:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44830 validation-auc:0.96809 validation-aucpr:0.97373
[21:30:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44516 validation-auc:0.96812 validation-aucpr:0.97375
[21:30:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.44251 validation-auc:0.96815 validation-aucpr:0.97374
[21:30:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43985 validation-auc:0.96815 validation-aucpr:0.97372
[21:30:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43695 validation-auc:0.96811 validation-aucpr:0.97371
[21:30:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.43394 validation-auc:0.96817 validation-aucpr:0.97375
[21:30:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.43096 validation-auc:0.96824 validation-aucpr:0.97382
[21:30:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42801 validation-auc:0.96829 validation-aucpr:0.97388
[21:30:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42564 validation-auc:0.96821 validation-aucpr:0.97380
[21:30:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.42322 validation-auc:0.96819 validation-aucpr:0.97376
[21:30:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.42055 validation-auc:0.96821 validation-aucpr:0.97378
[21:31:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41823 validation-auc:0.96823 validation-aucpr:0.97378
[21:31:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41586 validation-auc:0.96820 validation-aucpr:0.97373
[21:31:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.41319 validation-auc:0.96822 validation-aucpr:0.97374
[21:31:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.41059 validation-auc:0.96825 validation-aucpr:0.97378
[21:31:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40791 validation-auc:0.96829 validation-aucpr:0.97381
[21:31:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40536 validation-auc:0.96827 validation-aucpr:0.97379
[21:31:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.40284 validation-auc:0.96830 validation-aucpr:0.97382
[21:31:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.40034 validation-auc:0.96824 validation-aucpr:0.97377
[21:31:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39779 validation-auc:0.96831 validation-aucpr:0.97381
[21:31:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39525 validation-auc:0.96843 validation-aucpr:0.97390
[21:31:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.39323 validation-auc:0.96841 validation-aucpr:0.97385
[21:31:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.39084 validation-auc:0.96838 validation-aucpr:0.97384
[21:31:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38847 validation-auc:0.96844 validation-aucpr:0.97389
[21:31:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38612 validation-auc:0.96859 validation-aucpr:0.97400
[21:31:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38377 validation-auc:0.96862 validation-aucpr:0.97402
[21:31:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.38149 validation-auc:0.96867 validation-aucpr:0.97406
[21:31:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.37957 validation-auc:0.96865 validation-aucpr:0.97403
[21:31:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37742 validation-auc:0.96866 validation-aucpr:0.97404
[21:31:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.37518 validation-auc:0.96869 validation-aucpr:0.97407
[21:31:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.37317 validation-auc:0.96863 validation-aucpr:0.97404
[21:31:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.37146 validation-auc:0.96857 validation-aucpr:0.97397
[21:31:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.36946 validation-auc:0.96858 validation-aucpr:0.97399
[21:31:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.36747 validation-auc:0.96860 validation-aucpr:0.97403
[21:31:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.36542 validation-auc:0.96862 validation-aucpr:0.97405
[21:31:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.36345 validation-auc:0.96864 validation-aucpr:0.97405
[21:31:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.36151 validation-auc:0.96863 validation-aucpr:0.97406
[21:31:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.35980 validation-auc:0.96863 validation-aucpr:0.97405
{'best_iteration': '80', 'best_score': '0.9740666100234081'}
Trial 79, Fold 1: Log loss = 0.35979713209064984, Average precision = 0.9740550756475213, ROC-AUC = 0.9686318545284218, Elapsed Time = 132.87924360000034 seconds
Trial 79, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 79, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[21:31:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68605 validation-auc:0.95094 validation-aucpr:0.95519
[21:31:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67981 validation-auc:0.95808 validation-aucpr:0.96221
[21:31:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67373 validation-auc:0.95829 validation-aucpr:0.96214
[21:31:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66684 validation-auc:0.96122 validation-aucpr:0.96543
[21:31:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.66015 validation-auc:0.96275 validation-aucpr:0.96702
[21:31:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65359 validation-auc:0.96371 validation-aucpr:0.96797
[21:31:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64721 validation-auc:0.96480 validation-aucpr:0.96897
[21:31:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64075 validation-auc:0.96549 validation-aucpr:0.96960
[21:31:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63449 validation-auc:0.96575 validation-aucpr:0.96980
[21:31:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62856 validation-auc:0.96641 validation-aucpr:0.97035
[21:31:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62306 validation-auc:0.96669 validation-aucpr:0.97046
[21:31:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61715 validation-auc:0.96680 validation-aucpr:0.97051
[21:32:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.61143 validation-auc:0.96684 validation-aucpr:0.97057
[21:32:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60555 validation-auc:0.96732 validation-aucpr:0.97099
[21:32:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59989 validation-auc:0.96783 validation-aucpr:0.97128
[21:32:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59432 validation-auc:0.96801 validation-aucpr:0.97139
[21:32:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58878 validation-auc:0.96843 validation-aucpr:0.97176
[21:32:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58347 validation-auc:0.96885 validation-aucpr:0.97210
[21:32:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57819 validation-auc:0.96900 validation-aucpr:0.97223
[21:32:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57367 validation-auc:0.96897 validation-aucpr:0.97214
[21:32:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56867 validation-auc:0.96889 validation-aucpr:0.97207
[21:32:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56373 validation-auc:0.96890 validation-aucpr:0.97209
[21:32:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55891 validation-auc:0.96895 validation-aucpr:0.97210
[21:32:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55409 validation-auc:0.96911 validation-aucpr:0.97221
[21:32:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54930 validation-auc:0.96925 validation-aucpr:0.97232
[21:32:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54467 validation-auc:0.96936 validation-aucpr:0.97242
[21:32:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53998 validation-auc:0.96943 validation-aucpr:0.97249
[21:32:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53586 validation-auc:0.96916 validation-aucpr:0.97231
[21:32:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.53139 validation-auc:0.96925 validation-aucpr:0.97237
[21:32:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52688 validation-auc:0.96944 validation-aucpr:0.97257
[21:32:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52314 validation-auc:0.96932 validation-aucpr:0.97237
[21:32:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51885 validation-auc:0.96949 validation-aucpr:0.97248
[21:32:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51459 validation-auc:0.96947 validation-aucpr:0.97248
[21:32:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.51038 validation-auc:0.96966 validation-aucpr:0.97263
[21:32:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50622 validation-auc:0.96974 validation-aucpr:0.97268
[21:32:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.50213 validation-auc:0.96981 validation-aucpr:0.97274
[21:32:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49809 validation-auc:0.96989 validation-aucpr:0.97279
[21:32:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49419 validation-auc:0.96992 validation-aucpr:0.97282
[21:32:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.49075 validation-auc:0.96993 validation-aucpr:0.97285
[21:32:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48700 validation-auc:0.96985 validation-aucpr:0.97277
[21:32:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48313 validation-auc:0.96992 validation-aucpr:0.97283
[21:32:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.47951 validation-auc:0.96986 validation-aucpr:0.97279
[21:32:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47598 validation-auc:0.96985 validation-aucpr:0.97279
[21:32:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47239 validation-auc:0.96982 validation-aucpr:0.97276
[21:32:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.46930 validation-auc:0.96977 validation-aucpr:0.97271
[21:32:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46590 validation-auc:0.96988 validation-aucpr:0.97279
[21:32:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.46250 validation-auc:0.97005 validation-aucpr:0.97292
[21:32:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.45914 validation-auc:0.97001 validation-aucpr:0.97289
[21:32:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45581 validation-auc:0.97005 validation-aucpr:0.97291
[21:32:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45263 validation-auc:0.97008 validation-aucpr:0.97325
[21:32:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.44935 validation-auc:0.97009 validation-aucpr:0.97327
[21:32:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44658 validation-auc:0.97007 validation-aucpr:0.97327
[21:32:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44343 validation-auc:0.97004 validation-aucpr:0.97324
[21:32:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.44029 validation-auc:0.97003 validation-aucpr:0.97324
[21:32:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43721 validation-auc:0.97008 validation-aucpr:0.97327
[21:33:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43459 validation-auc:0.97000 validation-aucpr:0.97320
[21:33:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.43163 validation-auc:0.97009 validation-aucpr:0.97327
[21:33:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.42905 validation-auc:0.97005 validation-aucpr:0.97323
[21:33:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42610 validation-auc:0.97009 validation-aucpr:0.97328
[21:33:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42363 validation-auc:0.97009 validation-aucpr:0.97328
[21:33:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.42082 validation-auc:0.97017 validation-aucpr:0.97332
[21:33:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.41809 validation-auc:0.97015 validation-aucpr:0.97333
[21:33:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41570 validation-auc:0.97016 validation-aucpr:0.97335
[21:33:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41301 validation-auc:0.97013 validation-aucpr:0.97333
[21:33:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.41038 validation-auc:0.97013 validation-aucpr:0.97333
[21:33:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.40773 validation-auc:0.97025 validation-aucpr:0.97343
[21:33:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40517 validation-auc:0.97031 validation-aucpr:0.97349
[21:33:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40265 validation-auc:0.97031 validation-aucpr:0.97352
[21:33:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.40014 validation-auc:0.97025 validation-aucpr:0.97347
[21:33:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.39762 validation-auc:0.97024 validation-aucpr:0.97348
[21:33:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39507 validation-auc:0.97028 validation-aucpr:0.97351
[21:33:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39260 validation-auc:0.97035 validation-aucpr:0.97356
[21:33:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.39022 validation-auc:0.97032 validation-aucpr:0.97355
[21:33:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.38794 validation-auc:0.97038 validation-aucpr:0.97359
[21:33:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38556 validation-auc:0.97040 validation-aucpr:0.97359
[21:33:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38324 validation-auc:0.97039 validation-aucpr:0.97359
[21:33:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38091 validation-auc:0.97045 validation-aucpr:0.97362
[21:33:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.37877 validation-auc:0.97039 validation-aucpr:0.97358
[21:33:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.37686 validation-auc:0.97037 validation-aucpr:0.97357
[21:33:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37476 validation-auc:0.97032 validation-aucpr:0.97353
[21:33:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.37274 validation-auc:0.97029 validation-aucpr:0.97349
[21:33:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.37052 validation-auc:0.97035 validation-aucpr:0.97361
[21:33:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.36845 validation-auc:0.97037 validation-aucpr:0.97362
[21:33:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.36666 validation-auc:0.97035 validation-aucpr:0.97346
[21:33:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.36451 validation-auc:0.97045 validation-aucpr:0.97355
[21:33:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.36252 validation-auc:0.97046 validation-aucpr:0.97357
[21:33:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.36047 validation-auc:0.97053 validation-aucpr:0.97363
[21:33:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.35844 validation-auc:0.97050 validation-aucpr:0.97360
[21:33:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.35648 validation-auc:0.97045 validation-aucpr:0.97357
{'best_iteration': '86', 'best_score': '0.9736287736436597'}
Trial 79, Fold 2: Log loss = 0.35647722594563985, Average precision = 0.9735348578673697, ROC-AUC = 0.9704456013021172, Elapsed Time = 126.08551309999893 seconds
Trial 79, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 79, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[21:33:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68583 validation-auc:0.95657 validation-aucpr:0.95778
[21:33:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67871 validation-auc:0.96236 validation-aucpr:0.96444
[21:33:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67191 validation-auc:0.96376 validation-aucpr:0.96792
[21:33:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66503 validation-auc:0.96431 validation-aucpr:0.96843
[21:33:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65824 validation-auc:0.96523 validation-aucpr:0.96927
[21:33:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65242 validation-auc:0.96612 validation-aucpr:0.97044
[21:34:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64601 validation-auc:0.96652 validation-aucpr:0.97130
[21:34:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64026 validation-auc:0.96709 validation-aucpr:0.97163
[21:34:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63468 validation-auc:0.96723 validation-aucpr:0.97166
[21:34:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62852 validation-auc:0.96721 validation-aucpr:0.97173
[21:34:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62249 validation-auc:0.96728 validation-aucpr:0.97179
[21:34:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61643 validation-auc:0.96765 validation-aucpr:0.97207
[21:34:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.61119 validation-auc:0.96762 validation-aucpr:0.97194
[21:34:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60593 validation-auc:0.96765 validation-aucpr:0.97194
[21:34:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.60109 validation-auc:0.96752 validation-aucpr:0.97182
[21:34:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59564 validation-auc:0.96766 validation-aucpr:0.97191
[21:34:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.59012 validation-auc:0.96772 validation-aucpr:0.97194
[21:34:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58486 validation-auc:0.96761 validation-aucpr:0.97175
[21:34:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57959 validation-auc:0.96774 validation-aucpr:0.97185
[21:34:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57496 validation-auc:0.96794 validation-aucpr:0.97201
[21:34:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56983 validation-auc:0.96814 validation-aucpr:0.97218
[21:34:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56496 validation-auc:0.96798 validation-aucpr:0.97206
[21:34:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55993 validation-auc:0.96808 validation-aucpr:0.97214
[21:34:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55505 validation-auc:0.96824 validation-aucpr:0.97227
[21:34:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.55016 validation-auc:0.96836 validation-aucpr:0.97235
[21:34:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54544 validation-auc:0.96851 validation-aucpr:0.97246
[21:34:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.54075 validation-auc:0.96850 validation-aucpr:0.97250
[21:34:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53626 validation-auc:0.96871 validation-aucpr:0.97296
[21:34:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.53166 validation-auc:0.96884 validation-aucpr:0.97306
[21:34:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52730 validation-auc:0.96882 validation-aucpr:0.97302
[21:34:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52292 validation-auc:0.96878 validation-aucpr:0.97298
[21:34:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51867 validation-auc:0.96884 validation-aucpr:0.97301
[21:34:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51500 validation-auc:0.96880 validation-aucpr:0.97294
[21:34:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.51143 validation-auc:0.96865 validation-aucpr:0.97283
[21:34:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50766 validation-auc:0.96869 validation-aucpr:0.97287
[21:34:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.50364 validation-auc:0.96861 validation-aucpr:0.97281
[21:34:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49952 validation-auc:0.96876 validation-aucpr:0.97291
[21:34:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49552 validation-auc:0.96892 validation-aucpr:0.97303
[21:34:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.49215 validation-auc:0.96895 validation-aucpr:0.97307
[21:34:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48833 validation-auc:0.96897 validation-aucpr:0.97310
[21:34:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48504 validation-auc:0.96901 validation-aucpr:0.97316
[21:34:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.48146 validation-auc:0.96903 validation-aucpr:0.97317
[21:34:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47835 validation-auc:0.96893 validation-aucpr:0.97310
[21:34:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47475 validation-auc:0.96896 validation-aucpr:0.97312
[21:34:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.47116 validation-auc:0.96895 validation-aucpr:0.97312
[21:34:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46805 validation-auc:0.96893 validation-aucpr:0.97311
[21:34:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.46506 validation-auc:0.96896 validation-aucpr:0.97313
[21:34:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.46207 validation-auc:0.96887 validation-aucpr:0.97303
[21:35:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45904 validation-auc:0.96891 validation-aucpr:0.97308
[21:35:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45585 validation-auc:0.96899 validation-aucpr:0.97314
[21:35:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.45252 validation-auc:0.96915 validation-aucpr:0.97328
[21:35:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44978 validation-auc:0.96913 validation-aucpr:0.97324
[21:35:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44652 validation-auc:0.96923 validation-aucpr:0.97333
[21:35:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.44379 validation-auc:0.96917 validation-aucpr:0.97326
[21:35:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.44069 validation-auc:0.96928 validation-aucpr:0.97335
[21:35:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43803 validation-auc:0.96928 validation-aucpr:0.97341
[21:35:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.43544 validation-auc:0.96919 validation-aucpr:0.97335
[21:35:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.43241 validation-auc:0.96932 validation-aucpr:0.97345
[21:35:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42999 validation-auc:0.96926 validation-aucpr:0.97340
[21:35:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42703 validation-auc:0.96935 validation-aucpr:0.97358
[21:35:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.42422 validation-auc:0.96936 validation-aucpr:0.97359
[21:35:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.42131 validation-auc:0.96942 validation-aucpr:0.97366
[21:35:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41858 validation-auc:0.96942 validation-aucpr:0.97366
[21:35:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41625 validation-auc:0.96943 validation-aucpr:0.97365
[21:35:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.41353 validation-auc:0.96947 validation-aucpr:0.97368
[21:35:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.41079 validation-auc:0.96950 validation-aucpr:0.97370
[21:35:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40811 validation-auc:0.96951 validation-aucpr:0.97371
[21:35:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40588 validation-auc:0.96948 validation-aucpr:0.97367
[21:35:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.40333 validation-auc:0.96954 validation-aucpr:0.97370
[21:35:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.40120 validation-auc:0.96955 validation-aucpr:0.97370
[21:35:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39900 validation-auc:0.96954 validation-aucpr:0.97367
[21:35:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39696 validation-auc:0.96950 validation-aucpr:0.97365
[21:35:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.39447 validation-auc:0.96954 validation-aucpr:0.97369
[21:35:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.39209 validation-auc:0.96956 validation-aucpr:0.97371
[21:35:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38963 validation-auc:0.96956 validation-aucpr:0.97372
[21:35:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38724 validation-auc:0.96965 validation-aucpr:0.97378
[21:35:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38526 validation-auc:0.96961 validation-aucpr:0.97375
[21:35:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.38296 validation-auc:0.96967 validation-aucpr:0.97380
[21:35:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.38074 validation-auc:0.96964 validation-aucpr:0.97377
[21:35:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37877 validation-auc:0.96966 validation-aucpr:0.97378
[21:35:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.37693 validation-auc:0.96961 validation-aucpr:0.97374
[21:35:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.37473 validation-auc:0.96964 validation-aucpr:0.97376
[21:35:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.37257 validation-auc:0.96966 validation-aucpr:0.97378
[21:35:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.37076 validation-auc:0.96958 validation-aucpr:0.97371
[21:35:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.36860 validation-auc:0.96965 validation-aucpr:0.97377
[21:35:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.36651 validation-auc:0.96965 validation-aucpr:0.97377
[21:35:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.36443 validation-auc:0.96969 validation-aucpr:0.97380
[21:36:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.36237 validation-auc:0.96978 validation-aucpr:0.97385
[21:36:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.36042 validation-auc:0.96981 validation-aucpr:0.97388
{'best_iteration': '88', 'best_score': '0.9738848959265011'}
Trial 79, Fold 3: Log loss = 0.3604212559366444, Average precision = 0.9738371377642552, ROC-AUC = 0.9698099405524363, Elapsed Time = 131.3719807000016 seconds
Trial 79, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 79, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[21:36:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68578 validation-auc:0.95664 validation-aucpr:0.96286
[21:36:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67868 validation-auc:0.96245 validation-aucpr:0.96876
[21:36:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67179 validation-auc:0.96277 validation-aucpr:0.96940
[21:36:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66498 validation-auc:0.96323 validation-aucpr:0.96985
[21:36:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65822 validation-auc:0.96500 validation-aucpr:0.97089
[21:36:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65229 validation-auc:0.96512 validation-aucpr:0.97133
[21:36:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64572 validation-auc:0.96562 validation-aucpr:0.97165
[21:36:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63929 validation-auc:0.96661 validation-aucpr:0.97224
[21:36:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63388 validation-auc:0.96613 validation-aucpr:0.97183
[21:36:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62763 validation-auc:0.96682 validation-aucpr:0.97235
[21:36:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62161 validation-auc:0.96696 validation-aucpr:0.97247
[21:36:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61637 validation-auc:0.96676 validation-aucpr:0.97230
[21:36:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.61072 validation-auc:0.96705 validation-aucpr:0.97248
[21:36:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60495 validation-auc:0.96696 validation-aucpr:0.97242
[21:36:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59995 validation-auc:0.96676 validation-aucpr:0.97223
[21:36:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59497 validation-auc:0.96685 validation-aucpr:0.97223
[21:36:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58957 validation-auc:0.96695 validation-aucpr:0.97230
[21:36:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58428 validation-auc:0.96711 validation-aucpr:0.97244
[21:36:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57904 validation-auc:0.96710 validation-aucpr:0.97245
[21:36:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57441 validation-auc:0.96706 validation-aucpr:0.97237
[21:36:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56929 validation-auc:0.96707 validation-aucpr:0.97240
[21:36:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56489 validation-auc:0.96687 validation-aucpr:0.97221
[21:36:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55997 validation-auc:0.96716 validation-aucpr:0.97242
[21:36:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55511 validation-auc:0.96734 validation-aucpr:0.97254
[21:36:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.55020 validation-auc:0.96755 validation-aucpr:0.97269
[21:36:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54542 validation-auc:0.96763 validation-aucpr:0.97274
[21:36:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.54071 validation-auc:0.96775 validation-aucpr:0.97283
[21:36:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53625 validation-auc:0.96798 validation-aucpr:0.97305
[21:36:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.53193 validation-auc:0.96805 validation-aucpr:0.97309
[21:36:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52795 validation-auc:0.96809 validation-aucpr:0.97307
[21:36:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52363 validation-auc:0.96827 validation-aucpr:0.97320
[21:36:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51932 validation-auc:0.96834 validation-aucpr:0.97325
[21:36:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51512 validation-auc:0.96837 validation-aucpr:0.97328
[21:36:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.51145 validation-auc:0.96831 validation-aucpr:0.97324
[21:36:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50748 validation-auc:0.96841 validation-aucpr:0.97332
[21:36:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.50348 validation-auc:0.96838 validation-aucpr:0.97330
[21:36:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49946 validation-auc:0.96849 validation-aucpr:0.97340
[21:36:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49601 validation-auc:0.96840 validation-aucpr:0.97334
[21:36:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.49213 validation-auc:0.96845 validation-aucpr:0.97338
[21:36:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48880 validation-auc:0.96820 validation-aucpr:0.97320
[21:36:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48505 validation-auc:0.96843 validation-aucpr:0.97337
[21:36:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.48124 validation-auc:0.96851 validation-aucpr:0.97343
[21:37:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47749 validation-auc:0.96877 validation-aucpr:0.97362
[21:37:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47407 validation-auc:0.96887 validation-aucpr:0.97371
[21:37:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.47047 validation-auc:0.96892 validation-aucpr:0.97374
[21:37:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46695 validation-auc:0.96902 validation-aucpr:0.97382
[21:37:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.46362 validation-auc:0.96900 validation-aucpr:0.97380
[21:37:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.46038 validation-auc:0.96900 validation-aucpr:0.97379
[21:37:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45711 validation-auc:0.96907 validation-aucpr:0.97385
[21:37:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45429 validation-auc:0.96889 validation-aucpr:0.97372
[21:37:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.45099 validation-auc:0.96894 validation-aucpr:0.97375
[21:37:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.44790 validation-auc:0.96892 validation-aucpr:0.97374
[21:37:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44479 validation-auc:0.96886 validation-aucpr:0.97370
[21:37:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.44180 validation-auc:0.96880 validation-aucpr:0.97365
[21:37:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.43888 validation-auc:0.96885 validation-aucpr:0.97371
[21:37:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43583 validation-auc:0.96887 validation-aucpr:0.97373
[21:37:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.43292 validation-auc:0.96877 validation-aucpr:0.97365
[21:37:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.42997 validation-auc:0.96874 validation-aucpr:0.97364
[21:37:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.42704 validation-auc:0.96876 validation-aucpr:0.97365
[21:37:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42412 validation-auc:0.96886 validation-aucpr:0.97375
[21:37:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.42132 validation-auc:0.96891 validation-aucpr:0.97377
[21:37:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.41853 validation-auc:0.96886 validation-aucpr:0.97374
[21:37:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41579 validation-auc:0.96898 validation-aucpr:0.97382
[21:37:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41310 validation-auc:0.96891 validation-aucpr:0.97376
[21:37:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.41074 validation-auc:0.96888 validation-aucpr:0.97374
[21:37:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.40811 validation-auc:0.96889 validation-aucpr:0.97374
[21:37:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40561 validation-auc:0.96891 validation-aucpr:0.97375
[21:37:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40350 validation-auc:0.96882 validation-aucpr:0.97370
[21:37:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.40123 validation-auc:0.96880 validation-aucpr:0.97369
[21:37:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.39906 validation-auc:0.96889 validation-aucpr:0.97373
[21:37:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39657 validation-auc:0.96888 validation-aucpr:0.97373
[21:37:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39411 validation-auc:0.96892 validation-aucpr:0.97376
[21:37:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.39167 validation-auc:0.96898 validation-aucpr:0.97380
[21:37:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.38967 validation-auc:0.96896 validation-aucpr:0.97377
[21:37:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38737 validation-auc:0.96896 validation-aucpr:0.97377
[21:37:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38515 validation-auc:0.96897 validation-aucpr:0.97377
[21:37:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38326 validation-auc:0.96888 validation-aucpr:0.97370
[21:37:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.38101 validation-auc:0.96886 validation-aucpr:0.97369
[21:37:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.37884 validation-auc:0.96897 validation-aucpr:0.97377
[21:37:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37696 validation-auc:0.96898 validation-aucpr:0.97377
[21:37:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.37519 validation-auc:0.96894 validation-aucpr:0.97373
[21:37:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.37334 validation-auc:0.96894 validation-aucpr:0.97374
[21:38:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.37123 validation-auc:0.96901 validation-aucpr:0.97379
[21:38:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.36922 validation-auc:0.96899 validation-aucpr:0.97378
[21:38:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.36718 validation-auc:0.96908 validation-aucpr:0.97384
[21:38:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.36514 validation-auc:0.96917 validation-aucpr:0.97391
[21:38:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.36313 validation-auc:0.96915 validation-aucpr:0.97389
[21:38:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.36116 validation-auc:0.96913 validation-aucpr:0.97389
[21:38:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.35930 validation-auc:0.96914 validation-aucpr:0.97389
{'best_iteration': '85', 'best_score': '0.9739107500294601'}
Trial 79, Fold 4: Log loss = 0.35929864099093195, Average precision = 0.9738837765660591, ROC-AUC = 0.9691444707921726, Elapsed Time = 127.51060520000101 seconds
Trial 79, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 79, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[21:38:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68595 validation-auc:0.95081 validation-aucpr:0.95268
[21:38:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67976 validation-auc:0.95345 validation-aucpr:0.95767
[21:38:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67292 validation-auc:0.95778 validation-aucpr:0.96435
[21:38:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66600 validation-auc:0.96044 validation-aucpr:0.96643
[21:38:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65921 validation-auc:0.96249 validation-aucpr:0.96808
[21:38:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65286 validation-auc:0.96266 validation-aucpr:0.96809
[21:38:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64709 validation-auc:0.96269 validation-aucpr:0.96786
[21:38:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.64074 validation-auc:0.96344 validation-aucpr:0.96843
[21:38:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63440 validation-auc:0.96396 validation-aucpr:0.96873
[21:38:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62835 validation-auc:0.96428 validation-aucpr:0.96904
[21:38:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62242 validation-auc:0.96491 validation-aucpr:0.96944
[21:38:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61653 validation-auc:0.96516 validation-aucpr:0.96974
[21:38:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.61076 validation-auc:0.96506 validation-aucpr:0.96978
[21:38:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60514 validation-auc:0.96464 validation-aucpr:0.96948
[21:38:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.60015 validation-auc:0.96454 validation-aucpr:0.96945
[21:38:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59520 validation-auc:0.96473 validation-aucpr:0.96952
[21:38:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58992 validation-auc:0.96483 validation-aucpr:0.96962
[21:38:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58459 validation-auc:0.96490 validation-aucpr:0.96966
[21:38:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57966 validation-auc:0.96483 validation-aucpr:0.96961
[21:38:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57448 validation-auc:0.96497 validation-aucpr:0.96968
[21:38:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56950 validation-auc:0.96518 validation-aucpr:0.96981
[21:38:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56452 validation-auc:0.96525 validation-aucpr:0.96986
[21:38:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55970 validation-auc:0.96557 validation-aucpr:0.97007
[21:38:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55503 validation-auc:0.96577 validation-aucpr:0.97021
[21:38:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.55085 validation-auc:0.96552 validation-aucpr:0.97001
[21:38:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54687 validation-auc:0.96549 validation-aucpr:0.97034
[21:38:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.54272 validation-auc:0.96554 validation-aucpr:0.97034
[21:38:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53815 validation-auc:0.96571 validation-aucpr:0.97050
[21:38:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.53376 validation-auc:0.96567 validation-aucpr:0.97048
[21:38:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52943 validation-auc:0.96574 validation-aucpr:0.97054
[21:38:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.52514 validation-auc:0.96578 validation-aucpr:0.97055
[21:38:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.52142 validation-auc:0.96559 validation-aucpr:0.97041
[21:38:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51732 validation-auc:0.96568 validation-aucpr:0.97047
[21:38:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.51320 validation-auc:0.96576 validation-aucpr:0.97054
[21:38:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50975 validation-auc:0.96573 validation-aucpr:0.97085
[21:38:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.50574 validation-auc:0.96583 validation-aucpr:0.97092
[21:39:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.50179 validation-auc:0.96601 validation-aucpr:0.97105
[21:39:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49833 validation-auc:0.96598 validation-aucpr:0.97100
[21:39:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.49449 validation-auc:0.96600 validation-aucpr:0.97101
[21:39:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.49065 validation-auc:0.96605 validation-aucpr:0.97103
[21:39:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.48692 validation-auc:0.96611 validation-aucpr:0.97107
[21:39:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.48321 validation-auc:0.96624 validation-aucpr:0.97119
[21:39:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47965 validation-auc:0.96636 validation-aucpr:0.97126
[21:39:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.47622 validation-auc:0.96634 validation-aucpr:0.97123
[21:39:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.47315 validation-auc:0.96622 validation-aucpr:0.97114
[21:39:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.46964 validation-auc:0.96627 validation-aucpr:0.97117
[21:39:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.46622 validation-auc:0.96642 validation-aucpr:0.97127
[21:39:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.46299 validation-auc:0.96640 validation-aucpr:0.97123
[21:39:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.45981 validation-auc:0.96645 validation-aucpr:0.97128
[21:39:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.45651 validation-auc:0.96643 validation-aucpr:0.97128
[21:39:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.45332 validation-auc:0.96649 validation-aucpr:0.97135
[21:39:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.45007 validation-auc:0.96645 validation-aucpr:0.97133
[21:39:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.44695 validation-auc:0.96652 validation-aucpr:0.97141
[21:39:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.44431 validation-auc:0.96646 validation-aucpr:0.97136
[21:39:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.44171 validation-auc:0.96638 validation-aucpr:0.97128
[21:39:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.43875 validation-auc:0.96644 validation-aucpr:0.97134
[21:39:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.43614 validation-auc:0.96636 validation-aucpr:0.97128
[21:39:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.43330 validation-auc:0.96636 validation-aucpr:0.97130
[21:39:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.43039 validation-auc:0.96644 validation-aucpr:0.97136
[21:39:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.42757 validation-auc:0.96653 validation-aucpr:0.97142
[21:39:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.42480 validation-auc:0.96660 validation-aucpr:0.97149
[21:39:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.42201 validation-auc:0.96667 validation-aucpr:0.97154
[21:39:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.41932 validation-auc:0.96665 validation-aucpr:0.97151
[21:39:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.41651 validation-auc:0.96676 validation-aucpr:0.97161
[21:39:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.41391 validation-auc:0.96678 validation-aucpr:0.97162
[21:39:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.41126 validation-auc:0.96680 validation-aucpr:0.97162
[21:39:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.40862 validation-auc:0.96699 validation-aucpr:0.97176
[21:39:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.40622 validation-auc:0.96703 validation-aucpr:0.97178
[21:39:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.40371 validation-auc:0.96707 validation-aucpr:0.97180
[21:39:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.40130 validation-auc:0.96708 validation-aucpr:0.97180
[21:39:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.39885 validation-auc:0.96710 validation-aucpr:0.97183
[21:39:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.39646 validation-auc:0.96714 validation-aucpr:0.97186
[21:39:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.39413 validation-auc:0.96717 validation-aucpr:0.97188
[21:39:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.39208 validation-auc:0.96726 validation-aucpr:0.97194
[21:40:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.38973 validation-auc:0.96734 validation-aucpr:0.97199
[21:40:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.38750 validation-auc:0.96730 validation-aucpr:0.97195
[21:40:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.38525 validation-auc:0.96736 validation-aucpr:0.97195
[21:40:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.38305 validation-auc:0.96735 validation-aucpr:0.97195
[21:40:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.38083 validation-auc:0.96743 validation-aucpr:0.97201
[21:40:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.37870 validation-auc:0.96746 validation-aucpr:0.97216
[21:40:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[80] validation-logloss:0.37644 validation-auc:0.96762 validation-aucpr:0.97228
[21:40:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[81] validation-logloss:0.37436 validation-auc:0.96763 validation-aucpr:0.97231
[21:40:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[82] validation-logloss:0.37229 validation-auc:0.96769 validation-aucpr:0.97234
[21:40:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[83] validation-logloss:0.37056 validation-auc:0.96767 validation-aucpr:0.97231
[21:40:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[84] validation-logloss:0.36853 validation-auc:0.96770 validation-aucpr:0.97234
[21:40:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[85] validation-logloss:0.36651 validation-auc:0.96769 validation-aucpr:0.97233
[21:40:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[86] validation-logloss:0.36448 validation-auc:0.96781 validation-aucpr:0.97241
[21:40:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[87] validation-logloss:0.36248 validation-auc:0.96784 validation-aucpr:0.97243
[21:40:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[88] validation-logloss:0.36055 validation-auc:0.96792 validation-aucpr:0.97248
{'best_iteration': '88', 'best_score': '0.9724756751127037'}
Trial 79, Fold 5: Log loss = 0.36055336322219295, Average precision = 0.9724776594775508, ROC-AUC = 0.9679193653270907, Elapsed Time = 133.4455952999997 seconds
Optimization Progress: 80%|######## | 80/100 [3:41:35<1:19:33, 238.65s/it]
Trial 80, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 80, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.65533 validation-auc:0.93478 validation-aucpr:0.92750
[1] validation-logloss:0.61742 validation-auc:0.95600 validation-aucpr:0.96118
[2] validation-logloss:0.58746 validation-auc:0.95738 validation-aucpr:0.96255
[3] validation-logloss:0.56010 validation-auc:0.95831 validation-aucpr:0.96297
[4] validation-logloss:0.53597 validation-auc:0.95916 validation-aucpr:0.96383
[5] validation-logloss:0.51466 validation-auc:0.95938 validation-aucpr:0.96429
[6] validation-logloss:0.49433 validation-auc:0.95993 validation-aucpr:0.96474
[7] validation-logloss:0.47623 validation-auc:0.96009 validation-aucpr:0.96453
[8] validation-logloss:0.45946 validation-auc:0.96034 validation-aucpr:0.96501
[9] validation-logloss:0.44537 validation-auc:0.96008 validation-aucpr:0.96468
[10] validation-logloss:0.43100 validation-auc:0.96034 validation-aucpr:0.96487
[11] validation-logloss:0.41344 validation-auc:0.96248 validation-aucpr:0.96735
[12] validation-logloss:0.40175 validation-auc:0.96239 validation-aucpr:0.96726
[13] validation-logloss:0.38653 validation-auc:0.96354 validation-aucpr:0.96861
[14] validation-logloss:0.37610 validation-auc:0.96364 validation-aucpr:0.96870
[15] validation-logloss:0.36606 validation-auc:0.96361 validation-aucpr:0.96874
[16] validation-logloss:0.35691 validation-auc:0.96360 validation-aucpr:0.96869
[17] validation-logloss:0.34883 validation-auc:0.96371 validation-aucpr:0.96864
[18] validation-logloss:0.33813 validation-auc:0.96445 validation-aucpr:0.96954
[19] validation-logloss:0.33035 validation-auc:0.96463 validation-aucpr:0.96965
[20] validation-logloss:0.32439 validation-auc:0.96455 validation-aucpr:0.96955
[21] validation-logloss:0.31525 validation-auc:0.96498 validation-aucpr:0.97005
[22] validation-logloss:0.30812 validation-auc:0.96506 validation-aucpr:0.97014
[23] validation-logloss:0.30077 validation-auc:0.96533 validation-aucpr:0.97059
[24] validation-logloss:0.29346 validation-auc:0.96570 validation-aucpr:0.97091
[25] validation-logloss:0.28878 validation-auc:0.96569 validation-aucpr:0.97089
[26] validation-logloss:0.28417 validation-auc:0.96580 validation-aucpr:0.97092
[27] validation-logloss:0.27844 validation-auc:0.96609 validation-aucpr:0.97137
[28] validation-logloss:0.27446 validation-auc:0.96616 validation-aucpr:0.97139
[29] validation-logloss:0.27105 validation-auc:0.96626 validation-aucpr:0.97139
[30] validation-logloss:0.26756 validation-auc:0.96622 validation-aucpr:0.97138
[31] validation-logloss:0.26467 validation-auc:0.96624 validation-aucpr:0.97133
[32] validation-logloss:0.26140 validation-auc:0.96629 validation-aucpr:0.97136
[33] validation-logloss:0.25781 validation-auc:0.96646 validation-aucpr:0.97151
[34] validation-logloss:0.25492 validation-auc:0.96676 validation-aucpr:0.97178
[35] validation-logloss:0.25226 validation-auc:0.96688 validation-aucpr:0.97186
[36] validation-logloss:0.24984 validation-auc:0.96691 validation-aucpr:0.97188
[37] validation-logloss:0.24759 validation-auc:0.96695 validation-aucpr:0.97191
[38] validation-logloss:0.24360 validation-auc:0.96727 validation-aucpr:0.97219
[39] validation-logloss:0.24166 validation-auc:0.96745 validation-aucpr:0.97228
[40] validation-logloss:0.23828 validation-auc:0.96773 validation-aucpr:0.97256
[41] validation-logloss:0.23632 validation-auc:0.96784 validation-aucpr:0.97261
[42] validation-logloss:0.23461 validation-auc:0.96785 validation-aucpr:0.97262
[43] validation-logloss:0.23161 validation-auc:0.96809 validation-aucpr:0.97283
[44] validation-logloss:0.22868 validation-auc:0.96840 validation-aucpr:0.97313
[45] validation-logloss:0.22706 validation-auc:0.96857 validation-aucpr:0.97324
[46] validation-logloss:0.22572 validation-auc:0.96860 validation-aucpr:0.97328
[47] validation-logloss:0.22461 validation-auc:0.96864 validation-aucpr:0.97326
[48] validation-logloss:0.22363 validation-auc:0.96869 validation-aucpr:0.97329
[49] validation-logloss:0.22238 validation-auc:0.96880 validation-aucpr:0.97333
{'best_iteration': '49', 'best_score': '0.9733320179106767'}
Trial 80, Fold 1: Log loss = 0.22238400995920363, Average precision = 0.9733366848619621, ROC-AUC = 0.9687998869627144, Elapsed Time = 1.4775561999995261 seconds
Trial 80, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 80, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.65402 validation-auc:0.94272 validation-aucpr:0.94373
[1] validation-logloss:0.61759 validation-auc:0.95731 validation-aucpr:0.95915
[2] validation-logloss:0.58746 validation-auc:0.95880 validation-aucpr:0.95995
[3] validation-logloss:0.56106 validation-auc:0.95912 validation-aucpr:0.96075
[4] validation-logloss:0.53748 validation-auc:0.95895 validation-aucpr:0.96050
[5] validation-logloss:0.51407 validation-auc:0.96039 validation-aucpr:0.96260
[6] validation-logloss:0.49406 validation-auc:0.96073 validation-aucpr:0.96255
[7] validation-logloss:0.47616 validation-auc:0.96078 validation-aucpr:0.96316
[8] validation-logloss:0.45925 validation-auc:0.96111 validation-aucpr:0.96338
[9] validation-logloss:0.43884 validation-auc:0.96421 validation-aucpr:0.96687
[10] validation-logloss:0.42451 validation-auc:0.96405 validation-aucpr:0.96658
[11] validation-logloss:0.40673 validation-auc:0.96547 validation-aucpr:0.96827
[12] validation-logloss:0.39538 validation-auc:0.96546 validation-aucpr:0.96849
[13] validation-logloss:0.38054 validation-auc:0.96611 validation-aucpr:0.96924
[14] validation-logloss:0.37019 validation-auc:0.96609 validation-aucpr:0.96912
[15] validation-logloss:0.36089 validation-auc:0.96604 validation-aucpr:0.96992
[16] validation-logloss:0.34938 validation-auc:0.96651 validation-aucpr:0.97037
[17] validation-logloss:0.33788 validation-auc:0.96696 validation-aucpr:0.97084
[18] validation-logloss:0.33014 validation-auc:0.96715 validation-aucpr:0.97102
[19] validation-logloss:0.32366 validation-auc:0.96731 validation-aucpr:0.97107
[20] validation-logloss:0.31612 validation-auc:0.96756 validation-aucpr:0.97126
[21] validation-logloss:0.30994 validation-auc:0.96763 validation-aucpr:0.97124
[22] validation-logloss:0.30357 validation-auc:0.96766 validation-aucpr:0.97128
[23] validation-logloss:0.29569 validation-auc:0.96783 validation-aucpr:0.97148
[24] validation-logloss:0.28836 validation-auc:0.96809 validation-aucpr:0.97157
[25] validation-logloss:0.28353 validation-auc:0.96814 validation-aucpr:0.97158
[26] validation-logloss:0.27651 validation-auc:0.96861 validation-aucpr:0.97196
[27] validation-logloss:0.27213 validation-auc:0.96866 validation-aucpr:0.97194
[28] validation-logloss:0.26618 validation-auc:0.96891 validation-aucpr:0.97217
[29] validation-logloss:0.26230 validation-auc:0.96909 validation-aucpr:0.97225
[30] validation-logloss:0.25904 validation-auc:0.96921 validation-aucpr:0.97236
[31] validation-logloss:0.25592 validation-auc:0.96927 validation-aucpr:0.97239
[32] validation-logloss:0.25297 validation-auc:0.96934 validation-aucpr:0.97255
[33] validation-logloss:0.25040 validation-auc:0.96926 validation-aucpr:0.97248
[34] validation-logloss:0.24813 validation-auc:0.96914 validation-aucpr:0.97232
[35] validation-logloss:0.24381 validation-auc:0.96941 validation-aucpr:0.97254
[36] validation-logloss:0.24031 validation-auc:0.96954 validation-aucpr:0.97276
[37] validation-logloss:0.23785 validation-auc:0.96967 validation-aucpr:0.97287
[38] validation-logloss:0.23413 validation-auc:0.96995 validation-aucpr:0.97302
[39] validation-logloss:0.23209 validation-auc:0.97006 validation-aucpr:0.97312
[40] validation-logloss:0.23031 validation-auc:0.97007 validation-aucpr:0.97315
[41] validation-logloss:0.22808 validation-auc:0.97027 validation-aucpr:0.97335
[42] validation-logloss:0.22613 validation-auc:0.97032 validation-aucpr:0.97337
[43] validation-logloss:0.22338 validation-auc:0.97046 validation-aucpr:0.97359
[44] validation-logloss:0.22142 validation-auc:0.97062 validation-aucpr:0.97367
[45] validation-logloss:0.21851 validation-auc:0.97089 validation-aucpr:0.97399
[46] validation-logloss:0.21730 validation-auc:0.97093 validation-aucpr:0.97430
[47] validation-logloss:0.21561 validation-auc:0.97103 validation-aucpr:0.97439
[48] validation-logloss:0.21432 validation-auc:0.97114 validation-aucpr:0.97456
[49] validation-logloss:0.21187 validation-auc:0.97133 validation-aucpr:0.97472
{'best_iteration': '49', 'best_score': '0.9747160669731868'}
Trial 80, Fold 2: Log loss = 0.21186570833554003, Average precision = 0.9747205743499491, ROC-AUC = 0.9713305185130781, Elapsed Time = 1.8612036000013177 seconds
Trial 80, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 80, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.65376 validation-auc:0.94462 validation-aucpr:0.94750
[1] validation-logloss:0.62036 validation-auc:0.95173 validation-aucpr:0.95404
[2] validation-logloss:0.58383 validation-auc:0.96344 validation-aucpr:0.96719
[3] validation-logloss:0.55726 validation-auc:0.96357 validation-aucpr:0.96767
[4] validation-logloss:0.53355 validation-auc:0.96278 validation-aucpr:0.96694
[5] validation-logloss:0.51090 validation-auc:0.96318 validation-aucpr:0.96770
[6] validation-logloss:0.48865 validation-auc:0.96368 validation-aucpr:0.96833
[7] validation-logloss:0.46807 validation-auc:0.96439 validation-aucpr:0.96901
[8] validation-logloss:0.45127 validation-auc:0.96430 validation-aucpr:0.96885
[9] validation-logloss:0.43555 validation-auc:0.96475 validation-aucpr:0.96903
[10] validation-logloss:0.42111 validation-auc:0.96505 validation-aucpr:0.96934
[11] validation-logloss:0.40809 validation-auc:0.96506 validation-aucpr:0.96952
[12] validation-logloss:0.39647 validation-auc:0.96516 validation-aucpr:0.96950
[13] validation-logloss:0.38509 validation-auc:0.96521 validation-aucpr:0.96959
[14] validation-logloss:0.37432 validation-auc:0.96518 validation-aucpr:0.96949
[15] validation-logloss:0.36371 validation-auc:0.96543 validation-aucpr:0.96976
[16] validation-logloss:0.35475 validation-auc:0.96563 validation-aucpr:0.96986
[17] validation-logloss:0.34299 validation-auc:0.96641 validation-aucpr:0.97067
[18] validation-logloss:0.33576 validation-auc:0.96627 validation-aucpr:0.97060
[19] validation-logloss:0.32862 validation-auc:0.96637 validation-aucpr:0.97076
[20] validation-logloss:0.32162 validation-auc:0.96654 validation-aucpr:0.97094
[21] validation-logloss:0.31567 validation-auc:0.96653 validation-aucpr:0.97092
[22] validation-logloss:0.31016 validation-auc:0.96632 validation-aucpr:0.97075
[23] validation-logloss:0.30461 validation-auc:0.96633 validation-aucpr:0.97073
[24] validation-logloss:0.29695 validation-auc:0.96672 validation-aucpr:0.97113
[25] validation-logloss:0.29207 validation-auc:0.96689 validation-aucpr:0.97127
[26] validation-logloss:0.28713 validation-auc:0.96716 validation-aucpr:0.97144
[27] validation-logloss:0.28267 validation-auc:0.96724 validation-aucpr:0.97147
[28] validation-logloss:0.27852 validation-auc:0.96740 validation-aucpr:0.97162
[29] validation-logloss:0.27214 validation-auc:0.96778 validation-aucpr:0.97201
[30] validation-logloss:0.26870 validation-auc:0.96784 validation-aucpr:0.97212
[31] validation-logloss:0.26521 validation-auc:0.96788 validation-aucpr:0.97217
[32] validation-logloss:0.26241 validation-auc:0.96786 validation-aucpr:0.97212
[33] validation-logloss:0.25897 validation-auc:0.96807 validation-aucpr:0.97230
[34] validation-logloss:0.25367 validation-auc:0.96856 validation-aucpr:0.97273
[35] validation-logloss:0.25119 validation-auc:0.96863 validation-aucpr:0.97276
[36] validation-logloss:0.24718 validation-auc:0.96889 validation-aucpr:0.97298
[37] validation-logloss:0.24480 validation-auc:0.96903 validation-aucpr:0.97303
[38] validation-logloss:0.24199 validation-auc:0.96918 validation-aucpr:0.97315
[39] validation-logloss:0.23784 validation-auc:0.96954 validation-aucpr:0.97347
[40] validation-logloss:0.23613 validation-auc:0.96955 validation-aucpr:0.97350
[41] validation-logloss:0.23262 validation-auc:0.96974 validation-aucpr:0.97371
[42] validation-logloss:0.22988 validation-auc:0.97011 validation-aucpr:0.97406
[43] validation-logloss:0.22840 validation-auc:0.97010 validation-aucpr:0.97406
[44] validation-logloss:0.22517 validation-auc:0.97040 validation-aucpr:0.97436
[45] validation-logloss:0.22359 validation-auc:0.97055 validation-aucpr:0.97446
[46] validation-logloss:0.22068 validation-auc:0.97082 validation-aucpr:0.97471
[47] validation-logloss:0.21966 validation-auc:0.97081 validation-aucpr:0.97475
[48] validation-logloss:0.21864 validation-auc:0.97076 validation-aucpr:0.97470
[49] validation-logloss:0.21779 validation-auc:0.97073 validation-aucpr:0.97465
{'best_iteration': '47', 'best_score': '0.9747548137468102'}
Trial 80, Fold 3: Log loss = 0.21778798181689787, Average precision = 0.9746578127203783, ROC-AUC = 0.9707257165490075, Elapsed Time = 1.7147274000017205 seconds
Trial 80, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 80, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.65456 validation-auc:0.94003 validation-aucpr:0.94400
[1] validation-logloss:0.61672 validation-auc:0.95860 validation-aucpr:0.96379
[2] validation-logloss:0.58654 validation-auc:0.95883 validation-aucpr:0.96450
[3] validation-logloss:0.55935 validation-auc:0.95933 validation-aucpr:0.96450
[4] validation-logloss:0.53390 validation-auc:0.96083 validation-aucpr:0.96645
[5] validation-logloss:0.50728 validation-auc:0.96183 validation-aucpr:0.96783
[6] validation-logloss:0.48299 validation-auc:0.96273 validation-aucpr:0.96863
[7] validation-logloss:0.46173 validation-auc:0.96385 validation-aucpr:0.96969
[8] validation-logloss:0.44543 validation-auc:0.96361 validation-aucpr:0.96963
[9] validation-logloss:0.42956 validation-auc:0.96405 validation-aucpr:0.96999
[10] validation-logloss:0.41507 validation-auc:0.96418 validation-aucpr:0.97011
[11] validation-logloss:0.40258 validation-auc:0.96418 validation-aucpr:0.97002
[12] validation-logloss:0.39091 validation-auc:0.96441 validation-aucpr:0.97020
[13] validation-logloss:0.38037 validation-auc:0.96443 validation-aucpr:0.97017
[14] validation-logloss:0.37022 validation-auc:0.96479 validation-aucpr:0.97037
[15] validation-logloss:0.35768 validation-auc:0.96526 validation-aucpr:0.97088
[16] validation-logloss:0.34916 validation-auc:0.96534 validation-aucpr:0.97091
[17] validation-logloss:0.34153 validation-auc:0.96543 validation-aucpr:0.97093
[18] validation-logloss:0.33189 validation-auc:0.96590 validation-aucpr:0.97132
[19] validation-logloss:0.32530 validation-auc:0.96568 validation-aucpr:0.97115
[20] validation-logloss:0.31846 validation-auc:0.96567 validation-aucpr:0.97115
[21] validation-logloss:0.31212 validation-auc:0.96596 validation-aucpr:0.97131
[22] validation-logloss:0.30621 validation-auc:0.96611 validation-aucpr:0.97144
[23] validation-logloss:0.30028 validation-auc:0.96613 validation-aucpr:0.97147
[24] validation-logloss:0.29545 validation-auc:0.96614 validation-aucpr:0.97147
[25] validation-logloss:0.29112 validation-auc:0.96611 validation-aucpr:0.97139
[26] validation-logloss:0.28413 validation-auc:0.96639 validation-aucpr:0.97167
[27] validation-logloss:0.27983 validation-auc:0.96639 validation-aucpr:0.97169
[28] validation-logloss:0.27586 validation-auc:0.96645 validation-aucpr:0.97170
[29] validation-logloss:0.27226 validation-auc:0.96658 validation-aucpr:0.97183
[30] validation-logloss:0.26897 validation-auc:0.96641 validation-aucpr:0.97173
[31] validation-logloss:0.26566 validation-auc:0.96655 validation-aucpr:0.97179
[32] validation-logloss:0.25989 validation-auc:0.96693 validation-aucpr:0.97214
[33] validation-logloss:0.25542 validation-auc:0.96684 validation-aucpr:0.97219
[34] validation-logloss:0.25286 validation-auc:0.96698 validation-aucpr:0.97228
[35] validation-logloss:0.25021 validation-auc:0.96712 validation-aucpr:0.97237
[36] validation-logloss:0.24779 validation-auc:0.96714 validation-aucpr:0.97239
[37] validation-logloss:0.24595 validation-auc:0.96707 validation-aucpr:0.97237
[38] validation-logloss:0.24282 validation-auc:0.96712 validation-aucpr:0.97243
[39] validation-logloss:0.24117 validation-auc:0.96703 validation-aucpr:0.97236
[40] validation-logloss:0.23910 validation-auc:0.96724 validation-aucpr:0.97252
[41] validation-logloss:0.23552 validation-auc:0.96750 validation-aucpr:0.97277
[42] validation-logloss:0.23372 validation-auc:0.96762 validation-aucpr:0.97287
[43] validation-logloss:0.23213 validation-auc:0.96768 validation-aucpr:0.97291
[44] validation-logloss:0.22871 validation-auc:0.96799 validation-aucpr:0.97317
[45] validation-logloss:0.22711 validation-auc:0.96810 validation-aucpr:0.97326
[46] validation-logloss:0.22598 validation-auc:0.96814 validation-aucpr:0.97327
[47] validation-logloss:0.22322 validation-auc:0.96826 validation-aucpr:0.97341
[48] validation-logloss:0.22074 validation-auc:0.96835 validation-aucpr:0.97349
[49] validation-logloss:0.21964 validation-auc:0.96840 validation-aucpr:0.97350
{'best_iteration': '49', 'best_score': '0.9734999163688453'}
Trial 80, Fold 4: Log loss = 0.21964332522966643, Average precision = 0.973504239992713, ROC-AUC = 0.9684028448393027, Elapsed Time = 1.6603410000025178 seconds
Trial 80, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 80, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.64959 validation-auc:0.95433 validation-aucpr:0.95874
[1] validation-logloss:0.61442 validation-auc:0.95794 validation-aucpr:0.96264
[2] validation-logloss:0.58089 validation-auc:0.96110 validation-aucpr:0.96519
[3] validation-logloss:0.55395 validation-auc:0.96038 validation-aucpr:0.96570
[4] validation-logloss:0.53024 validation-auc:0.96010 validation-aucpr:0.96546
[5] validation-logloss:0.50856 validation-auc:0.96017 validation-aucpr:0.96536
[6] validation-logloss:0.48844 validation-auc:0.96061 validation-aucpr:0.96573
[7] validation-logloss:0.47027 validation-auc:0.96081 validation-aucpr:0.96596
[8] validation-logloss:0.45368 validation-auc:0.96094 validation-aucpr:0.96600
[9] validation-logloss:0.43961 validation-auc:0.96071 validation-aucpr:0.96577
[10] validation-logloss:0.42562 validation-auc:0.96077 validation-aucpr:0.96596
[11] validation-logloss:0.41295 validation-auc:0.96094 validation-aucpr:0.96608
[12] validation-logloss:0.40169 validation-auc:0.96074 validation-aucpr:0.96575
[13] validation-logloss:0.39029 validation-auc:0.96078 validation-aucpr:0.96587
[14] validation-logloss:0.37974 validation-auc:0.96091 validation-aucpr:0.96597
[15] validation-logloss:0.36909 validation-auc:0.96136 validation-aucpr:0.96638
[16] validation-logloss:0.35984 validation-auc:0.96172 validation-aucpr:0.96655
[17] validation-logloss:0.35116 validation-auc:0.96201 validation-aucpr:0.96675
[18] validation-logloss:0.34334 validation-auc:0.96227 validation-aucpr:0.96685
[19] validation-logloss:0.33647 validation-auc:0.96225 validation-aucpr:0.96678
[20] validation-logloss:0.32650 validation-auc:0.96319 validation-aucpr:0.96769
[21] validation-logloss:0.31995 validation-auc:0.96326 validation-aucpr:0.96774
[22] validation-logloss:0.31109 validation-auc:0.96390 validation-aucpr:0.96837
[23] validation-logloss:0.30295 validation-auc:0.96435 validation-aucpr:0.96889
[24] validation-logloss:0.29614 validation-auc:0.96462 validation-aucpr:0.96911
[25] validation-logloss:0.29111 validation-auc:0.96464 validation-aucpr:0.96909
[26] validation-logloss:0.28589 validation-auc:0.96482 validation-aucpr:0.96919
[27] validation-logloss:0.28150 validation-auc:0.96501 validation-aucpr:0.96941
[28] validation-logloss:0.27760 validation-auc:0.96518 validation-aucpr:0.96953
[29] validation-logloss:0.27148 validation-auc:0.96563 validation-aucpr:0.96996
[30] validation-logloss:0.26822 validation-auc:0.96572 validation-aucpr:0.97012
[31] validation-logloss:0.26320 validation-auc:0.96594 validation-aucpr:0.97031
[32] validation-logloss:0.26022 validation-auc:0.96598 validation-aucpr:0.97030
[33] validation-logloss:0.25566 validation-auc:0.96634 validation-aucpr:0.97063
[34] validation-logloss:0.25321 validation-auc:0.96639 validation-aucpr:0.97060
[35] validation-logloss:0.25107 validation-auc:0.96648 validation-aucpr:0.97062
[36] validation-logloss:0.24876 validation-auc:0.96649 validation-aucpr:0.97062
[37] validation-logloss:0.24628 validation-auc:0.96652 validation-aucpr:0.97066
[38] validation-logloss:0.24442 validation-auc:0.96651 validation-aucpr:0.97067
[39] validation-logloss:0.24284 validation-auc:0.96649 validation-aucpr:0.97052
[40] validation-logloss:0.23969 validation-auc:0.96669 validation-aucpr:0.97073
[41] validation-logloss:0.23814 validation-auc:0.96672 validation-aucpr:0.97080
[42] validation-logloss:0.23652 validation-auc:0.96671 validation-aucpr:0.97073
[43] validation-logloss:0.23515 validation-auc:0.96674 validation-aucpr:0.97072
[44] validation-logloss:0.23216 validation-auc:0.96702 validation-aucpr:0.97097
[45] validation-logloss:0.23103 validation-auc:0.96708 validation-aucpr:0.97133
[46] validation-logloss:0.22971 validation-auc:0.96722 validation-aucpr:0.97140
[47] validation-logloss:0.22879 validation-auc:0.96718 validation-aucpr:0.97132
[48] validation-logloss:0.22767 validation-auc:0.96728 validation-aucpr:0.97136
[49] validation-logloss:0.22644 validation-auc:0.96733 validation-aucpr:0.97139
{'best_iteration': '46', 'best_score': '0.9713952085868922'}
Trial 80, Fold 5: Log loss = 0.2264398003286916, Average precision = 0.971399548138062, ROC-AUC = 0.9673263846311058, Elapsed Time = 1.7558240999969712 seconds
Optimization Progress: 81%|########1 | 81/100 [3:41:52<54:27, 171.96s/it]
Trial 81, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 81, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66439 validation-auc:0.95973 validation-aucpr:0.96404
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63782 validation-auc:0.96425 validation-aucpr:0.96873
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61348 validation-auc:0.96444 validation-aucpr:0.96878
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.59234 validation-auc:0.96531 validation-aucpr:0.96962
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.57198 validation-auc:0.96639 validation-aucpr:0.97135
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.55183 validation-auc:0.96726 validation-aucpr:0.97221
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.53357 validation-auc:0.96764 validation-aucpr:0.97281
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.51769 validation-auc:0.96735 validation-aucpr:0.97255
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.50084 validation-auc:0.96778 validation-aucpr:0.97296
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.48514 validation-auc:0.96789 validation-aucpr:0.97304
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.47038 validation-auc:0.96796 validation-aucpr:0.97312
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.45701 validation-auc:0.96781 validation-aucpr:0.97302
[21:40:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.44416 validation-auc:0.96798 validation-aucpr:0.97316
{'best_iteration': '12', 'best_score': '0.9731590378806133'}
Trial 81, Fold 1: Log loss = 0.44415862414143403, Average precision = 0.9731129177684463, ROC-AUC = 0.9679839214617895, Elapsed Time = 0.706329900000128 seconds
Trial 81, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 81, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66391 validation-auc:0.96426 validation-aucpr:0.96762
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.64014 validation-auc:0.96421 validation-aucpr:0.96796
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61528 validation-auc:0.96705 validation-aucpr:0.97057
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.59523 validation-auc:0.96707 validation-aucpr:0.97068
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.57355 validation-auc:0.96800 validation-aucpr:0.97157
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.55572 validation-auc:0.96775 validation-aucpr:0.97122
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.53911 validation-auc:0.96777 validation-aucpr:0.97114
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.52110 validation-auc:0.96827 validation-aucpr:0.97153
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.50399 validation-auc:0.96878 validation-aucpr:0.97193
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.49005 validation-auc:0.96905 validation-aucpr:0.97204
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.47493 validation-auc:0.96971 validation-aucpr:0.97276
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.46275 validation-auc:0.96938 validation-aucpr:0.97230
[21:40:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.44916 validation-auc:0.96987 validation-aucpr:0.97278
{'best_iteration': '12', 'best_score': '0.972779721455394'}
Trial 81, Fold 2: Log loss = 0.44916450695000154, Average precision = 0.9726980587779851, ROC-AUC = 0.9698728292858658, Elapsed Time = 0.8429739999992307 seconds
Trial 81, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 81, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[21:40:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66408 validation-auc:0.96278 validation-aucpr:0.96586
[21:40:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63717 validation-auc:0.96630 validation-aucpr:0.97063
[21:40:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61231 validation-auc:0.96720 validation-aucpr:0.97136
[21:40:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.58920 validation-auc:0.96770 validation-aucpr:0.97171
[21:40:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.56785 validation-auc:0.96851 validation-aucpr:0.97262
[21:40:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.54993 validation-auc:0.96975 validation-aucpr:0.97351
[21:40:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.53285 validation-auc:0.96942 validation-aucpr:0.97319
[21:40:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.51481 validation-auc:0.96965 validation-aucpr:0.97342
[21:40:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.49791 validation-auc:0.97032 validation-aucpr:0.97366
[21:40:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.48242 validation-auc:0.97037 validation-aucpr:0.97470
[21:40:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.46758 validation-auc:0.97087 validation-aucpr:0.97510
[21:40:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.45390 validation-auc:0.97110 validation-aucpr:0.97531
[21:40:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.44238 validation-auc:0.97106 validation-aucpr:0.97527
{'best_iteration': '11', 'best_score': '0.975307214390828'}
Trial 81, Fold 3: Log loss = 0.44238187145411595, Average precision = 0.9752327037809126, ROC-AUC = 0.9710605455018251, Elapsed Time = 0.8559263000024657 seconds
Trial 81, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 81, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66407 validation-auc:0.96168 validation-aucpr:0.96773
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63820 validation-auc:0.96481 validation-aucpr:0.97022
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61422 validation-auc:0.96630 validation-aucpr:0.97147
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.59109 validation-auc:0.96894 validation-aucpr:0.97339
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.56992 validation-auc:0.96826 validation-aucpr:0.97303
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.55230 validation-auc:0.96743 validation-aucpr:0.97269
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.53403 validation-auc:0.96768 validation-aucpr:0.97288
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.51724 validation-auc:0.96822 validation-aucpr:0.97325
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.50079 validation-auc:0.96870 validation-aucpr:0.97359
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.48502 validation-auc:0.96899 validation-aucpr:0.97391
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.47198 validation-auc:0.96888 validation-aucpr:0.97389
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.45789 validation-auc:0.96910 validation-aucpr:0.97408
[21:40:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.44504 validation-auc:0.96901 validation-aucpr:0.97403
{'best_iteration': '11', 'best_score': '0.9740761196721902'}
Trial 81, Fold 4: Log loss = 0.44503664140822147, Average precision = 0.9739748843423582, ROC-AUC = 0.9690058769739933, Elapsed Time = 0.8716018000013719 seconds
Trial 81, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 81, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[21:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66431 validation-auc:0.96151 validation-aucpr:0.96659
[21:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63806 validation-auc:0.96450 validation-aucpr:0.96951
[21:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.61372 validation-auc:0.96548 validation-aucpr:0.97003
[21:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.59104 validation-auc:0.96700 validation-aucpr:0.97096
[21:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.57164 validation-auc:0.96555 validation-aucpr:0.96999
[21:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.55163 validation-auc:0.96634 validation-aucpr:0.97067
[21:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.53362 validation-auc:0.96668 validation-aucpr:0.97090
[21:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.51637 validation-auc:0.96681 validation-aucpr:0.97106
[21:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.50208 validation-auc:0.96657 validation-aucpr:0.97070
[21:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.48906 validation-auc:0.96645 validation-aucpr:0.97083
[21:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.47680 validation-auc:0.96575 validation-aucpr:0.97021
[21:40:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.46304 validation-auc:0.96648 validation-aucpr:0.97075
[21:40:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.44987 validation-auc:0.96701 validation-aucpr:0.97118
{'best_iteration': '12', 'best_score': '0.9711795928982662'}
Trial 81, Fold 5: Log loss = 0.44987265941620824, Average precision = 0.9710778666302752, ROC-AUC = 0.9670054066105569, Elapsed Time = 0.9390024000022095 seconds
Optimization Progress: 82%|########2 | 82/100 [3:42:04<37:11, 123.97s/it]
Trial 82, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 82, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[21:41:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.66022 validation-auc:0.95140 validation-aucpr:0.95862
[21:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63158 validation-auc:0.95985 validation-aucpr:0.96379
[21:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60299 validation-auc:0.96466 validation-aucpr:0.97005
[21:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57893 validation-auc:0.96492 validation-aucpr:0.97004
[21:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55352 validation-auc:0.96680 validation-aucpr:0.97186
[21:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53363 validation-auc:0.96682 validation-aucpr:0.97172
[21:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51515 validation-auc:0.96664 validation-aucpr:0.97154
[21:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49533 validation-auc:0.96719 validation-aucpr:0.97214
[21:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47949 validation-auc:0.96715 validation-aucpr:0.97228
[21:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.46524 validation-auc:0.96706 validation-aucpr:0.97213
[21:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44909 validation-auc:0.96763 validation-aucpr:0.97263
[21:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43392 validation-auc:0.96795 validation-aucpr:0.97296
[21:41:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.42032 validation-auc:0.96823 validation-aucpr:0.97318
[21:41:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40691 validation-auc:0.96846 validation-aucpr:0.97343
[21:41:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39433 validation-auc:0.96868 validation-aucpr:0.97365
[21:41:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38313 validation-auc:0.96869 validation-aucpr:0.97370
[21:41:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37264 validation-auc:0.96881 validation-aucpr:0.97378
[21:41:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36408 validation-auc:0.96891 validation-aucpr:0.97381
[21:41:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35645 validation-auc:0.96897 validation-aucpr:0.97384
[21:41:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34878 validation-auc:0.96892 validation-aucpr:0.97379
[21:41:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.34190 validation-auc:0.96887 validation-aucpr:0.97367
[21:41:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33506 validation-auc:0.96882 validation-aucpr:0.97370
[21:41:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32710 validation-auc:0.96895 validation-aucpr:0.97384
[21:41:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31955 validation-auc:0.96908 validation-aucpr:0.97398
[21:41:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31255 validation-auc:0.96928 validation-aucpr:0.97413
[21:41:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30734 validation-auc:0.96927 validation-aucpr:0.97414
[21:41:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.30157 validation-auc:0.96942 validation-aucpr:0.97431
[21:41:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29611 validation-auc:0.96959 validation-aucpr:0.97460
[21:41:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.29173 validation-auc:0.96963 validation-aucpr:0.97462
[21:41:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28730 validation-auc:0.96965 validation-aucpr:0.97463
[21:41:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.28223 validation-auc:0.96978 validation-aucpr:0.97480
[21:41:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27879 validation-auc:0.96967 validation-aucpr:0.97471
[21:41:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.27538 validation-auc:0.96969 validation-aucpr:0.97473
[21:41:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.27105 validation-auc:0.96977 validation-aucpr:0.97482
[21:41:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26781 validation-auc:0.96977 validation-aucpr:0.97482
[21:41:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.26466 validation-auc:0.96982 validation-aucpr:0.97486
[21:41:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.26094 validation-auc:0.96989 validation-aucpr:0.97496
[21:41:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25720 validation-auc:0.97010 validation-aucpr:0.97511
[21:41:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.25449 validation-auc:0.97015 validation-aucpr:0.97513
[21:41:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.25112 validation-auc:0.97021 validation-aucpr:0.97520
[21:41:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.24876 validation-auc:0.97028 validation-aucpr:0.97524
[21:41:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.24637 validation-auc:0.97034 validation-aucpr:0.97526
[21:41:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.24430 validation-auc:0.97036 validation-aucpr:0.97525
[21:41:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.24175 validation-auc:0.97041 validation-aucpr:0.97530
[21:41:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.23970 validation-auc:0.97047 validation-aucpr:0.97533
[21:41:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.23714 validation-auc:0.97054 validation-aucpr:0.97540
[21:41:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.23480 validation-auc:0.97059 validation-aucpr:0.97546
[21:41:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.23270 validation-auc:0.97067 validation-aucpr:0.97552
[21:41:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.23110 validation-auc:0.97067 validation-aucpr:0.97551
[21:41:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.22951 validation-auc:0.97071 validation-aucpr:0.97556
[21:41:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.22767 validation-auc:0.97073 validation-aucpr:0.97561
[21:41:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.22565 validation-auc:0.97085 validation-aucpr:0.97571
[21:41:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.22424 validation-auc:0.97080 validation-aucpr:0.97567
[21:41:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.22297 validation-auc:0.97083 validation-aucpr:0.97569
[21:41:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.22182 validation-auc:0.97084 validation-aucpr:0.97569
[21:41:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.22034 validation-auc:0.97086 validation-aucpr:0.97569
[21:41:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.21888 validation-auc:0.97087 validation-aucpr:0.97572
[21:41:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.21785 validation-auc:0.97087 validation-aucpr:0.97573
[21:41:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.21675 validation-auc:0.97095 validation-aucpr:0.97579
[21:41:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.21552 validation-auc:0.97103 validation-aucpr:0.97587
[21:41:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.21425 validation-auc:0.97114 validation-aucpr:0.97594
[21:41:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.21330 validation-auc:0.97112 validation-aucpr:0.97593
[21:41:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.21213 validation-auc:0.97117 validation-aucpr:0.97598
[21:41:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.21138 validation-auc:0.97119 validation-aucpr:0.97597
[21:41:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.21012 validation-auc:0.97127 validation-aucpr:0.97603
[21:41:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.20956 validation-auc:0.97127 validation-aucpr:0.97602
[21:41:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.20869 validation-auc:0.97126 validation-aucpr:0.97602
[21:41:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.20776 validation-auc:0.97131 validation-aucpr:0.97605
[21:41:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.20705 validation-auc:0.97137 validation-aucpr:0.97608
[21:41:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.20626 validation-auc:0.97137 validation-aucpr:0.97608
[21:41:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.20541 validation-auc:0.97144 validation-aucpr:0.97613
[21:41:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.20478 validation-auc:0.97140 validation-aucpr:0.97614
[21:41:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.20407 validation-auc:0.97143 validation-aucpr:0.97616
[21:41:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.20360 validation-auc:0.97147 validation-aucpr:0.97617
[21:41:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.20287 validation-auc:0.97156 validation-aucpr:0.97623
[21:41:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.20243 validation-auc:0.97159 validation-aucpr:0.97625
[21:41:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.20191 validation-auc:0.97162 validation-aucpr:0.97627
[21:41:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.20146 validation-auc:0.97161 validation-aucpr:0.97628
[21:41:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.20109 validation-auc:0.97161 validation-aucpr:0.97626
[21:41:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.20056 validation-auc:0.97164 validation-aucpr:0.97629
{'best_iteration': '79', 'best_score': '0.9762865072365965'}
Trial 82, Fold 1: Log loss = 0.20056391807788446, Average precision = 0.9762904774080723, ROC-AUC = 0.9716443400104195, Elapsed Time = 21.972112199997355 seconds
Trial 82, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 82, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[21:41:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.65852 validation-auc:0.96324 validation-aucpr:0.96664
[21:41:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.62994 validation-auc:0.96438 validation-aucpr:0.96797
[21:41:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60106 validation-auc:0.96664 validation-aucpr:0.97067
[21:41:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57880 validation-auc:0.96552 validation-aucpr:0.97007
[21:41:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55430 validation-auc:0.96654 validation-aucpr:0.97091
[21:41:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53413 validation-auc:0.96695 validation-aucpr:0.97102
[21:41:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51642 validation-auc:0.96688 validation-aucpr:0.97069
[21:41:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49698 validation-auc:0.96785 validation-aucpr:0.97155
[21:41:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47898 validation-auc:0.96839 validation-aucpr:0.97205
[21:41:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.46398 validation-auc:0.96845 validation-aucpr:0.97208
[21:41:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44807 validation-auc:0.96889 validation-aucpr:0.97240
[21:41:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43286 validation-auc:0.96911 validation-aucpr:0.97265
[21:41:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.41842 validation-auc:0.96966 validation-aucpr:0.97312
[21:41:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40715 validation-auc:0.96953 validation-aucpr:0.97297
[21:41:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39453 validation-auc:0.96985 validation-aucpr:0.97326
[21:41:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38331 validation-auc:0.97000 validation-aucpr:0.97333
[21:41:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37254 validation-auc:0.96997 validation-aucpr:0.97326
[21:41:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36256 validation-auc:0.97021 validation-aucpr:0.97335
[21:41:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35322 validation-auc:0.97019 validation-aucpr:0.97336
[21:41:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34460 validation-auc:0.97005 validation-aucpr:0.97329
[21:41:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.33771 validation-auc:0.96996 validation-aucpr:0.97319
[21:41:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33074 validation-auc:0.97000 validation-aucpr:0.97321
[21:41:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32393 validation-auc:0.97009 validation-aucpr:0.97327
[21:41:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31633 validation-auc:0.97047 validation-aucpr:0.97356
[21:41:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31062 validation-auc:0.97046 validation-aucpr:0.97354
[21:41:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30435 validation-auc:0.97066 validation-aucpr:0.97370
[21:41:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.29952 validation-auc:0.97069 validation-aucpr:0.97370
[21:41:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29377 validation-auc:0.97085 validation-aucpr:0.97441
[21:41:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.28952 validation-auc:0.97079 validation-aucpr:0.97434
[21:41:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28420 validation-auc:0.97086 validation-aucpr:0.97438
[21:41:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.27905 validation-auc:0.97083 validation-aucpr:0.97438
[21:41:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27502 validation-auc:0.97094 validation-aucpr:0.97448
[21:41:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.27154 validation-auc:0.97102 validation-aucpr:0.97450
[21:41:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26720 validation-auc:0.97120 validation-aucpr:0.97465
[21:41:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26309 validation-auc:0.97118 validation-aucpr:0.97466
[21:41:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.25912 validation-auc:0.97126 validation-aucpr:0.97478
[21:41:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25509 validation-auc:0.97140 validation-aucpr:0.97489
[21:41:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25160 validation-auc:0.97146 validation-aucpr:0.97493
[21:41:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.24891 validation-auc:0.97146 validation-aucpr:0.97494
[21:41:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.24574 validation-auc:0.97156 validation-aucpr:0.97498
[21:41:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.24268 validation-auc:0.97161 validation-aucpr:0.97502
[21:41:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.23995 validation-auc:0.97171 validation-aucpr:0.97508
[21:41:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.23790 validation-auc:0.97167 validation-aucpr:0.97504
[21:41:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.23516 validation-auc:0.97175 validation-aucpr:0.97511
[21:41:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.23270 validation-auc:0.97182 validation-aucpr:0.97522
[21:41:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.23085 validation-auc:0.97193 validation-aucpr:0.97528
[21:41:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.22865 validation-auc:0.97206 validation-aucpr:0.97538
[21:41:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.22635 validation-auc:0.97215 validation-aucpr:0.97544
[21:41:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.22464 validation-auc:0.97217 validation-aucpr:0.97544
[21:41:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.22286 validation-auc:0.97221 validation-aucpr:0.97541
[21:41:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.22076 validation-auc:0.97232 validation-aucpr:0.97559
[21:41:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.21947 validation-auc:0.97230 validation-aucpr:0.97556
[21:41:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.21821 validation-auc:0.97229 validation-aucpr:0.97555
[21:41:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.21706 validation-auc:0.97231 validation-aucpr:0.97557
[21:41:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.21544 validation-auc:0.97232 validation-aucpr:0.97557
[21:41:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.21390 validation-auc:0.97242 validation-aucpr:0.97564
[21:41:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.21245 validation-auc:0.97244 validation-aucpr:0.97564
[21:41:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.21142 validation-auc:0.97247 validation-aucpr:0.97565
[21:41:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.20965 validation-auc:0.97270 validation-aucpr:0.97582
[21:41:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.20828 validation-auc:0.97275 validation-aucpr:0.97585
[21:41:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.20703 validation-auc:0.97278 validation-aucpr:0.97587
[21:41:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.20604 validation-auc:0.97283 validation-aucpr:0.97591
[21:41:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.20491 validation-auc:0.97287 validation-aucpr:0.97595
[21:41:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.20411 validation-auc:0.97286 validation-aucpr:0.97595
[21:41:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.20313 validation-auc:0.97287 validation-aucpr:0.97594
[21:41:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.20258 validation-auc:0.97285 validation-aucpr:0.97587
[21:41:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.20152 validation-auc:0.97296 validation-aucpr:0.97594
[21:41:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.20068 validation-auc:0.97300 validation-aucpr:0.97597
[21:41:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.20021 validation-auc:0.97292 validation-aucpr:0.97591
[21:41:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.19932 validation-auc:0.97301 validation-aucpr:0.97593
[21:41:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.19841 validation-auc:0.97305 validation-aucpr:0.97596
[21:41:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.19785 validation-auc:0.97310 validation-aucpr:0.97611
[21:41:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.19691 validation-auc:0.97314 validation-aucpr:0.97616
[21:41:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.19613 validation-auc:0.97317 validation-aucpr:0.97617
[21:41:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.19551 validation-auc:0.97314 validation-aucpr:0.97614
[21:41:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.19509 validation-auc:0.97305 validation-aucpr:0.97601
[21:41:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.19467 validation-auc:0.97303 validation-aucpr:0.97594
[21:41:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.19436 validation-auc:0.97300 validation-aucpr:0.97599
[21:41:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.19418 validation-auc:0.97295 validation-aucpr:0.97599
[21:41:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.19370 validation-auc:0.97299 validation-aucpr:0.97603
{'best_iteration': '73', 'best_score': '0.9761700567932763'}
Trial 82, Fold 2: Log loss = 0.193696307689738, Average precision = 0.9760314877461944, ROC-AUC = 0.9729945088546351, Elapsed Time = 27.715898099999322 seconds
Trial 82, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 82, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[21:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.65969 validation-auc:0.95616 validation-aucpr:0.96138
[21:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.62780 validation-auc:0.96550 validation-aucpr:0.97001
[21:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60241 validation-auc:0.96617 validation-aucpr:0.97053
[21:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57612 validation-auc:0.96671 validation-aucpr:0.97084
[21:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55490 validation-auc:0.96624 validation-aucpr:0.97064
[21:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53444 validation-auc:0.96705 validation-aucpr:0.97130
[21:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51546 validation-auc:0.96762 validation-aucpr:0.97163
[21:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49534 validation-auc:0.96825 validation-aucpr:0.97227
[21:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47684 validation-auc:0.96860 validation-aucpr:0.97259
[21:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.46186 validation-auc:0.96845 validation-aucpr:0.97241
[21:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44637 validation-auc:0.96845 validation-aucpr:0.97244
[21:41:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43114 validation-auc:0.96883 validation-aucpr:0.97280
[21:41:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.41744 validation-auc:0.96901 validation-aucpr:0.97293
[21:41:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40437 validation-auc:0.96915 validation-aucpr:0.97305
[21:41:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39322 validation-auc:0.96930 validation-aucpr:0.97326
[21:41:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38328 validation-auc:0.96921 validation-aucpr:0.97316
[21:41:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37279 validation-auc:0.96926 validation-aucpr:0.97319
[21:41:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36451 validation-auc:0.96926 validation-aucpr:0.97384
[21:41:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35614 validation-auc:0.96937 validation-aucpr:0.97393
[21:41:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34686 validation-auc:0.96958 validation-aucpr:0.97412
[21:41:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.33938 validation-auc:0.96965 validation-aucpr:0.97418
[21:41:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33154 validation-auc:0.96965 validation-aucpr:0.97420
[21:41:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32370 validation-auc:0.96985 validation-aucpr:0.97436
[21:41:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31787 validation-auc:0.96971 validation-aucpr:0.97422
[21:41:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31264 validation-auc:0.96972 validation-aucpr:0.97422
[21:41:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30605 validation-auc:0.96987 validation-aucpr:0.97436
[21:41:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.29983 validation-auc:0.96996 validation-aucpr:0.97445
[21:41:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29397 validation-auc:0.97007 validation-aucpr:0.97453
[21:41:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.28841 validation-auc:0.97021 validation-aucpr:0.97462
[21:41:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28291 validation-auc:0.97035 validation-aucpr:0.97471
[21:41:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.27821 validation-auc:0.97038 validation-aucpr:0.97474
[21:41:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27374 validation-auc:0.97038 validation-aucpr:0.97474
[21:41:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.27010 validation-auc:0.97033 validation-aucpr:0.97469
[21:41:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26627 validation-auc:0.97025 validation-aucpr:0.97471
[21:41:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26275 validation-auc:0.97035 validation-aucpr:0.97480
[21:42:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.25987 validation-auc:0.97041 validation-aucpr:0.97485
[21:42:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25676 validation-auc:0.97051 validation-aucpr:0.97491
[21:42:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25403 validation-auc:0.97044 validation-aucpr:0.97485
[21:42:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.25157 validation-auc:0.97043 validation-aucpr:0.97482
[21:42:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.24830 validation-auc:0.97055 validation-aucpr:0.97490
[21:42:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.24546 validation-auc:0.97047 validation-aucpr:0.97483
[21:42:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.24233 validation-auc:0.97059 validation-aucpr:0.97492
[21:42:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.23956 validation-auc:0.97066 validation-aucpr:0.97495
[21:42:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.23690 validation-auc:0.97066 validation-aucpr:0.97496
[21:42:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.23419 validation-auc:0.97076 validation-aucpr:0.97505
[21:42:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.23217 validation-auc:0.97087 validation-aucpr:0.97513
[21:42:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.23050 validation-auc:0.97085 validation-aucpr:0.97510
[21:42:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.22833 validation-auc:0.97092 validation-aucpr:0.97514
[21:42:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.22673 validation-auc:0.97096 validation-aucpr:0.97522
[21:42:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.22486 validation-auc:0.97104 validation-aucpr:0.97532
[21:42:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.22282 validation-auc:0.97117 validation-aucpr:0.97541
[21:42:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.22138 validation-auc:0.97122 validation-aucpr:0.97542
[21:42:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.22030 validation-auc:0.97118 validation-aucpr:0.97537
[21:42:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.21916 validation-auc:0.97119 validation-aucpr:0.97536
[21:42:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.21785 validation-auc:0.97112 validation-aucpr:0.97530
[21:42:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.21622 validation-auc:0.97122 validation-aucpr:0.97535
[21:42:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.21481 validation-auc:0.97125 validation-aucpr:0.97537
[21:42:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.21341 validation-auc:0.97131 validation-aucpr:0.97543
[21:42:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.21206 validation-auc:0.97140 validation-aucpr:0.97553
[21:42:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.21106 validation-auc:0.97145 validation-aucpr:0.97556
[21:42:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.20988 validation-auc:0.97147 validation-aucpr:0.97558
[21:42:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.20865 validation-auc:0.97154 validation-aucpr:0.97562
[21:42:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.20731 validation-auc:0.97170 validation-aucpr:0.97572
[21:42:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.20645 validation-auc:0.97176 validation-aucpr:0.97574
[21:42:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.20568 validation-auc:0.97184 validation-aucpr:0.97578
[21:42:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.20460 validation-auc:0.97191 validation-aucpr:0.97584
[21:42:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.20367 validation-auc:0.97192 validation-aucpr:0.97583
[21:42:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.20292 validation-auc:0.97199 validation-aucpr:0.97587
[21:42:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.20232 validation-auc:0.97199 validation-aucpr:0.97586
[21:42:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.20165 validation-auc:0.97197 validation-aucpr:0.97583
[21:42:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.20095 validation-auc:0.97197 validation-aucpr:0.97583
[21:42:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.20025 validation-auc:0.97200 validation-aucpr:0.97586
[21:42:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.19964 validation-auc:0.97200 validation-aucpr:0.97582
[21:42:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.19917 validation-auc:0.97201 validation-aucpr:0.97587
[21:42:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.19870 validation-auc:0.97204 validation-aucpr:0.97589
[21:42:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.19816 validation-auc:0.97208 validation-aucpr:0.97592
[21:42:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.19755 validation-auc:0.97215 validation-aucpr:0.97595
[21:42:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.19710 validation-auc:0.97212 validation-aucpr:0.97592
[21:42:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.19660 validation-auc:0.97219 validation-aucpr:0.97597
[21:42:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.19618 validation-auc:0.97219 validation-aucpr:0.97595
{'best_iteration': '78', 'best_score': '0.975968739866936'}
Trial 82, Fold 3: Log loss = 0.1961760106049815, Average precision = 0.9759582368256433, ROC-AUC = 0.9721851193285241, Elapsed Time = 22.21717020000142 seconds
Trial 82, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 82, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[21:42:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.65844 validation-auc:0.95920 validation-aucpr:0.96307
[21:42:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63012 validation-auc:0.96310 validation-aucpr:0.96845
[21:42:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60162 validation-auc:0.96409 validation-aucpr:0.97045
[21:42:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57455 validation-auc:0.96526 validation-aucpr:0.97157
[21:42:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55353 validation-auc:0.96459 validation-aucpr:0.97083
[21:42:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.53391 validation-auc:0.96454 validation-aucpr:0.97067
[21:42:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.51284 validation-auc:0.96493 validation-aucpr:0.97097
[21:42:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49576 validation-auc:0.96486 validation-aucpr:0.97070
[21:42:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47775 validation-auc:0.96532 validation-aucpr:0.97118
[21:42:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.46365 validation-auc:0.96514 validation-aucpr:0.97099
[21:42:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44946 validation-auc:0.96532 validation-aucpr:0.97108
[21:42:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.43454 validation-auc:0.96570 validation-aucpr:0.97135
[21:42:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.42270 validation-auc:0.96575 validation-aucpr:0.97133
[21:42:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40997 validation-auc:0.96610 validation-aucpr:0.97167
[21:42:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39947 validation-auc:0.96607 validation-aucpr:0.97159
[21:42:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38966 validation-auc:0.96626 validation-aucpr:0.97165
[21:42:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.38065 validation-auc:0.96626 validation-aucpr:0.97164
[21:42:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.37008 validation-auc:0.96656 validation-aucpr:0.97190
[21:42:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35992 validation-auc:0.96693 validation-aucpr:0.97221
[21:42:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.35059 validation-auc:0.96711 validation-aucpr:0.97239
[21:42:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.34337 validation-auc:0.96713 validation-aucpr:0.97236
[21:42:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33478 validation-auc:0.96747 validation-aucpr:0.97266
[21:42:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32689 validation-auc:0.96788 validation-aucpr:0.97297
[21:42:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31944 validation-auc:0.96838 validation-aucpr:0.97337
[21:42:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31388 validation-auc:0.96822 validation-aucpr:0.97324
[21:42:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30846 validation-auc:0.96827 validation-aucpr:0.97328
[21:42:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.30366 validation-auc:0.96811 validation-aucpr:0.97313
[21:42:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29902 validation-auc:0.96803 validation-aucpr:0.97309
[21:42:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.29328 validation-auc:0.96811 validation-aucpr:0.97317
[21:42:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28777 validation-auc:0.96836 validation-aucpr:0.97341
[21:42:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.28265 validation-auc:0.96856 validation-aucpr:0.97359
[21:42:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27772 validation-auc:0.96878 validation-aucpr:0.97377
[21:42:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.27329 validation-auc:0.96877 validation-aucpr:0.97377
[21:42:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26900 validation-auc:0.96895 validation-aucpr:0.97392
[21:42:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26580 validation-auc:0.96890 validation-aucpr:0.97389
[21:42:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.26284 validation-auc:0.96900 validation-aucpr:0.97396
[21:42:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25933 validation-auc:0.96913 validation-aucpr:0.97407
[21:42:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25564 validation-auc:0.96937 validation-aucpr:0.97426
[21:42:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.25316 validation-auc:0.96944 validation-aucpr:0.97432
[21:42:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.25002 validation-auc:0.96945 validation-aucpr:0.97434
[21:42:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.24767 validation-auc:0.96954 validation-aucpr:0.97439
[21:42:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.24454 validation-auc:0.96974 validation-aucpr:0.97455
[21:42:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.24178 validation-auc:0.96984 validation-aucpr:0.97463
[21:42:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.23897 validation-auc:0.97000 validation-aucpr:0.97475
[21:42:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.23651 validation-auc:0.97014 validation-aucpr:0.97486
[21:42:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.23400 validation-auc:0.97025 validation-aucpr:0.97494
[21:42:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.23242 validation-auc:0.97021 validation-aucpr:0.97490
[21:42:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.23029 validation-auc:0.97018 validation-aucpr:0.97490
[21:42:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.22819 validation-auc:0.97028 validation-aucpr:0.97498
[21:42:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.22653 validation-auc:0.97039 validation-aucpr:0.97504
[21:42:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.22516 validation-auc:0.97042 validation-aucpr:0.97505
[21:42:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.22384 validation-auc:0.97047 validation-aucpr:0.97509
[21:42:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.22190 validation-auc:0.97060 validation-aucpr:0.97518
[21:42:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.22055 validation-auc:0.97071 validation-aucpr:0.97524
[21:42:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.21937 validation-auc:0.97071 validation-aucpr:0.97522
[21:42:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.21775 validation-auc:0.97078 validation-aucpr:0.97527
[21:42:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.21642 validation-auc:0.97081 validation-aucpr:0.97529
[21:42:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.21533 validation-auc:0.97090 validation-aucpr:0.97535
[21:42:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.21435 validation-auc:0.97091 validation-aucpr:0.97537
[21:42:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.21337 validation-auc:0.97090 validation-aucpr:0.97535
[21:42:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.21248 validation-auc:0.97091 validation-aucpr:0.97536
[21:42:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.21157 validation-auc:0.97094 validation-aucpr:0.97538
[21:42:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.21077 validation-auc:0.97094 validation-aucpr:0.97537
[21:42:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.21026 validation-auc:0.97088 validation-aucpr:0.97533
[21:42:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.20956 validation-auc:0.97087 validation-aucpr:0.97532
[21:42:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.20875 validation-auc:0.97088 validation-aucpr:0.97533
[21:42:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.20800 validation-auc:0.97089 validation-aucpr:0.97535
[21:42:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.20728 validation-auc:0.97096 validation-aucpr:0.97540
[21:42:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.20667 validation-auc:0.97099 validation-aucpr:0.97540
[21:42:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.20610 validation-auc:0.97095 validation-aucpr:0.97539
[21:42:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.20522 validation-auc:0.97102 validation-aucpr:0.97544
[21:42:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.20453 validation-auc:0.97111 validation-aucpr:0.97550
[21:42:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.20354 validation-auc:0.97124 validation-aucpr:0.97560
[21:42:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.20273 validation-auc:0.97131 validation-aucpr:0.97565
[21:42:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.20190 validation-auc:0.97144 validation-aucpr:0.97576
[21:42:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.20140 validation-auc:0.97149 validation-aucpr:0.97578
[21:42:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.20091 validation-auc:0.97150 validation-aucpr:0.97579
[21:42:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.19996 validation-auc:0.97162 validation-aucpr:0.97589
[21:42:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.19941 validation-auc:0.97164 validation-aucpr:0.97591
[21:42:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.19866 validation-auc:0.97174 validation-aucpr:0.97598
{'best_iteration': '79', 'best_score': '0.9759841043421987'}
Trial 82, Fold 4: Log loss = 0.1986568267174289, Average precision = 0.9759880043423315, ROC-AUC = 0.9717427323190055, Elapsed Time = 22.36301789999925 seconds
Trial 82, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 82, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[21:42:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.65841 validation-auc:0.96003 validation-aucpr:0.96572
[21:42:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.63094 validation-auc:0.96140 validation-aucpr:0.96647
[21:42:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.60218 validation-auc:0.96399 validation-aucpr:0.96934
[21:42:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.57621 validation-auc:0.96503 validation-aucpr:0.97020
[21:42:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.55177 validation-auc:0.96614 validation-aucpr:0.97085
[21:42:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.52964 validation-auc:0.96619 validation-aucpr:0.97119
[21:42:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.50891 validation-auc:0.96641 validation-aucpr:0.97138
[21:42:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.49048 validation-auc:0.96635 validation-aucpr:0.97131
[21:42:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.47244 validation-auc:0.96709 validation-aucpr:0.97176
[21:42:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.45609 validation-auc:0.96711 validation-aucpr:0.97180
[21:42:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.44099 validation-auc:0.96714 validation-aucpr:0.97172
[21:42:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.42691 validation-auc:0.96722 validation-aucpr:0.97176
[21:42:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.41372 validation-auc:0.96777 validation-aucpr:0.97213
[21:42:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.40274 validation-auc:0.96760 validation-aucpr:0.97210
[21:42:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.39146 validation-auc:0.96752 validation-aucpr:0.97199
[21:42:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.38195 validation-auc:0.96743 validation-aucpr:0.97194
[21:42:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.37155 validation-auc:0.96770 validation-aucpr:0.97217
[21:42:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.36297 validation-auc:0.96764 validation-aucpr:0.97215
[21:42:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.35341 validation-auc:0.96786 validation-aucpr:0.97234
[21:42:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.34495 validation-auc:0.96781 validation-aucpr:0.97232
[21:42:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.33735 validation-auc:0.96792 validation-aucpr:0.97248
[21:42:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.33008 validation-auc:0.96799 validation-aucpr:0.97252
[21:42:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.32267 validation-auc:0.96813 validation-aucpr:0.97261
[21:42:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.31655 validation-auc:0.96823 validation-aucpr:0.97272
[21:42:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.31009 validation-auc:0.96820 validation-aucpr:0.97272
[21:42:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.30439 validation-auc:0.96826 validation-aucpr:0.97287
[21:42:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.29898 validation-auc:0.96817 validation-aucpr:0.97278
[21:42:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.29418 validation-auc:0.96830 validation-aucpr:0.97284
[21:42:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.28891 validation-auc:0.96837 validation-aucpr:0.97291
[21:42:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.28382 validation-auc:0.96852 validation-aucpr:0.97302
[21:42:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.27985 validation-auc:0.96865 validation-aucpr:0.97314
[21:42:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.27534 validation-auc:0.96871 validation-aucpr:0.97320
[21:42:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.27115 validation-auc:0.96881 validation-aucpr:0.97331
[21:42:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.26803 validation-auc:0.96885 validation-aucpr:0.97347
[21:42:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.26511 validation-auc:0.96876 validation-aucpr:0.97338
[21:42:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.26264 validation-auc:0.96855 validation-aucpr:0.97319
[21:42:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.25974 validation-auc:0.96859 validation-aucpr:0.97318
[21:42:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.25657 validation-auc:0.96864 validation-aucpr:0.97322
[21:42:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.25417 validation-auc:0.96863 validation-aucpr:0.97321
[21:42:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.25178 validation-auc:0.96871 validation-aucpr:0.97327
[21:42:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.24940 validation-auc:0.96871 validation-aucpr:0.97326
[21:42:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.24668 validation-auc:0.96874 validation-aucpr:0.97327
[21:42:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.24406 validation-auc:0.96879 validation-aucpr:0.97333
[21:42:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.24173 validation-auc:0.96886 validation-aucpr:0.97342
[21:42:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.23987 validation-auc:0.96887 validation-aucpr:0.97344
[21:42:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.23770 validation-auc:0.96896 validation-aucpr:0.97355
[21:42:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.23536 validation-auc:0.96912 validation-aucpr:0.97368
[21:42:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.23324 validation-auc:0.96932 validation-aucpr:0.97387
[21:42:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.23179 validation-auc:0.96935 validation-aucpr:0.97389
[21:42:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.22986 validation-auc:0.96940 validation-aucpr:0.97392
[21:42:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.22806 validation-auc:0.96955 validation-aucpr:0.97404
[21:42:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.22632 validation-auc:0.96962 validation-aucpr:0.97408
[21:42:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.22525 validation-auc:0.96957 validation-aucpr:0.97401
[21:42:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.22381 validation-auc:0.96957 validation-aucpr:0.97398
[21:42:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.22227 validation-auc:0.96972 validation-aucpr:0.97408
[21:42:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.22096 validation-auc:0.96979 validation-aucpr:0.97412
[21:42:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.21974 validation-auc:0.96982 validation-aucpr:0.97415
[21:42:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.21837 validation-auc:0.96995 validation-aucpr:0.97426
[21:42:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.21744 validation-auc:0.97002 validation-aucpr:0.97430
[21:42:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.21631 validation-auc:0.97002 validation-aucpr:0.97429
[21:42:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.21554 validation-auc:0.96997 validation-aucpr:0.97427
[21:42:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[61] validation-logloss:0.21465 validation-auc:0.96998 validation-aucpr:0.97426
[21:42:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[62] validation-logloss:0.21394 validation-auc:0.96993 validation-aucpr:0.97419
[21:42:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[63] validation-logloss:0.21308 validation-auc:0.96992 validation-aucpr:0.97416
[21:42:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[64] validation-logloss:0.21235 validation-auc:0.96997 validation-aucpr:0.97418
[21:42:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[65] validation-logloss:0.21180 validation-auc:0.96998 validation-aucpr:0.97418
[21:42:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[66] validation-logloss:0.21117 validation-auc:0.96998 validation-aucpr:0.97419
[21:42:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[67] validation-logloss:0.21033 validation-auc:0.97004 validation-aucpr:0.97420
[21:42:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[68] validation-logloss:0.20987 validation-auc:0.97001 validation-aucpr:0.97416
[21:42:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[69] validation-logloss:0.20912 validation-auc:0.97003 validation-aucpr:0.97416
[21:42:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[70] validation-logloss:0.20855 validation-auc:0.96999 validation-aucpr:0.97411
[21:42:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[71] validation-logloss:0.20782 validation-auc:0.97004 validation-aucpr:0.97414
[21:42:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[72] validation-logloss:0.20707 validation-auc:0.97015 validation-aucpr:0.97418
[21:42:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[73] validation-logloss:0.20640 validation-auc:0.97022 validation-aucpr:0.97420
[21:42:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[74] validation-logloss:0.20593 validation-auc:0.97024 validation-aucpr:0.97419
[21:42:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[75] validation-logloss:0.20544 validation-auc:0.97029 validation-aucpr:0.97426
[21:42:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[76] validation-logloss:0.20484 validation-auc:0.97036 validation-aucpr:0.97429
[21:42:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[77] validation-logloss:0.20445 validation-auc:0.97037 validation-aucpr:0.97430
[21:42:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[78] validation-logloss:0.20403 validation-auc:0.97038 validation-aucpr:0.97427
[21:42:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[79] validation-logloss:0.20365 validation-auc:0.97033 validation-aucpr:0.97423
{'best_iteration': '77', 'best_score': '0.9743001257171137'}
Trial 82, Fold 5: Log loss = 0.2036515239633219, Average precision = 0.9742372043288416, ROC-AUC = 0.9703341694071308, Elapsed Time = 22.53138409999883 seconds
Optimization Progress: 83%|########2 | 83/100 [3:44:08<35:11, 124.22s/it]
Trial 83, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 83, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67223 validation-auc:0.91925 validation-aucpr:0.92320
[1] validation-logloss:0.65105 validation-auc:0.94335 validation-aucpr:0.94678
[2] validation-logloss:0.63256 validation-auc:0.94560 validation-aucpr:0.94919
[3] validation-logloss:0.61417 validation-auc:0.94817 validation-aucpr:0.95285
[4] validation-logloss:0.59744 validation-auc:0.95010 validation-aucpr:0.95527
[5] validation-logloss:0.58148 validation-auc:0.95143 validation-aucpr:0.95652
[6] validation-logloss:0.56842 validation-auc:0.95139 validation-aucpr:0.95635
[7] validation-logloss:0.55433 validation-auc:0.95200 validation-aucpr:0.95683
[8] validation-logloss:0.53832 validation-auc:0.95750 validation-aucpr:0.96241
[9] validation-logloss:0.52726 validation-auc:0.95686 validation-aucpr:0.96173
[10] validation-logloss:0.51634 validation-auc:0.95645 validation-aucpr:0.96134
[11] validation-logloss:0.50580 validation-auc:0.95617 validation-aucpr:0.96100
[12] validation-logloss:0.49557 validation-auc:0.95592 validation-aucpr:0.96066
[13] validation-logloss:0.48549 validation-auc:0.95596 validation-aucpr:0.96079
[14] validation-logloss:0.47593 validation-auc:0.95602 validation-aucpr:0.96065
[15] validation-logloss:0.46453 validation-auc:0.95798 validation-aucpr:0.96345
[16] validation-logloss:0.45637 validation-auc:0.95780 validation-aucpr:0.96330
[17] validation-logloss:0.44784 validation-auc:0.95833 validation-aucpr:0.96377
[18] validation-logloss:0.43932 validation-auc:0.95862 validation-aucpr:0.96401
[19] validation-logloss:0.43174 validation-auc:0.95864 validation-aucpr:0.96395
[20] validation-logloss:0.42202 validation-auc:0.95981 validation-aucpr:0.96522
[21] validation-logloss:0.41564 validation-auc:0.95973 validation-aucpr:0.96522
[22] validation-logloss:0.40837 validation-auc:0.96005 validation-aucpr:0.96560
[23] validation-logloss:0.39975 validation-auc:0.96060 validation-aucpr:0.96617
[24] validation-logloss:0.39368 validation-auc:0.96074 validation-aucpr:0.96632
[25] validation-logloss:0.38775 validation-auc:0.96083 validation-aucpr:0.96635
[26] validation-logloss:0.38179 validation-auc:0.96121 validation-aucpr:0.96679
[27] validation-logloss:0.37692 validation-auc:0.96119 validation-aucpr:0.96678
[28] validation-logloss:0.36912 validation-auc:0.96181 validation-aucpr:0.96739
[29] validation-logloss:0.36392 validation-auc:0.96209 validation-aucpr:0.96758
[30] validation-logloss:0.35898 validation-auc:0.96224 validation-aucpr:0.96773
[31] validation-logloss:0.35456 validation-auc:0.96233 validation-aucpr:0.96775
[32] validation-logloss:0.35042 validation-auc:0.96224 validation-aucpr:0.96765
[33] validation-logloss:0.34433 validation-auc:0.96256 validation-aucpr:0.96801
[34] validation-logloss:0.34098 validation-auc:0.96255 validation-aucpr:0.96798
[35] validation-logloss:0.33695 validation-auc:0.96260 validation-aucpr:0.96802
[36] validation-logloss:0.33302 validation-auc:0.96273 validation-aucpr:0.96814
[37] validation-logloss:0.32967 validation-auc:0.96280 validation-aucpr:0.96822
[38] validation-logloss:0.32629 validation-auc:0.96294 validation-aucpr:0.96830
[39] validation-logloss:0.32336 validation-auc:0.96295 validation-aucpr:0.96830
[40] validation-logloss:0.32027 validation-auc:0.96295 validation-aucpr:0.96827
[41] validation-logloss:0.31732 validation-auc:0.96304 validation-aucpr:0.96844
[42] validation-logloss:0.31478 validation-auc:0.96304 validation-aucpr:0.96841
[43] validation-logloss:0.31163 validation-auc:0.96308 validation-aucpr:0.96843
[44] validation-logloss:0.30930 validation-auc:0.96301 validation-aucpr:0.96833
{'best_iteration': '41', 'best_score': '0.9684363819628538'}
Trial 83, Fold 1: Log loss = 0.30929638720353547, Average precision = 0.9683288201497099, ROC-AUC = 0.963007398206698, Elapsed Time = 0.9747122000007948 seconds
Trial 83, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 83, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67241 validation-auc:0.92366 validation-aucpr:0.92099
[1] validation-logloss:0.65275 validation-auc:0.93620 validation-aucpr:0.93624
[2] validation-logloss:0.63482 validation-auc:0.93809 validation-aucpr:0.93925
[3] validation-logloss:0.61323 validation-auc:0.95316 validation-aucpr:0.95566
[4] validation-logloss:0.59301 validation-auc:0.95891 validation-aucpr:0.96264
[5] validation-logloss:0.57816 validation-auc:0.95847 validation-aucpr:0.96152
[6] validation-logloss:0.56341 validation-auc:0.95844 validation-aucpr:0.96152
[7] validation-logloss:0.55052 validation-auc:0.95825 validation-aucpr:0.96060
[8] validation-logloss:0.53769 validation-auc:0.95796 validation-aucpr:0.96014
[9] validation-logloss:0.52170 validation-auc:0.96019 validation-aucpr:0.96257
[10] validation-logloss:0.50951 validation-auc:0.96053 validation-aucpr:0.96288
[11] validation-logloss:0.49885 validation-auc:0.96036 validation-aucpr:0.96254
[12] validation-logloss:0.48795 validation-auc:0.96128 validation-aucpr:0.96337
[13] validation-logloss:0.47844 validation-auc:0.96114 validation-aucpr:0.96333
[14] validation-logloss:0.46896 validation-auc:0.96126 validation-aucpr:0.96344
[15] validation-logloss:0.46017 validation-auc:0.96127 validation-aucpr:0.96335
[16] validation-logloss:0.45220 validation-auc:0.96120 validation-aucpr:0.96330
[17] validation-logloss:0.44383 validation-auc:0.96158 validation-aucpr:0.96355
[18] validation-logloss:0.43627 validation-auc:0.96135 validation-aucpr:0.96332
[19] validation-logloss:0.42876 validation-auc:0.96150 validation-aucpr:0.96338
[20] validation-logloss:0.42017 validation-auc:0.96173 validation-aucpr:0.96365
[21] validation-logloss:0.41059 validation-auc:0.96252 validation-aucpr:0.96452
[22] validation-logloss:0.40446 validation-auc:0.96263 validation-aucpr:0.96535
[23] validation-logloss:0.39797 validation-auc:0.96276 validation-aucpr:0.96544
[24] validation-logloss:0.38918 validation-auc:0.96335 validation-aucpr:0.96613
[25] validation-logloss:0.38329 validation-auc:0.96336 validation-aucpr:0.96612
[26] validation-logloss:0.37564 validation-auc:0.96368 validation-aucpr:0.96647
[27] validation-logloss:0.36814 validation-auc:0.96397 validation-aucpr:0.96683
[28] validation-logloss:0.36340 validation-auc:0.96384 validation-aucpr:0.96673
[29] validation-logloss:0.35857 validation-auc:0.96379 validation-aucpr:0.96637
[30] validation-logloss:0.35391 validation-auc:0.96399 validation-aucpr:0.96655
[31] validation-logloss:0.34897 validation-auc:0.96416 validation-aucpr:0.96670
[32] validation-logloss:0.34464 validation-auc:0.96416 validation-aucpr:0.96705
[33] validation-logloss:0.34070 validation-auc:0.96423 validation-aucpr:0.96712
[34] validation-logloss:0.33642 validation-auc:0.96429 validation-aucpr:0.96718
[35] validation-logloss:0.33299 validation-auc:0.96426 validation-aucpr:0.96711
[36] validation-logloss:0.32957 validation-auc:0.96430 validation-aucpr:0.96727
[37] validation-logloss:0.32367 validation-auc:0.96464 validation-aucpr:0.96761
[38] validation-logloss:0.32007 validation-auc:0.96480 validation-aucpr:0.96762
[39] validation-logloss:0.31689 validation-auc:0.96487 validation-aucpr:0.96724
[40] validation-logloss:0.31195 validation-auc:0.96518 validation-aucpr:0.96794
[41] validation-logloss:0.30835 validation-auc:0.96528 validation-aucpr:0.96827
[42] validation-logloss:0.30388 validation-auc:0.96560 validation-aucpr:0.96900
[43] validation-logloss:0.30113 validation-auc:0.96565 validation-aucpr:0.96900
[44] validation-logloss:0.29833 validation-auc:0.96569 validation-aucpr:0.96901
{'best_iteration': '44', 'best_score': '0.9690120397681583'}
Trial 83, Fold 2: Log loss = 0.2983271238709228, Average precision = 0.9690123800135595, ROC-AUC = 0.965694833382449, Elapsed Time = 1.1981218000000808 seconds
Trial 83, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 83, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67127 validation-auc:0.93340 validation-aucpr:0.93520
[1] validation-logloss:0.64749 validation-auc:0.95962 validation-aucpr:0.96329
[2] validation-logloss:0.62856 validation-auc:0.96076 validation-aucpr:0.96361
[3] validation-logloss:0.61204 validation-auc:0.95898 validation-aucpr:0.96137
[4] validation-logloss:0.59547 validation-auc:0.95885 validation-aucpr:0.96133
[5] validation-logloss:0.58060 validation-auc:0.95870 validation-aucpr:0.96233
[6] validation-logloss:0.56688 validation-auc:0.95913 validation-aucpr:0.96348
[7] validation-logloss:0.55315 validation-auc:0.95845 validation-aucpr:0.96280
[8] validation-logloss:0.53699 validation-auc:0.95979 validation-aucpr:0.96426
[9] validation-logloss:0.52384 validation-auc:0.96084 validation-aucpr:0.96381
[10] validation-logloss:0.51142 validation-auc:0.96083 validation-aucpr:0.96376
[11] validation-logloss:0.49979 validation-auc:0.96119 validation-aucpr:0.96410
[12] validation-logloss:0.48579 validation-auc:0.96221 validation-aucpr:0.96519
[13] validation-logloss:0.47580 validation-auc:0.96215 validation-aucpr:0.96552
[14] validation-logloss:0.46577 validation-auc:0.96230 validation-aucpr:0.96549
[15] validation-logloss:0.45417 validation-auc:0.96305 validation-aucpr:0.96737
[16] validation-logloss:0.44563 validation-auc:0.96319 validation-aucpr:0.96741
[17] validation-logloss:0.43799 validation-auc:0.96358 validation-aucpr:0.96852
[18] validation-logloss:0.42974 validation-auc:0.96362 validation-aucpr:0.96849
[19] validation-logloss:0.41938 validation-auc:0.96396 validation-aucpr:0.96881
[20] validation-logloss:0.41213 validation-auc:0.96394 validation-aucpr:0.96881
[21] validation-logloss:0.40511 validation-auc:0.96385 validation-aucpr:0.96873
[22] validation-logloss:0.39575 validation-auc:0.96436 validation-aucpr:0.96914
[23] validation-logloss:0.38936 validation-auc:0.96447 validation-aucpr:0.96929
[24] validation-logloss:0.38385 validation-auc:0.96435 validation-aucpr:0.96912
[25] validation-logloss:0.37855 validation-auc:0.96429 validation-aucpr:0.96921
[26] validation-logloss:0.37263 validation-auc:0.96441 validation-aucpr:0.96933
[27] validation-logloss:0.36594 validation-auc:0.96444 validation-aucpr:0.96938
[28] validation-logloss:0.36107 validation-auc:0.96457 validation-aucpr:0.96945
[29] validation-logloss:0.35668 validation-auc:0.96458 validation-aucpr:0.96942
[30] validation-logloss:0.35219 validation-auc:0.96450 validation-aucpr:0.96934
[31] validation-logloss:0.34820 validation-auc:0.96446 validation-aucpr:0.96930
[32] validation-logloss:0.34228 validation-auc:0.96457 validation-aucpr:0.96942
[33] validation-logloss:0.33874 validation-auc:0.96448 validation-aucpr:0.96932
[34] validation-logloss:0.33459 validation-auc:0.96446 validation-aucpr:0.96931
[35] validation-logloss:0.33122 validation-auc:0.96449 validation-aucpr:0.96935
[36] validation-logloss:0.32579 validation-auc:0.96462 validation-aucpr:0.96947
[37] validation-logloss:0.32164 validation-auc:0.96497 validation-aucpr:0.96974
[38] validation-logloss:0.31657 validation-auc:0.96508 validation-aucpr:0.96985
[39] validation-logloss:0.31356 validation-auc:0.96512 validation-aucpr:0.96987
[40] validation-logloss:0.31023 validation-auc:0.96521 validation-aucpr:0.96993
[41] validation-logloss:0.30704 validation-auc:0.96532 validation-aucpr:0.96999
[42] validation-logloss:0.30420 validation-auc:0.96543 validation-aucpr:0.97000
[43] validation-logloss:0.29993 validation-auc:0.96550 validation-aucpr:0.97014
[44] validation-logloss:0.29763 validation-auc:0.96553 validation-aucpr:0.97015
{'best_iteration': '44', 'best_score': '0.9701487873135217'}
Trial 83, Fold 3: Log loss = 0.2976307423822404, Average precision = 0.9701537445481727, ROC-AUC = 0.9655256348055974, Elapsed Time = 1.1555424999969546 seconds
Trial 83, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 83, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67158 validation-auc:0.92830 validation-aucpr:0.92897
[1] validation-logloss:0.65023 validation-auc:0.94723 validation-aucpr:0.95055
[2] validation-logloss:0.63138 validation-auc:0.94807 validation-aucpr:0.95194
[3] validation-logloss:0.61474 validation-auc:0.94994 validation-aucpr:0.95445
[4] validation-logloss:0.59778 validation-auc:0.95102 validation-aucpr:0.95606
[5] validation-logloss:0.58273 validation-auc:0.95161 validation-aucpr:0.95671
[6] validation-logloss:0.56685 validation-auc:0.95257 validation-aucpr:0.95782
[7] validation-logloss:0.55357 validation-auc:0.95316 validation-aucpr:0.95841
[8] validation-logloss:0.53982 validation-auc:0.95453 validation-aucpr:0.95997
[9] validation-logloss:0.52812 validation-auc:0.95437 validation-aucpr:0.95985
[10] validation-logloss:0.51658 validation-auc:0.95449 validation-aucpr:0.95991
[11] validation-logloss:0.50351 validation-auc:0.95782 validation-aucpr:0.96340
[12] validation-logloss:0.49348 validation-auc:0.95750 validation-aucpr:0.96305
[13] validation-logloss:0.48375 validation-auc:0.95738 validation-aucpr:0.96297
[14] validation-logloss:0.47095 validation-auc:0.95983 validation-aucpr:0.96582
[15] validation-logloss:0.46174 validation-auc:0.95975 validation-aucpr:0.96576
[16] validation-logloss:0.45340 validation-auc:0.95984 validation-aucpr:0.96578
[17] validation-logloss:0.44503 validation-auc:0.96011 validation-aucpr:0.96599
[18] validation-logloss:0.43446 validation-auc:0.96112 validation-aucpr:0.96723
[19] validation-logloss:0.42644 validation-auc:0.96129 validation-aucpr:0.96736
[20] validation-logloss:0.41965 validation-auc:0.96123 validation-aucpr:0.96734
[21] validation-logloss:0.41093 validation-auc:0.96153 validation-aucpr:0.96767
[22] validation-logloss:0.40173 validation-auc:0.96200 validation-aucpr:0.96817
[23] validation-logloss:0.39619 validation-auc:0.96179 validation-aucpr:0.96797
[24] validation-logloss:0.38943 validation-auc:0.96236 validation-aucpr:0.96839
[25] validation-logloss:0.38387 validation-auc:0.96251 validation-aucpr:0.96852
[26] validation-logloss:0.37843 validation-auc:0.96255 validation-aucpr:0.96848
[27] validation-logloss:0.37380 validation-auc:0.96254 validation-aucpr:0.96855
[28] validation-logloss:0.36657 validation-auc:0.96289 validation-aucpr:0.96895
[29] validation-logloss:0.36132 validation-auc:0.96301 validation-aucpr:0.96904
[30] validation-logloss:0.35709 validation-auc:0.96303 validation-aucpr:0.96909
[31] validation-logloss:0.35061 validation-auc:0.96338 validation-aucpr:0.96945
[32] validation-logloss:0.34666 validation-auc:0.96331 validation-aucpr:0.96932
[33] validation-logloss:0.34091 validation-auc:0.96346 validation-aucpr:0.96948
[34] validation-logloss:0.33717 validation-auc:0.96342 validation-aucpr:0.96940
[35] validation-logloss:0.33388 validation-auc:0.96338 validation-aucpr:0.96933
[36] validation-logloss:0.33004 validation-auc:0.96338 validation-aucpr:0.96935
[37] validation-logloss:0.32666 validation-auc:0.96344 validation-aucpr:0.96939
[38] validation-logloss:0.32288 validation-auc:0.96356 validation-aucpr:0.96948
[39] validation-logloss:0.31996 validation-auc:0.96343 validation-aucpr:0.96941
[40] validation-logloss:0.31480 validation-auc:0.96366 validation-aucpr:0.96963
[41] validation-logloss:0.31004 validation-auc:0.96377 validation-aucpr:0.96974
[42] validation-logloss:0.30703 validation-auc:0.96382 validation-aucpr:0.96979
[43] validation-logloss:0.30224 validation-auc:0.96403 validation-aucpr:0.96997
[44] validation-logloss:0.29991 validation-auc:0.96413 validation-aucpr:0.97002
{'best_iteration': '44', 'best_score': '0.9700243644905334'}
Trial 83, Fold 4: Log loss = 0.29991398528608265, Average precision = 0.9700287573244741, ROC-AUC = 0.9641328140088843, Elapsed Time = 1.205062100001669 seconds
Trial 83, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 83, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67158 validation-auc:0.92778 validation-aucpr:0.92848
[1] validation-logloss:0.64804 validation-auc:0.95263 validation-aucpr:0.95765
[2] validation-logloss:0.62899 validation-auc:0.95517 validation-aucpr:0.95894
[3] validation-logloss:0.61059 validation-auc:0.95643 validation-aucpr:0.95747
[4] validation-logloss:0.59455 validation-auc:0.95621 validation-aucpr:0.96105
[5] validation-logloss:0.57741 validation-auc:0.95642 validation-aucpr:0.96115
[6] validation-logloss:0.56226 validation-auc:0.95703 validation-aucpr:0.96162
[7] validation-logloss:0.54856 validation-auc:0.95710 validation-aucpr:0.96223
[8] validation-logloss:0.53547 validation-auc:0.95692 validation-aucpr:0.96204
[9] validation-logloss:0.52349 validation-auc:0.95670 validation-aucpr:0.96159
[10] validation-logloss:0.50925 validation-auc:0.95856 validation-aucpr:0.96377
[11] validation-logloss:0.49892 validation-auc:0.95835 validation-aucpr:0.96356
[12] validation-logloss:0.48538 validation-auc:0.95901 validation-aucpr:0.96436
[13] validation-logloss:0.47371 validation-auc:0.95916 validation-aucpr:0.96478
[14] validation-logloss:0.46406 validation-auc:0.95931 validation-aucpr:0.96487
[15] validation-logloss:0.45263 validation-auc:0.95969 validation-aucpr:0.96523
[16] validation-logloss:0.44433 validation-auc:0.95963 validation-aucpr:0.96514
[17] validation-logloss:0.43349 validation-auc:0.96033 validation-aucpr:0.96585
[18] validation-logloss:0.42564 validation-auc:0.96045 validation-aucpr:0.96593
[19] validation-logloss:0.41793 validation-auc:0.96055 validation-aucpr:0.96591
[20] validation-logloss:0.41107 validation-auc:0.96065 validation-aucpr:0.96599
[21] validation-logloss:0.40408 validation-auc:0.96091 validation-aucpr:0.96614
[22] validation-logloss:0.39781 validation-auc:0.96080 validation-aucpr:0.96601
[23] validation-logloss:0.38994 validation-auc:0.96092 validation-aucpr:0.96616
[24] validation-logloss:0.38462 validation-auc:0.96094 validation-aucpr:0.96596
[25] validation-logloss:0.37944 validation-auc:0.96089 validation-aucpr:0.96583
[26] validation-logloss:0.37406 validation-auc:0.96087 validation-aucpr:0.96576
[27] validation-logloss:0.36873 validation-auc:0.96085 validation-aucpr:0.96587
[28] validation-logloss:0.36350 validation-auc:0.96113 validation-aucpr:0.96595
[29] validation-logloss:0.35890 validation-auc:0.96134 validation-aucpr:0.96614
[30] validation-logloss:0.35414 validation-auc:0.96144 validation-aucpr:0.96616
[31] validation-logloss:0.35049 validation-auc:0.96135 validation-aucpr:0.96663
[32] validation-logloss:0.34640 validation-auc:0.96129 validation-aucpr:0.96656
[33] validation-logloss:0.34249 validation-auc:0.96140 validation-aucpr:0.96671
[34] validation-logloss:0.33917 validation-auc:0.96133 validation-aucpr:0.96662
[35] validation-logloss:0.33609 validation-auc:0.96123 validation-aucpr:0.96662
[36] validation-logloss:0.33254 validation-auc:0.96121 validation-aucpr:0.96660
[37] validation-logloss:0.32934 validation-auc:0.96115 validation-aucpr:0.96657
[38] validation-logloss:0.32597 validation-auc:0.96119 validation-aucpr:0.96659
[39] validation-logloss:0.32079 validation-auc:0.96136 validation-aucpr:0.96680
[40] validation-logloss:0.31752 validation-auc:0.96146 validation-aucpr:0.96684
[41] validation-logloss:0.31419 validation-auc:0.96158 validation-aucpr:0.96690
[42] validation-logloss:0.31127 validation-auc:0.96168 validation-aucpr:0.96694
[43] validation-logloss:0.30856 validation-auc:0.96176 validation-aucpr:0.96702
[44] validation-logloss:0.30585 validation-auc:0.96184 validation-aucpr:0.96707
{'best_iteration': '44', 'best_score': '0.9670711622018966'}
Trial 83, Fold 5: Log loss = 0.30585460800323083, Average precision = 0.9670759297537443, ROC-AUC = 0.9618388421306877, Elapsed Time = 1.164583699999639 seconds
Optimization Progress: 84%|########4 | 84/100 [3:44:22<24:18, 91.18s/it]
Trial 84, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 84, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68515 validation-auc:0.94861 validation-aucpr:0.92425
[1] validation-logloss:0.67829 validation-auc:0.96219 validation-aucpr:0.96205
[2] validation-logloss:0.67094 validation-auc:0.96421 validation-aucpr:0.96619
[3] validation-logloss:0.66356 validation-auc:0.96568 validation-aucpr:0.96876
[4] validation-logloss:0.65724 validation-auc:0.96543 validation-aucpr:0.96880
[5] validation-logloss:0.65032 validation-auc:0.96593 validation-aucpr:0.96942
[6] validation-logloss:0.64396 validation-auc:0.96623 validation-aucpr:0.96967
[7] validation-logloss:0.63782 validation-auc:0.96662 validation-aucpr:0.97154
[8] validation-logloss:0.63165 validation-auc:0.96649 validation-aucpr:0.97135
[9] validation-logloss:0.62510 validation-auc:0.96694 validation-aucpr:0.97169
[10] validation-logloss:0.61862 validation-auc:0.96716 validation-aucpr:0.97186
[11] validation-logloss:0.61284 validation-auc:0.96724 validation-aucpr:0.97182
[12] validation-logloss:0.60680 validation-auc:0.96734 validation-aucpr:0.97047
[13] validation-logloss:0.60066 validation-auc:0.96758 validation-aucpr:0.97075
[14] validation-logloss:0.59459 validation-auc:0.96805 validation-aucpr:0.97137
[15] validation-logloss:0.58865 validation-auc:0.96830 validation-aucpr:0.97154
[16] validation-logloss:0.58280 validation-auc:0.96849 validation-aucpr:0.97172
[17] validation-logloss:0.57709 validation-auc:0.96878 validation-aucpr:0.97159
[18] validation-logloss:0.57219 validation-auc:0.96850 validation-aucpr:0.97134
[19] validation-logloss:0.56734 validation-auc:0.96834 validation-aucpr:0.97099
[20] validation-logloss:0.56273 validation-auc:0.96814 validation-aucpr:0.97078
[21] validation-logloss:0.55741 validation-auc:0.96842 validation-aucpr:0.97137
[22] validation-logloss:0.55220 validation-auc:0.96862 validation-aucpr:0.97153
[23] validation-logloss:0.54771 validation-auc:0.96849 validation-aucpr:0.97336
[24] validation-logloss:0.54255 validation-auc:0.96866 validation-aucpr:0.97352
[25] validation-logloss:0.53767 validation-auc:0.96862 validation-aucpr:0.97344
[26] validation-logloss:0.53288 validation-auc:0.96879 validation-aucpr:0.97217
[27] validation-logloss:0.52815 validation-auc:0.96887 validation-aucpr:0.97226
[28] validation-logloss:0.52349 validation-auc:0.96887 validation-aucpr:0.97230
[29] validation-logloss:0.51878 validation-auc:0.96902 validation-aucpr:0.97246
[30] validation-logloss:0.51484 validation-auc:0.96886 validation-aucpr:0.97230
[31] validation-logloss:0.51031 validation-auc:0.96902 validation-aucpr:0.97244
[32] validation-logloss:0.50645 validation-auc:0.96882 validation-aucpr:0.97227
[33] validation-logloss:0.50212 validation-auc:0.96891 validation-aucpr:0.97236
[34] validation-logloss:0.49778 validation-auc:0.96896 validation-aucpr:0.97244
[35] validation-logloss:0.49420 validation-auc:0.96886 validation-aucpr:0.97361
[36] validation-logloss:0.49054 validation-auc:0.96884 validation-aucpr:0.97356
[37] validation-logloss:0.48654 validation-auc:0.96892 validation-aucpr:0.97362
[38] validation-logloss:0.48250 validation-auc:0.96896 validation-aucpr:0.97363
[39] validation-logloss:0.47853 validation-auc:0.96897 validation-aucpr:0.97210
[40] validation-logloss:0.47460 validation-auc:0.96903 validation-aucpr:0.97216
[41] validation-logloss:0.47073 validation-auc:0.96911 validation-aucpr:0.97222
{'best_iteration': '38', 'best_score': '0.9736331182222254'}
Trial 84, Fold 1: Log loss = 0.47073261328856786, Average precision = 0.9729380187269572, ROC-AUC = 0.9691089172796489, Elapsed Time = 1.6764398999985133 seconds
Trial 84, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 84, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68546 validation-auc:0.94384 validation-aucpr:0.91530
[1] validation-logloss:0.67803 validation-auc:0.96147 validation-aucpr:0.95453
[2] validation-logloss:0.67052 validation-auc:0.96678 validation-aucpr:0.96787
[3] validation-logloss:0.66394 validation-auc:0.96759 validation-aucpr:0.97095
[4] validation-logloss:0.65766 validation-auc:0.96731 validation-aucpr:0.97088
[5] validation-logloss:0.65123 validation-auc:0.96732 validation-aucpr:0.97067
[6] validation-logloss:0.64434 validation-auc:0.96779 validation-aucpr:0.97125
[7] validation-logloss:0.63766 validation-auc:0.96773 validation-aucpr:0.97147
[8] validation-logloss:0.63100 validation-auc:0.96825 validation-aucpr:0.97193
[9] validation-logloss:0.62516 validation-auc:0.96835 validation-aucpr:0.97198
[10] validation-logloss:0.61868 validation-auc:0.96875 validation-aucpr:0.97233
[11] validation-logloss:0.61231 validation-auc:0.96948 validation-aucpr:0.97297
[12] validation-logloss:0.60673 validation-auc:0.96930 validation-aucpr:0.97277
[13] validation-logloss:0.60094 validation-auc:0.96897 validation-aucpr:0.97259
[14] validation-logloss:0.59554 validation-auc:0.96885 validation-aucpr:0.97250
[15] validation-logloss:0.58966 validation-auc:0.96907 validation-aucpr:0.97272
[16] validation-logloss:0.58385 validation-auc:0.96925 validation-aucpr:0.97290
[17] validation-logloss:0.57866 validation-auc:0.96938 validation-aucpr:0.97293
[18] validation-logloss:0.57387 validation-auc:0.96928 validation-aucpr:0.97281
[19] validation-logloss:0.56844 validation-auc:0.96920 validation-aucpr:0.97279
[20] validation-logloss:0.56299 validation-auc:0.96931 validation-aucpr:0.97283
[21] validation-logloss:0.55838 validation-auc:0.96936 validation-aucpr:0.97282
[22] validation-logloss:0.55374 validation-auc:0.96923 validation-aucpr:0.97265
[23] validation-logloss:0.54920 validation-auc:0.96903 validation-aucpr:0.97246
[24] validation-logloss:0.54483 validation-auc:0.96896 validation-aucpr:0.97238
[25] validation-logloss:0.54034 validation-auc:0.96906 validation-aucpr:0.97246
[26] validation-logloss:0.53534 validation-auc:0.96926 validation-aucpr:0.97264
[27] validation-logloss:0.53116 validation-auc:0.96919 validation-aucpr:0.97257
[28] validation-logloss:0.52645 validation-auc:0.96942 validation-aucpr:0.97277
[29] validation-logloss:0.52166 validation-auc:0.96961 validation-aucpr:0.97294
[30] validation-logloss:0.51705 validation-auc:0.96963 validation-aucpr:0.97296
[31] validation-logloss:0.51255 validation-auc:0.96963 validation-aucpr:0.97289
[32] validation-logloss:0.50798 validation-auc:0.96980 validation-aucpr:0.97305
[33] validation-logloss:0.50415 validation-auc:0.96976 validation-aucpr:0.97304
[34] validation-logloss:0.49980 validation-auc:0.96981 validation-aucpr:0.97309
[35] validation-logloss:0.49545 validation-auc:0.96994 validation-aucpr:0.97317
[36] validation-logloss:0.49119 validation-auc:0.97001 validation-aucpr:0.97324
[37] validation-logloss:0.48705 validation-auc:0.97016 validation-aucpr:0.97338
[38] validation-logloss:0.48294 validation-auc:0.97023 validation-aucpr:0.97345
[39] validation-logloss:0.47889 validation-auc:0.97037 validation-aucpr:0.97357
[40] validation-logloss:0.47495 validation-auc:0.97046 validation-aucpr:0.97365
[41] validation-logloss:0.47155 validation-auc:0.97039 validation-aucpr:0.97358
{'best_iteration': '40', 'best_score': '0.9736509735508068'}
Trial 84, Fold 2: Log loss = 0.47154738969745186, Average precision = 0.9735147670790396, ROC-AUC = 0.9703878612697465, Elapsed Time = 1.938225699999748 seconds
Trial 84, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 84, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68526 validation-auc:0.94306 validation-aucpr:0.92744
[1] validation-logloss:0.67746 validation-auc:0.96507 validation-aucpr:0.96408
[2] validation-logloss:0.67032 validation-auc:0.96592 validation-aucpr:0.96783
[3] validation-logloss:0.66304 validation-auc:0.96704 validation-aucpr:0.96927
[4] validation-logloss:0.65669 validation-auc:0.96760 validation-aucpr:0.97000
[5] validation-logloss:0.65030 validation-auc:0.96725 validation-aucpr:0.96975
[6] validation-logloss:0.64333 validation-auc:0.96799 validation-aucpr:0.97172
[7] validation-logloss:0.63640 validation-auc:0.96903 validation-aucpr:0.97357
[8] validation-logloss:0.62983 validation-auc:0.96935 validation-aucpr:0.97393
[9] validation-logloss:0.62392 validation-auc:0.96938 validation-aucpr:0.97383
[10] validation-logloss:0.61816 validation-auc:0.96933 validation-aucpr:0.97369
[11] validation-logloss:0.61186 validation-auc:0.96968 validation-aucpr:0.97395
[12] validation-logloss:0.60628 validation-auc:0.96951 validation-aucpr:0.97384
[13] validation-logloss:0.60003 validation-auc:0.96992 validation-aucpr:0.97417
[14] validation-logloss:0.59396 validation-auc:0.97017 validation-aucpr:0.97439
[15] validation-logloss:0.58876 validation-auc:0.96993 validation-aucpr:0.97415
[16] validation-logloss:0.58306 validation-auc:0.97006 validation-aucpr:0.97426
[17] validation-logloss:0.57805 validation-auc:0.96994 validation-aucpr:0.97405
[18] validation-logloss:0.57233 validation-auc:0.97016 validation-aucpr:0.97418
[19] validation-logloss:0.56740 validation-auc:0.97000 validation-aucpr:0.97412
[20] validation-logloss:0.56258 validation-auc:0.97004 validation-aucpr:0.97413
[21] validation-logloss:0.55780 validation-auc:0.97016 validation-aucpr:0.97422
[22] validation-logloss:0.55253 validation-auc:0.97045 validation-aucpr:0.97459
[23] validation-logloss:0.54728 validation-auc:0.97068 validation-aucpr:0.97475
[24] validation-logloss:0.54275 validation-auc:0.97060 validation-aucpr:0.97467
[25] validation-logloss:0.53820 validation-auc:0.97059 validation-aucpr:0.97464
[26] validation-logloss:0.53327 validation-auc:0.97064 validation-aucpr:0.97469
[27] validation-logloss:0.52841 validation-auc:0.97072 validation-aucpr:0.97474
[28] validation-logloss:0.52358 validation-auc:0.97089 validation-aucpr:0.97486
[29] validation-logloss:0.51909 validation-auc:0.97097 validation-aucpr:0.97490
[30] validation-logloss:0.51449 validation-auc:0.97097 validation-aucpr:0.97491
[31] validation-logloss:0.51042 validation-auc:0.97104 validation-aucpr:0.97496
[32] validation-logloss:0.50648 validation-auc:0.97102 validation-aucpr:0.97495
[33] validation-logloss:0.50207 validation-auc:0.97108 validation-aucpr:0.97500
[34] validation-logloss:0.49833 validation-auc:0.97099 validation-aucpr:0.97494
[35] validation-logloss:0.49445 validation-auc:0.97096 validation-aucpr:0.97490
[36] validation-logloss:0.49077 validation-auc:0.97092 validation-aucpr:0.97483
[37] validation-logloss:0.48657 validation-auc:0.97105 validation-aucpr:0.97496
[38] validation-logloss:0.48247 validation-auc:0.97109 validation-aucpr:0.97500
[39] validation-logloss:0.47843 validation-auc:0.97110 validation-aucpr:0.97501
[40] validation-logloss:0.47497 validation-auc:0.97103 validation-aucpr:0.97495
[41] validation-logloss:0.47155 validation-auc:0.97103 validation-aucpr:0.97493
{'best_iteration': '39', 'best_score': '0.9750053475125442'}
Trial 84, Fold 3: Log loss = 0.4715512103153635, Average precision = 0.9749272533373887, ROC-AUC = 0.9710345116102128, Elapsed Time = 1.825174599998718 seconds
Trial 84, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 84, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68530 validation-auc:0.93955 validation-aucpr:0.92656
[1] validation-logloss:0.67866 validation-auc:0.95745 validation-aucpr:0.96017
[2] validation-logloss:0.67118 validation-auc:0.96223 validation-aucpr:0.96612
[3] validation-logloss:0.66378 validation-auc:0.96525 validation-aucpr:0.97073
[4] validation-logloss:0.65664 validation-auc:0.96665 validation-aucpr:0.97220
[5] validation-logloss:0.64954 validation-auc:0.96688 validation-aucpr:0.97241
[6] validation-logloss:0.64324 validation-auc:0.96713 validation-aucpr:0.97254
[7] validation-logloss:0.63645 validation-auc:0.96759 validation-aucpr:0.97297
[8] validation-logloss:0.63032 validation-auc:0.96801 validation-aucpr:0.97327
[9] validation-logloss:0.62367 validation-auc:0.96840 validation-aucpr:0.97362
[10] validation-logloss:0.61727 validation-auc:0.96850 validation-aucpr:0.97374
[11] validation-logloss:0.61138 validation-auc:0.96822 validation-aucpr:0.97348
[12] validation-logloss:0.60527 validation-auc:0.96835 validation-aucpr:0.97363
[13] validation-logloss:0.59980 validation-auc:0.96854 validation-aucpr:0.97374
[14] validation-logloss:0.59379 validation-auc:0.96895 validation-aucpr:0.97404
[15] validation-logloss:0.58863 validation-auc:0.96887 validation-aucpr:0.97398
[16] validation-logloss:0.58282 validation-auc:0.96893 validation-aucpr:0.97402
[17] validation-logloss:0.57703 validation-auc:0.96922 validation-aucpr:0.97426
[18] validation-logloss:0.57220 validation-auc:0.96905 validation-aucpr:0.97411
[19] validation-logloss:0.56671 validation-auc:0.96913 validation-aucpr:0.97418
[20] validation-logloss:0.56127 validation-auc:0.96921 validation-aucpr:0.97426
[21] validation-logloss:0.55595 validation-auc:0.96942 validation-aucpr:0.97440
[22] validation-logloss:0.55073 validation-auc:0.96948 validation-aucpr:0.97446
[23] validation-logloss:0.54605 validation-auc:0.96952 validation-aucpr:0.97449
[24] validation-logloss:0.54106 validation-auc:0.96973 validation-aucpr:0.97465
[25] validation-logloss:0.53634 validation-auc:0.96967 validation-aucpr:0.97461
[26] validation-logloss:0.53149 validation-auc:0.96973 validation-aucpr:0.97467
[27] validation-logloss:0.52669 validation-auc:0.96985 validation-aucpr:0.97473
[28] validation-logloss:0.52191 validation-auc:0.97000 validation-aucpr:0.97486
[29] validation-logloss:0.51727 validation-auc:0.97014 validation-aucpr:0.97496
[30] validation-logloss:0.51270 validation-auc:0.97025 validation-aucpr:0.97507
[31] validation-logloss:0.50834 validation-auc:0.97021 validation-aucpr:0.97505
[32] validation-logloss:0.50438 validation-auc:0.97006 validation-aucpr:0.97494
[33] validation-logloss:0.50002 validation-auc:0.97013 validation-aucpr:0.97500
[34] validation-logloss:0.49580 validation-auc:0.97025 validation-aucpr:0.97509
[35] validation-logloss:0.49203 validation-auc:0.97005 validation-aucpr:0.97493
[36] validation-logloss:0.48834 validation-auc:0.96996 validation-aucpr:0.97484
[37] validation-logloss:0.48443 validation-auc:0.97001 validation-aucpr:0.97488
[38] validation-logloss:0.48080 validation-auc:0.96985 validation-aucpr:0.97474
[39] validation-logloss:0.47737 validation-auc:0.96974 validation-aucpr:0.97467
[40] validation-logloss:0.47389 validation-auc:0.96970 validation-aucpr:0.97462
[41] validation-logloss:0.47001 validation-auc:0.96977 validation-aucpr:0.97469
{'best_iteration': '34', 'best_score': '0.9750907393814487'}
Trial 84, Fold 4: Log loss = 0.4700130718515467, Average precision = 0.9746817828452765, ROC-AUC = 0.969767695897146, Elapsed Time = 1.8695074999995995 seconds
Trial 84, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 84, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68533 validation-auc:0.93718 validation-aucpr:0.91104
[1] validation-logloss:0.67771 validation-auc:0.95918 validation-aucpr:0.95235
[2] validation-logloss:0.67031 validation-auc:0.96389 validation-aucpr:0.96061
[3] validation-logloss:0.66381 validation-auc:0.96514 validation-aucpr:0.96797
[4] validation-logloss:0.65673 validation-auc:0.96609 validation-aucpr:0.96704
[5] validation-logloss:0.65056 validation-auc:0.96544 validation-aucpr:0.96856
[6] validation-logloss:0.64433 validation-auc:0.96527 validation-aucpr:0.96825
[7] validation-logloss:0.63829 validation-auc:0.96553 validation-aucpr:0.96835
[8] validation-logloss:0.63258 validation-auc:0.96513 validation-aucpr:0.96766
[9] validation-logloss:0.62671 validation-auc:0.96496 validation-aucpr:0.96745
[10] validation-logloss:0.62023 validation-auc:0.96532 validation-aucpr:0.96796
[11] validation-logloss:0.61392 validation-auc:0.96582 validation-aucpr:0.96833
[12] validation-logloss:0.60777 validation-auc:0.96653 validation-aucpr:0.96896
[13] validation-logloss:0.60165 validation-auc:0.96680 validation-aucpr:0.96928
[14] validation-logloss:0.59644 validation-auc:0.96675 validation-aucpr:0.97101
[15] validation-logloss:0.59052 validation-auc:0.96711 validation-aucpr:0.97139
[16] validation-logloss:0.58478 validation-auc:0.96741 validation-aucpr:0.97167
[17] validation-logloss:0.57918 validation-auc:0.96747 validation-aucpr:0.97171
[18] validation-logloss:0.57353 validation-auc:0.96767 validation-aucpr:0.97192
[19] validation-logloss:0.56865 validation-auc:0.96762 validation-aucpr:0.97183
[20] validation-logloss:0.56391 validation-auc:0.96757 validation-aucpr:0.97179
[21] validation-logloss:0.55857 validation-auc:0.96791 validation-aucpr:0.97205
[22] validation-logloss:0.55349 validation-auc:0.96801 validation-aucpr:0.97217
[23] validation-logloss:0.54892 validation-auc:0.96797 validation-aucpr:0.97211
[24] validation-logloss:0.54392 validation-auc:0.96804 validation-aucpr:0.97222
[25] validation-logloss:0.53970 validation-auc:0.96780 validation-aucpr:0.97200
[26] validation-logloss:0.53490 validation-auc:0.96785 validation-aucpr:0.97206
[27] validation-logloss:0.53051 validation-auc:0.96774 validation-aucpr:0.97196
[28] validation-logloss:0.52645 validation-auc:0.96759 validation-aucpr:0.97180
[29] validation-logloss:0.52184 validation-auc:0.96774 validation-aucpr:0.97194
[30] validation-logloss:0.51782 validation-auc:0.96777 validation-aucpr:0.97197
[31] validation-logloss:0.51328 validation-auc:0.96788 validation-aucpr:0.97208
[32] validation-logloss:0.50889 validation-auc:0.96805 validation-aucpr:0.97226
[33] validation-logloss:0.50515 validation-auc:0.96777 validation-aucpr:0.97203
[34] validation-logloss:0.50093 validation-auc:0.96786 validation-aucpr:0.97214
[35] validation-logloss:0.49667 validation-auc:0.96793 validation-aucpr:0.97224
[36] validation-logloss:0.49252 validation-auc:0.96797 validation-aucpr:0.97230
[37] validation-logloss:0.48844 validation-auc:0.96799 validation-aucpr:0.97233
[38] validation-logloss:0.48495 validation-auc:0.96787 validation-aucpr:0.97219
[39] validation-logloss:0.48104 validation-auc:0.96789 validation-aucpr:0.97225
[40] validation-logloss:0.47710 validation-auc:0.96797 validation-aucpr:0.97232
[41] validation-logloss:0.47323 validation-auc:0.96813 validation-aucpr:0.97246
{'best_iteration': '41', 'best_score': '0.9724589335569357'}
Trial 84, Fold 5: Log loss = 0.4732269235506302, Average precision = 0.9724522509729793, ROC-AUC = 0.9681272318525538, Elapsed Time = 1.8348221000014746 seconds
Optimization Progress: 85%|########5 | 85/100 [3:44:40<17:15, 69.03s/it]
Trial 85, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 85, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66495 validation-auc:0.95626 validation-aucpr:0.96087
[1] validation-logloss:0.63938 validation-auc:0.96148 validation-aucpr:0.96693
[2] validation-logloss:0.61518 validation-auc:0.96324 validation-aucpr:0.96821
[3] validation-logloss:0.59263 validation-auc:0.96340 validation-aucpr:0.96846
[4] validation-logloss:0.57147 validation-auc:0.96446 validation-aucpr:0.96972
[5] validation-logloss:0.55163 validation-auc:0.96551 validation-aucpr:0.97048
[6] validation-logloss:0.53320 validation-auc:0.96628 validation-aucpr:0.97106
[7] validation-logloss:0.51596 validation-auc:0.96735 validation-aucpr:0.97224
[8] validation-logloss:0.50137 validation-auc:0.96757 validation-aucpr:0.97228
[9] validation-logloss:0.48649 validation-auc:0.96820 validation-aucpr:0.97197
[10] validation-logloss:0.47192 validation-auc:0.96832 validation-aucpr:0.97204
[11] validation-logloss:0.45854 validation-auc:0.96853 validation-aucpr:0.97214
[12] validation-logloss:0.44623 validation-auc:0.96822 validation-aucpr:0.97198
[13] validation-logloss:0.43410 validation-auc:0.96828 validation-aucpr:0.97198
[14] validation-logloss:0.42250 validation-auc:0.96852 validation-aucpr:0.97217
[15] validation-logloss:0.41193 validation-auc:0.96863 validation-aucpr:0.97219
[16] validation-logloss:0.40194 validation-auc:0.96865 validation-aucpr:0.97225
[17] validation-logloss:0.39339 validation-auc:0.96874 validation-aucpr:0.97228
[18] validation-logloss:0.38428 validation-auc:0.96885 validation-aucpr:0.97236
[19] validation-logloss:0.37558 validation-auc:0.96886 validation-aucpr:0.97234
[20] validation-logloss:0.36790 validation-auc:0.96908 validation-aucpr:0.97248
[21] validation-logloss:0.36033 validation-auc:0.96892 validation-aucpr:0.97240
[22] validation-logloss:0.35253 validation-auc:0.96924 validation-aucpr:0.97256
[23] validation-logloss:0.34501 validation-auc:0.96957 validation-aucpr:0.97281
[24] validation-logloss:0.33803 validation-auc:0.96964 validation-aucpr:0.97283
[25] validation-logloss:0.33149 validation-auc:0.96974 validation-aucpr:0.97287
[26] validation-logloss:0.32532 validation-auc:0.96975 validation-aucpr:0.97288
[27] validation-logloss:0.32003 validation-auc:0.96979 validation-aucpr:0.97288
[28] validation-logloss:0.31434 validation-auc:0.96989 validation-aucpr:0.97295
[29] validation-logloss:0.30885 validation-auc:0.97003 validation-aucpr:0.97306
[30] validation-logloss:0.30372 validation-auc:0.97005 validation-aucpr:0.97320
[31] validation-logloss:0.29903 validation-auc:0.96991 validation-aucpr:0.97313
[32] validation-logloss:0.29433 validation-auc:0.97001 validation-aucpr:0.97342
[33] validation-logloss:0.29005 validation-auc:0.97001 validation-aucpr:0.97336
[34] validation-logloss:0.28634 validation-auc:0.96995 validation-aucpr:0.97329
[35] validation-logloss:0.28248 validation-auc:0.96988 validation-aucpr:0.97322
[36] validation-logloss:0.27825 validation-auc:0.97022 validation-aucpr:0.97348
[37] validation-logloss:0.27458 validation-auc:0.97027 validation-aucpr:0.97352
[38] validation-logloss:0.27171 validation-auc:0.97026 validation-aucpr:0.97476
[39] validation-logloss:0.26832 validation-auc:0.97017 validation-aucpr:0.97470
[40] validation-logloss:0.26503 validation-auc:0.97017 validation-aucpr:0.97474
[41] validation-logloss:0.26212 validation-auc:0.97023 validation-aucpr:0.97471
[42] validation-logloss:0.25969 validation-auc:0.97020 validation-aucpr:0.97467
[43] validation-logloss:0.25666 validation-auc:0.97036 validation-aucpr:0.97478
[44] validation-logloss:0.25389 validation-auc:0.97034 validation-aucpr:0.97474
[45] validation-logloss:0.25149 validation-auc:0.97041 validation-aucpr:0.97479
[46] validation-logloss:0.24909 validation-auc:0.97049 validation-aucpr:0.97483
[47] validation-logloss:0.24652 validation-auc:0.97061 validation-aucpr:0.97491
[48] validation-logloss:0.24396 validation-auc:0.97074 validation-aucpr:0.97498
[49] validation-logloss:0.24208 validation-auc:0.97060 validation-aucpr:0.97488
[50] validation-logloss:0.24015 validation-auc:0.97063 validation-aucpr:0.97488
[51] validation-logloss:0.23834 validation-auc:0.97059 validation-aucpr:0.97488
[52] validation-logloss:0.23641 validation-auc:0.97065 validation-aucpr:0.97491
[53] validation-logloss:0.23446 validation-auc:0.97073 validation-aucpr:0.97497
[54] validation-logloss:0.23260 validation-auc:0.97075 validation-aucpr:0.97499
[55] validation-logloss:0.23086 validation-auc:0.97088 validation-aucpr:0.97508
[56] validation-logloss:0.22904 validation-auc:0.97092 validation-aucpr:0.97511
[57] validation-logloss:0.22746 validation-auc:0.97104 validation-aucpr:0.97517
[58] validation-logloss:0.22607 validation-auc:0.97098 validation-aucpr:0.97512
[59] validation-logloss:0.22500 validation-auc:0.97094 validation-aucpr:0.97511
[60] validation-logloss:0.22365 validation-auc:0.97096 validation-aucpr:0.97511
[61] validation-logloss:0.22242 validation-auc:0.97096 validation-aucpr:0.97509
[62] validation-logloss:0.22136 validation-auc:0.97096 validation-aucpr:0.97507
[63] validation-logloss:0.22047 validation-auc:0.97099 validation-aucpr:0.97522
[64] validation-logloss:0.21908 validation-auc:0.97115 validation-aucpr:0.97533
[65] validation-logloss:0.21801 validation-auc:0.97118 validation-aucpr:0.97534
[66] validation-logloss:0.21733 validation-auc:0.97112 validation-aucpr:0.97529
[67] validation-logloss:0.21649 validation-auc:0.97106 validation-aucpr:0.97522
[68] validation-logloss:0.21561 validation-auc:0.97107 validation-aucpr:0.97523
[69] validation-logloss:0.21471 validation-auc:0.97105 validation-aucpr:0.97522
[70] validation-logloss:0.21387 validation-auc:0.97107 validation-aucpr:0.97521
[71] validation-logloss:0.21285 validation-auc:0.97112 validation-aucpr:0.97527
[72] validation-logloss:0.21230 validation-auc:0.97103 validation-aucpr:0.97517
[73] validation-logloss:0.21139 validation-auc:0.97113 validation-aucpr:0.97522
[74] validation-logloss:0.21082 validation-auc:0.97108 validation-aucpr:0.97519
[75] validation-logloss:0.21027 validation-auc:0.97112 validation-aucpr:0.97520
[76] validation-logloss:0.20969 validation-auc:0.97118 validation-aucpr:0.97543
[77] validation-logloss:0.20897 validation-auc:0.97120 validation-aucpr:0.97542
[78] validation-logloss:0.20833 validation-auc:0.97119 validation-aucpr:0.97540
[79] validation-logloss:0.20775 validation-auc:0.97120 validation-aucpr:0.97540
[80] validation-logloss:0.20720 validation-auc:0.97124 validation-aucpr:0.97542
[81] validation-logloss:0.20658 validation-auc:0.97128 validation-aucpr:0.97544
[82] validation-logloss:0.20614 validation-auc:0.97127 validation-aucpr:0.97540
[83] validation-logloss:0.20568 validation-auc:0.97129 validation-aucpr:0.97540
{'best_iteration': '81', 'best_score': '0.9754383400541304'}
Trial 85, Fold 1: Log loss = 0.2056844300096283, Average precision = 0.9754078842769631, ROC-AUC = 0.9712908744497497, Elapsed Time = 6.905310800000734 seconds
Trial 85, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 85, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.66535 validation-auc:0.95775 validation-aucpr:0.96024
[1] validation-logloss:0.64158 validation-auc:0.96415 validation-aucpr:0.96716
[2] validation-logloss:0.62009 validation-auc:0.96397 validation-aucpr:0.96656
[3] validation-logloss:0.59731 validation-auc:0.96644 validation-aucpr:0.96914
[4] validation-logloss:0.57560 validation-auc:0.96853 validation-aucpr:0.97184
[5] validation-logloss:0.55564 validation-auc:0.96901 validation-aucpr:0.97219
[6] validation-logloss:0.53747 validation-auc:0.96931 validation-aucpr:0.97248
[7] validation-logloss:0.52029 validation-auc:0.96960 validation-aucpr:0.97292
[8] validation-logloss:0.50561 validation-auc:0.96987 validation-aucpr:0.97309
[9] validation-logloss:0.48998 validation-auc:0.97028 validation-aucpr:0.97339
[10] validation-logloss:0.47713 validation-auc:0.97004 validation-aucpr:0.97282
[11] validation-logloss:0.46337 validation-auc:0.97027 validation-aucpr:0.97328
[12] validation-logloss:0.45010 validation-auc:0.97069 validation-aucpr:0.97362
[13] validation-logloss:0.43780 validation-auc:0.97063 validation-aucpr:0.97382
[14] validation-logloss:0.42574 validation-auc:0.97101 validation-aucpr:0.97413
[15] validation-logloss:0.41449 validation-auc:0.97097 validation-aucpr:0.97411
[16] validation-logloss:0.40397 validation-auc:0.97113 validation-aucpr:0.97407
[17] validation-logloss:0.39391 validation-auc:0.97114 validation-aucpr:0.97419
[18] validation-logloss:0.38453 validation-auc:0.97126 validation-aucpr:0.97418
[19] validation-logloss:0.37542 validation-auc:0.97155 validation-aucpr:0.97435
[20] validation-logloss:0.36663 validation-auc:0.97182 validation-aucpr:0.97458
[21] validation-logloss:0.35879 validation-auc:0.97167 validation-aucpr:0.97446
[22] validation-logloss:0.35154 validation-auc:0.97183 validation-aucpr:0.97484
[23] validation-logloss:0.34468 validation-auc:0.97208 validation-aucpr:0.97500
[24] validation-logloss:0.33746 validation-auc:0.97216 validation-aucpr:0.97510
[25] validation-logloss:0.33049 validation-auc:0.97237 validation-aucpr:0.97523
[26] validation-logloss:0.32400 validation-auc:0.97229 validation-aucpr:0.97518
[27] validation-logloss:0.31795 validation-auc:0.97219 validation-aucpr:0.97511
[28] validation-logloss:0.31207 validation-auc:0.97225 validation-aucpr:0.97519
[29] validation-logloss:0.30725 validation-auc:0.97214 validation-aucpr:0.97506
[30] validation-logloss:0.30238 validation-auc:0.97221 validation-aucpr:0.97509
[31] validation-logloss:0.29768 validation-auc:0.97197 validation-aucpr:0.97489
[32] validation-logloss:0.29338 validation-auc:0.97192 validation-aucpr:0.97483
[33] validation-logloss:0.28842 validation-auc:0.97211 validation-aucpr:0.97497
[34] validation-logloss:0.28462 validation-auc:0.97222 validation-aucpr:0.97500
[35] validation-logloss:0.28087 validation-auc:0.97232 validation-aucpr:0.97508
[36] validation-logloss:0.27727 validation-auc:0.97222 validation-aucpr:0.97503
[37] validation-logloss:0.27310 validation-auc:0.97232 validation-aucpr:0.97511
[38] validation-logloss:0.26937 validation-auc:0.97230 validation-aucpr:0.97509
[39] validation-logloss:0.26574 validation-auc:0.97231 validation-aucpr:0.97510
[40] validation-logloss:0.26232 validation-auc:0.97225 validation-aucpr:0.97505
[41] validation-logloss:0.25924 validation-auc:0.97219 validation-aucpr:0.97500
[42] validation-logloss:0.25654 validation-auc:0.97229 validation-aucpr:0.97506
[43] validation-logloss:0.25356 validation-auc:0.97217 validation-aucpr:0.97495
[44] validation-logloss:0.25084 validation-auc:0.97230 validation-aucpr:0.97505
[45] validation-logloss:0.24788 validation-auc:0.97242 validation-aucpr:0.97516
[46] validation-logloss:0.24522 validation-auc:0.97240 validation-aucpr:0.97515
[47] validation-logloss:0.24277 validation-auc:0.97238 validation-aucpr:0.97511
[48] validation-logloss:0.24030 validation-auc:0.97241 validation-aucpr:0.97515
[49] validation-logloss:0.23828 validation-auc:0.97238 validation-aucpr:0.97510
[50] validation-logloss:0.23624 validation-auc:0.97227 validation-aucpr:0.97501
[51] validation-logloss:0.23450 validation-auc:0.97228 validation-aucpr:0.97541
[52] validation-logloss:0.23285 validation-auc:0.97230 validation-aucpr:0.97541
[53] validation-logloss:0.23072 validation-auc:0.97239 validation-aucpr:0.97548
[54] validation-logloss:0.22861 validation-auc:0.97241 validation-aucpr:0.97551
[55] validation-logloss:0.22704 validation-auc:0.97243 validation-aucpr:0.97555
[56] validation-logloss:0.22563 validation-auc:0.97237 validation-aucpr:0.97554
[57] validation-logloss:0.22427 validation-auc:0.97237 validation-aucpr:0.97554
[58] validation-logloss:0.22239 validation-auc:0.97241 validation-aucpr:0.97556
[59] validation-logloss:0.22064 validation-auc:0.97244 validation-aucpr:0.97559
[60] validation-logloss:0.21918 validation-auc:0.97245 validation-aucpr:0.97559
[61] validation-logloss:0.21791 validation-auc:0.97253 validation-aucpr:0.97564
[62] validation-logloss:0.21635 validation-auc:0.97253 validation-aucpr:0.97564
[63] validation-logloss:0.21503 validation-auc:0.97254 validation-aucpr:0.97567
[64] validation-logloss:0.21369 validation-auc:0.97260 validation-aucpr:0.97565
[65] validation-logloss:0.21264 validation-auc:0.97259 validation-aucpr:0.97564
[66] validation-logloss:0.21134 validation-auc:0.97271 validation-aucpr:0.97573
[67] validation-logloss:0.21016 validation-auc:0.97277 validation-aucpr:0.97576
[68] validation-logloss:0.20918 validation-auc:0.97270 validation-aucpr:0.97573
[69] validation-logloss:0.20826 validation-auc:0.97278 validation-aucpr:0.97577
[70] validation-logloss:0.20738 validation-auc:0.97280 validation-aucpr:0.97577
[71] validation-logloss:0.20604 validation-auc:0.97295 validation-aucpr:0.97588
[72] validation-logloss:0.20491 validation-auc:0.97301 validation-aucpr:0.97595
[73] validation-logloss:0.20423 validation-auc:0.97304 validation-aucpr:0.97597
[74] validation-logloss:0.20355 validation-auc:0.97306 validation-aucpr:0.97594
[75] validation-logloss:0.20263 validation-auc:0.97311 validation-aucpr:0.97596
[76] validation-logloss:0.20193 validation-auc:0.97308 validation-aucpr:0.97593
[77] validation-logloss:0.20097 validation-auc:0.97311 validation-aucpr:0.97586
[78] validation-logloss:0.20015 validation-auc:0.97319 validation-aucpr:0.97591
[79] validation-logloss:0.19919 validation-auc:0.97326 validation-aucpr:0.97592
[80] validation-logloss:0.19848 validation-auc:0.97334 validation-aucpr:0.97601
[81] validation-logloss:0.19798 validation-auc:0.97333 validation-aucpr:0.97611
[82] validation-logloss:0.19712 validation-auc:0.97345 validation-aucpr:0.97621
[83] validation-logloss:0.19638 validation-auc:0.97355 validation-aucpr:0.97628
{'best_iteration': '83', 'best_score': '0.9762840686470307'}
Trial 85, Fold 2: Log loss = 0.19637594776935594, Average precision = 0.9762886215576242, ROC-AUC = 0.9735509679839487, Elapsed Time = 6.715063700001338 seconds
Trial 85, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 85, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66523 validation-auc:0.95901 validation-aucpr:0.96350
[1] validation-logloss:0.63919 validation-auc:0.96513 validation-aucpr:0.96909
[2] validation-logloss:0.61473 validation-auc:0.96702 validation-aucpr:0.97074
[3] validation-logloss:0.59218 validation-auc:0.96777 validation-aucpr:0.97130
[4] validation-logloss:0.57326 validation-auc:0.96813 validation-aucpr:0.97176
[5] validation-logloss:0.55355 validation-auc:0.96841 validation-aucpr:0.97198
[6] validation-logloss:0.53491 validation-auc:0.96908 validation-aucpr:0.97251
[7] validation-logloss:0.51802 validation-auc:0.96881 validation-aucpr:0.97268
[8] validation-logloss:0.50192 validation-auc:0.96876 validation-aucpr:0.97266
[9] validation-logloss:0.48642 validation-auc:0.96923 validation-aucpr:0.97301
[10] validation-logloss:0.47154 validation-auc:0.96972 validation-aucpr:0.97336
[11] validation-logloss:0.45916 validation-auc:0.96982 validation-aucpr:0.97407
[12] validation-logloss:0.44652 validation-auc:0.96991 validation-aucpr:0.97413
[13] validation-logloss:0.43456 validation-auc:0.96991 validation-aucpr:0.97409
[14] validation-logloss:0.42321 validation-auc:0.97017 validation-aucpr:0.97420
[15] validation-logloss:0.41307 validation-auc:0.97045 validation-aucpr:0.97458
[16] validation-logloss:0.40250 validation-auc:0.97057 validation-aucpr:0.97468
[17] validation-logloss:0.39391 validation-auc:0.97055 validation-aucpr:0.97471
[18] validation-logloss:0.38446 validation-auc:0.97079 validation-aucpr:0.97485
[19] validation-logloss:0.37529 validation-auc:0.97094 validation-aucpr:0.97500
[20] validation-logloss:0.36765 validation-auc:0.97105 validation-aucpr:0.97500
[21] validation-logloss:0.36030 validation-auc:0.97104 validation-aucpr:0.97497
[22] validation-logloss:0.35343 validation-auc:0.97111 validation-aucpr:0.97505
[23] validation-logloss:0.34599 validation-auc:0.97141 validation-aucpr:0.97545
[24] validation-logloss:0.33916 validation-auc:0.97129 validation-aucpr:0.97535
[25] validation-logloss:0.33269 validation-auc:0.97122 validation-aucpr:0.97530
[26] validation-logloss:0.32702 validation-auc:0.97099 validation-aucpr:0.97515
[27] validation-logloss:0.32076 validation-auc:0.97097 validation-aucpr:0.97513
[28] validation-logloss:0.31486 validation-auc:0.97112 validation-aucpr:0.97524
[29] validation-logloss:0.31010 validation-auc:0.97090 validation-aucpr:0.97512
[30] validation-logloss:0.30468 validation-auc:0.97095 validation-aucpr:0.97515
[31] validation-logloss:0.30025 validation-auc:0.97093 validation-aucpr:0.97511
[32] validation-logloss:0.29546 validation-auc:0.97074 validation-aucpr:0.97497
[33] validation-logloss:0.29164 validation-auc:0.97059 validation-aucpr:0.97486
[34] validation-logloss:0.28796 validation-auc:0.97042 validation-aucpr:0.97473
[35] validation-logloss:0.28364 validation-auc:0.97039 validation-aucpr:0.97470
[36] validation-logloss:0.27940 validation-auc:0.97054 validation-aucpr:0.97485
[37] validation-logloss:0.27543 validation-auc:0.97069 validation-aucpr:0.97495
[38] validation-logloss:0.27155 validation-auc:0.97064 validation-aucpr:0.97493
[39] validation-logloss:0.26881 validation-auc:0.97046 validation-aucpr:0.97478
[40] validation-logloss:0.26511 validation-auc:0.97061 validation-aucpr:0.97493
[41] validation-logloss:0.26161 validation-auc:0.97074 validation-aucpr:0.97502
[42] validation-logloss:0.25864 validation-auc:0.97074 validation-aucpr:0.97501
[43] validation-logloss:0.25553 validation-auc:0.97089 validation-aucpr:0.97510
[44] validation-logloss:0.25262 validation-auc:0.97098 validation-aucpr:0.97518
[45] validation-logloss:0.24986 validation-auc:0.97099 validation-aucpr:0.97517
[46] validation-logloss:0.24707 validation-auc:0.97111 validation-aucpr:0.97529
[47] validation-logloss:0.24456 validation-auc:0.97128 validation-aucpr:0.97539
[48] validation-logloss:0.24205 validation-auc:0.97138 validation-aucpr:0.97546
[49] validation-logloss:0.24024 validation-auc:0.97140 validation-aucpr:0.97541
[50] validation-logloss:0.23820 validation-auc:0.97129 validation-aucpr:0.97532
[51] validation-logloss:0.23576 validation-auc:0.97143 validation-aucpr:0.97544
[52] validation-logloss:0.23370 validation-auc:0.97142 validation-aucpr:0.97544
[53] validation-logloss:0.23225 validation-auc:0.97131 validation-aucpr:0.97533
[54] validation-logloss:0.23084 validation-auc:0.97124 validation-aucpr:0.97525
[55] validation-logloss:0.22915 validation-auc:0.97112 validation-aucpr:0.97507
[56] validation-logloss:0.22740 validation-auc:0.97120 validation-aucpr:0.97510
[57] validation-logloss:0.22569 validation-auc:0.97132 validation-aucpr:0.97518
[58] validation-logloss:0.22392 validation-auc:0.97141 validation-aucpr:0.97538
[59] validation-logloss:0.22256 validation-auc:0.97134 validation-aucpr:0.97533
[60] validation-logloss:0.22110 validation-auc:0.97138 validation-aucpr:0.97534
[61] validation-logloss:0.22003 validation-auc:0.97140 validation-aucpr:0.97537
[62] validation-logloss:0.21862 validation-auc:0.97146 validation-aucpr:0.97540
[63] validation-logloss:0.21755 validation-auc:0.97143 validation-aucpr:0.97538
[64] validation-logloss:0.21629 validation-auc:0.97149 validation-aucpr:0.97544
[65] validation-logloss:0.21495 validation-auc:0.97154 validation-aucpr:0.97550
[66] validation-logloss:0.21404 validation-auc:0.97152 validation-aucpr:0.97547
[67] validation-logloss:0.21306 validation-auc:0.97158 validation-aucpr:0.97552
[68] validation-logloss:0.21214 validation-auc:0.97158 validation-aucpr:0.97551
[69] validation-logloss:0.21110 validation-auc:0.97168 validation-aucpr:0.97558
[70] validation-logloss:0.21021 validation-auc:0.97167 validation-aucpr:0.97556
[71] validation-logloss:0.20915 validation-auc:0.97173 validation-aucpr:0.97558
[72] validation-logloss:0.20830 validation-auc:0.97176 validation-aucpr:0.97560
[73] validation-logloss:0.20741 validation-auc:0.97179 validation-aucpr:0.97561
[74] validation-logloss:0.20669 validation-auc:0.97180 validation-aucpr:0.97564
[75] validation-logloss:0.20578 validation-auc:0.97185 validation-aucpr:0.97568
[76] validation-logloss:0.20476 validation-auc:0.97194 validation-aucpr:0.97573
[77] validation-logloss:0.20396 validation-auc:0.97212 validation-aucpr:0.97599
[78] validation-logloss:0.20312 validation-auc:0.97216 validation-aucpr:0.97605
[79] validation-logloss:0.20250 validation-auc:0.97221 validation-aucpr:0.97608
[80] validation-logloss:0.20198 validation-auc:0.97216 validation-aucpr:0.97601
[81] validation-logloss:0.20152 validation-auc:0.97211 validation-aucpr:0.97595
[82] validation-logloss:0.20090 validation-auc:0.97213 validation-aucpr:0.97594
[83] validation-logloss:0.20036 validation-auc:0.97210 validation-aucpr:0.97588
{'best_iteration': '79', 'best_score': '0.976080442116701'}
Trial 85, Fold 3: Log loss = 0.20036022393956576, Average precision = 0.9758860880754837, ROC-AUC = 0.9720970104063333, Elapsed Time = 6.468014099998982 seconds
Trial 85, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 85, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66541 validation-auc:0.95157 validation-aucpr:0.95421
[1] validation-logloss:0.63949 validation-auc:0.95963 validation-aucpr:0.96707
[2] validation-logloss:0.61516 validation-auc:0.96363 validation-aucpr:0.96952
[3] validation-logloss:0.59546 validation-auc:0.96362 validation-aucpr:0.96963
[4] validation-logloss:0.57432 validation-auc:0.96463 validation-aucpr:0.97042
[5] validation-logloss:0.55469 validation-auc:0.96526 validation-aucpr:0.97097
[6] validation-logloss:0.53592 validation-auc:0.96579 validation-aucpr:0.97145
[7] validation-logloss:0.51877 validation-auc:0.96653 validation-aucpr:0.97192
[8] validation-logloss:0.50209 validation-auc:0.96768 validation-aucpr:0.97263
[9] validation-logloss:0.48678 validation-auc:0.96755 validation-aucpr:0.97256
[10] validation-logloss:0.47250 validation-auc:0.96763 validation-aucpr:0.97276
[11] validation-logloss:0.46030 validation-auc:0.96731 validation-aucpr:0.97255
[12] validation-logloss:0.44766 validation-auc:0.96730 validation-aucpr:0.97251
[13] validation-logloss:0.43514 validation-auc:0.96773 validation-aucpr:0.97284
[14] validation-logloss:0.42346 validation-auc:0.96800 validation-aucpr:0.97303
[15] validation-logloss:0.41215 validation-auc:0.96844 validation-aucpr:0.97340
[16] validation-logloss:0.40150 validation-auc:0.96880 validation-aucpr:0.97367
[17] validation-logloss:0.39161 validation-auc:0.96903 validation-aucpr:0.97386
[18] validation-logloss:0.38340 validation-auc:0.96897 validation-aucpr:0.97382
[19] validation-logloss:0.37540 validation-auc:0.96910 validation-aucpr:0.97399
[20] validation-logloss:0.36845 validation-auc:0.96888 validation-aucpr:0.97382
[21] validation-logloss:0.36019 validation-auc:0.96925 validation-aucpr:0.97409
[22] validation-logloss:0.35359 validation-auc:0.96910 validation-aucpr:0.97396
[23] validation-logloss:0.34598 validation-auc:0.96935 validation-aucpr:0.97417
[24] validation-logloss:0.33907 validation-auc:0.96952 validation-aucpr:0.97438
[25] validation-logloss:0.33250 validation-auc:0.96963 validation-aucpr:0.97445
[26] validation-logloss:0.32712 validation-auc:0.96967 validation-aucpr:0.97447
[27] validation-logloss:0.32108 validation-auc:0.96973 validation-aucpr:0.97452
[28] validation-logloss:0.31627 validation-auc:0.96951 validation-aucpr:0.97436
[29] validation-logloss:0.31081 validation-auc:0.96986 validation-aucpr:0.97459
[30] validation-logloss:0.30565 validation-auc:0.96996 validation-aucpr:0.97467
[31] validation-logloss:0.30067 validation-auc:0.97002 validation-aucpr:0.97474
[32] validation-logloss:0.29584 validation-auc:0.97007 validation-aucpr:0.97477
[33] validation-logloss:0.29106 validation-auc:0.97024 validation-aucpr:0.97487
[34] validation-logloss:0.28775 validation-auc:0.96989 validation-aucpr:0.97461
[35] validation-logloss:0.28349 validation-auc:0.97003 validation-aucpr:0.97470
[36] validation-logloss:0.27938 validation-auc:0.97011 validation-aucpr:0.97477
[37] validation-logloss:0.27568 validation-auc:0.97010 validation-aucpr:0.97476
[38] validation-logloss:0.27219 validation-auc:0.97015 validation-aucpr:0.97480
[39] validation-logloss:0.26937 validation-auc:0.97009 validation-aucpr:0.97477
[40] validation-logloss:0.26583 validation-auc:0.97023 validation-aucpr:0.97487
[41] validation-logloss:0.26266 validation-auc:0.97018 validation-aucpr:0.97487
[42] validation-logloss:0.25958 validation-auc:0.97014 validation-aucpr:0.97486
[43] validation-logloss:0.25642 validation-auc:0.97016 validation-aucpr:0.97489
[44] validation-logloss:0.25359 validation-auc:0.97024 validation-aucpr:0.97498
[45] validation-logloss:0.25084 validation-auc:0.97029 validation-aucpr:0.97499
[46] validation-logloss:0.24858 validation-auc:0.97019 validation-aucpr:0.97492
[47] validation-logloss:0.24659 validation-auc:0.97017 validation-aucpr:0.97491
[48] validation-logloss:0.24419 validation-auc:0.97017 validation-aucpr:0.97492
[49] validation-logloss:0.24212 validation-auc:0.97021 validation-aucpr:0.97496
[50] validation-logloss:0.23987 validation-auc:0.97028 validation-aucpr:0.97501
[51] validation-logloss:0.23833 validation-auc:0.97017 validation-aucpr:0.97493
[52] validation-logloss:0.23640 validation-auc:0.97019 validation-aucpr:0.97494
[53] validation-logloss:0.23447 validation-auc:0.97020 validation-aucpr:0.97495
[54] validation-logloss:0.23278 validation-auc:0.97015 validation-aucpr:0.97492
[55] validation-logloss:0.23094 validation-auc:0.97017 validation-aucpr:0.97493
[56] validation-logloss:0.22944 validation-auc:0.97011 validation-aucpr:0.97489
[57] validation-logloss:0.22765 validation-auc:0.97012 validation-aucpr:0.97491
[58] validation-logloss:0.22628 validation-auc:0.97004 validation-aucpr:0.97485
[59] validation-logloss:0.22503 validation-auc:0.97001 validation-aucpr:0.97483
[60] validation-logloss:0.22346 validation-auc:0.97004 validation-aucpr:0.97485
[61] validation-logloss:0.22208 validation-auc:0.97001 validation-aucpr:0.97486
[62] validation-logloss:0.22068 validation-auc:0.97004 validation-aucpr:0.97488
[63] validation-logloss:0.21967 validation-auc:0.96992 validation-aucpr:0.97480
[64] validation-logloss:0.21854 validation-auc:0.96987 validation-aucpr:0.97476
[65] validation-logloss:0.21752 validation-auc:0.96985 validation-aucpr:0.97473
[66] validation-logloss:0.21659 validation-auc:0.96988 validation-aucpr:0.97475
[67] validation-logloss:0.21580 validation-auc:0.96992 validation-aucpr:0.97476
[68] validation-logloss:0.21492 validation-auc:0.96986 validation-aucpr:0.97472
[69] validation-logloss:0.21392 validation-auc:0.96990 validation-aucpr:0.97474
[70] validation-logloss:0.21309 validation-auc:0.96986 validation-aucpr:0.97472
[71] validation-logloss:0.21224 validation-auc:0.96992 validation-aucpr:0.97477
[72] validation-logloss:0.21161 validation-auc:0.96980 validation-aucpr:0.97468
[73] validation-logloss:0.21064 validation-auc:0.96989 validation-aucpr:0.97475
[74] validation-logloss:0.20979 validation-auc:0.96997 validation-aucpr:0.97479
[75] validation-logloss:0.20932 validation-auc:0.96992 validation-aucpr:0.97476
[76] validation-logloss:0.20891 validation-auc:0.96984 validation-aucpr:0.97470
[77] validation-logloss:0.20819 validation-auc:0.96990 validation-aucpr:0.97474
[78] validation-logloss:0.20738 validation-auc:0.96997 validation-aucpr:0.97480
[79] validation-logloss:0.20662 validation-auc:0.97002 validation-aucpr:0.97485
[80] validation-logloss:0.20605 validation-auc:0.97004 validation-aucpr:0.97484
[81] validation-logloss:0.20540 validation-auc:0.97007 validation-aucpr:0.97487
[82] validation-logloss:0.20480 validation-auc:0.97011 validation-aucpr:0.97489
[83] validation-logloss:0.20433 validation-auc:0.97010 validation-aucpr:0.97488
{'best_iteration': '50', 'best_score': '0.9750077463971537'}
Trial 85, Fold 4: Log loss = 0.20432554100239475, Average precision = 0.9748823764951907, ROC-AUC = 0.9700994269071104, Elapsed Time = 6.552320899998449 seconds
Trial 85, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 85, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66500 validation-auc:0.95722 validation-aucpr:0.95513
[1] validation-logloss:0.63950 validation-auc:0.95967 validation-aucpr:0.96068
[2] validation-logloss:0.61596 validation-auc:0.96211 validation-aucpr:0.96738
[3] validation-logloss:0.59631 validation-auc:0.96338 validation-aucpr:0.96852
[4] validation-logloss:0.57777 validation-auc:0.96342 validation-aucpr:0.96844
[5] validation-logloss:0.55803 validation-auc:0.96414 validation-aucpr:0.96916
[6] validation-logloss:0.54156 validation-auc:0.96437 validation-aucpr:0.96915
[7] validation-logloss:0.52352 validation-auc:0.96608 validation-aucpr:0.97071
[8] validation-logloss:0.50701 validation-auc:0.96609 validation-aucpr:0.97082
[9] validation-logloss:0.49140 validation-auc:0.96686 validation-aucpr:0.97133
[10] validation-logloss:0.47655 validation-auc:0.96729 validation-aucpr:0.97172
[11] validation-logloss:0.46279 validation-auc:0.96773 validation-aucpr:0.97207
[12] validation-logloss:0.44967 validation-auc:0.96819 validation-aucpr:0.97227
[13] validation-logloss:0.43699 validation-auc:0.96855 validation-aucpr:0.97264
[14] validation-logloss:0.42547 validation-auc:0.96880 validation-aucpr:0.97293
[15] validation-logloss:0.41474 validation-auc:0.96909 validation-aucpr:0.97325
[16] validation-logloss:0.40442 validation-auc:0.96941 validation-aucpr:0.97345
[17] validation-logloss:0.39472 validation-auc:0.96950 validation-aucpr:0.97357
[18] validation-logloss:0.38528 validation-auc:0.96980 validation-aucpr:0.97387
[19] validation-logloss:0.37643 validation-auc:0.96977 validation-aucpr:0.97384
[20] validation-logloss:0.36924 validation-auc:0.96958 validation-aucpr:0.97346
[21] validation-logloss:0.36220 validation-auc:0.96948 validation-aucpr:0.97336
[22] validation-logloss:0.35551 validation-auc:0.96956 validation-aucpr:0.97347
[23] validation-logloss:0.34865 validation-auc:0.96950 validation-aucpr:0.97345
[24] validation-logloss:0.34173 validation-auc:0.96962 validation-aucpr:0.97354
[25] validation-logloss:0.33500 validation-auc:0.96971 validation-aucpr:0.97362
[26] validation-logloss:0.32866 validation-auc:0.96982 validation-aucpr:0.97370
[27] validation-logloss:0.32326 validation-auc:0.96987 validation-aucpr:0.97371
[28] validation-logloss:0.31759 validation-auc:0.96991 validation-aucpr:0.97378
[29] validation-logloss:0.31331 validation-auc:0.96965 validation-aucpr:0.97383
[30] validation-logloss:0.30814 validation-auc:0.96975 validation-aucpr:0.97390
[31] validation-logloss:0.30307 validation-auc:0.96980 validation-aucpr:0.97394
[32] validation-logloss:0.29867 validation-auc:0.96975 validation-aucpr:0.97391
[33] validation-logloss:0.29406 validation-auc:0.96978 validation-aucpr:0.97393
[34] validation-logloss:0.28974 validation-auc:0.96969 validation-aucpr:0.97391
[35] validation-logloss:0.28623 validation-auc:0.96958 validation-aucpr:0.97383
[36] validation-logloss:0.28243 validation-auc:0.96954 validation-aucpr:0.97382
[37] validation-logloss:0.27896 validation-auc:0.96963 validation-aucpr:0.97387
[38] validation-logloss:0.27601 validation-auc:0.96959 validation-aucpr:0.97386
[39] validation-logloss:0.27224 validation-auc:0.96974 validation-aucpr:0.97398
[40] validation-logloss:0.26951 validation-auc:0.96946 validation-aucpr:0.97378
[41] validation-logloss:0.26624 validation-auc:0.96957 validation-aucpr:0.97388
[42] validation-logloss:0.26330 validation-auc:0.96954 validation-aucpr:0.97383
[43] validation-logloss:0.26032 validation-auc:0.96954 validation-aucpr:0.97383
[44] validation-logloss:0.25768 validation-auc:0.96952 validation-aucpr:0.97380
[45] validation-logloss:0.25477 validation-auc:0.96986 validation-aucpr:0.97397
[46] validation-logloss:0.25230 validation-auc:0.96988 validation-aucpr:0.97399
[47] validation-logloss:0.24977 validation-auc:0.96996 validation-aucpr:0.97404
[48] validation-logloss:0.24706 validation-auc:0.97020 validation-aucpr:0.97424
[49] validation-logloss:0.24498 validation-auc:0.97021 validation-aucpr:0.97423
[50] validation-logloss:0.24267 validation-auc:0.97024 validation-aucpr:0.97427
[51] validation-logloss:0.24086 validation-auc:0.97017 validation-aucpr:0.97419
[52] validation-logloss:0.23876 validation-auc:0.97035 validation-aucpr:0.97434
[53] validation-logloss:0.23742 validation-auc:0.97024 validation-aucpr:0.97426
[54] validation-logloss:0.23543 validation-auc:0.97032 validation-aucpr:0.97431
[55] validation-logloss:0.23389 validation-auc:0.97039 validation-aucpr:0.97437
[56] validation-logloss:0.23212 validation-auc:0.97045 validation-aucpr:0.97445
[57] validation-logloss:0.23032 validation-auc:0.97051 validation-aucpr:0.97439
[58] validation-logloss:0.22839 validation-auc:0.97086 validation-aucpr:0.97465
[59] validation-logloss:0.22725 validation-auc:0.97071 validation-aucpr:0.97454
[60] validation-logloss:0.22578 validation-auc:0.97072 validation-aucpr:0.97453
[61] validation-logloss:0.22435 validation-auc:0.97079 validation-aucpr:0.97460
[62] validation-logloss:0.22316 validation-auc:0.97083 validation-aucpr:0.97454
[63] validation-logloss:0.22186 validation-auc:0.97087 validation-aucpr:0.97454
[64] validation-logloss:0.22047 validation-auc:0.97094 validation-aucpr:0.97460
[65] validation-logloss:0.21933 validation-auc:0.97091 validation-aucpr:0.97455
[66] validation-logloss:0.21814 validation-auc:0.97107 validation-aucpr:0.97467
[67] validation-logloss:0.21723 validation-auc:0.97104 validation-aucpr:0.97465
[68] validation-logloss:0.21630 validation-auc:0.97106 validation-aucpr:0.97463
[69] validation-logloss:0.21524 validation-auc:0.97107 validation-aucpr:0.97464
[70] validation-logloss:0.21436 validation-auc:0.97103 validation-aucpr:0.97454
[71] validation-logloss:0.21352 validation-auc:0.97095 validation-aucpr:0.97447
[72] validation-logloss:0.21269 validation-auc:0.97091 validation-aucpr:0.97445
[73] validation-logloss:0.21162 validation-auc:0.97101 validation-aucpr:0.97454
[74] validation-logloss:0.21098 validation-auc:0.97099 validation-aucpr:0.97451
[75] validation-logloss:0.21021 validation-auc:0.97099 validation-aucpr:0.97441
[76] validation-logloss:0.20960 validation-auc:0.97096 validation-aucpr:0.97448
[77] validation-logloss:0.20921 validation-auc:0.97090 validation-aucpr:0.97435
[78] validation-logloss:0.20891 validation-auc:0.97073 validation-aucpr:0.97421
[79] validation-logloss:0.20821 validation-auc:0.97078 validation-aucpr:0.97422
[80] validation-logloss:0.20735 validation-auc:0.97097 validation-aucpr:0.97440
[81] validation-logloss:0.20709 validation-auc:0.97078 validation-aucpr:0.97424
[82] validation-logloss:0.20644 validation-auc:0.97083 validation-aucpr:0.97434
[83] validation-logloss:0.20587 validation-auc:0.97086 validation-aucpr:0.97437
{'best_iteration': '66', 'best_score': '0.9746692647212228'}
Trial 85, Fold 5: Log loss = 0.20586828511529295, Average precision = 0.9743731212573705, ROC-AUC = 0.9708630882708136, Elapsed Time = 6.65404910000143 seconds
Optimization Progress: 86%|########6 | 86/100 [3:45:22<14:14, 61.02s/it]
Trial 86, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 86, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68238 validation-auc:0.95468 validation-aucpr:0.94200
[1] validation-logloss:0.67214 validation-auc:0.96377 validation-aucpr:0.96689
[2] validation-logloss:0.66215 validation-auc:0.96635 validation-aucpr:0.97169
[3] validation-logloss:0.65219 validation-auc:0.96699 validation-aucpr:0.97231
[4] validation-logloss:0.64346 validation-auc:0.96786 validation-aucpr:0.97295
[5] validation-logloss:0.63442 validation-auc:0.96695 validation-aucpr:0.97228
[6] validation-logloss:0.62549 validation-auc:0.96680 validation-aucpr:0.97217
[7] validation-logloss:0.61751 validation-auc:0.96697 validation-aucpr:0.97218
[8] validation-logloss:0.60864 validation-auc:0.96780 validation-aucpr:0.97290
[9] validation-logloss:0.60004 validation-auc:0.96856 validation-aucpr:0.97349
[10] validation-logloss:0.59168 validation-auc:0.96890 validation-aucpr:0.97374
[11] validation-logloss:0.58369 validation-auc:0.96919 validation-aucpr:0.97400
[12] validation-logloss:0.57608 validation-auc:0.96917 validation-aucpr:0.97423
[13] validation-logloss:0.56913 validation-auc:0.96895 validation-aucpr:0.97402
[14] validation-logloss:0.56172 validation-auc:0.96891 validation-aucpr:0.97399
[15] validation-logloss:0.55456 validation-auc:0.96899 validation-aucpr:0.97398
[16] validation-logloss:0.54756 validation-auc:0.96892 validation-aucpr:0.97398
[17] validation-logloss:0.54063 validation-auc:0.96912 validation-aucpr:0.97407
[18] validation-logloss:0.53414 validation-auc:0.96895 validation-aucpr:0.97176
[19] validation-logloss:0.52825 validation-auc:0.96885 validation-aucpr:0.97163
[20] validation-logloss:0.52179 validation-auc:0.96891 validation-aucpr:0.97168
[21] validation-logloss:0.51549 validation-auc:0.96915 validation-aucpr:0.97184
[22] validation-logloss:0.50928 validation-auc:0.96928 validation-aucpr:0.97193
[23] validation-logloss:0.50358 validation-auc:0.96926 validation-aucpr:0.97231
[24] validation-logloss:0.49826 validation-auc:0.96936 validation-aucpr:0.97236
[25] validation-logloss:0.49247 validation-auc:0.96961 validation-aucpr:0.97448
[26] validation-logloss:0.48699 validation-auc:0.96962 validation-aucpr:0.97449
[27] validation-logloss:0.48139 validation-auc:0.96967 validation-aucpr:0.97456
[28] validation-logloss:0.47650 validation-auc:0.96960 validation-aucpr:0.97447
[29] validation-logloss:0.47115 validation-auc:0.96964 validation-aucpr:0.97449
[30] validation-logloss:0.46648 validation-auc:0.96957 validation-aucpr:0.97443
[31] validation-logloss:0.46157 validation-auc:0.96956 validation-aucpr:0.97443
[32] validation-logloss:0.45649 validation-auc:0.96960 validation-aucpr:0.97448
[33] validation-logloss:0.45168 validation-auc:0.96965 validation-aucpr:0.97454
[34] validation-logloss:0.44754 validation-auc:0.96966 validation-aucpr:0.97450
[35] validation-logloss:0.44278 validation-auc:0.96976 validation-aucpr:0.97456
[36] validation-logloss:0.43811 validation-auc:0.96976 validation-aucpr:0.97458
[37] validation-logloss:0.43363 validation-auc:0.96983 validation-aucpr:0.97462
[38] validation-logloss:0.42915 validation-auc:0.96996 validation-aucpr:0.97471
[39] validation-logloss:0.42494 validation-auc:0.96998 validation-aucpr:0.97472
[40] validation-logloss:0.42107 validation-auc:0.97002 validation-aucpr:0.97475
[41] validation-logloss:0.41738 validation-auc:0.96998 validation-aucpr:0.97470
[42] validation-logloss:0.41391 validation-auc:0.96993 validation-aucpr:0.97463
[43] validation-logloss:0.40992 validation-auc:0.97006 validation-aucpr:0.97473
[44] validation-logloss:0.40655 validation-auc:0.96998 validation-aucpr:0.97467
[45] validation-logloss:0.40311 validation-auc:0.96998 validation-aucpr:0.97464
[46] validation-logloss:0.39980 validation-auc:0.97003 validation-aucpr:0.97464
[47] validation-logloss:0.39655 validation-auc:0.96998 validation-aucpr:0.97460
[48] validation-logloss:0.39290 validation-auc:0.96992 validation-aucpr:0.97456
{'best_iteration': '40', 'best_score': '0.9747540236047042'}
Trial 86, Fold 1: Log loss = 0.39290141609750506, Average precision = 0.9745635372166829, ROC-AUC = 0.9699236878832633, Elapsed Time = 56.130409400000644 seconds
Trial 86, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 86, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68258 validation-auc:0.95691 validation-aucpr:0.95557
[1] validation-logloss:0.67317 validation-auc:0.96309 validation-aucpr:0.96389
[2] validation-logloss:0.66402 validation-auc:0.96405 validation-aucpr:0.96728
[3] validation-logloss:0.65427 validation-auc:0.96643 validation-aucpr:0.96952
[4] validation-logloss:0.64592 validation-auc:0.96592 validation-aucpr:0.96893
[5] validation-logloss:0.63759 validation-auc:0.96605 validation-aucpr:0.96898
[6] validation-logloss:0.62865 validation-auc:0.96662 validation-aucpr:0.96968
[7] validation-logloss:0.61963 validation-auc:0.96732 validation-aucpr:0.97036
[8] validation-logloss:0.61080 validation-auc:0.96795 validation-aucpr:0.97096
[9] validation-logloss:0.60216 validation-auc:0.96867 validation-aucpr:0.97164
[10] validation-logloss:0.59405 validation-auc:0.96902 validation-aucpr:0.97189
[11] validation-logloss:0.58687 validation-auc:0.96899 validation-aucpr:0.97202
[12] validation-logloss:0.57899 validation-auc:0.96915 validation-aucpr:0.97222
[13] validation-logloss:0.57124 validation-auc:0.96935 validation-aucpr:0.97242
[14] validation-logloss:0.56392 validation-auc:0.96961 validation-aucpr:0.97262
[15] validation-logloss:0.55725 validation-auc:0.96971 validation-aucpr:0.97268
[16] validation-logloss:0.54994 validation-auc:0.97002 validation-aucpr:0.97298
[17] validation-logloss:0.54368 validation-auc:0.96982 validation-aucpr:0.97271
[18] validation-logloss:0.53699 validation-auc:0.96978 validation-aucpr:0.97272
[19] validation-logloss:0.53080 validation-auc:0.96971 validation-aucpr:0.97264
[20] validation-logloss:0.52509 validation-auc:0.96961 validation-aucpr:0.97240
[21] validation-logloss:0.51955 validation-auc:0.96922 validation-aucpr:0.97204
[22] validation-logloss:0.51404 validation-auc:0.96914 validation-aucpr:0.97196
[23] validation-logloss:0.50846 validation-auc:0.96918 validation-aucpr:0.97199
[24] validation-logloss:0.50314 validation-auc:0.96914 validation-aucpr:0.97199
[25] validation-logloss:0.49712 validation-auc:0.96922 validation-aucpr:0.97208
[26] validation-logloss:0.49138 validation-auc:0.96926 validation-aucpr:0.97209
[27] validation-logloss:0.48618 validation-auc:0.96932 validation-aucpr:0.97212
[28] validation-logloss:0.48078 validation-auc:0.96951 validation-aucpr:0.97226
[29] validation-logloss:0.47600 validation-auc:0.96951 validation-aucpr:0.97225
[30] validation-logloss:0.47142 validation-auc:0.96946 validation-aucpr:0.97222
[31] validation-logloss:0.46679 validation-auc:0.96945 validation-aucpr:0.97217
[32] validation-logloss:0.46153 validation-auc:0.96961 validation-aucpr:0.97233
[33] validation-logloss:0.45652 validation-auc:0.96964 validation-aucpr:0.97235
[34] validation-logloss:0.45157 validation-auc:0.96976 validation-aucpr:0.97247
[35] validation-logloss:0.44679 validation-auc:0.96985 validation-aucpr:0.97254
[36] validation-logloss:0.44212 validation-auc:0.97000 validation-aucpr:0.97263
[37] validation-logloss:0.43752 validation-auc:0.97014 validation-aucpr:0.97275
[38] validation-logloss:0.43302 validation-auc:0.97031 validation-aucpr:0.97286
[39] validation-logloss:0.42914 validation-auc:0.97028 validation-aucpr:0.97286
[40] validation-logloss:0.42530 validation-auc:0.97023 validation-aucpr:0.97282
[41] validation-logloss:0.42159 validation-auc:0.97020 validation-aucpr:0.97280
[42] validation-logloss:0.41725 validation-auc:0.97033 validation-aucpr:0.97287
[43] validation-logloss:0.41332 validation-auc:0.97033 validation-aucpr:0.97296
[44] validation-logloss:0.40983 validation-auc:0.97028 validation-aucpr:0.97292
[45] validation-logloss:0.40579 validation-auc:0.97053 validation-aucpr:0.97311
[46] validation-logloss:0.40189 validation-auc:0.97056 validation-aucpr:0.97315
[47] validation-logloss:0.39825 validation-auc:0.97062 validation-aucpr:0.97320
[48] validation-logloss:0.39500 validation-auc:0.97064 validation-aucpr:0.97320
{'best_iteration': '48', 'best_score': '0.9731980560776368'}
Trial 86, Fold 2: Log loss = 0.39500035361280744, Average precision = 0.9732003530395197, ROC-AUC = 0.9706408248281229, Elapsed Time = 56.37032370000088 seconds
Trial 86, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 86, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68239 validation-auc:0.94580 validation-aucpr:0.92497
[1] validation-logloss:0.67194 validation-auc:0.96398 validation-aucpr:0.95526
[2] validation-logloss:0.66194 validation-auc:0.96750 validation-aucpr:0.96994
[3] validation-logloss:0.65197 validation-auc:0.96821 validation-aucpr:0.97268
[4] validation-logloss:0.64325 validation-auc:0.96979 validation-aucpr:0.97371
[5] validation-logloss:0.63491 validation-auc:0.96951 validation-aucpr:0.97344
[6] validation-logloss:0.62639 validation-auc:0.96986 validation-aucpr:0.97396
[7] validation-logloss:0.61829 validation-auc:0.96972 validation-aucpr:0.97367
[8] validation-logloss:0.60951 validation-auc:0.96980 validation-aucpr:0.97394
[9] validation-logloss:0.60181 validation-auc:0.96954 validation-aucpr:0.97371
[10] validation-logloss:0.59435 validation-auc:0.96917 validation-aucpr:0.97337
[11] validation-logloss:0.58726 validation-auc:0.96914 validation-aucpr:0.97327
[12] validation-logloss:0.57920 validation-auc:0.96940 validation-aucpr:0.97358
[13] validation-logloss:0.57214 validation-auc:0.96961 validation-aucpr:0.97369
[14] validation-logloss:0.56552 validation-auc:0.96965 validation-aucpr:0.97384
[15] validation-logloss:0.55889 validation-auc:0.96961 validation-aucpr:0.97383
[16] validation-logloss:0.55168 validation-auc:0.96976 validation-aucpr:0.97400
[17] validation-logloss:0.54465 validation-auc:0.96999 validation-aucpr:0.97431
[18] validation-logloss:0.53780 validation-auc:0.97008 validation-aucpr:0.97439
[19] validation-logloss:0.53178 validation-auc:0.96996 validation-aucpr:0.97429
[20] validation-logloss:0.52609 validation-auc:0.96980 validation-aucpr:0.97413
[21] validation-logloss:0.51956 validation-auc:0.96996 validation-aucpr:0.97429
[22] validation-logloss:0.51331 validation-auc:0.96994 validation-aucpr:0.97428
[23] validation-logloss:0.50723 validation-auc:0.97010 validation-aucpr:0.97440
[24] validation-logloss:0.50165 validation-auc:0.97034 validation-aucpr:0.97457
[25] validation-logloss:0.49566 validation-auc:0.97049 validation-aucpr:0.97470
[26] validation-logloss:0.48998 validation-auc:0.97065 validation-aucpr:0.97483
[27] validation-logloss:0.48432 validation-auc:0.97070 validation-aucpr:0.97487
[28] validation-logloss:0.47936 validation-auc:0.97068 validation-aucpr:0.97482
[29] validation-logloss:0.47379 validation-auc:0.97094 validation-aucpr:0.97504
[30] validation-logloss:0.46916 validation-auc:0.97097 validation-aucpr:0.97502
[31] validation-logloss:0.46389 validation-auc:0.97105 validation-aucpr:0.97512
[32] validation-logloss:0.45932 validation-auc:0.97111 validation-aucpr:0.97513
[33] validation-logloss:0.45428 validation-auc:0.97124 validation-aucpr:0.97525
[34] validation-logloss:0.45006 validation-auc:0.97123 validation-aucpr:0.97524
[35] validation-logloss:0.44600 validation-auc:0.97121 validation-aucpr:0.97523
[36] validation-logloss:0.44156 validation-auc:0.97124 validation-aucpr:0.97524
[37] validation-logloss:0.43693 validation-auc:0.97129 validation-aucpr:0.97532
[38] validation-logloss:0.43235 validation-auc:0.97132 validation-aucpr:0.97536
[39] validation-logloss:0.42781 validation-auc:0.97139 validation-aucpr:0.97544
[40] validation-logloss:0.42341 validation-auc:0.97145 validation-aucpr:0.97548
[41] validation-logloss:0.41932 validation-auc:0.97140 validation-aucpr:0.97547
[42] validation-logloss:0.41559 validation-auc:0.97140 validation-aucpr:0.97546
[43] validation-logloss:0.41204 validation-auc:0.97130 validation-aucpr:0.97536
[44] validation-logloss:0.40842 validation-auc:0.97130 validation-aucpr:0.97538
[45] validation-logloss:0.40508 validation-auc:0.97120 validation-aucpr:0.97528
[46] validation-logloss:0.40181 validation-auc:0.97112 validation-aucpr:0.97520
[47] validation-logloss:0.39836 validation-auc:0.97122 validation-aucpr:0.97527
[48] validation-logloss:0.39475 validation-auc:0.97128 validation-aucpr:0.97533
{'best_iteration': '40', 'best_score': '0.9754833863268657'}
Trial 86, Fold 3: Log loss = 0.3947504659735634, Average precision = 0.9753386299852591, ROC-AUC = 0.971284316581753, Elapsed Time = 52.0949412000009 seconds
Trial 86, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 86, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68240 validation-auc:0.95348 validation-aucpr:0.94323
[1] validation-logloss:0.67235 validation-auc:0.96156 validation-aucpr:0.96249
[2] validation-logloss:0.66228 validation-auc:0.96536 validation-aucpr:0.97044
[3] validation-logloss:0.65339 validation-auc:0.96512 validation-aucpr:0.97043
[4] validation-logloss:0.64514 validation-auc:0.96469 validation-aucpr:0.97004
[5] validation-logloss:0.63685 validation-auc:0.96557 validation-aucpr:0.97062
[6] validation-logloss:0.62772 validation-auc:0.96670 validation-aucpr:0.97179
[7] validation-logloss:0.61874 validation-auc:0.96722 validation-aucpr:0.97235
[8] validation-logloss:0.61037 validation-auc:0.96784 validation-aucpr:0.97279
[9] validation-logloss:0.60194 validation-auc:0.96816 validation-aucpr:0.97308
[10] validation-logloss:0.59368 validation-auc:0.96834 validation-aucpr:0.97323
[11] validation-logloss:0.58573 validation-auc:0.96869 validation-aucpr:0.97350
[12] validation-logloss:0.57783 validation-auc:0.96891 validation-aucpr:0.97367
[13] validation-logloss:0.57040 validation-auc:0.96904 validation-aucpr:0.97376
[14] validation-logloss:0.56286 validation-auc:0.96928 validation-aucpr:0.97400
[15] validation-logloss:0.55632 validation-auc:0.96924 validation-aucpr:0.97398
[16] validation-logloss:0.55019 validation-auc:0.96889 validation-aucpr:0.97367
[17] validation-logloss:0.54361 validation-auc:0.96891 validation-aucpr:0.97369
[18] validation-logloss:0.53745 validation-auc:0.96902 validation-aucpr:0.97387
[19] validation-logloss:0.53066 validation-auc:0.96924 validation-aucpr:0.97406
[20] validation-logloss:0.52420 validation-auc:0.96929 validation-aucpr:0.97413
[21] validation-logloss:0.51790 validation-auc:0.96928 validation-aucpr:0.97416
[22] validation-logloss:0.51187 validation-auc:0.96939 validation-aucpr:0.97427
[23] validation-logloss:0.50574 validation-auc:0.96949 validation-aucpr:0.97442
[24] validation-logloss:0.50050 validation-auc:0.96932 validation-aucpr:0.97426
[25] validation-logloss:0.49460 validation-auc:0.96944 validation-aucpr:0.97437
[26] validation-logloss:0.48879 validation-auc:0.96960 validation-aucpr:0.97449
[27] validation-logloss:0.48385 validation-auc:0.96961 validation-aucpr:0.97447
[28] validation-logloss:0.47907 validation-auc:0.96955 validation-aucpr:0.97440
[29] validation-logloss:0.47425 validation-auc:0.96971 validation-aucpr:0.97451
[30] validation-logloss:0.46976 validation-auc:0.96958 validation-aucpr:0.97440
[31] validation-logloss:0.46513 validation-auc:0.96951 validation-aucpr:0.97433
[32] validation-logloss:0.46006 validation-auc:0.96958 validation-aucpr:0.97439
[33] validation-logloss:0.45500 validation-auc:0.96971 validation-aucpr:0.97451
[34] validation-logloss:0.45015 validation-auc:0.96976 validation-aucpr:0.97455
[35] validation-logloss:0.44534 validation-auc:0.96985 validation-aucpr:0.97460
[36] validation-logloss:0.44133 validation-auc:0.96966 validation-aucpr:0.97444
[37] validation-logloss:0.43667 validation-auc:0.96976 validation-aucpr:0.97452
[38] validation-logloss:0.43223 validation-auc:0.96968 validation-aucpr:0.97448
[39] validation-logloss:0.42835 validation-auc:0.96964 validation-aucpr:0.97443
[40] validation-logloss:0.42448 validation-auc:0.96964 validation-aucpr:0.97443
[41] validation-logloss:0.42085 validation-auc:0.96959 validation-aucpr:0.97438
[42] validation-logloss:0.41670 validation-auc:0.96966 validation-aucpr:0.97445
[43] validation-logloss:0.41249 validation-auc:0.96981 validation-aucpr:0.97457
[44] validation-logloss:0.40898 validation-auc:0.96985 validation-aucpr:0.97459
[45] validation-logloss:0.40564 validation-auc:0.96973 validation-aucpr:0.97449
[46] validation-logloss:0.40173 validation-auc:0.96991 validation-aucpr:0.97467
[47] validation-logloss:0.39821 validation-auc:0.96996 validation-aucpr:0.97471
[48] validation-logloss:0.39452 validation-auc:0.97001 validation-aucpr:0.97476
{'best_iteration': '48', 'best_score': '0.9747625947591851'}
Trial 86, Fold 4: Log loss = 0.39452383486087683, Average precision = 0.9747594994211715, ROC-AUC = 0.9700139607192333, Elapsed Time = 50.71849939999811 seconds
Trial 86, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 86, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68288 validation-auc:0.94632 validation-aucpr:0.93265
[1] validation-logloss:0.67243 validation-auc:0.96482 validation-aucpr:0.96976
[2] validation-logloss:0.66338 validation-auc:0.96409 validation-aucpr:0.96862
[3] validation-logloss:0.65344 validation-auc:0.96560 validation-aucpr:0.97002
[4] validation-logloss:0.64390 validation-auc:0.96642 validation-aucpr:0.97075
[5] validation-logloss:0.63465 validation-auc:0.96676 validation-aucpr:0.97093
[6] validation-logloss:0.62646 validation-auc:0.96666 validation-aucpr:0.97083
[7] validation-logloss:0.61769 validation-auc:0.96701 validation-aucpr:0.97117
[8] validation-logloss:0.60998 validation-auc:0.96640 validation-aucpr:0.97056
[9] validation-logloss:0.60208 validation-auc:0.96633 validation-aucpr:0.97050
[10] validation-logloss:0.59372 validation-auc:0.96698 validation-aucpr:0.97104
[11] validation-logloss:0.58569 validation-auc:0.96741 validation-aucpr:0.97134
[12] validation-logloss:0.57868 validation-auc:0.96760 validation-aucpr:0.97159
[13] validation-logloss:0.57115 validation-auc:0.96781 validation-aucpr:0.97175
[14] validation-logloss:0.56439 validation-auc:0.96759 validation-aucpr:0.97158
[15] validation-logloss:0.55718 validation-auc:0.96769 validation-aucpr:0.97170
[16] validation-logloss:0.55008 validation-auc:0.96778 validation-aucpr:0.97180
[17] validation-logloss:0.54395 validation-auc:0.96767 validation-aucpr:0.97169
[18] validation-logloss:0.53751 validation-auc:0.96769 validation-aucpr:0.97175
[19] validation-logloss:0.53157 validation-auc:0.96755 validation-aucpr:0.97162
[20] validation-logloss:0.52516 validation-auc:0.96753 validation-aucpr:0.97162
[21] validation-logloss:0.51906 validation-auc:0.96770 validation-aucpr:0.97177
[22] validation-logloss:0.51274 validation-auc:0.96781 validation-aucpr:0.97185
[23] validation-logloss:0.50679 validation-auc:0.96797 validation-aucpr:0.97199
[24] validation-logloss:0.50167 validation-auc:0.96789 validation-aucpr:0.97192
[25] validation-logloss:0.49644 validation-auc:0.96792 validation-aucpr:0.97190
[26] validation-logloss:0.49094 validation-auc:0.96794 validation-aucpr:0.97192
[27] validation-logloss:0.48530 validation-auc:0.96807 validation-aucpr:0.97205
[28] validation-logloss:0.48056 validation-auc:0.96800 validation-aucpr:0.97198
[29] validation-logloss:0.47596 validation-auc:0.96781 validation-aucpr:0.97182
[30] validation-logloss:0.47056 validation-auc:0.96812 validation-aucpr:0.97207
[31] validation-logloss:0.46552 validation-auc:0.96821 validation-aucpr:0.97210
[32] validation-logloss:0.46083 validation-auc:0.96823 validation-aucpr:0.97211
[33] validation-logloss:0.45666 validation-auc:0.96815 validation-aucpr:0.97203
[34] validation-logloss:0.45176 validation-auc:0.96836 validation-aucpr:0.97221
[35] validation-logloss:0.44698 validation-auc:0.96857 validation-aucpr:0.97237
[36] validation-logloss:0.44255 validation-auc:0.96858 validation-aucpr:0.97236
[37] validation-logloss:0.43884 validation-auc:0.96836 validation-aucpr:0.97034
[38] validation-logloss:0.43492 validation-auc:0.96833 validation-aucpr:0.97058
[39] validation-logloss:0.43117 validation-auc:0.96823 validation-aucpr:0.97024
[40] validation-logloss:0.42692 validation-auc:0.96834 validation-aucpr:0.97058
[41] validation-logloss:0.42295 validation-auc:0.96839 validation-aucpr:0.97076
[42] validation-logloss:0.41936 validation-auc:0.96829 validation-aucpr:0.97066
[43] validation-logloss:0.41520 validation-auc:0.96848 validation-aucpr:0.97084
[44] validation-logloss:0.41170 validation-auc:0.96858 validation-aucpr:0.97090
[45] validation-logloss:0.40771 validation-auc:0.96867 validation-aucpr:0.97098
[46] validation-logloss:0.40385 validation-auc:0.96866 validation-aucpr:0.97099
[47] validation-logloss:0.40064 validation-auc:0.96860 validation-aucpr:0.97087
[48] validation-logloss:0.39730 validation-auc:0.96861 validation-aucpr:0.97091
{'best_iteration': '35', 'best_score': '0.9723703196133963'}
Trial 86, Fold 5: Log loss = 0.39729713898022145, Average precision = 0.9716444141942279, ROC-AUC = 0.9686050201586683, Elapsed Time = 53.13384550000046 seconds
Optimization Progress: 87%|########7 | 87/100 [3:49:59<27:13, 125.67s/it]
Trial 87, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 87, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66841 validation-auc:0.91973 validation-aucpr:0.87848
[1] validation-logloss:0.64508 validation-auc:0.95583 validation-aucpr:0.94072
[2] validation-logloss:0.62419 validation-auc:0.96261 validation-aucpr:0.96576
[3] validation-logloss:0.60364 validation-auc:0.96458 validation-aucpr:0.96851
[4] validation-logloss:0.58423 validation-auc:0.96598 validation-aucpr:0.96908
[5] validation-logloss:0.56593 validation-auc:0.96685 validation-aucpr:0.96984
[6] validation-logloss:0.54867 validation-auc:0.96743 validation-aucpr:0.96912
[7] validation-logloss:0.53240 validation-auc:0.96778 validation-aucpr:0.96876
[8] validation-logloss:0.51698 validation-auc:0.96802 validation-aucpr:0.96898
[9] validation-logloss:0.50226 validation-auc:0.96867 validation-aucpr:0.97061
[10] validation-logloss:0.48878 validation-auc:0.96896 validation-aucpr:0.97042
[11] validation-logloss:0.47621 validation-auc:0.96886 validation-aucpr:0.97023
[12] validation-logloss:0.46388 validation-auc:0.96936 validation-aucpr:0.97278
[13] validation-logloss:0.45209 validation-auc:0.96940 validation-aucpr:0.97279
[14] validation-logloss:0.44108 validation-auc:0.96937 validation-aucpr:0.97398
[15] validation-logloss:0.43087 validation-auc:0.96972 validation-aucpr:0.97432
[16] validation-logloss:0.42079 validation-auc:0.96986 validation-aucpr:0.97436
[17] validation-logloss:0.41123 validation-auc:0.96982 validation-aucpr:0.97432
[18] validation-logloss:0.40217 validation-auc:0.96979 validation-aucpr:0.97222
[19] validation-logloss:0.39323 validation-auc:0.97004 validation-aucpr:0.97239
[20] validation-logloss:0.38504 validation-auc:0.97046 validation-aucpr:0.97496
[21] validation-logloss:0.37726 validation-auc:0.97047 validation-aucpr:0.97492
[22] validation-logloss:0.36956 validation-auc:0.97073 validation-aucpr:0.97514
[23] validation-logloss:0.36231 validation-auc:0.97065 validation-aucpr:0.97506
[24] validation-logloss:0.35534 validation-auc:0.97064 validation-aucpr:0.97504
[25] validation-logloss:0.34881 validation-auc:0.97083 validation-aucpr:0.97515
[26] validation-logloss:0.34228 validation-auc:0.97104 validation-aucpr:0.97534
[27] validation-logloss:0.33620 validation-auc:0.97112 validation-aucpr:0.97535
[28] validation-logloss:0.33030 validation-auc:0.97138 validation-aucpr:0.97564
[29] validation-logloss:0.32487 validation-auc:0.97133 validation-aucpr:0.97562
[30] validation-logloss:0.31951 validation-auc:0.97137 validation-aucpr:0.97564
[31] validation-logloss:0.31445 validation-auc:0.97145 validation-aucpr:0.97570
[32] validation-logloss:0.31010 validation-auc:0.97150 validation-aucpr:0.97576
[33] validation-logloss:0.30558 validation-auc:0.97137 validation-aucpr:0.97566
[34] validation-logloss:0.30094 validation-auc:0.97147 validation-aucpr:0.97574
[35] validation-logloss:0.29664 validation-auc:0.97157 validation-aucpr:0.97575
[36] validation-logloss:0.29248 validation-auc:0.97165 validation-aucpr:0.97581
[37] validation-logloss:0.28851 validation-auc:0.97172 validation-aucpr:0.97588
[38] validation-logloss:0.28479 validation-auc:0.97169 validation-aucpr:0.97586
[39] validation-logloss:0.28150 validation-auc:0.97174 validation-aucpr:0.97592
[40] validation-logloss:0.27796 validation-auc:0.97175 validation-aucpr:0.97592
[41] validation-logloss:0.27447 validation-auc:0.97187 validation-aucpr:0.97601
{'best_iteration': '41', 'best_score': '0.9760065105274797'}
Trial 87, Fold 1: Log loss = 0.2744740229206771, Average precision = 0.9760110559976684, ROC-AUC = 0.9718659934615219, Elapsed Time = 1.948451500000374 seconds
Trial 87, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 87, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.66807 validation-auc:0.92792 validation-aucpr:0.88731
[1] validation-logloss:0.64453 validation-auc:0.95960 validation-aucpr:0.94918
[2] validation-logloss:0.62291 validation-auc:0.96457 validation-aucpr:0.96300
[3] validation-logloss:0.60414 validation-auc:0.96694 validation-aucpr:0.97097
[4] validation-logloss:0.58510 validation-auc:0.96733 validation-aucpr:0.97137
[5] validation-logloss:0.56695 validation-auc:0.96736 validation-aucpr:0.97152
[6] validation-logloss:0.54976 validation-auc:0.96734 validation-aucpr:0.97136
[7] validation-logloss:0.53352 validation-auc:0.96769 validation-aucpr:0.97165
[8] validation-logloss:0.51795 validation-auc:0.96808 validation-aucpr:0.97208
[9] validation-logloss:0.50358 validation-auc:0.96837 validation-aucpr:0.97216
[10] validation-logloss:0.48987 validation-auc:0.96834 validation-aucpr:0.97210
[11] validation-logloss:0.47681 validation-auc:0.96817 validation-aucpr:0.97203
[12] validation-logloss:0.46467 validation-auc:0.96838 validation-aucpr:0.97230
[13] validation-logloss:0.45303 validation-auc:0.96826 validation-aucpr:0.97222
[14] validation-logloss:0.44183 validation-auc:0.96840 validation-aucpr:0.97186
[15] validation-logloss:0.43098 validation-auc:0.96881 validation-aucpr:0.97230
[16] validation-logloss:0.42095 validation-auc:0.96910 validation-aucpr:0.97251
[17] validation-logloss:0.41125 validation-auc:0.96930 validation-aucpr:0.97265
[18] validation-logloss:0.40237 validation-auc:0.96935 validation-aucpr:0.97256
[19] validation-logloss:0.39360 validation-auc:0.96960 validation-aucpr:0.97276
[20] validation-logloss:0.38512 validation-auc:0.96982 validation-aucpr:0.97296
[21] validation-logloss:0.37739 validation-auc:0.96967 validation-aucpr:0.97293
[22] validation-logloss:0.36977 validation-auc:0.96977 validation-aucpr:0.97294
[23] validation-logloss:0.36256 validation-auc:0.96999 validation-aucpr:0.97285
[24] validation-logloss:0.35628 validation-auc:0.97005 validation-aucpr:0.97275
[25] validation-logloss:0.35010 validation-auc:0.97005 validation-aucpr:0.97293
[26] validation-logloss:0.34346 validation-auc:0.97029 validation-aucpr:0.97312
[27] validation-logloss:0.33727 validation-auc:0.97028 validation-aucpr:0.97315
[28] validation-logloss:0.33114 validation-auc:0.97043 validation-aucpr:0.97330
[29] validation-logloss:0.32588 validation-auc:0.97032 validation-aucpr:0.97326
[30] validation-logloss:0.32041 validation-auc:0.97029 validation-aucpr:0.97321
[31] validation-logloss:0.31485 validation-auc:0.97054 validation-aucpr:0.97347
[32] validation-logloss:0.30969 validation-auc:0.97084 validation-aucpr:0.97358
[33] validation-logloss:0.30478 validation-auc:0.97107 validation-aucpr:0.97364
[34] validation-logloss:0.30002 validation-auc:0.97119 validation-aucpr:0.97368
[35] validation-logloss:0.29577 validation-auc:0.97122 validation-aucpr:0.97370
[36] validation-logloss:0.29218 validation-auc:0.97119 validation-aucpr:0.97446
[37] validation-logloss:0.28816 validation-auc:0.97109 validation-aucpr:0.97437
[38] validation-logloss:0.28417 validation-auc:0.97119 validation-aucpr:0.97443
[39] validation-logloss:0.28046 validation-auc:0.97114 validation-aucpr:0.97441
[40] validation-logloss:0.27673 validation-auc:0.97131 validation-aucpr:0.97459
[41] validation-logloss:0.27316 validation-auc:0.97142 validation-aucpr:0.97465
{'best_iteration': '41', 'best_score': '0.9746504882779937'}
Trial 87, Fold 2: Log loss = 0.27315787486291515, Average precision = 0.97459003043263, ROC-AUC = 0.9714154972729405, Elapsed Time = 2.1508365999980015 seconds
Trial 87, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 87, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66822 validation-auc:0.91816 validation-aucpr:0.87591
[1] validation-logloss:0.64466 validation-auc:0.95738 validation-aucpr:0.94106
[2] validation-logloss:0.62257 validation-auc:0.96645 validation-aucpr:0.96665
[3] validation-logloss:0.60396 validation-auc:0.96735 validation-aucpr:0.97051
[4] validation-logloss:0.58440 validation-auc:0.96790 validation-aucpr:0.97252
[5] validation-logloss:0.56591 validation-auc:0.96850 validation-aucpr:0.97304
[6] validation-logloss:0.54841 validation-auc:0.96944 validation-aucpr:0.97378
[7] validation-logloss:0.53217 validation-auc:0.96951 validation-aucpr:0.97405
[8] validation-logloss:0.51695 validation-auc:0.96940 validation-aucpr:0.97397
[9] validation-logloss:0.50214 validation-auc:0.96948 validation-aucpr:0.97315
[10] validation-logloss:0.48805 validation-auc:0.96932 validation-aucpr:0.97288
[11] validation-logloss:0.47528 validation-auc:0.96947 validation-aucpr:0.97360
[12] validation-logloss:0.46293 validation-auc:0.97000 validation-aucpr:0.97374
[13] validation-logloss:0.45131 validation-auc:0.97006 validation-aucpr:0.97375
[14] validation-logloss:0.44097 validation-auc:0.97010 validation-aucpr:0.97371
[15] validation-logloss:0.42999 validation-auc:0.97050 validation-aucpr:0.97402
[16] validation-logloss:0.42006 validation-auc:0.97081 validation-aucpr:0.97423
[17] validation-logloss:0.41021 validation-auc:0.97118 validation-aucpr:0.97543
[18] validation-logloss:0.40159 validation-auc:0.97119 validation-aucpr:0.97540
[19] validation-logloss:0.39248 validation-auc:0.97125 validation-aucpr:0.97534
[20] validation-logloss:0.38402 validation-auc:0.97118 validation-aucpr:0.97531
[21] validation-logloss:0.37603 validation-auc:0.97128 validation-aucpr:0.97534
[22] validation-logloss:0.36869 validation-auc:0.97109 validation-aucpr:0.97520
[23] validation-logloss:0.36126 validation-auc:0.97121 validation-aucpr:0.97531
[24] validation-logloss:0.35453 validation-auc:0.97098 validation-aucpr:0.97512
[25] validation-logloss:0.34807 validation-auc:0.97083 validation-aucpr:0.97500
[26] validation-logloss:0.34165 validation-auc:0.97083 validation-aucpr:0.97498
[27] validation-logloss:0.33542 validation-auc:0.97114 validation-aucpr:0.97533
[28] validation-logloss:0.32992 validation-auc:0.97130 validation-aucpr:0.97543
[29] validation-logloss:0.32410 validation-auc:0.97140 validation-aucpr:0.97549
[30] validation-logloss:0.31842 validation-auc:0.97152 validation-aucpr:0.97555
[31] validation-logloss:0.31315 validation-auc:0.97155 validation-aucpr:0.97557
[32] validation-logloss:0.30862 validation-auc:0.97166 validation-aucpr:0.97571
[33] validation-logloss:0.30371 validation-auc:0.97177 validation-aucpr:0.97584
[34] validation-logloss:0.29900 validation-auc:0.97195 validation-aucpr:0.97599
[35] validation-logloss:0.29472 validation-auc:0.97178 validation-aucpr:0.97586
[36] validation-logloss:0.29034 validation-auc:0.97190 validation-aucpr:0.97610
[37] validation-logloss:0.28630 validation-auc:0.97191 validation-aucpr:0.97613
[38] validation-logloss:0.28245 validation-auc:0.97193 validation-aucpr:0.97614
[39] validation-logloss:0.27878 validation-auc:0.97198 validation-aucpr:0.97617
[40] validation-logloss:0.27513 validation-auc:0.97194 validation-aucpr:0.97617
[41] validation-logloss:0.27177 validation-auc:0.97190 validation-aucpr:0.97610
{'best_iteration': '40', 'best_score': '0.9761697992665844'}
Trial 87, Fold 3: Log loss = 0.27177006105110235, Average precision = 0.976105101470461, ROC-AUC = 0.9718986712783272, Elapsed Time = 2.32190780000019 seconds
Trial 87, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 87, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66814 validation-auc:0.91552 validation-aucpr:0.88203
[1] validation-logloss:0.64555 validation-auc:0.94976 validation-aucpr:0.94123
[2] validation-logloss:0.62354 validation-auc:0.96001 validation-aucpr:0.96171
[3] validation-logloss:0.60277 validation-auc:0.96264 validation-aucpr:0.96326
[4] validation-logloss:0.58355 validation-auc:0.96383 validation-aucpr:0.96660
[5] validation-logloss:0.56655 validation-auc:0.96375 validation-aucpr:0.96597
[6] validation-logloss:0.55093 validation-auc:0.96487 validation-aucpr:0.97111
[7] validation-logloss:0.53626 validation-auc:0.96556 validation-aucpr:0.97145
[8] validation-logloss:0.52066 validation-auc:0.96600 validation-aucpr:0.97183
[9] validation-logloss:0.50599 validation-auc:0.96686 validation-aucpr:0.97244
[10] validation-logloss:0.49231 validation-auc:0.96712 validation-aucpr:0.97259
[11] validation-logloss:0.47915 validation-auc:0.96704 validation-aucpr:0.97256
[12] validation-logloss:0.46657 validation-auc:0.96721 validation-aucpr:0.97274
[13] validation-logloss:0.45487 validation-auc:0.96747 validation-aucpr:0.97294
[14] validation-logloss:0.44465 validation-auc:0.96782 validation-aucpr:0.97314
[15] validation-logloss:0.43398 validation-auc:0.96795 validation-aucpr:0.97330
[16] validation-logloss:0.42347 validation-auc:0.96829 validation-aucpr:0.97356
[17] validation-logloss:0.41386 validation-auc:0.96799 validation-aucpr:0.97334
[18] validation-logloss:0.40456 validation-auc:0.96804 validation-aucpr:0.97341
[19] validation-logloss:0.39541 validation-auc:0.96832 validation-aucpr:0.97364
[20] validation-logloss:0.38787 validation-auc:0.96853 validation-aucpr:0.97375
[21] validation-logloss:0.37969 validation-auc:0.96879 validation-aucpr:0.97394
[22] validation-logloss:0.37189 validation-auc:0.96881 validation-aucpr:0.97397
[23] validation-logloss:0.36457 validation-auc:0.96893 validation-aucpr:0.97407
[24] validation-logloss:0.35758 validation-auc:0.96891 validation-aucpr:0.97405
[25] validation-logloss:0.35160 validation-auc:0.96884 validation-aucpr:0.97398
[26] validation-logloss:0.34503 validation-auc:0.96917 validation-aucpr:0.97419
[27] validation-logloss:0.33866 validation-auc:0.96950 validation-aucpr:0.97442
[28] validation-logloss:0.33275 validation-auc:0.96964 validation-aucpr:0.97454
[29] validation-logloss:0.32750 validation-auc:0.96964 validation-aucpr:0.97454
[30] validation-logloss:0.32197 validation-auc:0.96959 validation-aucpr:0.97452
[31] validation-logloss:0.31669 validation-auc:0.96968 validation-aucpr:0.97460
[32] validation-logloss:0.31238 validation-auc:0.96958 validation-aucpr:0.97453
[33] validation-logloss:0.30742 validation-auc:0.96966 validation-aucpr:0.97458
[34] validation-logloss:0.30340 validation-auc:0.96969 validation-aucpr:0.97459
[35] validation-logloss:0.29881 validation-auc:0.96976 validation-aucpr:0.97466
[36] validation-logloss:0.29451 validation-auc:0.96976 validation-aucpr:0.97468
[37] validation-logloss:0.29028 validation-auc:0.96984 validation-aucpr:0.97477
[38] validation-logloss:0.28635 validation-auc:0.96993 validation-aucpr:0.97483
[39] validation-logloss:0.28254 validation-auc:0.97000 validation-aucpr:0.97491
[40] validation-logloss:0.27890 validation-auc:0.96989 validation-aucpr:0.97487
[41] validation-logloss:0.27548 validation-auc:0.96993 validation-aucpr:0.97489
{'best_iteration': '39', 'best_score': '0.974910544350713'}
Trial 87, Fold 4: Log loss = 0.2754796669893886, Average precision = 0.9748833936863853, ROC-AUC = 0.9699267062240249, Elapsed Time = 2.1952992000005906 seconds
Trial 87, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 87, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66835 validation-auc:0.91932 validation-aucpr:0.88248
[1] validation-logloss:0.64550 validation-auc:0.95531 validation-aucpr:0.94997
[2] validation-logloss:0.62387 validation-auc:0.96068 validation-aucpr:0.96237
[3] validation-logloss:0.60353 validation-auc:0.96315 validation-aucpr:0.96870
[4] validation-logloss:0.58473 validation-auc:0.96522 validation-aucpr:0.97014
[5] validation-logloss:0.56741 validation-auc:0.96485 validation-aucpr:0.96944
[6] validation-logloss:0.55094 validation-auc:0.96529 validation-aucpr:0.96948
[7] validation-logloss:0.53694 validation-auc:0.96464 validation-aucpr:0.96877
[8] validation-logloss:0.52136 validation-auc:0.96520 validation-aucpr:0.96929
[9] validation-logloss:0.50686 validation-auc:0.96604 validation-aucpr:0.96986
[10] validation-logloss:0.49325 validation-auc:0.96631 validation-aucpr:0.97006
[11] validation-logloss:0.48042 validation-auc:0.96698 validation-aucpr:0.97007
[12] validation-logloss:0.46817 validation-auc:0.96733 validation-aucpr:0.97028
[13] validation-logloss:0.45616 validation-auc:0.96780 validation-aucpr:0.97075
[14] validation-logloss:0.44472 validation-auc:0.96828 validation-aucpr:0.97113
[15] validation-logloss:0.43416 validation-auc:0.96869 validation-aucpr:0.97138
[16] validation-logloss:0.42421 validation-auc:0.96863 validation-aucpr:0.97148
[17] validation-logloss:0.41465 validation-auc:0.96885 validation-aucpr:0.97167
[18] validation-logloss:0.40545 validation-auc:0.96921 validation-aucpr:0.97188
[19] validation-logloss:0.39687 validation-auc:0.96908 validation-aucpr:0.97181
[20] validation-logloss:0.38885 validation-auc:0.96902 validation-aucpr:0.97189
[21] validation-logloss:0.38101 validation-auc:0.96899 validation-aucpr:0.97187
[22] validation-logloss:0.37384 validation-auc:0.96914 validation-aucpr:0.97195
[23] validation-logloss:0.36661 validation-auc:0.96919 validation-aucpr:0.97201
[24] validation-logloss:0.35959 validation-auc:0.96957 validation-aucpr:0.97231
[25] validation-logloss:0.35332 validation-auc:0.96975 validation-aucpr:0.97244
[26] validation-logloss:0.34690 validation-auc:0.96993 validation-aucpr:0.97229
[27] validation-logloss:0.34128 validation-auc:0.96999 validation-aucpr:0.97235
[28] validation-logloss:0.33538 validation-auc:0.97006 validation-aucpr:0.97237
[29] validation-logloss:0.32988 validation-auc:0.97002 validation-aucpr:0.97240
[30] validation-logloss:0.32452 validation-auc:0.97013 validation-aucpr:0.97266
[31] validation-logloss:0.31997 validation-auc:0.96998 validation-aucpr:0.97249
[32] validation-logloss:0.31541 validation-auc:0.96995 validation-aucpr:0.97246
[33] validation-logloss:0.31069 validation-auc:0.96985 validation-aucpr:0.97239
[34] validation-logloss:0.30615 validation-auc:0.96996 validation-aucpr:0.97230
[35] validation-logloss:0.30192 validation-auc:0.96995 validation-aucpr:0.97230
[36] validation-logloss:0.29763 validation-auc:0.97012 validation-aucpr:0.97244
[37] validation-logloss:0.29347 validation-auc:0.97024 validation-aucpr:0.97254
[38] validation-logloss:0.28982 validation-auc:0.97040 validation-aucpr:0.97265
[39] validation-logloss:0.28612 validation-auc:0.97038 validation-aucpr:0.97278
[40] validation-logloss:0.28272 validation-auc:0.97038 validation-aucpr:0.97286
[41] validation-logloss:0.27934 validation-auc:0.97038 validation-aucpr:0.97296
{'best_iteration': '41', 'best_score': '0.9729616450654627'}
Trial 87, Fold 5: Log loss = 0.2793432269406746, Average precision = 0.9731349045573595, ROC-AUC = 0.9703788343273322, Elapsed Time = 2.298222099998384 seconds
Optimization Progress: 88%|########8 | 88/100 [3:50:18<18:44, 93.73s/it]
Trial 88, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 88, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67879 validation-auc:0.94026 validation-aucpr:0.94310
[1] validation-logloss:0.66553 validation-auc:0.94625 validation-aucpr:0.94913
[2] validation-logloss:0.65201 validation-auc:0.94978 validation-aucpr:0.95335
[3] validation-logloss:0.63958 validation-auc:0.95177 validation-aucpr:0.95584
[4] validation-logloss:0.62735 validation-auc:0.95260 validation-aucpr:0.95705
[5] validation-logloss:0.61310 validation-auc:0.96017 validation-aucpr:0.96567
[6] validation-logloss:0.60127 validation-auc:0.96091 validation-aucpr:0.96615
[7] validation-logloss:0.58892 validation-auc:0.96204 validation-aucpr:0.96761
[8] validation-logloss:0.57856 validation-auc:0.96218 validation-aucpr:0.96769
[9] validation-logloss:0.56692 validation-auc:0.96295 validation-aucpr:0.96882
[10] validation-logloss:0.55737 validation-auc:0.96287 validation-aucpr:0.96867
[11] validation-logloss:0.54626 validation-auc:0.96343 validation-aucpr:0.96941
[12] validation-logloss:0.53608 validation-auc:0.96331 validation-aucpr:0.96943
[13] validation-logloss:0.52713 validation-auc:0.96336 validation-aucpr:0.96943
[14] validation-logloss:0.51890 validation-auc:0.96317 validation-aucpr:0.96923
[15] validation-logloss:0.51083 validation-auc:0.96314 validation-aucpr:0.96908
[16] validation-logloss:0.50245 validation-auc:0.96317 validation-aucpr:0.96911
[17] validation-logloss:0.49529 validation-auc:0.96306 validation-aucpr:0.96898
[18] validation-logloss:0.48806 validation-auc:0.96298 validation-aucpr:0.96893
[19] validation-logloss:0.47969 validation-auc:0.96339 validation-aucpr:0.96935
[20] validation-logloss:0.47139 validation-auc:0.96365 validation-aucpr:0.96962
[21] validation-logloss:0.46376 validation-auc:0.96387 validation-aucpr:0.96986
[22] validation-logloss:0.45609 validation-auc:0.96400 validation-aucpr:0.97005
[23] validation-logloss:0.45027 validation-auc:0.96386 validation-aucpr:0.96991
[24] validation-logloss:0.44469 validation-auc:0.96375 validation-aucpr:0.96979
[25] validation-logloss:0.43922 validation-auc:0.96369 validation-aucpr:0.96970
[26] validation-logloss:0.43375 validation-auc:0.96371 validation-aucpr:0.96964
[27] validation-logloss:0.42869 validation-auc:0.96365 validation-aucpr:0.96956
[28] validation-logloss:0.42344 validation-auc:0.96372 validation-aucpr:0.96962
[29] validation-logloss:0.41871 validation-auc:0.96366 validation-aucpr:0.96955
[30] validation-logloss:0.41273 validation-auc:0.96383 validation-aucpr:0.96972
[31] validation-logloss:0.40783 validation-auc:0.96398 validation-aucpr:0.96983
[32] validation-logloss:0.40183 validation-auc:0.96426 validation-aucpr:0.97008
[33] validation-logloss:0.39593 validation-auc:0.96455 validation-aucpr:0.97036
[34] validation-logloss:0.39195 validation-auc:0.96446 validation-aucpr:0.97029
[35] validation-logloss:0.38791 validation-auc:0.96443 validation-aucpr:0.97026
[36] validation-logloss:0.38274 validation-auc:0.96452 validation-aucpr:0.97039
[37] validation-logloss:0.37730 validation-auc:0.96479 validation-aucpr:0.97063
[38] validation-logloss:0.37219 validation-auc:0.96499 validation-aucpr:0.97083
[39] validation-logloss:0.36870 validation-auc:0.96496 validation-aucpr:0.97080
[40] validation-logloss:0.36393 validation-auc:0.96517 validation-aucpr:0.97096
[41] validation-logloss:0.36027 validation-auc:0.96522 validation-aucpr:0.97098
[42] validation-logloss:0.35569 validation-auc:0.96538 validation-aucpr:0.97115
[43] validation-logloss:0.35282 validation-auc:0.96529 validation-aucpr:0.97107
[44] validation-logloss:0.34946 validation-auc:0.96536 validation-aucpr:0.97110
[45] validation-logloss:0.34551 validation-auc:0.96547 validation-aucpr:0.97123
[46] validation-logloss:0.34134 validation-auc:0.96565 validation-aucpr:0.97138
{'best_iteration': '46', 'best_score': '0.9713768990895306'}
Trial 88, Fold 1: Log loss = 0.3413363609274223, Average precision = 0.9713801334846242, ROC-AUC = 0.9656482706251225, Elapsed Time = 1.1811795000030543 seconds
Trial 88, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 88, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67841 validation-auc:0.94098 validation-aucpr:0.94155
[1] validation-logloss:0.66472 validation-auc:0.94942 validation-aucpr:0.95172
[2] validation-logloss:0.64888 validation-auc:0.96139 validation-aucpr:0.96578
[3] validation-logloss:0.63668 validation-auc:0.96151 validation-aucpr:0.96570
[4] validation-logloss:0.62391 validation-auc:0.96223 validation-aucpr:0.96659
[5] validation-logloss:0.61175 validation-auc:0.96246 validation-aucpr:0.96654
[6] validation-logloss:0.59849 validation-auc:0.96350 validation-aucpr:0.96765
[7] validation-logloss:0.58573 validation-auc:0.96443 validation-aucpr:0.96850
[8] validation-logloss:0.57492 validation-auc:0.96434 validation-aucpr:0.96842
[9] validation-logloss:0.56313 validation-auc:0.96482 validation-aucpr:0.96891
[10] validation-logloss:0.55358 validation-auc:0.96462 validation-aucpr:0.96869
[11] validation-logloss:0.54242 validation-auc:0.96504 validation-aucpr:0.96914
[12] validation-logloss:0.53391 validation-auc:0.96483 validation-aucpr:0.96930
[13] validation-logloss:0.52351 validation-auc:0.96505 validation-aucpr:0.96951
[14] validation-logloss:0.51369 validation-auc:0.96539 validation-aucpr:0.96986
[15] validation-logloss:0.50569 validation-auc:0.96529 validation-aucpr:0.96975
[16] validation-logloss:0.49620 validation-auc:0.96572 validation-aucpr:0.97016
[17] validation-logloss:0.48717 validation-auc:0.96610 validation-aucpr:0.97047
[18] validation-logloss:0.47857 validation-auc:0.96645 validation-aucpr:0.97080
[19] validation-logloss:0.47169 validation-auc:0.96632 validation-aucpr:0.97062
[20] validation-logloss:0.46523 validation-auc:0.96597 validation-aucpr:0.97032
[21] validation-logloss:0.45770 validation-auc:0.96628 validation-aucpr:0.97061
[22] validation-logloss:0.45128 validation-auc:0.96622 validation-aucpr:0.97053
[23] validation-logloss:0.44510 validation-auc:0.96626 validation-aucpr:0.97047
[24] validation-logloss:0.43910 validation-auc:0.96626 validation-aucpr:0.97045
[25] validation-logloss:0.43247 validation-auc:0.96640 validation-aucpr:0.97060
[26] validation-logloss:0.42556 validation-auc:0.96667 validation-aucpr:0.97084
[27] validation-logloss:0.42026 validation-auc:0.96653 validation-aucpr:0.97066
[28] validation-logloss:0.41500 validation-auc:0.96640 validation-aucpr:0.97053
[29] validation-logloss:0.41002 validation-auc:0.96632 validation-aucpr:0.97043
[30] validation-logloss:0.40533 validation-auc:0.96632 validation-aucpr:0.97034
[31] validation-logloss:0.40071 validation-auc:0.96629 validation-aucpr:0.97029
[32] validation-logloss:0.39475 validation-auc:0.96659 validation-aucpr:0.97056
[33] validation-logloss:0.39062 validation-auc:0.96650 validation-aucpr:0.97048
[34] validation-logloss:0.38651 validation-auc:0.96645 validation-aucpr:0.97040
[35] validation-logloss:0.38135 validation-auc:0.96666 validation-aucpr:0.97051
[36] validation-logloss:0.37598 validation-auc:0.96688 validation-aucpr:0.97072
[37] validation-logloss:0.37226 validation-auc:0.96677 validation-aucpr:0.97061
[38] validation-logloss:0.36845 validation-auc:0.96682 validation-aucpr:0.97063
[39] validation-logloss:0.36363 validation-auc:0.96703 validation-aucpr:0.97084
[40] validation-logloss:0.35914 validation-auc:0.96715 validation-aucpr:0.97096
[41] validation-logloss:0.35458 validation-auc:0.96729 validation-aucpr:0.97115
[42] validation-logloss:0.35017 validation-auc:0.96747 validation-aucpr:0.97132
[43] validation-logloss:0.34612 validation-auc:0.96760 validation-aucpr:0.97135
[44] validation-logloss:0.34296 validation-auc:0.96757 validation-aucpr:0.97131
[45] validation-logloss:0.33931 validation-auc:0.96762 validation-aucpr:0.97136
[46] validation-logloss:0.33617 validation-auc:0.96765 validation-aucpr:0.97138
{'best_iteration': '46', 'best_score': '0.9713759033775468'}
Trial 88, Fold 2: Log loss = 0.33617017571275093, Average precision = 0.9713869132696409, ROC-AUC = 0.967648434372573, Elapsed Time = 1.597771599997941 seconds
Trial 88, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 88, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67819 validation-auc:0.94612 validation-aucpr:0.94908
[1] validation-logloss:0.66173 validation-auc:0.96430 validation-aucpr:0.96780
[2] validation-logloss:0.64828 validation-auc:0.96401 validation-aucpr:0.96794
[3] validation-logloss:0.63366 validation-auc:0.96519 validation-aucpr:0.96896
[4] validation-logloss:0.62120 validation-auc:0.96508 validation-aucpr:0.96889
[5] validation-logloss:0.60920 validation-auc:0.96536 validation-aucpr:0.96945
[6] validation-logloss:0.59791 validation-auc:0.96511 validation-aucpr:0.96914
[7] validation-logloss:0.58509 validation-auc:0.96580 validation-aucpr:0.96985
[8] validation-logloss:0.57464 validation-auc:0.96604 validation-aucpr:0.96995
[9] validation-logloss:0.56292 validation-auc:0.96636 validation-aucpr:0.97038
[10] validation-logloss:0.55346 validation-auc:0.96628 validation-aucpr:0.97060
[11] validation-logloss:0.54364 validation-auc:0.96639 validation-aucpr:0.97067
[12] validation-logloss:0.53489 validation-auc:0.96599 validation-aucpr:0.97032
[13] validation-logloss:0.52466 validation-auc:0.96628 validation-aucpr:0.97061
[14] validation-logloss:0.51639 validation-auc:0.96599 validation-aucpr:0.97036
[15] validation-logloss:0.50808 validation-auc:0.96601 validation-aucpr:0.97040
[16] validation-logloss:0.49875 validation-auc:0.96649 validation-aucpr:0.97119
[17] validation-logloss:0.48964 validation-auc:0.96666 validation-aucpr:0.97139
[18] validation-logloss:0.48105 validation-auc:0.96685 validation-aucpr:0.97157
[19] validation-logloss:0.47438 validation-auc:0.96659 validation-aucpr:0.97135
[20] validation-logloss:0.46646 validation-auc:0.96683 validation-aucpr:0.97162
[21] validation-logloss:0.46013 validation-auc:0.96678 validation-aucpr:0.97160
[22] validation-logloss:0.45380 validation-auc:0.96675 validation-aucpr:0.97152
[23] validation-logloss:0.44629 validation-auc:0.96693 validation-aucpr:0.97170
[24] validation-logloss:0.44003 validation-auc:0.96705 validation-aucpr:0.97180
[25] validation-logloss:0.43426 validation-auc:0.96698 validation-aucpr:0.97175
[26] validation-logloss:0.42826 validation-auc:0.96706 validation-aucpr:0.97179
[27] validation-logloss:0.42306 validation-auc:0.96704 validation-aucpr:0.97174
[28] validation-logloss:0.41803 validation-auc:0.96695 validation-aucpr:0.97164
[29] validation-logloss:0.41309 validation-auc:0.96689 validation-aucpr:0.97158
[30] validation-logloss:0.40831 validation-auc:0.96677 validation-aucpr:0.97146
[31] validation-logloss:0.40237 validation-auc:0.96682 validation-aucpr:0.97156
[32] validation-logloss:0.39755 validation-auc:0.96698 validation-aucpr:0.97169
[33] validation-logloss:0.39309 validation-auc:0.96707 validation-aucpr:0.97177
[34] validation-logloss:0.38855 validation-auc:0.96722 validation-aucpr:0.97191
[35] validation-logloss:0.38443 validation-auc:0.96721 validation-aucpr:0.97190
[36] validation-logloss:0.38017 validation-auc:0.96722 validation-aucpr:0.97187
[37] validation-logloss:0.37500 validation-auc:0.96729 validation-aucpr:0.97195
[38] validation-logloss:0.36976 validation-auc:0.96752 validation-aucpr:0.97216
[39] validation-logloss:0.36500 validation-auc:0.96762 validation-aucpr:0.97232
[40] validation-logloss:0.36145 validation-auc:0.96755 validation-aucpr:0.97225
[41] validation-logloss:0.35791 validation-auc:0.96758 validation-aucpr:0.97225
[42] validation-logloss:0.35440 validation-auc:0.96760 validation-aucpr:0.97225
[43] validation-logloss:0.34966 validation-auc:0.96779 validation-aucpr:0.97244
[44] validation-logloss:0.34549 validation-auc:0.96786 validation-aucpr:0.97251
[45] validation-logloss:0.34244 validation-auc:0.96786 validation-aucpr:0.97252
[46] validation-logloss:0.33934 validation-auc:0.96789 validation-aucpr:0.97254
{'best_iteration': '46', 'best_score': '0.9725354782972834'}
Trial 88, Fold 3: Log loss = 0.33934060727163606, Average precision = 0.9725391929748513, ROC-AUC = 0.9678902796371026, Elapsed Time = 1.4755301000004692 seconds
Trial 88, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 88, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67847 validation-auc:0.93894 validation-aucpr:0.94389
[1] validation-logloss:0.66476 validation-auc:0.94607 validation-aucpr:0.95173
[2] validation-logloss:0.65105 validation-auc:0.95285 validation-aucpr:0.95696
[3] validation-logloss:0.63844 validation-auc:0.95362 validation-aucpr:0.95798
[4] validation-logloss:0.62411 validation-auc:0.96066 validation-aucpr:0.96642
[5] validation-logloss:0.61000 validation-auc:0.96201 validation-aucpr:0.96803
[6] validation-logloss:0.59770 validation-auc:0.96247 validation-aucpr:0.96858
[7] validation-logloss:0.58557 validation-auc:0.96268 validation-aucpr:0.96880
[8] validation-logloss:0.57522 validation-auc:0.96253 validation-aucpr:0.96862
[9] validation-logloss:0.56321 validation-auc:0.96291 validation-aucpr:0.96912
[10] validation-logloss:0.55170 validation-auc:0.96324 validation-aucpr:0.96944
[11] validation-logloss:0.54062 validation-auc:0.96351 validation-aucpr:0.96982
[12] validation-logloss:0.53121 validation-auc:0.96378 validation-aucpr:0.96994
[13] validation-logloss:0.52250 validation-auc:0.96363 validation-aucpr:0.96979
[14] validation-logloss:0.51428 validation-auc:0.96369 validation-aucpr:0.96976
[15] validation-logloss:0.50623 validation-auc:0.96360 validation-aucpr:0.96964
[16] validation-logloss:0.49861 validation-auc:0.96358 validation-aucpr:0.96962
[17] validation-logloss:0.49103 validation-auc:0.96376 validation-aucpr:0.96979
[18] validation-logloss:0.48222 validation-auc:0.96403 validation-aucpr:0.97005
[19] validation-logloss:0.47372 validation-auc:0.96430 validation-aucpr:0.97033
[20] validation-logloss:0.46676 validation-auc:0.96431 validation-aucpr:0.97032
[21] validation-logloss:0.46007 validation-auc:0.96441 validation-aucpr:0.97043
[22] validation-logloss:0.45395 validation-auc:0.96434 validation-aucpr:0.97035
[23] validation-logloss:0.44625 validation-auc:0.96469 validation-aucpr:0.97066
[24] validation-logloss:0.44035 validation-auc:0.96473 validation-aucpr:0.97064
[25] validation-logloss:0.43315 validation-auc:0.96489 validation-aucpr:0.97083
[26] validation-logloss:0.42625 validation-auc:0.96501 validation-aucpr:0.97097
[27] validation-logloss:0.41948 validation-auc:0.96526 validation-aucpr:0.97118
[28] validation-logloss:0.41340 validation-auc:0.96533 validation-aucpr:0.97126
[29] validation-logloss:0.40845 validation-auc:0.96524 validation-aucpr:0.97117
[30] validation-logloss:0.40244 validation-auc:0.96537 validation-aucpr:0.97131
[31] validation-logloss:0.39779 validation-auc:0.96542 validation-aucpr:0.97131
[32] validation-logloss:0.39229 validation-auc:0.96550 validation-aucpr:0.97140
[33] validation-logloss:0.38790 validation-auc:0.96557 validation-aucpr:0.97145
[34] validation-logloss:0.38359 validation-auc:0.96559 validation-aucpr:0.97143
[35] validation-logloss:0.37826 validation-auc:0.96564 validation-aucpr:0.97149
[36] validation-logloss:0.37438 validation-auc:0.96565 validation-aucpr:0.97149
[37] validation-logloss:0.37026 validation-auc:0.96576 validation-aucpr:0.97156
[38] validation-logloss:0.36665 validation-auc:0.96566 validation-aucpr:0.97149
[39] validation-logloss:0.36220 validation-auc:0.96577 validation-aucpr:0.97160
[40] validation-logloss:0.35866 validation-auc:0.96574 validation-aucpr:0.97155
[41] validation-logloss:0.35544 validation-auc:0.96568 validation-aucpr:0.97149
[42] validation-logloss:0.35186 validation-auc:0.96570 validation-aucpr:0.97149
[43] validation-logloss:0.34775 validation-auc:0.96579 validation-aucpr:0.97158
[44] validation-logloss:0.34356 validation-auc:0.96596 validation-aucpr:0.97174
[45] validation-logloss:0.34071 validation-auc:0.96587 validation-aucpr:0.97166
[46] validation-logloss:0.33759 validation-auc:0.96586 validation-aucpr:0.97163
{'best_iteration': '44', 'best_score': '0.9717392143244231'}
Trial 88, Fold 4: Log loss = 0.3375868960426251, Average precision = 0.9716230766946939, ROC-AUC = 0.9658556990970241, Elapsed Time = 1.6040769999999611 seconds
Trial 88, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 88, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67878 validation-auc:0.93635 validation-aucpr:0.93760
[1] validation-logloss:0.66563 validation-auc:0.94369 validation-aucpr:0.94688
[2] validation-logloss:0.64986 validation-auc:0.95830 validation-aucpr:0.96298
[3] validation-logloss:0.63732 validation-auc:0.95859 validation-aucpr:0.96372
[4] validation-logloss:0.62287 validation-auc:0.95984 validation-aucpr:0.96491
[5] validation-logloss:0.61064 validation-auc:0.96044 validation-aucpr:0.96541
[6] validation-logloss:0.59754 validation-auc:0.96110 validation-aucpr:0.96619
[7] validation-logloss:0.58545 validation-auc:0.96139 validation-aucpr:0.96667
[8] validation-logloss:0.57514 validation-auc:0.96133 validation-aucpr:0.96681
[9] validation-logloss:0.56583 validation-auc:0.96118 validation-aucpr:0.96652
[10] validation-logloss:0.55442 validation-auc:0.96139 validation-aucpr:0.96679
[11] validation-logloss:0.54397 validation-auc:0.96133 validation-aucpr:0.96670
[12] validation-logloss:0.53485 validation-auc:0.96127 validation-aucpr:0.96674
[13] validation-logloss:0.52590 validation-auc:0.96140 validation-aucpr:0.96681
[14] validation-logloss:0.51768 validation-auc:0.96132 validation-aucpr:0.96673
[15] validation-logloss:0.50971 validation-auc:0.96144 validation-aucpr:0.96679
[16] validation-logloss:0.50240 validation-auc:0.96129 validation-aucpr:0.96667
[17] validation-logloss:0.49305 validation-auc:0.96169 validation-aucpr:0.96702
[18] validation-logloss:0.48464 validation-auc:0.96182 validation-aucpr:0.96716
[19] validation-logloss:0.47638 validation-auc:0.96204 validation-aucpr:0.96739
[20] validation-logloss:0.46811 validation-auc:0.96239 validation-aucpr:0.96771
[21] validation-logloss:0.46176 validation-auc:0.96234 validation-aucpr:0.96751
[22] validation-logloss:0.45577 validation-auc:0.96218 validation-aucpr:0.96736
[23] validation-logloss:0.44881 validation-auc:0.96242 validation-aucpr:0.96757
[24] validation-logloss:0.44186 validation-auc:0.96251 validation-aucpr:0.96764
[25] validation-logloss:0.43626 validation-auc:0.96247 validation-aucpr:0.96756
[26] validation-logloss:0.42946 validation-auc:0.96261 validation-aucpr:0.96771
[27] validation-logloss:0.42423 validation-auc:0.96260 validation-aucpr:0.96766
[28] validation-logloss:0.41916 validation-auc:0.96259 validation-aucpr:0.96764
[29] validation-logloss:0.41298 validation-auc:0.96276 validation-aucpr:0.96791
[30] validation-logloss:0.40677 validation-auc:0.96300 validation-aucpr:0.96810
[31] validation-logloss:0.40195 validation-auc:0.96308 validation-aucpr:0.96814
[32] validation-logloss:0.39767 validation-auc:0.96303 validation-aucpr:0.96804
[33] validation-logloss:0.39322 validation-auc:0.96306 validation-aucpr:0.96802
[34] validation-logloss:0.38900 validation-auc:0.96301 validation-aucpr:0.96794
[35] validation-logloss:0.38513 validation-auc:0.96294 validation-aucpr:0.96788
[36] validation-logloss:0.37973 validation-auc:0.96321 validation-aucpr:0.96812
[37] validation-logloss:0.37584 validation-auc:0.96326 validation-aucpr:0.96815
[38] validation-logloss:0.37208 validation-auc:0.96328 validation-aucpr:0.96812
[39] validation-logloss:0.36733 validation-auc:0.96343 validation-aucpr:0.96828
[40] validation-logloss:0.36311 validation-auc:0.96352 validation-aucpr:0.96877
[41] validation-logloss:0.35959 validation-auc:0.96365 validation-aucpr:0.96885
[42] validation-logloss:0.35602 validation-auc:0.96377 validation-aucpr:0.96892
[43] validation-logloss:0.35273 validation-auc:0.96383 validation-aucpr:0.96895
[44] validation-logloss:0.34956 validation-auc:0.96384 validation-aucpr:0.96893
[45] validation-logloss:0.34679 validation-auc:0.96376 validation-aucpr:0.96884
[46] validation-logloss:0.34271 validation-auc:0.96390 validation-aucpr:0.96899
{'best_iteration': '46', 'best_score': '0.968991717837239'}
Trial 88, Fold 5: Log loss = 0.34270535412207304, Average precision = 0.9689859086742532, ROC-AUC = 0.9639023465804581, Elapsed Time = 1.4417032999990624 seconds
Optimization Progress: 89%|########9 | 89/100 [3:50:33<12:51, 70.17s/it]
Trial 89, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 89, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.65288 validation-auc:0.95789 validation-aucpr:0.96334
[1] validation-logloss:0.61737 validation-auc:0.96161 validation-aucpr:0.96669
[2] validation-logloss:0.58472 validation-auc:0.96400 validation-aucpr:0.96993
[3] validation-logloss:0.55552 validation-auc:0.96476 validation-aucpr:0.97077
[4] validation-logloss:0.52917 validation-auc:0.96521 validation-aucpr:0.97137
[5] validation-logloss:0.50523 validation-auc:0.96574 validation-aucpr:0.97169
[6] validation-logloss:0.48312 validation-auc:0.96599 validation-aucpr:0.97131
[7] validation-logloss:0.46302 validation-auc:0.96659 validation-aucpr:0.97149
[8] validation-logloss:0.44414 validation-auc:0.96723 validation-aucpr:0.97191
[9] validation-logloss:0.42642 validation-auc:0.96779 validation-aucpr:0.97241
[10] validation-logloss:0.41255 validation-auc:0.96788 validation-aucpr:0.97247
[11] validation-logloss:0.39786 validation-auc:0.96819 validation-aucpr:0.97258
[12] validation-logloss:0.38413 validation-auc:0.96844 validation-aucpr:0.97283
[13] validation-logloss:0.37098 validation-auc:0.96881 validation-aucpr:0.97304
[14] validation-logloss:0.35929 validation-auc:0.96888 validation-aucpr:0.97312
[15] validation-logloss:0.34832 validation-auc:0.96906 validation-aucpr:0.97326
[16] validation-logloss:0.33861 validation-auc:0.96906 validation-aucpr:0.97326
[17] validation-logloss:0.32923 validation-auc:0.96936 validation-aucpr:0.97341
[18] validation-logloss:0.32011 validation-auc:0.96954 validation-aucpr:0.97390
[19] validation-logloss:0.31192 validation-auc:0.96963 validation-aucpr:0.97405
[20] validation-logloss:0.30384 validation-auc:0.97019 validation-aucpr:0.97441
[21] validation-logloss:0.29739 validation-auc:0.97022 validation-aucpr:0.97440
[22] validation-logloss:0.29085 validation-auc:0.97030 validation-aucpr:0.97449
[23] validation-logloss:0.28552 validation-auc:0.97027 validation-aucpr:0.97446
[24] validation-logloss:0.27984 validation-auc:0.97043 validation-aucpr:0.97524
[25] validation-logloss:0.27456 validation-auc:0.97047 validation-aucpr:0.97525
[26] validation-logloss:0.26977 validation-auc:0.97040 validation-aucpr:0.97523
[27] validation-logloss:0.26472 validation-auc:0.97053 validation-aucpr:0.97531
[28] validation-logloss:0.26056 validation-auc:0.97032 validation-aucpr:0.97517
[29] validation-logloss:0.25613 validation-auc:0.97055 validation-aucpr:0.97534
[30] validation-logloss:0.25199 validation-auc:0.97079 validation-aucpr:0.97548
[31] validation-logloss:0.24875 validation-auc:0.97052 validation-aucpr:0.97526
[32] validation-logloss:0.24518 validation-auc:0.97066 validation-aucpr:0.97534
[33] validation-logloss:0.24210 validation-auc:0.97060 validation-aucpr:0.97530
[34] validation-logloss:0.23881 validation-auc:0.97084 validation-aucpr:0.97546
[35] validation-logloss:0.23616 validation-auc:0.97073 validation-aucpr:0.97537
[36] validation-logloss:0.23347 validation-auc:0.97075 validation-aucpr:0.97543
[37] validation-logloss:0.23094 validation-auc:0.97092 validation-aucpr:0.97553
[38] validation-logloss:0.22854 validation-auc:0.97092 validation-aucpr:0.97552
[39] validation-logloss:0.22634 validation-auc:0.97110 validation-aucpr:0.97571
[40] validation-logloss:0.22474 validation-auc:0.97100 validation-aucpr:0.97562
[41] validation-logloss:0.22347 validation-auc:0.97079 validation-aucpr:0.97540
[42] validation-logloss:0.22140 validation-auc:0.97086 validation-aucpr:0.97543
[43] validation-logloss:0.21945 validation-auc:0.97101 validation-aucpr:0.97557
[44] validation-logloss:0.21773 validation-auc:0.97115 validation-aucpr:0.97565
[45] validation-logloss:0.21650 validation-auc:0.97113 validation-aucpr:0.97563
[46] validation-logloss:0.21513 validation-auc:0.97116 validation-aucpr:0.97564
[47] validation-logloss:0.21385 validation-auc:0.97124 validation-aucpr:0.97570
[48] validation-logloss:0.21293 validation-auc:0.97119 validation-aucpr:0.97574
[49] validation-logloss:0.21136 validation-auc:0.97131 validation-aucpr:0.97585
[50] validation-logloss:0.20978 validation-auc:0.97149 validation-aucpr:0.97599
[51] validation-logloss:0.20852 validation-auc:0.97156 validation-aucpr:0.97606
[52] validation-logloss:0.20741 validation-auc:0.97172 validation-aucpr:0.97618
[53] validation-logloss:0.20668 validation-auc:0.97163 validation-aucpr:0.97613
[54] validation-logloss:0.20607 validation-auc:0.97148 validation-aucpr:0.97598
[55] validation-logloss:0.20509 validation-auc:0.97156 validation-aucpr:0.97605
[56] validation-logloss:0.20398 validation-auc:0.97178 validation-aucpr:0.97618
{'best_iteration': '52', 'best_score': '0.976176684221807'}
Trial 89, Fold 1: Log loss = 0.2039760299158349, Average precision = 0.9761799411108688, ROC-AUC = 0.9717819025632937, Elapsed Time = 12.004220999999234 seconds
Trial 89, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 89, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.65220 validation-auc:0.95974 validation-aucpr:0.96371
[1] validation-logloss:0.61616 validation-auc:0.96514 validation-aucpr:0.96885
[2] validation-logloss:0.58510 validation-auc:0.96508 validation-aucpr:0.96870
[3] validation-logloss:0.55633 validation-auc:0.96588 validation-aucpr:0.96964
[4] validation-logloss:0.52896 validation-auc:0.96729 validation-aucpr:0.97087
[5] validation-logloss:0.50677 validation-auc:0.96826 validation-aucpr:0.97161
[6] validation-logloss:0.48432 validation-auc:0.96853 validation-aucpr:0.97176
[7] validation-logloss:0.46352 validation-auc:0.96885 validation-aucpr:0.97255
[8] validation-logloss:0.44485 validation-auc:0.96899 validation-aucpr:0.97262
[9] validation-logloss:0.42724 validation-auc:0.96924 validation-aucpr:0.97290
[10] validation-logloss:0.41329 validation-auc:0.96942 validation-aucpr:0.97302
[11] validation-logloss:0.39979 validation-auc:0.96983 validation-aucpr:0.97328
[12] validation-logloss:0.38577 validation-auc:0.97014 validation-aucpr:0.97357
[13] validation-logloss:0.37266 validation-auc:0.97043 validation-aucpr:0.97379
[14] validation-logloss:0.36108 validation-auc:0.97041 validation-aucpr:0.97377
[15] validation-logloss:0.35131 validation-auc:0.97065 validation-aucpr:0.97392
[16] validation-logloss:0.34075 validation-auc:0.97058 validation-aucpr:0.97378
[17] validation-logloss:0.33208 validation-auc:0.97061 validation-aucpr:0.97379
[18] validation-logloss:0.32282 validation-auc:0.97096 validation-aucpr:0.97406
[19] validation-logloss:0.31413 validation-auc:0.97111 validation-aucpr:0.97413
[20] validation-logloss:0.30582 validation-auc:0.97140 validation-aucpr:0.97430
[21] validation-logloss:0.29973 validation-auc:0.97113 validation-aucpr:0.97411
[22] validation-logloss:0.29394 validation-auc:0.97124 validation-aucpr:0.97419
[23] validation-logloss:0.28759 validation-auc:0.97116 validation-aucpr:0.97409
[24] validation-logloss:0.28124 validation-auc:0.97130 validation-aucpr:0.97420
[25] validation-logloss:0.27554 validation-auc:0.97120 validation-aucpr:0.97416
[26] validation-logloss:0.26983 validation-auc:0.97138 validation-aucpr:0.97432
[27] validation-logloss:0.26468 validation-auc:0.97138 validation-aucpr:0.97432
[28] validation-logloss:0.25982 validation-auc:0.97129 validation-aucpr:0.97433
[29] validation-logloss:0.25532 validation-auc:0.97125 validation-aucpr:0.97433
[30] validation-logloss:0.25155 validation-auc:0.97133 validation-aucpr:0.97441
[31] validation-logloss:0.24752 validation-auc:0.97137 validation-aucpr:0.97439
[32] validation-logloss:0.24386 validation-auc:0.97142 validation-aucpr:0.97441
[33] validation-logloss:0.24081 validation-auc:0.97167 validation-aucpr:0.97459
[34] validation-logloss:0.23747 validation-auc:0.97160 validation-aucpr:0.97455
[35] validation-logloss:0.23508 validation-auc:0.97168 validation-aucpr:0.97490
[36] validation-logloss:0.23257 validation-auc:0.97162 validation-aucpr:0.97486
[37] validation-logloss:0.22964 validation-auc:0.97166 validation-aucpr:0.97487
[38] validation-logloss:0.22697 validation-auc:0.97180 validation-aucpr:0.97497
[39] validation-logloss:0.22466 validation-auc:0.97195 validation-aucpr:0.97506
[40] validation-logloss:0.22219 validation-auc:0.97202 validation-aucpr:0.97508
[41] validation-logloss:0.21982 validation-auc:0.97192 validation-aucpr:0.97499
[42] validation-logloss:0.21823 validation-auc:0.97195 validation-aucpr:0.97497
[43] validation-logloss:0.21677 validation-auc:0.97195 validation-aucpr:0.97493
[44] validation-logloss:0.21478 validation-auc:0.97222 validation-aucpr:0.97515
[45] validation-logloss:0.21297 validation-auc:0.97216 validation-aucpr:0.97501
[46] validation-logloss:0.21145 validation-auc:0.97223 validation-aucpr:0.97505
[47] validation-logloss:0.20975 validation-auc:0.97228 validation-aucpr:0.97509
[48] validation-logloss:0.20793 validation-auc:0.97245 validation-aucpr:0.97521
[49] validation-logloss:0.20650 validation-auc:0.97238 validation-aucpr:0.97513
[50] validation-logloss:0.20518 validation-auc:0.97234 validation-aucpr:0.97503
[51] validation-logloss:0.20395 validation-auc:0.97231 validation-aucpr:0.97499
[52] validation-logloss:0.20237 validation-auc:0.97254 validation-aucpr:0.97517
[53] validation-logloss:0.20136 validation-auc:0.97249 validation-aucpr:0.97508
[54] validation-logloss:0.20000 validation-auc:0.97266 validation-aucpr:0.97520
[55] validation-logloss:0.19878 validation-auc:0.97278 validation-aucpr:0.97531
[56] validation-logloss:0.19779 validation-auc:0.97278 validation-aucpr:0.97531
{'best_iteration': '56', 'best_score': '0.9753079018133185'}
Trial 89, Fold 2: Log loss = 0.19778888192740518, Average precision = 0.975314491837502, ROC-AUC = 0.9727828206939062, Elapsed Time = 11.682237000000896 seconds
Trial 89, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 89, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.65183 validation-auc:0.96410 validation-aucpr:0.96861
[1] validation-logloss:0.61589 validation-auc:0.96421 validation-aucpr:0.96958
[2] validation-logloss:0.58678 validation-auc:0.96644 validation-aucpr:0.97093
[3] validation-logloss:0.55724 validation-auc:0.96706 validation-aucpr:0.97161
[4] validation-logloss:0.53337 validation-auc:0.96768 validation-aucpr:0.97198
[5] validation-logloss:0.50880 validation-auc:0.96797 validation-aucpr:0.97228
[6] validation-logloss:0.48597 validation-auc:0.96861 validation-aucpr:0.97261
[7] validation-logloss:0.46724 validation-auc:0.96905 validation-aucpr:0.97301
[8] validation-logloss:0.45022 validation-auc:0.96924 validation-aucpr:0.97352
[9] validation-logloss:0.43204 validation-auc:0.96919 validation-aucpr:0.97345
[10] validation-logloss:0.41515 validation-auc:0.96941 validation-aucpr:0.97356
[11] validation-logloss:0.40154 validation-auc:0.96924 validation-aucpr:0.97334
[12] validation-logloss:0.38915 validation-auc:0.96927 validation-aucpr:0.97335
[13] validation-logloss:0.37579 validation-auc:0.96958 validation-aucpr:0.97354
[14] validation-logloss:0.36525 validation-auc:0.96956 validation-aucpr:0.97351
[15] validation-logloss:0.35341 validation-auc:0.96972 validation-aucpr:0.97362
[16] validation-logloss:0.34239 validation-auc:0.97035 validation-aucpr:0.97406
[17] validation-logloss:0.33251 validation-auc:0.97046 validation-aucpr:0.97413
[18] validation-logloss:0.32270 validation-auc:0.97089 validation-aucpr:0.97449
[19] validation-logloss:0.31366 validation-auc:0.97116 validation-aucpr:0.97471
[20] validation-logloss:0.30569 validation-auc:0.97119 validation-aucpr:0.97472
[21] validation-logloss:0.29800 validation-auc:0.97127 validation-aucpr:0.97491
[22] validation-logloss:0.29141 validation-auc:0.97158 validation-aucpr:0.97512
[23] validation-logloss:0.28484 validation-auc:0.97176 validation-aucpr:0.97535
[24] validation-logloss:0.27937 validation-auc:0.97168 validation-aucpr:0.97526
[25] validation-logloss:0.27333 validation-auc:0.97175 validation-aucpr:0.97524
[26] validation-logloss:0.26864 validation-auc:0.97169 validation-aucpr:0.97517
[27] validation-logloss:0.26322 validation-auc:0.97202 validation-aucpr:0.97537
[28] validation-logloss:0.25820 validation-auc:0.97207 validation-aucpr:0.97541
[29] validation-logloss:0.25438 validation-auc:0.97209 validation-aucpr:0.97540
[30] validation-logloss:0.25076 validation-auc:0.97213 validation-aucpr:0.97469
[31] validation-logloss:0.24684 validation-auc:0.97203 validation-aucpr:0.97472
[32] validation-logloss:0.24296 validation-auc:0.97205 validation-aucpr:0.97475
[33] validation-logloss:0.23960 validation-auc:0.97196 validation-aucpr:0.97465
[34] validation-logloss:0.23671 validation-auc:0.97196 validation-aucpr:0.97464
[35] validation-logloss:0.23380 validation-auc:0.97197 validation-aucpr:0.97461
[36] validation-logloss:0.23067 validation-auc:0.97214 validation-aucpr:0.97480
[37] validation-logloss:0.22796 validation-auc:0.97219 validation-aucpr:0.97518
[38] validation-logloss:0.22504 validation-auc:0.97227 validation-aucpr:0.97523
[39] validation-logloss:0.22276 validation-auc:0.97222 validation-aucpr:0.97516
[40] validation-logloss:0.22057 validation-auc:0.97235 validation-aucpr:0.97525
[41] validation-logloss:0.21832 validation-auc:0.97230 validation-aucpr:0.97519
[42] validation-logloss:0.21607 validation-auc:0.97238 validation-aucpr:0.97524
[43] validation-logloss:0.21482 validation-auc:0.97227 validation-aucpr:0.97590
[44] validation-logloss:0.21324 validation-auc:0.97234 validation-aucpr:0.97593
[45] validation-logloss:0.21161 validation-auc:0.97226 validation-aucpr:0.97587
[46] validation-logloss:0.21051 validation-auc:0.97225 validation-aucpr:0.97584
[47] validation-logloss:0.20868 validation-auc:0.97241 validation-aucpr:0.97605
[48] validation-logloss:0.20670 validation-auc:0.97270 validation-aucpr:0.97628
[49] validation-logloss:0.20550 validation-auc:0.97269 validation-aucpr:0.97626
[50] validation-logloss:0.20409 validation-auc:0.97273 validation-aucpr:0.97625
[51] validation-logloss:0.20264 validation-auc:0.97279 validation-aucpr:0.97634
[52] validation-logloss:0.20146 validation-auc:0.97287 validation-aucpr:0.97640
[53] validation-logloss:0.20027 validation-auc:0.97293 validation-aucpr:0.97645
[54] validation-logloss:0.19940 validation-auc:0.97298 validation-aucpr:0.97646
[55] validation-logloss:0.19850 validation-auc:0.97290 validation-aucpr:0.97639
[56] validation-logloss:0.19759 validation-auc:0.97294 validation-aucpr:0.97639
{'best_iteration': '54', 'best_score': '0.9764551443514747'}
Trial 89, Fold 3: Log loss = 0.19758623619424412, Average precision = 0.9763968490881044, ROC-AUC = 0.9729429613988114, Elapsed Time = 10.560710200003086 seconds
Trial 89, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 89, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.65225 validation-auc:0.95540 validation-aucpr:0.95811
[1] validation-logloss:0.61561 validation-auc:0.96157 validation-aucpr:0.96507
[2] validation-logloss:0.58339 validation-auc:0.96360 validation-aucpr:0.96762
[3] validation-logloss:0.55708 validation-auc:0.96406 validation-aucpr:0.96979
[4] validation-logloss:0.52991 validation-auc:0.96531 validation-aucpr:0.97079
[5] validation-logloss:0.50509 validation-auc:0.96687 validation-aucpr:0.97220
[6] validation-logloss:0.48227 validation-auc:0.96739 validation-aucpr:0.97261
[7] validation-logloss:0.46261 validation-auc:0.96778 validation-aucpr:0.97287
[8] validation-logloss:0.44365 validation-auc:0.96797 validation-aucpr:0.97301
[9] validation-logloss:0.42648 validation-auc:0.96851 validation-aucpr:0.97344
[10] validation-logloss:0.41246 validation-auc:0.96883 validation-aucpr:0.97357
[11] validation-logloss:0.39939 validation-auc:0.96829 validation-aucpr:0.97318
[12] validation-logloss:0.38724 validation-auc:0.96827 validation-aucpr:0.97314
[13] validation-logloss:0.37425 validation-auc:0.96837 validation-aucpr:0.97329
[14] validation-logloss:0.36229 validation-auc:0.96863 validation-aucpr:0.97351
[15] validation-logloss:0.35095 validation-auc:0.96890 validation-aucpr:0.97369
[16] validation-logloss:0.34195 validation-auc:0.96873 validation-aucpr:0.97356
[17] validation-logloss:0.33226 validation-auc:0.96899 validation-aucpr:0.97384
[18] validation-logloss:0.32354 validation-auc:0.96899 validation-aucpr:0.97385
[19] validation-logloss:0.31471 validation-auc:0.96921 validation-aucpr:0.97406
[20] validation-logloss:0.30803 validation-auc:0.96954 validation-aucpr:0.97436
[21] validation-logloss:0.30076 validation-auc:0.96948 validation-aucpr:0.97432
[22] validation-logloss:0.29405 validation-auc:0.96950 validation-aucpr:0.97431
[23] validation-logloss:0.28809 validation-auc:0.96957 validation-aucpr:0.97436
[24] validation-logloss:0.28198 validation-auc:0.96971 validation-aucpr:0.97448
[25] validation-logloss:0.27607 validation-auc:0.96981 validation-aucpr:0.97456
[26] validation-logloss:0.27111 validation-auc:0.96986 validation-aucpr:0.97461
[27] validation-logloss:0.26569 validation-auc:0.97013 validation-aucpr:0.97481
[28] validation-logloss:0.26117 validation-auc:0.96996 validation-aucpr:0.97470
[29] validation-logloss:0.25733 validation-auc:0.96998 validation-aucpr:0.97467
[30] validation-logloss:0.25292 validation-auc:0.97025 validation-aucpr:0.97487
[31] validation-logloss:0.24986 validation-auc:0.97007 validation-aucpr:0.97470
[32] validation-logloss:0.24660 validation-auc:0.97020 validation-aucpr:0.97479
[33] validation-logloss:0.24295 validation-auc:0.97051 validation-aucpr:0.97505
[34] validation-logloss:0.23945 validation-auc:0.97069 validation-aucpr:0.97522
[35] validation-logloss:0.23636 validation-auc:0.97073 validation-aucpr:0.97528
[36] validation-logloss:0.23332 validation-auc:0.97094 validation-aucpr:0.97543
[37] validation-logloss:0.23088 validation-auc:0.97093 validation-aucpr:0.97541
[38] validation-logloss:0.22898 validation-auc:0.97084 validation-aucpr:0.97536
[39] validation-logloss:0.22634 validation-auc:0.97107 validation-aucpr:0.97552
[40] validation-logloss:0.22398 validation-auc:0.97124 validation-aucpr:0.97562
[41] validation-logloss:0.22257 validation-auc:0.97111 validation-aucpr:0.97554
[42] validation-logloss:0.22034 validation-auc:0.97129 validation-aucpr:0.97568
[43] validation-logloss:0.21859 validation-auc:0.97130 validation-aucpr:0.97570
[44] validation-logloss:0.21661 validation-auc:0.97148 validation-aucpr:0.97583
[45] validation-logloss:0.21468 validation-auc:0.97156 validation-aucpr:0.97590
[46] validation-logloss:0.21352 validation-auc:0.97146 validation-aucpr:0.97584
[47] validation-logloss:0.21203 validation-auc:0.97139 validation-aucpr:0.97583
[48] validation-logloss:0.21031 validation-auc:0.97157 validation-aucpr:0.97595
[49] validation-logloss:0.20906 validation-auc:0.97153 validation-aucpr:0.97592
[50] validation-logloss:0.20791 validation-auc:0.97149 validation-aucpr:0.97589
[51] validation-logloss:0.20664 validation-auc:0.97174 validation-aucpr:0.97605
[52] validation-logloss:0.20545 validation-auc:0.97183 validation-aucpr:0.97609
[53] validation-logloss:0.20491 validation-auc:0.97169 validation-aucpr:0.97599
[54] validation-logloss:0.20397 validation-auc:0.97160 validation-aucpr:0.97595
[55] validation-logloss:0.20282 validation-auc:0.97164 validation-aucpr:0.97599
[56] validation-logloss:0.20183 validation-auc:0.97157 validation-aucpr:0.97595
{'best_iteration': '52', 'best_score': '0.9760938284206304'}
Trial 89, Fold 4: Log loss = 0.20183458227168163, Average precision = 0.9759502113850549, ROC-AUC = 0.9715703096871415, Elapsed Time = 10.825298800002201 seconds
Trial 89, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 89, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.65276 validation-auc:0.95644 validation-aucpr:0.96185
[1] validation-logloss:0.62152 validation-auc:0.95868 validation-aucpr:0.96409
[2] validation-logloss:0.59298 validation-auc:0.95954 validation-aucpr:0.96431
[3] validation-logloss:0.56285 validation-auc:0.96259 validation-aucpr:0.96699
[4] validation-logloss:0.53503 validation-auc:0.96485 validation-aucpr:0.96886
[5] validation-logloss:0.51198 validation-auc:0.96489 validation-aucpr:0.96878
[6] validation-logloss:0.48924 validation-auc:0.96562 validation-aucpr:0.96947
[7] validation-logloss:0.46824 validation-auc:0.96644 validation-aucpr:0.97020
[8] validation-logloss:0.45220 validation-auc:0.96629 validation-aucpr:0.96987
[9] validation-logloss:0.43381 validation-auc:0.96712 validation-aucpr:0.97046
[10] validation-logloss:0.41953 validation-auc:0.96696 validation-aucpr:0.97030
[11] validation-logloss:0.40397 validation-auc:0.96768 validation-aucpr:0.97087
[12] validation-logloss:0.39028 validation-auc:0.96815 validation-aucpr:0.97148
[13] validation-logloss:0.37725 validation-auc:0.96809 validation-aucpr:0.97029
[14] validation-logloss:0.36693 validation-auc:0.96791 validation-aucpr:0.97011
[15] validation-logloss:0.35704 validation-auc:0.96823 validation-aucpr:0.97223
[16] validation-logloss:0.34606 validation-auc:0.96880 validation-aucpr:0.97277
[17] validation-logloss:0.33817 validation-auc:0.96842 validation-aucpr:0.97228
[18] validation-logloss:0.33036 validation-auc:0.96823 validation-aucpr:0.97208
[19] validation-logloss:0.32150 validation-auc:0.96855 validation-aucpr:0.97230
[20] validation-logloss:0.31339 validation-auc:0.96881 validation-aucpr:0.97250
[21] validation-logloss:0.30692 validation-auc:0.96874 validation-aucpr:0.97247
[22] validation-logloss:0.29955 validation-auc:0.96898 validation-aucpr:0.97264
[23] validation-logloss:0.29436 validation-auc:0.96890 validation-aucpr:0.97282
[24] validation-logloss:0.28795 validation-auc:0.96909 validation-aucpr:0.97292
[25] validation-logloss:0.28333 validation-auc:0.96901 validation-aucpr:0.97283
[26] validation-logloss:0.27758 validation-auc:0.96919 validation-aucpr:0.97298
[27] validation-logloss:0.27353 validation-auc:0.96904 validation-aucpr:0.97306
[28] validation-logloss:0.26843 validation-auc:0.96929 validation-aucpr:0.97320
[29] validation-logloss:0.26456 validation-auc:0.96932 validation-aucpr:0.97321
[30] validation-logloss:0.26005 validation-auc:0.96955 validation-aucpr:0.97341
[31] validation-logloss:0.25677 validation-auc:0.96946 validation-aucpr:0.97331
[32] validation-logloss:0.25262 validation-auc:0.96982 validation-aucpr:0.97365
[33] validation-logloss:0.24985 validation-auc:0.96966 validation-aucpr:0.97350
[34] validation-logloss:0.24712 validation-auc:0.96966 validation-aucpr:0.97347
[35] validation-logloss:0.24399 validation-auc:0.96955 validation-aucpr:0.97344
[36] validation-logloss:0.24083 validation-auc:0.96971 validation-aucpr:0.97354
[37] validation-logloss:0.23755 validation-auc:0.97002 validation-aucpr:0.97382
[38] validation-logloss:0.23485 validation-auc:0.97013 validation-aucpr:0.97390
[39] validation-logloss:0.23222 validation-auc:0.97032 validation-aucpr:0.97403
[40] validation-logloss:0.23036 validation-auc:0.97023 validation-aucpr:0.97393
[41] validation-logloss:0.22854 validation-auc:0.97036 validation-aucpr:0.97406
[42] validation-logloss:0.22604 validation-auc:0.97055 validation-aucpr:0.97419
[43] validation-logloss:0.22468 validation-auc:0.97039 validation-aucpr:0.97404
[44] validation-logloss:0.22264 validation-auc:0.97054 validation-aucpr:0.97415
[45] validation-logloss:0.22064 validation-auc:0.97064 validation-aucpr:0.97422
[46] validation-logloss:0.21937 validation-auc:0.97071 validation-aucpr:0.97425
[47] validation-logloss:0.21739 validation-auc:0.97095 validation-aucpr:0.97438
[48] validation-logloss:0.21560 validation-auc:0.97111 validation-aucpr:0.97446
[49] validation-logloss:0.21391 validation-auc:0.97135 validation-aucpr:0.97465
[50] validation-logloss:0.21264 validation-auc:0.97131 validation-aucpr:0.97461
[51] validation-logloss:0.21130 validation-auc:0.97140 validation-aucpr:0.97473
[52] validation-logloss:0.20998 validation-auc:0.97157 validation-aucpr:0.97492
[53] validation-logloss:0.20877 validation-auc:0.97167 validation-aucpr:0.97497
[54] validation-logloss:0.20782 validation-auc:0.97171 validation-aucpr:0.97492
[55] validation-logloss:0.20683 validation-auc:0.97173 validation-aucpr:0.97493
[56] validation-logloss:0.20575 validation-auc:0.97182 validation-aucpr:0.97479
{'best_iteration': '53', 'best_score': '0.9749698823576478'}
Trial 89, Fold 5: Log loss = 0.20574513483962278, Average precision = 0.9748008098075681, ROC-AUC = 0.9718218605428905, Elapsed Time = 11.012917900003231 seconds
Optimization Progress: 90%|######### | 90/100 [3:51:38<11:27, 68.71s/it]
Trial 90, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 90, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[21:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68354 validation-auc:0.92709 validation-aucpr:0.92888
[21:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67487 validation-auc:0.95282 validation-aucpr:0.94079
[21:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66557 validation-auc:0.96054 validation-aucpr:0.95758
[21:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65668 validation-auc:0.96288 validation-aucpr:0.96216
[21:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.64834 validation-auc:0.96400 validation-aucpr:0.96566
[21:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.63979 validation-auc:0.96555 validation-aucpr:0.97077
[21:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63156 validation-auc:0.96603 validation-aucpr:0.97110
[21:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.62432 validation-auc:0.96592 validation-aucpr:0.97096
[21:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.61660 validation-auc:0.96646 validation-aucpr:0.97132
[21:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.60976 validation-auc:0.96623 validation-aucpr:0.97109
[21:50:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.60202 validation-auc:0.96670 validation-aucpr:0.97103
[21:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.59529 validation-auc:0.96675 validation-aucpr:0.97129
[21:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.58876 validation-auc:0.96668 validation-aucpr:0.97118
[21:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.58162 validation-auc:0.96673 validation-aucpr:0.97120
[21:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.57436 validation-auc:0.96723 validation-aucpr:0.97022
[21:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.56836 validation-auc:0.96765 validation-aucpr:0.97103
[21:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.56162 validation-auc:0.96798 validation-aucpr:0.97305
[21:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.55592 validation-auc:0.96783 validation-aucpr:0.97289
[21:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.54955 validation-auc:0.96822 validation-aucpr:0.97321
[21:50:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.54312 validation-auc:0.96861 validation-aucpr:0.97346
[21:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.53768 validation-auc:0.96851 validation-aucpr:0.97336
[21:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.53159 validation-auc:0.96872 validation-aucpr:0.97357
[21:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.52656 validation-auc:0.96852 validation-aucpr:0.97339
[21:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.52076 validation-auc:0.96870 validation-aucpr:0.97354
[21:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.51577 validation-auc:0.96864 validation-aucpr:0.97345
[21:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.51095 validation-auc:0.96845 validation-aucpr:0.97329
[21:50:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.50561 validation-auc:0.96839 validation-aucpr:0.97325
[21:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.50098 validation-auc:0.96833 validation-aucpr:0.97317
[21:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.49566 validation-auc:0.96840 validation-aucpr:0.97322
[21:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.49130 validation-auc:0.96818 validation-aucpr:0.97306
[21:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.48694 validation-auc:0.96796 validation-aucpr:0.97286
[21:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.48257 validation-auc:0.96796 validation-aucpr:0.97284
[21:50:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.47842 validation-auc:0.96776 validation-aucpr:0.97269
[21:50:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.47353 validation-auc:0.96797 validation-aucpr:0.97288
[21:50:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.46935 validation-auc:0.96797 validation-aucpr:0.97289
[21:50:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.46466 validation-auc:0.96811 validation-aucpr:0.97301
[21:50:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.46080 validation-auc:0.96792 validation-aucpr:0.97286
[21:50:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.45619 validation-auc:0.96814 validation-aucpr:0.97305
[21:50:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.45172 validation-auc:0.96828 validation-aucpr:0.97315
[21:50:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.44734 validation-auc:0.96829 validation-aucpr:0.97319
[21:50:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.44310 validation-auc:0.96844 validation-aucpr:0.97339
[21:50:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.43887 validation-auc:0.96872 validation-aucpr:0.97361
[21:50:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.43468 validation-auc:0.96901 validation-aucpr:0.97381
[21:50:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.43078 validation-auc:0.96897 validation-aucpr:0.97379
{'best_iteration': '42', 'best_score': '0.9738097543498954'}
Trial 90, Fold 1: Log loss = 0.430776386850286, Average precision = 0.9737941693851264, ROC-AUC = 0.9689733711159862, Elapsed Time = 6.182111800000712 seconds
Trial 90, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 90, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[21:50:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68314 validation-auc:0.93332 validation-aucpr:0.93694
[21:50:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67376 validation-auc:0.95557 validation-aucpr:0.94053
[21:50:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66562 validation-auc:0.96129 validation-aucpr:0.95678
[21:50:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65646 validation-auc:0.96455 validation-aucpr:0.96381
[21:50:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.64867 validation-auc:0.96572 validation-aucpr:0.96888
[21:50:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.63997 validation-auc:0.96678 validation-aucpr:0.97098
[21:50:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63256 validation-auc:0.96687 validation-aucpr:0.97096
[21:50:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.62427 validation-auc:0.96759 validation-aucpr:0.97164
[21:50:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.61730 validation-auc:0.96759 validation-aucpr:0.97148
[21:50:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61039 validation-auc:0.96742 validation-aucpr:0.97125
[21:50:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.60353 validation-auc:0.96722 validation-aucpr:0.97102
[21:50:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.59681 validation-auc:0.96745 validation-aucpr:0.97112
[21:50:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.58935 validation-auc:0.96774 validation-aucpr:0.97148
[21:50:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.58212 validation-auc:0.96801 validation-aucpr:0.97167
[21:50:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.57526 validation-auc:0.96837 validation-aucpr:0.97198
[21:50:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.56850 validation-auc:0.96853 validation-aucpr:0.97210
[21:50:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.56171 validation-auc:0.96867 validation-aucpr:0.97230
[21:50:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.55550 validation-auc:0.96859 validation-aucpr:0.97223
[21:50:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.54988 validation-auc:0.96842 validation-aucpr:0.97193
[21:50:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.54354 validation-auc:0.96854 validation-aucpr:0.97202
[21:50:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.53726 validation-auc:0.96894 validation-aucpr:0.97231
[21:50:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.53186 validation-auc:0.96876 validation-aucpr:0.97224
[21:50:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.52656 validation-auc:0.96868 validation-aucpr:0.97214
[21:50:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.52155 validation-auc:0.96863 validation-aucpr:0.97204
[21:50:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.51588 validation-auc:0.96859 validation-aucpr:0.97196
[21:50:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.51013 validation-auc:0.96890 validation-aucpr:0.97221
[21:50:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.50525 validation-auc:0.96880 validation-aucpr:0.97212
[21:50:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.50035 validation-auc:0.96898 validation-aucpr:0.97227
[21:50:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.49581 validation-auc:0.96917 validation-aucpr:0.97236
[21:50:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.49056 validation-auc:0.96934 validation-aucpr:0.97250
[21:50:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.48597 validation-auc:0.96937 validation-aucpr:0.97245
[21:50:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.48173 validation-auc:0.96945 validation-aucpr:0.97251
[21:50:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.47746 validation-auc:0.96930 validation-aucpr:0.97241
[21:50:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.47315 validation-auc:0.96950 validation-aucpr:0.97253
[21:50:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.46909 validation-auc:0.96949 validation-aucpr:0.97241
[21:50:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.46441 validation-auc:0.96963 validation-aucpr:0.97353
[21:50:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.45973 validation-auc:0.96976 validation-aucpr:0.97367
[21:50:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.45586 validation-auc:0.96978 validation-aucpr:0.97365
[21:50:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.45133 validation-auc:0.96992 validation-aucpr:0.97377
[21:50:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.44703 validation-auc:0.96998 validation-aucpr:0.97382
[21:50:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.44321 validation-auc:0.97006 validation-aucpr:0.97386
[21:50:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.43898 validation-auc:0.97013 validation-aucpr:0.97390
[21:50:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.43478 validation-auc:0.97035 validation-aucpr:0.97408
[21:50:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.43077 validation-auc:0.97038 validation-aucpr:0.97411
{'best_iteration': '43', 'best_score': '0.9741106364338346'}
Trial 90, Fold 2: Log loss = 0.43076507082274706, Average precision = 0.9741145879228299, ROC-AUC = 0.9703752662035527, Elapsed Time = 6.197583100001793 seconds
Trial 90, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 90, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[21:50:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68327 validation-auc:0.93318 validation-aucpr:0.94252
[21:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67480 validation-auc:0.95431 validation-aucpr:0.93696
[21:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66570 validation-auc:0.96271 validation-aucpr:0.95550
[21:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65667 validation-auc:0.96478 validation-aucpr:0.96069
[21:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.64886 validation-auc:0.96514 validation-aucpr:0.96758
[21:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64028 validation-auc:0.96608 validation-aucpr:0.96940
[21:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63192 validation-auc:0.96657 validation-aucpr:0.97014
[21:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.62368 validation-auc:0.96688 validation-aucpr:0.97021
[21:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.61653 validation-auc:0.96736 validation-aucpr:0.96978
[21:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.60849 validation-auc:0.96858 validation-aucpr:0.97065
[21:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.60163 validation-auc:0.96898 validation-aucpr:0.97107
[21:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.59481 validation-auc:0.96932 validation-aucpr:0.97121
[21:50:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.58772 validation-auc:0.96964 validation-aucpr:0.97144
[21:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.58129 validation-auc:0.96967 validation-aucpr:0.97148
[21:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.57427 validation-auc:0.96969 validation-aucpr:0.97096
[21:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.56741 validation-auc:0.97002 validation-aucpr:0.97094
[21:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.56137 validation-auc:0.96998 validation-aucpr:0.97085
[21:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.55562 validation-auc:0.96998 validation-aucpr:0.97125
[21:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.54918 validation-auc:0.97008 validation-aucpr:0.97160
[21:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.54356 validation-auc:0.97043 validation-aucpr:0.97363
[21:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.53805 validation-auc:0.97042 validation-aucpr:0.97353
[21:50:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.53201 validation-auc:0.97039 validation-aucpr:0.97342
[21:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.52677 validation-auc:0.97045 validation-aucpr:0.97337
[21:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.52095 validation-auc:0.97062 validation-aucpr:0.97436
[21:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.51588 validation-auc:0.97047 validation-aucpr:0.97335
[21:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.51030 validation-auc:0.97053 validation-aucpr:0.97340
[21:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.50515 validation-auc:0.97051 validation-aucpr:0.97407
[21:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.49958 validation-auc:0.97073 validation-aucpr:0.97423
[21:50:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.49435 validation-auc:0.97062 validation-aucpr:0.97414
[21:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.48971 validation-auc:0.97055 validation-aucpr:0.97407
[21:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.48441 validation-auc:0.97059 validation-aucpr:0.97349
[21:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.47994 validation-auc:0.97067 validation-aucpr:0.97339
[21:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.47596 validation-auc:0.97047 validation-aucpr:0.97314
[21:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.47157 validation-auc:0.97044 validation-aucpr:0.97314
[21:50:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.46679 validation-auc:0.97068 validation-aucpr:0.97359
[21:50:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.46251 validation-auc:0.97086 validation-aucpr:0.97372
[21:50:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.45795 validation-auc:0.97082 validation-aucpr:0.97367
[21:50:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.45336 validation-auc:0.97099 validation-aucpr:0.97400
[21:50:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.44901 validation-auc:0.97113 validation-aucpr:0.97521
[21:50:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.44506 validation-auc:0.97108 validation-aucpr:0.97518
[21:50:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.44146 validation-auc:0.97110 validation-aucpr:0.97519
[21:50:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.43724 validation-auc:0.97116 validation-aucpr:0.97526
[21:50:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.43318 validation-auc:0.97120 validation-aucpr:0.97531
[21:50:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.42915 validation-auc:0.97126 validation-aucpr:0.97536
{'best_iteration': '43', 'best_score': '0.9753572218732132'}
Trial 90, Fold 3: Log loss = 0.42914536504407685, Average precision = 0.9753615635108435, ROC-AUC = 0.9712575302655276, Elapsed Time = 6.2764660000029835 seconds
Trial 90, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 90, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[21:50:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68309 validation-auc:0.92522 validation-aucpr:0.91691
[21:50:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67472 validation-auc:0.95181 validation-aucpr:0.94043
[21:50:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66553 validation-auc:0.95816 validation-aucpr:0.95189
[21:50:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65765 validation-auc:0.96004 validation-aucpr:0.95955
[21:50:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.64888 validation-auc:0.96237 validation-aucpr:0.96545
[21:50:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64020 validation-auc:0.96373 validation-aucpr:0.96673
[21:50:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63268 validation-auc:0.96432 validation-aucpr:0.96678
[21:50:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.62543 validation-auc:0.96487 validation-aucpr:0.97023
[21:50:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.61842 validation-auc:0.96460 validation-aucpr:0.96989
[21:50:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61055 validation-auc:0.96503 validation-aucpr:0.97049
[21:50:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.60274 validation-auc:0.96561 validation-aucpr:0.97107
[21:50:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.59510 validation-auc:0.96584 validation-aucpr:0.97128
[21:50:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.58775 validation-auc:0.96621 validation-aucpr:0.97162
[21:50:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.58155 validation-auc:0.96587 validation-aucpr:0.97134
[21:50:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.57447 validation-auc:0.96625 validation-aucpr:0.97160
[21:50:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.56840 validation-auc:0.96655 validation-aucpr:0.97190
[21:50:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.56165 validation-auc:0.96700 validation-aucpr:0.97225
[21:50:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.55580 validation-auc:0.96722 validation-aucpr:0.97241
[21:50:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.54934 validation-auc:0.96726 validation-aucpr:0.97245
[21:50:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.54309 validation-auc:0.96748 validation-aucpr:0.97263
[21:50:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.53701 validation-auc:0.96758 validation-aucpr:0.97272
[21:50:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.53165 validation-auc:0.96760 validation-aucpr:0.97271
[21:50:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.52629 validation-auc:0.96775 validation-aucpr:0.97281
[21:50:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.52036 validation-auc:0.96801 validation-aucpr:0.97300
[21:50:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.51513 validation-auc:0.96815 validation-aucpr:0.97313
[21:50:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.51019 validation-auc:0.96802 validation-aucpr:0.97300
[21:50:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.50538 validation-auc:0.96797 validation-aucpr:0.97292
[21:50:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.49998 validation-auc:0.96812 validation-aucpr:0.97308
[21:50:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.49476 validation-auc:0.96830 validation-aucpr:0.97330
[21:50:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.49034 validation-auc:0.96818 validation-aucpr:0.97319
[21:50:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.48579 validation-auc:0.96821 validation-aucpr:0.97322
[21:50:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.48071 validation-auc:0.96839 validation-aucpr:0.97337
[21:50:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.47636 validation-auc:0.96841 validation-aucpr:0.97338
[21:50:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.47151 validation-auc:0.96858 validation-aucpr:0.97351
[21:50:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.46675 validation-auc:0.96867 validation-aucpr:0.97365
[21:50:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.46282 validation-auc:0.96857 validation-aucpr:0.97356
[21:50:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.45816 validation-auc:0.96869 validation-aucpr:0.97364
[21:50:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.45415 validation-auc:0.96868 validation-aucpr:0.97362
[21:50:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.45045 validation-auc:0.96851 validation-aucpr:0.97350
[21:50:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.44606 validation-auc:0.96851 validation-aucpr:0.97353
[21:50:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.44240 validation-auc:0.96855 validation-aucpr:0.97357
[21:50:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.43830 validation-auc:0.96858 validation-aucpr:0.97359
[21:50:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.43406 validation-auc:0.96870 validation-aucpr:0.97368
[21:50:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.43003 validation-auc:0.96879 validation-aucpr:0.97378
{'best_iteration': '43', 'best_score': '0.9737783677406094'}
Trial 90, Fold 4: Log loss = 0.4300305917885512, Average precision = 0.9737804054317507, ROC-AUC = 0.9687914291198428, Elapsed Time = 6.178327800000261 seconds
Trial 90, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 90, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[21:51:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68339 validation-auc:0.93084 validation-aucpr:0.92778
[21:51:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67511 validation-auc:0.95050 validation-aucpr:0.93833
[21:51:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.66581 validation-auc:0.95928 validation-aucpr:0.95885
[21:51:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.65679 validation-auc:0.96248 validation-aucpr:0.96825
[21:51:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.64912 validation-auc:0.96295 validation-aucpr:0.96555
[21:51:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64146 validation-auc:0.96371 validation-aucpr:0.96758
[21:51:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.63312 validation-auc:0.96474 validation-aucpr:0.96983
[21:51:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.62501 validation-auc:0.96503 validation-aucpr:0.96929
[21:51:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.61720 validation-auc:0.96575 validation-aucpr:0.97081
[21:51:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.61011 validation-auc:0.96632 validation-aucpr:0.97131
[21:51:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.60262 validation-auc:0.96657 validation-aucpr:0.97153
[21:51:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.59530 validation-auc:0.96642 validation-aucpr:0.97157
[21:51:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.58891 validation-auc:0.96656 validation-aucpr:0.97166
[21:51:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.58163 validation-auc:0.96705 validation-aucpr:0.97204
[21:51:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.57459 validation-auc:0.96745 validation-aucpr:0.97235
[21:51:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.56772 validation-auc:0.96807 validation-aucpr:0.97271
[21:51:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.56197 validation-auc:0.96793 validation-aucpr:0.97251
[21:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.55629 validation-auc:0.96797 validation-aucpr:0.97248
[21:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.55004 validation-auc:0.96777 validation-aucpr:0.97238
[21:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.54456 validation-auc:0.96758 validation-aucpr:0.97220
[21:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.53833 validation-auc:0.96766 validation-aucpr:0.97227
[21:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.53304 validation-auc:0.96743 validation-aucpr:0.97207
[21:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.52802 validation-auc:0.96721 validation-aucpr:0.97190
[21:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.52291 validation-auc:0.96701 validation-aucpr:0.97171
[21:51:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.51717 validation-auc:0.96715 validation-aucpr:0.97183
[21:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.51222 validation-auc:0.96709 validation-aucpr:0.97176
[21:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.50671 validation-auc:0.96737 validation-aucpr:0.97197
[21:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.50172 validation-auc:0.96733 validation-aucpr:0.97199
[21:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.49724 validation-auc:0.96712 validation-aucpr:0.97184
[21:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.49264 validation-auc:0.96714 validation-aucpr:0.97186
[21:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.48747 validation-auc:0.96747 validation-aucpr:0.97215
[21:51:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.48312 validation-auc:0.96738 validation-aucpr:0.97196
[21:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.47811 validation-auc:0.96757 validation-aucpr:0.97210
[21:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.47324 validation-auc:0.96776 validation-aucpr:0.97226
[21:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.46913 validation-auc:0.96783 validation-aucpr:0.97231
[21:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.46449 validation-auc:0.96797 validation-aucpr:0.97243
[21:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.45998 validation-auc:0.96801 validation-aucpr:0.97246
[21:51:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.45613 validation-auc:0.96800 validation-aucpr:0.97245
[21:51:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.45181 validation-auc:0.96820 validation-aucpr:0.97259
[21:51:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.44759 validation-auc:0.96828 validation-aucpr:0.97268
[21:51:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.44329 validation-auc:0.96833 validation-aucpr:0.97275
[21:51:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.43971 validation-auc:0.96832 validation-aucpr:0.97274
[21:51:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.43615 validation-auc:0.96837 validation-aucpr:0.97281
[21:51:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.43208 validation-auc:0.96858 validation-aucpr:0.97299
{'best_iteration': '43', 'best_score': '0.9729927994550973'}
Trial 90, Fold 5: Log loss = 0.43207551940990796, Average precision = 0.9729978877294114, ROC-AUC = 0.9685836909871245, Elapsed Time = 6.390422000000399 seconds
Optimization Progress: 91%|#########1| 91/100 [3:52:18<08:59, 59.98s/it]
Trial 91, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 91, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[21:51:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68534 validation-auc:0.95976 validation-aucpr:0.96326
[21:51:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67859 validation-auc:0.96192 validation-aucpr:0.96653
[21:51:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67123 validation-auc:0.96368 validation-aucpr:0.96761
[21:51:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66410 validation-auc:0.96501 validation-aucpr:0.96989
[21:51:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65765 validation-auc:0.96517 validation-aucpr:0.96987
[21:51:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65063 validation-auc:0.96550 validation-aucpr:0.97027
[21:51:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64455 validation-auc:0.96508 validation-aucpr:0.96975
[21:51:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63845 validation-auc:0.96501 validation-aucpr:0.96999
[21:51:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63181 validation-auc:0.96566 validation-aucpr:0.97066
[21:51:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62605 validation-auc:0.96561 validation-aucpr:0.97054
[21:51:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.62051 validation-auc:0.96524 validation-aucpr:0.97017
[21:51:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61508 validation-auc:0.96516 validation-aucpr:0.97011
[21:51:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60994 validation-auc:0.96489 validation-aucpr:0.96987
[21:51:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60474 validation-auc:0.96469 validation-aucpr:0.96974
[21:51:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59873 validation-auc:0.96485 validation-aucpr:0.96999
[21:51:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59354 validation-auc:0.96493 validation-aucpr:0.97003
[21:51:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58782 validation-auc:0.96522 validation-aucpr:0.97033
[21:51:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58203 validation-auc:0.96567 validation-aucpr:0.97071
[21:51:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57715 validation-auc:0.96577 validation-aucpr:0.97075
[21:51:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57167 validation-auc:0.96612 validation-aucpr:0.97120
[21:51:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56626 validation-auc:0.96626 validation-aucpr:0.97133
[21:51:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.56096 validation-auc:0.96639 validation-aucpr:0.97147
[21:51:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55584 validation-auc:0.96644 validation-aucpr:0.97161
[21:51:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55078 validation-auc:0.96661 validation-aucpr:0.97230
[21:51:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54588 validation-auc:0.96670 validation-aucpr:0.97237
[21:51:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54104 validation-auc:0.96688 validation-aucpr:0.97246
[21:51:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53619 validation-auc:0.96709 validation-aucpr:0.97266
[21:51:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53146 validation-auc:0.96728 validation-aucpr:0.97280
[21:51:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52680 validation-auc:0.96731 validation-aucpr:0.97283
[21:51:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52228 validation-auc:0.96742 validation-aucpr:0.97298
[21:51:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.51851 validation-auc:0.96735 validation-aucpr:0.97283
[21:51:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51415 validation-auc:0.96738 validation-aucpr:0.97288
[21:51:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51023 validation-auc:0.96736 validation-aucpr:0.97279
[21:51:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.50652 validation-auc:0.96727 validation-aucpr:0.97267
[21:51:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50265 validation-auc:0.96736 validation-aucpr:0.97273
[21:51:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.49893 validation-auc:0.96740 validation-aucpr:0.97276
[21:51:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49487 validation-auc:0.96748 validation-aucpr:0.97283
[21:51:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49070 validation-auc:0.96773 validation-aucpr:0.97302
[21:51:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.48711 validation-auc:0.96784 validation-aucpr:0.97306
[21:51:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48309 validation-auc:0.96787 validation-aucpr:0.97311
[21:52:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.47914 validation-auc:0.96801 validation-aucpr:0.97325
[21:52:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.47539 validation-auc:0.96810 validation-aucpr:0.97334
[21:52:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47161 validation-auc:0.96810 validation-aucpr:0.97336
[21:52:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.46834 validation-auc:0.96810 validation-aucpr:0.97333
{'best_iteration': '42', 'best_score': '0.9733574134621086'}
Trial 91, Fold 1: Log loss = 0.4683359849337515, Average precision = 0.9733385803221706, ROC-AUC = 0.9681011707603847, Elapsed Time = 51.88532340000165 seconds
Trial 91, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 91, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[21:52:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68550 validation-auc:0.96132 validation-aucpr:0.96534
[21:52:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67810 validation-auc:0.96501 validation-aucpr:0.96922
[21:52:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67070 validation-auc:0.96661 validation-aucpr:0.97081
[21:52:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66416 validation-auc:0.96751 validation-aucpr:0.97149
[21:52:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65707 validation-auc:0.96772 validation-aucpr:0.97172
[21:52:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65085 validation-auc:0.96768 validation-aucpr:0.97168
[21:52:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64462 validation-auc:0.96776 validation-aucpr:0.97164
[21:52:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63780 validation-auc:0.96829 validation-aucpr:0.97202
[21:52:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63113 validation-auc:0.96837 validation-aucpr:0.97210
[21:52:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62530 validation-auc:0.96858 validation-aucpr:0.97215
[21:52:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61892 validation-auc:0.96934 validation-aucpr:0.97274
[21:52:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61328 validation-auc:0.96963 validation-aucpr:0.97291
[21:52:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60700 validation-auc:0.97004 validation-aucpr:0.97329
[21:52:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60169 validation-auc:0.96959 validation-aucpr:0.97285
[21:52:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59664 validation-auc:0.96923 validation-aucpr:0.97253
[21:52:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59145 validation-auc:0.96913 validation-aucpr:0.97246
[21:52:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58579 validation-auc:0.96941 validation-aucpr:0.97273
[21:52:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58017 validation-auc:0.96944 validation-aucpr:0.97276
[21:52:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57477 validation-auc:0.96953 validation-aucpr:0.97284
[21:52:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57001 validation-auc:0.96932 validation-aucpr:0.97268
[21:52:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56456 validation-auc:0.96963 validation-aucpr:0.97292
[21:52:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.55988 validation-auc:0.96969 validation-aucpr:0.97294
[21:52:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55477 validation-auc:0.96967 validation-aucpr:0.97297
[21:52:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55032 validation-auc:0.96962 validation-aucpr:0.97292
[21:52:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54588 validation-auc:0.96958 validation-aucpr:0.97286
[21:52:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54098 validation-auc:0.96966 validation-aucpr:0.97297
[21:52:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53674 validation-auc:0.96949 validation-aucpr:0.97269
[21:52:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53193 validation-auc:0.96952 validation-aucpr:0.97276
[21:52:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52717 validation-auc:0.96967 validation-aucpr:0.97290
[21:52:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52242 validation-auc:0.96980 validation-aucpr:0.97302
[21:52:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.51848 validation-auc:0.96976 validation-aucpr:0.97296
[21:52:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51381 validation-auc:0.97003 validation-aucpr:0.97321
[21:52:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51004 validation-auc:0.96987 validation-aucpr:0.97305
[21:52:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.50585 validation-auc:0.96986 validation-aucpr:0.97307
[21:52:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50151 validation-auc:0.96995 validation-aucpr:0.97316
[21:52:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.49782 validation-auc:0.97001 validation-aucpr:0.97277
[21:52:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49422 validation-auc:0.96996 validation-aucpr:0.97272
[21:52:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49066 validation-auc:0.96987 validation-aucpr:0.97264
[21:52:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.48705 validation-auc:0.96992 validation-aucpr:0.97268
[21:52:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48357 validation-auc:0.96991 validation-aucpr:0.97265
[21:52:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.47958 validation-auc:0.97004 validation-aucpr:0.97278
[21:52:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.47636 validation-auc:0.96999 validation-aucpr:0.97275
[21:52:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47245 validation-auc:0.97010 validation-aucpr:0.97283
[21:52:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.46912 validation-auc:0.97005 validation-aucpr:0.97279
{'best_iteration': '12', 'best_score': '0.9732866427723065'}
Trial 91, Fold 2: Log loss = 0.469120277958387, Average precision = 0.9727538310090776, ROC-AUC = 0.9700462011307639, Elapsed Time = 50.572803199997 seconds
Trial 91, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 91, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[21:52:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68535 validation-auc:0.96075 validation-aucpr:0.96245
[21:52:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67866 validation-auc:0.96244 validation-aucpr:0.96594
[21:52:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67133 validation-auc:0.96557 validation-aucpr:0.96968
[21:53:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66397 validation-auc:0.96694 validation-aucpr:0.97038
[21:53:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65708 validation-auc:0.96715 validation-aucpr:0.97051
[21:53:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65020 validation-auc:0.96761 validation-aucpr:0.97109
[21:53:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64349 validation-auc:0.96723 validation-aucpr:0.97087
[21:53:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63665 validation-auc:0.96774 validation-aucpr:0.97129
[21:53:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63011 validation-auc:0.96757 validation-aucpr:0.97131
[21:53:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62363 validation-auc:0.96714 validation-aucpr:0.97135
[21:53:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61802 validation-auc:0.96746 validation-aucpr:0.97168
[21:53:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61255 validation-auc:0.96742 validation-aucpr:0.97158
[21:53:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60652 validation-auc:0.96778 validation-aucpr:0.97180
[21:53:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60050 validation-auc:0.96842 validation-aucpr:0.97197
[21:53:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59530 validation-auc:0.96843 validation-aucpr:0.97192
[21:53:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59008 validation-auc:0.96865 validation-aucpr:0.97201
[21:53:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58447 validation-auc:0.96863 validation-aucpr:0.97185
[21:53:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.57895 validation-auc:0.96871 validation-aucpr:0.97190
[21:53:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57421 validation-auc:0.96870 validation-aucpr:0.97342
[21:53:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.56936 validation-auc:0.96882 validation-aucpr:0.97352
[21:53:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56463 validation-auc:0.96862 validation-aucpr:0.97336
[21:53:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.55947 validation-auc:0.96865 validation-aucpr:0.97343
[21:53:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55444 validation-auc:0.96852 validation-aucpr:0.97332
[21:53:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.54919 validation-auc:0.96897 validation-aucpr:0.97368
[21:53:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54473 validation-auc:0.96889 validation-aucpr:0.97359
[21:53:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54042 validation-auc:0.96888 validation-aucpr:0.97357
[21:53:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53552 validation-auc:0.96909 validation-aucpr:0.97372
[21:53:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53073 validation-auc:0.96925 validation-aucpr:0.97387
[21:53:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52643 validation-auc:0.96938 validation-aucpr:0.97399
[21:53:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52180 validation-auc:0.96956 validation-aucpr:0.97415
[21:53:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.51775 validation-auc:0.96953 validation-aucpr:0.97411
[21:53:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51384 validation-auc:0.96944 validation-aucpr:0.97404
[21:53:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.50960 validation-auc:0.96940 validation-aucpr:0.97399
[21:53:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.50590 validation-auc:0.96932 validation-aucpr:0.97390
[21:53:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50196 validation-auc:0.96946 validation-aucpr:0.97401
[21:53:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.49766 validation-auc:0.96960 validation-aucpr:0.97413
[21:53:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49416 validation-auc:0.96947 validation-aucpr:0.97398
[21:53:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49068 validation-auc:0.96949 validation-aucpr:0.97396
[21:53:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.48651 validation-auc:0.96969 validation-aucpr:0.97414
[21:53:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48232 validation-auc:0.96997 validation-aucpr:0.97435
[21:53:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.47912 validation-auc:0.96990 validation-aucpr:0.97428
[21:53:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.47584 validation-auc:0.96986 validation-aucpr:0.97421
[21:53:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47191 validation-auc:0.97000 validation-aucpr:0.97435
[21:53:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.46860 validation-auc:0.97007 validation-aucpr:0.97441
{'best_iteration': '43', 'best_score': '0.9744061651856183'}
Trial 91, Fold 3: Log loss = 0.4686027267876157, Average precision = 0.9744106591318193, ROC-AUC = 0.9700656896784198, Elapsed Time = 50.93511260000014 seconds
Trial 91, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 91, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[21:53:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68538 validation-auc:0.95451 validation-aucpr:0.95174
[21:53:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67861 validation-auc:0.96242 validation-aucpr:0.96761
[21:53:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67120 validation-auc:0.96456 validation-aucpr:0.97007
[21:53:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66400 validation-auc:0.96577 validation-aucpr:0.97157
[21:53:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65773 validation-auc:0.96489 validation-aucpr:0.97086
[21:53:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.65065 validation-auc:0.96518 validation-aucpr:0.97125
[21:53:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64453 validation-auc:0.96506 validation-aucpr:0.97099
[21:53:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63779 validation-auc:0.96587 validation-aucpr:0.97173
[21:53:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63123 validation-auc:0.96651 validation-aucpr:0.97228
[21:53:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62539 validation-auc:0.96680 validation-aucpr:0.97228
[21:53:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61905 validation-auc:0.96684 validation-aucpr:0.97234
[21:54:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61302 validation-auc:0.96725 validation-aucpr:0.97271
[21:54:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60685 validation-auc:0.96748 validation-aucpr:0.97299
[21:54:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60171 validation-auc:0.96696 validation-aucpr:0.97256
[21:54:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59576 validation-auc:0.96704 validation-aucpr:0.97267
[21:54:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59002 validation-auc:0.96728 validation-aucpr:0.97284
[21:54:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58489 validation-auc:0.96723 validation-aucpr:0.97280
[21:54:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.57987 validation-auc:0.96723 validation-aucpr:0.97273
[21:54:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57529 validation-auc:0.96689 validation-aucpr:0.97245
[21:54:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.57048 validation-auc:0.96683 validation-aucpr:0.97238
[21:54:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56510 validation-auc:0.96697 validation-aucpr:0.97253
[21:54:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.55993 validation-auc:0.96708 validation-aucpr:0.97265
[21:54:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55501 validation-auc:0.96721 validation-aucpr:0.97275
[21:54:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.55052 validation-auc:0.96709 validation-aucpr:0.97265
[21:54:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54615 validation-auc:0.96699 validation-aucpr:0.97253
[21:54:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.54128 validation-auc:0.96696 validation-aucpr:0.97253
[21:54:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53644 validation-auc:0.96712 validation-aucpr:0.97268
[21:54:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53221 validation-auc:0.96707 validation-aucpr:0.97262
[21:54:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52816 validation-auc:0.96703 validation-aucpr:0.97256
[21:54:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52350 validation-auc:0.96732 validation-aucpr:0.97281
[21:54:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.51949 validation-auc:0.96726 validation-aucpr:0.97274
[21:54:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51520 validation-auc:0.96726 validation-aucpr:0.97274
[21:54:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.51089 validation-auc:0.96734 validation-aucpr:0.97281
[21:54:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.50696 validation-auc:0.96735 validation-aucpr:0.97281
[21:54:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.50271 validation-auc:0.96737 validation-aucpr:0.97284
[21:54:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.49847 validation-auc:0.96736 validation-aucpr:0.97285
[21:54:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49489 validation-auc:0.96730 validation-aucpr:0.97278
[21:54:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.49083 validation-auc:0.96740 validation-aucpr:0.97287
[21:54:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.48739 validation-auc:0.96739 validation-aucpr:0.97285
[21:54:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48348 validation-auc:0.96742 validation-aucpr:0.97288
[21:54:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.47952 validation-auc:0.96759 validation-aucpr:0.97302
[21:54:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.47573 validation-auc:0.96764 validation-aucpr:0.97307
[21:54:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.47246 validation-auc:0.96752 validation-aucpr:0.97298
[21:54:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.46871 validation-auc:0.96771 validation-aucpr:0.97314
{'best_iteration': '43', 'best_score': '0.9731358137383671'}
Trial 91, Fold 4: Log loss = 0.46871306628357956, Average precision = 0.9731291862601925, ROC-AUC = 0.967713228824653, Elapsed Time = 50.220442699999694 seconds
Trial 91, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 91, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[21:54:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.68537 validation-auc:0.95408 validation-aucpr:0.95892
[21:54:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.67777 validation-auc:0.96220 validation-aucpr:0.96781
[21:54:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.67069 validation-auc:0.96237 validation-aucpr:0.96809
[21:54:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.66347 validation-auc:0.96512 validation-aucpr:0.97000
[21:54:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.65654 validation-auc:0.96510 validation-aucpr:0.96975
[21:54:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.64959 validation-auc:0.96544 validation-aucpr:0.97009
[21:54:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.64344 validation-auc:0.96634 validation-aucpr:0.97102
[21:54:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.63701 validation-auc:0.96590 validation-aucpr:0.97096
[21:54:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.63054 validation-auc:0.96633 validation-aucpr:0.97125
[21:54:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.62486 validation-auc:0.96644 validation-aucpr:0.97131
[21:54:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.61852 validation-auc:0.96666 validation-aucpr:0.97149
[21:54:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.61320 validation-auc:0.96647 validation-aucpr:0.97128
[21:54:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.60776 validation-auc:0.96679 validation-aucpr:0.97156
[21:54:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.60259 validation-auc:0.96615 validation-aucpr:0.97103
[21:54:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.59728 validation-auc:0.96645 validation-aucpr:0.97112
[21:54:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.59146 validation-auc:0.96702 validation-aucpr:0.97165
[21:54:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.58584 validation-auc:0.96717 validation-aucpr:0.97180
[21:54:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.58048 validation-auc:0.96706 validation-aucpr:0.97172
[21:54:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.57512 validation-auc:0.96709 validation-aucpr:0.97181
[21:54:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.56974 validation-auc:0.96736 validation-aucpr:0.97203
[21:54:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.56443 validation-auc:0.96746 validation-aucpr:0.97213
[21:55:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.55926 validation-auc:0.96761 validation-aucpr:0.97223
[21:55:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.55413 validation-auc:0.96774 validation-aucpr:0.97233
[21:55:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.54914 validation-auc:0.96782 validation-aucpr:0.97243
[21:55:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.54408 validation-auc:0.96801 validation-aucpr:0.97256
[21:55:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.53985 validation-auc:0.96774 validation-aucpr:0.97234
[21:55:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.53498 validation-auc:0.96795 validation-aucpr:0.97253
[21:55:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.53036 validation-auc:0.96817 validation-aucpr:0.97269
[21:55:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.52563 validation-auc:0.96840 validation-aucpr:0.97284
[21:55:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.52102 validation-auc:0.96853 validation-aucpr:0.97295
[21:55:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.51648 validation-auc:0.96873 validation-aucpr:0.97308
[21:55:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.51208 validation-auc:0.96869 validation-aucpr:0.97304
[21:55:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.50828 validation-auc:0.96871 validation-aucpr:0.97303
[21:55:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.50406 validation-auc:0.96875 validation-aucpr:0.97307
[21:55:14] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.49979 validation-auc:0.96888 validation-aucpr:0.97319
[21:55:15] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.49576 validation-auc:0.96890 validation-aucpr:0.97320
[21:55:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.49208 validation-auc:0.96882 validation-aucpr:0.97314
[21:55:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.48805 validation-auc:0.96889 validation-aucpr:0.97318
[21:55:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.48457 validation-auc:0.96868 validation-aucpr:0.97301
[21:55:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.48065 validation-auc:0.96876 validation-aucpr:0.97308
[21:55:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.47679 validation-auc:0.96890 validation-aucpr:0.97317
[21:55:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.47291 validation-auc:0.96908 validation-aucpr:0.97331
[21:55:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.46930 validation-auc:0.96918 validation-aucpr:0.97338
[21:55:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.46572 validation-auc:0.96916 validation-aucpr:0.97336
{'best_iteration': '42', 'best_score': '0.9733813409437337'}
Trial 91, Fold 5: Log loss = 0.46572245851236477, Average precision = 0.9733665370275528, ROC-AUC = 0.9691603961131858, Elapsed Time = 46.25501720000102 seconds
Optimization Progress: 92%|#########2| 92/100 [3:56:36<15:55, 119.39s/it]
Trial 92, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 92, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.65897 validation-auc:0.94764 validation-aucpr:0.95389
[1] validation-logloss:0.63078 validation-auc:0.91919 validation-aucpr:0.88950
[2] validation-logloss:0.60254 validation-auc:0.94006 validation-aucpr:0.92849
[3] validation-logloss:0.57999 validation-auc:0.94494 validation-aucpr:0.93579
[4] validation-logloss:0.55751 validation-auc:0.95116 validation-aucpr:0.94720
[5] validation-logloss:0.53479 validation-auc:0.95516 validation-aucpr:0.95929
[6] validation-logloss:0.51695 validation-auc:0.95573 validation-aucpr:0.95946
[7] validation-logloss:0.50079 validation-auc:0.95573 validation-aucpr:0.95886
[8] validation-logloss:0.48499 validation-auc:0.95665 validation-aucpr:0.96060
[9] validation-logloss:0.47000 validation-auc:0.95739 validation-aucpr:0.96080
[10] validation-logloss:0.45316 validation-auc:0.95893 validation-aucpr:0.96156
[11] validation-logloss:0.43847 validation-auc:0.96026 validation-aucpr:0.96592
[12] validation-logloss:0.42416 validation-auc:0.96069 validation-aucpr:0.96610
[13] validation-logloss:0.41086 validation-auc:0.96117 validation-aucpr:0.96653
[14] validation-logloss:0.39812 validation-auc:0.96184 validation-aucpr:0.96676
[15] validation-logloss:0.38629 validation-auc:0.96258 validation-aucpr:0.96721
[16] validation-logloss:0.37577 validation-auc:0.96290 validation-aucpr:0.96743
[17] validation-logloss:0.36596 validation-auc:0.96319 validation-aucpr:0.96726
[18] validation-logloss:0.35640 validation-auc:0.96345 validation-aucpr:0.96755
[19] validation-logloss:0.34712 validation-auc:0.96398 validation-aucpr:0.96793
[20] validation-logloss:0.33839 validation-auc:0.96403 validation-aucpr:0.96771
[21] validation-logloss:0.33024 validation-auc:0.96426 validation-aucpr:0.96790
[22] validation-logloss:0.32393 validation-auc:0.96446 validation-aucpr:0.96660
[23] validation-logloss:0.31726 validation-auc:0.96461 validation-aucpr:0.96691
[24] validation-logloss:0.31245 validation-auc:0.96442 validation-aucpr:0.96650
[25] validation-logloss:0.30621 validation-auc:0.96468 validation-aucpr:0.96690
[26] validation-logloss:0.30102 validation-auc:0.96493 validation-aucpr:0.96693
[27] validation-logloss:0.29609 validation-auc:0.96512 validation-aucpr:0.96692
[28] validation-logloss:0.29205 validation-auc:0.96515 validation-aucpr:0.96679
[29] validation-logloss:0.28713 validation-auc:0.96539 validation-aucpr:0.96698
[30] validation-logloss:0.28339 validation-auc:0.96539 validation-aucpr:0.96688
[31] validation-logloss:0.27897 validation-auc:0.96570 validation-aucpr:0.96844
[32] validation-logloss:0.27461 validation-auc:0.96572 validation-aucpr:0.96851
[33] validation-logloss:0.27162 validation-auc:0.96552 validation-aucpr:0.96817
[34] validation-logloss:0.26780 validation-auc:0.96542 validation-aucpr:0.96842
[35] validation-logloss:0.26434 validation-auc:0.96550 validation-aucpr:0.96821
[36] validation-logloss:0.26134 validation-auc:0.96572 validation-aucpr:0.96808
[37] validation-logloss:0.25876 validation-auc:0.96573 validation-aucpr:0.96799
[38] validation-logloss:0.25553 validation-auc:0.96591 validation-aucpr:0.96814
[39] validation-logloss:0.25316 validation-auc:0.96585 validation-aucpr:0.96805
[40] validation-logloss:0.25023 validation-auc:0.96592 validation-aucpr:0.96811
[41] validation-logloss:0.24720 validation-auc:0.96633 validation-aucpr:0.97090
[42] validation-logloss:0.24514 validation-auc:0.96646 validation-aucpr:0.97105
[43] validation-logloss:0.24307 validation-auc:0.96651 validation-aucpr:0.97105
[44] validation-logloss:0.24016 validation-auc:0.96682 validation-aucpr:0.97131
[45] validation-logloss:0.23822 validation-auc:0.96701 validation-aucpr:0.97150
[46] validation-logloss:0.23642 validation-auc:0.96687 validation-aucpr:0.97142
[47] validation-logloss:0.23500 validation-auc:0.96685 validation-aucpr:0.97146
[48] validation-logloss:0.23342 validation-auc:0.96681 validation-aucpr:0.97150
[49] validation-logloss:0.23129 validation-auc:0.96690 validation-aucpr:0.97156
[50] validation-logloss:0.22998 validation-auc:0.96702 validation-aucpr:0.97088
[51] validation-logloss:0.22835 validation-auc:0.96720 validation-aucpr:0.97110
[52] validation-logloss:0.22684 validation-auc:0.96735 validation-aucpr:0.97120
[53] validation-logloss:0.22578 validation-auc:0.96739 validation-aucpr:0.97121
[54] validation-logloss:0.22493 validation-auc:0.96743 validation-aucpr:0.97114
[55] validation-logloss:0.22333 validation-auc:0.96776 validation-aucpr:0.97150
[56] validation-logloss:0.22185 validation-auc:0.96797 validation-aucpr:0.97161
[57] validation-logloss:0.22115 validation-auc:0.96798 validation-aucpr:0.97165
[58] validation-logloss:0.22002 validation-auc:0.96813 validation-aucpr:0.97309
[59] validation-logloss:0.21893 validation-auc:0.96823 validation-aucpr:0.97314
[60] validation-logloss:0.21784 validation-auc:0.96831 validation-aucpr:0.97323
[61] validation-logloss:0.21713 validation-auc:0.96828 validation-aucpr:0.97323
[62] validation-logloss:0.21680 validation-auc:0.96820 validation-aucpr:0.97315
[63] validation-logloss:0.21602 validation-auc:0.96831 validation-aucpr:0.97325
[64] validation-logloss:0.21541 validation-auc:0.96837 validation-aucpr:0.97325
[65] validation-logloss:0.21435 validation-auc:0.96855 validation-aucpr:0.97337
[66] validation-logloss:0.21366 validation-auc:0.96858 validation-aucpr:0.97333
[67] validation-logloss:0.21278 validation-auc:0.96867 validation-aucpr:0.97338
[68] validation-logloss:0.21236 validation-auc:0.96876 validation-aucpr:0.97340
[69] validation-logloss:0.21219 validation-auc:0.96869 validation-aucpr:0.97331
[70] validation-logloss:0.21202 validation-auc:0.96870 validation-aucpr:0.97344
[71] validation-logloss:0.21100 validation-auc:0.96884 validation-aucpr:0.97353
[72] validation-logloss:0.21072 validation-auc:0.96876 validation-aucpr:0.97347
[73] validation-logloss:0.21026 validation-auc:0.96896 validation-aucpr:0.97369
[74] validation-logloss:0.21004 validation-auc:0.96895 validation-aucpr:0.97367
[75] validation-logloss:0.20955 validation-auc:0.96906 validation-aucpr:0.97373
[76] validation-logloss:0.20924 validation-auc:0.96903 validation-aucpr:0.97373
[77] validation-logloss:0.20900 validation-auc:0.96919 validation-aucpr:0.97386
[78] validation-logloss:0.20851 validation-auc:0.96930 validation-aucpr:0.97399
[79] validation-logloss:0.20822 validation-auc:0.96928 validation-aucpr:0.97390
[80] validation-logloss:0.20784 validation-auc:0.96930 validation-aucpr:0.97391
[81] validation-logloss:0.20813 validation-auc:0.96909 validation-aucpr:0.97383
[82] validation-logloss:0.20794 validation-auc:0.96907 validation-aucpr:0.97375
[83] validation-logloss:0.20795 validation-auc:0.96889 validation-aucpr:0.97362
[84] validation-logloss:0.20751 validation-auc:0.96899 validation-aucpr:0.97367
[85] validation-logloss:0.20741 validation-auc:0.96900 validation-aucpr:0.97362
[86] validation-logloss:0.20704 validation-auc:0.96910 validation-aucpr:0.97361
[87] validation-logloss:0.20675 validation-auc:0.96911 validation-aucpr:0.97358
[88] validation-logloss:0.20663 validation-auc:0.96911 validation-aucpr:0.97356
[89] validation-logloss:0.20621 validation-auc:0.96925 validation-aucpr:0.97368
[90] validation-logloss:0.20562 validation-auc:0.96947 validation-aucpr:0.97384
[91] validation-logloss:0.20572 validation-auc:0.96949 validation-aucpr:0.97390
[92] validation-logloss:0.20559 validation-auc:0.96957 validation-aucpr:0.97399
[93] validation-logloss:0.20570 validation-auc:0.96955 validation-aucpr:0.97396
{'best_iteration': '92', 'best_score': '0.9739895790503507'}
Trial 92, Fold 1: Log loss = 0.20569699593105253, Average precision = 0.9739613171371575, ROC-AUC = 0.9695516267331985, Elapsed Time = 2.5761713000028976 seconds
Trial 92, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 92, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.65786 validation-auc:0.94682 validation-aucpr:0.95072
[1] validation-logloss:0.62898 validation-auc:0.93550 validation-aucpr:0.91127
[2] validation-logloss:0.60339 validation-auc:0.94895 validation-aucpr:0.93533
[3] validation-logloss:0.57644 validation-auc:0.95450 validation-aucpr:0.94925
[4] validation-logloss:0.55491 validation-auc:0.95707 validation-aucpr:0.95339
[5] validation-logloss:0.53235 validation-auc:0.96108 validation-aucpr:0.96298
[6] validation-logloss:0.51335 validation-auc:0.96108 validation-aucpr:0.96320
[7] validation-logloss:0.49679 validation-auc:0.96208 validation-aucpr:0.96479
[8] validation-logloss:0.48172 validation-auc:0.96118 validation-aucpr:0.96413
[9] validation-logloss:0.46307 validation-auc:0.96381 validation-aucpr:0.96646
[10] validation-logloss:0.44678 validation-auc:0.96448 validation-aucpr:0.96710
[11] validation-logloss:0.43405 validation-auc:0.96423 validation-aucpr:0.96675
[12] validation-logloss:0.42295 validation-auc:0.96408 validation-aucpr:0.96641
[13] validation-logloss:0.41134 validation-auc:0.96408 validation-aucpr:0.96608
[14] validation-logloss:0.39876 validation-auc:0.96479 validation-aucpr:0.96676
[15] validation-logloss:0.38868 validation-auc:0.96497 validation-aucpr:0.96695
[16] validation-logloss:0.37963 validation-auc:0.96481 validation-aucpr:0.96652
[17] validation-logloss:0.37132 validation-auc:0.96478 validation-aucpr:0.96645
[18] validation-logloss:0.36318 validation-auc:0.96517 validation-aucpr:0.96735
[19] validation-logloss:0.35367 validation-auc:0.96535 validation-aucpr:0.96771
[20] validation-logloss:0.34492 validation-auc:0.96582 validation-aucpr:0.96862
[21] validation-logloss:0.33803 validation-auc:0.96582 validation-aucpr:0.96855
[22] validation-logloss:0.32967 validation-auc:0.96631 validation-aucpr:0.97024
[23] validation-logloss:0.32176 validation-auc:0.96661 validation-aucpr:0.97057
[24] validation-logloss:0.31463 validation-auc:0.96670 validation-aucpr:0.97065
[25] validation-logloss:0.30956 validation-auc:0.96655 validation-aucpr:0.97057
[26] validation-logloss:0.30426 validation-auc:0.96652 validation-aucpr:0.97050
[27] validation-logloss:0.29948 validation-auc:0.96661 validation-aucpr:0.97057
[28] validation-logloss:0.29361 validation-auc:0.96697 validation-aucpr:0.97088
[29] validation-logloss:0.28852 validation-auc:0.96680 validation-aucpr:0.97075
[30] validation-logloss:0.28335 validation-auc:0.96687 validation-aucpr:0.97083
[31] validation-logloss:0.27919 validation-auc:0.96715 validation-aucpr:0.97098
[32] validation-logloss:0.27437 validation-auc:0.96734 validation-aucpr:0.97113
[33] validation-logloss:0.26996 validation-auc:0.96747 validation-aucpr:0.97127
[34] validation-logloss:0.26532 validation-auc:0.96773 validation-aucpr:0.97155
[35] validation-logloss:0.26264 validation-auc:0.96753 validation-aucpr:0.97137
[36] validation-logloss:0.25956 validation-auc:0.96735 validation-aucpr:0.97125
[37] validation-logloss:0.25710 validation-auc:0.96729 validation-aucpr:0.97116
[38] validation-logloss:0.25470 validation-auc:0.96716 validation-aucpr:0.97108
[39] validation-logloss:0.25148 validation-auc:0.96720 validation-aucpr:0.97112
[40] validation-logloss:0.24856 validation-auc:0.96729 validation-aucpr:0.97119
[41] validation-logloss:0.24529 validation-auc:0.96755 validation-aucpr:0.97141
[42] validation-logloss:0.24270 validation-auc:0.96757 validation-aucpr:0.97140
[43] validation-logloss:0.23986 validation-auc:0.96775 validation-aucpr:0.97156
[44] validation-logloss:0.23742 validation-auc:0.96775 validation-aucpr:0.97159
[45] validation-logloss:0.23514 validation-auc:0.96788 validation-aucpr:0.97162
[46] validation-logloss:0.23300 validation-auc:0.96814 validation-aucpr:0.97182
[47] validation-logloss:0.23157 validation-auc:0.96831 validation-aucpr:0.97191
[48] validation-logloss:0.23056 validation-auc:0.96819 validation-aucpr:0.97183
[49] validation-logloss:0.22820 validation-auc:0.96840 validation-aucpr:0.97202
[50] validation-logloss:0.22642 validation-auc:0.96839 validation-aucpr:0.97201
[51] validation-logloss:0.22499 validation-auc:0.96846 validation-aucpr:0.97202
[52] validation-logloss:0.22399 validation-auc:0.96849 validation-aucpr:0.97199
[53] validation-logloss:0.22287 validation-auc:0.96851 validation-aucpr:0.97199
[54] validation-logloss:0.22114 validation-auc:0.96851 validation-aucpr:0.97201
[55] validation-logloss:0.21963 validation-auc:0.96857 validation-aucpr:0.97201
[56] validation-logloss:0.21838 validation-auc:0.96866 validation-aucpr:0.97215
[57] validation-logloss:0.21732 validation-auc:0.96870 validation-aucpr:0.97218
[58] validation-logloss:0.21621 validation-auc:0.96886 validation-aucpr:0.97231
[59] validation-logloss:0.21481 validation-auc:0.96904 validation-aucpr:0.97242
[60] validation-logloss:0.21375 validation-auc:0.96893 validation-aucpr:0.97236
[61] validation-logloss:0.21268 validation-auc:0.96909 validation-aucpr:0.97249
[62] validation-logloss:0.21166 validation-auc:0.96907 validation-aucpr:0.97255
[63] validation-logloss:0.21003 validation-auc:0.96930 validation-aucpr:0.97281
[64] validation-logloss:0.20913 validation-auc:0.96933 validation-aucpr:0.97281
[65] validation-logloss:0.20862 validation-auc:0.96943 validation-aucpr:0.97287
[66] validation-logloss:0.20790 validation-auc:0.96952 validation-aucpr:0.97290
[67] validation-logloss:0.20736 validation-auc:0.96946 validation-aucpr:0.97288
[68] validation-logloss:0.20688 validation-auc:0.96943 validation-aucpr:0.97309
[69] validation-logloss:0.20601 validation-auc:0.96958 validation-aucpr:0.97324
[70] validation-logloss:0.20536 validation-auc:0.96956 validation-aucpr:0.97313
[71] validation-logloss:0.20472 validation-auc:0.96968 validation-aucpr:0.97319
[72] validation-logloss:0.20409 validation-auc:0.96979 validation-aucpr:0.97325
[73] validation-logloss:0.20336 validation-auc:0.96993 validation-aucpr:0.97339
[74] validation-logloss:0.20279 validation-auc:0.96999 validation-aucpr:0.97331
[75] validation-logloss:0.20250 validation-auc:0.97002 validation-aucpr:0.97332
[76] validation-logloss:0.20248 validation-auc:0.96991 validation-aucpr:0.97322
[77] validation-logloss:0.20220 validation-auc:0.96989 validation-aucpr:0.97331
[78] validation-logloss:0.20192 validation-auc:0.96994 validation-aucpr:0.97334
[79] validation-logloss:0.20179 validation-auc:0.96996 validation-aucpr:0.97335
[80] validation-logloss:0.20137 validation-auc:0.96998 validation-aucpr:0.97333
[81] validation-logloss:0.20077 validation-auc:0.97010 validation-aucpr:0.97344
[82] validation-logloss:0.20011 validation-auc:0.97012 validation-aucpr:0.97341
[83] validation-logloss:0.19931 validation-auc:0.97026 validation-aucpr:0.97352
[84] validation-logloss:0.19866 validation-auc:0.97039 validation-aucpr:0.97373
[85] validation-logloss:0.19850 validation-auc:0.97041 validation-aucpr:0.97369
[86] validation-logloss:0.19866 validation-auc:0.97031 validation-aucpr:0.97362
[87] validation-logloss:0.19824 validation-auc:0.97042 validation-aucpr:0.97366
[88] validation-logloss:0.19778 validation-auc:0.97044 validation-aucpr:0.97367
[89] validation-logloss:0.19746 validation-auc:0.97044 validation-aucpr:0.97365
[90] validation-logloss:0.19752 validation-auc:0.97038 validation-aucpr:0.97364
[91] validation-logloss:0.19731 validation-auc:0.97039 validation-aucpr:0.97362
[92] validation-logloss:0.19745 validation-auc:0.97026 validation-aucpr:0.97357
[93] validation-logloss:0.19740 validation-auc:0.97028 validation-aucpr:0.97344
{'best_iteration': '84', 'best_score': '0.9737342955246244'}
Trial 92, Fold 2: Log loss = 0.19739752275694591, Average precision = 0.9734464038101037, ROC-AUC = 0.9702782234947456, Elapsed Time = 3.030435299999226 seconds
Trial 92, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 92, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.65806 validation-auc:0.95150 validation-aucpr:0.95838
[1] validation-logloss:0.62595 validation-auc:0.93582 validation-aucpr:0.91588
[2] validation-logloss:0.60060 validation-auc:0.94687 validation-aucpr:0.93455
[3] validation-logloss:0.57280 validation-auc:0.95316 validation-aucpr:0.94356
[4] validation-logloss:0.54954 validation-auc:0.95733 validation-aucpr:0.95671
[5] validation-logloss:0.52884 validation-auc:0.96035 validation-aucpr:0.96441
[6] validation-logloss:0.51051 validation-auc:0.96097 validation-aucpr:0.96329
[7] validation-logloss:0.49281 validation-auc:0.96233 validation-aucpr:0.96561
[8] validation-logloss:0.47540 validation-auc:0.96236 validation-aucpr:0.96586
[9] validation-logloss:0.46066 validation-auc:0.96268 validation-aucpr:0.96590
[10] validation-logloss:0.44741 validation-auc:0.96266 validation-aucpr:0.96567
[11] validation-logloss:0.43436 validation-auc:0.96308 validation-aucpr:0.96683
[12] validation-logloss:0.42042 validation-auc:0.96354 validation-aucpr:0.96733
[13] validation-logloss:0.40775 validation-auc:0.96422 validation-aucpr:0.96749
[14] validation-logloss:0.39649 validation-auc:0.96475 validation-aucpr:0.96771
[15] validation-logloss:0.38641 validation-auc:0.96485 validation-aucpr:0.96742
[16] validation-logloss:0.37466 validation-auc:0.96519 validation-aucpr:0.96728
[17] validation-logloss:0.36418 validation-auc:0.96536 validation-aucpr:0.96733
[18] validation-logloss:0.35620 validation-auc:0.96588 validation-aucpr:0.96717
[19] validation-logloss:0.34672 validation-auc:0.96626 validation-aucpr:0.96751
[20] validation-logloss:0.33830 validation-auc:0.96717 validation-aucpr:0.96868
[21] validation-logloss:0.32976 validation-auc:0.96725 validation-aucpr:0.96868
[22] validation-logloss:0.32188 validation-auc:0.96718 validation-aucpr:0.96839
[23] validation-logloss:0.31452 validation-auc:0.96725 validation-aucpr:0.96869
[24] validation-logloss:0.30731 validation-auc:0.96756 validation-aucpr:0.96889
[25] validation-logloss:0.30047 validation-auc:0.96791 validation-aucpr:0.96840
[26] validation-logloss:0.29561 validation-auc:0.96781 validation-aucpr:0.96838
[27] validation-logloss:0.29094 validation-auc:0.96776 validation-aucpr:0.96817
[28] validation-logloss:0.28507 validation-auc:0.96796 validation-aucpr:0.96832
[29] validation-logloss:0.27989 validation-auc:0.96815 validation-aucpr:0.96843
[30] validation-logloss:0.27597 validation-auc:0.96827 validation-aucpr:0.97076
[31] validation-logloss:0.27178 validation-auc:0.96823 validation-aucpr:0.97071
[32] validation-logloss:0.26830 validation-auc:0.96830 validation-aucpr:0.97067
[33] validation-logloss:0.26442 validation-auc:0.96821 validation-aucpr:0.97056
[34] validation-logloss:0.26110 validation-auc:0.96836 validation-aucpr:0.97060
[35] validation-logloss:0.25783 validation-auc:0.96843 validation-aucpr:0.97045
[36] validation-logloss:0.25490 validation-auc:0.96847 validation-aucpr:0.97042
[37] validation-logloss:0.25188 validation-auc:0.96879 validation-aucpr:0.97079
[38] validation-logloss:0.24820 validation-auc:0.96890 validation-aucpr:0.97076
[39] validation-logloss:0.24605 validation-auc:0.96884 validation-aucpr:0.97169
[40] validation-logloss:0.24308 validation-auc:0.96884 validation-aucpr:0.97169
[41] validation-logloss:0.24045 validation-auc:0.96879 validation-aucpr:0.97167
[42] validation-logloss:0.23764 validation-auc:0.96905 validation-aucpr:0.97326
[43] validation-logloss:0.23497 validation-auc:0.96928 validation-aucpr:0.97366
[44] validation-logloss:0.23320 validation-auc:0.96932 validation-aucpr:0.97375
[45] validation-logloss:0.23145 validation-auc:0.96922 validation-aucpr:0.97365
[46] validation-logloss:0.22915 validation-auc:0.96938 validation-aucpr:0.97378
[47] validation-logloss:0.22713 validation-auc:0.96942 validation-aucpr:0.97377
[48] validation-logloss:0.22538 validation-auc:0.96944 validation-aucpr:0.97369
[49] validation-logloss:0.22387 validation-auc:0.96961 validation-aucpr:0.97389
[50] validation-logloss:0.22213 validation-auc:0.96953 validation-aucpr:0.97385
[51] validation-logloss:0.22070 validation-auc:0.96947 validation-aucpr:0.97382
[52] validation-logloss:0.21948 validation-auc:0.96952 validation-aucpr:0.97380
[53] validation-logloss:0.21815 validation-auc:0.96970 validation-aucpr:0.97388
[54] validation-logloss:0.21731 validation-auc:0.96959 validation-aucpr:0.97378
[55] validation-logloss:0.21673 validation-auc:0.96944 validation-aucpr:0.97364
[56] validation-logloss:0.21554 validation-auc:0.96956 validation-aucpr:0.97383
[57] validation-logloss:0.21422 validation-auc:0.96977 validation-aucpr:0.97264
[58] validation-logloss:0.21324 validation-auc:0.96989 validation-aucpr:0.97269
[59] validation-logloss:0.21249 validation-auc:0.96998 validation-aucpr:0.97407
[60] validation-logloss:0.21153 validation-auc:0.97009 validation-aucpr:0.97416
[61] validation-logloss:0.21049 validation-auc:0.97021 validation-aucpr:0.97425
[62] validation-logloss:0.20957 validation-auc:0.97039 validation-aucpr:0.97440
[63] validation-logloss:0.20870 validation-auc:0.97041 validation-aucpr:0.97437
[64] validation-logloss:0.20815 validation-auc:0.97028 validation-aucpr:0.97428
[65] validation-logloss:0.20753 validation-auc:0.97033 validation-aucpr:0.97426
[66] validation-logloss:0.20691 validation-auc:0.97031 validation-aucpr:0.97412
[67] validation-logloss:0.20600 validation-auc:0.97046 validation-aucpr:0.97422
[68] validation-logloss:0.20559 validation-auc:0.97040 validation-aucpr:0.97420
[69] validation-logloss:0.20519 validation-auc:0.97036 validation-aucpr:0.97417
[70] validation-logloss:0.20435 validation-auc:0.97050 validation-aucpr:0.97405
[71] validation-logloss:0.20372 validation-auc:0.97059 validation-aucpr:0.97413
[72] validation-logloss:0.20364 validation-auc:0.97048 validation-aucpr:0.97400
[73] validation-logloss:0.20346 validation-auc:0.97040 validation-aucpr:0.97402
[74] validation-logloss:0.20317 validation-auc:0.97034 validation-aucpr:0.97394
[75] validation-logloss:0.20315 validation-auc:0.97017 validation-aucpr:0.97376
[76] validation-logloss:0.20270 validation-auc:0.97023 validation-aucpr:0.97377
[77] validation-logloss:0.20225 validation-auc:0.97023 validation-aucpr:0.97375
[78] validation-logloss:0.20193 validation-auc:0.97028 validation-aucpr:0.97361
[79] validation-logloss:0.20169 validation-auc:0.97029 validation-aucpr:0.97360
[80] validation-logloss:0.20089 validation-auc:0.97050 validation-aucpr:0.97374
[81] validation-logloss:0.20081 validation-auc:0.97052 validation-aucpr:0.97367
[82] validation-logloss:0.20059 validation-auc:0.97050 validation-aucpr:0.97361
[83] validation-logloss:0.20015 validation-auc:0.97053 validation-aucpr:0.97361
[84] validation-logloss:0.19965 validation-auc:0.97063 validation-aucpr:0.97412
[85] validation-logloss:0.19945 validation-auc:0.97061 validation-aucpr:0.97409
[86] validation-logloss:0.19947 validation-auc:0.97061 validation-aucpr:0.97429
[87] validation-logloss:0.19900 validation-auc:0.97067 validation-aucpr:0.97433
[88] validation-logloss:0.19925 validation-auc:0.97060 validation-aucpr:0.97421
[89] validation-logloss:0.19932 validation-auc:0.97046 validation-aucpr:0.97407
[90] validation-logloss:0.19930 validation-auc:0.97043 validation-aucpr:0.97407
[91] validation-logloss:0.19913 validation-auc:0.97051 validation-aucpr:0.97412
[92] validation-logloss:0.19883 validation-auc:0.97062 validation-aucpr:0.97417
[93] validation-logloss:0.19853 validation-auc:0.97075 validation-aucpr:0.97425
{'best_iteration': '62', 'best_score': '0.9743978942246921'}
Trial 92, Fold 3: Log loss = 0.19853437848993327, Average precision = 0.9742558475354604, ROC-AUC = 0.9707452795889475, Elapsed Time = 3.047268400001485 seconds
Trial 92, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 92, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.65883 validation-auc:0.94222 validation-aucpr:0.94345
[1] validation-logloss:0.63025 validation-auc:0.92396 validation-aucpr:0.90743
[2] validation-logloss:0.60046 validation-auc:0.94528 validation-aucpr:0.93742
[3] validation-logloss:0.57344 validation-auc:0.95397 validation-aucpr:0.95632
[4] validation-logloss:0.54915 validation-auc:0.95825 validation-aucpr:0.96297
[5] validation-logloss:0.52948 validation-auc:0.95884 validation-aucpr:0.96315
[6] validation-logloss:0.51112 validation-auc:0.95977 validation-aucpr:0.96139
[7] validation-logloss:0.49180 validation-auc:0.96052 validation-aucpr:0.96205
[8] validation-logloss:0.47487 validation-auc:0.96095 validation-aucpr:0.96420
[9] validation-logloss:0.45835 validation-auc:0.96175 validation-aucpr:0.96449
[10] validation-logloss:0.44416 validation-auc:0.96284 validation-aucpr:0.96510
[11] validation-logloss:0.43141 validation-auc:0.96255 validation-aucpr:0.96483
[12] validation-logloss:0.41914 validation-auc:0.96271 validation-aucpr:0.96632
[13] validation-logloss:0.40800 validation-auc:0.96283 validation-aucpr:0.96617
[14] validation-logloss:0.39789 validation-auc:0.96359 validation-aucpr:0.96666
[15] validation-logloss:0.38799 validation-auc:0.96414 validation-aucpr:0.96988
[16] validation-logloss:0.37700 validation-auc:0.96456 validation-aucpr:0.97024
[17] validation-logloss:0.36799 validation-auc:0.96513 validation-aucpr:0.97065
[18] validation-logloss:0.35797 validation-auc:0.96581 validation-aucpr:0.97116
[19] validation-logloss:0.34997 validation-auc:0.96594 validation-aucpr:0.97124
[20] validation-logloss:0.34169 validation-auc:0.96614 validation-aucpr:0.97142
[21] validation-logloss:0.33346 validation-auc:0.96642 validation-aucpr:0.97164
[22] validation-logloss:0.32697 validation-auc:0.96641 validation-aucpr:0.97159
[23] validation-logloss:0.32174 validation-auc:0.96589 validation-aucpr:0.97132
[24] validation-logloss:0.31443 validation-auc:0.96630 validation-aucpr:0.97158
[25] validation-logloss:0.30950 validation-auc:0.96608 validation-aucpr:0.97142
[26] validation-logloss:0.30400 validation-auc:0.96586 validation-aucpr:0.97128
[27] validation-logloss:0.29982 validation-auc:0.96573 validation-aucpr:0.97116
[28] validation-logloss:0.29378 validation-auc:0.96602 validation-aucpr:0.97146
[29] validation-logloss:0.28977 validation-auc:0.96584 validation-aucpr:0.97128
[30] validation-logloss:0.28444 validation-auc:0.96608 validation-aucpr:0.97147
[31] validation-logloss:0.28020 validation-auc:0.96588 validation-aucpr:0.97130
[32] validation-logloss:0.27539 validation-auc:0.96627 validation-aucpr:0.97158
[33] validation-logloss:0.27085 validation-auc:0.96648 validation-aucpr:0.97174
[34] validation-logloss:0.26674 validation-auc:0.96665 validation-aucpr:0.97188
[35] validation-logloss:0.26392 validation-auc:0.96651 validation-aucpr:0.97170
[36] validation-logloss:0.26099 validation-auc:0.96666 validation-aucpr:0.97179
[37] validation-logloss:0.25869 validation-auc:0.96662 validation-aucpr:0.97178
[38] validation-logloss:0.25569 validation-auc:0.96647 validation-aucpr:0.97168
[39] validation-logloss:0.25216 validation-auc:0.96684 validation-aucpr:0.97194
[40] validation-logloss:0.24950 validation-auc:0.96704 validation-aucpr:0.97209
[41] validation-logloss:0.24734 validation-auc:0.96706 validation-aucpr:0.97205
[42] validation-logloss:0.24560 validation-auc:0.96698 validation-aucpr:0.97202
[43] validation-logloss:0.24310 validation-auc:0.96725 validation-aucpr:0.97221
[44] validation-logloss:0.24011 validation-auc:0.96757 validation-aucpr:0.97245
[45] validation-logloss:0.23805 validation-auc:0.96755 validation-aucpr:0.97244
[46] validation-logloss:0.23591 validation-auc:0.96759 validation-aucpr:0.97248
[47] validation-logloss:0.23413 validation-auc:0.96740 validation-aucpr:0.97238
[48] validation-logloss:0.23154 validation-auc:0.96770 validation-aucpr:0.97266
[49] validation-logloss:0.23015 validation-auc:0.96758 validation-aucpr:0.97252
[50] validation-logloss:0.22850 validation-auc:0.96764 validation-aucpr:0.97258
[51] validation-logloss:0.22674 validation-auc:0.96766 validation-aucpr:0.97258
[52] validation-logloss:0.22505 validation-auc:0.96772 validation-aucpr:0.97264
[53] validation-logloss:0.22425 validation-auc:0.96762 validation-aucpr:0.97257
[54] validation-logloss:0.22350 validation-auc:0.96763 validation-aucpr:0.97256
[55] validation-logloss:0.22213 validation-auc:0.96761 validation-aucpr:0.97258
[56] validation-logloss:0.22134 validation-auc:0.96735 validation-aucpr:0.97240
[57] validation-logloss:0.22029 validation-auc:0.96752 validation-aucpr:0.97250
[58] validation-logloss:0.21929 validation-auc:0.96769 validation-aucpr:0.97262
[59] validation-logloss:0.21853 validation-auc:0.96756 validation-aucpr:0.97252
[60] validation-logloss:0.21784 validation-auc:0.96767 validation-aucpr:0.97262
[61] validation-logloss:0.21760 validation-auc:0.96748 validation-aucpr:0.97253
[62] validation-logloss:0.21661 validation-auc:0.96768 validation-aucpr:0.97268
[63] validation-logloss:0.21559 validation-auc:0.96782 validation-aucpr:0.97280
[64] validation-logloss:0.21434 validation-auc:0.96802 validation-aucpr:0.97294
[65] validation-logloss:0.21325 validation-auc:0.96807 validation-aucpr:0.97299
[66] validation-logloss:0.21252 validation-auc:0.96828 validation-aucpr:0.97314
[67] validation-logloss:0.21195 validation-auc:0.96832 validation-aucpr:0.97317
[68] validation-logloss:0.21133 validation-auc:0.96830 validation-aucpr:0.97319
[69] validation-logloss:0.21055 validation-auc:0.96844 validation-aucpr:0.97328
[70] validation-logloss:0.20974 validation-auc:0.96853 validation-aucpr:0.97336
[71] validation-logloss:0.20872 validation-auc:0.96876 validation-aucpr:0.97352
[72] validation-logloss:0.20825 validation-auc:0.96884 validation-aucpr:0.97358
[73] validation-logloss:0.20796 validation-auc:0.96885 validation-aucpr:0.97357
[74] validation-logloss:0.20747 validation-auc:0.96895 validation-aucpr:0.97363
[75] validation-logloss:0.20696 validation-auc:0.96886 validation-aucpr:0.97356
[76] validation-logloss:0.20670 validation-auc:0.96889 validation-aucpr:0.97360
[77] validation-logloss:0.20627 validation-auc:0.96888 validation-aucpr:0.97362
[78] validation-logloss:0.20627 validation-auc:0.96880 validation-aucpr:0.97357
[79] validation-logloss:0.20597 validation-auc:0.96885 validation-aucpr:0.97363
[80] validation-logloss:0.20588 validation-auc:0.96882 validation-aucpr:0.97360
[81] validation-logloss:0.20513 validation-auc:0.96903 validation-aucpr:0.97375
[82] validation-logloss:0.20475 validation-auc:0.96921 validation-aucpr:0.97385
[83] validation-logloss:0.20477 validation-auc:0.96919 validation-aucpr:0.97382
[84] validation-logloss:0.20484 validation-auc:0.96910 validation-aucpr:0.97375
[85] validation-logloss:0.20515 validation-auc:0.96906 validation-aucpr:0.97370
[86] validation-logloss:0.20475 validation-auc:0.96908 validation-aucpr:0.97373
[87] validation-logloss:0.20439 validation-auc:0.96922 validation-aucpr:0.97382
[88] validation-logloss:0.20413 validation-auc:0.96929 validation-aucpr:0.97385
[89] validation-logloss:0.20389 validation-auc:0.96935 validation-aucpr:0.97391
[90] validation-logloss:0.20397 validation-auc:0.96929 validation-aucpr:0.97390
[91] validation-logloss:0.20370 validation-auc:0.96948 validation-aucpr:0.97404
[92] validation-logloss:0.20376 validation-auc:0.96945 validation-aucpr:0.97401
[93] validation-logloss:0.20354 validation-auc:0.96953 validation-aucpr:0.97407
{'best_iteration': '93', 'best_score': '0.9740735088385705'}
Trial 92, Fold 4: Log loss = 0.2035398037214788, Average precision = 0.9740776692600727, ROC-AUC = 0.9695270195354693, Elapsed Time = 2.9379043000008096 seconds
Trial 92, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 92, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.65963 validation-auc:0.93324 validation-aucpr:0.93882
[1] validation-logloss:0.62811 validation-auc:0.92946 validation-aucpr:0.89337
[2] validation-logloss:0.60297 validation-auc:0.93860 validation-aucpr:0.92625
[3] validation-logloss:0.57963 validation-auc:0.94628 validation-aucpr:0.94370
[4] validation-logloss:0.55489 validation-auc:0.95300 validation-aucpr:0.95294
[5] validation-logloss:0.53307 validation-auc:0.95457 validation-aucpr:0.95439
[6] validation-logloss:0.51465 validation-auc:0.95653 validation-aucpr:0.95882
[7] validation-logloss:0.49524 validation-auc:0.95835 validation-aucpr:0.96185
[8] validation-logloss:0.47709 validation-auc:0.95907 validation-aucpr:0.96305
[9] validation-logloss:0.45976 validation-auc:0.96133 validation-aucpr:0.96504
[10] validation-logloss:0.44577 validation-auc:0.96230 validation-aucpr:0.96667
[11] validation-logloss:0.43105 validation-auc:0.96279 validation-aucpr:0.96725
[12] validation-logloss:0.41809 validation-auc:0.96312 validation-aucpr:0.96711
[13] validation-logloss:0.40710 validation-auc:0.96311 validation-aucpr:0.96705
[14] validation-logloss:0.39695 validation-auc:0.96308 validation-aucpr:0.96692
[15] validation-logloss:0.38748 validation-auc:0.96286 validation-aucpr:0.96677
[16] validation-logloss:0.37668 validation-auc:0.96302 validation-aucpr:0.96689
[17] validation-logloss:0.36744 validation-auc:0.96362 validation-aucpr:0.96718
[18] validation-logloss:0.35783 validation-auc:0.96383 validation-aucpr:0.96721
[19] validation-logloss:0.34907 validation-auc:0.96368 validation-aucpr:0.96711
[20] validation-logloss:0.34101 validation-auc:0.96384 validation-aucpr:0.96727
[21] validation-logloss:0.33271 validation-auc:0.96412 validation-aucpr:0.96749
[22] validation-logloss:0.32511 validation-auc:0.96464 validation-aucpr:0.96921
[23] validation-logloss:0.31960 validation-auc:0.96454 validation-aucpr:0.96900
[24] validation-logloss:0.31529 validation-auc:0.96416 validation-aucpr:0.96862
[25] validation-logloss:0.30935 validation-auc:0.96434 validation-aucpr:0.96878
[26] validation-logloss:0.30332 validation-auc:0.96493 validation-aucpr:0.96910
[27] validation-logloss:0.29860 validation-auc:0.96490 validation-aucpr:0.96909
[28] validation-logloss:0.29322 validation-auc:0.96494 validation-aucpr:0.96915
[29] validation-logloss:0.28849 validation-auc:0.96561 validation-aucpr:0.96944
[30] validation-logloss:0.28400 validation-auc:0.96552 validation-aucpr:0.96943
[31] validation-logloss:0.27918 validation-auc:0.96572 validation-aucpr:0.96958
[32] validation-logloss:0.27451 validation-auc:0.96605 validation-aucpr:0.96970
[33] validation-logloss:0.27018 validation-auc:0.96617 validation-aucpr:0.96947
[34] validation-logloss:0.26600 validation-auc:0.96624 validation-aucpr:0.96955
[35] validation-logloss:0.26224 validation-auc:0.96661 validation-aucpr:0.96976
[36] validation-logloss:0.25914 validation-auc:0.96664 validation-aucpr:0.96978
[37] validation-logloss:0.25582 validation-auc:0.96667 validation-aucpr:0.96983
[38] validation-logloss:0.25332 validation-auc:0.96671 validation-aucpr:0.96980
[39] validation-logloss:0.25141 validation-auc:0.96638 validation-aucpr:0.96957
[40] validation-logloss:0.24965 validation-auc:0.96615 validation-aucpr:0.96938
[41] validation-logloss:0.24845 validation-auc:0.96589 validation-aucpr:0.96890
[42] validation-logloss:0.24674 validation-auc:0.96570 validation-aucpr:0.96868
[43] validation-logloss:0.24445 validation-auc:0.96576 validation-aucpr:0.96878
[44] validation-logloss:0.24187 validation-auc:0.96592 validation-aucpr:0.96897
[45] validation-logloss:0.24005 validation-auc:0.96606 validation-aucpr:0.96903
[46] validation-logloss:0.23808 validation-auc:0.96606 validation-aucpr:0.96902
[47] validation-logloss:0.23622 validation-auc:0.96604 validation-aucpr:0.96889
[48] validation-logloss:0.23492 validation-auc:0.96582 validation-aucpr:0.96866
[49] validation-logloss:0.23305 validation-auc:0.96612 validation-aucpr:0.97033
[50] validation-logloss:0.23195 validation-auc:0.96622 validation-aucpr:0.97033
[51] validation-logloss:0.23047 validation-auc:0.96640 validation-aucpr:0.97037
[52] validation-logloss:0.22896 validation-auc:0.96645 validation-aucpr:0.97043
[53] validation-logloss:0.22733 validation-auc:0.96672 validation-aucpr:0.97071
[54] validation-logloss:0.22602 validation-auc:0.96667 validation-aucpr:0.97074
[55] validation-logloss:0.22517 validation-auc:0.96683 validation-aucpr:0.97083
[56] validation-logloss:0.22372 validation-auc:0.96711 validation-aucpr:0.97106
[57] validation-logloss:0.22310 validation-auc:0.96704 validation-aucpr:0.97091
[58] validation-logloss:0.22217 validation-auc:0.96711 validation-aucpr:0.97097
[59] validation-logloss:0.22131 validation-auc:0.96710 validation-aucpr:0.97085
[60] validation-logloss:0.22071 validation-auc:0.96712 validation-aucpr:0.97081
[61] validation-logloss:0.21939 validation-auc:0.96726 validation-aucpr:0.97092
[62] validation-logloss:0.21898 validation-auc:0.96722 validation-aucpr:0.97091
[63] validation-logloss:0.21842 validation-auc:0.96718 validation-aucpr:0.97092
[64] validation-logloss:0.21779 validation-auc:0.96731 validation-aucpr:0.97096
[65] validation-logloss:0.21666 validation-auc:0.96737 validation-aucpr:0.97103
[66] validation-logloss:0.21628 validation-auc:0.96740 validation-aucpr:0.97101
[67] validation-logloss:0.21578 validation-auc:0.96739 validation-aucpr:0.97098
[68] validation-logloss:0.21468 validation-auc:0.96751 validation-aucpr:0.97111
[69] validation-logloss:0.21406 validation-auc:0.96759 validation-aucpr:0.97143
[70] validation-logloss:0.21365 validation-auc:0.96759 validation-aucpr:0.97172
[71] validation-logloss:0.21305 validation-auc:0.96761 validation-aucpr:0.97175
[72] validation-logloss:0.21310 validation-auc:0.96747 validation-aucpr:0.97154
[73] validation-logloss:0.21231 validation-auc:0.96767 validation-aucpr:0.97170
[74] validation-logloss:0.21183 validation-auc:0.96767 validation-aucpr:0.97169
[75] validation-logloss:0.21161 validation-auc:0.96767 validation-aucpr:0.97187
[76] validation-logloss:0.21082 validation-auc:0.96789 validation-aucpr:0.97189
[77] validation-logloss:0.21112 validation-auc:0.96770 validation-aucpr:0.97173
[78] validation-logloss:0.21121 validation-auc:0.96765 validation-aucpr:0.97164
[79] validation-logloss:0.21103 validation-auc:0.96766 validation-aucpr:0.97203
[80] validation-logloss:0.21044 validation-auc:0.96787 validation-aucpr:0.97215
[81] validation-logloss:0.21048 validation-auc:0.96771 validation-aucpr:0.97207
[82] validation-logloss:0.21019 validation-auc:0.96777 validation-aucpr:0.97215
[83] validation-logloss:0.21025 validation-auc:0.96766 validation-aucpr:0.97219
[84] validation-logloss:0.21046 validation-auc:0.96768 validation-aucpr:0.97222
[85] validation-logloss:0.21004 validation-auc:0.96784 validation-aucpr:0.97229
[86] validation-logloss:0.21040 validation-auc:0.96772 validation-aucpr:0.97213
[87] validation-logloss:0.21030 validation-auc:0.96771 validation-aucpr:0.97214
[88] validation-logloss:0.20992 validation-auc:0.96781 validation-aucpr:0.97230
[89] validation-logloss:0.21021 validation-auc:0.96773 validation-aucpr:0.97227
[90] validation-logloss:0.21028 validation-auc:0.96768 validation-aucpr:0.97225
[91] validation-logloss:0.21062 validation-auc:0.96760 validation-aucpr:0.97220
[92] validation-logloss:0.21034 validation-auc:0.96771 validation-aucpr:0.97226
[93] validation-logloss:0.21025 validation-auc:0.96775 validation-aucpr:0.97225
{'best_iteration': '88', 'best_score': '0.9722974589530706'}
Trial 92, Fold 5: Log loss = 0.21024819867975683, Average precision = 0.9722566510920048, ROC-AUC = 0.9677525221559555, Elapsed Time = 3.7220796999972663 seconds
Optimization Progress: 93%|#########3| 93/100 [3:56:59<10:34, 90.62s/it]
Trial 93, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 93, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.67575 validation-auc:0.93460 validation-aucpr:0.93800
[1] validation-logloss:0.65965 validation-auc:0.93703 validation-aucpr:0.93943
[2] validation-logloss:0.64457 validation-auc:0.94314 validation-aucpr:0.94762
[3] validation-logloss:0.63090 validation-auc:0.94449 validation-aucpr:0.94784
[4] validation-logloss:0.61637 validation-auc:0.94767 validation-aucpr:0.95120
[5] validation-logloss:0.59997 validation-auc:0.95603 validation-aucpr:0.96103
[6] validation-logloss:0.58508 validation-auc:0.95694 validation-aucpr:0.96212
[7] validation-logloss:0.57333 validation-auc:0.95698 validation-aucpr:0.96369
[8] validation-logloss:0.55955 validation-auc:0.95805 validation-aucpr:0.96472
[9] validation-logloss:0.54840 validation-auc:0.95826 validation-aucpr:0.96488
[10] validation-logloss:0.53508 validation-auc:0.95968 validation-aucpr:0.96614
[11] validation-logloss:0.52519 validation-auc:0.95954 validation-aucpr:0.96584
[12] validation-logloss:0.51566 validation-auc:0.95980 validation-aucpr:0.96604
[13] validation-logloss:0.50712 validation-auc:0.95945 validation-aucpr:0.96557
[14] validation-logloss:0.49760 validation-auc:0.95993 validation-aucpr:0.96593
[15] validation-logloss:0.48704 validation-auc:0.96048 validation-aucpr:0.96685
[16] validation-logloss:0.47644 validation-auc:0.96104 validation-aucpr:0.96734
[17] validation-logloss:0.46889 validation-auc:0.96102 validation-aucpr:0.96723
[18] validation-logloss:0.46130 validation-auc:0.96094 validation-aucpr:0.96708
[19] validation-logloss:0.45403 validation-auc:0.96093 validation-aucpr:0.96697
[20] validation-logloss:0.44503 validation-auc:0.96119 validation-aucpr:0.96724
[21] validation-logloss:0.43896 validation-auc:0.96120 validation-aucpr:0.96718
[22] validation-logloss:0.43254 validation-auc:0.96112 validation-aucpr:0.96720
[23] validation-logloss:0.42405 validation-auc:0.96143 validation-aucpr:0.96745
[24] validation-logloss:0.41801 validation-auc:0.96163 validation-aucpr:0.96762
[25] validation-logloss:0.41218 validation-auc:0.96157 validation-aucpr:0.96755
[26] validation-logloss:0.40492 validation-auc:0.96186 validation-aucpr:0.96783
[27] validation-logloss:0.39955 validation-auc:0.96184 validation-aucpr:0.96777
[28] validation-logloss:0.39321 validation-auc:0.96198 validation-aucpr:0.96798
[29] validation-logloss:0.38867 validation-auc:0.96207 validation-aucpr:0.96802
[30] validation-logloss:0.38450 validation-auc:0.96191 validation-aucpr:0.96783
[31] validation-logloss:0.37859 validation-auc:0.96231 validation-aucpr:0.96822
[32] validation-logloss:0.37464 validation-auc:0.96225 validation-aucpr:0.96813
[33] validation-logloss:0.37083 validation-auc:0.96224 validation-aucpr:0.96813
[34] validation-logloss:0.36682 validation-auc:0.96219 validation-aucpr:0.96804
[35] validation-logloss:0.36285 validation-auc:0.96213 validation-aucpr:0.96800
[36] validation-logloss:0.35731 validation-auc:0.96238 validation-aucpr:0.96828
[37] validation-logloss:0.35363 validation-auc:0.96240 validation-aucpr:0.96827
[38] validation-logloss:0.34861 validation-auc:0.96251 validation-aucpr:0.96838
[39] validation-logloss:0.34446 validation-auc:0.96265 validation-aucpr:0.96852
[40] validation-logloss:0.34107 validation-auc:0.96277 validation-aucpr:0.96867
[41] validation-logloss:0.33793 validation-auc:0.96279 validation-aucpr:0.96865
[42] validation-logloss:0.33492 validation-auc:0.96290 validation-aucpr:0.96874
[43] validation-logloss:0.33197 validation-auc:0.96285 validation-aucpr:0.96864
[44] validation-logloss:0.32773 validation-auc:0.96311 validation-aucpr:0.96892
[45] validation-logloss:0.32496 validation-auc:0.96307 validation-aucpr:0.96881
[46] validation-logloss:0.32238 validation-auc:0.96310 validation-aucpr:0.96880
[47] validation-logloss:0.31998 validation-auc:0.96308 validation-aucpr:0.96877
[48] validation-logloss:0.31758 validation-auc:0.96318 validation-aucpr:0.96888
[49] validation-logloss:0.31513 validation-auc:0.96329 validation-aucpr:0.96892
[50] validation-logloss:0.31285 validation-auc:0.96332 validation-aucpr:0.96890
[51] validation-logloss:0.30887 validation-auc:0.96356 validation-aucpr:0.96914
[52] validation-logloss:0.30669 validation-auc:0.96355 validation-aucpr:0.96913
[53] validation-logloss:0.30469 validation-auc:0.96343 validation-aucpr:0.96897
[54] validation-logloss:0.30108 validation-auc:0.96358 validation-aucpr:0.96912
[55] validation-logloss:0.29922 validation-auc:0.96361 validation-aucpr:0.96915
[56] validation-logloss:0.29742 validation-auc:0.96360 validation-aucpr:0.96908
[57] validation-logloss:0.29432 validation-auc:0.96375 validation-aucpr:0.96921
[58] validation-logloss:0.29225 validation-auc:0.96381 validation-aucpr:0.96924
[59] validation-logloss:0.29060 validation-auc:0.96383 validation-aucpr:0.96923
[60] validation-logloss:0.28796 validation-auc:0.96394 validation-aucpr:0.96941
[61] validation-logloss:0.28544 validation-auc:0.96412 validation-aucpr:0.96957
[62] validation-logloss:0.28292 validation-auc:0.96422 validation-aucpr:0.96971
[63] validation-logloss:0.28004 validation-auc:0.96440 validation-aucpr:0.96987
[64] validation-logloss:0.27858 validation-auc:0.96445 validation-aucpr:0.96988
[65] validation-logloss:0.27607 validation-auc:0.96456 validation-aucpr:0.97002
[66] validation-logloss:0.27367 validation-auc:0.96469 validation-aucpr:0.97011
[67] validation-logloss:0.27142 validation-auc:0.96482 validation-aucpr:0.97027
[68] validation-logloss:0.26915 validation-auc:0.96490 validation-aucpr:0.97034
[69] validation-logloss:0.26776 validation-auc:0.96494 validation-aucpr:0.97037
[70] validation-logloss:0.26636 validation-auc:0.96504 validation-aucpr:0.97046
[71] validation-logloss:0.26506 validation-auc:0.96511 validation-aucpr:0.97054
[72] validation-logloss:0.26302 validation-auc:0.96518 validation-aucpr:0.97061
[73] validation-logloss:0.26105 validation-auc:0.96526 validation-aucpr:0.97072
[74] validation-logloss:0.25909 validation-auc:0.96536 validation-aucpr:0.97081
{'best_iteration': '74', 'best_score': '0.9708118262327414'}
Trial 93, Fold 1: Log loss = 0.2590858465904124, Average precision = 0.9707767569767338, ROC-AUC = 0.9653620927192518, Elapsed Time = 7.22263490000114 seconds
Trial 93, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 93, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67552 validation-auc:0.93147 validation-aucpr:0.93164
[1] validation-logloss:0.65929 validation-auc:0.93648 validation-aucpr:0.93653
[2] validation-logloss:0.64406 validation-auc:0.94359 validation-aucpr:0.94518
[3] validation-logloss:0.63037 validation-auc:0.94403 validation-aucpr:0.94610
[4] validation-logloss:0.61580 validation-auc:0.94694 validation-aucpr:0.94908
[5] validation-logloss:0.59967 validation-auc:0.95573 validation-aucpr:0.95969
[6] validation-logloss:0.58490 validation-auc:0.95757 validation-aucpr:0.96145
[7] validation-logloss:0.57307 validation-auc:0.95758 validation-aucpr:0.96143
[8] validation-logloss:0.55913 validation-auc:0.95846 validation-aucpr:0.96222
[9] validation-logloss:0.54799 validation-auc:0.95884 validation-aucpr:0.96275
[10] validation-logloss:0.53493 validation-auc:0.95985 validation-aucpr:0.96389
[11] validation-logloss:0.52482 validation-auc:0.95980 validation-aucpr:0.96390
[12] validation-logloss:0.51530 validation-auc:0.96003 validation-aucpr:0.96395
[13] validation-logloss:0.50636 validation-auc:0.96003 validation-aucpr:0.96436
[14] validation-logloss:0.49693 validation-auc:0.96040 validation-aucpr:0.96455
[15] validation-logloss:0.48635 validation-auc:0.96074 validation-aucpr:0.96515
[16] validation-logloss:0.47848 validation-auc:0.96044 validation-aucpr:0.96475
[17] validation-logloss:0.47090 validation-auc:0.96034 validation-aucpr:0.96451
[18] validation-logloss:0.46342 validation-auc:0.96032 validation-aucpr:0.96446
[19] validation-logloss:0.45657 validation-auc:0.96020 validation-aucpr:0.96419
[20] validation-logloss:0.44748 validation-auc:0.96045 validation-aucpr:0.96444
[21] validation-logloss:0.44127 validation-auc:0.96059 validation-aucpr:0.96456
[22] validation-logloss:0.43488 validation-auc:0.96069 validation-aucpr:0.96461
[23] validation-logloss:0.42640 validation-auc:0.96103 validation-aucpr:0.96501
[24] validation-logloss:0.42039 validation-auc:0.96114 validation-aucpr:0.96504
[25] validation-logloss:0.41446 validation-auc:0.96114 validation-aucpr:0.96500
[26] validation-logloss:0.40700 validation-auc:0.96146 validation-aucpr:0.96539
[27] validation-logloss:0.40175 validation-auc:0.96154 validation-aucpr:0.96543
[28] validation-logloss:0.39504 validation-auc:0.96179 validation-aucpr:0.96559
[29] validation-logloss:0.39022 validation-auc:0.96189 validation-aucpr:0.96568
[30] validation-logloss:0.38600 validation-auc:0.96178 validation-aucpr:0.96546
[31] validation-logloss:0.37998 validation-auc:0.96242 validation-aucpr:0.96637
[32] validation-logloss:0.37601 validation-auc:0.96237 validation-aucpr:0.96626
[33] validation-logloss:0.37189 validation-auc:0.96235 validation-aucpr:0.96620
[34] validation-logloss:0.36605 validation-auc:0.96257 validation-aucpr:0.96643
[35] validation-logloss:0.36041 validation-auc:0.96282 validation-aucpr:0.96673
[36] validation-logloss:0.35533 validation-auc:0.96308 validation-aucpr:0.96707
[37] validation-logloss:0.35145 validation-auc:0.96310 validation-aucpr:0.96708
[38] validation-logloss:0.34773 validation-auc:0.96328 validation-aucpr:0.96723
[39] validation-logloss:0.34426 validation-auc:0.96323 validation-aucpr:0.96717
[40] validation-logloss:0.34093 validation-auc:0.96336 validation-aucpr:0.96727
[41] validation-logloss:0.33764 validation-auc:0.96337 validation-aucpr:0.96720
[42] validation-logloss:0.33435 validation-auc:0.96344 validation-aucpr:0.96721
[43] validation-logloss:0.33164 validation-auc:0.96334 validation-aucpr:0.96711
[44] validation-logloss:0.32910 validation-auc:0.96327 validation-aucpr:0.96704
[45] validation-logloss:0.32584 validation-auc:0.96343 validation-aucpr:0.96718
[46] validation-logloss:0.32170 validation-auc:0.96359 validation-aucpr:0.96738
[47] validation-logloss:0.31890 validation-auc:0.96371 validation-aucpr:0.96747
[48] validation-logloss:0.31489 validation-auc:0.96389 validation-aucpr:0.96770
[49] validation-logloss:0.31214 validation-auc:0.96404 validation-aucpr:0.96781
[50] validation-logloss:0.30974 validation-auc:0.96404 validation-aucpr:0.96777
[51] validation-logloss:0.30757 validation-auc:0.96400 validation-aucpr:0.96768
[52] validation-logloss:0.30527 validation-auc:0.96406 validation-aucpr:0.96767
[53] validation-logloss:0.30169 validation-auc:0.96415 validation-aucpr:0.96781
[54] validation-logloss:0.29958 validation-auc:0.96418 validation-aucpr:0.96782
[55] validation-logloss:0.29762 validation-auc:0.96419 validation-aucpr:0.96772
[56] validation-logloss:0.29573 validation-auc:0.96413 validation-aucpr:0.96764
[57] validation-logloss:0.29374 validation-auc:0.96413 validation-aucpr:0.96759
[58] validation-logloss:0.29199 validation-auc:0.96416 validation-aucpr:0.96762
[59] validation-logloss:0.29017 validation-auc:0.96422 validation-aucpr:0.96804
[60] validation-logloss:0.28849 validation-auc:0.96420 validation-aucpr:0.96797
[61] validation-logloss:0.28557 validation-auc:0.96437 validation-aucpr:0.96826
[62] validation-logloss:0.28398 validation-auc:0.96436 validation-aucpr:0.96826
[63] validation-logloss:0.28118 validation-auc:0.96443 validation-aucpr:0.96835
[64] validation-logloss:0.27939 validation-auc:0.96452 validation-aucpr:0.96841
[65] validation-logloss:0.27792 validation-auc:0.96459 validation-aucpr:0.96849
[66] validation-logloss:0.27524 validation-auc:0.96474 validation-aucpr:0.96864
[67] validation-logloss:0.27267 validation-auc:0.96487 validation-aucpr:0.96879
[68] validation-logloss:0.27138 validation-auc:0.96483 validation-aucpr:0.96875
[69] validation-logloss:0.27017 validation-auc:0.96483 validation-aucpr:0.96858
[70] validation-logloss:0.26891 validation-auc:0.96482 validation-aucpr:0.96852
[71] validation-logloss:0.26643 validation-auc:0.96499 validation-aucpr:0.96869
[72] validation-logloss:0.26511 validation-auc:0.96502 validation-aucpr:0.96870
[73] validation-logloss:0.26364 validation-auc:0.96515 validation-aucpr:0.96879
[74] validation-logloss:0.26242 validation-auc:0.96521 validation-aucpr:0.96882
{'best_iteration': '74', 'best_score': '0.9688176110599449'}
Trial 93, Fold 2: Log loss = 0.26241678949069214, Average precision = 0.9687837016793314, ROC-AUC = 0.9652092404689492, Elapsed Time = 7.36766390000048 seconds
Trial 93, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 93, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67562 validation-auc:0.93375 validation-aucpr:0.93331
[1] validation-logloss:0.65914 validation-auc:0.94029 validation-aucpr:0.94233
[2] validation-logloss:0.64383 validation-auc:0.94486 validation-aucpr:0.94795
[3] validation-logloss:0.62994 validation-auc:0.94719 validation-aucpr:0.94897
[4] validation-logloss:0.61579 validation-auc:0.94976 validation-aucpr:0.95056
[5] validation-logloss:0.60285 validation-auc:0.95017 validation-aucpr:0.95187
[6] validation-logloss:0.59043 validation-auc:0.94965 validation-aucpr:0.95346
[7] validation-logloss:0.57838 validation-auc:0.95054 validation-aucpr:0.95439
[8] validation-logloss:0.56478 validation-auc:0.95896 validation-aucpr:0.96397
[9] validation-logloss:0.55409 validation-auc:0.95874 validation-aucpr:0.96359
[10] validation-logloss:0.54148 validation-auc:0.96012 validation-aucpr:0.96522
[11] validation-logloss:0.53196 validation-auc:0.95982 validation-aucpr:0.96494
[12] validation-logloss:0.52050 validation-auc:0.96050 validation-aucpr:0.96571
[13] validation-logloss:0.51118 validation-auc:0.96056 validation-aucpr:0.96587
[14] validation-logloss:0.50231 validation-auc:0.96062 validation-aucpr:0.96591
[15] validation-logloss:0.49072 validation-auc:0.96142 validation-aucpr:0.96664
[16] validation-logloss:0.48241 validation-auc:0.96148 validation-aucpr:0.96686
[17] validation-logloss:0.47191 validation-auc:0.96211 validation-aucpr:0.96738
[18] validation-logloss:0.46184 validation-auc:0.96274 validation-aucpr:0.96796
[19] validation-logloss:0.45473 validation-auc:0.96270 validation-aucpr:0.96791
[20] validation-logloss:0.44779 validation-auc:0.96282 validation-aucpr:0.96802
[21] validation-logloss:0.43924 validation-auc:0.96285 validation-aucpr:0.96807
[22] validation-logloss:0.43073 validation-auc:0.96311 validation-aucpr:0.96826
[23] validation-logloss:0.42466 validation-auc:0.96336 validation-aucpr:0.96849
[24] validation-logloss:0.41874 validation-auc:0.96339 validation-aucpr:0.96857
[25] validation-logloss:0.41322 validation-auc:0.96340 validation-aucpr:0.96852
[26] validation-logloss:0.40718 validation-auc:0.96350 validation-aucpr:0.96865
[27] validation-logloss:0.40188 validation-auc:0.96352 validation-aucpr:0.96866
[28] validation-logloss:0.39690 validation-auc:0.96366 validation-aucpr:0.96872
[29] validation-logloss:0.39034 validation-auc:0.96384 validation-aucpr:0.96892
[30] validation-logloss:0.38582 validation-auc:0.96375 validation-aucpr:0.96880
[31] validation-logloss:0.38153 validation-auc:0.96355 validation-aucpr:0.96863
[32] validation-logloss:0.37718 validation-auc:0.96367 validation-aucpr:0.96874
[33] validation-logloss:0.37125 validation-auc:0.96389 validation-aucpr:0.96895
[34] validation-logloss:0.36717 validation-auc:0.96393 validation-aucpr:0.96899
[35] validation-logloss:0.36174 validation-auc:0.96414 validation-aucpr:0.96918
[36] validation-logloss:0.35791 validation-auc:0.96428 validation-aucpr:0.96929
[37] validation-logloss:0.35397 validation-auc:0.96435 validation-aucpr:0.96934
[38] validation-logloss:0.35023 validation-auc:0.96443 validation-aucpr:0.96938
[39] validation-logloss:0.34481 validation-auc:0.96473 validation-aucpr:0.96966
[40] validation-logloss:0.33988 validation-auc:0.96491 validation-aucpr:0.96987
[41] validation-logloss:0.33683 validation-auc:0.96481 validation-aucpr:0.96977
[42] validation-logloss:0.33204 validation-auc:0.96510 validation-aucpr:0.97003
[43] validation-logloss:0.32732 validation-auc:0.96534 validation-aucpr:0.97024
[44] validation-logloss:0.32403 validation-auc:0.96545 validation-aucpr:0.97037
[45] validation-logloss:0.32103 validation-auc:0.96559 validation-aucpr:0.97046
[46] validation-logloss:0.31719 validation-auc:0.96566 validation-aucpr:0.97056
[47] validation-logloss:0.31354 validation-auc:0.96565 validation-aucpr:0.97058
[48] validation-logloss:0.31104 validation-auc:0.96563 validation-aucpr:0.97055
[49] validation-logloss:0.30809 validation-auc:0.96586 validation-aucpr:0.97076
[50] validation-logloss:0.30587 validation-auc:0.96576 validation-aucpr:0.97066
[51] validation-logloss:0.30320 validation-auc:0.96594 validation-aucpr:0.97083
[52] validation-logloss:0.29968 validation-auc:0.96603 validation-aucpr:0.97093
[53] validation-logloss:0.29646 validation-auc:0.96622 validation-aucpr:0.97113
[54] validation-logloss:0.29446 validation-auc:0.96612 validation-aucpr:0.97101
[55] validation-logloss:0.29243 validation-auc:0.96613 validation-aucpr:0.97099
[56] validation-logloss:0.29065 validation-auc:0.96607 validation-aucpr:0.97092
[57] validation-logloss:0.28875 validation-auc:0.96607 validation-aucpr:0.97095
[58] validation-logloss:0.28695 validation-auc:0.96602 validation-aucpr:0.97093
[59] validation-logloss:0.28521 validation-auc:0.96603 validation-aucpr:0.97092
[60] validation-logloss:0.28349 validation-auc:0.96604 validation-aucpr:0.97089
[61] validation-logloss:0.28066 validation-auc:0.96612 validation-aucpr:0.97098
[62] validation-logloss:0.27898 validation-auc:0.96614 validation-aucpr:0.97098
[63] validation-logloss:0.27632 validation-auc:0.96623 validation-aucpr:0.97108
[64] validation-logloss:0.27445 validation-auc:0.96636 validation-aucpr:0.97119
[65] validation-logloss:0.27293 validation-auc:0.96641 validation-aucpr:0.97123
[66] validation-logloss:0.27048 validation-auc:0.96648 validation-aucpr:0.97132
[67] validation-logloss:0.26800 validation-auc:0.96657 validation-aucpr:0.97143
[68] validation-logloss:0.26655 validation-auc:0.96667 validation-aucpr:0.97149
[69] validation-logloss:0.26469 validation-auc:0.96671 validation-aucpr:0.97152
[70] validation-logloss:0.26335 validation-auc:0.96671 validation-aucpr:0.97151
[71] validation-logloss:0.26100 validation-auc:0.96683 validation-aucpr:0.97162
[72] validation-logloss:0.25969 validation-auc:0.96681 validation-aucpr:0.97158
[73] validation-logloss:0.25820 validation-auc:0.96693 validation-aucpr:0.97167
[74] validation-logloss:0.25698 validation-auc:0.96698 validation-aucpr:0.97171
{'best_iteration': '74', 'best_score': '0.971711828125484'}
Trial 93, Fold 3: Log loss = 0.256981879734254, Average precision = 0.971697582692562, ROC-AUC = 0.9669770618842157, Elapsed Time = 7.480115800000931 seconds
Trial 93, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 93, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.67552 validation-auc:0.93217 validation-aucpr:0.93573
[1] validation-logloss:0.65921 validation-auc:0.93524 validation-aucpr:0.93838
[2] validation-logloss:0.64398 validation-auc:0.94157 validation-aucpr:0.94618
[3] validation-logloss:0.63029 validation-auc:0.94242 validation-aucpr:0.94709
[4] validation-logloss:0.61586 validation-auc:0.94668 validation-aucpr:0.95153
[5] validation-logloss:0.59969 validation-auc:0.95483 validation-aucpr:0.96118
[6] validation-logloss:0.58475 validation-auc:0.95626 validation-aucpr:0.96281
[7] validation-logloss:0.57301 validation-auc:0.95628 validation-aucpr:0.96285
[8] validation-logloss:0.55925 validation-auc:0.95726 validation-aucpr:0.96380
[9] validation-logloss:0.54806 validation-auc:0.95782 validation-aucpr:0.96440
[10] validation-logloss:0.53478 validation-auc:0.95881 validation-aucpr:0.96540
[11] validation-logloss:0.52485 validation-auc:0.95844 validation-aucpr:0.96493
[12] validation-logloss:0.51541 validation-auc:0.95842 validation-aucpr:0.96488
[13] validation-logloss:0.50631 validation-auc:0.95866 validation-aucpr:0.96510
[14] validation-logloss:0.49697 validation-auc:0.95906 validation-aucpr:0.96539
[15] validation-logloss:0.48650 validation-auc:0.95953 validation-aucpr:0.96598
[16] validation-logloss:0.47601 validation-auc:0.96003 validation-aucpr:0.96653
[17] validation-logloss:0.46840 validation-auc:0.96016 validation-aucpr:0.96655
[18] validation-logloss:0.46090 validation-auc:0.96022 validation-aucpr:0.96654
[19] validation-logloss:0.45360 validation-auc:0.96011 validation-aucpr:0.96637
[20] validation-logloss:0.44483 validation-auc:0.96022 validation-aucpr:0.96657
[21] validation-logloss:0.43851 validation-auc:0.96035 validation-aucpr:0.96660
[22] validation-logloss:0.43218 validation-auc:0.96030 validation-aucpr:0.96648
[23] validation-logloss:0.42367 validation-auc:0.96067 validation-aucpr:0.96687
[24] validation-logloss:0.41788 validation-auc:0.96063 validation-aucpr:0.96672
[25] validation-logloss:0.41219 validation-auc:0.96056 validation-aucpr:0.96664
[26] validation-logloss:0.40504 validation-auc:0.96086 validation-aucpr:0.96709
[27] validation-logloss:0.39992 validation-auc:0.96090 validation-aucpr:0.96708
[28] validation-logloss:0.39347 validation-auc:0.96105 validation-aucpr:0.96729
[29] validation-logloss:0.38863 validation-auc:0.96121 validation-aucpr:0.96748
[30] validation-logloss:0.38436 validation-auc:0.96117 validation-aucpr:0.96738
[31] validation-logloss:0.37837 validation-auc:0.96156 validation-aucpr:0.96787
[32] validation-logloss:0.37431 validation-auc:0.96152 validation-aucpr:0.96779
[33] validation-logloss:0.37005 validation-auc:0.96149 validation-aucpr:0.96777
[34] validation-logloss:0.36437 validation-auc:0.96177 validation-aucpr:0.96801
[35] validation-logloss:0.35888 validation-auc:0.96184 validation-aucpr:0.96816
[36] validation-logloss:0.35398 validation-auc:0.96209 validation-aucpr:0.96842
[37] validation-logloss:0.35019 validation-auc:0.96209 validation-aucpr:0.96842
[38] validation-logloss:0.34673 validation-auc:0.96212 validation-aucpr:0.96841
[39] validation-logloss:0.34343 validation-auc:0.96209 validation-aucpr:0.96840
[40] validation-logloss:0.34016 validation-auc:0.96220 validation-aucpr:0.96845
[41] validation-logloss:0.33693 validation-auc:0.96217 validation-aucpr:0.96835
[42] validation-logloss:0.33371 validation-auc:0.96219 validation-aucpr:0.96835
[43] validation-logloss:0.33096 validation-auc:0.96220 validation-aucpr:0.96834
[44] validation-logloss:0.32841 validation-auc:0.96217 validation-aucpr:0.96833
[45] validation-logloss:0.32433 validation-auc:0.96236 validation-aucpr:0.96854
[46] validation-logloss:0.32009 validation-auc:0.96256 validation-aucpr:0.96873
[47] validation-logloss:0.31735 validation-auc:0.96265 validation-aucpr:0.96875
[48] validation-logloss:0.31361 validation-auc:0.96281 validation-aucpr:0.96892
[49] validation-logloss:0.31080 validation-auc:0.96290 validation-aucpr:0.96895
[50] validation-logloss:0.30849 validation-auc:0.96294 validation-aucpr:0.96897
[51] validation-logloss:0.30478 validation-auc:0.96312 validation-aucpr:0.96917
[52] validation-logloss:0.30244 validation-auc:0.96310 validation-aucpr:0.96912
[53] validation-logloss:0.29939 validation-auc:0.96332 validation-aucpr:0.96934
[54] validation-logloss:0.29730 validation-auc:0.96332 validation-aucpr:0.96929
[55] validation-logloss:0.29524 validation-auc:0.96337 validation-aucpr:0.96932
[56] validation-logloss:0.29351 validation-auc:0.96328 validation-aucpr:0.96923
[57] validation-logloss:0.29166 validation-auc:0.96333 validation-aucpr:0.96924
[58] validation-logloss:0.28992 validation-auc:0.96344 validation-aucpr:0.96935
[59] validation-logloss:0.28808 validation-auc:0.96340 validation-aucpr:0.96928
[60] validation-logloss:0.28639 validation-auc:0.96347 validation-aucpr:0.96930
[61] validation-logloss:0.28362 validation-auc:0.96362 validation-aucpr:0.96947
[62] validation-logloss:0.28195 validation-auc:0.96369 validation-aucpr:0.96953
[63] validation-logloss:0.27926 validation-auc:0.96388 validation-aucpr:0.96971
[64] validation-logloss:0.27737 validation-auc:0.96400 validation-aucpr:0.96980
[65] validation-logloss:0.27580 validation-auc:0.96403 validation-aucpr:0.96986
[66] validation-logloss:0.27329 validation-auc:0.96413 validation-aucpr:0.97000
[67] validation-logloss:0.27084 validation-auc:0.96425 validation-aucpr:0.97014
[68] validation-logloss:0.26946 validation-auc:0.96424 validation-aucpr:0.97014
[69] validation-logloss:0.26750 validation-auc:0.96439 validation-aucpr:0.97031
[70] validation-logloss:0.26623 validation-auc:0.96435 validation-aucpr:0.97022
[71] validation-logloss:0.26400 validation-auc:0.96445 validation-aucpr:0.97037
[72] validation-logloss:0.26267 validation-auc:0.96448 validation-aucpr:0.97037
[73] validation-logloss:0.26116 validation-auc:0.96463 validation-aucpr:0.97049
[74] validation-logloss:0.25995 validation-auc:0.96471 validation-aucpr:0.97053
{'best_iteration': '74', 'best_score': '0.970527887052778'}
Trial 93, Fold 4: Log loss = 0.2599548907235923, Average precision = 0.9704986765151592, ROC-AUC = 0.9647127471738784, Elapsed Time = 7.700011400000221 seconds
Trial 93, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 93, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.67591 validation-auc:0.92772 validation-aucpr:0.92825
[1] validation-logloss:0.65975 validation-auc:0.93253 validation-aucpr:0.93482
[2] validation-logloss:0.64457 validation-auc:0.94027 validation-aucpr:0.94445
[3] validation-logloss:0.63095 validation-auc:0.94259 validation-aucpr:0.94718
[4] validation-logloss:0.61670 validation-auc:0.94487 validation-aucpr:0.94922
[5] validation-logloss:0.60374 validation-auc:0.94523 validation-aucpr:0.94979
[6] validation-logloss:0.59138 validation-auc:0.94542 validation-aucpr:0.95025
[7] validation-logloss:0.57952 validation-auc:0.94576 validation-aucpr:0.95051
[8] validation-logloss:0.56519 validation-auc:0.95392 validation-aucpr:0.96006
[9] validation-logloss:0.55464 validation-auc:0.95459 validation-aucpr:0.96061
[10] validation-logloss:0.54426 validation-auc:0.95467 validation-aucpr:0.96084
[11] validation-logloss:0.53326 validation-auc:0.95499 validation-aucpr:0.96118
[12] validation-logloss:0.52150 validation-auc:0.95611 validation-aucpr:0.96241
[13] validation-logloss:0.51267 validation-auc:0.95605 validation-aucpr:0.96222
[14] validation-logloss:0.50409 validation-auc:0.95607 validation-aucpr:0.96217
[15] validation-logloss:0.49301 validation-auc:0.95652 validation-aucpr:0.96259
[16] validation-logloss:0.48500 validation-auc:0.95653 validation-aucpr:0.96258
[17] validation-logloss:0.47458 validation-auc:0.95698 validation-aucpr:0.96300
[18] validation-logloss:0.46469 validation-auc:0.95734 validation-aucpr:0.96341
[19] validation-logloss:0.45776 validation-auc:0.95720 validation-aucpr:0.96329
[20] validation-logloss:0.44897 validation-auc:0.95749 validation-aucpr:0.96364
[21] validation-logloss:0.44265 validation-auc:0.95760 validation-aucpr:0.96373
[22] validation-logloss:0.43639 validation-auc:0.95772 validation-aucpr:0.96382
[23] validation-logloss:0.42799 validation-auc:0.95809 validation-aucpr:0.96417
[24] validation-logloss:0.42227 validation-auc:0.95820 validation-aucpr:0.96420
[25] validation-logloss:0.41635 validation-auc:0.95841 validation-aucpr:0.96429
[26] validation-logloss:0.40914 validation-auc:0.95901 validation-aucpr:0.96492
[27] validation-logloss:0.40393 validation-auc:0.95903 validation-aucpr:0.96488
[28] validation-logloss:0.39741 validation-auc:0.95928 validation-aucpr:0.96512
[29] validation-logloss:0.39271 validation-auc:0.95943 validation-aucpr:0.96523
[30] validation-logloss:0.38858 validation-auc:0.95945 validation-aucpr:0.96519
[31] validation-logloss:0.38258 validation-auc:0.95992 validation-aucpr:0.96576
[32] validation-logloss:0.37857 validation-auc:0.95984 validation-aucpr:0.96567
[33] validation-logloss:0.37420 validation-auc:0.96007 validation-aucpr:0.96584
[34] validation-logloss:0.36854 validation-auc:0.96030 validation-aucpr:0.96609
[35] validation-logloss:0.36307 validation-auc:0.96049 validation-aucpr:0.96631
[36] validation-logloss:0.35815 validation-auc:0.96066 validation-aucpr:0.96644
[37] validation-logloss:0.35432 validation-auc:0.96070 validation-aucpr:0.96645
[38] validation-logloss:0.35070 validation-auc:0.96089 validation-aucpr:0.96655
[39] validation-logloss:0.34745 validation-auc:0.96087 validation-aucpr:0.96651
[40] validation-logloss:0.34411 validation-auc:0.96095 validation-aucpr:0.96655
[41] validation-logloss:0.34092 validation-auc:0.96110 validation-aucpr:0.96671
[42] validation-logloss:0.33780 validation-auc:0.96114 validation-aucpr:0.96677
[43] validation-logloss:0.33507 validation-auc:0.96113 validation-aucpr:0.96672
[44] validation-logloss:0.33256 validation-auc:0.96100 validation-aucpr:0.96662
[45] validation-logloss:0.32860 validation-auc:0.96118 validation-aucpr:0.96679
[46] validation-logloss:0.32459 validation-auc:0.96133 validation-aucpr:0.96690
[47] validation-logloss:0.32182 validation-auc:0.96136 validation-aucpr:0.96687
[48] validation-logloss:0.31802 validation-auc:0.96143 validation-aucpr:0.96697
[49] validation-logloss:0.31520 validation-auc:0.96157 validation-aucpr:0.96705
[50] validation-logloss:0.31293 validation-auc:0.96160 validation-aucpr:0.96707
[51] validation-logloss:0.31069 validation-auc:0.96165 validation-aucpr:0.96710
[52] validation-logloss:0.30847 validation-auc:0.96164 validation-aucpr:0.96706
[53] validation-logloss:0.30628 validation-auc:0.96171 validation-aucpr:0.96707
[54] validation-logloss:0.30286 validation-auc:0.96187 validation-aucpr:0.96727
[55] validation-logloss:0.30086 validation-auc:0.96190 validation-aucpr:0.96727
[56] validation-logloss:0.29918 validation-auc:0.96188 validation-aucpr:0.96723
[57] validation-logloss:0.29727 validation-auc:0.96196 validation-aucpr:0.96729
[58] validation-logloss:0.29552 validation-auc:0.96199 validation-aucpr:0.96734
[59] validation-logloss:0.29375 validation-auc:0.96197 validation-aucpr:0.96729
[60] validation-logloss:0.29217 validation-auc:0.96200 validation-aucpr:0.96731
[61] validation-logloss:0.28923 validation-auc:0.96204 validation-aucpr:0.96736
[62] validation-logloss:0.28747 validation-auc:0.96214 validation-aucpr:0.96745
[63] validation-logloss:0.28480 validation-auc:0.96223 validation-aucpr:0.96756
[64] validation-logloss:0.28298 validation-auc:0.96231 validation-aucpr:0.96761
[65] validation-logloss:0.28147 validation-auc:0.96242 validation-aucpr:0.96769
[66] validation-logloss:0.27893 validation-auc:0.96261 validation-aucpr:0.96787
[67] validation-logloss:0.27647 validation-auc:0.96271 validation-aucpr:0.96799
[68] validation-logloss:0.27376 validation-auc:0.96294 validation-aucpr:0.96820
[69] validation-logloss:0.27137 validation-auc:0.96304 validation-aucpr:0.96829
[70] validation-logloss:0.26897 validation-auc:0.96326 validation-aucpr:0.96846
[71] validation-logloss:0.26679 validation-auc:0.96335 validation-aucpr:0.96856
[72] validation-logloss:0.26525 validation-auc:0.96352 validation-aucpr:0.96870
[73] validation-logloss:0.26413 validation-auc:0.96361 validation-aucpr:0.96875
[74] validation-logloss:0.26266 validation-auc:0.96378 validation-aucpr:0.96890
{'best_iteration': '74', 'best_score': '0.9689014911357019'}
Trial 93, Fold 5: Log loss = 0.2626649744827682, Average precision = 0.9689000073615628, ROC-AUC = 0.9637839585307395, Elapsed Time = 8.099983399999473 seconds
Optimization Progress: 94%|#########3| 94/100 [3:57:46<07:43, 77.27s/it]
Trial 94, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 94, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68325 validation-auc:0.94038 validation-aucpr:0.94328
[1] validation-logloss:0.67350 validation-auc:0.94740 validation-aucpr:0.95020
[2] validation-logloss:0.66237 validation-auc:0.95695 validation-aucpr:0.96034
[3] validation-logloss:0.65320 validation-auc:0.95779 validation-aucpr:0.96079
[4] validation-logloss:0.64298 validation-auc:0.95912 validation-aucpr:0.96398
[5] validation-logloss:0.63455 validation-auc:0.95918 validation-aucpr:0.96426
[6] validation-logloss:0.62506 validation-auc:0.96062 validation-aucpr:0.96583
[7] validation-logloss:0.61546 validation-auc:0.96104 validation-aucpr:0.96626
[8] validation-logloss:0.60785 validation-auc:0.96098 validation-aucpr:0.96606
[9] validation-logloss:0.60007 validation-auc:0.96089 validation-aucpr:0.96589
[10] validation-logloss:0.59154 validation-auc:0.96141 validation-aucpr:0.96645
[11] validation-logloss:0.58289 validation-auc:0.96152 validation-aucpr:0.96662
[12] validation-logloss:0.57562 validation-auc:0.96166 validation-aucpr:0.96667
[13] validation-logloss:0.56903 validation-auc:0.96141 validation-aucpr:0.96637
[14] validation-logloss:0.56089 validation-auc:0.96187 validation-aucpr:0.96681
[15] validation-logloss:0.55308 validation-auc:0.96218 validation-aucpr:0.96694
[16] validation-logloss:0.54550 validation-auc:0.96239 validation-aucpr:0.96724
[17] validation-logloss:0.53854 validation-auc:0.96250 validation-aucpr:0.96808
[18] validation-logloss:0.53288 validation-auc:0.96231 validation-aucpr:0.96792
[19] validation-logloss:0.52578 validation-auc:0.96246 validation-aucpr:0.96811
[20] validation-logloss:0.52031 validation-auc:0.96243 validation-aucpr:0.96805
[21] validation-logloss:0.51489 validation-auc:0.96262 validation-aucpr:0.96842
[22] validation-logloss:0.50830 validation-auc:0.96269 validation-aucpr:0.96854
[23] validation-logloss:0.50312 validation-auc:0.96274 validation-aucpr:0.96860
[24] validation-logloss:0.49695 validation-auc:0.96289 validation-aucpr:0.96876
[25] validation-logloss:0.49177 validation-auc:0.96298 validation-aucpr:0.96879
[26] validation-logloss:0.48570 validation-auc:0.96312 validation-aucpr:0.96894
[27] validation-logloss:0.48023 validation-auc:0.96316 validation-aucpr:0.96901
[28] validation-logloss:0.47471 validation-auc:0.96321 validation-aucpr:0.96907
[29] validation-logloss:0.46901 validation-auc:0.96335 validation-aucpr:0.96922
[30] validation-logloss:0.46391 validation-auc:0.96349 validation-aucpr:0.96934
[31] validation-logloss:0.45936 validation-auc:0.96358 validation-aucpr:0.96935
[32] validation-logloss:0.45401 validation-auc:0.96367 validation-aucpr:0.96946
[33] validation-logloss:0.44971 validation-auc:0.96370 validation-aucpr:0.96945
[34] validation-logloss:0.44454 validation-auc:0.96386 validation-aucpr:0.96958
[35] validation-logloss:0.44055 validation-auc:0.96388 validation-aucpr:0.96959
[36] validation-logloss:0.43579 validation-auc:0.96394 validation-aucpr:0.96966
[37] validation-logloss:0.43123 validation-auc:0.96401 validation-aucpr:0.96976
[38] validation-logloss:0.42736 validation-auc:0.96397 validation-aucpr:0.96969
[39] validation-logloss:0.42292 validation-auc:0.96398 validation-aucpr:0.96971
[40] validation-logloss:0.41918 validation-auc:0.96413 validation-aucpr:0.96978
[41] validation-logloss:0.41485 validation-auc:0.96425 validation-aucpr:0.96989
[42] validation-logloss:0.41109 validation-auc:0.96427 validation-aucpr:0.96991
[43] validation-logloss:0.40702 validation-auc:0.96431 validation-aucpr:0.96997
[44] validation-logloss:0.40285 validation-auc:0.96449 validation-aucpr:0.97010
[45] validation-logloss:0.39945 validation-auc:0.96451 validation-aucpr:0.97009
[46] validation-logloss:0.39562 validation-auc:0.96457 validation-aucpr:0.97015
[47] validation-logloss:0.39265 validation-auc:0.96460 validation-aucpr:0.97013
[48] validation-logloss:0.38894 validation-auc:0.96470 validation-aucpr:0.97020
[49] validation-logloss:0.38521 validation-auc:0.96488 validation-aucpr:0.97035
[50] validation-logloss:0.38236 validation-auc:0.96478 validation-aucpr:0.97024
[51] validation-logloss:0.37939 validation-auc:0.96484 validation-aucpr:0.97024
[52] validation-logloss:0.37605 validation-auc:0.96486 validation-aucpr:0.97026
[53] validation-logloss:0.37322 validation-auc:0.96483 validation-aucpr:0.97020
[54] validation-logloss:0.36994 validation-auc:0.96490 validation-aucpr:0.97027
[55] validation-logloss:0.36737 validation-auc:0.96490 validation-aucpr:0.97026
{'best_iteration': '49', 'best_score': '0.9703483133017742'}
Trial 94, Fold 1: Log loss = 0.3673695350323967, Average precision = 0.9698315584733633, ROC-AUC = 0.9649038496006654, Elapsed Time = 1.0445719999988796 seconds
Trial 94, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 94, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68219 validation-auc:0.95006 validation-aucpr:0.94543
[1] validation-logloss:0.67232 validation-auc:0.95474 validation-aucpr:0.95743
[2] validation-logloss:0.66131 validation-auc:0.96087 validation-aucpr:0.96474
[3] validation-logloss:0.65200 validation-auc:0.96064 validation-aucpr:0.96456
[4] validation-logloss:0.64192 validation-auc:0.96278 validation-aucpr:0.96663
[5] validation-logloss:0.63331 validation-auc:0.96275 validation-aucpr:0.96643
[6] validation-logloss:0.62387 validation-auc:0.96392 validation-aucpr:0.96778
[7] validation-logloss:0.61423 validation-auc:0.96490 validation-aucpr:0.96888
[8] validation-logloss:0.60653 validation-auc:0.96465 validation-aucpr:0.96844
[9] validation-logloss:0.59880 validation-auc:0.96410 validation-aucpr:0.96780
[10] validation-logloss:0.59021 validation-auc:0.96461 validation-aucpr:0.96830
[11] validation-logloss:0.58151 validation-auc:0.96514 validation-aucpr:0.96886
[12] validation-logloss:0.57430 validation-auc:0.96500 validation-aucpr:0.96863
[13] validation-logloss:0.56776 validation-auc:0.96477 validation-aucpr:0.96840
[14] validation-logloss:0.55966 validation-auc:0.96514 validation-aucpr:0.96880
[15] validation-logloss:0.55189 validation-auc:0.96534 validation-aucpr:0.96919
[16] validation-logloss:0.54425 validation-auc:0.96553 validation-aucpr:0.96944
[17] validation-logloss:0.53733 validation-auc:0.96539 validation-aucpr:0.96938
[18] validation-logloss:0.53163 validation-auc:0.96521 validation-aucpr:0.96927
[19] validation-logloss:0.52454 validation-auc:0.96534 validation-aucpr:0.96943
[20] validation-logloss:0.51810 validation-auc:0.96534 validation-aucpr:0.96931
[21] validation-logloss:0.51290 validation-auc:0.96508 validation-aucpr:0.96909
[22] validation-logloss:0.50751 validation-auc:0.96501 validation-aucpr:0.96898
[23] validation-logloss:0.50199 validation-auc:0.96481 validation-aucpr:0.96881
[24] validation-logloss:0.49670 validation-auc:0.96476 validation-aucpr:0.96872
[25] validation-logloss:0.49169 validation-auc:0.96465 validation-aucpr:0.96862
[26] validation-logloss:0.48680 validation-auc:0.96458 validation-aucpr:0.96855
[27] validation-logloss:0.48107 validation-auc:0.96469 validation-aucpr:0.96868
[28] validation-logloss:0.47515 validation-auc:0.96486 validation-aucpr:0.96883
[29] validation-logloss:0.47047 validation-auc:0.96488 validation-aucpr:0.96880
[30] validation-logloss:0.46590 validation-auc:0.96491 validation-aucpr:0.96878
[31] validation-logloss:0.46132 validation-auc:0.96483 validation-aucpr:0.96868
[32] validation-logloss:0.45608 validation-auc:0.96499 validation-aucpr:0.96882
[33] validation-logloss:0.45082 validation-auc:0.96506 validation-aucpr:0.96887
[34] validation-logloss:0.44580 validation-auc:0.96518 validation-aucpr:0.96898
[35] validation-logloss:0.44096 validation-auc:0.96540 validation-aucpr:0.96922
[36] validation-logloss:0.43630 validation-auc:0.96553 validation-aucpr:0.96934
[37] validation-logloss:0.43243 validation-auc:0.96548 validation-aucpr:0.96928
[38] validation-logloss:0.42879 validation-auc:0.96544 validation-aucpr:0.96911
[39] validation-logloss:0.42499 validation-auc:0.96546 validation-aucpr:0.96909
[40] validation-logloss:0.42161 validation-auc:0.96537 validation-aucpr:0.96902
[41] validation-logloss:0.41722 validation-auc:0.96544 validation-aucpr:0.96911
[42] validation-logloss:0.41289 validation-auc:0.96545 validation-aucpr:0.96913
[43] validation-logloss:0.40868 validation-auc:0.96545 validation-aucpr:0.96913
[44] validation-logloss:0.40455 validation-auc:0.96548 validation-aucpr:0.96918
[45] validation-logloss:0.40130 validation-auc:0.96549 validation-aucpr:0.96917
[46] validation-logloss:0.39802 validation-auc:0.96557 validation-aucpr:0.96922
[47] validation-logloss:0.39421 validation-auc:0.96567 validation-aucpr:0.96932
[48] validation-logloss:0.39034 validation-auc:0.96574 validation-aucpr:0.96940
[49] validation-logloss:0.38654 validation-auc:0.96582 validation-aucpr:0.96945
[50] validation-logloss:0.38368 validation-auc:0.96587 validation-aucpr:0.96950
[51] validation-logloss:0.38074 validation-auc:0.96580 validation-aucpr:0.96939
[52] validation-logloss:0.37734 validation-auc:0.96590 validation-aucpr:0.96949
[53] validation-logloss:0.37451 validation-auc:0.96584 validation-aucpr:0.96941
[54] validation-logloss:0.37098 validation-auc:0.96595 validation-aucpr:0.96951
[55] validation-logloss:0.36825 validation-auc:0.96597 validation-aucpr:0.96953
{'best_iteration': '55', 'best_score': '0.9695323887159993'}
Trial 94, Fold 2: Log loss = 0.3682469709595792, Average precision = 0.9692905878067982, ROC-AUC = 0.9659652479361542, Elapsed Time = 1.2049114000001282 seconds
Trial 94, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 94, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68261 validation-auc:0.94370 validation-aucpr:0.93423
[1] validation-logloss:0.67270 validation-auc:0.95496 validation-aucpr:0.95408
[2] validation-logloss:0.66175 validation-auc:0.96100 validation-aucpr:0.96347
[3] validation-logloss:0.65249 validation-auc:0.96148 validation-aucpr:0.96356
[4] validation-logloss:0.64201 validation-auc:0.96287 validation-aucpr:0.96673
[5] validation-logloss:0.63347 validation-auc:0.96293 validation-aucpr:0.96672
[6] validation-logloss:0.62390 validation-auc:0.96422 validation-aucpr:0.96796
[7] validation-logloss:0.61428 validation-auc:0.96461 validation-aucpr:0.96834
[8] validation-logloss:0.60650 validation-auc:0.96487 validation-aucpr:0.96877
[9] validation-logloss:0.59873 validation-auc:0.96458 validation-aucpr:0.96841
[10] validation-logloss:0.59016 validation-auc:0.96495 validation-aucpr:0.96889
[11] validation-logloss:0.58160 validation-auc:0.96495 validation-aucpr:0.96891
[12] validation-logloss:0.57443 validation-auc:0.96475 validation-aucpr:0.96869
[13] validation-logloss:0.56776 validation-auc:0.96458 validation-aucpr:0.96852
[14] validation-logloss:0.55981 validation-auc:0.96463 validation-aucpr:0.96858
[15] validation-logloss:0.55199 validation-auc:0.96487 validation-aucpr:0.96912
[16] validation-logloss:0.54431 validation-auc:0.96513 validation-aucpr:0.96939
[17] validation-logloss:0.53742 validation-auc:0.96520 validation-aucpr:0.96980
[18] validation-logloss:0.53170 validation-auc:0.96500 validation-aucpr:0.96963
[19] validation-logloss:0.52455 validation-auc:0.96520 validation-aucpr:0.96978
[20] validation-logloss:0.51806 validation-auc:0.96543 validation-aucpr:0.96996
[21] validation-logloss:0.51271 validation-auc:0.96512 validation-aucpr:0.96967
[22] validation-logloss:0.50720 validation-auc:0.96501 validation-aucpr:0.96955
[23] validation-logloss:0.50168 validation-auc:0.96490 validation-aucpr:0.96945
[24] validation-logloss:0.49634 validation-auc:0.96496 validation-aucpr:0.96946
[25] validation-logloss:0.49109 validation-auc:0.96510 validation-aucpr:0.96962
[26] validation-logloss:0.48608 validation-auc:0.96517 validation-aucpr:0.96971
[27] validation-logloss:0.48041 validation-auc:0.96527 validation-aucpr:0.97023
[28] validation-logloss:0.47460 validation-auc:0.96532 validation-aucpr:0.97025
[29] validation-logloss:0.46980 validation-auc:0.96544 validation-aucpr:0.97031
[30] validation-logloss:0.46521 validation-auc:0.96537 validation-aucpr:0.97020
[31] validation-logloss:0.46058 validation-auc:0.96532 validation-aucpr:0.97016
[32] validation-logloss:0.45538 validation-auc:0.96534 validation-aucpr:0.97018
[33] validation-logloss:0.45009 validation-auc:0.96542 validation-aucpr:0.97026
[34] validation-logloss:0.44511 validation-auc:0.96555 validation-aucpr:0.97041
[35] validation-logloss:0.44020 validation-auc:0.96563 validation-aucpr:0.97051
[36] validation-logloss:0.43544 validation-auc:0.96567 validation-aucpr:0.97057
[37] validation-logloss:0.43154 validation-auc:0.96573 validation-aucpr:0.97062
[38] validation-logloss:0.42789 validation-auc:0.96571 validation-aucpr:0.97064
[39] validation-logloss:0.42405 validation-auc:0.96564 validation-aucpr:0.97056
[40] validation-logloss:0.42061 validation-auc:0.96550 validation-aucpr:0.97046
[41] validation-logloss:0.41616 validation-auc:0.96565 validation-aucpr:0.97061
[42] validation-logloss:0.41180 validation-auc:0.96577 validation-aucpr:0.97073
[43] validation-logloss:0.40752 validation-auc:0.96587 validation-aucpr:0.97080
[44] validation-logloss:0.40319 validation-auc:0.96601 validation-aucpr:0.97091
[45] validation-logloss:0.39978 validation-auc:0.96613 validation-aucpr:0.97098
[46] validation-logloss:0.39575 validation-auc:0.96620 validation-aucpr:0.97105
[47] validation-logloss:0.39274 validation-auc:0.96621 validation-aucpr:0.97103
[48] validation-logloss:0.38890 validation-auc:0.96639 validation-aucpr:0.97120
[49] validation-logloss:0.38515 validation-auc:0.96639 validation-aucpr:0.97122
[50] validation-logloss:0.38228 validation-auc:0.96640 validation-aucpr:0.97121
[51] validation-logloss:0.37940 validation-auc:0.96644 validation-aucpr:0.97122
[52] validation-logloss:0.37584 validation-auc:0.96651 validation-aucpr:0.97130
[53] validation-logloss:0.37301 validation-auc:0.96660 validation-aucpr:0.97135
[54] validation-logloss:0.37023 validation-auc:0.96665 validation-aucpr:0.97139
[55] validation-logloss:0.36678 validation-auc:0.96674 validation-aucpr:0.97149
{'best_iteration': '55', 'best_score': '0.9714851566138424'}
Trial 94, Fold 3: Log loss = 0.36677932299082194, Average precision = 0.971296260368586, ROC-AUC = 0.9667447131636988, Elapsed Time = 1.1741581999995105 seconds
Trial 94, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 94, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68171 validation-auc:0.95216 validation-aucpr:0.94627
[1] validation-logloss:0.67187 validation-auc:0.95852 validation-aucpr:0.96196
[2] validation-logloss:0.66088 validation-auc:0.96030 validation-aucpr:0.96401
[3] validation-logloss:0.65158 validation-auc:0.96063 validation-aucpr:0.96492
[4] validation-logloss:0.64159 validation-auc:0.96117 validation-aucpr:0.96443
[5] validation-logloss:0.63308 validation-auc:0.96155 validation-aucpr:0.96729
[6] validation-logloss:0.62359 validation-auc:0.96292 validation-aucpr:0.96869
[7] validation-logloss:0.61399 validation-auc:0.96356 validation-aucpr:0.96931
[8] validation-logloss:0.60637 validation-auc:0.96324 validation-aucpr:0.96878
[9] validation-logloss:0.59852 validation-auc:0.96321 validation-aucpr:0.96876
[10] validation-logloss:0.59094 validation-auc:0.96308 validation-aucpr:0.96858
[11] validation-logloss:0.58245 validation-auc:0.96351 validation-aucpr:0.96903
[12] validation-logloss:0.57394 validation-auc:0.96372 validation-aucpr:0.96923
[13] validation-logloss:0.56578 validation-auc:0.96376 validation-aucpr:0.96941
[14] validation-logloss:0.55927 validation-auc:0.96367 validation-aucpr:0.96936
[15] validation-logloss:0.55267 validation-auc:0.96361 validation-aucpr:0.96924
[16] validation-logloss:0.54623 validation-auc:0.96368 validation-aucpr:0.96934
[17] validation-logloss:0.53994 validation-auc:0.96367 validation-aucpr:0.96923
[18] validation-logloss:0.53435 validation-auc:0.96329 validation-aucpr:0.96892
[19] validation-logloss:0.52839 validation-auc:0.96327 validation-aucpr:0.96881
[20] validation-logloss:0.52255 validation-auc:0.96318 validation-aucpr:0.96867
[21] validation-logloss:0.51693 validation-auc:0.96342 validation-aucpr:0.96883
[22] validation-logloss:0.51024 validation-auc:0.96354 validation-aucpr:0.96899
[23] validation-logloss:0.50503 validation-auc:0.96349 validation-aucpr:0.96897
[24] validation-logloss:0.49910 validation-auc:0.96356 validation-aucpr:0.96921
[25] validation-logloss:0.49387 validation-auc:0.96358 validation-aucpr:0.96915
[26] validation-logloss:0.48759 validation-auc:0.96391 validation-aucpr:0.96947
[27] validation-logloss:0.48150 validation-auc:0.96407 validation-aucpr:0.96963
[28] validation-logloss:0.47705 validation-auc:0.96400 validation-aucpr:0.96956
[29] validation-logloss:0.47235 validation-auc:0.96399 validation-aucpr:0.96950
[30] validation-logloss:0.46815 validation-auc:0.96370 validation-aucpr:0.96927
[31] validation-logloss:0.46275 validation-auc:0.96391 validation-aucpr:0.96951
[32] validation-logloss:0.45831 validation-auc:0.96394 validation-aucpr:0.96951
[33] validation-logloss:0.45295 validation-auc:0.96408 validation-aucpr:0.96966
[34] validation-logloss:0.44881 validation-auc:0.96413 validation-aucpr:0.96967
[35] validation-logloss:0.44473 validation-auc:0.96417 validation-aucpr:0.96967
[36] validation-logloss:0.44003 validation-auc:0.96431 validation-aucpr:0.96990
[37] validation-logloss:0.43514 validation-auc:0.96443 validation-aucpr:0.97005
[38] validation-logloss:0.43044 validation-auc:0.96455 validation-aucpr:0.97017
[39] validation-logloss:0.42587 validation-auc:0.96470 validation-aucpr:0.97028
[40] validation-logloss:0.42159 validation-auc:0.96477 validation-aucpr:0.97036
[41] validation-logloss:0.41721 validation-auc:0.96486 validation-aucpr:0.97043
[42] validation-logloss:0.41302 validation-auc:0.96480 validation-aucpr:0.97042
[43] validation-logloss:0.40869 validation-auc:0.96483 validation-aucpr:0.97047
[44] validation-logloss:0.40560 validation-auc:0.96486 validation-aucpr:0.97049
[45] validation-logloss:0.40161 validation-auc:0.96501 validation-aucpr:0.97065
[46] validation-logloss:0.39794 validation-auc:0.96490 validation-aucpr:0.97062
[47] validation-logloss:0.39487 validation-auc:0.96482 validation-aucpr:0.97053
[48] validation-logloss:0.39181 validation-auc:0.96475 validation-aucpr:0.97044
[49] validation-logloss:0.38896 validation-auc:0.96471 validation-aucpr:0.97041
[50] validation-logloss:0.38539 validation-auc:0.96470 validation-aucpr:0.97042
[51] validation-logloss:0.38167 validation-auc:0.96480 validation-aucpr:0.97054
[52] validation-logloss:0.37867 validation-auc:0.96486 validation-aucpr:0.97055
[53] validation-logloss:0.37512 validation-auc:0.96503 validation-aucpr:0.97072
[54] validation-logloss:0.37190 validation-auc:0.96505 validation-aucpr:0.97079
[55] validation-logloss:0.36850 validation-auc:0.96516 validation-aucpr:0.97089
{'best_iteration': '55', 'best_score': '0.9708877542981628'}
Trial 94, Fold 4: Log loss = 0.3685043068161344, Average precision = 0.9707699630531544, ROC-AUC = 0.9651570670329119, Elapsed Time = 1.1587060000019846 seconds
Trial 94, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 94, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68254 validation-auc:0.94775 validation-aucpr:0.94739
[1] validation-logloss:0.67277 validation-auc:0.95156 validation-aucpr:0.95367
[2] validation-logloss:0.66178 validation-auc:0.95803 validation-aucpr:0.96181
[3] validation-logloss:0.65278 validation-auc:0.95746 validation-aucpr:0.96125
[4] validation-logloss:0.64278 validation-auc:0.95835 validation-aucpr:0.96195
[5] validation-logloss:0.63412 validation-auc:0.95922 validation-aucpr:0.96406
[6] validation-logloss:0.62476 validation-auc:0.96033 validation-aucpr:0.96525
[7] validation-logloss:0.61521 validation-auc:0.96062 validation-aucpr:0.96575
[8] validation-logloss:0.60764 validation-auc:0.96039 validation-aucpr:0.96528
[9] validation-logloss:0.59995 validation-auc:0.96039 validation-aucpr:0.96506
[10] validation-logloss:0.59142 validation-auc:0.96095 validation-aucpr:0.96557
[11] validation-logloss:0.58288 validation-auc:0.96130 validation-aucpr:0.96584
[12] validation-logloss:0.57568 validation-auc:0.96161 validation-aucpr:0.96605
[13] validation-logloss:0.56751 validation-auc:0.96177 validation-aucpr:0.96629
[14] validation-logloss:0.56083 validation-auc:0.96189 validation-aucpr:0.96661
[15] validation-logloss:0.55341 validation-auc:0.96192 validation-aucpr:0.96697
[16] validation-logloss:0.54587 validation-auc:0.96192 validation-aucpr:0.96700
[17] validation-logloss:0.53896 validation-auc:0.96196 validation-aucpr:0.96705
[18] validation-logloss:0.53303 validation-auc:0.96192 validation-aucpr:0.96701
[19] validation-logloss:0.52716 validation-auc:0.96195 validation-aucpr:0.96700
[20] validation-logloss:0.52159 validation-auc:0.96190 validation-aucpr:0.96693
[21] validation-logloss:0.51483 validation-auc:0.96202 validation-aucpr:0.96704
[22] validation-logloss:0.50949 validation-auc:0.96192 validation-aucpr:0.96700
[23] validation-logloss:0.50299 validation-auc:0.96202 validation-aucpr:0.96713
[24] validation-logloss:0.49681 validation-auc:0.96206 validation-aucpr:0.96720
[25] validation-logloss:0.49223 validation-auc:0.96178 validation-aucpr:0.96695
[26] validation-logloss:0.48627 validation-auc:0.96191 validation-aucpr:0.96710
[27] validation-logloss:0.48164 validation-auc:0.96189 validation-aucpr:0.96707
[28] validation-logloss:0.47588 validation-auc:0.96191 validation-aucpr:0.96708
[29] validation-logloss:0.47153 validation-auc:0.96177 validation-aucpr:0.96699
[30] validation-logloss:0.46627 validation-auc:0.96177 validation-aucpr:0.96697
[31] validation-logloss:0.46102 validation-auc:0.96194 validation-aucpr:0.96712
[32] validation-logloss:0.45563 validation-auc:0.96207 validation-aucpr:0.96725
[33] validation-logloss:0.45147 validation-auc:0.96203 validation-aucpr:0.96717
[34] validation-logloss:0.44760 validation-auc:0.96206 validation-aucpr:0.96717
[35] validation-logloss:0.44346 validation-auc:0.96207 validation-aucpr:0.96716
[36] validation-logloss:0.43976 validation-auc:0.96206 validation-aucpr:0.96718
[37] validation-logloss:0.43591 validation-auc:0.96210 validation-aucpr:0.96721
[38] validation-logloss:0.43203 validation-auc:0.96223 validation-aucpr:0.96727
[39] validation-logloss:0.42753 validation-auc:0.96232 validation-aucpr:0.96735
[40] validation-logloss:0.42402 validation-auc:0.96234 validation-aucpr:0.96735
[41] validation-logloss:0.42048 validation-auc:0.96227 validation-aucpr:0.96730
[42] validation-logloss:0.41703 validation-auc:0.96234 validation-aucpr:0.96733
[43] validation-logloss:0.41286 validation-auc:0.96244 validation-aucpr:0.96741
[44] validation-logloss:0.40972 validation-auc:0.96234 validation-aucpr:0.96732
[45] validation-logloss:0.40572 validation-auc:0.96250 validation-aucpr:0.96752
[46] validation-logloss:0.40232 validation-auc:0.96249 validation-aucpr:0.96753
[47] validation-logloss:0.39931 validation-auc:0.96245 validation-aucpr:0.96749
[48] validation-logloss:0.39548 validation-auc:0.96249 validation-aucpr:0.96755
[49] validation-logloss:0.39177 validation-auc:0.96256 validation-aucpr:0.96762
[50] validation-logloss:0.38813 validation-auc:0.96270 validation-aucpr:0.96776
[51] validation-logloss:0.38447 validation-auc:0.96287 validation-aucpr:0.96790
[52] validation-logloss:0.38095 validation-auc:0.96296 validation-aucpr:0.96798
[53] validation-logloss:0.37832 validation-auc:0.96300 validation-aucpr:0.96798
[54] validation-logloss:0.37496 validation-auc:0.96308 validation-aucpr:0.96807
[55] validation-logloss:0.37261 validation-auc:0.96304 validation-aucpr:0.96803
{'best_iteration': '54', 'best_score': '0.968069327592028'}
Trial 94, Fold 5: Log loss = 0.3726133849620998, Average precision = 0.967905009267924, ROC-AUC = 0.9630352823142523, Elapsed Time = 1.158533500001795 seconds
Optimization Progress: 95%|#########5| 95/100 [3:57:59<04:51, 58.20s/it]
Trial 95, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 95, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66971 validation-auc:0.93383 validation-aucpr:0.90166
[1] validation-logloss:0.64720 validation-auc:0.95771 validation-aucpr:0.95242
[2] validation-logloss:0.62615 validation-auc:0.96304 validation-aucpr:0.96622
[3] validation-logloss:0.60645 validation-auc:0.96451 validation-aucpr:0.96875
[4] validation-logloss:0.58754 validation-auc:0.96530 validation-aucpr:0.96909
[5] validation-logloss:0.56988 validation-auc:0.96587 validation-aucpr:0.96997
[6] validation-logloss:0.55326 validation-auc:0.96596 validation-aucpr:0.97020
[7] validation-logloss:0.53816 validation-auc:0.96587 validation-aucpr:0.96927
[8] validation-logloss:0.52323 validation-auc:0.96686 validation-aucpr:0.97025
[9] validation-logloss:0.50994 validation-auc:0.96737 validation-aucpr:0.97291
[10] validation-logloss:0.49618 validation-auc:0.96782 validation-aucpr:0.97323
[11] validation-logloss:0.48370 validation-auc:0.96760 validation-aucpr:0.97308
[12] validation-logloss:0.47155 validation-auc:0.96753 validation-aucpr:0.97303
[13] validation-logloss:0.46132 validation-auc:0.96767 validation-aucpr:0.97322
[14] validation-logloss:0.45019 validation-auc:0.96793 validation-aucpr:0.97336
[15] validation-logloss:0.43988 validation-auc:0.96801 validation-aucpr:0.97345
[16] validation-logloss:0.42974 validation-auc:0.96806 validation-aucpr:0.97355
[17] validation-logloss:0.42043 validation-auc:0.96814 validation-aucpr:0.97356
[18] validation-logloss:0.41199 validation-auc:0.96843 validation-aucpr:0.97388
{'best_iteration': '18', 'best_score': '0.9738836899662131'}
Trial 95, Fold 1: Log loss = 0.41199033206300667, Average precision = 0.9738432506185263, ROC-AUC = 0.968429170072124, Elapsed Time = 1.558035199999722 seconds
Trial 95, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 95, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67021 validation-auc:0.93318 validation-aucpr:0.90261
[1] validation-logloss:0.64763 validation-auc:0.95870 validation-aucpr:0.95098
[2] validation-logloss:0.62867 validation-auc:0.96013 validation-aucpr:0.96485
[3] validation-logloss:0.60860 validation-auc:0.96347 validation-aucpr:0.96772
[4] validation-logloss:0.59048 validation-auc:0.96454 validation-aucpr:0.96876
[5] validation-logloss:0.57283 validation-auc:0.96522 validation-aucpr:0.96957
[6] validation-logloss:0.55635 validation-auc:0.96588 validation-aucpr:0.96786
[7] validation-logloss:0.54047 validation-auc:0.96648 validation-aucpr:0.96858
[8] validation-logloss:0.52704 validation-auc:0.96655 validation-aucpr:0.96837
[9] validation-logloss:0.51461 validation-auc:0.96694 validation-aucpr:0.96877
[10] validation-logloss:0.50246 validation-auc:0.96691 validation-aucpr:0.97035
[11] validation-logloss:0.49007 validation-auc:0.96708 validation-aucpr:0.97046
[12] validation-logloss:0.47778 validation-auc:0.96763 validation-aucpr:0.97091
[13] validation-logloss:0.46588 validation-auc:0.96786 validation-aucpr:0.96931
[14] validation-logloss:0.45537 validation-auc:0.96822 validation-aucpr:0.97142
[15] validation-logloss:0.44450 validation-auc:0.96842 validation-aucpr:0.97170
[16] validation-logloss:0.43455 validation-auc:0.96907 validation-aucpr:0.97230
[17] validation-logloss:0.42465 validation-auc:0.96941 validation-aucpr:0.97255
[18] validation-logloss:0.41603 validation-auc:0.96964 validation-aucpr:0.97270
{'best_iteration': '18', 'best_score': '0.9727045585634997'}
Trial 95, Fold 2: Log loss = 0.4160298962320222, Average precision = 0.9725908240025432, ROC-AUC = 0.9696389859484585, Elapsed Time = 1.7594687999990128 seconds
Trial 95, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 95, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.67059 validation-auc:0.92465 validation-aucpr:0.87755
[1] validation-logloss:0.64834 validation-auc:0.95652 validation-aucpr:0.94821
[2] validation-logloss:0.62705 validation-auc:0.96279 validation-aucpr:0.96143
[3] validation-logloss:0.60711 validation-auc:0.96491 validation-aucpr:0.96733
[4] validation-logloss:0.58878 validation-auc:0.96567 validation-aucpr:0.97121
[5] validation-logloss:0.57091 validation-auc:0.96607 validation-aucpr:0.97160
[6] validation-logloss:0.55419 validation-auc:0.96724 validation-aucpr:0.97252
[7] validation-logloss:0.53819 validation-auc:0.96767 validation-aucpr:0.97286
[8] validation-logloss:0.52335 validation-auc:0.96806 validation-aucpr:0.97308
[9] validation-logloss:0.50879 validation-auc:0.96848 validation-aucpr:0.97332
[10] validation-logloss:0.49531 validation-auc:0.96870 validation-aucpr:0.97356
[11] validation-logloss:0.48245 validation-auc:0.96891 validation-aucpr:0.97376
[12] validation-logloss:0.47038 validation-auc:0.96865 validation-aucpr:0.97356
[13] validation-logloss:0.45897 validation-auc:0.96903 validation-aucpr:0.97387
[14] validation-logloss:0.44787 validation-auc:0.96908 validation-aucpr:0.97391
[15] validation-logloss:0.43733 validation-auc:0.96935 validation-aucpr:0.97406
[16] validation-logloss:0.42764 validation-auc:0.96970 validation-aucpr:0.97423
[17] validation-logloss:0.41846 validation-auc:0.96952 validation-aucpr:0.97413
[18] validation-logloss:0.40935 validation-auc:0.96984 validation-aucpr:0.97438
{'best_iteration': '18', 'best_score': '0.974380799626537'}
Trial 95, Fold 3: Log loss = 0.40934568100738855, Average precision = 0.9743398942778332, ROC-AUC = 0.9698392098698848, Elapsed Time = 1.8469833000017388 seconds
Trial 95, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 95, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66999 validation-auc:0.93093 validation-aucpr:0.91191
[1] validation-logloss:0.64894 validation-auc:0.95747 validation-aucpr:0.95670
[2] validation-logloss:0.62772 validation-auc:0.96340 validation-aucpr:0.96424
[3] validation-logloss:0.60792 validation-auc:0.96536 validation-aucpr:0.97088
[4] validation-logloss:0.58997 validation-auc:0.96549 validation-aucpr:0.97109
[5] validation-logloss:0.57242 validation-auc:0.96663 validation-aucpr:0.97201
[6] validation-logloss:0.55560 validation-auc:0.96702 validation-aucpr:0.97243
[7] validation-logloss:0.53998 validation-auc:0.96754 validation-aucpr:0.97284
[8] validation-logloss:0.52548 validation-auc:0.96762 validation-aucpr:0.97291
[9] validation-logloss:0.51100 validation-auc:0.96805 validation-aucpr:0.97322
[10] validation-logloss:0.49800 validation-auc:0.96792 validation-aucpr:0.97315
[11] validation-logloss:0.48483 validation-auc:0.96846 validation-aucpr:0.97360
[12] validation-logloss:0.47264 validation-auc:0.96897 validation-aucpr:0.97391
[13] validation-logloss:0.46065 validation-auc:0.96928 validation-aucpr:0.97415
[14] validation-logloss:0.45103 validation-auc:0.96935 validation-aucpr:0.97417
[15] validation-logloss:0.44033 validation-auc:0.96935 validation-aucpr:0.97423
[16] validation-logloss:0.43028 validation-auc:0.96944 validation-aucpr:0.97433
[17] validation-logloss:0.42070 validation-auc:0.96969 validation-aucpr:0.97452
[18] validation-logloss:0.41148 validation-auc:0.96961 validation-aucpr:0.97450
{'best_iteration': '17', 'best_score': '0.974523158907407'}
Trial 95, Fold 4: Log loss = 0.4114788840127488, Average precision = 0.9744901295180513, ROC-AUC = 0.969611815108097, Elapsed Time = 1.8002224000010756 seconds
Trial 95, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 95, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66984 validation-auc:0.92772 validation-aucpr:0.90006
[1] validation-logloss:0.64844 validation-auc:0.95360 validation-aucpr:0.94183
[2] validation-logloss:0.62801 validation-auc:0.95915 validation-aucpr:0.95796
[3] validation-logloss:0.61014 validation-auc:0.96151 validation-aucpr:0.96392
[4] validation-logloss:0.59133 validation-auc:0.96352 validation-aucpr:0.96524
[5] validation-logloss:0.57388 validation-auc:0.96433 validation-aucpr:0.96541
[6] validation-logloss:0.55786 validation-auc:0.96487 validation-aucpr:0.96603
[7] validation-logloss:0.54292 validation-auc:0.96506 validation-aucpr:0.96654
[8] validation-logloss:0.52797 validation-auc:0.96524 validation-aucpr:0.96763
[9] validation-logloss:0.51360 validation-auc:0.96594 validation-aucpr:0.96824
[10] validation-logloss:0.50009 validation-auc:0.96632 validation-aucpr:0.96853
[11] validation-logloss:0.48888 validation-auc:0.96624 validation-aucpr:0.97111
[12] validation-logloss:0.47679 validation-auc:0.96649 validation-aucpr:0.97128
[13] validation-logloss:0.46527 validation-auc:0.96644 validation-aucpr:0.97130
[14] validation-logloss:0.45418 validation-auc:0.96691 validation-aucpr:0.97171
[15] validation-logloss:0.44343 validation-auc:0.96740 validation-aucpr:0.97206
[16] validation-logloss:0.43343 validation-auc:0.96774 validation-aucpr:0.97232
[17] validation-logloss:0.42382 validation-auc:0.96786 validation-aucpr:0.97262
[18] validation-logloss:0.41446 validation-auc:0.96809 validation-aucpr:0.97280
{'best_iteration': '18', 'best_score': '0.9727953837135653'}
Trial 95, Fold 5: Log loss = 0.4144598869762319, Average precision = 0.9727891260068996, ROC-AUC = 0.9680885123460231, Elapsed Time = 1.8103469999987283 seconds
Optimization Progress: 96%|#########6| 96/100 [3:58:16<03:03, 45.79s/it]
Trial 96, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 96, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[21:57:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.62898 validation-auc:0.92845 validation-aucpr:0.86784
[21:57:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.57505 validation-auc:0.95687 validation-aucpr:0.93799
[21:57:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.52993 validation-auc:0.96239 validation-aucpr:0.95987
[21:57:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.49115 validation-auc:0.96512 validation-aucpr:0.97003
[21:57:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.45687 validation-auc:0.96635 validation-aucpr:0.97089
[21:57:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.42798 validation-auc:0.96647 validation-aucpr:0.96934
[21:57:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.40235 validation-auc:0.96705 validation-aucpr:0.97122
[21:57:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.37945 validation-auc:0.96804 validation-aucpr:0.97221
[21:57:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.36011 validation-auc:0.96831 validation-aucpr:0.97283
[21:57:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.34476 validation-auc:0.96885 validation-aucpr:0.97334
[21:57:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.32943 validation-auc:0.96865 validation-aucpr:0.97322
[21:57:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.31572 validation-auc:0.96909 validation-aucpr:0.97378
[21:57:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.30299 validation-auc:0.96932 validation-aucpr:0.97399
[21:57:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.29170 validation-auc:0.96966 validation-aucpr:0.97423
[21:57:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.28176 validation-auc:0.96966 validation-aucpr:0.97339
[21:57:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.27283 validation-auc:0.97016 validation-aucpr:0.97369
[21:57:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.26482 validation-auc:0.97009 validation-aucpr:0.97365
[21:57:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.25797 validation-auc:0.96999 validation-aucpr:0.97359
[21:57:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.25221 validation-auc:0.97009 validation-aucpr:0.97370
[21:57:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.24652 validation-auc:0.97007 validation-aucpr:0.97402
[21:57:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.24136 validation-auc:0.97014 validation-aucpr:0.97474
[21:57:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.23666 validation-auc:0.97009 validation-aucpr:0.97473
[21:57:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.23231 validation-auc:0.97018 validation-aucpr:0.97479
[21:57:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.22895 validation-auc:0.97015 validation-aucpr:0.97475
[21:57:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.22533 validation-auc:0.97037 validation-aucpr:0.97489
[21:57:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.22221 validation-auc:0.97050 validation-aucpr:0.97497
[21:57:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.21949 validation-auc:0.97068 validation-aucpr:0.97510
[21:57:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.21704 validation-auc:0.97080 validation-aucpr:0.97522
[21:57:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.21484 validation-auc:0.97089 validation-aucpr:0.97528
[21:57:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.21329 validation-auc:0.97078 validation-aucpr:0.97520
[21:57:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.21159 validation-auc:0.97083 validation-aucpr:0.97523
[21:57:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.21008 validation-auc:0.97092 validation-aucpr:0.97533
[21:57:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.20848 validation-auc:0.97097 validation-aucpr:0.97538
[21:57:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.20707 validation-auc:0.97112 validation-aucpr:0.97549
[21:57:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.20581 validation-auc:0.97122 validation-aucpr:0.97555
[21:57:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.20488 validation-auc:0.97135 validation-aucpr:0.97565
[21:57:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.20417 validation-auc:0.97134 validation-aucpr:0.97563
[21:57:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20362 validation-auc:0.97126 validation-aucpr:0.97556
[21:57:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.20317 validation-auc:0.97126 validation-aucpr:0.97558
[21:57:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.20304 validation-auc:0.97128 validation-aucpr:0.97555
[21:57:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20285 validation-auc:0.97125 validation-aucpr:0.97553
[21:57:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.20225 validation-auc:0.97144 validation-aucpr:0.97573
[21:57:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20190 validation-auc:0.97153 validation-aucpr:0.97578
[21:57:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20163 validation-auc:0.97153 validation-aucpr:0.97576
[21:57:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.20201 validation-auc:0.97151 validation-aucpr:0.97590
[21:57:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.20150 validation-auc:0.97169 validation-aucpr:0.97592
[21:57:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.20141 validation-auc:0.97170 validation-aucpr:0.97597
[21:57:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20115 validation-auc:0.97188 validation-aucpr:0.97636
[21:57:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.20116 validation-auc:0.97195 validation-aucpr:0.97644
[21:57:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.20118 validation-auc:0.97204 validation-aucpr:0.97651
[21:57:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.20172 validation-auc:0.97192 validation-aucpr:0.97641
[21:57:34] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.20206 validation-auc:0.97195 validation-aucpr:0.97642
[21:57:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.20201 validation-auc:0.97208 validation-aucpr:0.97651
[21:57:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.20252 validation-auc:0.97204 validation-aucpr:0.97642
[21:57:35] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.20308 validation-auc:0.97207 validation-aucpr:0.97642
[21:57:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.20359 validation-auc:0.97204 validation-aucpr:0.97637
[21:57:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.20420 validation-auc:0.97201 validation-aucpr:0.97639
[21:57:36] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.20423 validation-auc:0.97213 validation-aucpr:0.97646
[21:57:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.20441 validation-auc:0.97222 validation-aucpr:0.97657
[21:57:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.20509 validation-auc:0.97218 validation-aucpr:0.97652
[21:57:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.20573 validation-auc:0.97213 validation-aucpr:0.97648
{'best_iteration': '58', 'best_score': '0.976570721942885'}
Trial 96, Fold 1: Log loss = 0.20572811828880266, Average precision = 0.9764804945313532, ROC-AUC = 0.9721327542860967, Elapsed Time = 27.369219000000157 seconds
Trial 96, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 96, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[21:57:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.62831 validation-auc:0.93587 validation-aucpr:0.89205
[21:57:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.57441 validation-auc:0.95918 validation-aucpr:0.95031
[21:57:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.52866 validation-auc:0.96432 validation-aucpr:0.96373
[21:57:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.49263 validation-auc:0.96662 validation-aucpr:0.97075
[21:57:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.46241 validation-auc:0.96662 validation-aucpr:0.97057
[21:57:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.43262 validation-auc:0.96697 validation-aucpr:0.97100
[21:57:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.40614 validation-auc:0.96731 validation-aucpr:0.97136
[21:57:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.38442 validation-auc:0.96831 validation-aucpr:0.97204
[21:57:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.36339 validation-auc:0.96896 validation-aucpr:0.97258
[21:57:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.34686 validation-auc:0.96926 validation-aucpr:0.97276
[21:57:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.33063 validation-auc:0.96945 validation-aucpr:0.97292
[21:57:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.31610 validation-auc:0.96943 validation-aucpr:0.97303
[21:57:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.30338 validation-auc:0.96930 validation-aucpr:0.97298
[21:57:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.29096 validation-auc:0.97006 validation-aucpr:0.97358
[21:57:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.28065 validation-auc:0.97018 validation-aucpr:0.97366
[21:57:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.27226 validation-auc:0.97035 validation-aucpr:0.97372
[21:57:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.26480 validation-auc:0.97053 validation-aucpr:0.97379
[21:57:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.25711 validation-auc:0.97060 validation-aucpr:0.97385
[21:57:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.25078 validation-auc:0.97074 validation-aucpr:0.97398
[21:57:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.24541 validation-auc:0.97073 validation-aucpr:0.97394
[21:57:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.23953 validation-auc:0.97084 validation-aucpr:0.97398
[21:57:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.23407 validation-auc:0.97096 validation-aucpr:0.97404
[21:57:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.22953 validation-auc:0.97085 validation-aucpr:0.97397
[21:57:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.22547 validation-auc:0.97080 validation-aucpr:0.97390
[21:57:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.22134 validation-auc:0.97101 validation-aucpr:0.97409
[21:57:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.21862 validation-auc:0.97092 validation-aucpr:0.97405
[21:57:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.21540 validation-auc:0.97110 validation-aucpr:0.97442
[21:57:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.21330 validation-auc:0.97107 validation-aucpr:0.97439
[21:57:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.21082 validation-auc:0.97105 validation-aucpr:0.97439
[21:57:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.20811 validation-auc:0.97129 validation-aucpr:0.97457
[21:57:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.20601 validation-auc:0.97126 validation-aucpr:0.97451
[21:57:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.20387 validation-auc:0.97137 validation-aucpr:0.97463
[21:57:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.20179 validation-auc:0.97161 validation-aucpr:0.97477
[21:57:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.20015 validation-auc:0.97167 validation-aucpr:0.97478
[21:57:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.19895 validation-auc:0.97179 validation-aucpr:0.97490
[21:57:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.19818 validation-auc:0.97176 validation-aucpr:0.97485
[21:57:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.19718 validation-auc:0.97179 validation-aucpr:0.97484
[21:57:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.19581 validation-auc:0.97194 validation-aucpr:0.97496
[21:57:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.19497 validation-auc:0.97199 validation-aucpr:0.97501
[21:57:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.19419 validation-auc:0.97212 validation-aucpr:0.97505
[21:57:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.19376 validation-auc:0.97200 validation-aucpr:0.97491
[21:57:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.19364 validation-auc:0.97192 validation-aucpr:0.97486
[21:58:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.19339 validation-auc:0.97192 validation-aucpr:0.97485
[21:58:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.19278 validation-auc:0.97199 validation-aucpr:0.97488
[21:58:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19220 validation-auc:0.97210 validation-aucpr:0.97501
[21:58:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19166 validation-auc:0.97213 validation-aucpr:0.97504
[21:58:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19107 validation-auc:0.97227 validation-aucpr:0.97506
[21:58:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19066 validation-auc:0.97237 validation-aucpr:0.97511
[21:58:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19054 validation-auc:0.97244 validation-aucpr:0.97527
[21:58:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19061 validation-auc:0.97227 validation-aucpr:0.97513
[21:58:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19042 validation-auc:0.97234 validation-aucpr:0.97516
[21:58:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19050 validation-auc:0.97231 validation-aucpr:0.97485
[21:58:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19038 validation-auc:0.97233 validation-aucpr:0.97484
[21:58:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19033 validation-auc:0.97233 validation-aucpr:0.97482
[21:58:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.19070 validation-auc:0.97230 validation-aucpr:0.97433
[21:58:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.19102 validation-auc:0.97228 validation-aucpr:0.97429
[21:58:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.19081 validation-auc:0.97232 validation-aucpr:0.97433
[21:58:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.19116 validation-auc:0.97240 validation-aucpr:0.97426
[21:58:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.19138 validation-auc:0.97238 validation-aucpr:0.97422
[21:58:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.19171 validation-auc:0.97244 validation-aucpr:0.97412
[21:58:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.19156 validation-auc:0.97249 validation-aucpr:0.97413
{'best_iteration': '48', 'best_score': '0.9752665860623102'}
Trial 96, Fold 2: Log loss = 0.19155986050609108, Average precision = 0.9742061023984951, ROC-AUC = 0.9724941205320521, Elapsed Time = 27.559297799998603 seconds
Trial 96, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 96, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[21:58:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.62854 validation-auc:0.93540 validation-aucpr:0.88474
[21:58:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.57500 validation-auc:0.95578 validation-aucpr:0.93764
[21:58:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.52904 validation-auc:0.96384 validation-aucpr:0.96444
[21:58:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.49038 validation-auc:0.96541 validation-aucpr:0.96817
[21:58:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.45652 validation-auc:0.96621 validation-aucpr:0.96876
[21:58:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.42740 validation-auc:0.96648 validation-aucpr:0.96873
[21:58:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.40194 validation-auc:0.96676 validation-aucpr:0.96886
[21:58:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.38083 validation-auc:0.96736 validation-aucpr:0.96932
[21:58:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.36028 validation-auc:0.96849 validation-aucpr:0.97213
[21:58:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.34389 validation-auc:0.96909 validation-aucpr:0.97342
[21:58:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.32781 validation-auc:0.96912 validation-aucpr:0.97350
[21:58:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.31350 validation-auc:0.96944 validation-aucpr:0.97375
[21:58:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.30032 validation-auc:0.96981 validation-aucpr:0.97406
[21:58:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.29026 validation-auc:0.96994 validation-aucpr:0.97406
[21:58:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.28002 validation-auc:0.97009 validation-aucpr:0.97423
[21:58:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.27121 validation-auc:0.97002 validation-aucpr:0.97415
[21:58:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.26270 validation-auc:0.97040 validation-aucpr:0.97438
[21:58:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.25534 validation-auc:0.97050 validation-aucpr:0.97452
[21:58:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.24836 validation-auc:0.97073 validation-aucpr:0.97474
[21:58:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.24211 validation-auc:0.97080 validation-aucpr:0.97482
[21:58:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.23651 validation-auc:0.97122 validation-aucpr:0.97512
[21:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.23182 validation-auc:0.97106 validation-aucpr:0.97509
[21:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.22748 validation-auc:0.97122 validation-aucpr:0.97522
[21:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.22372 validation-auc:0.97140 validation-aucpr:0.97528
[21:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.22066 validation-auc:0.97142 validation-aucpr:0.97528
[21:58:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.21760 validation-auc:0.97133 validation-aucpr:0.97526
[21:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.21453 validation-auc:0.97147 validation-aucpr:0.97535
[21:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.21188 validation-auc:0.97150 validation-aucpr:0.97543
[21:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.20981 validation-auc:0.97151 validation-aucpr:0.97541
[21:58:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.20792 validation-auc:0.97158 validation-aucpr:0.97575
[21:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.20574 validation-auc:0.97180 validation-aucpr:0.97591
[21:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.20442 validation-auc:0.97167 validation-aucpr:0.97582
[21:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.20297 validation-auc:0.97176 validation-aucpr:0.97588
[21:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.20209 validation-auc:0.97167 validation-aucpr:0.97581
[21:58:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.20057 validation-auc:0.97190 validation-aucpr:0.97604
[21:58:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.19954 validation-auc:0.97202 validation-aucpr:0.97611
[21:58:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.19862 validation-auc:0.97211 validation-aucpr:0.97612
[21:58:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.19766 validation-auc:0.97230 validation-aucpr:0.97617
[21:58:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.19688 validation-auc:0.97236 validation-aucpr:0.97618
[21:58:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.19656 validation-auc:0.97238 validation-aucpr:0.97618
[21:58:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.19588 validation-auc:0.97251 validation-aucpr:0.97624
[21:58:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.19546 validation-auc:0.97249 validation-aucpr:0.97621
[21:58:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.19532 validation-auc:0.97250 validation-aucpr:0.97622
[21:58:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.19526 validation-auc:0.97249 validation-aucpr:0.97618
[21:58:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19488 validation-auc:0.97251 validation-aucpr:0.97613
[21:58:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19493 validation-auc:0.97243 validation-aucpr:0.97608
[21:58:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19438 validation-auc:0.97255 validation-aucpr:0.97609
[21:58:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19498 validation-auc:0.97243 validation-aucpr:0.97598
[21:58:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19492 validation-auc:0.97250 validation-aucpr:0.97599
[21:58:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19502 validation-auc:0.97247 validation-aucpr:0.97593
[21:58:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19527 validation-auc:0.97252 validation-aucpr:0.97612
[21:58:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19560 validation-auc:0.97250 validation-aucpr:0.97613
[21:58:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19595 validation-auc:0.97254 validation-aucpr:0.97608
[21:58:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19647 validation-auc:0.97245 validation-aucpr:0.97596
[21:58:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.19678 validation-auc:0.97246 validation-aucpr:0.97599
[21:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.19660 validation-auc:0.97256 validation-aucpr:0.97606
[21:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.19673 validation-auc:0.97264 validation-aucpr:0.97612
[21:58:32] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.19687 validation-auc:0.97275 validation-aucpr:0.97621
[21:58:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.19717 validation-auc:0.97275 validation-aucpr:0.97613
[21:58:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.19796 validation-auc:0.97272 validation-aucpr:0.97626
[21:58:33] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.19811 validation-auc:0.97276 validation-aucpr:0.97623
{'best_iteration': '59', 'best_score': '0.9762626557997987'}
Trial 96, Fold 3: Log loss = 0.19811367177543784, Average precision = 0.9762333554119768, ROC-AUC = 0.972757263004305, Elapsed Time = 27.47824279999986 seconds
Trial 96, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 96, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[21:58:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.62844 validation-auc:0.93617 validation-aucpr:0.88997
[21:58:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.57530 validation-auc:0.96105 validation-aucpr:0.95438
[21:58:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.52950 validation-auc:0.96479 validation-aucpr:0.96434
[21:58:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.48986 validation-auc:0.96819 validation-aucpr:0.97292
[21:58:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.45614 validation-auc:0.96935 validation-aucpr:0.97402
[21:58:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.42734 validation-auc:0.96886 validation-aucpr:0.97372
[21:58:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.40166 validation-auc:0.96878 validation-aucpr:0.97374
[21:58:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.37911 validation-auc:0.96889 validation-aucpr:0.97389
[21:58:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.36122 validation-auc:0.96896 validation-aucpr:0.97394
[21:58:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.34385 validation-auc:0.96920 validation-aucpr:0.97411
[21:58:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.32770 validation-auc:0.96989 validation-aucpr:0.97454
[21:58:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.31396 validation-auc:0.96993 validation-aucpr:0.97454
[21:58:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.30194 validation-auc:0.97043 validation-aucpr:0.97492
[21:58:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.29033 validation-auc:0.97090 validation-aucpr:0.97527
[21:58:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.28015 validation-auc:0.97092 validation-aucpr:0.97528
[21:58:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.27242 validation-auc:0.97066 validation-aucpr:0.97516
[21:58:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.26367 validation-auc:0.97080 validation-aucpr:0.97531
[21:58:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.25596 validation-auc:0.97106 validation-aucpr:0.97549
[21:58:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.24916 validation-auc:0.97124 validation-aucpr:0.97560
[21:58:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.24322 validation-auc:0.97136 validation-aucpr:0.97570
[21:58:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.23784 validation-auc:0.97153 validation-aucpr:0.97582
[21:58:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.23272 validation-auc:0.97171 validation-aucpr:0.97601
[21:58:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.22845 validation-auc:0.97165 validation-aucpr:0.97600
[21:58:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.22476 validation-auc:0.97162 validation-aucpr:0.97599
[21:58:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.22102 validation-auc:0.97168 validation-aucpr:0.97606
[21:58:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.21834 validation-auc:0.97150 validation-aucpr:0.97594
[21:58:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.21518 validation-auc:0.97172 validation-aucpr:0.97607
[21:58:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.21267 validation-auc:0.97174 validation-aucpr:0.97607
[21:58:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.21011 validation-auc:0.97193 validation-aucpr:0.97618
[21:58:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.20854 validation-auc:0.97174 validation-aucpr:0.97607
[21:58:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.20717 validation-auc:0.97173 validation-aucpr:0.97605
[21:58:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.20533 validation-auc:0.97175 validation-aucpr:0.97608
[21:58:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.20430 validation-auc:0.97156 validation-aucpr:0.97595
[21:58:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.20294 validation-auc:0.97159 validation-aucpr:0.97597
[21:58:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.20175 validation-auc:0.97165 validation-aucpr:0.97602
[21:58:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.20075 validation-auc:0.97173 validation-aucpr:0.97606
[21:58:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.19985 validation-auc:0.97174 validation-aucpr:0.97606
[21:58:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.19920 validation-auc:0.97178 validation-aucpr:0.97607
[21:58:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.19827 validation-auc:0.97193 validation-aucpr:0.97618
[21:58:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.19832 validation-auc:0.97170 validation-aucpr:0.97601
[21:58:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.19779 validation-auc:0.97173 validation-aucpr:0.97603
[21:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.19740 validation-auc:0.97178 validation-aucpr:0.97610
[21:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.19753 validation-auc:0.97178 validation-aucpr:0.97606
[21:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.19697 validation-auc:0.97188 validation-aucpr:0.97614
[21:58:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.19634 validation-auc:0.97200 validation-aucpr:0.97623
[21:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.19627 validation-auc:0.97203 validation-aucpr:0.97624
[21:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.19603 validation-auc:0.97218 validation-aucpr:0.97634
[21:58:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.19614 validation-auc:0.97222 validation-aucpr:0.97638
[21:58:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19644 validation-auc:0.97220 validation-aucpr:0.97637
[21:58:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19654 validation-auc:0.97220 validation-aucpr:0.97635
[21:58:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.19653 validation-auc:0.97217 validation-aucpr:0.97632
[21:58:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.19668 validation-auc:0.97212 validation-aucpr:0.97629
[21:58:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.19677 validation-auc:0.97214 validation-aucpr:0.97631
[21:58:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.19705 validation-auc:0.97209 validation-aucpr:0.97627
[21:59:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.19765 validation-auc:0.97196 validation-aucpr:0.97618
[21:59:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.19801 validation-auc:0.97197 validation-aucpr:0.97619
[21:59:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.19842 validation-auc:0.97192 validation-aucpr:0.97617
[21:59:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.19873 validation-auc:0.97190 validation-aucpr:0.97617
[21:59:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.19920 validation-auc:0.97196 validation-aucpr:0.97618
[21:59:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.19961 validation-auc:0.97193 validation-aucpr:0.97615
[21:59:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.20026 validation-auc:0.97191 validation-aucpr:0.97612
{'best_iteration': '47', 'best_score': '0.9763758129206098'}
Trial 96, Fold 4: Log loss = 0.20025557918199285, Average precision = 0.9761245672747666, ROC-AUC = 0.9719141862843981, Elapsed Time = 27.80174470000202 seconds
Trial 96, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 96, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[21:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.62839 validation-auc:0.93519 validation-aucpr:0.89789
[21:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.57898 validation-auc:0.95845 validation-aucpr:0.95465
[21:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.53731 validation-auc:0.96030 validation-aucpr:0.95839
[21:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.50217 validation-auc:0.96126 validation-aucpr:0.96048
[21:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.46734 validation-auc:0.96368 validation-aucpr:0.96446
[21:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.43656 validation-auc:0.96572 validation-aucpr:0.96920
[21:59:16] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.40993 validation-auc:0.96666 validation-aucpr:0.97110
[21:59:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.38940 validation-auc:0.96721 validation-aucpr:0.97134
[21:59:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.36860 validation-auc:0.96778 validation-aucpr:0.97199
[21:59:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.35279 validation-auc:0.96756 validation-aucpr:0.97173
[21:59:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.33625 validation-auc:0.96782 validation-aucpr:0.97193
[21:59:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.32411 validation-auc:0.96761 validation-aucpr:0.97159
[21:59:17] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.31280 validation-auc:0.96772 validation-aucpr:0.97156
[21:59:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.30078 validation-auc:0.96806 validation-aucpr:0.97185
[21:59:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.28999 validation-auc:0.96819 validation-aucpr:0.97223
[21:59:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.28010 validation-auc:0.96840 validation-aucpr:0.97247
[21:59:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.27136 validation-auc:0.96857 validation-aucpr:0.97265
[21:59:18] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.26364 validation-auc:0.96877 validation-aucpr:0.97228
[21:59:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.25789 validation-auc:0.96888 validation-aucpr:0.97228
[21:59:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.25197 validation-auc:0.96863 validation-aucpr:0.97214
[21:59:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.24590 validation-auc:0.96918 validation-aucpr:0.97317
[21:59:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.24144 validation-auc:0.96919 validation-aucpr:0.97317
[21:59:19] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.23699 validation-auc:0.96914 validation-aucpr:0.97344
[21:59:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.23361 validation-auc:0.96923 validation-aucpr:0.97343
[21:59:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.22980 validation-auc:0.96946 validation-aucpr:0.97355
[21:59:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.22724 validation-auc:0.96939 validation-aucpr:0.97343
[21:59:20] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.22416 validation-auc:0.96956 validation-aucpr:0.97354
[21:59:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.22090 validation-auc:0.96982 validation-aucpr:0.97376
[21:59:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.21810 validation-auc:0.96998 validation-aucpr:0.97389
[21:59:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.21607 validation-auc:0.97000 validation-aucpr:0.97390
[21:59:21] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.21424 validation-auc:0.97009 validation-aucpr:0.97395
[21:59:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.21229 validation-auc:0.97030 validation-aucpr:0.97412
[21:59:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.21049 validation-auc:0.97051 validation-aucpr:0.97440
[21:59:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.20905 validation-auc:0.97053 validation-aucpr:0.97448
[21:59:22] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.20765 validation-auc:0.97069 validation-aucpr:0.97458
[21:59:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.20723 validation-auc:0.97056 validation-aucpr:0.97443
[21:59:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.20677 validation-auc:0.97055 validation-aucpr:0.97436
[21:59:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.20554 validation-auc:0.97072 validation-aucpr:0.97451
[21:59:23] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.20505 validation-auc:0.97066 validation-aucpr:0.97444
[21:59:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.20449 validation-auc:0.97077 validation-aucpr:0.97470
[21:59:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.20403 validation-auc:0.97081 validation-aucpr:0.97471
[21:59:24] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.20342 validation-auc:0.97087 validation-aucpr:0.97480
[21:59:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20301 validation-auc:0.97095 validation-aucpr:0.97486
[21:59:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20243 validation-auc:0.97106 validation-aucpr:0.97490
[21:59:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.20235 validation-auc:0.97109 validation-aucpr:0.97488
[21:59:25] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.20210 validation-auc:0.97118 validation-aucpr:0.97494
[21:59:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.20198 validation-auc:0.97124 validation-aucpr:0.97493
[21:59:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20192 validation-auc:0.97125 validation-aucpr:0.97495
[21:59:26] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.20199 validation-auc:0.97121 validation-aucpr:0.97482
[21:59:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.20212 validation-auc:0.97118 validation-aucpr:0.97478
[21:59:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[50] validation-logloss:0.20192 validation-auc:0.97131 validation-aucpr:0.97489
[21:59:27] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[51] validation-logloss:0.20181 validation-auc:0.97134 validation-aucpr:0.97492
[21:59:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[52] validation-logloss:0.20187 validation-auc:0.97145 validation-aucpr:0.97512
[21:59:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[53] validation-logloss:0.20251 validation-auc:0.97136 validation-aucpr:0.97501
[21:59:28] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[54] validation-logloss:0.20274 validation-auc:0.97143 validation-aucpr:0.97504
[21:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[55] validation-logloss:0.20294 validation-auc:0.97145 validation-aucpr:0.97506
[21:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[56] validation-logloss:0.20285 validation-auc:0.97156 validation-aucpr:0.97494
[21:59:29] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[57] validation-logloss:0.20294 validation-auc:0.97162 validation-aucpr:0.97496
[21:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[58] validation-logloss:0.20318 validation-auc:0.97163 validation-aucpr:0.97491
[21:59:30] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[59] validation-logloss:0.20374 validation-auc:0.97159 validation-aucpr:0.97474
[21:59:31] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[60] validation-logloss:0.20419 validation-auc:0.97162 validation-aucpr:0.97477
{'best_iteration': '52', 'best_score': '0.9751215601216484'}
Trial 96, Fold 5: Log loss = 0.2041886193925169, Average precision = 0.9747760514099615, ROC-AUC = 0.9716203853371236, Elapsed Time = 28.233958600001642 seconds
Optimization Progress: 97%|#########7| 97/100 [4:00:43<03:47, 76.00s/it]
Trial 97, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 97, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[21:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64867 validation-auc:0.95670 validation-aucpr:0.96258
[21:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.60951 validation-auc:0.96225 validation-aucpr:0.96166
[21:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.57441 validation-auc:0.96412 validation-aucpr:0.96074
[21:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.54647 validation-auc:0.96526 validation-aucpr:0.96692
[21:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.51790 validation-auc:0.96650 validation-aucpr:0.96834
[21:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.49202 validation-auc:0.96625 validation-aucpr:0.96698
[21:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.47174 validation-auc:0.96654 validation-aucpr:0.97127
[21:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.45055 validation-auc:0.96698 validation-aucpr:0.97175
[21:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.43051 validation-auc:0.96747 validation-aucpr:0.97218
[21:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.41292 validation-auc:0.96788 validation-aucpr:0.97253
[21:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.39664 validation-auc:0.96821 validation-aucpr:0.97272
[21:59:37] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.38360 validation-auc:0.96824 validation-aucpr:0.97275
[21:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.36961 validation-auc:0.96868 validation-aucpr:0.97313
[21:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.35663 validation-auc:0.96895 validation-aucpr:0.97330
[21:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.34479 validation-auc:0.96915 validation-aucpr:0.97342
[21:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.33570 validation-auc:0.96905 validation-aucpr:0.97362
[21:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.32540 validation-auc:0.96930 validation-aucpr:0.97387
[21:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.31689 validation-auc:0.96926 validation-aucpr:0.97379
[21:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.30786 validation-auc:0.96959 validation-aucpr:0.97406
[21:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.30076 validation-auc:0.96983 validation-aucpr:0.97422
[21:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.29305 validation-auc:0.96999 validation-aucpr:0.97434
[21:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.28598 validation-auc:0.97010 validation-aucpr:0.97441
[21:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.28064 validation-auc:0.97001 validation-aucpr:0.97431
[21:59:38] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.27448 validation-auc:0.97014 validation-aucpr:0.97441
[21:59:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.26983 validation-auc:0.97001 validation-aucpr:0.97424
[21:59:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.26474 validation-auc:0.96998 validation-aucpr:0.97419
[21:59:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.25992 validation-auc:0.97000 validation-aucpr:0.97424
[21:59:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.25517 validation-auc:0.97028 validation-aucpr:0.97445
[21:59:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.25077 validation-auc:0.97034 validation-aucpr:0.97451
[21:59:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.24746 validation-auc:0.97047 validation-aucpr:0.97460
[21:59:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.24383 validation-auc:0.97054 validation-aucpr:0.97458
[21:59:39] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.24001 validation-auc:0.97078 validation-aucpr:0.97480
[21:59:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.23669 validation-auc:0.97088 validation-aucpr:0.97487
[21:59:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.23377 validation-auc:0.97089 validation-aucpr:0.97489
[21:59:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.23160 validation-auc:0.97095 validation-aucpr:0.97490
[21:59:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.22891 validation-auc:0.97113 validation-aucpr:0.97503
[21:59:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.22629 validation-auc:0.97121 validation-aucpr:0.97507
[21:59:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.22400 validation-auc:0.97135 validation-aucpr:0.97516
[21:59:40] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.22147 validation-auc:0.97161 validation-aucpr:0.97534
[21:59:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.21930 validation-auc:0.97184 validation-aucpr:0.97598
[21:59:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.21776 validation-auc:0.97188 validation-aucpr:0.97597
[21:59:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.21644 validation-auc:0.97178 validation-aucpr:0.97587
[21:59:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.21440 validation-auc:0.97195 validation-aucpr:0.97601
[21:59:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.21275 validation-auc:0.97206 validation-aucpr:0.97609
[21:59:41] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.21126 validation-auc:0.97210 validation-aucpr:0.97616
[21:59:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.20968 validation-auc:0.97213 validation-aucpr:0.97617
[21:59:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.20811 validation-auc:0.97223 validation-aucpr:0.97625
[21:59:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20666 validation-auc:0.97239 validation-aucpr:0.97637
[21:59:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.20540 validation-auc:0.97243 validation-aucpr:0.97639
[21:59:42] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.20408 validation-auc:0.97258 validation-aucpr:0.97649
{'best_iteration': '49', 'best_score': '0.9764873601218161'}
Trial 97, Fold 1: Log loss = 0.20407838779868182, Average precision = 0.9764898931492989, ROC-AUC = 0.9725778535342673, Elapsed Time = 5.513618299999507 seconds
Trial 97, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 97, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[21:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64888 validation-auc:0.95914 validation-aucpr:0.95894
[21:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.60917 validation-auc:0.96414 validation-aucpr:0.96715
[21:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.57409 validation-auc:0.96652 validation-aucpr:0.96728
[21:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.54269 validation-auc:0.96667 validation-aucpr:0.97064
[21:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.51440 validation-auc:0.96649 validation-aucpr:0.97065
[21:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.48842 validation-auc:0.96777 validation-aucpr:0.97157
[21:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.46542 validation-auc:0.96869 validation-aucpr:0.97244
[21:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.44628 validation-auc:0.96878 validation-aucpr:0.97248
[21:59:43] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.42715 validation-auc:0.96921 validation-aucpr:0.97277
[21:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.40905 validation-auc:0.97001 validation-aucpr:0.97332
[21:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.39284 validation-auc:0.97031 validation-aucpr:0.97344
[21:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.37946 validation-auc:0.97019 validation-aucpr:0.97326
[21:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.36566 validation-auc:0.97009 validation-aucpr:0.97321
[21:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.35292 validation-auc:0.96986 validation-aucpr:0.97306
[21:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.34118 validation-auc:0.96988 validation-aucpr:0.97308
[21:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.33134 validation-auc:0.97014 validation-aucpr:0.97329
[21:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.32116 validation-auc:0.97019 validation-aucpr:0.97324
[21:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.31305 validation-auc:0.97037 validation-aucpr:0.97339
[21:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.30537 validation-auc:0.97060 validation-aucpr:0.97359
[21:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.29727 validation-auc:0.97052 validation-aucpr:0.97368
[21:59:44] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.28970 validation-auc:0.97057 validation-aucpr:0.97368
[21:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.28236 validation-auc:0.97089 validation-aucpr:0.97393
[21:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.27561 validation-auc:0.97115 validation-aucpr:0.97411
[21:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.26958 validation-auc:0.97099 validation-aucpr:0.97400
[21:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.26461 validation-auc:0.97087 validation-aucpr:0.97393
[21:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.25924 validation-auc:0.97110 validation-aucpr:0.97409
[21:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.25420 validation-auc:0.97118 validation-aucpr:0.97415
[21:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.24962 validation-auc:0.97113 validation-aucpr:0.97414
[21:59:45] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.24538 validation-auc:0.97123 validation-aucpr:0.97422
[21:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.24131 validation-auc:0.97120 validation-aucpr:0.97419
[21:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.23747 validation-auc:0.97146 validation-aucpr:0.97439
[21:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.23397 validation-auc:0.97163 validation-aucpr:0.97452
[21:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.23092 validation-auc:0.97200 validation-aucpr:0.97477
[21:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.22804 validation-auc:0.97195 validation-aucpr:0.97469
[21:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.22526 validation-auc:0.97187 validation-aucpr:0.97463
[21:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.22267 validation-auc:0.97214 validation-aucpr:0.97483
[21:59:46] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.22086 validation-auc:0.97213 validation-aucpr:0.97519
[21:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.21832 validation-auc:0.97220 validation-aucpr:0.97521
[21:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.21595 validation-auc:0.97222 validation-aucpr:0.97522
[21:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.21371 validation-auc:0.97232 validation-aucpr:0.97530
[21:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.21160 validation-auc:0.97248 validation-aucpr:0.97540
[21:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.20973 validation-auc:0.97247 validation-aucpr:0.97539
[21:59:47] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20802 validation-auc:0.97240 validation-aucpr:0.97533
[21:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20638 validation-auc:0.97250 validation-aucpr:0.97540
[21:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.20478 validation-auc:0.97251 validation-aucpr:0.97538
[21:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.20331 validation-auc:0.97258 validation-aucpr:0.97545
[21:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.20178 validation-auc:0.97263 validation-aucpr:0.97548
[21:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20034 validation-auc:0.97273 validation-aucpr:0.97559
[21:59:48] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.19908 validation-auc:0.97287 validation-aucpr:0.97565
[21:59:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.19801 validation-auc:0.97283 validation-aucpr:0.97561
{'best_iteration': '48', 'best_score': '0.975645861673028'}
Trial 97, Fold 2: Log loss = 0.19801219240028642, Average precision = 0.9755632458595235, ROC-AUC = 0.972825461821623, Elapsed Time = 5.8320394000002125 seconds
Trial 97, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 97, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[21:59:49] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64838 validation-auc:0.96097 validation-aucpr:0.96283
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.60883 validation-auc:0.96486 validation-aucpr:0.96844
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.57343 validation-auc:0.96625 validation-aucpr:0.96988
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.54120 validation-auc:0.96950 validation-aucpr:0.97233
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.51260 validation-auc:0.96927 validation-aucpr:0.97209
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.48677 validation-auc:0.97006 validation-aucpr:0.97259
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.46385 validation-auc:0.96987 validation-aucpr:0.97202
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.44239 validation-auc:0.97010 validation-aucpr:0.97299
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.42575 validation-auc:0.97042 validation-aucpr:0.97321
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.40816 validation-auc:0.97024 validation-aucpr:0.97312
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.39209 validation-auc:0.97014 validation-aucpr:0.97305
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.37705 validation-auc:0.97041 validation-aucpr:0.97322
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.36479 validation-auc:0.97031 validation-aucpr:0.97326
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.35187 validation-auc:0.97049 validation-aucpr:0.97340
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.34143 validation-auc:0.97043 validation-aucpr:0.97342
[21:59:50] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.33037 validation-auc:0.97067 validation-aucpr:0.97361
[21:59:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.32055 validation-auc:0.97066 validation-aucpr:0.97378
[21:59:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.31226 validation-auc:0.97086 validation-aucpr:0.97393
[21:59:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.30338 validation-auc:0.97095 validation-aucpr:0.97401
[21:59:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.29539 validation-auc:0.97060 validation-aucpr:0.97273
[21:59:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.28877 validation-auc:0.97077 validation-aucpr:0.97321
[21:59:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.28252 validation-auc:0.97102 validation-aucpr:0.97336
[21:59:51] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.27570 validation-auc:0.97129 validation-aucpr:0.97359
[21:59:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.26972 validation-auc:0.97145 validation-aucpr:0.97550
[21:59:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.26388 validation-auc:0.97141 validation-aucpr:0.97547
[21:59:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.25876 validation-auc:0.97140 validation-aucpr:0.97546
[21:59:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.25394 validation-auc:0.97148 validation-aucpr:0.97552
[21:59:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.25000 validation-auc:0.97155 validation-aucpr:0.97563
[21:59:52] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.24648 validation-auc:0.97162 validation-aucpr:0.97566
[21:59:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.24217 validation-auc:0.97176 validation-aucpr:0.97573
[21:59:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.23862 validation-auc:0.97199 validation-aucpr:0.97588
[21:59:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.23512 validation-auc:0.97202 validation-aucpr:0.97590
[21:59:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.23244 validation-auc:0.97208 validation-aucpr:0.97585
[21:59:53] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.22980 validation-auc:0.97205 validation-aucpr:0.97579
[21:59:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.22733 validation-auc:0.97208 validation-aucpr:0.97577
[21:59:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.22464 validation-auc:0.97212 validation-aucpr:0.97579
[21:59:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.22217 validation-auc:0.97205 validation-aucpr:0.97571
[21:59:54] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.21975 validation-auc:0.97214 validation-aucpr:0.97577
[21:59:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.21761 validation-auc:0.97212 validation-aucpr:0.97576
[21:59:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.21561 validation-auc:0.97215 validation-aucpr:0.97578
[21:59:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.21386 validation-auc:0.97230 validation-aucpr:0.97606
[21:59:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.21162 validation-auc:0.97247 validation-aucpr:0.97621
[21:59:55] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.20973 validation-auc:0.97258 validation-aucpr:0.97636
[21:59:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20819 validation-auc:0.97256 validation-aucpr:0.97632
[21:59:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.20656 validation-auc:0.97265 validation-aucpr:0.97640
[21:59:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.20502 validation-auc:0.97272 validation-aucpr:0.97646
[21:59:56] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.20365 validation-auc:0.97272 validation-aucpr:0.97645
[21:59:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20246 validation-auc:0.97283 validation-aucpr:0.97653
[21:59:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.20146 validation-auc:0.97279 validation-aucpr:0.97650
[21:59:57] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.20023 validation-auc:0.97291 validation-aucpr:0.97657
{'best_iteration': '49', 'best_score': '0.9765698410238164'}
Trial 97, Fold 3: Log loss = 0.20022721945089825, Average precision = 0.9765723226286877, ROC-AUC = 0.9729139930512083, Elapsed Time = 7.785095100000035 seconds
Trial 97, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 97, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[21:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64852 validation-auc:0.95775 validation-aucpr:0.95917
[21:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.60866 validation-auc:0.96676 validation-aucpr:0.96997
[21:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.57367 validation-auc:0.96795 validation-aucpr:0.97094
[21:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.54182 validation-auc:0.96924 validation-aucpr:0.97401
[21:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.51680 validation-auc:0.96821 validation-aucpr:0.97303
[21:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.49102 validation-auc:0.96826 validation-aucpr:0.97310
[21:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.46785 validation-auc:0.96834 validation-aucpr:0.97341
[21:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.44666 validation-auc:0.96850 validation-aucpr:0.97370
[21:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.42706 validation-auc:0.96885 validation-aucpr:0.97398
[21:59:58] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.40939 validation-auc:0.96900 validation-aucpr:0.97412
[21:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.39326 validation-auc:0.96917 validation-aucpr:0.97425
[21:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.37855 validation-auc:0.96917 validation-aucpr:0.97427
[21:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.36467 validation-auc:0.96949 validation-aucpr:0.97453
[21:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.35205 validation-auc:0.96994 validation-aucpr:0.97487
[21:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.34049 validation-auc:0.97003 validation-aucpr:0.97490
[21:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.33047 validation-auc:0.97035 validation-aucpr:0.97515
[21:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.32075 validation-auc:0.97027 validation-aucpr:0.97514
[21:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.31266 validation-auc:0.97038 validation-aucpr:0.97526
[21:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.30524 validation-auc:0.97050 validation-aucpr:0.97536
[21:59:59] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.29699 validation-auc:0.97069 validation-aucpr:0.97549
[22:00:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.28937 validation-auc:0.97075 validation-aucpr:0.97552
[22:00:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.28228 validation-auc:0.97089 validation-aucpr:0.97563
[22:00:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.27684 validation-auc:0.97061 validation-aucpr:0.97541
[22:00:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.27060 validation-auc:0.97059 validation-aucpr:0.97538
[22:00:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.26491 validation-auc:0.97063 validation-aucpr:0.97538
[22:00:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.25937 validation-auc:0.97075 validation-aucpr:0.97548
[22:00:00] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.25437 validation-auc:0.97090 validation-aucpr:0.97558
[22:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.24977 validation-auc:0.97098 validation-aucpr:0.97563
[22:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.24538 validation-auc:0.97092 validation-aucpr:0.97561
[22:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.24167 validation-auc:0.97087 validation-aucpr:0.97555
[22:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.23820 validation-auc:0.97075 validation-aucpr:0.97545
[22:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.23533 validation-auc:0.97054 validation-aucpr:0.97529
[22:00:01] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.23236 validation-auc:0.97047 validation-aucpr:0.97526
[22:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.22965 validation-auc:0.97046 validation-aucpr:0.97525
[22:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.22683 validation-auc:0.97052 validation-aucpr:0.97528
[22:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.22451 validation-auc:0.97066 validation-aucpr:0.97537
[22:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.22199 validation-auc:0.97082 validation-aucpr:0.97547
[22:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.21962 validation-auc:0.97099 validation-aucpr:0.97560
[22:00:02] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.21745 validation-auc:0.97103 validation-aucpr:0.97563
[22:00:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.21588 validation-auc:0.97105 validation-aucpr:0.97565
[22:00:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.21385 validation-auc:0.97119 validation-aucpr:0.97574
[22:00:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.21231 validation-auc:0.97105 validation-aucpr:0.97563
[22:00:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.21059 validation-auc:0.97114 validation-aucpr:0.97570
[22:00:03] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.20902 validation-auc:0.97126 validation-aucpr:0.97579
[22:00:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.20771 validation-auc:0.97114 validation-aucpr:0.97570
[22:00:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.20642 validation-auc:0.97124 validation-aucpr:0.97579
[22:00:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.20477 validation-auc:0.97150 validation-aucpr:0.97596
[22:00:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20348 validation-auc:0.97154 validation-aucpr:0.97599
[22:00:04] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.20223 validation-auc:0.97162 validation-aucpr:0.97604
[22:00:05] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.20124 validation-auc:0.97161 validation-aucpr:0.97604
{'best_iteration': '48', 'best_score': '0.9760388955289463'}
Trial 97, Fold 4: Log loss = 0.20123741217815116, Average precision = 0.9760271817156592, ROC-AUC = 0.9716141232167597, Elapsed Time = 7.109914699998626 seconds
Trial 97, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 97, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[22:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[0] validation-logloss:0.64889 validation-auc:0.95644 validation-aucpr:0.96122
[22:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[1] validation-logloss:0.60933 validation-auc:0.96394 validation-aucpr:0.96845
[22:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[2] validation-logloss:0.57441 validation-auc:0.96582 validation-aucpr:0.97007
[22:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[3] validation-logloss:0.54731 validation-auc:0.96355 validation-aucpr:0.96828
[22:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[4] validation-logloss:0.51876 validation-auc:0.96506 validation-aucpr:0.96967
[22:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[5] validation-logloss:0.49593 validation-auc:0.96520 validation-aucpr:0.96963
[22:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[6] validation-logloss:0.47211 validation-auc:0.96584 validation-aucpr:0.97025
[22:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[7] validation-logloss:0.45092 validation-auc:0.96602 validation-aucpr:0.97038
[22:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[8] validation-logloss:0.43155 validation-auc:0.96675 validation-aucpr:0.97088
[22:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[9] validation-logloss:0.41429 validation-auc:0.96716 validation-aucpr:0.97126
[22:00:06] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[10] validation-logloss:0.39916 validation-auc:0.96746 validation-aucpr:0.97141
[22:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[11] validation-logloss:0.38598 validation-auc:0.96724 validation-aucpr:0.97111
[22:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[12] validation-logloss:0.37172 validation-auc:0.96784 validation-aucpr:0.97166
[22:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[13] validation-logloss:0.35967 validation-auc:0.96785 validation-aucpr:0.97176
[22:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[14] validation-logloss:0.34969 validation-auc:0.96773 validation-aucpr:0.97161
[22:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[15] validation-logloss:0.33863 validation-auc:0.96797 validation-aucpr:0.97181
[22:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[16] validation-logloss:0.32863 validation-auc:0.96818 validation-aucpr:0.97197
[22:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[17] validation-logloss:0.32070 validation-auc:0.96803 validation-aucpr:0.97179
[22:00:07] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[18] validation-logloss:0.31323 validation-auc:0.96804 validation-aucpr:0.97171
[22:00:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[19] validation-logloss:0.30485 validation-auc:0.96831 validation-aucpr:0.97193
[22:00:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[20] validation-logloss:0.29723 validation-auc:0.96871 validation-aucpr:0.97225
[22:00:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[21] validation-logloss:0.29003 validation-auc:0.96888 validation-aucpr:0.97241
[22:00:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[22] validation-logloss:0.28468 validation-auc:0.96872 validation-aucpr:0.97229
[22:00:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[23] validation-logloss:0.27832 validation-auc:0.96904 validation-aucpr:0.97262
[22:00:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[24] validation-logloss:0.27266 validation-auc:0.96907 validation-aucpr:0.97264
[22:00:08] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[25] validation-logloss:0.26737 validation-auc:0.96925 validation-aucpr:0.97282
[22:00:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[26] validation-logloss:0.26238 validation-auc:0.96937 validation-aucpr:0.97291
[22:00:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[27] validation-logloss:0.25755 validation-auc:0.96961 validation-aucpr:0.97311
[22:00:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[28] validation-logloss:0.25348 validation-auc:0.96985 validation-aucpr:0.97323
[22:00:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[29] validation-logloss:0.24905 validation-auc:0.97013 validation-aucpr:0.97349
[22:00:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[30] validation-logloss:0.24522 validation-auc:0.97025 validation-aucpr:0.97363
[22:00:09] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[31] validation-logloss:0.24230 validation-auc:0.97016 validation-aucpr:0.97353
[22:00:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[32] validation-logloss:0.23978 validation-auc:0.97010 validation-aucpr:0.97347
[22:00:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[33] validation-logloss:0.23681 validation-auc:0.97008 validation-aucpr:0.97343
[22:00:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[34] validation-logloss:0.23389 validation-auc:0.97015 validation-aucpr:0.97349
[22:00:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[35] validation-logloss:0.23120 validation-auc:0.97017 validation-aucpr:0.97349
[22:00:10] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[36] validation-logloss:0.22894 validation-auc:0.97026 validation-aucpr:0.97341
[22:00:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[37] validation-logloss:0.22674 validation-auc:0.97026 validation-aucpr:0.97341
[22:00:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[38] validation-logloss:0.22480 validation-auc:0.97037 validation-aucpr:0.97352
[22:00:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[39] validation-logloss:0.22316 validation-auc:0.97028 validation-aucpr:0.97341
[22:00:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[40] validation-logloss:0.22100 validation-auc:0.97046 validation-aucpr:0.97361
[22:00:11] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[41] validation-logloss:0.21895 validation-auc:0.97067 validation-aucpr:0.97409
[22:00:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[42] validation-logloss:0.21693 validation-auc:0.97090 validation-aucpr:0.97427
[22:00:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[43] validation-logloss:0.21514 validation-auc:0.97095 validation-aucpr:0.97429
[22:00:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[44] validation-logloss:0.21363 validation-auc:0.97112 validation-aucpr:0.97445
[22:00:12] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[45] validation-logloss:0.21242 validation-auc:0.97120 validation-aucpr:0.97451
[22:00:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[46] validation-logloss:0.21067 validation-auc:0.97142 validation-aucpr:0.97469
[22:00:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[47] validation-logloss:0.20927 validation-auc:0.97149 validation-aucpr:0.97478
[22:00:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[48] validation-logloss:0.20784 validation-auc:0.97162 validation-aucpr:0.97485
[22:00:13] INFO: C:\buildkite-agent\builds\buildkite-windows-cpu-autoscaling-group-i-0b3782d1791676daf-1\xgboost\xgboost-ci-windows\src\gbm\gbtree.cc:887: drop 0 trees, weight = 1
[49] validation-logloss:0.20652 validation-auc:0.97173 validation-aucpr:0.97498
{'best_iteration': '49', 'best_score': '0.9749812712453916'}
Trial 97, Fold 5: Log loss = 0.20652109625057113, Average precision = 0.9749294362547176, ROC-AUC = 0.9717331995615258, Elapsed Time = 7.842935299999226 seconds
Optimization Progress: 98%|#########8| 98/100 [4:01:25<02:12, 66.03s/it]
Trial 98, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 98, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.66884 validation-auc:0.94051 validation-aucpr:0.94034
[1] validation-logloss:0.64799 validation-auc:0.95079 validation-aucpr:0.95078
[2] validation-logloss:0.62482 validation-auc:0.95751 validation-aucpr:0.95860
[3] validation-logloss:0.60614 validation-auc:0.95857 validation-aucpr:0.96514
[4] validation-logloss:0.58710 validation-auc:0.95926 validation-aucpr:0.96610
[5] validation-logloss:0.57165 validation-auc:0.96036 validation-aucpr:0.96675
[6] validation-logloss:0.55371 validation-auc:0.96162 validation-aucpr:0.96842
[7] validation-logloss:0.53786 validation-auc:0.96269 validation-aucpr:0.96927
[8] validation-logloss:0.52480 validation-auc:0.96279 validation-aucpr:0.96925
[9] validation-logloss:0.50960 validation-auc:0.96317 validation-aucpr:0.96967
[10] validation-logloss:0.49487 validation-auc:0.96389 validation-aucpr:0.97028
[11] validation-logloss:0.48348 validation-auc:0.96459 validation-aucpr:0.97087
[12] validation-logloss:0.47409 validation-auc:0.96409 validation-aucpr:0.97044
[13] validation-logloss:0.46168 validation-auc:0.96412 validation-aucpr:0.97067
[14] validation-logloss:0.45211 validation-auc:0.96373 validation-aucpr:0.97028
[15] validation-logloss:0.44119 validation-auc:0.96388 validation-aucpr:0.97045
[16] validation-logloss:0.43201 validation-auc:0.96395 validation-aucpr:0.97052
[17] validation-logloss:0.42372 validation-auc:0.96375 validation-aucpr:0.97036
[18] validation-logloss:0.41592 validation-auc:0.96355 validation-aucpr:0.97012
[19] validation-logloss:0.40686 validation-auc:0.96354 validation-aucpr:0.97013
[20] validation-logloss:0.39940 validation-auc:0.96360 validation-aucpr:0.97009
[21] validation-logloss:0.39086 validation-auc:0.96375 validation-aucpr:0.97029
[22] validation-logloss:0.38466 validation-auc:0.96361 validation-aucpr:0.97019
[23] validation-logloss:0.37694 validation-auc:0.96374 validation-aucpr:0.97036
[24] validation-logloss:0.37103 validation-auc:0.96374 validation-aucpr:0.97031
[25] validation-logloss:0.36507 validation-auc:0.96399 validation-aucpr:0.97047
[26] validation-logloss:0.35950 validation-auc:0.96379 validation-aucpr:0.97037
[27] validation-logloss:0.35412 validation-auc:0.96366 validation-aucpr:0.97028
[28] validation-logloss:0.34911 validation-auc:0.96360 validation-aucpr:0.97020
[29] validation-logloss:0.34321 validation-auc:0.96368 validation-aucpr:0.97027
[30] validation-logloss:0.33869 validation-auc:0.96365 validation-aucpr:0.97021
[31] validation-logloss:0.33474 validation-auc:0.96387 validation-aucpr:0.97029
[32] validation-logloss:0.33052 validation-auc:0.96389 validation-aucpr:0.97029
[33] validation-logloss:0.32658 validation-auc:0.96375 validation-aucpr:0.97010
[34] validation-logloss:0.32243 validation-auc:0.96376 validation-aucpr:0.97013
[35] validation-logloss:0.31939 validation-auc:0.96354 validation-aucpr:0.96992
[36] validation-logloss:0.31576 validation-auc:0.96353 validation-aucpr:0.96985
[37] validation-logloss:0.31113 validation-auc:0.96393 validation-aucpr:0.97022
[38] validation-logloss:0.30638 validation-auc:0.96428 validation-aucpr:0.97051
[39] validation-logloss:0.30234 validation-auc:0.96429 validation-aucpr:0.97052
[40] validation-logloss:0.29918 validation-auc:0.96440 validation-aucpr:0.97055
[41] validation-logloss:0.29528 validation-auc:0.96468 validation-aucpr:0.97081
[42] validation-logloss:0.29164 validation-auc:0.96483 validation-aucpr:0.97096
[43] validation-logloss:0.28894 validation-auc:0.96478 validation-aucpr:0.97090
[44] validation-logloss:0.28677 validation-auc:0.96463 validation-aucpr:0.97075
{'best_iteration': '42', 'best_score': '0.9709633010271024'}
Trial 98, Fold 1: Log loss = 0.2867742175548801, Average precision = 0.9707397207243473, ROC-AUC = 0.9646308902462923, Elapsed Time = 0.839261200002511 seconds
Trial 98, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 98, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.67039 validation-auc:0.93017 validation-aucpr:0.91660
[1] validation-logloss:0.64871 validation-auc:0.95007 validation-aucpr:0.95231
[2] validation-logloss:0.62938 validation-auc:0.95374 validation-aucpr:0.95551
[3] validation-logloss:0.60925 validation-auc:0.95643 validation-aucpr:0.96082
[4] validation-logloss:0.59263 validation-auc:0.95691 validation-aucpr:0.96111
[5] validation-logloss:0.57408 validation-auc:0.95835 validation-aucpr:0.96276
[6] validation-logloss:0.55616 validation-auc:0.95922 validation-aucpr:0.96345
[7] validation-logloss:0.54222 validation-auc:0.95911 validation-aucpr:0.96359
[8] validation-logloss:0.52724 validation-auc:0.95964 validation-aucpr:0.96396
[9] validation-logloss:0.51176 validation-auc:0.96018 validation-aucpr:0.96461
[10] validation-logloss:0.49829 validation-auc:0.96090 validation-aucpr:0.96574
[11] validation-logloss:0.48504 validation-auc:0.96133 validation-aucpr:0.96633
[12] validation-logloss:0.47425 validation-auc:0.96134 validation-aucpr:0.96626
[13] validation-logloss:0.46392 validation-auc:0.96193 validation-aucpr:0.96661
[14] validation-logloss:0.45399 validation-auc:0.96206 validation-aucpr:0.96676
[15] validation-logloss:0.44308 validation-auc:0.96231 validation-aucpr:0.96696
[16] validation-logloss:0.43509 validation-auc:0.96218 validation-aucpr:0.96669
[17] validation-logloss:0.42479 validation-auc:0.96248 validation-aucpr:0.96685
[18] validation-logloss:0.41703 validation-auc:0.96264 validation-aucpr:0.96688
[19] validation-logloss:0.40777 validation-auc:0.96331 validation-aucpr:0.96742
[20] validation-logloss:0.39873 validation-auc:0.96391 validation-aucpr:0.96794
[21] validation-logloss:0.39183 validation-auc:0.96395 validation-aucpr:0.96772
[22] validation-logloss:0.38513 validation-auc:0.96396 validation-aucpr:0.96758
[23] validation-logloss:0.37734 validation-auc:0.96422 validation-aucpr:0.96783
[24] validation-logloss:0.36972 validation-auc:0.96465 validation-aucpr:0.96829
[25] validation-logloss:0.36281 validation-auc:0.96491 validation-aucpr:0.96845
[26] validation-logloss:0.35695 validation-auc:0.96501 validation-aucpr:0.96851
[27] validation-logloss:0.35008 validation-auc:0.96540 validation-aucpr:0.96893
[28] validation-logloss:0.34507 validation-auc:0.96559 validation-aucpr:0.96914
[29] validation-logloss:0.33898 validation-auc:0.96559 validation-aucpr:0.96921
[30] validation-logloss:0.33309 validation-auc:0.96577 validation-aucpr:0.96936
[31] validation-logloss:0.32756 validation-auc:0.96596 validation-aucpr:0.96954
[32] validation-logloss:0.32204 validation-auc:0.96629 validation-aucpr:0.96984
[33] validation-logloss:0.31816 validation-auc:0.96642 validation-aucpr:0.96991
[34] validation-logloss:0.31330 validation-auc:0.96659 validation-aucpr:0.97017
[35] validation-logloss:0.30971 validation-auc:0.96645 validation-aucpr:0.97010
[36] validation-logloss:0.30558 validation-auc:0.96661 validation-aucpr:0.97032
[37] validation-logloss:0.30148 validation-auc:0.96665 validation-aucpr:0.97034
[38] validation-logloss:0.29827 validation-auc:0.96668 validation-aucpr:0.97034
[39] validation-logloss:0.29526 validation-auc:0.96672 validation-aucpr:0.97040
[40] validation-logloss:0.29280 validation-auc:0.96673 validation-aucpr:0.97073
[41] validation-logloss:0.28926 validation-auc:0.96685 validation-aucpr:0.97076
[42] validation-logloss:0.28663 validation-auc:0.96683 validation-aucpr:0.97074
[43] validation-logloss:0.28313 validation-auc:0.96696 validation-aucpr:0.97083
[44] validation-logloss:0.27967 validation-auc:0.96714 validation-aucpr:0.97099
{'best_iteration': '44', 'best_score': '0.9709925090997215'}
Trial 98, Fold 2: Log loss = 0.2796652386858184, Average precision = 0.9709801210620819, ROC-AUC = 0.9671362855966161, Elapsed Time = 1.1067725000029895 seconds
Trial 98, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 98, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.66926 validation-auc:0.93339 validation-aucpr:0.91513
[1] validation-logloss:0.64911 validation-auc:0.94741 validation-aucpr:0.94176
[2] validation-logloss:0.62933 validation-auc:0.95574 validation-aucpr:0.95849
[3] validation-logloss:0.60876 validation-auc:0.95955 validation-aucpr:0.96526
[4] validation-logloss:0.59157 validation-auc:0.96022 validation-aucpr:0.96597
[5] validation-logloss:0.57577 validation-auc:0.95997 validation-aucpr:0.96580
[6] validation-logloss:0.55910 validation-auc:0.95986 validation-aucpr:0.96602
[7] validation-logloss:0.54490 validation-auc:0.96022 validation-aucpr:0.96609
[8] validation-logloss:0.52932 validation-auc:0.96093 validation-aucpr:0.96658
[9] validation-logloss:0.51402 validation-auc:0.96148 validation-aucpr:0.96713
[10] validation-logloss:0.50185 validation-auc:0.96141 validation-aucpr:0.96731
[11] validation-logloss:0.48753 validation-auc:0.96242 validation-aucpr:0.96822
[12] validation-logloss:0.47425 validation-auc:0.96278 validation-aucpr:0.96854
[13] validation-logloss:0.46183 validation-auc:0.96284 validation-aucpr:0.96874
[14] validation-logloss:0.45221 validation-auc:0.96289 validation-aucpr:0.96866
[15] validation-logloss:0.44315 validation-auc:0.96287 validation-aucpr:0.96860
[16] validation-logloss:0.43438 validation-auc:0.96334 validation-aucpr:0.96871
[17] validation-logloss:0.42426 validation-auc:0.96357 validation-aucpr:0.96901
[18] validation-logloss:0.41664 validation-auc:0.96364 validation-aucpr:0.96900
[19] validation-logloss:0.40722 validation-auc:0.96369 validation-aucpr:0.96917
[20] validation-logloss:0.39961 validation-auc:0.96397 validation-aucpr:0.96936
[21] validation-logloss:0.39028 validation-auc:0.96478 validation-aucpr:0.97005
[22] validation-logloss:0.38212 validation-auc:0.96487 validation-aucpr:0.97011
[23] validation-logloss:0.37402 validation-auc:0.96515 validation-aucpr:0.97047
[24] validation-logloss:0.36725 validation-auc:0.96515 validation-aucpr:0.97060
[25] validation-logloss:0.36014 validation-auc:0.96541 validation-aucpr:0.97078
[26] validation-logloss:0.35408 validation-auc:0.96537 validation-aucpr:0.97081
[27] validation-logloss:0.34867 validation-auc:0.96557 validation-aucpr:0.97092
[28] validation-logloss:0.34361 validation-auc:0.96555 validation-aucpr:0.97088
[29] validation-logloss:0.33740 validation-auc:0.96588 validation-aucpr:0.97111
[30] validation-logloss:0.33286 validation-auc:0.96584 validation-aucpr:0.97101
[31] validation-logloss:0.32842 validation-auc:0.96589 validation-aucpr:0.97108
[32] validation-logloss:0.32427 validation-auc:0.96607 validation-aucpr:0.97120
[33] validation-logloss:0.32029 validation-auc:0.96605 validation-aucpr:0.97115
[34] validation-logloss:0.31545 validation-auc:0.96607 validation-aucpr:0.97121
[35] validation-logloss:0.31171 validation-auc:0.96607 validation-aucpr:0.97122
[36] validation-logloss:0.30840 validation-auc:0.96601 validation-aucpr:0.97114
[37] validation-logloss:0.30413 validation-auc:0.96618 validation-aucpr:0.97128
[38] validation-logloss:0.29924 validation-auc:0.96633 validation-aucpr:0.97141
[39] validation-logloss:0.29613 validation-auc:0.96648 validation-aucpr:0.97149
[40] validation-logloss:0.29314 validation-auc:0.96653 validation-aucpr:0.97156
[41] validation-logloss:0.28938 validation-auc:0.96647 validation-aucpr:0.97153
[42] validation-logloss:0.28565 validation-auc:0.96650 validation-aucpr:0.97153
[43] validation-logloss:0.28295 validation-auc:0.96672 validation-aucpr:0.97172
[44] validation-logloss:0.27939 validation-auc:0.96678 validation-aucpr:0.97179
{'best_iteration': '44', 'best_score': '0.9717887583922987'}
Trial 98, Fold 3: Log loss = 0.2793851269635118, Average precision = 0.9717884446967304, ROC-AUC = 0.9667844411832689, Elapsed Time = 1.0817418000006 seconds
Trial 98, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 98, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.66794 validation-auc:0.93771 validation-aucpr:0.92727
[1] validation-logloss:0.64542 validation-auc:0.95095 validation-aucpr:0.95402
[2] validation-logloss:0.62701 validation-auc:0.95216 validation-aucpr:0.95924
[3] validation-logloss:0.60977 validation-auc:0.95322 validation-aucpr:0.96004
[4] validation-logloss:0.59292 validation-auc:0.95518 validation-aucpr:0.96135
[5] validation-logloss:0.57694 validation-auc:0.95516 validation-aucpr:0.96170
[6] validation-logloss:0.55957 validation-auc:0.95681 validation-aucpr:0.96365
[7] validation-logloss:0.54466 validation-auc:0.95690 validation-aucpr:0.96394
[8] validation-logloss:0.53135 validation-auc:0.95738 validation-aucpr:0.96429
[9] validation-logloss:0.51857 validation-auc:0.95879 validation-aucpr:0.96504
[10] validation-logloss:0.50427 validation-auc:0.95977 validation-aucpr:0.96613
[11] validation-logloss:0.49322 validation-auc:0.95992 validation-aucpr:0.96620
[12] validation-logloss:0.48257 validation-auc:0.95986 validation-aucpr:0.96613
[13] validation-logloss:0.47021 validation-auc:0.96038 validation-aucpr:0.96676
[14] validation-logloss:0.45778 validation-auc:0.96110 validation-aucpr:0.96735
[15] validation-logloss:0.44832 validation-auc:0.96162 validation-aucpr:0.96771
[16] validation-logloss:0.43948 validation-auc:0.96199 validation-aucpr:0.96799
[17] validation-logloss:0.43147 validation-auc:0.96180 validation-aucpr:0.96788
[18] validation-logloss:0.42168 validation-auc:0.96230 validation-aucpr:0.96841
[19] validation-logloss:0.41173 validation-auc:0.96261 validation-aucpr:0.96871
[20] validation-logloss:0.40307 validation-auc:0.96273 validation-aucpr:0.96892
[21] validation-logloss:0.39647 validation-auc:0.96250 validation-aucpr:0.96867
[22] validation-logloss:0.38995 validation-auc:0.96266 validation-aucpr:0.96884
[23] validation-logloss:0.38394 validation-auc:0.96266 validation-aucpr:0.96883
[24] validation-logloss:0.37825 validation-auc:0.96270 validation-aucpr:0.96882
[25] validation-logloss:0.37215 validation-auc:0.96281 validation-aucpr:0.96886
[26] validation-logloss:0.36671 validation-auc:0.96275 validation-aucpr:0.96879
[27] validation-logloss:0.35998 validation-auc:0.96295 validation-aucpr:0.96896
[28] validation-logloss:0.35288 validation-auc:0.96305 validation-aucpr:0.96911
[29] validation-logloss:0.34753 validation-auc:0.96306 validation-aucpr:0.96914
[30] validation-logloss:0.34088 validation-auc:0.96362 validation-aucpr:0.96962
[31] validation-logloss:0.33598 validation-auc:0.96379 validation-aucpr:0.96975
[32] validation-logloss:0.32998 validation-auc:0.96409 validation-aucpr:0.97001
[33] validation-logloss:0.32595 validation-auc:0.96421 validation-aucpr:0.97004
[34] validation-logloss:0.32041 validation-auc:0.96462 validation-aucpr:0.97039
[35] validation-logloss:0.31555 validation-auc:0.96467 validation-aucpr:0.97046
[36] validation-logloss:0.31219 validation-auc:0.96469 validation-aucpr:0.97041
[37] validation-logloss:0.30742 validation-auc:0.96497 validation-aucpr:0.97066
[38] validation-logloss:0.30273 validation-auc:0.96506 validation-aucpr:0.97074
[39] validation-logloss:0.29897 validation-auc:0.96497 validation-aucpr:0.97070
[40] validation-logloss:0.29483 validation-auc:0.96517 validation-aucpr:0.97084
[41] validation-logloss:0.29102 validation-auc:0.96527 validation-aucpr:0.97093
[42] validation-logloss:0.28748 validation-auc:0.96540 validation-aucpr:0.97104
[43] validation-logloss:0.28405 validation-auc:0.96550 validation-aucpr:0.97111
[44] validation-logloss:0.28180 validation-auc:0.96549 validation-aucpr:0.97117
{'best_iteration': '44', 'best_score': '0.9711655496123021'}
Trial 98, Fold 4: Log loss = 0.2818028283500285, Average precision = 0.9711585620081205, ROC-AUC = 0.965494088452065, Elapsed Time = 1.1227743000017654 seconds
Trial 98, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 98, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.66844 validation-auc:0.93178 validation-aucpr:0.90280
[1] validation-logloss:0.64632 validation-auc:0.95347 validation-aucpr:0.95278
[2] validation-logloss:0.62748 validation-auc:0.95508 validation-aucpr:0.95885
[3] validation-logloss:0.60708 validation-auc:0.95689 validation-aucpr:0.96282
[4] validation-logloss:0.59037 validation-auc:0.95719 validation-aucpr:0.96288
[5] validation-logloss:0.57161 validation-auc:0.95712 validation-aucpr:0.96285
[6] validation-logloss:0.55689 validation-auc:0.95774 validation-aucpr:0.96310
[7] validation-logloss:0.54047 validation-auc:0.95874 validation-aucpr:0.96421
[8] validation-logloss:0.52453 validation-auc:0.95916 validation-aucpr:0.96486
[9] validation-logloss:0.50992 validation-auc:0.95966 validation-aucpr:0.96553
[10] validation-logloss:0.49817 validation-auc:0.95968 validation-aucpr:0.96561
[11] validation-logloss:0.48470 validation-auc:0.96022 validation-aucpr:0.96627
[12] validation-logloss:0.47239 validation-auc:0.96068 validation-aucpr:0.96655
[13] validation-logloss:0.46029 validation-auc:0.96138 validation-aucpr:0.96708
[14] validation-logloss:0.44871 validation-auc:0.96167 validation-aucpr:0.96739
[15] validation-logloss:0.43954 validation-auc:0.96193 validation-aucpr:0.96747
[16] validation-logloss:0.43048 validation-auc:0.96227 validation-aucpr:0.96770
[17] validation-logloss:0.42048 validation-auc:0.96265 validation-aucpr:0.96802
[18] validation-logloss:0.41102 validation-auc:0.96260 validation-aucpr:0.96802
[19] validation-logloss:0.40273 validation-auc:0.96274 validation-aucpr:0.96820
[20] validation-logloss:0.39417 validation-auc:0.96324 validation-aucpr:0.96867
[21] validation-logloss:0.38585 validation-auc:0.96358 validation-aucpr:0.96886
[22] validation-logloss:0.37959 validation-auc:0.96328 validation-aucpr:0.96858
[23] validation-logloss:0.37353 validation-auc:0.96308 validation-aucpr:0.96838
[24] validation-logloss:0.36786 validation-auc:0.96310 validation-aucpr:0.96837
[25] validation-logloss:0.36124 validation-auc:0.96352 validation-aucpr:0.96737
[26] validation-logloss:0.35562 validation-auc:0.96350 validation-aucpr:0.96738
[27] validation-logloss:0.34951 validation-auc:0.96351 validation-aucpr:0.96743
[28] validation-logloss:0.34351 validation-auc:0.96359 validation-aucpr:0.96754
[29] validation-logloss:0.33814 validation-auc:0.96373 validation-aucpr:0.96761
[30] validation-logloss:0.33281 validation-auc:0.96398 validation-aucpr:0.96787
[31] validation-logloss:0.32892 validation-auc:0.96372 validation-aucpr:0.96787
[32] validation-logloss:0.32353 validation-auc:0.96406 validation-aucpr:0.96812
[33] validation-logloss:0.31918 validation-auc:0.96388 validation-aucpr:0.96792
[34] validation-logloss:0.31568 validation-auc:0.96374 validation-aucpr:0.96781
[35] validation-logloss:0.31241 validation-auc:0.96370 validation-aucpr:0.96773
[36] validation-logloss:0.30865 validation-auc:0.96381 validation-aucpr:0.96780
[37] validation-logloss:0.30537 validation-auc:0.96384 validation-aucpr:0.96775
[38] validation-logloss:0.30100 validation-auc:0.96407 validation-aucpr:0.96791
[39] validation-logloss:0.29796 validation-auc:0.96404 validation-aucpr:0.96785
[40] validation-logloss:0.29454 validation-auc:0.96405 validation-aucpr:0.96922
[41] validation-logloss:0.29101 validation-auc:0.96405 validation-aucpr:0.96922
[42] validation-logloss:0.28869 validation-auc:0.96389 validation-aucpr:0.96905
[43] validation-logloss:0.28629 validation-auc:0.96391 validation-aucpr:0.96906
[44] validation-logloss:0.28312 validation-auc:0.96394 validation-aucpr:0.96925
{'best_iteration': '44', 'best_score': '0.9692529772877284'}
Trial 98, Fold 5: Log loss = 0.2831176401223858, Average precision = 0.9692541584306964, ROC-AUC = 0.9639441131114951, Elapsed Time = 1.0847797999995237 seconds
Optimization Progress: 99%|#########9| 99/100 [4:01:39<00:50, 50.25s/it]
Trial 99, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371
Trial 99, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913
[0] validation-logloss:0.68213 validation-auc:0.92188 validation-aucpr:0.91559
[1] validation-logloss:0.67129 validation-auc:0.94124 validation-aucpr:0.92973
[2] validation-logloss:0.65879 validation-auc:0.95926 validation-aucpr:0.95791
[3] validation-logloss:0.64899 validation-auc:0.96048 validation-aucpr:0.96295
[4] validation-logloss:0.63889 validation-auc:0.96090 validation-aucpr:0.96284
[5] validation-logloss:0.62909 validation-auc:0.96148 validation-aucpr:0.96393
[6] validation-logloss:0.61944 validation-auc:0.96189 validation-aucpr:0.96656
[7] validation-logloss:0.61049 validation-auc:0.96123 validation-aucpr:0.96472
[8] validation-logloss:0.60171 validation-auc:0.96146 validation-aucpr:0.96486
[9] validation-logloss:0.59289 validation-auc:0.96198 validation-aucpr:0.96520
[10] validation-logloss:0.58317 validation-auc:0.96389 validation-aucpr:0.96708
[11] validation-logloss:0.57473 validation-auc:0.96398 validation-aucpr:0.96729
[12] validation-logloss:0.56672 validation-auc:0.96407 validation-aucpr:0.96721
[13] validation-logloss:0.55767 validation-auc:0.96481 validation-aucpr:0.96809
[14] validation-logloss:0.55039 validation-auc:0.96465 validation-aucpr:0.96760
[15] validation-logloss:0.54314 validation-auc:0.96464 validation-aucpr:0.96785
[16] validation-logloss:0.53627 validation-auc:0.96473 validation-aucpr:0.96791
[17] validation-logloss:0.52855 validation-auc:0.96522 validation-aucpr:0.96903
[18] validation-logloss:0.52185 validation-auc:0.96519 validation-aucpr:0.96890
[19] validation-logloss:0.51542 validation-auc:0.96529 validation-aucpr:0.96897
[20] validation-logloss:0.50800 validation-auc:0.96564 validation-aucpr:0.96939
[21] validation-logloss:0.50182 validation-auc:0.96556 validation-aucpr:0.96932
[22] validation-logloss:0.49565 validation-auc:0.96568 validation-aucpr:0.96935
[23] validation-logloss:0.48980 validation-auc:0.96570 validation-aucpr:0.96941
[24] validation-logloss:0.48426 validation-auc:0.96574 validation-aucpr:0.96960
[25] validation-logloss:0.47761 validation-auc:0.96613 validation-aucpr:0.97004
[26] validation-logloss:0.47230 validation-auc:0.96622 validation-aucpr:0.96983
[27] validation-logloss:0.46708 validation-auc:0.96621 validation-aucpr:0.96987
[28] validation-logloss:0.46227 validation-auc:0.96618 validation-aucpr:0.97018
[29] validation-logloss:0.45734 validation-auc:0.96608 validation-aucpr:0.97008
[30] validation-logloss:0.45256 validation-auc:0.96596 validation-aucpr:0.97000
[31] validation-logloss:0.44808 validation-auc:0.96602 validation-aucpr:0.97084
[32] validation-logloss:0.44333 validation-auc:0.96614 validation-aucpr:0.97090
[33] validation-logloss:0.43893 validation-auc:0.96611 validation-aucpr:0.97085
[34] validation-logloss:0.43351 validation-auc:0.96632 validation-aucpr:0.97108
[35] validation-logloss:0.42933 validation-auc:0.96635 validation-aucpr:0.97112
[36] validation-logloss:0.42515 validation-auc:0.96637 validation-aucpr:0.97112
[37] validation-logloss:0.42123 validation-auc:0.96632 validation-aucpr:0.97112
[38] validation-logloss:0.41728 validation-auc:0.96622 validation-aucpr:0.97102
[39] validation-logloss:0.41325 validation-auc:0.96630 validation-aucpr:0.97104
[40] validation-logloss:0.40954 validation-auc:0.96626 validation-aucpr:0.97101
[41] validation-logloss:0.40569 validation-auc:0.96621 validation-aucpr:0.97095
[42] validation-logloss:0.40112 validation-auc:0.96639 validation-aucpr:0.97115
[43] validation-logloss:0.39660 validation-auc:0.96658 validation-aucpr:0.97135
[44] validation-logloss:0.39214 validation-auc:0.96681 validation-aucpr:0.97156
[45] validation-logloss:0.38868 validation-auc:0.96691 validation-aucpr:0.97161
[46] validation-logloss:0.38434 validation-auc:0.96715 validation-aucpr:0.97179
[47] validation-logloss:0.38027 validation-auc:0.96720 validation-aucpr:0.97185
[48] validation-logloss:0.37612 validation-auc:0.96740 validation-aucpr:0.97197
[49] validation-logloss:0.37298 validation-auc:0.96741 validation-aucpr:0.97197
[50] validation-logloss:0.37004 validation-auc:0.96743 validation-aucpr:0.97197
[51] validation-logloss:0.36705 validation-auc:0.96739 validation-aucpr:0.97191
[52] validation-logloss:0.36333 validation-auc:0.96756 validation-aucpr:0.97218
[53] validation-logloss:0.35957 validation-auc:0.96770 validation-aucpr:0.97231
[54] validation-logloss:0.35584 validation-auc:0.96785 validation-aucpr:0.97245
[55] validation-logloss:0.35310 validation-auc:0.96786 validation-aucpr:0.97246
[56] validation-logloss:0.35042 validation-auc:0.96790 validation-aucpr:0.97248
[57] validation-logloss:0.34776 validation-auc:0.96791 validation-aucpr:0.97248
[58] validation-logloss:0.34522 validation-auc:0.96791 validation-aucpr:0.97247
[59] validation-logloss:0.34278 validation-auc:0.96791 validation-aucpr:0.97247
[60] validation-logloss:0.34039 validation-auc:0.96788 validation-aucpr:0.97246
[61] validation-logloss:0.33804 validation-auc:0.96781 validation-aucpr:0.97241
[62] validation-logloss:0.33561 validation-auc:0.96780 validation-aucpr:0.97239
[63] validation-logloss:0.33252 validation-auc:0.96791 validation-aucpr:0.97241
[64] validation-logloss:0.32940 validation-auc:0.96807 validation-aucpr:0.97256
[65] validation-logloss:0.32723 validation-auc:0.96807 validation-aucpr:0.97259
[66] validation-logloss:0.32515 validation-auc:0.96805 validation-aucpr:0.97257
[67] validation-logloss:0.32238 validation-auc:0.96811 validation-aucpr:0.97249
[68] validation-logloss:0.31964 validation-auc:0.96820 validation-aucpr:0.97262
[69] validation-logloss:0.31689 validation-auc:0.96830 validation-aucpr:0.97271
[70] validation-logloss:0.31487 validation-auc:0.96839 validation-aucpr:0.97302
[71] validation-logloss:0.31296 validation-auc:0.96840 validation-aucpr:0.97302
[72] validation-logloss:0.31030 validation-auc:0.96855 validation-aucpr:0.97316
[73] validation-logloss:0.30843 validation-auc:0.96854 validation-aucpr:0.97314
[74] validation-logloss:0.30593 validation-auc:0.96861 validation-aucpr:0.97320
[75] validation-logloss:0.30411 validation-auc:0.96864 validation-aucpr:0.97321
[76] validation-logloss:0.30239 validation-auc:0.96868 validation-aucpr:0.97323
[77] validation-logloss:0.30063 validation-auc:0.96867 validation-aucpr:0.97321
[78] validation-logloss:0.29896 validation-auc:0.96864 validation-aucpr:0.97318
[79] validation-logloss:0.29655 validation-auc:0.96875 validation-aucpr:0.97328
[80] validation-logloss:0.29485 validation-auc:0.96876 validation-aucpr:0.97329
[81] validation-logloss:0.29327 validation-auc:0.96872 validation-aucpr:0.97326
[82] validation-logloss:0.29110 validation-auc:0.96878 validation-aucpr:0.97328
[83] validation-logloss:0.28951 validation-auc:0.96881 validation-aucpr:0.97330
[84] validation-logloss:0.28804 validation-auc:0.96880 validation-aucpr:0.97327
{'best_iteration': '83', 'best_score': '0.9732988522297122'}
Trial 99, Fold 1: Log loss = 0.28804067052732774, Average precision = 0.9732776156643419, ROC-AUC = 0.9688033222924822, Elapsed Time = 3.208025500000076 seconds
Trial 99, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396
Trial 99, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986
[0] validation-logloss:0.68171 validation-auc:0.92325 validation-aucpr:0.91102
[1] validation-logloss:0.67056 validation-auc:0.94917 validation-aucpr:0.94054
[2] validation-logloss:0.65989 validation-auc:0.95467 validation-aucpr:0.95663
[3] validation-logloss:0.64813 validation-auc:0.96462 validation-aucpr:0.96619
[4] validation-logloss:0.63836 validation-auc:0.96435 validation-aucpr:0.96735
[5] validation-logloss:0.62837 validation-auc:0.96458 validation-aucpr:0.96779
[6] validation-logloss:0.61945 validation-auc:0.96441 validation-aucpr:0.96749
[7] validation-logloss:0.61007 validation-auc:0.96505 validation-aucpr:0.96790
[8] validation-logloss:0.59963 validation-auc:0.96705 validation-aucpr:0.97018
[9] validation-logloss:0.59118 validation-auc:0.96703 validation-aucpr:0.97009
[10] validation-logloss:0.58289 validation-auc:0.96677 validation-aucpr:0.96978
[11] validation-logloss:0.57504 validation-auc:0.96638 validation-aucpr:0.96943
[12] validation-logloss:0.56735 validation-auc:0.96611 validation-aucpr:0.96917
[13] validation-logloss:0.55821 validation-auc:0.96712 validation-aucpr:0.97030
[14] validation-logloss:0.54947 validation-auc:0.96791 validation-aucpr:0.97118
[15] validation-logloss:0.54096 validation-auc:0.96820 validation-aucpr:0.97155
[16] validation-logloss:0.53258 validation-auc:0.96878 validation-aucpr:0.97213
[17] validation-logloss:0.52463 validation-auc:0.96899 validation-aucpr:0.97233
[18] validation-logloss:0.51808 validation-auc:0.96902 validation-aucpr:0.97239
[19] validation-logloss:0.51173 validation-auc:0.96906 validation-aucpr:0.97224
[20] validation-logloss:0.50420 validation-auc:0.96935 validation-aucpr:0.97251
[21] validation-logloss:0.49710 validation-auc:0.96950 validation-aucpr:0.97265
[22] validation-logloss:0.49001 validation-auc:0.96974 validation-aucpr:0.97290
[23] validation-logloss:0.48414 validation-auc:0.96980 validation-aucpr:0.97295
[24] validation-logloss:0.47756 validation-auc:0.96980 validation-aucpr:0.97297
[25] validation-logloss:0.47107 validation-auc:0.96994 validation-aucpr:0.97313
[26] validation-logloss:0.46472 validation-auc:0.97011 validation-aucpr:0.97327
[27] validation-logloss:0.45945 validation-auc:0.97004 validation-aucpr:0.97320
[28] validation-logloss:0.45363 validation-auc:0.97014 validation-aucpr:0.97334
[29] validation-logloss:0.44760 validation-auc:0.97037 validation-aucpr:0.97353
[30] validation-logloss:0.44282 validation-auc:0.97039 validation-aucpr:0.97354
[31] validation-logloss:0.43710 validation-auc:0.97077 validation-aucpr:0.97385
[32] validation-logloss:0.43273 validation-auc:0.97075 validation-aucpr:0.97366
[33] validation-logloss:0.42819 validation-auc:0.97068 validation-aucpr:0.97356
[34] validation-logloss:0.42400 validation-auc:0.97038 validation-aucpr:0.97332
[35] validation-logloss:0.41897 validation-auc:0.97047 validation-aucpr:0.97339
[36] validation-logloss:0.41379 validation-auc:0.97068 validation-aucpr:0.97358
[37] validation-logloss:0.40893 validation-auc:0.97074 validation-aucpr:0.97363
[38] validation-logloss:0.40486 validation-auc:0.97065 validation-aucpr:0.97355
[39] validation-logloss:0.40112 validation-auc:0.97052 validation-aucpr:0.97272
[40] validation-logloss:0.39723 validation-auc:0.97055 validation-aucpr:0.97273
[41] validation-logloss:0.39266 validation-auc:0.97068 validation-aucpr:0.97284
[42] validation-logloss:0.38888 validation-auc:0.97074 validation-aucpr:0.97287
[43] validation-logloss:0.38516 validation-auc:0.97087 validation-aucpr:0.97297
[44] validation-logloss:0.38149 validation-auc:0.97084 validation-aucpr:0.97292
[45] validation-logloss:0.37732 validation-auc:0.97099 validation-aucpr:0.97304
[46] validation-logloss:0.37394 validation-auc:0.97097 validation-aucpr:0.97304
[47] validation-logloss:0.36985 validation-auc:0.97108 validation-aucpr:0.97317
[48] validation-logloss:0.36623 validation-auc:0.97104 validation-aucpr:0.97306
[49] validation-logloss:0.36233 validation-auc:0.97116 validation-aucpr:0.97315
[50] validation-logloss:0.35866 validation-auc:0.97113 validation-aucpr:0.97314
[51] validation-logloss:0.35509 validation-auc:0.97116 validation-aucpr:0.97308
[52] validation-logloss:0.35230 validation-auc:0.97116 validation-aucpr:0.97309
[53] validation-logloss:0.34939 validation-auc:0.97111 validation-aucpr:0.97302
[54] validation-logloss:0.34649 validation-auc:0.97110 validation-aucpr:0.97302
[55] validation-logloss:0.34371 validation-auc:0.97109 validation-aucpr:0.97301
[56] validation-logloss:0.34105 validation-auc:0.97119 validation-aucpr:0.97303
[57] validation-logloss:0.33838 validation-auc:0.97118 validation-aucpr:0.97299
[58] validation-logloss:0.33589 validation-auc:0.97108 validation-aucpr:0.97282
[59] validation-logloss:0.33275 validation-auc:0.97111 validation-aucpr:0.97291
[60] validation-logloss:0.33036 validation-auc:0.97108 validation-aucpr:0.97290
[61] validation-logloss:0.32800 validation-auc:0.97102 validation-aucpr:0.97287
[62] validation-logloss:0.32505 validation-auc:0.97109 validation-aucpr:0.97293
[63] validation-logloss:0.32214 validation-auc:0.97109 validation-aucpr:0.97293
[64] validation-logloss:0.31992 validation-auc:0.97107 validation-aucpr:0.97291
[65] validation-logloss:0.31758 validation-auc:0.97112 validation-aucpr:0.97292
[66] validation-logloss:0.31523 validation-auc:0.97120 validation-aucpr:0.97297
[67] validation-logloss:0.31257 validation-auc:0.97128 validation-aucpr:0.97305
[68] validation-logloss:0.31066 validation-auc:0.97122 validation-aucpr:0.97293
[69] validation-logloss:0.30854 validation-auc:0.97129 validation-aucpr:0.97299
[70] validation-logloss:0.30650 validation-auc:0.97134 validation-aucpr:0.97306
[71] validation-logloss:0.30447 validation-auc:0.97135 validation-aucpr:0.97304
[72] validation-logloss:0.30254 validation-auc:0.97131 validation-aucpr:0.97300
[73] validation-logloss:0.30018 validation-auc:0.97134 validation-aucpr:0.97323
[74] validation-logloss:0.29839 validation-auc:0.97128 validation-aucpr:0.97318
[75] validation-logloss:0.29585 validation-auc:0.97140 validation-aucpr:0.97328
[76] validation-logloss:0.29353 validation-auc:0.97148 validation-aucpr:0.97333
[77] validation-logloss:0.29112 validation-auc:0.97158 validation-aucpr:0.97344
[78] validation-logloss:0.28901 validation-auc:0.97170 validation-aucpr:0.97346
[79] validation-logloss:0.28739 validation-auc:0.97165 validation-aucpr:0.97342
[80] validation-logloss:0.28532 validation-auc:0.97171 validation-aucpr:0.97347
{'best_iteration': '31', 'best_score': '0.973848252770876'}
Trial 99, Fold 2: Log loss = 0.28372156805296517, Average precision = 0.9737019709783394, ROC-AUC = 0.9716679297140677, Elapsed Time = 3.3942110999996657 seconds
Trial 99, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876
Trial 99, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379
[0] validation-logloss:0.68185 validation-auc:0.91807 validation-aucpr:0.89452
[1] validation-logloss:0.66931 validation-auc:0.95799 validation-aucpr:0.95321
[2] validation-logloss:0.65884 validation-auc:0.96007 validation-aucpr:0.95975
[3] validation-logloss:0.64791 validation-auc:0.96367 validation-aucpr:0.96592
[4] validation-logloss:0.63790 validation-auc:0.96450 validation-aucpr:0.96770
[5] validation-logloss:0.62839 validation-auc:0.96385 validation-aucpr:0.96578
[6] validation-logloss:0.61771 validation-auc:0.96550 validation-aucpr:0.96771
[7] validation-logloss:0.60739 validation-auc:0.96663 validation-aucpr:0.96881
[8] validation-logloss:0.59733 validation-auc:0.96691 validation-aucpr:0.96903
[9] validation-logloss:0.58737 validation-auc:0.96775 validation-aucpr:0.96981
[10] validation-logloss:0.57944 validation-auc:0.96742 validation-aucpr:0.96985
[11] validation-logloss:0.56985 validation-auc:0.96831 validation-aucpr:0.97069
[12] validation-logloss:0.56096 validation-auc:0.96882 validation-aucpr:0.97281
[13] validation-logloss:0.55340 validation-auc:0.96866 validation-aucpr:0.97270
[14] validation-logloss:0.54599 validation-auc:0.96860 validation-aucpr:0.97264
[15] validation-logloss:0.53902 validation-auc:0.96858 validation-aucpr:0.97275
[16] validation-logloss:0.53117 validation-auc:0.96854 validation-aucpr:0.97277
[17] validation-logloss:0.52434 validation-auc:0.96840 validation-aucpr:0.97269
[18] validation-logloss:0.51629 validation-auc:0.96894 validation-aucpr:0.97322
[19] validation-logloss:0.50994 validation-auc:0.96886 validation-aucpr:0.97314
[20] validation-logloss:0.50346 validation-auc:0.96897 validation-aucpr:0.97345
[21] validation-logloss:0.49639 validation-auc:0.96908 validation-aucpr:0.97356
[22] validation-logloss:0.49050 validation-auc:0.96890 validation-aucpr:0.97338
[23] validation-logloss:0.48473 validation-auc:0.96894 validation-aucpr:0.97336
[24] validation-logloss:0.47903 validation-auc:0.96883 validation-aucpr:0.97324
[25] validation-logloss:0.47257 validation-auc:0.96892 validation-aucpr:0.97342
[26] validation-logloss:0.46715 validation-auc:0.96891 validation-aucpr:0.97333
[27] validation-logloss:0.46087 validation-auc:0.96921 validation-aucpr:0.97357
[28] validation-logloss:0.45465 validation-auc:0.96947 validation-aucpr:0.97380
[29] validation-logloss:0.44872 validation-auc:0.96965 validation-aucpr:0.97402
[30] validation-logloss:0.44272 validation-auc:0.97002 validation-aucpr:0.97435
[31] validation-logloss:0.43791 validation-auc:0.97003 validation-aucpr:0.97440
[32] validation-logloss:0.43251 validation-auc:0.97017 validation-aucpr:0.97453
[33] validation-logloss:0.42708 validation-auc:0.97022 validation-aucpr:0.97455
[34] validation-logloss:0.42280 validation-auc:0.97003 validation-aucpr:0.97436
[35] validation-logloss:0.41764 validation-auc:0.97012 validation-aucpr:0.97444
[36] validation-logloss:0.41251 validation-auc:0.97029 validation-aucpr:0.97459
[37] validation-logloss:0.40751 validation-auc:0.97042 validation-aucpr:0.97472
[38] validation-logloss:0.40269 validation-auc:0.97052 validation-aucpr:0.97482
[39] validation-logloss:0.39817 validation-auc:0.97045 validation-aucpr:0.97476
[40] validation-logloss:0.39367 validation-auc:0.97047 validation-aucpr:0.97479
[41] validation-logloss:0.38930 validation-auc:0.97060 validation-aucpr:0.97490
[42] validation-logloss:0.38587 validation-auc:0.97061 validation-aucpr:0.97488
[43] validation-logloss:0.38213 validation-auc:0.97063 validation-aucpr:0.97489
[44] validation-logloss:0.37798 validation-auc:0.97071 validation-aucpr:0.97498
[45] validation-logloss:0.37448 validation-auc:0.97071 validation-aucpr:0.97498
[46] validation-logloss:0.37127 validation-auc:0.97066 validation-aucpr:0.97491
[47] validation-logloss:0.36804 validation-auc:0.97064 validation-aucpr:0.97493
[48] validation-logloss:0.36408 validation-auc:0.97077 validation-aucpr:0.97502
[49] validation-logloss:0.36043 validation-auc:0.97081 validation-aucpr:0.97506
[50] validation-logloss:0.35659 validation-auc:0.97091 validation-aucpr:0.97516
[51] validation-logloss:0.35338 validation-auc:0.97096 validation-aucpr:0.97521
[52] validation-logloss:0.34988 validation-auc:0.97107 validation-aucpr:0.97527
[53] validation-logloss:0.34699 validation-auc:0.97109 validation-aucpr:0.97527
[54] validation-logloss:0.34431 validation-auc:0.97103 validation-aucpr:0.97523
[55] validation-logloss:0.34145 validation-auc:0.97113 validation-aucpr:0.97533
[56] validation-logloss:0.33827 validation-auc:0.97117 validation-aucpr:0.97538
[57] validation-logloss:0.33549 validation-auc:0.97125 validation-aucpr:0.97546
[58] validation-logloss:0.33291 validation-auc:0.97126 validation-aucpr:0.97544
[59] validation-logloss:0.33040 validation-auc:0.97123 validation-aucpr:0.97539
[60] validation-logloss:0.32791 validation-auc:0.97123 validation-aucpr:0.97539
[61] validation-logloss:0.32548 validation-auc:0.97122 validation-aucpr:0.97537
[62] validation-logloss:0.32328 validation-auc:0.97119 validation-aucpr:0.97532
[63] validation-logloss:0.32101 validation-auc:0.97120 validation-aucpr:0.97529
[64] validation-logloss:0.31859 validation-auc:0.97128 validation-aucpr:0.97532
[65] validation-logloss:0.31590 validation-auc:0.97127 validation-aucpr:0.97532
[66] validation-logloss:0.31378 validation-auc:0.97125 validation-aucpr:0.97530
[67] validation-logloss:0.31148 validation-auc:0.97125 validation-aucpr:0.97529
[68] validation-logloss:0.30932 validation-auc:0.97127 validation-aucpr:0.97530
[69] validation-logloss:0.30717 validation-auc:0.97126 validation-aucpr:0.97528
[70] validation-logloss:0.30512 validation-auc:0.97129 validation-aucpr:0.97529
[71] validation-logloss:0.30318 validation-auc:0.97125 validation-aucpr:0.97526
[72] validation-logloss:0.30127 validation-auc:0.97121 validation-aucpr:0.97523
[73] validation-logloss:0.29947 validation-auc:0.97116 validation-aucpr:0.97518
[74] validation-logloss:0.29697 validation-auc:0.97129 validation-aucpr:0.97530
[75] validation-logloss:0.29520 validation-auc:0.97135 validation-aucpr:0.97546
[76] validation-logloss:0.29291 validation-auc:0.97133 validation-aucpr:0.97544
[77] validation-logloss:0.29069 validation-auc:0.97137 validation-aucpr:0.97547
[78] validation-logloss:0.28888 validation-auc:0.97147 validation-aucpr:0.97561
[79] validation-logloss:0.28686 validation-auc:0.97145 validation-aucpr:0.97560
[80] validation-logloss:0.28476 validation-auc:0.97149 validation-aucpr:0.97561
[81] validation-logloss:0.28305 validation-auc:0.97151 validation-aucpr:0.97561
[82] validation-logloss:0.28099 validation-auc:0.97156 validation-aucpr:0.97567
[83] validation-logloss:0.27901 validation-auc:0.97167 validation-aucpr:0.97576
[84] validation-logloss:0.27746 validation-auc:0.97167 validation-aucpr:0.97575
{'best_iteration': '83', 'best_score': '0.9757566802452886'}
Trial 99, Fold 3: Log loss = 0.27745504517041203, Average precision = 0.9757531703086391, ROC-AUC = 0.9716715142876405, Elapsed Time = 3.325000699998782 seconds
Trial 99, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592
Trial 99, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665
[0] validation-logloss:0.68215 validation-auc:0.90771 validation-aucpr:0.88290
[1] validation-logloss:0.67142 validation-auc:0.93736 validation-aucpr:0.93239
[2] validation-logloss:0.66080 validation-auc:0.94664 validation-aucpr:0.94827
[3] validation-logloss:0.64937 validation-auc:0.95628 validation-aucpr:0.95805
[4] validation-logloss:0.63946 validation-auc:0.95789 validation-aucpr:0.96436
[5] validation-logloss:0.62814 validation-auc:0.96087 validation-aucpr:0.96708
[6] validation-logloss:0.61857 validation-auc:0.96151 validation-aucpr:0.96762
[7] validation-logloss:0.60803 validation-auc:0.96319 validation-aucpr:0.96900
[8] validation-logloss:0.59798 validation-auc:0.96390 validation-aucpr:0.96968
[9] validation-logloss:0.58901 validation-auc:0.96453 validation-aucpr:0.97033
[10] validation-logloss:0.57950 validation-auc:0.96548 validation-aucpr:0.97108
[11] validation-logloss:0.57053 validation-auc:0.96563 validation-aucpr:0.97132
[12] validation-logloss:0.56169 validation-auc:0.96587 validation-aucpr:0.97159
[13] validation-logloss:0.55274 validation-auc:0.96646 validation-aucpr:0.97211
[14] validation-logloss:0.54420 validation-auc:0.96696 validation-aucpr:0.97245
[15] validation-logloss:0.53723 validation-auc:0.96663 validation-aucpr:0.97217
[16] validation-logloss:0.53034 validation-auc:0.96684 validation-aucpr:0.97226
[17] validation-logloss:0.52352 validation-auc:0.96691 validation-aucpr:0.97226
[18] validation-logloss:0.51690 validation-auc:0.96695 validation-aucpr:0.97220
[19] validation-logloss:0.51061 validation-auc:0.96679 validation-aucpr:0.97206
[20] validation-logloss:0.50417 validation-auc:0.96675 validation-aucpr:0.97197
[21] validation-logloss:0.49803 validation-auc:0.96675 validation-aucpr:0.97197
[22] validation-logloss:0.49124 validation-auc:0.96708 validation-aucpr:0.97224
[23] validation-logloss:0.48538 validation-auc:0.96719 validation-aucpr:0.97232
[24] validation-logloss:0.47847 validation-auc:0.96768 validation-aucpr:0.97275
[25] validation-logloss:0.47294 validation-auc:0.96771 validation-aucpr:0.97276
[26] validation-logloss:0.46657 validation-auc:0.96787 validation-aucpr:0.97292
[27] validation-logloss:0.46157 validation-auc:0.96780 validation-aucpr:0.97285
[28] validation-logloss:0.45647 validation-auc:0.96766 validation-aucpr:0.97273
[29] validation-logloss:0.45040 validation-auc:0.96783 validation-aucpr:0.97293
[30] validation-logloss:0.44564 validation-auc:0.96788 validation-aucpr:0.97295
[31] validation-logloss:0.43991 validation-auc:0.96810 validation-aucpr:0.97315
[32] validation-logloss:0.43541 validation-auc:0.96805 validation-aucpr:0.97308
[33] validation-logloss:0.43013 validation-auc:0.96810 validation-aucpr:0.97315
[34] validation-logloss:0.42575 validation-auc:0.96807 validation-aucpr:0.97317
[35] validation-logloss:0.42058 validation-auc:0.96809 validation-aucpr:0.97320
[36] validation-logloss:0.41643 validation-auc:0.96804 validation-aucpr:0.97313
[37] validation-logloss:0.41151 validation-auc:0.96820 validation-aucpr:0.97326
[38] validation-logloss:0.40764 validation-auc:0.96809 validation-aucpr:0.97316
[39] validation-logloss:0.40370 validation-auc:0.96807 validation-aucpr:0.97312
[40] validation-logloss:0.39907 validation-auc:0.96821 validation-aucpr:0.97325
[41] validation-logloss:0.39457 validation-auc:0.96835 validation-aucpr:0.97338
[42] validation-logloss:0.39023 validation-auc:0.96847 validation-aucpr:0.97350
[43] validation-logloss:0.38597 validation-auc:0.96860 validation-aucpr:0.97362
[44] validation-logloss:0.38253 validation-auc:0.96863 validation-aucpr:0.97363
[45] validation-logloss:0.37903 validation-auc:0.96865 validation-aucpr:0.97367
[46] validation-logloss:0.37494 validation-auc:0.96877 validation-aucpr:0.97377
[47] validation-logloss:0.37154 validation-auc:0.96875 validation-aucpr:0.97375
[48] validation-logloss:0.36836 validation-auc:0.96873 validation-aucpr:0.97373
[49] validation-logloss:0.36519 validation-auc:0.96877 validation-aucpr:0.97374
[50] validation-logloss:0.36225 validation-auc:0.96879 validation-aucpr:0.97373
[51] validation-logloss:0.35860 validation-auc:0.96881 validation-aucpr:0.97374
[52] validation-logloss:0.35578 validation-auc:0.96876 validation-aucpr:0.97370
[53] validation-logloss:0.35239 validation-auc:0.96883 validation-aucpr:0.97376
[54] validation-logloss:0.34898 validation-auc:0.96899 validation-aucpr:0.97389
[55] validation-logloss:0.34633 validation-auc:0.96893 validation-aucpr:0.97385
[56] validation-logloss:0.34295 validation-auc:0.96901 validation-aucpr:0.97394
[57] validation-logloss:0.34039 validation-auc:0.96895 validation-aucpr:0.97389
[58] validation-logloss:0.33740 validation-auc:0.96892 validation-aucpr:0.97389
[59] validation-logloss:0.33504 validation-auc:0.96880 validation-aucpr:0.97381
[60] validation-logloss:0.33197 validation-auc:0.96889 validation-aucpr:0.97388
[61] validation-logloss:0.32897 validation-auc:0.96898 validation-aucpr:0.97396
[62] validation-logloss:0.32671 validation-auc:0.96898 validation-aucpr:0.97395
[63] validation-logloss:0.32374 validation-auc:0.96909 validation-aucpr:0.97404
[64] validation-logloss:0.32080 validation-auc:0.96928 validation-aucpr:0.97419
[65] validation-logloss:0.31849 validation-auc:0.96938 validation-aucpr:0.97427
[66] validation-logloss:0.31629 validation-auc:0.96941 validation-aucpr:0.97427
[67] validation-logloss:0.31412 validation-auc:0.96939 validation-aucpr:0.97425
[68] validation-logloss:0.31202 validation-auc:0.96940 validation-aucpr:0.97424
[69] validation-logloss:0.30995 validation-auc:0.96943 validation-aucpr:0.97427
[70] validation-logloss:0.30732 validation-auc:0.96958 validation-aucpr:0.97439
[71] validation-logloss:0.30548 validation-auc:0.96957 validation-aucpr:0.97439
[72] validation-logloss:0.30300 validation-auc:0.96965 validation-aucpr:0.97446
[73] validation-logloss:0.30106 validation-auc:0.96965 validation-aucpr:0.97445
[74] validation-logloss:0.29920 validation-auc:0.96966 validation-aucpr:0.97446
[75] validation-logloss:0.29740 validation-auc:0.96965 validation-aucpr:0.97444
[76] validation-logloss:0.29573 validation-auc:0.96962 validation-aucpr:0.97440
[77] validation-logloss:0.29341 validation-auc:0.96970 validation-aucpr:0.97449
[78] validation-logloss:0.29132 validation-auc:0.96974 validation-aucpr:0.97454
[79] validation-logloss:0.28917 validation-auc:0.96980 validation-aucpr:0.97462
[80] validation-logloss:0.28763 validation-auc:0.96974 validation-aucpr:0.97458
[81] validation-logloss:0.28552 validation-auc:0.96984 validation-aucpr:0.97467
[82] validation-logloss:0.28404 validation-auc:0.96986 validation-aucpr:0.97467
[83] validation-logloss:0.28201 validation-auc:0.96993 validation-aucpr:0.97474
[84] validation-logloss:0.28049 validation-auc:0.96990 validation-aucpr:0.97471
{'best_iteration': '83', 'best_score': '0.9747392883128421'}
Trial 99, Fold 4: Log loss = 0.2804945606615627, Average precision = 0.9747112338846573, ROC-AUC = 0.9699017444341914, Elapsed Time = 3.4602881999999227 seconds
Trial 99, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897
Trial 99, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054
[0] validation-logloss:0.68206 validation-auc:0.91551 validation-aucpr:0.89662
[1] validation-logloss:0.66970 validation-auc:0.95149 validation-aucpr:0.94137
[2] validation-logloss:0.65907 validation-auc:0.95651 validation-aucpr:0.95462
[3] validation-logloss:0.64886 validation-auc:0.95838 validation-aucpr:0.96078
[4] validation-logloss:0.63917 validation-auc:0.95898 validation-aucpr:0.96250
[5] validation-logloss:0.62949 validation-auc:0.95902 validation-aucpr:0.96220
[6] validation-logloss:0.61915 validation-auc:0.96088 validation-aucpr:0.96426
[7] validation-logloss:0.61081 validation-auc:0.96048 validation-aucpr:0.96355
[8] validation-logloss:0.60224 validation-auc:0.96072 validation-aucpr:0.96455
[9] validation-logloss:0.59219 validation-auc:0.96196 validation-aucpr:0.96575
[10] validation-logloss:0.58382 validation-auc:0.96249 validation-aucpr:0.96615
[11] validation-logloss:0.57577 validation-auc:0.96232 validation-aucpr:0.96606
[12] validation-logloss:0.56800 validation-auc:0.96246 validation-aucpr:0.96613
[13] validation-logloss:0.56054 validation-auc:0.96255 validation-aucpr:0.96627
[14] validation-logloss:0.55189 validation-auc:0.96330 validation-aucpr:0.96711
[15] validation-logloss:0.54496 validation-auc:0.96316 validation-aucpr:0.96718
[16] validation-logloss:0.53800 validation-auc:0.96303 validation-aucpr:0.96704
[17] validation-logloss:0.53004 validation-auc:0.96373 validation-aucpr:0.96731
[18] validation-logloss:0.52336 validation-auc:0.96385 validation-aucpr:0.96737
[19] validation-logloss:0.51584 validation-auc:0.96399 validation-aucpr:0.96752
[20] validation-logloss:0.50934 validation-auc:0.96426 validation-aucpr:0.96774
[21] validation-logloss:0.50343 validation-auc:0.96446 validation-aucpr:0.96807
[22] validation-logloss:0.49636 validation-auc:0.96461 validation-aucpr:0.96824
[23] validation-logloss:0.49038 validation-auc:0.96467 validation-aucpr:0.96825
[24] validation-logloss:0.48493 validation-auc:0.96469 validation-aucpr:0.96834
[25] validation-logloss:0.47837 validation-auc:0.96496 validation-aucpr:0.96877
[26] validation-logloss:0.47196 validation-auc:0.96552 validation-aucpr:0.96921
[27] validation-logloss:0.46670 validation-auc:0.96545 validation-aucpr:0.96912
[28] validation-logloss:0.46044 validation-auc:0.96583 validation-aucpr:0.96963
[29] validation-logloss:0.45532 validation-auc:0.96595 validation-aucpr:0.96974
[30] validation-logloss:0.45053 validation-auc:0.96587 validation-aucpr:0.96953
[31] validation-logloss:0.44472 validation-auc:0.96606 validation-aucpr:0.96970
[32] validation-logloss:0.44022 validation-auc:0.96602 validation-aucpr:0.96967
[33] validation-logloss:0.43583 validation-auc:0.96593 validation-aucpr:0.96959
[34] validation-logloss:0.43155 validation-auc:0.96600 validation-aucpr:0.96966
[35] validation-logloss:0.42633 validation-auc:0.96622 validation-aucpr:0.96987
[36] validation-logloss:0.42200 validation-auc:0.96631 validation-aucpr:0.96994
[37] validation-logloss:0.41805 validation-auc:0.96632 validation-aucpr:0.96993
[38] validation-logloss:0.41418 validation-auc:0.96630 validation-aucpr:0.96992
[39] validation-logloss:0.40949 validation-auc:0.96625 validation-aucpr:0.96997
[40] validation-logloss:0.40601 validation-auc:0.96609 validation-aucpr:0.96981
[41] validation-logloss:0.40234 validation-auc:0.96609 validation-aucpr:0.96981
[42] validation-logloss:0.39867 validation-auc:0.96611 validation-aucpr:0.96983
[43] validation-logloss:0.39499 validation-auc:0.96624 validation-aucpr:0.96992
[44] validation-logloss:0.39069 validation-auc:0.96636 validation-aucpr:0.97005
[45] validation-logloss:0.38723 validation-auc:0.96638 validation-aucpr:0.97005
[46] validation-logloss:0.38304 validation-auc:0.96652 validation-aucpr:0.97018
[47] validation-logloss:0.37892 validation-auc:0.96661 validation-aucpr:0.97027
[48] validation-logloss:0.37599 validation-auc:0.96651 validation-aucpr:0.97017
[49] validation-logloss:0.37203 validation-auc:0.96671 validation-aucpr:0.97034
[50] validation-logloss:0.36886 validation-auc:0.96676 validation-aucpr:0.97040
[51] validation-logloss:0.36573 validation-auc:0.96677 validation-aucpr:0.97039
[52] validation-logloss:0.36281 validation-auc:0.96677 validation-aucpr:0.97035
[53] validation-logloss:0.35980 validation-auc:0.96683 validation-aucpr:0.97041
[54] validation-logloss:0.35621 validation-auc:0.96693 validation-aucpr:0.97052
[55] validation-logloss:0.35350 validation-auc:0.96688 validation-aucpr:0.97049
[56] validation-logloss:0.35120 validation-auc:0.96681 validation-aucpr:0.97039
[57] validation-logloss:0.34854 validation-auc:0.96686 validation-aucpr:0.97054
[58] validation-logloss:0.34523 validation-auc:0.96701 validation-aucpr:0.97067
[59] validation-logloss:0.34197 validation-auc:0.96711 validation-aucpr:0.97075
[60] validation-logloss:0.33902 validation-auc:0.96710 validation-aucpr:0.97074
[61] validation-logloss:0.33671 validation-auc:0.96713 validation-aucpr:0.97069
[62] validation-logloss:0.33426 validation-auc:0.96721 validation-aucpr:0.97074
[63] validation-logloss:0.33216 validation-auc:0.96723 validation-aucpr:0.97145
[64] validation-logloss:0.32993 validation-auc:0.96726 validation-aucpr:0.97147
[65] validation-logloss:0.32699 validation-auc:0.96733 validation-aucpr:0.97154
[66] validation-logloss:0.32485 validation-auc:0.96736 validation-aucpr:0.97156
[67] validation-logloss:0.32262 validation-auc:0.96738 validation-aucpr:0.97158
[68] validation-logloss:0.32065 validation-auc:0.96732 validation-aucpr:0.97154
[69] validation-logloss:0.31869 validation-auc:0.96733 validation-aucpr:0.97153
[70] validation-logloss:0.31662 validation-auc:0.96731 validation-aucpr:0.97151
[71] validation-logloss:0.31391 validation-auc:0.96743 validation-aucpr:0.97162
[72] validation-logloss:0.31198 validation-auc:0.96741 validation-aucpr:0.97160
[73] validation-logloss:0.31024 validation-auc:0.96735 validation-aucpr:0.97155
[74] validation-logloss:0.30853 validation-auc:0.96733 validation-aucpr:0.97152
[75] validation-logloss:0.30668 validation-auc:0.96735 validation-aucpr:0.97153
[76] validation-logloss:0.30431 validation-auc:0.96742 validation-aucpr:0.97160
[77] validation-logloss:0.30184 validation-auc:0.96755 validation-aucpr:0.97172
[78] validation-logloss:0.30006 validation-auc:0.96752 validation-aucpr:0.97171
[79] validation-logloss:0.29835 validation-auc:0.96753 validation-aucpr:0.97170
[80] validation-logloss:0.29670 validation-auc:0.96756 validation-aucpr:0.97172
[81] validation-logloss:0.29449 validation-auc:0.96763 validation-aucpr:0.97179
[82] validation-logloss:0.29248 validation-auc:0.96771 validation-aucpr:0.97190
[83] validation-logloss:0.29091 validation-auc:0.96777 validation-aucpr:0.97193
[84] validation-logloss:0.28872 validation-auc:0.96794 validation-aucpr:0.97207
{'best_iteration': '84', 'best_score': '0.9720716435454212'}
Trial 99, Fold 5: Log loss = 0.28871626715298626, Average precision = 0.9720776131216773, ROC-AUC = 0.9679444103821786, Elapsed Time = 3.337888300000486 seconds
Optimization Progress: 100%|##########| 100/100 [4:02:03<00:00, 42.55s/it]
Optuna Optimization Elapsed Time: 14523.8524339 seconds
Optimization Progress: 100%|##########| 100/100 [4:02:03<00:00, 145.24s/it]
Training with Best Trial 65
Full_params: {'objective': 'binary:logistic', 'device': 'cpu', 'verbosity': 1, 'validate_parameters': True, 'eval_metric': ['logloss', 'auc', 'aucpr'], 'seed': 42, 'max_depth': 0, 'sampling_method': 'uniform', 'num_parallel_tree': 1, 'booster': 'gbtree', 'eta': 0.08759379181951674, 'gamma': 4.671021255322478e-06, 'min_child_weight': 2.6427694911817074e-05, 'max_delta_step': 65.5415383777442, 'subsample': 0.5614298448675313, 'colsample_bytree': 0.8129876656630327, 'colsample_bylevel': 0.9184442500782993, 'colsample_bynode': 0.6255082544056032, 'lambda': 7.025644820629422, 'alpha': 0.03727945540630586, 'tree_method': 'auto', 'scale_pos_weight': 1, 'grow_policy': 'depthwise', 'max_leaves': 128, 'max_bin': 47, 'num_boost_round': 97}
Training Elapsed Time: 2.7084125000001222 seconds
Log loss: (Train) 0.18729623942177334 vs (Test) 0.1927517325225934
PR-AUC: (Train) 0.976406633670879 vs (Test) 0.975962677739389
ROC-AUC: (Train) 0.9731661086371626 vs (Test) 0.9725283483506415
Training with Best Trial 76
Full_params: {'objective': 'binary:logistic', 'device': 'cpu', 'verbosity': 1, 'validate_parameters': True, 'eval_metric': ['logloss', 'auc', 'aucpr'], 'seed': 42, 'max_depth': 0, 'sampling_method': 'uniform', 'num_parallel_tree': 1, 'booster': 'gbtree', 'eta': 0.06489083823042531, 'gamma': 1.733278388607598, 'min_child_weight': 0.0015768729934482134, 'max_delta_step': 37.67348822307136, 'subsample': 0.8942728317991482, 'colsample_bytree': 0.570634316092556, 'colsample_bylevel': 0.8376499027022023, 'colsample_bynode': 0.9166953679419032, 'lambda': 0.005622055563920644, 'alpha': 4.06229817623886e-08, 'tree_method': 'hist', 'scale_pos_weight': 1, 'grow_policy': 'depthwise', 'max_leaves': 196, 'max_bin': 92, 'num_boost_round': 73}
Training Elapsed Time: 3.324780799997825 seconds
Log loss: (Train) 0.18865200821954037 vs (Test) 0.19412531851812243
PR-AUC: (Train) 0.9765158675564066 vs (Test) 0.9760041164203919
ROC-AUC: (Train) 0.9730942189230347 vs (Test) 0.9721837774893096
save_results(clf_name = "XGBClassifier",
best_trials = best_trials_xgb,
exec_time = exec_time_xgb,
lloss_auc_train = lloss_auc_train_xgb,
lloss_auc_test = lloss_auc_test_xgb,
df_metrics = df_metrics_xgb,
cm_final = cm_final_xgb,
cm_all = cm_xgb_all,
cm_labels = cm_labels_xgb_all)
Optuna with CatBoostClassifier¶
gc.collect();
# Aggregated loan level data
X_df = clean_df.drop(columns = ["target", "anon_ssn"])
# https://catboost.ai/docs/en/concepts/algorithm-missing-values-processing
# Replace <NA> with NaN
X_df = X_df.fillna(np.nan)
# Identify categorical columns
cat_cols = X_df.select_dtypes(include = "category").columns.tolist()
# Convert missing NaN in Dtype categorical to "nan" as Dtype object
for col in cat_cols:
X_df[col] = X_df[col].astype(str)
# Convert missing <NA> in Dtype nullable boolean and nullable Int to NaN as float
X_df = X_df.apply(lambda x: x.map(lambda z: np.nan if pd.isna(z) else z))
y_df = clean_df.target
anon_ssn = clean_df.anon_ssn;
# A single train-test split (80%-20%) using GroupShuffleSplit, ensuring that no anon_ssn (grouped by anon_ssn) appear in both sets
gss = GroupShuffleSplit(n_splits = 1, test_size = 0.2, random_state = 42)
train_idx, test_idx = next(gss.split(X_df, y_df, groups = anon_ssn))
X_train, X_test = X_df.iloc[train_idx], X_df.iloc[test_idx]
y_train, y_test = y_df.iloc[train_idx], y_df.iloc[test_idx]
# Keep track of anon_ssn for cross-validation
anon_ssn_train = anon_ssn[train_idx]
del X_df, y_df, gss, train_idx, test_idx;
# Define the objective function
def objective(trial):
# Error: bayesian bootstrap doesn't support 'subsample' option
bootstrap_type = trial.suggest_categorical("bootstrap_type", ["Bayesian", "Bernoulli",
#"Poisson" # CPU doesn't support
]
)
if bootstrap_type == "Bayesian":
# Do not suggest subsample for Bayesian
subsample = None
else:
subsample = trial.suggest_float("subsample", 1e-1, 1e0)
grow_policy = trial.suggest_categorical("grow_policy", ["SymmetricTree",
"Lossguide", # Only for 'Ordered' and CPU
"Depthwise",
#"Region" # GrowPolicy Region is unimplemented for CPU
]
)
# Conditionally set sampling_frequency based on grow_policy
if grow_policy == "Lossguide":
sampling_frequency = "PerTree"
else:
sampling_frequency = trial.suggest_categorical("sampling_frequency", ["PerTree", "PerTreeLevel"])
posterior_sampling = trial.suggest_categorical("posterior_sampling", [True, False]) # CPU only
if posterior_sampling == True:
model_shrink_mode = "Constant"
else:
model_shrink_mode = trial.suggest_categorical("model_shrink_mode", ["Constant", "Decreasing"]) #CPU only
# https://catboost.ai/docs/en/references/training-parameters/
# https://catboost.ai/docs/en/concepts/python-reference_catboostclassifier
# https://catboost.ai/docs/en/concepts/speed-up-training
params = {"task_type": "CPU",
"objective": "Logloss",
"eval_metric": "Logloss",
"custom_metric": ["AUC", "PRAUC"],
"iterations": trial.suggest_int("iterations", 5, 100),
"learning_rate": trial.suggest_float("learning_rate", 1e-2, 1e-1, log = True),
"random_seed": seed,
"l2_leaf_reg": trial.suggest_float("l2_leaf_reg", 1e-8, 1e1, log = True),
"bootstrap_type": bootstrap_type,
"subsample": subsample,
"sampling_frequency": sampling_frequency,
"random_strength": trial.suggest_float("random_strength", 1e-2, 10.0, log = True),
"depth": trial.suggest_int("depth", 1, 16),
"grow_policy": grow_policy,
"min_data_in_leaf": trial.suggest_int("min_data_in_leaf", 5, 200), # Minimum data in one leaf
"has_time": trial.suggest_categorical("has_time", [True, False]),
"rsm": trial.suggest_float("rsm", 1e-1, 1e0),
"leaf_estimation_method": trial.suggest_categorical("leaf_estimation_method", ["Newton", "Gradient"]),
"leaf_estimation_backtracking": trial.suggest_categorical("leaf_estimation_backtracking", ["No", "AnyImprovement"]),
"fold_len_multiplier": trial.suggest_float("fold_len_multiplier", 1e0, 1e1),
"approx_on_full_history": False, # Can't use approx-on-full-history with Plain boosting-type
"auto_class_weights": trial.suggest_categorical("auto_class_weights", ["None", "Balanced", "SqrtBalanced"]),
"boosting_type": "Plain", # for CPU
"boost_from_average": trial.suggest_categorical("boost_from_average", [True, False]),
"posterior_sampling": posterior_sampling,
"allow_const_label": trial.suggest_categorical("allow_const_label", [True, False]),
"score_function": trial.suggest_categorical("score_function", ["Cosine", "L2", # CPU only
]
),
"model_shrink_mode": model_shrink_mode,
"thread_count": -1,
"border_count": trial.suggest_int("border_count", 40, 255),
"verbose": True, # Keeps live per-iteration training logs visible in the output cell
"allow_writing_files": False # Avoid CatBoost "Error 32" on Windows (file-lock in catboost_info); disables writing into catboost_info/ (so no file lock issues)
}
# Ensure max_leaves (# Number of leaves in one tree) is only used when grow_policy is Lossguide
if params["grow_policy"] == "Lossguide":
params["max_leaves"] = trial.suggest_int("max_leaves", 2, 256)
sgkf = StratifiedGroupKFold(n_splits = 5, shuffle = True, random_state = seed)
lloss_scores, pr_auc_scores, roc_auc_scores = [], [], []
for fold_idx, (train_index, valid_index) in enumerate(sgkf.split(X_train, y_train, groups = anon_ssn_train), start = 1):
# Split data into training and validation sets
X_train_fold, X_valid_fold = X_train.iloc[train_index], X_train.iloc[valid_index]
y_train_fold, y_valid_fold = y_train.iloc[train_index], y_train.iloc[valid_index]
# Summarize the composition of classes in the train and validation sets
train_0, train_1 = len(y_train_fold[y_train_fold == 0]), len(y_train_fold[y_train_fold == 1])
valid_0, valid_1 = len(y_valid_fold[y_valid_fold == 0]), len(y_valid_fold[y_valid_fold == 1])
print(f'Trial {trial.number}, Fold {fold_idx}: Train size = {len(train_index)} where 0 = {train_0}, 1 = {train_1}, 0/1 = {train_0/train_1}')
print(f'Trial {trial.number}, Fold {fold_idx}: Validation size = {len(valid_index)} where 0 = {valid_0}, 1 = {valid_1}, 0/1 = {valid_0/valid_1}')
# Create Pool objects for training and validation
train_pool_fold = Pool(X_train_fold, y_train_fold, cat_features = cat_cols)
valid_pool_fold = Pool(X_valid_fold, y_valid_fold, cat_features = cat_cols)
start_fold = time.perf_counter()
clf = CatBoostClassifier(**params).fit(train_pool_fold, eval_set = valid_pool_fold)
end_fold = time.perf_counter()
# Get probabilities and for the positive class
y_prob_fold = clf.predict_proba(X_valid_fold)[:, 1]
y_pred_fold = clf.predict(X_valid_fold) # Class prediction
print(f'Trial {trial.number}, Fold {fold_idx}: '
f'Log loss = {log_loss(y_valid_fold, y_prob_fold)}, '
f'Average precision = {average_precision_score(y_valid_fold, y_prob_fold)}, '
f'ROC-AUC = {roc_auc_score(y_valid_fold, y_prob_fold)}, '
f'Elapsed Time = {end_fold - start_fold} seconds')
lloss_scores.append(log_loss(y_valid_fold, y_prob_fold))
pr_auc_scores.append(average_precision_score(y_valid_fold, y_prob_fold))
roc_auc_scores.append(roc_auc_score(y_valid_fold, y_prob_fold))
del X_train_fold, X_valid_fold, y_train_fold, y_valid_fold, train_pool_fold, valid_pool_fold, clf, start_fold, end_fold
gc.collect()
mean_lloss = np.mean(lloss_scores)
mean_pr_auc = np.mean(pr_auc_scores)
mean_roc_auc = np.mean(roc_auc_scores)
del lloss_scores, pr_auc_scores, roc_auc_scores
gc.collect()
return mean_lloss, mean_pr_auc, mean_roc_auc
trial_progress = tqdm(total = n_trials, desc = "Optimization Progress", leave = True,
ascii = True, # Plain text mode
dynamic_ncols = True # Auto-fit width
)
def update_progress(study_cbc, trial):
trial_progress.update(1)
# Disable Optuna's stdout handler so notebook isn’t spammed
optuna.logging.disable_default_handler()
# Enable propagation to Python’s logging
optuna.logging.enable_propagation()
optuna.logging.set_verbosity(optuna.logging.DEBUG)
# Configure Python logging
logging.basicConfig(filename = "optuna_debug_CatBoostClassifier.log", filemode = "w", level = logging.DEBUG, format="%(asctime)s %(levelname)s %(message)s")
study_cbc = optuna.create_study(study_name = "Optuna for CatBoostClassifier",
directions = ["minimize", "maximize", "maximize"],
sampler = module.AutoSampler(seed = seed)
)
start_optuna = time.perf_counter()
study_cbc.optimize(objective, n_trials = n_trials, n_jobs = 1, callbacks = [update_progress])
end_optuna = time.perf_counter()
print(f'Optuna Optimization Elapsed Time: {end_optuna - start_optuna} seconds')
gc.collect()
fig = plot_pareto_front(study_cbc, target_names = ["Log loss", "PR-AUC", "ROC-AUC"])
fig.update_layout(width = 900, height = 400)
fig.show()
trial_progress.close()
# Plot optimization history for each objective
metrics = ["Log loss", "PR-AUC", "ROC-AUC"]
for i, obj in enumerate(metrics):
optuna.visualization.plot_optimization_history(study_cbc,
target = lambda t: t.values[i], # Correctly target each objective
target_name = obj).show()
best_trials = study_cbc.best_trials
best_trials_cbc = {}
exec_time_cbc, lloss_auc_train_cbc, lloss_auc_test_cbc, all_metrics = [],[], [], []
cm_cbc_all, cm_labels_cbc_all = [],[]
# Prepare the data - 80% training set and 20% test set
train_pool_all = Pool(X_train, y_train, cat_features = cat_cols)
test_pool_all = Pool(X_test, y_test, cat_features = cat_cols)
for i, trial in enumerate(best_trials):
display(Markdown(f"<span style = 'font-size: 18px; font-weight: bold;'> Training with Best Trial {trial.number} </span>"))
best_params = trial.params
display(HTML(best_params.__repr__()))
# Non-optimized and best Optuna optimized parameters
full_params = {"objective": "Logloss",
"eval_metric": "Logloss",
"custom_metric": ["AUC", "PRAUC"],
"random_seed": seed,
"thread_count": -1,
"verbose": True,
"allow_writing_files": False,
**best_params
}
print("Full_params:", full_params)
best_trials_cbc[trial.number] = full_params
# https://catboost.ai/docs/en/references/training-parameters/common#auto_class_weights
# https://catboost.ai/docs/en/references/training-parameters/performance
final_cbc = CatBoostClassifier(**full_params)
start_train = time.perf_counter()
final_cbc.fit(train_pool_all)
end_train = time.perf_counter()
print(f'Training Elapsed Time: {end_train - start_train} seconds')
y_prob_all = final_cbc.predict_proba(X_test)[:, 1]
y_pred_all = final_cbc.predict(X_test)
print(f'Log loss: (Train) {trial.values[0]} vs (Test) {log_loss(y_test, y_prob_all)}')
print(f'PR-AUC: (Train) {trial.values[1]} vs (Test) {average_precision_score(y_test, y_prob_all)}')
print(f'ROC-AUC: (Train) {trial.values[2]} vs (Test) {roc_auc_score(y_test, y_prob_all)}')
exec_time_cbc.append({"Classifier": "CatBoostClassifier",
"Best Trial": trial.number,
"Optimization Elapsed Time (s)": end_optuna - start_optuna,
"Training Elapsed Time (s)": end_train - start_train})
lloss_auc_train_cbc.append({"Classifier": "CatBoostClassifier",
"Best Trial": trial.number,
"Set": "Training",
"Log loss": trial.values[0],
"PR-AUC": trial.values[1],
"ROC-AUC": trial.values[2]})
lloss_auc_test_cbc.append({"Classifier": "CatBoostClassifier",
"Best Trial": trial.number,
"Set": "Test",
"Log loss": log_loss(y_test, y_prob_all),
"PR-AUC": average_precision_score(y_test, y_prob_all),
"ROC-AUC": roc_auc_score(y_test, y_prob_all)})
report = classification_report(y_test, y_pred_all, target_names = ["Safe", "Risky"], output_dict = True)
all_metrics.append({"Classifier": "CatBoostClassifier",
"Trial": trial.number,
"Accuracy": accuracy_score(y_test, y_pred_all),
"Precision (Safe)": report["Safe"]["precision"],
"Recall (Safe)": report["Safe"]["recall"],
"F1-score (Safe)": report["Safe"]["f1-score"],
"Precision (Risky)": report["Risky"]["precision"],
"Recall (Risky)": report["Risky"]["recall"],
"F1-score (Risky)": report["Risky"]["f1-score"],
"Precision (Macro avg)": report["macro avg"]["precision"],
"Recall (Macro avg)": report["macro avg"]["recall"],
"F1-score (Macro avg)": report["macro avg"]["f1-score"],
"Precision (Weighted avg)": report["weighted avg"]["precision"],
"Recall (Weighted avg)": report["weighted avg"]["recall"],
"F1-score (Weighted avg)": report["weighted avg"]["f1-score"]})
# Store confusion matrix
cm_final_cbc = confusion_matrix(y_test, y_pred_all)
cm_cbc_all.append(cm_final_cbc)
cm_labels_cbc_all.append(f'CatBoostClassifier Confusion Matrix for Best Trial {trial.number}') # Store label for subplots
df_metrics_cbc = pd.DataFrame(all_metrics)
gc.collect();
Optimization Progress: 0%| | 0/100 [00:00<?, ?it/s]
Trial 0, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 0, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6749843 test: 0.6749868 best: 0.6749868 (0) total: 427ms remaining: 8.54s 1: learn: 0.6566303 test: 0.6566865 best: 0.6566865 (1) total: 639ms remaining: 6.07s 2: learn: 0.6411094 test: 0.6411880 best: 0.6411880 (2) total: 906ms remaining: 5.43s 3: learn: 0.6222098 test: 0.6219594 best: 0.6219594 (3) total: 1.16s remaining: 4.92s 4: learn: 0.6095943 test: 0.6094700 best: 0.6094700 (4) total: 1.36s remaining: 4.37s 5: learn: 0.5962167 test: 0.5963761 best: 0.5963761 (5) total: 1.61s remaining: 4.03s 6: learn: 0.5836222 test: 0.5838488 best: 0.5838488 (6) total: 1.86s remaining: 3.72s 7: learn: 0.5704152 test: 0.5706220 best: 0.5706220 (7) total: 2.12s remaining: 3.44s 8: learn: 0.5562966 test: 0.5564756 best: 0.5564756 (8) total: 2.31s remaining: 3.08s 9: learn: 0.5448487 test: 0.5450990 best: 0.5450990 (9) total: 2.5s remaining: 2.75s 10: learn: 0.5335356 test: 0.5336698 best: 0.5336698 (10) total: 2.69s remaining: 2.45s 11: learn: 0.5222337 test: 0.5223484 best: 0.5223484 (11) total: 2.89s remaining: 2.17s 12: learn: 0.5117320 test: 0.5119540 best: 0.5119540 (12) total: 3.12s remaining: 1.92s 13: learn: 0.5072429 test: 0.5074012 best: 0.5074012 (13) total: 3.31s remaining: 1.66s 14: learn: 0.5001136 test: 0.5004308 best: 0.5004308 (14) total: 3.56s remaining: 1.42s 15: learn: 0.4888222 test: 0.4892473 best: 0.4892473 (15) total: 3.79s remaining: 1.19s 16: learn: 0.4802413 test: 0.4807960 best: 0.4807960 (16) total: 3.98s remaining: 937ms 17: learn: 0.4703788 test: 0.4709560 best: 0.4709560 (17) total: 4.19s remaining: 698ms 18: learn: 0.4640860 test: 0.4645814 best: 0.4645814 (18) total: 4.39s remaining: 462ms 19: learn: 0.4585899 test: 0.4592058 best: 0.4592058 (19) total: 4.63s remaining: 231ms 20: learn: 0.4498745 test: 0.4506160 best: 0.4506160 (20) total: 4.85s remaining: 0us bestTest = 0.4506159933 bestIteration = 20 Trial 0, Fold 1: Log loss = 0.45061599328166135, Average precision = 0.9576893964281532, ROC-AUC = 0.9511977800002867, Elapsed Time = 6.380537100001675 seconds Trial 0, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 0, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6800830 test: 0.6798725 best: 0.6798725 (0) total: 215ms remaining: 4.29s 1: learn: 0.6683618 test: 0.6681444 best: 0.6681444 (1) total: 415ms remaining: 3.94s 2: learn: 0.6575797 test: 0.6575538 best: 0.6575538 (2) total: 667ms remaining: 4s 3: learn: 0.6419279 test: 0.6419122 best: 0.6419122 (3) total: 840ms remaining: 3.57s 4: learn: 0.6187766 test: 0.6189370 best: 0.6189370 (4) total: 1.06s remaining: 3.41s 5: learn: 0.5979403 test: 0.5981629 best: 0.5981629 (5) total: 1.28s remaining: 3.19s 6: learn: 0.5824561 test: 0.5828292 best: 0.5828292 (6) total: 1.51s remaining: 3.02s 7: learn: 0.5684660 test: 0.5688134 best: 0.5688134 (7) total: 1.75s remaining: 2.84s 8: learn: 0.5536580 test: 0.5542112 best: 0.5542112 (8) total: 1.96s remaining: 2.61s 9: learn: 0.5398503 test: 0.5404290 best: 0.5404290 (9) total: 2.19s remaining: 2.41s 10: learn: 0.5308526 test: 0.5315586 best: 0.5315586 (10) total: 2.39s remaining: 2.18s 11: learn: 0.5185952 test: 0.5193456 best: 0.5193456 (11) total: 2.64s remaining: 1.98s 12: learn: 0.5084936 test: 0.5095141 best: 0.5095141 (12) total: 2.89s remaining: 1.78s 13: learn: 0.4975025 test: 0.4987796 best: 0.4987796 (13) total: 3.08s remaining: 1.54s 14: learn: 0.4884608 test: 0.4896609 best: 0.4896609 (14) total: 3.31s remaining: 1.32s 15: learn: 0.4798213 test: 0.4810037 best: 0.4810037 (15) total: 3.49s remaining: 1.09s 16: learn: 0.4702601 test: 0.4712615 best: 0.4712615 (16) total: 3.69s remaining: 870ms 17: learn: 0.4645747 test: 0.4655337 best: 0.4655337 (17) total: 3.87s remaining: 644ms 18: learn: 0.4580701 test: 0.4591078 best: 0.4591078 (18) total: 4.08s remaining: 430ms 19: learn: 0.4515486 test: 0.4524031 best: 0.4524031 (19) total: 4.31s remaining: 215ms 20: learn: 0.4440383 test: 0.4451925 best: 0.4451925 (20) total: 4.52s remaining: 0us bestTest = 0.4451924663 bestIteration = 20 Trial 0, Fold 2: Log loss = 0.44519246627841624, Average precision = 0.9594557994686177, ROC-AUC = 0.9531840630833753, Elapsed Time = 4.633024899998418 seconds Trial 0, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 0, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6732959 test: 0.6732722 best: 0.6732722 (0) total: 206ms remaining: 4.11s 1: learn: 0.6557122 test: 0.6558222 best: 0.6558222 (1) total: 406ms remaining: 3.85s 2: learn: 0.6400488 test: 0.6400794 best: 0.6400794 (2) total: 645ms remaining: 3.87s 3: learn: 0.6246623 test: 0.6246744 best: 0.6246744 (3) total: 838ms remaining: 3.56s 4: learn: 0.6103934 test: 0.6103715 best: 0.6103715 (4) total: 1.08s remaining: 3.44s 5: learn: 0.5925337 test: 0.5923637 best: 0.5923637 (5) total: 1.32s remaining: 3.31s 6: learn: 0.5739656 test: 0.5738763 best: 0.5738763 (6) total: 1.54s remaining: 3.08s 7: learn: 0.5614604 test: 0.5612742 best: 0.5612742 (7) total: 1.74s remaining: 2.83s 8: learn: 0.5434673 test: 0.5429382 best: 0.5429382 (8) total: 1.99s remaining: 2.65s 9: learn: 0.5324413 test: 0.5318961 best: 0.5318961 (9) total: 2.16s remaining: 2.38s 10: learn: 0.5236273 test: 0.5230635 best: 0.5230635 (10) total: 2.41s remaining: 2.19s 11: learn: 0.5139083 test: 0.5133198 best: 0.5133198 (11) total: 2.65s remaining: 1.98s 12: learn: 0.5060153 test: 0.5054326 best: 0.5054326 (12) total: 2.89s remaining: 1.78s 13: learn: 0.4982009 test: 0.4975340 best: 0.4975340 (13) total: 3.09s remaining: 1.54s 14: learn: 0.4910292 test: 0.4903469 best: 0.4903469 (14) total: 3.27s remaining: 1.31s 15: learn: 0.4797889 test: 0.4790524 best: 0.4790524 (15) total: 3.42s remaining: 1.07s 16: learn: 0.4746006 test: 0.4737270 best: 0.4737270 (16) total: 3.58s remaining: 843ms 17: learn: 0.4697465 test: 0.4688606 best: 0.4688606 (17) total: 3.76s remaining: 626ms 18: learn: 0.4631186 test: 0.4618613 best: 0.4618613 (18) total: 3.97s remaining: 418ms 19: learn: 0.4559915 test: 0.4547173 best: 0.4547173 (19) total: 4.13s remaining: 206ms 20: learn: 0.4526034 test: 0.4512378 best: 0.4512378 (20) total: 4.31s remaining: 0us bestTest = 0.4512377735 bestIteration = 20 Trial 0, Fold 3: Log loss = 0.4512377735099868, Average precision = 0.9558717887144259, ROC-AUC = 0.950449904773141, Elapsed Time = 4.419320900000457 seconds Trial 0, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 0, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6736458 test: 0.6734979 best: 0.6734979 (0) total: 243ms remaining: 4.85s 1: learn: 0.6607966 test: 0.6606037 best: 0.6606037 (1) total: 407ms remaining: 3.87s 2: learn: 0.6435052 test: 0.6435020 best: 0.6435020 (2) total: 652ms remaining: 3.91s 3: learn: 0.6286524 test: 0.6286423 best: 0.6286423 (3) total: 824ms remaining: 3.5s 4: learn: 0.6180952 test: 0.6181924 best: 0.6181924 (4) total: 990ms remaining: 3.17s 5: learn: 0.6021332 test: 0.6022106 best: 0.6022106 (5) total: 1.2s remaining: 3s 6: learn: 0.5861475 test: 0.5861958 best: 0.5861958 (6) total: 1.44s remaining: 2.88s 7: learn: 0.5719001 test: 0.5720800 best: 0.5720800 (7) total: 1.7s remaining: 2.75s 8: learn: 0.5545450 test: 0.5550844 best: 0.5550844 (8) total: 1.91s remaining: 2.55s 9: learn: 0.5439840 test: 0.5445666 best: 0.5445666 (9) total: 2.1s remaining: 2.31s 10: learn: 0.5348864 test: 0.5355960 best: 0.5355960 (10) total: 2.32s remaining: 2.1s 11: learn: 0.5239347 test: 0.5246174 best: 0.5246174 (11) total: 2.54s remaining: 1.9s 12: learn: 0.5145042 test: 0.5150943 best: 0.5150943 (12) total: 2.72s remaining: 1.67s 13: learn: 0.5054846 test: 0.5060748 best: 0.5060748 (13) total: 2.96s remaining: 1.48s 14: learn: 0.4957169 test: 0.4963797 best: 0.4963797 (14) total: 3.13s remaining: 1.25s 15: learn: 0.4893386 test: 0.4898711 best: 0.4898711 (15) total: 3.38s remaining: 1.05s 16: learn: 0.4801160 test: 0.4805847 best: 0.4805847 (16) total: 3.57s remaining: 840ms 17: learn: 0.4722899 test: 0.4726789 best: 0.4726789 (17) total: 3.75s remaining: 624ms 18: learn: 0.4597518 test: 0.4602520 best: 0.4602520 (18) total: 3.96s remaining: 417ms 19: learn: 0.4527845 test: 0.4533538 best: 0.4533538 (19) total: 4.12s remaining: 206ms 20: learn: 0.4466671 test: 0.4472081 best: 0.4472081 (20) total: 4.35s remaining: 0us bestTest = 0.4472080578 bestIteration = 20 Trial 0, Fold 4: Log loss = 0.4472080578107475, Average precision = 0.9574408749777041, ROC-AUC = 0.952917146529134, Elapsed Time = 4.461729999999079 seconds Trial 0, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 0, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6752415 test: 0.6759679 best: 0.6759679 (0) total: 202ms remaining: 4.03s 1: learn: 0.6621128 test: 0.6632393 best: 0.6632393 (1) total: 411ms remaining: 3.91s 2: learn: 0.6458959 test: 0.6472579 best: 0.6472579 (2) total: 659ms remaining: 3.96s 3: learn: 0.6301002 test: 0.6315856 best: 0.6315856 (3) total: 828ms remaining: 3.52s 4: learn: 0.6171512 test: 0.6190421 best: 0.6190421 (4) total: 1.06s remaining: 3.4s 5: learn: 0.6003249 test: 0.6025597 best: 0.6025597 (5) total: 1.29s remaining: 3.22s 6: learn: 0.5856578 test: 0.5880150 best: 0.5880150 (6) total: 1.54s remaining: 3.07s 7: learn: 0.5717806 test: 0.5740335 best: 0.5740335 (7) total: 1.72s remaining: 2.79s 8: learn: 0.5585630 test: 0.5611824 best: 0.5611824 (8) total: 1.93s remaining: 2.57s 9: learn: 0.5475056 test: 0.5501424 best: 0.5501424 (9) total: 2.14s remaining: 2.35s 10: learn: 0.5342650 test: 0.5370251 best: 0.5370251 (10) total: 2.39s remaining: 2.17s 11: learn: 0.5200379 test: 0.5231368 best: 0.5231368 (11) total: 2.63s remaining: 1.98s 12: learn: 0.5071831 test: 0.5101551 best: 0.5101551 (12) total: 2.88s remaining: 1.77s 13: learn: 0.5009780 test: 0.5040582 best: 0.5040582 (13) total: 3.06s remaining: 1.53s 14: learn: 0.4892587 test: 0.4927897 best: 0.4927897 (14) total: 3.28s remaining: 1.31s 15: learn: 0.4781244 test: 0.4818967 best: 0.4818967 (15) total: 3.52s remaining: 1.1s 16: learn: 0.4711806 test: 0.4752788 best: 0.4752788 (16) total: 3.7s remaining: 870ms 17: learn: 0.4623234 test: 0.4667260 best: 0.4667260 (17) total: 3.87s remaining: 645ms 18: learn: 0.4573321 test: 0.4618792 best: 0.4618792 (18) total: 4s remaining: 421ms 19: learn: 0.4503683 test: 0.4551646 best: 0.4551646 (19) total: 4.23s remaining: 211ms 20: learn: 0.4426500 test: 0.4475367 best: 0.4475367 (20) total: 4.38s remaining: 0us bestTest = 0.4475367208 bestIteration = 20
Optimization Progress: 1%|1 | 1/100 [00:33<54:45, 33.18s/it]
Trial 0, Fold 5: Log loss = 0.44753672082994483, Average precision = 0.9531065401172362, ROC-AUC = 0.9474237407799639, Elapsed Time = 4.487887499999488 seconds Trial 1, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 1, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6844508 test: 0.6845223 best: 0.6845223 (0) total: 18ms remaining: 1.01s 1: learn: 0.6753745 test: 0.6755910 best: 0.6755910 (1) total: 39ms remaining: 1.07s 2: learn: 0.6647018 test: 0.6648454 best: 0.6648454 (2) total: 65.4ms remaining: 1.18s 3: learn: 0.6551796 test: 0.6552181 best: 0.6552181 (3) total: 88.5ms remaining: 1.17s 4: learn: 0.6479090 test: 0.6478007 best: 0.6478007 (4) total: 105ms remaining: 1.1s 5: learn: 0.6410936 test: 0.6410499 best: 0.6410499 (5) total: 125ms remaining: 1.06s 6: learn: 0.6336999 test: 0.6336278 best: 0.6336278 (6) total: 142ms remaining: 1.02s 7: learn: 0.6258583 test: 0.6257618 best: 0.6257618 (7) total: 166ms remaining: 1.02s 8: learn: 0.6184494 test: 0.6183238 best: 0.6183238 (8) total: 193ms remaining: 1.03s 9: learn: 0.6113928 test: 0.6113818 best: 0.6113818 (9) total: 220ms remaining: 1.03s 10: learn: 0.6047949 test: 0.6048573 best: 0.6048573 (10) total: 240ms remaining: 1s 11: learn: 0.5982865 test: 0.5982999 best: 0.5982999 (11) total: 267ms remaining: 1s 12: learn: 0.5913540 test: 0.5914318 best: 0.5914318 (12) total: 292ms remaining: 989ms 13: learn: 0.5856286 test: 0.5858207 best: 0.5858207 (13) total: 316ms remaining: 971ms 14: learn: 0.5802498 test: 0.5804931 best: 0.5804931 (14) total: 342ms remaining: 957ms 15: learn: 0.5726920 test: 0.5729838 best: 0.5729838 (15) total: 368ms remaining: 944ms 16: learn: 0.5646651 test: 0.5649988 best: 0.5649988 (16) total: 394ms remaining: 926ms 17: learn: 0.5566392 test: 0.5568394 best: 0.5568394 (17) total: 413ms remaining: 895ms 18: learn: 0.5514932 test: 0.5518273 best: 0.5518273 (18) total: 435ms remaining: 871ms 19: learn: 0.5447123 test: 0.5449878 best: 0.5449878 (19) total: 462ms remaining: 855ms 20: learn: 0.5396064 test: 0.5398204 best: 0.5398204 (20) total: 489ms remaining: 838ms 21: learn: 0.5348887 test: 0.5350099 best: 0.5350099 (21) total: 512ms remaining: 815ms 22: learn: 0.5310407 test: 0.5311291 best: 0.5311291 (22) total: 535ms remaining: 790ms 23: learn: 0.5254453 test: 0.5256392 best: 0.5256392 (23) total: 558ms remaining: 767ms 24: learn: 0.5197383 test: 0.5199505 best: 0.5199505 (24) total: 582ms remaining: 744ms 25: learn: 0.5149871 test: 0.5152562 best: 0.5152562 (25) total: 607ms remaining: 724ms 26: learn: 0.5095573 test: 0.5098264 best: 0.5098264 (26) total: 626ms remaining: 696ms 27: learn: 0.5036129 test: 0.5037952 best: 0.5037952 (27) total: 652ms remaining: 675ms 28: learn: 0.4976762 test: 0.4977823 best: 0.4977823 (28) total: 680ms remaining: 656ms 29: learn: 0.4916715 test: 0.4916843 best: 0.4916843 (29) total: 705ms remaining: 634ms 30: learn: 0.4880196 test: 0.4879895 best: 0.4879895 (30) total: 731ms remaining: 613ms 31: learn: 0.4821455 test: 0.4819996 best: 0.4819996 (31) total: 756ms remaining: 591ms 32: learn: 0.4782816 test: 0.4781372 best: 0.4781372 (32) total: 783ms remaining: 569ms 33: learn: 0.4752209 test: 0.4751614 best: 0.4751614 (33) total: 812ms remaining: 549ms 34: learn: 0.4695851 test: 0.4694415 best: 0.4694415 (34) total: 834ms remaining: 524ms 35: learn: 0.4666812 test: 0.4664961 best: 0.4664961 (35) total: 857ms remaining: 500ms 36: learn: 0.4613669 test: 0.4611290 best: 0.4611290 (36) total: 883ms remaining: 477ms 37: learn: 0.4587242 test: 0.4585372 best: 0.4585372 (37) total: 910ms remaining: 455ms 38: learn: 0.4537149 test: 0.4534378 best: 0.4534378 (38) total: 936ms remaining: 432ms 39: learn: 0.4500252 test: 0.4496971 best: 0.4496971 (39) total: 957ms remaining: 407ms 40: learn: 0.4456850 test: 0.4452836 best: 0.4452836 (40) total: 982ms remaining: 383ms 41: learn: 0.4433068 test: 0.4429901 best: 0.4429901 (41) total: 1.01s remaining: 360ms 42: learn: 0.4410288 test: 0.4407393 best: 0.4407393 (42) total: 1.03s remaining: 336ms 43: learn: 0.4386625 test: 0.4383334 best: 0.4383334 (43) total: 1.06s remaining: 312ms 44: learn: 0.4352729 test: 0.4350294 best: 0.4350294 (44) total: 1.08s remaining: 289ms 45: learn: 0.4317688 test: 0.4314594 best: 0.4314594 (45) total: 1.1s remaining: 264ms 46: learn: 0.4263420 test: 0.4259080 best: 0.4259080 (46) total: 1.13s remaining: 241ms 47: learn: 0.4226148 test: 0.4221341 best: 0.4221341 (47) total: 1.16s remaining: 218ms 48: learn: 0.4201554 test: 0.4197055 best: 0.4197055 (48) total: 1.19s remaining: 194ms 49: learn: 0.4154722 test: 0.4149478 best: 0.4149478 (49) total: 1.21s remaining: 170ms 50: learn: 0.4133536 test: 0.4128292 best: 0.4128292 (50) total: 1.24s remaining: 145ms 51: learn: 0.4093914 test: 0.4088026 best: 0.4088026 (51) total: 1.26s remaining: 121ms 52: learn: 0.4064468 test: 0.4058603 best: 0.4058603 (52) total: 1.28s remaining: 97ms 53: learn: 0.4044185 test: 0.4038812 best: 0.4038812 (53) total: 1.31s remaining: 72.8ms 54: learn: 0.4006588 test: 0.4000630 best: 0.4000630 (54) total: 1.33s remaining: 48.5ms 55: learn: 0.3983366 test: 0.3976774 best: 0.3976774 (55) total: 1.36s remaining: 24.3ms 56: learn: 0.3964040 test: 0.3957977 best: 0.3957977 (56) total: 1.38s remaining: 0us bestTest = 0.3957976797 bestIteration = 56 Trial 1, Fold 1: Log loss = 0.3959331328819855, Average precision = 0.9563936087724884, ROC-AUC = 0.9505523114093576, Elapsed Time = 1.4943979000017862 seconds Trial 1, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 1, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6826367 test: 0.6827674 best: 0.6827674 (0) total: 19.5ms remaining: 1.09s 1: learn: 0.6745653 test: 0.6747239 best: 0.6747239 (1) total: 40.4ms remaining: 1.11s 2: learn: 0.6659111 test: 0.6660634 best: 0.6660634 (2) total: 59.8ms remaining: 1.07s 3: learn: 0.6556101 test: 0.6558801 best: 0.6558801 (3) total: 76.4ms remaining: 1.01s 4: learn: 0.6481322 test: 0.6484127 best: 0.6484127 (4) total: 96.9ms remaining: 1.01s 5: learn: 0.6379685 test: 0.6383400 best: 0.6383400 (5) total: 118ms remaining: 1s 6: learn: 0.6286604 test: 0.6291674 best: 0.6291674 (6) total: 136ms remaining: 970ms 7: learn: 0.6214220 test: 0.6219788 best: 0.6219788 (7) total: 158ms remaining: 968ms 8: learn: 0.6138271 test: 0.6143987 best: 0.6143987 (8) total: 185ms remaining: 985ms 9: learn: 0.6053366 test: 0.6059560 best: 0.6059560 (9) total: 211ms remaining: 994ms 10: learn: 0.5991127 test: 0.5997743 best: 0.5997743 (10) total: 235ms remaining: 982ms 11: learn: 0.5929668 test: 0.5935964 best: 0.5935964 (11) total: 261ms remaining: 978ms 12: learn: 0.5865606 test: 0.5872125 best: 0.5872125 (12) total: 290ms remaining: 980ms 13: learn: 0.5807189 test: 0.5813044 best: 0.5813044 (13) total: 316ms remaining: 971ms 14: learn: 0.5752562 test: 0.5758192 best: 0.5758192 (14) total: 343ms remaining: 960ms 15: learn: 0.5702932 test: 0.5708447 best: 0.5708447 (15) total: 369ms remaining: 945ms 16: learn: 0.5650312 test: 0.5655169 best: 0.5655169 (16) total: 389ms remaining: 915ms 17: learn: 0.5591881 test: 0.5596665 best: 0.5596665 (17) total: 414ms remaining: 897ms 18: learn: 0.5535374 test: 0.5540250 best: 0.5540250 (18) total: 442ms remaining: 883ms 19: learn: 0.5441804 test: 0.5447396 best: 0.5447396 (19) total: 467ms remaining: 865ms 20: learn: 0.5378067 test: 0.5384883 best: 0.5384883 (20) total: 492ms remaining: 843ms 21: learn: 0.5338293 test: 0.5344913 best: 0.5344913 (21) total: 521ms remaining: 828ms 22: learn: 0.5297973 test: 0.5304089 best: 0.5304089 (22) total: 544ms remaining: 805ms 23: learn: 0.5253167 test: 0.5259834 best: 0.5259834 (23) total: 572ms remaining: 786ms 24: learn: 0.5214700 test: 0.5220784 best: 0.5220784 (24) total: 597ms remaining: 764ms 25: learn: 0.5173613 test: 0.5179702 best: 0.5179702 (25) total: 621ms remaining: 740ms 26: learn: 0.5119736 test: 0.5125558 best: 0.5125558 (26) total: 646ms remaining: 718ms 27: learn: 0.5066481 test: 0.5073045 best: 0.5073045 (27) total: 671ms remaining: 695ms 28: learn: 0.5026043 test: 0.5032505 best: 0.5032505 (28) total: 692ms remaining: 668ms 29: learn: 0.4988145 test: 0.4994760 best: 0.4994760 (29) total: 719ms remaining: 648ms 30: learn: 0.4960465 test: 0.4966998 best: 0.4966998 (30) total: 747ms remaining: 626ms 31: learn: 0.4900152 test: 0.4906666 best: 0.4906666 (31) total: 774ms remaining: 605ms 32: learn: 0.4841399 test: 0.4848186 best: 0.4848186 (32) total: 802ms remaining: 583ms 33: learn: 0.4799344 test: 0.4806772 best: 0.4806772 (33) total: 830ms remaining: 561ms 34: learn: 0.4759918 test: 0.4766369 best: 0.4766369 (34) total: 858ms remaining: 539ms 35: learn: 0.4724963 test: 0.4731368 best: 0.4731368 (35) total: 886ms remaining: 517ms 36: learn: 0.4687383 test: 0.4694728 best: 0.4694728 (36) total: 914ms remaining: 494ms 37: learn: 0.4652101 test: 0.4659624 best: 0.4659624 (37) total: 941ms remaining: 471ms 38: learn: 0.4628409 test: 0.4635610 best: 0.4635610 (38) total: 969ms remaining: 447ms 39: learn: 0.4603836 test: 0.4610756 best: 0.4610756 (39) total: 990ms remaining: 421ms 40: learn: 0.4572137 test: 0.4579420 best: 0.4579420 (40) total: 1.02s remaining: 398ms 41: learn: 0.4520539 test: 0.4528003 best: 0.4528003 (41) total: 1.05s remaining: 375ms 42: learn: 0.4479269 test: 0.4486441 best: 0.4486441 (42) total: 1.07s remaining: 350ms 43: learn: 0.4450982 test: 0.4457860 best: 0.4457860 (43) total: 1.1s remaining: 326ms 44: learn: 0.4427647 test: 0.4434544 best: 0.4434544 (44) total: 1.14s remaining: 303ms 45: learn: 0.4405990 test: 0.4413431 best: 0.4413431 (45) total: 1.16s remaining: 278ms 46: learn: 0.4364552 test: 0.4372161 best: 0.4372161 (46) total: 1.19s remaining: 253ms 47: learn: 0.4344709 test: 0.4352204 best: 0.4352204 (47) total: 1.22s remaining: 229ms 48: learn: 0.4321877 test: 0.4329456 best: 0.4329456 (48) total: 1.25s remaining: 204ms 49: learn: 0.4303027 test: 0.4310734 best: 0.4310734 (49) total: 1.28s remaining: 179ms 50: learn: 0.4283098 test: 0.4290120 best: 0.4290120 (50) total: 1.31s remaining: 154ms 51: learn: 0.4260827 test: 0.4268738 best: 0.4268738 (51) total: 1.33s remaining: 128ms 52: learn: 0.4242543 test: 0.4249745 best: 0.4249745 (52) total: 1.36s remaining: 103ms 53: learn: 0.4218364 test: 0.4225527 best: 0.4225527 (53) total: 1.39s remaining: 77.3ms 54: learn: 0.4200480 test: 0.4207601 best: 0.4207601 (54) total: 1.42s remaining: 51.6ms 55: learn: 0.4180829 test: 0.4188054 best: 0.4188054 (55) total: 1.45s remaining: 25.9ms 56: learn: 0.4163095 test: 0.4170789 best: 0.4170789 (56) total: 1.48s remaining: 0us bestTest = 0.4170788521 bestIteration = 56 Trial 1, Fold 2: Log loss = 0.4171490212653478, Average precision = 0.9530273823259603, ROC-AUC = 0.9503425099265812, Elapsed Time = 1.588232499998412 seconds Trial 1, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 1, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6819556 test: 0.6816162 best: 0.6816162 (0) total: 24ms remaining: 1.35s 1: learn: 0.6729345 test: 0.6723292 best: 0.6723292 (1) total: 45.2ms remaining: 1.24s 2: learn: 0.6671088 test: 0.6663788 best: 0.6663788 (2) total: 72.5ms remaining: 1.3s 3: learn: 0.6597291 test: 0.6590262 best: 0.6590262 (3) total: 97.6ms remaining: 1.29s 4: learn: 0.6521875 test: 0.6513198 best: 0.6513198 (4) total: 117ms remaining: 1.22s 5: learn: 0.6447769 test: 0.6436859 best: 0.6436859 (5) total: 136ms remaining: 1.16s 6: learn: 0.6379119 test: 0.6366780 best: 0.6366780 (6) total: 163ms remaining: 1.16s 7: learn: 0.6302135 test: 0.6287888 best: 0.6287888 (7) total: 193ms remaining: 1.18s 8: learn: 0.6233920 test: 0.6218109 best: 0.6218109 (8) total: 218ms remaining: 1.16s 9: learn: 0.6166177 test: 0.6147910 best: 0.6147910 (9) total: 242ms remaining: 1.14s 10: learn: 0.6081419 test: 0.6060343 best: 0.6060343 (10) total: 266ms remaining: 1.11s 11: learn: 0.6014097 test: 0.5990584 best: 0.5990584 (11) total: 292ms remaining: 1.09s 12: learn: 0.5946012 test: 0.5920843 best: 0.5920843 (12) total: 317ms remaining: 1.07s 13: learn: 0.5866199 test: 0.5838594 best: 0.5838594 (13) total: 347ms remaining: 1.07s 14: learn: 0.5813384 test: 0.5784632 best: 0.5784632 (14) total: 375ms remaining: 1.05s 15: learn: 0.5742977 test: 0.5712834 best: 0.5712834 (15) total: 403ms remaining: 1.03s 16: learn: 0.5672321 test: 0.5643408 best: 0.5643408 (16) total: 435ms remaining: 1.02s 17: learn: 0.5623903 test: 0.5593947 best: 0.5593947 (17) total: 462ms remaining: 1s 18: learn: 0.5547305 test: 0.5516433 best: 0.5516433 (18) total: 492ms remaining: 983ms 19: learn: 0.5501671 test: 0.5469587 best: 0.5469587 (19) total: 521ms remaining: 964ms 20: learn: 0.5433591 test: 0.5400375 best: 0.5400375 (20) total: 555ms remaining: 951ms 21: learn: 0.5385652 test: 0.5350978 best: 0.5350978 (21) total: 579ms remaining: 921ms 22: learn: 0.5331048 test: 0.5296762 best: 0.5296762 (22) total: 609ms remaining: 901ms 23: learn: 0.5264660 test: 0.5229895 best: 0.5229895 (23) total: 639ms remaining: 878ms 24: learn: 0.5215530 test: 0.5179588 best: 0.5179588 (24) total: 666ms remaining: 852ms 25: learn: 0.5162284 test: 0.5125071 best: 0.5125071 (25) total: 695ms remaining: 829ms 26: learn: 0.5092348 test: 0.5055560 best: 0.5055560 (26) total: 722ms remaining: 802ms 27: learn: 0.5027183 test: 0.4990263 best: 0.4990263 (27) total: 751ms remaining: 778ms 28: learn: 0.4991742 test: 0.4953476 best: 0.4953476 (28) total: 783ms remaining: 756ms 29: learn: 0.4949817 test: 0.4910352 best: 0.4910352 (29) total: 809ms remaining: 728ms 30: learn: 0.4914966 test: 0.4874511 best: 0.4874511 (30) total: 838ms remaining: 703ms 31: learn: 0.4886101 test: 0.4844623 best: 0.4844623 (31) total: 864ms remaining: 675ms 32: learn: 0.4852765 test: 0.4810123 best: 0.4810123 (32) total: 893ms remaining: 650ms 33: learn: 0.4793758 test: 0.4751508 best: 0.4751508 (33) total: 922ms remaining: 624ms 34: learn: 0.4762076 test: 0.4718821 best: 0.4718821 (34) total: 947ms remaining: 595ms 35: learn: 0.4704330 test: 0.4660852 best: 0.4660852 (35) total: 972ms remaining: 567ms 36: learn: 0.4670548 test: 0.4625820 best: 0.4625820 (36) total: 1s remaining: 542ms 37: learn: 0.4642921 test: 0.4596857 best: 0.4596857 (37) total: 1.03s remaining: 516ms 38: learn: 0.4595978 test: 0.4549643 best: 0.4549643 (38) total: 1.06s remaining: 490ms 39: learn: 0.4570895 test: 0.4523758 best: 0.4523758 (39) total: 1.08s remaining: 461ms 40: learn: 0.4541699 test: 0.4493711 best: 0.4493711 (40) total: 1.11s remaining: 433ms 41: learn: 0.4489386 test: 0.4441287 best: 0.4441287 (41) total: 1.14s remaining: 406ms 42: learn: 0.4464669 test: 0.4415476 best: 0.4415476 (42) total: 1.16s remaining: 378ms 43: learn: 0.4419853 test: 0.4370084 best: 0.4370084 (43) total: 1.19s remaining: 351ms 44: learn: 0.4398272 test: 0.4347624 best: 0.4347624 (44) total: 1.22s remaining: 325ms 45: learn: 0.4375118 test: 0.4323350 best: 0.4323350 (45) total: 1.24s remaining: 297ms 46: learn: 0.4355408 test: 0.4303046 best: 0.4303046 (46) total: 1.27s remaining: 270ms 47: learn: 0.4303951 test: 0.4251812 best: 0.4251812 (47) total: 1.3s remaining: 243ms 48: learn: 0.4283092 test: 0.4230551 best: 0.4230551 (48) total: 1.32s remaining: 216ms 49: learn: 0.4243654 test: 0.4191002 best: 0.4191002 (49) total: 1.35s remaining: 190ms 50: learn: 0.4216236 test: 0.4162505 best: 0.4162505 (50) total: 1.38s remaining: 163ms 51: learn: 0.4200190 test: 0.4145863 best: 0.4145863 (51) total: 1.41s remaining: 136ms 52: learn: 0.4169063 test: 0.4114312 best: 0.4114312 (52) total: 1.44s remaining: 109ms 53: learn: 0.4151678 test: 0.4096025 best: 0.4096025 (53) total: 1.47s remaining: 81.7ms 54: learn: 0.4126640 test: 0.4070734 best: 0.4070734 (54) total: 1.5s remaining: 54.6ms 55: learn: 0.4086799 test: 0.4031292 best: 0.4031292 (55) total: 1.53s remaining: 27.4ms 56: learn: 0.4050047 test: 0.3994722 best: 0.3994722 (56) total: 1.56s remaining: 0us bestTest = 0.3994721782 bestIteration = 56 Trial 1, Fold 3: Log loss = 0.39974561984306517, Average precision = 0.9582582800143149, ROC-AUC = 0.9555932537007252, Elapsed Time = 1.690450599999167 seconds Trial 1, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 1, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6849780 test: 0.6850639 best: 0.6850639 (0) total: 24.1ms remaining: 1.35s 1: learn: 0.6761562 test: 0.6762125 best: 0.6762125 (1) total: 44.7ms remaining: 1.23s 2: learn: 0.6654400 test: 0.6654820 best: 0.6654820 (2) total: 71.3ms remaining: 1.28s 3: learn: 0.6580735 test: 0.6579945 best: 0.6579945 (3) total: 96.5ms remaining: 1.28s 4: learn: 0.6497556 test: 0.6495742 best: 0.6495742 (4) total: 124ms remaining: 1.29s 5: learn: 0.6425040 test: 0.6422761 best: 0.6422761 (5) total: 156ms remaining: 1.32s 6: learn: 0.6358549 test: 0.6355251 best: 0.6355251 (6) total: 185ms remaining: 1.32s 7: learn: 0.6283183 test: 0.6280399 best: 0.6280399 (7) total: 217ms remaining: 1.33s 8: learn: 0.6214843 test: 0.6211330 best: 0.6211330 (8) total: 243ms remaining: 1.3s 9: learn: 0.6138563 test: 0.6134914 best: 0.6134914 (9) total: 273ms remaining: 1.28s 10: learn: 0.6052992 test: 0.6049633 best: 0.6049633 (10) total: 306ms remaining: 1.28s 11: learn: 0.5990204 test: 0.5988446 best: 0.5988446 (11) total: 337ms remaining: 1.26s 12: learn: 0.5903888 test: 0.5902291 best: 0.5902291 (12) total: 364ms remaining: 1.23s 13: learn: 0.5849321 test: 0.5848182 best: 0.5848182 (13) total: 394ms remaining: 1.21s 14: learn: 0.5792522 test: 0.5791805 best: 0.5791805 (14) total: 425ms remaining: 1.19s 15: learn: 0.5737268 test: 0.5737250 best: 0.5737250 (15) total: 453ms remaining: 1.16s 16: learn: 0.5678441 test: 0.5677653 best: 0.5677653 (16) total: 481ms remaining: 1.13s 17: learn: 0.5624124 test: 0.5623180 best: 0.5623180 (17) total: 516ms remaining: 1.12s 18: learn: 0.5579926 test: 0.5579787 best: 0.5579787 (18) total: 544ms remaining: 1.09s 19: learn: 0.5505444 test: 0.5504483 best: 0.5504483 (19) total: 576ms remaining: 1.06s 20: learn: 0.5451577 test: 0.5449573 best: 0.5449573 (20) total: 605ms remaining: 1.04s 21: learn: 0.5403768 test: 0.5401836 best: 0.5401836 (21) total: 637ms remaining: 1.01s 22: learn: 0.5347742 test: 0.5346267 best: 0.5346267 (22) total: 667ms remaining: 986ms 23: learn: 0.5294982 test: 0.5293268 best: 0.5293268 (23) total: 704ms remaining: 968ms 24: learn: 0.5251084 test: 0.5249354 best: 0.5249354 (24) total: 735ms remaining: 940ms 25: learn: 0.5214177 test: 0.5211935 best: 0.5211935 (25) total: 770ms remaining: 918ms 26: learn: 0.5162139 test: 0.5159040 best: 0.5159040 (26) total: 807ms remaining: 897ms 27: learn: 0.5112421 test: 0.5109512 best: 0.5109512 (27) total: 840ms remaining: 870ms 28: learn: 0.5063123 test: 0.5059309 best: 0.5059309 (28) total: 873ms remaining: 843ms 29: learn: 0.5028575 test: 0.5024939 best: 0.5024939 (29) total: 906ms remaining: 815ms 30: learn: 0.4990067 test: 0.4986723 best: 0.4986723 (30) total: 935ms remaining: 784ms 31: learn: 0.4950732 test: 0.4947412 best: 0.4947412 (31) total: 968ms remaining: 756ms 32: learn: 0.4920553 test: 0.4916983 best: 0.4916983 (32) total: 1s remaining: 728ms 33: learn: 0.4890772 test: 0.4886819 best: 0.4886819 (33) total: 1.04s remaining: 701ms 34: learn: 0.4850337 test: 0.4845917 best: 0.4845917 (34) total: 1.07s remaining: 672ms 35: learn: 0.4816207 test: 0.4812067 best: 0.4812067 (35) total: 1.1s remaining: 641ms 36: learn: 0.4776302 test: 0.4772762 best: 0.4772762 (36) total: 1.13s remaining: 613ms 37: learn: 0.4739212 test: 0.4735566 best: 0.4735566 (37) total: 1.17s remaining: 583ms 38: learn: 0.4696993 test: 0.4693344 best: 0.4693344 (38) total: 1.2s remaining: 553ms 39: learn: 0.4661491 test: 0.4657853 best: 0.4657853 (39) total: 1.23s remaining: 521ms 40: learn: 0.4637062 test: 0.4632909 best: 0.4632909 (40) total: 1.26s remaining: 491ms 41: learn: 0.4607871 test: 0.4604581 best: 0.4604581 (41) total: 1.29s remaining: 460ms 42: learn: 0.4583497 test: 0.4580662 best: 0.4580662 (42) total: 1.32s remaining: 430ms 43: learn: 0.4558788 test: 0.4555625 best: 0.4555625 (43) total: 1.35s remaining: 399ms 44: learn: 0.4539744 test: 0.4536904 best: 0.4536904 (44) total: 1.38s remaining: 369ms 45: learn: 0.4512869 test: 0.4509882 best: 0.4509882 (45) total: 1.42s remaining: 339ms 46: learn: 0.4491173 test: 0.4488799 best: 0.4488799 (46) total: 1.45s remaining: 308ms 47: learn: 0.4468342 test: 0.4466588 best: 0.4466588 (47) total: 1.48s remaining: 277ms 48: learn: 0.4444932 test: 0.4443319 best: 0.4443319 (48) total: 1.5s remaining: 246ms 49: learn: 0.4425120 test: 0.4423813 best: 0.4423813 (49) total: 1.54s remaining: 215ms 50: learn: 0.4404506 test: 0.4403583 best: 0.4403583 (50) total: 1.57s remaining: 185ms 51: learn: 0.4385910 test: 0.4384652 best: 0.4384652 (51) total: 1.6s remaining: 154ms 52: learn: 0.4360603 test: 0.4359439 best: 0.4359439 (52) total: 1.63s remaining: 123ms 53: learn: 0.4312761 test: 0.4312099 best: 0.4312099 (53) total: 1.66s remaining: 92.2ms 54: learn: 0.4297543 test: 0.4296693 best: 0.4296693 (54) total: 1.69s remaining: 61.5ms 55: learn: 0.4251316 test: 0.4250730 best: 0.4250730 (55) total: 1.72s remaining: 30.8ms 56: learn: 0.4227583 test: 0.4226421 best: 0.4226421 (56) total: 1.75s remaining: 0us bestTest = 0.4226421495 bestIteration = 56 Trial 1, Fold 4: Log loss = 0.4226799843478286, Average precision = 0.9541335907356011, ROC-AUC = 0.9498205433592994, Elapsed Time = 1.8859120999986771 seconds Trial 1, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 1, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6836662 test: 0.6839511 best: 0.6839511 (0) total: 22.8ms remaining: 1.28s 1: learn: 0.6743180 test: 0.6749051 best: 0.6749051 (1) total: 42.7ms remaining: 1.18s 2: learn: 0.6639413 test: 0.6648403 best: 0.6648403 (2) total: 63.7ms remaining: 1.15s 3: learn: 0.6562408 test: 0.6574512 best: 0.6574512 (3) total: 89.1ms remaining: 1.18s 4: learn: 0.6459610 test: 0.6472819 best: 0.6472819 (4) total: 112ms remaining: 1.17s 5: learn: 0.6386059 test: 0.6402293 best: 0.6402293 (5) total: 140ms remaining: 1.19s 6: learn: 0.6305583 test: 0.6324221 best: 0.6324221 (6) total: 166ms remaining: 1.19s 7: learn: 0.6222298 test: 0.6242572 best: 0.6242572 (7) total: 192ms remaining: 1.17s 8: learn: 0.6147138 test: 0.6169646 best: 0.6169646 (8) total: 218ms remaining: 1.16s 9: learn: 0.6049309 test: 0.6073212 best: 0.6073212 (9) total: 244ms remaining: 1.14s 10: learn: 0.5968611 test: 0.5993411 best: 0.5993411 (10) total: 272ms remaining: 1.14s 11: learn: 0.5901779 test: 0.5928450 best: 0.5928450 (11) total: 297ms remaining: 1.11s 12: learn: 0.5849434 test: 0.5877133 best: 0.5877133 (12) total: 326ms remaining: 1.1s 13: learn: 0.5754982 test: 0.5784459 best: 0.5784459 (13) total: 356ms remaining: 1.09s 14: learn: 0.5697445 test: 0.5728970 best: 0.5728970 (14) total: 382ms remaining: 1.07s 15: learn: 0.5613308 test: 0.5645596 best: 0.5645596 (15) total: 403ms remaining: 1.03s 16: learn: 0.5558175 test: 0.5592602 best: 0.5592602 (16) total: 430ms remaining: 1.01s 17: learn: 0.5489020 test: 0.5525744 best: 0.5525744 (17) total: 460ms remaining: 996ms 18: learn: 0.5442404 test: 0.5480760 best: 0.5480760 (18) total: 488ms remaining: 975ms 19: learn: 0.5393929 test: 0.5434251 best: 0.5434251 (19) total: 512ms remaining: 947ms 20: learn: 0.5331716 test: 0.5373555 best: 0.5373555 (20) total: 540ms remaining: 926ms 21: learn: 0.5276535 test: 0.5319175 best: 0.5319175 (21) total: 568ms remaining: 903ms 22: learn: 0.5231261 test: 0.5275283 best: 0.5275283 (22) total: 598ms remaining: 885ms 23: learn: 0.5162922 test: 0.5207980 best: 0.5207980 (23) total: 622ms remaining: 856ms 24: learn: 0.5127600 test: 0.5173502 best: 0.5173502 (24) total: 653ms remaining: 836ms 25: learn: 0.5071680 test: 0.5118792 best: 0.5118792 (25) total: 678ms remaining: 809ms 26: learn: 0.5034460 test: 0.5083259 best: 0.5083259 (26) total: 706ms remaining: 784ms 27: learn: 0.4982627 test: 0.5031624 best: 0.5031624 (27) total: 735ms remaining: 761ms 28: learn: 0.4948251 test: 0.4998639 best: 0.4998639 (28) total: 759ms remaining: 733ms 29: learn: 0.4897203 test: 0.4948233 best: 0.4948233 (29) total: 787ms remaining: 708ms 30: learn: 0.4866665 test: 0.4919038 best: 0.4919038 (30) total: 815ms remaining: 683ms 31: learn: 0.4829278 test: 0.4882943 best: 0.4882943 (31) total: 839ms remaining: 656ms 32: learn: 0.4767269 test: 0.4821457 best: 0.4821457 (32) total: 867ms remaining: 631ms 33: learn: 0.4708356 test: 0.4762930 best: 0.4762930 (33) total: 897ms remaining: 607ms 34: learn: 0.4675368 test: 0.4731573 best: 0.4731573 (34) total: 923ms remaining: 580ms 35: learn: 0.4646068 test: 0.4703633 best: 0.4703633 (35) total: 950ms remaining: 554ms 36: learn: 0.4605293 test: 0.4662988 best: 0.4662988 (36) total: 978ms remaining: 529ms 37: learn: 0.4573715 test: 0.4631776 best: 0.4631776 (37) total: 1.01s remaining: 503ms 38: learn: 0.4542699 test: 0.4602030 best: 0.4602030 (38) total: 1.03s remaining: 476ms 39: learn: 0.4512281 test: 0.4572871 best: 0.4572871 (39) total: 1.06s remaining: 450ms 40: learn: 0.4486079 test: 0.4547648 best: 0.4547648 (40) total: 1.08s remaining: 423ms 41: learn: 0.4444283 test: 0.4506259 best: 0.4506259 (41) total: 1.11s remaining: 397ms 42: learn: 0.4401247 test: 0.4463364 best: 0.4463364 (42) total: 1.14s remaining: 371ms 43: learn: 0.4369390 test: 0.4432146 best: 0.4432146 (43) total: 1.16s remaining: 344ms 44: learn: 0.4333342 test: 0.4396511 best: 0.4396511 (44) total: 1.19s remaining: 318ms 45: learn: 0.4295195 test: 0.4359266 best: 0.4359266 (45) total: 1.22s remaining: 293ms 46: learn: 0.4276441 test: 0.4341594 best: 0.4341594 (46) total: 1.25s remaining: 266ms 47: learn: 0.4226178 test: 0.4291360 best: 0.4291360 (47) total: 1.28s remaining: 240ms 48: learn: 0.4207264 test: 0.4273829 best: 0.4273829 (48) total: 1.31s remaining: 213ms 49: learn: 0.4168369 test: 0.4235667 best: 0.4235667 (49) total: 1.34s remaining: 187ms 50: learn: 0.4136185 test: 0.4203743 best: 0.4203743 (50) total: 1.36s remaining: 160ms 51: learn: 0.4109571 test: 0.4177256 best: 0.4177256 (51) total: 1.39s remaining: 134ms 52: learn: 0.4081999 test: 0.4150064 best: 0.4150064 (52) total: 1.41s remaining: 107ms 53: learn: 0.4062538 test: 0.4130523 best: 0.4130523 (53) total: 1.44s remaining: 80ms 54: learn: 0.4043538 test: 0.4112391 best: 0.4112391 (54) total: 1.47s remaining: 53.4ms 55: learn: 0.4003775 test: 0.4072569 best: 0.4072569 (55) total: 1.5s remaining: 26.7ms 56: learn: 0.3967051 test: 0.4035929 best: 0.4035929 (56) total: 1.52s remaining: 0us bestTest = 0.4035928781 bestIteration = 56 Trial 1, Fold 5: Log loss = 0.4036063243832434, Average precision = 0.9527932311333421, ROC-AUC = 0.9487819705330434, Elapsed Time = 1.6435883999984071 seconds
Optimization Progress: 2%|2 | 2/100 [00:49<37:38, 23.05s/it]
Trial 2, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 2, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6804296 test: 0.6806120 best: 0.6806120 (0) total: 153ms remaining: 11.6s 1: learn: 0.6677884 test: 0.6680673 best: 0.6680673 (1) total: 296ms remaining: 11.1s 2: learn: 0.6554878 test: 0.6559308 best: 0.6559308 (2) total: 449ms remaining: 11.1s 3: learn: 0.6434826 test: 0.6440523 best: 0.6440523 (3) total: 596ms remaining: 10.9s 4: learn: 0.6323349 test: 0.6331974 best: 0.6331974 (4) total: 752ms remaining: 10.8s 5: learn: 0.6210927 test: 0.6221362 best: 0.6221362 (5) total: 900ms remaining: 10.6s 6: learn: 0.6110129 test: 0.6125733 best: 0.6125733 (6) total: 1.06s remaining: 10.7s 7: learn: 0.6000466 test: 0.6016630 best: 0.6016630 (7) total: 1.21s remaining: 10.4s 8: learn: 0.5894133 test: 0.5913387 best: 0.5913387 (8) total: 1.4s remaining: 10.6s 9: learn: 0.5790718 test: 0.5811954 best: 0.5811954 (9) total: 1.57s remaining: 10.5s 10: learn: 0.5691021 test: 0.5714738 best: 0.5714738 (10) total: 1.74s remaining: 10.4s 11: learn: 0.5597332 test: 0.5623301 best: 0.5623301 (11) total: 1.92s remaining: 10.4s 12: learn: 0.5504445 test: 0.5532851 best: 0.5532851 (12) total: 2.1s remaining: 10.3s 13: learn: 0.5412433 test: 0.5443793 best: 0.5443793 (13) total: 2.29s remaining: 10.3s 14: learn: 0.5322531 test: 0.5355715 best: 0.5355715 (14) total: 2.46s remaining: 10.2s 15: learn: 0.5238826 test: 0.5273040 best: 0.5273040 (15) total: 2.62s remaining: 9.98s 16: learn: 0.5153432 test: 0.5188888 best: 0.5188888 (16) total: 2.8s remaining: 9.88s 17: learn: 0.5072587 test: 0.5108885 best: 0.5108885 (17) total: 2.94s remaining: 9.65s 18: learn: 0.4993386 test: 0.5031608 best: 0.5031608 (18) total: 3.14s remaining: 9.59s 19: learn: 0.4917226 test: 0.4957110 best: 0.4957110 (19) total: 3.31s remaining: 9.44s 20: learn: 0.4845157 test: 0.4887245 best: 0.4887245 (20) total: 3.5s remaining: 9.34s 21: learn: 0.4778473 test: 0.4825589 best: 0.4825589 (21) total: 3.66s remaining: 9.15s 22: learn: 0.4707919 test: 0.4756419 best: 0.4756419 (22) total: 3.84s remaining: 9.01s 23: learn: 0.4641787 test: 0.4692414 best: 0.4692414 (23) total: 4.01s remaining: 8.86s 24: learn: 0.4573154 test: 0.4626291 best: 0.4626291 (24) total: 4.21s remaining: 8.76s 25: learn: 0.4506284 test: 0.4560333 best: 0.4560333 (25) total: 4.39s remaining: 8.62s 26: learn: 0.4443272 test: 0.4498964 best: 0.4498964 (26) total: 4.58s remaining: 8.48s 27: learn: 0.4381954 test: 0.4439829 best: 0.4439829 (27) total: 4.73s remaining: 8.27s 28: learn: 0.4319057 test: 0.4378829 best: 0.4378829 (28) total: 4.9s remaining: 8.11s 29: learn: 0.4263207 test: 0.4324700 best: 0.4324700 (29) total: 5.08s remaining: 7.96s 30: learn: 0.4204960 test: 0.4267758 best: 0.4267758 (30) total: 5.25s remaining: 7.8s 31: learn: 0.4151187 test: 0.4215344 best: 0.4215344 (31) total: 5.42s remaining: 7.63s 32: learn: 0.4097001 test: 0.4162987 best: 0.4162987 (32) total: 5.59s remaining: 7.46s 33: learn: 0.4044652 test: 0.4112205 best: 0.4112205 (33) total: 5.74s remaining: 7.26s 34: learn: 0.3995221 test: 0.4066065 best: 0.4066065 (34) total: 5.91s remaining: 7.09s 35: learn: 0.3944734 test: 0.4017346 best: 0.4017346 (35) total: 6.08s remaining: 6.93s 36: learn: 0.3896190 test: 0.3971068 best: 0.3971068 (36) total: 6.25s remaining: 6.76s 37: learn: 0.3849936 test: 0.3926460 best: 0.3926460 (37) total: 6.44s remaining: 6.61s 38: learn: 0.3802716 test: 0.3880936 best: 0.3880936 (38) total: 6.61s remaining: 6.44s 39: learn: 0.3759030 test: 0.3839355 best: 0.3839355 (39) total: 6.79s remaining: 6.28s 40: learn: 0.3714617 test: 0.3797615 best: 0.3797615 (40) total: 6.98s remaining: 6.13s 41: learn: 0.3674940 test: 0.3759305 best: 0.3759305 (41) total: 7.16s remaining: 5.97s 42: learn: 0.3634374 test: 0.3720208 best: 0.3720208 (42) total: 7.34s remaining: 5.8s 43: learn: 0.3595701 test: 0.3683692 best: 0.3683692 (43) total: 7.52s remaining: 5.64s 44: learn: 0.3556910 test: 0.3646915 best: 0.3646915 (44) total: 7.69s remaining: 5.47s 45: learn: 0.3519097 test: 0.3610272 best: 0.3610272 (45) total: 7.86s remaining: 5.29s 46: learn: 0.3480675 test: 0.3573390 best: 0.3573390 (46) total: 8.05s remaining: 5.14s 47: learn: 0.3443842 test: 0.3538868 best: 0.3538868 (47) total: 8.21s remaining: 4.96s 48: learn: 0.3407415 test: 0.3505077 best: 0.3505077 (48) total: 8.4s remaining: 4.8s 49: learn: 0.3371967 test: 0.3472038 best: 0.3472038 (49) total: 8.56s remaining: 4.62s 50: learn: 0.3338650 test: 0.3439903 best: 0.3439903 (50) total: 8.72s remaining: 4.44s 51: learn: 0.3307257 test: 0.3410389 best: 0.3410389 (51) total: 8.89s remaining: 4.28s 52: learn: 0.3276438 test: 0.3383346 best: 0.3383346 (52) total: 9.06s remaining: 4.1s 53: learn: 0.3246422 test: 0.3354905 best: 0.3354905 (53) total: 9.23s remaining: 3.93s 54: learn: 0.3216268 test: 0.3326555 best: 0.3326555 (54) total: 9.39s remaining: 3.76s 55: learn: 0.3188133 test: 0.3300179 best: 0.3300179 (55) total: 9.57s remaining: 3.59s 56: learn: 0.3160145 test: 0.3273864 best: 0.3273864 (56) total: 9.73s remaining: 3.41s 57: learn: 0.3130914 test: 0.3245757 best: 0.3245757 (57) total: 9.89s remaining: 3.24s 58: learn: 0.3102636 test: 0.3219746 best: 0.3219746 (58) total: 10.1s remaining: 3.07s 59: learn: 0.3074245 test: 0.3193642 best: 0.3193642 (59) total: 10.2s remaining: 2.9s 60: learn: 0.3048554 test: 0.3170375 best: 0.3170375 (60) total: 10.4s remaining: 2.73s 61: learn: 0.3022548 test: 0.3146774 best: 0.3146774 (61) total: 10.6s remaining: 2.56s 62: learn: 0.2998522 test: 0.3124458 best: 0.3124458 (62) total: 10.7s remaining: 2.38s 63: learn: 0.2974361 test: 0.3102289 best: 0.3102289 (63) total: 10.9s remaining: 2.21s 64: learn: 0.2951749 test: 0.3081751 best: 0.3081751 (64) total: 11.1s remaining: 2.04s 65: learn: 0.2928650 test: 0.3061166 best: 0.3061166 (65) total: 11.3s remaining: 1.88s 66: learn: 0.2905230 test: 0.3039803 best: 0.3039803 (66) total: 11.4s remaining: 1.71s 67: learn: 0.2883662 test: 0.3020124 best: 0.3020124 (67) total: 11.6s remaining: 1.54s 68: learn: 0.2863271 test: 0.3001502 best: 0.3001502 (68) total: 11.8s remaining: 1.37s 69: learn: 0.2842113 test: 0.2982080 best: 0.2982080 (69) total: 11.9s remaining: 1.19s 70: learn: 0.2822013 test: 0.2963492 best: 0.2963492 (70) total: 12.1s remaining: 1.02s 71: learn: 0.2802272 test: 0.2946159 best: 0.2946159 (71) total: 12.3s remaining: 851ms 72: learn: 0.2780386 test: 0.2927888 best: 0.2927888 (72) total: 12.5s remaining: 683ms 73: learn: 0.2759209 test: 0.2908612 best: 0.2908612 (73) total: 12.6s remaining: 512ms 74: learn: 0.2740745 test: 0.2892156 best: 0.2892156 (74) total: 12.8s remaining: 342ms 75: learn: 0.2722139 test: 0.2875322 best: 0.2875322 (75) total: 13s remaining: 171ms 76: learn: 0.2704869 test: 0.2859927 best: 0.2859927 (76) total: 13.1s remaining: 0us bestTest = 0.2859926523 bestIteration = 76 Trial 2, Fold 1: Log loss = 0.2859926522625346, Average precision = 0.9734384342518939, ROC-AUC = 0.9685485104410128, Elapsed Time = 13.314232200002152 seconds Trial 2, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 2, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6802992 test: 0.6804238 best: 0.6804238 (0) total: 145ms remaining: 11s 1: learn: 0.6677669 test: 0.6680604 best: 0.6680604 (1) total: 291ms remaining: 10.9s 2: learn: 0.6553625 test: 0.6558837 best: 0.6558837 (2) total: 442ms remaining: 10.9s 3: learn: 0.6436023 test: 0.6442595 best: 0.6442595 (3) total: 583ms remaining: 10.6s 4: learn: 0.6316734 test: 0.6325418 best: 0.6325418 (4) total: 750ms remaining: 10.8s 5: learn: 0.6202066 test: 0.6213160 best: 0.6213160 (5) total: 914ms remaining: 10.8s 6: learn: 0.6089649 test: 0.6103025 best: 0.6103025 (6) total: 1.05s remaining: 10.5s 7: learn: 0.5983487 test: 0.5998462 best: 0.5998462 (7) total: 1.19s remaining: 10.3s 8: learn: 0.5879823 test: 0.5896032 best: 0.5896032 (8) total: 1.35s remaining: 10.2s 9: learn: 0.5778730 test: 0.5796045 best: 0.5796045 (9) total: 1.5s remaining: 10.1s 10: learn: 0.5680909 test: 0.5699317 best: 0.5699317 (10) total: 1.64s remaining: 9.87s 11: learn: 0.5583613 test: 0.5603965 best: 0.5603965 (11) total: 1.81s remaining: 9.8s 12: learn: 0.5491401 test: 0.5513133 best: 0.5513133 (12) total: 1.96s remaining: 9.65s 13: learn: 0.5401695 test: 0.5424714 best: 0.5424714 (13) total: 2.1s remaining: 9.43s 14: learn: 0.5312167 test: 0.5337572 best: 0.5337572 (14) total: 2.26s remaining: 9.35s 15: learn: 0.5226484 test: 0.5254532 best: 0.5254532 (15) total: 2.44s remaining: 9.31s 16: learn: 0.5146160 test: 0.5175101 best: 0.5175101 (16) total: 2.58s remaining: 9.09s 17: learn: 0.5064706 test: 0.5095674 best: 0.5095674 (17) total: 2.73s remaining: 8.93s 18: learn: 0.4985761 test: 0.5018810 best: 0.5018810 (18) total: 2.88s remaining: 8.78s 19: learn: 0.4910576 test: 0.4945997 best: 0.4945997 (19) total: 3.02s remaining: 8.62s 20: learn: 0.4835454 test: 0.4872538 best: 0.4872538 (20) total: 3.18s remaining: 8.47s 21: learn: 0.4766546 test: 0.4807054 best: 0.4807054 (21) total: 3.34s remaining: 8.34s 22: learn: 0.4696147 test: 0.4738380 best: 0.4738380 (22) total: 3.46s remaining: 8.12s 23: learn: 0.4626850 test: 0.4670515 best: 0.4670515 (23) total: 3.61s remaining: 7.98s 24: learn: 0.4559021 test: 0.4604131 best: 0.4604131 (24) total: 3.78s remaining: 7.87s 25: learn: 0.4493520 test: 0.4540406 best: 0.4540406 (25) total: 3.94s remaining: 7.72s 26: learn: 0.4430703 test: 0.4478685 best: 0.4478685 (26) total: 4.06s remaining: 7.53s 27: learn: 0.4369602 test: 0.4419693 best: 0.4419693 (27) total: 4.21s remaining: 7.36s 28: learn: 0.4310376 test: 0.4361409 best: 0.4361409 (28) total: 4.37s remaining: 7.23s 29: learn: 0.4250703 test: 0.4304681 best: 0.4304681 (29) total: 4.53s remaining: 7.1s 30: learn: 0.4194986 test: 0.4250504 best: 0.4250504 (30) total: 4.69s remaining: 6.96s 31: learn: 0.4139779 test: 0.4196276 best: 0.4196276 (31) total: 4.84s remaining: 6.8s 32: learn: 0.4087263 test: 0.4144611 best: 0.4144611 (32) total: 4.98s remaining: 6.64s 33: learn: 0.4036313 test: 0.4094046 best: 0.4094046 (33) total: 5.12s remaining: 6.48s 34: learn: 0.3984828 test: 0.4044550 best: 0.4044550 (34) total: 5.28s remaining: 6.34s 35: learn: 0.3937909 test: 0.3999379 best: 0.3999379 (35) total: 5.44s remaining: 6.2s 36: learn: 0.3891533 test: 0.3954107 best: 0.3954107 (36) total: 5.58s remaining: 6.04s 37: learn: 0.3845237 test: 0.3908775 best: 0.3908775 (37) total: 5.75s remaining: 5.9s 38: learn: 0.3802567 test: 0.3866371 best: 0.3866371 (38) total: 5.87s remaining: 5.72s 39: learn: 0.3758981 test: 0.3824065 best: 0.3824065 (39) total: 6.01s remaining: 5.56s 40: learn: 0.3716689 test: 0.3782613 best: 0.3782613 (40) total: 6.19s remaining: 5.43s 41: learn: 0.3673435 test: 0.3741154 best: 0.3741154 (41) total: 6.36s remaining: 5.3s 42: learn: 0.3635849 test: 0.3706369 best: 0.3706369 (42) total: 6.52s remaining: 5.16s 43: learn: 0.3595143 test: 0.3666548 best: 0.3666548 (43) total: 6.66s remaining: 4.99s 44: learn: 0.3556274 test: 0.3628430 best: 0.3628430 (44) total: 6.81s remaining: 4.85s 45: learn: 0.3524491 test: 0.3599522 best: 0.3599522 (45) total: 6.97s remaining: 4.7s 46: learn: 0.3487101 test: 0.3563514 best: 0.3563514 (46) total: 7.12s remaining: 4.54s 47: learn: 0.3451290 test: 0.3529257 best: 0.3529257 (47) total: 7.27s remaining: 4.39s 48: learn: 0.3416074 test: 0.3495597 best: 0.3495597 (48) total: 7.46s remaining: 4.26s 49: learn: 0.3382532 test: 0.3463617 best: 0.3463617 (49) total: 7.62s remaining: 4.12s 50: learn: 0.3349759 test: 0.3431778 best: 0.3431778 (50) total: 7.79s remaining: 3.97s 51: learn: 0.3318035 test: 0.3400893 best: 0.3400893 (51) total: 7.92s remaining: 3.81s 52: learn: 0.3287209 test: 0.3371564 best: 0.3371564 (52) total: 8.06s remaining: 3.65s 53: learn: 0.3256028 test: 0.3342041 best: 0.3342041 (53) total: 8.21s remaining: 3.5s 54: learn: 0.3224571 test: 0.3311494 best: 0.3311494 (54) total: 8.37s remaining: 3.35s 55: learn: 0.3195305 test: 0.3283685 best: 0.3283685 (55) total: 8.51s remaining: 3.19s 56: learn: 0.3167853 test: 0.3257602 best: 0.3257602 (56) total: 8.66s remaining: 3.04s 57: learn: 0.3139636 test: 0.3230734 best: 0.3230734 (57) total: 8.82s remaining: 2.89s 58: learn: 0.3112017 test: 0.3204650 best: 0.3204650 (58) total: 8.97s remaining: 2.74s 59: learn: 0.3084012 test: 0.3177469 best: 0.3177469 (59) total: 9.13s remaining: 2.59s 60: learn: 0.3057680 test: 0.3152748 best: 0.3152748 (60) total: 9.28s remaining: 2.43s 61: learn: 0.3032990 test: 0.3129882 best: 0.3129882 (61) total: 9.43s remaining: 2.28s 62: learn: 0.3007196 test: 0.3105666 best: 0.3105666 (62) total: 9.58s remaining: 2.13s 63: learn: 0.2982947 test: 0.3082945 best: 0.3082945 (63) total: 9.74s remaining: 1.98s 64: learn: 0.2961092 test: 0.3061363 best: 0.3061363 (64) total: 9.86s remaining: 1.82s 65: learn: 0.2938981 test: 0.3041990 best: 0.3041990 (65) total: 10s remaining: 1.67s 66: learn: 0.2917545 test: 0.3022080 best: 0.3022080 (66) total: 10.2s remaining: 1.52s 67: learn: 0.2895628 test: 0.3001347 best: 0.3001347 (67) total: 10.4s remaining: 1.37s 68: learn: 0.2872912 test: 0.2980296 best: 0.2980296 (68) total: 10.5s remaining: 1.22s 69: learn: 0.2853264 test: 0.2961620 best: 0.2961620 (69) total: 10.7s remaining: 1.06s 70: learn: 0.2832790 test: 0.2942043 best: 0.2942043 (70) total: 10.8s remaining: 914ms 71: learn: 0.2811378 test: 0.2921425 best: 0.2921425 (71) total: 11s remaining: 763ms 72: learn: 0.2791569 test: 0.2902568 best: 0.2902568 (72) total: 11.1s remaining: 609ms 73: learn: 0.2772488 test: 0.2885083 best: 0.2885083 (73) total: 11.3s remaining: 457ms 74: learn: 0.2752813 test: 0.2866618 best: 0.2866618 (74) total: 11.4s remaining: 304ms 75: learn: 0.2735958 test: 0.2850857 best: 0.2850857 (75) total: 11.6s remaining: 152ms 76: learn: 0.2717767 test: 0.2834597 best: 0.2834597 (76) total: 11.7s remaining: 0us bestTest = 0.2834596634 bestIteration = 76 Trial 2, Fold 2: Log loss = 0.28345966337626055, Average precision = 0.9743472051969249, ROC-AUC = 0.9711451585931281, Elapsed Time = 11.851431200000661 seconds Trial 2, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 2, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6802592 test: 0.6803151 best: 0.6803151 (0) total: 170ms remaining: 13s 1: learn: 0.6676447 test: 0.6677462 best: 0.6677462 (1) total: 316ms remaining: 11.8s 2: learn: 0.6552994 test: 0.6555048 best: 0.6555048 (2) total: 471ms remaining: 11.6s 3: learn: 0.6433840 test: 0.6436023 best: 0.6436023 (3) total: 624ms remaining: 11.4s 4: learn: 0.6316158 test: 0.6319003 best: 0.6319003 (4) total: 785ms remaining: 11.3s 5: learn: 0.6203085 test: 0.6206607 best: 0.6206607 (5) total: 954ms remaining: 11.3s 6: learn: 0.6095564 test: 0.6100167 best: 0.6100167 (6) total: 1.1s remaining: 11s 7: learn: 0.5989344 test: 0.5994262 best: 0.5994262 (7) total: 1.26s remaining: 10.9s 8: learn: 0.5887212 test: 0.5893018 best: 0.5893018 (8) total: 1.41s remaining: 10.6s 9: learn: 0.5784499 test: 0.5791406 best: 0.5791406 (9) total: 1.56s remaining: 10.4s 10: learn: 0.5688528 test: 0.5695555 best: 0.5695555 (10) total: 1.7s remaining: 10.2s 11: learn: 0.5591708 test: 0.5600841 best: 0.5600841 (11) total: 1.84s remaining: 10s 12: learn: 0.5497081 test: 0.5506992 best: 0.5506992 (12) total: 2s remaining: 9.83s 13: learn: 0.5406820 test: 0.5418553 best: 0.5418553 (13) total: 2.17s remaining: 9.75s 14: learn: 0.5318934 test: 0.5330939 best: 0.5330939 (14) total: 2.31s remaining: 9.56s 15: learn: 0.5232106 test: 0.5245028 best: 0.5245028 (15) total: 2.46s remaining: 9.38s 16: learn: 0.5149234 test: 0.5162816 best: 0.5162816 (16) total: 2.6s remaining: 9.2s 17: learn: 0.5069621 test: 0.5083154 best: 0.5083154 (17) total: 2.74s remaining: 8.98s 18: learn: 0.4996404 test: 0.5013704 best: 0.5013704 (18) total: 2.9s remaining: 8.84s 19: learn: 0.4922115 test: 0.4940686 best: 0.4940686 (19) total: 3.05s remaining: 8.69s 20: learn: 0.4848134 test: 0.4868204 best: 0.4868204 (20) total: 3.19s remaining: 8.51s 21: learn: 0.4777469 test: 0.4798695 best: 0.4798695 (21) total: 3.34s remaining: 8.35s 22: learn: 0.4706629 test: 0.4728099 best: 0.4728099 (22) total: 3.49s remaining: 8.19s 23: learn: 0.4638614 test: 0.4661681 best: 0.4661681 (23) total: 3.64s remaining: 8.04s 24: learn: 0.4572204 test: 0.4596572 best: 0.4596572 (24) total: 3.82s remaining: 7.94s 25: learn: 0.4509509 test: 0.4535170 best: 0.4535170 (25) total: 3.96s remaining: 7.78s 26: learn: 0.4448008 test: 0.4475041 best: 0.4475041 (26) total: 4.1s remaining: 7.6s 27: learn: 0.4385012 test: 0.4413063 best: 0.4413063 (27) total: 4.27s remaining: 7.47s 28: learn: 0.4323006 test: 0.4351846 best: 0.4351846 (28) total: 4.42s remaining: 7.31s 29: learn: 0.4264249 test: 0.4294386 best: 0.4294386 (29) total: 4.57s remaining: 7.16s 30: learn: 0.4207285 test: 0.4239230 best: 0.4239230 (30) total: 4.72s remaining: 7.01s 31: learn: 0.4152590 test: 0.4185687 best: 0.4185687 (31) total: 4.85s remaining: 6.82s 32: learn: 0.4098208 test: 0.4132540 best: 0.4132540 (32) total: 5s remaining: 6.67s 33: learn: 0.4047431 test: 0.4082519 best: 0.4082519 (33) total: 5.14s remaining: 6.5s 34: learn: 0.3998356 test: 0.4034873 best: 0.4034873 (34) total: 5.26s remaining: 6.32s 35: learn: 0.3949415 test: 0.3987209 best: 0.3987209 (35) total: 5.42s remaining: 6.17s 36: learn: 0.3902077 test: 0.3941174 best: 0.3941174 (36) total: 5.56s remaining: 6.01s 37: learn: 0.3853102 test: 0.3893486 best: 0.3893486 (37) total: 5.73s remaining: 5.88s 38: learn: 0.3807700 test: 0.3850624 best: 0.3850624 (38) total: 5.91s remaining: 5.76s 39: learn: 0.3763671 test: 0.3808267 best: 0.3808267 (39) total: 6.06s remaining: 5.61s 40: learn: 0.3722142 test: 0.3769506 best: 0.3769506 (40) total: 6.2s remaining: 5.45s 41: learn: 0.3680259 test: 0.3728057 best: 0.3728057 (41) total: 6.35s remaining: 5.29s 42: learn: 0.3639972 test: 0.3688404 best: 0.3688404 (42) total: 6.51s remaining: 5.15s 43: learn: 0.3599685 test: 0.3648993 best: 0.3648993 (43) total: 6.66s remaining: 5s 44: learn: 0.3559737 test: 0.3610199 best: 0.3610199 (44) total: 6.82s remaining: 4.85s 45: learn: 0.3521165 test: 0.3572541 best: 0.3572541 (45) total: 6.96s remaining: 4.69s 46: learn: 0.3483896 test: 0.3536760 best: 0.3536760 (46) total: 7.12s remaining: 4.55s 47: learn: 0.3448431 test: 0.3502077 best: 0.3502077 (47) total: 7.26s remaining: 4.39s 48: learn: 0.3414761 test: 0.3469481 best: 0.3469481 (48) total: 7.44s remaining: 4.25s 49: learn: 0.3381541 test: 0.3437167 best: 0.3437167 (49) total: 7.59s remaining: 4.1s 50: learn: 0.3346782 test: 0.3404359 best: 0.3404359 (50) total: 7.76s remaining: 3.96s 51: learn: 0.3314799 test: 0.3373915 best: 0.3373915 (51) total: 7.92s remaining: 3.81s 52: learn: 0.3282989 test: 0.3343366 best: 0.3343366 (52) total: 8.09s remaining: 3.67s 53: learn: 0.3252758 test: 0.3314321 best: 0.3314321 (53) total: 8.25s remaining: 3.51s 54: learn: 0.3222029 test: 0.3284547 best: 0.3284547 (54) total: 8.4s remaining: 3.36s 55: learn: 0.3192638 test: 0.3255929 best: 0.3255929 (55) total: 8.54s remaining: 3.2s 56: learn: 0.3163622 test: 0.3228086 best: 0.3228086 (56) total: 8.71s remaining: 3.06s 57: learn: 0.3135205 test: 0.3200679 best: 0.3200679 (57) total: 8.88s remaining: 2.91s 58: learn: 0.3109216 test: 0.3175781 best: 0.3175781 (58) total: 9.03s remaining: 2.75s 59: learn: 0.3084055 test: 0.3151618 best: 0.3151618 (59) total: 9.16s remaining: 2.59s 60: learn: 0.3058635 test: 0.3129240 best: 0.3129240 (60) total: 9.31s remaining: 2.44s 61: learn: 0.3033608 test: 0.3105240 best: 0.3105240 (61) total: 9.45s remaining: 2.29s 62: learn: 0.3008927 test: 0.3081982 best: 0.3081982 (62) total: 9.59s remaining: 2.13s 63: learn: 0.2984109 test: 0.3057562 best: 0.3057562 (63) total: 9.74s remaining: 1.98s 64: learn: 0.2961105 test: 0.3035791 best: 0.3035791 (64) total: 9.89s remaining: 1.82s 65: learn: 0.2938764 test: 0.3015200 best: 0.3015200 (65) total: 10s remaining: 1.67s 66: learn: 0.2914835 test: 0.2993054 best: 0.2993054 (66) total: 10.2s remaining: 1.52s 67: learn: 0.2892736 test: 0.2972764 best: 0.2972764 (67) total: 10.4s remaining: 1.37s 68: learn: 0.2869875 test: 0.2951774 best: 0.2951774 (68) total: 10.5s remaining: 1.22s 69: learn: 0.2848428 test: 0.2931857 best: 0.2931857 (69) total: 10.7s remaining: 1.07s 70: learn: 0.2830059 test: 0.2915703 best: 0.2915703 (70) total: 10.8s remaining: 915ms 71: learn: 0.2809394 test: 0.2896555 best: 0.2896555 (71) total: 11s remaining: 763ms 72: learn: 0.2789101 test: 0.2878104 best: 0.2878104 (72) total: 11.2s remaining: 611ms 73: learn: 0.2768541 test: 0.2858728 best: 0.2858728 (73) total: 11.3s remaining: 459ms 74: learn: 0.2749603 test: 0.2840883 best: 0.2840883 (74) total: 11.5s remaining: 307ms 75: learn: 0.2730775 test: 0.2823078 best: 0.2823078 (75) total: 11.6s remaining: 153ms 76: learn: 0.2712717 test: 0.2806472 best: 0.2806472 (76) total: 11.8s remaining: 0us bestTest = 0.2806471975 bestIteration = 76 Trial 2, Fold 3: Log loss = 0.2806471974698048, Average precision = 0.9745374116291409, ROC-AUC = 0.9724172423216573, Elapsed Time = 11.957781299999624 seconds Trial 2, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 2, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6802108 test: 0.6803498 best: 0.6803498 (0) total: 137ms remaining: 10.4s 1: learn: 0.6674825 test: 0.6677026 best: 0.6677026 (1) total: 286ms remaining: 10.7s 2: learn: 0.6552091 test: 0.6555439 best: 0.6555439 (2) total: 445ms remaining: 11s 3: learn: 0.6432673 test: 0.6438825 best: 0.6438825 (3) total: 593ms remaining: 10.8s 4: learn: 0.6318724 test: 0.6325847 best: 0.6325847 (4) total: 764ms remaining: 11s 5: learn: 0.6209630 test: 0.6219154 best: 0.6219154 (5) total: 922ms remaining: 10.9s 6: learn: 0.6100152 test: 0.6112349 best: 0.6112349 (6) total: 1.07s remaining: 10.7s 7: learn: 0.5992946 test: 0.6006284 best: 0.6006284 (7) total: 1.22s remaining: 10.5s 8: learn: 0.5888442 test: 0.5903062 best: 0.5903062 (8) total: 1.38s remaining: 10.4s 9: learn: 0.5787386 test: 0.5804820 best: 0.5804820 (9) total: 1.54s remaining: 10.3s 10: learn: 0.5690144 test: 0.5708871 best: 0.5708871 (10) total: 1.7s remaining: 10.2s 11: learn: 0.5593012 test: 0.5612561 best: 0.5612561 (11) total: 1.86s remaining: 10.1s 12: learn: 0.5500676 test: 0.5520955 best: 0.5520955 (12) total: 2.02s remaining: 9.92s 13: learn: 0.5410969 test: 0.5433221 best: 0.5433221 (13) total: 2.17s remaining: 9.79s 14: learn: 0.5323978 test: 0.5348182 best: 0.5348182 (14) total: 2.33s remaining: 9.65s 15: learn: 0.5245510 test: 0.5275393 best: 0.5275393 (15) total: 2.5s remaining: 9.53s 16: learn: 0.5161299 test: 0.5192025 best: 0.5192025 (16) total: 2.65s remaining: 9.37s 17: learn: 0.5078122 test: 0.5109686 best: 0.5109686 (17) total: 2.8s remaining: 9.19s 18: learn: 0.4999594 test: 0.5031858 best: 0.5031858 (18) total: 2.94s remaining: 8.99s 19: learn: 0.4925579 test: 0.4961119 best: 0.4961119 (19) total: 3.1s remaining: 8.84s 20: learn: 0.4849844 test: 0.4886892 best: 0.4886892 (20) total: 3.25s remaining: 8.66s 21: learn: 0.4778841 test: 0.4816818 best: 0.4816818 (21) total: 3.42s remaining: 8.54s 22: learn: 0.4707061 test: 0.4746524 best: 0.4746524 (22) total: 3.58s remaining: 8.4s 23: learn: 0.4640693 test: 0.4681968 best: 0.4681968 (23) total: 3.73s remaining: 8.24s 24: learn: 0.4576530 test: 0.4619103 best: 0.4619103 (24) total: 3.9s remaining: 8.1s 25: learn: 0.4513672 test: 0.4558173 best: 0.4558173 (25) total: 4.05s remaining: 7.94s 26: learn: 0.4448422 test: 0.4499094 best: 0.4499094 (26) total: 4.2s remaining: 7.77s 27: learn: 0.4388392 test: 0.4440241 best: 0.4440241 (27) total: 4.31s remaining: 7.55s 28: learn: 0.4331725 test: 0.4385995 best: 0.4385995 (28) total: 4.48s remaining: 7.41s 29: learn: 0.4272205 test: 0.4327213 best: 0.4327213 (29) total: 4.62s remaining: 7.24s 30: learn: 0.4216683 test: 0.4273119 best: 0.4273119 (30) total: 4.76s remaining: 7.07s 31: learn: 0.4160271 test: 0.4218114 best: 0.4218114 (31) total: 4.93s remaining: 6.94s 32: learn: 0.4107170 test: 0.4165692 best: 0.4165692 (32) total: 5.07s remaining: 6.76s 33: learn: 0.4054435 test: 0.4114832 best: 0.4114832 (33) total: 5.22s remaining: 6.61s 34: learn: 0.4004316 test: 0.4067886 best: 0.4067886 (34) total: 5.39s remaining: 6.47s 35: learn: 0.3955445 test: 0.4020209 best: 0.4020209 (35) total: 5.55s remaining: 6.33s 36: learn: 0.3906860 test: 0.3972117 best: 0.3972117 (36) total: 5.7s remaining: 6.16s 37: learn: 0.3859348 test: 0.3925977 best: 0.3925977 (37) total: 5.85s remaining: 6.01s 38: learn: 0.3812946 test: 0.3881640 best: 0.3881640 (38) total: 5.99s remaining: 5.83s 39: learn: 0.3771165 test: 0.3843692 best: 0.3843692 (39) total: 6.16s remaining: 5.7s 40: learn: 0.3726915 test: 0.3800709 best: 0.3800709 (40) total: 6.32s remaining: 5.55s 41: learn: 0.3684658 test: 0.3759792 best: 0.3759792 (41) total: 6.49s remaining: 5.41s 42: learn: 0.3643192 test: 0.3718806 best: 0.3718806 (42) total: 6.65s remaining: 5.26s 43: learn: 0.3603153 test: 0.3680057 best: 0.3680057 (43) total: 6.8s remaining: 5.1s 44: learn: 0.3563378 test: 0.3642030 best: 0.3642030 (44) total: 6.96s remaining: 4.95s 45: learn: 0.3524138 test: 0.3603764 best: 0.3603764 (45) total: 7.11s remaining: 4.79s 46: learn: 0.3486799 test: 0.3567123 best: 0.3567123 (46) total: 7.25s remaining: 4.63s 47: learn: 0.3451824 test: 0.3533103 best: 0.3533103 (47) total: 7.4s remaining: 4.47s 48: learn: 0.3417079 test: 0.3500657 best: 0.3500657 (48) total: 7.57s remaining: 4.33s 49: learn: 0.3383435 test: 0.3468736 best: 0.3468736 (49) total: 7.74s remaining: 4.18s 50: learn: 0.3351789 test: 0.3437156 best: 0.3437156 (50) total: 7.87s remaining: 4.01s 51: learn: 0.3319630 test: 0.3406113 best: 0.3406113 (51) total: 8.02s remaining: 3.85s 52: learn: 0.3288206 test: 0.3375329 best: 0.3375329 (52) total: 8.18s remaining: 3.71s 53: learn: 0.3260164 test: 0.3347862 best: 0.3347862 (53) total: 8.32s remaining: 3.54s 54: learn: 0.3229502 test: 0.3319257 best: 0.3319257 (54) total: 8.49s remaining: 3.39s 55: learn: 0.3198729 test: 0.3288825 best: 0.3288825 (55) total: 8.64s remaining: 3.24s 56: learn: 0.3169847 test: 0.3261869 best: 0.3261869 (56) total: 8.81s remaining: 3.09s 57: learn: 0.3140098 test: 0.3233229 best: 0.3233229 (57) total: 8.97s remaining: 2.94s 58: learn: 0.3114008 test: 0.3208800 best: 0.3208800 (58) total: 9.15s remaining: 2.79s 59: learn: 0.3087965 test: 0.3183873 best: 0.3183873 (59) total: 9.3s remaining: 2.64s 60: learn: 0.3061956 test: 0.3158594 best: 0.3158594 (60) total: 9.45s remaining: 2.48s 61: learn: 0.3035445 test: 0.3133134 best: 0.3133134 (61) total: 9.61s remaining: 2.32s 62: learn: 0.3010145 test: 0.3109414 best: 0.3109414 (62) total: 9.79s remaining: 2.18s 63: learn: 0.2987159 test: 0.3088220 best: 0.3088220 (63) total: 9.96s remaining: 2.02s 64: learn: 0.2963793 test: 0.3065724 best: 0.3065724 (64) total: 10.1s remaining: 1.87s 65: learn: 0.2940492 test: 0.3044140 best: 0.3044140 (65) total: 10.3s remaining: 1.71s 66: learn: 0.2919439 test: 0.3024226 best: 0.3024226 (66) total: 10.4s remaining: 1.55s 67: learn: 0.2897408 test: 0.3003406 best: 0.3003406 (67) total: 10.6s remaining: 1.4s 68: learn: 0.2876030 test: 0.2983477 best: 0.2983477 (68) total: 10.7s remaining: 1.24s 69: learn: 0.2853172 test: 0.2962224 best: 0.2962224 (69) total: 10.9s remaining: 1.09s 70: learn: 0.2833135 test: 0.2943562 best: 0.2943562 (70) total: 11s remaining: 932ms 71: learn: 0.2812629 test: 0.2924578 best: 0.2924578 (71) total: 11.2s remaining: 776ms 72: learn: 0.2792819 test: 0.2906007 best: 0.2906007 (72) total: 11.3s remaining: 621ms 73: learn: 0.2773128 test: 0.2887388 best: 0.2887388 (73) total: 11.5s remaining: 466ms 74: learn: 0.2756504 test: 0.2871270 best: 0.2871270 (74) total: 11.7s remaining: 311ms 75: learn: 0.2738701 test: 0.2854474 best: 0.2854474 (75) total: 11.8s remaining: 155ms 76: learn: 0.2721302 test: 0.2837640 best: 0.2837640 (76) total: 11.9s remaining: 0us bestTest = 0.283763995 bestIteration = 76 Trial 2, Fold 4: Log loss = 0.2837639949721159, Average precision = 0.9759807966923569, ROC-AUC = 0.9718746944974974, Elapsed Time = 12.100321099998837 seconds Trial 2, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 2, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6802851 test: 0.6806926 best: 0.6806926 (0) total: 154ms remaining: 11.7s 1: learn: 0.6678631 test: 0.6685958 best: 0.6685958 (1) total: 290ms remaining: 10.9s 2: learn: 0.6555685 test: 0.6565415 best: 0.6565415 (2) total: 427ms remaining: 10.5s 3: learn: 0.6436728 test: 0.6450300 best: 0.6450300 (3) total: 588ms remaining: 10.7s 4: learn: 0.6323178 test: 0.6340072 best: 0.6340072 (4) total: 757ms remaining: 10.9s 5: learn: 0.6209765 test: 0.6228752 best: 0.6228752 (5) total: 906ms remaining: 10.7s 6: learn: 0.6096871 test: 0.6118557 best: 0.6118557 (6) total: 1.03s remaining: 10.3s 7: learn: 0.5989379 test: 0.6012702 best: 0.6012702 (7) total: 1.17s remaining: 10.1s 8: learn: 0.5886612 test: 0.5911957 best: 0.5911957 (8) total: 1.32s remaining: 10s 9: learn: 0.5784893 test: 0.5813602 best: 0.5813602 (9) total: 1.48s remaining: 9.94s 10: learn: 0.5685676 test: 0.5716071 best: 0.5716071 (10) total: 1.62s remaining: 9.74s 11: learn: 0.5590937 test: 0.5623610 best: 0.5623610 (11) total: 1.77s remaining: 9.57s 12: learn: 0.5496067 test: 0.5531793 best: 0.5531793 (12) total: 1.93s remaining: 9.51s 13: learn: 0.5405243 test: 0.5444556 best: 0.5444556 (13) total: 2.08s remaining: 9.37s 14: learn: 0.5314541 test: 0.5356386 best: 0.5356386 (14) total: 2.22s remaining: 9.19s 15: learn: 0.5228059 test: 0.5272587 best: 0.5272587 (15) total: 2.38s remaining: 9.09s 16: learn: 0.5145487 test: 0.5193253 best: 0.5193253 (16) total: 2.55s remaining: 8.99s 17: learn: 0.5065579 test: 0.5117419 best: 0.5117419 (17) total: 2.72s remaining: 8.9s 18: learn: 0.4987689 test: 0.5043345 best: 0.5043345 (18) total: 2.88s remaining: 8.79s 19: learn: 0.4913931 test: 0.4971042 best: 0.4971042 (19) total: 3.01s remaining: 8.59s 20: learn: 0.4837981 test: 0.4898333 best: 0.4898333 (20) total: 3.17s remaining: 8.44s 21: learn: 0.4764412 test: 0.4827818 best: 0.4827818 (21) total: 3.31s remaining: 8.29s 22: learn: 0.4698010 test: 0.4768502 best: 0.4768502 (22) total: 3.47s remaining: 8.15s 23: learn: 0.4628140 test: 0.4700607 best: 0.4700607 (23) total: 3.63s remaining: 8.02s 24: learn: 0.4563432 test: 0.4637302 best: 0.4637302 (24) total: 3.79s remaining: 7.89s 25: learn: 0.4498793 test: 0.4574651 best: 0.4574651 (25) total: 3.94s remaining: 7.73s 26: learn: 0.4433780 test: 0.4511529 best: 0.4511529 (26) total: 4.09s remaining: 7.58s 27: learn: 0.4373264 test: 0.4453816 best: 0.4453816 (27) total: 4.25s remaining: 7.43s 28: learn: 0.4318355 test: 0.4402842 best: 0.4402842 (28) total: 4.39s remaining: 7.26s 29: learn: 0.4262569 test: 0.4348162 best: 0.4348162 (29) total: 4.55s remaining: 7.13s 30: learn: 0.4204091 test: 0.4292129 best: 0.4292129 (30) total: 4.71s remaining: 7s 31: learn: 0.4148913 test: 0.4237822 best: 0.4237822 (31) total: 4.85s remaining: 6.82s 32: learn: 0.4094317 test: 0.4185356 best: 0.4185356 (32) total: 5s remaining: 6.66s 33: learn: 0.4040179 test: 0.4133264 best: 0.4133264 (33) total: 5.16s remaining: 6.53s 34: learn: 0.3988231 test: 0.4084583 best: 0.4084583 (34) total: 5.32s remaining: 6.39s 35: learn: 0.3938651 test: 0.4036821 best: 0.4036821 (35) total: 5.46s remaining: 6.22s 36: learn: 0.3890310 test: 0.3990680 best: 0.3990680 (36) total: 5.62s remaining: 6.07s 37: learn: 0.3841971 test: 0.3944279 best: 0.3944279 (37) total: 5.76s remaining: 5.92s 38: learn: 0.3797178 test: 0.3901897 best: 0.3901897 (38) total: 5.92s remaining: 5.77s 39: learn: 0.3753360 test: 0.3859223 best: 0.3859223 (39) total: 6.08s remaining: 5.63s 40: learn: 0.3707904 test: 0.3815990 best: 0.3815990 (40) total: 6.24s remaining: 5.48s 41: learn: 0.3667023 test: 0.3777396 best: 0.3777396 (41) total: 6.39s remaining: 5.33s 42: learn: 0.3625807 test: 0.3737876 best: 0.3737876 (42) total: 6.54s remaining: 5.17s 43: learn: 0.3585498 test: 0.3699239 best: 0.3699239 (43) total: 6.7s remaining: 5.02s 44: learn: 0.3546502 test: 0.3661582 best: 0.3661582 (44) total: 6.86s remaining: 4.88s 45: learn: 0.3509557 test: 0.3625891 best: 0.3625891 (45) total: 7.02s remaining: 4.73s 46: learn: 0.3472913 test: 0.3590831 best: 0.3590831 (46) total: 7.18s remaining: 4.58s 47: learn: 0.3436614 test: 0.3555976 best: 0.3555976 (47) total: 7.32s remaining: 4.42s 48: learn: 0.3402069 test: 0.3523591 best: 0.3523591 (48) total: 7.49s remaining: 4.28s 49: learn: 0.3367366 test: 0.3491228 best: 0.3491228 (49) total: 7.64s remaining: 4.13s 50: learn: 0.3332472 test: 0.3458411 best: 0.3458411 (50) total: 7.82s remaining: 3.99s 51: learn: 0.3299455 test: 0.3427807 best: 0.3427807 (51) total: 7.98s remaining: 3.84s 52: learn: 0.3267934 test: 0.3397846 best: 0.3397846 (52) total: 8.14s remaining: 3.69s 53: learn: 0.3237402 test: 0.3368887 best: 0.3368887 (53) total: 8.29s remaining: 3.53s 54: learn: 0.3208356 test: 0.3342514 best: 0.3342514 (54) total: 8.45s remaining: 3.38s 55: learn: 0.3178816 test: 0.3314509 best: 0.3314509 (55) total: 8.61s remaining: 3.23s 56: learn: 0.3149890 test: 0.3287341 best: 0.3287341 (56) total: 8.75s remaining: 3.07s 57: learn: 0.3122058 test: 0.3262048 best: 0.3262048 (57) total: 8.89s remaining: 2.91s 58: learn: 0.3095566 test: 0.3237949 best: 0.3237949 (58) total: 9.06s remaining: 2.77s 59: learn: 0.3070610 test: 0.3214446 best: 0.3214446 (59) total: 9.2s remaining: 2.61s 60: learn: 0.3046325 test: 0.3192233 best: 0.3192233 (60) total: 9.36s remaining: 2.46s 61: learn: 0.3021963 test: 0.3170370 best: 0.3170370 (61) total: 9.53s remaining: 2.31s 62: learn: 0.2997992 test: 0.3150498 best: 0.3150498 (62) total: 9.7s remaining: 2.16s 63: learn: 0.2975069 test: 0.3129359 best: 0.3129359 (63) total: 9.87s remaining: 2s 64: learn: 0.2951367 test: 0.3107652 best: 0.3107652 (64) total: 10s remaining: 1.85s 65: learn: 0.2927553 test: 0.3085635 best: 0.3085635 (65) total: 10.2s remaining: 1.7s 66: learn: 0.2903381 test: 0.3063619 best: 0.3063619 (66) total: 10.3s remaining: 1.54s 67: learn: 0.2880257 test: 0.3042008 best: 0.3042008 (67) total: 10.5s remaining: 1.39s 68: learn: 0.2857154 test: 0.3020620 best: 0.3020620 (68) total: 10.7s remaining: 1.24s 69: learn: 0.2835880 test: 0.3000606 best: 0.3000606 (69) total: 10.8s remaining: 1.08s 70: learn: 0.2814944 test: 0.2981734 best: 0.2981734 (70) total: 11s remaining: 926ms 71: learn: 0.2792715 test: 0.2961647 best: 0.2961647 (71) total: 11.1s remaining: 773ms 72: learn: 0.2772610 test: 0.2943931 best: 0.2943931 (72) total: 11.3s remaining: 619ms 73: learn: 0.2752017 test: 0.2925324 best: 0.2925324 (73) total: 11.5s remaining: 466ms 74: learn: 0.2733737 test: 0.2908720 best: 0.2908720 (74) total: 11.6s remaining: 310ms 75: learn: 0.2714866 test: 0.2891508 best: 0.2891508 (75) total: 11.8s remaining: 155ms 76: learn: 0.2697065 test: 0.2875601 best: 0.2875601 (76) total: 11.9s remaining: 0us bestTest = 0.2875601376 bestIteration = 76 Trial 2, Fold 5: Log loss = 0.28756013755104853, Average precision = 0.9731313191971385, ROC-AUC = 0.9708124779369416, Elapsed Time = 12.088010999999824 seconds
Optimization Progress: 3%|3 | 3/100 [02:00<1:12:45, 45.00s/it]
Trial 3, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 3, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6210638 test: 0.6256852 best: 0.6256852 (0) total: 1.97s remaining: 1m 42s 1: learn: 0.5595196 test: 0.5674321 best: 0.5674321 (1) total: 4.29s remaining: 1m 49s 2: learn: 0.5057908 test: 0.5203140 best: 0.5203140 (2) total: 6.39s remaining: 1m 46s 3: learn: 0.4613019 test: 0.4790619 best: 0.4790619 (3) total: 8.44s remaining: 1m 43s 4: learn: 0.4226066 test: 0.4437747 best: 0.4437747 (4) total: 10.5s remaining: 1m 40s 5: learn: 0.3901047 test: 0.4136660 best: 0.4136660 (5) total: 12.5s remaining: 1m 38s 6: learn: 0.3625389 test: 0.3881014 best: 0.3881014 (6) total: 14.6s remaining: 1m 35s 7: learn: 0.3376649 test: 0.3665084 best: 0.3665084 (7) total: 16.6s remaining: 1m 33s 8: learn: 0.3170532 test: 0.3475663 best: 0.3475663 (8) total: 18.6s remaining: 1m 31s 9: learn: 0.2992470 test: 0.3312340 best: 0.3312340 (9) total: 20.6s remaining: 1m 28s 10: learn: 0.2828939 test: 0.3174723 best: 0.3174723 (10) total: 22.6s remaining: 1m 26s 11: learn: 0.2687587 test: 0.3051603 best: 0.3051603 (11) total: 24.6s remaining: 1m 23s 12: learn: 0.2561306 test: 0.2947880 best: 0.2947880 (12) total: 26.6s remaining: 1m 21s 13: learn: 0.2454028 test: 0.2855136 best: 0.2855136 (13) total: 28.7s remaining: 1m 19s 14: learn: 0.2350388 test: 0.2774353 best: 0.2774353 (14) total: 30.8s remaining: 1m 17s 15: learn: 0.2271486 test: 0.2704277 best: 0.2704277 (15) total: 32.8s remaining: 1m 15s 16: learn: 0.2192820 test: 0.2642392 best: 0.2642392 (16) total: 34.8s remaining: 1m 13s 17: learn: 0.2125033 test: 0.2587187 best: 0.2587187 (17) total: 37s remaining: 1m 11s 18: learn: 0.2059179 test: 0.2541218 best: 0.2541218 (18) total: 39.2s remaining: 1m 10s 19: learn: 0.2001894 test: 0.2494567 best: 0.2494567 (19) total: 41.3s remaining: 1m 8s 20: learn: 0.1942554 test: 0.2461157 best: 0.2461157 (20) total: 43.5s remaining: 1m 6s 21: learn: 0.1890062 test: 0.2427093 best: 0.2427093 (21) total: 45.7s remaining: 1m 4s 22: learn: 0.1830366 test: 0.2402636 best: 0.2402636 (22) total: 47.8s remaining: 1m 2s 23: learn: 0.1788757 test: 0.2378161 best: 0.2378161 (23) total: 50.2s remaining: 1m 24: learn: 0.1749531 test: 0.2355780 best: 0.2355780 (24) total: 53.1s remaining: 59.5s 25: learn: 0.1712878 test: 0.2333763 best: 0.2333763 (25) total: 55.4s remaining: 57.5s 26: learn: 0.1669562 test: 0.2316423 best: 0.2316423 (26) total: 57.8s remaining: 55.7s 27: learn: 0.1637790 test: 0.2300955 best: 0.2300955 (27) total: 1m remaining: 53.7s 28: learn: 0.1604278 test: 0.2286207 best: 0.2286207 (28) total: 1m 2s remaining: 51.6s 29: learn: 0.1578096 test: 0.2272616 best: 0.2272616 (29) total: 1m 4s remaining: 49.4s 30: learn: 0.1543875 test: 0.2260657 best: 0.2260657 (30) total: 1m 6s remaining: 47.1s 31: learn: 0.1516137 test: 0.2250593 best: 0.2250593 (31) total: 1m 8s remaining: 45s 32: learn: 0.1488802 test: 0.2237784 best: 0.2237784 (32) total: 1m 10s remaining: 42.8s 33: learn: 0.1462272 test: 0.2227639 best: 0.2227639 (33) total: 1m 12s remaining: 40.6s 34: learn: 0.1425211 test: 0.2219945 best: 0.2219945 (34) total: 1m 14s remaining: 38.4s 35: learn: 0.1397427 test: 0.2211367 best: 0.2211367 (35) total: 1m 17s remaining: 36.5s 36: learn: 0.1376852 test: 0.2204046 best: 0.2204046 (36) total: 1m 20s remaining: 34.9s 37: learn: 0.1350455 test: 0.2198260 best: 0.2198260 (37) total: 1m 23s remaining: 32.8s 38: learn: 0.1321538 test: 0.2193594 best: 0.2193594 (38) total: 1m 25s remaining: 30.7s 39: learn: 0.1301001 test: 0.2186210 best: 0.2186210 (39) total: 1m 28s remaining: 28.6s 40: learn: 0.1284694 test: 0.2179048 best: 0.2179048 (40) total: 1m 30s remaining: 26.5s 41: learn: 0.1257432 test: 0.2173047 best: 0.2173047 (41) total: 1m 32s remaining: 24.3s 42: learn: 0.1242866 test: 0.2170200 best: 0.2170200 (42) total: 1m 34s remaining: 22.1s 43: learn: 0.1220583 test: 0.2166082 best: 0.2166082 (43) total: 1m 37s remaining: 19.9s 44: learn: 0.1206991 test: 0.2161925 best: 0.2161925 (44) total: 1m 39s remaining: 17.6s 45: learn: 0.1181242 test: 0.2160194 best: 0.2160194 (45) total: 1m 41s remaining: 15.4s 46: learn: 0.1155360 test: 0.2159972 best: 0.2159972 (46) total: 1m 43s remaining: 13.2s 47: learn: 0.1143342 test: 0.2154056 best: 0.2154056 (47) total: 1m 45s remaining: 11s 48: learn: 0.1126665 test: 0.2151294 best: 0.2151294 (48) total: 1m 47s remaining: 8.81s 49: learn: 0.1107846 test: 0.2147983 best: 0.2147983 (49) total: 1m 50s remaining: 6.6s 50: learn: 0.1095653 test: 0.2143256 best: 0.2143256 (50) total: 1m 52s remaining: 4.4s 51: learn: 0.1079845 test: 0.2141729 best: 0.2141729 (51) total: 1m 54s remaining: 2.2s 52: learn: 0.1062828 test: 0.2140891 best: 0.2140891 (52) total: 1m 56s remaining: 0us bestTest = 0.2140891159 bestIteration = 52 Trial 3, Fold 1: Log loss = 0.21320658424277797, Average precision = 0.9727751291264074, ROC-AUC = 0.9670637758516034, Elapsed Time = 116.65948170000047 seconds Trial 3, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 3, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6210098 test: 0.6260702 best: 0.6260702 (0) total: 1.89s remaining: 1m 38s 1: learn: 0.5589420 test: 0.5694954 best: 0.5694954 (1) total: 4.01s remaining: 1m 42s 2: learn: 0.5062144 test: 0.5203870 best: 0.5203870 (2) total: 6.03s remaining: 1m 40s 3: learn: 0.4609391 test: 0.4794926 best: 0.4794926 (3) total: 8.09s remaining: 1m 39s 4: learn: 0.4218096 test: 0.4443732 best: 0.4443732 (4) total: 10.1s remaining: 1m 37s 5: learn: 0.3890962 test: 0.4142500 best: 0.4142500 (5) total: 12.1s remaining: 1m 35s 6: learn: 0.3600858 test: 0.3887928 best: 0.3887928 (6) total: 14.2s remaining: 1m 33s 7: learn: 0.3349769 test: 0.3667618 best: 0.3667618 (7) total: 16.2s remaining: 1m 31s 8: learn: 0.3143513 test: 0.3470479 best: 0.3470479 (8) total: 18.2s remaining: 1m 28s 9: learn: 0.2954630 test: 0.3309186 best: 0.3309186 (9) total: 20.3s remaining: 1m 27s 10: learn: 0.2798895 test: 0.3169250 best: 0.3169250 (10) total: 22.3s remaining: 1m 25s 11: learn: 0.2651640 test: 0.3050290 best: 0.3050290 (11) total: 24.4s remaining: 1m 23s 12: learn: 0.2533588 test: 0.2938634 best: 0.2938634 (12) total: 26.4s remaining: 1m 21s 13: learn: 0.2426372 test: 0.2841766 best: 0.2841766 (13) total: 28.4s remaining: 1m 19s 14: learn: 0.2332385 test: 0.2763103 best: 0.2763103 (14) total: 30.5s remaining: 1m 17s 15: learn: 0.2246486 test: 0.2690792 best: 0.2690792 (15) total: 32.5s remaining: 1m 15s 16: learn: 0.2170229 test: 0.2625510 best: 0.2625510 (16) total: 34.5s remaining: 1m 13s 17: learn: 0.2098571 test: 0.2567302 best: 0.2567302 (17) total: 36.5s remaining: 1m 10s 18: learn: 0.2023808 test: 0.2521214 best: 0.2521214 (18) total: 38.6s remaining: 1m 9s 19: learn: 0.1971783 test: 0.2479722 best: 0.2479722 (19) total: 40.6s remaining: 1m 6s 20: learn: 0.1917194 test: 0.2443147 best: 0.2443147 (20) total: 42.6s remaining: 1m 4s 21: learn: 0.1867419 test: 0.2412234 best: 0.2412234 (21) total: 44.7s remaining: 1m 2s 22: learn: 0.1823548 test: 0.2382896 best: 0.2382896 (22) total: 46.7s remaining: 1m 23: learn: 0.1779926 test: 0.2356438 best: 0.2356438 (23) total: 48.7s remaining: 58.8s 24: learn: 0.1737699 test: 0.2329027 best: 0.2329027 (24) total: 50.7s remaining: 56.8s 25: learn: 0.1699136 test: 0.2308214 best: 0.2308214 (25) total: 52.8s remaining: 54.8s 26: learn: 0.1670988 test: 0.2286237 best: 0.2286237 (26) total: 54.7s remaining: 52.7s 27: learn: 0.1633889 test: 0.2262386 best: 0.2262386 (27) total: 56.8s remaining: 50.7s 28: learn: 0.1601269 test: 0.2246749 best: 0.2246749 (28) total: 58.9s remaining: 48.7s 29: learn: 0.1573050 test: 0.2230812 best: 0.2230812 (29) total: 1m remaining: 46.6s 30: learn: 0.1540864 test: 0.2215801 best: 0.2215801 (30) total: 1m 2s remaining: 44.6s 31: learn: 0.1495095 test: 0.2206311 best: 0.2206311 (31) total: 1m 4s remaining: 42.6s 32: learn: 0.1473677 test: 0.2194351 best: 0.2194351 (32) total: 1m 6s remaining: 40.6s 33: learn: 0.1451343 test: 0.2181613 best: 0.2181613 (33) total: 1m 9s remaining: 38.6s 34: learn: 0.1428885 test: 0.2173318 best: 0.2173318 (34) total: 1m 11s remaining: 36.6s 35: learn: 0.1402786 test: 0.2162324 best: 0.2162324 (35) total: 1m 13s remaining: 34.5s 36: learn: 0.1374835 test: 0.2158405 best: 0.2158405 (36) total: 1m 15s remaining: 32.5s 37: learn: 0.1341283 test: 0.2150776 best: 0.2150776 (37) total: 1m 17s remaining: 30.5s 38: learn: 0.1324217 test: 0.2142433 best: 0.2142433 (38) total: 1m 19s remaining: 28.4s 39: learn: 0.1291543 test: 0.2137623 best: 0.2137623 (39) total: 1m 21s remaining: 26.4s 40: learn: 0.1274758 test: 0.2131473 best: 0.2131473 (40) total: 1m 23s remaining: 24.4s 41: learn: 0.1248718 test: 0.2126773 best: 0.2126773 (41) total: 1m 25s remaining: 22.3s 42: learn: 0.1212438 test: 0.2122816 best: 0.2122816 (42) total: 1m 27s remaining: 20.3s 43: learn: 0.1181624 test: 0.2116469 best: 0.2116469 (43) total: 1m 29s remaining: 18.3s 44: learn: 0.1159663 test: 0.2110871 best: 0.2110871 (44) total: 1m 31s remaining: 16.2s 45: learn: 0.1143132 test: 0.2105465 best: 0.2105465 (45) total: 1m 33s remaining: 14.2s 46: learn: 0.1119715 test: 0.2100675 best: 0.2100675 (46) total: 1m 35s remaining: 12.2s 47: learn: 0.1102047 test: 0.2096369 best: 0.2096369 (47) total: 1m 37s remaining: 10.1s 48: learn: 0.1093285 test: 0.2092641 best: 0.2092641 (48) total: 1m 39s remaining: 8.11s 49: learn: 0.1080433 test: 0.2088416 best: 0.2088416 (49) total: 1m 41s remaining: 6.08s 50: learn: 0.1064957 test: 0.2084397 best: 0.2084397 (50) total: 1m 43s remaining: 4.05s 51: learn: 0.1049134 test: 0.2081341 best: 0.2081341 (51) total: 1m 45s remaining: 2.03s 52: learn: 0.1034676 test: 0.2079332 best: 0.2079332 (52) total: 1m 47s remaining: 0us bestTest = 0.2079332213 bestIteration = 52 Trial 3, Fold 2: Log loss = 0.20740025849944024, Average precision = 0.972057189393349, ROC-AUC = 0.9680150570222652, Elapsed Time = 107.9001986000003 seconds Trial 3, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 3, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6217426 test: 0.6264151 best: 0.6264151 (0) total: 1.79s remaining: 1m 33s 1: learn: 0.5598746 test: 0.5689335 best: 0.5689335 (1) total: 3.78s remaining: 1m 36s 2: learn: 0.5060771 test: 0.5210192 best: 0.5210192 (2) total: 5.8s remaining: 1m 36s 3: learn: 0.4612454 test: 0.4789047 best: 0.4789047 (3) total: 7.81s remaining: 1m 35s 4: learn: 0.4218588 test: 0.4441229 best: 0.4441229 (4) total: 9.78s remaining: 1m 33s 5: learn: 0.3892971 test: 0.4143956 best: 0.4143956 (5) total: 11.8s remaining: 1m 32s 6: learn: 0.3611752 test: 0.3884644 best: 0.3884644 (6) total: 13.8s remaining: 1m 30s 7: learn: 0.3368914 test: 0.3666740 best: 0.3666740 (7) total: 15.8s remaining: 1m 28s 8: learn: 0.3166318 test: 0.3476228 best: 0.3476228 (8) total: 17.8s remaining: 1m 27s 9: learn: 0.2986420 test: 0.3310179 best: 0.3310179 (9) total: 19.8s remaining: 1m 25s 10: learn: 0.2819197 test: 0.3168273 best: 0.3168273 (10) total: 21.8s remaining: 1m 23s 11: learn: 0.2688561 test: 0.3044549 best: 0.3044549 (11) total: 23.9s remaining: 1m 21s 12: learn: 0.2562674 test: 0.2939511 best: 0.2939511 (12) total: 25.9s remaining: 1m 19s 13: learn: 0.2456844 test: 0.2846300 best: 0.2846300 (13) total: 27.9s remaining: 1m 17s 14: learn: 0.2356542 test: 0.2771974 best: 0.2771974 (14) total: 30s remaining: 1m 16s 15: learn: 0.2274335 test: 0.2704975 best: 0.2704975 (15) total: 32.1s remaining: 1m 14s 16: learn: 0.2191565 test: 0.2643236 best: 0.2643236 (16) total: 34.3s remaining: 1m 12s 17: learn: 0.2107476 test: 0.2595670 best: 0.2595670 (17) total: 36.5s remaining: 1m 10s 18: learn: 0.2026448 test: 0.2545861 best: 0.2545861 (18) total: 38.5s remaining: 1m 8s 19: learn: 0.1964491 test: 0.2501775 best: 0.2501775 (19) total: 40.5s remaining: 1m 6s 20: learn: 0.1904338 test: 0.2463869 best: 0.2463869 (20) total: 42.5s remaining: 1m 4s 21: learn: 0.1852973 test: 0.2431218 best: 0.2431218 (21) total: 44.5s remaining: 1m 2s 22: learn: 0.1812940 test: 0.2400816 best: 0.2400816 (22) total: 46.6s remaining: 1m 23: learn: 0.1763397 test: 0.2373538 best: 0.2373538 (23) total: 48.6s remaining: 58.8s 24: learn: 0.1726828 test: 0.2348632 best: 0.2348632 (24) total: 50.6s remaining: 56.7s 25: learn: 0.1697343 test: 0.2322870 best: 0.2322870 (25) total: 52.7s remaining: 54.7s 26: learn: 0.1649073 test: 0.2304158 best: 0.2304158 (26) total: 54.7s remaining: 52.7s 27: learn: 0.1611601 test: 0.2285812 best: 0.2285812 (27) total: 56.7s remaining: 50.6s 28: learn: 0.1584040 test: 0.2265680 best: 0.2265680 (28) total: 58.8s remaining: 48.6s 29: learn: 0.1551773 test: 0.2250087 best: 0.2250087 (29) total: 1m remaining: 46.6s 30: learn: 0.1517237 test: 0.2232001 best: 0.2232001 (30) total: 1m 2s remaining: 44.6s 31: learn: 0.1489765 test: 0.2218629 best: 0.2218629 (31) total: 1m 4s remaining: 42.5s 32: learn: 0.1441983 test: 0.2212315 best: 0.2212315 (32) total: 1m 6s remaining: 40.5s 33: learn: 0.1419812 test: 0.2202291 best: 0.2202291 (33) total: 1m 8s remaining: 38.4s 34: learn: 0.1400566 test: 0.2190058 best: 0.2190058 (34) total: 1m 10s remaining: 36.3s 35: learn: 0.1376234 test: 0.2182455 best: 0.2182455 (35) total: 1m 12s remaining: 34.3s 36: learn: 0.1339168 test: 0.2173285 best: 0.2173285 (36) total: 1m 14s remaining: 32.3s 37: learn: 0.1307742 test: 0.2167338 best: 0.2167338 (37) total: 1m 16s remaining: 30.3s 38: learn: 0.1285404 test: 0.2160435 best: 0.2160435 (38) total: 1m 18s remaining: 28.2s 39: learn: 0.1265895 test: 0.2155904 best: 0.2155904 (39) total: 1m 20s remaining: 26.2s 40: learn: 0.1246069 test: 0.2147512 best: 0.2147512 (40) total: 1m 22s remaining: 24.2s 41: learn: 0.1229714 test: 0.2141363 best: 0.2141363 (41) total: 1m 24s remaining: 22.2s 42: learn: 0.1211167 test: 0.2135765 best: 0.2135765 (42) total: 1m 26s remaining: 20.2s 43: learn: 0.1196289 test: 0.2127974 best: 0.2127974 (43) total: 1m 28s remaining: 18.2s 44: learn: 0.1183339 test: 0.2124573 best: 0.2124573 (44) total: 1m 30s remaining: 16.1s 45: learn: 0.1150490 test: 0.2121544 best: 0.2121544 (45) total: 1m 32s remaining: 14.1s 46: learn: 0.1133476 test: 0.2117959 best: 0.2117959 (46) total: 1m 34s remaining: 12.1s 47: learn: 0.1119658 test: 0.2111476 best: 0.2111476 (47) total: 1m 36s remaining: 10.1s 48: learn: 0.1102662 test: 0.2107405 best: 0.2107405 (48) total: 1m 38s remaining: 8.06s 49: learn: 0.1085176 test: 0.2107458 best: 0.2107405 (48) total: 1m 40s remaining: 6.05s 50: learn: 0.1075040 test: 0.2105479 best: 0.2105479 (50) total: 1m 42s remaining: 4.03s 51: learn: 0.1064721 test: 0.2101094 best: 0.2101094 (51) total: 1m 44s remaining: 2.01s 52: learn: 0.1051549 test: 0.2099617 best: 0.2099617 (52) total: 1m 46s remaining: 0us bestTest = 0.2099616769 bestIteration = 52 Trial 3, Fold 3: Log loss = 0.20948046517183116, Average precision = 0.9721914561978554, ROC-AUC = 0.9677968285001588, Elapsed Time = 106.99108250000063 seconds Trial 3, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 3, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6209726 test: 0.6265059 best: 0.6265059 (0) total: 1.82s remaining: 1m 34s 1: learn: 0.5596561 test: 0.5690189 best: 0.5690189 (1) total: 3.83s remaining: 1m 37s 2: learn: 0.5068209 test: 0.5210777 best: 0.5210777 (2) total: 5.9s remaining: 1m 38s 3: learn: 0.4615579 test: 0.4799427 best: 0.4799427 (3) total: 7.95s remaining: 1m 37s 4: learn: 0.4231729 test: 0.4449089 best: 0.4449089 (4) total: 9.96s remaining: 1m 35s 5: learn: 0.3909425 test: 0.4145341 best: 0.4145341 (5) total: 12s remaining: 1m 33s 6: learn: 0.3632540 test: 0.3884701 best: 0.3884701 (6) total: 14s remaining: 1m 31s 7: learn: 0.3386476 test: 0.3662129 best: 0.3662129 (7) total: 16.1s remaining: 1m 30s 8: learn: 0.3178142 test: 0.3477776 best: 0.3477776 (8) total: 18.1s remaining: 1m 28s 9: learn: 0.2985394 test: 0.3315617 best: 0.3315617 (9) total: 20.4s remaining: 1m 27s 10: learn: 0.2826670 test: 0.3175812 best: 0.3175812 (10) total: 22.5s remaining: 1m 25s 11: learn: 0.2688249 test: 0.3049420 best: 0.3049420 (11) total: 24.5s remaining: 1m 23s 12: learn: 0.2558733 test: 0.2947825 best: 0.2947825 (12) total: 26.5s remaining: 1m 21s 13: learn: 0.2446947 test: 0.2861706 best: 0.2861706 (13) total: 28.6s remaining: 1m 19s 14: learn: 0.2342622 test: 0.2785555 best: 0.2785555 (14) total: 30.6s remaining: 1m 17s 15: learn: 0.2259162 test: 0.2715384 best: 0.2715384 (15) total: 32.8s remaining: 1m 15s 16: learn: 0.2168523 test: 0.2651371 best: 0.2651371 (16) total: 34.9s remaining: 1m 13s 17: learn: 0.2096884 test: 0.2596065 best: 0.2596065 (17) total: 36.9s remaining: 1m 11s 18: learn: 0.2029761 test: 0.2552432 best: 0.2552432 (18) total: 38.9s remaining: 1m 9s 19: learn: 0.1969338 test: 0.2507196 best: 0.2507196 (19) total: 41s remaining: 1m 7s 20: learn: 0.1909401 test: 0.2467846 best: 0.2467846 (20) total: 43.1s remaining: 1m 5s 21: learn: 0.1857961 test: 0.2428483 best: 0.2428483 (21) total: 45.2s remaining: 1m 3s 22: learn: 0.1815035 test: 0.2397539 best: 0.2397539 (22) total: 47.3s remaining: 1m 1s 23: learn: 0.1771143 test: 0.2371527 best: 0.2371527 (23) total: 49.3s remaining: 59.6s 24: learn: 0.1727362 test: 0.2343010 best: 0.2343010 (24) total: 51.3s remaining: 57.5s 25: learn: 0.1692961 test: 0.2320241 best: 0.2320241 (25) total: 53.3s remaining: 55.4s 26: learn: 0.1654971 test: 0.2299045 best: 0.2299045 (26) total: 55.4s remaining: 53.3s 27: learn: 0.1609491 test: 0.2281140 best: 0.2281140 (27) total: 57.4s remaining: 51.2s 28: learn: 0.1570182 test: 0.2264648 best: 0.2264648 (28) total: 59.5s remaining: 49.2s 29: learn: 0.1539085 test: 0.2253557 best: 0.2253557 (29) total: 1m 1s remaining: 47.3s 30: learn: 0.1510537 test: 0.2238512 best: 0.2238512 (30) total: 1m 3s remaining: 45.3s 31: learn: 0.1474197 test: 0.2228205 best: 0.2228205 (31) total: 1m 5s remaining: 43.2s 32: learn: 0.1445275 test: 0.2217600 best: 0.2217600 (32) total: 1m 7s remaining: 41.1s 33: learn: 0.1419418 test: 0.2206052 best: 0.2206052 (33) total: 1m 9s remaining: 39s 34: learn: 0.1394820 test: 0.2199991 best: 0.2199991 (34) total: 1m 11s remaining: 36.9s 35: learn: 0.1366103 test: 0.2192096 best: 0.2192096 (35) total: 1m 13s remaining: 34.8s 36: learn: 0.1352099 test: 0.2181132 best: 0.2181132 (36) total: 1m 15s remaining: 32.7s 37: learn: 0.1327343 test: 0.2175096 best: 0.2175096 (37) total: 1m 17s remaining: 30.6s 38: learn: 0.1305917 test: 0.2170761 best: 0.2170761 (38) total: 1m 19s remaining: 28.6s 39: learn: 0.1279707 test: 0.2166135 best: 0.2166135 (39) total: 1m 21s remaining: 26.5s 40: learn: 0.1258383 test: 0.2161659 best: 0.2161659 (40) total: 1m 23s remaining: 24.4s 41: learn: 0.1238137 test: 0.2155271 best: 0.2155271 (41) total: 1m 25s remaining: 22.4s 42: learn: 0.1224795 test: 0.2149981 best: 0.2149981 (42) total: 1m 27s remaining: 20.4s 43: learn: 0.1205790 test: 0.2143704 best: 0.2143704 (43) total: 1m 29s remaining: 18.3s 44: learn: 0.1178470 test: 0.2140639 best: 0.2140639 (44) total: 1m 31s remaining: 16.3s 45: learn: 0.1170938 test: 0.2135718 best: 0.2135718 (45) total: 1m 33s remaining: 14.2s 46: learn: 0.1150506 test: 0.2130614 best: 0.2130614 (46) total: 1m 35s remaining: 12.2s 47: learn: 0.1128442 test: 0.2127740 best: 0.2127740 (47) total: 1m 37s remaining: 10.2s 48: learn: 0.1113120 test: 0.2124517 best: 0.2124517 (48) total: 1m 39s remaining: 8.13s 49: learn: 0.1102172 test: 0.2120023 best: 0.2120023 (49) total: 1m 41s remaining: 6.09s 50: learn: 0.1081928 test: 0.2116876 best: 0.2116876 (50) total: 1m 43s remaining: 4.06s 51: learn: 0.1069263 test: 0.2112646 best: 0.2112646 (51) total: 1m 45s remaining: 2.03s 52: learn: 0.1050433 test: 0.2110767 best: 0.2110767 (52) total: 1m 47s remaining: 0us bestTest = 0.2110767146 bestIteration = 52 Trial 3, Fold 4: Log loss = 0.21053058723983187, Average precision = 0.973023916473308, ROC-AUC = 0.9685567882954094, Elapsed Time = 107.8088351999977 seconds Trial 3, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 3, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6207121 test: 0.6262120 best: 0.6262120 (0) total: 1.79s remaining: 1m 33s 1: learn: 0.5581805 test: 0.5706434 best: 0.5706434 (1) total: 3.78s remaining: 1m 36s 2: learn: 0.5045762 test: 0.5235112 best: 0.5235112 (2) total: 5.82s remaining: 1m 36s 3: learn: 0.4583798 test: 0.4827181 best: 0.4827181 (3) total: 7.84s remaining: 1m 36s 4: learn: 0.4199114 test: 0.4480349 best: 0.4480349 (4) total: 9.86s remaining: 1m 34s 5: learn: 0.3866340 test: 0.4181617 best: 0.4181617 (5) total: 11.9s remaining: 1m 33s 6: learn: 0.3590075 test: 0.3920679 best: 0.3920679 (6) total: 13.9s remaining: 1m 31s 7: learn: 0.3346293 test: 0.3705247 best: 0.3705247 (7) total: 15.9s remaining: 1m 29s 8: learn: 0.3141363 test: 0.3514561 best: 0.3514561 (8) total: 17.9s remaining: 1m 27s 9: learn: 0.2960314 test: 0.3356264 best: 0.3356264 (9) total: 19.9s remaining: 1m 25s 10: learn: 0.2788938 test: 0.3219420 best: 0.3219420 (10) total: 22s remaining: 1m 23s 11: learn: 0.2642754 test: 0.3102246 best: 0.3102246 (11) total: 24s remaining: 1m 21s 12: learn: 0.2510122 test: 0.2999487 best: 0.2999487 (12) total: 25.9s remaining: 1m 19s 13: learn: 0.2391309 test: 0.2906602 best: 0.2906602 (13) total: 28s remaining: 1m 18s 14: learn: 0.2292357 test: 0.2831058 best: 0.2831058 (14) total: 30.1s remaining: 1m 16s 15: learn: 0.2210567 test: 0.2759211 best: 0.2759211 (15) total: 32.2s remaining: 1m 14s 16: learn: 0.2125771 test: 0.2701167 best: 0.2701167 (16) total: 34.4s remaining: 1m 12s 17: learn: 0.2058954 test: 0.2645847 best: 0.2645847 (17) total: 36.4s remaining: 1m 10s 18: learn: 0.1988404 test: 0.2597416 best: 0.2597416 (18) total: 38.4s remaining: 1m 8s 19: learn: 0.1925195 test: 0.2556886 best: 0.2556886 (19) total: 40.4s remaining: 1m 6s 20: learn: 0.1871795 test: 0.2521063 best: 0.2521063 (20) total: 42.4s remaining: 1m 4s 21: learn: 0.1825667 test: 0.2489834 best: 0.2489834 (21) total: 44.4s remaining: 1m 2s 22: learn: 0.1783318 test: 0.2455383 best: 0.2455383 (22) total: 46.5s remaining: 1m 23: learn: 0.1745569 test: 0.2427797 best: 0.2427797 (23) total: 48.5s remaining: 58.6s 24: learn: 0.1705484 test: 0.2402489 best: 0.2402489 (24) total: 50.5s remaining: 56.6s 25: learn: 0.1668371 test: 0.2383719 best: 0.2383719 (25) total: 52.5s remaining: 54.6s 26: learn: 0.1630038 test: 0.2366746 best: 0.2366746 (26) total: 54.6s remaining: 52.6s 27: learn: 0.1584776 test: 0.2350093 best: 0.2350093 (27) total: 56.6s remaining: 50.5s 28: learn: 0.1539984 test: 0.2339040 best: 0.2339040 (28) total: 58.7s remaining: 48.6s 29: learn: 0.1509773 test: 0.2324048 best: 0.2324048 (29) total: 1m remaining: 46.5s 30: learn: 0.1479592 test: 0.2309666 best: 0.2309666 (30) total: 1m 2s remaining: 44.4s 31: learn: 0.1454236 test: 0.2297933 best: 0.2297933 (31) total: 1m 4s remaining: 42.4s 32: learn: 0.1429622 test: 0.2290292 best: 0.2290292 (32) total: 1m 6s remaining: 40.4s 33: learn: 0.1410981 test: 0.2278277 best: 0.2278277 (33) total: 1m 8s remaining: 38.4s 34: learn: 0.1380347 test: 0.2268028 best: 0.2268028 (34) total: 1m 10s remaining: 36.3s 35: learn: 0.1349456 test: 0.2262697 best: 0.2262697 (35) total: 1m 12s remaining: 34.4s 36: learn: 0.1330767 test: 0.2255436 best: 0.2255436 (36) total: 1m 14s remaining: 32.3s 37: learn: 0.1308531 test: 0.2249357 best: 0.2249357 (37) total: 1m 16s remaining: 30.3s 38: learn: 0.1288442 test: 0.2240713 best: 0.2240713 (38) total: 1m 18s remaining: 28.3s 39: learn: 0.1268568 test: 0.2235616 best: 0.2235616 (39) total: 1m 20s remaining: 26.2s 40: learn: 0.1240416 test: 0.2230700 best: 0.2230700 (40) total: 1m 22s remaining: 24.2s 41: learn: 0.1223937 test: 0.2221765 best: 0.2221765 (41) total: 1m 24s remaining: 22.2s 42: learn: 0.1198807 test: 0.2219886 best: 0.2219886 (42) total: 1m 26s remaining: 20.2s 43: learn: 0.1181476 test: 0.2216369 best: 0.2216369 (43) total: 1m 28s remaining: 18.2s 44: learn: 0.1156621 test: 0.2214997 best: 0.2214997 (44) total: 1m 30s remaining: 16.1s 45: learn: 0.1143192 test: 0.2210419 best: 0.2210419 (45) total: 1m 32s remaining: 14.1s 46: learn: 0.1127190 test: 0.2204376 best: 0.2204376 (46) total: 1m 34s remaining: 12.1s 47: learn: 0.1103712 test: 0.2201195 best: 0.2201195 (47) total: 1m 36s remaining: 10.1s 48: learn: 0.1084872 test: 0.2200150 best: 0.2200150 (48) total: 1m 38s remaining: 8.07s 49: learn: 0.1069574 test: 0.2195649 best: 0.2195649 (49) total: 1m 40s remaining: 6.05s 50: learn: 0.1049957 test: 0.2192373 best: 0.2192373 (50) total: 1m 42s remaining: 4.03s 51: learn: 0.1026196 test: 0.2190961 best: 0.2190961 (51) total: 1m 44s remaining: 2.02s 52: learn: 0.1006730 test: 0.2186315 best: 0.2186315 (52) total: 1m 46s remaining: 0us bestTest = 0.2186315184 bestIteration = 52 Trial 3, Fold 5: Log loss = 0.217743816459266, Average precision = 0.9712940630805605, ROC-AUC = 0.966626758077402, Elapsed Time = 107.1136712999978 seconds
Optimization Progress: 4%|4 | 4/100 [11:14<6:33:50, 246.15s/it]
Trial 4, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 4, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6751291 test: 0.6756952 best: 0.6756952 (0) total: 424ms remaining: 8.48s 1: learn: 0.6580755 test: 0.6588961 best: 0.6588961 (1) total: 878ms remaining: 8.34s 2: learn: 0.6414485 test: 0.6425834 best: 0.6425834 (2) total: 1.32s remaining: 7.91s 3: learn: 0.6255995 test: 0.6270070 best: 0.6270070 (3) total: 1.79s remaining: 7.62s 4: learn: 0.6102941 test: 0.6119646 best: 0.6119646 (4) total: 2.18s remaining: 6.97s 5: learn: 0.5955309 test: 0.5974463 best: 0.5974463 (5) total: 2.58s remaining: 6.44s 6: learn: 0.5812521 test: 0.5834002 best: 0.5834002 (6) total: 3.01s remaining: 6.01s 7: learn: 0.5677320 test: 0.5701475 best: 0.5701475 (7) total: 3.37s remaining: 5.47s 8: learn: 0.5546584 test: 0.5572836 best: 0.5572836 (8) total: 3.81s remaining: 5.08s 9: learn: 0.5420135 test: 0.5449337 best: 0.5449337 (9) total: 4.29s remaining: 4.72s 10: learn: 0.5297653 test: 0.5329147 best: 0.5329147 (10) total: 4.74s remaining: 4.31s 11: learn: 0.5181672 test: 0.5216831 best: 0.5216831 (11) total: 5.2s remaining: 3.9s 12: learn: 0.5069608 test: 0.5106155 best: 0.5106155 (12) total: 5.62s remaining: 3.46s 13: learn: 0.4961092 test: 0.4999910 best: 0.4999910 (13) total: 6s remaining: 3s 14: learn: 0.4856447 test: 0.4897795 best: 0.4897795 (14) total: 6.43s remaining: 2.57s 15: learn: 0.4758098 test: 0.4802741 best: 0.4802741 (15) total: 6.86s remaining: 2.14s 16: learn: 0.4660743 test: 0.4708078 best: 0.4708078 (16) total: 7.25s remaining: 1.71s 17: learn: 0.4568298 test: 0.4618825 best: 0.4618825 (17) total: 7.78s remaining: 1.3s 18: learn: 0.4479008 test: 0.4531683 best: 0.4531683 (18) total: 8.14s remaining: 857ms 19: learn: 0.4391787 test: 0.4449043 best: 0.4449043 (19) total: 8.58s remaining: 429ms 20: learn: 0.4308150 test: 0.4368316 best: 0.4368316 (20) total: 9.02s remaining: 0us bestTest = 0.4368316113 bestIteration = 20 Trial 4, Fold 1: Log loss = 0.43683161132526216, Average precision = 0.9725119914716599, ROC-AUC = 0.9691511867720087, Elapsed Time = 9.127943200001027 seconds Trial 4, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 4, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6751665 test: 0.6752813 best: 0.6752813 (0) total: 496ms remaining: 9.93s 1: learn: 0.6580671 test: 0.6585045 best: 0.6585045 (1) total: 957ms remaining: 9.09s 2: learn: 0.6415744 test: 0.6422953 best: 0.6422953 (2) total: 1.43s remaining: 8.59s 3: learn: 0.6256481 test: 0.6266891 best: 0.6266891 (3) total: 2s remaining: 8.5s 4: learn: 0.6102856 test: 0.6116619 best: 0.6116619 (4) total: 2.51s remaining: 8.03s 5: learn: 0.5956096 test: 0.5972144 best: 0.5972144 (5) total: 3.01s remaining: 7.53s 6: learn: 0.5814869 test: 0.5834651 best: 0.5834651 (6) total: 3.53s remaining: 7.06s 7: learn: 0.5679892 test: 0.5702089 best: 0.5702089 (7) total: 3.96s remaining: 6.43s 8: learn: 0.5548435 test: 0.5574491 best: 0.5574491 (8) total: 4.49s remaining: 5.98s 9: learn: 0.5420610 test: 0.5449568 best: 0.5449568 (9) total: 5.02s remaining: 5.52s 10: learn: 0.5298530 test: 0.5329699 best: 0.5329699 (10) total: 5.51s remaining: 5.01s 11: learn: 0.5181822 test: 0.5215523 best: 0.5215523 (11) total: 5.92s remaining: 4.44s 12: learn: 0.5069713 test: 0.5105400 best: 0.5105400 (12) total: 6.39s remaining: 3.93s 13: learn: 0.4962473 test: 0.5000775 best: 0.5000775 (13) total: 6.81s remaining: 3.4s 14: learn: 0.4858733 test: 0.4900487 best: 0.4900487 (14) total: 7.28s remaining: 2.91s 15: learn: 0.4760014 test: 0.4803204 best: 0.4803204 (15) total: 7.65s remaining: 2.39s 16: learn: 0.4663710 test: 0.4710931 best: 0.4710931 (16) total: 8.12s remaining: 1.91s 17: learn: 0.4569903 test: 0.4620164 best: 0.4620164 (17) total: 8.56s remaining: 1.43s 18: learn: 0.4480771 test: 0.4533657 best: 0.4533657 (18) total: 8.97s remaining: 944ms 19: learn: 0.4395562 test: 0.4450377 best: 0.4450377 (19) total: 9.33s remaining: 466ms 20: learn: 0.4314206 test: 0.4371695 best: 0.4371695 (20) total: 9.73s remaining: 0us bestTest = 0.4371695184 bestIteration = 20 Trial 4, Fold 2: Log loss = 0.4371695183522166, Average precision = 0.9743944292205545, ROC-AUC = 0.9718838668127897, Elapsed Time = 9.848131099999591 seconds Trial 4, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 4, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6753183 test: 0.6755595 best: 0.6755595 (0) total: 396ms remaining: 7.92s 1: learn: 0.6580844 test: 0.6586462 best: 0.6586462 (1) total: 952ms remaining: 9.05s 2: learn: 0.6415270 test: 0.6423197 best: 0.6423197 (2) total: 1.39s remaining: 8.35s 3: learn: 0.6256985 test: 0.6266717 best: 0.6266717 (3) total: 1.8s remaining: 7.65s 4: learn: 0.6105781 test: 0.6116335 best: 0.6116335 (4) total: 2.22s remaining: 7.11s 5: learn: 0.5959607 test: 0.5972124 best: 0.5972124 (5) total: 2.72s remaining: 6.81s 6: learn: 0.5816621 test: 0.5831419 best: 0.5831419 (6) total: 3.25s remaining: 6.49s 7: learn: 0.5681023 test: 0.5696667 best: 0.5696667 (7) total: 3.63s remaining: 5.9s 8: learn: 0.5551196 test: 0.5567472 best: 0.5567472 (8) total: 4s remaining: 5.33s 9: learn: 0.5426289 test: 0.5443377 best: 0.5443377 (9) total: 4.42s remaining: 4.87s 10: learn: 0.5304244 test: 0.5322596 best: 0.5322596 (10) total: 4.92s remaining: 4.47s 11: learn: 0.5187494 test: 0.5208432 best: 0.5208432 (11) total: 5.34s remaining: 4.01s 12: learn: 0.5074847 test: 0.5097322 best: 0.5097322 (12) total: 5.73s remaining: 3.53s 13: learn: 0.4966125 test: 0.4988804 best: 0.4988804 (13) total: 6.13s remaining: 3.07s 14: learn: 0.4861146 test: 0.4884831 best: 0.4884831 (14) total: 6.54s remaining: 2.62s 15: learn: 0.4761447 test: 0.4785506 best: 0.4785506 (15) total: 6.87s remaining: 2.15s 16: learn: 0.4664826 test: 0.4690408 best: 0.4690408 (16) total: 7.28s remaining: 1.71s 17: learn: 0.4572168 test: 0.4598743 best: 0.4598743 (17) total: 7.7s remaining: 1.28s 18: learn: 0.4485156 test: 0.4513245 best: 0.4513245 (18) total: 8.06s remaining: 848ms 19: learn: 0.4400284 test: 0.4430049 best: 0.4430049 (19) total: 8.42s remaining: 421ms 20: learn: 0.4317150 test: 0.4348469 best: 0.4348469 (20) total: 8.82s remaining: 0us bestTest = 0.4348469358 bestIteration = 20 Trial 4, Fold 3: Log loss = 0.4348469357579093, Average precision = 0.9742232301269391, ROC-AUC = 0.9718304263659215, Elapsed Time = 8.931193399999756 seconds Trial 4, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 4, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6752998 test: 0.6754327 best: 0.6754327 (0) total: 413ms remaining: 8.27s 1: learn: 0.6583339 test: 0.6587253 best: 0.6587253 (1) total: 798ms remaining: 7.58s 2: learn: 0.6418742 test: 0.6424550 best: 0.6424550 (2) total: 1.21s remaining: 7.26s 3: learn: 0.6260680 test: 0.6269050 best: 0.6269050 (3) total: 1.74s remaining: 7.4s 4: learn: 0.6108499 test: 0.6118316 best: 0.6118316 (4) total: 2.17s remaining: 6.93s 5: learn: 0.5962492 test: 0.5974495 best: 0.5974495 (5) total: 2.56s remaining: 6.4s 6: learn: 0.5822532 test: 0.5836620 best: 0.5836620 (6) total: 2.93s remaining: 5.87s 7: learn: 0.5684085 test: 0.5700877 best: 0.5700877 (7) total: 3.45s remaining: 5.6s 8: learn: 0.5552990 test: 0.5571120 best: 0.5571120 (8) total: 3.79s remaining: 5.05s 9: learn: 0.5429237 test: 0.5448646 best: 0.5448646 (9) total: 4.12s remaining: 4.54s 10: learn: 0.5306034 test: 0.5327420 best: 0.5327420 (10) total: 4.6s remaining: 4.18s 11: learn: 0.5188968 test: 0.5213811 best: 0.5213811 (11) total: 5.08s remaining: 3.81s 12: learn: 0.5077550 test: 0.5105051 best: 0.5105051 (12) total: 5.44s remaining: 3.35s 13: learn: 0.4968071 test: 0.4997981 best: 0.4997981 (13) total: 5.86s remaining: 2.93s 14: learn: 0.4863505 test: 0.4896662 best: 0.4896662 (14) total: 6.32s remaining: 2.53s 15: learn: 0.4763280 test: 0.4798449 best: 0.4798449 (15) total: 6.69s remaining: 2.09s 16: learn: 0.4666449 test: 0.4705958 best: 0.4705958 (16) total: 7.13s remaining: 1.68s 17: learn: 0.4572764 test: 0.4615315 best: 0.4615315 (17) total: 7.63s remaining: 1.27s 18: learn: 0.4482824 test: 0.4526776 best: 0.4526776 (18) total: 8s remaining: 842ms 19: learn: 0.4396185 test: 0.4442756 best: 0.4442756 (19) total: 8.48s remaining: 424ms 20: learn: 0.4312288 test: 0.4361223 best: 0.4361223 (20) total: 8.93s remaining: 0us bestTest = 0.4361223162 bestIteration = 20 Trial 4, Fold 4: Log loss = 0.43612231617886416, Average precision = 0.9760371125173493, ROC-AUC = 0.9719734984775543, Elapsed Time = 9.036765299999388 seconds Trial 4, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 4, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6750407 test: 0.6755989 best: 0.6755989 (0) total: 535ms remaining: 10.7s 1: learn: 0.6580667 test: 0.6591000 best: 0.6591000 (1) total: 903ms remaining: 8.58s 2: learn: 0.6416029 test: 0.6429922 best: 0.6429922 (2) total: 1.27s remaining: 7.61s 3: learn: 0.6254916 test: 0.6274327 best: 0.6274327 (3) total: 1.76s remaining: 7.46s 4: learn: 0.6103637 test: 0.6126868 best: 0.6126868 (4) total: 2.11s remaining: 6.74s 5: learn: 0.5955315 test: 0.5982971 best: 0.5982971 (5) total: 2.5s remaining: 6.25s 6: learn: 0.5814512 test: 0.5846105 best: 0.5846105 (6) total: 2.83s remaining: 5.67s 7: learn: 0.5679944 test: 0.5714976 best: 0.5714976 (7) total: 3.19s remaining: 5.19s 8: learn: 0.5548615 test: 0.5589816 best: 0.5589816 (8) total: 3.55s remaining: 4.73s 9: learn: 0.5421431 test: 0.5465440 best: 0.5465440 (9) total: 3.91s remaining: 4.3s 10: learn: 0.5299288 test: 0.5348384 best: 0.5348384 (10) total: 4.36s remaining: 3.96s 11: learn: 0.5182163 test: 0.5234442 best: 0.5234442 (11) total: 4.8s remaining: 3.6s 12: learn: 0.5068309 test: 0.5124603 best: 0.5124603 (12) total: 5.21s remaining: 3.21s 13: learn: 0.4959101 test: 0.5019123 best: 0.5019123 (13) total: 5.58s remaining: 2.79s 14: learn: 0.4854658 test: 0.4918334 best: 0.4918334 (14) total: 5.94s remaining: 2.38s 15: learn: 0.4751767 test: 0.4819665 best: 0.4819665 (15) total: 6.41s remaining: 2s 16: learn: 0.4656092 test: 0.4727705 best: 0.4727705 (16) total: 6.75s remaining: 1.59s 17: learn: 0.4561120 test: 0.4637867 best: 0.4637867 (17) total: 7.26s remaining: 1.21s 18: learn: 0.4469322 test: 0.4549357 best: 0.4549357 (18) total: 7.65s remaining: 805ms 19: learn: 0.4382787 test: 0.4466474 best: 0.4466474 (19) total: 8.11s remaining: 405ms 20: learn: 0.4301429 test: 0.4387201 best: 0.4387201 (20) total: 8.45s remaining: 0us bestTest = 0.4387200902 bestIteration = 20 Trial 4, Fold 5: Log loss = 0.4387200902369523, Average precision = 0.9717789847898515, ROC-AUC = 0.9691278449733385, Elapsed Time = 8.559081100000185 seconds
Optimization Progress: 5%|5 | 5/100 [12:07<4:39:15, 176.38s/it]
Trial 5, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 5, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6773268 test: 0.6774498 best: 0.6774498 (0) total: 101ms remaining: 6.8s 1: learn: 0.6619110 test: 0.6625295 best: 0.6625295 (1) total: 204ms remaining: 6.75s 2: learn: 0.6468069 test: 0.6475521 best: 0.6475521 (2) total: 313ms remaining: 6.79s 3: learn: 0.6321747 test: 0.6329820 best: 0.6329820 (3) total: 416ms remaining: 6.66s 4: learn: 0.6188141 test: 0.6200537 best: 0.6200537 (4) total: 531ms remaining: 6.7s 5: learn: 0.6056048 test: 0.6070380 best: 0.6070380 (5) total: 649ms remaining: 6.7s 6: learn: 0.5929134 test: 0.5947511 best: 0.5947511 (6) total: 770ms remaining: 6.71s 7: learn: 0.5799498 test: 0.5820043 best: 0.5820043 (7) total: 884ms remaining: 6.63s 8: learn: 0.5677034 test: 0.5698802 best: 0.5698802 (8) total: 999ms remaining: 6.55s 9: learn: 0.5559479 test: 0.5581774 best: 0.5581774 (9) total: 1.1s remaining: 6.38s 10: learn: 0.5446578 test: 0.5469422 best: 0.5469422 (10) total: 1.19s remaining: 6.16s 11: learn: 0.5337419 test: 0.5362091 best: 0.5362091 (11) total: 1.29s remaining: 6.05s 12: learn: 0.5228115 test: 0.5254390 best: 0.5254390 (12) total: 1.4s remaining: 5.91s 13: learn: 0.5125758 test: 0.5155250 best: 0.5155250 (13) total: 1.52s remaining: 5.85s 14: learn: 0.5034445 test: 0.5069767 best: 0.5069767 (14) total: 1.63s remaining: 5.76s 15: learn: 0.4938416 test: 0.4976427 best: 0.4976427 (15) total: 1.74s remaining: 5.67s 16: learn: 0.4847113 test: 0.4887767 best: 0.4887767 (16) total: 1.86s remaining: 5.58s 17: learn: 0.4758297 test: 0.4802281 best: 0.4802281 (17) total: 1.97s remaining: 5.47s 18: learn: 0.4671495 test: 0.4717648 best: 0.4717648 (18) total: 2.08s remaining: 5.37s 19: learn: 0.4588890 test: 0.4637197 best: 0.4637197 (19) total: 2.2s remaining: 5.28s 20: learn: 0.4508153 test: 0.4557579 best: 0.4557579 (20) total: 2.31s remaining: 5.17s 21: learn: 0.4431145 test: 0.4483239 best: 0.4483239 (21) total: 2.44s remaining: 5.1s 22: learn: 0.4351892 test: 0.4405633 best: 0.4405633 (22) total: 2.56s remaining: 5s 23: learn: 0.4281415 test: 0.4337391 best: 0.4337391 (23) total: 2.67s remaining: 4.89s 24: learn: 0.4210236 test: 0.4267980 best: 0.4267980 (24) total: 2.77s remaining: 4.77s 25: learn: 0.4139942 test: 0.4200403 best: 0.4200403 (25) total: 2.88s remaining: 4.66s 26: learn: 0.4075694 test: 0.4138252 best: 0.4138252 (26) total: 3s remaining: 4.56s 27: learn: 0.4011371 test: 0.4074825 best: 0.4074825 (27) total: 3.1s remaining: 4.43s 28: learn: 0.3948034 test: 0.4013427 best: 0.4013427 (28) total: 3.21s remaining: 4.31s 29: learn: 0.3892127 test: 0.3963284 best: 0.3963284 (29) total: 3.32s remaining: 4.21s 30: learn: 0.3834825 test: 0.3908703 best: 0.3908703 (30) total: 3.44s remaining: 4.11s 31: learn: 0.3780011 test: 0.3855947 best: 0.3855947 (31) total: 3.56s remaining: 4s 32: learn: 0.3731962 test: 0.3812924 best: 0.3812924 (32) total: 3.67s remaining: 3.89s 33: learn: 0.3680906 test: 0.3764390 best: 0.3764390 (33) total: 3.79s remaining: 3.79s 34: learn: 0.3634243 test: 0.3721387 best: 0.3721387 (34) total: 3.9s remaining: 3.68s 35: learn: 0.3584228 test: 0.3673680 best: 0.3673680 (35) total: 4.01s remaining: 3.57s 36: learn: 0.3535244 test: 0.3627180 best: 0.3627180 (36) total: 4.12s remaining: 3.45s 37: learn: 0.3490938 test: 0.3585225 best: 0.3585225 (37) total: 4.23s remaining: 3.34s 38: learn: 0.3446580 test: 0.3542959 best: 0.3542959 (38) total: 4.36s remaining: 3.24s 39: learn: 0.3404998 test: 0.3504603 best: 0.3504603 (39) total: 4.48s remaining: 3.14s 40: learn: 0.3361771 test: 0.3463964 best: 0.3463964 (40) total: 4.59s remaining: 3.02s 41: learn: 0.3322308 test: 0.3427019 best: 0.3427019 (41) total: 4.7s remaining: 2.91s 42: learn: 0.3283931 test: 0.3390295 best: 0.3390295 (42) total: 4.8s remaining: 2.79s 43: learn: 0.3247141 test: 0.3357970 best: 0.3357970 (43) total: 4.92s remaining: 2.69s 44: learn: 0.3208419 test: 0.3321466 best: 0.3321466 (44) total: 5.07s remaining: 2.59s 45: learn: 0.3171910 test: 0.3286695 best: 0.3286695 (45) total: 5.19s remaining: 2.48s 46: learn: 0.3137468 test: 0.3254513 best: 0.3254513 (46) total: 5.29s remaining: 2.36s 47: learn: 0.3103938 test: 0.3223690 best: 0.3223690 (47) total: 5.41s remaining: 2.25s 48: learn: 0.3075631 test: 0.3199507 best: 0.3199507 (48) total: 5.54s remaining: 2.15s 49: learn: 0.3042057 test: 0.3169059 best: 0.3169059 (49) total: 5.65s remaining: 2.04s 50: learn: 0.3008312 test: 0.3137155 best: 0.3137155 (50) total: 5.77s remaining: 1.92s 51: learn: 0.2979001 test: 0.3110601 best: 0.3110601 (51) total: 5.88s remaining: 1.81s 52: learn: 0.2949020 test: 0.3081993 best: 0.3081993 (52) total: 5.98s remaining: 1.69s 53: learn: 0.2924210 test: 0.3061113 best: 0.3061113 (53) total: 6.09s remaining: 1.58s 54: learn: 0.2899059 test: 0.3037648 best: 0.3037648 (54) total: 6.18s remaining: 1.46s 55: learn: 0.2870995 test: 0.3011696 best: 0.3011696 (55) total: 6.29s remaining: 1.35s 56: learn: 0.2843963 test: 0.2987296 best: 0.2987296 (56) total: 6.42s remaining: 1.24s 57: learn: 0.2816931 test: 0.2962818 best: 0.2962818 (57) total: 6.54s remaining: 1.13s 58: learn: 0.2793023 test: 0.2941391 best: 0.2941391 (58) total: 6.65s remaining: 1.01s 59: learn: 0.2772561 test: 0.2923968 best: 0.2923968 (59) total: 6.75s remaining: 899ms 60: learn: 0.2749554 test: 0.2904426 best: 0.2904426 (60) total: 6.85s remaining: 787ms 61: learn: 0.2727164 test: 0.2884453 best: 0.2884453 (61) total: 6.95s remaining: 673ms 62: learn: 0.2703777 test: 0.2862871 best: 0.2862871 (62) total: 7.07s remaining: 561ms 63: learn: 0.2681265 test: 0.2842805 best: 0.2842805 (63) total: 7.17s remaining: 448ms 64: learn: 0.2660524 test: 0.2824905 best: 0.2824905 (64) total: 7.29s remaining: 337ms 65: learn: 0.2639212 test: 0.2806233 best: 0.2806233 (65) total: 7.4s remaining: 224ms 66: learn: 0.2618865 test: 0.2788137 best: 0.2788137 (66) total: 7.51s remaining: 112ms 67: learn: 0.2598070 test: 0.2771105 best: 0.2771105 (67) total: 7.63s remaining: 0us bestTest = 0.2771105337 bestIteration = 67 Trial 5, Fold 1: Log loss = 0.2769973588825008, Average precision = 0.9738246884967362, ROC-AUC = 0.9688121346601473, Elapsed Time = 7.768258100000821 seconds Trial 5, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 5, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6782274 test: 0.6790685 best: 0.6790685 (0) total: 112ms remaining: 7.53s 1: learn: 0.6627084 test: 0.6638819 best: 0.6638819 (1) total: 229ms remaining: 7.55s 2: learn: 0.6476905 test: 0.6490303 best: 0.6490303 (2) total: 344ms remaining: 7.45s 3: learn: 0.6331870 test: 0.6346911 best: 0.6346911 (3) total: 439ms remaining: 7.03s 4: learn: 0.6191109 test: 0.6208684 best: 0.6208684 (4) total: 539ms remaining: 6.79s 5: learn: 0.6056036 test: 0.6074614 best: 0.6074614 (5) total: 654ms remaining: 6.75s 6: learn: 0.5926216 test: 0.5945403 best: 0.5945403 (6) total: 761ms remaining: 6.63s 7: learn: 0.5797980 test: 0.5819329 best: 0.5819329 (7) total: 880ms remaining: 6.6s 8: learn: 0.5677802 test: 0.5700745 best: 0.5700745 (8) total: 998ms remaining: 6.54s 9: learn: 0.5561671 test: 0.5587100 best: 0.5587100 (9) total: 1.11s remaining: 6.45s 10: learn: 0.5448090 test: 0.5475637 best: 0.5475637 (10) total: 1.22s remaining: 6.32s 11: learn: 0.5337192 test: 0.5367189 best: 0.5367189 (11) total: 1.33s remaining: 6.2s 12: learn: 0.5228807 test: 0.5259405 best: 0.5259405 (12) total: 1.43s remaining: 6.07s 13: learn: 0.5126741 test: 0.5159241 best: 0.5159241 (13) total: 1.57s remaining: 6.05s 14: learn: 0.5032661 test: 0.5068941 best: 0.5068941 (14) total: 1.69s remaining: 5.97s 15: learn: 0.4946456 test: 0.4989962 best: 0.4989962 (15) total: 1.82s remaining: 5.93s 16: learn: 0.4855026 test: 0.4901005 best: 0.4901005 (16) total: 1.95s remaining: 5.83s 17: learn: 0.4763721 test: 0.4810995 best: 0.4810995 (17) total: 2.06s remaining: 5.73s 18: learn: 0.4677532 test: 0.4725444 best: 0.4725444 (18) total: 2.17s remaining: 5.59s 19: learn: 0.4595062 test: 0.4647628 best: 0.4647628 (19) total: 2.29s remaining: 5.49s 20: learn: 0.4515221 test: 0.4569257 best: 0.4569257 (20) total: 2.42s remaining: 5.41s 21: learn: 0.4437129 test: 0.4491886 best: 0.4491886 (21) total: 2.53s remaining: 5.29s 22: learn: 0.4364871 test: 0.4420773 best: 0.4420773 (22) total: 2.66s remaining: 5.2s 23: learn: 0.4290727 test: 0.4347492 best: 0.4347492 (23) total: 2.76s remaining: 5.05s 24: learn: 0.4223193 test: 0.4281842 best: 0.4281842 (24) total: 2.87s remaining: 4.93s 25: learn: 0.4154880 test: 0.4216809 best: 0.4216809 (25) total: 2.98s remaining: 4.82s 26: learn: 0.4087779 test: 0.4151624 best: 0.4151624 (26) total: 3.11s remaining: 4.72s 27: learn: 0.4022600 test: 0.4088306 best: 0.4088306 (27) total: 3.24s remaining: 4.62s 28: learn: 0.3961692 test: 0.4029134 best: 0.4029134 (28) total: 3.33s remaining: 4.48s 29: learn: 0.3905095 test: 0.3975630 best: 0.3975630 (29) total: 3.45s remaining: 4.37s 30: learn: 0.3852676 test: 0.3924640 best: 0.3924640 (30) total: 3.56s remaining: 4.25s 31: learn: 0.3797958 test: 0.3870992 best: 0.3870992 (31) total: 3.65s remaining: 4.11s 32: learn: 0.3742170 test: 0.3816280 best: 0.3816280 (32) total: 3.77s remaining: 4s 33: learn: 0.3689426 test: 0.3764400 best: 0.3764400 (33) total: 3.87s remaining: 3.87s 34: learn: 0.3639492 test: 0.3716132 best: 0.3716132 (34) total: 3.98s remaining: 3.75s 35: learn: 0.3588865 test: 0.3668044 best: 0.3668044 (35) total: 4.09s remaining: 3.64s 36: learn: 0.3540886 test: 0.3622614 best: 0.3622614 (36) total: 4.2s remaining: 3.52s 37: learn: 0.3493596 test: 0.3576102 best: 0.3576102 (37) total: 4.32s remaining: 3.41s 38: learn: 0.3448994 test: 0.3532995 best: 0.3532995 (38) total: 4.43s remaining: 3.29s 39: learn: 0.3408039 test: 0.3495814 best: 0.3495814 (39) total: 4.53s remaining: 3.17s 40: learn: 0.3368711 test: 0.3457438 best: 0.3457438 (40) total: 4.65s remaining: 3.06s 41: learn: 0.3331446 test: 0.3420919 best: 0.3420919 (41) total: 4.76s remaining: 2.95s 42: learn: 0.3294452 test: 0.3387149 best: 0.3387149 (42) total: 4.88s remaining: 2.84s 43: learn: 0.3256264 test: 0.3349904 best: 0.3349904 (43) total: 4.99s remaining: 2.72s 44: learn: 0.3218687 test: 0.3314907 best: 0.3314907 (44) total: 5.11s remaining: 2.61s 45: learn: 0.3180468 test: 0.3278518 best: 0.3278518 (45) total: 5.21s remaining: 2.49s 46: learn: 0.3148912 test: 0.3248381 best: 0.3248381 (46) total: 5.33s remaining: 2.38s 47: learn: 0.3116461 test: 0.3217130 best: 0.3217130 (47) total: 5.44s remaining: 2.27s 48: learn: 0.3082130 test: 0.3183992 best: 0.3183992 (48) total: 5.56s remaining: 2.15s 49: learn: 0.3048881 test: 0.3152827 best: 0.3152827 (49) total: 5.66s remaining: 2.04s 50: learn: 0.3019054 test: 0.3124017 best: 0.3124017 (50) total: 5.77s remaining: 1.92s 51: learn: 0.2988418 test: 0.3094406 best: 0.3094406 (51) total: 5.88s remaining: 1.81s 52: learn: 0.2961471 test: 0.3070255 best: 0.3070255 (52) total: 6.01s remaining: 1.7s 53: learn: 0.2931560 test: 0.3041222 best: 0.3041222 (53) total: 6.11s remaining: 1.58s 54: learn: 0.2903559 test: 0.3014696 best: 0.3014696 (54) total: 6.24s remaining: 1.47s 55: learn: 0.2880593 test: 0.2992769 best: 0.2992769 (55) total: 6.34s remaining: 1.36s 56: learn: 0.2854781 test: 0.2968120 best: 0.2968120 (56) total: 6.46s remaining: 1.25s 57: learn: 0.2829058 test: 0.2942659 best: 0.2942659 (57) total: 6.58s remaining: 1.14s 58: learn: 0.2803691 test: 0.2917572 best: 0.2917572 (58) total: 6.71s remaining: 1.02s 59: learn: 0.2778273 test: 0.2893812 best: 0.2893812 (59) total: 6.83s remaining: 911ms 60: learn: 0.2755043 test: 0.2872001 best: 0.2872001 (60) total: 6.94s remaining: 796ms 61: learn: 0.2732048 test: 0.2850590 best: 0.2850590 (61) total: 7.05s remaining: 683ms 62: learn: 0.2709199 test: 0.2829714 best: 0.2829714 (62) total: 7.19s remaining: 570ms 63: learn: 0.2688571 test: 0.2810256 best: 0.2810256 (63) total: 7.28s remaining: 455ms 64: learn: 0.2666751 test: 0.2789453 best: 0.2789453 (64) total: 7.39s remaining: 341ms 65: learn: 0.2645032 test: 0.2769432 best: 0.2769432 (65) total: 7.5s remaining: 227ms 66: learn: 0.2627624 test: 0.2753154 best: 0.2753154 (66) total: 7.61s remaining: 114ms 67: learn: 0.2607091 test: 0.2733853 best: 0.2733853 (67) total: 7.73s remaining: 0us bestTest = 0.2733852713 bestIteration = 67 Trial 5, Fold 2: Log loss = 0.2733049031910323, Average precision = 0.9753165771130878, ROC-AUC = 0.9720532173413727, Elapsed Time = 7.869337500000256 seconds Trial 5, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 5, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6770024 test: 0.6769944 best: 0.6769944 (0) total: 99ms remaining: 6.63s 1: learn: 0.6614164 test: 0.6614533 best: 0.6614533 (1) total: 209ms remaining: 6.89s 2: learn: 0.6476255 test: 0.6481356 best: 0.6481356 (2) total: 320ms remaining: 6.93s 3: learn: 0.6330965 test: 0.6336531 best: 0.6336531 (3) total: 438ms remaining: 7.01s 4: learn: 0.6188210 test: 0.6193218 best: 0.6193218 (4) total: 553ms remaining: 6.97s 5: learn: 0.6054246 test: 0.6061869 best: 0.6061869 (5) total: 685ms remaining: 7.08s 6: learn: 0.5921004 test: 0.5930556 best: 0.5930556 (6) total: 793ms remaining: 6.91s 7: learn: 0.5795685 test: 0.5805064 best: 0.5805064 (7) total: 901ms remaining: 6.76s 8: learn: 0.5685867 test: 0.5700391 best: 0.5700391 (8) total: 1s remaining: 6.59s 9: learn: 0.5569592 test: 0.5584712 best: 0.5584712 (9) total: 1.13s remaining: 6.53s 10: learn: 0.5459480 test: 0.5475719 best: 0.5475719 (10) total: 1.22s remaining: 6.31s 11: learn: 0.5349041 test: 0.5366858 best: 0.5366858 (11) total: 1.34s remaining: 6.27s 12: learn: 0.5246758 test: 0.5264576 best: 0.5264576 (12) total: 1.46s remaining: 6.16s 13: learn: 0.5151639 test: 0.5175633 best: 0.5175633 (13) total: 1.55s remaining: 6s 14: learn: 0.5052640 test: 0.5078693 best: 0.5078693 (14) total: 1.69s remaining: 5.96s 15: learn: 0.4955975 test: 0.4982679 best: 0.4982679 (15) total: 1.79s remaining: 5.8s 16: learn: 0.4871512 test: 0.4901562 best: 0.4901562 (16) total: 1.9s remaining: 5.7s 17: learn: 0.4780636 test: 0.4811626 best: 0.4811626 (17) total: 2s remaining: 5.54s 18: learn: 0.4692453 test: 0.4723048 best: 0.4723048 (18) total: 2.11s remaining: 5.45s 19: learn: 0.4612803 test: 0.4647445 best: 0.4647445 (19) total: 2.24s remaining: 5.38s 20: learn: 0.4540749 test: 0.4578512 best: 0.4578512 (20) total: 2.35s remaining: 5.26s 21: learn: 0.4461532 test: 0.4500454 best: 0.4500454 (21) total: 2.46s remaining: 5.14s 22: learn: 0.4384576 test: 0.4423628 best: 0.4423628 (22) total: 2.6s remaining: 5.08s 23: learn: 0.4320494 test: 0.4361286 best: 0.4361286 (23) total: 2.71s remaining: 4.97s 24: learn: 0.4251741 test: 0.4292519 best: 0.4292519 (24) total: 2.81s remaining: 4.84s 25: learn: 0.4188866 test: 0.4235040 best: 0.4235040 (25) total: 2.94s remaining: 4.74s 26: learn: 0.4123501 test: 0.4171961 best: 0.4171961 (26) total: 3.06s remaining: 4.66s 27: learn: 0.4064384 test: 0.4117930 best: 0.4117930 (27) total: 3.18s remaining: 4.54s 28: learn: 0.4000433 test: 0.4055177 best: 0.4055177 (28) total: 3.27s remaining: 4.4s 29: learn: 0.3938640 test: 0.3993696 best: 0.3993696 (29) total: 3.38s remaining: 4.28s 30: learn: 0.3889825 test: 0.3947670 best: 0.3947670 (30) total: 3.48s remaining: 4.16s 31: learn: 0.3835957 test: 0.3895635 best: 0.3895635 (31) total: 3.61s remaining: 4.06s 32: learn: 0.3780979 test: 0.3840934 best: 0.3840934 (32) total: 3.73s remaining: 3.96s 33: learn: 0.3727497 test: 0.3788719 best: 0.3788719 (33) total: 3.85s remaining: 3.85s 34: learn: 0.3675991 test: 0.3738367 best: 0.3738367 (34) total: 3.96s remaining: 3.73s 35: learn: 0.3626350 test: 0.3689977 best: 0.3689977 (35) total: 4.07s remaining: 3.62s 36: learn: 0.3581923 test: 0.3649234 best: 0.3649234 (36) total: 4.19s remaining: 3.51s 37: learn: 0.3533442 test: 0.3601995 best: 0.3601995 (37) total: 4.32s remaining: 3.41s 38: learn: 0.3490417 test: 0.3558465 best: 0.3558465 (38) total: 4.43s remaining: 3.29s 39: learn: 0.3444519 test: 0.3513584 best: 0.3513584 (39) total: 4.54s remaining: 3.18s 40: learn: 0.3399770 test: 0.3470038 best: 0.3470038 (40) total: 4.67s remaining: 3.07s 41: learn: 0.3360450 test: 0.3432402 best: 0.3432402 (41) total: 4.8s remaining: 2.97s 42: learn: 0.3321735 test: 0.3394778 best: 0.3394778 (42) total: 4.93s remaining: 2.87s 43: learn: 0.3287040 test: 0.3362742 best: 0.3362742 (43) total: 5.04s remaining: 2.75s 44: learn: 0.3250491 test: 0.3327542 best: 0.3327542 (44) total: 5.16s remaining: 2.63s 45: learn: 0.3213036 test: 0.3290202 best: 0.3290202 (45) total: 5.25s remaining: 2.51s 46: learn: 0.3181938 test: 0.3261881 best: 0.3261881 (46) total: 5.37s remaining: 2.4s 47: learn: 0.3150565 test: 0.3230755 best: 0.3230755 (47) total: 5.48s remaining: 2.28s 48: learn: 0.3117864 test: 0.3199340 best: 0.3199340 (48) total: 5.59s remaining: 2.17s 49: learn: 0.3085454 test: 0.3169310 best: 0.3169310 (49) total: 5.72s remaining: 2.06s 50: learn: 0.3053373 test: 0.3138314 best: 0.3138314 (50) total: 5.83s remaining: 1.94s 51: learn: 0.3022927 test: 0.3109683 best: 0.3109683 (51) total: 5.96s remaining: 1.83s 52: learn: 0.2992796 test: 0.3081091 best: 0.3081091 (52) total: 6.07s remaining: 1.72s 53: learn: 0.2962694 test: 0.3052395 best: 0.3052395 (53) total: 6.19s remaining: 1.6s 54: learn: 0.2932214 test: 0.3023492 best: 0.3023492 (54) total: 6.29s remaining: 1.49s 55: learn: 0.2906131 test: 0.2998921 best: 0.2998921 (55) total: 6.4s remaining: 1.37s 56: learn: 0.2877365 test: 0.2971653 best: 0.2971653 (56) total: 6.52s remaining: 1.26s 57: learn: 0.2849883 test: 0.2946340 best: 0.2946340 (57) total: 6.64s remaining: 1.15s 58: learn: 0.2824768 test: 0.2922292 best: 0.2922292 (58) total: 6.74s remaining: 1.03s 59: learn: 0.2797722 test: 0.2897492 best: 0.2897492 (59) total: 6.85s remaining: 914ms 60: learn: 0.2776242 test: 0.2877678 best: 0.2877678 (60) total: 6.96s remaining: 799ms 61: learn: 0.2755373 test: 0.2858922 best: 0.2858922 (61) total: 7.07s remaining: 684ms 62: learn: 0.2731528 test: 0.2836383 best: 0.2836383 (62) total: 7.18s remaining: 570ms 63: learn: 0.2707401 test: 0.2813843 best: 0.2813843 (63) total: 7.3s remaining: 456ms 64: learn: 0.2685654 test: 0.2793232 best: 0.2793232 (64) total: 7.42s remaining: 342ms 65: learn: 0.2663744 test: 0.2772995 best: 0.2772995 (65) total: 7.53s remaining: 228ms 66: learn: 0.2643077 test: 0.2754240 best: 0.2754240 (66) total: 7.67s remaining: 114ms 67: learn: 0.2622474 test: 0.2734148 best: 0.2734148 (67) total: 7.77s remaining: 0us bestTest = 0.2734148285 bestIteration = 67 Trial 5, Fold 3: Log loss = 0.2734589256030984, Average precision = 0.973806963901712, ROC-AUC = 0.9717785090676194, Elapsed Time = 7.914452499997424 seconds Trial 5, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 5, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6772226 test: 0.6776731 best: 0.6776731 (0) total: 107ms remaining: 7.2s 1: learn: 0.6617972 test: 0.6624464 best: 0.6624464 (1) total: 226ms remaining: 7.44s 2: learn: 0.6467439 test: 0.6474846 best: 0.6474846 (2) total: 332ms remaining: 7.2s 3: learn: 0.6321458 test: 0.6330585 best: 0.6330585 (3) total: 442ms remaining: 7.08s 4: learn: 0.6179197 test: 0.6189540 best: 0.6189540 (4) total: 552ms remaining: 6.95s 5: learn: 0.6048796 test: 0.6061380 best: 0.6061380 (5) total: 676ms remaining: 6.99s 6: learn: 0.5930462 test: 0.5948392 best: 0.5948392 (6) total: 790ms remaining: 6.88s 7: learn: 0.5813939 test: 0.5838286 best: 0.5838286 (7) total: 903ms remaining: 6.77s 8: learn: 0.5690667 test: 0.5717037 best: 0.5717037 (8) total: 1.01s remaining: 6.62s 9: learn: 0.5570208 test: 0.5597433 best: 0.5597433 (9) total: 1.13s remaining: 6.56s 10: learn: 0.5455093 test: 0.5484311 best: 0.5484311 (10) total: 1.25s remaining: 6.47s 11: learn: 0.5344733 test: 0.5376084 best: 0.5376084 (11) total: 1.34s remaining: 6.26s 12: learn: 0.5241197 test: 0.5276719 best: 0.5276719 (12) total: 1.47s remaining: 6.21s 13: learn: 0.5140439 test: 0.5176806 best: 0.5176806 (13) total: 1.58s remaining: 6.08s 14: learn: 0.5040928 test: 0.5079708 best: 0.5079708 (14) total: 1.7s remaining: 5.99s 15: learn: 0.4942669 test: 0.4983848 best: 0.4983848 (15) total: 1.81s remaining: 5.88s 16: learn: 0.4854650 test: 0.4901534 best: 0.4901534 (16) total: 1.93s remaining: 5.8s 17: learn: 0.4765273 test: 0.4814222 best: 0.4814222 (17) total: 2.05s remaining: 5.69s 18: learn: 0.4677263 test: 0.4726755 best: 0.4726755 (18) total: 2.17s remaining: 5.6s 19: learn: 0.4598478 test: 0.4651688 best: 0.4651688 (19) total: 2.27s remaining: 5.45s 20: learn: 0.4515881 test: 0.4568818 best: 0.4568818 (20) total: 2.38s remaining: 5.33s 21: learn: 0.4437331 test: 0.4490854 best: 0.4490854 (21) total: 2.5s remaining: 5.22s 22: learn: 0.4370984 test: 0.4431319 best: 0.4431319 (22) total: 2.61s remaining: 5.11s 23: learn: 0.4299093 test: 0.4362375 best: 0.4362375 (23) total: 2.73s remaining: 5s 24: learn: 0.4225883 test: 0.4290915 best: 0.4290915 (24) total: 2.85s remaining: 4.91s 25: learn: 0.4165989 test: 0.4234872 best: 0.4234872 (25) total: 2.97s remaining: 4.79s 26: learn: 0.4099081 test: 0.4169448 best: 0.4169448 (26) total: 3.07s remaining: 4.66s 27: learn: 0.4030946 test: 0.4102895 best: 0.4102895 (27) total: 3.2s remaining: 4.57s 28: learn: 0.3968274 test: 0.4041612 best: 0.4041612 (28) total: 3.31s remaining: 4.46s 29: learn: 0.3908379 test: 0.3986040 best: 0.3986040 (29) total: 3.44s remaining: 4.36s 30: learn: 0.3849279 test: 0.3927649 best: 0.3927649 (30) total: 3.55s remaining: 4.24s 31: learn: 0.3794772 test: 0.3873589 best: 0.3873589 (31) total: 3.67s remaining: 4.13s 32: learn: 0.3738214 test: 0.3818102 best: 0.3818102 (32) total: 3.78s remaining: 4s 33: learn: 0.3687387 test: 0.3768440 best: 0.3768440 (33) total: 3.89s remaining: 3.89s 34: learn: 0.3637366 test: 0.3721276 best: 0.3721276 (34) total: 4.02s remaining: 3.79s 35: learn: 0.3587764 test: 0.3673278 best: 0.3673278 (35) total: 4.15s remaining: 3.69s 36: learn: 0.3542406 test: 0.3629858 best: 0.3629858 (36) total: 4.25s remaining: 3.56s 37: learn: 0.3495222 test: 0.3583734 best: 0.3583734 (37) total: 4.36s remaining: 3.44s 38: learn: 0.3446693 test: 0.3535615 best: 0.3535615 (38) total: 4.46s remaining: 3.32s 39: learn: 0.3403029 test: 0.3494579 best: 0.3494579 (39) total: 4.58s remaining: 3.2s 40: learn: 0.3363133 test: 0.3456147 best: 0.3456147 (40) total: 4.7s remaining: 3.09s 41: learn: 0.3327521 test: 0.3421903 best: 0.3421903 (41) total: 4.82s remaining: 2.98s 42: learn: 0.3287245 test: 0.3382218 best: 0.3382218 (42) total: 4.92s remaining: 2.86s 43: learn: 0.3247735 test: 0.3343926 best: 0.3343926 (43) total: 5.03s remaining: 2.75s 44: learn: 0.3210349 test: 0.3308753 best: 0.3308753 (44) total: 5.15s remaining: 2.63s 45: learn: 0.3174665 test: 0.3274820 best: 0.3274820 (45) total: 5.26s remaining: 2.52s 46: learn: 0.3140326 test: 0.3243047 best: 0.3243047 (46) total: 5.39s remaining: 2.41s 47: learn: 0.3111722 test: 0.3217054 best: 0.3217054 (47) total: 5.49s remaining: 2.29s 48: learn: 0.3079048 test: 0.3185340 best: 0.3185340 (48) total: 5.61s remaining: 2.17s 49: learn: 0.3048723 test: 0.3157172 best: 0.3157172 (49) total: 5.71s remaining: 2.06s 50: learn: 0.3017704 test: 0.3126798 best: 0.3126798 (50) total: 5.83s remaining: 1.94s 51: learn: 0.2987913 test: 0.3098359 best: 0.3098359 (51) total: 5.95s remaining: 1.83s 52: learn: 0.2957686 test: 0.3069657 best: 0.3069657 (52) total: 6.07s remaining: 1.72s 53: learn: 0.2928340 test: 0.3041482 best: 0.3041482 (53) total: 6.18s remaining: 1.6s 54: learn: 0.2900045 test: 0.3014392 best: 0.3014392 (54) total: 6.3s remaining: 1.49s 55: learn: 0.2873667 test: 0.2988853 best: 0.2988853 (55) total: 6.42s remaining: 1.38s 56: learn: 0.2846323 test: 0.2964002 best: 0.2964002 (56) total: 6.53s remaining: 1.26s 57: learn: 0.2822765 test: 0.2944224 best: 0.2944224 (57) total: 6.65s remaining: 1.15s 58: learn: 0.2796200 test: 0.2918944 best: 0.2918944 (58) total: 6.76s remaining: 1.03s 59: learn: 0.2771286 test: 0.2894935 best: 0.2894935 (59) total: 6.87s remaining: 916ms 60: learn: 0.2746903 test: 0.2873258 best: 0.2873258 (60) total: 6.99s remaining: 802ms 61: learn: 0.2727044 test: 0.2855098 best: 0.2855098 (61) total: 7.11s remaining: 688ms 62: learn: 0.2706245 test: 0.2835272 best: 0.2835272 (62) total: 7.22s remaining: 573ms 63: learn: 0.2683730 test: 0.2813912 best: 0.2813912 (63) total: 7.34s remaining: 459ms 64: learn: 0.2662880 test: 0.2793947 best: 0.2793947 (64) total: 7.46s remaining: 344ms 65: learn: 0.2640477 test: 0.2772726 best: 0.2772726 (65) total: 7.58s remaining: 230ms 66: learn: 0.2619216 test: 0.2754482 best: 0.2754482 (66) total: 7.69s remaining: 115ms 67: learn: 0.2602140 test: 0.2739280 best: 0.2739280 (67) total: 7.8s remaining: 0us bestTest = 0.2739279915 bestIteration = 67 Trial 5, Fold 4: Log loss = 0.27384385937702393, Average precision = 0.9763841751003696, ROC-AUC = 0.972039963900036, Elapsed Time = 7.9421445000007225 seconds Trial 5, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 5, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6768609 test: 0.6773147 best: 0.6773147 (0) total: 85.2ms remaining: 5.71s 1: learn: 0.6610424 test: 0.6619354 best: 0.6619354 (1) total: 201ms remaining: 6.63s 2: learn: 0.6460056 test: 0.6474211 best: 0.6474211 (2) total: 316ms remaining: 6.86s 3: learn: 0.6318901 test: 0.6335790 best: 0.6335790 (3) total: 434ms remaining: 6.95s 4: learn: 0.6179598 test: 0.6199244 best: 0.6199244 (4) total: 537ms remaining: 6.76s 5: learn: 0.6042531 test: 0.6065956 best: 0.6065956 (5) total: 654ms remaining: 6.75s 6: learn: 0.5913034 test: 0.5939922 best: 0.5939922 (6) total: 757ms remaining: 6.59s 7: learn: 0.5785977 test: 0.5815350 best: 0.5815350 (7) total: 861ms remaining: 6.45s 8: learn: 0.5664825 test: 0.5697721 best: 0.5697721 (8) total: 962ms remaining: 6.31s 9: learn: 0.5552878 test: 0.5587289 best: 0.5587289 (9) total: 1.05s remaining: 6.12s 10: learn: 0.5440561 test: 0.5477233 best: 0.5477233 (10) total: 1.18s remaining: 6.09s 11: learn: 0.5329836 test: 0.5369626 best: 0.5369626 (11) total: 1.3s remaining: 6.07s 12: learn: 0.5235150 test: 0.5280611 best: 0.5280611 (12) total: 1.41s remaining: 5.98s 13: learn: 0.5139500 test: 0.5192605 best: 0.5192605 (13) total: 1.54s remaining: 5.94s 14: learn: 0.5044257 test: 0.5103488 best: 0.5103488 (14) total: 1.66s remaining: 5.88s 15: learn: 0.4954184 test: 0.5018395 best: 0.5018395 (15) total: 1.78s remaining: 5.8s 16: learn: 0.4863708 test: 0.4929973 best: 0.4929973 (16) total: 1.88s remaining: 5.64s 17: learn: 0.4771900 test: 0.4840655 best: 0.4840655 (17) total: 1.98s remaining: 5.51s 18: learn: 0.4683234 test: 0.4754710 best: 0.4754710 (18) total: 2.09s remaining: 5.39s 19: learn: 0.4606404 test: 0.4687575 best: 0.4687575 (19) total: 2.22s remaining: 5.32s 20: learn: 0.4521919 test: 0.4605991 best: 0.4605991 (20) total: 2.33s remaining: 5.22s 21: learn: 0.4445706 test: 0.4532096 best: 0.4532096 (21) total: 2.44s remaining: 5.09s 22: learn: 0.4370870 test: 0.4458487 best: 0.4458487 (22) total: 2.54s remaining: 4.97s 23: learn: 0.4295817 test: 0.4385681 best: 0.4385681 (23) total: 2.66s remaining: 4.87s 24: learn: 0.4229156 test: 0.4326839 best: 0.4326839 (24) total: 2.77s remaining: 4.77s 25: learn: 0.4159747 test: 0.4259746 best: 0.4259746 (25) total: 2.88s remaining: 4.65s 26: learn: 0.4095874 test: 0.4197172 best: 0.4197172 (26) total: 2.98s remaining: 4.53s 27: learn: 0.4030949 test: 0.4133588 best: 0.4133588 (27) total: 3.09s remaining: 4.41s 28: learn: 0.3969850 test: 0.4074550 best: 0.4074550 (28) total: 3.2s remaining: 4.3s 29: learn: 0.3909963 test: 0.4018066 best: 0.4018066 (29) total: 3.34s remaining: 4.23s 30: learn: 0.3851477 test: 0.3961258 best: 0.3961258 (30) total: 3.44s remaining: 4.1s 31: learn: 0.3799763 test: 0.3914336 best: 0.3914336 (31) total: 3.55s remaining: 3.99s 32: learn: 0.3746787 test: 0.3863092 best: 0.3863092 (32) total: 3.67s remaining: 3.89s 33: learn: 0.3692143 test: 0.3811115 best: 0.3811115 (33) total: 3.79s remaining: 3.79s 34: learn: 0.3641982 test: 0.3763435 best: 0.3763435 (34) total: 3.89s remaining: 3.67s 35: learn: 0.3592804 test: 0.3716447 best: 0.3716447 (35) total: 4s remaining: 3.56s 36: learn: 0.3544006 test: 0.3670592 best: 0.3670592 (36) total: 4.12s remaining: 3.45s 37: learn: 0.3500552 test: 0.3628693 best: 0.3628693 (37) total: 4.23s remaining: 3.34s 38: learn: 0.3456108 test: 0.3586629 best: 0.3586629 (38) total: 4.34s remaining: 3.23s 39: learn: 0.3417500 test: 0.3550759 best: 0.3550759 (39) total: 4.46s remaining: 3.12s 40: learn: 0.3377399 test: 0.3511739 best: 0.3511739 (40) total: 4.57s remaining: 3.01s 41: learn: 0.3337240 test: 0.3474912 best: 0.3474912 (41) total: 4.69s remaining: 2.9s 42: learn: 0.3298366 test: 0.3437690 best: 0.3437690 (42) total: 4.79s remaining: 2.79s 43: learn: 0.3262017 test: 0.3403141 best: 0.3403141 (43) total: 4.91s remaining: 2.68s 44: learn: 0.3230789 test: 0.3378059 best: 0.3378059 (44) total: 5.03s remaining: 2.57s 45: learn: 0.3194272 test: 0.3343017 best: 0.3343017 (45) total: 5.17s remaining: 2.47s 46: learn: 0.3160668 test: 0.3311289 best: 0.3311289 (46) total: 5.28s remaining: 2.36s 47: learn: 0.3124920 test: 0.3278303 best: 0.3278303 (47) total: 5.4s remaining: 2.25s 48: learn: 0.3092311 test: 0.3247275 best: 0.3247275 (48) total: 5.5s remaining: 2.13s 49: learn: 0.3062264 test: 0.3218521 best: 0.3218521 (49) total: 5.61s remaining: 2.02s 50: learn: 0.3029495 test: 0.3187717 best: 0.3187717 (50) total: 5.72s remaining: 1.91s 51: learn: 0.2999004 test: 0.3158720 best: 0.3158720 (51) total: 5.84s remaining: 1.8s 52: learn: 0.2970808 test: 0.3133502 best: 0.3133502 (52) total: 5.95s remaining: 1.68s 53: learn: 0.2943819 test: 0.3107559 best: 0.3107559 (53) total: 6.06s remaining: 1.57s 54: learn: 0.2913508 test: 0.3079416 best: 0.3079416 (54) total: 6.17s remaining: 1.46s 55: learn: 0.2886764 test: 0.3056935 best: 0.3056935 (55) total: 6.3s remaining: 1.35s 56: learn: 0.2860409 test: 0.3031971 best: 0.3031971 (56) total: 6.41s remaining: 1.24s 57: learn: 0.2833266 test: 0.3008953 best: 0.3008953 (57) total: 6.52s remaining: 1.12s 58: learn: 0.2807482 test: 0.2985523 best: 0.2985523 (58) total: 6.64s remaining: 1.01s 59: learn: 0.2782630 test: 0.2962800 best: 0.2962800 (59) total: 6.74s remaining: 899ms 60: learn: 0.2758378 test: 0.2940852 best: 0.2940852 (60) total: 6.86s remaining: 787ms 61: learn: 0.2732940 test: 0.2918319 best: 0.2918319 (61) total: 6.97s remaining: 674ms 62: learn: 0.2708756 test: 0.2896735 best: 0.2896735 (62) total: 7.08s remaining: 562ms 63: learn: 0.2685466 test: 0.2875952 best: 0.2875952 (63) total: 7.2s remaining: 450ms 64: learn: 0.2663356 test: 0.2856111 best: 0.2856111 (64) total: 7.32s remaining: 338ms 65: learn: 0.2642497 test: 0.2836738 best: 0.2836738 (65) total: 7.43s remaining: 225ms 66: learn: 0.2621468 test: 0.2817574 best: 0.2817574 (66) total: 7.54s remaining: 113ms 67: learn: 0.2602072 test: 0.2800199 best: 0.2800199 (67) total: 7.66s remaining: 0us bestTest = 0.2800199454 bestIteration = 67 Trial 5, Fold 5: Log loss = 0.2798110219588214, Average precision = 0.9725522608394847, ROC-AUC = 0.969878304813927, Elapsed Time = 7.809099500002048 seconds
Optimization Progress: 6%|6 | 6/100 [12:54<3:27:22, 132.37s/it]
Trial 6, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 6, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6802252 test: 0.6804421 best: 0.6804421 (0) total: 21.6ms remaining: 260ms 1: learn: 0.6682112 test: 0.6683883 best: 0.6683883 (1) total: 43.2ms remaining: 237ms 2: learn: 0.6589036 test: 0.6589366 best: 0.6589366 (2) total: 64.3ms remaining: 214ms 3: learn: 0.6448929 test: 0.6448146 best: 0.6448146 (3) total: 85ms remaining: 191ms 4: learn: 0.6338868 test: 0.6337594 best: 0.6337594 (4) total: 106ms remaining: 169ms 5: learn: 0.6256641 test: 0.6255702 best: 0.6255702 (5) total: 125ms remaining: 146ms 6: learn: 0.6150270 test: 0.6149716 best: 0.6149716 (6) total: 145ms remaining: 124ms 7: learn: 0.6067636 test: 0.6065486 best: 0.6065486 (7) total: 165ms remaining: 103ms 8: learn: 0.5986717 test: 0.5983345 best: 0.5983345 (8) total: 185ms remaining: 82.1ms 9: learn: 0.5880264 test: 0.5876876 best: 0.5876876 (9) total: 204ms remaining: 61.3ms 10: learn: 0.5761084 test: 0.5757119 best: 0.5757119 (10) total: 225ms remaining: 40.9ms 11: learn: 0.5677784 test: 0.5674551 best: 0.5674551 (11) total: 246ms remaining: 20.5ms 12: learn: 0.5564968 test: 0.5561573 best: 0.5561573 (12) total: 266ms remaining: 0us bestTest = 0.5561572861 bestIteration = 12 Trial 6, Fold 1: Log loss = 0.5561536599010769, Average precision = 0.9460395785417441, ROC-AUC = 0.9484670662403273, Elapsed Time = 0.3607932000013534 seconds Trial 6, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 6, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6804183 test: 0.6804112 best: 0.6804112 (0) total: 20.2ms remaining: 242ms 1: learn: 0.6681285 test: 0.6682514 best: 0.6682514 (1) total: 42ms remaining: 231ms 2: learn: 0.6561352 test: 0.6563840 best: 0.6563840 (2) total: 63.1ms remaining: 210ms 3: learn: 0.6433992 test: 0.6436953 best: 0.6436953 (3) total: 84.7ms remaining: 191ms 4: learn: 0.6326166 test: 0.6330715 best: 0.6330715 (4) total: 105ms remaining: 168ms 5: learn: 0.6242096 test: 0.6246486 best: 0.6246486 (5) total: 126ms remaining: 147ms 6: learn: 0.6157876 test: 0.6162656 best: 0.6162656 (6) total: 147ms remaining: 126ms 7: learn: 0.6028798 test: 0.6035262 best: 0.6035262 (7) total: 168ms remaining: 105ms 8: learn: 0.5953127 test: 0.5959643 best: 0.5959643 (8) total: 190ms remaining: 84.3ms 9: learn: 0.5856283 test: 0.5864070 best: 0.5864070 (9) total: 211ms remaining: 63.2ms 10: learn: 0.5740877 test: 0.5749715 best: 0.5749715 (10) total: 232ms remaining: 42.3ms 11: learn: 0.5672548 test: 0.5681372 best: 0.5681372 (11) total: 254ms remaining: 21.2ms 12: learn: 0.5558938 test: 0.5569927 best: 0.5569927 (12) total: 275ms remaining: 0us bestTest = 0.556992702 bestIteration = 12 Trial 6, Fold 2: Log loss = 0.5570118298070835, Average precision = 0.9433713404357043, ROC-AUC = 0.9467596688377728, Elapsed Time = 0.3714698000003409 seconds Trial 6, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 6, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6806873 test: 0.6805359 best: 0.6805359 (0) total: 21.8ms remaining: 261ms 1: learn: 0.6683826 test: 0.6681212 best: 0.6681212 (1) total: 43ms remaining: 236ms 2: learn: 0.6590432 test: 0.6588794 best: 0.6588794 (2) total: 64.4ms remaining: 215ms 3: learn: 0.6446687 test: 0.6442932 best: 0.6442932 (3) total: 87ms remaining: 196ms 4: learn: 0.6308389 test: 0.6302799 best: 0.6302799 (4) total: 108ms remaining: 173ms 5: learn: 0.6198787 test: 0.6192151 best: 0.6192151 (5) total: 129ms remaining: 151ms 6: learn: 0.6114146 test: 0.6108602 best: 0.6108602 (6) total: 150ms remaining: 129ms 7: learn: 0.5988403 test: 0.5981104 best: 0.5981104 (7) total: 171ms remaining: 107ms 8: learn: 0.5898561 test: 0.5888247 best: 0.5888247 (8) total: 192ms remaining: 85.2ms 9: learn: 0.5823062 test: 0.5814279 best: 0.5814279 (9) total: 212ms remaining: 63.7ms 10: learn: 0.5721491 test: 0.5712091 best: 0.5712091 (10) total: 234ms remaining: 42.5ms 11: learn: 0.5635528 test: 0.5624689 best: 0.5624689 (11) total: 254ms remaining: 21.2ms 12: learn: 0.5557207 test: 0.5543627 best: 0.5543627 (12) total: 275ms remaining: 0us bestTest = 0.5543626585 bestIteration = 12 Trial 6, Fold 3: Log loss = 0.5544209334880373, Average precision = 0.9478648611501131, ROC-AUC = 0.9500570638826555, Elapsed Time = 0.37057730000015 seconds Trial 6, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 6, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6804686 test: 0.6802410 best: 0.6802410 (0) total: 20.3ms remaining: 244ms 1: learn: 0.6705542 test: 0.6703085 best: 0.6703085 (1) total: 41.1ms remaining: 226ms 2: learn: 0.6584628 test: 0.6581211 best: 0.6581211 (2) total: 62.4ms remaining: 208ms 3: learn: 0.6442829 test: 0.6439356 best: 0.6439356 (3) total: 83.9ms remaining: 189ms 4: learn: 0.6333709 test: 0.6329557 best: 0.6329557 (4) total: 105ms remaining: 168ms 5: learn: 0.6247979 test: 0.6242577 best: 0.6242577 (5) total: 126ms remaining: 146ms 6: learn: 0.6158961 test: 0.6152921 best: 0.6152921 (6) total: 145ms remaining: 125ms 7: learn: 0.6057282 test: 0.6050771 best: 0.6050771 (7) total: 165ms remaining: 103ms 8: learn: 0.5979378 test: 0.5971554 best: 0.5971554 (8) total: 186ms remaining: 82.6ms 9: learn: 0.5905088 test: 0.5896786 best: 0.5896786 (9) total: 207ms remaining: 62ms 10: learn: 0.5833599 test: 0.5824848 best: 0.5824848 (10) total: 227ms remaining: 41.3ms 11: learn: 0.5749975 test: 0.5741083 best: 0.5741083 (11) total: 248ms remaining: 20.7ms 12: learn: 0.5638812 test: 0.5629968 best: 0.5629968 (12) total: 270ms remaining: 0us bestTest = 0.5629968452 bestIteration = 12 Trial 6, Fold 4: Log loss = 0.5629718942723798, Average precision = 0.9458023229317589, ROC-AUC = 0.9472448592125248, Elapsed Time = 0.36387010000180453 seconds Trial 6, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 6, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6803936 test: 0.6807428 best: 0.6807428 (0) total: 20.2ms remaining: 243ms 1: learn: 0.6676559 test: 0.6682857 best: 0.6682857 (1) total: 40.7ms remaining: 224ms 2: learn: 0.6557227 test: 0.6566625 best: 0.6566625 (2) total: 61.4ms remaining: 205ms 3: learn: 0.6463381 test: 0.6474383 best: 0.6474383 (3) total: 81.9ms remaining: 184ms 4: learn: 0.6326081 test: 0.6338504 best: 0.6338504 (4) total: 103ms remaining: 165ms 5: learn: 0.6238334 test: 0.6253332 best: 0.6253332 (5) total: 124ms remaining: 144ms 6: learn: 0.6108541 test: 0.6124725 best: 0.6124725 (6) total: 144ms remaining: 124ms 7: learn: 0.5984452 test: 0.6002132 best: 0.6002132 (7) total: 165ms remaining: 103ms 8: learn: 0.5894687 test: 0.5914183 best: 0.5914183 (8) total: 186ms remaining: 82.7ms 9: learn: 0.5820750 test: 0.5841134 best: 0.5841134 (9) total: 207ms remaining: 62ms 10: learn: 0.5718870 test: 0.5740480 best: 0.5740480 (10) total: 228ms remaining: 41.4ms 11: learn: 0.5650425 test: 0.5673183 best: 0.5673183 (11) total: 249ms remaining: 20.7ms 12: learn: 0.5570764 test: 0.5595637 best: 0.5595637 (12) total: 270ms remaining: 0us bestTest = 0.5595636953 bestIteration = 12 Trial 6, Fold 5: Log loss = 0.5595586354766492, Average precision = 0.9440261715292246, ROC-AUC = 0.9444601007004441, Elapsed Time = 0.3644702999990841 seconds
Optimization Progress: 7%|7 | 7/100 [13:03<2:22:45, 92.10s/it]
Trial 7, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 7, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5989916 test: 0.5987729 best: 0.5987729 (0) total: 48.7ms remaining: 682ms 1: learn: 0.5212612 test: 0.5209932 best: 0.5209932 (1) total: 97ms remaining: 631ms 2: learn: 0.4632245 test: 0.4633237 best: 0.4633237 (2) total: 142ms remaining: 568ms 3: learn: 0.4174577 test: 0.4177841 best: 0.4177841 (3) total: 189ms remaining: 519ms 4: learn: 0.3814671 test: 0.3829626 best: 0.3829626 (4) total: 233ms remaining: 465ms 5: learn: 0.3555651 test: 0.3578947 best: 0.3578947 (5) total: 281ms remaining: 422ms 6: learn: 0.3310748 test: 0.3339130 best: 0.3339130 (6) total: 325ms remaining: 372ms 7: learn: 0.3140789 test: 0.3184745 best: 0.3184745 (7) total: 372ms remaining: 325ms 8: learn: 0.2981974 test: 0.3031023 best: 0.3031023 (8) total: 418ms remaining: 278ms 9: learn: 0.2870217 test: 0.2923589 best: 0.2923589 (9) total: 463ms remaining: 231ms 10: learn: 0.2751626 test: 0.2807556 best: 0.2807556 (10) total: 510ms remaining: 186ms 11: learn: 0.2690857 test: 0.2752113 best: 0.2752113 (11) total: 559ms remaining: 140ms 12: learn: 0.2609884 test: 0.2675063 best: 0.2675063 (12) total: 606ms remaining: 93.2ms 13: learn: 0.2550054 test: 0.2618784 best: 0.2618784 (13) total: 656ms remaining: 46.8ms 14: learn: 0.2478576 test: 0.2548913 best: 0.2548913 (14) total: 702ms remaining: 0us bestTest = 0.254891292 bestIteration = 14 Trial 7, Fold 1: Log loss = 0.2547158355731668, Average precision = 0.9696817835744587, ROC-AUC = 0.9642371715824742, Elapsed Time = 0.8005144000017026 seconds Trial 7, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 7, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6061725 test: 0.6104710 best: 0.6104710 (0) total: 52.7ms remaining: 738ms 1: learn: 0.5418799 test: 0.5515354 best: 0.5515354 (1) total: 103ms remaining: 667ms 2: learn: 0.4770784 test: 0.4856927 best: 0.4856927 (2) total: 153ms remaining: 611ms 3: learn: 0.4344532 test: 0.4450714 best: 0.4450714 (3) total: 202ms remaining: 555ms 4: learn: 0.4058501 test: 0.4173530 best: 0.4173530 (4) total: 250ms remaining: 500ms 5: learn: 0.3777201 test: 0.3892724 best: 0.3892724 (5) total: 301ms remaining: 451ms 6: learn: 0.3514552 test: 0.3616842 best: 0.3616842 (6) total: 348ms remaining: 398ms 7: learn: 0.3347482 test: 0.3463019 best: 0.3463019 (7) total: 398ms remaining: 348ms 8: learn: 0.3156493 test: 0.3259220 best: 0.3259220 (8) total: 450ms remaining: 300ms 9: learn: 0.2978537 test: 0.3071819 best: 0.3071819 (9) total: 499ms remaining: 249ms 10: learn: 0.2839342 test: 0.2924212 best: 0.2924212 (10) total: 549ms remaining: 200ms 11: learn: 0.2730108 test: 0.2810501 best: 0.2810501 (11) total: 593ms remaining: 148ms 12: learn: 0.2657488 test: 0.2737289 best: 0.2737289 (12) total: 641ms remaining: 98.6ms 13: learn: 0.2574214 test: 0.2650518 best: 0.2650518 (13) total: 690ms remaining: 49.3ms 14: learn: 0.2515038 test: 0.2586885 best: 0.2586885 (14) total: 739ms remaining: 0us bestTest = 0.2586884716 bestIteration = 14 Trial 7, Fold 2: Log loss = 0.2585835950241426, Average precision = 0.9692300288483594, ROC-AUC = 0.9664363489361418, Elapsed Time = 0.8348800000021583 seconds Trial 7, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 7, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5988857 test: 0.5982322 best: 0.5982322 (0) total: 48.8ms remaining: 683ms 1: learn: 0.5225025 test: 0.5213271 best: 0.5213271 (1) total: 99.3ms remaining: 646ms 2: learn: 0.4704166 test: 0.4702962 best: 0.4702962 (2) total: 147ms remaining: 590ms 3: learn: 0.4238724 test: 0.4237657 best: 0.4237657 (3) total: 198ms remaining: 545ms 4: learn: 0.3920162 test: 0.3924085 best: 0.3924085 (4) total: 248ms remaining: 495ms 5: learn: 0.3605599 test: 0.3604553 best: 0.3604553 (5) total: 296ms remaining: 444ms 6: learn: 0.3370087 test: 0.3368984 best: 0.3368984 (6) total: 345ms remaining: 394ms 7: learn: 0.3183141 test: 0.3179494 best: 0.3179494 (7) total: 393ms remaining: 344ms 8: learn: 0.3035164 test: 0.3031106 best: 0.3031106 (8) total: 442ms remaining: 295ms 9: learn: 0.2918984 test: 0.2919343 best: 0.2919343 (9) total: 498ms remaining: 249ms 10: learn: 0.2790050 test: 0.2788281 best: 0.2788281 (10) total: 556ms remaining: 202ms 11: learn: 0.2707991 test: 0.2701632 best: 0.2701632 (11) total: 605ms remaining: 151ms 12: learn: 0.2640301 test: 0.2632638 best: 0.2632638 (12) total: 653ms remaining: 100ms 13: learn: 0.2592847 test: 0.2584041 best: 0.2584041 (13) total: 705ms remaining: 50.4ms 14: learn: 0.2540698 test: 0.2532622 best: 0.2532622 (14) total: 757ms remaining: 0us bestTest = 0.2532622122 bestIteration = 14 Trial 7, Fold 3: Log loss = 0.253209611880942, Average precision = 0.971279655308492, ROC-AUC = 0.9674125652502625, Elapsed Time = 0.8535821000004944 seconds Trial 7, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 7, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6060943 test: 0.6074198 best: 0.6074198 (0) total: 48ms remaining: 672ms 1: learn: 0.5268457 test: 0.5283785 best: 0.5283785 (1) total: 98.1ms remaining: 638ms 2: learn: 0.4640682 test: 0.4657834 best: 0.4657834 (2) total: 148ms remaining: 593ms 3: learn: 0.4229567 test: 0.4269933 best: 0.4269933 (3) total: 199ms remaining: 548ms 4: learn: 0.3854017 test: 0.3896033 best: 0.3896033 (4) total: 247ms remaining: 495ms 5: learn: 0.3578307 test: 0.3618753 best: 0.3618753 (5) total: 293ms remaining: 440ms 6: learn: 0.3345831 test: 0.3387825 best: 0.3387825 (6) total: 342ms remaining: 391ms 7: learn: 0.3160064 test: 0.3206016 best: 0.3206016 (7) total: 386ms remaining: 337ms 8: learn: 0.2990350 test: 0.3031130 best: 0.3031130 (8) total: 433ms remaining: 289ms 9: learn: 0.2872985 test: 0.2915213 best: 0.2915213 (9) total: 482ms remaining: 241ms 10: learn: 0.2756518 test: 0.2798150 best: 0.2798150 (10) total: 531ms remaining: 193ms 11: learn: 0.2670230 test: 0.2712782 best: 0.2712782 (11) total: 579ms remaining: 145ms 12: learn: 0.2591719 test: 0.2634481 best: 0.2634481 (12) total: 627ms remaining: 96.4ms 13: learn: 0.2534180 test: 0.2575998 best: 0.2575998 (13) total: 677ms remaining: 48.3ms 14: learn: 0.2473836 test: 0.2516235 best: 0.2516235 (14) total: 732ms remaining: 0us bestTest = 0.2516234838 bestIteration = 14 Trial 7, Fold 4: Log loss = 0.2515106919620331, Average precision = 0.9708737713312391, ROC-AUC = 0.9652914881339848, Elapsed Time = 0.8270278999989387 seconds Trial 7, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 7, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5964152 test: 0.5978923 best: 0.5978923 (0) total: 48.9ms remaining: 684ms 1: learn: 0.5218618 test: 0.5249103 best: 0.5249103 (1) total: 96.7ms remaining: 628ms 2: learn: 0.4624925 test: 0.4663094 best: 0.4663094 (2) total: 145ms remaining: 579ms 3: learn: 0.4187886 test: 0.4236072 best: 0.4236072 (3) total: 191ms remaining: 526ms 4: learn: 0.3819607 test: 0.3878818 best: 0.3878818 (4) total: 237ms remaining: 475ms 5: learn: 0.3544906 test: 0.3608849 best: 0.3608849 (5) total: 284ms remaining: 426ms 6: learn: 0.3331826 test: 0.3416908 best: 0.3416908 (6) total: 332ms remaining: 379ms 7: learn: 0.3115865 test: 0.3207907 best: 0.3207907 (7) total: 379ms remaining: 331ms 8: learn: 0.2949340 test: 0.3046269 best: 0.3046269 (8) total: 427ms remaining: 285ms 9: learn: 0.2836907 test: 0.2935630 best: 0.2935630 (9) total: 477ms remaining: 239ms 10: learn: 0.2732865 test: 0.2835165 best: 0.2835165 (10) total: 525ms remaining: 191ms 11: learn: 0.2648832 test: 0.2759568 best: 0.2759568 (11) total: 576ms remaining: 144ms 12: learn: 0.2580136 test: 0.2691159 best: 0.2691159 (12) total: 624ms remaining: 96ms 13: learn: 0.2543786 test: 0.2658205 best: 0.2658205 (13) total: 673ms remaining: 48.1ms 14: learn: 0.2493514 test: 0.2609888 best: 0.2609888 (14) total: 721ms remaining: 0us bestTest = 0.2609888218 bestIteration = 14 Trial 7, Fold 5: Log loss = 0.26076694101564946, Average precision = 0.9651209070356066, ROC-AUC = 0.9620912992586813, Elapsed Time = 0.81645330000174 seconds
Optimization Progress: 8%|8 | 8/100 [13:15<1:41:53, 66.45s/it]
Trial 8, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 8, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6235132 test: 0.6233448 best: 0.6233448 (0) total: 229ms remaining: 20.1s 1: learn: 0.5639866 test: 0.5645558 best: 0.5645558 (1) total: 471ms remaining: 20.5s 2: learn: 0.5133377 test: 0.5151054 best: 0.5151054 (2) total: 720ms remaining: 20.6s 3: learn: 0.4703528 test: 0.4728199 best: 0.4728199 (3) total: 970ms remaining: 20.6s 4: learn: 0.4344873 test: 0.4379380 best: 0.4379380 (4) total: 1.22s remaining: 20.5s 5: learn: 0.4033720 test: 0.4076159 best: 0.4076159 (5) total: 1.47s remaining: 20.4s 6: learn: 0.3767150 test: 0.3816499 best: 0.3816499 (6) total: 1.72s remaining: 20.1s 7: learn: 0.3536239 test: 0.3595380 best: 0.3595380 (7) total: 1.98s remaining: 20s 8: learn: 0.3342543 test: 0.3409849 best: 0.3409849 (8) total: 2.27s remaining: 20.1s 9: learn: 0.3182201 test: 0.3257850 best: 0.3257850 (9) total: 2.55s remaining: 20.1s 10: learn: 0.3048964 test: 0.3131215 best: 0.3131215 (10) total: 2.82s remaining: 20s 11: learn: 0.2917239 test: 0.3007186 best: 0.3007186 (11) total: 3.1s remaining: 19.9s 12: learn: 0.2807458 test: 0.2902330 best: 0.2902330 (12) total: 3.39s remaining: 19.8s 13: learn: 0.2711021 test: 0.2809839 best: 0.2809839 (13) total: 3.66s remaining: 19.6s 14: learn: 0.2634502 test: 0.2739690 best: 0.2739690 (14) total: 3.94s remaining: 19.4s 15: learn: 0.2560501 test: 0.2671412 best: 0.2671412 (15) total: 4.23s remaining: 19.3s 16: learn: 0.2494120 test: 0.2611171 best: 0.2611171 (16) total: 4.51s remaining: 19.1s 17: learn: 0.2433200 test: 0.2556702 best: 0.2556702 (17) total: 4.8s remaining: 18.9s 18: learn: 0.2379267 test: 0.2508903 best: 0.2508903 (18) total: 5.08s remaining: 18.7s 19: learn: 0.2329893 test: 0.2464384 best: 0.2464384 (19) total: 5.36s remaining: 18.5s 20: learn: 0.2284423 test: 0.2426183 best: 0.2426183 (20) total: 5.64s remaining: 18.3s 21: learn: 0.2250956 test: 0.2395416 best: 0.2395416 (21) total: 5.92s remaining: 18s 22: learn: 0.2216102 test: 0.2372452 best: 0.2372452 (22) total: 6.2s remaining: 17.8s 23: learn: 0.2186459 test: 0.2349195 best: 0.2349195 (23) total: 6.48s remaining: 17.6s 24: learn: 0.2161413 test: 0.2330409 best: 0.2330409 (24) total: 6.76s remaining: 17.3s 25: learn: 0.2138313 test: 0.2310877 best: 0.2310877 (25) total: 7.03s remaining: 17s 26: learn: 0.2111730 test: 0.2296044 best: 0.2296044 (26) total: 7.31s remaining: 16.8s 27: learn: 0.2090974 test: 0.2280037 best: 0.2280037 (27) total: 7.59s remaining: 16.5s 28: learn: 0.2069201 test: 0.2270102 best: 0.2270102 (28) total: 7.87s remaining: 16.3s 29: learn: 0.2052381 test: 0.2257277 best: 0.2257277 (29) total: 8.14s remaining: 16s 30: learn: 0.2032290 test: 0.2238401 best: 0.2238401 (30) total: 8.41s remaining: 15.7s 31: learn: 0.2012553 test: 0.2225205 best: 0.2225205 (31) total: 8.7s remaining: 15.5s 32: learn: 0.1997857 test: 0.2218491 best: 0.2218491 (32) total: 8.97s remaining: 15.2s 33: learn: 0.1986858 test: 0.2210780 best: 0.2210780 (33) total: 9.25s remaining: 15s 34: learn: 0.1970208 test: 0.2204598 best: 0.2204598 (34) total: 9.54s remaining: 14.7s 35: learn: 0.1957923 test: 0.2194711 best: 0.2194711 (35) total: 9.82s remaining: 14.5s 36: learn: 0.1938677 test: 0.2189352 best: 0.2189352 (36) total: 10.1s remaining: 14.2s 37: learn: 0.1927009 test: 0.2180411 best: 0.2180411 (37) total: 10.4s remaining: 14s 38: learn: 0.1916182 test: 0.2175253 best: 0.2175253 (38) total: 10.7s remaining: 13.7s 39: learn: 0.1910072 test: 0.2170093 best: 0.2170093 (39) total: 11s remaining: 13.5s 40: learn: 0.1899669 test: 0.2163949 best: 0.2163949 (40) total: 11.3s remaining: 13.2s 41: learn: 0.1892962 test: 0.2158977 best: 0.2158977 (41) total: 11.6s remaining: 13s 42: learn: 0.1883398 test: 0.2154452 best: 0.2154452 (42) total: 11.9s remaining: 12.7s 43: learn: 0.1873488 test: 0.2149965 best: 0.2149965 (43) total: 12.2s remaining: 12.5s 44: learn: 0.1864632 test: 0.2145602 best: 0.2145602 (44) total: 12.5s remaining: 12.2s 45: learn: 0.1860668 test: 0.2143103 best: 0.2143103 (45) total: 12.8s remaining: 12s 46: learn: 0.1852722 test: 0.2139317 best: 0.2139317 (46) total: 13.2s remaining: 11.8s 47: learn: 0.1846933 test: 0.2135161 best: 0.2135161 (47) total: 13.6s remaining: 11.6s 48: learn: 0.1839183 test: 0.2129157 best: 0.2129157 (48) total: 13.9s remaining: 11.4s 49: learn: 0.1831883 test: 0.2126024 best: 0.2126024 (49) total: 14.3s remaining: 11.1s 50: learn: 0.1826526 test: 0.2124578 best: 0.2124578 (50) total: 14.6s remaining: 10.9s 51: learn: 0.1819755 test: 0.2121040 best: 0.2121040 (51) total: 15s remaining: 10.7s 52: learn: 0.1815057 test: 0.2120335 best: 0.2120335 (52) total: 15.4s remaining: 10.4s 53: learn: 0.1806744 test: 0.2115826 best: 0.2115826 (53) total: 15.7s remaining: 10.2s 54: learn: 0.1799654 test: 0.2111165 best: 0.2111165 (54) total: 16.1s remaining: 9.94s 55: learn: 0.1792113 test: 0.2107457 best: 0.2107457 (55) total: 16.5s remaining: 9.7s 56: learn: 0.1786161 test: 0.2103422 best: 0.2103422 (56) total: 16.8s remaining: 9.41s 57: learn: 0.1780739 test: 0.2102771 best: 0.2102771 (57) total: 17.1s remaining: 9.13s 58: learn: 0.1769655 test: 0.2103106 best: 0.2102771 (57) total: 17.4s remaining: 8.84s 59: learn: 0.1765305 test: 0.2102747 best: 0.2102747 (59) total: 17.7s remaining: 8.55s 60: learn: 0.1762060 test: 0.2101676 best: 0.2101676 (60) total: 18s remaining: 8.26s 61: learn: 0.1757224 test: 0.2099155 best: 0.2099155 (61) total: 18.3s remaining: 7.97s 62: learn: 0.1749205 test: 0.2096013 best: 0.2096013 (62) total: 18.6s remaining: 7.68s 63: learn: 0.1743979 test: 0.2094752 best: 0.2094752 (63) total: 18.9s remaining: 7.39s 64: learn: 0.1737738 test: 0.2093821 best: 0.2093821 (64) total: 19.2s remaining: 7.1s 65: learn: 0.1730890 test: 0.2093173 best: 0.2093173 (65) total: 19.5s remaining: 6.81s 66: learn: 0.1726758 test: 0.2093557 best: 0.2093173 (65) total: 19.8s remaining: 6.51s 67: learn: 0.1721743 test: 0.2089509 best: 0.2089509 (67) total: 20.1s remaining: 6.22s 68: learn: 0.1719201 test: 0.2088050 best: 0.2088050 (68) total: 20.4s remaining: 5.92s 69: learn: 0.1715522 test: 0.2086703 best: 0.2086703 (69) total: 20.8s remaining: 5.63s 70: learn: 0.1714238 test: 0.2085905 best: 0.2085905 (70) total: 21.1s remaining: 5.34s 71: learn: 0.1705273 test: 0.2085506 best: 0.2085506 (71) total: 21.4s remaining: 5.05s 72: learn: 0.1698819 test: 0.2086237 best: 0.2085506 (71) total: 21.7s remaining: 4.75s 73: learn: 0.1696516 test: 0.2084499 best: 0.2084499 (73) total: 22s remaining: 4.46s 74: learn: 0.1690941 test: 0.2083911 best: 0.2083911 (74) total: 22.3s remaining: 4.17s 75: learn: 0.1689020 test: 0.2083950 best: 0.2083911 (74) total: 22.6s remaining: 3.87s 76: learn: 0.1686237 test: 0.2082793 best: 0.2082793 (76) total: 22.9s remaining: 3.57s 77: learn: 0.1683865 test: 0.2082728 best: 0.2082728 (77) total: 23.2s remaining: 3.28s 78: learn: 0.1679962 test: 0.2078957 best: 0.2078957 (78) total: 23.5s remaining: 2.98s 79: learn: 0.1677810 test: 0.2077615 best: 0.2077615 (79) total: 23.8s remaining: 2.67s 80: learn: 0.1674586 test: 0.2076358 best: 0.2076358 (80) total: 24.1s remaining: 2.38s 81: learn: 0.1673650 test: 0.2076283 best: 0.2076283 (81) total: 24.3s remaining: 2.08s 82: learn: 0.1671188 test: 0.2074760 best: 0.2074760 (82) total: 24.6s remaining: 1.77s 83: learn: 0.1667051 test: 0.2073765 best: 0.2073765 (83) total: 24.8s remaining: 1.48s 84: learn: 0.1662906 test: 0.2071947 best: 0.2071947 (84) total: 25.1s remaining: 1.18s 85: learn: 0.1660414 test: 0.2071192 best: 0.2071192 (85) total: 25.4s remaining: 885ms 86: learn: 0.1655952 test: 0.2068205 best: 0.2068205 (86) total: 25.6s remaining: 590ms 87: learn: 0.1655175 test: 0.2068100 best: 0.2068100 (87) total: 25.9s remaining: 295ms 88: learn: 0.1652717 test: 0.2066757 best: 0.2066757 (88) total: 26.2s remaining: 0us bestTest = 0.2066756533 bestIteration = 88 Trial 8, Fold 1: Log loss = 0.20633682757903246, Average precision = 0.9733903949856466, ROC-AUC = 0.9689592563915057, Elapsed Time = 26.378331899999466 seconds Trial 8, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 8, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6230400 test: 0.6239528 best: 0.6239528 (0) total: 254ms remaining: 22.4s 1: learn: 0.5628892 test: 0.5648296 best: 0.5648296 (1) total: 504ms remaining: 21.9s 2: learn: 0.5128702 test: 0.5155691 best: 0.5155691 (2) total: 751ms remaining: 21.5s 3: learn: 0.4700938 test: 0.4742809 best: 0.4742809 (3) total: 1s remaining: 21.3s 4: learn: 0.4345318 test: 0.4396929 best: 0.4396929 (4) total: 1.27s remaining: 21.3s 5: learn: 0.4034349 test: 0.4094090 best: 0.4094090 (5) total: 1.56s remaining: 21.6s 6: learn: 0.3773201 test: 0.3837127 best: 0.3837127 (6) total: 1.84s remaining: 21.6s 7: learn: 0.3556498 test: 0.3622881 best: 0.3622881 (7) total: 2.13s remaining: 21.5s 8: learn: 0.3366406 test: 0.3435655 best: 0.3435655 (8) total: 2.4s remaining: 21.4s 9: learn: 0.3191389 test: 0.3267800 best: 0.3267800 (9) total: 2.7s remaining: 21.4s 10: learn: 0.3050726 test: 0.3129299 best: 0.3129299 (10) total: 2.99s remaining: 21.2s 11: learn: 0.2920450 test: 0.3005064 best: 0.3005064 (11) total: 3.27s remaining: 21s 12: learn: 0.2812405 test: 0.2902187 best: 0.2902187 (12) total: 3.55s remaining: 20.8s 13: learn: 0.2717799 test: 0.2815072 best: 0.2815072 (13) total: 3.83s remaining: 20.5s 14: learn: 0.2643363 test: 0.2745330 best: 0.2745330 (14) total: 4.12s remaining: 20.3s 15: learn: 0.2571362 test: 0.2672341 best: 0.2672341 (15) total: 4.41s remaining: 20.1s 16: learn: 0.2507433 test: 0.2612412 best: 0.2612412 (16) total: 4.69s remaining: 19.9s 17: learn: 0.2449578 test: 0.2556455 best: 0.2556455 (17) total: 4.97s remaining: 19.6s 18: learn: 0.2396125 test: 0.2504679 best: 0.2504679 (18) total: 5.25s remaining: 19.3s 19: learn: 0.2350627 test: 0.2462972 best: 0.2462972 (19) total: 5.53s remaining: 19.1s 20: learn: 0.2313480 test: 0.2429901 best: 0.2429901 (20) total: 5.81s remaining: 18.8s 21: learn: 0.2282984 test: 0.2405202 best: 0.2405202 (21) total: 6.09s remaining: 18.5s 22: learn: 0.2256348 test: 0.2380649 best: 0.2380649 (22) total: 6.36s remaining: 18.3s 23: learn: 0.2218998 test: 0.2345139 best: 0.2345139 (23) total: 6.64s remaining: 18s 24: learn: 0.2185473 test: 0.2315708 best: 0.2315708 (24) total: 6.93s remaining: 17.7s 25: learn: 0.2155818 test: 0.2289938 best: 0.2289938 (25) total: 7.21s remaining: 17.5s 26: learn: 0.2134314 test: 0.2272812 best: 0.2272812 (26) total: 7.49s remaining: 17.2s 27: learn: 0.2119603 test: 0.2257912 best: 0.2257912 (27) total: 7.57s remaining: 16.5s 28: learn: 0.2095911 test: 0.2239753 best: 0.2239753 (28) total: 7.85s remaining: 16.2s 29: learn: 0.2080185 test: 0.2228110 best: 0.2228110 (29) total: 8.13s remaining: 16s 30: learn: 0.2061052 test: 0.2213925 best: 0.2213925 (30) total: 8.4s remaining: 15.7s 31: learn: 0.2041346 test: 0.2194945 best: 0.2194945 (31) total: 8.68s remaining: 15.5s 32: learn: 0.2024199 test: 0.2185570 best: 0.2185570 (32) total: 8.96s remaining: 15.2s 33: learn: 0.2005007 test: 0.2174764 best: 0.2174764 (33) total: 9.25s remaining: 15s 34: learn: 0.1988608 test: 0.2168021 best: 0.2168021 (34) total: 9.53s remaining: 14.7s 35: learn: 0.1983728 test: 0.2162695 best: 0.2162695 (35) total: 9.61s remaining: 14.2s 36: learn: 0.1965538 test: 0.2151997 best: 0.2151997 (36) total: 9.89s remaining: 13.9s 37: learn: 0.1949830 test: 0.2141913 best: 0.2141913 (37) total: 10.2s remaining: 13.7s 38: learn: 0.1949816 test: 0.2141884 best: 0.2141884 (38) total: 10.2s remaining: 13.1s 39: learn: 0.1934750 test: 0.2131849 best: 0.2131849 (39) total: 10.5s remaining: 12.9s 40: learn: 0.1929076 test: 0.2128087 best: 0.2128087 (40) total: 10.8s remaining: 12.6s 41: learn: 0.1917335 test: 0.2121712 best: 0.2121712 (41) total: 11.1s remaining: 12.4s 42: learn: 0.1908124 test: 0.2116872 best: 0.2116872 (42) total: 11.3s remaining: 12.1s 43: learn: 0.1896293 test: 0.2106405 best: 0.2106405 (43) total: 11.6s remaining: 11.9s 44: learn: 0.1886919 test: 0.2105018 best: 0.2105018 (44) total: 11.9s remaining: 11.6s 45: learn: 0.1876306 test: 0.2098606 best: 0.2098606 (45) total: 12.2s remaining: 11.4s 46: learn: 0.1870424 test: 0.2092444 best: 0.2092444 (46) total: 12.4s remaining: 11.1s 47: learn: 0.1863792 test: 0.2086351 best: 0.2086351 (47) total: 12.7s remaining: 10.9s 48: learn: 0.1857553 test: 0.2082549 best: 0.2082549 (48) total: 13s remaining: 10.6s 49: learn: 0.1850461 test: 0.2077904 best: 0.2077904 (49) total: 13.3s remaining: 10.4s 50: learn: 0.1843407 test: 0.2073475 best: 0.2073475 (50) total: 13.6s remaining: 10.1s 51: learn: 0.1833767 test: 0.2070715 best: 0.2070715 (51) total: 13.9s remaining: 9.87s 52: learn: 0.1825714 test: 0.2066554 best: 0.2066554 (52) total: 14.2s remaining: 9.61s 53: learn: 0.1818732 test: 0.2064393 best: 0.2064393 (53) total: 14.4s remaining: 9.35s 54: learn: 0.1809543 test: 0.2064643 best: 0.2064393 (53) total: 14.7s remaining: 9.1s 55: learn: 0.1802154 test: 0.2060327 best: 0.2060327 (55) total: 15s remaining: 8.84s 56: learn: 0.1794323 test: 0.2055850 best: 0.2055850 (56) total: 15.3s remaining: 8.58s 57: learn: 0.1789171 test: 0.2052853 best: 0.2052853 (57) total: 15.6s remaining: 8.32s 58: learn: 0.1787345 test: 0.2051189 best: 0.2051189 (58) total: 15.8s remaining: 8.05s 59: learn: 0.1787347 test: 0.2051190 best: 0.2051189 (58) total: 15.9s remaining: 7.67s 60: learn: 0.1772532 test: 0.2047746 best: 0.2047746 (60) total: 16.2s remaining: 7.42s 61: learn: 0.1768351 test: 0.2045307 best: 0.2045307 (61) total: 16.4s remaining: 7.16s 62: learn: 0.1768352 test: 0.2045309 best: 0.2045307 (61) total: 16.5s remaining: 6.8s 63: learn: 0.1758105 test: 0.2036401 best: 0.2036401 (63) total: 16.8s remaining: 6.55s 64: learn: 0.1753576 test: 0.2034029 best: 0.2034029 (64) total: 17s remaining: 6.29s 65: learn: 0.1749537 test: 0.2031145 best: 0.2031145 (65) total: 17.3s remaining: 6.03s 66: learn: 0.1749530 test: 0.2031144 best: 0.2031144 (66) total: 17.4s remaining: 5.71s 67: learn: 0.1746815 test: 0.2030970 best: 0.2030970 (67) total: 17.7s remaining: 5.45s 68: learn: 0.1746154 test: 0.2030235 best: 0.2030235 (68) total: 17.8s remaining: 5.15s 69: learn: 0.1745723 test: 0.2029781 best: 0.2029781 (69) total: 17.8s remaining: 4.84s 70: learn: 0.1740751 test: 0.2029130 best: 0.2029130 (70) total: 18.1s remaining: 4.59s 71: learn: 0.1736351 test: 0.2028346 best: 0.2028346 (71) total: 18.4s remaining: 4.34s 72: learn: 0.1729937 test: 0.2025910 best: 0.2025910 (72) total: 18.7s remaining: 4.09s 73: learn: 0.1723837 test: 0.2021242 best: 0.2021242 (73) total: 18.9s remaining: 3.84s 74: learn: 0.1722835 test: 0.2021152 best: 0.2021152 (74) total: 19.2s remaining: 3.59s 75: learn: 0.1722365 test: 0.2020748 best: 0.2020748 (75) total: 19.3s remaining: 3.31s 76: learn: 0.1718960 test: 0.2019424 best: 0.2019424 (76) total: 19.6s remaining: 3.06s 77: learn: 0.1710141 test: 0.2016671 best: 0.2016671 (77) total: 19.9s remaining: 2.81s 78: learn: 0.1706578 test: 0.2014360 best: 0.2014360 (78) total: 20.2s remaining: 2.55s 79: learn: 0.1706572 test: 0.2014355 best: 0.2014355 (79) total: 20.2s remaining: 2.28s 80: learn: 0.1702670 test: 0.2010896 best: 0.2010896 (80) total: 20.5s remaining: 2.02s 81: learn: 0.1699578 test: 0.2008811 best: 0.2008811 (81) total: 20.8s remaining: 1.77s 82: learn: 0.1697276 test: 0.2007568 best: 0.2007568 (82) total: 21.1s remaining: 1.52s 83: learn: 0.1685584 test: 0.2009126 best: 0.2007568 (82) total: 21.3s remaining: 1.27s 84: learn: 0.1680087 test: 0.2007498 best: 0.2007498 (84) total: 21.6s remaining: 1.02s 85: learn: 0.1680088 test: 0.2007499 best: 0.2007498 (84) total: 21.7s remaining: 756ms 86: learn: 0.1676349 test: 0.2004561 best: 0.2004561 (86) total: 21.9s remaining: 504ms 87: learn: 0.1674487 test: 0.2004079 best: 0.2004079 (87) total: 22.2s remaining: 252ms 88: learn: 0.1670596 test: 0.2002621 best: 0.2002621 (88) total: 22.5s remaining: 0us bestTest = 0.2002620715 bestIteration = 88 Trial 8, Fold 2: Log loss = 0.20005981628069316, Average precision = 0.97440544640823, ROC-AUC = 0.9712539617553091, Elapsed Time = 22.641231000001426 seconds Trial 8, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 8, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6240027 test: 0.6242612 best: 0.6242612 (0) total: 247ms remaining: 21.8s 1: learn: 0.5651643 test: 0.5655446 best: 0.5655446 (1) total: 505ms remaining: 21.9s 2: learn: 0.5151866 test: 0.5156883 best: 0.5156883 (2) total: 755ms remaining: 21.7s 3: learn: 0.4727307 test: 0.4734829 best: 0.4734829 (3) total: 1s remaining: 21.3s 4: learn: 0.4365922 test: 0.4378784 best: 0.4378784 (4) total: 1.26s remaining: 21.1s 5: learn: 0.4057399 test: 0.4076261 best: 0.4076261 (5) total: 1.52s remaining: 21s 6: learn: 0.3794010 test: 0.3819804 best: 0.3819804 (6) total: 1.8s remaining: 21s 7: learn: 0.3561966 test: 0.3591329 best: 0.3591329 (7) total: 2.09s remaining: 21.1s 8: learn: 0.3363936 test: 0.3398916 best: 0.3398916 (8) total: 2.37s remaining: 21.1s 9: learn: 0.3195415 test: 0.3233686 best: 0.3233686 (9) total: 2.65s remaining: 21s 10: learn: 0.3047232 test: 0.3086820 best: 0.3086820 (10) total: 2.93s remaining: 20.8s 11: learn: 0.2918669 test: 0.2965566 best: 0.2965566 (11) total: 3.22s remaining: 20.7s 12: learn: 0.2818566 test: 0.2867866 best: 0.2867866 (12) total: 3.49s remaining: 20.4s 13: learn: 0.2738626 test: 0.2789407 best: 0.2789407 (13) total: 3.78s remaining: 20.2s 14: learn: 0.2653693 test: 0.2710481 best: 0.2710481 (14) total: 4.06s remaining: 20s 15: learn: 0.2576976 test: 0.2641105 best: 0.2641105 (15) total: 4.34s remaining: 19.8s 16: learn: 0.2511979 test: 0.2580746 best: 0.2580746 (16) total: 4.62s remaining: 19.6s 17: learn: 0.2465828 test: 0.2534137 best: 0.2534137 (17) total: 4.72s remaining: 18.6s 18: learn: 0.2410565 test: 0.2482367 best: 0.2482367 (18) total: 5s remaining: 18.4s 19: learn: 0.2360524 test: 0.2436906 best: 0.2436906 (19) total: 5.28s remaining: 18.2s 20: learn: 0.2327827 test: 0.2411569 best: 0.2411569 (20) total: 5.56s remaining: 18s 21: learn: 0.2289513 test: 0.2375801 best: 0.2375801 (21) total: 5.83s remaining: 17.8s 22: learn: 0.2254225 test: 0.2344178 best: 0.2344178 (22) total: 6.11s remaining: 17.5s 23: learn: 0.2219287 test: 0.2312021 best: 0.2312021 (23) total: 6.39s remaining: 17.3s 24: learn: 0.2194015 test: 0.2291985 best: 0.2291985 (24) total: 6.66s remaining: 17.1s 25: learn: 0.2172487 test: 0.2276064 best: 0.2276064 (25) total: 6.95s remaining: 16.8s 26: learn: 0.2150159 test: 0.2258773 best: 0.2258773 (26) total: 7.22s remaining: 16.6s 27: learn: 0.2120001 test: 0.2231388 best: 0.2231388 (27) total: 7.5s remaining: 16.3s 28: learn: 0.2095785 test: 0.2215284 best: 0.2215284 (28) total: 7.79s remaining: 16.1s 29: learn: 0.2072760 test: 0.2200917 best: 0.2200917 (29) total: 8.07s remaining: 15.9s 30: learn: 0.2058046 test: 0.2194110 best: 0.2194110 (30) total: 8.35s remaining: 15.6s 31: learn: 0.2046626 test: 0.2185649 best: 0.2185649 (31) total: 8.62s remaining: 15.4s 32: learn: 0.2029258 test: 0.2171438 best: 0.2171438 (32) total: 8.9s remaining: 15.1s 33: learn: 0.2016638 test: 0.2159530 best: 0.2159530 (33) total: 9.17s remaining: 14.8s 34: learn: 0.2007666 test: 0.2152374 best: 0.2152374 (34) total: 9.44s remaining: 14.6s 35: learn: 0.2004279 test: 0.2149151 best: 0.2149151 (35) total: 9.52s remaining: 14s 36: learn: 0.1995195 test: 0.2141961 best: 0.2141961 (36) total: 9.79s remaining: 13.8s 37: learn: 0.1976156 test: 0.2124275 best: 0.2124275 (37) total: 10.1s remaining: 13.5s 38: learn: 0.1963513 test: 0.2120212 best: 0.2120212 (38) total: 10.4s remaining: 13.3s 39: learn: 0.1949355 test: 0.2109453 best: 0.2109453 (39) total: 10.6s remaining: 13s 40: learn: 0.1941357 test: 0.2106155 best: 0.2106155 (40) total: 10.9s remaining: 12.8s 41: learn: 0.1928204 test: 0.2093525 best: 0.2093525 (41) total: 11.2s remaining: 12.5s 42: learn: 0.1916599 test: 0.2086132 best: 0.2086132 (42) total: 11.5s remaining: 12.3s 43: learn: 0.1911036 test: 0.2081434 best: 0.2081434 (43) total: 11.7s remaining: 12s 44: learn: 0.1904832 test: 0.2078998 best: 0.2078998 (44) total: 12s remaining: 11.7s 45: learn: 0.1892059 test: 0.2077595 best: 0.2077595 (45) total: 12.3s remaining: 11.5s 46: learn: 0.1886239 test: 0.2074620 best: 0.2074620 (46) total: 12.5s remaining: 11.2s 47: learn: 0.1878153 test: 0.2067137 best: 0.2067137 (47) total: 12.8s remaining: 11s 48: learn: 0.1871383 test: 0.2060678 best: 0.2060678 (48) total: 13.1s remaining: 10.7s 49: learn: 0.1862574 test: 0.2060721 best: 0.2060678 (48) total: 13.4s remaining: 10.4s 50: learn: 0.1856268 test: 0.2058447 best: 0.2058447 (50) total: 13.7s remaining: 10.2s 51: learn: 0.1849283 test: 0.2053195 best: 0.2053195 (51) total: 13.9s remaining: 9.92s 52: learn: 0.1843850 test: 0.2050928 best: 0.2050928 (52) total: 14.2s remaining: 9.66s 53: learn: 0.1835517 test: 0.2044067 best: 0.2044067 (53) total: 14.5s remaining: 9.39s 54: learn: 0.1830710 test: 0.2041934 best: 0.2041934 (54) total: 14.8s remaining: 9.14s 55: learn: 0.1821161 test: 0.2038004 best: 0.2038004 (55) total: 15.1s remaining: 8.88s 56: learn: 0.1816141 test: 0.2035306 best: 0.2035306 (56) total: 15.3s remaining: 8.62s 57: learn: 0.1800654 test: 0.2032770 best: 0.2032770 (57) total: 15.6s remaining: 8.36s 58: learn: 0.1797919 test: 0.2029714 best: 0.2029714 (58) total: 15.9s remaining: 8.09s 59: learn: 0.1793736 test: 0.2028015 best: 0.2028015 (59) total: 16.2s remaining: 7.82s 60: learn: 0.1793704 test: 0.2028057 best: 0.2028015 (59) total: 16.2s remaining: 7.46s 61: learn: 0.1786599 test: 0.2024916 best: 0.2024916 (61) total: 16.5s remaining: 7.19s 62: learn: 0.1786557 test: 0.2024829 best: 0.2024829 (62) total: 16.6s remaining: 6.85s 63: learn: 0.1780133 test: 0.2022466 best: 0.2022466 (63) total: 16.9s remaining: 6.59s 64: learn: 0.1771629 test: 0.2020288 best: 0.2020288 (64) total: 17.1s remaining: 6.33s 65: learn: 0.1771630 test: 0.2020290 best: 0.2020288 (64) total: 17.2s remaining: 5.99s 66: learn: 0.1771380 test: 0.2020398 best: 0.2020288 (64) total: 17.3s remaining: 5.67s 67: learn: 0.1766023 test: 0.2018887 best: 0.2018887 (67) total: 17.5s remaining: 5.42s 68: learn: 0.1762951 test: 0.2016884 best: 0.2016884 (68) total: 17.8s remaining: 5.17s 69: learn: 0.1759202 test: 0.2014436 best: 0.2014436 (69) total: 18.1s remaining: 4.92s 70: learn: 0.1751914 test: 0.2012608 best: 0.2012608 (70) total: 18.4s remaining: 4.67s 71: learn: 0.1748696 test: 0.2010284 best: 0.2010284 (71) total: 18.7s remaining: 4.41s 72: learn: 0.1745631 test: 0.2009772 best: 0.2009772 (72) total: 19s remaining: 4.15s 73: learn: 0.1742545 test: 0.2008394 best: 0.2008394 (73) total: 19.2s remaining: 3.9s 74: learn: 0.1737531 test: 0.2009333 best: 0.2008394 (73) total: 19.5s remaining: 3.64s 75: learn: 0.1732316 test: 0.2007506 best: 0.2007506 (75) total: 19.8s remaining: 3.38s 76: learn: 0.1729679 test: 0.2005925 best: 0.2005925 (76) total: 20s remaining: 3.12s 77: learn: 0.1728594 test: 0.2006025 best: 0.2005925 (76) total: 20.3s remaining: 2.86s 78: learn: 0.1724475 test: 0.2004770 best: 0.2004770 (78) total: 20.6s remaining: 2.6s 79: learn: 0.1720450 test: 0.2002882 best: 0.2002882 (79) total: 20.9s remaining: 2.35s 80: learn: 0.1716791 test: 0.2002173 best: 0.2002173 (80) total: 21.1s remaining: 2.09s 81: learn: 0.1711790 test: 0.2003529 best: 0.2002173 (80) total: 21.4s remaining: 1.83s 82: learn: 0.1709064 test: 0.2001949 best: 0.2001949 (82) total: 21.7s remaining: 1.57s 83: learn: 0.1708936 test: 0.2001952 best: 0.2001949 (82) total: 21.7s remaining: 1.29s 84: learn: 0.1706641 test: 0.2000664 best: 0.2000664 (84) total: 22s remaining: 1.03s 85: learn: 0.1700734 test: 0.2000005 best: 0.2000005 (85) total: 22.3s remaining: 777ms 86: learn: 0.1693848 test: 0.1997752 best: 0.1997752 (86) total: 22.6s remaining: 519ms 87: learn: 0.1689204 test: 0.1997132 best: 0.1997132 (87) total: 22.8s remaining: 260ms 88: learn: 0.1683673 test: 0.1993576 best: 0.1993576 (88) total: 23.1s remaining: 0us bestTest = 0.1993575998 bestIteration = 88 Trial 8, Fold 3: Log loss = 0.19917795720290504, Average precision = 0.9747259111593369, ROC-AUC = 0.9710141961456596, Elapsed Time = 23.275047400002222 seconds Trial 8, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 8, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6227717 test: 0.6245712 best: 0.6245712 (0) total: 247ms remaining: 21.8s 1: learn: 0.5639292 test: 0.5666445 best: 0.5666445 (1) total: 491ms remaining: 21.4s 2: learn: 0.5138476 test: 0.5170890 best: 0.5170890 (2) total: 731ms remaining: 20.9s 3: learn: 0.4706998 test: 0.4748507 best: 0.4748507 (3) total: 979ms remaining: 20.8s 4: learn: 0.4333941 test: 0.4385119 best: 0.4385119 (4) total: 1.24s remaining: 20.8s 5: learn: 0.4027880 test: 0.4082310 best: 0.4082310 (5) total: 1.49s remaining: 20.6s 6: learn: 0.3766222 test: 0.3830123 best: 0.3830123 (6) total: 1.76s remaining: 20.6s 7: learn: 0.3542199 test: 0.3606660 best: 0.3606660 (7) total: 2.04s remaining: 20.7s 8: learn: 0.3354312 test: 0.3423730 best: 0.3423730 (8) total: 2.32s remaining: 20.7s 9: learn: 0.3185389 test: 0.3262176 best: 0.3262176 (9) total: 2.6s remaining: 20.5s 10: learn: 0.3039794 test: 0.3123920 best: 0.3123920 (10) total: 2.87s remaining: 20.4s 11: learn: 0.2917100 test: 0.3003188 best: 0.3003188 (11) total: 3.15s remaining: 20.2s 12: learn: 0.2811283 test: 0.2902094 best: 0.2902094 (12) total: 3.42s remaining: 20s 13: learn: 0.2722422 test: 0.2814257 best: 0.2814257 (13) total: 3.7s remaining: 19.8s 14: learn: 0.2638598 test: 0.2735030 best: 0.2735030 (14) total: 3.97s remaining: 19.6s 15: learn: 0.2574305 test: 0.2674723 best: 0.2674723 (15) total: 4.24s remaining: 19.3s 16: learn: 0.2503999 test: 0.2608771 best: 0.2608771 (16) total: 4.52s remaining: 19.1s 17: learn: 0.2443425 test: 0.2552836 best: 0.2552836 (17) total: 4.79s remaining: 18.9s 18: learn: 0.2388075 test: 0.2501532 best: 0.2501532 (18) total: 5.07s remaining: 18.7s 19: learn: 0.2344000 test: 0.2465480 best: 0.2465480 (19) total: 5.34s remaining: 18.4s 20: learn: 0.2299983 test: 0.2425043 best: 0.2425043 (20) total: 5.62s remaining: 18.2s 21: learn: 0.2260802 test: 0.2392063 best: 0.2392063 (21) total: 5.9s remaining: 18s 22: learn: 0.2228328 test: 0.2364492 best: 0.2364492 (22) total: 6.19s remaining: 17.8s 23: learn: 0.2195809 test: 0.2339345 best: 0.2339345 (23) total: 6.46s remaining: 17.5s 24: learn: 0.2173118 test: 0.2319506 best: 0.2319506 (24) total: 6.73s remaining: 17.2s 25: learn: 0.2149149 test: 0.2298759 best: 0.2298759 (25) total: 7.01s remaining: 17s 26: learn: 0.2123322 test: 0.2285671 best: 0.2285671 (26) total: 7.3s remaining: 16.8s 27: learn: 0.2102615 test: 0.2277561 best: 0.2277561 (27) total: 7.58s remaining: 16.5s 28: learn: 0.2079729 test: 0.2259088 best: 0.2259088 (28) total: 7.85s remaining: 16.3s 29: learn: 0.2054598 test: 0.2238785 best: 0.2238785 (29) total: 8.13s remaining: 16s 30: learn: 0.2036787 test: 0.2227161 best: 0.2227161 (30) total: 8.41s remaining: 15.7s 31: learn: 0.2022987 test: 0.2216760 best: 0.2216760 (31) total: 8.68s remaining: 15.5s 32: learn: 0.2006740 test: 0.2204557 best: 0.2204557 (32) total: 8.96s remaining: 15.2s 33: learn: 0.1992261 test: 0.2199306 best: 0.2199306 (33) total: 9.24s remaining: 14.9s 34: learn: 0.1978512 test: 0.2188566 best: 0.2188566 (34) total: 9.51s remaining: 14.7s 35: learn: 0.1970803 test: 0.2182831 best: 0.2182831 (35) total: 9.78s remaining: 14.4s 36: learn: 0.1961399 test: 0.2174813 best: 0.2174813 (36) total: 10.1s remaining: 14.1s 37: learn: 0.1948793 test: 0.2171206 best: 0.2171206 (37) total: 10.3s remaining: 13.9s 38: learn: 0.1936618 test: 0.2163379 best: 0.2163379 (38) total: 10.6s remaining: 13.6s 39: learn: 0.1929236 test: 0.2157252 best: 0.2157252 (39) total: 10.9s remaining: 13.3s 40: learn: 0.1918630 test: 0.2149195 best: 0.2149195 (40) total: 11.2s remaining: 13.1s 41: learn: 0.1912611 test: 0.2146291 best: 0.2146291 (41) total: 11.4s remaining: 12.8s 42: learn: 0.1908631 test: 0.2144894 best: 0.2144894 (42) total: 11.7s remaining: 12.5s 43: learn: 0.1898948 test: 0.2136291 best: 0.2136291 (43) total: 12s remaining: 12.2s 44: learn: 0.1885945 test: 0.2126461 best: 0.2126461 (44) total: 12.2s remaining: 12s 45: learn: 0.1876887 test: 0.2124849 best: 0.2124849 (45) total: 12.5s remaining: 11.7s 46: learn: 0.1872569 test: 0.2123079 best: 0.2123079 (46) total: 12.8s remaining: 11.4s 47: learn: 0.1866985 test: 0.2120714 best: 0.2120714 (47) total: 13.1s remaining: 11.2s 48: learn: 0.1861687 test: 0.2116582 best: 0.2116582 (48) total: 13.3s remaining: 10.9s 49: learn: 0.1857244 test: 0.2114657 best: 0.2114657 (49) total: 13.6s remaining: 10.6s 50: learn: 0.1848855 test: 0.2109529 best: 0.2109529 (50) total: 13.9s remaining: 10.3s 51: learn: 0.1842384 test: 0.2109089 best: 0.2109089 (51) total: 14.1s remaining: 10.1s 52: learn: 0.1837334 test: 0.2105955 best: 0.2105955 (52) total: 14.4s remaining: 9.79s 53: learn: 0.1833529 test: 0.2104860 best: 0.2104860 (53) total: 14.7s remaining: 9.52s 54: learn: 0.1822339 test: 0.2101375 best: 0.2101375 (54) total: 15s remaining: 9.25s 55: learn: 0.1816129 test: 0.2100254 best: 0.2100254 (55) total: 15.2s remaining: 8.98s 56: learn: 0.1812535 test: 0.2098416 best: 0.2098416 (56) total: 15.5s remaining: 8.71s 57: learn: 0.1804552 test: 0.2098219 best: 0.2098219 (57) total: 15.8s remaining: 8.44s 58: learn: 0.1800007 test: 0.2094342 best: 0.2094342 (58) total: 16.1s remaining: 8.16s 59: learn: 0.1799973 test: 0.2094336 best: 0.2094336 (59) total: 16.1s remaining: 7.79s 60: learn: 0.1792663 test: 0.2093759 best: 0.2093759 (60) total: 16.4s remaining: 7.52s 61: learn: 0.1787498 test: 0.2090252 best: 0.2090252 (61) total: 16.7s remaining: 7.26s 62: learn: 0.1780562 test: 0.2090954 best: 0.2090252 (61) total: 16.9s remaining: 6.99s 63: learn: 0.1771966 test: 0.2088522 best: 0.2088522 (63) total: 17.2s remaining: 6.73s 64: learn: 0.1771916 test: 0.2088405 best: 0.2088405 (64) total: 17.3s remaining: 6.38s 65: learn: 0.1765025 test: 0.2088974 best: 0.2088405 (64) total: 17.6s remaining: 6.12s 66: learn: 0.1765021 test: 0.2088993 best: 0.2088405 (64) total: 17.6s remaining: 5.79s 67: learn: 0.1758950 test: 0.2085218 best: 0.2085218 (67) total: 17.9s remaining: 5.52s 68: learn: 0.1752196 test: 0.2083459 best: 0.2083459 (68) total: 18.2s remaining: 5.26s 69: learn: 0.1744581 test: 0.2080957 best: 0.2080957 (69) total: 18.4s remaining: 5s 70: learn: 0.1737812 test: 0.2079315 best: 0.2079315 (70) total: 18.7s remaining: 4.74s 71: learn: 0.1733775 test: 0.2078236 best: 0.2078236 (71) total: 19s remaining: 4.48s 72: learn: 0.1728244 test: 0.2073892 best: 0.2073892 (72) total: 19.2s remaining: 4.22s 73: learn: 0.1723892 test: 0.2070741 best: 0.2070741 (73) total: 19.5s remaining: 3.96s 74: learn: 0.1717491 test: 0.2068225 best: 0.2068225 (74) total: 19.8s remaining: 3.69s 75: learn: 0.1713647 test: 0.2067552 best: 0.2067552 (75) total: 20.1s remaining: 3.43s 76: learn: 0.1710303 test: 0.2066583 best: 0.2066583 (76) total: 20.3s remaining: 3.17s 77: learn: 0.1704815 test: 0.2062739 best: 0.2062739 (77) total: 20.6s remaining: 2.91s 78: learn: 0.1700095 test: 0.2059748 best: 0.2059748 (78) total: 20.9s remaining: 2.64s 79: learn: 0.1695289 test: 0.2057721 best: 0.2057721 (79) total: 21.2s remaining: 2.38s 80: learn: 0.1690867 test: 0.2056164 best: 0.2056164 (80) total: 21.4s remaining: 2.12s 81: learn: 0.1687680 test: 0.2055020 best: 0.2055020 (81) total: 21.7s remaining: 1.85s 82: learn: 0.1682811 test: 0.2051763 best: 0.2051763 (82) total: 22s remaining: 1.59s 83: learn: 0.1681190 test: 0.2050649 best: 0.2050649 (83) total: 22.2s remaining: 1.32s 84: learn: 0.1678429 test: 0.2049384 best: 0.2049384 (84) total: 22.5s remaining: 1.06s 85: learn: 0.1675309 test: 0.2047369 best: 0.2047369 (85) total: 22.8s remaining: 795ms 86: learn: 0.1672668 test: 0.2044898 best: 0.2044898 (86) total: 22.9s remaining: 526ms 87: learn: 0.1665549 test: 0.2043614 best: 0.2043614 (87) total: 23.2s remaining: 263ms 88: learn: 0.1662487 test: 0.2041590 best: 0.2041590 (88) total: 23.4s remaining: 0us bestTest = 0.2041590354 bestIteration = 88 Trial 8, Fold 4: Log loss = 0.20391093724402545, Average precision = 0.9735056098800242, ROC-AUC = 0.9687904604533717, Elapsed Time = 23.611795300002996 seconds Trial 8, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 8, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6219988 test: 0.6238712 best: 0.6238712 (0) total: 244ms remaining: 21.5s 1: learn: 0.5622986 test: 0.5650788 best: 0.5650788 (1) total: 486ms remaining: 21.1s 2: learn: 0.5119016 test: 0.5159305 best: 0.5159305 (2) total: 733ms remaining: 21s 3: learn: 0.4695141 test: 0.4748573 best: 0.4748573 (3) total: 985ms remaining: 20.9s 4: learn: 0.4323376 test: 0.4393864 best: 0.4393864 (4) total: 1.25s remaining: 21s 5: learn: 0.4012471 test: 0.4095777 best: 0.4095777 (5) total: 1.52s remaining: 21.1s 6: learn: 0.3752346 test: 0.3849137 best: 0.3849137 (6) total: 1.8s remaining: 21.1s 7: learn: 0.3525532 test: 0.3631765 best: 0.3631765 (7) total: 2.08s remaining: 21.1s 8: learn: 0.3335970 test: 0.3449627 best: 0.3449627 (8) total: 2.36s remaining: 21s 9: learn: 0.3169024 test: 0.3291789 best: 0.3291789 (9) total: 2.64s remaining: 20.8s 10: learn: 0.3028637 test: 0.3156033 best: 0.3156033 (10) total: 2.91s remaining: 20.6s 11: learn: 0.2898446 test: 0.3035715 best: 0.3035715 (11) total: 3.19s remaining: 20.5s 12: learn: 0.2790835 test: 0.2935300 best: 0.2935300 (12) total: 3.47s remaining: 20.3s 13: learn: 0.2697500 test: 0.2847895 best: 0.2847895 (13) total: 3.74s remaining: 20s 14: learn: 0.2614608 test: 0.2773061 best: 0.2773061 (14) total: 4.01s remaining: 19.8s 15: learn: 0.2539063 test: 0.2701794 best: 0.2701794 (15) total: 4.29s remaining: 19.6s 16: learn: 0.2476639 test: 0.2646022 best: 0.2646022 (16) total: 4.57s remaining: 19.4s 17: learn: 0.2424000 test: 0.2596029 best: 0.2596029 (17) total: 4.84s remaining: 19.1s 18: learn: 0.2375593 test: 0.2553382 best: 0.2553382 (18) total: 5.12s remaining: 18.9s 19: learn: 0.2337121 test: 0.2518487 best: 0.2518487 (19) total: 5.39s remaining: 18.6s 20: learn: 0.2296362 test: 0.2483662 best: 0.2483662 (20) total: 5.66s remaining: 18.3s 21: learn: 0.2252051 test: 0.2446834 best: 0.2446834 (21) total: 5.94s remaining: 18.1s 22: learn: 0.2223631 test: 0.2418040 best: 0.2418040 (22) total: 6.21s remaining: 17.8s 23: learn: 0.2197163 test: 0.2394860 best: 0.2394860 (23) total: 6.48s remaining: 17.6s 24: learn: 0.2169082 test: 0.2376311 best: 0.2376311 (24) total: 6.75s remaining: 17.3s 25: learn: 0.2147066 test: 0.2359089 best: 0.2359089 (25) total: 7.03s remaining: 17s 26: learn: 0.2125080 test: 0.2344677 best: 0.2344677 (26) total: 7.31s remaining: 16.8s 27: learn: 0.2105452 test: 0.2329066 best: 0.2329066 (27) total: 7.58s remaining: 16.5s 28: learn: 0.2083518 test: 0.2315031 best: 0.2315031 (28) total: 7.86s remaining: 16.3s 29: learn: 0.2062740 test: 0.2298818 best: 0.2298818 (29) total: 8.14s remaining: 16s 30: learn: 0.2040951 test: 0.2281017 best: 0.2281017 (30) total: 8.41s remaining: 15.7s 31: learn: 0.2021522 test: 0.2264090 best: 0.2264090 (31) total: 8.69s remaining: 15.5s 32: learn: 0.2004753 test: 0.2251399 best: 0.2251399 (32) total: 8.95s remaining: 15.2s 33: learn: 0.1988950 test: 0.2243852 best: 0.2243852 (33) total: 9.22s remaining: 14.9s 34: learn: 0.1975516 test: 0.2233585 best: 0.2233585 (34) total: 9.5s remaining: 14.7s 35: learn: 0.1962386 test: 0.2230410 best: 0.2230410 (35) total: 9.78s remaining: 14.4s 36: learn: 0.1949636 test: 0.2223239 best: 0.2223239 (36) total: 10.1s remaining: 14.1s 37: learn: 0.1938546 test: 0.2213775 best: 0.2213775 (37) total: 10.3s remaining: 13.9s 38: learn: 0.1926732 test: 0.2209995 best: 0.2209995 (38) total: 10.6s remaining: 13.6s 39: learn: 0.1918311 test: 0.2205785 best: 0.2205785 (39) total: 10.9s remaining: 13.3s 40: learn: 0.1905992 test: 0.2199909 best: 0.2199909 (40) total: 11.2s remaining: 13.1s 41: learn: 0.1890657 test: 0.2196014 best: 0.2196014 (41) total: 11.4s remaining: 12.8s 42: learn: 0.1882669 test: 0.2190364 best: 0.2190364 (42) total: 11.7s remaining: 12.5s 43: learn: 0.1877088 test: 0.2186284 best: 0.2186284 (43) total: 12s remaining: 12.2s 44: learn: 0.1870781 test: 0.2183018 best: 0.2183018 (44) total: 12.2s remaining: 12s 45: learn: 0.1863768 test: 0.2180477 best: 0.2180477 (45) total: 12.5s remaining: 11.7s 46: learn: 0.1857085 test: 0.2177978 best: 0.2177978 (46) total: 12.8s remaining: 11.4s 47: learn: 0.1847600 test: 0.2176082 best: 0.2176082 (47) total: 13.1s remaining: 11.2s 48: learn: 0.1842636 test: 0.2172480 best: 0.2172480 (48) total: 13.3s remaining: 10.9s 49: learn: 0.1834735 test: 0.2168142 best: 0.2168142 (49) total: 13.6s remaining: 10.6s 50: learn: 0.1828405 test: 0.2165550 best: 0.2165550 (50) total: 13.9s remaining: 10.3s 51: learn: 0.1824936 test: 0.2162514 best: 0.2162514 (51) total: 14.2s remaining: 10.1s 52: learn: 0.1817900 test: 0.2161578 best: 0.2161578 (52) total: 14.4s remaining: 9.8s 53: learn: 0.1814714 test: 0.2160305 best: 0.2160305 (53) total: 14.7s remaining: 9.53s 54: learn: 0.1807003 test: 0.2157302 best: 0.2157302 (54) total: 15s remaining: 9.26s 55: learn: 0.1799289 test: 0.2152504 best: 0.2152504 (55) total: 15.3s remaining: 8.99s 56: learn: 0.1793130 test: 0.2149931 best: 0.2149931 (56) total: 15.5s remaining: 8.72s 57: learn: 0.1786100 test: 0.2145730 best: 0.2145730 (57) total: 15.8s remaining: 8.44s 58: learn: 0.1785456 test: 0.2145653 best: 0.2145653 (58) total: 15.9s remaining: 8.11s 59: learn: 0.1773267 test: 0.2143315 best: 0.2143315 (59) total: 16.2s remaining: 7.84s 60: learn: 0.1768003 test: 0.2141859 best: 0.2141859 (60) total: 16.5s remaining: 7.57s 61: learn: 0.1767988 test: 0.2141890 best: 0.2141859 (60) total: 16.6s remaining: 7.21s 62: learn: 0.1762460 test: 0.2140142 best: 0.2140142 (62) total: 16.8s remaining: 6.95s 63: learn: 0.1760248 test: 0.2138506 best: 0.2138506 (63) total: 17.1s remaining: 6.68s 64: learn: 0.1755827 test: 0.2137476 best: 0.2137476 (64) total: 17.4s remaining: 6.42s 65: learn: 0.1754384 test: 0.2136342 best: 0.2136342 (65) total: 17.5s remaining: 6.09s 66: learn: 0.1751353 test: 0.2134834 best: 0.2134834 (66) total: 17.7s remaining: 5.82s 67: learn: 0.1751354 test: 0.2134831 best: 0.2134831 (67) total: 17.8s remaining: 5.5s 68: learn: 0.1746736 test: 0.2132437 best: 0.2132437 (68) total: 18.1s remaining: 5.24s 69: learn: 0.1746735 test: 0.2132432 best: 0.2132432 (69) total: 18.1s remaining: 4.92s 70: learn: 0.1746734 test: 0.2132423 best: 0.2132423 (70) total: 18.2s remaining: 4.61s 71: learn: 0.1740951 test: 0.2129935 best: 0.2129935 (71) total: 18.4s remaining: 4.36s 72: learn: 0.1738359 test: 0.2127998 best: 0.2127998 (72) total: 18.7s remaining: 4.1s 73: learn: 0.1734772 test: 0.2127292 best: 0.2127292 (73) total: 19s remaining: 3.85s 74: learn: 0.1731770 test: 0.2124373 best: 0.2124373 (74) total: 19.2s remaining: 3.59s 75: learn: 0.1728334 test: 0.2122947 best: 0.2122947 (75) total: 19.5s remaining: 3.34s 76: learn: 0.1720189 test: 0.2123715 best: 0.2122947 (75) total: 19.8s remaining: 3.09s 77: learn: 0.1714198 test: 0.2122262 best: 0.2122262 (77) total: 20.1s remaining: 2.83s 78: learn: 0.1710891 test: 0.2120853 best: 0.2120853 (78) total: 20.4s remaining: 2.58s 79: learn: 0.1705578 test: 0.2120323 best: 0.2120323 (79) total: 20.6s remaining: 2.32s 80: learn: 0.1701023 test: 0.2118774 best: 0.2118774 (80) total: 20.9s remaining: 2.06s 81: learn: 0.1695416 test: 0.2115818 best: 0.2115818 (81) total: 21.2s remaining: 1.81s 82: learn: 0.1695408 test: 0.2115760 best: 0.2115760 (82) total: 21.2s remaining: 1.53s 83: learn: 0.1694502 test: 0.2115496 best: 0.2115496 (83) total: 21.5s remaining: 1.28s 84: learn: 0.1691877 test: 0.2114318 best: 0.2114318 (84) total: 21.8s remaining: 1.02s 85: learn: 0.1686102 test: 0.2113363 best: 0.2113363 (85) total: 22s remaining: 769ms 86: learn: 0.1681279 test: 0.2110884 best: 0.2110884 (86) total: 22.3s remaining: 513ms 87: learn: 0.1676486 test: 0.2110622 best: 0.2110622 (87) total: 22.6s remaining: 257ms 88: learn: 0.1676000 test: 0.2110129 best: 0.2110129 (88) total: 22.7s remaining: 0us bestTest = 0.2110128776 bestIteration = 88 Trial 8, Fold 5: Log loss = 0.21064118275161223, Average precision = 0.9721801808196247, ROC-AUC = 0.9683430503688013, Elapsed Time = 22.82268710000062 seconds
Optimization Progress: 9%|9 | 9/100 [15:21<2:09:17, 85.25s/it]
Trial 9, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 9, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5936707 test: 0.5933712 best: 0.5933712 (0) total: 446ms remaining: 14.3s 1: learn: 0.5169584 test: 0.5169710 best: 0.5169710 (1) total: 842ms remaining: 13.1s 2: learn: 0.4557980 test: 0.4561473 best: 0.4561473 (2) total: 1.29s remaining: 12.9s 3: learn: 0.4102913 test: 0.4111520 best: 0.4111520 (3) total: 1.64s remaining: 11.9s 4: learn: 0.3729478 test: 0.3742570 best: 0.3742570 (4) total: 2.02s remaining: 11.3s 5: learn: 0.3443626 test: 0.3463007 best: 0.3463007 (5) total: 2.37s remaining: 10.6s 6: learn: 0.3212944 test: 0.3238142 best: 0.3238142 (6) total: 2.85s remaining: 10.6s 7: learn: 0.3018028 test: 0.3048864 best: 0.3048864 (7) total: 3.23s remaining: 10.1s 8: learn: 0.2870346 test: 0.2905007 best: 0.2905007 (8) total: 3.51s remaining: 9.35s 9: learn: 0.2742177 test: 0.2782839 best: 0.2782839 (9) total: 3.83s remaining: 8.81s 10: learn: 0.2643747 test: 0.2688520 best: 0.2688520 (10) total: 4.21s remaining: 8.41s 11: learn: 0.2552880 test: 0.2605364 best: 0.2605364 (11) total: 4.63s remaining: 8.11s 12: learn: 0.2479670 test: 0.2538750 best: 0.2538750 (12) total: 4.96s remaining: 7.63s 13: learn: 0.2423350 test: 0.2489049 best: 0.2489049 (13) total: 5.32s remaining: 7.23s 14: learn: 0.2373503 test: 0.2447181 best: 0.2447181 (14) total: 5.83s remaining: 7s 15: learn: 0.2329669 test: 0.2408450 best: 0.2408450 (15) total: 6.19s remaining: 6.57s 16: learn: 0.2284540 test: 0.2367572 best: 0.2367572 (16) total: 6.61s remaining: 6.22s 17: learn: 0.2243470 test: 0.2330597 best: 0.2330597 (17) total: 7.01s remaining: 5.84s 18: learn: 0.2207452 test: 0.2299804 best: 0.2299804 (18) total: 7.36s remaining: 5.42s 19: learn: 0.2182944 test: 0.2278850 best: 0.2278850 (19) total: 7.81s remaining: 5.08s 20: learn: 0.2160060 test: 0.2260179 best: 0.2260179 (20) total: 8.14s remaining: 4.65s 21: learn: 0.2141739 test: 0.2245275 best: 0.2245275 (21) total: 8.51s remaining: 4.26s 22: learn: 0.2121770 test: 0.2228235 best: 0.2228235 (22) total: 8.85s remaining: 3.85s 23: learn: 0.2101391 test: 0.2211456 best: 0.2211456 (23) total: 9.21s remaining: 3.45s 24: learn: 0.2083146 test: 0.2197165 best: 0.2197165 (24) total: 9.67s remaining: 3.09s 25: learn: 0.2066468 test: 0.2183103 best: 0.2183103 (25) total: 10s remaining: 2.7s 26: learn: 0.2055387 test: 0.2176514 best: 0.2176514 (26) total: 10.5s remaining: 2.33s 27: learn: 0.2045847 test: 0.2170696 best: 0.2170696 (27) total: 10.9s remaining: 1.94s 28: learn: 0.2035615 test: 0.2162629 best: 0.2162629 (28) total: 11.2s remaining: 1.54s 29: learn: 0.2026365 test: 0.2157242 best: 0.2157242 (29) total: 11.6s remaining: 1.16s 30: learn: 0.2010455 test: 0.2144278 best: 0.2144278 (30) total: 11.9s remaining: 771ms 31: learn: 0.2000132 test: 0.2135985 best: 0.2135985 (31) total: 12.3s remaining: 383ms 32: learn: 0.1992913 test: 0.2130620 best: 0.2130620 (32) total: 12.7s remaining: 0us bestTest = 0.213062041 bestIteration = 32 Trial 9, Fold 1: Log loss = 0.21251230814945316, Average precision = 0.9724544033652528, ROC-AUC = 0.9683726364931197, Elapsed Time = 12.858461199997691 seconds Trial 9, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 9, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5928393 test: 0.5935655 best: 0.5935655 (0) total: 469ms remaining: 15s 1: learn: 0.5154737 test: 0.5166009 best: 0.5166009 (1) total: 887ms remaining: 13.7s 2: learn: 0.4556854 test: 0.4572782 best: 0.4572782 (2) total: 1.35s remaining: 13.5s 3: learn: 0.4129612 test: 0.4147107 best: 0.4147107 (3) total: 1.84s remaining: 13.4s 4: learn: 0.3752124 test: 0.3773215 best: 0.3773215 (4) total: 2.25s remaining: 12.6s 5: learn: 0.3455114 test: 0.3479752 best: 0.3479752 (5) total: 2.66s remaining: 12s 6: learn: 0.3221872 test: 0.3247950 best: 0.3247950 (6) total: 3.09s remaining: 11.5s 7: learn: 0.3042221 test: 0.3071683 best: 0.3071683 (7) total: 3.57s remaining: 11.2s 8: learn: 0.2890753 test: 0.2925294 best: 0.2925294 (8) total: 4.06s remaining: 10.8s 9: learn: 0.2777986 test: 0.2813481 best: 0.2813481 (9) total: 4.48s remaining: 10.3s 10: learn: 0.2665410 test: 0.2701667 best: 0.2701667 (10) total: 4.86s remaining: 9.72s 11: learn: 0.2576710 test: 0.2612841 best: 0.2612841 (11) total: 5.18s remaining: 9.07s 12: learn: 0.2515531 test: 0.2553608 best: 0.2553608 (12) total: 5.56s remaining: 8.56s 13: learn: 0.2445893 test: 0.2483747 best: 0.2483747 (13) total: 5.87s remaining: 7.96s 14: learn: 0.2386219 test: 0.2423030 best: 0.2423030 (14) total: 6.41s remaining: 7.69s 15: learn: 0.2338035 test: 0.2376632 best: 0.2376632 (15) total: 6.92s remaining: 7.35s 16: learn: 0.2303003 test: 0.2343276 best: 0.2343276 (16) total: 7.31s remaining: 6.88s 17: learn: 0.2275353 test: 0.2313350 best: 0.2313350 (17) total: 7.68s remaining: 6.4s 18: learn: 0.2242389 test: 0.2280582 best: 0.2280582 (18) total: 8.17s remaining: 6.02s 19: learn: 0.2212634 test: 0.2252850 best: 0.2252850 (19) total: 8.53s remaining: 5.55s 20: learn: 0.2184880 test: 0.2229192 best: 0.2229192 (20) total: 8.98s remaining: 5.13s 21: learn: 0.2166862 test: 0.2212277 best: 0.2212277 (21) total: 9.36s remaining: 4.68s 22: learn: 0.2141873 test: 0.2187356 best: 0.2187356 (22) total: 9.65s remaining: 4.2s 23: learn: 0.2127087 test: 0.2174144 best: 0.2174144 (23) total: 10.1s remaining: 3.8s 24: learn: 0.2110162 test: 0.2157953 best: 0.2157953 (24) total: 10.5s remaining: 3.35s 25: learn: 0.2099695 test: 0.2148078 best: 0.2148078 (25) total: 10.7s remaining: 2.89s 26: learn: 0.2091740 test: 0.2139935 best: 0.2139935 (26) total: 11.1s remaining: 2.46s 27: learn: 0.2075386 test: 0.2126314 best: 0.2126314 (27) total: 11.5s remaining: 2.04s 28: learn: 0.2060038 test: 0.2113207 best: 0.2113207 (28) total: 11.9s remaining: 1.64s 29: learn: 0.2049571 test: 0.2104891 best: 0.2104891 (29) total: 12.2s remaining: 1.22s 30: learn: 0.2045220 test: 0.2102507 best: 0.2102507 (30) total: 12.5s remaining: 808ms 31: learn: 0.2034800 test: 0.2092401 best: 0.2092401 (31) total: 12.9s remaining: 403ms 32: learn: 0.2024597 test: 0.2083785 best: 0.2083785 (32) total: 13.3s remaining: 0us bestTest = 0.2083784637 bestIteration = 32 Trial 9, Fold 2: Log loss = 0.2079913903409887, Average precision = 0.9739655657248937, ROC-AUC = 0.9714503992635983, Elapsed Time = 13.429159499999514 seconds Trial 9, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 9, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5950911 test: 0.5955450 best: 0.5955450 (0) total: 423ms remaining: 13.5s 1: learn: 0.5173761 test: 0.5176026 best: 0.5176026 (1) total: 816ms remaining: 12.7s 2: learn: 0.4566581 test: 0.4563989 best: 0.4563989 (2) total: 1.15s remaining: 11.5s 3: learn: 0.4092815 test: 0.4087400 best: 0.4087400 (3) total: 1.54s remaining: 11.2s 4: learn: 0.3725337 test: 0.3715264 best: 0.3715264 (4) total: 1.93s remaining: 10.8s 5: learn: 0.3437913 test: 0.3430945 best: 0.3430945 (5) total: 2.29s remaining: 10.3s 6: learn: 0.3217600 test: 0.3208128 best: 0.3208128 (6) total: 2.64s remaining: 9.81s 7: learn: 0.3046259 test: 0.3037606 best: 0.3037606 (7) total: 3.04s remaining: 9.49s 8: learn: 0.2898816 test: 0.2889390 best: 0.2889390 (8) total: 3.39s remaining: 9.04s 9: learn: 0.2784385 test: 0.2777004 best: 0.2777004 (9) total: 3.75s remaining: 8.63s 10: learn: 0.2684511 test: 0.2680242 best: 0.2680242 (10) total: 4.13s remaining: 8.26s 11: learn: 0.2601644 test: 0.2600098 best: 0.2600098 (11) total: 4.48s remaining: 7.84s 12: learn: 0.2514578 test: 0.2513683 best: 0.2513683 (12) total: 4.85s remaining: 7.46s 13: learn: 0.2442318 test: 0.2444738 best: 0.2444738 (13) total: 5.24s remaining: 7.12s 14: learn: 0.2385207 test: 0.2387314 best: 0.2387314 (14) total: 5.63s remaining: 6.76s 15: learn: 0.2343976 test: 0.2347551 best: 0.2347551 (15) total: 6.04s remaining: 6.42s 16: learn: 0.2309014 test: 0.2314345 best: 0.2314345 (16) total: 6.4s remaining: 6.03s 17: learn: 0.2277368 test: 0.2288450 best: 0.2288450 (17) total: 6.84s remaining: 5.7s 18: learn: 0.2250554 test: 0.2264560 best: 0.2264560 (18) total: 7.24s remaining: 5.34s 19: learn: 0.2218533 test: 0.2235187 best: 0.2235187 (19) total: 7.65s remaining: 4.97s 20: learn: 0.2191715 test: 0.2211074 best: 0.2211074 (20) total: 8.06s remaining: 4.61s 21: learn: 0.2165034 test: 0.2189425 best: 0.2189425 (21) total: 8.47s remaining: 4.23s 22: learn: 0.2145110 test: 0.2169396 best: 0.2169396 (22) total: 8.8s remaining: 3.83s 23: learn: 0.2123530 test: 0.2149414 best: 0.2149414 (23) total: 9.23s remaining: 3.46s 24: learn: 0.2113452 test: 0.2138891 best: 0.2138891 (24) total: 9.48s remaining: 3.03s 25: learn: 0.2099055 test: 0.2125900 best: 0.2125900 (25) total: 9.94s remaining: 2.67s 26: learn: 0.2081948 test: 0.2112036 best: 0.2112036 (26) total: 10.3s remaining: 2.29s 27: learn: 0.2066223 test: 0.2096369 best: 0.2096369 (27) total: 10.6s remaining: 1.9s 28: learn: 0.2053198 test: 0.2083839 best: 0.2083839 (28) total: 11s remaining: 1.52s 29: learn: 0.2044157 test: 0.2077249 best: 0.2077249 (29) total: 11.3s remaining: 1.13s 30: learn: 0.2038872 test: 0.2074261 best: 0.2074261 (30) total: 11.7s remaining: 755ms 31: learn: 0.2028381 test: 0.2067167 best: 0.2067167 (31) total: 12.2s remaining: 382ms 32: learn: 0.2017426 test: 0.2057500 best: 0.2057500 (32) total: 12.6s remaining: 0us bestTest = 0.2057499915 bestIteration = 32 Trial 9, Fold 3: Log loss = 0.20547208442232048, Average precision = 0.9730721419563638, ROC-AUC = 0.9710076500515261, Elapsed Time = 12.712617600001977 seconds Trial 9, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 9, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5949473 test: 0.5951608 best: 0.5951608 (0) total: 500ms remaining: 16s 1: learn: 0.5185676 test: 0.5185231 best: 0.5185231 (1) total: 915ms remaining: 14.2s 2: learn: 0.4583759 test: 0.4584267 best: 0.4584267 (2) total: 1.27s remaining: 12.7s 3: learn: 0.4125898 test: 0.4133054 best: 0.4133054 (3) total: 1.76s remaining: 12.7s 4: learn: 0.3761021 test: 0.3773940 best: 0.3773940 (4) total: 2.11s remaining: 11.8s 5: learn: 0.3491898 test: 0.3509079 best: 0.3509079 (5) total: 2.48s remaining: 11.1s 6: learn: 0.3266943 test: 0.3285672 best: 0.3285672 (6) total: 2.85s remaining: 10.6s 7: learn: 0.3071887 test: 0.3092082 best: 0.3092082 (7) total: 3.27s remaining: 10.2s 8: learn: 0.2920120 test: 0.2944397 best: 0.2944397 (8) total: 3.6s remaining: 9.59s 9: learn: 0.2785118 test: 0.2810644 best: 0.2810644 (9) total: 4.03s remaining: 9.28s 10: learn: 0.2672285 test: 0.2701709 best: 0.2701709 (10) total: 4.45s remaining: 8.9s 11: learn: 0.2587271 test: 0.2621087 best: 0.2621087 (11) total: 4.92s remaining: 8.61s 12: learn: 0.2520679 test: 0.2559825 best: 0.2559825 (12) total: 5.42s remaining: 8.35s 13: learn: 0.2461723 test: 0.2507004 best: 0.2507004 (13) total: 5.8s remaining: 7.87s 14: learn: 0.2408587 test: 0.2452545 best: 0.2452545 (14) total: 6.13s remaining: 7.35s 15: learn: 0.2349641 test: 0.2391542 best: 0.2391542 (15) total: 6.43s remaining: 6.83s 16: learn: 0.2312228 test: 0.2357763 best: 0.2357763 (16) total: 6.78s remaining: 6.38s 17: learn: 0.2279079 test: 0.2326529 best: 0.2326529 (17) total: 7.22s remaining: 6.02s 18: learn: 0.2243084 test: 0.2290304 best: 0.2290304 (18) total: 7.62s remaining: 5.61s 19: learn: 0.2218237 test: 0.2265505 best: 0.2265505 (19) total: 8s remaining: 5.2s 20: learn: 0.2192685 test: 0.2243022 best: 0.2243022 (20) total: 8.45s remaining: 4.83s 21: learn: 0.2169257 test: 0.2223026 best: 0.2223026 (21) total: 8.91s remaining: 4.45s 22: learn: 0.2143877 test: 0.2201183 best: 0.2201183 (22) total: 9.36s remaining: 4.07s 23: learn: 0.2126784 test: 0.2187079 best: 0.2187079 (23) total: 9.73s remaining: 3.65s 24: learn: 0.2111809 test: 0.2173300 best: 0.2173300 (24) total: 10.1s remaining: 3.23s 25: learn: 0.2095688 test: 0.2159504 best: 0.2159504 (25) total: 10.4s remaining: 2.8s 26: learn: 0.2080576 test: 0.2145398 best: 0.2145398 (26) total: 10.8s remaining: 2.41s 27: learn: 0.2067230 test: 0.2134166 best: 0.2134166 (27) total: 11.3s remaining: 2.01s 28: learn: 0.2052727 test: 0.2122740 best: 0.2122740 (28) total: 11.6s remaining: 1.6s 29: learn: 0.2040002 test: 0.2115598 best: 0.2115598 (29) total: 12.1s remaining: 1.21s 30: learn: 0.2029557 test: 0.2107107 best: 0.2107107 (30) total: 12.5s remaining: 806ms 31: learn: 0.2019009 test: 0.2098873 best: 0.2098873 (31) total: 12.8s remaining: 401ms 32: learn: 0.2009859 test: 0.2091389 best: 0.2091389 (32) total: 13.2s remaining: 0us bestTest = 0.2091388992 bestIteration = 32 Trial 9, Fold 4: Log loss = 0.20869698135130227, Average precision = 0.9742626799703638, ROC-AUC = 0.9696986970392782, Elapsed Time = 13.32215240000005 seconds Trial 9, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 9, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5927489 test: 0.5943014 best: 0.5943014 (0) total: 370ms remaining: 11.8s 1: learn: 0.5149027 test: 0.5185038 best: 0.5185038 (1) total: 788ms remaining: 12.2s 2: learn: 0.4548540 test: 0.4593431 best: 0.4593431 (2) total: 1.12s remaining: 11.2s 3: learn: 0.4069713 test: 0.4124602 best: 0.4124602 (3) total: 1.49s remaining: 10.8s 4: learn: 0.3710401 test: 0.3772466 best: 0.3772466 (4) total: 1.88s remaining: 10.5s 5: learn: 0.3438637 test: 0.3506857 best: 0.3506857 (5) total: 2.31s remaining: 10.4s 6: learn: 0.3213823 test: 0.3291563 best: 0.3291563 (6) total: 2.72s remaining: 10.1s 7: learn: 0.3029190 test: 0.3111729 best: 0.3111729 (7) total: 3.05s remaining: 9.53s 8: learn: 0.2877737 test: 0.2966636 best: 0.2966636 (8) total: 3.44s remaining: 9.18s 9: learn: 0.2755021 test: 0.2847018 best: 0.2847018 (9) total: 3.85s remaining: 8.86s 10: learn: 0.2642726 test: 0.2740277 best: 0.2740277 (10) total: 4.2s remaining: 8.4s 11: learn: 0.2556348 test: 0.2657291 best: 0.2657291 (11) total: 4.57s remaining: 7.99s 12: learn: 0.2478406 test: 0.2583892 best: 0.2583892 (12) total: 4.98s remaining: 7.67s 13: learn: 0.2418344 test: 0.2528262 best: 0.2528262 (13) total: 5.34s remaining: 7.25s 14: learn: 0.2371820 test: 0.2481243 best: 0.2481243 (14) total: 5.75s remaining: 6.9s 15: learn: 0.2327670 test: 0.2441895 best: 0.2441895 (15) total: 6.15s remaining: 6.54s 16: learn: 0.2291655 test: 0.2410815 best: 0.2410815 (16) total: 6.49s remaining: 6.11s 17: learn: 0.2249996 test: 0.2374609 best: 0.2374609 (17) total: 6.93s remaining: 5.78s 18: learn: 0.2219219 test: 0.2348467 best: 0.2348467 (18) total: 7.28s remaining: 5.37s 19: learn: 0.2189307 test: 0.2321812 best: 0.2321812 (19) total: 7.68s remaining: 4.99s 20: learn: 0.2160560 test: 0.2298557 best: 0.2298557 (20) total: 8.06s remaining: 4.6s 21: learn: 0.2137852 test: 0.2280290 best: 0.2280290 (21) total: 8.44s remaining: 4.22s 22: learn: 0.2122558 test: 0.2268850 best: 0.2268850 (22) total: 8.8s remaining: 3.83s 23: learn: 0.2111656 test: 0.2259762 best: 0.2259762 (23) total: 9.17s remaining: 3.44s 24: learn: 0.2092945 test: 0.2241585 best: 0.2241585 (24) total: 9.57s remaining: 3.06s 25: learn: 0.2083738 test: 0.2233689 best: 0.2233689 (25) total: 9.87s remaining: 2.66s 26: learn: 0.2068820 test: 0.2220799 best: 0.2220799 (26) total: 10.3s remaining: 2.28s 27: learn: 0.2056296 test: 0.2212331 best: 0.2212331 (27) total: 10.7s remaining: 1.92s 28: learn: 0.2042925 test: 0.2202110 best: 0.2202110 (28) total: 11.1s remaining: 1.53s 29: learn: 0.2029629 test: 0.2191241 best: 0.2191241 (29) total: 11.5s remaining: 1.15s 30: learn: 0.2018787 test: 0.2182633 best: 0.2182633 (30) total: 11.8s remaining: 760ms 31: learn: 0.2010627 test: 0.2177216 best: 0.2177216 (31) total: 12.1s remaining: 378ms 32: learn: 0.2004656 test: 0.2173738 best: 0.2173738 (32) total: 12.4s remaining: 0us bestTest = 0.2173738237 bestIteration = 32 Trial 9, Fold 5: Log loss = 0.21671369139124652, Average precision = 0.9710542606095034, ROC-AUC = 0.9675792133474537, Elapsed Time = 12.527992699997412 seconds
Optimization Progress: 10%|# | 10/100 [16:33<2:01:52, 81.25s/it]
Trial 10, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 10, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6506128 test: 0.6515501 best: 0.6515501 (0) total: 248ms remaining: 3.47s 1: learn: 0.6110494 test: 0.6131081 best: 0.6131081 (1) total: 553ms remaining: 3.6s 2: learn: 0.5753256 test: 0.5784900 best: 0.5784900 (2) total: 841ms remaining: 3.36s 3: learn: 0.5420402 test: 0.5461008 best: 0.5461008 (3) total: 1.13s remaining: 3.1s 4: learn: 0.5115528 test: 0.5169230 best: 0.5169230 (4) total: 1.42s remaining: 2.84s 5: learn: 0.4844937 test: 0.4911414 best: 0.4911414 (5) total: 1.71s remaining: 2.57s 6: learn: 0.4594776 test: 0.4672055 best: 0.4672055 (6) total: 1.96s remaining: 2.23s 7: learn: 0.4367891 test: 0.4453762 best: 0.4453762 (7) total: 2.23s remaining: 1.95s 8: learn: 0.4159033 test: 0.4258133 best: 0.4258133 (8) total: 2.55s remaining: 1.7s 9: learn: 0.3964793 test: 0.4081034 best: 0.4081034 (9) total: 2.91s remaining: 1.45s 10: learn: 0.3795007 test: 0.3922869 best: 0.3922869 (10) total: 3.21s remaining: 1.17s 11: learn: 0.3638085 test: 0.3777383 best: 0.3777383 (11) total: 3.53s remaining: 882ms 12: learn: 0.3491671 test: 0.3642213 best: 0.3642213 (12) total: 3.85s remaining: 592ms 13: learn: 0.3358037 test: 0.3520950 best: 0.3520950 (13) total: 4.19s remaining: 299ms 14: learn: 0.3233857 test: 0.3410935 best: 0.3410935 (14) total: 4.52s remaining: 0us bestTest = 0.3410934927 bestIteration = 14 Trial 10, Fold 1: Log loss = 0.34114290750540777, Average precision = 0.9737290436163282, ROC-AUC = 0.9693178002657452, Elapsed Time = 4.637054099999659 seconds Trial 10, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 10, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6495233 test: 0.6508443 best: 0.6508443 (0) total: 308ms remaining: 4.32s 1: learn: 0.6101453 test: 0.6125148 best: 0.6125148 (1) total: 603ms remaining: 3.92s 2: learn: 0.5741949 test: 0.5776522 best: 0.5776522 (2) total: 897ms remaining: 3.59s 3: learn: 0.5413039 test: 0.5456660 best: 0.5456660 (3) total: 1.2s remaining: 3.3s 4: learn: 0.5116834 test: 0.5169992 best: 0.5169992 (4) total: 1.48s remaining: 2.95s 5: learn: 0.4843079 test: 0.4904038 best: 0.4904038 (5) total: 1.79s remaining: 2.68s 6: learn: 0.4598820 test: 0.4667204 best: 0.4667204 (6) total: 2.14s remaining: 2.45s 7: learn: 0.4374509 test: 0.4453623 best: 0.4453623 (7) total: 2.42s remaining: 2.12s 8: learn: 0.4168376 test: 0.4258038 best: 0.4258038 (8) total: 2.76s remaining: 1.84s 9: learn: 0.3978310 test: 0.4080304 best: 0.4080304 (9) total: 3.14s remaining: 1.57s 10: learn: 0.3806902 test: 0.3916813 best: 0.3916813 (10) total: 3.39s remaining: 1.23s 11: learn: 0.3637069 test: 0.3761094 best: 0.3761094 (11) total: 3.76s remaining: 941ms 12: learn: 0.3483987 test: 0.3622137 best: 0.3622137 (12) total: 4.12s remaining: 634ms 13: learn: 0.3345895 test: 0.3494607 best: 0.3494607 (13) total: 4.48s remaining: 320ms 14: learn: 0.3228142 test: 0.3383884 best: 0.3383884 (14) total: 4.78s remaining: 0us bestTest = 0.3383883706 bestIteration = 14 Trial 10, Fold 2: Log loss = 0.33840529229879684, Average precision = 0.9743811706503799, ROC-AUC = 0.9717702835953663, Elapsed Time = 4.909788400000252 seconds Trial 10, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 10, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6493013 test: 0.6503620 best: 0.6503620 (0) total: 297ms remaining: 4.15s 1: learn: 0.6100346 test: 0.6116639 best: 0.6116639 (1) total: 594ms remaining: 3.86s 2: learn: 0.5742804 test: 0.5766965 best: 0.5766965 (2) total: 894ms remaining: 3.58s 3: learn: 0.5412897 test: 0.5450626 best: 0.5450626 (3) total: 1.2s remaining: 3.29s 4: learn: 0.5117832 test: 0.5162098 best: 0.5162098 (4) total: 1.5s remaining: 3.01s 5: learn: 0.4846486 test: 0.4902072 best: 0.4902072 (5) total: 1.86s remaining: 2.79s 6: learn: 0.4591154 test: 0.4658467 best: 0.4658467 (6) total: 2.2s remaining: 2.51s 7: learn: 0.4359811 test: 0.4440841 best: 0.4440841 (7) total: 2.57s remaining: 2.25s 8: learn: 0.4152104 test: 0.4240599 best: 0.4240599 (8) total: 2.87s remaining: 1.91s 9: learn: 0.3957711 test: 0.4053465 best: 0.4053465 (9) total: 3.18s remaining: 1.59s 10: learn: 0.3791731 test: 0.3894543 best: 0.3894543 (10) total: 3.45s remaining: 1.25s 11: learn: 0.3625424 test: 0.3741448 best: 0.3741448 (11) total: 3.8s remaining: 950ms 12: learn: 0.3471749 test: 0.3603103 best: 0.3603103 (12) total: 4.16s remaining: 639ms 13: learn: 0.3343698 test: 0.3482857 best: 0.3482857 (13) total: 4.47s remaining: 319ms 14: learn: 0.3216278 test: 0.3366030 best: 0.3366030 (14) total: 4.8s remaining: 0us bestTest = 0.3366030155 bestIteration = 14 Trial 10, Fold 3: Log loss = 0.3367046290656582, Average precision = 0.9750636133066876, ROC-AUC = 0.9722684127331914, Elapsed Time = 4.930267300002015 seconds Trial 10, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 10, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6497768 test: 0.6504533 best: 0.6504533 (0) total: 284ms remaining: 3.97s 1: learn: 0.6101131 test: 0.6118149 best: 0.6118149 (1) total: 565ms remaining: 3.67s 2: learn: 0.5742091 test: 0.5767182 best: 0.5767182 (2) total: 837ms remaining: 3.35s 3: learn: 0.5416570 test: 0.5453346 best: 0.5453346 (3) total: 1.15s remaining: 3.15s 4: learn: 0.5117165 test: 0.5169491 best: 0.5169491 (4) total: 1.46s remaining: 2.91s 5: learn: 0.4848987 test: 0.4910289 best: 0.4910289 (5) total: 1.72s remaining: 2.57s 6: learn: 0.4605750 test: 0.4681571 best: 0.4681571 (6) total: 2.08s remaining: 2.38s 7: learn: 0.4376421 test: 0.4460392 best: 0.4460392 (7) total: 2.36s remaining: 2.07s 8: learn: 0.4165617 test: 0.4259678 best: 0.4259678 (8) total: 2.69s remaining: 1.79s 9: learn: 0.3976068 test: 0.4078765 best: 0.4078765 (9) total: 3.02s remaining: 1.51s 10: learn: 0.3800087 test: 0.3919319 best: 0.3919319 (10) total: 3.36s remaining: 1.22s 11: learn: 0.3639558 test: 0.3765887 best: 0.3765887 (11) total: 3.62s remaining: 906ms 12: learn: 0.3494335 test: 0.3631847 best: 0.3631847 (12) total: 3.97s remaining: 611ms 13: learn: 0.3359265 test: 0.3507470 best: 0.3507470 (13) total: 4.25s remaining: 304ms 14: learn: 0.3240637 test: 0.3401326 best: 0.3401326 (14) total: 4.56s remaining: 0us bestTest = 0.3401326333 bestIteration = 14 Trial 10, Fold 4: Log loss = 0.34016948226569266, Average precision = 0.9761613962132126, ROC-AUC = 0.9718627724486218, Elapsed Time = 4.686669899998378 seconds Trial 10, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 10, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6497697 test: 0.6513156 best: 0.6513156 (0) total: 293ms remaining: 4.1s 1: learn: 0.6100821 test: 0.6129789 best: 0.6129789 (1) total: 551ms remaining: 3.58s 2: learn: 0.5739950 test: 0.5783090 best: 0.5783090 (2) total: 811ms remaining: 3.24s 3: learn: 0.5414654 test: 0.5472702 best: 0.5472702 (3) total: 1.1s remaining: 3.04s 4: learn: 0.5115031 test: 0.5185709 best: 0.5185709 (4) total: 1.36s remaining: 2.73s 5: learn: 0.4846643 test: 0.4926821 best: 0.4926821 (5) total: 1.61s remaining: 2.42s 6: learn: 0.4595928 test: 0.4689463 best: 0.4689463 (6) total: 1.88s remaining: 2.15s 7: learn: 0.4369477 test: 0.4474768 best: 0.4474768 (7) total: 2.13s remaining: 1.87s 8: learn: 0.4156743 test: 0.4278795 best: 0.4278795 (8) total: 2.41s remaining: 1.6s 9: learn: 0.3970332 test: 0.4104888 best: 0.4104888 (9) total: 2.68s remaining: 1.34s 10: learn: 0.3790900 test: 0.3940605 best: 0.3940605 (10) total: 3.04s remaining: 1.1s 11: learn: 0.3633643 test: 0.3793960 best: 0.3793960 (11) total: 3.31s remaining: 829ms 12: learn: 0.3481699 test: 0.3657617 best: 0.3657617 (12) total: 3.65s remaining: 561ms 13: learn: 0.3340227 test: 0.3534457 best: 0.3534457 (13) total: 4s remaining: 286ms 14: learn: 0.3222709 test: 0.3430015 best: 0.3430015 (14) total: 4.26s remaining: 0us bestTest = 0.343001456 bestIteration = 14 Trial 10, Fold 5: Log loss = 0.34299012822241326, Average precision = 0.9727276665629832, ROC-AUC = 0.9681648365940212, Elapsed Time = 4.384535199998936 seconds
Optimization Progress: 11%|#1 | 11/100 [17:05<1:37:52, 65.99s/it]
Trial 11, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 11, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6792982 test: 0.6795236 best: 0.6795236 (0) total: 211ms remaining: 13.9s 1: learn: 0.6613199 test: 0.6614242 best: 0.6614242 (1) total: 430ms remaining: 14s 2: learn: 0.6454800 test: 0.6455000 best: 0.6455000 (2) total: 672ms remaining: 14.3s 3: learn: 0.6316974 test: 0.6319361 best: 0.6319361 (3) total: 902ms remaining: 14.2s 4: learn: 0.6164283 test: 0.6165959 best: 0.6165959 (4) total: 1.13s remaining: 14s 5: learn: 0.6018727 test: 0.6020022 best: 0.6020022 (5) total: 1.34s remaining: 13.7s 6: learn: 0.5880905 test: 0.5883114 best: 0.5883114 (6) total: 1.56s remaining: 13.4s 7: learn: 0.5758505 test: 0.5760261 best: 0.5760261 (7) total: 1.76s remaining: 13s 8: learn: 0.5646255 test: 0.5648380 best: 0.5648380 (8) total: 1.96s remaining: 12.6s 9: learn: 0.5531004 test: 0.5532903 best: 0.5532903 (9) total: 2.16s remaining: 12.3s 10: learn: 0.5418647 test: 0.5421599 best: 0.5421599 (10) total: 2.36s remaining: 12s 11: learn: 0.5296037 test: 0.5297705 best: 0.5297705 (11) total: 2.56s remaining: 11.7s 12: learn: 0.5189116 test: 0.5191465 best: 0.5191465 (12) total: 2.77s remaining: 11.5s 13: learn: 0.5090852 test: 0.5092889 best: 0.5092889 (13) total: 3.02s remaining: 11.4s 14: learn: 0.5002797 test: 0.5005569 best: 0.5005569 (14) total: 3.25s remaining: 11.3s 15: learn: 0.4913775 test: 0.4917325 best: 0.4917325 (15) total: 3.49s remaining: 11.1s 16: learn: 0.4816507 test: 0.4819651 best: 0.4819651 (16) total: 3.72s remaining: 10.9s 17: learn: 0.4721953 test: 0.4724984 best: 0.4724984 (17) total: 3.94s remaining: 10.7s 18: learn: 0.4643604 test: 0.4646357 best: 0.4646357 (18) total: 4.17s remaining: 10.5s 19: learn: 0.4559276 test: 0.4563565 best: 0.4563565 (19) total: 4.42s remaining: 10.4s 20: learn: 0.4494263 test: 0.4498409 best: 0.4498409 (20) total: 4.59s remaining: 10.1s 21: learn: 0.4424962 test: 0.4429434 best: 0.4429434 (21) total: 4.81s remaining: 9.85s 22: learn: 0.4355663 test: 0.4361514 best: 0.4361514 (22) total: 5.05s remaining: 9.66s 23: learn: 0.4285787 test: 0.4292599 best: 0.4292599 (23) total: 5.28s remaining: 9.45s 24: learn: 0.4215260 test: 0.4222227 best: 0.4222227 (24) total: 5.48s remaining: 9.2s 25: learn: 0.4152337 test: 0.4160220 best: 0.4160220 (25) total: 5.66s remaining: 8.92s 26: learn: 0.4098363 test: 0.4106106 best: 0.4106106 (26) total: 5.89s remaining: 8.72s 27: learn: 0.4051751 test: 0.4060397 best: 0.4060397 (27) total: 6.08s remaining: 8.46s 28: learn: 0.3982501 test: 0.3991822 best: 0.3991822 (28) total: 6.28s remaining: 8.23s 29: learn: 0.3929915 test: 0.3939361 best: 0.3939361 (29) total: 6.47s remaining: 7.99s 30: learn: 0.3865899 test: 0.3876108 best: 0.3876108 (30) total: 6.68s remaining: 7.76s 31: learn: 0.3812522 test: 0.3823310 best: 0.3823310 (31) total: 6.91s remaining: 7.56s 32: learn: 0.3773833 test: 0.3785814 best: 0.3785814 (32) total: 7.13s remaining: 7.35s 33: learn: 0.3721944 test: 0.3734319 best: 0.3734319 (33) total: 7.35s remaining: 7.13s 34: learn: 0.3672463 test: 0.3687210 best: 0.3687210 (34) total: 7.56s remaining: 6.91s 35: learn: 0.3629758 test: 0.3646023 best: 0.3646023 (35) total: 7.77s remaining: 6.69s 36: learn: 0.3584619 test: 0.3601807 best: 0.3601807 (36) total: 8s remaining: 6.48s 37: learn: 0.3537072 test: 0.3554779 best: 0.3554779 (37) total: 8.24s remaining: 6.29s 38: learn: 0.3495596 test: 0.3513960 best: 0.3513960 (38) total: 8.43s remaining: 6.05s 39: learn: 0.3461099 test: 0.3480424 best: 0.3480424 (39) total: 8.62s remaining: 5.82s 40: learn: 0.3432918 test: 0.3453719 best: 0.3453719 (40) total: 8.84s remaining: 5.61s 41: learn: 0.3394303 test: 0.3416792 best: 0.3416792 (41) total: 9.08s remaining: 5.41s 42: learn: 0.3361839 test: 0.3385499 best: 0.3385499 (42) total: 9.29s remaining: 5.18s 43: learn: 0.3324358 test: 0.3348834 best: 0.3348834 (43) total: 9.52s remaining: 4.98s 44: learn: 0.3287561 test: 0.3311708 best: 0.3311708 (44) total: 9.71s remaining: 4.75s 45: learn: 0.3252645 test: 0.3276961 best: 0.3276961 (45) total: 9.94s remaining: 4.54s 46: learn: 0.3220492 test: 0.3245259 best: 0.3245259 (46) total: 10.2s remaining: 4.32s 47: learn: 0.3189953 test: 0.3215531 best: 0.3215531 (47) total: 10.4s remaining: 4.1s 48: learn: 0.3159522 test: 0.3186032 best: 0.3186032 (48) total: 10.6s remaining: 3.88s 49: learn: 0.3130870 test: 0.3157528 best: 0.3157528 (49) total: 10.8s remaining: 3.66s 50: learn: 0.3107485 test: 0.3133808 best: 0.3133808 (50) total: 10.9s remaining: 3.43s 51: learn: 0.3077091 test: 0.3104121 best: 0.3104121 (51) total: 11.2s remaining: 3.22s 52: learn: 0.3054726 test: 0.3082489 best: 0.3082489 (52) total: 11.4s remaining: 3s 53: learn: 0.3027849 test: 0.3057546 best: 0.3057546 (53) total: 11.6s remaining: 2.79s 54: learn: 0.3000333 test: 0.3030310 best: 0.3030310 (54) total: 11.8s remaining: 2.57s 55: learn: 0.2973370 test: 0.3003871 best: 0.3003871 (55) total: 12s remaining: 2.35s 56: learn: 0.2948005 test: 0.2979023 best: 0.2979023 (56) total: 12.2s remaining: 2.13s 57: learn: 0.2924727 test: 0.2956265 best: 0.2956265 (57) total: 12.4s remaining: 1.92s 58: learn: 0.2905947 test: 0.2937887 best: 0.2937887 (58) total: 12.6s remaining: 1.7s 59: learn: 0.2888714 test: 0.2921896 best: 0.2921896 (59) total: 12.8s remaining: 1.49s 60: learn: 0.2867790 test: 0.2902468 best: 0.2902468 (60) total: 13s remaining: 1.28s 61: learn: 0.2850776 test: 0.2887336 best: 0.2887336 (61) total: 13.2s remaining: 1.07s 62: learn: 0.2833522 test: 0.2870603 best: 0.2870603 (62) total: 13.4s remaining: 851ms 63: learn: 0.2815315 test: 0.2853507 best: 0.2853507 (63) total: 13.6s remaining: 638ms 64: learn: 0.2796751 test: 0.2835420 best: 0.2835420 (64) total: 13.8s remaining: 426ms 65: learn: 0.2774963 test: 0.2814237 best: 0.2814237 (65) total: 14s remaining: 213ms 66: learn: 0.2754841 test: 0.2795543 best: 0.2795543 (66) total: 14.3s remaining: 0us bestTest = 0.2795543458 bestIteration = 66 Trial 11, Fold 1: Log loss = 0.2794368553787128, Average precision = 0.9710326929799092, ROC-AUC = 0.9648835363463862, Elapsed Time = 14.426432200001727 seconds Trial 11, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 11, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6768745 test: 0.6769331 best: 0.6769331 (0) total: 188ms remaining: 12.4s 1: learn: 0.6605359 test: 0.6606861 best: 0.6606861 (1) total: 452ms remaining: 14.7s 2: learn: 0.6460412 test: 0.6463768 best: 0.6463768 (2) total: 680ms remaining: 14.5s 3: learn: 0.6318890 test: 0.6321683 best: 0.6321683 (3) total: 993ms remaining: 15.6s 4: learn: 0.6181038 test: 0.6185037 best: 0.6185037 (4) total: 1.24s remaining: 15.4s 5: learn: 0.6071810 test: 0.6075869 best: 0.6075869 (5) total: 1.48s remaining: 15.1s 6: learn: 0.5948793 test: 0.5954034 best: 0.5954034 (6) total: 1.72s remaining: 14.8s 7: learn: 0.5811800 test: 0.5817882 best: 0.5817882 (7) total: 1.95s remaining: 14.4s 8: learn: 0.5675538 test: 0.5683075 best: 0.5683075 (8) total: 2.16s remaining: 13.9s 9: learn: 0.5544078 test: 0.5551949 best: 0.5551949 (9) total: 2.38s remaining: 13.6s 10: learn: 0.5426587 test: 0.5435545 best: 0.5435545 (10) total: 2.58s remaining: 13.1s 11: learn: 0.5307968 test: 0.5317359 best: 0.5317359 (11) total: 2.81s remaining: 12.9s 12: learn: 0.5203673 test: 0.5214711 best: 0.5214711 (12) total: 3.03s remaining: 12.6s 13: learn: 0.5101300 test: 0.5112535 best: 0.5112535 (13) total: 3.25s remaining: 12.3s 14: learn: 0.5023139 test: 0.5035386 best: 0.5035386 (14) total: 3.46s remaining: 12s 15: learn: 0.4931253 test: 0.4944312 best: 0.4944312 (15) total: 3.68s remaining: 11.7s 16: learn: 0.4845059 test: 0.4858758 best: 0.4858758 (16) total: 3.9s remaining: 11.5s 17: learn: 0.4757084 test: 0.4770090 best: 0.4770090 (17) total: 4.11s remaining: 11.2s 18: learn: 0.4668061 test: 0.4681375 best: 0.4681375 (18) total: 4.33s remaining: 10.9s 19: learn: 0.4580350 test: 0.4595314 best: 0.4595314 (19) total: 4.54s remaining: 10.7s 20: learn: 0.4505076 test: 0.4521002 best: 0.4521002 (20) total: 4.73s remaining: 10.4s 21: learn: 0.4436367 test: 0.4453004 best: 0.4453004 (21) total: 4.93s remaining: 10.1s 22: learn: 0.4363055 test: 0.4379876 best: 0.4379876 (22) total: 5.16s remaining: 9.87s 23: learn: 0.4305131 test: 0.4322956 best: 0.4322956 (23) total: 5.39s remaining: 9.66s 24: learn: 0.4241404 test: 0.4259386 best: 0.4259386 (24) total: 5.62s remaining: 9.44s 25: learn: 0.4175541 test: 0.4194418 best: 0.4194418 (25) total: 5.86s remaining: 9.23s 26: learn: 0.4105208 test: 0.4124478 best: 0.4124478 (26) total: 6.08s remaining: 9s 27: learn: 0.4036903 test: 0.4056795 best: 0.4056795 (27) total: 6.28s remaining: 8.74s 28: learn: 0.3981490 test: 0.4001420 best: 0.4001420 (28) total: 6.5s remaining: 8.52s 29: learn: 0.3921423 test: 0.3941180 best: 0.3941180 (29) total: 6.72s remaining: 8.29s 30: learn: 0.3861784 test: 0.3881387 best: 0.3881387 (30) total: 6.95s remaining: 8.07s 31: learn: 0.3807530 test: 0.3828415 best: 0.3828415 (31) total: 7.18s remaining: 7.86s 32: learn: 0.3767053 test: 0.3788195 best: 0.3788195 (32) total: 7.39s remaining: 7.61s 33: learn: 0.3716546 test: 0.3737365 best: 0.3737365 (33) total: 7.62s remaining: 7.39s 34: learn: 0.3665675 test: 0.3686206 best: 0.3686206 (34) total: 7.82s remaining: 7.15s 35: learn: 0.3615512 test: 0.3636013 best: 0.3636013 (35) total: 8.07s remaining: 6.95s 36: learn: 0.3572078 test: 0.3592971 best: 0.3592971 (36) total: 8.3s remaining: 6.73s 37: learn: 0.3539008 test: 0.3559943 best: 0.3559943 (37) total: 8.49s remaining: 6.48s 38: learn: 0.3497796 test: 0.3518848 best: 0.3518848 (38) total: 8.7s remaining: 6.24s 39: learn: 0.3462238 test: 0.3484546 best: 0.3484546 (39) total: 8.89s remaining: 6s 40: learn: 0.3418760 test: 0.3441171 best: 0.3441171 (40) total: 9.09s remaining: 5.76s 41: learn: 0.3384278 test: 0.3406463 best: 0.3406463 (41) total: 9.32s remaining: 5.54s 42: learn: 0.3351877 test: 0.3374186 best: 0.3374186 (42) total: 9.55s remaining: 5.33s 43: learn: 0.3318712 test: 0.3341067 best: 0.3341067 (43) total: 9.78s remaining: 5.11s 44: learn: 0.3284350 test: 0.3306387 best: 0.3306387 (44) total: 10s remaining: 4.9s 45: learn: 0.3251581 test: 0.3273822 best: 0.3273822 (45) total: 10.2s remaining: 4.67s 46: learn: 0.3222006 test: 0.3244664 best: 0.3244664 (46) total: 10.4s remaining: 4.45s 47: learn: 0.3194208 test: 0.3217285 best: 0.3217285 (47) total: 10.6s remaining: 4.21s 48: learn: 0.3162123 test: 0.3184769 best: 0.3184769 (48) total: 10.9s remaining: 4s 49: learn: 0.3128895 test: 0.3151844 best: 0.3151844 (49) total: 11.1s remaining: 3.76s 50: learn: 0.3096733 test: 0.3119715 best: 0.3119715 (50) total: 11.3s remaining: 3.55s 51: learn: 0.3069216 test: 0.3091944 best: 0.3091944 (51) total: 11.6s remaining: 3.33s 52: learn: 0.3044808 test: 0.3068331 best: 0.3068331 (52) total: 11.8s remaining: 3.11s 53: learn: 0.3023612 test: 0.3046962 best: 0.3046962 (53) total: 12s remaining: 2.88s 54: learn: 0.2994012 test: 0.3017882 best: 0.3017882 (54) total: 12.2s remaining: 2.65s 55: learn: 0.2971688 test: 0.2995405 best: 0.2995405 (55) total: 12.4s remaining: 2.43s 56: learn: 0.2950217 test: 0.2974087 best: 0.2974087 (56) total: 12.6s remaining: 2.21s 57: learn: 0.2931551 test: 0.2955341 best: 0.2955341 (57) total: 12.8s remaining: 1.99s 58: learn: 0.2909885 test: 0.2934032 best: 0.2934032 (58) total: 13s remaining: 1.77s 59: learn: 0.2883323 test: 0.2907928 best: 0.2907928 (59) total: 13.3s remaining: 1.55s 60: learn: 0.2861939 test: 0.2887035 best: 0.2887035 (60) total: 13.5s remaining: 1.33s 61: learn: 0.2842688 test: 0.2867902 best: 0.2867902 (61) total: 13.8s remaining: 1.11s 62: learn: 0.2822182 test: 0.2847322 best: 0.2847322 (62) total: 14s remaining: 888ms 63: learn: 0.2804294 test: 0.2829481 best: 0.2829481 (63) total: 14.2s remaining: 666ms 64: learn: 0.2787728 test: 0.2813100 best: 0.2813100 (64) total: 14.4s remaining: 443ms 65: learn: 0.2768484 test: 0.2794160 best: 0.2794160 (65) total: 14.6s remaining: 221ms 66: learn: 0.2753172 test: 0.2778421 best: 0.2778421 (66) total: 14.8s remaining: 0us bestTest = 0.2778421475 bestIteration = 66 Trial 11, Fold 2: Log loss = 0.27777743832662527, Average precision = 0.9701344665845565, ROC-AUC = 0.9662593351443926, Elapsed Time = 14.970460399999865 seconds Trial 11, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 11, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6745818 test: 0.6745439 best: 0.6745439 (0) total: 301ms remaining: 19.8s 1: learn: 0.6600438 test: 0.6598261 best: 0.6598261 (1) total: 616ms remaining: 20s 2: learn: 0.6436126 test: 0.6432539 best: 0.6432539 (2) total: 832ms remaining: 17.8s 3: learn: 0.6296694 test: 0.6291667 best: 0.6291667 (3) total: 1.17s remaining: 18.5s 4: learn: 0.6140972 test: 0.6135292 best: 0.6135292 (4) total: 1.44s remaining: 17.9s 5: learn: 0.6002503 test: 0.5996492 best: 0.5996492 (5) total: 1.74s remaining: 17.7s 6: learn: 0.5857951 test: 0.5851386 best: 0.5851386 (6) total: 1.99s remaining: 17.1s 7: learn: 0.5721890 test: 0.5714577 best: 0.5714577 (7) total: 2.32s remaining: 17.1s 8: learn: 0.5614667 test: 0.5606751 best: 0.5606751 (8) total: 2.65s remaining: 17.1s 9: learn: 0.5509425 test: 0.5500872 best: 0.5500872 (9) total: 2.92s remaining: 16.6s 10: learn: 0.5388606 test: 0.5379885 best: 0.5379885 (10) total: 3.22s remaining: 16.4s 11: learn: 0.5277555 test: 0.5268546 best: 0.5268546 (11) total: 3.48s remaining: 16s 12: learn: 0.5177943 test: 0.5168698 best: 0.5168698 (12) total: 3.74s remaining: 15.5s 13: learn: 0.5080051 test: 0.5070389 best: 0.5070389 (13) total: 3.99s remaining: 15.1s 14: learn: 0.4993816 test: 0.4983255 best: 0.4983255 (14) total: 4.22s remaining: 14.6s 15: learn: 0.4894705 test: 0.4884585 best: 0.4884585 (15) total: 4.47s remaining: 14.3s 16: learn: 0.4804009 test: 0.4793427 best: 0.4793427 (16) total: 4.7s remaining: 13.8s 17: learn: 0.4726732 test: 0.4715583 best: 0.4715583 (17) total: 4.94s remaining: 13.5s 18: learn: 0.4635668 test: 0.4623785 best: 0.4623785 (18) total: 5.19s remaining: 13.1s 19: learn: 0.4555370 test: 0.4542507 best: 0.4542507 (19) total: 5.45s remaining: 12.8s 20: learn: 0.4472793 test: 0.4460816 best: 0.4460816 (20) total: 5.7s remaining: 12.5s 21: learn: 0.4390947 test: 0.4378865 best: 0.4378865 (21) total: 5.97s remaining: 12.2s 22: learn: 0.4332559 test: 0.4319120 best: 0.4319120 (22) total: 6.21s remaining: 11.9s 23: learn: 0.4263962 test: 0.4250459 best: 0.4250459 (23) total: 6.43s remaining: 11.5s 24: learn: 0.4188998 test: 0.4175155 best: 0.4175155 (24) total: 6.69s remaining: 11.2s 25: learn: 0.4132966 test: 0.4117496 best: 0.4117496 (25) total: 6.95s remaining: 11s 26: learn: 0.4070446 test: 0.4054401 best: 0.4054401 (26) total: 7.23s remaining: 10.7s 27: learn: 0.4004617 test: 0.3988641 best: 0.3988641 (27) total: 7.52s remaining: 10.5s 28: learn: 0.3949450 test: 0.3932897 best: 0.3932897 (28) total: 7.77s remaining: 10.2s 29: learn: 0.3894224 test: 0.3877362 best: 0.3877362 (29) total: 8.03s remaining: 9.9s 30: learn: 0.3841958 test: 0.3824798 best: 0.3824798 (30) total: 8.24s remaining: 9.57s 31: learn: 0.3784917 test: 0.3766846 best: 0.3766846 (31) total: 8.48s remaining: 9.28s 32: learn: 0.3740872 test: 0.3722345 best: 0.3722345 (32) total: 8.72s remaining: 8.99s 33: learn: 0.3702647 test: 0.3683160 best: 0.3683160 (33) total: 8.94s remaining: 8.68s 34: learn: 0.3665477 test: 0.3645282 best: 0.3645282 (34) total: 9.2s remaining: 8.41s 35: learn: 0.3621122 test: 0.3601180 best: 0.3601180 (35) total: 9.41s remaining: 8.11s 36: learn: 0.3580006 test: 0.3560056 best: 0.3560056 (36) total: 9.66s remaining: 7.83s 37: learn: 0.3533613 test: 0.3513785 best: 0.3513785 (37) total: 9.89s remaining: 7.55s 38: learn: 0.3490896 test: 0.3471357 best: 0.3471357 (38) total: 10.1s remaining: 7.28s 39: learn: 0.3451511 test: 0.3432658 best: 0.3432658 (39) total: 10.4s remaining: 7s 40: learn: 0.3412224 test: 0.3393922 best: 0.3393922 (40) total: 10.6s remaining: 6.74s 41: learn: 0.3374990 test: 0.3356321 best: 0.3356321 (41) total: 10.8s remaining: 6.44s 42: learn: 0.3342637 test: 0.3323483 best: 0.3323483 (42) total: 11s remaining: 6.14s 43: learn: 0.3307026 test: 0.3287136 best: 0.3287136 (43) total: 11.2s remaining: 5.88s 44: learn: 0.3277783 test: 0.3258012 best: 0.3258012 (44) total: 11.5s remaining: 5.6s 45: learn: 0.3246766 test: 0.3227313 best: 0.3227313 (45) total: 11.7s remaining: 5.33s 46: learn: 0.3213079 test: 0.3193988 best: 0.3193988 (46) total: 11.9s remaining: 5.07s 47: learn: 0.3184889 test: 0.3165987 best: 0.3165987 (47) total: 12.1s remaining: 4.79s 48: learn: 0.3153573 test: 0.3135196 best: 0.3135196 (48) total: 12.3s remaining: 4.54s 49: learn: 0.3123103 test: 0.3105356 best: 0.3105356 (49) total: 12.6s remaining: 4.27s 50: learn: 0.3092228 test: 0.3074065 best: 0.3074065 (50) total: 12.8s remaining: 4s 51: learn: 0.3068746 test: 0.3051386 best: 0.3051386 (51) total: 13s remaining: 3.75s 52: learn: 0.3046266 test: 0.3029323 best: 0.3029323 (52) total: 13.2s remaining: 3.48s 53: learn: 0.3018328 test: 0.3001466 best: 0.3001466 (53) total: 13.4s remaining: 3.22s 54: learn: 0.2997672 test: 0.2981099 best: 0.2981099 (54) total: 13.6s remaining: 2.97s 55: learn: 0.2974619 test: 0.2958449 best: 0.2958449 (55) total: 13.8s remaining: 2.71s 56: learn: 0.2956165 test: 0.2939894 best: 0.2939894 (56) total: 14s remaining: 2.45s 57: learn: 0.2930828 test: 0.2915011 best: 0.2915011 (57) total: 14.2s remaining: 2.2s 58: learn: 0.2907967 test: 0.2891978 best: 0.2891978 (58) total: 14.4s remaining: 1.96s 59: learn: 0.2889759 test: 0.2873388 best: 0.2873388 (59) total: 14.6s remaining: 1.7s 60: learn: 0.2867605 test: 0.2851635 best: 0.2851635 (60) total: 14.8s remaining: 1.46s 61: learn: 0.2845229 test: 0.2829976 best: 0.2829976 (61) total: 15.1s remaining: 1.22s 62: learn: 0.2828839 test: 0.2813537 best: 0.2813537 (62) total: 15.3s remaining: 971ms 63: learn: 0.2809485 test: 0.2794361 best: 0.2794361 (63) total: 15.5s remaining: 728ms 64: learn: 0.2792572 test: 0.2777364 best: 0.2777364 (64) total: 15.8s remaining: 485ms 65: learn: 0.2774225 test: 0.2758960 best: 0.2758960 (65) total: 16s remaining: 242ms 66: learn: 0.2759123 test: 0.2744278 best: 0.2744278 (66) total: 16.2s remaining: 0us bestTest = 0.2744277971 bestIteration = 66 Trial 11, Fold 3: Log loss = 0.27449709591265703, Average precision = 0.9707993364138457, ROC-AUC = 0.9673232524486907, Elapsed Time = 16.35089109999899 seconds Trial 11, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 11, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6773956 test: 0.6774482 best: 0.6774482 (0) total: 229ms remaining: 15.1s 1: learn: 0.6596448 test: 0.6596878 best: 0.6596878 (1) total: 447ms remaining: 14.5s 2: learn: 0.6431136 test: 0.6431593 best: 0.6431593 (2) total: 665ms remaining: 14.2s 3: learn: 0.6284000 test: 0.6282933 best: 0.6282933 (3) total: 907ms remaining: 14.3s 4: learn: 0.6123555 test: 0.6123598 best: 0.6123598 (4) total: 1.13s remaining: 14s 5: learn: 0.5974285 test: 0.5974745 best: 0.5974745 (5) total: 1.36s remaining: 13.8s 6: learn: 0.5846733 test: 0.5847007 best: 0.5847007 (6) total: 1.6s remaining: 13.7s 7: learn: 0.5720504 test: 0.5721056 best: 0.5721056 (7) total: 1.82s remaining: 13.4s 8: learn: 0.5586835 test: 0.5588197 best: 0.5588197 (8) total: 2.06s remaining: 13.3s 9: learn: 0.5468372 test: 0.5469032 best: 0.5469032 (9) total: 2.31s remaining: 13.2s 10: learn: 0.5355511 test: 0.5356398 best: 0.5356398 (10) total: 2.52s remaining: 12.8s 11: learn: 0.5239600 test: 0.5240280 best: 0.5240280 (11) total: 2.74s remaining: 12.6s 12: learn: 0.5132575 test: 0.5133898 best: 0.5133898 (12) total: 2.97s remaining: 12.3s 13: learn: 0.5034212 test: 0.5036301 best: 0.5036301 (13) total: 3.23s remaining: 12.2s 14: learn: 0.4944624 test: 0.4947159 best: 0.4947159 (14) total: 3.43s remaining: 11.9s 15: learn: 0.4861198 test: 0.4864163 best: 0.4864163 (15) total: 3.7s remaining: 11.8s 16: learn: 0.4768987 test: 0.4772338 best: 0.4772338 (16) total: 3.94s remaining: 11.6s 17: learn: 0.4680293 test: 0.4683420 best: 0.4683420 (17) total: 4.16s remaining: 11.3s 18: learn: 0.4604450 test: 0.4608012 best: 0.4608012 (18) total: 4.41s remaining: 11.1s 19: learn: 0.4535440 test: 0.4539408 best: 0.4539408 (19) total: 4.7s remaining: 11s 20: learn: 0.4470502 test: 0.4475409 best: 0.4475409 (20) total: 4.9s remaining: 10.7s 21: learn: 0.4390813 test: 0.4394970 best: 0.4394970 (21) total: 5.12s remaining: 10.5s 22: learn: 0.4320481 test: 0.4325157 best: 0.4325157 (22) total: 5.34s remaining: 10.2s 23: learn: 0.4248334 test: 0.4252803 best: 0.4252803 (23) total: 5.56s remaining: 9.97s 24: learn: 0.4183409 test: 0.4187858 best: 0.4187858 (24) total: 5.76s remaining: 9.68s 25: learn: 0.4116909 test: 0.4122146 best: 0.4122146 (25) total: 5.96s remaining: 9.39s 26: learn: 0.4048585 test: 0.4053654 best: 0.4053654 (26) total: 6.17s remaining: 9.13s 27: learn: 0.3990501 test: 0.3996079 best: 0.3996079 (27) total: 6.41s remaining: 8.93s 28: learn: 0.3935038 test: 0.3941391 best: 0.3941391 (28) total: 6.62s remaining: 8.68s 29: learn: 0.3890964 test: 0.3897911 best: 0.3897911 (29) total: 6.84s remaining: 8.43s 30: learn: 0.3842292 test: 0.3849965 best: 0.3849965 (30) total: 7.05s remaining: 8.19s 31: learn: 0.3794898 test: 0.3803093 best: 0.3803093 (31) total: 7.3s remaining: 7.99s 32: learn: 0.3746090 test: 0.3754313 best: 0.3754313 (32) total: 7.5s remaining: 7.72s 33: learn: 0.3701510 test: 0.3710158 best: 0.3710158 (33) total: 7.73s remaining: 7.51s 34: learn: 0.3654079 test: 0.3662508 best: 0.3662508 (34) total: 7.96s remaining: 7.28s 35: learn: 0.3607161 test: 0.3616315 best: 0.3616315 (35) total: 8.2s remaining: 7.06s 36: learn: 0.3571017 test: 0.3580852 best: 0.3580852 (36) total: 8.43s remaining: 6.84s 37: learn: 0.3528004 test: 0.3538447 best: 0.3538447 (37) total: 8.68s remaining: 6.63s 38: learn: 0.3483745 test: 0.3494049 best: 0.3494049 (38) total: 8.89s remaining: 6.38s 39: learn: 0.3440062 test: 0.3450627 best: 0.3450627 (39) total: 9.13s remaining: 6.16s 40: learn: 0.3408131 test: 0.3419446 best: 0.3419446 (40) total: 9.45s remaining: 5.99s 41: learn: 0.3375819 test: 0.3387926 best: 0.3387926 (41) total: 9.7s remaining: 5.77s 42: learn: 0.3335197 test: 0.3348440 best: 0.3348440 (42) total: 9.94s remaining: 5.55s 43: learn: 0.3306985 test: 0.3320617 best: 0.3320617 (43) total: 10.2s remaining: 5.3s 44: learn: 0.3275306 test: 0.3288955 best: 0.3288955 (44) total: 10.4s remaining: 5.06s 45: learn: 0.3242911 test: 0.3257486 best: 0.3257486 (45) total: 10.6s remaining: 4.82s 46: learn: 0.3211383 test: 0.3225475 best: 0.3225475 (46) total: 10.8s remaining: 4.58s 47: learn: 0.3183170 test: 0.3197039 best: 0.3197039 (47) total: 11s remaining: 4.35s 48: learn: 0.3150319 test: 0.3165236 best: 0.3165236 (48) total: 11.2s remaining: 4.11s 49: learn: 0.3118535 test: 0.3133920 best: 0.3133920 (49) total: 11.4s remaining: 3.88s 50: learn: 0.3091888 test: 0.3107787 best: 0.3107787 (50) total: 11.6s remaining: 3.65s 51: learn: 0.3066345 test: 0.3083023 best: 0.3083023 (51) total: 11.9s remaining: 3.42s 52: learn: 0.3043216 test: 0.3059976 best: 0.3059976 (52) total: 12.1s remaining: 3.19s 53: learn: 0.3020885 test: 0.3038430 best: 0.3038430 (53) total: 12.3s remaining: 2.95s 54: learn: 0.2999550 test: 0.3017456 best: 0.3017456 (54) total: 12.5s remaining: 2.72s 55: learn: 0.2975051 test: 0.2993211 best: 0.2993211 (55) total: 12.7s remaining: 2.5s 56: learn: 0.2953217 test: 0.2971362 best: 0.2971362 (56) total: 12.9s remaining: 2.27s 57: learn: 0.2936812 test: 0.2954753 best: 0.2954753 (57) total: 13.1s remaining: 2.04s 58: learn: 0.2913188 test: 0.2931846 best: 0.2931846 (58) total: 13.4s remaining: 1.81s 59: learn: 0.2891267 test: 0.2909626 best: 0.2909626 (59) total: 13.6s remaining: 1.58s 60: learn: 0.2868925 test: 0.2887959 best: 0.2887959 (60) total: 13.8s remaining: 1.36s 61: learn: 0.2845077 test: 0.2864239 best: 0.2864239 (61) total: 14s remaining: 1.13s 62: learn: 0.2823554 test: 0.2842884 best: 0.2842884 (62) total: 14.2s remaining: 905ms 63: learn: 0.2806743 test: 0.2826149 best: 0.2826149 (63) total: 14.5s remaining: 678ms 64: learn: 0.2788672 test: 0.2808704 best: 0.2808704 (64) total: 14.7s remaining: 452ms 65: learn: 0.2772859 test: 0.2793294 best: 0.2793294 (65) total: 14.9s remaining: 226ms 66: learn: 0.2757826 test: 0.2778709 best: 0.2778709 (66) total: 15.1s remaining: 0us bestTest = 0.2778709347 bestIteration = 66 Trial 11, Fold 4: Log loss = 0.2777909289006165, Average precision = 0.9711449705618475, ROC-AUC = 0.9652710716252852, Elapsed Time = 15.248579500002961 seconds Trial 11, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 11, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6753409 test: 0.6755724 best: 0.6755724 (0) total: 224ms remaining: 14.8s 1: learn: 0.6601324 test: 0.6605737 best: 0.6605737 (1) total: 451ms remaining: 14.7s 2: learn: 0.6434900 test: 0.6442327 best: 0.6442327 (2) total: 692ms remaining: 14.8s 3: learn: 0.6287593 test: 0.6298026 best: 0.6298026 (3) total: 913ms remaining: 14.4s 4: learn: 0.6147572 test: 0.6158683 best: 0.6158683 (4) total: 1.16s remaining: 14.4s 5: learn: 0.6003083 test: 0.6015889 best: 0.6015889 (5) total: 1.39s remaining: 14.1s 6: learn: 0.5859469 test: 0.5874423 best: 0.5874423 (6) total: 1.61s remaining: 13.8s 7: learn: 0.5731605 test: 0.5748562 best: 0.5748562 (7) total: 1.85s remaining: 13.7s 8: learn: 0.5629117 test: 0.5648638 best: 0.5648638 (8) total: 2.12s remaining: 13.7s 9: learn: 0.5514924 test: 0.5536351 best: 0.5536351 (9) total: 2.36s remaining: 13.4s 10: learn: 0.5392151 test: 0.5415519 best: 0.5415519 (10) total: 2.6s remaining: 13.2s 11: learn: 0.5296436 test: 0.5321464 best: 0.5321464 (11) total: 2.82s remaining: 12.9s 12: learn: 0.5189684 test: 0.5216151 best: 0.5216151 (12) total: 3.06s remaining: 12.7s 13: learn: 0.5085028 test: 0.5112776 best: 0.5112776 (13) total: 3.29s remaining: 12.5s 14: learn: 0.4982562 test: 0.5012383 best: 0.5012383 (14) total: 3.5s remaining: 12.1s 15: learn: 0.4883012 test: 0.4914796 best: 0.4914796 (15) total: 3.71s remaining: 11.8s 16: learn: 0.4798366 test: 0.4830555 best: 0.4830555 (16) total: 3.94s remaining: 11.6s 17: learn: 0.4717240 test: 0.4751497 best: 0.4751497 (17) total: 4.17s remaining: 11.3s 18: learn: 0.4636268 test: 0.4671435 best: 0.4671435 (18) total: 4.37s remaining: 11s 19: learn: 0.4561798 test: 0.4599502 best: 0.4599502 (19) total: 4.6s remaining: 10.8s 20: learn: 0.4487982 test: 0.4527293 best: 0.4527293 (20) total: 4.78s remaining: 10.5s 21: learn: 0.4409260 test: 0.4450200 best: 0.4450200 (21) total: 5.02s remaining: 10.3s 22: learn: 0.4343339 test: 0.4384846 best: 0.4384846 (22) total: 5.22s remaining: 9.98s 23: learn: 0.4273664 test: 0.4316937 best: 0.4316937 (23) total: 5.42s remaining: 9.72s 24: learn: 0.4206245 test: 0.4250087 best: 0.4250087 (24) total: 5.62s remaining: 9.44s 25: learn: 0.4145943 test: 0.4191314 best: 0.4191314 (25) total: 5.84s remaining: 9.22s 26: learn: 0.4080814 test: 0.4127882 best: 0.4127882 (26) total: 6.08s remaining: 9s 27: learn: 0.4011640 test: 0.4059879 best: 0.4059879 (27) total: 6.29s remaining: 8.76s 28: learn: 0.3958573 test: 0.4008154 best: 0.4008154 (28) total: 6.5s remaining: 8.52s 29: learn: 0.3899300 test: 0.3949721 best: 0.3949721 (29) total: 6.72s remaining: 8.29s 30: learn: 0.3848788 test: 0.3900121 best: 0.3900121 (30) total: 6.99s remaining: 8.11s 31: learn: 0.3799213 test: 0.3850029 best: 0.3850029 (31) total: 7.22s remaining: 7.9s 32: learn: 0.3750653 test: 0.3802996 best: 0.3802996 (32) total: 7.43s remaining: 7.66s 33: learn: 0.3698968 test: 0.3752238 best: 0.3752238 (33) total: 7.66s remaining: 7.43s 34: learn: 0.3650126 test: 0.3704073 best: 0.3704073 (34) total: 7.92s remaining: 7.24s 35: learn: 0.3598255 test: 0.3653251 best: 0.3653251 (35) total: 8.16s remaining: 7.03s 36: learn: 0.3546094 test: 0.3602273 best: 0.3602273 (36) total: 8.45s remaining: 6.85s 37: learn: 0.3502890 test: 0.3560263 best: 0.3560263 (37) total: 8.7s remaining: 6.64s 38: learn: 0.3464804 test: 0.3522631 best: 0.3522631 (38) total: 8.91s remaining: 6.4s 39: learn: 0.3422838 test: 0.3482084 best: 0.3482084 (39) total: 9.16s remaining: 6.18s 40: learn: 0.3388215 test: 0.3447807 best: 0.3447807 (40) total: 9.43s remaining: 5.98s 41: learn: 0.3351738 test: 0.3412294 best: 0.3412294 (41) total: 9.7s remaining: 5.77s 42: learn: 0.3314143 test: 0.3375643 best: 0.3375643 (42) total: 9.9s remaining: 5.52s 43: learn: 0.3277363 test: 0.3340494 best: 0.3340494 (43) total: 10.2s remaining: 5.31s 44: learn: 0.3239840 test: 0.3304497 best: 0.3304497 (44) total: 10.4s remaining: 5.07s 45: learn: 0.3208045 test: 0.3274623 best: 0.3274623 (45) total: 10.6s remaining: 4.84s 46: learn: 0.3177725 test: 0.3244789 best: 0.3244789 (46) total: 10.8s remaining: 4.61s 47: learn: 0.3150695 test: 0.3218795 best: 0.3218795 (47) total: 11s remaining: 4.37s 48: learn: 0.3125447 test: 0.3195485 best: 0.3195485 (48) total: 11.3s remaining: 4.14s 49: learn: 0.3092978 test: 0.3164171 best: 0.3164171 (49) total: 11.5s remaining: 3.9s 50: learn: 0.3066592 test: 0.3138145 best: 0.3138145 (50) total: 11.7s remaining: 3.67s 51: learn: 0.3038494 test: 0.3110927 best: 0.3110927 (51) total: 11.9s remaining: 3.44s 52: learn: 0.3009743 test: 0.3082691 best: 0.3082691 (52) total: 12.2s remaining: 3.21s 53: learn: 0.2985862 test: 0.3060117 best: 0.3060117 (53) total: 12.4s remaining: 2.97s 54: learn: 0.2961186 test: 0.3036417 best: 0.3036417 (54) total: 12.6s remaining: 2.74s 55: learn: 0.2937497 test: 0.3013589 best: 0.3013589 (55) total: 12.8s remaining: 2.51s 56: learn: 0.2918273 test: 0.2995652 best: 0.2995652 (56) total: 13s remaining: 2.28s 57: learn: 0.2898604 test: 0.2976273 best: 0.2976273 (57) total: 13.2s remaining: 2.05s 58: learn: 0.2877724 test: 0.2955616 best: 0.2955616 (58) total: 13.4s remaining: 1.81s 59: learn: 0.2859400 test: 0.2938289 best: 0.2938289 (59) total: 13.6s remaining: 1.58s 60: learn: 0.2839536 test: 0.2918789 best: 0.2918789 (60) total: 13.8s remaining: 1.36s 61: learn: 0.2818728 test: 0.2899406 best: 0.2899406 (61) total: 14s remaining: 1.13s 62: learn: 0.2799409 test: 0.2880880 best: 0.2880880 (62) total: 14.2s remaining: 902ms 63: learn: 0.2780779 test: 0.2862848 best: 0.2862848 (63) total: 14.4s remaining: 677ms 64: learn: 0.2763721 test: 0.2846905 best: 0.2846905 (64) total: 14.7s remaining: 452ms 65: learn: 0.2746583 test: 0.2830798 best: 0.2830798 (65) total: 14.9s remaining: 226ms 66: learn: 0.2729542 test: 0.2814548 best: 0.2814548 (66) total: 15.1s remaining: 0us bestTest = 0.2814547793 bestIteration = 66 Trial 11, Fold 5: Log loss = 0.2812293273669448, Average precision = 0.9692282383735199, ROC-AUC = 0.963279936086803, Elapsed Time = 15.249750499999209 seconds
Optimization Progress: 12%|#2 | 12/100 [18:29<1:44:46, 71.44s/it]
Trial 12, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 12, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5384561 test: 0.5403447 best: 0.5403447 (0) total: 491ms remaining: 48.6s 1: learn: 0.4310432 test: 0.4363623 best: 0.4363623 (1) total: 1.12s remaining: 55s 2: learn: 0.3619269 test: 0.3696297 best: 0.3696297 (2) total: 1.76s remaining: 56.9s 3: learn: 0.3111326 test: 0.3220037 best: 0.3220037 (3) total: 2.42s remaining: 58.1s 4: learn: 0.2729093 test: 0.2877730 best: 0.2877730 (4) total: 3.1s remaining: 58.9s 5: learn: 0.2457871 test: 0.2640025 best: 0.2640025 (5) total: 3.77s remaining: 59.1s 6: learn: 0.2231111 test: 0.2444385 best: 0.2444385 (6) total: 4.47s remaining: 59.4s 7: learn: 2.4701636 test: 0.2325594 best: 0.2325594 (7) total: 5.11s remaining: 58.8s 8: learn: 10.6390693 test: 12.9046381 best: 0.2325594 (7) total: 5.78s remaining: 58.5s 9: learn: 13.3196306 test: 12.8510000 best: 0.2325594 (7) total: 6.38s remaining: 57.4s 10: learn: 579.0603453 test: 561.8532997 best: 0.2325594 (7) total: 6.97s remaining: 56.4s 11: learn: 2295.0834727 test: 2260.0907995 best: 0.2325594 (7) total: 7.55s remaining: 55.4s 12: learn: 379934.8232905 test: 383171.6665990 best: 0.2325594 (7) total: 8.02s remaining: 53.6s 13: learn: 315249.5239398 test: 347726.9128801 best: 0.2325594 (7) total: 8.63s remaining: 53s
Training has stopped (degenerate solution on iteration 14, probably too small l2-regularization, try to increase it)
bestTest = 0.2325594147 bestIteration = 7 Shrink model to first 8 iterations. Trial 12, Fold 1: Log loss = 0.23255941469882188, Average precision = 0.9719359413186033, ROC-AUC = 0.9677732461297275, Elapsed Time = 9.332107500002166 seconds Trial 12, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 12, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5252559 test: 0.5296265 best: 0.5296265 (0) total: 665ms remaining: 1m 5s 1: learn: 0.4190479 test: 0.4251401 best: 0.4251401 (1) total: 1.4s remaining: 1m 8s 2: learn: 0.3497104 test: 0.3597729 best: 0.3597729 (2) total: 2.08s remaining: 1m 7s 3: learn: 0.3042432 test: 0.3148924 best: 0.3148924 (3) total: 2.8s remaining: 1m 7s 4: learn: 0.2699026 test: 0.2838566 best: 0.2838566 (4) total: 3.57s remaining: 1m 7s 5: learn: 0.2428225 test: 0.2598137 best: 0.2598137 (5) total: 4.29s remaining: 1m 7s 6: learn: 0.2255687 test: 0.2455997 best: 0.2455997 (6) total: 5.01s remaining: 1m 6s 7: learn: 0.2102940 test: 0.2322544 best: 0.2322544 (7) total: 5.66s remaining: 1m 5s 8: learn: 0.1985501 test: 0.2221343 best: 0.2221343 (8) total: 6.38s remaining: 1m 4s 9: learn: 0.1890692 test: 0.2149487 best: 0.2149487 (9) total: 7.1s remaining: 1m 3s 10: learn: 3.7943339 test: 0.2106575 best: 0.2106575 (10) total: 7.6s remaining: 1m 1s 11: learn: 3.7868560 test: 0.2055239 best: 0.2055239 (11) total: 8.21s remaining: 1m 12: learn: 3.7811813 test: 0.2018917 best: 0.2018917 (12) total: 8.87s remaining: 59.4s 13: learn: 6.4812439 test: 0.1997345 best: 0.1997345 (13) total: 9.43s remaining: 58s 14: learn: 5.9006283 test: 0.1976059 best: 0.1976059 (14) total: 9.92s remaining: 56.2s 15: learn: 635.8730364 test: 455.3360458 best: 0.1976059 (14) total: 10.5s remaining: 55s 16: learn: 450220.0936542 test: 444025.8463026 best: 0.1976059 (14) total: 11s remaining: 53.5s 17: learn: 73317.3068615 test: 48122.8515844 best: 0.1976059 (14) total: 11.5s remaining: 52.2s 18: learn: 93627.5119791 test: 69235.4602546 best: 0.1976059 (14) total: 11.8s remaining: 50.5s 19: learn: 98438.4261827 test: 72874.4917515 best: 0.1976059 (14) total: 12.1s remaining: 48.6s 20: learn: 108014.5175455 test: 82281.6648393 best: 0.1976059 (14) total: 12.5s remaining: 46.9s
Training has stopped (degenerate solution on iteration 21, probably too small l2-regularization, try to increase it)
bestTest = 0.1976059236 bestIteration = 14 Shrink model to first 15 iterations. Trial 12, Fold 2: Log loss = 0.19760592357393936, Average precision = 0.9738911728355482, ROC-AUC = 0.970783315998678, Elapsed Time = 13.0962698000003 seconds Trial 12, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 12, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5405717 test: 0.5405482 best: 0.5405482 (0) total: 592ms remaining: 58.6s 1: learn: 0.4380485 test: 0.4406064 best: 0.4406064 (1) total: 1.23s remaining: 1m 2: learn: 0.3555325 test: 0.3610371 best: 0.3610371 (2) total: 1.81s remaining: 58.7s 3: learn: 0.3060056 test: 0.3158127 best: 0.3158127 (3) total: 2.53s remaining: 1m 4: learn: 0.2712005 test: 0.2833835 best: 0.2833835 (4) total: 3.16s remaining: 1m 5: learn: 0.2432929 test: 0.2578149 best: 0.2578149 (5) total: 3.88s remaining: 1m 6: learn: 2.9382892 test: 0.2442496 best: 0.2442496 (6) total: 4.54s remaining: 1m 7: learn: 2.9224494 test: 0.2308975 best: 0.2308975 (7) total: 5.17s remaining: 59.4s 8: learn: 9.2361234 test: 0.2209059 best: 0.2209059 (8) total: 5.84s remaining: 59.1s 9: learn: 34.0982740 test: 11.2996876 best: 0.2209059 (8) total: 6.48s remaining: 58.3s 10: learn: 34.0905015 test: 11.2945373 best: 0.2209059 (8) total: 7.01s remaining: 56.7s 11: learn: 34.0842501 test: 11.2905138 best: 0.2209059 (8) total: 7.61s remaining: 55.8s 12: learn: 34.0786996 test: 11.2873439 best: 0.2209059 (8) total: 8.21s remaining: 54.9s 13: learn: 74.6171724 test: 60.0718407 best: 0.2209059 (8) total: 8.77s remaining: 53.9s 14: learn: 74.6130974 test: 60.0694122 best: 0.2209059 (8) total: 9.27s remaining: 52.5s
Training has stopped (degenerate solution on iteration 15, probably too small l2-regularization, try to increase it)
bestTest = 0.2209059064 bestIteration = 8 Shrink model to first 9 iterations. Trial 12, Fold 3: Log loss = 0.22090590640476881, Average precision = 0.972677075566525, ROC-AUC = 0.9699477847415512, Elapsed Time = 9.988151199999265 seconds Trial 12, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 12, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5337580 test: 0.5358030 best: 0.5358030 (0) total: 569ms remaining: 56.4s 1: learn: 0.4279728 test: 0.4322761 best: 0.4322761 (1) total: 1.15s remaining: 56.5s 2: learn: 0.3636670 test: 0.3725890 best: 0.3725890 (2) total: 1.82s remaining: 58.7s 3: learn: 0.3006959 test: 0.3121006 best: 0.3121006 (3) total: 2.6s remaining: 1m 2s 4: learn: 0.2591788 test: 0.2731217 best: 0.2731217 (4) total: 3.38s remaining: 1m 4s 5: learn: 0.2354564 test: 0.2517987 best: 0.2517987 (5) total: 4.02s remaining: 1m 3s 6: learn: 0.2205635 test: 0.2389304 best: 0.2389304 (6) total: 4.63s remaining: 1m 1s 7: learn: 2.9137287 test: 11.0164602 best: 0.2389304 (6) total: 5.23s remaining: 1m 8: learn: 2.9037325 test: 11.0083992 best: 0.2389304 (6) total: 5.76s remaining: 58.3s 9: learn: 2.8949908 test: 11.0023402 best: 0.2389304 (6) total: 6.49s remaining: 58.4s 10: learn: 2.8870372 test: 10.9961141 best: 0.2389304 (6) total: 7.04s remaining: 56.9s 11: learn: 2.8814986 test: 10.9922730 best: 0.2389304 (6) total: 7.59s remaining: 55.7s 12: learn: 2.8767385 test: 10.9891866 best: 0.2389304 (6) total: 8.07s remaining: 54s 13: learn: 12.3746416 test: 10.9870132 best: 0.2389304 (6) total: 8.61s remaining: 52.9s 14: learn: 199.2728841 test: 0.1966305 best: 0.1966305 (14) total: 9.18s remaining: 52s 15: learn: 198.8160300 test: 0.1951282 best: 0.1951282 (15) total: 9.74s remaining: 51.2s 16: learn: 198.8127800 test: 0.1943081 best: 0.1943081 (16) total: 10.2s remaining: 49.8s 17: learn: 198.8104832 test: 0.1933285 best: 0.1933285 (17) total: 10.5s remaining: 47.9s 18: learn: 198.8081626 test: 0.1928502 best: 0.1928502 (18) total: 11s remaining: 46.7s 19: learn: 198.8058863 test: 0.1915156 best: 0.1915156 (19) total: 11.3s remaining: 45.3s 20: learn: 198.8039969 test: 0.1911224 best: 0.1911224 (20) total: 11.6s remaining: 43.7s 21: learn: 198.8013816 test: 0.1911539 best: 0.1911224 (20) total: 12.1s remaining: 43s 22: learn: 198.3464005 test: 0.1906620 best: 0.1906620 (22) total: 12.5s remaining: 42s 23: learn: 198.3443615 test: 0.1906044 best: 0.1906044 (23) total: 12.9s remaining: 41s 24: learn: 198.3420116 test: 0.1904159 best: 0.1904159 (24) total: 13.4s remaining: 40.1s 25: learn: 198.3407880 test: 0.1899795 best: 0.1899795 (25) total: 13.6s remaining: 38.7s 26: learn: 198.3384767 test: 0.1902573 best: 0.1899795 (25) total: 14s remaining: 37.7s 27: learn: 198.3373787 test: 0.1902691 best: 0.1899795 (25) total: 14.1s remaining: 36.4s 28: learn: 198.3354289 test: 0.1905724 best: 0.1899795 (25) total: 14.6s remaining: 35.7s 29: learn: 198.3341863 test: 0.1905841 best: 0.1899795 (25) total: 14.9s remaining: 34.7s 30: learn: 198.3325132 test: 0.1904254 best: 0.1899795 (25) total: 15.2s remaining: 33.9s 31: learn: 198.3309906 test: 0.1907043 best: 0.1899795 (25) total: 15.5s remaining: 33s 32: learn: 198.3295900 test: 0.1908821 best: 0.1899795 (25) total: 15.9s remaining: 32.2s 33: learn: 198.3280674 test: 0.1913507 best: 0.1899795 (25) total: 16.2s remaining: 31.4s 34: learn: 198.3267566 test: 0.1912379 best: 0.1899795 (25) total: 16.4s remaining: 30.5s 35: learn: 198.3257603 test: 0.1911091 best: 0.1899795 (25) total: 16.6s remaining: 29.6s 36: learn: 198.3241935 test: 0.1915359 best: 0.1899795 (25) total: 17.1s remaining: 29s 37: learn: 198.3226345 test: 0.1916846 best: 0.1899795 (25) total: 17.4s remaining: 28.5s 38: learn: 205.1092634 test: 27.2502936 best: 0.1899795 (25) total: 17.8s remaining: 27.8s 39: learn: 205.1077210 test: 27.2501914 best: 0.1899795 (25) total: 18.1s remaining: 27.2s 40: learn: 205.1066920 test: 27.2499595 best: 0.1899795 (25) total: 18.5s remaining: 26.6s 41: learn: 215.9635224 test: 92.1757205 best: 0.1899795 (25) total: 18.9s remaining: 26s 42: learn: 215.9625819 test: 92.1757260 best: 0.1899795 (25) total: 19.1s remaining: 25.3s 43: learn: 215.9614609 test: 92.1755810 best: 0.1899795 (25) total: 19.4s remaining: 24.7s 44: learn: 215.9602190 test: 92.1755354 best: 0.1899795 (25) total: 19.7s remaining: 24.1s 45: learn: 215.9584605 test: 92.1754815 best: 0.1899795 (25) total: 20.2s remaining: 23.7s 46: learn: 215.9574638 test: 92.1752338 best: 0.1899795 (25) total: 20.4s remaining: 23s 47: learn: 215.9561716 test: 92.1752024 best: 0.1899795 (25) total: 20.8s remaining: 22.5s 48: learn: 215.9551750 test: 92.1750692 best: 0.1899795 (25) total: 21.1s remaining: 21.9s 49: learn: 215.9538110 test: 92.1749878 best: 0.1899795 (25) total: 21.5s remaining: 21.5s 50: learn: 215.9523619 test: 92.1745495 best: 0.1899795 (25) total: 21.9s remaining: 21s 51: learn: 215.9514479 test: 92.1744824 best: 0.1899795 (25) total: 22.1s remaining: 20.4s 52: learn: 215.9503812 test: 92.1742116 best: 0.1899795 (25) total: 22.4s remaining: 19.8s 53: learn: 215.9489862 test: 92.1745386 best: 0.1899795 (25) total: 22.7s remaining: 19.3s 54: learn: 215.9474576 test: 92.1744485 best: 0.1899795 (25) total: 23.2s remaining: 19s 55: learn: 215.9462411 test: 92.1745610 best: 0.1899795 (25) total: 23.6s remaining: 18.5s 56: learn: 215.9449416 test: 92.1746027 best: 0.1899795 (25) total: 23.9s remaining: 18s 57: learn: 215.9434264 test: 92.1742967 best: 0.1899795 (25) total: 24.3s remaining: 17.6s 58: learn: 215.9423476 test: 92.1742670 best: 0.1899795 (25) total: 24.6s remaining: 17.1s 59: learn: 215.9411004 test: 92.1747086 best: 0.1899795 (25) total: 24.9s remaining: 16.6s 60: learn: 215.9401105 test: 92.1744968 best: 0.1899795 (25) total: 25.2s remaining: 16.1s 61: learn: 215.9385415 test: 92.1746472 best: 0.1899795 (25) total: 25.7s remaining: 15.7s 62: learn: 215.9373929 test: 92.1747561 best: 0.1899795 (25) total: 25.9s remaining: 15.2s 63: learn: 215.9362795 test: 92.1750015 best: 0.1899795 (25) total: 26.2s remaining: 14.8s 64: learn: 215.9353528 test: 92.1746854 best: 0.1899795 (25) total: 26.5s remaining: 14.3s 65: learn: 215.9341033 test: 92.1746680 best: 0.1899795 (25) total: 26.8s remaining: 13.8s 66: learn: 215.9330384 test: 92.1748234 best: 0.1899795 (25) total: 27.1s remaining: 13.3s 67: learn: 215.9315892 test: 92.1747421 best: 0.1899795 (25) total: 27.5s remaining: 12.9s 68: learn: 215.9302883 test: 92.1745173 best: 0.1899795 (25) total: 28s remaining: 12.6s 69: learn: 215.9291788 test: 92.1740380 best: 0.1899795 (25) total: 28.3s remaining: 12.1s 70: learn: 215.9280241 test: 92.1737539 best: 0.1899795 (25) total: 28.7s remaining: 11.7s 71: learn: 215.9268352 test: 92.1735268 best: 0.1899795 (25) total: 29.1s remaining: 11.3s 72: learn: 215.9258191 test: 92.1739286 best: 0.1899795 (25) total: 29.4s remaining: 10.9s 73: learn: 221.3544831 test: 102.9965750 best: 0.1899795 (25) total: 29.9s remaining: 10.5s 74: learn: 221.3531396 test: 102.9962237 best: 0.1899795 (25) total: 30.3s remaining: 10.1s 75: learn: 221.3521042 test: 102.9961651 best: 0.1899795 (25) total: 30.5s remaining: 9.64s 76: learn: 221.3510676 test: 102.9961242 best: 0.1899795 (25) total: 30.9s remaining: 9.22s 77: learn: 221.3503807 test: 102.9959194 best: 0.1899795 (25) total: 31.1s remaining: 8.77s 78: learn: 221.3493191 test: 102.9959472 best: 0.1899795 (25) total: 31.4s remaining: 8.34s 79: learn: 221.3485447 test: 102.9963615 best: 0.1899795 (25) total: 31.6s remaining: 7.91s 80: learn: 221.3477924 test: 102.9961812 best: 0.1899795 (25) total: 31.8s remaining: 7.46s 81: learn: 221.3471177 test: 102.9958642 best: 0.1899795 (25) total: 32s remaining: 7.03s 82: learn: 221.3460169 test: 102.9955374 best: 0.1899795 (25) total: 32.3s remaining: 6.62s 83: learn: 225.7596070 test: 102.9953306 best: 0.1899795 (25) total: 32.7s remaining: 6.22s 84: learn: 225.7588580 test: 102.9951860 best: 0.1899795 (25) total: 32.9s remaining: 5.81s 85: learn: 225.7578431 test: 102.9951278 best: 0.1899795 (25) total: 33.3s remaining: 5.42s 86: learn: 225.7569815 test: 102.9950862 best: 0.1899795 (25) total: 33.6s remaining: 5.03s 87: learn: 225.7556105 test: 102.9949087 best: 0.1899795 (25) total: 34.2s remaining: 4.66s 88: learn: 225.7539307 test: 102.9952167 best: 0.1899795 (25) total: 34.8s remaining: 4.3s 89: learn: 225.7528003 test: 102.9952006 best: 0.1899795 (25) total: 35.1s remaining: 3.9s 90: learn: 225.7518702 test: 102.9950969 best: 0.1899795 (25) total: 35.4s remaining: 3.5s 91: learn: 225.7507622 test: 102.9948485 best: 0.1899795 (25) total: 35.7s remaining: 3.11s 92: learn: 225.7497095 test: 102.9944551 best: 0.1899795 (25) total: 36.1s remaining: 2.71s 93: learn: 225.7488606 test: 102.9943664 best: 0.1899795 (25) total: 36.4s remaining: 2.32s 94: learn: 225.7479585 test: 102.9942524 best: 0.1899795 (25) total: 36.7s remaining: 1.93s 95: learn: 225.7472186 test: 102.9939966 best: 0.1899795 (25) total: 36.9s remaining: 1.54s 96: learn: 225.7466284 test: 102.9938276 best: 0.1899795 (25) total: 37s remaining: 1.15s 97: learn: 225.7457853 test: 102.9937536 best: 0.1899795 (25) total: 37.3s remaining: 762ms 98: learn: 225.7445454 test: 102.9936466 best: 0.1899795 (25) total: 37.8s remaining: 382ms 99: learn: 225.7436706 test: 102.9933365 best: 0.1899795 (25) total: 38s remaining: 0us bestTest = 0.1899794542 bestIteration = 25 Shrink model to first 26 iterations. Trial 12, Fold 4: Log loss = 0.1899794542385773, Average precision = 0.9755532823387798, ROC-AUC = 0.9715319355923231, Elapsed Time = 38.34923730000082 seconds Trial 12, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 12, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5334887 test: 0.5377967 best: 0.5377967 (0) total: 728ms remaining: 1m 12s 1: learn: 0.4315138 test: 0.4382841 best: 0.4382841 (1) total: 1.35s remaining: 1m 6s 2: learn: 0.3648393 test: 0.3753602 best: 0.3753602 (2) total: 2.1s remaining: 1m 7s 3: learn: 0.3154854 test: 0.3301007 best: 0.3301007 (3) total: 2.82s remaining: 1m 7s 4: learn: 0.2762072 test: 0.2950413 best: 0.2950413 (4) total: 3.5s remaining: 1m 6s 5: learn: 0.2501987 test: 0.2721544 best: 0.2721544 (5) total: 4.13s remaining: 1m 4s 6: learn: 0.2308790 test: 0.2556880 best: 0.2556880 (6) total: 4.75s remaining: 1m 3s 7: learn: 2.4795387 test: 0.2439319 best: 0.2439319 (7) total: 5.49s remaining: 1m 3s 8: learn: 2.4691598 test: 0.2361281 best: 0.2361281 (8) total: 6.14s remaining: 1m 2s 9: learn: 2.4600406 test: 0.2297237 best: 0.2297237 (9) total: 6.7s remaining: 1m 10: learn: 1.5442093 test: 0.2225559 best: 0.2225559 (10) total: 7.37s remaining: 59.7s 11: learn: 1.5386935 test: 0.2189337 best: 0.2189337 (11) total: 7.93s remaining: 58.1s 12: learn: 3.3593897 test: 7.4448680 best: 0.2189337 (11) total: 8.5s remaining: 56.9s 13: learn: 11.9532839 test: 12.8459125 best: 0.2189337 (11) total: 9.04s remaining: 55.6s 14: learn: 12618.1054060 test: 11652.6210360 best: 0.2189337 (11) total: 9.53s remaining: 54s 15: learn: 45460.0929514 test: 42949.6156958 best: 0.2189337 (11) total: 10s remaining: 52.7s
Training has stopped (degenerate solution on iteration 16, probably too small l2-regularization, try to increase it)
bestTest = 0.2189336578 bestIteration = 11 Shrink model to first 12 iterations. Trial 12, Fold 5: Log loss = 0.21893365778022436, Average precision = 0.972602290968769, ROC-AUC = 0.9685415528677331, Elapsed Time = 10.713243800000782 seconds
Optimization Progress: 13%|#3 | 13/100 [19:58<1:51:13, 76.70s/it]
Trial 13, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 13, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6362268 test: 0.6369981 best: 0.6369981 (0) total: 172ms remaining: 6s 1: learn: 0.5884682 test: 0.5887326 best: 0.5887326 (1) total: 355ms remaining: 6.04s 2: learn: 0.5494788 test: 0.5503714 best: 0.5503714 (2) total: 541ms remaining: 5.95s 3: learn: 0.5141643 test: 0.5154301 best: 0.5154301 (3) total: 747ms remaining: 5.98s 4: learn: 0.4837059 test: 0.4848689 best: 0.4848689 (4) total: 931ms remaining: 5.77s 5: learn: 0.4534961 test: 0.4544274 best: 0.4544274 (5) total: 1.12s remaining: 5.6s 6: learn: 0.4310285 test: 0.4323185 best: 0.4323185 (6) total: 1.3s remaining: 5.37s 7: learn: 0.4071864 test: 0.4086668 best: 0.4086668 (7) total: 1.5s remaining: 5.24s 8: learn: 0.3908716 test: 0.3925285 best: 0.3925285 (8) total: 1.66s remaining: 4.99s 9: learn: 0.3696076 test: 0.3716048 best: 0.3716048 (9) total: 1.86s remaining: 4.85s 10: learn: 0.3551930 test: 0.3576297 best: 0.3576297 (10) total: 2.04s remaining: 4.64s 11: learn: 0.3406785 test: 0.3437153 best: 0.3437153 (11) total: 2.22s remaining: 4.43s 12: learn: 0.3293936 test: 0.3327610 best: 0.3327610 (12) total: 2.4s remaining: 4.24s 13: learn: 0.3195913 test: 0.3233061 best: 0.3233061 (13) total: 2.57s remaining: 4.04s 14: learn: 0.3067872 test: 0.3106768 best: 0.3106768 (14) total: 2.76s remaining: 3.86s 15: learn: 0.2981494 test: 0.3025180 best: 0.3025180 (15) total: 2.93s remaining: 3.66s 16: learn: 0.2891692 test: 0.2939077 best: 0.2939077 (16) total: 3.11s remaining: 3.47s 17: learn: 0.2816610 test: 0.2867221 best: 0.2867221 (17) total: 3.28s remaining: 3.28s 18: learn: 0.2753114 test: 0.2808322 best: 0.2808322 (18) total: 3.45s remaining: 3.09s 19: learn: 0.2689741 test: 0.2746001 best: 0.2746001 (19) total: 3.65s remaining: 2.92s 20: learn: 0.2636244 test: 0.2697511 best: 0.2697511 (20) total: 3.85s remaining: 2.75s 21: learn: 0.2575506 test: 0.2641143 best: 0.2641143 (21) total: 4.04s remaining: 2.57s 22: learn: 0.2521268 test: 0.2588866 best: 0.2588866 (22) total: 4.25s remaining: 2.4s 23: learn: 0.2475484 test: 0.2543767 best: 0.2543767 (23) total: 4.44s remaining: 2.22s 24: learn: 0.2413954 test: 0.2487474 best: 0.2487474 (24) total: 4.67s remaining: 2.06s 25: learn: 0.2371227 test: 0.2450913 best: 0.2450913 (25) total: 4.88s remaining: 1.88s 26: learn: 0.2343551 test: 0.2426650 best: 0.2426650 (26) total: 5.06s remaining: 1.69s 27: learn: 0.2313708 test: 0.2399776 best: 0.2399776 (27) total: 5.24s remaining: 1.5s 28: learn: 0.2290610 test: 0.2379554 best: 0.2379554 (28) total: 5.42s remaining: 1.31s 29: learn: 0.2258381 test: 0.2352651 best: 0.2352651 (29) total: 5.6s remaining: 1.12s 30: learn: 0.2222137 test: 0.2321313 best: 0.2321313 (30) total: 5.83s remaining: 941ms 31: learn: 25.6131747 test: 0.2299716 best: 0.2299716 (31) total: 6.04s remaining: 755ms 32: learn: 25.6103639 test: 0.2278312 best: 0.2278312 (32) total: 6.25s remaining: 568ms 33: learn: 25.6076984 test: 0.2256242 best: 0.2256242 (33) total: 6.46s remaining: 380ms 34: learn: 25.6062534 test: 0.2245408 best: 0.2245408 (34) total: 6.66s remaining: 190ms 35: learn: 25.6039619 test: 0.2225528 best: 0.2225528 (35) total: 6.82s remaining: 0us bestTest = 0.2225528386 bestIteration = 35 Trial 13, Fold 1: Log loss = 0.22255283857359395, Average precision = 0.9739927683394082, ROC-AUC = 0.9694795594891575, Elapsed Time = 6.948956299998827 seconds Trial 13, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 13, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6398724 test: 0.6407422 best: 0.6407422 (0) total: 217ms remaining: 7.61s 1: learn: 0.5943464 test: 0.5958223 best: 0.5958223 (1) total: 401ms remaining: 6.82s 2: learn: 0.5570008 test: 0.5587957 best: 0.5587957 (2) total: 591ms remaining: 6.5s 3: learn: 0.5224136 test: 0.5239691 best: 0.5239691 (3) total: 772ms remaining: 6.18s 4: learn: 0.4890758 test: 0.4912125 best: 0.4912125 (4) total: 967ms remaining: 6s 5: learn: 0.4623566 test: 0.4652444 best: 0.4652444 (5) total: 1.18s remaining: 5.89s 6: learn: 0.4389565 test: 0.4423142 best: 0.4423142 (6) total: 1.37s remaining: 5.66s 7: learn: 0.4190942 test: 0.4226313 best: 0.4226313 (7) total: 1.54s remaining: 5.39s 8: learn: 0.3981403 test: 0.4018835 best: 0.4018835 (8) total: 1.71s remaining: 5.13s 9: learn: 0.3793336 test: 0.3831528 best: 0.3831528 (9) total: 1.89s remaining: 4.92s 10: learn: 0.3629387 test: 0.3670206 best: 0.3670206 (10) total: 2.08s remaining: 4.73s 11: learn: 0.3493532 test: 0.3539982 best: 0.3539982 (11) total: 2.26s remaining: 4.52s 12: learn: 0.3373240 test: 0.3418369 best: 0.3418369 (12) total: 2.44s remaining: 4.33s 13: learn: 0.3253181 test: 0.3298845 best: 0.3298845 (13) total: 2.63s remaining: 4.13s 14: learn: 0.3142063 test: 0.3190196 best: 0.3190196 (14) total: 2.82s remaining: 3.95s 15: learn: 0.3045101 test: 0.3094914 best: 0.3094914 (15) total: 3.02s remaining: 3.77s 16: learn: 0.2957632 test: 0.3007374 best: 0.3007374 (16) total: 3.19s remaining: 3.57s 17: learn: 0.2877281 test: 0.2931055 best: 0.2931055 (17) total: 3.38s remaining: 3.38s 18: learn: 0.2797039 test: 0.2853485 best: 0.2853485 (18) total: 3.58s remaining: 3.21s 19: learn: 0.2735516 test: 0.2794311 best: 0.2794311 (19) total: 3.77s remaining: 3.02s 20: learn: 0.2672734 test: 0.2734567 best: 0.2734567 (20) total: 3.98s remaining: 2.84s 21: learn: 0.2619050 test: 0.2681774 best: 0.2681774 (21) total: 4.17s remaining: 2.65s 22: learn: 0.2570093 test: 0.2633686 best: 0.2633686 (22) total: 4.35s remaining: 2.46s 23: learn: 0.2509020 test: 0.2575585 best: 0.2575585 (23) total: 4.56s remaining: 2.28s 24: learn: 0.2468644 test: 0.2537102 best: 0.2537102 (24) total: 4.74s remaining: 2.08s 25: learn: 0.2429230 test: 0.2499420 best: 0.2499420 (25) total: 4.94s remaining: 1.9s 26: learn: 0.2374097 test: 0.2446994 best: 0.2446994 (26) total: 5.14s remaining: 1.71s 27: learn: 115.0324122 test: 0.2413381 best: 0.2413381 (27) total: 5.35s remaining: 1.53s 28: learn: 115.0296409 test: 0.2387022 best: 0.2387022 (28) total: 5.54s remaining: 1.34s 29: learn: 115.0249799 test: 0.2344858 best: 0.2344858 (29) total: 5.74s remaining: 1.15s 30: learn: 115.0222988 test: 0.2321476 best: 0.2321476 (30) total: 5.93s remaining: 956ms 31: learn: 115.0194505 test: 0.2297740 best: 0.2297740 (31) total: 6.15s remaining: 769ms 32: learn: 115.0161886 test: 0.2266689 best: 0.2266689 (32) total: 6.36s remaining: 578ms 33: learn: 115.0132828 test: 0.2243788 best: 0.2243788 (33) total: 6.57s remaining: 386ms 34: learn: 115.0118309 test: 0.2230723 best: 0.2230723 (34) total: 6.77s remaining: 193ms 35: learn: 115.0099578 test: 0.2214299 best: 0.2214299 (35) total: 6.98s remaining: 0us bestTest = 0.2214298734 bestIteration = 35 Trial 13, Fold 2: Log loss = 0.22142987336946682, Average precision = 0.9734090091011551, ROC-AUC = 0.9698588684896027, Elapsed Time = 7.118063699999766 seconds Trial 13, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 13, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6466042 test: 0.6468302 best: 0.6468302 (0) total: 168ms remaining: 5.87s 1: learn: 0.5963553 test: 0.5962675 best: 0.5962675 (1) total: 360ms remaining: 6.13s 2: learn: 0.5572160 test: 0.5573220 best: 0.5573220 (2) total: 520ms remaining: 5.72s 3: learn: 0.5172537 test: 0.5171124 best: 0.5171124 (3) total: 697ms remaining: 5.58s 4: learn: 0.4863424 test: 0.4861273 best: 0.4861273 (4) total: 875ms remaining: 5.42s 5: learn: 0.4577305 test: 0.4576526 best: 0.4576526 (5) total: 1.08s remaining: 5.38s 6: learn: 0.4322471 test: 0.4320593 best: 0.4320593 (6) total: 1.25s remaining: 5.19s 7: learn: 0.4132913 test: 0.4130888 best: 0.4130888 (7) total: 1.43s remaining: 5s 8: learn: 0.3902830 test: 0.3900960 best: 0.3900960 (8) total: 1.6s remaining: 4.81s 9: learn: 0.3727125 test: 0.3727354 best: 0.3727354 (9) total: 1.79s remaining: 4.66s 10: learn: 0.3571500 test: 0.3571792 best: 0.3571792 (10) total: 1.97s remaining: 4.47s 11: learn: 0.3424003 test: 0.3425473 best: 0.3425473 (11) total: 2.15s remaining: 4.3s 12: learn: 0.3304501 test: 0.3309722 best: 0.3309722 (12) total: 2.34s remaining: 4.13s 13: learn: 0.3182785 test: 0.3189781 best: 0.3189781 (13) total: 2.54s remaining: 3.99s 14: learn: 0.3080659 test: 0.3088771 best: 0.3088771 (14) total: 2.72s remaining: 3.8s 15: learn: 0.2994057 test: 0.3002695 best: 0.3002695 (15) total: 2.94s remaining: 3.67s 16: learn: 0.2916006 test: 0.2924914 best: 0.2924914 (16) total: 3.13s remaining: 3.5s 17: learn: 0.2846112 test: 0.2859538 best: 0.2859538 (17) total: 3.34s remaining: 3.34s 18: learn: 0.2785793 test: 0.2801477 best: 0.2801477 (18) total: 3.53s remaining: 3.16s 19: learn: 0.2726033 test: 0.2745009 best: 0.2745009 (19) total: 3.73s remaining: 2.98s 20: learn: 0.2645065 test: 0.2666863 best: 0.2666863 (20) total: 3.93s remaining: 2.81s 21: learn: 0.2590203 test: 0.2616787 best: 0.2616787 (21) total: 4.13s remaining: 2.63s 22: learn: 0.2548280 test: 0.2578550 best: 0.2578550 (22) total: 4.32s remaining: 2.44s 23: learn: 0.2502271 test: 0.2534140 best: 0.2534140 (23) total: 4.5s remaining: 2.25s 24: learn: 0.2455800 test: 0.2490834 best: 0.2490834 (24) total: 4.68s remaining: 2.06s 25: learn: 0.2414810 test: 0.2452255 best: 0.2452255 (25) total: 4.86s remaining: 1.87s 26: learn: 0.2375463 test: 0.2418027 best: 0.2418027 (26) total: 5.06s remaining: 1.69s 27: learn: 0.2338022 test: 0.2384819 best: 0.2384819 (27) total: 5.25s remaining: 1.5s 28: learn: 0.2303229 test: 0.2355415 best: 0.2355415 (28) total: 5.44s remaining: 1.31s 29: learn: 0.2261626 test: 0.2320020 best: 0.2320020 (29) total: 5.66s remaining: 1.13s 30: learn: 0.2224263 test: 0.2285268 best: 0.2285268 (30) total: 5.87s remaining: 946ms 31: learn: 0.2198848 test: 0.2263913 best: 0.2263913 (31) total: 6.05s remaining: 756ms 32: learn: 0.2175499 test: 0.2244188 best: 0.2244188 (32) total: 6.24s remaining: 567ms 33: learn: 0.2160925 test: 0.2232195 best: 0.2232195 (33) total: 6.42s remaining: 377ms 34: learn: 0.2138605 test: 0.2213452 best: 0.2213452 (34) total: 6.63s remaining: 189ms 35: learn: 0.2114873 test: 0.2192736 best: 0.2192736 (35) total: 6.83s remaining: 0us bestTest = 0.2192736263 bestIteration = 35 Trial 13, Fold 3: Log loss = 0.21927362628077832, Average precision = 0.9712415754523683, ROC-AUC = 0.9694828615731331, Elapsed Time = 6.961125500001799 seconds Trial 13, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 13, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6292556 test: 0.6290567 best: 0.6290567 (0) total: 182ms remaining: 6.37s 1: learn: 0.5865732 test: 0.5866598 best: 0.5866598 (1) total: 342ms remaining: 5.82s 2: learn: 0.5439731 test: 0.5443882 best: 0.5443882 (2) total: 540ms remaining: 5.94s 3: learn: 0.5124213 test: 0.5131079 best: 0.5131079 (3) total: 708ms remaining: 5.66s 4: learn: 0.4847606 test: 0.4853535 best: 0.4853535 (4) total: 885ms remaining: 5.49s 5: learn: 0.4579177 test: 0.4592255 best: 0.4592255 (5) total: 1.07s remaining: 5.34s 6: learn: 0.4320823 test: 0.4332697 best: 0.4332697 (6) total: 1.26s remaining: 5.23s 7: learn: 0.4129328 test: 0.4145319 best: 0.4145319 (7) total: 1.41s remaining: 4.93s 8: learn: 0.3943310 test: 0.3961700 best: 0.3961700 (8) total: 1.6s remaining: 4.8s 9: learn: 0.3772095 test: 0.3792984 best: 0.3792984 (9) total: 1.8s remaining: 4.69s 10: learn: 0.3613679 test: 0.3637563 best: 0.3637563 (10) total: 1.98s remaining: 4.5s 11: learn: 0.3484218 test: 0.3510269 best: 0.3510269 (11) total: 2.17s remaining: 4.34s 12: learn: 0.3375404 test: 0.3403678 best: 0.3403678 (12) total: 2.34s remaining: 4.14s 13: learn: 0.3260996 test: 0.3291135 best: 0.3291135 (13) total: 2.52s remaining: 3.97s 14: learn: 0.3157898 test: 0.3188949 best: 0.3188949 (14) total: 2.71s remaining: 3.79s 15: learn: 0.3049987 test: 0.3086094 best: 0.3086094 (15) total: 2.9s remaining: 3.63s 16: learn: 0.2966381 test: 0.3004102 best: 0.3004102 (16) total: 3.07s remaining: 3.43s 17: learn: 0.2874092 test: 0.2912290 best: 0.2912290 (17) total: 3.26s remaining: 3.26s 18: learn: 0.2809379 test: 0.2851852 best: 0.2851852 (18) total: 3.42s remaining: 3.06s 19: learn: 0.2738296 test: 0.2783650 best: 0.2783650 (19) total: 3.61s remaining: 2.88s 20: learn: 0.2660666 test: 0.2707975 best: 0.2707975 (20) total: 3.8s remaining: 2.71s 21: learn: 0.2600301 test: 0.2649686 best: 0.2649686 (21) total: 3.98s remaining: 2.54s 22: learn: 0.2544076 test: 0.2596404 best: 0.2596404 (22) total: 4.17s remaining: 2.36s 23: learn: 0.2484161 test: 0.2539750 best: 0.2539750 (23) total: 4.38s remaining: 2.19s 24: learn: 0.2433901 test: 0.2498655 best: 0.2498655 (24) total: 4.58s remaining: 2.02s 25: learn: 0.2374331 test: 0.2442497 best: 0.2442497 (25) total: 4.79s remaining: 1.84s 26: learn: 0.2336108 test: 0.2407654 best: 0.2407654 (26) total: 5s remaining: 1.67s 27: learn: 0.2302201 test: 0.2377074 best: 0.2377074 (27) total: 5.2s remaining: 1.49s 28: learn: 0.2273847 test: 0.2355093 best: 0.2355093 (28) total: 5.4s remaining: 1.3s 29: learn: 0.2237075 test: 0.2323485 best: 0.2323485 (29) total: 5.6s remaining: 1.12s 30: learn: 0.2208885 test: 0.2297270 best: 0.2297270 (30) total: 5.79s remaining: 934ms 31: learn: 0.2188529 test: 0.2279287 best: 0.2279287 (31) total: 5.99s remaining: 748ms 32: learn: 128.0484387 test: 1019.3316459 best: 0.2279287 (31) total: 6.16s remaining: 560ms 33: learn: 128.0462237 test: 1019.3299203 best: 0.2279287 (31) total: 6.34s remaining: 373ms 34: learn: 128.0442746 test: 1019.3285110 best: 0.2279287 (31) total: 6.56s remaining: 187ms 35: learn: 128.0416541 test: 1019.3264245 best: 0.2279287 (31) total: 6.76s remaining: 0us bestTest = 0.2279287088 bestIteration = 31 Shrink model to first 32 iterations. Trial 13, Fold 4: Log loss = 0.2279287088411435, Average precision = 0.9726621824986958, ROC-AUC = 0.9689829270299077, Elapsed Time = 6.896853700000065 seconds Trial 13, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 13, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6438806 test: 0.6446860 best: 0.6446860 (0) total: 173ms remaining: 6.06s 1: learn: 0.5959006 test: 0.5976127 best: 0.5976127 (1) total: 371ms remaining: 6.31s 2: learn: 0.5540793 test: 0.5567142 best: 0.5567142 (2) total: 557ms remaining: 6.13s 3: learn: 0.5162174 test: 0.5197316 best: 0.5197316 (3) total: 735ms remaining: 5.88s 4: learn: 0.4821013 test: 0.4863610 best: 0.4863610 (4) total: 914ms remaining: 5.67s 5: learn: 0.4558803 test: 0.4609406 best: 0.4609406 (5) total: 1.08s remaining: 5.38s 6: learn: 0.4301470 test: 0.4355561 best: 0.4355561 (6) total: 1.24s remaining: 5.14s 7: learn: 0.4098010 test: 0.4156831 best: 0.4156831 (7) total: 1.41s remaining: 4.94s 8: learn: 0.3910446 test: 0.3971811 best: 0.3971811 (8) total: 1.6s remaining: 4.79s 9: learn: 0.3754452 test: 0.3819208 best: 0.3819208 (9) total: 1.76s remaining: 4.57s 10: learn: 0.3599687 test: 0.3668101 best: 0.3668101 (10) total: 1.94s remaining: 4.4s 11: learn: 0.3447094 test: 0.3522747 best: 0.3522747 (11) total: 2.12s remaining: 4.24s 12: learn: 0.3344576 test: 0.3422896 best: 0.3422896 (12) total: 2.28s remaining: 4.04s 13: learn: 0.3211561 test: 0.3292972 best: 0.3292972 (13) total: 2.47s remaining: 3.88s 14: learn: 0.3108631 test: 0.3194641 best: 0.3194641 (14) total: 2.64s remaining: 3.69s 15: learn: 0.3008288 test: 0.3099759 best: 0.3099759 (15) total: 2.84s remaining: 3.55s 16: learn: 0.2928958 test: 0.3024424 best: 0.3024424 (16) total: 3.02s remaining: 3.38s 17: learn: 0.2856383 test: 0.2955359 best: 0.2955359 (17) total: 3.21s remaining: 3.21s 18: learn: 0.2784109 test: 0.2886771 best: 0.2886771 (18) total: 3.38s remaining: 3.03s 19: learn: 0.2719536 test: 0.2830028 best: 0.2830028 (19) total: 3.55s remaining: 2.84s 20: learn: 0.2647132 test: 0.2763195 best: 0.2763195 (20) total: 3.74s remaining: 2.67s 21: learn: 0.2588569 test: 0.2707512 best: 0.2707512 (21) total: 3.93s remaining: 2.5s 22: learn: 0.2536089 test: 0.2657579 best: 0.2657579 (22) total: 4.1s remaining: 2.32s 23: learn: 0.2480647 test: 0.2608990 best: 0.2608990 (23) total: 4.28s remaining: 2.14s 24: learn: 0.2443657 test: 0.2577369 best: 0.2577369 (24) total: 4.45s remaining: 1.96s 25: learn: 0.2395493 test: 0.2535182 best: 0.2535182 (25) total: 4.65s remaining: 1.79s 26: learn: 0.2353475 test: 0.2498460 best: 0.2498460 (26) total: 4.84s remaining: 1.61s 27: learn: 0.2326519 test: 0.2475142 best: 0.2475142 (27) total: 5.01s remaining: 1.43s 28: learn: 0.2281577 test: 0.2435103 best: 0.2435103 (28) total: 5.21s remaining: 1.26s 29: learn: 0.2258064 test: 0.2415264 best: 0.2415264 (29) total: 5.39s remaining: 1.08s 30: learn: 0.2224247 test: 0.2385384 best: 0.2385384 (30) total: 5.58s remaining: 900ms 31: learn: 0.2205953 test: 0.2372016 best: 0.2372016 (31) total: 5.76s remaining: 721ms 32: learn: 0.2176616 test: 0.2350868 best: 0.2350868 (32) total: 5.96s remaining: 542ms 33: learn: 0.2148242 test: 0.2330774 best: 0.2330774 (33) total: 6.18s remaining: 363ms 34: learn: 153.6551022 test: 0.2308258 best: 0.2308258 (34) total: 6.38s remaining: 182ms 35: learn: 153.6526900 test: 0.2290186 best: 0.2290186 (35) total: 6.57s remaining: 0us bestTest = 0.2290186192 bestIteration = 35 Trial 13, Fold 5: Log loss = 0.22901861922288552, Average precision = 0.9708279634013677, ROC-AUC = 0.9669413447782546, Elapsed Time = 6.705726799998956 seconds
Optimization Progress: 14%|#4 | 14/100 [20:40<1:35:04, 66.33s/it]
Trial 14, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 14, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5801609 test: 0.5909252 best: 0.5909252 (0) total: 3.17s remaining: 2m 19s 1: learn: 0.4936147 test: 0.5096646 best: 0.5096646 (1) total: 6.14s remaining: 2m 11s 2: learn: 0.4258390 test: 0.4477475 best: 0.4477475 (2) total: 9.18s remaining: 2m 8s 3: learn: 0.3703814 test: 0.4004829 best: 0.4004829 (3) total: 12.4s remaining: 2m 6s 4: learn: 0.3300916 test: 0.3629523 best: 0.3629523 (4) total: 15.4s remaining: 2m 3s 5: learn: 0.2984747 test: 0.3345477 best: 0.3345477 (5) total: 18.5s remaining: 2m 6: learn: 0.2702072 test: 0.3118904 best: 0.3118904 (6) total: 21.4s remaining: 1m 56s 7: learn: 0.2462469 test: 0.2949055 best: 0.2949055 (7) total: 24.1s remaining: 1m 51s 8: learn: 0.2281451 test: 0.2814214 best: 0.2814214 (8) total: 27.4s remaining: 1m 49s 9: learn: 0.2154538 test: 0.2711527 best: 0.2711527 (9) total: 30.3s remaining: 1m 46s 10: learn: 0.2035282 test: 0.2635893 best: 0.2635893 (10) total: 33.1s remaining: 1m 42s 11: learn: 0.1930792 test: 0.2553380 best: 0.2553380 (11) total: 35.9s remaining: 1m 38s 12: learn: 0.1837754 test: 0.2489476 best: 0.2489476 (12) total: 38.8s remaining: 1m 35s 13: learn: 0.1749195 test: 0.2436962 best: 0.2436962 (13) total: 42s remaining: 1m 32s 14: learn: 0.1683841 test: 0.2403143 best: 0.2403143 (14) total: 45s remaining: 1m 30s 15: learn: 0.1604667 test: 0.2369559 best: 0.2369559 (15) total: 48s remaining: 1m 26s 16: learn: 0.1534018 test: 0.2340586 best: 0.2340586 (16) total: 51s remaining: 1m 24s 17: learn: 0.1469744 test: 0.2321920 best: 0.2321920 (17) total: 53.8s remaining: 1m 20s 18: learn: 0.1397973 test: 0.2299127 best: 0.2299127 (18) total: 56.4s remaining: 1m 17s 19: learn: 0.1357238 test: 0.2274836 best: 0.2274836 (19) total: 59.6s remaining: 1m 14s 20: learn: 0.1315739 test: 0.2258505 best: 0.2258505 (20) total: 1m 2s remaining: 1m 11s 21: learn: 0.1270202 test: 0.2239872 best: 0.2239872 (21) total: 1m 5s remaining: 1m 8s 22: learn: 0.1236635 test: 0.2228864 best: 0.2228864 (22) total: 1m 8s remaining: 1m 5s 23: learn: 0.1204728 test: 0.2218333 best: 0.2218333 (23) total: 1m 12s remaining: 1m 3s 24: learn: 0.1170833 test: 0.2209510 best: 0.2209510 (24) total: 1m 15s remaining: 1m 25: learn: 0.1131268 test: 0.2209637 best: 0.2209510 (24) total: 1m 18s remaining: 57.4s 26: learn: 0.1095461 test: 0.2205932 best: 0.2205932 (26) total: 1m 21s remaining: 54.1s 27: learn: 0.1066295 test: 0.2201905 best: 0.2201905 (27) total: 1m 24s remaining: 51.1s 28: learn: 0.1042506 test: 0.2196624 best: 0.2196624 (28) total: 1m 27s remaining: 48.1s 29: learn: 0.1016227 test: 0.2192604 best: 0.2192604 (29) total: 1m 29s remaining: 44.9s 30: learn: 0.0977455 test: 0.2188646 best: 0.2188646 (30) total: 1m 32s remaining: 42s 31: learn: 0.0945755 test: 0.2185130 best: 0.2185130 (31) total: 1m 35s remaining: 39s 32: learn: 0.0926957 test: 0.2182133 best: 0.2182133 (32) total: 1m 38s remaining: 35.8s 33: learn: 0.0904558 test: 0.2179245 best: 0.2179245 (33) total: 1m 41s remaining: 32.8s 34: learn: 0.0878753 test: 0.2179111 best: 0.2179111 (34) total: 1m 44s remaining: 29.8s 35: learn: 0.0830881 test: 0.2187615 best: 0.2179111 (34) total: 1m 47s remaining: 26.8s 36: learn: 0.0808470 test: 0.2185576 best: 0.2179111 (34) total: 1m 49s remaining: 23.8s 37: learn: 0.0788830 test: 0.2185941 best: 0.2179111 (34) total: 1m 53s remaining: 20.8s 38: learn: 0.0758702 test: 0.2189666 best: 0.2179111 (34) total: 1m 56s remaining: 17.9s 39: learn: 0.0737398 test: 0.2189955 best: 0.2179111 (34) total: 1m 58s remaining: 14.8s 40: learn: 0.0723090 test: 0.2189501 best: 0.2179111 (34) total: 2m 1s remaining: 11.8s 41: learn: 0.0706000 test: 0.2191205 best: 0.2179111 (34) total: 2m 4s remaining: 8.88s 42: learn: 0.0681112 test: 0.2193127 best: 0.2179111 (34) total: 2m 7s remaining: 5.91s 43: learn: 0.0656441 test: 0.2193762 best: 0.2179111 (34) total: 2m 9s remaining: 2.95s 44: learn: 0.0646914 test: 0.2192511 best: 0.2179111 (34) total: 2m 12s remaining: 0us bestTest = 0.2179111493 bestIteration = 34 Shrink model to first 35 iterations. Trial 14, Fold 1: Log loss = 0.2169168552325152, Average precision = 0.9712305782721078, ROC-AUC = 0.9652470838531134, Elapsed Time = 133.0830840000017 seconds Trial 14, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 14, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5799682 test: 0.5888075 best: 0.5888075 (0) total: 2.69s remaining: 1m 58s 1: learn: 0.4913820 test: 0.5104179 best: 0.5104179 (1) total: 5.55s remaining: 1m 59s 2: learn: 0.4222678 test: 0.4495135 best: 0.4495135 (2) total: 8.34s remaining: 1m 56s 3: learn: 0.3686176 test: 0.4020057 best: 0.4020057 (3) total: 11.1s remaining: 1m 54s 4: learn: 0.3272427 test: 0.3667085 best: 0.3667085 (4) total: 14.1s remaining: 1m 52s 5: learn: 0.2960504 test: 0.3379471 best: 0.3379471 (5) total: 17.2s remaining: 1m 51s 6: learn: 0.2708433 test: 0.3150888 best: 0.3150888 (6) total: 19.3s remaining: 1m 44s 7: learn: 0.2481460 test: 0.2977073 best: 0.2977073 (7) total: 22.4s remaining: 1m 43s 8: learn: 0.2295153 test: 0.2836679 best: 0.2836679 (8) total: 25.3s remaining: 1m 41s 9: learn: 0.2153637 test: 0.2723696 best: 0.2723696 (9) total: 28.2s remaining: 1m 38s 10: learn: 0.2033896 test: 0.2624800 best: 0.2624800 (10) total: 31.2s remaining: 1m 36s 11: learn: 0.1936138 test: 0.2550728 best: 0.2550728 (11) total: 34s remaining: 1m 33s 12: learn: 0.1833256 test: 0.2504981 best: 0.2504981 (12) total: 36.9s remaining: 1m 30s 13: learn: 0.1736326 test: 0.2463730 best: 0.2463730 (13) total: 40.2s remaining: 1m 29s 14: learn: 0.1656572 test: 0.2418261 best: 0.2418261 (14) total: 43.1s remaining: 1m 26s 15: learn: 0.1592256 test: 0.2384311 best: 0.2384311 (15) total: 46.2s remaining: 1m 23s 16: learn: 0.1525943 test: 0.2359437 best: 0.2359437 (16) total: 49s remaining: 1m 20s 17: learn: 0.1471297 test: 0.2332217 best: 0.2332217 (17) total: 52s remaining: 1m 18s 18: learn: 0.1399421 test: 0.2313575 best: 0.2313575 (18) total: 55s remaining: 1m 15s 19: learn: 0.1363407 test: 0.2292368 best: 0.2292368 (19) total: 57.9s remaining: 1m 12s 20: learn: 0.1334490 test: 0.2268514 best: 0.2268514 (20) total: 1m remaining: 1m 9s 21: learn: 0.1299957 test: 0.2248536 best: 0.2248536 (21) total: 1m 3s remaining: 1m 6s 22: learn: 0.1263746 test: 0.2227625 best: 0.2227625 (22) total: 1m 6s remaining: 1m 3s 23: learn: 0.1216962 test: 0.2216588 best: 0.2216588 (23) total: 1m 9s remaining: 1m 24: learn: 0.1162550 test: 0.2208000 best: 0.2208000 (24) total: 1m 12s remaining: 57.6s 25: learn: 0.1128809 test: 0.2195293 best: 0.2195293 (25) total: 1m 15s remaining: 55s 26: learn: 0.1103078 test: 0.2190989 best: 0.2190989 (26) total: 1m 18s remaining: 52.1s 27: learn: 0.1070875 test: 0.2181046 best: 0.2181046 (27) total: 1m 20s remaining: 49.2s 28: learn: 0.1040426 test: 0.2169204 best: 0.2169204 (28) total: 1m 23s remaining: 46.3s 29: learn: 0.0991583 test: 0.2159539 best: 0.2159539 (29) total: 1m 26s remaining: 43.5s 30: learn: 0.0964646 test: 0.2159804 best: 0.2159539 (29) total: 1m 29s remaining: 40.6s 31: learn: 0.0946704 test: 0.2150655 best: 0.2150655 (31) total: 1m 32s remaining: 37.7s 32: learn: 0.0914193 test: 0.2143254 best: 0.2143254 (32) total: 1m 35s remaining: 34.8s 33: learn: 0.0894467 test: 0.2138953 best: 0.2138953 (33) total: 1m 38s remaining: 31.9s 34: learn: 0.0872183 test: 0.2137364 best: 0.2137364 (34) total: 1m 41s remaining: 29s 35: learn: 0.0859416 test: 0.2131971 best: 0.2131971 (35) total: 1m 44s remaining: 26.2s 36: learn: 0.0842613 test: 0.2128493 best: 0.2128493 (36) total: 1m 47s remaining: 23.3s 37: learn: 0.0815801 test: 0.2124763 best: 0.2124763 (37) total: 1m 50s remaining: 20.4s 38: learn: 0.0788507 test: 0.2121357 best: 0.2121357 (38) total: 1m 53s remaining: 17.4s 39: learn: 0.0754353 test: 0.2130134 best: 0.2121357 (38) total: 1m 56s remaining: 14.5s 40: learn: 0.0722003 test: 0.2133775 best: 0.2121357 (38) total: 1m 59s remaining: 11.6s 41: learn: 0.0698146 test: 0.2132444 best: 0.2121357 (38) total: 2m 1s remaining: 8.71s 42: learn: 0.0687421 test: 0.2130809 best: 0.2121357 (38) total: 2m 5s remaining: 5.82s 43: learn: 0.0668734 test: 0.2126983 best: 0.2121357 (38) total: 2m 7s remaining: 2.9s 44: learn: 0.0659047 test: 0.2124649 best: 0.2121357 (38) total: 2m 10s remaining: 0us bestTest = 0.2121356942 bestIteration = 38 Shrink model to first 39 iterations. Trial 14, Fold 2: Log loss = 0.21156600761180294, Average precision = 0.9714203502763437, ROC-AUC = 0.9671939497550942, Elapsed Time = 130.8136955000009 seconds Trial 14, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 14, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5906430 test: 0.6035098 best: 0.6035098 (0) total: 3.2s remaining: 2m 20s 1: learn: 0.5025360 test: 0.5233086 best: 0.5233086 (1) total: 6.5s remaining: 2m 19s 2: learn: 0.4334635 test: 0.4579570 best: 0.4579570 (2) total: 10.2s remaining: 2m 22s 3: learn: 0.3795100 test: 0.4075997 best: 0.4075997 (3) total: 13.5s remaining: 2m 17s 4: learn: 0.3351713 test: 0.3695734 best: 0.3695734 (4) total: 17s remaining: 2m 15s 5: learn: 0.3001534 test: 0.3408086 best: 0.3408086 (5) total: 20s remaining: 2m 10s 6: learn: 0.2733942 test: 0.3175320 best: 0.3175320 (6) total: 22.9s remaining: 2m 4s 7: learn: 0.2490784 test: 0.2996460 best: 0.2996460 (7) total: 25.3s remaining: 1m 57s 8: learn: 0.2312309 test: 0.2848158 best: 0.2848158 (8) total: 28.6s remaining: 1m 54s 9: learn: 0.2173557 test: 0.2727211 best: 0.2727211 (9) total: 31.7s remaining: 1m 50s 10: learn: 0.2043049 test: 0.2636351 best: 0.2636351 (10) total: 34.5s remaining: 1m 46s 11: learn: 0.1918733 test: 0.2565412 best: 0.2565412 (11) total: 37.6s remaining: 1m 43s 12: learn: 0.1824579 test: 0.2496497 best: 0.2496497 (12) total: 40.6s remaining: 1m 39s 13: learn: 0.1710815 test: 0.2440102 best: 0.2440102 (13) total: 43.6s remaining: 1m 36s 14: learn: 0.1618203 test: 0.2408540 best: 0.2408540 (14) total: 46.8s remaining: 1m 33s 15: learn: 0.1531503 test: 0.2372968 best: 0.2372968 (15) total: 50.1s remaining: 1m 30s 16: learn: 0.1460826 test: 0.2341701 best: 0.2341701 (16) total: 52.7s remaining: 1m 26s 17: learn: 0.1399991 test: 0.2320540 best: 0.2320540 (17) total: 55.8s remaining: 1m 23s 18: learn: 0.1345222 test: 0.2294553 best: 0.2294553 (18) total: 58.9s remaining: 1m 20s 19: learn: 0.1284688 test: 0.2281663 best: 0.2281663 (19) total: 1m 1s remaining: 1m 16s 20: learn: 0.1251633 test: 0.2263021 best: 0.2263021 (20) total: 1m 4s remaining: 1m 13s 21: learn: 0.1222190 test: 0.2241378 best: 0.2241378 (21) total: 1m 7s remaining: 1m 10s 22: learn: 0.1189313 test: 0.2225205 best: 0.2225205 (22) total: 1m 10s remaining: 1m 7s 23: learn: 0.1150162 test: 0.2213763 best: 0.2213763 (23) total: 1m 14s remaining: 1m 4s 24: learn: 0.1121778 test: 0.2196625 best: 0.2196625 (24) total: 1m 17s remaining: 1m 1s 25: learn: 0.1070835 test: 0.2186613 best: 0.2186613 (25) total: 1m 21s remaining: 59.3s 26: learn: 0.1037281 test: 0.2180018 best: 0.2180018 (26) total: 1m 24s remaining: 56.5s 27: learn: 0.1016477 test: 0.2169467 best: 0.2169467 (27) total: 1m 27s remaining: 53.4s 28: learn: 0.1002068 test: 0.2156831 best: 0.2156831 (28) total: 1m 31s remaining: 50.4s 29: learn: 0.0974374 test: 0.2155917 best: 0.2155917 (29) total: 1m 34s remaining: 47.3s 30: learn: 0.0953183 test: 0.2155103 best: 0.2155103 (30) total: 1m 37s remaining: 44.2s 31: learn: 0.0934469 test: 0.2149233 best: 0.2149233 (31) total: 1m 40s remaining: 40.8s 32: learn: 0.0910578 test: 0.2147090 best: 0.2147090 (32) total: 1m 43s remaining: 37.6s 33: learn: 0.0880304 test: 0.2142396 best: 0.2142396 (33) total: 1m 46s remaining: 34.5s 34: learn: 0.0846219 test: 0.2142565 best: 0.2142396 (33) total: 1m 49s remaining: 31.4s 35: learn: 0.0813137 test: 0.2139123 best: 0.2139123 (35) total: 1m 53s remaining: 28.3s 36: learn: 0.0789879 test: 0.2138254 best: 0.2138254 (36) total: 1m 56s remaining: 25.2s 37: learn: 0.0771026 test: 0.2133653 best: 0.2133653 (37) total: 1m 59s remaining: 22.1s 38: learn: 0.0748405 test: 0.2132859 best: 0.2132859 (38) total: 2m 2s remaining: 18.9s 39: learn: 0.0734691 test: 0.2127105 best: 0.2127105 (39) total: 2m 5s remaining: 15.7s 40: learn: 0.0716870 test: 0.2124159 best: 0.2124159 (40) total: 2m 9s remaining: 12.6s 41: learn: 0.0695950 test: 0.2127512 best: 0.2124159 (40) total: 2m 12s remaining: 9.43s 42: learn: 0.0685505 test: 0.2127837 best: 0.2124159 (40) total: 2m 14s remaining: 6.26s 43: learn: 0.0673532 test: 0.2125384 best: 0.2124159 (40) total: 2m 17s remaining: 3.13s 44: learn: 0.0649196 test: 0.2125478 best: 0.2124159 (40) total: 2m 21s remaining: 0us bestTest = 0.2124158617 bestIteration = 40 Shrink model to first 41 iterations. Trial 14, Fold 3: Log loss = 0.21176093735375257, Average precision = 0.9704697095516545, ROC-AUC = 0.9665134930801014, Elapsed Time = 141.51804309999716 seconds Trial 14, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 14, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5913083 test: 0.6032185 best: 0.6032185 (0) total: 3.25s remaining: 2m 23s 1: learn: 0.4994854 test: 0.5214923 best: 0.5214923 (1) total: 6.39s remaining: 2m 17s 2: learn: 0.4284373 test: 0.4587060 best: 0.4587060 (2) total: 9.74s remaining: 2m 16s 3: learn: 0.3744961 test: 0.4087844 best: 0.4087844 (3) total: 12.9s remaining: 2m 12s 4: learn: 0.3323128 test: 0.3714134 best: 0.3714134 (4) total: 15.9s remaining: 2m 7s 5: learn: 0.2982867 test: 0.3420492 best: 0.3420492 (5) total: 18.8s remaining: 2m 1s 6: learn: 0.2726038 test: 0.3190632 best: 0.3190632 (6) total: 22s remaining: 1m 59s 7: learn: 0.2514265 test: 0.3002631 best: 0.3002631 (7) total: 25s remaining: 1m 55s 8: learn: 0.2322700 test: 0.2854272 best: 0.2854272 (8) total: 27.9s remaining: 1m 51s 9: learn: 0.2182933 test: 0.2739475 best: 0.2739475 (9) total: 31s remaining: 1m 48s 10: learn: 0.2054406 test: 0.2649456 best: 0.2649456 (10) total: 34.4s remaining: 1m 46s 11: learn: 0.1944492 test: 0.2563974 best: 0.2563974 (11) total: 37.6s remaining: 1m 43s 12: learn: 0.1842426 test: 0.2503772 best: 0.2503772 (12) total: 40.6s remaining: 1m 39s 13: learn: 0.1744058 test: 0.2447633 best: 0.2447633 (13) total: 44.1s remaining: 1m 37s 14: learn: 0.1667173 test: 0.2396696 best: 0.2396696 (14) total: 47.1s remaining: 1m 34s 15: learn: 0.1602946 test: 0.2362365 best: 0.2362365 (15) total: 50s remaining: 1m 30s 16: learn: 0.1527441 test: 0.2324698 best: 0.2324698 (16) total: 53.4s remaining: 1m 27s 17: learn: 0.1489245 test: 0.2296899 best: 0.2296899 (17) total: 56.4s remaining: 1m 24s 18: learn: 0.1439826 test: 0.2276895 best: 0.2276895 (18) total: 59.4s remaining: 1m 21s 19: learn: 0.1383381 test: 0.2259605 best: 0.2259605 (19) total: 1m 2s remaining: 1m 17s 20: learn: 0.1312760 test: 0.2252708 best: 0.2252708 (20) total: 1m 5s remaining: 1m 14s 21: learn: 0.1274150 test: 0.2234976 best: 0.2234976 (21) total: 1m 8s remaining: 1m 11s 22: learn: 0.1205500 test: 0.2229066 best: 0.2229066 (22) total: 1m 11s remaining: 1m 8s 23: learn: 0.1171982 test: 0.2215082 best: 0.2215082 (23) total: 1m 14s remaining: 1m 5s 24: learn: 0.1124804 test: 0.2209376 best: 0.2209376 (24) total: 1m 17s remaining: 1m 2s 25: learn: 0.1080570 test: 0.2205332 best: 0.2205332 (25) total: 1m 20s remaining: 59.1s 26: learn: 0.1052080 test: 0.2192549 best: 0.2192549 (26) total: 1m 23s remaining: 55.8s 27: learn: 0.1032704 test: 0.2181041 best: 0.2181041 (27) total: 1m 26s remaining: 52.8s 28: learn: 0.1011699 test: 0.2178589 best: 0.2178589 (28) total: 1m 30s remaining: 49.7s 29: learn: 0.0981321 test: 0.2173280 best: 0.2173280 (29) total: 1m 33s remaining: 46.6s 30: learn: 0.0954972 test: 0.2165293 best: 0.2165293 (30) total: 1m 36s remaining: 43.4s 31: learn: 0.0935445 test: 0.2158696 best: 0.2158696 (31) total: 1m 38s remaining: 40s 32: learn: 0.0920876 test: 0.2151512 best: 0.2151512 (32) total: 1m 41s remaining: 36.9s 33: learn: 0.0895295 test: 0.2148285 best: 0.2148285 (33) total: 1m 43s remaining: 33.6s 34: learn: 0.0877189 test: 0.2140067 best: 0.2140067 (34) total: 1m 46s remaining: 30.6s 35: learn: 0.0851383 test: 0.2137625 best: 0.2137625 (35) total: 1m 50s remaining: 27.6s 36: learn: 0.0818876 test: 0.2138893 best: 0.2137625 (35) total: 1m 53s remaining: 24.5s 37: learn: 0.0803853 test: 0.2132556 best: 0.2132556 (37) total: 1m 56s remaining: 21.4s 38: learn: 0.0781947 test: 0.2128033 best: 0.2128033 (38) total: 1m 59s remaining: 18.4s 39: learn: 0.0756566 test: 0.2123925 best: 0.2123925 (39) total: 2m 1s remaining: 15.2s 40: learn: 0.0744280 test: 0.2122901 best: 0.2122901 (40) total: 2m 4s remaining: 12.2s 41: learn: 0.0711997 test: 0.2127214 best: 0.2122901 (40) total: 2m 8s remaining: 9.14s 42: learn: 0.0695970 test: 0.2126545 best: 0.2122901 (40) total: 2m 11s remaining: 6.1s 43: learn: 0.0683690 test: 0.2124755 best: 0.2122901 (40) total: 2m 14s remaining: 3.06s 44: learn: 0.0668914 test: 0.2124761 best: 0.2122901 (40) total: 2m 16s remaining: 0us bestTest = 0.212290099 bestIteration = 40 Shrink model to first 41 iterations. Trial 14, Fold 4: Log loss = 0.2116095543792527, Average precision = 0.9721683375773902, ROC-AUC = 0.9668279421828319, Elapsed Time = 137.0592063999975 seconds Trial 14, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 14, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5803209 test: 0.5914360 best: 0.5914360 (0) total: 3.27s remaining: 2m 24s 1: learn: 0.4917459 test: 0.5130217 best: 0.5130217 (1) total: 6.38s remaining: 2m 17s 2: learn: 0.4236202 test: 0.4527485 best: 0.4527485 (2) total: 9.35s remaining: 2m 10s 3: learn: 0.3717660 test: 0.4063369 best: 0.4063369 (3) total: 12.7s remaining: 2m 9s 4: learn: 0.3281444 test: 0.3701779 best: 0.3701779 (4) total: 15.9s remaining: 2m 7s 5: learn: 0.2959626 test: 0.3413374 best: 0.3413374 (5) total: 19.1s remaining: 2m 4s 6: learn: 0.2708576 test: 0.3191538 best: 0.3191538 (6) total: 22s remaining: 1m 59s 7: learn: 0.2494503 test: 0.3021471 best: 0.3021471 (7) total: 25.3s remaining: 1m 57s 8: learn: 0.2302248 test: 0.2890283 best: 0.2890283 (8) total: 28.4s remaining: 1m 53s 9: learn: 0.2162939 test: 0.2777274 best: 0.2777274 (9) total: 31.4s remaining: 1m 49s 10: learn: 0.2041671 test: 0.2684269 best: 0.2684269 (10) total: 34.6s remaining: 1m 46s 11: learn: 0.1942773 test: 0.2612122 best: 0.2612122 (11) total: 37.9s remaining: 1m 44s 12: learn: 0.1838066 test: 0.2548227 best: 0.2548227 (12) total: 40.8s remaining: 1m 40s 13: learn: 0.1753989 test: 0.2500792 best: 0.2500792 (13) total: 43.9s remaining: 1m 37s 14: learn: 0.1665754 test: 0.2459458 best: 0.2459458 (14) total: 47.1s remaining: 1m 34s 15: learn: 0.1577527 test: 0.2426342 best: 0.2426342 (15) total: 50.1s remaining: 1m 30s 16: learn: 0.1517693 test: 0.2404095 best: 0.2404095 (16) total: 53.4s remaining: 1m 27s 17: learn: 0.1461579 test: 0.2375820 best: 0.2375820 (17) total: 56.5s remaining: 1m 24s 18: learn: 0.1409462 test: 0.2358171 best: 0.2358171 (18) total: 1m remaining: 1m 22s 19: learn: 0.1367763 test: 0.2334854 best: 0.2334854 (19) total: 1m 3s remaining: 1m 19s 20: learn: 0.1317757 test: 0.2322473 best: 0.2322473 (20) total: 1m 7s remaining: 1m 16s 21: learn: 0.1286373 test: 0.2311968 best: 0.2311968 (21) total: 1m 10s remaining: 1m 14s 22: learn: 0.1226746 test: 0.2296791 best: 0.2296791 (22) total: 1m 14s remaining: 1m 11s 23: learn: 0.1182230 test: 0.2284024 best: 0.2284024 (23) total: 1m 17s remaining: 1m 7s 24: learn: 0.1143845 test: 0.2273456 best: 0.2273456 (24) total: 1m 20s remaining: 1m 4s 25: learn: 0.1077534 test: 0.2268715 best: 0.2268715 (25) total: 1m 23s remaining: 1m 1s 26: learn: 0.1041038 test: 0.2266512 best: 0.2266512 (26) total: 1m 26s remaining: 57.8s 27: learn: 0.1013187 test: 0.2255078 best: 0.2255078 (27) total: 1m 29s remaining: 54.5s 28: learn: 0.0986119 test: 0.2250147 best: 0.2250147 (28) total: 1m 32s remaining: 50.9s 29: learn: 0.0956607 test: 0.2242976 best: 0.2242976 (29) total: 1m 35s remaining: 47.8s 30: learn: 0.0918244 test: 0.2235352 best: 0.2235352 (30) total: 1m 38s remaining: 44.5s 31: learn: 0.0893024 test: 0.2229120 best: 0.2229120 (31) total: 1m 41s remaining: 41.2s 32: learn: 0.0848291 test: 0.2233149 best: 0.2229120 (31) total: 1m 44s remaining: 38.1s 33: learn: 0.0831844 test: 0.2229017 best: 0.2229017 (33) total: 1m 47s remaining: 34.7s 34: learn: 0.0815398 test: 0.2226362 best: 0.2226362 (34) total: 1m 50s remaining: 31.5s 35: learn: 0.0790131 test: 0.2228296 best: 0.2226362 (34) total: 1m 53s remaining: 28.3s 36: learn: 0.0753729 test: 0.2231285 best: 0.2226362 (34) total: 1m 56s remaining: 25.2s 37: learn: 0.0734253 test: 0.2230945 best: 0.2226362 (34) total: 1m 59s remaining: 22s 38: learn: 0.0702069 test: 0.2231708 best: 0.2226362 (34) total: 2m 2s remaining: 18.8s 39: learn: 0.0689018 test: 0.2229141 best: 0.2226362 (34) total: 2m 5s remaining: 15.7s 40: learn: 0.0677969 test: 0.2225275 best: 0.2225275 (40) total: 2m 8s remaining: 12.6s 41: learn: 0.0663907 test: 0.2220228 best: 0.2220228 (41) total: 2m 12s remaining: 9.44s 42: learn: 0.0649653 test: 0.2216970 best: 0.2216970 (42) total: 2m 15s remaining: 6.28s 43: learn: 0.0634324 test: 0.2212573 best: 0.2212573 (43) total: 2m 18s remaining: 3.15s 44: learn: 0.0621476 test: 0.2211322 best: 0.2211322 (44) total: 2m 21s remaining: 0us bestTest = 0.2211322192 bestIteration = 44 Trial 14, Fold 5: Log loss = 0.22007899087043226, Average precision = 0.9700105951317912, ROC-AUC = 0.9646967281645393, Elapsed Time = 141.9878531999966 seconds
Optimization Progress: 15%|#5 | 15/100 [32:12<6:01:13, 254.98s/it]
Trial 15, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 15, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6728683 test: 0.6732393 best: 0.6732393 (0) total: 141ms remaining: 7.04s 1: learn: 0.6541275 test: 0.6545015 best: 0.6545015 (1) total: 287ms remaining: 7.02s 2: learn: 0.6368697 test: 0.6372832 best: 0.6372832 (2) total: 424ms remaining: 6.79s 3: learn: 0.6211260 test: 0.6216329 best: 0.6216329 (3) total: 553ms remaining: 6.49s 4: learn: 0.6054688 test: 0.6059944 best: 0.6059944 (4) total: 694ms remaining: 6.39s 5: learn: 0.5895789 test: 0.5901938 best: 0.5901938 (5) total: 824ms remaining: 6.18s 6: learn: 0.5751578 test: 0.5758150 best: 0.5758150 (6) total: 957ms remaining: 6.02s 7: learn: 0.5613624 test: 0.5621416 best: 0.5621416 (7) total: 1.1s remaining: 5.91s 8: learn: 0.5480040 test: 0.5487776 best: 0.5487776 (8) total: 1.25s remaining: 5.85s 9: learn: 0.5354522 test: 0.5363188 best: 0.5363188 (9) total: 1.4s remaining: 5.72s 10: learn: 0.5237824 test: 0.5247925 best: 0.5247925 (10) total: 1.53s remaining: 5.57s 11: learn: 0.5120870 test: 0.5131077 best: 0.5131077 (11) total: 1.68s remaining: 5.47s 12: learn: 0.4999877 test: 0.5010560 best: 0.5010560 (12) total: 1.82s remaining: 5.33s 13: learn: 0.4887113 test: 0.4898685 best: 0.4898685 (13) total: 1.98s remaining: 5.23s 14: learn: 0.4772183 test: 0.4784086 best: 0.4784086 (14) total: 2.12s remaining: 5.09s 15: learn: 0.4678831 test: 0.4691628 best: 0.4691628 (15) total: 2.26s remaining: 4.94s 16: learn: 0.4587907 test: 0.4602005 best: 0.4602005 (16) total: 2.4s remaining: 4.8s 17: learn: 0.4498592 test: 0.4514128 best: 0.4514128 (17) total: 2.54s remaining: 4.65s 18: learn: 0.4411317 test: 0.4428083 best: 0.4428083 (18) total: 2.67s remaining: 4.5s 19: learn: 0.4326863 test: 0.4344499 best: 0.4344499 (19) total: 2.82s remaining: 4.38s 20: learn: 0.4247225 test: 0.4265189 best: 0.4265189 (20) total: 2.96s remaining: 4.23s 21: learn: 0.4169884 test: 0.4188824 best: 0.4188824 (21) total: 3.1s remaining: 4.08s 22: learn: 0.4102549 test: 0.4122892 best: 0.4122892 (22) total: 3.23s remaining: 3.93s 23: learn: 0.4027808 test: 0.4049111 best: 0.4049111 (23) total: 3.36s remaining: 3.78s 24: learn: 0.3967501 test: 0.3989949 best: 0.3989949 (24) total: 3.48s remaining: 3.62s 25: learn: 0.3900780 test: 0.3924132 best: 0.3924132 (25) total: 3.61s remaining: 3.47s 26: learn: 0.3843643 test: 0.3868353 best: 0.3868353 (26) total: 3.75s remaining: 3.33s 27: learn: 0.3782254 test: 0.3808609 best: 0.3808609 (27) total: 3.9s remaining: 3.2s 28: learn: 0.3724376 test: 0.3751760 best: 0.3751760 (28) total: 4.04s remaining: 3.06s 29: learn: 0.3663642 test: 0.3692551 best: 0.3692551 (29) total: 4.17s remaining: 2.92s 30: learn: 0.3608435 test: 0.3637795 best: 0.3637795 (30) total: 4.31s remaining: 2.78s 31: learn: 0.3561659 test: 0.3591867 best: 0.3591867 (31) total: 4.44s remaining: 2.64s 32: learn: 0.3517007 test: 0.3548291 best: 0.3548291 (32) total: 4.57s remaining: 2.49s 33: learn: 0.3465546 test: 0.3497850 best: 0.3497850 (33) total: 4.71s remaining: 2.35s 34: learn: 0.3423229 test: 0.3456567 best: 0.3456567 (34) total: 4.84s remaining: 2.21s 35: learn: 0.3372529 test: 0.3407005 best: 0.3407005 (35) total: 4.97s remaining: 2.07s 36: learn: 0.3333269 test: 0.3369435 best: 0.3369435 (36) total: 5.11s remaining: 1.93s 37: learn: 0.3290819 test: 0.3327537 best: 0.3327537 (37) total: 5.24s remaining: 1.79s 38: learn: 0.3247505 test: 0.3285556 best: 0.3285556 (38) total: 5.39s remaining: 1.66s 39: learn: 0.3210650 test: 0.3249880 best: 0.3249880 (39) total: 5.52s remaining: 1.52s 40: learn: 0.3177187 test: 0.3218019 best: 0.3218019 (40) total: 5.66s remaining: 1.38s 41: learn: 0.3140193 test: 0.3182344 best: 0.3182344 (41) total: 5.8s remaining: 1.24s 42: learn: 0.3110862 test: 0.3154983 best: 0.3154983 (42) total: 5.95s remaining: 1.11s 43: learn: 0.3081155 test: 0.3126211 best: 0.3126211 (43) total: 6.08s remaining: 967ms 44: learn: 0.3053485 test: 0.3099577 best: 0.3099577 (44) total: 6.21s remaining: 828ms 45: learn: 0.3022643 test: 0.3070569 best: 0.3070569 (45) total: 6.35s remaining: 690ms 46: learn: 0.2996876 test: 0.3046085 best: 0.3046085 (46) total: 6.49s remaining: 552ms 47: learn: 0.2968019 test: 0.3018493 best: 0.3018493 (47) total: 6.61s remaining: 413ms 48: learn: 0.2942260 test: 0.2994601 best: 0.2994601 (48) total: 6.75s remaining: 275ms 49: learn: 0.2916104 test: 0.2969634 best: 0.2969634 (49) total: 6.89s remaining: 138ms 50: learn: 0.2889257 test: 0.2943792 best: 0.2943792 (50) total: 7.03s remaining: 0us bestTest = 0.294379192 bestIteration = 50 Trial 15, Fold 1: Log loss = 0.2943791920070218, Average precision = 0.9710517488769363, ROC-AUC = 0.9666720735769968, Elapsed Time = 7.14929640000264 seconds Trial 15, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 15, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6748872 test: 0.6748155 best: 0.6748155 (0) total: 134ms remaining: 6.72s 1: learn: 0.6556056 test: 0.6556863 best: 0.6556863 (1) total: 273ms remaining: 6.7s 2: learn: 0.6373877 test: 0.6375877 best: 0.6375877 (2) total: 416ms remaining: 6.66s 3: learn: 0.6210184 test: 0.6213073 best: 0.6213073 (3) total: 563ms remaining: 6.61s 4: learn: 0.6049537 test: 0.6053904 best: 0.6053904 (4) total: 705ms remaining: 6.48s 5: learn: 0.5897077 test: 0.5902489 best: 0.5902489 (5) total: 838ms remaining: 6.29s 6: learn: 0.5745344 test: 0.5751641 best: 0.5751641 (6) total: 977ms remaining: 6.14s 7: learn: 0.5596429 test: 0.5604156 best: 0.5604156 (7) total: 1.11s remaining: 5.97s 8: learn: 0.5456515 test: 0.5465521 best: 0.5465521 (8) total: 1.25s remaining: 5.83s 9: learn: 0.5327822 test: 0.5338148 best: 0.5338148 (9) total: 1.39s remaining: 5.68s 10: learn: 0.5194756 test: 0.5206760 best: 0.5206760 (10) total: 1.54s remaining: 5.59s 11: learn: 0.5074875 test: 0.5088455 best: 0.5088455 (11) total: 1.69s remaining: 5.48s 12: learn: 0.4966804 test: 0.4981045 best: 0.4981045 (12) total: 1.82s remaining: 5.34s 13: learn: 0.4853273 test: 0.4868116 best: 0.4868116 (13) total: 1.96s remaining: 5.18s 14: learn: 0.4739013 test: 0.4754464 best: 0.4754464 (14) total: 2.11s remaining: 5.06s 15: learn: 0.4643052 test: 0.4659142 best: 0.4659142 (15) total: 2.24s remaining: 4.89s 16: learn: 0.4559918 test: 0.4576005 best: 0.4576005 (16) total: 2.37s remaining: 4.73s 17: learn: 0.4478750 test: 0.4497021 best: 0.4497021 (17) total: 2.5s remaining: 4.59s 18: learn: 0.4393841 test: 0.4412799 best: 0.4412799 (18) total: 2.63s remaining: 4.43s 19: learn: 0.4315773 test: 0.4336106 best: 0.4336106 (19) total: 2.78s remaining: 4.3s 20: learn: 0.4232813 test: 0.4253823 best: 0.4253823 (20) total: 2.91s remaining: 4.16s 21: learn: 0.4145807 test: 0.4167526 best: 0.4167526 (21) total: 3.05s remaining: 4.02s 22: learn: 0.4071915 test: 0.4094128 best: 0.4094128 (22) total: 3.2s remaining: 3.9s 23: learn: 0.3999160 test: 0.4021896 best: 0.4021896 (23) total: 3.33s remaining: 3.75s 24: learn: 0.3931842 test: 0.3955455 best: 0.3955455 (24) total: 3.47s remaining: 3.61s 25: learn: 0.3872830 test: 0.3897442 best: 0.3897442 (25) total: 3.6s remaining: 3.47s 26: learn: 0.3803257 test: 0.3828414 best: 0.3828414 (26) total: 3.75s remaining: 3.33s 27: learn: 0.3750248 test: 0.3776589 best: 0.3776589 (27) total: 3.88s remaining: 3.19s 28: learn: 0.3698119 test: 0.3725524 best: 0.3725524 (28) total: 4.02s remaining: 3.05s 29: learn: 0.3641476 test: 0.3669507 best: 0.3669507 (29) total: 4.14s remaining: 2.9s 30: learn: 0.3595867 test: 0.3623697 best: 0.3623697 (30) total: 4.29s remaining: 2.77s 31: learn: 0.3538647 test: 0.3567068 best: 0.3567068 (31) total: 4.42s remaining: 2.62s 32: learn: 0.3495118 test: 0.3524132 best: 0.3524132 (32) total: 4.56s remaining: 2.49s 33: learn: 0.3452377 test: 0.3482090 best: 0.3482090 (33) total: 4.7s remaining: 2.35s 34: learn: 0.3406748 test: 0.3436359 best: 0.3436359 (34) total: 4.85s remaining: 2.22s 35: learn: 0.3367794 test: 0.3398424 best: 0.3398424 (35) total: 4.97s remaining: 2.07s 36: learn: 0.3325422 test: 0.3356818 best: 0.3356818 (36) total: 5.11s remaining: 1.93s 37: learn: 0.3284229 test: 0.3316398 best: 0.3316398 (37) total: 5.24s remaining: 1.79s 38: learn: 0.3242583 test: 0.3275325 best: 0.3275325 (38) total: 5.37s remaining: 1.65s 39: learn: 0.3208764 test: 0.3241996 best: 0.3241996 (39) total: 5.51s remaining: 1.51s 40: learn: 0.3176053 test: 0.3209512 best: 0.3209512 (40) total: 5.64s remaining: 1.38s 41: learn: 0.3139711 test: 0.3174109 best: 0.3174109 (41) total: 5.77s remaining: 1.24s 42: learn: 0.3111160 test: 0.3145614 best: 0.3145614 (42) total: 5.91s remaining: 1.1s 43: learn: 0.3081899 test: 0.3116891 best: 0.3116891 (43) total: 6.05s remaining: 962ms 44: learn: 0.3053506 test: 0.3088554 best: 0.3088554 (44) total: 6.18s remaining: 825ms 45: learn: 0.3025351 test: 0.3060431 best: 0.3060431 (45) total: 6.33s remaining: 688ms 46: learn: 0.3001768 test: 0.3036876 best: 0.3036876 (46) total: 6.46s remaining: 549ms 47: learn: 0.2975678 test: 0.3011809 best: 0.3011809 (47) total: 6.59s remaining: 412ms 48: learn: 0.2950448 test: 0.2987605 best: 0.2987605 (48) total: 6.73s remaining: 275ms 49: learn: 0.2928648 test: 0.2965851 best: 0.2965851 (49) total: 6.87s remaining: 137ms 50: learn: 0.2907349 test: 0.2945330 best: 0.2945330 (50) total: 7.01s remaining: 0us bestTest = 0.2945329514 bestIteration = 50 Trial 15, Fold 2: Log loss = 0.29453295141492997, Average precision = 0.9723128989222364, ROC-AUC = 0.969825939220156, Elapsed Time = 7.130247099998087 seconds Trial 15, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 15, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6737150 test: 0.6737774 best: 0.6737774 (0) total: 133ms remaining: 6.66s 1: learn: 0.6556731 test: 0.6556025 best: 0.6556025 (1) total: 282ms remaining: 6.9s 2: learn: 0.6384265 test: 0.6383746 best: 0.6383746 (2) total: 416ms remaining: 6.66s 3: learn: 0.6202492 test: 0.6200729 best: 0.6200729 (3) total: 554ms remaining: 6.5s 4: learn: 0.6024360 test: 0.6021683 best: 0.6021683 (4) total: 691ms remaining: 6.36s 5: learn: 0.5872408 test: 0.5869243 best: 0.5869243 (5) total: 834ms remaining: 6.25s 6: learn: 0.5738351 test: 0.5735270 best: 0.5735270 (6) total: 972ms remaining: 6.11s 7: learn: 0.5603657 test: 0.5600393 best: 0.5600393 (7) total: 1.11s remaining: 5.97s 8: learn: 0.5469666 test: 0.5466726 best: 0.5466726 (8) total: 1.25s remaining: 5.82s 9: learn: 0.5335606 test: 0.5331790 best: 0.5331790 (9) total: 1.39s remaining: 5.72s 10: learn: 0.5215780 test: 0.5212122 best: 0.5212122 (10) total: 1.53s remaining: 5.56s 11: learn: 0.5080765 test: 0.5077232 best: 0.5077232 (11) total: 1.68s remaining: 5.46s 12: learn: 0.4965899 test: 0.4962758 best: 0.4962758 (12) total: 1.82s remaining: 5.33s 13: learn: 0.4859075 test: 0.4855964 best: 0.4855964 (13) total: 1.96s remaining: 5.17s 14: learn: 0.4756953 test: 0.4754248 best: 0.4754248 (14) total: 2.09s remaining: 5.02s 15: learn: 0.4661357 test: 0.4658436 best: 0.4658436 (15) total: 2.23s remaining: 4.88s 16: learn: 0.4555393 test: 0.4551286 best: 0.4551286 (16) total: 2.38s remaining: 4.75s 17: learn: 0.4469852 test: 0.4465182 best: 0.4465182 (17) total: 2.52s remaining: 4.62s 18: learn: 0.4372430 test: 0.4367061 best: 0.4367061 (18) total: 2.66s remaining: 4.49s 19: learn: 0.4297540 test: 0.4291912 best: 0.4291912 (19) total: 2.8s remaining: 4.33s 20: learn: 0.4219764 test: 0.4213299 best: 0.4213299 (20) total: 2.93s remaining: 4.18s 21: learn: 0.4153043 test: 0.4146611 best: 0.4146611 (21) total: 3.06s remaining: 4.04s 22: learn: 0.4080898 test: 0.4074790 best: 0.4074790 (22) total: 3.2s remaining: 3.9s 23: learn: 0.4015937 test: 0.4009681 best: 0.4009681 (23) total: 3.33s remaining: 3.75s 24: learn: 0.3955302 test: 0.3948635 best: 0.3948635 (24) total: 3.47s remaining: 3.61s 25: learn: 0.3882643 test: 0.3876042 best: 0.3876042 (25) total: 3.6s remaining: 3.46s 26: learn: 0.3821782 test: 0.3814607 best: 0.3814607 (26) total: 3.74s remaining: 3.32s 27: learn: 0.3767607 test: 0.3760167 best: 0.3760167 (27) total: 3.88s remaining: 3.18s 28: learn: 0.3710042 test: 0.3701794 best: 0.3701794 (28) total: 4.01s remaining: 3.04s 29: learn: 0.3650790 test: 0.3641834 best: 0.3641834 (29) total: 4.14s remaining: 2.9s 30: learn: 0.3605539 test: 0.3596723 best: 0.3596723 (30) total: 4.27s remaining: 2.75s 31: learn: 0.3554408 test: 0.3546200 best: 0.3546200 (31) total: 4.4s remaining: 2.62s 32: learn: 0.3501479 test: 0.3493450 best: 0.3493450 (32) total: 4.56s remaining: 2.48s 33: learn: 0.3458221 test: 0.3450252 best: 0.3450252 (33) total: 4.69s remaining: 2.34s 34: learn: 0.3419931 test: 0.3411490 best: 0.3411490 (34) total: 4.83s remaining: 2.21s 35: learn: 0.3373654 test: 0.3364456 best: 0.3364456 (35) total: 4.96s remaining: 2.06s 36: learn: 0.3337178 test: 0.3328617 best: 0.3328617 (36) total: 5.09s remaining: 1.92s 37: learn: 0.3302016 test: 0.3293694 best: 0.3293694 (37) total: 5.22s remaining: 1.79s 38: learn: 0.3268689 test: 0.3260440 best: 0.3260440 (38) total: 5.36s remaining: 1.65s 39: learn: 0.3235117 test: 0.3226529 best: 0.3226529 (39) total: 5.5s remaining: 1.51s 40: learn: 0.3197855 test: 0.3189859 best: 0.3189859 (40) total: 5.64s remaining: 1.38s 41: learn: 0.3154072 test: 0.3145683 best: 0.3145683 (41) total: 5.77s remaining: 1.24s 42: learn: 0.3127296 test: 0.3119176 best: 0.3119176 (42) total: 5.91s remaining: 1.1s 43: learn: 0.3094605 test: 0.3086213 best: 0.3086213 (43) total: 6.05s remaining: 962ms 44: learn: 0.3064140 test: 0.3056097 best: 0.3056097 (44) total: 6.19s remaining: 825ms 45: learn: 0.3035037 test: 0.3027410 best: 0.3027410 (45) total: 6.32s remaining: 687ms 46: learn: 0.3009612 test: 0.3002134 best: 0.3002134 (46) total: 6.45s remaining: 549ms 47: learn: 0.2980785 test: 0.2973883 best: 0.2973883 (47) total: 6.59s remaining: 412ms 48: learn: 0.2950206 test: 0.2943422 best: 0.2943422 (48) total: 6.73s remaining: 275ms 49: learn: 0.2926577 test: 0.2920326 best: 0.2920326 (49) total: 6.87s remaining: 137ms 50: learn: 0.2907070 test: 0.2901031 best: 0.2901031 (50) total: 7.01s remaining: 0us bestTest = 0.290103069 bestIteration = 50 Trial 15, Fold 3: Log loss = 0.29010306900574834, Average precision = 0.9707903406904075, ROC-AUC = 0.9698042221253768, Elapsed Time = 7.128263999999035 seconds Trial 15, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 15, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6747674 test: 0.6747747 best: 0.6747747 (0) total: 134ms remaining: 6.71s 1: learn: 0.6564930 test: 0.6565363 best: 0.6565363 (1) total: 389ms remaining: 9.54s 2: learn: 0.6384627 test: 0.6385317 best: 0.6385317 (2) total: 544ms remaining: 8.7s 3: learn: 0.6217752 test: 0.6219375 best: 0.6219375 (3) total: 692ms remaining: 8.13s 4: learn: 0.6063244 test: 0.6065861 best: 0.6065861 (4) total: 833ms remaining: 7.66s 5: learn: 0.5905101 test: 0.5907747 best: 0.5907747 (5) total: 988ms remaining: 7.41s 6: learn: 0.5758586 test: 0.5762451 best: 0.5762451 (6) total: 1.13s remaining: 7.11s 7: learn: 0.5611757 test: 0.5616702 best: 0.5616702 (7) total: 1.3s remaining: 6.99s 8: learn: 0.5487693 test: 0.5493094 best: 0.5493094 (8) total: 1.46s remaining: 6.83s 9: learn: 0.5357312 test: 0.5363039 best: 0.5363039 (9) total: 1.62s remaining: 6.65s 10: learn: 0.5228645 test: 0.5233767 best: 0.5233767 (10) total: 1.76s remaining: 6.4s 11: learn: 0.5110139 test: 0.5114988 best: 0.5114988 (11) total: 1.91s remaining: 6.22s 12: learn: 0.5000183 test: 0.5005701 best: 0.5005701 (12) total: 2.06s remaining: 6.02s 13: learn: 0.4888817 test: 0.4895375 best: 0.4895375 (13) total: 2.2s remaining: 5.83s 14: learn: 0.4788814 test: 0.4795552 best: 0.4795552 (14) total: 2.35s remaining: 5.65s 15: learn: 0.4679141 test: 0.4685668 best: 0.4685668 (15) total: 2.5s remaining: 5.46s 16: learn: 0.4584847 test: 0.4592768 best: 0.4592768 (16) total: 2.64s remaining: 5.29s 17: learn: 0.4496093 test: 0.4504748 best: 0.4504748 (17) total: 2.8s remaining: 5.13s 18: learn: 0.4407832 test: 0.4416916 best: 0.4416916 (18) total: 2.94s remaining: 4.96s 19: learn: 0.4327449 test: 0.4336458 best: 0.4336458 (19) total: 3.09s remaining: 4.79s 20: learn: 0.4257850 test: 0.4267568 best: 0.4267568 (20) total: 3.24s remaining: 4.62s 21: learn: 0.4180085 test: 0.4191476 best: 0.4191476 (21) total: 3.39s remaining: 4.47s 22: learn: 0.4106134 test: 0.4118767 best: 0.4118767 (22) total: 3.54s remaining: 4.31s 23: learn: 0.4042741 test: 0.4055703 best: 0.4055703 (23) total: 3.68s remaining: 4.14s 24: learn: 0.3980080 test: 0.3993681 best: 0.3993681 (24) total: 3.83s remaining: 3.99s 25: learn: 0.3912420 test: 0.3926066 best: 0.3926066 (25) total: 3.98s remaining: 3.83s 26: learn: 0.3843673 test: 0.3857695 best: 0.3857695 (26) total: 4.14s remaining: 3.68s 27: learn: 0.3785103 test: 0.3799868 best: 0.3799868 (27) total: 4.29s remaining: 3.52s 28: learn: 0.3733940 test: 0.3749466 best: 0.3749466 (28) total: 4.44s remaining: 3.37s 29: learn: 0.3680923 test: 0.3697181 best: 0.3697181 (29) total: 4.6s remaining: 3.22s 30: learn: 0.3632522 test: 0.3649372 best: 0.3649372 (30) total: 4.74s remaining: 3.06s 31: learn: 0.3590162 test: 0.3608020 best: 0.3608020 (31) total: 4.88s remaining: 2.9s 32: learn: 0.3539181 test: 0.3557406 best: 0.3557406 (32) total: 5.01s remaining: 2.73s 33: learn: 0.3489559 test: 0.3507879 best: 0.3507879 (33) total: 5.16s remaining: 2.58s 34: learn: 0.3440679 test: 0.3459044 best: 0.3459044 (34) total: 5.3s remaining: 2.42s 35: learn: 0.3402373 test: 0.3421111 best: 0.3421111 (35) total: 5.43s remaining: 2.26s 36: learn: 0.3352376 test: 0.3371498 best: 0.3371498 (36) total: 5.57s remaining: 2.11s 37: learn: 0.3315528 test: 0.3334594 best: 0.3334594 (37) total: 5.7s remaining: 1.95s 38: learn: 0.3282769 test: 0.3302658 best: 0.3302658 (38) total: 5.84s remaining: 1.8s 39: learn: 0.3249223 test: 0.3270317 best: 0.3270317 (39) total: 5.98s remaining: 1.64s 40: learn: 0.3217570 test: 0.3238747 best: 0.3238747 (40) total: 6.13s remaining: 1.49s 41: learn: 0.3182879 test: 0.3205148 best: 0.3205148 (41) total: 6.27s remaining: 1.34s 42: learn: 0.3143726 test: 0.3166497 best: 0.3166497 (42) total: 6.42s remaining: 1.19s 43: learn: 0.3115700 test: 0.3138879 best: 0.3138879 (43) total: 6.55s remaining: 1.04s 44: learn: 0.3088497 test: 0.3112254 best: 0.3112254 (44) total: 6.68s remaining: 891ms 45: learn: 0.3058567 test: 0.3083014 best: 0.3083014 (45) total: 6.82s remaining: 742ms 46: learn: 0.3030175 test: 0.3055288 best: 0.3055288 (46) total: 6.96s remaining: 593ms 47: learn: 0.3002485 test: 0.3028262 best: 0.3028262 (47) total: 7.11s remaining: 444ms 48: learn: 0.2972998 test: 0.2998738 best: 0.2998738 (48) total: 7.26s remaining: 296ms 49: learn: 0.2947370 test: 0.2973342 best: 0.2973342 (49) total: 7.4s remaining: 148ms 50: learn: 0.2924728 test: 0.2951406 best: 0.2951406 (50) total: 7.54s remaining: 0us bestTest = 0.2951406229 bestIteration = 50 Trial 15, Fold 4: Log loss = 0.2951406228889906, Average precision = 0.9696657754605814, ROC-AUC = 0.9677570423542708, Elapsed Time = 7.65740150000056 seconds Trial 15, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 15, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6749286 test: 0.6752187 best: 0.6752187 (0) total: 152ms remaining: 7.61s 1: learn: 0.6563835 test: 0.6568623 best: 0.6568623 (1) total: 297ms remaining: 7.26s 2: learn: 0.6387812 test: 0.6395092 best: 0.6395092 (2) total: 431ms remaining: 6.89s 3: learn: 0.6213472 test: 0.6223058 best: 0.6223058 (3) total: 568ms remaining: 6.67s 4: learn: 0.6064743 test: 0.6076714 best: 0.6076714 (4) total: 722ms remaining: 6.64s 5: learn: 0.5910226 test: 0.5924552 best: 0.5924552 (5) total: 862ms remaining: 6.47s 6: learn: 0.5761826 test: 0.5777278 best: 0.5777278 (6) total: 996ms remaining: 6.26s 7: learn: 0.5615746 test: 0.5633103 best: 0.5633103 (7) total: 1.14s remaining: 6.1s 8: learn: 0.5470730 test: 0.5490262 best: 0.5490262 (8) total: 1.28s remaining: 5.99s 9: learn: 0.5344301 test: 0.5366072 best: 0.5366072 (9) total: 1.42s remaining: 5.8s 10: learn: 0.5214774 test: 0.5239067 best: 0.5239067 (10) total: 1.55s remaining: 5.65s 11: learn: 0.5093748 test: 0.5120132 best: 0.5120132 (11) total: 1.7s remaining: 5.53s 12: learn: 0.4974717 test: 0.5002895 best: 0.5002895 (12) total: 1.85s remaining: 5.4s 13: learn: 0.4871519 test: 0.4901594 best: 0.4901594 (13) total: 1.98s remaining: 5.24s 14: learn: 0.4766188 test: 0.4798549 best: 0.4798549 (14) total: 2.12s remaining: 5.09s 15: learn: 0.4664337 test: 0.4698863 best: 0.4698863 (15) total: 2.26s remaining: 4.95s 16: learn: 0.4573124 test: 0.4609870 best: 0.4609870 (16) total: 2.4s remaining: 4.81s 17: learn: 0.4476646 test: 0.4515038 best: 0.4515038 (17) total: 2.55s remaining: 4.68s 18: learn: 0.4396557 test: 0.4435652 best: 0.4435652 (18) total: 2.69s remaining: 4.52s 19: learn: 0.4320201 test: 0.4360968 best: 0.4360968 (19) total: 2.83s remaining: 4.38s 20: learn: 0.4245480 test: 0.4286799 best: 0.4286799 (20) total: 2.96s remaining: 4.22s 21: learn: 0.4164367 test: 0.4207947 best: 0.4207947 (21) total: 3.1s remaining: 4.08s 22: learn: 0.4089210 test: 0.4134252 best: 0.4134252 (22) total: 3.24s remaining: 3.94s 23: learn: 0.4016361 test: 0.4062348 best: 0.4062348 (23) total: 3.39s remaining: 3.81s 24: learn: 0.3948250 test: 0.3996153 best: 0.3996153 (24) total: 3.53s remaining: 3.67s 25: learn: 0.3872354 test: 0.3922229 best: 0.3922229 (25) total: 3.67s remaining: 3.53s 26: learn: 0.3815908 test: 0.3867182 best: 0.3867182 (26) total: 3.82s remaining: 3.39s 27: learn: 0.3760259 test: 0.3813150 best: 0.3813150 (27) total: 3.95s remaining: 3.24s 28: learn: 0.3699730 test: 0.3754618 best: 0.3754618 (28) total: 4.08s remaining: 3.1s 29: learn: 0.3652049 test: 0.3708206 best: 0.3708206 (29) total: 4.21s remaining: 2.95s 30: learn: 0.3593718 test: 0.3650963 best: 0.3650963 (30) total: 4.35s remaining: 2.81s 31: learn: 0.3547913 test: 0.3606804 best: 0.3606804 (31) total: 4.5s remaining: 2.67s 32: learn: 0.3500445 test: 0.3559710 best: 0.3559710 (32) total: 4.64s remaining: 2.53s 33: learn: 0.3453935 test: 0.3514221 best: 0.3514221 (33) total: 4.79s remaining: 2.4s 34: learn: 0.3405961 test: 0.3467812 best: 0.3467812 (34) total: 4.94s remaining: 2.26s 35: learn: 0.3365308 test: 0.3428885 best: 0.3428885 (35) total: 5.08s remaining: 2.12s 36: learn: 0.3327567 test: 0.3392621 best: 0.3392621 (36) total: 5.22s remaining: 1.98s 37: learn: 0.3289216 test: 0.3355744 best: 0.3355744 (37) total: 5.35s remaining: 1.83s 38: learn: 0.3253676 test: 0.3321392 best: 0.3321392 (38) total: 5.48s remaining: 1.69s 39: learn: 0.3220216 test: 0.3289726 best: 0.3289726 (39) total: 5.62s remaining: 1.54s 40: learn: 0.3179764 test: 0.3250610 best: 0.3250610 (40) total: 5.76s remaining: 1.4s 41: learn: 0.3142306 test: 0.3214984 best: 0.3214984 (41) total: 5.9s remaining: 1.26s 42: learn: 0.3107067 test: 0.3181042 best: 0.3181042 (42) total: 6.04s remaining: 1.12s 43: learn: 0.3076920 test: 0.3152685 best: 0.3152685 (43) total: 6.17s remaining: 983ms 44: learn: 0.3046823 test: 0.3124086 best: 0.3124086 (44) total: 6.32s remaining: 842ms 45: learn: 0.3020779 test: 0.3099324 best: 0.3099324 (45) total: 6.46s remaining: 702ms 46: learn: 0.2991469 test: 0.3070955 best: 0.3070955 (46) total: 6.59s remaining: 561ms 47: learn: 0.2958719 test: 0.3039843 best: 0.3039843 (47) total: 6.74s remaining: 421ms 48: learn: 0.2934179 test: 0.3017028 best: 0.3017028 (48) total: 6.88s remaining: 281ms 49: learn: 0.2909491 test: 0.2993619 best: 0.2993619 (49) total: 7.02s remaining: 140ms 50: learn: 0.2886391 test: 0.2971599 best: 0.2971599 (50) total: 7.16s remaining: 0us bestTest = 0.2971599153 bestIteration = 50 Trial 15, Fold 5: Log loss = 0.29715991525457464, Average precision = 0.9710148105682529, ROC-AUC = 0.9673432547423962, Elapsed Time = 7.277942400000029 seconds
Optimization Progress: 16%|#6 | 16/100 [32:56<4:27:57, 191.40s/it]
Trial 16, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 16, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6112437 test: 0.6129955 best: 0.6129955 (0) total: 1.04s remaining: 20.8s 1: learn: 0.5527328 test: 0.5557880 best: 0.5557880 (1) total: 2.15s remaining: 20.5s 2: learn: 0.5058588 test: 0.5085660 best: 0.5085660 (2) total: 3.08s remaining: 18.5s 3: learn: 0.4679376 test: 0.4721209 best: 0.4721209 (3) total: 4.15s remaining: 17.6s 4: learn: 0.4264395 test: 0.4310013 best: 0.4310013 (4) total: 5.25s remaining: 16.8s 5: learn: 0.3972670 test: 0.4019986 best: 0.4019986 (5) total: 6.11s remaining: 15.3s 6: learn: 0.3745662 test: 0.3790426 best: 0.3790426 (6) total: 7.19s remaining: 14.4s 7: learn: 0.3550913 test: 0.3602187 best: 0.3602187 (7) total: 8.01s remaining: 13s 8: learn: 0.3342404 test: 0.3401219 best: 0.3401219 (8) total: 8.97s remaining: 12s 9: learn: 0.3218753 test: 0.3281902 best: 0.3281902 (9) total: 9.8s remaining: 10.8s 10: learn: 0.3078084 test: 0.3146690 best: 0.3146690 (10) total: 10.9s remaining: 9.89s 11: learn: 0.2979011 test: 0.3050748 best: 0.3050748 (11) total: 11.7s remaining: 8.74s 12: learn: 0.2876851 test: 0.2961616 best: 0.2961616 (12) total: 12.6s remaining: 7.74s 13: learn: 0.2782803 test: 0.2875732 best: 0.2875732 (13) total: 13.6s remaining: 6.81s 14: learn: 0.2668659 test: 0.2767293 best: 0.2767293 (14) total: 14.6s remaining: 5.86s 15: learn: 0.2605462 test: 0.2708354 best: 0.2708354 (15) total: 15.6s remaining: 4.88s 16: learn: 0.2533517 test: 0.2643266 best: 0.2643266 (16) total: 16.6s remaining: 3.91s 17: learn: 0.2469424 test: 0.2579453 best: 0.2579453 (17) total: 17.7s remaining: 2.95s 18: learn: 0.2388218 test: 0.2507337 best: 0.2507337 (18) total: 18.8s remaining: 1.98s 19: learn: 0.2350983 test: 0.2471481 best: 0.2471481 (19) total: 19.9s remaining: 995ms 20: learn: 0.2308703 test: 0.2435573 best: 0.2435573 (20) total: 21s remaining: 0us bestTest = 0.2435573218 bestIteration = 20 Trial 16, Fold 1: Log loss = 0.24349434567895456, Average precision = 0.9688245044761086, ROC-AUC = 0.9641888529224798, Elapsed Time = 21.102435500000865 seconds Trial 16, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 16, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6219896 test: 0.6229820 best: 0.6229820 (0) total: 1.08s remaining: 21.5s 1: learn: 0.5609037 test: 0.5626538 best: 0.5626538 (1) total: 2.13s remaining: 20.3s 2: learn: 0.4972675 test: 0.5001986 best: 0.5001986 (2) total: 3.19s remaining: 19.1s 3: learn: 0.4566861 test: 0.4607402 best: 0.4607402 (3) total: 4.07s remaining: 17.3s 4: learn: 0.4205067 test: 0.4254297 best: 0.4254297 (4) total: 5.16s remaining: 16.5s 5: learn: 0.3922384 test: 0.3985706 best: 0.3985706 (5) total: 6.28s remaining: 15.7s 6: learn: 0.3633713 test: 0.3695486 best: 0.3695486 (6) total: 7.34s remaining: 14.7s 7: learn: 0.3434849 test: 0.3497902 best: 0.3497902 (7) total: 8.26s remaining: 13.4s 8: learn: 0.3228661 test: 0.3291346 best: 0.3291346 (8) total: 9.1s remaining: 12.1s 9: learn: 0.3079825 test: 0.3145786 best: 0.3145786 (9) total: 10.2s remaining: 11.2s 10: learn: 0.2947195 test: 0.3014123 best: 0.3014123 (10) total: 11.2s remaining: 10.2s 11: learn: 0.2844025 test: 0.2914302 best: 0.2914302 (11) total: 12.3s remaining: 9.2s 12: learn: 0.2768772 test: 0.2842791 best: 0.2842791 (12) total: 13.3s remaining: 8.2s 13: learn: 0.2705951 test: 0.2783960 best: 0.2783960 (13) total: 14.3s remaining: 7.14s 14: learn: 0.2637940 test: 0.2719830 best: 0.2719830 (14) total: 15.3s remaining: 6.1s 15: learn: 0.2569748 test: 0.2653755 best: 0.2653755 (15) total: 16.1s remaining: 5.04s 16: learn: 0.2511250 test: 0.2597700 best: 0.2597700 (16) total: 17.1s remaining: 4.02s 17: learn: 0.2446844 test: 0.2544315 best: 0.2544315 (17) total: 18.1s remaining: 3.02s 18: learn: 0.2393917 test: 0.2495976 best: 0.2495976 (18) total: 19.2s remaining: 2.02s 19: learn: 0.2345287 test: 0.2451930 best: 0.2451930 (19) total: 20.3s remaining: 1.01s 20: learn: 0.2294859 test: 0.2405264 best: 0.2405264 (20) total: 21.4s remaining: 0us bestTest = 0.240526408 bestIteration = 20 Trial 16, Fold 2: Log loss = 0.24051377744740576, Average precision = 0.9685976566392376, ROC-AUC = 0.9645798665287179, Elapsed Time = 21.50512820000222 seconds Trial 16, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 16, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6237145 test: 0.6246918 best: 0.6246918 (0) total: 974ms remaining: 19.5s 1: learn: 0.5655950 test: 0.5675249 best: 0.5675249 (1) total: 2.04s remaining: 19.4s 2: learn: 0.5194817 test: 0.5220728 best: 0.5220728 (2) total: 3.15s remaining: 18.9s 3: learn: 0.4835023 test: 0.4855263 best: 0.4855263 (3) total: 4.19s remaining: 17.8s 4: learn: 0.4411071 test: 0.4436720 best: 0.4436720 (4) total: 5.08s remaining: 16.3s 5: learn: 0.4050742 test: 0.4075925 best: 0.4075925 (5) total: 6.13s remaining: 15.3s 6: learn: 0.3778549 test: 0.3807623 best: 0.3807623 (6) total: 7.27s remaining: 14.5s 7: learn: 0.3581575 test: 0.3612357 best: 0.3612357 (7) total: 8.35s remaining: 13.6s 8: learn: 0.3380135 test: 0.3415768 best: 0.3415768 (8) total: 9.18s remaining: 12.2s 9: learn: 0.3215406 test: 0.3254650 best: 0.3254650 (9) total: 10.3s remaining: 11.3s 10: learn: 0.3076051 test: 0.3120838 best: 0.3120838 (10) total: 11.4s remaining: 10.3s 11: learn: 0.2931971 test: 0.2978907 best: 0.2978907 (11) total: 12.3s remaining: 9.22s 12: learn: 0.2830273 test: 0.2878913 best: 0.2878913 (12) total: 13.2s remaining: 8.15s 13: learn: 0.2735831 test: 0.2790286 best: 0.2790286 (13) total: 14.3s remaining: 7.15s 14: learn: 0.2663008 test: 0.2724547 best: 0.2724547 (14) total: 15.3s remaining: 6.14s 15: learn: 0.2592145 test: 0.2660693 best: 0.2660693 (15) total: 16.4s remaining: 5.12s 16: learn: 0.5668449 test: 0.2610921 best: 0.2610921 (16) total: 17.4s remaining: 4.1s 17: learn: 0.5608359 test: 0.2554151 best: 0.2554151 (17) total: 18.5s remaining: 3.08s 18: learn: 0.5569603 test: 0.2519859 best: 0.2519859 (18) total: 19.3s remaining: 2.03s 19: learn: 0.5518702 test: 0.2477325 best: 0.2477325 (19) total: 20.1s remaining: 1s 20: learn: 0.5469646 test: 0.2436389 best: 0.2436389 (20) total: 21.2s remaining: 0us bestTest = 0.2436388868 bestIteration = 20 Trial 16, Fold 3: Log loss = 0.2438268360702275, Average precision = 0.9666008060568859, ROC-AUC = 0.9652810215638875, Elapsed Time = 21.303991299999325 seconds Trial 16, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 16, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6195042 test: 0.6198789 best: 0.6198789 (0) total: 827ms remaining: 16.5s 1: learn: 0.5531350 test: 0.5544879 best: 0.5544879 (1) total: 1.76s remaining: 16.7s 2: learn: 0.5049490 test: 0.5062983 best: 0.5062983 (2) total: 2.72s remaining: 16.3s 3: learn: 0.4586702 test: 0.4604111 best: 0.4604111 (3) total: 3.79s remaining: 16.1s 4: learn: 0.4150608 test: 0.4164114 best: 0.4164114 (4) total: 4.89s remaining: 15.6s 5: learn: 0.3891383 test: 0.3914193 best: 0.3914193 (5) total: 5.95s remaining: 14.9s 6: learn: 0.3608860 test: 0.3637675 best: 0.3637675 (6) total: 7.07s remaining: 14.1s 7: learn: 0.3393949 test: 0.3424237 best: 0.3424237 (7) total: 8.16s remaining: 13.3s 8: learn: 0.3203100 test: 0.3236770 best: 0.3236770 (8) total: 9.06s remaining: 12.1s 9: learn: 0.3084192 test: 0.3122982 best: 0.3122982 (9) total: 10s remaining: 11s 10: learn: 0.2965423 test: 0.3009307 best: 0.3009307 (10) total: 11s remaining: 10s 11: learn: 0.2855985 test: 0.2909422 best: 0.2909422 (11) total: 12.1s remaining: 9.07s 12: learn: 0.2771255 test: 0.2832360 best: 0.2832360 (12) total: 13.2s remaining: 8.1s 13: learn: 0.2683347 test: 0.2754139 best: 0.2754139 (13) total: 14.1s remaining: 7.04s 14: learn: 0.2611082 test: 0.2690381 best: 0.2690381 (14) total: 15.2s remaining: 6.07s 15: learn: 0.2545484 test: 0.2635986 best: 0.2635986 (15) total: 16.2s remaining: 5.08s 16: learn: 0.2469706 test: 0.2563797 best: 0.2563797 (16) total: 17.3s remaining: 4.07s 17: learn: 0.2406065 test: 0.2507442 best: 0.2507442 (17) total: 18.4s remaining: 3.06s 18: learn: 0.2351302 test: 0.2460177 best: 0.2460177 (18) total: 19.3s remaining: 2.03s 19: learn: 0.2307550 test: 0.2421790 best: 0.2421790 (19) total: 20.2s remaining: 1.01s 20: learn: 0.2272946 test: 0.2389546 best: 0.2389546 (20) total: 21.3s remaining: 0us bestTest = 0.2389546105 bestIteration = 20 Trial 16, Fold 4: Log loss = 0.23895464133929212, Average precision = 0.9696762793057822, ROC-AUC = 0.9642569523428018, Elapsed Time = 21.448489299997163 seconds Trial 16, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 16, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6175041 test: 0.6192383 best: 0.6192383 (0) total: 1.05s remaining: 21s 1: learn: 0.5549582 test: 0.5589956 best: 0.5589956 (1) total: 2.12s remaining: 20.1s 2: learn: 0.5092998 test: 0.5142542 best: 0.5142542 (2) total: 3.13s remaining: 18.8s 3: learn: 0.4611523 test: 0.4667227 best: 0.4667227 (3) total: 4.18s remaining: 17.8s 4: learn: 0.4232865 test: 0.4303367 best: 0.4303367 (4) total: 5.27s remaining: 16.9s 5: learn: 0.3962199 test: 0.4040572 best: 0.4040572 (5) total: 6.35s remaining: 15.9s 6: learn: 0.3685334 test: 0.3775031 best: 0.3775031 (6) total: 7.47s remaining: 14.9s 7: learn: 0.3440098 test: 0.3541453 best: 0.3541453 (7) total: 8.54s remaining: 13.9s 8: learn: 0.3235769 test: 0.3350267 best: 0.3350267 (8) total: 9.56s remaining: 12.7s 9: learn: 0.3063688 test: 0.3189450 best: 0.3189450 (9) total: 10.5s remaining: 11.5s 10: learn: 0.2929550 test: 0.3064599 best: 0.3064599 (10) total: 11.5s remaining: 10.5s 11: learn: 0.2825026 test: 0.2969344 best: 0.2969344 (11) total: 12.6s remaining: 9.45s 12: learn: 0.2702243 test: 0.2856575 best: 0.2856575 (12) total: 13.6s remaining: 8.39s 13: learn: 0.2628938 test: 0.2790292 best: 0.2790292 (13) total: 14.7s remaining: 7.35s 14: learn: 0.2557582 test: 0.2727491 best: 0.2727491 (14) total: 15.6s remaining: 6.23s 15: learn: 0.2495899 test: 0.2673660 best: 0.2673660 (15) total: 16.6s remaining: 5.19s 16: learn: 0.2444062 test: 0.2628684 best: 0.2628684 (16) total: 17.5s remaining: 4.13s 17: learn: 0.2392839 test: 0.2584712 best: 0.2584712 (17) total: 18.5s remaining: 3.08s 18: learn: 0.2331003 test: 0.2525294 best: 0.2525294 (18) total: 19.6s remaining: 2.06s 19: learn: 0.2307047 test: 0.2502945 best: 0.2502945 (19) total: 20.6s remaining: 1.03s 20: learn: 0.2261292 test: 0.2466300 best: 0.2466300 (20) total: 21.7s remaining: 0us bestTest = 0.2466300182 bestIteration = 20 Trial 16, Fold 5: Log loss = 0.2465087570707962, Average precision = 0.9651381822498915, ROC-AUC = 0.9621859056537168, Elapsed Time = 21.82507070000065 seconds
Optimization Progress: 17%|#7 | 17/100 [34:51<3:52:58, 168.42s/it]
Trial 17, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 17, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6524703 test: 0.6522438 best: 0.6522438 (0) total: 51.4ms remaining: 2.83s 1: learn: 0.6159441 test: 0.6156141 best: 0.6156141 (1) total: 108ms remaining: 2.93s 2: learn: 0.5835970 test: 0.5833237 best: 0.5833237 (2) total: 163ms remaining: 2.87s 3: learn: 0.5536792 test: 0.5533433 best: 0.5533433 (3) total: 213ms remaining: 2.76s 4: learn: 0.5279274 test: 0.5277444 best: 0.5277444 (4) total: 264ms remaining: 2.69s 5: learn: 0.5029440 test: 0.5027275 best: 0.5027275 (5) total: 314ms remaining: 2.62s 6: learn: 0.4807604 test: 0.4807579 best: 0.4807579 (6) total: 364ms remaining: 2.55s 7: learn: 0.4607105 test: 0.4609005 best: 0.4609005 (7) total: 413ms remaining: 2.48s 8: learn: 0.4420271 test: 0.4421695 best: 0.4421695 (8) total: 463ms remaining: 2.42s 9: learn: 0.4250389 test: 0.4251735 best: 0.4251735 (9) total: 513ms remaining: 2.36s 10: learn: 0.4103514 test: 0.4106908 best: 0.4106908 (10) total: 563ms remaining: 2.3s 11: learn: 0.3982851 test: 0.3986765 best: 0.3986765 (11) total: 613ms remaining: 2.25s 12: learn: 0.3863599 test: 0.3870441 best: 0.3870441 (12) total: 663ms remaining: 2.19s 13: learn: 0.3762953 test: 0.3771236 best: 0.3771236 (13) total: 714ms remaining: 2.14s 14: learn: 0.3651996 test: 0.3660746 best: 0.3660746 (14) total: 767ms remaining: 2.1s 15: learn: 0.3557593 test: 0.3568026 best: 0.3568026 (15) total: 817ms remaining: 2.04s 16: learn: 0.3475505 test: 0.3488369 best: 0.3488369 (16) total: 868ms remaining: 1.99s 17: learn: 0.3391458 test: 0.3406790 best: 0.3406790 (17) total: 919ms remaining: 1.94s 18: learn: 0.3317513 test: 0.3334739 best: 0.3334739 (18) total: 973ms remaining: 1.89s 19: learn: 0.3254527 test: 0.3273332 best: 0.3273332 (19) total: 1.02s remaining: 1.84s 20: learn: 0.3183213 test: 0.3203267 best: 0.3203267 (20) total: 1.07s remaining: 1.79s 21: learn: 0.3128703 test: 0.3151166 best: 0.3151166 (21) total: 1.13s remaining: 1.74s 22: learn: 0.3076150 test: 0.3100057 best: 0.3100057 (22) total: 1.18s remaining: 1.69s 23: learn: 0.3032142 test: 0.3057674 best: 0.3057674 (23) total: 1.23s remaining: 1.64s 24: learn: 0.2976538 test: 0.3001955 best: 0.3001955 (24) total: 1.28s remaining: 1.59s 25: learn: 0.2937612 test: 0.2964374 best: 0.2964374 (25) total: 1.33s remaining: 1.54s 26: learn: 0.2896069 test: 0.2925319 best: 0.2925319 (26) total: 1.38s remaining: 1.49s 27: learn: 0.2849275 test: 0.2878681 best: 0.2878681 (27) total: 1.44s remaining: 1.44s 28: learn: 0.2807088 test: 0.2836839 best: 0.2836839 (28) total: 1.49s remaining: 1.39s 29: learn: 0.2767794 test: 0.2797583 best: 0.2797583 (29) total: 1.54s remaining: 1.33s 30: learn: 0.2738616 test: 0.2771694 best: 0.2771694 (30) total: 1.59s remaining: 1.28s 31: learn: 0.2713254 test: 0.2747899 best: 0.2747899 (31) total: 1.64s remaining: 1.23s 32: learn: 0.2679556 test: 0.2714812 best: 0.2714812 (32) total: 1.69s remaining: 1.18s 33: learn: 0.2646862 test: 0.2682302 best: 0.2682302 (33) total: 1.75s remaining: 1.13s 34: learn: 0.2625549 test: 0.2663964 best: 0.2663964 (34) total: 1.8s remaining: 1.08s 35: learn: 0.2597315 test: 0.2636067 best: 0.2636067 (35) total: 1.85s remaining: 1.03s 36: learn: 0.2575818 test: 0.2616537 best: 0.2616537 (36) total: 1.91s remaining: 978ms 37: learn: 0.2549409 test: 0.2590198 best: 0.2590198 (37) total: 1.96s remaining: 928ms 38: learn: 0.2526653 test: 0.2567124 best: 0.2567124 (38) total: 2.01s remaining: 877ms 39: learn: 0.2505841 test: 0.2546987 best: 0.2546987 (39) total: 2.06s remaining: 825ms 40: learn: 0.2490319 test: 0.2533004 best: 0.2533004 (40) total: 2.11s remaining: 773ms 41: learn: 0.2470828 test: 0.2513652 best: 0.2513652 (41) total: 2.17s remaining: 722ms 42: learn: 0.2451789 test: 0.2495709 best: 0.2495709 (42) total: 2.21s remaining: 670ms 43: learn: 0.2437947 test: 0.2483572 best: 0.2483572 (43) total: 2.27s remaining: 618ms 44: learn: 0.2426297 test: 0.2472997 best: 0.2472997 (44) total: 2.32s remaining: 567ms 45: learn: 0.2410671 test: 0.2457766 best: 0.2457766 (45) total: 2.37s remaining: 515ms 46: learn: 0.2395170 test: 0.2443485 best: 0.2443485 (46) total: 2.42s remaining: 463ms 47: learn: 0.2385724 test: 0.2435913 best: 0.2435913 (47) total: 2.47s remaining: 412ms 48: learn: 0.2372919 test: 0.2423855 best: 0.2423855 (48) total: 2.53s remaining: 361ms 49: learn: 0.2360129 test: 0.2411318 best: 0.2411318 (49) total: 2.58s remaining: 310ms 50: learn: 0.2345987 test: 0.2399265 best: 0.2399265 (50) total: 2.63s remaining: 258ms 51: learn: 0.2336964 test: 0.2391719 best: 0.2391719 (51) total: 2.69s remaining: 207ms 52: learn: 0.2328911 test: 0.2384003 best: 0.2384003 (52) total: 2.74s remaining: 155ms 53: learn: 0.2317167 test: 0.2373192 best: 0.2373192 (53) total: 2.8s remaining: 104ms 54: learn: 0.2307710 test: 0.2364900 best: 0.2364900 (54) total: 2.85s remaining: 51.9ms 55: learn: 0.2300853 test: 0.2359228 best: 0.2359228 (55) total: 2.91s remaining: 0us bestTest = 0.2359228495 bestIteration = 55 Trial 17, Fold 1: Log loss = 0.2359228494826574, Average precision = 0.9681689939213123, ROC-AUC = 0.9648643433083361, Elapsed Time = 3.021453900000779 seconds Trial 17, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 17, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6532847 test: 0.6535746 best: 0.6535746 (0) total: 50.7ms remaining: 2.79s 1: learn: 0.6167445 test: 0.6173381 best: 0.6173381 (1) total: 104ms remaining: 2.81s 2: learn: 0.5829553 test: 0.5837063 best: 0.5837063 (2) total: 158ms remaining: 2.78s 3: learn: 0.5528568 test: 0.5536770 best: 0.5536770 (3) total: 211ms remaining: 2.74s 4: learn: 0.5272159 test: 0.5282087 best: 0.5282087 (4) total: 262ms remaining: 2.68s 5: learn: 0.5023352 test: 0.5035155 best: 0.5035155 (5) total: 317ms remaining: 2.64s 6: learn: 0.4796393 test: 0.4809156 best: 0.4809156 (6) total: 372ms remaining: 2.6s 7: learn: 0.4595429 test: 0.4609547 best: 0.4609547 (7) total: 425ms remaining: 2.55s 8: learn: 0.4408846 test: 0.4424643 best: 0.4424643 (8) total: 480ms remaining: 2.5s 9: learn: 0.4241501 test: 0.4259043 best: 0.4259043 (9) total: 534ms remaining: 2.46s 10: learn: 0.4087306 test: 0.4105564 best: 0.4105564 (10) total: 588ms remaining: 2.41s 11: learn: 0.3947009 test: 0.3966339 best: 0.3966339 (11) total: 645ms remaining: 2.37s 12: learn: 0.3827172 test: 0.3847123 best: 0.3847123 (12) total: 699ms remaining: 2.31s 13: learn: 0.3723946 test: 0.3745432 best: 0.3745432 (13) total: 753ms remaining: 2.26s 14: learn: 0.3629878 test: 0.3652397 best: 0.3652397 (14) total: 806ms remaining: 2.2s 15: learn: 0.3530562 test: 0.3554412 best: 0.3554412 (15) total: 860ms remaining: 2.15s 16: learn: 0.3442906 test: 0.3466123 best: 0.3466123 (16) total: 915ms remaining: 2.1s 17: learn: 0.3367926 test: 0.3392124 best: 0.3392124 (17) total: 969ms remaining: 2.05s 18: learn: 0.3293596 test: 0.3316782 best: 0.3316782 (18) total: 1.03s remaining: 2s 19: learn: 0.3232545 test: 0.3256577 best: 0.3256577 (19) total: 1.08s remaining: 1.94s 20: learn: 0.3176318 test: 0.3200229 best: 0.3200229 (20) total: 1.13s remaining: 1.89s 21: learn: 0.3123221 test: 0.3146364 best: 0.3146364 (21) total: 1.19s remaining: 1.83s 22: learn: 0.3059025 test: 0.3083200 best: 0.3083200 (22) total: 1.24s remaining: 1.78s 23: learn: 0.3014259 test: 0.3038086 best: 0.3038086 (23) total: 1.29s remaining: 1.72s 24: learn: 0.2967850 test: 0.2993317 best: 0.2993317 (24) total: 1.35s remaining: 1.67s 25: learn: 0.2923562 test: 0.2949290 best: 0.2949290 (25) total: 1.4s remaining: 1.62s 26: learn: 0.2886608 test: 0.2912421 best: 0.2912421 (26) total: 1.46s remaining: 1.57s 27: learn: 0.2840706 test: 0.2866413 best: 0.2866413 (27) total: 1.52s remaining: 1.52s 28: learn: 0.2811088 test: 0.2836217 best: 0.2836217 (28) total: 1.57s remaining: 1.47s 29: learn: 0.2781356 test: 0.2806790 best: 0.2806790 (29) total: 1.65s remaining: 1.43s 30: learn: 0.2749498 test: 0.2774961 best: 0.2774961 (30) total: 1.71s remaining: 1.38s 31: learn: 0.2712580 test: 0.2737156 best: 0.2737156 (31) total: 1.77s remaining: 1.33s 32: learn: 0.2678912 test: 0.2703739 best: 0.2703739 (32) total: 1.83s remaining: 1.27s 33: learn: 0.2648284 test: 0.2672823 best: 0.2672823 (33) total: 1.89s remaining: 1.22s 34: learn: 0.2619276 test: 0.2644045 best: 0.2644045 (34) total: 1.95s remaining: 1.17s 35: learn: 0.2598831 test: 0.2623780 best: 0.2623780 (35) total: 2.01s remaining: 1.12s 36: learn: 0.2573922 test: 0.2599114 best: 0.2599114 (36) total: 2.07s remaining: 1.06s 37: learn: 0.2552160 test: 0.2577846 best: 0.2577846 (37) total: 2.13s remaining: 1.01s 38: learn: 0.2528479 test: 0.2553990 best: 0.2553990 (38) total: 2.19s remaining: 956ms 39: learn: 0.2511436 test: 0.2536491 best: 0.2536491 (39) total: 2.25s remaining: 901ms 40: learn: 0.2497172 test: 0.2522639 best: 0.2522639 (40) total: 2.31s remaining: 846ms 41: learn: 0.2478993 test: 0.2504591 best: 0.2504591 (41) total: 2.37s remaining: 791ms 42: learn: 0.2459701 test: 0.2485291 best: 0.2485291 (42) total: 2.43s remaining: 736ms 43: learn: 0.2446036 test: 0.2471453 best: 0.2471453 (43) total: 2.49s remaining: 680ms 44: learn: 0.2434575 test: 0.2459915 best: 0.2459915 (44) total: 2.55s remaining: 624ms 45: learn: 0.2420681 test: 0.2444291 best: 0.2444291 (45) total: 2.61s remaining: 568ms 46: learn: 0.2404834 test: 0.2428500 best: 0.2428500 (46) total: 2.67s remaining: 512ms 47: learn: 0.2390334 test: 0.2413328 best: 0.2413328 (47) total: 2.73s remaining: 455ms 48: learn: 0.2380394 test: 0.2402978 best: 0.2402978 (48) total: 2.79s remaining: 399ms 49: learn: 0.2367010 test: 0.2390496 best: 0.2390496 (49) total: 2.85s remaining: 342ms 50: learn: 0.2354374 test: 0.2377095 best: 0.2377095 (50) total: 2.91s remaining: 285ms 51: learn: 0.2343533 test: 0.2366443 best: 0.2366443 (51) total: 2.97s remaining: 229ms 52: learn: 0.2335246 test: 0.2357566 best: 0.2357566 (52) total: 3.03s remaining: 172ms 53: learn: 0.2323834 test: 0.2346622 best: 0.2346622 (53) total: 3.09s remaining: 114ms 54: learn: 0.2314090 test: 0.2337294 best: 0.2337294 (54) total: 3.15s remaining: 57.3ms 55: learn: 0.2304081 test: 0.2327804 best: 0.2327804 (55) total: 3.21s remaining: 0us bestTest = 0.2327804265 bestIteration = 55 Trial 17, Fold 2: Log loss = 0.23278042645488775, Average precision = 0.9712607334633114, ROC-AUC = 0.9678798497454127, Elapsed Time = 3.325564899998426 seconds Trial 17, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 17, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6526699 test: 0.6527509 best: 0.6527509 (0) total: 56ms remaining: 3.08s 1: learn: 0.6162519 test: 0.6161510 best: 0.6161510 (1) total: 114ms remaining: 3.08s 2: learn: 0.5830913 test: 0.5827618 best: 0.5827618 (2) total: 172ms remaining: 3.04s 3: learn: 0.5531242 test: 0.5525635 best: 0.5525635 (3) total: 229ms remaining: 2.98s 4: learn: 0.5260483 test: 0.5252981 best: 0.5252981 (4) total: 291ms remaining: 2.97s 5: learn: 0.5016584 test: 0.5006889 best: 0.5006889 (5) total: 350ms remaining: 2.92s 6: learn: 0.4807880 test: 0.4796240 best: 0.4796240 (6) total: 411ms remaining: 2.88s 7: learn: 0.4605764 test: 0.4593027 best: 0.4593027 (7) total: 470ms remaining: 2.82s 8: learn: 0.4421066 test: 0.4406165 best: 0.4406165 (8) total: 534ms remaining: 2.79s 9: learn: 0.4264893 test: 0.4248505 best: 0.4248505 (9) total: 595ms remaining: 2.73s 10: learn: 0.4116968 test: 0.4097708 best: 0.4097708 (10) total: 655ms remaining: 2.68s 11: learn: 0.3977785 test: 0.3956557 best: 0.3956557 (11) total: 716ms remaining: 2.63s 12: learn: 0.3863300 test: 0.3839557 best: 0.3839557 (12) total: 779ms remaining: 2.58s 13: learn: 0.3746131 test: 0.3719748 best: 0.3719748 (13) total: 840ms remaining: 2.52s 14: learn: 0.3641851 test: 0.3612783 best: 0.3612783 (14) total: 899ms remaining: 2.46s 15: learn: 0.3553294 test: 0.3522614 best: 0.3522614 (15) total: 957ms remaining: 2.39s 16: learn: 0.3464074 test: 0.3432774 best: 0.3432774 (16) total: 1.02s remaining: 2.33s 17: learn: 0.3390572 test: 0.3359455 best: 0.3359455 (17) total: 1.07s remaining: 2.27s 18: learn: 0.3312576 test: 0.3280955 best: 0.3280955 (18) total: 1.13s remaining: 2.21s 19: learn: 0.3253954 test: 0.3221195 best: 0.3221195 (19) total: 1.19s remaining: 2.14s 20: learn: 0.3195337 test: 0.3161516 best: 0.3161516 (20) total: 1.25s remaining: 2.08s 21: learn: 0.3129789 test: 0.3094139 best: 0.3094139 (21) total: 1.31s remaining: 2.02s 22: learn: 0.3079618 test: 0.3044811 best: 0.3044811 (22) total: 1.36s remaining: 1.96s 23: learn: 0.3038475 test: 0.3002720 best: 0.3002720 (23) total: 1.42s remaining: 1.9s 24: learn: 0.2995297 test: 0.2958084 best: 0.2958084 (24) total: 1.48s remaining: 1.83s 25: learn: 0.2944950 test: 0.2906392 best: 0.2906392 (25) total: 1.54s remaining: 1.78s 26: learn: 0.2908821 test: 0.2869263 best: 0.2869263 (26) total: 1.6s remaining: 1.72s 27: learn: 0.2873228 test: 0.2833197 best: 0.2833197 (27) total: 1.66s remaining: 1.66s 28: learn: 0.2831430 test: 0.2790810 best: 0.2790810 (28) total: 1.72s remaining: 1.6s 29: learn: 0.2796455 test: 0.2755149 best: 0.2755149 (29) total: 1.78s remaining: 1.54s 30: learn: 0.2765118 test: 0.2724801 best: 0.2724801 (30) total: 1.84s remaining: 1.48s 31: learn: 0.2731201 test: 0.2690163 best: 0.2690163 (31) total: 1.9s remaining: 1.42s 32: learn: 0.2700005 test: 0.2657919 best: 0.2657919 (32) total: 1.95s remaining: 1.36s 33: learn: 0.2674363 test: 0.2630423 best: 0.2630423 (33) total: 2.01s remaining: 1.3s 34: learn: 0.2647163 test: 0.2602886 best: 0.2602886 (34) total: 2.07s remaining: 1.24s 35: learn: 0.2619306 test: 0.2575619 best: 0.2575619 (35) total: 2.13s remaining: 1.18s 36: learn: 0.2600440 test: 0.2556817 best: 0.2556817 (36) total: 2.19s remaining: 1.12s 37: learn: 0.2579351 test: 0.2537374 best: 0.2537374 (37) total: 2.24s remaining: 1.06s 38: learn: 0.2554326 test: 0.2511916 best: 0.2511916 (38) total: 2.3s remaining: 1s 39: learn: 0.2530862 test: 0.2488156 best: 0.2488156 (39) total: 2.36s remaining: 943ms 40: learn: 0.2516953 test: 0.2474388 best: 0.2474388 (40) total: 2.42s remaining: 884ms 41: learn: 0.2497792 test: 0.2455413 best: 0.2455413 (41) total: 2.47s remaining: 824ms 42: learn: 0.2485090 test: 0.2442669 best: 0.2442669 (42) total: 2.53s remaining: 765ms 43: learn: 0.2471500 test: 0.2429112 best: 0.2429112 (43) total: 2.59s remaining: 705ms 44: learn: 0.2454879 test: 0.2412322 best: 0.2412322 (44) total: 2.64s remaining: 646ms 45: learn: 0.2444288 test: 0.2401711 best: 0.2401711 (45) total: 2.7s remaining: 588ms 46: learn: 0.2431661 test: 0.2388957 best: 0.2388957 (46) total: 2.76s remaining: 529ms 47: learn: 0.2416746 test: 0.2374018 best: 0.2374018 (47) total: 2.82s remaining: 470ms 48: learn: 0.2404577 test: 0.2361602 best: 0.2361602 (48) total: 2.88s remaining: 411ms 49: learn: 0.2389008 test: 0.2347028 best: 0.2347028 (49) total: 2.93s remaining: 352ms 50: learn: 0.2379461 test: 0.2337621 best: 0.2337621 (50) total: 2.99s remaining: 293ms 51: learn: 0.2366868 test: 0.2324390 best: 0.2324390 (51) total: 3.04s remaining: 234ms 52: learn: 0.2354816 test: 0.2312357 best: 0.2312357 (52) total: 3.1s remaining: 176ms 53: learn: 0.2343895 test: 0.2301045 best: 0.2301045 (53) total: 3.16s remaining: 117ms 54: learn: 0.2333519 test: 0.2290692 best: 0.2290692 (54) total: 3.21s remaining: 58.4ms 55: learn: 0.2324122 test: 0.2280821 best: 0.2280821 (55) total: 3.27s remaining: 0us bestTest = 0.2280821006 bestIteration = 55 Trial 17, Fold 3: Log loss = 0.2280821005841465, Average precision = 0.9708749209136153, ROC-AUC = 0.9685720515790082, Elapsed Time = 3.3947281000000658 seconds Trial 17, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 17, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6530484 test: 0.6529733 best: 0.6529733 (0) total: 54.1ms remaining: 2.97s 1: learn: 0.6161970 test: 0.6160424 best: 0.6160424 (1) total: 110ms remaining: 2.97s 2: learn: 0.5830943 test: 0.5828704 best: 0.5828704 (2) total: 169ms remaining: 2.98s 3: learn: 0.5531266 test: 0.5528263 best: 0.5528263 (3) total: 226ms remaining: 2.94s 4: learn: 0.5275973 test: 0.5272063 best: 0.5272063 (4) total: 280ms remaining: 2.85s 5: learn: 0.5028316 test: 0.5024320 best: 0.5024320 (5) total: 334ms remaining: 2.78s 6: learn: 0.4814244 test: 0.4811414 best: 0.4811414 (6) total: 389ms remaining: 2.72s 7: learn: 0.4620304 test: 0.4617320 best: 0.4617320 (7) total: 443ms remaining: 2.66s 8: learn: 0.4432754 test: 0.4429464 best: 0.4429464 (8) total: 497ms remaining: 2.6s 9: learn: 0.4268792 test: 0.4266802 best: 0.4266802 (9) total: 552ms remaining: 2.54s 10: learn: 0.4112762 test: 0.4110211 best: 0.4110211 (10) total: 614ms remaining: 2.51s 11: learn: 0.3978890 test: 0.3975870 best: 0.3975870 (11) total: 670ms remaining: 2.46s 12: learn: 0.3849754 test: 0.3846606 best: 0.3846606 (12) total: 724ms remaining: 2.39s 13: learn: 0.3744809 test: 0.3740880 best: 0.3740880 (13) total: 778ms remaining: 2.33s 14: learn: 0.3645014 test: 0.3642854 best: 0.3642854 (14) total: 833ms remaining: 2.28s 15: learn: 0.3560488 test: 0.3557524 best: 0.3557524 (15) total: 887ms remaining: 2.22s 16: learn: 0.3472151 test: 0.3470536 best: 0.3470536 (16) total: 941ms remaining: 2.16s 17: learn: 0.3387070 test: 0.3385988 best: 0.3385988 (17) total: 994ms remaining: 2.1s 18: learn: 0.3320788 test: 0.3320506 best: 0.3320506 (18) total: 1.05s remaining: 2.04s 19: learn: 0.3260523 test: 0.3260479 best: 0.3260479 (19) total: 1.1s remaining: 1.98s 20: learn: 0.3194647 test: 0.3195104 best: 0.3195104 (20) total: 1.15s remaining: 1.92s 21: learn: 0.3141157 test: 0.3141452 best: 0.3141452 (21) total: 1.21s remaining: 1.86s 22: learn: 0.3079464 test: 0.3079118 best: 0.3079118 (22) total: 1.26s remaining: 1.81s 23: learn: 0.3019748 test: 0.3018947 best: 0.3018947 (23) total: 1.31s remaining: 1.75s 24: learn: 0.2980411 test: 0.2980557 best: 0.2980557 (24) total: 1.37s remaining: 1.7s 25: learn: 0.2935610 test: 0.2936172 best: 0.2936172 (25) total: 1.42s remaining: 1.64s 26: learn: 0.2889323 test: 0.2890049 best: 0.2890049 (26) total: 1.48s remaining: 1.58s 27: learn: 0.2843563 test: 0.2844252 best: 0.2844252 (27) total: 1.53s remaining: 1.53s 28: learn: 0.2802601 test: 0.2802922 best: 0.2802922 (28) total: 1.58s remaining: 1.47s 29: learn: 0.2770933 test: 0.2770586 best: 0.2770586 (29) total: 1.64s remaining: 1.42s 30: learn: 0.2735834 test: 0.2734680 best: 0.2734680 (30) total: 1.69s remaining: 1.36s 31: learn: 0.2701771 test: 0.2699879 best: 0.2699879 (31) total: 1.74s remaining: 1.31s 32: learn: 0.2677561 test: 0.2675526 best: 0.2675526 (32) total: 1.8s remaining: 1.25s 33: learn: 0.2657358 test: 0.2655131 best: 0.2655131 (33) total: 1.85s remaining: 1.2s 34: learn: 0.2628485 test: 0.2626104 best: 0.2626104 (34) total: 1.91s remaining: 1.15s 35: learn: 0.2600522 test: 0.2598559 best: 0.2598559 (35) total: 1.96s remaining: 1.09s 36: learn: 0.2576460 test: 0.2574706 best: 0.2574706 (36) total: 2.02s remaining: 1.03s 37: learn: 0.2551461 test: 0.2549694 best: 0.2549694 (37) total: 2.07s remaining: 981ms 38: learn: 0.2530928 test: 0.2529325 best: 0.2529325 (38) total: 2.13s remaining: 926ms 39: learn: 0.2515934 test: 0.2515480 best: 0.2515480 (39) total: 2.18s remaining: 872ms 40: learn: 0.2501172 test: 0.2501718 best: 0.2501718 (40) total: 2.23s remaining: 817ms 41: learn: 0.2481690 test: 0.2481593 best: 0.2481593 (41) total: 2.29s remaining: 763ms 42: learn: 0.2464486 test: 0.2464260 best: 0.2464260 (42) total: 2.34s remaining: 708ms 43: learn: 0.2450610 test: 0.2452503 best: 0.2452503 (43) total: 2.4s remaining: 654ms 44: learn: 0.2435591 test: 0.2437641 best: 0.2437641 (44) total: 2.45s remaining: 599ms 45: learn: 0.2417544 test: 0.2419246 best: 0.2419246 (45) total: 2.51s remaining: 545ms 46: learn: 0.2403974 test: 0.2406390 best: 0.2406390 (46) total: 2.56s remaining: 491ms 47: learn: 0.2394907 test: 0.2397710 best: 0.2397710 (47) total: 2.62s remaining: 436ms 48: learn: 0.2385347 test: 0.2387266 best: 0.2387266 (48) total: 2.67s remaining: 382ms 49: learn: 0.2376000 test: 0.2379035 best: 0.2379035 (49) total: 2.73s remaining: 327ms 50: learn: 0.2362510 test: 0.2365902 best: 0.2365902 (50) total: 2.78s remaining: 273ms 51: learn: 0.2353397 test: 0.2358865 best: 0.2358865 (51) total: 2.84s remaining: 219ms 52: learn: 0.2346365 test: 0.2352489 best: 0.2352489 (52) total: 2.9s remaining: 164ms 53: learn: 0.2334749 test: 0.2340996 best: 0.2340996 (53) total: 2.96s remaining: 110ms 54: learn: 0.2324929 test: 0.2331309 best: 0.2331309 (54) total: 3.01s remaining: 54.8ms 55: learn: 0.2316293 test: 0.2324182 best: 0.2324182 (55) total: 3.07s remaining: 0us bestTest = 0.2324182174 bestIteration = 55 Trial 17, Fold 4: Log loss = 0.23241821737738685, Average precision = 0.9703314041177783, ROC-AUC = 0.9657063754348566, Elapsed Time = 3.1934275000021444 seconds Trial 17, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 17, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6524623 test: 0.6531228 best: 0.6531228 (0) total: 86.9ms remaining: 4.78s 1: learn: 0.6222853 test: 0.6236559 best: 0.6236559 (1) total: 164ms remaining: 4.43s 2: learn: 0.5884894 test: 0.5903137 best: 0.5903137 (2) total: 243ms remaining: 4.29s 3: learn: 0.5577281 test: 0.5598429 best: 0.5598429 (3) total: 317ms remaining: 4.12s 4: learn: 0.5300068 test: 0.5323462 best: 0.5323462 (4) total: 381ms remaining: 3.88s 5: learn: 0.5048851 test: 0.5075358 best: 0.5075358 (5) total: 444ms remaining: 3.7s 6: learn: 0.4818974 test: 0.4848022 best: 0.4848022 (6) total: 505ms remaining: 3.54s 7: learn: 0.4611958 test: 0.4645430 best: 0.4645430 (7) total: 572ms remaining: 3.43s 8: learn: 0.4423828 test: 0.4461465 best: 0.4461465 (8) total: 638ms remaining: 3.33s 9: learn: 0.4252067 test: 0.4293412 best: 0.4293412 (9) total: 697ms remaining: 3.21s 10: learn: 0.4095030 test: 0.4138890 best: 0.4138890 (10) total: 758ms remaining: 3.1s 11: learn: 0.3958385 test: 0.4003184 best: 0.4003184 (11) total: 815ms remaining: 2.99s 12: learn: 0.3841519 test: 0.3888351 best: 0.3888351 (12) total: 875ms remaining: 2.9s 13: learn: 0.3728332 test: 0.3776818 best: 0.3776818 (13) total: 936ms remaining: 2.81s 14: learn: 0.3630317 test: 0.3680233 best: 0.3680233 (14) total: 994ms remaining: 2.72s 15: learn: 0.3544697 test: 0.3596845 best: 0.3596845 (15) total: 1.05s remaining: 2.63s 16: learn: 0.3450730 test: 0.3505235 best: 0.3505235 (16) total: 1.11s remaining: 2.55s 17: learn: 0.3375334 test: 0.3431428 best: 0.3431428 (17) total: 1.17s remaining: 2.46s 18: learn: 0.3297014 test: 0.3354996 best: 0.3354996 (18) total: 1.22s remaining: 2.38s 19: learn: 0.3224186 test: 0.3283823 best: 0.3283823 (19) total: 1.28s remaining: 2.3s 20: learn: 0.3168348 test: 0.3230119 best: 0.3230119 (20) total: 1.34s remaining: 2.23s 21: learn: 0.3112323 test: 0.3174834 best: 0.3174834 (21) total: 1.4s remaining: 2.16s 22: learn: 0.3054178 test: 0.3119066 best: 0.3119066 (22) total: 1.46s remaining: 2.1s 23: learn: 0.2995593 test: 0.3062788 best: 0.3062788 (23) total: 1.53s remaining: 2.04s 24: learn: 0.2952580 test: 0.3021051 best: 0.3021051 (24) total: 1.59s remaining: 1.97s 25: learn: 0.2912635 test: 0.2983359 best: 0.2983359 (25) total: 1.65s remaining: 1.91s 26: learn: 0.2877071 test: 0.2949178 best: 0.2949178 (26) total: 1.71s remaining: 1.84s 27: learn: 0.2830570 test: 0.2904987 best: 0.2904987 (27) total: 1.77s remaining: 1.77s 28: learn: 0.2787178 test: 0.2864418 best: 0.2864418 (28) total: 1.83s remaining: 1.7s 29: learn: 0.2748165 test: 0.2826713 best: 0.2826713 (29) total: 1.89s remaining: 1.64s 30: learn: 0.2719716 test: 0.2800154 best: 0.2800154 (30) total: 1.95s remaining: 1.57s 31: learn: 0.2693267 test: 0.2775165 best: 0.2775165 (31) total: 2.01s remaining: 1.51s 32: learn: 0.2670349 test: 0.2753816 best: 0.2753816 (32) total: 2.07s remaining: 1.44s 33: learn: 0.2640117 test: 0.2724896 best: 0.2724896 (33) total: 2.13s remaining: 1.38s 34: learn: 0.2610186 test: 0.2697358 best: 0.2697358 (34) total: 2.18s remaining: 1.31s 35: learn: 0.2580708 test: 0.2670109 best: 0.2670109 (35) total: 2.24s remaining: 1.25s 36: learn: 0.2554700 test: 0.2646010 best: 0.2646010 (36) total: 2.3s remaining: 1.18s 37: learn: 0.2532913 test: 0.2625432 best: 0.2625432 (37) total: 2.35s remaining: 1.11s 38: learn: 0.2509078 test: 0.2603844 best: 0.2603844 (38) total: 2.41s remaining: 1.05s 39: learn: 0.2490998 test: 0.2587642 best: 0.2587642 (39) total: 2.47s remaining: 987ms 40: learn: 0.2477264 test: 0.2574286 best: 0.2574286 (40) total: 2.52s remaining: 924ms 41: learn: 0.2459790 test: 0.2557323 best: 0.2557323 (41) total: 2.58s remaining: 860ms 42: learn: 0.2446411 test: 0.2545506 best: 0.2545506 (42) total: 2.64s remaining: 797ms 43: learn: 0.2434487 test: 0.2534149 best: 0.2534149 (43) total: 2.69s remaining: 734ms 44: learn: 0.2416030 test: 0.2516952 best: 0.2516952 (44) total: 2.75s remaining: 672ms 45: learn: 0.2400953 test: 0.2502974 best: 0.2502974 (45) total: 2.8s remaining: 609ms 46: learn: 0.2386117 test: 0.2489493 best: 0.2489493 (46) total: 2.86s remaining: 548ms 47: learn: 0.2372218 test: 0.2477694 best: 0.2477694 (47) total: 2.92s remaining: 486ms 48: learn: 0.2360939 test: 0.2467882 best: 0.2467882 (48) total: 2.97s remaining: 425ms 49: learn: 0.2351693 test: 0.2460151 best: 0.2460151 (49) total: 3.03s remaining: 364ms 50: learn: 0.2337681 test: 0.2448243 best: 0.2448243 (50) total: 3.09s remaining: 303ms 51: learn: 0.2326110 test: 0.2437472 best: 0.2437472 (51) total: 3.14s remaining: 242ms 52: learn: 0.2318015 test: 0.2430493 best: 0.2430493 (52) total: 3.2s remaining: 181ms 53: learn: 0.2309605 test: 0.2422756 best: 0.2422756 (53) total: 3.25s remaining: 120ms 54: learn: 0.2298978 test: 0.2412547 best: 0.2412547 (54) total: 3.31s remaining: 60.1ms 55: learn: 0.2291083 test: 0.2404637 best: 0.2404637 (55) total: 3.37s remaining: 0us bestTest = 0.240463659 bestIteration = 55 Trial 17, Fold 5: Log loss = 0.2404636590437554, Average precision = 0.967340164381498, ROC-AUC = 0.9634232205562677, Elapsed Time = 3.5209907999997085 seconds
Optimization Progress: 18%|#8 | 18/100 [35:15<2:50:59, 125.12s/it]
Trial 18, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 18, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6674286 test: 0.6677229 best: 0.6677229 (0) total: 181ms remaining: 12.1s 1: learn: 0.6432510 test: 0.6438432 best: 0.6438432 (1) total: 350ms remaining: 11.6s 2: learn: 0.6203010 test: 0.6212118 best: 0.6212118 (2) total: 534ms remaining: 11.6s 3: learn: 0.5984492 test: 0.5997089 best: 0.5997089 (3) total: 750ms remaining: 12s 4: learn: 0.5778435 test: 0.5795152 best: 0.5795152 (4) total: 937ms remaining: 11.8s 5: learn: 0.5583987 test: 0.5604094 best: 0.5604094 (5) total: 1.13s remaining: 11.6s 6: learn: 0.5401218 test: 0.5423817 best: 0.5423817 (6) total: 1.3s remaining: 11.3s 7: learn: 0.5226471 test: 0.5252553 best: 0.5252553 (7) total: 1.47s remaining: 11s 8: learn: 0.5064308 test: 0.5093129 best: 0.5093129 (8) total: 1.64s remaining: 10.8s 9: learn: 0.4910992 test: 0.4942924 best: 0.4942924 (9) total: 1.82s remaining: 10.5s 10: learn: 0.4763207 test: 0.4798732 best: 0.4798732 (10) total: 2s remaining: 10.3s 11: learn: 0.4622985 test: 0.4662310 best: 0.4662310 (11) total: 2.18s remaining: 10.2s 12: learn: 0.4491429 test: 0.4534223 best: 0.4534223 (12) total: 2.37s remaining: 10s 13: learn: 0.4368355 test: 0.4413519 best: 0.4413519 (13) total: 2.55s remaining: 9.85s 14: learn: 0.4252128 test: 0.4300181 best: 0.4300181 (14) total: 2.75s remaining: 9.72s 15: learn: 0.4140771 test: 0.4192870 best: 0.4192870 (15) total: 2.96s remaining: 9.63s 16: learn: 0.4035885 test: 0.4091808 best: 0.4091808 (16) total: 3.16s remaining: 9.48s 17: learn: 0.3939842 test: 0.3998102 best: 0.3998102 (17) total: 3.36s remaining: 9.33s 18: learn: 0.3844686 test: 0.3907521 best: 0.3907521 (18) total: 3.56s remaining: 9.18s 19: learn: 0.3756262 test: 0.3822509 best: 0.3822509 (19) total: 3.76s remaining: 9.02s 20: learn: 0.3673877 test: 0.3744899 best: 0.3744899 (20) total: 3.96s remaining: 8.86s 21: learn: 0.3592227 test: 0.3666865 best: 0.3666865 (21) total: 4.15s remaining: 8.68s 22: learn: 0.3519945 test: 0.3598516 best: 0.3598516 (22) total: 4.34s remaining: 8.49s 23: learn: 0.3453301 test: 0.3534360 best: 0.3534360 (23) total: 4.51s remaining: 8.28s 24: learn: 0.3382922 test: 0.3467880 best: 0.3467880 (24) total: 4.71s remaining: 8.11s 25: learn: 0.3317454 test: 0.3405685 best: 0.3405685 (25) total: 4.9s remaining: 7.92s 26: learn: 0.3258567 test: 0.3349978 best: 0.3349978 (26) total: 5.1s remaining: 7.74s 27: learn: 0.3201027 test: 0.3295136 best: 0.3295136 (27) total: 5.29s remaining: 7.55s 28: learn: 0.3144717 test: 0.3241245 best: 0.3241245 (28) total: 5.47s remaining: 7.36s 29: learn: 0.3089823 test: 0.3189645 best: 0.3189645 (29) total: 5.67s remaining: 7.18s 30: learn: 0.3036400 test: 0.3140133 best: 0.3140133 (30) total: 5.87s remaining: 7s 31: learn: 0.2986430 test: 0.3092594 best: 0.3092594 (31) total: 6.07s remaining: 6.83s 32: learn: 0.2943098 test: 0.3052999 best: 0.3052999 (32) total: 6.25s remaining: 6.62s 33: learn: 0.2901909 test: 0.3015193 best: 0.3015193 (33) total: 6.43s remaining: 6.43s 34: learn: 0.2860677 test: 0.2977713 best: 0.2977713 (34) total: 6.62s remaining: 6.25s 35: learn: 0.2819986 test: 0.2940364 best: 0.2940364 (35) total: 6.83s remaining: 6.08s 36: learn: 0.2783221 test: 0.2907896 best: 0.2907896 (36) total: 7.04s remaining: 5.89s 37: learn: 0.2747827 test: 0.2876384 best: 0.2876384 (37) total: 7.23s remaining: 5.71s 38: learn: 0.2714407 test: 0.2846799 best: 0.2846799 (38) total: 7.42s remaining: 5.52s 39: learn: 0.2683226 test: 0.2818720 best: 0.2818720 (39) total: 7.63s remaining: 5.34s 40: learn: 0.2652568 test: 0.2791553 best: 0.2791553 (40) total: 7.83s remaining: 5.15s 41: learn: 0.2622276 test: 0.2764568 best: 0.2764568 (41) total: 8.02s remaining: 4.97s 42: learn: 0.2594505 test: 0.2739167 best: 0.2739167 (42) total: 8.21s remaining: 4.77s 43: learn: 0.2564363 test: 0.2711750 best: 0.2711750 (43) total: 8.4s remaining: 4.58s 44: learn: 0.2537135 test: 0.2687808 best: 0.2687808 (44) total: 8.59s remaining: 4.39s 45: learn: 0.2510069 test: 0.2663061 best: 0.2663061 (45) total: 8.78s remaining: 4.2s 46: learn: 0.2483450 test: 0.2639406 best: 0.2639406 (46) total: 8.97s remaining: 4.01s 47: learn: 0.2458545 test: 0.2616330 best: 0.2616330 (47) total: 9.16s remaining: 3.81s 48: learn: 0.2437269 test: 0.2597331 best: 0.2597331 (48) total: 9.34s remaining: 3.62s 49: learn: 0.2416259 test: 0.2580336 best: 0.2580336 (49) total: 9.53s remaining: 3.43s 50: learn: 0.2395727 test: 0.2563253 best: 0.2563253 (50) total: 9.73s remaining: 3.24s 51: learn: 0.2374728 test: 0.2544844 best: 0.2544844 (51) total: 9.92s remaining: 3.05s 52: learn: 0.2353415 test: 0.2525910 best: 0.2525910 (52) total: 10.1s remaining: 2.86s 53: learn: 0.2333086 test: 0.2507699 best: 0.2507699 (53) total: 10.3s remaining: 2.67s 54: learn: 0.2315482 test: 0.2492967 best: 0.2492967 (54) total: 10.5s remaining: 2.48s 55: learn: 0.2296121 test: 0.2475384 best: 0.2475384 (55) total: 10.7s remaining: 2.29s 56: learn: 0.2279532 test: 0.2461490 best: 0.2461490 (56) total: 10.9s remaining: 2.1s 57: learn: 0.2263567 test: 0.2448401 best: 0.2448401 (57) total: 11.1s remaining: 1.91s 58: learn: 0.2245548 test: 0.2432323 best: 0.2432323 (58) total: 11.3s remaining: 1.72s 59: learn: 0.2228910 test: 0.2418065 best: 0.2418065 (59) total: 11.5s remaining: 1.53s 60: learn: 0.2213378 test: 0.2404923 best: 0.2404923 (60) total: 11.7s remaining: 1.34s 61: learn: 0.2198124 test: 0.2391569 best: 0.2391569 (61) total: 11.9s remaining: 1.15s 62: learn: 0.2182633 test: 0.2378205 best: 0.2378205 (62) total: 12.1s remaining: 957ms 63: learn: 0.2168726 test: 0.2367078 best: 0.2367078 (63) total: 12.3s remaining: 766ms 64: learn: 0.2155879 test: 0.2356910 best: 0.2356910 (64) total: 12.4s remaining: 574ms 65: learn: 0.2142908 test: 0.2346625 best: 0.2346625 (65) total: 12.6s remaining: 383ms 66: learn: 0.2128880 test: 0.2335496 best: 0.2335496 (66) total: 12.8s remaining: 192ms 67: learn: 0.2116379 test: 0.2324910 best: 0.2324910 (67) total: 13s remaining: 0us bestTest = 0.232491013 bestIteration = 67 Trial 18, Fold 1: Log loss = 0.23222625511648723, Average precision = 0.9752353269523638, ROC-AUC = 0.9710207529764892, Elapsed Time = 13.157521600001928 seconds Trial 18, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 18, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6675568 test: 0.6679719 best: 0.6679719 (0) total: 187ms remaining: 12.5s 1: learn: 0.6434130 test: 0.6443341 best: 0.6443341 (1) total: 374ms remaining: 12.3s 2: learn: 0.6203539 test: 0.6215981 best: 0.6215981 (2) total: 558ms remaining: 12.1s 3: learn: 0.5988477 test: 0.6003166 best: 0.6003166 (3) total: 732ms remaining: 11.7s 4: learn: 0.5785605 test: 0.5802741 best: 0.5802741 (4) total: 913ms remaining: 11.5s 5: learn: 0.5593755 test: 0.5613737 best: 0.5613737 (5) total: 1.08s remaining: 11.2s 6: learn: 0.5411274 test: 0.5433935 best: 0.5433935 (6) total: 1.26s remaining: 11s 7: learn: 0.5237759 test: 0.5263609 best: 0.5263609 (7) total: 1.45s remaining: 10.9s 8: learn: 0.5077754 test: 0.5106025 best: 0.5106025 (8) total: 1.63s remaining: 10.7s 9: learn: 0.4923284 test: 0.4955172 best: 0.4955172 (9) total: 1.81s remaining: 10.5s 10: learn: 0.4779842 test: 0.4813340 best: 0.4813340 (10) total: 2s remaining: 10.4s 11: learn: 0.4642267 test: 0.4678349 best: 0.4678349 (11) total: 2.19s remaining: 10.2s 12: learn: 0.4511302 test: 0.4550504 best: 0.4550504 (12) total: 2.39s remaining: 10.1s 13: learn: 0.4387206 test: 0.4429710 best: 0.4429710 (13) total: 2.61s remaining: 10.1s 14: learn: 0.4269284 test: 0.4313762 best: 0.4313762 (14) total: 2.82s remaining: 9.97s 15: learn: 0.4156370 test: 0.4204112 best: 0.4204112 (15) total: 3.04s remaining: 9.88s 16: learn: 0.4053387 test: 0.4103157 best: 0.4103157 (16) total: 3.26s remaining: 9.77s 17: learn: 0.3953543 test: 0.4006169 best: 0.4006169 (17) total: 3.47s remaining: 9.65s 18: learn: 0.3861750 test: 0.3915783 best: 0.3915783 (18) total: 3.68s remaining: 9.5s 19: learn: 0.3772552 test: 0.3829482 best: 0.3829482 (19) total: 3.9s remaining: 9.36s 20: learn: 0.3686708 test: 0.3745731 best: 0.3745731 (20) total: 4.12s remaining: 9.22s 21: learn: 0.3605182 test: 0.3667170 best: 0.3667170 (21) total: 4.33s remaining: 9.05s 22: learn: 0.3528139 test: 0.3592875 best: 0.3592875 (22) total: 4.54s remaining: 8.87s 23: learn: 0.3455120 test: 0.3522217 best: 0.3522217 (23) total: 4.74s remaining: 8.7s 24: learn: 0.3388894 test: 0.3458592 best: 0.3458592 (24) total: 4.97s remaining: 8.55s 25: learn: 0.3325547 test: 0.3398070 best: 0.3398070 (25) total: 5.19s remaining: 8.38s 26: learn: 0.3262061 test: 0.3336898 best: 0.3336898 (26) total: 5.39s remaining: 8.18s 27: learn: 0.3203121 test: 0.3280027 best: 0.3280027 (27) total: 5.57s remaining: 7.96s 28: learn: 0.3150697 test: 0.3229605 best: 0.3229605 (28) total: 5.75s remaining: 7.74s 29: learn: 0.3098988 test: 0.3179617 best: 0.3179617 (29) total: 5.93s remaining: 7.52s 30: learn: 0.3050619 test: 0.3132433 best: 0.3132433 (30) total: 6.13s remaining: 7.31s 31: learn: 0.3005417 test: 0.3089164 best: 0.3089164 (31) total: 6.32s remaining: 7.11s 32: learn: 0.2960873 test: 0.3046406 best: 0.3046406 (32) total: 6.53s remaining: 6.93s 33: learn: 0.2915626 test: 0.3002669 best: 0.3002669 (33) total: 6.71s remaining: 6.71s 34: learn: 0.2874001 test: 0.2962416 best: 0.2962416 (34) total: 6.89s remaining: 6.5s 35: learn: 0.2833280 test: 0.2923069 best: 0.2923069 (35) total: 7.09s remaining: 6.3s 36: learn: 0.2798036 test: 0.2888772 best: 0.2888772 (36) total: 7.28s remaining: 6.1s 37: learn: 0.2763414 test: 0.2856543 best: 0.2856543 (37) total: 7.47s remaining: 5.9s 38: learn: 0.2730800 test: 0.2824260 best: 0.2824260 (38) total: 7.67s remaining: 5.7s 39: learn: 0.2698046 test: 0.2793495 best: 0.2793495 (39) total: 7.87s remaining: 5.51s 40: learn: 0.2666682 test: 0.2763368 best: 0.2763368 (40) total: 8.07s remaining: 5.32s 41: learn: 0.2636400 test: 0.2734687 best: 0.2734687 (41) total: 8.28s remaining: 5.12s 42: learn: 0.2604690 test: 0.2705412 best: 0.2705412 (42) total: 8.47s remaining: 4.92s 43: learn: 0.2576376 test: 0.2678230 best: 0.2678230 (43) total: 8.65s remaining: 4.72s 44: learn: 0.2546207 test: 0.2650025 best: 0.2650025 (44) total: 8.86s remaining: 4.53s 45: learn: 0.2518203 test: 0.2623567 best: 0.2623567 (45) total: 9.06s remaining: 4.33s 46: learn: 0.2494534 test: 0.2601101 best: 0.2601101 (46) total: 9.25s remaining: 4.13s 47: learn: 0.2469397 test: 0.2577509 best: 0.2577509 (47) total: 9.44s remaining: 3.93s 48: learn: 0.2446092 test: 0.2556214 best: 0.2556214 (48) total: 9.65s remaining: 3.74s 49: learn: 0.2423243 test: 0.2535203 best: 0.2535203 (49) total: 9.86s remaining: 3.55s 50: learn: 0.2401718 test: 0.2514270 best: 0.2514270 (50) total: 10.1s remaining: 3.35s 51: learn: 0.2381545 test: 0.2495723 best: 0.2495723 (51) total: 10.3s remaining: 3.16s 52: learn: 0.2360678 test: 0.2476394 best: 0.2476394 (52) total: 10.4s remaining: 2.96s 53: learn: 0.2344197 test: 0.2461837 best: 0.2461837 (53) total: 10.6s remaining: 2.76s 54: learn: 0.2327142 test: 0.2446358 best: 0.2446358 (54) total: 10.8s remaining: 2.56s 55: learn: 0.2310428 test: 0.2431636 best: 0.2431636 (55) total: 11s remaining: 2.36s 56: learn: 0.2292089 test: 0.2414075 best: 0.2414075 (56) total: 11.2s remaining: 2.16s 57: learn: 0.2276425 test: 0.2399774 best: 0.2399774 (57) total: 11.4s remaining: 1.97s 58: learn: 0.2261376 test: 0.2385519 best: 0.2385519 (58) total: 11.6s remaining: 1.77s 59: learn: 0.2246136 test: 0.2371413 best: 0.2371413 (59) total: 11.8s remaining: 1.57s 60: learn: 0.2230852 test: 0.2357451 best: 0.2357451 (60) total: 12s remaining: 1.37s 61: learn: 0.2215032 test: 0.2342975 best: 0.2342975 (61) total: 12.2s remaining: 1.18s 62: learn: 0.2199830 test: 0.2328664 best: 0.2328664 (62) total: 12.4s remaining: 981ms 63: learn: 0.2186798 test: 0.2317099 best: 0.2317099 (63) total: 12.6s remaining: 786ms 64: learn: 0.2172548 test: 0.2303192 best: 0.2303192 (64) total: 12.8s remaining: 589ms 65: learn: 0.2159280 test: 0.2291410 best: 0.2291410 (65) total: 13s remaining: 393ms 66: learn: 0.2147298 test: 0.2280899 best: 0.2280899 (66) total: 13.2s remaining: 197ms 67: learn: 0.2133562 test: 0.2269768 best: 0.2269768 (67) total: 13.4s remaining: 0us bestTest = 0.2269768382 bestIteration = 67 Trial 18, Fold 2: Log loss = 0.22675746868625035, Average precision = 0.9758337479347416, ROC-AUC = 0.973149063974135, Elapsed Time = 13.523272300000826 seconds Trial 18, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 18, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6676847 test: 0.6678795 best: 0.6678795 (0) total: 177ms remaining: 11.9s 1: learn: 0.6432465 test: 0.6437643 best: 0.6437643 (1) total: 376ms remaining: 12.4s 2: learn: 0.6205654 test: 0.6213584 best: 0.6213584 (2) total: 549ms remaining: 11.9s 3: learn: 0.5989519 test: 0.5999456 best: 0.5999456 (3) total: 725ms remaining: 11.6s 4: learn: 0.5785837 test: 0.5797203 best: 0.5797203 (4) total: 916ms remaining: 11.5s 5: learn: 0.5591281 test: 0.5606036 best: 0.5606036 (5) total: 1.1s remaining: 11.4s 6: learn: 0.5408954 test: 0.5426539 best: 0.5426539 (6) total: 1.29s remaining: 11.2s 7: learn: 0.5235576 test: 0.5255043 best: 0.5255043 (7) total: 1.47s remaining: 11s 8: learn: 0.5072568 test: 0.5093352 best: 0.5093352 (8) total: 1.65s remaining: 10.8s 9: learn: 0.4916266 test: 0.4938495 best: 0.4938495 (9) total: 1.85s remaining: 10.8s 10: learn: 0.4770166 test: 0.4794027 best: 0.4794027 (10) total: 2.05s remaining: 10.6s 11: learn: 0.4630797 test: 0.4655367 best: 0.4655367 (11) total: 2.24s remaining: 10.5s 12: learn: 0.4501781 test: 0.4526354 best: 0.4526354 (12) total: 2.43s remaining: 10.3s 13: learn: 0.4376950 test: 0.4403969 best: 0.4403969 (13) total: 2.63s remaining: 10.1s 14: learn: 0.4260416 test: 0.4289273 best: 0.4289273 (14) total: 2.83s remaining: 9.99s 15: learn: 0.4148121 test: 0.4180512 best: 0.4180512 (15) total: 3.04s remaining: 9.87s 16: learn: 0.4043091 test: 0.4076100 best: 0.4076100 (16) total: 3.22s remaining: 9.67s 17: learn: 0.3944979 test: 0.3979207 best: 0.3979207 (17) total: 3.41s remaining: 9.48s 18: learn: 0.3852332 test: 0.3887053 best: 0.3887053 (18) total: 3.6s remaining: 9.29s 19: learn: 0.3762089 test: 0.3797953 best: 0.3797953 (19) total: 3.8s remaining: 9.12s 20: learn: 0.3678488 test: 0.3715224 best: 0.3715224 (20) total: 4s remaining: 8.95s 21: learn: 0.3602252 test: 0.3639466 best: 0.3639466 (21) total: 4.18s remaining: 8.74s 22: learn: 0.3527498 test: 0.3567926 best: 0.3567926 (22) total: 4.4s remaining: 8.6s 23: learn: 0.3457520 test: 0.3499661 best: 0.3499661 (23) total: 4.59s remaining: 8.41s 24: learn: 0.3391391 test: 0.3435180 best: 0.3435180 (24) total: 4.79s remaining: 8.23s 25: learn: 0.3325631 test: 0.3371307 best: 0.3371307 (25) total: 4.99s remaining: 8.05s 26: learn: 0.3265857 test: 0.3313092 best: 0.3313092 (26) total: 5.17s remaining: 7.86s 27: learn: 0.3208027 test: 0.3258120 best: 0.3258120 (27) total: 5.38s remaining: 7.69s 28: learn: 0.3151333 test: 0.3202766 best: 0.3202766 (28) total: 5.58s remaining: 7.5s 29: learn: 0.3094340 test: 0.3147855 best: 0.3147855 (29) total: 5.78s remaining: 7.32s 30: learn: 0.3042418 test: 0.3098224 best: 0.3098224 (30) total: 6s remaining: 7.16s 31: learn: 0.2997600 test: 0.3054834 best: 0.3054834 (31) total: 6.18s remaining: 6.95s 32: learn: 0.2953282 test: 0.3012315 best: 0.3012315 (32) total: 6.38s remaining: 6.76s 33: learn: 0.2909291 test: 0.2969398 best: 0.2969398 (33) total: 6.55s remaining: 6.55s 34: learn: 0.2867671 test: 0.2930370 best: 0.2930370 (34) total: 6.75s remaining: 6.37s 35: learn: 0.2826892 test: 0.2890163 best: 0.2890163 (35) total: 6.96s remaining: 6.18s 36: learn: 0.2790872 test: 0.2855126 best: 0.2855126 (36) total: 7.13s remaining: 5.97s 37: learn: 0.2752928 test: 0.2819334 best: 0.2819334 (37) total: 7.33s remaining: 5.78s 38: learn: 0.2716916 test: 0.2784069 best: 0.2784069 (38) total: 7.53s remaining: 5.6s 39: learn: 0.2686290 test: 0.2755127 best: 0.2755127 (39) total: 7.73s remaining: 5.41s 40: learn: 0.2655603 test: 0.2725659 best: 0.2725659 (40) total: 7.93s remaining: 5.22s 41: learn: 0.2625583 test: 0.2697205 best: 0.2697205 (41) total: 8.12s remaining: 5.03s 42: learn: 0.2598248 test: 0.2672632 best: 0.2672632 (42) total: 8.31s remaining: 4.83s 43: learn: 0.2570051 test: 0.2646204 best: 0.2646204 (43) total: 8.51s remaining: 4.64s 44: learn: 0.2544375 test: 0.2621834 best: 0.2621834 (44) total: 8.7s remaining: 4.45s 45: learn: 0.2519292 test: 0.2598674 best: 0.2598674 (45) total: 8.9s remaining: 4.26s 46: learn: 0.2493494 test: 0.2573962 best: 0.2573962 (46) total: 9.11s remaining: 4.07s 47: learn: 0.2469266 test: 0.2551844 best: 0.2551844 (47) total: 9.33s remaining: 3.89s 48: learn: 0.2445039 test: 0.2530130 best: 0.2530130 (48) total: 9.56s remaining: 3.71s 49: learn: 0.2421727 test: 0.2509047 best: 0.2509047 (49) total: 9.8s remaining: 3.53s 50: learn: 0.2398968 test: 0.2487561 best: 0.2487561 (50) total: 10s remaining: 3.34s 51: learn: 0.2378793 test: 0.2468881 best: 0.2468881 (51) total: 10.3s remaining: 3.16s 52: learn: 0.2358388 test: 0.2450636 best: 0.2450636 (52) total: 10.7s remaining: 3.02s 53: learn: 0.2340250 test: 0.2435152 best: 0.2435152 (53) total: 11s remaining: 2.85s 54: learn: 0.2318815 test: 0.2416619 best: 0.2416619 (54) total: 11.3s remaining: 2.67s 55: learn: 0.2301815 test: 0.2402096 best: 0.2402096 (55) total: 11.6s remaining: 2.48s 56: learn: 0.2283430 test: 0.2385406 best: 0.2385406 (56) total: 11.9s remaining: 2.29s 57: learn: 0.2266239 test: 0.2370028 best: 0.2370028 (57) total: 12.1s remaining: 2.09s 58: learn: 0.2248451 test: 0.2354397 best: 0.2354397 (58) total: 12.4s remaining: 1.89s 59: learn: 0.2231769 test: 0.2340068 best: 0.2340068 (59) total: 12.6s remaining: 1.68s 60: learn: 0.2216441 test: 0.2325612 best: 0.2325612 (60) total: 12.9s remaining: 1.48s 61: learn: 0.2199736 test: 0.2310486 best: 0.2310486 (61) total: 13.2s remaining: 1.28s 62: learn: 0.2185931 test: 0.2298928 best: 0.2298928 (62) total: 13.4s remaining: 1.06s 63: learn: 0.2173042 test: 0.2287579 best: 0.2287579 (63) total: 13.6s remaining: 853ms 64: learn: 0.2160287 test: 0.2277174 best: 0.2277174 (64) total: 13.9s remaining: 641ms 65: learn: 0.2147394 test: 0.2265774 best: 0.2265774 (65) total: 14.1s remaining: 427ms 66: learn: 0.2134485 test: 0.2255235 best: 0.2255235 (66) total: 14.3s remaining: 214ms 67: learn: 0.2120495 test: 0.2243323 best: 0.2243323 (67) total: 14.5s remaining: 0us bestTest = 0.2243322775 bestIteration = 67 Trial 18, Fold 3: Log loss = 0.2242580182581941, Average precision = 0.9748102542312739, ROC-AUC = 0.9730504828760196, Elapsed Time = 14.702690000001894 seconds Trial 18, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 18, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6675249 test: 0.6677979 best: 0.6677979 (0) total: 210ms remaining: 14.1s 1: learn: 0.6433627 test: 0.6439268 best: 0.6439268 (1) total: 417ms remaining: 13.8s 2: learn: 0.6207335 test: 0.6215292 best: 0.6215292 (2) total: 638ms remaining: 13.8s 3: learn: 0.5992492 test: 0.6002743 best: 0.6002743 (3) total: 855ms remaining: 13.7s 4: learn: 0.5790600 test: 0.5801364 best: 0.5801364 (4) total: 1.09s remaining: 13.8s 5: learn: 0.5595531 test: 0.5609294 best: 0.5609294 (5) total: 1.32s remaining: 13.6s 6: learn: 0.5413195 test: 0.5429138 best: 0.5429138 (6) total: 1.54s remaining: 13.5s 7: learn: 0.5240114 test: 0.5259753 best: 0.5259753 (7) total: 1.78s remaining: 13.4s 8: learn: 0.5075318 test: 0.5098302 best: 0.5098302 (8) total: 1.99s remaining: 13.1s 9: learn: 0.4919610 test: 0.4944193 best: 0.4944193 (9) total: 2.2s remaining: 12.8s 10: learn: 0.4772275 test: 0.4800279 best: 0.4800279 (10) total: 2.42s remaining: 12.5s 11: learn: 0.4633766 test: 0.4663993 best: 0.4663993 (11) total: 2.63s remaining: 12.3s 12: learn: 0.4505395 test: 0.4538573 best: 0.4538573 (12) total: 2.84s remaining: 12s 13: learn: 0.4383363 test: 0.4418837 best: 0.4418837 (13) total: 3.06s remaining: 11.8s 14: learn: 0.4266664 test: 0.4304657 best: 0.4304657 (14) total: 3.25s remaining: 11.5s 15: learn: 0.4153624 test: 0.4194630 best: 0.4194630 (15) total: 3.44s remaining: 11.2s 16: learn: 0.4050062 test: 0.4093519 best: 0.4093519 (16) total: 3.64s remaining: 10.9s 17: learn: 0.3950166 test: 0.3995871 best: 0.3995871 (17) total: 3.83s remaining: 10.6s 18: learn: 0.3858282 test: 0.3907953 best: 0.3907953 (18) total: 4.02s remaining: 10.4s 19: learn: 0.3769586 test: 0.3822137 best: 0.3822137 (19) total: 4.21s remaining: 10.1s 20: learn: 0.3684962 test: 0.3740633 best: 0.3740633 (20) total: 4.4s remaining: 9.84s 21: learn: 0.3604467 test: 0.3661013 best: 0.3661013 (21) total: 4.6s remaining: 9.62s 22: learn: 0.3528708 test: 0.3587217 best: 0.3587217 (22) total: 4.79s remaining: 9.36s 23: learn: 0.3457257 test: 0.3517549 best: 0.3517549 (23) total: 4.96s remaining: 9.1s 24: learn: 0.3391229 test: 0.3453536 best: 0.3453536 (24) total: 5.15s remaining: 8.86s 25: learn: 0.3328579 test: 0.3392968 best: 0.3392968 (25) total: 5.34s remaining: 8.62s 26: learn: 0.3268233 test: 0.3334239 best: 0.3334239 (26) total: 5.55s remaining: 8.43s 27: learn: 0.3207978 test: 0.3275996 best: 0.3275996 (27) total: 5.74s remaining: 8.21s 28: learn: 0.3150243 test: 0.3220743 best: 0.3220743 (28) total: 5.94s remaining: 7.99s 29: learn: 0.3099219 test: 0.3173583 best: 0.3173583 (29) total: 6.15s remaining: 7.79s 30: learn: 0.3046559 test: 0.3123013 best: 0.3123013 (30) total: 6.37s remaining: 7.6s 31: learn: 0.3000199 test: 0.3078890 best: 0.3078890 (31) total: 6.55s remaining: 7.37s 32: learn: 0.2955093 test: 0.3036886 best: 0.3036886 (32) total: 6.76s remaining: 7.17s 33: learn: 0.2913509 test: 0.2997605 best: 0.2997605 (33) total: 6.94s remaining: 6.94s 34: learn: 0.2872324 test: 0.2958781 best: 0.2958781 (34) total: 7.12s remaining: 6.72s 35: learn: 0.2832884 test: 0.2921907 best: 0.2921907 (35) total: 7.32s remaining: 6.51s 36: learn: 0.2796676 test: 0.2887641 best: 0.2887641 (36) total: 7.5s remaining: 6.29s 37: learn: 0.2760959 test: 0.2854283 best: 0.2854283 (37) total: 7.71s remaining: 6.09s 38: learn: 0.2725233 test: 0.2820184 best: 0.2820184 (38) total: 7.9s remaining: 5.88s 39: learn: 0.2692619 test: 0.2789454 best: 0.2789454 (39) total: 8.1s remaining: 5.67s 40: learn: 0.2659628 test: 0.2757640 best: 0.2757640 (40) total: 8.28s remaining: 5.45s 41: learn: 0.2631220 test: 0.2730946 best: 0.2730946 (41) total: 8.47s remaining: 5.24s 42: learn: 0.2599337 test: 0.2700717 best: 0.2700717 (42) total: 8.68s remaining: 5.05s 43: learn: 0.2572839 test: 0.2676144 best: 0.2676144 (43) total: 8.89s remaining: 4.85s 44: learn: 0.2544669 test: 0.2649454 best: 0.2649454 (44) total: 9.09s remaining: 4.64s 45: learn: 0.2519575 test: 0.2626993 best: 0.2626993 (45) total: 9.29s remaining: 4.44s 46: learn: 0.2492139 test: 0.2601778 best: 0.2601778 (46) total: 9.48s remaining: 4.23s 47: learn: 0.2468788 test: 0.2581592 best: 0.2581592 (47) total: 9.7s remaining: 4.04s 48: learn: 0.2445754 test: 0.2559920 best: 0.2559920 (48) total: 9.89s remaining: 3.84s 49: learn: 0.2420804 test: 0.2537089 best: 0.2537089 (49) total: 10.1s remaining: 3.63s 50: learn: 0.2399938 test: 0.2518495 best: 0.2518495 (50) total: 10.3s remaining: 3.43s 51: learn: 0.2379274 test: 0.2499858 best: 0.2499858 (51) total: 10.5s remaining: 3.22s 52: learn: 0.2358517 test: 0.2479937 best: 0.2479937 (52) total: 10.7s remaining: 3.02s 53: learn: 0.2337233 test: 0.2460465 best: 0.2460465 (53) total: 10.9s remaining: 2.82s 54: learn: 0.2317219 test: 0.2441502 best: 0.2441502 (54) total: 11.1s remaining: 2.62s 55: learn: 0.2300760 test: 0.2427038 best: 0.2427038 (55) total: 11.3s remaining: 2.41s 56: learn: 0.2284356 test: 0.2412549 best: 0.2412549 (56) total: 11.4s remaining: 2.21s 57: learn: 0.2266699 test: 0.2396447 best: 0.2396447 (57) total: 11.6s remaining: 2.01s 58: learn: 0.2250242 test: 0.2382448 best: 0.2382448 (58) total: 11.8s remaining: 1.8s 59: learn: 0.2233397 test: 0.2367420 best: 0.2367420 (59) total: 12s remaining: 1.6s 60: learn: 0.2217163 test: 0.2352604 best: 0.2352604 (60) total: 12.2s remaining: 1.4s 61: learn: 0.2201635 test: 0.2339396 best: 0.2339396 (61) total: 12.4s remaining: 1.2s 62: learn: 0.2185645 test: 0.2326504 best: 0.2326504 (62) total: 12.7s remaining: 1.01s 63: learn: 0.2172424 test: 0.2314986 best: 0.2314986 (63) total: 12.8s remaining: 803ms 64: learn: 0.2158611 test: 0.2304200 best: 0.2304200 (64) total: 13.1s remaining: 602ms 65: learn: 0.2144836 test: 0.2291628 best: 0.2291628 (65) total: 13.2s remaining: 401ms 66: learn: 0.2131592 test: 0.2279756 best: 0.2279756 (66) total: 13.4s remaining: 201ms 67: learn: 0.2118704 test: 0.2267795 best: 0.2267795 (67) total: 13.6s remaining: 0us bestTest = 0.2267794595 bestIteration = 67 Trial 18, Fold 4: Log loss = 0.22656632827949785, Average precision = 0.9754526281814258, ROC-AUC = 0.9728596792730412, Elapsed Time = 13.785531000001356 seconds Trial 18, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 18, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6675209 test: 0.6681154 best: 0.6681154 (0) total: 173ms remaining: 11.6s 1: learn: 0.6430914 test: 0.6442728 best: 0.6442728 (1) total: 356ms remaining: 11.8s 2: learn: 0.6201287 test: 0.6219982 best: 0.6219982 (2) total: 533ms remaining: 11.5s 3: learn: 0.5984456 test: 0.6008643 best: 0.6008643 (3) total: 711ms remaining: 11.4s 4: learn: 0.5777639 test: 0.5807847 best: 0.5807847 (4) total: 888ms remaining: 11.2s 5: learn: 0.5583649 test: 0.5618269 best: 0.5618269 (5) total: 1.05s remaining: 10.9s 6: learn: 0.5398611 test: 0.5437472 best: 0.5437472 (6) total: 1.24s remaining: 10.8s 7: learn: 0.5225423 test: 0.5269019 best: 0.5269019 (7) total: 1.42s remaining: 10.7s 8: learn: 0.5060470 test: 0.5110203 best: 0.5110203 (8) total: 1.61s remaining: 10.6s 9: learn: 0.4907244 test: 0.4960992 best: 0.4960992 (9) total: 1.78s remaining: 10.3s 10: learn: 0.4762676 test: 0.4820087 best: 0.4820087 (10) total: 1.96s remaining: 10.2s 11: learn: 0.4622926 test: 0.4685673 best: 0.4685673 (11) total: 2.15s remaining: 10s 12: learn: 0.4492429 test: 0.4559017 best: 0.4559017 (12) total: 2.36s remaining: 9.97s 13: learn: 0.4368132 test: 0.4439055 best: 0.4439055 (13) total: 2.55s remaining: 9.83s 14: learn: 0.4251153 test: 0.4326566 best: 0.4326566 (14) total: 2.74s remaining: 9.7s 15: learn: 0.4140520 test: 0.4220094 best: 0.4220094 (15) total: 2.94s remaining: 9.56s 16: learn: 0.4034638 test: 0.4118512 best: 0.4118512 (16) total: 3.14s remaining: 9.42s 17: learn: 0.3937169 test: 0.4024789 best: 0.4024789 (17) total: 3.34s remaining: 9.27s 18: learn: 0.3848198 test: 0.3938054 best: 0.3938054 (18) total: 3.52s remaining: 9.09s 19: learn: 0.3759505 test: 0.3852842 best: 0.3852842 (19) total: 3.72s remaining: 8.92s 20: learn: 0.3673403 test: 0.3771184 best: 0.3771184 (20) total: 3.91s remaining: 8.74s 21: learn: 0.3594125 test: 0.3695991 best: 0.3695991 (21) total: 4.08s remaining: 8.53s 22: learn: 0.3516215 test: 0.3621797 best: 0.3621797 (22) total: 4.27s remaining: 8.36s 23: learn: 0.3443484 test: 0.3552866 best: 0.3552866 (23) total: 4.46s remaining: 8.18s 24: learn: 0.3375154 test: 0.3488305 best: 0.3488305 (24) total: 4.66s remaining: 8.01s 25: learn: 0.3307825 test: 0.3425144 best: 0.3425144 (25) total: 4.85s remaining: 7.83s 26: learn: 0.3247399 test: 0.3367556 best: 0.3367556 (26) total: 5.04s remaining: 7.66s 27: learn: 0.3187970 test: 0.3311329 best: 0.3311329 (27) total: 5.23s remaining: 7.47s 28: learn: 0.3134924 test: 0.3260575 best: 0.3260575 (28) total: 5.43s remaining: 7.31s 29: learn: 0.3079602 test: 0.3209738 best: 0.3209738 (29) total: 5.66s remaining: 7.16s 30: learn: 0.3028300 test: 0.3161085 best: 0.3161085 (30) total: 5.86s remaining: 7s 31: learn: 0.2984214 test: 0.3119837 best: 0.3119837 (31) total: 6.05s remaining: 6.81s 32: learn: 0.2939513 test: 0.3078058 best: 0.3078058 (32) total: 6.25s remaining: 6.63s 33: learn: 0.2899600 test: 0.3041130 best: 0.3041130 (33) total: 6.45s remaining: 6.45s 34: learn: 0.2859161 test: 0.3004431 best: 0.3004431 (34) total: 6.64s remaining: 6.26s 35: learn: 0.2822810 test: 0.2969985 best: 0.2969985 (35) total: 6.82s remaining: 6.06s 36: learn: 0.2782656 test: 0.2932827 best: 0.2932827 (36) total: 7.02s remaining: 5.88s 37: learn: 0.2744905 test: 0.2897926 best: 0.2897926 (37) total: 7.23s remaining: 5.71s 38: learn: 0.2709381 test: 0.2864805 best: 0.2864805 (38) total: 7.43s remaining: 5.53s 39: learn: 0.2673248 test: 0.2832838 best: 0.2832838 (39) total: 7.64s remaining: 5.35s 40: learn: 0.2641851 test: 0.2804306 best: 0.2804306 (40) total: 7.87s remaining: 5.18s 41: learn: 0.2611733 test: 0.2776589 best: 0.2776589 (41) total: 8.06s remaining: 4.99s 42: learn: 0.2580720 test: 0.2749201 best: 0.2749201 (42) total: 8.27s remaining: 4.81s 43: learn: 0.2553811 test: 0.2726220 best: 0.2726220 (43) total: 8.47s remaining: 4.62s 44: learn: 0.2526155 test: 0.2702299 best: 0.2702299 (44) total: 8.66s remaining: 4.43s 45: learn: 0.2500506 test: 0.2679468 best: 0.2679468 (45) total: 8.87s remaining: 4.24s 46: learn: 0.2476570 test: 0.2657908 best: 0.2657908 (46) total: 9.09s remaining: 4.06s 47: learn: 0.2452556 test: 0.2636649 best: 0.2636649 (47) total: 9.3s remaining: 3.88s 48: learn: 0.2430115 test: 0.2615917 best: 0.2615917 (48) total: 9.54s remaining: 3.7s 49: learn: 0.2408292 test: 0.2596959 best: 0.2596959 (49) total: 9.75s remaining: 3.51s 50: learn: 0.2385353 test: 0.2576842 best: 0.2576842 (50) total: 9.97s remaining: 3.32s 51: learn: 0.2362704 test: 0.2556382 best: 0.2556382 (51) total: 10.2s remaining: 3.13s 52: learn: 0.2342322 test: 0.2537788 best: 0.2537788 (52) total: 10.4s remaining: 2.94s 53: learn: 0.2321403 test: 0.2519635 best: 0.2519635 (53) total: 10.6s remaining: 2.74s 54: learn: 0.2302206 test: 0.2502595 best: 0.2502595 (54) total: 10.8s remaining: 2.55s 55: learn: 0.2284905 test: 0.2488091 best: 0.2488091 (55) total: 11s remaining: 2.35s 56: learn: 0.2267129 test: 0.2472312 best: 0.2472312 (56) total: 11.2s remaining: 2.16s 57: learn: 0.2250823 test: 0.2458813 best: 0.2458813 (57) total: 11.4s remaining: 1.97s 58: learn: 0.2234830 test: 0.2445122 best: 0.2445122 (58) total: 11.6s remaining: 1.77s 59: learn: 0.2220491 test: 0.2433213 best: 0.2433213 (59) total: 11.8s remaining: 1.58s 60: learn: 0.2204854 test: 0.2419924 best: 0.2419924 (60) total: 12s remaining: 1.38s 61: learn: 0.2189062 test: 0.2406796 best: 0.2406796 (61) total: 12.2s remaining: 1.18s 62: learn: 0.2174967 test: 0.2395102 best: 0.2395102 (62) total: 12.4s remaining: 986ms 63: learn: 0.2159155 test: 0.2383253 best: 0.2383253 (63) total: 12.7s remaining: 791ms 64: learn: 0.2145945 test: 0.2371814 best: 0.2371814 (64) total: 12.8s remaining: 593ms 65: learn: 0.2133032 test: 0.2360983 best: 0.2360983 (65) total: 13s remaining: 395ms 66: learn: 0.2120533 test: 0.2351231 best: 0.2351231 (66) total: 13.3s remaining: 198ms 67: learn: 0.2107542 test: 0.2340904 best: 0.2340904 (67) total: 13.5s remaining: 0us bestTest = 0.2340903653 bestIteration = 67 Trial 18, Fold 5: Log loss = 0.2336974523166397, Average precision = 0.9744352207003804, ROC-AUC = 0.9710939189565799, Elapsed Time = 13.6142170000021 seconds
Optimization Progress: 19%|#9 | 19/100 [36:32<2:29:23, 110.66s/it]
Trial 19, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 19, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6493495 test: 0.6493227 best: 0.6493227 (0) total: 58.3ms remaining: 233ms 1: learn: 0.6100900 test: 0.6100392 best: 0.6100392 (1) total: 121ms remaining: 181ms 2: learn: 0.5817833 test: 0.5819830 best: 0.5819830 (2) total: 183ms remaining: 122ms 3: learn: 0.5564804 test: 0.5570457 best: 0.5570457 (3) total: 240ms remaining: 59.9ms 4: learn: 0.5284393 test: 0.5291573 best: 0.5291573 (4) total: 304ms remaining: 0us bestTest = 0.5291573093 bestIteration = 4 Trial 19, Fold 1: Log loss = 0.5294335510119113, Average precision = 0.9694654657745588, ROC-AUC = 0.9645724896402402, Elapsed Time = 0.4126544999999169 seconds Trial 19, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 19, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6500462 test: 0.6501067 best: 0.6501067 (0) total: 68.8ms remaining: 275ms 1: learn: 0.6176881 test: 0.6180616 best: 0.6180616 (1) total: 131ms remaining: 196ms 2: learn: 0.5876388 test: 0.5884129 best: 0.5884129 (2) total: 195ms remaining: 130ms 3: learn: 0.5575685 test: 0.5586698 best: 0.5586698 (3) total: 260ms remaining: 64.9ms 4: learn: 0.5335288 test: 0.5349131 best: 0.5349131 (4) total: 318ms remaining: 0us bestTest = 0.5349131433 bestIteration = 4 Trial 19, Fold 2: Log loss = 0.5350452831126273, Average precision = 0.9684881298278613, ROC-AUC = 0.9657373227623803, Elapsed Time = 0.4333470999990823 seconds Trial 19, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 19, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6551776 test: 0.6549643 best: 0.6549643 (0) total: 66.8ms remaining: 267ms 1: learn: 0.6220921 test: 0.6218435 best: 0.6218435 (1) total: 123ms remaining: 185ms 2: learn: 0.5891958 test: 0.5887209 best: 0.5887209 (2) total: 186ms remaining: 124ms 3: learn: 0.5634596 test: 0.5628272 best: 0.5628272 (3) total: 249ms remaining: 62.3ms 4: learn: 0.5362578 test: 0.5355707 best: 0.5355707 (4) total: 325ms remaining: 0us bestTest = 0.5355706597 bestIteration = 4 Trial 19, Fold 3: Log loss = 0.5358263510296953, Average precision = 0.9717330150833315, ROC-AUC = 0.968760759671967, Elapsed Time = 0.4338562000011734 seconds Trial 19, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 19, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6574934 test: 0.6576269 best: 0.6576269 (0) total: 63.8ms remaining: 255ms 1: learn: 0.6302521 test: 0.6307208 best: 0.6307208 (1) total: 128ms remaining: 192ms 2: learn: 0.5980759 test: 0.5988533 best: 0.5988533 (2) total: 200ms remaining: 133ms 3: learn: 0.5657710 test: 0.5667482 best: 0.5667482 (3) total: 269ms remaining: 67.2ms 4: learn: 0.5407174 test: 0.5421242 best: 0.5421242 (4) total: 341ms remaining: 0us bestTest = 0.5421241734 bestIteration = 4 Trial 19, Fold 4: Log loss = 0.5423984690305971, Average precision = 0.9679680371166938, ROC-AUC = 0.962191829939126, Elapsed Time = 0.45545209999909275 seconds Trial 19, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 19, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6552983 test: 0.6563531 best: 0.6563531 (0) total: 59.9ms remaining: 240ms 1: learn: 0.6182081 test: 0.6199267 best: 0.6199267 (1) total: 123ms remaining: 184ms 2: learn: 0.5863516 test: 0.5885258 best: 0.5885258 (2) total: 188ms remaining: 125ms 3: learn: 0.5575569 test: 0.5600240 best: 0.5600240 (3) total: 247ms remaining: 61.8ms 4: learn: 0.5298845 test: 0.5328504 best: 0.5328504 (4) total: 316ms remaining: 0us bestTest = 0.5328503903 bestIteration = 4 Trial 19, Fold 5: Log loss = 0.5331276533217285, Average precision = 0.9677386078861334, ROC-AUC = 0.9630175947085818, Elapsed Time = 0.42010640000080457 seconds
Optimization Progress: 20%|## | 20/100 [36:43<1:47:31, 80.65s/it]
Trial 20, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 20, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6600030 test: 0.6596422 best: 0.6596422 (0) total: 15.1ms remaining: 908ms 1: learn: 0.6351353 test: 0.6349128 best: 0.6349128 (1) total: 33.4ms remaining: 984ms 2: learn: 0.6067821 test: 0.6062365 best: 0.6062365 (2) total: 50ms remaining: 967ms 3: learn: 0.5856293 test: 0.5848043 best: 0.5848043 (3) total: 69.8ms remaining: 995ms 4: learn: 0.5668055 test: 0.5658969 best: 0.5658969 (4) total: 91.9ms remaining: 1.03s 5: learn: 0.5469435 test: 0.5457626 best: 0.5457626 (5) total: 114ms remaining: 1.04s 6: learn: 0.5233196 test: 0.5220183 best: 0.5220183 (6) total: 132ms remaining: 1.02s 7: learn: 0.5100441 test: 0.5090627 best: 0.5090627 (7) total: 153ms remaining: 1.01s 8: learn: 0.4942716 test: 0.4933097 best: 0.4933097 (8) total: 175ms remaining: 1.01s 9: learn: 0.4756191 test: 0.4743885 best: 0.4743885 (9) total: 192ms remaining: 977ms 10: learn: 0.4626865 test: 0.4612494 best: 0.4612494 (10) total: 213ms remaining: 970ms 11: learn: 0.4524807 test: 0.4511571 best: 0.4511571 (11) total: 235ms remaining: 960ms 12: learn: 0.4445025 test: 0.4434951 best: 0.4434951 (12) total: 259ms remaining: 955ms 13: learn: 0.4367688 test: 0.4361796 best: 0.4361796 (13) total: 279ms remaining: 937ms 14: learn: 0.4236759 test: 0.4229512 best: 0.4229512 (14) total: 300ms remaining: 920ms 15: learn: 0.4109740 test: 0.4103612 best: 0.4103612 (15) total: 318ms remaining: 896ms 16: learn: 0.4054715 test: 0.4047995 best: 0.4047995 (16) total: 340ms remaining: 881ms 17: learn: 0.3956301 test: 0.3951566 best: 0.3951566 (17) total: 358ms remaining: 855ms 18: learn: 0.3864497 test: 0.3858686 best: 0.3858686 (18) total: 379ms remaining: 837ms 19: learn: 0.3805680 test: 0.3799315 best: 0.3799315 (19) total: 401ms remaining: 822ms 20: learn: 0.3754517 test: 0.3749557 best: 0.3749557 (20) total: 422ms remaining: 805ms 21: learn: 0.3686064 test: 0.3683204 best: 0.3683204 (21) total: 444ms remaining: 788ms 22: learn: 0.3608984 test: 0.3604834 best: 0.3604834 (22) total: 464ms remaining: 766ms 23: learn: 0.3556372 test: 0.3553163 best: 0.3553163 (23) total: 486ms remaining: 749ms 24: learn: 0.3521231 test: 0.3518754 best: 0.3518754 (24) total: 508ms remaining: 731ms 25: learn: 0.3481778 test: 0.3480665 best: 0.3480665 (25) total: 530ms remaining: 713ms 26: learn: 0.3419402 test: 0.3417983 best: 0.3417983 (26) total: 551ms remaining: 694ms 27: learn: 0.3363168 test: 0.3361402 best: 0.3361402 (27) total: 574ms remaining: 676ms 28: learn: 0.3333367 test: 0.3332729 best: 0.3332729 (28) total: 595ms remaining: 657ms 29: learn: 0.3294786 test: 0.3294604 best: 0.3294604 (29) total: 617ms remaining: 638ms 30: learn: 0.3259026 test: 0.3261653 best: 0.3261653 (30) total: 636ms remaining: 615ms 31: learn: 0.3220580 test: 0.3221872 best: 0.3221872 (31) total: 658ms remaining: 596ms 32: learn: 0.3195664 test: 0.3200065 best: 0.3200065 (32) total: 681ms remaining: 578ms 33: learn: 0.3159294 test: 0.3167834 best: 0.3167834 (33) total: 702ms remaining: 558ms 34: learn: 0.3141789 test: 0.3149715 best: 0.3149715 (34) total: 724ms remaining: 538ms 35: learn: 0.3098561 test: 0.3106467 best: 0.3106467 (35) total: 746ms remaining: 518ms 36: learn: 0.3071313 test: 0.3080142 best: 0.3080142 (36) total: 768ms remaining: 498ms 37: learn: 0.3044101 test: 0.3053825 best: 0.3053825 (37) total: 790ms remaining: 478ms 38: learn: 0.3012210 test: 0.3021771 best: 0.3021771 (38) total: 812ms remaining: 458ms 39: learn: 0.2974984 test: 0.2984618 best: 0.2984618 (39) total: 834ms remaining: 438ms 40: learn: 0.2950305 test: 0.2960506 best: 0.2960506 (40) total: 857ms remaining: 418ms 41: learn: 0.2920636 test: 0.2931465 best: 0.2931465 (41) total: 879ms remaining: 398ms 42: learn: 0.2892677 test: 0.2902983 best: 0.2902983 (42) total: 902ms remaining: 378ms 43: learn: 0.2875605 test: 0.2887486 best: 0.2887486 (43) total: 925ms remaining: 357ms 44: learn: 0.2851868 test: 0.2863853 best: 0.2863853 (44) total: 947ms remaining: 337ms 45: learn: 0.2821721 test: 0.2833875 best: 0.2833875 (45) total: 973ms remaining: 317ms 46: learn: 0.2798855 test: 0.2810763 best: 0.2810763 (46) total: 994ms remaining: 296ms 47: learn: 0.2778212 test: 0.2788972 best: 0.2788972 (47) total: 1.02s remaining: 275ms 48: learn: 0.2770435 test: 0.2782937 best: 0.2782937 (48) total: 1.04s remaining: 255ms 49: learn: 0.2750797 test: 0.2763348 best: 0.2763348 (49) total: 1.06s remaining: 234ms 50: learn: 0.2743512 test: 0.2755993 best: 0.2755993 (50) total: 1.08s remaining: 213ms 51: learn: 0.2732244 test: 0.2745997 best: 0.2745997 (51) total: 1.11s remaining: 192ms 52: learn: 0.2719497 test: 0.2735169 best: 0.2735169 (52) total: 1.13s remaining: 171ms 53: learn: 0.2703102 test: 0.2718881 best: 0.2718881 (53) total: 1.15s remaining: 150ms 54: learn: 0.2691377 test: 0.2708199 best: 0.2708199 (54) total: 1.18s remaining: 129ms 55: learn: 0.2683711 test: 0.2700794 best: 0.2700794 (55) total: 1.2s remaining: 107ms 56: learn: 0.2675773 test: 0.2694844 best: 0.2694844 (56) total: 1.22s remaining: 85.9ms 57: learn: 0.2664412 test: 0.2684466 best: 0.2684466 (57) total: 1.25s remaining: 64.6ms 58: learn: 0.2648318 test: 0.2668169 best: 0.2668169 (58) total: 1.27s remaining: 43ms 59: learn: 0.2634645 test: 0.2654719 best: 0.2654719 (59) total: 1.29s remaining: 21.5ms 60: learn: 0.2618390 test: 0.2638001 best: 0.2638001 (60) total: 1.31s remaining: 0us bestTest = 0.2638000709 bestIteration = 60 Trial 20, Fold 1: Log loss = 0.2636127219803609, Average precision = 0.9632467500154142, ROC-AUC = 0.9569972887780024, Elapsed Time = 1.4266287999998895 seconds Trial 20, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 20, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6613361 test: 0.6616139 best: 0.6616139 (0) total: 21.8ms remaining: 1.31s 1: learn: 0.6314861 test: 0.6320707 best: 0.6320707 (1) total: 39.2ms remaining: 1.16s 2: learn: 0.6088114 test: 0.6093734 best: 0.6093734 (2) total: 62.9ms remaining: 1.22s 3: learn: 0.5920383 test: 0.5924745 best: 0.5924745 (3) total: 80.7ms remaining: 1.15s 4: learn: 0.5719947 test: 0.5724448 best: 0.5724448 (4) total: 103ms remaining: 1.16s 5: learn: 0.5468898 test: 0.5474059 best: 0.5474059 (5) total: 127ms remaining: 1.16s 6: learn: 0.5323437 test: 0.5329812 best: 0.5329812 (6) total: 145ms remaining: 1.12s 7: learn: 0.5188917 test: 0.5196985 best: 0.5196985 (7) total: 166ms remaining: 1.1s 8: learn: 0.5049730 test: 0.5057751 best: 0.5057751 (8) total: 188ms remaining: 1.08s 9: learn: 0.4923168 test: 0.4932202 best: 0.4932202 (9) total: 210ms remaining: 1.07s 10: learn: 0.4814957 test: 0.4823191 best: 0.4823191 (10) total: 231ms remaining: 1.05s 11: learn: 0.4604722 test: 0.4617324 best: 0.4617324 (11) total: 252ms remaining: 1.03s 12: learn: 0.4456555 test: 0.4467941 best: 0.4467941 (12) total: 273ms remaining: 1.01s 13: learn: 0.4364572 test: 0.4376764 best: 0.4376764 (13) total: 295ms remaining: 990ms 14: learn: 0.4285349 test: 0.4296712 best: 0.4296712 (14) total: 317ms remaining: 973ms 15: learn: 0.4162873 test: 0.4173785 best: 0.4173785 (15) total: 339ms remaining: 953ms 16: learn: 0.4077931 test: 0.4087770 best: 0.4087770 (16) total: 359ms remaining: 928ms 17: learn: 0.3975891 test: 0.3987397 best: 0.3987397 (17) total: 380ms remaining: 907ms 18: learn: 0.3897958 test: 0.3910950 best: 0.3910950 (18) total: 401ms remaining: 887ms 19: learn: 0.3837439 test: 0.3851556 best: 0.3851556 (19) total: 425ms remaining: 871ms 20: learn: 0.3757366 test: 0.3772657 best: 0.3772657 (20) total: 446ms remaining: 850ms 21: learn: 0.3649270 test: 0.3666718 best: 0.3666718 (21) total: 469ms remaining: 831ms 22: learn: 0.3604250 test: 0.3622652 best: 0.3622652 (22) total: 491ms remaining: 811ms 23: learn: 0.3556664 test: 0.3575026 best: 0.3575026 (23) total: 513ms remaining: 791ms 24: learn: 0.3508749 test: 0.3526852 best: 0.3526852 (24) total: 535ms remaining: 771ms 25: learn: 0.3441344 test: 0.3458298 best: 0.3458298 (25) total: 558ms remaining: 751ms 26: learn: 0.3381001 test: 0.3397118 best: 0.3397118 (26) total: 580ms remaining: 730ms 27: learn: 0.3324107 test: 0.3339380 best: 0.3339380 (27) total: 599ms remaining: 706ms 28: learn: 0.3275290 test: 0.3291574 best: 0.3291574 (28) total: 622ms remaining: 686ms 29: learn: 0.3240620 test: 0.3256298 best: 0.3256298 (29) total: 644ms remaining: 665ms 30: learn: 0.3190591 test: 0.3205499 best: 0.3205499 (30) total: 666ms remaining: 645ms 31: learn: 0.3160931 test: 0.3174656 best: 0.3174656 (31) total: 688ms remaining: 624ms 32: learn: 0.3124723 test: 0.3138514 best: 0.3138514 (32) total: 710ms remaining: 603ms 33: learn: 0.3108255 test: 0.3121388 best: 0.3121388 (33) total: 732ms remaining: 582ms 34: learn: 0.3066993 test: 0.3080630 best: 0.3080630 (34) total: 754ms remaining: 560ms 35: learn: 0.3035075 test: 0.3049603 best: 0.3049603 (35) total: 777ms remaining: 540ms 36: learn: 0.3014457 test: 0.3029198 best: 0.3029198 (36) total: 799ms remaining: 518ms 37: learn: 0.2981204 test: 0.2994713 best: 0.2994713 (37) total: 823ms remaining: 498ms 38: learn: 0.2947823 test: 0.2960823 best: 0.2960823 (38) total: 845ms remaining: 476ms 39: learn: 0.2929796 test: 0.2944412 best: 0.2944412 (39) total: 867ms remaining: 455ms 40: learn: 0.2912532 test: 0.2926979 best: 0.2926979 (40) total: 889ms remaining: 434ms 41: learn: 0.2884132 test: 0.2896844 best: 0.2896844 (41) total: 911ms remaining: 412ms 42: learn: 0.2863445 test: 0.2875635 best: 0.2875635 (42) total: 934ms remaining: 391ms 43: learn: 0.2842955 test: 0.2854317 best: 0.2854317 (43) total: 955ms remaining: 369ms 44: learn: 0.2820951 test: 0.2832043 best: 0.2832043 (44) total: 978ms remaining: 348ms 45: learn: 0.2800738 test: 0.2812069 best: 0.2812069 (45) total: 1000ms remaining: 326ms 46: learn: 0.2786759 test: 0.2797902 best: 0.2797902 (46) total: 1.02s remaining: 305ms 47: learn: 0.2761998 test: 0.2773175 best: 0.2773175 (47) total: 1.05s remaining: 284ms 48: learn: 0.2751147 test: 0.2762101 best: 0.2762101 (48) total: 1.07s remaining: 263ms 49: learn: 0.2734421 test: 0.2744521 best: 0.2744521 (49) total: 1.1s remaining: 241ms 50: learn: 0.2716721 test: 0.2725505 best: 0.2725505 (50) total: 1.12s remaining: 220ms 51: learn: 0.2697280 test: 0.2706755 best: 0.2706755 (51) total: 1.14s remaining: 197ms 52: learn: 0.2683001 test: 0.2692182 best: 0.2692182 (52) total: 1.16s remaining: 176ms 53: learn: 0.2670056 test: 0.2679078 best: 0.2679078 (53) total: 1.18s remaining: 153ms 54: learn: 0.2660591 test: 0.2669856 best: 0.2669856 (54) total: 1.21s remaining: 132ms 55: learn: 0.2639787 test: 0.2648551 best: 0.2648551 (55) total: 1.23s remaining: 109ms 56: learn: 0.2622013 test: 0.2630312 best: 0.2630312 (56) total: 1.25s remaining: 87.6ms 57: learn: 0.2615060 test: 0.2623835 best: 0.2623835 (57) total: 1.27s remaining: 65.8ms 58: learn: 0.2605233 test: 0.2614203 best: 0.2614203 (58) total: 1.29s remaining: 43.8ms 59: learn: 0.2595311 test: 0.2603602 best: 0.2603602 (59) total: 1.31s remaining: 21.9ms 60: learn: 0.2579574 test: 0.2587699 best: 0.2587699 (60) total: 1.33s remaining: 0us bestTest = 0.2587698676 bestIteration = 60 Trial 20, Fold 2: Log loss = 0.2586720483515504, Average precision = 0.9648877163322858, ROC-AUC = 0.96090825298541, Elapsed Time = 1.4445759000009275 seconds Trial 20, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 20, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6596380 test: 0.6593400 best: 0.6593400 (0) total: 20.5ms remaining: 1.23s 1: learn: 0.6286747 test: 0.6279832 best: 0.6279832 (1) total: 41.7ms remaining: 1.23s 2: learn: 0.5990495 test: 0.5982034 best: 0.5982034 (2) total: 60.7ms remaining: 1.17s 3: learn: 0.5728684 test: 0.5718085 best: 0.5718085 (3) total: 81.3ms remaining: 1.16s 4: learn: 0.5515591 test: 0.5508560 best: 0.5508560 (4) total: 102ms remaining: 1.15s 5: learn: 0.5264678 test: 0.5254746 best: 0.5254746 (5) total: 120ms remaining: 1.1s 6: learn: 0.5115415 test: 0.5101509 best: 0.5101509 (6) total: 143ms remaining: 1.1s 7: learn: 0.4915111 test: 0.4899413 best: 0.4899413 (7) total: 164ms remaining: 1.09s 8: learn: 0.4741456 test: 0.4723580 best: 0.4723580 (8) total: 180ms remaining: 1.04s 9: learn: 0.4628190 test: 0.4605972 best: 0.4605972 (9) total: 201ms remaining: 1.02s 10: learn: 0.4495327 test: 0.4468125 best: 0.4468125 (10) total: 223ms remaining: 1.01s 11: learn: 0.4409659 test: 0.4380044 best: 0.4380044 (11) total: 245ms remaining: 998ms 12: learn: 0.4277775 test: 0.4247065 best: 0.4247065 (12) total: 266ms remaining: 984ms 13: learn: 0.4155170 test: 0.4122875 best: 0.4122875 (13) total: 288ms remaining: 966ms 14: learn: 0.4062703 test: 0.4028732 best: 0.4028732 (14) total: 309ms remaining: 949ms 15: learn: 0.4001999 test: 0.3965491 best: 0.3965491 (15) total: 331ms remaining: 931ms 16: learn: 0.3895388 test: 0.3858915 best: 0.3858915 (16) total: 352ms remaining: 912ms 17: learn: 0.3801833 test: 0.3763682 best: 0.3763682 (17) total: 374ms remaining: 893ms 18: learn: 0.3736434 test: 0.3699335 best: 0.3699335 (18) total: 395ms remaining: 872ms 19: learn: 0.3662083 test: 0.3625072 best: 0.3625072 (19) total: 416ms remaining: 854ms 20: learn: 0.3605641 test: 0.3565899 best: 0.3565899 (20) total: 438ms remaining: 835ms 21: learn: 0.3524353 test: 0.3483759 best: 0.3483759 (21) total: 460ms remaining: 815ms 22: learn: 0.3481899 test: 0.3439963 best: 0.3439963 (22) total: 482ms remaining: 796ms 23: learn: 0.3437330 test: 0.3396697 best: 0.3396697 (23) total: 501ms remaining: 772ms 24: learn: 0.3406347 test: 0.3363380 best: 0.3363380 (24) total: 520ms remaining: 748ms 25: learn: 0.3368492 test: 0.3326412 best: 0.3326412 (25) total: 542ms remaining: 729ms 26: learn: 0.3321558 test: 0.3279109 best: 0.3279109 (26) total: 563ms remaining: 709ms 27: learn: 0.3257705 test: 0.3214898 best: 0.3214898 (27) total: 585ms remaining: 690ms 28: learn: 0.3199098 test: 0.3157153 best: 0.3157153 (28) total: 607ms remaining: 670ms 29: learn: 0.3176029 test: 0.3132673 best: 0.3132673 (29) total: 629ms remaining: 650ms 30: learn: 0.3133644 test: 0.3089002 best: 0.3089002 (30) total: 652ms remaining: 631ms 31: learn: 0.3092598 test: 0.3047412 best: 0.3047412 (31) total: 676ms remaining: 613ms 32: learn: 0.3047506 test: 0.3002103 best: 0.3002103 (32) total: 699ms remaining: 593ms 33: learn: 0.3006235 test: 0.2961261 best: 0.2961261 (33) total: 721ms remaining: 572ms 34: learn: 0.2991485 test: 0.2944816 best: 0.2944816 (34) total: 740ms remaining: 550ms 35: learn: 0.2953280 test: 0.2906998 best: 0.2906998 (35) total: 762ms remaining: 529ms 36: learn: 0.2933173 test: 0.2887401 best: 0.2887401 (36) total: 784ms remaining: 509ms 37: learn: 0.2902627 test: 0.2856658 best: 0.2856658 (37) total: 807ms remaining: 488ms 38: learn: 0.2874451 test: 0.2827567 best: 0.2827567 (38) total: 829ms remaining: 467ms 39: learn: 0.2856914 test: 0.2809180 best: 0.2809180 (39) total: 852ms remaining: 447ms 40: learn: 0.2829128 test: 0.2780724 best: 0.2780724 (40) total: 874ms remaining: 427ms 41: learn: 0.2804549 test: 0.2754615 best: 0.2754615 (41) total: 898ms remaining: 406ms 42: learn: 0.2776809 test: 0.2727310 best: 0.2727310 (42) total: 921ms remaining: 386ms 43: learn: 0.2767788 test: 0.2717467 best: 0.2717467 (43) total: 945ms remaining: 365ms 44: learn: 0.2746018 test: 0.2695513 best: 0.2695513 (44) total: 968ms remaining: 344ms 45: learn: 0.2735092 test: 0.2684545 best: 0.2684545 (45) total: 984ms remaining: 321ms 46: learn: 0.2726336 test: 0.2674985 best: 0.2674985 (46) total: 1.01s remaining: 300ms 47: learn: 0.2707163 test: 0.2655145 best: 0.2655145 (47) total: 1.03s remaining: 279ms 48: learn: 0.2686660 test: 0.2634850 best: 0.2634850 (48) total: 1.05s remaining: 258ms 49: learn: 0.2669606 test: 0.2618910 best: 0.2618910 (49) total: 1.08s remaining: 237ms 50: learn: 0.2660269 test: 0.2609634 best: 0.2609634 (50) total: 1.1s remaining: 216ms 51: learn: 0.2649498 test: 0.2598501 best: 0.2598501 (51) total: 1.12s remaining: 194ms 52: learn: 0.2641336 test: 0.2590306 best: 0.2590306 (52) total: 1.15s remaining: 173ms 53: learn: 0.2634683 test: 0.2583231 best: 0.2583231 (53) total: 1.17s remaining: 152ms 54: learn: 0.2625584 test: 0.2574161 best: 0.2574161 (54) total: 1.2s remaining: 130ms 55: learn: 0.2605339 test: 0.2555074 best: 0.2555074 (55) total: 1.22s remaining: 109ms 56: learn: 0.2594299 test: 0.2544388 best: 0.2544388 (56) total: 1.25s remaining: 87.6ms 57: learn: 0.2583126 test: 0.2532417 best: 0.2532417 (57) total: 1.27s remaining: 66ms 58: learn: 0.2577066 test: 0.2525822 best: 0.2525822 (58) total: 1.3s remaining: 44ms 59: learn: 0.2562993 test: 0.2511230 best: 0.2511230 (59) total: 1.32s remaining: 22ms 60: learn: 0.2554709 test: 0.2502923 best: 0.2502923 (60) total: 1.35s remaining: 0us bestTest = 0.2502922561 bestIteration = 60 Trial 20, Fold 3: Log loss = 0.2502433561134991, Average precision = 0.9665725313821034, ROC-AUC = 0.9627439962535275, Elapsed Time = 1.4568521000001056 seconds Trial 20, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 20, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6574858 test: 0.6573514 best: 0.6573514 (0) total: 12.1ms remaining: 728ms 1: learn: 0.6292672 test: 0.6291140 best: 0.6291140 (1) total: 34.6ms remaining: 1.02s 2: learn: 0.5978879 test: 0.5974939 best: 0.5974939 (2) total: 54.8ms remaining: 1.06s 3: learn: 0.5745739 test: 0.5741393 best: 0.5741393 (3) total: 74.6ms remaining: 1.06s 4: learn: 0.5535561 test: 0.5530872 best: 0.5530872 (4) total: 89.8ms remaining: 1.01s 5: learn: 0.5311609 test: 0.5305256 best: 0.5305256 (5) total: 112ms remaining: 1.02s 6: learn: 0.5182799 test: 0.5178778 best: 0.5178778 (6) total: 136ms remaining: 1.05s 7: learn: 0.5047290 test: 0.5046510 best: 0.5046510 (7) total: 158ms remaining: 1.05s 8: learn: 0.4872705 test: 0.4869790 best: 0.4869790 (8) total: 180ms remaining: 1.04s 9: learn: 0.4774160 test: 0.4772531 best: 0.4772531 (9) total: 203ms remaining: 1.04s 10: learn: 0.4667291 test: 0.4663813 best: 0.4663813 (10) total: 227ms remaining: 1.03s 11: learn: 0.4552179 test: 0.4546468 best: 0.4546468 (11) total: 249ms remaining: 1.02s 12: learn: 0.4462684 test: 0.4454172 best: 0.4454172 (12) total: 268ms remaining: 990ms 13: learn: 0.4329101 test: 0.4322627 best: 0.4322627 (13) total: 291ms remaining: 978ms 14: learn: 0.4258287 test: 0.4251087 best: 0.4251087 (14) total: 314ms remaining: 963ms 15: learn: 0.4133448 test: 0.4126977 best: 0.4126977 (15) total: 336ms remaining: 944ms 16: learn: 0.4025900 test: 0.4019283 best: 0.4019283 (16) total: 357ms remaining: 925ms 17: learn: 0.3924323 test: 0.3915844 best: 0.3915844 (17) total: 379ms remaining: 907ms 18: learn: 0.3872188 test: 0.3865915 best: 0.3865915 (18) total: 402ms remaining: 888ms 19: learn: 0.3801890 test: 0.3797293 best: 0.3797293 (19) total: 424ms remaining: 869ms 20: learn: 0.3757571 test: 0.3754673 best: 0.3754673 (20) total: 446ms remaining: 850ms 21: learn: 0.3704503 test: 0.3704520 best: 0.3704520 (21) total: 469ms remaining: 832ms 22: learn: 0.3624620 test: 0.3624464 best: 0.3624464 (22) total: 492ms remaining: 813ms 23: learn: 0.3577070 test: 0.3577219 best: 0.3577219 (23) total: 515ms remaining: 794ms 24: learn: 0.3536201 test: 0.3535085 best: 0.3535085 (24) total: 537ms remaining: 774ms 25: learn: 0.3492915 test: 0.3491144 best: 0.3491144 (25) total: 560ms remaining: 754ms 26: learn: 0.3441787 test: 0.3440859 best: 0.3440859 (26) total: 584ms remaining: 735ms 27: learn: 0.3404117 test: 0.3401853 best: 0.3401853 (27) total: 604ms remaining: 711ms 28: learn: 0.3338677 test: 0.3334736 best: 0.3334736 (28) total: 627ms remaining: 692ms 29: learn: 0.3302445 test: 0.3298751 best: 0.3298751 (29) total: 650ms remaining: 672ms 30: learn: 0.3278312 test: 0.3276391 best: 0.3276391 (30) total: 673ms remaining: 652ms 31: learn: 0.3222833 test: 0.3220952 best: 0.3220952 (31) total: 697ms remaining: 632ms 32: learn: 0.3173643 test: 0.3172898 best: 0.3172898 (32) total: 720ms remaining: 611ms 33: learn: 0.3130183 test: 0.3128874 best: 0.3128874 (33) total: 743ms remaining: 590ms 34: learn: 0.3099060 test: 0.3097522 best: 0.3097522 (34) total: 766ms remaining: 569ms 35: learn: 0.3068943 test: 0.3068752 best: 0.3068752 (35) total: 789ms remaining: 548ms 36: learn: 0.3039948 test: 0.3040778 best: 0.3040778 (36) total: 812ms remaining: 527ms 37: learn: 0.3004129 test: 0.3005749 best: 0.3005749 (37) total: 835ms remaining: 505ms 38: learn: 0.2987099 test: 0.2990008 best: 0.2990008 (38) total: 858ms remaining: 484ms 39: learn: 0.2950581 test: 0.2954570 best: 0.2954570 (39) total: 881ms remaining: 462ms 40: learn: 0.2919213 test: 0.2923001 best: 0.2923001 (40) total: 904ms remaining: 441ms 41: learn: 0.2890717 test: 0.2895173 best: 0.2895173 (41) total: 927ms remaining: 419ms 42: learn: 0.2864365 test: 0.2868591 best: 0.2868591 (42) total: 950ms remaining: 398ms 43: learn: 0.2839108 test: 0.2842776 best: 0.2842776 (43) total: 969ms remaining: 375ms 44: learn: 0.2826760 test: 0.2831405 best: 0.2831405 (44) total: 993ms remaining: 353ms 45: learn: 0.2811648 test: 0.2817766 best: 0.2817766 (45) total: 1.02s remaining: 332ms 46: learn: 0.2783728 test: 0.2789367 best: 0.2789367 (46) total: 1.04s remaining: 310ms 47: learn: 0.2773987 test: 0.2780159 best: 0.2780159 (47) total: 1.06s remaining: 289ms 48: learn: 0.2764598 test: 0.2771109 best: 0.2771109 (48) total: 1.08s remaining: 266ms 49: learn: 0.2744468 test: 0.2751124 best: 0.2751124 (49) total: 1.11s remaining: 244ms 50: learn: 0.2736989 test: 0.2743416 best: 0.2743416 (50) total: 1.13s remaining: 222ms 51: learn: 0.2728324 test: 0.2735561 best: 0.2735561 (51) total: 1.15s remaining: 200ms 52: learn: 0.2704839 test: 0.2711058 best: 0.2711058 (52) total: 1.18s remaining: 178ms 53: learn: 0.2696716 test: 0.2701765 best: 0.2701765 (53) total: 1.2s remaining: 156ms 54: learn: 0.2685835 test: 0.2692605 best: 0.2692605 (54) total: 1.23s remaining: 134ms 55: learn: 0.2675976 test: 0.2682903 best: 0.2682903 (55) total: 1.25s remaining: 112ms 56: learn: 0.2656877 test: 0.2663140 best: 0.2663140 (56) total: 1.27s remaining: 89.5ms 57: learn: 0.2641634 test: 0.2647682 best: 0.2647682 (57) total: 1.3s remaining: 67.1ms 58: learn: 0.2621322 test: 0.2627324 best: 0.2627324 (58) total: 1.32s remaining: 44.8ms 59: learn: 0.2613593 test: 0.2618935 best: 0.2618935 (59) total: 1.35s remaining: 22.4ms 60: learn: 0.2598917 test: 0.2604125 best: 0.2604125 (60) total: 1.37s remaining: 0us bestTest = 0.2604125491 bestIteration = 60 Trial 20, Fold 4: Log loss = 0.2602940149502716, Average precision = 0.965263836969806, ROC-AUC = 0.9590665393391846, Elapsed Time = 1.481476400000247 seconds Trial 20, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 20, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6618855 test: 0.6626266 best: 0.6626266 (0) total: 13ms remaining: 779ms 1: learn: 0.6335429 test: 0.6351127 best: 0.6351127 (1) total: 35.8ms remaining: 1.06s 2: learn: 0.6131283 test: 0.6153034 best: 0.6153034 (2) total: 58ms remaining: 1.12s 3: learn: 0.5926369 test: 0.5953955 best: 0.5953955 (3) total: 80.1ms remaining: 1.14s 4: learn: 0.5754433 test: 0.5789151 best: 0.5789151 (4) total: 103ms remaining: 1.15s 5: learn: 0.5593078 test: 0.5633131 best: 0.5633131 (5) total: 125ms remaining: 1.15s 6: learn: 0.5419415 test: 0.5461459 best: 0.5461459 (6) total: 145ms remaining: 1.12s 7: learn: 0.5218472 test: 0.5264890 best: 0.5264890 (7) total: 163ms remaining: 1.08s 8: learn: 0.5097092 test: 0.5148205 best: 0.5148205 (8) total: 186ms remaining: 1.08s 9: learn: 0.4956121 test: 0.5009415 best: 0.5009415 (9) total: 209ms remaining: 1.06s 10: learn: 0.4876545 test: 0.4927960 best: 0.4927960 (10) total: 232ms remaining: 1.05s 11: learn: 0.4751467 test: 0.4804277 best: 0.4804277 (11) total: 251ms remaining: 1.02s 12: learn: 0.4550254 test: 0.4604288 best: 0.4604288 (12) total: 275ms remaining: 1.02s 13: learn: 0.4474331 test: 0.4531685 best: 0.4531685 (13) total: 299ms remaining: 1s 14: learn: 0.4395728 test: 0.4456434 best: 0.4456434 (14) total: 322ms remaining: 988ms 15: learn: 0.4300107 test: 0.4363736 best: 0.4363736 (15) total: 344ms remaining: 968ms 16: learn: 0.4164908 test: 0.4230459 best: 0.4230459 (16) total: 367ms remaining: 950ms 17: learn: 0.4066540 test: 0.4128998 best: 0.4128998 (17) total: 389ms remaining: 929ms 18: learn: 0.3954272 test: 0.4017879 best: 0.4017879 (18) total: 412ms remaining: 910ms 19: learn: 0.3875033 test: 0.3939585 best: 0.3939585 (19) total: 434ms remaining: 890ms 20: learn: 0.3826503 test: 0.3893588 best: 0.3893588 (20) total: 457ms remaining: 870ms 21: learn: 0.3774040 test: 0.3843054 best: 0.3843054 (21) total: 480ms remaining: 851ms 22: learn: 0.3689661 test: 0.3760854 best: 0.3760854 (22) total: 503ms remaining: 830ms 23: learn: 0.3619471 test: 0.3692271 best: 0.3692271 (23) total: 525ms remaining: 809ms 24: learn: 0.3539207 test: 0.3613762 best: 0.3613762 (24) total: 547ms remaining: 787ms 25: learn: 0.3491755 test: 0.3567311 best: 0.3567311 (25) total: 569ms remaining: 766ms 26: learn: 0.3449299 test: 0.3526811 best: 0.3526811 (26) total: 592ms remaining: 745ms 27: learn: 0.3376864 test: 0.3455664 best: 0.3455664 (27) total: 614ms remaining: 724ms 28: learn: 0.3325930 test: 0.3403196 best: 0.3403196 (28) total: 636ms remaining: 702ms 29: learn: 0.3276953 test: 0.3355983 best: 0.3355983 (29) total: 660ms remaining: 682ms 30: learn: 0.3253161 test: 0.3332282 best: 0.3332282 (30) total: 683ms remaining: 661ms 31: learn: 0.3211902 test: 0.3291523 best: 0.3291523 (31) total: 706ms remaining: 640ms 32: learn: 0.3164850 test: 0.3242661 best: 0.3242661 (32) total: 730ms remaining: 619ms 33: learn: 0.3127017 test: 0.3205808 best: 0.3205808 (33) total: 752ms remaining: 598ms 34: learn: 0.3109046 test: 0.3188616 best: 0.3188616 (34) total: 776ms remaining: 576ms 35: learn: 0.3066480 test: 0.3144828 best: 0.3144828 (35) total: 799ms remaining: 555ms 36: learn: 0.3028943 test: 0.3107875 best: 0.3107875 (36) total: 818ms remaining: 531ms 37: learn: 0.2993014 test: 0.3072168 best: 0.3072168 (37) total: 840ms remaining: 509ms 38: learn: 0.2959936 test: 0.3038934 best: 0.3038934 (38) total: 864ms remaining: 487ms 39: learn: 0.2927726 test: 0.3008353 best: 0.3008353 (39) total: 887ms remaining: 465ms 40: learn: 0.2900033 test: 0.2981364 best: 0.2981364 (40) total: 909ms remaining: 444ms 41: learn: 0.2873031 test: 0.2954368 best: 0.2954368 (41) total: 932ms remaining: 421ms 42: learn: 0.2860307 test: 0.2941629 best: 0.2941629 (42) total: 955ms remaining: 400ms 43: learn: 0.2850105 test: 0.2932245 best: 0.2932245 (43) total: 978ms remaining: 378ms 44: learn: 0.2816894 test: 0.2899790 best: 0.2899790 (44) total: 1s remaining: 356ms 45: learn: 0.2798052 test: 0.2882087 best: 0.2882087 (45) total: 1.02s remaining: 334ms 46: learn: 0.2774257 test: 0.2857858 best: 0.2857858 (46) total: 1.05s remaining: 313ms 47: learn: 0.2764045 test: 0.2847940 best: 0.2847940 (47) total: 1.07s remaining: 290ms 48: learn: 0.2738557 test: 0.2822665 best: 0.2822665 (48) total: 1.09s remaining: 268ms 49: learn: 0.2730370 test: 0.2814629 best: 0.2814629 (49) total: 1.12s remaining: 246ms 50: learn: 0.2719141 test: 0.2804113 best: 0.2804113 (50) total: 1.14s remaining: 224ms 51: learn: 0.2699821 test: 0.2784708 best: 0.2784708 (51) total: 1.16s remaining: 202ms 52: learn: 0.2675750 test: 0.2761328 best: 0.2761328 (52) total: 1.19s remaining: 179ms 53: learn: 0.2666315 test: 0.2752912 best: 0.2752912 (53) total: 1.21s remaining: 157ms 54: learn: 0.2645325 test: 0.2732287 best: 0.2732287 (54) total: 1.23s remaining: 135ms 55: learn: 0.2631174 test: 0.2718107 best: 0.2718107 (55) total: 1.26s remaining: 112ms 56: learn: 0.2621154 test: 0.2709273 best: 0.2709273 (56) total: 1.28s remaining: 89.9ms 57: learn: 0.2604860 test: 0.2691623 best: 0.2691623 (57) total: 1.3s remaining: 67.5ms 58: learn: 0.2587858 test: 0.2674470 best: 0.2674470 (58) total: 1.33s remaining: 45ms 59: learn: 0.2571154 test: 0.2658304 best: 0.2658304 (59) total: 1.35s remaining: 22.5ms 60: learn: 0.2555486 test: 0.2643443 best: 0.2643443 (60) total: 1.37s remaining: 0us bestTest = 0.2643442648 bestIteration = 60
Optimization Progress: 21%|##1 | 21/100 [36:58<1:20:15, 60.95s/it]
Trial 20, Fold 5: Log loss = 0.26414094643409086, Average precision = 0.9619330790696504, ROC-AUC = 0.9570425282871634, Elapsed Time = 1.4874039000023913 seconds Trial 21, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 21, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6052402 test: 0.6056813 best: 0.6056813 (0) total: 298ms remaining: 4.17s 1: learn: 0.5345421 test: 0.5357213 best: 0.5357213 (1) total: 680ms remaining: 4.42s 2: learn: 0.4777000 test: 0.4796167 best: 0.4796167 (2) total: 1.08s remaining: 4.32s 3: learn: 0.4332827 test: 0.4362740 best: 0.4362740 (3) total: 1.4s remaining: 3.85s 4: learn: 0.3946632 test: 0.3981686 best: 0.3981686 (4) total: 1.77s remaining: 3.54s 5: learn: 0.3648220 test: 0.3692311 best: 0.3692311 (5) total: 2.15s remaining: 3.22s 6: learn: 0.3402481 test: 0.3454375 best: 0.3454375 (6) total: 2.5s remaining: 2.86s 7: learn: 0.3204276 test: 0.3265295 best: 0.3265295 (7) total: 2.77s remaining: 2.42s 8: learn: 0.3028063 test: 0.3093781 best: 0.3093781 (8) total: 3.11s remaining: 2.08s 9: learn: 0.2883391 test: 0.2957645 best: 0.2957645 (9) total: 3.44s remaining: 1.72s 10: learn: 0.2765725 test: 0.2845841 best: 0.2845841 (10) total: 3.81s remaining: 1.39s 11: learn: 0.2667665 test: 0.2754682 best: 0.2754682 (11) total: 4.14s remaining: 1.03s 12: learn: 0.2592984 test: 0.2685861 best: 0.2685861 (12) total: 4.37s remaining: 673ms 13: learn: 0.2516720 test: 0.2615534 best: 0.2615534 (13) total: 4.75s remaining: 339ms 14: learn: 0.2450749 test: 0.2558926 best: 0.2558926 (14) total: 5.1s remaining: 0us bestTest = 0.2558925518 bestIteration = 14 Trial 21, Fold 1: Log loss = 0.25565690971526495, Average precision = 0.9717095024139286, ROC-AUC = 0.9676144741495916, Elapsed Time = 5.211276900001394 seconds Trial 21, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 21, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6046925 test: 0.6055960 best: 0.6055960 (0) total: 457ms remaining: 6.4s 1: learn: 0.5341215 test: 0.5355399 best: 0.5355399 (1) total: 800ms remaining: 5.2s 2: learn: 0.4768494 test: 0.4785904 best: 0.4785904 (2) total: 1.11s remaining: 4.43s 3: learn: 0.4310689 test: 0.4333149 best: 0.4333149 (3) total: 1.45s remaining: 3.99s 4: learn: 0.3931205 test: 0.3956140 best: 0.3956140 (4) total: 1.78s remaining: 3.57s 5: learn: 0.3625034 test: 0.3650723 best: 0.3650723 (5) total: 2.1s remaining: 3.15s 6: learn: 0.3399291 test: 0.3429229 best: 0.3429229 (6) total: 2.4s remaining: 2.74s 7: learn: 0.3188097 test: 0.3223808 best: 0.3223808 (7) total: 2.71s remaining: 2.37s 8: learn: 0.3018135 test: 0.3054379 best: 0.3054379 (8) total: 3.04s remaining: 2.03s 9: learn: 0.2878346 test: 0.2916962 best: 0.2916962 (9) total: 3.38s remaining: 1.69s 10: learn: 0.2759631 test: 0.2805473 best: 0.2805473 (10) total: 3.64s remaining: 1.32s 11: learn: 0.2666824 test: 0.2718867 best: 0.2718867 (11) total: 3.96s remaining: 991ms 12: learn: 0.2587605 test: 0.2646505 best: 0.2646505 (12) total: 4.3s remaining: 662ms 13: learn: 0.2512243 test: 0.2571744 best: 0.2571744 (13) total: 4.65s remaining: 332ms 14: learn: 0.2448386 test: 0.2512642 best: 0.2512642 (14) total: 4.98s remaining: 0us bestTest = 0.2512642087 bestIteration = 14 Trial 21, Fold 2: Log loss = 0.2511007027585758, Average precision = 0.9731409802316744, ROC-AUC = 0.9695328383725232, Elapsed Time = 5.091062700001203 seconds Trial 21, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 21, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6057888 test: 0.6054812 best: 0.6054812 (0) total: 365ms remaining: 5.11s 1: learn: 0.5357161 test: 0.5354874 best: 0.5354874 (1) total: 689ms remaining: 4.48s 2: learn: 0.4792120 test: 0.4788670 best: 0.4788670 (2) total: 1.01s remaining: 4.06s 3: learn: 0.4326812 test: 0.4324274 best: 0.4324274 (3) total: 1.35s remaining: 3.71s 4: learn: 0.3950740 test: 0.3952497 best: 0.3952497 (4) total: 1.74s remaining: 3.49s 5: learn: 0.3654539 test: 0.3656939 best: 0.3656939 (5) total: 2.1s remaining: 3.15s 6: learn: 0.3430518 test: 0.3435685 best: 0.3435685 (6) total: 2.36s remaining: 2.7s 7: learn: 0.3224220 test: 0.3234336 best: 0.3234336 (7) total: 2.69s remaining: 2.35s 8: learn: 0.3060553 test: 0.3072086 best: 0.3072086 (8) total: 3.04s remaining: 2.03s 9: learn: 0.2936707 test: 0.2948880 best: 0.2948880 (9) total: 3.33s remaining: 1.67s 10: learn: 0.2801772 test: 0.2820504 best: 0.2820504 (10) total: 3.76s remaining: 1.37s 11: learn: 0.2695583 test: 0.2715100 best: 0.2715100 (11) total: 4.09s remaining: 1.02s 12: learn: 0.2610255 test: 0.2633191 best: 0.2633191 (12) total: 4.43s remaining: 682ms 13: learn: 0.2535457 test: 0.2563301 best: 0.2563301 (13) total: 4.75s remaining: 339ms 14: learn: 0.2470052 test: 0.2500255 best: 0.2500255 (14) total: 5.07s remaining: 0us bestTest = 0.250025528 bestIteration = 14 Trial 21, Fold 3: Log loss = 0.25000451183347056, Average precision = 0.9721744330022752, ROC-AUC = 0.9688436768643276, Elapsed Time = 5.176167000001442 seconds Trial 21, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 21, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6044499 test: 0.6068209 best: 0.6068209 (0) total: 340ms remaining: 4.76s 1: learn: 0.5336060 test: 0.5363572 best: 0.5363572 (1) total: 682ms remaining: 4.43s 2: learn: 0.4781055 test: 0.4806307 best: 0.4806307 (2) total: 978ms remaining: 3.91s 3: learn: 0.4325489 test: 0.4364999 best: 0.4364999 (3) total: 1.33s remaining: 3.65s 4: learn: 0.3966119 test: 0.4010256 best: 0.4010256 (4) total: 1.72s remaining: 3.44s 5: learn: 0.3658267 test: 0.3712067 best: 0.3712067 (5) total: 2s remaining: 3s 6: learn: 0.3412521 test: 0.3476130 best: 0.3476130 (6) total: 2.39s remaining: 2.73s 7: learn: 0.3209018 test: 0.3280923 best: 0.3280923 (7) total: 2.68s remaining: 2.35s 8: learn: 0.3054452 test: 0.3132992 best: 0.3132992 (8) total: 3.02s remaining: 2.01s 9: learn: 0.2917643 test: 0.3001822 best: 0.3001822 (9) total: 3.37s remaining: 1.68s 10: learn: 0.2805816 test: 0.2893259 best: 0.2893259 (10) total: 3.69s remaining: 1.34s 11: learn: 0.2697974 test: 0.2784795 best: 0.2784795 (11) total: 4.1s remaining: 1.02s 12: learn: 0.2603664 test: 0.2690757 best: 0.2690757 (12) total: 4.4s remaining: 678ms 13: learn: 0.2518468 test: 0.2609608 best: 0.2609608 (13) total: 4.76s remaining: 340ms 14: learn: 0.2450163 test: 0.2543693 best: 0.2543693 (14) total: 5.08s remaining: 0us bestTest = 0.254369291 bestIteration = 14 Trial 21, Fold 4: Log loss = 0.25420276714706525, Average precision = 0.9728859743949347, ROC-AUC = 0.9680305788631611, Elapsed Time = 5.191823100001784 seconds Trial 21, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 21, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6046631 test: 0.6062912 best: 0.6062912 (0) total: 332ms remaining: 4.65s 1: learn: 0.5333253 test: 0.5362119 best: 0.5362119 (1) total: 674ms remaining: 4.38s 2: learn: 0.4758445 test: 0.4815178 best: 0.4815178 (2) total: 1.13s remaining: 4.52s 3: learn: 0.4296908 test: 0.4372961 best: 0.4372961 (3) total: 1.43s remaining: 3.93s 4: learn: 0.3952187 test: 0.4036810 best: 0.4036810 (4) total: 1.72s remaining: 3.44s 5: learn: 0.3646717 test: 0.3740258 best: 0.3740258 (5) total: 2.06s remaining: 3.09s 6: learn: 0.3398687 test: 0.3498655 best: 0.3498655 (6) total: 2.4s remaining: 2.75s 7: learn: 0.3200923 test: 0.3311283 best: 0.3311283 (7) total: 2.73s remaining: 2.39s 8: learn: 0.3022381 test: 0.3136684 best: 0.3136684 (8) total: 3.02s remaining: 2.01s 9: learn: 0.2871766 test: 0.2993820 best: 0.2993820 (9) total: 3.35s remaining: 1.67s 10: learn: 0.2766382 test: 0.2896269 best: 0.2896269 (10) total: 3.63s remaining: 1.32s 11: learn: 0.2661843 test: 0.2797086 best: 0.2797086 (11) total: 4s remaining: 1s 12: learn: 0.2575636 test: 0.2717674 best: 0.2717674 (12) total: 4.32s remaining: 665ms 13: learn: 0.2495566 test: 0.2641221 best: 0.2641221 (13) total: 4.6s remaining: 328ms 14: learn: 0.2433030 test: 0.2582642 best: 0.2582642 (14) total: 4.84s remaining: 0us bestTest = 0.2582642295 bestIteration = 14 Trial 21, Fold 5: Log loss = 0.25793613146605265, Average precision = 0.9714345026242601, ROC-AUC = 0.9671171803875667, Elapsed Time = 4.945450200000778 seconds
Optimization Progress: 22%|##2 | 22/100 [37:31<1:08:23, 52.61s/it]
Trial 22, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 22, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6484288 test: 0.6485089 best: 0.6485089 (0) total: 125ms remaining: 873ms 1: learn: 0.6092857 test: 0.6096999 best: 0.6096999 (1) total: 254ms remaining: 762ms 2: learn: 0.5698356 test: 0.5699160 best: 0.5699160 (2) total: 398ms remaining: 663ms 3: learn: 0.5376881 test: 0.5379766 best: 0.5379766 (3) total: 537ms remaining: 537ms 4: learn: 0.5088473 test: 0.5095523 best: 0.5095523 (4) total: 665ms remaining: 399ms 5: learn: 0.4821093 test: 0.4829636 best: 0.4829636 (5) total: 810ms remaining: 270ms 6: learn: 0.4580630 test: 0.4592787 best: 0.4592787 (6) total: 947ms remaining: 135ms 7: learn: 0.4363021 test: 0.4376143 best: 0.4376143 (7) total: 1.1s remaining: 0us bestTest = 0.4376143381 bestIteration = 7 Trial 22, Fold 1: Log loss = 0.43821208600053174, Average precision = 0.9714780359988808, ROC-AUC = 0.9670584734947878, Elapsed Time = 1.2224325000024692 seconds Trial 22, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 22, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6448862 test: 0.6451627 best: 0.6451627 (0) total: 147ms remaining: 1.03s 1: learn: 0.6052244 test: 0.6058257 best: 0.6058257 (1) total: 285ms remaining: 854ms 2: learn: 0.5704255 test: 0.5715395 best: 0.5715395 (2) total: 447ms remaining: 745ms 3: learn: 0.5359748 test: 0.5373700 best: 0.5373700 (3) total: 604ms remaining: 604ms 4: learn: 0.5030373 test: 0.5047150 best: 0.5047150 (4) total: 764ms remaining: 459ms 5: learn: 0.4747069 test: 0.4762664 best: 0.4762664 (5) total: 931ms remaining: 310ms 6: learn: 0.4529150 test: 0.4547509 best: 0.4547509 (6) total: 1.08s remaining: 154ms 7: learn: 0.4329562 test: 0.4352767 best: 0.4352767 (7) total: 1.23s remaining: 0us bestTest = 0.4352767145 bestIteration = 7 Trial 22, Fold 2: Log loss = 0.43556843681241236, Average precision = 0.9716352472754425, ROC-AUC = 0.9679856938257771, Elapsed Time = 1.3532617000018945 seconds Trial 22, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 22, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6471622 test: 0.6469957 best: 0.6469957 (0) total: 125ms remaining: 876ms 1: learn: 0.6071983 test: 0.6068611 best: 0.6068611 (1) total: 244ms remaining: 732ms 2: learn: 0.5707154 test: 0.5699889 best: 0.5699889 (2) total: 370ms remaining: 617ms 3: learn: 0.5350672 test: 0.5339401 best: 0.5339401 (3) total: 506ms remaining: 506ms 4: learn: 0.5049380 test: 0.5042262 best: 0.5042262 (4) total: 651ms remaining: 391ms 5: learn: 0.4792372 test: 0.4788578 best: 0.4788578 (5) total: 778ms remaining: 259ms 6: learn: 0.4583073 test: 0.4579646 best: 0.4579646 (6) total: 927ms remaining: 132ms 7: learn: 0.4385213 test: 0.4382335 best: 0.4382335 (7) total: 1.06s remaining: 0us bestTest = 0.4382335323 bestIteration = 7 Trial 22, Fold 3: Log loss = 0.4386836080120765, Average precision = 0.9714202845719077, ROC-AUC = 0.9688746767583862, Elapsed Time = 1.1888800999986415 seconds Trial 22, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 22, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6472816 test: 0.6472400 best: 0.6472400 (0) total: 124ms remaining: 865ms 1: learn: 0.6081028 test: 0.6081294 best: 0.6081294 (1) total: 243ms remaining: 730ms 2: learn: 0.5697098 test: 0.5698944 best: 0.5698944 (2) total: 351ms remaining: 585ms 3: learn: 0.5340442 test: 0.5344432 best: 0.5344432 (3) total: 471ms remaining: 471ms 4: learn: 0.5050421 test: 0.5054522 best: 0.5054522 (4) total: 601ms remaining: 360ms 5: learn: 0.4789070 test: 0.4794594 best: 0.4794594 (5) total: 721ms remaining: 240ms 6: learn: 0.4574948 test: 0.4583475 best: 0.4583475 (6) total: 853ms remaining: 122ms 7: learn: 0.4363622 test: 0.4373440 best: 0.4373440 (7) total: 969ms remaining: 0us bestTest = 0.437344016 bestIteration = 7 Trial 22, Fold 4: Log loss = 0.43779578916843814, Average precision = 0.9737375106157666, ROC-AUC = 0.9693548204420219, Elapsed Time = 1.083319900000788 seconds Trial 22, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 22, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6417526 test: 0.6428893 best: 0.6428893 (0) total: 120ms remaining: 839ms 1: learn: 0.5993223 test: 0.6010456 best: 0.6010456 (1) total: 230ms remaining: 691ms 2: learn: 0.5615980 test: 0.5640013 best: 0.5640013 (2) total: 356ms remaining: 593ms 3: learn: 0.5298356 test: 0.5327830 best: 0.5327830 (3) total: 464ms remaining: 464ms 4: learn: 0.4989567 test: 0.5026511 best: 0.5026511 (4) total: 578ms remaining: 347ms 5: learn: 0.4732253 test: 0.4777445 best: 0.4777445 (5) total: 692ms remaining: 231ms 6: learn: 0.4529167 test: 0.4579802 best: 0.4579802 (6) total: 807ms remaining: 115ms 7: learn: 0.4335662 test: 0.4388874 best: 0.4388874 (7) total: 921ms remaining: 0us bestTest = 0.4388874333 bestIteration = 7 Trial 22, Fold 5: Log loss = 0.4392646548500526, Average precision = 0.9685299905654721, ROC-AUC = 0.9655006967281646, Elapsed Time = 1.0358863000001293 seconds
Optimization Progress: 23%|##3 | 23/100 [37:46<52:52, 41.20s/it]
Trial 23, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 23, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5946378 test: 0.5951884 best: 0.5951884 (0) total: 96.1ms remaining: 5.86s 1: learn: 0.5161184 test: 0.5179206 best: 0.5179206 (1) total: 191ms remaining: 5.72s 2: learn: 0.4556386 test: 0.4575610 best: 0.4575610 (2) total: 287ms remaining: 5.64s 3: learn: 0.4084679 test: 0.4113166 best: 0.4113166 (3) total: 386ms remaining: 5.59s 4: learn: 0.3708326 test: 0.3740405 best: 0.3740405 (4) total: 485ms remaining: 5.53s 5: learn: 0.3418623 test: 0.3456504 best: 0.3456504 (5) total: 588ms remaining: 5.49s 6: learn: 0.3181901 test: 0.3226266 best: 0.3226266 (6) total: 689ms remaining: 5.42s 7: learn: 0.2991157 test: 0.3043136 best: 0.3043136 (7) total: 792ms remaining: 5.34s 8: learn: 0.2834271 test: 0.2892065 best: 0.2892065 (8) total: 891ms remaining: 5.25s 9: learn: 0.2701944 test: 0.2765020 best: 0.2765020 (9) total: 991ms remaining: 5.15s 10: learn: 0.2598416 test: 0.2668583 best: 0.2668583 (10) total: 1.09s remaining: 5.06s 11: learn: 0.2511069 test: 0.2587223 best: 0.2587223 (11) total: 1.19s remaining: 4.96s 12: learn: 0.2432856 test: 0.2517632 best: 0.2517632 (12) total: 1.29s remaining: 4.87s 13: learn: 0.2362389 test: 0.2454121 best: 0.2454121 (13) total: 1.39s remaining: 4.77s 14: learn: 0.2306967 test: 0.2407194 best: 0.2407194 (14) total: 1.49s remaining: 4.68s 15: learn: 0.2259349 test: 0.2365842 best: 0.2365842 (15) total: 1.61s remaining: 4.62s 16: learn: 0.2221057 test: 0.2334495 best: 0.2334495 (16) total: 1.72s remaining: 4.55s 17: learn: 0.2180254 test: 0.2300064 best: 0.2300064 (17) total: 1.84s remaining: 4.5s 18: learn: 0.2147617 test: 0.2272233 best: 0.2272233 (18) total: 1.96s remaining: 4.43s 19: learn: 0.2119218 test: 0.2253095 best: 0.2253095 (19) total: 2.08s remaining: 4.36s 20: learn: 0.2097391 test: 0.2235570 best: 0.2235570 (20) total: 2.2s remaining: 4.29s 21: learn: 0.2074959 test: 0.2217911 best: 0.2217911 (21) total: 2.32s remaining: 4.21s 22: learn: 0.2048109 test: 0.2196584 best: 0.2196584 (22) total: 2.44s remaining: 4.13s 23: learn: 0.2026583 test: 0.2179778 best: 0.2179778 (23) total: 2.56s remaining: 4.05s 24: learn: 0.2009177 test: 0.2171423 best: 0.2171423 (24) total: 2.68s remaining: 3.96s 25: learn: 0.1992347 test: 0.2163414 best: 0.2163414 (25) total: 2.8s remaining: 3.88s 26: learn: 0.1976886 test: 0.2155401 best: 0.2155401 (26) total: 2.92s remaining: 3.78s 27: learn: 0.1963604 test: 0.2145468 best: 0.2145468 (27) total: 3.03s remaining: 3.68s 28: learn: 0.1949652 test: 0.2136411 best: 0.2136411 (28) total: 3.15s remaining: 3.58s 29: learn: 0.1937050 test: 0.2127818 best: 0.2127818 (29) total: 3.26s remaining: 3.48s 30: learn: 0.1927659 test: 0.2122896 best: 0.2122896 (30) total: 3.38s remaining: 3.38s 31: learn: 0.1910726 test: 0.2111595 best: 0.2111595 (31) total: 3.5s remaining: 3.28s 32: learn: 0.1901546 test: 0.2108521 best: 0.2108521 (32) total: 3.61s remaining: 3.18s 33: learn: 0.1889994 test: 0.2101183 best: 0.2101183 (33) total: 3.73s remaining: 3.07s 34: learn: 0.1878954 test: 0.2096134 best: 0.2096134 (34) total: 3.85s remaining: 2.97s 35: learn: 0.1868284 test: 0.2087641 best: 0.2087641 (35) total: 3.96s remaining: 2.86s 36: learn: 0.1858167 test: 0.2081378 best: 0.2081378 (36) total: 4.07s remaining: 2.75s 37: learn: 0.1847439 test: 0.2073654 best: 0.2073654 (37) total: 4.19s remaining: 2.65s 38: learn: 0.1838790 test: 0.2066600 best: 0.2066600 (38) total: 4.3s remaining: 2.54s 39: learn: 0.1828032 test: 0.2063928 best: 0.2063928 (39) total: 4.42s remaining: 2.43s 40: learn: 0.1821491 test: 0.2062394 best: 0.2062394 (40) total: 4.53s remaining: 2.32s 41: learn: 0.1812931 test: 0.2056536 best: 0.2056536 (41) total: 4.64s remaining: 2.21s 42: learn: 0.1802951 test: 0.2048635 best: 0.2048635 (42) total: 4.75s remaining: 2.1s 43: learn: 0.1796966 test: 0.2046581 best: 0.2046581 (43) total: 4.87s remaining: 1.99s 44: learn: 0.1788351 test: 0.2043450 best: 0.2043450 (44) total: 4.98s remaining: 1.88s 45: learn: 0.1782909 test: 0.2040453 best: 0.2040453 (45) total: 5.09s remaining: 1.77s 46: learn: 0.1777572 test: 0.2040285 best: 0.2040285 (46) total: 5.21s remaining: 1.66s 47: learn: 0.1771215 test: 0.2038190 best: 0.2038190 (47) total: 5.32s remaining: 1.55s 48: learn: 0.1765893 test: 0.2037140 best: 0.2037140 (48) total: 5.43s remaining: 1.44s 49: learn: 0.1759159 test: 0.2033375 best: 0.2033375 (49) total: 5.55s remaining: 1.33s 50: learn: 0.1756127 test: 0.2033349 best: 0.2033349 (50) total: 5.67s remaining: 1.22s 51: learn: 0.1751527 test: 0.2030500 best: 0.2030500 (51) total: 5.8s remaining: 1.11s 52: learn: 0.1746009 test: 0.2029599 best: 0.2029599 (52) total: 5.93s remaining: 1.01s 53: learn: 0.1739298 test: 0.2024579 best: 0.2024579 (53) total: 6.05s remaining: 897ms 54: learn: 0.1734110 test: 0.2024009 best: 0.2024009 (54) total: 6.17s remaining: 786ms 55: learn: 0.1727020 test: 0.2026237 best: 0.2024009 (54) total: 6.29s remaining: 674ms 56: learn: 0.1724114 test: 0.2026374 best: 0.2024009 (54) total: 6.41s remaining: 562ms 57: learn: 0.1717590 test: 0.2025565 best: 0.2024009 (54) total: 6.53s remaining: 451ms 58: learn: 0.1713783 test: 0.2021670 best: 0.2021670 (58) total: 6.66s remaining: 339ms 59: learn: 0.1707844 test: 0.2022256 best: 0.2021670 (58) total: 6.78s remaining: 226ms 60: learn: 0.1702490 test: 0.2021097 best: 0.2021097 (60) total: 6.9s remaining: 113ms 61: learn: 0.1697146 test: 0.2023187 best: 0.2021097 (60) total: 7.01s remaining: 0us bestTest = 0.2021097075 bestIteration = 60 Shrink model to first 61 iterations. Trial 23, Fold 1: Log loss = 0.20150776947659338, Average precision = 0.9745777443313712, ROC-AUC = 0.9700378005764184, Elapsed Time = 7.127732000000833 seconds Trial 23, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 23, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5944426 test: 0.5956551 best: 0.5956551 (0) total: 113ms remaining: 6.92s 1: learn: 0.5168102 test: 0.5187980 best: 0.5187980 (1) total: 222ms remaining: 6.66s 2: learn: 0.4557290 test: 0.4583486 best: 0.4583486 (2) total: 332ms remaining: 6.52s 3: learn: 0.4091624 test: 0.4120791 best: 0.4120791 (3) total: 439ms remaining: 6.37s 4: learn: 0.3715465 test: 0.3749377 best: 0.3749377 (4) total: 555ms remaining: 6.33s 5: learn: 0.3415787 test: 0.3458123 best: 0.3458123 (5) total: 691ms remaining: 6.45s 6: learn: 0.3185279 test: 0.3230719 best: 0.3230719 (6) total: 845ms remaining: 6.64s 7: learn: 0.2999072 test: 0.3045940 best: 0.3045940 (7) total: 1s remaining: 6.75s 8: learn: 0.2836892 test: 0.2890879 best: 0.2890879 (8) total: 1.14s remaining: 6.71s 9: learn: 0.2714636 test: 0.2769361 best: 0.2769361 (9) total: 1.28s remaining: 6.64s 10: learn: 0.2612091 test: 0.2670189 best: 0.2670189 (10) total: 1.41s remaining: 6.56s 11: learn: 0.2523532 test: 0.2585248 best: 0.2585248 (11) total: 1.55s remaining: 6.47s 12: learn: 0.2447739 test: 0.2509758 best: 0.2509758 (12) total: 1.69s remaining: 6.36s 13: learn: 0.2378198 test: 0.2444379 best: 0.2444379 (13) total: 1.82s remaining: 6.25s 14: learn: 0.2322271 test: 0.2391528 best: 0.2391528 (14) total: 1.98s remaining: 6.19s 15: learn: 0.2269158 test: 0.2338861 best: 0.2338861 (15) total: 2.12s remaining: 6.11s 16: learn: 0.2223604 test: 0.2296065 best: 0.2296065 (16) total: 2.27s remaining: 6.02s 17: learn: 0.2182410 test: 0.2255502 best: 0.2255502 (17) total: 2.45s remaining: 5.99s 18: learn: 0.2154648 test: 0.2233225 best: 0.2233225 (18) total: 2.75s remaining: 6.22s 19: learn: 0.2124322 test: 0.2204621 best: 0.2204621 (19) total: 2.91s remaining: 6.12s 20: learn: 0.2095625 test: 0.2177923 best: 0.2177923 (20) total: 3.06s remaining: 5.98s 21: learn: 0.2073915 test: 0.2159624 best: 0.2159624 (21) total: 3.21s remaining: 5.84s 22: learn: 0.2051672 test: 0.2140336 best: 0.2140336 (22) total: 3.37s remaining: 5.72s 23: learn: 0.2035949 test: 0.2125451 best: 0.2125451 (23) total: 3.54s remaining: 5.6s 24: learn: 0.2020725 test: 0.2114147 best: 0.2114147 (24) total: 3.68s remaining: 5.45s 25: learn: 0.2006071 test: 0.2102594 best: 0.2102594 (25) total: 3.82s remaining: 5.29s 26: learn: 0.1992984 test: 0.2094834 best: 0.2094834 (26) total: 3.97s remaining: 5.15s 27: learn: 0.1978459 test: 0.2081773 best: 0.2081773 (27) total: 4.13s remaining: 5.02s 28: learn: 0.1966500 test: 0.2073022 best: 0.2073022 (28) total: 4.3s remaining: 4.89s 29: learn: 0.1954979 test: 0.2063658 best: 0.2063658 (29) total: 4.45s remaining: 4.75s 30: learn: 0.1943457 test: 0.2055358 best: 0.2055358 (30) total: 4.62s remaining: 4.62s 31: learn: 0.1933315 test: 0.2049406 best: 0.2049406 (31) total: 4.78s remaining: 4.48s 32: learn: 0.1921828 test: 0.2039875 best: 0.2039875 (32) total: 4.95s remaining: 4.35s 33: learn: 0.1910443 test: 0.2034873 best: 0.2034873 (33) total: 5.1s remaining: 4.2s 34: learn: 0.1899742 test: 0.2027068 best: 0.2027068 (34) total: 5.25s remaining: 4.05s 35: learn: 0.1889271 test: 0.2017067 best: 0.2017067 (35) total: 5.48s remaining: 3.96s 36: learn: 0.1880403 test: 0.2013348 best: 0.2013348 (36) total: 5.66s remaining: 3.82s 37: learn: 0.1872048 test: 0.2007256 best: 0.2007256 (37) total: 5.85s remaining: 3.7s 38: learn: 0.1862368 test: 0.2002209 best: 0.2002209 (38) total: 6.04s remaining: 3.56s 39: learn: 0.1854812 test: 0.1999321 best: 0.1999321 (39) total: 6.19s remaining: 3.4s 40: learn: 0.1846087 test: 0.1993842 best: 0.1993842 (40) total: 6.34s remaining: 3.25s 41: learn: 0.1839997 test: 0.1988548 best: 0.1988548 (41) total: 6.49s remaining: 3.09s 42: learn: 0.1834196 test: 0.1984738 best: 0.1984738 (42) total: 6.66s remaining: 2.94s 43: learn: 0.1825080 test: 0.1976467 best: 0.1976467 (43) total: 6.81s remaining: 2.79s 44: learn: 0.1818027 test: 0.1971245 best: 0.1971245 (44) total: 6.95s remaining: 2.63s 45: learn: 0.1813978 test: 0.1970669 best: 0.1970669 (45) total: 7.09s remaining: 2.46s 46: learn: 0.1807374 test: 0.1966961 best: 0.1966961 (46) total: 7.22s remaining: 2.31s 47: learn: 0.1799848 test: 0.1964340 best: 0.1964340 (47) total: 7.37s remaining: 2.15s 48: learn: 0.1794095 test: 0.1960678 best: 0.1960678 (48) total: 7.5s remaining: 1.99s 49: learn: 0.1785523 test: 0.1956776 best: 0.1956776 (49) total: 7.63s remaining: 1.83s 50: learn: 0.1780793 test: 0.1955309 best: 0.1955309 (50) total: 7.75s remaining: 1.67s 51: learn: 0.1775050 test: 0.1949884 best: 0.1949884 (51) total: 7.88s remaining: 1.51s 52: learn: 0.1767431 test: 0.1945622 best: 0.1945622 (52) total: 8s remaining: 1.36s 53: learn: 0.1759827 test: 0.1946145 best: 0.1945622 (52) total: 8.12s remaining: 1.2s 54: learn: 0.1755692 test: 0.1944823 best: 0.1944823 (54) total: 8.24s remaining: 1.05s 55: learn: 0.1750098 test: 0.1942576 best: 0.1942576 (55) total: 8.37s remaining: 897ms 56: learn: 0.1743985 test: 0.1940159 best: 0.1940159 (56) total: 8.49s remaining: 745ms 57: learn: 0.1735300 test: 0.1937917 best: 0.1937917 (57) total: 8.62s remaining: 594ms 58: learn: 0.1731211 test: 0.1935804 best: 0.1935804 (58) total: 8.74s remaining: 444ms 59: learn: 0.1725410 test: 0.1934648 best: 0.1934648 (59) total: 8.87s remaining: 296ms 60: learn: 0.1719518 test: 0.1935744 best: 0.1934648 (59) total: 8.99s remaining: 147ms 61: learn: 0.1715889 test: 0.1932486 best: 0.1932486 (61) total: 9.11s remaining: 0us bestTest = 0.1932486127 bestIteration = 61 Trial 23, Fold 2: Log loss = 0.19287027784030045, Average precision = 0.9759821994962075, ROC-AUC = 0.9732471689174405, Elapsed Time = 9.256475499998487 seconds Trial 23, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 23, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5951731 test: 0.5950747 best: 0.5950747 (0) total: 115ms remaining: 6.99s 1: learn: 0.5170293 test: 0.5165196 best: 0.5165196 (1) total: 235ms remaining: 7.04s 2: learn: 0.4567837 test: 0.4568171 best: 0.4568171 (2) total: 356ms remaining: 7.01s 3: learn: 0.4099830 test: 0.4099489 best: 0.4099489 (3) total: 474ms remaining: 6.87s 4: learn: 0.3732347 test: 0.3736549 best: 0.3736549 (4) total: 600ms remaining: 6.84s 5: learn: 0.3437848 test: 0.3442001 best: 0.3442001 (5) total: 722ms remaining: 6.74s 6: learn: 0.3199739 test: 0.3206195 best: 0.3206195 (6) total: 848ms remaining: 6.67s 7: learn: 0.3008914 test: 0.3016161 best: 0.3016161 (7) total: 971ms remaining: 6.55s 8: learn: 0.2858124 test: 0.2868289 best: 0.2868289 (8) total: 1.1s remaining: 6.46s 9: learn: 0.2730405 test: 0.2745897 best: 0.2745897 (9) total: 1.22s remaining: 6.36s 10: learn: 0.2622947 test: 0.2640194 best: 0.2640194 (10) total: 1.35s remaining: 6.26s 11: learn: 0.2531893 test: 0.2553303 best: 0.2553303 (11) total: 1.48s remaining: 6.16s 12: learn: 0.2459051 test: 0.2483089 best: 0.2483089 (12) total: 1.6s remaining: 6.03s 13: learn: 0.2395021 test: 0.2421740 best: 0.2421740 (13) total: 1.71s remaining: 5.88s 14: learn: 0.2343107 test: 0.2372447 best: 0.2372447 (14) total: 1.83s remaining: 5.74s 15: learn: 0.2297839 test: 0.2329418 best: 0.2329418 (15) total: 1.95s remaining: 5.6s 16: learn: 0.2254070 test: 0.2290113 best: 0.2290113 (16) total: 2.08s remaining: 5.51s 17: learn: 0.2215858 test: 0.2255834 best: 0.2255834 (17) total: 2.21s remaining: 5.41s 18: learn: 0.2184659 test: 0.2230091 best: 0.2230091 (18) total: 2.33s remaining: 5.28s 19: learn: 0.2155248 test: 0.2202873 best: 0.2202873 (19) total: 2.45s remaining: 5.15s 20: learn: 0.2131670 test: 0.2182896 best: 0.2182896 (20) total: 2.58s remaining: 5.04s 21: learn: 0.2107364 test: 0.2162761 best: 0.2162761 (21) total: 2.71s remaining: 4.92s 22: learn: 0.2083931 test: 0.2145688 best: 0.2145688 (22) total: 2.82s remaining: 4.79s 23: learn: 0.2066279 test: 0.2137769 best: 0.2137769 (23) total: 2.94s remaining: 4.66s 24: learn: 0.2040749 test: 0.2115887 best: 0.2115887 (24) total: 3.07s remaining: 4.54s 25: learn: 0.2024478 test: 0.2105896 best: 0.2105896 (25) total: 3.19s remaining: 4.42s 26: learn: 0.2007961 test: 0.2095902 best: 0.2095902 (26) total: 3.31s remaining: 4.29s 27: learn: 0.1992836 test: 0.2082203 best: 0.2082203 (27) total: 3.43s remaining: 4.17s 28: learn: 0.1979759 test: 0.2074137 best: 0.2074137 (28) total: 3.56s remaining: 4.05s 29: learn: 0.1966775 test: 0.2065519 best: 0.2065519 (29) total: 3.68s remaining: 3.93s 30: learn: 0.1956816 test: 0.2059188 best: 0.2059188 (30) total: 3.8s remaining: 3.8s 31: learn: 0.1944793 test: 0.2048561 best: 0.2048561 (31) total: 3.92s remaining: 3.67s 32: learn: 0.1933973 test: 0.2042320 best: 0.2042320 (32) total: 4.03s remaining: 3.54s 33: learn: 0.1918907 test: 0.2026441 best: 0.2026441 (33) total: 4.15s remaining: 3.42s 34: learn: 0.1907784 test: 0.2019382 best: 0.2019382 (34) total: 4.27s remaining: 3.3s 35: learn: 0.1897847 test: 0.2015780 best: 0.2015780 (35) total: 4.4s remaining: 3.17s 36: learn: 0.1885700 test: 0.2008740 best: 0.2008740 (36) total: 4.52s remaining: 3.05s 37: learn: 0.1876671 test: 0.2004427 best: 0.2004427 (37) total: 4.64s remaining: 2.93s 38: learn: 0.1871107 test: 0.1999988 best: 0.1999988 (38) total: 4.76s remaining: 2.81s 39: learn: 0.1861522 test: 0.1994645 best: 0.1994645 (39) total: 4.88s remaining: 2.68s 40: learn: 0.1851503 test: 0.1985272 best: 0.1985272 (40) total: 5s remaining: 2.56s 41: learn: 0.1843615 test: 0.1980484 best: 0.1980484 (41) total: 5.12s remaining: 2.44s 42: learn: 0.1837152 test: 0.1977226 best: 0.1977226 (42) total: 5.24s remaining: 2.31s 43: learn: 0.1826980 test: 0.1977072 best: 0.1977072 (43) total: 5.36s remaining: 2.19s 44: learn: 0.1822670 test: 0.1972338 best: 0.1972338 (44) total: 5.48s remaining: 2.07s 45: learn: 0.1817108 test: 0.1967613 best: 0.1967613 (45) total: 5.6s remaining: 1.95s 46: learn: 0.1810562 test: 0.1965605 best: 0.1965605 (46) total: 5.72s remaining: 1.83s 47: learn: 0.1804961 test: 0.1965437 best: 0.1965437 (47) total: 5.84s remaining: 1.7s 48: learn: 0.1794615 test: 0.1966603 best: 0.1965437 (47) total: 5.96s remaining: 1.58s 49: learn: 0.1786963 test: 0.1964699 best: 0.1964699 (49) total: 6.08s remaining: 1.46s 50: learn: 0.1780401 test: 0.1961837 best: 0.1961837 (50) total: 6.21s remaining: 1.34s 51: learn: 0.1772582 test: 0.1958690 best: 0.1958690 (51) total: 6.33s remaining: 1.22s 52: learn: 0.1765997 test: 0.1957953 best: 0.1957953 (52) total: 6.45s remaining: 1.09s 53: learn: 0.1762493 test: 0.1957325 best: 0.1957325 (53) total: 6.57s remaining: 973ms 54: learn: 0.1758172 test: 0.1956071 best: 0.1956071 (54) total: 6.69s remaining: 851ms 55: learn: 0.1750515 test: 0.1952493 best: 0.1952493 (55) total: 6.81s remaining: 730ms 56: learn: 0.1744532 test: 0.1950777 best: 0.1950777 (56) total: 6.93s remaining: 608ms 57: learn: 0.1740408 test: 0.1949342 best: 0.1949342 (57) total: 7.05s remaining: 486ms 58: learn: 0.1735009 test: 0.1946910 best: 0.1946910 (58) total: 7.16s remaining: 364ms 59: learn: 0.1729838 test: 0.1946532 best: 0.1946532 (59) total: 7.28s remaining: 243ms 60: learn: 0.1724042 test: 0.1946504 best: 0.1946504 (60) total: 7.4s remaining: 121ms 61: learn: 0.1719425 test: 0.1945666 best: 0.1945666 (61) total: 7.57s remaining: 0us bestTest = 0.1945666354 bestIteration = 61 Trial 23, Fold 3: Log loss = 0.1942799515808129, Average precision = 0.9757164259236374, ROC-AUC = 0.9723673565698107, Elapsed Time = 7.702082700001483 seconds Trial 23, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 23, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5957919 test: 0.5966905 best: 0.5966905 (0) total: 108ms remaining: 6.61s 1: learn: 0.5176786 test: 0.5193158 best: 0.5193158 (1) total: 216ms remaining: 6.48s 2: learn: 0.4572115 test: 0.4596429 best: 0.4596429 (2) total: 320ms remaining: 6.29s 3: learn: 0.4093680 test: 0.4128213 best: 0.4128213 (3) total: 425ms remaining: 6.16s 4: learn: 0.3721170 test: 0.3758761 best: 0.3758761 (4) total: 529ms remaining: 6.03s 5: learn: 0.3426337 test: 0.3473634 best: 0.3473634 (5) total: 635ms remaining: 5.93s 6: learn: 0.3194484 test: 0.3245936 best: 0.3245936 (6) total: 757ms remaining: 5.95s 7: learn: 0.2991011 test: 0.3049288 best: 0.3049288 (7) total: 897ms remaining: 6.05s 8: learn: 0.2829583 test: 0.2892908 best: 0.2892908 (8) total: 1.01s remaining: 5.96s 9: learn: 0.2702596 test: 0.2772073 best: 0.2772073 (9) total: 1.12s remaining: 5.84s 10: learn: 0.2600351 test: 0.2672663 best: 0.2672663 (10) total: 1.24s remaining: 5.74s 11: learn: 0.2511651 test: 0.2586983 best: 0.2586983 (11) total: 1.35s remaining: 5.61s 12: learn: 0.2433572 test: 0.2514801 best: 0.2514801 (12) total: 1.46s remaining: 5.51s 13: learn: 0.2369985 test: 0.2454443 best: 0.2454443 (13) total: 1.57s remaining: 5.38s 14: learn: 0.2322525 test: 0.2416216 best: 0.2416216 (14) total: 1.68s remaining: 5.26s 15: learn: 0.2271164 test: 0.2365172 best: 0.2365172 (15) total: 1.79s remaining: 5.14s 16: learn: 0.2222608 test: 0.2318826 best: 0.2318826 (16) total: 1.89s remaining: 5.01s 17: learn: 0.2187227 test: 0.2287661 best: 0.2287661 (17) total: 2s remaining: 4.9s 18: learn: 0.2161313 test: 0.2270865 best: 0.2270865 (18) total: 2.12s remaining: 4.79s 19: learn: 0.2126150 test: 0.2242695 best: 0.2242695 (19) total: 2.22s remaining: 4.67s 20: learn: 0.2102116 test: 0.2224428 best: 0.2224428 (20) total: 2.33s remaining: 4.55s 21: learn: 0.2080293 test: 0.2206777 best: 0.2206777 (21) total: 2.44s remaining: 4.43s 22: learn: 0.2055186 test: 0.2187229 best: 0.2187229 (22) total: 2.55s remaining: 4.32s 23: learn: 0.2032377 test: 0.2168438 best: 0.2168438 (23) total: 2.66s remaining: 4.21s 24: learn: 0.2010591 test: 0.2146325 best: 0.2146325 (24) total: 2.77s remaining: 4.1s 25: learn: 0.1993205 test: 0.2135670 best: 0.2135670 (25) total: 2.88s remaining: 3.99s 26: learn: 0.1979280 test: 0.2124655 best: 0.2124655 (26) total: 2.99s remaining: 3.87s 27: learn: 0.1967378 test: 0.2118648 best: 0.2118648 (27) total: 3.09s remaining: 3.75s 28: learn: 0.1949462 test: 0.2106643 best: 0.2106643 (28) total: 3.2s remaining: 3.64s 29: learn: 0.1935984 test: 0.2099035 best: 0.2099035 (29) total: 3.31s remaining: 3.53s 30: learn: 0.1926454 test: 0.2093702 best: 0.2093702 (30) total: 3.42s remaining: 3.42s 31: learn: 0.1914591 test: 0.2084194 best: 0.2084194 (31) total: 3.52s remaining: 3.3s 32: learn: 0.1901512 test: 0.2074712 best: 0.2074712 (32) total: 3.63s remaining: 3.19s 33: learn: 0.1888148 test: 0.2066580 best: 0.2066580 (33) total: 3.74s remaining: 3.08s 34: learn: 0.1881323 test: 0.2060119 best: 0.2060119 (34) total: 3.85s remaining: 2.97s 35: learn: 0.1869646 test: 0.2053207 best: 0.2053207 (35) total: 3.95s remaining: 2.85s 36: learn: 0.1862153 test: 0.2052897 best: 0.2052897 (36) total: 4.06s remaining: 2.74s 37: learn: 0.1851334 test: 0.2049645 best: 0.2049645 (37) total: 4.17s remaining: 2.63s 38: learn: 0.1844301 test: 0.2049823 best: 0.2049645 (37) total: 4.27s remaining: 2.52s 39: learn: 0.1837182 test: 0.2047448 best: 0.2047448 (39) total: 4.38s remaining: 2.41s 40: learn: 0.1827654 test: 0.2044805 best: 0.2044805 (40) total: 4.49s remaining: 2.3s 41: learn: 0.1821670 test: 0.2043897 best: 0.2043897 (41) total: 4.59s remaining: 2.19s 42: learn: 0.1816186 test: 0.2040655 best: 0.2040655 (42) total: 4.69s remaining: 2.07s 43: learn: 0.1810250 test: 0.2042005 best: 0.2040655 (42) total: 4.8s remaining: 1.96s 44: learn: 0.1802995 test: 0.2041540 best: 0.2040655 (42) total: 4.9s remaining: 1.85s 45: learn: 0.1798024 test: 0.2039556 best: 0.2039556 (45) total: 5s remaining: 1.74s 46: learn: 0.1788771 test: 0.2032649 best: 0.2032649 (46) total: 5.11s remaining: 1.63s 47: learn: 0.1780431 test: 0.2030655 best: 0.2030655 (47) total: 5.22s remaining: 1.52s 48: learn: 0.1768880 test: 0.2018474 best: 0.2018474 (48) total: 5.32s remaining: 1.41s 49: learn: 0.1760783 test: 0.2014100 best: 0.2014100 (49) total: 5.43s remaining: 1.3s 50: learn: 0.1754404 test: 0.2010217 best: 0.2010217 (50) total: 5.54s remaining: 1.19s 51: learn: 0.1746786 test: 0.2006403 best: 0.2006403 (51) total: 5.65s remaining: 1.08s 52: learn: 0.1743344 test: 0.2008042 best: 0.2006403 (51) total: 5.76s remaining: 978ms 53: learn: 0.1737666 test: 0.2009606 best: 0.2006403 (51) total: 5.87s remaining: 869ms 54: learn: 0.1730287 test: 0.2005985 best: 0.2005985 (54) total: 5.97s remaining: 760ms 55: learn: 0.1726760 test: 0.2006021 best: 0.2005985 (54) total: 6.08s remaining: 652ms 56: learn: 0.1715976 test: 0.2008670 best: 0.2005985 (54) total: 6.19s remaining: 543ms 57: learn: 0.1712090 test: 0.2010684 best: 0.2005985 (54) total: 6.3s remaining: 434ms 58: learn: 0.1707234 test: 0.2011379 best: 0.2005985 (54) total: 6.41s remaining: 326ms 59: learn: 0.1702758 test: 0.2011670 best: 0.2005985 (54) total: 6.51s remaining: 217ms 60: learn: 0.1699079 test: 0.2012349 best: 0.2005985 (54) total: 6.62s remaining: 108ms 61: learn: 0.1694034 test: 0.2011856 best: 0.2005985 (54) total: 6.72s remaining: 0us bestTest = 0.2005985087 bestIteration = 54 Shrink model to first 55 iterations. Trial 23, Fold 4: Log loss = 0.20012387602368556, Average precision = 0.9744471579175493, ROC-AUC = 0.9697106936009595, Elapsed Time = 6.847606200000882 seconds Trial 23, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 23, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5937950 test: 0.5956931 best: 0.5956931 (0) total: 99.6ms remaining: 6.07s 1: learn: 0.5160028 test: 0.5200099 best: 0.5200099 (1) total: 197ms remaining: 5.91s 2: learn: 0.4553333 test: 0.4605296 best: 0.4605296 (2) total: 306ms remaining: 6.01s 3: learn: 0.4073562 test: 0.4139328 best: 0.4139328 (3) total: 402ms remaining: 5.83s 4: learn: 0.3702173 test: 0.3777423 best: 0.3777423 (4) total: 499ms remaining: 5.69s 5: learn: 0.3401950 test: 0.3490149 best: 0.3490149 (5) total: 597ms remaining: 5.57s 6: learn: 0.3160475 test: 0.3260091 best: 0.3260091 (6) total: 696ms remaining: 5.47s 7: learn: 0.2976596 test: 0.3082631 best: 0.3082631 (7) total: 796ms remaining: 5.37s 8: learn: 0.2824998 test: 0.2937702 best: 0.2937702 (8) total: 895ms remaining: 5.27s 9: learn: 0.2692860 test: 0.2813462 best: 0.2813462 (9) total: 994ms remaining: 5.17s 10: learn: 0.2580412 test: 0.2711022 best: 0.2711022 (10) total: 1.09s remaining: 5.07s 11: learn: 0.2489195 test: 0.2630277 best: 0.2630277 (11) total: 1.19s remaining: 4.97s 12: learn: 0.2415216 test: 0.2562765 best: 0.2562765 (12) total: 1.29s remaining: 4.87s 13: learn: 0.2349126 test: 0.2502402 best: 0.2502402 (13) total: 1.39s remaining: 4.77s 14: learn: 0.2294433 test: 0.2454388 best: 0.2454388 (14) total: 1.49s remaining: 4.67s 15: learn: 0.2249345 test: 0.2418664 best: 0.2418664 (15) total: 1.59s remaining: 4.58s 16: learn: 0.2207250 test: 0.2380607 best: 0.2380607 (16) total: 1.69s remaining: 4.48s 17: learn: 0.2171940 test: 0.2348029 best: 0.2348029 (17) total: 1.79s remaining: 4.38s 18: learn: 0.2134974 test: 0.2319286 best: 0.2319286 (18) total: 1.89s remaining: 4.28s 19: learn: 0.2107484 test: 0.2298489 best: 0.2298489 (19) total: 2s remaining: 4.19s 20: learn: 0.2083395 test: 0.2278001 best: 0.2278001 (20) total: 2.1s remaining: 4.09s 21: learn: 0.2065967 test: 0.2265194 best: 0.2265194 (21) total: 2.2s remaining: 3.99s 22: learn: 0.2040116 test: 0.2246948 best: 0.2246948 (22) total: 2.3s remaining: 3.9s 23: learn: 0.2020218 test: 0.2232490 best: 0.2232490 (23) total: 2.4s remaining: 3.81s 24: learn: 0.2000501 test: 0.2221530 best: 0.2221530 (24) total: 2.51s remaining: 3.71s 25: learn: 0.1984011 test: 0.2209460 best: 0.2209460 (25) total: 2.61s remaining: 3.62s 26: learn: 0.1969833 test: 0.2196209 best: 0.2196209 (26) total: 2.72s remaining: 3.52s 27: learn: 0.1950760 test: 0.2179907 best: 0.2179907 (27) total: 2.82s remaining: 3.43s 28: learn: 0.1936585 test: 0.2168804 best: 0.2168804 (28) total: 2.93s remaining: 3.33s 29: learn: 0.1921661 test: 0.2153103 best: 0.2153103 (29) total: 3.04s remaining: 3.24s 30: learn: 0.1911180 test: 0.2144782 best: 0.2144782 (30) total: 3.15s remaining: 3.15s 31: learn: 0.1896719 test: 0.2135546 best: 0.2135546 (31) total: 3.26s remaining: 3.06s 32: learn: 0.1884247 test: 0.2127827 best: 0.2127827 (32) total: 3.38s remaining: 2.98s 33: learn: 0.1870294 test: 0.2117825 best: 0.2117825 (33) total: 3.5s remaining: 2.88s 34: learn: 0.1860682 test: 0.2112304 best: 0.2112304 (34) total: 3.61s remaining: 2.79s 35: learn: 0.1852090 test: 0.2105865 best: 0.2105865 (35) total: 3.72s remaining: 2.69s 36: learn: 0.1841420 test: 0.2101854 best: 0.2101854 (36) total: 3.83s remaining: 2.59s 37: learn: 0.1832371 test: 0.2098653 best: 0.2098653 (37) total: 3.94s remaining: 2.49s 38: learn: 0.1825193 test: 0.2095850 best: 0.2095850 (38) total: 4.05s remaining: 2.39s 39: learn: 0.1818177 test: 0.2092675 best: 0.2092675 (39) total: 4.16s remaining: 2.29s 40: learn: 0.1810604 test: 0.2088934 best: 0.2088934 (40) total: 4.26s remaining: 2.18s 41: learn: 0.1803432 test: 0.2086562 best: 0.2086562 (41) total: 4.37s remaining: 2.08s 42: learn: 0.1796515 test: 0.2081000 best: 0.2081000 (42) total: 4.48s remaining: 1.98s 43: learn: 0.1790503 test: 0.2075384 best: 0.2075384 (43) total: 4.58s remaining: 1.88s 44: learn: 0.1784890 test: 0.2074930 best: 0.2074930 (44) total: 4.69s remaining: 1.77s 45: learn: 0.1779763 test: 0.2073651 best: 0.2073651 (45) total: 4.8s remaining: 1.67s 46: learn: 0.1770775 test: 0.2068049 best: 0.2068049 (46) total: 4.91s remaining: 1.57s 47: learn: 0.1765538 test: 0.2065318 best: 0.2065318 (47) total: 5.02s remaining: 1.46s 48: learn: 0.1761515 test: 0.2064799 best: 0.2064799 (48) total: 5.12s remaining: 1.36s 49: learn: 0.1752056 test: 0.2057692 best: 0.2057692 (49) total: 5.23s remaining: 1.25s 50: learn: 0.1746627 test: 0.2053639 best: 0.2053639 (50) total: 5.33s remaining: 1.15s 51: learn: 0.1737627 test: 0.2052029 best: 0.2052029 (51) total: 5.44s remaining: 1.05s 52: learn: 0.1732008 test: 0.2048461 best: 0.2048461 (52) total: 5.55s remaining: 942ms 53: learn: 0.1726052 test: 0.2047167 best: 0.2047167 (53) total: 5.65s remaining: 838ms 54: learn: 0.1723983 test: 0.2045807 best: 0.2045807 (54) total: 5.76s remaining: 733ms 55: learn: 0.1716393 test: 0.2045029 best: 0.2045029 (55) total: 5.87s remaining: 629ms 56: learn: 0.1708887 test: 0.2041617 best: 0.2041617 (56) total: 5.97s remaining: 524ms 57: learn: 0.1705130 test: 0.2041243 best: 0.2041243 (57) total: 6.08s remaining: 419ms 58: learn: 0.1699873 test: 0.2039336 best: 0.2039336 (58) total: 6.19s remaining: 315ms 59: learn: 0.1695635 test: 0.2038187 best: 0.2038187 (59) total: 6.29s remaining: 210ms 60: learn: 0.1687722 test: 0.2039111 best: 0.2038187 (59) total: 6.39s remaining: 105ms 61: learn: 0.1681154 test: 0.2036850 best: 0.2036850 (61) total: 6.49s remaining: 0us bestTest = 0.2036849841 bestIteration = 61 Trial 23, Fold 5: Log loss = 0.20301804103302334, Average precision = 0.974129227151172, ROC-AUC = 0.9709877190048865, Elapsed Time = 6.611383499999647 seconds
Optimization Progress: 24%|##4 | 24/100 [38:32<54:20, 42.90s/it]
Trial 24, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 24, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6438578 test: 0.6455061 best: 0.6455061 (0) total: 260ms remaining: 24.2s 1: learn: 0.5998870 test: 0.6031787 best: 0.6031787 (1) total: 523ms remaining: 24s 2: learn: 0.5605713 test: 0.5651558 best: 0.5651558 (2) total: 773ms remaining: 23.4s 3: learn: 0.5254198 test: 0.5306228 best: 0.5306228 (3) total: 1.06s remaining: 23.8s 4: learn: 0.4935897 test: 0.5001612 best: 0.5001612 (4) total: 1.38s remaining: 24.5s 5: learn: 0.4653441 test: 0.4723755 best: 0.4723755 (5) total: 1.63s remaining: 23.9s 6: learn: 0.4395657 test: 0.4472658 best: 0.4472658 (6) total: 1.88s remaining: 23.4s 7: learn: 0.4167117 test: 0.4248842 best: 0.4248842 (7) total: 2.12s remaining: 22.8s 8: learn: 0.3965549 test: 0.4056319 best: 0.4056319 (8) total: 2.36s remaining: 22.3s 9: learn: 0.3787996 test: 0.3888457 best: 0.3888457 (9) total: 2.63s remaining: 22.1s 10: learn: 0.3622498 test: 0.3728510 best: 0.3728510 (10) total: 2.89s remaining: 21.8s 11: learn: 0.3475664 test: 0.3585846 best: 0.3585846 (11) total: 3.13s remaining: 21.4s 12: learn: 0.3336766 test: 0.3456763 best: 0.3456763 (12) total: 3.39s remaining: 21.1s 13: learn: 0.3214153 test: 0.3340535 best: 0.3340535 (13) total: 3.68s remaining: 21s 14: learn: 0.3100814 test: 0.3232737 best: 0.3232737 (14) total: 3.95s remaining: 20.8s 15: learn: 0.2996748 test: 0.3136852 best: 0.3136852 (15) total: 4.26s remaining: 20.8s 16: learn: 0.2910982 test: 0.3054965 best: 0.3054965 (16) total: 4.52s remaining: 20.5s 17: learn: 0.2828487 test: 0.2978696 best: 0.2978696 (17) total: 4.83s remaining: 20.4s 18: learn: 0.2753132 test: 0.2909673 best: 0.2909673 (18) total: 5.13s remaining: 20.3s 19: learn: 0.2683074 test: 0.2847141 best: 0.2847141 (19) total: 5.43s remaining: 20.1s 20: learn: 0.2615446 test: 0.2784504 best: 0.2784504 (20) total: 5.68s remaining: 19.7s 21: learn: 0.2552878 test: 0.2731186 best: 0.2731186 (21) total: 6.01s remaining: 19.7s 22: learn: 0.2498510 test: 0.2684122 best: 0.2684122 (22) total: 6.3s remaining: 19.4s 23: learn: 0.2449985 test: 0.2639732 best: 0.2639732 (23) total: 6.54s remaining: 19.1s 24: learn: 0.2398456 test: 0.2594648 best: 0.2594648 (24) total: 6.82s remaining: 18.8s 25: learn: 0.2355215 test: 0.2560590 best: 0.2560590 (25) total: 7.13s remaining: 18.7s 26: learn: 0.2314192 test: 0.2528651 best: 0.2528651 (26) total: 7.46s remaining: 18.5s 27: learn: 0.2275921 test: 0.2495441 best: 0.2495441 (27) total: 7.73s remaining: 18.2s 28: learn: 0.2238458 test: 0.2466560 best: 0.2466560 (28) total: 8.04s remaining: 18s 29: learn: 0.2202150 test: 0.2437916 best: 0.2437916 (29) total: 8.34s remaining: 17.8s 30: learn: 0.2169537 test: 0.2410058 best: 0.2410058 (30) total: 8.61s remaining: 17.5s 31: learn: 0.2137148 test: 0.2385520 best: 0.2385520 (31) total: 8.91s remaining: 17.3s 32: learn: 0.2107685 test: 0.2362687 best: 0.2362687 (32) total: 9.19s remaining: 17s 33: learn: 0.2080446 test: 0.2342027 best: 0.2342027 (33) total: 9.47s remaining: 16.7s 34: learn: 0.2053449 test: 0.2322565 best: 0.2322565 (34) total: 9.79s remaining: 16.5s 35: learn: 0.2029717 test: 0.2304254 best: 0.2304254 (35) total: 10.1s remaining: 16.3s 36: learn: 0.2007547 test: 0.2290256 best: 0.2290256 (36) total: 10.4s remaining: 16s 37: learn: 0.1985039 test: 0.2271766 best: 0.2271766 (37) total: 10.6s remaining: 15.7s 38: learn: 0.1964343 test: 0.2258945 best: 0.2258945 (38) total: 11s remaining: 15.5s 39: learn: 0.1945615 test: 0.2245749 best: 0.2245749 (39) total: 11.3s remaining: 15.2s 40: learn: 0.1925775 test: 0.2233524 best: 0.2233524 (40) total: 11.6s remaining: 15s 41: learn: 0.1908676 test: 0.2221610 best: 0.2221610 (41) total: 11.8s remaining: 14.6s 42: learn: 0.1890263 test: 0.2208478 best: 0.2208478 (42) total: 12.1s remaining: 14.4s 43: learn: 0.1875386 test: 0.2198998 best: 0.2198998 (43) total: 12.4s remaining: 14.1s 44: learn: 0.1861395 test: 0.2188144 best: 0.2188144 (44) total: 12.6s remaining: 13.7s 45: learn: 0.1844432 test: 0.2180320 best: 0.2180320 (45) total: 12.9s remaining: 13.4s 46: learn: 0.1829200 test: 0.2168720 best: 0.2168720 (46) total: 13.1s remaining: 13.1s 47: learn: 0.1814301 test: 0.2159957 best: 0.2159957 (47) total: 13.3s remaining: 12.8s 48: learn: 0.1800036 test: 0.2152624 best: 0.2152624 (48) total: 13.5s remaining: 12.4s 49: learn: 0.1786161 test: 0.2145674 best: 0.2145674 (49) total: 13.8s remaining: 12.1s 50: learn: 0.1771863 test: 0.2137856 best: 0.2137856 (50) total: 14s remaining: 11.8s 51: learn: 0.1760759 test: 0.2130029 best: 0.2130029 (51) total: 14.2s remaining: 11.5s 52: learn: 0.1748653 test: 0.2123449 best: 0.2123449 (52) total: 14.4s remaining: 11.1s 53: learn: 0.1735244 test: 0.2115128 best: 0.2115128 (53) total: 14.7s remaining: 10.9s 54: learn: 0.1724233 test: 0.2107858 best: 0.2107858 (54) total: 14.9s remaining: 10.5s 55: learn: 0.1711771 test: 0.2103165 best: 0.2103165 (55) total: 15.1s remaining: 10.2s 56: learn: 0.1700469 test: 0.2097637 best: 0.2097637 (56) total: 15.3s remaining: 9.93s 57: learn: 0.1692750 test: 0.2093347 best: 0.2093347 (57) total: 15.5s remaining: 9.62s 58: learn: 0.1681576 test: 0.2087038 best: 0.2087038 (58) total: 15.7s remaining: 9.33s 59: learn: 0.1670884 test: 0.2084187 best: 0.2084187 (59) total: 16s remaining: 9.05s 60: learn: 0.1659883 test: 0.2080129 best: 0.2080129 (60) total: 16.2s remaining: 8.76s 61: learn: 0.1652681 test: 0.2075952 best: 0.2075952 (61) total: 16.4s remaining: 8.45s 62: learn: 0.1641994 test: 0.2071743 best: 0.2071743 (62) total: 16.6s remaining: 8.18s 63: learn: 0.1634326 test: 0.2068601 best: 0.2068601 (63) total: 16.8s remaining: 7.89s 64: learn: 0.1625874 test: 0.2064590 best: 0.2064590 (64) total: 17.1s remaining: 7.61s 65: learn: 0.1618887 test: 0.2062914 best: 0.2062914 (65) total: 17.3s remaining: 7.32s 66: learn: 0.1610142 test: 0.2060099 best: 0.2060099 (66) total: 17.5s remaining: 7.04s 67: learn: 0.1600630 test: 0.2055882 best: 0.2055882 (67) total: 17.7s remaining: 6.78s 68: learn: 0.1592825 test: 0.2052952 best: 0.2052952 (68) total: 18s remaining: 6.51s 69: learn: 0.1585901 test: 0.2050407 best: 0.2050407 (69) total: 18.2s remaining: 6.23s 70: learn: 0.1576917 test: 0.2048092 best: 0.2048092 (70) total: 18.4s remaining: 5.96s 71: learn: 0.1570617 test: 0.2046015 best: 0.2046015 (71) total: 18.6s remaining: 5.68s 72: learn: 0.1564476 test: 0.2044074 best: 0.2044074 (72) total: 18.8s remaining: 5.41s 73: learn: 0.1556355 test: 0.2040531 best: 0.2040531 (73) total: 19s remaining: 5.14s 74: learn: 0.1548263 test: 0.2037117 best: 0.2037117 (74) total: 19.2s remaining: 4.88s 75: learn: 0.1543279 test: 0.2034990 best: 0.2034990 (75) total: 19.4s remaining: 4.6s 76: learn: 0.1538256 test: 0.2033836 best: 0.2033836 (76) total: 19.6s remaining: 4.34s 77: learn: 0.1531846 test: 0.2032268 best: 0.2032268 (77) total: 19.9s remaining: 4.07s 78: learn: 0.1526104 test: 0.2029620 best: 0.2029620 (78) total: 20.1s remaining: 3.81s 79: learn: 0.1520619 test: 0.2027729 best: 0.2027729 (79) total: 20.2s remaining: 3.54s 80: learn: 0.1515769 test: 0.2025519 best: 0.2025519 (80) total: 20.4s remaining: 3.28s 81: learn: 0.1511291 test: 0.2024016 best: 0.2024016 (81) total: 20.6s remaining: 3.02s 82: learn: 0.1505826 test: 0.2021700 best: 0.2021700 (82) total: 20.8s remaining: 2.76s 83: learn: 0.1500965 test: 0.2020202 best: 0.2020202 (83) total: 21s remaining: 2.5s 84: learn: 0.1494016 test: 0.2018039 best: 0.2018039 (84) total: 21.2s remaining: 2.25s 85: learn: 0.1488026 test: 0.2014398 best: 0.2014398 (85) total: 21.4s remaining: 1.99s 86: learn: 0.1482321 test: 0.2012780 best: 0.2012780 (86) total: 21.6s remaining: 1.74s 87: learn: 0.1473594 test: 0.2010542 best: 0.2010542 (87) total: 21.9s remaining: 1.49s 88: learn: 0.1469367 test: 0.2010097 best: 0.2010097 (88) total: 22.1s remaining: 1.24s 89: learn: 0.1464939 test: 0.2008691 best: 0.2008691 (89) total: 22.3s remaining: 991ms 90: learn: 0.1460058 test: 0.2007546 best: 0.2007546 (90) total: 22.5s remaining: 742ms 91: learn: 0.1455250 test: 0.2006112 best: 0.2006112 (91) total: 22.7s remaining: 494ms 92: learn: 0.1450330 test: 0.2005511 best: 0.2005511 (92) total: 22.9s remaining: 247ms 93: learn: 0.1445806 test: 0.2004436 best: 0.2004436 (93) total: 23.1s remaining: 0us bestTest = 0.2004435921 bestIteration = 93 Trial 24, Fold 1: Log loss = 0.20008381016671992, Average precision = 0.9751319074063409, ROC-AUC = 0.9714906463438533, Elapsed Time = 23.274636800000735 seconds Trial 24, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 24, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6443987 test: 0.6458694 best: 0.6458694 (0) total: 216ms remaining: 20s 1: learn: 0.6006061 test: 0.6028911 best: 0.6028911 (1) total: 418ms remaining: 19.2s 2: learn: 0.5612165 test: 0.5646854 best: 0.5646854 (2) total: 629ms remaining: 19.1s 3: learn: 0.5263291 test: 0.5308011 best: 0.5308011 (3) total: 832ms remaining: 18.7s 4: learn: 0.4948343 test: 0.4998954 best: 0.4998954 (4) total: 1.04s remaining: 18.6s 5: learn: 0.4665603 test: 0.4724162 best: 0.4724162 (5) total: 1.25s remaining: 18.3s 6: learn: 0.4414229 test: 0.4478592 best: 0.4478592 (6) total: 1.44s remaining: 17.9s 7: learn: 0.4183978 test: 0.4259165 best: 0.4259165 (7) total: 1.65s remaining: 17.7s 8: learn: 0.3981837 test: 0.4061892 best: 0.4061892 (8) total: 1.85s remaining: 17.5s 9: learn: 0.3797112 test: 0.3886281 best: 0.3886281 (9) total: 2.07s remaining: 17.4s 10: learn: 0.3629506 test: 0.3726458 best: 0.3726458 (10) total: 2.28s remaining: 17.2s 11: learn: 0.3477535 test: 0.3580721 best: 0.3580721 (11) total: 2.5s remaining: 17.1s 12: learn: 0.3344286 test: 0.3454659 best: 0.3454659 (12) total: 2.74s remaining: 17s 13: learn: 0.3226938 test: 0.3342708 best: 0.3342708 (13) total: 2.98s remaining: 17s 14: learn: 0.3112615 test: 0.3231711 best: 0.3231711 (14) total: 3.2s remaining: 16.9s 15: learn: 0.3012001 test: 0.3134058 best: 0.3134058 (15) total: 3.41s remaining: 16.6s 16: learn: 0.2917738 test: 0.3044189 best: 0.3044189 (16) total: 3.61s remaining: 16.4s 17: learn: 0.2829052 test: 0.2961251 best: 0.2961251 (17) total: 3.85s remaining: 16.3s 18: learn: 0.2751397 test: 0.2887107 best: 0.2887107 (18) total: 4.07s remaining: 16.1s 19: learn: 0.2685971 test: 0.2824079 best: 0.2824079 (19) total: 4.27s remaining: 15.8s 20: learn: 0.2619875 test: 0.2763769 best: 0.2763769 (20) total: 4.47s remaining: 15.6s 21: learn: 0.2563015 test: 0.2711898 best: 0.2711898 (21) total: 4.69s remaining: 15.3s 22: learn: 0.2510789 test: 0.2664546 best: 0.2664546 (22) total: 4.92s remaining: 15.2s 23: learn: 0.2464731 test: 0.2621778 best: 0.2621778 (23) total: 5.12s remaining: 14.9s 24: learn: 0.2418728 test: 0.2577170 best: 0.2577170 (24) total: 5.31s remaining: 14.7s 25: learn: 0.2375449 test: 0.2537484 best: 0.2537484 (25) total: 5.52s remaining: 14.4s 26: learn: 0.2332479 test: 0.2498588 best: 0.2498588 (26) total: 5.75s remaining: 14.3s 27: learn: 0.2296669 test: 0.2465966 best: 0.2465966 (27) total: 5.96s remaining: 14s 28: learn: 0.2255581 test: 0.2432903 best: 0.2432903 (28) total: 6.2s remaining: 13.9s 29: learn: 0.2222726 test: 0.2402317 best: 0.2402317 (29) total: 6.42s remaining: 13.7s 30: learn: 0.2187398 test: 0.2370845 best: 0.2370845 (30) total: 6.64s remaining: 13.5s 31: learn: 0.2156296 test: 0.2343700 best: 0.2343700 (31) total: 6.87s remaining: 13.3s 32: learn: 0.2125656 test: 0.2317570 best: 0.2317570 (32) total: 7.11s remaining: 13.1s 33: learn: 0.2099778 test: 0.2296559 best: 0.2296559 (33) total: 7.32s remaining: 12.9s 34: learn: 0.2076560 test: 0.2276918 best: 0.2276918 (34) total: 7.52s remaining: 12.7s 35: learn: 0.2050343 test: 0.2252784 best: 0.2252784 (35) total: 7.74s remaining: 12.5s 36: learn: 0.2028739 test: 0.2234280 best: 0.2234280 (36) total: 7.97s remaining: 12.3s 37: learn: 0.2005772 test: 0.2217400 best: 0.2217400 (37) total: 8.21s remaining: 12.1s 38: learn: 0.1985963 test: 0.2199244 best: 0.2199244 (38) total: 8.41s remaining: 11.9s 39: learn: 0.1968823 test: 0.2185968 best: 0.2185968 (39) total: 8.62s remaining: 11.6s 40: learn: 0.1950567 test: 0.2173128 best: 0.2173128 (40) total: 8.85s remaining: 11.4s 41: learn: 0.1931372 test: 0.2159142 best: 0.2159142 (41) total: 9.1s remaining: 11.3s 42: learn: 0.1913422 test: 0.2145421 best: 0.2145421 (42) total: 9.34s remaining: 11.1s 43: learn: 0.1897028 test: 0.2134539 best: 0.2134539 (43) total: 9.58s remaining: 10.9s 44: learn: 0.1881901 test: 0.2123747 best: 0.2123747 (44) total: 9.82s remaining: 10.7s 45: learn: 0.1864825 test: 0.2111352 best: 0.2111352 (45) total: 10s remaining: 10.5s 46: learn: 0.1846357 test: 0.2098074 best: 0.2098074 (46) total: 10.3s remaining: 10.3s 47: learn: 0.1834662 test: 0.2089223 best: 0.2089223 (47) total: 10.5s remaining: 10.1s 48: learn: 0.1821087 test: 0.2078463 best: 0.2078463 (48) total: 10.7s remaining: 9.83s 49: learn: 0.1809417 test: 0.2069263 best: 0.2069263 (49) total: 10.9s remaining: 9.59s 50: learn: 0.1796631 test: 0.2060387 best: 0.2060387 (50) total: 11.1s remaining: 9.37s 51: learn: 0.1783378 test: 0.2051503 best: 0.2051503 (51) total: 11.3s remaining: 9.16s 52: learn: 0.1771133 test: 0.2044600 best: 0.2044600 (52) total: 11.6s remaining: 8.94s 53: learn: 0.1758453 test: 0.2036549 best: 0.2036549 (53) total: 11.8s remaining: 8.75s 54: learn: 0.1746708 test: 0.2031410 best: 0.2031410 (54) total: 12s remaining: 8.54s 55: learn: 0.1736009 test: 0.2024551 best: 0.2024551 (55) total: 12.2s remaining: 8.31s 56: learn: 0.1726978 test: 0.2017280 best: 0.2017280 (56) total: 12.4s remaining: 8.07s 57: learn: 0.1715218 test: 0.2009564 best: 0.2009564 (57) total: 12.7s remaining: 7.86s 58: learn: 0.1703209 test: 0.2003504 best: 0.2003504 (58) total: 12.9s remaining: 7.65s 59: learn: 0.1694003 test: 0.1998771 best: 0.1998771 (59) total: 13.1s remaining: 7.43s 60: learn: 0.1685087 test: 0.1993464 best: 0.1993464 (60) total: 13.3s remaining: 7.21s 61: learn: 0.1674854 test: 0.1987652 best: 0.1987652 (61) total: 13.5s remaining: 6.99s 62: learn: 0.1666222 test: 0.1981592 best: 0.1981592 (62) total: 13.8s remaining: 6.77s 63: learn: 0.1657425 test: 0.1975919 best: 0.1975919 (63) total: 14s remaining: 6.56s 64: learn: 0.1648838 test: 0.1970298 best: 0.1970298 (64) total: 14.2s remaining: 6.33s 65: learn: 0.1639515 test: 0.1966297 best: 0.1966297 (65) total: 14.4s remaining: 6.12s 66: learn: 0.1630731 test: 0.1959741 best: 0.1959741 (66) total: 14.7s remaining: 5.91s 67: learn: 0.1622336 test: 0.1957912 best: 0.1957912 (67) total: 14.9s remaining: 5.69s 68: learn: 0.1615137 test: 0.1953125 best: 0.1953125 (68) total: 15.1s remaining: 5.46s 69: learn: 0.1608363 test: 0.1951107 best: 0.1951107 (69) total: 15.3s remaining: 5.24s 70: learn: 0.1600949 test: 0.1947808 best: 0.1947808 (70) total: 15.5s remaining: 5.02s 71: learn: 0.1593147 test: 0.1942786 best: 0.1942786 (71) total: 15.7s remaining: 4.8s 72: learn: 0.1586041 test: 0.1939388 best: 0.1939388 (72) total: 15.9s remaining: 4.58s 73: learn: 0.1578242 test: 0.1935605 best: 0.1935605 (73) total: 16.1s remaining: 4.37s 74: learn: 0.1571315 test: 0.1931934 best: 0.1931934 (74) total: 16.4s remaining: 4.14s 75: learn: 0.1565015 test: 0.1928650 best: 0.1928650 (75) total: 16.6s remaining: 3.92s 76: learn: 0.1559128 test: 0.1925961 best: 0.1925961 (76) total: 16.8s remaining: 3.7s 77: learn: 0.1551099 test: 0.1923254 best: 0.1923254 (77) total: 17s remaining: 3.49s 78: learn: 0.1543963 test: 0.1919864 best: 0.1919864 (78) total: 17.2s remaining: 3.27s 79: learn: 0.1539521 test: 0.1918092 best: 0.1918092 (79) total: 17.4s remaining: 3.04s 80: learn: 0.1534020 test: 0.1915386 best: 0.1915386 (80) total: 17.6s remaining: 2.82s 81: learn: 0.1528399 test: 0.1912301 best: 0.1912301 (81) total: 17.8s remaining: 2.61s 82: learn: 0.1522999 test: 0.1909980 best: 0.1909980 (82) total: 18s remaining: 2.39s 83: learn: 0.1517446 test: 0.1908214 best: 0.1908214 (83) total: 18.3s remaining: 2.17s 84: learn: 0.1512958 test: 0.1906582 best: 0.1906582 (84) total: 18.4s remaining: 1.95s 85: learn: 0.1505766 test: 0.1903550 best: 0.1903550 (85) total: 18.7s remaining: 1.74s 86: learn: 0.1499012 test: 0.1900577 best: 0.1900577 (86) total: 18.9s remaining: 1.52s 87: learn: 0.1493876 test: 0.1898858 best: 0.1898858 (87) total: 19.1s remaining: 1.3s 88: learn: 0.1488664 test: 0.1897057 best: 0.1897057 (88) total: 19.4s remaining: 1.09s 89: learn: 0.1482959 test: 0.1895615 best: 0.1895615 (89) total: 19.6s remaining: 870ms 90: learn: 0.1476303 test: 0.1893389 best: 0.1893389 (90) total: 19.8s remaining: 654ms 91: learn: 0.1469021 test: 0.1892212 best: 0.1892212 (91) total: 20.1s remaining: 437ms 92: learn: 0.1465635 test: 0.1891523 best: 0.1891523 (92) total: 20.3s remaining: 218ms 93: learn: 0.1462047 test: 0.1889960 best: 0.1889960 (93) total: 20.4s remaining: 0us bestTest = 0.1889959826 bestIteration = 93 Trial 24, Fold 2: Log loss = 0.1887453295075406, Average precision = 0.9769374588088688, ROC-AUC = 0.9742078082733499, Elapsed Time = 20.600181599998905 seconds Trial 24, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 24, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6442926 test: 0.6450194 best: 0.6450194 (0) total: 207ms remaining: 19.2s 1: learn: 0.6002137 test: 0.6015989 best: 0.6015989 (1) total: 419ms remaining: 19.3s 2: learn: 0.5607998 test: 0.5634445 best: 0.5634445 (2) total: 634ms remaining: 19.2s 3: learn: 0.5257020 test: 0.5289667 best: 0.5289667 (3) total: 836ms remaining: 18.8s 4: learn: 0.4939781 test: 0.4978868 best: 0.4978868 (4) total: 1.04s remaining: 18.5s 5: learn: 0.4659419 test: 0.4703222 best: 0.4703222 (5) total: 1.25s remaining: 18.3s 6: learn: 0.4401299 test: 0.4451304 best: 0.4451304 (6) total: 1.46s remaining: 18.2s 7: learn: 0.4173515 test: 0.4227825 best: 0.4227825 (7) total: 1.69s remaining: 18.1s 8: learn: 0.3968730 test: 0.4025526 best: 0.4025526 (8) total: 1.9s remaining: 18s 9: learn: 0.3788439 test: 0.3847913 best: 0.3847913 (9) total: 2.12s remaining: 17.8s 10: learn: 0.3623611 test: 0.3686292 best: 0.3686292 (10) total: 2.34s remaining: 17.6s 11: learn: 0.3477180 test: 0.3542044 best: 0.3542044 (11) total: 2.56s remaining: 17.5s 12: learn: 0.3348488 test: 0.3417151 best: 0.3417151 (12) total: 2.78s remaining: 17.3s 13: learn: 0.3228583 test: 0.3301986 best: 0.3301986 (13) total: 2.98s remaining: 17s 14: learn: 0.3115509 test: 0.3193678 best: 0.3193678 (14) total: 3.19s remaining: 16.8s 15: learn: 0.3011264 test: 0.3092764 best: 0.3092764 (15) total: 3.4s remaining: 16.6s 16: learn: 0.2924532 test: 0.3010421 best: 0.3010421 (16) total: 3.61s remaining: 16.4s 17: learn: 0.2839230 test: 0.2929693 best: 0.2929693 (17) total: 3.83s remaining: 16.2s 18: learn: 0.2764304 test: 0.2859865 best: 0.2859865 (18) total: 4.05s remaining: 16s 19: learn: 0.2694159 test: 0.2797797 best: 0.2797797 (19) total: 4.29s remaining: 15.9s 20: learn: 0.2630811 test: 0.2737306 best: 0.2737306 (20) total: 4.52s remaining: 15.7s 21: learn: 0.2570803 test: 0.2681159 best: 0.2681159 (21) total: 4.73s remaining: 15.5s 22: learn: 0.2512856 test: 0.2629721 best: 0.2629721 (22) total: 4.97s remaining: 15.3s 23: learn: 0.2460460 test: 0.2582451 best: 0.2582451 (23) total: 5.18s remaining: 15.1s 24: learn: 0.2412702 test: 0.2538034 best: 0.2538034 (24) total: 5.39s remaining: 14.9s 25: learn: 0.2370227 test: 0.2499812 best: 0.2499812 (25) total: 5.62s remaining: 14.7s 26: learn: 0.2329220 test: 0.2463286 best: 0.2463286 (26) total: 5.84s remaining: 14.5s 27: learn: 0.2288268 test: 0.2427021 best: 0.2427021 (27) total: 6.06s remaining: 14.3s 28: learn: 0.2249997 test: 0.2392191 best: 0.2392191 (28) total: 6.26s remaining: 14s 29: learn: 0.2218107 test: 0.2365162 best: 0.2365162 (29) total: 6.48s remaining: 13.8s 30: learn: 0.2187572 test: 0.2340031 best: 0.2340031 (30) total: 6.72s remaining: 13.6s 31: learn: 0.2154822 test: 0.2312671 best: 0.2312671 (31) total: 6.93s remaining: 13.4s 32: learn: 0.2126049 test: 0.2289032 best: 0.2289032 (32) total: 7.16s remaining: 13.2s 33: learn: 0.2097961 test: 0.2266418 best: 0.2266418 (33) total: 7.37s remaining: 13s 34: learn: 0.2073313 test: 0.2247285 best: 0.2247285 (34) total: 7.61s remaining: 12.8s 35: learn: 0.2049090 test: 0.2229234 best: 0.2229234 (35) total: 7.81s remaining: 12.6s 36: learn: 0.2023383 test: 0.2210865 best: 0.2210865 (36) total: 8.05s remaining: 12.4s 37: learn: 0.2001367 test: 0.2193339 best: 0.2193339 (37) total: 8.26s remaining: 12.2s 38: learn: 0.1980676 test: 0.2177792 best: 0.2177792 (38) total: 8.49s remaining: 12s 39: learn: 0.1959919 test: 0.2159631 best: 0.2159631 (39) total: 8.72s remaining: 11.8s 40: learn: 0.1940913 test: 0.2143708 best: 0.2143708 (40) total: 8.94s remaining: 11.6s 41: learn: 0.1922716 test: 0.2129196 best: 0.2129196 (41) total: 9.14s remaining: 11.3s 42: learn: 0.1904831 test: 0.2116147 best: 0.2116147 (42) total: 9.35s remaining: 11.1s 43: learn: 0.1887552 test: 0.2102444 best: 0.2102444 (43) total: 9.58s remaining: 10.9s 44: learn: 0.1872057 test: 0.2091064 best: 0.2091064 (44) total: 9.79s remaining: 10.7s 45: learn: 0.1855625 test: 0.2079415 best: 0.2079415 (45) total: 10s remaining: 10.4s 46: learn: 0.1838825 test: 0.2069543 best: 0.2069543 (46) total: 10.2s remaining: 10.2s 47: learn: 0.1824123 test: 0.2062682 best: 0.2062682 (47) total: 10.5s remaining: 10s 48: learn: 0.1810645 test: 0.2052202 best: 0.2052202 (48) total: 10.7s remaining: 9.82s 49: learn: 0.1795728 test: 0.2042233 best: 0.2042233 (49) total: 10.9s remaining: 9.63s 50: learn: 0.1784114 test: 0.2034604 best: 0.2034604 (50) total: 11.1s remaining: 9.4s 51: learn: 0.1770833 test: 0.2025053 best: 0.2025053 (51) total: 11.4s remaining: 9.17s 52: learn: 0.1758592 test: 0.2017224 best: 0.2017224 (52) total: 11.6s remaining: 8.97s 53: learn: 0.1747710 test: 0.2011229 best: 0.2011229 (53) total: 11.8s remaining: 8.73s 54: learn: 0.1735176 test: 0.2003818 best: 0.2003818 (54) total: 12s remaining: 8.52s 55: learn: 0.1724080 test: 0.1999398 best: 0.1999398 (55) total: 12.2s remaining: 8.3s 56: learn: 0.1712669 test: 0.1995383 best: 0.1995383 (56) total: 12.5s remaining: 8.09s 57: learn: 0.1704511 test: 0.1990260 best: 0.1990260 (57) total: 12.7s remaining: 7.87s 58: learn: 0.1693066 test: 0.1986055 best: 0.1986055 (58) total: 12.9s remaining: 7.65s 59: learn: 0.1683802 test: 0.1981421 best: 0.1981421 (59) total: 13.1s remaining: 7.42s 60: learn: 0.1675669 test: 0.1977292 best: 0.1977292 (60) total: 13.3s remaining: 7.19s 61: learn: 0.1666891 test: 0.1973516 best: 0.1973516 (61) total: 13.5s remaining: 6.96s 62: learn: 0.1657691 test: 0.1970614 best: 0.1970614 (62) total: 13.7s remaining: 6.75s 63: learn: 0.1647843 test: 0.1967424 best: 0.1967424 (63) total: 14s remaining: 6.54s 64: learn: 0.1640107 test: 0.1962637 best: 0.1962637 (64) total: 14.2s remaining: 6.32s 65: learn: 0.1632415 test: 0.1958191 best: 0.1958191 (65) total: 14.4s remaining: 6.1s 66: learn: 0.1625686 test: 0.1956057 best: 0.1956057 (66) total: 14.6s remaining: 5.88s 67: learn: 0.1619086 test: 0.1953406 best: 0.1953406 (67) total: 14.8s remaining: 5.66s 68: learn: 0.1612454 test: 0.1948705 best: 0.1948705 (68) total: 15s remaining: 5.44s 69: learn: 0.1605861 test: 0.1945905 best: 0.1945905 (69) total: 15.2s remaining: 5.21s 70: learn: 0.1599587 test: 0.1942710 best: 0.1942710 (70) total: 15.4s remaining: 4.98s 71: learn: 0.1592909 test: 0.1940864 best: 0.1940864 (71) total: 15.6s remaining: 4.76s 72: learn: 0.1586394 test: 0.1937779 best: 0.1937779 (72) total: 15.8s remaining: 4.54s 73: learn: 0.1577834 test: 0.1934846 best: 0.1934846 (73) total: 16s remaining: 4.33s 74: learn: 0.1568961 test: 0.1932098 best: 0.1932098 (74) total: 16.2s remaining: 4.11s 75: learn: 0.1563545 test: 0.1931019 best: 0.1931019 (75) total: 16.4s remaining: 3.89s 76: learn: 0.1554973 test: 0.1928669 best: 0.1928669 (76) total: 16.7s remaining: 3.68s 77: learn: 0.1549731 test: 0.1925799 best: 0.1925799 (77) total: 16.9s remaining: 3.46s 78: learn: 0.1543225 test: 0.1923936 best: 0.1923936 (78) total: 17.1s remaining: 3.24s 79: learn: 0.1537359 test: 0.1920834 best: 0.1920834 (79) total: 17.3s remaining: 3.02s 80: learn: 0.1532823 test: 0.1918172 best: 0.1918172 (80) total: 17.5s remaining: 2.81s 81: learn: 0.1526892 test: 0.1916281 best: 0.1916281 (81) total: 17.7s remaining: 2.59s 82: learn: 0.1522501 test: 0.1915017 best: 0.1915017 (82) total: 17.9s remaining: 2.37s 83: learn: 0.1516159 test: 0.1913016 best: 0.1913016 (83) total: 18.1s remaining: 2.15s 84: learn: 0.1510352 test: 0.1911297 best: 0.1911297 (84) total: 18.3s remaining: 1.94s 85: learn: 0.1503093 test: 0.1909302 best: 0.1909302 (85) total: 18.5s remaining: 1.72s 86: learn: 0.1498173 test: 0.1907846 best: 0.1907846 (86) total: 18.7s remaining: 1.51s 87: learn: 0.1492057 test: 0.1906159 best: 0.1906159 (87) total: 19s remaining: 1.29s 88: learn: 0.1486695 test: 0.1904800 best: 0.1904800 (88) total: 19.2s remaining: 1.08s 89: learn: 0.1482241 test: 0.1904367 best: 0.1904367 (89) total: 19.4s remaining: 861ms 90: learn: 0.1476080 test: 0.1902556 best: 0.1902556 (90) total: 19.6s remaining: 646ms 91: learn: 0.1471805 test: 0.1901360 best: 0.1901360 (91) total: 19.8s remaining: 430ms 92: learn: 0.1467775 test: 0.1900043 best: 0.1900043 (92) total: 20s remaining: 215ms 93: learn: 0.1461245 test: 0.1898086 best: 0.1898086 (93) total: 20.2s remaining: 0us bestTest = 0.1898085713 bestIteration = 93 Trial 24, Fold 3: Log loss = 0.1896281499984433, Average precision = 0.9762000415598325, ROC-AUC = 0.9738050895204708, Elapsed Time = 20.38147620000018 seconds Trial 24, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 24, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6444883 test: 0.6459490 best: 0.6459490 (0) total: 213ms remaining: 19.8s 1: learn: 0.6008349 test: 0.6035999 best: 0.6035999 (1) total: 407ms remaining: 18.7s 2: learn: 0.5620654 test: 0.5657130 best: 0.5657130 (2) total: 595ms remaining: 18.1s 3: learn: 0.5267365 test: 0.5314069 best: 0.5314069 (3) total: 803ms remaining: 18.1s 4: learn: 0.4947830 test: 0.5009283 best: 0.5009283 (4) total: 1.02s remaining: 18.2s 5: learn: 0.4663073 test: 0.4733542 best: 0.4733542 (5) total: 1.22s remaining: 17.9s 6: learn: 0.4409050 test: 0.4490563 best: 0.4490563 (6) total: 1.42s remaining: 17.7s 7: learn: 0.4180646 test: 0.4267910 best: 0.4267910 (7) total: 1.64s remaining: 17.6s 8: learn: 0.3975348 test: 0.4067182 best: 0.4067182 (8) total: 1.86s remaining: 17.6s 9: learn: 0.3792393 test: 0.3888220 best: 0.3888220 (9) total: 2.08s remaining: 17.5s 10: learn: 0.3625248 test: 0.3729268 best: 0.3729268 (10) total: 2.31s remaining: 17.4s 11: learn: 0.3478056 test: 0.3589688 best: 0.3589688 (11) total: 2.52s remaining: 17.2s 12: learn: 0.3344647 test: 0.3458606 best: 0.3458606 (12) total: 2.73s remaining: 17s 13: learn: 0.3227083 test: 0.3344449 best: 0.3344449 (13) total: 2.96s remaining: 16.9s 14: learn: 0.3117173 test: 0.3238205 best: 0.3238205 (14) total: 3.15s remaining: 16.6s 15: learn: 0.3024318 test: 0.3149243 best: 0.3149243 (15) total: 3.35s remaining: 16.3s 16: learn: 0.2931689 test: 0.3062343 best: 0.3062343 (16) total: 3.58s remaining: 16.2s 17: learn: 0.2843133 test: 0.2977964 best: 0.2977964 (17) total: 3.81s remaining: 16.1s 18: learn: 0.2766565 test: 0.2905060 best: 0.2905060 (18) total: 4.02s remaining: 15.9s 19: learn: 0.2697464 test: 0.2840969 best: 0.2840969 (19) total: 4.24s remaining: 15.7s 20: learn: 0.2629809 test: 0.2778068 best: 0.2778068 (20) total: 4.46s remaining: 15.5s 21: learn: 0.2568378 test: 0.2718085 best: 0.2718085 (21) total: 4.66s remaining: 15.3s 22: learn: 0.2516489 test: 0.2671587 best: 0.2671587 (22) total: 4.86s remaining: 15s 23: learn: 0.2473410 test: 0.2631466 best: 0.2631466 (23) total: 5.08s remaining: 14.8s 24: learn: 0.2426459 test: 0.2586514 best: 0.2586514 (24) total: 5.29s remaining: 14.6s 25: learn: 0.2378774 test: 0.2543290 best: 0.2543290 (25) total: 5.51s remaining: 14.4s 26: learn: 0.2334635 test: 0.2505275 best: 0.2505275 (26) total: 5.76s remaining: 14.3s 27: learn: 0.2295402 test: 0.2470876 best: 0.2470876 (27) total: 5.99s remaining: 14.1s 28: learn: 0.2256417 test: 0.2436966 best: 0.2436966 (28) total: 6.21s remaining: 13.9s 29: learn: 0.2223366 test: 0.2408155 best: 0.2408155 (29) total: 6.42s remaining: 13.7s 30: learn: 0.2192363 test: 0.2381167 best: 0.2381167 (30) total: 6.64s remaining: 13.5s 31: learn: 0.2162490 test: 0.2356081 best: 0.2356081 (31) total: 6.87s remaining: 13.3s 32: learn: 0.2133655 test: 0.2328888 best: 0.2328888 (32) total: 7.06s remaining: 13.1s 33: learn: 0.2105033 test: 0.2304765 best: 0.2304765 (33) total: 7.31s remaining: 12.9s 34: learn: 0.2078395 test: 0.2282011 best: 0.2282011 (34) total: 7.52s remaining: 12.7s 35: learn: 0.2052115 test: 0.2269914 best: 0.2269914 (35) total: 7.78s remaining: 12.5s 36: learn: 0.2028308 test: 0.2248130 best: 0.2248130 (36) total: 7.99s remaining: 12.3s 37: learn: 0.2004210 test: 0.2229280 best: 0.2229280 (37) total: 8.21s remaining: 12.1s 38: learn: 0.1982951 test: 0.2211858 best: 0.2211858 (38) total: 8.42s remaining: 11.9s 39: learn: 0.1958637 test: 0.2193891 best: 0.2193891 (39) total: 8.7s remaining: 11.7s 40: learn: 0.1937740 test: 0.2180471 best: 0.2180471 (40) total: 8.95s remaining: 11.6s 41: learn: 0.1920922 test: 0.2167262 best: 0.2167262 (41) total: 9.17s remaining: 11.4s 42: learn: 0.1903887 test: 0.2156461 best: 0.2156461 (42) total: 9.4s remaining: 11.1s 43: learn: 0.1885815 test: 0.2144720 best: 0.2144720 (43) total: 9.66s remaining: 11s 44: learn: 0.1870701 test: 0.2132487 best: 0.2132487 (44) total: 9.86s remaining: 10.7s 45: learn: 0.1857377 test: 0.2123816 best: 0.2123816 (45) total: 10.1s remaining: 10.5s 46: learn: 0.1844250 test: 0.2113349 best: 0.2113349 (46) total: 10.3s remaining: 10.3s 47: learn: 0.1830928 test: 0.2103653 best: 0.2103653 (47) total: 10.5s remaining: 10s 48: learn: 0.1815295 test: 0.2093172 best: 0.2093172 (48) total: 10.7s remaining: 9.86s 49: learn: 0.1802313 test: 0.2085915 best: 0.2085915 (49) total: 10.9s remaining: 9.63s 50: learn: 0.1788705 test: 0.2077048 best: 0.2077048 (50) total: 11.1s remaining: 9.39s 51: learn: 0.1777052 test: 0.2069794 best: 0.2069794 (51) total: 11.3s remaining: 9.16s 52: learn: 0.1763601 test: 0.2062102 best: 0.2062102 (52) total: 11.6s remaining: 8.97s 53: learn: 0.1752878 test: 0.2055255 best: 0.2055255 (53) total: 11.8s remaining: 8.76s 54: learn: 0.1739323 test: 0.2048447 best: 0.2048447 (54) total: 12.1s remaining: 8.55s 55: learn: 0.1730498 test: 0.2042723 best: 0.2042723 (55) total: 12.2s remaining: 8.3s 56: learn: 0.1717930 test: 0.2034116 best: 0.2034116 (56) total: 12.5s remaining: 8.09s 57: learn: 0.1706907 test: 0.2025084 best: 0.2025084 (57) total: 12.7s remaining: 7.86s 58: learn: 0.1694868 test: 0.2020775 best: 0.2020775 (58) total: 12.9s remaining: 7.65s 59: learn: 0.1687470 test: 0.2016958 best: 0.2016958 (59) total: 13.1s remaining: 7.42s 60: learn: 0.1678653 test: 0.2013927 best: 0.2013927 (60) total: 13.3s remaining: 7.19s 61: learn: 0.1668516 test: 0.2008194 best: 0.2008194 (61) total: 13.5s remaining: 6.96s 62: learn: 0.1661307 test: 0.2004451 best: 0.2004451 (62) total: 13.7s remaining: 6.74s 63: learn: 0.1652618 test: 0.2000638 best: 0.2000638 (63) total: 13.9s remaining: 6.52s 64: learn: 0.1643732 test: 0.1998359 best: 0.1998359 (64) total: 14.2s remaining: 6.32s 65: learn: 0.1635703 test: 0.1994115 best: 0.1994115 (65) total: 14.4s remaining: 6.09s 66: learn: 0.1628409 test: 0.1990823 best: 0.1990823 (66) total: 14.6s remaining: 5.87s 67: learn: 0.1621904 test: 0.1988275 best: 0.1988275 (67) total: 14.8s remaining: 5.64s 68: learn: 0.1614131 test: 0.1985348 best: 0.1985348 (68) total: 15s remaining: 5.42s 69: learn: 0.1607301 test: 0.1983120 best: 0.1983120 (69) total: 15.2s remaining: 5.2s 70: learn: 0.1601001 test: 0.1979902 best: 0.1979902 (70) total: 15.4s remaining: 4.98s 71: learn: 0.1593919 test: 0.1976695 best: 0.1976695 (71) total: 15.6s remaining: 4.77s 72: learn: 0.1588070 test: 0.1975588 best: 0.1975588 (72) total: 15.8s remaining: 4.55s 73: learn: 0.1581466 test: 0.1973295 best: 0.1973295 (73) total: 16s remaining: 4.33s 74: learn: 0.1574944 test: 0.1971128 best: 0.1971128 (74) total: 16.2s remaining: 4.11s 75: learn: 0.1568175 test: 0.1968728 best: 0.1968728 (75) total: 16.4s remaining: 3.89s 76: learn: 0.1561895 test: 0.1966351 best: 0.1966351 (76) total: 16.7s remaining: 3.68s 77: learn: 0.1555398 test: 0.1964681 best: 0.1964681 (77) total: 16.9s remaining: 3.47s 78: learn: 0.1548485 test: 0.1962305 best: 0.1962305 (78) total: 17.2s remaining: 3.26s 79: learn: 0.1540238 test: 0.1961104 best: 0.1961104 (79) total: 17.5s remaining: 3.06s 80: learn: 0.1534461 test: 0.1960905 best: 0.1960905 (80) total: 17.7s remaining: 2.84s 81: learn: 0.1530246 test: 0.1959334 best: 0.1959334 (81) total: 17.9s remaining: 2.62s 82: learn: 0.1524286 test: 0.1955880 best: 0.1955880 (82) total: 18.2s remaining: 2.41s 83: learn: 0.1519283 test: 0.1953844 best: 0.1953844 (83) total: 18.4s remaining: 2.19s 84: learn: 0.1515167 test: 0.1953449 best: 0.1953449 (84) total: 18.6s remaining: 1.97s 85: learn: 0.1510059 test: 0.1952312 best: 0.1952312 (85) total: 18.8s remaining: 1.75s 86: learn: 0.1505625 test: 0.1952200 best: 0.1952200 (86) total: 19s remaining: 1.53s 87: learn: 0.1499320 test: 0.1951423 best: 0.1951423 (87) total: 19.2s remaining: 1.31s 88: learn: 0.1492386 test: 0.1949788 best: 0.1949788 (88) total: 19.4s remaining: 1.09s 89: learn: 0.1487463 test: 0.1948841 best: 0.1948841 (89) total: 19.6s remaining: 871ms 90: learn: 0.1483929 test: 0.1947950 best: 0.1947950 (90) total: 19.8s remaining: 652ms 91: learn: 0.1479021 test: 0.1946522 best: 0.1946522 (91) total: 20s remaining: 434ms 92: learn: 0.1474308 test: 0.1946732 best: 0.1946522 (91) total: 20.2s remaining: 217ms 93: learn: 0.1468555 test: 0.1944609 best: 0.1944609 (93) total: 20.4s remaining: 0us bestTest = 0.1944608541 bestIteration = 93 Trial 24, Fold 4: Log loss = 0.19421419745927387, Average precision = 0.9765479303642637, ROC-AUC = 0.9726961236650286, Elapsed Time = 20.537694999999076 seconds Trial 24, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 24, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6437416 test: 0.6468447 best: 0.6468447 (0) total: 224ms remaining: 20.8s 1: learn: 0.5997445 test: 0.6046467 best: 0.6046467 (1) total: 426ms remaining: 19.6s 2: learn: 0.5603284 test: 0.5677981 best: 0.5677981 (2) total: 639ms remaining: 19.4s 3: learn: 0.5249177 test: 0.5334347 best: 0.5334347 (3) total: 850ms remaining: 19.1s 4: learn: 0.4934863 test: 0.5032852 best: 0.5032852 (4) total: 1.04s remaining: 18.5s 5: learn: 0.4646420 test: 0.4756384 best: 0.4756384 (5) total: 1.25s remaining: 18.3s 6: learn: 0.4388649 test: 0.4513126 best: 0.4513126 (6) total: 1.47s remaining: 18.2s 7: learn: 0.4161149 test: 0.4291803 best: 0.4291803 (7) total: 1.69s remaining: 18.1s 8: learn: 0.3958074 test: 0.4102794 best: 0.4102794 (8) total: 1.88s remaining: 17.8s 9: learn: 0.3772509 test: 0.3925931 best: 0.3925931 (9) total: 2.08s remaining: 17.5s 10: learn: 0.3606941 test: 0.3767262 best: 0.3767262 (10) total: 2.27s remaining: 17.2s 11: learn: 0.3457354 test: 0.3624748 best: 0.3624748 (11) total: 2.49s remaining: 17s 12: learn: 0.3319077 test: 0.3496591 best: 0.3496591 (12) total: 2.71s remaining: 16.9s 13: learn: 0.3197048 test: 0.3383881 best: 0.3383881 (13) total: 2.95s remaining: 16.9s 14: learn: 0.3088756 test: 0.3282380 best: 0.3282380 (14) total: 3.22s remaining: 17s 15: learn: 0.2989796 test: 0.3192724 best: 0.3192724 (15) total: 3.48s remaining: 17s 16: learn: 0.2896859 test: 0.3106178 best: 0.3106178 (16) total: 3.78s remaining: 17.1s 17: learn: 0.2816761 test: 0.3032699 best: 0.3032699 (17) total: 4.11s remaining: 17.3s 18: learn: 0.2742581 test: 0.2961948 best: 0.2961948 (18) total: 4.42s remaining: 17.4s 19: learn: 0.2675262 test: 0.2899978 best: 0.2899978 (19) total: 4.76s remaining: 17.6s 20: learn: 0.2617253 test: 0.2845937 best: 0.2845937 (20) total: 5.06s remaining: 17.6s 21: learn: 0.2553681 test: 0.2788911 best: 0.2788911 (21) total: 5.39s remaining: 17.6s 22: learn: 0.2498689 test: 0.2740228 best: 0.2740228 (22) total: 5.76s remaining: 17.8s 23: learn: 0.2443065 test: 0.2694650 best: 0.2694650 (23) total: 6.08s remaining: 17.7s 24: learn: 0.2394431 test: 0.2648971 best: 0.2648971 (24) total: 6.36s remaining: 17.6s 25: learn: 0.2349290 test: 0.2610177 best: 0.2610177 (25) total: 6.7s remaining: 17.5s 26: learn: 0.2307488 test: 0.2572330 best: 0.2572330 (26) total: 7.03s remaining: 17.4s 27: learn: 0.2268803 test: 0.2541663 best: 0.2541663 (27) total: 7.3s remaining: 17.2s 28: learn: 0.2233874 test: 0.2512056 best: 0.2512056 (28) total: 7.54s remaining: 16.9s 29: learn: 0.2196864 test: 0.2479765 best: 0.2479765 (29) total: 7.78s remaining: 16.6s 30: learn: 0.2164731 test: 0.2454922 best: 0.2454922 (30) total: 8.06s remaining: 16.4s 31: learn: 0.2133641 test: 0.2429660 best: 0.2429660 (31) total: 8.31s remaining: 16.1s 32: learn: 0.2104393 test: 0.2406481 best: 0.2406481 (32) total: 8.54s remaining: 15.8s 33: learn: 0.2074476 test: 0.2384644 best: 0.2384644 (33) total: 8.81s remaining: 15.6s 34: learn: 0.2046239 test: 0.2363359 best: 0.2363359 (34) total: 9.08s remaining: 15.3s 35: learn: 0.2020601 test: 0.2342660 best: 0.2342660 (35) total: 9.35s remaining: 15.1s 36: learn: 0.1998343 test: 0.2325548 best: 0.2325548 (36) total: 9.6s remaining: 14.8s 37: learn: 0.1973682 test: 0.2308488 best: 0.2308488 (37) total: 9.86s remaining: 14.5s 38: learn: 0.1953105 test: 0.2292948 best: 0.2292948 (38) total: 10.1s remaining: 14.2s 39: learn: 0.1933018 test: 0.2278874 best: 0.2278874 (39) total: 10.3s remaining: 13.9s 40: learn: 0.1916624 test: 0.2268446 best: 0.2268446 (40) total: 10.6s remaining: 13.7s 41: learn: 0.1900179 test: 0.2255127 best: 0.2255127 (41) total: 10.8s remaining: 13.4s 42: learn: 0.1882851 test: 0.2241726 best: 0.2241726 (42) total: 11.1s remaining: 13.2s 43: learn: 0.1863840 test: 0.2228648 best: 0.2228648 (43) total: 11.4s remaining: 13s 44: learn: 0.1845719 test: 0.2216802 best: 0.2216802 (44) total: 11.7s remaining: 12.7s 45: learn: 0.1830262 test: 0.2205685 best: 0.2205685 (45) total: 11.9s remaining: 12.4s 46: learn: 0.1816808 test: 0.2196323 best: 0.2196323 (46) total: 12.1s remaining: 12.1s 47: learn: 0.1802808 test: 0.2187292 best: 0.2187292 (47) total: 12.4s remaining: 11.8s 48: learn: 0.1790991 test: 0.2178910 best: 0.2178910 (48) total: 12.6s remaining: 11.6s 49: learn: 0.1780235 test: 0.2172491 best: 0.2172491 (49) total: 12.9s remaining: 11.3s 50: learn: 0.1768668 test: 0.2164300 best: 0.2164300 (50) total: 13.1s remaining: 11s 51: learn: 0.1757262 test: 0.2157135 best: 0.2157135 (51) total: 13.3s remaining: 10.7s 52: learn: 0.1744502 test: 0.2150297 best: 0.2150297 (52) total: 13.5s remaining: 10.5s 53: learn: 0.1730970 test: 0.2142723 best: 0.2142723 (53) total: 13.8s remaining: 10.2s 54: learn: 0.1720720 test: 0.2137435 best: 0.2137435 (54) total: 14s remaining: 9.91s 55: learn: 0.1710779 test: 0.2131123 best: 0.2131123 (55) total: 14.2s remaining: 9.61s 56: learn: 0.1701368 test: 0.2125458 best: 0.2125458 (56) total: 14.4s remaining: 9.33s 57: learn: 0.1692581 test: 0.2121132 best: 0.2121132 (57) total: 14.6s remaining: 9.05s 58: learn: 0.1682691 test: 0.2116861 best: 0.2116861 (58) total: 14.8s remaining: 8.78s 59: learn: 0.1672600 test: 0.2111860 best: 0.2111860 (59) total: 15s remaining: 8.52s 60: learn: 0.1662209 test: 0.2107086 best: 0.2107086 (60) total: 15.3s remaining: 8.25s 61: learn: 0.1652036 test: 0.2102810 best: 0.2102810 (61) total: 15.5s remaining: 7.99s 62: learn: 0.1644007 test: 0.2100910 best: 0.2100910 (62) total: 15.7s remaining: 7.72s 63: learn: 0.1634596 test: 0.2096354 best: 0.2096354 (63) total: 15.9s remaining: 7.46s 64: learn: 0.1628000 test: 0.2093566 best: 0.2093566 (64) total: 16.1s remaining: 7.19s 65: learn: 0.1621026 test: 0.2089882 best: 0.2089882 (65) total: 16.3s remaining: 6.92s 66: learn: 0.1614946 test: 0.2086291 best: 0.2086291 (66) total: 16.5s remaining: 6.65s 67: learn: 0.1606863 test: 0.2083496 best: 0.2083496 (67) total: 16.7s remaining: 6.4s 68: learn: 0.1598757 test: 0.2080057 best: 0.2080057 (68) total: 17s remaining: 6.15s 69: learn: 0.1591895 test: 0.2077311 best: 0.2077311 (69) total: 17.2s remaining: 5.89s 70: learn: 0.1586255 test: 0.2074342 best: 0.2074342 (70) total: 17.4s remaining: 5.62s 71: learn: 0.1578988 test: 0.2069805 best: 0.2069805 (71) total: 17.6s remaining: 5.37s 72: learn: 0.1574156 test: 0.2067121 best: 0.2067121 (72) total: 17.8s remaining: 5.11s 73: learn: 0.1567707 test: 0.2065615 best: 0.2065615 (73) total: 18s remaining: 4.85s 74: learn: 0.1561836 test: 0.2064704 best: 0.2064704 (74) total: 18.2s remaining: 4.6s 75: learn: 0.1555046 test: 0.2062449 best: 0.2062449 (75) total: 18.4s remaining: 4.35s 76: learn: 0.1549047 test: 0.2059463 best: 0.2059463 (76) total: 18.6s remaining: 4.1s 77: learn: 0.1543768 test: 0.2058200 best: 0.2058200 (77) total: 18.8s remaining: 3.85s 78: learn: 0.1535994 test: 0.2053684 best: 0.2053684 (78) total: 19s remaining: 3.6s 79: learn: 0.1528733 test: 0.2050305 best: 0.2050305 (79) total: 19.2s remaining: 3.36s 80: learn: 0.1522303 test: 0.2049808 best: 0.2049808 (80) total: 19.4s remaining: 3.12s 81: learn: 0.1517290 test: 0.2047404 best: 0.2047404 (81) total: 19.6s remaining: 2.87s 82: learn: 0.1509246 test: 0.2044828 best: 0.2044828 (82) total: 19.9s remaining: 2.63s 83: learn: 0.1503767 test: 0.2041618 best: 0.2041618 (83) total: 20.1s remaining: 2.39s 84: learn: 0.1498510 test: 0.2041023 best: 0.2041023 (84) total: 20.3s remaining: 2.14s 85: learn: 0.1493126 test: 0.2038755 best: 0.2038755 (85) total: 20.5s remaining: 1.9s 86: learn: 0.1487781 test: 0.2037527 best: 0.2037527 (86) total: 20.7s remaining: 1.66s 87: learn: 0.1482875 test: 0.2035227 best: 0.2035227 (87) total: 20.9s remaining: 1.42s 88: learn: 0.1476303 test: 0.2033193 best: 0.2033193 (88) total: 21.1s remaining: 1.19s 89: learn: 0.1471401 test: 0.2033107 best: 0.2033107 (89) total: 21.3s remaining: 947ms 90: learn: 0.1466505 test: 0.2031674 best: 0.2031674 (90) total: 21.5s remaining: 710ms 91: learn: 0.1460600 test: 0.2029208 best: 0.2029208 (91) total: 21.7s remaining: 473ms 92: learn: 0.1454903 test: 0.2027782 best: 0.2027782 (92) total: 22s remaining: 236ms 93: learn: 0.1449389 test: 0.2027605 best: 0.2027605 (93) total: 22.2s remaining: 0us bestTest = 0.2027605014 bestIteration = 93 Trial 24, Fold 5: Log loss = 0.20235849248054216, Average precision = 0.9742823084834791, ROC-AUC = 0.9712950968916635, Elapsed Time = 22.32423199999903 seconds
Optimization Progress: 25%|##5 | 25/100 [40:28<1:20:43, 64.58s/it]
Trial 25, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 25, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6551025 test: 0.6551884 best: 0.6551884 (0) total: 768ms remaining: 28.4s 1: learn: 0.6220195 test: 0.6217118 best: 0.6217118 (1) total: 1.54s remaining: 27.7s 2: learn: 0.5914368 test: 0.5913060 best: 0.5913060 (2) total: 2.34s remaining: 27.4s 3: learn: 0.5635964 test: 0.5635380 best: 0.5635380 (3) total: 3.13s remaining: 26.6s 4: learn: 0.5393063 test: 0.5396114 best: 0.5396114 (4) total: 3.9s remaining: 25.7s 5: learn: 0.5124410 test: 0.5129131 best: 0.5129131 (5) total: 4.72s remaining: 25.2s 6: learn: 0.4853965 test: 0.4863230 best: 0.4863230 (6) total: 5.51s remaining: 24.4s 7: learn: 0.4644841 test: 0.4658430 best: 0.4658430 (7) total: 6.31s remaining: 23.7s 8: learn: 0.4482538 test: 0.4497906 best: 0.4497906 (8) total: 7.11s remaining: 22.9s 9: learn: 0.4333099 test: 0.4351463 best: 0.4351463 (9) total: 7.88s remaining: 22.1s 10: learn: 0.4178357 test: 0.4201084 best: 0.4201084 (10) total: 8.63s remaining: 21.2s 11: learn: 0.4016854 test: 0.4041646 best: 0.4041646 (11) total: 9.39s remaining: 20.4s 12: learn: 0.3862410 test: 0.3891622 best: 0.3891622 (12) total: 10.2s remaining: 19.6s 13: learn: 0.3728924 test: 0.3763416 best: 0.3763416 (13) total: 11s remaining: 18.8s 14: learn: 0.3615115 test: 0.3651829 best: 0.3651829 (14) total: 11.8s remaining: 18s 15: learn: 0.3518735 test: 0.3557320 best: 0.3557320 (15) total: 12.6s remaining: 17.3s 16: learn: 0.3437423 test: 0.3479117 best: 0.3479117 (16) total: 13.3s remaining: 16.5s 17: learn: 0.3354622 test: 0.3400628 best: 0.3400628 (17) total: 14.1s remaining: 15.6s 18: learn: 0.3265461 test: 0.3313955 best: 0.3313955 (18) total: 14.8s remaining: 14.8s 19: learn: 0.3183624 test: 0.3237778 best: 0.3237778 (19) total: 15.6s remaining: 14s 20: learn: 0.3104106 test: 0.3163329 best: 0.3163329 (20) total: 16.4s remaining: 13.2s 21: learn: 0.3018004 test: 0.3082762 best: 0.3082762 (21) total: 17.2s remaining: 12.5s 22: learn: 0.2945121 test: 0.3013383 best: 0.3013383 (22) total: 17.9s remaining: 11.7s 23: learn: 0.2885858 test: 0.2958771 best: 0.2958771 (23) total: 18.7s remaining: 10.9s 24: learn: 0.2832247 test: 0.2908840 best: 0.2908840 (24) total: 19.5s remaining: 10.1s 25: learn: 0.2773509 test: 0.2855438 best: 0.2855438 (25) total: 20.3s remaining: 9.37s 26: learn: 0.2726196 test: 0.2812248 best: 0.2812248 (26) total: 21.1s remaining: 8.59s 27: learn: 0.2666013 test: 0.2755650 best: 0.2755650 (27) total: 21.9s remaining: 7.81s 28: learn: 0.2621540 test: 0.2714702 best: 0.2714702 (28) total: 22.6s remaining: 7.01s 29: learn: 0.2584063 test: 0.2681081 best: 0.2681081 (29) total: 23.4s remaining: 6.23s 30: learn: 0.2539318 test: 0.2640294 best: 0.2640294 (30) total: 24.1s remaining: 5.45s 31: learn: 0.2506823 test: 0.2612625 best: 0.2612625 (31) total: 24.9s remaining: 4.68s 32: learn: 0.2472938 test: 0.2580703 best: 0.2580703 (32) total: 25.7s remaining: 3.89s 33: learn: 0.2434965 test: 0.2547128 best: 0.2547128 (33) total: 26.5s remaining: 3.12s 34: learn: 0.2409347 test: 0.2526586 best: 0.2526586 (34) total: 27.3s remaining: 2.33s 35: learn: 0.2377868 test: 0.2499298 best: 0.2499298 (35) total: 28s remaining: 1.56s 36: learn: 0.2348970 test: 0.2474071 best: 0.2474071 (36) total: 28.8s remaining: 778ms 37: learn: 0.2317185 test: 0.2447452 best: 0.2447452 (37) total: 29.6s remaining: 0us bestTest = 0.2447451662 bestIteration = 37 Trial 25, Fold 1: Log loss = 0.24474516617115755, Average precision = 0.9702032705447189, ROC-AUC = 0.9670949925438407, Elapsed Time = 29.706429499998194 seconds Trial 25, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 25, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6537777 test: 0.6537922 best: 0.6537922 (0) total: 801ms remaining: 29.6s 1: learn: 0.6161418 test: 0.6165931 best: 0.6165931 (1) total: 1.59s remaining: 28.6s 2: learn: 0.5854789 test: 0.5863223 best: 0.5863223 (2) total: 2.37s remaining: 27.6s 3: learn: 0.5520918 test: 0.5533529 best: 0.5533529 (3) total: 3.16s remaining: 26.8s 4: learn: 0.5257061 test: 0.5274536 best: 0.5274536 (4) total: 3.92s remaining: 25.9s 5: learn: 0.5048142 test: 0.5069478 best: 0.5069478 (5) total: 4.74s remaining: 25.3s 6: learn: 0.4800607 test: 0.4824727 best: 0.4824727 (6) total: 5.52s remaining: 24.4s 7: learn: 0.4608285 test: 0.4636582 best: 0.4636582 (7) total: 6.3s remaining: 23.6s 8: learn: 0.4440095 test: 0.4475027 best: 0.4475027 (8) total: 7.05s remaining: 22.7s 9: learn: 0.4269611 test: 0.4304843 best: 0.4304843 (9) total: 7.82s remaining: 21.9s 10: learn: 0.4120385 test: 0.4160926 best: 0.4160926 (10) total: 8.6s remaining: 21.1s 11: learn: 0.4000495 test: 0.4043968 best: 0.4043968 (11) total: 9.37s remaining: 20.3s 12: learn: 0.3877826 test: 0.3924568 best: 0.3924568 (12) total: 10.1s remaining: 19.5s 13: learn: 0.3758297 test: 0.3804665 best: 0.3804665 (13) total: 10.9s remaining: 18.8s 14: learn: 0.3647699 test: 0.3699216 best: 0.3699216 (14) total: 11.7s remaining: 18s 15: learn: 0.3538097 test: 0.3591396 best: 0.3591396 (15) total: 12.5s remaining: 17.2s 16: learn: 0.3423760 test: 0.3480009 best: 0.3480009 (16) total: 13.3s remaining: 16.5s 17: learn: 0.3330178 test: 0.3390448 best: 0.3390448 (17) total: 14.1s remaining: 15.7s 18: learn: 0.3240229 test: 0.3305777 best: 0.3305777 (18) total: 14.9s remaining: 14.9s 19: learn: 0.3165488 test: 0.3235112 best: 0.3235112 (19) total: 15.7s remaining: 14.1s 20: learn: 0.3091091 test: 0.3162875 best: 0.3162875 (20) total: 16.4s remaining: 13.3s 21: learn: 0.3014940 test: 0.3090335 best: 0.3090335 (21) total: 17.2s remaining: 12.5s 22: learn: 0.2945739 test: 0.3024152 best: 0.3024152 (22) total: 18s remaining: 11.8s 23: learn: 0.2886078 test: 0.2967389 best: 0.2967389 (23) total: 18.9s remaining: 11s 24: learn: 0.2829079 test: 0.2914403 best: 0.2914403 (24) total: 19.9s remaining: 10.4s 25: learn: 0.2771710 test: 0.2860768 best: 0.2860768 (25) total: 20.7s remaining: 9.57s 26: learn: 0.2721222 test: 0.2812859 best: 0.2812859 (26) total: 21.6s remaining: 8.8s 27: learn: 0.2671184 test: 0.2763223 best: 0.2763223 (27) total: 22.4s remaining: 8.01s 28: learn: 0.2629854 test: 0.2723947 best: 0.2723947 (28) total: 23.2s remaining: 7.21s 29: learn: 0.2590122 test: 0.2685282 best: 0.2685282 (29) total: 24s remaining: 6.41s 30: learn: 0.2550511 test: 0.2647788 best: 0.2647788 (30) total: 24.8s remaining: 5.6s 31: learn: 0.2514290 test: 0.2613604 best: 0.2613604 (31) total: 25.6s remaining: 4.8s 32: learn: 0.2483118 test: 0.2584439 best: 0.2584439 (32) total: 26.5s remaining: 4.01s 33: learn: 0.2453928 test: 0.2558953 best: 0.2558953 (33) total: 27.3s remaining: 3.21s 34: learn: 0.2425323 test: 0.2532212 best: 0.2532212 (34) total: 28.1s remaining: 2.41s 35: learn: 0.2397526 test: 0.2506240 best: 0.2506240 (35) total: 28.9s remaining: 1.61s 36: learn: 0.2372036 test: 0.2483513 best: 0.2483513 (36) total: 29.7s remaining: 803ms 37: learn: 0.2343317 test: 0.2459917 best: 0.2459917 (37) total: 30.5s remaining: 0us bestTest = 0.2459916857 bestIteration = 37 Trial 25, Fold 2: Log loss = 0.24599168572126281, Average precision = 0.9728615503006427, ROC-AUC = 0.9695893644226103, Elapsed Time = 30.65187850000075 seconds Trial 25, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 25, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6534904 test: 0.6534476 best: 0.6534476 (0) total: 878ms remaining: 32.5s 1: learn: 0.6202238 test: 0.6200255 best: 0.6200255 (1) total: 1.68s remaining: 30.2s 2: learn: 0.5909163 test: 0.5910462 best: 0.5910462 (2) total: 2.5s remaining: 29.2s 3: learn: 0.5641244 test: 0.5644968 best: 0.5644968 (3) total: 3.31s remaining: 28.2s 4: learn: 0.5362643 test: 0.5366240 best: 0.5366240 (4) total: 4.13s remaining: 27.2s 5: learn: 0.5123296 test: 0.5124338 best: 0.5124338 (5) total: 4.99s remaining: 26.6s 6: learn: 0.4924087 test: 0.4925890 best: 0.4925890 (6) total: 5.79s remaining: 25.6s 7: learn: 0.4746362 test: 0.4751843 best: 0.4751843 (7) total: 6.57s remaining: 24.6s 8: learn: 0.4566507 test: 0.4569660 best: 0.4569660 (8) total: 7.4s remaining: 23.9s 9: learn: 0.4397249 test: 0.4402620 best: 0.4402620 (9) total: 8.2s remaining: 23s 10: learn: 0.4253159 test: 0.4260021 best: 0.4260021 (10) total: 9s remaining: 22.1s 11: learn: 0.4058499 test: 0.4067009 best: 0.4067009 (11) total: 9.83s remaining: 21.3s 12: learn: 0.3923014 test: 0.3933954 best: 0.3933954 (12) total: 10.6s remaining: 20.4s 13: learn: 0.3796829 test: 0.3809781 best: 0.3809781 (13) total: 11.4s remaining: 19.6s 14: learn: 0.3675676 test: 0.3690010 best: 0.3690010 (14) total: 12.3s remaining: 18.8s 15: learn: 0.3561124 test: 0.3580033 best: 0.3580033 (15) total: 13.1s remaining: 18s 16: learn: 0.3465400 test: 0.3487278 best: 0.3487278 (16) total: 14s remaining: 17.2s 17: learn: 0.3353112 test: 0.3374913 best: 0.3374913 (17) total: 14.8s remaining: 16.4s 18: learn: 0.3282849 test: 0.3307878 best: 0.3307878 (18) total: 15.6s remaining: 15.6s 19: learn: 0.3206025 test: 0.3234005 best: 0.3234005 (19) total: 16.4s remaining: 14.8s 20: learn: 0.3134222 test: 0.3164902 best: 0.3164902 (20) total: 17.2s remaining: 13.9s 21: learn: 0.3066700 test: 0.3097480 best: 0.3097480 (21) total: 18s remaining: 13.1s 22: learn: 0.3002549 test: 0.3037012 best: 0.3037012 (22) total: 18.9s remaining: 12.3s 23: learn: 0.2920724 test: 0.2955615 best: 0.2955615 (23) total: 19.6s remaining: 11.5s 24: learn: 0.2869411 test: 0.2905390 best: 0.2905390 (24) total: 20.4s remaining: 10.6s 25: learn: 0.2829940 test: 0.2867844 best: 0.2867844 (25) total: 21.2s remaining: 9.79s 26: learn: 0.2774531 test: 0.2815980 best: 0.2815980 (26) total: 22s remaining: 8.97s 27: learn: 0.2730162 test: 0.2772900 best: 0.2772900 (27) total: 22.8s remaining: 8.16s 28: learn: 0.2679865 test: 0.2726117 best: 0.2726117 (28) total: 23.7s remaining: 7.34s 29: learn: 0.2639027 test: 0.2687882 best: 0.2687882 (29) total: 24.5s remaining: 6.52s 30: learn: 0.2601268 test: 0.2651994 best: 0.2651994 (30) total: 25.3s remaining: 5.71s 31: learn: 0.2559390 test: 0.2612788 best: 0.2612788 (31) total: 26.1s remaining: 4.9s 32: learn: 0.2527582 test: 0.2583605 best: 0.2583605 (32) total: 27s remaining: 4.08s 33: learn: 0.2496792 test: 0.2554962 best: 0.2554962 (33) total: 27.8s remaining: 3.27s 34: learn: 0.2465769 test: 0.2525424 best: 0.2525424 (34) total: 28.6s remaining: 2.45s 35: learn: 0.2426381 test: 0.2490707 best: 0.2490707 (35) total: 29.5s remaining: 1.64s 36: learn: 0.2395198 test: 0.2463333 best: 0.2463333 (36) total: 30.3s remaining: 820ms 37: learn: 0.2367891 test: 0.2438643 best: 0.2438643 (37) total: 31.2s remaining: 0us bestTest = 0.2438642717 bestIteration = 37 Trial 25, Fold 3: Log loss = 0.2438642716557561, Average precision = 0.9719410737754791, ROC-AUC = 0.9698646418218064, Elapsed Time = 31.299758900000597 seconds Trial 25, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 25, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6579446 test: 0.6582622 best: 0.6582622 (0) total: 808ms remaining: 29.9s 1: learn: 0.6254090 test: 0.6265165 best: 0.6265165 (1) total: 1.63s remaining: 29.3s 2: learn: 0.5938307 test: 0.5951509 best: 0.5951509 (2) total: 2.46s remaining: 28.7s 3: learn: 0.5638331 test: 0.5652799 best: 0.5652799 (3) total: 3.28s remaining: 27.9s 4: learn: 0.5371041 test: 0.5387772 best: 0.5387772 (4) total: 4.11s remaining: 27.1s 5: learn: 0.5142683 test: 0.5160418 best: 0.5160418 (5) total: 4.92s remaining: 26.2s 6: learn: 0.4907633 test: 0.4926741 best: 0.4926741 (6) total: 5.72s remaining: 25.3s 7: learn: 0.4688091 test: 0.4706404 best: 0.4706404 (7) total: 6.53s remaining: 24.5s 8: learn: 0.4501606 test: 0.4524206 best: 0.4524206 (8) total: 7.42s remaining: 23.9s 9: learn: 0.4359050 test: 0.4384830 best: 0.4384830 (9) total: 8.36s remaining: 23.4s 10: learn: 0.4191483 test: 0.4221864 best: 0.4221864 (10) total: 9.31s remaining: 22.9s 11: learn: 0.4055183 test: 0.4089529 best: 0.4089529 (11) total: 10.2s remaining: 22.1s 12: learn: 0.3922137 test: 0.3961097 best: 0.3961097 (12) total: 11.1s remaining: 21.3s 13: learn: 0.3783195 test: 0.3824437 best: 0.3824437 (13) total: 12s remaining: 20.6s 14: learn: 0.3675538 test: 0.3718253 best: 0.3718253 (14) total: 13.1s remaining: 20s 15: learn: 0.3567711 test: 0.3613566 best: 0.3613566 (15) total: 14s remaining: 19.2s 16: learn: 0.3461948 test: 0.3512245 best: 0.3512245 (16) total: 14.8s remaining: 18.3s 17: learn: 0.3382960 test: 0.3433253 best: 0.3433253 (17) total: 15.8s remaining: 17.5s 18: learn: 0.3286658 test: 0.3338888 best: 0.3338888 (18) total: 16.6s remaining: 16.6s 19: learn: 0.3213552 test: 0.3268034 best: 0.3268034 (19) total: 17.5s remaining: 15.7s 20: learn: 0.3122696 test: 0.3179034 best: 0.3179034 (20) total: 18.3s remaining: 14.8s 21: learn: 0.3065763 test: 0.3126010 best: 0.3126010 (21) total: 19.1s remaining: 13.9s 22: learn: 0.2995669 test: 0.3058432 best: 0.3058432 (22) total: 20s remaining: 13s 23: learn: 0.2941524 test: 0.3006008 best: 0.3006008 (23) total: 20.8s remaining: 12.1s 24: learn: 0.2882715 test: 0.2949742 best: 0.2949742 (24) total: 21.6s remaining: 11.2s 25: learn: 0.2824345 test: 0.2895117 best: 0.2895117 (25) total: 22.4s remaining: 10.4s 26: learn: 0.2774040 test: 0.2848424 best: 0.2848424 (26) total: 23.3s remaining: 9.49s 27: learn: 0.2719488 test: 0.2799374 best: 0.2799374 (27) total: 24.1s remaining: 8.62s 28: learn: 0.2667268 test: 0.2749773 best: 0.2749773 (28) total: 25s remaining: 7.75s 29: learn: 0.2627683 test: 0.2712788 best: 0.2712788 (29) total: 25.8s remaining: 6.87s 30: learn: 0.2593429 test: 0.2684311 best: 0.2684311 (30) total: 26.6s remaining: 6s 31: learn: 0.2553482 test: 0.2646591 best: 0.2646591 (31) total: 27.4s remaining: 5.14s 32: learn: 0.2516517 test: 0.2611200 best: 0.2611200 (32) total: 28.2s remaining: 4.28s 33: learn: 0.2483462 test: 0.2581637 best: 0.2581637 (33) total: 29.1s remaining: 3.42s 34: learn: 0.2453658 test: 0.2553378 best: 0.2553378 (34) total: 29.9s remaining: 2.56s 35: learn: 0.2419220 test: 0.2522248 best: 0.2522248 (35) total: 30.7s remaining: 1.71s 36: learn: 0.2390553 test: 0.2496696 best: 0.2496696 (36) total: 31.6s remaining: 853ms 37: learn: 0.2360780 test: 0.2469461 best: 0.2469461 (37) total: 32.4s remaining: 0us bestTest = 0.2469461168 bestIteration = 37 Trial 25, Fold 4: Log loss = 0.24694611679454234, Average precision = 0.9717403666180794, ROC-AUC = 0.9689360584752653, Elapsed Time = 32.581517300000996 seconds Trial 25, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 25, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6550586 test: 0.6556575 best: 0.6556575 (0) total: 834ms remaining: 30.8s 1: learn: 0.6213769 test: 0.6226913 best: 0.6226913 (1) total: 1.63s remaining: 29.3s 2: learn: 0.5907224 test: 0.5926708 best: 0.5926708 (2) total: 2.46s remaining: 28.6s 3: learn: 0.5608832 test: 0.5637277 best: 0.5637277 (3) total: 3.29s remaining: 28s 4: learn: 0.5332228 test: 0.5363509 best: 0.5363509 (4) total: 4.09s remaining: 27s 5: learn: 0.5087964 test: 0.5122560 best: 0.5122560 (5) total: 4.92s remaining: 26.2s 6: learn: 0.4839196 test: 0.4878942 best: 0.4878942 (6) total: 5.75s remaining: 25.5s 7: learn: 0.4631018 test: 0.4676830 best: 0.4676830 (7) total: 6.6s remaining: 24.7s 8: learn: 0.4461822 test: 0.4512441 best: 0.4512441 (8) total: 7.42s remaining: 23.9s 9: learn: 0.4265317 test: 0.4324501 best: 0.4324501 (9) total: 8.24s remaining: 23.1s 10: learn: 0.4106122 test: 0.4168169 best: 0.4168169 (10) total: 9.05s remaining: 22.2s 11: learn: 0.3945124 test: 0.4009913 best: 0.4009913 (11) total: 9.85s remaining: 21.3s 12: learn: 0.3807004 test: 0.3875569 best: 0.3875569 (12) total: 10.7s remaining: 20.5s 13: learn: 0.3674979 test: 0.3751007 best: 0.3751007 (13) total: 11.5s remaining: 19.6s 14: learn: 0.3577002 test: 0.3660433 best: 0.3660433 (14) total: 12.3s remaining: 18.8s 15: learn: 0.3454133 test: 0.3541217 best: 0.3541217 (15) total: 13.1s remaining: 18s 16: learn: 0.3365313 test: 0.3457224 best: 0.3457224 (16) total: 13.9s remaining: 17.2s 17: learn: 0.3275839 test: 0.3371531 best: 0.3371531 (17) total: 14.7s remaining: 16.3s 18: learn: 0.3200993 test: 0.3300458 best: 0.3300458 (18) total: 15.5s remaining: 15.5s 19: learn: 0.3138587 test: 0.3240484 best: 0.3240484 (19) total: 16.4s remaining: 14.7s 20: learn: 0.3066031 test: 0.3173031 best: 0.3173031 (20) total: 17.2s remaining: 13.9s 21: learn: 0.3006407 test: 0.3116708 best: 0.3116708 (21) total: 18s remaining: 13.1s 22: learn: 0.2922002 test: 0.3035821 best: 0.3035821 (22) total: 18.8s remaining: 12.3s 23: learn: 0.2862596 test: 0.2980691 best: 0.2980691 (23) total: 19.6s remaining: 11.4s 24: learn: 0.2813295 test: 0.2935662 best: 0.2935662 (24) total: 20.4s remaining: 10.6s 25: learn: 0.2767516 test: 0.2894332 best: 0.2894332 (25) total: 21.2s remaining: 9.8s 26: learn: 0.2728524 test: 0.2858342 best: 0.2858342 (26) total: 22s remaining: 8.97s 27: learn: 0.2673472 test: 0.2807015 best: 0.2807015 (27) total: 22.9s remaining: 8.16s 28: learn: 0.2626492 test: 0.2764948 best: 0.2764948 (28) total: 23.6s remaining: 7.34s 29: learn: 0.2573908 test: 0.2717384 best: 0.2717384 (29) total: 24.5s remaining: 6.52s 30: learn: 0.2541779 test: 0.2689647 best: 0.2689647 (30) total: 25.2s remaining: 5.7s 31: learn: 0.2506329 test: 0.2657187 best: 0.2657187 (31) total: 26s remaining: 4.88s 32: learn: 0.2477135 test: 0.2629566 best: 0.2629566 (32) total: 26.9s remaining: 4.07s 33: learn: 0.2443281 test: 0.2598671 best: 0.2598671 (33) total: 27.7s remaining: 3.26s 34: learn: 0.2412125 test: 0.2572139 best: 0.2572139 (34) total: 28.5s remaining: 2.44s 35: learn: 0.2384432 test: 0.2547062 best: 0.2547062 (35) total: 29.4s remaining: 1.63s 36: learn: 0.2353411 test: 0.2520272 best: 0.2520272 (36) total: 30.2s remaining: 816ms 37: learn: 0.2328189 test: 0.2497984 best: 0.2497984 (37) total: 31s remaining: 0us bestTest = 0.2497983569 bestIteration = 37 Trial 25, Fold 5: Log loss = 0.24979835686495963, Average precision = 0.970976381143047, ROC-AUC = 0.9674064247626479, Elapsed Time = 31.180757099999028 seconds
Optimization Progress: 26%|##6 | 26/100 [43:10<1:55:58, 94.04s/it]
Trial 26, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 26, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5961897 test: 0.5966535 best: 0.5966535 (0) total: 81.2ms remaining: 6.9s 1: learn: 0.5176279 test: 0.5181272 best: 0.5181272 (1) total: 162ms remaining: 6.8s 2: learn: 0.4570094 test: 0.4575511 best: 0.4575511 (2) total: 238ms remaining: 6.58s 3: learn: 0.4092952 test: 0.4103357 best: 0.4103357 (3) total: 315ms remaining: 6.45s 4: learn: 0.3722963 test: 0.3734406 best: 0.3734406 (4) total: 392ms remaining: 6.34s 5: learn: 0.3425457 test: 0.3445832 best: 0.3445832 (5) total: 468ms remaining: 6.24s 6: learn: 0.3185460 test: 0.3214509 best: 0.3214509 (6) total: 544ms remaining: 6.13s 7: learn: 0.3009166 test: 0.3043608 best: 0.3043608 (7) total: 620ms remaining: 6.04s 8: learn: 0.2853562 test: 0.2893107 best: 0.2893107 (8) total: 695ms remaining: 5.95s 9: learn: 0.2722718 test: 0.2767430 best: 0.2767430 (9) total: 772ms remaining: 5.87s 10: learn: 0.2617090 test: 0.2667152 best: 0.2667152 (10) total: 849ms remaining: 5.79s 11: learn: 0.2522911 test: 0.2577035 best: 0.2577035 (11) total: 925ms remaining: 5.7s 12: learn: 0.2446026 test: 0.2508385 best: 0.2508385 (12) total: 1s remaining: 5.63s 13: learn: 0.2382998 test: 0.2453456 best: 0.2453456 (13) total: 1.08s remaining: 5.56s 14: learn: 0.2331281 test: 0.2408402 best: 0.2408402 (14) total: 1.16s remaining: 5.48s 15: learn: 0.2284206 test: 0.2366743 best: 0.2366743 (15) total: 1.24s remaining: 5.41s 16: learn: 0.2241607 test: 0.2331478 best: 0.2331478 (16) total: 1.31s remaining: 5.33s 17: learn: 0.2208185 test: 0.2303629 best: 0.2303629 (17) total: 1.39s remaining: 5.25s 18: learn: 0.2169545 test: 0.2271892 best: 0.2271892 (18) total: 1.47s remaining: 5.17s 19: learn: 0.2138855 test: 0.2246617 best: 0.2246617 (19) total: 1.54s remaining: 5.08s 20: learn: 0.2114325 test: 0.2230101 best: 0.2230101 (20) total: 1.61s remaining: 5s 21: learn: 0.2088218 test: 0.2207469 best: 0.2207469 (21) total: 1.69s remaining: 4.92s 22: learn: 0.2069793 test: 0.2193138 best: 0.2193138 (22) total: 1.77s remaining: 4.85s 23: learn: 0.2046419 test: 0.2173639 best: 0.2173639 (23) total: 1.85s remaining: 4.77s 24: learn: 0.2028875 test: 0.2159203 best: 0.2159203 (24) total: 1.92s remaining: 4.69s 25: learn: 0.2011492 test: 0.2144813 best: 0.2144813 (25) total: 2s remaining: 4.62s 26: learn: 0.1994839 test: 0.2133433 best: 0.2133433 (26) total: 2.08s remaining: 4.54s 27: learn: 0.1974382 test: 0.2117110 best: 0.2117110 (27) total: 2.15s remaining: 4.46s 28: learn: 0.1960562 test: 0.2108416 best: 0.2108416 (28) total: 2.23s remaining: 4.39s 29: learn: 0.1947738 test: 0.2099376 best: 0.2099376 (29) total: 2.31s remaining: 4.32s 30: learn: 0.1935973 test: 0.2091192 best: 0.2091192 (30) total: 2.39s remaining: 4.25s 31: learn: 0.1922710 test: 0.2082635 best: 0.2082635 (31) total: 2.48s remaining: 4.18s 32: learn: 0.1915013 test: 0.2079095 best: 0.2079095 (32) total: 2.56s remaining: 4.11s 33: learn: 0.1905890 test: 0.2073112 best: 0.2073112 (33) total: 2.65s remaining: 4.05s 34: learn: 0.1893695 test: 0.2065134 best: 0.2065134 (34) total: 2.73s remaining: 3.98s 35: learn: 0.1885177 test: 0.2064832 best: 0.2064832 (35) total: 2.82s remaining: 3.91s 36: learn: 0.1875963 test: 0.2059469 best: 0.2059469 (36) total: 2.9s remaining: 3.85s 37: learn: 0.1865994 test: 0.2053995 best: 0.2053995 (37) total: 2.99s remaining: 3.78s 38: learn: 0.1857341 test: 0.2048206 best: 0.2048206 (38) total: 3.08s remaining: 3.71s 39: learn: 0.1848709 test: 0.2043411 best: 0.2043411 (39) total: 3.16s remaining: 3.64s 40: learn: 0.1836523 test: 0.2034087 best: 0.2034087 (40) total: 3.25s remaining: 3.57s 41: learn: 0.1829765 test: 0.2030586 best: 0.2030586 (41) total: 3.33s remaining: 3.49s 42: learn: 0.1821526 test: 0.2026561 best: 0.2026561 (42) total: 3.42s remaining: 3.42s 43: learn: 0.1813961 test: 0.2021911 best: 0.2021911 (43) total: 3.5s remaining: 3.34s 44: learn: 0.1807605 test: 0.2022470 best: 0.2021911 (43) total: 3.58s remaining: 3.26s 45: learn: 0.1801335 test: 0.2021195 best: 0.2021195 (45) total: 3.67s remaining: 3.19s 46: learn: 0.1793973 test: 0.2016656 best: 0.2016656 (46) total: 3.75s remaining: 3.11s 47: learn: 0.1787482 test: 0.2014694 best: 0.2014694 (47) total: 3.84s remaining: 3.04s 48: learn: 0.1778793 test: 0.2013561 best: 0.2013561 (48) total: 3.92s remaining: 2.96s 49: learn: 0.1771707 test: 0.2008762 best: 0.2008762 (49) total: 4.01s remaining: 2.89s 50: learn: 0.1766118 test: 0.2009145 best: 0.2008762 (49) total: 4.1s remaining: 2.81s 51: learn: 0.1758707 test: 0.2008192 best: 0.2008192 (51) total: 4.19s remaining: 2.74s 52: learn: 0.1754272 test: 0.2006719 best: 0.2006719 (52) total: 4.29s remaining: 2.67s 53: learn: 0.1748338 test: 0.2006954 best: 0.2006719 (52) total: 4.38s remaining: 2.6s 54: learn: 0.1742865 test: 0.2005408 best: 0.2005408 (54) total: 4.48s remaining: 2.53s 55: learn: 0.1737242 test: 0.2004864 best: 0.2004864 (55) total: 4.58s remaining: 2.45s 56: learn: 0.1727037 test: 0.2005084 best: 0.2004864 (55) total: 4.68s remaining: 2.38s 57: learn: 0.1718873 test: 0.2004553 best: 0.2004553 (57) total: 4.78s remaining: 2.31s 58: learn: 0.1712867 test: 0.2004763 best: 0.2004553 (57) total: 4.88s remaining: 2.23s 59: learn: 0.1706767 test: 0.2005735 best: 0.2004553 (57) total: 4.97s remaining: 2.15s 60: learn: 0.1699505 test: 0.2004282 best: 0.2004282 (60) total: 5.07s remaining: 2.08s 61: learn: 0.1693117 test: 0.2005701 best: 0.2004282 (60) total: 5.17s remaining: 2s 62: learn: 0.1687221 test: 0.2003025 best: 0.2003025 (62) total: 5.27s remaining: 1.92s 63: learn: 0.1681146 test: 0.2002297 best: 0.2002297 (63) total: 5.37s remaining: 1.84s 64: learn: 0.1674100 test: 0.2002640 best: 0.2002297 (63) total: 5.46s remaining: 1.76s 65: learn: 0.1668933 test: 0.2003646 best: 0.2002297 (63) total: 5.56s remaining: 1.68s 66: learn: 0.1664414 test: 0.2003633 best: 0.2002297 (63) total: 5.65s remaining: 1.6s 67: learn: 0.1661558 test: 0.2002565 best: 0.2002297 (63) total: 5.75s remaining: 1.52s 68: learn: 0.1654527 test: 0.2003058 best: 0.2002297 (63) total: 5.85s remaining: 1.44s 69: learn: 0.1647682 test: 0.2000424 best: 0.2000424 (69) total: 5.95s remaining: 1.36s 70: learn: 0.1641971 test: 0.1999070 best: 0.1999070 (70) total: 6.05s remaining: 1.28s 71: learn: 0.1633571 test: 0.1997731 best: 0.1997731 (71) total: 6.15s remaining: 1.2s 72: learn: 0.1627931 test: 0.1994530 best: 0.1994530 (72) total: 6.25s remaining: 1.11s 73: learn: 0.1620809 test: 0.1996425 best: 0.1994530 (72) total: 6.34s remaining: 1.03s 74: learn: 0.1615916 test: 0.1995293 best: 0.1994530 (72) total: 6.44s remaining: 944ms 75: learn: 0.1609933 test: 0.1992014 best: 0.1992014 (75) total: 6.53s remaining: 860ms 76: learn: 0.1604442 test: 0.1990088 best: 0.1990088 (76) total: 6.63s remaining: 775ms 77: learn: 0.1601312 test: 0.1988077 best: 0.1988077 (77) total: 6.72s remaining: 690ms 78: learn: 0.1594820 test: 0.1987086 best: 0.1987086 (78) total: 6.82s remaining: 604ms 79: learn: 0.1589518 test: 0.1987760 best: 0.1987086 (78) total: 6.92s remaining: 519ms 80: learn: 0.1585901 test: 0.1986273 best: 0.1986273 (80) total: 7.02s remaining: 434ms 81: learn: 0.1581958 test: 0.1984428 best: 0.1984428 (81) total: 7.12s remaining: 347ms 82: learn: 0.1577444 test: 0.1984267 best: 0.1984267 (82) total: 7.22s remaining: 261ms 83: learn: 0.1572889 test: 0.1984218 best: 0.1984218 (83) total: 7.32s remaining: 174ms 84: learn: 0.1566824 test: 0.1985619 best: 0.1984218 (83) total: 7.42s remaining: 87.2ms 85: learn: 0.1561125 test: 0.1984786 best: 0.1984218 (83) total: 7.51s remaining: 0us bestTest = 0.1984218466 bestIteration = 83 Shrink model to first 84 iterations. Trial 26, Fold 1: Log loss = 0.1984218466016081, Average precision = 0.9746056884631267, ROC-AUC = 0.9702059076917928, Elapsed Time = 7.627324600001884 seconds Trial 26, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 26, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5955079 test: 0.5962104 best: 0.5962104 (0) total: 85.7ms remaining: 7.28s 1: learn: 0.5176591 test: 0.5193088 best: 0.5193088 (1) total: 169ms remaining: 7.1s 2: learn: 0.4574553 test: 0.4596686 best: 0.4596686 (2) total: 260ms remaining: 7.2s 3: learn: 0.4109490 test: 0.4135817 best: 0.4135817 (3) total: 352ms remaining: 7.22s 4: learn: 0.3725170 test: 0.3752987 best: 0.3752987 (4) total: 463ms remaining: 7.5s 5: learn: 0.3420745 test: 0.3452443 best: 0.3452443 (5) total: 553ms remaining: 7.37s 6: learn: 0.3198002 test: 0.3232981 best: 0.3232981 (6) total: 642ms remaining: 7.25s 7: learn: 0.3007652 test: 0.3042798 best: 0.3042798 (7) total: 739ms remaining: 7.21s 8: learn: 0.2851011 test: 0.2891205 best: 0.2891205 (8) total: 831ms remaining: 7.11s 9: learn: 0.2722871 test: 0.2763056 best: 0.2763056 (9) total: 927ms remaining: 7.04s 10: learn: 0.2621905 test: 0.2663447 best: 0.2663447 (10) total: 1.02s remaining: 6.98s 11: learn: 0.2531454 test: 0.2575505 best: 0.2575505 (11) total: 1.12s remaining: 6.89s 12: learn: 0.2454792 test: 0.2501093 best: 0.2501093 (12) total: 1.21s remaining: 6.8s 13: learn: 0.2389428 test: 0.2438255 best: 0.2438255 (13) total: 1.3s remaining: 6.71s 14: learn: 0.2335640 test: 0.2385716 best: 0.2385716 (14) total: 1.4s remaining: 6.63s 15: learn: 0.2286732 test: 0.2338733 best: 0.2338733 (15) total: 1.5s remaining: 6.54s 16: learn: 0.2244946 test: 0.2298354 best: 0.2298354 (16) total: 1.59s remaining: 6.45s 17: learn: 0.2204524 test: 0.2259817 best: 0.2259817 (17) total: 1.68s remaining: 6.37s 18: learn: 0.2169438 test: 0.2228794 best: 0.2228794 (18) total: 1.78s remaining: 6.27s 19: learn: 0.2148503 test: 0.2212680 best: 0.2212680 (19) total: 1.87s remaining: 6.18s 20: learn: 0.2116910 test: 0.2181490 best: 0.2181490 (20) total: 1.97s remaining: 6.08s 21: learn: 0.2097987 test: 0.2162540 best: 0.2162540 (21) total: 2.06s remaining: 5.99s 22: learn: 0.2075451 test: 0.2142648 best: 0.2142648 (22) total: 2.15s remaining: 5.9s 23: learn: 0.2055412 test: 0.2126354 best: 0.2126354 (23) total: 2.25s remaining: 5.8s 24: learn: 0.2037175 test: 0.2110280 best: 0.2110280 (24) total: 2.34s remaining: 5.71s 25: learn: 0.2018765 test: 0.2098807 best: 0.2098807 (25) total: 2.43s remaining: 5.61s 26: learn: 0.2004011 test: 0.2088922 best: 0.2088922 (26) total: 2.52s remaining: 5.51s 27: learn: 0.1990409 test: 0.2080949 best: 0.2080949 (27) total: 2.62s remaining: 5.42s 28: learn: 0.1974208 test: 0.2068283 best: 0.2068283 (28) total: 2.71s remaining: 5.33s 29: learn: 0.1961378 test: 0.2058650 best: 0.2058650 (29) total: 2.8s remaining: 5.23s 30: learn: 0.1947731 test: 0.2047953 best: 0.2047953 (30) total: 2.9s remaining: 5.14s 31: learn: 0.1939297 test: 0.2046028 best: 0.2046028 (31) total: 2.99s remaining: 5.04s 32: learn: 0.1928895 test: 0.2039315 best: 0.2039315 (32) total: 3.08s remaining: 4.95s 33: learn: 0.1917702 test: 0.2030349 best: 0.2030349 (33) total: 3.18s remaining: 4.86s 34: learn: 0.1910105 test: 0.2025337 best: 0.2025337 (34) total: 3.27s remaining: 4.76s 35: learn: 0.1899740 test: 0.2017949 best: 0.2017949 (35) total: 3.36s remaining: 4.67s 36: learn: 0.1890422 test: 0.2010502 best: 0.2010502 (36) total: 3.45s remaining: 4.57s 37: learn: 0.1883228 test: 0.2007394 best: 0.2007394 (37) total: 3.54s remaining: 4.47s 38: learn: 0.1876091 test: 0.2002715 best: 0.2002715 (38) total: 3.63s remaining: 4.38s 39: learn: 0.1866065 test: 0.1998964 best: 0.1998964 (39) total: 3.73s remaining: 4.29s 40: learn: 0.1857559 test: 0.1994146 best: 0.1994146 (40) total: 3.82s remaining: 4.19s 41: learn: 0.1848095 test: 0.1992523 best: 0.1992523 (41) total: 3.92s remaining: 4.1s 42: learn: 0.1840975 test: 0.1988469 best: 0.1988469 (42) total: 4.01s remaining: 4.01s 43: learn: 0.1832018 test: 0.1981226 best: 0.1981226 (43) total: 4.1s remaining: 3.92s 44: learn: 0.1823941 test: 0.1977004 best: 0.1977004 (44) total: 4.19s remaining: 3.82s 45: learn: 0.1816504 test: 0.1973185 best: 0.1973185 (45) total: 4.29s remaining: 3.73s 46: learn: 0.1810547 test: 0.1969038 best: 0.1969038 (46) total: 4.38s remaining: 3.63s 47: learn: 0.1806052 test: 0.1966221 best: 0.1966221 (47) total: 4.47s remaining: 3.54s 48: learn: 0.1797342 test: 0.1962178 best: 0.1962178 (48) total: 4.57s remaining: 3.45s 49: learn: 0.1789883 test: 0.1956066 best: 0.1956066 (49) total: 4.67s remaining: 3.36s 50: learn: 0.1781548 test: 0.1950987 best: 0.1950987 (50) total: 4.76s remaining: 3.27s 51: learn: 0.1771726 test: 0.1951141 best: 0.1950987 (50) total: 4.86s remaining: 3.17s 52: learn: 0.1764305 test: 0.1947715 best: 0.1947715 (52) total: 4.95s remaining: 3.08s 53: learn: 0.1754825 test: 0.1942325 best: 0.1942325 (53) total: 5.05s remaining: 2.99s 54: learn: 0.1750116 test: 0.1938909 best: 0.1938909 (54) total: 5.14s remaining: 2.9s 55: learn: 0.1744740 test: 0.1936423 best: 0.1936423 (55) total: 5.23s remaining: 2.8s 56: learn: 0.1737625 test: 0.1935779 best: 0.1935779 (56) total: 5.32s remaining: 2.71s 57: learn: 0.1730740 test: 0.1932804 best: 0.1932804 (57) total: 5.41s remaining: 2.61s 58: learn: 0.1723218 test: 0.1933849 best: 0.1932804 (57) total: 5.51s remaining: 2.52s 59: learn: 0.1717062 test: 0.1933533 best: 0.1932804 (57) total: 5.6s remaining: 2.43s 60: learn: 0.1710292 test: 0.1932709 best: 0.1932709 (60) total: 5.69s remaining: 2.33s 61: learn: 0.1706576 test: 0.1931652 best: 0.1931652 (61) total: 5.78s remaining: 2.24s 62: learn: 0.1701509 test: 0.1929107 best: 0.1929107 (62) total: 5.87s remaining: 2.14s 63: learn: 0.1697155 test: 0.1930161 best: 0.1929107 (62) total: 5.97s remaining: 2.05s 64: learn: 0.1692628 test: 0.1928729 best: 0.1928729 (64) total: 6.07s remaining: 1.96s 65: learn: 0.1686842 test: 0.1923617 best: 0.1923617 (65) total: 6.16s remaining: 1.87s 66: learn: 0.1682941 test: 0.1921406 best: 0.1921406 (66) total: 6.25s remaining: 1.77s 67: learn: 0.1676420 test: 0.1921823 best: 0.1921406 (66) total: 6.34s remaining: 1.68s 68: learn: 0.1671246 test: 0.1921430 best: 0.1921406 (66) total: 6.43s remaining: 1.58s 69: learn: 0.1667779 test: 0.1920524 best: 0.1920524 (69) total: 6.53s remaining: 1.49s 70: learn: 0.1663714 test: 0.1919754 best: 0.1919754 (70) total: 6.62s remaining: 1.4s 71: learn: 0.1659194 test: 0.1918955 best: 0.1918955 (71) total: 6.71s remaining: 1.3s 72: learn: 0.1652250 test: 0.1915939 best: 0.1915939 (72) total: 6.8s remaining: 1.21s 73: learn: 0.1645387 test: 0.1916211 best: 0.1915939 (72) total: 6.89s remaining: 1.12s 74: learn: 0.1640369 test: 0.1914843 best: 0.1914843 (74) total: 6.99s remaining: 1.02s 75: learn: 0.1635718 test: 0.1915085 best: 0.1914843 (74) total: 7.08s remaining: 931ms 76: learn: 0.1630772 test: 0.1912866 best: 0.1912866 (76) total: 7.17s remaining: 838ms 77: learn: 0.1628294 test: 0.1913824 best: 0.1912866 (76) total: 7.26s remaining: 745ms 78: learn: 0.1624014 test: 0.1913937 best: 0.1912866 (76) total: 7.35s remaining: 651ms 79: learn: 0.1618613 test: 0.1914253 best: 0.1912866 (76) total: 7.44s remaining: 558ms 80: learn: 0.1610970 test: 0.1912986 best: 0.1912866 (76) total: 7.53s remaining: 465ms 81: learn: 0.1604630 test: 0.1911444 best: 0.1911444 (81) total: 7.62s remaining: 372ms 82: learn: 0.1602410 test: 0.1911423 best: 0.1911423 (82) total: 7.71s remaining: 279ms 83: learn: 0.1596833 test: 0.1909968 best: 0.1909968 (83) total: 7.8s remaining: 186ms 84: learn: 0.1591767 test: 0.1906379 best: 0.1906379 (84) total: 7.89s remaining: 92.8ms 85: learn: 0.1588717 test: 0.1906999 best: 0.1906379 (84) total: 7.98s remaining: 0us bestTest = 0.190637882 bestIteration = 84 Shrink model to first 85 iterations. Trial 26, Fold 2: Log loss = 0.19063788199100115, Average precision = 0.9754013776490958, ROC-AUC = 0.9728190125407404, Elapsed Time = 8.099196099999972 seconds Trial 26, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 26, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5951131 test: 0.5951614 best: 0.5951614 (0) total: 88.5ms remaining: 7.52s 1: learn: 0.5167258 test: 0.5167379 best: 0.5167379 (1) total: 171ms remaining: 7.19s 2: learn: 0.4578782 test: 0.4575862 best: 0.4575862 (2) total: 254ms remaining: 7.03s 3: learn: 0.4101744 test: 0.4098095 best: 0.4098095 (3) total: 339ms remaining: 6.94s 4: learn: 0.3725409 test: 0.3719460 best: 0.3719460 (4) total: 423ms remaining: 6.85s 5: learn: 0.3433236 test: 0.3427514 best: 0.3427514 (5) total: 506ms remaining: 6.75s 6: learn: 0.3195739 test: 0.3194461 best: 0.3194461 (6) total: 590ms remaining: 6.66s 7: learn: 0.3016137 test: 0.3014539 best: 0.3014539 (7) total: 673ms remaining: 6.56s 8: learn: 0.2869274 test: 0.2868732 best: 0.2868732 (8) total: 755ms remaining: 6.46s 9: learn: 0.2741532 test: 0.2739765 best: 0.2739765 (9) total: 840ms remaining: 6.38s 10: learn: 0.2633825 test: 0.2635254 best: 0.2635254 (10) total: 926ms remaining: 6.31s 11: learn: 0.2546346 test: 0.2551210 best: 0.2551210 (11) total: 1.01s remaining: 6.24s 12: learn: 0.2473469 test: 0.2482896 best: 0.2482896 (12) total: 1.1s remaining: 6.16s 13: learn: 0.2401271 test: 0.2414365 best: 0.2414365 (13) total: 1.18s remaining: 6.08s 14: learn: 0.2344418 test: 0.2363151 best: 0.2363151 (14) total: 1.27s remaining: 6s 15: learn: 0.2295660 test: 0.2314951 best: 0.2314951 (15) total: 1.35s remaining: 5.92s 16: learn: 0.2249249 test: 0.2270819 best: 0.2270819 (16) total: 1.44s remaining: 5.84s 17: learn: 0.2215535 test: 0.2243093 best: 0.2243093 (17) total: 1.52s remaining: 5.76s 18: learn: 0.2189321 test: 0.2220042 best: 0.2220042 (18) total: 1.61s remaining: 5.67s 19: learn: 0.2164342 test: 0.2198868 best: 0.2198868 (19) total: 1.69s remaining: 5.59s 20: learn: 0.2134914 test: 0.2174944 best: 0.2174944 (20) total: 1.78s remaining: 5.5s 21: learn: 0.2107736 test: 0.2149653 best: 0.2149653 (21) total: 1.86s remaining: 5.42s 22: learn: 0.2082118 test: 0.2125456 best: 0.2125456 (22) total: 1.95s remaining: 5.33s 23: learn: 0.2060165 test: 0.2108870 best: 0.2108870 (23) total: 2.03s remaining: 5.25s 24: learn: 0.2038786 test: 0.2086841 best: 0.2086841 (24) total: 2.12s remaining: 5.16s 25: learn: 0.2021673 test: 0.2072457 best: 0.2072457 (25) total: 2.2s remaining: 5.08s 26: learn: 0.2006404 test: 0.2059105 best: 0.2059105 (26) total: 2.29s remaining: 5s 27: learn: 0.1993745 test: 0.2052036 best: 0.2052036 (27) total: 2.37s remaining: 4.92s 28: learn: 0.1979868 test: 0.2041072 best: 0.2041072 (28) total: 2.46s remaining: 4.84s 29: learn: 0.1966218 test: 0.2029424 best: 0.2029424 (29) total: 2.54s remaining: 4.75s 30: learn: 0.1951748 test: 0.2018125 best: 0.2018125 (30) total: 2.63s remaining: 4.66s 31: learn: 0.1942206 test: 0.2011257 best: 0.2011257 (31) total: 2.71s remaining: 4.58s 32: learn: 0.1928852 test: 0.2003587 best: 0.2003587 (32) total: 2.8s remaining: 4.5s 33: learn: 0.1917493 test: 0.1996927 best: 0.1996927 (33) total: 2.88s remaining: 4.41s 34: learn: 0.1908785 test: 0.1994640 best: 0.1994640 (34) total: 2.97s remaining: 4.33s 35: learn: 0.1901338 test: 0.1988192 best: 0.1988192 (35) total: 3.06s remaining: 4.25s 36: learn: 0.1892047 test: 0.1984201 best: 0.1984201 (36) total: 3.14s remaining: 4.16s 37: learn: 0.1880119 test: 0.1979617 best: 0.1979617 (37) total: 3.23s remaining: 4.08s 38: learn: 0.1870600 test: 0.1978656 best: 0.1978656 (38) total: 3.32s remaining: 4s 39: learn: 0.1864834 test: 0.1972169 best: 0.1972169 (39) total: 3.41s remaining: 3.92s 40: learn: 0.1857913 test: 0.1969184 best: 0.1969184 (40) total: 3.5s remaining: 3.85s 41: learn: 0.1851180 test: 0.1965119 best: 0.1965119 (41) total: 3.59s remaining: 3.77s 42: learn: 0.1840849 test: 0.1959661 best: 0.1959661 (42) total: 3.68s remaining: 3.68s 43: learn: 0.1833532 test: 0.1953849 best: 0.1953849 (43) total: 3.77s remaining: 3.6s 44: learn: 0.1824376 test: 0.1947588 best: 0.1947588 (44) total: 3.86s remaining: 3.52s 45: learn: 0.1818742 test: 0.1945006 best: 0.1945006 (45) total: 3.95s remaining: 3.43s 46: learn: 0.1813938 test: 0.1943880 best: 0.1943880 (46) total: 4.04s remaining: 3.35s 47: learn: 0.1805557 test: 0.1936250 best: 0.1936250 (47) total: 4.12s remaining: 3.26s 48: learn: 0.1798660 test: 0.1937082 best: 0.1936250 (47) total: 4.21s remaining: 3.18s 49: learn: 0.1793376 test: 0.1938094 best: 0.1936250 (47) total: 4.3s remaining: 3.1s 50: learn: 0.1789936 test: 0.1935375 best: 0.1935375 (50) total: 4.39s remaining: 3.01s 51: learn: 0.1780013 test: 0.1932636 best: 0.1932636 (51) total: 4.48s remaining: 2.93s 52: learn: 0.1771480 test: 0.1928505 best: 0.1928505 (52) total: 4.57s remaining: 2.84s 53: learn: 0.1764841 test: 0.1927277 best: 0.1927277 (53) total: 4.65s remaining: 2.76s 54: learn: 0.1758479 test: 0.1928806 best: 0.1927277 (53) total: 4.74s remaining: 2.67s 55: learn: 0.1752412 test: 0.1927198 best: 0.1927198 (55) total: 4.83s remaining: 2.59s 56: learn: 0.1747937 test: 0.1926848 best: 0.1926848 (56) total: 4.91s remaining: 2.5s 57: learn: 0.1740197 test: 0.1927311 best: 0.1926848 (56) total: 5s remaining: 2.41s 58: learn: 0.1735943 test: 0.1926977 best: 0.1926848 (56) total: 5.09s remaining: 2.33s 59: learn: 0.1726598 test: 0.1921134 best: 0.1921134 (59) total: 5.18s remaining: 2.24s 60: learn: 0.1721833 test: 0.1919961 best: 0.1919961 (60) total: 5.27s remaining: 2.16s 61: learn: 0.1715622 test: 0.1919733 best: 0.1919733 (61) total: 5.36s remaining: 2.07s 62: learn: 0.1710278 test: 0.1919472 best: 0.1919472 (62) total: 5.44s remaining: 1.99s 63: learn: 0.1704206 test: 0.1914165 best: 0.1914165 (63) total: 5.53s remaining: 1.9s 64: learn: 0.1697875 test: 0.1915051 best: 0.1914165 (63) total: 5.61s remaining: 1.81s 65: learn: 0.1689709 test: 0.1914946 best: 0.1914165 (63) total: 5.7s remaining: 1.73s 66: learn: 0.1684550 test: 0.1913073 best: 0.1913073 (66) total: 5.79s remaining: 1.64s 67: learn: 0.1680339 test: 0.1910500 best: 0.1910500 (67) total: 5.87s remaining: 1.55s 68: learn: 0.1675081 test: 0.1907092 best: 0.1907092 (68) total: 5.96s remaining: 1.47s 69: learn: 0.1668248 test: 0.1902983 best: 0.1902983 (69) total: 6.05s remaining: 1.38s 70: learn: 0.1664616 test: 0.1902151 best: 0.1902151 (70) total: 6.14s remaining: 1.3s 71: learn: 0.1658044 test: 0.1900169 best: 0.1900169 (71) total: 6.23s remaining: 1.21s 72: learn: 0.1651548 test: 0.1902740 best: 0.1900169 (71) total: 6.32s remaining: 1.13s 73: learn: 0.1647378 test: 0.1902593 best: 0.1900169 (71) total: 6.41s remaining: 1.04s 74: learn: 0.1642939 test: 0.1901160 best: 0.1900169 (71) total: 6.5s remaining: 954ms 75: learn: 0.1636202 test: 0.1897513 best: 0.1897513 (75) total: 6.59s remaining: 867ms 76: learn: 0.1630883 test: 0.1899844 best: 0.1897513 (75) total: 6.68s remaining: 781ms 77: learn: 0.1625891 test: 0.1898716 best: 0.1897513 (75) total: 6.77s remaining: 694ms 78: learn: 0.1620292 test: 0.1896366 best: 0.1896366 (78) total: 6.86s remaining: 608ms 79: learn: 0.1616891 test: 0.1895879 best: 0.1895879 (79) total: 6.94s remaining: 521ms 80: learn: 0.1613094 test: 0.1897044 best: 0.1895879 (79) total: 7.03s remaining: 434ms 81: learn: 0.1609351 test: 0.1896191 best: 0.1895879 (79) total: 7.12s remaining: 347ms 82: learn: 0.1604474 test: 0.1896527 best: 0.1895879 (79) total: 7.21s remaining: 260ms 83: learn: 0.1600992 test: 0.1895988 best: 0.1895879 (79) total: 7.29s remaining: 174ms 84: learn: 0.1597038 test: 0.1896686 best: 0.1895879 (79) total: 7.38s remaining: 86.9ms 85: learn: 0.1593001 test: 0.1897739 best: 0.1895879 (79) total: 7.47s remaining: 0us bestTest = 0.189587939 bestIteration = 79 Shrink model to first 80 iterations. Trial 26, Fold 3: Log loss = 0.18958793904807827, Average precision = 0.9765933146866443, ROC-AUC = 0.9732478438520288, Elapsed Time = 7.589770599999611 seconds Trial 26, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 26, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5937701 test: 0.5940916 best: 0.5940916 (0) total: 84.2ms remaining: 7.15s 1: learn: 0.5188686 test: 0.5192107 best: 0.5192107 (1) total: 169ms remaining: 7.08s 2: learn: 0.4612808 test: 0.4622554 best: 0.4622554 (2) total: 253ms remaining: 7s 3: learn: 0.4137958 test: 0.4153866 best: 0.4153866 (3) total: 338ms remaining: 6.93s 4: learn: 0.3759603 test: 0.3776184 best: 0.3776184 (4) total: 421ms remaining: 6.82s 5: learn: 0.3477499 test: 0.3497685 best: 0.3497685 (5) total: 504ms remaining: 6.72s 6: learn: 0.3236575 test: 0.3258119 best: 0.3258119 (6) total: 589ms remaining: 6.65s 7: learn: 0.3037189 test: 0.3058512 best: 0.3058512 (7) total: 673ms remaining: 6.56s 8: learn: 0.2874683 test: 0.2901149 best: 0.2901149 (8) total: 755ms remaining: 6.46s 9: learn: 0.2746124 test: 0.2775809 best: 0.2775809 (9) total: 839ms remaining: 6.38s 10: learn: 0.2646771 test: 0.2683812 best: 0.2683812 (10) total: 924ms remaining: 6.3s 11: learn: 0.2560632 test: 0.2605069 best: 0.2605069 (11) total: 1.01s remaining: 6.22s 12: learn: 0.2475746 test: 0.2524287 best: 0.2524287 (12) total: 1.09s remaining: 6.14s 13: learn: 0.2412442 test: 0.2466659 best: 0.2466659 (13) total: 1.18s remaining: 6.06s 14: learn: 0.2367055 test: 0.2427398 best: 0.2427398 (14) total: 1.26s remaining: 5.97s 15: learn: 0.2318446 test: 0.2381287 best: 0.2381287 (15) total: 1.35s remaining: 5.89s 16: learn: 0.2271727 test: 0.2339015 best: 0.2339015 (16) total: 1.43s remaining: 5.81s 17: learn: 0.2227676 test: 0.2300358 best: 0.2300358 (17) total: 1.51s remaining: 5.72s 18: learn: 0.2199505 test: 0.2275714 best: 0.2275714 (18) total: 1.6s remaining: 5.64s 19: learn: 0.2173645 test: 0.2255144 best: 0.2255144 (19) total: 1.69s remaining: 5.56s 20: learn: 0.2142568 test: 0.2227362 best: 0.2227362 (20) total: 1.77s remaining: 5.48s 21: learn: 0.2114560 test: 0.2203343 best: 0.2203343 (21) total: 1.85s remaining: 5.39s 22: learn: 0.2087403 test: 0.2180200 best: 0.2180200 (22) total: 1.94s remaining: 5.32s 23: learn: 0.2065891 test: 0.2161076 best: 0.2161076 (23) total: 2.02s remaining: 5.23s 24: learn: 0.2040170 test: 0.2139536 best: 0.2139536 (24) total: 2.11s remaining: 5.14s 25: learn: 0.2024640 test: 0.2126905 best: 0.2126905 (25) total: 2.19s remaining: 5.07s 26: learn: 0.2009150 test: 0.2118433 best: 0.2118433 (26) total: 2.28s remaining: 4.98s 27: learn: 0.1996687 test: 0.2110950 best: 0.2110950 (27) total: 2.37s remaining: 4.9s 28: learn: 0.1986574 test: 0.2103464 best: 0.2103464 (28) total: 2.45s remaining: 4.81s 29: learn: 0.1966731 test: 0.2086178 best: 0.2086178 (29) total: 2.53s remaining: 4.73s 30: learn: 0.1953894 test: 0.2076505 best: 0.2076505 (30) total: 2.62s remaining: 4.65s 31: learn: 0.1942883 test: 0.2070396 best: 0.2070396 (31) total: 2.7s remaining: 4.56s 32: learn: 0.1930326 test: 0.2063001 best: 0.2063001 (32) total: 2.79s remaining: 4.48s 33: learn: 0.1919180 test: 0.2053976 best: 0.2053976 (33) total: 2.87s remaining: 4.39s 34: learn: 0.1909901 test: 0.2051503 best: 0.2051503 (34) total: 2.96s remaining: 4.31s 35: learn: 0.1896448 test: 0.2041235 best: 0.2041235 (35) total: 3.04s remaining: 4.22s 36: learn: 0.1888195 test: 0.2037797 best: 0.2037797 (36) total: 3.12s remaining: 4.14s 37: learn: 0.1880353 test: 0.2030494 best: 0.2030494 (37) total: 3.21s remaining: 4.05s 38: learn: 0.1874204 test: 0.2029949 best: 0.2029949 (38) total: 3.29s remaining: 3.97s 39: learn: 0.1863924 test: 0.2028466 best: 0.2028466 (39) total: 3.38s remaining: 3.89s 40: learn: 0.1857149 test: 0.2025090 best: 0.2025090 (40) total: 3.46s remaining: 3.8s 41: learn: 0.1850104 test: 0.2021874 best: 0.2021874 (41) total: 3.55s remaining: 3.72s 42: learn: 0.1841064 test: 0.2020477 best: 0.2020477 (42) total: 3.63s remaining: 3.63s 43: learn: 0.1832347 test: 0.2016772 best: 0.2016772 (43) total: 3.71s remaining: 3.54s 44: learn: 0.1821963 test: 0.2015291 best: 0.2015291 (44) total: 3.8s remaining: 3.46s 45: learn: 0.1816425 test: 0.2015838 best: 0.2015291 (44) total: 3.88s remaining: 3.38s 46: learn: 0.1811813 test: 0.2015712 best: 0.2015291 (44) total: 3.97s remaining: 3.29s 47: learn: 0.1804871 test: 0.2010985 best: 0.2010985 (47) total: 4.05s remaining: 3.21s 48: learn: 0.1795562 test: 0.2003352 best: 0.2003352 (48) total: 4.13s remaining: 3.12s 49: learn: 0.1788104 test: 0.1997281 best: 0.1997281 (49) total: 4.22s remaining: 3.04s 50: learn: 0.1781117 test: 0.1992996 best: 0.1992996 (50) total: 4.3s remaining: 2.95s 51: learn: 0.1771449 test: 0.1993571 best: 0.1992996 (50) total: 4.39s remaining: 2.87s 52: learn: 0.1765444 test: 0.1990900 best: 0.1990900 (52) total: 4.48s remaining: 2.79s 53: learn: 0.1757476 test: 0.1989051 best: 0.1989051 (53) total: 4.56s remaining: 2.7s 54: learn: 0.1748708 test: 0.1985608 best: 0.1985608 (54) total: 4.65s remaining: 2.62s 55: learn: 0.1742051 test: 0.1985016 best: 0.1985016 (55) total: 4.74s remaining: 2.54s 56: learn: 0.1735391 test: 0.1979485 best: 0.1979485 (56) total: 4.82s remaining: 2.45s 57: learn: 0.1727233 test: 0.1978712 best: 0.1978712 (57) total: 4.91s remaining: 2.37s 58: learn: 0.1719082 test: 0.1974000 best: 0.1974000 (58) total: 5s remaining: 2.29s 59: learn: 0.1712730 test: 0.1974977 best: 0.1974000 (58) total: 5.08s remaining: 2.2s 60: learn: 0.1706521 test: 0.1972582 best: 0.1972582 (60) total: 5.17s remaining: 2.12s 61: learn: 0.1702977 test: 0.1969222 best: 0.1969222 (61) total: 5.26s remaining: 2.04s 62: learn: 0.1698904 test: 0.1965616 best: 0.1965616 (62) total: 5.35s remaining: 1.95s 63: learn: 0.1690836 test: 0.1964319 best: 0.1964319 (63) total: 5.44s remaining: 1.87s 64: learn: 0.1684844 test: 0.1960314 best: 0.1960314 (64) total: 5.53s remaining: 1.79s 65: learn: 0.1681979 test: 0.1959260 best: 0.1959260 (65) total: 5.62s remaining: 1.7s 66: learn: 0.1678774 test: 0.1959813 best: 0.1959260 (65) total: 5.71s remaining: 1.62s 67: learn: 0.1676153 test: 0.1958245 best: 0.1958245 (67) total: 5.79s remaining: 1.53s 68: learn: 0.1668939 test: 0.1959402 best: 0.1958245 (67) total: 5.88s remaining: 1.45s 69: learn: 0.1662083 test: 0.1959955 best: 0.1958245 (67) total: 5.97s remaining: 1.36s 70: learn: 0.1656827 test: 0.1960697 best: 0.1958245 (67) total: 6.06s remaining: 1.28s 71: learn: 0.1652947 test: 0.1959302 best: 0.1958245 (67) total: 6.14s remaining: 1.19s 72: learn: 0.1646717 test: 0.1957405 best: 0.1957405 (72) total: 6.23s remaining: 1.11s 73: learn: 0.1641673 test: 0.1955291 best: 0.1955291 (73) total: 6.32s remaining: 1.02s 74: learn: 0.1636090 test: 0.1954837 best: 0.1954837 (74) total: 6.41s remaining: 940ms 75: learn: 0.1630839 test: 0.1954306 best: 0.1954306 (75) total: 6.5s remaining: 855ms 76: learn: 0.1624488 test: 0.1954790 best: 0.1954306 (75) total: 6.58s remaining: 770ms 77: learn: 0.1619481 test: 0.1955782 best: 0.1954306 (75) total: 6.67s remaining: 684ms 78: learn: 0.1616110 test: 0.1955532 best: 0.1954306 (75) total: 6.75s remaining: 599ms 79: learn: 0.1610401 test: 0.1957590 best: 0.1954306 (75) total: 6.84s remaining: 513ms 80: learn: 0.1606725 test: 0.1958271 best: 0.1954306 (75) total: 6.93s remaining: 428ms 81: learn: 0.1601051 test: 0.1958182 best: 0.1954306 (75) total: 7.02s remaining: 342ms 82: learn: 0.1595881 test: 0.1955278 best: 0.1954306 (75) total: 7.1s remaining: 257ms 83: learn: 0.1591889 test: 0.1955023 best: 0.1954306 (75) total: 7.19s remaining: 171ms 84: learn: 0.1588083 test: 0.1953440 best: 0.1953440 (84) total: 7.28s remaining: 85.6ms 85: learn: 0.1583426 test: 0.1952499 best: 0.1952499 (85) total: 7.37s remaining: 0us bestTest = 0.1952498939 bestIteration = 85 Trial 26, Fold 4: Log loss = 0.19524989392204334, Average precision = 0.975597634166847, ROC-AUC = 0.9713602580885141, Elapsed Time = 7.4855626999997185 seconds Trial 26, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 26, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5959191 test: 0.5987927 best: 0.5987927 (0) total: 83.5ms remaining: 7.1s 1: learn: 0.5173370 test: 0.5218257 best: 0.5218257 (1) total: 166ms remaining: 6.98s 2: learn: 0.4563750 test: 0.4626003 best: 0.4626003 (2) total: 251ms remaining: 6.93s 3: learn: 0.4092082 test: 0.4167310 best: 0.4167310 (3) total: 335ms remaining: 6.86s 4: learn: 0.3712981 test: 0.3796877 best: 0.3796877 (4) total: 418ms remaining: 6.77s 5: learn: 0.3429889 test: 0.3521676 best: 0.3521676 (5) total: 502ms remaining: 6.69s 6: learn: 0.3180858 test: 0.3281903 best: 0.3281903 (6) total: 587ms remaining: 6.62s 7: learn: 0.2993317 test: 0.3104279 best: 0.3104279 (7) total: 671ms remaining: 6.54s 8: learn: 0.2829861 test: 0.2947924 best: 0.2947924 (8) total: 756ms remaining: 6.46s 9: learn: 0.2706918 test: 0.2832228 best: 0.2832228 (9) total: 840ms remaining: 6.38s 10: learn: 0.2599673 test: 0.2729933 best: 0.2729933 (10) total: 923ms remaining: 6.29s 11: learn: 0.2519182 test: 0.2653510 best: 0.2653510 (11) total: 1.01s remaining: 6.21s 12: learn: 0.2441090 test: 0.2580433 best: 0.2580433 (12) total: 1.09s remaining: 6.13s 13: learn: 0.2373410 test: 0.2520995 best: 0.2520995 (13) total: 1.18s remaining: 6.07s 14: learn: 0.2322269 test: 0.2476190 best: 0.2476190 (14) total: 1.26s remaining: 5.99s 15: learn: 0.2271630 test: 0.2431555 best: 0.2431555 (15) total: 1.35s remaining: 5.91s 16: learn: 0.2230767 test: 0.2394496 best: 0.2394496 (16) total: 1.43s remaining: 5.82s 17: learn: 0.2195510 test: 0.2367280 best: 0.2367280 (17) total: 1.52s remaining: 5.74s 18: learn: 0.2161249 test: 0.2338041 best: 0.2338041 (18) total: 1.6s remaining: 5.66s 19: learn: 0.2126325 test: 0.2306754 best: 0.2306754 (19) total: 1.69s remaining: 5.57s 20: learn: 0.2103295 test: 0.2286582 best: 0.2286582 (20) total: 1.77s remaining: 5.48s 21: learn: 0.2074162 test: 0.2263978 best: 0.2263978 (21) total: 1.86s remaining: 5.4s 22: learn: 0.2050539 test: 0.2241903 best: 0.2241903 (22) total: 1.94s remaining: 5.31s 23: learn: 0.2028967 test: 0.2226596 best: 0.2226596 (23) total: 2.02s remaining: 5.23s 24: learn: 0.2012948 test: 0.2213131 best: 0.2213131 (24) total: 2.11s remaining: 5.14s 25: learn: 0.1995719 test: 0.2198625 best: 0.2198625 (25) total: 2.19s remaining: 5.06s 26: learn: 0.1979127 test: 0.2189413 best: 0.2189413 (26) total: 2.27s remaining: 4.97s 27: learn: 0.1963061 test: 0.2178254 best: 0.2178254 (27) total: 2.36s remaining: 4.89s 28: learn: 0.1947993 test: 0.2168379 best: 0.2168379 (28) total: 2.45s remaining: 4.81s 29: learn: 0.1935057 test: 0.2157394 best: 0.2157394 (29) total: 2.53s remaining: 4.72s 30: learn: 0.1923204 test: 0.2149038 best: 0.2149038 (30) total: 2.61s remaining: 4.63s 31: learn: 0.1912220 test: 0.2142413 best: 0.2142413 (31) total: 2.7s remaining: 4.55s 32: learn: 0.1903394 test: 0.2137523 best: 0.2137523 (32) total: 2.78s remaining: 4.47s 33: learn: 0.1890634 test: 0.2129072 best: 0.2129072 (33) total: 2.87s remaining: 4.38s 34: learn: 0.1882058 test: 0.2126917 best: 0.2126917 (34) total: 2.95s remaining: 4.3s 35: learn: 0.1872985 test: 0.2120758 best: 0.2120758 (35) total: 3.04s remaining: 4.22s 36: learn: 0.1865332 test: 0.2118833 best: 0.2118833 (36) total: 3.12s remaining: 4.13s 37: learn: 0.1854233 test: 0.2106394 best: 0.2106394 (37) total: 3.21s remaining: 4.05s 38: learn: 0.1845250 test: 0.2101603 best: 0.2101603 (38) total: 3.29s remaining: 3.97s 39: learn: 0.1836715 test: 0.2098813 best: 0.2098813 (39) total: 3.38s remaining: 3.89s 40: learn: 0.1828609 test: 0.2095350 best: 0.2095350 (40) total: 3.47s remaining: 3.8s 41: learn: 0.1819025 test: 0.2086451 best: 0.2086451 (41) total: 3.55s remaining: 3.72s 42: learn: 0.1812402 test: 0.2083157 best: 0.2083157 (42) total: 3.64s remaining: 3.64s 43: learn: 0.1803412 test: 0.2079377 best: 0.2079377 (43) total: 3.72s remaining: 3.55s 44: learn: 0.1796953 test: 0.2074513 best: 0.2074513 (44) total: 3.81s remaining: 3.47s 45: learn: 0.1792173 test: 0.2071699 best: 0.2071699 (45) total: 3.89s remaining: 3.38s 46: learn: 0.1788030 test: 0.2070081 best: 0.2070081 (46) total: 3.98s remaining: 3.3s 47: learn: 0.1780970 test: 0.2065403 best: 0.2065403 (47) total: 4.07s remaining: 3.22s 48: learn: 0.1773356 test: 0.2065653 best: 0.2065403 (47) total: 4.16s remaining: 3.14s 49: learn: 0.1766429 test: 0.2063037 best: 0.2063037 (49) total: 4.25s remaining: 3.06s 50: learn: 0.1759642 test: 0.2061793 best: 0.2061793 (50) total: 4.35s remaining: 2.98s 51: learn: 0.1750389 test: 0.2056195 best: 0.2056195 (51) total: 4.44s remaining: 2.9s 52: learn: 0.1743761 test: 0.2052862 best: 0.2052862 (52) total: 4.53s remaining: 2.82s 53: learn: 0.1739174 test: 0.2049269 best: 0.2049269 (53) total: 4.61s remaining: 2.73s 54: learn: 0.1732614 test: 0.2049608 best: 0.2049269 (53) total: 4.7s remaining: 2.65s 55: learn: 0.1726396 test: 0.2050532 best: 0.2049269 (53) total: 4.79s remaining: 2.56s 56: learn: 0.1720381 test: 0.2047974 best: 0.2047974 (56) total: 4.88s remaining: 2.48s 57: learn: 0.1714078 test: 0.2049161 best: 0.2047974 (56) total: 4.96s remaining: 2.4s 58: learn: 0.1709271 test: 0.2047073 best: 0.2047073 (58) total: 5.05s remaining: 2.31s 59: learn: 0.1703606 test: 0.2045936 best: 0.2045936 (59) total: 5.14s remaining: 2.23s 60: learn: 0.1696206 test: 0.2038220 best: 0.2038220 (60) total: 5.23s remaining: 2.14s 61: learn: 0.1689159 test: 0.2037441 best: 0.2037441 (61) total: 5.32s remaining: 2.06s 62: learn: 0.1684071 test: 0.2035978 best: 0.2035978 (62) total: 5.41s remaining: 1.97s 63: learn: 0.1678970 test: 0.2032055 best: 0.2032055 (63) total: 5.49s remaining: 1.89s 64: learn: 0.1675344 test: 0.2031801 best: 0.2031801 (64) total: 5.58s remaining: 1.8s 65: learn: 0.1668871 test: 0.2029743 best: 0.2029743 (65) total: 5.67s remaining: 1.72s 66: learn: 0.1663401 test: 0.2028553 best: 0.2028553 (66) total: 5.75s remaining: 1.63s 67: learn: 0.1655628 test: 0.2026412 best: 0.2026412 (67) total: 5.84s remaining: 1.55s 68: learn: 0.1652340 test: 0.2026855 best: 0.2026412 (67) total: 5.93s remaining: 1.46s 69: learn: 0.1646510 test: 0.2028049 best: 0.2026412 (67) total: 6.01s remaining: 1.37s 70: learn: 0.1642445 test: 0.2025320 best: 0.2025320 (70) total: 6.1s remaining: 1.29s 71: learn: 0.1633498 test: 0.2023738 best: 0.2023738 (71) total: 6.19s remaining: 1.2s 72: learn: 0.1629263 test: 0.2021049 best: 0.2021049 (72) total: 6.28s remaining: 1.12s 73: learn: 0.1626800 test: 0.2020527 best: 0.2020527 (73) total: 6.36s remaining: 1.03s 74: learn: 0.1621898 test: 0.2019839 best: 0.2019839 (74) total: 6.45s remaining: 945ms 75: learn: 0.1615454 test: 0.2020660 best: 0.2019839 (74) total: 6.53s remaining: 860ms 76: learn: 0.1610596 test: 0.2021031 best: 0.2019839 (74) total: 6.62s remaining: 774ms 77: learn: 0.1604573 test: 0.2021084 best: 0.2019839 (74) total: 6.7s remaining: 688ms 78: learn: 0.1598144 test: 0.2021050 best: 0.2019839 (74) total: 6.79s remaining: 602ms 79: learn: 0.1589914 test: 0.2023604 best: 0.2019839 (74) total: 6.88s remaining: 516ms 80: learn: 0.1584758 test: 0.2024848 best: 0.2019839 (74) total: 6.96s remaining: 430ms 81: learn: 0.1581161 test: 0.2023747 best: 0.2019839 (74) total: 7.05s remaining: 344ms 82: learn: 0.1577797 test: 0.2022069 best: 0.2019839 (74) total: 7.14s remaining: 258ms 83: learn: 0.1573349 test: 0.2019322 best: 0.2019322 (83) total: 7.23s remaining: 172ms 84: learn: 0.1566461 test: 0.2018213 best: 0.2018213 (84) total: 7.32s remaining: 86.2ms 85: learn: 0.1562054 test: 0.2016892 best: 0.2016892 (85) total: 7.41s remaining: 0us bestTest = 0.2016892099 bestIteration = 85 Trial 26, Fold 5: Log loss = 0.2016892098712569, Average precision = 0.9737888484863096, ROC-AUC = 0.9706393177637813, Elapsed Time = 7.5332304000003205 seconds
Optimization Progress: 27%|##7 | 27/100 [43:57<1:36:54, 79.65s/it]
Trial 27, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 27, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6436326 test: 0.6448571 best: 0.6448571 (0) total: 1.9s remaining: 2m 28s 1: learn: 0.5991320 test: 0.6016533 best: 0.6016533 (1) total: 4.12s remaining: 2m 38s 2: learn: 0.5658325 test: 0.5688523 best: 0.5688523 (2) total: 6.11s remaining: 2m 34s 3: learn: 0.5302039 test: 0.5342642 best: 0.5342642 (3) total: 8.07s remaining: 2m 31s 4: learn: 0.4925786 test: 0.4991456 best: 0.4991456 (4) total: 10.2s remaining: 2m 31s 5: learn: 0.4613896 test: 0.4698042 best: 0.4698042 (5) total: 12.3s remaining: 2m 29s 6: learn: 0.4333218 test: 0.4428999 best: 0.4428999 (6) total: 14.4s remaining: 2m 28s 7: learn: 0.4080162 test: 0.4184690 best: 0.4184690 (7) total: 17s remaining: 2m 31s 8: learn: 0.3860175 test: 0.3967641 best: 0.3967641 (8) total: 19.6s remaining: 2m 32s 9: learn: 0.3645788 test: 0.3768134 best: 0.3768134 (9) total: 22s remaining: 2m 31s 10: learn: 0.3451679 test: 0.3597070 best: 0.3597070 (10) total: 24.4s remaining: 2m 30s 11: learn: 0.3306120 test: 0.3459940 best: 0.3459940 (11) total: 26.5s remaining: 2m 27s 12: learn: 0.3165812 test: 0.3328416 best: 0.3328416 (12) total: 28.5s remaining: 2m 24s 13: learn: 0.3031992 test: 0.3199409 best: 0.3199409 (13) total: 30.6s remaining: 2m 22s 14: learn: 0.2915633 test: 0.3091228 best: 0.3091228 (14) total: 32.7s remaining: 2m 19s 15: learn: 0.2806891 test: 0.2988138 best: 0.2988138 (15) total: 34.8s remaining: 2m 17s 16: learn: 0.2691209 test: 0.2889407 best: 0.2889407 (16) total: 36.9s remaining: 2m 14s 17: learn: 0.2583319 test: 0.2802289 best: 0.2802289 (17) total: 39s remaining: 2m 12s 18: learn: 0.2495698 test: 0.2731262 best: 0.2731262 (18) total: 41s remaining: 2m 9s 19: learn: 0.2417977 test: 0.2668647 best: 0.2668647 (19) total: 43.1s remaining: 2m 7s 20: learn: 0.2344450 test: 0.2606681 best: 0.2606681 (20) total: 45.2s remaining: 2m 4s 21: learn: 0.2286183 test: 0.2554908 best: 0.2554908 (21) total: 47.3s remaining: 2m 2s 22: learn: 0.2222937 test: 0.2503125 best: 0.2503125 (22) total: 49.4s remaining: 2m 23: learn: 0.2166078 test: 0.2459673 best: 0.2459673 (23) total: 51.4s remaining: 1m 57s 24: learn: 0.2112841 test: 0.2419525 best: 0.2419525 (24) total: 53.3s remaining: 1m 55s 25: learn: 0.2059181 test: 0.2377367 best: 0.2377367 (25) total: 55.2s remaining: 1m 52s 26: learn: 0.2014845 test: 0.2343045 best: 0.2343045 (26) total: 57.3s remaining: 1m 50s 27: learn: 0.1969696 test: 0.2317300 best: 0.2317300 (27) total: 59.4s remaining: 1m 48s 28: learn: 0.1943754 test: 0.2298688 best: 0.2298688 (28) total: 1m 1s remaining: 1m 46s 29: learn: 0.1920957 test: 0.2278500 best: 0.2278500 (29) total: 1m 3s remaining: 1m 43s 30: learn: 0.1893873 test: 0.2264015 best: 0.2264015 (30) total: 1m 5s remaining: 1m 41s 31: learn: 0.1864758 test: 0.2247073 best: 0.2247073 (31) total: 1m 7s remaining: 1m 39s 32: learn: 0.1833911 test: 0.2229929 best: 0.2229929 (32) total: 1m 9s remaining: 1m 37s 33: learn: 0.1819979 test: 0.2217622 best: 0.2217622 (33) total: 1m 10s remaining: 1m 32s 34: learn: 0.1796203 test: 0.2202608 best: 0.2202608 (34) total: 1m 12s remaining: 1m 30s 35: learn: 0.1770911 test: 0.2192711 best: 0.2192711 (35) total: 1m 13s remaining: 1m 28s 36: learn: 0.1752612 test: 0.2180154 best: 0.2180154 (36) total: 1m 15s remaining: 1m 26s 37: learn: 0.1724878 test: 0.2169776 best: 0.2169776 (37) total: 1m 17s remaining: 1m 23s 38: learn: 0.1703701 test: 0.2162568 best: 0.2162568 (38) total: 1m 19s remaining: 1m 21s 39: learn: 0.1678336 test: 0.2151633 best: 0.2151633 (39) total: 1m 21s remaining: 1m 19s 40: learn: 0.1662598 test: 0.2144260 best: 0.2144260 (40) total: 1m 23s remaining: 1m 17s 41: learn: 0.1646502 test: 0.2133513 best: 0.2133513 (41) total: 1m 26s remaining: 1m 15s 42: learn: 0.1628431 test: 0.2122959 best: 0.2122959 (42) total: 1m 28s remaining: 1m 13s 43: learn: 0.1596308 test: 0.2116278 best: 0.2116278 (43) total: 1m 30s remaining: 1m 11s 44: learn: 0.1578219 test: 0.2109091 best: 0.2109091 (44) total: 1m 32s remaining: 1m 9s 45: learn: 0.1545579 test: 0.2102483 best: 0.2102483 (45) total: 1m 34s remaining: 1m 7s 46: learn: 0.1530970 test: 0.2097192 best: 0.2097192 (46) total: 1m 36s remaining: 1m 5s 47: learn: 0.1518949 test: 0.2091065 best: 0.2091065 (47) total: 1m 38s remaining: 1m 3s 48: learn: 0.1484690 test: 0.2091385 best: 0.2091065 (47) total: 1m 40s remaining: 1m 1s 49: learn: 0.1474987 test: 0.2085558 best: 0.2085558 (49) total: 1m 42s remaining: 59.4s 50: learn: 0.1462527 test: 0.2080865 best: 0.2080865 (50) total: 1m 44s remaining: 57.3s 51: learn: 0.1438706 test: 0.2078379 best: 0.2078379 (51) total: 1m 46s remaining: 55.3s 52: learn: 0.1411219 test: 0.2076138 best: 0.2076138 (52) total: 1m 48s remaining: 53.1s 53: learn: 0.1395017 test: 0.2069905 best: 0.2069905 (53) total: 1m 50s remaining: 51s 54: learn: 0.1376258 test: 0.2070950 best: 0.2069905 (53) total: 1m 52s remaining: 49s 55: learn: 0.1358900 test: 0.2066915 best: 0.2066915 (55) total: 1m 54s remaining: 46.9s 56: learn: 0.1346850 test: 0.2061882 best: 0.2061882 (56) total: 1m 56s remaining: 44.8s 57: learn: 0.1334264 test: 0.2057378 best: 0.2057378 (57) total: 1m 57s remaining: 42.7s 58: learn: 0.1319706 test: 0.2054899 best: 0.2054899 (58) total: 1m 59s remaining: 40.6s 59: learn: 0.1306673 test: 0.2055825 best: 0.2054899 (58) total: 2m 1s remaining: 38.6s 60: learn: 0.1280079 test: 0.2055247 best: 0.2054899 (58) total: 2m 3s remaining: 36.5s 61: learn: 0.1274031 test: 0.2054452 best: 0.2054452 (61) total: 2m 5s remaining: 34.4s 62: learn: 0.1256065 test: 0.2058106 best: 0.2054452 (61) total: 2m 7s remaining: 32.4s 63: learn: 0.1241854 test: 0.2059006 best: 0.2054452 (61) total: 2m 9s remaining: 30.4s 64: learn: 0.1235934 test: 0.2058258 best: 0.2054452 (61) total: 2m 11s remaining: 28.4s 65: learn: 0.1229661 test: 0.2054751 best: 0.2054452 (61) total: 2m 13s remaining: 26.3s 66: learn: 0.1212798 test: 0.2055770 best: 0.2054452 (61) total: 2m 15s remaining: 24.3s 67: learn: 0.1206954 test: 0.2055262 best: 0.2054452 (61) total: 2m 17s remaining: 22.3s 68: learn: 0.1194914 test: 0.2053417 best: 0.2053417 (68) total: 2m 19s remaining: 20.3s 69: learn: 0.1186984 test: 0.2052543 best: 0.2052543 (69) total: 2m 21s remaining: 18.2s 70: learn: 0.1182860 test: 0.2049277 best: 0.2049277 (70) total: 2m 24s remaining: 16.2s 71: learn: 0.1175249 test: 0.2047415 best: 0.2047415 (71) total: 2m 25s remaining: 14.2s 72: learn: 0.1173021 test: 0.2045654 best: 0.2045654 (72) total: 2m 27s remaining: 12.1s 73: learn: 0.1165851 test: 0.2045561 best: 0.2045561 (73) total: 2m 29s remaining: 10.1s 74: learn: 0.1168726 test: 0.2046658 best: 0.2045561 (73) total: 2m 31s remaining: 8.09s 75: learn: 0.1163420 test: 0.2044014 best: 0.2044014 (75) total: 2m 33s remaining: 6.06s 76: learn: 0.1154349 test: 0.2042930 best: 0.2042930 (76) total: 2m 35s remaining: 4.04s 77: learn: 0.1148492 test: 0.2044064 best: 0.2042930 (76) total: 2m 37s remaining: 2.02s 78: learn: 0.1140254 test: 0.2042638 best: 0.2042638 (78) total: 2m 39s remaining: 0us bestTest = 0.2042637568 bestIteration = 78 Trial 27, Fold 1: Log loss = 0.20426375682127698, Average precision = 0.9733148873983092, ROC-AUC = 0.9677766814594952, Elapsed Time = 160.0499552000001 seconds Trial 27, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 27, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6432379 test: 0.6440639 best: 0.6440639 (0) total: 1.81s remaining: 2m 21s 1: learn: 0.6016265 test: 0.6040135 best: 0.6040135 (1) total: 3.8s remaining: 2m 26s 2: learn: 0.5577720 test: 0.5614766 best: 0.5614766 (2) total: 5.8s remaining: 2m 26s 3: learn: 0.5219290 test: 0.5269526 best: 0.5269526 (3) total: 7.7s remaining: 2m 24s 4: learn: 0.4821436 test: 0.4887893 best: 0.4887893 (4) total: 9.65s remaining: 2m 22s 5: learn: 0.4531871 test: 0.4610609 best: 0.4610609 (5) total: 11.6s remaining: 2m 20s 6: learn: 0.4279906 test: 0.4374160 best: 0.4374160 (6) total: 13.7s remaining: 2m 20s 7: learn: 0.4047416 test: 0.4149460 best: 0.4149460 (7) total: 15.6s remaining: 2m 18s 8: learn: 0.3837783 test: 0.3948665 best: 0.3948665 (8) total: 17.6s remaining: 2m 16s 9: learn: 0.3629509 test: 0.3751645 best: 0.3751645 (9) total: 19.6s remaining: 2m 15s 10: learn: 0.3468384 test: 0.3594659 best: 0.3594659 (10) total: 21.5s remaining: 2m 12s 11: learn: 0.3292370 test: 0.3429188 best: 0.3429188 (11) total: 23.5s remaining: 2m 11s 12: learn: 0.3137364 test: 0.3284696 best: 0.3284696 (12) total: 25.5s remaining: 2m 9s 13: learn: 0.2999966 test: 0.3154711 best: 0.3154711 (13) total: 27.5s remaining: 2m 7s 14: learn: 0.2886965 test: 0.3052741 best: 0.3052741 (14) total: 29.4s remaining: 2m 5s 15: learn: 0.2800148 test: 0.2970694 best: 0.2970694 (15) total: 31.3s remaining: 2m 3s 16: learn: 0.2696012 test: 0.2882576 best: 0.2882576 (16) total: 33.2s remaining: 2m 1s 17: learn: 0.2598890 test: 0.2797794 best: 0.2797794 (17) total: 35.2s remaining: 1m 59s 18: learn: 0.2515387 test: 0.2721579 best: 0.2721579 (18) total: 37.2s remaining: 1m 57s 19: learn: 0.2426808 test: 0.2659608 best: 0.2659608 (19) total: 39.2s remaining: 1m 55s 20: learn: 0.2368314 test: 0.2603380 best: 0.2603380 (20) total: 41.1s remaining: 1m 53s 21: learn: 0.2298083 test: 0.2546630 best: 0.2546630 (21) total: 43s remaining: 1m 51s 22: learn: 0.2228256 test: 0.2497758 best: 0.2497758 (22) total: 44.9s remaining: 1m 49s 23: learn: 0.2168213 test: 0.2447906 best: 0.2447906 (23) total: 46.8s remaining: 1m 47s 24: learn: 0.2126739 test: 0.2419864 best: 0.2419864 (24) total: 48.8s remaining: 1m 45s 25: learn: 0.2067884 test: 0.2383061 best: 0.2383061 (25) total: 50.7s remaining: 1m 43s 26: learn: 0.2003288 test: 0.2342560 best: 0.2342560 (26) total: 52.6s remaining: 1m 41s 27: learn: 0.1953645 test: 0.2307161 best: 0.2307161 (27) total: 54.6s remaining: 1m 39s 28: learn: 0.1919075 test: 0.2283788 best: 0.2283788 (28) total: 56.5s remaining: 1m 37s 29: learn: 0.1889659 test: 0.2266947 best: 0.2266947 (29) total: 58.5s remaining: 1m 35s 30: learn: 0.1836933 test: 0.2246791 best: 0.2246791 (30) total: 1m remaining: 1m 33s 31: learn: 0.1819390 test: 0.2233292 best: 0.2233292 (31) total: 1m 2s remaining: 1m 31s 32: learn: 0.1792837 test: 0.2219191 best: 0.2219191 (32) total: 1m 4s remaining: 1m 29s 33: learn: 0.1770933 test: 0.2197665 best: 0.2197665 (33) total: 1m 6s remaining: 1m 28s 34: learn: 0.1742200 test: 0.2178142 best: 0.2178142 (34) total: 1m 8s remaining: 1m 26s 35: learn: 0.1711255 test: 0.2165175 best: 0.2165175 (35) total: 1m 10s remaining: 1m 24s 36: learn: 0.1687116 test: 0.2146731 best: 0.2146731 (36) total: 1m 12s remaining: 1m 22s 37: learn: 0.1668131 test: 0.2131964 best: 0.2131964 (37) total: 1m 14s remaining: 1m 20s 38: learn: 0.1649548 test: 0.2122170 best: 0.2122170 (38) total: 1m 16s remaining: 1m 18s 39: learn: 0.1644955 test: 0.2117476 best: 0.2117476 (39) total: 1m 16s remaining: 1m 14s 40: learn: 0.1625176 test: 0.2105199 best: 0.2105199 (40) total: 1m 18s remaining: 1m 12s 41: learn: 0.1602972 test: 0.2092578 best: 0.2092578 (41) total: 1m 20s remaining: 1m 10s 42: learn: 0.1582096 test: 0.2087356 best: 0.2087356 (42) total: 1m 22s remaining: 1m 9s 43: learn: 0.1561778 test: 0.2081725 best: 0.2081725 (43) total: 1m 24s remaining: 1m 7s 44: learn: 0.1542048 test: 0.2069202 best: 0.2069202 (44) total: 1m 26s remaining: 1m 5s 45: learn: 0.1528012 test: 0.2065937 best: 0.2065937 (45) total: 1m 28s remaining: 1m 3s 46: learn: 0.1495169 test: 0.2062814 best: 0.2062814 (46) total: 1m 30s remaining: 1m 1s 47: learn: 0.1469609 test: 0.2060055 best: 0.2060055 (47) total: 1m 32s remaining: 1m 48: learn: 0.1457169 test: 0.2054417 best: 0.2054417 (48) total: 1m 34s remaining: 58.1s 49: learn: 0.1435442 test: 0.2047395 best: 0.2047395 (49) total: 1m 36s remaining: 56.1s 50: learn: 0.1423755 test: 0.2043610 best: 0.2043610 (50) total: 1m 38s remaining: 54.2s 51: learn: 0.1406497 test: 0.2040927 best: 0.2040927 (51) total: 1m 40s remaining: 52.4s 52: learn: 0.1401018 test: 0.2036223 best: 0.2036223 (52) total: 1m 43s remaining: 50.5s 53: learn: 0.1376314 test: 0.2034459 best: 0.2034459 (53) total: 1m 45s remaining: 48.7s 54: learn: 0.1368492 test: 0.2030750 best: 0.2030750 (54) total: 1m 47s remaining: 46.8s 55: learn: 0.1341870 test: 0.2026353 best: 0.2026353 (55) total: 1m 49s remaining: 45s 56: learn: 0.1333453 test: 0.2023671 best: 0.2023671 (56) total: 1m 51s remaining: 43.1s 57: learn: 0.1325367 test: 0.2015306 best: 0.2015306 (57) total: 1m 53s remaining: 41.1s 58: learn: 0.1318226 test: 0.2012569 best: 0.2012569 (58) total: 1m 55s remaining: 39.2s 59: learn: 0.1296858 test: 0.2009105 best: 0.2009105 (59) total: 1m 57s remaining: 37.3s 60: learn: 0.1286201 test: 0.2004406 best: 0.2004406 (60) total: 1m 59s remaining: 35.3s 61: learn: 0.1279722 test: 0.2003606 best: 0.2003606 (61) total: 2m 1s remaining: 33.3s 62: learn: 0.1263785 test: 0.2002525 best: 0.2002525 (62) total: 2m 3s remaining: 31.4s 63: learn: 0.1258432 test: 0.1996862 best: 0.1996862 (63) total: 2m 5s remaining: 29.4s 64: learn: 0.1237852 test: 0.1993712 best: 0.1993712 (64) total: 2m 7s remaining: 27.5s 65: learn: 0.1234259 test: 0.1991724 best: 0.1991724 (65) total: 2m 9s remaining: 25.5s 66: learn: 0.1219486 test: 0.1990799 best: 0.1990799 (66) total: 2m 11s remaining: 23.5s 67: learn: 0.1207009 test: 0.1990793 best: 0.1990793 (67) total: 2m 13s remaining: 21.5s 68: learn: 0.1202319 test: 0.1986088 best: 0.1986088 (68) total: 2m 15s remaining: 19.6s 69: learn: 0.1187411 test: 0.1981984 best: 0.1981984 (69) total: 2m 17s remaining: 17.6s 70: learn: 0.1174243 test: 0.1981397 best: 0.1981397 (70) total: 2m 19s remaining: 15.7s 71: learn: 0.1166418 test: 0.1977732 best: 0.1977732 (71) total: 2m 21s remaining: 13.7s 72: learn: 0.1149122 test: 0.1975647 best: 0.1975647 (72) total: 2m 23s remaining: 11.8s 73: learn: 0.1138649 test: 0.1975094 best: 0.1975094 (73) total: 2m 25s remaining: 9.8s 74: learn: 0.1133437 test: 0.1973646 best: 0.1973646 (74) total: 2m 26s remaining: 7.84s 75: learn: 0.1128341 test: 0.1972073 best: 0.1972073 (75) total: 2m 28s remaining: 5.88s 76: learn: 0.1124802 test: 0.1970767 best: 0.1970767 (76) total: 2m 30s remaining: 3.92s 77: learn: 0.1119171 test: 0.1967061 best: 0.1967061 (77) total: 2m 32s remaining: 1.96s 78: learn: 0.1105065 test: 0.1966736 best: 0.1966736 (78) total: 2m 35s remaining: 0us bestTest = 0.1966735767 bestIteration = 78 Trial 27, Fold 2: Log loss = 0.19667357670757726, Average precision = 0.9729162475124771, ROC-AUC = 0.9692006624701323, Elapsed Time = 155.33394789999875 seconds Trial 27, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 27, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6424342 test: 0.6434223 best: 0.6434223 (0) total: 1.83s remaining: 2m 22s 1: learn: 0.5930715 test: 0.5949485 best: 0.5949485 (1) total: 3.87s remaining: 2m 28s 2: learn: 0.5484378 test: 0.5522595 best: 0.5522595 (2) total: 5.85s remaining: 2m 28s 3: learn: 0.5113355 test: 0.5161610 best: 0.5161610 (3) total: 7.83s remaining: 2m 26s 4: learn: 0.4751347 test: 0.4815587 best: 0.4815587 (4) total: 9.84s remaining: 2m 25s 5: learn: 0.4439371 test: 0.4513510 best: 0.4513510 (5) total: 11.8s remaining: 2m 23s 6: learn: 0.4195114 test: 0.4278611 best: 0.4278611 (6) total: 13.7s remaining: 2m 21s 7: learn: 0.3933387 test: 0.4033463 best: 0.4033463 (7) total: 15.8s remaining: 2m 20s 8: learn: 0.3733499 test: 0.3845344 best: 0.3845344 (8) total: 17.9s remaining: 2m 19s 9: learn: 0.3547811 test: 0.3666688 best: 0.3666688 (9) total: 20s remaining: 2m 17s 10: learn: 0.3378720 test: 0.3509750 best: 0.3509750 (10) total: 22s remaining: 2m 16s 11: learn: 0.3241105 test: 0.3379323 best: 0.3379323 (11) total: 24.1s remaining: 2m 14s 12: learn: 0.3114162 test: 0.3267908 best: 0.3267908 (12) total: 26.1s remaining: 2m 12s 13: learn: 0.2984382 test: 0.3148156 best: 0.3148156 (13) total: 28.1s remaining: 2m 10s 14: learn: 0.2877602 test: 0.3048854 best: 0.3048854 (14) total: 30s remaining: 2m 8s 15: learn: 0.2768391 test: 0.2952257 best: 0.2952257 (15) total: 32s remaining: 2m 5s 16: learn: 0.2662269 test: 0.2858894 best: 0.2858894 (16) total: 34s remaining: 2m 4s 17: learn: 0.2578113 test: 0.2789136 best: 0.2789136 (17) total: 36s remaining: 2m 1s 18: learn: 0.2495360 test: 0.2722087 best: 0.2722087 (18) total: 38.1s remaining: 2m 19: learn: 0.2423489 test: 0.2662555 best: 0.2662555 (19) total: 40.2s remaining: 1m 58s 20: learn: 0.2351214 test: 0.2602189 best: 0.2602189 (20) total: 42.2s remaining: 1m 56s 21: learn: 0.2276150 test: 0.2539282 best: 0.2539282 (21) total: 44.3s remaining: 1m 54s 22: learn: 0.2224550 test: 0.2499379 best: 0.2499379 (22) total: 46.3s remaining: 1m 52s 23: learn: 0.2164909 test: 0.2458402 best: 0.2458402 (23) total: 48.4s remaining: 1m 50s 24: learn: 0.2107891 test: 0.2413347 best: 0.2413347 (24) total: 50.4s remaining: 1m 48s 25: learn: 0.2059974 test: 0.2381021 best: 0.2381021 (25) total: 52.4s remaining: 1m 46s 26: learn: 0.2014926 test: 0.2343933 best: 0.2343933 (26) total: 54.4s remaining: 1m 44s 27: learn: 0.1979873 test: 0.2316181 best: 0.2316181 (27) total: 56.5s remaining: 1m 42s 28: learn: 0.1937184 test: 0.2290713 best: 0.2290713 (28) total: 58.5s remaining: 1m 40s 29: learn: 0.1901414 test: 0.2268655 best: 0.2268655 (29) total: 1m remaining: 1m 38s 30: learn: 0.1866667 test: 0.2251438 best: 0.2251438 (30) total: 1m 2s remaining: 1m 36s 31: learn: 0.1840453 test: 0.2232471 best: 0.2232471 (31) total: 1m 4s remaining: 1m 34s 32: learn: 0.1819178 test: 0.2219574 best: 0.2219574 (32) total: 1m 6s remaining: 1m 32s 33: learn: 0.1800689 test: 0.2206836 best: 0.2206836 (33) total: 1m 8s remaining: 1m 30s 34: learn: 0.1766834 test: 0.2191319 best: 0.2191319 (34) total: 1m 10s remaining: 1m 28s 35: learn: 0.1746731 test: 0.2183487 best: 0.2183487 (35) total: 1m 12s remaining: 1m 26s 36: learn: 0.1720884 test: 0.2169321 best: 0.2169321 (36) total: 1m 14s remaining: 1m 24s 37: learn: 0.1704449 test: 0.2160139 best: 0.2160139 (37) total: 1m 16s remaining: 1m 22s 38: learn: 0.1681118 test: 0.2147150 best: 0.2147150 (38) total: 1m 18s remaining: 1m 20s 39: learn: 0.1654063 test: 0.2136544 best: 0.2136544 (39) total: 1m 20s remaining: 1m 18s 40: learn: 0.1621439 test: 0.2126683 best: 0.2126683 (40) total: 1m 22s remaining: 1m 16s 41: learn: 0.1593790 test: 0.2119272 best: 0.2119272 (41) total: 1m 24s remaining: 1m 14s 42: learn: 0.1581338 test: 0.2115006 best: 0.2115006 (42) total: 1m 27s remaining: 1m 12s 43: learn: 0.1554987 test: 0.2107924 best: 0.2107924 (43) total: 1m 29s remaining: 1m 10s 44: learn: 0.1534147 test: 0.2102529 best: 0.2102529 (44) total: 1m 31s remaining: 1m 8s 45: learn: 0.1513614 test: 0.2096530 best: 0.2096530 (45) total: 1m 33s remaining: 1m 6s 46: learn: 0.1487207 test: 0.2093385 best: 0.2093385 (46) total: 1m 35s remaining: 1m 4s 47: learn: 0.1468434 test: 0.2089257 best: 0.2089257 (47) total: 1m 36s remaining: 1m 2s 48: learn: 0.1451704 test: 0.2085275 best: 0.2085275 (48) total: 1m 38s remaining: 1m 49: learn: 0.1431858 test: 0.2080938 best: 0.2080938 (49) total: 1m 40s remaining: 58.6s 50: learn: 0.1411395 test: 0.2067361 best: 0.2067361 (50) total: 1m 42s remaining: 56.5s 51: learn: 0.1394893 test: 0.2064878 best: 0.2064878 (51) total: 1m 44s remaining: 54.5s 52: learn: 0.1373802 test: 0.2058906 best: 0.2058906 (52) total: 1m 46s remaining: 52.4s 53: learn: 0.1355672 test: 0.2058193 best: 0.2058193 (53) total: 1m 48s remaining: 50.4s 54: learn: 0.1339173 test: 0.2057752 best: 0.2057752 (54) total: 1m 50s remaining: 48.3s 55: learn: 0.1325437 test: 0.2055185 best: 0.2055185 (55) total: 1m 52s remaining: 46.2s 56: learn: 0.1305737 test: 0.2051801 best: 0.2051801 (56) total: 1m 54s remaining: 44.2s 57: learn: 0.1282046 test: 0.2053556 best: 0.2051801 (56) total: 1m 56s remaining: 42.2s 58: learn: 0.1260598 test: 0.2054494 best: 0.2051801 (56) total: 1m 58s remaining: 40.1s 59: learn: 0.1253439 test: 0.2047817 best: 0.2047817 (59) total: 2m remaining: 38.1s 60: learn: 0.1243322 test: 0.2046045 best: 0.2046045 (60) total: 2m 2s remaining: 36.1s 61: learn: 0.1228333 test: 0.2037890 best: 0.2037890 (61) total: 2m 4s remaining: 34s 62: learn: 0.1213519 test: 0.2035754 best: 0.2035754 (62) total: 2m 6s remaining: 32s 63: learn: 0.1206650 test: 0.2034100 best: 0.2034100 (63) total: 2m 8s remaining: 30s 64: learn: 0.1172473 test: 0.2036611 best: 0.2034100 (63) total: 2m 10s remaining: 28s 65: learn: 0.1160592 test: 0.2034049 best: 0.2034049 (65) total: 2m 12s remaining: 26s 66: learn: 0.1149958 test: 0.2031103 best: 0.2031103 (66) total: 2m 14s remaining: 24s 67: learn: 0.1124106 test: 0.2032830 best: 0.2031103 (66) total: 2m 16s remaining: 22s 68: learn: 0.1110764 test: 0.2032203 best: 0.2031103 (66) total: 2m 18s remaining: 20s 69: learn: 0.1103848 test: 0.2028467 best: 0.2028467 (69) total: 2m 20s remaining: 18s 70: learn: 0.1082419 test: 0.2027802 best: 0.2027802 (70) total: 2m 22s remaining: 16s 71: learn: 0.1076181 test: 0.2025474 best: 0.2025474 (71) total: 2m 24s remaining: 14s 72: learn: 0.1070797 test: 0.2023765 best: 0.2023765 (72) total: 2m 25s remaining: 12s 73: learn: 0.1063945 test: 0.2023573 best: 0.2023573 (73) total: 2m 27s remaining: 9.99s 74: learn: 0.1055047 test: 0.2023422 best: 0.2023422 (74) total: 2m 29s remaining: 8s 75: learn: 0.1039245 test: 0.2022214 best: 0.2022214 (75) total: 2m 31s remaining: 6s 76: learn: 0.1028951 test: 0.2020855 best: 0.2020855 (76) total: 2m 34s remaining: 4s 77: learn: 0.1015890 test: 0.2022395 best: 0.2020855 (76) total: 2m 36s remaining: 2s 78: learn: 0.1008573 test: 0.2020695 best: 0.2020695 (78) total: 2m 38s remaining: 0us bestTest = 0.2020695124 bestIteration = 78 Trial 27, Fold 3: Log loss = 0.20206951237268705, Average precision = 0.9720168293297896, ROC-AUC = 0.9679361775385, Elapsed Time = 158.47888109999985 seconds Trial 27, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 27, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6483102 test: 0.6488308 best: 0.6488308 (0) total: 1.8s remaining: 2m 20s 1: learn: 0.6087740 test: 0.6102711 best: 0.6102711 (1) total: 3.7s remaining: 2m 22s 2: learn: 0.5651672 test: 0.5676629 best: 0.5676629 (2) total: 5.68s remaining: 2m 24s 3: learn: 0.5230778 test: 0.5277857 best: 0.5277857 (3) total: 7.74s remaining: 2m 25s 4: learn: 0.4984121 test: 0.5029913 best: 0.5029913 (4) total: 7.82s remaining: 1m 55s 5: learn: 0.4683002 test: 0.4739477 best: 0.4739477 (5) total: 9.8s remaining: 1m 59s 6: learn: 0.4414474 test: 0.4480070 best: 0.4480070 (6) total: 11.8s remaining: 2m 1s 7: learn: 0.4106245 test: 0.4191669 best: 0.4191669 (7) total: 13.8s remaining: 2m 2s 8: learn: 0.3861123 test: 0.3962003 best: 0.3962003 (8) total: 15.8s remaining: 2m 3s 9: learn: 0.3642189 test: 0.3762903 best: 0.3762903 (9) total: 17.7s remaining: 2m 2s 10: learn: 0.3473670 test: 0.3601564 best: 0.3601564 (10) total: 19.7s remaining: 2m 1s 11: learn: 0.3310307 test: 0.3445727 best: 0.3445727 (11) total: 21.6s remaining: 2m 12: learn: 0.3168274 test: 0.3315082 best: 0.3315082 (12) total: 23.6s remaining: 1m 59s 13: learn: 0.2999939 test: 0.3178589 best: 0.3178589 (13) total: 25.4s remaining: 1m 58s 14: learn: 0.2889187 test: 0.3076504 best: 0.3076504 (14) total: 27.4s remaining: 1m 56s 15: learn: 0.2772119 test: 0.2973679 best: 0.2973679 (15) total: 29.3s remaining: 1m 55s 16: learn: 0.2659003 test: 0.2881124 best: 0.2881124 (16) total: 31.2s remaining: 1m 53s 17: learn: 0.2556879 test: 0.2796440 best: 0.2796440 (17) total: 33.3s remaining: 1m 52s 18: learn: 0.2471138 test: 0.2719968 best: 0.2719968 (18) total: 35.3s remaining: 1m 51s 19: learn: 0.2374305 test: 0.2655755 best: 0.2655755 (19) total: 37.4s remaining: 1m 50s 20: learn: 0.2307162 test: 0.2600603 best: 0.2600603 (20) total: 39.3s remaining: 1m 48s 21: learn: 0.2248460 test: 0.2547072 best: 0.2547072 (21) total: 41.4s remaining: 1m 47s 22: learn: 0.2189658 test: 0.2503104 best: 0.2503104 (22) total: 43.4s remaining: 1m 45s 23: learn: 0.2129746 test: 0.2458532 best: 0.2458532 (23) total: 45.3s remaining: 1m 43s 24: learn: 0.2086800 test: 0.2423943 best: 0.2423943 (24) total: 47.2s remaining: 1m 41s 25: learn: 0.2037498 test: 0.2382925 best: 0.2382925 (25) total: 49.1s remaining: 1m 40s 26: learn: 0.1986377 test: 0.2349290 best: 0.2349290 (26) total: 51.1s remaining: 1m 38s 27: learn: 0.1942263 test: 0.2318053 best: 0.2318053 (27) total: 53s remaining: 1m 36s 28: learn: 0.1895630 test: 0.2290068 best: 0.2290068 (28) total: 55s remaining: 1m 34s 29: learn: 0.1860781 test: 0.2263582 best: 0.2263582 (29) total: 56.9s remaining: 1m 32s 30: learn: 0.1826258 test: 0.2243254 best: 0.2243254 (30) total: 58.9s remaining: 1m 31s 31: learn: 0.1799986 test: 0.2227222 best: 0.2227222 (31) total: 1m remaining: 1m 29s 32: learn: 0.1770159 test: 0.2212956 best: 0.2212956 (32) total: 1m 2s remaining: 1m 27s 33: learn: 0.1742581 test: 0.2197518 best: 0.2197518 (33) total: 1m 4s remaining: 1m 25s 34: learn: 0.1731903 test: 0.2189542 best: 0.2189542 (34) total: 1m 7s remaining: 1m 24s 35: learn: 0.1707911 test: 0.2179094 best: 0.2179094 (35) total: 1m 9s remaining: 1m 22s 36: learn: 0.1692690 test: 0.2171467 best: 0.2171467 (36) total: 1m 11s remaining: 1m 20s 37: learn: 0.1668849 test: 0.2162762 best: 0.2162762 (37) total: 1m 12s remaining: 1m 18s 38: learn: 0.1639262 test: 0.2153689 best: 0.2153689 (38) total: 1m 14s remaining: 1m 16s 39: learn: 0.1615155 test: 0.2142920 best: 0.2142920 (39) total: 1m 16s remaining: 1m 15s 40: learn: 0.1597620 test: 0.2130459 best: 0.2130459 (40) total: 1m 19s remaining: 1m 13s 41: learn: 0.1565711 test: 0.2120158 best: 0.2120158 (41) total: 1m 21s remaining: 1m 11s 42: learn: 0.1550185 test: 0.2110980 best: 0.2110980 (42) total: 1m 23s remaining: 1m 9s 43: learn: 0.1542547 test: 0.2104611 best: 0.2104611 (43) total: 1m 25s remaining: 1m 7s 44: learn: 0.1529383 test: 0.2100524 best: 0.2100524 (44) total: 1m 27s remaining: 1m 5s 45: learn: 0.1517305 test: 0.2095934 best: 0.2095934 (45) total: 1m 29s remaining: 1m 3s 46: learn: 0.1501209 test: 0.2091784 best: 0.2091784 (46) total: 1m 31s remaining: 1m 2s 47: learn: 0.1483588 test: 0.2089680 best: 0.2089680 (47) total: 1m 33s remaining: 1m 48: learn: 0.1472128 test: 0.2083471 best: 0.2083471 (48) total: 1m 35s remaining: 58.3s 49: learn: 0.1443615 test: 0.2079454 best: 0.2079454 (49) total: 1m 37s remaining: 56.4s 50: learn: 0.1424292 test: 0.2078413 best: 0.2078413 (50) total: 1m 39s remaining: 54.6s 51: learn: 0.1410043 test: 0.2075336 best: 0.2075336 (51) total: 1m 41s remaining: 52.7s 52: learn: 0.1391313 test: 0.2076859 best: 0.2075336 (51) total: 1m 43s remaining: 50.8s 53: learn: 0.1380560 test: 0.2072582 best: 0.2072582 (53) total: 1m 45s remaining: 48.9s 54: learn: 0.1370250 test: 0.2068555 best: 0.2068555 (54) total: 1m 47s remaining: 46.9s 55: learn: 0.1344140 test: 0.2066779 best: 0.2066779 (55) total: 1m 49s remaining: 45s 56: learn: 0.1334971 test: 0.2065325 best: 0.2065325 (56) total: 1m 51s remaining: 43.1s 57: learn: 0.1321337 test: 0.2060224 best: 0.2060224 (57) total: 1m 53s remaining: 41.2s 58: learn: 0.1302322 test: 0.2057796 best: 0.2057796 (58) total: 1m 55s remaining: 39.2s 59: learn: 0.1281869 test: 0.2056844 best: 0.2056844 (59) total: 1m 57s remaining: 37.3s 60: learn: 0.1270618 test: 0.2053663 best: 0.2053663 (60) total: 1m 59s remaining: 35.4s 61: learn: 0.1261795 test: 0.2050581 best: 0.2050581 (61) total: 2m 1s remaining: 33.4s 62: learn: 0.1253839 test: 0.2046293 best: 0.2046293 (62) total: 2m 3s remaining: 31.4s 63: learn: 0.1242221 test: 0.2040936 best: 0.2040936 (63) total: 2m 5s remaining: 29.5s 64: learn: 0.1230068 test: 0.2040266 best: 0.2040266 (64) total: 2m 7s remaining: 27.5s 65: learn: 0.1207452 test: 0.2037792 best: 0.2037792 (65) total: 2m 9s remaining: 25.6s 66: learn: 0.1198688 test: 0.2036249 best: 0.2036249 (66) total: 2m 11s remaining: 23.6s 67: learn: 0.1192322 test: 0.2034527 best: 0.2034527 (67) total: 2m 13s remaining: 21.7s 68: learn: 0.1180137 test: 0.2032675 best: 0.2032675 (68) total: 2m 15s remaining: 19.7s 69: learn: 0.1176513 test: 0.2031584 best: 0.2031584 (69) total: 2m 17s remaining: 17.7s 70: learn: 0.1169591 test: 0.2029843 best: 0.2029843 (70) total: 2m 19s remaining: 15.8s 71: learn: 0.1159100 test: 0.2026803 best: 0.2026803 (71) total: 2m 21s remaining: 13.8s 72: learn: 0.1148865 test: 0.2022719 best: 0.2022719 (72) total: 2m 23s remaining: 11.8s 73: learn: 0.1131934 test: 0.2023443 best: 0.2022719 (72) total: 2m 25s remaining: 9.84s 74: learn: 0.1122320 test: 0.2023078 best: 0.2022719 (72) total: 2m 27s remaining: 7.87s 75: learn: 0.1106808 test: 0.2023613 best: 0.2022719 (72) total: 2m 29s remaining: 5.91s 76: learn: 0.1092950 test: 0.2024794 best: 0.2022719 (72) total: 2m 31s remaining: 3.94s 77: learn: 0.1078511 test: 0.2024475 best: 0.2022719 (72) total: 2m 33s remaining: 1.97s 78: learn: 0.1069723 test: 0.2025878 best: 0.2022719 (72) total: 2m 35s remaining: 0us bestTest = 0.2022718574 bestIteration = 72 Shrink model to first 73 iterations. Trial 27, Fold 4: Log loss = 0.20227185735420683, Average precision = 0.9719868224035556, ROC-AUC = 0.9668532765366926, Elapsed Time = 156.1056112999977 seconds Trial 27, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 27, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6414369 test: 0.6433910 best: 0.6433910 (0) total: 1.82s remaining: 2m 21s 1: learn: 0.5934005 test: 0.5988165 best: 0.5988165 (1) total: 3.79s remaining: 2m 25s 2: learn: 0.5532644 test: 0.5601322 best: 0.5601322 (2) total: 5.78s remaining: 2m 26s 3: learn: 0.5284592 test: 0.5355728 best: 0.5355728 (3) total: 5.89s remaining: 1m 50s 4: learn: 0.4975270 test: 0.5055494 best: 0.5055494 (4) total: 7.86s remaining: 1m 56s 5: learn: 0.4621622 test: 0.4726681 best: 0.4726681 (5) total: 9.86s remaining: 1m 59s 6: learn: 0.4303357 test: 0.4425206 best: 0.4425206 (6) total: 11.8s remaining: 2m 1s 7: learn: 0.4062293 test: 0.4198856 best: 0.4198856 (7) total: 13.7s remaining: 2m 1s 8: learn: 0.3845962 test: 0.4000257 best: 0.4000257 (8) total: 15.8s remaining: 2m 2s 9: learn: 0.3622898 test: 0.3798968 best: 0.3798968 (9) total: 17.8s remaining: 2m 2s 10: learn: 0.3409834 test: 0.3602111 best: 0.3602111 (10) total: 19.8s remaining: 2m 2s 11: learn: 0.3257498 test: 0.3457279 best: 0.3457279 (11) total: 21.8s remaining: 2m 1s 12: learn: 0.3103754 test: 0.3321588 best: 0.3321588 (12) total: 23.7s remaining: 2m 13: learn: 0.2984964 test: 0.3213889 best: 0.3213889 (13) total: 25.8s remaining: 1m 59s 14: learn: 0.2853694 test: 0.3100870 best: 0.3100870 (14) total: 27.8s remaining: 1m 58s 15: learn: 0.2795190 test: 0.3042283 best: 0.3042283 (15) total: 27.9s remaining: 1m 49s 16: learn: 0.2725859 test: 0.2974389 best: 0.2974389 (16) total: 28.1s remaining: 1m 42s 17: learn: 0.2625094 test: 0.2887773 best: 0.2887773 (17) total: 30.1s remaining: 1m 42s 18: learn: 0.2520870 test: 0.2802457 best: 0.2802457 (18) total: 32.1s remaining: 1m 41s 19: learn: 0.2440653 test: 0.2733036 best: 0.2733036 (19) total: 34.1s remaining: 1m 40s 20: learn: 0.2377736 test: 0.2677730 best: 0.2677730 (20) total: 36.1s remaining: 1m 39s 21: learn: 0.2319436 test: 0.2621336 best: 0.2621336 (21) total: 38.2s remaining: 1m 38s 22: learn: 0.2243874 test: 0.2572048 best: 0.2572048 (22) total: 40.2s remaining: 1m 37s 23: learn: 0.2185356 test: 0.2530194 best: 0.2530194 (23) total: 42.2s remaining: 1m 36s 24: learn: 0.2142580 test: 0.2496535 best: 0.2496535 (24) total: 44.3s remaining: 1m 35s 25: learn: 0.2109060 test: 0.2468705 best: 0.2468705 (25) total: 46.3s remaining: 1m 34s 26: learn: 0.2070980 test: 0.2432505 best: 0.2432505 (26) total: 48.4s remaining: 1m 33s 27: learn: 0.2036227 test: 0.2403174 best: 0.2403174 (27) total: 50.4s remaining: 1m 31s 28: learn: 0.1997720 test: 0.2377560 best: 0.2377560 (28) total: 52.5s remaining: 1m 30s 29: learn: 0.1970375 test: 0.2362883 best: 0.2362883 (29) total: 54.5s remaining: 1m 29s 30: learn: 0.1926124 test: 0.2338471 best: 0.2338471 (30) total: 56.5s remaining: 1m 27s 31: learn: 0.1901002 test: 0.2322492 best: 0.2322492 (31) total: 58.6s remaining: 1m 26s 32: learn: 0.1873296 test: 0.2298138 best: 0.2298138 (32) total: 1m remaining: 1m 24s 33: learn: 0.1838727 test: 0.2284746 best: 0.2284746 (33) total: 1m 2s remaining: 1m 22s 34: learn: 0.1801325 test: 0.2271299 best: 0.2271299 (34) total: 1m 4s remaining: 1m 20s 35: learn: 0.1776170 test: 0.2260153 best: 0.2260153 (35) total: 1m 6s remaining: 1m 19s 36: learn: 0.1760704 test: 0.2248014 best: 0.2248014 (36) total: 1m 8s remaining: 1m 17s 37: learn: 0.1740637 test: 0.2233491 best: 0.2233491 (37) total: 1m 10s remaining: 1m 16s 38: learn: 0.1710396 test: 0.2219865 best: 0.2219865 (38) total: 1m 12s remaining: 1m 14s 39: learn: 0.1688753 test: 0.2207719 best: 0.2207719 (39) total: 1m 14s remaining: 1m 12s 40: learn: 0.1673258 test: 0.2200301 best: 0.2200301 (40) total: 1m 16s remaining: 1m 10s 41: learn: 0.1658852 test: 0.2189727 best: 0.2189727 (41) total: 1m 18s remaining: 1m 9s 42: learn: 0.1628270 test: 0.2181918 best: 0.2181918 (42) total: 1m 20s remaining: 1m 7s 43: learn: 0.1607470 test: 0.2179505 best: 0.2179505 (43) total: 1m 22s remaining: 1m 5s 44: learn: 0.1579093 test: 0.2173863 best: 0.2173863 (44) total: 1m 24s remaining: 1m 3s 45: learn: 0.1559217 test: 0.2166576 best: 0.2166576 (45) total: 1m 26s remaining: 1m 1s 46: learn: 0.1540533 test: 0.2160421 best: 0.2160421 (46) total: 1m 28s remaining: 1m 47: learn: 0.1522184 test: 0.2152471 best: 0.2152471 (47) total: 1m 30s remaining: 58.3s 48: learn: 0.1510984 test: 0.2148158 best: 0.2148158 (48) total: 1m 32s remaining: 56.5s 49: learn: 0.1490832 test: 0.2143701 best: 0.2143701 (49) total: 1m 34s remaining: 54.7s 50: learn: 0.1482664 test: 0.2137755 best: 0.2137755 (50) total: 1m 36s remaining: 52.9s 51: learn: 0.1474221 test: 0.2134487 best: 0.2134487 (51) total: 1m 38s remaining: 51.1s 52: learn: 0.1459811 test: 0.2126745 best: 0.2126745 (52) total: 1m 40s remaining: 49.3s 53: learn: 0.1448486 test: 0.2121862 best: 0.2121862 (53) total: 1m 42s remaining: 47.5s 54: learn: 0.1441864 test: 0.2120424 best: 0.2120424 (54) total: 1m 44s remaining: 45.7s 55: learn: 0.1433402 test: 0.2118688 best: 0.2118688 (55) total: 1m 46s remaining: 43.9s 56: learn: 0.1425301 test: 0.2117028 best: 0.2117028 (56) total: 1m 48s remaining: 42s 57: learn: 0.1403235 test: 0.2116409 best: 0.2116409 (57) total: 1m 50s remaining: 40.1s 58: learn: 0.1382734 test: 0.2110141 best: 0.2110141 (58) total: 1m 52s remaining: 38.3s 59: learn: 0.1361050 test: 0.2107962 best: 0.2107962 (59) total: 1m 55s remaining: 36.4s 60: learn: 0.1339660 test: 0.2105644 best: 0.2105644 (60) total: 1m 57s remaining: 34.6s 61: learn: 0.1321890 test: 0.2104547 best: 0.2104547 (61) total: 1m 59s remaining: 32.7s 62: learn: 0.1293208 test: 0.2110771 best: 0.2104547 (61) total: 2m 1s remaining: 30.8s 63: learn: 0.1283685 test: 0.2109770 best: 0.2104547 (61) total: 2m 3s remaining: 28.9s 64: learn: 0.1270639 test: 0.2103740 best: 0.2103740 (64) total: 2m 5s remaining: 27s 65: learn: 0.1256618 test: 0.2102833 best: 0.2102833 (65) total: 2m 7s remaining: 25s 66: learn: 0.1243619 test: 0.2103517 best: 0.2102833 (65) total: 2m 9s remaining: 23.1s 67: learn: 0.1234420 test: 0.2100051 best: 0.2100051 (67) total: 2m 11s remaining: 21.2s 68: learn: 0.1228101 test: 0.2098038 best: 0.2098038 (68) total: 2m 13s remaining: 19.3s 69: learn: 0.1212794 test: 0.2098763 best: 0.2098038 (68) total: 2m 15s remaining: 17.4s 70: learn: 0.1203862 test: 0.2096317 best: 0.2096317 (70) total: 2m 16s remaining: 15.4s 71: learn: 0.1197831 test: 0.2094846 best: 0.2094846 (71) total: 2m 18s remaining: 13.5s 72: learn: 0.1181572 test: 0.2097466 best: 0.2094846 (71) total: 2m 20s remaining: 11.6s 73: learn: 0.1164303 test: 0.2101071 best: 0.2094846 (71) total: 2m 23s remaining: 9.66s 74: learn: 0.1159023 test: 0.2098883 best: 0.2094846 (71) total: 2m 25s remaining: 7.74s 75: learn: 0.1152899 test: 0.2098226 best: 0.2094846 (71) total: 2m 27s remaining: 5.8s 76: learn: 0.1135961 test: 0.2098121 best: 0.2094846 (71) total: 2m 29s remaining: 3.87s 77: learn: 0.1124052 test: 0.2099584 best: 0.2094846 (71) total: 2m 31s remaining: 1.94s 78: learn: 0.1116220 test: 0.2097965 best: 0.2094846 (71) total: 2m 33s remaining: 0us bestTest = 0.2094846199 bestIteration = 71 Shrink model to first 72 iterations. Trial 27, Fold 5: Log loss = 0.2094846198788979, Average precision = 0.9713292401149299, ROC-AUC = 0.9662394143767534, Elapsed Time = 153.3656974000005 seconds
Optimization Progress: 28%|##8 | 28/100 [57:08<5:51:48, 293.17s/it]
Trial 28, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 28, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6802173 test: 0.6804548 best: 0.6804548 (0) total: 9.43ms remaining: 830ms 1: learn: 0.6683060 test: 0.6688853 best: 0.6688853 (1) total: 21.4ms remaining: 932ms 2: learn: 0.6572897 test: 0.6582149 best: 0.6582149 (2) total: 33.6ms remaining: 964ms 3: learn: 0.6474847 test: 0.6487702 best: 0.6487702 (3) total: 45.4ms remaining: 964ms 4: learn: 0.6333719 test: 0.6343600 best: 0.6343600 (4) total: 53ms remaining: 890ms 5: learn: 0.6254519 test: 0.6266107 best: 0.6266107 (5) total: 65ms remaining: 900ms 6: learn: 0.6131641 test: 0.6140267 best: 0.6140267 (6) total: 76.4ms remaining: 895ms 7: learn: 0.6019959 test: 0.6025644 best: 0.6025644 (7) total: 83.5ms remaining: 845ms 8: learn: 0.5935620 test: 0.5942326 best: 0.5942326 (8) total: 94.8ms remaining: 843ms 9: learn: 0.5835186 test: 0.5839067 best: 0.5839067 (9) total: 102ms remaining: 806ms 10: learn: 0.5759253 test: 0.5764123 best: 0.5764123 (10) total: 113ms remaining: 803ms 11: learn: 0.5686667 test: 0.5695101 best: 0.5695101 (11) total: 124ms remaining: 799ms 12: learn: 0.5621832 test: 0.5632083 best: 0.5632083 (12) total: 135ms remaining: 792ms 13: learn: 0.5564809 test: 0.5575876 best: 0.5575876 (13) total: 146ms remaining: 784ms 14: learn: 0.5506583 test: 0.5518530 best: 0.5518530 (14) total: 157ms remaining: 775ms 15: learn: 0.5450948 test: 0.5465717 best: 0.5465717 (15) total: 168ms remaining: 766ms 16: learn: 0.5399968 test: 0.5417142 best: 0.5417142 (16) total: 179ms remaining: 759ms 17: learn: 0.5321244 test: 0.5337447 best: 0.5337447 (17) total: 191ms remaining: 754ms 18: learn: 0.5270987 test: 0.5288193 best: 0.5288193 (18) total: 203ms remaining: 747ms 19: learn: 0.5201962 test: 0.5218077 best: 0.5218077 (19) total: 214ms remaining: 738ms 20: learn: 0.5156309 test: 0.5173621 best: 0.5173621 (20) total: 225ms remaining: 729ms 21: learn: 0.5108481 test: 0.5127974 best: 0.5127974 (21) total: 236ms remaining: 720ms 22: learn: 0.5045902 test: 0.5064700 best: 0.5064700 (22) total: 247ms remaining: 710ms 23: learn: 0.5002153 test: 0.5021719 best: 0.5021719 (23) total: 258ms remaining: 700ms 24: learn: 0.4962167 test: 0.4982409 best: 0.4982409 (24) total: 269ms remaining: 689ms 25: learn: 0.4922298 test: 0.4943267 best: 0.4943267 (25) total: 281ms remaining: 680ms 26: learn: 0.4852861 test: 0.4871558 best: 0.4871558 (26) total: 292ms remaining: 670ms 27: learn: 0.4789792 test: 0.4806244 best: 0.4806244 (27) total: 303ms remaining: 660ms 28: learn: 0.4753121 test: 0.4770442 best: 0.4770442 (28) total: 314ms remaining: 650ms 29: learn: 0.4695893 test: 0.4711038 best: 0.4711038 (29) total: 327ms remaining: 642ms 30: learn: 0.4661351 test: 0.4678501 best: 0.4678501 (30) total: 337ms remaining: 631ms 31: learn: 0.4630868 test: 0.4650027 best: 0.4650027 (31) total: 349ms remaining: 622ms 32: learn: 0.4600443 test: 0.4620288 best: 0.4620288 (32) total: 361ms remaining: 612ms 33: learn: 0.4571970 test: 0.4592611 best: 0.4592611 (33) total: 372ms remaining: 602ms 34: learn: 0.4516563 test: 0.4536614 best: 0.4536614 (34) total: 383ms remaining: 591ms 35: learn: 0.4467184 test: 0.4486454 best: 0.4486454 (35) total: 395ms remaining: 581ms 36: learn: 0.4441474 test: 0.4461469 best: 0.4461469 (36) total: 406ms remaining: 571ms 37: learn: 0.4394879 test: 0.4412879 best: 0.4412879 (37) total: 418ms remaining: 561ms 38: learn: 0.4367458 test: 0.4386197 best: 0.4386197 (38) total: 425ms remaining: 545ms 39: learn: 0.4323757 test: 0.4341742 best: 0.4341742 (39) total: 437ms remaining: 535ms 40: learn: 0.4283882 test: 0.4301767 best: 0.4301767 (40) total: 447ms remaining: 524ms 41: learn: 0.4255671 test: 0.4276019 best: 0.4276019 (41) total: 458ms remaining: 513ms 42: learn: 0.4214553 test: 0.4233018 best: 0.4233018 (42) total: 470ms remaining: 502ms 43: learn: 0.4178763 test: 0.4196752 best: 0.4196752 (43) total: 481ms remaining: 492ms 44: learn: 0.4146012 test: 0.4163975 best: 0.4163975 (44) total: 493ms remaining: 482ms 45: learn: 0.4109663 test: 0.4125809 best: 0.4125809 (45) total: 504ms remaining: 471ms 46: learn: 0.4083768 test: 0.4100671 best: 0.4100671 (46) total: 515ms remaining: 460ms 47: learn: 0.4059483 test: 0.4077258 best: 0.4077258 (47) total: 522ms remaining: 446ms 48: learn: 0.4030004 test: 0.4047432 best: 0.4047432 (48) total: 534ms remaining: 436ms 49: learn: 0.4008041 test: 0.4026148 best: 0.4026148 (49) total: 546ms remaining: 426ms 50: learn: 0.3987835 test: 0.4006529 best: 0.4006529 (50) total: 558ms remaining: 416ms 51: learn: 0.3967385 test: 0.3988202 best: 0.3988202 (51) total: 569ms remaining: 405ms 52: learn: 0.3949699 test: 0.3971234 best: 0.3971234 (52) total: 580ms remaining: 394ms 53: learn: 0.3934013 test: 0.3956125 best: 0.3956125 (53) total: 591ms remaining: 383ms 54: learn: 0.3916957 test: 0.3939835 best: 0.3939835 (54) total: 602ms remaining: 372ms 55: learn: 0.3900680 test: 0.3924266 best: 0.3924266 (55) total: 614ms remaining: 362ms 56: learn: 0.3885331 test: 0.3909591 best: 0.3909591 (56) total: 625ms remaining: 351ms 57: learn: 0.3871324 test: 0.3896151 best: 0.3896151 (57) total: 633ms remaining: 338ms 58: learn: 0.3840120 test: 0.3864602 best: 0.3864602 (58) total: 644ms remaining: 328ms 59: learn: 0.3811895 test: 0.3836106 best: 0.3836106 (59) total: 656ms remaining: 317ms 60: learn: 0.3798007 test: 0.3823078 best: 0.3823078 (60) total: 667ms remaining: 306ms 61: learn: 0.3784440 test: 0.3810089 best: 0.3810089 (61) total: 678ms remaining: 295ms 62: learn: 0.3770068 test: 0.3797538 best: 0.3797538 (62) total: 689ms remaining: 284ms 63: learn: 0.3744559 test: 0.3771712 best: 0.3771712 (63) total: 700ms remaining: 274ms 64: learn: 0.3716332 test: 0.3741910 best: 0.3741910 (64) total: 712ms remaining: 263ms 65: learn: 0.3690617 test: 0.3714646 best: 0.3714646 (65) total: 723ms remaining: 252ms 66: learn: 0.3678237 test: 0.3702907 best: 0.3702907 (66) total: 734ms remaining: 241ms 67: learn: 0.3657262 test: 0.3681583 best: 0.3681583 (67) total: 746ms remaining: 230ms 68: learn: 0.3634442 test: 0.3658820 best: 0.3658820 (68) total: 757ms remaining: 219ms 69: learn: 0.3621981 test: 0.3647035 best: 0.3647035 (69) total: 769ms remaining: 209ms 70: learn: 0.3601626 test: 0.3626510 best: 0.3626510 (70) total: 780ms remaining: 198ms 71: learn: 0.3589079 test: 0.3615813 best: 0.3615813 (71) total: 791ms remaining: 187ms 72: learn: 0.3570361 test: 0.3596961 best: 0.3596961 (72) total: 803ms remaining: 176ms 73: learn: 0.3559056 test: 0.3586216 best: 0.3586216 (73) total: 816ms remaining: 165ms 74: learn: 0.3549032 test: 0.3576900 best: 0.3576900 (74) total: 824ms remaining: 154ms 75: learn: 0.3539384 test: 0.3567837 best: 0.3567837 (75) total: 835ms remaining: 143ms 76: learn: 0.3528962 test: 0.3558276 best: 0.3558276 (76) total: 846ms remaining: 132ms 77: learn: 0.3510243 test: 0.3539736 best: 0.3539736 (77) total: 858ms remaining: 121ms 78: learn: 0.3500603 test: 0.3532220 best: 0.3532220 (78) total: 869ms remaining: 110ms 79: learn: 0.3481973 test: 0.3513255 best: 0.3513255 (79) total: 880ms remaining: 99ms 80: learn: 0.3465573 test: 0.3497029 best: 0.3497029 (80) total: 892ms remaining: 88.1ms 81: learn: 0.3456337 test: 0.3488374 best: 0.3488374 (81) total: 903ms remaining: 77.1ms 82: learn: 0.3441371 test: 0.3473672 best: 0.3473672 (82) total: 916ms remaining: 66.2ms 83: learn: 0.3427810 test: 0.3460388 best: 0.3460388 (83) total: 927ms remaining: 55.2ms 84: learn: 0.3418727 test: 0.3451799 best: 0.3451799 (84) total: 939ms remaining: 44.2ms 85: learn: 0.3411152 test: 0.3444891 best: 0.3444891 (85) total: 950ms remaining: 33.1ms 86: learn: 0.3392045 test: 0.3424387 best: 0.3424387 (86) total: 962ms remaining: 22.1ms 87: learn: 0.3380285 test: 0.3412616 best: 0.3412616 (87) total: 973ms remaining: 11.1ms 88: learn: 0.3371873 test: 0.3405192 best: 0.3405192 (88) total: 985ms remaining: 0us bestTest = 0.3405191699 bestIteration = 88 Trial 28, Fold 1: Log loss = 0.34053714359201404, Average precision = 0.9315523225541416, ROC-AUC = 0.9398508110962945, Elapsed Time = 1.086109600000782 seconds Trial 28, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 28, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6803047 test: 0.6803343 best: 0.6803343 (0) total: 9.82ms remaining: 864ms 1: learn: 0.6685441 test: 0.6685958 best: 0.6685958 (1) total: 17.1ms remaining: 742ms 2: learn: 0.6537491 test: 0.6538364 best: 0.6538364 (2) total: 30.5ms remaining: 874ms 3: learn: 0.6434232 test: 0.6436072 best: 0.6436072 (3) total: 38.6ms remaining: 820ms 4: learn: 0.6337818 test: 0.6339889 best: 0.6339889 (4) total: 51.6ms remaining: 866ms 5: learn: 0.6250963 test: 0.6253504 best: 0.6253504 (5) total: 63.9ms remaining: 884ms 6: learn: 0.6172264 test: 0.6174720 best: 0.6174720 (6) total: 74.9ms remaining: 878ms 7: learn: 0.6047608 test: 0.6050549 best: 0.6050549 (7) total: 87.4ms remaining: 885ms 8: learn: 0.5968235 test: 0.5971296 best: 0.5971296 (8) total: 99.2ms remaining: 882ms 9: learn: 0.5894956 test: 0.5897089 best: 0.5897089 (9) total: 111ms remaining: 876ms 10: learn: 0.5828509 test: 0.5830290 best: 0.5830290 (10) total: 122ms remaining: 868ms 11: learn: 0.5769001 test: 0.5771358 best: 0.5771358 (11) total: 134ms remaining: 862ms 12: learn: 0.5659331 test: 0.5662309 best: 0.5662309 (12) total: 146ms remaining: 852ms 13: learn: 0.5606496 test: 0.5610621 best: 0.5610621 (13) total: 158ms remaining: 844ms 14: learn: 0.5511608 test: 0.5516404 best: 0.5516404 (14) total: 169ms remaining: 833ms 15: learn: 0.5425404 test: 0.5430938 best: 0.5430938 (15) total: 180ms remaining: 823ms 16: learn: 0.5373368 test: 0.5379514 best: 0.5379514 (16) total: 191ms remaining: 811ms 17: learn: 0.5325277 test: 0.5331877 best: 0.5331877 (17) total: 203ms remaining: 801ms 18: learn: 0.5280601 test: 0.5286441 best: 0.5286441 (18) total: 215ms remaining: 792ms 19: learn: 0.5202312 test: 0.5209838 best: 0.5209838 (19) total: 228ms remaining: 786ms 20: learn: 0.5126350 test: 0.5134608 best: 0.5134608 (20) total: 239ms remaining: 775ms 21: learn: 0.5056566 test: 0.5066262 best: 0.5066262 (21) total: 250ms remaining: 763ms 22: learn: 0.5012708 test: 0.5022512 best: 0.5022512 (22) total: 262ms remaining: 752ms 23: learn: 0.4970992 test: 0.4981162 best: 0.4981162 (23) total: 274ms remaining: 742ms 24: learn: 0.4908540 test: 0.4919979 best: 0.4919979 (24) total: 285ms remaining: 731ms 25: learn: 0.4851960 test: 0.4865262 best: 0.4865262 (25) total: 297ms remaining: 720ms 26: learn: 0.4814204 test: 0.4827716 best: 0.4827716 (26) total: 309ms remaining: 709ms 27: learn: 0.4773033 test: 0.4786106 best: 0.4786106 (27) total: 321ms remaining: 699ms 28: learn: 0.4713239 test: 0.4727019 best: 0.4727019 (28) total: 333ms remaining: 688ms 29: learn: 0.4676338 test: 0.4690128 best: 0.4690128 (29) total: 344ms remaining: 677ms 30: learn: 0.4643894 test: 0.4657678 best: 0.4657678 (30) total: 356ms remaining: 666ms 31: learn: 0.4609894 test: 0.4623703 best: 0.4623703 (31) total: 367ms remaining: 654ms 32: learn: 0.4557775 test: 0.4572336 best: 0.4572336 (32) total: 379ms remaining: 643ms 33: learn: 0.4526864 test: 0.4541440 best: 0.4541440 (33) total: 390ms remaining: 631ms 34: learn: 0.4498742 test: 0.4513144 best: 0.4513144 (34) total: 401ms remaining: 619ms 35: learn: 0.4469846 test: 0.4484462 best: 0.4484462 (35) total: 414ms remaining: 609ms 36: learn: 0.4420246 test: 0.4436419 best: 0.4436419 (36) total: 425ms remaining: 597ms 37: learn: 0.4374941 test: 0.4392475 best: 0.4392475 (37) total: 437ms remaining: 586ms 38: learn: 0.4330817 test: 0.4349081 best: 0.4349081 (38) total: 448ms remaining: 575ms 39: learn: 0.4302810 test: 0.4321212 best: 0.4321212 (39) total: 460ms remaining: 564ms 40: learn: 0.4277174 test: 0.4295593 best: 0.4295593 (40) total: 472ms remaining: 553ms 41: learn: 0.4236873 test: 0.4256039 best: 0.4256039 (41) total: 484ms remaining: 541ms 42: learn: 0.4213604 test: 0.4233116 best: 0.4233116 (42) total: 495ms remaining: 530ms 43: learn: 0.4176815 test: 0.4197085 best: 0.4197085 (43) total: 507ms remaining: 519ms 44: learn: 0.4153451 test: 0.4174298 best: 0.4174298 (44) total: 519ms remaining: 507ms 45: learn: 0.4133544 test: 0.4154526 best: 0.4154526 (45) total: 530ms remaining: 495ms 46: learn: 0.4115379 test: 0.4136660 best: 0.4136660 (46) total: 542ms remaining: 485ms 47: learn: 0.4094275 test: 0.4115344 best: 0.4115344 (47) total: 555ms remaining: 474ms 48: learn: 0.4074840 test: 0.4095573 best: 0.4095573 (48) total: 566ms remaining: 462ms 49: learn: 0.4057405 test: 0.4078122 best: 0.4078122 (49) total: 578ms remaining: 451ms 50: learn: 0.4041574 test: 0.4061459 best: 0.4061459 (50) total: 590ms remaining: 440ms 51: learn: 0.4025771 test: 0.4045567 best: 0.4045567 (51) total: 602ms remaining: 428ms 52: learn: 0.3998509 test: 0.4018616 best: 0.4018616 (52) total: 613ms remaining: 416ms 53: learn: 0.3981599 test: 0.4001369 best: 0.4001369 (53) total: 625ms remaining: 405ms 54: learn: 0.3966253 test: 0.3986057 best: 0.3986057 (54) total: 637ms remaining: 394ms 55: learn: 0.3926350 test: 0.3947411 best: 0.3947411 (55) total: 648ms remaining: 382ms 56: learn: 0.3902647 test: 0.3923992 best: 0.3923992 (56) total: 660ms remaining: 371ms 57: learn: 0.3867529 test: 0.3889495 best: 0.3889495 (57) total: 672ms remaining: 359ms 58: learn: 0.3852421 test: 0.3874629 best: 0.3874629 (58) total: 684ms remaining: 348ms 59: learn: 0.3838588 test: 0.3860560 best: 0.3860560 (59) total: 695ms remaining: 336ms 60: learn: 0.3813245 test: 0.3835964 best: 0.3835964 (60) total: 707ms remaining: 324ms 61: learn: 0.3798661 test: 0.3821696 best: 0.3821696 (61) total: 718ms remaining: 313ms 62: learn: 0.3775402 test: 0.3799184 best: 0.3799184 (62) total: 730ms remaining: 301ms 63: learn: 0.3754203 test: 0.3778722 best: 0.3778722 (63) total: 742ms remaining: 290ms 64: learn: 0.3722468 test: 0.3747568 best: 0.3747568 (64) total: 755ms remaining: 279ms 65: learn: 0.3709345 test: 0.3734483 best: 0.3734483 (65) total: 766ms remaining: 267ms 66: learn: 0.3680387 test: 0.3706285 best: 0.3706285 (66) total: 778ms remaining: 255ms 67: learn: 0.3666901 test: 0.3693108 best: 0.3693108 (67) total: 790ms remaining: 244ms 68: learn: 0.3639924 test: 0.3667081 best: 0.3667081 (68) total: 801ms remaining: 232ms 69: learn: 0.3615900 test: 0.3643684 best: 0.3643684 (69) total: 814ms remaining: 221ms 70: learn: 0.3603647 test: 0.3631515 best: 0.3631515 (70) total: 827ms remaining: 210ms 71: learn: 0.3587462 test: 0.3615572 best: 0.3615572 (71) total: 838ms remaining: 198ms 72: learn: 0.3565545 test: 0.3594099 best: 0.3594099 (72) total: 851ms remaining: 187ms 73: learn: 0.3553742 test: 0.3582575 best: 0.3582575 (73) total: 864ms remaining: 175ms 74: learn: 0.3536381 test: 0.3565869 best: 0.3565869 (74) total: 875ms remaining: 163ms 75: learn: 0.3525035 test: 0.3554516 best: 0.3554516 (75) total: 887ms remaining: 152ms 76: learn: 0.3514509 test: 0.3543975 best: 0.3543975 (76) total: 899ms remaining: 140ms 77: learn: 0.3493510 test: 0.3523680 best: 0.3523680 (77) total: 911ms remaining: 128ms 78: learn: 0.3483854 test: 0.3514065 best: 0.3514065 (78) total: 923ms remaining: 117ms 79: learn: 0.3473673 test: 0.3504151 best: 0.3504151 (79) total: 936ms remaining: 105ms 80: learn: 0.3466816 test: 0.3496815 best: 0.3496815 (80) total: 947ms remaining: 93.5ms 81: learn: 0.3458511 test: 0.3488545 best: 0.3488545 (81) total: 959ms remaining: 81.9ms 82: learn: 0.3452649 test: 0.3483163 best: 0.3483163 (82) total: 971ms remaining: 70.2ms 83: learn: 0.3445367 test: 0.3476100 best: 0.3476100 (83) total: 983ms remaining: 58.5ms 84: learn: 0.3440254 test: 0.3471328 best: 0.3471328 (84) total: 995ms remaining: 46.8ms 85: learn: 0.3431305 test: 0.3462355 best: 0.3462355 (85) total: 1.01s remaining: 35.1ms 86: learn: 0.3426369 test: 0.3457773 best: 0.3457773 (86) total: 1.02s remaining: 23.4ms 87: learn: 0.3418052 test: 0.3449432 best: 0.3449432 (87) total: 1.03s remaining: 11.7ms 88: learn: 0.3409240 test: 0.3440422 best: 0.3440422 (88) total: 1.04s remaining: 0us bestTest = 0.3440422231 bestIteration = 88 Trial 28, Fold 2: Log loss = 0.3440507802857193, Average precision = 0.9361610234283044, ROC-AUC = 0.9425062542849783, Elapsed Time = 1.146935000000667 seconds Trial 28, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 28, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6804308 test: 0.6798024 best: 0.6798024 (0) total: 9.63ms remaining: 848ms 1: learn: 0.6692039 test: 0.6680057 best: 0.6680057 (1) total: 17.7ms remaining: 768ms 2: learn: 0.6588860 test: 0.6571421 best: 0.6571421 (2) total: 29.4ms remaining: 842ms 3: learn: 0.6491237 test: 0.6468231 best: 0.6468231 (3) total: 36.8ms remaining: 781ms 4: learn: 0.6405583 test: 0.6383465 best: 0.6383465 (4) total: 48.2ms remaining: 810ms 5: learn: 0.6267984 test: 0.6248366 best: 0.6248366 (5) total: 59.5ms remaining: 823ms 6: learn: 0.6142877 test: 0.6125457 best: 0.6125457 (6) total: 71.2ms remaining: 834ms 7: learn: 0.6029175 test: 0.6013678 best: 0.6013678 (7) total: 79.1ms remaining: 801ms 8: learn: 0.5944316 test: 0.5924114 best: 0.5924114 (8) total: 90.4ms remaining: 803ms 9: learn: 0.5842153 test: 0.5823681 best: 0.5823681 (9) total: 102ms remaining: 804ms 10: learn: 0.5766869 test: 0.5743780 best: 0.5743780 (10) total: 113ms remaining: 801ms 11: learn: 0.5697603 test: 0.5669914 best: 0.5669914 (11) total: 124ms remaining: 796ms 12: learn: 0.5635654 test: 0.5604209 best: 0.5604209 (12) total: 135ms remaining: 789ms 13: learn: 0.5578533 test: 0.5543075 best: 0.5543075 (13) total: 147ms remaining: 787ms 14: learn: 0.5521367 test: 0.5487370 best: 0.5487370 (14) total: 158ms remaining: 780ms 15: learn: 0.5467181 test: 0.5429675 best: 0.5429675 (15) total: 169ms remaining: 773ms 16: learn: 0.5414381 test: 0.5378741 best: 0.5378741 (16) total: 180ms remaining: 764ms 17: learn: 0.5337711 test: 0.5305664 best: 0.5305664 (17) total: 192ms remaining: 756ms 18: learn: 0.5291224 test: 0.5260856 best: 0.5260856 (18) total: 199ms remaining: 732ms 19: learn: 0.5213882 test: 0.5184809 best: 0.5184809 (19) total: 210ms remaining: 725ms 20: learn: 0.5165477 test: 0.5133281 best: 0.5133281 (20) total: 221ms remaining: 716ms 21: learn: 0.5121200 test: 0.5086085 best: 0.5086085 (21) total: 228ms remaining: 695ms 22: learn: 0.5080817 test: 0.5046061 best: 0.5046061 (22) total: 240ms remaining: 689ms 23: learn: 0.5038466 test: 0.5000216 best: 0.5000216 (23) total: 251ms remaining: 679ms 24: learn: 0.4970285 test: 0.4935299 best: 0.4935299 (24) total: 262ms remaining: 671ms 25: learn: 0.4908475 test: 0.4876563 best: 0.4876563 (25) total: 274ms remaining: 663ms 26: learn: 0.4852617 test: 0.4823608 best: 0.4823608 (26) total: 285ms remaining: 654ms 27: learn: 0.4789622 test: 0.4761449 best: 0.4761449 (27) total: 296ms remaining: 644ms 28: learn: 0.4749539 test: 0.4718659 best: 0.4718659 (28) total: 307ms remaining: 634ms 29: learn: 0.4700992 test: 0.4671834 best: 0.4671834 (29) total: 318ms remaining: 626ms 30: learn: 0.4663384 test: 0.4631626 best: 0.4631626 (30) total: 329ms remaining: 616ms 31: learn: 0.4629102 test: 0.4595010 best: 0.4595010 (31) total: 341ms remaining: 607ms 32: learn: 0.4597426 test: 0.4561066 best: 0.4561066 (32) total: 351ms remaining: 596ms 33: learn: 0.4568499 test: 0.4529927 best: 0.4529927 (33) total: 362ms remaining: 586ms 34: learn: 0.4540675 test: 0.4499976 best: 0.4499976 (34) total: 373ms remaining: 576ms 35: learn: 0.4517097 test: 0.4476596 best: 0.4476596 (35) total: 384ms remaining: 566ms 36: learn: 0.4492134 test: 0.4449272 best: 0.4449272 (36) total: 395ms remaining: 556ms 37: learn: 0.4462978 test: 0.4421282 best: 0.4421282 (37) total: 407ms remaining: 546ms 38: learn: 0.4440265 test: 0.4396717 best: 0.4396717 (38) total: 418ms remaining: 536ms 39: learn: 0.4387771 test: 0.4345021 best: 0.4345021 (39) total: 430ms remaining: 526ms 40: learn: 0.4369858 test: 0.4327567 best: 0.4327567 (40) total: 441ms remaining: 517ms 41: learn: 0.4349464 test: 0.4305185 best: 0.4305185 (41) total: 454ms remaining: 507ms 42: learn: 0.4302383 test: 0.4260650 best: 0.4260650 (42) total: 465ms remaining: 497ms 43: learn: 0.4259814 test: 0.4220506 best: 0.4220506 (43) total: 476ms remaining: 487ms 44: learn: 0.4237941 test: 0.4197058 best: 0.4197058 (44) total: 484ms remaining: 473ms 45: learn: 0.4199013 test: 0.4160401 best: 0.4160401 (45) total: 495ms remaining: 463ms 46: learn: 0.4176093 test: 0.4137820 best: 0.4137820 (46) total: 506ms remaining: 452ms 47: learn: 0.4133547 test: 0.4095672 best: 0.4095672 (47) total: 518ms remaining: 442ms 48: learn: 0.4113420 test: 0.4073763 best: 0.4073763 (48) total: 529ms remaining: 432ms 49: learn: 0.4094806 test: 0.4053846 best: 0.4053846 (49) total: 541ms remaining: 422ms 50: learn: 0.4076087 test: 0.4033484 best: 0.4033484 (50) total: 552ms remaining: 412ms 51: learn: 0.4060700 test: 0.4016497 best: 0.4016497 (51) total: 564ms remaining: 401ms 52: learn: 0.4040652 test: 0.3996764 best: 0.3996764 (52) total: 575ms remaining: 391ms 53: learn: 0.4021515 test: 0.3977906 best: 0.3977906 (53) total: 587ms remaining: 380ms 54: learn: 0.4006080 test: 0.3963361 best: 0.3963361 (54) total: 598ms remaining: 370ms 55: learn: 0.3989642 test: 0.3945430 best: 0.3945430 (55) total: 611ms remaining: 360ms 56: learn: 0.3954916 test: 0.3912815 best: 0.3912815 (56) total: 622ms remaining: 349ms 57: learn: 0.3940727 test: 0.3897375 best: 0.3897375 (57) total: 634ms remaining: 339ms 58: learn: 0.3927109 test: 0.3883734 best: 0.3883734 (58) total: 645ms remaining: 328ms 59: learn: 0.3893465 test: 0.3850383 best: 0.3850383 (59) total: 653ms remaining: 316ms 60: learn: 0.3880275 test: 0.3835912 best: 0.3835912 (60) total: 665ms remaining: 305ms 61: learn: 0.3866916 test: 0.3822771 best: 0.3822771 (61) total: 677ms remaining: 295ms 62: learn: 0.3854457 test: 0.3808994 best: 0.3808994 (62) total: 689ms remaining: 285ms 63: learn: 0.3842795 test: 0.3797328 best: 0.3797328 (63) total: 701ms remaining: 274ms 64: learn: 0.3813413 test: 0.3768192 best: 0.3768192 (64) total: 714ms remaining: 264ms 65: learn: 0.3788549 test: 0.3743612 best: 0.3743612 (65) total: 726ms remaining: 253ms 66: learn: 0.3763938 test: 0.3719128 best: 0.3719128 (66) total: 738ms remaining: 242ms 67: learn: 0.3751886 test: 0.3705780 best: 0.3705780 (67) total: 751ms remaining: 232ms 68: learn: 0.3732002 test: 0.3685806 best: 0.3685806 (68) total: 763ms remaining: 221ms 69: learn: 0.3720382 test: 0.3673597 best: 0.3673597 (69) total: 775ms remaining: 210ms 70: learn: 0.3709991 test: 0.3662191 best: 0.3662191 (70) total: 787ms remaining: 199ms 71: learn: 0.3680020 test: 0.3634041 best: 0.3634041 (71) total: 799ms remaining: 189ms 72: learn: 0.3668329 test: 0.3622633 best: 0.3622633 (72) total: 807ms remaining: 177ms 73: learn: 0.3658176 test: 0.3611507 best: 0.3611507 (73) total: 819ms remaining: 166ms 74: learn: 0.3647106 test: 0.3600826 best: 0.3600826 (74) total: 831ms remaining: 155ms 75: learn: 0.3637200 test: 0.3589749 best: 0.3589749 (75) total: 844ms remaining: 144ms 76: learn: 0.3626866 test: 0.3579864 best: 0.3579864 (76) total: 856ms remaining: 133ms 77: learn: 0.3617873 test: 0.3569872 best: 0.3569872 (77) total: 868ms remaining: 122ms 78: learn: 0.3608072 test: 0.3560343 best: 0.3560343 (78) total: 881ms remaining: 111ms 79: learn: 0.3599636 test: 0.3551028 best: 0.3551028 (79) total: 893ms remaining: 100ms 80: learn: 0.3581826 test: 0.3533383 best: 0.3533383 (80) total: 905ms remaining: 89.4ms 81: learn: 0.3556114 test: 0.3509312 best: 0.3509312 (81) total: 918ms remaining: 78.3ms 82: learn: 0.3533361 test: 0.3487522 best: 0.3487522 (82) total: 930ms remaining: 67.3ms 83: learn: 0.3524477 test: 0.3477648 best: 0.3477648 (83) total: 943ms remaining: 56.1ms 84: learn: 0.3509731 test: 0.3462381 best: 0.3462381 (84) total: 956ms remaining: 45ms 85: learn: 0.3488512 test: 0.3442613 best: 0.3442613 (85) total: 969ms remaining: 33.8ms 86: learn: 0.3480069 test: 0.3433302 best: 0.3433302 (86) total: 981ms remaining: 22.5ms 87: learn: 0.3461170 test: 0.3415257 best: 0.3415257 (87) total: 994ms remaining: 11.3ms 88: learn: 0.3443524 test: 0.3398934 best: 0.3398934 (88) total: 1s remaining: 0us bestTest = 0.3398934333 bestIteration = 88 Trial 28, Fold 3: Log loss = 0.3400008697202025, Average precision = 0.9431815108972704, ROC-AUC = 0.9461049536024887, Elapsed Time = 1.1075904999997874 seconds Trial 28, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 28, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6803051 test: 0.6800551 best: 0.6800551 (0) total: 9.04ms remaining: 795ms 1: learn: 0.6686877 test: 0.6682011 best: 0.6682011 (1) total: 21.3ms remaining: 926ms 2: learn: 0.6579685 test: 0.6572624 best: 0.6572624 (2) total: 29.2ms remaining: 838ms 3: learn: 0.6433200 test: 0.6427584 best: 0.6427584 (3) total: 41.2ms remaining: 875ms 4: learn: 0.6340020 test: 0.6331306 best: 0.6331306 (4) total: 52.4ms remaining: 880ms 5: learn: 0.6260801 test: 0.6252934 best: 0.6252934 (5) total: 63.5ms remaining: 879ms 6: learn: 0.6176020 test: 0.6165201 best: 0.6165201 (6) total: 74.4ms remaining: 872ms 7: learn: 0.6103433 test: 0.6093452 best: 0.6093452 (7) total: 85.6ms remaining: 867ms 8: learn: 0.5983564 test: 0.5975101 best: 0.5975101 (8) total: 97.3ms remaining: 865ms 9: learn: 0.5874702 test: 0.5867757 best: 0.5867757 (9) total: 105ms remaining: 826ms 10: learn: 0.5799783 test: 0.5790133 best: 0.5790133 (10) total: 116ms remaining: 824ms 11: learn: 0.5728046 test: 0.5716382 best: 0.5716382 (11) total: 127ms remaining: 817ms 12: learn: 0.5663492 test: 0.5650118 best: 0.5650118 (12) total: 139ms remaining: 810ms 13: learn: 0.5605501 test: 0.5589796 best: 0.5589796 (13) total: 150ms remaining: 802ms 14: learn: 0.5552855 test: 0.5534850 best: 0.5534850 (14) total: 161ms remaining: 796ms 15: learn: 0.5457895 test: 0.5441383 best: 0.5441383 (15) total: 173ms remaining: 789ms 16: learn: 0.5409938 test: 0.5391292 best: 0.5391292 (16) total: 184ms remaining: 780ms 17: learn: 0.5366662 test: 0.5346628 best: 0.5346628 (17) total: 195ms remaining: 770ms 18: learn: 0.5280838 test: 0.5262268 best: 0.5262268 (18) total: 207ms remaining: 763ms 19: learn: 0.5231771 test: 0.5213466 best: 0.5213466 (19) total: 219ms remaining: 755ms 20: learn: 0.5186632 test: 0.5168585 best: 0.5168585 (20) total: 230ms remaining: 744ms 21: learn: 0.5145665 test: 0.5125944 best: 0.5125944 (21) total: 241ms remaining: 734ms 22: learn: 0.5074099 test: 0.5055849 best: 0.5055849 (22) total: 253ms remaining: 726ms 23: learn: 0.5009115 test: 0.4992316 best: 0.4992316 (23) total: 265ms remaining: 717ms 24: learn: 0.4936970 test: 0.4920529 best: 0.4920529 (24) total: 277ms remaining: 709ms 25: learn: 0.4872077 test: 0.4855982 best: 0.4855982 (25) total: 288ms remaining: 698ms 26: learn: 0.4833593 test: 0.4815834 best: 0.4815834 (26) total: 300ms remaining: 688ms 27: learn: 0.4797435 test: 0.4778094 best: 0.4778094 (27) total: 311ms remaining: 677ms 28: learn: 0.4741017 test: 0.4723132 best: 0.4723132 (28) total: 322ms remaining: 666ms 29: learn: 0.4709569 test: 0.4689906 best: 0.4689906 (29) total: 333ms remaining: 656ms 30: learn: 0.4677611 test: 0.4659106 best: 0.4659106 (30) total: 345ms remaining: 645ms 31: learn: 0.4648409 test: 0.4628159 best: 0.4628159 (31) total: 352ms remaining: 627ms 32: learn: 0.4588877 test: 0.4568845 best: 0.4568845 (32) total: 364ms remaining: 617ms 33: learn: 0.4560938 test: 0.4539419 best: 0.4539419 (33) total: 375ms remaining: 607ms 34: learn: 0.4534422 test: 0.4511489 best: 0.4511489 (34) total: 386ms remaining: 596ms 35: learn: 0.4510889 test: 0.4486584 best: 0.4486584 (35) total: 398ms remaining: 586ms 36: learn: 0.4462303 test: 0.4439429 best: 0.4439429 (36) total: 410ms remaining: 577ms 37: learn: 0.4407177 test: 0.4384382 best: 0.4384382 (37) total: 422ms remaining: 566ms 38: learn: 0.4368653 test: 0.4347958 best: 0.4347958 (38) total: 433ms remaining: 556ms 39: learn: 0.4344366 test: 0.4324912 best: 0.4324912 (39) total: 445ms remaining: 545ms 40: learn: 0.4310505 test: 0.4293037 best: 0.4293037 (40) total: 456ms remaining: 534ms 41: learn: 0.4263615 test: 0.4246085 best: 0.4246085 (41) total: 468ms remaining: 524ms 42: learn: 0.4240374 test: 0.4221307 best: 0.4221307 (42) total: 479ms remaining: 513ms 43: learn: 0.4218208 test: 0.4197803 best: 0.4197803 (43) total: 491ms remaining: 502ms 44: learn: 0.4182675 test: 0.4163619 best: 0.4163619 (44) total: 502ms remaining: 491ms 45: learn: 0.4140100 test: 0.4120996 best: 0.4120996 (45) total: 514ms remaining: 481ms 46: learn: 0.4119836 test: 0.4099266 best: 0.4099266 (46) total: 525ms remaining: 470ms 47: learn: 0.4080442 test: 0.4059694 best: 0.4059694 (47) total: 537ms remaining: 459ms 48: learn: 0.4044968 test: 0.4024021 best: 0.4024021 (48) total: 548ms remaining: 448ms 49: learn: 0.4014057 test: 0.3994459 best: 0.3994459 (49) total: 560ms remaining: 437ms 50: learn: 0.3982541 test: 0.3962705 best: 0.3962705 (50) total: 572ms remaining: 426ms 51: learn: 0.3961505 test: 0.3940322 best: 0.3940322 (51) total: 583ms remaining: 415ms 52: learn: 0.3942806 test: 0.3920077 best: 0.3920077 (52) total: 595ms remaining: 404ms 53: learn: 0.3924445 test: 0.3900696 best: 0.3900696 (53) total: 606ms remaining: 393ms 54: learn: 0.3896218 test: 0.3873796 best: 0.3873796 (54) total: 614ms remaining: 379ms 55: learn: 0.3880263 test: 0.3856468 best: 0.3856468 (55) total: 625ms remaining: 368ms 56: learn: 0.3865241 test: 0.3840352 best: 0.3840352 (56) total: 637ms remaining: 357ms 57: learn: 0.3842144 test: 0.3819071 best: 0.3819071 (57) total: 648ms remaining: 346ms 58: learn: 0.3811616 test: 0.3788255 best: 0.3788255 (58) total: 660ms remaining: 336ms 59: learn: 0.3796131 test: 0.3771161 best: 0.3771161 (59) total: 673ms remaining: 325ms 60: learn: 0.3775242 test: 0.3752025 best: 0.3752025 (60) total: 686ms remaining: 315ms 61: learn: 0.3748713 test: 0.3725219 best: 0.3725219 (61) total: 697ms remaining: 304ms 62: learn: 0.3734345 test: 0.3709703 best: 0.3709703 (62) total: 710ms remaining: 293ms 63: learn: 0.3712139 test: 0.3688758 best: 0.3688758 (63) total: 722ms remaining: 282ms 64: learn: 0.3698774 test: 0.3674331 best: 0.3674331 (64) total: 733ms remaining: 271ms 65: learn: 0.3686617 test: 0.3661366 best: 0.3661366 (65) total: 745ms remaining: 260ms 66: learn: 0.3675300 test: 0.3649150 best: 0.3649150 (66) total: 756ms remaining: 248ms 67: learn: 0.3649264 test: 0.3622709 best: 0.3622709 (67) total: 768ms remaining: 237ms 68: learn: 0.3625482 test: 0.3598498 best: 0.3598498 (68) total: 780ms remaining: 226ms 69: learn: 0.3607965 test: 0.3582660 best: 0.3582660 (69) total: 792ms remaining: 215ms 70: learn: 0.3586953 test: 0.3561242 best: 0.3561242 (70) total: 804ms remaining: 204ms 71: learn: 0.3568107 test: 0.3541982 best: 0.3541982 (71) total: 816ms remaining: 193ms 72: learn: 0.3555935 test: 0.3528792 best: 0.3528792 (72) total: 827ms remaining: 181ms 73: learn: 0.3538213 test: 0.3510602 best: 0.3510602 (73) total: 840ms remaining: 170ms 74: learn: 0.3526657 test: 0.3498001 best: 0.3498001 (74) total: 851ms remaining: 159ms 75: learn: 0.3517401 test: 0.3489122 best: 0.3489122 (75) total: 863ms remaining: 148ms 76: learn: 0.3501438 test: 0.3472616 best: 0.3472616 (76) total: 874ms remaining: 136ms 77: learn: 0.3487391 test: 0.3458141 best: 0.3458141 (77) total: 886ms remaining: 125ms 78: learn: 0.3474627 test: 0.3444932 best: 0.3444932 (78) total: 897ms remaining: 114ms 79: learn: 0.3462914 test: 0.3432163 best: 0.3432163 (79) total: 908ms remaining: 102ms 80: learn: 0.3452476 test: 0.3420633 best: 0.3420633 (80) total: 920ms remaining: 90.8ms 81: learn: 0.3437427 test: 0.3407222 best: 0.3407222 (81) total: 931ms remaining: 79.5ms 82: learn: 0.3420681 test: 0.3391728 best: 0.3391728 (82) total: 942ms remaining: 68.1ms 83: learn: 0.3410191 test: 0.3380509 best: 0.3380509 (83) total: 953ms remaining: 56.7ms 84: learn: 0.3396036 test: 0.3367785 best: 0.3367785 (84) total: 965ms remaining: 45.4ms 85: learn: 0.3386211 test: 0.3357254 best: 0.3357254 (85) total: 976ms remaining: 34ms 86: learn: 0.3373937 test: 0.3344496 best: 0.3344496 (86) total: 988ms remaining: 22.7ms 87: learn: 0.3364635 test: 0.3334509 best: 0.3334509 (87) total: 999ms remaining: 11.4ms 88: learn: 0.3354550 test: 0.3324139 best: 0.3324139 (88) total: 1.01s remaining: 0us bestTest = 0.3324139198 bestIteration = 88 Trial 28, Fold 4: Log loss = 0.3324219486263143, Average precision = 0.9391763331989472, ROC-AUC = 0.9459501247046311, Elapsed Time = 1.1102492999998503 seconds Trial 28, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 28, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6799925 test: 0.6806376 best: 0.6806376 (0) total: 9.48ms remaining: 834ms 1: learn: 0.6678828 test: 0.6690854 best: 0.6690854 (1) total: 20.8ms remaining: 905ms 2: learn: 0.6570897 test: 0.6589960 best: 0.6589960 (2) total: 32.9ms remaining: 942ms 3: learn: 0.6471150 test: 0.6497309 best: 0.6497309 (3) total: 40.8ms remaining: 868ms 4: learn: 0.6379168 test: 0.6410614 best: 0.6410614 (4) total: 53ms remaining: 891ms 5: learn: 0.6296306 test: 0.6333911 best: 0.6333911 (5) total: 65.1ms remaining: 901ms 6: learn: 0.6213078 test: 0.6247079 best: 0.6247079 (6) total: 77.2ms remaining: 905ms 7: learn: 0.6077787 test: 0.6109813 best: 0.6109813 (7) total: 89.3ms remaining: 904ms 8: learn: 0.6002064 test: 0.6038732 best: 0.6038732 (8) total: 101ms remaining: 900ms 9: learn: 0.5880243 test: 0.5915019 best: 0.5915019 (9) total: 114ms remaining: 898ms 10: learn: 0.5769601 test: 0.5802676 best: 0.5802676 (10) total: 121ms remaining: 860ms 11: learn: 0.5669140 test: 0.5700688 best: 0.5700688 (11) total: 133ms remaining: 856ms 12: learn: 0.5603518 test: 0.5639038 best: 0.5639038 (12) total: 145ms remaining: 847ms 13: learn: 0.5547690 test: 0.5580424 best: 0.5580424 (13) total: 159ms remaining: 852ms 14: learn: 0.5461008 test: 0.5492379 best: 0.5492379 (14) total: 170ms remaining: 840ms 15: learn: 0.5388769 test: 0.5415054 best: 0.5415054 (15) total: 183ms remaining: 833ms 16: learn: 0.5329373 test: 0.5360409 best: 0.5360409 (16) total: 194ms remaining: 821ms 17: learn: 0.5263030 test: 0.5289992 best: 0.5289992 (17) total: 206ms remaining: 812ms 18: learn: 0.5207088 test: 0.5237273 best: 0.5237273 (18) total: 218ms remaining: 802ms 19: learn: 0.5146137 test: 0.5172518 best: 0.5172518 (19) total: 229ms remaining: 790ms 20: learn: 0.5094380 test: 0.5125000 best: 0.5125000 (20) total: 241ms remaining: 779ms 21: learn: 0.5047479 test: 0.5081967 best: 0.5081967 (21) total: 252ms remaining: 768ms 22: learn: 0.5004315 test: 0.5042624 best: 0.5042624 (22) total: 263ms remaining: 755ms 23: learn: 0.4931515 test: 0.4968419 best: 0.4968419 (23) total: 274ms remaining: 743ms 24: learn: 0.4874761 test: 0.4907880 best: 0.4907880 (24) total: 286ms remaining: 731ms 25: learn: 0.4834342 test: 0.4869769 best: 0.4869769 (25) total: 297ms remaining: 720ms 26: learn: 0.4797803 test: 0.4836323 best: 0.4836323 (26) total: 309ms remaining: 710ms 27: learn: 0.4764041 test: 0.4801474 best: 0.4801474 (27) total: 321ms remaining: 699ms 28: learn: 0.4702123 test: 0.4738336 best: 0.4738336 (28) total: 332ms remaining: 687ms 29: learn: 0.4651114 test: 0.4684071 best: 0.4684071 (29) total: 344ms remaining: 676ms 30: learn: 0.4604986 test: 0.4634940 best: 0.4634940 (30) total: 356ms remaining: 665ms 31: learn: 0.4570067 test: 0.4603156 best: 0.4603156 (31) total: 368ms remaining: 655ms 32: learn: 0.4537209 test: 0.4572752 best: 0.4572752 (32) total: 379ms remaining: 643ms 33: learn: 0.4507306 test: 0.4545876 best: 0.4545876 (33) total: 391ms remaining: 633ms 34: learn: 0.4462715 test: 0.4497591 best: 0.4497591 (34) total: 403ms remaining: 621ms 35: learn: 0.4434524 test: 0.4472329 best: 0.4472329 (35) total: 415ms remaining: 611ms 36: learn: 0.4407617 test: 0.4446772 best: 0.4446772 (36) total: 426ms remaining: 599ms 37: learn: 0.4384984 test: 0.4423125 best: 0.4423125 (37) total: 437ms remaining: 586ms 38: learn: 0.4361034 test: 0.4401893 best: 0.4401893 (38) total: 449ms remaining: 575ms 39: learn: 0.4339284 test: 0.4381809 best: 0.4381809 (39) total: 460ms remaining: 564ms 40: learn: 0.4287197 test: 0.4328489 best: 0.4328489 (40) total: 472ms remaining: 552ms 41: learn: 0.4269677 test: 0.4310390 best: 0.4310390 (41) total: 483ms remaining: 540ms 42: learn: 0.4249584 test: 0.4292442 best: 0.4292442 (42) total: 495ms remaining: 529ms 43: learn: 0.4209125 test: 0.4250223 best: 0.4250223 (43) total: 506ms remaining: 518ms 44: learn: 0.4192263 test: 0.4232878 best: 0.4232878 (44) total: 518ms remaining: 507ms 45: learn: 0.4153900 test: 0.4191894 best: 0.4191894 (45) total: 530ms remaining: 495ms 46: learn: 0.4119547 test: 0.4155307 best: 0.4155307 (46) total: 542ms remaining: 484ms 47: learn: 0.4078800 test: 0.4113685 best: 0.4113685 (47) total: 554ms remaining: 473ms 48: learn: 0.4058302 test: 0.4095511 best: 0.4095511 (48) total: 565ms remaining: 461ms 49: learn: 0.4027063 test: 0.4062083 best: 0.4062083 (49) total: 577ms remaining: 450ms 50: learn: 0.3998113 test: 0.4030418 best: 0.4030418 (50) total: 588ms remaining: 438ms 51: learn: 0.3978519 test: 0.4013073 best: 0.4013073 (51) total: 596ms remaining: 424ms 52: learn: 0.3960035 test: 0.3994311 best: 0.3994311 (52) total: 608ms remaining: 413ms 53: learn: 0.3941972 test: 0.3978412 best: 0.3978412 (53) total: 620ms remaining: 402ms 54: learn: 0.3925032 test: 0.3963403 best: 0.3963403 (54) total: 632ms remaining: 391ms 55: learn: 0.3908617 test: 0.3946364 best: 0.3946364 (55) total: 644ms remaining: 379ms 56: learn: 0.3892885 test: 0.3932649 best: 0.3932649 (56) total: 655ms remaining: 368ms 57: learn: 0.3877547 test: 0.3915735 best: 0.3915735 (57) total: 667ms remaining: 356ms 58: learn: 0.3851024 test: 0.3886709 best: 0.3886709 (58) total: 679ms remaining: 345ms 59: learn: 0.3836243 test: 0.3873574 best: 0.3873574 (59) total: 690ms remaining: 334ms 60: learn: 0.3803349 test: 0.3839907 best: 0.3839907 (60) total: 702ms remaining: 322ms 61: learn: 0.3789155 test: 0.3825036 best: 0.3825036 (61) total: 714ms remaining: 311ms 62: learn: 0.3760417 test: 0.3795669 best: 0.3795669 (62) total: 726ms remaining: 300ms 63: learn: 0.3737891 test: 0.3772037 best: 0.3772037 (63) total: 737ms remaining: 288ms 64: learn: 0.3722666 test: 0.3755400 best: 0.3755400 (64) total: 749ms remaining: 276ms 65: learn: 0.3700164 test: 0.3730660 best: 0.3730660 (65) total: 761ms remaining: 265ms 66: learn: 0.3685211 test: 0.3717506 best: 0.3717506 (66) total: 772ms remaining: 254ms 67: learn: 0.3664502 test: 0.3694681 best: 0.3694681 (67) total: 784ms remaining: 242ms 68: learn: 0.3650347 test: 0.3682277 best: 0.3682277 (68) total: 796ms remaining: 231ms 69: learn: 0.3631599 test: 0.3662155 best: 0.3662155 (69) total: 808ms remaining: 219ms 70: learn: 0.3612292 test: 0.3641875 best: 0.3641875 (70) total: 821ms remaining: 208ms 71: learn: 0.3598832 test: 0.3629563 best: 0.3629563 (71) total: 833ms remaining: 197ms 72: learn: 0.3586244 test: 0.3618821 best: 0.3618821 (72) total: 846ms remaining: 185ms 73: learn: 0.3573808 test: 0.3606519 best: 0.3606519 (73) total: 857ms remaining: 174ms 74: learn: 0.3563519 test: 0.3597646 best: 0.3597646 (74) total: 869ms remaining: 162ms 75: learn: 0.3541320 test: 0.3574927 best: 0.3574927 (75) total: 881ms remaining: 151ms 76: learn: 0.3521077 test: 0.3554229 best: 0.3554229 (76) total: 893ms remaining: 139ms 77: learn: 0.3503764 test: 0.3535625 best: 0.3535625 (77) total: 905ms remaining: 128ms 78: learn: 0.3487951 test: 0.3518504 best: 0.3518504 (78) total: 917ms remaining: 116ms 79: learn: 0.3477533 test: 0.3509445 best: 0.3509445 (79) total: 925ms remaining: 104ms 80: learn: 0.3466714 test: 0.3497493 best: 0.3497493 (80) total: 936ms remaining: 92.5ms 81: learn: 0.3451689 test: 0.3481766 best: 0.3481766 (81) total: 948ms remaining: 81ms 82: learn: 0.3437607 test: 0.3466452 best: 0.3466452 (82) total: 960ms remaining: 69.4ms 83: learn: 0.3429326 test: 0.3456937 best: 0.3456937 (83) total: 972ms remaining: 57.9ms 84: learn: 0.3418527 test: 0.3447544 best: 0.3447544 (84) total: 984ms remaining: 46.3ms 85: learn: 0.3408911 test: 0.3439361 best: 0.3439361 (85) total: 997ms remaining: 34.8ms 86: learn: 0.3400288 test: 0.3432063 best: 0.3432063 (86) total: 1.01s remaining: 23.2ms 87: learn: 0.3394091 test: 0.3426958 best: 0.3426958 (87) total: 1.02s remaining: 11.6ms 88: learn: 0.3386074 test: 0.3419796 best: 0.3419796 (88) total: 1.03s remaining: 0us bestTest = 0.3419796246 bestIteration = 88 Trial 28, Fold 5: Log loss = 0.34193552435926494, Average precision = 0.9387729437504191, ROC-AUC = 0.94084387715289, Elapsed Time = 1.1335534999998345 seconds
Optimization Progress: 29%|##9 | 29/100 [57:21<4:07:29, 209.14s/it]
Trial 29, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 29, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5234054 test: 0.5269370 best: 0.5269370 (0) total: 107ms remaining: 748ms 1: learn: 0.4077799 test: 0.4151052 best: 0.4151052 (1) total: 232ms remaining: 697ms 2: learn: 0.3443801 test: 0.3568744 best: 0.3568744 (2) total: 339ms remaining: 565ms 3: learn: 0.3011532 test: 0.3167597 best: 0.3167597 (3) total: 440ms remaining: 440ms 4: learn: 0.2689592 test: 0.2888302 best: 0.2888302 (4) total: 547ms remaining: 328ms 5: learn: 0.2463223 test: 0.2703737 best: 0.2703737 (5) total: 658ms remaining: 219ms 6: learn: 0.2249648 test: 0.2525461 best: 0.2525461 (6) total: 769ms remaining: 110ms 7: learn: 0.2132401 test: 0.2425724 best: 0.2425724 (7) total: 860ms remaining: 0us bestTest = 0.2425723814 bestIteration = 7 Trial 29, Fold 1: Log loss = 0.24257238140409293, Average precision = 0.9737298682606091, ROC-AUC = 0.9691606712694111, Elapsed Time = 0.9674114000008558 seconds Trial 29, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 29, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5113139 test: 0.5182081 best: 0.5182081 (0) total: 115ms remaining: 802ms 1: learn: 0.4188790 test: 0.4290746 best: 0.4290746 (1) total: 209ms remaining: 627ms 2: learn: 0.3524915 test: 0.3657758 best: 0.3657758 (2) total: 308ms remaining: 513ms 3: learn: 0.3069820 test: 0.3225214 best: 0.3225214 (3) total: 420ms remaining: 420ms 4: learn: 0.2681808 test: 0.2861285 best: 0.2861285 (4) total: 524ms remaining: 315ms 5: learn: 0.2439793 test: 0.2642727 best: 0.2642727 (5) total: 639ms remaining: 213ms 6: learn: 0.2274446 test: 0.2505672 best: 0.2505672 (6) total: 747ms remaining: 107ms 7: learn: 0.2084926 test: 0.2340582 best: 0.2340582 (7) total: 866ms remaining: 0us bestTest = 0.2340581928 bestIteration = 7 Trial 29, Fold 2: Log loss = 0.23405819275102638, Average precision = 0.9738304205821529, ROC-AUC = 0.9710940954633178, Elapsed Time = 0.9764156999990519 seconds Trial 29, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 29, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5208002 test: 0.5252330 best: 0.5252330 (0) total: 98.7ms remaining: 691ms 1: learn: 0.4041076 test: 0.4134832 best: 0.4134832 (1) total: 223ms remaining: 670ms 2: learn: 0.3310082 test: 0.3424916 best: 0.3424916 (2) total: 341ms remaining: 569ms 3: learn: 0.2871646 test: 0.3015236 best: 0.3015236 (3) total: 460ms remaining: 460ms 4: learn: 0.2532199 test: 0.2700490 best: 0.2700490 (4) total: 579ms remaining: 347ms 5: learn: 0.2283496 test: 0.2495927 best: 0.2495927 (5) total: 702ms remaining: 234ms 6: learn: 0.2127983 test: 0.2353389 best: 0.2353389 (6) total: 808ms remaining: 115ms 7: learn: 0.2013496 test: 0.2257840 best: 0.2257840 (7) total: 917ms remaining: 0us bestTest = 0.2257839883 bestIteration = 7 Trial 29, Fold 3: Log loss = 0.22578398830624874, Average precision = 0.9739366786524106, ROC-AUC = 0.9710549775596883, Elapsed Time = 1.0257279999968887 seconds Trial 29, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 29, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5262443 test: 0.5295493 best: 0.5295493 (0) total: 107ms remaining: 749ms 1: learn: 0.4241625 test: 0.4325186 best: 0.4325186 (1) total: 239ms remaining: 716ms 2: learn: 0.3537430 test: 0.3668450 best: 0.3668450 (2) total: 367ms remaining: 612ms 3: learn: 0.3029682 test: 0.3202529 best: 0.3202529 (3) total: 475ms remaining: 475ms 4: learn: 0.2639422 test: 0.2854576 best: 0.2854576 (4) total: 592ms remaining: 355ms 5: learn: 0.2412212 test: 0.2651503 best: 0.2651503 (5) total: 701ms remaining: 234ms 6: learn: 0.2233639 test: 0.2494902 best: 0.2494902 (6) total: 806ms remaining: 115ms 7: learn: 0.2075575 test: 0.2357603 best: 0.2357603 (7) total: 915ms remaining: 0us bestTest = 0.235760258 bestIteration = 7 Trial 29, Fold 4: Log loss = 0.23576025802358747, Average precision = 0.9738185492047103, ROC-AUC = 0.9691119832089863, Elapsed Time = 1.0255820999991556 seconds Trial 29, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 29, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5252179 test: 0.5310030 best: 0.5310030 (0) total: 100ms remaining: 703ms 1: learn: 0.4129104 test: 0.4230170 best: 0.4230170 (1) total: 201ms remaining: 603ms 2: learn: 0.3500848 test: 0.3640901 best: 0.3640901 (2) total: 318ms remaining: 530ms 3: learn: 0.2977744 test: 0.3164476 best: 0.3164476 (3) total: 433ms remaining: 433ms 4: learn: 0.2589766 test: 0.2811380 best: 0.2811380 (4) total: 551ms remaining: 331ms 5: learn: 0.2325891 test: 0.2586546 best: 0.2586546 (5) total: 661ms remaining: 220ms 6: learn: 0.2162641 test: 0.2455106 best: 0.2455106 (6) total: 764ms remaining: 109ms 7: learn: 0.2020155 test: 0.2336163 best: 0.2336163 (7) total: 860ms remaining: 0us bestTest = 0.2336163004 bestIteration = 7 Trial 29, Fold 5: Log loss = 0.2336163004102549, Average precision = 0.9728322258126286, ROC-AUC = 0.9705157274771009, Elapsed Time = 0.9670232000025862 seconds
Optimization Progress: 30%|### | 30/100 [57:34<2:55:18, 150.26s/it]
Trial 30, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 30, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6297320 test: 0.6292425 best: 0.6292425 (0) total: 19.4ms remaining: 194ms 1: learn: 0.5775852 test: 0.5768352 best: 0.5768352 (1) total: 41.6ms remaining: 187ms 2: learn: 0.5356362 test: 0.5347636 best: 0.5347636 (2) total: 62.2ms remaining: 166ms 3: learn: 0.4875776 test: 0.4860527 best: 0.4860527 (3) total: 82ms remaining: 144ms 4: learn: 0.4577605 test: 0.4563919 best: 0.4563919 (4) total: 101ms remaining: 122ms 5: learn: 0.4222557 test: 0.4204229 best: 0.4204229 (5) total: 121ms remaining: 101ms 6: learn: 0.3943860 test: 0.3924508 best: 0.3924508 (6) total: 141ms remaining: 80.4ms 7: learn: 0.3704038 test: 0.3681564 best: 0.3681564 (7) total: 160ms remaining: 60.1ms 8: learn: 0.3524545 test: 0.3499191 best: 0.3499191 (8) total: 180ms remaining: 40ms 9: learn: 0.3451964 test: 0.3431423 best: 0.3431423 (9) total: 199ms remaining: 19.9ms 10: learn: 0.3314616 test: 0.3295054 best: 0.3295054 (10) total: 219ms remaining: 0us bestTest = 0.3295053929 bestIteration = 10 Trial 30, Fold 1: Log loss = 0.329641455503343, Average precision = 0.9564282766209009, ROC-AUC = 0.9532976626613711, Elapsed Time = 0.30890739999813377 seconds Trial 30, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 30, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6304387 test: 0.6307899 best: 0.6307899 (0) total: 15.7ms remaining: 157ms 1: learn: 0.5750675 test: 0.5755771 best: 0.5755771 (1) total: 36ms remaining: 162ms 2: learn: 0.5212407 test: 0.5219479 best: 0.5219479 (2) total: 56.2ms remaining: 150ms 3: learn: 0.4884375 test: 0.4895090 best: 0.4895090 (3) total: 76.2ms remaining: 133ms 4: learn: 0.4586968 test: 0.4599780 best: 0.4599780 (4) total: 96.4ms remaining: 116ms 5: learn: 0.4366844 test: 0.4379877 best: 0.4379877 (5) total: 116ms remaining: 96.9ms 6: learn: 0.4110479 test: 0.4126098 best: 0.4126098 (6) total: 136ms remaining: 77.7ms 7: learn: 0.3843871 test: 0.3866744 best: 0.3866744 (7) total: 156ms remaining: 58.6ms 8: learn: 0.3619643 test: 0.3643699 best: 0.3643699 (8) total: 176ms remaining: 39.2ms 9: learn: 0.3438019 test: 0.3463229 best: 0.3463229 (9) total: 197ms remaining: 19.7ms 10: learn: 0.3345017 test: 0.3368272 best: 0.3368272 (10) total: 218ms remaining: 0us bestTest = 0.3368271875 bestIteration = 10 Trial 30, Fold 2: Log loss = 0.33690868529929513, Average precision = 0.9503121000238468, ROC-AUC = 0.951458766636489, Elapsed Time = 0.31127980000019306 seconds Trial 30, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 30, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6315016 test: 0.6307374 best: 0.6307374 (0) total: 18.9ms remaining: 189ms 1: learn: 0.5956838 test: 0.5936444 best: 0.5936444 (1) total: 39ms remaining: 176ms 2: learn: 0.5467070 test: 0.5439962 best: 0.5439962 (2) total: 59.8ms remaining: 159ms 3: learn: 0.5146990 test: 0.5114295 best: 0.5114295 (3) total: 80.9ms remaining: 142ms 4: learn: 0.4786422 test: 0.4753303 best: 0.4753303 (4) total: 102ms remaining: 122ms 5: learn: 0.4493291 test: 0.4458540 best: 0.4458540 (5) total: 122ms remaining: 102ms 6: learn: 0.4161335 test: 0.4125098 best: 0.4125098 (6) total: 143ms remaining: 81.9ms 7: learn: 0.3978533 test: 0.3940485 best: 0.3940485 (7) total: 165ms remaining: 61.8ms 8: learn: 0.3741194 test: 0.3700630 best: 0.3700630 (8) total: 185ms remaining: 41.2ms 9: learn: 0.3588867 test: 0.3545137 best: 0.3545137 (9) total: 205ms remaining: 20.5ms 10: learn: 0.3462158 test: 0.3418741 best: 0.3418741 (10) total: 226ms remaining: 0us bestTest = 0.3418740508 bestIteration = 10 Trial 30, Fold 3: Log loss = 0.34216509956925406, Average precision = 0.9556468415148268, ROC-AUC = 0.955056774951604, Elapsed Time = 0.3175353000005998 seconds Trial 30, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 30, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6346334 test: 0.6347239 best: 0.6347239 (0) total: 19.6ms remaining: 196ms 1: learn: 0.5710477 test: 0.5711897 best: 0.5711897 (1) total: 41.3ms remaining: 186ms 2: learn: 0.5389466 test: 0.5391744 best: 0.5391744 (2) total: 62.3ms remaining: 166ms 3: learn: 0.4977092 test: 0.4976987 best: 0.4976987 (3) total: 82.8ms remaining: 145ms 4: learn: 0.4780857 test: 0.4779143 best: 0.4779143 (4) total: 103ms remaining: 124ms 5: learn: 0.4479908 test: 0.4475201 best: 0.4475201 (5) total: 123ms remaining: 103ms 6: learn: 0.4216918 test: 0.4215243 best: 0.4215243 (6) total: 144ms remaining: 82.2ms 7: learn: 0.4067477 test: 0.4064268 best: 0.4064268 (7) total: 164ms remaining: 61.6ms 8: learn: 0.3882080 test: 0.3878213 best: 0.3878213 (8) total: 185ms remaining: 41ms 9: learn: 0.3658446 test: 0.3656229 best: 0.3656229 (9) total: 205ms remaining: 20.5ms 10: learn: 0.3514128 test: 0.3510706 best: 0.3510706 (10) total: 227ms remaining: 0us bestTest = 0.351070582 bestIteration = 10 Trial 30, Fold 4: Log loss = 0.35116767590433345, Average precision = 0.9541383033016537, ROC-AUC = 0.9520921407469403, Elapsed Time = 0.318028099998628 seconds Trial 30, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 30, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6301348 test: 0.6315408 best: 0.6315408 (0) total: 19ms remaining: 190ms 1: learn: 0.5824963 test: 0.5843826 best: 0.5843826 (1) total: 40ms remaining: 180ms 2: learn: 0.5452854 test: 0.5477495 best: 0.5477495 (2) total: 60.2ms remaining: 160ms 3: learn: 0.5032157 test: 0.5061825 best: 0.5061825 (3) total: 79.9ms remaining: 140ms 4: learn: 0.4685349 test: 0.4720053 best: 0.4720053 (4) total: 99.6ms remaining: 119ms 5: learn: 0.4397417 test: 0.4433973 best: 0.4433973 (5) total: 119ms remaining: 99.3ms 6: learn: 0.4214537 test: 0.4254232 best: 0.4254232 (6) total: 140ms remaining: 79.7ms 7: learn: 0.4017002 test: 0.4061425 best: 0.4061425 (7) total: 160ms remaining: 59.9ms 8: learn: 0.3801980 test: 0.3851741 best: 0.3851741 (8) total: 180ms remaining: 40ms 9: learn: 0.3626068 test: 0.3682225 best: 0.3682225 (9) total: 200ms remaining: 20ms 10: learn: 0.3428479 test: 0.3496953 best: 0.3496953 (10) total: 223ms remaining: 0us bestTest = 0.3496953141 bestIteration = 10 Trial 30, Fold 5: Log loss = 0.34985093232434983, Average precision = 0.9551948787778671, ROC-AUC = 0.9522911766345243, Elapsed Time = 0.3170896999981778 seconds
Optimization Progress: 31%|###1 | 31/100 [57:43<2:04:04, 107.89s/it]
Trial 31, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 31, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6533743 test: 0.6532918 best: 0.6532918 (0) total: 65.6ms remaining: 328ms 1: learn: 0.6194632 test: 0.6194319 best: 0.6194319 (1) total: 141ms remaining: 281ms 2: learn: 0.5793730 test: 0.5792039 best: 0.5792039 (2) total: 218ms remaining: 218ms 3: learn: 0.5449760 test: 0.5449069 best: 0.5449069 (3) total: 287ms remaining: 143ms 4: learn: 0.5066023 test: 0.5071254 best: 0.5071254 (4) total: 376ms remaining: 75.3ms 5: learn: 0.4808199 test: 0.4816899 best: 0.4816899 (5) total: 458ms remaining: 0us bestTest = 0.4816899151 bestIteration = 5 Trial 31, Fold 1: Log loss = 0.48227684860666004, Average precision = 0.9616832003120429, ROC-AUC = 0.9561859535041558, Elapsed Time = 0.5688093000026129 seconds Trial 31, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 31, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6578868 test: 0.6581268 best: 0.6581268 (0) total: 52.7ms remaining: 264ms 1: learn: 0.6156270 test: 0.6157125 best: 0.6157125 (1) total: 129ms remaining: 259ms 2: learn: 0.5762515 test: 0.5768996 best: 0.5768996 (2) total: 210ms remaining: 210ms 3: learn: 0.5399041 test: 0.5419145 best: 0.5419145 (3) total: 293ms remaining: 146ms 4: learn: 0.5057528 test: 0.5078313 best: 0.5078313 (4) total: 378ms remaining: 75.6ms 5: learn: 0.4797256 test: 0.4821003 best: 0.4821003 (5) total: 451ms remaining: 0us bestTest = 0.4821002784 bestIteration = 5 Trial 31, Fold 2: Log loss = 0.4822819670587704, Average precision = 0.9604624298469059, ROC-AUC = 0.9572299625395416, Elapsed Time = 0.5691735000000335 seconds Trial 31, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 31, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6476212 test: 0.6477717 best: 0.6477717 (0) total: 69.2ms remaining: 346ms 1: learn: 0.6049135 test: 0.6047355 best: 0.6047355 (1) total: 148ms remaining: 297ms 2: learn: 0.5698312 test: 0.5693648 best: 0.5693648 (2) total: 218ms remaining: 218ms 3: learn: 0.5357541 test: 0.5349822 best: 0.5349822 (3) total: 308ms remaining: 154ms 4: learn: 0.5062721 test: 0.5053796 best: 0.5053796 (4) total: 380ms remaining: 76.1ms 5: learn: 0.4794568 test: 0.4780800 best: 0.4780800 (5) total: 473ms remaining: 0us bestTest = 0.4780800369 bestIteration = 5 Trial 31, Fold 3: Log loss = 0.47858746848210454, Average precision = 0.9611131940341558, ROC-AUC = 0.9581101140314551, Elapsed Time = 0.5772212000010768 seconds Trial 31, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 31, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6543290 test: 0.6545023 best: 0.6545023 (0) total: 52.1ms remaining: 261ms 1: learn: 0.6142547 test: 0.6148548 best: 0.6148548 (1) total: 124ms remaining: 249ms 2: learn: 0.5759854 test: 0.5767401 best: 0.5767401 (2) total: 198ms remaining: 198ms 3: learn: 0.5500581 test: 0.5508322 best: 0.5508322 (3) total: 277ms remaining: 139ms 4: learn: 0.5153094 test: 0.5161562 best: 0.5161562 (4) total: 361ms remaining: 72.3ms 5: learn: 0.4905766 test: 0.4915648 best: 0.4915648 (5) total: 443ms remaining: 0us bestTest = 0.4915648485 bestIteration = 5 Trial 31, Fold 4: Log loss = 0.49187389418534044, Average precision = 0.9610626872962896, ROC-AUC = 0.9565157424694378, Elapsed Time = 0.5466653000003134 seconds Trial 31, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 31, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6484769 test: 0.6489480 best: 0.6489480 (0) total: 69.6ms remaining: 348ms 1: learn: 0.6056154 test: 0.6070429 best: 0.6070429 (1) total: 151ms remaining: 303ms 2: learn: 0.5687554 test: 0.5708132 best: 0.5708132 (2) total: 220ms remaining: 220ms 3: learn: 0.5356215 test: 0.5382773 best: 0.5382773 (3) total: 311ms remaining: 155ms 4: learn: 0.5068299 test: 0.5097400 best: 0.5097400 (4) total: 384ms remaining: 76.8ms 5: learn: 0.4784862 test: 0.4822374 best: 0.4822374 (5) total: 477ms remaining: 0us bestTest = 0.4822374051 bestIteration = 5 Trial 31, Fold 5: Log loss = 0.4826240433436574, Average precision = 0.9592506933182632, ROC-AUC = 0.9545434479683407, Elapsed Time = 0.5817145000000892 seconds
Optimization Progress: 32%|###2 | 32/100 [57:53<1:29:08, 78.65s/it]
Trial 32, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 32, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6844625 test: 0.6846968 best: 0.6846968 (0) total: 17.5ms remaining: 997ms 1: learn: 0.6761904 test: 0.6766631 best: 0.6766631 (1) total: 35.6ms remaining: 998ms 2: learn: 0.6683833 test: 0.6690876 best: 0.6690876 (2) total: 57.3ms remaining: 1.05s 3: learn: 0.6588414 test: 0.6593342 best: 0.6593342 (3) total: 79.3ms remaining: 1.07s 4: learn: 0.6516333 test: 0.6523409 best: 0.6523409 (4) total: 100ms remaining: 1.06s 5: learn: 0.6449855 test: 0.6457598 best: 0.6457598 (5) total: 119ms remaining: 1.03s 6: learn: 0.6361549 test: 0.6367266 best: 0.6367266 (6) total: 140ms remaining: 1.02s 7: learn: 0.6278496 test: 0.6282190 best: 0.6282190 (7) total: 163ms remaining: 1.02s 8: learn: 0.6200458 test: 0.6202143 best: 0.6202143 (8) total: 184ms remaining: 1s 9: learn: 0.6127196 test: 0.6126888 best: 0.6126888 (9) total: 206ms remaining: 987ms 10: learn: 0.6058345 test: 0.6056058 best: 0.6056058 (10) total: 227ms remaining: 971ms 11: learn: 0.5996982 test: 0.5996780 best: 0.5996780 (11) total: 249ms remaining: 955ms 12: learn: 0.5940028 test: 0.5941245 best: 0.5941245 (12) total: 271ms remaining: 937ms 13: learn: 0.5876288 test: 0.5875588 best: 0.5875588 (13) total: 292ms remaining: 918ms 14: learn: 0.5823750 test: 0.5823772 best: 0.5823772 (14) total: 314ms remaining: 901ms 15: learn: 0.5774252 test: 0.5774968 best: 0.5774968 (15) total: 337ms remaining: 884ms 16: learn: 0.5726796 test: 0.5728822 best: 0.5728822 (16) total: 360ms remaining: 867ms 17: learn: 0.5668043 test: 0.5668256 best: 0.5668256 (17) total: 382ms remaining: 848ms 18: learn: 0.5612893 test: 0.5611307 best: 0.5611307 (18) total: 404ms remaining: 829ms 19: learn: 0.5560992 test: 0.5557623 best: 0.5557623 (19) total: 427ms remaining: 811ms 20: learn: 0.5516740 test: 0.5515123 best: 0.5515123 (20) total: 450ms remaining: 793ms 21: learn: 0.5474549 test: 0.5474701 best: 0.5474701 (21) total: 473ms remaining: 773ms 22: learn: 0.5431028 test: 0.5430851 best: 0.5430851 (22) total: 495ms remaining: 754ms 23: learn: 0.5382858 test: 0.5381846 best: 0.5381846 (23) total: 519ms remaining: 735ms 24: learn: 0.5337697 test: 0.5335921 best: 0.5335921 (24) total: 542ms remaining: 715ms 25: learn: 0.5297272 test: 0.5297194 best: 0.5297194 (25) total: 566ms remaining: 696ms 26: learn: 0.5252986 test: 0.5251251 best: 0.5251251 (26) total: 590ms remaining: 678ms 27: learn: 0.5216474 test: 0.5215413 best: 0.5215413 (27) total: 622ms remaining: 667ms 28: learn: 0.5180638 test: 0.5181192 best: 0.5181192 (28) total: 648ms remaining: 648ms 29: learn: 0.5137552 test: 0.5137414 best: 0.5137414 (29) total: 671ms remaining: 626ms 30: learn: 0.5096747 test: 0.5095026 best: 0.5095026 (30) total: 694ms remaining: 604ms 31: learn: 0.5056880 test: 0.5054476 best: 0.5054476 (31) total: 717ms remaining: 582ms 32: learn: 0.5023321 test: 0.5021542 best: 0.5021542 (32) total: 739ms remaining: 560ms 33: learn: 0.4985538 test: 0.4983123 best: 0.4983123 (33) total: 763ms remaining: 539ms 34: learn: 0.4948191 test: 0.4944268 best: 0.4944268 (34) total: 786ms remaining: 516ms 35: learn: 0.4916172 test: 0.4912918 best: 0.4912918 (35) total: 808ms remaining: 494ms 36: learn: 0.4884781 test: 0.4883436 best: 0.4883436 (36) total: 831ms remaining: 472ms 37: learn: 0.4848996 test: 0.4847103 best: 0.4847103 (37) total: 855ms remaining: 450ms 38: learn: 0.4819211 test: 0.4818786 best: 0.4818786 (38) total: 877ms remaining: 427ms 39: learn: 0.4784586 test: 0.4783831 best: 0.4783831 (39) total: 901ms remaining: 405ms 40: learn: 0.4753719 test: 0.4752688 best: 0.4752688 (40) total: 924ms remaining: 383ms 41: learn: 0.4725881 test: 0.4725488 best: 0.4725488 (41) total: 947ms remaining: 361ms 42: learn: 0.4693504 test: 0.4692842 best: 0.4692842 (42) total: 970ms remaining: 338ms 43: learn: 0.4664834 test: 0.4663912 best: 0.4663912 (43) total: 992ms remaining: 316ms 44: learn: 0.4638208 test: 0.4637911 best: 0.4637911 (44) total: 1.02s remaining: 294ms 45: learn: 0.4612499 test: 0.4613562 best: 0.4613562 (45) total: 1.04s remaining: 272ms 46: learn: 0.4581805 test: 0.4582638 best: 0.4582638 (46) total: 1.06s remaining: 249ms 47: learn: 0.4559528 test: 0.4560992 best: 0.4560992 (47) total: 1.09s remaining: 227ms 48: learn: 0.4535288 test: 0.4537361 best: 0.4537361 (48) total: 1.11s remaining: 205ms 49: learn: 0.4506299 test: 0.4507052 best: 0.4507052 (49) total: 1.14s remaining: 182ms 50: learn: 0.4478155 test: 0.4478729 best: 0.4478729 (50) total: 1.16s remaining: 160ms 51: learn: 0.4454907 test: 0.4456094 best: 0.4456094 (51) total: 1.19s remaining: 137ms 52: learn: 0.4430676 test: 0.4431615 best: 0.4431615 (52) total: 1.21s remaining: 114ms 53: learn: 0.4408650 test: 0.4410182 best: 0.4410182 (53) total: 1.23s remaining: 91.4ms 54: learn: 0.4382482 test: 0.4383683 best: 0.4383683 (54) total: 1.26s remaining: 68.6ms 55: learn: 0.4361343 test: 0.4363121 best: 0.4363121 (55) total: 1.28s remaining: 45.8ms 56: learn: 0.4342019 test: 0.4344457 best: 0.4344457 (56) total: 1.31s remaining: 22.9ms 57: learn: 0.4322030 test: 0.4325028 best: 0.4325028 (57) total: 1.33s remaining: 0us bestTest = 0.4325028339 bestIteration = 57 Trial 32, Fold 1: Log loss = 0.43268292980258755, Average precision = 0.9349689977155016, ROC-AUC = 0.941189469489492, Elapsed Time = 1.4362436999981583 seconds Trial 32, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 32, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6831126 test: 0.6831391 best: 0.6831391 (0) total: 20.7ms remaining: 1.18s 1: learn: 0.6736686 test: 0.6737274 best: 0.6737274 (1) total: 40.5ms remaining: 1.13s 2: learn: 0.6653204 test: 0.6654121 best: 0.6654121 (2) total: 64.5ms remaining: 1.18s 3: learn: 0.6574001 test: 0.6575170 best: 0.6575170 (3) total: 85.2ms remaining: 1.15s 4: learn: 0.6499211 test: 0.6500599 best: 0.6500599 (4) total: 109ms remaining: 1.16s 5: learn: 0.6413065 test: 0.6414814 best: 0.6414814 (5) total: 134ms remaining: 1.16s 6: learn: 0.6332117 test: 0.6334273 best: 0.6334273 (6) total: 160ms remaining: 1.16s 7: learn: 0.6263183 test: 0.6265573 best: 0.6265573 (7) total: 183ms remaining: 1.14s 8: learn: 0.6187752 test: 0.6190584 best: 0.6190584 (8) total: 206ms remaining: 1.12s 9: learn: 0.6123071 test: 0.6125370 best: 0.6125370 (9) total: 230ms remaining: 1.1s 10: learn: 0.6062347 test: 0.6064836 best: 0.6064836 (10) total: 253ms remaining: 1.08s 11: learn: 0.6004966 test: 0.6007616 best: 0.6007616 (11) total: 276ms remaining: 1.06s 12: learn: 0.5935728 test: 0.5938862 best: 0.5938862 (12) total: 299ms remaining: 1.03s 13: learn: 0.5870785 test: 0.5874430 best: 0.5874430 (13) total: 322ms remaining: 1.01s 14: learn: 0.5809828 test: 0.5814013 best: 0.5814013 (14) total: 345ms remaining: 989ms 15: learn: 0.5757376 test: 0.5761751 best: 0.5761751 (15) total: 369ms remaining: 968ms 16: learn: 0.5700496 test: 0.5705426 best: 0.5705426 (16) total: 391ms remaining: 944ms 17: learn: 0.5651715 test: 0.5657079 best: 0.5657079 (17) total: 416ms remaining: 924ms 18: learn: 0.5598658 test: 0.5604587 best: 0.5604587 (18) total: 440ms remaining: 902ms 19: learn: 0.5552471 test: 0.5558781 best: 0.5558781 (19) total: 463ms remaining: 879ms 20: learn: 0.5508826 test: 0.5515492 best: 0.5515492 (20) total: 486ms remaining: 856ms 21: learn: 0.5459514 test: 0.5466761 best: 0.5466761 (21) total: 509ms remaining: 833ms 22: learn: 0.5413241 test: 0.5421082 best: 0.5421082 (22) total: 532ms remaining: 810ms 23: learn: 0.5372515 test: 0.5380516 best: 0.5380516 (23) total: 555ms remaining: 787ms 24: learn: 0.5334134 test: 0.5342273 best: 0.5342273 (24) total: 578ms remaining: 763ms 25: learn: 0.5291044 test: 0.5299787 best: 0.5299787 (25) total: 602ms remaining: 741ms 26: learn: 0.5254632 test: 0.5262943 best: 0.5262943 (26) total: 625ms remaining: 718ms 27: learn: 0.5206608 test: 0.5216076 best: 0.5216076 (27) total: 650ms remaining: 696ms 28: learn: 0.5171902 test: 0.5180931 best: 0.5180931 (28) total: 674ms remaining: 674ms 29: learn: 0.5142091 test: 0.5151520 best: 0.5151520 (29) total: 697ms remaining: 651ms 30: learn: 0.5097322 test: 0.5107818 best: 0.5107818 (30) total: 722ms remaining: 628ms 31: learn: 0.5055428 test: 0.5066922 best: 0.5066922 (31) total: 746ms remaining: 606ms 32: learn: 0.5015603 test: 0.5028330 best: 0.5028330 (32) total: 770ms remaining: 584ms 33: learn: 0.4982160 test: 0.4995288 best: 0.4995288 (33) total: 794ms remaining: 561ms 34: learn: 0.4944378 test: 0.4958653 best: 0.4958653 (34) total: 819ms remaining: 538ms 35: learn: 0.4912156 test: 0.4926516 best: 0.4926516 (35) total: 843ms remaining: 515ms 36: learn: 0.4875985 test: 0.4890870 best: 0.4890870 (36) total: 868ms remaining: 492ms 37: learn: 0.4841057 test: 0.4856746 best: 0.4856746 (37) total: 893ms remaining: 470ms 38: learn: 0.4810335 test: 0.4826081 best: 0.4826081 (38) total: 918ms remaining: 447ms 39: learn: 0.4781428 test: 0.4797214 best: 0.4797214 (39) total: 943ms remaining: 424ms 40: learn: 0.4748039 test: 0.4764592 best: 0.4764592 (40) total: 968ms remaining: 401ms 41: learn: 0.4720441 test: 0.4737338 best: 0.4737338 (41) total: 993ms remaining: 378ms 42: learn: 0.4694270 test: 0.4711195 best: 0.4711195 (42) total: 1.02s remaining: 355ms 43: learn: 0.4670688 test: 0.4687814 best: 0.4687814 (43) total: 1.04s remaining: 333ms 44: learn: 0.4639269 test: 0.4657114 best: 0.4657114 (44) total: 1.07s remaining: 310ms 45: learn: 0.4607571 test: 0.4625910 best: 0.4625910 (45) total: 1.1s remaining: 287ms 46: learn: 0.4577939 test: 0.4597161 best: 0.4597161 (46) total: 1.13s remaining: 264ms 47: learn: 0.4552633 test: 0.4572176 best: 0.4572176 (47) total: 1.16s remaining: 241ms 48: learn: 0.4526153 test: 0.4545810 best: 0.4545810 (48) total: 1.19s remaining: 218ms 49: learn: 0.4502040 test: 0.4521734 best: 0.4521734 (49) total: 1.22s remaining: 195ms 50: learn: 0.4478582 test: 0.4498239 best: 0.4498239 (50) total: 1.25s remaining: 171ms 51: learn: 0.4450621 test: 0.4470916 best: 0.4470916 (51) total: 1.27s remaining: 147ms 52: learn: 0.4422937 test: 0.4443721 best: 0.4443721 (52) total: 1.3s remaining: 123ms 53: learn: 0.4400978 test: 0.4421793 best: 0.4421793 (53) total: 1.33s remaining: 98.8ms 54: learn: 0.4377596 test: 0.4398537 best: 0.4398537 (54) total: 1.36s remaining: 74.4ms 55: learn: 0.4356809 test: 0.4377780 best: 0.4377780 (55) total: 1.39s remaining: 49.7ms 56: learn: 0.4337182 test: 0.4358178 best: 0.4358178 (56) total: 1.42s remaining: 24.9ms 57: learn: 0.4310650 test: 0.4332260 best: 0.4332260 (57) total: 1.45s remaining: 0us bestTest = 0.4332260341 bestIteration = 57 Trial 32, Fold 2: Log loss = 0.4333338618730923, Average precision = 0.9279569767242893, ROC-AUC = 0.9423153555708585, Elapsed Time = 1.5652832999985549 seconds Trial 32, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 32, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6846461 test: 0.6842247 best: 0.6842247 (0) total: 19.1ms remaining: 1.09s 1: learn: 0.6747819 test: 0.6745194 best: 0.6745194 (1) total: 39.8ms remaining: 1.11s 2: learn: 0.6669872 test: 0.6662852 best: 0.6662852 (2) total: 63.5ms remaining: 1.16s 3: learn: 0.6595745 test: 0.6584715 best: 0.6584715 (3) total: 86ms remaining: 1.16s 4: learn: 0.6504684 test: 0.6495195 best: 0.6495195 (4) total: 108ms remaining: 1.15s 5: learn: 0.6435920 test: 0.6422550 best: 0.6422550 (5) total: 130ms remaining: 1.13s 6: learn: 0.6351045 test: 0.6339126 best: 0.6339126 (6) total: 153ms remaining: 1.12s 7: learn: 0.6271180 test: 0.6260595 best: 0.6260595 (7) total: 176ms remaining: 1.1s 8: learn: 0.6195994 test: 0.6186633 best: 0.6186633 (8) total: 198ms remaining: 1.08s 9: learn: 0.6132805 test: 0.6119920 best: 0.6119920 (9) total: 222ms remaining: 1.06s 10: learn: 0.6062892 test: 0.6051155 best: 0.6051155 (10) total: 245ms remaining: 1.04s 11: learn: 0.6004221 test: 0.5988983 best: 0.5988983 (11) total: 267ms remaining: 1.02s 12: learn: 0.5939005 test: 0.5924858 best: 0.5924858 (12) total: 289ms remaining: 1s 13: learn: 0.5884094 test: 0.5866703 best: 0.5866703 (13) total: 312ms remaining: 981ms 14: learn: 0.5831961 test: 0.5811499 best: 0.5811499 (14) total: 335ms remaining: 959ms 15: learn: 0.5783048 test: 0.5759570 best: 0.5759570 (15) total: 357ms remaining: 938ms 16: learn: 0.5723085 test: 0.5700689 best: 0.5700689 (16) total: 380ms remaining: 916ms 17: learn: 0.5676417 test: 0.5651285 best: 0.5651285 (17) total: 402ms remaining: 894ms 18: learn: 0.5620291 test: 0.5596176 best: 0.5596176 (18) total: 425ms remaining: 872ms 19: learn: 0.5576625 test: 0.5549903 best: 0.5549903 (19) total: 447ms remaining: 849ms 20: learn: 0.5527212 test: 0.5502266 best: 0.5502266 (20) total: 471ms remaining: 830ms 21: learn: 0.5475592 test: 0.5451549 best: 0.5451549 (21) total: 495ms remaining: 810ms 22: learn: 0.5427153 test: 0.5403936 best: 0.5403936 (22) total: 519ms remaining: 789ms 23: learn: 0.5387069 test: 0.5361468 best: 0.5361468 (23) total: 542ms remaining: 768ms 24: learn: 0.5343355 test: 0.5319220 best: 0.5319220 (24) total: 565ms remaining: 746ms 25: learn: 0.5304043 test: 0.5277549 best: 0.5277549 (25) total: 589ms remaining: 725ms 26: learn: 0.5267035 test: 0.5238272 best: 0.5238272 (26) total: 613ms remaining: 704ms 27: learn: 0.5232106 test: 0.5201151 best: 0.5201151 (27) total: 638ms remaining: 683ms 28: learn: 0.5187144 test: 0.5158411 best: 0.5158411 (28) total: 664ms remaining: 664ms 29: learn: 0.5143487 test: 0.5115496 best: 0.5115496 (29) total: 689ms remaining: 643ms 30: learn: 0.5104369 test: 0.5077743 best: 0.5077743 (30) total: 714ms remaining: 622ms 31: learn: 0.5063870 test: 0.5037884 best: 0.5037884 (31) total: 740ms remaining: 601ms 32: learn: 0.5025841 test: 0.5000436 best: 0.5000436 (32) total: 766ms remaining: 580ms 33: learn: 0.4990031 test: 0.4965845 best: 0.4965845 (33) total: 792ms remaining: 559ms 34: learn: 0.4956868 test: 0.4930712 best: 0.4930712 (34) total: 818ms remaining: 538ms 35: learn: 0.4925395 test: 0.4897270 best: 0.4897270 (35) total: 844ms remaining: 516ms 36: learn: 0.4889455 test: 0.4862720 best: 0.4862720 (36) total: 870ms remaining: 494ms 37: learn: 0.4858927 test: 0.4830282 best: 0.4830282 (37) total: 897ms remaining: 472ms 38: learn: 0.4824328 test: 0.4796194 best: 0.4796194 (38) total: 926ms remaining: 451ms 39: learn: 0.4790719 test: 0.4763907 best: 0.4763907 (39) total: 953ms remaining: 429ms 40: learn: 0.4761893 test: 0.4733241 best: 0.4733241 (40) total: 982ms remaining: 407ms 41: learn: 0.4729225 test: 0.4702323 best: 0.4702323 (41) total: 1.01s remaining: 384ms 42: learn: 0.4701614 test: 0.4672919 best: 0.4672919 (42) total: 1.04s remaining: 362ms 43: learn: 0.4678046 test: 0.4649703 best: 0.4649703 (43) total: 1.06s remaining: 339ms 44: learn: 0.4650430 test: 0.4621853 best: 0.4621853 (44) total: 1.09s remaining: 315ms 45: learn: 0.4624412 test: 0.4594170 best: 0.4594170 (45) total: 1.12s remaining: 291ms 46: learn: 0.4593923 test: 0.4565323 best: 0.4565323 (46) total: 1.14s remaining: 267ms 47: learn: 0.4568909 test: 0.4538641 best: 0.4538641 (47) total: 1.17s remaining: 243ms 48: learn: 0.4540121 test: 0.4510209 best: 0.4510209 (48) total: 1.19s remaining: 219ms 49: learn: 0.4516664 test: 0.4485204 best: 0.4485204 (49) total: 1.22s remaining: 195ms 50: learn: 0.4487979 test: 0.4458085 best: 0.4458085 (50) total: 1.25s remaining: 171ms 51: learn: 0.4464950 test: 0.4433527 best: 0.4433527 (51) total: 1.27s remaining: 147ms 52: learn: 0.4440801 test: 0.4409114 best: 0.4409114 (52) total: 1.3s remaining: 123ms 53: learn: 0.4419232 test: 0.4386035 best: 0.4386035 (53) total: 1.32s remaining: 98.2ms 54: learn: 0.4392618 test: 0.4360516 best: 0.4360516 (54) total: 1.35s remaining: 73.8ms 55: learn: 0.4370142 test: 0.4337759 best: 0.4337759 (55) total: 1.38s remaining: 49.2ms 56: learn: 0.4349437 test: 0.4315427 best: 0.4315427 (56) total: 1.4s remaining: 24.6ms 57: learn: 0.4324544 test: 0.4291574 best: 0.4291574 (57) total: 1.43s remaining: 0us bestTest = 0.4291574479 bestIteration = 57 Trial 32, Fold 3: Log loss = 0.4294681622549487, Average precision = 0.9345971557209325, ROC-AUC = 0.9459223401488958, Elapsed Time = 1.5482555000016873 seconds Trial 32, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 32, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6830869 test: 0.6831870 best: 0.6831870 (0) total: 19.3ms remaining: 1.1s 1: learn: 0.6746004 test: 0.6745228 best: 0.6745228 (1) total: 39.5ms remaining: 1.1s 2: learn: 0.6652465 test: 0.6652683 best: 0.6652683 (2) total: 63ms remaining: 1.16s 3: learn: 0.6574305 test: 0.6572989 best: 0.6572989 (3) total: 83ms remaining: 1.12s 4: learn: 0.6499655 test: 0.6496676 best: 0.6496676 (4) total: 106ms remaining: 1.12s 5: learn: 0.6413315 test: 0.6411323 best: 0.6411323 (5) total: 128ms remaining: 1.11s 6: learn: 0.6332106 test: 0.6331104 best: 0.6331104 (6) total: 151ms remaining: 1.1s 7: learn: 0.6263191 test: 0.6260591 best: 0.6260591 (7) total: 174ms remaining: 1.09s 8: learn: 0.6198279 test: 0.6194130 best: 0.6194130 (8) total: 197ms remaining: 1.07s 9: learn: 0.6123395 test: 0.6120218 best: 0.6120218 (9) total: 221ms remaining: 1.06s 10: learn: 0.6062790 test: 0.6058116 best: 0.6058116 (10) total: 244ms remaining: 1.04s 11: learn: 0.5992920 test: 0.5989210 best: 0.5989210 (11) total: 267ms remaining: 1.02s 12: learn: 0.5927408 test: 0.5924659 best: 0.5924659 (12) total: 290ms remaining: 1s 13: learn: 0.5872959 test: 0.5868345 best: 0.5868345 (13) total: 312ms remaining: 982ms 14: learn: 0.5811770 test: 0.5808113 best: 0.5808113 (14) total: 335ms remaining: 961ms 15: learn: 0.5759081 test: 0.5754021 best: 0.5754021 (15) total: 358ms remaining: 940ms 16: learn: 0.5702078 test: 0.5697964 best: 0.5697964 (16) total: 381ms remaining: 919ms 17: learn: 0.5653218 test: 0.5647930 best: 0.5647930 (17) total: 404ms remaining: 898ms 18: learn: 0.5600040 test: 0.5595689 best: 0.5595689 (18) total: 427ms remaining: 877ms 19: learn: 0.5554354 test: 0.5548876 best: 0.5548876 (19) total: 450ms remaining: 856ms 20: learn: 0.5511862 test: 0.5504790 best: 0.5504790 (20) total: 473ms remaining: 834ms 21: learn: 0.5462601 test: 0.5456449 best: 0.5456449 (21) total: 496ms remaining: 812ms 22: learn: 0.5421513 test: 0.5414086 best: 0.5414086 (22) total: 519ms remaining: 790ms 23: learn: 0.5382817 test: 0.5374148 best: 0.5374148 (23) total: 542ms remaining: 768ms 24: learn: 0.5334828 test: 0.5326309 best: 0.5326309 (24) total: 566ms remaining: 747ms 25: learn: 0.5289859 test: 0.5281492 best: 0.5281492 (25) total: 591ms remaining: 727ms 26: learn: 0.5247626 test: 0.5239329 best: 0.5239329 (26) total: 615ms remaining: 706ms 27: learn: 0.5203429 test: 0.5196060 best: 0.5196060 (27) total: 640ms remaining: 686ms 28: learn: 0.5166372 test: 0.5157978 best: 0.5157978 (28) total: 663ms remaining: 663ms 29: learn: 0.5131710 test: 0.5121974 best: 0.5121974 (29) total: 687ms remaining: 641ms 30: learn: 0.5094633 test: 0.5086418 best: 0.5086418 (30) total: 712ms remaining: 620ms 31: learn: 0.5055181 test: 0.5047036 best: 0.5047036 (31) total: 736ms remaining: 598ms 32: learn: 0.5020765 test: 0.5014104 best: 0.5014104 (32) total: 760ms remaining: 576ms 33: learn: 0.4987135 test: 0.4979311 best: 0.4979311 (33) total: 784ms remaining: 553ms 34: learn: 0.4950407 test: 0.4942707 best: 0.4942707 (34) total: 809ms remaining: 532ms 35: learn: 0.4918720 test: 0.4909741 best: 0.4909741 (35) total: 834ms remaining: 510ms 36: learn: 0.4882681 test: 0.4874609 best: 0.4874609 (36) total: 860ms remaining: 488ms 37: learn: 0.4852699 test: 0.4843712 best: 0.4843712 (37) total: 886ms remaining: 467ms 38: learn: 0.4824483 test: 0.4814287 best: 0.4814287 (38) total: 914ms remaining: 445ms 39: learn: 0.4789226 test: 0.4779064 best: 0.4779064 (39) total: 939ms remaining: 423ms 40: learn: 0.4761774 test: 0.4750550 best: 0.4750550 (40) total: 966ms remaining: 401ms 41: learn: 0.4728587 test: 0.4718258 best: 0.4718258 (41) total: 992ms remaining: 378ms 42: learn: 0.4700453 test: 0.4691529 best: 0.4691529 (42) total: 1.02s remaining: 355ms 43: learn: 0.4667772 test: 0.4658871 best: 0.4658871 (43) total: 1.04s remaining: 332ms 44: learn: 0.4637158 test: 0.4628249 best: 0.4628249 (44) total: 1.07s remaining: 310ms 45: learn: 0.4610902 test: 0.4600829 best: 0.4600829 (45) total: 1.1s remaining: 287ms 46: learn: 0.4586126 test: 0.4574924 best: 0.4574924 (46) total: 1.13s remaining: 263ms 47: learn: 0.4556419 test: 0.4544950 best: 0.4544950 (47) total: 1.15s remaining: 240ms 48: learn: 0.4532620 test: 0.4520041 best: 0.4520041 (48) total: 1.18s remaining: 217ms 49: learn: 0.4505105 test: 0.4492567 best: 0.4492567 (49) total: 1.21s remaining: 193ms 50: learn: 0.4476759 test: 0.4465113 best: 0.4465113 (50) total: 1.23s remaining: 169ms 51: learn: 0.4453666 test: 0.4441018 best: 0.4441018 (51) total: 1.26s remaining: 145ms 52: learn: 0.4433518 test: 0.4421738 best: 0.4421738 (52) total: 1.29s remaining: 122ms 53: learn: 0.4407806 test: 0.4395892 best: 0.4395892 (53) total: 1.31s remaining: 97.4ms 54: learn: 0.4385375 test: 0.4372675 best: 0.4372675 (54) total: 1.34s remaining: 73.2ms 55: learn: 0.4364591 test: 0.4350842 best: 0.4350842 (55) total: 1.37s remaining: 48.9ms 56: learn: 0.4341594 test: 0.4329185 best: 0.4329185 (56) total: 1.4s remaining: 24.5ms 57: learn: 0.4316807 test: 0.4304200 best: 0.4304200 (57) total: 1.42s remaining: 0us bestTest = 0.4304200261 bestIteration = 57 Trial 32, Fold 4: Log loss = 0.4305655394735344, Average precision = 0.9357193696992347, ROC-AUC = 0.9453403119046426, Elapsed Time = 1.5390923000013572 seconds Trial 32, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 32, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6843527 test: 0.6847811 best: 0.6847811 (0) total: 19.5ms remaining: 1.11s 1: learn: 0.6760458 test: 0.6768900 best: 0.6768900 (1) total: 42.6ms remaining: 1.19s 2: learn: 0.6682270 test: 0.6694869 best: 0.6694869 (2) total: 62.4ms remaining: 1.14s 3: learn: 0.6585019 test: 0.6596625 best: 0.6596625 (3) total: 85.2ms remaining: 1.15s 4: learn: 0.6512208 test: 0.6527618 best: 0.6527618 (4) total: 108ms remaining: 1.15s 5: learn: 0.6421696 test: 0.6436132 best: 0.6436132 (5) total: 131ms remaining: 1.13s 6: learn: 0.6336522 test: 0.6350059 best: 0.6350059 (6) total: 151ms remaining: 1.1s 7: learn: 0.6270220 test: 0.6288220 best: 0.6288220 (7) total: 174ms remaining: 1.08s 8: learn: 0.6206965 test: 0.6228467 best: 0.6228467 (8) total: 196ms remaining: 1.06s 9: learn: 0.6128274 test: 0.6148822 best: 0.6148822 (9) total: 218ms remaining: 1.05s 10: learn: 0.6068749 test: 0.6092234 best: 0.6092234 (10) total: 240ms remaining: 1.03s 11: learn: 0.6012647 test: 0.6038979 best: 0.6038979 (11) total: 263ms remaining: 1.01s 12: learn: 0.5939859 test: 0.5965231 best: 0.5965231 (12) total: 286ms remaining: 991ms 13: learn: 0.5871517 test: 0.5896000 best: 0.5896000 (13) total: 308ms remaining: 969ms 14: learn: 0.5820571 test: 0.5848775 best: 0.5848775 (14) total: 332ms remaining: 952ms 15: learn: 0.5772328 test: 0.5803376 best: 0.5803376 (15) total: 355ms remaining: 931ms 16: learn: 0.5726959 test: 0.5761518 best: 0.5761518 (16) total: 377ms remaining: 910ms 17: learn: 0.5683988 test: 0.5721225 best: 0.5721225 (17) total: 401ms remaining: 890ms 18: learn: 0.5643666 test: 0.5684151 best: 0.5684151 (18) total: 423ms remaining: 868ms 19: learn: 0.5602549 test: 0.5641078 best: 0.5641078 (19) total: 446ms remaining: 847ms 20: learn: 0.5563670 test: 0.5600299 best: 0.5600299 (20) total: 469ms remaining: 826ms 21: learn: 0.5525367 test: 0.5564495 best: 0.5564495 (21) total: 493ms remaining: 806ms 22: learn: 0.5489057 test: 0.5526162 best: 0.5526162 (22) total: 517ms remaining: 786ms 23: learn: 0.5431979 test: 0.5468095 best: 0.5468095 (23) total: 540ms remaining: 765ms 24: learn: 0.5378421 test: 0.5413611 best: 0.5413611 (24) total: 563ms remaining: 743ms 25: learn: 0.5328148 test: 0.5362472 best: 0.5362472 (25) total: 586ms remaining: 722ms 26: learn: 0.5293101 test: 0.5330328 best: 0.5330328 (26) total: 610ms remaining: 700ms 27: learn: 0.5262727 test: 0.5298301 best: 0.5298301 (27) total: 634ms remaining: 679ms 28: learn: 0.5216112 test: 0.5248206 best: 0.5248206 (28) total: 659ms remaining: 659ms 29: learn: 0.5171844 test: 0.5201424 best: 0.5201424 (29) total: 682ms remaining: 637ms 30: learn: 0.5127212 test: 0.5156063 best: 0.5156063 (30) total: 706ms remaining: 615ms 31: learn: 0.5085341 test: 0.5113515 best: 0.5113515 (31) total: 733ms remaining: 595ms 32: learn: 0.5045987 test: 0.5073533 best: 0.5073533 (32) total: 758ms remaining: 574ms 33: learn: 0.5012114 test: 0.5041531 best: 0.5041531 (33) total: 783ms remaining: 553ms 34: learn: 0.4975191 test: 0.5004001 best: 0.5004001 (34) total: 808ms remaining: 531ms 35: learn: 0.4935526 test: 0.4961316 best: 0.4961316 (35) total: 833ms remaining: 509ms 36: learn: 0.4898129 test: 0.4921742 best: 0.4921742 (36) total: 857ms remaining: 486ms 37: learn: 0.4865858 test: 0.4892081 best: 0.4892081 (37) total: 881ms remaining: 464ms 38: learn: 0.4835404 test: 0.4864028 best: 0.4864028 (38) total: 907ms remaining: 442ms 39: learn: 0.4799450 test: 0.4825999 best: 0.4825999 (39) total: 932ms remaining: 419ms 40: learn: 0.4765879 test: 0.4790431 best: 0.4790431 (40) total: 957ms remaining: 397ms 41: learn: 0.4736306 test: 0.4763206 best: 0.4763206 (41) total: 982ms remaining: 374ms 42: learn: 0.4708024 test: 0.4736571 best: 0.4736571 (42) total: 1.01s remaining: 351ms 43: learn: 0.4675844 test: 0.4702455 best: 0.4702455 (43) total: 1.03s remaining: 328ms 44: learn: 0.4645573 test: 0.4670378 best: 0.4670378 (44) total: 1.06s remaining: 305ms 45: learn: 0.4618394 test: 0.4645430 best: 0.4645430 (45) total: 1.08s remaining: 282ms 46: learn: 0.4592731 test: 0.4621938 best: 0.4621938 (46) total: 1.11s remaining: 259ms 47: learn: 0.4560861 test: 0.4589501 best: 0.4589501 (47) total: 1.13s remaining: 236ms 48: learn: 0.4530837 test: 0.4558950 best: 0.4558950 (48) total: 1.16s remaining: 213ms 49: learn: 0.4502697 test: 0.4528497 best: 0.4528497 (49) total: 1.18s remaining: 189ms 50: learn: 0.4478148 test: 0.4506010 best: 0.4506010 (50) total: 1.21s remaining: 166ms 51: learn: 0.4450332 test: 0.4477705 best: 0.4477705 (51) total: 1.24s remaining: 143ms 52: learn: 0.4427244 test: 0.4456590 best: 0.4456590 (52) total: 1.26s remaining: 119ms 53: learn: 0.4405074 test: 0.4435787 best: 0.4435787 (53) total: 1.28s remaining: 95.2ms 54: learn: 0.4381651 test: 0.4411449 best: 0.4411449 (54) total: 1.31s remaining: 71.5ms 55: learn: 0.4354886 test: 0.4382512 best: 0.4382512 (55) total: 1.34s remaining: 47.8ms 56: learn: 0.4333855 test: 0.4363343 best: 0.4363343 (56) total: 1.36s remaining: 23.9ms 57: learn: 0.4308619 test: 0.4335984 best: 0.4335984 (57) total: 1.39s remaining: 0us bestTest = 0.4335983829 bestIteration = 57 Trial 32, Fold 5: Log loss = 0.43367785706258694, Average precision = 0.9328699446903309, ROC-AUC = 0.9420889954108838, Elapsed Time = 1.5051772000006167 seconds
Optimization Progress: 33%|###3 | 33/100 [58:09<1:06:41, 59.73s/it]
Trial 33, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 33, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6797377 test: 0.6802594 best: 0.6802594 (0) total: 470ms remaining: 21.6s 1: learn: 0.6668473 test: 0.6673814 best: 0.6673814 (1) total: 913ms remaining: 20.6s 2: learn: 0.6544500 test: 0.6550625 best: 0.6550625 (2) total: 1.36s remaining: 20s 3: learn: 0.6423785 test: 0.6431145 best: 0.6431145 (3) total: 1.81s remaining: 19.4s 4: learn: 0.6305508 test: 0.6313342 best: 0.6313342 (4) total: 2.25s remaining: 18.9s 5: learn: 0.6193913 test: 0.6202447 best: 0.6202447 (5) total: 2.75s remaining: 18.8s 6: learn: 0.6082789 test: 0.6092181 best: 0.6092181 (6) total: 3.24s remaining: 18.5s 7: learn: 0.5974721 test: 0.5985155 best: 0.5985155 (7) total: 3.74s remaining: 18.2s 8: learn: 0.5870666 test: 0.5880974 best: 0.5880974 (8) total: 4.13s remaining: 17.4s 9: learn: 0.5769250 test: 0.5779843 best: 0.5779843 (9) total: 4.56s remaining: 16.9s 10: learn: 0.5668720 test: 0.5681468 best: 0.5681468 (10) total: 4.99s remaining: 16.3s 11: learn: 0.5572260 test: 0.5586595 best: 0.5586595 (11) total: 5.48s remaining: 16s 12: learn: 0.5479056 test: 0.5494224 best: 0.5494224 (12) total: 5.93s remaining: 15.5s 13: learn: 0.5388268 test: 0.5404070 best: 0.5404070 (13) total: 6.47s remaining: 15.3s 14: learn: 0.5300595 test: 0.5317294 best: 0.5317294 (14) total: 6.88s remaining: 14.7s 15: learn: 0.5215696 test: 0.5233745 best: 0.5233745 (15) total: 7.48s remaining: 14.5s 16: learn: 0.5133177 test: 0.5151876 best: 0.5151876 (16) total: 8s remaining: 14.1s 17: learn: 0.5052887 test: 0.5073057 best: 0.5073057 (17) total: 8.48s remaining: 13.7s 18: learn: 0.4974693 test: 0.4995652 best: 0.4995652 (18) total: 9.03s remaining: 13.3s 19: learn: 0.4899479 test: 0.4921797 best: 0.4921797 (19) total: 9.53s remaining: 12.9s 20: learn: 0.4824456 test: 0.4848091 best: 0.4848091 (20) total: 10.1s remaining: 12.5s 21: learn: 0.4754226 test: 0.4778334 best: 0.4778334 (21) total: 10.5s remaining: 11.9s 22: learn: 0.4683757 test: 0.4709242 best: 0.4709242 (22) total: 11s remaining: 11.5s 23: learn: 0.4616117 test: 0.4642900 best: 0.4642900 (23) total: 11.5s remaining: 11s 24: learn: 0.4551474 test: 0.4579457 best: 0.4579457 (24) total: 12s remaining: 10.5s 25: learn: 0.4487640 test: 0.4516847 best: 0.4516847 (25) total: 12.4s remaining: 10s 26: learn: 0.4425574 test: 0.4456085 best: 0.4456085 (26) total: 12.9s remaining: 9.56s 27: learn: 0.4366080 test: 0.4397448 best: 0.4397448 (27) total: 13.3s remaining: 9.06s 28: learn: 0.4307355 test: 0.4340584 best: 0.4340584 (28) total: 13.8s remaining: 8.59s 29: learn: 0.4252502 test: 0.4287402 best: 0.4287402 (29) total: 14.3s remaining: 8.11s 30: learn: 0.4197574 test: 0.4233660 best: 0.4233660 (30) total: 14.7s remaining: 7.59s 31: learn: 0.4143638 test: 0.4181288 best: 0.4181288 (31) total: 15.2s remaining: 7.1s 32: learn: 0.4092713 test: 0.4131355 best: 0.4131355 (32) total: 15.7s remaining: 6.65s 33: learn: 0.4043843 test: 0.4083731 best: 0.4083731 (33) total: 16s remaining: 6.13s 34: learn: 0.3994213 test: 0.4035469 best: 0.4035469 (34) total: 16.5s remaining: 5.65s 35: learn: 0.3945180 test: 0.3988135 best: 0.3988135 (35) total: 17s remaining: 5.18s 36: learn: 0.3897369 test: 0.3940931 best: 0.3940931 (36) total: 17.5s remaining: 4.73s 37: learn: 0.3852883 test: 0.3897233 best: 0.3897233 (37) total: 17.9s remaining: 4.24s 38: learn: 0.3809071 test: 0.3854998 best: 0.3854998 (38) total: 18.4s remaining: 3.77s 39: learn: 0.3768054 test: 0.3815801 best: 0.3815801 (39) total: 18.7s remaining: 3.28s 40: learn: 0.3725880 test: 0.3775056 best: 0.3775056 (40) total: 19.2s remaining: 2.81s 41: learn: 0.3685526 test: 0.3735379 best: 0.3735379 (41) total: 19.7s remaining: 2.34s 42: learn: 0.3647741 test: 0.3698762 best: 0.3698762 (42) total: 20.1s remaining: 1.87s 43: learn: 0.3609205 test: 0.3661565 best: 0.3661565 (43) total: 20.5s remaining: 1.4s 44: learn: 0.3573182 test: 0.3626890 best: 0.3626890 (44) total: 21s remaining: 932ms 45: learn: 0.3538630 test: 0.3594555 best: 0.3594555 (45) total: 21.5s remaining: 467ms 46: learn: 0.3503443 test: 0.3561272 best: 0.3561272 (46) total: 22s remaining: 0us bestTest = 0.3561272356 bestIteration = 46 Trial 33, Fold 1: Log loss = 0.35612723564515086, Average precision = 0.972196983245255, ROC-AUC = 0.9678050602706202, Elapsed Time = 22.13490599999932 seconds Trial 33, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 33, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6798731 test: 0.6798059 best: 0.6798059 (0) total: 434ms remaining: 19.9s 1: learn: 0.6671206 test: 0.6672299 best: 0.6672299 (1) total: 924ms remaining: 20.8s 2: learn: 0.6547932 test: 0.6550367 best: 0.6550367 (2) total: 1.37s remaining: 20.1s 3: learn: 0.6427706 test: 0.6431789 best: 0.6431789 (3) total: 1.85s remaining: 19.9s 4: learn: 0.6310726 test: 0.6316476 best: 0.6316476 (4) total: 2.28s remaining: 19.2s 5: learn: 0.6195910 test: 0.6203282 best: 0.6203282 (5) total: 2.79s remaining: 19.1s 6: learn: 0.6085365 test: 0.6093711 best: 0.6093711 (6) total: 3.33s remaining: 19s 7: learn: 0.5978200 test: 0.5987527 best: 0.5987527 (7) total: 3.83s remaining: 18.7s 8: learn: 0.5873281 test: 0.5883992 best: 0.5883992 (8) total: 4.37s remaining: 18.5s 9: learn: 0.5772564 test: 0.5784242 best: 0.5784242 (9) total: 4.76s remaining: 17.6s 10: learn: 0.5673475 test: 0.5686801 best: 0.5686801 (10) total: 5.3s remaining: 17.4s 11: learn: 0.5578317 test: 0.5592380 best: 0.5592380 (11) total: 5.67s remaining: 16.5s 12: learn: 0.5484901 test: 0.5500446 best: 0.5500446 (12) total: 6.23s remaining: 16.3s 13: learn: 0.5394575 test: 0.5411858 best: 0.5411858 (13) total: 6.77s remaining: 16s 14: learn: 0.5306768 test: 0.5325572 best: 0.5325572 (14) total: 7.22s remaining: 15.4s 15: learn: 0.5220287 test: 0.5240335 best: 0.5240335 (15) total: 7.75s remaining: 15s 16: learn: 0.5136873 test: 0.5158806 best: 0.5158806 (16) total: 8.24s remaining: 14.5s 17: learn: 0.5056498 test: 0.5079742 best: 0.5079742 (17) total: 8.75s remaining: 14.1s 18: learn: 0.4977221 test: 0.5001873 best: 0.5001873 (18) total: 9.26s remaining: 13.6s 19: learn: 0.4901031 test: 0.4926585 best: 0.4926585 (19) total: 9.69s remaining: 13.1s 20: learn: 0.4827701 test: 0.4854192 best: 0.4854192 (20) total: 10.1s remaining: 12.5s 21: learn: 0.4756486 test: 0.4784196 best: 0.4784196 (21) total: 10.5s remaining: 12s 22: learn: 0.4685811 test: 0.4714526 best: 0.4714526 (22) total: 11.1s remaining: 11.5s 23: learn: 0.4619491 test: 0.4649748 best: 0.4649748 (23) total: 11.5s remaining: 11s 24: learn: 0.4554346 test: 0.4584988 best: 0.4584988 (24) total: 11.9s remaining: 10.5s 25: learn: 0.4490672 test: 0.4522667 best: 0.4522667 (25) total: 12.4s remaining: 10s 26: learn: 0.4428614 test: 0.4462255 best: 0.4462255 (26) total: 12.9s remaining: 9.59s 27: learn: 0.4368642 test: 0.4403080 best: 0.4403080 (27) total: 13.4s remaining: 9.12s 28: learn: 0.4309742 test: 0.4345347 best: 0.4345347 (28) total: 14s remaining: 8.67s 29: learn: 0.4251749 test: 0.4289163 best: 0.4289163 (29) total: 14.6s remaining: 8.26s 30: learn: 0.4195950 test: 0.4234050 best: 0.4234050 (30) total: 15s remaining: 7.74s 31: learn: 0.4141959 test: 0.4180434 best: 0.4180434 (31) total: 15.5s remaining: 7.28s 32: learn: 0.4091247 test: 0.4131156 best: 0.4131156 (32) total: 16s remaining: 6.78s 33: learn: 0.4040330 test: 0.4081150 best: 0.4081150 (33) total: 16.4s remaining: 6.26s 34: learn: 0.3991653 test: 0.4033574 best: 0.4033574 (34) total: 16.8s remaining: 5.76s 35: learn: 0.3943518 test: 0.3986350 best: 0.3986350 (35) total: 17.3s remaining: 5.3s 36: learn: 0.3897634 test: 0.3941457 best: 0.3941457 (36) total: 17.7s remaining: 4.79s 37: learn: 0.3852835 test: 0.3896868 best: 0.3896868 (37) total: 18.2s remaining: 4.3s 38: learn: 0.3809759 test: 0.3854949 best: 0.3854949 (38) total: 18.7s remaining: 3.83s 39: learn: 0.3766753 test: 0.3812858 best: 0.3812858 (39) total: 19.2s remaining: 3.37s 40: learn: 0.3725566 test: 0.3772323 best: 0.3772323 (40) total: 19.6s remaining: 2.87s 41: learn: 0.3686116 test: 0.3733100 best: 0.3733100 (41) total: 20s remaining: 2.38s 42: learn: 0.3646371 test: 0.3694580 best: 0.3694580 (42) total: 20.5s remaining: 1.91s 43: learn: 0.3607491 test: 0.3656762 best: 0.3656762 (43) total: 21.1s remaining: 1.44s 44: learn: 0.3570445 test: 0.3620224 best: 0.3620224 (44) total: 21.7s remaining: 965ms 45: learn: 0.3534633 test: 0.3585506 best: 0.3585506 (45) total: 22.3s remaining: 485ms 46: learn: 0.3499085 test: 0.3550376 best: 0.3550376 (46) total: 22.8s remaining: 0us bestTest = 0.355037595 bestIteration = 46 Trial 33, Fold 2: Log loss = 0.3550375950367974, Average precision = 0.9729696541336975, ROC-AUC = 0.9706665460777597, Elapsed Time = 22.946749499998987 seconds Trial 33, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 33, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6799509 test: 0.6800352 best: 0.6800352 (0) total: 582ms remaining: 26.8s 1: learn: 0.6671249 test: 0.6672784 best: 0.6672784 (1) total: 1.1s remaining: 24.8s 2: learn: 0.6548147 test: 0.6549937 best: 0.6549937 (2) total: 1.61s remaining: 23.7s 3: learn: 0.6426938 test: 0.6428971 best: 0.6428971 (3) total: 2.12s remaining: 22.8s 4: learn: 0.6310210 test: 0.6312552 best: 0.6312552 (4) total: 2.54s remaining: 21.3s 5: learn: 0.6195908 test: 0.6198639 best: 0.6198639 (5) total: 3.12s remaining: 21.3s 6: learn: 0.6085625 test: 0.6089137 best: 0.6089137 (6) total: 3.62s remaining: 20.7s 7: learn: 0.5978878 test: 0.5982684 best: 0.5982684 (7) total: 4.11s remaining: 20s 8: learn: 0.5873856 test: 0.5877928 best: 0.5877928 (8) total: 4.55s remaining: 19.2s 9: learn: 0.5772680 test: 0.5777291 best: 0.5777291 (9) total: 5.09s remaining: 18.8s 10: learn: 0.5673483 test: 0.5678488 best: 0.5678488 (10) total: 5.57s remaining: 18.2s 11: learn: 0.5576772 test: 0.5581541 best: 0.5581541 (11) total: 6.03s remaining: 17.6s 12: learn: 0.5484140 test: 0.5489271 best: 0.5489271 (12) total: 6.55s remaining: 17.1s 13: learn: 0.5394177 test: 0.5399539 best: 0.5399539 (13) total: 7.01s remaining: 16.5s 14: learn: 0.5307399 test: 0.5313947 best: 0.5313947 (14) total: 7.48s remaining: 16s 15: learn: 0.5223395 test: 0.5230529 best: 0.5230529 (15) total: 7.93s remaining: 15.4s 16: learn: 0.5141404 test: 0.5148477 best: 0.5148477 (16) total: 8.47s remaining: 15s 17: learn: 0.5060278 test: 0.5067142 best: 0.5067142 (17) total: 8.96s remaining: 14.4s 18: learn: 0.4982232 test: 0.4989773 best: 0.4989773 (18) total: 9.44s remaining: 13.9s 19: learn: 0.4906798 test: 0.4914421 best: 0.4914421 (19) total: 9.89s remaining: 13.4s 20: learn: 0.4833670 test: 0.4842252 best: 0.4842252 (20) total: 10.4s remaining: 12.9s 21: learn: 0.4762383 test: 0.4771701 best: 0.4771701 (21) total: 10.8s remaining: 12.3s 22: learn: 0.4692249 test: 0.4701640 best: 0.4701640 (22) total: 11.4s remaining: 11.9s 23: learn: 0.4625261 test: 0.4635101 best: 0.4635101 (23) total: 11.9s remaining: 11.4s 24: learn: 0.4560124 test: 0.4570187 best: 0.4570187 (24) total: 12.3s remaining: 10.8s 25: learn: 0.4495830 test: 0.4506266 best: 0.4506266 (25) total: 12.8s remaining: 10.4s 26: learn: 0.4434510 test: 0.4445625 best: 0.4445625 (26) total: 13.2s remaining: 9.81s 27: learn: 0.4375220 test: 0.4386554 best: 0.4386554 (27) total: 13.7s remaining: 9.32s 28: learn: 0.4316755 test: 0.4328746 best: 0.4328746 (28) total: 14.2s remaining: 8.82s 29: learn: 0.4259967 test: 0.4272801 best: 0.4272801 (29) total: 14.6s remaining: 8.28s 30: learn: 0.4205404 test: 0.4217760 best: 0.4217760 (30) total: 15.1s remaining: 7.81s 31: learn: 0.4151948 test: 0.4164346 best: 0.4164346 (31) total: 15.6s remaining: 7.3s 32: learn: 0.4099476 test: 0.4112541 best: 0.4112541 (32) total: 16s remaining: 6.77s 33: learn: 0.4048313 test: 0.4061846 best: 0.4061846 (33) total: 16.3s remaining: 6.25s 34: learn: 0.3998099 test: 0.4011987 best: 0.4011987 (34) total: 16.7s remaining: 5.74s 35: learn: 0.3949732 test: 0.3964012 best: 0.3964012 (35) total: 17.2s remaining: 5.27s 36: learn: 0.3905605 test: 0.3920238 best: 0.3920238 (36) total: 17.7s remaining: 4.79s 37: learn: 0.3858692 test: 0.3873239 best: 0.3873239 (37) total: 18.2s remaining: 4.32s 38: learn: 0.3816521 test: 0.3831435 best: 0.3831435 (38) total: 18.6s remaining: 3.82s 39: learn: 0.3774929 test: 0.3791088 best: 0.3791088 (39) total: 19.1s remaining: 3.34s 40: learn: 0.3733914 test: 0.3750757 best: 0.3750757 (40) total: 19.5s remaining: 2.86s 41: learn: 0.3694223 test: 0.3711697 best: 0.3711697 (41) total: 20s remaining: 2.38s 42: learn: 0.3653696 test: 0.3671026 best: 0.3671026 (42) total: 20.4s remaining: 1.9s 43: learn: 0.3614132 test: 0.3631509 best: 0.3631509 (43) total: 20.8s remaining: 1.42s 44: learn: 0.3576997 test: 0.3595060 best: 0.3595060 (44) total: 21.3s remaining: 948ms 45: learn: 0.3541952 test: 0.3560741 best: 0.3560741 (45) total: 21.7s remaining: 473ms 46: learn: 0.3507134 test: 0.3526761 best: 0.3526761 (46) total: 22.2s remaining: 0us bestTest = 0.3526760709 bestIteration = 46 Trial 33, Fold 3: Log loss = 0.3526760709219909, Average precision = 0.972012422554811, ROC-AUC = 0.9703599629445927, Elapsed Time = 22.334626600000774 seconds Trial 33, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 33, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6799692 test: 0.6799521 best: 0.6799521 (0) total: 467ms remaining: 21.5s 1: learn: 0.6671930 test: 0.6672389 best: 0.6672389 (1) total: 949ms remaining: 21.4s 2: learn: 0.6547724 test: 0.6548321 best: 0.6548321 (2) total: 1.41s remaining: 20.6s 3: learn: 0.6428845 test: 0.6429992 best: 0.6429992 (3) total: 1.82s remaining: 19.6s 4: learn: 0.6310770 test: 0.6313125 best: 0.6313125 (4) total: 2.34s remaining: 19.7s 5: learn: 0.6197182 test: 0.6200063 best: 0.6200063 (5) total: 2.86s remaining: 19.5s 6: learn: 0.6086455 test: 0.6090045 best: 0.6090045 (6) total: 3.31s remaining: 18.9s 7: learn: 0.5979433 test: 0.5983726 best: 0.5983726 (7) total: 3.81s remaining: 18.6s 8: learn: 0.5874391 test: 0.5879798 best: 0.5879798 (8) total: 4.21s remaining: 17.8s 9: learn: 0.5773179 test: 0.5779191 best: 0.5779191 (9) total: 4.64s remaining: 17.2s 10: learn: 0.5675401 test: 0.5683030 best: 0.5683030 (10) total: 5.08s remaining: 16.6s 11: learn: 0.5579704 test: 0.5588327 best: 0.5588327 (11) total: 5.55s remaining: 16.2s 12: learn: 0.5486920 test: 0.5496348 best: 0.5496348 (12) total: 5.95s remaining: 15.6s 13: learn: 0.5398533 test: 0.5408685 best: 0.5408685 (13) total: 6.34s remaining: 14.9s 14: learn: 0.5311511 test: 0.5322926 best: 0.5322926 (14) total: 6.73s remaining: 14.3s 15: learn: 0.5226852 test: 0.5239382 best: 0.5239382 (15) total: 7.19s remaining: 13.9s 16: learn: 0.5143974 test: 0.5157576 best: 0.5157576 (16) total: 7.59s remaining: 13.4s 17: learn: 0.5063820 test: 0.5078092 best: 0.5078092 (17) total: 8.12s remaining: 13.1s 18: learn: 0.4984966 test: 0.5000367 best: 0.5000367 (18) total: 8.56s remaining: 12.6s 19: learn: 0.4908642 test: 0.4924577 best: 0.4924577 (19) total: 8.95s remaining: 12.1s 20: learn: 0.4834198 test: 0.4851068 best: 0.4851068 (20) total: 9.45s remaining: 11.7s 21: learn: 0.4762645 test: 0.4780915 best: 0.4780915 (21) total: 9.92s remaining: 11.3s 22: learn: 0.4694445 test: 0.4713035 best: 0.4713035 (22) total: 10.3s remaining: 10.8s 23: learn: 0.4627020 test: 0.4646017 best: 0.4646017 (23) total: 10.9s remaining: 10.4s 24: learn: 0.4563071 test: 0.4582481 best: 0.4582481 (24) total: 11.3s remaining: 9.91s 25: learn: 0.4499597 test: 0.4519808 best: 0.4519808 (25) total: 11.7s remaining: 9.49s 26: learn: 0.4437539 test: 0.4459033 best: 0.4459033 (26) total: 12.3s remaining: 9.08s 27: learn: 0.4376952 test: 0.4399470 best: 0.4399470 (27) total: 12.7s remaining: 8.59s 28: learn: 0.4318537 test: 0.4342987 best: 0.4342987 (28) total: 13.1s remaining: 8.14s 29: learn: 0.4261261 test: 0.4287158 best: 0.4287158 (29) total: 13.5s remaining: 7.66s 30: learn: 0.4206580 test: 0.4233420 best: 0.4233420 (30) total: 13.9s remaining: 7.2s 31: learn: 0.4153329 test: 0.4181466 best: 0.4181466 (31) total: 14.5s remaining: 6.78s 32: learn: 0.4100997 test: 0.4130558 best: 0.4130558 (32) total: 14.9s remaining: 6.33s 33: learn: 0.4049992 test: 0.4081214 best: 0.4081214 (33) total: 15.4s remaining: 5.9s 34: learn: 0.3999494 test: 0.4031748 best: 0.4031748 (34) total: 15.9s remaining: 5.45s 35: learn: 0.3952151 test: 0.3985011 best: 0.3985011 (35) total: 16.3s remaining: 4.98s 36: learn: 0.3904883 test: 0.3938939 best: 0.3938939 (36) total: 16.7s remaining: 4.52s 37: learn: 0.3862677 test: 0.3898062 best: 0.3898062 (37) total: 17.2s remaining: 4.08s 38: learn: 0.3817417 test: 0.3854080 best: 0.3854080 (38) total: 17.7s remaining: 3.62s 39: learn: 0.3777289 test: 0.3814294 best: 0.3814294 (39) total: 18s remaining: 3.15s 40: learn: 0.3738134 test: 0.3776291 best: 0.3776291 (40) total: 18.4s remaining: 2.7s 41: learn: 0.3699027 test: 0.3738155 best: 0.3738155 (41) total: 18.8s remaining: 2.23s 42: learn: 0.3660927 test: 0.3700101 best: 0.3700101 (42) total: 19.3s remaining: 1.79s 43: learn: 0.3623670 test: 0.3663667 best: 0.3663667 (43) total: 19.7s remaining: 1.34s 44: learn: 0.3587059 test: 0.3628643 best: 0.3628643 (44) total: 20.1s remaining: 892ms 45: learn: 0.3550818 test: 0.3592529 best: 0.3592529 (45) total: 20.5s remaining: 445ms 46: learn: 0.3515931 test: 0.3558125 best: 0.3558125 (46) total: 20.9s remaining: 0us bestTest = 0.3558125306 bestIteration = 46 Trial 33, Fold 4: Log loss = 0.35581253057496137, Average precision = 0.9744075377306871, ROC-AUC = 0.9702797478963544, Elapsed Time = 20.999208799999906 seconds Trial 33, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 33, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6798679 test: 0.6801917 best: 0.6801917 (0) total: 508ms remaining: 23.4s 1: learn: 0.6670886 test: 0.6676697 best: 0.6676697 (1) total: 931ms remaining: 20.9s 2: learn: 0.6546026 test: 0.6554156 best: 0.6554156 (2) total: 1.45s remaining: 21.3s 3: learn: 0.6425578 test: 0.6436337 best: 0.6436337 (3) total: 1.89s remaining: 20.3s 4: learn: 0.6308166 test: 0.6321976 best: 0.6321976 (4) total: 2.4s remaining: 20.2s 5: learn: 0.6193215 test: 0.6208824 best: 0.6208824 (5) total: 2.88s remaining: 19.7s 6: learn: 0.6081236 test: 0.6098733 best: 0.6098733 (6) total: 3.36s remaining: 19.2s 7: learn: 0.5972964 test: 0.5993188 best: 0.5993188 (7) total: 3.93s remaining: 19.2s 8: learn: 0.5867748 test: 0.5890315 best: 0.5890315 (8) total: 4.32s remaining: 18.2s 9: learn: 0.5767021 test: 0.5791945 best: 0.5791945 (9) total: 4.76s remaining: 17.6s 10: learn: 0.5668515 test: 0.5695209 best: 0.5695209 (10) total: 5.07s remaining: 16.6s 11: learn: 0.5571833 test: 0.5600675 best: 0.5600675 (11) total: 5.52s remaining: 16.1s 12: learn: 0.5479403 test: 0.5509889 best: 0.5509889 (12) total: 6.07s remaining: 15.9s 13: learn: 0.5387446 test: 0.5419693 best: 0.5419693 (13) total: 6.52s remaining: 15.4s 14: learn: 0.5298851 test: 0.5332832 best: 0.5332832 (14) total: 7.02s remaining: 15s 15: learn: 0.5213257 test: 0.5248873 best: 0.5248873 (15) total: 7.42s remaining: 14.4s 16: learn: 0.5129345 test: 0.5167050 best: 0.5167050 (16) total: 7.9s remaining: 13.9s 17: learn: 0.5049864 test: 0.5089013 best: 0.5089013 (17) total: 8.36s remaining: 13.5s 18: learn: 0.4972747 test: 0.5013650 best: 0.5013650 (18) total: 8.8s remaining: 13s 19: learn: 0.4896915 test: 0.4939695 best: 0.4939695 (19) total: 9.25s remaining: 12.5s 20: learn: 0.4822362 test: 0.4867063 best: 0.4867063 (20) total: 9.67s remaining: 12s 21: learn: 0.4750321 test: 0.4798003 best: 0.4798003 (21) total: 10.2s remaining: 11.5s 22: learn: 0.4679653 test: 0.4728788 best: 0.4728788 (22) total: 10.6s remaining: 11s 23: learn: 0.4613317 test: 0.4664195 best: 0.4664195 (23) total: 11.1s remaining: 10.6s 24: learn: 0.4547018 test: 0.4599752 best: 0.4599752 (24) total: 11.5s remaining: 10.1s 25: learn: 0.4482876 test: 0.4537063 best: 0.4537063 (25) total: 11.9s remaining: 9.64s 26: learn: 0.4420091 test: 0.4476160 best: 0.4476160 (26) total: 12.4s remaining: 9.16s 27: learn: 0.4359534 test: 0.4417756 best: 0.4417756 (27) total: 12.8s remaining: 8.68s 28: learn: 0.4300229 test: 0.4360130 best: 0.4360130 (28) total: 13.3s remaining: 8.27s 29: learn: 0.4241825 test: 0.4303801 best: 0.4303801 (29) total: 13.9s remaining: 7.86s 30: learn: 0.4187289 test: 0.4250956 best: 0.4250956 (30) total: 14.3s remaining: 7.39s 31: learn: 0.4132559 test: 0.4197909 best: 0.4197909 (31) total: 14.8s remaining: 6.94s 32: learn: 0.4080057 test: 0.4146902 best: 0.4146902 (32) total: 15.3s remaining: 6.48s 33: learn: 0.4029331 test: 0.4098509 best: 0.4098509 (33) total: 15.8s remaining: 6.03s 34: learn: 0.3980536 test: 0.4051407 best: 0.4051407 (34) total: 16.3s remaining: 5.58s 35: learn: 0.3932674 test: 0.4005203 best: 0.4005203 (35) total: 16.7s remaining: 5.1s 36: learn: 0.3885288 test: 0.3960026 best: 0.3960026 (36) total: 17.1s remaining: 4.62s 37: learn: 0.3843021 test: 0.3918855 best: 0.3918855 (37) total: 17.6s remaining: 4.17s 38: learn: 0.3797245 test: 0.3875249 best: 0.3875249 (38) total: 18.1s remaining: 3.71s 39: learn: 0.3754033 test: 0.3833875 best: 0.3833875 (39) total: 18.6s remaining: 3.25s 40: learn: 0.3713167 test: 0.3794975 best: 0.3794975 (40) total: 19s remaining: 2.79s 41: learn: 0.3671760 test: 0.3755365 best: 0.3755365 (41) total: 19.5s remaining: 2.33s 42: learn: 0.3632089 test: 0.3717372 best: 0.3717372 (42) total: 20s remaining: 1.86s 43: learn: 0.3592591 test: 0.3679166 best: 0.3679166 (43) total: 20.4s remaining: 1.39s 44: learn: 0.3556749 test: 0.3644660 best: 0.3644660 (44) total: 20.9s remaining: 927ms 45: learn: 0.3520098 test: 0.3609446 best: 0.3609446 (45) total: 21.4s remaining: 465ms 46: learn: 0.3484924 test: 0.3575087 best: 0.3575087 (46) total: 21.8s remaining: 0us bestTest = 0.3575087443 bestIteration = 46 Trial 33, Fold 5: Log loss = 0.35750874426837875, Average precision = 0.96991020543768, ROC-AUC = 0.9683131003474352, Elapsed Time = 21.928632199997082 seconds
Optimization Progress: 34%|###4 | 34/100 [1:00:07<1:24:54, 77.19s/it]
Trial 34, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 34, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5943476 test: 0.5990645 best: 0.5990645 (0) total: 190ms remaining: 4.38s 1: learn: 0.5212467 test: 0.5274094 best: 0.5274094 (1) total: 372ms remaining: 4.09s 2: learn: 0.4649574 test: 0.4731094 best: 0.4731094 (2) total: 549ms remaining: 3.84s 3: learn: 0.4104977 test: 0.4204379 best: 0.4204379 (3) total: 731ms remaining: 3.65s 4: learn: 0.3753823 test: 0.3892092 best: 0.3892092 (4) total: 919ms remaining: 3.49s 5: learn: 0.3437354 test: 0.3602125 best: 0.3602125 (5) total: 1.14s remaining: 3.41s 6: learn: 0.3154581 test: 0.3346817 best: 0.3346817 (6) total: 1.36s remaining: 3.3s 7: learn: 0.2891738 test: 0.3112655 best: 0.3112655 (7) total: 1.56s remaining: 3.12s 8: learn: 0.2699985 test: 0.2956367 best: 0.2956367 (8) total: 1.78s remaining: 2.97s 9: learn: 0.2515779 test: 0.2790994 best: 0.2790994 (9) total: 2.02s remaining: 2.83s 10: learn: 0.2377430 test: 0.2673821 best: 0.2673821 (10) total: 2.23s remaining: 2.64s 11: learn: 0.2264094 test: 0.2583536 best: 0.2583536 (11) total: 2.46s remaining: 2.46s 12: learn: 0.2164267 test: 0.2504224 best: 0.2504224 (12) total: 2.67s remaining: 2.26s 13: learn: 0.2069969 test: 0.2422374 best: 0.2422374 (13) total: 2.89s remaining: 2.06s 14: learn: 0.2290826 test: 0.2352201 best: 0.2352201 (14) total: 3.15s remaining: 1.89s 15: learn: 0.2556292 test: 0.2293058 best: 0.2293058 (15) total: 3.38s remaining: 1.69s
Training has stopped (degenerate solution on iteration 16, probably too small l2-regularization, try to increase it)
bestTest = 0.2293057997 bestIteration = 15 Shrink model to first 16 iterations. Trial 34, Fold 1: Log loss = 0.22953166751640147, Average precision = 0.973807404451311, ROC-AUC = 0.9697118923349727, Elapsed Time = 3.7016322999988915 seconds Trial 34, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 34, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6056415 test: 0.6069813 best: 0.6069813 (0) total: 175ms remaining: 4.01s 1: learn: 0.5193939 test: 0.5247933 best: 0.5247933 (1) total: 395ms remaining: 4.35s 2: learn: 0.4653923 test: 0.4727238 best: 0.4727238 (2) total: 578ms remaining: 4.05s 3: learn: 0.4158795 test: 0.4247616 best: 0.4247616 (3) total: 771ms remaining: 3.85s 4: learn: 0.3829056 test: 0.3939249 best: 0.3939249 (4) total: 979ms remaining: 3.72s 5: learn: 0.3517505 test: 0.3647171 best: 0.3647171 (5) total: 1.19s remaining: 3.57s 6: learn: 0.3260694 test: 0.3415961 best: 0.3415961 (6) total: 1.39s remaining: 3.38s 7: learn: 0.3017450 test: 0.3195364 best: 0.3195364 (7) total: 1.62s remaining: 3.24s 8: learn: 0.2784449 test: 0.2975302 best: 0.2975302 (8) total: 1.89s remaining: 3.14s 9: learn: 0.2574647 test: 0.2802588 best: 0.2802588 (9) total: 2.13s remaining: 2.99s 10: learn: 0.2423264 test: 0.2670958 best: 0.2670958 (10) total: 2.36s remaining: 2.79s 11: learn: 0.2289134 test: 0.2566089 best: 0.2566089 (11) total: 2.59s remaining: 2.59s 12: learn: 0.2170221 test: 0.2467791 best: 0.2467791 (12) total: 2.84s remaining: 2.4s 13: learn: 0.2082763 test: 0.2392661 best: 0.2392661 (13) total: 3.08s remaining: 2.2s 14: learn: 0.1995248 test: 0.2321519 best: 0.2321519 (14) total: 3.33s remaining: 2s 15: learn: 0.1934826 test: 0.2268702 best: 0.2268702 (15) total: 3.55s remaining: 1.78s 16: learn: 0.1877595 test: 0.2226930 best: 0.2226930 (16) total: 3.79s remaining: 1.56s 17: learn: 0.1808638 test: 0.2176225 best: 0.2176225 (17) total: 4.04s remaining: 1.35s 18: learn: 0.1756847 test: 0.2138852 best: 0.2138852 (18) total: 4.28s remaining: 1.13s 19: learn: 0.1704553 test: 0.2099490 best: 0.2099490 (19) total: 4.52s remaining: 904ms 20: learn: 0.1662161 test: 0.2067578 best: 0.2067578 (20) total: 4.75s remaining: 678ms 21: learn: 0.1627557 test: 0.2044718 best: 0.2044718 (21) total: 4.96s remaining: 451ms 22: learn: 0.1597167 test: 0.2023996 best: 0.2023996 (22) total: 5.16s remaining: 224ms 23: learn: 0.1566909 test: 0.2005386 best: 0.2005386 (23) total: 5.36s remaining: 0us bestTest = 0.2005385641 bestIteration = 23 Trial 34, Fold 2: Log loss = 0.2004136860628991, Average precision = 0.9754332516152835, ROC-AUC = 0.9728828224845302, Elapsed Time = 5.483135800001037 seconds Trial 34, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 34, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5955809 test: 0.5964806 best: 0.5964806 (0) total: 191ms remaining: 4.39s 1: learn: 0.5258734 test: 0.5281224 best: 0.5281224 (1) total: 365ms remaining: 4.02s 2: learn: 0.4665994 test: 0.4704731 best: 0.4704731 (2) total: 556ms remaining: 3.89s 3: learn: 0.4224107 test: 0.4268245 best: 0.4268245 (3) total: 755ms remaining: 3.78s 4: learn: 0.3874139 test: 0.3932919 best: 0.3932919 (4) total: 945ms remaining: 3.59s 5: learn: 0.3546507 test: 0.3670296 best: 0.3670296 (5) total: 1.15s remaining: 3.44s 6: learn: 0.3300182 test: 0.3446434 best: 0.3446434 (6) total: 1.35s remaining: 3.27s 7: learn: 0.3070320 test: 0.3232082 best: 0.3232082 (7) total: 1.6s remaining: 3.21s 8: learn: 0.2861044 test: 0.3042426 best: 0.3042426 (8) total: 1.86s remaining: 3.1s 9: learn: 0.2685478 test: 0.2884555 best: 0.2884555 (9) total: 2.14s remaining: 3s 10: learn: 0.2536522 test: 0.2748480 best: 0.2748480 (10) total: 2.41s remaining: 2.85s 11: learn: 0.2394473 test: 0.2635975 best: 0.2635975 (11) total: 2.71s remaining: 2.71s 12: learn: 0.2279682 test: 0.2542054 best: 0.2542054 (12) total: 3.01s remaining: 2.55s 13: learn: 0.2184617 test: 0.2460240 best: 0.2460240 (13) total: 3.33s remaining: 2.38s 14: learn: 0.2077029 test: 0.2364710 best: 0.2364710 (14) total: 3.62s remaining: 2.17s 15: learn: 0.2015176 test: 0.2310847 best: 0.2310847 (15) total: 3.91s remaining: 1.95s 16: learn: 0.1949912 test: 0.2257418 best: 0.2257418 (16) total: 4.29s remaining: 1.76s 17: learn: 0.1900155 test: 0.2219918 best: 0.2219918 (17) total: 4.57s remaining: 1.52s 18: learn: 0.1850203 test: 0.2186533 best: 0.2186533 (18) total: 4.84s remaining: 1.27s 19: learn: 0.1792839 test: 0.2148101 best: 0.2148101 (19) total: 5.12s remaining: 1.02s 20: learn: 0.1743226 test: 0.2112856 best: 0.2112856 (20) total: 5.49s remaining: 784ms 21: learn: 0.1695785 test: 0.2075384 best: 0.2075384 (21) total: 5.74s remaining: 522ms 22: learn: 0.1663557 test: 0.2054433 best: 0.2054433 (22) total: 5.96s remaining: 259ms 23: learn: 0.1809540 test: 0.2031111 best: 0.2031111 (23) total: 6.2s remaining: 0us bestTest = 0.2031111368 bestIteration = 23 Trial 34, Fold 3: Log loss = 0.2032233239940844, Average precision = 0.9755286697322263, ROC-AUC = 0.9732541642187785, Elapsed Time = 6.327928200000315 seconds Trial 34, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 34, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5973665 test: 0.5995453 best: 0.5995453 (0) total: 227ms remaining: 5.21s 1: learn: 0.5198167 test: 0.5239807 best: 0.5239807 (1) total: 456ms remaining: 5.01s 2: learn: 0.4598358 test: 0.4663113 best: 0.4663113 (2) total: 681ms remaining: 4.77s 3: learn: 0.4177520 test: 0.4263924 best: 0.4263924 (3) total: 912ms remaining: 4.56s 4: learn: 0.3806622 test: 0.3910781 best: 0.3910781 (4) total: 1.15s remaining: 4.36s 5: learn: 0.3526744 test: 0.3655766 best: 0.3655766 (5) total: 1.4s remaining: 4.19s 6: learn: 0.3301461 test: 0.3449584 best: 0.3449584 (6) total: 1.63s remaining: 3.96s 7: learn: 0.3078073 test: 0.3237156 best: 0.3237156 (7) total: 1.92s remaining: 3.84s 8: learn: 0.2898432 test: 0.3086806 best: 0.3086806 (8) total: 2.18s remaining: 3.63s 9: learn: 0.2716349 test: 0.2934197 best: 0.2934197 (9) total: 2.43s remaining: 3.4s 10: learn: 0.2590062 test: 0.2822746 best: 0.2822746 (10) total: 2.66s remaining: 3.14s 11: learn: 0.2468990 test: 0.2719219 best: 0.2719219 (11) total: 2.94s remaining: 2.94s 12: learn: 0.2343369 test: 0.2614777 best: 0.2614777 (12) total: 3.18s remaining: 2.69s 13: learn: 0.2254935 test: 0.2550927 best: 0.2550927 (13) total: 3.42s remaining: 2.44s 14: learn: 0.2161107 test: 0.2475053 best: 0.2475053 (14) total: 3.65s remaining: 2.19s 15: learn: 0.2084657 test: 0.2414135 best: 0.2414135 (15) total: 3.88s remaining: 1.94s 16: learn: 0.2007506 test: 0.2353869 best: 0.2353869 (16) total: 4.13s remaining: 1.7s 17: learn: 0.1950521 test: 0.2311037 best: 0.2311037 (17) total: 4.38s remaining: 1.46s 18: learn: 0.1881732 test: 0.2260424 best: 0.2260424 (18) total: 4.61s remaining: 1.21s 19: learn: 0.1828753 test: 0.2217793 best: 0.2217793 (19) total: 4.86s remaining: 971ms 20: learn: 0.1881780 test: 0.2176608 best: 0.2176608 (20) total: 5.07s remaining: 725ms 21: learn: 0.4569488 test: 0.7577585 best: 0.2176608 (20) total: 5.27s remaining: 479ms 22: learn: 0.7789546 test: 1.5201378 best: 0.2176608 (20) total: 5.47s remaining: 238ms
Training has stopped (degenerate solution on iteration 23, probably too small l2-regularization, try to increase it)
bestTest = 0.217660783 bestIteration = 20 Shrink model to first 21 iterations. Trial 34, Fold 4: Log loss = 0.2176669441894009, Average precision = 0.9709851008514145, ROC-AUC = 0.9698864693090696, Elapsed Time = 5.810770200001571 seconds Trial 34, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 34, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5944791 test: 0.5979486 best: 0.5979486 (0) total: 208ms remaining: 4.79s 1: learn: 0.5221860 test: 0.5329186 best: 0.5329186 (1) total: 382ms remaining: 4.2s 2: learn: 0.4600844 test: 0.4730636 best: 0.4730636 (2) total: 568ms remaining: 3.98s 3: learn: 0.4157845 test: 0.4308862 best: 0.4308862 (3) total: 757ms remaining: 3.79s 4: learn: 0.3803584 test: 0.3983018 best: 0.3983018 (4) total: 990ms remaining: 3.76s 5: learn: 0.3412906 test: 0.3637792 best: 0.3637792 (5) total: 1.2s remaining: 3.6s 6: learn: 0.3103205 test: 0.3349416 best: 0.3349416 (6) total: 1.43s remaining: 3.48s 7: learn: 0.2858679 test: 0.3131531 best: 0.3131531 (7) total: 1.64s remaining: 3.28s 8: learn: 0.2637704 test: 0.2933788 best: 0.2933788 (8) total: 1.86s remaining: 3.1s 9: learn: 0.2473231 test: 0.2809982 best: 0.2809982 (9) total: 2.08s remaining: 2.92s 10: learn: 0.2318633 test: 0.2682475 best: 0.2682475 (10) total: 2.29s remaining: 2.71s 11: learn: 0.2197833 test: 0.2579069 best: 0.2579069 (11) total: 2.53s remaining: 2.53s 12: learn: 0.2097791 test: 0.2491652 best: 0.2491652 (12) total: 2.74s remaining: 2.31s 13: learn: 0.2012860 test: 0.2434989 best: 0.2434989 (13) total: 2.95s remaining: 2.11s 14: learn: 0.1936671 test: 0.2376607 best: 0.2376607 (14) total: 3.19s remaining: 1.91s 15: learn: 0.2042173 test: 0.4375058 best: 0.2376607 (14) total: 3.42s remaining: 1.71s 16: learn: 0.2148160 test: 0.4328437 best: 0.2376607 (14) total: 3.63s remaining: 1.49s 17: learn: 0.2098738 test: 0.4291165 best: 0.2376607 (14) total: 3.84s remaining: 1.28s 18: learn: 0.1980945 test: 0.4054428 best: 0.2376607 (14) total: 4.07s remaining: 1.07s 19: learn: 0.1917563 test: 0.4027966 best: 0.2376607 (14) total: 4.26s remaining: 853ms 20: learn: 0.1873851 test: 0.4008291 best: 0.2376607 (14) total: 4.5s remaining: 643ms 21: learn: 0.1836871 test: 0.3986377 best: 0.2376607 (14) total: 4.73s remaining: 430ms 22: learn: 0.2022435 test: 0.8507112 best: 0.2376607 (14) total: 4.95s remaining: 215ms 23: learn: 0.8600333 test: 1.4181642 best: 0.2376607 (14) total: 5.17s remaining: 0us bestTest = 0.237660695 bestIteration = 14 Shrink model to first 15 iterations. Trial 34, Fold 5: Log loss = 0.23766411327831838, Average precision = 0.9719393609186869, ROC-AUC = 0.9676541998773759, Elapsed Time = 5.284863400000177 seconds
Optimization Progress: 35%|###5 | 35/100 [1:00:41<1:09:45, 64.39s/it]
Trial 35, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 35, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5937483 test: 0.5966649 best: 0.5966649 (0) total: 68.9ms remaining: 2.48s 1: learn: 0.5094030 test: 0.5140879 best: 0.5140879 (1) total: 159ms remaining: 2.79s 2: learn: 0.4231109 test: 0.4316569 best: 0.4316569 (2) total: 246ms remaining: 2.79s 3: learn: 0.3738528 test: 0.3845591 best: 0.3845591 (3) total: 338ms remaining: 2.79s 4: learn: 0.3259108 test: 0.3385297 best: 0.3385297 (4) total: 429ms remaining: 2.75s 5: learn: 0.2980231 test: 0.3122958 best: 0.3122958 (5) total: 525ms remaining: 2.71s 6: learn: 0.2767767 test: 0.2931342 best: 0.2931342 (6) total: 617ms remaining: 2.65s 7: learn: 0.2615518 test: 0.2782832 best: 0.2782832 (7) total: 708ms remaining: 2.57s 8: learn: 0.2455048 test: 0.2639997 best: 0.2639997 (8) total: 803ms remaining: 2.5s 9: learn: 0.2335749 test: 0.2541992 best: 0.2541992 (9) total: 893ms remaining: 2.41s 10: learn: 0.2224435 test: 0.2445282 best: 0.2445282 (10) total: 986ms remaining: 2.33s 11: learn: 0.2106635 test: 0.2357523 best: 0.2357523 (11) total: 1.08s remaining: 2.25s 12: learn: 0.2048203 test: 0.2323197 best: 0.2323197 (12) total: 1.17s remaining: 2.16s 13: learn: 0.2001978 test: 0.2293196 best: 0.2293196 (13) total: 1.26s remaining: 2.06s 14: learn: 0.1952103 test: 0.2261645 best: 0.2261645 (14) total: 1.35s remaining: 1.99s 15: learn: 0.1919123 test: 0.2241957 best: 0.2241957 (15) total: 1.45s remaining: 1.9s 16: learn: 0.1864241 test: 0.2207171 best: 0.2207171 (16) total: 1.55s remaining: 1.82s 17: learn: 0.1832486 test: 0.2189516 best: 0.2189516 (17) total: 1.63s remaining: 1.72s 18: learn: 0.1799431 test: 0.2188193 best: 0.2188193 (18) total: 1.73s remaining: 1.64s 19: learn: 0.1784047 test: 0.2179312 best: 0.2179312 (19) total: 1.82s remaining: 1.54s 20: learn: 0.1746889 test: 0.2170770 best: 0.2170770 (20) total: 1.91s remaining: 1.46s 21: learn: 0.1727769 test: 0.2161512 best: 0.2161512 (21) total: 2s remaining: 1.36s 22: learn: 0.1700697 test: 0.2146060 best: 0.2146060 (22) total: 2.09s remaining: 1.27s 23: learn: 0.1679370 test: 0.2156218 best: 0.2146060 (22) total: 2.18s remaining: 1.18s 24: learn: 0.1653224 test: 0.2138320 best: 0.2138320 (24) total: 2.27s remaining: 1.09s 25: learn: 0.1634993 test: 0.2136834 best: 0.2136834 (25) total: 2.36s remaining: 997ms 26: learn: 0.1613185 test: 0.2129319 best: 0.2129319 (26) total: 2.44s remaining: 906ms 27: learn: 0.1599076 test: 0.2121355 best: 0.2121355 (27) total: 2.54s remaining: 816ms 28: learn: 0.1579754 test: 0.2122690 best: 0.2121355 (27) total: 2.63s remaining: 724ms 29: learn: 0.1566730 test: 0.2118358 best: 0.2118358 (29) total: 2.73s remaining: 636ms 30: learn: 0.1548380 test: 0.2115831 best: 0.2115831 (30) total: 2.84s remaining: 549ms 31: learn: 0.1541379 test: 0.2110852 best: 0.2110852 (31) total: 2.89s remaining: 451ms 32: learn: 0.1530239 test: 0.2109044 best: 0.2109044 (32) total: 2.99s remaining: 362ms 33: learn: 0.1521723 test: 0.2104875 best: 0.2104875 (33) total: 3.08s remaining: 272ms 34: learn: 0.1505012 test: 0.2108399 best: 0.2104875 (33) total: 3.17s remaining: 181ms 35: learn: 0.1484048 test: 0.2110630 best: 0.2104875 (33) total: 3.26s remaining: 90.6ms 36: learn: 0.1475744 test: 0.2110590 best: 0.2104875 (33) total: 3.35s remaining: 0us bestTest = 0.210487509 bestIteration = 33 Shrink model to first 34 iterations. Trial 35, Fold 1: Log loss = 0.2102661657700664, Average precision = 0.9688816097141942, ROC-AUC = 0.9656561121387227, Elapsed Time = 3.4643465999979526 seconds Trial 35, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 35, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5800729 test: 0.5863184 best: 0.5863184 (0) total: 72.7ms remaining: 2.62s 1: learn: 0.5114772 test: 0.5183640 best: 0.5183640 (1) total: 161ms remaining: 2.81s 2: learn: 0.4561146 test: 0.4648007 best: 0.4648007 (2) total: 252ms remaining: 2.86s 3: learn: 0.4086666 test: 0.4184434 best: 0.4184434 (3) total: 336ms remaining: 2.77s 4: learn: 0.3618812 test: 0.3749225 best: 0.3749225 (4) total: 431ms remaining: 2.76s 5: learn: 0.3233624 test: 0.3388270 best: 0.3388270 (5) total: 529ms remaining: 2.73s 6: learn: 0.2952563 test: 0.3113254 best: 0.3113254 (6) total: 614ms remaining: 2.63s 7: learn: 0.2737078 test: 0.2907757 best: 0.2907757 (7) total: 702ms remaining: 2.54s 8: learn: 0.2556602 test: 0.2728645 best: 0.2728645 (8) total: 788ms remaining: 2.45s 9: learn: 0.2382800 test: 0.2566828 best: 0.2566828 (9) total: 883ms remaining: 2.38s 10: learn: 0.2253825 test: 0.2463368 best: 0.2463368 (10) total: 966ms remaining: 2.28s 11: learn: 0.2170819 test: 0.2385675 best: 0.2385675 (11) total: 1.05s remaining: 2.19s 12: learn: 0.2105433 test: 0.2326706 best: 0.2326706 (12) total: 1.14s remaining: 2.1s 13: learn: 0.2043653 test: 0.2268359 best: 0.2268359 (13) total: 1.23s remaining: 2.02s 14: learn: 0.1988833 test: 0.2223574 best: 0.2223574 (14) total: 1.31s remaining: 1.93s 15: learn: 0.1952427 test: 0.2196582 best: 0.2196582 (15) total: 1.41s remaining: 1.85s 16: learn: 0.1914536 test: 0.2159305 best: 0.2159305 (16) total: 1.5s remaining: 1.76s 17: learn: 0.1879457 test: 0.2135012 best: 0.2135012 (17) total: 1.59s remaining: 1.68s 18: learn: 0.1844419 test: 0.2119908 best: 0.2119908 (18) total: 1.68s remaining: 1.59s 19: learn: 0.1798339 test: 0.2100570 best: 0.2100570 (19) total: 1.77s remaining: 1.5s 20: learn: 0.1765939 test: 0.2075050 best: 0.2075050 (20) total: 1.87s remaining: 1.43s 21: learn: 0.1721645 test: 0.2075740 best: 0.2075050 (20) total: 1.95s remaining: 1.33s 22: learn: 0.1686517 test: 0.2072139 best: 0.2072139 (22) total: 2.05s remaining: 1.25s 23: learn: 0.1666998 test: 0.2059224 best: 0.2059224 (23) total: 2.15s remaining: 1.16s 24: learn: 0.1657819 test: 0.2053967 best: 0.2053967 (24) total: 2.24s remaining: 1.07s 25: learn: 0.1618556 test: 0.2049278 best: 0.2049278 (25) total: 2.33s remaining: 984ms 26: learn: 0.1592855 test: 0.2038046 best: 0.2038046 (26) total: 2.43s remaining: 900ms 27: learn: 0.1583669 test: 0.2030972 best: 0.2030972 (27) total: 2.53s remaining: 813ms 28: learn: 0.1567541 test: 0.2023227 best: 0.2023227 (28) total: 2.62s remaining: 723ms 29: learn: 0.1556967 test: 0.2019420 best: 0.2019420 (29) total: 2.71s remaining: 632ms 30: learn: 0.1534447 test: 0.2020709 best: 0.2019420 (29) total: 2.8s remaining: 542ms 31: learn: 0.1515945 test: 0.2018600 best: 0.2018600 (31) total: 2.9s remaining: 452ms 32: learn: 0.1496791 test: 0.2012439 best: 0.2012439 (32) total: 2.99s remaining: 362ms 33: learn: 0.1486589 test: 0.2010294 best: 0.2010294 (33) total: 3.08s remaining: 272ms 34: learn: 0.1472906 test: 0.2006491 best: 0.2006491 (34) total: 3.17s remaining: 181ms 35: learn: 0.1465504 test: 0.2001808 best: 0.2001808 (35) total: 3.26s remaining: 90.7ms 36: learn: 0.1453401 test: 0.2002420 best: 0.2001808 (35) total: 3.36s remaining: 0us bestTest = 0.2001808059 bestIteration = 35 Shrink model to first 36 iterations. Trial 35, Fold 2: Log loss = 0.19990868143104323, Average precision = 0.9715196457365233, ROC-AUC = 0.967846920475966, Elapsed Time = 3.4694494000032137 seconds Trial 35, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 35, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5661951 test: 0.5678574 best: 0.5678574 (0) total: 75.1ms remaining: 2.71s 1: learn: 0.4671416 test: 0.4730328 best: 0.4730328 (1) total: 157ms remaining: 2.75s 2: learn: 0.4120820 test: 0.4162268 best: 0.4162268 (2) total: 222ms remaining: 2.52s 3: learn: 0.3663406 test: 0.3728967 best: 0.3728967 (3) total: 315ms remaining: 2.6s 4: learn: 0.3190951 test: 0.3286352 best: 0.3286352 (4) total: 393ms remaining: 2.52s 5: learn: 0.2999258 test: 0.3117226 best: 0.3117226 (5) total: 488ms remaining: 2.52s 6: learn: 0.2720919 test: 0.2859078 best: 0.2859078 (6) total: 571ms remaining: 2.45s 7: learn: 0.2519738 test: 0.2679621 best: 0.2679621 (7) total: 657ms remaining: 2.38s 8: learn: 0.2383162 test: 0.2557470 best: 0.2557470 (8) total: 735ms remaining: 2.29s 9: learn: 0.2250496 test: 0.2458122 best: 0.2458122 (9) total: 826ms remaining: 2.23s 10: learn: 0.2142764 test: 0.2370832 best: 0.2370832 (10) total: 913ms remaining: 2.16s 11: learn: 0.2061246 test: 0.2310896 best: 0.2310896 (11) total: 1s remaining: 2.09s 12: learn: 0.2001517 test: 0.2257756 best: 0.2257756 (12) total: 1.09s remaining: 2.01s 13: learn: 0.1932860 test: 0.2208899 best: 0.2208899 (13) total: 1.17s remaining: 1.93s 14: learn: 0.1886409 test: 0.2196815 best: 0.2196815 (14) total: 1.27s remaining: 1.86s 15: learn: 0.1851608 test: 0.2173051 best: 0.2173051 (15) total: 1.36s remaining: 1.78s 16: learn: 0.1817297 test: 0.2155324 best: 0.2155324 (16) total: 1.45s remaining: 1.71s 17: learn: 0.1784238 test: 0.2137384 best: 0.2137384 (17) total: 1.54s remaining: 1.63s 18: learn: 0.1751883 test: 0.2119168 best: 0.2119168 (18) total: 1.65s remaining: 1.56s 19: learn: 0.1726259 test: 0.2104840 best: 0.2104840 (19) total: 1.74s remaining: 1.47s 20: learn: 0.1681197 test: 0.2097635 best: 0.2097635 (20) total: 1.82s remaining: 1.39s 21: learn: 0.1659280 test: 0.2087549 best: 0.2087549 (21) total: 1.92s remaining: 1.31s 22: learn: 0.1635182 test: 0.2082181 best: 0.2082181 (22) total: 2.01s remaining: 1.22s 23: learn: 0.1623971 test: 0.2079401 best: 0.2079401 (23) total: 2.1s remaining: 1.14s 24: learn: 0.1589933 test: 0.2064739 best: 0.2064739 (24) total: 2.19s remaining: 1.05s 25: learn: 0.1568628 test: 0.2067796 best: 0.2064739 (24) total: 2.27s remaining: 962ms 26: learn: 0.1542814 test: 0.2063708 best: 0.2063708 (26) total: 2.35s remaining: 872ms 27: learn: 0.1528129 test: 0.2064739 best: 0.2063708 (26) total: 2.44s remaining: 783ms 28: learn: 0.1511428 test: 0.2061920 best: 0.2061920 (28) total: 2.52s remaining: 696ms 29: learn: 0.1501552 test: 0.2058355 best: 0.2058355 (29) total: 2.62s remaining: 610ms 30: learn: 0.1493821 test: 0.2053288 best: 0.2053288 (30) total: 2.71s remaining: 524ms 31: learn: 0.1484321 test: 0.2049398 best: 0.2049398 (31) total: 2.8s remaining: 437ms 32: learn: 0.1476440 test: 0.2043469 best: 0.2043469 (32) total: 2.88s remaining: 349ms 33: learn: 0.1461820 test: 0.2040624 best: 0.2040624 (33) total: 2.97s remaining: 262ms 34: learn: 0.1455454 test: 0.2041404 best: 0.2040624 (33) total: 3.06s remaining: 175ms 35: learn: 0.1444435 test: 0.2042981 best: 0.2040624 (33) total: 3.15s remaining: 87.4ms 36: learn: 0.1439901 test: 0.2041775 best: 0.2040624 (33) total: 3.25s remaining: 0us bestTest = 0.204062404 bestIteration = 33 Shrink model to first 34 iterations. Trial 35, Fold 3: Log loss = 0.20393327208220924, Average precision = 0.9717563195723444, ROC-AUC = 0.968178307778987, Elapsed Time = 3.354944799997611 seconds Trial 35, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 35, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5845384 test: 0.5870582 best: 0.5870582 (0) total: 60.2ms remaining: 2.17s 1: learn: 0.5113327 test: 0.5151457 best: 0.5151457 (1) total: 145ms remaining: 2.53s 2: learn: 0.4539507 test: 0.4592735 best: 0.4592735 (2) total: 228ms remaining: 2.59s 3: learn: 0.3869288 test: 0.3958203 best: 0.3958203 (3) total: 306ms remaining: 2.53s 4: learn: 0.3431263 test: 0.3530457 best: 0.3530457 (4) total: 394ms remaining: 2.52s 5: learn: 0.3128123 test: 0.3256252 best: 0.3256252 (5) total: 479ms remaining: 2.47s 6: learn: 0.2879313 test: 0.3018420 best: 0.3018420 (6) total: 565ms remaining: 2.42s 7: learn: 0.2685870 test: 0.2847382 best: 0.2847382 (7) total: 644ms remaining: 2.33s 8: learn: 0.2511673 test: 0.2681880 best: 0.2681880 (8) total: 742ms remaining: 2.31s 9: learn: 0.2428679 test: 0.2618678 best: 0.2618678 (9) total: 844ms remaining: 2.28s 10: learn: 0.2329783 test: 0.2536927 best: 0.2536927 (10) total: 930ms remaining: 2.2s 11: learn: 0.2218944 test: 0.2479151 best: 0.2479151 (11) total: 1.02s remaining: 2.12s 12: learn: 0.2143622 test: 0.2450016 best: 0.2450016 (12) total: 1.11s remaining: 2.04s 13: learn: 0.2069162 test: 0.2383306 best: 0.2383306 (13) total: 1.2s remaining: 1.97s 14: learn: 0.2003610 test: 0.2326682 best: 0.2326682 (14) total: 1.28s remaining: 1.88s 15: learn: 0.1932551 test: 0.2278519 best: 0.2278519 (15) total: 1.37s remaining: 1.8s 16: learn: 0.1882226 test: 0.2245923 best: 0.2245923 (16) total: 1.46s remaining: 1.72s 17: learn: 0.1830154 test: 0.2204322 best: 0.2204322 (17) total: 1.56s remaining: 1.65s 18: learn: 0.1789478 test: 0.2199789 best: 0.2199789 (18) total: 1.65s remaining: 1.56s 19: learn: 0.1754247 test: 0.2186780 best: 0.2186780 (19) total: 1.73s remaining: 1.47s 20: learn: 0.1724616 test: 0.2180645 best: 0.2180645 (20) total: 1.82s remaining: 1.39s 21: learn: 0.1691453 test: 0.2146615 best: 0.2146615 (21) total: 1.9s remaining: 1.3s 22: learn: 0.1650188 test: 0.2124953 best: 0.2124953 (22) total: 2.01s remaining: 1.22s 23: learn: 0.1622201 test: 0.2106509 best: 0.2106509 (23) total: 2.1s remaining: 1.14s 24: learn: 0.1597533 test: 0.2107331 best: 0.2106509 (23) total: 2.19s remaining: 1.05s 25: learn: 0.1568024 test: 0.2092613 best: 0.2092613 (25) total: 2.29s remaining: 970ms 26: learn: 0.1542140 test: 0.2079132 best: 0.2079132 (26) total: 2.39s remaining: 886ms 27: learn: 0.1529832 test: 0.2070758 best: 0.2070758 (27) total: 2.49s remaining: 801ms 28: learn: 0.1509399 test: 0.2057624 best: 0.2057624 (28) total: 2.59s remaining: 715ms 29: learn: 0.1497892 test: 0.2051603 best: 0.2051603 (29) total: 2.68s remaining: 626ms 30: learn: 0.1486153 test: 0.2051576 best: 0.2051576 (30) total: 2.77s remaining: 537ms 31: learn: 0.1475538 test: 0.2044946 best: 0.2044946 (31) total: 2.87s remaining: 449ms 32: learn: 0.1457184 test: 0.2038074 best: 0.2038074 (32) total: 2.96s remaining: 358ms 33: learn: 0.1450307 test: 0.2033394 best: 0.2033394 (33) total: 3.04s remaining: 268ms 34: learn: 0.1435520 test: 0.2035739 best: 0.2033394 (33) total: 3.13s remaining: 179ms 35: learn: 0.1426752 test: 0.2038821 best: 0.2033394 (33) total: 3.22s remaining: 89.4ms 36: learn: 0.1419089 test: 0.2036470 best: 0.2033394 (33) total: 3.31s remaining: 0us bestTest = 0.2033393902 bestIteration = 33 Shrink model to first 34 iterations. Trial 35, Fold 4: Log loss = 0.20312251838546025, Average precision = 0.9728389615374551, ROC-AUC = 0.968123868895613, Elapsed Time = 3.417873400001554 seconds Trial 35, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 35, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5674097 test: 0.5734460 best: 0.5734460 (0) total: 64.7ms remaining: 2.33s 1: learn: 0.4792091 test: 0.4970853 best: 0.4970853 (1) total: 140ms remaining: 2.44s 2: learn: 0.4150288 test: 0.4350405 best: 0.4350405 (2) total: 223ms remaining: 2.52s 3: learn: 0.3630619 test: 0.3849709 best: 0.3849709 (3) total: 309ms remaining: 2.55s 4: learn: 0.3148678 test: 0.3384573 best: 0.3384573 (4) total: 394ms remaining: 2.52s 5: learn: 0.2937025 test: 0.3185700 best: 0.3185700 (5) total: 481ms remaining: 2.49s 6: learn: 0.2655852 test: 0.2913259 best: 0.2913259 (6) total: 569ms remaining: 2.44s 7: learn: 0.2483214 test: 0.2749089 best: 0.2749089 (7) total: 656ms remaining: 2.38s 8: learn: 0.2314359 test: 0.2601018 best: 0.2601018 (8) total: 738ms remaining: 2.3s 9: learn: 0.2193651 test: 0.2508719 best: 0.2508719 (9) total: 826ms remaining: 2.23s 10: learn: 0.2135414 test: 0.2450242 best: 0.2450242 (10) total: 863ms remaining: 2.04s 11: learn: 0.2060724 test: 0.2392544 best: 0.2392544 (11) total: 954ms remaining: 1.99s 12: learn: 0.1985881 test: 0.2333684 best: 0.2333684 (12) total: 1.05s remaining: 1.94s 13: learn: 0.1925849 test: 0.2280599 best: 0.2280599 (13) total: 1.14s remaining: 1.87s 14: learn: 0.1881146 test: 0.2242419 best: 0.2242419 (14) total: 1.22s remaining: 1.78s 15: learn: 0.1837271 test: 0.2215898 best: 0.2215898 (15) total: 1.31s remaining: 1.71s 16: learn: 0.1806371 test: 0.2196230 best: 0.2196230 (16) total: 1.41s remaining: 1.66s 17: learn: 0.1780884 test: 0.2185178 best: 0.2185178 (17) total: 1.51s remaining: 1.59s 18: learn: 0.1742611 test: 0.2178843 best: 0.2178843 (18) total: 1.59s remaining: 1.51s 19: learn: 0.1719116 test: 0.2169411 best: 0.2169411 (19) total: 1.68s remaining: 1.42s 20: learn: 0.1691682 test: 0.2163296 best: 0.2163296 (20) total: 1.76s remaining: 1.34s 21: learn: 0.1668951 test: 0.2158573 best: 0.2158573 (21) total: 1.85s remaining: 1.26s 22: learn: 0.1643840 test: 0.2144327 best: 0.2144327 (22) total: 1.94s remaining: 1.18s 23: learn: 0.1625394 test: 0.2139659 best: 0.2139659 (23) total: 2.03s remaining: 1.1s 24: learn: 0.1604631 test: 0.2135979 best: 0.2135979 (24) total: 2.13s remaining: 1.02s 25: learn: 0.1570404 test: 0.2141806 best: 0.2135979 (24) total: 2.23s remaining: 943ms 26: learn: 0.1555632 test: 0.2140493 best: 0.2135979 (24) total: 2.32s remaining: 858ms 27: learn: 0.1541083 test: 0.2140787 best: 0.2135979 (24) total: 2.41s remaining: 776ms 28: learn: 0.1532032 test: 0.2139242 best: 0.2135979 (24) total: 2.51s remaining: 693ms 29: learn: 0.1503712 test: 0.2146581 best: 0.2135979 (24) total: 2.6s remaining: 608ms 30: learn: 0.1488435 test: 0.2146673 best: 0.2135979 (24) total: 2.7s remaining: 522ms 31: learn: 0.1478913 test: 0.2141495 best: 0.2135979 (24) total: 2.79s remaining: 435ms 32: learn: 0.1469173 test: 0.2139325 best: 0.2135979 (24) total: 2.87s remaining: 348ms 33: learn: 0.1449795 test: 0.2139701 best: 0.2135979 (24) total: 2.96s remaining: 262ms 34: learn: 0.1442938 test: 0.2139744 best: 0.2135979 (24) total: 3.05s remaining: 174ms 35: learn: 0.1432803 test: 0.2138707 best: 0.2135979 (24) total: 3.13s remaining: 87.1ms 36: learn: 0.1422826 test: 0.2138658 best: 0.2135979 (24) total: 3.22s remaining: 0us bestTest = 0.2135978597 bestIteration = 24 Shrink model to first 25 iterations. Trial 35, Fold 5: Log loss = 0.2133493554362293, Average precision = 0.9701531810737392, ROC-AUC = 0.9649087564795719, Elapsed Time = 3.324585100002878 seconds
Optimization Progress: 36%|###6 | 36/100 [1:01:06<55:57, 52.46s/it]
Trial 36, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 36, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6335414 test: 0.6332409 best: 0.6332409 (0) total: 32.2ms remaining: 967ms 1: learn: 0.5829026 test: 0.5825127 best: 0.5825127 (1) total: 63.4ms remaining: 919ms 2: learn: 0.5379638 test: 0.5374102 best: 0.5374102 (2) total: 96.1ms remaining: 897ms 3: learn: 0.4991890 test: 0.4987643 best: 0.4987643 (3) total: 128ms remaining: 867ms 4: learn: 0.4664064 test: 0.4658071 best: 0.4658071 (4) total: 161ms remaining: 835ms 5: learn: 0.4379235 test: 0.4373468 best: 0.4373468 (5) total: 192ms remaining: 801ms 6: learn: 0.4127562 test: 0.4121545 best: 0.4121545 (6) total: 225ms remaining: 771ms 7: learn: 0.3913927 test: 0.3907187 best: 0.3907187 (7) total: 256ms remaining: 736ms 8: learn: 0.3734834 test: 0.3729463 best: 0.3729463 (8) total: 292ms remaining: 715ms 9: learn: 0.3591529 test: 0.3586467 best: 0.3586467 (9) total: 324ms remaining: 680ms 10: learn: 0.3451439 test: 0.3445689 best: 0.3445689 (10) total: 357ms remaining: 649ms 11: learn: 0.3351248 test: 0.3347936 best: 0.3347936 (11) total: 389ms remaining: 615ms 12: learn: 0.3239702 test: 0.3238051 best: 0.3238051 (12) total: 422ms remaining: 584ms 13: learn: 0.3147448 test: 0.3146693 best: 0.3146693 (13) total: 455ms remaining: 552ms 14: learn: 0.3067931 test: 0.3067517 best: 0.3067517 (14) total: 490ms remaining: 522ms 15: learn: 0.2992964 test: 0.2994222 best: 0.2994222 (15) total: 522ms remaining: 490ms 16: learn: 0.2931834 test: 0.2933145 best: 0.2933145 (16) total: 556ms remaining: 458ms 17: learn: 0.2878462 test: 0.2884212 best: 0.2884212 (17) total: 589ms remaining: 425ms 18: learn: 0.2831531 test: 0.2841201 best: 0.2841201 (18) total: 624ms remaining: 394ms 19: learn: 0.2773340 test: 0.2785762 best: 0.2785762 (19) total: 660ms remaining: 363ms 20: learn: 0.2726288 test: 0.2738925 best: 0.2738925 (20) total: 693ms remaining: 330ms 21: learn: 0.2680302 test: 0.2695071 best: 0.2695071 (21) total: 729ms remaining: 298ms 22: learn: 0.2646858 test: 0.2664582 best: 0.2664582 (22) total: 764ms remaining: 266ms 23: learn: 0.2615335 test: 0.2636077 best: 0.2636077 (23) total: 800ms remaining: 233ms 24: learn: 0.2582565 test: 0.2603241 best: 0.2603241 (24) total: 837ms remaining: 201ms 25: learn: 0.2552632 test: 0.2575453 best: 0.2575453 (25) total: 872ms remaining: 168ms 26: learn: 0.2526711 test: 0.2552128 best: 0.2552128 (26) total: 908ms remaining: 135ms 27: learn: 0.2504780 test: 0.2533200 best: 0.2533200 (27) total: 945ms remaining: 101ms 28: learn: 0.2479797 test: 0.2509263 best: 0.2509263 (28) total: 984ms remaining: 67.9ms 29: learn: 0.2460837 test: 0.2491788 best: 0.2491788 (29) total: 1.02s remaining: 34ms 30: learn: 0.2439032 test: 0.2472092 best: 0.2472092 (30) total: 1.05s remaining: 0us bestTest = 0.2472092154 bestIteration = 30 Trial 36, Fold 1: Log loss = 0.24720921535051218, Average precision = 0.9651780902002346, ROC-AUC = 0.9604550228703346, Elapsed Time = 1.1556350999999268 seconds Trial 36, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 36, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6321961 test: 0.6324039 best: 0.6324039 (0) total: 32.7ms remaining: 981ms 1: learn: 0.5799195 test: 0.5802820 best: 0.5802820 (1) total: 63.9ms remaining: 927ms 2: learn: 0.5347752 test: 0.5352168 best: 0.5352168 (2) total: 98.2ms remaining: 917ms 3: learn: 0.4965712 test: 0.4971461 best: 0.4971461 (3) total: 131ms remaining: 883ms 4: learn: 0.4637906 test: 0.4643334 best: 0.4643334 (4) total: 162ms remaining: 842ms 5: learn: 0.4360090 test: 0.4366451 best: 0.4366451 (5) total: 194ms remaining: 809ms 6: learn: 0.4117258 test: 0.4127307 best: 0.4127307 (6) total: 228ms remaining: 782ms 7: learn: 0.3902043 test: 0.3911515 best: 0.3911515 (7) total: 260ms remaining: 746ms 8: learn: 0.3717555 test: 0.3728308 best: 0.3728308 (8) total: 294ms remaining: 719ms 9: learn: 0.3567564 test: 0.3579291 best: 0.3579291 (9) total: 327ms remaining: 686ms 10: learn: 0.3430329 test: 0.3442809 best: 0.3442809 (10) total: 362ms remaining: 658ms 11: learn: 0.3315827 test: 0.3326568 best: 0.3326568 (11) total: 395ms remaining: 625ms 12: learn: 0.3219044 test: 0.3230700 best: 0.3230700 (12) total: 430ms remaining: 596ms 13: learn: 0.3131217 test: 0.3144449 best: 0.3144449 (13) total: 466ms remaining: 566ms 14: learn: 0.3052843 test: 0.3066750 best: 0.3066750 (14) total: 501ms remaining: 534ms 15: learn: 0.2987774 test: 0.3000112 best: 0.3000112 (15) total: 537ms remaining: 504ms 16: learn: 0.2926365 test: 0.2940077 best: 0.2940077 (16) total: 573ms remaining: 472ms 17: learn: 0.2878095 test: 0.2891993 best: 0.2891993 (17) total: 608ms remaining: 439ms 18: learn: 0.2820032 test: 0.2834947 best: 0.2834947 (18) total: 644ms remaining: 407ms 19: learn: 0.2764289 test: 0.2779704 best: 0.2779704 (19) total: 679ms remaining: 373ms 20: learn: 0.2722680 test: 0.2737569 best: 0.2737569 (20) total: 715ms remaining: 340ms 21: learn: 0.2685199 test: 0.2699160 best: 0.2699160 (21) total: 751ms remaining: 307ms 22: learn: 0.2639899 test: 0.2652863 best: 0.2652863 (22) total: 786ms remaining: 273ms 23: learn: 0.2608326 test: 0.2619790 best: 0.2619790 (23) total: 823ms remaining: 240ms 24: learn: 0.2579129 test: 0.2590658 best: 0.2590658 (24) total: 857ms remaining: 206ms 25: learn: 0.2551680 test: 0.2561448 best: 0.2561448 (25) total: 893ms remaining: 172ms 26: learn: 0.2528416 test: 0.2537325 best: 0.2537325 (26) total: 928ms remaining: 137ms 27: learn: 0.2501835 test: 0.2511464 best: 0.2511464 (27) total: 964ms remaining: 103ms 28: learn: 0.2486045 test: 0.2494531 best: 0.2494531 (28) total: 999ms remaining: 68.9ms 29: learn: 0.2462393 test: 0.2472098 best: 0.2472098 (29) total: 1.03s remaining: 34.5ms 30: learn: 0.2445880 test: 0.2454690 best: 0.2454690 (30) total: 1.07s remaining: 0us bestTest = 0.2454689512 bestIteration = 30 Trial 36, Fold 2: Log loss = 0.24546895115736625, Average precision = 0.9656584465185779, ROC-AUC = 0.9631113273348446, Elapsed Time = 1.1699856000013824 seconds Trial 36, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 36, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6322704 test: 0.6316602 best: 0.6316602 (0) total: 31.3ms remaining: 940ms 1: learn: 0.5800813 test: 0.5789426 best: 0.5789426 (1) total: 63.1ms remaining: 915ms 2: learn: 0.5358965 test: 0.5345030 best: 0.5345030 (2) total: 96.4ms remaining: 899ms 3: learn: 0.4974627 test: 0.4956794 best: 0.4956794 (3) total: 129ms remaining: 868ms 4: learn: 0.4650624 test: 0.4631068 best: 0.4631068 (4) total: 163ms remaining: 845ms 5: learn: 0.4372320 test: 0.4352625 best: 0.4352625 (5) total: 196ms remaining: 817ms 6: learn: 0.4134574 test: 0.4112980 best: 0.4112980 (6) total: 230ms remaining: 789ms 7: learn: 0.3929282 test: 0.3905459 best: 0.3905459 (7) total: 265ms remaining: 762ms 8: learn: 0.3743200 test: 0.3716842 best: 0.3716842 (8) total: 298ms remaining: 728ms 9: learn: 0.3583727 test: 0.3557163 best: 0.3557163 (9) total: 333ms remaining: 698ms 10: learn: 0.3455190 test: 0.3425897 best: 0.3425897 (10) total: 366ms remaining: 666ms 11: learn: 0.3342647 test: 0.3313743 best: 0.3313743 (11) total: 402ms remaining: 636ms 12: learn: 0.3233621 test: 0.3203141 best: 0.3203141 (12) total: 441ms remaining: 610ms 13: learn: 0.3151022 test: 0.3119269 best: 0.3119269 (13) total: 478ms remaining: 581ms 14: learn: 0.3067933 test: 0.3035545 best: 0.3035545 (14) total: 515ms remaining: 549ms 15: learn: 0.2988538 test: 0.2955266 best: 0.2955266 (15) total: 550ms remaining: 516ms 16: learn: 0.2934045 test: 0.2901121 best: 0.2901121 (16) total: 588ms remaining: 484ms 17: learn: 0.2869845 test: 0.2835901 best: 0.2835901 (17) total: 623ms remaining: 450ms 18: learn: 0.2816602 test: 0.2783037 best: 0.2783037 (18) total: 660ms remaining: 417ms 19: learn: 0.2760533 test: 0.2729735 best: 0.2729735 (19) total: 696ms remaining: 383ms 20: learn: 0.2725347 test: 0.2693839 best: 0.2693839 (20) total: 736ms remaining: 350ms 21: learn: 0.2689243 test: 0.2660096 best: 0.2660096 (21) total: 772ms remaining: 316ms 22: learn: 0.2660823 test: 0.2631206 best: 0.2631206 (22) total: 808ms remaining: 281ms 23: learn: 0.2631551 test: 0.2601957 best: 0.2601957 (23) total: 844ms remaining: 246ms 24: learn: 0.2595808 test: 0.2566617 best: 0.2566617 (24) total: 881ms remaining: 211ms 25: learn: 0.2575431 test: 0.2546736 best: 0.2546736 (25) total: 915ms remaining: 176ms 26: learn: 0.2547905 test: 0.2518268 best: 0.2518268 (26) total: 954ms remaining: 141ms 27: learn: 0.2518628 test: 0.2489227 best: 0.2489227 (27) total: 991ms remaining: 106ms 28: learn: 0.2497764 test: 0.2468394 best: 0.2468394 (28) total: 1.02s remaining: 70.7ms 29: learn: 0.2476580 test: 0.2447387 best: 0.2447387 (29) total: 1.07s remaining: 35.5ms 30: learn: 0.2456036 test: 0.2429106 best: 0.2429106 (30) total: 1.1s remaining: 0us bestTest = 0.2429106425 bestIteration = 30 Trial 36, Fold 3: Log loss = 0.2429106424978244, Average precision = 0.9658629804103465, ROC-AUC = 0.9632374363147808, Elapsed Time = 1.2047749000012118 seconds Trial 36, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 36, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6326380 test: 0.6324727 best: 0.6324727 (0) total: 31.7ms remaining: 952ms 1: learn: 0.5799718 test: 0.5798212 best: 0.5798212 (1) total: 63.2ms remaining: 916ms 2: learn: 0.5357771 test: 0.5353916 best: 0.5353916 (2) total: 96.1ms remaining: 897ms 3: learn: 0.4976020 test: 0.4971809 best: 0.4971809 (3) total: 128ms remaining: 867ms 4: learn: 0.4645415 test: 0.4640172 best: 0.4640172 (4) total: 161ms remaining: 838ms 5: learn: 0.4366091 test: 0.4360426 best: 0.4360426 (5) total: 193ms remaining: 803ms 6: learn: 0.4123956 test: 0.4117736 best: 0.4117736 (6) total: 226ms remaining: 776ms 7: learn: 0.3912863 test: 0.3906664 best: 0.3906664 (7) total: 262ms remaining: 754ms 8: learn: 0.3737151 test: 0.3733579 best: 0.3733579 (8) total: 298ms remaining: 728ms 9: learn: 0.3576726 test: 0.3574136 best: 0.3574136 (9) total: 332ms remaining: 697ms 10: learn: 0.3450129 test: 0.3448055 best: 0.3448055 (10) total: 368ms remaining: 670ms 11: learn: 0.3323277 test: 0.3321471 best: 0.3321471 (11) total: 403ms remaining: 639ms 12: learn: 0.3225684 test: 0.3223502 best: 0.3223502 (12) total: 439ms remaining: 608ms 13: learn: 0.3138580 test: 0.3135423 best: 0.3135423 (13) total: 474ms remaining: 575ms 14: learn: 0.3053655 test: 0.3049220 best: 0.3049220 (14) total: 510ms remaining: 544ms 15: learn: 0.2976588 test: 0.2973549 best: 0.2973549 (15) total: 544ms remaining: 510ms 16: learn: 0.2914504 test: 0.2912477 best: 0.2912477 (16) total: 579ms remaining: 477ms 17: learn: 0.2858844 test: 0.2857063 best: 0.2857063 (17) total: 613ms remaining: 443ms 18: learn: 0.2814922 test: 0.2815347 best: 0.2815347 (18) total: 650ms remaining: 411ms 19: learn: 0.2772586 test: 0.2774085 best: 0.2774085 (19) total: 685ms remaining: 377ms 20: learn: 0.2731711 test: 0.2736776 best: 0.2736776 (20) total: 723ms remaining: 344ms 21: learn: 0.2685351 test: 0.2691846 best: 0.2691846 (21) total: 761ms remaining: 311ms 22: learn: 0.2652421 test: 0.2660100 best: 0.2660100 (22) total: 796ms remaining: 277ms 23: learn: 0.2617761 test: 0.2625281 best: 0.2625281 (23) total: 833ms remaining: 243ms 24: learn: 0.2582636 test: 0.2589763 best: 0.2589763 (24) total: 870ms remaining: 209ms 25: learn: 0.2554489 test: 0.2561395 best: 0.2561395 (25) total: 906ms remaining: 174ms 26: learn: 0.2531507 test: 0.2536980 best: 0.2536980 (26) total: 942ms remaining: 139ms 27: learn: 0.2507715 test: 0.2512219 best: 0.2512219 (27) total: 978ms remaining: 105ms 28: learn: 0.2487733 test: 0.2492508 best: 0.2492508 (28) total: 1.01s remaining: 69.8ms 29: learn: 0.2467194 test: 0.2472228 best: 0.2472228 (29) total: 1.05s remaining: 35.1ms 30: learn: 0.2448059 test: 0.2452990 best: 0.2452990 (30) total: 1.09s remaining: 0us bestTest = 0.2452989512 bestIteration = 30 Trial 36, Fold 4: Log loss = 0.24529895123633147, Average precision = 0.966793696239375, ROC-AUC = 0.9628482132425351, Elapsed Time = 1.1857075000007171 seconds Trial 36, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 36, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6321553 test: 0.6334853 best: 0.6334853 (0) total: 32.4ms remaining: 973ms 1: learn: 0.5795485 test: 0.5816992 best: 0.5816992 (1) total: 64.3ms remaining: 932ms 2: learn: 0.5354520 test: 0.5381245 best: 0.5381245 (2) total: 97.8ms remaining: 912ms 3: learn: 0.4965649 test: 0.4999415 best: 0.4999415 (3) total: 131ms remaining: 881ms 4: learn: 0.4629835 test: 0.4668330 best: 0.4668330 (4) total: 164ms remaining: 855ms 5: learn: 0.4341808 test: 0.4385440 best: 0.4385440 (5) total: 199ms remaining: 827ms 6: learn: 0.4098538 test: 0.4146087 best: 0.4146087 (6) total: 231ms remaining: 794ms 7: learn: 0.3885036 test: 0.3937268 best: 0.3937268 (7) total: 267ms remaining: 769ms 8: learn: 0.3702614 test: 0.3758098 best: 0.3758098 (8) total: 302ms remaining: 738ms 9: learn: 0.3552084 test: 0.3610865 best: 0.3610865 (9) total: 339ms remaining: 713ms 10: learn: 0.3412463 test: 0.3474442 best: 0.3474442 (10) total: 376ms remaining: 683ms 11: learn: 0.3296944 test: 0.3362474 best: 0.3362474 (11) total: 411ms remaining: 650ms 12: learn: 0.3197591 test: 0.3266751 best: 0.3266751 (12) total: 448ms remaining: 620ms 13: learn: 0.3103504 test: 0.3176120 best: 0.3176120 (13) total: 485ms remaining: 588ms 14: learn: 0.3019616 test: 0.3094649 best: 0.3094649 (14) total: 520ms remaining: 555ms 15: learn: 0.2950881 test: 0.3030200 best: 0.3030200 (15) total: 557ms remaining: 522ms 16: learn: 0.2889452 test: 0.2971056 best: 0.2971056 (16) total: 593ms remaining: 488ms 17: learn: 0.2834440 test: 0.2918088 best: 0.2918088 (17) total: 632ms remaining: 457ms 18: learn: 0.2776284 test: 0.2863769 best: 0.2863769 (18) total: 668ms remaining: 422ms 19: learn: 0.2735780 test: 0.2825726 best: 0.2825726 (19) total: 706ms remaining: 388ms 20: learn: 0.2697440 test: 0.2787300 best: 0.2787300 (20) total: 742ms remaining: 353ms 21: learn: 0.2661791 test: 0.2754238 best: 0.2754238 (21) total: 779ms remaining: 319ms 22: learn: 0.2627995 test: 0.2721724 best: 0.2721724 (22) total: 817ms remaining: 284ms 23: learn: 0.2585694 test: 0.2684057 best: 0.2684057 (23) total: 853ms remaining: 249ms 24: learn: 0.2550688 test: 0.2652280 best: 0.2652280 (24) total: 890ms remaining: 214ms 25: learn: 0.2517389 test: 0.2623029 best: 0.2623029 (25) total: 925ms remaining: 178ms 26: learn: 0.2491205 test: 0.2598389 best: 0.2598389 (26) total: 963ms remaining: 143ms 27: learn: 0.2469576 test: 0.2578386 best: 0.2578386 (27) total: 998ms remaining: 107ms 28: learn: 0.2448775 test: 0.2558177 best: 0.2558177 (28) total: 1.03s remaining: 71.2ms 29: learn: 0.2430578 test: 0.2539699 best: 0.2539699 (29) total: 1.07s remaining: 35.7ms 30: learn: 0.2414040 test: 0.2523786 best: 0.2523786 (30) total: 1.1s remaining: 0us bestTest = 0.252378634 bestIteration = 30 Trial 36, Fold 5: Log loss = 0.2523786340410989, Average precision = 0.9619196767010048, ROC-AUC = 0.9600014863534178, Elapsed Time = 1.2047831999989285 seconds
Optimization Progress: 37%|###7 | 37/100 [1:01:19<42:49, 40.79s/it]
Trial 37, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 37, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6695437 test: 0.6695171 best: 0.6695171 (0) total: 588ms remaining: 53.5s 1: learn: 0.6472349 test: 0.6472905 best: 0.6472905 (1) total: 1.17s remaining: 52.6s 2: learn: 0.6259377 test: 0.6260614 best: 0.6260614 (2) total: 1.68s remaining: 49.8s 3: learn: 0.6075096 test: 0.6077917 best: 0.6077917 (3) total: 2.25s remaining: 49.5s 4: learn: 0.5904155 test: 0.5908382 best: 0.5908382 (4) total: 2.74s remaining: 47.7s 5: learn: 0.5738991 test: 0.5744462 best: 0.5744462 (5) total: 3.17s remaining: 45.4s 6: learn: 0.5569791 test: 0.5577504 best: 0.5577504 (6) total: 3.69s remaining: 44.8s 7: learn: 0.5423353 test: 0.5433830 best: 0.5433830 (7) total: 4.25s remaining: 44.7s 8: learn: 0.5262154 test: 0.5275270 best: 0.5275270 (8) total: 4.81s remaining: 44.3s 9: learn: 0.5117036 test: 0.5131838 best: 0.5131838 (9) total: 5.25s remaining: 43s 10: learn: 0.4989859 test: 0.5005169 best: 0.5005169 (10) total: 5.69s remaining: 41.9s 11: learn: 0.4854402 test: 0.4871607 best: 0.4871607 (11) total: 6.23s remaining: 41.6s 12: learn: 0.4740400 test: 0.4759165 best: 0.4759165 (12) total: 6.71s remaining: 40.8s 13: learn: 0.4632541 test: 0.4653485 best: 0.4653485 (13) total: 7.12s remaining: 39.6s 14: learn: 0.4524653 test: 0.4547578 best: 0.4547578 (14) total: 7.65s remaining: 39.3s 15: learn: 0.4417422 test: 0.4442784 best: 0.4442784 (15) total: 8.23s remaining: 39.1s 16: learn: 0.4315316 test: 0.4342770 best: 0.4342770 (16) total: 8.66s remaining: 38.2s 17: learn: 0.4214574 test: 0.4244362 best: 0.4244362 (17) total: 9.12s remaining: 37.5s 18: learn: 0.4125044 test: 0.4157806 best: 0.4157806 (18) total: 9.69s remaining: 37.2s 19: learn: 0.4044765 test: 0.4079692 best: 0.4079692 (19) total: 10.1s remaining: 36.4s 20: learn: 0.3967246 test: 0.4004808 best: 0.4004808 (20) total: 10.5s remaining: 35.5s 21: learn: 0.3877912 test: 0.3917711 best: 0.3917711 (21) total: 11s remaining: 35s 22: learn: 0.3802419 test: 0.3845747 best: 0.3845747 (22) total: 11.5s remaining: 34.4s 23: learn: 0.3733902 test: 0.3780494 best: 0.3780494 (23) total: 12s remaining: 33.9s 24: learn: 0.3666593 test: 0.3716204 best: 0.3716204 (24) total: 12.5s remaining: 33.5s 25: learn: 0.3599880 test: 0.3651059 best: 0.3651059 (25) total: 13s remaining: 33s 26: learn: 0.3537896 test: 0.3591613 best: 0.3591613 (26) total: 13.5s remaining: 32.6s 27: learn: 0.3476016 test: 0.3532154 best: 0.3532154 (27) total: 14s remaining: 32.1s 28: learn: 0.3422871 test: 0.3482156 best: 0.3482156 (28) total: 14.5s remaining: 31.6s 29: learn: 0.3360097 test: 0.3421813 best: 0.3421813 (29) total: 14.9s remaining: 30.9s 30: learn: 0.3297487 test: 0.3361878 best: 0.3361878 (30) total: 15.5s remaining: 30.4s 31: learn: 0.3243660 test: 0.3310715 best: 0.3310715 (31) total: 15.9s remaining: 29.9s 32: learn: 0.3195116 test: 0.3264996 best: 0.3264996 (32) total: 16.4s remaining: 29.4s 33: learn: 0.3147630 test: 0.3219540 best: 0.3219540 (33) total: 16.9s remaining: 28.8s 34: learn: 0.3106732 test: 0.3179714 best: 0.3179714 (34) total: 17.3s remaining: 28.1s 35: learn: 0.3057769 test: 0.3133218 best: 0.3133218 (35) total: 17.8s remaining: 27.6s 36: learn: 0.3013094 test: 0.3091937 best: 0.3091937 (36) total: 18.3s remaining: 27.2s 37: learn: 0.2975878 test: 0.3056309 best: 0.3056309 (37) total: 18.8s remaining: 26.7s 38: learn: 0.2940463 test: 0.3022229 best: 0.3022229 (38) total: 19.2s remaining: 26.1s 39: learn: 0.2894918 test: 0.2980791 best: 0.2980791 (39) total: 19.7s remaining: 25.6s 40: learn: 0.2856422 test: 0.2944623 best: 0.2944623 (40) total: 20.1s remaining: 25s 41: learn: 0.2817687 test: 0.2908263 best: 0.2908263 (41) total: 20.8s remaining: 24.8s 42: learn: 0.2782495 test: 0.2875382 best: 0.2875382 (42) total: 21.3s remaining: 24.2s 43: learn: 0.2746911 test: 0.2841901 best: 0.2841901 (43) total: 21.8s remaining: 23.8s 44: learn: 0.2717135 test: 0.2815465 best: 0.2815465 (44) total: 22.3s remaining: 23.2s 45: learn: 0.2684809 test: 0.2785489 best: 0.2785489 (45) total: 22.8s remaining: 22.8s 46: learn: 0.2645847 test: 0.2749499 best: 0.2749499 (46) total: 23.2s remaining: 22.3s 47: learn: 0.2613533 test: 0.2720557 best: 0.2720557 (47) total: 23.7s remaining: 21.8s 48: learn: 0.2585026 test: 0.2694449 best: 0.2694449 (48) total: 24.2s remaining: 21.3s 49: learn: 0.2557744 test: 0.2669881 best: 0.2669881 (49) total: 24.6s remaining: 20.7s 50: learn: 0.2532376 test: 0.2646392 best: 0.2646392 (50) total: 25.1s remaining: 20.2s 51: learn: 0.2505951 test: 0.2623024 best: 0.2623024 (51) total: 25.6s remaining: 19.7s 52: learn: 0.2482390 test: 0.2601189 best: 0.2601189 (52) total: 26.1s remaining: 19.2s 53: learn: 0.2460733 test: 0.2582967 best: 0.2582967 (53) total: 26.6s remaining: 18.7s 54: learn: 0.2441195 test: 0.2565438 best: 0.2565438 (54) total: 27s remaining: 18.2s 55: learn: 0.2416335 test: 0.2544060 best: 0.2544060 (55) total: 27.6s remaining: 17.7s 56: learn: 0.2389513 test: 0.2520251 best: 0.2520251 (56) total: 28s remaining: 17.2s 57: learn: 0.2369371 test: 0.2502181 best: 0.2502181 (57) total: 28.4s remaining: 16.6s 58: learn: 0.2351696 test: 0.2487436 best: 0.2487436 (58) total: 28.8s remaining: 16.1s 59: learn: 0.2331934 test: 0.2470222 best: 0.2470222 (59) total: 29.3s remaining: 15.6s 60: learn: 0.2312634 test: 0.2453219 best: 0.2453219 (60) total: 29.8s remaining: 15.1s 61: learn: 0.2296976 test: 0.2439229 best: 0.2439229 (61) total: 30.1s remaining: 14.6s 62: learn: 0.2279162 test: 0.2424165 best: 0.2424165 (62) total: 30.6s remaining: 14.1s 63: learn: 0.2264687 test: 0.2412693 best: 0.2412693 (63) total: 31s remaining: 13.6s 64: learn: 0.2247382 test: 0.2398480 best: 0.2398480 (64) total: 31.5s remaining: 13.1s 65: learn: 0.2229521 test: 0.2384014 best: 0.2384014 (65) total: 32s remaining: 12.6s 66: learn: 0.2212590 test: 0.2371046 best: 0.2371046 (66) total: 32.5s remaining: 12.1s 67: learn: 0.2198720 test: 0.2359322 best: 0.2359322 (67) total: 32.9s remaining: 11.6s 68: learn: 0.2185427 test: 0.2349124 best: 0.2349124 (68) total: 33.4s remaining: 11.1s 69: learn: 0.2168830 test: 0.2335695 best: 0.2335695 (69) total: 33.9s remaining: 10.7s 70: learn: 0.2153950 test: 0.2323859 best: 0.2323859 (70) total: 34.4s remaining: 10.2s 71: learn: 0.2138172 test: 0.2311391 best: 0.2311391 (71) total: 34.9s remaining: 9.7s 72: learn: 0.2124480 test: 0.2300056 best: 0.2300056 (72) total: 35.4s remaining: 9.2s 73: learn: 0.2113851 test: 0.2291471 best: 0.2291471 (73) total: 35.8s remaining: 8.7s 74: learn: 0.2101586 test: 0.2282034 best: 0.2282034 (74) total: 36.2s remaining: 8.21s 75: learn: 0.2089915 test: 0.2272670 best: 0.2272670 (75) total: 36.7s remaining: 7.72s 76: learn: 0.2079239 test: 0.2264616 best: 0.2264616 (76) total: 37.1s remaining: 7.23s 77: learn: 0.2063405 test: 0.2250962 best: 0.2250962 (77) total: 37.6s remaining: 6.75s 78: learn: 0.2053710 test: 0.2244186 best: 0.2244186 (78) total: 38.1s remaining: 6.27s 79: learn: 0.2043383 test: 0.2236071 best: 0.2236071 (79) total: 38.5s remaining: 5.78s 80: learn: 0.2030015 test: 0.2225232 best: 0.2225232 (80) total: 39s remaining: 5.3s 81: learn: 0.2020101 test: 0.2217343 best: 0.2217343 (81) total: 39.4s remaining: 4.81s 82: learn: 0.2008113 test: 0.2207923 best: 0.2207923 (82) total: 40s remaining: 4.34s 83: learn: 0.1994663 test: 0.2197970 best: 0.2197970 (83) total: 40.6s remaining: 3.87s 84: learn: 0.1984942 test: 0.2191497 best: 0.2191497 (84) total: 41.1s remaining: 3.39s 85: learn: 0.1973149 test: 0.2182621 best: 0.2182621 (85) total: 41.6s remaining: 2.9s 86: learn: 0.1964717 test: 0.2176325 best: 0.2176325 (86) total: 42.1s remaining: 2.42s 87: learn: 0.1955205 test: 0.2169809 best: 0.2169809 (87) total: 42.6s remaining: 1.94s 88: learn: 0.1944400 test: 0.2161733 best: 0.2161733 (88) total: 43.1s remaining: 1.45s 89: learn: 0.1935420 test: 0.2156504 best: 0.2156504 (89) total: 43.6s remaining: 970ms 90: learn: 0.1924376 test: 0.2149023 best: 0.2149023 (90) total: 44.1s remaining: 485ms 91: learn: 0.1915335 test: 0.2142968 best: 0.2142968 (91) total: 44.6s remaining: 0us bestTest = 0.2142968138 bestIteration = 91 Trial 37, Fold 1: Log loss = 0.21422999987221414, Average precision = 0.974891629378225, ROC-AUC = 0.9709085819914637, Elapsed Time = 44.72040569999808 seconds Trial 37, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 37, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6680672 test: 0.6685114 best: 0.6685114 (0) total: 538ms remaining: 48.9s 1: learn: 0.6464621 test: 0.6471813 best: 0.6471813 (1) total: 1.17s remaining: 52.7s 2: learn: 0.6274987 test: 0.6284067 best: 0.6284067 (2) total: 1.61s remaining: 47.8s 3: learn: 0.6086581 test: 0.6098681 best: 0.6098681 (3) total: 2.04s remaining: 44.8s 4: learn: 0.5916224 test: 0.5930997 best: 0.5930997 (4) total: 2.56s remaining: 44.5s 5: learn: 0.5749000 test: 0.5766753 best: 0.5766753 (5) total: 3.04s remaining: 43.6s 6: learn: 0.5591018 test: 0.5610121 best: 0.5610121 (6) total: 3.52s remaining: 42.8s 7: learn: 0.5443112 test: 0.5464974 best: 0.5464974 (7) total: 4.05s remaining: 42.5s 8: learn: 0.5279658 test: 0.5306216 best: 0.5306216 (8) total: 4.51s remaining: 41.6s 9: learn: 0.5137325 test: 0.5165953 best: 0.5165953 (9) total: 5.04s remaining: 41.3s 10: learn: 0.4996375 test: 0.5026990 best: 0.5026990 (10) total: 5.49s remaining: 40.4s 11: learn: 0.4858085 test: 0.4891075 best: 0.4891075 (11) total: 5.99s remaining: 40s 12: learn: 0.4743355 test: 0.4778293 best: 0.4778293 (12) total: 6.43s remaining: 39.1s 13: learn: 0.4622309 test: 0.4659083 best: 0.4659083 (13) total: 6.95s remaining: 38.7s 14: learn: 0.4513718 test: 0.4552845 best: 0.4552845 (14) total: 7.45s remaining: 38.2s 15: learn: 0.4413814 test: 0.4454104 best: 0.4454104 (15) total: 7.89s remaining: 37.5s 16: learn: 0.4314356 test: 0.4356391 best: 0.4356391 (16) total: 8.3s remaining: 36.6s 17: learn: 0.4221801 test: 0.4266060 best: 0.4266060 (17) total: 8.86s remaining: 36.4s 18: learn: 0.4127404 test: 0.4173970 best: 0.4173970 (18) total: 9.4s remaining: 36.1s 19: learn: 0.4039343 test: 0.4087335 best: 0.4087335 (19) total: 9.89s remaining: 35.6s 20: learn: 0.3970722 test: 0.4020470 best: 0.4020470 (20) total: 10.3s remaining: 34.8s 21: learn: 0.3900771 test: 0.3953130 best: 0.3953130 (21) total: 10.8s remaining: 34.3s 22: learn: 0.3831100 test: 0.3886954 best: 0.3886954 (22) total: 11.3s remaining: 33.8s 23: learn: 0.3758403 test: 0.3816954 best: 0.3816954 (23) total: 11.8s remaining: 33.5s 24: learn: 0.3686236 test: 0.3747115 best: 0.3747115 (24) total: 12.3s remaining: 33s 25: learn: 0.3628260 test: 0.3690680 best: 0.3690680 (25) total: 12.7s remaining: 32.3s 26: learn: 0.3567176 test: 0.3630762 best: 0.3630762 (26) total: 13.1s remaining: 31.6s 27: learn: 0.3506960 test: 0.3573471 best: 0.3573471 (27) total: 13.6s remaining: 31.2s 28: learn: 0.3448907 test: 0.3516252 best: 0.3516252 (28) total: 14.1s remaining: 30.6s 29: learn: 0.3394720 test: 0.3463191 best: 0.3463191 (29) total: 14.5s remaining: 30.1s 30: learn: 0.3343826 test: 0.3413268 best: 0.3413268 (30) total: 15s remaining: 29.5s 31: learn: 0.3290519 test: 0.3360251 best: 0.3360251 (31) total: 15.5s remaining: 29.1s 32: learn: 0.3240683 test: 0.3312095 best: 0.3312095 (32) total: 15.9s remaining: 28.5s 33: learn: 0.3191912 test: 0.3266406 best: 0.3266406 (33) total: 16.4s remaining: 28s 34: learn: 0.3147999 test: 0.3224791 best: 0.3224791 (34) total: 16.9s remaining: 27.6s 35: learn: 0.3101748 test: 0.3181688 best: 0.3181688 (35) total: 17.5s remaining: 27.2s 36: learn: 0.3055099 test: 0.3136622 best: 0.3136622 (36) total: 18.1s remaining: 27s 37: learn: 0.3009979 test: 0.3093040 best: 0.3093040 (37) total: 18.6s remaining: 26.5s 38: learn: 0.2965275 test: 0.3049959 best: 0.3049959 (38) total: 19.2s remaining: 26s 39: learn: 0.2924803 test: 0.3012049 best: 0.3012049 (39) total: 19.8s remaining: 25.8s 40: learn: 0.2883474 test: 0.2974342 best: 0.2974342 (40) total: 20.5s remaining: 25.5s 41: learn: 0.2853011 test: 0.2944523 best: 0.2944523 (41) total: 20.9s remaining: 24.9s 42: learn: 0.2821471 test: 0.2913848 best: 0.2913848 (42) total: 21.5s remaining: 24.5s 43: learn: 0.2785870 test: 0.2879025 best: 0.2879025 (43) total: 22.1s remaining: 24.1s 44: learn: 0.2754472 test: 0.2849429 best: 0.2849429 (44) total: 22.6s remaining: 23.6s 45: learn: 0.2720533 test: 0.2816619 best: 0.2816619 (45) total: 23s remaining: 23s 46: learn: 0.2690534 test: 0.2789042 best: 0.2789042 (46) total: 23.6s remaining: 22.6s 47: learn: 0.2658547 test: 0.2760325 best: 0.2760325 (47) total: 24.1s remaining: 22.1s 48: learn: 0.2631753 test: 0.2735557 best: 0.2735557 (48) total: 24.6s remaining: 21.6s 49: learn: 0.2604163 test: 0.2709927 best: 0.2709927 (49) total: 25.1s remaining: 21.1s 50: learn: 0.2578431 test: 0.2686504 best: 0.2686504 (50) total: 25.6s remaining: 20.6s 51: learn: 0.2550900 test: 0.2660419 best: 0.2660419 (51) total: 26.1s remaining: 20.1s 52: learn: 0.2525847 test: 0.2636286 best: 0.2636286 (52) total: 26.6s remaining: 19.6s 53: learn: 0.2499748 test: 0.2611155 best: 0.2611155 (53) total: 27s remaining: 19s 54: learn: 0.2473812 test: 0.2587450 best: 0.2587450 (54) total: 27.6s remaining: 18.6s 55: learn: 0.2448102 test: 0.2562573 best: 0.2562573 (55) total: 28s remaining: 18s 56: learn: 0.2429315 test: 0.2544590 best: 0.2544590 (56) total: 28.4s remaining: 17.5s 57: learn: 0.2409203 test: 0.2527081 best: 0.2527081 (57) total: 29s remaining: 17s 58: learn: 0.2385148 test: 0.2506229 best: 0.2506229 (58) total: 29.7s remaining: 16.6s 59: learn: 0.2363434 test: 0.2486250 best: 0.2486250 (59) total: 30.2s remaining: 16.1s 60: learn: 0.2344943 test: 0.2469867 best: 0.2469867 (60) total: 30.9s remaining: 15.7s 61: learn: 0.2327531 test: 0.2454317 best: 0.2454317 (61) total: 31.4s remaining: 15.2s 62: learn: 0.2310856 test: 0.2438988 best: 0.2438988 (62) total: 31.9s remaining: 14.7s 63: learn: 0.2293508 test: 0.2423285 best: 0.2423285 (63) total: 32.4s remaining: 14.2s 64: learn: 0.2274977 test: 0.2406391 best: 0.2406391 (64) total: 32.9s remaining: 13.7s 65: learn: 0.2258504 test: 0.2390753 best: 0.2390753 (65) total: 33.4s remaining: 13.2s 66: learn: 0.2241529 test: 0.2375138 best: 0.2375138 (66) total: 33.9s remaining: 12.7s 67: learn: 0.2225124 test: 0.2360541 best: 0.2360541 (67) total: 34.5s remaining: 12.2s 68: learn: 0.2207254 test: 0.2344144 best: 0.2344144 (68) total: 35.1s remaining: 11.7s 69: learn: 0.2188321 test: 0.2328184 best: 0.2328184 (69) total: 35.6s remaining: 11.2s 70: learn: 0.2175377 test: 0.2316191 best: 0.2316191 (70) total: 36.1s remaining: 10.7s 71: learn: 0.2155827 test: 0.2298878 best: 0.2298878 (71) total: 36.7s remaining: 10.2s 72: learn: 0.2143142 test: 0.2287750 best: 0.2287750 (72) total: 37.2s remaining: 9.69s 73: learn: 0.2126929 test: 0.2272924 best: 0.2272924 (73) total: 37.8s remaining: 9.2s 74: learn: 0.2116740 test: 0.2264067 best: 0.2264067 (74) total: 38.3s remaining: 8.68s 75: learn: 0.2104245 test: 0.2253481 best: 0.2253481 (75) total: 38.8s remaining: 8.17s 76: learn: 0.2091689 test: 0.2242529 best: 0.2242529 (76) total: 39.4s remaining: 7.68s 77: learn: 0.2078912 test: 0.2232362 best: 0.2232362 (77) total: 40s remaining: 7.18s 78: learn: 0.2065118 test: 0.2219469 best: 0.2219469 (78) total: 40.5s remaining: 6.67s 79: learn: 0.2052269 test: 0.2208558 best: 0.2208558 (79) total: 41.1s remaining: 6.17s 80: learn: 0.2042195 test: 0.2200361 best: 0.2200361 (80) total: 41.6s remaining: 5.65s 81: learn: 0.2033175 test: 0.2192757 best: 0.2192757 (81) total: 42.1s remaining: 5.13s 82: learn: 0.2020260 test: 0.2181065 best: 0.2181065 (82) total: 42.6s remaining: 4.62s 83: learn: 0.2008667 test: 0.2171045 best: 0.2171045 (83) total: 43.1s remaining: 4.11s 84: learn: 0.1998757 test: 0.2163648 best: 0.2163648 (84) total: 43.7s remaining: 3.6s 85: learn: 0.1988037 test: 0.2154911 best: 0.2154911 (85) total: 44.2s remaining: 3.08s 86: learn: 0.1979393 test: 0.2147038 best: 0.2147038 (86) total: 44.7s remaining: 2.57s 87: learn: 0.1971070 test: 0.2141045 best: 0.2141045 (87) total: 45.1s remaining: 2.05s 88: learn: 0.1962276 test: 0.2134936 best: 0.2134936 (88) total: 45.7s remaining: 1.54s 89: learn: 0.1953470 test: 0.2128228 best: 0.2128228 (89) total: 46.1s remaining: 1.02s 90: learn: 0.1943239 test: 0.2119949 best: 0.2119949 (90) total: 46.6s remaining: 512ms 91: learn: 0.1933285 test: 0.2111141 best: 0.2111141 (91) total: 47.2s remaining: 0us bestTest = 0.2111141188 bestIteration = 91 Trial 37, Fold 2: Log loss = 0.2110342609411055, Average precision = 0.9765589419677143, ROC-AUC = 0.9736675102831887, Elapsed Time = 47.38509990000239 seconds Trial 37, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 37, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6706959 test: 0.6707870 best: 0.6707870 (0) total: 437ms remaining: 39.7s 1: learn: 0.6500932 test: 0.6502074 best: 0.6502074 (1) total: 988ms remaining: 44.5s 2: learn: 0.6281080 test: 0.6283853 best: 0.6283853 (2) total: 1.53s remaining: 45.3s 3: learn: 0.6091787 test: 0.6094737 best: 0.6094737 (3) total: 1.98s remaining: 43.5s 4: learn: 0.5904177 test: 0.5908067 best: 0.5908067 (4) total: 2.46s remaining: 42.8s 5: learn: 0.5731525 test: 0.5735716 best: 0.5735716 (5) total: 2.81s remaining: 40.3s 6: learn: 0.5570055 test: 0.5575865 best: 0.5575865 (6) total: 3.38s remaining: 41.1s 7: learn: 0.5411646 test: 0.5416970 best: 0.5416970 (7) total: 3.94s remaining: 41.3s 8: learn: 0.5266588 test: 0.5271083 best: 0.5271083 (8) total: 4.33s remaining: 39.9s 9: learn: 0.5113456 test: 0.5117747 best: 0.5117747 (9) total: 4.83s remaining: 39.6s 10: learn: 0.4985095 test: 0.4990107 best: 0.4990107 (10) total: 5.31s remaining: 39.1s 11: learn: 0.4861047 test: 0.4867248 best: 0.4867248 (11) total: 5.86s remaining: 39s 12: learn: 0.4740441 test: 0.4746538 best: 0.4746538 (12) total: 6.33s remaining: 38.5s 13: learn: 0.4626107 test: 0.4632786 best: 0.4632786 (13) total: 6.71s remaining: 37.4s 14: learn: 0.4527191 test: 0.4534364 best: 0.4534364 (14) total: 7.13s remaining: 36.6s 15: learn: 0.4422558 test: 0.4432141 best: 0.4432141 (15) total: 7.67s remaining: 36.4s 16: learn: 0.4329151 test: 0.4339857 best: 0.4339857 (16) total: 8.19s remaining: 36.1s 17: learn: 0.4225167 test: 0.4236440 best: 0.4236440 (17) total: 8.62s remaining: 35.5s 18: learn: 0.4138968 test: 0.4150308 best: 0.4150308 (18) total: 9.01s remaining: 34.6s 19: learn: 0.4055707 test: 0.4068400 best: 0.4068400 (19) total: 9.51s remaining: 34.2s 20: learn: 0.3976421 test: 0.3989304 best: 0.3989304 (20) total: 10s remaining: 33.8s 21: learn: 0.3902846 test: 0.3916461 best: 0.3916461 (21) total: 10.4s remaining: 33.1s 22: learn: 0.3830864 test: 0.3845065 best: 0.3845065 (22) total: 10.8s remaining: 32.3s 23: learn: 0.3755945 test: 0.3771030 best: 0.3771030 (23) total: 11.3s remaining: 32s 24: learn: 0.3679892 test: 0.3697461 best: 0.3697461 (24) total: 11.8s remaining: 31.6s 25: learn: 0.3622658 test: 0.3640830 best: 0.3640830 (25) total: 12.2s remaining: 30.9s 26: learn: 0.3557124 test: 0.3577252 best: 0.3577252 (26) total: 12.7s remaining: 30.6s 27: learn: 0.3496405 test: 0.3517139 best: 0.3517139 (27) total: 13.2s remaining: 30.1s 28: learn: 0.3419458 test: 0.3441985 best: 0.3441985 (28) total: 13.6s remaining: 29.6s 29: learn: 0.3360074 test: 0.3384631 best: 0.3384631 (29) total: 14.2s remaining: 29.3s 30: learn: 0.3309593 test: 0.3335891 best: 0.3335891 (30) total: 14.7s remaining: 28.8s 31: learn: 0.3245423 test: 0.3273543 best: 0.3273543 (31) total: 15.2s remaining: 28.5s 32: learn: 0.3192952 test: 0.3222237 best: 0.3222237 (32) total: 15.8s remaining: 28.3s 33: learn: 0.3147012 test: 0.3177630 best: 0.3177630 (33) total: 16.2s remaining: 27.7s 34: learn: 0.3102289 test: 0.3133976 best: 0.3133976 (34) total: 16.6s remaining: 27s 35: learn: 0.3065323 test: 0.3098369 best: 0.3098369 (35) total: 17s remaining: 26.5s 36: learn: 0.3021914 test: 0.3056863 best: 0.3056863 (36) total: 17.5s remaining: 26s 37: learn: 0.2981865 test: 0.3018126 best: 0.3018126 (37) total: 18s remaining: 25.5s 38: learn: 0.2938619 test: 0.2977348 best: 0.2977348 (38) total: 18.6s remaining: 25.2s 39: learn: 0.2900896 test: 0.2940625 best: 0.2940625 (39) total: 19s remaining: 24.7s 40: learn: 0.2854379 test: 0.2897081 best: 0.2897081 (40) total: 19.6s remaining: 24.4s 41: learn: 0.2822202 test: 0.2865952 best: 0.2865952 (41) total: 20s remaining: 23.8s 42: learn: 0.2790663 test: 0.2836833 best: 0.2836833 (42) total: 20.5s remaining: 23.4s 43: learn: 0.2750194 test: 0.2799141 best: 0.2799141 (43) total: 21.1s remaining: 23.1s 44: learn: 0.2716043 test: 0.2766697 best: 0.2766697 (44) total: 21.7s remaining: 22.6s 45: learn: 0.2682470 test: 0.2736086 best: 0.2736086 (45) total: 22.3s remaining: 22.3s 46: learn: 0.2646799 test: 0.2703078 best: 0.2703078 (46) total: 22.8s remaining: 21.9s 47: learn: 0.2614198 test: 0.2673368 best: 0.2673368 (47) total: 23.4s remaining: 21.4s 48: learn: 0.2584324 test: 0.2646361 best: 0.2646361 (48) total: 23.9s remaining: 21s 49: learn: 0.2557005 test: 0.2620498 best: 0.2620498 (49) total: 24.5s remaining: 20.5s 50: learn: 0.2526110 test: 0.2591708 best: 0.2591708 (50) total: 25.1s remaining: 20.2s 51: learn: 0.2494158 test: 0.2563637 best: 0.2563637 (51) total: 25.7s remaining: 19.8s 52: learn: 0.2469127 test: 0.2541968 best: 0.2541968 (52) total: 26.3s remaining: 19.4s 53: learn: 0.2449864 test: 0.2523227 best: 0.2523227 (53) total: 26.8s remaining: 18.9s 54: learn: 0.2430776 test: 0.2504706 best: 0.2504706 (54) total: 27.2s remaining: 18.3s 55: learn: 0.2407649 test: 0.2482968 best: 0.2482968 (55) total: 27.7s remaining: 17.8s 56: learn: 0.2384995 test: 0.2461954 best: 0.2461954 (56) total: 28.2s remaining: 17.3s 57: learn: 0.2363964 test: 0.2443837 best: 0.2443837 (57) total: 28.7s remaining: 16.8s 58: learn: 0.2344271 test: 0.2426072 best: 0.2426072 (58) total: 29.2s remaining: 16.3s 59: learn: 0.2326500 test: 0.2410545 best: 0.2410545 (59) total: 29.7s remaining: 15.8s 60: learn: 0.2308159 test: 0.2394449 best: 0.2394449 (60) total: 30.2s remaining: 15.4s 61: learn: 0.2289155 test: 0.2378222 best: 0.2378222 (61) total: 30.8s remaining: 14.9s 62: learn: 0.2271833 test: 0.2362387 best: 0.2362387 (62) total: 31.2s remaining: 14.4s 63: learn: 0.2255160 test: 0.2348301 best: 0.2348301 (63) total: 31.8s remaining: 13.9s 64: learn: 0.2232194 test: 0.2328630 best: 0.2328630 (64) total: 32.3s remaining: 13.4s 65: learn: 0.2216501 test: 0.2315376 best: 0.2315376 (65) total: 32.9s remaining: 12.9s 66: learn: 0.2198722 test: 0.2299352 best: 0.2299352 (66) total: 33.5s remaining: 12.5s 67: learn: 0.2183483 test: 0.2286438 best: 0.2286438 (67) total: 33.9s remaining: 12s 68: learn: 0.2167242 test: 0.2272596 best: 0.2272596 (68) total: 34.5s remaining: 11.5s 69: learn: 0.2155091 test: 0.2261638 best: 0.2261638 (69) total: 34.9s remaining: 11s 70: learn: 0.2143460 test: 0.2252252 best: 0.2252252 (70) total: 35.3s remaining: 10.4s 71: learn: 0.2131102 test: 0.2242393 best: 0.2242393 (71) total: 35.8s remaining: 9.95s 72: learn: 0.2120079 test: 0.2232377 best: 0.2232377 (72) total: 36.2s remaining: 9.42s 73: learn: 0.2105401 test: 0.2219968 best: 0.2219968 (73) total: 36.6s remaining: 8.91s 74: learn: 0.2092028 test: 0.2209107 best: 0.2209107 (74) total: 37.2s remaining: 8.42s 75: learn: 0.2078946 test: 0.2198308 best: 0.2198308 (75) total: 37.7s remaining: 7.93s 76: learn: 0.2065162 test: 0.2186755 best: 0.2186755 (76) total: 38.2s remaining: 7.45s 77: learn: 0.2054934 test: 0.2177770 best: 0.2177770 (77) total: 38.6s remaining: 6.93s 78: learn: 0.2042004 test: 0.2167094 best: 0.2167094 (78) total: 39.2s remaining: 6.45s 79: learn: 0.2029392 test: 0.2156942 best: 0.2156942 (79) total: 39.7s remaining: 5.96s 80: learn: 0.2016897 test: 0.2147895 best: 0.2147895 (80) total: 40.3s remaining: 5.48s 81: learn: 0.2007337 test: 0.2140461 best: 0.2140461 (81) total: 40.8s remaining: 4.97s 82: learn: 0.1997393 test: 0.2133781 best: 0.2133781 (82) total: 41.4s remaining: 4.49s 83: learn: 0.1984674 test: 0.2123895 best: 0.2123895 (83) total: 41.9s remaining: 3.99s 84: learn: 0.1974777 test: 0.2116732 best: 0.2116732 (84) total: 42.4s remaining: 3.5s 85: learn: 0.1964929 test: 0.2109526 best: 0.2109526 (85) total: 43s remaining: 3s 86: learn: 0.1954628 test: 0.2101066 best: 0.2101066 (86) total: 43.5s remaining: 2.5s 87: learn: 0.1945753 test: 0.2093801 best: 0.2093801 (87) total: 44.1s remaining: 2s 88: learn: 0.1936345 test: 0.2086833 best: 0.2086833 (88) total: 44.6s remaining: 1.5s 89: learn: 0.1925239 test: 0.2079150 best: 0.2079150 (89) total: 45.1s remaining: 1s 90: learn: 0.1915430 test: 0.2070748 best: 0.2070748 (90) total: 45.6s remaining: 501ms 91: learn: 0.1907836 test: 0.2065610 best: 0.2065610 (91) total: 46s remaining: 0us bestTest = 0.2065610184 bestIteration = 91 Trial 37, Fold 3: Log loss = 0.20671254099511285, Average precision = 0.9754589955346551, ROC-AUC = 0.9736953860118847, Elapsed Time = 46.160713100001885 seconds Trial 37, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 37, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6698048 test: 0.6698669 best: 0.6698669 (0) total: 543ms remaining: 49.4s 1: learn: 0.6487575 test: 0.6488394 best: 0.6488394 (1) total: 1.04s remaining: 46.6s 2: learn: 0.6279133 test: 0.6281883 best: 0.6281883 (2) total: 1.48s remaining: 44s 3: learn: 0.6089232 test: 0.6095049 best: 0.6095049 (3) total: 1.98s remaining: 43.6s 4: learn: 0.5903257 test: 0.5912191 best: 0.5912191 (4) total: 2.48s remaining: 43.1s 5: learn: 0.5725337 test: 0.5736085 best: 0.5736085 (5) total: 2.94s remaining: 42.2s 6: learn: 0.5559208 test: 0.5571373 best: 0.5571373 (6) total: 3.42s remaining: 41.6s 7: learn: 0.5400286 test: 0.5413705 best: 0.5413705 (7) total: 3.98s remaining: 41.8s 8: learn: 0.5261491 test: 0.5275263 best: 0.5275263 (8) total: 4.53s remaining: 41.8s 9: learn: 0.5132345 test: 0.5147833 best: 0.5147833 (9) total: 5.02s remaining: 41.2s 10: learn: 0.4988597 test: 0.5007291 best: 0.5007291 (10) total: 5.57s remaining: 41s 11: learn: 0.4872923 test: 0.4893447 best: 0.4893447 (11) total: 6.02s remaining: 40.1s 12: learn: 0.4751720 test: 0.4774872 best: 0.4774872 (12) total: 6.55s remaining: 39.8s 13: learn: 0.4636175 test: 0.4661100 best: 0.4661100 (13) total: 7.04s remaining: 39.2s 14: learn: 0.4524029 test: 0.4550498 best: 0.4550498 (14) total: 7.64s remaining: 39.2s 15: learn: 0.4429409 test: 0.4456877 best: 0.4456877 (15) total: 8.05s remaining: 38.2s 16: learn: 0.4327411 test: 0.4357422 best: 0.4357422 (16) total: 8.6s remaining: 37.9s 17: learn: 0.4233617 test: 0.4265488 best: 0.4265488 (17) total: 9.09s remaining: 37.4s 18: learn: 0.4136944 test: 0.4170324 best: 0.4170324 (18) total: 9.59s remaining: 36.9s 19: learn: 0.4047206 test: 0.4082848 best: 0.4082848 (19) total: 10s remaining: 36.1s 20: learn: 0.3965062 test: 0.4001760 best: 0.4001760 (20) total: 10.4s remaining: 35.2s 21: learn: 0.3880246 test: 0.3918288 best: 0.3918288 (21) total: 10.9s remaining: 34.6s 22: learn: 0.3812076 test: 0.3851650 best: 0.3851650 (22) total: 11.4s remaining: 34.1s 23: learn: 0.3746745 test: 0.3788649 best: 0.3788649 (23) total: 11.9s remaining: 33.6s 24: learn: 0.3675040 test: 0.3719753 best: 0.3719753 (24) total: 12.4s remaining: 33.2s 25: learn: 0.3609361 test: 0.3655628 best: 0.3655628 (25) total: 12.9s remaining: 32.8s 26: learn: 0.3544486 test: 0.3592410 best: 0.3592410 (26) total: 13.3s remaining: 32.1s 27: learn: 0.3487455 test: 0.3536566 best: 0.3536566 (27) total: 13.8s remaining: 31.6s 28: learn: 0.3432524 test: 0.3482890 best: 0.3482890 (28) total: 14.2s remaining: 30.9s 29: learn: 0.3376949 test: 0.3430020 best: 0.3430020 (29) total: 14.8s remaining: 30.6s 30: learn: 0.3320142 test: 0.3375109 best: 0.3375109 (30) total: 15.3s remaining: 30.1s 31: learn: 0.3261393 test: 0.3319353 best: 0.3319353 (31) total: 15.9s remaining: 29.8s 32: learn: 0.3213204 test: 0.3272406 best: 0.3272406 (32) total: 16.4s remaining: 29.3s 33: learn: 0.3165095 test: 0.3226541 best: 0.3226541 (33) total: 16.8s remaining: 28.7s 34: learn: 0.3120248 test: 0.3183722 best: 0.3183722 (34) total: 17.3s remaining: 28.1s 35: learn: 0.3074890 test: 0.3140362 best: 0.3140362 (35) total: 17.8s remaining: 27.6s 36: learn: 0.3033988 test: 0.3099996 best: 0.3099996 (36) total: 18.2s remaining: 27s 37: learn: 0.2991774 test: 0.3059345 best: 0.3059345 (37) total: 18.6s remaining: 26.5s 38: learn: 0.2952354 test: 0.3022847 best: 0.3022847 (38) total: 19.1s remaining: 26s 39: learn: 0.2914070 test: 0.2986183 best: 0.2986183 (39) total: 19.6s remaining: 25.4s 40: learn: 0.2873986 test: 0.2948157 best: 0.2948157 (40) total: 20.1s remaining: 25s 41: learn: 0.2841085 test: 0.2916949 best: 0.2916949 (41) total: 20.6s remaining: 24.5s 42: learn: 0.2805671 test: 0.2883597 best: 0.2883597 (42) total: 21.1s remaining: 24s 43: learn: 0.2771779 test: 0.2851363 best: 0.2851363 (43) total: 21.5s remaining: 23.5s 44: learn: 0.2740432 test: 0.2822149 best: 0.2822149 (44) total: 22s remaining: 23s 45: learn: 0.2707195 test: 0.2792395 best: 0.2792395 (45) total: 22.6s remaining: 22.6s 46: learn: 0.2678452 test: 0.2765993 best: 0.2765993 (46) total: 23.1s remaining: 22.1s 47: learn: 0.2652374 test: 0.2741594 best: 0.2741594 (47) total: 23.5s remaining: 21.6s 48: learn: 0.2625666 test: 0.2717675 best: 0.2717675 (48) total: 24s remaining: 21.1s 49: learn: 0.2593234 test: 0.2687507 best: 0.2687507 (49) total: 24.6s remaining: 20.7s 50: learn: 0.2569924 test: 0.2666214 best: 0.2666214 (50) total: 25s remaining: 20.1s 51: learn: 0.2545850 test: 0.2645174 best: 0.2645174 (51) total: 25.5s remaining: 19.6s 52: learn: 0.2518798 test: 0.2620332 best: 0.2620332 (52) total: 26.1s remaining: 19.2s 53: learn: 0.2496489 test: 0.2600501 best: 0.2600501 (53) total: 26.5s remaining: 18.7s 54: learn: 0.2473613 test: 0.2578216 best: 0.2578216 (54) total: 26.9s remaining: 18.1s 55: learn: 0.2449286 test: 0.2555939 best: 0.2555939 (55) total: 27.5s remaining: 17.7s 56: learn: 0.2425791 test: 0.2535099 best: 0.2535099 (56) total: 27.9s remaining: 17.2s 57: learn: 0.2403800 test: 0.2515012 best: 0.2515012 (57) total: 28.4s remaining: 16.7s 58: learn: 0.2382929 test: 0.2496298 best: 0.2496298 (58) total: 29s remaining: 16.2s 59: learn: 0.2361369 test: 0.2477441 best: 0.2477441 (59) total: 29.5s remaining: 15.7s 60: learn: 0.2337895 test: 0.2456284 best: 0.2456284 (60) total: 30.2s remaining: 15.3s 61: learn: 0.2320689 test: 0.2441077 best: 0.2441077 (61) total: 30.7s remaining: 14.8s 62: learn: 0.2302903 test: 0.2425108 best: 0.2425108 (62) total: 31.1s remaining: 14.3s 63: learn: 0.2286087 test: 0.2410168 best: 0.2410168 (63) total: 31.6s remaining: 13.8s 64: learn: 0.2269286 test: 0.2396172 best: 0.2396172 (64) total: 32.1s remaining: 13.3s 65: learn: 0.2252787 test: 0.2382092 best: 0.2382092 (65) total: 32.5s remaining: 12.8s 66: learn: 0.2232514 test: 0.2364268 best: 0.2364268 (66) total: 33.2s remaining: 12.4s 67: learn: 0.2217811 test: 0.2351716 best: 0.2351716 (67) total: 33.7s remaining: 11.9s 68: learn: 0.2200657 test: 0.2336776 best: 0.2336776 (68) total: 34.3s remaining: 11.4s 69: learn: 0.2186908 test: 0.2324930 best: 0.2324930 (69) total: 34.7s remaining: 10.9s 70: learn: 0.2174011 test: 0.2313565 best: 0.2313565 (70) total: 35.1s remaining: 10.4s 71: learn: 0.2157129 test: 0.2298715 best: 0.2298715 (71) total: 35.7s remaining: 9.91s 72: learn: 0.2145327 test: 0.2288873 best: 0.2288873 (72) total: 36.1s remaining: 9.4s 73: learn: 0.2129681 test: 0.2276794 best: 0.2276794 (73) total: 36.7s remaining: 8.92s 74: learn: 0.2116222 test: 0.2265220 best: 0.2265220 (74) total: 37.2s remaining: 8.43s 75: learn: 0.2103121 test: 0.2253614 best: 0.2253614 (75) total: 37.7s remaining: 7.93s 76: learn: 0.2088588 test: 0.2241142 best: 0.2241142 (76) total: 38.2s remaining: 7.43s 77: learn: 0.2074569 test: 0.2228058 best: 0.2228058 (77) total: 38.6s remaining: 6.93s 78: learn: 0.2061568 test: 0.2217384 best: 0.2217384 (78) total: 39.2s remaining: 6.44s 79: learn: 0.2050128 test: 0.2208449 best: 0.2208449 (79) total: 39.7s remaining: 5.95s 80: learn: 0.2039849 test: 0.2200153 best: 0.2200153 (80) total: 40.1s remaining: 5.44s 81: learn: 0.2027535 test: 0.2189794 best: 0.2189794 (81) total: 40.6s remaining: 4.95s 82: learn: 0.2017685 test: 0.2182436 best: 0.2182436 (82) total: 41s remaining: 4.45s 83: learn: 0.2005231 test: 0.2172870 best: 0.2172870 (83) total: 41.5s remaining: 3.95s 84: learn: 0.1994852 test: 0.2164320 best: 0.2164320 (84) total: 42s remaining: 3.46s 85: learn: 0.1985511 test: 0.2156992 best: 0.2156992 (85) total: 42.5s remaining: 2.96s 86: learn: 0.1974182 test: 0.2148669 best: 0.2148669 (86) total: 43s remaining: 2.47s 87: learn: 0.1963671 test: 0.2140767 best: 0.2140767 (87) total: 43.6s remaining: 1.98s 88: learn: 0.1953667 test: 0.2133128 best: 0.2133128 (88) total: 44.1s remaining: 1.49s 89: learn: 0.1943655 test: 0.2124868 best: 0.2124868 (89) total: 44.6s remaining: 991ms 90: learn: 0.1934021 test: 0.2118300 best: 0.2118300 (90) total: 45.1s remaining: 496ms 91: learn: 0.1926000 test: 0.2112235 best: 0.2112235 (91) total: 45.5s remaining: 0us bestTest = 0.2112234546 bestIteration = 91 Trial 37, Fold 4: Log loss = 0.21116277335507552, Average precision = 0.9762176575154632, ROC-AUC = 0.9725466509772503, Elapsed Time = 45.66182259999914 seconds Trial 37, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 37, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6722061 test: 0.6725024 best: 0.6725024 (0) total: 407ms remaining: 37s 1: learn: 0.6514660 test: 0.6522190 best: 0.6522190 (1) total: 936ms remaining: 42.1s 2: learn: 0.6317185 test: 0.6326378 best: 0.6326378 (2) total: 1.32s remaining: 39.3s 3: learn: 0.6125202 test: 0.6137478 best: 0.6137478 (3) total: 1.79s remaining: 39.4s 4: learn: 0.5930948 test: 0.5946755 best: 0.5946755 (4) total: 2.27s remaining: 39.6s 5: learn: 0.5756514 test: 0.5776426 best: 0.5776426 (5) total: 2.72s remaining: 39s 6: learn: 0.5596532 test: 0.5618331 best: 0.5618331 (6) total: 3.07s remaining: 37.3s 7: learn: 0.5419852 test: 0.5445603 best: 0.5445603 (7) total: 3.59s remaining: 37.7s 8: learn: 0.5272396 test: 0.5301645 best: 0.5301645 (8) total: 4.04s remaining: 37.2s 9: learn: 0.5124390 test: 0.5156256 best: 0.5156256 (9) total: 4.42s remaining: 36.2s 10: learn: 0.4986999 test: 0.5023879 best: 0.5023879 (10) total: 4.89s remaining: 36s 11: learn: 0.4867940 test: 0.4908550 best: 0.4908550 (11) total: 5.33s remaining: 35.5s 12: learn: 0.4750861 test: 0.4793492 best: 0.4793492 (12) total: 5.86s remaining: 35.6s 13: learn: 0.4630260 test: 0.4675896 best: 0.4675896 (13) total: 6.42s remaining: 35.8s 14: learn: 0.4522790 test: 0.4571899 best: 0.4571899 (14) total: 6.83s remaining: 35s 15: learn: 0.4411517 test: 0.4464545 best: 0.4464545 (15) total: 7.21s remaining: 34.2s 16: learn: 0.4319622 test: 0.4375473 best: 0.4375473 (16) total: 7.66s remaining: 33.8s 17: learn: 0.4218268 test: 0.4277670 best: 0.4277670 (17) total: 8.16s remaining: 33.6s 18: learn: 0.4116588 test: 0.4179730 best: 0.4179730 (18) total: 8.78s remaining: 33.7s 19: learn: 0.4023406 test: 0.4089343 best: 0.4089343 (19) total: 9.25s remaining: 33.3s 20: learn: 0.3940948 test: 0.4010707 best: 0.4010707 (20) total: 9.81s remaining: 33.2s 21: learn: 0.3864260 test: 0.3937494 best: 0.3937494 (21) total: 10.2s remaining: 32.5s 22: learn: 0.3790159 test: 0.3867305 best: 0.3867305 (22) total: 10.7s remaining: 32.1s 23: learn: 0.3716838 test: 0.3797514 best: 0.3797514 (23) total: 11.3s remaining: 32.1s 24: learn: 0.3650531 test: 0.3734321 best: 0.3734321 (24) total: 11.8s remaining: 31.6s 25: learn: 0.3583921 test: 0.3670548 best: 0.3670548 (25) total: 12.3s remaining: 31.2s 26: learn: 0.3526284 test: 0.3615125 best: 0.3615125 (26) total: 12.7s remaining: 30.7s 27: learn: 0.3468997 test: 0.3561416 best: 0.3561416 (27) total: 13.3s remaining: 30.5s 28: learn: 0.3402561 test: 0.3500716 best: 0.3500716 (28) total: 13.9s remaining: 30.2s 29: learn: 0.3341862 test: 0.3443829 best: 0.3443829 (29) total: 14.4s remaining: 29.7s 30: learn: 0.3293812 test: 0.3398594 best: 0.3398594 (30) total: 14.8s remaining: 29.2s 31: learn: 0.3237274 test: 0.3344376 best: 0.3344376 (31) total: 15.3s remaining: 28.7s 32: learn: 0.3183116 test: 0.3294070 best: 0.3294070 (32) total: 15.8s remaining: 28.3s 33: learn: 0.3133750 test: 0.3247534 best: 0.3247534 (33) total: 16.3s remaining: 27.8s 34: learn: 0.3093864 test: 0.3209545 best: 0.3209545 (34) total: 16.7s remaining: 27.2s 35: learn: 0.3049332 test: 0.3168454 best: 0.3168454 (35) total: 17.2s remaining: 26.8s 36: learn: 0.3007450 test: 0.3128055 best: 0.3128055 (36) total: 17.7s remaining: 26.3s 37: learn: 0.2963029 test: 0.3087153 best: 0.3087153 (37) total: 18.3s remaining: 25.9s 38: learn: 0.2922965 test: 0.3050570 best: 0.3050570 (38) total: 18.7s remaining: 25.4s 39: learn: 0.2889567 test: 0.3017346 best: 0.3017346 (39) total: 19.2s remaining: 24.9s 40: learn: 0.2853405 test: 0.2983730 best: 0.2983730 (40) total: 19.7s remaining: 24.5s 41: learn: 0.2816722 test: 0.2951274 best: 0.2951274 (41) total: 20.2s remaining: 24.1s 42: learn: 0.2780912 test: 0.2918741 best: 0.2918741 (42) total: 20.8s remaining: 23.7s 43: learn: 0.2745510 test: 0.2885757 best: 0.2885757 (43) total: 21.3s remaining: 23.2s 44: learn: 0.2712923 test: 0.2855469 best: 0.2855469 (44) total: 21.8s remaining: 22.8s 45: learn: 0.2677103 test: 0.2824220 best: 0.2824220 (45) total: 22.4s remaining: 22.4s 46: learn: 0.2648705 test: 0.2799336 best: 0.2799336 (46) total: 22.9s remaining: 22s 47: learn: 0.2619332 test: 0.2773282 best: 0.2773282 (47) total: 23.4s remaining: 21.5s 48: learn: 0.2595350 test: 0.2751914 best: 0.2751914 (48) total: 23.8s remaining: 20.9s 49: learn: 0.2567487 test: 0.2727781 best: 0.2727781 (49) total: 24.4s remaining: 20.5s 50: learn: 0.2538304 test: 0.2701974 best: 0.2701974 (50) total: 24.9s remaining: 20s 51: learn: 0.2511586 test: 0.2678456 best: 0.2678456 (51) total: 25.4s remaining: 19.5s 52: learn: 0.2487527 test: 0.2656855 best: 0.2656855 (52) total: 25.9s remaining: 19s 53: learn: 0.2462449 test: 0.2635238 best: 0.2635238 (53) total: 26.4s remaining: 18.6s 54: learn: 0.2437834 test: 0.2612305 best: 0.2612305 (54) total: 26.9s remaining: 18.1s 55: learn: 0.2416038 test: 0.2593917 best: 0.2593917 (55) total: 27.5s remaining: 17.6s 56: learn: 0.2394018 test: 0.2575490 best: 0.2575490 (56) total: 28s remaining: 17.2s 57: learn: 0.2370574 test: 0.2555003 best: 0.2555003 (57) total: 28.5s remaining: 16.7s 58: learn: 0.2350499 test: 0.2537469 best: 0.2537469 (58) total: 28.9s remaining: 16.2s 59: learn: 0.2329123 test: 0.2519654 best: 0.2519654 (59) total: 29.4s remaining: 15.7s 60: learn: 0.2308889 test: 0.2501978 best: 0.2501978 (60) total: 29.9s remaining: 15.2s 61: learn: 0.2289530 test: 0.2485571 best: 0.2485571 (61) total: 30.4s remaining: 14.7s 62: learn: 0.2272274 test: 0.2470847 best: 0.2470847 (62) total: 30.9s remaining: 14.2s 63: learn: 0.2251804 test: 0.2454643 best: 0.2454643 (63) total: 31.4s remaining: 13.7s 64: learn: 0.2232619 test: 0.2439765 best: 0.2439765 (64) total: 32.1s remaining: 13.3s 65: learn: 0.2215978 test: 0.2426281 best: 0.2426281 (65) total: 32.6s remaining: 12.8s 66: learn: 0.2201255 test: 0.2414800 best: 0.2414800 (66) total: 33s remaining: 12.3s 67: learn: 0.2186015 test: 0.2403176 best: 0.2403176 (67) total: 33.6s remaining: 11.9s 68: learn: 0.2170353 test: 0.2391003 best: 0.2391003 (68) total: 34s remaining: 11.3s 69: learn: 0.2155952 test: 0.2378558 best: 0.2378558 (69) total: 34.5s remaining: 10.8s 70: learn: 0.2142181 test: 0.2367435 best: 0.2367435 (70) total: 35s remaining: 10.3s 71: learn: 0.2127086 test: 0.2356294 best: 0.2356294 (71) total: 35.6s remaining: 9.88s 72: learn: 0.2114320 test: 0.2345738 best: 0.2345738 (72) total: 36s remaining: 9.38s 73: learn: 0.2101178 test: 0.2335459 best: 0.2335459 (73) total: 36.6s remaining: 8.9s 74: learn: 0.2089153 test: 0.2324833 best: 0.2324833 (74) total: 37s remaining: 8.39s 75: learn: 0.2075198 test: 0.2314608 best: 0.2314608 (75) total: 37.5s remaining: 7.89s 76: learn: 0.2063059 test: 0.2305218 best: 0.2305218 (76) total: 37.9s remaining: 7.39s 77: learn: 0.2050198 test: 0.2295028 best: 0.2295028 (77) total: 38.4s remaining: 6.9s 78: learn: 0.2033562 test: 0.2282495 best: 0.2282495 (78) total: 39s remaining: 6.42s 79: learn: 0.2021224 test: 0.2273715 best: 0.2273715 (79) total: 39.5s remaining: 5.93s 80: learn: 0.2010467 test: 0.2266541 best: 0.2266541 (80) total: 40s remaining: 5.43s 81: learn: 0.1997712 test: 0.2256840 best: 0.2256840 (81) total: 40.5s remaining: 4.94s 82: learn: 0.1985998 test: 0.2248668 best: 0.2248668 (82) total: 41.1s remaining: 4.46s 83: learn: 0.1977182 test: 0.2242207 best: 0.2242207 (83) total: 41.5s remaining: 3.95s 84: learn: 0.1964556 test: 0.2233076 best: 0.2233076 (84) total: 42s remaining: 3.46s 85: learn: 0.1952991 test: 0.2223543 best: 0.2223543 (85) total: 42.5s remaining: 2.96s 86: learn: 0.1942661 test: 0.2215859 best: 0.2215859 (86) total: 42.9s remaining: 2.47s 87: learn: 0.1934492 test: 0.2209722 best: 0.2209722 (87) total: 43.3s remaining: 1.97s 88: learn: 0.1924005 test: 0.2201586 best: 0.2201586 (88) total: 43.8s remaining: 1.48s 89: learn: 0.1912934 test: 0.2193200 best: 0.2193200 (89) total: 44.3s remaining: 983ms 90: learn: 0.1902723 test: 0.2186016 best: 0.2186016 (90) total: 44.7s remaining: 492ms 91: learn: 0.1894761 test: 0.2181088 best: 0.2181088 (91) total: 45.1s remaining: 0us bestTest = 0.2181087983 bestIteration = 91 Trial 37, Fold 5: Log loss = 0.21790760100087853, Average precision = 0.9735336139166868, ROC-AUC = 0.9705873697118331, Elapsed Time = 45.29885459999787 seconds
Optimization Progress: 38%|###8 | 38/100 [1:05:16<1:42:49, 99.52s/it]
Trial 38, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 38, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6755498 test: 0.6755016 best: 0.6755016 (0) total: 47.5ms remaining: 3.33s 1: learn: 0.6600072 test: 0.6601579 best: 0.6601579 (1) total: 106ms remaining: 3.64s 2: learn: 0.6458545 test: 0.6456915 best: 0.6456915 (2) total: 163ms remaining: 3.69s 3: learn: 0.6317653 test: 0.6314952 best: 0.6314952 (3) total: 222ms remaining: 3.72s 4: learn: 0.6134193 test: 0.6130813 best: 0.6130813 (4) total: 283ms remaining: 3.73s 5: learn: 0.5994553 test: 0.5992984 best: 0.5992984 (5) total: 344ms remaining: 3.73s 6: learn: 0.5862994 test: 0.5864132 best: 0.5864132 (6) total: 399ms remaining: 3.65s 7: learn: 0.5729108 test: 0.5732096 best: 0.5732096 (7) total: 458ms remaining: 3.6s 8: learn: 0.5617437 test: 0.5618737 best: 0.5618737 (8) total: 519ms remaining: 3.58s 9: learn: 0.5485363 test: 0.5485881 best: 0.5485881 (9) total: 581ms remaining: 3.54s 10: learn: 0.5371607 test: 0.5373219 best: 0.5373219 (10) total: 641ms remaining: 3.49s 11: learn: 0.5288243 test: 0.5290327 best: 0.5290327 (11) total: 698ms remaining: 3.43s 12: learn: 0.5199916 test: 0.5203647 best: 0.5203647 (12) total: 758ms remaining: 3.38s 13: learn: 0.5079749 test: 0.5083136 best: 0.5083136 (13) total: 821ms remaining: 3.34s 14: learn: 0.4984821 test: 0.4989078 best: 0.4989078 (14) total: 885ms remaining: 3.3s 15: learn: 0.4900679 test: 0.4905698 best: 0.4905698 (15) total: 945ms remaining: 3.25s 16: learn: 0.4755551 test: 0.4759165 best: 0.4759165 (16) total: 1.01s remaining: 3.2s 17: learn: 0.4696800 test: 0.4699791 best: 0.4699791 (17) total: 1.07s remaining: 3.15s 18: learn: 0.4594868 test: 0.4597936 best: 0.4597936 (18) total: 1.13s remaining: 3.08s 19: learn: 0.4498478 test: 0.4500007 best: 0.4500007 (19) total: 1.19s remaining: 3.03s 20: learn: 0.4444112 test: 0.4444519 best: 0.4444519 (20) total: 1.25s remaining: 2.97s 21: learn: 0.4397520 test: 0.4398887 best: 0.4398887 (21) total: 1.31s remaining: 2.92s 22: learn: 0.4311046 test: 0.4312630 best: 0.4312630 (22) total: 1.38s remaining: 2.87s 23: learn: 0.4234216 test: 0.4234451 best: 0.4234451 (23) total: 1.44s remaining: 2.82s 24: learn: 0.4191780 test: 0.4194205 best: 0.4194205 (24) total: 1.5s remaining: 2.76s 25: learn: 0.4124442 test: 0.4125728 best: 0.4125728 (25) total: 1.56s remaining: 2.7s 26: learn: 0.4080681 test: 0.4080209 best: 0.4080209 (26) total: 1.62s remaining: 2.64s 27: learn: 0.4039848 test: 0.4039224 best: 0.4039224 (27) total: 1.69s remaining: 2.59s 28: learn: 0.3994853 test: 0.3993624 best: 0.3993624 (28) total: 1.75s remaining: 2.53s 29: learn: 0.3954707 test: 0.3953739 best: 0.3953739 (29) total: 1.81s remaining: 2.48s 30: learn: 0.3909160 test: 0.3908370 best: 0.3908370 (30) total: 1.88s remaining: 2.42s 31: learn: 0.3866650 test: 0.3866982 best: 0.3866982 (31) total: 1.95s remaining: 2.37s 32: learn: 0.3822502 test: 0.3823516 best: 0.3823516 (32) total: 2.02s remaining: 2.32s 33: learn: 0.3775400 test: 0.3775514 best: 0.3775514 (33) total: 2.08s remaining: 2.27s 34: learn: 0.3751396 test: 0.3754645 best: 0.3754645 (34) total: 2.15s remaining: 2.22s 35: learn: 0.3708202 test: 0.3710898 best: 0.3710898 (35) total: 2.23s remaining: 2.16s 36: learn: 0.3661206 test: 0.3664703 best: 0.3664703 (36) total: 2.3s remaining: 2.11s 37: learn: 0.3633443 test: 0.3636223 best: 0.3636223 (37) total: 2.38s remaining: 2.07s 38: learn: 0.3591513 test: 0.3595639 best: 0.3595639 (38) total: 2.46s remaining: 2.02s 39: learn: 0.3577469 test: 0.3582597 best: 0.3582597 (39) total: 2.54s remaining: 1.97s 40: learn: 0.3564140 test: 0.3569810 best: 0.3569810 (40) total: 2.62s remaining: 1.91s 41: learn: 0.3524864 test: 0.3529861 best: 0.3529861 (41) total: 2.69s remaining: 1.86s 42: learn: 0.3479404 test: 0.3483528 best: 0.3483528 (42) total: 2.77s remaining: 1.8s 43: learn: 0.3457547 test: 0.3462339 best: 0.3462339 (43) total: 2.85s remaining: 1.75s 44: learn: 0.3442323 test: 0.3446855 best: 0.3446855 (44) total: 2.92s remaining: 1.69s 45: learn: 0.3427857 test: 0.3434797 best: 0.3434797 (45) total: 2.99s remaining: 1.63s 46: learn: 0.3394256 test: 0.3400667 best: 0.3400667 (46) total: 3.07s remaining: 1.57s 47: learn: 0.3368836 test: 0.3376973 best: 0.3376973 (47) total: 3.14s remaining: 1.5s 48: learn: 0.3349872 test: 0.3358901 best: 0.3358901 (48) total: 3.21s remaining: 1.44s 49: learn: 0.3325940 test: 0.3335380 best: 0.3335380 (49) total: 3.29s remaining: 1.38s 50: learn: 0.3299306 test: 0.3308545 best: 0.3308545 (50) total: 3.36s remaining: 1.32s 51: learn: 0.3288384 test: 0.3298540 best: 0.3298540 (51) total: 3.43s remaining: 1.25s 52: learn: 0.3270898 test: 0.3281049 best: 0.3281049 (52) total: 3.5s remaining: 1.19s 53: learn: 0.3260869 test: 0.3269488 best: 0.3269488 (53) total: 3.58s remaining: 1.13s 54: learn: 0.3251655 test: 0.3261880 best: 0.3261880 (54) total: 3.65s remaining: 1.06s 55: learn: 0.3226251 test: 0.3235434 best: 0.3235434 (55) total: 3.72s remaining: 997ms 56: learn: 0.3212059 test: 0.3221044 best: 0.3221044 (56) total: 3.79s remaining: 932ms 57: learn: 0.3204203 test: 0.3214381 best: 0.3214381 (57) total: 3.87s remaining: 866ms 58: learn: 0.3193089 test: 0.3204187 best: 0.3204187 (58) total: 3.94s remaining: 801ms 59: learn: 0.3178232 test: 0.3188183 best: 0.3188183 (59) total: 4.01s remaining: 735ms 60: learn: 0.3154111 test: 0.3163878 best: 0.3163878 (60) total: 4.08s remaining: 669ms 61: learn: 0.3142465 test: 0.3152509 best: 0.3152509 (61) total: 4.16s remaining: 603ms 62: learn: 0.3111851 test: 0.3122019 best: 0.3122019 (62) total: 4.23s remaining: 537ms 63: learn: 0.3089943 test: 0.3100030 best: 0.3100030 (63) total: 4.3s remaining: 470ms 64: learn: 0.3082791 test: 0.3092742 best: 0.3092742 (64) total: 4.37s remaining: 404ms 65: learn: 0.3073392 test: 0.3083953 best: 0.3083953 (65) total: 4.45s remaining: 337ms 66: learn: 0.3059731 test: 0.3069697 best: 0.3069697 (66) total: 4.52s remaining: 270ms 67: learn: 0.3040411 test: 0.3050323 best: 0.3050323 (67) total: 4.59s remaining: 203ms 68: learn: 0.3025167 test: 0.3034098 best: 0.3034098 (68) total: 4.66s remaining: 135ms 69: learn: 0.3013735 test: 0.3022988 best: 0.3022988 (69) total: 4.74s remaining: 67.7ms 70: learn: 0.2993263 test: 0.3003050 best: 0.3003050 (70) total: 4.81s remaining: 0us bestTest = 0.3003050242 bestIteration = 70 Trial 38, Fold 1: Log loss = 0.30030502422053895, Average precision = 0.9597965553968761, ROC-AUC = 0.9533512089970988, Elapsed Time = 4.9189482999972824 seconds Trial 38, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 38, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6739487 test: 0.6739801 best: 0.6739801 (0) total: 56.7ms remaining: 3.97s 1: learn: 0.6587719 test: 0.6586950 best: 0.6586950 (1) total: 113ms remaining: 3.88s 2: learn: 0.6429520 test: 0.6428576 best: 0.6428576 (2) total: 178ms remaining: 4.03s 3: learn: 0.6289967 test: 0.6289084 best: 0.6289084 (3) total: 237ms remaining: 3.97s 4: learn: 0.6152347 test: 0.6149728 best: 0.6149728 (4) total: 299ms remaining: 3.95s 5: learn: 0.6005708 test: 0.6002894 best: 0.6002894 (5) total: 362ms remaining: 3.92s 6: learn: 0.5845512 test: 0.5843703 best: 0.5843703 (6) total: 420ms remaining: 3.84s 7: learn: 0.5701488 test: 0.5701240 best: 0.5701240 (7) total: 477ms remaining: 3.75s 8: learn: 0.5596103 test: 0.5598431 best: 0.5598431 (8) total: 541ms remaining: 3.72s 9: learn: 0.5471411 test: 0.5475320 best: 0.5475320 (9) total: 600ms remaining: 3.66s 10: learn: 0.5348508 test: 0.5353293 best: 0.5353293 (10) total: 664ms remaining: 3.62s 11: learn: 0.5264138 test: 0.5268216 best: 0.5268216 (11) total: 729ms remaining: 3.58s 12: learn: 0.5163326 test: 0.5168238 best: 0.5168238 (12) total: 795ms remaining: 3.55s 13: learn: 0.5081855 test: 0.5087273 best: 0.5087273 (13) total: 866ms remaining: 3.52s 14: learn: 0.5019663 test: 0.5025188 best: 0.5025188 (14) total: 932ms remaining: 3.48s 15: learn: 0.4946624 test: 0.4953010 best: 0.4953010 (15) total: 998ms remaining: 3.43s 16: learn: 0.4882409 test: 0.4887470 best: 0.4887470 (16) total: 1.06s remaining: 3.38s 17: learn: 0.4795725 test: 0.4801021 best: 0.4801021 (17) total: 1.13s remaining: 3.34s 18: learn: 0.4730243 test: 0.4735505 best: 0.4735505 (18) total: 1.2s remaining: 3.29s 19: learn: 0.4671034 test: 0.4676932 best: 0.4676932 (19) total: 1.27s remaining: 3.25s 20: learn: 0.4593757 test: 0.4599907 best: 0.4599907 (20) total: 1.35s remaining: 3.22s 21: learn: 0.4541115 test: 0.4547159 best: 0.4547159 (21) total: 1.43s remaining: 3.17s 22: learn: 0.4482899 test: 0.4489727 best: 0.4489727 (22) total: 1.5s remaining: 3.12s 23: learn: 0.4386969 test: 0.4392797 best: 0.4392797 (23) total: 1.57s remaining: 3.07s 24: learn: 0.4344164 test: 0.4350976 best: 0.4350976 (24) total: 1.64s remaining: 3.01s 25: learn: 0.4261314 test: 0.4268255 best: 0.4268255 (25) total: 1.71s remaining: 2.96s 26: learn: 0.4201196 test: 0.4208972 best: 0.4208972 (26) total: 1.78s remaining: 2.9s 27: learn: 0.4162929 test: 0.4168779 best: 0.4168779 (27) total: 1.85s remaining: 2.84s 28: learn: 0.4136601 test: 0.4142289 best: 0.4142289 (28) total: 1.92s remaining: 2.78s 29: learn: 0.4066927 test: 0.4071855 best: 0.4071855 (29) total: 1.99s remaining: 2.72s 30: learn: 0.4038778 test: 0.4042465 best: 0.4042465 (30) total: 2.06s remaining: 2.66s 31: learn: 0.3998645 test: 0.4002364 best: 0.4002364 (31) total: 2.13s remaining: 2.59s 32: learn: 0.3953005 test: 0.3957315 best: 0.3957315 (32) total: 2.2s remaining: 2.54s 33: learn: 0.3906713 test: 0.3911013 best: 0.3911013 (33) total: 2.27s remaining: 2.47s 34: learn: 0.3860901 test: 0.3867041 best: 0.3867041 (34) total: 2.34s remaining: 2.41s 35: learn: 0.3787552 test: 0.3794343 best: 0.3794343 (35) total: 2.41s remaining: 2.34s 36: learn: 0.3764134 test: 0.3771295 best: 0.3771295 (36) total: 2.48s remaining: 2.28s 37: learn: 0.3740856 test: 0.3748363 best: 0.3748363 (37) total: 2.55s remaining: 2.21s 38: learn: 0.3695405 test: 0.3702440 best: 0.3702440 (38) total: 2.62s remaining: 2.15s 39: learn: 0.3655147 test: 0.3661433 best: 0.3661433 (39) total: 2.69s remaining: 2.08s 40: learn: 0.3619864 test: 0.3626258 best: 0.3626258 (40) total: 2.76s remaining: 2.02s 41: learn: 0.3599264 test: 0.3606919 best: 0.3606919 (41) total: 2.83s remaining: 1.96s 42: learn: 0.3572936 test: 0.3579825 best: 0.3579825 (42) total: 2.9s remaining: 1.89s 43: learn: 0.3544650 test: 0.3551091 best: 0.3551091 (43) total: 2.97s remaining: 1.82s 44: learn: 0.3502352 test: 0.3508000 best: 0.3508000 (44) total: 3.05s remaining: 1.76s 45: learn: 0.3494558 test: 0.3499915 best: 0.3499915 (45) total: 3.12s remaining: 1.69s 46: learn: 0.3459579 test: 0.3464827 best: 0.3464827 (46) total: 3.19s remaining: 1.63s 47: learn: 0.3426412 test: 0.3430942 best: 0.3430942 (47) total: 3.26s remaining: 1.56s 48: learn: 0.3408909 test: 0.3413030 best: 0.3413030 (48) total: 3.33s remaining: 1.5s 49: learn: 0.3395108 test: 0.3398458 best: 0.3398458 (49) total: 3.4s remaining: 1.43s 50: learn: 0.3369927 test: 0.3373514 best: 0.3373514 (50) total: 3.47s remaining: 1.36s 51: learn: 0.3353469 test: 0.3357623 best: 0.3357623 (51) total: 3.54s remaining: 1.29s 52: learn: 0.3328273 test: 0.3331620 best: 0.3331620 (52) total: 3.61s remaining: 1.23s 53: learn: 0.3317080 test: 0.3320974 best: 0.3320974 (53) total: 3.68s remaining: 1.16s 54: learn: 0.3301979 test: 0.3306092 best: 0.3306092 (54) total: 3.75s remaining: 1.09s 55: learn: 0.3287360 test: 0.3292052 best: 0.3292052 (55) total: 3.83s remaining: 1.02s 56: learn: 0.3272422 test: 0.3277298 best: 0.3277298 (56) total: 3.9s remaining: 958ms 57: learn: 0.3247364 test: 0.3252451 best: 0.3252451 (57) total: 3.97s remaining: 890ms 58: learn: 0.3241688 test: 0.3246808 best: 0.3246808 (58) total: 4.04s remaining: 822ms 59: learn: 0.3218825 test: 0.3224515 best: 0.3224515 (59) total: 4.11s remaining: 754ms 60: learn: 0.3189831 test: 0.3194508 best: 0.3194508 (60) total: 4.19s remaining: 686ms 61: learn: 0.3178752 test: 0.3183790 best: 0.3183790 (61) total: 4.26s remaining: 618ms 62: learn: 0.3168860 test: 0.3173499 best: 0.3173499 (62) total: 4.33s remaining: 550ms 63: learn: 0.3156306 test: 0.3160948 best: 0.3160948 (63) total: 4.4s remaining: 481ms 64: learn: 0.3146366 test: 0.3150864 best: 0.3150864 (64) total: 4.47s remaining: 413ms 65: learn: 0.3135111 test: 0.3139475 best: 0.3139475 (65) total: 4.54s remaining: 344ms 66: learn: 0.3130337 test: 0.3134776 best: 0.3134776 (66) total: 4.61s remaining: 275ms 67: learn: 0.3124552 test: 0.3128737 best: 0.3128737 (67) total: 4.68s remaining: 206ms 68: learn: 0.3115245 test: 0.3119070 best: 0.3119070 (68) total: 4.75s remaining: 138ms 69: learn: 0.3094738 test: 0.3099221 best: 0.3099221 (69) total: 4.81s remaining: 68.7ms 70: learn: 0.3073763 test: 0.3078334 best: 0.3078334 (70) total: 4.88s remaining: 0us bestTest = 0.3078334479 bestIteration = 70 Trial 38, Fold 2: Log loss = 0.3078334478933136, Average precision = 0.9590835624220182, ROC-AUC = 0.954372324117488, Elapsed Time = 5.006788000002416 seconds Trial 38, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 38, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6739696 test: 0.6734504 best: 0.6734504 (0) total: 55.8ms remaining: 3.9s 1: learn: 0.6540315 test: 0.6530784 best: 0.6530784 (1) total: 115ms remaining: 3.95s 2: learn: 0.6346908 test: 0.6332040 best: 0.6332040 (2) total: 172ms remaining: 3.91s 3: learn: 0.6213465 test: 0.6198992 best: 0.6198992 (3) total: 232ms remaining: 3.89s 4: learn: 0.6066717 test: 0.6047027 best: 0.6047027 (4) total: 292ms remaining: 3.86s 5: learn: 0.5932309 test: 0.5910721 best: 0.5910721 (5) total: 355ms remaining: 3.85s 6: learn: 0.5791875 test: 0.5771240 best: 0.5771240 (6) total: 409ms remaining: 3.74s 7: learn: 0.5687855 test: 0.5663491 best: 0.5663491 (7) total: 470ms remaining: 3.7s 8: learn: 0.5573207 test: 0.5545950 best: 0.5545950 (8) total: 532ms remaining: 3.67s 9: learn: 0.5442369 test: 0.5412121 best: 0.5412121 (9) total: 595ms remaining: 3.63s 10: learn: 0.5355854 test: 0.5323100 best: 0.5323100 (10) total: 653ms remaining: 3.56s 11: learn: 0.5275580 test: 0.5239818 best: 0.5239818 (11) total: 715ms remaining: 3.51s 12: learn: 0.5199504 test: 0.5161448 best: 0.5161448 (12) total: 776ms remaining: 3.46s 13: learn: 0.5103136 test: 0.5061447 best: 0.5061447 (13) total: 838ms remaining: 3.41s 14: learn: 0.5020404 test: 0.4976368 best: 0.4976368 (14) total: 900ms remaining: 3.36s 15: learn: 0.4950739 test: 0.4904401 best: 0.4904401 (15) total: 964ms remaining: 3.31s 16: learn: 0.4855881 test: 0.4808152 best: 0.4808152 (16) total: 1.03s remaining: 3.26s 17: learn: 0.4746591 test: 0.4698927 best: 0.4698927 (17) total: 1.09s remaining: 3.21s 18: learn: 0.4637011 test: 0.4588766 best: 0.4588766 (18) total: 1.16s remaining: 3.16s 19: learn: 0.4573162 test: 0.4525410 best: 0.4525410 (19) total: 1.22s remaining: 3.1s 20: learn: 0.4508210 test: 0.4457398 best: 0.4457398 (20) total: 1.28s remaining: 3.05s 21: learn: 0.4466611 test: 0.4414020 best: 0.4414020 (21) total: 1.34s remaining: 2.99s 22: learn: 0.4403851 test: 0.4349654 best: 0.4349654 (22) total: 1.41s remaining: 2.94s 23: learn: 0.4320118 test: 0.4266074 best: 0.4266074 (23) total: 1.47s remaining: 2.89s 24: learn: 0.4259302 test: 0.4206589 best: 0.4206589 (24) total: 1.54s remaining: 2.83s 25: learn: 0.4216014 test: 0.4160313 best: 0.4160313 (25) total: 1.61s remaining: 2.79s 26: learn: 0.4152985 test: 0.4096579 best: 0.4096579 (26) total: 1.68s remaining: 2.73s 27: learn: 0.4106842 test: 0.4051443 best: 0.4051443 (27) total: 1.75s remaining: 2.68s 28: learn: 0.4081591 test: 0.4024406 best: 0.4024406 (28) total: 1.81s remaining: 2.63s 29: learn: 0.4043698 test: 0.3985388 best: 0.3985388 (29) total: 1.88s remaining: 2.57s 30: learn: 0.3990239 test: 0.3930701 best: 0.3930701 (30) total: 1.95s remaining: 2.52s 31: learn: 0.3919073 test: 0.3858040 best: 0.3858040 (31) total: 2.02s remaining: 2.46s 32: learn: 0.3878411 test: 0.3815994 best: 0.3815994 (32) total: 2.09s remaining: 2.4s 33: learn: 0.3831140 test: 0.3768171 best: 0.3768171 (33) total: 2.15s remaining: 2.34s 34: learn: 0.3804287 test: 0.3739180 best: 0.3739180 (34) total: 2.22s remaining: 2.28s 35: learn: 0.3783172 test: 0.3716656 best: 0.3716656 (35) total: 2.29s remaining: 2.22s 36: learn: 0.3730891 test: 0.3664463 best: 0.3664463 (36) total: 2.36s remaining: 2.17s 37: learn: 0.3707747 test: 0.3639621 best: 0.3639621 (37) total: 2.43s remaining: 2.11s 38: learn: 0.3658967 test: 0.3590783 best: 0.3590783 (38) total: 2.5s remaining: 2.05s 39: learn: 0.3630543 test: 0.3561730 best: 0.3561730 (39) total: 2.57s remaining: 1.99s 40: learn: 0.3585088 test: 0.3513347 best: 0.3513347 (40) total: 2.64s remaining: 1.93s 41: learn: 0.3564604 test: 0.3492065 best: 0.3492065 (41) total: 2.71s remaining: 1.87s 42: learn: 0.3540394 test: 0.3467541 best: 0.3467541 (42) total: 2.78s remaining: 1.81s 43: learn: 0.3508206 test: 0.3436589 best: 0.3436589 (43) total: 2.85s remaining: 1.75s 44: learn: 0.3472879 test: 0.3401084 best: 0.3401084 (44) total: 2.92s remaining: 1.69s 45: learn: 0.3453259 test: 0.3380892 best: 0.3380892 (45) total: 2.99s remaining: 1.63s 46: learn: 0.3430964 test: 0.3358720 best: 0.3358720 (46) total: 3.06s remaining: 1.56s 47: learn: 0.3415968 test: 0.3343171 best: 0.3343171 (47) total: 3.13s remaining: 1.5s 48: learn: 0.3396437 test: 0.3322439 best: 0.3322439 (48) total: 3.2s remaining: 1.44s 49: learn: 0.3382797 test: 0.3309430 best: 0.3309430 (49) total: 3.27s remaining: 1.37s 50: learn: 0.3366515 test: 0.3292652 best: 0.3292652 (50) total: 3.33s remaining: 1.31s 51: learn: 0.3336618 test: 0.3264205 best: 0.3264205 (51) total: 3.4s remaining: 1.24s 52: learn: 0.3309816 test: 0.3238521 best: 0.3238521 (52) total: 3.47s remaining: 1.18s 53: learn: 0.3291131 test: 0.3219066 best: 0.3219066 (53) total: 3.54s remaining: 1.11s 54: learn: 0.3263350 test: 0.3192323 best: 0.3192323 (54) total: 3.61s remaining: 1.05s 55: learn: 0.3248405 test: 0.3177045 best: 0.3177045 (55) total: 3.68s remaining: 985ms 56: learn: 0.3232000 test: 0.3160541 best: 0.3160541 (56) total: 3.74s remaining: 920ms 57: learn: 0.3208681 test: 0.3137588 best: 0.3137588 (57) total: 3.81s remaining: 854ms 58: learn: 0.3201599 test: 0.3130382 best: 0.3130382 (58) total: 3.88s remaining: 789ms 59: learn: 0.3179127 test: 0.3107362 best: 0.3107362 (59) total: 3.94s remaining: 723ms 60: learn: 0.3157292 test: 0.3086470 best: 0.3086470 (60) total: 4.01s remaining: 658ms 61: learn: 0.3139662 test: 0.3068980 best: 0.3068980 (61) total: 4.08s remaining: 592ms 62: learn: 0.3116917 test: 0.3045812 best: 0.3045812 (62) total: 4.14s remaining: 526ms 63: learn: 0.3095343 test: 0.3023607 best: 0.3023607 (63) total: 4.21s remaining: 461ms 64: learn: 0.3071762 test: 0.3000030 best: 0.3000030 (64) total: 4.28s remaining: 395ms 65: learn: 0.3063504 test: 0.2991539 best: 0.2991539 (65) total: 4.35s remaining: 330ms 66: learn: 0.3052860 test: 0.2981302 best: 0.2981302 (66) total: 4.42s remaining: 264ms 67: learn: 0.3031481 test: 0.2959762 best: 0.2959762 (67) total: 4.48s remaining: 198ms 68: learn: 0.3013577 test: 0.2942633 best: 0.2942633 (68) total: 4.55s remaining: 132ms 69: learn: 0.2997564 test: 0.2926013 best: 0.2926013 (69) total: 4.61s remaining: 65.9ms 70: learn: 0.2979945 test: 0.2909257 best: 0.2909257 (70) total: 4.68s remaining: 0us bestTest = 0.2909257083 bestIteration = 70 Trial 38, Fold 3: Log loss = 0.2909257082784127, Average precision = 0.9626860783820593, ROC-AUC = 0.958333433656615, Elapsed Time = 4.805641099999775 seconds Trial 38, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 38, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6701365 test: 0.6700297 best: 0.6700297 (0) total: 54.8ms remaining: 3.84s 1: learn: 0.6518699 test: 0.6517810 best: 0.6517810 (1) total: 111ms remaining: 3.85s 2: learn: 0.6328971 test: 0.6328429 best: 0.6328429 (2) total: 171ms remaining: 3.88s 3: learn: 0.6149595 test: 0.6147809 best: 0.6147809 (3) total: 229ms remaining: 3.83s 4: learn: 0.5976393 test: 0.5973294 best: 0.5973294 (4) total: 286ms remaining: 3.78s 5: learn: 0.5860103 test: 0.5858385 best: 0.5858385 (5) total: 347ms remaining: 3.76s 6: learn: 0.5731292 test: 0.5729742 best: 0.5729742 (6) total: 402ms remaining: 3.68s 7: learn: 0.5603904 test: 0.5604265 best: 0.5604265 (7) total: 460ms remaining: 3.62s 8: learn: 0.5504624 test: 0.5504349 best: 0.5504349 (8) total: 521ms remaining: 3.59s 9: learn: 0.5410729 test: 0.5410021 best: 0.5410021 (9) total: 579ms remaining: 3.53s 10: learn: 0.5315916 test: 0.5314319 best: 0.5314319 (10) total: 639ms remaining: 3.48s 11: learn: 0.5206602 test: 0.5206262 best: 0.5206262 (11) total: 699ms remaining: 3.43s 12: learn: 0.5100043 test: 0.5099890 best: 0.5099890 (12) total: 761ms remaining: 3.39s 13: learn: 0.5021718 test: 0.5021030 best: 0.5021030 (13) total: 823ms remaining: 3.35s 14: learn: 0.4923706 test: 0.4922870 best: 0.4922870 (14) total: 883ms remaining: 3.3s 15: learn: 0.4849823 test: 0.4849593 best: 0.4849593 (15) total: 944ms remaining: 3.25s 16: learn: 0.4773886 test: 0.4773757 best: 0.4773757 (16) total: 1.01s remaining: 3.2s 17: learn: 0.4720293 test: 0.4719407 best: 0.4719407 (17) total: 1.06s remaining: 3.14s 18: learn: 0.4665283 test: 0.4663281 best: 0.4663281 (18) total: 1.13s remaining: 3.09s 19: learn: 0.4611998 test: 0.4608986 best: 0.4608986 (19) total: 1.19s remaining: 3.04s 20: learn: 0.4570940 test: 0.4567386 best: 0.4567386 (20) total: 1.25s remaining: 2.98s 21: learn: 0.4516583 test: 0.4511212 best: 0.4511212 (21) total: 1.31s remaining: 2.93s 22: learn: 0.4452226 test: 0.4446760 best: 0.4446760 (22) total: 1.38s remaining: 2.87s 23: learn: 0.4408118 test: 0.4402433 best: 0.4402433 (23) total: 1.44s remaining: 2.81s 24: learn: 0.4361809 test: 0.4355955 best: 0.4355955 (24) total: 1.5s remaining: 2.77s 25: learn: 0.4303182 test: 0.4297087 best: 0.4297087 (25) total: 1.57s remaining: 2.72s 26: learn: 0.4207233 test: 0.4201194 best: 0.4201194 (26) total: 1.64s remaining: 2.67s 27: learn: 0.4171798 test: 0.4166213 best: 0.4166213 (27) total: 1.7s remaining: 2.61s 28: learn: 0.4141998 test: 0.4136057 best: 0.4136057 (28) total: 1.77s remaining: 2.56s 29: learn: 0.4102277 test: 0.4095632 best: 0.4095632 (29) total: 1.83s remaining: 2.51s 30: learn: 0.4073266 test: 0.4067710 best: 0.4067710 (30) total: 1.9s remaining: 2.45s 31: learn: 0.4017225 test: 0.4013135 best: 0.4013135 (31) total: 1.97s remaining: 2.4s 32: learn: 0.3985585 test: 0.3982040 best: 0.3982040 (32) total: 2.04s remaining: 2.35s 33: learn: 0.3945244 test: 0.3942751 best: 0.3942751 (33) total: 2.11s remaining: 2.29s 34: learn: 0.3883637 test: 0.3881150 best: 0.3881150 (34) total: 2.18s remaining: 2.24s 35: learn: 0.3852884 test: 0.3850010 best: 0.3850010 (35) total: 2.25s remaining: 2.18s 36: learn: 0.3798721 test: 0.3796398 best: 0.3796398 (36) total: 2.31s remaining: 2.13s 37: learn: 0.3781880 test: 0.3779084 best: 0.3779084 (37) total: 2.38s remaining: 2.07s 38: learn: 0.3762074 test: 0.3758416 best: 0.3758416 (38) total: 2.45s remaining: 2.01s 39: learn: 0.3723402 test: 0.3720776 best: 0.3720776 (39) total: 2.52s remaining: 1.95s 40: learn: 0.3706252 test: 0.3703764 best: 0.3703764 (40) total: 2.58s remaining: 1.89s 41: learn: 0.3682378 test: 0.3680002 best: 0.3680002 (41) total: 2.65s remaining: 1.83s 42: learn: 0.3643057 test: 0.3640652 best: 0.3640652 (42) total: 2.71s remaining: 1.77s 43: learn: 0.3587756 test: 0.3584656 best: 0.3584656 (43) total: 2.78s remaining: 1.71s 44: learn: 0.3571552 test: 0.3568030 best: 0.3568030 (44) total: 2.85s remaining: 1.65s 45: learn: 0.3539774 test: 0.3537141 best: 0.3537141 (45) total: 2.92s remaining: 1.58s 46: learn: 0.3516483 test: 0.3513340 best: 0.3513340 (46) total: 2.98s remaining: 1.52s 47: learn: 0.3492489 test: 0.3490881 best: 0.3490881 (47) total: 3.05s remaining: 1.46s 48: learn: 0.3454056 test: 0.3452127 best: 0.3452127 (48) total: 3.12s remaining: 1.4s 49: learn: 0.3432077 test: 0.3431990 best: 0.3431990 (49) total: 3.18s remaining: 1.34s 50: learn: 0.3419121 test: 0.3418445 best: 0.3418445 (50) total: 3.25s remaining: 1.27s 51: learn: 0.3386347 test: 0.3385040 best: 0.3385040 (51) total: 3.32s remaining: 1.21s 52: learn: 0.3365897 test: 0.3364663 best: 0.3364663 (52) total: 3.38s remaining: 1.15s 53: learn: 0.3342511 test: 0.3342207 best: 0.3342207 (53) total: 3.45s remaining: 1.08s 54: learn: 0.3318595 test: 0.3318567 best: 0.3318567 (54) total: 3.52s remaining: 1.02s 55: learn: 0.3296476 test: 0.3297387 best: 0.3297387 (55) total: 3.58s remaining: 959ms 56: learn: 0.3283772 test: 0.3283936 best: 0.3283936 (56) total: 3.65s remaining: 896ms 57: learn: 0.3265336 test: 0.3265519 best: 0.3265519 (57) total: 3.71s remaining: 833ms 58: learn: 0.3230968 test: 0.3230277 best: 0.3230277 (58) total: 3.78s remaining: 769ms 59: learn: 0.3204285 test: 0.3204079 best: 0.3204079 (59) total: 3.85s remaining: 705ms 60: learn: 0.3193867 test: 0.3193472 best: 0.3193472 (60) total: 3.92s remaining: 642ms 61: learn: 0.3183724 test: 0.3183489 best: 0.3183489 (61) total: 3.98s remaining: 578ms 62: learn: 0.3171241 test: 0.3171957 best: 0.3171957 (62) total: 4.05s remaining: 514ms 63: learn: 0.3157724 test: 0.3158728 best: 0.3158728 (63) total: 4.12s remaining: 450ms 64: learn: 0.3149399 test: 0.3150587 best: 0.3150587 (64) total: 4.18s remaining: 386ms 65: learn: 0.3140486 test: 0.3140716 best: 0.3140716 (65) total: 4.25s remaining: 322ms 66: learn: 0.3129548 test: 0.3129452 best: 0.3129452 (66) total: 4.31s remaining: 257ms 67: learn: 0.3108957 test: 0.3107858 best: 0.3107858 (67) total: 4.38s remaining: 193ms 68: learn: 0.3102045 test: 0.3102024 best: 0.3102024 (68) total: 4.44s remaining: 129ms 69: learn: 0.3091526 test: 0.3092206 best: 0.3092206 (69) total: 4.51s remaining: 64.4ms 70: learn: 0.3084075 test: 0.3085451 best: 0.3085451 (70) total: 4.57s remaining: 0us bestTest = 0.3085450536 bestIteration = 70 Trial 38, Fold 4: Log loss = 0.3085450535590476, Average precision = 0.959983382412791, ROC-AUC = 0.9535379872243324, Elapsed Time = 4.69804110000041 seconds Trial 38, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 38, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6738089 test: 0.6741173 best: 0.6741173 (0) total: 58.5ms remaining: 4.1s 1: learn: 0.6545786 test: 0.6552503 best: 0.6552503 (1) total: 115ms remaining: 3.98s 2: learn: 0.6376745 test: 0.6388893 best: 0.6388893 (2) total: 178ms remaining: 4.03s 3: learn: 0.6229715 test: 0.6247794 best: 0.6247794 (3) total: 236ms remaining: 3.95s 4: learn: 0.6089991 test: 0.6113289 best: 0.6113289 (4) total: 299ms remaining: 3.94s 5: learn: 0.5920798 test: 0.5948466 best: 0.5948466 (5) total: 361ms remaining: 3.91s 6: learn: 0.5807876 test: 0.5840629 best: 0.5840629 (6) total: 420ms remaining: 3.84s 7: learn: 0.5667378 test: 0.5703536 best: 0.5703536 (7) total: 475ms remaining: 3.74s 8: learn: 0.5551309 test: 0.5590570 best: 0.5590570 (8) total: 539ms remaining: 3.71s 9: learn: 0.5433170 test: 0.5476287 best: 0.5476287 (9) total: 601ms remaining: 3.66s 10: learn: 0.5311477 test: 0.5355604 best: 0.5355604 (10) total: 662ms remaining: 3.61s 11: learn: 0.5212188 test: 0.5259125 best: 0.5259125 (11) total: 723ms remaining: 3.55s 12: learn: 0.5082861 test: 0.5131146 best: 0.5131146 (12) total: 787ms remaining: 3.51s 13: learn: 0.4977550 test: 0.5028612 best: 0.5028612 (13) total: 850ms remaining: 3.46s 14: learn: 0.4887968 test: 0.4940619 best: 0.4940619 (14) total: 908ms remaining: 3.39s 15: learn: 0.4801074 test: 0.4854603 best: 0.4854603 (15) total: 970ms remaining: 3.33s 16: learn: 0.4729969 test: 0.4786991 best: 0.4786991 (16) total: 1.03s remaining: 3.28s 17: learn: 0.4658513 test: 0.4715855 best: 0.4715855 (17) total: 1.09s remaining: 3.22s 18: learn: 0.4554003 test: 0.4613505 best: 0.4613505 (18) total: 1.16s remaining: 3.16s 19: learn: 0.4474205 test: 0.4532238 best: 0.4532238 (19) total: 1.21s remaining: 3.09s 20: learn: 0.4431854 test: 0.4492996 best: 0.4492996 (20) total: 1.27s remaining: 3.03s 21: learn: 0.4344919 test: 0.4407287 best: 0.4407287 (21) total: 1.33s remaining: 2.97s 22: learn: 0.4278509 test: 0.4338204 best: 0.4338204 (22) total: 1.39s remaining: 2.91s 23: learn: 0.4197953 test: 0.4258419 best: 0.4258419 (23) total: 1.45s remaining: 2.85s 24: learn: 0.4156646 test: 0.4218737 best: 0.4218737 (24) total: 1.52s remaining: 2.79s 25: learn: 0.4115957 test: 0.4179499 best: 0.4179499 (25) total: 1.57s remaining: 2.72s 26: learn: 0.4050500 test: 0.4114789 best: 0.4114789 (26) total: 1.63s remaining: 2.66s 27: learn: 0.3994062 test: 0.4058332 best: 0.4058332 (27) total: 1.69s remaining: 2.6s 28: learn: 0.3954729 test: 0.4018111 best: 0.4018111 (28) total: 1.76s remaining: 2.54s 29: learn: 0.3911703 test: 0.3974666 best: 0.3974666 (29) total: 1.82s remaining: 2.49s 30: learn: 0.3873724 test: 0.3936797 best: 0.3936797 (30) total: 1.89s remaining: 2.43s 31: learn: 0.3816950 test: 0.3881149 best: 0.3881149 (31) total: 1.96s remaining: 2.38s 32: learn: 0.3780407 test: 0.3845868 best: 0.3845868 (32) total: 2.03s remaining: 2.33s 33: learn: 0.3726456 test: 0.3793114 best: 0.3793114 (33) total: 2.1s remaining: 2.28s 34: learn: 0.3674266 test: 0.3742329 best: 0.3742329 (34) total: 2.17s remaining: 2.23s 35: learn: 0.3631175 test: 0.3699257 best: 0.3699257 (35) total: 2.24s remaining: 2.17s 36: learn: 0.3599937 test: 0.3668823 best: 0.3668823 (36) total: 2.31s remaining: 2.13s 37: learn: 0.3575279 test: 0.3643783 best: 0.3643783 (37) total: 2.39s remaining: 2.07s 38: learn: 0.3539537 test: 0.3607964 best: 0.3607964 (38) total: 2.46s remaining: 2.02s 39: learn: 0.3495749 test: 0.3565429 best: 0.3565429 (39) total: 2.54s remaining: 1.97s 40: learn: 0.3462978 test: 0.3532888 best: 0.3532888 (40) total: 2.61s remaining: 1.91s 41: learn: 0.3429492 test: 0.3500236 best: 0.3500236 (41) total: 2.7s remaining: 1.86s 42: learn: 0.3392539 test: 0.3462613 best: 0.3462613 (42) total: 2.78s remaining: 1.81s 43: learn: 0.3357650 test: 0.3428476 best: 0.3428476 (43) total: 2.86s remaining: 1.76s 44: learn: 0.3330687 test: 0.3401201 best: 0.3401201 (44) total: 2.94s remaining: 1.7s 45: learn: 0.3306126 test: 0.3376770 best: 0.3376770 (45) total: 3.02s remaining: 1.64s 46: learn: 0.3293477 test: 0.3364924 best: 0.3364924 (46) total: 3.09s remaining: 1.58s 47: learn: 0.3281742 test: 0.3353866 best: 0.3353866 (47) total: 3.17s remaining: 1.52s 48: learn: 0.3265254 test: 0.3337140 best: 0.3337140 (48) total: 3.24s remaining: 1.45s 49: learn: 0.3240439 test: 0.3312225 best: 0.3312225 (49) total: 3.31s remaining: 1.39s 50: learn: 0.3222532 test: 0.3294249 best: 0.3294249 (50) total: 3.39s remaining: 1.33s 51: learn: 0.3197304 test: 0.3269047 best: 0.3269047 (51) total: 3.46s remaining: 1.26s 52: learn: 0.3185205 test: 0.3256540 best: 0.3256540 (52) total: 3.53s remaining: 1.2s 53: learn: 0.3158038 test: 0.3229696 best: 0.3229696 (53) total: 3.61s remaining: 1.14s 54: learn: 0.3142696 test: 0.3214791 best: 0.3214791 (54) total: 3.68s remaining: 1.07s 55: learn: 0.3135930 test: 0.3209261 best: 0.3209261 (55) total: 3.75s remaining: 1.01s 56: learn: 0.3129404 test: 0.3202941 best: 0.3202941 (56) total: 3.83s remaining: 941ms 57: learn: 0.3121391 test: 0.3195664 best: 0.3195664 (57) total: 3.9s remaining: 874ms 58: learn: 0.3111019 test: 0.3186211 best: 0.3186211 (58) total: 3.97s remaining: 808ms 59: learn: 0.3089493 test: 0.3163972 best: 0.3163972 (59) total: 4.04s remaining: 741ms 60: learn: 0.3080481 test: 0.3155364 best: 0.3155364 (60) total: 4.11s remaining: 675ms 61: learn: 0.3069081 test: 0.3144857 best: 0.3144857 (61) total: 4.19s remaining: 608ms 62: learn: 0.3056878 test: 0.3133384 best: 0.3133384 (62) total: 4.25s remaining: 540ms 63: learn: 0.3046024 test: 0.3123602 best: 0.3123602 (63) total: 4.32s remaining: 473ms 64: learn: 0.3028777 test: 0.3105599 best: 0.3105599 (64) total: 4.39s remaining: 405ms 65: learn: 0.3020844 test: 0.3098017 best: 0.3098017 (65) total: 4.46s remaining: 338ms 66: learn: 0.3003541 test: 0.3080580 best: 0.3080580 (66) total: 4.52s remaining: 270ms 67: learn: 0.2997295 test: 0.3074424 best: 0.3074424 (67) total: 4.59s remaining: 202ms 68: learn: 0.2987554 test: 0.3064541 best: 0.3064541 (68) total: 4.65s remaining: 135ms 69: learn: 0.2978323 test: 0.3055506 best: 0.3055506 (69) total: 4.71s remaining: 67.3ms 70: learn: 0.2973933 test: 0.3051493 best: 0.3051493 (70) total: 4.78s remaining: 0us bestTest = 0.3051492928 bestIteration = 70 Trial 38, Fold 5: Log loss = 0.3051492927836401, Average precision = 0.9558258237773902, ROC-AUC = 0.9505639596455048, Elapsed Time = 4.897137100000691 seconds
Optimization Progress: 39%|###9 | 39/100 [1:05:48<1:20:41, 79.37s/it]
Trial 39, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 39, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6028764 test: 0.6044165 best: 0.6044165 (0) total: 212ms remaining: 16.3s 1: learn: 0.5301267 test: 0.5324554 best: 0.5324554 (1) total: 421ms remaining: 16s 2: learn: 0.4704575 test: 0.4741452 best: 0.4741452 (2) total: 641ms remaining: 16s 3: learn: 0.4228651 test: 0.4269718 best: 0.4269718 (3) total: 857ms remaining: 15.9s 4: learn: 0.3852993 test: 0.3906589 best: 0.3906589 (4) total: 1.08s remaining: 15.7s 5: learn: 0.3541797 test: 0.3600447 best: 0.3600447 (5) total: 1.31s remaining: 15.7s 6: learn: 0.3281914 test: 0.3350601 best: 0.3350601 (6) total: 1.53s remaining: 15.5s 7: learn: 0.3069893 test: 0.3145209 best: 0.3145209 (7) total: 1.75s remaining: 15.3s 8: learn: 0.2896389 test: 0.2978298 best: 0.2978298 (8) total: 1.97s remaining: 15.1s 9: learn: 0.2754968 test: 0.2848583 best: 0.2848583 (9) total: 2.23s remaining: 15.1s 10: learn: 0.2633909 test: 0.2735612 best: 0.2735612 (10) total: 2.47s remaining: 15.1s 11: learn: 0.2531251 test: 0.2640682 best: 0.2640682 (11) total: 2.73s remaining: 15s 12: learn: 0.2446520 test: 0.2564371 best: 0.2564371 (12) total: 2.97s remaining: 14.8s 13: learn: 0.2367139 test: 0.2497769 best: 0.2497769 (13) total: 3.22s remaining: 14.7s 14: learn: 0.2302153 test: 0.2441436 best: 0.2441436 (14) total: 3.47s remaining: 14.6s 15: learn: 0.2240664 test: 0.2389165 best: 0.2389165 (15) total: 3.73s remaining: 14.4s 16: learn: 0.2188841 test: 0.2343988 best: 0.2343988 (16) total: 3.98s remaining: 14.3s 17: learn: 0.2144648 test: 0.2310716 best: 0.2310716 (17) total: 4.23s remaining: 14.1s 18: learn: 0.2106931 test: 0.2278586 best: 0.2278586 (18) total: 4.47s remaining: 13.9s 19: learn: 0.2073872 test: 0.2250813 best: 0.2250813 (19) total: 4.72s remaining: 13.7s 20: learn: 0.2042370 test: 0.2227074 best: 0.2227074 (20) total: 4.96s remaining: 13.5s 21: learn: 0.2019922 test: 0.2211033 best: 0.2211033 (21) total: 5.21s remaining: 13.3s 22: learn: 0.1988718 test: 0.2191452 best: 0.2191452 (22) total: 5.48s remaining: 13.1s 23: learn: 0.1966030 test: 0.2174051 best: 0.2174051 (23) total: 5.71s remaining: 12.9s 24: learn: 0.1938755 test: 0.2159164 best: 0.2159164 (24) total: 5.97s remaining: 12.7s 25: learn: 0.1914638 test: 0.2142994 best: 0.2142994 (25) total: 6.23s remaining: 12.5s 26: learn: 0.1901435 test: 0.2134232 best: 0.2134232 (26) total: 6.47s remaining: 12.2s 27: learn: 0.1883589 test: 0.2121057 best: 0.2121057 (27) total: 6.72s remaining: 12s 28: learn: 0.1865428 test: 0.2112831 best: 0.2112831 (28) total: 6.98s remaining: 11.8s 29: learn: 0.1848949 test: 0.2101803 best: 0.2101803 (29) total: 7.22s remaining: 11.6s 30: learn: 0.1833318 test: 0.2090247 best: 0.2090247 (30) total: 7.47s remaining: 11.3s 31: learn: 0.1817110 test: 0.2084024 best: 0.2084024 (31) total: 7.71s remaining: 11.1s 32: learn: 0.1803548 test: 0.2075489 best: 0.2075489 (32) total: 7.96s remaining: 10.8s 33: learn: 0.1791456 test: 0.2069714 best: 0.2069714 (33) total: 8.2s remaining: 10.6s 34: learn: 0.1778903 test: 0.2064591 best: 0.2064591 (34) total: 8.46s remaining: 10.4s 35: learn: 0.1767511 test: 0.2058802 best: 0.2058802 (35) total: 8.7s remaining: 10.1s 36: learn: 0.1758322 test: 0.2053332 best: 0.2053332 (36) total: 8.94s remaining: 9.91s 37: learn: 0.1748801 test: 0.2046675 best: 0.2046675 (37) total: 9.2s remaining: 9.68s 38: learn: 0.1739260 test: 0.2041540 best: 0.2041540 (38) total: 9.44s remaining: 9.44s 39: learn: 0.1730527 test: 0.2035847 best: 0.2035847 (39) total: 9.68s remaining: 9.19s 40: learn: 0.1717095 test: 0.2033979 best: 0.2033979 (40) total: 9.92s remaining: 8.95s 41: learn: 0.1709555 test: 0.2027297 best: 0.2027297 (41) total: 10.2s remaining: 8.71s 42: learn: 0.1700703 test: 0.2026351 best: 0.2026351 (42) total: 10.4s remaining: 8.46s 43: learn: 0.1688917 test: 0.2026129 best: 0.2026129 (43) total: 10.6s remaining: 8.23s 44: learn: 0.1681996 test: 0.2022485 best: 0.2022485 (44) total: 10.9s remaining: 7.99s 45: learn: 0.1673846 test: 0.2019749 best: 0.2019749 (45) total: 11.1s remaining: 7.74s 46: learn: 0.1666302 test: 0.2017215 best: 0.2017215 (46) total: 11.4s remaining: 7.5s 47: learn: 0.1655473 test: 0.2016988 best: 0.2016988 (47) total: 11.6s remaining: 7.26s 48: learn: 0.1646887 test: 0.2015177 best: 0.2015177 (48) total: 11.9s remaining: 7.02s 49: learn: 0.1638959 test: 0.2014585 best: 0.2014585 (49) total: 12.1s remaining: 6.78s 50: learn: 0.1626663 test: 0.2014563 best: 0.2014563 (50) total: 12.3s remaining: 6.53s 51: learn: 0.1620714 test: 0.2011186 best: 0.2011186 (51) total: 12.6s remaining: 6.29s 52: learn: 0.1610720 test: 0.2005063 best: 0.2005063 (52) total: 12.8s remaining: 6.05s 53: learn: 0.1604592 test: 0.2003100 best: 0.2003100 (53) total: 13.1s remaining: 5.8s 54: learn: 0.1596841 test: 0.2001895 best: 0.2001895 (54) total: 13.3s remaining: 5.56s 55: learn: 0.1592098 test: 0.2002814 best: 0.2001895 (54) total: 13.5s remaining: 5.32s 56: learn: 0.1586935 test: 0.2000572 best: 0.2000572 (56) total: 13.8s remaining: 5.08s 57: learn: 0.1577771 test: 0.1998736 best: 0.1998736 (57) total: 14s remaining: 4.84s 58: learn: 0.1568264 test: 0.1996011 best: 0.1996011 (58) total: 14.3s remaining: 4.6s 59: learn: 0.1565574 test: 0.1994501 best: 0.1994501 (59) total: 14.5s remaining: 4.36s 60: learn: 0.1555834 test: 0.1996333 best: 0.1994501 (59) total: 14.8s remaining: 4.11s 61: learn: 0.1549539 test: 0.1993503 best: 0.1993503 (61) total: 15s remaining: 3.87s 62: learn: 0.1546379 test: 0.1993109 best: 0.1993109 (62) total: 15.2s remaining: 3.63s 63: learn: 0.1536787 test: 0.1994435 best: 0.1993109 (62) total: 15.5s remaining: 3.39s 64: learn: 0.1532428 test: 0.1991706 best: 0.1991706 (64) total: 15.7s remaining: 3.14s 65: learn: 0.1528815 test: 0.1989763 best: 0.1989763 (65) total: 16s remaining: 2.9s 66: learn: 0.1521180 test: 0.1989402 best: 0.1989402 (66) total: 16.2s remaining: 2.66s 67: learn: 0.1515556 test: 0.1990053 best: 0.1989402 (66) total: 16.5s remaining: 2.42s 68: learn: 0.1507400 test: 0.1985787 best: 0.1985787 (68) total: 16.7s remaining: 2.18s 69: learn: 0.1503169 test: 0.1986083 best: 0.1985787 (68) total: 16.9s remaining: 1.94s 70: learn: 0.1495320 test: 0.1984941 best: 0.1984941 (70) total: 17.2s remaining: 1.69s 71: learn: 0.1489787 test: 0.1983942 best: 0.1983942 (71) total: 17.4s remaining: 1.45s 72: learn: 0.1484794 test: 0.1984438 best: 0.1983942 (71) total: 17.7s remaining: 1.21s 73: learn: 0.1481374 test: 0.1982886 best: 0.1982886 (73) total: 17.9s remaining: 969ms 74: learn: 0.1466584 test: 0.1985579 best: 0.1982886 (73) total: 18.2s remaining: 727ms 75: learn: 0.1454621 test: 0.1983119 best: 0.1982886 (73) total: 18.4s remaining: 484ms 76: learn: 0.1443120 test: 0.1982591 best: 0.1982591 (76) total: 18.7s remaining: 242ms 77: learn: 0.1434456 test: 0.1981671 best: 0.1981671 (77) total: 18.9s remaining: 0us bestTest = 0.1981670521 bestIteration = 77 Trial 39, Fold 1: Log loss = 0.1981670521330898, Average precision = 0.9743004332187335, ROC-AUC = 0.970903578358976, Elapsed Time = 19.041579799999454 seconds Trial 39, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 39, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6033425 test: 0.6039316 best: 0.6039316 (0) total: 222ms remaining: 17.1s 1: learn: 0.5311920 test: 0.5327690 best: 0.5327690 (1) total: 441ms remaining: 16.8s 2: learn: 0.4722045 test: 0.4753341 best: 0.4753341 (2) total: 667ms remaining: 16.7s 3: learn: 0.4246124 test: 0.4288044 best: 0.4288044 (3) total: 885ms remaining: 16.4s 4: learn: 0.3858848 test: 0.3913662 best: 0.3913662 (4) total: 1.1s remaining: 16.1s 5: learn: 0.3549408 test: 0.3610740 best: 0.3610740 (5) total: 1.33s remaining: 15.9s 6: learn: 0.3286592 test: 0.3356902 best: 0.3356902 (6) total: 1.58s remaining: 16s 7: learn: 0.3078114 test: 0.3151593 best: 0.3151593 (7) total: 1.83s remaining: 16s 8: learn: 0.2906582 test: 0.2984058 best: 0.2984058 (8) total: 2.07s remaining: 15.8s 9: learn: 0.2765889 test: 0.2843650 best: 0.2843650 (9) total: 2.31s remaining: 15.7s 10: learn: 0.2642948 test: 0.2726594 best: 0.2726594 (10) total: 2.56s remaining: 15.6s 11: learn: 0.2540927 test: 0.2634548 best: 0.2634548 (11) total: 2.79s remaining: 15.3s 12: learn: 0.2457758 test: 0.2555745 best: 0.2555745 (12) total: 3.03s remaining: 15.2s 13: learn: 0.2384196 test: 0.2482585 best: 0.2482585 (13) total: 3.27s remaining: 14.9s 14: learn: 0.2317925 test: 0.2425833 best: 0.2425833 (14) total: 3.5s remaining: 14.7s 15: learn: 0.2259679 test: 0.2374941 best: 0.2374941 (15) total: 3.76s remaining: 14.6s 16: learn: 0.2211106 test: 0.2329531 best: 0.2329531 (16) total: 4s remaining: 14.3s 17: learn: 0.2162079 test: 0.2281058 best: 0.2281058 (17) total: 4.24s remaining: 14.1s 18: learn: 0.2121157 test: 0.2245548 best: 0.2245548 (18) total: 4.49s remaining: 13.9s 19: learn: 0.2085207 test: 0.2215220 best: 0.2215220 (19) total: 4.73s remaining: 13.7s 20: learn: 0.2055707 test: 0.2186126 best: 0.2186126 (20) total: 4.97s remaining: 13.5s 21: learn: 0.2030346 test: 0.2167568 best: 0.2167568 (21) total: 5.21s remaining: 13.3s 22: learn: 0.2005346 test: 0.2146212 best: 0.2146212 (22) total: 5.44s remaining: 13s 23: learn: 0.1974882 test: 0.2123634 best: 0.2123634 (23) total: 5.68s remaining: 12.8s 24: learn: 0.1951056 test: 0.2108147 best: 0.2108147 (24) total: 5.92s remaining: 12.5s 25: learn: 0.1933376 test: 0.2097580 best: 0.2097580 (25) total: 6.15s remaining: 12.3s 26: learn: 0.1914094 test: 0.2082978 best: 0.2082978 (26) total: 6.39s remaining: 12.1s 27: learn: 0.1891505 test: 0.2072209 best: 0.2072209 (27) total: 6.64s remaining: 11.9s 28: learn: 0.1871496 test: 0.2055342 best: 0.2055342 (28) total: 6.88s remaining: 11.6s 29: learn: 0.1856700 test: 0.2047994 best: 0.2047994 (29) total: 7.12s remaining: 11.4s 30: learn: 0.1841644 test: 0.2038512 best: 0.2038512 (30) total: 7.36s remaining: 11.2s 31: learn: 0.1825760 test: 0.2029559 best: 0.2029559 (31) total: 7.6s remaining: 10.9s 32: learn: 0.1811476 test: 0.2018102 best: 0.2018102 (32) total: 7.85s remaining: 10.7s 33: learn: 0.1796907 test: 0.2013536 best: 0.2013536 (33) total: 8.09s remaining: 10.5s 34: learn: 0.1783207 test: 0.2005802 best: 0.2005802 (34) total: 8.33s remaining: 10.2s 35: learn: 0.1765586 test: 0.1997044 best: 0.1997044 (35) total: 8.58s remaining: 10s 36: learn: 0.1751100 test: 0.1988936 best: 0.1988936 (36) total: 8.82s remaining: 9.77s 37: learn: 0.1741227 test: 0.1979544 best: 0.1979544 (37) total: 9.06s remaining: 9.54s 38: learn: 0.1729330 test: 0.1972955 best: 0.1972955 (38) total: 9.3s remaining: 9.3s 39: learn: 0.1720262 test: 0.1965807 best: 0.1965807 (39) total: 9.54s remaining: 9.07s 40: learn: 0.1711281 test: 0.1958886 best: 0.1958886 (40) total: 9.78s remaining: 8.83s 41: learn: 0.1698123 test: 0.1959586 best: 0.1958886 (40) total: 10s remaining: 8.59s 42: learn: 0.1691197 test: 0.1957462 best: 0.1957462 (42) total: 10.3s remaining: 8.34s 43: learn: 0.1681382 test: 0.1956655 best: 0.1956655 (43) total: 10.5s remaining: 8.1s 44: learn: 0.1672541 test: 0.1952095 best: 0.1952095 (44) total: 10.7s remaining: 7.87s 45: learn: 0.1659204 test: 0.1948564 best: 0.1948564 (45) total: 11s remaining: 7.63s 46: learn: 0.1652899 test: 0.1943558 best: 0.1943558 (46) total: 11.2s remaining: 7.39s 47: learn: 0.1643815 test: 0.1936591 best: 0.1936591 (47) total: 11.4s remaining: 7.15s 48: learn: 0.1634807 test: 0.1935198 best: 0.1935198 (48) total: 11.7s remaining: 6.91s 49: learn: 0.1629786 test: 0.1931758 best: 0.1931758 (49) total: 11.9s remaining: 6.67s 50: learn: 0.1621010 test: 0.1931960 best: 0.1931758 (49) total: 12.1s remaining: 6.43s 51: learn: 0.1615892 test: 0.1927658 best: 0.1927658 (51) total: 12.4s remaining: 6.19s 52: learn: 0.1609338 test: 0.1921904 best: 0.1921904 (52) total: 12.6s remaining: 5.95s 53: learn: 0.1597038 test: 0.1919771 best: 0.1919771 (53) total: 12.9s remaining: 5.71s 54: learn: 0.1591312 test: 0.1916870 best: 0.1916870 (54) total: 13.1s remaining: 5.47s 55: learn: 0.1584687 test: 0.1911936 best: 0.1911936 (55) total: 13.3s remaining: 5.23s 56: learn: 0.1576322 test: 0.1910575 best: 0.1910575 (56) total: 13.6s remaining: 4.99s 57: learn: 0.1569304 test: 0.1908101 best: 0.1908101 (57) total: 13.8s remaining: 4.75s 58: learn: 0.1559807 test: 0.1905594 best: 0.1905594 (58) total: 14s remaining: 4.52s 59: learn: 0.1550502 test: 0.1901525 best: 0.1901525 (59) total: 14.3s remaining: 4.28s 60: learn: 0.1541641 test: 0.1900685 best: 0.1900685 (60) total: 14.5s remaining: 4.04s 61: learn: 0.1537650 test: 0.1898039 best: 0.1898039 (61) total: 14.8s remaining: 3.81s 62: learn: 0.1530529 test: 0.1891969 best: 0.1891969 (62) total: 15s remaining: 3.57s 63: learn: 0.1521930 test: 0.1891741 best: 0.1891741 (63) total: 15.2s remaining: 3.33s 64: learn: 0.1514512 test: 0.1886860 best: 0.1886860 (64) total: 15.5s remaining: 3.1s 65: learn: 0.1509732 test: 0.1883263 best: 0.1883263 (65) total: 15.7s remaining: 2.86s 66: learn: 0.1503140 test: 0.1880537 best: 0.1880537 (66) total: 15.9s remaining: 2.62s 67: learn: 0.1490176 test: 0.1878952 best: 0.1878952 (67) total: 16.2s remaining: 2.38s 68: learn: 0.1484631 test: 0.1876491 best: 0.1876491 (68) total: 16.4s remaining: 2.14s 69: learn: 0.1468150 test: 0.1873294 best: 0.1873294 (69) total: 16.7s remaining: 1.91s 70: learn: 0.1460209 test: 0.1872629 best: 0.1872629 (70) total: 16.9s remaining: 1.67s 71: learn: 0.1452779 test: 0.1868019 best: 0.1868019 (71) total: 17.1s remaining: 1.43s 72: learn: 0.1444197 test: 0.1866277 best: 0.1866277 (72) total: 17.4s remaining: 1.19s 73: learn: 0.1436808 test: 0.1863659 best: 0.1863659 (73) total: 17.6s remaining: 952ms 74: learn: 0.1426099 test: 0.1864520 best: 0.1863659 (73) total: 17.9s remaining: 714ms 75: learn: 0.1422624 test: 0.1862834 best: 0.1862834 (75) total: 18.1s remaining: 476ms 76: learn: 0.1411518 test: 0.1860547 best: 0.1860547 (76) total: 18.3s remaining: 238ms 77: learn: 0.1403017 test: 0.1861427 best: 0.1860547 (76) total: 18.6s remaining: 0us bestTest = 0.1860546548 bestIteration = 76 Shrink model to first 77 iterations. Trial 39, Fold 2: Log loss = 0.18605465475775707, Average precision = 0.9768771960742806, ROC-AUC = 0.9746746603353444, Elapsed Time = 18.732987100000173 seconds Trial 39, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 39, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6043673 test: 0.6050415 best: 0.6050415 (0) total: 224ms remaining: 17.2s 1: learn: 0.5318033 test: 0.5324937 best: 0.5324937 (1) total: 448ms remaining: 17s 2: learn: 0.4732816 test: 0.4748956 best: 0.4748956 (2) total: 692ms remaining: 17.3s 3: learn: 0.4265924 test: 0.4284848 best: 0.4284848 (3) total: 933ms remaining: 17.3s 4: learn: 0.3875685 test: 0.3898713 best: 0.3898713 (4) total: 1.2s remaining: 17.5s 5: learn: 0.3559009 test: 0.3585000 best: 0.3585000 (5) total: 1.46s remaining: 17.5s 6: learn: 0.3309193 test: 0.3338553 best: 0.3338553 (6) total: 1.72s remaining: 17.5s 7: learn: 0.3102291 test: 0.3136542 best: 0.3136542 (7) total: 2.01s remaining: 17.6s 8: learn: 0.2921086 test: 0.2960888 best: 0.2960888 (8) total: 2.3s remaining: 17.6s 9: learn: 0.2777740 test: 0.2825540 best: 0.2825540 (9) total: 2.61s remaining: 17.8s 10: learn: 0.2660007 test: 0.2708828 best: 0.2708828 (10) total: 2.98s remaining: 18.1s 11: learn: 0.2557189 test: 0.2610978 best: 0.2610978 (11) total: 3.27s remaining: 18s 12: learn: 0.2466240 test: 0.2529059 best: 0.2529059 (12) total: 3.58s remaining: 17.9s 13: learn: 0.2391872 test: 0.2465776 best: 0.2465776 (13) total: 3.89s remaining: 17.8s 14: learn: 0.2320867 test: 0.2402580 best: 0.2402580 (14) total: 4.2s remaining: 17.6s 15: learn: 0.2264022 test: 0.2352082 best: 0.2352082 (15) total: 4.53s remaining: 17.5s 16: learn: 0.2213660 test: 0.2309194 best: 0.2309194 (16) total: 4.83s remaining: 17.3s 17: learn: 0.2171579 test: 0.2271773 best: 0.2271773 (17) total: 5.12s remaining: 17.1s 18: learn: 0.2129182 test: 0.2243267 best: 0.2243267 (18) total: 5.41s remaining: 16.8s 19: learn: 0.2090499 test: 0.2213316 best: 0.2213316 (19) total: 5.67s remaining: 16.5s 20: learn: 0.2058570 test: 0.2190999 best: 0.2190999 (20) total: 5.93s remaining: 16.1s 21: learn: 0.2029040 test: 0.2170743 best: 0.2170743 (21) total: 6.21s remaining: 15.8s 22: learn: 0.1998802 test: 0.2142565 best: 0.2142565 (22) total: 6.47s remaining: 15.5s 23: learn: 0.1978819 test: 0.2126312 best: 0.2126312 (23) total: 6.73s remaining: 15.1s 24: learn: 0.1958484 test: 0.2108608 best: 0.2108608 (24) total: 6.98s remaining: 14.8s 25: learn: 0.1939181 test: 0.2097491 best: 0.2097491 (25) total: 7.26s remaining: 14.5s 26: learn: 0.1919114 test: 0.2082654 best: 0.2082654 (26) total: 7.55s remaining: 14.3s 27: learn: 0.1901056 test: 0.2074777 best: 0.2074777 (27) total: 7.83s remaining: 14s 28: learn: 0.1881828 test: 0.2065685 best: 0.2065685 (28) total: 8.1s remaining: 13.7s 29: learn: 0.1863797 test: 0.2054105 best: 0.2054105 (29) total: 8.37s remaining: 13.4s 30: learn: 0.1845680 test: 0.2045242 best: 0.2045242 (30) total: 8.63s remaining: 13.1s 31: learn: 0.1832413 test: 0.2036499 best: 0.2036499 (31) total: 8.89s remaining: 12.8s 32: learn: 0.1813787 test: 0.2030180 best: 0.2030180 (32) total: 9.16s remaining: 12.5s 33: learn: 0.1800620 test: 0.2020211 best: 0.2020211 (33) total: 9.43s remaining: 12.2s 34: learn: 0.1788862 test: 0.2015281 best: 0.2015281 (34) total: 9.69s remaining: 11.9s 35: learn: 0.1777692 test: 0.2006005 best: 0.2006005 (35) total: 9.95s remaining: 11.6s 36: learn: 0.1767748 test: 0.1998392 best: 0.1998392 (36) total: 10.2s remaining: 11.3s 37: learn: 0.1753844 test: 0.1995821 best: 0.1995821 (37) total: 10.5s remaining: 11.1s 38: learn: 0.1741882 test: 0.1989604 best: 0.1989604 (38) total: 10.7s remaining: 10.7s 39: learn: 0.1732191 test: 0.1983608 best: 0.1983608 (39) total: 11s remaining: 10.4s 40: learn: 0.1721248 test: 0.1980414 best: 0.1980414 (40) total: 11.2s remaining: 10.1s 41: learn: 0.1707927 test: 0.1982162 best: 0.1980414 (40) total: 11.5s remaining: 9.83s 42: learn: 0.1691279 test: 0.1980128 best: 0.1980128 (42) total: 11.7s remaining: 9.52s 43: learn: 0.1678714 test: 0.1977835 best: 0.1977835 (43) total: 11.9s remaining: 9.23s 44: learn: 0.1669229 test: 0.1972714 best: 0.1972714 (44) total: 12.2s remaining: 8.94s 45: learn: 0.1660052 test: 0.1968572 best: 0.1968572 (45) total: 12.4s remaining: 8.64s 46: learn: 0.1647912 test: 0.1965795 best: 0.1965795 (46) total: 12.7s remaining: 8.36s 47: learn: 0.1638683 test: 0.1964049 best: 0.1964049 (47) total: 12.9s remaining: 8.08s 48: learn: 0.1631630 test: 0.1956648 best: 0.1956648 (48) total: 13.2s remaining: 7.8s 49: learn: 0.1622771 test: 0.1951472 best: 0.1951472 (49) total: 13.4s remaining: 7.52s 50: learn: 0.1614685 test: 0.1948357 best: 0.1948357 (50) total: 13.7s remaining: 7.25s 51: learn: 0.1602802 test: 0.1945997 best: 0.1945997 (51) total: 13.9s remaining: 6.97s 52: learn: 0.1598213 test: 0.1944239 best: 0.1944239 (52) total: 14.2s remaining: 6.68s 53: learn: 0.1591220 test: 0.1939195 best: 0.1939195 (53) total: 14.4s remaining: 6.41s 54: learn: 0.1582018 test: 0.1937785 best: 0.1937785 (54) total: 14.7s remaining: 6.13s 55: learn: 0.1569319 test: 0.1935609 best: 0.1935609 (55) total: 14.9s remaining: 5.86s 56: learn: 0.1560833 test: 0.1929449 best: 0.1929449 (56) total: 15.1s remaining: 5.58s 57: learn: 0.1549230 test: 0.1928849 best: 0.1928849 (57) total: 15.4s remaining: 5.3s 58: learn: 0.1541003 test: 0.1926194 best: 0.1926194 (58) total: 15.6s remaining: 5.03s 59: learn: 0.1527105 test: 0.1927588 best: 0.1926194 (58) total: 15.9s remaining: 4.76s 60: learn: 0.1519819 test: 0.1928305 best: 0.1926194 (58) total: 16.1s remaining: 4.48s 61: learn: 0.1501230 test: 0.1931749 best: 0.1926194 (58) total: 16.3s remaining: 4.21s 62: learn: 0.1492889 test: 0.1931144 best: 0.1926194 (58) total: 16.6s remaining: 3.95s 63: learn: 0.1483239 test: 0.1928116 best: 0.1926194 (58) total: 16.8s remaining: 3.68s 64: learn: 0.1474556 test: 0.1923216 best: 0.1923216 (64) total: 17.1s remaining: 3.41s 65: learn: 0.1469299 test: 0.1919485 best: 0.1919485 (65) total: 17.3s remaining: 3.14s 66: learn: 0.1463267 test: 0.1917007 best: 0.1917007 (66) total: 17.5s remaining: 2.88s 67: learn: 0.1454366 test: 0.1916769 best: 0.1916769 (67) total: 17.8s remaining: 2.61s 68: learn: 0.1446627 test: 0.1914947 best: 0.1914947 (68) total: 18s remaining: 2.35s 69: learn: 0.1442667 test: 0.1913570 best: 0.1913570 (69) total: 18.2s remaining: 2.08s 70: learn: 0.1438342 test: 0.1913281 best: 0.1913281 (70) total: 18.5s remaining: 1.82s 71: learn: 0.1430224 test: 0.1911377 best: 0.1911377 (71) total: 18.7s remaining: 1.56s 72: learn: 0.1426185 test: 0.1910425 best: 0.1910425 (72) total: 18.9s remaining: 1.3s 73: learn: 0.1422473 test: 0.1910114 best: 0.1910114 (73) total: 19.2s remaining: 1.04s 74: learn: 0.1417248 test: 0.1908013 best: 0.1908013 (74) total: 19.4s remaining: 777ms 75: learn: 0.1409100 test: 0.1909874 best: 0.1908013 (74) total: 19.7s remaining: 517ms 76: learn: 0.1404153 test: 0.1908472 best: 0.1908013 (74) total: 19.9s remaining: 258ms 77: learn: 0.1400520 test: 0.1908106 best: 0.1908013 (74) total: 20.1s remaining: 0us bestTest = 0.1908013315 bestIteration = 74 Shrink model to first 75 iterations. Trial 39, Fold 3: Log loss = 0.19080133152470824, Average precision = 0.9761563406455036, ROC-AUC = 0.9728910441005094, Elapsed Time = 20.284400800002913 seconds Trial 39, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 39, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6021292 test: 0.6035799 best: 0.6035799 (0) total: 220ms remaining: 16.9s 1: learn: 0.5295952 test: 0.5319746 best: 0.5319746 (1) total: 441ms remaining: 16.8s 2: learn: 0.4715173 test: 0.4746950 best: 0.4746950 (2) total: 666ms remaining: 16.6s 3: learn: 0.4242696 test: 0.4276632 best: 0.4276632 (3) total: 877ms remaining: 16.2s 4: learn: 0.3859034 test: 0.3906661 best: 0.3906661 (4) total: 1.1s remaining: 16.1s 5: learn: 0.3546202 test: 0.3601044 best: 0.3601044 (5) total: 1.32s remaining: 15.8s 6: learn: 0.3302096 test: 0.3368171 best: 0.3368171 (6) total: 1.55s remaining: 15.7s 7: learn: 0.3088577 test: 0.3162282 best: 0.3162282 (7) total: 1.79s remaining: 15.6s 8: learn: 0.2911990 test: 0.2998578 best: 0.2998578 (8) total: 2.04s remaining: 15.6s 9: learn: 0.2771878 test: 0.2862894 best: 0.2862894 (9) total: 2.28s remaining: 15.5s 10: learn: 0.2643907 test: 0.2744334 best: 0.2744334 (10) total: 2.54s remaining: 15.5s 11: learn: 0.2538555 test: 0.2644441 best: 0.2644441 (11) total: 2.79s remaining: 15.3s 12: learn: 0.2453925 test: 0.2565802 best: 0.2565802 (12) total: 3.02s remaining: 15.1s 13: learn: 0.2377614 test: 0.2492690 best: 0.2492690 (13) total: 3.26s remaining: 14.9s 14: learn: 0.2310925 test: 0.2432068 best: 0.2432068 (14) total: 3.51s remaining: 14.7s 15: learn: 0.2256000 test: 0.2381895 best: 0.2381895 (15) total: 3.75s remaining: 14.5s 16: learn: 0.2202447 test: 0.2337703 best: 0.2337703 (16) total: 4.01s remaining: 14.4s 17: learn: 0.2152910 test: 0.2299411 best: 0.2299411 (17) total: 4.26s remaining: 14.2s 18: learn: 0.2114373 test: 0.2265482 best: 0.2265482 (18) total: 4.5s remaining: 14s 19: learn: 0.2078975 test: 0.2236592 best: 0.2236592 (19) total: 4.74s remaining: 13.8s 20: learn: 0.2047922 test: 0.2208891 best: 0.2208891 (20) total: 4.99s remaining: 13.5s 21: learn: 0.2019026 test: 0.2191002 best: 0.2191002 (21) total: 5.23s remaining: 13.3s 22: learn: 0.1988751 test: 0.2176141 best: 0.2176141 (22) total: 5.48s remaining: 13.1s 23: learn: 0.1966466 test: 0.2157214 best: 0.2157214 (23) total: 5.72s remaining: 12.9s 24: learn: 0.1939352 test: 0.2138095 best: 0.2138095 (24) total: 5.97s remaining: 12.7s 25: learn: 0.1919131 test: 0.2120930 best: 0.2120930 (25) total: 6.22s remaining: 12.4s 26: learn: 0.1900418 test: 0.2108355 best: 0.2108355 (26) total: 6.46s remaining: 12.2s 27: learn: 0.1877525 test: 0.2102507 best: 0.2102507 (27) total: 6.71s remaining: 12s 28: learn: 0.1860742 test: 0.2090267 best: 0.2090267 (28) total: 6.95s remaining: 11.8s 29: learn: 0.1845687 test: 0.2076779 best: 0.2076779 (29) total: 7.2s remaining: 11.5s 30: learn: 0.1830185 test: 0.2066755 best: 0.2066755 (30) total: 7.43s remaining: 11.3s 31: learn: 0.1814062 test: 0.2059108 best: 0.2059108 (31) total: 7.68s remaining: 11s 32: learn: 0.1802985 test: 0.2051184 best: 0.2051184 (32) total: 7.92s remaining: 10.8s 33: learn: 0.1787247 test: 0.2051666 best: 0.2051184 (32) total: 8.16s remaining: 10.6s 34: learn: 0.1773843 test: 0.2042251 best: 0.2042251 (34) total: 8.4s remaining: 10.3s 35: learn: 0.1759247 test: 0.2041150 best: 0.2041150 (35) total: 8.65s remaining: 10.1s 36: learn: 0.1745201 test: 0.2035039 best: 0.2035039 (36) total: 8.89s remaining: 9.86s 37: learn: 0.1733342 test: 0.2030247 best: 0.2030247 (37) total: 9.14s remaining: 9.62s 38: learn: 0.1723864 test: 0.2025158 best: 0.2025158 (38) total: 9.39s remaining: 9.39s 39: learn: 0.1713911 test: 0.2024521 best: 0.2024521 (39) total: 9.64s remaining: 9.15s 40: learn: 0.1703080 test: 0.2016794 best: 0.2016794 (40) total: 9.87s remaining: 8.91s 41: learn: 0.1692085 test: 0.2009448 best: 0.2009448 (41) total: 10.1s remaining: 8.67s 42: learn: 0.1682496 test: 0.2004039 best: 0.2004039 (42) total: 10.4s remaining: 8.44s 43: learn: 0.1670311 test: 0.1996720 best: 0.1996720 (43) total: 10.6s remaining: 8.2s 44: learn: 0.1664318 test: 0.1993110 best: 0.1993110 (44) total: 10.9s remaining: 7.96s 45: learn: 0.1654384 test: 0.1990926 best: 0.1990926 (45) total: 11.1s remaining: 7.72s 46: learn: 0.1647956 test: 0.1985207 best: 0.1985207 (46) total: 11.3s remaining: 7.48s 47: learn: 0.1637912 test: 0.1983275 best: 0.1983275 (47) total: 11.6s remaining: 7.23s 48: learn: 0.1623757 test: 0.1980495 best: 0.1980495 (48) total: 11.8s remaining: 6.99s 49: learn: 0.1618056 test: 0.1976359 best: 0.1976359 (49) total: 12s remaining: 6.75s 50: learn: 0.1610266 test: 0.1972229 best: 0.1972229 (50) total: 12.3s remaining: 6.51s 51: learn: 0.1603027 test: 0.1970802 best: 0.1970802 (51) total: 12.5s remaining: 6.26s 52: learn: 0.1593220 test: 0.1967487 best: 0.1967487 (52) total: 12.8s remaining: 6.03s 53: learn: 0.1584996 test: 0.1966938 best: 0.1966938 (53) total: 13s remaining: 5.79s 54: learn: 0.1579189 test: 0.1964513 best: 0.1964513 (54) total: 13.3s remaining: 5.55s 55: learn: 0.1575353 test: 0.1963489 best: 0.1963489 (55) total: 13.5s remaining: 5.31s 56: learn: 0.1563596 test: 0.1967357 best: 0.1963489 (55) total: 13.8s remaining: 5.08s 57: learn: 0.1552799 test: 0.1967969 best: 0.1963489 (55) total: 14s remaining: 4.84s 58: learn: 0.1545178 test: 0.1966403 best: 0.1963489 (55) total: 14.3s remaining: 4.6s 59: learn: 0.1535977 test: 0.1963340 best: 0.1963340 (59) total: 14.5s remaining: 4.35s 60: learn: 0.1524430 test: 0.1961098 best: 0.1961098 (60) total: 14.8s remaining: 4.11s 61: learn: 0.1516660 test: 0.1960629 best: 0.1960629 (61) total: 15s remaining: 3.87s 62: learn: 0.1512417 test: 0.1958565 best: 0.1958565 (62) total: 15.2s remaining: 3.63s 63: learn: 0.1500974 test: 0.1957111 best: 0.1957111 (63) total: 15.5s remaining: 3.39s 64: learn: 0.1491227 test: 0.1957201 best: 0.1957111 (63) total: 15.7s remaining: 3.15s 65: learn: 0.1486268 test: 0.1957244 best: 0.1957111 (63) total: 16s remaining: 2.9s 66: learn: 0.1479368 test: 0.1955850 best: 0.1955850 (66) total: 16.2s remaining: 2.66s 67: learn: 0.1474487 test: 0.1955111 best: 0.1955111 (67) total: 16.5s remaining: 2.42s 68: learn: 0.1469999 test: 0.1953739 best: 0.1953739 (68) total: 16.7s remaining: 2.18s 69: learn: 0.1462170 test: 0.1948827 best: 0.1948827 (69) total: 16.9s remaining: 1.93s 70: learn: 0.1448946 test: 0.1947921 best: 0.1947921 (70) total: 17.2s remaining: 1.69s 71: learn: 0.1442238 test: 0.1947866 best: 0.1947866 (71) total: 17.4s remaining: 1.45s 72: learn: 0.1433972 test: 0.1946638 best: 0.1946638 (72) total: 17.6s remaining: 1.21s 73: learn: 0.1428674 test: 0.1944386 best: 0.1944386 (73) total: 17.9s remaining: 967ms 74: learn: 0.1423622 test: 0.1942635 best: 0.1942635 (74) total: 18.1s remaining: 725ms 75: learn: 0.1413118 test: 0.1943395 best: 0.1942635 (74) total: 18.4s remaining: 483ms 76: learn: 0.1399583 test: 0.1946575 best: 0.1942635 (74) total: 18.6s remaining: 242ms 77: learn: 0.1393212 test: 0.1942272 best: 0.1942272 (77) total: 18.8s remaining: 0us bestTest = 0.1942271736 bestIteration = 77 Trial 39, Fold 4: Log loss = 0.1942271736352091, Average precision = 0.9756894766774887, ROC-AUC = 0.9715121151860675, Elapsed Time = 18.995119199997134 seconds Trial 39, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 39, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6020907 test: 0.6048290 best: 0.6048290 (0) total: 223ms remaining: 17.2s 1: learn: 0.5283407 test: 0.5339577 best: 0.5339577 (1) total: 445ms remaining: 16.9s 2: learn: 0.4688248 test: 0.4759303 best: 0.4759303 (2) total: 661ms remaining: 16.5s 3: learn: 0.4219210 test: 0.4303752 best: 0.4303752 (3) total: 877ms remaining: 16.2s 4: learn: 0.3836077 test: 0.3933078 best: 0.3933078 (4) total: 1.1s remaining: 16s 5: learn: 0.3521352 test: 0.3633947 best: 0.3633947 (5) total: 1.32s remaining: 15.8s 6: learn: 0.3264386 test: 0.3392514 best: 0.3392514 (6) total: 1.55s remaining: 15.7s 7: learn: 0.3059412 test: 0.3196175 best: 0.3196175 (7) total: 1.79s remaining: 15.7s 8: learn: 0.2885682 test: 0.3034293 best: 0.3034293 (8) total: 2.03s remaining: 15.6s 9: learn: 0.2741080 test: 0.2898183 best: 0.2898183 (9) total: 2.28s remaining: 15.5s 10: learn: 0.2623956 test: 0.2788346 best: 0.2788346 (10) total: 2.54s remaining: 15.5s 11: learn: 0.2519449 test: 0.2695736 best: 0.2695736 (11) total: 2.79s remaining: 15.3s 12: learn: 0.2433478 test: 0.2617484 best: 0.2617484 (12) total: 3.02s remaining: 15.1s 13: learn: 0.2359291 test: 0.2551026 best: 0.2551026 (13) total: 3.27s remaining: 14.9s 14: learn: 0.2291329 test: 0.2489371 best: 0.2489371 (14) total: 3.51s remaining: 14.8s 15: learn: 0.2230827 test: 0.2439157 best: 0.2439157 (15) total: 3.76s remaining: 14.6s 16: learn: 0.2178366 test: 0.2394881 best: 0.2394881 (16) total: 4.01s remaining: 14.4s 17: learn: 0.2134946 test: 0.2356768 best: 0.2356768 (17) total: 4.24s remaining: 14.1s 18: learn: 0.2095411 test: 0.2330808 best: 0.2330808 (18) total: 4.49s remaining: 13.9s 19: learn: 0.2065958 test: 0.2301892 best: 0.2301892 (19) total: 4.74s remaining: 13.7s 20: learn: 0.2030599 test: 0.2274848 best: 0.2274848 (20) total: 5s remaining: 13.6s 21: learn: 0.2001597 test: 0.2256374 best: 0.2256374 (21) total: 5.24s remaining: 13.3s 22: learn: 0.1972798 test: 0.2234369 best: 0.2234369 (22) total: 5.48s remaining: 13.1s 23: learn: 0.1947971 test: 0.2217108 best: 0.2217108 (23) total: 5.72s remaining: 12.9s 24: learn: 0.1924986 test: 0.2207028 best: 0.2207028 (24) total: 5.97s remaining: 12.7s 25: learn: 0.1905136 test: 0.2195762 best: 0.2195762 (25) total: 6.21s remaining: 12.4s 26: learn: 0.1886146 test: 0.2185456 best: 0.2185456 (26) total: 6.45s remaining: 12.2s 27: learn: 0.1866784 test: 0.2176217 best: 0.2176217 (27) total: 6.68s remaining: 11.9s 28: learn: 0.1850942 test: 0.2169603 best: 0.2169603 (28) total: 6.93s remaining: 11.7s 29: learn: 0.1833073 test: 0.2159345 best: 0.2159345 (29) total: 7.17s remaining: 11.5s 30: learn: 0.1816665 test: 0.2151545 best: 0.2151545 (30) total: 7.42s remaining: 11.3s 31: learn: 0.1802034 test: 0.2140397 best: 0.2140397 (31) total: 7.66s remaining: 11s 32: learn: 0.1783996 test: 0.2136616 best: 0.2136616 (32) total: 7.91s remaining: 10.8s 33: learn: 0.1769958 test: 0.2127593 best: 0.2127593 (33) total: 8.16s remaining: 10.6s 34: learn: 0.1755769 test: 0.2116862 best: 0.2116862 (34) total: 8.39s remaining: 10.3s 35: learn: 0.1739043 test: 0.2112386 best: 0.2112386 (35) total: 8.65s remaining: 10.1s 36: learn: 0.1723972 test: 0.2110725 best: 0.2110725 (36) total: 8.9s remaining: 9.86s 37: learn: 0.1713538 test: 0.2104642 best: 0.2104642 (37) total: 9.14s remaining: 9.63s 38: learn: 0.1705287 test: 0.2097934 best: 0.2097934 (38) total: 9.38s remaining: 9.38s 39: learn: 0.1697612 test: 0.2094992 best: 0.2094992 (39) total: 9.61s remaining: 9.13s 40: learn: 0.1688228 test: 0.2092641 best: 0.2092641 (40) total: 9.85s remaining: 8.89s 41: learn: 0.1679192 test: 0.2088667 best: 0.2088667 (41) total: 10.1s remaining: 8.64s 42: learn: 0.1668295 test: 0.2085802 best: 0.2085802 (42) total: 10.3s remaining: 8.4s 43: learn: 0.1660875 test: 0.2079786 best: 0.2079786 (43) total: 10.6s remaining: 8.15s 44: learn: 0.1650841 test: 0.2073371 best: 0.2073371 (44) total: 10.8s remaining: 7.92s 45: learn: 0.1644671 test: 0.2071881 best: 0.2071881 (45) total: 11s remaining: 7.68s 46: learn: 0.1638745 test: 0.2069592 best: 0.2069592 (46) total: 11.3s remaining: 7.44s 47: learn: 0.1631774 test: 0.2068409 best: 0.2068409 (47) total: 11.5s remaining: 7.19s 48: learn: 0.1624252 test: 0.2065829 best: 0.2065829 (48) total: 11.7s remaining: 6.95s 49: learn: 0.1617842 test: 0.2063799 best: 0.2063799 (49) total: 12s remaining: 6.71s 50: learn: 0.1610944 test: 0.2059338 best: 0.2059338 (50) total: 12.2s remaining: 6.47s 51: learn: 0.1598854 test: 0.2058152 best: 0.2058152 (51) total: 12.5s remaining: 6.23s 52: learn: 0.1592358 test: 0.2055146 best: 0.2055146 (52) total: 12.7s remaining: 5.99s 53: learn: 0.1578726 test: 0.2053810 best: 0.2053810 (53) total: 12.9s remaining: 5.75s 54: learn: 0.1569088 test: 0.2050422 best: 0.2050422 (54) total: 13.2s remaining: 5.51s 55: learn: 0.1559594 test: 0.2052480 best: 0.2050422 (54) total: 13.4s remaining: 5.27s 56: learn: 0.1553592 test: 0.2050440 best: 0.2050422 (54) total: 13.7s remaining: 5.03s 57: learn: 0.1546604 test: 0.2043590 best: 0.2043590 (57) total: 13.9s remaining: 4.79s 58: learn: 0.1541071 test: 0.2043009 best: 0.2043009 (58) total: 14.2s remaining: 4.56s 59: learn: 0.1532057 test: 0.2039202 best: 0.2039202 (59) total: 14.4s remaining: 4.31s 60: learn: 0.1527760 test: 0.2038144 best: 0.2038144 (60) total: 14.6s remaining: 4.07s 61: learn: 0.1521297 test: 0.2036384 best: 0.2036384 (61) total: 14.8s remaining: 3.83s 62: learn: 0.1515324 test: 0.2032938 best: 0.2032938 (62) total: 15.1s remaining: 3.59s 63: learn: 0.1512861 test: 0.2033242 best: 0.2032938 (62) total: 15.3s remaining: 3.35s 64: learn: 0.1503911 test: 0.2032351 best: 0.2032351 (64) total: 15.6s remaining: 3.11s 65: learn: 0.1493628 test: 0.2034860 best: 0.2032351 (64) total: 15.8s remaining: 2.87s 66: learn: 0.1486455 test: 0.2035203 best: 0.2032351 (64) total: 16.1s remaining: 2.63s 67: learn: 0.1483354 test: 0.2034665 best: 0.2032351 (64) total: 16.3s remaining: 2.4s 68: learn: 0.1472864 test: 0.2033173 best: 0.2032351 (64) total: 16.5s remaining: 2.16s 69: learn: 0.1464837 test: 0.2034106 best: 0.2032351 (64) total: 16.8s remaining: 1.92s 70: learn: 0.1459136 test: 0.2030282 best: 0.2030282 (70) total: 17s remaining: 1.68s 71: learn: 0.1448795 test: 0.2029602 best: 0.2029602 (71) total: 17.3s remaining: 1.44s 72: learn: 0.1444419 test: 0.2028237 best: 0.2028237 (72) total: 17.5s remaining: 1.2s 73: learn: 0.1441424 test: 0.2027370 best: 0.2027370 (73) total: 17.8s remaining: 960ms 74: learn: 0.1432780 test: 0.2028096 best: 0.2027370 (73) total: 18s remaining: 720ms 75: learn: 0.1429771 test: 0.2028358 best: 0.2027370 (73) total: 18.2s remaining: 480ms 76: learn: 0.1424739 test: 0.2027590 best: 0.2027370 (73) total: 18.5s remaining: 240ms 77: learn: 0.1417955 test: 0.2028005 best: 0.2027370 (73) total: 18.7s remaining: 0us bestTest = 0.2027369957 bestIteration = 73 Shrink model to first 74 iterations. Trial 39, Fold 5: Log loss = 0.20273699574091272, Average precision = 0.9736236803525101, ROC-AUC = 0.9700538431525556, Elapsed Time = 18.862699500001327 seconds
Optimization Progress: 40%|#### | 40/100 [1:07:32<1:26:43, 86.72s/it]
Trial 40, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 40, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6554782 test: 0.6556461 best: 0.6556461 (0) total: 58.3ms remaining: 5.48s 1: learn: 0.6162421 test: 0.6166916 best: 0.6166916 (1) total: 114ms remaining: 5.32s 2: learn: 0.5862515 test: 0.5865632 best: 0.5865632 (2) total: 173ms remaining: 5.3s 3: learn: 0.5518687 test: 0.5520777 best: 0.5520777 (3) total: 233ms remaining: 5.31s 4: learn: 0.5309729 test: 0.5310357 best: 0.5310357 (4) total: 290ms remaining: 5.23s 5: learn: 0.5124120 test: 0.5120463 best: 0.5120463 (5) total: 345ms remaining: 5.11s 6: learn: 0.4926145 test: 0.4926005 best: 0.4926005 (6) total: 400ms remaining: 5.03s 7: learn: 0.4777359 test: 0.4774964 best: 0.4774964 (7) total: 455ms remaining: 4.95s 8: learn: 0.4625226 test: 0.4624646 best: 0.4624646 (8) total: 513ms remaining: 4.9s 9: learn: 0.4492872 test: 0.4496400 best: 0.4496400 (9) total: 573ms remaining: 4.87s 10: learn: 0.4373329 test: 0.4381147 best: 0.4381147 (10) total: 632ms remaining: 4.83s 11: learn: 0.4259965 test: 0.4270448 best: 0.4270448 (11) total: 693ms remaining: 4.79s 12: learn: 0.4112760 test: 0.4121202 best: 0.4121202 (12) total: 755ms remaining: 4.76s 13: learn: 0.3997724 test: 0.4006262 best: 0.4006262 (13) total: 817ms remaining: 4.73s 14: learn: 0.3901031 test: 0.3911214 best: 0.3911214 (14) total: 880ms remaining: 4.69s 15: learn: 0.3822330 test: 0.3835888 best: 0.3835888 (15) total: 943ms remaining: 4.66s 16: learn: 0.3709513 test: 0.3721377 best: 0.3721377 (16) total: 1.01s remaining: 4.62s 17: learn: 0.3600885 test: 0.3612953 best: 0.3612953 (17) total: 1.07s remaining: 4.57s 18: learn: 0.3511555 test: 0.3522971 best: 0.3522971 (18) total: 1.13s remaining: 4.53s 19: learn: 0.3428405 test: 0.3438065 best: 0.3438065 (19) total: 1.2s remaining: 4.49s 20: learn: 0.3357121 test: 0.3365880 best: 0.3365880 (20) total: 1.26s remaining: 4.44s 21: learn: 0.3288989 test: 0.3298264 best: 0.3298264 (21) total: 1.32s remaining: 4.39s 22: learn: 0.3232005 test: 0.3242473 best: 0.3242473 (22) total: 1.38s remaining: 4.33s 23: learn: 0.3178764 test: 0.3189644 best: 0.3189644 (23) total: 1.45s remaining: 4.28s 24: learn: 0.3121197 test: 0.3134765 best: 0.3134765 (24) total: 1.51s remaining: 4.23s 25: learn: 0.3078127 test: 0.3090953 best: 0.3090953 (25) total: 1.57s remaining: 4.18s 26: learn: 0.3033044 test: 0.3045428 best: 0.3045428 (26) total: 1.64s remaining: 4.12s 27: learn: 0.2996062 test: 0.3007873 best: 0.3007873 (27) total: 1.7s remaining: 4.07s 28: learn: 0.2962534 test: 0.2974507 best: 0.2974507 (28) total: 1.76s remaining: 4.01s 29: learn: 0.2926174 test: 0.2937621 best: 0.2937621 (29) total: 1.83s remaining: 3.96s 30: learn: 0.2893403 test: 0.2904533 best: 0.2904533 (30) total: 1.89s remaining: 3.9s 31: learn: 0.2860321 test: 0.2872208 best: 0.2872208 (31) total: 1.95s remaining: 3.84s 32: learn: 0.2829227 test: 0.2841663 best: 0.2841663 (32) total: 2.02s remaining: 3.79s 33: learn: 0.2801674 test: 0.2813876 best: 0.2813876 (33) total: 2.08s remaining: 3.73s 34: learn: 0.2777340 test: 0.2790171 best: 0.2790171 (34) total: 2.14s remaining: 3.67s 35: learn: 0.2751135 test: 0.2763639 best: 0.2763639 (35) total: 2.21s remaining: 3.62s 36: learn: 0.2731006 test: 0.2743204 best: 0.2743204 (36) total: 2.27s remaining: 3.56s 37: learn: 0.2711205 test: 0.2724305 best: 0.2724305 (37) total: 2.33s remaining: 3.5s 38: learn: 0.2694771 test: 0.2708567 best: 0.2708567 (38) total: 2.4s remaining: 3.44s 39: learn: 0.2680888 test: 0.2696166 best: 0.2696166 (39) total: 2.46s remaining: 3.38s 40: learn: 0.2660697 test: 0.2675878 best: 0.2675878 (40) total: 2.52s remaining: 3.32s 41: learn: 0.2646646 test: 0.2665079 best: 0.2665079 (41) total: 2.58s remaining: 3.26s 42: learn: 0.2631153 test: 0.2650399 best: 0.2650399 (42) total: 2.64s remaining: 3.2s 43: learn: 0.2615567 test: 0.2634425 best: 0.2634425 (43) total: 2.71s remaining: 3.14s 44: learn: 0.2597293 test: 0.2616481 best: 0.2616481 (44) total: 2.77s remaining: 3.08s 45: learn: 0.2574537 test: 0.2596006 best: 0.2596006 (45) total: 2.83s remaining: 3.02s 46: learn: 0.2566640 test: 0.2589194 best: 0.2589194 (46) total: 2.9s remaining: 2.96s 47: learn: 0.2553096 test: 0.2575631 best: 0.2575631 (47) total: 2.96s remaining: 2.9s 48: learn: 0.2545158 test: 0.2569619 best: 0.2569619 (48) total: 3.02s remaining: 2.84s 49: learn: 0.2534449 test: 0.2559719 best: 0.2559719 (49) total: 3.08s remaining: 2.78s 50: learn: 0.2518142 test: 0.2543459 best: 0.2543459 (50) total: 3.15s remaining: 2.71s 51: learn: 0.2506225 test: 0.2531744 best: 0.2531744 (51) total: 3.21s remaining: 2.66s 52: learn: 0.2496623 test: 0.2523469 best: 0.2523469 (52) total: 3.28s remaining: 2.6s 53: learn: 0.2490221 test: 0.2517827 best: 0.2517827 (53) total: 3.34s remaining: 2.54s 54: learn: 0.2474451 test: 0.2503354 best: 0.2503354 (54) total: 3.41s remaining: 2.48s 55: learn: 0.2463258 test: 0.2492349 best: 0.2492349 (55) total: 3.47s remaining: 2.42s 56: learn: 0.2453764 test: 0.2483210 best: 0.2483210 (56) total: 3.54s remaining: 2.36s 57: learn: 0.2443792 test: 0.2473979 best: 0.2473979 (57) total: 3.61s remaining: 2.3s 58: learn: 0.2437374 test: 0.2468189 best: 0.2468189 (58) total: 3.67s remaining: 2.24s 59: learn: 0.2423796 test: 0.2456148 best: 0.2456148 (59) total: 3.74s remaining: 2.18s 60: learn: 0.2416645 test: 0.2449695 best: 0.2449695 (60) total: 3.81s remaining: 2.12s 61: learn: 0.2412009 test: 0.2446869 best: 0.2446869 (61) total: 3.88s remaining: 2.06s 62: learn: 0.2402075 test: 0.2437922 best: 0.2437922 (62) total: 3.95s remaining: 2s 63: learn: 0.2397013 test: 0.2434322 best: 0.2434322 (63) total: 4.02s remaining: 1.95s 64: learn: 0.2387929 test: 0.2425013 best: 0.2425013 (64) total: 4.09s remaining: 1.89s 65: learn: 0.2377734 test: 0.2415946 best: 0.2415946 (65) total: 4.16s remaining: 1.83s 66: learn: 0.2373976 test: 0.2412106 best: 0.2412106 (66) total: 4.24s remaining: 1.77s 67: learn: 0.2368649 test: 0.2407750 best: 0.2407750 (67) total: 4.32s remaining: 1.72s 68: learn: 0.2358497 test: 0.2398473 best: 0.2398473 (68) total: 4.39s remaining: 1.66s 69: learn: 0.2351546 test: 0.2391267 best: 0.2391267 (69) total: 4.47s remaining: 1.6s 70: learn: 0.2345009 test: 0.2386445 best: 0.2386445 (70) total: 4.54s remaining: 1.53s 71: learn: 0.2338866 test: 0.2380186 best: 0.2380186 (71) total: 4.61s remaining: 1.47s 72: learn: 0.2332379 test: 0.2373930 best: 0.2373930 (72) total: 4.68s remaining: 1.41s 73: learn: 0.2327751 test: 0.2368912 best: 0.2368912 (73) total: 4.75s remaining: 1.35s 74: learn: 0.2322789 test: 0.2364948 best: 0.2364948 (74) total: 4.83s remaining: 1.29s 75: learn: 0.2316847 test: 0.2358903 best: 0.2358903 (75) total: 4.9s remaining: 1.22s 76: learn: 0.2312201 test: 0.2354640 best: 0.2354640 (76) total: 4.96s remaining: 1.16s 77: learn: 0.2309652 test: 0.2352663 best: 0.2352663 (77) total: 5.03s remaining: 1.1s 78: learn: 0.2304192 test: 0.2347792 best: 0.2347792 (78) total: 5.1s remaining: 1.03s 79: learn: 0.2299169 test: 0.2343072 best: 0.2343072 (79) total: 5.17s remaining: 969ms 80: learn: 0.2297221 test: 0.2341128 best: 0.2341128 (80) total: 5.24s remaining: 905ms 81: learn: 0.2295303 test: 0.2340136 best: 0.2340136 (81) total: 5.3s remaining: 841ms 82: learn: 0.2288616 test: 0.2334295 best: 0.2334295 (82) total: 5.37s remaining: 777ms 83: learn: 0.2287275 test: 0.2333610 best: 0.2333610 (83) total: 5.44s remaining: 712ms 84: learn: 0.2283761 test: 0.2330317 best: 0.2330317 (84) total: 5.51s remaining: 648ms 85: learn: 0.2280096 test: 0.2327032 best: 0.2327032 (85) total: 5.58s remaining: 584ms 86: learn: 0.2272192 test: 0.2320239 best: 0.2320239 (86) total: 5.65s remaining: 519ms 87: learn: 0.2265834 test: 0.2313442 best: 0.2313442 (87) total: 5.72s remaining: 455ms 88: learn: 0.2263268 test: 0.2312125 best: 0.2312125 (88) total: 5.78s remaining: 390ms 89: learn: 0.2257168 test: 0.2306283 best: 0.2306283 (89) total: 5.86s remaining: 325ms 90: learn: 0.2254015 test: 0.2303800 best: 0.2303800 (90) total: 5.92s remaining: 260ms 91: learn: 0.2250540 test: 0.2300163 best: 0.2300163 (91) total: 5.99s remaining: 195ms 92: learn: 0.2246196 test: 0.2295556 best: 0.2295556 (92) total: 6.06s remaining: 130ms 93: learn: 0.2241814 test: 0.2291300 best: 0.2291300 (93) total: 6.13s remaining: 65.2ms 94: learn: 0.2239324 test: 0.2290014 best: 0.2290014 (94) total: 6.19s remaining: 0us bestTest = 0.2290013564 bestIteration = 94 Trial 40, Fold 1: Log loss = 0.22900135640558722, Average precision = 0.9681151514558485, ROC-AUC = 0.9633356215616831, Elapsed Time = 6.310286399999313 seconds Trial 40, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 40, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6552045 test: 0.6552306 best: 0.6552306 (0) total: 62.3ms remaining: 5.85s 1: learn: 0.6224710 test: 0.6227734 best: 0.6227734 (1) total: 127ms remaining: 5.9s 2: learn: 0.5957208 test: 0.5961969 best: 0.5961969 (2) total: 188ms remaining: 5.78s 3: learn: 0.5602211 test: 0.5612492 best: 0.5612492 (3) total: 250ms remaining: 5.69s 4: learn: 0.5391584 test: 0.5402413 best: 0.5402413 (4) total: 312ms remaining: 5.62s 5: learn: 0.5102845 test: 0.5117212 best: 0.5117212 (5) total: 376ms remaining: 5.58s 6: learn: 0.4930595 test: 0.4945761 best: 0.4945761 (6) total: 440ms remaining: 5.53s 7: learn: 0.4771421 test: 0.4786690 best: 0.4786690 (7) total: 499ms remaining: 5.43s 8: learn: 0.4604398 test: 0.4621249 best: 0.4621249 (8) total: 563ms remaining: 5.38s 9: learn: 0.4416281 test: 0.4433474 best: 0.4433474 (9) total: 623ms remaining: 5.29s 10: learn: 0.4305952 test: 0.4323870 best: 0.4323870 (10) total: 688ms remaining: 5.25s 11: learn: 0.4182214 test: 0.4204023 best: 0.4204023 (11) total: 751ms remaining: 5.2s 12: learn: 0.4018337 test: 0.4041801 best: 0.4041801 (12) total: 813ms remaining: 5.13s 13: learn: 0.3885562 test: 0.3909175 best: 0.3909175 (13) total: 875ms remaining: 5.06s 14: learn: 0.3787090 test: 0.3809109 best: 0.3809109 (14) total: 938ms remaining: 5s 15: learn: 0.3707261 test: 0.3729608 best: 0.3729608 (15) total: 1s remaining: 4.95s 16: learn: 0.3597810 test: 0.3620340 best: 0.3620340 (16) total: 1.07s remaining: 4.9s 17: learn: 0.3499917 test: 0.3522336 best: 0.3522336 (17) total: 1.13s remaining: 4.85s 18: learn: 0.3431715 test: 0.3455432 best: 0.3455432 (18) total: 1.2s remaining: 4.79s 19: learn: 0.3359302 test: 0.3381580 best: 0.3381580 (19) total: 1.26s remaining: 4.73s 20: learn: 0.3293659 test: 0.3314032 best: 0.3314032 (20) total: 1.32s remaining: 4.67s 21: learn: 0.3250581 test: 0.3268456 best: 0.3268456 (21) total: 1.39s remaining: 4.62s 22: learn: 0.3198875 test: 0.3215360 best: 0.3215360 (22) total: 1.46s remaining: 4.56s 23: learn: 0.3162280 test: 0.3177106 best: 0.3177106 (23) total: 1.52s remaining: 4.49s 24: learn: 0.3127767 test: 0.3142723 best: 0.3142723 (24) total: 1.58s remaining: 4.43s 25: learn: 0.3077694 test: 0.3092553 best: 0.3092553 (25) total: 1.65s remaining: 4.37s 26: learn: 0.3026383 test: 0.3040563 best: 0.3040563 (26) total: 1.71s remaining: 4.31s 27: learn: 0.2989729 test: 0.3003851 best: 0.3003851 (27) total: 1.78s remaining: 4.25s 28: learn: 0.2956987 test: 0.2971438 best: 0.2971438 (28) total: 1.85s remaining: 4.2s 29: learn: 0.2918084 test: 0.2932348 best: 0.2932348 (29) total: 1.91s remaining: 4.14s 30: learn: 0.2882868 test: 0.2896985 best: 0.2896985 (30) total: 1.98s remaining: 4.08s 31: learn: 0.2853639 test: 0.2868011 best: 0.2868011 (31) total: 2.05s remaining: 4.03s 32: learn: 0.2828194 test: 0.2842784 best: 0.2842784 (32) total: 2.11s remaining: 3.97s 33: learn: 0.2799628 test: 0.2813593 best: 0.2813593 (33) total: 2.18s remaining: 3.91s 34: learn: 0.2775630 test: 0.2790350 best: 0.2790350 (34) total: 2.25s remaining: 3.86s 35: learn: 0.2761404 test: 0.2776179 best: 0.2776179 (35) total: 2.33s remaining: 3.81s 36: learn: 0.2734335 test: 0.2748018 best: 0.2748018 (36) total: 2.4s remaining: 3.76s 37: learn: 0.2715108 test: 0.2728219 best: 0.2728219 (37) total: 2.47s remaining: 3.71s 38: learn: 0.2693466 test: 0.2707365 best: 0.2707365 (38) total: 2.54s remaining: 3.65s 39: learn: 0.2667875 test: 0.2679932 best: 0.2679932 (39) total: 2.62s remaining: 3.6s 40: learn: 0.2649067 test: 0.2660532 best: 0.2660532 (40) total: 2.69s remaining: 3.54s 41: learn: 0.2631549 test: 0.2643925 best: 0.2643925 (41) total: 2.76s remaining: 3.48s 42: learn: 0.2614647 test: 0.2627284 best: 0.2627284 (42) total: 2.83s remaining: 3.42s 43: learn: 0.2596399 test: 0.2609110 best: 0.2609110 (43) total: 2.9s remaining: 3.36s 44: learn: 0.2585342 test: 0.2597040 best: 0.2597040 (44) total: 2.97s remaining: 3.3s 45: learn: 0.2573675 test: 0.2584414 best: 0.2584414 (45) total: 3.04s remaining: 3.23s 46: learn: 0.2562636 test: 0.2572300 best: 0.2572300 (46) total: 3.1s remaining: 3.17s 47: learn: 0.2547776 test: 0.2557710 best: 0.2557710 (47) total: 3.17s remaining: 3.11s 48: learn: 0.2532977 test: 0.2543287 best: 0.2543287 (48) total: 3.24s remaining: 3.04s 49: learn: 0.2524378 test: 0.2534085 best: 0.2534085 (49) total: 3.31s remaining: 2.98s 50: learn: 0.2515915 test: 0.2524526 best: 0.2524526 (50) total: 3.38s remaining: 2.92s 51: learn: 0.2504943 test: 0.2514563 best: 0.2514563 (51) total: 3.45s remaining: 2.85s 52: learn: 0.2495015 test: 0.2504219 best: 0.2504219 (52) total: 3.52s remaining: 2.79s 53: learn: 0.2488212 test: 0.2496831 best: 0.2496831 (53) total: 3.6s remaining: 2.73s 54: learn: 0.2477299 test: 0.2485467 best: 0.2485467 (54) total: 3.68s remaining: 2.68s 55: learn: 0.2463998 test: 0.2472907 best: 0.2472907 (55) total: 3.77s remaining: 2.63s 56: learn: 0.2450082 test: 0.2457508 best: 0.2457508 (56) total: 3.85s remaining: 2.57s 57: learn: 0.2443694 test: 0.2452056 best: 0.2452056 (57) total: 3.92s remaining: 2.5s 58: learn: 0.2437301 test: 0.2444903 best: 0.2444903 (58) total: 3.99s remaining: 2.44s 59: learn: 0.2428780 test: 0.2436598 best: 0.2436598 (59) total: 4.07s remaining: 2.37s 60: learn: 0.2418463 test: 0.2426233 best: 0.2426233 (60) total: 4.14s remaining: 2.31s 61: learn: 0.2409934 test: 0.2417128 best: 0.2417128 (61) total: 4.21s remaining: 2.24s 62: learn: 0.2399940 test: 0.2408069 best: 0.2408069 (62) total: 4.28s remaining: 2.17s 63: learn: 0.2395182 test: 0.2404173 best: 0.2404173 (63) total: 4.35s remaining: 2.1s 64: learn: 0.2385618 test: 0.2393877 best: 0.2393877 (64) total: 4.42s remaining: 2.04s 65: learn: 0.2379728 test: 0.2387686 best: 0.2387686 (65) total: 4.48s remaining: 1.97s 66: learn: 0.2370880 test: 0.2378216 best: 0.2378216 (66) total: 4.55s remaining: 1.9s 67: learn: 0.2362135 test: 0.2370205 best: 0.2370205 (67) total: 4.62s remaining: 1.83s 68: learn: 0.2357721 test: 0.2365858 best: 0.2365858 (68) total: 4.69s remaining: 1.77s 69: learn: 0.2348559 test: 0.2355400 best: 0.2355400 (69) total: 4.76s remaining: 1.7s 70: learn: 0.2345411 test: 0.2351879 best: 0.2351879 (70) total: 4.83s remaining: 1.63s 71: learn: 0.2342630 test: 0.2349192 best: 0.2349192 (71) total: 4.89s remaining: 1.56s 72: learn: 0.2339331 test: 0.2345221 best: 0.2345221 (72) total: 4.96s remaining: 1.49s 73: learn: 0.2335204 test: 0.2340930 best: 0.2340930 (73) total: 5.03s remaining: 1.43s 74: learn: 0.2329046 test: 0.2334806 best: 0.2334806 (74) total: 5.09s remaining: 1.36s 75: learn: 0.2322534 test: 0.2329236 best: 0.2329236 (75) total: 5.16s remaining: 1.29s 76: learn: 0.2320004 test: 0.2326935 best: 0.2326935 (76) total: 5.23s remaining: 1.22s 77: learn: 0.2312453 test: 0.2320267 best: 0.2320267 (77) total: 5.29s remaining: 1.15s 78: learn: 0.2308857 test: 0.2317223 best: 0.2317223 (78) total: 5.36s remaining: 1.09s 79: learn: 0.2301114 test: 0.2309404 best: 0.2309404 (79) total: 5.43s remaining: 1.02s 80: learn: 0.2295273 test: 0.2303241 best: 0.2303241 (80) total: 5.5s remaining: 952ms 81: learn: 0.2293675 test: 0.2301734 best: 0.2301734 (81) total: 5.57s remaining: 883ms 82: learn: 0.2290849 test: 0.2298491 best: 0.2298491 (82) total: 5.64s remaining: 816ms 83: learn: 0.2287951 test: 0.2295357 best: 0.2295357 (83) total: 5.71s remaining: 748ms 84: learn: 0.2281583 test: 0.2289148 best: 0.2289148 (84) total: 5.78s remaining: 680ms 85: learn: 0.2279462 test: 0.2287248 best: 0.2287248 (85) total: 5.85s remaining: 612ms 86: learn: 0.2275926 test: 0.2283845 best: 0.2283845 (86) total: 5.92s remaining: 544ms 87: learn: 0.2274124 test: 0.2282521 best: 0.2282521 (87) total: 5.98s remaining: 476ms 88: learn: 0.2272611 test: 0.2281050 best: 0.2281050 (88) total: 6.05s remaining: 408ms 89: learn: 0.2268149 test: 0.2276489 best: 0.2276489 (89) total: 6.12s remaining: 340ms 90: learn: 0.2265297 test: 0.2273448 best: 0.2273448 (90) total: 6.18s remaining: 272ms 91: learn: 0.2263460 test: 0.2271695 best: 0.2271695 (91) total: 6.25s remaining: 204ms 92: learn: 0.2260842 test: 0.2269171 best: 0.2269171 (92) total: 6.32s remaining: 136ms 93: learn: 0.2258014 test: 0.2266041 best: 0.2266041 (93) total: 6.39s remaining: 67.9ms 94: learn: 0.2255642 test: 0.2263710 best: 0.2263710 (94) total: 6.45s remaining: 0us bestTest = 0.2263709889 bestIteration = 94 Trial 40, Fold 2: Log loss = 0.2263709889023547, Average precision = 0.9694279016777742, ROC-AUC = 0.9650430766438609, Elapsed Time = 6.576813600000605 seconds Trial 40, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 40, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6560642 test: 0.6556384 best: 0.6556384 (0) total: 61.1ms remaining: 5.74s 1: learn: 0.6175190 test: 0.6167491 best: 0.6167491 (1) total: 124ms remaining: 5.75s 2: learn: 0.5879547 test: 0.5862893 best: 0.5862893 (2) total: 186ms remaining: 5.69s 3: learn: 0.5642781 test: 0.5630946 best: 0.5630946 (3) total: 247ms remaining: 5.62s 4: learn: 0.5433727 test: 0.5425102 best: 0.5425102 (4) total: 314ms remaining: 5.66s 5: learn: 0.5155015 test: 0.5144696 best: 0.5144696 (5) total: 377ms remaining: 5.6s 6: learn: 0.4949600 test: 0.4935536 best: 0.4935536 (6) total: 440ms remaining: 5.53s 7: learn: 0.4801163 test: 0.4781109 best: 0.4781109 (7) total: 502ms remaining: 5.46s 8: learn: 0.4632766 test: 0.4611300 best: 0.4611300 (8) total: 564ms remaining: 5.39s 9: learn: 0.4485793 test: 0.4459886 best: 0.4459886 (9) total: 624ms remaining: 5.31s 10: learn: 0.4355177 test: 0.4332901 best: 0.4332901 (10) total: 686ms remaining: 5.24s 11: learn: 0.4171849 test: 0.4147018 best: 0.4147018 (11) total: 747ms remaining: 5.16s 12: learn: 0.4022644 test: 0.3993861 best: 0.3993861 (12) total: 807ms remaining: 5.09s 13: learn: 0.3942673 test: 0.3910713 best: 0.3910713 (13) total: 870ms remaining: 5.03s 14: learn: 0.3850792 test: 0.3815498 best: 0.3815498 (14) total: 932ms remaining: 4.97s 15: learn: 0.3748859 test: 0.3712141 best: 0.3712141 (15) total: 993ms remaining: 4.9s 16: learn: 0.3653428 test: 0.3615816 best: 0.3615816 (16) total: 1.06s remaining: 4.85s 17: learn: 0.3572388 test: 0.3534510 best: 0.3534510 (17) total: 1.12s remaining: 4.78s 18: learn: 0.3510795 test: 0.3474960 best: 0.3474960 (18) total: 1.18s remaining: 4.72s 19: learn: 0.3438501 test: 0.3401431 best: 0.3401431 (19) total: 1.24s remaining: 4.66s 20: learn: 0.3364491 test: 0.3325370 best: 0.3325370 (20) total: 1.31s remaining: 4.61s 21: learn: 0.3299096 test: 0.3259292 best: 0.3259292 (21) total: 1.37s remaining: 4.54s 22: learn: 0.3249573 test: 0.3206557 best: 0.3206557 (22) total: 1.43s remaining: 4.48s 23: learn: 0.3200144 test: 0.3158003 best: 0.3158003 (23) total: 1.49s remaining: 4.42s 24: learn: 0.3150978 test: 0.3110405 best: 0.3110405 (24) total: 1.56s remaining: 4.36s 25: learn: 0.3096505 test: 0.3056184 best: 0.3056184 (25) total: 1.62s remaining: 4.3s 26: learn: 0.3066761 test: 0.3025587 best: 0.3025587 (26) total: 1.68s remaining: 4.24s 27: learn: 0.3020708 test: 0.2980430 best: 0.2980430 (27) total: 1.75s remaining: 4.18s 28: learn: 0.2977204 test: 0.2935065 best: 0.2935065 (28) total: 1.81s remaining: 4.12s 29: learn: 0.2939450 test: 0.2896808 best: 0.2896808 (29) total: 1.87s remaining: 4.06s 30: learn: 0.2913272 test: 0.2871232 best: 0.2871232 (30) total: 1.94s remaining: 4s 31: learn: 0.2877351 test: 0.2834974 best: 0.2834974 (31) total: 2s remaining: 3.95s 32: learn: 0.2847964 test: 0.2804130 best: 0.2804130 (32) total: 2.07s remaining: 3.89s 33: learn: 0.2829352 test: 0.2784140 best: 0.2784140 (33) total: 2.13s remaining: 3.83s 34: learn: 0.2802389 test: 0.2755788 best: 0.2755788 (34) total: 2.2s remaining: 3.78s 35: learn: 0.2778462 test: 0.2730680 best: 0.2730680 (35) total: 2.27s remaining: 3.73s 36: learn: 0.2756434 test: 0.2708234 best: 0.2708234 (36) total: 2.35s remaining: 3.68s 37: learn: 0.2735894 test: 0.2689137 best: 0.2689137 (37) total: 2.42s remaining: 3.63s 38: learn: 0.2712126 test: 0.2664095 best: 0.2664095 (38) total: 2.49s remaining: 3.57s 39: learn: 0.2690913 test: 0.2642391 best: 0.2642391 (39) total: 2.56s remaining: 3.52s 40: learn: 0.2669694 test: 0.2620213 best: 0.2620213 (40) total: 2.63s remaining: 3.46s 41: learn: 0.2655985 test: 0.2606043 best: 0.2606043 (41) total: 2.69s remaining: 3.4s 42: learn: 0.2642770 test: 0.2592440 best: 0.2592440 (42) total: 2.76s remaining: 3.34s 43: learn: 0.2625173 test: 0.2573859 best: 0.2573859 (43) total: 2.83s remaining: 3.28s 44: learn: 0.2608577 test: 0.2558542 best: 0.2558542 (44) total: 2.9s remaining: 3.22s 45: learn: 0.2596686 test: 0.2546798 best: 0.2546798 (45) total: 2.97s remaining: 3.16s 46: learn: 0.2583669 test: 0.2532030 best: 0.2532030 (46) total: 3.04s remaining: 3.1s 47: learn: 0.2568786 test: 0.2516461 best: 0.2516461 (47) total: 3.1s remaining: 3.04s 48: learn: 0.2557351 test: 0.2504084 best: 0.2504084 (48) total: 3.17s remaining: 2.98s 49: learn: 0.2546539 test: 0.2494194 best: 0.2494194 (49) total: 3.24s remaining: 2.92s 50: learn: 0.2533853 test: 0.2482414 best: 0.2482414 (50) total: 3.31s remaining: 2.86s 51: learn: 0.2521075 test: 0.2469245 best: 0.2469245 (51) total: 3.38s remaining: 2.8s 52: learn: 0.2508645 test: 0.2456816 best: 0.2456816 (52) total: 3.45s remaining: 2.74s 53: learn: 0.2494446 test: 0.2443203 best: 0.2443203 (53) total: 3.52s remaining: 2.67s 54: learn: 0.2479774 test: 0.2428088 best: 0.2428088 (54) total: 3.59s remaining: 2.61s 55: learn: 0.2472296 test: 0.2420608 best: 0.2420608 (55) total: 3.66s remaining: 2.55s 56: learn: 0.2466242 test: 0.2415046 best: 0.2415046 (56) total: 3.73s remaining: 2.49s 57: learn: 0.2454131 test: 0.2403900 best: 0.2403900 (57) total: 3.79s remaining: 2.42s 58: learn: 0.2442624 test: 0.2392705 best: 0.2392705 (58) total: 3.86s remaining: 2.36s 59: learn: 0.2431668 test: 0.2381046 best: 0.2381046 (59) total: 3.93s remaining: 2.29s 60: learn: 0.2426458 test: 0.2376036 best: 0.2376036 (60) total: 4s remaining: 2.23s 61: learn: 0.2418700 test: 0.2367324 best: 0.2367324 (61) total: 4.07s remaining: 2.16s 62: learn: 0.2409494 test: 0.2357410 best: 0.2357410 (62) total: 4.13s remaining: 2.1s 63: learn: 0.2401980 test: 0.2349489 best: 0.2349489 (63) total: 4.2s remaining: 2.04s 64: learn: 0.2395062 test: 0.2341871 best: 0.2341871 (64) total: 4.27s remaining: 1.97s 65: learn: 0.2386467 test: 0.2332754 best: 0.2332754 (65) total: 4.34s remaining: 1.91s 66: learn: 0.2380590 test: 0.2327449 best: 0.2327449 (66) total: 4.41s remaining: 1.84s 67: learn: 0.2373504 test: 0.2320561 best: 0.2320561 (67) total: 4.48s remaining: 1.78s 68: learn: 0.2368454 test: 0.2314968 best: 0.2314968 (68) total: 4.55s remaining: 1.71s 69: learn: 0.2359031 test: 0.2305867 best: 0.2305867 (69) total: 4.61s remaining: 1.65s 70: learn: 0.2351524 test: 0.2298733 best: 0.2298733 (70) total: 4.68s remaining: 1.58s 71: learn: 0.2347366 test: 0.2293615 best: 0.2293615 (71) total: 4.75s remaining: 1.52s 72: learn: 0.2339095 test: 0.2285555 best: 0.2285555 (72) total: 4.82s remaining: 1.45s 73: learn: 0.2335620 test: 0.2282555 best: 0.2282555 (73) total: 4.88s remaining: 1.39s 74: learn: 0.2328099 test: 0.2274799 best: 0.2274799 (74) total: 4.95s remaining: 1.32s 75: learn: 0.2323867 test: 0.2270623 best: 0.2270623 (75) total: 5.02s remaining: 1.25s 76: learn: 0.2321296 test: 0.2268246 best: 0.2268246 (76) total: 5.09s remaining: 1.19s 77: learn: 0.2316273 test: 0.2263376 best: 0.2263376 (77) total: 5.15s remaining: 1.12s 78: learn: 0.2310505 test: 0.2257778 best: 0.2257778 (78) total: 5.22s remaining: 1.06s 79: learn: 0.2305844 test: 0.2253933 best: 0.2253933 (79) total: 5.28s remaining: 991ms 80: learn: 0.2303789 test: 0.2251972 best: 0.2251972 (80) total: 5.35s remaining: 925ms 81: learn: 0.2297299 test: 0.2245309 best: 0.2245309 (81) total: 5.42s remaining: 859ms 82: learn: 0.2295418 test: 0.2243422 best: 0.2243422 (82) total: 5.48s remaining: 793ms 83: learn: 0.2293271 test: 0.2240859 best: 0.2240859 (83) total: 5.55s remaining: 727ms 84: learn: 0.2291039 test: 0.2239149 best: 0.2239149 (84) total: 5.62s remaining: 661ms 85: learn: 0.2288830 test: 0.2237137 best: 0.2237137 (85) total: 5.68s remaining: 595ms 86: learn: 0.2287420 test: 0.2235534 best: 0.2235534 (86) total: 5.75s remaining: 529ms 87: learn: 0.2281998 test: 0.2229600 best: 0.2229600 (87) total: 5.82s remaining: 463ms 88: learn: 0.2277984 test: 0.2226330 best: 0.2226330 (88) total: 5.89s remaining: 397ms 89: learn: 0.2276426 test: 0.2225052 best: 0.2225052 (89) total: 5.95s remaining: 331ms 90: learn: 0.2275049 test: 0.2223997 best: 0.2223997 (90) total: 6.02s remaining: 264ms 91: learn: 0.2270551 test: 0.2219229 best: 0.2219229 (91) total: 6.08s remaining: 198ms 92: learn: 0.2268558 test: 0.2216974 best: 0.2216974 (92) total: 6.15s remaining: 132ms 93: learn: 0.2267112 test: 0.2215829 best: 0.2215829 (93) total: 6.22s remaining: 66.2ms 94: learn: 0.2264221 test: 0.2212509 best: 0.2212509 (94) total: 6.29s remaining: 0us bestTest = 0.221250859 bestIteration = 94 Trial 40, Fold 3: Log loss = 0.22125085901596958, Average precision = 0.970532873190614, ROC-AUC = 0.9667763149974478, Elapsed Time = 6.420953900000313 seconds Trial 40, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 40, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6548981 test: 0.6544946 best: 0.6544946 (0) total: 59.3ms remaining: 5.57s 1: learn: 0.6133807 test: 0.6129378 best: 0.6129378 (1) total: 121ms remaining: 5.61s 2: learn: 0.5835735 test: 0.5829211 best: 0.5829211 (2) total: 183ms remaining: 5.61s 3: learn: 0.5510658 test: 0.5506415 best: 0.5506415 (3) total: 244ms remaining: 5.55s 4: learn: 0.5232922 test: 0.5227615 best: 0.5227615 (4) total: 309ms remaining: 5.56s 5: learn: 0.5041622 test: 0.5035307 best: 0.5035307 (5) total: 372ms remaining: 5.51s 6: learn: 0.4844816 test: 0.4838726 best: 0.4838726 (6) total: 432ms remaining: 5.43s 7: learn: 0.4696933 test: 0.4689636 best: 0.4689636 (7) total: 492ms remaining: 5.35s 8: learn: 0.4504013 test: 0.4499718 best: 0.4499718 (8) total: 555ms remaining: 5.31s 9: learn: 0.4330679 test: 0.4325640 best: 0.4325640 (9) total: 616ms remaining: 5.24s 10: learn: 0.4223493 test: 0.4219192 best: 0.4219192 (10) total: 679ms remaining: 5.18s 11: learn: 0.4108416 test: 0.4104806 best: 0.4104806 (11) total: 740ms remaining: 5.12s 12: learn: 0.3959191 test: 0.3955266 best: 0.3955266 (12) total: 802ms remaining: 5.06s 13: learn: 0.3849168 test: 0.3842963 best: 0.3842963 (13) total: 864ms remaining: 5s 14: learn: 0.3767629 test: 0.3760315 best: 0.3760315 (14) total: 928ms remaining: 4.95s 15: learn: 0.3695581 test: 0.3690157 best: 0.3690157 (15) total: 989ms remaining: 4.88s 16: learn: 0.3629444 test: 0.3625586 best: 0.3625586 (16) total: 1.05s remaining: 4.82s 17: learn: 0.3551292 test: 0.3548182 best: 0.3548182 (17) total: 1.11s remaining: 4.76s 18: learn: 0.3479760 test: 0.3476103 best: 0.3476103 (18) total: 1.17s remaining: 4.69s 19: learn: 0.3414891 test: 0.3411385 best: 0.3411385 (19) total: 1.24s remaining: 4.63s 20: learn: 0.3340438 test: 0.3336276 best: 0.3336276 (20) total: 1.3s remaining: 4.58s 21: learn: 0.3289947 test: 0.3286409 best: 0.3286409 (21) total: 1.36s remaining: 4.52s 22: learn: 0.3225173 test: 0.3221457 best: 0.3221457 (22) total: 1.43s remaining: 4.46s 23: learn: 0.3179049 test: 0.3175124 best: 0.3175124 (23) total: 1.49s remaining: 4.4s 24: learn: 0.3120635 test: 0.3116462 best: 0.3116462 (24) total: 1.55s remaining: 4.34s 25: learn: 0.3073707 test: 0.3069252 best: 0.3069252 (25) total: 1.61s remaining: 4.28s 26: learn: 0.3047714 test: 0.3044305 best: 0.3044305 (26) total: 1.67s remaining: 4.22s 27: learn: 0.3004567 test: 0.3001956 best: 0.3001956 (27) total: 1.74s remaining: 4.16s 28: learn: 0.2963156 test: 0.2960913 best: 0.2960913 (28) total: 1.8s remaining: 4.09s 29: learn: 0.2926679 test: 0.2923444 best: 0.2923444 (29) total: 1.86s remaining: 4.04s 30: learn: 0.2891965 test: 0.2888877 best: 0.2888877 (30) total: 1.93s remaining: 3.98s 31: learn: 0.2866752 test: 0.2866413 best: 0.2866413 (31) total: 1.99s remaining: 3.93s 32: learn: 0.2833618 test: 0.2832598 best: 0.2832598 (32) total: 2.06s remaining: 3.87s 33: learn: 0.2808592 test: 0.2808203 best: 0.2808203 (33) total: 2.13s remaining: 3.82s 34: learn: 0.2783096 test: 0.2782527 best: 0.2782527 (34) total: 2.19s remaining: 3.76s 35: learn: 0.2762117 test: 0.2763563 best: 0.2763563 (35) total: 2.26s remaining: 3.71s 36: learn: 0.2738955 test: 0.2740801 best: 0.2740801 (36) total: 2.33s remaining: 3.66s 37: learn: 0.2721475 test: 0.2723137 best: 0.2723137 (37) total: 2.4s remaining: 3.6s 38: learn: 0.2703694 test: 0.2705441 best: 0.2705441 (38) total: 2.47s remaining: 3.55s 39: learn: 0.2686651 test: 0.2688604 best: 0.2688604 (39) total: 2.54s remaining: 3.49s 40: learn: 0.2671235 test: 0.2673234 best: 0.2673234 (40) total: 2.61s remaining: 3.44s 41: learn: 0.2657867 test: 0.2660585 best: 0.2660585 (41) total: 2.68s remaining: 3.38s 42: learn: 0.2637467 test: 0.2641417 best: 0.2641417 (42) total: 2.75s remaining: 3.32s 43: learn: 0.2618994 test: 0.2622723 best: 0.2622723 (43) total: 2.81s remaining: 3.26s 44: learn: 0.2602183 test: 0.2607018 best: 0.2607018 (44) total: 2.88s remaining: 3.2s 45: learn: 0.2581445 test: 0.2586103 best: 0.2586103 (45) total: 2.95s remaining: 3.14s 46: learn: 0.2566264 test: 0.2571434 best: 0.2571434 (46) total: 3.01s remaining: 3.08s 47: learn: 0.2549681 test: 0.2555902 best: 0.2555902 (47) total: 3.08s remaining: 3.02s 48: learn: 0.2538531 test: 0.2545519 best: 0.2545519 (48) total: 3.15s remaining: 2.96s 49: learn: 0.2529063 test: 0.2535729 best: 0.2535729 (49) total: 3.22s remaining: 2.9s 50: learn: 0.2518389 test: 0.2526163 best: 0.2526163 (50) total: 3.28s remaining: 2.83s 51: learn: 0.2506175 test: 0.2513963 best: 0.2513963 (51) total: 3.35s remaining: 2.77s 52: learn: 0.2495125 test: 0.2503464 best: 0.2503464 (52) total: 3.42s remaining: 2.71s 53: learn: 0.2482813 test: 0.2491788 best: 0.2491788 (53) total: 3.49s remaining: 2.65s 54: learn: 0.2469625 test: 0.2478549 best: 0.2478549 (54) total: 3.56s remaining: 2.59s 55: learn: 0.2457640 test: 0.2466991 best: 0.2466991 (55) total: 3.63s remaining: 2.53s 56: learn: 0.2449024 test: 0.2458348 best: 0.2458348 (56) total: 3.69s remaining: 2.46s 57: learn: 0.2443472 test: 0.2453280 best: 0.2453280 (57) total: 3.76s remaining: 2.4s 58: learn: 0.2434303 test: 0.2444743 best: 0.2444743 (58) total: 3.83s remaining: 2.34s 59: learn: 0.2430298 test: 0.2440208 best: 0.2440208 (59) total: 3.9s remaining: 2.27s 60: learn: 0.2420849 test: 0.2431184 best: 0.2431184 (60) total: 3.96s remaining: 2.21s 61: learn: 0.2415947 test: 0.2427628 best: 0.2427628 (61) total: 4.03s remaining: 2.15s 62: learn: 0.2409462 test: 0.2422359 best: 0.2422359 (62) total: 4.1s remaining: 2.08s 63: learn: 0.2402775 test: 0.2416223 best: 0.2416223 (63) total: 4.17s remaining: 2.02s 64: learn: 0.2392709 test: 0.2405403 best: 0.2405403 (64) total: 4.23s remaining: 1.95s 65: learn: 0.2386514 test: 0.2399731 best: 0.2399731 (65) total: 4.3s remaining: 1.89s 66: learn: 0.2379658 test: 0.2393278 best: 0.2393278 (66) total: 4.37s remaining: 1.83s 67: learn: 0.2369943 test: 0.2383948 best: 0.2383948 (67) total: 4.44s remaining: 1.76s 68: learn: 0.2362377 test: 0.2376423 best: 0.2376423 (68) total: 4.5s remaining: 1.7s 69: learn: 0.2354839 test: 0.2368809 best: 0.2368809 (69) total: 4.57s remaining: 1.63s 70: learn: 0.2346816 test: 0.2361288 best: 0.2361288 (70) total: 4.64s remaining: 1.57s 71: learn: 0.2341420 test: 0.2356637 best: 0.2356637 (71) total: 4.71s remaining: 1.5s 72: learn: 0.2339023 test: 0.2354382 best: 0.2354382 (72) total: 4.77s remaining: 1.44s 73: learn: 0.2336559 test: 0.2351905 best: 0.2351905 (73) total: 4.84s remaining: 1.37s 74: learn: 0.2330508 test: 0.2345459 best: 0.2345459 (74) total: 4.91s remaining: 1.31s 75: learn: 0.2324125 test: 0.2340207 best: 0.2340207 (75) total: 4.97s remaining: 1.24s 76: learn: 0.2320073 test: 0.2336055 best: 0.2336055 (76) total: 5.04s remaining: 1.18s 77: learn: 0.2314113 test: 0.2330181 best: 0.2330181 (77) total: 5.1s remaining: 1.11s 78: learn: 0.2310897 test: 0.2327351 best: 0.2327351 (78) total: 5.17s remaining: 1.05s 79: learn: 0.2306993 test: 0.2323927 best: 0.2323927 (79) total: 5.24s remaining: 982ms 80: learn: 0.2301610 test: 0.2318961 best: 0.2318961 (80) total: 5.3s remaining: 916ms 81: learn: 0.2297527 test: 0.2315625 best: 0.2315625 (81) total: 5.37s remaining: 851ms 82: learn: 0.2295390 test: 0.2313652 best: 0.2313652 (82) total: 5.43s remaining: 786ms 83: learn: 0.2293769 test: 0.2311384 best: 0.2311384 (83) total: 5.5s remaining: 720ms 84: learn: 0.2286490 test: 0.2304440 best: 0.2304440 (84) total: 5.57s remaining: 655ms 85: learn: 0.2285437 test: 0.2304070 best: 0.2304070 (85) total: 5.63s remaining: 589ms 86: learn: 0.2282330 test: 0.2301991 best: 0.2301991 (86) total: 5.7s remaining: 524ms 87: learn: 0.2275185 test: 0.2294475 best: 0.2294475 (87) total: 5.76s remaining: 459ms 88: learn: 0.2272462 test: 0.2292154 best: 0.2292154 (88) total: 5.83s remaining: 393ms 89: learn: 0.2270586 test: 0.2290746 best: 0.2290746 (89) total: 5.9s remaining: 328ms 90: learn: 0.2266358 test: 0.2286966 best: 0.2286966 (90) total: 5.97s remaining: 262ms 91: learn: 0.2261864 test: 0.2282191 best: 0.2282191 (91) total: 6.03s remaining: 197ms 92: learn: 0.2260651 test: 0.2280955 best: 0.2280955 (92) total: 6.1s remaining: 131ms 93: learn: 0.2259016 test: 0.2280150 best: 0.2280150 (93) total: 6.17s remaining: 65.6ms 94: learn: 0.2258166 test: 0.2280020 best: 0.2280020 (94) total: 6.23s remaining: 0us bestTest = 0.2280019958 bestIteration = 94 Trial 40, Fold 4: Log loss = 0.22800199576045335, Average precision = 0.9684660577259117, ROC-AUC = 0.9632701047471215, Elapsed Time = 6.358377000000473 seconds Trial 40, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 40, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6595833 test: 0.6605801 best: 0.6605801 (0) total: 59.9ms remaining: 5.63s 1: learn: 0.6245109 test: 0.6263132 best: 0.6263132 (1) total: 120ms remaining: 5.56s 2: learn: 0.5978626 test: 0.6002289 best: 0.6002289 (2) total: 181ms remaining: 5.56s 3: learn: 0.5752844 test: 0.5779631 best: 0.5779631 (3) total: 242ms remaining: 5.5s 4: learn: 0.5529517 test: 0.5562943 best: 0.5562943 (4) total: 306ms remaining: 5.51s 5: learn: 0.5339874 test: 0.5379370 best: 0.5379370 (5) total: 368ms remaining: 5.46s 6: learn: 0.5084266 test: 0.5126539 best: 0.5126539 (6) total: 431ms remaining: 5.42s 7: learn: 0.4898426 test: 0.4947626 best: 0.4947626 (7) total: 493ms remaining: 5.36s 8: learn: 0.4723545 test: 0.4775591 best: 0.4775591 (8) total: 555ms remaining: 5.3s 9: learn: 0.4565435 test: 0.4621599 best: 0.4621599 (9) total: 618ms remaining: 5.26s 10: learn: 0.4451414 test: 0.4509637 best: 0.4509637 (10) total: 682ms remaining: 5.21s 11: learn: 0.4281808 test: 0.4342048 best: 0.4342048 (11) total: 743ms remaining: 5.14s 12: learn: 0.4184431 test: 0.4249331 best: 0.4249331 (12) total: 806ms remaining: 5.08s 13: learn: 0.4071081 test: 0.4135150 best: 0.4135150 (13) total: 868ms remaining: 5.02s 14: learn: 0.3926788 test: 0.3992361 best: 0.3992361 (14) total: 929ms remaining: 4.96s 15: learn: 0.3837259 test: 0.3905130 best: 0.3905130 (15) total: 992ms remaining: 4.9s 16: learn: 0.3750595 test: 0.3820393 best: 0.3820393 (16) total: 1.05s remaining: 4.84s 17: learn: 0.3645465 test: 0.3718105 best: 0.3718105 (17) total: 1.11s remaining: 4.77s 18: learn: 0.3550727 test: 0.3626581 best: 0.3626581 (18) total: 1.18s remaining: 4.71s 19: learn: 0.3467873 test: 0.3547803 best: 0.3547803 (19) total: 1.24s remaining: 4.64s 20: learn: 0.3404340 test: 0.3486217 best: 0.3486217 (20) total: 1.3s remaining: 4.58s 21: learn: 0.3320645 test: 0.3405104 best: 0.3405104 (21) total: 1.36s remaining: 4.51s 22: learn: 0.3247352 test: 0.3333848 best: 0.3333848 (22) total: 1.42s remaining: 4.46s 23: learn: 0.3190681 test: 0.3278579 best: 0.3278579 (23) total: 1.49s remaining: 4.4s 24: learn: 0.3139722 test: 0.3230321 best: 0.3230321 (24) total: 1.55s remaining: 4.33s 25: learn: 0.3090834 test: 0.3184028 best: 0.3184028 (25) total: 1.61s remaining: 4.28s 26: learn: 0.3049615 test: 0.3143447 best: 0.3143447 (26) total: 1.68s remaining: 4.22s 27: learn: 0.3004348 test: 0.3101269 best: 0.3101269 (27) total: 1.74s remaining: 4.16s 28: learn: 0.2971846 test: 0.3068108 best: 0.3068108 (28) total: 1.8s remaining: 4.1s 29: learn: 0.2935383 test: 0.3032552 best: 0.3032552 (29) total: 1.86s remaining: 4.04s 30: learn: 0.2898845 test: 0.2997187 best: 0.2997187 (30) total: 1.93s remaining: 3.98s 31: learn: 0.2869827 test: 0.2967429 best: 0.2967429 (31) total: 1.99s remaining: 3.92s 32: learn: 0.2839054 test: 0.2937769 best: 0.2937769 (32) total: 2.06s remaining: 3.87s 33: learn: 0.2809406 test: 0.2910533 best: 0.2910533 (33) total: 2.12s remaining: 3.81s 34: learn: 0.2784504 test: 0.2885206 best: 0.2885206 (34) total: 2.19s remaining: 3.75s 35: learn: 0.2766220 test: 0.2868058 best: 0.2868058 (35) total: 2.26s remaining: 3.7s 36: learn: 0.2742104 test: 0.2844921 best: 0.2844921 (36) total: 2.33s remaining: 3.65s 37: learn: 0.2719154 test: 0.2823532 best: 0.2823532 (37) total: 2.4s remaining: 3.6s 38: learn: 0.2705077 test: 0.2809498 best: 0.2809498 (38) total: 2.47s remaining: 3.55s 39: learn: 0.2686356 test: 0.2790170 best: 0.2790170 (39) total: 2.54s remaining: 3.49s 40: learn: 0.2667973 test: 0.2770869 best: 0.2770869 (40) total: 2.6s remaining: 3.43s 41: learn: 0.2651833 test: 0.2754014 best: 0.2754014 (41) total: 2.67s remaining: 3.38s 42: learn: 0.2634251 test: 0.2737052 best: 0.2737052 (42) total: 2.74s remaining: 3.32s 43: learn: 0.2625952 test: 0.2728129 best: 0.2728129 (43) total: 2.81s remaining: 3.26s 44: learn: 0.2615318 test: 0.2716937 best: 0.2716937 (44) total: 2.88s remaining: 3.2s 45: learn: 0.2597747 test: 0.2701210 best: 0.2701210 (45) total: 2.95s remaining: 3.14s 46: learn: 0.2582854 test: 0.2686196 best: 0.2686196 (46) total: 3.02s remaining: 3.08s 47: learn: 0.2566114 test: 0.2669166 best: 0.2669166 (47) total: 3.08s remaining: 3.02s 48: learn: 0.2548229 test: 0.2652564 best: 0.2652564 (48) total: 3.15s remaining: 2.96s 49: learn: 0.2532644 test: 0.2637989 best: 0.2637989 (49) total: 3.22s remaining: 2.9s 50: learn: 0.2517916 test: 0.2624073 best: 0.2624073 (50) total: 3.28s remaining: 2.83s 51: learn: 0.2505427 test: 0.2612319 best: 0.2612319 (51) total: 3.35s remaining: 2.77s 52: learn: 0.2490586 test: 0.2597246 best: 0.2597246 (52) total: 3.42s remaining: 2.71s 53: learn: 0.2477025 test: 0.2584396 best: 0.2584396 (53) total: 3.49s remaining: 2.65s 54: learn: 0.2466903 test: 0.2575614 best: 0.2575614 (54) total: 3.56s remaining: 2.58s 55: learn: 0.2458617 test: 0.2567295 best: 0.2567295 (55) total: 3.63s remaining: 2.52s 56: learn: 0.2447531 test: 0.2556396 best: 0.2556396 (56) total: 3.69s remaining: 2.46s 57: learn: 0.2440513 test: 0.2549460 best: 0.2549460 (57) total: 3.76s remaining: 2.4s 58: learn: 0.2433170 test: 0.2542131 best: 0.2542131 (58) total: 3.83s remaining: 2.33s 59: learn: 0.2423195 test: 0.2532707 best: 0.2532707 (59) total: 3.9s remaining: 2.27s 60: learn: 0.2415360 test: 0.2525246 best: 0.2525246 (60) total: 3.96s remaining: 2.21s 61: learn: 0.2409244 test: 0.2519040 best: 0.2519040 (61) total: 4.03s remaining: 2.15s 62: learn: 0.2402360 test: 0.2512990 best: 0.2512990 (62) total: 4.1s remaining: 2.08s 63: learn: 0.2398883 test: 0.2509964 best: 0.2509964 (63) total: 4.16s remaining: 2.02s 64: learn: 0.2391124 test: 0.2502862 best: 0.2502862 (64) total: 4.23s remaining: 1.95s 65: learn: 0.2380702 test: 0.2493625 best: 0.2493625 (65) total: 4.3s remaining: 1.89s 66: learn: 0.2372263 test: 0.2484866 best: 0.2484866 (66) total: 4.36s remaining: 1.82s 67: learn: 0.2366709 test: 0.2478939 best: 0.2478939 (67) total: 4.43s remaining: 1.76s 68: learn: 0.2355915 test: 0.2467506 best: 0.2467506 (68) total: 4.5s remaining: 1.7s 69: learn: 0.2346773 test: 0.2459544 best: 0.2459544 (69) total: 4.57s remaining: 1.63s 70: learn: 0.2342090 test: 0.2455848 best: 0.2455848 (70) total: 4.64s remaining: 1.57s 71: learn: 0.2337939 test: 0.2452960 best: 0.2452960 (71) total: 4.7s remaining: 1.5s 72: learn: 0.2330301 test: 0.2445659 best: 0.2445659 (72) total: 4.77s remaining: 1.44s 73: learn: 0.2327294 test: 0.2442293 best: 0.2442293 (73) total: 4.84s remaining: 1.37s 74: learn: 0.2315475 test: 0.2431797 best: 0.2431797 (74) total: 4.9s remaining: 1.31s 75: learn: 0.2308595 test: 0.2423824 best: 0.2423824 (75) total: 4.97s remaining: 1.24s 76: learn: 0.2303924 test: 0.2419038 best: 0.2419038 (76) total: 5.04s remaining: 1.18s 77: learn: 0.2297102 test: 0.2412015 best: 0.2412015 (77) total: 5.1s remaining: 1.11s 78: learn: 0.2294327 test: 0.2409048 best: 0.2409048 (78) total: 5.17s remaining: 1.05s 79: learn: 0.2292718 test: 0.2407983 best: 0.2407983 (79) total: 5.23s remaining: 981ms 80: learn: 0.2287824 test: 0.2402820 best: 0.2402820 (80) total: 5.3s remaining: 916ms 81: learn: 0.2283737 test: 0.2398967 best: 0.2398967 (81) total: 5.36s remaining: 850ms 82: learn: 0.2281477 test: 0.2397036 best: 0.2397036 (82) total: 5.43s remaining: 785ms 83: learn: 0.2279342 test: 0.2395012 best: 0.2395012 (83) total: 5.49s remaining: 719ms 84: learn: 0.2275580 test: 0.2391622 best: 0.2391622 (84) total: 5.56s remaining: 654ms 85: learn: 0.2273961 test: 0.2390777 best: 0.2390777 (85) total: 5.62s remaining: 589ms 86: learn: 0.2270790 test: 0.2387924 best: 0.2387924 (86) total: 5.69s remaining: 523ms 87: learn: 0.2268114 test: 0.2385052 best: 0.2385052 (87) total: 5.76s remaining: 458ms 88: learn: 0.2266310 test: 0.2383845 best: 0.2383845 (88) total: 5.82s remaining: 393ms 89: learn: 0.2262644 test: 0.2381007 best: 0.2381007 (89) total: 5.89s remaining: 327ms 90: learn: 0.2254780 test: 0.2373482 best: 0.2373482 (90) total: 5.96s remaining: 262ms 91: learn: 0.2247583 test: 0.2366178 best: 0.2366178 (91) total: 6.03s remaining: 196ms 92: learn: 0.2245986 test: 0.2365103 best: 0.2365103 (92) total: 6.09s remaining: 131ms 93: learn: 0.2243291 test: 0.2362923 best: 0.2362923 (93) total: 6.16s remaining: 65.5ms 94: learn: 0.2241443 test: 0.2361444 best: 0.2361444 (94) total: 6.22s remaining: 0us bestTest = 0.2361444488 bestIteration = 94 Trial 40, Fold 5: Log loss = 0.23614444883432828, Average precision = 0.96622342609293, ROC-AUC = 0.9613915983873066, Elapsed Time = 6.3494046999985585 seconds
Optimization Progress: 41%|####1 | 41/100 [1:08:12<1:11:28, 72.69s/it]
Trial 41, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 41, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6798004 test: 0.6797463 best: 0.6797463 (0) total: 22.7ms remaining: 1.45s 1: learn: 0.6668658 test: 0.6667675 best: 0.6667675 (1) total: 45.2ms remaining: 1.42s 2: learn: 0.6543340 test: 0.6541587 best: 0.6541587 (2) total: 67.3ms remaining: 1.39s 3: learn: 0.6422706 test: 0.6419201 best: 0.6419201 (3) total: 89ms remaining: 1.36s 4: learn: 0.6326356 test: 0.6323496 best: 0.6323496 (4) total: 111ms remaining: 1.33s 5: learn: 0.6212273 test: 0.6209026 best: 0.6209026 (5) total: 133ms remaining: 1.31s 6: learn: 0.6102101 test: 0.6097477 best: 0.6097477 (6) total: 156ms remaining: 1.29s 7: learn: 0.6007955 test: 0.6004163 best: 0.6004163 (7) total: 178ms remaining: 1.27s 8: learn: 0.5923047 test: 0.5919116 best: 0.5919116 (8) total: 201ms remaining: 1.25s 9: learn: 0.5837588 test: 0.5832416 best: 0.5832416 (9) total: 224ms remaining: 1.23s 10: learn: 0.5738281 test: 0.5731581 best: 0.5731581 (10) total: 248ms remaining: 1.22s 11: learn: 0.5642722 test: 0.5635473 best: 0.5635473 (11) total: 271ms remaining: 1.2s 12: learn: 0.5550378 test: 0.5542899 best: 0.5542899 (12) total: 295ms remaining: 1.18s 13: learn: 0.5460545 test: 0.5452054 best: 0.5452054 (13) total: 317ms remaining: 1.16s 14: learn: 0.5373325 test: 0.5363541 best: 0.5363541 (14) total: 341ms remaining: 1.14s 15: learn: 0.5289290 test: 0.5279083 best: 0.5279083 (15) total: 363ms remaining: 1.11s 16: learn: 0.5222855 test: 0.5213790 best: 0.5213790 (16) total: 386ms remaining: 1.09s 17: learn: 0.5144361 test: 0.5135395 best: 0.5135395 (17) total: 409ms remaining: 1.07s 18: learn: 0.5068069 test: 0.5058950 best: 0.5058950 (18) total: 432ms remaining: 1.04s 19: learn: 0.4993594 test: 0.4983368 best: 0.4983368 (19) total: 455ms remaining: 1.02s 20: learn: 0.4923252 test: 0.4914228 best: 0.4914228 (20) total: 478ms remaining: 1s 21: learn: 0.4853659 test: 0.4844086 best: 0.4844086 (21) total: 501ms remaining: 979ms 22: learn: 0.4786142 test: 0.4775735 best: 0.4775735 (22) total: 525ms remaining: 959ms 23: learn: 0.4721109 test: 0.4709658 best: 0.4709658 (23) total: 549ms remaining: 937ms 24: learn: 0.4659542 test: 0.4648313 best: 0.4648313 (24) total: 573ms remaining: 916ms 25: learn: 0.4607275 test: 0.4596321 best: 0.4596321 (25) total: 596ms remaining: 895ms 26: learn: 0.4547239 test: 0.4535620 best: 0.4535620 (26) total: 621ms remaining: 874ms 27: learn: 0.4490817 test: 0.4479123 best: 0.4479123 (27) total: 645ms remaining: 852ms 28: learn: 0.4435927 test: 0.4424019 best: 0.4424019 (28) total: 669ms remaining: 830ms 29: learn: 0.4382387 test: 0.4370205 best: 0.4370205 (29) total: 693ms remaining: 808ms 30: learn: 0.4329656 test: 0.4316864 best: 0.4316864 (30) total: 717ms remaining: 786ms 31: learn: 0.4287308 test: 0.4274763 best: 0.4274763 (31) total: 740ms remaining: 764ms 32: learn: 0.4250237 test: 0.4239163 best: 0.4239163 (32) total: 764ms remaining: 741ms 33: learn: 0.4201863 test: 0.4190008 best: 0.4190008 (33) total: 788ms remaining: 718ms 34: learn: 0.4163765 test: 0.4153031 best: 0.4153031 (34) total: 812ms remaining: 696ms 35: learn: 0.4132090 test: 0.4122278 best: 0.4122278 (35) total: 836ms remaining: 674ms 36: learn: 0.4087074 test: 0.4076618 best: 0.4076618 (36) total: 860ms remaining: 651ms 37: learn: 0.4047748 test: 0.4036836 best: 0.4036836 (37) total: 884ms remaining: 628ms 38: learn: 0.4005378 test: 0.3993984 best: 0.3993984 (38) total: 908ms remaining: 605ms 39: learn: 0.3976827 test: 0.3967314 best: 0.3967314 (39) total: 931ms remaining: 582ms 40: learn: 0.3936874 test: 0.3926638 best: 0.3926638 (40) total: 955ms remaining: 559ms 41: learn: 0.3906035 test: 0.3895640 best: 0.3895640 (41) total: 978ms remaining: 536ms 42: learn: 0.3869583 test: 0.3859584 best: 0.3859584 (42) total: 1s remaining: 513ms 43: learn: 0.3831769 test: 0.3822178 best: 0.3822178 (43) total: 1.03s remaining: 490ms 44: learn: 0.3796589 test: 0.3786984 best: 0.3786984 (44) total: 1.05s remaining: 467ms 45: learn: 0.3769959 test: 0.3760744 best: 0.3760744 (45) total: 1.07s remaining: 444ms 46: learn: 0.3745812 test: 0.3738090 best: 0.3738090 (46) total: 1.1s remaining: 420ms 47: learn: 0.3718878 test: 0.3711885 best: 0.3711885 (47) total: 1.12s remaining: 397ms 48: learn: 0.3694388 test: 0.3687571 best: 0.3687571 (48) total: 1.14s remaining: 374ms 49: learn: 0.3668748 test: 0.3662226 best: 0.3662226 (49) total: 1.17s remaining: 351ms 50: learn: 0.3643652 test: 0.3637544 best: 0.3637544 (50) total: 1.19s remaining: 328ms 51: learn: 0.3615956 test: 0.3609656 best: 0.3609656 (51) total: 1.22s remaining: 304ms 52: learn: 0.3592594 test: 0.3587045 best: 0.3587045 (52) total: 1.24s remaining: 281ms 53: learn: 0.3569230 test: 0.3565084 best: 0.3565084 (53) total: 1.26s remaining: 257ms 54: learn: 0.3540888 test: 0.3536357 best: 0.3536357 (54) total: 1.29s remaining: 234ms 55: learn: 0.3513325 test: 0.3508906 best: 0.3508906 (55) total: 1.31s remaining: 211ms 56: learn: 0.3490252 test: 0.3486477 best: 0.3486477 (56) total: 1.34s remaining: 188ms 57: learn: 0.3464244 test: 0.3460108 best: 0.3460108 (57) total: 1.36s remaining: 164ms 58: learn: 0.3446430 test: 0.3442872 best: 0.3442872 (58) total: 1.39s remaining: 141ms 59: learn: 0.3421864 test: 0.3417945 best: 0.3417945 (59) total: 1.41s remaining: 118ms 60: learn: 0.3398028 test: 0.3393531 best: 0.3393531 (60) total: 1.44s remaining: 94.2ms 61: learn: 0.3379169 test: 0.3374973 best: 0.3374973 (61) total: 1.46s remaining: 70.8ms 62: learn: 0.3358282 test: 0.3354120 best: 0.3354120 (62) total: 1.5s remaining: 47.5ms 63: learn: 0.3337341 test: 0.3333812 best: 0.3333812 (63) total: 1.52s remaining: 23.8ms 64: learn: 0.3319155 test: 0.3316169 best: 0.3316169 (64) total: 1.55s remaining: 0us bestTest = 0.3316168701 bestIteration = 64 Trial 41, Fold 1: Log loss = 0.33168902711184856, Average precision = 0.9583766064905149, ROC-AUC = 0.9541974950172781, Elapsed Time = 1.648871399997006 seconds Trial 41, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 41, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6797991 test: 0.6798315 best: 0.6798315 (0) total: 21.1ms remaining: 1.35s 1: learn: 0.6667425 test: 0.6668927 best: 0.6668927 (1) total: 42.8ms remaining: 1.35s 2: learn: 0.6542388 test: 0.6543774 best: 0.6543774 (2) total: 64ms remaining: 1.32s 3: learn: 0.6420226 test: 0.6422481 best: 0.6422481 (3) total: 85.1ms remaining: 1.3s 4: learn: 0.6320633 test: 0.6323373 best: 0.6323373 (4) total: 106ms remaining: 1.27s 5: learn: 0.6205511 test: 0.6209102 best: 0.6209102 (5) total: 127ms remaining: 1.25s 6: learn: 0.6094008 test: 0.6098411 best: 0.6098411 (6) total: 148ms remaining: 1.22s 7: learn: 0.5986761 test: 0.5991631 best: 0.5991631 (7) total: 169ms remaining: 1.2s 8: learn: 0.5883292 test: 0.5889027 best: 0.5889027 (8) total: 190ms remaining: 1.18s 9: learn: 0.5783609 test: 0.5789173 best: 0.5789173 (9) total: 211ms remaining: 1.16s 10: learn: 0.5685639 test: 0.5691965 best: 0.5691965 (10) total: 234ms remaining: 1.15s 11: learn: 0.5605112 test: 0.5611897 best: 0.5611897 (11) total: 257ms remaining: 1.14s 12: learn: 0.5528365 test: 0.5534888 best: 0.5534888 (12) total: 280ms remaining: 1.12s 13: learn: 0.5438654 test: 0.5445937 best: 0.5445937 (13) total: 303ms remaining: 1.1s 14: learn: 0.5351612 test: 0.5359422 best: 0.5359422 (14) total: 327ms remaining: 1.09s 15: learn: 0.5288429 test: 0.5296641 best: 0.5296641 (15) total: 350ms remaining: 1.07s 16: learn: 0.5206326 test: 0.5215279 best: 0.5215279 (16) total: 373ms remaining: 1.05s 17: learn: 0.5138944 test: 0.5148303 best: 0.5148303 (17) total: 397ms remaining: 1.03s 18: learn: 0.5062241 test: 0.5071880 best: 0.5071880 (18) total: 420ms remaining: 1.02s 19: learn: 0.4986958 test: 0.4997196 best: 0.4997196 (19) total: 443ms remaining: 997ms 20: learn: 0.4934917 test: 0.4944650 best: 0.4944650 (20) total: 465ms remaining: 974ms 21: learn: 0.4863593 test: 0.4873979 best: 0.4873979 (21) total: 487ms remaining: 951ms 22: learn: 0.4794668 test: 0.4805477 best: 0.4805477 (22) total: 509ms remaining: 929ms 23: learn: 0.4740334 test: 0.4751512 best: 0.4751512 (23) total: 531ms remaining: 907ms 24: learn: 0.4687115 test: 0.4698477 best: 0.4698477 (24) total: 553ms remaining: 885ms 25: learn: 0.4625036 test: 0.4637127 best: 0.4637127 (25) total: 575ms remaining: 863ms 26: learn: 0.4564442 test: 0.4577295 best: 0.4577295 (26) total: 598ms remaining: 841ms 27: learn: 0.4508703 test: 0.4522699 best: 0.4522699 (27) total: 620ms remaining: 820ms 28: learn: 0.4462075 test: 0.4476593 best: 0.4476593 (28) total: 643ms remaining: 798ms 29: learn: 0.4409064 test: 0.4424310 best: 0.4424310 (29) total: 665ms remaining: 776ms 30: learn: 0.4355103 test: 0.4370938 best: 0.4370938 (30) total: 688ms remaining: 755ms 31: learn: 0.4302530 test: 0.4318837 best: 0.4318837 (31) total: 710ms remaining: 732ms 32: learn: 0.4251646 test: 0.4268664 best: 0.4268664 (32) total: 732ms remaining: 710ms 33: learn: 0.4203484 test: 0.4220418 best: 0.4220418 (33) total: 754ms remaining: 687ms 34: learn: 0.4155989 test: 0.4173594 best: 0.4173594 (34) total: 776ms remaining: 665ms 35: learn: 0.4122928 test: 0.4139733 best: 0.4139733 (35) total: 798ms remaining: 643ms 36: learn: 0.4078382 test: 0.4095822 best: 0.4095822 (36) total: 821ms remaining: 621ms 37: learn: 0.4038632 test: 0.4056339 best: 0.4056339 (37) total: 845ms remaining: 600ms 38: learn: 0.3996000 test: 0.4014326 best: 0.4014326 (38) total: 869ms remaining: 579ms 39: learn: 0.3955348 test: 0.3973762 best: 0.3973762 (39) total: 893ms remaining: 558ms 40: learn: 0.3919428 test: 0.3938036 best: 0.3938036 (40) total: 916ms remaining: 536ms 41: learn: 0.3884583 test: 0.3903565 best: 0.3903565 (41) total: 941ms remaining: 515ms 42: learn: 0.3854243 test: 0.3873209 best: 0.3873209 (42) total: 964ms remaining: 493ms 43: learn: 0.3823258 test: 0.3842550 best: 0.3842550 (43) total: 988ms remaining: 472ms 44: learn: 0.3788452 test: 0.3807986 best: 0.3807986 (44) total: 1.01s remaining: 450ms 45: learn: 0.3756706 test: 0.3776364 best: 0.3776364 (45) total: 1.04s remaining: 429ms 46: learn: 0.3724144 test: 0.3744174 best: 0.3744174 (46) total: 1.06s remaining: 407ms 47: learn: 0.3696354 test: 0.3716575 best: 0.3716575 (47) total: 1.09s remaining: 385ms 48: learn: 0.3669634 test: 0.3690190 best: 0.3690190 (48) total: 1.11s remaining: 363ms 49: learn: 0.3641905 test: 0.3662743 best: 0.3662743 (49) total: 1.14s remaining: 341ms 50: learn: 0.3612819 test: 0.3633724 best: 0.3633724 (50) total: 1.16s remaining: 318ms 51: learn: 0.3585757 test: 0.3606812 best: 0.3606812 (51) total: 1.18s remaining: 296ms 52: learn: 0.3556551 test: 0.3577952 best: 0.3577952 (52) total: 1.21s remaining: 274ms 53: learn: 0.3531508 test: 0.3553368 best: 0.3553368 (53) total: 1.23s remaining: 251ms 54: learn: 0.3505087 test: 0.3526937 best: 0.3526937 (54) total: 1.26s remaining: 229ms 55: learn: 0.3483552 test: 0.3505988 best: 0.3505988 (55) total: 1.28s remaining: 206ms 56: learn: 0.3462067 test: 0.3484615 best: 0.3484615 (56) total: 1.31s remaining: 184ms 57: learn: 0.3441073 test: 0.3463651 best: 0.3463651 (57) total: 1.33s remaining: 161ms 58: learn: 0.3420558 test: 0.3442958 best: 0.3442958 (58) total: 1.36s remaining: 138ms 59: learn: 0.3401915 test: 0.3424710 best: 0.3424710 (59) total: 1.38s remaining: 115ms 60: learn: 0.3379543 test: 0.3402321 best: 0.3402321 (60) total: 1.41s remaining: 92.3ms 61: learn: 0.3355238 test: 0.3378767 best: 0.3378767 (61) total: 1.43s remaining: 69.3ms 62: learn: 0.3331850 test: 0.3355956 best: 0.3355956 (62) total: 1.46s remaining: 46.3ms 63: learn: 0.3311182 test: 0.3335495 best: 0.3335495 (63) total: 1.48s remaining: 23.2ms 64: learn: 0.3293335 test: 0.3317839 best: 0.3317839 (64) total: 1.51s remaining: 0us bestTest = 0.3317839282 bestIteration = 64 Trial 41, Fold 2: Log loss = 0.3318118362532073, Average precision = 0.9594057473870764, ROC-AUC = 0.9566941411090517, Elapsed Time = 1.6018395999999484 seconds Trial 41, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 41, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6820500 test: 0.6818658 best: 0.6818658 (0) total: 20.6ms remaining: 1.32s 1: learn: 0.6689770 test: 0.6687083 best: 0.6687083 (1) total: 42.5ms remaining: 1.34s 2: learn: 0.6583121 test: 0.6578309 best: 0.6578309 (2) total: 63.6ms remaining: 1.31s 3: learn: 0.6463119 test: 0.6457303 best: 0.6457303 (3) total: 85ms remaining: 1.3s 4: learn: 0.6365156 test: 0.6358015 best: 0.6358015 (4) total: 106ms remaining: 1.27s 5: learn: 0.6287414 test: 0.6278602 best: 0.6278602 (5) total: 127ms remaining: 1.25s 6: learn: 0.6173442 test: 0.6163743 best: 0.6163743 (6) total: 149ms remaining: 1.23s 7: learn: 0.6064608 test: 0.6054120 best: 0.6054120 (7) total: 170ms remaining: 1.21s 8: learn: 0.5958739 test: 0.5947389 best: 0.5947389 (8) total: 191ms remaining: 1.19s 9: learn: 0.5856102 test: 0.5843745 best: 0.5843745 (9) total: 213ms remaining: 1.17s 10: learn: 0.5757057 test: 0.5744015 best: 0.5744015 (10) total: 234ms remaining: 1.15s 11: learn: 0.5661196 test: 0.5647505 best: 0.5647505 (11) total: 256ms remaining: 1.13s 12: learn: 0.5579941 test: 0.5565428 best: 0.5565428 (12) total: 277ms remaining: 1.11s 13: learn: 0.5488819 test: 0.5473456 best: 0.5473456 (13) total: 299ms remaining: 1.09s 14: learn: 0.5403022 test: 0.5386985 best: 0.5386985 (14) total: 321ms remaining: 1.07s 15: learn: 0.5318114 test: 0.5301178 best: 0.5301178 (15) total: 343ms remaining: 1.05s 16: learn: 0.5236161 test: 0.5218518 best: 0.5218518 (16) total: 365ms remaining: 1.03s 17: learn: 0.5156623 test: 0.5137905 best: 0.5137905 (17) total: 388ms remaining: 1.01s 18: learn: 0.5079751 test: 0.5060185 best: 0.5060185 (18) total: 409ms remaining: 991ms 19: learn: 0.5017954 test: 0.4997681 best: 0.4997681 (19) total: 432ms remaining: 973ms 20: learn: 0.4946188 test: 0.4925371 best: 0.4925371 (20) total: 455ms remaining: 953ms 21: learn: 0.4876405 test: 0.4855030 best: 0.4855030 (21) total: 477ms remaining: 933ms 22: learn: 0.4829915 test: 0.4808140 best: 0.4808140 (22) total: 500ms remaining: 913ms 23: learn: 0.4770892 test: 0.4748443 best: 0.4748443 (23) total: 522ms remaining: 892ms 24: learn: 0.4724280 test: 0.4701022 best: 0.4701022 (24) total: 545ms remaining: 871ms 25: learn: 0.4660859 test: 0.4636834 best: 0.4636834 (25) total: 567ms remaining: 850ms 26: learn: 0.4600092 test: 0.4575553 best: 0.4575553 (26) total: 589ms remaining: 829ms 27: learn: 0.4552755 test: 0.4527693 best: 0.4527693 (27) total: 612ms remaining: 808ms 28: learn: 0.4495540 test: 0.4470020 best: 0.4470020 (28) total: 635ms remaining: 788ms 29: learn: 0.4439865 test: 0.4413902 best: 0.4413902 (29) total: 660ms remaining: 769ms 30: learn: 0.4388717 test: 0.4362262 best: 0.4362262 (30) total: 684ms remaining: 750ms 31: learn: 0.4345655 test: 0.4319145 best: 0.4319145 (31) total: 709ms remaining: 732ms 32: learn: 0.4303147 test: 0.4275845 best: 0.4275845 (32) total: 734ms remaining: 712ms 33: learn: 0.4262073 test: 0.4234166 best: 0.4234166 (33) total: 759ms remaining: 692ms 34: learn: 0.4214043 test: 0.4185651 best: 0.4185651 (34) total: 784ms remaining: 672ms 35: learn: 0.4166502 test: 0.4137620 best: 0.4137620 (35) total: 808ms remaining: 651ms 36: learn: 0.4122658 test: 0.4093352 best: 0.4093352 (36) total: 833ms remaining: 630ms 37: learn: 0.4090736 test: 0.4060995 best: 0.4060995 (37) total: 858ms remaining: 609ms 38: learn: 0.4051345 test: 0.4021054 best: 0.4021054 (38) total: 884ms remaining: 589ms 39: learn: 0.4021055 test: 0.3989770 best: 0.3989770 (39) total: 909ms remaining: 568ms 40: learn: 0.3979573 test: 0.3947856 best: 0.3947856 (40) total: 933ms remaining: 546ms 41: learn: 0.3946866 test: 0.3914794 best: 0.3914794 (41) total: 958ms remaining: 525ms 42: learn: 0.3912613 test: 0.3880742 best: 0.3880742 (42) total: 986ms remaining: 504ms 43: learn: 0.3874252 test: 0.3841979 best: 0.3841979 (43) total: 1.01s remaining: 482ms 44: learn: 0.3845109 test: 0.3812635 best: 0.3812635 (44) total: 1.03s remaining: 460ms 45: learn: 0.3813378 test: 0.3780506 best: 0.3780506 (45) total: 1.06s remaining: 437ms 46: learn: 0.3780202 test: 0.3746977 best: 0.3746977 (46) total: 1.08s remaining: 415ms 47: learn: 0.3753428 test: 0.3719373 best: 0.3719373 (47) total: 1.11s remaining: 393ms 48: learn: 0.3726209 test: 0.3691294 best: 0.3691294 (48) total: 1.13s remaining: 370ms 49: learn: 0.3699535 test: 0.3663738 best: 0.3663738 (49) total: 1.16s remaining: 348ms 50: learn: 0.3672985 test: 0.3637148 best: 0.3637148 (50) total: 1.18s remaining: 325ms 51: learn: 0.3647907 test: 0.3611720 best: 0.3611720 (51) total: 1.21s remaining: 302ms 52: learn: 0.3620083 test: 0.3583821 best: 0.3583821 (52) total: 1.23s remaining: 279ms 53: learn: 0.3590419 test: 0.3553858 best: 0.3553858 (53) total: 1.26s remaining: 256ms 54: learn: 0.3565757 test: 0.3528512 best: 0.3528512 (54) total: 1.28s remaining: 233ms 55: learn: 0.3540591 test: 0.3503003 best: 0.3503003 (55) total: 1.31s remaining: 210ms 56: learn: 0.3513317 test: 0.3475869 best: 0.3475869 (56) total: 1.33s remaining: 187ms 57: learn: 0.3491760 test: 0.3454219 best: 0.3454219 (57) total: 1.36s remaining: 164ms 58: learn: 0.3465698 test: 0.3428049 best: 0.3428049 (58) total: 1.38s remaining: 141ms 59: learn: 0.3447773 test: 0.3409965 best: 0.3409965 (59) total: 1.41s remaining: 117ms 60: learn: 0.3431860 test: 0.3393611 best: 0.3393611 (60) total: 1.43s remaining: 94ms 61: learn: 0.3414119 test: 0.3375385 best: 0.3375385 (61) total: 1.46s remaining: 70.6ms 62: learn: 0.3396122 test: 0.3357383 best: 0.3357383 (62) total: 1.48s remaining: 47.1ms 63: learn: 0.3375999 test: 0.3336764 best: 0.3336764 (63) total: 1.51s remaining: 23.6ms 64: learn: 0.3353945 test: 0.3314601 best: 0.3314601 (64) total: 1.53s remaining: 0us bestTest = 0.3314601398 bestIteration = 64 Trial 41, Fold 3: Log loss = 0.3316725370755788, Average precision = 0.9615542151513036, ROC-AUC = 0.9589315359815469, Elapsed Time = 1.6288577000013902 seconds Trial 41, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 41, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6816188 test: 0.6816727 best: 0.6816727 (0) total: 20.7ms remaining: 1.32s 1: learn: 0.6685896 test: 0.6685942 best: 0.6685942 (1) total: 41.7ms remaining: 1.31s 2: learn: 0.6559443 test: 0.6559648 best: 0.6559648 (2) total: 62.4ms remaining: 1.29s 3: learn: 0.6437775 test: 0.6438269 best: 0.6438269 (3) total: 82.8ms remaining: 1.26s 4: learn: 0.6319407 test: 0.6319449 best: 0.6319449 (4) total: 104ms remaining: 1.24s 5: learn: 0.6206117 test: 0.6206511 best: 0.6206511 (5) total: 124ms remaining: 1.22s 6: learn: 0.6095376 test: 0.6095347 best: 0.6095347 (6) total: 145ms remaining: 1.2s 7: learn: 0.5989224 test: 0.5989506 best: 0.5989506 (7) total: 166ms remaining: 1.18s 8: learn: 0.5885639 test: 0.5886183 best: 0.5886183 (8) total: 186ms remaining: 1.16s 9: learn: 0.5785325 test: 0.5786018 best: 0.5786018 (9) total: 207ms remaining: 1.14s 10: learn: 0.5688134 test: 0.5689034 best: 0.5689034 (10) total: 231ms remaining: 1.13s 11: learn: 0.5620897 test: 0.5621621 best: 0.5621621 (11) total: 254ms remaining: 1.12s 12: learn: 0.5543562 test: 0.5543612 best: 0.5543612 (12) total: 277ms remaining: 1.11s 13: learn: 0.5468821 test: 0.5468197 best: 0.5468197 (13) total: 301ms remaining: 1.09s 14: learn: 0.5406214 test: 0.5405622 best: 0.5405622 (14) total: 324ms remaining: 1.08s 15: learn: 0.5335728 test: 0.5334517 best: 0.5334517 (15) total: 348ms remaining: 1.06s 16: learn: 0.5252590 test: 0.5251021 best: 0.5251021 (16) total: 372ms remaining: 1.05s 17: learn: 0.5172689 test: 0.5171465 best: 0.5171465 (17) total: 395ms remaining: 1.03s 18: learn: 0.5095427 test: 0.5094423 best: 0.5094423 (18) total: 419ms remaining: 1.01s 19: learn: 0.5033518 test: 0.5032798 best: 0.5032798 (19) total: 442ms remaining: 996ms 20: learn: 0.4961541 test: 0.4961219 best: 0.4961219 (20) total: 466ms remaining: 977ms 21: learn: 0.4903189 test: 0.4902617 best: 0.4902617 (21) total: 490ms remaining: 958ms 22: learn: 0.4833917 test: 0.4833066 best: 0.4833066 (22) total: 514ms remaining: 938ms 23: learn: 0.4767138 test: 0.4766398 best: 0.4766398 (23) total: 538ms remaining: 920ms 24: learn: 0.4718782 test: 0.4718011 best: 0.4718011 (24) total: 563ms remaining: 900ms 25: learn: 0.4655999 test: 0.4655677 best: 0.4655677 (25) total: 587ms remaining: 880ms 26: learn: 0.4607230 test: 0.4606241 best: 0.4606241 (26) total: 611ms remaining: 860ms 27: learn: 0.4558975 test: 0.4559039 best: 0.4559039 (27) total: 636ms remaining: 840ms 28: learn: 0.4501031 test: 0.4501169 best: 0.4501169 (28) total: 660ms remaining: 820ms 29: learn: 0.4444636 test: 0.4444834 best: 0.4444834 (29) total: 688ms remaining: 802ms 30: learn: 0.4392471 test: 0.4393197 best: 0.4393197 (30) total: 713ms remaining: 783ms 31: learn: 0.4339814 test: 0.4340545 best: 0.4340545 (31) total: 739ms remaining: 762ms 32: learn: 0.4288355 test: 0.4289011 best: 0.4289011 (32) total: 763ms remaining: 740ms 33: learn: 0.4239987 test: 0.4241046 best: 0.4241046 (33) total: 789ms remaining: 719ms 34: learn: 0.4202203 test: 0.4203965 best: 0.4203965 (34) total: 813ms remaining: 697ms 35: learn: 0.4164522 test: 0.4167122 best: 0.4167122 (35) total: 838ms remaining: 675ms 36: learn: 0.4131933 test: 0.4134052 best: 0.4134052 (36) total: 862ms remaining: 653ms 37: learn: 0.4089260 test: 0.4091887 best: 0.4091887 (37) total: 885ms remaining: 629ms 38: learn: 0.4044735 test: 0.4047992 best: 0.4047992 (38) total: 909ms remaining: 606ms 39: learn: 0.4003142 test: 0.4006580 best: 0.4006580 (39) total: 932ms remaining: 582ms 40: learn: 0.3962175 test: 0.3965483 best: 0.3965483 (40) total: 955ms remaining: 559ms 41: learn: 0.3926029 test: 0.3929059 best: 0.3929059 (41) total: 978ms remaining: 536ms 42: learn: 0.3893840 test: 0.3896853 best: 0.3896853 (42) total: 1s remaining: 512ms 43: learn: 0.3866701 test: 0.3869322 best: 0.3869322 (43) total: 1.02s remaining: 489ms 44: learn: 0.3830297 test: 0.3832940 best: 0.3832940 (44) total: 1.05s remaining: 466ms 45: learn: 0.3797628 test: 0.3800095 best: 0.3800095 (45) total: 1.07s remaining: 443ms 46: learn: 0.3768830 test: 0.3770643 best: 0.3770643 (46) total: 1.1s remaining: 421ms 47: learn: 0.3738010 test: 0.3739700 best: 0.3739700 (47) total: 1.12s remaining: 398ms 48: learn: 0.3707982 test: 0.3710087 best: 0.3710087 (48) total: 1.15s remaining: 375ms 49: learn: 0.3682976 test: 0.3685826 best: 0.3685826 (49) total: 1.17s remaining: 352ms 50: learn: 0.3657316 test: 0.3659610 best: 0.3659610 (50) total: 1.2s remaining: 329ms 51: learn: 0.3629568 test: 0.3632193 best: 0.3632193 (51) total: 1.22s remaining: 306ms 52: learn: 0.3608444 test: 0.3610888 best: 0.3610888 (52) total: 1.25s remaining: 282ms 53: learn: 0.3584626 test: 0.3586550 best: 0.3586550 (53) total: 1.27s remaining: 259ms 54: learn: 0.3555586 test: 0.3557338 best: 0.3557338 (54) total: 1.3s remaining: 236ms 55: learn: 0.3530154 test: 0.3531799 best: 0.3531799 (55) total: 1.32s remaining: 213ms 56: learn: 0.3503477 test: 0.3506096 best: 0.3506096 (56) total: 1.35s remaining: 189ms 57: learn: 0.3477213 test: 0.3479618 best: 0.3479618 (57) total: 1.37s remaining: 166ms 58: learn: 0.3450464 test: 0.3453330 best: 0.3453330 (58) total: 1.4s remaining: 142ms 59: learn: 0.3429977 test: 0.3432322 best: 0.3432322 (59) total: 1.42s remaining: 119ms 60: learn: 0.3409404 test: 0.3411838 best: 0.3411838 (60) total: 1.45s remaining: 95ms 61: learn: 0.3387148 test: 0.3389403 best: 0.3389403 (61) total: 1.47s remaining: 71.3ms 62: learn: 0.3363900 test: 0.3365939 best: 0.3365939 (62) total: 1.5s remaining: 47.5ms 63: learn: 0.3347451 test: 0.3349497 best: 0.3349497 (63) total: 1.52s remaining: 23.8ms 64: learn: 0.3327107 test: 0.3329170 best: 0.3329170 (64) total: 1.54s remaining: 0us bestTest = 0.3329169814 bestIteration = 64 Trial 41, Fold 4: Log loss = 0.33297531728255003, Average precision = 0.96014019612667, ROC-AUC = 0.9559887133963294, Elapsed Time = 1.6406114000019443 seconds Trial 41, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 41, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6842478 test: 0.6844594 best: 0.6844594 (0) total: 20.2ms remaining: 1.29s 1: learn: 0.6732119 test: 0.6737196 best: 0.6737196 (1) total: 42.2ms remaining: 1.33s 2: learn: 0.6625256 test: 0.6633198 best: 0.6633198 (2) total: 63.3ms remaining: 1.31s 3: learn: 0.6541087 test: 0.6551981 best: 0.6551981 (3) total: 83.6ms remaining: 1.27s 4: learn: 0.6443765 test: 0.6455893 best: 0.6455893 (4) total: 104ms remaining: 1.25s 5: learn: 0.6344916 test: 0.6359478 best: 0.6359478 (5) total: 125ms remaining: 1.23s 6: learn: 0.6250158 test: 0.6267288 best: 0.6267288 (6) total: 146ms remaining: 1.21s 7: learn: 0.6158381 test: 0.6178183 best: 0.6178183 (7) total: 166ms remaining: 1.19s 8: learn: 0.6068646 test: 0.6090656 best: 0.6090656 (8) total: 188ms remaining: 1.17s 9: learn: 0.5960557 test: 0.5984364 best: 0.5984364 (9) total: 210ms remaining: 1.16s 10: learn: 0.5872740 test: 0.5897710 best: 0.5897710 (10) total: 233ms remaining: 1.15s 11: learn: 0.5770825 test: 0.5797548 best: 0.5797548 (11) total: 256ms remaining: 1.13s 12: learn: 0.5693921 test: 0.5722797 best: 0.5722797 (12) total: 279ms remaining: 1.12s 13: learn: 0.5599140 test: 0.5628671 best: 0.5628671 (13) total: 302ms remaining: 1.1s 14: learn: 0.5506682 test: 0.5537037 best: 0.5537037 (14) total: 325ms remaining: 1.08s 15: learn: 0.5433446 test: 0.5464798 best: 0.5464798 (15) total: 348ms remaining: 1.07s 16: learn: 0.5346696 test: 0.5378510 best: 0.5378510 (16) total: 372ms remaining: 1.05s 17: learn: 0.5278641 test: 0.5312231 best: 0.5312231 (17) total: 394ms remaining: 1.03s 18: learn: 0.5196771 test: 0.5230933 best: 0.5230933 (18) total: 418ms remaining: 1.01s 19: learn: 0.5117212 test: 0.5152909 best: 0.5152909 (19) total: 442ms remaining: 994ms 20: learn: 0.5039561 test: 0.5076830 best: 0.5076830 (20) total: 465ms remaining: 974ms 21: learn: 0.4964280 test: 0.5002725 best: 0.5002725 (21) total: 489ms remaining: 955ms 22: learn: 0.4892648 test: 0.4931509 best: 0.4931509 (22) total: 512ms remaining: 935ms 23: learn: 0.4837242 test: 0.4876631 best: 0.4876631 (23) total: 536ms remaining: 916ms 24: learn: 0.4768705 test: 0.4809241 best: 0.4809241 (24) total: 561ms remaining: 897ms 25: learn: 0.4702223 test: 0.4743838 best: 0.4743838 (25) total: 585ms remaining: 877ms 26: learn: 0.4638640 test: 0.4681821 best: 0.4681821 (26) total: 609ms remaining: 857ms 27: learn: 0.4576355 test: 0.4620602 best: 0.4620602 (27) total: 633ms remaining: 837ms 28: learn: 0.4526242 test: 0.4571058 best: 0.4571058 (28) total: 659ms remaining: 818ms 29: learn: 0.4468217 test: 0.4513738 best: 0.4513738 (29) total: 683ms remaining: 797ms 30: learn: 0.4414503 test: 0.4460806 best: 0.4460806 (30) total: 709ms remaining: 778ms 31: learn: 0.4360716 test: 0.4408360 best: 0.4408360 (31) total: 734ms remaining: 756ms 32: learn: 0.4308119 test: 0.4356398 best: 0.4356398 (32) total: 758ms remaining: 735ms 33: learn: 0.4271497 test: 0.4320038 best: 0.4320038 (33) total: 782ms remaining: 713ms 34: learn: 0.4221000 test: 0.4270304 best: 0.4270304 (34) total: 806ms remaining: 691ms 35: learn: 0.4180942 test: 0.4230954 best: 0.4230954 (35) total: 831ms remaining: 669ms 36: learn: 0.4142173 test: 0.4192944 best: 0.4192944 (36) total: 855ms remaining: 647ms 37: learn: 0.4097181 test: 0.4148040 best: 0.4148040 (37) total: 878ms remaining: 624ms 38: learn: 0.4052407 test: 0.4103988 best: 0.4103988 (38) total: 901ms remaining: 601ms 39: learn: 0.4010296 test: 0.4062156 best: 0.4062156 (39) total: 924ms remaining: 577ms 40: learn: 0.3979114 test: 0.4031748 best: 0.4031748 (40) total: 947ms remaining: 554ms 41: learn: 0.3937610 test: 0.3990890 best: 0.3990890 (41) total: 970ms remaining: 531ms 42: learn: 0.3897847 test: 0.3952488 best: 0.3952488 (42) total: 993ms remaining: 508ms 43: learn: 0.3864128 test: 0.3920134 best: 0.3920134 (43) total: 1.01s remaining: 485ms 44: learn: 0.3826871 test: 0.3884113 best: 0.3884113 (44) total: 1.04s remaining: 462ms 45: learn: 0.3792466 test: 0.3850543 best: 0.3850543 (45) total: 1.06s remaining: 439ms 46: learn: 0.3763687 test: 0.3822351 best: 0.3822351 (46) total: 1.09s remaining: 416ms 47: learn: 0.3734406 test: 0.3793693 best: 0.3793693 (47) total: 1.11s remaining: 394ms 48: learn: 0.3709579 test: 0.3769528 best: 0.3769528 (48) total: 1.14s remaining: 372ms 49: learn: 0.3682457 test: 0.3743179 best: 0.3743179 (49) total: 1.16s remaining: 349ms 50: learn: 0.3650466 test: 0.3711829 best: 0.3711829 (50) total: 1.19s remaining: 326ms 51: learn: 0.3620631 test: 0.3682895 best: 0.3682895 (51) total: 1.21s remaining: 303ms 52: learn: 0.3590393 test: 0.3653463 best: 0.3653463 (52) total: 1.24s remaining: 280ms 53: learn: 0.3562951 test: 0.3627082 best: 0.3627082 (53) total: 1.26s remaining: 257ms 54: learn: 0.3534508 test: 0.3599782 best: 0.3599782 (54) total: 1.29s remaining: 234ms 55: learn: 0.3509597 test: 0.3575251 best: 0.3575251 (55) total: 1.31s remaining: 211ms 56: learn: 0.3482610 test: 0.3548926 best: 0.3548926 (56) total: 1.34s remaining: 188ms 57: learn: 0.3463653 test: 0.3530471 best: 0.3530471 (57) total: 1.36s remaining: 164ms 58: learn: 0.3445284 test: 0.3512602 best: 0.3512602 (58) total: 1.39s remaining: 141ms 59: learn: 0.3419998 test: 0.3487982 best: 0.3487982 (59) total: 1.41s remaining: 118ms 60: learn: 0.3395478 test: 0.3464120 best: 0.3464120 (60) total: 1.44s remaining: 94.2ms 61: learn: 0.3372769 test: 0.3441878 best: 0.3441878 (61) total: 1.46s remaining: 70.7ms 62: learn: 0.3350670 test: 0.3420130 best: 0.3420130 (62) total: 1.49s remaining: 47.2ms 63: learn: 0.3331586 test: 0.3402043 best: 0.3402043 (63) total: 1.51s remaining: 23.6ms 64: learn: 0.3312880 test: 0.3384091 best: 0.3384091 (64) total: 1.54s remaining: 0us bestTest = 0.3384090616 bestIteration = 64 Trial 41, Fold 5: Log loss = 0.33841465723965264, Average precision = 0.9561260913492311, ROC-AUC = 0.9514392731731787, Elapsed Time = 1.6334887000011804 seconds
Optimization Progress: 42%|####2 | 42/100 [1:08:28<53:51, 55.72s/it]
Trial 42, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 42, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6801055 test: 0.6801152 best: 0.6801152 (0) total: 246ms remaining: 23.9s 1: learn: 0.6670661 test: 0.6670421 best: 0.6670421 (1) total: 499ms remaining: 24s 2: learn: 0.6547151 test: 0.6547102 best: 0.6547102 (2) total: 757ms remaining: 24s 3: learn: 0.6421933 test: 0.6423392 best: 0.6423392 (3) total: 1.01s remaining: 23.8s 4: learn: 0.6303974 test: 0.6305314 best: 0.6305314 (4) total: 1.28s remaining: 23.9s 5: learn: 0.6188101 test: 0.6190026 best: 0.6190026 (5) total: 1.53s remaining: 23.5s 6: learn: 0.6083346 test: 0.6085968 best: 0.6085968 (6) total: 1.8s remaining: 23.4s 7: learn: 0.5976734 test: 0.5979936 best: 0.5979936 (7) total: 2.05s remaining: 23s 8: learn: 0.5872979 test: 0.5877204 best: 0.5877204 (8) total: 2.3s remaining: 22.8s 9: learn: 0.5769345 test: 0.5774203 best: 0.5774203 (9) total: 2.57s remaining: 22.6s 10: learn: 0.5669668 test: 0.5674868 best: 0.5674868 (10) total: 2.81s remaining: 22.3s 11: learn: 0.5570907 test: 0.5576390 best: 0.5576390 (11) total: 3.08s remaining: 22.1s 12: learn: 0.5480477 test: 0.5486545 best: 0.5486545 (12) total: 3.35s remaining: 21.9s 13: learn: 0.5388138 test: 0.5394418 best: 0.5394418 (13) total: 3.61s remaining: 21.7s 14: learn: 0.5299262 test: 0.5306298 best: 0.5306298 (14) total: 3.88s remaining: 21.4s 15: learn: 0.5216069 test: 0.5225430 best: 0.5225430 (15) total: 4.19s remaining: 21.5s 16: learn: 0.5132911 test: 0.5142968 best: 0.5142968 (16) total: 4.49s remaining: 21.4s 17: learn: 0.5054557 test: 0.5066014 best: 0.5066014 (17) total: 4.79s remaining: 21.3s 18: learn: 0.4975622 test: 0.4987997 best: 0.4987997 (18) total: 5.08s remaining: 21.1s 19: learn: 0.4899172 test: 0.4912594 best: 0.4912594 (19) total: 5.36s remaining: 20.9s 20: learn: 0.4825526 test: 0.4839983 best: 0.4839983 (20) total: 5.67s remaining: 20.8s 21: learn: 0.4758574 test: 0.4774653 best: 0.4774653 (21) total: 5.96s remaining: 20.6s 22: learn: 0.4688072 test: 0.4705962 best: 0.4705962 (22) total: 6.26s remaining: 20.4s 23: learn: 0.4619117 test: 0.4637607 best: 0.4637607 (23) total: 6.6s remaining: 20.4s 24: learn: 0.4558513 test: 0.4578614 best: 0.4578614 (24) total: 6.9s remaining: 20.2s 25: learn: 0.4497411 test: 0.4518620 best: 0.4518620 (25) total: 7.23s remaining: 20s 26: learn: 0.4435587 test: 0.4458001 best: 0.4458001 (26) total: 7.53s remaining: 19.8s 27: learn: 0.4372157 test: 0.4395764 best: 0.4395764 (27) total: 7.81s remaining: 19.5s 28: learn: 0.4316925 test: 0.4341962 best: 0.4341962 (28) total: 8.11s remaining: 19.3s 29: learn: 0.4258745 test: 0.4285069 best: 0.4285069 (29) total: 8.38s remaining: 19s 30: learn: 0.4205003 test: 0.4232146 best: 0.4232146 (30) total: 8.68s remaining: 18.8s 31: learn: 0.4153256 test: 0.4181524 best: 0.4181524 (31) total: 8.99s remaining: 18.5s 32: learn: 0.4103142 test: 0.4132633 best: 0.4132633 (32) total: 9.27s remaining: 18.3s 33: learn: 0.4057788 test: 0.4088648 best: 0.4088648 (33) total: 9.56s remaining: 18s 34: learn: 0.4006881 test: 0.4038875 best: 0.4038875 (34) total: 9.85s remaining: 17.7s 35: learn: 0.3959591 test: 0.3992722 best: 0.3992722 (35) total: 10.1s remaining: 17.4s 36: learn: 0.3912751 test: 0.3946524 best: 0.3946524 (36) total: 10.4s remaining: 17.1s 37: learn: 0.3868283 test: 0.3902599 best: 0.3902599 (37) total: 10.7s remaining: 16.8s 38: learn: 0.3823382 test: 0.3859789 best: 0.3859789 (38) total: 10.9s remaining: 16.5s 39: learn: 0.3778518 test: 0.3815826 best: 0.3815826 (39) total: 11.2s remaining: 16.2s 40: learn: 0.3738528 test: 0.3776989 best: 0.3776989 (40) total: 11.4s remaining: 15.9s 41: learn: 0.3695924 test: 0.3734977 best: 0.3734977 (41) total: 11.7s remaining: 15.6s 42: learn: 0.3655835 test: 0.3696089 best: 0.3696089 (42) total: 12s remaining: 15.4s 43: learn: 0.3618302 test: 0.3659734 best: 0.3659734 (43) total: 12.3s remaining: 15.1s 44: learn: 0.3580882 test: 0.3623790 best: 0.3623790 (44) total: 12.6s remaining: 14.8s 45: learn: 0.3546205 test: 0.3590112 best: 0.3590112 (45) total: 12.8s remaining: 14.5s 46: learn: 0.3509595 test: 0.3555623 best: 0.3555623 (46) total: 13.1s remaining: 14.2s 47: learn: 0.3471709 test: 0.3519030 best: 0.3519030 (47) total: 13.4s remaining: 13.9s 48: learn: 0.3435332 test: 0.3483913 best: 0.3483913 (48) total: 13.6s remaining: 13.6s 49: learn: 0.3406010 test: 0.3456487 best: 0.3456487 (49) total: 13.9s remaining: 13.3s 50: learn: 0.3375187 test: 0.3427146 best: 0.3427146 (50) total: 14.2s remaining: 13s 51: learn: 0.3345083 test: 0.3398858 best: 0.3398858 (51) total: 14.4s remaining: 12.8s 52: learn: 0.3316500 test: 0.3371071 best: 0.3371071 (52) total: 14.7s remaining: 12.5s 53: learn: 0.3288671 test: 0.3344858 best: 0.3344858 (53) total: 15s remaining: 12.2s 54: learn: 0.3259809 test: 0.3317028 best: 0.3317028 (54) total: 15.3s remaining: 11.9s 55: learn: 0.3233511 test: 0.3291667 best: 0.3291667 (55) total: 15.5s remaining: 11.6s 56: learn: 0.3207023 test: 0.3267204 best: 0.3267204 (56) total: 15.8s remaining: 11.3s 57: learn: 0.3179547 test: 0.3241389 best: 0.3241389 (57) total: 16s remaining: 11.1s 58: learn: 0.3153169 test: 0.3216628 best: 0.3216628 (58) total: 16.3s remaining: 10.8s 59: learn: 0.3126849 test: 0.3192387 best: 0.3192387 (59) total: 16.6s remaining: 10.5s 60: learn: 0.3104098 test: 0.3171282 best: 0.3171282 (60) total: 16.8s remaining: 10.2s 61: learn: 0.3080530 test: 0.3148928 best: 0.3148928 (61) total: 17.1s remaining: 9.91s 62: learn: 0.3058033 test: 0.3127289 best: 0.3127289 (62) total: 17.3s remaining: 9.63s 63: learn: 0.3035318 test: 0.3104593 best: 0.3104593 (63) total: 17.6s remaining: 9.35s 64: learn: 0.3011084 test: 0.3081678 best: 0.3081678 (64) total: 17.9s remaining: 9.07s 65: learn: 0.2987619 test: 0.3059563 best: 0.3059563 (65) total: 18.1s remaining: 8.79s 66: learn: 0.2967910 test: 0.3040708 best: 0.3040708 (66) total: 18.4s remaining: 8.51s 67: learn: 0.2948713 test: 0.3022876 best: 0.3022876 (67) total: 18.7s remaining: 8.23s 68: learn: 0.2931076 test: 0.3006232 best: 0.3006232 (68) total: 18.9s remaining: 7.94s 69: learn: 0.2912543 test: 0.2988750 best: 0.2988750 (69) total: 19.2s remaining: 7.67s 70: learn: 0.2891204 test: 0.2968777 best: 0.2968777 (70) total: 19.4s remaining: 7.38s 71: learn: 0.2872305 test: 0.2951620 best: 0.2951620 (71) total: 19.7s remaining: 7.11s 72: learn: 0.2852980 test: 0.2933059 best: 0.2933059 (72) total: 19.9s remaining: 6.82s 73: learn: 0.2836364 test: 0.2918227 best: 0.2918227 (73) total: 20.2s remaining: 6.54s 74: learn: 0.2816106 test: 0.2899483 best: 0.2899483 (74) total: 20.4s remaining: 6.26s 75: learn: 0.2799627 test: 0.2884462 best: 0.2884462 (75) total: 20.7s remaining: 5.99s 76: learn: 0.2782329 test: 0.2867915 best: 0.2867915 (76) total: 20.9s remaining: 5.71s 77: learn: 0.2765483 test: 0.2852901 best: 0.2852901 (77) total: 21.2s remaining: 5.43s 78: learn: 0.2748478 test: 0.2837749 best: 0.2837749 (78) total: 21.5s remaining: 5.16s 79: learn: 0.2734202 test: 0.2824972 best: 0.2824972 (79) total: 21.7s remaining: 4.88s 80: learn: 0.2716243 test: 0.2808510 best: 0.2808510 (80) total: 22s remaining: 4.61s 81: learn: 0.2702354 test: 0.2795974 best: 0.2795974 (81) total: 22.2s remaining: 4.33s 82: learn: 0.2686327 test: 0.2781006 best: 0.2781006 (82) total: 22.5s remaining: 4.06s 83: learn: 0.2670594 test: 0.2766438 best: 0.2766438 (83) total: 22.7s remaining: 3.79s 84: learn: 0.2658322 test: 0.2756223 best: 0.2756223 (84) total: 23s remaining: 3.51s 85: learn: 0.2645959 test: 0.2745446 best: 0.2745446 (85) total: 23.2s remaining: 3.24s 86: learn: 0.2633150 test: 0.2733891 best: 0.2733891 (86) total: 23.5s remaining: 2.97s 87: learn: 0.2617293 test: 0.2719958 best: 0.2719958 (87) total: 23.8s remaining: 2.7s 88: learn: 0.2604286 test: 0.2708605 best: 0.2708605 (88) total: 24s remaining: 2.43s 89: learn: 0.2593243 test: 0.2699003 best: 0.2699003 (89) total: 24.3s remaining: 2.16s 90: learn: 0.2579425 test: 0.2686483 best: 0.2686483 (90) total: 24.5s remaining: 1.89s 91: learn: 0.2565667 test: 0.2673583 best: 0.2673583 (91) total: 24.8s remaining: 1.61s 92: learn: 0.2552880 test: 0.2661796 best: 0.2661796 (92) total: 25s remaining: 1.34s 93: learn: 0.2539267 test: 0.2649071 best: 0.2649071 (93) total: 25.3s remaining: 1.07s 94: learn: 0.2528352 test: 0.2639432 best: 0.2639432 (94) total: 25.6s remaining: 807ms 95: learn: 0.2516303 test: 0.2628180 best: 0.2628180 (95) total: 25.8s remaining: 537ms 96: learn: 0.2505425 test: 0.2619200 best: 0.2619200 (96) total: 26s remaining: 269ms 97: learn: 0.2493141 test: 0.2608320 best: 0.2608320 (97) total: 26.3s remaining: 0us bestTest = 0.2608320017 bestIteration = 97 Trial 42, Fold 1: Log loss = 0.2608320016904406, Average precision = 0.972556999025177, ROC-AUC = 0.968024921375757, Elapsed Time = 26.434395100000984 seconds Trial 42, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 42, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6796812 test: 0.6798537 best: 0.6798537 (0) total: 260ms remaining: 25.3s 1: learn: 0.6667428 test: 0.6670460 best: 0.6670460 (1) total: 529ms remaining: 25.4s 2: learn: 0.6541239 test: 0.6545730 best: 0.6545730 (2) total: 791ms remaining: 25s 3: learn: 0.6423711 test: 0.6428684 best: 0.6428684 (3) total: 1.06s remaining: 24.8s 4: learn: 0.6306046 test: 0.6311609 best: 0.6311609 (4) total: 1.3s remaining: 24.2s 5: learn: 0.6191641 test: 0.6198325 best: 0.6198325 (5) total: 1.56s remaining: 24s 6: learn: 0.6082645 test: 0.6091399 best: 0.6091399 (6) total: 1.81s remaining: 23.6s 7: learn: 0.5975131 test: 0.5984878 best: 0.5984878 (7) total: 2.09s remaining: 23.5s 8: learn: 0.5870125 test: 0.5880127 best: 0.5880127 (8) total: 2.35s remaining: 23.3s 9: learn: 0.5767511 test: 0.5777264 best: 0.5777264 (9) total: 2.63s remaining: 23.1s 10: learn: 0.5669034 test: 0.5679243 best: 0.5679243 (10) total: 2.9s remaining: 22.9s 11: learn: 0.5574512 test: 0.5585552 best: 0.5585552 (11) total: 3.15s remaining: 22.6s 12: learn: 0.5481549 test: 0.5493140 best: 0.5493140 (12) total: 3.42s remaining: 22.4s 13: learn: 0.5395274 test: 0.5407083 best: 0.5407083 (13) total: 3.7s remaining: 22.2s 14: learn: 0.5304995 test: 0.5317552 best: 0.5317552 (14) total: 3.95s remaining: 21.9s 15: learn: 0.5225523 test: 0.5238432 best: 0.5238432 (15) total: 4.2s remaining: 21.5s 16: learn: 0.5142305 test: 0.5156523 best: 0.5156523 (16) total: 4.48s remaining: 21.3s 17: learn: 0.5059491 test: 0.5075017 best: 0.5075017 (17) total: 4.72s remaining: 21s 18: learn: 0.4981414 test: 0.4997719 best: 0.4997719 (18) total: 4.98s remaining: 20.7s 19: learn: 0.4906138 test: 0.4922747 best: 0.4922747 (19) total: 5.23s remaining: 20.4s 20: learn: 0.4831893 test: 0.4849932 best: 0.4849932 (20) total: 5.49s remaining: 20.1s 21: learn: 0.4763427 test: 0.4782917 best: 0.4782917 (21) total: 5.76s remaining: 19.9s 22: learn: 0.4692952 test: 0.4712693 best: 0.4712693 (22) total: 6.03s remaining: 19.7s 23: learn: 0.4623041 test: 0.4643265 best: 0.4643265 (23) total: 6.29s remaining: 19.4s 24: learn: 0.4556609 test: 0.4577552 best: 0.4577552 (24) total: 6.55s remaining: 19.1s 25: learn: 0.4493221 test: 0.4514612 best: 0.4514612 (25) total: 6.81s remaining: 18.9s 26: learn: 0.4432552 test: 0.4454570 best: 0.4454570 (26) total: 7.08s remaining: 18.6s 27: learn: 0.4378327 test: 0.4401821 best: 0.4401821 (27) total: 7.34s remaining: 18.3s 28: learn: 0.4321845 test: 0.4346227 best: 0.4346227 (28) total: 7.58s remaining: 18s 29: learn: 0.4267460 test: 0.4292658 best: 0.4292658 (29) total: 7.85s remaining: 17.8s 30: learn: 0.4210860 test: 0.4236506 best: 0.4236506 (30) total: 8.12s remaining: 17.5s 31: learn: 0.4157880 test: 0.4184630 best: 0.4184630 (31) total: 8.37s remaining: 17.3s 32: learn: 0.4105557 test: 0.4132871 best: 0.4132871 (32) total: 8.63s remaining: 17s 33: learn: 0.4055079 test: 0.4083441 best: 0.4083441 (33) total: 8.9s remaining: 16.8s 34: learn: 0.4007264 test: 0.4036432 best: 0.4036432 (34) total: 9.16s remaining: 16.5s 35: learn: 0.3957825 test: 0.3987778 best: 0.3987778 (35) total: 9.42s remaining: 16.2s 36: learn: 0.3911055 test: 0.3941875 best: 0.3941875 (36) total: 9.69s remaining: 16s 37: learn: 0.3865436 test: 0.3896793 best: 0.3896793 (37) total: 9.96s remaining: 15.7s 38: learn: 0.3826131 test: 0.3858570 best: 0.3858570 (38) total: 10.2s remaining: 15.5s 39: learn: 0.3784928 test: 0.3817796 best: 0.3817796 (39) total: 10.5s remaining: 15.2s 40: learn: 0.3746685 test: 0.3780592 best: 0.3780592 (40) total: 10.8s remaining: 14.9s 41: learn: 0.3708210 test: 0.3742585 best: 0.3742585 (41) total: 11s remaining: 14.7s 42: learn: 0.3670851 test: 0.3705864 best: 0.3705864 (42) total: 11.3s remaining: 14.4s 43: learn: 0.3634266 test: 0.3669347 best: 0.3669347 (43) total: 11.5s remaining: 14.2s 44: learn: 0.3596210 test: 0.3632596 best: 0.3632596 (44) total: 11.8s remaining: 13.9s 45: learn: 0.3557504 test: 0.3595504 best: 0.3595504 (45) total: 12.1s remaining: 13.6s 46: learn: 0.3521553 test: 0.3559916 best: 0.3559916 (46) total: 12.3s remaining: 13.4s 47: learn: 0.3487109 test: 0.3526107 best: 0.3526107 (47) total: 12.6s remaining: 13.1s 48: learn: 0.3455678 test: 0.3495830 best: 0.3495830 (48) total: 12.8s remaining: 12.8s 49: learn: 0.3423179 test: 0.3464546 best: 0.3464546 (49) total: 13.1s remaining: 12.5s 50: learn: 0.3391574 test: 0.3433352 best: 0.3433352 (50) total: 13.3s remaining: 12.3s 51: learn: 0.3360612 test: 0.3403393 best: 0.3403393 (51) total: 13.6s remaining: 12s 52: learn: 0.3332357 test: 0.3375417 best: 0.3375417 (52) total: 13.9s remaining: 11.8s 53: learn: 0.3303263 test: 0.3346626 best: 0.3346626 (53) total: 14.1s remaining: 11.5s 54: learn: 0.3272713 test: 0.3316395 best: 0.3316395 (54) total: 14.4s remaining: 11.2s 55: learn: 0.3242930 test: 0.3287401 best: 0.3287401 (55) total: 14.7s remaining: 11s 56: learn: 0.3213555 test: 0.3258641 best: 0.3258641 (56) total: 14.9s remaining: 10.7s 57: learn: 0.3187792 test: 0.3233733 best: 0.3233733 (57) total: 15.2s remaining: 10.5s 58: learn: 0.3165084 test: 0.3211912 best: 0.3211912 (58) total: 15.4s remaining: 10.2s 59: learn: 0.3141179 test: 0.3188639 best: 0.3188639 (59) total: 15.7s remaining: 9.95s 60: learn: 0.3116817 test: 0.3165021 best: 0.3165021 (60) total: 16s remaining: 9.69s 61: learn: 0.3092076 test: 0.3141001 best: 0.3141001 (61) total: 16.2s remaining: 9.42s 62: learn: 0.3067999 test: 0.3117599 best: 0.3117599 (62) total: 16.5s remaining: 9.16s 63: learn: 0.3046307 test: 0.3096712 best: 0.3096712 (63) total: 16.7s remaining: 8.89s 64: learn: 0.3025787 test: 0.3076414 best: 0.3076414 (64) total: 17s remaining: 8.63s 65: learn: 0.3002758 test: 0.3054006 best: 0.3054006 (65) total: 17.2s remaining: 8.36s 66: learn: 0.2982278 test: 0.3034584 best: 0.3034584 (66) total: 17.5s remaining: 8.1s 67: learn: 0.2960492 test: 0.3014535 best: 0.3014535 (67) total: 17.8s remaining: 7.83s 68: learn: 0.2941312 test: 0.2995797 best: 0.2995797 (68) total: 18s remaining: 7.57s 69: learn: 0.2920062 test: 0.2974990 best: 0.2974990 (69) total: 18.3s remaining: 7.31s 70: learn: 0.2900748 test: 0.2956409 best: 0.2956409 (70) total: 18.5s remaining: 7.04s 71: learn: 0.2879933 test: 0.2936423 best: 0.2936423 (71) total: 18.8s remaining: 6.78s 72: learn: 0.2863093 test: 0.2920275 best: 0.2920275 (72) total: 19s remaining: 6.52s 73: learn: 0.2845079 test: 0.2902491 best: 0.2902491 (73) total: 19.3s remaining: 6.26s 74: learn: 0.2828356 test: 0.2886682 best: 0.2886682 (74) total: 19.6s remaining: 6s 75: learn: 0.2813911 test: 0.2872690 best: 0.2872690 (75) total: 19.8s remaining: 5.73s 76: learn: 0.2796134 test: 0.2856129 best: 0.2856129 (76) total: 20.1s remaining: 5.47s 77: learn: 0.2780214 test: 0.2840538 best: 0.2840538 (77) total: 20.3s remaining: 5.21s 78: learn: 0.2763227 test: 0.2824110 best: 0.2824110 (78) total: 20.6s remaining: 4.95s 79: learn: 0.2747382 test: 0.2808841 best: 0.2808841 (79) total: 20.8s remaining: 4.68s 80: learn: 0.2732349 test: 0.2793570 best: 0.2793570 (80) total: 21.1s remaining: 4.42s 81: learn: 0.2716789 test: 0.2778757 best: 0.2778757 (81) total: 21.3s remaining: 4.16s 82: learn: 0.2702194 test: 0.2764718 best: 0.2764718 (82) total: 21.6s remaining: 3.9s 83: learn: 0.2687991 test: 0.2751145 best: 0.2751145 (83) total: 21.9s remaining: 3.64s 84: learn: 0.2674574 test: 0.2738288 best: 0.2738288 (84) total: 22.1s remaining: 3.38s 85: learn: 0.2660461 test: 0.2724595 best: 0.2724595 (85) total: 22.4s remaining: 3.12s 86: learn: 0.2647076 test: 0.2711692 best: 0.2711692 (86) total: 22.6s remaining: 2.86s 87: learn: 0.2632555 test: 0.2697962 best: 0.2697962 (87) total: 22.9s remaining: 2.6s 88: learn: 0.2619602 test: 0.2685729 best: 0.2685729 (88) total: 23.1s remaining: 2.34s 89: learn: 0.2607269 test: 0.2673710 best: 0.2673710 (89) total: 23.4s remaining: 2.08s 90: learn: 0.2594476 test: 0.2661505 best: 0.2661505 (90) total: 23.6s remaining: 1.82s 91: learn: 0.2580777 test: 0.2648101 best: 0.2648101 (91) total: 23.9s remaining: 1.56s 92: learn: 0.2569975 test: 0.2638114 best: 0.2638114 (92) total: 24.1s remaining: 1.3s 93: learn: 0.2558517 test: 0.2626862 best: 0.2626862 (93) total: 24.4s remaining: 1.04s 94: learn: 0.2545200 test: 0.2614256 best: 0.2614256 (94) total: 24.6s remaining: 778ms 95: learn: 0.2534186 test: 0.2603672 best: 0.2603672 (95) total: 24.9s remaining: 519ms 96: learn: 0.2522381 test: 0.2592068 best: 0.2592068 (96) total: 25.1s remaining: 259ms 97: learn: 0.2512421 test: 0.2582550 best: 0.2582550 (97) total: 25.4s remaining: 0us bestTest = 0.2582549624 bestIteration = 97 Trial 42, Fold 2: Log loss = 0.25825496239464546, Average precision = 0.9744488879065488, ROC-AUC = 0.9708409042832634, Elapsed Time = 25.533264300000155 seconds Trial 42, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 42, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6800046 test: 0.6799664 best: 0.6799664 (0) total: 260ms remaining: 25.3s 1: learn: 0.6673085 test: 0.6672017 best: 0.6672017 (1) total: 534ms remaining: 25.7s 2: learn: 0.6549112 test: 0.6547710 best: 0.6547710 (2) total: 798ms remaining: 25.3s 3: learn: 0.6426231 test: 0.6424465 best: 0.6424465 (3) total: 1.07s remaining: 25.2s 4: learn: 0.6306856 test: 0.6304148 best: 0.6304148 (4) total: 1.33s remaining: 24.8s 5: learn: 0.6190880 test: 0.6188758 best: 0.6188758 (5) total: 1.58s remaining: 24.2s 6: learn: 0.6083937 test: 0.6081359 best: 0.6081359 (6) total: 1.82s remaining: 23.6s 7: learn: 0.5980445 test: 0.5977468 best: 0.5977468 (7) total: 2.07s remaining: 23.3s 8: learn: 0.5876851 test: 0.5873616 best: 0.5873616 (8) total: 2.35s remaining: 23.2s 9: learn: 0.5774807 test: 0.5771254 best: 0.5771254 (9) total: 2.6s remaining: 22.9s 10: learn: 0.5675405 test: 0.5672218 best: 0.5672218 (10) total: 2.84s remaining: 22.5s 11: learn: 0.5576967 test: 0.5573891 best: 0.5573891 (11) total: 3.08s remaining: 22.1s 12: learn: 0.5486601 test: 0.5483323 best: 0.5483323 (12) total: 3.34s remaining: 21.9s 13: learn: 0.5403782 test: 0.5398610 best: 0.5398610 (13) total: 3.59s remaining: 21.5s 14: learn: 0.5317700 test: 0.5313024 best: 0.5313024 (14) total: 3.86s remaining: 21.3s 15: learn: 0.5230629 test: 0.5225951 best: 0.5225951 (15) total: 4.12s remaining: 21.1s 16: learn: 0.5145312 test: 0.5140899 best: 0.5140899 (16) total: 4.38s remaining: 20.8s 17: learn: 0.5063657 test: 0.5059813 best: 0.5059813 (17) total: 4.64s remaining: 20.6s 18: learn: 0.4982443 test: 0.4978296 best: 0.4978296 (18) total: 4.88s remaining: 20.3s 19: learn: 0.4906327 test: 0.4901929 best: 0.4901929 (19) total: 5.13s remaining: 20s 20: learn: 0.4833487 test: 0.4828527 best: 0.4828527 (20) total: 5.38s remaining: 19.7s 21: learn: 0.4770280 test: 0.4765064 best: 0.4765064 (21) total: 5.63s remaining: 19.5s 22: learn: 0.4699900 test: 0.4695585 best: 0.4695585 (22) total: 5.9s remaining: 19.2s 23: learn: 0.4633243 test: 0.4628824 best: 0.4628824 (23) total: 6.16s remaining: 19s 24: learn: 0.4570860 test: 0.4566721 best: 0.4566721 (24) total: 6.41s remaining: 18.7s 25: learn: 0.4505760 test: 0.4501388 best: 0.4501388 (25) total: 6.67s remaining: 18.5s 26: learn: 0.4446010 test: 0.4442011 best: 0.4442011 (26) total: 6.93s remaining: 18.2s 27: learn: 0.4388690 test: 0.4384172 best: 0.4384172 (27) total: 7.2s remaining: 18s 28: learn: 0.4329002 test: 0.4324884 best: 0.4324884 (28) total: 7.46s remaining: 17.8s 29: learn: 0.4271076 test: 0.4267083 best: 0.4267083 (29) total: 7.73s remaining: 17.5s 30: learn: 0.4214135 test: 0.4209993 best: 0.4209993 (30) total: 8s remaining: 17.3s 31: learn: 0.4161585 test: 0.4157700 best: 0.4157700 (31) total: 8.26s remaining: 17s 32: learn: 0.4112990 test: 0.4109843 best: 0.4109843 (32) total: 8.5s remaining: 16.7s 33: learn: 0.4060210 test: 0.4056803 best: 0.4056803 (33) total: 8.75s remaining: 16.5s 34: learn: 0.4007951 test: 0.4005239 best: 0.4005239 (34) total: 9s remaining: 16.2s 35: learn: 0.3963071 test: 0.3961375 best: 0.3961375 (35) total: 9.26s remaining: 16s 36: learn: 0.3921575 test: 0.3919765 best: 0.3919765 (36) total: 9.51s remaining: 15.7s 37: learn: 0.3879949 test: 0.3877885 best: 0.3877885 (37) total: 9.77s remaining: 15.4s 38: learn: 0.3835111 test: 0.3833913 best: 0.3833913 (38) total: 10s remaining: 15.2s 39: learn: 0.3791801 test: 0.3791223 best: 0.3791223 (39) total: 10.3s remaining: 14.9s 40: learn: 0.3750076 test: 0.3750186 best: 0.3750186 (40) total: 10.6s remaining: 14.7s 41: learn: 0.3710829 test: 0.3711341 best: 0.3711341 (41) total: 10.8s remaining: 14.4s 42: learn: 0.3670463 test: 0.3671227 best: 0.3671227 (42) total: 11.1s remaining: 14.1s 43: learn: 0.3633255 test: 0.3634045 best: 0.3634045 (43) total: 11.3s remaining: 13.9s 44: learn: 0.3601270 test: 0.3602061 best: 0.3602061 (44) total: 11.6s remaining: 13.6s 45: learn: 0.3565759 test: 0.3566901 best: 0.3566901 (45) total: 11.8s remaining: 13.4s 46: learn: 0.3533565 test: 0.3535351 best: 0.3535351 (46) total: 12.1s remaining: 13.1s 47: learn: 0.3498880 test: 0.3500895 best: 0.3500895 (47) total: 12.4s remaining: 12.9s 48: learn: 0.3465049 test: 0.3467540 best: 0.3467540 (48) total: 12.6s remaining: 12.6s 49: learn: 0.3432702 test: 0.3436515 best: 0.3436515 (49) total: 12.9s remaining: 12.3s 50: learn: 0.3400814 test: 0.3404594 best: 0.3404594 (50) total: 13.1s remaining: 12.1s 51: learn: 0.3369861 test: 0.3374326 best: 0.3374326 (51) total: 13.4s remaining: 11.8s 52: learn: 0.3342608 test: 0.3347815 best: 0.3347815 (52) total: 13.6s remaining: 11.6s 53: learn: 0.3312786 test: 0.3318622 best: 0.3318622 (53) total: 13.9s remaining: 11.3s 54: learn: 0.3285038 test: 0.3291618 best: 0.3291618 (54) total: 14.1s remaining: 11.1s 55: learn: 0.3259413 test: 0.3266474 best: 0.3266474 (55) total: 14.4s remaining: 10.8s 56: learn: 0.3232294 test: 0.3239744 best: 0.3239744 (56) total: 14.6s remaining: 10.5s 57: learn: 0.3205955 test: 0.3214082 best: 0.3214082 (57) total: 14.9s remaining: 10.3s 58: learn: 0.3180908 test: 0.3189283 best: 0.3189283 (58) total: 15.2s remaining: 10s 59: learn: 0.3156771 test: 0.3166013 best: 0.3166013 (59) total: 15.4s remaining: 9.77s 60: learn: 0.3132229 test: 0.3141730 best: 0.3141730 (60) total: 15.7s remaining: 9.5s 61: learn: 0.3107027 test: 0.3117469 best: 0.3117469 (61) total: 15.9s remaining: 9.24s 62: learn: 0.3081717 test: 0.3093300 best: 0.3093300 (62) total: 16.2s remaining: 8.98s 63: learn: 0.3058418 test: 0.3070816 best: 0.3070816 (63) total: 16.4s remaining: 8.72s 64: learn: 0.3036086 test: 0.3049023 best: 0.3049023 (64) total: 16.7s remaining: 8.47s 65: learn: 0.3015935 test: 0.3029776 best: 0.3029776 (65) total: 16.9s remaining: 8.22s 66: learn: 0.2993485 test: 0.3007989 best: 0.3007989 (66) total: 17.2s remaining: 7.96s 67: learn: 0.2972459 test: 0.2987182 best: 0.2987182 (67) total: 17.4s remaining: 7.7s 68: learn: 0.2949879 test: 0.2965011 best: 0.2965011 (68) total: 17.7s remaining: 7.44s 69: learn: 0.2927463 test: 0.2943214 best: 0.2943214 (69) total: 18s remaining: 7.19s 70: learn: 0.2908137 test: 0.2924265 best: 0.2924265 (70) total: 18.2s remaining: 6.93s 71: learn: 0.2889855 test: 0.2906617 best: 0.2906617 (71) total: 18.5s remaining: 6.67s 72: learn: 0.2872011 test: 0.2889134 best: 0.2889134 (72) total: 18.7s remaining: 6.42s 73: learn: 0.2852514 test: 0.2870330 best: 0.2870330 (73) total: 19s remaining: 6.16s 74: learn: 0.2834949 test: 0.2853311 best: 0.2853311 (74) total: 19.2s remaining: 5.9s 75: learn: 0.2819065 test: 0.2838154 best: 0.2838154 (75) total: 19.5s remaining: 5.64s 76: learn: 0.2803950 test: 0.2823679 best: 0.2823679 (76) total: 19.7s remaining: 5.38s 77: learn: 0.2789203 test: 0.2809355 best: 0.2809355 (77) total: 20s remaining: 5.13s 78: learn: 0.2771479 test: 0.2793045 best: 0.2793045 (78) total: 20.2s remaining: 4.87s 79: learn: 0.2755355 test: 0.2777182 best: 0.2777182 (79) total: 20.5s remaining: 4.61s 80: learn: 0.2738087 test: 0.2760225 best: 0.2760225 (80) total: 20.7s remaining: 4.35s 81: learn: 0.2720815 test: 0.2744264 best: 0.2744264 (81) total: 21s remaining: 4.09s 82: learn: 0.2706489 test: 0.2731004 best: 0.2731004 (82) total: 21.2s remaining: 3.84s 83: learn: 0.2692719 test: 0.2718565 best: 0.2718565 (83) total: 21.5s remaining: 3.58s 84: learn: 0.2677368 test: 0.2704132 best: 0.2704132 (84) total: 21.8s remaining: 3.33s 85: learn: 0.2665301 test: 0.2692552 best: 0.2692552 (85) total: 22s remaining: 3.07s 86: learn: 0.2652036 test: 0.2679974 best: 0.2679974 (86) total: 22.3s remaining: 2.81s 87: learn: 0.2639717 test: 0.2668232 best: 0.2668232 (87) total: 22.5s remaining: 2.56s 88: learn: 0.2627266 test: 0.2656357 best: 0.2656357 (88) total: 22.8s remaining: 2.3s 89: learn: 0.2613952 test: 0.2644511 best: 0.2644511 (89) total: 23s remaining: 2.05s 90: learn: 0.2601868 test: 0.2632966 best: 0.2632966 (90) total: 23.3s remaining: 1.79s 91: learn: 0.2588993 test: 0.2621477 best: 0.2621477 (91) total: 23.5s remaining: 1.53s 92: learn: 0.2576150 test: 0.2609539 best: 0.2609539 (92) total: 23.8s remaining: 1.28s 93: learn: 0.2562770 test: 0.2596910 best: 0.2596910 (93) total: 24s remaining: 1.02s 94: learn: 0.2552345 test: 0.2587447 best: 0.2587447 (94) total: 24.3s remaining: 766ms 95: learn: 0.2539939 test: 0.2575641 best: 0.2575641 (95) total: 24.5s remaining: 511ms 96: learn: 0.2528743 test: 0.2565347 best: 0.2565347 (96) total: 24.8s remaining: 255ms 97: learn: 0.2517860 test: 0.2554913 best: 0.2554913 (97) total: 25s remaining: 0us bestTest = 0.2554913437 bestIteration = 97 Trial 42, Fold 3: Log loss = 0.2554913437227735, Average precision = 0.9712237322056307, ROC-AUC = 0.9705600326492088, Elapsed Time = 25.159806300001947 seconds Trial 42, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 42, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6795559 test: 0.6796226 best: 0.6796226 (0) total: 267ms remaining: 25.9s 1: learn: 0.6666331 test: 0.6666840 best: 0.6666840 (1) total: 527ms remaining: 25.3s 2: learn: 0.6540855 test: 0.6541845 best: 0.6541845 (2) total: 782ms remaining: 24.8s 3: learn: 0.6417352 test: 0.6418209 best: 0.6418209 (3) total: 1.03s remaining: 24.1s 4: learn: 0.6301287 test: 0.6301813 best: 0.6301813 (4) total: 1.28s remaining: 23.8s 5: learn: 0.6189876 test: 0.6189789 best: 0.6189789 (5) total: 1.52s remaining: 23.3s 6: learn: 0.6080905 test: 0.6081780 best: 0.6081780 (6) total: 1.78s remaining: 23.2s 7: learn: 0.5973314 test: 0.5973737 best: 0.5973737 (7) total: 2.04s remaining: 22.9s 8: learn: 0.5868876 test: 0.5870342 best: 0.5870342 (8) total: 2.29s remaining: 22.6s 9: learn: 0.5768819 test: 0.5771497 best: 0.5771497 (9) total: 2.55s remaining: 22.4s 10: learn: 0.5670953 test: 0.5674054 best: 0.5674054 (10) total: 2.8s remaining: 22.2s 11: learn: 0.5575036 test: 0.5579080 best: 0.5579080 (11) total: 3.05s remaining: 21.9s 12: learn: 0.5481445 test: 0.5486357 best: 0.5486357 (12) total: 3.31s remaining: 21.6s 13: learn: 0.5392694 test: 0.5399747 best: 0.5399747 (13) total: 3.58s remaining: 21.5s 14: learn: 0.5302516 test: 0.5310647 best: 0.5310647 (14) total: 3.83s remaining: 21.2s 15: learn: 0.5216812 test: 0.5224967 best: 0.5224967 (15) total: 4.1s remaining: 21s 16: learn: 0.5133394 test: 0.5141765 best: 0.5141765 (16) total: 4.36s remaining: 20.8s 17: learn: 0.5054615 test: 0.5062657 best: 0.5062657 (17) total: 4.61s remaining: 20.5s 18: learn: 0.4976979 test: 0.4985635 best: 0.4985635 (18) total: 4.87s remaining: 20.2s 19: learn: 0.4900923 test: 0.4910827 best: 0.4910827 (19) total: 5.13s remaining: 20s 20: learn: 0.4824773 test: 0.4835263 best: 0.4835263 (20) total: 5.39s remaining: 19.8s 21: learn: 0.4763827 test: 0.4774861 best: 0.4774861 (21) total: 5.64s remaining: 19.5s 22: learn: 0.4694311 test: 0.4706056 best: 0.4706056 (22) total: 5.9s remaining: 19.2s 23: learn: 0.4631164 test: 0.4643536 best: 0.4643536 (23) total: 6.14s remaining: 18.9s 24: learn: 0.4564623 test: 0.4578802 best: 0.4578802 (24) total: 6.4s remaining: 18.7s 25: learn: 0.4497475 test: 0.4512417 best: 0.4512417 (25) total: 6.67s remaining: 18.5s 26: learn: 0.4436883 test: 0.4453835 best: 0.4453835 (26) total: 6.93s remaining: 18.2s 27: learn: 0.4376013 test: 0.4393999 best: 0.4393999 (27) total: 7.18s remaining: 18s 28: learn: 0.4317775 test: 0.4336221 best: 0.4336221 (28) total: 7.44s remaining: 17.7s 29: learn: 0.4262130 test: 0.4281705 best: 0.4281705 (29) total: 7.71s remaining: 17.5s 30: learn: 0.4208328 test: 0.4228091 best: 0.4228091 (30) total: 7.97s remaining: 17.2s 31: learn: 0.4153106 test: 0.4173284 best: 0.4173284 (31) total: 8.21s remaining: 16.9s 32: learn: 0.4101115 test: 0.4122354 best: 0.4122354 (32) total: 8.47s remaining: 16.7s 33: learn: 0.4047326 test: 0.4070097 best: 0.4070097 (33) total: 8.73s remaining: 16.4s 34: learn: 0.3998219 test: 0.4022318 best: 0.4022318 (34) total: 9.01s remaining: 16.2s 35: learn: 0.3955276 test: 0.3980348 best: 0.3980348 (35) total: 9.28s remaining: 16s 36: learn: 0.3908805 test: 0.3934588 best: 0.3934588 (36) total: 9.54s remaining: 15.7s 37: learn: 0.3868629 test: 0.3894845 best: 0.3894845 (37) total: 9.78s remaining: 15.4s 38: learn: 0.3826502 test: 0.3853424 best: 0.3853424 (38) total: 10s remaining: 15.2s 39: learn: 0.3786178 test: 0.3813585 best: 0.3813585 (39) total: 10.3s remaining: 14.9s 40: learn: 0.3745060 test: 0.3772384 best: 0.3772384 (40) total: 10.5s remaining: 14.6s 41: learn: 0.3703246 test: 0.3731426 best: 0.3731426 (41) total: 10.8s remaining: 14.4s 42: learn: 0.3663578 test: 0.3692259 best: 0.3692259 (42) total: 11s remaining: 14.1s 43: learn: 0.3628588 test: 0.3658283 best: 0.3658283 (43) total: 11.3s remaining: 13.9s 44: learn: 0.3590981 test: 0.3621399 best: 0.3621399 (44) total: 11.6s remaining: 13.6s 45: learn: 0.3554797 test: 0.3586477 best: 0.3586477 (45) total: 11.8s remaining: 13.4s 46: learn: 0.3522603 test: 0.3555176 best: 0.3555176 (46) total: 12.1s remaining: 13.1s 47: learn: 0.3486442 test: 0.3519461 best: 0.3519461 (47) total: 12.3s remaining: 12.8s 48: learn: 0.3451327 test: 0.3485239 best: 0.3485239 (48) total: 12.6s remaining: 12.6s 49: learn: 0.3417185 test: 0.3452786 best: 0.3452786 (49) total: 12.8s remaining: 12.3s 50: learn: 0.3384224 test: 0.3421181 best: 0.3421181 (50) total: 13.1s remaining: 12s 51: learn: 0.3355508 test: 0.3392966 best: 0.3392966 (51) total: 13.3s remaining: 11.8s 52: learn: 0.3324141 test: 0.3362885 best: 0.3362885 (52) total: 13.6s remaining: 11.5s 53: learn: 0.3300036 test: 0.3339915 best: 0.3339915 (53) total: 13.8s remaining: 11.3s 54: learn: 0.3272727 test: 0.3313201 best: 0.3313201 (54) total: 14.1s remaining: 11s 55: learn: 0.3243710 test: 0.3284911 best: 0.3284911 (55) total: 14.3s remaining: 10.7s 56: learn: 0.3217827 test: 0.3260035 best: 0.3260035 (56) total: 14.6s remaining: 10.5s 57: learn: 0.3191646 test: 0.3234784 best: 0.3234784 (57) total: 14.8s remaining: 10.2s 58: learn: 0.3168437 test: 0.3212078 best: 0.3212078 (58) total: 15.1s remaining: 9.96s 59: learn: 0.3143819 test: 0.3188584 best: 0.3188584 (59) total: 15.3s remaining: 9.71s 60: learn: 0.3121471 test: 0.3165704 best: 0.3165704 (60) total: 15.6s remaining: 9.45s 61: learn: 0.3098505 test: 0.3143590 best: 0.3143590 (61) total: 15.8s remaining: 9.19s 62: learn: 0.3074353 test: 0.3120172 best: 0.3120172 (62) total: 16.1s remaining: 8.94s 63: learn: 0.3050721 test: 0.3097635 best: 0.3097635 (63) total: 16.3s remaining: 8.68s 64: learn: 0.3031000 test: 0.3078794 best: 0.3078794 (64) total: 16.6s remaining: 8.42s 65: learn: 0.3006451 test: 0.3055075 best: 0.3055075 (65) total: 16.8s remaining: 8.16s 66: learn: 0.2984579 test: 0.3034029 best: 0.3034029 (66) total: 17.1s remaining: 7.91s 67: learn: 0.2962236 test: 0.3012515 best: 0.3012515 (67) total: 17.3s remaining: 7.65s 68: learn: 0.2943098 test: 0.2994256 best: 0.2994256 (68) total: 17.6s remaining: 7.39s 69: learn: 0.2926137 test: 0.2978103 best: 0.2978103 (69) total: 17.8s remaining: 7.13s 70: learn: 0.2905746 test: 0.2958246 best: 0.2958246 (70) total: 18.1s remaining: 6.87s 71: learn: 0.2886465 test: 0.2939762 best: 0.2939762 (71) total: 18.3s remaining: 6.62s 72: learn: 0.2867863 test: 0.2923640 best: 0.2923640 (72) total: 18.6s remaining: 6.36s 73: learn: 0.2847925 test: 0.2904686 best: 0.2904686 (73) total: 18.8s remaining: 6.11s 74: learn: 0.2828239 test: 0.2886141 best: 0.2886141 (74) total: 19.1s remaining: 5.86s 75: learn: 0.2811213 test: 0.2869640 best: 0.2869640 (75) total: 19.3s remaining: 5.6s 76: learn: 0.2794963 test: 0.2854421 best: 0.2854421 (76) total: 19.6s remaining: 5.34s 77: learn: 0.2778854 test: 0.2838932 best: 0.2838932 (77) total: 19.8s remaining: 5.08s 78: learn: 0.2763615 test: 0.2824781 best: 0.2824781 (78) total: 20.1s remaining: 4.83s 79: learn: 0.2747698 test: 0.2810164 best: 0.2810164 (79) total: 20.3s remaining: 4.58s 80: learn: 0.2730842 test: 0.2794351 best: 0.2794351 (80) total: 20.6s remaining: 4.32s 81: learn: 0.2715510 test: 0.2779535 best: 0.2779535 (81) total: 20.8s remaining: 4.06s 82: learn: 0.2699971 test: 0.2764722 best: 0.2764722 (82) total: 21.1s remaining: 3.81s 83: learn: 0.2687343 test: 0.2752397 best: 0.2752397 (83) total: 21.3s remaining: 3.55s 84: learn: 0.2673886 test: 0.2739850 best: 0.2739850 (84) total: 21.5s remaining: 3.29s 85: learn: 0.2659612 test: 0.2726869 best: 0.2726869 (85) total: 21.8s remaining: 3.04s 86: learn: 0.2644174 test: 0.2712348 best: 0.2712348 (86) total: 22s remaining: 2.79s 87: learn: 0.2629471 test: 0.2698101 best: 0.2698101 (87) total: 22.3s remaining: 2.53s 88: learn: 0.2616389 test: 0.2685866 best: 0.2685866 (88) total: 22.5s remaining: 2.28s 89: learn: 0.2604967 test: 0.2674652 best: 0.2674652 (89) total: 22.8s remaining: 2.02s 90: learn: 0.2593852 test: 0.2664642 best: 0.2664642 (90) total: 23s remaining: 1.77s 91: learn: 0.2582230 test: 0.2654177 best: 0.2654177 (91) total: 23.2s remaining: 1.51s 92: learn: 0.2568674 test: 0.2641458 best: 0.2641458 (92) total: 23.5s remaining: 1.26s 93: learn: 0.2557549 test: 0.2630896 best: 0.2630896 (93) total: 23.7s remaining: 1.01s 94: learn: 0.2545827 test: 0.2619864 best: 0.2619864 (94) total: 24s remaining: 757ms 95: learn: 0.2535946 test: 0.2610587 best: 0.2610587 (95) total: 24.2s remaining: 505ms 96: learn: 0.2524114 test: 0.2599793 best: 0.2599793 (96) total: 24.5s remaining: 252ms 97: learn: 0.2511560 test: 0.2587722 best: 0.2587722 (97) total: 24.7s remaining: 0us bestTest = 0.2587722405 bestIteration = 97 Trial 42, Fold 4: Log loss = 0.25877224052726905, Average precision = 0.9740514368891039, ROC-AUC = 0.9700750612197211, Elapsed Time = 24.857192000003124 seconds Trial 42, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 42, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6796555 test: 0.6798883 best: 0.6798883 (0) total: 263ms remaining: 25.5s 1: learn: 0.6665737 test: 0.6670597 best: 0.6670597 (1) total: 522ms remaining: 25.1s 2: learn: 0.6543089 test: 0.6549523 best: 0.6549523 (2) total: 794ms remaining: 25.1s 3: learn: 0.6420059 test: 0.6428925 best: 0.6428925 (3) total: 1.04s remaining: 24.4s 4: learn: 0.6298237 test: 0.6309681 best: 0.6309681 (4) total: 1.29s remaining: 24s 5: learn: 0.6189382 test: 0.6203248 best: 0.6203248 (5) total: 1.55s remaining: 23.8s 6: learn: 0.6076601 test: 0.6092323 best: 0.6092323 (6) total: 1.8s remaining: 23.4s 7: learn: 0.5965696 test: 0.5983180 best: 0.5983180 (7) total: 2.07s remaining: 23.3s 8: learn: 0.5861382 test: 0.5879935 best: 0.5879935 (8) total: 2.32s remaining: 22.9s 9: learn: 0.5763276 test: 0.5782909 best: 0.5782909 (9) total: 2.57s remaining: 22.6s 10: learn: 0.5664513 test: 0.5685515 best: 0.5685515 (10) total: 2.81s remaining: 22.2s 11: learn: 0.5565169 test: 0.5588952 best: 0.5588952 (11) total: 3.05s remaining: 21.8s 12: learn: 0.5471970 test: 0.5497463 best: 0.5497463 (12) total: 3.31s remaining: 21.6s 13: learn: 0.5385630 test: 0.5412675 best: 0.5412675 (13) total: 3.56s remaining: 21.4s 14: learn: 0.5295299 test: 0.5324265 best: 0.5324265 (14) total: 3.82s remaining: 21.1s 15: learn: 0.5212262 test: 0.5242580 best: 0.5242580 (15) total: 4.08s remaining: 20.9s 16: learn: 0.5128986 test: 0.5160788 best: 0.5160788 (16) total: 4.34s remaining: 20.7s 17: learn: 0.5047684 test: 0.5081395 best: 0.5081395 (17) total: 4.58s remaining: 20.4s 18: learn: 0.4968814 test: 0.5004202 best: 0.5004202 (18) total: 4.83s remaining: 20.1s 19: learn: 0.4889782 test: 0.4926966 best: 0.4926966 (19) total: 5.08s remaining: 19.8s 20: learn: 0.4814690 test: 0.4852924 best: 0.4852924 (20) total: 5.33s remaining: 19.5s 21: learn: 0.4747038 test: 0.4786878 best: 0.4786878 (21) total: 5.6s remaining: 19.3s 22: learn: 0.4678603 test: 0.4719595 best: 0.4719595 (22) total: 5.87s remaining: 19.1s 23: learn: 0.4617961 test: 0.4659857 best: 0.4659857 (23) total: 6.11s remaining: 18.8s 24: learn: 0.4557356 test: 0.4601760 best: 0.4601760 (24) total: 6.36s remaining: 18.6s 25: learn: 0.4495432 test: 0.4541733 best: 0.4541733 (25) total: 6.61s remaining: 18.3s 26: learn: 0.4430569 test: 0.4478470 best: 0.4478470 (26) total: 6.85s remaining: 18s 27: learn: 0.4370129 test: 0.4418827 best: 0.4418827 (27) total: 7.1s remaining: 17.8s 28: learn: 0.4311954 test: 0.4362533 best: 0.4362533 (28) total: 7.36s remaining: 17.5s 29: learn: 0.4256822 test: 0.4307975 best: 0.4307975 (29) total: 7.61s remaining: 17.2s 30: learn: 0.4202679 test: 0.4254747 best: 0.4254747 (30) total: 7.85s remaining: 17s 31: learn: 0.4150145 test: 0.4204284 best: 0.4204284 (31) total: 8.1s remaining: 16.7s 32: learn: 0.4098071 test: 0.4153303 best: 0.4153303 (32) total: 8.34s remaining: 16.4s 33: learn: 0.4045738 test: 0.4103183 best: 0.4103183 (33) total: 8.59s remaining: 16.2s 34: learn: 0.3995553 test: 0.4054996 best: 0.4054996 (34) total: 8.84s remaining: 15.9s 35: learn: 0.3944979 test: 0.4006792 best: 0.4006792 (35) total: 9.1s remaining: 15.7s 36: learn: 0.3898669 test: 0.3961624 best: 0.3961624 (36) total: 9.35s remaining: 15.4s 37: learn: 0.3855509 test: 0.3919508 best: 0.3919508 (37) total: 9.6s remaining: 15.2s 38: learn: 0.3810100 test: 0.3875360 best: 0.3875360 (38) total: 9.84s remaining: 14.9s 39: learn: 0.3768447 test: 0.3835217 best: 0.3835217 (39) total: 10.1s remaining: 14.6s 40: learn: 0.3728517 test: 0.3796745 best: 0.3796745 (40) total: 10.4s remaining: 14.4s 41: learn: 0.3685873 test: 0.3756178 best: 0.3756178 (41) total: 10.6s remaining: 14.1s 42: learn: 0.3649820 test: 0.3721481 best: 0.3721481 (42) total: 10.8s remaining: 13.9s 43: learn: 0.3612279 test: 0.3685252 best: 0.3685252 (43) total: 11.1s remaining: 13.6s 44: learn: 0.3575188 test: 0.3649135 best: 0.3649135 (44) total: 11.4s remaining: 13.4s 45: learn: 0.3537958 test: 0.3613276 best: 0.3613276 (45) total: 11.6s remaining: 13.1s 46: learn: 0.3503888 test: 0.3580346 best: 0.3580346 (46) total: 11.9s remaining: 12.9s 47: learn: 0.3472302 test: 0.3549842 best: 0.3549842 (47) total: 12.1s remaining: 12.6s 48: learn: 0.3436350 test: 0.3515708 best: 0.3515708 (48) total: 12.4s remaining: 12.4s 49: learn: 0.3404449 test: 0.3485432 best: 0.3485432 (49) total: 12.6s remaining: 12.1s 50: learn: 0.3374408 test: 0.3456670 best: 0.3456670 (50) total: 12.8s remaining: 11.8s 51: learn: 0.3344123 test: 0.3427737 best: 0.3427737 (51) total: 13.1s remaining: 11.6s 52: learn: 0.3315030 test: 0.3399619 best: 0.3399619 (52) total: 13.3s remaining: 11.3s 53: learn: 0.3288160 test: 0.3375005 best: 0.3375005 (53) total: 13.6s remaining: 11.1s 54: learn: 0.3259975 test: 0.3348345 best: 0.3348345 (54) total: 13.9s remaining: 10.8s 55: learn: 0.3233486 test: 0.3323451 best: 0.3323451 (55) total: 14.1s remaining: 10.6s 56: learn: 0.3204405 test: 0.3295840 best: 0.3295840 (56) total: 14.3s remaining: 10.3s 57: learn: 0.3175763 test: 0.3268459 best: 0.3268459 (57) total: 14.6s remaining: 10.1s 58: learn: 0.3149522 test: 0.3243829 best: 0.3243829 (58) total: 14.8s remaining: 9.79s 59: learn: 0.3122872 test: 0.3218816 best: 0.3218816 (59) total: 15.1s remaining: 9.54s 60: learn: 0.3096721 test: 0.3194748 best: 0.3194748 (60) total: 15.3s remaining: 9.29s 61: learn: 0.3072685 test: 0.3172266 best: 0.3172266 (61) total: 15.6s remaining: 9.04s 62: learn: 0.3051140 test: 0.3152082 best: 0.3152082 (62) total: 15.8s remaining: 8.79s 63: learn: 0.3027029 test: 0.3129566 best: 0.3129566 (63) total: 16.1s remaining: 8.53s 64: learn: 0.3003738 test: 0.3107905 best: 0.3107905 (64) total: 16.3s remaining: 8.28s 65: learn: 0.2984274 test: 0.3089915 best: 0.3089915 (65) total: 16.6s remaining: 8.03s 66: learn: 0.2964631 test: 0.3071939 best: 0.3071939 (66) total: 16.8s remaining: 7.77s 67: learn: 0.2945782 test: 0.3054051 best: 0.3054051 (67) total: 17s remaining: 7.52s 68: learn: 0.2924067 test: 0.3033519 best: 0.3033519 (68) total: 17.3s remaining: 7.26s 69: learn: 0.2902984 test: 0.3013791 best: 0.3013791 (69) total: 17.5s remaining: 7.01s 70: learn: 0.2883945 test: 0.2996022 best: 0.2996022 (70) total: 17.8s remaining: 6.75s 71: learn: 0.2863288 test: 0.2976832 best: 0.2976832 (71) total: 18s remaining: 6.5s 72: learn: 0.2846422 test: 0.2961073 best: 0.2961073 (72) total: 18.3s remaining: 6.25s 73: learn: 0.2830464 test: 0.2946232 best: 0.2946232 (73) total: 18.5s remaining: 6s 74: learn: 0.2813856 test: 0.2931049 best: 0.2931049 (74) total: 18.7s remaining: 5.75s 75: learn: 0.2795565 test: 0.2913631 best: 0.2913631 (75) total: 19s remaining: 5.5s 76: learn: 0.2780328 test: 0.2899516 best: 0.2899516 (76) total: 19.2s remaining: 5.25s 77: learn: 0.2762926 test: 0.2883011 best: 0.2883011 (77) total: 19.5s remaining: 4.99s 78: learn: 0.2746825 test: 0.2868260 best: 0.2868260 (78) total: 19.7s remaining: 4.74s 79: learn: 0.2731369 test: 0.2854210 best: 0.2854210 (79) total: 20s remaining: 4.49s 80: learn: 0.2716687 test: 0.2840720 best: 0.2840720 (80) total: 20.2s remaining: 4.24s 81: learn: 0.2699935 test: 0.2825409 best: 0.2825409 (81) total: 20.4s remaining: 3.99s 82: learn: 0.2686244 test: 0.2813196 best: 0.2813196 (82) total: 20.7s remaining: 3.74s 83: learn: 0.2673047 test: 0.2800317 best: 0.2800317 (83) total: 20.9s remaining: 3.48s 84: learn: 0.2660487 test: 0.2788791 best: 0.2788791 (84) total: 21.1s remaining: 3.23s 85: learn: 0.2647066 test: 0.2776965 best: 0.2776965 (85) total: 21.4s remaining: 2.98s 86: learn: 0.2632694 test: 0.2764025 best: 0.2764025 (86) total: 21.6s remaining: 2.73s 87: learn: 0.2619027 test: 0.2751919 best: 0.2751919 (87) total: 21.9s remaining: 2.49s 88: learn: 0.2607482 test: 0.2741252 best: 0.2741252 (88) total: 22.1s remaining: 2.24s 89: learn: 0.2595714 test: 0.2730802 best: 0.2730802 (89) total: 22.4s remaining: 1.99s 90: learn: 0.2582588 test: 0.2718715 best: 0.2718715 (90) total: 22.6s remaining: 1.74s 91: learn: 0.2570414 test: 0.2707740 best: 0.2707740 (91) total: 22.9s remaining: 1.49s 92: learn: 0.2557865 test: 0.2696361 best: 0.2696361 (92) total: 23.1s remaining: 1.24s 93: learn: 0.2544450 test: 0.2684606 best: 0.2684606 (93) total: 23.4s remaining: 994ms 94: learn: 0.2532034 test: 0.2673339 best: 0.2673339 (94) total: 23.6s remaining: 745ms 95: learn: 0.2519166 test: 0.2661495 best: 0.2661495 (95) total: 23.8s remaining: 497ms 96: learn: 0.2506515 test: 0.2650242 best: 0.2650242 (96) total: 24.1s remaining: 248ms 97: learn: 0.2494991 test: 0.2640062 best: 0.2640062 (97) total: 24.3s remaining: 0us bestTest = 0.2640062469 bestIteration = 97 Trial 42, Fold 5: Log loss = 0.2640062469134561, Average precision = 0.9716072915204965, ROC-AUC = 0.9683154041952325, Elapsed Time = 24.468977199998335 seconds
Optimization Progress: 43%|####3 | 43/100 [1:10:42<1:15:10, 79.14s/it]
Trial 43, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 43, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5460660 test: 0.5450925 best: 0.5450925 (0) total: 39.8ms remaining: 358ms 1: learn: 0.4601231 test: 0.4600283 best: 0.4600283 (1) total: 76.4ms remaining: 306ms 2: learn: 0.3965213 test: 0.3962833 best: 0.3962833 (2) total: 113ms remaining: 263ms 3: learn: 0.3598688 test: 0.3600733 best: 0.3600733 (3) total: 148ms remaining: 222ms 4: learn: 0.3338137 test: 0.3344572 best: 0.3344572 (4) total: 183ms remaining: 183ms 5: learn: 0.3068850 test: 0.3084256 best: 0.3084256 (5) total: 220ms remaining: 147ms 6: learn: 0.2888460 test: 0.2912281 best: 0.2912281 (6) total: 258ms remaining: 111ms 7: learn: 0.2724850 test: 0.2757664 best: 0.2757664 (7) total: 295ms remaining: 73.7ms 8: learn: 0.2610050 test: 0.2656276 best: 0.2656276 (8) total: 332ms remaining: 36.9ms 9: learn: 0.2543256 test: 0.2595717 best: 0.2595717 (9) total: 372ms remaining: 0us bestTest = 0.2595717389 bestIteration = 9 Trial 43, Fold 1: Log loss = 0.25955953350980043, Average precision = 0.9663326217392855, ROC-AUC = 0.9626414609053496, Elapsed Time = 0.46879339999941294 seconds Trial 43, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 43, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5551144 test: 0.5563962 best: 0.5563962 (0) total: 38.9ms remaining: 350ms 1: learn: 0.4653665 test: 0.4668387 best: 0.4668387 (1) total: 77.5ms remaining: 310ms 2: learn: 0.4090073 test: 0.4110196 best: 0.4110196 (2) total: 119ms remaining: 278ms 3: learn: 0.3650324 test: 0.3677520 best: 0.3677520 (3) total: 159ms remaining: 238ms 4: learn: 0.3333589 test: 0.3379949 best: 0.3379949 (4) total: 198ms remaining: 198ms 5: learn: 0.3107982 test: 0.3150375 best: 0.3150375 (5) total: 238ms remaining: 159ms 6: learn: 0.2888373 test: 0.2927803 best: 0.2927803 (6) total: 279ms remaining: 119ms 7: learn: 0.2760024 test: 0.2797005 best: 0.2797005 (7) total: 318ms remaining: 79.5ms 8: learn: 0.2653578 test: 0.2690237 best: 0.2690237 (8) total: 358ms remaining: 39.8ms 9: learn: 0.2570825 test: 0.2604959 best: 0.2604959 (9) total: 399ms remaining: 0us bestTest = 0.2604959017 bestIteration = 9 Trial 43, Fold 2: Log loss = 0.2604778162631055, Average precision = 0.9681149562319693, ROC-AUC = 0.9656772306393346, Elapsed Time = 0.4984293000015896 seconds Trial 43, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 43, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5542480 test: 0.5534174 best: 0.5534174 (0) total: 39.6ms remaining: 356ms 1: learn: 0.4637517 test: 0.4623683 best: 0.4623683 (1) total: 80ms remaining: 320ms 2: learn: 0.4036071 test: 0.4021017 best: 0.4021017 (2) total: 119ms remaining: 277ms 3: learn: 0.3647707 test: 0.3626357 best: 0.3626357 (3) total: 155ms remaining: 233ms 4: learn: 0.3328382 test: 0.3303443 best: 0.3303443 (4) total: 195ms remaining: 195ms 5: learn: 0.3082857 test: 0.3057334 best: 0.3057334 (5) total: 236ms remaining: 157ms 6: learn: 0.2915234 test: 0.2892387 best: 0.2892387 (6) total: 276ms remaining: 118ms 7: learn: 0.2738339 test: 0.2713185 best: 0.2713185 (7) total: 317ms remaining: 79.2ms 8: learn: 0.2635612 test: 0.2614159 best: 0.2614159 (8) total: 358ms remaining: 39.7ms 9: learn: 0.2550455 test: 0.2531532 best: 0.2531532 (9) total: 400ms remaining: 0us bestTest = 0.2531531862 bestIteration = 9 Trial 43, Fold 3: Log loss = 0.25326300340831376, Average precision = 0.9703199120398129, ROC-AUC = 0.9682479070556963, Elapsed Time = 0.4978806999970402 seconds Trial 43, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 43, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5592422 test: 0.5595585 best: 0.5595585 (0) total: 41.6ms remaining: 375ms 1: learn: 0.4602357 test: 0.4600690 best: 0.4600690 (1) total: 78.9ms remaining: 316ms 2: learn: 0.4048205 test: 0.4051879 best: 0.4051879 (2) total: 118ms remaining: 274ms 3: learn: 0.3588265 test: 0.3595980 best: 0.3595980 (3) total: 156ms remaining: 234ms 4: learn: 0.3289645 test: 0.3300340 best: 0.3300340 (4) total: 195ms remaining: 195ms 5: learn: 0.3078039 test: 0.3094592 best: 0.3094592 (5) total: 235ms remaining: 156ms 6: learn: 0.2894590 test: 0.2909770 best: 0.2909770 (6) total: 274ms remaining: 117ms 7: learn: 0.2726987 test: 0.2744118 best: 0.2744118 (7) total: 312ms remaining: 78.1ms 8: learn: 0.2626660 test: 0.2647081 best: 0.2647081 (8) total: 359ms remaining: 39.9ms 9: learn: 0.2523929 test: 0.2547683 best: 0.2547683 (9) total: 403ms remaining: 0us bestTest = 0.2547683347 bestIteration = 9 Trial 43, Fold 4: Log loss = 0.25477216931620916, Average precision = 0.9663377721735341, ROC-AUC = 0.9653137674628212, Elapsed Time = 0.5030527000017173 seconds Trial 43, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 43, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5583379 test: 0.5600104 best: 0.5600104 (0) total: 39.2ms remaining: 353ms 1: learn: 0.4678735 test: 0.4712954 best: 0.4712954 (1) total: 78.6ms remaining: 314ms 2: learn: 0.3996488 test: 0.4041260 best: 0.4041260 (2) total: 118ms remaining: 274ms 3: learn: 0.3642038 test: 0.3718420 best: 0.3718420 (3) total: 157ms remaining: 236ms 4: learn: 0.3284597 test: 0.3368922 best: 0.3368922 (4) total: 197ms remaining: 197ms 5: learn: 0.3080546 test: 0.3167858 best: 0.3167858 (5) total: 236ms remaining: 157ms 6: learn: 0.2883535 test: 0.2975305 best: 0.2975305 (6) total: 275ms remaining: 118ms 7: learn: 0.2744868 test: 0.2840836 best: 0.2840836 (7) total: 314ms remaining: 78.6ms 8: learn: 0.2620734 test: 0.2720387 best: 0.2720387 (8) total: 354ms remaining: 39.3ms 9: learn: 0.2496768 test: 0.2599068 best: 0.2599068 (9) total: 394ms remaining: 0us bestTest = 0.2599068468 bestIteration = 9 Trial 43, Fold 5: Log loss = 0.25983842623579323, Average precision = 0.9682498962168871, ROC-AUC = 0.9643176337253591, Elapsed Time = 0.49794289999772445 seconds
Optimization Progress: 44%|####4 | 44/100 [1:10:52<54:36, 58.51s/it]
Trial 44, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 44, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5844485 test: 0.5860317 best: 0.5860317 (0) total: 150ms remaining: 9.89s 1: learn: 0.5011915 test: 0.5041480 best: 0.5041480 (1) total: 276ms remaining: 8.97s 2: learn: 0.4373466 test: 0.4424333 best: 0.4424333 (2) total: 414ms remaining: 8.82s 3: learn: 0.3890854 test: 0.3951325 best: 0.3951325 (3) total: 546ms remaining: 8.6s 4: learn: 0.3502443 test: 0.3581723 best: 0.3581723 (4) total: 683ms remaining: 8.46s 5: learn: 0.3217753 test: 0.3321859 best: 0.3321859 (5) total: 821ms remaining: 8.35s 6: learn: 0.2992649 test: 0.3107299 best: 0.3107299 (6) total: 947ms remaining: 8.12s 7: learn: 0.2788515 test: 0.2919434 best: 0.2919434 (7) total: 1.09s remaining: 8.02s 8: learn: 0.2623307 test: 0.2769269 best: 0.2769269 (8) total: 1.23s remaining: 7.94s 9: learn: 0.2497562 test: 0.2654506 best: 0.2654506 (9) total: 1.36s remaining: 7.78s 10: learn: 0.2384929 test: 0.2555615 best: 0.2555615 (10) total: 1.5s remaining: 7.64s 11: learn: 0.2293638 test: 0.2480933 best: 0.2480933 (11) total: 1.65s remaining: 7.54s 12: learn: 0.2204854 test: 0.2409612 best: 0.2409612 (12) total: 1.8s remaining: 7.47s 13: learn: 0.2131123 test: 0.2348678 best: 0.2348678 (13) total: 1.95s remaining: 7.39s 14: learn: 0.2074797 test: 0.2306742 best: 0.2306742 (14) total: 2.11s remaining: 7.3s 15: learn: 0.2023474 test: 0.2266117 best: 0.2266117 (15) total: 2.25s remaining: 7.16s 16: learn: 0.1984355 test: 0.2237164 best: 0.2237164 (16) total: 2.4s remaining: 7.06s 17: learn: 0.1941391 test: 0.2208861 best: 0.2208861 (17) total: 2.58s remaining: 7.04s 18: learn: 0.1908671 test: 0.2189945 best: 0.2189945 (18) total: 2.75s remaining: 6.94s 19: learn: 0.1874531 test: 0.2164831 best: 0.2164831 (19) total: 2.9s remaining: 6.82s 20: learn: 0.1845522 test: 0.2142682 best: 0.2142682 (20) total: 3.05s remaining: 6.69s 21: learn: 0.1818416 test: 0.2122642 best: 0.2122642 (21) total: 3.19s remaining: 6.53s 22: learn: 0.1789453 test: 0.2104814 best: 0.2104814 (22) total: 3.36s remaining: 6.43s 23: learn: 0.1771683 test: 0.2096594 best: 0.2096594 (23) total: 3.51s remaining: 6.29s 24: learn: 0.1748810 test: 0.2083543 best: 0.2083543 (24) total: 3.67s remaining: 6.17s 25: learn: 0.1732372 test: 0.2074799 best: 0.2074799 (25) total: 3.81s remaining: 6s 26: learn: 0.1714011 test: 0.2065258 best: 0.2065258 (26) total: 3.97s remaining: 5.88s 27: learn: 0.1697431 test: 0.2056813 best: 0.2056813 (27) total: 4.11s remaining: 5.72s 28: learn: 0.1682598 test: 0.2052712 best: 0.2052712 (28) total: 4.25s remaining: 5.57s 29: learn: 0.1664161 test: 0.2046890 best: 0.2046890 (29) total: 4.42s remaining: 5.46s 30: learn: 0.1648749 test: 0.2041241 best: 0.2041241 (30) total: 4.58s remaining: 5.32s 31: learn: 0.1632713 test: 0.2035187 best: 0.2035187 (31) total: 4.75s remaining: 5.19s 32: learn: 0.1616994 test: 0.2031064 best: 0.2031064 (32) total: 4.92s remaining: 5.06s 33: learn: 0.1604639 test: 0.2026259 best: 0.2026259 (33) total: 5.06s remaining: 4.91s 34: learn: 0.1595458 test: 0.2021560 best: 0.2021560 (34) total: 5.19s remaining: 4.74s 35: learn: 0.1584328 test: 0.2017071 best: 0.2017071 (35) total: 5.32s remaining: 4.58s 36: learn: 0.1573447 test: 0.2010141 best: 0.2010141 (36) total: 5.47s remaining: 4.43s 37: learn: 0.1565554 test: 0.2008990 best: 0.2008990 (37) total: 5.6s remaining: 4.28s 38: learn: 0.1556625 test: 0.2008487 best: 0.2008487 (38) total: 5.75s remaining: 4.13s 39: learn: 0.1547277 test: 0.2005068 best: 0.2005068 (39) total: 5.89s remaining: 3.98s 40: learn: 0.1533377 test: 0.2002165 best: 0.2002165 (40) total: 6.05s remaining: 3.84s 41: learn: 0.1523970 test: 0.1998871 best: 0.1998871 (41) total: 6.19s remaining: 3.68s 42: learn: 0.1513915 test: 0.1995340 best: 0.1995340 (42) total: 6.34s remaining: 3.54s 43: learn: 0.1505729 test: 0.1993208 best: 0.1993208 (43) total: 6.49s remaining: 3.39s 44: learn: 0.1497546 test: 0.1988391 best: 0.1988391 (44) total: 6.63s remaining: 3.24s 45: learn: 0.1489251 test: 0.1983224 best: 0.1983224 (45) total: 6.77s remaining: 3.09s 46: learn: 0.1482850 test: 0.1982411 best: 0.1982411 (46) total: 6.9s remaining: 2.94s 47: learn: 0.1475397 test: 0.1982812 best: 0.1982411 (46) total: 7.04s remaining: 2.79s 48: learn: 0.1467524 test: 0.1983210 best: 0.1982411 (46) total: 7.18s remaining: 2.64s 49: learn: 0.1461435 test: 0.1984761 best: 0.1982411 (46) total: 7.31s remaining: 2.48s 50: learn: 0.1454017 test: 0.1983282 best: 0.1982411 (46) total: 7.45s remaining: 2.34s 51: learn: 0.1445853 test: 0.1984746 best: 0.1982411 (46) total: 7.59s remaining: 2.19s 52: learn: 0.1438281 test: 0.1983636 best: 0.1982411 (46) total: 7.74s remaining: 2.04s 53: learn: 0.1432959 test: 0.1982075 best: 0.1982075 (53) total: 7.87s remaining: 1.89s 54: learn: 0.1428064 test: 0.1980129 best: 0.1980129 (54) total: 7.99s remaining: 1.74s 55: learn: 0.1422153 test: 0.1980013 best: 0.1980013 (55) total: 8.12s remaining: 1.59s 56: learn: 0.1416317 test: 0.1981321 best: 0.1980013 (55) total: 8.26s remaining: 1.45s 57: learn: 0.1411519 test: 0.1983211 best: 0.1980013 (55) total: 8.4s remaining: 1.3s 58: learn: 0.1403767 test: 0.1984673 best: 0.1980013 (55) total: 8.55s remaining: 1.16s 59: learn: 0.1398065 test: 0.1983405 best: 0.1980013 (55) total: 8.69s remaining: 1.01s 60: learn: 0.1393567 test: 0.1983556 best: 0.1980013 (55) total: 8.81s remaining: 866ms 61: learn: 0.1388946 test: 0.1983471 best: 0.1980013 (55) total: 8.93s remaining: 720ms 62: learn: 0.1384481 test: 0.1983710 best: 0.1980013 (55) total: 9.05s remaining: 575ms 63: learn: 0.1374697 test: 0.1983262 best: 0.1980013 (55) total: 9.2s remaining: 431ms 64: learn: 0.1366222 test: 0.1983973 best: 0.1980013 (55) total: 9.36s remaining: 288ms 65: learn: 0.1361691 test: 0.1984395 best: 0.1980013 (55) total: 9.48s remaining: 144ms 66: learn: 0.1358118 test: 0.1986493 best: 0.1980013 (55) total: 9.6s remaining: 0us bestTest = 0.1980012852 bestIteration = 55 Shrink model to first 56 iterations. Trial 44, Fold 1: Log loss = 0.19800128523381824, Average precision = 0.9753835438388123, ROC-AUC = 0.9718022158175726, Elapsed Time = 9.725621600002341 seconds Trial 44, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 44, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5840422 test: 0.5867268 best: 0.5867268 (0) total: 180ms remaining: 11.9s 1: learn: 0.5026327 test: 0.5063222 best: 0.5063222 (1) total: 308ms remaining: 10s 2: learn: 0.4385963 test: 0.4438692 best: 0.4438692 (2) total: 450ms remaining: 9.6s 3: learn: 0.3890092 test: 0.3949324 best: 0.3949324 (3) total: 590ms remaining: 9.29s 4: learn: 0.3504626 test: 0.3571604 best: 0.3571604 (4) total: 720ms remaining: 8.93s 5: learn: 0.3216285 test: 0.3297508 best: 0.3297508 (5) total: 865ms remaining: 8.8s 6: learn: 0.2969872 test: 0.3066918 best: 0.3066918 (6) total: 1.01s remaining: 8.63s 7: learn: 0.2795092 test: 0.2905130 best: 0.2905130 (7) total: 1.15s remaining: 8.48s 8: learn: 0.2643047 test: 0.2764965 best: 0.2764965 (8) total: 1.29s remaining: 8.34s 9: learn: 0.2504172 test: 0.2634512 best: 0.2634512 (9) total: 1.46s remaining: 8.31s 10: learn: 0.2406903 test: 0.2541693 best: 0.2541693 (10) total: 1.6s remaining: 8.17s 11: learn: 0.2312000 test: 0.2459003 best: 0.2459003 (11) total: 1.76s remaining: 8.08s 12: learn: 0.2229058 test: 0.2383250 best: 0.2383250 (12) total: 1.92s remaining: 7.97s 13: learn: 0.2159388 test: 0.2322990 best: 0.2322990 (13) total: 2.09s remaining: 7.93s 14: learn: 0.2097941 test: 0.2266122 best: 0.2266122 (14) total: 2.25s remaining: 7.81s 15: learn: 0.2049958 test: 0.2227921 best: 0.2227921 (15) total: 2.42s remaining: 7.71s 16: learn: 0.2000924 test: 0.2184801 best: 0.2184801 (16) total: 2.58s remaining: 7.57s 17: learn: 0.1962041 test: 0.2152319 best: 0.2152319 (17) total: 2.72s remaining: 7.41s 18: learn: 0.1926165 test: 0.2128951 best: 0.2128951 (18) total: 2.88s remaining: 7.29s 19: learn: 0.1896070 test: 0.2108230 best: 0.2108230 (19) total: 3.04s remaining: 7.14s 20: learn: 0.1863827 test: 0.2084711 best: 0.2084711 (20) total: 3.19s remaining: 6.98s 21: learn: 0.1833610 test: 0.2062647 best: 0.2062647 (21) total: 3.34s remaining: 6.84s 22: learn: 0.1814086 test: 0.2047060 best: 0.2047060 (22) total: 3.48s remaining: 6.65s 23: learn: 0.1793903 test: 0.2032256 best: 0.2032256 (23) total: 3.61s remaining: 6.47s 24: learn: 0.1774459 test: 0.2016870 best: 0.2016870 (24) total: 3.75s remaining: 6.3s 25: learn: 0.1754611 test: 0.2004702 best: 0.2004702 (25) total: 3.9s remaining: 6.14s 26: learn: 0.1733517 test: 0.1995009 best: 0.1995009 (26) total: 4.05s remaining: 6.01s 27: learn: 0.1720681 test: 0.1986453 best: 0.1986453 (27) total: 4.2s remaining: 5.84s 28: learn: 0.1708955 test: 0.1977208 best: 0.1977208 (28) total: 4.32s remaining: 5.67s 29: learn: 0.1688273 test: 0.1967883 best: 0.1967883 (29) total: 4.48s remaining: 5.53s 30: learn: 0.1675519 test: 0.1960634 best: 0.1960634 (30) total: 4.62s remaining: 5.37s 31: learn: 0.1665305 test: 0.1955022 best: 0.1955022 (31) total: 4.75s remaining: 5.2s 32: learn: 0.1651652 test: 0.1947926 best: 0.1947926 (32) total: 4.9s remaining: 5.04s 33: learn: 0.1636461 test: 0.1937048 best: 0.1937048 (33) total: 5.04s remaining: 4.89s 34: learn: 0.1625192 test: 0.1930605 best: 0.1930605 (34) total: 5.18s remaining: 4.74s 35: learn: 0.1618159 test: 0.1925783 best: 0.1925783 (35) total: 5.31s remaining: 4.57s 36: learn: 0.1610686 test: 0.1922286 best: 0.1922286 (36) total: 5.44s remaining: 4.41s 37: learn: 0.1602290 test: 0.1917865 best: 0.1917865 (37) total: 5.57s remaining: 4.25s 38: learn: 0.1589877 test: 0.1912052 best: 0.1912052 (38) total: 5.71s remaining: 4.1s 39: learn: 0.1578647 test: 0.1906836 best: 0.1906836 (39) total: 5.86s remaining: 3.95s 40: learn: 0.1570975 test: 0.1903364 best: 0.1903364 (40) total: 5.98s remaining: 3.79s 41: learn: 0.1563180 test: 0.1902962 best: 0.1902962 (41) total: 6.13s remaining: 3.65s 42: learn: 0.1554624 test: 0.1899332 best: 0.1899332 (42) total: 6.26s remaining: 3.49s 43: learn: 0.1544092 test: 0.1894974 best: 0.1894974 (43) total: 6.4s remaining: 3.35s 44: learn: 0.1536525 test: 0.1889361 best: 0.1889361 (44) total: 6.52s remaining: 3.19s 45: learn: 0.1530849 test: 0.1886167 best: 0.1886167 (45) total: 6.65s remaining: 3.03s 46: learn: 0.1524084 test: 0.1882077 best: 0.1882077 (46) total: 6.78s remaining: 2.88s 47: learn: 0.1513718 test: 0.1877792 best: 0.1877792 (47) total: 6.91s remaining: 2.73s 48: learn: 0.1505488 test: 0.1875359 best: 0.1875359 (48) total: 7.05s remaining: 2.59s 49: learn: 0.1498300 test: 0.1873960 best: 0.1873960 (49) total: 7.18s remaining: 2.44s 50: learn: 0.1490776 test: 0.1873318 best: 0.1873318 (50) total: 7.3s remaining: 2.29s 51: learn: 0.1482584 test: 0.1870077 best: 0.1870077 (51) total: 7.44s remaining: 2.15s 52: learn: 0.1478540 test: 0.1869509 best: 0.1869509 (52) total: 7.56s remaining: 2s 53: learn: 0.1476433 test: 0.1868411 best: 0.1868411 (53) total: 7.66s remaining: 1.84s 54: learn: 0.1471294 test: 0.1866626 best: 0.1866626 (54) total: 7.78s remaining: 1.7s 55: learn: 0.1464932 test: 0.1866297 best: 0.1866297 (55) total: 7.91s remaining: 1.55s 56: learn: 0.1455631 test: 0.1862142 best: 0.1862142 (56) total: 8.04s remaining: 1.41s 57: learn: 0.1451555 test: 0.1861595 best: 0.1861595 (57) total: 8.17s remaining: 1.27s 58: learn: 0.1442274 test: 0.1859130 best: 0.1859130 (58) total: 8.32s remaining: 1.13s 59: learn: 0.1432739 test: 0.1857194 best: 0.1857194 (59) total: 8.47s remaining: 988ms 60: learn: 0.1427232 test: 0.1855276 best: 0.1855276 (60) total: 8.6s remaining: 846ms 61: learn: 0.1421327 test: 0.1853745 best: 0.1853745 (61) total: 8.73s remaining: 704ms 62: learn: 0.1415632 test: 0.1853328 best: 0.1853328 (62) total: 8.85s remaining: 562ms 63: learn: 0.1405773 test: 0.1850808 best: 0.1850808 (63) total: 9s remaining: 422ms 64: learn: 0.1399082 test: 0.1851675 best: 0.1850808 (63) total: 9.14s remaining: 281ms 65: learn: 0.1396658 test: 0.1851496 best: 0.1850808 (63) total: 9.24s remaining: 140ms 66: learn: 0.1390733 test: 0.1852378 best: 0.1850808 (63) total: 9.37s remaining: 0us bestTest = 0.1850807937 bestIteration = 63 Shrink model to first 64 iterations. Trial 44, Fold 2: Log loss = 0.18508079373799857, Average precision = 0.9772625433742628, ROC-AUC = 0.9745802732127828, Elapsed Time = 9.507290300000022 seconds Trial 44, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 44, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5861191 test: 0.5869367 best: 0.5869367 (0) total: 146ms remaining: 9.67s 1: learn: 0.5026530 test: 0.5047823 best: 0.5047823 (1) total: 282ms remaining: 9.16s 2: learn: 0.4395876 test: 0.4424883 best: 0.4424883 (2) total: 419ms remaining: 8.93s 3: learn: 0.3896579 test: 0.3934397 best: 0.3934397 (3) total: 562ms remaining: 8.85s 4: learn: 0.3502485 test: 0.3548202 best: 0.3548202 (4) total: 714ms remaining: 8.86s 5: learn: 0.3203280 test: 0.3257695 best: 0.3257695 (5) total: 846ms remaining: 8.6s 6: learn: 0.2962221 test: 0.3024731 best: 0.3024731 (6) total: 986ms remaining: 8.45s 7: learn: 0.2768734 test: 0.2836014 best: 0.2836014 (7) total: 1.13s remaining: 8.36s 8: learn: 0.2627954 test: 0.2706321 best: 0.2706321 (8) total: 1.27s remaining: 8.21s 9: learn: 0.2500447 test: 0.2588895 best: 0.2588895 (9) total: 1.43s remaining: 8.13s 10: learn: 0.2401223 test: 0.2499675 best: 0.2499675 (10) total: 1.57s remaining: 8.01s 11: learn: 0.2310610 test: 0.2415767 best: 0.2415767 (11) total: 1.71s remaining: 7.84s 12: learn: 0.2243359 test: 0.2356182 best: 0.2356182 (12) total: 1.84s remaining: 7.65s 13: learn: 0.2169950 test: 0.2291426 best: 0.2291426 (13) total: 1.98s remaining: 7.51s 14: learn: 0.2106958 test: 0.2236639 best: 0.2236639 (14) total: 2.13s remaining: 7.39s 15: learn: 0.2053074 test: 0.2193945 best: 0.2193945 (15) total: 2.28s remaining: 7.27s 16: learn: 0.2006909 test: 0.2158890 best: 0.2158890 (16) total: 2.43s remaining: 7.15s 17: learn: 0.1967982 test: 0.2127853 best: 0.2127853 (17) total: 2.6s remaining: 7.07s 18: learn: 0.1932941 test: 0.2104059 best: 0.2104059 (18) total: 2.77s remaining: 7s 19: learn: 0.1897495 test: 0.2080779 best: 0.2080779 (19) total: 2.92s remaining: 6.87s 20: learn: 0.1870515 test: 0.2061734 best: 0.2061734 (20) total: 3.07s remaining: 6.72s 21: learn: 0.1845969 test: 0.2044727 best: 0.2044727 (21) total: 3.2s remaining: 6.55s 22: learn: 0.1823061 test: 0.2030811 best: 0.2030811 (22) total: 3.34s remaining: 6.39s 23: learn: 0.1800785 test: 0.2020337 best: 0.2020337 (23) total: 3.48s remaining: 6.24s 24: learn: 0.1774869 test: 0.2001999 best: 0.2001999 (24) total: 3.64s remaining: 6.11s 25: learn: 0.1757766 test: 0.1990960 best: 0.1990960 (25) total: 3.8s remaining: 6s 26: learn: 0.1742629 test: 0.1981845 best: 0.1981845 (26) total: 3.93s remaining: 5.82s 27: learn: 0.1722696 test: 0.1972699 best: 0.1972699 (27) total: 4.1s remaining: 5.72s 28: learn: 0.1703899 test: 0.1963275 best: 0.1963275 (28) total: 4.25s remaining: 5.57s 29: learn: 0.1690729 test: 0.1958143 best: 0.1958143 (29) total: 4.39s remaining: 5.41s 30: learn: 0.1680088 test: 0.1951287 best: 0.1951287 (30) total: 4.51s remaining: 5.24s 31: learn: 0.1667636 test: 0.1944575 best: 0.1944575 (31) total: 4.64s remaining: 5.07s 32: learn: 0.1655416 test: 0.1937138 best: 0.1937138 (32) total: 4.77s remaining: 4.92s 33: learn: 0.1641895 test: 0.1931321 best: 0.1931321 (33) total: 4.9s remaining: 4.76s 34: learn: 0.1632300 test: 0.1927362 best: 0.1927362 (34) total: 5.03s remaining: 4.6s 35: learn: 0.1620377 test: 0.1923534 best: 0.1923534 (35) total: 5.18s remaining: 4.46s 36: learn: 0.1609756 test: 0.1920620 best: 0.1920620 (36) total: 5.31s remaining: 4.31s 37: learn: 0.1600638 test: 0.1914692 best: 0.1914692 (37) total: 5.44s remaining: 4.15s 38: learn: 0.1590610 test: 0.1912265 best: 0.1912265 (38) total: 5.57s remaining: 4s 39: learn: 0.1581106 test: 0.1911263 best: 0.1911263 (39) total: 5.71s remaining: 3.85s 40: learn: 0.1573682 test: 0.1908477 best: 0.1908477 (40) total: 5.84s remaining: 3.7s 41: learn: 0.1563916 test: 0.1906343 best: 0.1906343 (41) total: 5.97s remaining: 3.55s 42: learn: 0.1554079 test: 0.1903102 best: 0.1903102 (42) total: 6.11s remaining: 3.41s 43: learn: 0.1544408 test: 0.1898793 best: 0.1898793 (43) total: 6.25s remaining: 3.27s 44: learn: 0.1535424 test: 0.1896787 best: 0.1896787 (44) total: 6.39s remaining: 3.13s 45: learn: 0.1531829 test: 0.1895479 best: 0.1895479 (45) total: 6.5s remaining: 2.97s 46: learn: 0.1519385 test: 0.1895300 best: 0.1895300 (46) total: 6.64s remaining: 2.83s 47: learn: 0.1511226 test: 0.1893474 best: 0.1893474 (47) total: 6.78s remaining: 2.68s 48: learn: 0.1506773 test: 0.1890883 best: 0.1890883 (48) total: 6.89s remaining: 2.53s 49: learn: 0.1499348 test: 0.1890453 best: 0.1890453 (49) total: 7.02s remaining: 2.39s 50: learn: 0.1492558 test: 0.1889661 best: 0.1889661 (50) total: 7.15s remaining: 2.24s 51: learn: 0.1486909 test: 0.1888390 best: 0.1888390 (51) total: 7.26s remaining: 2.1s 52: learn: 0.1480385 test: 0.1885938 best: 0.1885938 (52) total: 7.39s remaining: 1.95s 53: learn: 0.1475530 test: 0.1885396 best: 0.1885396 (53) total: 7.51s remaining: 1.81s 54: learn: 0.1471450 test: 0.1884773 best: 0.1884773 (54) total: 7.61s remaining: 1.66s 55: learn: 0.1467446 test: 0.1884101 best: 0.1884101 (55) total: 7.72s remaining: 1.51s 56: learn: 0.1463774 test: 0.1882745 best: 0.1882745 (56) total: 7.82s remaining: 1.37s 57: learn: 0.1454593 test: 0.1880341 best: 0.1880341 (57) total: 7.96s remaining: 1.23s 58: learn: 0.1450016 test: 0.1879162 best: 0.1879162 (58) total: 8.08s remaining: 1.09s 59: learn: 0.1446762 test: 0.1880056 best: 0.1879162 (58) total: 8.19s remaining: 955ms 60: learn: 0.1439473 test: 0.1879362 best: 0.1879162 (58) total: 8.32s remaining: 819ms 61: learn: 0.1434600 test: 0.1879495 best: 0.1879162 (58) total: 8.44s remaining: 681ms 62: learn: 0.1427748 test: 0.1877577 best: 0.1877577 (62) total: 8.58s remaining: 545ms 63: learn: 0.1424116 test: 0.1877641 best: 0.1877577 (62) total: 8.69s remaining: 408ms 64: learn: 0.1419919 test: 0.1877964 best: 0.1877577 (62) total: 8.81s remaining: 271ms 65: learn: 0.1412427 test: 0.1877956 best: 0.1877577 (62) total: 8.95s remaining: 136ms 66: learn: 0.1404571 test: 0.1876819 best: 0.1876819 (66) total: 9.1s remaining: 0us bestTest = 0.187681891 bestIteration = 66 Trial 44, Fold 3: Log loss = 0.18768189095663948, Average precision = 0.9764593874892866, ROC-AUC = 0.9739831884263852, Elapsed Time = 9.22843020000073 seconds Trial 44, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 44, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5854960 test: 0.5868520 best: 0.5868520 (0) total: 153ms remaining: 10.1s 1: learn: 0.5024079 test: 0.5051085 best: 0.5051085 (1) total: 296ms remaining: 9.61s 2: learn: 0.4395882 test: 0.4425519 best: 0.4425519 (2) total: 430ms remaining: 9.18s 3: learn: 0.3892579 test: 0.3931227 best: 0.3931227 (3) total: 562ms remaining: 8.85s 4: learn: 0.3516569 test: 0.3566716 best: 0.3566716 (4) total: 695ms remaining: 8.62s 5: learn: 0.3232938 test: 0.3293027 best: 0.3293027 (5) total: 824ms remaining: 8.38s 6: learn: 0.2988016 test: 0.3060760 best: 0.3060760 (6) total: 965ms remaining: 8.27s 7: learn: 0.2788393 test: 0.2873073 best: 0.2873073 (7) total: 1.1s remaining: 8.1s 8: learn: 0.2641075 test: 0.2728782 best: 0.2728782 (8) total: 1.23s remaining: 7.92s 9: learn: 0.2505797 test: 0.2602747 best: 0.2602747 (9) total: 1.37s remaining: 7.81s 10: learn: 0.2407132 test: 0.2513592 best: 0.2513592 (10) total: 1.5s remaining: 7.62s 11: learn: 0.2316243 test: 0.2433277 best: 0.2433277 (11) total: 1.64s remaining: 7.5s 12: learn: 0.2238522 test: 0.2361248 best: 0.2361248 (12) total: 1.77s remaining: 7.36s 13: learn: 0.2170028 test: 0.2304469 best: 0.2304469 (13) total: 1.91s remaining: 7.22s 14: learn: 0.2118956 test: 0.2266294 best: 0.2266294 (14) total: 2.03s remaining: 7.04s 15: learn: 0.2068172 test: 0.2225244 best: 0.2225244 (15) total: 2.17s remaining: 6.92s 16: learn: 0.2021485 test: 0.2186928 best: 0.2186928 (16) total: 2.31s remaining: 6.79s 17: learn: 0.1984789 test: 0.2158533 best: 0.2158533 (17) total: 2.45s remaining: 6.66s 18: learn: 0.1954123 test: 0.2140464 best: 0.2140464 (18) total: 2.58s remaining: 6.51s 19: learn: 0.1915664 test: 0.2111519 best: 0.2111519 (19) total: 2.72s remaining: 6.4s 20: learn: 0.1887659 test: 0.2087682 best: 0.2087682 (20) total: 2.85s remaining: 6.23s 21: learn: 0.1853311 test: 0.2068895 best: 0.2068895 (21) total: 3.01s remaining: 6.15s 22: learn: 0.1826720 test: 0.2051909 best: 0.2051909 (22) total: 3.14s remaining: 6.01s 23: learn: 0.1804295 test: 0.2041076 best: 0.2041076 (23) total: 3.3s remaining: 5.92s 24: learn: 0.1783396 test: 0.2032549 best: 0.2032549 (24) total: 3.45s remaining: 5.79s 25: learn: 0.1764563 test: 0.2021287 best: 0.2021287 (25) total: 3.59s remaining: 5.66s 26: learn: 0.1748155 test: 0.2011787 best: 0.2011787 (26) total: 3.73s remaining: 5.52s 27: learn: 0.1731516 test: 0.2001809 best: 0.2001809 (27) total: 3.88s remaining: 5.4s 28: learn: 0.1715443 test: 0.1992749 best: 0.1992749 (28) total: 4.02s remaining: 5.27s 29: learn: 0.1698965 test: 0.1988125 best: 0.1988125 (29) total: 4.17s remaining: 5.14s 30: learn: 0.1684767 test: 0.1982910 best: 0.1982910 (30) total: 4.32s remaining: 5.01s 31: learn: 0.1672103 test: 0.1980907 best: 0.1980907 (31) total: 4.46s remaining: 4.88s 32: learn: 0.1661719 test: 0.1979505 best: 0.1979505 (32) total: 4.59s remaining: 4.73s 33: learn: 0.1645645 test: 0.1972069 best: 0.1972069 (33) total: 4.73s remaining: 4.59s 34: learn: 0.1633585 test: 0.1971444 best: 0.1971444 (34) total: 4.87s remaining: 4.45s 35: learn: 0.1621475 test: 0.1966794 best: 0.1966794 (35) total: 5s remaining: 4.31s 36: learn: 0.1608137 test: 0.1958108 best: 0.1958108 (36) total: 5.15s remaining: 4.17s 37: learn: 0.1597390 test: 0.1954735 best: 0.1954735 (37) total: 5.29s remaining: 4.04s 38: learn: 0.1584358 test: 0.1949887 best: 0.1949887 (38) total: 5.44s remaining: 3.9s 39: learn: 0.1578903 test: 0.1945857 best: 0.1945857 (39) total: 5.54s remaining: 3.74s 40: learn: 0.1569783 test: 0.1941710 best: 0.1941710 (40) total: 5.68s remaining: 3.6s 41: learn: 0.1559441 test: 0.1938117 best: 0.1938117 (41) total: 5.81s remaining: 3.46s 42: learn: 0.1550957 test: 0.1933920 best: 0.1933920 (42) total: 5.94s remaining: 3.31s 43: learn: 0.1544075 test: 0.1932122 best: 0.1932122 (43) total: 6.05s remaining: 3.16s 44: learn: 0.1537580 test: 0.1931382 best: 0.1931382 (44) total: 6.17s remaining: 3.02s 45: learn: 0.1526895 test: 0.1929988 best: 0.1929988 (45) total: 6.33s remaining: 2.89s 46: learn: 0.1519501 test: 0.1926635 best: 0.1926635 (46) total: 6.45s remaining: 2.74s 47: learn: 0.1511377 test: 0.1927365 best: 0.1926635 (46) total: 6.58s remaining: 2.6s 48: learn: 0.1508046 test: 0.1924659 best: 0.1924659 (48) total: 6.68s remaining: 2.45s 49: learn: 0.1500989 test: 0.1924802 best: 0.1924659 (48) total: 6.81s remaining: 2.31s 50: learn: 0.1490361 test: 0.1923644 best: 0.1923644 (50) total: 6.96s remaining: 2.18s 51: learn: 0.1486429 test: 0.1923835 best: 0.1923644 (50) total: 7.07s remaining: 2.04s 52: learn: 0.1482168 test: 0.1921454 best: 0.1921454 (52) total: 7.18s remaining: 1.9s 53: learn: 0.1471621 test: 0.1918839 best: 0.1918839 (53) total: 7.34s remaining: 1.77s 54: learn: 0.1466538 test: 0.1920225 best: 0.1918839 (53) total: 7.46s remaining: 1.63s 55: learn: 0.1460729 test: 0.1918739 best: 0.1918739 (55) total: 7.58s remaining: 1.49s 56: learn: 0.1455895 test: 0.1919571 best: 0.1918739 (55) total: 7.7s remaining: 1.35s 57: learn: 0.1450468 test: 0.1920437 best: 0.1918739 (55) total: 7.82s remaining: 1.21s 58: learn: 0.1444262 test: 0.1921577 best: 0.1918739 (55) total: 7.94s remaining: 1.08s 59: learn: 0.1438357 test: 0.1922240 best: 0.1918739 (55) total: 8.06s remaining: 940ms 60: learn: 0.1429416 test: 0.1923575 best: 0.1918739 (55) total: 8.21s remaining: 807ms 61: learn: 0.1426589 test: 0.1923551 best: 0.1918739 (55) total: 8.32s remaining: 671ms 62: learn: 0.1422864 test: 0.1922132 best: 0.1918739 (55) total: 8.43s remaining: 535ms 63: learn: 0.1413716 test: 0.1923502 best: 0.1918739 (55) total: 8.57s remaining: 402ms 64: learn: 0.1410270 test: 0.1922263 best: 0.1918739 (55) total: 8.68s remaining: 267ms 65: learn: 0.1405457 test: 0.1923988 best: 0.1918739 (55) total: 8.79s remaining: 133ms 66: learn: 0.1401072 test: 0.1924067 best: 0.1918739 (55) total: 8.9s remaining: 0us bestTest = 0.1918738941 bestIteration = 55 Shrink model to first 56 iterations. Trial 44, Fold 4: Log loss = 0.19187389408416525, Average precision = 0.9766082860624836, ROC-AUC = 0.9728536437357977, Elapsed Time = 9.03114099999948 seconds Trial 44, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 44, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5838665 test: 0.5866961 best: 0.5866961 (0) total: 152ms remaining: 10s 1: learn: 0.5002629 test: 0.5056570 best: 0.5056570 (1) total: 285ms remaining: 9.25s 2: learn: 0.4370653 test: 0.4440596 best: 0.4440596 (2) total: 418ms remaining: 8.91s 3: learn: 0.3880570 test: 0.3987871 best: 0.3987871 (3) total: 549ms remaining: 8.64s 4: learn: 0.3485078 test: 0.3609780 best: 0.3609780 (4) total: 695ms remaining: 8.61s 5: learn: 0.3191880 test: 0.3323440 best: 0.3323440 (5) total: 815ms remaining: 8.28s 6: learn: 0.2964611 test: 0.3109227 best: 0.3109227 (6) total: 949ms remaining: 8.13s 7: learn: 0.2765411 test: 0.2931271 best: 0.2931271 (7) total: 1.1s remaining: 8.11s 8: learn: 0.2610990 test: 0.2789842 best: 0.2789842 (8) total: 1.24s remaining: 7.97s 9: learn: 0.2487167 test: 0.2680170 best: 0.2680170 (9) total: 1.37s remaining: 7.82s 10: learn: 0.2377041 test: 0.2585069 best: 0.2585069 (10) total: 1.52s remaining: 7.74s 11: learn: 0.2297116 test: 0.2518608 best: 0.2518608 (11) total: 1.66s remaining: 7.63s 12: learn: 0.2213496 test: 0.2449375 best: 0.2449375 (12) total: 1.81s remaining: 7.51s 13: learn: 0.2146602 test: 0.2391865 best: 0.2391865 (13) total: 1.94s remaining: 7.33s 14: learn: 0.2087316 test: 0.2344552 best: 0.2344552 (14) total: 2.07s remaining: 7.19s 15: learn: 0.2032532 test: 0.2310314 best: 0.2310314 (15) total: 2.23s remaining: 7.1s 16: learn: 0.1986062 test: 0.2278762 best: 0.2278762 (16) total: 2.38s remaining: 7.01s 17: learn: 0.1946948 test: 0.2249528 best: 0.2249528 (17) total: 2.51s remaining: 6.84s 18: learn: 0.1911759 test: 0.2222946 best: 0.2222946 (18) total: 2.66s remaining: 6.72s 19: learn: 0.1882320 test: 0.2200526 best: 0.2200526 (19) total: 2.79s remaining: 6.56s 20: learn: 0.1849461 test: 0.2174230 best: 0.2174230 (20) total: 2.93s remaining: 6.42s 21: learn: 0.1823308 test: 0.2156790 best: 0.2156790 (21) total: 3.07s remaining: 6.28s 22: learn: 0.1797058 test: 0.2142880 best: 0.2142880 (22) total: 3.24s remaining: 6.2s 23: learn: 0.1771103 test: 0.2126907 best: 0.2126907 (23) total: 3.39s remaining: 6.08s 24: learn: 0.1749412 test: 0.2118615 best: 0.2118615 (24) total: 3.54s remaining: 5.95s 25: learn: 0.1731986 test: 0.2107829 best: 0.2107829 (25) total: 3.69s remaining: 5.82s 26: learn: 0.1711216 test: 0.2097172 best: 0.2097172 (26) total: 3.84s remaining: 5.69s 27: learn: 0.1693818 test: 0.2089074 best: 0.2089074 (27) total: 3.97s remaining: 5.53s 28: learn: 0.1677938 test: 0.2081781 best: 0.2081781 (28) total: 4.11s remaining: 5.38s 29: learn: 0.1663528 test: 0.2074588 best: 0.2074588 (29) total: 4.24s remaining: 5.23s 30: learn: 0.1651187 test: 0.2067221 best: 0.2067221 (30) total: 4.37s remaining: 5.07s 31: learn: 0.1643877 test: 0.2064909 best: 0.2064909 (31) total: 4.48s remaining: 4.9s 32: learn: 0.1630495 test: 0.2062229 best: 0.2062229 (32) total: 4.63s remaining: 4.76s 33: learn: 0.1622212 test: 0.2055407 best: 0.2055407 (33) total: 4.75s remaining: 4.61s 34: learn: 0.1612946 test: 0.2052015 best: 0.2052015 (34) total: 4.87s remaining: 4.46s 35: learn: 0.1602617 test: 0.2046599 best: 0.2046599 (35) total: 5s remaining: 4.31s 36: learn: 0.1588533 test: 0.2038150 best: 0.2038150 (36) total: 5.14s remaining: 4.17s 37: learn: 0.1577594 test: 0.2034102 best: 0.2034102 (37) total: 5.28s remaining: 4.03s 38: learn: 0.1567877 test: 0.2029234 best: 0.2029234 (38) total: 5.42s remaining: 3.89s 39: learn: 0.1556440 test: 0.2026293 best: 0.2026293 (39) total: 5.55s remaining: 3.75s 40: learn: 0.1547987 test: 0.2022407 best: 0.2022407 (40) total: 5.68s remaining: 3.6s 41: learn: 0.1540339 test: 0.2016760 best: 0.2016760 (41) total: 5.81s remaining: 3.46s 42: learn: 0.1528353 test: 0.2014272 best: 0.2014272 (42) total: 5.96s remaining: 3.33s 43: learn: 0.1522384 test: 0.2013670 best: 0.2013670 (43) total: 6.08s remaining: 3.18s 44: learn: 0.1516184 test: 0.2011607 best: 0.2011607 (44) total: 6.21s remaining: 3.04s 45: learn: 0.1507578 test: 0.2007823 best: 0.2007823 (45) total: 6.34s remaining: 2.89s 46: learn: 0.1500332 test: 0.2005323 best: 0.2005323 (46) total: 6.46s remaining: 2.75s 47: learn: 0.1491526 test: 0.2003765 best: 0.2003765 (47) total: 6.59s remaining: 2.61s 48: learn: 0.1484978 test: 0.2000776 best: 0.2000776 (48) total: 6.72s remaining: 2.47s 49: learn: 0.1478543 test: 0.1998068 best: 0.1998068 (49) total: 6.84s remaining: 2.33s 50: learn: 0.1472020 test: 0.1998250 best: 0.1998068 (49) total: 6.96s remaining: 2.18s 51: learn: 0.1462165 test: 0.1997678 best: 0.1997678 (51) total: 7.1s remaining: 2.05s 52: learn: 0.1455680 test: 0.1997869 best: 0.1997678 (51) total: 7.22s remaining: 1.91s 53: learn: 0.1449036 test: 0.1996060 best: 0.1996060 (53) total: 7.35s remaining: 1.77s 54: learn: 0.1444468 test: 0.1994599 best: 0.1994599 (54) total: 7.46s remaining: 1.63s 55: learn: 0.1438483 test: 0.1994641 best: 0.1994599 (54) total: 7.59s remaining: 1.49s 56: learn: 0.1436126 test: 0.1993642 best: 0.1993642 (56) total: 7.69s remaining: 1.35s 57: learn: 0.1431328 test: 0.1993341 best: 0.1993341 (57) total: 7.8s remaining: 1.21s 58: learn: 0.1427390 test: 0.1992997 best: 0.1992997 (58) total: 7.92s remaining: 1.07s 59: learn: 0.1422496 test: 0.1993652 best: 0.1992997 (58) total: 8.04s remaining: 938ms 60: learn: 0.1417998 test: 0.1992251 best: 0.1992251 (60) total: 8.16s remaining: 803ms 61: learn: 0.1412986 test: 0.1992721 best: 0.1992251 (60) total: 8.28s remaining: 668ms 62: learn: 0.1407968 test: 0.1991017 best: 0.1991017 (62) total: 8.41s remaining: 534ms 63: learn: 0.1403263 test: 0.1991060 best: 0.1991017 (62) total: 8.53s remaining: 400ms 64: learn: 0.1398457 test: 0.1993109 best: 0.1991017 (62) total: 8.65s remaining: 266ms 65: learn: 0.1394701 test: 0.1991725 best: 0.1991017 (62) total: 8.76s remaining: 133ms 66: learn: 0.1388442 test: 0.1991123 best: 0.1991017 (62) total: 8.89s remaining: 0us bestTest = 0.1991017119 bestIteration = 62 Shrink model to first 63 iterations. Trial 44, Fold 5: Log loss = 0.19910171185297648, Average precision = 0.9742595684718001, ROC-AUC = 0.9715143340207718, Elapsed Time = 9.025208800001565 seconds
Optimization Progress: 45%|####5 | 45/100 [1:11:47<52:33, 57.34s/it]
Trial 45, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 45, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6197849 test: 0.6218831 best: 0.6218831 (0) total: 217ms remaining: 2.38s 1: learn: 0.5579534 test: 0.5607020 best: 0.5607020 (1) total: 405ms remaining: 2.02s 2: learn: 0.5075304 test: 0.5106510 best: 0.5106510 (2) total: 612ms remaining: 1.83s 3: learn: 0.4645591 test: 0.4732353 best: 0.4732353 (3) total: 876ms remaining: 1.75s 4: learn: 0.4320801 test: 0.4419082 best: 0.4419082 (4) total: 1.11s remaining: 1.56s 5: learn: 0.4042108 test: 0.4144016 best: 0.4144016 (5) total: 1.38s remaining: 1.38s 6: learn: 0.3821669 test: 0.3935248 best: 0.3935248 (6) total: 1.6s remaining: 1.14s 7: learn: 0.3602152 test: 0.3720033 best: 0.3720033 (7) total: 1.83s remaining: 916ms 8: learn: 0.3420685 test: 0.3547263 best: 0.3547263 (8) total: 2.06s remaining: 686ms 9: learn: 0.3256564 test: 0.3396126 best: 0.3396126 (9) total: 2.3s remaining: 460ms 10: learn: 0.3109182 test: 0.3260117 best: 0.3260117 (10) total: 2.52s remaining: 229ms 11: learn: 0.2984007 test: 0.3144693 best: 0.3144693 (11) total: 2.75s remaining: 0us bestTest = 0.3144693066 bestIteration = 11 Trial 45, Fold 1: Log loss = 0.31500063407156714, Average precision = 0.9716785851155554, ROC-AUC = 0.966898655979505, Elapsed Time = 2.856975099999545 seconds Trial 45, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 45, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6133841 test: 0.6189109 best: 0.6189109 (0) total: 221ms remaining: 2.43s 1: learn: 0.5474808 test: 0.5581981 best: 0.5581981 (1) total: 448ms remaining: 2.24s 2: learn: 0.5008421 test: 0.5137028 best: 0.5137028 (2) total: 635ms remaining: 1.91s 3: learn: 0.4612514 test: 0.4745482 best: 0.4745482 (3) total: 893ms remaining: 1.78s 4: learn: 0.4311436 test: 0.4445738 best: 0.4445738 (4) total: 1.12s remaining: 1.57s 5: learn: 0.4048848 test: 0.4191039 best: 0.4191039 (5) total: 1.33s remaining: 1.33s 6: learn: 0.3774024 test: 0.3928628 best: 0.3928628 (6) total: 1.56s remaining: 1.12s 7: learn: 0.3568998 test: 0.3726231 best: 0.3726231 (7) total: 1.75s remaining: 873ms 8: learn: 0.3394508 test: 0.3560802 best: 0.3560802 (8) total: 2.02s remaining: 673ms 9: learn: 0.3246235 test: 0.3415713 best: 0.3415713 (9) total: 2.29s remaining: 458ms 10: learn: 0.3109292 test: 0.3276623 best: 0.3276623 (10) total: 2.54s remaining: 231ms 11: learn: 0.2983538 test: 0.3157570 best: 0.3157570 (11) total: 2.78s remaining: 0us bestTest = 0.3157569564 bestIteration = 11 Trial 45, Fold 2: Log loss = 0.31605413706255214, Average precision = 0.9714380352115993, ROC-AUC = 0.9691789625368102, Elapsed Time = 2.890101799999684 seconds Trial 45, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 45, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6204854 test: 0.6216577 best: 0.6216577 (0) total: 226ms remaining: 2.49s 1: learn: 0.5543362 test: 0.5566500 best: 0.5566500 (1) total: 453ms remaining: 2.26s 2: learn: 0.5085264 test: 0.5113346 best: 0.5113346 (2) total: 681ms remaining: 2.04s 3: learn: 0.4685437 test: 0.4721843 best: 0.4721843 (3) total: 913ms remaining: 1.83s 4: learn: 0.4321419 test: 0.4368013 best: 0.4368013 (4) total: 1.15s remaining: 1.61s 5: learn: 0.4067278 test: 0.4119069 best: 0.4119069 (5) total: 1.34s remaining: 1.34s 6: learn: 0.3832610 test: 0.3890018 best: 0.3890018 (6) total: 1.59s remaining: 1.14s 7: learn: 0.3594044 test: 0.3659051 best: 0.3659051 (7) total: 1.83s remaining: 916ms 8: learn: 0.3408459 test: 0.3476536 best: 0.3476536 (8) total: 2.02s remaining: 675ms 9: learn: 0.3252817 test: 0.3326134 best: 0.3326134 (9) total: 2.27s remaining: 454ms 10: learn: 0.3129504 test: 0.3207333 best: 0.3207333 (10) total: 2.49s remaining: 227ms 11: learn: 0.3018828 test: 0.3100266 best: 0.3100266 (11) total: 2.72s remaining: 0us bestTest = 0.3100266025 bestIteration = 11 Trial 45, Fold 3: Log loss = 0.31064000572944606, Average precision = 0.9726096230830306, ROC-AUC = 0.9704358073455905, Elapsed Time = 2.8207452000024205 seconds Trial 45, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 45, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6192516 test: 0.6195475 best: 0.6195475 (0) total: 196ms remaining: 2.15s 1: learn: 0.5589961 test: 0.5595667 best: 0.5595667 (1) total: 425ms remaining: 2.12s 2: learn: 0.5097852 test: 0.5134780 best: 0.5134780 (2) total: 666ms remaining: 2s 3: learn: 0.4722750 test: 0.4764873 best: 0.4764873 (3) total: 870ms remaining: 1.74s 4: learn: 0.4401100 test: 0.4448493 best: 0.4448493 (4) total: 1.04s remaining: 1.46s 5: learn: 0.4118516 test: 0.4164421 best: 0.4164421 (5) total: 1.2s remaining: 1.2s 6: learn: 0.3870203 test: 0.3926651 best: 0.3926651 (6) total: 1.44s remaining: 1.03s 7: learn: 0.3672033 test: 0.3730975 best: 0.3730975 (7) total: 1.63s remaining: 816ms 8: learn: 0.3494273 test: 0.3561897 best: 0.3561897 (8) total: 1.89s remaining: 629ms 9: learn: 0.3332045 test: 0.3404028 best: 0.3404028 (9) total: 2.07s remaining: 415ms 10: learn: 0.3200265 test: 0.3272365 best: 0.3272365 (10) total: 2.27s remaining: 206ms 11: learn: 0.3062370 test: 0.3141036 best: 0.3141036 (11) total: 2.48s remaining: 0us bestTest = 0.3141035657 bestIteration = 11 Trial 45, Fold 4: Log loss = 0.3144815270072962, Average precision = 0.9740361505912015, ROC-AUC = 0.9705727322474731, Elapsed Time = 2.58415949999835 seconds Trial 45, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 45, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6238992 test: 0.6255498 best: 0.6255498 (0) total: 177ms remaining: 1.94s 1: learn: 0.5579559 test: 0.5660149 best: 0.5660149 (1) total: 425ms remaining: 2.13s 2: learn: 0.5103925 test: 0.5188910 best: 0.5188910 (2) total: 583ms remaining: 1.75s 3: learn: 0.4705227 test: 0.4793463 best: 0.4793463 (3) total: 781ms remaining: 1.56s 4: learn: 0.4395748 test: 0.4487582 best: 0.4487582 (4) total: 1.01s remaining: 1.41s 5: learn: 0.4076916 test: 0.4188010 best: 0.4188010 (5) total: 1.23s remaining: 1.23s 6: learn: 0.3807648 test: 0.3930993 best: 0.3930993 (6) total: 1.46s remaining: 1.04s 7: learn: 0.3610635 test: 0.3749145 best: 0.3749145 (7) total: 1.7s remaining: 849ms 8: learn: 0.3423237 test: 0.3574398 best: 0.3574398 (8) total: 1.96s remaining: 654ms 9: learn: 0.3269657 test: 0.3431572 best: 0.3431572 (9) total: 2.22s remaining: 444ms 10: learn: 0.3115597 test: 0.3285384 best: 0.3285384 (10) total: 2.46s remaining: 224ms 11: learn: 0.3000032 test: 0.3173250 best: 0.3173250 (11) total: 2.69s remaining: 0us bestTest = 0.317325027 bestIteration = 11 Trial 45, Fold 5: Log loss = 0.31767776482020044, Average precision = 0.9733426175452514, ROC-AUC = 0.9698287349274474, Elapsed Time = 2.7899317999981577 seconds
Optimization Progress: 46%|####6 | 46/100 [1:12:08<41:52, 46.54s/it]
Trial 46, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 46, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6718597 test: 0.6721007 best: 0.6721007 (0) total: 32.5ms remaining: 1.4s 1: learn: 0.6521733 test: 0.6524398 best: 0.6524398 (1) total: 64ms remaining: 1.34s 2: learn: 0.6321114 test: 0.6325184 best: 0.6325184 (2) total: 96.7ms remaining: 1.32s 3: learn: 0.6131947 test: 0.6135452 best: 0.6135452 (3) total: 131ms remaining: 1.31s 4: learn: 0.5969040 test: 0.5971384 best: 0.5971384 (4) total: 162ms remaining: 1.26s 5: learn: 0.5801479 test: 0.5803663 best: 0.5803663 (5) total: 193ms remaining: 1.22s 6: learn: 0.5654499 test: 0.5656246 best: 0.5656246 (6) total: 223ms remaining: 1.18s 7: learn: 0.5514368 test: 0.5516281 best: 0.5516281 (7) total: 255ms remaining: 1.15s 8: learn: 0.5380913 test: 0.5382040 best: 0.5382040 (8) total: 286ms remaining: 1.11s 9: learn: 0.5248912 test: 0.5249626 best: 0.5249626 (9) total: 318ms remaining: 1.08s 10: learn: 0.5118844 test: 0.5119356 best: 0.5119356 (10) total: 349ms remaining: 1.05s 11: learn: 0.4983273 test: 0.4983608 best: 0.4983608 (11) total: 380ms remaining: 1.01s 12: learn: 0.4871007 test: 0.4871783 best: 0.4871783 (12) total: 412ms remaining: 983ms 13: learn: 0.4759442 test: 0.4759466 best: 0.4759466 (13) total: 444ms remaining: 952ms 14: learn: 0.4653374 test: 0.4653452 best: 0.4653452 (14) total: 477ms remaining: 921ms 15: learn: 0.4543195 test: 0.4542570 best: 0.4542570 (15) total: 510ms remaining: 893ms 16: learn: 0.4440672 test: 0.4440474 best: 0.4440474 (16) total: 542ms remaining: 861ms 17: learn: 0.4354070 test: 0.4353177 best: 0.4353177 (17) total: 575ms remaining: 831ms 18: learn: 0.4256099 test: 0.4255837 best: 0.4255837 (18) total: 608ms remaining: 800ms 19: learn: 0.4167700 test: 0.4166712 best: 0.4166712 (19) total: 641ms remaining: 769ms 20: learn: 0.4089841 test: 0.4088232 best: 0.4088232 (20) total: 675ms remaining: 740ms 21: learn: 0.4018984 test: 0.4017213 best: 0.4017213 (21) total: 709ms remaining: 709ms 22: learn: 0.3941365 test: 0.3944060 best: 0.3944060 (22) total: 743ms remaining: 679ms 23: learn: 0.3862640 test: 0.3864971 best: 0.3864971 (23) total: 777ms remaining: 648ms 24: learn: 0.3793347 test: 0.3794565 best: 0.3794565 (24) total: 812ms remaining: 617ms 25: learn: 0.3732609 test: 0.3733918 best: 0.3733918 (25) total: 846ms remaining: 586ms 26: learn: 0.3674527 test: 0.3675713 best: 0.3675713 (26) total: 881ms remaining: 555ms 27: learn: 0.3611256 test: 0.3612596 best: 0.3612596 (27) total: 915ms remaining: 523ms 28: learn: 0.3552536 test: 0.3553649 best: 0.3553649 (28) total: 950ms remaining: 491ms 29: learn: 0.3494507 test: 0.3496676 best: 0.3496676 (29) total: 984ms remaining: 459ms 30: learn: 0.3450202 test: 0.3452732 best: 0.3452732 (30) total: 1.02s remaining: 427ms 31: learn: 0.3407553 test: 0.3409812 best: 0.3409812 (31) total: 1.05s remaining: 395ms 32: learn: 0.3359981 test: 0.3361961 best: 0.3361961 (32) total: 1.09s remaining: 363ms 33: learn: 0.3320758 test: 0.3323349 best: 0.3323349 (33) total: 1.12s remaining: 330ms 34: learn: 0.3280575 test: 0.3283666 best: 0.3283666 (34) total: 1.16s remaining: 297ms 35: learn: 0.3234662 test: 0.3237357 best: 0.3237357 (35) total: 1.19s remaining: 265ms 36: learn: 0.3199651 test: 0.3202640 best: 0.3202640 (36) total: 1.22s remaining: 232ms 37: learn: 0.3166668 test: 0.3169981 best: 0.3169981 (37) total: 1.26s remaining: 199ms 38: learn: 0.3135238 test: 0.3138983 best: 0.3138983 (38) total: 1.29s remaining: 166ms 39: learn: 0.3097604 test: 0.3101894 best: 0.3101894 (39) total: 1.33s remaining: 133ms 40: learn: 0.3066920 test: 0.3071638 best: 0.3071638 (40) total: 1.36s remaining: 99.6ms 41: learn: 0.3039397 test: 0.3043985 best: 0.3043985 (41) total: 1.4s remaining: 66.4ms 42: learn: 0.3012123 test: 0.3016882 best: 0.3016882 (42) total: 1.43s remaining: 33.2ms 43: learn: 0.2985684 test: 0.2990954 best: 0.2990954 (43) total: 1.46s remaining: 0us bestTest = 0.2990953855 bestIteration = 43 Trial 46, Fold 1: Log loss = 0.2990953855100339, Average precision = 0.9635913039585033, ROC-AUC = 0.9600948360122931, Elapsed Time = 1.564544400000159 seconds Trial 46, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 46, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6726343 test: 0.6724835 best: 0.6724835 (0) total: 32.2ms remaining: 1.38s 1: learn: 0.6514164 test: 0.6520545 best: 0.6520545 (1) total: 67ms remaining: 1.41s 2: learn: 0.6319513 test: 0.6327401 best: 0.6327401 (2) total: 99.9ms remaining: 1.36s 3: learn: 0.6140827 test: 0.6148880 best: 0.6148880 (3) total: 132ms remaining: 1.32s 4: learn: 0.5962682 test: 0.5978678 best: 0.5978678 (4) total: 165ms remaining: 1.29s 5: learn: 0.5808827 test: 0.5824885 best: 0.5824885 (5) total: 198ms remaining: 1.25s 6: learn: 0.5651512 test: 0.5667050 best: 0.5667050 (6) total: 231ms remaining: 1.22s 7: learn: 0.5502167 test: 0.5520250 best: 0.5520250 (7) total: 265ms remaining: 1.19s 8: learn: 0.5349641 test: 0.5367968 best: 0.5367968 (8) total: 298ms remaining: 1.16s 9: learn: 0.5214396 test: 0.5232141 best: 0.5232141 (9) total: 332ms remaining: 1.13s 10: learn: 0.5081754 test: 0.5100086 best: 0.5100086 (10) total: 366ms remaining: 1.1s 11: learn: 0.4959527 test: 0.4977885 best: 0.4977885 (11) total: 400ms remaining: 1.06s 12: learn: 0.4846588 test: 0.4865341 best: 0.4865341 (12) total: 435ms remaining: 1.04s 13: learn: 0.4724463 test: 0.4744298 best: 0.4744298 (13) total: 470ms remaining: 1.01s 14: learn: 0.4619756 test: 0.4639502 best: 0.4639502 (14) total: 505ms remaining: 975ms 15: learn: 0.4517185 test: 0.4545329 best: 0.4545329 (15) total: 539ms remaining: 943ms 16: learn: 0.4419601 test: 0.4448616 best: 0.4448616 (16) total: 573ms remaining: 910ms 17: learn: 0.4329139 test: 0.4357607 best: 0.4357607 (17) total: 606ms remaining: 876ms 18: learn: 0.4239829 test: 0.4269119 best: 0.4269119 (18) total: 642ms remaining: 844ms 19: learn: 0.4154384 test: 0.4183638 best: 0.4183638 (19) total: 676ms remaining: 811ms 20: learn: 0.4075170 test: 0.4104117 best: 0.4104117 (20) total: 709ms remaining: 777ms 21: learn: 0.4004086 test: 0.4032293 best: 0.4032293 (21) total: 743ms remaining: 743ms 22: learn: 0.3936662 test: 0.3964239 best: 0.3964239 (22) total: 777ms remaining: 709ms 23: learn: 0.3861782 test: 0.3889765 best: 0.3889765 (23) total: 810ms remaining: 675ms 24: learn: 0.3799407 test: 0.3827160 best: 0.3827160 (24) total: 845ms remaining: 642ms 25: learn: 0.3740465 test: 0.3768234 best: 0.3768234 (25) total: 880ms remaining: 609ms 26: learn: 0.3693401 test: 0.3720433 best: 0.3720433 (26) total: 913ms remaining: 575ms 27: learn: 0.3637624 test: 0.3665077 best: 0.3665077 (27) total: 948ms remaining: 542ms 28: learn: 0.3579070 test: 0.3606965 best: 0.3606965 (28) total: 982ms remaining: 508ms 29: learn: 0.3530412 test: 0.3557280 best: 0.3557280 (29) total: 1.02s remaining: 474ms 30: learn: 0.3483317 test: 0.3510111 best: 0.3510111 (30) total: 1.05s remaining: 440ms 31: learn: 0.3436033 test: 0.3463136 best: 0.3463136 (31) total: 1.08s remaining: 407ms 32: learn: 0.3389266 test: 0.3416516 best: 0.3416516 (32) total: 1.12s remaining: 373ms 33: learn: 0.3341955 test: 0.3369217 best: 0.3369217 (33) total: 1.15s remaining: 339ms 34: learn: 0.3301806 test: 0.3329138 best: 0.3329138 (34) total: 1.19s remaining: 305ms 35: learn: 0.3261058 test: 0.3288463 best: 0.3288463 (35) total: 1.22s remaining: 271ms 36: learn: 0.3220613 test: 0.3247919 best: 0.3247919 (36) total: 1.26s remaining: 238ms 37: learn: 0.3186139 test: 0.3212669 best: 0.3212669 (37) total: 1.29s remaining: 204ms 38: learn: 0.3144149 test: 0.3170407 best: 0.3170407 (38) total: 1.32s remaining: 170ms 39: learn: 0.3115354 test: 0.3140940 best: 0.3140940 (39) total: 1.36s remaining: 136ms 40: learn: 0.3077677 test: 0.3103022 best: 0.3103022 (40) total: 1.39s remaining: 102ms 41: learn: 0.3044094 test: 0.3069964 best: 0.3069964 (41) total: 1.43s remaining: 68.1ms 42: learn: 0.3011495 test: 0.3037382 best: 0.3037382 (42) total: 1.46s remaining: 34ms 43: learn: 0.2978651 test: 0.3004452 best: 0.3004452 (43) total: 1.5s remaining: 0us bestTest = 0.3004452474 bestIteration = 43 Trial 46, Fold 2: Log loss = 0.3004452473628985, Average precision = 0.9624983249447823, ROC-AUC = 0.9603160572526184, Elapsed Time = 1.5966767000027176 seconds Trial 46, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 46, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6725103 test: 0.6724287 best: 0.6724287 (0) total: 33.7ms remaining: 1.45s 1: learn: 0.6480907 test: 0.6477937 best: 0.6477937 (1) total: 65.8ms remaining: 1.38s 2: learn: 0.6278792 test: 0.6279205 best: 0.6279205 (2) total: 98ms remaining: 1.34s 3: learn: 0.6109301 test: 0.6109110 best: 0.6109110 (3) total: 130ms remaining: 1.3s 4: learn: 0.5933251 test: 0.5931071 best: 0.5931071 (4) total: 162ms remaining: 1.26s 5: learn: 0.5772496 test: 0.5768387 best: 0.5768387 (5) total: 195ms remaining: 1.23s 6: learn: 0.5614450 test: 0.5609206 best: 0.5609206 (6) total: 227ms remaining: 1.2s 7: learn: 0.5476579 test: 0.5470973 best: 0.5470973 (7) total: 260ms remaining: 1.17s 8: learn: 0.5318162 test: 0.5310829 best: 0.5310829 (8) total: 293ms remaining: 1.14s 9: learn: 0.5192746 test: 0.5184843 best: 0.5184843 (9) total: 326ms remaining: 1.11s 10: learn: 0.5075898 test: 0.5067644 best: 0.5067644 (10) total: 360ms remaining: 1.08s 11: learn: 0.4954757 test: 0.4945413 best: 0.4945413 (11) total: 394ms remaining: 1.05s 12: learn: 0.4846051 test: 0.4836570 best: 0.4836570 (12) total: 428ms remaining: 1.02s 13: learn: 0.4730552 test: 0.4722056 best: 0.4722056 (13) total: 462ms remaining: 990ms 14: learn: 0.4619821 test: 0.4610968 best: 0.4610968 (14) total: 497ms remaining: 960ms 15: learn: 0.4514142 test: 0.4503751 best: 0.4503751 (15) total: 531ms remaining: 929ms 16: learn: 0.4414138 test: 0.4402920 best: 0.4402920 (16) total: 568ms remaining: 902ms 17: learn: 0.4312818 test: 0.4300203 best: 0.4300203 (17) total: 602ms remaining: 869ms 18: learn: 0.4230920 test: 0.4218103 best: 0.4218103 (18) total: 636ms remaining: 837ms 19: learn: 0.4147333 test: 0.4134000 best: 0.4134000 (19) total: 671ms remaining: 805ms 20: learn: 0.4065287 test: 0.4051500 best: 0.4051500 (20) total: 705ms remaining: 772ms 21: learn: 0.3984881 test: 0.3970912 best: 0.3970912 (21) total: 740ms remaining: 740ms 22: learn: 0.3913789 test: 0.3899068 best: 0.3899068 (22) total: 774ms remaining: 707ms 23: learn: 0.3837957 test: 0.3822866 best: 0.3822866 (23) total: 808ms remaining: 673ms 24: learn: 0.3772258 test: 0.3755896 best: 0.3755896 (24) total: 842ms remaining: 640ms 25: learn: 0.3708867 test: 0.3691822 best: 0.3691822 (25) total: 877ms remaining: 607ms 26: learn: 0.3650690 test: 0.3633635 best: 0.3633635 (26) total: 911ms remaining: 573ms 27: learn: 0.3597347 test: 0.3579967 best: 0.3579967 (27) total: 945ms remaining: 540ms 28: learn: 0.3543355 test: 0.3524646 best: 0.3524646 (28) total: 978ms remaining: 506ms 29: learn: 0.3490979 test: 0.3471639 best: 0.3471639 (29) total: 1.01s remaining: 473ms 30: learn: 0.3446008 test: 0.3426333 best: 0.3426333 (30) total: 1.05s remaining: 439ms 31: learn: 0.3394739 test: 0.3374844 best: 0.3374844 (31) total: 1.08s remaining: 405ms 32: learn: 0.3354007 test: 0.3334440 best: 0.3334440 (32) total: 1.11s remaining: 372ms 33: learn: 0.3305126 test: 0.3285108 best: 0.3285108 (33) total: 1.15s remaining: 338ms 34: learn: 0.3260426 test: 0.3240587 best: 0.3240587 (34) total: 1.18s remaining: 304ms 35: learn: 0.3218946 test: 0.3198754 best: 0.3198754 (35) total: 1.22s remaining: 270ms 36: learn: 0.3181483 test: 0.3161122 best: 0.3161122 (36) total: 1.25s remaining: 237ms 37: learn: 0.3147775 test: 0.3126971 best: 0.3126971 (37) total: 1.28s remaining: 203ms 38: learn: 0.3118251 test: 0.3097287 best: 0.3097287 (38) total: 1.32s remaining: 169ms 39: learn: 0.3085471 test: 0.3064246 best: 0.3064246 (39) total: 1.36s remaining: 136ms 40: learn: 0.3053220 test: 0.3032373 best: 0.3032373 (40) total: 1.39s remaining: 102ms 41: learn: 0.3026113 test: 0.3004532 best: 0.3004532 (41) total: 1.43s remaining: 67.9ms 42: learn: 0.3001146 test: 0.2979633 best: 0.2979633 (42) total: 1.46s remaining: 34ms 43: learn: 0.2972173 test: 0.2950397 best: 0.2950397 (43) total: 1.5s remaining: 0us bestTest = 0.2950397348 bestIteration = 43 Trial 46, Fold 3: Log loss = 0.295039734799017, Average precision = 0.9602635223581758, ROC-AUC = 0.960549926081806, Elapsed Time = 1.5994404000011855 seconds Trial 46, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 46, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6729410 test: 0.6728757 best: 0.6728757 (0) total: 31.6ms remaining: 1.36s 1: learn: 0.6530214 test: 0.6533387 best: 0.6533387 (1) total: 62.7ms remaining: 1.31s 2: learn: 0.6335344 test: 0.6339156 best: 0.6339156 (2) total: 94.4ms remaining: 1.29s 3: learn: 0.6147024 test: 0.6159795 best: 0.6159795 (3) total: 128ms remaining: 1.28s 4: learn: 0.5968380 test: 0.5980315 best: 0.5980315 (4) total: 160ms remaining: 1.25s 5: learn: 0.5812749 test: 0.5824289 best: 0.5824289 (5) total: 192ms remaining: 1.21s 6: learn: 0.5653881 test: 0.5665028 best: 0.5665028 (6) total: 224ms remaining: 1.18s 7: learn: 0.5499255 test: 0.5510229 best: 0.5510229 (7) total: 256ms remaining: 1.15s 8: learn: 0.5363241 test: 0.5374256 best: 0.5374256 (8) total: 288ms remaining: 1.12s 9: learn: 0.5229585 test: 0.5239891 best: 0.5239891 (9) total: 321ms remaining: 1.09s 10: learn: 0.5108815 test: 0.5119029 best: 0.5119029 (10) total: 354ms remaining: 1.06s 11: learn: 0.4981896 test: 0.4992404 best: 0.4992404 (11) total: 387ms remaining: 1.03s 12: learn: 0.4859081 test: 0.4869385 best: 0.4869385 (12) total: 420ms remaining: 1s 13: learn: 0.4736625 test: 0.4745572 best: 0.4745572 (13) total: 454ms remaining: 974ms 14: learn: 0.4639659 test: 0.4648637 best: 0.4648637 (14) total: 490ms remaining: 946ms 15: learn: 0.4539888 test: 0.4548691 best: 0.4548691 (15) total: 523ms remaining: 916ms 16: learn: 0.4434075 test: 0.4441987 best: 0.4441987 (16) total: 557ms remaining: 885ms 17: learn: 0.4346789 test: 0.4354099 best: 0.4354099 (17) total: 591ms remaining: 854ms 18: learn: 0.4258044 test: 0.4265784 best: 0.4265784 (18) total: 626ms remaining: 823ms 19: learn: 0.4167134 test: 0.4173899 best: 0.4173899 (19) total: 661ms remaining: 793ms 20: learn: 0.4085478 test: 0.4091937 best: 0.4091937 (20) total: 695ms remaining: 762ms 21: learn: 0.4013609 test: 0.4020469 best: 0.4020469 (21) total: 730ms remaining: 730ms 22: learn: 0.3937214 test: 0.3943201 best: 0.3943201 (22) total: 765ms remaining: 698ms 23: learn: 0.3863854 test: 0.3869079 best: 0.3869079 (23) total: 799ms remaining: 666ms 24: learn: 0.3803590 test: 0.3808545 best: 0.3808545 (24) total: 833ms remaining: 633ms 25: learn: 0.3745044 test: 0.3750274 best: 0.3750274 (25) total: 868ms remaining: 601ms 26: learn: 0.3683040 test: 0.3688034 best: 0.3688034 (26) total: 902ms remaining: 568ms 27: learn: 0.3624574 test: 0.3629151 best: 0.3629151 (27) total: 937ms remaining: 535ms 28: learn: 0.3565098 test: 0.3569283 best: 0.3569283 (28) total: 970ms remaining: 502ms 29: learn: 0.3518578 test: 0.3522371 best: 0.3522371 (29) total: 1s remaining: 468ms 30: learn: 0.3463672 test: 0.3467127 best: 0.3467127 (30) total: 1.04s remaining: 435ms 31: learn: 0.3421391 test: 0.3424051 best: 0.3424051 (31) total: 1.07s remaining: 402ms 32: learn: 0.3380306 test: 0.3383502 best: 0.3383502 (32) total: 1.11s remaining: 369ms 33: learn: 0.3335051 test: 0.3338588 best: 0.3338588 (33) total: 1.14s remaining: 336ms 34: learn: 0.3288715 test: 0.3291991 best: 0.3291991 (34) total: 1.18s remaining: 302ms 35: learn: 0.3244199 test: 0.3246972 best: 0.3246972 (35) total: 1.21s remaining: 269ms 36: learn: 0.3204271 test: 0.3207853 best: 0.3207853 (36) total: 1.24s remaining: 235ms 37: learn: 0.3169089 test: 0.3172308 best: 0.3172308 (37) total: 1.28s remaining: 202ms 38: learn: 0.3135477 test: 0.3139008 best: 0.3139008 (38) total: 1.31s remaining: 168ms 39: learn: 0.3108133 test: 0.3111243 best: 0.3111243 (39) total: 1.35s remaining: 135ms 40: learn: 0.3082932 test: 0.3086549 best: 0.3086549 (40) total: 1.38s remaining: 101ms 41: learn: 0.3053917 test: 0.3057925 best: 0.3057925 (41) total: 1.42s remaining: 67.4ms 42: learn: 0.3025972 test: 0.3030643 best: 0.3030643 (42) total: 1.45s remaining: 33.7ms 43: learn: 0.2999808 test: 0.3004626 best: 0.3004626 (43) total: 1.48s remaining: 0us bestTest = 0.3004626167 bestIteration = 43 Trial 46, Fold 4: Log loss = 0.300462616736596, Average precision = 0.9612037416076539, ROC-AUC = 0.9587895752412426, Elapsed Time = 1.5848167000003741 seconds Trial 46, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 46, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6702065 test: 0.6708398 best: 0.6708398 (0) total: 32.3ms remaining: 1.39s 1: learn: 0.6498593 test: 0.6518671 best: 0.6518671 (1) total: 64.5ms remaining: 1.35s 2: learn: 0.6306890 test: 0.6329656 best: 0.6329656 (2) total: 96.7ms remaining: 1.32s 3: learn: 0.6125778 test: 0.6150638 best: 0.6150638 (3) total: 129ms remaining: 1.29s 4: learn: 0.5938097 test: 0.5965628 best: 0.5965628 (4) total: 161ms remaining: 1.25s 5: learn: 0.5764824 test: 0.5794088 best: 0.5794088 (5) total: 194ms remaining: 1.23s 6: learn: 0.5607245 test: 0.5646914 best: 0.5646914 (6) total: 226ms remaining: 1.2s 7: learn: 0.5457802 test: 0.5498987 best: 0.5498987 (7) total: 260ms remaining: 1.17s 8: learn: 0.5310449 test: 0.5353344 best: 0.5353344 (8) total: 293ms remaining: 1.14s 9: learn: 0.5175028 test: 0.5219478 best: 0.5219478 (9) total: 326ms remaining: 1.11s 10: learn: 0.5031138 test: 0.5077863 best: 0.5077863 (10) total: 361ms remaining: 1.08s 11: learn: 0.4913432 test: 0.4961155 best: 0.4961155 (11) total: 397ms remaining: 1.06s 12: learn: 0.4790545 test: 0.4840457 best: 0.4840457 (12) total: 430ms remaining: 1.02s 13: learn: 0.4681061 test: 0.4732615 best: 0.4732615 (13) total: 464ms remaining: 993ms 14: learn: 0.4564892 test: 0.4618515 best: 0.4618515 (14) total: 497ms remaining: 962ms 15: learn: 0.4470885 test: 0.4525829 best: 0.4525829 (15) total: 532ms remaining: 930ms 16: learn: 0.4370321 test: 0.4425974 best: 0.4425974 (16) total: 566ms remaining: 899ms 17: learn: 0.4285418 test: 0.4341681 best: 0.4341681 (17) total: 600ms remaining: 867ms 18: learn: 0.4195090 test: 0.4252614 best: 0.4252614 (18) total: 634ms remaining: 834ms 19: learn: 0.4104234 test: 0.4162096 best: 0.4162096 (19) total: 669ms remaining: 803ms 20: learn: 0.4026207 test: 0.4085743 best: 0.4085743 (20) total: 705ms remaining: 772ms 21: learn: 0.3950663 test: 0.4011606 best: 0.4011606 (21) total: 740ms remaining: 740ms 22: learn: 0.3873054 test: 0.3938408 best: 0.3938408 (22) total: 774ms remaining: 707ms 23: learn: 0.3809603 test: 0.3875872 best: 0.3875872 (23) total: 808ms remaining: 674ms 24: learn: 0.3751576 test: 0.3818428 best: 0.3818428 (24) total: 842ms remaining: 640ms 25: learn: 0.3687970 test: 0.3755448 best: 0.3755448 (25) total: 876ms remaining: 607ms 26: learn: 0.3623953 test: 0.3692443 best: 0.3692443 (26) total: 911ms remaining: 574ms 27: learn: 0.3563806 test: 0.3632429 best: 0.3632429 (27) total: 945ms remaining: 540ms 28: learn: 0.3505915 test: 0.3576108 best: 0.3576108 (28) total: 979ms remaining: 507ms 29: learn: 0.3451202 test: 0.3522391 best: 0.3522391 (29) total: 1.01s remaining: 473ms 30: learn: 0.3403513 test: 0.3476609 best: 0.3476609 (30) total: 1.05s remaining: 439ms 31: learn: 0.3363464 test: 0.3437153 best: 0.3437153 (31) total: 1.08s remaining: 406ms 32: learn: 0.3314972 test: 0.3390185 best: 0.3390185 (32) total: 1.11s remaining: 372ms 33: learn: 0.3270353 test: 0.3346947 best: 0.3346947 (33) total: 1.15s remaining: 338ms 34: learn: 0.3226307 test: 0.3303193 best: 0.3303193 (34) total: 1.18s remaining: 304ms 35: learn: 0.3186624 test: 0.3264282 best: 0.3264282 (35) total: 1.22s remaining: 271ms 36: learn: 0.3149332 test: 0.3227905 best: 0.3227905 (36) total: 1.25s remaining: 237ms 37: learn: 0.3108251 test: 0.3187572 best: 0.3187572 (37) total: 1.29s remaining: 203ms 38: learn: 0.3072792 test: 0.3152973 best: 0.3152973 (38) total: 1.32s remaining: 170ms 39: learn: 0.3044311 test: 0.3125553 best: 0.3125553 (39) total: 1.36s remaining: 136ms 40: learn: 0.3009517 test: 0.3091550 best: 0.3091550 (40) total: 1.39s remaining: 102ms 41: learn: 0.2978727 test: 0.3061865 best: 0.3061865 (41) total: 1.43s remaining: 67.9ms 42: learn: 0.2948707 test: 0.3032722 best: 0.3032722 (42) total: 1.46s remaining: 34ms 43: learn: 0.2919695 test: 0.3005000 best: 0.3005000 (43) total: 1.49s remaining: 0us bestTest = 0.3004999589 bestIteration = 43 Trial 46, Fold 5: Log loss = 0.3004999589374852, Average precision = 0.9605533844627435, ROC-AUC = 0.9575139252735818, Elapsed Time = 1.5927054999992833 seconds
Optimization Progress: 47%|####6 | 47/100 [1:12:24<32:52, 37.22s/it]
Trial 47, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 47, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5445765 test: 0.5485984 best: 0.5485984 (0) total: 178ms remaining: 15.8s 1: learn: 0.4299264 test: 0.4366798 best: 0.4366798 (1) total: 349ms remaining: 15.3s 2: learn: 0.3603643 test: 0.3724235 best: 0.3724235 (2) total: 546ms remaining: 15.8s 3: learn: 0.3140387 test: 0.3300967 best: 0.3300967 (3) total: 730ms remaining: 15.7s 4: learn: 0.2775221 test: 0.2979354 best: 0.2979354 (4) total: 930ms remaining: 15.8s 5: learn: 0.2458362 test: 0.2704039 best: 0.2704039 (5) total: 1.14s remaining: 16s 6: learn: 0.2238562 test: 0.2528266 best: 0.2528266 (6) total: 1.36s remaining: 16.2s 7: learn: 0.2070770 test: 0.2402302 best: 0.2402302 (7) total: 1.56s remaining: 16s 8: learn: 0.1927852 test: 0.2290207 best: 0.2290207 (8) total: 1.74s remaining: 15.7s 9: learn: 0.1828432 test: 0.2224468 best: 0.2224468 (9) total: 1.92s remaining: 15.3s 10: learn: 0.1755071 test: 0.2187629 best: 0.2187629 (10) total: 2.09s remaining: 15s 11: learn: 2121.9126702 test: 2464.0379721 best: 0.2187629 (10) total: 2.3s remaining: 14.9s 12: learn: 99990.2005741 test: 73489.1326533 best: 0.2187629 (10) total: 2.48s remaining: 14.7s 13: learn: 223380.4554914 test: 168834.0762833 best: 0.2187629 (10) total: 2.68s remaining: 14.5s 14: learn: 260895.6906733 test: 193329.0233095 best: 0.2187629 (10) total: 2.86s remaining: 14.3s 15: learn: 387161.3639864 test: 281032.0530646 best: 0.2187629 (10) total: 3.02s remaining: 13.9s 16: learn: 407549.0252662 test: 293471.4635057 best: 0.2187629 (10) total: 3.2s remaining: 13.8s 17: learn: 508876.8258082 test: 363575.5100022 best: 0.2187629 (10) total: 3.34s remaining: 13.4s 18: learn: 615532.8003254 test: 438996.4287909 best: 0.2187629 (10) total: 3.5s remaining: 13.1s
Training has stopped (degenerate solution on iteration 19, probably too small l2-regularization, try to increase it)
bestTest = 0.2187628502 bestIteration = 10 Shrink model to first 11 iterations. Trial 47, Fold 1: Log loss = 0.2188074000879337, Average precision = 0.9724157484850913, ROC-AUC = 0.9689162400883268, Elapsed Time = 3.7517687999970804 seconds Trial 47, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 47, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5411664 test: 0.5493164 best: 0.5493164 (0) total: 194ms remaining: 17.2s 1: learn: 0.4199534 test: 0.4324356 best: 0.4324356 (1) total: 384ms remaining: 16.9s 2: learn: 0.3558354 test: 0.3695494 best: 0.3695494 (2) total: 562ms remaining: 16.3s 3: learn: 0.3040169 test: 0.3213347 best: 0.3213347 (3) total: 763ms remaining: 16.4s 4: learn: 0.2637208 test: 0.2824708 best: 0.2824708 (4) total: 973ms remaining: 16.5s 5: learn: 0.2346057 test: 0.2591154 best: 0.2591154 (5) total: 1.2s remaining: 16.8s 6: learn: 0.2126006 test: 0.2400408 best: 0.2400408 (6) total: 1.42s remaining: 16.9s 7: learn: 0.1914403 test: 0.2231867 best: 0.2231867 (7) total: 1.65s remaining: 16.9s 8: learn: 383.5799923 test: 0.2150845 best: 0.2150845 (8) total: 1.86s remaining: 16.7s 9: learn: 317.9185685 test: 0.2096493 best: 0.2096493 (9) total: 2.05s remaining: 16.4s 10: learn: 317.9114271 test: 0.2045151 best: 0.2045151 (10) total: 2.23s remaining: 16s 11: learn: 317.9049357 test: 0.2000909 best: 0.2000909 (11) total: 2.41s remaining: 15.6s 12: learn: 317.8985681 test: 0.1972696 best: 0.1972696 (12) total: 2.58s remaining: 15.3s 13: learn: 1856.2323625 test: 0.1941526 best: 0.1941526 (13) total: 2.77s remaining: 15.1s 14: learn: 1856.2265767 test: 0.1928957 best: 0.1928957 (14) total: 2.92s remaining: 14.6s 15: learn: 1790.5560957 test: 0.1909503 best: 0.1909503 (15) total: 3.09s remaining: 14.3s 16: learn: 1790.5490639 test: 0.1900652 best: 0.1900652 (16) total: 3.27s remaining: 14s 17: learn: 1790.5424898 test: 0.1897081 best: 0.1897081 (17) total: 3.43s remaining: 13.7s 18: learn: 3659.0168157 test: 3543.8937189 best: 0.1897081 (17) total: 3.61s remaining: 13.5s 19: learn: 3659.0079558 test: 3543.8862557 best: 0.1897081 (17) total: 3.73s remaining: 13.1s 20: learn: 4172.1539862 test: 3543.8780847 best: 0.1897081 (17) total: 3.88s remaining: 12.8s 21: learn: 27389.8183839 test: 32912.1852086 best: 0.1897081 (17) total: 4.07s remaining: 12.6s 22: learn: 210606.1368468 test: 314248.7028504 best: 0.1897081 (17) total: 4.2s remaining: 12.2s 23: learn: 444161.2545133 test: 625601.7477575 best: 0.1897081 (17) total: 4.33s remaining: 11.9s 24: learn: 655769.7163478 test: 860040.9578908 best: 0.1897081 (17) total: 4.48s remaining: 11.6s 25: learn: 962552.0428314 test: 1202318.1205499 best: 0.1897081 (17) total: 4.62s remaining: 11.4s 26: learn: 1189393.9362534 test: 1523284.3560800 best: 0.1897081 (17) total: 4.74s remaining: 11.1s bestTest = 0.189708114 bestIteration = 17 Shrink model to first 18 iterations.
Training has stopped (degenerate solution on iteration 27, probably too small l2-regularization, try to increase it)
Trial 47, Fold 2: Log loss = 0.18946390033828614, Average precision = 0.9756765924089011, ROC-AUC = 0.9731557408766955, Elapsed Time = 5.012145599997893 seconds Trial 47, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 47, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5266539 test: 0.5306859 best: 0.5306859 (0) total: 197ms remaining: 17.6s 1: learn: 0.4245679 test: 0.4309326 best: 0.4309326 (1) total: 387ms remaining: 17s 2: learn: 0.3616663 test: 0.3715064 best: 0.3715064 (2) total: 590ms remaining: 17.1s 3: learn: 0.3050630 test: 0.3182559 best: 0.3182559 (3) total: 772ms remaining: 16.6s 4: learn: 0.2679481 test: 0.2867087 best: 0.2867087 (4) total: 991ms remaining: 16.8s 5: learn: 0.2376870 test: 0.2601645 best: 0.2601645 (5) total: 1.18s remaining: 16.6s 6: learn: 0.2151076 test: 0.2417223 best: 0.2417223 (6) total: 1.43s remaining: 17s 7: learn: 0.1978812 test: 0.2286708 best: 0.2286708 (7) total: 1.66s remaining: 17s 8: learn: 383.2880893 test: 0.2192734 best: 0.2192734 (8) total: 1.86s remaining: 16.7s 9: learn: 3047.3132981 test: 699.8858606 best: 0.2192734 (8) total: 2.09s remaining: 16.7s 10: learn: 3372.1361871 test: 700.7840286 best: 0.2192734 (8) total: 2.31s remaining: 16.6s 11: learn: 3883.6656152 test: 700.7783552 best: 0.2192734 (8) total: 2.54s remaining: 16.5s 12: learn: 5300.3400875 test: 700.7733421 best: 0.2192734 (8) total: 2.79s remaining: 16.5s 13: learn: 6194.9683125 test: 700.7706197 best: 0.2192734 (8) total: 3.04s remaining: 16.5s 14: learn: 6706.4957220 test: 700.7685803 best: 0.2192734 (8) total: 3.23s remaining: 16.2s 15: learn: 7218.0215380 test: 700.7659895 best: 0.2192734 (8) total: 3.46s remaining: 16s 16: learn: 8209.9517439 test: 700.7635380 best: 0.2192734 (8) total: 3.65s remaining: 15.7s 17: learn: 8701.4602152 test: 700.7614157 best: 0.2192734 (8) total: 3.86s remaining: 15.4s 18: learn: 9212.9823914 test: 700.7592509 best: 0.2192734 (8) total: 4.07s remaining: 15.2s 19: learn: 10560.3659693 test: 6105.0388158 best: 0.2192734 (8) total: 4.26s remaining: 14.9s 20: learn: 21333.8865498 test: 6105.0258843 best: 0.2192734 (8) total: 4.48s remaining: 14.7s 21: learn: 28055.2963502 test: 9160.3869512 best: 0.2192734 (8) total: 4.77s remaining: 14.7s 22: learn: 42248.2567659 test: 9160.3676051 best: 0.2192734 (8) total: 4.92s remaining: 14.3s 23: learn: 46712.3498100 test: 9160.3482430 best: 0.2192734 (8) total: 5.12s remaining: 14.1s 24: learn: 53538.5796675 test: 9160.3289761 best: 0.2192734 (8) total: 5.24s remaining: 13.6s 25: learn: 53206.2414700 test: 9160.3093842 best: 0.2192734 (8) total: 5.41s remaining: 13.3s 26: learn: 72000.6733033 test: 12755.4488043 best: 0.2192734 (8) total: 5.63s remaining: 13.1s 27: learn: 326051.8507325 test: 183113.8079437 best: 0.2192734 (8) total: 5.83s remaining: 12.9s 28: learn: 593451.4072636 test: 366527.7465668 best: 0.2192734 (8) total: 5.98s remaining: 12.6s 29: learn: 817086.5939836 test: 470468.9057101 best: 0.2192734 (8) total: 6.12s remaining: 12.2s 30: learn: 1094828.7866171 test: 653624.9451635 best: 0.2192734 (8) total: 6.28s remaining: 11.9s 31: learn: 1255571.4823463 test: 776938.1392473 best: 0.2192734 (8) total: 6.43s remaining: 11.6s 32: learn: 1248765.6190287 test: 779089.1767183 best: 0.2192734 (8) total: 6.55s remaining: 11.3s 33: learn: 1255771.6931545 test: 779087.5365427 best: 0.2192734 (8) total: 6.69s remaining: 11s bestTest = 0.2192733625 bestIteration = 8 Shrink model to first 9 iterations.
Training has stopped (degenerate solution on iteration 34, probably too small l2-regularization, try to increase it)
Trial 47, Fold 3: Log loss = 0.2196358003113448, Average precision = 0.9733511790412094, ROC-AUC = 0.9703974336903237, Elapsed Time = 6.92220449999877 seconds Trial 47, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 47, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5446024 test: 0.5477560 best: 0.5477560 (0) total: 188ms remaining: 16.7s 1: learn: 0.4438218 test: 0.4490892 best: 0.4490892 (1) total: 364ms remaining: 16s 2: learn: 0.3765651 test: 0.3852653 best: 0.3852653 (2) total: 577ms remaining: 16.7s 3: learn: 0.3167976 test: 0.3304497 best: 0.3304497 (3) total: 789ms remaining: 17s 4: learn: 0.2814704 test: 0.2979695 best: 0.2979695 (4) total: 989ms remaining: 16.8s 5: learn: 0.2550445 test: 0.2745692 best: 0.2745692 (5) total: 1.2s remaining: 16.9s 6: learn: 0.2331345 test: 0.2565067 best: 0.2565067 (6) total: 1.42s remaining: 16.8s 7: learn: 0.2157695 test: 0.2446878 best: 0.2446878 (7) total: 1.64s remaining: 16.8s 8: learn: 0.1981667 test: 0.2314778 best: 0.2314778 (8) total: 1.88s remaining: 16.9s 9: learn: 514.4968286 test: 4100.8092724 best: 0.2314778 (8) total: 2.12s remaining: 16.9s 10: learn: 514.4847062 test: 4100.7917005 best: 0.2314778 (8) total: 2.34s remaining: 16.8s 11: learn: 514.4754040 test: 4100.7778090 best: 0.2314778 (8) total: 2.54s remaining: 16.5s 12: learn: 514.4679927 test: 4100.7655998 best: 0.2314778 (8) total: 2.73s remaining: 16.2s 13: learn: 3691.1423801 test: 5367.1329394 best: 0.2314778 (8) total: 2.94s remaining: 15.9s 14: learn: 70433.9179269 test: 57337.6110917 best: 0.2314778 (8) total: 3.14s remaining: 15.7s 15: learn: 125913.4071238 test: 98626.1461615 best: 0.2314778 (8) total: 3.33s remaining: 15.4s 16: learn: 717217.9178473 test: 399966.5941299 best: 0.2314778 (8) total: 3.5s remaining: 15s 17: learn: 2834768.3175421 test: 1842874.8236286 best: 0.2314778 (8) total: 3.67s remaining: 14.7s 18: learn: 3235163.9187140 test: 2481513.7720534 best: 0.2314778 (8) total: 3.83s remaining: 14.3s bestTest = 0.2314777689 bestIteration = 8 Shrink model to first 9 iterations.
Training has stopped (degenerate solution on iteration 19, probably too small l2-regularization, try to increase it)
Trial 47, Fold 4: Log loss = 0.23151441709257464, Average precision = 0.9722849244760045, ROC-AUC = 0.9682106018011831, Elapsed Time = 4.109465400000772 seconds Trial 47, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 47, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5321272 test: 0.5373220 best: 0.5373220 (0) total: 213ms remaining: 19s 1: learn: 0.4259160 test: 0.4411418 best: 0.4411418 (1) total: 422ms remaining: 18.6s 2: learn: 0.3580407 test: 0.3762831 best: 0.3762831 (2) total: 615ms remaining: 17.8s 3: learn: 0.3082562 test: 0.3344205 best: 0.3344205 (3) total: 834ms remaining: 17.9s 4: learn: 0.2684827 test: 0.2994503 best: 0.2994503 (4) total: 1.04s remaining: 17.7s 5: learn: 0.2382337 test: 0.2723594 best: 0.2723594 (5) total: 1.29s remaining: 18.1s 6: learn: 0.2164230 test: 0.2542840 best: 0.2542840 (6) total: 1.55s remaining: 18.4s 7: learn: 0.2022145 test: 0.2425345 best: 0.2425345 (7) total: 1.79s remaining: 18.3s 8: learn: 383.8920718 test: 1527.3730288 best: 0.2425345 (7) total: 2.05s remaining: 18.5s 9: learn: 317.3408870 test: 1527.3610993 best: 0.2425345 (7) total: 2.31s remaining: 18.5s 10: learn: 683.2890259 test: 1527.3528918 best: 0.2425345 (7) total: 2.55s remaining: 18.3s 11: learn: 6643.6088200 test: 4593.9107728 best: 0.2425345 (7) total: 2.77s remaining: 18s 12: learn: 12553.5765119 test: 14665.9662048 best: 0.2425345 (7) total: 2.98s remaining: 17.7s 13: learn: 24481.3886227 test: 35842.3177287 best: 0.2425345 (7) total: 3.21s remaining: 17.4s 14: learn: 1842301.5905066 test: 1648623.6895771 best: 0.2425345 (7) total: 3.41s remaining: 17s 15: learn: 3639567.5663221 test: 3064742.6491588 best: 0.2425345 (7) total: 3.63s remaining: 16.8s 16: learn: 5441478.6428118 test: 4930257.0877915 best: 0.2425345 (7) total: 3.85s remaining: 16.5s
Training has stopped (degenerate solution on iteration 17, probably too small l2-regularization, try to increase it)
bestTest = 0.2425344631 bestIteration = 7 Shrink model to first 8 iterations. Trial 47, Fold 5: Log loss = 0.242535836249544, Average precision = 0.9710069158723872, ROC-AUC = 0.9665398807201383, Elapsed Time = 4.201541700000234 seconds
Optimization Progress: 48%|####8 | 48/100 [1:12:56<30:53, 35.65s/it]
Trial 48, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 48, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6792664 test: 0.6794444 best: 0.6794444 (0) total: 204ms remaining: 14.7s 1: learn: 0.6658175 test: 0.6661715 best: 0.6661715 (1) total: 410ms remaining: 14.5s 2: learn: 0.6526446 test: 0.6531355 best: 0.6531355 (2) total: 637ms remaining: 14.9s 3: learn: 0.6399495 test: 0.6406199 best: 0.6406199 (3) total: 832ms remaining: 14.3s 4: learn: 0.6276409 test: 0.6284870 best: 0.6284870 (4) total: 1.03s remaining: 14s 5: learn: 0.6157320 test: 0.6167990 best: 0.6167990 (5) total: 1.24s remaining: 13.9s 6: learn: 0.6039515 test: 0.6052771 best: 0.6052771 (6) total: 1.48s remaining: 14s 7: learn: 0.5926401 test: 0.5941191 best: 0.5941191 (7) total: 1.7s remaining: 13.8s 8: learn: 0.5815731 test: 0.5832748 best: 0.5832748 (8) total: 1.9s remaining: 13.5s 9: learn: 0.5709932 test: 0.5728782 best: 0.5728782 (9) total: 2.14s remaining: 13.5s 10: learn: 0.5604555 test: 0.5627355 best: 0.5627355 (10) total: 2.4s remaining: 13.5s 11: learn: 0.5503712 test: 0.5528527 best: 0.5528527 (11) total: 2.65s remaining: 13.5s 12: learn: 0.5406664 test: 0.5433544 best: 0.5433544 (12) total: 2.91s remaining: 13.4s 13: learn: 0.5311352 test: 0.5341263 best: 0.5341263 (13) total: 3.16s remaining: 13.3s 14: learn: 0.5220735 test: 0.5254224 best: 0.5254224 (14) total: 3.41s remaining: 13.2s 15: learn: 0.5132245 test: 0.5167614 best: 0.5167614 (15) total: 3.65s remaining: 13s 16: learn: 0.5044837 test: 0.5084397 best: 0.5084397 (16) total: 3.9s remaining: 12.9s 17: learn: 0.4961869 test: 0.5002916 best: 0.5002916 (17) total: 4.12s remaining: 12.6s 18: learn: 0.4880709 test: 0.4923277 best: 0.4923277 (18) total: 4.41s remaining: 12.5s 19: learn: 0.4802390 test: 0.4846078 best: 0.4846078 (19) total: 4.64s remaining: 12.3s 20: learn: 0.4726614 test: 0.4772046 best: 0.4772046 (20) total: 4.86s remaining: 12s 21: learn: 0.4651423 test: 0.4699017 best: 0.4699017 (21) total: 5.09s remaining: 11.8s 22: learn: 0.4579415 test: 0.4629382 best: 0.4629382 (22) total: 5.37s remaining: 11.7s 23: learn: 0.4510793 test: 0.4562438 best: 0.4562438 (23) total: 5.61s remaining: 11.4s 24: learn: 0.4443209 test: 0.4497607 best: 0.4497607 (24) total: 5.86s remaining: 11.3s 25: learn: 0.4376813 test: 0.4433528 best: 0.4433528 (25) total: 6.12s remaining: 11.1s 26: learn: 0.4313095 test: 0.4371852 best: 0.4371852 (26) total: 6.36s remaining: 10.8s 27: learn: 0.4250567 test: 0.4311387 best: 0.4311387 (27) total: 6.6s remaining: 10.6s 28: learn: 0.4190031 test: 0.4252630 best: 0.4252630 (28) total: 6.83s remaining: 10.4s 29: learn: 0.4131987 test: 0.4196054 best: 0.4196054 (29) total: 7.05s remaining: 10.1s 30: learn: 0.4075263 test: 0.4142211 best: 0.4142211 (30) total: 7.28s remaining: 9.86s 31: learn: 0.4019092 test: 0.4088159 best: 0.4088159 (31) total: 7.54s remaining: 9.66s 32: learn: 0.3966800 test: 0.4038412 best: 0.4038412 (32) total: 7.76s remaining: 9.41s 33: learn: 0.3913871 test: 0.3987934 best: 0.3987934 (33) total: 8.02s remaining: 9.2s 34: learn: 0.3865868 test: 0.3941817 best: 0.3941817 (34) total: 8.24s remaining: 8.95s 35: learn: 0.3816649 test: 0.3894337 best: 0.3894337 (35) total: 8.49s remaining: 8.73s 36: learn: 0.3769145 test: 0.3848423 best: 0.3848423 (36) total: 8.71s remaining: 8.47s 37: learn: 0.3722498 test: 0.3803265 best: 0.3803265 (37) total: 8.93s remaining: 8.23s 38: learn: 0.3678495 test: 0.3760888 best: 0.3760888 (38) total: 9.15s remaining: 7.97s 39: learn: 0.3635411 test: 0.3719778 best: 0.3719778 (39) total: 9.39s remaining: 7.75s 40: learn: 0.3592502 test: 0.3678414 best: 0.3678414 (40) total: 9.6s remaining: 7.49s 41: learn: 0.3552272 test: 0.3640809 best: 0.3640809 (41) total: 9.82s remaining: 7.25s 42: learn: 0.3511430 test: 0.3601571 best: 0.3601571 (42) total: 10s remaining: 7s 43: learn: 0.3472803 test: 0.3564855 best: 0.3564855 (43) total: 10.3s remaining: 6.76s 44: learn: 0.3436158 test: 0.3530047 best: 0.3530047 (44) total: 10.5s remaining: 6.54s 45: learn: 0.3400145 test: 0.3495630 best: 0.3495630 (45) total: 10.7s remaining: 6.3s 46: learn: 0.3363625 test: 0.3460647 best: 0.3460647 (46) total: 11s remaining: 6.06s 47: learn: 0.3328825 test: 0.3427210 best: 0.3427210 (47) total: 11.2s remaining: 5.82s 48: learn: 0.3294664 test: 0.3395044 best: 0.3395044 (48) total: 11.4s remaining: 5.6s 49: learn: 0.3262582 test: 0.3365521 best: 0.3365521 (49) total: 11.7s remaining: 5.37s 50: learn: 0.3231287 test: 0.3337060 best: 0.3337060 (50) total: 11.9s remaining: 5.14s 51: learn: 0.3198590 test: 0.3306829 best: 0.3306829 (51) total: 12.1s remaining: 4.91s 52: learn: 0.3168322 test: 0.3278625 best: 0.3278625 (52) total: 12.4s remaining: 4.67s 53: learn: 0.3140188 test: 0.3252500 best: 0.3252500 (53) total: 12.6s remaining: 4.43s 54: learn: 0.3112011 test: 0.3227084 best: 0.3227084 (54) total: 12.8s remaining: 4.2s 55: learn: 0.3085022 test: 0.3201918 best: 0.3201918 (55) total: 13s remaining: 3.96s 56: learn: 0.3057118 test: 0.3175153 best: 0.3175153 (56) total: 13.2s remaining: 3.72s 57: learn: 0.3031092 test: 0.3151411 best: 0.3151411 (57) total: 13.5s remaining: 3.49s 58: learn: 0.3004347 test: 0.3126163 best: 0.3126163 (58) total: 13.7s remaining: 3.25s 59: learn: 0.2979728 test: 0.3103121 best: 0.3103121 (59) total: 13.9s remaining: 3.02s 60: learn: 0.2954377 test: 0.3079745 best: 0.3079745 (60) total: 14.2s remaining: 2.78s 61: learn: 0.2931286 test: 0.3058263 best: 0.3058263 (61) total: 14.4s remaining: 2.55s 62: learn: 0.2906149 test: 0.3035412 best: 0.3035412 (62) total: 14.6s remaining: 2.32s 63: learn: 0.2882652 test: 0.3013367 best: 0.3013367 (63) total: 14.8s remaining: 2.09s 64: learn: 0.2859104 test: 0.2992471 best: 0.2992471 (64) total: 15.1s remaining: 1.85s 65: learn: 0.2837121 test: 0.2972806 best: 0.2972806 (65) total: 15.3s remaining: 1.62s 66: learn: 0.2816366 test: 0.2953530 best: 0.2953530 (66) total: 15.5s remaining: 1.39s 67: learn: 0.2795602 test: 0.2934513 best: 0.2934513 (67) total: 15.8s remaining: 1.16s 68: learn: 0.2774842 test: 0.2915580 best: 0.2915580 (68) total: 16s remaining: 929ms 69: learn: 0.2754472 test: 0.2897312 best: 0.2897312 (69) total: 16.3s remaining: 697ms 70: learn: 0.2734194 test: 0.2878284 best: 0.2878284 (70) total: 16.5s remaining: 464ms 71: learn: 0.2714114 test: 0.2860099 best: 0.2860099 (71) total: 16.7s remaining: 232ms 72: learn: 0.2696523 test: 0.2844267 best: 0.2844267 (72) total: 16.9s remaining: 0us bestTest = 0.2844267237 bestIteration = 72 Trial 48, Fold 1: Log loss = 0.28439988595821813, Average precision = 0.9742608270115632, ROC-AUC = 0.9698202545848209, Elapsed Time = 17.062217900001997 seconds Trial 48, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 48, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6793079 test: 0.6796044 best: 0.6796044 (0) total: 222ms remaining: 16s 1: learn: 0.6657962 test: 0.6663031 best: 0.6663031 (1) total: 457ms remaining: 16.2s 2: learn: 0.6527091 test: 0.6534594 best: 0.6534594 (2) total: 690ms remaining: 16.1s 3: learn: 0.6399769 test: 0.6409716 best: 0.6409716 (3) total: 916ms remaining: 15.8s 4: learn: 0.6276569 test: 0.6288570 best: 0.6288570 (4) total: 1.12s remaining: 15.3s 5: learn: 0.6155450 test: 0.6169886 best: 0.6169886 (5) total: 1.35s remaining: 15.1s 6: learn: 0.6039518 test: 0.6056122 best: 0.6056122 (6) total: 1.54s remaining: 14.6s 7: learn: 0.5926600 test: 0.5945726 best: 0.5945726 (7) total: 1.78s remaining: 14.5s 8: learn: 0.5817369 test: 0.5838029 best: 0.5838029 (8) total: 2s remaining: 14.2s 9: learn: 0.5711097 test: 0.5733810 best: 0.5733810 (9) total: 2.22s remaining: 14s 10: learn: 0.5607348 test: 0.5632080 best: 0.5632080 (10) total: 2.44s remaining: 13.8s 11: learn: 0.5507163 test: 0.5534266 best: 0.5534266 (11) total: 2.67s remaining: 13.6s 12: learn: 0.5409550 test: 0.5439202 best: 0.5439202 (12) total: 2.91s remaining: 13.4s 13: learn: 0.5316485 test: 0.5347454 best: 0.5347454 (13) total: 3.13s remaining: 13.2s 14: learn: 0.5224362 test: 0.5257118 best: 0.5257118 (14) total: 3.38s remaining: 13.1s 15: learn: 0.5134914 test: 0.5169914 best: 0.5169914 (15) total: 3.61s remaining: 12.9s 16: learn: 0.5049395 test: 0.5086280 best: 0.5086280 (16) total: 3.84s remaining: 12.6s 17: learn: 0.4965456 test: 0.5004538 best: 0.5004538 (17) total: 4.09s remaining: 12.5s 18: learn: 0.4885215 test: 0.4927264 best: 0.4927264 (18) total: 4.31s remaining: 12.2s 19: learn: 0.4805738 test: 0.4849561 best: 0.4849561 (19) total: 4.52s remaining: 12s 20: learn: 0.4729370 test: 0.4774786 best: 0.4774786 (20) total: 4.77s remaining: 11.8s 21: learn: 0.4655213 test: 0.4702353 best: 0.4702353 (21) total: 4.99s remaining: 11.6s 22: learn: 0.4582965 test: 0.4632287 best: 0.4632287 (22) total: 5.21s remaining: 11.3s 23: learn: 0.4513623 test: 0.4565551 best: 0.4565551 (23) total: 5.45s remaining: 11.1s 24: learn: 0.4447800 test: 0.4501326 best: 0.4501326 (24) total: 5.66s remaining: 10.9s 25: learn: 0.4381052 test: 0.4436724 best: 0.4436724 (25) total: 5.9s remaining: 10.7s 26: learn: 0.4317578 test: 0.4375321 best: 0.4375321 (26) total: 6.15s remaining: 10.5s 27: learn: 0.4254347 test: 0.4313811 best: 0.4313811 (27) total: 6.38s remaining: 10.3s 28: learn: 0.4194095 test: 0.4255430 best: 0.4255430 (28) total: 6.6s remaining: 10s 29: learn: 0.4135038 test: 0.4199079 best: 0.4199079 (29) total: 6.84s remaining: 9.81s 30: learn: 0.4078379 test: 0.4143372 best: 0.4143372 (30) total: 7.07s remaining: 9.58s 31: learn: 0.4023660 test: 0.4090298 best: 0.4090298 (31) total: 7.31s remaining: 9.37s 32: learn: 0.3970158 test: 0.4038367 best: 0.4038367 (32) total: 7.52s remaining: 9.12s 33: learn: 0.3917779 test: 0.3987910 best: 0.3987910 (33) total: 7.76s remaining: 8.9s 34: learn: 0.3867060 test: 0.3939213 best: 0.3939213 (34) total: 7.97s remaining: 8.65s 35: learn: 0.3816611 test: 0.3891778 best: 0.3891778 (35) total: 8.22s remaining: 8.45s 36: learn: 0.3769573 test: 0.3846456 best: 0.3846456 (36) total: 8.44s remaining: 8.21s 37: learn: 0.3722793 test: 0.3800737 best: 0.3800737 (37) total: 8.66s remaining: 7.98s 38: learn: 0.3677693 test: 0.3757058 best: 0.3757058 (38) total: 8.88s remaining: 7.74s 39: learn: 0.3634299 test: 0.3714340 best: 0.3714340 (39) total: 9.09s remaining: 7.5s 40: learn: 0.3592253 test: 0.3673632 best: 0.3673632 (40) total: 9.29s remaining: 7.25s 41: learn: 0.3551494 test: 0.3633897 best: 0.3633897 (41) total: 9.51s remaining: 7.02s 42: learn: 0.3511656 test: 0.3595356 best: 0.3595356 (42) total: 9.74s remaining: 6.8s 43: learn: 0.3471541 test: 0.3557123 best: 0.3557123 (43) total: 9.98s remaining: 6.58s 44: learn: 0.3434234 test: 0.3521529 best: 0.3521529 (44) total: 10.2s remaining: 6.37s 45: learn: 0.3396820 test: 0.3485646 best: 0.3485646 (45) total: 10.5s remaining: 6.14s 46: learn: 0.3361992 test: 0.3452771 best: 0.3452771 (46) total: 10.7s remaining: 5.93s 47: learn: 0.3328066 test: 0.3419418 best: 0.3419418 (47) total: 10.9s remaining: 5.69s 48: learn: 0.3295801 test: 0.3388748 best: 0.3388748 (48) total: 11.1s remaining: 5.45s 49: learn: 0.3263481 test: 0.3357223 best: 0.3357223 (49) total: 11.3s remaining: 5.21s 50: learn: 0.3231568 test: 0.3326433 best: 0.3326433 (50) total: 11.5s remaining: 4.98s 51: learn: 0.3200942 test: 0.3297131 best: 0.3297131 (51) total: 11.8s remaining: 4.77s 52: learn: 0.3171230 test: 0.3268815 best: 0.3268815 (52) total: 12.1s remaining: 4.55s 53: learn: 0.3140929 test: 0.3239981 best: 0.3239981 (53) total: 12.3s remaining: 4.33s 54: learn: 0.3110168 test: 0.3211029 best: 0.3211029 (54) total: 12.6s remaining: 4.11s 55: learn: 0.3081943 test: 0.3183870 best: 0.3183870 (55) total: 12.8s remaining: 3.88s 56: learn: 0.3054985 test: 0.3158108 best: 0.3158108 (56) total: 13s remaining: 3.65s 57: learn: 0.3028160 test: 0.3132235 best: 0.3132235 (57) total: 13.2s remaining: 3.42s 58: learn: 0.3001826 test: 0.3107052 best: 0.3107052 (58) total: 13.5s remaining: 3.19s 59: learn: 0.2975931 test: 0.3082150 best: 0.3082150 (59) total: 13.7s remaining: 2.97s 60: learn: 0.2951980 test: 0.3058858 best: 0.3058858 (60) total: 13.9s remaining: 2.74s 61: learn: 0.2927623 test: 0.3035835 best: 0.3035835 (61) total: 14.2s remaining: 2.51s 62: learn: 0.2905539 test: 0.3014778 best: 0.3014778 (62) total: 14.4s remaining: 2.28s 63: learn: 0.2882797 test: 0.2993061 best: 0.2993061 (63) total: 14.6s remaining: 2.05s 64: learn: 0.2860501 test: 0.2971844 best: 0.2971844 (64) total: 14.9s remaining: 1.83s 65: learn: 0.2838889 test: 0.2951157 best: 0.2951157 (65) total: 15.1s remaining: 1.6s 66: learn: 0.2818296 test: 0.2931224 best: 0.2931224 (66) total: 15.3s remaining: 1.37s 67: learn: 0.2795916 test: 0.2910636 best: 0.2910636 (67) total: 15.6s remaining: 1.15s 68: learn: 0.2775367 test: 0.2890782 best: 0.2890782 (68) total: 15.9s remaining: 919ms 69: learn: 0.2755269 test: 0.2871221 best: 0.2871221 (69) total: 16.1s remaining: 688ms 70: learn: 0.2734529 test: 0.2851978 best: 0.2851978 (70) total: 16.3s remaining: 459ms 71: learn: 0.2714532 test: 0.2832861 best: 0.2832861 (71) total: 16.5s remaining: 229ms 72: learn: 0.2695897 test: 0.2815583 best: 0.2815583 (72) total: 16.8s remaining: 0us bestTest = 0.2815582753 bestIteration = 72 Trial 48, Fold 2: Log loss = 0.28151055643296685, Average precision = 0.9746979821037248, ROC-AUC = 0.9718453228752806, Elapsed Time = 16.923168199999054 seconds Trial 48, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 48, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6792414 test: 0.6792707 best: 0.6792707 (0) total: 224ms remaining: 16.1s 1: learn: 0.6657429 test: 0.6658590 best: 0.6658590 (1) total: 442ms remaining: 15.7s 2: learn: 0.6525622 test: 0.6527835 best: 0.6527835 (2) total: 671ms remaining: 15.7s 3: learn: 0.6398139 test: 0.6403419 best: 0.6403419 (3) total: 888ms remaining: 15.3s 4: learn: 0.6274955 test: 0.6281436 best: 0.6281436 (4) total: 1.1s remaining: 15s 5: learn: 0.6154992 test: 0.6162556 best: 0.6162556 (5) total: 1.34s remaining: 15s 6: learn: 0.6039044 test: 0.6048212 best: 0.6048212 (6) total: 1.55s remaining: 14.7s 7: learn: 0.5925904 test: 0.5936800 best: 0.5936800 (7) total: 1.77s remaining: 14.4s 8: learn: 0.5816187 test: 0.5828402 best: 0.5828402 (8) total: 2.01s remaining: 14.3s 9: learn: 0.5709910 test: 0.5723762 best: 0.5723762 (9) total: 2.25s remaining: 14.2s 10: learn: 0.5606964 test: 0.5621669 best: 0.5621669 (10) total: 2.47s remaining: 13.9s 11: learn: 0.5507286 test: 0.5522654 best: 0.5522654 (11) total: 2.68s remaining: 13.6s 12: learn: 0.5409624 test: 0.5426418 best: 0.5426418 (12) total: 2.9s remaining: 13.4s 13: learn: 0.5315408 test: 0.5333931 best: 0.5333931 (13) total: 3.13s remaining: 13.2s 14: learn: 0.5224797 test: 0.5244093 best: 0.5244093 (14) total: 3.37s remaining: 13s 15: learn: 0.5136294 test: 0.5156342 best: 0.5156342 (15) total: 3.6s remaining: 12.8s 16: learn: 0.5050615 test: 0.5073011 best: 0.5073011 (16) total: 3.83s remaining: 12.6s 17: learn: 0.4966629 test: 0.4990012 best: 0.4990012 (17) total: 4.05s remaining: 12.4s 18: learn: 0.4884530 test: 0.4909385 best: 0.4909385 (18) total: 4.3s remaining: 12.2s 19: learn: 0.4805884 test: 0.4831240 best: 0.4831240 (19) total: 4.51s remaining: 12s 20: learn: 0.4729641 test: 0.4755799 best: 0.4755799 (20) total: 4.75s remaining: 11.8s 21: learn: 0.4657327 test: 0.4685016 best: 0.4685016 (21) total: 4.98s remaining: 11.6s 22: learn: 0.4586187 test: 0.4614413 best: 0.4614413 (22) total: 5.21s remaining: 11.3s 23: learn: 0.4516416 test: 0.4545742 best: 0.4545742 (23) total: 5.45s remaining: 11.1s 24: learn: 0.4448139 test: 0.4479279 best: 0.4479279 (24) total: 5.68s remaining: 10.9s 25: learn: 0.4383160 test: 0.4414788 best: 0.4414788 (25) total: 5.89s remaining: 10.7s 26: learn: 0.4319731 test: 0.4352478 best: 0.4352478 (26) total: 6.1s remaining: 10.4s 27: learn: 0.4257941 test: 0.4291243 best: 0.4291243 (27) total: 6.36s remaining: 10.2s 28: learn: 0.4197338 test: 0.4232439 best: 0.4232439 (28) total: 6.61s remaining: 10s 29: learn: 0.4139080 test: 0.4175015 best: 0.4175015 (29) total: 6.84s remaining: 9.8s 30: learn: 0.4082429 test: 0.4119446 best: 0.4119446 (30) total: 7.06s remaining: 9.56s 31: learn: 0.4028723 test: 0.4066439 best: 0.4066439 (31) total: 7.28s remaining: 9.32s 32: learn: 0.3973834 test: 0.4013054 best: 0.4013054 (32) total: 7.53s remaining: 9.13s 33: learn: 0.3921879 test: 0.3962196 best: 0.3962196 (33) total: 7.74s remaining: 8.88s 34: learn: 0.3871122 test: 0.3913502 best: 0.3913502 (34) total: 7.97s remaining: 8.65s 35: learn: 0.3825162 test: 0.3868196 best: 0.3868196 (35) total: 8.19s remaining: 8.42s 36: learn: 0.3778032 test: 0.3821493 best: 0.3821493 (36) total: 8.4s remaining: 8.18s 37: learn: 0.3732124 test: 0.3776427 best: 0.3776427 (37) total: 8.64s remaining: 7.96s 38: learn: 0.3686511 test: 0.3731441 best: 0.3731441 (38) total: 8.87s remaining: 7.73s 39: learn: 0.3645209 test: 0.3691017 best: 0.3691017 (39) total: 9.09s remaining: 7.5s 40: learn: 0.3604007 test: 0.3651519 best: 0.3651519 (40) total: 9.33s remaining: 7.28s 41: learn: 0.3564049 test: 0.3612585 best: 0.3612585 (41) total: 9.56s remaining: 7.05s 42: learn: 0.3525361 test: 0.3574718 best: 0.3574718 (42) total: 9.78s remaining: 6.82s 43: learn: 0.3486737 test: 0.3537507 best: 0.3537507 (43) total: 9.98s remaining: 6.58s 44: learn: 0.3449764 test: 0.3501505 best: 0.3501505 (44) total: 10.2s remaining: 6.36s 45: learn: 0.3412476 test: 0.3465144 best: 0.3465144 (45) total: 10.4s remaining: 6.13s 46: learn: 0.3375417 test: 0.3429076 best: 0.3429076 (46) total: 10.7s remaining: 5.9s 47: learn: 0.3339357 test: 0.3394299 best: 0.3394299 (47) total: 10.9s remaining: 5.68s 48: learn: 0.3306758 test: 0.3362639 best: 0.3362639 (48) total: 11.2s remaining: 5.47s 49: learn: 0.3274558 test: 0.3330779 best: 0.3330779 (49) total: 11.4s remaining: 5.23s 50: learn: 0.3242868 test: 0.3300388 best: 0.3300388 (50) total: 11.6s remaining: 5s 51: learn: 0.3211294 test: 0.3269523 best: 0.3269523 (51) total: 11.8s remaining: 4.78s 52: learn: 0.3179708 test: 0.3239305 best: 0.3239305 (52) total: 12s remaining: 4.54s 53: learn: 0.3151278 test: 0.3211931 best: 0.3211931 (53) total: 12.3s remaining: 4.32s 54: learn: 0.3121333 test: 0.3183166 best: 0.3183166 (54) total: 12.5s remaining: 4.09s 55: learn: 0.3094168 test: 0.3157047 best: 0.3157047 (55) total: 12.7s remaining: 3.87s 56: learn: 0.3065775 test: 0.3129036 best: 0.3129036 (56) total: 13s remaining: 3.64s 57: learn: 0.3038841 test: 0.3102528 best: 0.3102528 (57) total: 13.2s remaining: 3.42s 58: learn: 0.3012710 test: 0.3077265 best: 0.3077265 (58) total: 13.4s remaining: 3.18s 59: learn: 0.2986046 test: 0.3051299 best: 0.3051299 (59) total: 13.6s remaining: 2.95s 60: learn: 0.2960474 test: 0.3026900 best: 0.3026900 (60) total: 13.8s remaining: 2.72s 61: learn: 0.2937762 test: 0.3005270 best: 0.3005270 (61) total: 14.1s remaining: 2.49s 62: learn: 0.2913932 test: 0.2982000 best: 0.2982000 (62) total: 14.3s remaining: 2.27s 63: learn: 0.2891455 test: 0.2961048 best: 0.2961048 (63) total: 14.5s remaining: 2.04s 64: learn: 0.2869643 test: 0.2940765 best: 0.2940765 (64) total: 14.7s remaining: 1.81s 65: learn: 0.2846921 test: 0.2919187 best: 0.2919187 (65) total: 14.9s remaining: 1.58s 66: learn: 0.2826163 test: 0.2900306 best: 0.2900306 (66) total: 15.1s remaining: 1.35s 67: learn: 0.2804072 test: 0.2878951 best: 0.2878951 (67) total: 15.4s remaining: 1.13s 68: learn: 0.2785007 test: 0.2860802 best: 0.2860802 (68) total: 15.6s remaining: 905ms 69: learn: 0.2764114 test: 0.2841366 best: 0.2841366 (69) total: 15.9s remaining: 680ms 70: learn: 0.2745131 test: 0.2824005 best: 0.2824005 (70) total: 16.1s remaining: 453ms 71: learn: 0.2725973 test: 0.2805474 best: 0.2805474 (71) total: 16.3s remaining: 226ms 72: learn: 0.2707414 test: 0.2787546 best: 0.2787546 (72) total: 16.5s remaining: 0us bestTest = 0.278754565 bestIteration = 72 Trial 48, Fold 3: Log loss = 0.2789076748905186, Average precision = 0.9750188025336488, ROC-AUC = 0.9727258116554787, Elapsed Time = 16.703614400001243 seconds Trial 48, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 48, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6792331 test: 0.6794337 best: 0.6794337 (0) total: 229ms remaining: 16.5s 1: learn: 0.6656807 test: 0.6661415 best: 0.6661415 (1) total: 446ms remaining: 15.8s 2: learn: 0.6525918 test: 0.6531860 best: 0.6531860 (2) total: 640ms remaining: 14.9s 3: learn: 0.6399953 test: 0.6407287 best: 0.6407287 (3) total: 828ms remaining: 14.3s 4: learn: 0.6276797 test: 0.6286784 best: 0.6286784 (4) total: 1.05s remaining: 14.2s 5: learn: 0.6157389 test: 0.6168702 best: 0.6168702 (5) total: 1.27s remaining: 14.1s 6: learn: 0.6041087 test: 0.6053720 best: 0.6053720 (6) total: 1.49s remaining: 14.1s 7: learn: 0.5926903 test: 0.5941922 best: 0.5941922 (7) total: 1.74s remaining: 14.1s 8: learn: 0.5817271 test: 0.5834198 best: 0.5834198 (8) total: 1.96s remaining: 13.9s 9: learn: 0.5711487 test: 0.5730243 best: 0.5730243 (9) total: 2.19s remaining: 13.8s 10: learn: 0.5609047 test: 0.5629844 best: 0.5629844 (10) total: 2.41s remaining: 13.6s 11: learn: 0.5508864 test: 0.5531332 best: 0.5531332 (11) total: 2.63s remaining: 13.4s 12: learn: 0.5411433 test: 0.5435154 best: 0.5435154 (12) total: 2.87s remaining: 13.2s 13: learn: 0.5317281 test: 0.5341254 best: 0.5341254 (13) total: 3.09s remaining: 13s 14: learn: 0.5225877 test: 0.5252805 best: 0.5252805 (14) total: 3.31s remaining: 12.8s 15: learn: 0.5137963 test: 0.5167062 best: 0.5167062 (15) total: 3.53s remaining: 12.6s 16: learn: 0.5052267 test: 0.5083532 best: 0.5083532 (16) total: 3.75s remaining: 12.4s 17: learn: 0.4967492 test: 0.5000784 best: 0.5000784 (17) total: 4.01s remaining: 12.2s 18: learn: 0.4886448 test: 0.4920886 best: 0.4920886 (18) total: 4.24s remaining: 12.1s 19: learn: 0.4809454 test: 0.4845033 best: 0.4845033 (19) total: 4.45s remaining: 11.8s 20: learn: 0.4731813 test: 0.4769070 best: 0.4769070 (20) total: 4.69s remaining: 11.6s 21: learn: 0.4656755 test: 0.4695231 best: 0.4695231 (21) total: 4.92s remaining: 11.4s 22: learn: 0.4585060 test: 0.4624962 best: 0.4624962 (22) total: 5.14s remaining: 11.2s 23: learn: 0.4514931 test: 0.4556544 best: 0.4556544 (23) total: 5.37s remaining: 11s 24: learn: 0.4448987 test: 0.4491517 best: 0.4491517 (24) total: 5.57s remaining: 10.7s 25: learn: 0.4383370 test: 0.4427507 best: 0.4427507 (25) total: 5.77s remaining: 10.4s 26: learn: 0.4321004 test: 0.4366940 best: 0.4366940 (26) total: 6s remaining: 10.2s 27: learn: 0.4257952 test: 0.4305572 best: 0.4305572 (27) total: 6.23s remaining: 10s 28: learn: 0.4198104 test: 0.4247273 best: 0.4247273 (28) total: 6.46s remaining: 9.8s 29: learn: 0.4141235 test: 0.4192160 best: 0.4192160 (29) total: 6.66s remaining: 9.54s 30: learn: 0.4084160 test: 0.4137630 best: 0.4137630 (30) total: 6.89s remaining: 9.33s 31: learn: 0.4027841 test: 0.4083390 best: 0.4083390 (31) total: 7.13s remaining: 9.14s 32: learn: 0.3974496 test: 0.4031283 best: 0.4031283 (32) total: 7.35s remaining: 8.91s 33: learn: 0.3923401 test: 0.3981409 best: 0.3981409 (33) total: 7.54s remaining: 8.65s 34: learn: 0.3872150 test: 0.3931732 best: 0.3931732 (34) total: 7.77s remaining: 8.43s 35: learn: 0.3823968 test: 0.3885269 best: 0.3885269 (35) total: 8.02s remaining: 8.24s 36: learn: 0.3775861 test: 0.3839148 best: 0.3839148 (36) total: 8.24s remaining: 8.02s 37: learn: 0.3729769 test: 0.3793673 best: 0.3793673 (37) total: 8.47s remaining: 7.8s 38: learn: 0.3684501 test: 0.3750540 best: 0.3750540 (38) total: 8.69s remaining: 7.57s 39: learn: 0.3640452 test: 0.3707877 best: 0.3707877 (39) total: 8.91s remaining: 7.35s 40: learn: 0.3597413 test: 0.3666743 best: 0.3666743 (40) total: 9.14s remaining: 7.13s 41: learn: 0.3557101 test: 0.3627713 best: 0.3627713 (41) total: 9.35s remaining: 6.9s 42: learn: 0.3516065 test: 0.3587557 best: 0.3587557 (42) total: 9.56s remaining: 6.67s 43: learn: 0.3476444 test: 0.3554795 best: 0.3554795 (43) total: 9.78s remaining: 6.44s 44: learn: 0.3439734 test: 0.3519141 best: 0.3519141 (44) total: 10s remaining: 6.22s 45: learn: 0.3402567 test: 0.3483713 best: 0.3483713 (45) total: 10.2s remaining: 6s 46: learn: 0.3366449 test: 0.3448624 best: 0.3448624 (46) total: 10.4s remaining: 5.77s 47: learn: 0.3331779 test: 0.3415814 best: 0.3415814 (47) total: 10.7s remaining: 5.55s 48: learn: 0.3296502 test: 0.3382598 best: 0.3382598 (48) total: 10.9s remaining: 5.34s 49: learn: 0.3263606 test: 0.3350810 best: 0.3350810 (49) total: 11.1s remaining: 5.12s 50: learn: 0.3231516 test: 0.3320265 best: 0.3320265 (50) total: 11.4s remaining: 4.9s 51: learn: 0.3199475 test: 0.3289319 best: 0.3289319 (51) total: 11.6s remaining: 4.68s 52: learn: 0.3170580 test: 0.3261521 best: 0.3261521 (52) total: 11.8s remaining: 4.46s 53: learn: 0.3142497 test: 0.3235947 best: 0.3235947 (53) total: 12.1s remaining: 4.24s 54: learn: 0.3113788 test: 0.3209404 best: 0.3209404 (54) total: 12.3s remaining: 4.02s 55: learn: 0.3085223 test: 0.3181457 best: 0.3181457 (55) total: 12.5s remaining: 3.8s 56: learn: 0.3058770 test: 0.3156412 best: 0.3156412 (56) total: 12.7s remaining: 3.57s 57: learn: 0.3033397 test: 0.3132673 best: 0.3132673 (57) total: 12.9s remaining: 3.34s 58: learn: 0.3007797 test: 0.3108559 best: 0.3108559 (58) total: 13.2s remaining: 3.12s 59: learn: 0.2982330 test: 0.3084003 best: 0.3084003 (59) total: 13.4s remaining: 2.9s 60: learn: 0.2956737 test: 0.3059604 best: 0.3059604 (60) total: 13.6s remaining: 2.68s 61: learn: 0.2932326 test: 0.3037593 best: 0.3037593 (61) total: 13.9s remaining: 2.46s 62: learn: 0.2908608 test: 0.3015022 best: 0.3015022 (62) total: 14.1s remaining: 2.24s 63: learn: 0.2886927 test: 0.2994848 best: 0.2994848 (63) total: 14.3s remaining: 2.02s 64: learn: 0.2863284 test: 0.2973044 best: 0.2973044 (64) total: 14.6s remaining: 1.79s 65: learn: 0.2842094 test: 0.2953304 best: 0.2953304 (65) total: 14.8s remaining: 1.57s 66: learn: 0.2820517 test: 0.2932272 best: 0.2932272 (66) total: 15s remaining: 1.34s 67: learn: 0.2800311 test: 0.2913253 best: 0.2913253 (67) total: 15.2s remaining: 1.12s 68: learn: 0.2780728 test: 0.2894924 best: 0.2894924 (68) total: 15.4s remaining: 896ms 69: learn: 0.2762756 test: 0.2878037 best: 0.2878037 (69) total: 15.6s remaining: 671ms 70: learn: 0.2742482 test: 0.2859245 best: 0.2859245 (70) total: 15.9s remaining: 448ms 71: learn: 0.2723758 test: 0.2841739 best: 0.2841739 (71) total: 16.1s remaining: 224ms 72: learn: 0.2703251 test: 0.2823284 best: 0.2823284 (72) total: 16.4s remaining: 0us bestTest = 0.2823284056 bestIteration = 72 Trial 48, Fold 4: Log loss = 0.28233987263072025, Average precision = 0.9758039650097482, ROC-AUC = 0.971943171765727, Elapsed Time = 16.542822800001886 seconds Trial 48, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 48, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6792173 test: 0.6794923 best: 0.6794923 (0) total: 211ms remaining: 15.2s 1: learn: 0.6656142 test: 0.6661876 best: 0.6661876 (1) total: 430ms remaining: 15.3s 2: learn: 0.6523629 test: 0.6532637 best: 0.6532637 (2) total: 645ms remaining: 15.1s 3: learn: 0.6395157 test: 0.6407677 best: 0.6407677 (3) total: 878ms remaining: 15.1s 4: learn: 0.6271343 test: 0.6286737 best: 0.6286737 (4) total: 1.09s remaining: 14.9s 5: learn: 0.6152945 test: 0.6171451 best: 0.6171451 (5) total: 1.3s remaining: 14.5s 6: learn: 0.6034603 test: 0.6056821 best: 0.6056821 (6) total: 1.5s remaining: 14.2s 7: learn: 0.5920637 test: 0.5945726 best: 0.5945726 (7) total: 1.71s remaining: 13.9s 8: learn: 0.5811503 test: 0.5839474 best: 0.5839474 (8) total: 1.94s remaining: 13.8s 9: learn: 0.5704441 test: 0.5735130 best: 0.5735130 (9) total: 2.17s remaining: 13.7s 10: learn: 0.5599748 test: 0.5636345 best: 0.5636345 (10) total: 2.41s remaining: 13.6s 11: learn: 0.5498069 test: 0.5536905 best: 0.5536905 (11) total: 2.64s remaining: 13.4s 12: learn: 0.5400985 test: 0.5442664 best: 0.5442664 (12) total: 2.87s remaining: 13.3s 13: learn: 0.5306734 test: 0.5350534 best: 0.5350534 (13) total: 3.1s remaining: 13s 14: learn: 0.5215674 test: 0.5261675 best: 0.5261675 (14) total: 3.31s remaining: 12.8s 15: learn: 0.5127713 test: 0.5176103 best: 0.5176103 (15) total: 3.51s remaining: 12.5s 16: learn: 0.5041716 test: 0.5092156 best: 0.5092156 (16) total: 3.75s remaining: 12.4s 17: learn: 0.4958170 test: 0.5011020 best: 0.5011020 (17) total: 3.99s remaining: 12.2s 18: learn: 0.4876758 test: 0.4932074 best: 0.4932074 (18) total: 4.2s remaining: 11.9s 19: learn: 0.4797709 test: 0.4855820 best: 0.4855820 (19) total: 4.42s remaining: 11.7s 20: learn: 0.4721193 test: 0.4782114 best: 0.4782114 (20) total: 4.65s remaining: 11.5s 21: learn: 0.4646545 test: 0.4709478 best: 0.4709478 (21) total: 4.87s remaining: 11.3s 22: learn: 0.4573776 test: 0.4639255 best: 0.4639255 (22) total: 5.11s remaining: 11.1s 23: learn: 0.4503902 test: 0.4571177 best: 0.4571177 (23) total: 5.32s remaining: 10.9s 24: learn: 0.4436714 test: 0.4506761 best: 0.4506761 (24) total: 5.57s remaining: 10.7s 25: learn: 0.4371344 test: 0.4443633 best: 0.4443633 (25) total: 5.8s remaining: 10.5s 26: learn: 0.4308142 test: 0.4382771 best: 0.4382771 (26) total: 6.02s remaining: 10.3s 27: learn: 0.4245507 test: 0.4322628 best: 0.4322628 (27) total: 6.24s remaining: 10s 28: learn: 0.4184588 test: 0.4264193 best: 0.4264193 (28) total: 6.46s remaining: 9.79s 29: learn: 0.4125302 test: 0.4208039 best: 0.4208039 (29) total: 6.68s remaining: 9.57s 30: learn: 0.4067240 test: 0.4152380 best: 0.4152380 (30) total: 6.94s remaining: 9.4s 31: learn: 0.4011304 test: 0.4099027 best: 0.4099027 (31) total: 7.16s remaining: 9.18s 32: learn: 0.3959868 test: 0.4049627 best: 0.4049627 (32) total: 7.39s remaining: 8.95s 33: learn: 0.3908796 test: 0.4000417 best: 0.4000417 (33) total: 7.59s remaining: 8.7s 34: learn: 0.3860987 test: 0.3954617 best: 0.3954617 (34) total: 7.81s remaining: 8.48s 35: learn: 0.3812117 test: 0.3907520 best: 0.3907520 (35) total: 8.03s remaining: 8.25s 36: learn: 0.3763276 test: 0.3860921 best: 0.3860921 (36) total: 8.25s remaining: 8.03s 37: learn: 0.3716550 test: 0.3816564 best: 0.3816564 (37) total: 8.47s remaining: 7.8s 38: learn: 0.3670928 test: 0.3772935 best: 0.3772935 (38) total: 8.67s remaining: 7.56s 39: learn: 0.3626501 test: 0.3730593 best: 0.3730593 (39) total: 8.91s remaining: 7.35s 40: learn: 0.3584818 test: 0.3691134 best: 0.3691134 (40) total: 9.13s remaining: 7.12s 41: learn: 0.3543284 test: 0.3652104 best: 0.3652104 (41) total: 9.32s remaining: 6.88s 42: learn: 0.3503406 test: 0.3615224 best: 0.3615224 (42) total: 9.53s remaining: 6.65s 43: learn: 0.3464777 test: 0.3579468 best: 0.3579468 (43) total: 9.74s remaining: 6.42s 44: learn: 0.3426976 test: 0.3544208 best: 0.3544208 (44) total: 9.97s remaining: 6.2s 45: learn: 0.3392147 test: 0.3511057 best: 0.3511057 (45) total: 10.2s remaining: 5.99s 46: learn: 0.3354534 test: 0.3475707 best: 0.3475707 (46) total: 10.4s remaining: 5.76s 47: learn: 0.3319131 test: 0.3442160 best: 0.3442160 (47) total: 10.6s remaining: 5.55s 48: learn: 0.3285812 test: 0.3411620 best: 0.3411620 (48) total: 10.9s remaining: 5.34s 49: learn: 0.3251910 test: 0.3380956 best: 0.3380956 (49) total: 11.1s remaining: 5.12s 50: learn: 0.3219366 test: 0.3350509 best: 0.3350509 (50) total: 11.3s remaining: 4.89s 51: learn: 0.3189141 test: 0.3321853 best: 0.3321853 (51) total: 11.6s remaining: 4.67s 52: learn: 0.3159461 test: 0.3293339 best: 0.3293339 (52) total: 11.8s remaining: 4.45s 53: learn: 0.3129010 test: 0.3265113 best: 0.3265113 (53) total: 12s remaining: 4.23s 54: learn: 0.3100719 test: 0.3239609 best: 0.3239609 (54) total: 12.3s remaining: 4.01s 55: learn: 0.3073563 test: 0.3214622 best: 0.3214622 (55) total: 12.5s remaining: 3.8s 56: learn: 0.3045117 test: 0.3189111 best: 0.3189111 (56) total: 12.8s remaining: 3.59s 57: learn: 0.3019439 test: 0.3165368 best: 0.3165368 (57) total: 13s remaining: 3.36s 58: learn: 0.2994345 test: 0.3142230 best: 0.3142230 (58) total: 13.2s remaining: 3.14s 59: learn: 0.2967488 test: 0.3117157 best: 0.3117157 (59) total: 13.4s remaining: 2.91s 60: learn: 0.2942201 test: 0.3093819 best: 0.3093819 (60) total: 13.7s remaining: 2.69s 61: learn: 0.2916999 test: 0.3071223 best: 0.3071223 (61) total: 13.9s remaining: 2.47s 62: learn: 0.2892894 test: 0.3048789 best: 0.3048789 (62) total: 14.1s remaining: 2.24s 63: learn: 0.2871084 test: 0.3028761 best: 0.3028761 (63) total: 14.3s remaining: 2.02s 64: learn: 0.2850708 test: 0.3010042 best: 0.3010042 (64) total: 14.5s remaining: 1.79s 65: learn: 0.2829801 test: 0.2991534 best: 0.2991534 (65) total: 14.8s remaining: 1.57s 66: learn: 0.2807205 test: 0.2970837 best: 0.2970837 (66) total: 15s remaining: 1.34s 67: learn: 0.2785599 test: 0.2950504 best: 0.2950504 (67) total: 15.2s remaining: 1.12s 68: learn: 0.2765154 test: 0.2931936 best: 0.2931936 (68) total: 15.4s remaining: 894ms 69: learn: 0.2745022 test: 0.2913628 best: 0.2913628 (69) total: 15.7s remaining: 671ms 70: learn: 0.2725467 test: 0.2896314 best: 0.2896314 (70) total: 15.9s remaining: 448ms 71: learn: 0.2706860 test: 0.2879669 best: 0.2879669 (71) total: 16.1s remaining: 224ms 72: learn: 0.2688673 test: 0.2863312 best: 0.2863312 (72) total: 16.4s remaining: 0us bestTest = 0.2863312174 bestIteration = 72 Trial 48, Fold 5: Log loss = 0.2862084415915731, Average precision = 0.9728769473834451, ROC-AUC = 0.9700563699533657, Elapsed Time = 16.5084491000016 seconds
Optimization Progress: 49%|####9 | 49/100 [1:14:29<44:50, 52.75s/it]
Trial 49, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 49, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6226962 test: 0.6264346 best: 0.6264346 (0) total: 199ms remaining: 8.75s 1: learn: 0.5565129 test: 0.5638150 best: 0.5638150 (1) total: 414ms remaining: 8.9s 2: learn: 0.5006705 test: 0.5103924 best: 0.5103924 (2) total: 607ms remaining: 8.5s 3: learn: 0.4525667 test: 0.4649841 best: 0.4649841 (3) total: 830ms remaining: 8.51s 4: learn: 0.4117866 test: 0.4278813 best: 0.4278813 (4) total: 1.06s remaining: 8.5s 5: learn: 0.3781477 test: 0.3963360 best: 0.3963360 (5) total: 1.27s remaining: 8.25s 6: learn: 0.3498402 test: 0.3705774 best: 0.3705774 (6) total: 1.49s remaining: 8.08s 7: learn: 0.3254497 test: 0.3484773 best: 0.3484773 (7) total: 1.74s remaining: 8.04s 8: learn: 0.3040151 test: 0.3287622 best: 0.3287622 (8) total: 1.93s remaining: 7.73s 9: learn: 0.2860215 test: 0.3121868 best: 0.3121868 (9) total: 2.11s remaining: 7.4s 10: learn: 0.2701206 test: 0.2986349 best: 0.2986349 (10) total: 2.37s remaining: 7.32s 11: learn: 0.2562938 test: 0.2876326 best: 0.2876326 (11) total: 2.62s remaining: 7.2s 12: learn: 0.2439136 test: 0.2775252 best: 0.2775252 (12) total: 2.86s remaining: 7.04s 13: learn: 0.2328865 test: 0.2689295 best: 0.2689295 (13) total: 3.09s remaining: 6.83s 14: learn: 0.2240106 test: 0.2616450 best: 0.2616450 (14) total: 3.3s remaining: 6.6s 15: learn: 0.2168587 test: 0.2563248 best: 0.2563248 (15) total: 3.52s remaining: 6.38s 16: learn: 0.2098375 test: 0.2505653 best: 0.2505653 (16) total: 3.69s remaining: 6.08s 17: learn: 0.2035889 test: 0.2456241 best: 0.2456241 (17) total: 3.9s remaining: 5.85s 18: learn: 0.1970460 test: 0.2407697 best: 0.2407697 (18) total: 4.12s remaining: 5.63s 19: learn: 0.1907490 test: 0.2365184 best: 0.2365184 (19) total: 4.34s remaining: 5.43s 20: learn: 0.1853387 test: 0.2324923 best: 0.2324923 (20) total: 4.57s remaining: 5.22s 21: learn: 0.1806359 test: 0.2287924 best: 0.2287924 (21) total: 4.8s remaining: 5.02s 22: learn: 0.1762846 test: 0.2263484 best: 0.2263484 (22) total: 5.05s remaining: 4.83s 23: learn: 0.1723327 test: 0.2235211 best: 0.2235211 (23) total: 5.26s remaining: 4.6s 24: learn: 0.1689282 test: 0.2211281 best: 0.2211281 (24) total: 5.49s remaining: 4.39s 25: learn: 0.1655835 test: 0.2194358 best: 0.2194358 (25) total: 5.73s remaining: 4.18s 26: learn: 0.1626178 test: 0.2177346 best: 0.2177346 (26) total: 5.95s remaining: 3.96s 27: learn: 0.1593979 test: 0.2160693 best: 0.2160693 (27) total: 6.17s remaining: 3.75s 28: learn: 0.1560624 test: 0.2143397 best: 0.2143397 (28) total: 6.43s remaining: 3.55s 29: learn: 0.1534570 test: 0.2130167 best: 0.2130167 (29) total: 6.69s remaining: 3.34s 30: learn: 0.1508633 test: 0.2119896 best: 0.2119896 (30) total: 6.93s remaining: 3.13s 31: learn: 0.1484666 test: 0.2108056 best: 0.2108056 (31) total: 7.16s remaining: 2.91s 32: learn: 0.1463448 test: 0.2099846 best: 0.2099846 (32) total: 7.39s remaining: 2.69s 33: learn: 0.1439130 test: 0.2092267 best: 0.2092267 (33) total: 7.64s remaining: 2.47s 34: learn: 0.1414603 test: 0.2083392 best: 0.2083392 (34) total: 7.9s remaining: 2.26s 35: learn: 0.1396289 test: 0.2075447 best: 0.2075447 (35) total: 8.1s remaining: 2.02s 36: learn: 0.1378591 test: 0.2066166 best: 0.2066166 (36) total: 8.29s remaining: 1.79s 37: learn: 0.1359520 test: 0.2058077 best: 0.2058077 (37) total: 8.53s remaining: 1.57s 38: learn: 0.1341741 test: 0.2054373 best: 0.2054373 (38) total: 8.75s remaining: 1.34s 39: learn: 0.1324327 test: 0.2045629 best: 0.2045629 (39) total: 8.98s remaining: 1.12s 40: learn: 0.1308034 test: 0.2039722 best: 0.2039722 (40) total: 9.17s remaining: 895ms 41: learn: 0.1292764 test: 0.2034323 best: 0.2034323 (41) total: 9.35s remaining: 668ms 42: learn: 0.1279424 test: 0.2033627 best: 0.2033627 (42) total: 9.57s remaining: 445ms 43: learn: 0.1263838 test: 0.2029941 best: 0.2029941 (43) total: 9.8s remaining: 223ms 44: learn: 0.1250937 test: 0.2025762 best: 0.2025762 (44) total: 10s remaining: 0us bestTest = 0.2025761643 bestIteration = 44 Trial 49, Fold 1: Log loss = 0.20257616434370945, Average precision = 0.9755856730081887, ROC-AUC = 0.9710382283496557, Elapsed Time = 10.18093030000091 seconds Trial 49, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 49, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6169033 test: 0.6204335 best: 0.6204335 (0) total: 299ms remaining: 13.1s 1: learn: 0.5526643 test: 0.5593812 best: 0.5593812 (1) total: 547ms remaining: 11.8s 2: learn: 0.4976345 test: 0.5066408 best: 0.5066408 (2) total: 743ms remaining: 10.4s 3: learn: 0.4516315 test: 0.4631533 best: 0.4631533 (3) total: 976ms remaining: 10s 4: learn: 0.4136285 test: 0.4274974 best: 0.4274974 (4) total: 1.24s remaining: 9.94s 5: learn: 0.3811725 test: 0.3969964 best: 0.3969964 (5) total: 1.51s remaining: 9.81s 6: learn: 0.3526512 test: 0.3700794 best: 0.3700794 (6) total: 1.75s remaining: 9.49s 7: learn: 0.3302156 test: 0.3497880 best: 0.3497880 (7) total: 2.06s remaining: 9.54s 8: learn: 0.3078144 test: 0.3296770 best: 0.3296770 (8) total: 2.39s remaining: 9.55s 9: learn: 0.2888202 test: 0.3122362 best: 0.3122362 (9) total: 2.73s remaining: 9.55s 10: learn: 0.2730311 test: 0.2979499 best: 0.2979499 (10) total: 3.05s remaining: 9.43s 11: learn: 0.2596493 test: 0.2865476 best: 0.2865476 (11) total: 3.33s remaining: 9.17s 12: learn: 0.2485260 test: 0.2769437 best: 0.2769437 (12) total: 3.61s remaining: 8.9s 13: learn: 0.2376733 test: 0.2673020 best: 0.2673020 (13) total: 3.95s remaining: 8.74s 14: learn: 0.2272949 test: 0.2584016 best: 0.2584016 (14) total: 4.28s remaining: 8.57s 15: learn: 0.2181441 test: 0.2507841 best: 0.2507841 (15) total: 4.52s remaining: 8.19s 16: learn: 0.2106698 test: 0.2446736 best: 0.2446736 (16) total: 4.76s remaining: 7.84s 17: learn: 0.2037665 test: 0.2390508 best: 0.2390508 (17) total: 5s remaining: 7.49s 18: learn: 0.1970127 test: 0.2339138 best: 0.2339138 (18) total: 5.27s remaining: 7.21s 19: learn: 0.1909039 test: 0.2290251 best: 0.2290251 (19) total: 5.54s remaining: 6.92s 20: learn: 0.1861532 test: 0.2255628 best: 0.2255628 (20) total: 5.79s remaining: 6.62s 21: learn: 0.1809991 test: 0.2215910 best: 0.2215910 (21) total: 6.04s remaining: 6.31s 22: learn: 0.1763806 test: 0.2182634 best: 0.2182634 (22) total: 6.29s remaining: 6.02s 23: learn: 0.1721203 test: 0.2153179 best: 0.2153179 (23) total: 6.55s remaining: 5.73s 24: learn: 0.1683778 test: 0.2132924 best: 0.2132924 (24) total: 6.85s remaining: 5.48s 25: learn: 0.1642537 test: 0.2104666 best: 0.2104666 (25) total: 7.13s remaining: 5.21s 26: learn: 0.1610590 test: 0.2082576 best: 0.2082576 (26) total: 7.38s remaining: 4.92s 27: learn: 0.1583392 test: 0.2069117 best: 0.2069117 (27) total: 7.65s remaining: 4.65s 28: learn: 0.1555876 test: 0.2054478 best: 0.2054478 (28) total: 7.92s remaining: 4.37s 29: learn: 0.1528927 test: 0.2038065 best: 0.2038065 (29) total: 8.17s remaining: 4.08s 30: learn: 0.1497997 test: 0.2019601 best: 0.2019601 (30) total: 8.48s remaining: 3.83s 31: learn: 0.1474211 test: 0.2005460 best: 0.2005460 (31) total: 8.73s remaining: 3.54s 32: learn: 0.1450353 test: 0.1992481 best: 0.1992481 (32) total: 8.96s remaining: 3.26s 33: learn: 0.1427338 test: 0.1979008 best: 0.1979008 (33) total: 9.25s remaining: 2.99s 34: learn: 0.1405341 test: 0.1971061 best: 0.1971061 (34) total: 9.54s remaining: 2.72s 35: learn: 0.1385573 test: 0.1960588 best: 0.1960588 (35) total: 9.76s remaining: 2.44s 36: learn: 0.1363566 test: 0.1952296 best: 0.1952296 (36) total: 10s remaining: 2.17s 37: learn: 0.1343658 test: 0.1942227 best: 0.1942227 (37) total: 10.3s remaining: 1.89s 38: learn: 0.1326300 test: 0.1935040 best: 0.1935040 (38) total: 10.5s remaining: 1.62s 39: learn: 0.1307258 test: 0.1929493 best: 0.1929493 (39) total: 10.7s remaining: 1.34s 40: learn: 0.1294661 test: 0.1920495 best: 0.1920495 (40) total: 10.9s remaining: 1.06s 41: learn: 0.1278996 test: 0.1914181 best: 0.1914181 (41) total: 11.1s remaining: 794ms 42: learn: 0.1264906 test: 0.1905878 best: 0.1905878 (42) total: 11.3s remaining: 527ms 43: learn: 0.1251924 test: 0.1901772 best: 0.1901772 (43) total: 11.6s remaining: 263ms 44: learn: 0.1238828 test: 0.1894979 best: 0.1894979 (44) total: 11.8s remaining: 0us bestTest = 0.1894978519 bestIteration = 44 Trial 49, Fold 2: Log loss = 0.18949785187423762, Average precision = 0.9774370561633179, ROC-AUC = 0.9746546296276625, Elapsed Time = 11.948648300000059 seconds Trial 49, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 49, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6180050 test: 0.6208199 best: 0.6208199 (0) total: 255ms remaining: 11.2s 1: learn: 0.5563472 test: 0.5610585 best: 0.5610585 (1) total: 473ms remaining: 10.2s 2: learn: 0.4997161 test: 0.5067352 best: 0.5067352 (2) total: 697ms remaining: 9.75s 3: learn: 0.4535401 test: 0.4624745 best: 0.4624745 (3) total: 935ms remaining: 9.59s 4: learn: 0.4138956 test: 0.4252417 best: 0.4252417 (4) total: 1.13s remaining: 9.06s 5: learn: 0.3828519 test: 0.3970053 best: 0.3970053 (5) total: 1.37s remaining: 8.88s 6: learn: 0.3553827 test: 0.3711514 best: 0.3711514 (6) total: 1.57s remaining: 8.55s 7: learn: 0.3297188 test: 0.3475921 best: 0.3475921 (7) total: 1.81s remaining: 8.4s 8: learn: 0.3073202 test: 0.3273856 best: 0.3273856 (8) total: 2.04s remaining: 8.15s 9: learn: 0.2890453 test: 0.3109915 best: 0.3109915 (9) total: 2.27s remaining: 7.96s 10: learn: 0.2734310 test: 0.2977603 best: 0.2977603 (10) total: 2.54s remaining: 7.84s 11: learn: 0.2593761 test: 0.2854891 best: 0.2854891 (11) total: 2.75s remaining: 7.56s 12: learn: 0.2465609 test: 0.2742639 best: 0.2742639 (12) total: 2.98s remaining: 7.34s 13: learn: 0.2369903 test: 0.2663265 best: 0.2663265 (13) total: 3.29s remaining: 7.28s 14: learn: 0.2269594 test: 0.2579386 best: 0.2579386 (14) total: 3.51s remaining: 7.01s 15: learn: 0.2180850 test: 0.2507760 best: 0.2507760 (15) total: 3.77s remaining: 6.83s 16: learn: 0.2111524 test: 0.2449997 best: 0.2449997 (16) total: 3.98s remaining: 6.55s 17: learn: 0.2050746 test: 0.2407157 best: 0.2407157 (17) total: 4.21s remaining: 6.31s 18: learn: 0.1999643 test: 0.2374986 best: 0.2374986 (18) total: 4.44s remaining: 6.08s 19: learn: 0.1935087 test: 0.2332542 best: 0.2332542 (19) total: 4.67s remaining: 5.84s 20: learn: 0.1873254 test: 0.2289156 best: 0.2289156 (20) total: 4.9s remaining: 5.6s 21: learn: 0.1821483 test: 0.2254238 best: 0.2254238 (21) total: 5.13s remaining: 5.36s 22: learn: 0.1771530 test: 0.2220288 best: 0.2220288 (22) total: 5.34s remaining: 5.11s 23: learn: 0.1730602 test: 0.2192913 best: 0.2192913 (23) total: 5.55s remaining: 4.86s 24: learn: 0.1690649 test: 0.2166322 best: 0.2166322 (24) total: 5.78s remaining: 4.62s 25: learn: 0.1650688 test: 0.2139549 best: 0.2139549 (25) total: 6.01s remaining: 4.39s 26: learn: 0.1613765 test: 0.2119467 best: 0.2119467 (26) total: 6.25s remaining: 4.17s 27: learn: 0.1577357 test: 0.2100592 best: 0.2100592 (27) total: 6.49s remaining: 3.94s 28: learn: 0.1547710 test: 0.2080215 best: 0.2080215 (28) total: 6.69s remaining: 3.69s 29: learn: 0.1520849 test: 0.2063004 best: 0.2063004 (29) total: 6.9s remaining: 3.45s 30: learn: 0.1495942 test: 0.2050915 best: 0.2050915 (30) total: 7.12s remaining: 3.22s 31: learn: 0.1473081 test: 0.2039609 best: 0.2039609 (31) total: 7.33s remaining: 2.98s 32: learn: 0.1449137 test: 0.2027232 best: 0.2027232 (32) total: 7.57s remaining: 2.75s 33: learn: 0.1432285 test: 0.2017526 best: 0.2017526 (33) total: 7.76s remaining: 2.51s 34: learn: 0.1410431 test: 0.2010352 best: 0.2010352 (34) total: 7.98s remaining: 2.28s 35: learn: 0.1388592 test: 0.2001658 best: 0.2001658 (35) total: 8.22s remaining: 2.06s 36: learn: 0.1367464 test: 0.1992638 best: 0.1992638 (36) total: 8.49s remaining: 1.84s 37: learn: 0.1345366 test: 0.1984889 best: 0.1984889 (37) total: 8.74s remaining: 1.61s 38: learn: 0.1329384 test: 0.1975966 best: 0.1975966 (38) total: 8.96s remaining: 1.38s 39: learn: 0.1312921 test: 0.1967614 best: 0.1967614 (39) total: 9.2s remaining: 1.15s 40: learn: 0.1298862 test: 0.1964088 best: 0.1964088 (40) total: 9.43s remaining: 920ms 41: learn: 0.1285350 test: 0.1959249 best: 0.1959249 (41) total: 9.64s remaining: 689ms 42: learn: 0.1271042 test: 0.1953395 best: 0.1953395 (42) total: 9.86s remaining: 459ms 43: learn: 0.1254062 test: 0.1951089 best: 0.1951089 (43) total: 10.1s remaining: 230ms 44: learn: 0.1241640 test: 0.1947249 best: 0.1947249 (44) total: 10.4s remaining: 0us bestTest = 0.1947248803 bestIteration = 44 Trial 49, Fold 3: Log loss = 0.19472488028293558, Average precision = 0.974079074312862, ROC-AUC = 0.9726829234525335, Elapsed Time = 10.543397699999332 seconds Trial 49, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 49, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6164632 test: 0.6194466 best: 0.6194466 (0) total: 222ms remaining: 9.75s 1: learn: 0.5506540 test: 0.5565594 best: 0.5565594 (1) total: 456ms remaining: 9.8s 2: learn: 0.4947039 test: 0.5033002 best: 0.5033002 (2) total: 708ms remaining: 9.91s 3: learn: 0.4478003 test: 0.4596282 best: 0.4596282 (3) total: 928ms remaining: 9.51s 4: learn: 0.4087063 test: 0.4226566 best: 0.4226566 (4) total: 1.14s remaining: 9.08s 5: learn: 0.3751910 test: 0.3915312 best: 0.3915312 (5) total: 1.34s remaining: 8.74s 6: learn: 0.3472360 test: 0.3649324 best: 0.3649324 (6) total: 1.51s remaining: 8.22s 7: learn: 0.3233482 test: 0.3431185 best: 0.3431185 (7) total: 1.74s remaining: 8.03s 8: learn: 0.3031631 test: 0.3248692 best: 0.3248692 (8) total: 1.95s remaining: 7.8s 9: learn: 0.2854991 test: 0.3095496 best: 0.3095496 (9) total: 2.18s remaining: 7.64s 10: learn: 0.2695880 test: 0.2950903 best: 0.2950903 (10) total: 2.39s remaining: 7.4s 11: learn: 0.2564686 test: 0.2836018 best: 0.2836018 (11) total: 2.6s remaining: 7.15s 12: learn: 0.2444637 test: 0.2730861 best: 0.2730861 (12) total: 2.82s remaining: 6.94s 13: learn: 0.2329613 test: 0.2640508 best: 0.2640508 (13) total: 3.05s remaining: 6.75s 14: learn: 0.2235898 test: 0.2565114 best: 0.2565114 (14) total: 3.26s remaining: 6.52s 15: learn: 0.2155184 test: 0.2501213 best: 0.2501213 (15) total: 3.51s remaining: 6.36s 16: learn: 0.2079015 test: 0.2446642 best: 0.2446642 (16) total: 3.79s remaining: 6.25s 17: learn: 0.2007018 test: 0.2392839 best: 0.2392839 (17) total: 4.05s remaining: 6.07s 18: learn: 0.1945272 test: 0.2349739 best: 0.2349739 (18) total: 4.29s remaining: 5.87s 19: learn: 0.1886551 test: 0.2308173 best: 0.2308173 (19) total: 4.52s remaining: 5.65s 20: learn: 0.1838068 test: 0.2273460 best: 0.2273460 (20) total: 4.75s remaining: 5.43s 21: learn: 0.1789844 test: 0.2240422 best: 0.2240422 (21) total: 4.98s remaining: 5.21s 22: learn: 0.1745986 test: 0.2211972 best: 0.2211972 (22) total: 5.22s remaining: 5s 23: learn: 0.1705014 test: 0.2186301 best: 0.2186301 (23) total: 5.47s remaining: 4.79s 24: learn: 0.1666038 test: 0.2162652 best: 0.2162652 (24) total: 5.71s remaining: 4.57s 25: learn: 0.1629173 test: 0.2143531 best: 0.2143531 (25) total: 6s remaining: 4.38s 26: learn: 0.1595638 test: 0.2122058 best: 0.2122058 (26) total: 6.22s remaining: 4.14s 27: learn: 0.1566077 test: 0.2102043 best: 0.2102043 (27) total: 6.44s remaining: 3.91s 28: learn: 0.1535058 test: 0.2086202 best: 0.2086202 (28) total: 6.67s remaining: 3.68s 29: learn: 0.1507324 test: 0.2075581 best: 0.2075581 (29) total: 6.93s remaining: 3.46s 30: learn: 0.1486030 test: 0.2061454 best: 0.2061454 (30) total: 7.14s remaining: 3.22s 31: learn: 0.1464960 test: 0.2051166 best: 0.2051166 (31) total: 7.37s remaining: 2.99s 32: learn: 0.1441395 test: 0.2037382 best: 0.2037382 (32) total: 7.58s remaining: 2.76s 33: learn: 0.1418832 test: 0.2025083 best: 0.2025083 (33) total: 7.82s remaining: 2.53s 34: learn: 0.1400176 test: 0.2017073 best: 0.2017073 (34) total: 8.03s remaining: 2.29s 35: learn: 0.1380953 test: 0.2008600 best: 0.2008600 (35) total: 8.26s remaining: 2.06s 36: learn: 0.1362735 test: 0.2000743 best: 0.2000743 (36) total: 8.45s remaining: 1.83s 37: learn: 0.1346921 test: 0.1992842 best: 0.1992842 (37) total: 8.67s remaining: 1.6s 38: learn: 0.1328996 test: 0.1984923 best: 0.1984923 (38) total: 8.88s remaining: 1.37s 39: learn: 0.1314115 test: 0.1979776 best: 0.1979776 (39) total: 9.06s remaining: 1.13s 40: learn: 0.1297846 test: 0.1977512 best: 0.1977512 (40) total: 9.29s remaining: 907ms 41: learn: 0.1285544 test: 0.1973727 best: 0.1973727 (41) total: 9.51s remaining: 679ms 42: learn: 0.1271677 test: 0.1968014 best: 0.1968014 (42) total: 9.72s remaining: 452ms 43: learn: 0.1250798 test: 0.1963742 best: 0.1963742 (43) total: 9.97s remaining: 227ms 44: learn: 0.1235936 test: 0.1960739 best: 0.1960739 (44) total: 10.2s remaining: 0us bestTest = 0.1960738703 bestIteration = 44 Trial 49, Fold 4: Log loss = 0.19607387033436352, Average precision = 0.9766321767544495, ROC-AUC = 0.9725493334382473, Elapsed Time = 10.381020600001648 seconds Trial 49, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 49, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6156308 test: 0.6196338 best: 0.6196338 (0) total: 253ms remaining: 11.1s 1: learn: 0.5504350 test: 0.5575470 best: 0.5575470 (1) total: 457ms remaining: 9.82s 2: learn: 0.4945676 test: 0.5054286 best: 0.5054286 (2) total: 675ms remaining: 9.45s 3: learn: 0.4482627 test: 0.4623910 best: 0.4623910 (3) total: 867ms remaining: 8.89s 4: learn: 0.4086851 test: 0.4251317 best: 0.4251317 (4) total: 1.06s remaining: 8.48s 5: learn: 0.3747766 test: 0.3944977 best: 0.3944977 (5) total: 1.29s remaining: 8.41s 6: learn: 0.3486637 test: 0.3714950 best: 0.3714950 (6) total: 1.55s remaining: 8.42s 7: learn: 0.3240233 test: 0.3489605 best: 0.3489605 (7) total: 1.8s remaining: 8.33s 8: learn: 0.3033992 test: 0.3307749 best: 0.3307749 (8) total: 2.01s remaining: 8.03s 9: learn: 0.2856555 test: 0.3154434 best: 0.3154434 (9) total: 2.22s remaining: 7.76s 10: learn: 0.2695219 test: 0.3016684 best: 0.3016684 (10) total: 2.44s remaining: 7.55s 11: learn: 0.2560445 test: 0.2904053 best: 0.2904053 (11) total: 2.66s remaining: 7.31s 12: learn: 0.2437788 test: 0.2796185 best: 0.2796185 (12) total: 2.86s remaining: 7.03s 13: learn: 0.2324151 test: 0.2703316 best: 0.2703316 (13) total: 3.13s remaining: 6.93s 14: learn: 0.2233228 test: 0.2633097 best: 0.2633097 (14) total: 3.35s remaining: 6.69s 15: learn: 0.2147534 test: 0.2568609 best: 0.2568609 (15) total: 3.58s remaining: 6.49s 16: learn: 0.2074001 test: 0.2512269 best: 0.2512269 (16) total: 3.81s remaining: 6.28s 17: learn: 0.2000579 test: 0.2461105 best: 0.2461105 (17) total: 4.09s remaining: 6.13s 18: learn: 0.1944049 test: 0.2420900 best: 0.2420900 (18) total: 4.33s remaining: 5.92s 19: learn: 0.1891513 test: 0.2381164 best: 0.2381164 (19) total: 4.54s remaining: 5.67s 20: learn: 0.1840565 test: 0.2344102 best: 0.2344102 (20) total: 4.75s remaining: 5.42s 21: learn: 0.1790848 test: 0.2309405 best: 0.2309405 (21) total: 4.96s remaining: 5.18s 22: learn: 0.1748791 test: 0.2281043 best: 0.2281043 (22) total: 5.17s remaining: 4.95s 23: learn: 0.1709632 test: 0.2258300 best: 0.2258300 (23) total: 5.42s remaining: 4.74s 24: learn: 0.1667749 test: 0.2230758 best: 0.2230758 (24) total: 5.67s remaining: 4.53s 25: learn: 0.1637605 test: 0.2214190 best: 0.2214190 (25) total: 5.9s remaining: 4.31s 26: learn: 0.1606904 test: 0.2193879 best: 0.2193879 (26) total: 6.11s remaining: 4.07s 27: learn: 0.1573390 test: 0.2175506 best: 0.2175506 (27) total: 6.33s remaining: 3.84s 28: learn: 0.1546147 test: 0.2159142 best: 0.2159142 (28) total: 6.57s remaining: 3.63s 29: learn: 0.1514815 test: 0.2144689 best: 0.2144689 (29) total: 6.83s remaining: 3.42s 30: learn: 0.1490835 test: 0.2131217 best: 0.2131217 (30) total: 7.04s remaining: 3.18s 31: learn: 0.1459701 test: 0.2119443 best: 0.2119443 (31) total: 7.31s remaining: 2.97s 32: learn: 0.1436473 test: 0.2107649 best: 0.2107649 (32) total: 7.54s remaining: 2.74s 33: learn: 0.1416501 test: 0.2097770 best: 0.2097770 (33) total: 7.74s remaining: 2.5s 34: learn: 0.1397390 test: 0.2087928 best: 0.2087928 (34) total: 7.96s remaining: 2.27s 35: learn: 0.1376093 test: 0.2080114 best: 0.2080114 (35) total: 8.24s remaining: 2.06s 36: learn: 0.1357996 test: 0.2077369 best: 0.2077369 (36) total: 8.48s remaining: 1.83s 37: learn: 0.1340893 test: 0.2069862 best: 0.2069862 (37) total: 8.66s remaining: 1.6s 38: learn: 0.1322499 test: 0.2062572 best: 0.2062572 (38) total: 8.9s remaining: 1.37s 39: learn: 0.1305891 test: 0.2057270 best: 0.2057270 (39) total: 9.12s remaining: 1.14s 40: learn: 0.1288804 test: 0.2050312 best: 0.2050312 (40) total: 9.35s remaining: 913ms 41: learn: 0.1270981 test: 0.2044872 best: 0.2044872 (41) total: 9.59s remaining: 685ms 42: learn: 0.1257052 test: 0.2040819 best: 0.2040819 (42) total: 9.81s remaining: 456ms 43: learn: 0.1239099 test: 0.2035169 best: 0.2035169 (43) total: 10.1s remaining: 229ms 44: learn: 0.1224298 test: 0.2032652 best: 0.2032652 (44) total: 10.3s remaining: 0us bestTest = 0.2032651634 bestIteration = 44 Trial 49, Fold 5: Log loss = 0.20326516342903195, Average precision = 0.9748217651359093, ROC-AUC = 0.971378184047712, Elapsed Time = 10.45471489999909 seconds
Optimization Progress: 50%|##### | 50/100 [1:15:30<46:09, 55.38s/it]
Trial 50, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 50, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6181454 test: 0.6197185 best: 0.6197185 (0) total: 239ms remaining: 10.8s 1: learn: 0.5546949 test: 0.5578788 best: 0.5578788 (1) total: 498ms remaining: 11s 2: learn: 0.5013938 test: 0.5058207 best: 0.5058207 (2) total: 739ms remaining: 10.6s 3: learn: 0.4570279 test: 0.4628527 best: 0.4628527 (3) total: 1.01s remaining: 10.6s 4: learn: 0.4187699 test: 0.4258721 best: 0.4258721 (4) total: 1.28s remaining: 10.5s 5: learn: 0.3866805 test: 0.3948618 best: 0.3948618 (5) total: 1.53s remaining: 10.2s 6: learn: 0.3595621 test: 0.3692960 best: 0.3692960 (6) total: 1.81s remaining: 10.1s 7: learn: 0.3369295 test: 0.3480191 best: 0.3480191 (7) total: 2.14s remaining: 10.1s 8: learn: 0.3171539 test: 0.3292654 best: 0.3292654 (8) total: 2.45s remaining: 10.1s 9: learn: 0.3004260 test: 0.3133805 best: 0.3133805 (9) total: 2.75s remaining: 9.89s 10: learn: 0.2860069 test: 0.3006652 best: 0.3006652 (10) total: 3.11s remaining: 9.89s 11: learn: 0.2740425 test: 0.2900591 best: 0.2900591 (11) total: 3.4s remaining: 9.62s 12: learn: 0.2638088 test: 0.2811734 best: 0.2811734 (12) total: 3.68s remaining: 9.34s 13: learn: 0.2536472 test: 0.2721084 best: 0.2721084 (13) total: 4.01s remaining: 9.16s 14: learn: 0.2446004 test: 0.2638588 best: 0.2638588 (14) total: 4.3s remaining: 8.89s 15: learn: 0.2375223 test: 0.2578960 best: 0.2578960 (15) total: 4.62s remaining: 8.65s 16: learn: 0.2298698 test: 0.2514143 best: 0.2514143 (16) total: 4.94s remaining: 8.43s 17: learn: 0.2240451 test: 0.2466109 best: 0.2466109 (17) total: 5.23s remaining: 8.14s 18: learn: 0.2191506 test: 0.2426219 best: 0.2426219 (18) total: 5.53s remaining: 7.85s 19: learn: 0.2139284 test: 0.2385362 best: 0.2385362 (19) total: 5.83s remaining: 7.58s 20: learn: 0.2097266 test: 0.2350522 best: 0.2350522 (20) total: 6.13s remaining: 7.3s 21: learn: 0.2055683 test: 0.2315831 best: 0.2315831 (21) total: 6.42s remaining: 7.01s 22: learn: 0.2017544 test: 0.2284884 best: 0.2284884 (22) total: 6.69s remaining: 6.69s 23: learn: 0.1985995 test: 0.2261466 best: 0.2261466 (23) total: 6.99s remaining: 6.41s 24: learn: 0.1954786 test: 0.2239580 best: 0.2239580 (24) total: 7.27s remaining: 6.11s 25: learn: 0.1925442 test: 0.2221230 best: 0.2221230 (25) total: 7.59s remaining: 5.84s 26: learn: 0.1893985 test: 0.2202646 best: 0.2202646 (26) total: 7.89s remaining: 5.55s 27: learn: 0.1869130 test: 0.2187299 best: 0.2187299 (27) total: 8.2s remaining: 5.27s 28: learn: 0.1846106 test: 0.2173238 best: 0.2173238 (28) total: 8.47s remaining: 4.96s 29: learn: 0.1823291 test: 0.2157230 best: 0.2157230 (29) total: 8.74s remaining: 4.66s 30: learn: 0.1803582 test: 0.2145821 best: 0.2145821 (30) total: 9.03s remaining: 4.37s 31: learn: 0.1783213 test: 0.2132169 best: 0.2132169 (31) total: 9.32s remaining: 4.08s 32: learn: 0.1762650 test: 0.2120514 best: 0.2120514 (32) total: 9.62s remaining: 3.79s 33: learn: 0.1742082 test: 0.2108452 best: 0.2108452 (33) total: 9.93s remaining: 3.5s 34: learn: 0.1726935 test: 0.2101329 best: 0.2101329 (34) total: 10.2s remaining: 3.22s 35: learn: 0.1710609 test: 0.2092036 best: 0.2092036 (35) total: 10.5s remaining: 2.92s 36: learn: 0.1692490 test: 0.2081049 best: 0.2081049 (36) total: 10.8s remaining: 2.63s 37: learn: 0.1678820 test: 0.2073260 best: 0.2073260 (37) total: 11.1s remaining: 2.33s 38: learn: 0.1663079 test: 0.2064458 best: 0.2064458 (38) total: 11.4s remaining: 2.04s 39: learn: 0.1652061 test: 0.2058535 best: 0.2058535 (39) total: 11.6s remaining: 1.74s 40: learn: 0.1637068 test: 0.2055528 best: 0.2055528 (40) total: 11.9s remaining: 1.45s 41: learn: 0.1625866 test: 0.2049924 best: 0.2049924 (41) total: 12.2s remaining: 1.16s 42: learn: 0.1612104 test: 0.2044702 best: 0.2044702 (42) total: 12.5s remaining: 870ms 43: learn: 0.1598716 test: 0.2040626 best: 0.2040626 (43) total: 12.8s remaining: 580ms 44: learn: 0.1586672 test: 0.2036446 best: 0.2036446 (44) total: 13.1s remaining: 290ms 45: learn: 0.1575951 test: 0.2031164 best: 0.2031164 (45) total: 13.3s remaining: 0us bestTest = 0.2031163996 bestIteration = 45 Trial 50, Fold 1: Log loss = 0.20281439260644013, Average precision = 0.9747480648415343, ROC-AUC = 0.9713088725904896, Elapsed Time = 13.479710599996906 seconds Trial 50, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 50, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6179873 test: 0.6192485 best: 0.6192485 (0) total: 273ms remaining: 12.3s 1: learn: 0.5555514 test: 0.5581638 best: 0.5581638 (1) total: 557ms remaining: 12.2s 2: learn: 0.5026688 test: 0.5066382 best: 0.5066382 (2) total: 835ms remaining: 12s 3: learn: 0.4576852 test: 0.4628809 best: 0.4628809 (3) total: 1.14s remaining: 12s 4: learn: 0.4201382 test: 0.4265560 best: 0.4265560 (4) total: 1.43s remaining: 11.7s 5: learn: 0.3880516 test: 0.3956203 best: 0.3956203 (5) total: 1.75s remaining: 11.7s 6: learn: 0.3607390 test: 0.3696203 best: 0.3696203 (6) total: 2.09s remaining: 11.6s 7: learn: 0.3377555 test: 0.3473652 best: 0.3473652 (7) total: 2.37s remaining: 11.3s 8: learn: 0.3181891 test: 0.3288797 best: 0.3288797 (8) total: 2.66s remaining: 11s 9: learn: 0.3018292 test: 0.3136081 best: 0.3136081 (9) total: 2.97s remaining: 10.7s 10: learn: 0.2873064 test: 0.2997662 best: 0.2997662 (10) total: 3.26s remaining: 10.4s 11: learn: 0.2752035 test: 0.2883015 best: 0.2883015 (11) total: 3.56s remaining: 10.1s 12: learn: 0.2636362 test: 0.2776845 best: 0.2776845 (12) total: 3.84s remaining: 9.76s 13: learn: 0.2544559 test: 0.2694235 best: 0.2694235 (13) total: 4.13s remaining: 9.45s 14: learn: 0.2457755 test: 0.2613501 best: 0.2613501 (14) total: 4.42s remaining: 9.13s 15: learn: 0.2378435 test: 0.2544026 best: 0.2544026 (15) total: 4.72s remaining: 8.85s 16: learn: 0.2317414 test: 0.2488112 best: 0.2488112 (16) total: 5.01s remaining: 8.55s 17: learn: 0.2254761 test: 0.2432227 best: 0.2432227 (17) total: 5.29s remaining: 8.23s 18: learn: 0.2200039 test: 0.2385637 best: 0.2385637 (18) total: 5.6s remaining: 7.95s 19: learn: 0.2147972 test: 0.2341731 best: 0.2341731 (19) total: 5.9s remaining: 7.67s 20: learn: 0.2104088 test: 0.2300730 best: 0.2300730 (20) total: 6.18s remaining: 7.35s 21: learn: 0.2062512 test: 0.2262538 best: 0.2262538 (21) total: 6.43s remaining: 7.02s 22: learn: 0.2026519 test: 0.2232805 best: 0.2232805 (22) total: 6.71s remaining: 6.71s 23: learn: 0.1989272 test: 0.2202128 best: 0.2202128 (23) total: 7.02s remaining: 6.43s 24: learn: 0.1959673 test: 0.2181388 best: 0.2181388 (24) total: 7.31s remaining: 6.14s 25: learn: 0.1931703 test: 0.2156466 best: 0.2156466 (25) total: 7.58s remaining: 5.83s 26: learn: 0.1910157 test: 0.2137302 best: 0.2137302 (26) total: 7.85s remaining: 5.53s 27: learn: 0.1889961 test: 0.2122552 best: 0.2122552 (27) total: 8.12s remaining: 5.22s 28: learn: 0.1859937 test: 0.2100790 best: 0.2100790 (28) total: 8.43s remaining: 4.94s 29: learn: 0.1832680 test: 0.2082171 best: 0.2082171 (29) total: 8.72s remaining: 4.65s 30: learn: 0.1805123 test: 0.2064843 best: 0.2064843 (30) total: 9.04s remaining: 4.37s 31: learn: 0.1785625 test: 0.2048673 best: 0.2048673 (31) total: 9.3s remaining: 4.07s 32: learn: 0.1766840 test: 0.2036018 best: 0.2036018 (32) total: 9.58s remaining: 3.77s 33: learn: 0.1748022 test: 0.2023486 best: 0.2023486 (33) total: 9.9s remaining: 3.49s 34: learn: 0.1729100 test: 0.2015756 best: 0.2015756 (34) total: 10.2s remaining: 3.21s 35: learn: 0.1712452 test: 0.2004161 best: 0.2004161 (35) total: 10.5s remaining: 2.91s 36: learn: 0.1692729 test: 0.1994155 best: 0.1994155 (36) total: 10.8s remaining: 2.62s 37: learn: 0.1675864 test: 0.1988043 best: 0.1988043 (37) total: 11.1s remaining: 2.34s 38: learn: 0.1661034 test: 0.1978774 best: 0.1978774 (38) total: 11.4s remaining: 2.04s 39: learn: 0.1647142 test: 0.1971105 best: 0.1971105 (39) total: 11.7s remaining: 1.75s 40: learn: 0.1633068 test: 0.1963006 best: 0.1963006 (40) total: 12s remaining: 1.46s 41: learn: 0.1619309 test: 0.1955668 best: 0.1955668 (41) total: 12.3s remaining: 1.17s 42: learn: 0.1604109 test: 0.1946884 best: 0.1946884 (42) total: 12.6s remaining: 876ms 43: learn: 0.1591892 test: 0.1939680 best: 0.1939680 (43) total: 12.8s remaining: 583ms 44: learn: 0.1579988 test: 0.1934174 best: 0.1934174 (44) total: 13.1s remaining: 291ms 45: learn: 0.1570315 test: 0.1929078 best: 0.1929078 (45) total: 13.4s remaining: 0us bestTest = 0.1929078196 bestIteration = 45 Trial 50, Fold 2: Log loss = 0.1926998253958043, Average precision = 0.9767160857624644, ROC-AUC = 0.9742849720222608, Elapsed Time = 13.507884499998909 seconds Trial 50, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 50, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6178544 test: 0.6190754 best: 0.6190754 (0) total: 290ms remaining: 13s 1: learn: 0.5549311 test: 0.5571483 best: 0.5571483 (1) total: 552ms remaining: 12.1s 2: learn: 0.5017875 test: 0.5053256 best: 0.5053256 (2) total: 823ms remaining: 11.8s 3: learn: 0.4571424 test: 0.4619925 best: 0.4619925 (3) total: 1.08s remaining: 11.3s 4: learn: 0.4188859 test: 0.4244106 best: 0.4244106 (4) total: 1.35s remaining: 11.1s 5: learn: 0.3868612 test: 0.3935517 best: 0.3935517 (5) total: 1.65s remaining: 11s 6: learn: 0.3599702 test: 0.3676090 best: 0.3676090 (6) total: 1.95s remaining: 10.9s 7: learn: 0.3378926 test: 0.3464450 best: 0.3464450 (7) total: 2.25s remaining: 10.7s 8: learn: 0.3184316 test: 0.3278061 best: 0.3278061 (8) total: 2.51s remaining: 10.3s 9: learn: 0.3017212 test: 0.3124041 best: 0.3124041 (9) total: 2.82s remaining: 10.2s 10: learn: 0.2865931 test: 0.2984180 best: 0.2984180 (10) total: 3.15s remaining: 10s 11: learn: 0.2738820 test: 0.2863556 best: 0.2863556 (11) total: 3.42s remaining: 9.7s 12: learn: 0.2625004 test: 0.2760126 best: 0.2760126 (12) total: 3.74s remaining: 9.5s 13: learn: 0.2529927 test: 0.2674119 best: 0.2674119 (13) total: 4.01s remaining: 9.18s 14: learn: 0.2442810 test: 0.2601336 best: 0.2601336 (14) total: 4.3s remaining: 8.89s 15: learn: 0.2368400 test: 0.2533279 best: 0.2533279 (15) total: 4.57s remaining: 8.58s 16: learn: 0.2299836 test: 0.2474844 best: 0.2474844 (16) total: 4.9s remaining: 8.36s 17: learn: 0.2235342 test: 0.2419973 best: 0.2419973 (17) total: 5.18s remaining: 8.06s 18: learn: 0.2183713 test: 0.2376473 best: 0.2376473 (18) total: 5.45s remaining: 7.74s 19: learn: 0.2129099 test: 0.2330156 best: 0.2330156 (19) total: 5.74s remaining: 7.47s 20: learn: 0.2088255 test: 0.2298258 best: 0.2298258 (20) total: 6.03s remaining: 7.18s 21: learn: 0.2043511 test: 0.2262325 best: 0.2262325 (21) total: 6.31s remaining: 6.89s 22: learn: 0.2002883 test: 0.2229310 best: 0.2229310 (22) total: 6.59s remaining: 6.59s 23: learn: 0.1968979 test: 0.2204908 best: 0.2204908 (23) total: 6.91s remaining: 6.34s 24: learn: 0.1938405 test: 0.2179331 best: 0.2179331 (24) total: 7.17s remaining: 6.02s 25: learn: 0.1909117 test: 0.2156939 best: 0.2156939 (25) total: 7.43s remaining: 5.71s 26: learn: 0.1877819 test: 0.2132574 best: 0.2132574 (26) total: 7.7s remaining: 5.42s 27: learn: 0.1854910 test: 0.2116935 best: 0.2116935 (27) total: 7.97s remaining: 5.12s 28: learn: 0.1827133 test: 0.2096202 best: 0.2096202 (28) total: 8.26s remaining: 4.84s 29: learn: 0.1805936 test: 0.2078156 best: 0.2078156 (29) total: 8.51s remaining: 4.54s 30: learn: 0.1784780 test: 0.2062052 best: 0.2062052 (30) total: 8.8s remaining: 4.26s 31: learn: 0.1761777 test: 0.2048994 best: 0.2048994 (31) total: 9.09s remaining: 3.98s 32: learn: 0.1745002 test: 0.2036959 best: 0.2036959 (32) total: 9.34s remaining: 3.68s 33: learn: 0.1727701 test: 0.2026426 best: 0.2026426 (33) total: 9.6s remaining: 3.39s 34: learn: 0.1709083 test: 0.2013716 best: 0.2013716 (34) total: 9.89s remaining: 3.11s 35: learn: 0.1690686 test: 0.2003829 best: 0.2003829 (35) total: 10.2s remaining: 2.83s 36: learn: 0.1676516 test: 0.1995139 best: 0.1995139 (36) total: 10.4s remaining: 2.54s 37: learn: 0.1660741 test: 0.1987830 best: 0.1987830 (37) total: 10.8s remaining: 2.26s 38: learn: 0.1645369 test: 0.1980025 best: 0.1980025 (38) total: 11s remaining: 1.98s 39: learn: 0.1633015 test: 0.1973022 best: 0.1973022 (39) total: 11.3s remaining: 1.69s 40: learn: 0.1621575 test: 0.1965038 best: 0.1965038 (40) total: 11.5s remaining: 1.4s 41: learn: 0.1609752 test: 0.1957273 best: 0.1957273 (41) total: 11.7s remaining: 1.12s 42: learn: 0.1597722 test: 0.1953870 best: 0.1953870 (42) total: 12s remaining: 838ms 43: learn: 0.1584105 test: 0.1949930 best: 0.1949930 (43) total: 12.3s remaining: 558ms 44: learn: 0.1571810 test: 0.1944654 best: 0.1944654 (44) total: 12.5s remaining: 278ms 45: learn: 0.1562715 test: 0.1940207 best: 0.1940207 (45) total: 12.8s remaining: 0us bestTest = 0.1940206681 bestIteration = 45 Trial 50, Fold 3: Log loss = 0.19387414893267466, Average precision = 0.9772232770454468, ROC-AUC = 0.9739894335506737, Elapsed Time = 12.906083999998373 seconds Trial 50, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 50, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6183698 test: 0.6197240 best: 0.6197240 (0) total: 266ms remaining: 12s 1: learn: 0.5555576 test: 0.5578863 best: 0.5578863 (1) total: 526ms remaining: 11.6s 2: learn: 0.5028698 test: 0.5067444 best: 0.5067444 (2) total: 815ms remaining: 11.7s 3: learn: 0.4579173 test: 0.4630229 best: 0.4630229 (3) total: 1.08s remaining: 11.4s 4: learn: 0.4197925 test: 0.4259875 best: 0.4259875 (4) total: 1.39s remaining: 11.4s 5: learn: 0.3884763 test: 0.3956923 best: 0.3956923 (5) total: 1.64s remaining: 10.9s 6: learn: 0.3617651 test: 0.3700823 best: 0.3700823 (6) total: 1.92s remaining: 10.7s 7: learn: 0.3389121 test: 0.3480724 best: 0.3480724 (7) total: 2.18s remaining: 10.4s 8: learn: 0.3193144 test: 0.3292847 best: 0.3292847 (8) total: 2.46s remaining: 10.1s 9: learn: 0.3028246 test: 0.3139350 best: 0.3139350 (9) total: 2.75s remaining: 9.91s 10: learn: 0.2880849 test: 0.3001490 best: 0.3001490 (10) total: 3.02s remaining: 9.6s 11: learn: 0.2748970 test: 0.2881450 best: 0.2881450 (11) total: 3.3s remaining: 9.36s 12: learn: 0.2639638 test: 0.2783061 best: 0.2783061 (12) total: 3.59s remaining: 9.12s 13: learn: 0.2548115 test: 0.2703312 best: 0.2703312 (13) total: 3.86s remaining: 8.82s 14: learn: 0.2462065 test: 0.2629370 best: 0.2629370 (14) total: 4.14s remaining: 8.55s 15: learn: 0.2386249 test: 0.2558654 best: 0.2558654 (15) total: 4.42s remaining: 8.28s 16: learn: 0.2317221 test: 0.2499081 best: 0.2499081 (16) total: 4.7s remaining: 8.02s 17: learn: 0.2262458 test: 0.2451600 best: 0.2451600 (17) total: 4.95s remaining: 7.71s 18: learn: 0.2204625 test: 0.2401054 best: 0.2401054 (18) total: 5.22s remaining: 7.42s 19: learn: 0.2152167 test: 0.2360322 best: 0.2360322 (19) total: 5.52s remaining: 7.17s 20: learn: 0.2112656 test: 0.2327668 best: 0.2327668 (20) total: 5.77s remaining: 6.87s 21: learn: 0.2069863 test: 0.2296762 best: 0.2296762 (21) total: 6.05s remaining: 6.6s 22: learn: 0.2028777 test: 0.2264273 best: 0.2264273 (22) total: 6.32s remaining: 6.32s 23: learn: 0.1991225 test: 0.2235330 best: 0.2235330 (23) total: 6.58s remaining: 6.03s 24: learn: 0.1960530 test: 0.2209623 best: 0.2209623 (24) total: 6.86s remaining: 5.76s 25: learn: 0.1928089 test: 0.2184083 best: 0.2184083 (25) total: 7.13s remaining: 5.49s 26: learn: 0.1899297 test: 0.2161699 best: 0.2161699 (26) total: 7.41s remaining: 5.22s 27: learn: 0.1875146 test: 0.2141628 best: 0.2141628 (27) total: 7.65s remaining: 4.92s 28: learn: 0.1854085 test: 0.2125825 best: 0.2125825 (28) total: 7.92s remaining: 4.64s 29: learn: 0.1830041 test: 0.2105225 best: 0.2105225 (29) total: 8.17s remaining: 4.36s 30: learn: 0.1808694 test: 0.2091833 best: 0.2091833 (30) total: 8.45s remaining: 4.09s 31: learn: 0.1789685 test: 0.2078399 best: 0.2078399 (31) total: 8.72s remaining: 3.82s 32: learn: 0.1771306 test: 0.2066955 best: 0.2066955 (32) total: 8.98s remaining: 3.54s 33: learn: 0.1752228 test: 0.2053706 best: 0.2053706 (33) total: 9.25s remaining: 3.27s 34: learn: 0.1735439 test: 0.2042049 best: 0.2042049 (34) total: 9.51s remaining: 2.99s 35: learn: 0.1717815 test: 0.2031189 best: 0.2031189 (35) total: 9.77s remaining: 2.71s 36: learn: 0.1700256 test: 0.2019351 best: 0.2019351 (36) total: 10.1s remaining: 2.44s 37: learn: 0.1685526 test: 0.2011771 best: 0.2011771 (37) total: 10.3s remaining: 2.17s 38: learn: 0.1670850 test: 0.2005026 best: 0.2005026 (38) total: 10.6s remaining: 1.9s 39: learn: 0.1656754 test: 0.1999724 best: 0.1999724 (39) total: 10.9s remaining: 1.63s 40: learn: 0.1644005 test: 0.1992514 best: 0.1992514 (40) total: 11.1s remaining: 1.36s 41: learn: 0.1631135 test: 0.1986401 best: 0.1986401 (41) total: 11.4s remaining: 1.09s 42: learn: 0.1619106 test: 0.1979694 best: 0.1979694 (42) total: 11.7s remaining: 815ms 43: learn: 0.1608369 test: 0.1975967 best: 0.1975967 (43) total: 11.9s remaining: 543ms 44: learn: 0.1595419 test: 0.1970484 best: 0.1970484 (44) total: 12.2s remaining: 271ms 45: learn: 0.1583545 test: 0.1966812 best: 0.1966812 (45) total: 12.5s remaining: 0us bestTest = 0.1966811628 bestIteration = 45 Trial 50, Fold 4: Log loss = 0.19644371453445866, Average precision = 0.976562927945438, ROC-AUC = 0.9727146028407858, Elapsed Time = 12.629328699997131 seconds Trial 50, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 50, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6177552 test: 0.6202089 best: 0.6202089 (0) total: 294ms remaining: 13.2s 1: learn: 0.5550832 test: 0.5592067 best: 0.5592067 (1) total: 533ms remaining: 11.7s 2: learn: 0.5021079 test: 0.5082167 best: 0.5082167 (2) total: 788ms remaining: 11.3s 3: learn: 0.4566007 test: 0.4651797 best: 0.4651797 (3) total: 1.08s remaining: 11.3s 4: learn: 0.4188748 test: 0.4290758 best: 0.4290758 (4) total: 1.32s remaining: 10.8s 5: learn: 0.3863935 test: 0.3977348 best: 0.3977348 (5) total: 1.58s remaining: 10.5s 6: learn: 0.3592003 test: 0.3719025 best: 0.3719025 (6) total: 1.82s remaining: 10.2s 7: learn: 0.3364689 test: 0.3507489 best: 0.3507489 (7) total: 2.09s remaining: 9.92s 8: learn: 0.3173569 test: 0.3327574 best: 0.3327574 (8) total: 2.38s remaining: 9.77s 9: learn: 0.3010131 test: 0.3177034 best: 0.3177034 (9) total: 2.67s remaining: 9.62s 10: learn: 0.2859704 test: 0.3039735 best: 0.3039735 (10) total: 2.96s remaining: 9.4s 11: learn: 0.2733486 test: 0.2926140 best: 0.2926140 (11) total: 3.23s remaining: 9.16s 12: learn: 0.2623045 test: 0.2825992 best: 0.2825992 (12) total: 3.52s remaining: 8.94s 13: learn: 0.2523075 test: 0.2737423 best: 0.2737423 (13) total: 3.8s remaining: 8.69s 14: learn: 0.2445109 test: 0.2670103 best: 0.2670103 (14) total: 4.07s remaining: 8.42s 15: learn: 0.2364628 test: 0.2602902 best: 0.2602902 (15) total: 4.37s remaining: 8.19s 16: learn: 0.2293516 test: 0.2543034 best: 0.2543034 (16) total: 4.65s remaining: 7.93s 17: learn: 0.2233743 test: 0.2492695 best: 0.2492695 (17) total: 4.94s remaining: 7.68s 18: learn: 0.2176815 test: 0.2446697 best: 0.2446697 (18) total: 5.21s remaining: 7.41s 19: learn: 0.2126295 test: 0.2408479 best: 0.2408479 (19) total: 5.47s remaining: 7.11s 20: learn: 0.2082821 test: 0.2372255 best: 0.2372255 (20) total: 5.73s remaining: 6.83s 21: learn: 0.2039655 test: 0.2340990 best: 0.2340990 (21) total: 6.05s remaining: 6.61s 22: learn: 0.2000550 test: 0.2311210 best: 0.2311210 (22) total: 6.34s remaining: 6.34s 23: learn: 0.1967629 test: 0.2288106 best: 0.2288106 (23) total: 6.61s remaining: 6.05s 24: learn: 0.1932770 test: 0.2262336 best: 0.2262336 (24) total: 6.9s remaining: 5.79s 25: learn: 0.1905242 test: 0.2241351 best: 0.2241351 (25) total: 7.17s remaining: 5.51s 26: learn: 0.1872845 test: 0.2217717 best: 0.2217717 (26) total: 7.45s remaining: 5.25s 27: learn: 0.1846244 test: 0.2199451 best: 0.2199451 (27) total: 7.75s remaining: 4.98s 28: learn: 0.1821237 test: 0.2183201 best: 0.2183201 (28) total: 8.02s remaining: 4.7s 29: learn: 0.1797850 test: 0.2167902 best: 0.2167902 (29) total: 8.29s remaining: 4.42s 30: learn: 0.1772771 test: 0.2152870 best: 0.2152870 (30) total: 8.56s remaining: 4.14s 31: learn: 0.1752306 test: 0.2140020 best: 0.2140020 (31) total: 8.82s remaining: 3.86s 32: learn: 0.1732370 test: 0.2127603 best: 0.2127603 (32) total: 9.12s remaining: 3.59s 33: learn: 0.1713963 test: 0.2118372 best: 0.2118372 (33) total: 9.41s remaining: 3.32s 34: learn: 0.1697455 test: 0.2110212 best: 0.2110212 (34) total: 9.66s remaining: 3.04s 35: learn: 0.1681545 test: 0.2102244 best: 0.2102244 (35) total: 9.93s remaining: 2.76s 36: learn: 0.1664058 test: 0.2093842 best: 0.2093842 (36) total: 10.2s remaining: 2.48s 37: learn: 0.1648714 test: 0.2086543 best: 0.2086543 (37) total: 10.5s remaining: 2.21s 38: learn: 0.1638382 test: 0.2081024 best: 0.2081024 (38) total: 10.7s remaining: 1.93s 39: learn: 0.1626299 test: 0.2075727 best: 0.2075727 (39) total: 11s remaining: 1.65s 40: learn: 0.1615390 test: 0.2071212 best: 0.2071212 (40) total: 11.2s remaining: 1.37s 41: learn: 0.1604275 test: 0.2069183 best: 0.2069183 (41) total: 11.5s remaining: 1.09s 42: learn: 0.1592751 test: 0.2063189 best: 0.2063189 (42) total: 11.7s remaining: 819ms 43: learn: 0.1578361 test: 0.2059229 best: 0.2059229 (43) total: 12s remaining: 546ms 44: learn: 0.1567747 test: 0.2055044 best: 0.2055044 (44) total: 12.3s remaining: 273ms 45: learn: 0.1556959 test: 0.2051194 best: 0.2051194 (45) total: 12.5s remaining: 0us bestTest = 0.2051194058 bestIteration = 45 Trial 50, Fold 5: Log loss = 0.20477899093252747, Average precision = 0.9745866292732474, ROC-AUC = 0.9716729279304387, Elapsed Time = 12.676016099998378 seconds
Optimization Progress: 51%|#####1 | 51/100 [1:16:43<49:39, 60.80s/it]
Trial 51, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 51, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6634209 test: 0.6637522 best: 0.6637522 (0) total: 55.6ms remaining: 3.89s 1: learn: 0.6375511 test: 0.6382479 best: 0.6382479 (1) total: 115ms remaining: 3.96s 2: learn: 0.6097466 test: 0.6106364 best: 0.6106364 (2) total: 173ms remaining: 3.92s 3: learn: 0.5853449 test: 0.5862071 best: 0.5862071 (3) total: 229ms remaining: 3.84s 4: learn: 0.5637241 test: 0.5648126 best: 0.5648126 (4) total: 291ms remaining: 3.85s 5: learn: 0.5429040 test: 0.5444362 best: 0.5444362 (5) total: 352ms remaining: 3.81s 6: learn: 0.5276308 test: 0.5299587 best: 0.5299587 (6) total: 414ms remaining: 3.79s 7: learn: 0.5131802 test: 0.5159000 best: 0.5159000 (7) total: 486ms remaining: 3.83s 8: learn: 0.4975942 test: 0.5005517 best: 0.5005517 (8) total: 551ms remaining: 3.79s 9: learn: 0.4851651 test: 0.4881047 best: 0.4881047 (9) total: 614ms remaining: 3.75s 10: learn: 0.4711858 test: 0.4736580 best: 0.4736580 (10) total: 681ms remaining: 3.71s 11: learn: 0.4536641 test: 0.4559358 best: 0.4559358 (11) total: 749ms remaining: 3.68s 12: learn: 0.4362421 test: 0.4386718 best: 0.4386718 (12) total: 815ms remaining: 3.63s 13: learn: 0.4238146 test: 0.4260074 best: 0.4260074 (13) total: 883ms remaining: 3.59s 14: learn: 0.4148227 test: 0.4172548 best: 0.4172548 (14) total: 948ms remaining: 3.54s 15: learn: 0.4062228 test: 0.4084853 best: 0.4084853 (15) total: 1.02s remaining: 3.51s 16: learn: 0.3947948 test: 0.3967555 best: 0.3967555 (16) total: 1.09s remaining: 3.45s 17: learn: 0.3889597 test: 0.3913026 best: 0.3913026 (17) total: 1.15s remaining: 3.39s 18: learn: 0.3821281 test: 0.3848163 best: 0.3848163 (18) total: 1.22s remaining: 3.33s 19: learn: 0.3745622 test: 0.3773620 best: 0.3773620 (19) total: 1.29s remaining: 3.28s 20: learn: 0.3631516 test: 0.3659310 best: 0.3659310 (20) total: 1.35s remaining: 3.22s 21: learn: 0.3529109 test: 0.3555346 best: 0.3555346 (21) total: 1.42s remaining: 3.17s 22: learn: 0.3492399 test: 0.3519671 best: 0.3519671 (22) total: 1.5s remaining: 3.12s 23: learn: 0.3440006 test: 0.3465095 best: 0.3465095 (23) total: 1.57s remaining: 3.07s 24: learn: 0.3380811 test: 0.3405796 best: 0.3405796 (24) total: 1.64s remaining: 3.01s 25: learn: 0.3349671 test: 0.3378943 best: 0.3378943 (25) total: 1.71s remaining: 2.96s 26: learn: 0.3312339 test: 0.3342383 best: 0.3342383 (26) total: 1.79s remaining: 2.91s 27: learn: 0.3253885 test: 0.3282108 best: 0.3282108 (27) total: 1.86s remaining: 2.85s 28: learn: 0.3215563 test: 0.3243296 best: 0.3243296 (28) total: 1.95s remaining: 2.83s 29: learn: 0.3156488 test: 0.3184638 best: 0.3184638 (29) total: 2.03s remaining: 2.77s 30: learn: 0.3128319 test: 0.3156597 best: 0.3156597 (30) total: 2.11s remaining: 2.72s 31: learn: 0.3082633 test: 0.3111219 best: 0.3111219 (31) total: 2.18s remaining: 2.66s 32: learn: 0.3039314 test: 0.3067620 best: 0.3067620 (32) total: 2.3s remaining: 2.65s 33: learn: 0.3023288 test: 0.3055019 best: 0.3055019 (33) total: 2.4s remaining: 2.61s 34: learn: 0.2985464 test: 0.3016167 best: 0.3016167 (34) total: 2.48s remaining: 2.56s 35: learn: 0.2950510 test: 0.2981975 best: 0.2981975 (35) total: 2.57s remaining: 2.5s 36: learn: 0.2935800 test: 0.2968318 best: 0.2968318 (36) total: 2.66s remaining: 2.44s 37: learn: 0.2914369 test: 0.2946621 best: 0.2946621 (37) total: 2.75s remaining: 2.38s 38: learn: 0.2884580 test: 0.2916250 best: 0.2916250 (38) total: 2.83s remaining: 2.32s 39: learn: 0.2865441 test: 0.2897970 best: 0.2897970 (39) total: 2.92s remaining: 2.26s 40: learn: 0.2856674 test: 0.2889583 best: 0.2889583 (40) total: 3.01s remaining: 2.2s 41: learn: 0.2830413 test: 0.2864665 best: 0.2864665 (41) total: 3.09s remaining: 2.13s 42: learn: 0.2818022 test: 0.2855181 best: 0.2855181 (42) total: 3.18s remaining: 2.07s 43: learn: 0.2792129 test: 0.2829574 best: 0.2829574 (43) total: 3.27s remaining: 2s 44: learn: 0.2766082 test: 0.2804453 best: 0.2804453 (44) total: 3.35s remaining: 1.93s 45: learn: 0.2741900 test: 0.2780880 best: 0.2780880 (45) total: 3.44s remaining: 1.87s 46: learn: 0.2723645 test: 0.2762759 best: 0.2762759 (46) total: 3.52s remaining: 1.8s 47: learn: 0.2702284 test: 0.2741175 best: 0.2741175 (47) total: 3.6s remaining: 1.72s 48: learn: 0.2682677 test: 0.2721078 best: 0.2721078 (48) total: 3.67s remaining: 1.65s 49: learn: 0.2659881 test: 0.2697922 best: 0.2697922 (49) total: 3.74s remaining: 1.57s 50: learn: 0.2641562 test: 0.2680089 best: 0.2680089 (50) total: 3.82s remaining: 1.5s 51: learn: 0.2630048 test: 0.2668983 best: 0.2668983 (51) total: 3.9s remaining: 1.42s 52: learn: 0.2617910 test: 0.2656604 best: 0.2656604 (52) total: 3.97s remaining: 1.35s 53: learn: 0.2603787 test: 0.2642516 best: 0.2642516 (53) total: 4.05s remaining: 1.27s 54: learn: 0.2595279 test: 0.2634767 best: 0.2634767 (54) total: 4.12s remaining: 1.2s 55: learn: 0.2585196 test: 0.2624512 best: 0.2624512 (55) total: 4.2s remaining: 1.12s 56: learn: 0.2579199 test: 0.2619215 best: 0.2619215 (56) total: 4.28s remaining: 1.05s 57: learn: 0.2572282 test: 0.2613795 best: 0.2613795 (57) total: 4.35s remaining: 976ms 58: learn: 0.2559697 test: 0.2600546 best: 0.2600546 (58) total: 4.43s remaining: 900ms 59: learn: 0.2550037 test: 0.2590914 best: 0.2590914 (59) total: 4.5s remaining: 826ms 60: learn: 0.2536512 test: 0.2577347 best: 0.2577347 (60) total: 4.58s remaining: 751ms 61: learn: 0.2531177 test: 0.2571852 best: 0.2571852 (61) total: 4.66s remaining: 676ms 62: learn: 0.2518416 test: 0.2558969 best: 0.2558969 (62) total: 4.74s remaining: 601ms 63: learn: 0.2510068 test: 0.2550722 best: 0.2550722 (63) total: 4.81s remaining: 526ms 64: learn: 0.2496977 test: 0.2537770 best: 0.2537770 (64) total: 4.88s remaining: 451ms 65: learn: 0.2484790 test: 0.2525453 best: 0.2525453 (65) total: 4.96s remaining: 376ms 66: learn: 0.2480561 test: 0.2521605 best: 0.2521605 (66) total: 5.04s remaining: 301ms 67: learn: 0.2473605 test: 0.2515421 best: 0.2515421 (67) total: 5.12s remaining: 226ms 68: learn: 0.2470260 test: 0.2513322 best: 0.2513322 (68) total: 5.19s remaining: 151ms 69: learn: 0.2465675 test: 0.2509071 best: 0.2509071 (69) total: 5.27s remaining: 75.3ms 70: learn: 0.2457878 test: 0.2501164 best: 0.2501164 (70) total: 5.34s remaining: 0us bestTest = 0.2501164455 bestIteration = 70 Trial 51, Fold 1: Log loss = 0.2501164455119471, Average precision = 0.9651792434948091, ROC-AUC = 0.9593418266634166, Elapsed Time = 5.464839000000211 seconds Trial 51, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 51, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6559708 test: 0.6560732 best: 0.6560732 (0) total: 78.3ms remaining: 5.48s 1: learn: 0.6258703 test: 0.6265645 best: 0.6265645 (1) total: 149ms remaining: 5.15s 2: learn: 0.6012651 test: 0.6020241 best: 0.6020241 (2) total: 220ms remaining: 4.99s 3: learn: 0.5773391 test: 0.5781456 best: 0.5781456 (3) total: 290ms remaining: 4.86s 4: learn: 0.5562752 test: 0.5570941 best: 0.5570941 (4) total: 364ms remaining: 4.8s 5: learn: 0.5339560 test: 0.5351507 best: 0.5351507 (5) total: 436ms remaining: 4.72s 6: learn: 0.5179548 test: 0.5191932 best: 0.5191932 (6) total: 512ms remaining: 4.68s 7: learn: 0.5018025 test: 0.5031235 best: 0.5031235 (7) total: 589ms remaining: 4.64s 8: learn: 0.4849196 test: 0.4864893 best: 0.4864893 (8) total: 664ms remaining: 4.57s 9: learn: 0.4724225 test: 0.4739761 best: 0.4739761 (9) total: 738ms remaining: 4.5s 10: learn: 0.4501332 test: 0.4519014 best: 0.4519014 (10) total: 815ms remaining: 4.45s 11: learn: 0.4381081 test: 0.4402716 best: 0.4402716 (11) total: 890ms remaining: 4.38s 12: learn: 0.4226385 test: 0.4247122 best: 0.4247122 (12) total: 965ms remaining: 4.3s 13: learn: 0.4134652 test: 0.4155033 best: 0.4155033 (13) total: 1.04s remaining: 4.23s 14: learn: 0.4049544 test: 0.4070420 best: 0.4070420 (14) total: 1.11s remaining: 4.16s 15: learn: 0.3976718 test: 0.3998361 best: 0.3998361 (15) total: 1.19s remaining: 4.09s 16: learn: 0.3850149 test: 0.3870555 best: 0.3870555 (16) total: 1.26s remaining: 4.01s 17: learn: 0.3750613 test: 0.3770549 best: 0.3770549 (17) total: 1.34s remaining: 3.94s 18: learn: 0.3706578 test: 0.3727107 best: 0.3727107 (18) total: 1.41s remaining: 3.86s 19: learn: 0.3665069 test: 0.3683443 best: 0.3683443 (19) total: 1.48s remaining: 3.79s 20: learn: 0.3619533 test: 0.3638091 best: 0.3638091 (20) total: 1.56s remaining: 3.72s 21: learn: 0.3539945 test: 0.3558157 best: 0.3558157 (21) total: 1.64s remaining: 3.64s 22: learn: 0.3495594 test: 0.3514726 best: 0.3514726 (22) total: 1.71s remaining: 3.57s 23: learn: 0.3455245 test: 0.3474083 best: 0.3474083 (23) total: 1.78s remaining: 3.49s 24: learn: 0.3422008 test: 0.3441055 best: 0.3441055 (24) total: 1.85s remaining: 3.4s 25: learn: 0.3371093 test: 0.3389792 best: 0.3389792 (25) total: 1.91s remaining: 3.31s 26: learn: 0.3343710 test: 0.3362945 best: 0.3362945 (26) total: 1.98s remaining: 3.23s 27: learn: 0.3287650 test: 0.3306090 best: 0.3306090 (27) total: 2.05s remaining: 3.15s 28: learn: 0.3260390 test: 0.3277908 best: 0.3277908 (28) total: 2.12s remaining: 3.07s 29: learn: 0.3233779 test: 0.3251977 best: 0.3251977 (29) total: 2.19s remaining: 2.99s 30: learn: 0.3201230 test: 0.3220410 best: 0.3220410 (30) total: 2.25s remaining: 2.9s 31: learn: 0.3169747 test: 0.3187873 best: 0.3187873 (31) total: 2.32s remaining: 2.83s 32: learn: 0.3129208 test: 0.3147843 best: 0.3147843 (32) total: 2.39s remaining: 2.75s 33: learn: 0.3110823 test: 0.3127968 best: 0.3127968 (33) total: 2.46s remaining: 2.68s 34: learn: 0.3081462 test: 0.3097868 best: 0.3097868 (34) total: 2.56s remaining: 2.64s 35: learn: 0.3063114 test: 0.3079257 best: 0.3079257 (35) total: 2.64s remaining: 2.57s 36: learn: 0.3036888 test: 0.3053774 best: 0.3053774 (36) total: 2.71s remaining: 2.49s 37: learn: 0.2994720 test: 0.3010074 best: 0.3010074 (37) total: 2.78s remaining: 2.41s 38: learn: 0.2959929 test: 0.2974659 best: 0.2974659 (38) total: 2.85s remaining: 2.33s 39: learn: 0.2929422 test: 0.2945160 best: 0.2945160 (39) total: 2.92s remaining: 2.26s 40: learn: 0.2916661 test: 0.2932932 best: 0.2932932 (40) total: 2.98s remaining: 2.18s 41: learn: 0.2903128 test: 0.2920566 best: 0.2920566 (41) total: 3.05s remaining: 2.11s 42: learn: 0.2879161 test: 0.2898274 best: 0.2898274 (42) total: 3.13s remaining: 2.04s 43: learn: 0.2847592 test: 0.2865702 best: 0.2865702 (43) total: 3.21s remaining: 1.97s 44: learn: 0.2820297 test: 0.2837825 best: 0.2837825 (44) total: 3.28s remaining: 1.89s 45: learn: 0.2807597 test: 0.2826260 best: 0.2826260 (45) total: 3.34s remaining: 1.82s 46: learn: 0.2781173 test: 0.2799161 best: 0.2799161 (46) total: 3.41s remaining: 1.74s 47: learn: 0.2760780 test: 0.2779086 best: 0.2779086 (47) total: 3.48s remaining: 1.67s 48: learn: 0.2742471 test: 0.2761829 best: 0.2761829 (48) total: 3.55s remaining: 1.59s 49: learn: 0.2721790 test: 0.2741910 best: 0.2741910 (49) total: 3.61s remaining: 1.52s 50: learn: 0.2712140 test: 0.2731309 best: 0.2731309 (50) total: 3.68s remaining: 1.44s 51: learn: 0.2699954 test: 0.2719069 best: 0.2719069 (51) total: 3.75s remaining: 1.37s 52: learn: 0.2688652 test: 0.2707429 best: 0.2707429 (52) total: 3.81s remaining: 1.29s 53: learn: 0.2664035 test: 0.2684274 best: 0.2684274 (53) total: 3.88s remaining: 1.22s 54: learn: 0.2641460 test: 0.2659487 best: 0.2659487 (54) total: 3.94s remaining: 1.15s 55: learn: 0.2634143 test: 0.2651724 best: 0.2651724 (55) total: 4s remaining: 1.07s 56: learn: 0.2619846 test: 0.2636461 best: 0.2636461 (56) total: 4.07s remaining: 1000ms 57: learn: 0.2615006 test: 0.2631073 best: 0.2631073 (57) total: 4.14s remaining: 928ms 58: learn: 0.2607075 test: 0.2623353 best: 0.2623353 (58) total: 4.21s remaining: 856ms 59: learn: 0.2597759 test: 0.2613818 best: 0.2613818 (59) total: 4.27s remaining: 784ms 60: learn: 0.2592106 test: 0.2608042 best: 0.2608042 (60) total: 4.34s remaining: 712ms 61: learn: 0.2574839 test: 0.2590366 best: 0.2590366 (61) total: 4.41s remaining: 640ms 62: learn: 0.2560224 test: 0.2576319 best: 0.2576319 (62) total: 4.47s remaining: 568ms 63: learn: 0.2554818 test: 0.2570843 best: 0.2570843 (63) total: 4.55s remaining: 497ms 64: learn: 0.2537275 test: 0.2553180 best: 0.2553180 (64) total: 4.62s remaining: 426ms 65: learn: 0.2524055 test: 0.2540134 best: 0.2540134 (65) total: 4.68s remaining: 355ms 66: learn: 0.2521331 test: 0.2537444 best: 0.2537444 (66) total: 4.75s remaining: 283ms 67: learn: 0.2511538 test: 0.2528266 best: 0.2528266 (67) total: 4.81s remaining: 212ms 68: learn: 0.2507072 test: 0.2523578 best: 0.2523578 (68) total: 4.9s remaining: 142ms 69: learn: 0.2501721 test: 0.2518411 best: 0.2518411 (69) total: 4.97s remaining: 71.1ms 70: learn: 0.2497587 test: 0.2514247 best: 0.2514247 (70) total: 5.04s remaining: 0us bestTest = 0.2514246839 bestIteration = 70 Trial 51, Fold 2: Log loss = 0.2514246838670997, Average precision = 0.9648018845347787, ROC-AUC = 0.9613906591954149, Elapsed Time = 5.177672900001198 seconds Trial 51, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 51, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6559335 test: 0.6556170 best: 0.6556170 (0) total: 55.8ms remaining: 3.91s 1: learn: 0.6265480 test: 0.6253695 best: 0.6253695 (1) total: 117ms remaining: 4.02s 2: learn: 0.6005954 test: 0.5988450 best: 0.5988450 (2) total: 177ms remaining: 4.01s 3: learn: 0.5784683 test: 0.5761594 best: 0.5761594 (3) total: 240ms remaining: 4.02s 4: learn: 0.5576915 test: 0.5552078 best: 0.5552078 (4) total: 301ms remaining: 3.97s 5: learn: 0.5393960 test: 0.5362276 best: 0.5362276 (5) total: 362ms remaining: 3.92s 6: learn: 0.5214966 test: 0.5185055 best: 0.5185055 (6) total: 430ms remaining: 3.93s 7: learn: 0.5051994 test: 0.5024759 best: 0.5024759 (7) total: 497ms remaining: 3.92s 8: learn: 0.4873735 test: 0.4843997 best: 0.4843997 (8) total: 573ms remaining: 3.95s 9: learn: 0.4747382 test: 0.4720208 best: 0.4720208 (9) total: 640ms remaining: 3.9s 10: learn: 0.4638774 test: 0.4607725 best: 0.4607725 (10) total: 697ms remaining: 3.8s 11: learn: 0.4453475 test: 0.4420155 best: 0.4420155 (11) total: 758ms remaining: 3.73s 12: learn: 0.4262922 test: 0.4228122 best: 0.4228122 (12) total: 821ms remaining: 3.66s 13: learn: 0.4177763 test: 0.4139409 best: 0.4139409 (13) total: 882ms remaining: 3.59s 14: learn: 0.4029164 test: 0.3989647 best: 0.3989647 (14) total: 943ms remaining: 3.52s 15: learn: 0.3961764 test: 0.3920528 best: 0.3920528 (15) total: 1s remaining: 3.45s 16: learn: 0.3865241 test: 0.3827101 best: 0.3827101 (16) total: 1.06s remaining: 3.38s 17: learn: 0.3812555 test: 0.3771192 best: 0.3771192 (17) total: 1.12s remaining: 3.3s 18: learn: 0.3715294 test: 0.3673512 best: 0.3673512 (18) total: 1.19s remaining: 3.24s 19: learn: 0.3667548 test: 0.3625511 best: 0.3625511 (19) total: 1.24s remaining: 3.17s 20: learn: 0.3616578 test: 0.3572237 best: 0.3572237 (20) total: 1.3s remaining: 3.1s 21: learn: 0.3536770 test: 0.3491830 best: 0.3491830 (21) total: 1.36s remaining: 3.04s 22: learn: 0.3475339 test: 0.3431492 best: 0.3431492 (22) total: 1.43s remaining: 2.98s 23: learn: 0.3397825 test: 0.3351581 best: 0.3351581 (23) total: 1.48s remaining: 2.9s 24: learn: 0.3358127 test: 0.3308949 best: 0.3308949 (24) total: 1.54s remaining: 2.84s 25: learn: 0.3319970 test: 0.3269227 best: 0.3269227 (25) total: 1.6s remaining: 2.78s 26: learn: 0.3267247 test: 0.3217201 best: 0.3217201 (26) total: 1.67s remaining: 2.72s 27: learn: 0.3234785 test: 0.3182543 best: 0.3182543 (27) total: 1.73s remaining: 2.66s 28: learn: 0.3203501 test: 0.3149595 best: 0.3149595 (28) total: 1.79s remaining: 2.6s 29: learn: 0.3164705 test: 0.3110593 best: 0.3110593 (29) total: 1.86s remaining: 2.54s 30: learn: 0.3126605 test: 0.3071071 best: 0.3071071 (30) total: 1.92s remaining: 2.48s 31: learn: 0.3105384 test: 0.3050325 best: 0.3050325 (31) total: 1.99s remaining: 2.42s 32: learn: 0.3089367 test: 0.3033981 best: 0.3033981 (32) total: 2.05s remaining: 2.36s 33: learn: 0.3068621 test: 0.3012203 best: 0.3012203 (33) total: 2.11s remaining: 2.3s 34: learn: 0.3048637 test: 0.2991776 best: 0.2991776 (34) total: 2.18s remaining: 2.24s 35: learn: 0.3022941 test: 0.2966639 best: 0.2966639 (35) total: 2.24s remaining: 2.17s 36: learn: 0.3004745 test: 0.2948412 best: 0.2948412 (36) total: 2.3s remaining: 2.11s 37: learn: 0.2974322 test: 0.2918256 best: 0.2918256 (37) total: 2.36s remaining: 2.05s 38: learn: 0.2938393 test: 0.2882940 best: 0.2882940 (38) total: 2.43s remaining: 2s 39: learn: 0.2901682 test: 0.2847573 best: 0.2847573 (39) total: 2.5s remaining: 1.94s 40: learn: 0.2885604 test: 0.2830354 best: 0.2830354 (40) total: 2.56s remaining: 1.88s 41: learn: 0.2859026 test: 0.2804415 best: 0.2804415 (41) total: 2.63s remaining: 1.81s 42: learn: 0.2836727 test: 0.2780430 best: 0.2780430 (42) total: 2.7s remaining: 1.76s 43: learn: 0.2825939 test: 0.2769223 best: 0.2769223 (43) total: 2.77s remaining: 1.7s 44: learn: 0.2817899 test: 0.2760398 best: 0.2760398 (44) total: 2.84s remaining: 1.64s 45: learn: 0.2805922 test: 0.2747363 best: 0.2747363 (45) total: 2.9s remaining: 1.58s 46: learn: 0.2777496 test: 0.2719358 best: 0.2719358 (46) total: 2.97s remaining: 1.52s 47: learn: 0.2765420 test: 0.2706255 best: 0.2706255 (47) total: 3.04s remaining: 1.45s 48: learn: 0.2740540 test: 0.2682766 best: 0.2682766 (48) total: 3.1s remaining: 1.39s 49: learn: 0.2723509 test: 0.2665130 best: 0.2665130 (49) total: 3.17s remaining: 1.33s 50: learn: 0.2698898 test: 0.2640976 best: 0.2640976 (50) total: 3.24s remaining: 1.27s 51: learn: 0.2685728 test: 0.2627988 best: 0.2627988 (51) total: 3.31s remaining: 1.21s 52: learn: 0.2668575 test: 0.2610180 best: 0.2610180 (52) total: 3.37s remaining: 1.15s 53: learn: 0.2650140 test: 0.2592755 best: 0.2592755 (53) total: 3.44s remaining: 1.08s 54: learn: 0.2645586 test: 0.2588876 best: 0.2588876 (54) total: 3.51s remaining: 1.02s 55: learn: 0.2632651 test: 0.2574370 best: 0.2574370 (55) total: 3.57s remaining: 957ms 56: learn: 0.2620117 test: 0.2561235 best: 0.2561235 (56) total: 3.64s remaining: 894ms 57: learn: 0.2613111 test: 0.2553382 best: 0.2553382 (57) total: 3.71s remaining: 830ms 58: learn: 0.2603018 test: 0.2543004 best: 0.2543004 (58) total: 3.77s remaining: 767ms 59: learn: 0.2587358 test: 0.2528133 best: 0.2528133 (59) total: 3.84s remaining: 704ms 60: learn: 0.2572699 test: 0.2514369 best: 0.2514369 (60) total: 3.9s remaining: 640ms 61: learn: 0.2558959 test: 0.2499643 best: 0.2499643 (61) total: 3.96s remaining: 576ms 62: learn: 0.2545499 test: 0.2486171 best: 0.2486171 (62) total: 4.03s remaining: 512ms 63: learn: 0.2539498 test: 0.2479679 best: 0.2479679 (63) total: 4.09s remaining: 447ms 64: learn: 0.2530525 test: 0.2471032 best: 0.2471032 (64) total: 4.16s remaining: 384ms 65: learn: 0.2524944 test: 0.2465285 best: 0.2465285 (65) total: 4.22s remaining: 320ms 66: learn: 0.2514112 test: 0.2454102 best: 0.2454102 (66) total: 4.29s remaining: 256ms 67: learn: 0.2505112 test: 0.2444618 best: 0.2444618 (67) total: 4.35s remaining: 192ms 68: learn: 0.2496087 test: 0.2436133 best: 0.2436133 (68) total: 4.41s remaining: 128ms 69: learn: 0.2489065 test: 0.2430428 best: 0.2430428 (69) total: 4.48s remaining: 64ms 70: learn: 0.2483928 test: 0.2425184 best: 0.2425184 (70) total: 4.55s remaining: 0us bestTest = 0.2425183739 bestIteration = 70 Trial 51, Fold 3: Log loss = 0.24251837389686698, Average precision = 0.9670952121281954, ROC-AUC = 0.9632783682137318, Elapsed Time = 4.66802319999988 seconds Trial 51, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 51, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6558091 test: 0.6553259 best: 0.6553259 (0) total: 60ms remaining: 4.2s 1: learn: 0.6281898 test: 0.6276108 best: 0.6276108 (1) total: 120ms remaining: 4.16s 2: learn: 0.5981515 test: 0.5974713 best: 0.5974713 (2) total: 183ms remaining: 4.15s 3: learn: 0.5753709 test: 0.5747707 best: 0.5747707 (3) total: 248ms remaining: 4.14s 4: learn: 0.5429288 test: 0.5423963 best: 0.5423963 (4) total: 311ms remaining: 4.11s 5: learn: 0.5215467 test: 0.5210908 best: 0.5210908 (5) total: 373ms remaining: 4.04s 6: learn: 0.5060325 test: 0.5059148 best: 0.5059148 (6) total: 429ms remaining: 3.92s 7: learn: 0.4899834 test: 0.4896519 best: 0.4896519 (7) total: 488ms remaining: 3.84s 8: learn: 0.4777784 test: 0.4776313 best: 0.4776313 (8) total: 549ms remaining: 3.78s 9: learn: 0.4597129 test: 0.4593908 best: 0.4593908 (9) total: 614ms remaining: 3.75s 10: learn: 0.4469910 test: 0.4466815 best: 0.4466815 (10) total: 675ms remaining: 3.68s 11: learn: 0.4368605 test: 0.4363134 best: 0.4363134 (11) total: 734ms remaining: 3.61s 12: learn: 0.4253733 test: 0.4249636 best: 0.4249636 (12) total: 797ms remaining: 3.56s 13: learn: 0.4122782 test: 0.4118042 best: 0.4118042 (13) total: 859ms remaining: 3.5s 14: learn: 0.4057261 test: 0.4050474 best: 0.4050474 (14) total: 921ms remaining: 3.44s 15: learn: 0.3952511 test: 0.3946159 best: 0.3946159 (15) total: 979ms remaining: 3.36s 16: learn: 0.3887755 test: 0.3882370 best: 0.3882370 (16) total: 1.04s remaining: 3.31s 17: learn: 0.3793376 test: 0.3787645 best: 0.3787645 (17) total: 1.1s remaining: 3.24s 18: learn: 0.3716977 test: 0.3710215 best: 0.3710215 (18) total: 1.16s remaining: 3.18s 19: learn: 0.3637151 test: 0.3631087 best: 0.3631087 (19) total: 1.23s remaining: 3.13s 20: learn: 0.3558983 test: 0.3552999 best: 0.3552999 (20) total: 1.29s remaining: 3.06s 21: learn: 0.3513774 test: 0.3506961 best: 0.3506961 (21) total: 1.34s remaining: 3s 22: learn: 0.3443860 test: 0.3438264 best: 0.3438264 (22) total: 1.41s remaining: 2.94s 23: learn: 0.3410503 test: 0.3405483 best: 0.3405483 (23) total: 1.47s remaining: 2.87s 24: learn: 0.3344085 test: 0.3337854 best: 0.3337854 (24) total: 1.53s remaining: 2.81s 25: learn: 0.3291615 test: 0.3286058 best: 0.3286058 (25) total: 1.59s remaining: 2.75s 26: learn: 0.3241047 test: 0.3234618 best: 0.3234618 (26) total: 1.65s remaining: 2.69s 27: learn: 0.3216839 test: 0.3211867 best: 0.3211867 (27) total: 1.71s remaining: 2.63s 28: learn: 0.3161760 test: 0.3155383 best: 0.3155383 (28) total: 1.77s remaining: 2.57s 29: learn: 0.3131547 test: 0.3123779 best: 0.3123779 (29) total: 1.84s remaining: 2.51s 30: learn: 0.3089730 test: 0.3081521 best: 0.3081521 (30) total: 1.89s remaining: 2.44s 31: learn: 0.3055196 test: 0.3047342 best: 0.3047342 (31) total: 1.95s remaining: 2.38s 32: learn: 0.3011954 test: 0.3002989 best: 0.3002989 (32) total: 2.02s remaining: 2.32s 33: learn: 0.2988787 test: 0.2980164 best: 0.2980164 (33) total: 2.08s remaining: 2.26s 34: learn: 0.2957477 test: 0.2950495 best: 0.2950495 (34) total: 2.14s remaining: 2.2s 35: learn: 0.2926948 test: 0.2919711 best: 0.2919711 (35) total: 2.21s remaining: 2.14s 36: learn: 0.2902876 test: 0.2894714 best: 0.2894714 (36) total: 2.27s remaining: 2.08s 37: learn: 0.2892262 test: 0.2885422 best: 0.2885422 (37) total: 2.33s remaining: 2.02s 38: learn: 0.2871985 test: 0.2864935 best: 0.2864935 (38) total: 2.39s remaining: 1.96s 39: learn: 0.2862593 test: 0.2855670 best: 0.2855670 (39) total: 2.45s remaining: 1.9s 40: learn: 0.2847824 test: 0.2842361 best: 0.2842361 (40) total: 2.52s remaining: 1.84s 41: learn: 0.2833647 test: 0.2830051 best: 0.2830051 (41) total: 2.58s remaining: 1.78s 42: learn: 0.2816875 test: 0.2812117 best: 0.2812117 (42) total: 2.64s remaining: 1.72s 43: learn: 0.2798069 test: 0.2793715 best: 0.2793715 (43) total: 2.7s remaining: 1.66s 44: learn: 0.2769429 test: 0.2764526 best: 0.2764526 (44) total: 2.77s remaining: 1.6s 45: learn: 0.2749297 test: 0.2744640 best: 0.2744640 (45) total: 2.83s remaining: 1.54s 46: learn: 0.2724270 test: 0.2718722 best: 0.2718722 (46) total: 2.89s remaining: 1.47s 47: learn: 0.2713012 test: 0.2709096 best: 0.2709096 (47) total: 2.95s remaining: 1.41s 48: learn: 0.2691429 test: 0.2689268 best: 0.2689268 (48) total: 3.01s remaining: 1.35s 49: learn: 0.2681712 test: 0.2680868 best: 0.2680868 (49) total: 3.08s remaining: 1.29s 50: learn: 0.2663824 test: 0.2664385 best: 0.2664385 (50) total: 3.14s remaining: 1.23s 51: learn: 0.2655274 test: 0.2656124 best: 0.2656124 (51) total: 3.2s remaining: 1.17s 52: learn: 0.2646438 test: 0.2648348 best: 0.2648348 (52) total: 3.27s remaining: 1.11s 53: learn: 0.2636772 test: 0.2639660 best: 0.2639660 (53) total: 3.33s remaining: 1.05s 54: learn: 0.2626562 test: 0.2630668 best: 0.2630668 (54) total: 3.39s remaining: 987ms 55: learn: 0.2609732 test: 0.2613929 best: 0.2613929 (55) total: 3.46s remaining: 926ms 56: learn: 0.2593232 test: 0.2598439 best: 0.2598439 (56) total: 3.52s remaining: 866ms 57: learn: 0.2578597 test: 0.2584297 best: 0.2584297 (57) total: 3.59s remaining: 805ms 58: learn: 0.2572719 test: 0.2578699 best: 0.2578699 (58) total: 3.66s remaining: 744ms 59: learn: 0.2555612 test: 0.2561224 best: 0.2561224 (59) total: 3.72s remaining: 683ms 60: learn: 0.2541960 test: 0.2547050 best: 0.2547050 (60) total: 3.79s remaining: 621ms 61: learn: 0.2531084 test: 0.2536244 best: 0.2536244 (61) total: 3.85s remaining: 560ms 62: learn: 0.2519337 test: 0.2524008 best: 0.2524008 (62) total: 3.92s remaining: 498ms 63: learn: 0.2511284 test: 0.2516262 best: 0.2516262 (63) total: 4s remaining: 437ms 64: learn: 0.2505069 test: 0.2511763 best: 0.2511763 (64) total: 4.07s remaining: 375ms 65: learn: 0.2498768 test: 0.2505923 best: 0.2505923 (65) total: 4.14s remaining: 313ms 66: learn: 0.2488530 test: 0.2496524 best: 0.2496524 (66) total: 4.21s remaining: 251ms 67: learn: 0.2479407 test: 0.2488088 best: 0.2488088 (67) total: 4.27s remaining: 189ms 68: learn: 0.2476684 test: 0.2485791 best: 0.2485791 (68) total: 4.34s remaining: 126ms 69: learn: 0.2466374 test: 0.2475473 best: 0.2475473 (69) total: 4.41s remaining: 62.9ms 70: learn: 0.2457004 test: 0.2467596 best: 0.2467596 (70) total: 4.47s remaining: 0us bestTest = 0.2467596263 bestIteration = 70 Trial 51, Fold 4: Log loss = 0.24675962631827011, Average precision = 0.9670576678372873, ROC-AUC = 0.9614320973745264, Elapsed Time = 4.5955150999980106 seconds Trial 51, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 51, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6616017 test: 0.6621591 best: 0.6621591 (0) total: 58.9ms remaining: 4.12s 1: learn: 0.6231335 test: 0.6243037 best: 0.6243037 (1) total: 120ms remaining: 4.14s 2: learn: 0.5989429 test: 0.6008356 best: 0.6008356 (2) total: 179ms remaining: 4.05s 3: learn: 0.5745397 test: 0.5773090 best: 0.5773090 (3) total: 234ms remaining: 3.92s 4: learn: 0.5557279 test: 0.5592374 best: 0.5592374 (4) total: 295ms remaining: 3.9s 5: learn: 0.5384202 test: 0.5427608 best: 0.5427608 (5) total: 356ms remaining: 3.86s 6: learn: 0.5218680 test: 0.5268710 best: 0.5268710 (6) total: 419ms remaining: 3.83s 7: learn: 0.4997340 test: 0.5050271 best: 0.5050271 (7) total: 482ms remaining: 3.79s 8: learn: 0.4853170 test: 0.4911533 best: 0.4911533 (8) total: 542ms remaining: 3.73s 9: learn: 0.4698100 test: 0.4760204 best: 0.4760204 (9) total: 603ms remaining: 3.68s 10: learn: 0.4522297 test: 0.4587330 best: 0.4587330 (10) total: 662ms remaining: 3.61s 11: learn: 0.4433182 test: 0.4500819 best: 0.4500819 (11) total: 722ms remaining: 3.55s 12: learn: 0.4327826 test: 0.4398476 best: 0.4398476 (12) total: 784ms remaining: 3.5s 13: learn: 0.4200037 test: 0.4271907 best: 0.4271907 (13) total: 847ms remaining: 3.45s 14: learn: 0.4128423 test: 0.4203735 best: 0.4203735 (14) total: 905ms remaining: 3.38s 15: learn: 0.4017730 test: 0.4093328 best: 0.4093328 (15) total: 967ms remaining: 3.32s 16: learn: 0.3933676 test: 0.4009335 best: 0.4009335 (16) total: 1.03s remaining: 3.27s 17: learn: 0.3824571 test: 0.3901407 best: 0.3901407 (17) total: 1.09s remaining: 3.21s 18: learn: 0.3750530 test: 0.3826821 best: 0.3826821 (18) total: 1.15s remaining: 3.15s 19: learn: 0.3667044 test: 0.3745228 best: 0.3745228 (19) total: 1.21s remaining: 3.09s 20: learn: 0.3631528 test: 0.3711272 best: 0.3711272 (20) total: 1.27s remaining: 3.03s 21: learn: 0.3575629 test: 0.3654942 best: 0.3654942 (21) total: 1.33s remaining: 2.97s 22: learn: 0.3537048 test: 0.3614244 best: 0.3614244 (22) total: 1.4s remaining: 2.92s 23: learn: 0.3477210 test: 0.3555438 best: 0.3555438 (23) total: 1.46s remaining: 2.85s 24: learn: 0.3448311 test: 0.3528994 best: 0.3528994 (24) total: 1.52s remaining: 2.8s 25: learn: 0.3393100 test: 0.3472212 best: 0.3472212 (25) total: 1.58s remaining: 2.74s 26: learn: 0.3323278 test: 0.3403167 best: 0.3403167 (26) total: 1.64s remaining: 2.68s 27: learn: 0.3284471 test: 0.3361713 best: 0.3361713 (27) total: 1.7s remaining: 2.62s 28: learn: 0.3258321 test: 0.3335237 best: 0.3335237 (28) total: 1.77s remaining: 2.56s 29: learn: 0.3207592 test: 0.3283087 best: 0.3283087 (29) total: 1.83s remaining: 2.5s 30: learn: 0.3163958 test: 0.3239428 best: 0.3239428 (30) total: 1.89s remaining: 2.44s 31: learn: 0.3132265 test: 0.3209502 best: 0.3209502 (31) total: 1.95s remaining: 2.38s 32: learn: 0.3110105 test: 0.3187612 best: 0.3187612 (32) total: 2.02s remaining: 2.32s 33: learn: 0.3087190 test: 0.3163983 best: 0.3163983 (33) total: 2.08s remaining: 2.26s 34: learn: 0.3051745 test: 0.3127127 best: 0.3127127 (34) total: 2.14s remaining: 2.2s 35: learn: 0.3032747 test: 0.3108768 best: 0.3108768 (35) total: 2.2s remaining: 2.14s 36: learn: 0.2994429 test: 0.3070488 best: 0.3070488 (36) total: 2.26s remaining: 2.08s 37: learn: 0.2975969 test: 0.3050833 best: 0.3050833 (37) total: 2.32s remaining: 2.01s 38: learn: 0.2940198 test: 0.3014220 best: 0.3014220 (38) total: 2.38s remaining: 1.96s 39: learn: 0.2910731 test: 0.2984207 best: 0.2984207 (39) total: 2.45s remaining: 1.9s 40: learn: 0.2887998 test: 0.2962845 best: 0.2962845 (40) total: 2.51s remaining: 1.83s 41: learn: 0.2857022 test: 0.2932155 best: 0.2932155 (41) total: 2.57s remaining: 1.77s 42: learn: 0.2846978 test: 0.2922992 best: 0.2922992 (42) total: 2.63s remaining: 1.71s 43: learn: 0.2837073 test: 0.2913605 best: 0.2913605 (43) total: 2.69s remaining: 1.65s 44: learn: 0.2812123 test: 0.2888846 best: 0.2888846 (44) total: 2.75s remaining: 1.59s 45: learn: 0.2784282 test: 0.2859580 best: 0.2859580 (45) total: 2.82s remaining: 1.53s 46: learn: 0.2762086 test: 0.2836964 best: 0.2836964 (46) total: 2.88s remaining: 1.47s 47: learn: 0.2748605 test: 0.2822940 best: 0.2822940 (47) total: 2.94s remaining: 1.41s 48: learn: 0.2727546 test: 0.2802108 best: 0.2802108 (48) total: 3.01s remaining: 1.35s 49: learn: 0.2706321 test: 0.2780487 best: 0.2780487 (49) total: 3.07s remaining: 1.29s 50: learn: 0.2694350 test: 0.2768914 best: 0.2768914 (50) total: 3.13s remaining: 1.23s 51: learn: 0.2686361 test: 0.2761729 best: 0.2761729 (51) total: 3.2s remaining: 1.17s 52: learn: 0.2676755 test: 0.2751831 best: 0.2751831 (52) total: 3.26s remaining: 1.11s 53: learn: 0.2658249 test: 0.2733468 best: 0.2733468 (53) total: 3.33s remaining: 1.05s 54: learn: 0.2648782 test: 0.2725169 best: 0.2725169 (54) total: 3.4s remaining: 988ms 55: learn: 0.2630213 test: 0.2707040 best: 0.2707040 (55) total: 3.46s remaining: 926ms 56: learn: 0.2622581 test: 0.2700612 best: 0.2700612 (56) total: 3.52s remaining: 865ms 57: learn: 0.2601877 test: 0.2679362 best: 0.2679362 (57) total: 3.59s remaining: 805ms 58: learn: 0.2588086 test: 0.2665943 best: 0.2665943 (58) total: 3.66s remaining: 744ms 59: learn: 0.2573758 test: 0.2650972 best: 0.2650972 (59) total: 3.72s remaining: 683ms 60: learn: 0.2566637 test: 0.2643733 best: 0.2643733 (60) total: 3.79s remaining: 622ms 61: learn: 0.2554817 test: 0.2631907 best: 0.2631907 (61) total: 3.87s remaining: 561ms 62: learn: 0.2547611 test: 0.2625104 best: 0.2625104 (62) total: 3.93s remaining: 500ms 63: learn: 0.2531689 test: 0.2609323 best: 0.2609323 (63) total: 4s remaining: 438ms 64: learn: 0.2523389 test: 0.2601910 best: 0.2601910 (64) total: 4.08s remaining: 376ms 65: learn: 0.2517066 test: 0.2596108 best: 0.2596108 (65) total: 4.14s remaining: 314ms 66: learn: 0.2513172 test: 0.2592946 best: 0.2592946 (66) total: 4.21s remaining: 251ms 67: learn: 0.2506984 test: 0.2586273 best: 0.2586273 (67) total: 4.27s remaining: 189ms 68: learn: 0.2493523 test: 0.2572411 best: 0.2572411 (68) total: 4.34s remaining: 126ms 69: learn: 0.2480637 test: 0.2559594 best: 0.2559594 (69) total: 4.41s remaining: 63ms 70: learn: 0.2475529 test: 0.2554910 best: 0.2554910 (70) total: 4.47s remaining: 0us bestTest = 0.2554910349 bestIteration = 70 Trial 51, Fold 5: Log loss = 0.2554910349390996, Average precision = 0.9647821063377878, ROC-AUC = 0.9591003102762761, Elapsed Time = 4.5963112999997975 seconds
Optimization Progress: 52%|#####2 | 52/100 [1:17:16<41:50, 52.30s/it]
Trial 52, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 52, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6007363 test: 0.6117303 best: 0.6117303 (0) total: 214ms remaining: 16.7s 1: learn: 0.5330657 test: 0.5516216 best: 0.5516216 (1) total: 419ms remaining: 16.1s 2: learn: 0.4784990 test: 0.5018970 best: 0.5018970 (2) total: 632ms remaining: 16s 3: learn: 0.4336885 test: 0.4603398 best: 0.4603398 (3) total: 870ms remaining: 16.3s 4: learn: 0.3999728 test: 0.4251496 best: 0.4251496 (4) total: 1.09s remaining: 16.2s 5: learn: 0.3708225 test: 0.3945938 best: 0.3945938 (5) total: 1.31s remaining: 15.9s 6: learn: 0.3449674 test: 0.3676794 best: 0.3676794 (6) total: 1.55s remaining: 16s 7: learn: 0.3259138 test: 0.3471731 best: 0.3471731 (7) total: 1.84s remaining: 16.3s 8: learn: 0.3107728 test: 0.3327771 best: 0.3327771 (8) total: 2.1s remaining: 16.4s 9: learn: 0.2937127 test: 0.3161216 best: 0.3161216 (9) total: 2.39s remaining: 16.5s 10: learn: 0.2803343 test: 0.3027993 best: 0.3027993 (10) total: 2.68s remaining: 16.6s 11: learn: 0.2659738 test: 0.2892195 best: 0.2892195 (11) total: 2.95s remaining: 16.5s 12: learn: 0.2544560 test: 0.2788092 best: 0.2788092 (12) total: 3.24s remaining: 16.4s 13: learn: 0.2452722 test: 0.2700145 best: 0.2700145 (13) total: 3.55s remaining: 16.5s 14: learn: 0.2367518 test: 0.2621928 best: 0.2621928 (14) total: 3.86s remaining: 16.5s 15: learn: 0.2301217 test: 0.2564757 best: 0.2564757 (15) total: 4.11s remaining: 16.2s 16: learn: 0.2237138 test: 0.2506024 best: 0.2506024 (16) total: 4.36s remaining: 15.9s 17: learn: 0.2144302 test: 0.2424323 best: 0.2424323 (17) total: 4.66s remaining: 15.8s 18: learn: 0.2064025 test: 0.2355595 best: 0.2355595 (18) total: 4.98s remaining: 15.7s 19: learn: 0.2020122 test: 0.2320539 best: 0.2320539 (19) total: 5.24s remaining: 15.5s 20: learn: 0.1975701 test: 0.2286192 best: 0.2286192 (20) total: 5.51s remaining: 15.2s 21: learn: 0.1939508 test: 0.2260034 best: 0.2260034 (21) total: 5.8s remaining: 15s 22: learn: 15.2768573 test: 0.2219228 best: 0.2219228 (22) total: 6.06s remaining: 14.8s 23: learn: 12.1099148 test: 0.2191927 best: 0.2191927 (23) total: 6.36s remaining: 14.6s 24: learn: 12.1067310 test: 0.2173214 best: 0.2173214 (24) total: 6.67s remaining: 14.4s 25: learn: 12.1032913 test: 0.2151956 best: 0.2151956 (25) total: 6.95s remaining: 14.2s 26: learn: 12.1000974 test: 0.2130473 best: 0.2130473 (26) total: 7.22s remaining: 13.9s 27: learn: 33.4793361 test: 0.2108273 best: 0.2108273 (27) total: 7.47s remaining: 13.6s 28: learn: 30.3142507 test: 12.3503368 best: 0.2108273 (27) total: 7.75s remaining: 13.4s 29: learn: 30.3121085 test: 12.3491773 best: 0.2108273 (27) total: 8.01s remaining: 13.1s 30: learn: 30.3099687 test: 12.3483104 best: 0.2108273 (27) total: 8.27s remaining: 12.8s 31: learn: 54.2351964 test: 298.8604287 best: 0.2108273 (27) total: 8.54s remaining: 12.5s 32: learn: 706.8792712 test: 1529.0069553 best: 0.2108273 (27) total: 8.8s remaining: 12.3s
Training has stopped (degenerate solution on iteration 33, probably too small l2-regularization, try to increase it)
bestTest = 0.2108272576 bestIteration = 27 Shrink model to first 28 iterations. Trial 52, Fold 1: Log loss = 0.21069644814043154, Average precision = 0.9717256097227264, ROC-AUC = 0.9694559602672747, Elapsed Time = 9.182990499997686 seconds Trial 52, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 52, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6111835 test: 0.6226610 best: 0.6226610 (0) total: 218ms remaining: 17s 1: learn: 0.5394007 test: 0.5589982 best: 0.5589982 (1) total: 442ms remaining: 17s 2: learn: 0.4878466 test: 0.5128214 best: 0.5128214 (2) total: 690ms remaining: 17.5s 3: learn: 0.4439685 test: 0.4699998 best: 0.4699998 (3) total: 959ms remaining: 18s 4: learn: 0.4043180 test: 0.4323217 best: 0.4323217 (4) total: 1.23s remaining: 18.1s 5: learn: 0.3712810 test: 0.3995021 best: 0.3995021 (5) total: 1.48s remaining: 18s 6: learn: 0.3457900 test: 0.3719430 best: 0.3719430 (6) total: 1.73s remaining: 17.8s 7: learn: 0.3252993 test: 0.3516343 best: 0.3516343 (7) total: 2.03s remaining: 18s 8: learn: 0.3061276 test: 0.3321531 best: 0.3321531 (8) total: 2.3s remaining: 17.9s 9: learn: 0.2908269 test: 0.3159480 best: 0.3159480 (9) total: 2.56s remaining: 17.7s 10: learn: 0.2763067 test: 0.3005335 best: 0.3005335 (10) total: 2.85s remaining: 17.6s 11: learn: 0.2641114 test: 0.2892182 best: 0.2892182 (11) total: 3.14s remaining: 17.5s 12: learn: 0.2537885 test: 0.2786214 best: 0.2786214 (12) total: 3.39s remaining: 17.2s 13: learn: 0.2433435 test: 0.2680905 best: 0.2680905 (13) total: 3.67s remaining: 17s 14: learn: 0.2352450 test: 0.2601812 best: 0.2601812 (14) total: 3.96s remaining: 16.9s 15: learn: 0.2266276 test: 0.2520925 best: 0.2520925 (15) total: 4.26s remaining: 16.8s 16: learn: 0.2201787 test: 0.2459260 best: 0.2459260 (16) total: 4.53s remaining: 16.5s 17: learn: 0.2145855 test: 0.2408975 best: 0.2408975 (17) total: 4.8s remaining: 16.3s 18: learn: 0.2090800 test: 0.2352024 best: 0.2352024 (18) total: 5.06s remaining: 16s 19: learn: 0.2042282 test: 0.2307075 best: 0.2307075 (19) total: 5.31s remaining: 15.7s 20: learn: 0.1994770 test: 0.2263185 best: 0.2263185 (20) total: 5.58s remaining: 15.4s 21: learn: 0.1953308 test: 0.2231280 best: 0.2231280 (21) total: 5.87s remaining: 15.2s 22: learn: 0.1916367 test: 0.2201898 best: 0.2201898 (22) total: 6.13s remaining: 14.9s 23: learn: 0.1887684 test: 0.2178409 best: 0.2178409 (23) total: 6.4s remaining: 14.7s 24: learn: 0.1855452 test: 0.2149730 best: 0.2149730 (24) total: 6.66s remaining: 14.4s
Training has stopped (degenerate solution on iteration 25, probably too small l2-regularization, try to increase it)
bestTest = 0.2149729902 bestIteration = 24 Shrink model to first 25 iterations. Trial 52, Fold 2: Log loss = 0.2148990726437246, Average precision = 0.9747406236409003, ROC-AUC = 0.9717614063499165, Elapsed Time = 7.07240389999788 seconds Trial 52, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 52, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6062642 test: 0.6156540 best: 0.6156540 (0) total: 234ms remaining: 18.2s 1: learn: 0.5407220 test: 0.5564034 best: 0.5564034 (1) total: 448ms remaining: 17.3s 2: learn: 0.4881608 test: 0.5081456 best: 0.5081456 (2) total: 652ms remaining: 16.5s 3: learn: 0.4459415 test: 0.4694549 best: 0.4694549 (3) total: 878ms remaining: 16.5s 4: learn: 0.4080858 test: 0.4328094 best: 0.4328094 (4) total: 1.2s remaining: 17.8s 5: learn: 0.3812365 test: 0.4041893 best: 0.4041893 (5) total: 1.56s remaining: 19s 6: learn: 0.3562090 test: 0.3787593 best: 0.3787593 (6) total: 1.9s remaining: 19.5s 7: learn: 0.3371529 test: 0.3594174 best: 0.3594174 (7) total: 2.19s remaining: 19.5s 8: learn: 0.3182969 test: 0.3389844 best: 0.3389844 (8) total: 2.52s remaining: 19.6s 9: learn: 0.3045150 test: 0.3246838 best: 0.3246838 (9) total: 2.82s remaining: 19.4s 10: learn: 0.2910198 test: 0.3107458 best: 0.3107458 (10) total: 3.09s remaining: 19.1s 11: learn: 0.2771032 test: 0.2968483 best: 0.2968483 (11) total: 3.38s remaining: 18.8s 12: learn: 0.2662847 test: 0.2859029 best: 0.2859029 (12) total: 3.66s remaining: 18.6s 13: learn: 0.2560237 test: 0.2756404 best: 0.2756404 (13) total: 3.95s remaining: 18.3s 14: learn: 0.2486171 test: 0.2681738 best: 0.2681738 (14) total: 4.23s remaining: 18.1s 15: learn: 0.2398348 test: 0.2599397 best: 0.2599397 (15) total: 4.57s remaining: 18s 16: learn: 12.2802762 test: 0.2528253 best: 0.2528253 (16) total: 4.88s remaining: 17.8s 17: learn: 12.2724880 test: 0.2462363 best: 0.2462363 (17) total: 5.18s remaining: 17.6s 18: learn: 12.2672607 test: 0.2420455 best: 0.2420455 (18) total: 5.49s remaining: 17.3s 19: learn: 149.6040500 test: 191.8418295 best: 0.2420455 (18) total: 5.76s remaining: 17s
Training has stopped (degenerate solution on iteration 20, probably too small l2-regularization, try to increase it)
bestTest = 0.2420455077 bestIteration = 18 Shrink model to first 19 iterations. Trial 52, Fold 3: Log loss = 0.24250937468924788, Average precision = 0.9713262805111159, ROC-AUC = 0.9697353752732806, Elapsed Time = 6.117950899999414 seconds Trial 52, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 52, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6133301 test: 0.6155228 best: 0.6155228 (0) total: 199ms remaining: 15.5s 1: learn: 0.5476824 test: 0.5555812 best: 0.5555812 (1) total: 412ms remaining: 15.9s 2: learn: 0.4942530 test: 0.5052748 best: 0.5052748 (2) total: 677ms remaining: 17.2s 3: learn: 0.4517225 test: 0.4647773 best: 0.4647773 (3) total: 916ms remaining: 17.2s 4: learn: 0.4181274 test: 0.4355127 best: 0.4355127 (4) total: 1.15s remaining: 17s 5: learn: 0.3860266 test: 0.4063332 best: 0.4063332 (5) total: 1.39s remaining: 16.9s 6: learn: 0.3623489 test: 0.3837529 best: 0.3837529 (6) total: 1.63s remaining: 16.7s 7: learn: 0.3391849 test: 0.3613684 best: 0.3613684 (7) total: 1.87s remaining: 16.6s 8: learn: 0.3218588 test: 0.3451343 best: 0.3451343 (8) total: 2.14s remaining: 16.6s 9: learn: 0.3038652 test: 0.3273818 best: 0.3273818 (9) total: 2.37s remaining: 16.4s 10: learn: 0.2925763 test: 0.3161195 best: 0.3161195 (10) total: 2.62s remaining: 16.2s 11: learn: 0.2816033 test: 0.3054718 best: 0.3054718 (11) total: 2.89s remaining: 16.2s 12: learn: 0.2721571 test: 0.2965127 best: 0.2965127 (12) total: 3.12s remaining: 15.9s 13: learn: 0.2619911 test: 0.2870278 best: 0.2870278 (13) total: 3.37s remaining: 15.7s 14: learn: 0.2519138 test: 0.2772422 best: 0.2772422 (14) total: 3.63s remaining: 15.5s 15: learn: 0.2426619 test: 0.2688409 best: 0.2688409 (15) total: 3.88s remaining: 15.3s 16: learn: 0.2345629 test: 0.2611969 best: 0.2611969 (16) total: 4.12s remaining: 15s 17: learn: 0.2285093 test: 0.2550710 best: 0.2550710 (17) total: 4.35s remaining: 14.7s 18: learn: 0.2227166 test: 0.2499288 best: 0.2499288 (18) total: 4.62s remaining: 14.6s 19: learn: 0.2181477 test: 0.2455096 best: 0.2455096 (19) total: 4.87s remaining: 14.4s 20: learn: 0.2130815 test: 0.2414888 best: 0.2414888 (20) total: 5.12s remaining: 14.1s 21: learn: 0.2078739 test: 0.2350795 best: 0.2350795 (21) total: 5.37s remaining: 13.9s 22: learn: 0.2034048 test: 0.2312612 best: 0.2312612 (22) total: 5.62s remaining: 13.7s 23: learn: 0.1986027 test: 0.2272488 best: 0.2272488 (23) total: 5.89s remaining: 13.5s 24: learn: 24.1715773 test: 0.2234134 best: 0.2234134 (24) total: 6.14s remaining: 13.3s 25: learn: 576.7273153 test: 0.2199385 best: 0.2199385 (25) total: 6.4s remaining: 13.1s
Training has stopped (degenerate solution on iteration 26, probably too small l2-regularization, try to increase it)
bestTest = 0.2199384653 bestIteration = 25 Shrink model to first 26 iterations. Trial 52, Fold 4: Log loss = 0.21992247379741797, Average precision = 0.9736665888102406, ROC-AUC = 0.9709873960099286, Elapsed Time = 6.823154199999408 seconds Trial 52, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 52, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5963745 test: 0.5999505 best: 0.5999505 (0) total: 227ms remaining: 17.7s 1: learn: 0.5325007 test: 0.5355169 best: 0.5355169 (1) total: 422ms remaining: 16.3s 2: learn: 0.4842094 test: 0.4896419 best: 0.4896419 (2) total: 630ms remaining: 16s 3: learn: 0.4379176 test: 0.4479523 best: 0.4479523 (3) total: 835ms remaining: 15.7s 4: learn: 0.4048628 test: 0.4157127 best: 0.4157127 (4) total: 1.06s remaining: 15.7s 5: learn: 0.3719044 test: 0.3856948 best: 0.3856948 (5) total: 1.27s remaining: 15.4s 6: learn: 0.3470246 test: 0.3631662 best: 0.3631662 (6) total: 1.5s remaining: 15.5s 7: learn: 0.3249294 test: 0.3441988 best: 0.3441988 (7) total: 1.73s remaining: 15.4s 8: learn: 0.3059625 test: 0.3265686 best: 0.3265686 (8) total: 1.97s remaining: 15.3s 9: learn: 0.2913277 test: 0.3134018 best: 0.3134018 (9) total: 2.23s remaining: 15.4s 10: learn: 0.2787450 test: 0.3021593 best: 0.3021593 (10) total: 2.47s remaining: 15.3s 11: learn: 0.2662502 test: 0.2911137 best: 0.2911137 (11) total: 2.71s remaining: 15.1s 12: learn: 0.2552073 test: 0.2808602 best: 0.2808602 (12) total: 2.98s remaining: 15.1s 13: learn: 0.2453852 test: 0.2722385 best: 0.2722385 (13) total: 3.27s remaining: 15.2s 14: learn: 0.2364326 test: 0.2643418 best: 0.2643418 (14) total: 3.54s remaining: 15.1s 15: learn: 0.2295043 test: 0.2583914 best: 0.2583914 (15) total: 3.78s remaining: 14.9s 16: learn: 0.2228040 test: 0.2525632 best: 0.2525632 (16) total: 4.02s remaining: 14.7s 17: learn: 0.2153877 test: 0.2461256 best: 0.2461256 (17) total: 4.26s remaining: 14.4s 18: learn: 0.2091170 test: 0.2406134 best: 0.2406134 (18) total: 4.5s remaining: 14.2s 19: learn: 0.2038073 test: 0.2363168 best: 0.2363168 (19) total: 4.74s remaining: 14s 20: learn: 0.1995256 test: 0.2329093 best: 0.2329093 (20) total: 4.97s remaining: 13.7s 21: learn: 0.1960483 test: 0.2303052 best: 0.2303052 (21) total: 5.2s remaining: 13.5s 22: learn: 0.1930546 test: 0.2280277 best: 0.2280277 (22) total: 5.42s remaining: 13.2s 23: learn: 0.1891087 test: 0.2260067 best: 0.2260067 (23) total: 5.67s remaining: 13s 24: learn: 0.1851263 test: 0.2233147 best: 0.2233147 (24) total: 5.93s remaining: 12.8s 25: learn: 0.1814988 test: 0.2208883 best: 0.2208883 (25) total: 6.23s remaining: 12.7s 26: learn: 0.1783751 test: 0.2188261 best: 0.2188261 (26) total: 6.48s remaining: 12.5s 27: learn: 0.1758105 test: 0.2174333 best: 0.2174333 (27) total: 6.73s remaining: 12.3s 28: learn: 0.1732660 test: 0.2156153 best: 0.2156153 (28) total: 6.97s remaining: 12s 29: learn: 0.1709430 test: 0.2147208 best: 0.2147208 (29) total: 7.25s remaining: 11.8s 30: learn: 0.1688678 test: 0.2135383 best: 0.2135383 (30) total: 7.5s remaining: 11.6s 31: learn: 0.1664114 test: 0.2122529 best: 0.2122529 (31) total: 7.75s remaining: 11.4s 32: learn: 0.1641024 test: 0.2107476 best: 0.2107476 (32) total: 8.01s remaining: 11.2s 33: learn: 0.1622695 test: 0.2101651 best: 0.2101651 (33) total: 8.29s remaining: 11s 34: learn: 0.1604694 test: 0.2092465 best: 0.2092465 (34) total: 8.53s remaining: 10.7s 35: learn: 626.6661337 test: 0.2086638 best: 0.2086638 (35) total: 8.78s remaining: 10.5s 36: learn: 626.6631965 test: 0.2081746 best: 0.2081746 (36) total: 9.05s remaining: 10.3s 37: learn: 1679.6308291 test: 289.2247117 best: 0.2081746 (36) total: 9.31s remaining: 10s
Training has stopped (degenerate solution on iteration 38, probably too small l2-regularization, try to increase it)
bestTest = 0.2081745998 bestIteration = 36 Shrink model to first 37 iterations. Trial 52, Fold 5: Log loss = 0.20776672363436344, Average precision = 0.9726535945074268, ROC-AUC = 0.9700186908942274, Elapsed Time = 9.64951579999979 seconds
Optimization Progress: 53%|#####3 | 53/100 [1:18:03<39:49, 50.84s/it]
Trial 53, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 53, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6207185 test: 0.6257933 best: 0.6257933 (0) total: 2.42s remaining: 2m 5s 1: learn: 0.5582711 test: 0.5678019 best: 0.5678019 (1) total: 4.87s remaining: 2m 4s 2: learn: 0.5048060 test: 0.5189623 best: 0.5189623 (2) total: 7.32s remaining: 2m 1s 3: learn: 0.4599540 test: 0.4771747 best: 0.4771747 (3) total: 9.92s remaining: 2m 1s 4: learn: 0.4205278 test: 0.4418999 best: 0.4418999 (4) total: 12.5s remaining: 1m 59s 5: learn: 0.3878088 test: 0.4116257 best: 0.4116257 (5) total: 15s remaining: 1m 57s 6: learn: 0.3597961 test: 0.3862717 best: 0.3862717 (6) total: 17.6s remaining: 1m 55s 7: learn: 0.3349984 test: 0.3647314 best: 0.3647314 (7) total: 20.2s remaining: 1m 53s 8: learn: 0.3143271 test: 0.3458138 best: 0.3458138 (8) total: 22.6s remaining: 1m 50s 9: learn: 0.2950605 test: 0.3300087 best: 0.3300087 (9) total: 25.2s remaining: 1m 48s 10: learn: 0.2791298 test: 0.3158813 best: 0.3158813 (10) total: 27.6s remaining: 1m 45s 11: learn: 0.2651529 test: 0.3043534 best: 0.3043534 (11) total: 30.2s remaining: 1m 43s 12: learn: 0.2522737 test: 0.2936236 best: 0.2936236 (12) total: 32.7s remaining: 1m 40s 13: learn: 0.2414369 test: 0.2846476 best: 0.2846476 (13) total: 35.1s remaining: 1m 37s 14: learn: 0.2328277 test: 0.2765607 best: 0.2765607 (14) total: 37.4s remaining: 1m 34s 15: learn: 0.2230532 test: 0.2698914 best: 0.2698914 (15) total: 39.9s remaining: 1m 32s 16: learn: 0.2137646 test: 0.2642645 best: 0.2642645 (16) total: 42.3s remaining: 1m 29s 17: learn: 0.2066047 test: 0.2590235 best: 0.2590235 (17) total: 44.8s remaining: 1m 27s 18: learn: 0.2003673 test: 0.2539948 best: 0.2539948 (18) total: 47.8s remaining: 1m 25s 19: learn: 0.1952243 test: 0.2496210 best: 0.2496210 (19) total: 50.5s remaining: 1m 23s 20: learn: 0.1902493 test: 0.2454872 best: 0.2454872 (20) total: 53.4s remaining: 1m 21s 21: learn: 0.1850423 test: 0.2421304 best: 0.2421304 (21) total: 56s remaining: 1m 18s 22: learn: 0.1794941 test: 0.2389688 best: 0.2389688 (22) total: 58.4s remaining: 1m 16s 23: learn: 0.1755843 test: 0.2364259 best: 0.2364259 (23) total: 1m remaining: 1m 13s 24: learn: 0.1725238 test: 0.2339140 best: 0.2339140 (24) total: 1m 3s remaining: 1m 10s 25: learn: 0.1690154 test: 0.2316403 best: 0.2316403 (25) total: 1m 5s remaining: 1m 7s 26: learn: 0.1651181 test: 0.2296852 best: 0.2296852 (26) total: 1m 7s remaining: 1m 5s 27: learn: 0.1620411 test: 0.2276850 best: 0.2276850 (27) total: 1m 10s remaining: 1m 2s 28: learn: 0.1590206 test: 0.2260489 best: 0.2260489 (28) total: 1m 12s remaining: 1m 29: learn: 0.1554470 test: 0.2245657 best: 0.2245657 (29) total: 1m 14s remaining: 57.4s 30: learn: 0.1512784 test: 0.2238373 best: 0.2238373 (30) total: 1m 17s remaining: 54.9s 31: learn: 0.1486838 test: 0.2225154 best: 0.2225154 (31) total: 1m 19s remaining: 52.3s 32: learn: 0.1454425 test: 0.2216379 best: 0.2216379 (32) total: 1m 22s remaining: 49.7s 33: learn: 0.1428554 test: 0.2207837 best: 0.2207837 (33) total: 1m 24s remaining: 47.2s 34: learn: 0.1397071 test: 0.2199237 best: 0.2199237 (34) total: 1m 26s remaining: 44.7s 35: learn: 0.1379830 test: 0.2192089 best: 0.2192089 (35) total: 1m 29s remaining: 42.2s 36: learn: 0.1354645 test: 0.2184105 best: 0.2184105 (36) total: 1m 31s remaining: 39.7s 37: learn: 0.1333465 test: 0.2177159 best: 0.2177159 (37) total: 1m 34s remaining: 37.1s 38: learn: 0.1311951 test: 0.2169725 best: 0.2169725 (38) total: 1m 36s remaining: 34.6s 39: learn: 0.1297000 test: 0.2164504 best: 0.2164504 (39) total: 1m 38s remaining: 32s 40: learn: 0.1282341 test: 0.2156503 best: 0.2156503 (40) total: 1m 40s remaining: 29.5s 41: learn: 0.1256098 test: 0.2150655 best: 0.2150655 (41) total: 1m 43s remaining: 27.1s 42: learn: 0.1237077 test: 0.2147156 best: 0.2147156 (42) total: 1m 45s remaining: 24.6s 43: learn: 0.1215307 test: 0.2144153 best: 0.2144153 (43) total: 1m 48s remaining: 22.2s 44: learn: 0.1192771 test: 0.2139899 best: 0.2139899 (44) total: 1m 50s remaining: 19.7s 45: learn: 0.1162142 test: 0.2137357 best: 0.2137357 (45) total: 1m 53s remaining: 17.2s 46: learn: 0.1152737 test: 0.2131631 best: 0.2131631 (46) total: 1m 56s remaining: 14.8s 47: learn: 0.1134304 test: 0.2129513 best: 0.2129513 (47) total: 1m 58s remaining: 12.4s 48: learn: 0.1124897 test: 0.2126272 best: 0.2126272 (48) total: 2m 1s remaining: 9.88s 49: learn: 0.1112737 test: 0.2124180 best: 0.2124180 (49) total: 2m 3s remaining: 7.41s 50: learn: 0.1100341 test: 0.2120586 best: 0.2120586 (50) total: 2m 5s remaining: 4.93s 51: learn: 0.1088174 test: 0.2119238 best: 0.2119238 (51) total: 2m 8s remaining: 2.46s 52: learn: 0.1074396 test: 0.2117996 best: 0.2117996 (52) total: 2m 10s remaining: 0us bestTest = 0.2117996017 bestIteration = 52 Trial 53, Fold 1: Log loss = 0.21179960173124454, Average precision = 0.9733106009457305, ROC-AUC = 0.9679481492235557, Elapsed Time = 130.75538059999963 seconds Trial 53, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 53, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6200862 test: 0.6262834 best: 0.6262834 (0) total: 2.13s remaining: 1m 51s 1: learn: 0.5581055 test: 0.5692718 best: 0.5692718 (1) total: 4.64s remaining: 1m 58s 2: learn: 0.5053283 test: 0.5201026 best: 0.5201026 (2) total: 7.06s remaining: 1m 57s 3: learn: 0.4601452 test: 0.4790113 best: 0.4790113 (3) total: 9.52s remaining: 1m 56s 4: learn: 0.4214242 test: 0.4435762 best: 0.4435762 (4) total: 12s remaining: 1m 54s 5: learn: 0.3885419 test: 0.4131455 best: 0.4131455 (5) total: 14.3s remaining: 1m 52s 6: learn: 0.3587735 test: 0.3880919 best: 0.3880919 (6) total: 16.7s remaining: 1m 49s 7: learn: 0.3344026 test: 0.3662800 best: 0.3662800 (7) total: 19.1s remaining: 1m 47s 8: learn: 0.3139304 test: 0.3471202 best: 0.3471202 (8) total: 21.4s remaining: 1m 44s 9: learn: 0.2949417 test: 0.3309400 best: 0.3309400 (9) total: 23.8s remaining: 1m 42s 10: learn: 0.2788629 test: 0.3166213 best: 0.3166213 (10) total: 26.2s remaining: 1m 39s 11: learn: 0.2641766 test: 0.3045219 best: 0.3045219 (11) total: 28.6s remaining: 1m 37s 12: learn: 0.2525080 test: 0.2938232 best: 0.2938232 (12) total: 31.3s remaining: 1m 36s 13: learn: 0.2419877 test: 0.2843671 best: 0.2843671 (13) total: 33.7s remaining: 1m 33s 14: learn: 0.2325390 test: 0.2765417 best: 0.2765417 (14) total: 36s remaining: 1m 31s 15: learn: 0.2241042 test: 0.2698607 best: 0.2698607 (15) total: 38.5s remaining: 1m 29s 16: learn: 0.2166072 test: 0.2634676 best: 0.2634676 (16) total: 40.9s remaining: 1m 26s 17: learn: 0.2095813 test: 0.2580119 best: 0.2580119 (17) total: 43.2s remaining: 1m 24s 18: learn: 0.2016243 test: 0.2529380 best: 0.2529380 (18) total: 45.6s remaining: 1m 21s 19: learn: 0.1959133 test: 0.2482167 best: 0.2482167 (19) total: 48s remaining: 1m 19s 20: learn: 0.1905227 test: 0.2444682 best: 0.2444682 (20) total: 50.3s remaining: 1m 16s 21: learn: 0.1851725 test: 0.2406459 best: 0.2406459 (21) total: 52.7s remaining: 1m 14s 22: learn: 0.1813708 test: 0.2378602 best: 0.2378602 (22) total: 55.1s remaining: 1m 11s 23: learn: 0.1778537 test: 0.2350174 best: 0.2350174 (23) total: 57.4s remaining: 1m 9s 24: learn: 0.1736335 test: 0.2326699 best: 0.2326699 (24) total: 59.8s remaining: 1m 6s 25: learn: 0.1695663 test: 0.2305564 best: 0.2305564 (25) total: 1m 2s remaining: 1m 4s 26: learn: 0.1660837 test: 0.2285181 best: 0.2285181 (26) total: 1m 4s remaining: 1m 2s 27: learn: 0.1625335 test: 0.2262644 best: 0.2262644 (27) total: 1m 6s remaining: 59.8s 28: learn: 0.1585491 test: 0.2246073 best: 0.2246073 (28) total: 1m 9s remaining: 57.4s 29: learn: 0.1548434 test: 0.2233372 best: 0.2233372 (29) total: 1m 11s remaining: 55s 30: learn: 0.1497689 test: 0.2224872 best: 0.2224872 (30) total: 1m 14s remaining: 52.7s 31: learn: 0.1470547 test: 0.2210643 best: 0.2210643 (31) total: 1m 16s remaining: 50.3s 32: learn: 0.1433129 test: 0.2197515 best: 0.2197515 (32) total: 1m 18s remaining: 47.8s 33: learn: 0.1398370 test: 0.2187102 best: 0.2187102 (33) total: 1m 21s remaining: 45.4s 34: learn: 0.1379761 test: 0.2174019 best: 0.2174019 (34) total: 1m 23s remaining: 43s 35: learn: 0.1349754 test: 0.2163838 best: 0.2163838 (35) total: 1m 25s remaining: 40.5s 36: learn: 0.1332328 test: 0.2154186 best: 0.2154186 (36) total: 1m 28s remaining: 38.1s 37: learn: 0.1309944 test: 0.2147691 best: 0.2147691 (37) total: 1m 30s remaining: 35.7s 38: learn: 0.1292921 test: 0.2139301 best: 0.2139301 (38) total: 1m 32s remaining: 33.3s 39: learn: 0.1268856 test: 0.2131510 best: 0.2131510 (39) total: 1m 35s remaining: 31s 40: learn: 0.1252697 test: 0.2125212 best: 0.2125212 (40) total: 1m 37s remaining: 28.6s 41: learn: 0.1220830 test: 0.2122499 best: 0.2122499 (41) total: 1m 40s remaining: 26.2s 42: learn: 0.1201587 test: 0.2114249 best: 0.2114249 (42) total: 1m 42s remaining: 23.9s 43: learn: 0.1183791 test: 0.2111057 best: 0.2111057 (43) total: 1m 44s remaining: 21.5s 44: learn: 0.1161141 test: 0.2105275 best: 0.2105275 (44) total: 1m 47s remaining: 19.1s 45: learn: 0.1148241 test: 0.2098555 best: 0.2098555 (45) total: 1m 49s remaining: 16.7s 46: learn: 0.1120179 test: 0.2095684 best: 0.2095684 (46) total: 1m 52s remaining: 14.3s 47: learn: 0.1108150 test: 0.2092366 best: 0.2092366 (47) total: 1m 54s remaining: 11.9s 48: learn: 0.1093121 test: 0.2088062 best: 0.2088062 (48) total: 1m 56s remaining: 9.55s 49: learn: 0.1084675 test: 0.2084267 best: 0.2084267 (49) total: 1m 59s remaining: 7.16s 50: learn: 0.1071668 test: 0.2081714 best: 0.2081714 (50) total: 2m 1s remaining: 4.77s 51: learn: 0.1061700 test: 0.2079255 best: 0.2079255 (51) total: 2m 4s remaining: 2.38s 52: learn: 0.1038074 test: 0.2077162 best: 0.2077162 (52) total: 2m 6s remaining: 0us bestTest = 0.2077161998 bestIteration = 52 Trial 53, Fold 2: Log loss = 0.20771619979730535, Average precision = 0.9717465272076953, ROC-AUC = 0.9676972971594939, Elapsed Time = 126.69987220000257 seconds Trial 53, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 53, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6213692 test: 0.6258139 best: 0.6258139 (0) total: 2.19s remaining: 1m 54s 1: learn: 0.5597700 test: 0.5684427 best: 0.5684427 (1) total: 4.6s remaining: 1m 57s 2: learn: 0.5061759 test: 0.5197090 best: 0.5197090 (2) total: 6.96s remaining: 1m 56s 3: learn: 0.4616705 test: 0.4776925 best: 0.4776925 (3) total: 9.4s remaining: 1m 55s 4: learn: 0.4228975 test: 0.4428072 best: 0.4428072 (4) total: 11.8s remaining: 1m 53s 5: learn: 0.3902093 test: 0.4126804 best: 0.4126804 (5) total: 14.2s remaining: 1m 50s 6: learn: 0.3620884 test: 0.3867353 best: 0.3867353 (6) total: 16.6s remaining: 1m 48s 7: learn: 0.3378133 test: 0.3649104 best: 0.3649104 (7) total: 18.9s remaining: 1m 46s 8: learn: 0.3174279 test: 0.3463072 best: 0.3463072 (8) total: 21.4s remaining: 1m 44s 9: learn: 0.2994333 test: 0.3298378 best: 0.3298378 (9) total: 23.8s remaining: 1m 42s 10: learn: 0.2839580 test: 0.3159761 best: 0.3159761 (10) total: 26s remaining: 1m 39s 11: learn: 0.2682302 test: 0.3040613 best: 0.3040613 (11) total: 28.3s remaining: 1m 36s 12: learn: 0.2563493 test: 0.2937775 best: 0.2937775 (12) total: 30.6s remaining: 1m 34s 13: learn: 0.2458650 test: 0.2840162 best: 0.2840162 (13) total: 32.9s remaining: 1m 31s 14: learn: 0.2359060 test: 0.2765858 best: 0.2765858 (14) total: 35.2s remaining: 1m 29s 15: learn: 0.2273516 test: 0.2690805 best: 0.2690805 (15) total: 37.4s remaining: 1m 26s 16: learn: 0.2185780 test: 0.2624788 best: 0.2624788 (16) total: 39.7s remaining: 1m 24s 17: learn: 0.2101907 test: 0.2576758 best: 0.2576758 (17) total: 42.1s remaining: 1m 21s 18: learn: 0.2031882 test: 0.2532085 best: 0.2532085 (18) total: 44.5s remaining: 1m 19s 19: learn: 0.1962847 test: 0.2489770 best: 0.2489770 (19) total: 46.8s remaining: 1m 17s 20: learn: 0.1885276 test: 0.2450903 best: 0.2450903 (20) total: 49.3s remaining: 1m 15s 21: learn: 0.1833086 test: 0.2416185 best: 0.2416185 (21) total: 51.7s remaining: 1m 12s 22: learn: 0.1792076 test: 0.2384930 best: 0.2384930 (22) total: 54.1s remaining: 1m 10s 23: learn: 0.1751273 test: 0.2358345 best: 0.2358345 (23) total: 56.5s remaining: 1m 8s 24: learn: 0.1710857 test: 0.2333968 best: 0.2333968 (24) total: 58.9s remaining: 1m 5s 25: learn: 0.1682680 test: 0.2306660 best: 0.2306660 (25) total: 1m 1s remaining: 1m 3s 26: learn: 0.1636661 test: 0.2288144 best: 0.2288144 (26) total: 1m 3s remaining: 1m 1s 27: learn: 0.1606286 test: 0.2271894 best: 0.2271894 (27) total: 1m 6s remaining: 59.1s 28: learn: 0.1578918 test: 0.2252291 best: 0.2252291 (28) total: 1m 8s remaining: 56.7s 29: learn: 0.1537235 test: 0.2238850 best: 0.2238850 (29) total: 1m 10s remaining: 54.4s 30: learn: 0.1512156 test: 0.2222526 best: 0.2222526 (30) total: 1m 13s remaining: 52.1s 31: learn: 0.1488832 test: 0.2206733 best: 0.2206733 (31) total: 1m 15s remaining: 49.7s 32: learn: 0.1443441 test: 0.2195360 best: 0.2195360 (32) total: 1m 18s remaining: 47.3s 33: learn: 0.1424247 test: 0.2181128 best: 0.2181128 (33) total: 1m 20s remaining: 44.9s 34: learn: 0.1400640 test: 0.2170195 best: 0.2170195 (34) total: 1m 22s remaining: 42.6s 35: learn: 0.1370108 test: 0.2164908 best: 0.2164908 (35) total: 1m 25s remaining: 40.3s 36: learn: 0.1343946 test: 0.2156326 best: 0.2156326 (36) total: 1m 27s remaining: 37.9s 37: learn: 0.1323168 test: 0.2148902 best: 0.2148902 (37) total: 1m 30s remaining: 35.5s 38: learn: 0.1306528 test: 0.2138766 best: 0.2138766 (38) total: 1m 32s remaining: 33.2s 39: learn: 0.1285335 test: 0.2131011 best: 0.2131011 (39) total: 1m 34s remaining: 30.8s 40: learn: 0.1246090 test: 0.2128485 best: 0.2128485 (40) total: 1m 37s remaining: 28.4s 41: learn: 0.1221187 test: 0.2121695 best: 0.2121695 (41) total: 1m 39s remaining: 26.1s 42: learn: 0.1207424 test: 0.2116648 best: 0.2116648 (42) total: 1m 41s remaining: 23.7s 43: learn: 0.1191103 test: 0.2110926 best: 0.2110926 (43) total: 1m 44s remaining: 21.3s 44: learn: 0.1168393 test: 0.2108154 best: 0.2108154 (44) total: 1m 46s remaining: 19s 45: learn: 0.1155065 test: 0.2101448 best: 0.2101448 (45) total: 1m 49s remaining: 16.6s 46: learn: 0.1139519 test: 0.2099105 best: 0.2099105 (46) total: 1m 51s remaining: 14.2s 47: learn: 0.1112916 test: 0.2094349 best: 0.2094349 (47) total: 1m 53s remaining: 11.8s 48: learn: 0.1100960 test: 0.2090742 best: 0.2090742 (48) total: 1m 56s remaining: 9.47s 49: learn: 0.1088066 test: 0.2088259 best: 0.2088259 (49) total: 1m 58s remaining: 7.1s 50: learn: 0.1069822 test: 0.2086522 best: 0.2086522 (50) total: 2m remaining: 4.73s 51: learn: 0.1058120 test: 0.2087468 best: 0.2086522 (50) total: 2m 3s remaining: 2.37s 52: learn: 0.1044536 test: 0.2082622 best: 0.2082622 (52) total: 2m 5s remaining: 0us bestTest = 0.2082622218 bestIteration = 52 Trial 53, Fold 3: Log loss = 0.208262221827831, Average precision = 0.9721844815214287, ROC-AUC = 0.9681268419354528, Elapsed Time = 125.71645570000328 seconds Trial 53, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 53, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6207817 test: 0.6262071 best: 0.6262071 (0) total: 2.28s remaining: 1m 58s 1: learn: 0.5592344 test: 0.5683394 best: 0.5683394 (1) total: 4.68s remaining: 1m 59s 2: learn: 0.5066457 test: 0.5197907 best: 0.5197907 (2) total: 7.12s remaining: 1m 58s 3: learn: 0.4619709 test: 0.4782127 best: 0.4782127 (3) total: 9.57s remaining: 1m 57s 4: learn: 0.4230715 test: 0.4440764 best: 0.4440764 (4) total: 12s remaining: 1m 55s 5: learn: 0.3899920 test: 0.4146163 best: 0.4146163 (5) total: 14.5s remaining: 1m 53s 6: learn: 0.3615243 test: 0.3886795 best: 0.3886795 (6) total: 16.8s remaining: 1m 50s 7: learn: 0.3366949 test: 0.3673127 best: 0.3673127 (7) total: 19.2s remaining: 1m 48s 8: learn: 0.3152040 test: 0.3481159 best: 0.3481159 (8) total: 21.5s remaining: 1m 45s 9: learn: 0.2975238 test: 0.3317412 best: 0.3317412 (9) total: 23.9s remaining: 1m 42s 10: learn: 0.2808176 test: 0.3180144 best: 0.3180144 (10) total: 26.3s remaining: 1m 40s 11: learn: 0.2673821 test: 0.3055285 best: 0.3055285 (11) total: 28.7s remaining: 1m 38s 12: learn: 0.2548198 test: 0.2944281 best: 0.2944281 (12) total: 31.1s remaining: 1m 35s 13: learn: 0.2443595 test: 0.2853254 best: 0.2853254 (13) total: 33.6s remaining: 1m 33s 14: learn: 0.2342983 test: 0.2773050 best: 0.2773050 (14) total: 36.1s remaining: 1m 31s 15: learn: 0.2252706 test: 0.2703068 best: 0.2703068 (15) total: 38.5s remaining: 1m 29s 16: learn: 0.2169706 test: 0.2635000 best: 0.2635000 (16) total: 40.9s remaining: 1m 26s 17: learn: 0.2099978 test: 0.2583241 best: 0.2583241 (17) total: 43.4s remaining: 1m 24s 18: learn: 0.2040580 test: 0.2535718 best: 0.2535718 (18) total: 45.8s remaining: 1m 21s 19: learn: 0.1978969 test: 0.2499082 best: 0.2499082 (19) total: 48.2s remaining: 1m 19s 20: learn: 0.1931955 test: 0.2457546 best: 0.2457546 (20) total: 50.6s remaining: 1m 17s 21: learn: 0.1884547 test: 0.2423050 best: 0.2423050 (21) total: 53s remaining: 1m 14s 22: learn: 0.1836981 test: 0.2391804 best: 0.2391804 (22) total: 55.6s remaining: 1m 12s 23: learn: 0.1789317 test: 0.2364458 best: 0.2364458 (23) total: 58.1s remaining: 1m 10s 24: learn: 0.1753742 test: 0.2338162 best: 0.2338162 (24) total: 1m remaining: 1m 7s 25: learn: 0.1716100 test: 0.2317727 best: 0.2317727 (25) total: 1m 2s remaining: 1m 5s 26: learn: 0.1688712 test: 0.2298898 best: 0.2298898 (26) total: 1m 5s remaining: 1m 2s 27: learn: 0.1651802 test: 0.2281535 best: 0.2281535 (27) total: 1m 7s remaining: 1m 28: learn: 0.1619104 test: 0.2264993 best: 0.2264993 (28) total: 1m 10s remaining: 58.2s 29: learn: 0.1595909 test: 0.2248145 best: 0.2248145 (29) total: 1m 12s remaining: 55.7s 30: learn: 0.1565715 test: 0.2231981 best: 0.2231981 (30) total: 1m 14s remaining: 53.2s 31: learn: 0.1534725 test: 0.2218868 best: 0.2218868 (31) total: 1m 17s remaining: 50.8s 32: learn: 0.1513321 test: 0.2208560 best: 0.2208560 (32) total: 1m 19s remaining: 48.3s 33: learn: 0.1486103 test: 0.2201688 best: 0.2201688 (33) total: 1m 22s remaining: 45.8s 34: learn: 0.1451168 test: 0.2190507 best: 0.2190507 (34) total: 1m 24s remaining: 43.4s 35: learn: 0.1413789 test: 0.2184106 best: 0.2184106 (35) total: 1m 26s remaining: 40.9s 36: learn: 0.1389043 test: 0.2173446 best: 0.2173446 (36) total: 1m 29s remaining: 38.5s 37: learn: 0.1374307 test: 0.2162025 best: 0.2162025 (37) total: 1m 31s remaining: 36.1s 38: learn: 0.1353403 test: 0.2152332 best: 0.2152332 (38) total: 1m 33s remaining: 33.7s 39: learn: 0.1339353 test: 0.2144270 best: 0.2144270 (39) total: 1m 36s remaining: 31.3s 40: learn: 0.1326017 test: 0.2135951 best: 0.2135951 (40) total: 1m 38s remaining: 28.9s 41: learn: 0.1304202 test: 0.2130823 best: 0.2130823 (41) total: 1m 41s remaining: 26.5s 42: learn: 0.1290200 test: 0.2124741 best: 0.2124741 (42) total: 1m 43s remaining: 24.1s 43: learn: 0.1271531 test: 0.2118001 best: 0.2118001 (43) total: 1m 46s remaining: 21.7s 44: learn: 0.1255582 test: 0.2113169 best: 0.2113169 (44) total: 1m 48s remaining: 19.3s 45: learn: 0.1242425 test: 0.2111626 best: 0.2111626 (45) total: 1m 50s remaining: 16.8s 46: learn: 0.1226406 test: 0.2106542 best: 0.2106542 (46) total: 1m 53s remaining: 14.4s 47: learn: 0.1197945 test: 0.2101843 best: 0.2101843 (47) total: 1m 55s remaining: 12s 48: learn: 0.1176865 test: 0.2096479 best: 0.2096479 (48) total: 1m 58s remaining: 9.63s 49: learn: 0.1154716 test: 0.2096768 best: 0.2096479 (48) total: 2m remaining: 7.22s 50: learn: 0.1140586 test: 0.2094562 best: 0.2094562 (50) total: 2m 2s remaining: 4.81s 51: learn: 0.1130484 test: 0.2091180 best: 0.2091180 (51) total: 2m 5s remaining: 2.41s 52: learn: 0.1117480 test: 0.2088547 best: 0.2088547 (52) total: 2m 7s remaining: 0us bestTest = 0.2088546992 bestIteration = 52 Trial 53, Fold 4: Log loss = 0.2088546991897949, Average precision = 0.9735387786389698, ROC-AUC = 0.9687572277421308, Elapsed Time = 127.79895599999873 seconds Trial 53, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 53, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6204819 test: 0.6260629 best: 0.6260629 (0) total: 2.06s remaining: 1m 46s 1: learn: 0.5582463 test: 0.5703027 best: 0.5703027 (1) total: 4.31s remaining: 1m 49s 2: learn: 0.5044394 test: 0.5227379 best: 0.5227379 (2) total: 6.57s remaining: 1m 49s 3: learn: 0.4592319 test: 0.4816138 best: 0.4816138 (3) total: 8.86s remaining: 1m 48s 4: learn: 0.4210088 test: 0.4462601 best: 0.4462601 (4) total: 11.1s remaining: 1m 46s 5: learn: 0.3878635 test: 0.4162088 best: 0.4162088 (5) total: 13.5s remaining: 1m 45s 6: learn: 0.3591068 test: 0.3912136 best: 0.3912136 (6) total: 15.9s remaining: 1m 44s 7: learn: 0.3347683 test: 0.3695798 best: 0.3695798 (7) total: 18.4s remaining: 1m 43s 8: learn: 0.3137811 test: 0.3506210 best: 0.3506210 (8) total: 21s remaining: 1m 42s 9: learn: 0.2959369 test: 0.3345337 best: 0.3345337 (9) total: 23.5s remaining: 1m 40s 10: learn: 0.2807957 test: 0.3204561 best: 0.3204561 (10) total: 25.8s remaining: 1m 38s 11: learn: 0.2658581 test: 0.3085765 best: 0.3085765 (11) total: 28.1s remaining: 1m 36s 12: learn: 0.2532878 test: 0.2983947 best: 0.2983947 (12) total: 30.5s remaining: 1m 33s 13: learn: 0.2421659 test: 0.2891080 best: 0.2891080 (13) total: 32.8s remaining: 1m 31s 14: learn: 0.2322293 test: 0.2810248 best: 0.2810248 (14) total: 35.2s remaining: 1m 29s 15: learn: 0.2232475 test: 0.2740221 best: 0.2740221 (15) total: 37.5s remaining: 1m 26s 16: learn: 0.2155048 test: 0.2680633 best: 0.2680633 (16) total: 39.9s remaining: 1m 24s 17: learn: 0.2080499 test: 0.2625431 best: 0.2625431 (17) total: 42.3s remaining: 1m 22s 18: learn: 0.2016385 test: 0.2582002 best: 0.2582002 (18) total: 44.7s remaining: 1m 19s 19: learn: 0.1966192 test: 0.2541439 best: 0.2541439 (19) total: 47.1s remaining: 1m 17s 20: learn: 0.1908936 test: 0.2503389 best: 0.2503389 (20) total: 49.5s remaining: 1m 15s 21: learn: 0.1849661 test: 0.2467527 best: 0.2467527 (21) total: 52s remaining: 1m 13s 22: learn: 0.1799754 test: 0.2434776 best: 0.2434776 (22) total: 54.5s remaining: 1m 11s 23: learn: 0.1750463 test: 0.2410710 best: 0.2410710 (23) total: 56.9s remaining: 1m 8s 24: learn: 0.1705807 test: 0.2391874 best: 0.2391874 (24) total: 59.3s remaining: 1m 6s 25: learn: 0.1661337 test: 0.2374008 best: 0.2374008 (25) total: 1m 1s remaining: 1m 4s 26: learn: 0.1625114 test: 0.2355163 best: 0.2355163 (26) total: 1m 4s remaining: 1m 1s 27: learn: 0.1588178 test: 0.2338460 best: 0.2338460 (27) total: 1m 6s remaining: 59.4s 28: learn: 0.1553998 test: 0.2322869 best: 0.2322869 (28) total: 1m 8s remaining: 57s 29: learn: 0.1520088 test: 0.2307818 best: 0.2307818 (29) total: 1m 11s remaining: 54.6s 30: learn: 0.1484579 test: 0.2295505 best: 0.2295505 (30) total: 1m 13s remaining: 52.2s 31: learn: 0.1460489 test: 0.2281707 best: 0.2281707 (31) total: 1m 16s remaining: 49.9s 32: learn: 0.1432559 test: 0.2271352 best: 0.2271352 (32) total: 1m 18s remaining: 47.5s 33: learn: 0.1415508 test: 0.2262773 best: 0.2262773 (33) total: 1m 20s remaining: 45.1s 34: learn: 0.1385759 test: 0.2252322 best: 0.2252322 (34) total: 1m 23s remaining: 42.7s 35: learn: 0.1355586 test: 0.2244272 best: 0.2244272 (35) total: 1m 25s remaining: 40.3s 36: learn: 0.1331976 test: 0.2237205 best: 0.2237205 (36) total: 1m 27s remaining: 37.9s 37: learn: 0.1304821 test: 0.2233947 best: 0.2233947 (37) total: 1m 30s remaining: 35.6s 38: learn: 0.1284029 test: 0.2225668 best: 0.2225668 (38) total: 1m 32s remaining: 33.2s 39: learn: 0.1264698 test: 0.2218853 best: 0.2218853 (39) total: 1m 35s remaining: 30.9s 40: learn: 0.1247585 test: 0.2212600 best: 0.2212600 (40) total: 1m 37s remaining: 28.5s 41: learn: 0.1229878 test: 0.2206429 best: 0.2206429 (41) total: 1m 39s remaining: 26.1s 42: learn: 0.1197637 test: 0.2206647 best: 0.2206429 (41) total: 1m 42s remaining: 23.8s 43: learn: 0.1159710 test: 0.2207880 best: 0.2206429 (41) total: 1m 44s remaining: 21.4s 44: learn: 0.1145367 test: 0.2202850 best: 0.2202850 (44) total: 1m 47s remaining: 19s 45: learn: 0.1138056 test: 0.2196347 best: 0.2196347 (45) total: 1m 49s remaining: 16.7s 46: learn: 0.1114418 test: 0.2190294 best: 0.2190294 (46) total: 1m 51s remaining: 14.3s 47: learn: 0.1094790 test: 0.2188204 best: 0.2188204 (47) total: 1m 54s remaining: 11.9s 48: learn: 0.1071913 test: 0.2183676 best: 0.2183676 (48) total: 1m 56s remaining: 9.52s 49: learn: 0.1059314 test: 0.2182875 best: 0.2182875 (49) total: 1m 58s remaining: 7.14s 50: learn: 0.1037094 test: 0.2182146 best: 0.2182146 (50) total: 2m 1s remaining: 4.76s 51: learn: 0.1023366 test: 0.2177673 best: 0.2177673 (51) total: 2m 3s remaining: 2.38s 52: learn: 0.1007672 test: 0.2175866 best: 0.2175866 (52) total: 2m 6s remaining: 0us bestTest = 0.217586627 bestIteration = 52 Trial 53, Fold 5: Log loss = 0.21758662702271073, Average precision = 0.9715107242558519, ROC-AUC = 0.9662566560763987, Elapsed Time = 126.38266299999668 seconds
Optimization Progress: 54%|#####4 | 54/100 [1:28:49<2:55:41, 229.16s/it]
Trial 54, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 54, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6702089 test: 0.6701121 best: 0.6701121 (0) total: 62.2ms remaining: 4.35s 1: learn: 0.6488508 test: 0.6488653 best: 0.6488653 (1) total: 123ms remaining: 4.24s 2: learn: 0.6239838 test: 0.6238830 best: 0.6238830 (2) total: 183ms remaining: 4.15s 3: learn: 0.6079443 test: 0.6076796 best: 0.6076796 (3) total: 247ms remaining: 4.13s 4: learn: 0.5854009 test: 0.5851546 best: 0.5851546 (4) total: 307ms remaining: 4.05s 5: learn: 0.5647849 test: 0.5644729 best: 0.5644729 (5) total: 369ms remaining: 3.99s 6: learn: 0.5454227 test: 0.5450967 best: 0.5450967 (6) total: 429ms remaining: 3.92s 7: learn: 0.5332533 test: 0.5327199 best: 0.5327199 (7) total: 490ms remaining: 3.85s 8: learn: 0.5218859 test: 0.5211759 best: 0.5211759 (8) total: 551ms remaining: 3.8s 9: learn: 0.5052860 test: 0.5045866 best: 0.5045866 (9) total: 611ms remaining: 3.73s 10: learn: 0.4900233 test: 0.4893713 best: 0.4893713 (10) total: 671ms remaining: 3.66s 11: learn: 0.4757605 test: 0.4751368 best: 0.4751368 (11) total: 735ms remaining: 3.61s 12: learn: 0.4626168 test: 0.4619759 best: 0.4619759 (12) total: 797ms remaining: 3.56s 13: learn: 0.4539526 test: 0.4532560 best: 0.4532560 (13) total: 859ms remaining: 3.5s 14: learn: 0.4425793 test: 0.4418563 best: 0.4418563 (14) total: 919ms remaining: 3.43s 15: learn: 0.4323169 test: 0.4316741 best: 0.4316741 (15) total: 980ms remaining: 3.37s 16: learn: 0.4220060 test: 0.4212273 best: 0.4212273 (16) total: 1.04s remaining: 3.3s 17: learn: 0.4135375 test: 0.4127506 best: 0.4127506 (17) total: 1.1s remaining: 3.24s 18: learn: 0.4048861 test: 0.4042106 best: 0.4042106 (18) total: 1.16s remaining: 3.17s 19: learn: 0.3975390 test: 0.3968647 best: 0.3968647 (19) total: 1.22s remaining: 3.11s 20: learn: 0.3896466 test: 0.3890120 best: 0.3890120 (20) total: 1.28s remaining: 3.05s 21: learn: 0.3825971 test: 0.3820782 best: 0.3820782 (21) total: 1.34s remaining: 2.98s 22: learn: 0.3761679 test: 0.3758163 best: 0.3758163 (22) total: 1.4s remaining: 2.92s 23: learn: 0.3700497 test: 0.3698600 best: 0.3698600 (23) total: 1.46s remaining: 2.86s 24: learn: 0.3643710 test: 0.3641705 best: 0.3641705 (24) total: 1.52s remaining: 2.8s 25: learn: 0.3591769 test: 0.3589713 best: 0.3589713 (25) total: 1.58s remaining: 2.74s 26: learn: 0.3540962 test: 0.3539117 best: 0.3539117 (26) total: 1.65s remaining: 2.69s 27: learn: 0.3504039 test: 0.3502408 best: 0.3502408 (27) total: 1.71s remaining: 2.63s 28: learn: 0.3450880 test: 0.3449134 best: 0.3449134 (28) total: 1.78s remaining: 2.58s 29: learn: 0.3415058 test: 0.3413397 best: 0.3413397 (29) total: 1.85s remaining: 2.53s 30: learn: 0.3367288 test: 0.3367578 best: 0.3367578 (30) total: 1.92s remaining: 2.47s 31: learn: 0.3325045 test: 0.3326919 best: 0.3326919 (31) total: 1.98s remaining: 2.42s 32: learn: 0.3290101 test: 0.3292130 best: 0.3292130 (32) total: 2.05s remaining: 2.36s 33: learn: 0.3259535 test: 0.3264296 best: 0.3264296 (33) total: 2.12s remaining: 2.31s 34: learn: 0.3225846 test: 0.3231015 best: 0.3231015 (34) total: 2.18s remaining: 2.25s 35: learn: 0.3195605 test: 0.3201802 best: 0.3201802 (35) total: 2.25s remaining: 2.19s 36: learn: 0.3159298 test: 0.3167147 best: 0.3167147 (36) total: 2.32s remaining: 2.13s 37: learn: 0.3127517 test: 0.3135622 best: 0.3135622 (37) total: 2.38s remaining: 2.07s 38: learn: 0.3095840 test: 0.3105216 best: 0.3105216 (38) total: 2.45s remaining: 2.01s 39: learn: 0.3067931 test: 0.3077733 best: 0.3077733 (39) total: 2.52s remaining: 1.95s 40: learn: 0.3045879 test: 0.3058070 best: 0.3058070 (40) total: 2.58s remaining: 1.89s 41: learn: 0.3018752 test: 0.3030588 best: 0.3030588 (41) total: 2.65s remaining: 1.83s 42: learn: 0.2996610 test: 0.3008164 best: 0.3008164 (42) total: 2.72s remaining: 1.77s 43: learn: 0.2972672 test: 0.2984569 best: 0.2984569 (43) total: 2.79s remaining: 1.71s 44: learn: 0.2951269 test: 0.2963979 best: 0.2963979 (44) total: 2.85s remaining: 1.65s 45: learn: 0.2930999 test: 0.2944279 best: 0.2944279 (45) total: 2.92s remaining: 1.59s 46: learn: 0.2905806 test: 0.2918464 best: 0.2918464 (46) total: 2.99s remaining: 1.53s 47: learn: 0.2886465 test: 0.2899501 best: 0.2899501 (47) total: 3.06s remaining: 1.47s 48: learn: 0.2863303 test: 0.2877395 best: 0.2877395 (48) total: 3.12s remaining: 1.4s 49: learn: 0.2846816 test: 0.2860397 best: 0.2860397 (49) total: 3.19s remaining: 1.34s 50: learn: 0.2824878 test: 0.2838583 best: 0.2838583 (50) total: 3.25s remaining: 1.28s 51: learn: 0.2804891 test: 0.2819733 best: 0.2819733 (51) total: 3.32s remaining: 1.21s 52: learn: 0.2786287 test: 0.2802318 best: 0.2802318 (52) total: 3.39s remaining: 1.15s 53: learn: 0.2768655 test: 0.2785582 best: 0.2785582 (53) total: 3.46s remaining: 1.09s 54: learn: 0.2755292 test: 0.2772876 best: 0.2772876 (54) total: 3.52s remaining: 1.02s 55: learn: 0.2739550 test: 0.2758039 best: 0.2758039 (55) total: 3.59s remaining: 962ms 56: learn: 0.2724407 test: 0.2743457 best: 0.2743457 (56) total: 3.66s remaining: 898ms 57: learn: 0.2709230 test: 0.2728484 best: 0.2728484 (57) total: 3.72s remaining: 835ms 58: learn: 0.2697268 test: 0.2716743 best: 0.2716743 (58) total: 3.79s remaining: 771ms 59: learn: 0.2681779 test: 0.2702404 best: 0.2702404 (59) total: 3.86s remaining: 708ms 60: learn: 0.2668996 test: 0.2689249 best: 0.2689249 (60) total: 3.92s remaining: 644ms 61: learn: 0.2656107 test: 0.2676492 best: 0.2676492 (61) total: 3.99s remaining: 580ms 62: learn: 0.2644651 test: 0.2664606 best: 0.2664606 (62) total: 4.06s remaining: 516ms 63: learn: 0.2634408 test: 0.2655062 best: 0.2655062 (63) total: 4.13s remaining: 452ms 64: learn: 0.2622221 test: 0.2643072 best: 0.2643072 (64) total: 4.2s remaining: 388ms 65: learn: 0.2611369 test: 0.2632319 best: 0.2632319 (65) total: 4.27s remaining: 324ms 66: learn: 0.2601643 test: 0.2622686 best: 0.2622686 (66) total: 4.35s remaining: 260ms 67: learn: 0.2591793 test: 0.2612691 best: 0.2612691 (67) total: 4.42s remaining: 195ms 68: learn: 0.2583911 test: 0.2604906 best: 0.2604906 (68) total: 4.49s remaining: 130ms 69: learn: 0.2577577 test: 0.2599956 best: 0.2599956 (69) total: 4.56s remaining: 65.2ms 70: learn: 0.2570389 test: 0.2593645 best: 0.2593645 (70) total: 4.64s remaining: 0us bestTest = 0.2593645059 bestIteration = 70 Trial 54, Fold 1: Log loss = 0.2593645058885701, Average precision = 0.9648842884960731, ROC-AUC = 0.9599296414591131, Elapsed Time = 4.751763500000379 seconds Trial 54, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 54, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6699620 test: 0.6701186 best: 0.6701186 (0) total: 65.8ms remaining: 4.61s 1: learn: 0.6436301 test: 0.6439516 best: 0.6439516 (1) total: 133ms remaining: 4.59s 2: learn: 0.6264563 test: 0.6267192 best: 0.6267192 (2) total: 203ms remaining: 4.61s 3: learn: 0.6028036 test: 0.6033030 best: 0.6033030 (3) total: 270ms remaining: 4.53s 4: learn: 0.5850663 test: 0.5858118 best: 0.5858118 (4) total: 337ms remaining: 4.45s 5: learn: 0.5708685 test: 0.5716052 best: 0.5716052 (5) total: 405ms remaining: 4.39s 6: learn: 0.5506069 test: 0.5516818 best: 0.5516818 (6) total: 474ms remaining: 4.33s 7: learn: 0.5320629 test: 0.5331591 best: 0.5331591 (7) total: 540ms remaining: 4.25s 8: learn: 0.5182839 test: 0.5195679 best: 0.5195679 (8) total: 607ms remaining: 4.18s 9: learn: 0.5058538 test: 0.5073237 best: 0.5073237 (9) total: 674ms remaining: 4.11s 10: learn: 0.4958917 test: 0.4973953 best: 0.4973953 (10) total: 741ms remaining: 4.04s 11: learn: 0.4818663 test: 0.4833632 best: 0.4833632 (11) total: 807ms remaining: 3.97s 12: learn: 0.4678421 test: 0.4694151 best: 0.4694151 (12) total: 873ms remaining: 3.9s 13: learn: 0.4592189 test: 0.4608060 best: 0.4608060 (13) total: 940ms remaining: 3.83s 14: learn: 0.4499150 test: 0.4517433 best: 0.4517433 (14) total: 1.01s remaining: 3.76s 15: learn: 0.4378558 test: 0.4398088 best: 0.4398088 (15) total: 1.07s remaining: 3.69s 16: learn: 0.4272084 test: 0.4291346 best: 0.4291346 (16) total: 1.14s remaining: 3.62s 17: learn: 0.4172570 test: 0.4191807 best: 0.4191807 (17) total: 1.21s remaining: 3.55s 18: learn: 0.4107889 test: 0.4126359 best: 0.4126359 (18) total: 1.27s remaining: 3.48s 19: learn: 0.4018479 test: 0.4037437 best: 0.4037437 (19) total: 1.34s remaining: 3.41s 20: learn: 0.3958926 test: 0.3977071 best: 0.3977071 (20) total: 1.41s remaining: 3.35s 21: learn: 0.3879751 test: 0.3897833 best: 0.3897833 (21) total: 1.47s remaining: 3.28s 22: learn: 0.3812568 test: 0.3829035 best: 0.3829035 (22) total: 1.54s remaining: 3.22s 23: learn: 0.3761785 test: 0.3778621 best: 0.3778621 (23) total: 1.61s remaining: 3.15s 24: learn: 0.3699917 test: 0.3717515 best: 0.3717515 (24) total: 1.68s remaining: 3.09s 25: learn: 0.3643329 test: 0.3661703 best: 0.3661703 (25) total: 1.75s remaining: 3.03s 26: learn: 0.3592561 test: 0.3611761 best: 0.3611761 (26) total: 1.82s remaining: 2.96s 27: learn: 0.3544842 test: 0.3563842 best: 0.3563842 (27) total: 1.89s remaining: 2.9s 28: learn: 0.3507605 test: 0.3526017 best: 0.3526017 (28) total: 1.96s remaining: 2.84s 29: learn: 0.3476075 test: 0.3495041 best: 0.3495041 (29) total: 2.03s remaining: 2.78s 30: learn: 0.3427817 test: 0.3446056 best: 0.3446056 (30) total: 2.1s remaining: 2.72s 31: learn: 0.3379188 test: 0.3395858 best: 0.3395858 (31) total: 2.18s remaining: 2.65s 32: learn: 0.3338717 test: 0.3354336 best: 0.3354336 (32) total: 2.25s remaining: 2.59s 33: learn: 0.3296000 test: 0.3310486 best: 0.3310486 (33) total: 2.33s remaining: 2.53s 34: learn: 0.3252405 test: 0.3266873 best: 0.3266873 (34) total: 2.4s remaining: 2.47s 35: learn: 0.3218003 test: 0.3232801 best: 0.3232801 (35) total: 2.49s remaining: 2.42s 36: learn: 0.3179364 test: 0.3193859 best: 0.3193859 (36) total: 2.57s remaining: 2.36s 37: learn: 0.3141851 test: 0.3156420 best: 0.3156420 (37) total: 2.65s remaining: 2.3s 38: learn: 0.3112799 test: 0.3127133 best: 0.3127133 (38) total: 2.73s remaining: 2.24s 39: learn: 0.3087165 test: 0.3102038 best: 0.3102038 (39) total: 2.8s remaining: 2.17s 40: learn: 0.3062072 test: 0.3077619 best: 0.3077619 (40) total: 2.88s remaining: 2.11s 41: learn: 0.3033928 test: 0.3048768 best: 0.3048768 (41) total: 2.95s remaining: 2.04s 42: learn: 0.3005903 test: 0.3020257 best: 0.3020257 (42) total: 3.03s remaining: 1.97s 43: learn: 0.2977287 test: 0.2990561 best: 0.2990561 (43) total: 3.1s remaining: 1.9s 44: learn: 0.2953803 test: 0.2966454 best: 0.2966454 (44) total: 3.18s remaining: 1.83s 45: learn: 0.2933124 test: 0.2946304 best: 0.2946304 (45) total: 3.25s remaining: 1.76s 46: learn: 0.2909195 test: 0.2920490 best: 0.2920490 (46) total: 3.32s remaining: 1.7s 47: learn: 0.2889742 test: 0.2901182 best: 0.2901182 (47) total: 3.39s remaining: 1.63s 48: learn: 0.2868152 test: 0.2879034 best: 0.2879034 (48) total: 3.47s remaining: 1.56s 49: learn: 0.2849229 test: 0.2860181 best: 0.2860181 (49) total: 3.54s remaining: 1.49s 50: learn: 0.2834419 test: 0.2845715 best: 0.2845715 (50) total: 3.61s remaining: 1.42s 51: learn: 0.2815077 test: 0.2826997 best: 0.2826997 (51) total: 3.68s remaining: 1.35s 52: learn: 0.2794743 test: 0.2805437 best: 0.2805437 (52) total: 3.76s remaining: 1.28s 53: learn: 0.2775372 test: 0.2785164 best: 0.2785164 (53) total: 3.83s remaining: 1.21s 54: learn: 0.2758267 test: 0.2767758 best: 0.2767758 (54) total: 3.9s remaining: 1.13s 55: learn: 0.2743136 test: 0.2752416 best: 0.2752416 (55) total: 3.97s remaining: 1.06s 56: learn: 0.2727568 test: 0.2736668 best: 0.2736668 (56) total: 4.04s remaining: 994ms 57: learn: 0.2712305 test: 0.2721680 best: 0.2721680 (57) total: 4.12s remaining: 923ms 58: learn: 0.2702995 test: 0.2712958 best: 0.2712958 (58) total: 4.19s remaining: 852ms 59: learn: 0.2689312 test: 0.2699664 best: 0.2699664 (59) total: 4.26s remaining: 782ms 60: learn: 0.2674986 test: 0.2684375 best: 0.2684375 (60) total: 4.34s remaining: 711ms 61: learn: 0.2661496 test: 0.2671323 best: 0.2671323 (61) total: 4.41s remaining: 640ms 62: learn: 0.2649082 test: 0.2658928 best: 0.2658928 (62) total: 4.49s remaining: 570ms 63: learn: 0.2638191 test: 0.2648281 best: 0.2648281 (63) total: 4.56s remaining: 499ms 64: learn: 0.2628430 test: 0.2639111 best: 0.2639111 (64) total: 4.63s remaining: 428ms 65: learn: 0.2619248 test: 0.2629130 best: 0.2629130 (65) total: 4.71s remaining: 357ms 66: learn: 0.2610148 test: 0.2619240 best: 0.2619240 (66) total: 4.78s remaining: 285ms 67: learn: 0.2601996 test: 0.2611431 best: 0.2611431 (67) total: 4.86s remaining: 214ms 68: learn: 0.2591833 test: 0.2601208 best: 0.2601208 (68) total: 4.93s remaining: 143ms 69: learn: 0.2583868 test: 0.2593355 best: 0.2593355 (69) total: 5s remaining: 71.5ms 70: learn: 0.2572255 test: 0.2580699 best: 0.2580699 (70) total: 5.08s remaining: 0us bestTest = 0.2580698523 bestIteration = 70 Trial 54, Fold 2: Log loss = 0.25806985231369184, Average precision = 0.9644703927296976, ROC-AUC = 0.9615634240491712, Elapsed Time = 5.203042400000413 seconds Trial 54, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 54, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6701520 test: 0.6699164 best: 0.6699164 (0) total: 64.9ms remaining: 4.54s 1: learn: 0.6488663 test: 0.6483644 best: 0.6483644 (1) total: 130ms remaining: 4.47s 2: learn: 0.6317938 test: 0.6315338 best: 0.6315338 (2) total: 196ms remaining: 4.44s 3: learn: 0.6079254 test: 0.6072935 best: 0.6072935 (3) total: 261ms remaining: 4.37s 4: learn: 0.5931236 test: 0.5926432 best: 0.5926432 (4) total: 326ms remaining: 4.31s 5: learn: 0.5714927 test: 0.5706905 best: 0.5706905 (5) total: 392ms remaining: 4.25s 6: learn: 0.5517018 test: 0.5505301 best: 0.5505301 (6) total: 457ms remaining: 4.18s 7: learn: 0.5333394 test: 0.5318232 best: 0.5318232 (7) total: 523ms remaining: 4.12s 8: learn: 0.5216573 test: 0.5203361 best: 0.5203361 (8) total: 589ms remaining: 4.06s 9: learn: 0.5085850 test: 0.5071349 best: 0.5071349 (9) total: 654ms remaining: 3.99s 10: learn: 0.4928167 test: 0.4911490 best: 0.4911490 (10) total: 724ms remaining: 3.95s 11: learn: 0.4783470 test: 0.4764062 best: 0.4764062 (11) total: 791ms remaining: 3.89s 12: learn: 0.4645116 test: 0.4623836 best: 0.4623836 (12) total: 860ms remaining: 3.83s 13: learn: 0.4560478 test: 0.4541285 best: 0.4541285 (13) total: 930ms remaining: 3.79s 14: learn: 0.4447217 test: 0.4427128 best: 0.4427128 (14) total: 995ms remaining: 3.71s 15: learn: 0.4356640 test: 0.4335725 best: 0.4335725 (15) total: 1.06s remaining: 3.65s 16: learn: 0.4288936 test: 0.4268993 best: 0.4268993 (16) total: 1.13s remaining: 3.58s 17: learn: 0.4183926 test: 0.4162240 best: 0.4162240 (17) total: 1.19s remaining: 3.51s 18: learn: 0.4086706 test: 0.4063603 best: 0.4063603 (18) total: 1.26s remaining: 3.44s 19: learn: 0.4000316 test: 0.3976575 best: 0.3976575 (19) total: 1.32s remaining: 3.37s 20: learn: 0.3939886 test: 0.3915209 best: 0.3915209 (20) total: 1.39s remaining: 3.31s 21: learn: 0.3863701 test: 0.3836936 best: 0.3836936 (21) total: 1.46s remaining: 3.24s 22: learn: 0.3799736 test: 0.3772666 best: 0.3772666 (22) total: 1.52s remaining: 3.17s 23: learn: 0.3731539 test: 0.3704321 best: 0.3704321 (23) total: 1.59s remaining: 3.11s 24: learn: 0.3671809 test: 0.3644388 best: 0.3644388 (24) total: 1.65s remaining: 3.04s 25: learn: 0.3609472 test: 0.3581879 best: 0.3581879 (25) total: 1.72s remaining: 2.98s 26: learn: 0.3562264 test: 0.3533489 best: 0.3533489 (26) total: 1.79s remaining: 2.92s 27: learn: 0.3511369 test: 0.3481687 best: 0.3481687 (27) total: 1.86s remaining: 2.86s 28: learn: 0.3462607 test: 0.3432639 best: 0.3432639 (28) total: 1.93s remaining: 2.79s 29: learn: 0.3420109 test: 0.3389819 best: 0.3389819 (29) total: 2s remaining: 2.73s 30: learn: 0.3388956 test: 0.3358384 best: 0.3358384 (30) total: 2.07s remaining: 2.67s 31: learn: 0.3343063 test: 0.3311953 best: 0.3311953 (31) total: 2.14s remaining: 2.6s 32: learn: 0.3301718 test: 0.3269666 best: 0.3269666 (32) total: 2.21s remaining: 2.54s 33: learn: 0.3260300 test: 0.3227167 best: 0.3227167 (33) total: 2.28s remaining: 2.48s 34: learn: 0.3226146 test: 0.3192738 best: 0.3192738 (34) total: 2.35s remaining: 2.42s 35: learn: 0.3193120 test: 0.3158989 best: 0.3158989 (35) total: 2.42s remaining: 2.36s 36: learn: 0.3157163 test: 0.3122198 best: 0.3122198 (36) total: 2.5s remaining: 2.29s 37: learn: 0.3128218 test: 0.3092676 best: 0.3092676 (37) total: 2.57s remaining: 2.23s 38: learn: 0.3100130 test: 0.3063587 best: 0.3063587 (38) total: 2.64s remaining: 2.17s 39: learn: 0.3076439 test: 0.3040965 best: 0.3040965 (39) total: 2.73s remaining: 2.11s 40: learn: 0.3045565 test: 0.3009154 best: 0.3009154 (40) total: 2.81s remaining: 2.05s 41: learn: 0.3016691 test: 0.2979653 best: 0.2979653 (41) total: 2.88s remaining: 1.99s 42: learn: 0.2990773 test: 0.2952552 best: 0.2952552 (42) total: 2.96s remaining: 1.93s 43: learn: 0.2967696 test: 0.2928483 best: 0.2928483 (43) total: 3.03s remaining: 1.86s 44: learn: 0.2943100 test: 0.2903457 best: 0.2903457 (44) total: 3.11s remaining: 1.79s 45: learn: 0.2920638 test: 0.2879812 best: 0.2879812 (45) total: 3.18s remaining: 1.73s 46: learn: 0.2898549 test: 0.2857391 best: 0.2857391 (46) total: 3.25s remaining: 1.66s 47: learn: 0.2877171 test: 0.2835150 best: 0.2835150 (47) total: 3.32s remaining: 1.59s 48: learn: 0.2858556 test: 0.2815996 best: 0.2815996 (48) total: 3.4s remaining: 1.52s 49: learn: 0.2841434 test: 0.2799264 best: 0.2799264 (49) total: 3.47s remaining: 1.46s 50: learn: 0.2821357 test: 0.2778774 best: 0.2778774 (50) total: 3.54s remaining: 1.39s 51: learn: 0.2803582 test: 0.2760034 best: 0.2760034 (51) total: 3.61s remaining: 1.32s 52: learn: 0.2787603 test: 0.2743603 best: 0.2743603 (52) total: 3.69s remaining: 1.25s 53: learn: 0.2773641 test: 0.2730145 best: 0.2730145 (53) total: 3.76s remaining: 1.18s 54: learn: 0.2760025 test: 0.2715508 best: 0.2715508 (54) total: 3.83s remaining: 1.11s 55: learn: 0.2743632 test: 0.2698663 best: 0.2698663 (55) total: 3.9s remaining: 1.04s 56: learn: 0.2729394 test: 0.2683967 best: 0.2683967 (56) total: 3.98s remaining: 977ms 57: learn: 0.2714944 test: 0.2669667 best: 0.2669667 (57) total: 4.05s remaining: 908ms 58: learn: 0.2704104 test: 0.2658821 best: 0.2658821 (58) total: 4.12s remaining: 838ms 59: learn: 0.2693196 test: 0.2647728 best: 0.2647728 (59) total: 4.19s remaining: 769ms 60: learn: 0.2678090 test: 0.2632072 best: 0.2632072 (60) total: 4.26s remaining: 699ms 61: learn: 0.2665304 test: 0.2619521 best: 0.2619521 (61) total: 4.34s remaining: 630ms 62: learn: 0.2652236 test: 0.2606410 best: 0.2606410 (62) total: 4.41s remaining: 560ms 63: learn: 0.2640679 test: 0.2594682 best: 0.2594682 (63) total: 4.48s remaining: 490ms 64: learn: 0.2631806 test: 0.2586177 best: 0.2586177 (64) total: 4.55s remaining: 420ms 65: learn: 0.2619936 test: 0.2574294 best: 0.2574294 (65) total: 4.62s remaining: 350ms 66: learn: 0.2610456 test: 0.2564521 best: 0.2564521 (66) total: 4.69s remaining: 280ms 67: learn: 0.2599222 test: 0.2552712 best: 0.2552712 (67) total: 4.77s remaining: 210ms 68: learn: 0.2589736 test: 0.2542752 best: 0.2542752 (68) total: 4.84s remaining: 140ms 69: learn: 0.2579849 test: 0.2532525 best: 0.2532525 (69) total: 4.91s remaining: 70.2ms 70: learn: 0.2571962 test: 0.2524869 best: 0.2524869 (70) total: 4.98s remaining: 0us bestTest = 0.2524868978 bestIteration = 70 Trial 54, Fold 3: Log loss = 0.25248689782596256, Average precision = 0.9654262617139904, ROC-AUC = 0.9628300736292629, Elapsed Time = 5.105439099999785 seconds Trial 54, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 54, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6697009 test: 0.6695279 best: 0.6695279 (0) total: 65.2ms remaining: 4.56s 1: learn: 0.6484960 test: 0.6481939 best: 0.6481939 (1) total: 133ms remaining: 4.59s 2: learn: 0.6317864 test: 0.6313981 best: 0.6313981 (2) total: 200ms remaining: 4.54s 3: learn: 0.6076829 test: 0.6073164 best: 0.6073164 (3) total: 266ms remaining: 4.46s 4: learn: 0.5903541 test: 0.5897798 best: 0.5897798 (4) total: 332ms remaining: 4.38s 5: learn: 0.5690596 test: 0.5684802 best: 0.5684802 (5) total: 397ms remaining: 4.3s 6: learn: 0.5561229 test: 0.5554539 best: 0.5554539 (6) total: 462ms remaining: 4.22s 7: learn: 0.5370176 test: 0.5363986 best: 0.5363986 (7) total: 528ms remaining: 4.16s 8: learn: 0.5237645 test: 0.5231795 best: 0.5231795 (8) total: 594ms remaining: 4.09s 9: learn: 0.5070567 test: 0.5064582 best: 0.5064582 (9) total: 659ms remaining: 4.02s 10: learn: 0.4969423 test: 0.4961428 best: 0.4961428 (10) total: 725ms remaining: 3.96s 11: learn: 0.4824739 test: 0.4817651 best: 0.4817651 (11) total: 792ms remaining: 3.89s 12: learn: 0.4738518 test: 0.4731094 best: 0.4731094 (12) total: 858ms remaining: 3.83s 13: learn: 0.4639693 test: 0.4632064 best: 0.4632064 (13) total: 925ms remaining: 3.76s 14: learn: 0.4515164 test: 0.4507922 best: 0.4507922 (14) total: 990ms remaining: 3.69s 15: learn: 0.4396916 test: 0.4389677 best: 0.4389677 (15) total: 1.06s remaining: 3.64s 16: learn: 0.4311979 test: 0.4302933 best: 0.4302933 (16) total: 1.12s remaining: 3.57s 17: learn: 0.4213254 test: 0.4205005 best: 0.4205005 (17) total: 1.19s remaining: 3.51s 18: learn: 0.4138863 test: 0.4128975 best: 0.4128975 (18) total: 1.26s remaining: 3.44s 19: learn: 0.4076952 test: 0.4068004 best: 0.4068004 (19) total: 1.32s remaining: 3.37s 20: learn: 0.3990921 test: 0.3980486 best: 0.3980486 (20) total: 1.39s remaining: 3.3s 21: learn: 0.3937096 test: 0.3927256 best: 0.3927256 (21) total: 1.45s remaining: 3.24s 22: learn: 0.3859462 test: 0.3849664 best: 0.3849664 (22) total: 1.52s remaining: 3.17s 23: learn: 0.3806693 test: 0.3796635 best: 0.3796635 (23) total: 1.58s remaining: 3.1s 24: learn: 0.3748251 test: 0.3738199 best: 0.3738199 (24) total: 1.65s remaining: 3.04s 25: learn: 0.3699063 test: 0.3688888 best: 0.3688888 (25) total: 1.72s remaining: 2.98s 26: learn: 0.3635707 test: 0.3625493 best: 0.3625493 (26) total: 1.78s remaining: 2.91s 27: learn: 0.3574915 test: 0.3563490 best: 0.3563490 (27) total: 1.85s remaining: 2.84s 28: learn: 0.3518622 test: 0.3506965 best: 0.3506965 (28) total: 1.92s remaining: 2.78s 29: learn: 0.3475698 test: 0.3463551 best: 0.3463551 (29) total: 1.99s remaining: 2.71s 30: learn: 0.3422954 test: 0.3409195 best: 0.3409195 (30) total: 2.06s remaining: 2.65s 31: learn: 0.3377483 test: 0.3363646 best: 0.3363646 (31) total: 2.13s remaining: 2.59s 32: learn: 0.3341158 test: 0.3327709 best: 0.3327709 (32) total: 2.2s remaining: 2.53s 33: learn: 0.3305940 test: 0.3291606 best: 0.3291606 (33) total: 2.27s remaining: 2.46s 34: learn: 0.3274519 test: 0.3260733 best: 0.3260733 (34) total: 2.33s remaining: 2.4s 35: learn: 0.3233082 test: 0.3218963 best: 0.3218963 (35) total: 2.4s remaining: 2.34s 36: learn: 0.3193983 test: 0.3179593 best: 0.3179593 (36) total: 2.48s remaining: 2.27s 37: learn: 0.3157323 test: 0.3142279 best: 0.3142279 (37) total: 2.54s remaining: 2.21s 38: learn: 0.3127357 test: 0.3113422 best: 0.3113422 (38) total: 2.62s remaining: 2.15s 39: learn: 0.3097135 test: 0.3083049 best: 0.3083049 (39) total: 2.69s remaining: 2.09s 40: learn: 0.3065178 test: 0.3051741 best: 0.3051741 (40) total: 2.77s remaining: 2.02s 41: learn: 0.3034876 test: 0.3021080 best: 0.3021080 (41) total: 2.84s remaining: 1.96s 42: learn: 0.3005757 test: 0.2991559 best: 0.2991559 (42) total: 2.91s remaining: 1.9s 43: learn: 0.2983118 test: 0.2969300 best: 0.2969300 (43) total: 2.99s remaining: 1.83s 44: learn: 0.2959511 test: 0.2946037 best: 0.2946037 (44) total: 3.08s remaining: 1.78s 45: learn: 0.2937440 test: 0.2923564 best: 0.2923564 (45) total: 3.16s remaining: 1.72s 46: learn: 0.2920666 test: 0.2908136 best: 0.2908136 (46) total: 3.24s remaining: 1.65s 47: learn: 0.2901105 test: 0.2888313 best: 0.2888313 (47) total: 3.31s remaining: 1.59s 48: learn: 0.2878067 test: 0.2865554 best: 0.2865554 (48) total: 3.39s remaining: 1.52s 49: learn: 0.2859282 test: 0.2846425 best: 0.2846425 (49) total: 3.46s remaining: 1.45s 50: learn: 0.2836940 test: 0.2823484 best: 0.2823484 (50) total: 3.54s remaining: 1.39s 51: learn: 0.2821036 test: 0.2808332 best: 0.2808332 (51) total: 3.61s remaining: 1.32s 52: learn: 0.2807517 test: 0.2795558 best: 0.2795558 (52) total: 3.68s remaining: 1.25s 53: learn: 0.2787779 test: 0.2775447 best: 0.2775447 (53) total: 3.75s remaining: 1.18s 54: learn: 0.2772005 test: 0.2759513 best: 0.2759513 (54) total: 3.82s remaining: 1.11s 55: learn: 0.2754085 test: 0.2741333 best: 0.2741333 (55) total: 3.9s remaining: 1.04s 56: learn: 0.2740407 test: 0.2728104 best: 0.2728104 (56) total: 3.97s remaining: 975ms 57: learn: 0.2731995 test: 0.2720013 best: 0.2720013 (57) total: 4.04s remaining: 906ms 58: learn: 0.2714656 test: 0.2702169 best: 0.2702169 (58) total: 4.12s remaining: 838ms 59: learn: 0.2701522 test: 0.2689395 best: 0.2689395 (59) total: 4.19s remaining: 768ms 60: learn: 0.2689402 test: 0.2677697 best: 0.2677697 (60) total: 4.26s remaining: 699ms 61: learn: 0.2674183 test: 0.2662897 best: 0.2662897 (61) total: 4.34s remaining: 629ms 62: learn: 0.2661861 test: 0.2650624 best: 0.2650624 (62) total: 4.41s remaining: 560ms 63: learn: 0.2647516 test: 0.2635519 best: 0.2635519 (63) total: 4.48s remaining: 490ms 64: learn: 0.2637207 test: 0.2625858 best: 0.2625858 (64) total: 4.55s remaining: 420ms 65: learn: 0.2624476 test: 0.2613837 best: 0.2613837 (65) total: 4.62s remaining: 350ms 66: learn: 0.2616129 test: 0.2606606 best: 0.2606606 (66) total: 4.69s remaining: 280ms 67: learn: 0.2609818 test: 0.2601390 best: 0.2601390 (67) total: 4.76s remaining: 210ms 68: learn: 0.2597001 test: 0.2588208 best: 0.2588208 (68) total: 4.84s remaining: 140ms 69: learn: 0.2584973 test: 0.2575687 best: 0.2575687 (69) total: 4.91s remaining: 70.1ms 70: learn: 0.2573042 test: 0.2563374 best: 0.2563374 (70) total: 4.98s remaining: 0us bestTest = 0.2563374062 bestIteration = 70 Trial 54, Fold 4: Log loss = 0.25633740624047024, Average precision = 0.965722052564221, ROC-AUC = 0.96159975118684, Elapsed Time = 5.0999941999980365 seconds Trial 54, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 54, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6699952 test: 0.6705528 best: 0.6705528 (0) total: 67.3ms remaining: 4.71s 1: learn: 0.6487145 test: 0.6497737 best: 0.6497737 (1) total: 133ms remaining: 4.59s 2: learn: 0.6313290 test: 0.6327268 best: 0.6327268 (2) total: 198ms remaining: 4.49s 3: learn: 0.6124794 test: 0.6143517 best: 0.6143517 (3) total: 265ms remaining: 4.43s 4: learn: 0.5897355 test: 0.5917456 best: 0.5917456 (4) total: 332ms remaining: 4.38s 5: learn: 0.5730664 test: 0.5755156 best: 0.5755156 (5) total: 399ms remaining: 4.32s 6: learn: 0.5530071 test: 0.5556640 best: 0.5556640 (6) total: 467ms remaining: 4.27s 7: learn: 0.5405779 test: 0.5433939 best: 0.5433939 (7) total: 537ms remaining: 4.23s 8: learn: 0.5224611 test: 0.5254044 best: 0.5254044 (8) total: 603ms remaining: 4.15s 9: learn: 0.5056286 test: 0.5088462 best: 0.5088462 (9) total: 669ms remaining: 4.08s 10: learn: 0.4953539 test: 0.4987537 best: 0.4987537 (10) total: 736ms remaining: 4.01s 11: learn: 0.4802463 test: 0.4838827 best: 0.4838827 (11) total: 802ms remaining: 3.94s 12: learn: 0.4664193 test: 0.4702603 best: 0.4702603 (12) total: 868ms remaining: 3.87s 13: learn: 0.4533546 test: 0.4573947 best: 0.4573947 (13) total: 933ms remaining: 3.8s 14: learn: 0.4412232 test: 0.4454811 best: 0.4454811 (14) total: 999ms remaining: 3.73s 15: learn: 0.4306312 test: 0.4350172 best: 0.4350172 (15) total: 1.07s remaining: 3.67s 16: learn: 0.4202141 test: 0.4247794 best: 0.4247794 (16) total: 1.13s remaining: 3.6s 17: learn: 0.4120276 test: 0.4167446 best: 0.4167446 (17) total: 1.2s remaining: 3.53s 18: learn: 0.4042350 test: 0.4091988 best: 0.4091988 (18) total: 1.26s remaining: 3.46s 19: learn: 0.3956338 test: 0.4007964 best: 0.4007964 (19) total: 1.33s remaining: 3.39s 20: learn: 0.3898740 test: 0.3952295 best: 0.3952295 (20) total: 1.4s remaining: 3.33s 21: learn: 0.3835014 test: 0.3890186 best: 0.3890186 (21) total: 1.46s remaining: 3.26s 22: learn: 0.3761900 test: 0.3817880 best: 0.3817880 (22) total: 1.53s remaining: 3.2s 23: learn: 0.3701338 test: 0.3759902 best: 0.3759902 (23) total: 1.6s remaining: 3.13s 24: learn: 0.3656407 test: 0.3716917 best: 0.3716917 (24) total: 1.66s remaining: 3.06s 25: learn: 0.3591827 test: 0.3654703 best: 0.3654703 (25) total: 1.73s remaining: 3s 26: learn: 0.3551419 test: 0.3616157 best: 0.3616157 (26) total: 1.8s remaining: 2.93s 27: learn: 0.3494761 test: 0.3561444 best: 0.3561444 (27) total: 1.87s remaining: 2.87s 28: learn: 0.3442912 test: 0.3509433 best: 0.3509433 (28) total: 1.93s remaining: 2.8s 29: learn: 0.3394782 test: 0.3462244 best: 0.3462244 (29) total: 2s remaining: 2.73s 30: learn: 0.3349835 test: 0.3418316 best: 0.3418316 (30) total: 2.07s remaining: 2.67s 31: learn: 0.3309557 test: 0.3378978 best: 0.3378978 (31) total: 2.14s remaining: 2.61s 32: learn: 0.3270462 test: 0.3341320 best: 0.3341320 (32) total: 2.21s remaining: 2.54s 33: learn: 0.3229859 test: 0.3300739 best: 0.3300739 (33) total: 2.28s remaining: 2.48s 34: learn: 0.3192800 test: 0.3265596 best: 0.3265596 (34) total: 2.35s remaining: 2.42s 35: learn: 0.3157669 test: 0.3231307 best: 0.3231307 (35) total: 2.42s remaining: 2.36s 36: learn: 0.3120766 test: 0.3195287 best: 0.3195287 (36) total: 2.5s remaining: 2.29s 37: learn: 0.3087997 test: 0.3164015 best: 0.3164015 (37) total: 2.57s remaining: 2.23s 38: learn: 0.3055582 test: 0.3133161 best: 0.3133161 (38) total: 2.64s remaining: 2.17s 39: learn: 0.3028481 test: 0.3107278 best: 0.3107278 (39) total: 2.71s remaining: 2.1s 40: learn: 0.3003897 test: 0.3083225 best: 0.3083225 (40) total: 2.78s remaining: 2.03s 41: learn: 0.2978459 test: 0.3058304 best: 0.3058304 (41) total: 2.85s remaining: 1.97s 42: learn: 0.2954440 test: 0.3035278 best: 0.3035278 (42) total: 2.92s remaining: 1.9s 43: learn: 0.2935435 test: 0.3016516 best: 0.3016516 (43) total: 2.99s remaining: 1.83s 44: learn: 0.2909057 test: 0.2990036 best: 0.2990036 (44) total: 3.06s remaining: 1.77s 45: learn: 0.2884242 test: 0.2965288 best: 0.2965288 (45) total: 3.13s remaining: 1.7s 46: learn: 0.2865040 test: 0.2945738 best: 0.2945738 (46) total: 3.2s remaining: 1.64s 47: learn: 0.2851785 test: 0.2932809 best: 0.2932809 (47) total: 3.27s remaining: 1.57s 48: learn: 0.2829898 test: 0.2911619 best: 0.2911619 (48) total: 3.35s remaining: 1.5s 49: learn: 0.2811514 test: 0.2893268 best: 0.2893268 (49) total: 3.41s remaining: 1.43s 50: learn: 0.2792770 test: 0.2874943 best: 0.2874943 (50) total: 3.48s remaining: 1.37s 51: learn: 0.2776878 test: 0.2859107 best: 0.2859107 (51) total: 3.55s remaining: 1.3s 52: learn: 0.2758041 test: 0.2840945 best: 0.2840945 (52) total: 3.62s remaining: 1.23s 53: learn: 0.2744056 test: 0.2827218 best: 0.2827218 (53) total: 3.69s remaining: 1.16s 54: learn: 0.2727751 test: 0.2811584 best: 0.2811584 (54) total: 3.76s remaining: 1.09s 55: learn: 0.2712871 test: 0.2797478 best: 0.2797478 (55) total: 3.83s remaining: 1.03s 56: learn: 0.2695539 test: 0.2780552 best: 0.2780552 (56) total: 3.91s remaining: 960ms 57: learn: 0.2681953 test: 0.2767414 best: 0.2767414 (57) total: 3.98s remaining: 892ms 58: learn: 0.2666711 test: 0.2753395 best: 0.2753395 (58) total: 4.05s remaining: 824ms 59: learn: 0.2651950 test: 0.2739141 best: 0.2739141 (59) total: 4.12s remaining: 756ms 60: learn: 0.2638758 test: 0.2727062 best: 0.2727062 (60) total: 4.19s remaining: 688ms 61: learn: 0.2627427 test: 0.2716349 best: 0.2716349 (61) total: 4.27s remaining: 619ms 62: learn: 0.2614148 test: 0.2703465 best: 0.2703465 (62) total: 4.34s remaining: 552ms 63: learn: 0.2602221 test: 0.2692351 best: 0.2692351 (63) total: 4.43s remaining: 484ms 64: learn: 0.2590356 test: 0.2681233 best: 0.2681233 (64) total: 4.51s remaining: 417ms 65: learn: 0.2579619 test: 0.2670878 best: 0.2670878 (65) total: 4.6s remaining: 348ms 66: learn: 0.2568592 test: 0.2659949 best: 0.2659949 (66) total: 4.67s remaining: 279ms 67: learn: 0.2557707 test: 0.2649574 best: 0.2649574 (67) total: 4.76s remaining: 210ms 68: learn: 0.2548173 test: 0.2640438 best: 0.2640438 (68) total: 4.84s remaining: 140ms 69: learn: 0.2541071 test: 0.2633139 best: 0.2633139 (69) total: 4.93s remaining: 70.4ms 70: learn: 0.2533542 test: 0.2626023 best: 0.2626023 (70) total: 5.01s remaining: 0us bestTest = 0.2626023064 bestIteration = 70 Trial 54, Fold 5: Log loss = 0.26260230635432746, Average precision = 0.9619306863444705, ROC-AUC = 0.957235382643108, Elapsed Time = 5.132205799996882 seconds
Optimization Progress: 55%|#####5 | 55/100 [1:29:22<2:07:48, 170.40s/it]
Trial 55, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 55, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6718912 test: 0.6723943 best: 0.6723943 (0) total: 383ms remaining: 34.8s 1: learn: 0.6517942 test: 0.6525750 best: 0.6525750 (1) total: 760ms remaining: 34.2s 2: learn: 0.6324438 test: 0.6334824 best: 0.6334824 (2) total: 1.15s remaining: 34.1s 3: learn: 0.6160213 test: 0.6173741 best: 0.6173741 (3) total: 1.53s remaining: 33.7s 4: learn: 0.5980892 test: 0.5995899 best: 0.5995899 (4) total: 1.91s remaining: 33.2s 5: learn: 0.5810802 test: 0.5827271 best: 0.5827271 (5) total: 2.28s remaining: 32.7s 6: learn: 0.5661755 test: 0.5681185 best: 0.5681185 (6) total: 2.68s remaining: 32.6s 7: learn: 0.5521967 test: 0.5544954 best: 0.5544954 (7) total: 3.09s remaining: 32.4s 8: learn: 0.5380939 test: 0.5409797 best: 0.5409797 (8) total: 3.46s remaining: 31.9s 9: learn: 0.5233331 test: 0.5265428 best: 0.5265428 (9) total: 3.84s remaining: 31.5s 10: learn: 0.5092089 test: 0.5126584 best: 0.5126584 (10) total: 4.22s remaining: 31.1s 11: learn: 0.4959848 test: 0.4997060 best: 0.4997060 (11) total: 4.59s remaining: 30.6s 12: learn: 0.4851260 test: 0.4891555 best: 0.4891555 (12) total: 4.98s remaining: 30.2s 13: learn: 0.4745763 test: 0.4789078 best: 0.4789078 (13) total: 5.35s remaining: 29.8s 14: learn: 0.4649177 test: 0.4694451 best: 0.4694451 (14) total: 5.7s remaining: 29.3s 15: learn: 0.4541934 test: 0.4589710 best: 0.4589710 (15) total: 6.07s remaining: 28.8s 16: learn: 0.4436817 test: 0.4489013 best: 0.4489013 (16) total: 6.44s remaining: 28.4s 17: learn: 0.4334641 test: 0.4391580 best: 0.4391580 (17) total: 6.8s remaining: 28s 18: learn: 0.4243479 test: 0.4303659 best: 0.4303659 (18) total: 7.18s remaining: 27.6s 19: learn: 0.4146536 test: 0.4210036 best: 0.4210036 (19) total: 7.56s remaining: 27.2s 20: learn: 0.4066480 test: 0.4133648 best: 0.4133648 (20) total: 7.95s remaining: 26.9s 21: learn: 0.3986952 test: 0.4057223 best: 0.4057223 (21) total: 8.33s remaining: 26.5s 22: learn: 0.3919208 test: 0.3992694 best: 0.3992694 (22) total: 8.75s remaining: 26.3s 23: learn: 0.3853772 test: 0.3931271 best: 0.3931271 (23) total: 9.11s remaining: 25.8s 24: learn: 0.3777878 test: 0.3858486 best: 0.3858486 (24) total: 9.5s remaining: 25.5s 25: learn: 0.3702762 test: 0.3787323 best: 0.3787323 (25) total: 9.87s remaining: 25.1s 26: learn: 0.3628392 test: 0.3716803 best: 0.3716803 (26) total: 10.2s remaining: 24.7s 27: learn: 0.3568648 test: 0.3659786 best: 0.3659786 (27) total: 10.6s remaining: 24.3s 28: learn: 0.3507688 test: 0.3602775 best: 0.3602775 (28) total: 11s remaining: 23.9s 29: learn: 0.3446013 test: 0.3543357 best: 0.3543357 (29) total: 11.3s remaining: 23.4s 30: learn: 0.3391813 test: 0.3493097 best: 0.3493097 (30) total: 11.7s remaining: 23.1s 31: learn: 0.3339450 test: 0.3444538 best: 0.3444538 (31) total: 12.1s remaining: 22.7s 32: learn: 0.3293787 test: 0.3402625 best: 0.3402625 (32) total: 12.5s remaining: 22.3s 33: learn: 0.3234997 test: 0.3345653 best: 0.3345653 (33) total: 12.9s remaining: 22s 34: learn: 0.3188727 test: 0.3302502 best: 0.3302502 (34) total: 13.2s remaining: 21.6s 35: learn: 0.3142771 test: 0.3260331 best: 0.3260331 (35) total: 13.6s remaining: 21.2s 36: learn: 0.3095462 test: 0.3215975 best: 0.3215975 (36) total: 14s remaining: 20.8s 37: learn: 0.3054351 test: 0.3179260 best: 0.3179260 (37) total: 14.4s remaining: 20.4s 38: learn: 0.3009080 test: 0.3137089 best: 0.3137089 (38) total: 14.7s remaining: 20s 39: learn: 0.2966942 test: 0.3098393 best: 0.3098393 (39) total: 15.1s remaining: 19.6s 40: learn: 0.2929747 test: 0.3064133 best: 0.3064133 (40) total: 15.5s remaining: 19.2s 41: learn: 0.2897175 test: 0.3035005 best: 0.3035005 (41) total: 15.9s remaining: 18.9s 42: learn: 0.2859193 test: 0.3000823 best: 0.3000823 (42) total: 16.2s remaining: 18.5s 43: learn: 0.2824697 test: 0.2970360 best: 0.2970360 (43) total: 16.6s remaining: 18.1s 44: learn: 0.2796027 test: 0.2945302 best: 0.2945302 (44) total: 17s remaining: 17.7s 45: learn: 0.2769327 test: 0.2921691 best: 0.2921691 (45) total: 17.4s remaining: 17.4s 46: learn: 0.2740482 test: 0.2895785 best: 0.2895785 (46) total: 17.7s remaining: 17s 47: learn: 0.2712214 test: 0.2870654 best: 0.2870654 (47) total: 18.1s remaining: 16.6s 48: learn: 0.2686942 test: 0.2849725 best: 0.2849725 (48) total: 18.5s remaining: 16.2s 49: learn: 0.2657453 test: 0.2823581 best: 0.2823581 (49) total: 18.9s remaining: 15.8s 50: learn: 0.2627303 test: 0.2795840 best: 0.2795840 (50) total: 19.2s remaining: 15.5s 51: learn: 0.2600971 test: 0.2773279 best: 0.2773279 (51) total: 19.6s remaining: 15.1s 52: learn: 0.2578542 test: 0.2753471 best: 0.2753471 (52) total: 20s remaining: 14.7s 53: learn: 0.2553936 test: 0.2731306 best: 0.2731306 (53) total: 20.3s remaining: 14.3s 54: learn: 0.2529237 test: 0.2708980 best: 0.2708980 (54) total: 20.7s remaining: 13.9s 55: learn: 0.2507435 test: 0.2690790 best: 0.2690790 (55) total: 21s remaining: 13.5s 56: learn: 0.2486297 test: 0.2673443 best: 0.2673443 (56) total: 21.4s remaining: 13.2s 57: learn: 0.2465933 test: 0.2656639 best: 0.2656639 (57) total: 21.8s remaining: 12.8s 58: learn: 0.2444989 test: 0.2640392 best: 0.2640392 (58) total: 22.1s remaining: 12.4s 59: learn: 0.2424901 test: 0.2623564 best: 0.2623564 (59) total: 22.5s remaining: 12s 60: learn: 0.2403923 test: 0.2606294 best: 0.2606294 (60) total: 22.9s remaining: 11.6s 61: learn: 0.2379892 test: 0.2586250 best: 0.2586250 (61) total: 23.3s remaining: 11.3s 62: learn: 0.2356928 test: 0.2565076 best: 0.2565076 (62) total: 23.6s remaining: 10.9s 63: learn: 0.2339797 test: 0.2550858 best: 0.2550858 (63) total: 24s remaining: 10.5s 64: learn: 0.2317075 test: 0.2531323 best: 0.2531323 (64) total: 24.4s remaining: 10.1s 65: learn: 0.2300955 test: 0.2517619 best: 0.2517619 (65) total: 24.7s remaining: 9.74s 66: learn: 0.2278667 test: 0.2499057 best: 0.2499057 (66) total: 25.1s remaining: 9.37s 67: learn: 0.2259195 test: 0.2482909 best: 0.2482909 (67) total: 25.5s remaining: 9s 68: learn: 0.2244190 test: 0.2470751 best: 0.2470751 (68) total: 25.9s remaining: 8.63s 69: learn: 0.2230378 test: 0.2460152 best: 0.2460152 (69) total: 26.2s remaining: 8.25s 70: learn: 0.2215265 test: 0.2448072 best: 0.2448072 (70) total: 26.6s remaining: 7.86s 71: learn: 0.2200290 test: 0.2437015 best: 0.2437015 (71) total: 27s remaining: 7.49s 72: learn: 0.2187733 test: 0.2426863 best: 0.2426863 (72) total: 27.3s remaining: 7.12s 73: learn: 0.2174128 test: 0.2416904 best: 0.2416904 (73) total: 27.7s remaining: 6.74s 74: learn: 0.2161421 test: 0.2406833 best: 0.2406833 (74) total: 28.1s remaining: 6.37s 75: learn: 0.2150347 test: 0.2398557 best: 0.2398557 (75) total: 28.5s remaining: 5.99s 76: learn: 0.2135878 test: 0.2388496 best: 0.2388496 (76) total: 28.8s remaining: 5.61s 77: learn: 0.2122906 test: 0.2378536 best: 0.2378536 (77) total: 29.2s remaining: 5.24s 78: learn: 0.2108823 test: 0.2367725 best: 0.2367725 (78) total: 29.6s remaining: 4.86s 79: learn: 0.2096640 test: 0.2358403 best: 0.2358403 (79) total: 29.9s remaining: 4.49s 80: learn: 0.2085659 test: 0.2350838 best: 0.2350838 (80) total: 30.3s remaining: 4.12s 81: learn: 0.2074624 test: 0.2342699 best: 0.2342699 (81) total: 30.7s remaining: 3.74s 82: learn: 0.2063891 test: 0.2334558 best: 0.2334558 (82) total: 31.1s remaining: 3.37s 83: learn: 0.2052005 test: 0.2325175 best: 0.2325175 (83) total: 31.4s remaining: 2.99s 84: learn: 0.2038341 test: 0.2314944 best: 0.2314944 (84) total: 31.8s remaining: 2.62s 85: learn: 0.2023693 test: 0.2302526 best: 0.2302526 (85) total: 32.2s remaining: 2.24s 86: learn: 0.2010996 test: 0.2292121 best: 0.2292121 (86) total: 32.5s remaining: 1.87s 87: learn: 0.2001041 test: 0.2286033 best: 0.2286033 (87) total: 32.9s remaining: 1.5s 88: learn: 0.1991452 test: 0.2280109 best: 0.2280109 (88) total: 33.3s remaining: 1.12s 89: learn: 0.1981240 test: 0.2272529 best: 0.2272529 (89) total: 33.7s remaining: 748ms 90: learn: 0.1968318 test: 0.2261929 best: 0.2261929 (90) total: 34s remaining: 374ms 91: learn: 0.1955144 test: 0.2251410 best: 0.2251410 (91) total: 34.4s remaining: 0us bestTest = 0.2251409612 bestIteration = 91 Trial 55, Fold 1: Log loss = 0.22514096118599683, Average precision = 0.9732269855250595, ROC-AUC = 0.9696902348212195, Elapsed Time = 34.570546200000535 seconds Trial 55, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 55, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6735994 test: 0.6737935 best: 0.6737935 (0) total: 372ms remaining: 33.8s 1: learn: 0.6547616 test: 0.6554527 best: 0.6554527 (1) total: 762ms remaining: 34.3s 2: learn: 0.6357105 test: 0.6368406 best: 0.6368406 (2) total: 1.17s remaining: 34.6s 3: learn: 0.6159597 test: 0.6174777 best: 0.6174777 (3) total: 1.56s remaining: 34.4s 4: learn: 0.5992010 test: 0.6011809 best: 0.6011809 (4) total: 1.95s remaining: 34s 5: learn: 0.5828050 test: 0.5851654 best: 0.5851654 (5) total: 2.32s remaining: 33.3s 6: learn: 0.5676791 test: 0.5703358 best: 0.5703358 (6) total: 2.7s remaining: 32.7s 7: learn: 0.5528806 test: 0.5559622 best: 0.5559622 (7) total: 3.1s remaining: 32.5s 8: learn: 0.5386166 test: 0.5418927 best: 0.5418927 (8) total: 3.51s remaining: 32.4s 9: learn: 0.5246756 test: 0.5283244 best: 0.5283244 (9) total: 3.9s remaining: 32s 10: learn: 0.5107294 test: 0.5145697 best: 0.5145697 (10) total: 4.28s remaining: 31.5s 11: learn: 0.4988257 test: 0.5029076 best: 0.5029076 (11) total: 4.65s remaining: 31s 12: learn: 0.4860846 test: 0.4904251 best: 0.4904251 (12) total: 5.03s remaining: 30.6s 13: learn: 0.4739532 test: 0.4786481 best: 0.4786481 (13) total: 5.45s remaining: 30.3s 14: learn: 0.4638092 test: 0.4688171 best: 0.4688171 (14) total: 5.84s remaining: 30s 15: learn: 0.4530370 test: 0.4584019 best: 0.4584019 (15) total: 6.23s remaining: 29.6s 16: learn: 0.4422340 test: 0.4479023 best: 0.4479023 (16) total: 6.61s remaining: 29.2s 17: learn: 0.4327822 test: 0.4387007 best: 0.4387007 (17) total: 7.01s remaining: 28.8s 18: learn: 0.4234990 test: 0.4296686 best: 0.4296686 (18) total: 7.38s remaining: 28.3s 19: learn: 0.4145776 test: 0.4210540 best: 0.4210540 (19) total: 7.78s remaining: 28s 20: learn: 0.4059128 test: 0.4125666 best: 0.4125666 (20) total: 8.15s remaining: 27.6s 21: learn: 0.3979624 test: 0.4048855 best: 0.4048855 (21) total: 8.52s remaining: 27.1s 22: learn: 0.3903919 test: 0.3975539 best: 0.3975539 (22) total: 8.89s remaining: 26.7s 23: learn: 0.3820066 test: 0.3893092 best: 0.3893092 (23) total: 9.29s remaining: 26.3s 24: learn: 0.3756117 test: 0.3831415 best: 0.3831415 (24) total: 9.65s remaining: 25.9s 25: learn: 0.3690913 test: 0.3769204 best: 0.3769204 (25) total: 10s remaining: 25.4s 26: learn: 0.3626305 test: 0.3707285 best: 0.3707285 (26) total: 10.4s remaining: 25s 27: learn: 0.3569293 test: 0.3652685 best: 0.3652685 (27) total: 10.8s remaining: 24.7s 28: learn: 0.3512047 test: 0.3597998 best: 0.3597998 (28) total: 11.2s remaining: 24.3s 29: learn: 0.3441672 test: 0.3530837 best: 0.3530837 (29) total: 11.5s remaining: 23.9s 30: learn: 0.3387876 test: 0.3479568 best: 0.3479568 (30) total: 11.9s remaining: 23.5s 31: learn: 0.3336221 test: 0.3430208 best: 0.3430208 (31) total: 12.3s remaining: 23.1s 32: learn: 0.3290825 test: 0.3387012 best: 0.3387012 (32) total: 12.7s remaining: 22.7s 33: learn: 0.3243257 test: 0.3341305 best: 0.3341305 (33) total: 13.1s remaining: 22.3s 34: learn: 0.3197384 test: 0.3297987 best: 0.3297987 (34) total: 13.5s remaining: 21.9s 35: learn: 0.3144980 test: 0.3247445 best: 0.3247445 (35) total: 13.8s remaining: 21.5s 36: learn: 0.3104176 test: 0.3207620 best: 0.3207620 (36) total: 14.2s remaining: 21.1s 37: learn: 0.3064011 test: 0.3170097 best: 0.3170097 (37) total: 14.6s remaining: 20.7s 38: learn: 0.3022884 test: 0.3131003 best: 0.3131003 (38) total: 15s remaining: 20.3s 39: learn: 0.2974137 test: 0.3084367 best: 0.3084367 (39) total: 15.4s remaining: 20s 40: learn: 0.2939222 test: 0.3052780 best: 0.3052780 (40) total: 15.8s remaining: 19.6s 41: learn: 0.2905493 test: 0.3021459 best: 0.3021459 (41) total: 16.1s remaining: 19.2s 42: learn: 0.2874153 test: 0.2992398 best: 0.2992398 (42) total: 16.5s remaining: 18.8s 43: learn: 0.2844590 test: 0.2964356 best: 0.2964356 (43) total: 16.9s remaining: 18.4s 44: learn: 0.2809953 test: 0.2931622 best: 0.2931622 (44) total: 17.2s remaining: 18s 45: learn: 0.2777977 test: 0.2901817 best: 0.2901817 (45) total: 17.6s remaining: 17.6s 46: learn: 0.2749486 test: 0.2875661 best: 0.2875661 (46) total: 18s remaining: 17.2s 47: learn: 0.2717032 test: 0.2845389 best: 0.2845389 (47) total: 18.4s remaining: 16.9s 48: learn: 0.2692685 test: 0.2822394 best: 0.2822394 (48) total: 18.8s remaining: 16.5s 49: learn: 0.2658792 test: 0.2790753 best: 0.2790753 (49) total: 19.1s remaining: 16.1s 50: learn: 0.2633950 test: 0.2767950 best: 0.2767950 (50) total: 19.5s remaining: 15.7s 51: learn: 0.2607929 test: 0.2744215 best: 0.2744215 (51) total: 19.9s remaining: 15.3s 52: learn: 0.2584713 test: 0.2722678 best: 0.2722678 (52) total: 20.3s remaining: 14.9s 53: learn: 0.2554457 test: 0.2693812 best: 0.2693812 (53) total: 20.7s remaining: 14.6s 54: learn: 0.2532468 test: 0.2673298 best: 0.2673298 (54) total: 21.1s remaining: 14.2s 55: learn: 0.2509793 test: 0.2653045 best: 0.2653045 (55) total: 21.4s remaining: 13.8s 56: learn: 0.2490800 test: 0.2636448 best: 0.2636448 (56) total: 21.8s remaining: 13.4s 57: learn: 0.2461466 test: 0.2608911 best: 0.2608911 (57) total: 22.2s remaining: 13s 58: learn: 0.2441942 test: 0.2591011 best: 0.2591011 (58) total: 22.6s remaining: 12.6s 59: learn: 0.2424098 test: 0.2575236 best: 0.2575236 (59) total: 23s remaining: 12.2s 60: learn: 0.2404090 test: 0.2557176 best: 0.2557176 (60) total: 23.3s remaining: 11.9s 61: learn: 0.2385724 test: 0.2540804 best: 0.2540804 (61) total: 23.7s remaining: 11.5s 62: learn: 0.2369003 test: 0.2526896 best: 0.2526896 (62) total: 24.1s remaining: 11.1s 63: learn: 0.2353158 test: 0.2512948 best: 0.2512948 (63) total: 24.5s remaining: 10.7s 64: learn: 0.2336345 test: 0.2498754 best: 0.2498754 (64) total: 24.9s remaining: 10.3s 65: learn: 0.2320242 test: 0.2485259 best: 0.2485259 (65) total: 25.2s remaining: 9.94s 66: learn: 0.2305616 test: 0.2473285 best: 0.2473285 (66) total: 25.6s remaining: 9.55s 67: learn: 0.2290539 test: 0.2460250 best: 0.2460250 (67) total: 25.9s remaining: 9.15s 68: learn: 0.2275488 test: 0.2447087 best: 0.2447087 (68) total: 26.3s remaining: 8.78s 69: learn: 0.2260428 test: 0.2433265 best: 0.2433265 (69) total: 26.7s remaining: 8.39s 70: learn: 0.2246490 test: 0.2422135 best: 0.2422135 (70) total: 27.1s remaining: 8.01s 71: learn: 0.2232735 test: 0.2410337 best: 0.2410337 (71) total: 27.5s remaining: 7.63s 72: learn: 0.2218223 test: 0.2397669 best: 0.2397669 (72) total: 27.9s remaining: 7.25s 73: learn: 0.2202008 test: 0.2384779 best: 0.2384779 (73) total: 28.2s remaining: 6.87s 74: learn: 0.2189671 test: 0.2373582 best: 0.2373582 (74) total: 28.6s remaining: 6.48s 75: learn: 0.2171355 test: 0.2357349 best: 0.2357349 (75) total: 29s remaining: 6.1s 76: learn: 0.2159082 test: 0.2347702 best: 0.2347702 (76) total: 29.4s remaining: 5.72s 77: learn: 0.2143863 test: 0.2333988 best: 0.2333988 (77) total: 29.7s remaining: 5.34s 78: learn: 0.2128058 test: 0.2321045 best: 0.2321045 (78) total: 30.1s remaining: 4.96s 79: learn: 0.2116441 test: 0.2311477 best: 0.2311477 (79) total: 30.5s remaining: 4.57s 80: learn: 0.2105510 test: 0.2302285 best: 0.2302285 (80) total: 30.9s remaining: 4.19s 81: learn: 0.2092155 test: 0.2291634 best: 0.2291634 (81) total: 31.2s remaining: 3.81s 82: learn: 0.2078683 test: 0.2280150 best: 0.2280150 (82) total: 31.6s remaining: 3.43s 83: learn: 0.2067975 test: 0.2271591 best: 0.2271591 (83) total: 32s remaining: 3.04s 84: learn: 0.2056684 test: 0.2262720 best: 0.2262720 (84) total: 32.4s remaining: 2.67s 85: learn: 0.2044468 test: 0.2252325 best: 0.2252325 (85) total: 32.7s remaining: 2.28s 86: learn: 0.2035034 test: 0.2244757 best: 0.2244757 (86) total: 33.1s remaining: 1.9s 87: learn: 0.2021020 test: 0.2232416 best: 0.2232416 (87) total: 33.5s remaining: 1.52s 88: learn: 0.2008266 test: 0.2221648 best: 0.2221648 (88) total: 33.9s remaining: 1.14s 89: learn: 0.1996821 test: 0.2211758 best: 0.2211758 (89) total: 34.2s remaining: 760ms 90: learn: 0.1986918 test: 0.2203788 best: 0.2203788 (90) total: 34.6s remaining: 380ms 91: learn: 0.1977903 test: 0.2196930 best: 0.2196930 (91) total: 35s remaining: 0us bestTest = 0.2196930218 bestIteration = 91 Trial 55, Fold 2: Log loss = 0.21969302183474357, Average precision = 0.9750632291154587, ROC-AUC = 0.9728057346095119, Elapsed Time = 35.137603999999556 seconds Trial 55, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 55, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6720699 test: 0.6722819 best: 0.6722819 (0) total: 398ms remaining: 36.2s 1: learn: 0.6524002 test: 0.6528826 best: 0.6528826 (1) total: 776ms remaining: 34.9s 2: learn: 0.6341741 test: 0.6348532 best: 0.6348532 (2) total: 1.14s remaining: 33.8s 3: learn: 0.6163326 test: 0.6171109 best: 0.6171109 (3) total: 1.5s remaining: 33s 4: learn: 0.5977230 test: 0.5985396 best: 0.5985396 (4) total: 1.9s remaining: 33.1s 5: learn: 0.5807214 test: 0.5817190 best: 0.5817190 (5) total: 2.28s remaining: 32.6s 6: learn: 0.5650637 test: 0.5661958 best: 0.5661958 (6) total: 2.67s remaining: 32.4s 7: learn: 0.5493403 test: 0.5505199 best: 0.5505199 (7) total: 3.05s remaining: 32.1s 8: learn: 0.5354757 test: 0.5367497 best: 0.5367497 (8) total: 3.43s remaining: 31.6s 9: learn: 0.5208318 test: 0.5223868 best: 0.5223868 (9) total: 3.82s remaining: 31.4s 10: learn: 0.5077314 test: 0.5092698 best: 0.5092698 (10) total: 4.21s remaining: 31s 11: learn: 0.4957632 test: 0.4974860 best: 0.4974860 (11) total: 4.58s remaining: 30.5s 12: learn: 0.4833900 test: 0.4852568 best: 0.4852568 (12) total: 4.96s remaining: 30.1s 13: learn: 0.4709952 test: 0.4730511 best: 0.4730511 (13) total: 5.37s remaining: 29.9s 14: learn: 0.4604029 test: 0.4627819 best: 0.4627819 (14) total: 5.74s remaining: 29.5s 15: learn: 0.4500792 test: 0.4525053 best: 0.4525053 (15) total: 6.12s remaining: 29.1s 16: learn: 0.4402504 test: 0.4430349 best: 0.4430349 (16) total: 6.5s remaining: 28.7s 17: learn: 0.4316307 test: 0.4345363 best: 0.4345363 (17) total: 6.88s remaining: 28.3s 18: learn: 0.4231138 test: 0.4262691 best: 0.4262691 (18) total: 7.25s remaining: 27.9s 19: learn: 0.4150150 test: 0.4185374 best: 0.4185374 (19) total: 7.64s remaining: 27.5s 20: learn: 0.4060830 test: 0.4098607 best: 0.4098607 (20) total: 8.02s remaining: 27.1s 21: learn: 0.3982747 test: 0.4023381 best: 0.4023381 (21) total: 8.4s remaining: 26.7s 22: learn: 0.3906392 test: 0.3948204 best: 0.3948204 (22) total: 8.78s remaining: 26.3s 23: learn: 0.3829990 test: 0.3874167 best: 0.3874167 (23) total: 9.16s remaining: 26s 24: learn: 0.3752934 test: 0.3799345 best: 0.3799345 (24) total: 9.53s remaining: 25.5s 25: learn: 0.3690482 test: 0.3740073 best: 0.3740073 (25) total: 9.91s remaining: 25.2s 26: learn: 0.3617537 test: 0.3669654 best: 0.3669654 (26) total: 10.3s remaining: 24.9s 27: learn: 0.3563761 test: 0.3618121 best: 0.3618121 (27) total: 10.7s remaining: 24.4s 28: learn: 0.3508755 test: 0.3565553 best: 0.3565553 (28) total: 11.1s remaining: 24.1s 29: learn: 0.3459112 test: 0.3516785 best: 0.3516785 (29) total: 11.5s remaining: 23.7s 30: learn: 0.3392570 test: 0.3451939 best: 0.3451939 (30) total: 11.8s remaining: 23.3s 31: learn: 0.3325867 test: 0.3387179 best: 0.3387179 (31) total: 12.2s remaining: 22.9s 32: learn: 0.3268802 test: 0.3333494 best: 0.3333494 (32) total: 12.6s remaining: 22.5s 33: learn: 0.3220758 test: 0.3287829 best: 0.3287829 (33) total: 13s remaining: 22.2s 34: learn: 0.3165216 test: 0.3233067 best: 0.3233067 (34) total: 13.4s remaining: 21.8s 35: learn: 0.3112703 test: 0.3181579 best: 0.3181579 (35) total: 13.7s remaining: 21.4s 36: learn: 0.3071746 test: 0.3144242 best: 0.3144242 (36) total: 14.1s remaining: 21s 37: learn: 0.3032471 test: 0.3107277 best: 0.3107277 (37) total: 14.5s remaining: 20.6s 38: learn: 0.2996241 test: 0.3073080 best: 0.3073080 (38) total: 14.9s remaining: 20.2s 39: learn: 0.2962887 test: 0.3042826 best: 0.3042826 (39) total: 15.2s remaining: 19.8s 40: learn: 0.2927182 test: 0.3010523 best: 0.3010523 (40) total: 15.6s remaining: 19.5s 41: learn: 0.2894051 test: 0.2979200 best: 0.2979200 (41) total: 16s remaining: 19.1s 42: learn: 0.2860331 test: 0.2947096 best: 0.2947096 (42) total: 16.4s remaining: 18.7s 43: learn: 0.2829961 test: 0.2919477 best: 0.2919477 (43) total: 16.8s remaining: 18.3s 44: learn: 0.2802370 test: 0.2894754 best: 0.2894754 (44) total: 17.1s remaining: 17.9s 45: learn: 0.2774060 test: 0.2869348 best: 0.2869348 (45) total: 17.5s remaining: 17.5s 46: learn: 0.2741343 test: 0.2839610 best: 0.2839610 (46) total: 17.9s remaining: 17.1s 47: learn: 0.2714316 test: 0.2814745 best: 0.2814745 (47) total: 18.2s remaining: 16.7s 48: learn: 0.2687829 test: 0.2792050 best: 0.2792050 (48) total: 18.6s remaining: 16.3s 49: learn: 0.2656648 test: 0.2763160 best: 0.2763160 (49) total: 19s remaining: 15.9s 50: learn: 0.2633650 test: 0.2742709 best: 0.2742709 (50) total: 19.3s remaining: 15.5s 51: learn: 0.2610507 test: 0.2722327 best: 0.2722327 (51) total: 19.7s remaining: 15.2s 52: learn: 0.2577096 test: 0.2690543 best: 0.2690543 (52) total: 20.1s remaining: 14.8s 53: learn: 0.2552925 test: 0.2668511 best: 0.2668511 (53) total: 20.4s remaining: 14.4s 54: learn: 0.2527606 test: 0.2645204 best: 0.2645204 (54) total: 20.8s remaining: 14s 55: learn: 0.2506322 test: 0.2625216 best: 0.2625216 (55) total: 21.2s remaining: 13.6s 56: learn: 0.2485885 test: 0.2607861 best: 0.2607861 (56) total: 21.5s remaining: 13.2s 57: learn: 0.2466171 test: 0.2590352 best: 0.2590352 (57) total: 21.9s remaining: 12.8s 58: learn: 0.2446265 test: 0.2573359 best: 0.2573359 (58) total: 22.3s remaining: 12.5s 59: learn: 0.2424653 test: 0.2553850 best: 0.2553850 (59) total: 22.6s remaining: 12.1s 60: learn: 0.2406675 test: 0.2538345 best: 0.2538345 (60) total: 23s remaining: 11.7s 61: learn: 0.2389328 test: 0.2523017 best: 0.2523017 (61) total: 23.4s remaining: 11.3s 62: learn: 0.2369972 test: 0.2506667 best: 0.2506667 (62) total: 23.7s remaining: 10.9s 63: learn: 0.2346918 test: 0.2485763 best: 0.2485763 (63) total: 24.1s remaining: 10.5s 64: learn: 0.2325911 test: 0.2467976 best: 0.2467976 (64) total: 24.5s remaining: 10.2s 65: learn: 0.2301358 test: 0.2446754 best: 0.2446754 (65) total: 24.8s remaining: 9.79s 66: learn: 0.2285044 test: 0.2432607 best: 0.2432607 (66) total: 25.2s remaining: 9.42s 67: learn: 0.2266949 test: 0.2417780 best: 0.2417780 (67) total: 25.6s remaining: 9.05s 68: learn: 0.2251062 test: 0.2404663 best: 0.2404663 (68) total: 26s remaining: 8.67s 69: learn: 0.2236954 test: 0.2392940 best: 0.2392940 (69) total: 26.4s remaining: 8.29s 70: learn: 0.2220066 test: 0.2378271 best: 0.2378271 (70) total: 26.8s remaining: 7.92s 71: learn: 0.2200595 test: 0.2360830 best: 0.2360830 (71) total: 27.2s remaining: 7.54s 72: learn: 0.2188014 test: 0.2351471 best: 0.2351471 (72) total: 27.5s remaining: 7.16s 73: learn: 0.2175321 test: 0.2342410 best: 0.2342410 (73) total: 27.9s remaining: 6.78s 74: learn: 0.2160097 test: 0.2330271 best: 0.2330271 (74) total: 28.3s remaining: 6.41s 75: learn: 0.2144919 test: 0.2317068 best: 0.2317068 (75) total: 28.6s remaining: 6.03s 76: learn: 0.2133585 test: 0.2307528 best: 0.2307528 (76) total: 29s remaining: 5.64s 77: learn: 0.2121562 test: 0.2298916 best: 0.2298916 (77) total: 29.4s remaining: 5.27s 78: learn: 0.2108320 test: 0.2288670 best: 0.2288670 (78) total: 29.7s remaining: 4.89s 79: learn: 0.2098137 test: 0.2280737 best: 0.2280737 (79) total: 30.1s remaining: 4.51s 80: learn: 0.2087969 test: 0.2273089 best: 0.2273089 (80) total: 30.5s remaining: 4.14s 81: learn: 0.2076776 test: 0.2265459 best: 0.2265459 (81) total: 30.8s remaining: 3.76s 82: learn: 0.2065629 test: 0.2256959 best: 0.2256959 (82) total: 31.2s remaining: 3.38s 83: learn: 0.2055652 test: 0.2249677 best: 0.2249677 (83) total: 31.6s remaining: 3.01s 84: learn: 0.2042408 test: 0.2237717 best: 0.2237717 (84) total: 32s remaining: 2.63s 85: learn: 0.2031756 test: 0.2229165 best: 0.2229165 (85) total: 32.4s remaining: 2.26s 86: learn: 0.2021594 test: 0.2222318 best: 0.2222318 (86) total: 32.7s remaining: 1.88s 87: learn: 0.2011118 test: 0.2215026 best: 0.2215026 (87) total: 33.1s remaining: 1.5s 88: learn: 0.2002035 test: 0.2209090 best: 0.2209090 (88) total: 33.5s remaining: 1.13s 89: learn: 0.1991668 test: 0.2202328 best: 0.2202328 (89) total: 33.9s remaining: 753ms 90: learn: 0.1980359 test: 0.2193528 best: 0.2193528 (90) total: 34.2s remaining: 376ms 91: learn: 0.1970997 test: 0.2186754 best: 0.2186754 (91) total: 34.6s remaining: 0us bestTest = 0.218675371 bestIteration = 91 Trial 55, Fold 3: Log loss = 0.21867537101758067, Average precision = 0.973053933839598, ROC-AUC = 0.9723832327291464, Elapsed Time = 34.787642700001015 seconds Trial 55, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 55, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6713765 test: 0.6715109 best: 0.6715109 (0) total: 390ms remaining: 35.5s 1: learn: 0.6521583 test: 0.6524979 best: 0.6524979 (1) total: 758ms remaining: 34.1s 2: learn: 0.6330484 test: 0.6336480 best: 0.6336480 (2) total: 1.17s remaining: 34.7s 3: learn: 0.6148968 test: 0.6158699 best: 0.6158699 (3) total: 1.57s remaining: 34.5s 4: learn: 0.5978131 test: 0.5990212 best: 0.5990212 (4) total: 1.95s remaining: 33.9s 5: learn: 0.5809088 test: 0.5822938 best: 0.5822938 (5) total: 2.33s remaining: 33.4s 6: learn: 0.5655530 test: 0.5672132 best: 0.5672132 (6) total: 2.73s remaining: 33.1s 7: learn: 0.5504279 test: 0.5524196 best: 0.5524196 (7) total: 3.12s remaining: 32.8s 8: learn: 0.5365550 test: 0.5388679 best: 0.5388679 (8) total: 3.55s remaining: 32.8s 9: learn: 0.5220911 test: 0.5245803 best: 0.5245803 (9) total: 3.92s remaining: 32.1s 10: learn: 0.5096260 test: 0.5123255 best: 0.5123255 (10) total: 4.29s remaining: 31.6s 11: learn: 0.4968945 test: 0.4999650 best: 0.4999650 (11) total: 4.69s remaining: 31.3s 12: learn: 0.4858023 test: 0.4893133 best: 0.4893133 (12) total: 5.05s remaining: 30.7s 13: learn: 0.4733276 test: 0.4772463 best: 0.4772463 (13) total: 5.45s remaining: 30.3s 14: learn: 0.4628388 test: 0.4670432 best: 0.4670432 (14) total: 5.81s remaining: 29.8s 15: learn: 0.4523879 test: 0.4569759 best: 0.4569759 (15) total: 6.19s remaining: 29.4s 16: learn: 0.4417992 test: 0.4467000 best: 0.4467000 (16) total: 6.55s remaining: 28.9s 17: learn: 0.4324862 test: 0.4378605 best: 0.4378605 (17) total: 6.94s remaining: 28.5s 18: learn: 0.4234126 test: 0.4290592 best: 0.4290592 (18) total: 7.3s remaining: 28.1s 19: learn: 0.4141409 test: 0.4200639 best: 0.4200639 (19) total: 7.68s remaining: 27.7s 20: learn: 0.4061254 test: 0.4123528 best: 0.4123528 (20) total: 8.05s remaining: 27.2s 21: learn: 0.3981559 test: 0.4045396 best: 0.4045396 (21) total: 8.41s remaining: 26.8s 22: learn: 0.3903341 test: 0.3968375 best: 0.3968375 (22) total: 8.81s remaining: 26.4s 23: learn: 0.3836190 test: 0.3903839 best: 0.3903839 (23) total: 9.19s remaining: 26s 24: learn: 0.3762556 test: 0.3833675 best: 0.3833675 (24) total: 9.57s remaining: 25.6s 25: learn: 0.3694102 test: 0.3767650 best: 0.3767650 (25) total: 9.96s remaining: 25.3s 26: learn: 0.3635284 test: 0.3710815 best: 0.3710815 (26) total: 10.3s remaining: 24.8s 27: learn: 0.3561700 test: 0.3640294 best: 0.3640294 (27) total: 10.7s remaining: 24.5s 28: learn: 0.3504160 test: 0.3584975 best: 0.3584975 (28) total: 11.1s remaining: 24.1s 29: learn: 0.3453610 test: 0.3537734 best: 0.3537734 (29) total: 11.5s remaining: 23.8s 30: learn: 0.3402851 test: 0.3490079 best: 0.3490079 (30) total: 11.9s remaining: 23.5s 31: learn: 0.3353509 test: 0.3443837 best: 0.3443837 (31) total: 12.3s remaining: 23.1s 32: learn: 0.3289299 test: 0.3383139 best: 0.3383139 (32) total: 12.8s remaining: 22.9s 33: learn: 0.3241671 test: 0.3338697 best: 0.3338697 (33) total: 13.2s remaining: 22.6s 34: learn: 0.3194201 test: 0.3293254 best: 0.3293254 (34) total: 13.6s remaining: 22.2s 35: learn: 0.3152118 test: 0.3252870 best: 0.3252870 (35) total: 14s remaining: 21.8s 36: learn: 0.3109905 test: 0.3212826 best: 0.3212826 (36) total: 14.4s remaining: 21.4s 37: learn: 0.3071816 test: 0.3176733 best: 0.3176733 (37) total: 14.8s remaining: 21.1s 38: learn: 0.3034001 test: 0.3141764 best: 0.3141764 (38) total: 15.3s remaining: 20.7s 39: learn: 0.2998243 test: 0.3107950 best: 0.3107950 (39) total: 15.7s remaining: 20.4s 40: learn: 0.2963527 test: 0.3076010 best: 0.3076010 (40) total: 16.1s remaining: 20s 41: learn: 0.2928521 test: 0.3043113 best: 0.3043113 (41) total: 16.4s remaining: 19.6s 42: learn: 0.2884378 test: 0.3002232 best: 0.3002232 (42) total: 16.9s remaining: 19.2s 43: learn: 0.2850635 test: 0.2969977 best: 0.2969977 (43) total: 17.2s remaining: 18.8s 44: learn: 0.2819341 test: 0.2940716 best: 0.2940716 (44) total: 17.6s remaining: 18.4s 45: learn: 0.2791983 test: 0.2916168 best: 0.2916168 (45) total: 18s remaining: 18s 46: learn: 0.2764589 test: 0.2890997 best: 0.2890997 (46) total: 18.4s remaining: 17.6s 47: learn: 0.2731625 test: 0.2860013 best: 0.2860013 (47) total: 18.7s remaining: 17.2s 48: learn: 0.2703127 test: 0.2835328 best: 0.2835328 (48) total: 19.1s remaining: 16.8s 49: learn: 0.2676222 test: 0.2811007 best: 0.2811007 (49) total: 19.6s remaining: 16.4s 50: learn: 0.2653137 test: 0.2791041 best: 0.2791041 (50) total: 19.9s remaining: 16s 51: learn: 0.2619583 test: 0.2760886 best: 0.2760886 (51) total: 20.3s remaining: 15.6s 52: learn: 0.2589831 test: 0.2733352 best: 0.2733352 (52) total: 20.7s remaining: 15.3s 53: learn: 0.2563630 test: 0.2708726 best: 0.2708726 (53) total: 21.1s remaining: 14.9s 54: learn: 0.2539957 test: 0.2685775 best: 0.2685775 (54) total: 21.5s remaining: 14.5s 55: learn: 0.2518605 test: 0.2666671 best: 0.2666671 (55) total: 21.9s remaining: 14.1s 56: learn: 0.2497738 test: 0.2648109 best: 0.2648109 (56) total: 22.2s remaining: 13.6s 57: learn: 0.2475672 test: 0.2628717 best: 0.2628717 (57) total: 22.6s remaining: 13.3s 58: learn: 0.2455899 test: 0.2610962 best: 0.2610962 (58) total: 23s remaining: 12.9s 59: learn: 0.2433596 test: 0.2592264 best: 0.2592264 (59) total: 23.4s remaining: 12.5s 60: learn: 0.2413069 test: 0.2573539 best: 0.2573539 (60) total: 23.8s remaining: 12.1s 61: learn: 0.2393033 test: 0.2555557 best: 0.2555557 (61) total: 24.1s remaining: 11.7s 62: learn: 0.2370499 test: 0.2534698 best: 0.2534698 (62) total: 24.5s remaining: 11.3s 63: learn: 0.2353727 test: 0.2520092 best: 0.2520092 (63) total: 24.9s remaining: 10.9s 64: learn: 0.2336639 test: 0.2505149 best: 0.2505149 (64) total: 25.2s remaining: 10.5s 65: learn: 0.2320849 test: 0.2491854 best: 0.2491854 (65) total: 25.6s remaining: 10.1s 66: learn: 0.2304526 test: 0.2478158 best: 0.2478158 (66) total: 26s remaining: 9.7s 67: learn: 0.2285720 test: 0.2461652 best: 0.2461652 (67) total: 26.4s remaining: 9.3s 68: learn: 0.2270717 test: 0.2448893 best: 0.2448893 (68) total: 26.7s remaining: 8.91s 69: learn: 0.2250703 test: 0.2431571 best: 0.2431571 (69) total: 27.1s remaining: 8.52s 70: learn: 0.2233539 test: 0.2418015 best: 0.2418015 (70) total: 27.5s remaining: 8.13s 71: learn: 0.2216730 test: 0.2403682 best: 0.2403682 (71) total: 27.8s remaining: 7.73s 72: learn: 0.2202956 test: 0.2392606 best: 0.2392606 (72) total: 28.2s remaining: 7.34s 73: learn: 0.2190320 test: 0.2381565 best: 0.2381565 (73) total: 28.6s remaining: 6.95s 74: learn: 0.2174835 test: 0.2368454 best: 0.2368454 (74) total: 29s remaining: 6.56s 75: learn: 0.2162248 test: 0.2360067 best: 0.2360067 (75) total: 29.3s remaining: 6.17s 76: learn: 0.2149326 test: 0.2348945 best: 0.2348945 (76) total: 29.7s remaining: 5.78s 77: learn: 0.2136754 test: 0.2338158 best: 0.2338158 (77) total: 30.1s remaining: 5.4s 78: learn: 0.2125735 test: 0.2328810 best: 0.2328810 (78) total: 30.4s remaining: 5.01s 79: learn: 0.2109888 test: 0.2315165 best: 0.2315165 (79) total: 30.8s remaining: 4.62s 80: learn: 0.2098723 test: 0.2307003 best: 0.2307003 (80) total: 31.2s remaining: 4.24s 81: learn: 0.2082103 test: 0.2293078 best: 0.2293078 (81) total: 31.6s remaining: 3.85s 82: learn: 0.2070365 test: 0.2283552 best: 0.2283552 (82) total: 32s remaining: 3.46s 83: learn: 0.2059159 test: 0.2274012 best: 0.2274012 (83) total: 32.4s remaining: 3.08s 84: learn: 0.2044428 test: 0.2261662 best: 0.2261662 (84) total: 32.8s remaining: 2.7s 85: learn: 0.2031236 test: 0.2250698 best: 0.2250698 (85) total: 33.2s remaining: 2.31s 86: learn: 0.2017441 test: 0.2237796 best: 0.2237796 (86) total: 33.6s remaining: 1.93s 87: learn: 0.2008668 test: 0.2231262 best: 0.2231262 (87) total: 34s remaining: 1.54s 88: learn: 0.1999667 test: 0.2224665 best: 0.2224665 (88) total: 34.4s remaining: 1.16s 89: learn: 0.1989752 test: 0.2216442 best: 0.2216442 (89) total: 34.7s remaining: 772ms 90: learn: 0.1981556 test: 0.2210612 best: 0.2210612 (90) total: 35.1s remaining: 386ms 91: learn: 0.1969564 test: 0.2201699 best: 0.2201699 (91) total: 35.5s remaining: 0us bestTest = 0.2201698928 bestIteration = 91 Trial 55, Fold 4: Log loss = 0.22016989283450483, Average precision = 0.9753153670135544, ROC-AUC = 0.9728019318487998, Elapsed Time = 35.669709100002365 seconds Trial 55, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 55, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6725767 test: 0.6730899 best: 0.6730899 (0) total: 365ms remaining: 33.2s 1: learn: 0.6529921 test: 0.6537452 best: 0.6537452 (1) total: 768ms remaining: 34.5s 2: learn: 0.6342861 test: 0.6354284 best: 0.6354284 (2) total: 1.15s remaining: 34.2s 3: learn: 0.6154139 test: 0.6169354 best: 0.6169354 (3) total: 1.52s remaining: 33.5s 4: learn: 0.5979172 test: 0.5996886 best: 0.5996886 (4) total: 1.9s remaining: 33.1s 5: learn: 0.5804489 test: 0.5828713 best: 0.5828713 (5) total: 2.29s remaining: 32.9s 6: learn: 0.5636211 test: 0.5664291 best: 0.5664291 (6) total: 2.65s remaining: 32.2s 7: learn: 0.5498881 test: 0.5529679 best: 0.5529679 (7) total: 3.03s remaining: 31.9s 8: learn: 0.5370474 test: 0.5405807 best: 0.5405807 (8) total: 3.42s remaining: 31.6s 9: learn: 0.5233391 test: 0.5272763 best: 0.5272763 (9) total: 3.81s remaining: 31.3s 10: learn: 0.5103937 test: 0.5147596 best: 0.5147596 (10) total: 4.2s remaining: 30.9s 11: learn: 0.4952572 test: 0.5001360 best: 0.5001360 (11) total: 4.6s remaining: 30.7s 12: learn: 0.4825083 test: 0.4878948 best: 0.4878948 (12) total: 4.98s remaining: 30.3s 13: learn: 0.4711246 test: 0.4769214 best: 0.4769214 (13) total: 5.38s remaining: 30s 14: learn: 0.4601168 test: 0.4662171 best: 0.4662171 (14) total: 5.77s remaining: 29.6s 15: learn: 0.4491940 test: 0.4556683 best: 0.4556683 (15) total: 6.13s remaining: 29.1s 16: learn: 0.4394441 test: 0.4463285 best: 0.4463285 (16) total: 6.51s remaining: 28.7s 17: learn: 0.4301659 test: 0.4373674 best: 0.4373674 (17) total: 6.89s remaining: 28.3s 18: learn: 0.4216421 test: 0.4292892 best: 0.4292892 (18) total: 7.25s remaining: 27.8s 19: learn: 0.4128203 test: 0.4208367 best: 0.4208367 (19) total: 7.63s remaining: 27.5s 20: learn: 0.4047211 test: 0.4131273 best: 0.4131273 (20) total: 8s remaining: 27.1s 21: learn: 0.3970369 test: 0.4057043 best: 0.4057043 (21) total: 8.38s remaining: 26.7s 22: learn: 0.3895517 test: 0.3985181 best: 0.3985181 (22) total: 8.74s remaining: 26.2s 23: learn: 0.3819191 test: 0.3912253 best: 0.3912253 (23) total: 9.15s remaining: 25.9s 24: learn: 0.3751713 test: 0.3847316 best: 0.3847316 (24) total: 9.53s remaining: 25.5s 25: learn: 0.3689318 test: 0.3789827 best: 0.3789827 (25) total: 9.9s remaining: 25.1s 26: learn: 0.3616245 test: 0.3720267 best: 0.3720267 (26) total: 10.3s remaining: 24.7s 27: learn: 0.3559930 test: 0.3668466 best: 0.3668466 (27) total: 10.6s remaining: 24.3s 28: learn: 0.3502028 test: 0.3613837 best: 0.3613837 (28) total: 11s remaining: 23.9s 29: learn: 0.3443308 test: 0.3560381 best: 0.3560381 (29) total: 11.4s remaining: 23.5s 30: learn: 0.3388585 test: 0.3509018 best: 0.3509018 (30) total: 11.8s remaining: 23.2s 31: learn: 0.3334637 test: 0.3459128 best: 0.3459128 (31) total: 12.2s remaining: 22.8s 32: learn: 0.3280046 test: 0.3408205 best: 0.3408205 (32) total: 12.5s remaining: 22.4s 33: learn: 0.3229199 test: 0.3361273 best: 0.3361273 (33) total: 12.9s remaining: 22s 34: learn: 0.3172254 test: 0.3308128 best: 0.3308128 (34) total: 13.3s remaining: 21.6s 35: learn: 0.3121128 test: 0.3260758 best: 0.3260758 (35) total: 13.7s remaining: 21.3s 36: learn: 0.3073994 test: 0.3217961 best: 0.3217961 (36) total: 14s remaining: 20.9s 37: learn: 0.3030136 test: 0.3178684 best: 0.3178684 (37) total: 14.4s remaining: 20.5s 38: learn: 0.2990506 test: 0.3143057 best: 0.3143057 (38) total: 14.8s remaining: 20.1s 39: learn: 0.2952491 test: 0.3108248 best: 0.3108248 (39) total: 15.2s remaining: 19.7s 40: learn: 0.2920952 test: 0.3080149 best: 0.3080149 (40) total: 15.5s remaining: 19.3s 41: learn: 0.2889091 test: 0.3051953 best: 0.3051953 (41) total: 15.9s remaining: 18.9s 42: learn: 0.2858460 test: 0.3023940 best: 0.3023940 (42) total: 16.2s remaining: 18.5s 43: learn: 0.2813865 test: 0.2982059 best: 0.2982059 (43) total: 16.6s remaining: 18.1s 44: learn: 0.2785672 test: 0.2956476 best: 0.2956476 (44) total: 17s remaining: 17.7s 45: learn: 0.2753842 test: 0.2928356 best: 0.2928356 (45) total: 17.3s remaining: 17.3s 46: learn: 0.2725606 test: 0.2903087 best: 0.2903087 (46) total: 17.7s remaining: 16.9s 47: learn: 0.2695058 test: 0.2876564 best: 0.2876564 (47) total: 18.1s remaining: 16.6s 48: learn: 0.2667162 test: 0.2850875 best: 0.2850875 (48) total: 18.4s remaining: 16.2s 49: learn: 0.2638444 test: 0.2824773 best: 0.2824773 (49) total: 18.8s remaining: 15.8s 50: learn: 0.2608146 test: 0.2798211 best: 0.2798211 (50) total: 19.2s remaining: 15.4s 51: learn: 0.2582504 test: 0.2775466 best: 0.2775466 (51) total: 19.6s remaining: 15.1s 52: learn: 0.2558628 test: 0.2754275 best: 0.2754275 (52) total: 20s remaining: 14.7s 53: learn: 0.2538566 test: 0.2737064 best: 0.2737064 (53) total: 20.3s remaining: 14.3s 54: learn: 0.2516489 test: 0.2718406 best: 0.2718406 (54) total: 20.7s remaining: 13.9s 55: learn: 0.2490548 test: 0.2694963 best: 0.2694963 (55) total: 21.1s remaining: 13.6s 56: learn: 0.2469852 test: 0.2676451 best: 0.2676451 (56) total: 21.5s remaining: 13.2s 57: learn: 0.2451423 test: 0.2660624 best: 0.2660624 (57) total: 21.9s remaining: 12.8s 58: learn: 0.2425521 test: 0.2638165 best: 0.2638165 (58) total: 22.2s remaining: 12.4s 59: learn: 0.2407281 test: 0.2622227 best: 0.2622227 (59) total: 22.6s remaining: 12.1s 60: learn: 0.2388369 test: 0.2606072 best: 0.2606072 (60) total: 23s remaining: 11.7s 61: learn: 0.2370423 test: 0.2591068 best: 0.2591068 (61) total: 23.4s remaining: 11.3s 62: learn: 0.2350147 test: 0.2573381 best: 0.2573381 (62) total: 23.8s remaining: 10.9s 63: learn: 0.2334973 test: 0.2560890 best: 0.2560890 (63) total: 24.1s remaining: 10.6s 64: learn: 0.2318045 test: 0.2547465 best: 0.2547465 (64) total: 24.5s remaining: 10.2s 65: learn: 0.2298736 test: 0.2530459 best: 0.2530459 (65) total: 24.9s remaining: 9.79s 66: learn: 0.2279960 test: 0.2514301 best: 0.2514301 (66) total: 25.2s remaining: 9.41s 67: learn: 0.2266012 test: 0.2502404 best: 0.2502404 (67) total: 25.6s remaining: 9.04s 68: learn: 0.2251615 test: 0.2490852 best: 0.2490852 (68) total: 26s remaining: 8.67s 69: learn: 0.2237479 test: 0.2480182 best: 0.2480182 (69) total: 26.4s remaining: 8.29s 70: learn: 0.2216622 test: 0.2462893 best: 0.2462893 (70) total: 26.7s remaining: 7.91s 71: learn: 0.2203477 test: 0.2453005 best: 0.2453005 (71) total: 27.1s remaining: 7.53s 72: learn: 0.2185806 test: 0.2438033 best: 0.2438033 (72) total: 27.5s remaining: 7.16s 73: learn: 0.2172998 test: 0.2427584 best: 0.2427584 (73) total: 27.9s remaining: 6.78s 74: learn: 0.2159294 test: 0.2416681 best: 0.2416681 (74) total: 28.2s remaining: 6.4s 75: learn: 0.2145392 test: 0.2406768 best: 0.2406768 (75) total: 28.6s remaining: 6.03s 76: learn: 0.2134896 test: 0.2399449 best: 0.2399449 (76) total: 29s remaining: 5.65s 77: learn: 0.2123176 test: 0.2390175 best: 0.2390175 (77) total: 29.4s remaining: 5.27s 78: learn: 0.2110024 test: 0.2378917 best: 0.2378917 (78) total: 29.7s remaining: 4.89s 79: learn: 0.2098871 test: 0.2370645 best: 0.2370645 (79) total: 30.1s remaining: 4.52s 80: learn: 0.2086736 test: 0.2361082 best: 0.2361082 (80) total: 30.5s remaining: 4.14s 81: learn: 0.2070479 test: 0.2347919 best: 0.2347919 (81) total: 30.9s remaining: 3.77s 82: learn: 0.2059651 test: 0.2339014 best: 0.2339014 (82) total: 31.3s remaining: 3.39s 83: learn: 0.2049425 test: 0.2331795 best: 0.2331795 (83) total: 31.6s remaining: 3.01s 84: learn: 0.2033798 test: 0.2318909 best: 0.2318909 (84) total: 32s remaining: 2.64s 85: learn: 0.2024887 test: 0.2311888 best: 0.2311888 (85) total: 32.4s remaining: 2.26s 86: learn: 0.2015507 test: 0.2305104 best: 0.2305104 (86) total: 32.7s remaining: 1.88s 87: learn: 0.2006102 test: 0.2299049 best: 0.2299049 (87) total: 33.1s remaining: 1.5s 88: learn: 0.1996870 test: 0.2292503 best: 0.2292503 (88) total: 33.5s remaining: 1.13s 89: learn: 0.1984275 test: 0.2282717 best: 0.2282717 (89) total: 33.9s remaining: 753ms 90: learn: 0.1974186 test: 0.2275696 best: 0.2275696 (90) total: 34.2s remaining: 376ms 91: learn: 0.1964638 test: 0.2268257 best: 0.2268257 (91) total: 34.6s remaining: 0us bestTest = 0.226825692 bestIteration = 91 Trial 55, Fold 5: Log loss = 0.22682569196956784, Average precision = 0.9742588788968436, ROC-AUC = 0.9710420452223029, Elapsed Time = 34.77915720000237 seconds
Optimization Progress: 56%|#####6 | 56/100 [1:32:26<2:07:52, 174.37s/it]
Trial 56, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 56, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6791099 test: 0.6792828 best: 0.6792828 (0) total: 134ms remaining: 9s 1: learn: 0.6650798 test: 0.6652052 best: 0.6652052 (1) total: 284ms remaining: 9.37s 2: learn: 0.6522115 test: 0.6524164 best: 0.6524164 (2) total: 450ms remaining: 9.75s 3: learn: 0.6402167 test: 0.6404911 best: 0.6404911 (3) total: 621ms remaining: 9.93s 4: learn: 0.6284741 test: 0.6287646 best: 0.6287646 (4) total: 777ms remaining: 9.79s 5: learn: 0.6156846 test: 0.6159536 best: 0.6159536 (5) total: 923ms remaining: 9.53s 6: learn: 0.6042537 test: 0.6046326 best: 0.6046326 (6) total: 1.06s remaining: 9.21s 7: learn: 0.5940262 test: 0.5944768 best: 0.5944768 (7) total: 1.23s remaining: 9.21s 8: learn: 0.5829904 test: 0.5836252 best: 0.5836252 (8) total: 1.37s remaining: 9.01s 9: learn: 0.5729716 test: 0.5736634 best: 0.5736634 (9) total: 1.52s remaining: 8.83s 10: learn: 0.5627140 test: 0.5634712 best: 0.5634712 (10) total: 1.68s remaining: 8.72s 11: learn: 0.5549803 test: 0.5557598 best: 0.5557598 (11) total: 1.82s remaining: 8.5s 12: learn: 0.5463676 test: 0.5471262 best: 0.5471262 (12) total: 1.98s remaining: 8.37s 13: learn: 0.5383362 test: 0.5391551 best: 0.5391551 (13) total: 2.1s remaining: 8.1s 14: learn: 0.5298778 test: 0.5307261 best: 0.5307261 (14) total: 2.27s remaining: 8.02s 15: learn: 0.5209019 test: 0.5217688 best: 0.5217688 (15) total: 2.42s remaining: 7.87s 16: learn: 0.5129155 test: 0.5139649 best: 0.5139649 (16) total: 2.58s remaining: 7.74s 17: learn: 0.5052436 test: 0.5064318 best: 0.5064318 (17) total: 2.73s remaining: 7.58s 18: learn: 0.4966772 test: 0.4978842 best: 0.4978842 (18) total: 2.9s remaining: 7.48s 19: learn: 0.4899340 test: 0.4911407 best: 0.4911407 (19) total: 3.04s remaining: 7.31s 20: learn: 0.4828940 test: 0.4841560 best: 0.4841560 (20) total: 3.23s remaining: 7.24s 21: learn: 0.4748207 test: 0.4760678 best: 0.4760678 (21) total: 3.41s remaining: 7.14s 22: learn: 0.4679714 test: 0.4692604 best: 0.4692604 (22) total: 3.56s remaining: 6.96s 23: learn: 0.4614820 test: 0.4625963 best: 0.4625963 (23) total: 3.72s remaining: 6.82s 24: learn: 0.4554872 test: 0.4567501 best: 0.4567501 (24) total: 3.89s remaining: 6.69s 25: learn: 0.4498403 test: 0.4510449 best: 0.4510449 (25) total: 4.05s remaining: 6.54s 26: learn: 0.4435019 test: 0.4447641 best: 0.4447641 (26) total: 4.22s remaining: 6.41s 27: learn: 0.4369271 test: 0.4381339 best: 0.4381339 (27) total: 4.38s remaining: 6.26s 28: learn: 0.4310764 test: 0.4323491 best: 0.4323491 (28) total: 4.53s remaining: 6.09s 29: learn: 0.4257836 test: 0.4271650 best: 0.4271650 (29) total: 4.69s remaining: 5.93s 30: learn: 0.4215896 test: 0.4230235 best: 0.4230235 (30) total: 4.82s remaining: 5.75s 31: learn: 0.4163357 test: 0.4177476 best: 0.4177476 (31) total: 4.99s remaining: 5.61s 32: learn: 0.4123595 test: 0.4138616 best: 0.4138616 (32) total: 5.16s remaining: 5.47s 33: learn: 0.4083203 test: 0.4098293 best: 0.4098293 (33) total: 5.33s remaining: 5.33s 34: learn: 0.4037611 test: 0.4053204 best: 0.4053204 (34) total: 5.5s remaining: 5.18s 35: learn: 0.3996016 test: 0.4012278 best: 0.4012278 (35) total: 5.67s remaining: 5.04s 36: learn: 0.3954918 test: 0.3970803 best: 0.3970803 (36) total: 5.83s remaining: 4.89s 37: learn: 0.3914175 test: 0.3931434 best: 0.3931434 (37) total: 5.99s remaining: 4.73s 38: learn: 0.3874317 test: 0.3892226 best: 0.3892226 (38) total: 6.15s remaining: 4.58s 39: learn: 0.3834323 test: 0.3852408 best: 0.3852408 (39) total: 6.33s remaining: 4.43s 40: learn: 0.3790579 test: 0.3808879 best: 0.3808879 (40) total: 6.5s remaining: 4.28s 41: learn: 0.3751356 test: 0.3770507 best: 0.3770507 (41) total: 6.67s remaining: 4.13s 42: learn: 0.3715318 test: 0.3735551 best: 0.3735551 (42) total: 6.82s remaining: 3.96s 43: learn: 0.3681266 test: 0.3701582 best: 0.3701582 (43) total: 6.98s remaining: 3.81s 44: learn: 0.3648105 test: 0.3669014 best: 0.3669014 (44) total: 7.12s remaining: 3.64s 45: learn: 0.3612172 test: 0.3633378 best: 0.3633378 (45) total: 7.28s remaining: 3.48s 46: learn: 0.3580930 test: 0.3601437 best: 0.3601437 (46) total: 7.45s remaining: 3.33s 47: learn: 0.3546418 test: 0.3568132 best: 0.3568132 (47) total: 7.62s remaining: 3.17s 48: learn: 0.3513843 test: 0.3536353 best: 0.3536353 (48) total: 7.78s remaining: 3.02s 49: learn: 0.3483203 test: 0.3505604 best: 0.3505604 (49) total: 7.95s remaining: 2.86s 50: learn: 0.3451212 test: 0.3474648 best: 0.3474648 (50) total: 8.13s remaining: 2.71s 51: learn: 0.3422793 test: 0.3446781 best: 0.3446781 (51) total: 8.29s remaining: 2.55s 52: learn: 0.3394029 test: 0.3418478 best: 0.3418478 (52) total: 8.44s remaining: 2.39s 53: learn: 0.3363401 test: 0.3389582 best: 0.3389582 (53) total: 8.61s remaining: 2.23s 54: learn: 0.3339003 test: 0.3364470 best: 0.3364470 (54) total: 8.76s remaining: 2.07s 55: learn: 0.3313858 test: 0.3340391 best: 0.3340391 (55) total: 8.92s remaining: 1.91s 56: learn: 0.3288046 test: 0.3315910 best: 0.3315910 (56) total: 9.11s remaining: 1.76s 57: learn: 0.3264206 test: 0.3292582 best: 0.3292582 (57) total: 9.28s remaining: 1.6s 58: learn: 0.3242070 test: 0.3270531 best: 0.3270531 (58) total: 9.43s remaining: 1.44s 59: learn: 0.3218935 test: 0.3248950 best: 0.3248950 (59) total: 9.69s remaining: 1.29s 60: learn: 0.3192187 test: 0.3222100 best: 0.3222100 (60) total: 9.89s remaining: 1.14s 61: learn: 0.3165833 test: 0.3196449 best: 0.3196449 (61) total: 10.1s remaining: 976ms 62: learn: 0.3145845 test: 0.3177394 best: 0.3177394 (62) total: 10.3s remaining: 816ms 63: learn: 0.3120978 test: 0.3154638 best: 0.3154638 (63) total: 10.5s remaining: 653ms 64: learn: 0.3100567 test: 0.3135019 best: 0.3135019 (64) total: 10.6s remaining: 491ms 65: learn: 0.3083998 test: 0.3118753 best: 0.3118753 (65) total: 10.8s remaining: 327ms 66: learn: 0.3064218 test: 0.3099613 best: 0.3099613 (66) total: 11s remaining: 164ms 67: learn: 0.3040928 test: 0.3076964 best: 0.3076964 (67) total: 11.2s remaining: 0us bestTest = 0.3076964181 bestIteration = 67 Trial 56, Fold 1: Log loss = 0.3076299540603475, Average precision = 0.9646151856433388, ROC-AUC = 0.9606843684728734, Elapsed Time = 11.31044860000111 seconds Trial 56, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 56, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6808750 test: 0.6809045 best: 0.6809045 (0) total: 164ms remaining: 11s 1: learn: 0.6695070 test: 0.6696202 best: 0.6696202 (1) total: 355ms remaining: 11.7s 2: learn: 0.6559792 test: 0.6562891 best: 0.6562891 (2) total: 522ms remaining: 11.3s 3: learn: 0.6446054 test: 0.6448647 best: 0.6448647 (3) total: 686ms remaining: 11s 4: learn: 0.6339550 test: 0.6342745 best: 0.6342745 (4) total: 845ms remaining: 10.6s 5: learn: 0.6222832 test: 0.6226840 best: 0.6226840 (5) total: 1s remaining: 10.3s 6: learn: 0.6094796 test: 0.6098831 best: 0.6098831 (6) total: 1.18s remaining: 10.2s 7: learn: 0.5979301 test: 0.5986238 best: 0.5986238 (7) total: 1.33s remaining: 9.95s 8: learn: 0.5863343 test: 0.5871589 best: 0.5871589 (8) total: 1.5s remaining: 9.87s 9: learn: 0.5759805 test: 0.5770503 best: 0.5770503 (9) total: 1.66s remaining: 9.63s 10: learn: 0.5662561 test: 0.5673054 best: 0.5673054 (10) total: 1.82s remaining: 9.41s 11: learn: 0.5572004 test: 0.5583868 best: 0.5583868 (11) total: 1.98s remaining: 9.22s 12: learn: 0.5467628 test: 0.5479930 best: 0.5479930 (12) total: 2.13s remaining: 9s 13: learn: 0.5374020 test: 0.5386046 best: 0.5386046 (13) total: 2.29s remaining: 8.82s 14: learn: 0.5286393 test: 0.5299540 best: 0.5299540 (14) total: 2.45s remaining: 8.65s 15: learn: 0.5204429 test: 0.5219243 best: 0.5219243 (15) total: 2.6s remaining: 8.45s 16: learn: 0.5124929 test: 0.5142029 best: 0.5142029 (16) total: 2.79s remaining: 8.37s 17: learn: 0.5052460 test: 0.5070656 best: 0.5070656 (17) total: 2.93s remaining: 8.13s 18: learn: 0.4976920 test: 0.4995299 best: 0.4995299 (18) total: 3.1s remaining: 7.99s 19: learn: 0.4903163 test: 0.4922607 best: 0.4922607 (19) total: 3.27s remaining: 7.84s 20: learn: 0.4835784 test: 0.4856114 best: 0.4856114 (20) total: 3.43s remaining: 7.68s 21: learn: 0.4768023 test: 0.4789252 best: 0.4789252 (21) total: 3.6s remaining: 7.52s 22: learn: 0.4710039 test: 0.4730768 best: 0.4730768 (22) total: 3.74s remaining: 7.32s 23: learn: 0.4636641 test: 0.4657855 best: 0.4657855 (23) total: 3.91s remaining: 7.17s 24: learn: 0.4584986 test: 0.4605329 best: 0.4605329 (24) total: 4.08s remaining: 7.01s 25: learn: 0.4517371 test: 0.4539269 best: 0.4539269 (25) total: 4.24s remaining: 6.85s 26: learn: 0.4463737 test: 0.4486184 best: 0.4486184 (26) total: 4.38s remaining: 6.66s 27: learn: 0.4409221 test: 0.4432063 best: 0.4432063 (27) total: 4.54s remaining: 6.49s 28: learn: 0.4352953 test: 0.4376240 best: 0.4376240 (28) total: 4.72s remaining: 6.35s 29: learn: 0.4296909 test: 0.4320510 best: 0.4320510 (29) total: 4.89s remaining: 6.2s 30: learn: 0.4244220 test: 0.4267899 best: 0.4267899 (30) total: 5.05s remaining: 6.02s 31: learn: 0.4193010 test: 0.4217435 best: 0.4217435 (31) total: 5.19s remaining: 5.84s 32: learn: 0.4142655 test: 0.4170463 best: 0.4170463 (32) total: 5.36s remaining: 5.68s 33: learn: 0.4090081 test: 0.4118626 best: 0.4118626 (33) total: 5.52s remaining: 5.52s 34: learn: 0.4044726 test: 0.4070420 best: 0.4070420 (34) total: 5.71s remaining: 5.39s 35: learn: 0.4005311 test: 0.4030327 best: 0.4030327 (35) total: 5.87s remaining: 5.22s 36: learn: 0.3965971 test: 0.3991235 best: 0.3991235 (36) total: 6.03s remaining: 5.05s 37: learn: 0.3923353 test: 0.3949540 best: 0.3949540 (37) total: 6.21s remaining: 4.9s 38: learn: 0.3877557 test: 0.3904420 best: 0.3904420 (38) total: 6.37s remaining: 4.74s 39: learn: 0.3839181 test: 0.3866835 best: 0.3866835 (39) total: 6.52s remaining: 4.56s 40: learn: 0.3803546 test: 0.3832730 best: 0.3832730 (40) total: 6.67s remaining: 4.39s 41: learn: 0.3764814 test: 0.3794623 best: 0.3794623 (41) total: 6.84s remaining: 4.23s 42: learn: 0.3735332 test: 0.3765016 best: 0.3765016 (42) total: 6.96s remaining: 4.05s 43: learn: 0.3694727 test: 0.3725656 best: 0.3725656 (43) total: 7.1s remaining: 3.87s 44: learn: 0.3663066 test: 0.3694087 best: 0.3694087 (44) total: 7.26s remaining: 3.71s 45: learn: 0.3632323 test: 0.3662650 best: 0.3662650 (45) total: 7.44s remaining: 3.56s 46: learn: 0.3601088 test: 0.3631666 best: 0.3631666 (46) total: 7.59s remaining: 3.39s 47: learn: 0.3569636 test: 0.3600346 best: 0.3600346 (47) total: 7.74s remaining: 3.23s 48: learn: 0.3538919 test: 0.3569912 best: 0.3569912 (48) total: 7.9s remaining: 3.06s 49: learn: 0.3502711 test: 0.3533825 best: 0.3533825 (49) total: 8.04s remaining: 2.89s 50: learn: 0.3477947 test: 0.3509450 best: 0.3509450 (50) total: 8.17s remaining: 2.72s 51: learn: 0.3448414 test: 0.3480195 best: 0.3480195 (51) total: 8.32s remaining: 2.56s 52: learn: 0.3423472 test: 0.3455476 best: 0.3455476 (52) total: 8.47s remaining: 2.4s 53: learn: 0.3399742 test: 0.3431601 best: 0.3431601 (53) total: 8.61s remaining: 2.23s 54: learn: 0.3372056 test: 0.3404303 best: 0.3404303 (54) total: 8.77s remaining: 2.07s 55: learn: 0.3344731 test: 0.3377483 best: 0.3377483 (55) total: 8.95s remaining: 1.92s 56: learn: 0.3322116 test: 0.3355166 best: 0.3355166 (56) total: 9.1s remaining: 1.76s 57: learn: 0.3296362 test: 0.3330340 best: 0.3330340 (57) total: 9.23s remaining: 1.59s 58: learn: 0.3275304 test: 0.3309708 best: 0.3309708 (58) total: 9.38s remaining: 1.43s 59: learn: 0.3257227 test: 0.3291983 best: 0.3291983 (59) total: 9.55s remaining: 1.27s 60: learn: 0.3231857 test: 0.3266476 best: 0.3266476 (60) total: 9.72s remaining: 1.11s 61: learn: 0.3209484 test: 0.3244445 best: 0.3244445 (61) total: 9.87s remaining: 955ms 62: learn: 0.3186552 test: 0.3222121 best: 0.3222121 (62) total: 10s remaining: 795ms 63: learn: 0.3166649 test: 0.3202214 best: 0.3202214 (63) total: 10.2s remaining: 636ms 64: learn: 0.3140282 test: 0.3176328 best: 0.3176328 (64) total: 10.4s remaining: 478ms 65: learn: 0.3118736 test: 0.3154392 best: 0.3154392 (65) total: 10.5s remaining: 319ms 66: learn: 0.3098426 test: 0.3134350 best: 0.3134350 (66) total: 10.7s remaining: 159ms 67: learn: 0.3073778 test: 0.3110252 best: 0.3110252 (67) total: 10.8s remaining: 0us bestTest = 0.3110251809 bestIteration = 67 Trial 56, Fold 2: Log loss = 0.3110085670418255, Average precision = 0.966155746447845, ROC-AUC = 0.9614176703012285, Elapsed Time = 11.003359300000739 seconds Trial 56, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 56, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6794612 test: 0.6793211 best: 0.6793211 (0) total: 141ms remaining: 9.44s 1: learn: 0.6665109 test: 0.6662959 best: 0.6662959 (1) total: 290ms remaining: 9.56s 2: learn: 0.6537708 test: 0.6535659 best: 0.6535659 (2) total: 445ms remaining: 9.64s 3: learn: 0.6434936 test: 0.6432301 best: 0.6432301 (3) total: 591ms remaining: 9.46s 4: learn: 0.6321460 test: 0.6317957 best: 0.6317957 (4) total: 753ms remaining: 9.48s 5: learn: 0.6201376 test: 0.6197572 best: 0.6197572 (5) total: 934ms remaining: 9.65s 6: learn: 0.6096060 test: 0.6091988 best: 0.6091988 (6) total: 1.09s remaining: 9.54s 7: learn: 0.5980128 test: 0.5977047 best: 0.5977047 (7) total: 1.25s remaining: 9.41s 8: learn: 0.5882193 test: 0.5879050 best: 0.5879050 (8) total: 1.38s remaining: 9.04s 9: learn: 0.5772902 test: 0.5768416 best: 0.5768416 (9) total: 1.56s remaining: 9.07s 10: learn: 0.5690043 test: 0.5684459 best: 0.5684459 (10) total: 1.7s remaining: 8.79s 11: learn: 0.5599222 test: 0.5593594 best: 0.5593594 (11) total: 1.85s remaining: 8.64s 12: learn: 0.5505873 test: 0.5499134 best: 0.5499134 (12) total: 2s remaining: 8.48s 13: learn: 0.5409234 test: 0.5403895 best: 0.5403895 (13) total: 2.16s remaining: 8.32s 14: learn: 0.5319961 test: 0.5314320 best: 0.5314320 (14) total: 2.32s remaining: 8.21s 15: learn: 0.5239010 test: 0.5232996 best: 0.5232996 (15) total: 2.48s remaining: 8.05s 16: learn: 0.5159650 test: 0.5152788 best: 0.5152788 (16) total: 2.63s remaining: 7.88s 17: learn: 0.5082796 test: 0.5075770 best: 0.5075770 (17) total: 2.78s remaining: 7.71s 18: learn: 0.4994525 test: 0.4987218 best: 0.4987218 (18) total: 2.93s remaining: 7.55s 19: learn: 0.4922174 test: 0.4913959 best: 0.4913959 (19) total: 3.08s remaining: 7.39s 20: learn: 0.4849280 test: 0.4840995 best: 0.4840995 (20) total: 3.22s remaining: 7.21s 21: learn: 0.4783280 test: 0.4774238 best: 0.4774238 (21) total: 3.36s remaining: 7.02s 22: learn: 0.4720590 test: 0.4711145 best: 0.4711145 (22) total: 3.5s remaining: 6.86s 23: learn: 0.4649710 test: 0.4640268 best: 0.4640268 (23) total: 3.67s remaining: 6.74s 24: learn: 0.4583963 test: 0.4574457 best: 0.4574457 (24) total: 3.82s remaining: 6.58s 25: learn: 0.4514549 test: 0.4505681 best: 0.4505681 (25) total: 3.99s remaining: 6.44s 26: learn: 0.4451104 test: 0.4442412 best: 0.4442412 (26) total: 4.14s remaining: 6.29s 27: learn: 0.4389150 test: 0.4379132 best: 0.4379132 (27) total: 4.29s remaining: 6.13s 28: learn: 0.4328426 test: 0.4318619 best: 0.4318619 (28) total: 4.44s remaining: 5.97s 29: learn: 0.4282585 test: 0.4273550 best: 0.4273550 (29) total: 4.58s remaining: 5.81s 30: learn: 0.4231268 test: 0.4222476 best: 0.4222476 (30) total: 4.74s remaining: 5.66s 31: learn: 0.4180209 test: 0.4171416 best: 0.4171416 (31) total: 4.89s remaining: 5.5s 32: learn: 0.4134789 test: 0.4125909 best: 0.4125909 (32) total: 5.04s remaining: 5.34s 33: learn: 0.4087290 test: 0.4078594 best: 0.4078594 (33) total: 5.22s remaining: 5.22s 34: learn: 0.4051155 test: 0.4042799 best: 0.4042799 (34) total: 5.38s remaining: 5.07s 35: learn: 0.3998492 test: 0.3990040 best: 0.3990040 (35) total: 5.54s remaining: 4.92s 36: learn: 0.3957296 test: 0.3947805 best: 0.3947805 (36) total: 5.68s remaining: 4.76s 37: learn: 0.3920829 test: 0.3911573 best: 0.3911573 (37) total: 5.83s remaining: 4.6s 38: learn: 0.3879776 test: 0.3871084 best: 0.3871084 (38) total: 5.98s remaining: 4.44s 39: learn: 0.3836402 test: 0.3827349 best: 0.3827349 (39) total: 6.16s remaining: 4.31s 40: learn: 0.3791992 test: 0.3783294 best: 0.3783294 (40) total: 6.31s remaining: 4.16s 41: learn: 0.3752432 test: 0.3743529 best: 0.3743529 (41) total: 6.48s remaining: 4.01s 42: learn: 0.3714628 test: 0.3706978 best: 0.3706978 (42) total: 6.66s remaining: 3.87s 43: learn: 0.3679859 test: 0.3672229 best: 0.3672229 (43) total: 6.83s remaining: 3.73s 44: learn: 0.3646420 test: 0.3638426 best: 0.3638426 (44) total: 7s remaining: 3.58s 45: learn: 0.3613679 test: 0.3605301 best: 0.3605301 (45) total: 7.19s remaining: 3.44s 46: learn: 0.3580020 test: 0.3570285 best: 0.3570285 (46) total: 7.34s remaining: 3.28s 47: learn: 0.3545962 test: 0.3536689 best: 0.3536689 (47) total: 7.5s remaining: 3.13s 48: learn: 0.3511857 test: 0.3502946 best: 0.3502946 (48) total: 7.67s remaining: 2.98s 49: learn: 0.3479428 test: 0.3471201 best: 0.3471201 (49) total: 7.83s remaining: 2.82s 50: learn: 0.3454097 test: 0.3445936 best: 0.3445936 (50) total: 7.98s remaining: 2.66s 51: learn: 0.3423049 test: 0.3415736 best: 0.3415736 (51) total: 8.15s remaining: 2.51s 52: learn: 0.3399661 test: 0.3392790 best: 0.3392790 (52) total: 8.31s remaining: 2.35s 53: learn: 0.3376144 test: 0.3369593 best: 0.3369593 (53) total: 8.45s remaining: 2.19s 54: learn: 0.3349245 test: 0.3342945 best: 0.3342945 (54) total: 8.59s remaining: 2.03s 55: learn: 0.3322260 test: 0.3316667 best: 0.3316667 (55) total: 8.72s remaining: 1.87s 56: learn: 0.3298651 test: 0.3292966 best: 0.3292966 (56) total: 8.87s remaining: 1.71s 57: learn: 0.3274696 test: 0.3268959 best: 0.3268959 (57) total: 9.02s remaining: 1.55s 58: learn: 0.3254894 test: 0.3249003 best: 0.3249003 (58) total: 9.18s remaining: 1.4s 59: learn: 0.3228539 test: 0.3222299 best: 0.3222299 (59) total: 9.34s remaining: 1.25s 60: learn: 0.3202992 test: 0.3196495 best: 0.3196495 (60) total: 9.48s remaining: 1.09s 61: learn: 0.3182670 test: 0.3175952 best: 0.3175952 (61) total: 9.65s remaining: 934ms 62: learn: 0.3160902 test: 0.3154508 best: 0.3154508 (62) total: 9.8s remaining: 778ms 63: learn: 0.3142071 test: 0.3135371 best: 0.3135371 (63) total: 9.98s remaining: 624ms 64: learn: 0.3122548 test: 0.3115832 best: 0.3115832 (64) total: 10.1s remaining: 468ms 65: learn: 0.3101013 test: 0.3094745 best: 0.3094745 (65) total: 10.3s remaining: 312ms 66: learn: 0.3083302 test: 0.3077392 best: 0.3077392 (66) total: 10.5s remaining: 156ms 67: learn: 0.3065057 test: 0.3059884 best: 0.3059884 (67) total: 10.6s remaining: 0us bestTest = 0.3059884176 bestIteration = 67 Trial 56, Fold 3: Log loss = 0.3060164081762603, Average precision = 0.9668191328922311, ROC-AUC = 0.9625206013859059, Elapsed Time = 10.77089270000215 seconds Trial 56, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 56, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6814474 test: 0.6815006 best: 0.6815006 (0) total: 143ms remaining: 9.61s 1: learn: 0.6669826 test: 0.6670765 best: 0.6670765 (1) total: 286ms remaining: 9.44s 2: learn: 0.6551183 test: 0.6553350 best: 0.6553350 (2) total: 429ms remaining: 9.3s 3: learn: 0.6434367 test: 0.6436506 best: 0.6436506 (3) total: 575ms remaining: 9.19s 4: learn: 0.6315521 test: 0.6318631 best: 0.6318631 (4) total: 719ms remaining: 9.06s 5: learn: 0.6199000 test: 0.6203016 best: 0.6203016 (5) total: 869ms remaining: 8.98s 6: learn: 0.6081209 test: 0.6085370 best: 0.6085370 (6) total: 1.01s remaining: 8.84s 7: learn: 0.5965142 test: 0.5969715 best: 0.5969715 (7) total: 1.18s remaining: 8.82s 8: learn: 0.5864223 test: 0.5867135 best: 0.5867135 (8) total: 1.33s remaining: 8.72s 9: learn: 0.5754979 test: 0.5758143 best: 0.5758143 (9) total: 1.48s remaining: 8.6s 10: learn: 0.5658398 test: 0.5658829 best: 0.5658829 (10) total: 1.65s remaining: 8.53s 11: learn: 0.5564466 test: 0.5564206 best: 0.5564206 (11) total: 1.79s remaining: 8.34s 12: learn: 0.5461499 test: 0.5461047 best: 0.5461047 (12) total: 1.94s remaining: 8.23s 13: learn: 0.5389397 test: 0.5388853 best: 0.5388853 (13) total: 2.1s remaining: 8.08s 14: learn: 0.5311338 test: 0.5311134 best: 0.5311134 (14) total: 2.25s remaining: 7.94s 15: learn: 0.5217525 test: 0.5218718 best: 0.5218718 (15) total: 2.42s remaining: 7.86s 16: learn: 0.5143762 test: 0.5145325 best: 0.5145325 (16) total: 2.59s remaining: 7.76s 17: learn: 0.5062130 test: 0.5064946 best: 0.5064946 (17) total: 2.75s remaining: 7.65s 18: learn: 0.4987286 test: 0.4990707 best: 0.4990707 (18) total: 2.9s remaining: 7.47s 19: learn: 0.4904915 test: 0.4910101 best: 0.4910101 (19) total: 3.08s remaining: 7.4s 20: learn: 0.4836541 test: 0.4842049 best: 0.4842049 (20) total: 3.25s remaining: 7.27s 21: learn: 0.4756682 test: 0.4762989 best: 0.4762989 (21) total: 3.38s remaining: 7.07s 22: learn: 0.4695418 test: 0.4704442 best: 0.4704442 (22) total: 3.54s remaining: 6.92s 23: learn: 0.4624240 test: 0.4632765 best: 0.4632765 (23) total: 3.68s remaining: 6.74s 24: learn: 0.4551507 test: 0.4560303 best: 0.4560303 (24) total: 3.81s remaining: 6.56s 25: learn: 0.4490696 test: 0.4500009 best: 0.4500009 (25) total: 3.96s remaining: 6.4s 26: learn: 0.4439302 test: 0.4448552 best: 0.4448552 (26) total: 4.12s remaining: 6.26s 27: learn: 0.4383177 test: 0.4393317 best: 0.4393317 (27) total: 4.27s remaining: 6.1s 28: learn: 0.4326287 test: 0.4336876 best: 0.4336876 (28) total: 4.43s remaining: 5.96s 29: learn: 0.4273016 test: 0.4284027 best: 0.4284027 (29) total: 4.6s remaining: 5.82s 30: learn: 0.4221578 test: 0.4233917 best: 0.4233917 (30) total: 4.75s remaining: 5.67s 31: learn: 0.4171327 test: 0.4184667 best: 0.4184667 (31) total: 4.91s remaining: 5.52s 32: learn: 0.4120900 test: 0.4134540 best: 0.4134540 (32) total: 5.07s remaining: 5.37s 33: learn: 0.4068857 test: 0.4082861 best: 0.4082861 (33) total: 5.2s remaining: 5.2s 34: learn: 0.4020015 test: 0.4034792 best: 0.4034792 (34) total: 5.35s remaining: 5.04s 35: learn: 0.3971931 test: 0.3986641 best: 0.3986641 (35) total: 5.5s remaining: 4.89s 36: learn: 0.3932913 test: 0.3947390 best: 0.3947390 (36) total: 5.66s remaining: 4.74s 37: learn: 0.3893527 test: 0.3907869 best: 0.3907869 (37) total: 5.8s remaining: 4.58s 38: learn: 0.3843995 test: 0.3858602 best: 0.3858602 (38) total: 5.98s remaining: 4.44s 39: learn: 0.3806854 test: 0.3821521 best: 0.3821521 (39) total: 6.15s remaining: 4.3s 40: learn: 0.3770847 test: 0.3786104 best: 0.3786104 (40) total: 6.31s remaining: 4.16s 41: learn: 0.3741662 test: 0.3756967 best: 0.3756967 (41) total: 6.46s remaining: 4s 42: learn: 0.3712274 test: 0.3727402 best: 0.3727402 (42) total: 6.6s remaining: 3.84s 43: learn: 0.3675675 test: 0.3690948 best: 0.3690948 (43) total: 6.76s remaining: 3.69s 44: learn: 0.3642995 test: 0.3658065 best: 0.3658065 (44) total: 6.91s remaining: 3.53s 45: learn: 0.3607180 test: 0.3622853 best: 0.3622853 (45) total: 7.07s remaining: 3.38s 46: learn: 0.3570004 test: 0.3586005 best: 0.3586005 (46) total: 7.22s remaining: 3.22s 47: learn: 0.3534716 test: 0.3551523 best: 0.3551523 (47) total: 7.38s remaining: 3.07s 48: learn: 0.3505519 test: 0.3522630 best: 0.3522630 (48) total: 7.51s remaining: 2.91s 49: learn: 0.3476102 test: 0.3493941 best: 0.3493941 (49) total: 7.66s remaining: 2.76s 50: learn: 0.3441962 test: 0.3460708 best: 0.3460708 (50) total: 7.82s remaining: 2.61s 51: learn: 0.3413055 test: 0.3432366 best: 0.3432366 (51) total: 7.97s remaining: 2.45s 52: learn: 0.3380331 test: 0.3399288 best: 0.3399288 (52) total: 8.14s remaining: 2.3s 53: learn: 0.3349779 test: 0.3369219 best: 0.3369219 (53) total: 8.29s remaining: 2.15s 54: learn: 0.3323009 test: 0.3342774 best: 0.3342774 (54) total: 8.43s remaining: 1.99s 55: learn: 0.3300190 test: 0.3319810 best: 0.3319810 (55) total: 8.57s remaining: 1.84s 56: learn: 0.3274544 test: 0.3294987 best: 0.3294987 (56) total: 8.71s remaining: 1.68s 57: learn: 0.3249507 test: 0.3270198 best: 0.3270198 (57) total: 8.87s remaining: 1.53s 58: learn: 0.3226158 test: 0.3245947 best: 0.3245947 (58) total: 9.04s remaining: 1.38s 59: learn: 0.3205162 test: 0.3225719 best: 0.3225719 (59) total: 9.19s remaining: 1.23s 60: learn: 0.3182433 test: 0.3203310 best: 0.3203310 (60) total: 9.35s remaining: 1.07s 61: learn: 0.3161497 test: 0.3182589 best: 0.3182589 (61) total: 9.52s remaining: 922ms 62: learn: 0.3140110 test: 0.3161791 best: 0.3161791 (62) total: 9.69s remaining: 769ms 63: learn: 0.3119093 test: 0.3139860 best: 0.3139860 (63) total: 9.87s remaining: 617ms 64: learn: 0.3095444 test: 0.3117100 best: 0.3117100 (64) total: 10s remaining: 463ms 65: learn: 0.3076049 test: 0.3097604 best: 0.3097604 (65) total: 10.2s remaining: 309ms 66: learn: 0.3052070 test: 0.3074961 best: 0.3074961 (66) total: 10.4s remaining: 155ms 67: learn: 0.3032752 test: 0.3056005 best: 0.3056005 (67) total: 10.6s remaining: 0us bestTest = 0.3056005275 bestIteration = 67 Trial 56, Fold 4: Log loss = 0.3055728903934642, Average precision = 0.9658589619601456, ROC-AUC = 0.9609524584456988, Elapsed Time = 10.732753100000991 seconds Trial 56, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 56, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6802565 test: 0.6803544 best: 0.6803544 (0) total: 181ms remaining: 12.1s 1: learn: 0.6671606 test: 0.6675148 best: 0.6675148 (1) total: 346ms remaining: 11.4s 2: learn: 0.6548884 test: 0.6555948 best: 0.6555948 (2) total: 482ms remaining: 10.4s 3: learn: 0.6426484 test: 0.6436608 best: 0.6436608 (3) total: 627ms remaining: 10s 4: learn: 0.6313052 test: 0.6324014 best: 0.6324014 (4) total: 760ms remaining: 9.57s 5: learn: 0.6181097 test: 0.6195516 best: 0.6195516 (5) total: 910ms remaining: 9.41s 6: learn: 0.6075908 test: 0.6092289 best: 0.6092289 (6) total: 1.05s remaining: 9.2s 7: learn: 0.5953148 test: 0.5972783 best: 0.5972783 (7) total: 1.2s remaining: 9.01s 8: learn: 0.5851338 test: 0.5873130 best: 0.5873130 (8) total: 1.36s remaining: 8.91s 9: learn: 0.5755903 test: 0.5779119 best: 0.5779119 (9) total: 1.52s remaining: 8.8s 10: learn: 0.5656021 test: 0.5681484 best: 0.5681484 (10) total: 1.68s remaining: 8.69s 11: learn: 0.5567064 test: 0.5593104 best: 0.5593104 (11) total: 1.82s remaining: 8.51s 12: learn: 0.5472675 test: 0.5499506 best: 0.5499506 (12) total: 1.97s remaining: 8.35s 13: learn: 0.5393090 test: 0.5420868 best: 0.5420868 (13) total: 2.13s remaining: 8.21s 14: learn: 0.5309739 test: 0.5339824 best: 0.5339824 (14) total: 2.29s remaining: 8.1s 15: learn: 0.5226066 test: 0.5257270 best: 0.5257270 (15) total: 2.43s remaining: 7.91s 16: learn: 0.5141022 test: 0.5174533 best: 0.5174533 (16) total: 2.61s remaining: 7.83s 17: learn: 0.5055827 test: 0.5090518 best: 0.5090518 (17) total: 2.76s remaining: 7.66s 18: learn: 0.4979467 test: 0.5015277 best: 0.5015277 (18) total: 2.93s remaining: 7.56s 19: learn: 0.4892354 test: 0.4930908 best: 0.4930908 (19) total: 3.08s remaining: 7.38s 20: learn: 0.4813948 test: 0.4854349 best: 0.4854349 (20) total: 3.21s remaining: 7.2s 21: learn: 0.4748540 test: 0.4790525 best: 0.4790525 (21) total: 3.37s remaining: 7.05s 22: learn: 0.4683203 test: 0.4727451 best: 0.4727451 (22) total: 3.53s remaining: 6.9s 23: learn: 0.4623641 test: 0.4668757 best: 0.4668757 (23) total: 3.68s remaining: 6.75s 24: learn: 0.4571473 test: 0.4618874 best: 0.4618874 (24) total: 3.83s remaining: 6.59s 25: learn: 0.4499105 test: 0.4548772 best: 0.4548772 (25) total: 3.98s remaining: 6.43s 26: learn: 0.4444047 test: 0.4494997 best: 0.4494997 (26) total: 4.14s remaining: 6.28s 27: learn: 0.4380521 test: 0.4433250 best: 0.4433250 (27) total: 4.3s remaining: 6.15s 28: learn: 0.4324074 test: 0.4378602 best: 0.4378602 (28) total: 4.45s remaining: 5.98s 29: learn: 0.4272110 test: 0.4328242 best: 0.4328242 (29) total: 4.59s remaining: 5.82s 30: learn: 0.4216020 test: 0.4273384 best: 0.4273384 (30) total: 4.74s remaining: 5.66s 31: learn: 0.4165855 test: 0.4224257 best: 0.4224257 (31) total: 4.89s remaining: 5.5s 32: learn: 0.4114807 test: 0.4175373 best: 0.4175373 (32) total: 5.04s remaining: 5.35s 33: learn: 0.4073207 test: 0.4134301 best: 0.4134301 (33) total: 5.2s remaining: 5.2s 34: learn: 0.4023218 test: 0.4086766 best: 0.4086766 (34) total: 5.37s remaining: 5.06s 35: learn: 0.3979833 test: 0.4044906 best: 0.4044906 (35) total: 5.5s remaining: 4.89s 36: learn: 0.3936065 test: 0.4001911 best: 0.4001911 (36) total: 5.67s remaining: 4.75s 37: learn: 0.3892489 test: 0.3959660 best: 0.3959660 (37) total: 5.82s remaining: 4.59s 38: learn: 0.3844413 test: 0.3913463 best: 0.3913463 (38) total: 5.99s remaining: 4.45s 39: learn: 0.3799398 test: 0.3869712 best: 0.3869712 (39) total: 6.15s remaining: 4.3s 40: learn: 0.3759241 test: 0.3829813 best: 0.3829813 (40) total: 6.31s remaining: 4.16s 41: learn: 0.3721404 test: 0.3793284 best: 0.3793284 (41) total: 6.47s remaining: 4s 42: learn: 0.3688314 test: 0.3762351 best: 0.3762351 (42) total: 6.63s remaining: 3.85s 43: learn: 0.3654816 test: 0.3730526 best: 0.3730526 (43) total: 6.79s remaining: 3.71s 44: learn: 0.3618685 test: 0.3695522 best: 0.3695522 (44) total: 6.96s remaining: 3.56s 45: learn: 0.3585119 test: 0.3663458 best: 0.3663458 (45) total: 7.12s remaining: 3.41s 46: learn: 0.3552800 test: 0.3631963 best: 0.3631963 (46) total: 7.28s remaining: 3.25s 47: learn: 0.3523136 test: 0.3602786 best: 0.3602786 (47) total: 7.45s remaining: 3.1s 48: learn: 0.3484370 test: 0.3565295 best: 0.3565295 (48) total: 7.62s remaining: 2.95s 49: learn: 0.3452702 test: 0.3534811 best: 0.3534811 (49) total: 7.8s remaining: 2.81s 50: learn: 0.3417925 test: 0.3502167 best: 0.3502167 (50) total: 7.95s remaining: 2.65s 51: learn: 0.3390114 test: 0.3475395 best: 0.3475395 (51) total: 8.1s remaining: 2.49s 52: learn: 0.3362965 test: 0.3449788 best: 0.3449788 (52) total: 8.27s remaining: 2.34s 53: learn: 0.3332084 test: 0.3420653 best: 0.3420653 (53) total: 8.43s remaining: 2.19s 54: learn: 0.3308636 test: 0.3397704 best: 0.3397704 (54) total: 8.58s remaining: 2.03s 55: learn: 0.3282695 test: 0.3373069 best: 0.3373069 (55) total: 8.73s remaining: 1.87s 56: learn: 0.3254079 test: 0.3345830 best: 0.3345830 (56) total: 8.87s remaining: 1.71s 57: learn: 0.3227287 test: 0.3319664 best: 0.3319664 (57) total: 9.01s remaining: 1.55s 58: learn: 0.3202519 test: 0.3296861 best: 0.3296861 (58) total: 9.18s remaining: 1.4s 59: learn: 0.3180658 test: 0.3276391 best: 0.3276391 (59) total: 9.34s remaining: 1.24s 60: learn: 0.3155948 test: 0.3252877 best: 0.3252877 (60) total: 9.49s remaining: 1.09s 61: learn: 0.3136324 test: 0.3234546 best: 0.3234546 (61) total: 9.64s remaining: 933ms 62: learn: 0.3116128 test: 0.3215623 best: 0.3215623 (62) total: 9.81s remaining: 779ms 63: learn: 0.3097830 test: 0.3197007 best: 0.3197007 (63) total: 9.96s remaining: 623ms 64: learn: 0.3080723 test: 0.3180555 best: 0.3180555 (64) total: 10.1s remaining: 467ms 65: learn: 0.3060438 test: 0.3161168 best: 0.3161168 (65) total: 10.3s remaining: 311ms 66: learn: 0.3039538 test: 0.3140971 best: 0.3140971 (66) total: 10.4s remaining: 156ms 67: learn: 0.3019237 test: 0.3121814 best: 0.3121814 (67) total: 10.6s remaining: 0us bestTest = 0.3121814389 bestIteration = 67 Trial 56, Fold 5: Log loss = 0.3120946344441767, Average precision = 0.9619429643461269, ROC-AUC = 0.9570932129387066, Elapsed Time = 10.751953400002094 seconds
Optimization Progress: 57%|#####6 | 57/100 [1:33:28<1:40:56, 140.84s/it]
Trial 57, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 57, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5890787 test: 0.5936873 best: 0.5936873 (0) total: 522ms remaining: 48.5s 1: learn: 0.5084948 test: 0.5145016 best: 0.5145016 (1) total: 989ms remaining: 45.5s 2: learn: 0.4453946 test: 0.4528188 best: 0.4528188 (2) total: 1.48s remaining: 44.9s 3: learn: 0.3962743 test: 0.4047435 best: 0.4047435 (3) total: 1.88s remaining: 42.4s 4: learn: 0.3565412 test: 0.3671223 best: 0.3671223 (4) total: 2.36s remaining: 42s 5: learn: 0.3261005 test: 0.3381460 best: 0.3381460 (5) total: 2.74s remaining: 40.2s 6: learn: 0.3030712 test: 0.3171905 best: 0.3171905 (6) total: 3.17s remaining: 39.5s 7: learn: 0.2826533 test: 0.2983120 best: 0.2983120 (7) total: 3.59s remaining: 38.6s 8: learn: 0.2658466 test: 0.2832187 best: 0.2832187 (8) total: 4.05s remaining: 38.2s 9: learn: 0.2527713 test: 0.2711335 best: 0.2711335 (9) total: 4.45s remaining: 37.4s 10: learn: 0.2410541 test: 0.2611329 best: 0.2611329 (10) total: 4.92s remaining: 37.2s 11: learn: 0.2303318 test: 0.2521651 best: 0.2521651 (11) total: 5.39s remaining: 36.9s 12: learn: 0.2226782 test: 0.2457544 best: 0.2457544 (12) total: 5.8s remaining: 36.1s 13: learn: 0.2153505 test: 0.2400209 best: 0.2400209 (13) total: 6.3s remaining: 36s 14: learn: 0.2090125 test: 0.2352740 best: 0.2352740 (14) total: 6.82s remaining: 35.9s 15: learn: 0.2041312 test: 0.2320191 best: 0.2320191 (15) total: 7.24s remaining: 35.3s 16: learn: 0.1991640 test: 0.2283960 best: 0.2283960 (16) total: 7.71s remaining: 34.9s 17: learn: 0.1949290 test: 0.2255427 best: 0.2255427 (17) total: 8.15s remaining: 34.4s 18: learn: 0.1904334 test: 0.2226696 best: 0.2226696 (18) total: 8.58s remaining: 33.9s 19: learn: 0.1871622 test: 0.2206387 best: 0.2206387 (19) total: 8.98s remaining: 33.2s 20: learn: 0.1837903 test: 0.2184162 best: 0.2184162 (20) total: 9.37s remaining: 32.6s 21: learn: 0.1806096 test: 0.2164289 best: 0.2164289 (21) total: 9.82s remaining: 32.1s 22: learn: 0.1775224 test: 0.2146384 best: 0.2146384 (22) total: 10.3s remaining: 31.7s 23: learn: 0.1747595 test: 0.2129258 best: 0.2129258 (23) total: 10.7s remaining: 31.2s 24: learn: 0.1730931 test: 0.2123129 best: 0.2123129 (24) total: 11s remaining: 30.5s 25: learn: 0.1707557 test: 0.2109800 best: 0.2109800 (25) total: 11.5s remaining: 30.1s 26: learn: 0.1685087 test: 0.2100937 best: 0.2100937 (26) total: 11.9s remaining: 29.5s 27: learn: 0.1667218 test: 0.2093386 best: 0.2093386 (27) total: 12.3s remaining: 29s 28: learn: 0.1647422 test: 0.2082677 best: 0.2082677 (28) total: 12.7s remaining: 28.5s 29: learn: 0.1629141 test: 0.2076046 best: 0.2076046 (29) total: 13.1s remaining: 28s 30: learn: 0.1614491 test: 0.2069608 best: 0.2069608 (30) total: 13.5s remaining: 27.4s 31: learn: 0.1598646 test: 0.2062242 best: 0.2062242 (31) total: 13.9s remaining: 26.9s 32: learn: 0.1584465 test: 0.2057382 best: 0.2057382 (32) total: 14.3s remaining: 26.4s 33: learn: 0.1567051 test: 0.2051181 best: 0.2051181 (33) total: 14.7s remaining: 26s 34: learn: 0.1553547 test: 0.2047183 best: 0.2047183 (34) total: 15.2s remaining: 25.5s 35: learn: 0.1538262 test: 0.2042686 best: 0.2042686 (35) total: 15.7s remaining: 25.2s 36: learn: 0.1523031 test: 0.2034256 best: 0.2034256 (36) total: 16.1s remaining: 24.8s 37: learn: 0.1508058 test: 0.2031911 best: 0.2031911 (37) total: 16.6s remaining: 24.4s 38: learn: 0.1495200 test: 0.2026820 best: 0.2026820 (38) total: 17s remaining: 23.9s 39: learn: 0.1484669 test: 0.2024076 best: 0.2024076 (39) total: 17.3s remaining: 23.4s 40: learn: 0.1472778 test: 0.2018029 best: 0.2018029 (40) total: 17.7s remaining: 22.9s 41: learn: 0.1460056 test: 0.2017361 best: 0.2017361 (41) total: 18.1s remaining: 22.5s 42: learn: 0.1445248 test: 0.2013574 best: 0.2013574 (42) total: 18.6s remaining: 22.1s 43: learn: 0.1432884 test: 0.2012153 best: 0.2012153 (43) total: 19.1s remaining: 21.7s 44: learn: 0.1422474 test: 0.2008611 best: 0.2008611 (44) total: 19.5s remaining: 21.2s 45: learn: 0.1414044 test: 0.2006467 best: 0.2006467 (45) total: 19.8s remaining: 20.7s 46: learn: 0.1406444 test: 0.2005626 best: 0.2005626 (46) total: 20.1s remaining: 20.1s 47: learn: 0.1397610 test: 0.2003004 best: 0.2003004 (47) total: 20.5s remaining: 19.7s 48: learn: 0.1387720 test: 0.2002388 best: 0.2002388 (48) total: 20.9s remaining: 19.2s 49: learn: 0.1381290 test: 0.2001948 best: 0.2001948 (49) total: 21.3s remaining: 18.7s 50: learn: 0.1371001 test: 0.2000187 best: 0.2000187 (50) total: 21.8s remaining: 18.3s 51: learn: 0.1359532 test: 0.1998246 best: 0.1998246 (51) total: 22.2s remaining: 17.9s 52: learn: 0.1351131 test: 0.1996781 best: 0.1996781 (52) total: 22.6s remaining: 17.5s 53: learn: 0.1338098 test: 0.1996973 best: 0.1996781 (52) total: 23.1s remaining: 17.1s 54: learn: 0.1331822 test: 0.1995064 best: 0.1995064 (54) total: 23.4s remaining: 16.6s 55: learn: 0.1322666 test: 0.1993607 best: 0.1993607 (55) total: 23.8s remaining: 16.1s 56: learn: 0.1313889 test: 0.1993443 best: 0.1993443 (56) total: 24.2s remaining: 15.7s 57: learn: 0.1304765 test: 0.1991210 best: 0.1991210 (57) total: 24.6s remaining: 15.2s 58: learn: 0.1297249 test: 0.1991471 best: 0.1991210 (57) total: 24.9s remaining: 14.8s 59: learn: 0.1288513 test: 0.1990550 best: 0.1990550 (59) total: 25.3s remaining: 14.3s 60: learn: 0.1279837 test: 0.1992294 best: 0.1990550 (59) total: 25.7s remaining: 13.9s 61: learn: 0.1273214 test: 0.1989087 best: 0.1989087 (61) total: 26.1s remaining: 13.5s 62: learn: 0.1263373 test: 0.1985592 best: 0.1985592 (62) total: 26.5s remaining: 13s 63: learn: 0.1256664 test: 0.1985613 best: 0.1985592 (62) total: 26.9s remaining: 12.6s 64: learn: 0.1249647 test: 0.1985580 best: 0.1985580 (64) total: 27.4s remaining: 12.2s 65: learn: 0.1242560 test: 0.1982696 best: 0.1982696 (65) total: 27.8s remaining: 11.8s 66: learn: 0.1238117 test: 0.1981400 best: 0.1981400 (66) total: 28.1s remaining: 11.3s 67: learn: 0.1234166 test: 0.1980539 best: 0.1980539 (67) total: 28.3s remaining: 10.8s 68: learn: 0.1229327 test: 0.1983628 best: 0.1980539 (67) total: 28.6s remaining: 10.4s 69: learn: 0.1219328 test: 0.1984607 best: 0.1980539 (67) total: 29.2s remaining: 10s 70: learn: 0.1211682 test: 0.1984118 best: 0.1980539 (67) total: 29.5s remaining: 9.56s 71: learn: 0.1205057 test: 0.1984141 best: 0.1980539 (67) total: 29.9s remaining: 9.14s 72: learn: 0.1195115 test: 0.1982503 best: 0.1980539 (67) total: 30.4s remaining: 8.75s 73: learn: 0.1191710 test: 0.1980950 best: 0.1980539 (67) total: 30.6s remaining: 8.28s 74: learn: 0.1185503 test: 0.1979566 best: 0.1979566 (74) total: 31s remaining: 7.84s 75: learn: 0.1181171 test: 0.1979931 best: 0.1979566 (74) total: 31.2s remaining: 7.4s 76: learn: 0.1174857 test: 0.1980118 best: 0.1979566 (74) total: 31.7s remaining: 6.99s 77: learn: 0.1167465 test: 0.1983080 best: 0.1979566 (74) total: 32.1s remaining: 6.59s 78: learn: 0.1164593 test: 0.1981618 best: 0.1979566 (74) total: 32.4s remaining: 6.14s 79: learn: 0.1159078 test: 0.1982408 best: 0.1979566 (74) total: 32.7s remaining: 5.72s 80: learn: 0.1154118 test: 0.1982418 best: 0.1979566 (74) total: 33s remaining: 5.3s 81: learn: 0.1146231 test: 0.1982347 best: 0.1979566 (74) total: 33.5s remaining: 4.9s 82: learn: 0.1142460 test: 0.1981468 best: 0.1979566 (74) total: 33.8s remaining: 4.48s 83: learn: 0.1137158 test: 0.1981334 best: 0.1979566 (74) total: 34.1s remaining: 4.07s 84: learn: 0.1134258 test: 0.1981065 best: 0.1979566 (74) total: 34.4s remaining: 3.64s 85: learn: 0.1130252 test: 0.1981144 best: 0.1979566 (74) total: 34.7s remaining: 3.23s 86: learn: 0.1125481 test: 0.1981705 best: 0.1979566 (74) total: 35s remaining: 2.81s 87: learn: 0.1121565 test: 0.1982503 best: 0.1979566 (74) total: 35.3s remaining: 2.4s 88: learn: 0.1116439 test: 0.1981206 best: 0.1979566 (74) total: 35.6s remaining: 2s 89: learn: 0.1109068 test: 0.1980684 best: 0.1979566 (74) total: 36.1s remaining: 1.6s 90: learn: 0.1102356 test: 0.1980993 best: 0.1979566 (74) total: 36.6s remaining: 1.21s 91: learn: 0.1100020 test: 0.1980003 best: 0.1979566 (74) total: 36.8s remaining: 800ms 92: learn: 0.1093673 test: 0.1978562 best: 0.1978562 (92) total: 37.2s remaining: 400ms 93: learn: 0.1088471 test: 0.1980920 best: 0.1978562 (92) total: 37.5s remaining: 0us bestTest = 0.1978562267 bestIteration = 92 Shrink model to first 93 iterations. Trial 57, Fold 1: Log loss = 0.1975462884624243, Average precision = 0.9758262951181361, ROC-AUC = 0.9721045248371354, Elapsed Time = 37.68159860000014 seconds Trial 57, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 57, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5899010 test: 0.5927589 best: 0.5927589 (0) total: 507ms remaining: 47.1s 1: learn: 0.5093047 test: 0.5142643 best: 0.5142643 (1) total: 1.01s remaining: 46.6s 2: learn: 0.4465705 test: 0.4534929 best: 0.4534929 (2) total: 1.51s remaining: 45.9s 3: learn: 0.3972341 test: 0.4059481 best: 0.4059481 (3) total: 2.01s remaining: 45.3s 4: learn: 0.3598375 test: 0.3693817 best: 0.3693817 (4) total: 2.43s remaining: 43.3s 5: learn: 0.3294544 test: 0.3394370 best: 0.3394370 (5) total: 2.9s remaining: 42.5s 6: learn: 0.3056175 test: 0.3165340 best: 0.3165340 (6) total: 3.32s remaining: 41.2s 7: learn: 0.2846520 test: 0.2969479 best: 0.2969479 (7) total: 3.77s remaining: 40.5s 8: learn: 0.2686975 test: 0.2821687 best: 0.2821687 (8) total: 4.19s remaining: 39.6s 9: learn: 0.2545185 test: 0.2692494 best: 0.2692494 (9) total: 4.7s remaining: 39.5s 10: learn: 0.2421662 test: 0.2576968 best: 0.2576968 (10) total: 5.21s remaining: 39.3s 11: learn: 0.2324838 test: 0.2494745 best: 0.2494745 (11) total: 5.66s remaining: 38.7s 12: learn: 0.2239140 test: 0.2420513 best: 0.2420513 (12) total: 6.11s remaining: 38.1s 13: learn: 0.2167564 test: 0.2354888 best: 0.2354888 (13) total: 6.51s remaining: 37.2s 14: learn: 0.2106688 test: 0.2301490 best: 0.2301490 (14) total: 6.92s remaining: 36.4s 15: learn: 0.2047027 test: 0.2255410 best: 0.2255410 (15) total: 7.41s remaining: 36.1s 16: learn: 0.1994354 test: 0.2212095 best: 0.2212095 (16) total: 7.88s remaining: 35.7s 17: learn: 0.1946689 test: 0.2170313 best: 0.2170313 (17) total: 8.36s remaining: 35.3s 18: learn: 0.1906322 test: 0.2142056 best: 0.2142056 (18) total: 8.8s remaining: 34.8s 19: learn: 0.1870886 test: 0.2119556 best: 0.2119556 (19) total: 9.28s remaining: 34.3s 20: learn: 0.1839215 test: 0.2096529 best: 0.2096529 (20) total: 9.68s remaining: 33.7s 21: learn: 0.1812018 test: 0.2073426 best: 0.2073426 (21) total: 10.1s remaining: 33s 22: learn: 0.1784822 test: 0.2052971 best: 0.2052971 (22) total: 10.5s remaining: 32.5s 23: learn: 0.1758208 test: 0.2034275 best: 0.2034275 (23) total: 11s remaining: 32.1s 24: learn: 0.1741743 test: 0.2021201 best: 0.2021201 (24) total: 11.3s remaining: 31.2s 25: learn: 0.1715371 test: 0.2001344 best: 0.2001344 (25) total: 11.8s remaining: 30.9s 26: learn: 0.1691394 test: 0.1988223 best: 0.1988223 (26) total: 12.2s remaining: 30.3s 27: learn: 0.1671462 test: 0.1973820 best: 0.1973820 (27) total: 12.7s remaining: 29.8s 28: learn: 0.1654632 test: 0.1961235 best: 0.1961235 (28) total: 13s remaining: 29.1s 29: learn: 0.1633917 test: 0.1951070 best: 0.1951070 (29) total: 13.5s remaining: 28.8s 30: learn: 0.1614891 test: 0.1938308 best: 0.1938308 (30) total: 14s remaining: 28.4s 31: learn: 0.1605108 test: 0.1930069 best: 0.1930069 (31) total: 14.3s remaining: 27.7s 32: learn: 0.1594554 test: 0.1925205 best: 0.1925205 (32) total: 14.6s remaining: 27s 33: learn: 0.1580997 test: 0.1921047 best: 0.1921047 (33) total: 15.1s remaining: 26.6s 34: learn: 0.1569664 test: 0.1914044 best: 0.1914044 (34) total: 15.4s remaining: 26s 35: learn: 0.1556965 test: 0.1910257 best: 0.1910257 (35) total: 15.8s remaining: 25.5s 36: learn: 0.1540996 test: 0.1904366 best: 0.1904366 (36) total: 16.3s remaining: 25.1s 37: learn: 0.1526371 test: 0.1896531 best: 0.1896531 (37) total: 16.8s remaining: 24.7s 38: learn: 0.1514457 test: 0.1893421 best: 0.1893421 (38) total: 17.1s remaining: 24.1s 39: learn: 0.1503737 test: 0.1889290 best: 0.1889290 (39) total: 17.5s remaining: 23.6s 40: learn: 0.1488592 test: 0.1886810 best: 0.1886810 (40) total: 18s remaining: 23.3s 41: learn: 0.1479325 test: 0.1884454 best: 0.1884454 (41) total: 18.4s remaining: 22.7s 42: learn: 0.1466962 test: 0.1880998 best: 0.1880998 (42) total: 18.8s remaining: 22.2s 43: learn: 0.1452843 test: 0.1877033 best: 0.1877033 (43) total: 19.3s remaining: 21.9s 44: learn: 0.1441469 test: 0.1872321 best: 0.1872321 (44) total: 19.7s remaining: 21.4s 45: learn: 0.1431157 test: 0.1867373 best: 0.1867373 (45) total: 20.2s remaining: 21s 46: learn: 0.1421721 test: 0.1864318 best: 0.1864318 (46) total: 20.6s remaining: 20.6s 47: learn: 0.1413699 test: 0.1862728 best: 0.1862728 (47) total: 20.9s remaining: 20s 48: learn: 0.1402623 test: 0.1861767 best: 0.1861767 (48) total: 21.4s remaining: 19.7s 49: learn: 0.1393958 test: 0.1858722 best: 0.1858722 (49) total: 21.8s remaining: 19.2s 50: learn: 0.1385360 test: 0.1853309 best: 0.1853309 (50) total: 22.1s remaining: 18.7s 51: learn: 0.1377468 test: 0.1850972 best: 0.1850972 (51) total: 22.5s remaining: 18.1s 52: learn: 0.1369250 test: 0.1848771 best: 0.1848771 (52) total: 22.8s remaining: 17.7s 53: learn: 0.1362570 test: 0.1848121 best: 0.1848121 (53) total: 23.1s remaining: 17.1s 54: learn: 0.1354367 test: 0.1847892 best: 0.1847892 (54) total: 23.5s remaining: 16.7s 55: learn: 0.1347598 test: 0.1844705 best: 0.1844705 (55) total: 23.8s remaining: 16.2s 56: learn: 0.1340887 test: 0.1843602 best: 0.1843602 (56) total: 24.1s remaining: 15.6s 57: learn: 0.1333314 test: 0.1844201 best: 0.1843602 (56) total: 24.5s remaining: 15.2s 58: learn: 0.1326077 test: 0.1843539 best: 0.1843539 (58) total: 24.9s remaining: 14.8s 59: learn: 0.1323145 test: 0.1843872 best: 0.1843539 (58) total: 25.1s remaining: 14.2s 60: learn: 0.1315456 test: 0.1842559 best: 0.1842559 (60) total: 25.5s remaining: 13.8s 61: learn: 0.1307809 test: 0.1841167 best: 0.1841167 (61) total: 25.9s remaining: 13.4s 62: learn: 0.1300085 test: 0.1840446 best: 0.1840446 (62) total: 26.2s remaining: 12.9s 63: learn: 0.1288847 test: 0.1840277 best: 0.1840277 (63) total: 26.7s remaining: 12.5s 64: learn: 0.1285593 test: 0.1838015 best: 0.1838015 (64) total: 27s remaining: 12s 65: learn: 0.1277557 test: 0.1839895 best: 0.1838015 (64) total: 27.3s remaining: 11.6s 66: learn: 0.1269212 test: 0.1839102 best: 0.1838015 (64) total: 27.8s remaining: 11.2s 67: learn: 0.1259933 test: 0.1837180 best: 0.1837180 (67) total: 28.3s remaining: 10.8s 68: learn: 0.1255558 test: 0.1835991 best: 0.1835991 (68) total: 28.6s remaining: 10.3s 69: learn: 0.1247861 test: 0.1835018 best: 0.1835018 (69) total: 29s remaining: 9.93s 70: learn: 0.1242364 test: 0.1835873 best: 0.1835018 (69) total: 29.3s remaining: 9.49s 71: learn: 0.1237005 test: 0.1835359 best: 0.1835018 (69) total: 29.7s remaining: 9.07s 72: learn: 0.1229222 test: 0.1834656 best: 0.1834656 (72) total: 30.2s remaining: 8.68s 73: learn: 0.1223861 test: 0.1834466 best: 0.1834466 (73) total: 30.5s remaining: 8.24s 74: learn: 0.1220025 test: 0.1834671 best: 0.1834466 (73) total: 30.8s remaining: 7.8s 75: learn: 0.1211188 test: 0.1835959 best: 0.1834466 (73) total: 31.2s remaining: 7.4s 76: learn: 0.1203958 test: 0.1836350 best: 0.1834466 (73) total: 31.6s remaining: 6.99s 77: learn: 0.1194138 test: 0.1835680 best: 0.1834466 (73) total: 32.1s remaining: 6.58s 78: learn: 0.1188008 test: 0.1836017 best: 0.1834466 (73) total: 32.4s remaining: 6.16s 79: learn: 0.1182571 test: 0.1834589 best: 0.1834466 (73) total: 32.8s remaining: 5.75s 80: learn: 0.1178932 test: 0.1833982 best: 0.1833982 (80) total: 33.1s remaining: 5.32s 81: learn: 0.1174210 test: 0.1835626 best: 0.1833982 (80) total: 33.4s remaining: 4.89s 82: learn: 0.1170467 test: 0.1836154 best: 0.1833982 (80) total: 33.7s remaining: 4.46s 83: learn: 0.1161286 test: 0.1834725 best: 0.1833982 (80) total: 34.2s remaining: 4.07s 84: learn: 0.1155761 test: 0.1834683 best: 0.1833982 (80) total: 34.6s remaining: 3.66s 85: learn: 0.1151151 test: 0.1833814 best: 0.1833814 (85) total: 34.9s remaining: 3.25s 86: learn: 0.1146518 test: 0.1832630 best: 0.1832630 (86) total: 35.3s remaining: 2.84s 87: learn: 0.1138358 test: 0.1831522 best: 0.1831522 (87) total: 35.8s remaining: 2.44s 88: learn: 0.1133139 test: 0.1831560 best: 0.1831522 (87) total: 36.2s remaining: 2.03s 89: learn: 0.1128933 test: 0.1830486 best: 0.1830486 (89) total: 36.6s remaining: 1.62s 90: learn: 0.1124185 test: 0.1830539 best: 0.1830486 (89) total: 36.9s remaining: 1.22s 91: learn: 0.1119197 test: 0.1830842 best: 0.1830486 (89) total: 37.3s remaining: 810ms 92: learn: 0.1112869 test: 0.1829602 best: 0.1829602 (92) total: 37.7s remaining: 405ms 93: learn: 0.1107361 test: 0.1827692 best: 0.1827692 (93) total: 38.1s remaining: 0us bestTest = 0.1827692326 bestIteration = 93 Trial 57, Fold 2: Log loss = 0.18251450060014623, Average precision = 0.9773483000479155, ROC-AUC = 0.974676784804341, Elapsed Time = 38.283101299999544 seconds Trial 57, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 57, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5895990 test: 0.5910811 best: 0.5910811 (0) total: 502ms remaining: 46.7s 1: learn: 0.5093155 test: 0.5119115 best: 0.5119115 (1) total: 996ms remaining: 45.8s 2: learn: 0.4466183 test: 0.4499960 best: 0.4499960 (2) total: 1.41s remaining: 42.8s 3: learn: 0.3967450 test: 0.4015672 best: 0.4015672 (3) total: 1.91s remaining: 43.1s 4: learn: 0.3580733 test: 0.3641607 best: 0.3641607 (4) total: 2.36s remaining: 42s 5: learn: 0.3286120 test: 0.3361049 best: 0.3361049 (5) total: 2.79s remaining: 40.9s 6: learn: 0.3041558 test: 0.3128702 best: 0.3128702 (6) total: 3.27s remaining: 40.7s 7: learn: 0.2846210 test: 0.2948917 best: 0.2948917 (7) total: 3.77s remaining: 40.6s 8: learn: 0.2690535 test: 0.2806977 best: 0.2806977 (8) total: 4.22s remaining: 39.8s 9: learn: 0.2543006 test: 0.2677571 best: 0.2677571 (9) total: 4.71s remaining: 39.6s 10: learn: 0.2423367 test: 0.2567044 best: 0.2567044 (10) total: 5.2s remaining: 39.3s 11: learn: 0.2321705 test: 0.2478976 best: 0.2478976 (11) total: 5.7s remaining: 39s 12: learn: 0.2235538 test: 0.2402775 best: 0.2402775 (12) total: 6.18s remaining: 38.5s 13: learn: 0.2168912 test: 0.2349172 best: 0.2349172 (13) total: 6.62s remaining: 37.8s 14: learn: 0.2109702 test: 0.2302900 best: 0.2302900 (14) total: 7.04s remaining: 37.1s 15: learn: 0.2051707 test: 0.2252925 best: 0.2252925 (15) total: 7.46s remaining: 36.3s 16: learn: 0.2000220 test: 0.2206327 best: 0.2206327 (16) total: 7.83s remaining: 35.5s 17: learn: 0.1951592 test: 0.2170240 best: 0.2170240 (17) total: 8.33s remaining: 35.2s 18: learn: 0.1912503 test: 0.2144036 best: 0.2144036 (18) total: 8.78s remaining: 34.7s 19: learn: 0.1874468 test: 0.2114486 best: 0.2114486 (19) total: 9.24s remaining: 34.2s 20: learn: 0.1841058 test: 0.2091513 best: 0.2091513 (20) total: 9.68s remaining: 33.6s 21: learn: 0.1811188 test: 0.2075129 best: 0.2075129 (21) total: 10.2s remaining: 33.2s 22: learn: 0.1787781 test: 0.2056346 best: 0.2056346 (22) total: 10.5s remaining: 32.3s 23: learn: 0.1759563 test: 0.2039694 best: 0.2039694 (23) total: 10.9s remaining: 31.9s 24: learn: 0.1730564 test: 0.2026437 best: 0.2026437 (24) total: 11.4s remaining: 31.5s 25: learn: 0.1710739 test: 0.2018540 best: 0.2018540 (25) total: 11.7s remaining: 30.7s 26: learn: 0.1692995 test: 0.2007831 best: 0.2007831 (26) total: 12.1s remaining: 30.1s 27: learn: 0.1678736 test: 0.1999069 best: 0.1999069 (27) total: 12.5s remaining: 29.4s 28: learn: 0.1664170 test: 0.1989623 best: 0.1989623 (28) total: 12.8s remaining: 28.6s 29: learn: 0.1645121 test: 0.1978530 best: 0.1978530 (29) total: 13.2s remaining: 28.1s 30: learn: 0.1626058 test: 0.1971700 best: 0.1971700 (30) total: 13.7s remaining: 27.8s 31: learn: 0.1612684 test: 0.1967892 best: 0.1967892 (31) total: 14.1s remaining: 27.2s 32: learn: 0.1596576 test: 0.1960704 best: 0.1960704 (32) total: 14.5s remaining: 26.8s 33: learn: 0.1579204 test: 0.1953731 best: 0.1953731 (33) total: 14.9s remaining: 26.3s 34: learn: 0.1564675 test: 0.1950329 best: 0.1950329 (34) total: 15.3s remaining: 25.8s 35: learn: 0.1552562 test: 0.1947404 best: 0.1947404 (35) total: 15.7s remaining: 25.2s 36: learn: 0.1538562 test: 0.1943905 best: 0.1943905 (36) total: 16.2s remaining: 24.9s 37: learn: 0.1524998 test: 0.1940918 best: 0.1940918 (37) total: 16.6s remaining: 24.5s 38: learn: 0.1510299 test: 0.1934688 best: 0.1934688 (38) total: 17s remaining: 24s 39: learn: 0.1499709 test: 0.1929633 best: 0.1929633 (39) total: 17.4s remaining: 23.5s 40: learn: 0.1489325 test: 0.1924362 best: 0.1924362 (40) total: 17.8s remaining: 23s 41: learn: 0.1478240 test: 0.1921320 best: 0.1921320 (41) total: 18.2s remaining: 22.5s 42: learn: 0.1468683 test: 0.1918396 best: 0.1918396 (42) total: 18.5s remaining: 22s 43: learn: 0.1461735 test: 0.1916738 best: 0.1916738 (43) total: 18.8s remaining: 21.4s 44: learn: 0.1454234 test: 0.1914592 best: 0.1914592 (44) total: 19.1s remaining: 20.8s 45: learn: 0.1443030 test: 0.1909744 best: 0.1909744 (45) total: 19.6s remaining: 20.4s 46: learn: 0.1432034 test: 0.1909055 best: 0.1909055 (46) total: 20s remaining: 20s 47: learn: 0.1422025 test: 0.1905460 best: 0.1905460 (47) total: 20.4s remaining: 19.6s 48: learn: 0.1412902 test: 0.1901620 best: 0.1901620 (48) total: 20.8s remaining: 19.1s 49: learn: 0.1402171 test: 0.1899473 best: 0.1899473 (49) total: 21.4s remaining: 18.8s 50: learn: 0.1390535 test: 0.1896504 best: 0.1896504 (50) total: 21.9s remaining: 18.5s 51: learn: 0.1383788 test: 0.1895988 best: 0.1895988 (51) total: 22.2s remaining: 18s 52: learn: 0.1373386 test: 0.1896740 best: 0.1895988 (51) total: 22.6s remaining: 17.5s 53: learn: 0.1365308 test: 0.1897136 best: 0.1895988 (51) total: 23s remaining: 17s 54: learn: 0.1356989 test: 0.1896062 best: 0.1895988 (51) total: 23.4s remaining: 16.6s 55: learn: 0.1350417 test: 0.1895041 best: 0.1895041 (55) total: 23.7s remaining: 16.1s 56: learn: 0.1341734 test: 0.1892705 best: 0.1892705 (56) total: 24.1s remaining: 15.6s 57: learn: 0.1334312 test: 0.1891896 best: 0.1891896 (57) total: 24.4s remaining: 15.2s 58: learn: 0.1330013 test: 0.1892167 best: 0.1891896 (57) total: 24.7s remaining: 14.6s 59: learn: 0.1323122 test: 0.1891607 best: 0.1891607 (59) total: 25s remaining: 14.2s 60: learn: 0.1316044 test: 0.1891669 best: 0.1891607 (59) total: 25.4s remaining: 13.7s 61: learn: 0.1311254 test: 0.1889954 best: 0.1889954 (61) total: 25.7s remaining: 13.3s 62: learn: 0.1305394 test: 0.1889695 best: 0.1889695 (62) total: 26.1s remaining: 12.8s 63: learn: 0.1299955 test: 0.1885962 best: 0.1885962 (63) total: 26.4s remaining: 12.4s 64: learn: 0.1287483 test: 0.1882442 best: 0.1882442 (64) total: 26.9s remaining: 12s 65: learn: 0.1279399 test: 0.1883770 best: 0.1882442 (64) total: 27.3s remaining: 11.6s 66: learn: 0.1275245 test: 0.1883809 best: 0.1882442 (64) total: 27.6s remaining: 11.1s 67: learn: 0.1266324 test: 0.1883165 best: 0.1882442 (64) total: 28s remaining: 10.7s 68: learn: 0.1262620 test: 0.1885000 best: 0.1882442 (64) total: 28.3s remaining: 10.2s 69: learn: 0.1254047 test: 0.1885067 best: 0.1882442 (64) total: 28.7s remaining: 9.84s 70: learn: 0.1247048 test: 0.1884326 best: 0.1882442 (64) total: 29.1s remaining: 9.43s 71: learn: 0.1240529 test: 0.1882513 best: 0.1882442 (64) total: 29.5s remaining: 9.01s 72: learn: 0.1231799 test: 0.1882634 best: 0.1882442 (64) total: 30s remaining: 8.62s 73: learn: 0.1223953 test: 0.1883437 best: 0.1882442 (64) total: 30.4s remaining: 8.22s 74: learn: 0.1215914 test: 0.1882004 best: 0.1882004 (74) total: 30.8s remaining: 7.81s 75: learn: 0.1210155 test: 0.1881109 best: 0.1881109 (75) total: 31.2s remaining: 7.38s 76: learn: 0.1204380 test: 0.1880521 best: 0.1880521 (76) total: 31.6s remaining: 6.97s 77: learn: 0.1200310 test: 0.1880305 best: 0.1880305 (77) total: 31.9s remaining: 6.55s 78: learn: 0.1192955 test: 0.1878620 best: 0.1878620 (78) total: 32.4s remaining: 6.16s 79: learn: 0.1187549 test: 0.1878588 best: 0.1878588 (79) total: 32.7s remaining: 5.73s 80: learn: 0.1180847 test: 0.1878683 best: 0.1878588 (79) total: 33.1s remaining: 5.31s 81: learn: 0.1177833 test: 0.1878580 best: 0.1878580 (81) total: 33.3s remaining: 4.87s 82: learn: 0.1171217 test: 0.1878967 best: 0.1878580 (81) total: 33.7s remaining: 4.46s 83: learn: 0.1169610 test: 0.1879252 best: 0.1878580 (81) total: 33.9s remaining: 4.03s 84: learn: 0.1163429 test: 0.1878341 best: 0.1878341 (84) total: 34.2s remaining: 3.62s 85: learn: 0.1157455 test: 0.1878266 best: 0.1878266 (85) total: 34.7s remaining: 3.23s 86: learn: 0.1153676 test: 0.1877917 best: 0.1877917 (86) total: 35s remaining: 2.81s 87: learn: 0.1149597 test: 0.1876916 best: 0.1876916 (87) total: 35.3s remaining: 2.41s 88: learn: 0.1144851 test: 0.1876907 best: 0.1876907 (88) total: 35.7s remaining: 2s 89: learn: 0.1141814 test: 0.1876340 best: 0.1876340 (89) total: 35.9s remaining: 1.59s 90: learn: 0.1135851 test: 0.1875826 best: 0.1875826 (90) total: 36.3s remaining: 1.2s 91: learn: 0.1128295 test: 0.1878644 best: 0.1875826 (90) total: 36.7s remaining: 798ms 92: learn: 0.1123844 test: 0.1880469 best: 0.1875826 (90) total: 37s remaining: 398ms 93: learn: 0.1119322 test: 0.1880237 best: 0.1875826 (90) total: 37.3s remaining: 0us bestTest = 0.1875826345 bestIteration = 90 Shrink model to first 91 iterations. Trial 57, Fold 3: Log loss = 0.18739240902453494, Average precision = 0.9752750355044638, ROC-AUC = 0.9735378282979072, Elapsed Time = 37.481389799999306 seconds Trial 57, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 57, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5889277 test: 0.5908707 best: 0.5908707 (0) total: 523ms remaining: 48.6s 1: learn: 0.5085563 test: 0.5136694 best: 0.5136694 (1) total: 1.03s remaining: 47.2s 2: learn: 0.4449576 test: 0.4522369 best: 0.4522369 (2) total: 1.52s remaining: 46.1s 3: learn: 0.3952368 test: 0.4035509 best: 0.4035509 (3) total: 2s remaining: 45.1s 4: learn: 0.3566469 test: 0.3660009 best: 0.3660009 (4) total: 2.4s remaining: 42.8s 5: learn: 0.3262410 test: 0.3363588 best: 0.3363588 (5) total: 2.84s remaining: 41.7s 6: learn: 0.3019965 test: 0.3134657 best: 0.3134657 (6) total: 3.43s remaining: 42.7s 7: learn: 0.2834502 test: 0.2960993 best: 0.2960993 (7) total: 3.92s remaining: 42.1s 8: learn: 0.2668177 test: 0.2803623 best: 0.2803623 (8) total: 4.46s remaining: 42.1s 9: learn: 0.2530070 test: 0.2680647 best: 0.2680647 (9) total: 5.08s remaining: 42.6s 10: learn: 0.2423112 test: 0.2583277 best: 0.2583277 (10) total: 5.62s remaining: 42.4s 11: learn: 0.2315711 test: 0.2489861 best: 0.2489861 (11) total: 6.17s remaining: 42.2s 12: learn: 0.2230324 test: 0.2418254 best: 0.2418254 (12) total: 6.72s remaining: 41.9s 13: learn: 0.2167635 test: 0.2364640 best: 0.2364640 (13) total: 7.15s remaining: 40.9s 14: learn: 0.2103681 test: 0.2316171 best: 0.2316171 (14) total: 7.61s remaining: 40.1s 15: learn: 0.2049591 test: 0.2271782 best: 0.2271782 (15) total: 8.11s remaining: 39.5s 16: learn: 0.1996455 test: 0.2230873 best: 0.2230873 (16) total: 8.59s remaining: 38.9s 17: learn: 0.1947425 test: 0.2198949 best: 0.2198949 (17) total: 9.11s remaining: 38.5s 18: learn: 0.1901778 test: 0.2164314 best: 0.2164314 (18) total: 9.6s remaining: 37.9s 19: learn: 0.1865540 test: 0.2138960 best: 0.2138960 (19) total: 10s remaining: 37s 20: learn: 0.1832753 test: 0.2118755 best: 0.2118755 (20) total: 10.5s remaining: 36.3s 21: learn: 0.1801844 test: 0.2098332 best: 0.2098332 (21) total: 10.9s remaining: 35.6s 22: learn: 0.1773696 test: 0.2081748 best: 0.2081748 (22) total: 11.4s remaining: 35.2s 23: learn: 0.1747370 test: 0.2064295 best: 0.2064295 (23) total: 11.9s remaining: 34.7s 24: learn: 0.1725666 test: 0.2051122 best: 0.2051122 (24) total: 12.3s remaining: 33.9s 25: learn: 0.1703893 test: 0.2036648 best: 0.2036648 (25) total: 12.7s remaining: 33.3s 26: learn: 0.1686406 test: 0.2023003 best: 0.2023003 (26) total: 13.1s remaining: 32.4s 27: learn: 0.1664274 test: 0.2010722 best: 0.2010722 (27) total: 13.5s remaining: 31.9s 28: learn: 0.1646381 test: 0.2003857 best: 0.2003857 (28) total: 14s remaining: 31.3s 29: learn: 0.1630051 test: 0.1996621 best: 0.1996621 (29) total: 14.3s remaining: 30.6s 30: learn: 0.1610327 test: 0.1986035 best: 0.1986035 (30) total: 14.8s remaining: 30s 31: learn: 0.1593274 test: 0.1981839 best: 0.1981839 (31) total: 15.2s remaining: 29.4s 32: learn: 0.1576366 test: 0.1974079 best: 0.1974079 (32) total: 15.7s remaining: 29s 33: learn: 0.1560747 test: 0.1969829 best: 0.1969829 (33) total: 16.1s remaining: 28.4s 34: learn: 0.1547084 test: 0.1966748 best: 0.1966748 (34) total: 16.5s remaining: 27.9s 35: learn: 0.1531557 test: 0.1958770 best: 0.1958770 (35) total: 17s remaining: 27.4s 36: learn: 0.1521214 test: 0.1953880 best: 0.1953880 (36) total: 17.3s remaining: 26.7s 37: learn: 0.1514897 test: 0.1951003 best: 0.1951003 (37) total: 17.6s remaining: 25.9s 38: learn: 0.1501416 test: 0.1947339 best: 0.1947339 (38) total: 18s remaining: 25.3s 39: learn: 0.1491872 test: 0.1943894 best: 0.1943894 (39) total: 18.3s remaining: 24.7s 40: learn: 0.1476052 test: 0.1936657 best: 0.1936657 (40) total: 18.8s remaining: 24.3s 41: learn: 0.1464323 test: 0.1932625 best: 0.1932625 (41) total: 19.2s remaining: 23.8s 42: learn: 0.1457526 test: 0.1931403 best: 0.1931403 (42) total: 19.5s remaining: 23.1s 43: learn: 0.1446446 test: 0.1930621 best: 0.1930621 (43) total: 19.9s remaining: 22.6s 44: learn: 0.1439382 test: 0.1929598 best: 0.1929598 (44) total: 20.2s remaining: 22s 45: learn: 0.1433049 test: 0.1924871 best: 0.1924871 (45) total: 20.4s remaining: 21.3s 46: learn: 0.1423348 test: 0.1923795 best: 0.1923795 (46) total: 20.8s remaining: 20.8s 47: learn: 0.1414678 test: 0.1921447 best: 0.1921447 (47) total: 21.2s remaining: 20.3s 48: learn: 0.1407576 test: 0.1919076 best: 0.1919076 (48) total: 21.5s remaining: 19.7s 49: learn: 0.1397226 test: 0.1914375 best: 0.1914375 (49) total: 21.8s remaining: 19.2s 50: learn: 0.1392184 test: 0.1914051 best: 0.1914051 (50) total: 22.1s remaining: 18.6s 51: learn: 0.1380708 test: 0.1914127 best: 0.1914051 (50) total: 22.6s remaining: 18.2s 52: learn: 0.1377390 test: 0.1912543 best: 0.1912543 (52) total: 22.8s remaining: 17.6s 53: learn: 0.1366305 test: 0.1907384 best: 0.1907384 (53) total: 23.3s remaining: 17.3s 54: learn: 0.1357483 test: 0.1904411 best: 0.1904411 (54) total: 23.8s remaining: 16.9s 55: learn: 0.1348764 test: 0.1901759 best: 0.1901759 (55) total: 24.2s remaining: 16.4s 56: learn: 0.1338217 test: 0.1900009 best: 0.1900009 (56) total: 24.6s remaining: 16s 57: learn: 0.1327223 test: 0.1897877 best: 0.1897877 (57) total: 25.2s remaining: 15.6s 58: learn: 0.1321515 test: 0.1896613 best: 0.1896613 (58) total: 25.5s remaining: 15.1s 59: learn: 0.1317585 test: 0.1897144 best: 0.1896613 (58) total: 25.7s remaining: 14.6s 60: learn: 0.1308645 test: 0.1894055 best: 0.1894055 (60) total: 26.2s remaining: 14.2s 61: learn: 0.1299073 test: 0.1890797 best: 0.1890797 (61) total: 26.6s remaining: 13.8s 62: learn: 0.1289629 test: 0.1887997 best: 0.1887997 (62) total: 27.1s remaining: 13.4s 63: learn: 0.1283353 test: 0.1888338 best: 0.1887997 (62) total: 27.5s remaining: 12.9s 64: learn: 0.1274168 test: 0.1887319 best: 0.1887319 (64) total: 28s remaining: 12.5s 65: learn: 0.1270334 test: 0.1886889 best: 0.1886889 (65) total: 28.2s remaining: 12s 66: learn: 0.1264059 test: 0.1888539 best: 0.1886889 (65) total: 28.6s remaining: 11.5s 67: learn: 0.1254912 test: 0.1887942 best: 0.1886889 (65) total: 29.1s remaining: 11.1s 68: learn: 0.1246803 test: 0.1887457 best: 0.1886889 (65) total: 29.5s remaining: 10.7s 69: learn: 0.1236914 test: 0.1885279 best: 0.1885279 (69) total: 30s remaining: 10.3s 70: learn: 0.1226526 test: 0.1883529 best: 0.1883529 (70) total: 30.5s remaining: 9.88s 71: learn: 0.1224459 test: 0.1883146 best: 0.1883146 (71) total: 30.7s remaining: 9.37s 72: learn: 0.1216742 test: 0.1884107 best: 0.1883146 (71) total: 31.1s remaining: 8.94s 73: learn: 0.1208456 test: 0.1883510 best: 0.1883146 (71) total: 31.6s remaining: 8.53s 74: learn: 0.1199646 test: 0.1881224 best: 0.1881224 (74) total: 32.1s remaining: 8.12s 75: learn: 0.1192916 test: 0.1881431 best: 0.1881224 (74) total: 32.5s remaining: 7.69s 76: learn: 0.1185334 test: 0.1882031 best: 0.1881224 (74) total: 32.9s remaining: 7.26s 77: learn: 0.1178886 test: 0.1883596 best: 0.1881224 (74) total: 33.3s remaining: 6.83s 78: learn: 0.1173348 test: 0.1883540 best: 0.1881224 (74) total: 33.6s remaining: 6.39s 79: learn: 0.1169956 test: 0.1883283 best: 0.1881224 (74) total: 33.9s remaining: 5.94s 80: learn: 0.1165439 test: 0.1884336 best: 0.1881224 (74) total: 34.3s remaining: 5.5s 81: learn: 0.1158020 test: 0.1884126 best: 0.1881224 (74) total: 34.7s remaining: 5.08s 82: learn: 0.1151602 test: 0.1882332 best: 0.1881224 (74) total: 35.1s remaining: 4.66s 83: learn: 0.1144981 test: 0.1879127 best: 0.1879127 (83) total: 35.5s remaining: 4.23s 84: learn: 0.1140615 test: 0.1879795 best: 0.1879127 (83) total: 35.9s remaining: 3.8s 85: learn: 0.1135528 test: 0.1877893 best: 0.1877893 (85) total: 36.2s remaining: 3.37s 86: learn: 0.1129208 test: 0.1878278 best: 0.1877893 (85) total: 36.7s remaining: 2.95s 87: learn: 0.1126113 test: 0.1877948 best: 0.1877893 (85) total: 37s remaining: 2.52s 88: learn: 0.1118528 test: 0.1878441 best: 0.1877893 (85) total: 37.5s remaining: 2.1s 89: learn: 0.1111801 test: 0.1875995 best: 0.1875995 (89) total: 37.9s remaining: 1.68s 90: learn: 0.1107057 test: 0.1877280 best: 0.1875995 (89) total: 38.2s remaining: 1.26s 91: learn: 0.1103827 test: 0.1876648 best: 0.1875995 (89) total: 38.5s remaining: 837ms 92: learn: 0.1099874 test: 0.1877476 best: 0.1875995 (89) total: 38.8s remaining: 417ms 93: learn: 0.1096211 test: 0.1878518 best: 0.1875995 (89) total: 39.1s remaining: 0us bestTest = 0.1875994878 bestIteration = 89 Shrink model to first 90 iterations. Trial 57, Fold 4: Log loss = 0.18731344372084563, Average precision = 0.9773447682522916, ROC-AUC = 0.9738364676399828, Elapsed Time = 39.21282539999811 seconds Trial 57, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 57, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5898857 test: 0.5925959 best: 0.5925959 (0) total: 458ms remaining: 42.6s 1: learn: 0.5087441 test: 0.5175993 best: 0.5175993 (1) total: 960ms remaining: 44.2s 2: learn: 0.4442358 test: 0.4561764 best: 0.4561764 (2) total: 1.46s remaining: 44.4s 3: learn: 0.3950488 test: 0.4086865 best: 0.4086865 (3) total: 1.93s remaining: 43.5s 4: learn: 0.3565004 test: 0.3713988 best: 0.3713988 (4) total: 2.34s remaining: 41.7s 5: learn: 0.3247656 test: 0.3412669 best: 0.3412669 (5) total: 2.81s remaining: 41.2s 6: learn: 0.3011869 test: 0.3212194 best: 0.3212194 (6) total: 3.26s remaining: 40.6s 7: learn: 0.2811007 test: 0.3025253 best: 0.3025253 (7) total: 3.69s remaining: 39.7s 8: learn: 0.2645436 test: 0.2870850 best: 0.2870850 (8) total: 4.11s remaining: 38.8s 9: learn: 0.2502899 test: 0.2742906 best: 0.2742906 (9) total: 4.59s remaining: 38.6s 10: learn: 0.2388811 test: 0.2648775 best: 0.2648775 (10) total: 5s remaining: 37.7s 11: learn: 0.2290816 test: 0.2562717 best: 0.2562717 (11) total: 5.43s remaining: 37.1s 12: learn: 0.2210691 test: 0.2495770 best: 0.2495770 (12) total: 5.89s remaining: 36.7s 13: learn: 0.2137842 test: 0.2439431 best: 0.2439431 (13) total: 6.38s remaining: 36.5s 14: learn: 0.2075655 test: 0.2389031 best: 0.2389031 (14) total: 6.88s remaining: 36.2s 15: learn: 0.2020280 test: 0.2347723 best: 0.2347723 (15) total: 7.32s remaining: 35.7s 16: learn: 0.1970170 test: 0.2313021 best: 0.2313021 (16) total: 7.81s remaining: 35.4s 17: learn: 0.1926363 test: 0.2281378 best: 0.2281378 (17) total: 8.18s remaining: 34.5s 18: learn: 0.1888760 test: 0.2252002 best: 0.2252002 (18) total: 8.57s remaining: 33.8s 19: learn: 0.1853554 test: 0.2226824 best: 0.2226824 (19) total: 8.98s remaining: 33.2s 20: learn: 0.1818603 test: 0.2204447 best: 0.2204447 (20) total: 9.44s remaining: 32.8s 21: learn: 0.1785811 test: 0.2184110 best: 0.2184110 (21) total: 9.88s remaining: 32.4s 22: learn: 0.1764790 test: 0.2170329 best: 0.2170329 (22) total: 10.2s remaining: 31.5s 23: learn: 0.1739798 test: 0.2156751 best: 0.2156751 (23) total: 10.6s remaining: 31s 24: learn: 0.1715787 test: 0.2144384 best: 0.2144384 (24) total: 11.1s remaining: 30.7s 25: learn: 0.1696200 test: 0.2132431 best: 0.2132431 (25) total: 11.5s remaining: 30.1s 26: learn: 0.1675038 test: 0.2119141 best: 0.2119141 (26) total: 11.9s remaining: 29.5s 27: learn: 0.1661004 test: 0.2113695 best: 0.2113695 (27) total: 12.3s remaining: 28.9s 28: learn: 0.1644770 test: 0.2104935 best: 0.2104935 (28) total: 12.6s remaining: 28.3s 29: learn: 0.1628754 test: 0.2097879 best: 0.2097879 (29) total: 13.1s remaining: 27.9s 30: learn: 0.1610787 test: 0.2092645 best: 0.2092645 (30) total: 13.5s remaining: 27.5s 31: learn: 0.1596147 test: 0.2086607 best: 0.2086607 (31) total: 13.8s remaining: 26.8s 32: learn: 0.1581958 test: 0.2082007 best: 0.2082007 (32) total: 14.2s remaining: 26.3s 33: learn: 0.1568025 test: 0.2077648 best: 0.2077648 (33) total: 14.6s remaining: 25.7s 34: learn: 0.1557574 test: 0.2075572 best: 0.2075572 (34) total: 14.9s remaining: 25.1s 35: learn: 0.1548632 test: 0.2068648 best: 0.2068648 (35) total: 15.2s remaining: 24.5s 36: learn: 0.1531855 test: 0.2061364 best: 0.2061364 (36) total: 15.6s remaining: 24.1s 37: learn: 0.1521553 test: 0.2055529 best: 0.2055529 (37) total: 15.9s remaining: 23.5s 38: learn: 0.1514898 test: 0.2054193 best: 0.2054193 (38) total: 16.2s remaining: 22.9s 39: learn: 0.1503281 test: 0.2051467 best: 0.2051467 (39) total: 16.6s remaining: 22.4s 40: learn: 0.1492419 test: 0.2046917 best: 0.2046917 (40) total: 17s remaining: 21.9s 41: learn: 0.1483371 test: 0.2040159 best: 0.2040159 (41) total: 17.3s remaining: 21.4s 42: learn: 0.1472000 test: 0.2037228 best: 0.2037228 (42) total: 17.7s remaining: 21s 43: learn: 0.1460270 test: 0.2035463 best: 0.2035463 (43) total: 18.1s remaining: 20.6s 44: learn: 0.1447358 test: 0.2034031 best: 0.2034031 (44) total: 18.6s remaining: 20.3s 45: learn: 0.1435952 test: 0.2029389 best: 0.2029389 (45) total: 19s remaining: 19.8s 46: learn: 0.1426696 test: 0.2028387 best: 0.2028387 (46) total: 19.4s remaining: 19.4s 47: learn: 0.1414942 test: 0.2024598 best: 0.2024598 (47) total: 19.9s remaining: 19.1s 48: learn: 0.1401601 test: 0.2024150 best: 0.2024150 (48) total: 20.4s remaining: 18.7s 49: learn: 0.1392917 test: 0.2022501 best: 0.2022501 (49) total: 20.8s remaining: 18.3s 50: learn: 0.1383386 test: 0.2021856 best: 0.2021856 (50) total: 21.1s remaining: 17.8s 51: learn: 0.1374146 test: 0.2016467 best: 0.2016467 (51) total: 21.5s remaining: 17.3s 52: learn: 0.1366337 test: 0.2014199 best: 0.2014199 (52) total: 21.9s remaining: 16.9s 53: learn: 0.1358583 test: 0.2012168 best: 0.2012168 (53) total: 22.2s remaining: 16.4s 54: learn: 0.1350975 test: 0.2011970 best: 0.2011970 (54) total: 22.6s remaining: 16s 55: learn: 0.1342025 test: 0.2011341 best: 0.2011341 (55) total: 23s remaining: 15.6s 56: learn: 0.1330420 test: 0.2011730 best: 0.2011341 (55) total: 23.5s remaining: 15.3s 57: learn: 0.1322300 test: 0.2008990 best: 0.2008990 (57) total: 23.9s remaining: 14.8s 58: learn: 0.1316743 test: 0.2008360 best: 0.2008360 (58) total: 24.2s remaining: 14.4s 59: learn: 0.1307769 test: 0.2005877 best: 0.2005877 (59) total: 24.7s remaining: 14s 60: learn: 0.1301337 test: 0.2005190 best: 0.2005190 (60) total: 25s remaining: 13.5s 61: learn: 0.1292758 test: 0.2005046 best: 0.2005046 (61) total: 25.4s remaining: 13.1s 62: learn: 0.1287031 test: 0.2005047 best: 0.2005046 (61) total: 25.8s remaining: 12.7s 63: learn: 0.1283266 test: 0.2004596 best: 0.2004596 (63) total: 26s remaining: 12.2s 64: learn: 0.1277697 test: 0.2002639 best: 0.2002639 (64) total: 26.3s remaining: 11.7s 65: learn: 0.1273991 test: 0.2003091 best: 0.2002639 (64) total: 26.6s remaining: 11.3s 66: learn: 0.1267602 test: 0.2002984 best: 0.2002639 (64) total: 27s remaining: 10.9s 67: learn: 0.1258507 test: 0.2001897 best: 0.2001897 (67) total: 27.4s remaining: 10.5s 68: learn: 0.1249602 test: 0.2001144 best: 0.2001144 (68) total: 27.9s remaining: 10.1s 69: learn: 0.1241627 test: 0.2000773 best: 0.2000773 (69) total: 28.3s remaining: 9.72s 70: learn: 0.1234966 test: 0.2000041 best: 0.2000041 (70) total: 28.7s remaining: 9.29s 71: learn: 0.1229064 test: 0.1998843 best: 0.1998843 (71) total: 29s remaining: 8.87s 72: learn: 0.1223531 test: 0.1998934 best: 0.1998843 (71) total: 29.4s remaining: 8.45s 73: learn: 0.1218115 test: 0.1998160 best: 0.1998160 (73) total: 29.7s remaining: 8.03s 74: learn: 0.1212082 test: 0.1998952 best: 0.1998160 (73) total: 30.2s remaining: 7.65s 75: learn: 0.1206769 test: 0.1999050 best: 0.1998160 (73) total: 30.6s remaining: 7.24s 76: learn: 0.1199319 test: 0.1998673 best: 0.1998160 (73) total: 31s remaining: 6.84s 77: learn: 0.1193639 test: 0.1998400 best: 0.1998160 (73) total: 31.4s remaining: 6.44s 78: learn: 0.1189205 test: 0.1999312 best: 0.1998160 (73) total: 31.7s remaining: 6.01s 79: learn: 0.1182608 test: 0.1997256 best: 0.1997256 (79) total: 32.1s remaining: 5.61s 80: learn: 0.1172883 test: 0.1996081 best: 0.1996081 (80) total: 32.6s remaining: 5.23s 81: learn: 0.1168773 test: 0.1995030 best: 0.1995030 (81) total: 32.9s remaining: 4.81s 82: learn: 0.1162611 test: 0.1995190 best: 0.1995030 (81) total: 33.3s remaining: 4.41s 83: learn: 0.1157626 test: 0.1997014 best: 0.1995030 (81) total: 33.6s remaining: 4s 84: learn: 0.1151964 test: 0.2001381 best: 0.1995030 (81) total: 33.9s remaining: 3.59s 85: learn: 0.1145749 test: 0.2002601 best: 0.1995030 (81) total: 34.2s remaining: 3.18s 86: learn: 0.1140095 test: 0.2002132 best: 0.1995030 (81) total: 34.6s remaining: 2.78s 87: learn: 0.1134113 test: 0.2001661 best: 0.1995030 (81) total: 34.9s remaining: 2.38s 88: learn: 0.1131851 test: 0.2001440 best: 0.1995030 (81) total: 35.2s remaining: 1.97s 89: learn: 0.1126676 test: 0.2004825 best: 0.1995030 (81) total: 35.5s remaining: 1.58s 90: learn: 0.1121592 test: 0.2005948 best: 0.1995030 (81) total: 35.8s remaining: 1.18s 91: learn: 0.1115697 test: 0.2004430 best: 0.1995030 (81) total: 36.2s remaining: 787ms 92: learn: 0.1112320 test: 0.2003845 best: 0.1995030 (81) total: 36.4s remaining: 392ms 93: learn: 0.1106751 test: 0.2004691 best: 0.1995030 (81) total: 36.8s remaining: 0us bestTest = 0.1995030142 bestIteration = 81 Shrink model to first 82 iterations. Trial 57, Fold 5: Log loss = 0.19910033921792109, Average precision = 0.9734823221915858, ROC-AUC = 0.9713457815432064, Elapsed Time = 36.97576919999847 seconds
Optimization Progress: 58%|#####8 | 58/100 [1:36:47<1:50:41, 158.12s/it]
Trial 58, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 58, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6563950 test: 0.6565181 best: 0.6565181 (0) total: 111ms remaining: 3.88s 1: learn: 0.6032791 test: 0.6052388 best: 0.6052388 (1) total: 1.93s remaining: 32.8s 2: learn: 0.5609840 test: 0.5650305 best: 0.5650305 (2) total: 4s remaining: 44s 3: learn: 0.5161422 test: 0.5234598 best: 0.5234598 (3) total: 6.08s remaining: 48.7s 4: learn: 0.4861434 test: 0.4943470 best: 0.4943470 (4) total: 8.18s remaining: 50.7s 5: learn: 0.4502123 test: 0.4592075 best: 0.4592075 (5) total: 10.2s remaining: 51s 6: learn: 0.4186427 test: 0.4292636 best: 0.4292636 (6) total: 12.3s remaining: 50.9s 7: learn: 0.3969615 test: 0.4083434 best: 0.4083434 (7) total: 14.3s remaining: 50.1s 8: learn: 0.3683639 test: 0.3825851 best: 0.3825851 (8) total: 16.4s remaining: 49.1s 9: learn: 0.3473279 test: 0.3626512 best: 0.3626512 (9) total: 18.3s remaining: 47.5s 10: learn: 0.3335696 test: 0.3493456 best: 0.3493456 (10) total: 20.2s remaining: 45.9s 11: learn: 0.3163182 test: 0.3343550 best: 0.3343550 (11) total: 22.2s remaining: 44.3s 12: learn: 0.2974392 test: 0.3176397 best: 0.3176397 (12) total: 24.3s remaining: 42.9s 13: learn: 0.2822501 test: 0.3037204 best: 0.3037204 (13) total: 26.3s remaining: 41.4s 14: learn: 0.2692402 test: 0.2925930 best: 0.2925930 (14) total: 28.4s remaining: 39.7s 15: learn: 0.2584198 test: 0.2834605 best: 0.2834605 (15) total: 30.4s remaining: 38s 16: learn: 0.2490219 test: 0.2755784 best: 0.2755784 (16) total: 32.5s remaining: 36.3s 17: learn: 0.2429960 test: 0.2704973 best: 0.2704973 (17) total: 34.5s remaining: 34.5s 18: learn: 0.2342237 test: 0.2631935 best: 0.2631935 (18) total: 36.5s remaining: 32.7s 19: learn: 0.2256031 test: 0.2572313 best: 0.2572313 (19) total: 38.7s remaining: 31s 20: learn: 0.2192461 test: 0.2521925 best: 0.2521925 (20) total: 40.8s remaining: 29.1s 21: learn: 0.2172054 test: 0.2501605 best: 0.2501605 (21) total: 40.9s remaining: 26s 22: learn: 0.2124020 test: 0.2469360 best: 0.2469360 (22) total: 43s remaining: 24.3s 23: learn: 0.2054782 test: 0.2411778 best: 0.2411778 (23) total: 45s remaining: 22.5s 24: learn: 0.2015411 test: 0.2389570 best: 0.2389570 (24) total: 47.4s remaining: 20.8s 25: learn: 0.1978298 test: 0.2367623 best: 0.2367623 (25) total: 50s remaining: 19.2s 26: learn: 0.1964178 test: 0.2354882 best: 0.2354882 (26) total: 50.2s remaining: 16.7s 27: learn: 0.1924605 test: 0.2322355 best: 0.2322355 (27) total: 52.6s remaining: 15s 28: learn: 0.1890591 test: 0.2298643 best: 0.2298643 (28) total: 55.1s remaining: 13.3s 29: learn: 0.1868139 test: 0.2279836 best: 0.2279836 (29) total: 57.3s remaining: 11.5s 30: learn: 0.1844836 test: 0.2260874 best: 0.2260874 (30) total: 59.4s remaining: 9.58s 31: learn: 0.1797902 test: 0.2242731 best: 0.2242731 (31) total: 1m 1s remaining: 7.69s 32: learn: 0.1773071 test: 0.2226806 best: 0.2226806 (32) total: 1m 3s remaining: 5.77s 33: learn: 0.1743833 test: 0.2211380 best: 0.2211380 (33) total: 1m 5s remaining: 3.85s 34: learn: 0.1699619 test: 0.2197214 best: 0.2197214 (34) total: 1m 7s remaining: 1.93s 35: learn: 0.1684649 test: 0.2191012 best: 0.2191012 (35) total: 1m 9s remaining: 0us bestTest = 0.2191011988 bestIteration = 35 Trial 58, Fold 1: Log loss = 0.21910119880217663, Average precision = 0.9662186733191698, ROC-AUC = 0.9648683014056771, Elapsed Time = 69.89396879999913 seconds Trial 58, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 58, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6401557 test: 0.6421433 best: 0.6421433 (0) total: 1.9s remaining: 1m 6s 1: learn: 0.5872724 test: 0.5915292 best: 0.5915292 (1) total: 4.15s remaining: 1m 10s 2: learn: 0.5443993 test: 0.5502160 best: 0.5502160 (2) total: 6.32s remaining: 1m 9s 3: learn: 0.5034012 test: 0.5116634 best: 0.5116634 (3) total: 8.4s remaining: 1m 7s 4: learn: 0.4705805 test: 0.4807002 best: 0.4807002 (4) total: 10.5s remaining: 1m 5s 5: learn: 0.4382018 test: 0.4503461 best: 0.4503461 (5) total: 12.6s remaining: 1m 3s 6: learn: 0.4144927 test: 0.4281704 best: 0.4281704 (6) total: 14.8s remaining: 1m 1s 7: learn: 0.3888419 test: 0.4042623 best: 0.4042623 (7) total: 16.9s remaining: 59.1s 8: learn: 0.3674405 test: 0.3841058 best: 0.3841058 (8) total: 19.1s remaining: 57.2s 9: learn: 0.3531910 test: 0.3698213 best: 0.3698213 (9) total: 19.3s remaining: 50.1s 10: learn: 0.3402450 test: 0.3573257 best: 0.3573257 (10) total: 21.3s remaining: 48.3s 11: learn: 0.3205432 test: 0.3393440 best: 0.3393440 (11) total: 23.4s remaining: 46.7s 12: learn: 0.3081723 test: 0.3272116 best: 0.3272116 (12) total: 25.5s remaining: 45s 13: learn: 0.2868778 test: 0.3094956 best: 0.3094956 (13) total: 27.5s remaining: 43.2s 14: learn: 0.2742013 test: 0.2980004 best: 0.2980004 (14) total: 29.6s remaining: 41.4s 15: learn: 0.2617117 test: 0.2863081 best: 0.2863081 (15) total: 31.6s remaining: 39.5s 16: learn: 0.2537094 test: 0.2791617 best: 0.2791617 (16) total: 33.5s remaining: 37.5s 17: learn: 0.2437908 test: 0.2709897 best: 0.2709897 (17) total: 35.5s remaining: 35.5s 18: learn: 0.2351190 test: 0.2637207 best: 0.2637207 (18) total: 37.6s remaining: 33.7s 19: learn: 0.2280407 test: 0.2577475 best: 0.2577475 (19) total: 39.7s remaining: 31.7s 20: learn: 0.2188580 test: 0.2506328 best: 0.2506328 (20) total: 41.9s remaining: 29.9s 21: learn: 0.2096357 test: 0.2441287 best: 0.2441287 (21) total: 44s remaining: 28s 22: learn: 0.2045768 test: 0.2406618 best: 0.2406618 (22) total: 46.1s remaining: 26s 23: learn: 0.2012221 test: 0.2378956 best: 0.2378956 (23) total: 48.1s remaining: 24s 24: learn: 0.1996418 test: 0.2362262 best: 0.2362262 (24) total: 48.1s remaining: 21.2s 25: learn: 0.1941285 test: 0.2320302 best: 0.2320302 (25) total: 50.1s remaining: 19.3s 26: learn: 0.1889198 test: 0.2286568 best: 0.2286568 (26) total: 52.1s remaining: 17.4s 27: learn: 0.1874468 test: 0.2274730 best: 0.2274730 (27) total: 54.3s remaining: 15.5s 28: learn: 0.1834615 test: 0.2244127 best: 0.2244127 (28) total: 56.3s remaining: 13.6s 29: learn: 0.1802323 test: 0.2222663 best: 0.2222663 (29) total: 58.5s remaining: 11.7s 30: learn: 0.1756141 test: 0.2205113 best: 0.2205113 (30) total: 1m remaining: 9.77s 31: learn: 0.1725408 test: 0.2192942 best: 0.2192942 (31) total: 1m 2s remaining: 7.83s 32: learn: 0.1701086 test: 0.2180738 best: 0.2180738 (32) total: 1m 4s remaining: 5.89s 33: learn: 0.1681581 test: 0.2168289 best: 0.2168289 (33) total: 1m 6s remaining: 3.94s 34: learn: 0.1664811 test: 0.2155451 best: 0.2155451 (34) total: 1m 9s remaining: 1.97s 35: learn: 0.1644470 test: 0.2145037 best: 0.2145037 (35) total: 1m 11s remaining: 0us bestTest = 0.2145036872 bestIteration = 35 Trial 58, Fold 2: Log loss = 0.21450368718369395, Average precision = 0.9710319913053642, ROC-AUC = 0.9667554745289826, Elapsed Time = 71.19416329999876 seconds Trial 58, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 58, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6399583 test: 0.6406838 best: 0.6406838 (0) total: 1.89s remaining: 1m 6s 1: learn: 0.5890941 test: 0.5924879 best: 0.5924879 (1) total: 3.98s remaining: 1m 7s 2: learn: 0.5550763 test: 0.5593652 best: 0.5593652 (2) total: 5.94s remaining: 1m 5s 3: learn: 0.5195976 test: 0.5249868 best: 0.5249868 (3) total: 8s remaining: 1m 4s 4: learn: 0.4857580 test: 0.4931536 best: 0.4931536 (4) total: 10.2s remaining: 1m 3s 5: learn: 0.4611532 test: 0.4681995 best: 0.4681995 (5) total: 12.3s remaining: 1m 1s 6: learn: 0.4368784 test: 0.4445459 best: 0.4445459 (6) total: 14.4s remaining: 59.6s 7: learn: 0.4137298 test: 0.4221526 best: 0.4221526 (7) total: 16.5s remaining: 57.8s 8: learn: 0.3847982 test: 0.3949960 best: 0.3949960 (8) total: 18.6s remaining: 55.8s 9: learn: 0.3652183 test: 0.3755241 best: 0.3755241 (9) total: 20.7s remaining: 53.8s 10: learn: 0.3412780 test: 0.3537147 best: 0.3537147 (10) total: 22.8s remaining: 51.8s 11: learn: 0.3239102 test: 0.3372665 best: 0.3372665 (11) total: 24.9s remaining: 49.7s 12: learn: 0.3054150 test: 0.3199671 best: 0.3199671 (12) total: 26.8s remaining: 47.4s 13: learn: 0.2906155 test: 0.3061072 best: 0.3061072 (13) total: 28.8s remaining: 45.3s 14: learn: 0.2748752 test: 0.2920552 best: 0.2920552 (14) total: 31s remaining: 43.4s 15: learn: 0.2617291 test: 0.2811418 best: 0.2811418 (15) total: 33.2s remaining: 41.5s 16: learn: 0.2505753 test: 0.2723612 best: 0.2723612 (16) total: 35.3s remaining: 39.5s 17: learn: 0.2407042 test: 0.2643012 best: 0.2643012 (17) total: 37.2s remaining: 37.2s 18: learn: 0.2318305 test: 0.2568630 best: 0.2568630 (18) total: 39.2s remaining: 35.1s 19: learn: 0.2246859 test: 0.2512842 best: 0.2512842 (19) total: 41.2s remaining: 33s 20: learn: 0.2186016 test: 0.2458714 best: 0.2458714 (20) total: 43.3s remaining: 30.9s 21: learn: 0.2118255 test: 0.2409487 best: 0.2409487 (21) total: 45.3s remaining: 28.8s 22: learn: 0.2098899 test: 0.2390141 best: 0.2390141 (22) total: 45.4s remaining: 25.6s 23: learn: 18.5195175 test: 0.2362713 best: 0.2362713 (23) total: 47.3s remaining: 23.6s 24: learn: 0.2003060 test: 0.2328738 best: 0.2328738 (24) total: 49.3s remaining: 21.7s 25: learn: 0.1980129 test: 0.2307224 best: 0.2307224 (25) total: 51.2s remaining: 19.7s 26: learn: 0.1958056 test: 0.2292250 best: 0.2292250 (26) total: 53.1s remaining: 17.7s 27: learn: 0.1930674 test: 0.2272587 best: 0.2272587 (27) total: 55.3s remaining: 15.8s 28: learn: 0.1918642 test: 0.2261064 best: 0.2261064 (28) total: 55.5s remaining: 13.4s 29: learn: 0.1893892 test: 0.2245743 best: 0.2245743 (29) total: 57.6s remaining: 11.5s 30: learn: 0.1863329 test: 0.2222948 best: 0.2222948 (30) total: 59.6s remaining: 9.62s 31: learn: 0.1815548 test: 0.2201867 best: 0.2201867 (31) total: 1m 1s remaining: 7.71s 32: learn: 0.1784511 test: 0.2187658 best: 0.2187658 (32) total: 1m 3s remaining: 5.8s 33: learn: 0.1755985 test: 0.2172381 best: 0.2172381 (33) total: 1m 5s remaining: 3.88s 34: learn: 0.1730835 test: 0.2161280 best: 0.2161280 (34) total: 1m 7s remaining: 1.94s 35: learn: 0.1709884 test: 0.2155195 best: 0.2155195 (35) total: 1m 9s remaining: 0us bestTest = 0.2155194589 bestIteration = 35 Trial 58, Fold 3: Log loss = 0.21551945885426307, Average precision = 0.9711899260410743, ROC-AUC = 0.9678798961774422, Elapsed Time = 69.95388779999848 seconds Trial 58, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 58, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6366655 test: 0.6386189 best: 0.6386189 (0) total: 2.09s remaining: 1m 13s 1: learn: 0.5921567 test: 0.5956299 best: 0.5956299 (1) total: 4.23s remaining: 1m 11s 2: learn: 0.5476632 test: 0.5521359 best: 0.5521359 (2) total: 6.42s remaining: 1m 10s 3: learn: 0.5077409 test: 0.5136605 best: 0.5136605 (3) total: 8.53s remaining: 1m 8s 4: learn: 0.4668923 test: 0.4744174 best: 0.4744174 (4) total: 10.7s remaining: 1m 6s 5: learn: 0.4388057 test: 0.4474852 best: 0.4474852 (5) total: 12.8s remaining: 1m 4s 6: learn: 0.4099977 test: 0.4201464 best: 0.4201464 (6) total: 14.9s remaining: 1m 1s 7: learn: 0.3862340 test: 0.3975642 best: 0.3975642 (7) total: 17.1s remaining: 59.8s 8: learn: 0.3707562 test: 0.3826993 best: 0.3826993 (8) total: 19.2s remaining: 57.7s 9: learn: 0.3493225 test: 0.3628015 best: 0.3628015 (9) total: 21.2s remaining: 55.2s 10: learn: 0.3317862 test: 0.3461560 best: 0.3461560 (10) total: 23.3s remaining: 52.9s 11: learn: 0.3194466 test: 0.3342839 best: 0.3342839 (11) total: 25.2s remaining: 50.5s 12: learn: 0.3064192 test: 0.3217682 best: 0.3217682 (12) total: 27.2s remaining: 48.1s 13: learn: 0.2948572 test: 0.3110563 best: 0.3110563 (13) total: 29.2s remaining: 45.8s 14: learn: 0.2813691 test: 0.2992898 best: 0.2992898 (14) total: 31.1s remaining: 43.6s 15: learn: 0.2724625 test: 0.2911025 best: 0.2911025 (15) total: 33.2s remaining: 41.4s 16: learn: 0.2586341 test: 0.2798232 best: 0.2798232 (16) total: 35.1s remaining: 39.2s 17: learn: 0.2489458 test: 0.2722971 best: 0.2722971 (17) total: 37.1s remaining: 37.1s 18: learn: 0.2394410 test: 0.2645339 best: 0.2645339 (18) total: 39.2s remaining: 35.1s 19: learn: 0.2322934 test: 0.2586774 best: 0.2586774 (19) total: 41.3s remaining: 33.1s 20: learn: 0.2255666 test: 0.2528727 best: 0.2528727 (20) total: 43.4s remaining: 31s 21: learn: 0.2182521 test: 0.2473471 best: 0.2473471 (21) total: 45.4s remaining: 28.9s 22: learn: 0.2163598 test: 0.2455151 best: 0.2455151 (22) total: 45.4s remaining: 25.7s 23: learn: 0.2093624 test: 0.2404171 best: 0.2404171 (23) total: 47.4s remaining: 23.7s 24: learn: 0.2040690 test: 0.2363686 best: 0.2363686 (24) total: 49.5s remaining: 21.8s 25: learn: 0.1994458 test: 0.2329408 best: 0.2329408 (25) total: 51.5s remaining: 19.8s 26: learn: 0.1938845 test: 0.2289976 best: 0.2289976 (26) total: 53.5s remaining: 17.8s 27: learn: 0.1908269 test: 0.2270090 best: 0.2270090 (27) total: 55.7s remaining: 15.9s 28: learn: 0.1855901 test: 0.2243107 best: 0.2243107 (28) total: 57.8s remaining: 14s 29: learn: 0.1820518 test: 0.2222789 best: 0.2222789 (29) total: 59.8s remaining: 12s 30: learn: 0.1794197 test: 0.2207138 best: 0.2207138 (30) total: 1m 1s remaining: 9.99s 31: learn: 0.1775947 test: 0.2191747 best: 0.2191747 (31) total: 1m 4s remaining: 8.01s 32: learn: 0.1753364 test: 0.2175693 best: 0.2175693 (32) total: 1m 6s remaining: 6.02s 33: learn: 0.1713402 test: 0.2154056 best: 0.2154056 (33) total: 1m 8s remaining: 4.01s 34: learn: 0.1671940 test: 0.2143691 best: 0.2143691 (34) total: 1m 10s remaining: 2.01s 35: learn: 0.1642013 test: 0.2132699 best: 0.2132699 (35) total: 1m 12s remaining: 0us bestTest = 0.2132698901 bestIteration = 35 Trial 58, Fold 4: Log loss = 0.21326989013844874, Average precision = 0.9721579843426332, ROC-AUC = 0.9667626689652378, Elapsed Time = 72.46899029999986 seconds Trial 58, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 58, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6440754 test: 0.6461490 best: 0.6461490 (0) total: 1.91s remaining: 1m 6s 1: learn: 0.5932618 test: 0.5967554 best: 0.5967554 (1) total: 4.07s remaining: 1m 9s 2: learn: 0.5588492 test: 0.5631507 best: 0.5631507 (2) total: 5.08s remaining: 55.8s 3: learn: 0.5178741 test: 0.5241465 best: 0.5241465 (3) total: 7.25s remaining: 58s 4: learn: 0.4858240 test: 0.4938035 best: 0.4938035 (4) total: 9.35s remaining: 58s 5: learn: 0.4505198 test: 0.4613625 best: 0.4613625 (5) total: 11.4s remaining: 57.2s 6: learn: 0.4237993 test: 0.4367367 best: 0.4367367 (6) total: 13.5s remaining: 56s 7: learn: 0.3980840 test: 0.4140162 best: 0.4140162 (7) total: 15.6s remaining: 54.5s 8: learn: 0.3730052 test: 0.3907331 best: 0.3907331 (8) total: 17.7s remaining: 53.1s 9: learn: 0.3499876 test: 0.3699463 best: 0.3699463 (9) total: 19.8s remaining: 51.5s 10: learn: 0.3307022 test: 0.3524800 best: 0.3524800 (10) total: 21.9s remaining: 49.8s 11: learn: 0.3092963 test: 0.3336498 best: 0.3336498 (11) total: 24s remaining: 48s 12: learn: 0.2928545 test: 0.3189097 best: 0.3189097 (12) total: 26.2s remaining: 46.3s 13: learn: 0.2821173 test: 0.3089769 best: 0.3089769 (13) total: 28.1s remaining: 44.2s 14: learn: 0.2669321 test: 0.2961673 best: 0.2961673 (14) total: 30.1s remaining: 42.2s 15: learn: 0.2565516 test: 0.2870455 best: 0.2870455 (15) total: 32.1s remaining: 40.2s 16: learn: 0.2446360 test: 0.2782737 best: 0.2782737 (16) total: 34.2s remaining: 38.2s 17: learn: 0.2380095 test: 0.2720392 best: 0.2720392 (17) total: 36.1s remaining: 36.1s 18: learn: 0.2307940 test: 0.2661322 best: 0.2661322 (18) total: 38.3s remaining: 34.2s 19: learn: 0.2273150 test: 0.2627326 best: 0.2627326 (19) total: 40.1s remaining: 32.1s 20: learn: 0.2209691 test: 0.2583977 best: 0.2583977 (20) total: 42.2s remaining: 30.1s 21: learn: 0.2142393 test: 0.2525253 best: 0.2525253 (21) total: 44.2s remaining: 28.1s 22: learn: 0.2097855 test: 0.2482746 best: 0.2482746 (22) total: 46.2s remaining: 26.1s 23: learn: 0.2061727 test: 0.2454219 best: 0.2454219 (23) total: 48.4s remaining: 24.2s 24: learn: 0.2013565 test: 0.2421026 best: 0.2421026 (24) total: 50.5s remaining: 22.2s 25: learn: 0.1988066 test: 0.2404214 best: 0.2404214 (25) total: 52.6s remaining: 20.2s 26: learn: 0.1958193 test: 0.2383234 best: 0.2383234 (26) total: 54.5s remaining: 18.2s 27: learn: 0.1911997 test: 0.2355670 best: 0.2355670 (27) total: 56.5s remaining: 16.1s 28: learn: 0.1870194 test: 0.2324766 best: 0.2324766 (28) total: 58.6s remaining: 14.1s 29: learn: 0.1826177 test: 0.2300928 best: 0.2300928 (29) total: 1m remaining: 12.1s 30: learn: 0.1816922 test: 0.2294487 best: 0.2294487 (30) total: 1m 2s remaining: 10.1s 31: learn: 0.1808795 test: 0.2288042 best: 0.2288042 (31) total: 1m 2s remaining: 7.84s 32: learn: 0.1773129 test: 0.2268317 best: 0.2268317 (32) total: 1m 4s remaining: 5.9s 33: learn: 0.1740476 test: 0.2256837 best: 0.2256837 (33) total: 1m 7s remaining: 3.97s 34: learn: 0.1691995 test: 0.2240668 best: 0.2240668 (34) total: 1m 9s remaining: 2s 35: learn: 0.1673609 test: 0.2231378 best: 0.2231378 (35) total: 1m 12s remaining: 0us bestTest = 0.223137802 bestIteration = 35 Trial 58, Fold 5: Log loss = 0.22313780199889557, Average precision = 0.9697573158890176, ROC-AUC = 0.9646580086580087, Elapsed Time = 73.19685559999925 seconds
Optimization Progress: 59%|#####8 | 59/100 [1:42:51<2:30:23, 220.09s/it]
Trial 59, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 59, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6788082 test: 0.6791267 best: 0.6791267 (0) total: 498ms remaining: 21.9s 1: learn: 0.6648502 test: 0.6656751 best: 0.6656751 (1) total: 987ms remaining: 21.2s 2: learn: 0.6514918 test: 0.6526205 best: 0.6526205 (2) total: 1.4s remaining: 19.6s 3: learn: 0.6386050 test: 0.6399956 best: 0.6399956 (3) total: 1.85s remaining: 19s 4: learn: 0.6257946 test: 0.6275125 best: 0.6275125 (4) total: 2.5s remaining: 20s 5: learn: 0.6133042 test: 0.6153863 best: 0.6153863 (5) total: 3.12s remaining: 20.3s 6: learn: 0.6012732 test: 0.6036819 best: 0.6036819 (6) total: 3.69s remaining: 20.1s 7: learn: 0.5897229 test: 0.5924413 best: 0.5924413 (7) total: 4.21s remaining: 19.5s 8: learn: 0.5784649 test: 0.5815324 best: 0.5815324 (8) total: 4.72s remaining: 18.9s 9: learn: 0.5675754 test: 0.5709684 best: 0.5709684 (9) total: 5.24s remaining: 18.3s 10: learn: 0.5568986 test: 0.5606220 best: 0.5606220 (10) total: 5.75s remaining: 17.8s 11: learn: 0.5466116 test: 0.5506356 best: 0.5506356 (11) total: 6.3s remaining: 17.3s 12: learn: 0.5365563 test: 0.5409657 best: 0.5409657 (12) total: 6.84s remaining: 16.8s 13: learn: 0.5268146 test: 0.5316197 best: 0.5316197 (13) total: 7.38s remaining: 16.3s 14: learn: 0.5173172 test: 0.5225260 best: 0.5225260 (14) total: 7.99s remaining: 16s 15: learn: 0.5081312 test: 0.5137305 best: 0.5137305 (15) total: 8.55s remaining: 15.5s 16: learn: 0.4993621 test: 0.5052220 best: 0.5052220 (16) total: 9.04s remaining: 14.9s 17: learn: 0.4908894 test: 0.4970889 best: 0.4970889 (17) total: 9.57s remaining: 14.4s 18: learn: 0.4827201 test: 0.4891716 best: 0.4891716 (18) total: 10s remaining: 13.7s 19: learn: 0.4747486 test: 0.4815894 best: 0.4815894 (19) total: 10.5s remaining: 13.2s 20: learn: 0.4669783 test: 0.4740907 best: 0.4740907 (20) total: 11.1s remaining: 12.7s 21: learn: 0.4593862 test: 0.4667565 best: 0.4667565 (21) total: 11.6s remaining: 12.2s 22: learn: 0.4519170 test: 0.4596468 best: 0.4596468 (22) total: 12.1s remaining: 11.6s 23: learn: 0.4446272 test: 0.4526408 best: 0.4526408 (23) total: 12.7s remaining: 11.1s 24: learn: 0.4375920 test: 0.4459208 best: 0.4459208 (24) total: 13.2s remaining: 10.6s 25: learn: 0.4308587 test: 0.4394833 best: 0.4394833 (25) total: 13.7s remaining: 10s 26: learn: 0.4242342 test: 0.4332028 best: 0.4332028 (26) total: 14.2s remaining: 9.47s 27: learn: 0.4177301 test: 0.4269650 best: 0.4269650 (27) total: 14.7s remaining: 8.93s 28: learn: 0.4115981 test: 0.4210769 best: 0.4210769 (28) total: 15.2s remaining: 8.4s 29: learn: 0.4055630 test: 0.4153239 best: 0.4153239 (29) total: 15.7s remaining: 7.85s 30: learn: 0.3998982 test: 0.4098620 best: 0.4098620 (30) total: 16.1s remaining: 7.29s 31: learn: 0.3943307 test: 0.4045880 best: 0.4045880 (31) total: 16.6s remaining: 6.76s 32: learn: 0.3887663 test: 0.3992970 best: 0.3992970 (32) total: 17.1s remaining: 6.23s 33: learn: 0.3834222 test: 0.3942725 best: 0.3942725 (33) total: 17.6s remaining: 5.69s 34: learn: 0.3782409 test: 0.3893808 best: 0.3893808 (34) total: 18.1s remaining: 5.16s 35: learn: 0.3731136 test: 0.3845989 best: 0.3845989 (35) total: 18.6s remaining: 4.65s 36: learn: 0.3682121 test: 0.3799916 best: 0.3799916 (36) total: 19.1s remaining: 4.12s 37: learn: 0.3634498 test: 0.3755578 best: 0.3755578 (37) total: 19.6s remaining: 3.6s 38: learn: 0.3588786 test: 0.3712309 best: 0.3712309 (38) total: 20s remaining: 3.08s 39: learn: 0.3543924 test: 0.3670251 best: 0.3670251 (39) total: 20.5s remaining: 2.56s 40: learn: 0.3499976 test: 0.3629462 best: 0.3629462 (40) total: 20.9s remaining: 2.04s 41: learn: 0.3458346 test: 0.3590774 best: 0.3590774 (41) total: 21.4s remaining: 1.53s 42: learn: 0.3417636 test: 0.3552783 best: 0.3552783 (42) total: 21.9s remaining: 1.02s 43: learn: 0.3377368 test: 0.3515630 best: 0.3515630 (43) total: 22.3s remaining: 507ms 44: learn: 0.3337020 test: 0.3478793 best: 0.3478793 (44) total: 22.9s remaining: 0us bestTest = 0.3478792629 bestIteration = 44 Trial 59, Fold 1: Log loss = 0.34800591869926284, Average precision = 0.9747018812788043, ROC-AUC = 0.9704799125813129, Elapsed Time = 23.01954359999945 seconds Trial 59, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 59, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6788135 test: 0.6791975 best: 0.6791975 (0) total: 529ms remaining: 23.3s 1: learn: 0.6648868 test: 0.6657152 best: 0.6657152 (1) total: 1.03s remaining: 22.1s 2: learn: 0.6513559 test: 0.6525846 best: 0.6525846 (2) total: 1.53s remaining: 21.5s 3: learn: 0.6381727 test: 0.6398207 best: 0.6398207 (3) total: 2.1s remaining: 21.5s 4: learn: 0.6254790 test: 0.6274441 best: 0.6274441 (4) total: 2.62s remaining: 20.9s 5: learn: 0.6133124 test: 0.6155526 best: 0.6155526 (5) total: 3.14s remaining: 20.4s 6: learn: 0.6013368 test: 0.6039661 best: 0.6039661 (6) total: 3.7s remaining: 20.1s 7: learn: 0.5897090 test: 0.5925822 best: 0.5925822 (7) total: 4.25s remaining: 19.6s 8: learn: 0.5784250 test: 0.5815691 best: 0.5815691 (8) total: 4.81s remaining: 19.2s 9: learn: 0.5674847 test: 0.5709909 best: 0.5709909 (9) total: 5.37s remaining: 18.8s 10: learn: 0.5570827 test: 0.5609320 best: 0.5609320 (10) total: 5.9s remaining: 18.2s 11: learn: 0.5468342 test: 0.5509843 best: 0.5509843 (11) total: 6.43s remaining: 17.7s 12: learn: 0.5367072 test: 0.5412307 best: 0.5412307 (12) total: 7.04s remaining: 17.3s 13: learn: 0.5270482 test: 0.5319025 best: 0.5319025 (13) total: 7.62s remaining: 16.9s 14: learn: 0.5176032 test: 0.5227830 best: 0.5227830 (14) total: 8.19s remaining: 16.4s 15: learn: 0.5086647 test: 0.5141055 best: 0.5141055 (15) total: 8.71s remaining: 15.8s 16: learn: 0.4997598 test: 0.5055414 best: 0.5055414 (16) total: 9.27s remaining: 15.3s 17: learn: 0.4911073 test: 0.4971867 best: 0.4971867 (17) total: 9.82s remaining: 14.7s 18: learn: 0.4828603 test: 0.4891556 best: 0.4891556 (18) total: 10.3s remaining: 14.1s 19: learn: 0.4747319 test: 0.4812722 best: 0.4812722 (19) total: 10.8s remaining: 13.6s 20: learn: 0.4668522 test: 0.4736890 best: 0.4736890 (20) total: 11.4s remaining: 13s 21: learn: 0.4593501 test: 0.4664459 best: 0.4664459 (21) total: 11.9s remaining: 12.4s 22: learn: 0.4519312 test: 0.4593328 best: 0.4593328 (22) total: 12.4s remaining: 11.9s 23: learn: 0.4447585 test: 0.4524118 best: 0.4524118 (23) total: 13s remaining: 11.4s 24: learn: 0.4377784 test: 0.4456940 best: 0.4456940 (24) total: 13.5s remaining: 10.8s 25: learn: 0.4310765 test: 0.4392497 best: 0.4392497 (25) total: 14.1s remaining: 10.3s 26: learn: 0.4244846 test: 0.4329422 best: 0.4329422 (26) total: 14.6s remaining: 9.74s 27: learn: 0.4181028 test: 0.4267966 best: 0.4267966 (27) total: 15.2s remaining: 9.21s 28: learn: 0.4119653 test: 0.4208761 best: 0.4208761 (28) total: 15.7s remaining: 8.66s 29: learn: 0.4058882 test: 0.4150830 best: 0.4150830 (29) total: 16.2s remaining: 8.12s 30: learn: 0.3998950 test: 0.4093732 best: 0.4093732 (30) total: 16.9s remaining: 7.61s 31: learn: 0.3942045 test: 0.4038995 best: 0.4038995 (31) total: 17.4s remaining: 7.08s 32: learn: 0.3887241 test: 0.3986694 best: 0.3986694 (32) total: 17.9s remaining: 6.51s 33: learn: 0.3834326 test: 0.3935684 best: 0.3935684 (33) total: 18.4s remaining: 5.97s 34: learn: 0.3783310 test: 0.3886650 best: 0.3886650 (34) total: 18.9s remaining: 5.41s 35: learn: 0.3732796 test: 0.3838681 best: 0.3838681 (35) total: 19.4s remaining: 4.86s 36: learn: 0.3685155 test: 0.3792737 best: 0.3792737 (36) total: 19.9s remaining: 4.3s 37: learn: 0.3637952 test: 0.3747983 best: 0.3747983 (37) total: 20.4s remaining: 3.76s 38: learn: 0.3591260 test: 0.3703507 best: 0.3703507 (38) total: 20.9s remaining: 3.22s 39: learn: 0.3545701 test: 0.3660154 best: 0.3660154 (39) total: 21.4s remaining: 2.68s 40: learn: 0.3502678 test: 0.3619121 best: 0.3619121 (40) total: 21.9s remaining: 2.14s 41: learn: 0.3460149 test: 0.3578886 best: 0.3578886 (41) total: 22.5s remaining: 1.6s 42: learn: 0.3419377 test: 0.3540072 best: 0.3540072 (42) total: 22.9s remaining: 1.07s 43: learn: 0.3380097 test: 0.3502696 best: 0.3502696 (43) total: 23.4s remaining: 532ms 44: learn: 0.3340487 test: 0.3465553 best: 0.3465553 (44) total: 24s remaining: 0us bestTest = 0.3465552565 bestIteration = 44 Trial 59, Fold 2: Log loss = 0.3466080520059207, Average precision = 0.9748791852635729, ROC-AUC = 0.9719432001969079, Elapsed Time = 24.161008299997775 seconds Trial 59, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 59, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6788482 test: 0.6790861 best: 0.6790861 (0) total: 492ms remaining: 21.6s 1: learn: 0.6649102 test: 0.6653491 best: 0.6653491 (1) total: 1000ms remaining: 21.5s 2: learn: 0.6514657 test: 0.6521574 best: 0.6521574 (2) total: 1.45s remaining: 20.3s 3: learn: 0.6382830 test: 0.6393518 best: 0.6393518 (3) total: 2.01s remaining: 20.6s 4: learn: 0.6255513 test: 0.6268568 best: 0.6268568 (4) total: 2.54s remaining: 20.3s 5: learn: 0.6132580 test: 0.6148541 best: 0.6148541 (5) total: 3.09s remaining: 20.1s 6: learn: 0.6012789 test: 0.6031252 best: 0.6031252 (6) total: 3.66s remaining: 19.9s 7: learn: 0.5895750 test: 0.5916805 best: 0.5916805 (7) total: 4.24s remaining: 19.6s 8: learn: 0.5782048 test: 0.5805698 best: 0.5805698 (8) total: 4.85s remaining: 19.4s 9: learn: 0.5673032 test: 0.5699450 best: 0.5699450 (9) total: 5.36s remaining: 18.8s 10: learn: 0.5569108 test: 0.5596397 best: 0.5596397 (10) total: 5.79s remaining: 17.9s 11: learn: 0.5466458 test: 0.5495866 best: 0.5495866 (11) total: 6.33s remaining: 17.4s 12: learn: 0.5366773 test: 0.5398542 best: 0.5398542 (12) total: 6.86s remaining: 16.9s 13: learn: 0.5270108 test: 0.5304484 best: 0.5304484 (13) total: 7.32s remaining: 16.2s 14: learn: 0.5176213 test: 0.5213096 best: 0.5213096 (14) total: 7.81s remaining: 15.6s 15: learn: 0.5087695 test: 0.5126235 best: 0.5126235 (15) total: 8.29s remaining: 15s 16: learn: 0.4998549 test: 0.5040945 best: 0.5040945 (16) total: 8.87s remaining: 14.6s 17: learn: 0.4912513 test: 0.4957342 best: 0.4957342 (17) total: 9.36s remaining: 14s 18: learn: 0.4829481 test: 0.4876745 best: 0.4876745 (18) total: 9.89s remaining: 13.5s 19: learn: 0.4748332 test: 0.4797808 best: 0.4797808 (19) total: 10.4s remaining: 13s 20: learn: 0.4670968 test: 0.4722970 best: 0.4722970 (20) total: 10.9s remaining: 12.4s 21: learn: 0.4595835 test: 0.4649170 best: 0.4649170 (21) total: 11.4s remaining: 11.9s 22: learn: 0.4523287 test: 0.4578251 best: 0.4578251 (22) total: 11.9s remaining: 11.4s 23: learn: 0.4451281 test: 0.4508205 best: 0.4508205 (23) total: 12.4s remaining: 10.8s 24: learn: 0.4381726 test: 0.4440937 best: 0.4440937 (24) total: 12.9s remaining: 10.3s 25: learn: 0.4314429 test: 0.4375411 best: 0.4375411 (25) total: 13.4s remaining: 9.79s 26: learn: 0.4248623 test: 0.4311914 best: 0.4311914 (26) total: 13.9s remaining: 9.28s 27: learn: 0.4184393 test: 0.4249814 best: 0.4249814 (27) total: 14.5s remaining: 8.78s 28: learn: 0.4123176 test: 0.4190181 best: 0.4190181 (28) total: 15s remaining: 8.25s 29: learn: 0.4063562 test: 0.4132531 best: 0.4132531 (29) total: 15.5s remaining: 7.74s 30: learn: 0.4004445 test: 0.4075216 best: 0.4075216 (30) total: 16s remaining: 7.23s 31: learn: 0.3947623 test: 0.4020949 best: 0.4020949 (31) total: 16.6s remaining: 6.73s 32: learn: 0.3893800 test: 0.3969251 best: 0.3969251 (32) total: 17.1s remaining: 6.2s 33: learn: 0.3841113 test: 0.3918719 best: 0.3918719 (33) total: 17.6s remaining: 5.68s 34: learn: 0.3789759 test: 0.3868921 best: 0.3868921 (34) total: 18.1s remaining: 5.16s 35: learn: 0.3739723 test: 0.3821415 best: 0.3821415 (35) total: 18.6s remaining: 4.64s 36: learn: 0.3690357 test: 0.3774588 best: 0.3774588 (36) total: 19.1s remaining: 4.13s 37: learn: 0.3642056 test: 0.3729173 best: 0.3729173 (37) total: 19.7s remaining: 3.63s 38: learn: 0.3594931 test: 0.3684826 best: 0.3684826 (38) total: 20.3s remaining: 3.12s 39: learn: 0.3550347 test: 0.3642827 best: 0.3642827 (39) total: 20.8s remaining: 2.6s 40: learn: 0.3506965 test: 0.3601651 best: 0.3601651 (40) total: 21.3s remaining: 2.08s 41: learn: 0.3463943 test: 0.3560588 best: 0.3560588 (41) total: 21.9s remaining: 1.56s 42: learn: 0.3423221 test: 0.3522316 best: 0.3522316 (42) total: 22.4s remaining: 1.04s 43: learn: 0.3381841 test: 0.3482952 best: 0.3482952 (43) total: 23s remaining: 522ms 44: learn: 0.3342009 test: 0.3445661 best: 0.3445661 (44) total: 23.5s remaining: 0us bestTest = 0.3445660829 bestIteration = 44 Trial 59, Fold 3: Log loss = 0.344823465758147, Average precision = 0.9747567286077528, ROC-AUC = 0.9717709095790275, Elapsed Time = 23.662727099999756 seconds Trial 59, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 59, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6788755 test: 0.6791264 best: 0.6791264 (0) total: 441ms remaining: 19.4s 1: learn: 0.6649780 test: 0.6655718 best: 0.6655718 (1) total: 934ms remaining: 20.1s 2: learn: 0.6514407 test: 0.6524184 best: 0.6524184 (2) total: 1.38s remaining: 19.4s 3: learn: 0.6384179 test: 0.6397497 best: 0.6397497 (3) total: 1.86s remaining: 19.1s 4: learn: 0.6258656 test: 0.6274969 best: 0.6274969 (4) total: 2.33s remaining: 18.6s 5: learn: 0.6136195 test: 0.6155402 best: 0.6155402 (5) total: 2.84s remaining: 18.5s 6: learn: 0.6017138 test: 0.6039517 best: 0.6039517 (6) total: 3.36s remaining: 18.2s 7: learn: 0.5901390 test: 0.5926049 best: 0.5926049 (7) total: 3.89s remaining: 18s 8: learn: 0.5788542 test: 0.5816559 best: 0.5816559 (8) total: 4.41s remaining: 17.6s 9: learn: 0.5679629 test: 0.5710599 best: 0.5710599 (9) total: 4.91s remaining: 17.2s 10: learn: 0.5574145 test: 0.5606982 best: 0.5606982 (10) total: 5.42s remaining: 16.8s 11: learn: 0.5470926 test: 0.5507561 best: 0.5507561 (11) total: 5.99s remaining: 16.5s 12: learn: 0.5371032 test: 0.5410436 best: 0.5410436 (12) total: 6.46s remaining: 15.9s 13: learn: 0.5274408 test: 0.5316015 best: 0.5316015 (13) total: 6.97s remaining: 15.4s 14: learn: 0.5181463 test: 0.5225714 best: 0.5225714 (14) total: 7.43s remaining: 14.9s 15: learn: 0.5092991 test: 0.5138736 best: 0.5138736 (15) total: 7.81s remaining: 14.2s 16: learn: 0.5004570 test: 0.5052652 best: 0.5052652 (16) total: 8.26s remaining: 13.6s 17: learn: 0.4919394 test: 0.4969711 best: 0.4969711 (17) total: 8.73s remaining: 13.1s 18: learn: 0.4836281 test: 0.4889655 best: 0.4889655 (18) total: 9.19s remaining: 12.6s 19: learn: 0.4755456 test: 0.4811280 best: 0.4811280 (19) total: 9.67s remaining: 12.1s 20: learn: 0.4676565 test: 0.4735358 best: 0.4735358 (20) total: 10.1s remaining: 11.6s 21: learn: 0.4600817 test: 0.4661551 best: 0.4661551 (21) total: 10.6s remaining: 11.1s 22: learn: 0.4526051 test: 0.4589386 best: 0.4589386 (22) total: 11.2s remaining: 10.7s 23: learn: 0.4455620 test: 0.4520785 best: 0.4520785 (23) total: 11.6s remaining: 10.2s 24: learn: 0.4386124 test: 0.4454082 best: 0.4454082 (24) total: 12.1s remaining: 9.71s 25: learn: 0.4318690 test: 0.4389106 best: 0.4389106 (25) total: 12.6s remaining: 9.21s 26: learn: 0.4253473 test: 0.4325981 best: 0.4325981 (26) total: 13.1s remaining: 8.76s 27: learn: 0.4190384 test: 0.4264450 best: 0.4264450 (27) total: 13.6s remaining: 8.25s 28: learn: 0.4129460 test: 0.4205609 best: 0.4205609 (28) total: 14s remaining: 7.74s 29: learn: 0.4069166 test: 0.4147871 best: 0.4147871 (29) total: 14.5s remaining: 7.26s 30: learn: 0.4010527 test: 0.4091972 best: 0.4091972 (30) total: 15s remaining: 6.78s 31: learn: 0.3954320 test: 0.4038219 best: 0.4038219 (31) total: 15.4s remaining: 6.28s 32: learn: 0.3899786 test: 0.3985838 best: 0.3985838 (32) total: 15.9s remaining: 5.8s 33: learn: 0.3847479 test: 0.3935139 best: 0.3935139 (33) total: 16.4s remaining: 5.29s 34: learn: 0.3795189 test: 0.3885429 best: 0.3885429 (34) total: 16.9s remaining: 4.83s 35: learn: 0.3745392 test: 0.3837331 best: 0.3837331 (35) total: 17.4s remaining: 4.35s 36: learn: 0.3697033 test: 0.3791220 best: 0.3791220 (36) total: 17.9s remaining: 3.86s 37: learn: 0.3650279 test: 0.3746450 best: 0.3746450 (37) total: 18.3s remaining: 3.37s 38: learn: 0.3604897 test: 0.3702987 best: 0.3702987 (38) total: 18.7s remaining: 2.88s 39: learn: 0.3559727 test: 0.3660789 best: 0.3660789 (39) total: 19.2s remaining: 2.4s 40: learn: 0.3516797 test: 0.3620737 best: 0.3620737 (40) total: 19.7s remaining: 1.93s 41: learn: 0.3472896 test: 0.3579795 best: 0.3579795 (41) total: 20.3s remaining: 1.45s 42: learn: 0.3431211 test: 0.3540563 best: 0.3540563 (42) total: 20.8s remaining: 967ms 43: learn: 0.3390678 test: 0.3502281 best: 0.3502281 (43) total: 21.3s remaining: 485ms 44: learn: 0.3352627 test: 0.3466322 best: 0.3466322 (44) total: 21.8s remaining: 0us bestTest = 0.3466321647 bestIteration = 44 Trial 59, Fold 4: Log loss = 0.34674210336313677, Average precision = 0.975791471702802, ROC-AUC = 0.9719213395137235, Elapsed Time = 21.92226539999683 seconds Trial 59, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 59, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6787856 test: 0.6792613 best: 0.6792613 (0) total: 457ms remaining: 20.1s 1: learn: 0.6646886 test: 0.6657278 best: 0.6657278 (1) total: 947ms remaining: 20.4s 2: learn: 0.6510763 test: 0.6526124 best: 0.6526124 (2) total: 1.42s remaining: 19.9s 3: learn: 0.6379180 test: 0.6399702 best: 0.6399702 (3) total: 1.89s remaining: 19.4s 4: learn: 0.6251184 test: 0.6276454 best: 0.6276454 (4) total: 2.4s remaining: 19.2s 5: learn: 0.6128136 test: 0.6157750 best: 0.6157750 (5) total: 2.88s remaining: 18.7s 6: learn: 0.6008636 test: 0.6042891 best: 0.6042891 (6) total: 3.36s remaining: 18.3s 7: learn: 0.5890757 test: 0.5930357 best: 0.5930357 (7) total: 3.88s remaining: 17.9s 8: learn: 0.5777671 test: 0.5821788 best: 0.5821788 (8) total: 4.43s remaining: 17.7s 9: learn: 0.5666746 test: 0.5715800 best: 0.5715800 (9) total: 4.95s remaining: 17.3s 10: learn: 0.5559402 test: 0.5613515 best: 0.5613515 (10) total: 5.44s remaining: 16.8s 11: learn: 0.5457295 test: 0.5515498 best: 0.5515498 (11) total: 5.9s remaining: 16.2s 12: learn: 0.5355611 test: 0.5418489 best: 0.5418489 (12) total: 6.47s remaining: 15.9s 13: learn: 0.5257971 test: 0.5325437 best: 0.5325437 (13) total: 6.97s remaining: 15.4s 14: learn: 0.5162965 test: 0.5235181 best: 0.5235181 (14) total: 7.51s remaining: 15s 15: learn: 0.5071046 test: 0.5147038 best: 0.5147038 (15) total: 8.01s remaining: 14.5s 16: learn: 0.4982961 test: 0.5062159 best: 0.5062159 (16) total: 8.47s remaining: 14s 17: learn: 0.4896744 test: 0.4979769 best: 0.4979769 (17) total: 8.96s remaining: 13.4s 18: learn: 0.4812473 test: 0.4898934 best: 0.4898934 (18) total: 9.47s remaining: 13s 19: learn: 0.4731061 test: 0.4821046 best: 0.4821046 (19) total: 10s remaining: 12.5s 20: learn: 0.4651864 test: 0.4746201 best: 0.4746201 (20) total: 10.5s remaining: 12s 21: learn: 0.4575317 test: 0.4674033 best: 0.4674033 (21) total: 11s remaining: 11.5s 22: learn: 0.4501311 test: 0.4603138 best: 0.4603138 (22) total: 11.5s remaining: 11s 23: learn: 0.4429957 test: 0.4535117 best: 0.4535117 (23) total: 12s remaining: 10.5s 24: learn: 0.4360053 test: 0.4469204 best: 0.4469204 (24) total: 12.5s remaining: 9.96s 25: learn: 0.4292345 test: 0.4404601 best: 0.4404601 (25) total: 12.9s remaining: 9.43s 26: learn: 0.4226602 test: 0.4342360 best: 0.4342360 (26) total: 13.4s remaining: 8.92s 27: learn: 0.4163787 test: 0.4283534 best: 0.4283534 (27) total: 13.8s remaining: 8.38s 28: learn: 0.4101432 test: 0.4225253 best: 0.4225253 (28) total: 14.3s remaining: 7.89s 29: learn: 0.4040368 test: 0.4168121 best: 0.4168121 (29) total: 14.8s remaining: 7.4s 30: learn: 0.3983200 test: 0.4114141 best: 0.4114141 (30) total: 15.3s remaining: 6.89s 31: learn: 0.3926656 test: 0.4060573 best: 0.4060573 (31) total: 15.7s remaining: 6.39s 32: learn: 0.3872589 test: 0.4008542 best: 0.4008542 (32) total: 16.2s remaining: 5.88s 33: learn: 0.3818949 test: 0.3958925 best: 0.3958925 (33) total: 16.7s remaining: 5.39s 34: learn: 0.3767399 test: 0.3910294 best: 0.3910294 (34) total: 17.1s remaining: 4.89s 35: learn: 0.3717397 test: 0.3863409 best: 0.3863409 (35) total: 17.6s remaining: 4.4s 36: learn: 0.3668721 test: 0.3817965 best: 0.3817965 (36) total: 18s remaining: 3.9s 37: learn: 0.3621842 test: 0.3773573 best: 0.3773573 (37) total: 18.5s remaining: 3.41s 38: learn: 0.3575882 test: 0.3730834 best: 0.3730834 (38) total: 19s remaining: 2.92s 39: learn: 0.3530361 test: 0.3688405 best: 0.3688405 (39) total: 19.5s remaining: 2.44s 40: learn: 0.3485890 test: 0.3647311 best: 0.3647311 (40) total: 20.1s remaining: 1.96s 41: learn: 0.3444678 test: 0.3608375 best: 0.3608375 (41) total: 20.6s remaining: 1.47s 42: learn: 0.3404278 test: 0.3570630 best: 0.3570630 (42) total: 21.1s remaining: 981ms 43: learn: 0.3363753 test: 0.3532952 best: 0.3532952 (43) total: 21.6s remaining: 491ms 44: learn: 0.3325530 test: 0.3496843 best: 0.3496843 (44) total: 22s remaining: 0us bestTest = 0.3496842704 bestIteration = 44 Trial 59, Fold 5: Log loss = 0.34972687949873343, Average precision = 0.973546495446978, ROC-AUC = 0.9693380896642698, Elapsed Time = 22.15098339999895 seconds
Optimization Progress: 60%|###### | 60/100 [1:44:56<2:07:40, 191.52s/it]
Trial 60, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 60, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6413976 test: 0.6414130 best: 0.6414130 (0) total: 47.6ms remaining: 1.71s 1: learn: 0.6014448 test: 0.6007142 best: 0.6007142 (1) total: 79.1ms remaining: 1.38s 2: learn: 0.5637975 test: 0.5633736 best: 0.5633736 (2) total: 177ms remaining: 2.01s 3: learn: 0.5271508 test: 0.5279555 best: 0.5279555 (3) total: 1.7s remaining: 14s 4: learn: 0.5123892 test: 0.5128429 best: 0.5128429 (4) total: 1.73s remaining: 11.1s 5: learn: 0.4948159 test: 0.4951438 best: 0.4951438 (5) total: 1.76s remaining: 9.11s 6: learn: 0.4831600 test: 0.4835002 best: 0.4835002 (6) total: 1.8s remaining: 7.73s 7: learn: 0.4710140 test: 0.4714156 best: 0.4714156 (7) total: 1.84s remaining: 6.68s 8: learn: 0.4501674 test: 0.4513507 best: 0.4513507 (8) total: 3.5s remaining: 10.9s 9: learn: 0.4394774 test: 0.4409881 best: 0.4409881 (9) total: 3.54s remaining: 9.56s 10: learn: 0.4228195 test: 0.4243368 best: 0.4243368 (10) total: 3.66s remaining: 8.64s 11: learn: 0.4109829 test: 0.4125507 best: 0.4125507 (11) total: 3.7s remaining: 7.71s 12: learn: 0.4053461 test: 0.4068508 best: 0.4068508 (12) total: 3.73s remaining: 6.89s 13: learn: 0.3898378 test: 0.3914924 best: 0.3914924 (13) total: 3.82s remaining: 6.27s 14: learn: 0.3774130 test: 0.3790442 best: 0.3790442 (14) total: 3.85s remaining: 5.64s 15: learn: 0.3559223 test: 0.3600135 best: 0.3600135 (15) total: 5.47s remaining: 7.18s 16: learn: 0.3444109 test: 0.3499728 best: 0.3499728 (16) total: 7.16s remaining: 8.42s 17: learn: 0.3390947 test: 0.3448802 best: 0.3448802 (17) total: 7.19s remaining: 7.59s 18: learn: 0.3245863 test: 0.3320887 best: 0.3320887 (18) total: 8.92s remaining: 8.45s 19: learn: 0.3185466 test: 0.3258796 best: 0.3258796 (19) total: 8.98s remaining: 7.63s 20: learn: 0.3089616 test: 0.3174011 best: 0.3174011 (20) total: 10.7s remaining: 8.15s 21: learn: 0.3006396 test: 0.3097593 best: 0.3097593 (21) total: 12.5s remaining: 8.54s 22: learn: 0.2975478 test: 0.3066922 best: 0.3066922 (22) total: 12.6s remaining: 7.65s 23: learn: 0.2899096 test: 0.2993527 best: 0.2993527 (23) total: 12.7s remaining: 6.89s 24: learn: 0.2856814 test: 0.2954027 best: 0.2954027 (24) total: 12.8s remaining: 6.13s 25: learn: 0.2761297 test: 0.2871051 best: 0.2871051 (25) total: 14.3s remaining: 6.07s 26: learn: 0.2702937 test: 0.2816563 best: 0.2816563 (26) total: 14.6s remaining: 5.41s 27: learn: 0.2642921 test: 0.2766935 best: 0.2766935 (27) total: 16.2s remaining: 5.22s 28: learn: 0.2607330 test: 0.2734164 best: 0.2734164 (28) total: 16.3s remaining: 4.51s 29: learn: 0.2572977 test: 0.2709045 best: 0.2709045 (29) total: 17.9s remaining: 4.18s 30: learn: 0.2561531 test: 0.2699409 best: 0.2699409 (30) total: 17.9s remaining: 3.47s 31: learn: 0.2523842 test: 0.2672599 best: 0.2672599 (31) total: 19.6s remaining: 3.06s 32: learn: 0.2503917 test: 0.2658524 best: 0.2658524 (32) total: 19.9s remaining: 2.41s 33: learn: 0.2447839 test: 0.2611254 best: 0.2611254 (33) total: 21.4s remaining: 1.89s 34: learn: 0.2422343 test: 0.2584929 best: 0.2584929 (34) total: 21.5s remaining: 1.23s 35: learn: 0.2412699 test: 0.2575430 best: 0.2575430 (35) total: 21.5s remaining: 598ms 36: learn: 0.2394385 test: 0.2557080 best: 0.2557080 (36) total: 21.6s remaining: 0us bestTest = 0.2557080153 bestIteration = 36 Trial 60, Fold 1: Log loss = 0.2557080153293043, Average precision = 0.9666998600671489, ROC-AUC = 0.9599315831672426, Elapsed Time = 21.708156699998653 seconds Trial 60, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 60, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6448603 test: 0.6452140 best: 0.6452140 (0) total: 96.6ms remaining: 3.48s 1: learn: 0.6182217 test: 0.6183731 best: 0.6183731 (1) total: 128ms remaining: 2.24s 2: learn: 0.5947500 test: 0.5949152 best: 0.5949152 (2) total: 164ms remaining: 1.86s 3: learn: 0.5455926 test: 0.5467219 best: 0.5467219 (3) total: 614ms remaining: 5.07s 4: learn: 0.5307396 test: 0.5316811 best: 0.5316811 (4) total: 647ms remaining: 4.14s 5: learn: 0.5033823 test: 0.5046461 best: 0.5046461 (5) total: 696ms remaining: 3.59s 6: learn: 0.4859849 test: 0.4873324 best: 0.4873324 (6) total: 729ms remaining: 3.13s 7: learn: 0.4580016 test: 0.4593971 best: 0.4593971 (7) total: 786ms remaining: 2.85s 8: learn: 0.4315943 test: 0.4330843 best: 0.4330843 (8) total: 887ms remaining: 2.76s 9: learn: 0.4131588 test: 0.4154108 best: 0.4154108 (9) total: 2.63s remaining: 7.09s 10: learn: 0.4058451 test: 0.4079354 best: 0.4079354 (10) total: 2.66s remaining: 6.29s 11: learn: 0.3825210 test: 0.3854794 best: 0.3854794 (11) total: 4.32s remaining: 9s 12: learn: 0.3609411 test: 0.3654093 best: 0.3654093 (12) total: 5.92s remaining: 10.9s 13: learn: 0.3540949 test: 0.3585027 best: 0.3585027 (13) total: 5.96s remaining: 9.78s 14: learn: 0.3472787 test: 0.3517481 best: 0.3517481 (14) total: 6s remaining: 8.81s 15: learn: 0.3361695 test: 0.3409433 best: 0.3409433 (15) total: 7.56s remaining: 9.93s 16: learn: 0.3197453 test: 0.3278380 best: 0.3278380 (16) total: 9.23s remaining: 10.9s 17: learn: 0.3095631 test: 0.3178892 best: 0.3178892 (17) total: 9.35s remaining: 9.87s 18: learn: 0.3072065 test: 0.3154864 best: 0.3154864 (18) total: 9.38s remaining: 8.89s 19: learn: 0.3021782 test: 0.3108035 best: 0.3108035 (19) total: 9.46s remaining: 8.04s 20: learn: 0.2946839 test: 0.3032470 best: 0.3032470 (20) total: 9.5s remaining: 7.24s 21: learn: 0.2909549 test: 0.2995427 best: 0.2995427 (21) total: 9.58s remaining: 6.53s 22: learn: 0.2829199 test: 0.2925087 best: 0.2925087 (22) total: 10.7s remaining: 6.51s 23: learn: 0.2765862 test: 0.2860704 best: 0.2860704 (23) total: 10.8s remaining: 5.83s 24: learn: 0.2718900 test: 0.2812744 best: 0.2812744 (24) total: 10.8s remaining: 5.18s 25: learn: 0.2669548 test: 0.2763522 best: 0.2763522 (25) total: 11s remaining: 4.63s 26: learn: 0.2643858 test: 0.2737162 best: 0.2737162 (26) total: 11s remaining: 4.08s 27: learn: 0.2619637 test: 0.2712143 best: 0.2712143 (27) total: 11s remaining: 3.55s 28: learn: 0.2559296 test: 0.2662291 best: 0.2662291 (28) total: 12.7s remaining: 3.49s 29: learn: 0.2530744 test: 0.2633230 best: 0.2633230 (29) total: 12.7s remaining: 2.97s 30: learn: 0.2505544 test: 0.2608503 best: 0.2608503 (30) total: 12.8s remaining: 2.48s 31: learn: 0.2447663 test: 0.2558049 best: 0.2558049 (31) total: 14.2s remaining: 2.22s 32: learn: 0.2437215 test: 0.2547681 best: 0.2547681 (32) total: 14.2s remaining: 1.73s 33: learn: 0.2418779 test: 0.2529739 best: 0.2529739 (33) total: 14.3s remaining: 1.26s 34: learn: 0.2379906 test: 0.2494713 best: 0.2494713 (34) total: 15.8s remaining: 904ms 35: learn: 0.2344856 test: 0.2468180 best: 0.2468180 (35) total: 17.2s remaining: 479ms 36: learn: 0.2317095 test: 0.2450267 best: 0.2450267 (36) total: 18.7s remaining: 0us bestTest = 0.2450266556 bestIteration = 36 Trial 60, Fold 2: Log loss = 0.24502665558572578, Average precision = 0.967378661128013, ROC-AUC = 0.9627475120191834, Elapsed Time = 18.81169299999965 seconds Trial 60, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 60, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6588095 test: 0.6584035 best: 0.6584035 (0) total: 49.8ms remaining: 1.79s 1: learn: 0.6323251 test: 0.6311241 best: 0.6311241 (1) total: 77.8ms remaining: 1.36s 2: learn: 0.5959688 test: 0.5942062 best: 0.5942062 (2) total: 129ms remaining: 1.46s 3: learn: 0.5634785 test: 0.5616200 best: 0.5616200 (3) total: 268ms remaining: 2.21s 4: learn: 0.5409569 test: 0.5389323 best: 0.5389323 (4) total: 315ms remaining: 2.02s 5: learn: 0.5087824 test: 0.5068007 best: 0.5068007 (5) total: 594ms remaining: 3.07s 6: learn: 0.4842817 test: 0.4827371 best: 0.4827371 (6) total: 1.9s remaining: 8.16s 7: learn: 0.4734201 test: 0.4715804 best: 0.4715804 (7) total: 1.94s remaining: 7.03s 8: learn: 0.4540761 test: 0.4521849 best: 0.4521849 (8) total: 2.04s remaining: 6.35s 9: learn: 0.4345103 test: 0.4327809 best: 0.4327809 (9) total: 2.2s remaining: 5.93s 10: learn: 0.4191090 test: 0.4171904 best: 0.4171904 (10) total: 2.27s remaining: 5.36s 11: learn: 0.4077537 test: 0.4060432 best: 0.4060432 (11) total: 2.35s remaining: 4.89s 12: learn: 0.3875249 test: 0.3864551 best: 0.3864551 (12) total: 3.97s remaining: 7.33s 13: learn: 0.3699512 test: 0.3695710 best: 0.3695710 (13) total: 5.52s remaining: 9.07s 14: learn: 0.3597504 test: 0.3592775 best: 0.3592775 (14) total: 5.67s remaining: 8.32s 15: learn: 0.3432861 test: 0.3429983 best: 0.3429983 (15) total: 5.94s remaining: 7.79s 16: learn: 0.3376492 test: 0.3371944 best: 0.3371944 (16) total: 5.98s remaining: 7.03s 17: learn: 0.3353276 test: 0.3347472 best: 0.3347472 (17) total: 6.01s remaining: 6.34s 18: learn: 0.3248798 test: 0.3245554 best: 0.3245554 (18) total: 6.05s remaining: 5.73s 19: learn: 0.3154389 test: 0.3155176 best: 0.3155176 (19) total: 6.31s remaining: 5.36s 20: learn: 0.3069301 test: 0.3068898 best: 0.3068898 (20) total: 6.46s remaining: 4.92s 21: learn: 0.2978190 test: 0.2979877 best: 0.2979877 (21) total: 6.61s remaining: 4.5s 22: learn: 0.2901937 test: 0.2909450 best: 0.2909450 (22) total: 6.76s remaining: 4.11s 23: learn: 0.2824676 test: 0.2839734 best: 0.2839734 (23) total: 8.49s remaining: 4.6s 24: learn: 0.2745594 test: 0.2771194 best: 0.2771194 (24) total: 8.96s remaining: 4.3s 25: learn: 0.2696678 test: 0.2730696 best: 0.2730696 (25) total: 10s remaining: 4.24s 26: learn: 0.2656806 test: 0.2693603 best: 0.2693603 (26) total: 10.1s remaining: 3.73s 27: learn: 0.2602056 test: 0.2645170 best: 0.2645170 (27) total: 11.5s remaining: 3.71s 28: learn: 0.2595660 test: 0.2638414 best: 0.2638414 (28) total: 11.6s remaining: 3.19s 29: learn: 0.2553395 test: 0.2595942 best: 0.2595942 (29) total: 11.6s remaining: 2.71s 30: learn: 0.2535263 test: 0.2578080 best: 0.2578080 (30) total: 11.7s remaining: 2.26s 31: learn: 0.2497116 test: 0.2542769 best: 0.2542769 (31) total: 11.9s remaining: 1.86s 32: learn: 0.2462814 test: 0.2517107 best: 0.2517107 (32) total: 13s remaining: 1.57s 33: learn: 0.2458571 test: 0.2512928 best: 0.2512928 (33) total: 13s remaining: 1.15s 34: learn: 0.2407878 test: 0.2480634 best: 0.2480634 (34) total: 14.3s remaining: 816ms 35: learn: 0.2398007 test: 0.2471224 best: 0.2471224 (35) total: 14.3s remaining: 398ms 36: learn: 0.2345810 test: 0.2441826 best: 0.2441826 (36) total: 15.7s remaining: 0us bestTest = 0.2441825604 bestIteration = 36 Trial 60, Fold 3: Log loss = 0.24418256040899053, Average precision = 0.9683457883783508, ROC-AUC = 0.9630323253652571, Elapsed Time = 15.805479599999671 seconds Trial 60, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 60, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6574676 test: 0.6572983 best: 0.6572983 (0) total: 33.7ms remaining: 1.21s 1: learn: 0.6182762 test: 0.6181347 best: 0.6181347 (1) total: 68.5ms remaining: 1.2s 2: learn: 0.5845651 test: 0.5844408 best: 0.5844408 (2) total: 106ms remaining: 1.2s 3: learn: 0.5611499 test: 0.5609818 best: 0.5609818 (3) total: 155ms remaining: 1.28s 4: learn: 0.5371533 test: 0.5372228 best: 0.5372228 (4) total: 428ms remaining: 2.74s 5: learn: 0.5072072 test: 0.5072931 best: 0.5072931 (5) total: 537ms remaining: 2.77s 6: learn: 0.4864527 test: 0.4863553 best: 0.4863553 (6) total: 604ms remaining: 2.59s 7: learn: 0.4518773 test: 0.4528886 best: 0.4528886 (7) total: 2s remaining: 7.23s 8: learn: 0.4343471 test: 0.4354716 best: 0.4354716 (8) total: 2.03s remaining: 6.32s 9: learn: 0.4169959 test: 0.4182098 best: 0.4182098 (9) total: 2.07s remaining: 5.6s 10: learn: 0.3976837 test: 0.3990857 best: 0.3990857 (10) total: 2.12s remaining: 5.02s 11: learn: 0.3845413 test: 0.3858483 best: 0.3858483 (11) total: 2.17s remaining: 4.53s 12: learn: 0.3662193 test: 0.3676189 best: 0.3676189 (12) total: 2.25s remaining: 4.15s 13: learn: 0.3512467 test: 0.3526482 best: 0.3526482 (13) total: 2.51s remaining: 4.13s 14: learn: 0.3399024 test: 0.3417811 best: 0.3417811 (14) total: 2.75s remaining: 4.03s 15: learn: 0.3251858 test: 0.3280744 best: 0.3280744 (15) total: 4.49s remaining: 5.9s 16: learn: 0.3129415 test: 0.3165803 best: 0.3165803 (16) total: 4.75s remaining: 5.59s 17: learn: 0.3017338 test: 0.3068040 best: 0.3068040 (17) total: 6.26s remaining: 6.61s 18: learn: 0.2905798 test: 0.2968210 best: 0.2968210 (18) total: 7.82s remaining: 7.41s 19: learn: 0.2862307 test: 0.2928759 best: 0.2928759 (19) total: 7.89s remaining: 6.71s 20: learn: 0.2771021 test: 0.2849247 best: 0.2849247 (20) total: 8.4s remaining: 6.4s 21: learn: 0.2712880 test: 0.2791376 best: 0.2791376 (21) total: 8.46s remaining: 5.76s 22: learn: 0.2644558 test: 0.2732310 best: 0.2732310 (22) total: 10.1s remaining: 6.17s 23: learn: 0.2618922 test: 0.2708126 best: 0.2708126 (23) total: 10.2s remaining: 5.55s 24: learn: 0.2603178 test: 0.2692047 best: 0.2692047 (24) total: 10.3s remaining: 4.93s 25: learn: 0.2539343 test: 0.2648315 best: 0.2648315 (25) total: 11.9s remaining: 5.03s 26: learn: 0.2499462 test: 0.2609698 best: 0.2609698 (26) total: 12s remaining: 4.44s 27: learn: 0.2482950 test: 0.2592730 best: 0.2592730 (27) total: 12s remaining: 3.87s 28: learn: 0.2446846 test: 0.2557489 best: 0.2557489 (28) total: 12.2s remaining: 3.36s 29: learn: 0.2428156 test: 0.2537874 best: 0.2537874 (29) total: 12.2s remaining: 2.85s 30: learn: 0.2389014 test: 0.2506735 best: 0.2506735 (30) total: 13.9s remaining: 2.69s 31: learn: 0.2375219 test: 0.2492950 best: 0.2492950 (31) total: 14s remaining: 2.18s 32: learn: 0.2345016 test: 0.2466344 best: 0.2466344 (32) total: 15.4s remaining: 1.87s 33: learn: 0.2313273 test: 0.2444394 best: 0.2444394 (33) total: 15.7s remaining: 1.39s 34: learn: 0.2290705 test: 0.2422995 best: 0.2422995 (34) total: 15.9s remaining: 910ms 35: learn: 0.2283370 test: 0.2415466 best: 0.2415466 (35) total: 16s remaining: 444ms 36: learn: 0.2272679 test: 0.2405275 best: 0.2405275 (36) total: 16s remaining: 0us bestTest = 0.2405275154 bestIteration = 36 Trial 60, Fold 4: Log loss = 0.2405275153543953, Average precision = 0.9692593498615643, ROC-AUC = 0.9630999920122273, Elapsed Time = 16.162527300002694 seconds Trial 60, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 60, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6576798 test: 0.6585479 best: 0.6585479 (0) total: 31.5ms remaining: 1.13s 1: learn: 0.6151640 test: 0.6176017 best: 0.6176017 (1) total: 1.21s remaining: 21.1s 2: learn: 0.5614830 test: 0.5655334 best: 0.5655334 (2) total: 2.25s remaining: 25.5s 3: learn: 0.5202086 test: 0.5251524 best: 0.5251524 (3) total: 3.34s remaining: 27.6s 4: learn: 0.4829995 test: 0.4896788 best: 0.4896788 (4) total: 4.92s remaining: 31.5s 5: learn: 0.4566853 test: 0.4636357 best: 0.4636357 (5) total: 4.97s remaining: 25.7s 6: learn: 0.4388278 test: 0.4464226 best: 0.4464226 (6) total: 5.02s remaining: 21.5s 7: learn: 0.4152344 test: 0.4234051 best: 0.4234051 (7) total: 5.16s remaining: 18.7s 8: learn: 0.3971550 test: 0.4057560 best: 0.4057560 (8) total: 5.21s remaining: 16.2s 9: learn: 0.3747583 test: 0.3851649 best: 0.3851649 (9) total: 5.71s remaining: 15.4s 10: learn: 0.3642525 test: 0.3751092 best: 0.3751092 (10) total: 5.75s remaining: 13.6s 11: learn: 0.3536658 test: 0.3647984 best: 0.3647984 (11) total: 5.83s remaining: 12.1s 12: learn: 0.3493382 test: 0.3608830 best: 0.3608830 (12) total: 5.87s remaining: 10.8s 13: learn: 0.3402022 test: 0.3520801 best: 0.3520801 (13) total: 5.91s remaining: 9.71s 14: learn: 0.3362336 test: 0.3485150 best: 0.3485150 (14) total: 5.95s remaining: 8.72s 15: learn: 0.3194443 test: 0.3335837 best: 0.3335837 (15) total: 7.5s remaining: 9.84s 16: learn: 0.3151391 test: 0.3294809 best: 0.3294809 (16) total: 7.54s remaining: 8.87s 17: learn: 0.3036042 test: 0.3183527 best: 0.3183527 (17) total: 7.8s remaining: 8.23s 18: learn: 0.2961694 test: 0.3110668 best: 0.3110668 (18) total: 7.86s remaining: 7.44s 19: learn: 0.2891830 test: 0.3042728 best: 0.3042728 (19) total: 7.93s remaining: 6.74s 20: learn: 0.2791945 test: 0.2955062 best: 0.2955062 (20) total: 9.38s remaining: 7.15s 21: learn: 0.2768493 test: 0.2934302 best: 0.2934302 (21) total: 9.42s remaining: 6.42s 22: learn: 0.2712844 test: 0.2886088 best: 0.2886088 (22) total: 11s remaining: 6.69s 23: learn: 0.2685592 test: 0.2859805 best: 0.2859805 (23) total: 11s remaining: 5.97s 24: learn: 0.2666814 test: 0.2840956 best: 0.2840956 (24) total: 11.1s remaining: 5.31s 25: learn: 0.2624431 test: 0.2805231 best: 0.2805231 (25) total: 12.7s remaining: 5.37s 26: learn: 0.2574608 test: 0.2760920 best: 0.2760920 (26) total: 13s remaining: 4.8s 27: learn: 0.2543037 test: 0.2729270 best: 0.2729270 (27) total: 13s remaining: 4.19s 28: learn: 0.2483865 test: 0.2696507 best: 0.2696507 (28) total: 14.7s remaining: 4.06s 29: learn: 0.2470795 test: 0.2684145 best: 0.2684145 (29) total: 14.8s remaining: 3.44s 30: learn: 0.2434520 test: 0.2652954 best: 0.2652954 (30) total: 16.4s remaining: 3.17s 31: learn: 0.2390298 test: 0.2610622 best: 0.2610622 (31) total: 16.5s remaining: 2.58s 32: learn: 0.2366013 test: 0.2588209 best: 0.2588209 (32) total: 17.5s remaining: 2.13s 33: learn: 0.2343291 test: 0.2566679 best: 0.2566679 (33) total: 17.7s remaining: 1.56s 34: learn: 0.2315951 test: 0.2541974 best: 0.2541974 (34) total: 18.2s remaining: 1.04s 35: learn: 0.2291459 test: 0.2517521 best: 0.2517521 (35) total: 18.2s remaining: 506ms 36: learn: 0.2280693 test: 0.2507450 best: 0.2507450 (36) total: 18.3s remaining: 0us bestTest = 0.2507450171 bestIteration = 36 Trial 60, Fold 5: Log loss = 0.2507450170550699, Average precision = 0.9648030042260477, ROC-AUC = 0.9588524608438771, Elapsed Time = 18.41302600000199 seconds
Optimization Progress: 61%|######1 | 61/100 [1:46:35<1:46:27, 163.78s/it]
Trial 61, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 61, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5311544 test: 0.5345780 best: 0.5345780 (0) total: 114ms remaining: 2.27s 1: learn: 0.4264863 test: 0.4314730 best: 0.4314730 (1) total: 222ms remaining: 2.11s 2: learn: 0.3606027 test: 0.3680394 best: 0.3680394 (2) total: 339ms remaining: 2.03s 3: learn: 0.3150024 test: 0.3260176 best: 0.3260176 (3) total: 458ms remaining: 1.95s 4: learn: 0.2793900 test: 0.2935812 best: 0.2935812 (4) total: 577ms remaining: 1.85s 5: learn: 0.2583997 test: 0.2745268 best: 0.2745268 (5) total: 691ms remaining: 1.73s 6: learn: 0.2392554 test: 0.2576366 best: 0.2576366 (6) total: 807ms remaining: 1.61s 7: learn: 0.2234033 test: 0.2437379 best: 0.2437379 (7) total: 923ms remaining: 1.5s 8: learn: 0.2127209 test: 0.2355047 best: 0.2355047 (8) total: 1.06s remaining: 1.41s 9: learn: 0.2019645 test: 0.2275382 best: 0.2275382 (9) total: 1.19s remaining: 1.31s 10: learn: 0.1934348 test: 0.2205094 best: 0.2205094 (10) total: 1.31s remaining: 1.19s 11: learn: 0.1875968 test: 0.2160301 best: 0.2160301 (11) total: 1.42s remaining: 1.07s 12: learn: 0.1827931 test: 0.2132209 best: 0.2132209 (12) total: 1.54s remaining: 946ms 13: learn: 0.1786750 test: 0.2109046 best: 0.2109046 (13) total: 1.65s remaining: 823ms 14: learn: 0.1975012 test: 0.2928730 best: 0.2109046 (13) total: 1.76s remaining: 706ms 15: learn: 0.1939075 test: 0.2912446 best: 0.2109046 (13) total: 1.88s remaining: 588ms 16: learn: 0.2499129 test: 0.3784955 best: 0.2109046 (13) total: 1.99s remaining: 469ms 17: learn: 0.2476881 test: 0.3770666 best: 0.2109046 (13) total: 2.11s remaining: 351ms 18: learn: 0.2812899 test: 0.4908095 best: 0.2109046 (13) total: 2.23s remaining: 234ms 19: learn: 0.2784393 test: 0.4891935 best: 0.2109046 (13) total: 2.35s remaining: 117ms 20: learn: 0.2784471 test: 0.4883762 best: 0.2109046 (13) total: 2.46s remaining: 0us bestTest = 0.2109046388 bestIteration = 13 Shrink model to first 14 iterations. Trial 61, Fold 1: Log loss = 0.21090464423234495, Average precision = 0.9735671156286745, ROC-AUC = 0.9698844803152618, Elapsed Time = 2.5722095999990415 seconds Trial 61, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 61, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5170240 test: 0.5224300 best: 0.5224300 (0) total: 140ms remaining: 2.8s 1: learn: 0.4223446 test: 0.4304005 best: 0.4304005 (1) total: 275ms remaining: 2.61s 2: learn: 0.3653666 test: 0.3755443 best: 0.3755443 (2) total: 394ms remaining: 2.36s 3: learn: 0.3160177 test: 0.3281427 best: 0.3281427 (3) total: 515ms remaining: 2.19s 4: learn: 0.2816628 test: 0.2955258 best: 0.2955258 (4) total: 637ms remaining: 2.04s 5: learn: 0.2600185 test: 0.2754751 best: 0.2754751 (5) total: 758ms remaining: 1.89s 6: learn: 0.2428474 test: 0.2596154 best: 0.2596154 (6) total: 897ms remaining: 1.79s 7: learn: 0.2268989 test: 0.2448804 best: 0.2448804 (7) total: 1.04s remaining: 1.69s 8: learn: 0.2170804 test: 0.2366342 best: 0.2366342 (8) total: 1.17s remaining: 1.55s 9: learn: 0.2085159 test: 0.2283819 best: 0.2283819 (9) total: 1.28s remaining: 1.41s 10: learn: 0.2051550 test: 0.2503914 best: 0.2283819 (9) total: 1.4s remaining: 1.27s 11: learn: 0.2109780 test: 0.2599792 best: 0.2283819 (9) total: 1.52s remaining: 1.14s 12: learn: 0.2255988 test: 0.3123151 best: 0.2283819 (9) total: 1.63s remaining: 1s 13: learn: 0.2201515 test: 0.3074553 best: 0.2283819 (9) total: 1.73s remaining: 866ms 14: learn: 0.2141763 test: 0.3017693 best: 0.2283819 (9) total: 1.84s remaining: 737ms 15: learn: 0.2084611 test: 0.2973281 best: 0.2283819 (9) total: 1.97s remaining: 617ms 16: learn: 0.2040906 test: 0.2940727 best: 0.2283819 (9) total: 2.1s remaining: 495ms 17: learn: 0.2008244 test: 0.2912497 best: 0.2283819 (9) total: 2.23s remaining: 372ms 18: learn: 0.1973140 test: 0.2885535 best: 0.2283819 (9) total: 2.36s remaining: 248ms 19: learn: 0.1937425 test: 0.2856299 best: 0.2283819 (9) total: 2.48s remaining: 124ms 20: learn: 0.1902070 test: 0.2838196 best: 0.2283819 (9) total: 2.62s remaining: 0us bestTest = 0.2283818946 bestIteration = 9 Shrink model to first 10 iterations. Trial 61, Fold 2: Log loss = 0.22838190879653822, Average precision = 0.9730120451891991, ROC-AUC = 0.9716699783091715, Elapsed Time = 2.731778999997914 seconds Trial 61, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 61, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5265614 test: 0.5283679 best: 0.5283679 (0) total: 118ms remaining: 2.35s 1: learn: 0.4182210 test: 0.4219841 best: 0.4219841 (1) total: 233ms remaining: 2.22s 2: learn: 0.3597721 test: 0.3647261 best: 0.3647261 (2) total: 343ms remaining: 2.06s 3: learn: 0.3188557 test: 0.3255736 best: 0.3255736 (3) total: 454ms remaining: 1.93s 4: learn: 0.2902434 test: 0.2987632 best: 0.2987632 (4) total: 577ms remaining: 1.85s 5: learn: 0.2615270 test: 0.2724171 best: 0.2724171 (5) total: 718ms remaining: 1.79s 6: learn: 0.2432002 test: 0.2555061 best: 0.2555061 (6) total: 838ms remaining: 1.68s 7: learn: 0.2284978 test: 0.2426520 best: 0.2426520 (7) total: 964ms remaining: 1.56s 8: learn: 0.2158200 test: 0.2314811 best: 0.2314811 (8) total: 1.09s remaining: 1.46s 9: learn: 0.2061480 test: 0.2230007 best: 0.2230007 (9) total: 1.21s remaining: 1.33s 10: learn: 0.1982370 test: 0.2161273 best: 0.2161273 (10) total: 1.33s remaining: 1.21s 11: learn: 0.1919049 test: 0.2118250 best: 0.2118250 (11) total: 1.45s remaining: 1.09s 12: learn: 0.1860883 test: 0.2079150 best: 0.2079150 (12) total: 1.58s remaining: 970ms 13: learn: 0.1810199 test: 0.2039577 best: 0.2039577 (13) total: 1.7s remaining: 849ms 14: learn: 0.1907710 test: 0.2001616 best: 0.2001616 (14) total: 1.81s remaining: 726ms 15: learn: 0.1872608 test: 0.1983682 best: 0.1983682 (15) total: 1.93s remaining: 603ms 16: learn: 0.1863851 test: 0.1972791 best: 0.1972791 (16) total: 2.05s remaining: 482ms 17: learn: 0.1832934 test: 0.1956441 best: 0.1956441 (17) total: 2.17s remaining: 362ms 18: learn: 0.1797063 test: 0.1944977 best: 0.1944977 (18) total: 2.28s remaining: 241ms 19: learn: 0.1988476 test: 0.3231839 best: 0.1944977 (18) total: 2.39s remaining: 120ms 20: learn: 0.1965443 test: 0.3218225 best: 0.1944977 (18) total: 2.5s remaining: 0us bestTest = 0.1944977159 bestIteration = 18 Shrink model to first 19 iterations. Trial 61, Fold 3: Log loss = 0.19449771790342277, Average precision = 0.9749471046359938, ROC-AUC = 0.9730193324970384, Elapsed Time = 2.6221650000006775 seconds Trial 61, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 61, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5196888 test: 0.5218917 best: 0.5218917 (0) total: 122ms remaining: 2.44s 1: learn: 0.4250635 test: 0.4292221 best: 0.4292221 (1) total: 226ms remaining: 2.15s 2: learn: 0.3680109 test: 0.3742214 best: 0.3742214 (2) total: 340ms remaining: 2.04s 3: learn: 0.3264765 test: 0.3347420 best: 0.3347420 (3) total: 465ms remaining: 1.98s 4: learn: 0.2955975 test: 0.3053098 best: 0.3053098 (4) total: 575ms remaining: 1.84s 5: learn: 0.2705389 test: 0.2819182 best: 0.2819182 (5) total: 685ms remaining: 1.71s 6: learn: 0.2480537 test: 0.2606801 best: 0.2606801 (6) total: 795ms remaining: 1.59s 7: learn: 0.2339199 test: 0.2477329 best: 0.2477329 (7) total: 909ms remaining: 1.48s 8: learn: 0.2204580 test: 0.2362331 best: 0.2362331 (8) total: 1.03s remaining: 1.37s 9: learn: 0.2056297 test: 0.2238851 best: 0.2238851 (9) total: 1.16s remaining: 1.27s 10: learn: 0.1987083 test: 0.2185771 best: 0.2185771 (10) total: 1.27s remaining: 1.15s 11: learn: 0.1931468 test: 0.2145469 best: 0.2145469 (11) total: 1.39s remaining: 1.04s 12: learn: 0.1866961 test: 0.2095464 best: 0.2095464 (12) total: 1.51s remaining: 932ms 13: learn: 0.1833701 test: 0.2072871 best: 0.2072871 (13) total: 1.62s remaining: 812ms 14: learn: 0.1824937 test: 0.2047178 best: 0.2047178 (14) total: 1.75s remaining: 701ms 15: learn: 0.2097261 test: 0.2023134 best: 0.2023134 (15) total: 1.88s remaining: 587ms 16: learn: 0.2058819 test: 0.1988227 best: 0.1988227 (16) total: 1.99s remaining: 469ms 17: learn: 0.2034074 test: 0.1964292 best: 0.1964292 (17) total: 2.11s remaining: 351ms 18: learn: 0.2208410 test: 0.2565879 best: 0.1964292 (17) total: 2.23s remaining: 234ms 19: learn: 0.2189360 test: 0.2545922 best: 0.1964292 (17) total: 2.35s remaining: 117ms 20: learn: 0.2163566 test: 0.2538703 best: 0.1964292 (17) total: 2.48s remaining: 0us bestTest = 0.1964291975 bestIteration = 17 Shrink model to first 18 iterations. Trial 61, Fold 4: Log loss = 0.1964291992932503, Average precision = 0.9765616380414603, ROC-AUC = 0.9729477534091098, Elapsed Time = 2.6020305000020016 seconds Trial 61, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 61, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5170266 test: 0.5220017 best: 0.5220017 (0) total: 121ms remaining: 2.41s 1: learn: 0.4239195 test: 0.4318673 best: 0.4318673 (1) total: 239ms remaining: 2.27s 2: learn: 0.3648167 test: 0.3747425 best: 0.3747425 (2) total: 351ms remaining: 2.1s 3: learn: 0.3195297 test: 0.3314538 best: 0.3314538 (3) total: 461ms remaining: 1.96s 4: learn: 0.2824791 test: 0.2977562 best: 0.2977562 (4) total: 580ms remaining: 1.86s 5: learn: 0.2604306 test: 0.2776919 best: 0.2776919 (5) total: 702ms remaining: 1.75s 6: learn: 0.2388056 test: 0.2579652 best: 0.2579652 (6) total: 822ms remaining: 1.64s 7: learn: 0.2221904 test: 0.2441964 best: 0.2441964 (7) total: 943ms remaining: 1.53s 8: learn: 0.2115120 test: 0.2357255 best: 0.2357255 (8) total: 1.06s remaining: 1.41s 9: learn: 0.2019193 test: 0.2283650 best: 0.2283650 (9) total: 1.19s remaining: 1.31s 10: learn: 0.1947171 test: 0.2237741 best: 0.2237741 (10) total: 1.32s remaining: 1.2s 11: learn: 0.1883263 test: 0.2201550 best: 0.2201550 (11) total: 1.44s remaining: 1.08s 12: learn: 0.1817500 test: 0.2151684 best: 0.2151684 (12) total: 1.56s remaining: 960ms 13: learn: 0.1771992 test: 0.2127784 best: 0.2127784 (13) total: 1.67s remaining: 836ms 14: learn: 0.1731531 test: 0.2097602 best: 0.2097602 (14) total: 1.79s remaining: 716ms 15: learn: 0.1692000 test: 0.2073409 best: 0.2073409 (15) total: 1.9s remaining: 595ms 16: learn: 0.1747308 test: 0.2578248 best: 0.2073409 (15) total: 2.02s remaining: 476ms 17: learn: 0.1722062 test: 0.2563285 best: 0.2073409 (15) total: 2.14s remaining: 357ms 18: learn: 0.1697991 test: 0.2553895 best: 0.2073409 (15) total: 2.25s remaining: 237ms 19: learn: 0.1671298 test: 0.2541773 best: 0.2073409 (15) total: 2.38s remaining: 119ms 20: learn: 0.1655066 test: 0.2539007 best: 0.2073409 (15) total: 2.49s remaining: 0us bestTest = 0.207340894 bestIteration = 15 Shrink model to first 16 iterations. Trial 61, Fold 5: Log loss = 0.2073408952745976, Average precision = 0.9740838197006582, ROC-AUC = 0.9712975493748025, Elapsed Time = 2.611271099998703 seconds
Optimization Progress: 62%|######2 | 62/100 [1:46:56<1:16:38, 121.02s/it]
Trial 62, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 62, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6844508 test: 0.6845223 best: 0.6845223 (0) total: 17.5ms remaining: 978ms 1: learn: 0.6753745 test: 0.6755910 best: 0.6755910 (1) total: 39ms remaining: 1.07s 2: learn: 0.6647018 test: 0.6648454 best: 0.6648454 (2) total: 65.2ms remaining: 1.17s 3: learn: 0.6552653 test: 0.6552997 best: 0.6552997 (3) total: 88.9ms remaining: 1.18s 4: learn: 0.6479941 test: 0.6478817 best: 0.6478817 (4) total: 106ms remaining: 1.1s 5: learn: 0.6414312 test: 0.6413197 best: 0.6413197 (5) total: 126ms remaining: 1.07s 6: learn: 0.6344055 test: 0.6341630 best: 0.6341630 (6) total: 142ms remaining: 1.02s 7: learn: 0.6272578 test: 0.6269855 best: 0.6269855 (7) total: 164ms remaining: 1s 8: learn: 0.6197120 test: 0.6192801 best: 0.6192801 (8) total: 191ms remaining: 1.02s 9: learn: 0.6128665 test: 0.6123594 best: 0.6123594 (9) total: 217ms remaining: 1.02s 10: learn: 0.6043475 test: 0.6037690 best: 0.6037690 (10) total: 243ms remaining: 1.01s 11: learn: 0.5963485 test: 0.5956907 best: 0.5956907 (11) total: 269ms remaining: 1.01s 12: learn: 0.5900110 test: 0.5892001 best: 0.5892001 (12) total: 295ms remaining: 998ms 13: learn: 0.5837312 test: 0.5828971 best: 0.5828971 (13) total: 320ms remaining: 983ms 14: learn: 0.5779650 test: 0.5769839 best: 0.5769839 (14) total: 341ms remaining: 955ms 15: learn: 0.5725532 test: 0.5715547 best: 0.5715547 (15) total: 366ms remaining: 939ms 16: learn: 0.5674115 test: 0.5664241 best: 0.5664241 (16) total: 392ms remaining: 921ms 17: learn: 0.5619958 test: 0.5609627 best: 0.5609627 (17) total: 417ms remaining: 905ms 18: learn: 0.5570120 test: 0.5559777 best: 0.5559777 (18) total: 446ms remaining: 892ms 19: learn: 0.5513930 test: 0.5503800 best: 0.5503800 (19) total: 471ms remaining: 871ms 20: learn: 0.5451141 test: 0.5441129 best: 0.5441129 (20) total: 496ms remaining: 851ms 21: learn: 0.5398541 test: 0.5388667 best: 0.5388667 (21) total: 522ms remaining: 831ms 22: learn: 0.5352422 test: 0.5343396 best: 0.5343396 (22) total: 551ms remaining: 815ms 23: learn: 0.5307334 test: 0.5297916 best: 0.5297916 (23) total: 585ms remaining: 804ms 24: learn: 0.5250148 test: 0.5240294 best: 0.5240294 (24) total: 621ms remaining: 795ms 25: learn: 0.5195180 test: 0.5185688 best: 0.5185688 (25) total: 650ms remaining: 775ms 26: learn: 0.5157740 test: 0.5148946 best: 0.5148946 (26) total: 677ms remaining: 752ms 27: learn: 0.5109339 test: 0.5100797 best: 0.5100797 (27) total: 701ms remaining: 726ms 28: learn: 0.5069566 test: 0.5061592 best: 0.5061592 (28) total: 729ms remaining: 704ms 29: learn: 0.5011404 test: 0.5003144 best: 0.5003144 (29) total: 753ms remaining: 677ms 30: learn: 0.4934397 test: 0.4925084 best: 0.4925084 (30) total: 777ms remaining: 652ms 31: learn: 0.4901011 test: 0.4891083 best: 0.4891083 (31) total: 799ms remaining: 624ms 32: learn: 0.4842487 test: 0.4831369 best: 0.4831369 (32) total: 826ms remaining: 600ms 33: learn: 0.4785746 test: 0.4774927 best: 0.4774927 (33) total: 854ms remaining: 578ms 34: learn: 0.4752922 test: 0.4742171 best: 0.4742171 (34) total: 879ms remaining: 552ms 35: learn: 0.4726217 test: 0.4715927 best: 0.4715927 (35) total: 915ms remaining: 534ms 36: learn: 0.4672490 test: 0.4661072 best: 0.4661072 (36) total: 950ms remaining: 514ms 37: learn: 0.4634060 test: 0.4623897 best: 0.4623897 (37) total: 982ms remaining: 491ms 38: learn: 0.4598609 test: 0.4588186 best: 0.4588186 (38) total: 1.01s remaining: 469ms 39: learn: 0.4569103 test: 0.4559239 best: 0.4559239 (39) total: 1.05s remaining: 445ms 40: learn: 0.4530464 test: 0.4521406 best: 0.4521406 (40) total: 1.07s remaining: 419ms 41: learn: 0.4507702 test: 0.4499974 best: 0.4499974 (41) total: 1.1s remaining: 393ms 42: learn: 0.4484451 test: 0.4477748 best: 0.4477748 (42) total: 1.13s remaining: 368ms 43: learn: 0.4437996 test: 0.4430414 best: 0.4430414 (43) total: 1.16s remaining: 342ms 44: learn: 0.4392108 test: 0.4385112 best: 0.4385112 (44) total: 1.19s remaining: 316ms 45: learn: 0.4370629 test: 0.4364344 best: 0.4364344 (45) total: 1.21s remaining: 290ms 46: learn: 0.4323968 test: 0.4318368 best: 0.4318368 (46) total: 1.24s remaining: 264ms 47: learn: 0.4303447 test: 0.4297893 best: 0.4297893 (47) total: 1.27s remaining: 238ms 48: learn: 0.4284930 test: 0.4279803 best: 0.4279803 (48) total: 1.29s remaining: 210ms 49: learn: 0.4244805 test: 0.4239394 best: 0.4239394 (49) total: 1.32s remaining: 184ms 50: learn: 0.4203528 test: 0.4197308 best: 0.4197308 (50) total: 1.34s remaining: 158ms 51: learn: 0.4164418 test: 0.4157287 best: 0.4157287 (51) total: 1.37s remaining: 132ms 52: learn: 0.4131444 test: 0.4124318 best: 0.4124318 (52) total: 1.4s remaining: 106ms 53: learn: 0.4105543 test: 0.4098521 best: 0.4098521 (53) total: 1.43s remaining: 79.4ms 54: learn: 0.4087138 test: 0.4079893 best: 0.4079893 (54) total: 1.46s remaining: 52.9ms 55: learn: 0.4062619 test: 0.4055346 best: 0.4055346 (55) total: 1.48s remaining: 26.5ms 56: learn: 0.4023852 test: 0.4016481 best: 0.4016481 (56) total: 1.51s remaining: 0us bestTest = 0.4016481397 bestIteration = 56 Trial 62, Fold 1: Log loss = 0.4017622730911705, Average precision = 0.9567265941778511, ROC-AUC = 0.95159209611276, Elapsed Time = 1.622585799999797 seconds Trial 62, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 62, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6826367 test: 0.6827674 best: 0.6827674 (0) total: 21.2ms remaining: 1.19s 1: learn: 0.6745823 test: 0.6747142 best: 0.6747142 (1) total: 42.4ms remaining: 1.17s 2: learn: 0.6649258 test: 0.6650794 best: 0.6650794 (2) total: 58.9ms remaining: 1.06s 3: learn: 0.6573809 test: 0.6574891 best: 0.6574891 (3) total: 75.7ms remaining: 1s 4: learn: 0.6496580 test: 0.6498265 best: 0.6498265 (4) total: 97.2ms remaining: 1.01s 5: learn: 0.6415860 test: 0.6417270 best: 0.6417270 (5) total: 121ms remaining: 1.03s 6: learn: 0.6328362 test: 0.6329736 best: 0.6329736 (6) total: 139ms remaining: 994ms 7: learn: 0.6239147 test: 0.6241192 best: 0.6241192 (7) total: 165ms remaining: 1.01s 8: learn: 0.6168021 test: 0.6170908 best: 0.6170908 (8) total: 192ms remaining: 1.02s 9: learn: 0.6094188 test: 0.6096694 best: 0.6096694 (9) total: 210ms remaining: 987ms 10: learn: 0.6033625 test: 0.6035996 best: 0.6035996 (10) total: 235ms remaining: 982ms 11: learn: 0.5975308 test: 0.5978020 best: 0.5978020 (11) total: 261ms remaining: 980ms 12: learn: 0.5904572 test: 0.5908286 best: 0.5908286 (12) total: 286ms remaining: 967ms 13: learn: 0.5826039 test: 0.5831017 best: 0.5831017 (13) total: 309ms remaining: 950ms 14: learn: 0.5750201 test: 0.5756510 best: 0.5756510 (14) total: 332ms remaining: 930ms 15: learn: 0.5695985 test: 0.5702878 best: 0.5702878 (15) total: 359ms remaining: 921ms 16: learn: 0.5647092 test: 0.5653548 best: 0.5653548 (16) total: 387ms remaining: 910ms 17: learn: 0.5602323 test: 0.5609047 best: 0.5609047 (17) total: 414ms remaining: 896ms 18: learn: 0.5529018 test: 0.5537201 best: 0.5537201 (18) total: 436ms remaining: 872ms 19: learn: 0.5495848 test: 0.5504369 best: 0.5504369 (19) total: 465ms remaining: 861ms 20: learn: 0.5448350 test: 0.5456064 best: 0.5456064 (20) total: 493ms remaining: 846ms 21: learn: 0.5384798 test: 0.5393837 best: 0.5393837 (21) total: 519ms remaining: 825ms 22: learn: 0.5305192 test: 0.5314476 best: 0.5314476 (22) total: 545ms remaining: 805ms 23: learn: 0.5251108 test: 0.5260633 best: 0.5260633 (23) total: 574ms remaining: 789ms 24: learn: 0.5214132 test: 0.5223795 best: 0.5223795 (24) total: 604ms remaining: 773ms 25: learn: 0.5172631 test: 0.5182582 best: 0.5182582 (25) total: 632ms remaining: 754ms 26: learn: 0.5127113 test: 0.5136893 best: 0.5136893 (26) total: 654ms remaining: 726ms 27: learn: 0.5083961 test: 0.5094638 best: 0.5094638 (27) total: 676ms remaining: 700ms 28: learn: 0.5036799 test: 0.5047266 best: 0.5047266 (28) total: 698ms remaining: 674ms 29: learn: 0.4986385 test: 0.4998238 best: 0.4998238 (29) total: 723ms remaining: 651ms 30: learn: 0.4949432 test: 0.4961559 best: 0.4961559 (30) total: 751ms remaining: 630ms 31: learn: 0.4930938 test: 0.4943123 best: 0.4943123 (31) total: 773ms remaining: 604ms 32: learn: 0.4899735 test: 0.4911678 best: 0.4911678 (32) total: 801ms remaining: 583ms 33: learn: 0.4868648 test: 0.4880935 best: 0.4880935 (33) total: 831ms remaining: 562ms 34: learn: 0.4814015 test: 0.4826468 best: 0.4826468 (34) total: 860ms remaining: 540ms 35: learn: 0.4791202 test: 0.4803682 best: 0.4803682 (35) total: 884ms remaining: 516ms 36: learn: 0.4747395 test: 0.4760352 best: 0.4760352 (36) total: 916ms remaining: 495ms 37: learn: 0.4709511 test: 0.4722909 best: 0.4722909 (37) total: 945ms remaining: 472ms 38: learn: 0.4682647 test: 0.4695644 best: 0.4695644 (38) total: 973ms remaining: 449ms 39: learn: 0.4652409 test: 0.4665296 best: 0.4665296 (39) total: 1s remaining: 426ms 40: learn: 0.4605261 test: 0.4618053 best: 0.4618053 (40) total: 1.03s remaining: 402ms 41: learn: 0.4578005 test: 0.4591079 best: 0.4591079 (41) total: 1.05s remaining: 377ms 42: learn: 0.4553075 test: 0.4566396 best: 0.4566396 (42) total: 1.08s remaining: 353ms 43: learn: 0.4529459 test: 0.4542590 best: 0.4542590 (43) total: 1.11s remaining: 328ms 44: learn: 0.4501645 test: 0.4515119 best: 0.4515119 (44) total: 1.13s remaining: 303ms 45: learn: 0.4471690 test: 0.4485851 best: 0.4485851 (45) total: 1.16s remaining: 278ms 46: learn: 0.4442093 test: 0.4456657 best: 0.4456657 (46) total: 1.19s remaining: 253ms 47: learn: 0.4408802 test: 0.4423860 best: 0.4423860 (47) total: 1.22s remaining: 228ms 48: learn: 0.4383607 test: 0.4399227 best: 0.4399227 (48) total: 1.25s remaining: 203ms 49: learn: 0.4363350 test: 0.4378870 best: 0.4378870 (49) total: 1.27s remaining: 178ms 50: learn: 0.4344069 test: 0.4358789 best: 0.4358789 (50) total: 1.3s remaining: 153ms 51: learn: 0.4320317 test: 0.4335147 best: 0.4335147 (51) total: 1.33s remaining: 128ms 52: learn: 0.4281554 test: 0.4296511 best: 0.4296511 (52) total: 1.35s remaining: 102ms 53: learn: 0.4257941 test: 0.4273153 best: 0.4273153 (53) total: 1.38s remaining: 76.9ms 54: learn: 0.4228094 test: 0.4243174 best: 0.4243174 (54) total: 1.42s remaining: 51.5ms 55: learn: 0.4200455 test: 0.4216072 best: 0.4216072 (55) total: 1.44s remaining: 25.7ms 56: learn: 0.4186743 test: 0.4202287 best: 0.4202287 (56) total: 1.47s remaining: 0us bestTest = 0.4202286886 bestIteration = 56 Trial 62, Fold 2: Log loss = 0.42030386935676656, Average precision = 0.950718282227725, ROC-AUC = 0.9486870933349034, Elapsed Time = 1.580957899997884 seconds Trial 62, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 62, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6819556 test: 0.6816162 best: 0.6816162 (0) total: 19.3ms remaining: 1.08s 1: learn: 0.6729345 test: 0.6723292 best: 0.6723292 (1) total: 36.6ms remaining: 1.01s 2: learn: 0.6671088 test: 0.6663788 best: 0.6663788 (2) total: 59.6ms remaining: 1.07s 3: learn: 0.6597291 test: 0.6590262 best: 0.6590262 (3) total: 82.4ms remaining: 1.09s 4: learn: 0.6521875 test: 0.6513198 best: 0.6513198 (4) total: 101ms remaining: 1.05s 5: learn: 0.6447769 test: 0.6436859 best: 0.6436859 (5) total: 120ms remaining: 1.02s 6: learn: 0.6374699 test: 0.6361990 best: 0.6361990 (6) total: 145ms remaining: 1.03s 7: learn: 0.6295766 test: 0.6281131 best: 0.6281131 (7) total: 170ms remaining: 1.04s 8: learn: 0.6233813 test: 0.6216913 best: 0.6216913 (8) total: 198ms remaining: 1.06s 9: learn: 0.6156266 test: 0.6137028 best: 0.6137028 (9) total: 226ms remaining: 1.06s 10: learn: 0.6048216 test: 0.6027999 best: 0.6027999 (10) total: 248ms remaining: 1.04s 11: learn: 0.5963890 test: 0.5941048 best: 0.5941048 (11) total: 276ms remaining: 1.03s 12: learn: 0.5900673 test: 0.5876073 best: 0.5876073 (12) total: 303ms remaining: 1.02s 13: learn: 0.5831045 test: 0.5805199 best: 0.5805199 (13) total: 328ms remaining: 1.01s 14: learn: 0.5774735 test: 0.5747050 best: 0.5747050 (14) total: 355ms remaining: 993ms 15: learn: 0.5720756 test: 0.5691686 best: 0.5691686 (15) total: 385ms remaining: 986ms 16: learn: 0.5670059 test: 0.5639827 best: 0.5639827 (16) total: 411ms remaining: 968ms 17: learn: 0.5588010 test: 0.5556987 best: 0.5556987 (17) total: 439ms remaining: 952ms 18: learn: 0.5518780 test: 0.5486533 best: 0.5486533 (18) total: 468ms remaining: 936ms 19: learn: 0.5464889 test: 0.5431083 best: 0.5431083 (19) total: 496ms remaining: 917ms 20: learn: 0.5421099 test: 0.5386110 best: 0.5386110 (20) total: 523ms remaining: 897ms 21: learn: 0.5361915 test: 0.5328287 best: 0.5328287 (21) total: 551ms remaining: 877ms 22: learn: 0.5319021 test: 0.5284385 best: 0.5284385 (22) total: 572ms remaining: 846ms 23: learn: 0.5270080 test: 0.5234577 best: 0.5234577 (23) total: 600ms remaining: 825ms 24: learn: 0.5230434 test: 0.5194024 best: 0.5194024 (24) total: 627ms remaining: 803ms 25: learn: 0.5189714 test: 0.5152078 best: 0.5152078 (25) total: 655ms remaining: 780ms 26: learn: 0.5150280 test: 0.5111401 best: 0.5111401 (26) total: 680ms remaining: 755ms 27: learn: 0.5117267 test: 0.5079295 best: 0.5079295 (27) total: 704ms remaining: 729ms 28: learn: 0.5081798 test: 0.5042959 best: 0.5042959 (28) total: 731ms remaining: 706ms 29: learn: 0.5047832 test: 0.5008160 best: 0.5008160 (29) total: 756ms remaining: 681ms 30: learn: 0.4999235 test: 0.4958459 best: 0.4958459 (30) total: 784ms remaining: 657ms 31: learn: 0.4961796 test: 0.4919997 best: 0.4919997 (31) total: 810ms remaining: 633ms 32: learn: 0.4929416 test: 0.4886738 best: 0.4886738 (32) total: 834ms remaining: 607ms 33: learn: 0.4891720 test: 0.4848169 best: 0.4848169 (33) total: 859ms remaining: 581ms 34: learn: 0.4859680 test: 0.4815276 best: 0.4815276 (34) total: 888ms remaining: 558ms 35: learn: 0.4834883 test: 0.4789750 best: 0.4789750 (35) total: 912ms remaining: 532ms 36: learn: 0.4786576 test: 0.4740956 best: 0.4740956 (36) total: 937ms remaining: 506ms 37: learn: 0.4756080 test: 0.4709019 best: 0.4709019 (37) total: 962ms remaining: 481ms 38: learn: 0.4721452 test: 0.4673995 best: 0.4673995 (38) total: 992ms remaining: 458ms 39: learn: 0.4694039 test: 0.4645519 best: 0.4645519 (39) total: 1.01s remaining: 432ms 40: learn: 0.4655533 test: 0.4605994 best: 0.4605994 (40) total: 1.04s remaining: 406ms 41: learn: 0.4619251 test: 0.4570407 best: 0.4570407 (41) total: 1.06s remaining: 380ms 42: learn: 0.4562705 test: 0.4514286 best: 0.4514286 (42) total: 1.09s remaining: 356ms 43: learn: 0.4523774 test: 0.4476321 best: 0.4476321 (43) total: 1.12s remaining: 330ms 44: learn: 0.4471276 test: 0.4424472 best: 0.4424472 (44) total: 1.15s remaining: 306ms 45: learn: 0.4450267 test: 0.4402370 best: 0.4402370 (45) total: 1.18s remaining: 282ms 46: learn: 0.4399645 test: 0.4352154 best: 0.4352154 (46) total: 1.2s remaining: 256ms 47: learn: 0.4374143 test: 0.4326148 best: 0.4326148 (47) total: 1.23s remaining: 230ms 48: learn: 0.4354107 test: 0.4305587 best: 0.4305587 (48) total: 1.25s remaining: 204ms 49: learn: 0.4335519 test: 0.4286323 best: 0.4286323 (49) total: 1.27s remaining: 179ms 50: learn: 0.4316269 test: 0.4266384 best: 0.4266384 (50) total: 1.3s remaining: 153ms 51: learn: 0.4296197 test: 0.4245141 best: 0.4245141 (51) total: 1.33s remaining: 128ms 52: learn: 0.4275533 test: 0.4223441 best: 0.4223441 (52) total: 1.35s remaining: 102ms 53: learn: 0.4231141 test: 0.4179516 best: 0.4179516 (53) total: 1.38s remaining: 76.6ms 54: learn: 0.4204749 test: 0.4152823 best: 0.4152823 (54) total: 1.41s remaining: 51.1ms 55: learn: 0.4168541 test: 0.4117985 best: 0.4117985 (55) total: 1.43s remaining: 25.6ms 56: learn: 0.4136381 test: 0.4084437 best: 0.4084437 (56) total: 1.46s remaining: 0us bestTest = 0.4084436948 bestIteration = 56 Trial 62, Fold 3: Log loss = 0.4086846132751064, Average precision = 0.9557043540753563, ROC-AUC = 0.9529976897554681, Elapsed Time = 1.5760173000016948 seconds Trial 62, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 62, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6849780 test: 0.6850639 best: 0.6850639 (0) total: 21ms remaining: 1.17s 1: learn: 0.6761562 test: 0.6762125 best: 0.6762125 (1) total: 37.2ms remaining: 1.02s 2: learn: 0.6654400 test: 0.6654820 best: 0.6654820 (2) total: 57.8ms remaining: 1.04s 3: learn: 0.6580534 test: 0.6580031 best: 0.6580031 (3) total: 78.2ms remaining: 1.03s 4: learn: 0.6501611 test: 0.6499968 best: 0.6499968 (4) total: 100ms remaining: 1.04s 5: learn: 0.6430145 test: 0.6429569 best: 0.6429569 (5) total: 124ms remaining: 1.05s 6: learn: 0.6322366 test: 0.6321974 best: 0.6321974 (6) total: 148ms remaining: 1.05s 7: learn: 0.6245800 test: 0.6244558 best: 0.6244558 (7) total: 175ms remaining: 1.07s 8: learn: 0.6178026 test: 0.6177018 best: 0.6177018 (8) total: 196ms remaining: 1.04s 9: learn: 0.6106578 test: 0.6104352 best: 0.6104352 (9) total: 219ms remaining: 1.03s 10: learn: 0.6045109 test: 0.6042018 best: 0.6042018 (10) total: 242ms remaining: 1.01s 11: learn: 0.5947061 test: 0.5943683 best: 0.5943683 (11) total: 266ms remaining: 998ms 12: learn: 0.5850603 test: 0.5847923 best: 0.5847923 (12) total: 299ms remaining: 1.01s 13: learn: 0.5784438 test: 0.5781397 best: 0.5781397 (13) total: 321ms remaining: 986ms 14: learn: 0.5718706 test: 0.5716125 best: 0.5716125 (14) total: 347ms remaining: 971ms 15: learn: 0.5665375 test: 0.5662766 best: 0.5662766 (15) total: 375ms remaining: 960ms 16: learn: 0.5612055 test: 0.5608658 best: 0.5608658 (16) total: 402ms remaining: 946ms 17: learn: 0.5541889 test: 0.5538814 best: 0.5538814 (17) total: 427ms remaining: 924ms 18: learn: 0.5464861 test: 0.5461887 best: 0.5461887 (18) total: 451ms remaining: 902ms 19: learn: 0.5427317 test: 0.5423257 best: 0.5423257 (19) total: 478ms remaining: 884ms 20: learn: 0.5353877 test: 0.5349885 best: 0.5349885 (20) total: 505ms remaining: 866ms 21: learn: 0.5278798 test: 0.5273545 best: 0.5273545 (21) total: 530ms remaining: 844ms 22: learn: 0.5236708 test: 0.5230482 best: 0.5230482 (22) total: 558ms remaining: 824ms 23: learn: 0.5191580 test: 0.5186224 best: 0.5186224 (23) total: 585ms remaining: 804ms 24: learn: 0.5127289 test: 0.5122266 best: 0.5122266 (24) total: 610ms remaining: 781ms 25: learn: 0.5084212 test: 0.5079288 best: 0.5079288 (25) total: 638ms remaining: 761ms 26: learn: 0.5018419 test: 0.5012786 best: 0.5012786 (26) total: 665ms remaining: 739ms 27: learn: 0.4978111 test: 0.4972533 best: 0.4972533 (27) total: 690ms remaining: 715ms 28: learn: 0.4943613 test: 0.4937576 best: 0.4937576 (28) total: 712ms remaining: 687ms 29: learn: 0.4911137 test: 0.4904654 best: 0.4904654 (29) total: 739ms remaining: 665ms 30: learn: 0.4879576 test: 0.4872719 best: 0.4872719 (30) total: 766ms remaining: 643ms 31: learn: 0.4811457 test: 0.4804823 best: 0.4804823 (31) total: 794ms remaining: 620ms 32: learn: 0.4767181 test: 0.4761209 best: 0.4761209 (32) total: 822ms remaining: 598ms 33: learn: 0.4737205 test: 0.4730788 best: 0.4730788 (33) total: 850ms remaining: 575ms 34: learn: 0.4706649 test: 0.4700268 best: 0.4700268 (34) total: 874ms remaining: 550ms 35: learn: 0.4654778 test: 0.4648103 best: 0.4648103 (35) total: 899ms remaining: 525ms 36: learn: 0.4621608 test: 0.4614062 best: 0.4614062 (36) total: 926ms remaining: 501ms 37: learn: 0.4594719 test: 0.4587885 best: 0.4587885 (37) total: 952ms remaining: 476ms 38: learn: 0.4537970 test: 0.4530431 best: 0.4530431 (38) total: 979ms remaining: 452ms 39: learn: 0.4490165 test: 0.4482612 best: 0.4482612 (39) total: 1s remaining: 427ms 40: learn: 0.4463579 test: 0.4456286 best: 0.4456286 (40) total: 1.02s remaining: 400ms 41: learn: 0.4423026 test: 0.4415950 best: 0.4415950 (41) total: 1.05s remaining: 375ms 42: learn: 0.4398741 test: 0.4391455 best: 0.4391455 (42) total: 1.08s remaining: 351ms 43: learn: 0.4368065 test: 0.4359741 best: 0.4359741 (43) total: 1.1s remaining: 327ms 44: learn: 0.4323122 test: 0.4315753 best: 0.4315753 (44) total: 1.13s remaining: 303ms 45: learn: 0.4300955 test: 0.4293257 best: 0.4293257 (45) total: 1.16s remaining: 277ms 46: learn: 0.4257685 test: 0.4249084 best: 0.4249084 (46) total: 1.19s remaining: 253ms 47: learn: 0.4227030 test: 0.4217852 best: 0.4217852 (47) total: 1.22s remaining: 228ms 48: learn: 0.4190050 test: 0.4180588 best: 0.4180588 (48) total: 1.24s remaining: 203ms 49: learn: 0.4173033 test: 0.4163104 best: 0.4163104 (49) total: 1.27s remaining: 178ms 50: learn: 0.4153973 test: 0.4143868 best: 0.4143868 (50) total: 1.3s remaining: 153ms 51: learn: 0.4133780 test: 0.4123860 best: 0.4123860 (51) total: 1.32s remaining: 127ms 52: learn: 0.4106398 test: 0.4097067 best: 0.4097067 (52) total: 1.35s remaining: 102ms 53: learn: 0.4067822 test: 0.4057772 best: 0.4057772 (53) total: 1.38s remaining: 76.5ms 54: learn: 0.4032527 test: 0.4022200 best: 0.4022200 (54) total: 1.41s remaining: 51.1ms 55: learn: 0.4017007 test: 0.4007388 best: 0.4007388 (55) total: 1.43s remaining: 25.5ms 56: learn: 0.3987116 test: 0.3977191 best: 0.3977191 (56) total: 1.45s remaining: 0us bestTest = 0.3977190798 bestIteration = 56 Trial 62, Fold 4: Log loss = 0.39783009426448285, Average precision = 0.9591092684337645, ROC-AUC = 0.9544344507869745, Elapsed Time = 1.5617993999985629 seconds Trial 62, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 62, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6842402 test: 0.6844893 best: 0.6844893 (0) total: 22.2ms remaining: 1.24s 1: learn: 0.6748753 test: 0.6754273 best: 0.6754273 (1) total: 42.8ms remaining: 1.18s 2: learn: 0.6644785 test: 0.6653432 best: 0.6653432 (2) total: 63.4ms remaining: 1.14s 3: learn: 0.6560476 test: 0.6572208 best: 0.6572208 (3) total: 86.2ms remaining: 1.14s 4: learn: 0.6457130 test: 0.6470372 best: 0.6470372 (4) total: 103ms remaining: 1.07s 5: learn: 0.6384728 test: 0.6400769 best: 0.6400769 (5) total: 128ms remaining: 1.08s 6: learn: 0.6315051 test: 0.6333965 best: 0.6333965 (6) total: 151ms remaining: 1.08s 7: learn: 0.6237370 test: 0.6257859 best: 0.6257859 (7) total: 172ms remaining: 1.05s 8: learn: 0.6169018 test: 0.6192245 best: 0.6192245 (8) total: 191ms remaining: 1.02s 9: learn: 0.6106755 test: 0.6131926 best: 0.6131926 (9) total: 211ms remaining: 991ms 10: learn: 0.6040687 test: 0.6068431 best: 0.6068431 (10) total: 234ms remaining: 979ms 11: learn: 0.5971477 test: 0.6000698 best: 0.6000698 (11) total: 262ms remaining: 983ms 12: learn: 0.5897439 test: 0.5928634 best: 0.5928634 (12) total: 286ms remaining: 966ms 13: learn: 0.5843326 test: 0.5877101 best: 0.5877101 (13) total: 313ms remaining: 961ms 14: learn: 0.5784701 test: 0.5821013 best: 0.5821013 (14) total: 341ms remaining: 956ms 15: learn: 0.5733667 test: 0.5771958 best: 0.5771958 (15) total: 368ms remaining: 943ms 16: learn: 0.5682577 test: 0.5723083 best: 0.5723083 (16) total: 391ms remaining: 920ms 17: learn: 0.5632854 test: 0.5675362 best: 0.5675362 (17) total: 415ms remaining: 899ms 18: learn: 0.5586820 test: 0.5630924 best: 0.5630924 (18) total: 442ms remaining: 883ms 19: learn: 0.5534579 test: 0.5580657 best: 0.5580657 (19) total: 469ms remaining: 867ms 20: learn: 0.5481473 test: 0.5529267 best: 0.5529267 (20) total: 497ms remaining: 852ms 21: learn: 0.5433126 test: 0.5481296 best: 0.5481296 (21) total: 523ms remaining: 833ms 22: learn: 0.5381286 test: 0.5431141 best: 0.5431141 (22) total: 551ms remaining: 815ms 23: learn: 0.5347737 test: 0.5400058 best: 0.5400058 (23) total: 576ms remaining: 792ms 24: learn: 0.5273075 test: 0.5326939 best: 0.5326939 (24) total: 599ms remaining: 767ms 25: learn: 0.5231513 test: 0.5287239 best: 0.5287239 (25) total: 626ms remaining: 747ms 26: learn: 0.5185622 test: 0.5242852 best: 0.5242852 (26) total: 650ms remaining: 723ms 27: learn: 0.5140901 test: 0.5199637 best: 0.5199637 (27) total: 675ms remaining: 699ms 28: learn: 0.5101711 test: 0.5161930 best: 0.5161930 (28) total: 700ms remaining: 676ms 29: learn: 0.5055569 test: 0.5116351 best: 0.5116351 (29) total: 728ms remaining: 655ms 30: learn: 0.5023930 test: 0.5086003 best: 0.5086003 (30) total: 754ms remaining: 633ms 31: learn: 0.4989227 test: 0.5051676 best: 0.5051676 (31) total: 782ms remaining: 611ms 32: learn: 0.4944888 test: 0.5008166 best: 0.5008166 (32) total: 809ms remaining: 589ms 33: learn: 0.4914496 test: 0.4979285 best: 0.4979285 (33) total: 837ms remaining: 566ms 34: learn: 0.4880270 test: 0.4946482 best: 0.4946482 (34) total: 862ms remaining: 542ms 35: learn: 0.4845903 test: 0.4912936 best: 0.4912936 (35) total: 886ms remaining: 517ms 36: learn: 0.4804014 test: 0.4871101 best: 0.4871101 (36) total: 913ms remaining: 494ms 37: learn: 0.4773104 test: 0.4841678 best: 0.4841678 (37) total: 930ms remaining: 465ms 38: learn: 0.4746221 test: 0.4815598 best: 0.4815598 (38) total: 958ms remaining: 442ms 39: learn: 0.4721708 test: 0.4792345 best: 0.4792345 (39) total: 982ms remaining: 417ms 40: learn: 0.4697485 test: 0.4769337 best: 0.4769337 (40) total: 1.01s remaining: 394ms 41: learn: 0.4671643 test: 0.4743983 best: 0.4743983 (41) total: 1.04s remaining: 370ms 42: learn: 0.4638981 test: 0.4711440 best: 0.4711440 (42) total: 1.06s remaining: 347ms 43: learn: 0.4604199 test: 0.4677828 best: 0.4677828 (43) total: 1.09s remaining: 322ms 44: learn: 0.4580254 test: 0.4654503 best: 0.4654503 (44) total: 1.11s remaining: 297ms 45: learn: 0.4550526 test: 0.4625775 best: 0.4625775 (45) total: 1.14s remaining: 273ms 46: learn: 0.4525546 test: 0.4601993 best: 0.4601993 (46) total: 1.17s remaining: 249ms 47: learn: 0.4497918 test: 0.4575001 best: 0.4575001 (47) total: 1.19s remaining: 224ms 48: learn: 0.4476767 test: 0.4554394 best: 0.4554394 (48) total: 1.22s remaining: 200ms 49: learn: 0.4443855 test: 0.4520277 best: 0.4520277 (49) total: 1.25s remaining: 175ms 50: learn: 0.4415130 test: 0.4492025 best: 0.4492025 (50) total: 1.27s remaining: 150ms 51: learn: 0.4398201 test: 0.4474874 best: 0.4474874 (51) total: 1.3s remaining: 125ms 52: learn: 0.4366313 test: 0.4442909 best: 0.4442909 (52) total: 1.33s remaining: 100ms 53: learn: 0.4342614 test: 0.4419116 best: 0.4419116 (53) total: 1.35s remaining: 75.2ms 54: learn: 0.4319667 test: 0.4396911 best: 0.4396911 (54) total: 1.38s remaining: 50.1ms 55: learn: 0.4289022 test: 0.4365646 best: 0.4365646 (55) total: 1.41s remaining: 25.1ms 56: learn: 0.4264440 test: 0.4341871 best: 0.4341871 (56) total: 1.43s remaining: 0us bestTest = 0.4341871003 bestIteration = 56 Trial 62, Fold 5: Log loss = 0.4341949787366909, Average precision = 0.9490793838424075, ROC-AUC = 0.9438020177247646, Elapsed Time = 1.5355920000001788 seconds
Optimization Progress: 63%|######3 | 63/100 [1:47:14<55:23, 89.83s/it]
Trial 63, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 63, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5820811 test: 0.5824604 best: 0.5824604 (0) total: 88.5ms remaining: 7.52s 1: learn: 0.5103345 test: 0.5105672 best: 0.5105672 (1) total: 183ms remaining: 7.7s 2: learn: 0.4512765 test: 0.4516506 best: 0.4516506 (2) total: 282ms remaining: 7.81s 3: learn: 0.3954563 test: 0.3957980 best: 0.3957980 (3) total: 386ms remaining: 7.9s 4: learn: 0.3603886 test: 0.3607285 best: 0.3607285 (4) total: 483ms remaining: 7.83s 5: learn: 0.3391203 test: 0.3396635 best: 0.3396635 (5) total: 586ms remaining: 7.82s 6: learn: 0.3152734 test: 0.3160064 best: 0.3160064 (6) total: 690ms remaining: 7.78s 7: learn: 0.2967210 test: 0.2977905 best: 0.2977905 (7) total: 793ms remaining: 7.73s 8: learn: 0.2881499 test: 0.2898108 best: 0.2898108 (8) total: 893ms remaining: 7.64s 9: learn: 0.2765536 test: 0.2788190 best: 0.2788190 (9) total: 994ms remaining: 7.56s 10: learn: 0.2628664 test: 0.2653283 best: 0.2653283 (10) total: 1.1s remaining: 7.5s 11: learn: 0.2567347 test: 0.2595351 best: 0.2595351 (11) total: 1.2s remaining: 7.42s 12: learn: 0.2494155 test: 0.2529362 best: 0.2529362 (12) total: 1.31s remaining: 7.35s 13: learn: 0.2452745 test: 0.2492910 best: 0.2492910 (13) total: 1.41s remaining: 7.26s 14: learn: 0.2429555 test: 0.2476429 best: 0.2476429 (14) total: 1.51s remaining: 7.16s 15: learn: 0.2382654 test: 0.2432408 best: 0.2432408 (15) total: 1.61s remaining: 7.07s 16: learn: 0.2341336 test: 0.2398928 best: 0.2398928 (16) total: 1.72s remaining: 6.97s 17: learn: 0.2312156 test: 0.2371874 best: 0.2371874 (17) total: 1.82s remaining: 6.89s 18: learn: 0.2292409 test: 0.2355084 best: 0.2355084 (18) total: 1.92s remaining: 6.78s 19: learn: 0.2276068 test: 0.2345169 best: 0.2345169 (19) total: 2.02s remaining: 6.68s 20: learn: 0.2244469 test: 0.2314258 best: 0.2314258 (20) total: 2.13s remaining: 6.59s 21: learn: 0.2219018 test: 0.2293753 best: 0.2293753 (21) total: 2.23s remaining: 6.5s 22: learn: 0.2206720 test: 0.2288006 best: 0.2288006 (22) total: 2.34s remaining: 6.41s 23: learn: 0.2191711 test: 0.2276439 best: 0.2276439 (23) total: 2.44s remaining: 6.31s 24: learn: 0.2174099 test: 0.2262332 best: 0.2262332 (24) total: 2.55s remaining: 6.21s 25: learn: 0.2166572 test: 0.2254650 best: 0.2254650 (25) total: 2.65s remaining: 6.12s 26: learn: 0.2150800 test: 0.2244046 best: 0.2244046 (26) total: 2.75s remaining: 6.02s 27: learn: 0.2120318 test: 0.2217835 best: 0.2217835 (27) total: 2.86s remaining: 5.92s 28: learn: 0.2109733 test: 0.2211111 best: 0.2211111 (28) total: 2.96s remaining: 5.82s 29: learn: 0.2099612 test: 0.2204303 best: 0.2204303 (29) total: 3.06s remaining: 5.72s 30: learn: 0.2084659 test: 0.2189456 best: 0.2189456 (30) total: 3.17s remaining: 5.62s 31: learn: 0.2077926 test: 0.2186014 best: 0.2186014 (31) total: 3.27s remaining: 5.52s 32: learn: 0.2072346 test: 0.2184821 best: 0.2184821 (32) total: 3.38s remaining: 5.42s 33: learn: 0.2060695 test: 0.2175149 best: 0.2175149 (33) total: 3.48s remaining: 5.32s 34: learn: 0.2044705 test: 0.2163578 best: 0.2163578 (34) total: 3.58s remaining: 5.22s 35: learn: 0.2033684 test: 0.2155478 best: 0.2155478 (35) total: 3.69s remaining: 5.12s 36: learn: 0.2024528 test: 0.2151123 best: 0.2151123 (36) total: 3.79s remaining: 5.02s 37: learn: 0.2014787 test: 0.2142996 best: 0.2142996 (37) total: 3.89s remaining: 4.92s 38: learn: 0.2005959 test: 0.2137940 best: 0.2137940 (38) total: 3.99s remaining: 4.81s 39: learn: 0.2002297 test: 0.2136440 best: 0.2136440 (39) total: 4.09s remaining: 4.71s 40: learn: 0.1993624 test: 0.2128178 best: 0.2128178 (40) total: 4.2s remaining: 4.61s 41: learn: 0.1984662 test: 0.2118263 best: 0.2118263 (41) total: 4.3s remaining: 4.51s 42: learn: 0.1978708 test: 0.2116721 best: 0.2116721 (42) total: 4.4s remaining: 4.4s 43: learn: 0.1977662 test: 0.2116392 best: 0.2116392 (43) total: 4.46s remaining: 4.26s 44: learn: 0.1972671 test: 0.2113951 best: 0.2113951 (44) total: 4.56s remaining: 4.15s 45: learn: 0.1969685 test: 0.2110634 best: 0.2110634 (45) total: 4.66s remaining: 4.05s 46: learn: 0.1963295 test: 0.2108066 best: 0.2108066 (46) total: 4.77s remaining: 3.96s 47: learn: 0.1958147 test: 0.2106333 best: 0.2106333 (47) total: 4.87s remaining: 3.86s 48: learn: 0.1951968 test: 0.2101619 best: 0.2101619 (48) total: 4.98s remaining: 3.76s 49: learn: 0.1951076 test: 0.2101255 best: 0.2101255 (49) total: 5.08s remaining: 3.66s 50: learn: 0.1949002 test: 0.2099854 best: 0.2099854 (50) total: 5.18s remaining: 3.56s 51: learn: 0.1944909 test: 0.2096867 best: 0.2096867 (51) total: 5.29s remaining: 3.46s 52: learn: 0.1940917 test: 0.2094784 best: 0.2094784 (52) total: 5.39s remaining: 3.36s 53: learn: 0.1937106 test: 0.2093583 best: 0.2093583 (53) total: 5.5s remaining: 3.26s 54: learn: 0.1934148 test: 0.2092936 best: 0.2092936 (54) total: 5.6s remaining: 3.16s 55: learn: 0.1926228 test: 0.2087910 best: 0.2087910 (55) total: 5.7s remaining: 3.06s 56: learn: 0.1923687 test: 0.2086862 best: 0.2086862 (56) total: 5.81s remaining: 2.95s 57: learn: 0.1921778 test: 0.2087550 best: 0.2086862 (56) total: 5.91s remaining: 2.85s 58: learn: 0.1920607 test: 0.2087350 best: 0.2086862 (56) total: 5.99s remaining: 2.74s 59: learn: 0.1914343 test: 0.2082967 best: 0.2082967 (59) total: 6.09s remaining: 2.64s 60: learn: 0.1909279 test: 0.2080087 best: 0.2080087 (60) total: 6.2s remaining: 2.54s 61: learn: 0.1904038 test: 0.2075252 best: 0.2075252 (61) total: 6.3s remaining: 2.44s 62: learn: 0.1904038 test: 0.2075252 best: 0.2075252 (62) total: 6.33s remaining: 2.31s 63: learn: 0.1900029 test: 0.2071495 best: 0.2071495 (63) total: 6.43s remaining: 2.21s 64: learn: 0.1897056 test: 0.2069646 best: 0.2069646 (64) total: 6.53s remaining: 2.11s 65: learn: 0.1896664 test: 0.2069614 best: 0.2069614 (65) total: 6.59s remaining: 2s 66: learn: 0.1895704 test: 0.2069995 best: 0.2069614 (65) total: 6.7s remaining: 1.9s 67: learn: 0.1888082 test: 0.2064244 best: 0.2064244 (67) total: 6.8s remaining: 1.8s 68: learn: 0.1885317 test: 0.2063301 best: 0.2063301 (68) total: 6.91s remaining: 1.7s 69: learn: 0.1880845 test: 0.2060740 best: 0.2060740 (69) total: 7.02s remaining: 1.6s 70: learn: 0.1878884 test: 0.2060142 best: 0.2060142 (70) total: 7.12s remaining: 1.5s 71: learn: 0.1877362 test: 0.2059666 best: 0.2059666 (71) total: 7.23s remaining: 1.41s 72: learn: 0.1872696 test: 0.2057342 best: 0.2057342 (72) total: 7.34s remaining: 1.31s 73: learn: 0.1868723 test: 0.2055909 best: 0.2055909 (73) total: 7.44s remaining: 1.21s 74: learn: 0.1867105 test: 0.2054283 best: 0.2054283 (74) total: 7.54s remaining: 1.11s 75: learn: 0.1862898 test: 0.2052028 best: 0.2052028 (75) total: 7.65s remaining: 1.01s 76: learn: 0.1859478 test: 0.2049291 best: 0.2049291 (76) total: 7.75s remaining: 906ms 77: learn: 0.1853297 test: 0.2046914 best: 0.2046914 (77) total: 7.86s remaining: 806ms 78: learn: 0.1851635 test: 0.2045634 best: 0.2045634 (78) total: 7.97s remaining: 706ms 79: learn: 0.1848419 test: 0.2046077 best: 0.2045634 (78) total: 8.07s remaining: 606ms 80: learn: 0.1846636 test: 0.2046581 best: 0.2045634 (78) total: 8.18s remaining: 505ms 81: learn: 0.1841776 test: 0.2044278 best: 0.2044278 (81) total: 8.29s remaining: 404ms 82: learn: 0.1840512 test: 0.2045408 best: 0.2044278 (81) total: 8.37s remaining: 303ms 83: learn: 0.1838361 test: 0.2044506 best: 0.2044278 (81) total: 8.47s remaining: 202ms 84: learn: 0.1835623 test: 0.2041637 best: 0.2041637 (84) total: 8.57s remaining: 101ms 85: learn: 0.1834004 test: 0.2041771 best: 0.2041637 (84) total: 8.68s remaining: 0us bestTest = 0.2041637214 bestIteration = 84 Shrink model to first 85 iterations. Trial 63, Fold 1: Log loss = 0.20358020977818503, Average precision = 0.9731543004024821, ROC-AUC = 0.9684258094234381, Elapsed Time = 8.812033399997745 seconds Trial 63, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 63, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5746697 test: 0.5757894 best: 0.5757894 (0) total: 92.8ms remaining: 7.89s 1: learn: 0.4915654 test: 0.4916924 best: 0.4916924 (1) total: 201ms remaining: 8.46s 2: learn: 0.4284752 test: 0.4280252 best: 0.4280252 (2) total: 307ms remaining: 8.49s 3: learn: 0.3941959 test: 0.3946710 best: 0.3946710 (3) total: 406ms remaining: 8.32s 4: learn: 0.3599375 test: 0.3608943 best: 0.3608943 (4) total: 510ms remaining: 8.27s 5: learn: 0.3356229 test: 0.3366645 best: 0.3366645 (5) total: 614ms remaining: 8.18s 6: learn: 0.3061189 test: 0.3076703 best: 0.3076703 (6) total: 719ms remaining: 8.12s 7: learn: 0.2923954 test: 0.2940910 best: 0.2940910 (7) total: 822ms remaining: 8.01s 8: learn: 0.2791122 test: 0.2808970 best: 0.2808970 (8) total: 927ms remaining: 7.93s 9: learn: 0.2658295 test: 0.2678841 best: 0.2678841 (9) total: 1.03s remaining: 7.85s 10: learn: 0.2601770 test: 0.2624569 best: 0.2624569 (10) total: 1.11s remaining: 7.6s 11: learn: 0.2539223 test: 0.2564319 best: 0.2564319 (11) total: 1.21s remaining: 7.49s 12: learn: 0.2487338 test: 0.2514765 best: 0.2514765 (12) total: 1.32s remaining: 7.4s 13: learn: 0.2413167 test: 0.2445543 best: 0.2445543 (13) total: 1.42s remaining: 7.3s 14: learn: 0.2398787 test: 0.2428888 best: 0.2428888 (14) total: 1.48s remaining: 7s 15: learn: 0.2365112 test: 0.2396485 best: 0.2396485 (15) total: 1.57s remaining: 6.88s 16: learn: 0.2350053 test: 0.2378692 best: 0.2378692 (16) total: 1.68s remaining: 6.8s 17: learn: 0.2319070 test: 0.2345106 best: 0.2345106 (17) total: 1.75s remaining: 6.6s 18: learn: 0.2304074 test: 0.2328605 best: 0.2328605 (18) total: 1.84s remaining: 6.5s 19: learn: 0.2264558 test: 0.2288529 best: 0.2288529 (19) total: 1.95s remaining: 6.44s 20: learn: 0.2239574 test: 0.2264327 best: 0.2264327 (20) total: 2.05s remaining: 6.36s 21: learn: 0.2225184 test: 0.2252933 best: 0.2252933 (21) total: 2.16s remaining: 6.28s 22: learn: 0.2213475 test: 0.2242245 best: 0.2242245 (22) total: 2.26s remaining: 6.19s 23: learn: 0.2192823 test: 0.2220783 best: 0.2220783 (23) total: 2.36s remaining: 6.11s 24: learn: 0.2158012 test: 0.2189159 best: 0.2189159 (24) total: 2.47s remaining: 6.03s 25: learn: 0.2150104 test: 0.2182138 best: 0.2182138 (25) total: 2.57s remaining: 5.94s 26: learn: 0.2143980 test: 0.2176418 best: 0.2176418 (26) total: 2.68s remaining: 5.85s 27: learn: 0.2136651 test: 0.2168373 best: 0.2168373 (27) total: 2.78s remaining: 5.76s 28: learn: 0.2130416 test: 0.2161744 best: 0.2161744 (28) total: 2.88s remaining: 5.67s 29: learn: 0.2110059 test: 0.2143578 best: 0.2143578 (29) total: 2.99s remaining: 5.58s 30: learn: 0.2102999 test: 0.2139206 best: 0.2139206 (30) total: 3.09s remaining: 5.48s 31: learn: 0.2089611 test: 0.2127622 best: 0.2127622 (31) total: 3.19s remaining: 5.38s 32: learn: 0.2077411 test: 0.2117811 best: 0.2117811 (32) total: 3.29s remaining: 5.29s 33: learn: 0.2067449 test: 0.2112088 best: 0.2112088 (33) total: 3.39s remaining: 5.19s 34: learn: 0.2062018 test: 0.2108375 best: 0.2108375 (34) total: 3.5s remaining: 5.09s 35: learn: 0.2057656 test: 0.2105989 best: 0.2105989 (35) total: 3.6s remaining: 5s 36: learn: 0.2052479 test: 0.2102525 best: 0.2102525 (36) total: 3.7s remaining: 4.9s 37: learn: 0.2047290 test: 0.2097257 best: 0.2097257 (37) total: 3.8s remaining: 4.8s 38: learn: 0.2041951 test: 0.2092626 best: 0.2092626 (38) total: 3.9s remaining: 4.7s 39: learn: 0.2037368 test: 0.2090823 best: 0.2090823 (39) total: 4s remaining: 4.6s 40: learn: 0.2034065 test: 0.2088618 best: 0.2088618 (40) total: 4.1s remaining: 4.5s 41: learn: 0.2031517 test: 0.2085850 best: 0.2085850 (41) total: 4.16s remaining: 4.36s 42: learn: 0.2024527 test: 0.2080416 best: 0.2080416 (42) total: 4.25s remaining: 4.25s 43: learn: 0.2018808 test: 0.2076664 best: 0.2076664 (43) total: 4.35s remaining: 4.15s 44: learn: 0.2015145 test: 0.2073420 best: 0.2073420 (44) total: 4.45s remaining: 4.06s 45: learn: 0.2010894 test: 0.2070219 best: 0.2070219 (45) total: 4.56s remaining: 3.96s 46: learn: 0.2002519 test: 0.2067169 best: 0.2067169 (46) total: 4.66s remaining: 3.87s 47: learn: 0.2001895 test: 0.2066900 best: 0.2066900 (47) total: 4.73s remaining: 3.75s 48: learn: 0.1995797 test: 0.2063437 best: 0.2063437 (48) total: 4.83s remaining: 3.65s 49: learn: 0.1989607 test: 0.2059121 best: 0.2059121 (49) total: 4.93s remaining: 3.55s 50: learn: 0.1983906 test: 0.2057521 best: 0.2057521 (50) total: 5.04s remaining: 3.46s 51: learn: 0.1978047 test: 0.2052413 best: 0.2052413 (51) total: 5.14s remaining: 3.36s 52: learn: 0.1975539 test: 0.2050935 best: 0.2050935 (52) total: 5.25s remaining: 3.27s 53: learn: 0.1957456 test: 0.2033715 best: 0.2033715 (53) total: 5.36s remaining: 3.17s 54: learn: 0.1953401 test: 0.2030960 best: 0.2030960 (54) total: 5.46s remaining: 3.08s 55: learn: 0.1950628 test: 0.2028039 best: 0.2028039 (55) total: 5.57s remaining: 2.98s 56: learn: 0.1948215 test: 0.2025794 best: 0.2025794 (56) total: 5.67s remaining: 2.88s 57: learn: 0.1948181 test: 0.2025764 best: 0.2025764 (57) total: 5.73s remaining: 2.76s 58: learn: 0.1945247 test: 0.2023830 best: 0.2023830 (58) total: 5.82s remaining: 2.66s 59: learn: 0.1944643 test: 0.2023194 best: 0.2023194 (59) total: 5.87s remaining: 2.54s 60: learn: 0.1943026 test: 0.2022073 best: 0.2022073 (60) total: 5.96s remaining: 2.44s 61: learn: 0.1941812 test: 0.2020688 best: 0.2020688 (61) total: 6.01s remaining: 2.33s 62: learn: 0.1941168 test: 0.2020542 best: 0.2020542 (62) total: 6.11s remaining: 2.23s 63: learn: 0.1938153 test: 0.2019486 best: 0.2019486 (63) total: 6.21s remaining: 2.13s 64: learn: 0.1935118 test: 0.2017715 best: 0.2017715 (64) total: 6.31s remaining: 2.04s 65: learn: 0.1930958 test: 0.2014089 best: 0.2014089 (65) total: 6.42s remaining: 1.94s 66: learn: 0.1927509 test: 0.2010656 best: 0.2010656 (66) total: 6.52s remaining: 1.85s 67: learn: 0.1923442 test: 0.2007599 best: 0.2007599 (67) total: 6.63s remaining: 1.75s 68: learn: 0.1921279 test: 0.2004720 best: 0.2004720 (68) total: 6.73s remaining: 1.66s 69: learn: 0.1918849 test: 0.2004351 best: 0.2004351 (69) total: 6.83s remaining: 1.56s 70: learn: 0.1917443 test: 0.2004319 best: 0.2004319 (70) total: 6.94s remaining: 1.47s 71: learn: 0.1914271 test: 0.2001502 best: 0.2001502 (71) total: 7.04s remaining: 1.37s 72: learn: 0.1909070 test: 0.1994216 best: 0.1994216 (72) total: 7.14s remaining: 1.27s 73: learn: 0.1905421 test: 0.1991792 best: 0.1991792 (73) total: 7.24s remaining: 1.17s 74: learn: 0.1902986 test: 0.1990641 best: 0.1990641 (74) total: 7.35s remaining: 1.08s 75: learn: 0.1897932 test: 0.1987978 best: 0.1987978 (75) total: 7.46s remaining: 981ms 76: learn: 0.1891665 test: 0.1980625 best: 0.1980625 (76) total: 7.57s remaining: 885ms 77: learn: 0.1885349 test: 0.1977478 best: 0.1977478 (77) total: 7.69s remaining: 789ms 78: learn: 0.1882743 test: 0.1977397 best: 0.1977397 (78) total: 7.81s remaining: 692ms 79: learn: 0.1882003 test: 0.1976872 best: 0.1976872 (79) total: 7.92s remaining: 594ms 80: learn: 0.1873944 test: 0.1972069 best: 0.1972069 (80) total: 8.04s remaining: 496ms 81: learn: 0.1868225 test: 0.1968147 best: 0.1968147 (81) total: 8.14s remaining: 397ms 82: learn: 0.1868200 test: 0.1968178 best: 0.1968147 (81) total: 8.19s remaining: 296ms 83: learn: 0.1867856 test: 0.1967854 best: 0.1967854 (83) total: 8.29s remaining: 197ms 84: learn: 0.1865873 test: 0.1965710 best: 0.1965710 (84) total: 8.39s remaining: 98.7ms 85: learn: 0.1863535 test: 0.1965029 best: 0.1965029 (85) total: 8.49s remaining: 0us bestTest = 0.1965028818 bestIteration = 85 Trial 63, Fold 2: Log loss = 0.19611343832119293, Average precision = 0.9738883789691233, ROC-AUC = 0.9704723089123599, Elapsed Time = 8.637516799997684 seconds Trial 63, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 63, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5799371 test: 0.5785148 best: 0.5785148 (0) total: 92.2ms remaining: 7.84s 1: learn: 0.4938906 test: 0.4921180 best: 0.4921180 (1) total: 198ms remaining: 8.3s 2: learn: 0.4261428 test: 0.4243203 best: 0.4243203 (2) total: 302ms remaining: 8.35s 3: learn: 0.3955031 test: 0.3936393 best: 0.3936393 (3) total: 387ms remaining: 7.92s 4: learn: 0.3636162 test: 0.3611905 best: 0.3611905 (4) total: 486ms remaining: 7.88s 5: learn: 0.3432288 test: 0.3407134 best: 0.3407134 (5) total: 590ms remaining: 7.86s 6: learn: 0.3199225 test: 0.3170621 best: 0.3170621 (6) total: 695ms remaining: 7.84s 7: learn: 0.3051505 test: 0.3021003 best: 0.3021003 (7) total: 796ms remaining: 7.76s 8: learn: 0.2840659 test: 0.2811750 best: 0.2811750 (8) total: 902ms remaining: 7.71s 9: learn: 0.2707012 test: 0.2676464 best: 0.2676464 (9) total: 1.01s remaining: 7.65s 10: learn: 0.2593343 test: 0.2565836 best: 0.2565836 (10) total: 1.11s remaining: 7.58s 11: learn: 0.2532560 test: 0.2508080 best: 0.2508080 (11) total: 1.22s remaining: 7.51s 12: learn: 0.2471005 test: 0.2453534 best: 0.2453534 (12) total: 1.32s remaining: 7.42s 13: learn: 0.2430983 test: 0.2414862 best: 0.2414862 (13) total: 1.42s remaining: 7.32s 14: learn: 0.2369278 test: 0.2358733 best: 0.2358733 (14) total: 1.53s remaining: 7.24s 15: learn: 0.2338234 test: 0.2332803 best: 0.2332803 (15) total: 1.64s remaining: 7.15s 16: learn: 0.2288203 test: 0.2287314 best: 0.2287314 (16) total: 1.74s remaining: 7.06s 17: learn: 0.2266078 test: 0.2264730 best: 0.2264730 (17) total: 1.85s remaining: 6.98s 18: learn: 0.2250721 test: 0.2252273 best: 0.2252273 (18) total: 1.95s remaining: 6.87s 19: learn: 0.2209718 test: 0.2212022 best: 0.2212022 (19) total: 2.05s remaining: 6.77s 20: learn: 0.2197370 test: 0.2201992 best: 0.2201992 (20) total: 2.15s remaining: 6.66s 21: learn: 0.2186603 test: 0.2191882 best: 0.2191882 (21) total: 2.26s remaining: 6.57s 22: learn: 0.2174513 test: 0.2180618 best: 0.2180618 (22) total: 2.33s remaining: 6.38s 23: learn: 0.2162758 test: 0.2171330 best: 0.2171330 (23) total: 2.43s remaining: 6.27s 24: learn: 0.2157749 test: 0.2164684 best: 0.2164684 (24) total: 2.53s remaining: 6.17s 25: learn: 0.2141586 test: 0.2153620 best: 0.2153620 (25) total: 2.69s remaining: 6.2s 26: learn: 0.2123524 test: 0.2140425 best: 0.2140425 (26) total: 2.95s remaining: 6.44s 27: learn: 0.2117615 test: 0.2134066 best: 0.2134066 (27) total: 3.08s remaining: 6.39s 28: learn: 0.2110809 test: 0.2129553 best: 0.2129553 (28) total: 3.25s remaining: 6.38s 29: learn: 0.2093459 test: 0.2113312 best: 0.2113312 (29) total: 3.44s remaining: 6.42s 30: learn: 0.2089189 test: 0.2111788 best: 0.2111788 (30) total: 3.62s remaining: 6.43s 31: learn: 0.2084471 test: 0.2108646 best: 0.2108646 (31) total: 3.82s remaining: 6.44s 32: learn: 0.2072287 test: 0.2098730 best: 0.2098730 (32) total: 3.98s remaining: 6.39s 33: learn: 0.2068074 test: 0.2099073 best: 0.2098730 (32) total: 4.15s remaining: 6.34s 34: learn: 0.2061713 test: 0.2097143 best: 0.2097143 (34) total: 4.31s remaining: 6.28s 35: learn: 0.2054889 test: 0.2092225 best: 0.2092225 (35) total: 4.47s remaining: 6.21s 36: learn: 0.2045969 test: 0.2080438 best: 0.2080438 (36) total: 4.61s remaining: 6.11s 37: learn: 0.2042833 test: 0.2077922 best: 0.2077922 (37) total: 4.78s remaining: 6.04s 38: learn: 0.2033338 test: 0.2070626 best: 0.2070626 (38) total: 4.91s remaining: 5.92s 39: learn: 0.2016143 test: 0.2059369 best: 0.2059369 (39) total: 5.05s remaining: 5.81s 40: learn: 0.2013923 test: 0.2058081 best: 0.2058081 (40) total: 5.18s remaining: 5.69s 41: learn: 0.2009060 test: 0.2054106 best: 0.2054106 (41) total: 5.33s remaining: 5.58s 42: learn: 0.2003771 test: 0.2051089 best: 0.2051089 (42) total: 5.46s remaining: 5.46s 43: learn: 0.1995587 test: 0.2043586 best: 0.2043586 (43) total: 5.59s remaining: 5.33s 44: learn: 0.1991234 test: 0.2042142 best: 0.2042142 (44) total: 5.72s remaining: 5.21s 45: learn: 0.1984813 test: 0.2037164 best: 0.2037164 (45) total: 5.85s remaining: 5.09s 46: learn: 0.1983653 test: 0.2036667 best: 0.2036667 (46) total: 5.97s remaining: 4.96s 47: learn: 0.1975470 test: 0.2029834 best: 0.2029834 (47) total: 6.11s remaining: 4.83s 48: learn: 0.1972514 test: 0.2028342 best: 0.2028342 (48) total: 6.28s remaining: 4.74s 49: learn: 0.1966891 test: 0.2024200 best: 0.2024200 (49) total: 6.43s remaining: 4.63s 50: learn: 0.1963984 test: 0.2022405 best: 0.2022405 (50) total: 6.56s remaining: 4.5s 51: learn: 0.1962267 test: 0.2021764 best: 0.2021764 (51) total: 6.71s remaining: 4.39s 52: learn: 0.1957632 test: 0.2018087 best: 0.2018087 (52) total: 6.86s remaining: 4.27s 53: learn: 0.1956863 test: 0.2017098 best: 0.2017098 (53) total: 6.92s remaining: 4.1s 54: learn: 0.1954333 test: 0.2016210 best: 0.2016210 (54) total: 7.05s remaining: 3.97s 55: learn: 0.1949562 test: 0.2015010 best: 0.2015010 (55) total: 7.18s remaining: 3.85s 56: learn: 0.1943369 test: 0.2009436 best: 0.2009436 (56) total: 7.32s remaining: 3.72s 57: learn: 0.1940304 test: 0.2009331 best: 0.2009331 (57) total: 7.44s remaining: 3.59s 58: learn: 0.1937302 test: 0.2008444 best: 0.2008444 (58) total: 7.57s remaining: 3.46s 59: learn: 0.1932980 test: 0.2004368 best: 0.2004368 (59) total: 7.7s remaining: 3.34s 60: learn: 0.1927615 test: 0.2003157 best: 0.2003157 (60) total: 7.83s remaining: 3.21s 61: learn: 0.1925009 test: 0.2002186 best: 0.2002186 (61) total: 7.95s remaining: 3.08s 62: learn: 0.1924670 test: 0.2001665 best: 0.2001665 (62) total: 8s remaining: 2.92s 63: learn: 0.1916854 test: 0.1996045 best: 0.1996045 (63) total: 8.11s remaining: 2.79s 64: learn: 0.1912526 test: 0.1992100 best: 0.1992100 (64) total: 8.22s remaining: 2.65s 65: learn: 0.1909905 test: 0.1992077 best: 0.1992077 (65) total: 8.33s remaining: 2.52s 66: learn: 0.1908117 test: 0.1992462 best: 0.1992077 (65) total: 8.44s remaining: 2.39s 67: learn: 0.1904889 test: 0.1988857 best: 0.1988857 (67) total: 8.55s remaining: 2.26s 68: learn: 0.1899648 test: 0.1986352 best: 0.1986352 (68) total: 8.66s remaining: 2.13s 69: learn: 0.1896288 test: 0.1983401 best: 0.1983401 (69) total: 8.78s remaining: 2.01s 70: learn: 0.1892977 test: 0.1983172 best: 0.1983172 (70) total: 8.89s remaining: 1.88s 71: learn: 0.1889933 test: 0.1980998 best: 0.1980998 (71) total: 9s remaining: 1.75s 72: learn: 0.1886349 test: 0.1980802 best: 0.1980802 (72) total: 9.12s remaining: 1.62s 73: learn: 0.1877480 test: 0.1979089 best: 0.1979089 (73) total: 9.24s remaining: 1.5s 74: learn: 0.1872486 test: 0.1975548 best: 0.1975548 (74) total: 9.34s remaining: 1.37s 75: learn: 0.1868367 test: 0.1975200 best: 0.1975200 (75) total: 9.45s remaining: 1.24s 76: learn: 0.1864179 test: 0.1973609 best: 0.1973609 (76) total: 9.57s remaining: 1.12s 77: learn: 0.1862191 test: 0.1972272 best: 0.1972272 (77) total: 9.68s remaining: 992ms 78: learn: 0.1860182 test: 0.1971213 best: 0.1971213 (78) total: 9.78s remaining: 867ms 79: learn: 0.1857414 test: 0.1971089 best: 0.1971089 (79) total: 9.89s remaining: 742ms 80: learn: 0.1856597 test: 0.1970215 best: 0.1970215 (80) total: 9.97s remaining: 615ms 81: learn: 0.1853358 test: 0.1967921 best: 0.1967921 (81) total: 10.1s remaining: 491ms 82: learn: 0.1849444 test: 0.1966289 best: 0.1966289 (82) total: 10.2s remaining: 368ms 83: learn: 0.1844655 test: 0.1962964 best: 0.1962964 (83) total: 10.3s remaining: 245ms 84: learn: 0.1840193 test: 0.1961947 best: 0.1961947 (84) total: 10.4s remaining: 122ms 85: learn: 0.1835652 test: 0.1961769 best: 0.1961769 (85) total: 10.5s remaining: 0us bestTest = 0.1961768734 bestIteration = 85 Trial 63, Fold 3: Log loss = 0.19586252378887894, Average precision = 0.9733974224892061, ROC-AUC = 0.9702815602999104, Elapsed Time = 10.650340999996843 seconds Trial 63, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 63, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6192880 test: 0.6196967 best: 0.6196967 (0) total: 95.2ms remaining: 8.09s 1: learn: 0.5615765 test: 0.5619809 best: 0.5619809 (1) total: 148ms remaining: 6.24s 2: learn: 0.4852288 test: 0.4851356 best: 0.4851356 (2) total: 242ms remaining: 6.7s 3: learn: 0.4165740 test: 0.4169075 best: 0.4169075 (3) total: 346ms remaining: 7.1s 4: learn: 0.3656330 test: 0.3665144 best: 0.3665144 (4) total: 455ms remaining: 7.38s 5: learn: 0.3328923 test: 0.3345014 best: 0.3345014 (5) total: 560ms remaining: 7.47s 6: learn: 0.3134941 test: 0.3154414 best: 0.3154414 (6) total: 664ms remaining: 7.49s 7: learn: 0.2932356 test: 0.2956937 best: 0.2956937 (7) total: 770ms remaining: 7.51s 8: learn: 0.2810444 test: 0.2836899 best: 0.2836899 (8) total: 877ms remaining: 7.5s 9: learn: 0.2717582 test: 0.2742203 best: 0.2742203 (9) total: 980ms remaining: 7.45s 10: learn: 0.2636067 test: 0.2665253 best: 0.2665253 (10) total: 1.09s remaining: 7.42s 11: learn: 0.2516264 test: 0.2546851 best: 0.2546851 (11) total: 1.2s remaining: 7.4s 12: learn: 0.2460666 test: 0.2493207 best: 0.2493207 (12) total: 1.31s remaining: 7.34s 13: learn: 0.2419546 test: 0.2451504 best: 0.2451504 (13) total: 1.41s remaining: 7.28s 14: learn: 0.2399869 test: 0.2430951 best: 0.2430951 (14) total: 1.52s remaining: 7.2s 15: learn: 0.2360870 test: 0.2390302 best: 0.2390302 (15) total: 1.63s remaining: 7.14s 16: learn: 0.2338301 test: 0.2369226 best: 0.2369226 (16) total: 1.74s remaining: 7.07s 17: learn: 0.2300798 test: 0.2331828 best: 0.2331828 (17) total: 1.85s remaining: 6.98s 18: learn: 0.2274113 test: 0.2310356 best: 0.2310356 (18) total: 1.96s remaining: 6.91s 19: learn: 0.2241949 test: 0.2278271 best: 0.2278271 (19) total: 2.08s remaining: 6.85s 20: learn: 0.2216083 test: 0.2254544 best: 0.2254544 (20) total: 2.19s remaining: 6.78s 21: learn: 0.2204429 test: 0.2245326 best: 0.2245326 (21) total: 2.3s remaining: 6.71s 22: learn: 0.2192237 test: 0.2233462 best: 0.2233462 (22) total: 2.42s remaining: 6.63s 23: learn: 0.2180655 test: 0.2222449 best: 0.2222449 (23) total: 2.53s remaining: 6.53s 24: learn: 0.2154222 test: 0.2197801 best: 0.2197801 (24) total: 2.63s remaining: 6.43s 25: learn: 0.2145923 test: 0.2190996 best: 0.2190996 (25) total: 2.74s remaining: 6.33s 26: learn: 0.2136924 test: 0.2183905 best: 0.2183905 (26) total: 2.85s remaining: 6.22s 27: learn: 0.2131914 test: 0.2179488 best: 0.2179488 (27) total: 2.95s remaining: 6.12s 28: learn: 0.2121085 test: 0.2172023 best: 0.2172023 (28) total: 3.06s remaining: 6.01s 29: learn: 0.2103283 test: 0.2155587 best: 0.2155587 (29) total: 3.17s remaining: 5.91s 30: learn: 0.2086479 test: 0.2144529 best: 0.2144529 (30) total: 3.28s remaining: 5.82s 31: learn: 0.2074177 test: 0.2135983 best: 0.2135983 (31) total: 3.39s remaining: 5.71s 32: learn: 0.2066712 test: 0.2128203 best: 0.2128203 (32) total: 3.49s remaining: 5.61s 33: learn: 0.2059991 test: 0.2123001 best: 0.2123001 (33) total: 3.6s remaining: 5.5s 34: learn: 0.2055859 test: 0.2121079 best: 0.2121079 (34) total: 3.7s remaining: 5.39s 35: learn: 0.2044817 test: 0.2110463 best: 0.2110463 (35) total: 3.81s remaining: 5.29s 36: learn: 0.2043034 test: 0.2108693 best: 0.2108693 (36) total: 3.86s remaining: 5.11s 37: learn: 0.2034672 test: 0.2103239 best: 0.2103239 (37) total: 3.95s remaining: 4.99s 38: learn: 0.2028888 test: 0.2098612 best: 0.2098612 (38) total: 4.05s remaining: 4.89s 39: learn: 0.2027138 test: 0.2097732 best: 0.2097732 (39) total: 4.16s remaining: 4.79s 40: learn: 0.2024494 test: 0.2096626 best: 0.2096626 (40) total: 4.27s remaining: 4.68s 41: learn: 0.2018892 test: 0.2092890 best: 0.2092890 (41) total: 4.37s remaining: 4.58s 42: learn: 0.2012638 test: 0.2089522 best: 0.2089522 (42) total: 4.48s remaining: 4.48s 43: learn: 0.2006402 test: 0.2086808 best: 0.2086808 (43) total: 4.58s remaining: 4.37s 44: learn: 0.2001926 test: 0.2084644 best: 0.2084644 (44) total: 4.69s remaining: 4.27s 45: learn: 0.1992393 test: 0.2082711 best: 0.2082711 (45) total: 4.79s remaining: 4.17s 46: learn: 0.1987966 test: 0.2080133 best: 0.2080133 (46) total: 4.9s remaining: 4.06s 47: learn: 0.1976210 test: 0.2072354 best: 0.2072354 (47) total: 5s remaining: 3.96s 48: learn: 0.1970955 test: 0.2069980 best: 0.2069980 (48) total: 5.11s remaining: 3.86s 49: learn: 0.1970186 test: 0.2069589 best: 0.2069589 (49) total: 5.16s remaining: 3.71s 50: learn: 0.1967735 test: 0.2068429 best: 0.2068429 (50) total: 5.25s remaining: 3.6s 51: learn: 0.1963594 test: 0.2065120 best: 0.2065120 (51) total: 5.36s remaining: 3.5s 52: learn: 0.1954486 test: 0.2058760 best: 0.2058760 (52) total: 5.46s remaining: 3.4s 53: learn: 0.1950528 test: 0.2056873 best: 0.2056873 (53) total: 5.57s remaining: 3.3s 54: learn: 0.1946611 test: 0.2056088 best: 0.2056088 (54) total: 5.68s remaining: 3.2s 55: learn: 0.1945330 test: 0.2054820 best: 0.2054820 (55) total: 5.76s remaining: 3.08s 56: learn: 0.1941831 test: 0.2052171 best: 0.2052171 (56) total: 5.87s remaining: 2.98s 57: learn: 0.1941205 test: 0.2052138 best: 0.2052138 (57) total: 5.94s remaining: 2.87s 58: learn: 0.1937511 test: 0.2050478 best: 0.2050478 (58) total: 6.04s remaining: 2.76s 59: learn: 0.1937068 test: 0.2050329 best: 0.2050329 (59) total: 6.09s remaining: 2.64s 60: learn: 0.1932134 test: 0.2046329 best: 0.2046329 (60) total: 6.19s remaining: 2.54s 61: learn: 0.1931889 test: 0.2046001 best: 0.2046001 (61) total: 6.24s remaining: 2.42s 62: learn: 0.1931581 test: 0.2045791 best: 0.2045791 (62) total: 6.29s remaining: 2.3s 63: learn: 0.1927110 test: 0.2042666 best: 0.2042666 (63) total: 6.39s remaining: 2.19s 64: learn: 0.1926006 test: 0.2041744 best: 0.2041744 (64) total: 6.5s remaining: 2.1s 65: learn: 0.1923024 test: 0.2041570 best: 0.2041570 (65) total: 6.6s remaining: 2s 66: learn: 0.1921060 test: 0.2039753 best: 0.2039753 (66) total: 6.71s remaining: 1.9s 67: learn: 0.1915725 test: 0.2035319 best: 0.2035319 (67) total: 6.82s remaining: 1.8s 68: learn: 0.1913803 test: 0.2034287 best: 0.2034287 (68) total: 6.93s remaining: 1.71s 69: learn: 0.1910272 test: 0.2033852 best: 0.2033852 (69) total: 7.04s remaining: 1.61s 70: learn: 0.1903462 test: 0.2032270 best: 0.2032270 (70) total: 7.15s remaining: 1.51s 71: learn: 0.1898625 test: 0.2029354 best: 0.2029354 (71) total: 7.26s remaining: 1.41s 72: learn: 0.1891290 test: 0.2025208 best: 0.2025208 (72) total: 7.37s remaining: 1.31s 73: learn: 0.1887191 test: 0.2025388 best: 0.2025208 (72) total: 7.47s remaining: 1.21s 74: learn: 0.1884832 test: 0.2023020 best: 0.2023020 (74) total: 7.58s remaining: 1.11s 75: learn: 0.1877998 test: 0.2017973 best: 0.2017973 (75) total: 7.68s remaining: 1.01s 76: learn: 0.1875746 test: 0.2017044 best: 0.2017044 (76) total: 7.79s remaining: 911ms 77: learn: 0.1872923 test: 0.2015372 best: 0.2015372 (77) total: 7.9s remaining: 810ms 78: learn: 0.1870680 test: 0.2014064 best: 0.2014064 (78) total: 8s remaining: 709ms 79: learn: 0.1869238 test: 0.2013210 best: 0.2013210 (79) total: 8.11s remaining: 608ms 80: learn: 0.1863689 test: 0.2011748 best: 0.2011748 (80) total: 8.21s remaining: 507ms 81: learn: 0.1857803 test: 0.2009863 best: 0.2009863 (81) total: 8.32s remaining: 406ms 82: learn: 0.1855220 test: 0.2011513 best: 0.2009863 (81) total: 8.42s remaining: 304ms 83: learn: 0.1855220 test: 0.2011513 best: 0.2009863 (81) total: 8.45s remaining: 201ms 84: learn: 0.1853702 test: 0.2012014 best: 0.2009863 (81) total: 8.54s remaining: 101ms 85: learn: 0.1851354 test: 0.2010169 best: 0.2009863 (81) total: 8.64s remaining: 0us bestTest = 0.2009862839 bestIteration = 81 Shrink model to first 82 iterations. Trial 63, Fold 4: Log loss = 0.20050170080730612, Average precision = 0.9738741510547041, ROC-AUC = 0.9689064023786872, Elapsed Time = 8.791355500001373 seconds Trial 63, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 63, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5979033 test: 0.6005158 best: 0.6005158 (0) total: 91.6ms remaining: 7.79s 1: learn: 0.5626567 test: 0.5667608 best: 0.5667608 (1) total: 134ms remaining: 5.64s 2: learn: 0.4921501 test: 0.4965468 best: 0.4965468 (2) total: 194ms remaining: 5.37s 3: learn: 0.4213797 test: 0.4276038 best: 0.4276038 (3) total: 294ms remaining: 6.04s 4: learn: 0.3698140 test: 0.3771198 best: 0.3771198 (4) total: 397ms remaining: 6.43s 5: learn: 0.3417551 test: 0.3494085 best: 0.3494085 (5) total: 498ms remaining: 6.64s 6: learn: 0.3127740 test: 0.3205417 best: 0.3205417 (6) total: 603ms remaining: 6.81s 7: learn: 0.2979547 test: 0.3063482 best: 0.3063482 (7) total: 707ms remaining: 6.89s 8: learn: 0.2892801 test: 0.2976995 best: 0.2976995 (8) total: 776ms remaining: 6.64s 9: learn: 0.2752184 test: 0.2844984 best: 0.2844984 (9) total: 875ms remaining: 6.65s 10: learn: 0.2635984 test: 0.2726078 best: 0.2726078 (10) total: 982ms remaining: 6.7s 11: learn: 0.2564741 test: 0.2661300 best: 0.2661300 (11) total: 1.09s remaining: 6.7s 12: learn: 0.2500230 test: 0.2597303 best: 0.2597303 (12) total: 1.19s remaining: 6.68s 13: learn: 0.2460981 test: 0.2564590 best: 0.2564590 (13) total: 1.29s remaining: 6.66s 14: learn: 0.2399163 test: 0.2508022 best: 0.2508022 (14) total: 1.4s remaining: 6.64s 15: learn: 0.2360865 test: 0.2478164 best: 0.2478164 (15) total: 1.51s remaining: 6.6s 16: learn: 0.2319929 test: 0.2443877 best: 0.2443877 (16) total: 1.61s remaining: 6.55s 17: learn: 0.2283240 test: 0.2407539 best: 0.2407539 (17) total: 1.72s remaining: 6.5s 18: learn: 0.2259706 test: 0.2388175 best: 0.2388175 (18) total: 1.82s remaining: 6.43s 19: learn: 0.2245792 test: 0.2377076 best: 0.2377076 (19) total: 1.93s remaining: 6.36s 20: learn: 0.2209844 test: 0.2346308 best: 0.2346308 (20) total: 2.03s remaining: 6.29s 21: learn: 0.2191507 test: 0.2330636 best: 0.2330636 (21) total: 2.13s remaining: 6.21s 22: learn: 0.2180720 test: 0.2320537 best: 0.2320537 (22) total: 2.24s remaining: 6.13s 23: learn: 0.2153594 test: 0.2295815 best: 0.2295815 (23) total: 2.34s remaining: 6.05s 24: learn: 0.2138734 test: 0.2280836 best: 0.2280836 (24) total: 2.45s remaining: 5.97s 25: learn: 0.2114920 test: 0.2259509 best: 0.2259509 (25) total: 2.55s remaining: 5.89s 26: learn: 0.2104974 test: 0.2250912 best: 0.2250912 (26) total: 2.66s remaining: 5.8s 27: learn: 0.2091365 test: 0.2240127 best: 0.2240127 (27) total: 2.76s remaining: 5.72s 28: learn: 0.2082966 test: 0.2233029 best: 0.2233029 (28) total: 2.87s remaining: 5.63s 29: learn: 0.2077282 test: 0.2228154 best: 0.2228154 (29) total: 2.97s remaining: 5.54s 30: learn: 0.2070164 test: 0.2222519 best: 0.2222519 (30) total: 3.08s remaining: 5.46s 31: learn: 0.2067729 test: 0.2220166 best: 0.2220166 (31) total: 3.15s remaining: 5.31s 32: learn: 0.2053466 test: 0.2207401 best: 0.2207401 (32) total: 3.24s remaining: 5.21s 33: learn: 0.2043966 test: 0.2201634 best: 0.2201634 (33) total: 3.35s remaining: 5.12s 34: learn: 0.2037916 test: 0.2197420 best: 0.2197420 (34) total: 3.45s remaining: 5.03s 35: learn: 0.2035122 test: 0.2194491 best: 0.2194491 (35) total: 3.55s remaining: 4.94s 36: learn: 0.2029293 test: 0.2190018 best: 0.2190018 (36) total: 3.66s remaining: 4.85s 37: learn: 0.2019929 test: 0.2186435 best: 0.2186435 (37) total: 3.77s remaining: 4.76s 38: learn: 0.2010629 test: 0.2179152 best: 0.2179152 (38) total: 3.87s remaining: 4.67s 39: learn: 0.2002659 test: 0.2173831 best: 0.2173831 (39) total: 3.97s remaining: 4.57s 40: learn: 0.1996017 test: 0.2169002 best: 0.2169002 (40) total: 4.08s remaining: 4.47s 41: learn: 0.1985519 test: 0.2163113 best: 0.2163113 (41) total: 4.18s remaining: 4.38s 42: learn: 0.1979200 test: 0.2158798 best: 0.2158798 (42) total: 4.29s remaining: 4.29s 43: learn: 0.1976123 test: 0.2156860 best: 0.2156860 (43) total: 4.36s remaining: 4.16s 44: learn: 0.1970949 test: 0.2154350 best: 0.2154350 (44) total: 4.46s remaining: 4.06s 45: learn: 0.1967186 test: 0.2150095 best: 0.2150095 (45) total: 4.56s remaining: 3.97s 46: learn: 0.1962395 test: 0.2146003 best: 0.2146003 (46) total: 4.67s remaining: 3.87s 47: learn: 0.1957286 test: 0.2143300 best: 0.2143300 (47) total: 4.77s remaining: 3.77s 48: learn: 0.1951756 test: 0.2139618 best: 0.2139618 (48) total: 4.87s remaining: 3.67s 49: learn: 0.1950695 test: 0.2138999 best: 0.2138999 (49) total: 4.97s remaining: 3.58s 50: learn: 0.1945825 test: 0.2136010 best: 0.2136010 (50) total: 5.07s remaining: 3.48s 51: learn: 0.1942956 test: 0.2135817 best: 0.2135817 (51) total: 5.18s remaining: 3.38s 52: learn: 0.1939033 test: 0.2133979 best: 0.2133979 (52) total: 5.28s remaining: 3.29s 53: learn: 0.1933951 test: 0.2130336 best: 0.2130336 (53) total: 5.39s remaining: 3.19s 54: learn: 0.1933930 test: 0.2130345 best: 0.2130336 (53) total: 5.43s remaining: 3.06s 55: learn: 0.1930437 test: 0.2128624 best: 0.2128624 (55) total: 5.52s remaining: 2.96s 56: learn: 0.1925356 test: 0.2124037 best: 0.2124037 (56) total: 5.63s remaining: 2.86s 57: learn: 0.1921698 test: 0.2122858 best: 0.2122858 (57) total: 5.73s remaining: 2.77s 58: learn: 0.1920974 test: 0.2123300 best: 0.2122858 (57) total: 5.83s remaining: 2.67s 59: learn: 0.1918895 test: 0.2121759 best: 0.2121759 (59) total: 5.9s remaining: 2.56s 60: learn: 0.1916659 test: 0.2121013 best: 0.2121013 (60) total: 6s remaining: 2.46s 61: learn: 0.1912379 test: 0.2116263 best: 0.2116263 (61) total: 6.11s remaining: 2.36s 62: learn: 0.1909771 test: 0.2116146 best: 0.2116146 (62) total: 6.21s remaining: 2.27s 63: learn: 0.1905805 test: 0.2115535 best: 0.2115535 (63) total: 6.32s remaining: 2.17s 64: learn: 0.1902846 test: 0.2114919 best: 0.2114919 (64) total: 6.42s remaining: 2.07s 65: learn: 0.1899002 test: 0.2113644 best: 0.2113644 (65) total: 6.52s remaining: 1.98s 66: learn: 0.1898717 test: 0.2113108 best: 0.2113108 (66) total: 6.58s remaining: 1.86s 67: learn: 0.1895051 test: 0.2112399 best: 0.2112399 (67) total: 6.67s remaining: 1.77s 68: learn: 0.1889101 test: 0.2108586 best: 0.2108586 (68) total: 6.78s remaining: 1.67s 69: learn: 0.1884465 test: 0.2104404 best: 0.2104404 (69) total: 6.88s remaining: 1.57s 70: learn: 0.1883711 test: 0.2104199 best: 0.2104199 (70) total: 6.95s remaining: 1.47s 71: learn: 0.1883562 test: 0.2103965 best: 0.2103965 (71) total: 7s remaining: 1.36s 72: learn: 0.1875140 test: 0.2095830 best: 0.2095830 (72) total: 7.09s remaining: 1.26s 73: learn: 0.1874157 test: 0.2094952 best: 0.2094952 (73) total: 7.16s remaining: 1.16s 74: learn: 0.1872260 test: 0.2093804 best: 0.2093804 (74) total: 7.25s remaining: 1.06s 75: learn: 0.1871302 test: 0.2092478 best: 0.2092478 (75) total: 7.35s remaining: 968ms 76: learn: 0.1864556 test: 0.2086884 best: 0.2086884 (76) total: 7.46s remaining: 872ms 77: learn: 0.1861498 test: 0.2086053 best: 0.2086053 (77) total: 7.56s remaining: 776ms 78: learn: 0.1856220 test: 0.2084091 best: 0.2084091 (78) total: 7.66s remaining: 679ms 79: learn: 0.1854471 test: 0.2082673 best: 0.2082673 (79) total: 7.77s remaining: 583ms 80: learn: 0.1850585 test: 0.2082012 best: 0.2082012 (80) total: 7.87s remaining: 486ms 81: learn: 0.1846068 test: 0.2077918 best: 0.2077918 (81) total: 7.98s remaining: 389ms 82: learn: 0.1841956 test: 0.2075242 best: 0.2075242 (82) total: 8.08s remaining: 292ms 83: learn: 0.1841956 test: 0.2075241 best: 0.2075241 (83) total: 8.12s remaining: 193ms 84: learn: 0.1839335 test: 0.2073266 best: 0.2073266 (84) total: 8.21s remaining: 96.6ms 85: learn: 0.1838526 test: 0.2072912 best: 0.2072912 (85) total: 8.32s remaining: 0us bestTest = 0.2072912112 bestIteration = 85 Trial 63, Fold 5: Log loss = 0.20660221002593426, Average precision = 0.971973721771268, ROC-AUC = 0.9688050833286885, Elapsed Time = 8.45616439999867 seconds
Optimization Progress: 64%|######4 | 64/100 [1:48:07<47:23, 78.97s/it]
Trial 64, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 64, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6270816 test: 0.6272108 best: 0.6272108 (0) total: 63.6ms remaining: 3.31s 1: learn: 0.5709433 test: 0.5711774 best: 0.5711774 (1) total: 128ms remaining: 3.25s 2: learn: 0.5232252 test: 0.5237121 best: 0.5237121 (2) total: 191ms remaining: 3.19s 3: learn: 0.4823789 test: 0.4829348 best: 0.4829348 (3) total: 254ms remaining: 3.12s 4: learn: 0.4476997 test: 0.4485957 best: 0.4485957 (4) total: 321ms remaining: 3.08s 5: learn: 0.4176515 test: 0.4190183 best: 0.4190183 (5) total: 384ms remaining: 3.01s 6: learn: 0.3923052 test: 0.3938203 best: 0.3938203 (6) total: 445ms remaining: 2.93s 7: learn: 0.3694487 test: 0.3711216 best: 0.3711216 (7) total: 507ms remaining: 2.85s 8: learn: 0.3500923 test: 0.3521287 best: 0.3521287 (8) total: 571ms remaining: 2.79s 9: learn: 0.3336712 test: 0.3360098 best: 0.3360098 (9) total: 636ms remaining: 2.73s 10: learn: 0.3194564 test: 0.3221179 best: 0.3221179 (10) total: 703ms remaining: 2.68s 11: learn: 0.3072330 test: 0.3104184 best: 0.3104184 (11) total: 769ms remaining: 2.63s 12: learn: 0.2967933 test: 0.3004363 best: 0.3004363 (12) total: 835ms remaining: 2.57s 13: learn: 0.2870043 test: 0.2910537 best: 0.2910537 (13) total: 903ms remaining: 2.52s 14: learn: 0.2786789 test: 0.2831422 best: 0.2831422 (14) total: 971ms remaining: 2.46s 15: learn: 0.2713475 test: 0.2759855 best: 0.2759855 (15) total: 1.04s remaining: 2.4s 16: learn: 0.2637971 test: 0.2690574 best: 0.2690574 (16) total: 1.11s remaining: 2.34s 17: learn: 0.2582177 test: 0.2639155 best: 0.2639155 (17) total: 1.17s remaining: 2.28s 18: learn: 0.2532155 test: 0.2591458 best: 0.2591458 (18) total: 1.24s remaining: 2.22s 19: learn: 0.2478269 test: 0.2542340 best: 0.2542340 (19) total: 1.31s remaining: 2.16s 20: learn: 0.2439621 test: 0.2507762 best: 0.2507762 (20) total: 1.38s remaining: 2.1s 21: learn: 0.2403576 test: 0.2474266 best: 0.2474266 (21) total: 1.45s remaining: 2.04s 22: learn: 0.2368041 test: 0.2441560 best: 0.2441560 (22) total: 1.51s remaining: 1.97s 23: learn: 0.2338774 test: 0.2416794 best: 0.2416794 (23) total: 1.58s remaining: 1.91s 24: learn: 0.2310771 test: 0.2392428 best: 0.2392428 (24) total: 1.65s remaining: 1.84s 25: learn: 0.2289197 test: 0.2374381 best: 0.2374381 (25) total: 1.72s remaining: 1.78s 26: learn: 0.2269306 test: 0.2358023 best: 0.2358023 (26) total: 1.78s remaining: 1.72s 27: learn: 0.2241931 test: 0.2334895 best: 0.2334895 (27) total: 1.85s remaining: 1.65s 28: learn: 0.2217228 test: 0.2315311 best: 0.2315311 (28) total: 1.92s remaining: 1.59s 29: learn: 0.2196265 test: 0.2296647 best: 0.2296647 (29) total: 1.99s remaining: 1.52s 30: learn: 0.2177734 test: 0.2282919 best: 0.2282919 (30) total: 2.06s remaining: 1.46s 31: learn: 0.2160186 test: 0.2266527 best: 0.2266527 (31) total: 2.13s remaining: 1.39s 32: learn: 0.2143046 test: 0.2253099 best: 0.2253099 (32) total: 2.19s remaining: 1.33s 33: learn: 0.2125567 test: 0.2239893 best: 0.2239893 (33) total: 2.26s remaining: 1.26s 34: learn: 0.2110753 test: 0.2226340 best: 0.2226340 (34) total: 2.33s remaining: 1.2s 35: learn: 0.2095110 test: 0.2215644 best: 0.2215644 (35) total: 2.4s remaining: 1.13s 36: learn: 0.2082692 test: 0.2206706 best: 0.2206706 (36) total: 2.46s remaining: 1.06s 37: learn: 0.2071511 test: 0.2198545 best: 0.2198545 (37) total: 2.53s remaining: 999ms 38: learn: 0.2061178 test: 0.2191793 best: 0.2191793 (38) total: 2.6s remaining: 932ms 39: learn: 0.2048050 test: 0.2181099 best: 0.2181099 (39) total: 2.67s remaining: 866ms 40: learn: 0.2039829 test: 0.2176603 best: 0.2176603 (40) total: 2.73s remaining: 800ms 41: learn: 0.2031423 test: 0.2172354 best: 0.2172354 (41) total: 2.8s remaining: 733ms 42: learn: 0.2019565 test: 0.2163547 best: 0.2163547 (42) total: 2.87s remaining: 667ms 43: learn: 0.2012982 test: 0.2161173 best: 0.2161173 (43) total: 2.94s remaining: 601ms 44: learn: 0.2005324 test: 0.2156350 best: 0.2156350 (44) total: 3s remaining: 534ms 45: learn: 0.1997377 test: 0.2150757 best: 0.2150757 (45) total: 3.07s remaining: 467ms 46: learn: 0.1989226 test: 0.2146975 best: 0.2146975 (46) total: 3.14s remaining: 401ms 47: learn: 0.1981992 test: 0.2143581 best: 0.2143581 (47) total: 3.21s remaining: 334ms 48: learn: 0.1973950 test: 0.2136674 best: 0.2136674 (48) total: 3.28s remaining: 267ms 49: learn: 0.1968132 test: 0.2132059 best: 0.2132059 (49) total: 3.34s remaining: 201ms 50: learn: 0.1959770 test: 0.2125990 best: 0.2125990 (50) total: 3.41s remaining: 134ms 51: learn: 0.1954323 test: 0.2123275 best: 0.2123275 (51) total: 3.47s remaining: 66.8ms 52: learn: 0.1947887 test: 0.2119892 best: 0.2119892 (52) total: 3.54s remaining: 0us bestTest = 0.2119891636 bestIteration = 52 Trial 64, Fold 1: Log loss = 0.21198916364723694, Average precision = 0.973417810367677, ROC-AUC = 0.9688073550709053, Elapsed Time = 3.6468724999976985 seconds Trial 64, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 64, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6272498 test: 0.6275179 best: 0.6275179 (0) total: 63.3ms remaining: 3.29s 1: learn: 0.5703894 test: 0.5710904 best: 0.5710904 (1) total: 126ms remaining: 3.21s 2: learn: 0.5219272 test: 0.5233028 best: 0.5233028 (2) total: 194ms remaining: 3.23s 3: learn: 0.4807341 test: 0.4826580 best: 0.4826580 (3) total: 257ms remaining: 3.15s 4: learn: 0.4457995 test: 0.4479698 best: 0.4479698 (4) total: 320ms remaining: 3.07s 5: learn: 0.4162526 test: 0.4190098 best: 0.4190098 (5) total: 384ms remaining: 3s 6: learn: 0.3904693 test: 0.3935761 best: 0.3935761 (6) total: 448ms remaining: 2.94s 7: learn: 0.3687386 test: 0.3719287 best: 0.3719287 (7) total: 512ms remaining: 2.88s 8: learn: 0.3490902 test: 0.3525083 best: 0.3525083 (8) total: 580ms remaining: 2.83s 9: learn: 0.3333056 test: 0.3370342 best: 0.3370342 (9) total: 647ms remaining: 2.78s 10: learn: 0.3195095 test: 0.3231021 best: 0.3231021 (10) total: 714ms remaining: 2.73s 11: learn: 0.3070638 test: 0.3108900 best: 0.3108900 (11) total: 783ms remaining: 2.67s 12: learn: 0.2961384 test: 0.3000111 best: 0.3000111 (12) total: 850ms remaining: 2.62s 13: learn: 0.2871533 test: 0.2911657 best: 0.2911657 (13) total: 919ms remaining: 2.56s 14: learn: 0.2790210 test: 0.2830980 best: 0.2830980 (14) total: 990ms remaining: 2.51s 15: learn: 0.2715377 test: 0.2758894 best: 0.2758894 (15) total: 1.06s remaining: 2.45s 16: learn: 0.2651449 test: 0.2695102 best: 0.2695102 (16) total: 1.13s remaining: 2.38s 17: learn: 0.2599231 test: 0.2644403 best: 0.2644403 (17) total: 1.2s remaining: 2.33s 18: learn: 0.2539787 test: 0.2586516 best: 0.2586516 (18) total: 1.26s remaining: 2.26s 19: learn: 0.2488865 test: 0.2538013 best: 0.2538013 (19) total: 1.33s remaining: 2.2s 20: learn: 0.2451621 test: 0.2499494 best: 0.2499494 (20) total: 1.4s remaining: 2.13s 21: learn: 0.2415140 test: 0.2463515 best: 0.2463515 (21) total: 1.46s remaining: 2.06s 22: learn: 0.2375683 test: 0.2424838 best: 0.2424838 (22) total: 1.53s remaining: 2s 23: learn: 0.2339615 test: 0.2390925 best: 0.2390925 (23) total: 1.6s remaining: 1.94s 24: learn: 0.2308027 test: 0.2361235 best: 0.2361235 (24) total: 1.67s remaining: 1.87s 25: learn: 0.2283145 test: 0.2336443 best: 0.2336443 (25) total: 1.74s remaining: 1.8s 26: learn: 0.2258539 test: 0.2313307 best: 0.2313307 (26) total: 1.8s remaining: 1.74s 27: learn: 0.2242396 test: 0.2299340 best: 0.2299340 (27) total: 1.87s remaining: 1.67s 28: learn: 0.2221109 test: 0.2279761 best: 0.2279761 (28) total: 1.94s remaining: 1.61s 29: learn: 0.2202479 test: 0.2261574 best: 0.2261574 (29) total: 2.03s remaining: 1.56s 30: learn: 0.2186719 test: 0.2247315 best: 0.2247315 (30) total: 2.1s remaining: 1.49s 31: learn: 0.2166524 test: 0.2228832 best: 0.2228832 (31) total: 2.17s remaining: 1.43s 32: learn: 0.2149928 test: 0.2215377 best: 0.2215377 (32) total: 2.25s remaining: 1.36s 33: learn: 0.2135254 test: 0.2203957 best: 0.2203957 (33) total: 2.31s remaining: 1.29s 34: learn: 0.2119741 test: 0.2190566 best: 0.2190566 (34) total: 2.38s remaining: 1.23s 35: learn: 0.2105420 test: 0.2176919 best: 0.2176919 (35) total: 2.45s remaining: 1.16s 36: learn: 0.2093130 test: 0.2165625 best: 0.2165625 (36) total: 2.52s remaining: 1.09s 37: learn: 0.2080760 test: 0.2154965 best: 0.2154965 (37) total: 2.59s remaining: 1.02s 38: learn: 0.2070658 test: 0.2148568 best: 0.2148568 (38) total: 2.66s remaining: 954ms 39: learn: 0.2055556 test: 0.2133411 best: 0.2133411 (39) total: 2.72s remaining: 885ms 40: learn: 0.2048372 test: 0.2128522 best: 0.2128522 (40) total: 2.79s remaining: 816ms 41: learn: 0.2038314 test: 0.2120949 best: 0.2120949 (41) total: 2.86s remaining: 748ms 42: learn: 0.2029579 test: 0.2113977 best: 0.2113977 (42) total: 2.92s remaining: 680ms 43: learn: 0.2021849 test: 0.2108946 best: 0.2108946 (43) total: 2.99s remaining: 612ms 44: learn: 0.2015106 test: 0.2106167 best: 0.2106167 (44) total: 3.06s remaining: 544ms 45: learn: 0.2006209 test: 0.2096512 best: 0.2096512 (45) total: 3.13s remaining: 476ms 46: learn: 0.1999761 test: 0.2092365 best: 0.2092365 (46) total: 3.19s remaining: 408ms 47: learn: 0.1993171 test: 0.2087032 best: 0.2087032 (47) total: 3.26s remaining: 340ms 48: learn: 0.1985722 test: 0.2085370 best: 0.2085370 (48) total: 3.33s remaining: 272ms 49: learn: 0.1979320 test: 0.2080002 best: 0.2080002 (49) total: 3.4s remaining: 204ms 50: learn: 0.1975199 test: 0.2079426 best: 0.2079426 (50) total: 3.46s remaining: 136ms 51: learn: 0.1968898 test: 0.2074155 best: 0.2074155 (51) total: 3.53s remaining: 67.9ms 52: learn: 0.1964597 test: 0.2072443 best: 0.2072443 (52) total: 3.6s remaining: 0us bestTest = 0.2072443174 bestIteration = 52 Trial 64, Fold 2: Log loss = 0.20724431744109836, Average precision = 0.9740108750986209, ROC-AUC = 0.9711155677749617, Elapsed Time = 3.70022660000177 seconds Trial 64, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 64, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6272106 test: 0.6269623 best: 0.6269623 (0) total: 70.7ms remaining: 3.68s 1: learn: 0.5711270 test: 0.5708883 best: 0.5708883 (1) total: 139ms remaining: 3.55s 2: learn: 0.5236185 test: 0.5233514 best: 0.5233514 (2) total: 209ms remaining: 3.49s 3: learn: 0.4824140 test: 0.4816175 best: 0.4816175 (3) total: 279ms remaining: 3.42s 4: learn: 0.4474830 test: 0.4466406 best: 0.4466406 (4) total: 349ms remaining: 3.35s 5: learn: 0.4176684 test: 0.4172260 best: 0.4172260 (5) total: 419ms remaining: 3.28s 6: learn: 0.3917373 test: 0.3914493 best: 0.3914493 (6) total: 490ms remaining: 3.22s 7: learn: 0.3693899 test: 0.3691466 best: 0.3691466 (7) total: 562ms remaining: 3.16s 8: learn: 0.3502680 test: 0.3504018 best: 0.3504018 (8) total: 633ms remaining: 3.09s 9: learn: 0.3339829 test: 0.3339490 best: 0.3339490 (9) total: 704ms remaining: 3.03s 10: learn: 0.3198958 test: 0.3202395 best: 0.3202395 (10) total: 778ms remaining: 2.97s 11: learn: 0.3075153 test: 0.3079284 best: 0.3079284 (11) total: 849ms remaining: 2.9s 12: learn: 0.2968222 test: 0.2974408 best: 0.2974408 (12) total: 921ms remaining: 2.83s 13: learn: 0.2872642 test: 0.2881740 best: 0.2881740 (13) total: 991ms remaining: 2.76s 14: learn: 0.2789630 test: 0.2798542 best: 0.2798542 (14) total: 1.06s remaining: 2.69s 15: learn: 0.2716406 test: 0.2726202 best: 0.2726202 (15) total: 1.13s remaining: 2.62s 16: learn: 0.2649609 test: 0.2661348 best: 0.2661348 (16) total: 1.2s remaining: 2.55s 17: learn: 0.2587405 test: 0.2602116 best: 0.2602116 (17) total: 1.28s remaining: 2.48s 18: learn: 0.2536098 test: 0.2553552 best: 0.2553552 (18) total: 1.35s remaining: 2.41s 19: learn: 0.2487024 test: 0.2505207 best: 0.2505207 (19) total: 1.42s remaining: 2.34s 20: learn: 0.2449155 test: 0.2469899 best: 0.2469899 (20) total: 1.49s remaining: 2.27s 21: learn: 0.2405949 test: 0.2431077 best: 0.2431077 (21) total: 1.56s remaining: 2.21s 22: learn: 0.2371925 test: 0.2400052 best: 0.2400052 (22) total: 1.64s remaining: 2.14s 23: learn: 0.2351421 test: 0.2382663 best: 0.2382663 (23) total: 1.71s remaining: 2.07s 24: learn: 0.2318596 test: 0.2350464 best: 0.2350464 (24) total: 1.79s remaining: 2s 25: learn: 0.2286539 test: 0.2320221 best: 0.2320221 (25) total: 1.87s remaining: 1.94s 26: learn: 0.2260558 test: 0.2296686 best: 0.2296686 (26) total: 1.95s remaining: 1.88s 27: learn: 0.2241248 test: 0.2281492 best: 0.2281492 (27) total: 2.03s remaining: 1.81s 28: learn: 0.2215471 test: 0.2257034 best: 0.2257034 (28) total: 2.11s remaining: 1.74s 29: learn: 0.2198882 test: 0.2242730 best: 0.2242730 (29) total: 2.19s remaining: 1.68s 30: learn: 0.2180032 test: 0.2226205 best: 0.2226205 (30) total: 2.26s remaining: 1.6s 31: learn: 0.2164570 test: 0.2213149 best: 0.2213149 (31) total: 2.34s remaining: 1.54s 32: learn: 0.2151077 test: 0.2201764 best: 0.2201764 (32) total: 2.42s remaining: 1.47s 33: learn: 0.2134166 test: 0.2184726 best: 0.2184726 (33) total: 2.5s remaining: 1.4s 34: learn: 0.2118418 test: 0.2171079 best: 0.2171079 (34) total: 2.58s remaining: 1.32s 35: learn: 0.2108189 test: 0.2163008 best: 0.2163008 (35) total: 2.65s remaining: 1.25s 36: learn: 0.2094582 test: 0.2153173 best: 0.2153173 (36) total: 2.73s remaining: 1.18s 37: learn: 0.2081152 test: 0.2142304 best: 0.2142304 (37) total: 2.81s remaining: 1.11s 38: learn: 0.2069796 test: 0.2130909 best: 0.2130909 (38) total: 2.88s remaining: 1.03s 39: learn: 0.2062282 test: 0.2127021 best: 0.2127021 (39) total: 2.95s remaining: 960ms 40: learn: 0.2049457 test: 0.2117386 best: 0.2117386 (40) total: 3.02s remaining: 885ms 41: learn: 0.2040769 test: 0.2114416 best: 0.2114416 (41) total: 3.1s remaining: 811ms 42: learn: 0.2033186 test: 0.2109323 best: 0.2109323 (42) total: 3.17s remaining: 736ms 43: learn: 0.2023917 test: 0.2105225 best: 0.2105225 (43) total: 3.23s remaining: 662ms 44: learn: 0.2013334 test: 0.2097433 best: 0.2097433 (44) total: 3.31s remaining: 588ms 45: learn: 0.2002781 test: 0.2089615 best: 0.2089615 (45) total: 3.38s remaining: 514ms 46: learn: 0.1994922 test: 0.2085245 best: 0.2085245 (46) total: 3.44s remaining: 440ms 47: learn: 0.1985125 test: 0.2085387 best: 0.2085245 (46) total: 3.52s remaining: 366ms 48: learn: 0.1979046 test: 0.2080937 best: 0.2080937 (48) total: 3.59s remaining: 293ms 49: learn: 0.1971850 test: 0.2077479 best: 0.2077479 (49) total: 3.65s remaining: 219ms 50: learn: 0.1964600 test: 0.2073387 best: 0.2073387 (50) total: 3.72s remaining: 146ms 51: learn: 0.1956467 test: 0.2068461 best: 0.2068461 (51) total: 3.79s remaining: 72.9ms 52: learn: 0.1951465 test: 0.2065540 best: 0.2065540 (52) total: 3.86s remaining: 0us bestTest = 0.2065540451 bestIteration = 52 Trial 64, Fold 3: Log loss = 0.20655404506131142, Average precision = 0.9726031615999413, ROC-AUC = 0.9698857097109728, Elapsed Time = 3.9769863999972586 seconds Trial 64, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 64, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6277257 test: 0.6278012 best: 0.6278012 (0) total: 65.2ms remaining: 3.39s 1: learn: 0.5708531 test: 0.5715551 best: 0.5715551 (1) total: 131ms remaining: 3.33s 2: learn: 0.5224588 test: 0.5234789 best: 0.5234789 (2) total: 198ms remaining: 3.3s 3: learn: 0.4818117 test: 0.4830826 best: 0.4830826 (3) total: 265ms remaining: 3.24s 4: learn: 0.4461144 test: 0.4475577 best: 0.4475577 (4) total: 331ms remaining: 3.18s 5: learn: 0.4162141 test: 0.4177487 best: 0.4177487 (5) total: 398ms remaining: 3.12s 6: learn: 0.3904792 test: 0.3924021 best: 0.3924021 (6) total: 466ms remaining: 3.06s 7: learn: 0.3683291 test: 0.3703427 best: 0.3703427 (7) total: 532ms remaining: 2.99s 8: learn: 0.3502889 test: 0.3524590 best: 0.3524590 (8) total: 600ms remaining: 2.93s 9: learn: 0.3334856 test: 0.3358245 best: 0.3358245 (9) total: 668ms remaining: 2.87s 10: learn: 0.3186772 test: 0.3213800 best: 0.3213800 (10) total: 736ms remaining: 2.81s 11: learn: 0.3060018 test: 0.3089227 best: 0.3089227 (11) total: 804ms remaining: 2.75s 12: learn: 0.2950033 test: 0.2980567 best: 0.2980567 (12) total: 873ms remaining: 2.69s 13: learn: 0.2850740 test: 0.2885263 best: 0.2885263 (13) total: 940ms remaining: 2.62s 14: learn: 0.2766197 test: 0.2804151 best: 0.2804151 (14) total: 1.01s remaining: 2.55s 15: learn: 0.2690111 test: 0.2730878 best: 0.2730878 (15) total: 1.08s remaining: 2.49s 16: learn: 0.2622947 test: 0.2668701 best: 0.2668701 (16) total: 1.14s remaining: 2.42s 17: learn: 0.2568298 test: 0.2615743 best: 0.2615743 (17) total: 1.21s remaining: 2.35s 18: learn: 0.2516507 test: 0.2567909 best: 0.2567909 (18) total: 1.28s remaining: 2.29s 19: learn: 0.2468856 test: 0.2521541 best: 0.2521541 (19) total: 1.34s remaining: 2.22s 20: learn: 0.2420928 test: 0.2479226 best: 0.2479226 (20) total: 1.41s remaining: 2.15s 21: learn: 0.2382193 test: 0.2441248 best: 0.2441248 (21) total: 1.48s remaining: 2.09s 22: learn: 0.2350005 test: 0.2412065 best: 0.2412065 (22) total: 1.55s remaining: 2.02s 23: learn: 0.2312740 test: 0.2379033 best: 0.2379033 (23) total: 1.62s remaining: 1.96s 24: learn: 0.2287846 test: 0.2358697 best: 0.2358697 (24) total: 1.69s remaining: 1.89s 25: learn: 0.2261651 test: 0.2334325 best: 0.2334325 (25) total: 1.76s remaining: 1.82s 26: learn: 0.2240774 test: 0.2317676 best: 0.2317676 (26) total: 1.82s remaining: 1.76s 27: learn: 0.2218759 test: 0.2298462 best: 0.2298462 (27) total: 1.89s remaining: 1.69s 28: learn: 0.2201082 test: 0.2283591 best: 0.2283591 (28) total: 1.96s remaining: 1.62s 29: learn: 0.2183786 test: 0.2269307 best: 0.2269307 (29) total: 2.02s remaining: 1.55s 30: learn: 0.2163999 test: 0.2253000 best: 0.2253000 (30) total: 2.09s remaining: 1.48s 31: learn: 0.2150620 test: 0.2242514 best: 0.2242514 (31) total: 2.15s remaining: 1.41s 32: learn: 0.2128276 test: 0.2221972 best: 0.2221972 (32) total: 2.22s remaining: 1.34s 33: learn: 0.2112866 test: 0.2209997 best: 0.2209997 (33) total: 2.29s remaining: 1.28s 34: learn: 0.2098897 test: 0.2198142 best: 0.2198142 (34) total: 2.36s remaining: 1.21s 35: learn: 0.2085186 test: 0.2189609 best: 0.2189609 (35) total: 2.43s remaining: 1.15s 36: learn: 0.2072749 test: 0.2178783 best: 0.2178783 (36) total: 2.5s remaining: 1.08s 37: learn: 0.2063888 test: 0.2172298 best: 0.2172298 (37) total: 2.56s remaining: 1.01s 38: learn: 0.2054677 test: 0.2167390 best: 0.2167390 (38) total: 2.64s remaining: 946ms 39: learn: 0.2046856 test: 0.2163173 best: 0.2163173 (39) total: 2.7s remaining: 879ms 40: learn: 0.2033709 test: 0.2151835 best: 0.2151835 (40) total: 2.77s remaining: 811ms 41: learn: 0.2026366 test: 0.2148140 best: 0.2148140 (41) total: 2.84s remaining: 744ms 42: learn: 0.2019292 test: 0.2144899 best: 0.2144899 (42) total: 2.91s remaining: 677ms 43: learn: 0.2007888 test: 0.2134945 best: 0.2134945 (43) total: 2.98s remaining: 609ms 44: learn: 0.1998084 test: 0.2128348 best: 0.2128348 (44) total: 3.04s remaining: 541ms 45: learn: 0.1989450 test: 0.2121302 best: 0.2121302 (45) total: 3.11s remaining: 473ms 46: learn: 0.1982666 test: 0.2118182 best: 0.2118182 (46) total: 3.17s remaining: 405ms 47: learn: 0.1976045 test: 0.2114925 best: 0.2114925 (47) total: 3.24s remaining: 338ms 48: learn: 0.1969545 test: 0.2109298 best: 0.2109298 (48) total: 3.31s remaining: 270ms 49: learn: 0.1961525 test: 0.2106408 best: 0.2106408 (49) total: 3.38s remaining: 203ms 50: learn: 0.1955621 test: 0.2103707 best: 0.2103707 (50) total: 3.44s remaining: 135ms 51: learn: 0.1952197 test: 0.2102183 best: 0.2102183 (51) total: 3.51s remaining: 67.5ms 52: learn: 0.1946508 test: 0.2101184 best: 0.2101184 (52) total: 3.58s remaining: 0us bestTest = 0.2101184371 bestIteration = 52 Trial 64, Fold 4: Log loss = 0.21011843705512998, Average precision = 0.9732822428368338, ROC-AUC = 0.968786883838709, Elapsed Time = 3.68862259999878 seconds Trial 64, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 64, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6276437 test: 0.6290933 best: 0.6290933 (0) total: 64ms remaining: 3.33s 1: learn: 0.5705337 test: 0.5734819 best: 0.5734819 (1) total: 126ms remaining: 3.21s 2: learn: 0.5213972 test: 0.5255341 best: 0.5255341 (2) total: 192ms remaining: 3.2s 3: learn: 0.4805579 test: 0.4856973 best: 0.4856973 (3) total: 255ms remaining: 3.12s 4: learn: 0.4458394 test: 0.4516936 best: 0.4516936 (4) total: 320ms remaining: 3.07s 5: learn: 0.4155465 test: 0.4221693 best: 0.4221693 (5) total: 383ms remaining: 3s 6: learn: 0.3893914 test: 0.3965569 best: 0.3965569 (6) total: 450ms remaining: 2.96s 7: learn: 0.3670904 test: 0.3749849 best: 0.3749849 (7) total: 518ms remaining: 2.91s 8: learn: 0.3480323 test: 0.3565294 best: 0.3565294 (8) total: 584ms remaining: 2.85s 9: learn: 0.3323139 test: 0.3412667 best: 0.3412667 (9) total: 652ms remaining: 2.8s 10: learn: 0.3177393 test: 0.3271190 best: 0.3271190 (10) total: 718ms remaining: 2.74s 11: learn: 0.3051082 test: 0.3149317 best: 0.3149317 (11) total: 784ms remaining: 2.68s 12: learn: 0.2946923 test: 0.3050299 best: 0.3050299 (12) total: 851ms remaining: 2.62s 13: learn: 0.2852350 test: 0.2960189 best: 0.2960189 (13) total: 918ms remaining: 2.56s 14: learn: 0.2765591 test: 0.2878481 best: 0.2878481 (14) total: 987ms remaining: 2.5s 15: learn: 0.2688474 test: 0.2804526 best: 0.2804526 (15) total: 1.05s remaining: 2.44s 16: learn: 0.2623416 test: 0.2744170 best: 0.2744170 (16) total: 1.12s remaining: 2.38s 17: learn: 0.2563924 test: 0.2688064 best: 0.2688064 (17) total: 1.19s remaining: 2.32s 18: learn: 0.2509211 test: 0.2638740 best: 0.2638740 (18) total: 1.26s remaining: 2.25s 19: learn: 0.2461857 test: 0.2595852 best: 0.2595852 (19) total: 1.32s remaining: 2.18s 20: learn: 0.2418688 test: 0.2556044 best: 0.2556044 (20) total: 1.39s remaining: 2.12s 21: learn: 0.2378781 test: 0.2518377 best: 0.2518377 (21) total: 1.46s remaining: 2.06s 22: learn: 0.2341955 test: 0.2486193 best: 0.2486193 (22) total: 1.53s remaining: 1.99s 23: learn: 0.2312303 test: 0.2459254 best: 0.2459254 (23) total: 1.59s remaining: 1.93s 24: learn: 0.2286333 test: 0.2437066 best: 0.2437066 (24) total: 1.66s remaining: 1.86s 25: learn: 0.2257660 test: 0.2412433 best: 0.2412433 (25) total: 1.73s remaining: 1.8s 26: learn: 0.2230482 test: 0.2388774 best: 0.2388774 (26) total: 1.8s remaining: 1.73s 27: learn: 0.2209352 test: 0.2374292 best: 0.2374292 (27) total: 1.87s remaining: 1.67s 28: learn: 0.2191299 test: 0.2358582 best: 0.2358582 (28) total: 1.93s remaining: 1.6s 29: learn: 0.2172243 test: 0.2343884 best: 0.2343884 (29) total: 2s remaining: 1.53s 30: learn: 0.2153125 test: 0.2328823 best: 0.2328823 (30) total: 2.07s remaining: 1.47s 31: learn: 0.2136684 test: 0.2314334 best: 0.2314334 (31) total: 2.14s remaining: 1.4s 32: learn: 0.2124637 test: 0.2303543 best: 0.2303543 (32) total: 2.21s remaining: 1.34s 33: learn: 0.2112532 test: 0.2293303 best: 0.2293303 (33) total: 2.27s remaining: 1.27s 34: learn: 0.2100533 test: 0.2284870 best: 0.2284870 (34) total: 2.34s remaining: 1.2s 35: learn: 0.2087136 test: 0.2273255 best: 0.2273255 (35) total: 2.4s remaining: 1.14s 36: learn: 0.2069119 test: 0.2258485 best: 0.2258485 (36) total: 2.47s remaining: 1.07s 37: learn: 0.2055939 test: 0.2248879 best: 0.2248879 (37) total: 2.54s remaining: 1s 38: learn: 0.2044728 test: 0.2240679 best: 0.2240679 (38) total: 2.61s remaining: 936ms 39: learn: 0.2036946 test: 0.2235797 best: 0.2235797 (39) total: 2.67s remaining: 869ms 40: learn: 0.2029473 test: 0.2232506 best: 0.2232506 (40) total: 2.74s remaining: 802ms 41: learn: 0.2020322 test: 0.2225604 best: 0.2225604 (41) total: 2.81s remaining: 735ms 42: learn: 0.2012892 test: 0.2220318 best: 0.2220318 (42) total: 2.87s remaining: 668ms 43: learn: 0.2004189 test: 0.2213877 best: 0.2213877 (43) total: 2.94s remaining: 601ms 44: learn: 0.1996247 test: 0.2209986 best: 0.2209986 (44) total: 3.01s remaining: 535ms 45: learn: 0.1991811 test: 0.2208099 best: 0.2208099 (45) total: 3.08s remaining: 468ms 46: learn: 0.1986177 test: 0.2204998 best: 0.2204998 (46) total: 3.14s remaining: 401ms 47: learn: 0.1980467 test: 0.2202016 best: 0.2202016 (47) total: 3.21s remaining: 335ms 48: learn: 0.1972110 test: 0.2195484 best: 0.2195484 (48) total: 3.28s remaining: 268ms 49: learn: 0.1965650 test: 0.2190391 best: 0.2190391 (49) total: 3.35s remaining: 201ms 50: learn: 0.1959991 test: 0.2187537 best: 0.2187537 (50) total: 3.41s remaining: 134ms 51: learn: 0.1952643 test: 0.2183235 best: 0.2183235 (51) total: 3.48s remaining: 66.9ms 52: learn: 0.1943478 test: 0.2176515 best: 0.2176515 (52) total: 3.55s remaining: 0us bestTest = 0.2176514974 bestIteration = 52 Trial 64, Fold 5: Log loss = 0.21765149735541112, Average precision = 0.970327625197403, ROC-AUC = 0.9678532969176746, Elapsed Time = 3.6505628999984765 seconds
Optimization Progress: 65%|######5 | 65/100 [1:48:34<36:53, 63.23s/it]
Trial 65, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 65, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6703911 test: 0.6707235 best: 0.6707235 (0) total: 275ms remaining: 25.6s 1: learn: 0.6468697 test: 0.6509662 best: 0.6509662 (1) total: 531ms remaining: 24.4s 2: learn: 0.6248171 test: 0.6320659 best: 0.6320659 (2) total: 821ms remaining: 24.9s 3: learn: 0.6017511 test: 0.6118859 best: 0.6118859 (3) total: 1.08s remaining: 24.4s 4: learn: 0.5823891 test: 0.5948790 best: 0.5948790 (4) total: 1.38s remaining: 24.6s 5: learn: 0.5632086 test: 0.5778545 best: 0.5778545 (5) total: 1.7s remaining: 24.9s 6: learn: 0.5445946 test: 0.5609259 best: 0.5609259 (6) total: 2.04s remaining: 25.4s 7: learn: 0.5271383 test: 0.5450710 best: 0.5450710 (7) total: 2.32s remaining: 24.9s 8: learn: 0.5127842 test: 0.5320540 best: 0.5320540 (8) total: 2.6s remaining: 24.5s 9: learn: 0.4981510 test: 0.5186083 best: 0.5186083 (9) total: 2.88s remaining: 24.2s 10: learn: 0.4845267 test: 0.5055651 best: 0.5055651 (10) total: 3.16s remaining: 23.9s 11: learn: 0.4733789 test: 0.4950742 best: 0.4950742 (11) total: 3.44s remaining: 23.5s 12: learn: 0.4614259 test: 0.4837222 best: 0.4837222 (12) total: 3.71s remaining: 23.1s 13: learn: 0.4487761 test: 0.4714296 best: 0.4714296 (13) total: 4.01s remaining: 22.9s 14: learn: 0.4361522 test: 0.4596659 best: 0.4596659 (14) total: 4.27s remaining: 22.5s 15: learn: 0.4272855 test: 0.4509853 best: 0.4509853 (15) total: 4.54s remaining: 22.1s 16: learn: 0.4178009 test: 0.4416769 best: 0.4416769 (16) total: 4.86s remaining: 22s 17: learn: 0.4091281 test: 0.4329435 best: 0.4329435 (17) total: 5.14s remaining: 21.7s 18: learn: 0.4005737 test: 0.4243726 best: 0.4243726 (18) total: 5.47s remaining: 21.6s 19: learn: 0.3917149 test: 0.4157961 best: 0.4157961 (19) total: 5.79s remaining: 21.4s 20: learn: 0.3834664 test: 0.4074473 best: 0.4074473 (20) total: 6.1s remaining: 21.2s 21: learn: 0.3750553 test: 0.3991408 best: 0.3991408 (21) total: 6.37s remaining: 20.8s 22: learn: 0.3688601 test: 0.3929361 best: 0.3929361 (22) total: 6.63s remaining: 20.5s 23: learn: 0.3636629 test: 0.3874540 best: 0.3874540 (23) total: 6.88s remaining: 20.1s 24: learn: 0.3585541 test: 0.3823039 best: 0.3823039 (24) total: 7.14s remaining: 19.7s 25: learn: 0.3536065 test: 0.3772555 best: 0.3772555 (25) total: 7.44s remaining: 19.4s 26: learn: 0.3487602 test: 0.3721785 best: 0.3721785 (26) total: 7.7s remaining: 19.1s 27: learn: 0.3441954 test: 0.3672739 best: 0.3672739 (27) total: 7.97s remaining: 18.8s 28: learn: 0.3392190 test: 0.3622916 best: 0.3622916 (28) total: 8.26s remaining: 18.5s 29: learn: 0.3332955 test: 0.3565574 best: 0.3565574 (29) total: 8.62s remaining: 18.4s 30: learn: 0.3289903 test: 0.3521347 best: 0.3521347 (30) total: 8.92s remaining: 18.1s 31: learn: 0.3241718 test: 0.3473403 best: 0.3473403 (31) total: 9.27s remaining: 18s 32: learn: 0.3206001 test: 0.3435775 best: 0.3435775 (32) total: 9.55s remaining: 17.7s 33: learn: 0.3168327 test: 0.3396586 best: 0.3396586 (33) total: 9.86s remaining: 17.4s 34: learn: 0.3123637 test: 0.3349976 best: 0.3349976 (34) total: 10.2s remaining: 17.1s 35: learn: 0.3087155 test: 0.3311177 best: 0.3311177 (35) total: 10.5s remaining: 16.9s 36: learn: 0.3046194 test: 0.3267822 best: 0.3267822 (36) total: 10.8s remaining: 16.6s 37: learn: 0.3009223 test: 0.3229436 best: 0.3229436 (37) total: 11s remaining: 16.2s 38: learn: 0.2975374 test: 0.3193939 best: 0.3193939 (38) total: 11.3s remaining: 16s 39: learn: 0.2944012 test: 0.3159622 best: 0.3159622 (39) total: 11.6s remaining: 15.7s 40: learn: 0.2906679 test: 0.3119141 best: 0.3119141 (40) total: 11.9s remaining: 15.4s 41: learn: 0.2874503 test: 0.3085391 best: 0.3085391 (41) total: 12.3s remaining: 15.2s 42: learn: 0.2849895 test: 0.3059674 best: 0.3059674 (42) total: 12.6s remaining: 14.9s 43: learn: 0.2826090 test: 0.3034030 best: 0.3034030 (43) total: 12.8s remaining: 14.6s 44: learn: 0.2794824 test: 0.3001432 best: 0.3001432 (44) total: 13.1s remaining: 14.3s 45: learn: 0.2771377 test: 0.2976295 best: 0.2976295 (45) total: 13.4s remaining: 14s 46: learn: 0.2750467 test: 0.2954313 best: 0.2954313 (46) total: 13.7s remaining: 13.7s 47: learn: 0.2723642 test: 0.2926466 best: 0.2926466 (47) total: 13.9s remaining: 13.3s 48: learn: 0.2695663 test: 0.2897705 best: 0.2897705 (48) total: 14.3s remaining: 13.1s 49: learn: 0.2673147 test: 0.2874294 best: 0.2874294 (49) total: 14.6s remaining: 12.8s 50: learn: 0.2649327 test: 0.2849596 best: 0.2849596 (50) total: 14.8s remaining: 12.5s 51: learn: 0.2625867 test: 0.2824360 best: 0.2824360 (51) total: 15.1s remaining: 12.2s 52: learn: 0.2604886 test: 0.2802453 best: 0.2802453 (52) total: 15.4s remaining: 11.9s 53: learn: 0.2584726 test: 0.2781319 best: 0.2781319 (53) total: 15.7s remaining: 11.6s 54: learn: 0.2565621 test: 0.2761460 best: 0.2761460 (54) total: 16s remaining: 11.3s 55: learn: 0.2549617 test: 0.2744869 best: 0.2744869 (55) total: 16.2s remaining: 11s 56: learn: 0.2536009 test: 0.2730276 best: 0.2730276 (56) total: 16.5s remaining: 10.7s 57: learn: 0.2518163 test: 0.2711477 best: 0.2711477 (57) total: 16.8s remaining: 10.4s 58: learn: 0.2498359 test: 0.2691317 best: 0.2691317 (58) total: 17.1s remaining: 10.2s 59: learn: 0.2482129 test: 0.2673401 best: 0.2673401 (59) total: 17.4s remaining: 9.84s 60: learn: 0.2467474 test: 0.2657979 best: 0.2657979 (60) total: 17.6s remaining: 9.53s 61: learn: 0.2454002 test: 0.2643560 best: 0.2643560 (61) total: 17.9s remaining: 9.24s 62: learn: 0.2437360 test: 0.2625897 best: 0.2625897 (62) total: 18.1s remaining: 8.93s 63: learn: 0.2420777 test: 0.2608378 best: 0.2608378 (63) total: 18.4s remaining: 8.64s 64: learn: 0.2408688 test: 0.2595667 best: 0.2595667 (64) total: 18.7s remaining: 8.34s 65: learn: 0.2395238 test: 0.2581269 best: 0.2581269 (65) total: 18.9s remaining: 8.03s 66: learn: 0.2379711 test: 0.2565485 best: 0.2565485 (66) total: 19.3s remaining: 7.76s 67: learn: 0.2365680 test: 0.2550376 best: 0.2550376 (67) total: 19.5s remaining: 7.47s 68: learn: 0.2352712 test: 0.2537979 best: 0.2537979 (68) total: 19.8s remaining: 7.17s 69: learn: 0.2339314 test: 0.2524489 best: 0.2524489 (69) total: 20.1s remaining: 6.89s 70: learn: 0.2322156 test: 0.2507337 best: 0.2507337 (70) total: 20.4s remaining: 6.6s 71: learn: 0.2309520 test: 0.2495064 best: 0.2495064 (71) total: 20.7s remaining: 6.33s 72: learn: 0.2299813 test: 0.2484965 best: 0.2484965 (72) total: 21s remaining: 6.05s 73: learn: 0.2288422 test: 0.2473150 best: 0.2473150 (73) total: 21.3s remaining: 5.77s 74: learn: 0.2277883 test: 0.2462391 best: 0.2462391 (74) total: 21.6s remaining: 5.48s 75: learn: 0.2266848 test: 0.2451370 best: 0.2451370 (75) total: 21.9s remaining: 5.19s 76: learn: 0.2257154 test: 0.2441638 best: 0.2441638 (76) total: 22.2s remaining: 4.9s 77: learn: 0.2243285 test: 0.2428028 best: 0.2428028 (77) total: 22.5s remaining: 4.62s 78: learn: 0.2228431 test: 0.2414079 best: 0.2414079 (78) total: 22.9s remaining: 4.34s 79: learn: 0.2219405 test: 0.2405116 best: 0.2405116 (79) total: 23.2s remaining: 4.07s 80: learn: 0.2211109 test: 0.2396790 best: 0.2396790 (80) total: 23.5s remaining: 3.78s 81: learn: 0.2202285 test: 0.2388311 best: 0.2388311 (81) total: 23.8s remaining: 3.49s 82: learn: 0.2189880 test: 0.2376660 best: 0.2376660 (82) total: 24.1s remaining: 3.2s 83: learn: 0.2180885 test: 0.2368082 best: 0.2368082 (83) total: 24.5s remaining: 2.91s 84: learn: 0.2170342 test: 0.2357728 best: 0.2357728 (84) total: 24.8s remaining: 2.62s 85: learn: 0.2158678 test: 0.2347109 best: 0.2347109 (85) total: 25.1s remaining: 2.34s 86: learn: 0.2147551 test: 0.2336861 best: 0.2336861 (86) total: 25.4s remaining: 2.05s 87: learn: 0.2140853 test: 0.2330468 best: 0.2330468 (87) total: 25.7s remaining: 1.75s 88: learn: 0.2131380 test: 0.2321108 best: 0.2321108 (88) total: 26s remaining: 1.46s 89: learn: 0.2123376 test: 0.2313991 best: 0.2313991 (89) total: 26.4s remaining: 1.17s 90: learn: 0.2116288 test: 0.2307242 best: 0.2307242 (90) total: 26.7s remaining: 880ms 91: learn: 0.2107810 test: 0.2299281 best: 0.2299281 (91) total: 27s remaining: 587ms 92: learn: 0.2098952 test: 0.2291256 best: 0.2291256 (92) total: 27.3s remaining: 293ms 93: learn: 0.2091610 test: 0.2284485 best: 0.2284485 (93) total: 27.6s remaining: 0us bestTest = 0.2284485328 bestIteration = 93 Trial 65, Fold 1: Log loss = 0.22844853284198552, Average precision = 0.9717435611306037, ROC-AUC = 0.9682847368597144, Elapsed Time = 27.705972499999916 seconds Trial 65, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 65, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6700905 test: 0.6701556 best: 0.6701556 (0) total: 274ms remaining: 25.5s 1: learn: 0.6500247 test: 0.6513105 best: 0.6513105 (1) total: 545ms remaining: 25.1s 2: learn: 0.6318684 test: 0.6333800 best: 0.6333800 (2) total: 776ms remaining: 23.5s 3: learn: 0.6110730 test: 0.6127853 best: 0.6127853 (3) total: 1.08s remaining: 24.4s 4: learn: 0.5942529 test: 0.5961073 best: 0.5961073 (4) total: 1.35s remaining: 24.1s 5: learn: 0.5752468 test: 0.5772888 best: 0.5772888 (5) total: 1.66s remaining: 24.3s 6: learn: 0.5572228 test: 0.5615063 best: 0.5615063 (6) total: 1.95s remaining: 24.2s 7: learn: 0.5406512 test: 0.5465102 best: 0.5465102 (7) total: 2.29s remaining: 24.6s 8: learn: 0.5252178 test: 0.5331558 best: 0.5331558 (8) total: 2.61s remaining: 24.6s 9: learn: 0.5092711 test: 0.5183471 best: 0.5183471 (9) total: 2.92s remaining: 24.5s 10: learn: 0.4956281 test: 0.5059458 best: 0.5059458 (10) total: 3.23s remaining: 24.4s 11: learn: 0.4838539 test: 0.4958471 best: 0.4958471 (11) total: 3.49s remaining: 23.8s 12: learn: 0.4703248 test: 0.4837681 best: 0.4837681 (12) total: 3.77s remaining: 23.5s 13: learn: 0.4573269 test: 0.4723246 best: 0.4723246 (13) total: 4.08s remaining: 23.3s 14: learn: 0.4462319 test: 0.4617743 best: 0.4617743 (14) total: 4.39s remaining: 23.1s 15: learn: 0.4370185 test: 0.4530534 best: 0.4530534 (15) total: 4.73s remaining: 23.1s 16: learn: 0.4278263 test: 0.4442168 best: 0.4442168 (16) total: 5s remaining: 22.7s 17: learn: 0.4200106 test: 0.4369601 best: 0.4369601 (17) total: 5.24s remaining: 22.1s 18: learn: 0.4109001 test: 0.4282845 best: 0.4282845 (18) total: 5.51s remaining: 21.8s 19: learn: 0.4035581 test: 0.4216226 best: 0.4216226 (19) total: 5.81s remaining: 21.5s 20: learn: 0.3951957 test: 0.4134477 best: 0.4134477 (20) total: 6.11s remaining: 21.2s 21: learn: 0.3872487 test: 0.4059721 best: 0.4059721 (21) total: 6.45s remaining: 21.1s 22: learn: 0.3803211 test: 0.3991574 best: 0.3991574 (22) total: 6.73s remaining: 20.8s 23: learn: 0.3744211 test: 0.3932792 best: 0.3932792 (23) total: 7.05s remaining: 20.6s 24: learn: 0.3683642 test: 0.3873998 best: 0.3873998 (24) total: 7.35s remaining: 20.3s 25: learn: 0.3625704 test: 0.3813428 best: 0.3813428 (25) total: 7.66s remaining: 20s 26: learn: 0.3568260 test: 0.3755102 best: 0.3755102 (26) total: 7.95s remaining: 19.7s 27: learn: 0.3517613 test: 0.3705437 best: 0.3705437 (27) total: 8.16s remaining: 19.2s 28: learn: 0.3471247 test: 0.3657450 best: 0.3657450 (28) total: 8.46s remaining: 19s 29: learn: 0.3427301 test: 0.3613732 best: 0.3613732 (29) total: 8.73s remaining: 18.6s 30: learn: 0.3384541 test: 0.3569571 best: 0.3569571 (30) total: 8.99s remaining: 18.3s 31: learn: 0.3340310 test: 0.3526406 best: 0.3526406 (31) total: 9.27s remaining: 18s 32: learn: 0.3294637 test: 0.3480557 best: 0.3480557 (32) total: 9.6s remaining: 17.7s 33: learn: 0.3251367 test: 0.3436213 best: 0.3436213 (33) total: 9.85s remaining: 17.4s 34: learn: 0.3212959 test: 0.3396212 best: 0.3396212 (34) total: 10.2s remaining: 17.1s 35: learn: 0.3177247 test: 0.3360239 best: 0.3360239 (35) total: 10.5s remaining: 16.8s 36: learn: 0.3138261 test: 0.3319532 best: 0.3319532 (36) total: 10.8s remaining: 16.7s 37: learn: 0.3107401 test: 0.3286119 best: 0.3286119 (37) total: 11.1s remaining: 16.3s 38: learn: 0.3068066 test: 0.3246110 best: 0.3246110 (38) total: 11.3s remaining: 16s 39: learn: 0.3034705 test: 0.3211028 best: 0.3211028 (39) total: 11.6s remaining: 15.6s 40: learn: 0.3001235 test: 0.3175891 best: 0.3175891 (40) total: 11.9s remaining: 15.3s 41: learn: 0.2963595 test: 0.3137457 best: 0.3137457 (41) total: 12.2s remaining: 15.1s 42: learn: 0.2927870 test: 0.3102163 best: 0.3102163 (42) total: 12.5s remaining: 14.8s 43: learn: 0.2901888 test: 0.3074534 best: 0.3074534 (43) total: 12.8s remaining: 14.5s 44: learn: 0.2878795 test: 0.3051040 best: 0.3051040 (44) total: 13s remaining: 14.2s 45: learn: 0.2842458 test: 0.3012760 best: 0.3012760 (45) total: 13.3s remaining: 13.9s 46: learn: 0.2816592 test: 0.2985673 best: 0.2985673 (46) total: 13.6s remaining: 13.6s 47: learn: 0.2780103 test: 0.2947616 best: 0.2947616 (47) total: 13.9s remaining: 13.3s 48: learn: 0.2744894 test: 0.2911506 best: 0.2911506 (48) total: 14.2s remaining: 13.1s 49: learn: 0.2719001 test: 0.2885017 best: 0.2885017 (49) total: 14.5s remaining: 12.8s 50: learn: 0.2701279 test: 0.2866835 best: 0.2866835 (50) total: 14.8s remaining: 12.5s 51: learn: 0.2684511 test: 0.2849222 best: 0.2849222 (51) total: 15s remaining: 12.1s 52: learn: 0.2665559 test: 0.2829416 best: 0.2829416 (52) total: 15.3s remaining: 11.8s 53: learn: 0.2641119 test: 0.2801837 best: 0.2801837 (53) total: 15.6s remaining: 11.6s 54: learn: 0.2609879 test: 0.2769836 best: 0.2769836 (54) total: 15.9s remaining: 11.3s 55: learn: 0.2595406 test: 0.2756346 best: 0.2756346 (55) total: 16.2s remaining: 11s 56: learn: 0.2577922 test: 0.2737326 best: 0.2737326 (56) total: 16.5s remaining: 10.7s 57: learn: 0.2558518 test: 0.2716891 best: 0.2716891 (57) total: 16.8s remaining: 10.4s 58: learn: 0.2543708 test: 0.2701515 best: 0.2701515 (58) total: 17s remaining: 10.1s 59: learn: 0.2529407 test: 0.2685314 best: 0.2685314 (59) total: 17.3s remaining: 9.79s 60: learn: 0.2515201 test: 0.2670370 best: 0.2670370 (60) total: 17.5s remaining: 9.47s 61: learn: 0.2500447 test: 0.2653350 best: 0.2653350 (61) total: 17.8s remaining: 9.18s 62: learn: 0.2484460 test: 0.2635239 best: 0.2635239 (62) total: 18s remaining: 8.87s 63: learn: 0.2471231 test: 0.2620985 best: 0.2620985 (63) total: 18.3s remaining: 8.58s 64: learn: 0.2460737 test: 0.2610414 best: 0.2610414 (64) total: 18.6s remaining: 8.28s 65: learn: 0.2448160 test: 0.2597542 best: 0.2597542 (65) total: 18.9s remaining: 8.01s 66: learn: 0.2433097 test: 0.2581331 best: 0.2581331 (66) total: 19.1s remaining: 7.7s 67: learn: 0.2416857 test: 0.2564838 best: 0.2564838 (67) total: 19.4s remaining: 7.41s 68: learn: 0.2406307 test: 0.2553541 best: 0.2553541 (68) total: 19.6s remaining: 7.11s 69: learn: 0.2392549 test: 0.2538512 best: 0.2538512 (69) total: 19.9s remaining: 6.81s 70: learn: 0.2382907 test: 0.2527082 best: 0.2527082 (70) total: 20.1s remaining: 6.52s 71: learn: 0.2373739 test: 0.2517216 best: 0.2517216 (71) total: 20.4s remaining: 6.22s 72: learn: 0.2362408 test: 0.2507598 best: 0.2507598 (72) total: 20.6s remaining: 5.93s 73: learn: 0.2347761 test: 0.2492580 best: 0.2492580 (73) total: 20.9s remaining: 5.65s 74: learn: 0.2337793 test: 0.2481451 best: 0.2481451 (74) total: 21.2s remaining: 5.37s 75: learn: 0.2325563 test: 0.2469200 best: 0.2469200 (75) total: 21.5s remaining: 5.09s 76: learn: 0.2312548 test: 0.2456208 best: 0.2456208 (76) total: 21.8s remaining: 4.81s 77: learn: 0.2301349 test: 0.2444625 best: 0.2444625 (77) total: 22.1s remaining: 4.53s 78: learn: 0.2290907 test: 0.2436088 best: 0.2436088 (78) total: 22.3s remaining: 4.24s 79: learn: 0.2279168 test: 0.2423726 best: 0.2423726 (79) total: 22.6s remaining: 3.95s 80: learn: 0.2266492 test: 0.2410144 best: 0.2410144 (80) total: 22.9s remaining: 3.67s 81: learn: 0.2255557 test: 0.2399099 best: 0.2399099 (81) total: 23.2s remaining: 3.4s 82: learn: 0.2246911 test: 0.2390455 best: 0.2390455 (82) total: 23.5s remaining: 3.11s 83: learn: 0.2238798 test: 0.2382242 best: 0.2382242 (83) total: 23.7s remaining: 2.83s 84: learn: 0.2229658 test: 0.2372724 best: 0.2372724 (84) total: 24s remaining: 2.54s 85: learn: 0.2219704 test: 0.2362882 best: 0.2362882 (85) total: 24.3s remaining: 2.26s 86: learn: 0.2210475 test: 0.2353326 best: 0.2353326 (86) total: 24.5s remaining: 1.97s 87: learn: 0.2202916 test: 0.2345215 best: 0.2345215 (87) total: 24.8s remaining: 1.69s 88: learn: 0.2192671 test: 0.2334327 best: 0.2334327 (88) total: 25s remaining: 1.41s 89: learn: 0.2182575 test: 0.2324133 best: 0.2324133 (89) total: 25.4s remaining: 1.13s 90: learn: 0.2175928 test: 0.2316867 best: 0.2316867 (90) total: 25.6s remaining: 845ms 91: learn: 0.2167652 test: 0.2307925 best: 0.2307925 (91) total: 25.9s remaining: 563ms 92: learn: 0.2157388 test: 0.2297139 best: 0.2297139 (92) total: 26.2s remaining: 282ms 93: learn: 0.2149731 test: 0.2289427 best: 0.2289427 (93) total: 26.4s remaining: 0us bestTest = 0.2289426739 bestIteration = 93 Trial 65, Fold 2: Log loss = 0.22894267389669753, Average precision = 0.9732539860836594, ROC-AUC = 0.970380577376044, Elapsed Time = 26.59802010000203 seconds Trial 65, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 65, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6705664 test: 0.6704281 best: 0.6704281 (0) total: 245ms remaining: 22.8s 1: learn: 0.6489183 test: 0.6487379 best: 0.6487379 (1) total: 515ms remaining: 23.7s 2: learn: 0.6284337 test: 0.6283566 best: 0.6283566 (2) total: 750ms remaining: 22.7s 3: learn: 0.6103650 test: 0.6113974 best: 0.6113974 (3) total: 1s remaining: 22.5s 4: learn: 0.5904691 test: 0.5914083 best: 0.5914083 (4) total: 1.26s remaining: 22.4s 5: learn: 0.5736555 test: 0.5745786 best: 0.5745786 (5) total: 1.5s remaining: 22s 6: learn: 0.5578163 test: 0.5588263 best: 0.5588263 (6) total: 1.73s remaining: 21.5s 7: learn: 0.5420901 test: 0.5430932 best: 0.5430932 (7) total: 2s remaining: 21.5s 8: learn: 0.5275352 test: 0.5284980 best: 0.5284980 (8) total: 2.26s remaining: 21.3s 9: learn: 0.5118290 test: 0.5146176 best: 0.5146176 (9) total: 2.52s remaining: 21.1s 10: learn: 0.4970125 test: 0.5016262 best: 0.5016262 (10) total: 2.79s remaining: 21.1s 11: learn: 0.4834071 test: 0.4893702 best: 0.4893702 (11) total: 3.11s remaining: 21.2s 12: learn: 0.4707844 test: 0.4779878 best: 0.4779878 (12) total: 3.39s remaining: 21.1s 13: learn: 0.4581374 test: 0.4666404 best: 0.4666404 (13) total: 3.69s remaining: 21.1s 14: learn: 0.4469004 test: 0.4564094 best: 0.4564094 (14) total: 3.98s remaining: 21s 15: learn: 0.4370141 test: 0.4473966 best: 0.4473966 (15) total: 4.25s remaining: 20.7s 16: learn: 0.4263881 test: 0.4377345 best: 0.4377345 (16) total: 4.53s remaining: 20.5s 17: learn: 0.4179821 test: 0.4299825 best: 0.4299825 (17) total: 4.77s remaining: 20.2s 18: learn: 0.4093589 test: 0.4221099 best: 0.4221099 (18) total: 5.04s remaining: 19.9s 19: learn: 0.4011337 test: 0.4144141 best: 0.4144141 (19) total: 5.33s remaining: 19.7s 20: learn: 0.3934541 test: 0.4071225 best: 0.4071225 (20) total: 5.61s remaining: 19.5s 21: learn: 0.3860206 test: 0.3995775 best: 0.3995775 (21) total: 5.87s remaining: 19.2s 22: learn: 0.3789475 test: 0.3929823 best: 0.3929823 (22) total: 6.13s remaining: 18.9s 23: learn: 0.3723804 test: 0.3865978 best: 0.3865978 (23) total: 6.4s remaining: 18.7s 24: learn: 0.3667436 test: 0.3807642 best: 0.3807642 (24) total: 6.66s remaining: 18.4s 25: learn: 0.3601293 test: 0.3744113 best: 0.3744113 (25) total: 6.93s remaining: 18.1s 26: learn: 0.3535918 test: 0.3679108 best: 0.3679108 (26) total: 7.18s remaining: 17.8s 27: learn: 0.3490585 test: 0.3632653 best: 0.3632653 (27) total: 7.47s remaining: 17.6s 28: learn: 0.3436684 test: 0.3578951 best: 0.3578951 (28) total: 7.75s remaining: 17.4s 29: learn: 0.3387427 test: 0.3530590 best: 0.3530590 (29) total: 8.05s remaining: 17.2s 30: learn: 0.3340700 test: 0.3484431 best: 0.3484431 (30) total: 8.32s remaining: 16.9s 31: learn: 0.3287740 test: 0.3430907 best: 0.3430907 (31) total: 8.58s remaining: 16.6s 32: learn: 0.3241796 test: 0.3384911 best: 0.3384911 (32) total: 8.87s remaining: 16.4s 33: learn: 0.3192071 test: 0.3334550 best: 0.3334550 (33) total: 9.17s remaining: 16.2s 34: learn: 0.3155641 test: 0.3296636 best: 0.3296636 (34) total: 9.47s remaining: 16s 35: learn: 0.3117705 test: 0.3256638 best: 0.3256638 (35) total: 9.7s remaining: 15.6s 36: learn: 0.3077277 test: 0.3215469 best: 0.3215469 (36) total: 10s remaining: 15.5s 37: learn: 0.3045157 test: 0.3181191 best: 0.3181191 (37) total: 10.3s remaining: 15.1s 38: learn: 0.3019533 test: 0.3152700 best: 0.3152700 (38) total: 10.5s remaining: 14.8s 39: learn: 0.2982408 test: 0.3114928 best: 0.3114928 (39) total: 10.8s remaining: 14.6s 40: learn: 0.2951512 test: 0.3081717 best: 0.3081717 (40) total: 11s remaining: 14.2s 41: learn: 0.2923172 test: 0.3052186 best: 0.3052186 (41) total: 11.2s remaining: 13.9s 42: learn: 0.2886905 test: 0.3015078 best: 0.3015078 (42) total: 11.5s remaining: 13.7s 43: learn: 0.2864558 test: 0.2992117 best: 0.2992117 (43) total: 11.8s remaining: 13.4s 44: learn: 0.2842764 test: 0.2969006 best: 0.2969006 (44) total: 12s remaining: 13.1s 45: learn: 0.2820731 test: 0.2945927 best: 0.2945927 (45) total: 12.3s remaining: 12.8s 46: learn: 0.2798330 test: 0.2921749 best: 0.2921749 (46) total: 12.5s remaining: 12.5s 47: learn: 0.2775807 test: 0.2897855 best: 0.2897855 (47) total: 12.8s remaining: 12.3s 48: learn: 0.2746818 test: 0.2868817 best: 0.2868817 (48) total: 13.1s remaining: 12s 49: learn: 0.2720077 test: 0.2841200 best: 0.2841200 (49) total: 13.4s remaining: 11.8s 50: learn: 0.2697907 test: 0.2818711 best: 0.2818711 (50) total: 13.7s remaining: 11.5s 51: learn: 0.2679870 test: 0.2799643 best: 0.2799643 (51) total: 13.9s remaining: 11.2s 52: learn: 0.2655967 test: 0.2775234 best: 0.2775234 (52) total: 14.2s remaining: 11s 53: learn: 0.2638998 test: 0.2757576 best: 0.2757576 (53) total: 14.4s remaining: 10.7s 54: learn: 0.2616446 test: 0.2734296 best: 0.2734296 (54) total: 14.7s remaining: 10.4s 55: learn: 0.2600159 test: 0.2718122 best: 0.2718122 (55) total: 14.9s remaining: 10.1s 56: learn: 0.2577800 test: 0.2696680 best: 0.2696680 (56) total: 15.3s remaining: 9.92s 57: learn: 0.2554772 test: 0.2672800 best: 0.2672800 (57) total: 15.6s remaining: 9.67s 58: learn: 0.2540243 test: 0.2657188 best: 0.2657188 (58) total: 15.9s remaining: 9.41s 59: learn: 0.2517860 test: 0.2635027 best: 0.2635027 (59) total: 16.2s remaining: 9.16s 60: learn: 0.2501647 test: 0.2618873 best: 0.2618873 (60) total: 16.4s remaining: 8.9s 61: learn: 0.2483015 test: 0.2600241 best: 0.2600241 (61) total: 16.7s remaining: 8.64s 62: learn: 0.2467515 test: 0.2583587 best: 0.2583587 (62) total: 17s remaining: 8.38s 63: learn: 0.2451923 test: 0.2567076 best: 0.2567076 (63) total: 17.3s remaining: 8.12s 64: learn: 0.2434914 test: 0.2549181 best: 0.2549181 (64) total: 17.6s remaining: 7.87s 65: learn: 0.2415569 test: 0.2529100 best: 0.2529100 (65) total: 18s remaining: 7.62s 66: learn: 0.2405168 test: 0.2517975 best: 0.2517975 (66) total: 18.2s remaining: 7.34s 67: learn: 0.2394363 test: 0.2507257 best: 0.2507257 (67) total: 18.5s remaining: 7.08s 68: learn: 0.2382282 test: 0.2495436 best: 0.2495436 (68) total: 18.8s remaining: 6.79s 69: learn: 0.2369401 test: 0.2482489 best: 0.2482489 (69) total: 19s remaining: 6.53s 70: learn: 0.2357739 test: 0.2471261 best: 0.2471261 (70) total: 19.4s remaining: 6.27s 71: learn: 0.2344635 test: 0.2458830 best: 0.2458830 (71) total: 19.7s remaining: 6.02s 72: learn: 0.2333965 test: 0.2448075 best: 0.2448075 (72) total: 20s remaining: 5.75s 73: learn: 0.2320626 test: 0.2434628 best: 0.2434628 (73) total: 20.3s remaining: 5.48s 74: learn: 0.2311207 test: 0.2424528 best: 0.2424528 (74) total: 20.5s remaining: 5.2s 75: learn: 0.2302404 test: 0.2415938 best: 0.2415938 (75) total: 20.8s remaining: 4.93s 76: learn: 0.2293052 test: 0.2406547 best: 0.2406547 (76) total: 21.1s remaining: 4.65s 77: learn: 0.2283702 test: 0.2396443 best: 0.2396443 (77) total: 21.3s remaining: 4.37s 78: learn: 0.2274145 test: 0.2387510 best: 0.2387510 (78) total: 21.6s remaining: 4.1s 79: learn: 0.2264418 test: 0.2378053 best: 0.2378053 (79) total: 21.9s remaining: 3.83s 80: learn: 0.2251259 test: 0.2364406 best: 0.2364406 (80) total: 22.2s remaining: 3.56s 81: learn: 0.2241898 test: 0.2355310 best: 0.2355310 (81) total: 22.5s remaining: 3.29s 82: learn: 0.2233360 test: 0.2346681 best: 0.2346681 (82) total: 22.7s remaining: 3.01s 83: learn: 0.2222790 test: 0.2336064 best: 0.2336064 (83) total: 23s remaining: 2.74s 84: learn: 0.2215739 test: 0.2328891 best: 0.2328891 (84) total: 23.3s remaining: 2.47s 85: learn: 0.2204289 test: 0.2317446 best: 0.2317446 (85) total: 23.6s remaining: 2.19s 86: learn: 0.2195526 test: 0.2309087 best: 0.2309087 (86) total: 23.9s remaining: 1.92s 87: learn: 0.2185637 test: 0.2299201 best: 0.2299201 (87) total: 24.2s remaining: 1.65s 88: learn: 0.2178828 test: 0.2292098 best: 0.2292098 (88) total: 24.5s remaining: 1.37s 89: learn: 0.2169227 test: 0.2283000 best: 0.2283000 (89) total: 24.7s remaining: 1.1s 90: learn: 0.2162362 test: 0.2276935 best: 0.2276935 (90) total: 25s remaining: 823ms 91: learn: 0.2154492 test: 0.2269436 best: 0.2269436 (91) total: 25.3s remaining: 549ms 92: learn: 0.2146970 test: 0.2262395 best: 0.2262395 (92) total: 25.6s remaining: 275ms 93: learn: 0.2139478 test: 0.2254583 best: 0.2254583 (93) total: 25.8s remaining: 0us bestTest = 0.2254582995 bestIteration = 93 Trial 65, Fold 3: Log loss = 0.2254582994719636, Average precision = 0.9740730539862928, ROC-AUC = 0.9704280573720757, Elapsed Time = 25.97395140000299 seconds Trial 65, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 65, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6713281 test: 0.6724000 best: 0.6724000 (0) total: 255ms remaining: 23.7s 1: learn: 0.6497621 test: 0.6509888 best: 0.6509888 (1) total: 497ms remaining: 22.9s 2: learn: 0.6302870 test: 0.6316057 best: 0.6316057 (2) total: 735ms remaining: 22.3s 3: learn: 0.6108031 test: 0.6122324 best: 0.6122324 (3) total: 1s remaining: 22.5s 4: learn: 0.5922143 test: 0.5936218 best: 0.5936218 (4) total: 1.27s remaining: 22.7s 5: learn: 0.5742928 test: 0.5768973 best: 0.5768973 (5) total: 1.58s remaining: 23.2s 6: learn: 0.5571167 test: 0.5598283 best: 0.5598283 (6) total: 1.85s remaining: 23s 7: learn: 0.5408378 test: 0.5435755 best: 0.5435755 (7) total: 2.11s remaining: 22.7s 8: learn: 0.5250263 test: 0.5288420 best: 0.5288420 (8) total: 2.35s remaining: 22.2s 9: learn: 0.5111753 test: 0.5150486 best: 0.5150486 (9) total: 2.58s remaining: 21.7s 10: learn: 0.4995698 test: 0.5030219 best: 0.5030219 (10) total: 2.82s remaining: 21.3s 11: learn: 0.4868144 test: 0.4916567 best: 0.4916567 (11) total: 3.08s remaining: 21s 12: learn: 0.4733683 test: 0.4787366 best: 0.4787366 (12) total: 3.36s remaining: 20.9s 13: learn: 0.4628714 test: 0.4682291 best: 0.4682291 (13) total: 3.66s remaining: 20.9s 14: learn: 0.4526142 test: 0.4586116 best: 0.4586116 (14) total: 3.89s remaining: 20.5s 15: learn: 0.4421636 test: 0.4481652 best: 0.4481652 (15) total: 4.17s remaining: 20.3s 16: learn: 0.4332576 test: 0.4398580 best: 0.4398580 (16) total: 4.44s remaining: 20.1s 17: learn: 0.4233743 test: 0.4306113 best: 0.4306113 (17) total: 4.71s remaining: 19.9s 18: learn: 0.4145764 test: 0.4225797 best: 0.4225797 (18) total: 5.02s remaining: 19.8s 19: learn: 0.4051580 test: 0.4140508 best: 0.4140508 (19) total: 5.32s remaining: 19.7s 20: learn: 0.3963509 test: 0.4058764 best: 0.4058764 (20) total: 5.6s remaining: 19.5s 21: learn: 0.3885094 test: 0.3983906 best: 0.3983906 (21) total: 5.9s remaining: 19.3s 22: learn: 0.3800894 test: 0.3907721 best: 0.3907721 (22) total: 6.17s remaining: 19s 23: learn: 0.3736436 test: 0.3835500 best: 0.3835500 (23) total: 6.41s remaining: 18.7s 24: learn: 0.3674081 test: 0.3774406 best: 0.3774406 (24) total: 6.66s remaining: 18.4s 25: learn: 0.3612793 test: 0.3705552 best: 0.3705552 (25) total: 6.91s remaining: 18.1s 26: learn: 0.3563402 test: 0.3662842 best: 0.3662842 (26) total: 7.2s remaining: 17.9s 27: learn: 0.3503923 test: 0.3604913 best: 0.3604913 (27) total: 7.46s remaining: 17.6s 28: learn: 0.3441708 test: 0.3545615 best: 0.3545615 (28) total: 7.8s remaining: 17.5s 29: learn: 0.3395436 test: 0.3500300 best: 0.3500300 (29) total: 8.09s remaining: 17.3s 30: learn: 0.3346198 test: 0.3451272 best: 0.3451272 (30) total: 8.39s remaining: 17s 31: learn: 0.3295554 test: 0.3401915 best: 0.3401915 (31) total: 8.69s remaining: 16.8s 32: learn: 0.3243852 test: 0.3350779 best: 0.3350779 (32) total: 8.97s remaining: 16.6s 33: learn: 0.3203988 test: 0.3307380 best: 0.3307380 (33) total: 9.21s remaining: 16.2s 34: learn: 0.3171973 test: 0.3273206 best: 0.3273206 (34) total: 9.41s remaining: 15.9s 35: learn: 0.3133715 test: 0.3236352 best: 0.3236352 (35) total: 9.63s remaining: 15.5s 36: learn: 0.3089455 test: 0.3194014 best: 0.3194014 (36) total: 9.92s remaining: 15.3s 37: learn: 0.3056408 test: 0.3161806 best: 0.3161806 (37) total: 10.2s remaining: 15s 38: learn: 0.3025579 test: 0.3131001 best: 0.3131001 (38) total: 10.5s remaining: 14.7s 39: learn: 0.2994402 test: 0.3100375 best: 0.3100375 (39) total: 10.7s remaining: 14.5s 40: learn: 0.2960819 test: 0.3067452 best: 0.3067452 (40) total: 11s remaining: 14.3s 41: learn: 0.2938394 test: 0.3044751 best: 0.3044751 (41) total: 11.3s remaining: 14s 42: learn: 0.2908541 test: 0.3014727 best: 0.3014727 (42) total: 11.6s remaining: 13.7s 43: learn: 0.2876027 test: 0.2977531 best: 0.2977531 (43) total: 11.9s remaining: 13.5s 44: learn: 0.2853429 test: 0.2955049 best: 0.2955049 (44) total: 12.1s remaining: 13.2s 45: learn: 0.2829864 test: 0.2933021 best: 0.2933021 (45) total: 12.4s remaining: 12.9s 46: learn: 0.2802991 test: 0.2907863 best: 0.2907863 (46) total: 12.7s remaining: 12.7s 47: learn: 0.2774213 test: 0.2879503 best: 0.2879503 (47) total: 12.9s remaining: 12.4s 48: learn: 0.2750521 test: 0.2851716 best: 0.2851716 (48) total: 13.2s remaining: 12.1s 49: learn: 0.2724046 test: 0.2825387 best: 0.2825387 (49) total: 13.4s remaining: 11.8s 50: learn: 0.2699871 test: 0.2800444 best: 0.2800444 (50) total: 13.7s remaining: 11.5s 51: learn: 0.2677887 test: 0.2779865 best: 0.2779865 (51) total: 13.9s remaining: 11.3s 52: learn: 0.2661443 test: 0.2764014 best: 0.2764014 (52) total: 14.2s remaining: 11s 53: learn: 0.2641720 test: 0.2745569 best: 0.2745569 (53) total: 14.4s remaining: 10.7s 54: learn: 0.2620974 test: 0.2726860 best: 0.2726860 (54) total: 14.7s remaining: 10.4s 55: learn: 0.2604042 test: 0.2710455 best: 0.2710455 (55) total: 15s remaining: 10.2s 56: learn: 0.2584820 test: 0.2690811 best: 0.2690811 (56) total: 15.3s remaining: 9.9s 57: learn: 0.2564379 test: 0.2670383 best: 0.2670383 (57) total: 15.5s remaining: 9.63s 58: learn: 0.2545911 test: 0.2653061 best: 0.2653061 (58) total: 15.8s remaining: 9.36s 59: learn: 0.2529710 test: 0.2637321 best: 0.2637321 (59) total: 16s remaining: 9.07s 60: learn: 0.2507398 test: 0.2615127 best: 0.2615127 (60) total: 16.3s remaining: 8.81s 61: learn: 0.2488250 test: 0.2596377 best: 0.2596377 (61) total: 16.6s remaining: 8.56s 62: learn: 0.2464663 test: 0.2570394 best: 0.2570394 (62) total: 16.9s remaining: 8.3s 63: learn: 0.2444801 test: 0.2548289 best: 0.2548289 (63) total: 17.2s remaining: 8.05s 64: learn: 0.2431173 test: 0.2534890 best: 0.2534890 (64) total: 17.4s remaining: 7.78s 65: learn: 0.2416634 test: 0.2521321 best: 0.2521321 (65) total: 17.7s remaining: 7.51s 66: learn: 0.2399647 test: 0.2506010 best: 0.2506010 (66) total: 18s remaining: 7.26s 67: learn: 0.2387448 test: 0.2492951 best: 0.2492951 (67) total: 18.3s remaining: 6.99s 68: learn: 0.2375479 test: 0.2480989 best: 0.2480989 (68) total: 18.5s remaining: 6.71s 69: learn: 0.2364480 test: 0.2469910 best: 0.2469910 (69) total: 18.8s remaining: 6.43s 70: learn: 0.2349840 test: 0.2455410 best: 0.2455410 (70) total: 19s remaining: 6.17s 71: learn: 0.2335966 test: 0.2440689 best: 0.2440689 (71) total: 19.3s remaining: 5.91s 72: learn: 0.2322091 test: 0.2425881 best: 0.2425881 (72) total: 19.6s remaining: 5.65s 73: learn: 0.2311622 test: 0.2413984 best: 0.2413984 (73) total: 19.9s remaining: 5.37s 74: learn: 0.2299781 test: 0.2402938 best: 0.2402938 (74) total: 20.1s remaining: 5.1s 75: learn: 0.2289155 test: 0.2392671 best: 0.2392671 (75) total: 20.4s remaining: 4.83s 76: learn: 0.2276729 test: 0.2380414 best: 0.2380414 (76) total: 20.7s remaining: 4.58s 77: learn: 0.2263439 test: 0.2367426 best: 0.2367426 (77) total: 21s remaining: 4.3s 78: learn: 0.2252054 test: 0.2356424 best: 0.2356424 (78) total: 21.3s remaining: 4.04s 79: learn: 0.2240501 test: 0.2345380 best: 0.2345380 (79) total: 21.5s remaining: 3.76s 80: learn: 0.2230593 test: 0.2336205 best: 0.2336205 (80) total: 21.7s remaining: 3.48s 81: learn: 0.2222791 test: 0.2330055 best: 0.2330055 (81) total: 21.9s remaining: 3.21s 82: learn: 0.2210471 test: 0.2316704 best: 0.2316704 (82) total: 22.2s remaining: 2.94s 83: learn: 0.2200154 test: 0.2305676 best: 0.2305676 (83) total: 22.5s remaining: 2.68s 84: learn: 0.2188882 test: 0.2295150 best: 0.2295150 (84) total: 22.8s remaining: 2.41s 85: learn: 0.2181440 test: 0.2288198 best: 0.2288198 (85) total: 23.1s remaining: 2.15s 86: learn: 0.2171450 test: 0.2279613 best: 0.2279613 (86) total: 23.4s remaining: 1.88s 87: learn: 0.2160694 test: 0.2269861 best: 0.2269861 (87) total: 23.7s remaining: 1.61s 88: learn: 0.2152965 test: 0.2262835 best: 0.2262835 (88) total: 23.9s remaining: 1.34s 89: learn: 0.2144653 test: 0.2255086 best: 0.2255086 (89) total: 24.2s remaining: 1.07s 90: learn: 0.2136028 test: 0.2247910 best: 0.2247910 (90) total: 24.5s remaining: 808ms 91: learn: 0.2128136 test: 0.2241062 best: 0.2241062 (91) total: 24.8s remaining: 539ms 92: learn: 0.2119827 test: 0.2232951 best: 0.2232951 (92) total: 25.1s remaining: 270ms 93: learn: 0.2112846 test: 0.2227094 best: 0.2227094 (93) total: 25.3s remaining: 0us bestTest = 0.2227093909 bestIteration = 93 Trial 65, Fold 4: Log loss = 0.22270939086504088, Average precision = 0.9741357559715035, ROC-AUC = 0.9701410795653697, Elapsed Time = 25.479954099999304 seconds Trial 65, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 65, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6709642 test: 0.6715991 best: 0.6715991 (0) total: 225ms remaining: 20.9s 1: learn: 0.6478293 test: 0.6488547 best: 0.6488547 (1) total: 519ms remaining: 23.9s 2: learn: 0.6289449 test: 0.6303849 best: 0.6303849 (2) total: 758ms remaining: 23s 3: learn: 0.6070362 test: 0.6111387 best: 0.6111387 (3) total: 1.06s remaining: 23.9s 4: learn: 0.5878396 test: 0.5922094 best: 0.5922094 (4) total: 1.3s remaining: 23.2s 5: learn: 0.5723419 test: 0.5768878 best: 0.5768878 (5) total: 1.58s remaining: 23.2s 6: learn: 0.5555804 test: 0.5603244 best: 0.5603244 (6) total: 1.82s remaining: 22.7s 7: learn: 0.5387197 test: 0.5440862 best: 0.5440862 (7) total: 2.09s remaining: 22.4s 8: learn: 0.5230234 test: 0.5286015 best: 0.5286015 (8) total: 2.35s remaining: 22.1s 9: learn: 0.5096927 test: 0.5154135 best: 0.5154135 (9) total: 2.58s remaining: 21.7s 10: learn: 0.4955611 test: 0.5014144 best: 0.5014144 (10) total: 2.84s remaining: 21.4s 11: learn: 0.4842877 test: 0.4915594 best: 0.4915594 (11) total: 3.07s remaining: 21s 12: learn: 0.4715146 test: 0.4791983 best: 0.4791983 (12) total: 3.38s remaining: 21.1s 13: learn: 0.4589483 test: 0.4672332 best: 0.4672332 (13) total: 3.62s remaining: 20.7s 14: learn: 0.4481110 test: 0.4568292 best: 0.4568292 (14) total: 3.88s remaining: 20.5s 15: learn: 0.4385849 test: 0.4477488 best: 0.4477488 (15) total: 4.11s remaining: 20s 16: learn: 0.4278549 test: 0.4375965 best: 0.4375965 (16) total: 4.34s remaining: 19.7s 17: learn: 0.4184693 test: 0.4285389 best: 0.4285389 (17) total: 4.62s remaining: 19.5s 18: learn: 0.4111899 test: 0.4215869 best: 0.4215869 (18) total: 4.89s remaining: 19.3s 19: learn: 0.4018956 test: 0.4129023 best: 0.4129023 (19) total: 5.13s remaining: 19s 20: learn: 0.3933351 test: 0.4049230 best: 0.4049230 (20) total: 5.38s remaining: 18.7s 21: learn: 0.3861715 test: 0.3980726 best: 0.3980726 (21) total: 5.64s remaining: 18.5s 22: learn: 0.3801514 test: 0.3925953 best: 0.3925953 (22) total: 5.92s remaining: 18.3s 23: learn: 0.3736471 test: 0.3862658 best: 0.3862658 (23) total: 6.19s remaining: 18s 24: learn: 0.3673023 test: 0.3805210 best: 0.3805210 (24) total: 6.46s remaining: 17.8s 25: learn: 0.3621083 test: 0.3754593 best: 0.3754593 (25) total: 6.72s remaining: 17.6s 26: learn: 0.3568888 test: 0.3706637 best: 0.3706637 (26) total: 6.97s remaining: 17.3s 27: learn: 0.3513143 test: 0.3654049 best: 0.3654049 (27) total: 7.26s remaining: 17.1s 28: learn: 0.3467611 test: 0.3611683 best: 0.3611683 (28) total: 7.46s remaining: 16.7s 29: learn: 0.3423535 test: 0.3569551 best: 0.3569551 (29) total: 7.7s remaining: 16.4s 30: learn: 0.3375973 test: 0.3524200 best: 0.3524200 (30) total: 8s remaining: 16.3s 31: learn: 0.3325534 test: 0.3477297 best: 0.3477297 (31) total: 8.27s remaining: 16s 32: learn: 0.3282827 test: 0.3435317 best: 0.3435317 (32) total: 8.47s remaining: 15.7s 33: learn: 0.3241066 test: 0.3396499 best: 0.3396499 (33) total: 8.79s remaining: 15.5s 34: learn: 0.3181051 test: 0.3339166 best: 0.3339166 (34) total: 9.05s remaining: 15.3s 35: learn: 0.3141856 test: 0.3303325 best: 0.3303325 (35) total: 9.27s remaining: 14.9s 36: learn: 0.3104077 test: 0.3266728 best: 0.3266728 (36) total: 9.58s remaining: 14.8s 37: learn: 0.3069382 test: 0.3233233 best: 0.3233233 (37) total: 9.84s remaining: 14.5s 38: learn: 0.3029359 test: 0.3196856 best: 0.3196856 (38) total: 10.1s remaining: 14.3s 39: learn: 0.2995579 test: 0.3163787 best: 0.3163787 (39) total: 10.4s remaining: 14s 40: learn: 0.2966137 test: 0.3136887 best: 0.3136887 (40) total: 10.7s remaining: 13.8s 41: learn: 0.2926207 test: 0.3100107 best: 0.3100107 (41) total: 11s remaining: 13.6s 42: learn: 0.2891079 test: 0.3068382 best: 0.3068382 (42) total: 11.2s remaining: 13.3s 43: learn: 0.2864038 test: 0.3043269 best: 0.3043269 (43) total: 11.5s remaining: 13.1s 44: learn: 0.2834711 test: 0.3013663 best: 0.3013663 (44) total: 11.8s remaining: 12.8s 45: learn: 0.2801106 test: 0.2980027 best: 0.2980027 (45) total: 12.1s remaining: 12.6s 46: learn: 0.2776822 test: 0.2956527 best: 0.2956527 (46) total: 12.3s remaining: 12.3s 47: learn: 0.2742125 test: 0.2921691 best: 0.2921691 (47) total: 12.6s remaining: 12s 48: learn: 0.2712553 test: 0.2894196 best: 0.2894196 (48) total: 12.8s remaining: 11.8s 49: learn: 0.2692189 test: 0.2873718 best: 0.2873718 (49) total: 13.1s remaining: 11.5s 50: learn: 0.2672431 test: 0.2854949 best: 0.2854949 (50) total: 13.3s remaining: 11.3s 51: learn: 0.2642035 test: 0.2825036 best: 0.2825036 (51) total: 13.6s remaining: 11s 52: learn: 0.2623760 test: 0.2807761 best: 0.2807761 (52) total: 13.9s remaining: 10.7s 53: learn: 0.2601927 test: 0.2787323 best: 0.2787323 (53) total: 14.1s remaining: 10.5s 54: learn: 0.2586375 test: 0.2771490 best: 0.2771490 (54) total: 14.4s remaining: 10.2s 55: learn: 0.2563471 test: 0.2750155 best: 0.2750155 (55) total: 14.7s remaining: 9.98s 56: learn: 0.2539746 test: 0.2727612 best: 0.2727612 (56) total: 14.9s remaining: 9.7s 57: learn: 0.2521620 test: 0.2710841 best: 0.2710841 (57) total: 15.2s remaining: 9.44s 58: learn: 0.2503674 test: 0.2694591 best: 0.2694591 (58) total: 15.5s remaining: 9.18s 59: learn: 0.2486568 test: 0.2678690 best: 0.2678690 (59) total: 15.8s remaining: 8.93s 60: learn: 0.2468967 test: 0.2662502 best: 0.2662502 (60) total: 16s remaining: 8.68s 61: learn: 0.2452117 test: 0.2646609 best: 0.2646609 (61) total: 16.3s remaining: 8.42s 62: learn: 0.2435935 test: 0.2632217 best: 0.2632217 (62) total: 16.6s remaining: 8.17s 63: learn: 0.2420123 test: 0.2617198 best: 0.2617198 (63) total: 16.9s remaining: 7.92s 64: learn: 0.2406689 test: 0.2604489 best: 0.2604489 (64) total: 17.1s remaining: 7.64s 65: learn: 0.2386283 test: 0.2585901 best: 0.2585901 (65) total: 17.4s remaining: 7.38s 66: learn: 0.2371027 test: 0.2572483 best: 0.2572483 (66) total: 17.7s remaining: 7.12s 67: learn: 0.2355898 test: 0.2558742 best: 0.2558742 (67) total: 18s remaining: 6.88s 68: learn: 0.2343741 test: 0.2547931 best: 0.2547931 (68) total: 18.2s remaining: 6.61s 69: learn: 0.2329550 test: 0.2534498 best: 0.2534498 (69) total: 18.5s remaining: 6.34s 70: learn: 0.2310791 test: 0.2517153 best: 0.2517153 (70) total: 18.8s remaining: 6.09s 71: learn: 0.2294669 test: 0.2500946 best: 0.2500946 (71) total: 19.1s remaining: 5.83s 72: learn: 0.2283584 test: 0.2490668 best: 0.2490668 (72) total: 19.3s remaining: 5.56s 73: learn: 0.2270634 test: 0.2479637 best: 0.2479637 (73) total: 19.6s remaining: 5.3s 74: learn: 0.2258069 test: 0.2467624 best: 0.2467624 (74) total: 19.8s remaining: 5.03s 75: learn: 0.2245999 test: 0.2456636 best: 0.2456636 (75) total: 20.1s remaining: 4.77s 76: learn: 0.2230923 test: 0.2443205 best: 0.2443205 (76) total: 20.4s remaining: 4.5s 77: learn: 0.2219663 test: 0.2432775 best: 0.2432775 (77) total: 20.7s remaining: 4.25s 78: learn: 0.2208780 test: 0.2423791 best: 0.2423791 (78) total: 21s remaining: 3.98s 79: learn: 0.2198004 test: 0.2413880 best: 0.2413880 (79) total: 21.3s remaining: 3.72s 80: learn: 0.2189321 test: 0.2405787 best: 0.2405787 (80) total: 21.5s remaining: 3.46s 81: learn: 0.2177920 test: 0.2394771 best: 0.2394771 (81) total: 21.8s remaining: 3.19s 82: learn: 0.2169273 test: 0.2387210 best: 0.2387210 (82) total: 22.1s remaining: 2.93s 83: learn: 0.2159485 test: 0.2378177 best: 0.2378177 (83) total: 22.4s remaining: 2.67s 84: learn: 0.2150358 test: 0.2370037 best: 0.2370037 (84) total: 22.6s remaining: 2.4s 85: learn: 0.2142381 test: 0.2363161 best: 0.2363161 (85) total: 22.9s remaining: 2.13s 86: learn: 0.2135857 test: 0.2357347 best: 0.2357347 (86) total: 23.2s remaining: 1.87s 87: learn: 0.2127268 test: 0.2349809 best: 0.2349809 (87) total: 23.5s remaining: 1.6s 88: learn: 0.2117856 test: 0.2340924 best: 0.2340924 (88) total: 23.8s remaining: 1.34s 89: learn: 0.2111193 test: 0.2335366 best: 0.2335366 (89) total: 24s remaining: 1.07s 90: learn: 0.2104673 test: 0.2329916 best: 0.2329916 (90) total: 24.3s remaining: 801ms 91: learn: 0.2097130 test: 0.2323336 best: 0.2323336 (91) total: 24.6s remaining: 534ms 92: learn: 0.2090172 test: 0.2317873 best: 0.2317873 (92) total: 24.8s remaining: 267ms 93: learn: 0.2083880 test: 0.2311678 best: 0.2311678 (93) total: 25s remaining: 0us bestTest = 0.2311678229 bestIteration = 93 Trial 65, Fold 5: Log loss = 0.23116782291661211, Average precision = 0.9720569451347603, ROC-AUC = 0.9682291213793359, Elapsed Time = 25.182768200000282 seconds
Optimization Progress: 66%|######6 | 66/100 [1:50:52<48:36, 85.79s/it]
Trial 66, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 66, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6038481 test: 0.6125825 best: 0.6125825 (0) total: 6.31s remaining: 44.2s 1: learn: 0.5188425 test: 0.5425638 best: 0.5425638 (1) total: 14.1s remaining: 42.2s 2: learn: 0.4562161 test: 0.4901837 best: 0.4901837 (2) total: 20.4s remaining: 34s 3: learn: 0.4043148 test: 0.4462300 best: 0.4462300 (3) total: 25.8s remaining: 25.8s 4: learn: 0.3525741 test: 0.4048929 best: 0.4048929 (4) total: 31.7s remaining: 19s 5: learn: 0.3168511 test: 0.3752779 best: 0.3752779 (5) total: 37s remaining: 12.3s 6: learn: 0.2783105 test: 0.3446947 best: 0.3446947 (6) total: 42.6s remaining: 6.09s 7: learn: 0.2486222 test: 0.3212030 best: 0.3212030 (7) total: 48.1s remaining: 0us bestTest = 0.3212030184 bestIteration = 7 Trial 66, Fold 1: Log loss = 0.3212030184293957, Average precision = 0.9676886982497143, ROC-AUC = 0.9611996679380375, Elapsed Time = 48.305870099997264 seconds Trial 66, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 66, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6081135 test: 0.6189193 best: 0.6189193 (0) total: 5.91s remaining: 41.4s 1: learn: 0.5252401 test: 0.5494736 best: 0.5494736 (1) total: 11.8s remaining: 35.5s 2: learn: 0.4587531 test: 0.4973216 best: 0.4973216 (2) total: 17.8s remaining: 29.7s 3: learn: 0.4069466 test: 0.4514136 best: 0.4514136 (3) total: 23.3s remaining: 23.3s 4: learn: 0.3518388 test: 0.4090534 best: 0.4090534 (4) total: 28.8s remaining: 17.3s 5: learn: 0.3089903 test: 0.3736304 best: 0.3736304 (5) total: 34.2s remaining: 11.4s 6: learn: 0.2696581 test: 0.3435978 best: 0.3435978 (6) total: 39.7s remaining: 5.67s 7: learn: 0.2446864 test: 0.3223327 best: 0.3223327 (7) total: 45.2s remaining: 0us bestTest = 0.3223327159 bestIteration = 7 Trial 66, Fold 2: Log loss = 0.32233271594206875, Average precision = 0.967443880222844, ROC-AUC = 0.9625607104952715, Elapsed Time = 45.320567999999184 seconds Trial 66, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 66, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5913281 test: 0.6115472 best: 0.6115472 (0) total: 5.82s remaining: 40.8s 1: learn: 0.5068758 test: 0.5396878 best: 0.5396878 (1) total: 11.6s remaining: 34.9s 2: learn: 0.4450620 test: 0.4883430 best: 0.4883430 (2) total: 17.5s remaining: 29.2s 3: learn: 0.3917756 test: 0.4448439 best: 0.4448439 (3) total: 23.1s remaining: 23.1s 4: learn: 0.3472829 test: 0.4040824 best: 0.4040824 (4) total: 28.9s remaining: 17.4s 5: learn: 0.3168064 test: 0.3764005 best: 0.3764005 (5) total: 34.6s remaining: 11.5s 6: learn: 0.2789554 test: 0.3460343 best: 0.3460343 (6) total: 40.4s remaining: 5.77s 7: learn: 0.2502393 test: 0.3222995 best: 0.3222995 (7) total: 46.1s remaining: 0us bestTest = 0.3222995379 bestIteration = 7 Trial 66, Fold 3: Log loss = 0.32229953792725474, Average precision = 0.9654392335880897, ROC-AUC = 0.9622990123373559, Elapsed Time = 46.269129299998895 seconds Trial 66, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 66, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6001260 test: 0.6116289 best: 0.6116289 (0) total: 6.26s remaining: 43.8s 1: learn: 0.5196074 test: 0.5403403 best: 0.5403403 (1) total: 12.5s remaining: 37.4s 2: learn: 0.4534521 test: 0.4869906 best: 0.4869906 (2) total: 18.3s remaining: 30.5s 3: learn: 0.4071649 test: 0.4453260 best: 0.4453260 (3) total: 24s remaining: 24s 4: learn: 0.3583900 test: 0.4032130 best: 0.4032130 (4) total: 29.7s remaining: 17.8s 5: learn: 0.3197381 test: 0.3706860 best: 0.3706860 (5) total: 35.6s remaining: 11.9s 6: learn: 0.2809273 test: 0.3411497 best: 0.3411497 (6) total: 41.4s remaining: 5.91s 7: learn: 0.2509235 test: 0.3179334 best: 0.3179334 (7) total: 47.1s remaining: 0us bestTest = 0.317933405 bestIteration = 7 Trial 66, Fold 4: Log loss = 0.3179334050373481, Average precision = 0.9689730529116045, ROC-AUC = 0.9634770268079191, Elapsed Time = 47.31287929999962 seconds Trial 66, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 66, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5946099 test: 0.6130502 best: 0.6130502 (0) total: 5.94s remaining: 41.6s 1: learn: 0.5153158 test: 0.5479888 best: 0.5479888 (1) total: 11.9s remaining: 35.6s 2: learn: 0.4490802 test: 0.4936994 best: 0.4936994 (2) total: 17.8s remaining: 29.6s 3: learn: 0.4003560 test: 0.4501497 best: 0.4501497 (3) total: 23.4s remaining: 23.4s 4: learn: 0.3501528 test: 0.4120923 best: 0.4120923 (4) total: 29.3s remaining: 17.6s 5: learn: 0.3103541 test: 0.3784655 best: 0.3784655 (5) total: 35.1s remaining: 11.7s 6: learn: 0.2710949 test: 0.3481161 best: 0.3481161 (6) total: 40.7s remaining: 5.81s 7: learn: 0.2405829 test: 0.3245025 best: 0.3245025 (7) total: 46.6s remaining: 0us bestTest = 0.3245024617 bestIteration = 7 Trial 66, Fold 5: Log loss = 0.32450246170250396, Average precision = 0.964914102627159, ROC-AUC = 0.9582429073072849, Elapsed Time = 46.748971100001654 seconds
Optimization Progress: 67%|######7 | 67/100 [1:54:54<1:12:56, 132.61s/it]
Trial 67, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 67, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5308367 test: 0.5334931 best: 0.5334931 (0) total: 61.5ms remaining: 1.04s 1: learn: 0.4119444 test: 0.4171130 best: 0.4171130 (1) total: 125ms remaining: 999ms 2: learn: 0.3379642 test: 0.3449680 best: 0.3449680 (2) total: 188ms remaining: 942ms 3: learn: 0.3011151 test: 0.3083902 best: 0.3083902 (3) total: 247ms remaining: 864ms 4: learn: 0.2696773 test: 0.2773132 best: 0.2773132 (4) total: 308ms remaining: 802ms 5: learn: 0.2498357 test: 0.2592433 best: 0.2592433 (5) total: 372ms remaining: 743ms 6: learn: 0.2347770 test: 0.2458484 best: 0.2458484 (6) total: 431ms remaining: 677ms 7: learn: 0.2255207 test: 0.2379487 best: 0.2379487 (7) total: 492ms remaining: 615ms 8: learn: 0.2177878 test: 0.2318626 best: 0.2318626 (8) total: 554ms remaining: 554ms 9: learn: 0.2123682 test: 0.2282185 best: 0.2282185 (9) total: 617ms remaining: 494ms 10: learn: 0.2073813 test: 0.2236595 best: 0.2236595 (10) total: 677ms remaining: 431ms 11: learn: 0.2024308 test: 0.2201602 best: 0.2201602 (11) total: 740ms remaining: 370ms 12: learn: 0.1981273 test: 0.2180637 best: 0.2180637 (12) total: 805ms remaining: 309ms 13: learn: 0.1947860 test: 0.2167048 best: 0.2167048 (13) total: 868ms remaining: 248ms 14: learn: 0.1929895 test: 0.2152865 best: 0.2152865 (14) total: 929ms remaining: 186ms 15: learn: 0.1890878 test: 0.2125549 best: 0.2125549 (15) total: 991ms remaining: 124ms 16: learn: 0.1873403 test: 0.2117731 best: 0.2117731 (16) total: 1.05s remaining: 62.1ms 17: learn: 0.1852215 test: 0.2109326 best: 0.2109326 (17) total: 1.12s remaining: 0us bestTest = 0.2109326431 bestIteration = 17 Trial 67, Fold 1: Log loss = 0.2109326430998132, Average precision = 0.9693742843982487, ROC-AUC = 0.9667552683022421, Elapsed Time = 1.2243921000008413 seconds Trial 67, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 67, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5116023 test: 0.5141740 best: 0.5141740 (0) total: 63ms remaining: 1.07s 1: learn: 0.4257167 test: 0.4316891 best: 0.4316891 (1) total: 125ms remaining: 1s 2: learn: 0.3543765 test: 0.3612416 best: 0.3612416 (2) total: 188ms remaining: 939ms 3: learn: 0.3109329 test: 0.3187222 best: 0.3187222 (3) total: 250ms remaining: 875ms 4: learn: 0.2777554 test: 0.2878122 best: 0.2878122 (4) total: 312ms remaining: 812ms 5: learn: 0.2500602 test: 0.2604702 best: 0.2604702 (5) total: 375ms remaining: 750ms 6: learn: 0.2360304 test: 0.2476421 best: 0.2476421 (6) total: 441ms remaining: 693ms 7: learn: 0.2246791 test: 0.2380085 best: 0.2380085 (7) total: 498ms remaining: 622ms 8: learn: 0.2178107 test: 0.2314467 best: 0.2314467 (8) total: 560ms remaining: 560ms 9: learn: 0.2129245 test: 0.2271637 best: 0.2271637 (9) total: 622ms remaining: 497ms 10: learn: 0.2076415 test: 0.2229373 best: 0.2229373 (10) total: 684ms remaining: 435ms 11: learn: 0.2017919 test: 0.2188999 best: 0.2188999 (11) total: 751ms remaining: 376ms 12: learn: 0.1961952 test: 0.2138627 best: 0.2138627 (12) total: 813ms remaining: 313ms 13: learn: 0.1942325 test: 0.2129220 best: 0.2129220 (13) total: 876ms remaining: 250ms 14: learn: 0.1920418 test: 0.2108790 best: 0.2108790 (14) total: 937ms remaining: 187ms 15: learn: 0.1895377 test: 0.2089532 best: 0.2089532 (15) total: 1s remaining: 125ms 16: learn: 0.2047257 test: 0.2422178 best: 0.2089532 (15) total: 1.06s remaining: 62.6ms 17: learn: 0.2034148 test: 0.2417271 best: 0.2089532 (15) total: 1.13s remaining: 0us bestTest = 0.2089532283 bestIteration = 15 Shrink model to first 16 iterations. Trial 67, Fold 2: Log loss = 0.208953228258031, Average precision = 0.9717722839040971, ROC-AUC = 0.9683297060554348, Elapsed Time = 1.232151399999566 seconds Trial 67, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 67, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5269672 test: 0.5302358 best: 0.5302358 (0) total: 65.5ms remaining: 1.11s 1: learn: 0.4347368 test: 0.4376431 best: 0.4376431 (1) total: 129ms remaining: 1.03s 2: learn: 0.3579215 test: 0.3620770 best: 0.3620770 (2) total: 192ms remaining: 960ms 3: learn: 0.3063125 test: 0.3113898 best: 0.3113898 (3) total: 257ms remaining: 900ms 4: learn: 0.2754623 test: 0.2818396 best: 0.2818396 (4) total: 321ms remaining: 835ms 5: learn: 0.2533066 test: 0.2618419 best: 0.2618419 (5) total: 386ms remaining: 771ms 6: learn: 0.2379659 test: 0.2474018 best: 0.2474018 (6) total: 448ms remaining: 704ms 7: learn: 0.2280851 test: 0.2383508 best: 0.2383508 (7) total: 510ms remaining: 638ms 8: learn: 0.2178364 test: 0.2294978 best: 0.2294978 (8) total: 574ms remaining: 574ms 9: learn: 0.2332119 test: 0.2543952 best: 0.2294978 (8) total: 638ms remaining: 511ms 10: learn: 0.2292176 test: 0.2520130 best: 0.2294978 (8) total: 700ms remaining: 445ms 11: learn: 0.2203169 test: 0.2459908 best: 0.2294978 (8) total: 762ms remaining: 381ms 12: learn: 0.2174781 test: 0.2450912 best: 0.2294978 (8) total: 826ms remaining: 318ms 13: learn: 0.2140852 test: 0.2421165 best: 0.2294978 (8) total: 888ms remaining: 254ms 14: learn: 0.2113313 test: 0.2398027 best: 0.2294978 (8) total: 953ms remaining: 191ms 15: learn: 0.2104487 test: 0.2380092 best: 0.2294978 (8) total: 1.02s remaining: 127ms 16: learn: 0.2082114 test: 0.2379327 best: 0.2294978 (8) total: 1.08s remaining: 63.5ms 17: learn: 0.2064527 test: 0.2373509 best: 0.2294978 (8) total: 1.14s remaining: 0us bestTest = 0.2294977727 bestIteration = 8 Shrink model to first 9 iterations. Trial 67, Fold 3: Log loss = 0.22949777268239238, Average precision = 0.9704896064872417, ROC-AUC = 0.9673667425913264, Elapsed Time = 1.2557278999993287 seconds Trial 67, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 67, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5399962 test: 0.5420976 best: 0.5420976 (0) total: 59ms remaining: 1s 1: learn: 0.4366313 test: 0.4396874 best: 0.4396874 (1) total: 122ms remaining: 976ms 2: learn: 0.3659843 test: 0.3710791 best: 0.3710791 (2) total: 186ms remaining: 929ms 3: learn: 0.3138812 test: 0.3212191 best: 0.3212191 (3) total: 247ms remaining: 865ms 4: learn: 0.2790154 test: 0.2882059 best: 0.2882059 (4) total: 304ms remaining: 789ms 5: learn: 0.2567535 test: 0.2676397 best: 0.2676397 (5) total: 367ms remaining: 733ms 6: learn: 0.2411789 test: 0.2527669 best: 0.2527669 (6) total: 430ms remaining: 676ms 7: learn: 0.2295620 test: 0.2410108 best: 0.2410108 (7) total: 490ms remaining: 613ms 8: learn: 0.2211462 test: 0.2335786 best: 0.2335786 (8) total: 555ms remaining: 555ms 9: learn: 0.2110079 test: 0.2249983 best: 0.2249983 (9) total: 621ms remaining: 497ms 10: learn: 0.2058577 test: 0.2209926 best: 0.2209926 (10) total: 686ms remaining: 436ms 11: learn: 0.2022723 test: 0.2184358 best: 0.2184358 (11) total: 752ms remaining: 376ms 12: learn: 0.1994626 test: 0.2176441 best: 0.2176441 (12) total: 816ms remaining: 314ms 13: learn: 0.2040053 test: 0.2308095 best: 0.2176441 (12) total: 883ms remaining: 252ms 14: learn: 0.2005297 test: 0.2304924 best: 0.2176441 (12) total: 949ms remaining: 190ms 15: learn: 0.1983725 test: 0.2300883 best: 0.2176441 (12) total: 1.01s remaining: 127ms 16: learn: 0.1965882 test: 0.2300897 best: 0.2176441 (12) total: 1.08s remaining: 63.6ms 17: learn: 0.1911552 test: 0.2295691 best: 0.2176441 (12) total: 1.15s remaining: 0us bestTest = 0.2176441442 bestIteration = 12 Shrink model to first 13 iterations. Trial 67, Fold 4: Log loss = 0.21764414422675507, Average precision = 0.9714140523055634, ROC-AUC = 0.9661470441664223, Elapsed Time = 1.2504069999995409 seconds Trial 67, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 67, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5564492 test: 0.5592982 best: 0.5592982 (0) total: 58.2ms remaining: 989ms 1: learn: 0.4272732 test: 0.4346879 best: 0.4346879 (1) total: 119ms remaining: 954ms 2: learn: 0.3465059 test: 0.3563917 best: 0.3563917 (2) total: 186ms remaining: 930ms 3: learn: 0.2970443 test: 0.3087665 best: 0.3087665 (3) total: 251ms remaining: 877ms 4: learn: 0.2668272 test: 0.2797645 best: 0.2797645 (4) total: 319ms remaining: 830ms 5: learn: 0.2480203 test: 0.2624281 best: 0.2624281 (5) total: 382ms remaining: 765ms 6: learn: 0.2355684 test: 0.2519021 best: 0.2519021 (6) total: 446ms remaining: 701ms 7: learn: 0.2255662 test: 0.2436928 best: 0.2436928 (7) total: 511ms remaining: 639ms 8: learn: 0.2186008 test: 0.2373635 best: 0.2373635 (8) total: 574ms remaining: 574ms 9: learn: 0.2099999 test: 0.2307744 best: 0.2307744 (9) total: 639ms remaining: 511ms 10: learn: 0.2046994 test: 0.2272234 best: 0.2272234 (10) total: 704ms remaining: 448ms 11: learn: 0.2006855 test: 0.2250941 best: 0.2250941 (11) total: 770ms remaining: 385ms 12: learn: 0.1966524 test: 0.2231804 best: 0.2231804 (12) total: 834ms remaining: 321ms 13: learn: 0.2250210 test: 0.2228473 best: 0.2228473 (13) total: 900ms remaining: 257ms 14: learn: 0.2206247 test: 0.2210096 best: 0.2210096 (14) total: 966ms remaining: 193ms 15: learn: 0.2130708 test: 0.2192445 best: 0.2192445 (15) total: 1.03s remaining: 129ms 16: learn: 0.2108473 test: 0.2178783 best: 0.2178783 (16) total: 1.09s remaining: 64.5ms 17: learn: 0.2045721 test: 0.2187451 best: 0.2178783 (16) total: 1.17s remaining: 0us bestTest = 0.2178783235 bestIteration = 16 Shrink model to first 17 iterations. Trial 67, Fold 5: Log loss = 0.21787832346472952, Average precision = 0.9695044188372833, ROC-AUC = 0.9658037641900303, Elapsed Time = 1.2686074999983248 seconds
Optimization Progress: 68%|######8 | 68/100 [1:55:08<51:47, 97.12s/it]
Trial 68, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 68, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6715229 test: 0.6713295 best: 0.6713295 (0) total: 38.4ms remaining: 1.65s 1: learn: 0.6507826 test: 0.6513380 best: 0.6513380 (1) total: 77ms remaining: 1.62s 2: learn: 0.6307988 test: 0.6319539 best: 0.6319539 (2) total: 116ms remaining: 1.59s 3: learn: 0.6125123 test: 0.6135329 best: 0.6135329 (3) total: 155ms remaining: 1.54s 4: learn: 0.5954194 test: 0.5963300 best: 0.5963300 (4) total: 196ms remaining: 1.53s 5: learn: 0.5802584 test: 0.5811088 best: 0.5811088 (5) total: 234ms remaining: 1.48s 6: learn: 0.5642124 test: 0.5650419 best: 0.5650419 (6) total: 280ms remaining: 1.48s 7: learn: 0.5488591 test: 0.5496760 best: 0.5496760 (7) total: 321ms remaining: 1.44s 8: learn: 0.5345535 test: 0.5353299 best: 0.5353299 (8) total: 364ms remaining: 1.42s 9: learn: 0.5214069 test: 0.5221706 best: 0.5221706 (9) total: 408ms remaining: 1.39s 10: learn: 0.5073602 test: 0.5080573 best: 0.5080573 (10) total: 448ms remaining: 1.34s 11: learn: 0.4949940 test: 0.4956339 best: 0.4956339 (11) total: 494ms remaining: 1.32s 12: learn: 0.4828097 test: 0.4834088 best: 0.4834088 (12) total: 538ms remaining: 1.28s 13: learn: 0.4715758 test: 0.4721349 best: 0.4721349 (13) total: 583ms remaining: 1.25s 14: learn: 0.4611673 test: 0.4617086 best: 0.4617086 (14) total: 627ms remaining: 1.21s 15: learn: 0.4518775 test: 0.4523708 best: 0.4523708 (15) total: 666ms remaining: 1.17s 16: learn: 0.4423638 test: 0.4428191 best: 0.4428191 (16) total: 705ms remaining: 1.12s 17: learn: 0.4327482 test: 0.4331369 best: 0.4331369 (17) total: 744ms remaining: 1.07s 18: learn: 0.4237713 test: 0.4240130 best: 0.4240130 (18) total: 787ms remaining: 1.03s 19: learn: 0.4152192 test: 0.4153542 best: 0.4153542 (19) total: 825ms remaining: 990ms 20: learn: 0.4070232 test: 0.4070987 best: 0.4070987 (20) total: 865ms remaining: 947ms 21: learn: 0.3986516 test: 0.3987646 best: 0.3987646 (21) total: 906ms remaining: 906ms 22: learn: 0.3925358 test: 0.3926947 best: 0.3926947 (22) total: 948ms remaining: 866ms 23: learn: 0.3858795 test: 0.3860548 best: 0.3860548 (23) total: 991ms remaining: 826ms 24: learn: 0.3780403 test: 0.3781437 best: 0.3781437 (24) total: 1.03s remaining: 784ms 25: learn: 0.3713928 test: 0.3714752 best: 0.3714752 (25) total: 1.07s remaining: 741ms 26: learn: 0.3647853 test: 0.3648779 best: 0.3648779 (26) total: 1.11s remaining: 698ms 27: learn: 0.3584142 test: 0.3584880 best: 0.3584880 (27) total: 1.15s remaining: 657ms 28: learn: 0.3525777 test: 0.3526651 best: 0.3526651 (28) total: 1.19s remaining: 616ms 29: learn: 0.3477404 test: 0.3478645 best: 0.3478645 (29) total: 1.23s remaining: 574ms 30: learn: 0.3433193 test: 0.3434189 best: 0.3434189 (30) total: 1.27s remaining: 532ms 31: learn: 0.3389700 test: 0.3390841 best: 0.3390841 (31) total: 1.31s remaining: 490ms 32: learn: 0.3351685 test: 0.3353633 best: 0.3353633 (32) total: 1.35s remaining: 449ms 33: learn: 0.3308349 test: 0.3310497 best: 0.3310497 (33) total: 1.39s remaining: 409ms 34: learn: 0.3267493 test: 0.3269202 best: 0.3269202 (34) total: 1.43s remaining: 368ms 35: learn: 0.3230301 test: 0.3232363 best: 0.3232363 (35) total: 1.47s remaining: 326ms 36: learn: 0.3193846 test: 0.3195837 best: 0.3195837 (36) total: 1.51s remaining: 285ms 37: learn: 0.3158514 test: 0.3160780 best: 0.3160780 (37) total: 1.55s remaining: 244ms 38: learn: 0.3120899 test: 0.3123455 best: 0.3123455 (38) total: 1.59s remaining: 203ms 39: learn: 0.3079103 test: 0.3081418 best: 0.3081418 (39) total: 1.63s remaining: 163ms 40: learn: 0.3042969 test: 0.3045559 best: 0.3045559 (40) total: 1.67s remaining: 122ms 41: learn: 0.3012648 test: 0.3015400 best: 0.3015400 (41) total: 1.7s remaining: 81.2ms 42: learn: 0.2986541 test: 0.2989321 best: 0.2989321 (42) total: 1.74s remaining: 40.6ms 43: learn: 0.2956651 test: 0.2959083 best: 0.2959083 (43) total: 1.78s remaining: 0us bestTest = 0.2959083192 bestIteration = 43 Trial 68, Fold 1: Log loss = 0.29608877817783463, Average precision = 0.9585629280319068, ROC-AUC = 0.9581787434937841, Elapsed Time = 1.891680599997926 seconds Trial 68, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 68, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6720298 test: 0.6720889 best: 0.6720889 (0) total: 37.1ms remaining: 1.6s 1: learn: 0.6512161 test: 0.6522705 best: 0.6522705 (1) total: 76ms remaining: 1.6s 2: learn: 0.6310217 test: 0.6329515 best: 0.6329515 (2) total: 114ms remaining: 1.56s 3: learn: 0.6125670 test: 0.6144676 best: 0.6144676 (3) total: 151ms remaining: 1.51s 4: learn: 0.5944894 test: 0.5972572 best: 0.5972572 (4) total: 190ms remaining: 1.48s 5: learn: 0.5793343 test: 0.5821531 best: 0.5821531 (5) total: 227ms remaining: 1.44s 6: learn: 0.5640747 test: 0.5669158 best: 0.5669158 (6) total: 264ms remaining: 1.39s 7: learn: 0.5484010 test: 0.5513195 best: 0.5513195 (7) total: 302ms remaining: 1.36s 8: learn: 0.5346366 test: 0.5374895 best: 0.5374895 (8) total: 338ms remaining: 1.31s 9: learn: 0.5196860 test: 0.5225541 best: 0.5225541 (9) total: 375ms remaining: 1.27s 10: learn: 0.5076796 test: 0.5107475 best: 0.5107475 (10) total: 412ms remaining: 1.24s 11: learn: 0.4954300 test: 0.4984835 best: 0.4984835 (11) total: 449ms remaining: 1.2s 12: learn: 0.4827789 test: 0.4858123 best: 0.4858123 (12) total: 487ms remaining: 1.16s 13: learn: 0.4724117 test: 0.4754844 best: 0.4754844 (13) total: 524ms remaining: 1.12s 14: learn: 0.4606122 test: 0.4636479 best: 0.4636479 (14) total: 561ms remaining: 1.08s 15: learn: 0.4496516 test: 0.4526551 best: 0.4526551 (15) total: 598ms remaining: 1.05s 16: learn: 0.4396589 test: 0.4426140 best: 0.4426140 (16) total: 636ms remaining: 1.01s 17: learn: 0.4297262 test: 0.4326801 best: 0.4326801 (17) total: 673ms remaining: 972ms 18: learn: 0.4207440 test: 0.4236615 best: 0.4236615 (18) total: 711ms remaining: 935ms 19: learn: 0.4122147 test: 0.4151516 best: 0.4151516 (19) total: 751ms remaining: 901ms 20: learn: 0.4044637 test: 0.4074022 best: 0.4074022 (20) total: 794ms remaining: 869ms 21: learn: 0.3954637 test: 0.3983186 best: 0.3983186 (21) total: 837ms remaining: 837ms 22: learn: 0.3888377 test: 0.3916502 best: 0.3916502 (22) total: 875ms remaining: 799ms 23: learn: 0.3819110 test: 0.3846870 best: 0.3846870 (23) total: 913ms remaining: 761ms 24: learn: 0.3757110 test: 0.3785345 best: 0.3785345 (24) total: 952ms remaining: 724ms 25: learn: 0.3695077 test: 0.3723248 best: 0.3723248 (25) total: 991ms remaining: 686ms 26: learn: 0.3637248 test: 0.3665544 best: 0.3665544 (26) total: 1.03s remaining: 648ms 27: learn: 0.3590221 test: 0.3618110 best: 0.3618110 (27) total: 1.07s remaining: 613ms 28: learn: 0.3536804 test: 0.3564659 best: 0.3564659 (28) total: 1.11s remaining: 577ms 29: learn: 0.3480089 test: 0.3508706 best: 0.3508706 (29) total: 1.16s remaining: 540ms 30: learn: 0.3435149 test: 0.3463510 best: 0.3463510 (30) total: 1.2s remaining: 501ms 31: learn: 0.3389301 test: 0.3417313 best: 0.3417313 (31) total: 1.23s remaining: 463ms 32: learn: 0.3342953 test: 0.3371030 best: 0.3371030 (32) total: 1.27s remaining: 424ms 33: learn: 0.3301124 test: 0.3328820 best: 0.3328820 (33) total: 1.31s remaining: 385ms 34: learn: 0.3258825 test: 0.3286914 best: 0.3286914 (34) total: 1.35s remaining: 347ms 35: learn: 0.3217549 test: 0.3245288 best: 0.3245288 (35) total: 1.39s remaining: 308ms 36: learn: 0.3181378 test: 0.3209042 best: 0.3209042 (36) total: 1.42s remaining: 270ms 37: learn: 0.3144700 test: 0.3172310 best: 0.3172310 (37) total: 1.46s remaining: 231ms 38: learn: 0.3109824 test: 0.3137513 best: 0.3137513 (38) total: 1.5s remaining: 193ms 39: learn: 0.3080440 test: 0.3107391 best: 0.3107391 (39) total: 1.54s remaining: 154ms 40: learn: 0.3042367 test: 0.3069854 best: 0.3069854 (40) total: 1.58s remaining: 116ms 41: learn: 0.3014078 test: 0.3041849 best: 0.3041849 (41) total: 1.62s remaining: 77ms 42: learn: 0.2985788 test: 0.3013575 best: 0.3013575 (42) total: 1.65s remaining: 38.5ms 43: learn: 0.2960224 test: 0.2987755 best: 0.2987755 (43) total: 1.69s remaining: 0us bestTest = 0.2987755422 bestIteration = 43 Trial 68, Fold 2: Log loss = 0.2988582516607877, Average precision = 0.9620673966498999, ROC-AUC = 0.9586472868558192, Elapsed Time = 1.8046482000027027 seconds Trial 68, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 68, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6710944 test: 0.6710178 best: 0.6710178 (0) total: 41.5ms remaining: 1.78s 1: learn: 0.6461995 test: 0.6459367 best: 0.6459367 (1) total: 83.8ms remaining: 1.76s 2: learn: 0.6265348 test: 0.6267553 best: 0.6267553 (2) total: 122ms remaining: 1.67s 3: learn: 0.6088676 test: 0.6089947 best: 0.6089947 (3) total: 161ms remaining: 1.61s 4: learn: 0.5909711 test: 0.5916551 best: 0.5916551 (4) total: 197ms remaining: 1.54s 5: learn: 0.5743030 test: 0.5748195 best: 0.5748195 (5) total: 234ms remaining: 1.48s 6: learn: 0.5589338 test: 0.5592601 best: 0.5592601 (6) total: 271ms remaining: 1.43s 7: learn: 0.5443850 test: 0.5446039 best: 0.5446039 (7) total: 309ms remaining: 1.39s 8: learn: 0.5280413 test: 0.5280560 best: 0.5280560 (8) total: 358ms remaining: 1.39s 9: learn: 0.5155787 test: 0.5156071 best: 0.5156071 (9) total: 399ms remaining: 1.35s 10: learn: 0.5027824 test: 0.5027265 best: 0.5027265 (10) total: 439ms remaining: 1.32s 11: learn: 0.4906654 test: 0.4904640 best: 0.4904640 (11) total: 477ms remaining: 1.27s 12: learn: 0.4794994 test: 0.4791587 best: 0.4791587 (12) total: 516ms remaining: 1.23s 13: learn: 0.4682666 test: 0.4679656 best: 0.4679656 (13) total: 553ms remaining: 1.19s 14: learn: 0.4569835 test: 0.4566264 best: 0.4566264 (14) total: 591ms remaining: 1.14s 15: learn: 0.4467899 test: 0.4462291 best: 0.4462291 (15) total: 630ms remaining: 1.1s 16: learn: 0.4362203 test: 0.4355270 best: 0.4355270 (16) total: 667ms remaining: 1.06s 17: learn: 0.4266303 test: 0.4258524 best: 0.4258524 (17) total: 705ms remaining: 1.02s 18: learn: 0.4188485 test: 0.4180186 best: 0.4180186 (18) total: 742ms remaining: 977ms 19: learn: 0.4110937 test: 0.4102267 best: 0.4102267 (19) total: 780ms remaining: 935ms 20: learn: 0.4030600 test: 0.4021665 best: 0.4021665 (20) total: 817ms remaining: 895ms 21: learn: 0.3951527 test: 0.3942433 best: 0.3942433 (21) total: 855ms remaining: 855ms 22: learn: 0.3876057 test: 0.3866753 best: 0.3866753 (22) total: 892ms remaining: 814ms 23: learn: 0.3808749 test: 0.3799272 best: 0.3799272 (23) total: 929ms remaining: 774ms 24: learn: 0.3747379 test: 0.3738087 best: 0.3738087 (24) total: 966ms remaining: 734ms 25: learn: 0.3680943 test: 0.3671395 best: 0.3671395 (25) total: 1s remaining: 695ms 26: learn: 0.3632302 test: 0.3622434 best: 0.3622434 (26) total: 1.04s remaining: 655ms 27: learn: 0.3583862 test: 0.3573606 best: 0.3573606 (27) total: 1.08s remaining: 616ms 28: learn: 0.3524392 test: 0.3514209 best: 0.3514209 (28) total: 1.11s remaining: 577ms 29: learn: 0.3470685 test: 0.3459540 best: 0.3459540 (29) total: 1.15s remaining: 538ms 30: learn: 0.3428998 test: 0.3417477 best: 0.3417477 (30) total: 1.19s remaining: 499ms 31: learn: 0.3377427 test: 0.3365319 best: 0.3365319 (31) total: 1.23s remaining: 460ms 32: learn: 0.3338521 test: 0.3326093 best: 0.3326093 (32) total: 1.26s remaining: 421ms 33: learn: 0.3294357 test: 0.3280886 best: 0.3280886 (33) total: 1.3s remaining: 383ms 34: learn: 0.3252100 test: 0.3238709 best: 0.3238709 (34) total: 1.34s remaining: 344ms 35: learn: 0.3217640 test: 0.3203806 best: 0.3203806 (35) total: 1.38s remaining: 306ms 36: learn: 0.3182303 test: 0.3169225 best: 0.3169225 (36) total: 1.41s remaining: 267ms 37: learn: 0.3143921 test: 0.3130544 best: 0.3130544 (37) total: 1.45s remaining: 229ms 38: learn: 0.3103072 test: 0.3088740 best: 0.3088740 (38) total: 1.49s remaining: 191ms 39: learn: 0.3073250 test: 0.3058891 best: 0.3058891 (39) total: 1.52s remaining: 153ms 40: learn: 0.3038696 test: 0.3023889 best: 0.3023889 (40) total: 1.56s remaining: 114ms 41: learn: 0.3011901 test: 0.2996574 best: 0.2996574 (41) total: 1.6s remaining: 76.3ms 42: learn: 0.2982444 test: 0.2966805 best: 0.2966805 (42) total: 1.64s remaining: 38.1ms 43: learn: 0.2950725 test: 0.2934390 best: 0.2934390 (43) total: 1.68s remaining: 0us bestTest = 0.2934390283 bestIteration = 43 Trial 68, Fold 3: Log loss = 0.2936613666425326, Average precision = 0.9607125775671368, ROC-AUC = 0.9599722897063497, Elapsed Time = 1.778061400000297 seconds Trial 68, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 68, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6712144 test: 0.6721568 best: 0.6721568 (0) total: 37.9ms remaining: 1.63s 1: learn: 0.6500436 test: 0.6520715 best: 0.6520715 (1) total: 75.7ms remaining: 1.59s 2: learn: 0.6298424 test: 0.6327392 best: 0.6327392 (2) total: 111ms remaining: 1.51s 3: learn: 0.6115376 test: 0.6144401 best: 0.6144401 (3) total: 146ms remaining: 1.46s 4: learn: 0.5952882 test: 0.5981323 best: 0.5981323 (4) total: 180ms remaining: 1.41s 5: learn: 0.5800681 test: 0.5829000 best: 0.5829000 (5) total: 216ms remaining: 1.36s 6: learn: 0.5645040 test: 0.5675465 best: 0.5675465 (6) total: 253ms remaining: 1.34s 7: learn: 0.5495265 test: 0.5525358 best: 0.5525358 (7) total: 289ms remaining: 1.3s 8: learn: 0.5349718 test: 0.5379017 best: 0.5379017 (8) total: 325ms remaining: 1.26s 9: learn: 0.5209453 test: 0.5237523 best: 0.5237523 (9) total: 361ms remaining: 1.23s 10: learn: 0.5087841 test: 0.5115194 best: 0.5115194 (10) total: 397ms remaining: 1.19s 11: learn: 0.4956504 test: 0.4982938 best: 0.4982938 (11) total: 432ms remaining: 1.15s 12: learn: 0.4830754 test: 0.4855588 best: 0.4855588 (12) total: 469ms remaining: 1.12s 13: learn: 0.4717113 test: 0.4741392 best: 0.4741392 (13) total: 506ms remaining: 1.08s 14: learn: 0.4611179 test: 0.4635157 best: 0.4635157 (14) total: 543ms remaining: 1.05s 15: learn: 0.4504387 test: 0.4527197 best: 0.4527197 (15) total: 579ms remaining: 1.01s 16: learn: 0.4405716 test: 0.4427678 best: 0.4427678 (16) total: 615ms remaining: 977ms 17: learn: 0.4316800 test: 0.4338256 best: 0.4338256 (17) total: 651ms remaining: 940ms 18: learn: 0.4236668 test: 0.4258783 best: 0.4258783 (18) total: 688ms remaining: 905ms 19: learn: 0.4149030 test: 0.4169968 best: 0.4169968 (19) total: 725ms remaining: 870ms 20: learn: 0.4074628 test: 0.4096115 best: 0.4096115 (20) total: 762ms remaining: 834ms 21: learn: 0.3994045 test: 0.4015360 best: 0.4015360 (21) total: 800ms remaining: 800ms 22: learn: 0.3915930 test: 0.3936903 best: 0.3936903 (22) total: 836ms remaining: 763ms 23: learn: 0.3856320 test: 0.3877421 best: 0.3877421 (23) total: 874ms remaining: 728ms 24: learn: 0.3792510 test: 0.3814187 best: 0.3814187 (24) total: 911ms remaining: 692ms 25: learn: 0.3731832 test: 0.3753251 best: 0.3753251 (25) total: 948ms remaining: 656ms 26: learn: 0.3674361 test: 0.3695767 best: 0.3695767 (26) total: 984ms remaining: 620ms 27: learn: 0.3614828 test: 0.3635942 best: 0.3635942 (27) total: 1.02s remaining: 584ms 28: learn: 0.3552670 test: 0.3572664 best: 0.3572664 (28) total: 1.06s remaining: 547ms 29: learn: 0.3502835 test: 0.3522653 best: 0.3522653 (29) total: 1.09s remaining: 511ms 30: learn: 0.3454647 test: 0.3473873 best: 0.3473873 (30) total: 1.13s remaining: 476ms 31: learn: 0.3405416 test: 0.3424445 best: 0.3424445 (31) total: 1.17s remaining: 440ms 32: learn: 0.3358055 test: 0.3376789 best: 0.3376789 (32) total: 1.21s remaining: 403ms 33: learn: 0.3317584 test: 0.3335626 best: 0.3335626 (33) total: 1.25s remaining: 367ms 34: learn: 0.3269603 test: 0.3286256 best: 0.3286256 (34) total: 1.29s remaining: 331ms 35: learn: 0.3228780 test: 0.3245392 best: 0.3245392 (35) total: 1.32s remaining: 294ms 36: learn: 0.3190155 test: 0.3209712 best: 0.3209712 (36) total: 1.36s remaining: 257ms 37: learn: 0.3156400 test: 0.3176353 best: 0.3176353 (37) total: 1.4s remaining: 221ms 38: learn: 0.3125416 test: 0.3145118 best: 0.3145118 (38) total: 1.43s remaining: 184ms 39: learn: 0.3092820 test: 0.3111987 best: 0.3111987 (39) total: 1.47s remaining: 147ms 40: learn: 0.3062303 test: 0.3081425 best: 0.3081425 (40) total: 1.51s remaining: 110ms 41: learn: 0.3030187 test: 0.3049246 best: 0.3049246 (41) total: 1.54s remaining: 73.5ms 42: learn: 0.2996441 test: 0.3014703 best: 0.3014703 (42) total: 1.58s remaining: 36.8ms 43: learn: 0.2965647 test: 0.2983066 best: 0.2983066 (43) total: 1.62s remaining: 0us bestTest = 0.2983065604 bestIteration = 43 Trial 68, Fold 4: Log loss = 0.29843035982676436, Average precision = 0.9635127854350897, ROC-AUC = 0.9594627984386884, Elapsed Time = 1.7159068999972078 seconds Trial 68, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 68, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6700720 test: 0.6706175 best: 0.6706175 (0) total: 35.8ms remaining: 1.54s 1: learn: 0.6488607 test: 0.6507275 best: 0.6507275 (1) total: 71.6ms remaining: 1.5s 2: learn: 0.6288944 test: 0.6310352 best: 0.6310352 (2) total: 105ms remaining: 1.44s 3: learn: 0.6106449 test: 0.6130423 best: 0.6130423 (3) total: 141ms remaining: 1.41s 4: learn: 0.5926625 test: 0.5953044 best: 0.5953044 (4) total: 175ms remaining: 1.37s 5: learn: 0.5763282 test: 0.5791190 best: 0.5791190 (5) total: 212ms remaining: 1.34s 6: learn: 0.5609078 test: 0.5639524 best: 0.5639524 (6) total: 246ms remaining: 1.3s 7: learn: 0.5444075 test: 0.5476544 best: 0.5476544 (7) total: 281ms remaining: 1.26s 8: learn: 0.5292529 test: 0.5327012 best: 0.5327012 (8) total: 316ms remaining: 1.23s 9: learn: 0.5161952 test: 0.5198475 best: 0.5198475 (9) total: 351ms remaining: 1.19s 10: learn: 0.5034744 test: 0.5073487 best: 0.5073487 (10) total: 386ms remaining: 1.16s 11: learn: 0.4906547 test: 0.4947383 best: 0.4947383 (11) total: 420ms remaining: 1.12s 12: learn: 0.4792086 test: 0.4835060 best: 0.4835060 (12) total: 455ms remaining: 1.08s 13: learn: 0.4680886 test: 0.4725509 best: 0.4725509 (13) total: 495ms remaining: 1.06s 14: learn: 0.4583139 test: 0.4629274 best: 0.4629274 (14) total: 533ms remaining: 1.03s 15: learn: 0.4471707 test: 0.4519905 best: 0.4519905 (15) total: 570ms remaining: 997ms 16: learn: 0.4369722 test: 0.4419744 best: 0.4419744 (16) total: 606ms remaining: 963ms 17: learn: 0.4277939 test: 0.4329848 best: 0.4329848 (17) total: 643ms remaining: 929ms 18: learn: 0.4196744 test: 0.4249758 best: 0.4249758 (18) total: 681ms remaining: 895ms 19: learn: 0.4108081 test: 0.4168708 best: 0.4168708 (19) total: 718ms remaining: 862ms 20: learn: 0.4029975 test: 0.4092114 best: 0.4092114 (20) total: 755ms remaining: 827ms 21: learn: 0.3956136 test: 0.4018756 best: 0.4018756 (21) total: 792ms remaining: 792ms 22: learn: 0.3885641 test: 0.3949085 best: 0.3949085 (22) total: 829ms remaining: 757ms 23: learn: 0.3809060 test: 0.3873782 best: 0.3873782 (23) total: 867ms remaining: 722ms 24: learn: 0.3741449 test: 0.3807689 best: 0.3807689 (24) total: 904ms remaining: 687ms 25: learn: 0.3687002 test: 0.3753665 best: 0.3753665 (25) total: 940ms remaining: 651ms 26: learn: 0.3628878 test: 0.3696860 best: 0.3696860 (26) total: 979ms remaining: 617ms 27: learn: 0.3566537 test: 0.3636212 best: 0.3636212 (27) total: 1.02s remaining: 582ms 28: learn: 0.3514585 test: 0.3584883 best: 0.3584883 (28) total: 1.05s remaining: 546ms 29: learn: 0.3461636 test: 0.3533172 best: 0.3533172 (29) total: 1.09s remaining: 510ms 30: learn: 0.3415320 test: 0.3487584 best: 0.3487584 (30) total: 1.13s remaining: 474ms 31: learn: 0.3366828 test: 0.3440497 best: 0.3440497 (31) total: 1.17s remaining: 438ms 32: learn: 0.3326359 test: 0.3401285 best: 0.3401285 (32) total: 1.21s remaining: 402ms 33: learn: 0.3288733 test: 0.3363258 best: 0.3363258 (33) total: 1.24s remaining: 366ms 34: learn: 0.3247726 test: 0.3323310 best: 0.3323310 (34) total: 1.28s remaining: 330ms 35: learn: 0.3203307 test: 0.3279972 best: 0.3279972 (35) total: 1.32s remaining: 293ms 36: learn: 0.3162496 test: 0.3240432 best: 0.3240432 (36) total: 1.36s remaining: 257ms 37: learn: 0.3120157 test: 0.3198850 best: 0.3198850 (37) total: 1.4s remaining: 220ms 38: learn: 0.3083454 test: 0.3163205 best: 0.3163205 (38) total: 1.43s remaining: 184ms 39: learn: 0.3046461 test: 0.3126819 best: 0.3126819 (39) total: 1.47s remaining: 147ms 40: learn: 0.3016933 test: 0.3098473 best: 0.3098473 (40) total: 1.51s remaining: 110ms 41: learn: 0.2984501 test: 0.3067065 best: 0.3067065 (41) total: 1.54s remaining: 73.5ms 42: learn: 0.2953463 test: 0.3037274 best: 0.3037274 (42) total: 1.58s remaining: 36.8ms 43: learn: 0.2928051 test: 0.3012900 best: 0.3012900 (43) total: 1.62s remaining: 0us bestTest = 0.3012899958 bestIteration = 43 Trial 68, Fold 5: Log loss = 0.3013720648789694, Average precision = 0.9613763111211567, ROC-AUC = 0.9575561377106443, Elapsed Time = 1.71973749999961 seconds
Optimization Progress: 69%|######9 | 69/100 [1:55:26<37:49, 73.22s/it]
Trial 69, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 69, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6794093 test: 0.6795162 best: 0.6795162 (0) total: 213ms remaining: 14.3s 1: learn: 0.6660886 test: 0.6662427 best: 0.6662427 (1) total: 426ms remaining: 14.1s 2: learn: 0.6529153 test: 0.6532755 best: 0.6532755 (2) total: 634ms remaining: 13.7s 3: learn: 0.6405112 test: 0.6409635 best: 0.6409635 (3) total: 839ms remaining: 13.4s 4: learn: 0.6281987 test: 0.6286968 best: 0.6286968 (4) total: 1.05s remaining: 13.3s 5: learn: 0.6162780 test: 0.6169003 best: 0.6169003 (5) total: 1.27s remaining: 13.1s 6: learn: 0.6048092 test: 0.6055020 best: 0.6055020 (6) total: 1.52s remaining: 13.3s 7: learn: 0.5936816 test: 0.5944523 best: 0.5944523 (7) total: 1.72s remaining: 12.9s 8: learn: 0.5828043 test: 0.5837378 best: 0.5837378 (8) total: 1.93s remaining: 12.6s 9: learn: 0.5722969 test: 0.5734740 best: 0.5734740 (9) total: 2.15s remaining: 12.5s 10: learn: 0.5621113 test: 0.5633811 best: 0.5633811 (10) total: 2.35s remaining: 12.2s 11: learn: 0.5522139 test: 0.5535443 best: 0.5535443 (11) total: 2.55s remaining: 11.9s 12: learn: 0.5428740 test: 0.5443852 best: 0.5443852 (12) total: 2.78s remaining: 11.8s 13: learn: 0.5335014 test: 0.5351084 best: 0.5351084 (13) total: 3.02s remaining: 11.7s 14: learn: 0.5243267 test: 0.5261183 best: 0.5261183 (14) total: 3.26s remaining: 11.5s 15: learn: 0.5156598 test: 0.5176325 best: 0.5176325 (15) total: 3.47s remaining: 11.3s 16: learn: 0.5072325 test: 0.5094596 best: 0.5094596 (16) total: 3.73s remaining: 11.2s 17: learn: 0.4988290 test: 0.5012240 best: 0.5012240 (17) total: 4s remaining: 11.1s 18: learn: 0.4905832 test: 0.4931530 best: 0.4931530 (18) total: 4.24s remaining: 10.9s 19: learn: 0.4829425 test: 0.4857309 best: 0.4857309 (19) total: 4.47s remaining: 10.7s 20: learn: 0.4753583 test: 0.4782543 best: 0.4782543 (20) total: 4.69s remaining: 10.5s 21: learn: 0.4681176 test: 0.4713168 best: 0.4713168 (21) total: 4.98s remaining: 10.4s 22: learn: 0.4608813 test: 0.4642096 best: 0.4642096 (22) total: 5.21s remaining: 10.2s 23: learn: 0.4540173 test: 0.4574240 best: 0.4574240 (23) total: 5.44s remaining: 9.97s 24: learn: 0.4473735 test: 0.4508750 best: 0.4508750 (24) total: 5.67s remaining: 9.75s 25: learn: 0.4408477 test: 0.4445304 best: 0.4445304 (25) total: 5.89s remaining: 9.51s 26: learn: 0.4344212 test: 0.4383571 best: 0.4383571 (26) total: 6.12s remaining: 9.3s 27: learn: 0.4282736 test: 0.4323817 best: 0.4323817 (27) total: 6.38s remaining: 9.12s 28: learn: 0.4222175 test: 0.4264954 best: 0.4264954 (28) total: 6.66s remaining: 8.96s 29: learn: 0.4164931 test: 0.4209832 best: 0.4209832 (29) total: 6.89s remaining: 8.73s 30: learn: 0.4108603 test: 0.4154776 best: 0.4154776 (30) total: 7.14s remaining: 8.52s 31: learn: 0.4053712 test: 0.4101086 best: 0.4101086 (31) total: 7.35s remaining: 8.27s 32: learn: 0.4002818 test: 0.4052428 best: 0.4052428 (32) total: 7.58s remaining: 8.04s 33: learn: 0.3952695 test: 0.4005245 best: 0.4005245 (33) total: 7.81s remaining: 7.81s 34: learn: 0.3901688 test: 0.3955553 best: 0.3955553 (34) total: 8.02s remaining: 7.56s 35: learn: 0.3855567 test: 0.3911294 best: 0.3911294 (35) total: 8.25s remaining: 7.34s 36: learn: 0.3806815 test: 0.3863556 best: 0.3863556 (36) total: 8.49s remaining: 7.11s 37: learn: 0.3759299 test: 0.3817904 best: 0.3817904 (37) total: 8.74s remaining: 6.9s 38: learn: 0.3715626 test: 0.3775673 best: 0.3775673 (38) total: 8.96s remaining: 6.66s 39: learn: 0.3671101 test: 0.3733438 best: 0.3733438 (39) total: 9.19s remaining: 6.43s 40: learn: 0.3627639 test: 0.3691796 best: 0.3691796 (40) total: 9.42s remaining: 6.2s 41: learn: 0.3586597 test: 0.3652501 best: 0.3652501 (41) total: 9.65s remaining: 5.97s 42: learn: 0.3547835 test: 0.3615585 best: 0.3615585 (42) total: 9.86s remaining: 5.73s 43: learn: 0.3507999 test: 0.3577361 best: 0.3577361 (43) total: 10.1s remaining: 5.51s 44: learn: 0.3471463 test: 0.3543327 best: 0.3543327 (44) total: 10.3s remaining: 5.27s 45: learn: 0.3434124 test: 0.3507992 best: 0.3507992 (45) total: 10.5s remaining: 5.03s 46: learn: 0.3399359 test: 0.3475333 best: 0.3475333 (46) total: 10.7s remaining: 4.8s 47: learn: 0.3367554 test: 0.3444993 best: 0.3444993 (47) total: 11s remaining: 4.57s 48: learn: 0.3334317 test: 0.3413740 best: 0.3413740 (48) total: 11.2s remaining: 4.34s 49: learn: 0.3300452 test: 0.3380857 best: 0.3380857 (49) total: 11.4s remaining: 4.11s 50: learn: 0.3269150 test: 0.3351791 best: 0.3351791 (50) total: 11.7s remaining: 3.89s 51: learn: 0.3236058 test: 0.3320991 best: 0.3320991 (51) total: 11.9s remaining: 3.66s 52: learn: 0.3207385 test: 0.3294378 best: 0.3294378 (52) total: 12.1s remaining: 3.43s 53: learn: 0.3177681 test: 0.3266379 best: 0.3266379 (53) total: 12.3s remaining: 3.2s 54: learn: 0.3148098 test: 0.3237546 best: 0.3237546 (54) total: 12.5s remaining: 2.96s 55: learn: 0.3119029 test: 0.3210170 best: 0.3210170 (55) total: 12.7s remaining: 2.73s 56: learn: 0.3091305 test: 0.3185016 best: 0.3185016 (56) total: 13s remaining: 2.51s 57: learn: 0.3063575 test: 0.3158838 best: 0.3158838 (57) total: 13.2s remaining: 2.28s 58: learn: 0.3036041 test: 0.3132700 best: 0.3132700 (58) total: 13.5s remaining: 2.06s 59: learn: 0.3011722 test: 0.3110139 best: 0.3110139 (59) total: 13.7s remaining: 1.82s 60: learn: 0.2985729 test: 0.3086691 best: 0.3086691 (60) total: 14s remaining: 1.6s 61: learn: 0.2960900 test: 0.3063564 best: 0.3063564 (61) total: 14.2s remaining: 1.38s 62: learn: 0.2935538 test: 0.3040348 best: 0.3040348 (62) total: 14.4s remaining: 1.15s 63: learn: 0.2912019 test: 0.3018689 best: 0.3018689 (63) total: 14.7s remaining: 918ms 64: learn: 0.2889470 test: 0.2998402 best: 0.2998402 (64) total: 14.9s remaining: 688ms 65: learn: 0.2866527 test: 0.2977326 best: 0.2977326 (65) total: 15.1s remaining: 458ms 66: learn: 0.2845009 test: 0.2958421 best: 0.2958421 (66) total: 15.4s remaining: 230ms 67: learn: 0.2822558 test: 0.2937921 best: 0.2937921 (67) total: 15.6s remaining: 0us bestTest = 0.2937921196 bestIteration = 67 Trial 69, Fold 1: Log loss = 0.29378810395775945, Average precision = 0.9742611456593842, ROC-AUC = 0.9695210074896163, Elapsed Time = 15.787811100002727 seconds Trial 69, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 69, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6792873 test: 0.6795194 best: 0.6795194 (0) total: 229ms remaining: 15.4s 1: learn: 0.6659016 test: 0.6662851 best: 0.6662851 (1) total: 432ms remaining: 14.3s 2: learn: 0.6528520 test: 0.6535030 best: 0.6535030 (2) total: 684ms remaining: 14.8s 3: learn: 0.6402828 test: 0.6410826 best: 0.6410826 (3) total: 880ms remaining: 14.1s 4: learn: 0.6283874 test: 0.6293504 best: 0.6293504 (4) total: 1.08s remaining: 13.6s 5: learn: 0.6165375 test: 0.6175682 best: 0.6175682 (5) total: 1.28s remaining: 13.2s 6: learn: 0.6051796 test: 0.6063811 best: 0.6063811 (6) total: 1.49s remaining: 13s 7: learn: 0.5940481 test: 0.5954400 best: 0.5954400 (7) total: 1.7s remaining: 12.7s 8: learn: 0.5831676 test: 0.5847384 best: 0.5847384 (8) total: 1.95s remaining: 12.8s 9: learn: 0.5725227 test: 0.5742480 best: 0.5742480 (9) total: 2.18s remaining: 12.7s 10: learn: 0.5625045 test: 0.5643180 best: 0.5643180 (10) total: 2.4s remaining: 12.5s 11: learn: 0.5525241 test: 0.5544883 best: 0.5544883 (11) total: 2.61s remaining: 12.2s 12: learn: 0.5428160 test: 0.5450034 best: 0.5450034 (12) total: 2.86s remaining: 12.1s 13: learn: 0.5335788 test: 0.5358322 best: 0.5358322 (13) total: 3.08s remaining: 11.9s 14: learn: 0.5244222 test: 0.5269071 best: 0.5269071 (14) total: 3.33s remaining: 11.8s 15: learn: 0.5155796 test: 0.5182443 best: 0.5182443 (15) total: 3.57s remaining: 11.6s 16: learn: 0.5069403 test: 0.5097330 best: 0.5097330 (16) total: 3.84s remaining: 11.5s 17: learn: 0.4987068 test: 0.5016571 best: 0.5016571 (17) total: 4.07s remaining: 11.3s 18: learn: 0.4904828 test: 0.4935950 best: 0.4935950 (18) total: 4.29s remaining: 11.1s 19: learn: 0.4825982 test: 0.4858588 best: 0.4858588 (19) total: 4.56s remaining: 10.9s 20: learn: 0.4751974 test: 0.4785475 best: 0.4785475 (20) total: 4.83s remaining: 10.8s 21: learn: 0.4679416 test: 0.4713449 best: 0.4713449 (21) total: 5.07s remaining: 10.6s 22: learn: 0.4609914 test: 0.4644865 best: 0.4644865 (22) total: 5.3s remaining: 10.4s 23: learn: 0.4541012 test: 0.4577427 best: 0.4577427 (23) total: 5.51s remaining: 10.1s 24: learn: 0.4474134 test: 0.4512061 best: 0.4512061 (24) total: 5.74s remaining: 9.88s 25: learn: 0.4407902 test: 0.4447807 best: 0.4447807 (25) total: 6.02s remaining: 9.72s 26: learn: 0.4345427 test: 0.4385875 best: 0.4385875 (26) total: 6.26s remaining: 9.51s 27: learn: 0.4284300 test: 0.4325816 best: 0.4325816 (27) total: 6.5s remaining: 9.28s 28: learn: 0.4224905 test: 0.4268110 best: 0.4268110 (28) total: 6.78s remaining: 9.11s 29: learn: 0.4166226 test: 0.4210486 best: 0.4210486 (29) total: 7.01s remaining: 8.88s 30: learn: 0.4111346 test: 0.4156579 best: 0.4156579 (30) total: 7.25s remaining: 8.65s 31: learn: 0.4059531 test: 0.4106148 best: 0.4106148 (31) total: 7.48s remaining: 8.41s 32: learn: 0.4008448 test: 0.4055514 best: 0.4055514 (32) total: 7.71s remaining: 8.18s 33: learn: 0.3955804 test: 0.4004356 best: 0.4004356 (33) total: 7.92s remaining: 7.92s 34: learn: 0.3907238 test: 0.3957035 best: 0.3957035 (34) total: 8.14s remaining: 7.68s 35: learn: 0.3859783 test: 0.3910849 best: 0.3910849 (35) total: 8.36s remaining: 7.43s 36: learn: 0.3812969 test: 0.3865801 best: 0.3865801 (36) total: 8.59s remaining: 7.19s 37: learn: 0.3767655 test: 0.3822087 best: 0.3822087 (37) total: 8.81s remaining: 6.95s 38: learn: 0.3721299 test: 0.3776657 best: 0.3776657 (38) total: 9.05s remaining: 6.73s 39: learn: 0.3677628 test: 0.3734288 best: 0.3734288 (39) total: 9.27s remaining: 6.49s 40: learn: 0.3636188 test: 0.3693452 best: 0.3693452 (40) total: 9.5s remaining: 6.26s 41: learn: 0.3595039 test: 0.3654137 best: 0.3654137 (41) total: 9.76s remaining: 6.04s 42: learn: 0.3556073 test: 0.3615699 best: 0.3615699 (42) total: 10s remaining: 5.83s 43: learn: 0.3518887 test: 0.3579675 best: 0.3579675 (43) total: 10.3s remaining: 5.6s 44: learn: 0.3481205 test: 0.3543049 best: 0.3543049 (44) total: 10.5s remaining: 5.36s 45: learn: 0.3445655 test: 0.3508987 best: 0.3508987 (45) total: 10.7s remaining: 5.12s 46: learn: 0.3411349 test: 0.3476397 best: 0.3476397 (46) total: 10.9s remaining: 4.88s 47: learn: 0.3378498 test: 0.3444601 best: 0.3444601 (47) total: 11.2s remaining: 4.65s 48: learn: 0.3345414 test: 0.3412512 best: 0.3412512 (48) total: 11.4s remaining: 4.41s 49: learn: 0.3310394 test: 0.3378909 best: 0.3378909 (49) total: 11.6s remaining: 4.19s 50: learn: 0.3279396 test: 0.3349119 best: 0.3349119 (50) total: 11.9s remaining: 3.96s 51: learn: 0.3249077 test: 0.3319151 best: 0.3319151 (51) total: 12.1s remaining: 3.73s 52: learn: 0.3217567 test: 0.3289533 best: 0.3289533 (52) total: 12.4s remaining: 3.5s 53: learn: 0.3188429 test: 0.3261880 best: 0.3261880 (53) total: 12.6s remaining: 3.26s 54: learn: 0.3160765 test: 0.3234737 best: 0.3234737 (54) total: 12.8s remaining: 3.02s 55: learn: 0.3132175 test: 0.3207291 best: 0.3207291 (55) total: 13s remaining: 2.79s 56: learn: 0.3105284 test: 0.3181585 best: 0.3181585 (56) total: 13.3s remaining: 2.56s 57: learn: 0.3078993 test: 0.3156398 best: 0.3156398 (57) total: 13.5s remaining: 2.32s 58: learn: 0.3052284 test: 0.3131123 best: 0.3131123 (58) total: 13.7s remaining: 2.09s 59: learn: 0.3026531 test: 0.3106991 best: 0.3106991 (59) total: 14s remaining: 1.86s 60: learn: 0.3000318 test: 0.3082571 best: 0.3082571 (60) total: 14.2s remaining: 1.63s 61: learn: 0.2975494 test: 0.3059032 best: 0.3059032 (61) total: 14.4s remaining: 1.4s 62: learn: 0.2952668 test: 0.3037140 best: 0.3037140 (62) total: 14.6s remaining: 1.16s 63: learn: 0.2929489 test: 0.3014485 best: 0.3014485 (63) total: 14.8s remaining: 925ms 64: learn: 0.2905424 test: 0.2992334 best: 0.2992334 (64) total: 15.1s remaining: 695ms 65: learn: 0.2883400 test: 0.2971658 best: 0.2971658 (65) total: 15.3s remaining: 463ms 66: learn: 0.2860398 test: 0.2949507 best: 0.2949507 (66) total: 15.5s remaining: 232ms 67: learn: 0.2839300 test: 0.2929573 best: 0.2929573 (67) total: 15.7s remaining: 0us bestTest = 0.2929572957 bestIteration = 67 Trial 69, Fold 2: Log loss = 0.2929318457145807, Average precision = 0.9736490952490129, ROC-AUC = 0.9713565432582858, Elapsed Time = 15.902804399996967 seconds Trial 69, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 69, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6793662 test: 0.6794053 best: 0.6794053 (0) total: 230ms remaining: 15.4s 1: learn: 0.6659984 test: 0.6660620 best: 0.6660620 (1) total: 427ms remaining: 14.1s 2: learn: 0.6534136 test: 0.6534755 best: 0.6534755 (2) total: 647ms remaining: 14s 3: learn: 0.6408811 test: 0.6410049 best: 0.6410049 (3) total: 884ms remaining: 14.1s 4: learn: 0.6287163 test: 0.6289134 best: 0.6289134 (4) total: 1.1s remaining: 13.9s 5: learn: 0.6169227 test: 0.6171804 best: 0.6171804 (5) total: 1.33s remaining: 13.7s 6: learn: 0.6054060 test: 0.6057187 best: 0.6057187 (6) total: 1.53s remaining: 13.3s 7: learn: 0.5942877 test: 0.5945577 best: 0.5945577 (7) total: 1.73s remaining: 13s 8: learn: 0.5834776 test: 0.5837249 best: 0.5837249 (8) total: 1.94s remaining: 12.7s 9: learn: 0.5729121 test: 0.5733147 best: 0.5733147 (9) total: 2.14s remaining: 12.4s 10: learn: 0.5628851 test: 0.5632646 best: 0.5632646 (10) total: 2.36s remaining: 12.2s 11: learn: 0.5528995 test: 0.5533005 best: 0.5533005 (11) total: 2.59s remaining: 12.1s 12: learn: 0.5435157 test: 0.5440597 best: 0.5440597 (12) total: 2.84s remaining: 12s 13: learn: 0.5342085 test: 0.5347515 best: 0.5347515 (13) total: 3.03s remaining: 11.7s 14: learn: 0.5252481 test: 0.5258640 best: 0.5258640 (14) total: 3.26s remaining: 11.5s 15: learn: 0.5165466 test: 0.5172953 best: 0.5172953 (15) total: 3.53s remaining: 11.5s 16: learn: 0.5080319 test: 0.5088996 best: 0.5088996 (16) total: 3.78s remaining: 11.3s 17: learn: 0.4999657 test: 0.5009041 best: 0.5009041 (17) total: 4.02s remaining: 11.2s 18: learn: 0.4919892 test: 0.4930809 best: 0.4930809 (18) total: 4.26s remaining: 11s 19: learn: 0.4843184 test: 0.4854743 best: 0.4854743 (19) total: 4.51s remaining: 10.8s 20: learn: 0.4766146 test: 0.4779046 best: 0.4779046 (20) total: 4.75s remaining: 10.6s 21: learn: 0.4691771 test: 0.4705666 best: 0.4705666 (21) total: 4.98s remaining: 10.4s 22: learn: 0.4620345 test: 0.4635570 best: 0.4635570 (22) total: 5.23s remaining: 10.2s 23: learn: 0.4553509 test: 0.4569043 best: 0.4569043 (23) total: 5.48s remaining: 10.1s 24: learn: 0.4485416 test: 0.4501747 best: 0.4501747 (24) total: 5.73s remaining: 9.86s 25: learn: 0.4418924 test: 0.4435625 best: 0.4435625 (25) total: 5.98s remaining: 9.65s 26: learn: 0.4354448 test: 0.4371005 best: 0.4371005 (26) total: 6.16s remaining: 9.36s 27: learn: 0.4293239 test: 0.4310922 best: 0.4310922 (27) total: 6.41s remaining: 9.16s 28: learn: 0.4233718 test: 0.4252731 best: 0.4252731 (28) total: 6.64s remaining: 8.92s 29: learn: 0.4178469 test: 0.4198235 best: 0.4198235 (29) total: 6.86s remaining: 8.69s 30: learn: 0.4122539 test: 0.4142697 best: 0.4142697 (30) total: 7.08s remaining: 8.45s 31: learn: 0.4068389 test: 0.4089881 best: 0.4089881 (31) total: 7.31s remaining: 8.23s 32: learn: 0.4017770 test: 0.4040305 best: 0.4040305 (32) total: 7.56s remaining: 8.01s 33: learn: 0.3967746 test: 0.3990929 best: 0.3990929 (33) total: 7.77s remaining: 7.77s 34: learn: 0.3916589 test: 0.3940969 best: 0.3940969 (34) total: 8s remaining: 7.55s 35: learn: 0.3866769 test: 0.3892250 best: 0.3892250 (35) total: 8.22s remaining: 7.31s 36: learn: 0.3818056 test: 0.3844033 best: 0.3844033 (36) total: 8.43s remaining: 7.06s 37: learn: 0.3771950 test: 0.3798899 best: 0.3798899 (37) total: 8.65s remaining: 6.83s 38: learn: 0.3727677 test: 0.3755699 best: 0.3755699 (38) total: 8.9s remaining: 6.62s 39: learn: 0.3684254 test: 0.3713259 best: 0.3713259 (39) total: 9.13s remaining: 6.39s 40: learn: 0.3641088 test: 0.3670355 best: 0.3670355 (40) total: 9.35s remaining: 6.16s 41: learn: 0.3601589 test: 0.3631889 best: 0.3631889 (41) total: 9.55s remaining: 5.91s 42: learn: 0.3561481 test: 0.3593526 best: 0.3593526 (42) total: 9.78s remaining: 5.68s 43: learn: 0.3522576 test: 0.3555469 best: 0.3555469 (43) total: 10s remaining: 5.46s 44: learn: 0.3484427 test: 0.3518199 best: 0.3518199 (44) total: 10.2s remaining: 5.23s 45: learn: 0.3449705 test: 0.3483756 best: 0.3483756 (45) total: 10.5s remaining: 5s 46: learn: 0.3413211 test: 0.3448438 best: 0.3448438 (46) total: 10.7s remaining: 4.77s 47: learn: 0.3378501 test: 0.3414874 best: 0.3414874 (47) total: 10.9s remaining: 4.54s 48: learn: 0.3343946 test: 0.3381159 best: 0.3381159 (48) total: 11.2s remaining: 4.33s 49: learn: 0.3310479 test: 0.3349358 best: 0.3349358 (49) total: 11.4s remaining: 4.11s 50: learn: 0.3277249 test: 0.3318200 best: 0.3318200 (50) total: 11.6s remaining: 3.88s 51: learn: 0.3246477 test: 0.3288630 best: 0.3288630 (51) total: 11.9s remaining: 3.65s 52: learn: 0.3217196 test: 0.3260427 best: 0.3260427 (52) total: 12.1s remaining: 3.42s 53: learn: 0.3187304 test: 0.3231687 best: 0.3231687 (53) total: 12.3s remaining: 3.19s 54: learn: 0.3158450 test: 0.3203318 best: 0.3203318 (54) total: 12.5s remaining: 2.96s 55: learn: 0.3129389 test: 0.3175699 best: 0.3175699 (55) total: 12.8s remaining: 2.74s 56: learn: 0.3103281 test: 0.3150519 best: 0.3150519 (56) total: 13s remaining: 2.51s 57: learn: 0.3076811 test: 0.3124904 best: 0.3124904 (57) total: 13.2s remaining: 2.28s 58: learn: 0.3049768 test: 0.3098896 best: 0.3098896 (58) total: 13.5s remaining: 2.05s 59: learn: 0.3025065 test: 0.3075696 best: 0.3075696 (59) total: 13.7s remaining: 1.82s 60: learn: 0.2998095 test: 0.3050152 best: 0.3050152 (60) total: 13.9s remaining: 1.6s 61: learn: 0.2974216 test: 0.3027739 best: 0.3027739 (61) total: 14.2s remaining: 1.37s 62: learn: 0.2952444 test: 0.3006627 best: 0.3006627 (62) total: 14.4s remaining: 1.14s 63: learn: 0.2928217 test: 0.2983377 best: 0.2983377 (63) total: 14.6s remaining: 914ms 64: learn: 0.2906497 test: 0.2962776 best: 0.2962776 (64) total: 14.9s remaining: 686ms 65: learn: 0.2884194 test: 0.2941583 best: 0.2941583 (65) total: 15.1s remaining: 458ms 66: learn: 0.2861784 test: 0.2920314 best: 0.2920314 (66) total: 15.4s remaining: 230ms 67: learn: 0.2840460 test: 0.2900096 best: 0.2900096 (67) total: 15.6s remaining: 0us bestTest = 0.2900096478 bestIteration = 67 Trial 69, Fold 3: Log loss = 0.2901548698436933, Average precision = 0.9735075759623979, ROC-AUC = 0.9721246243896331, Elapsed Time = 15.7711761999999 seconds Trial 69, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 69, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6795417 test: 0.6796607 best: 0.6796607 (0) total: 178ms remaining: 11.9s 1: learn: 0.6666037 test: 0.6668800 best: 0.6668800 (1) total: 398ms remaining: 13.1s 2: learn: 0.6538778 test: 0.6543081 best: 0.6543081 (2) total: 610ms remaining: 13.2s 3: learn: 0.6413017 test: 0.6417800 best: 0.6417800 (3) total: 801ms remaining: 12.8s 4: learn: 0.6293495 test: 0.6299596 best: 0.6299596 (4) total: 1.03s remaining: 13s 5: learn: 0.6175099 test: 0.6183337 best: 0.6183337 (5) total: 1.26s remaining: 13s 6: learn: 0.6061631 test: 0.6070588 best: 0.6070588 (6) total: 1.47s remaining: 12.8s 7: learn: 0.5949267 test: 0.5958638 best: 0.5958638 (7) total: 1.65s remaining: 12.4s 8: learn: 0.5840554 test: 0.5851029 best: 0.5851029 (8) total: 1.87s remaining: 12.3s 9: learn: 0.5735002 test: 0.5746616 best: 0.5746616 (9) total: 2.09s remaining: 12.1s 10: learn: 0.5634610 test: 0.5647073 best: 0.5647073 (10) total: 2.32s remaining: 12s 11: learn: 0.5536411 test: 0.5549863 best: 0.5549863 (11) total: 2.56s remaining: 12s 12: learn: 0.5439347 test: 0.5452743 best: 0.5452743 (12) total: 2.75s remaining: 11.6s 13: learn: 0.5346751 test: 0.5361018 best: 0.5361018 (13) total: 2.96s remaining: 11.4s 14: learn: 0.5257681 test: 0.5273281 best: 0.5273281 (14) total: 3.18s remaining: 11.2s 15: learn: 0.5169668 test: 0.5185078 best: 0.5185078 (15) total: 3.38s remaining: 11s 16: learn: 0.5085064 test: 0.5102608 best: 0.5102608 (16) total: 3.63s remaining: 10.9s 17: learn: 0.5002309 test: 0.5021654 best: 0.5021654 (17) total: 3.86s remaining: 10.7s 18: learn: 0.4922487 test: 0.4943647 best: 0.4943647 (18) total: 4.08s remaining: 10.5s 19: learn: 0.4843259 test: 0.4865286 best: 0.4865286 (19) total: 4.29s remaining: 10.3s 20: learn: 0.4767449 test: 0.4791019 best: 0.4791019 (20) total: 4.53s remaining: 10.1s 21: learn: 0.4693280 test: 0.4719884 best: 0.4719884 (21) total: 4.77s remaining: 9.98s 22: learn: 0.4621469 test: 0.4650261 best: 0.4650261 (22) total: 5s remaining: 9.77s 23: learn: 0.4551784 test: 0.4581579 best: 0.4581579 (23) total: 5.27s remaining: 9.66s 24: learn: 0.4485978 test: 0.4517446 best: 0.4517446 (24) total: 5.51s remaining: 9.47s 25: learn: 0.4421749 test: 0.4454291 best: 0.4454291 (25) total: 5.73s remaining: 9.26s 26: learn: 0.4357691 test: 0.4390402 best: 0.4390402 (26) total: 5.95s remaining: 9.03s 27: learn: 0.4295508 test: 0.4329153 best: 0.4329153 (27) total: 6.16s remaining: 8.8s 28: learn: 0.4234047 test: 0.4267902 best: 0.4267902 (28) total: 6.39s remaining: 8.59s 29: learn: 0.4175095 test: 0.4210007 best: 0.4210007 (29) total: 6.61s remaining: 8.37s 30: learn: 0.4118656 test: 0.4154440 best: 0.4154440 (30) total: 6.82s remaining: 8.13s 31: learn: 0.4065633 test: 0.4101963 best: 0.4101963 (31) total: 7.04s remaining: 7.92s 32: learn: 0.4012527 test: 0.4049700 best: 0.4049700 (32) total: 7.22s remaining: 7.66s 33: learn: 0.3960432 test: 0.3999228 best: 0.3999228 (33) total: 7.47s remaining: 7.47s 34: learn: 0.3910549 test: 0.3951523 best: 0.3951523 (34) total: 7.69s remaining: 7.25s 35: learn: 0.3861718 test: 0.3904470 best: 0.3904470 (35) total: 7.92s remaining: 7.04s 36: learn: 0.3814541 test: 0.3858615 best: 0.3858615 (36) total: 8.14s remaining: 6.82s 37: learn: 0.3767722 test: 0.3812742 best: 0.3812742 (37) total: 8.38s remaining: 6.62s 38: learn: 0.3723641 test: 0.3768924 best: 0.3768924 (38) total: 8.63s remaining: 6.42s 39: learn: 0.3681009 test: 0.3727491 best: 0.3727491 (39) total: 8.86s remaining: 6.21s 40: learn: 0.3640803 test: 0.3688628 best: 0.3688628 (40) total: 9.1s remaining: 5.99s 41: learn: 0.3600699 test: 0.3649387 best: 0.3649387 (41) total: 9.35s remaining: 5.79s 42: learn: 0.3560999 test: 0.3610503 best: 0.3610503 (42) total: 9.56s remaining: 5.56s 43: learn: 0.3524617 test: 0.3575418 best: 0.3575418 (43) total: 9.77s remaining: 5.33s 44: learn: 0.3486285 test: 0.3538096 best: 0.3538096 (44) total: 9.98s remaining: 5.1s 45: learn: 0.3448570 test: 0.3501915 best: 0.3501915 (45) total: 10.2s remaining: 4.89s 46: learn: 0.3411454 test: 0.3466293 best: 0.3466293 (46) total: 10.4s remaining: 4.66s 47: learn: 0.3377523 test: 0.3433434 best: 0.3433434 (47) total: 10.7s remaining: 4.44s 48: learn: 0.3343449 test: 0.3401081 best: 0.3401081 (48) total: 10.9s remaining: 4.22s 49: learn: 0.3311163 test: 0.3370380 best: 0.3370380 (49) total: 11.1s remaining: 4.01s 50: learn: 0.3278705 test: 0.3339666 best: 0.3339666 (50) total: 11.4s remaining: 3.8s 51: learn: 0.3247704 test: 0.3309476 best: 0.3309476 (51) total: 11.6s remaining: 3.57s 52: learn: 0.3217373 test: 0.3281030 best: 0.3281030 (52) total: 11.8s remaining: 3.35s 53: learn: 0.3188581 test: 0.3253463 best: 0.3253463 (53) total: 12.1s remaining: 3.13s 54: learn: 0.3158665 test: 0.3224809 best: 0.3224809 (54) total: 12.3s remaining: 2.9s 55: learn: 0.3130536 test: 0.3197864 best: 0.3197864 (55) total: 12.5s remaining: 2.68s 56: learn: 0.3101666 test: 0.3172009 best: 0.3172009 (56) total: 12.7s remaining: 2.45s 57: learn: 0.3075023 test: 0.3147161 best: 0.3147161 (57) total: 13s remaining: 2.23s 58: learn: 0.3049028 test: 0.3122861 best: 0.3122861 (58) total: 13.2s remaining: 2.01s 59: learn: 0.3024802 test: 0.3100449 best: 0.3100449 (59) total: 13.4s remaining: 1.79s 60: learn: 0.3000835 test: 0.3078608 best: 0.3078608 (60) total: 13.6s remaining: 1.56s 61: learn: 0.2975887 test: 0.3054319 best: 0.3054319 (61) total: 13.8s remaining: 1.34s 62: learn: 0.2953694 test: 0.3034140 best: 0.3034140 (62) total: 14.1s remaining: 1.12s 63: learn: 0.2929571 test: 0.3011256 best: 0.3011256 (63) total: 14.3s remaining: 895ms 64: learn: 0.2905817 test: 0.2988896 best: 0.2988896 (64) total: 14.6s remaining: 672ms 65: learn: 0.2883839 test: 0.2968154 best: 0.2968154 (65) total: 14.8s remaining: 448ms 66: learn: 0.2862473 test: 0.2948096 best: 0.2948096 (66) total: 15s remaining: 224ms 67: learn: 0.2841460 test: 0.2929045 best: 0.2929045 (67) total: 15.2s remaining: 0us bestTest = 0.2929044525 bestIteration = 67 Trial 69, Fold 4: Log loss = 0.2929207009664367, Average precision = 0.9752771384417679, ROC-AUC = 0.971326205736413, Elapsed Time = 15.37222980000297 seconds Trial 69, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 69, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6794961 test: 0.6797770 best: 0.6797770 (0) total: 197ms remaining: 13.2s 1: learn: 0.6661761 test: 0.6667592 best: 0.6667592 (1) total: 395ms remaining: 13s 2: learn: 0.6530422 test: 0.6539136 best: 0.6539136 (2) total: 628ms remaining: 13.6s 3: learn: 0.6404886 test: 0.6415343 best: 0.6415343 (3) total: 832ms remaining: 13.3s 4: learn: 0.6282572 test: 0.6294412 best: 0.6294412 (4) total: 1.04s remaining: 13.2s 5: learn: 0.6162585 test: 0.6177555 best: 0.6177555 (5) total: 1.25s remaining: 12.9s 6: learn: 0.6047987 test: 0.6065332 best: 0.6065332 (6) total: 1.48s remaining: 12.9s 7: learn: 0.5935350 test: 0.5955475 best: 0.5955475 (7) total: 1.7s remaining: 12.8s 8: learn: 0.5827582 test: 0.5850992 best: 0.5850992 (8) total: 1.92s remaining: 12.6s 9: learn: 0.5721150 test: 0.5746276 best: 0.5746276 (9) total: 2.13s remaining: 12.3s 10: learn: 0.5618216 test: 0.5645751 best: 0.5645751 (10) total: 2.33s remaining: 12.1s 11: learn: 0.5520583 test: 0.5549654 best: 0.5549654 (11) total: 2.56s remaining: 12s 12: learn: 0.5424975 test: 0.5456323 best: 0.5456323 (12) total: 2.73s remaining: 11.6s 13: learn: 0.5331425 test: 0.5364913 best: 0.5364913 (13) total: 2.95s remaining: 11.4s 14: learn: 0.5242647 test: 0.5277787 best: 0.5277787 (14) total: 3.17s remaining: 11.2s 15: learn: 0.5154033 test: 0.5191585 best: 0.5191585 (15) total: 3.4s remaining: 11.1s 16: learn: 0.5068542 test: 0.5108756 best: 0.5108756 (16) total: 3.62s remaining: 10.9s 17: learn: 0.4985364 test: 0.5027703 best: 0.5027703 (17) total: 3.82s remaining: 10.6s 18: learn: 0.4903452 test: 0.4947443 best: 0.4947443 (18) total: 4.01s remaining: 10.3s 19: learn: 0.4825082 test: 0.4871609 best: 0.4871609 (19) total: 4.24s remaining: 10.2s 20: learn: 0.4747854 test: 0.4797329 best: 0.4797329 (20) total: 4.56s remaining: 10.2s 21: learn: 0.4673905 test: 0.4727131 best: 0.4727131 (21) total: 4.85s remaining: 10.1s 22: learn: 0.4605244 test: 0.4662210 best: 0.4662210 (22) total: 5.13s remaining: 10s 23: learn: 0.4536565 test: 0.4596426 best: 0.4596426 (23) total: 5.37s remaining: 9.85s 24: learn: 0.4468609 test: 0.4530314 best: 0.4530314 (24) total: 5.58s remaining: 9.59s 25: learn: 0.4403182 test: 0.4467296 best: 0.4467296 (25) total: 5.83s remaining: 9.42s 26: learn: 0.4338483 test: 0.4404514 best: 0.4404514 (26) total: 6.05s remaining: 9.18s 27: learn: 0.4276609 test: 0.4344214 best: 0.4344214 (27) total: 6.28s remaining: 8.98s 28: learn: 0.4216462 test: 0.4286092 best: 0.4286092 (28) total: 6.47s remaining: 8.71s 29: learn: 0.4157851 test: 0.4229668 best: 0.4229668 (29) total: 6.7s remaining: 8.48s 30: learn: 0.4100995 test: 0.4174923 best: 0.4174923 (30) total: 6.93s remaining: 8.28s 31: learn: 0.4046658 test: 0.4123212 best: 0.4123212 (31) total: 7.16s remaining: 8.05s 32: learn: 0.3992299 test: 0.4071025 best: 0.4071025 (32) total: 7.38s remaining: 7.82s 33: learn: 0.3940851 test: 0.4021008 best: 0.4021008 (33) total: 7.62s remaining: 7.62s 34: learn: 0.3889862 test: 0.3972486 best: 0.3972486 (34) total: 7.83s remaining: 7.38s 35: learn: 0.3840369 test: 0.3924956 best: 0.3924956 (35) total: 8.05s remaining: 7.15s 36: learn: 0.3790095 test: 0.3877073 best: 0.3877073 (36) total: 8.29s remaining: 6.94s 37: learn: 0.3744634 test: 0.3832634 best: 0.3832634 (37) total: 8.52s remaining: 6.73s 38: learn: 0.3700519 test: 0.3789949 best: 0.3789949 (38) total: 8.81s remaining: 6.55s 39: learn: 0.3656994 test: 0.3748361 best: 0.3748361 (39) total: 9.07s remaining: 6.35s 40: learn: 0.3615398 test: 0.3708297 best: 0.3708297 (40) total: 9.27s remaining: 6.1s 41: learn: 0.3576802 test: 0.3670690 best: 0.3670690 (41) total: 9.46s remaining: 5.86s 42: learn: 0.3535924 test: 0.3632077 best: 0.3632077 (42) total: 9.71s remaining: 5.65s 43: learn: 0.3497486 test: 0.3595572 best: 0.3595572 (43) total: 9.93s remaining: 5.42s 44: learn: 0.3457842 test: 0.3558416 best: 0.3558416 (44) total: 10.2s remaining: 5.21s 45: learn: 0.3424278 test: 0.3526206 best: 0.3526206 (45) total: 10.4s remaining: 4.96s 46: learn: 0.3388500 test: 0.3492391 best: 0.3492391 (46) total: 10.6s remaining: 4.74s 47: learn: 0.3354498 test: 0.3459996 best: 0.3459996 (47) total: 10.8s remaining: 4.52s 48: learn: 0.3319106 test: 0.3426878 best: 0.3426878 (48) total: 11.1s remaining: 4.3s 49: learn: 0.3285794 test: 0.3395303 best: 0.3395303 (49) total: 11.3s remaining: 4.08s 50: learn: 0.3253373 test: 0.3364570 best: 0.3364570 (50) total: 11.5s remaining: 3.85s 51: learn: 0.3222277 test: 0.3335471 best: 0.3335471 (51) total: 11.8s remaining: 3.63s 52: learn: 0.3191771 test: 0.3306987 best: 0.3306987 (52) total: 12s remaining: 3.41s 53: learn: 0.3163666 test: 0.3280694 best: 0.3280694 (53) total: 12.3s remaining: 3.18s 54: learn: 0.3135398 test: 0.3254615 best: 0.3254615 (54) total: 12.5s remaining: 2.96s 55: learn: 0.3107864 test: 0.3229560 best: 0.3229560 (55) total: 12.7s remaining: 2.73s 56: learn: 0.3082396 test: 0.3205446 best: 0.3205446 (56) total: 13s remaining: 2.5s 57: learn: 0.3055397 test: 0.3180445 best: 0.3180445 (57) total: 13.2s remaining: 2.27s 58: learn: 0.3030356 test: 0.3157779 best: 0.3157779 (58) total: 13.4s remaining: 2.05s 59: learn: 0.3003334 test: 0.3132981 best: 0.3132981 (59) total: 13.7s remaining: 1.82s 60: learn: 0.2979157 test: 0.3110672 best: 0.3110672 (60) total: 13.9s remaining: 1.59s 61: learn: 0.2954875 test: 0.3088490 best: 0.3088490 (61) total: 14.2s remaining: 1.37s 62: learn: 0.2930101 test: 0.3065283 best: 0.3065283 (62) total: 14.4s remaining: 1.14s 63: learn: 0.2908301 test: 0.3045025 best: 0.3045025 (63) total: 14.6s remaining: 915ms 64: learn: 0.2883864 test: 0.3023865 best: 0.3023865 (64) total: 14.9s remaining: 689ms 65: learn: 0.2861969 test: 0.3005031 best: 0.3005031 (65) total: 15.2s remaining: 460ms 66: learn: 0.2840559 test: 0.2985693 best: 0.2985693 (66) total: 15.4s remaining: 230ms 67: learn: 0.2819347 test: 0.2966230 best: 0.2966230 (67) total: 15.7s remaining: 0us bestTest = 0.2966229601 bestIteration = 67 Trial 69, Fold 5: Log loss = 0.29650773010314624, Average precision = 0.9724871864318616, ROC-AUC = 0.9693863961503445, Elapsed Time = 15.81532199999856 seconds
Optimization Progress: 70%|####### | 70/100 [1:56:52<38:36, 77.23s/it]
Trial 70, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 70, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5905772 test: 0.5924819 best: 0.5924819 (0) total: 494ms remaining: 48s 1: learn: 0.5099826 test: 0.5149647 best: 0.5149647 (1) total: 1.22s remaining: 58.6s 2: learn: 0.4462398 test: 0.4533661 best: 0.4533661 (2) total: 1.84s remaining: 58.4s 3: learn: 0.3968159 test: 0.4056015 best: 0.4056015 (3) total: 2.5s remaining: 58.6s 4: learn: 0.3574138 test: 0.3689089 best: 0.3689089 (4) total: 3.19s remaining: 59.4s 5: learn: 0.3278006 test: 0.3417506 best: 0.3417506 (5) total: 3.77s remaining: 57.8s 6: learn: 0.3024417 test: 0.3180534 best: 0.3180534 (6) total: 4.24s remaining: 55.1s 7: learn: 0.2833687 test: 0.3008918 best: 0.3008918 (7) total: 4.68s remaining: 52.6s 8: learn: 0.2660724 test: 0.2850664 best: 0.2850664 (8) total: 5.17s remaining: 51.2s 9: learn: 0.2524903 test: 0.2736993 best: 0.2736993 (9) total: 5.69s remaining: 50s 10: learn: 0.2414030 test: 0.2642846 best: 0.2642846 (10) total: 6.22s remaining: 49.2s 11: learn: 0.2308821 test: 0.2552324 best: 0.2552324 (11) total: 6.75s remaining: 48.4s 12: learn: 0.2222081 test: 0.2478845 best: 0.2478845 (12) total: 7.28s remaining: 47.6s 13: learn: 0.2142990 test: 0.2413145 best: 0.2413145 (13) total: 7.84s remaining: 47.1s 14: learn: 0.2080397 test: 0.2365838 best: 0.2365838 (14) total: 8.34s remaining: 46.1s 15: learn: 0.2017884 test: 0.2317788 best: 0.2317788 (15) total: 8.93s remaining: 45.8s 16: learn: 0.1963821 test: 0.2280830 best: 0.2280830 (16) total: 9.43s remaining: 44.9s 17: learn: 0.1918899 test: 0.2250469 best: 0.2250469 (17) total: 9.99s remaining: 44.4s 18: learn: 0.1883329 test: 0.2226479 best: 0.2226479 (18) total: 10.4s remaining: 43.3s 19: learn: 0.1852685 test: 0.2202928 best: 0.2202928 (19) total: 10.8s remaining: 42.1s 20: learn: 0.1813848 test: 0.2182228 best: 0.2182228 (20) total: 11.4s remaining: 41.9s 21: learn: 0.1786874 test: 0.2164332 best: 0.2164332 (21) total: 11.9s remaining: 40.9s 22: learn: 0.1764863 test: 0.2150543 best: 0.2150543 (22) total: 12.3s remaining: 40.1s 23: learn: 0.1741891 test: 0.2140018 best: 0.2140018 (23) total: 12.8s remaining: 39.5s 24: learn: 0.1715689 test: 0.2121176 best: 0.2121176 (24) total: 13.2s remaining: 38.7s 25: learn: 0.1689038 test: 0.2109803 best: 0.2109803 (25) total: 13.8s remaining: 38.4s 26: learn: 0.1665270 test: 0.2101557 best: 0.2101557 (26) total: 14.4s remaining: 37.8s 27: learn: 0.1648067 test: 0.2092547 best: 0.2092547 (27) total: 14.8s remaining: 37s 28: learn: 0.1623736 test: 0.2078307 best: 0.2078307 (28) total: 15.3s remaining: 36.4s 29: learn: 0.1604310 test: 0.2067263 best: 0.2067263 (29) total: 15.8s remaining: 35.8s 30: learn: 0.1586735 test: 0.2061251 best: 0.2061251 (30) total: 16.3s remaining: 35.3s 31: learn: 0.1570812 test: 0.2055076 best: 0.2055076 (31) total: 16.8s remaining: 34.7s 32: learn: 0.1559264 test: 0.2050967 best: 0.2050967 (32) total: 17.2s remaining: 33.8s 33: learn: 0.1544694 test: 0.2043173 best: 0.2043173 (33) total: 17.7s remaining: 33.3s 34: learn: 0.1529100 test: 0.2040010 best: 0.2040010 (34) total: 18.5s remaining: 33.3s 35: learn: 0.1516016 test: 0.2036874 best: 0.2036874 (35) total: 19.3s remaining: 33.2s 36: learn: 0.1503167 test: 0.2031996 best: 0.2031996 (36) total: 19.8s remaining: 32.7s 37: learn: 0.1485814 test: 0.2026597 best: 0.2026597 (37) total: 20.5s remaining: 32.4s 38: learn: 0.1477150 test: 0.2022165 best: 0.2022165 (38) total: 20.9s remaining: 31.6s 39: learn: 0.1463444 test: 0.2019163 best: 0.2019163 (39) total: 21.5s remaining: 31.2s 40: learn: 0.1454557 test: 0.2018099 best: 0.2018099 (40) total: 21.9s remaining: 30.5s 41: learn: 0.1444027 test: 0.2014171 best: 0.2014171 (41) total: 22.4s remaining: 29.8s 42: learn: 0.1435348 test: 0.2011701 best: 0.2011701 (42) total: 22.7s remaining: 29.1s 43: learn: 0.1428324 test: 0.2010470 best: 0.2010470 (43) total: 23.1s remaining: 28.3s 44: learn: 0.1416515 test: 0.2008914 best: 0.2008914 (44) total: 23.6s remaining: 27.8s 45: learn: 0.1407742 test: 0.2006712 best: 0.2006712 (45) total: 24s remaining: 27.2s 46: learn: 0.1399810 test: 0.2004374 best: 0.2004374 (46) total: 24.5s remaining: 26.6s 47: learn: 0.1390373 test: 0.2003127 best: 0.2003127 (47) total: 25s remaining: 26s 48: learn: 0.1380890 test: 0.2002086 best: 0.2002086 (48) total: 25.4s remaining: 25.4s 49: learn: 0.1371752 test: 0.1999784 best: 0.1999784 (49) total: 25.8s remaining: 24.7s 50: learn: 0.1360426 test: 0.1998805 best: 0.1998805 (50) total: 26.3s remaining: 24.3s 51: learn: 0.1353581 test: 0.1995427 best: 0.1995427 (51) total: 26.7s remaining: 23.6s 52: learn: 0.1346444 test: 0.1994362 best: 0.1994362 (52) total: 27.1s remaining: 23s 53: learn: 0.1336632 test: 0.1994375 best: 0.1994362 (52) total: 27.6s remaining: 22.5s 54: learn: 0.1328662 test: 0.1995697 best: 0.1994362 (52) total: 28s remaining: 21.9s 55: learn: 0.1318159 test: 0.1993489 best: 0.1993489 (55) total: 28.5s remaining: 21.3s 56: learn: 0.1309525 test: 0.1993913 best: 0.1993489 (55) total: 28.8s remaining: 20.7s 57: learn: 0.1299540 test: 0.1992568 best: 0.1992568 (57) total: 29.4s remaining: 20.3s 58: learn: 0.1291897 test: 0.1994265 best: 0.1992568 (57) total: 29.9s remaining: 19.7s 59: learn: 0.1282263 test: 0.1996294 best: 0.1992568 (57) total: 30.3s remaining: 19.2s 60: learn: 0.1275549 test: 0.1997176 best: 0.1992568 (57) total: 30.8s remaining: 18.7s 61: learn: 0.1265109 test: 0.1997618 best: 0.1992568 (57) total: 31.3s remaining: 18.2s 62: learn: 0.1256643 test: 0.1996881 best: 0.1992568 (57) total: 31.8s remaining: 17.7s 63: learn: 0.1250717 test: 0.1994465 best: 0.1992568 (57) total: 32.2s remaining: 17.1s 64: learn: 0.1245741 test: 0.1996128 best: 0.1992568 (57) total: 32.6s remaining: 16.6s 65: learn: 0.1240829 test: 0.1996194 best: 0.1992568 (57) total: 32.9s remaining: 16s 66: learn: 0.1235027 test: 0.1997490 best: 0.1992568 (57) total: 33.3s remaining: 15.4s 67: learn: 0.1227869 test: 0.1998322 best: 0.1992568 (57) total: 33.8s remaining: 14.9s 68: learn: 0.1220616 test: 0.1997813 best: 0.1992568 (57) total: 34.2s remaining: 14.4s 69: learn: 0.1216692 test: 0.1999381 best: 0.1992568 (57) total: 34.5s remaining: 13.8s 70: learn: 0.1209723 test: 0.1998977 best: 0.1992568 (57) total: 34.9s remaining: 13.3s 71: learn: 0.1206897 test: 0.1999365 best: 0.1992568 (57) total: 35.1s remaining: 12.7s 72: learn: 0.1198608 test: 0.1998701 best: 0.1992568 (57) total: 35.6s remaining: 12.2s 73: learn: 0.1192651 test: 0.1995603 best: 0.1992568 (57) total: 36s remaining: 11.7s 74: learn: 0.1184293 test: 0.1994595 best: 0.1992568 (57) total: 36.5s remaining: 11.2s 75: learn: 0.1176857 test: 0.1997555 best: 0.1992568 (57) total: 37s remaining: 10.7s 76: learn: 0.1170716 test: 0.1998908 best: 0.1992568 (57) total: 37.3s remaining: 10.2s 77: learn: 0.1166289 test: 0.1998779 best: 0.1992568 (57) total: 37.7s remaining: 9.66s 78: learn: 0.1158295 test: 0.1999912 best: 0.1992568 (57) total: 38.3s remaining: 9.21s 79: learn: 0.1151329 test: 0.2001182 best: 0.1992568 (57) total: 38.9s remaining: 8.75s 80: learn: 0.1148478 test: 0.2001152 best: 0.1992568 (57) total: 39.2s remaining: 8.22s 81: learn: 0.1142654 test: 0.2002655 best: 0.1992568 (57) total: 39.6s remaining: 7.73s 82: learn: 0.1137150 test: 0.2000838 best: 0.1992568 (57) total: 40.1s remaining: 7.24s 83: learn: 0.1131164 test: 0.1999940 best: 0.1992568 (57) total: 40.5s remaining: 6.75s 84: learn: 0.1126476 test: 0.1999980 best: 0.1992568 (57) total: 40.9s remaining: 6.25s 85: learn: 0.1120430 test: 0.2001277 best: 0.1992568 (57) total: 41.3s remaining: 5.77s 86: learn: 0.1115123 test: 0.2000620 best: 0.1992568 (57) total: 41.7s remaining: 5.28s 87: learn: 0.1109291 test: 0.2000884 best: 0.1992568 (57) total: 42.2s remaining: 4.79s 88: learn: 0.1104506 test: 0.2000480 best: 0.1992568 (57) total: 42.6s remaining: 4.31s 89: learn: 0.1096367 test: 0.2001814 best: 0.1992568 (57) total: 43.2s remaining: 3.84s 90: learn: 0.1091195 test: 0.2002284 best: 0.1992568 (57) total: 43.5s remaining: 3.35s 91: learn: 0.1085984 test: 0.2000942 best: 0.1992568 (57) total: 44s remaining: 2.87s 92: learn: 0.1081529 test: 0.2004042 best: 0.1992568 (57) total: 44.4s remaining: 2.38s 93: learn: 0.1077820 test: 0.2005607 best: 0.1992568 (57) total: 44.7s remaining: 1.9s 94: learn: 0.1071577 test: 0.2008356 best: 0.1992568 (57) total: 45.4s remaining: 1.43s 95: learn: 0.1067390 test: 0.2008031 best: 0.1992568 (57) total: 45.7s remaining: 952ms 96: learn: 0.1059280 test: 0.2008478 best: 0.1992568 (57) total: 46.3s remaining: 477ms 97: learn: 0.1053433 test: 0.2010897 best: 0.1992568 (57) total: 46.8s remaining: 0us bestTest = 0.1992568471 bestIteration = 57 Shrink model to first 58 iterations. Trial 70, Fold 1: Log loss = 0.19856434204522747, Average precision = 0.9738502437692736, ROC-AUC = 0.971148158902224, Elapsed Time = 46.91665510000166 seconds Trial 70, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 70, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5903495 test: 0.5937671 best: 0.5937671 (0) total: 733ms remaining: 1m 11s 1: learn: 0.5109283 test: 0.5157377 best: 0.5157377 (1) total: 1.26s remaining: 1m 2: learn: 0.4486895 test: 0.4564853 best: 0.4564853 (2) total: 2.08s remaining: 1m 5s 3: learn: 0.3991440 test: 0.4083987 best: 0.4083987 (3) total: 2.67s remaining: 1m 2s 4: learn: 0.3602470 test: 0.3712030 best: 0.3712030 (4) total: 3.33s remaining: 1m 2s 5: learn: 0.3288458 test: 0.3409858 best: 0.3409858 (5) total: 3.98s remaining: 1m 1s 6: learn: 0.3030557 test: 0.3162873 best: 0.3162873 (6) total: 4.67s remaining: 1m 7: learn: 0.2820161 test: 0.2969770 best: 0.2969770 (7) total: 5.46s remaining: 1m 1s 8: learn: 0.2650595 test: 0.2808997 best: 0.2808997 (8) total: 6s remaining: 59.3s 9: learn: 0.2512379 test: 0.2678163 best: 0.2678163 (9) total: 6.54s remaining: 57.5s 10: learn: 0.2406356 test: 0.2582500 best: 0.2582500 (10) total: 7.06s remaining: 55.8s 11: learn: 0.2306208 test: 0.2495441 best: 0.2495441 (11) total: 7.66s remaining: 54.9s 12: learn: 0.2228246 test: 0.2423782 best: 0.2423782 (12) total: 8.12s remaining: 53.1s 13: learn: 0.2159826 test: 0.2363771 best: 0.2363771 (13) total: 8.71s remaining: 52.2s 14: learn: 0.2100466 test: 0.2309979 best: 0.2309979 (14) total: 9.24s remaining: 51.1s 15: learn: 0.2042586 test: 0.2259809 best: 0.2259809 (15) total: 9.72s remaining: 49.8s 16: learn: 0.1988514 test: 0.2217163 best: 0.2217163 (16) total: 10.2s remaining: 48.8s 17: learn: 0.1938219 test: 0.2175412 best: 0.2175412 (17) total: 10.7s remaining: 47.6s 18: learn: 0.1893671 test: 0.2136764 best: 0.2136764 (18) total: 11.3s remaining: 46.8s 19: learn: 0.1854990 test: 0.2105509 best: 0.2105509 (19) total: 11.9s remaining: 46.4s 20: learn: 0.1822671 test: 0.2082736 best: 0.2082736 (20) total: 12.4s remaining: 45.5s 21: learn: 0.1793576 test: 0.2061325 best: 0.2061325 (21) total: 12.9s remaining: 44.5s 22: learn: 0.1766524 test: 0.2039809 best: 0.2039809 (22) total: 13.4s remaining: 43.5s 23: learn: 0.1740531 test: 0.2022919 best: 0.2022919 (23) total: 13.9s remaining: 42.8s 24: learn: 0.1717333 test: 0.2005049 best: 0.2005049 (24) total: 14.3s remaining: 41.8s 25: learn: 0.1693537 test: 0.1992942 best: 0.1992942 (25) total: 14.9s remaining: 41.2s 26: learn: 0.1675793 test: 0.1982418 best: 0.1982418 (26) total: 15.4s remaining: 40.4s 27: learn: 0.1656878 test: 0.1967932 best: 0.1967932 (27) total: 15.8s remaining: 39.4s 28: learn: 0.1637316 test: 0.1956539 best: 0.1956539 (28) total: 16.3s remaining: 38.8s 29: learn: 0.1622235 test: 0.1945202 best: 0.1945202 (29) total: 16.7s remaining: 37.8s 30: learn: 0.1603196 test: 0.1940123 best: 0.1940123 (30) total: 17.3s remaining: 37.4s 31: learn: 0.1587989 test: 0.1933214 best: 0.1933214 (31) total: 17.8s remaining: 36.7s 32: learn: 0.1572392 test: 0.1922856 best: 0.1922856 (32) total: 18.2s remaining: 35.9s 33: learn: 0.1555415 test: 0.1915029 best: 0.1915029 (33) total: 18.7s remaining: 35.2s 34: learn: 0.1542581 test: 0.1913877 best: 0.1913877 (34) total: 19.2s remaining: 34.5s 35: learn: 0.1527974 test: 0.1909475 best: 0.1909475 (35) total: 19.7s remaining: 34s 36: learn: 0.1514191 test: 0.1904755 best: 0.1904755 (36) total: 20.2s remaining: 33.3s 37: learn: 0.1502270 test: 0.1902423 best: 0.1902423 (37) total: 20.7s remaining: 32.7s 38: learn: 0.1490257 test: 0.1898841 best: 0.1898841 (38) total: 21.2s remaining: 32s 39: learn: 0.1480089 test: 0.1895869 best: 0.1895869 (39) total: 21.5s remaining: 31.2s 40: learn: 0.1468255 test: 0.1892219 best: 0.1892219 (40) total: 22s remaining: 30.6s 41: learn: 0.1457890 test: 0.1889170 best: 0.1889170 (41) total: 22.5s remaining: 30s 42: learn: 0.1445309 test: 0.1885946 best: 0.1885946 (42) total: 23s remaining: 29.4s 43: learn: 0.1432302 test: 0.1882746 best: 0.1882746 (43) total: 23.6s remaining: 28.9s 44: learn: 0.1422338 test: 0.1879910 best: 0.1879910 (44) total: 24s remaining: 28.3s 45: learn: 0.1410800 test: 0.1876842 best: 0.1876842 (45) total: 24.5s remaining: 27.7s 46: learn: 0.1403643 test: 0.1873134 best: 0.1873134 (46) total: 24.9s remaining: 27s 47: learn: 0.1395635 test: 0.1869573 best: 0.1869573 (47) total: 25.2s remaining: 26.3s 48: learn: 0.1389249 test: 0.1868165 best: 0.1868165 (48) total: 25.6s remaining: 25.6s 49: learn: 0.1383097 test: 0.1866388 best: 0.1866388 (49) total: 25.9s remaining: 24.9s 50: learn: 0.1371398 test: 0.1859977 best: 0.1859977 (50) total: 26.4s remaining: 24.4s 51: learn: 0.1362723 test: 0.1858866 best: 0.1858866 (51) total: 26.9s remaining: 23.8s 52: learn: 0.1353078 test: 0.1856730 best: 0.1856730 (52) total: 27.3s remaining: 23.2s 53: learn: 0.1342302 test: 0.1854412 best: 0.1854412 (53) total: 27.9s remaining: 22.7s 54: learn: 0.1334857 test: 0.1853934 best: 0.1853934 (54) total: 28.3s remaining: 22.2s 55: learn: 0.1325612 test: 0.1852854 best: 0.1852854 (55) total: 28.9s remaining: 21.7s 56: learn: 0.1314313 test: 0.1847653 best: 0.1847653 (56) total: 29.4s remaining: 21.2s 57: learn: 0.1306806 test: 0.1847680 best: 0.1847653 (56) total: 29.8s remaining: 20.6s 58: learn: 0.1297154 test: 0.1845564 best: 0.1845564 (58) total: 30.3s remaining: 20.1s 59: learn: 0.1287907 test: 0.1842792 best: 0.1842792 (59) total: 30.8s remaining: 19.5s 60: learn: 0.1280853 test: 0.1843330 best: 0.1842792 (59) total: 31.2s remaining: 18.9s 61: learn: 0.1275072 test: 0.1840961 best: 0.1840961 (61) total: 31.6s remaining: 18.3s 62: learn: 0.1268617 test: 0.1841083 best: 0.1840961 (61) total: 32s remaining: 17.8s 63: learn: 0.1259326 test: 0.1841705 best: 0.1840961 (61) total: 32.5s remaining: 17.3s 64: learn: 0.1252697 test: 0.1840031 best: 0.1840031 (64) total: 32.9s remaining: 16.7s 65: learn: 0.1249639 test: 0.1839370 best: 0.1839370 (65) total: 33.1s remaining: 16.1s 66: learn: 0.1242512 test: 0.1838234 best: 0.1838234 (66) total: 33.6s remaining: 15.6s 67: learn: 0.1237280 test: 0.1837524 best: 0.1837524 (67) total: 34s remaining: 15s 68: learn: 0.1228790 test: 0.1835007 best: 0.1835007 (68) total: 34.4s remaining: 14.5s 69: learn: 0.1223081 test: 0.1834296 best: 0.1834296 (69) total: 34.8s remaining: 13.9s 70: learn: 0.1217906 test: 0.1833127 best: 0.1833127 (70) total: 35.2s remaining: 13.4s 71: learn: 0.1212838 test: 0.1833760 best: 0.1833127 (70) total: 35.5s remaining: 12.8s 72: learn: 0.1204896 test: 0.1832722 best: 0.1832722 (72) total: 36s remaining: 12.3s 73: learn: 0.1201458 test: 0.1830648 best: 0.1830648 (73) total: 36.3s remaining: 11.8s 74: learn: 0.1194429 test: 0.1830603 best: 0.1830603 (74) total: 36.8s remaining: 11.3s 75: learn: 0.1187049 test: 0.1827500 best: 0.1827500 (75) total: 37.4s remaining: 10.8s 76: learn: 0.1178918 test: 0.1826149 best: 0.1826149 (76) total: 37.8s remaining: 10.3s 77: learn: 0.1174233 test: 0.1824768 best: 0.1824768 (77) total: 38.2s remaining: 9.79s 78: learn: 0.1167596 test: 0.1821638 best: 0.1821638 (78) total: 38.6s remaining: 9.29s 79: learn: 0.1159332 test: 0.1822310 best: 0.1821638 (78) total: 39.3s remaining: 8.84s 80: learn: 0.1154232 test: 0.1823413 best: 0.1821638 (78) total: 39.7s remaining: 8.33s 81: learn: 0.1147235 test: 0.1825515 best: 0.1821638 (78) total: 40.1s remaining: 7.83s 82: learn: 0.1143806 test: 0.1825361 best: 0.1821638 (78) total: 40.4s remaining: 7.29s 83: learn: 0.1137468 test: 0.1825702 best: 0.1821638 (78) total: 40.8s remaining: 6.79s 84: learn: 0.1134625 test: 0.1824195 best: 0.1821638 (78) total: 41.1s remaining: 6.28s 85: learn: 0.1130208 test: 0.1823941 best: 0.1821638 (78) total: 41.5s remaining: 5.79s 86: learn: 0.1123604 test: 0.1823931 best: 0.1821638 (78) total: 42s remaining: 5.3s 87: learn: 0.1120570 test: 0.1824323 best: 0.1821638 (78) total: 42.2s remaining: 4.8s 88: learn: 0.1116114 test: 0.1824444 best: 0.1821638 (78) total: 42.7s remaining: 4.31s 89: learn: 0.1113400 test: 0.1823912 best: 0.1821638 (78) total: 43s remaining: 3.82s 90: learn: 0.1109395 test: 0.1822926 best: 0.1821638 (78) total: 43.4s remaining: 3.34s 91: learn: 0.1106336 test: 0.1823708 best: 0.1821638 (78) total: 43.6s remaining: 2.85s 92: learn: 0.1099393 test: 0.1824049 best: 0.1821638 (78) total: 44.1s remaining: 2.37s 93: learn: 0.1096727 test: 0.1824819 best: 0.1821638 (78) total: 44.4s remaining: 1.89s 94: learn: 0.1089879 test: 0.1824099 best: 0.1821638 (78) total: 44.9s remaining: 1.42s 95: learn: 0.1081779 test: 0.1823502 best: 0.1821638 (78) total: 45.5s remaining: 948ms 96: learn: 0.1076583 test: 0.1820390 best: 0.1820390 (96) total: 45.9s remaining: 474ms 97: learn: 0.1070059 test: 0.1822019 best: 0.1820390 (96) total: 46.4s remaining: 0us bestTest = 0.1820390205 bestIteration = 96 Shrink model to first 97 iterations. Trial 70, Fold 2: Log loss = 0.18155097510167162, Average precision = 0.9771623947613914, ROC-AUC = 0.9744726840328856, Elapsed Time = 46.57567019999988 seconds Trial 70, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 70, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5896952 test: 0.5909205 best: 0.5909205 (0) total: 742ms remaining: 1m 11s 1: learn: 0.5099365 test: 0.5122185 best: 0.5122185 (1) total: 1.41s remaining: 1m 7s 2: learn: 0.4474920 test: 0.4514671 best: 0.4514671 (2) total: 2.14s remaining: 1m 7s 3: learn: 0.3974730 test: 0.4033041 best: 0.4033041 (3) total: 2.71s remaining: 1m 3s 4: learn: 0.3587799 test: 0.3665277 best: 0.3665277 (4) total: 3.3s remaining: 1m 1s 5: learn: 0.3282815 test: 0.3365930 best: 0.3365930 (5) total: 3.85s remaining: 59.1s 6: learn: 0.3034459 test: 0.3123067 best: 0.3123067 (6) total: 4.33s remaining: 56.3s 7: learn: 0.2832563 test: 0.2936972 best: 0.2936972 (7) total: 4.91s remaining: 55.2s 8: learn: 0.2667189 test: 0.2790983 best: 0.2790983 (8) total: 5.46s remaining: 54s 9: learn: 0.2540349 test: 0.2674719 best: 0.2674719 (9) total: 6.03s remaining: 53.1s 10: learn: 0.2427419 test: 0.2572960 best: 0.2572960 (10) total: 6.6s remaining: 52.2s 11: learn: 0.2334824 test: 0.2491497 best: 0.2491497 (11) total: 7.12s remaining: 51s 12: learn: 0.2248907 test: 0.2426009 best: 0.2426009 (12) total: 7.7s remaining: 50.3s 13: learn: 0.2176156 test: 0.2364581 best: 0.2364581 (13) total: 8.32s remaining: 49.9s 14: learn: 0.2105319 test: 0.2309006 best: 0.2309006 (14) total: 8.97s remaining: 49.6s 15: learn: 0.2056585 test: 0.2264427 best: 0.2264427 (15) total: 9.46s remaining: 48.5s 16: learn: 0.2002490 test: 0.2221459 best: 0.2221459 (16) total: 9.98s remaining: 47.6s 17: learn: 0.1958554 test: 0.2186346 best: 0.2186346 (17) total: 10.5s remaining: 46.5s 18: learn: 0.1914289 test: 0.2153419 best: 0.2153419 (18) total: 11s remaining: 45.8s 19: learn: 0.1876021 test: 0.2122994 best: 0.2122994 (19) total: 11.5s remaining: 44.8s 20: learn: 0.1841942 test: 0.2098508 best: 0.2098508 (20) total: 12s remaining: 44.1s 21: learn: 0.1809568 test: 0.2074226 best: 0.2074226 (21) total: 12.5s remaining: 43.1s 22: learn: 0.1778594 test: 0.2054688 best: 0.2054688 (22) total: 13.1s remaining: 42.8s 23: learn: 0.1753442 test: 0.2036256 best: 0.2036256 (23) total: 13.6s remaining: 41.8s 24: learn: 0.1727680 test: 0.2018714 best: 0.2018714 (24) total: 14.1s remaining: 41.1s 25: learn: 0.1706630 test: 0.2010972 best: 0.2010972 (25) total: 14.6s remaining: 40.4s 26: learn: 0.1685118 test: 0.2000259 best: 0.2000259 (26) total: 15.1s remaining: 39.8s 27: learn: 0.1664659 test: 0.1992184 best: 0.1992184 (27) total: 15.7s remaining: 39.2s 28: learn: 0.1646393 test: 0.1982494 best: 0.1982494 (28) total: 16.1s remaining: 38.4s 29: learn: 0.1636244 test: 0.1975162 best: 0.1975162 (29) total: 16.4s remaining: 37.2s 30: learn: 0.1621698 test: 0.1965954 best: 0.1965954 (30) total: 16.8s remaining: 36.3s 31: learn: 0.1606273 test: 0.1955543 best: 0.1955543 (31) total: 17.2s remaining: 35.5s 32: learn: 0.1589563 test: 0.1947279 best: 0.1947279 (32) total: 17.7s remaining: 34.8s 33: learn: 0.1575280 test: 0.1937659 best: 0.1937659 (33) total: 18.1s remaining: 34s 34: learn: 0.1559821 test: 0.1929502 best: 0.1929502 (34) total: 18.7s remaining: 33.6s 35: learn: 0.1553017 test: 0.1926735 best: 0.1926735 (35) total: 19s remaining: 32.7s 36: learn: 0.1541629 test: 0.1921422 best: 0.1921422 (36) total: 19.4s remaining: 32s 37: learn: 0.1525169 test: 0.1919166 best: 0.1919166 (37) total: 19.9s remaining: 31.5s 38: learn: 0.1511259 test: 0.1916008 best: 0.1916008 (38) total: 20.5s remaining: 31s 39: learn: 0.1497258 test: 0.1911351 best: 0.1911351 (39) total: 21s remaining: 30.4s 40: learn: 0.1490838 test: 0.1909034 best: 0.1909034 (40) total: 21.2s remaining: 29.5s 41: learn: 0.1475696 test: 0.1905469 best: 0.1905469 (41) total: 21.9s remaining: 29.1s 42: learn: 0.1462386 test: 0.1900922 best: 0.1900922 (42) total: 22.3s remaining: 28.5s 43: learn: 0.1452854 test: 0.1898982 best: 0.1898982 (43) total: 22.7s remaining: 27.9s 44: learn: 0.1441090 test: 0.1895707 best: 0.1895707 (44) total: 23.3s remaining: 27.4s 45: learn: 0.1432505 test: 0.1894023 best: 0.1894023 (45) total: 23.6s remaining: 26.7s 46: learn: 0.1421531 test: 0.1891987 best: 0.1891987 (46) total: 24.1s remaining: 26.1s 47: learn: 0.1407088 test: 0.1890067 best: 0.1890067 (47) total: 24.7s remaining: 25.8s 48: learn: 0.1395841 test: 0.1885338 best: 0.1885338 (48) total: 25.3s remaining: 25.3s 49: learn: 0.1385148 test: 0.1885945 best: 0.1885338 (48) total: 25.7s remaining: 24.7s 50: learn: 0.1375899 test: 0.1883414 best: 0.1883414 (50) total: 26.2s remaining: 24.2s 51: learn: 0.1366472 test: 0.1882363 best: 0.1882363 (51) total: 26.7s remaining: 23.6s 52: learn: 0.1360266 test: 0.1881417 best: 0.1881417 (52) total: 27s remaining: 22.9s 53: learn: 0.1350170 test: 0.1880539 best: 0.1880539 (53) total: 27.5s remaining: 22.4s 54: learn: 0.1342010 test: 0.1880384 best: 0.1880384 (54) total: 28s remaining: 21.9s 55: learn: 0.1338215 test: 0.1879896 best: 0.1879896 (55) total: 28.2s remaining: 21.2s 56: learn: 0.1333924 test: 0.1881194 best: 0.1879896 (55) total: 28.5s remaining: 20.5s 57: learn: 0.1328504 test: 0.1880521 best: 0.1879896 (55) total: 28.8s remaining: 19.9s 58: learn: 0.1320202 test: 0.1877901 best: 0.1877901 (58) total: 29.3s remaining: 19.3s 59: learn: 0.1313107 test: 0.1875958 best: 0.1875958 (59) total: 29.7s remaining: 18.8s 60: learn: 0.1303285 test: 0.1875156 best: 0.1875156 (60) total: 30.3s remaining: 18.4s 61: learn: 0.1296986 test: 0.1873856 best: 0.1873856 (61) total: 30.7s remaining: 17.8s 62: learn: 0.1291353 test: 0.1872040 best: 0.1872040 (62) total: 31s remaining: 17.2s 63: learn: 0.1286295 test: 0.1872165 best: 0.1872040 (62) total: 31.4s remaining: 16.7s 64: learn: 0.1276206 test: 0.1873623 best: 0.1872040 (62) total: 31.9s remaining: 16.2s 65: learn: 0.1264591 test: 0.1871621 best: 0.1871621 (65) total: 32.5s remaining: 15.8s 66: learn: 0.1259634 test: 0.1870695 best: 0.1870695 (66) total: 32.9s remaining: 15.2s 67: learn: 0.1248936 test: 0.1869412 best: 0.1869412 (67) total: 33.5s remaining: 14.8s 68: learn: 0.1242801 test: 0.1869481 best: 0.1869412 (67) total: 33.9s remaining: 14.2s 69: learn: 0.1234649 test: 0.1868858 best: 0.1868858 (69) total: 34.4s remaining: 13.8s 70: learn: 0.1228557 test: 0.1868086 best: 0.1868086 (70) total: 34.8s remaining: 13.2s 71: learn: 0.1222460 test: 0.1866279 best: 0.1866279 (71) total: 35.1s remaining: 12.7s 72: learn: 0.1217392 test: 0.1866609 best: 0.1866279 (71) total: 35.4s remaining: 12.1s 73: learn: 0.1212061 test: 0.1865855 best: 0.1865855 (73) total: 35.8s remaining: 11.6s 74: learn: 0.1205830 test: 0.1867629 best: 0.1865855 (73) total: 36.2s remaining: 11.1s 75: learn: 0.1197731 test: 0.1866525 best: 0.1865855 (73) total: 36.7s remaining: 10.6s 76: learn: 0.1188814 test: 0.1866393 best: 0.1865855 (73) total: 37.2s remaining: 10.1s 77: learn: 0.1182859 test: 0.1866702 best: 0.1865855 (73) total: 37.6s remaining: 9.65s 78: learn: 0.1177748 test: 0.1866364 best: 0.1865855 (73) total: 38.1s remaining: 9.16s 79: learn: 0.1172924 test: 0.1867552 best: 0.1865855 (73) total: 38.5s remaining: 8.65s 80: learn: 0.1164260 test: 0.1865921 best: 0.1865855 (73) total: 39s remaining: 8.2s 81: learn: 0.1158539 test: 0.1867031 best: 0.1865855 (73) total: 39.4s remaining: 7.69s 82: learn: 0.1153921 test: 0.1866044 best: 0.1865855 (73) total: 39.8s remaining: 7.19s 83: learn: 0.1147294 test: 0.1862893 best: 0.1862893 (83) total: 40.3s remaining: 6.72s 84: learn: 0.1140109 test: 0.1863474 best: 0.1862893 (83) total: 40.9s remaining: 6.25s 85: learn: 0.1134467 test: 0.1863306 best: 0.1862893 (83) total: 41.3s remaining: 5.76s 86: learn: 0.1132785 test: 0.1863418 best: 0.1862893 (83) total: 41.4s remaining: 5.24s 87: learn: 0.1126590 test: 0.1863500 best: 0.1862893 (83) total: 41.8s remaining: 4.75s 88: learn: 0.1121741 test: 0.1862952 best: 0.1862893 (83) total: 42.3s remaining: 4.27s 89: learn: 0.1118659 test: 0.1864118 best: 0.1862893 (83) total: 42.5s remaining: 3.78s 90: learn: 0.1115306 test: 0.1863569 best: 0.1862893 (83) total: 42.8s remaining: 3.29s 91: learn: 0.1111340 test: 0.1863664 best: 0.1862893 (83) total: 43.1s remaining: 2.81s 92: learn: 0.1103868 test: 0.1861989 best: 0.1861989 (92) total: 43.6s remaining: 2.35s 93: learn: 0.1097452 test: 0.1864650 best: 0.1861989 (92) total: 44.2s remaining: 1.88s 94: learn: 0.1094328 test: 0.1862831 best: 0.1861989 (92) total: 44.5s remaining: 1.4s 95: learn: 0.1089817 test: 0.1863722 best: 0.1861989 (92) total: 44.8s remaining: 933ms 96: learn: 0.1083339 test: 0.1862068 best: 0.1861989 (92) total: 45.3s remaining: 467ms 97: learn: 0.1077885 test: 0.1861192 best: 0.1861192 (97) total: 45.8s remaining: 0us bestTest = 0.186119214 bestIteration = 97 Trial 70, Fold 3: Log loss = 0.18577943179712123, Average precision = 0.9753117041345816, ROC-AUC = 0.9742210298465777, Elapsed Time = 45.941452199997 seconds Trial 70, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 70, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5903866 test: 0.5926740 best: 0.5926740 (0) total: 647ms remaining: 1m 2s 1: learn: 0.5116286 test: 0.5162784 best: 0.5162784 (1) total: 1.18s remaining: 56.9s 2: learn: 0.4494189 test: 0.4555256 best: 0.4555256 (2) total: 1.74s remaining: 55.2s 3: learn: 0.3985661 test: 0.4068757 best: 0.4068757 (3) total: 2.43s remaining: 57s 4: learn: 0.3599436 test: 0.3705242 best: 0.3705242 (4) total: 2.95s remaining: 55s 5: learn: 0.3291947 test: 0.3413378 best: 0.3413378 (5) total: 3.54s remaining: 54.2s 6: learn: 0.3056764 test: 0.3193571 best: 0.3193571 (6) total: 4.03s remaining: 52.3s 7: learn: 0.2848018 test: 0.2993082 best: 0.2993082 (7) total: 4.47s remaining: 50.3s 8: learn: 0.2683200 test: 0.2836972 best: 0.2836972 (8) total: 5.03s remaining: 49.8s 9: learn: 0.2544236 test: 0.2708741 best: 0.2708741 (9) total: 5.55s remaining: 48.9s 10: learn: 0.2421673 test: 0.2598218 best: 0.2598218 (10) total: 6.08s remaining: 48.1s 11: learn: 0.2321446 test: 0.2512839 best: 0.2512839 (11) total: 6.51s remaining: 46.7s 12: learn: 0.2247804 test: 0.2449503 best: 0.2449503 (12) total: 6.97s remaining: 45.6s 13: learn: 0.2167446 test: 0.2385240 best: 0.2385240 (13) total: 7.59s remaining: 45.6s 14: learn: 0.2104622 test: 0.2331092 best: 0.2331092 (14) total: 8.09s remaining: 44.8s 15: learn: 0.2049084 test: 0.2288381 best: 0.2288381 (15) total: 8.7s remaining: 44.6s 16: learn: 0.1996204 test: 0.2248549 best: 0.2248549 (16) total: 9.22s remaining: 43.9s 17: learn: 0.1944856 test: 0.2213899 best: 0.2213899 (17) total: 9.94s remaining: 44.2s 18: learn: 0.1904442 test: 0.2183846 best: 0.2183846 (18) total: 10.5s remaining: 43.6s 19: learn: 0.1869269 test: 0.2154487 best: 0.2154487 (19) total: 10.9s remaining: 42.6s 20: learn: 0.1841956 test: 0.2135939 best: 0.2135939 (20) total: 11.3s remaining: 41.4s 21: learn: 0.1809192 test: 0.2116390 best: 0.2116390 (21) total: 11.7s remaining: 40.6s 22: learn: 0.1780880 test: 0.2101559 best: 0.2101559 (22) total: 12.2s remaining: 39.8s 23: learn: 0.1752879 test: 0.2084411 best: 0.2084411 (23) total: 12.6s remaining: 38.8s 24: learn: 0.1722349 test: 0.2067880 best: 0.2067880 (24) total: 13.3s remaining: 38.7s 25: learn: 0.1697879 test: 0.2052590 best: 0.2052590 (25) total: 13.8s remaining: 38.1s 26: learn: 0.1680131 test: 0.2040930 best: 0.2040930 (26) total: 14.2s remaining: 37.3s 27: learn: 0.1661250 test: 0.2029495 best: 0.2029495 (27) total: 14.6s remaining: 36.6s 28: learn: 0.1643004 test: 0.2020483 best: 0.2020483 (28) total: 15.2s remaining: 36.2s 29: learn: 0.1622007 test: 0.2021418 best: 0.2020483 (28) total: 16s remaining: 36.3s 30: learn: 0.1605851 test: 0.2014845 best: 0.2014845 (30) total: 16.7s remaining: 36.2s 31: learn: 0.1593385 test: 0.2009049 best: 0.2009049 (31) total: 17.3s remaining: 35.6s 32: learn: 0.1573883 test: 0.2003655 best: 0.2003655 (32) total: 18s remaining: 35.4s 33: learn: 0.1561716 test: 0.2000978 best: 0.2000978 (33) total: 18.4s remaining: 34.5s 34: learn: 0.1547655 test: 0.1996476 best: 0.1996476 (34) total: 18.9s remaining: 34s 35: learn: 0.1537214 test: 0.1989982 best: 0.1989982 (35) total: 19.2s remaining: 33.1s 36: learn: 0.1524592 test: 0.1986614 best: 0.1986614 (36) total: 19.7s remaining: 32.4s 37: learn: 0.1512639 test: 0.1982159 best: 0.1982159 (37) total: 20.2s remaining: 31.9s 38: learn: 0.1500103 test: 0.1978623 best: 0.1978623 (38) total: 20.7s remaining: 31.4s 39: learn: 0.1489568 test: 0.1976017 best: 0.1976017 (39) total: 21.2s remaining: 30.7s 40: learn: 0.1482261 test: 0.1970594 best: 0.1970594 (40) total: 21.4s remaining: 29.8s 41: learn: 0.1472388 test: 0.1969252 best: 0.1969252 (41) total: 21.8s remaining: 29.1s 42: learn: 0.1459125 test: 0.1965207 best: 0.1965207 (42) total: 22.3s remaining: 28.5s 43: learn: 0.1450426 test: 0.1960739 best: 0.1960739 (43) total: 22.6s remaining: 27.8s 44: learn: 0.1440874 test: 0.1959868 best: 0.1959868 (44) total: 23s remaining: 27.1s 45: learn: 0.1434531 test: 0.1960275 best: 0.1959868 (44) total: 23.3s remaining: 26.3s 46: learn: 0.1426864 test: 0.1959422 best: 0.1959422 (46) total: 23.6s remaining: 25.7s 47: learn: 0.1417727 test: 0.1955950 best: 0.1955950 (47) total: 24.1s remaining: 25.1s 48: learn: 0.1406708 test: 0.1952917 best: 0.1952917 (48) total: 24.6s remaining: 24.6s 49: learn: 0.1398507 test: 0.1952145 best: 0.1952145 (49) total: 25s remaining: 24s 50: learn: 0.1389719 test: 0.1949487 best: 0.1949487 (50) total: 25.5s remaining: 23.5s 51: learn: 0.1380472 test: 0.1949459 best: 0.1949459 (51) total: 25.9s remaining: 22.9s 52: learn: 0.1372903 test: 0.1949670 best: 0.1949459 (51) total: 26.3s remaining: 22.3s 53: learn: 0.1366290 test: 0.1948453 best: 0.1948453 (53) total: 26.6s remaining: 21.7s 54: learn: 0.1356674 test: 0.1947245 best: 0.1947245 (54) total: 27.1s remaining: 21.2s 55: learn: 0.1343231 test: 0.1943676 best: 0.1943676 (55) total: 27.8s remaining: 20.8s 56: learn: 0.1334436 test: 0.1941313 best: 0.1941313 (56) total: 28.2s remaining: 20.3s 57: learn: 0.1326704 test: 0.1939875 best: 0.1939875 (57) total: 28.6s remaining: 19.8s 58: learn: 0.1322473 test: 0.1937925 best: 0.1937925 (58) total: 28.9s remaining: 19.1s 59: learn: 0.1316569 test: 0.1938481 best: 0.1937925 (58) total: 29.2s remaining: 18.5s 60: learn: 0.1308835 test: 0.1937013 best: 0.1937013 (60) total: 29.7s remaining: 18s 61: learn: 0.1301896 test: 0.1936174 best: 0.1936174 (61) total: 30.1s remaining: 17.5s 62: learn: 0.1294747 test: 0.1936500 best: 0.1936174 (61) total: 30.5s remaining: 16.9s 63: learn: 0.1291463 test: 0.1937705 best: 0.1936174 (61) total: 30.7s remaining: 16.3s 64: learn: 0.1282698 test: 0.1938988 best: 0.1936174 (61) total: 31.2s remaining: 15.8s 65: learn: 0.1275179 test: 0.1939173 best: 0.1936174 (61) total: 31.6s remaining: 15.3s 66: learn: 0.1264292 test: 0.1938085 best: 0.1936174 (61) total: 32.3s remaining: 15s 67: learn: 0.1255920 test: 0.1936310 best: 0.1936174 (61) total: 32.8s remaining: 14.5s 68: learn: 0.1247783 test: 0.1933971 best: 0.1933971 (68) total: 33.3s remaining: 14s 69: learn: 0.1239627 test: 0.1934268 best: 0.1933971 (68) total: 33.9s remaining: 13.5s 70: learn: 0.1229439 test: 0.1931467 best: 0.1931467 (70) total: 34.6s remaining: 13.2s 71: learn: 0.1225462 test: 0.1931909 best: 0.1931467 (70) total: 35s remaining: 12.6s 72: learn: 0.1217475 test: 0.1929360 best: 0.1929360 (72) total: 35.5s remaining: 12.1s 73: learn: 0.1212338 test: 0.1928165 best: 0.1928165 (73) total: 35.8s remaining: 11.6s 74: learn: 0.1204093 test: 0.1929877 best: 0.1928165 (73) total: 36.4s remaining: 11.1s 75: learn: 0.1199375 test: 0.1929339 best: 0.1928165 (73) total: 36.7s remaining: 10.6s 76: learn: 0.1192027 test: 0.1928159 best: 0.1928159 (76) total: 37.2s remaining: 10.1s 77: learn: 0.1183243 test: 0.1926144 best: 0.1926144 (77) total: 37.8s remaining: 9.7s 78: learn: 0.1178015 test: 0.1926250 best: 0.1926144 (77) total: 38.2s remaining: 9.19s 79: learn: 0.1172633 test: 0.1926208 best: 0.1926144 (77) total: 38.6s remaining: 8.69s 80: learn: 0.1168945 test: 0.1925953 best: 0.1925953 (80) total: 39s remaining: 8.17s 81: learn: 0.1160975 test: 0.1925826 best: 0.1925826 (81) total: 39.5s remaining: 7.7s 82: learn: 0.1155785 test: 0.1927122 best: 0.1925826 (81) total: 39.9s remaining: 7.2s 83: learn: 0.1147983 test: 0.1927527 best: 0.1925826 (81) total: 40.4s remaining: 6.73s 84: learn: 0.1143240 test: 0.1927585 best: 0.1925826 (81) total: 40.8s remaining: 6.24s 85: learn: 0.1133622 test: 0.1930191 best: 0.1925826 (81) total: 41.4s remaining: 5.78s 86: learn: 0.1125493 test: 0.1929498 best: 0.1925826 (81) total: 41.9s remaining: 5.3s 87: learn: 0.1118920 test: 0.1926952 best: 0.1925826 (81) total: 42.4s remaining: 4.82s 88: learn: 0.1114217 test: 0.1926927 best: 0.1925826 (81) total: 42.8s remaining: 4.33s 89: learn: 0.1110389 test: 0.1926702 best: 0.1925826 (81) total: 43.1s remaining: 3.83s 90: learn: 0.1106194 test: 0.1926314 best: 0.1925826 (81) total: 43.4s remaining: 3.34s 91: learn: 0.1101499 test: 0.1924486 best: 0.1924486 (91) total: 43.8s remaining: 2.86s 92: learn: 0.1095291 test: 0.1924287 best: 0.1924287 (92) total: 44.4s remaining: 2.38s 93: learn: 0.1090287 test: 0.1925581 best: 0.1924287 (92) total: 44.8s remaining: 1.91s 94: learn: 0.1084822 test: 0.1925460 best: 0.1924287 (92) total: 45.3s remaining: 1.43s 95: learn: 0.1079710 test: 0.1927072 best: 0.1924287 (92) total: 45.9s remaining: 956ms 96: learn: 0.1073525 test: 0.1929110 best: 0.1924287 (92) total: 46.3s remaining: 478ms 97: learn: 0.1069757 test: 0.1929770 best: 0.1924287 (92) total: 46.7s remaining: 0us bestTest = 0.1924287435 bestIteration = 92 Shrink model to first 93 iterations. Trial 70, Fold 4: Log loss = 0.19186932567833512, Average precision = 0.9759671294837424, ROC-AUC = 0.9719536780712985, Elapsed Time = 46.824653199997556 seconds Trial 70, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 70, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5904299 test: 0.5958298 best: 0.5958298 (0) total: 538ms remaining: 52.2s 1: learn: 0.5092095 test: 0.5168225 best: 0.5168225 (1) total: 1.3s remaining: 1m 2s 2: learn: 0.4463328 test: 0.4558214 best: 0.4558214 (2) total: 1.78s remaining: 56.5s 3: learn: 0.3967623 test: 0.4094800 best: 0.4094800 (3) total: 2.36s remaining: 55.4s 4: learn: 0.3571436 test: 0.3725866 best: 0.3725866 (4) total: 2.84s remaining: 52.9s 5: learn: 0.3270604 test: 0.3443047 best: 0.3443047 (5) total: 3.43s remaining: 52.6s 6: learn: 0.3015068 test: 0.3206474 best: 0.3206474 (6) total: 4s remaining: 52s 7: learn: 0.2815891 test: 0.3025004 best: 0.3025004 (7) total: 4.6s remaining: 51.8s 8: learn: 0.2648195 test: 0.2866234 best: 0.2866234 (8) total: 5.05s remaining: 49.9s 9: learn: 0.2514650 test: 0.2746255 best: 0.2746255 (9) total: 5.54s remaining: 48.7s 10: learn: 0.2398131 test: 0.2643554 best: 0.2643554 (10) total: 6.14s remaining: 48.6s 11: learn: 0.2298553 test: 0.2563370 best: 0.2563370 (11) total: 6.72s remaining: 48.2s 12: learn: 0.2210606 test: 0.2487205 best: 0.2487205 (12) total: 7.21s remaining: 47.2s 13: learn: 0.2142005 test: 0.2431083 best: 0.2431083 (13) total: 7.75s remaining: 46.5s 14: learn: 0.2085150 test: 0.2382231 best: 0.2382231 (14) total: 8.17s remaining: 45.2s 15: learn: 0.2028113 test: 0.2338603 best: 0.2338603 (15) total: 8.71s remaining: 44.6s 16: learn: 0.1976013 test: 0.2301886 best: 0.2301886 (16) total: 9.31s remaining: 44.3s 17: learn: 0.1929889 test: 0.2266481 best: 0.2266481 (17) total: 9.93s remaining: 44.1s 18: learn: 0.1889342 test: 0.2239191 best: 0.2239191 (18) total: 10.5s remaining: 43.6s 19: learn: 0.1852748 test: 0.2213895 best: 0.2213895 (19) total: 11s remaining: 42.9s 20: learn: 0.1822415 test: 0.2192372 best: 0.2192372 (20) total: 11.4s remaining: 42s 21: learn: 0.1788892 test: 0.2169057 best: 0.2169057 (21) total: 11.9s remaining: 41.1s 22: learn: 0.1760892 test: 0.2154420 best: 0.2154420 (22) total: 12.5s remaining: 40.6s 23: learn: 0.1733774 test: 0.2141357 best: 0.2141357 (23) total: 13.1s remaining: 40.3s 24: learn: 0.1706547 test: 0.2128073 best: 0.2128073 (24) total: 13.6s remaining: 39.8s 25: learn: 0.1681768 test: 0.2117522 best: 0.2117522 (25) total: 14.1s remaining: 39s 26: learn: 0.1659039 test: 0.2110429 best: 0.2110429 (26) total: 14.6s remaining: 38.4s 27: learn: 0.1638167 test: 0.2097277 best: 0.2097277 (27) total: 15.2s remaining: 37.9s 28: learn: 0.1618886 test: 0.2091572 best: 0.2091572 (28) total: 15.6s remaining: 37s 29: learn: 0.1598940 test: 0.2084769 best: 0.2084769 (29) total: 16.1s remaining: 36.4s 30: learn: 0.1581990 test: 0.2078567 best: 0.2078567 (30) total: 16.7s remaining: 36s 31: learn: 0.1564426 test: 0.2072040 best: 0.2072040 (31) total: 17.1s remaining: 35.4s 32: learn: 0.1548277 test: 0.2060797 best: 0.2060797 (32) total: 17.6s remaining: 34.7s 33: learn: 0.1533100 test: 0.2055582 best: 0.2055582 (33) total: 18.2s remaining: 34.3s 34: learn: 0.1516797 test: 0.2047594 best: 0.2047594 (34) total: 18.7s remaining: 33.7s 35: learn: 0.1502815 test: 0.2043170 best: 0.2043170 (35) total: 19.2s remaining: 33.1s 36: learn: 0.1491845 test: 0.2037814 best: 0.2037814 (36) total: 19.6s remaining: 32.3s 37: learn: 0.1480542 test: 0.2037177 best: 0.2037177 (37) total: 20.1s remaining: 31.8s 38: learn: 0.1471103 test: 0.2035313 best: 0.2035313 (38) total: 20.6s remaining: 31.1s 39: learn: 0.1462939 test: 0.2031819 best: 0.2031819 (39) total: 20.9s remaining: 30.3s 40: learn: 0.1449860 test: 0.2028769 best: 0.2028769 (40) total: 21.4s remaining: 29.8s 41: learn: 0.1440613 test: 0.2027071 best: 0.2027071 (41) total: 21.9s remaining: 29.1s 42: learn: 0.1428620 test: 0.2025038 best: 0.2025038 (42) total: 22.4s remaining: 28.6s 43: learn: 0.1419738 test: 0.2024404 best: 0.2024404 (43) total: 22.7s remaining: 27.9s 44: learn: 0.1408974 test: 0.2022139 best: 0.2022139 (44) total: 23.3s remaining: 27.4s 45: learn: 0.1401060 test: 0.2019269 best: 0.2019269 (45) total: 23.6s remaining: 26.7s 46: learn: 0.1388182 test: 0.2015181 best: 0.2015181 (46) total: 24.2s remaining: 26.3s 47: learn: 0.1378194 test: 0.2014353 best: 0.2014353 (47) total: 24.7s remaining: 25.7s 48: learn: 0.1371581 test: 0.2011007 best: 0.2011007 (48) total: 25s remaining: 25s 49: learn: 0.1365916 test: 0.2009530 best: 0.2009530 (49) total: 25.4s remaining: 24.3s 50: learn: 0.1353233 test: 0.2005552 best: 0.2005552 (50) total: 25.9s remaining: 23.9s 51: learn: 0.1345497 test: 0.2002959 best: 0.2002959 (51) total: 26.3s remaining: 23.3s 52: learn: 0.1338788 test: 0.2001721 best: 0.2001721 (52) total: 26.6s remaining: 22.6s 53: learn: 0.1334574 test: 0.2001449 best: 0.2001449 (53) total: 26.9s remaining: 21.9s 54: learn: 0.1324676 test: 0.1998834 best: 0.1998834 (54) total: 27.4s remaining: 21.4s 55: learn: 0.1314993 test: 0.1993890 best: 0.1993890 (55) total: 27.9s remaining: 20.9s 56: learn: 0.1304930 test: 0.1992163 best: 0.1992163 (56) total: 28.4s remaining: 20.4s 57: learn: 0.1298450 test: 0.1992431 best: 0.1992163 (56) total: 28.8s remaining: 19.8s 58: learn: 0.1287671 test: 0.1990168 best: 0.1990168 (58) total: 29.3s remaining: 19.4s 59: learn: 0.1280426 test: 0.1987992 best: 0.1987992 (59) total: 29.8s remaining: 18.9s 60: learn: 0.1272466 test: 0.1989128 best: 0.1987992 (59) total: 30.3s remaining: 18.4s 61: learn: 0.1262062 test: 0.1987534 best: 0.1987534 (61) total: 30.8s remaining: 17.9s 62: learn: 0.1255443 test: 0.1986378 best: 0.1986378 (62) total: 31.2s remaining: 17.4s 63: learn: 0.1246252 test: 0.1982837 best: 0.1982837 (63) total: 31.7s remaining: 16.9s 64: learn: 0.1238588 test: 0.1980848 best: 0.1980848 (64) total: 32.2s remaining: 16.4s 65: learn: 0.1234637 test: 0.1981026 best: 0.1980848 (64) total: 32.5s remaining: 15.7s 66: learn: 0.1230882 test: 0.1980267 best: 0.1980267 (66) total: 32.7s remaining: 15.2s 67: learn: 0.1222745 test: 0.1982945 best: 0.1980267 (66) total: 33.2s remaining: 14.7s 68: learn: 0.1217571 test: 0.1984015 best: 0.1980267 (66) total: 33.5s remaining: 14.1s 69: learn: 0.1211085 test: 0.1981569 best: 0.1980267 (66) total: 34s remaining: 13.6s 70: learn: 0.1203771 test: 0.1980290 best: 0.1980267 (66) total: 34.5s remaining: 13.1s 71: learn: 0.1198889 test: 0.1979246 best: 0.1979246 (71) total: 34.8s remaining: 12.6s 72: learn: 0.1192985 test: 0.1977843 best: 0.1977843 (72) total: 35.2s remaining: 12s 73: learn: 0.1183789 test: 0.1976746 best: 0.1976746 (73) total: 35.7s remaining: 11.6s 74: learn: 0.1176186 test: 0.1976629 best: 0.1976629 (74) total: 36.1s remaining: 11.1s 75: learn: 0.1170817 test: 0.1976510 best: 0.1976510 (75) total: 36.5s remaining: 10.6s 76: learn: 0.1165258 test: 0.1975140 best: 0.1975140 (76) total: 36.9s remaining: 10.1s 77: learn: 0.1159144 test: 0.1976282 best: 0.1975140 (76) total: 37.3s remaining: 9.57s 78: learn: 0.1153590 test: 0.1975565 best: 0.1975140 (76) total: 37.9s remaining: 9.11s 79: learn: 0.1150699 test: 0.1973680 best: 0.1973680 (79) total: 38.1s remaining: 8.57s 80: learn: 0.1145810 test: 0.1971900 best: 0.1971900 (80) total: 38.4s remaining: 8.07s 81: learn: 0.1140473 test: 0.1974369 best: 0.1971900 (80) total: 38.8s remaining: 7.57s 82: learn: 0.1134024 test: 0.1974264 best: 0.1971900 (80) total: 39.2s remaining: 7.09s 83: learn: 0.1129543 test: 0.1974972 best: 0.1971900 (80) total: 39.6s remaining: 6.59s 84: learn: 0.1125477 test: 0.1973166 best: 0.1971900 (80) total: 39.9s remaining: 6.11s 85: learn: 0.1120383 test: 0.1973632 best: 0.1971900 (80) total: 40.3s remaining: 5.62s 86: learn: 0.1118262 test: 0.1973748 best: 0.1971900 (80) total: 40.5s remaining: 5.12s 87: learn: 0.1111959 test: 0.1975936 best: 0.1971900 (80) total: 41s remaining: 4.65s 88: learn: 0.1107515 test: 0.1977836 best: 0.1971900 (80) total: 41.3s remaining: 4.17s 89: learn: 0.1102352 test: 0.1975569 best: 0.1971900 (80) total: 41.8s remaining: 3.71s 90: learn: 0.1097574 test: 0.1976070 best: 0.1971900 (80) total: 42.2s remaining: 3.24s 91: learn: 0.1092355 test: 0.1977673 best: 0.1971900 (80) total: 42.6s remaining: 2.78s 92: learn: 0.1086382 test: 0.1978113 best: 0.1971900 (80) total: 43s remaining: 2.31s 93: learn: 0.1081876 test: 0.1976875 best: 0.1971900 (80) total: 43.4s remaining: 1.85s 94: learn: 0.1077483 test: 0.1976105 best: 0.1971900 (80) total: 43.8s remaining: 1.38s 95: learn: 0.1072330 test: 0.1977702 best: 0.1971900 (80) total: 44.2s remaining: 921ms 96: learn: 0.1068337 test: 0.1979027 best: 0.1971900 (80) total: 44.5s remaining: 459ms 97: learn: 0.1066497 test: 0.1978261 best: 0.1971900 (80) total: 44.8s remaining: 0us bestTest = 0.1971900379 bestIteration = 80 Shrink model to first 81 iterations. Trial 70, Fold 5: Log loss = 0.1964205843343422, Average precision = 0.9743607278685225, ROC-AUC = 0.972080931943593, Elapsed Time = 44.91034370000125 seconds
Optimization Progress: 71%|#######1 | 71/100 [2:00:51<1:00:42, 125.62s/it]
Trial 71, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 71, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5937818 test: 0.5937905 best: 0.5937905 (0) total: 88.5ms remaining: 5.4s 1: learn: 0.5161468 test: 0.5171198 best: 0.5171198 (1) total: 182ms remaining: 5.46s 2: learn: 0.4561093 test: 0.4571813 best: 0.4571813 (2) total: 275ms remaining: 5.41s 3: learn: 0.4088388 test: 0.4102799 best: 0.4102799 (3) total: 369ms remaining: 5.35s 4: learn: 0.3708212 test: 0.3728695 best: 0.3728695 (4) total: 467ms remaining: 5.32s 5: learn: 0.3413325 test: 0.3439232 best: 0.3439232 (5) total: 566ms remaining: 5.28s 6: learn: 0.3169946 test: 0.3204268 best: 0.3204268 (6) total: 665ms remaining: 5.23s 7: learn: 0.2982813 test: 0.3025617 best: 0.3025617 (7) total: 768ms remaining: 5.18s 8: learn: 0.2828963 test: 0.2874215 best: 0.2874215 (8) total: 871ms remaining: 5.13s 9: learn: 0.2698101 test: 0.2752712 best: 0.2752712 (9) total: 972ms remaining: 5.06s 10: learn: 0.2592600 test: 0.2655021 best: 0.2655021 (10) total: 1.07s remaining: 4.97s 11: learn: 0.2495725 test: 0.2565685 best: 0.2565685 (11) total: 1.17s remaining: 4.88s 12: learn: 0.2412890 test: 0.2491022 best: 0.2491022 (12) total: 1.27s remaining: 4.8s 13: learn: 0.2345579 test: 0.2429996 best: 0.2429996 (13) total: 1.37s remaining: 4.71s 14: learn: 0.2288734 test: 0.2379391 best: 0.2379391 (14) total: 1.47s remaining: 4.62s 15: learn: 0.2235950 test: 0.2332005 best: 0.2332005 (15) total: 1.57s remaining: 4.53s 16: learn: 0.2190571 test: 0.2290120 best: 0.2290120 (16) total: 1.67s remaining: 4.43s 17: learn: 0.2153145 test: 0.2259537 best: 0.2259537 (17) total: 1.77s remaining: 4.34s 18: learn: 0.2119301 test: 0.2231757 best: 0.2231757 (18) total: 1.88s remaining: 4.25s 19: learn: 0.2088486 test: 0.2207479 best: 0.2207479 (19) total: 1.98s remaining: 4.16s 20: learn: 0.2061033 test: 0.2187095 best: 0.2187095 (20) total: 2.08s remaining: 4.06s 21: learn: 0.2039597 test: 0.2170497 best: 0.2170497 (21) total: 2.18s remaining: 3.97s 22: learn: 0.2019683 test: 0.2157699 best: 0.2157699 (22) total: 2.28s remaining: 3.86s 23: learn: 0.1997631 test: 0.2140399 best: 0.2140399 (23) total: 2.38s remaining: 3.77s 24: learn: 0.1979476 test: 0.2129108 best: 0.2129108 (24) total: 2.48s remaining: 3.67s 25: learn: 0.1963104 test: 0.2121092 best: 0.2121092 (25) total: 2.59s remaining: 3.58s 26: learn: 0.1950329 test: 0.2115122 best: 0.2115122 (26) total: 2.69s remaining: 3.48s 27: learn: 0.1931360 test: 0.2101689 best: 0.2101689 (27) total: 2.8s remaining: 3.4s 28: learn: 0.1914261 test: 0.2090033 best: 0.2090033 (28) total: 2.91s remaining: 3.31s 29: learn: 0.1898968 test: 0.2080888 best: 0.2080888 (29) total: 3.02s remaining: 3.22s 30: learn: 0.1885272 test: 0.2076419 best: 0.2076419 (30) total: 3.14s remaining: 3.14s 31: learn: 0.1872770 test: 0.2069438 best: 0.2069438 (31) total: 3.25s remaining: 3.04s 32: learn: 0.1862602 test: 0.2065621 best: 0.2065621 (32) total: 3.36s remaining: 2.95s 33: learn: 0.1851835 test: 0.2059849 best: 0.2059849 (33) total: 3.47s remaining: 2.86s 34: learn: 0.1841631 test: 0.2055351 best: 0.2055351 (34) total: 3.58s remaining: 2.76s 35: learn: 0.1832170 test: 0.2049952 best: 0.2049952 (35) total: 3.69s remaining: 2.66s 36: learn: 0.1822240 test: 0.2047144 best: 0.2047144 (36) total: 3.8s remaining: 2.57s 37: learn: 0.1810698 test: 0.2040797 best: 0.2040797 (37) total: 3.91s remaining: 2.47s 38: learn: 0.1802331 test: 0.2037360 best: 0.2037360 (38) total: 4.03s remaining: 2.38s 39: learn: 0.1792237 test: 0.2032748 best: 0.2032748 (39) total: 4.14s remaining: 2.28s 40: learn: 0.1782914 test: 0.2026725 best: 0.2026725 (40) total: 4.25s remaining: 2.18s 41: learn: 0.1776901 test: 0.2024959 best: 0.2024959 (41) total: 4.36s remaining: 2.08s 42: learn: 0.1767046 test: 0.2020450 best: 0.2020450 (42) total: 4.47s remaining: 1.98s 43: learn: 0.1759728 test: 0.2015341 best: 0.2015341 (43) total: 4.58s remaining: 1.87s 44: learn: 0.1748716 test: 0.2013603 best: 0.2013603 (44) total: 4.69s remaining: 1.77s 45: learn: 0.1740563 test: 0.2011216 best: 0.2011216 (45) total: 4.8s remaining: 1.67s 46: learn: 0.1734400 test: 0.2006760 best: 0.2006760 (46) total: 4.92s remaining: 1.57s 47: learn: 0.1726667 test: 0.2004424 best: 0.2004424 (47) total: 5.04s remaining: 1.47s 48: learn: 0.1721083 test: 0.2002274 best: 0.2002274 (48) total: 5.15s remaining: 1.37s 49: learn: 0.1714406 test: 0.2001716 best: 0.2001716 (49) total: 5.26s remaining: 1.26s 50: learn: 0.1708046 test: 0.1999850 best: 0.1999850 (50) total: 5.37s remaining: 1.16s 51: learn: 0.1700950 test: 0.2000467 best: 0.1999850 (50) total: 5.48s remaining: 1.05s 52: learn: 0.1692670 test: 0.2000185 best: 0.1999850 (50) total: 5.59s remaining: 950ms 53: learn: 0.1683271 test: 0.1998274 best: 0.1998274 (53) total: 5.7s remaining: 845ms 54: learn: 0.1675133 test: 0.1993993 best: 0.1993993 (54) total: 5.81s remaining: 740ms 55: learn: 0.1669431 test: 0.1991634 best: 0.1991634 (55) total: 5.92s remaining: 635ms 56: learn: 0.1665231 test: 0.1990390 best: 0.1990390 (56) total: 6.03s remaining: 529ms 57: learn: 0.1660398 test: 0.1987856 best: 0.1987856 (57) total: 6.14s remaining: 423ms 58: learn: 0.1654167 test: 0.1988480 best: 0.1987856 (57) total: 6.25s remaining: 318ms 59: learn: 0.1646436 test: 0.1985806 best: 0.1985806 (59) total: 6.35s remaining: 212ms 60: learn: 0.1639878 test: 0.1984373 best: 0.1984373 (60) total: 6.46s remaining: 106ms 61: learn: 0.1634405 test: 0.1985401 best: 0.1984373 (60) total: 6.57s remaining: 0us bestTest = 0.1984373284 bestIteration = 60 Shrink model to first 61 iterations. Trial 71, Fold 1: Log loss = 0.19781661846645085, Average precision = 0.9755612533028836, ROC-AUC = 0.9712075303623406, Elapsed Time = 6.687996600001497 seconds Trial 71, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 71, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5940360 test: 0.5951529 best: 0.5951529 (0) total: 91ms remaining: 5.55s 1: learn: 0.5165636 test: 0.5183048 best: 0.5183048 (1) total: 192ms remaining: 5.77s 2: learn: 0.4554413 test: 0.4580697 best: 0.4580697 (2) total: 293ms remaining: 5.75s 3: learn: 0.4080998 test: 0.4115470 best: 0.4115470 (3) total: 392ms remaining: 5.69s 4: learn: 0.3713066 test: 0.3749773 best: 0.3749773 (4) total: 493ms remaining: 5.62s 5: learn: 0.3416411 test: 0.3454689 best: 0.3454689 (5) total: 596ms remaining: 5.57s 6: learn: 0.3173218 test: 0.3215599 best: 0.3215599 (6) total: 704ms remaining: 5.53s 7: learn: 0.2982340 test: 0.3027757 best: 0.3027757 (7) total: 812ms remaining: 5.48s 8: learn: 0.2826427 test: 0.2874789 best: 0.2874789 (8) total: 916ms remaining: 5.39s 9: learn: 0.2704971 test: 0.2754323 best: 0.2754323 (9) total: 1.02s remaining: 5.3s 10: learn: 0.2596362 test: 0.2648737 best: 0.2648737 (10) total: 1.13s remaining: 5.24s 11: learn: 0.2502707 test: 0.2556589 best: 0.2556589 (11) total: 1.24s remaining: 5.15s 12: learn: 0.2420710 test: 0.2480717 best: 0.2480717 (12) total: 1.34s remaining: 5.05s 13: learn: 0.2352034 test: 0.2417859 best: 0.2417859 (13) total: 1.45s remaining: 4.97s 14: learn: 0.2299485 test: 0.2369035 best: 0.2369035 (14) total: 1.56s remaining: 4.88s 15: learn: 0.2250472 test: 0.2323559 best: 0.2323559 (15) total: 1.67s remaining: 4.8s 16: learn: 0.2208591 test: 0.2284049 best: 0.2284049 (16) total: 1.78s remaining: 4.72s 17: learn: 0.2169038 test: 0.2247818 best: 0.2247818 (17) total: 1.9s remaining: 4.65s 18: learn: 0.2133178 test: 0.2214797 best: 0.2214797 (18) total: 2.02s remaining: 4.57s 19: learn: 0.2100397 test: 0.2185342 best: 0.2185342 (19) total: 2.14s remaining: 4.49s 20: learn: 0.2078835 test: 0.2163570 best: 0.2163570 (20) total: 2.25s remaining: 4.39s 21: learn: 0.2049611 test: 0.2138087 best: 0.2138087 (21) total: 2.36s remaining: 4.29s 22: learn: 0.2028642 test: 0.2116557 best: 0.2116557 (22) total: 2.47s remaining: 4.19s 23: learn: 0.2011177 test: 0.2102123 best: 0.2102123 (23) total: 2.58s remaining: 4.08s 24: learn: 0.1990289 test: 0.2084872 best: 0.2084872 (24) total: 2.69s remaining: 3.98s 25: learn: 0.1973550 test: 0.2071868 best: 0.2071868 (25) total: 2.8s remaining: 3.87s 26: learn: 0.1957954 test: 0.2062490 best: 0.2062490 (26) total: 2.9s remaining: 3.77s 27: learn: 0.1940912 test: 0.2048063 best: 0.2048063 (27) total: 3.01s remaining: 3.66s 28: learn: 0.1925617 test: 0.2037624 best: 0.2037624 (28) total: 3.12s remaining: 3.55s 29: learn: 0.1912291 test: 0.2027371 best: 0.2027371 (29) total: 3.23s remaining: 3.44s 30: learn: 0.1899871 test: 0.2020417 best: 0.2020417 (30) total: 3.33s remaining: 3.33s 31: learn: 0.1888159 test: 0.2010104 best: 0.2010104 (31) total: 3.44s remaining: 3.22s 32: learn: 0.1879026 test: 0.2004879 best: 0.2004879 (32) total: 3.54s remaining: 3.11s 33: learn: 0.1865647 test: 0.1995994 best: 0.1995994 (33) total: 3.65s remaining: 3.01s 34: learn: 0.1855209 test: 0.1987913 best: 0.1987913 (34) total: 3.76s remaining: 2.9s 35: learn: 0.1843535 test: 0.1979139 best: 0.1979139 (35) total: 3.86s remaining: 2.79s 36: learn: 0.1831576 test: 0.1972875 best: 0.1972875 (36) total: 3.97s remaining: 2.68s 37: learn: 0.1820729 test: 0.1964224 best: 0.1964224 (37) total: 4.08s remaining: 2.57s 38: learn: 0.1812717 test: 0.1959867 best: 0.1959867 (38) total: 4.18s remaining: 2.46s 39: learn: 0.1804207 test: 0.1953306 best: 0.1953306 (39) total: 4.29s remaining: 2.36s 40: learn: 0.1796739 test: 0.1947693 best: 0.1947693 (40) total: 4.39s remaining: 2.25s 41: learn: 0.1789228 test: 0.1945676 best: 0.1945676 (41) total: 4.5s remaining: 2.14s 42: learn: 0.1778030 test: 0.1945315 best: 0.1945315 (42) total: 4.6s remaining: 2.03s 43: learn: 0.1769090 test: 0.1943152 best: 0.1943152 (43) total: 4.71s remaining: 1.93s 44: learn: 0.1760765 test: 0.1940441 best: 0.1940441 (44) total: 4.82s remaining: 1.82s 45: learn: 0.1752228 test: 0.1935738 best: 0.1935738 (45) total: 4.92s remaining: 1.71s 46: learn: 0.1742261 test: 0.1933785 best: 0.1933785 (46) total: 5.03s remaining: 1.6s 47: learn: 0.1734045 test: 0.1926481 best: 0.1926481 (47) total: 5.13s remaining: 1.5s 48: learn: 0.1727191 test: 0.1922655 best: 0.1922655 (48) total: 5.24s remaining: 1.39s 49: learn: 0.1722331 test: 0.1920201 best: 0.1920201 (49) total: 5.34s remaining: 1.28s 50: learn: 0.1715531 test: 0.1916733 best: 0.1916733 (50) total: 5.45s remaining: 1.17s 51: learn: 0.1710037 test: 0.1915051 best: 0.1915051 (51) total: 5.55s remaining: 1.07s 52: learn: 0.1703992 test: 0.1911082 best: 0.1911082 (52) total: 5.65s remaining: 960ms 53: learn: 0.1695018 test: 0.1904920 best: 0.1904920 (53) total: 5.77s remaining: 854ms 54: learn: 0.1688983 test: 0.1900467 best: 0.1900467 (54) total: 5.87s remaining: 747ms 55: learn: 0.1683431 test: 0.1898963 best: 0.1898963 (55) total: 5.97s remaining: 640ms 56: learn: 0.1677596 test: 0.1897196 best: 0.1897196 (56) total: 6.09s remaining: 534ms 57: learn: 0.1670510 test: 0.1893718 best: 0.1893718 (57) total: 6.2s remaining: 427ms 58: learn: 0.1664955 test: 0.1889900 best: 0.1889900 (58) total: 6.31s remaining: 321ms 59: learn: 0.1658443 test: 0.1882827 best: 0.1882827 (59) total: 6.42s remaining: 214ms 60: learn: 0.1650057 test: 0.1881357 best: 0.1881357 (60) total: 6.53s remaining: 107ms 61: learn: 0.1644020 test: 0.1879339 best: 0.1879339 (61) total: 6.66s remaining: 0us bestTest = 0.1879338785 bestIteration = 61 Trial 71, Fold 2: Log loss = 0.18752264196186716, Average precision = 0.9766229820117249, ROC-AUC = 0.9739342828900428, Elapsed Time = 6.798018499997852 seconds Trial 71, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 71, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5942032 test: 0.5938701 best: 0.5938701 (0) total: 91ms remaining: 5.55s 1: learn: 0.5165592 test: 0.5162580 best: 0.5162580 (1) total: 193ms remaining: 5.78s 2: learn: 0.4562968 test: 0.4560423 best: 0.4560423 (2) total: 293ms remaining: 5.76s 3: learn: 0.4091856 test: 0.4086466 best: 0.4086466 (3) total: 399ms remaining: 5.78s 4: learn: 0.3712890 test: 0.3707102 best: 0.3707102 (4) total: 500ms remaining: 5.7s 5: learn: 0.3415210 test: 0.3413415 best: 0.3413415 (5) total: 600ms remaining: 5.6s 6: learn: 0.3175262 test: 0.3177758 best: 0.3177758 (6) total: 703ms remaining: 5.52s 7: learn: 0.2985796 test: 0.2989179 best: 0.2989179 (7) total: 804ms remaining: 5.42s 8: learn: 0.2828141 test: 0.2836067 best: 0.2836067 (8) total: 905ms remaining: 5.33s 9: learn: 0.2700275 test: 0.2712441 best: 0.2712441 (9) total: 1.01s remaining: 5.24s 10: learn: 0.2598396 test: 0.2613944 best: 0.2613944 (10) total: 1.11s remaining: 5.14s 11: learn: 0.2514332 test: 0.2531187 best: 0.2531187 (11) total: 1.21s remaining: 5.05s 12: learn: 0.2435034 test: 0.2454338 best: 0.2454338 (12) total: 1.31s remaining: 4.95s 13: learn: 0.2371312 test: 0.2394712 best: 0.2394712 (13) total: 1.42s remaining: 4.86s 14: learn: 0.2313918 test: 0.2341148 best: 0.2341148 (14) total: 1.52s remaining: 4.76s 15: learn: 0.2262472 test: 0.2296635 best: 0.2296635 (15) total: 1.62s remaining: 4.67s 16: learn: 0.2216544 test: 0.2255878 best: 0.2255878 (16) total: 1.73s remaining: 4.58s 17: learn: 0.2177794 test: 0.2224409 best: 0.2224409 (17) total: 1.83s remaining: 4.49s 18: learn: 0.2142175 test: 0.2195751 best: 0.2195751 (18) total: 1.94s remaining: 4.4s 19: learn: 0.2111257 test: 0.2168923 best: 0.2168923 (19) total: 2.05s remaining: 4.3s 20: learn: 0.2086195 test: 0.2148889 best: 0.2148889 (20) total: 2.15s remaining: 4.2s 21: learn: 0.2060118 test: 0.2128223 best: 0.2128223 (21) total: 2.26s remaining: 4.1s 22: learn: 0.2040380 test: 0.2109638 best: 0.2109638 (22) total: 2.37s remaining: 4.01s 23: learn: 0.2018494 test: 0.2089936 best: 0.2089936 (23) total: 2.47s remaining: 3.91s 24: learn: 0.1999530 test: 0.2075169 best: 0.2075169 (24) total: 2.58s remaining: 3.81s 25: learn: 0.1982078 test: 0.2067331 best: 0.2067331 (25) total: 2.68s remaining: 3.71s 26: learn: 0.1965988 test: 0.2056509 best: 0.2056509 (26) total: 2.78s remaining: 3.61s 27: learn: 0.1950955 test: 0.2043706 best: 0.2043706 (27) total: 2.88s remaining: 3.5s 28: learn: 0.1936116 test: 0.2031947 best: 0.2031947 (28) total: 2.99s remaining: 3.4s 29: learn: 0.1924065 test: 0.2026051 best: 0.2026051 (29) total: 3.1s remaining: 3.3s 30: learn: 0.1913234 test: 0.2018096 best: 0.2018096 (30) total: 3.2s remaining: 3.2s 31: learn: 0.1898806 test: 0.2007924 best: 0.2007924 (31) total: 3.3s remaining: 3.1s 32: learn: 0.1886453 test: 0.1999372 best: 0.1999372 (32) total: 3.41s remaining: 3s 33: learn: 0.1874668 test: 0.1991394 best: 0.1991394 (33) total: 3.51s remaining: 2.89s 34: learn: 0.1862434 test: 0.1981782 best: 0.1981782 (34) total: 3.61s remaining: 2.79s 35: learn: 0.1851524 test: 0.1975771 best: 0.1975771 (35) total: 3.72s remaining: 2.68s 36: learn: 0.1842798 test: 0.1971287 best: 0.1971287 (36) total: 3.82s remaining: 2.58s 37: learn: 0.1832142 test: 0.1965247 best: 0.1965247 (37) total: 3.93s remaining: 2.48s 38: learn: 0.1824539 test: 0.1959744 best: 0.1959744 (38) total: 4.03s remaining: 2.38s 39: learn: 0.1813989 test: 0.1954278 best: 0.1954278 (39) total: 4.13s remaining: 2.27s 40: learn: 0.1804912 test: 0.1949466 best: 0.1949466 (40) total: 4.24s remaining: 2.17s 41: learn: 0.1794864 test: 0.1942876 best: 0.1942876 (41) total: 4.36s remaining: 2.07s 42: learn: 0.1785498 test: 0.1937961 best: 0.1937961 (42) total: 4.46s remaining: 1.97s 43: learn: 0.1777905 test: 0.1937284 best: 0.1937284 (43) total: 4.57s remaining: 1.87s 44: learn: 0.1769786 test: 0.1935168 best: 0.1935168 (44) total: 4.67s remaining: 1.76s 45: learn: 0.1761466 test: 0.1934411 best: 0.1934411 (45) total: 4.77s remaining: 1.66s 46: learn: 0.1755944 test: 0.1930994 best: 0.1930994 (46) total: 4.88s remaining: 1.56s 47: learn: 0.1748651 test: 0.1924929 best: 0.1924929 (47) total: 4.99s remaining: 1.46s 48: learn: 0.1738709 test: 0.1921438 best: 0.1921438 (48) total: 5.09s remaining: 1.35s 49: learn: 0.1730688 test: 0.1915882 best: 0.1915882 (49) total: 5.2s remaining: 1.25s 50: learn: 0.1724456 test: 0.1913381 best: 0.1913381 (50) total: 5.3s remaining: 1.14s 51: learn: 0.1716220 test: 0.1908211 best: 0.1908211 (51) total: 5.41s remaining: 1.04s 52: learn: 0.1710927 test: 0.1904107 best: 0.1904107 (52) total: 5.51s remaining: 937ms 53: learn: 0.1704231 test: 0.1900592 best: 0.1900592 (53) total: 5.62s remaining: 832ms 54: learn: 0.1696481 test: 0.1898771 best: 0.1898771 (54) total: 5.72s remaining: 728ms 55: learn: 0.1689648 test: 0.1900673 best: 0.1898771 (54) total: 5.83s remaining: 624ms 56: learn: 0.1684710 test: 0.1898491 best: 0.1898491 (56) total: 5.93s remaining: 520ms 57: learn: 0.1675511 test: 0.1901500 best: 0.1898491 (56) total: 6.03s remaining: 416ms 58: learn: 0.1667383 test: 0.1900255 best: 0.1898491 (56) total: 6.13s remaining: 312ms 59: learn: 0.1660785 test: 0.1900649 best: 0.1898491 (56) total: 6.24s remaining: 208ms 60: learn: 0.1655087 test: 0.1902696 best: 0.1898491 (56) total: 6.34s remaining: 104ms 61: learn: 0.1646410 test: 0.1905068 best: 0.1898491 (56) total: 6.44s remaining: 0us bestTest = 0.1898490596 bestIteration = 56 Shrink model to first 57 iterations. Trial 71, Fold 3: Log loss = 0.18957033622173933, Average precision = 0.976479376844621, ROC-AUC = 0.9735502433040228, Elapsed Time = 6.571405700000469 seconds Trial 71, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 71, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5932970 test: 0.5940946 best: 0.5940946 (0) total: 92.5ms remaining: 5.64s 1: learn: 0.5163668 test: 0.5173917 best: 0.5173917 (1) total: 192ms remaining: 5.76s 2: learn: 0.4554311 test: 0.4568227 best: 0.4568227 (2) total: 293ms remaining: 5.77s 3: learn: 0.4079927 test: 0.4098404 best: 0.4098404 (3) total: 394ms remaining: 5.71s 4: learn: 0.3704868 test: 0.3728434 best: 0.3728434 (4) total: 493ms remaining: 5.62s 5: learn: 0.3410873 test: 0.3435445 best: 0.3435445 (5) total: 594ms remaining: 5.55s 6: learn: 0.3174559 test: 0.3202128 best: 0.3202128 (6) total: 692ms remaining: 5.44s 7: learn: 0.2980792 test: 0.3015490 best: 0.3015490 (7) total: 796ms remaining: 5.37s 8: learn: 0.2824559 test: 0.2864954 best: 0.2864954 (8) total: 897ms remaining: 5.28s 9: learn: 0.2702700 test: 0.2746760 best: 0.2746760 (9) total: 997ms remaining: 5.18s 10: learn: 0.2588205 test: 0.2636745 best: 0.2636745 (10) total: 1.1s remaining: 5.12s 11: learn: 0.2501882 test: 0.2555765 best: 0.2555765 (11) total: 1.21s remaining: 5.05s 12: learn: 0.2420709 test: 0.2479381 best: 0.2479381 (12) total: 1.32s remaining: 4.98s 13: learn: 0.2358447 test: 0.2424321 best: 0.2424321 (13) total: 1.43s remaining: 4.92s 14: learn: 0.2302474 test: 0.2373645 best: 0.2373645 (14) total: 1.54s remaining: 4.83s 15: learn: 0.2251233 test: 0.2330907 best: 0.2330907 (15) total: 1.65s remaining: 4.73s 16: learn: 0.2204300 test: 0.2289214 best: 0.2289214 (16) total: 1.76s remaining: 4.66s 17: learn: 0.2170667 test: 0.2262601 best: 0.2262601 (17) total: 1.87s remaining: 4.57s 18: learn: 0.2138684 test: 0.2235789 best: 0.2235789 (18) total: 1.98s remaining: 4.47s 19: learn: 0.2107847 test: 0.2208857 best: 0.2208857 (19) total: 2.09s remaining: 4.4s 20: learn: 0.2080847 test: 0.2187011 best: 0.2187011 (20) total: 2.22s remaining: 4.33s 21: learn: 0.2056564 test: 0.2166317 best: 0.2166317 (21) total: 2.35s remaining: 4.27s 22: learn: 0.2029932 test: 0.2144069 best: 0.2144069 (22) total: 2.49s remaining: 4.21s 23: learn: 0.2008605 test: 0.2128885 best: 0.2128885 (23) total: 2.62s remaining: 4.14s 24: learn: 0.1990655 test: 0.2118938 best: 0.2118938 (24) total: 2.75s remaining: 4.06s 25: learn: 0.1971737 test: 0.2106168 best: 0.2106168 (25) total: 2.87s remaining: 3.98s 26: learn: 0.1955354 test: 0.2093503 best: 0.2093503 (26) total: 3s remaining: 3.89s 27: learn: 0.1943817 test: 0.2086596 best: 0.2086596 (27) total: 3.13s remaining: 3.8s 28: learn: 0.1926162 test: 0.2074489 best: 0.2074489 (28) total: 3.27s remaining: 3.73s 29: learn: 0.1911381 test: 0.2064285 best: 0.2064285 (29) total: 3.41s remaining: 3.64s 30: learn: 0.1900563 test: 0.2058049 best: 0.2058049 (30) total: 3.58s remaining: 3.58s 31: learn: 0.1885898 test: 0.2049879 best: 0.2049879 (31) total: 3.72s remaining: 3.49s 32: learn: 0.1871968 test: 0.2039810 best: 0.2039810 (32) total: 3.85s remaining: 3.38s 33: learn: 0.1859805 test: 0.2033403 best: 0.2033403 (33) total: 3.98s remaining: 3.28s 34: learn: 0.1847656 test: 0.2026160 best: 0.2026160 (34) total: 4.11s remaining: 3.17s 35: learn: 0.1839004 test: 0.2023105 best: 0.2023105 (35) total: 4.25s remaining: 3.07s 36: learn: 0.1826178 test: 0.2016224 best: 0.2016224 (36) total: 4.38s remaining: 2.96s 37: learn: 0.1816906 test: 0.2008120 best: 0.2008120 (37) total: 4.51s remaining: 2.85s 38: learn: 0.1806184 test: 0.2001723 best: 0.2001723 (38) total: 4.64s remaining: 2.74s 39: learn: 0.1797288 test: 0.1997932 best: 0.1997932 (39) total: 4.77s remaining: 2.62s 40: learn: 0.1789075 test: 0.1991021 best: 0.1991021 (40) total: 4.92s remaining: 2.52s 41: learn: 0.1781222 test: 0.1987627 best: 0.1987627 (41) total: 5.06s remaining: 2.41s 42: learn: 0.1769193 test: 0.1979951 best: 0.1979951 (42) total: 5.2s remaining: 2.3s 43: learn: 0.1762475 test: 0.1978936 best: 0.1978936 (43) total: 5.33s remaining: 2.18s 44: learn: 0.1755472 test: 0.1976096 best: 0.1976096 (44) total: 5.46s remaining: 2.06s 45: learn: 0.1747939 test: 0.1975275 best: 0.1975275 (45) total: 5.58s remaining: 1.94s 46: learn: 0.1737286 test: 0.1971820 best: 0.1971820 (46) total: 5.7s remaining: 1.82s 47: learn: 0.1726504 test: 0.1966028 best: 0.1966028 (47) total: 5.82s remaining: 1.7s 48: learn: 0.1720177 test: 0.1963135 best: 0.1963135 (48) total: 5.94s remaining: 1.57s 49: learn: 0.1713959 test: 0.1961600 best: 0.1961600 (49) total: 6.06s remaining: 1.46s 50: learn: 0.1706900 test: 0.1957634 best: 0.1957634 (50) total: 6.18s remaining: 1.33s 51: learn: 0.1700036 test: 0.1954095 best: 0.1954095 (51) total: 6.3s remaining: 1.21s 52: learn: 0.1693125 test: 0.1952721 best: 0.1952721 (52) total: 6.41s remaining: 1.09s 53: learn: 0.1687940 test: 0.1948157 best: 0.1948157 (53) total: 6.53s remaining: 967ms 54: learn: 0.1683268 test: 0.1946654 best: 0.1946654 (54) total: 6.64s remaining: 846ms 55: learn: 0.1676812 test: 0.1944610 best: 0.1944610 (55) total: 6.75s remaining: 724ms 56: learn: 0.1670558 test: 0.1943092 best: 0.1943092 (56) total: 6.87s remaining: 603ms 57: learn: 0.1665507 test: 0.1942187 best: 0.1942187 (57) total: 6.99s remaining: 482ms 58: learn: 0.1659377 test: 0.1940179 best: 0.1940179 (58) total: 7.1s remaining: 361ms 59: learn: 0.1653240 test: 0.1937910 best: 0.1937910 (59) total: 7.22s remaining: 241ms 60: learn: 0.1647261 test: 0.1935352 best: 0.1935352 (60) total: 7.33s remaining: 120ms 61: learn: 0.1642977 test: 0.1934569 best: 0.1934569 (61) total: 7.45s remaining: 0us bestTest = 0.1934568578 bestIteration = 61 Trial 71, Fold 4: Log loss = 0.19298445961855104, Average precision = 0.9759976408213453, ROC-AUC = 0.9718607606028742, Elapsed Time = 7.578010999997787 seconds Trial 71, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 71, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5929156 test: 0.5955466 best: 0.5955466 (0) total: 101ms remaining: 6.15s 1: learn: 0.5151365 test: 0.5193759 best: 0.5193759 (1) total: 213ms remaining: 6.4s 2: learn: 0.4535585 test: 0.4595309 best: 0.4595309 (2) total: 326ms remaining: 6.41s 3: learn: 0.4058341 test: 0.4134441 best: 0.4134441 (3) total: 430ms remaining: 6.24s 4: learn: 0.3683643 test: 0.3768900 best: 0.3768900 (4) total: 536ms remaining: 6.11s 5: learn: 0.3380560 test: 0.3476656 best: 0.3476656 (5) total: 644ms remaining: 6.01s 6: learn: 0.3138138 test: 0.3244874 best: 0.3244874 (6) total: 754ms remaining: 5.93s 7: learn: 0.2949514 test: 0.3063397 best: 0.3063397 (7) total: 876ms remaining: 5.92s 8: learn: 0.2799900 test: 0.2920884 best: 0.2920884 (8) total: 993ms remaining: 5.85s 9: learn: 0.2673838 test: 0.2801668 best: 0.2801668 (9) total: 1.11s remaining: 5.79s 10: learn: 0.2561353 test: 0.2699840 best: 0.2699840 (10) total: 1.23s remaining: 5.71s 11: learn: 0.2467971 test: 0.2613494 best: 0.2613494 (11) total: 1.35s remaining: 5.62s 12: learn: 0.2389235 test: 0.2544486 best: 0.2544486 (12) total: 1.47s remaining: 5.53s 13: learn: 0.2320467 test: 0.2484227 best: 0.2484227 (13) total: 1.58s remaining: 5.41s 14: learn: 0.2264004 test: 0.2434892 best: 0.2434892 (14) total: 1.7s remaining: 5.31s 15: learn: 0.2216365 test: 0.2391192 best: 0.2391192 (15) total: 1.81s remaining: 5.2s 16: learn: 0.2173836 test: 0.2351698 best: 0.2351698 (16) total: 1.94s remaining: 5.12s 17: learn: 0.2135341 test: 0.2321830 best: 0.2321830 (17) total: 2.05s remaining: 5.01s 18: learn: 0.2102448 test: 0.2292788 best: 0.2292788 (18) total: 2.17s remaining: 4.9s 19: learn: 0.2076077 test: 0.2266857 best: 0.2266857 (19) total: 2.29s remaining: 4.8s 20: learn: 0.2047032 test: 0.2241864 best: 0.2241864 (20) total: 2.4s remaining: 4.68s 21: learn: 0.2024241 test: 0.2221009 best: 0.2221009 (21) total: 2.52s remaining: 4.58s 22: learn: 0.2003651 test: 0.2208186 best: 0.2208186 (22) total: 2.63s remaining: 4.46s 23: learn: 0.1982513 test: 0.2195571 best: 0.2195571 (23) total: 2.75s remaining: 4.35s 24: learn: 0.1966283 test: 0.2186219 best: 0.2186219 (24) total: 2.86s remaining: 4.23s 25: learn: 0.1949866 test: 0.2173982 best: 0.2173982 (25) total: 2.97s remaining: 4.12s 26: learn: 0.1934364 test: 0.2163542 best: 0.2163542 (26) total: 3.09s remaining: 4.01s 27: learn: 0.1917855 test: 0.2154690 best: 0.2154690 (27) total: 3.21s remaining: 3.9s 28: learn: 0.1900611 test: 0.2143334 best: 0.2143334 (28) total: 3.33s remaining: 3.79s 29: learn: 0.1888050 test: 0.2133852 best: 0.2133852 (29) total: 3.45s remaining: 3.68s 30: learn: 0.1874236 test: 0.2127815 best: 0.2127815 (30) total: 3.57s remaining: 3.57s 31: learn: 0.1861255 test: 0.2119247 best: 0.2119247 (31) total: 3.68s remaining: 3.45s 32: learn: 0.1847597 test: 0.2109924 best: 0.2109924 (32) total: 3.8s remaining: 3.34s 33: learn: 0.1838431 test: 0.2104867 best: 0.2104867 (33) total: 3.91s remaining: 3.22s 34: learn: 0.1828920 test: 0.2097097 best: 0.2097097 (34) total: 4.03s remaining: 3.11s 35: learn: 0.1818399 test: 0.2091024 best: 0.2091024 (35) total: 4.15s remaining: 3s 36: learn: 0.1808709 test: 0.2086972 best: 0.2086972 (36) total: 4.26s remaining: 2.88s 37: learn: 0.1796366 test: 0.2081189 best: 0.2081189 (37) total: 4.38s remaining: 2.77s 38: learn: 0.1789237 test: 0.2077497 best: 0.2077497 (38) total: 4.5s remaining: 2.65s 39: learn: 0.1780680 test: 0.2073907 best: 0.2073907 (39) total: 4.6s remaining: 2.53s 40: learn: 0.1773312 test: 0.2071039 best: 0.2071039 (40) total: 4.72s remaining: 2.42s 41: learn: 0.1765216 test: 0.2068258 best: 0.2068258 (41) total: 4.83s remaining: 2.3s 42: learn: 0.1756710 test: 0.2062462 best: 0.2062462 (42) total: 4.95s remaining: 2.19s 43: learn: 0.1749180 test: 0.2061039 best: 0.2061039 (43) total: 5.06s remaining: 2.07s 44: learn: 0.1741118 test: 0.2056131 best: 0.2056131 (44) total: 5.19s remaining: 1.96s 45: learn: 0.1735127 test: 0.2053607 best: 0.2053607 (45) total: 5.3s remaining: 1.84s 46: learn: 0.1728638 test: 0.2052420 best: 0.2052420 (46) total: 5.42s remaining: 1.73s 47: learn: 0.1720468 test: 0.2048140 best: 0.2048140 (47) total: 5.54s remaining: 1.62s 48: learn: 0.1711904 test: 0.2042881 best: 0.2042881 (48) total: 5.66s remaining: 1.5s 49: learn: 0.1704531 test: 0.2036620 best: 0.2036620 (49) total: 5.77s remaining: 1.39s 50: learn: 0.1697828 test: 0.2032262 best: 0.2032262 (50) total: 5.89s remaining: 1.27s 51: learn: 0.1688116 test: 0.2032713 best: 0.2032262 (50) total: 6.01s remaining: 1.16s 52: learn: 0.1681736 test: 0.2030066 best: 0.2030066 (52) total: 6.14s remaining: 1.04s 53: learn: 0.1676730 test: 0.2026546 best: 0.2026546 (53) total: 6.27s remaining: 928ms 54: learn: 0.1671018 test: 0.2024825 best: 0.2024825 (54) total: 6.39s remaining: 813ms 55: learn: 0.1664903 test: 0.2020076 best: 0.2020076 (55) total: 6.51s remaining: 698ms 56: learn: 0.1658311 test: 0.2021094 best: 0.2020076 (55) total: 6.63s remaining: 581ms 57: learn: 0.1653382 test: 0.2019074 best: 0.2019074 (57) total: 6.74s remaining: 465ms 58: learn: 0.1644788 test: 0.2019069 best: 0.2019069 (58) total: 6.86s remaining: 349ms 59: learn: 0.1639615 test: 0.2018521 best: 0.2018521 (59) total: 6.97s remaining: 232ms 60: learn: 0.1633990 test: 0.2015790 best: 0.2015790 (60) total: 7.09s remaining: 116ms 61: learn: 0.1629623 test: 0.2011565 best: 0.2011565 (61) total: 7.2s remaining: 0us bestTest = 0.2011565328 bestIteration = 61 Trial 71, Fold 5: Log loss = 0.20047779827987935, Average precision = 0.9737315361605978, ROC-AUC = 0.9710568344388087, Elapsed Time = 7.334567800000514 seconds
Optimization Progress: 72%|#######2 | 72/100 [2:01:34<47:01, 100.78s/it]
Trial 72, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 72, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6149957 test: 0.6187336 best: 0.6187336 (0) total: 548ms remaining: 24.1s 1: learn: 0.5483818 test: 0.5553574 best: 0.5553574 (1) total: 1.17s remaining: 25.2s 2: learn: 0.4915612 test: 0.5016770 best: 0.5016770 (2) total: 1.82s remaining: 25.6s 3: learn: 0.4441540 test: 0.4571943 best: 0.4571943 (3) total: 2.62s remaining: 26.9s 4: learn: 0.4045897 test: 0.4206978 best: 0.4206978 (4) total: 3.33s remaining: 26.6s 5: learn: 0.3701658 test: 0.3896019 best: 0.3896019 (5) total: 4.1s remaining: 26.7s 6: learn: 0.3426143 test: 0.3636252 best: 0.3636252 (6) total: 4.68s remaining: 25.4s 7: learn: 0.3181619 test: 0.3421305 best: 0.3421305 (7) total: 5.38s remaining: 24.9s 8: learn: 0.2974555 test: 0.3241772 best: 0.3241772 (8) total: 6.09s remaining: 24.4s 9: learn: 0.2798658 test: 0.3085502 best: 0.3085502 (9) total: 6.7s remaining: 23.4s 10: learn: 0.2645263 test: 0.2954271 best: 0.2954271 (10) total: 7.39s remaining: 22.8s 11: learn: 0.2510510 test: 0.2839243 best: 0.2839243 (11) total: 8.01s remaining: 22s 12: learn: 0.2398551 test: 0.2747890 best: 0.2747890 (12) total: 8.63s remaining: 21.3s 13: learn: 0.2299489 test: 0.2667753 best: 0.2667753 (13) total: 9.25s remaining: 20.5s 14: learn: 0.2201665 test: 0.2596910 best: 0.2596910 (14) total: 10s remaining: 20s 15: learn: 0.2117419 test: 0.2531270 best: 0.2531270 (15) total: 10.8s remaining: 19.5s 16: learn: 0.2037793 test: 0.2477866 best: 0.2477866 (16) total: 11.5s remaining: 18.9s 17: learn: 0.1977548 test: 0.2433507 best: 0.2433507 (17) total: 12.1s remaining: 18.2s 18: learn: 0.1917172 test: 0.2387477 best: 0.2387477 (18) total: 12.8s remaining: 17.6s 19: learn: 0.1866942 test: 0.2353591 best: 0.2353591 (19) total: 13.6s remaining: 17s 20: learn: 0.1816632 test: 0.2317806 best: 0.2317806 (20) total: 14.3s remaining: 16.4s 21: learn: 0.1774252 test: 0.2288004 best: 0.2288004 (21) total: 15s remaining: 15.7s 22: learn: 0.1731348 test: 0.2260504 best: 0.2260504 (22) total: 15.8s remaining: 15.1s 23: learn: 0.1690640 test: 0.2236909 best: 0.2236909 (23) total: 16.5s remaining: 14.5s 24: learn: 0.1654928 test: 0.2216270 best: 0.2216270 (24) total: 17.2s remaining: 13.8s 25: learn: 0.1620347 test: 0.2199094 best: 0.2199094 (25) total: 18s remaining: 13.2s 26: learn: 0.1587435 test: 0.2182991 best: 0.2182991 (26) total: 18.8s remaining: 12.5s 27: learn: 0.1553213 test: 0.2166184 best: 0.2166184 (27) total: 19.6s remaining: 11.9s 28: learn: 0.1522812 test: 0.2151695 best: 0.2151695 (28) total: 20.5s remaining: 11.3s 29: learn: 0.1490613 test: 0.2140972 best: 0.2140972 (29) total: 21.3s remaining: 10.6s 30: learn: 0.1463250 test: 0.2131197 best: 0.2131197 (30) total: 21.9s remaining: 9.91s 31: learn: 0.1440529 test: 0.2121062 best: 0.2121062 (31) total: 22.7s remaining: 9.21s 32: learn: 0.1413557 test: 0.2111486 best: 0.2111486 (32) total: 23.5s remaining: 8.55s 33: learn: 0.1390516 test: 0.2105831 best: 0.2105831 (33) total: 24.2s remaining: 7.84s 34: learn: 0.1367291 test: 0.2097764 best: 0.2097764 (34) total: 25s remaining: 7.15s 35: learn: 0.1346859 test: 0.2089525 best: 0.2089525 (35) total: 25.8s remaining: 6.44s 36: learn: 0.1328724 test: 0.2080605 best: 0.2080605 (36) total: 26.5s remaining: 5.74s 37: learn: 0.1304706 test: 0.2074053 best: 0.2074053 (37) total: 27.5s remaining: 5.06s 38: learn: 0.1285974 test: 0.2067868 best: 0.2067868 (38) total: 28.4s remaining: 4.36s 39: learn: 0.1265553 test: 0.2063475 best: 0.2063475 (39) total: 29.2s remaining: 3.65s 40: learn: 0.1243125 test: 0.2059421 best: 0.2059421 (40) total: 30s remaining: 2.92s 41: learn: 0.1223950 test: 0.2055957 best: 0.2055957 (41) total: 30.9s remaining: 2.21s 42: learn: 0.1205241 test: 0.2052764 best: 0.2052764 (42) total: 31.8s remaining: 1.48s 43: learn: 0.1193109 test: 0.2048634 best: 0.2048634 (43) total: 32.5s remaining: 738ms 44: learn: 0.1179902 test: 0.2045313 best: 0.2045313 (44) total: 33.2s remaining: 0us bestTest = 0.2045313427 bestIteration = 44 Trial 72, Fold 1: Log loss = 0.20453134266117223, Average precision = 0.975197454205744, ROC-AUC = 0.9709214271375518, Elapsed Time = 33.35583069999848 seconds Trial 72, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 72, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6143779 test: 0.6175117 best: 0.6175117 (0) total: 669ms remaining: 29.4s 1: learn: 0.5479272 test: 0.5541530 best: 0.5541530 (1) total: 1.31s remaining: 28.2s 2: learn: 0.4922476 test: 0.5009081 best: 0.5009081 (2) total: 1.96s remaining: 27.5s 3: learn: 0.4457773 test: 0.4568396 best: 0.4568396 (3) total: 2.59s remaining: 26.5s 4: learn: 0.4064229 test: 0.4198544 best: 0.4198544 (4) total: 3.34s remaining: 26.7s 5: learn: 0.3730805 test: 0.3887238 best: 0.3887238 (5) total: 4.09s remaining: 26.6s 6: learn: 0.3442587 test: 0.3625739 best: 0.3625739 (6) total: 4.86s remaining: 26.4s 7: learn: 0.3202496 test: 0.3397839 best: 0.3397839 (7) total: 5.6s remaining: 25.9s 8: learn: 0.2988347 test: 0.3206673 best: 0.3206673 (8) total: 6.44s remaining: 25.8s 9: learn: 0.2798775 test: 0.3043069 best: 0.3043069 (9) total: 7.32s remaining: 25.6s 10: learn: 0.2647461 test: 0.2903078 best: 0.2903078 (10) total: 7.92s remaining: 24.5s 11: learn: 0.2517017 test: 0.2792792 best: 0.2792792 (11) total: 8.59s remaining: 23.6s 12: learn: 0.2391478 test: 0.2691631 best: 0.2691631 (12) total: 9.43s remaining: 23.2s 13: learn: 0.2291637 test: 0.2609045 best: 0.2609045 (13) total: 10.1s remaining: 22.4s 14: learn: 0.2195906 test: 0.2530388 best: 0.2530388 (14) total: 10.9s remaining: 21.8s 15: learn: 0.2114598 test: 0.2462693 best: 0.2462693 (15) total: 11.5s remaining: 20.9s 16: learn: 0.2036104 test: 0.2402839 best: 0.2402839 (16) total: 12.3s remaining: 20.2s 17: learn: 0.1972840 test: 0.2351305 best: 0.2351305 (17) total: 13.1s remaining: 19.7s 18: learn: 0.1914236 test: 0.2310109 best: 0.2310109 (18) total: 13.8s remaining: 18.9s 19: learn: 0.1863121 test: 0.2268536 best: 0.2268536 (19) total: 14.5s remaining: 18.1s 20: learn: 0.1817127 test: 0.2232717 best: 0.2232717 (20) total: 15.1s remaining: 17.3s 21: learn: 0.1764845 test: 0.2195496 best: 0.2195496 (21) total: 16s remaining: 16.7s 22: learn: 0.1722209 test: 0.2168149 best: 0.2168149 (22) total: 16.7s remaining: 16s 23: learn: 0.1684574 test: 0.2143555 best: 0.2143555 (23) total: 17.6s remaining: 15.4s 24: learn: 0.1643133 test: 0.2121342 best: 0.2121342 (24) total: 18.4s remaining: 14.7s 25: learn: 0.1609474 test: 0.2096476 best: 0.2096476 (25) total: 19s remaining: 13.9s 26: learn: 0.1575116 test: 0.2078157 best: 0.2078157 (26) total: 19.8s remaining: 13.2s 27: learn: 0.1547682 test: 0.2059992 best: 0.2059992 (27) total: 20.5s remaining: 12.4s 28: learn: 0.1514469 test: 0.2044510 best: 0.2044510 (28) total: 21.5s remaining: 11.8s 29: learn: 0.1487618 test: 0.2027375 best: 0.2027375 (29) total: 22.2s remaining: 11.1s 30: learn: 0.1461726 test: 0.2015406 best: 0.2015406 (30) total: 22.9s remaining: 10.4s 31: learn: 0.1435980 test: 0.2001510 best: 0.2001510 (31) total: 23.8s remaining: 9.68s 32: learn: 0.1416305 test: 0.1988856 best: 0.1988856 (32) total: 24.5s remaining: 8.9s 33: learn: 0.1396139 test: 0.1978820 best: 0.1978820 (33) total: 25.2s remaining: 8.15s 34: learn: 0.1373274 test: 0.1963720 best: 0.1963720 (34) total: 26s remaining: 7.43s 35: learn: 0.1354210 test: 0.1954332 best: 0.1954332 (35) total: 26.8s remaining: 6.69s 36: learn: 0.1328897 test: 0.1947397 best: 0.1947397 (36) total: 27.8s remaining: 6s 37: learn: 0.1310737 test: 0.1939943 best: 0.1939943 (37) total: 28.5s remaining: 5.25s 38: learn: 0.1292011 test: 0.1932524 best: 0.1932524 (38) total: 29.3s remaining: 4.51s 39: learn: 0.1277136 test: 0.1926010 best: 0.1926010 (39) total: 30s remaining: 3.75s 40: learn: 0.1261487 test: 0.1918664 best: 0.1918664 (40) total: 30.8s remaining: 3.01s 41: learn: 0.1245997 test: 0.1912512 best: 0.1912512 (41) total: 31.8s remaining: 2.27s 42: learn: 0.1231387 test: 0.1907170 best: 0.1907170 (42) total: 32.9s remaining: 1.53s 43: learn: 0.1214810 test: 0.1901939 best: 0.1901939 (43) total: 33.8s remaining: 768ms 44: learn: 0.1203049 test: 0.1895541 best: 0.1895541 (44) total: 34.5s remaining: 0us bestTest = 0.1895541267 bestIteration = 44 Trial 72, Fold 2: Log loss = 0.18955412666489446, Average precision = 0.9770493220330668, ROC-AUC = 0.9740655447244732, Elapsed Time = 34.72651369999949 seconds Trial 72, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 72, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6146411 test: 0.6170722 best: 0.6170722 (0) total: 602ms remaining: 26.5s 1: learn: 0.5485935 test: 0.5531971 best: 0.5531971 (1) total: 1.24s remaining: 26.7s 2: learn: 0.4925319 test: 0.4999656 best: 0.4999656 (2) total: 1.95s remaining: 27.3s 3: learn: 0.4461457 test: 0.4560761 best: 0.4560761 (3) total: 2.73s remaining: 27.9s 4: learn: 0.4051635 test: 0.4182496 best: 0.4182496 (4) total: 3.56s remaining: 28.5s 5: learn: 0.3721658 test: 0.3874574 best: 0.3874574 (5) total: 4.21s remaining: 27.4s 6: learn: 0.3439259 test: 0.3605786 best: 0.3605786 (6) total: 4.87s remaining: 26.4s 7: learn: 0.3194314 test: 0.3380282 best: 0.3380282 (7) total: 5.66s remaining: 26.2s 8: learn: 0.2981969 test: 0.3192649 best: 0.3192649 (8) total: 6.35s remaining: 25.4s 9: learn: 0.2805961 test: 0.3033386 best: 0.3033386 (9) total: 6.95s remaining: 24.3s 10: learn: 0.2652540 test: 0.2899312 best: 0.2899312 (10) total: 7.66s remaining: 23.7s 11: learn: 0.2514556 test: 0.2780060 best: 0.2780060 (11) total: 8.39s remaining: 23.1s 12: learn: 0.2392276 test: 0.2681064 best: 0.2681064 (12) total: 9.22s remaining: 22.7s 13: learn: 0.2297905 test: 0.2601609 best: 0.2601609 (13) total: 9.86s remaining: 21.8s 14: learn: 0.2210117 test: 0.2532546 best: 0.2532546 (14) total: 10.6s remaining: 21.2s 15: learn: 0.2127305 test: 0.2466611 best: 0.2466611 (15) total: 11.4s remaining: 20.7s 16: learn: 0.2054125 test: 0.2404269 best: 0.2404269 (16) total: 12.1s remaining: 20s 17: learn: 0.1987649 test: 0.2353122 best: 0.2353122 (17) total: 12.9s remaining: 19.3s 18: learn: 0.1925888 test: 0.2306823 best: 0.2306823 (18) total: 13.7s remaining: 18.7s 19: learn: 0.1872272 test: 0.2265903 best: 0.2265903 (19) total: 14.3s remaining: 17.9s 20: learn: 0.1815790 test: 0.2227958 best: 0.2227958 (20) total: 15.1s remaining: 17.3s 21: learn: 0.1766221 test: 0.2194926 best: 0.2194926 (21) total: 15.9s remaining: 16.6s 22: learn: 0.1725189 test: 0.2165278 best: 0.2165278 (22) total: 16.7s remaining: 15.9s 23: learn: 0.1690484 test: 0.2142183 best: 0.2142183 (23) total: 17.3s remaining: 15.2s 24: learn: 0.1656023 test: 0.2118020 best: 0.2118020 (24) total: 18s remaining: 14.4s 25: learn: 0.1625123 test: 0.2101372 best: 0.2101372 (25) total: 18.7s remaining: 13.7s 26: learn: 0.1595277 test: 0.2082406 best: 0.2082406 (26) total: 19.4s remaining: 13s 27: learn: 0.1564802 test: 0.2066891 best: 0.2066891 (27) total: 20.1s remaining: 12.2s 28: learn: 0.1538926 test: 0.2051251 best: 0.2051251 (28) total: 20.8s remaining: 11.5s 29: learn: 0.1503593 test: 0.2037754 best: 0.2037754 (29) total: 21.8s remaining: 10.9s 30: learn: 0.1480728 test: 0.2025761 best: 0.2025761 (30) total: 22.5s remaining: 10.2s 31: learn: 0.1456100 test: 0.2014230 best: 0.2014230 (31) total: 23.3s remaining: 9.46s 32: learn: 0.1430670 test: 0.2002027 best: 0.2002027 (32) total: 24.1s remaining: 8.76s 33: learn: 0.1409911 test: 0.1991788 best: 0.1991788 (33) total: 24.8s remaining: 8.04s 34: learn: 0.1385312 test: 0.1983223 best: 0.1983223 (34) total: 25.6s remaining: 7.32s 35: learn: 0.1362730 test: 0.1975713 best: 0.1975713 (35) total: 26.5s remaining: 6.62s 36: learn: 0.1343658 test: 0.1968252 best: 0.1968252 (36) total: 27.2s remaining: 5.88s 37: learn: 0.1323774 test: 0.1965569 best: 0.1965569 (37) total: 28s remaining: 5.17s 38: learn: 0.1307742 test: 0.1959996 best: 0.1959996 (38) total: 28.8s remaining: 4.42s 39: learn: 0.1288817 test: 0.1953884 best: 0.1953884 (39) total: 29.6s remaining: 3.71s 40: learn: 0.1273027 test: 0.1947979 best: 0.1947979 (40) total: 30.4s remaining: 2.96s 41: learn: 0.1257281 test: 0.1943178 best: 0.1943178 (41) total: 31.1s remaining: 2.22s 42: learn: 0.1239085 test: 0.1937616 best: 0.1937616 (42) total: 31.9s remaining: 1.49s 43: learn: 0.1223016 test: 0.1931401 best: 0.1931401 (43) total: 32.8s remaining: 745ms 44: learn: 0.1212052 test: 0.1928733 best: 0.1928733 (44) total: 33.4s remaining: 0us bestTest = 0.1928733353 bestIteration = 44 Trial 72, Fold 3: Log loss = 0.19287333528003864, Average precision = 0.975518451660432, ROC-AUC = 0.9732680088316591, Elapsed Time = 33.61581719999958 seconds Trial 72, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 72, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6139814 test: 0.6174364 best: 0.6174364 (0) total: 695ms remaining: 30.6s 1: learn: 0.5476754 test: 0.5543579 best: 0.5543579 (1) total: 1.44s remaining: 31s 2: learn: 0.4915874 test: 0.5011974 best: 0.5011974 (2) total: 2.3s remaining: 32.2s 3: learn: 0.4450791 test: 0.4570694 best: 0.4570694 (3) total: 2.97s remaining: 30.4s 4: learn: 0.4053955 test: 0.4198277 best: 0.4198277 (4) total: 3.83s remaining: 30.6s 5: learn: 0.3714320 test: 0.3884709 best: 0.3884709 (5) total: 4.75s remaining: 30.9s 6: learn: 0.3435730 test: 0.3627896 best: 0.3627896 (6) total: 5.46s remaining: 29.6s 7: learn: 0.3195855 test: 0.3406056 best: 0.3406056 (7) total: 6.12s remaining: 28.3s 8: learn: 0.2991483 test: 0.3215932 best: 0.3215932 (8) total: 6.83s remaining: 27.3s 9: learn: 0.2815019 test: 0.3053526 best: 0.3053526 (9) total: 7.47s remaining: 26.1s 10: learn: 0.2657566 test: 0.2914479 best: 0.2914479 (10) total: 8.18s remaining: 25.3s 11: learn: 0.2527047 test: 0.2797557 best: 0.2797557 (11) total: 8.87s remaining: 24.4s 12: learn: 0.2407123 test: 0.2694259 best: 0.2694259 (12) total: 9.6s remaining: 23.6s 13: learn: 0.2305808 test: 0.2608205 best: 0.2608205 (13) total: 10.4s remaining: 23s 14: learn: 0.2225797 test: 0.2535727 best: 0.2535727 (14) total: 11s remaining: 21.9s 15: learn: 0.2147251 test: 0.2477900 best: 0.2477900 (15) total: 11.7s remaining: 21.3s 16: learn: 0.2073739 test: 0.2421385 best: 0.2421385 (16) total: 12.5s remaining: 20.5s 17: learn: 0.2006059 test: 0.2363525 best: 0.2363525 (17) total: 13.2s remaining: 19.8s 18: learn: 0.1944051 test: 0.2319852 best: 0.2319852 (18) total: 14s remaining: 19.1s 19: learn: 0.1883604 test: 0.2280470 best: 0.2280470 (19) total: 14.8s remaining: 18.5s 20: learn: 0.1835144 test: 0.2242978 best: 0.2242978 (20) total: 15.4s remaining: 17.6s 21: learn: 0.1792353 test: 0.2209050 best: 0.2209050 (21) total: 16s remaining: 16.7s 22: learn: 0.1753356 test: 0.2174480 best: 0.2174480 (22) total: 16.6s remaining: 15.9s 23: learn: 0.1711124 test: 0.2150262 best: 0.2150262 (23) total: 17.5s remaining: 15.3s 24: learn: 0.1677430 test: 0.2128233 best: 0.2128233 (24) total: 18.1s remaining: 14.5s 25: learn: 0.1642268 test: 0.2110198 best: 0.2110198 (25) total: 18.9s remaining: 13.8s 26: learn: 0.1603225 test: 0.2094342 best: 0.2094342 (26) total: 19.8s remaining: 13.2s 27: learn: 0.1576076 test: 0.2075912 best: 0.2075912 (27) total: 20.4s remaining: 12.4s 28: learn: 0.1548637 test: 0.2064262 best: 0.2064262 (28) total: 21.2s remaining: 11.7s 29: learn: 0.1515837 test: 0.2051470 best: 0.2051470 (29) total: 22s remaining: 11s 30: learn: 0.1493089 test: 0.2038947 best: 0.2038947 (30) total: 22.7s remaining: 10.2s 31: learn: 0.1466502 test: 0.2025827 best: 0.2025827 (31) total: 23.5s remaining: 9.56s 32: learn: 0.1443764 test: 0.2014233 best: 0.2014233 (32) total: 24.2s remaining: 8.78s 33: learn: 0.1419224 test: 0.2002963 best: 0.2002963 (33) total: 24.9s remaining: 8.05s 34: learn: 0.1396581 test: 0.1992796 best: 0.1992796 (34) total: 25.7s remaining: 7.34s 35: learn: 0.1371871 test: 0.1985966 best: 0.1985966 (35) total: 26.5s remaining: 6.63s 36: learn: 0.1353821 test: 0.1978249 best: 0.1978249 (36) total: 27.1s remaining: 5.87s 37: learn: 0.1340937 test: 0.1973641 best: 0.1973641 (37) total: 27.7s remaining: 5.1s 38: learn: 0.1321083 test: 0.1966869 best: 0.1966869 (38) total: 28.5s remaining: 4.39s 39: learn: 0.1305165 test: 0.1959460 best: 0.1959460 (39) total: 29.2s remaining: 3.65s 40: learn: 0.1287446 test: 0.1953539 best: 0.1953539 (40) total: 30s remaining: 2.92s 41: learn: 0.1273668 test: 0.1946952 best: 0.1946952 (41) total: 30.7s remaining: 2.19s 42: learn: 0.1261399 test: 0.1943018 best: 0.1943018 (42) total: 31.3s remaining: 1.46s 43: learn: 0.1243743 test: 0.1940139 best: 0.1940139 (43) total: 32.2s remaining: 731ms 44: learn: 0.1226573 test: 0.1936647 best: 0.1936647 (44) total: 33.1s remaining: 0us bestTest = 0.1936646567 bestIteration = 44 Trial 72, Fold 4: Log loss = 0.1936646567102457, Average precision = 0.9771454589891937, ROC-AUC = 0.9734168115195605, Elapsed Time = 33.27767400000084 seconds Trial 72, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 72, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6127148 test: 0.6176480 best: 0.6176480 (0) total: 840ms remaining: 37s 1: learn: 0.5458297 test: 0.5548888 best: 0.5548888 (1) total: 1.52s remaining: 32.7s 2: learn: 0.4900098 test: 0.5018309 best: 0.5018309 (2) total: 2.17s remaining: 30.4s 3: learn: 0.4430198 test: 0.4582369 best: 0.4582369 (3) total: 2.91s remaining: 29.9s 4: learn: 0.4033778 test: 0.4214657 best: 0.4214657 (4) total: 3.56s remaining: 28.5s 5: learn: 0.3693627 test: 0.3904203 best: 0.3904203 (5) total: 4.38s remaining: 28.5s 6: learn: 0.3412864 test: 0.3644308 best: 0.3644308 (6) total: 4.98s remaining: 27s 7: learn: 0.3169276 test: 0.3423522 best: 0.3423522 (7) total: 5.71s remaining: 26.4s 8: learn: 0.2963670 test: 0.3237402 best: 0.3237402 (8) total: 6.33s remaining: 25.3s 9: learn: 0.2782457 test: 0.3079475 best: 0.3079475 (9) total: 7.04s remaining: 24.6s 10: learn: 0.2620337 test: 0.2943590 best: 0.2943590 (10) total: 7.83s remaining: 24.2s 11: learn: 0.2487155 test: 0.2832016 best: 0.2832016 (11) total: 8.58s remaining: 23.6s 12: learn: 0.2375893 test: 0.2736436 best: 0.2736436 (12) total: 9.12s remaining: 22.5s 13: learn: 0.2273917 test: 0.2649181 best: 0.2649181 (13) total: 9.8s remaining: 21.7s 14: learn: 0.2181801 test: 0.2574665 best: 0.2574665 (14) total: 10.5s remaining: 21s 15: learn: 0.2098106 test: 0.2508508 best: 0.2508508 (15) total: 11.2s remaining: 20.3s 16: learn: 0.2022873 test: 0.2450970 best: 0.2450970 (16) total: 12s remaining: 19.7s 17: learn: 0.1960760 test: 0.2403643 best: 0.2403643 (17) total: 12.6s remaining: 18.9s 18: learn: 0.1894488 test: 0.2359252 best: 0.2359252 (18) total: 13.4s remaining: 18.3s 19: learn: 0.1844670 test: 0.2323897 best: 0.2323897 (19) total: 14.1s remaining: 17.6s 20: learn: 0.1798981 test: 0.2290457 best: 0.2290457 (20) total: 14.7s remaining: 16.9s 21: learn: 0.1745067 test: 0.2261179 best: 0.2261179 (21) total: 15.6s remaining: 16.3s 22: learn: 0.1707630 test: 0.2234762 best: 0.2234762 (22) total: 16.2s remaining: 15.5s 23: learn: 0.1667089 test: 0.2211510 best: 0.2211510 (23) total: 17s remaining: 14.9s 24: learn: 0.1629814 test: 0.2188226 best: 0.2188226 (24) total: 17.8s remaining: 14.3s 25: learn: 0.1591170 test: 0.2175977 best: 0.2175977 (25) total: 18.6s remaining: 13.6s 26: learn: 0.1559253 test: 0.2160089 best: 0.2160089 (26) total: 19.3s remaining: 12.9s 27: learn: 0.1528320 test: 0.2147862 best: 0.2147862 (27) total: 20.1s remaining: 12.2s 28: learn: 0.1503868 test: 0.2136957 best: 0.2136957 (28) total: 20.8s remaining: 11.5s 29: learn: 0.1477475 test: 0.2120249 best: 0.2120249 (29) total: 21.6s remaining: 10.8s 30: learn: 0.1455577 test: 0.2109424 best: 0.2109424 (30) total: 22.4s remaining: 10.1s 31: learn: 0.1425283 test: 0.2098464 best: 0.2098464 (31) total: 23.4s remaining: 9.5s 32: learn: 0.1395483 test: 0.2085571 best: 0.2085571 (32) total: 24.4s remaining: 8.86s 33: learn: 0.1369779 test: 0.2077674 best: 0.2077674 (33) total: 25.3s remaining: 8.18s 34: learn: 0.1349867 test: 0.2068239 best: 0.2068239 (34) total: 26s remaining: 7.44s 35: learn: 0.1330015 test: 0.2062151 best: 0.2062151 (35) total: 26.9s remaining: 6.72s 36: learn: 0.1310245 test: 0.2054769 best: 0.2054769 (36) total: 27.6s remaining: 5.97s 37: learn: 0.1291408 test: 0.2046145 best: 0.2046145 (37) total: 28.4s remaining: 5.23s 38: learn: 0.1265618 test: 0.2039921 best: 0.2039921 (38) total: 29.4s remaining: 4.52s 39: learn: 0.1247574 test: 0.2034458 best: 0.2034458 (39) total: 30.2s remaining: 3.78s 40: learn: 0.1231611 test: 0.2030532 best: 0.2030532 (40) total: 31.1s remaining: 3.03s 41: learn: 0.1217398 test: 0.2027780 best: 0.2027780 (41) total: 31.7s remaining: 2.27s 42: learn: 0.1202429 test: 0.2026240 best: 0.2026240 (42) total: 32.6s remaining: 1.51s 43: learn: 0.1191519 test: 0.2020674 best: 0.2020674 (43) total: 33.3s remaining: 757ms 44: learn: 0.1176498 test: 0.2016388 best: 0.2016388 (44) total: 34.1s remaining: 0us bestTest = 0.2016387522 bestIteration = 44 Trial 72, Fold 5: Log loss = 0.20163875215161584, Average precision = 0.974648495045539, ROC-AUC = 0.97209832227858, Elapsed Time = 34.260632699999405 seconds
Optimization Progress: 73%|#######3 | 73/100 [2:04:32<55:48, 124.02s/it]
Trial 73, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 73, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6769995 test: 0.6773118 best: 0.6773118 (0) total: 222ms remaining: 14.7s 1: learn: 0.6608383 test: 0.6610702 best: 0.6610702 (1) total: 423ms remaining: 13.7s 2: learn: 0.6472116 test: 0.6473207 best: 0.6473207 (2) total: 623ms remaining: 13.3s 3: learn: 0.6334409 test: 0.6335228 best: 0.6335228 (3) total: 814ms remaining: 12.8s 4: learn: 0.6201762 test: 0.6202000 best: 0.6202000 (4) total: 1.01s remaining: 12.5s 5: learn: 0.6070844 test: 0.6071853 best: 0.6071853 (5) total: 1.21s remaining: 12.3s 6: learn: 0.5944063 test: 0.5944753 best: 0.5944753 (6) total: 1.41s remaining: 12s 7: learn: 0.5814867 test: 0.5815458 best: 0.5815458 (7) total: 1.63s remaining: 12s 8: learn: 0.5702497 test: 0.5703317 best: 0.5703317 (8) total: 1.84s remaining: 11.9s 9: learn: 0.5598461 test: 0.5601085 best: 0.5601085 (9) total: 2.04s remaining: 11.6s 10: learn: 0.5493838 test: 0.5496493 best: 0.5496493 (10) total: 2.2s remaining: 11.2s 11: learn: 0.5388721 test: 0.5390938 best: 0.5390938 (11) total: 2.41s remaining: 11.1s 12: learn: 0.5288961 test: 0.5291152 best: 0.5291152 (12) total: 2.58s remaining: 10.7s 13: learn: 0.5189487 test: 0.5192035 best: 0.5192035 (13) total: 2.78s remaining: 10.5s 14: learn: 0.5102389 test: 0.5104272 best: 0.5104272 (14) total: 2.96s remaining: 10.3s 15: learn: 0.5022160 test: 0.5024391 best: 0.5024391 (15) total: 3.11s remaining: 9.93s 16: learn: 0.4932323 test: 0.4934964 best: 0.4934964 (16) total: 3.31s remaining: 9.73s 17: learn: 0.4852780 test: 0.4855507 best: 0.4855507 (17) total: 3.51s remaining: 9.56s 18: learn: 0.4768683 test: 0.4771562 best: 0.4771562 (18) total: 3.71s remaining: 9.39s 19: learn: 0.4685238 test: 0.4687598 best: 0.4687598 (19) total: 3.9s remaining: 9.17s 20: learn: 0.4605971 test: 0.4609148 best: 0.4609148 (20) total: 4.13s remaining: 9.05s 21: learn: 0.4533067 test: 0.4536984 best: 0.4536984 (21) total: 4.3s remaining: 8.79s 22: learn: 0.4456725 test: 0.4461315 best: 0.4461315 (22) total: 4.46s remaining: 8.53s 23: learn: 0.4392745 test: 0.4399419 best: 0.4399419 (23) total: 4.67s remaining: 8.37s 24: learn: 0.4327398 test: 0.4334317 best: 0.4334317 (24) total: 4.86s remaining: 8.17s 25: learn: 0.4264743 test: 0.4272258 best: 0.4272258 (25) total: 5.07s remaining: 7.99s 26: learn: 0.4208784 test: 0.4216565 best: 0.4216565 (26) total: 5.24s remaining: 7.77s 27: learn: 0.4146600 test: 0.4154683 best: 0.4154683 (27) total: 5.45s remaining: 7.58s 28: learn: 0.4089238 test: 0.4096914 best: 0.4096914 (28) total: 5.65s remaining: 7.4s 29: learn: 0.4034349 test: 0.4041835 best: 0.4041835 (29) total: 5.83s remaining: 7.19s 30: learn: 0.3978095 test: 0.3985997 best: 0.3985997 (30) total: 6.03s remaining: 7s 31: learn: 0.3926132 test: 0.3935049 best: 0.3935049 (31) total: 6.25s remaining: 6.83s 32: learn: 0.3878590 test: 0.3887389 best: 0.3887389 (32) total: 6.42s remaining: 6.62s 33: learn: 0.3833569 test: 0.3842408 best: 0.3842408 (33) total: 6.6s remaining: 6.41s 34: learn: 0.3784285 test: 0.3793557 best: 0.3793557 (34) total: 6.79s remaining: 6.21s 35: learn: 0.3746805 test: 0.3757084 best: 0.3757084 (35) total: 6.99s remaining: 6.02s 36: learn: 0.3704695 test: 0.3716482 best: 0.3716482 (36) total: 7.18s remaining: 5.82s 37: learn: 0.3664357 test: 0.3677691 best: 0.3677691 (37) total: 7.37s remaining: 5.63s 38: learn: 0.3619724 test: 0.3634416 best: 0.3634416 (38) total: 7.54s remaining: 5.41s 39: learn: 0.3580585 test: 0.3595755 best: 0.3595755 (39) total: 7.76s remaining: 5.24s 40: learn: 0.3542398 test: 0.3558351 best: 0.3558351 (40) total: 7.95s remaining: 5.04s 41: learn: 0.3505676 test: 0.3521759 best: 0.3521759 (41) total: 8.17s remaining: 4.86s 42: learn: 0.3466619 test: 0.3483110 best: 0.3483110 (42) total: 8.36s remaining: 4.67s 43: learn: 0.3427017 test: 0.3443742 best: 0.3443742 (43) total: 8.53s remaining: 4.46s 44: learn: 0.3397192 test: 0.3414579 best: 0.3414579 (44) total: 8.67s remaining: 4.24s 45: learn: 0.3366439 test: 0.3383284 best: 0.3383284 (45) total: 8.84s remaining: 4.04s 46: learn: 0.3329038 test: 0.3347947 best: 0.3347947 (46) total: 9.05s remaining: 3.85s 47: learn: 0.3301516 test: 0.3320672 best: 0.3320672 (47) total: 9.24s remaining: 3.66s 48: learn: 0.3269524 test: 0.3288501 best: 0.3288501 (48) total: 9.45s remaining: 3.47s 49: learn: 0.3235752 test: 0.3255712 best: 0.3255712 (49) total: 9.64s remaining: 3.28s 50: learn: 0.3207462 test: 0.3228221 best: 0.3228221 (50) total: 9.85s remaining: 3.09s 51: learn: 0.3179988 test: 0.3202184 best: 0.3202184 (51) total: 10s remaining: 2.89s 52: learn: 0.3152271 test: 0.3174547 best: 0.3174547 (52) total: 10.2s remaining: 2.69s 53: learn: 0.3121809 test: 0.3144816 best: 0.3144816 (53) total: 10.4s remaining: 2.51s 54: learn: 0.3099646 test: 0.3123291 best: 0.3123291 (54) total: 10.6s remaining: 2.31s 55: learn: 0.3077906 test: 0.3101982 best: 0.3101982 (55) total: 10.8s remaining: 2.12s 56: learn: 0.3057748 test: 0.3082907 best: 0.3082907 (56) total: 11s remaining: 1.92s 57: learn: 0.3033914 test: 0.3059070 best: 0.3059070 (57) total: 11.2s remaining: 1.73s 58: learn: 0.3006811 test: 0.3032270 best: 0.3032270 (58) total: 11.3s remaining: 1.53s 59: learn: 0.2984633 test: 0.3010354 best: 0.3010354 (59) total: 11.5s remaining: 1.34s 60: learn: 0.2959332 test: 0.2985653 best: 0.2985653 (60) total: 11.7s remaining: 1.15s 61: learn: 0.2939661 test: 0.2967641 best: 0.2967641 (61) total: 11.9s remaining: 957ms 62: learn: 0.2920356 test: 0.2950084 best: 0.2950084 (62) total: 12.1s remaining: 767ms 63: learn: 0.2903512 test: 0.2932669 best: 0.2932669 (63) total: 12.2s remaining: 574ms 64: learn: 0.2886881 test: 0.2916989 best: 0.2916989 (64) total: 12.4s remaining: 383ms 65: learn: 0.2870290 test: 0.2900847 best: 0.2900847 (65) total: 12.6s remaining: 191ms 66: learn: 0.2851654 test: 0.2883727 best: 0.2883727 (66) total: 12.8s remaining: 0us bestTest = 0.2883726847 bestIteration = 66 Trial 73, Fold 1: Log loss = 0.2883726847206898, Average precision = 0.9673291485710853, ROC-AUC = 0.9624328766435812, Elapsed Time = 12.948463799999445 seconds Trial 73, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 73, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6767065 test: 0.6766775 best: 0.6766775 (0) total: 152ms remaining: 10.1s 1: learn: 0.6616497 test: 0.6618009 best: 0.6618009 (1) total: 379ms remaining: 12.3s 2: learn: 0.6474051 test: 0.6477476 best: 0.6477476 (2) total: 566ms remaining: 12.1s 3: learn: 0.6343686 test: 0.6347047 best: 0.6347047 (3) total: 753ms remaining: 11.9s 4: learn: 0.6205271 test: 0.6210619 best: 0.6210619 (4) total: 958ms remaining: 11.9s 5: learn: 0.6067286 test: 0.6075147 best: 0.6075147 (5) total: 1.17s remaining: 11.9s 6: learn: 0.5948623 test: 0.5956810 best: 0.5956810 (6) total: 1.35s remaining: 11.6s 7: learn: 0.5832627 test: 0.5841972 best: 0.5841972 (7) total: 1.56s remaining: 11.5s 8: learn: 0.5710078 test: 0.5720563 best: 0.5720563 (8) total: 1.79s remaining: 11.6s 9: learn: 0.5589336 test: 0.5600602 best: 0.5600602 (9) total: 2.01s remaining: 11.4s 10: learn: 0.5484852 test: 0.5497090 best: 0.5497090 (10) total: 2.21s remaining: 11.2s 11: learn: 0.5376379 test: 0.5389448 best: 0.5389448 (11) total: 2.39s remaining: 11s 12: learn: 0.5283031 test: 0.5297346 best: 0.5297346 (12) total: 2.57s remaining: 10.7s 13: learn: 0.5181507 test: 0.5196985 best: 0.5196985 (13) total: 2.75s remaining: 10.4s 14: learn: 0.5077779 test: 0.5094236 best: 0.5094236 (14) total: 2.94s remaining: 10.2s 15: learn: 0.4993347 test: 0.5010856 best: 0.5010856 (15) total: 3.1s remaining: 9.88s 16: learn: 0.4907020 test: 0.4924989 best: 0.4924989 (16) total: 3.28s remaining: 9.66s 17: learn: 0.4818769 test: 0.4837682 best: 0.4837682 (17) total: 3.48s remaining: 9.48s 18: learn: 0.4741981 test: 0.4762109 best: 0.4762109 (18) total: 3.67s remaining: 9.26s 19: learn: 0.4654422 test: 0.4674951 best: 0.4674951 (19) total: 3.84s remaining: 9.02s 20: learn: 0.4581425 test: 0.4601559 best: 0.4601559 (20) total: 4.04s remaining: 8.85s 21: learn: 0.4508526 test: 0.4529217 best: 0.4529217 (21) total: 4.22s remaining: 8.63s 22: learn: 0.4442123 test: 0.4462414 best: 0.4462414 (22) total: 4.37s remaining: 8.37s 23: learn: 0.4375037 test: 0.4395973 best: 0.4395973 (23) total: 4.61s remaining: 8.25s 24: learn: 0.4311161 test: 0.4331875 best: 0.4331875 (24) total: 4.82s remaining: 8.1s 25: learn: 0.4242694 test: 0.4263849 best: 0.4263849 (25) total: 5.01s remaining: 7.91s 26: learn: 0.4177269 test: 0.4199205 best: 0.4199205 (26) total: 5.21s remaining: 7.72s 27: learn: 0.4122342 test: 0.4145325 best: 0.4145325 (27) total: 5.39s remaining: 7.51s 28: learn: 0.4069672 test: 0.4093198 best: 0.4093198 (28) total: 5.6s remaining: 7.33s 29: learn: 0.4011555 test: 0.4035636 best: 0.4035636 (29) total: 5.77s remaining: 7.12s 30: learn: 0.3960085 test: 0.3985323 best: 0.3985323 (30) total: 5.96s remaining: 6.92s 31: learn: 0.3908328 test: 0.3933446 best: 0.3933446 (31) total: 6.13s remaining: 6.71s 32: learn: 0.3858548 test: 0.3884617 best: 0.3884617 (32) total: 6.32s remaining: 6.51s 33: learn: 0.3807806 test: 0.3833680 best: 0.3833680 (33) total: 6.51s remaining: 6.32s 34: learn: 0.3768076 test: 0.3793838 best: 0.3793838 (34) total: 6.66s remaining: 6.09s 35: learn: 0.3724453 test: 0.3750985 best: 0.3750985 (35) total: 6.86s remaining: 5.9s 36: learn: 0.3681023 test: 0.3707312 best: 0.3707312 (36) total: 7.07s remaining: 5.73s 37: learn: 0.3639209 test: 0.3666350 best: 0.3666350 (37) total: 7.23s remaining: 5.52s 38: learn: 0.3592985 test: 0.3620799 best: 0.3620799 (38) total: 7.43s remaining: 5.33s 39: learn: 0.3552121 test: 0.3580568 best: 0.3580568 (39) total: 7.65s remaining: 5.16s 40: learn: 0.3518538 test: 0.3546367 best: 0.3546367 (40) total: 7.83s remaining: 4.96s 41: learn: 0.3489518 test: 0.3516303 best: 0.3516303 (41) total: 8.03s remaining: 4.78s 42: learn: 0.3454947 test: 0.3482287 best: 0.3482287 (42) total: 8.23s remaining: 4.59s 43: learn: 0.3420081 test: 0.3446899 best: 0.3446899 (43) total: 8.44s remaining: 4.41s 44: learn: 0.3381870 test: 0.3408601 best: 0.3408601 (44) total: 8.65s remaining: 4.23s 45: learn: 0.3350611 test: 0.3377072 best: 0.3377072 (45) total: 8.84s remaining: 4.03s 46: learn: 0.3316818 test: 0.3343657 best: 0.3343657 (46) total: 8.99s remaining: 3.83s 47: learn: 0.3285650 test: 0.3312733 best: 0.3312733 (47) total: 9.15s remaining: 3.62s 48: learn: 0.3258085 test: 0.3284954 best: 0.3284954 (48) total: 9.35s remaining: 3.44s 49: learn: 0.3234829 test: 0.3261847 best: 0.3261847 (49) total: 9.51s remaining: 3.23s 50: learn: 0.3205227 test: 0.3232058 best: 0.3232058 (50) total: 9.7s remaining: 3.04s 51: learn: 0.3177687 test: 0.3204836 best: 0.3204836 (51) total: 9.93s remaining: 2.86s 52: learn: 0.3150750 test: 0.3177133 best: 0.3177133 (52) total: 10.1s remaining: 2.67s 53: learn: 0.3120738 test: 0.3146961 best: 0.3146961 (53) total: 10.3s remaining: 2.49s 54: learn: 0.3096977 test: 0.3123537 best: 0.3123537 (54) total: 10.5s remaining: 2.29s 55: learn: 0.3072342 test: 0.3099119 best: 0.3099119 (55) total: 10.7s remaining: 2.11s 56: learn: 0.3048118 test: 0.3074644 best: 0.3074644 (56) total: 11s remaining: 1.93s 57: learn: 0.3023736 test: 0.3050178 best: 0.3050178 (57) total: 11.2s remaining: 1.73s 58: learn: 0.3006680 test: 0.3032856 best: 0.3032856 (58) total: 11.4s remaining: 1.54s 59: learn: 0.2987122 test: 0.3013699 best: 0.3013699 (59) total: 11.5s remaining: 1.35s 60: learn: 0.2970545 test: 0.2997365 best: 0.2997365 (60) total: 11.7s remaining: 1.15s 61: learn: 0.2949208 test: 0.2976692 best: 0.2976692 (61) total: 12s remaining: 965ms 62: learn: 0.2931654 test: 0.2958932 best: 0.2958932 (62) total: 12.2s remaining: 772ms 63: learn: 0.2909332 test: 0.2936883 best: 0.2936883 (63) total: 12.3s remaining: 578ms 64: learn: 0.2891088 test: 0.2918196 best: 0.2918196 (64) total: 12.5s remaining: 385ms 65: learn: 0.2875618 test: 0.2902548 best: 0.2902548 (65) total: 12.7s remaining: 192ms 66: learn: 0.2854805 test: 0.2881788 best: 0.2881788 (66) total: 12.9s remaining: 0us bestTest = 0.2881788354 bestIteration = 66 Trial 73, Fold 2: Log loss = 0.28817883537160904, Average precision = 0.9696547592949077, ROC-AUC = 0.9662350554987176, Elapsed Time = 12.999011100000644 seconds Trial 73, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 73, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6769426 test: 0.6769432 best: 0.6769432 (0) total: 178ms remaining: 11.8s 1: learn: 0.6629823 test: 0.6629289 best: 0.6629289 (1) total: 359ms remaining: 11.7s 2: learn: 0.6477664 test: 0.6476971 best: 0.6476971 (2) total: 563ms remaining: 12s 3: learn: 0.6336203 test: 0.6334883 best: 0.6334883 (3) total: 770ms remaining: 12.1s 4: learn: 0.6201463 test: 0.6198678 best: 0.6198678 (4) total: 978ms remaining: 12.1s 5: learn: 0.6066512 test: 0.6063254 best: 0.6063254 (5) total: 1.14s remaining: 11.5s 6: learn: 0.5935805 test: 0.5932670 best: 0.5932670 (6) total: 1.35s remaining: 11.6s 7: learn: 0.5804371 test: 0.5800240 best: 0.5800240 (7) total: 1.55s remaining: 11.4s 8: learn: 0.5681256 test: 0.5676512 best: 0.5676512 (8) total: 1.76s remaining: 11.4s 9: learn: 0.5562250 test: 0.5557347 best: 0.5557347 (9) total: 1.92s remaining: 11s 10: learn: 0.5460863 test: 0.5454960 best: 0.5454960 (10) total: 2.13s remaining: 10.8s 11: learn: 0.5360584 test: 0.5353245 best: 0.5353245 (11) total: 2.31s remaining: 10.6s 12: learn: 0.5252183 test: 0.5243361 best: 0.5243361 (12) total: 2.49s remaining: 10.3s 13: learn: 0.5154077 test: 0.5144370 best: 0.5144370 (13) total: 2.64s remaining: 9.99s 14: learn: 0.5061076 test: 0.5051598 best: 0.5051598 (14) total: 2.84s remaining: 9.84s 15: learn: 0.4975293 test: 0.4965052 best: 0.4965052 (15) total: 3.04s remaining: 9.69s 16: learn: 0.4887468 test: 0.4877235 best: 0.4877235 (16) total: 3.23s remaining: 9.49s 17: learn: 0.4797113 test: 0.4786417 best: 0.4786417 (17) total: 3.44s remaining: 9.36s 18: learn: 0.4719726 test: 0.4708636 best: 0.4708636 (18) total: 3.6s remaining: 9.11s 19: learn: 0.4640830 test: 0.4629471 best: 0.4629471 (19) total: 3.77s remaining: 8.85s 20: learn: 0.4568310 test: 0.4556263 best: 0.4556263 (20) total: 3.94s remaining: 8.64s 21: learn: 0.4488890 test: 0.4476548 best: 0.4476548 (21) total: 4.15s remaining: 8.49s 22: learn: 0.4417978 test: 0.4405844 best: 0.4405844 (22) total: 4.34s remaining: 8.3s 23: learn: 0.4348951 test: 0.4336684 best: 0.4336684 (23) total: 4.54s remaining: 8.14s 24: learn: 0.4286031 test: 0.4273249 best: 0.4273249 (24) total: 4.74s remaining: 7.96s 25: learn: 0.4223397 test: 0.4210178 best: 0.4210178 (25) total: 4.91s remaining: 7.75s 26: learn: 0.4161549 test: 0.4148680 best: 0.4148680 (26) total: 5.13s remaining: 7.6s 27: learn: 0.4106278 test: 0.4091961 best: 0.4091961 (27) total: 5.31s remaining: 7.4s 28: learn: 0.4048371 test: 0.4033808 best: 0.4033808 (28) total: 5.57s remaining: 7.3s 29: learn: 0.3994607 test: 0.3980168 best: 0.3980168 (29) total: 5.74s remaining: 7.08s 30: learn: 0.3944833 test: 0.3930157 best: 0.3930157 (30) total: 5.95s remaining: 6.9s 31: learn: 0.3895220 test: 0.3880361 best: 0.3880361 (31) total: 6.14s remaining: 6.72s 32: learn: 0.3841704 test: 0.3826938 best: 0.3826938 (32) total: 6.36s remaining: 6.55s 33: learn: 0.3800568 test: 0.3783951 best: 0.3783951 (33) total: 6.5s remaining: 6.31s 34: learn: 0.3756223 test: 0.3738918 best: 0.3738918 (34) total: 6.67s remaining: 6.1s 35: learn: 0.3714375 test: 0.3696507 best: 0.3696507 (35) total: 6.87s remaining: 5.91s 36: learn: 0.3678165 test: 0.3660298 best: 0.3660298 (36) total: 7.03s remaining: 5.7s 37: learn: 0.3636209 test: 0.3618026 best: 0.3618026 (37) total: 7.23s remaining: 5.52s 38: learn: 0.3593910 test: 0.3575677 best: 0.3575677 (38) total: 7.44s remaining: 5.34s 39: learn: 0.3560855 test: 0.3542510 best: 0.3542510 (39) total: 7.64s remaining: 5.16s 40: learn: 0.3524909 test: 0.3505987 best: 0.3505987 (40) total: 7.81s remaining: 4.96s 41: learn: 0.3492129 test: 0.3472923 best: 0.3472923 (41) total: 7.97s remaining: 4.75s 42: learn: 0.3453550 test: 0.3434582 best: 0.3434582 (42) total: 8.19s remaining: 4.57s 43: learn: 0.3420006 test: 0.3401374 best: 0.3401374 (43) total: 8.4s remaining: 4.39s 44: learn: 0.3386589 test: 0.3367860 best: 0.3367860 (44) total: 8.6s remaining: 4.21s 45: learn: 0.3355534 test: 0.3336696 best: 0.3336696 (45) total: 8.77s remaining: 4s 46: learn: 0.3326345 test: 0.3306788 best: 0.3306788 (46) total: 8.94s remaining: 3.81s 47: learn: 0.3294721 test: 0.3275017 best: 0.3275017 (47) total: 9.15s remaining: 3.62s 48: learn: 0.3261091 test: 0.3241564 best: 0.3241564 (48) total: 9.34s remaining: 3.43s 49: learn: 0.3229964 test: 0.3210766 best: 0.3210766 (49) total: 9.55s remaining: 3.25s 50: learn: 0.3199385 test: 0.3181016 best: 0.3181016 (50) total: 9.79s remaining: 3.07s 51: learn: 0.3174054 test: 0.3156180 best: 0.3156180 (51) total: 10s remaining: 2.88s 52: learn: 0.3145455 test: 0.3128024 best: 0.3128024 (52) total: 10.2s remaining: 2.69s 53: learn: 0.3119686 test: 0.3102895 best: 0.3102895 (53) total: 10.4s remaining: 2.5s 54: learn: 0.3092924 test: 0.3076226 best: 0.3076226 (54) total: 10.6s remaining: 2.31s 55: learn: 0.3064196 test: 0.3047462 best: 0.3047462 (55) total: 10.8s remaining: 2.12s 56: learn: 0.3040986 test: 0.3024852 best: 0.3024852 (56) total: 10.9s remaining: 1.92s 57: learn: 0.3020755 test: 0.3004631 best: 0.3004631 (57) total: 11.1s remaining: 1.73s 58: learn: 0.2996399 test: 0.2980974 best: 0.2980974 (58) total: 11.3s remaining: 1.53s 59: learn: 0.2974485 test: 0.2958928 best: 0.2958928 (59) total: 11.5s remaining: 1.34s 60: learn: 0.2952686 test: 0.2936857 best: 0.2936857 (60) total: 11.7s remaining: 1.15s 61: learn: 0.2930483 test: 0.2915285 best: 0.2915285 (61) total: 11.8s remaining: 954ms 62: learn: 0.2913060 test: 0.2897843 best: 0.2897843 (62) total: 12s remaining: 762ms 63: learn: 0.2894612 test: 0.2879684 best: 0.2879684 (63) total: 12.2s remaining: 573ms 64: learn: 0.2872966 test: 0.2858261 best: 0.2858261 (64) total: 12.4s remaining: 383ms 65: learn: 0.2852603 test: 0.2838687 best: 0.2838687 (65) total: 12.7s remaining: 192ms 66: learn: 0.2833924 test: 0.2820576 best: 0.2820576 (66) total: 12.8s remaining: 0us bestTest = 0.2820576015 bestIteration = 66 Trial 73, Fold 3: Log loss = 0.2820576015164774, Average precision = 0.9683603365207796, ROC-AUC = 0.9661705379414626, Elapsed Time = 12.962134000001242 seconds Trial 73, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 73, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6759164 test: 0.6759003 best: 0.6759003 (0) total: 210ms remaining: 13.9s 1: learn: 0.6604966 test: 0.6604526 best: 0.6604526 (1) total: 399ms remaining: 13s 2: learn: 0.6452101 test: 0.6451413 best: 0.6451413 (2) total: 588ms remaining: 12.5s 3: learn: 0.6313688 test: 0.6313238 best: 0.6313238 (3) total: 766ms remaining: 12.1s 4: learn: 0.6180571 test: 0.6179943 best: 0.6179943 (4) total: 944ms remaining: 11.7s 5: learn: 0.6048558 test: 0.6047886 best: 0.6047886 (5) total: 1.14s remaining: 11.6s 6: learn: 0.5925386 test: 0.5924586 best: 0.5924586 (6) total: 1.35s remaining: 11.6s 7: learn: 0.5801331 test: 0.5801024 best: 0.5801024 (7) total: 1.54s remaining: 11.4s 8: learn: 0.5680309 test: 0.5680283 best: 0.5680283 (8) total: 1.72s remaining: 11.1s 9: learn: 0.5578486 test: 0.5578487 best: 0.5578487 (9) total: 1.85s remaining: 10.6s 10: learn: 0.5468398 test: 0.5469000 best: 0.5469000 (10) total: 2.05s remaining: 10.4s 11: learn: 0.5361922 test: 0.5362724 best: 0.5362724 (11) total: 2.25s remaining: 10.3s 12: learn: 0.5252793 test: 0.5254141 best: 0.5254141 (12) total: 2.45s remaining: 10.2s 13: learn: 0.5155401 test: 0.5157526 best: 0.5157526 (13) total: 2.67s remaining: 10.1s 14: learn: 0.5059149 test: 0.5061254 best: 0.5061254 (14) total: 2.83s remaining: 9.83s 15: learn: 0.4960741 test: 0.4963488 best: 0.4963488 (15) total: 3.07s remaining: 9.78s 16: learn: 0.4877747 test: 0.4880572 best: 0.4880572 (16) total: 3.22s remaining: 9.46s 17: learn: 0.4802648 test: 0.4806283 best: 0.4806283 (17) total: 3.38s remaining: 9.21s 18: learn: 0.4728297 test: 0.4732401 best: 0.4732401 (18) total: 3.57s remaining: 9.01s 19: learn: 0.4655559 test: 0.4659551 best: 0.4659551 (19) total: 3.77s remaining: 8.86s 20: learn: 0.4581338 test: 0.4585591 best: 0.4585591 (20) total: 3.94s remaining: 8.62s 21: learn: 0.4515876 test: 0.4519924 best: 0.4519924 (21) total: 4.14s remaining: 8.47s 22: learn: 0.4445237 test: 0.4449656 best: 0.4449656 (22) total: 4.3s remaining: 8.24s 23: learn: 0.4375951 test: 0.4380049 best: 0.4380049 (23) total: 4.46s remaining: 8s 24: learn: 0.4303908 test: 0.4307765 best: 0.4307765 (24) total: 4.66s remaining: 7.83s 25: learn: 0.4244455 test: 0.4248265 best: 0.4248265 (25) total: 4.83s remaining: 7.63s 26: learn: 0.4186782 test: 0.4191195 best: 0.4191195 (26) total: 5.02s remaining: 7.43s 27: learn: 0.4129965 test: 0.4134564 best: 0.4134564 (27) total: 5.19s remaining: 7.23s 28: learn: 0.4078076 test: 0.4082919 best: 0.4082919 (28) total: 5.36s remaining: 7.03s 29: learn: 0.4017830 test: 0.4022063 best: 0.4022063 (29) total: 5.56s remaining: 6.86s 30: learn: 0.3963472 test: 0.3967793 best: 0.3967793 (30) total: 5.72s remaining: 6.65s 31: learn: 0.3910114 test: 0.3914101 best: 0.3914101 (31) total: 5.96s remaining: 6.52s 32: learn: 0.3863747 test: 0.3866692 best: 0.3866692 (32) total: 6.14s remaining: 6.33s 33: learn: 0.3818718 test: 0.3821820 best: 0.3821820 (33) total: 6.34s remaining: 6.15s 34: learn: 0.3765951 test: 0.3769409 best: 0.3769409 (34) total: 6.5s remaining: 5.94s 35: learn: 0.3717587 test: 0.3721718 best: 0.3721718 (35) total: 6.67s remaining: 5.74s 36: learn: 0.3677949 test: 0.3681892 best: 0.3681892 (36) total: 6.87s remaining: 5.57s 37: learn: 0.3631920 test: 0.3635912 best: 0.3635912 (37) total: 7.03s remaining: 5.36s 38: learn: 0.3590558 test: 0.3594583 best: 0.3594583 (38) total: 7.22s remaining: 5.18s 39: learn: 0.3552734 test: 0.3556856 best: 0.3556856 (39) total: 7.42s remaining: 5.01s 40: learn: 0.3514092 test: 0.3518196 best: 0.3518196 (40) total: 7.59s remaining: 4.82s 41: learn: 0.3479215 test: 0.3483522 best: 0.3483522 (41) total: 7.78s remaining: 4.63s 42: learn: 0.3446431 test: 0.3450911 best: 0.3450911 (42) total: 7.96s remaining: 4.44s 43: learn: 0.3415459 test: 0.3419790 best: 0.3419790 (43) total: 8.19s remaining: 4.28s 44: learn: 0.3385521 test: 0.3390398 best: 0.3390398 (44) total: 8.39s remaining: 4.1s 45: learn: 0.3351310 test: 0.3356945 best: 0.3356945 (45) total: 8.58s remaining: 3.92s 46: learn: 0.3320895 test: 0.3327533 best: 0.3327533 (46) total: 8.74s remaining: 3.72s 47: learn: 0.3284830 test: 0.3292028 best: 0.3292028 (47) total: 8.93s remaining: 3.53s 48: learn: 0.3255186 test: 0.3262709 best: 0.3262709 (48) total: 9.13s remaining: 3.35s 49: learn: 0.3227906 test: 0.3235175 best: 0.3235175 (49) total: 9.33s remaining: 3.17s 50: learn: 0.3195268 test: 0.3203127 best: 0.3203127 (50) total: 9.54s remaining: 2.99s 51: learn: 0.3170968 test: 0.3178745 best: 0.3178745 (51) total: 9.75s remaining: 2.81s 52: learn: 0.3148039 test: 0.3156958 best: 0.3156958 (52) total: 9.92s remaining: 2.62s 53: learn: 0.3124667 test: 0.3134936 best: 0.3134936 (53) total: 10.2s remaining: 2.45s 54: learn: 0.3100170 test: 0.3110886 best: 0.3110886 (54) total: 10.4s remaining: 2.27s 55: learn: 0.3078059 test: 0.3089374 best: 0.3089374 (55) total: 10.6s remaining: 2.08s 56: learn: 0.3054165 test: 0.3065258 best: 0.3065258 (56) total: 10.8s remaining: 1.89s 57: learn: 0.3028615 test: 0.3039296 best: 0.3039296 (57) total: 11s remaining: 1.7s 58: learn: 0.3007139 test: 0.3018401 best: 0.3018401 (58) total: 11.2s remaining: 1.52s 59: learn: 0.2984385 test: 0.2995734 best: 0.2995734 (59) total: 11.4s remaining: 1.33s 60: learn: 0.2960961 test: 0.2972640 best: 0.2972640 (60) total: 11.7s remaining: 1.15s 61: learn: 0.2942306 test: 0.2954276 best: 0.2954276 (61) total: 11.8s remaining: 952ms 62: learn: 0.2923092 test: 0.2935401 best: 0.2935401 (62) total: 12s remaining: 763ms 63: learn: 0.2908237 test: 0.2920348 best: 0.2920348 (63) total: 12.2s remaining: 571ms 64: learn: 0.2886555 test: 0.2899157 best: 0.2899157 (64) total: 12.4s remaining: 382ms 65: learn: 0.2871719 test: 0.2884501 best: 0.2884501 (65) total: 12.6s remaining: 191ms 66: learn: 0.2854511 test: 0.2867326 best: 0.2867326 (66) total: 12.8s remaining: 0us bestTest = 0.286732647 bestIteration = 66 Trial 73, Fold 4: Log loss = 0.2867326470217839, Average precision = 0.9672012912634944, ROC-AUC = 0.9637673286980407, Elapsed Time = 12.946753800002625 seconds Trial 73, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 73, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6767427 test: 0.6769649 best: 0.6769649 (0) total: 219ms remaining: 14.5s 1: learn: 0.6608828 test: 0.6614982 best: 0.6614982 (1) total: 433ms remaining: 14.1s 2: learn: 0.6461699 test: 0.6469678 best: 0.6469678 (2) total: 670ms remaining: 14.3s 3: learn: 0.6321839 test: 0.6331292 best: 0.6331292 (3) total: 1.01s remaining: 16s 4: learn: 0.6196014 test: 0.6207927 best: 0.6207927 (4) total: 1.23s remaining: 15.3s 5: learn: 0.6077827 test: 0.6091075 best: 0.6091075 (5) total: 1.46s remaining: 14.8s 6: learn: 0.5942335 test: 0.5957168 best: 0.5957168 (6) total: 1.63s remaining: 14s 7: learn: 0.5823704 test: 0.5839966 best: 0.5839966 (7) total: 1.84s remaining: 13.6s 8: learn: 0.5699542 test: 0.5717031 best: 0.5717031 (8) total: 2.03s remaining: 13.1s 9: learn: 0.5583726 test: 0.5602708 best: 0.5602708 (9) total: 2.26s remaining: 12.9s 10: learn: 0.5480843 test: 0.5501990 best: 0.5501990 (10) total: 2.48s remaining: 12.6s 11: learn: 0.5375460 test: 0.5397990 best: 0.5397990 (11) total: 2.69s remaining: 12.3s 12: learn: 0.5277213 test: 0.5300436 best: 0.5300436 (12) total: 2.87s remaining: 11.9s 13: learn: 0.5170673 test: 0.5196444 best: 0.5196444 (13) total: 3.06s remaining: 11.6s 14: learn: 0.5078859 test: 0.5106475 best: 0.5106475 (14) total: 3.27s remaining: 11.3s 15: learn: 0.4986707 test: 0.5015259 best: 0.5015259 (15) total: 3.48s remaining: 11.1s 16: learn: 0.4902085 test: 0.4932476 best: 0.4932476 (16) total: 3.67s remaining: 10.8s 17: learn: 0.4819371 test: 0.4850234 best: 0.4850234 (17) total: 3.89s remaining: 10.6s 18: learn: 0.4730309 test: 0.4762727 best: 0.4762727 (18) total: 4.1s remaining: 10.4s 19: learn: 0.4655548 test: 0.4688948 best: 0.4688948 (19) total: 4.32s remaining: 10.2s 20: learn: 0.4573219 test: 0.4608248 best: 0.4608248 (20) total: 4.52s remaining: 9.9s 21: learn: 0.4496281 test: 0.4532979 best: 0.4532979 (21) total: 4.73s remaining: 9.68s 22: learn: 0.4423183 test: 0.4461265 best: 0.4461265 (22) total: 4.89s remaining: 9.35s 23: learn: 0.4360318 test: 0.4398994 best: 0.4398994 (23) total: 5.1s remaining: 9.13s 24: learn: 0.4297914 test: 0.4338165 best: 0.4338165 (24) total: 5.32s remaining: 8.93s 25: learn: 0.4233974 test: 0.4275222 best: 0.4275222 (25) total: 5.55s remaining: 8.75s 26: learn: 0.4172939 test: 0.4215590 best: 0.4215590 (26) total: 5.72s remaining: 8.47s 27: learn: 0.4118704 test: 0.4162347 best: 0.4162347 (27) total: 5.92s remaining: 8.25s 28: learn: 0.4063168 test: 0.4107402 best: 0.4107402 (28) total: 6.1s remaining: 8s 29: learn: 0.4009420 test: 0.4054203 best: 0.4054203 (29) total: 6.31s remaining: 7.78s 30: learn: 0.3953394 test: 0.3998948 best: 0.3998948 (30) total: 6.48s remaining: 7.53s 31: learn: 0.3907309 test: 0.3954166 best: 0.3954166 (31) total: 6.63s remaining: 7.25s 32: learn: 0.3862666 test: 0.3910482 best: 0.3910482 (32) total: 6.81s remaining: 7.01s 33: learn: 0.3816706 test: 0.3865808 best: 0.3865808 (33) total: 6.96s remaining: 6.76s 34: learn: 0.3774111 test: 0.3824079 best: 0.3824079 (34) total: 7.14s remaining: 6.53s 35: learn: 0.3733237 test: 0.3783872 best: 0.3783872 (35) total: 7.36s remaining: 6.33s 36: learn: 0.3695175 test: 0.3746487 best: 0.3746487 (36) total: 7.49s remaining: 6.07s 37: learn: 0.3651567 test: 0.3703485 best: 0.3703485 (37) total: 7.67s remaining: 5.85s 38: learn: 0.3610439 test: 0.3664217 best: 0.3664217 (38) total: 7.86s remaining: 5.64s 39: learn: 0.3563493 test: 0.3618735 best: 0.3618735 (39) total: 8.05s remaining: 5.43s 40: learn: 0.3522771 test: 0.3578891 best: 0.3578891 (40) total: 8.22s remaining: 5.22s 41: learn: 0.3487609 test: 0.3544896 best: 0.3544896 (41) total: 8.42s remaining: 5.01s 42: learn: 0.3450183 test: 0.3508370 best: 0.3508370 (42) total: 8.65s remaining: 4.83s 43: learn: 0.3417335 test: 0.3475778 best: 0.3475778 (43) total: 8.85s remaining: 4.62s 44: learn: 0.3380707 test: 0.3440532 best: 0.3440532 (44) total: 9.04s remaining: 4.42s 45: learn: 0.3345741 test: 0.3406528 best: 0.3406528 (45) total: 9.19s remaining: 4.2s 46: learn: 0.3313036 test: 0.3374793 best: 0.3374793 (46) total: 9.4s remaining: 4s 47: learn: 0.3284327 test: 0.3346990 best: 0.3346990 (47) total: 9.58s remaining: 3.79s 48: learn: 0.3250334 test: 0.3314346 best: 0.3314346 (48) total: 9.77s remaining: 3.59s 49: learn: 0.3219309 test: 0.3284399 best: 0.3284399 (49) total: 10s remaining: 3.4s 50: learn: 0.3188101 test: 0.3254345 best: 0.3254345 (50) total: 10.2s remaining: 3.19s 51: learn: 0.3166328 test: 0.3232746 best: 0.3232746 (51) total: 10.3s remaining: 2.97s 52: learn: 0.3134585 test: 0.3201774 best: 0.3201774 (52) total: 10.5s remaining: 2.77s 53: learn: 0.3104568 test: 0.3173019 best: 0.3173019 (53) total: 10.6s remaining: 2.56s 54: learn: 0.3080053 test: 0.3149645 best: 0.3149645 (54) total: 10.8s remaining: 2.36s 55: learn: 0.3053353 test: 0.3124120 best: 0.3124120 (55) total: 11s remaining: 2.16s 56: learn: 0.3028565 test: 0.3100545 best: 0.3100545 (56) total: 11.2s remaining: 1.96s 57: learn: 0.3005348 test: 0.3078788 best: 0.3078788 (57) total: 11.4s remaining: 1.77s 58: learn: 0.2986460 test: 0.3060583 best: 0.3060583 (58) total: 11.6s remaining: 1.57s 59: learn: 0.2970414 test: 0.3045406 best: 0.3045406 (59) total: 11.8s remaining: 1.37s 60: learn: 0.2950264 test: 0.3025558 best: 0.3025558 (60) total: 11.9s remaining: 1.17s 61: learn: 0.2930767 test: 0.3006777 best: 0.3006777 (61) total: 12.1s remaining: 975ms 62: learn: 0.2908897 test: 0.2985919 best: 0.2985919 (62) total: 12.2s remaining: 777ms 63: learn: 0.2892684 test: 0.2970477 best: 0.2970477 (63) total: 12.4s remaining: 581ms 64: learn: 0.2878738 test: 0.2957441 best: 0.2957441 (64) total: 12.6s remaining: 386ms 65: learn: 0.2863658 test: 0.2943141 best: 0.2943141 (65) total: 12.8s remaining: 194ms 66: learn: 0.2843684 test: 0.2924208 best: 0.2924208 (66) total: 13s remaining: 0us bestTest = 0.2924208183 bestIteration = 66 Trial 73, Fold 5: Log loss = 0.292420818332159, Average precision = 0.9663694513644887, ROC-AUC = 0.9618022035189417, Elapsed Time = 13.122513300000719 seconds
Optimization Progress: 74%|#######4 | 74/100 [2:05:45<47:07, 108.74s/it]
Trial 74, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 74, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6232529 test: 0.6255123 best: 0.6255123 (0) total: 1.72s remaining: 2m 46s 1: learn: 0.5634242 test: 0.5694901 best: 0.5694901 (1) total: 3.45s remaining: 2m 45s 2: learn: 0.5119092 test: 0.5205986 best: 0.5205986 (2) total: 5.22s remaining: 2m 45s 3: learn: 0.4677107 test: 0.4787723 best: 0.4787723 (3) total: 7.05s remaining: 2m 45s 4: learn: 0.4307119 test: 0.4437289 best: 0.4437289 (4) total: 8.76s remaining: 2m 43s 5: learn: 0.3992726 test: 0.4136569 best: 0.4136569 (5) total: 10.5s remaining: 2m 41s 6: learn: 0.3707956 test: 0.3880211 best: 0.3880211 (6) total: 12.4s remaining: 2m 40s 7: learn: 0.3465285 test: 0.3659645 best: 0.3659645 (7) total: 14.1s remaining: 2m 38s 8: learn: 0.3251271 test: 0.3468582 best: 0.3468582 (8) total: 15.8s remaining: 2m 36s 9: learn: 0.3063085 test: 0.3305904 best: 0.3305904 (9) total: 17.7s remaining: 2m 35s 10: learn: 0.2907767 test: 0.3166238 best: 0.3166238 (10) total: 19.5s remaining: 2m 33s 11: learn: 0.2764756 test: 0.3041565 best: 0.3041565 (11) total: 21.2s remaining: 2m 31s 12: learn: 0.2636042 test: 0.2929092 best: 0.2929092 (12) total: 22.9s remaining: 2m 29s 13: learn: 0.2528316 test: 0.2833240 best: 0.2833240 (13) total: 24.7s remaining: 2m 28s 14: learn: 0.2425360 test: 0.2757965 best: 0.2757965 (14) total: 26.5s remaining: 2m 26s 15: learn: 0.2346916 test: 0.2684571 best: 0.2684571 (15) total: 27.7s remaining: 2m 21s 16: learn: 0.2264883 test: 0.2621669 best: 0.2621669 (16) total: 29.5s remaining: 2m 20s 17: learn: 0.2200375 test: 0.2570933 best: 0.2570933 (17) total: 31.2s remaining: 2m 18s 18: learn: 0.2134484 test: 0.2524993 best: 0.2524993 (18) total: 32.8s remaining: 2m 16s 19: learn: 0.2067561 test: 0.2479874 best: 0.2479874 (19) total: 34.6s remaining: 2m 14s 20: learn: 0.2009262 test: 0.2440430 best: 0.2440430 (20) total: 36.4s remaining: 2m 13s 21: learn: 0.1959275 test: 0.2408531 best: 0.2408531 (21) total: 38.1s remaining: 2m 11s 22: learn: 0.1907958 test: 0.2381353 best: 0.2381353 (22) total: 39.9s remaining: 2m 10s 23: learn: 0.1868329 test: 0.2353094 best: 0.2353094 (23) total: 41.6s remaining: 2m 8s 24: learn: 0.1828640 test: 0.2327396 best: 0.2327396 (24) total: 43.3s remaining: 2m 6s 25: learn: 0.1783596 test: 0.2305057 best: 0.2305057 (25) total: 44.9s remaining: 2m 4s 26: learn: 0.1756403 test: 0.2283829 best: 0.2283829 (26) total: 46.7s remaining: 2m 2s 27: learn: 0.1720660 test: 0.2268108 best: 0.2268108 (27) total: 48.4s remaining: 2m 1s 28: learn: 0.1690610 test: 0.2250806 best: 0.2250806 (28) total: 50.2s remaining: 1m 59s 29: learn: 0.1670935 test: 0.2238862 best: 0.2238862 (29) total: 52s remaining: 1m 57s 30: learn: 0.1639191 test: 0.2222086 best: 0.2222086 (30) total: 53.8s remaining: 1m 56s 31: learn: 0.1620865 test: 0.2211674 best: 0.2211674 (31) total: 55.7s remaining: 1m 54s 32: learn: 0.1569246 test: 0.2205927 best: 0.2205927 (32) total: 57.5s remaining: 1m 53s 33: learn: 0.1546766 test: 0.2194111 best: 0.2194111 (33) total: 59.3s remaining: 1m 51s 34: learn: 0.1534872 test: 0.2183777 best: 0.2183777 (34) total: 1m remaining: 1m 49s 35: learn: 0.1518402 test: 0.2173499 best: 0.2173499 (35) total: 1m 2s remaining: 1m 47s 36: learn: 0.1484623 test: 0.2166549 best: 0.2166549 (36) total: 1m 4s remaining: 1m 46s 37: learn: 0.1465265 test: 0.2156923 best: 0.2156923 (37) total: 1m 6s remaining: 1m 45s 38: learn: 0.1446739 test: 0.2149982 best: 0.2149982 (38) total: 1m 8s remaining: 1m 43s 39: learn: 0.1431467 test: 0.2146565 best: 0.2146565 (39) total: 1m 10s remaining: 1m 41s 40: learn: 0.1402169 test: 0.2144004 best: 0.2144004 (40) total: 1m 12s remaining: 1m 40s 41: learn: 0.1360364 test: 0.2136947 best: 0.2136947 (41) total: 1m 13s remaining: 1m 38s 42: learn: 0.1345656 test: 0.2128604 best: 0.2128604 (42) total: 1m 15s remaining: 1m 36s 43: learn: 0.1324468 test: 0.2126205 best: 0.2126205 (43) total: 1m 17s remaining: 1m 34s 44: learn: 0.1309301 test: 0.2119793 best: 0.2119793 (44) total: 1m 19s remaining: 1m 33s 45: learn: 0.1289103 test: 0.2115829 best: 0.2115829 (45) total: 1m 20s remaining: 1m 31s 46: learn: 0.1274791 test: 0.2110572 best: 0.2110572 (46) total: 1m 22s remaining: 1m 29s 47: learn: 0.1256124 test: 0.2107921 best: 0.2107921 (47) total: 1m 24s remaining: 1m 27s 48: learn: 0.1239052 test: 0.2105827 best: 0.2105827 (48) total: 1m 26s remaining: 1m 26s 49: learn: 0.1217875 test: 0.2104965 best: 0.2104965 (49) total: 1m 27s remaining: 1m 24s 50: learn: 0.1208264 test: 0.2100439 best: 0.2100439 (50) total: 1m 29s remaining: 1m 22s 51: learn: 0.1197853 test: 0.2096200 best: 0.2096200 (51) total: 1m 31s remaining: 1m 20s 52: learn: 0.1177429 test: 0.2093768 best: 0.2093768 (52) total: 1m 32s remaining: 1m 18s 53: learn: 0.1165802 test: 0.2090084 best: 0.2090084 (53) total: 1m 34s remaining: 1m 17s 54: learn: 0.1152258 test: 0.2085788 best: 0.2085788 (54) total: 1m 36s remaining: 1m 15s 55: learn: 0.1139894 test: 0.2083989 best: 0.2083989 (55) total: 1m 38s remaining: 1m 13s 56: learn: 0.1122666 test: 0.2082830 best: 0.2082830 (56) total: 1m 39s remaining: 1m 11s 57: learn: 0.1107802 test: 0.2079620 best: 0.2079620 (57) total: 1m 42s remaining: 1m 10s 58: learn: 0.1097250 test: 0.2076721 best: 0.2076721 (58) total: 1m 44s remaining: 1m 9s 59: learn: 0.1069663 test: 0.2073845 best: 0.2073845 (59) total: 1m 46s remaining: 1m 7s 60: learn: 0.1054171 test: 0.2072888 best: 0.2072888 (60) total: 1m 48s remaining: 1m 5s 61: learn: 0.1041512 test: 0.2071445 best: 0.2071445 (61) total: 1m 50s remaining: 1m 4s 62: learn: 0.1032335 test: 0.2070329 best: 0.2070329 (62) total: 1m 52s remaining: 1m 2s 63: learn: 0.1020884 test: 0.2068976 best: 0.2068976 (63) total: 1m 54s remaining: 1m 64: learn: 0.1014188 test: 0.2066263 best: 0.2066263 (64) total: 1m 55s remaining: 58.8s 65: learn: 0.1003571 test: 0.2063877 best: 0.2063877 (65) total: 1m 57s remaining: 57s 66: learn: 0.0995142 test: 0.2063085 best: 0.2063085 (66) total: 1m 59s remaining: 55.2s 67: learn: 0.0984690 test: 0.2063613 best: 0.2063085 (66) total: 2m 1s remaining: 53.5s 68: learn: 0.0977426 test: 0.2061527 best: 0.2061527 (68) total: 2m 2s remaining: 51.7s 69: learn: 0.0964775 test: 0.2059270 best: 0.2059270 (69) total: 2m 4s remaining: 49.9s 70: learn: 0.0952989 test: 0.2055679 best: 0.2055679 (70) total: 2m 6s remaining: 48.1s 71: learn: 0.0941229 test: 0.2054755 best: 0.2054755 (71) total: 2m 8s remaining: 46.3s 72: learn: 0.0936486 test: 0.2051485 best: 0.2051485 (72) total: 2m 10s remaining: 44.6s 73: learn: 0.0927480 test: 0.2052597 best: 0.2051485 (72) total: 2m 11s remaining: 42.8s 74: learn: 0.0919816 test: 0.2051112 best: 0.2051112 (74) total: 2m 13s remaining: 41s 75: learn: 0.0914499 test: 0.2049317 best: 0.2049317 (75) total: 2m 15s remaining: 39.2s 76: learn: 0.0902677 test: 0.2050955 best: 0.2049317 (75) total: 2m 17s remaining: 37.4s 77: learn: 0.0882956 test: 0.2053834 best: 0.2049317 (75) total: 2m 18s remaining: 35.6s 78: learn: 0.0873860 test: 0.2051853 best: 0.2049317 (75) total: 2m 20s remaining: 33.8s 79: learn: 0.0863677 test: 0.2050888 best: 0.2049317 (75) total: 2m 22s remaining: 32.1s 80: learn: 0.0853932 test: 0.2052619 best: 0.2049317 (75) total: 2m 24s remaining: 30.3s 81: learn: 0.0848008 test: 0.2051984 best: 0.2049317 (75) total: 2m 26s remaining: 28.5s 82: learn: 0.0841937 test: 0.2051424 best: 0.2049317 (75) total: 2m 27s remaining: 26.7s 83: learn: 0.0835698 test: 0.2050848 best: 0.2049317 (75) total: 2m 29s remaining: 25s 84: learn: 0.0827020 test: 0.2050480 best: 0.2049317 (75) total: 2m 31s remaining: 23.2s 85: learn: 0.0818364 test: 0.2051596 best: 0.2049317 (75) total: 2m 33s remaining: 21.4s 86: learn: 0.0805210 test: 0.2054751 best: 0.2049317 (75) total: 2m 35s remaining: 19.6s 87: learn: 0.0800185 test: 0.2053557 best: 0.2049317 (75) total: 2m 37s remaining: 17.8s 88: learn: 0.0791765 test: 0.2051345 best: 0.2049317 (75) total: 2m 38s remaining: 16.1s 89: learn: 0.0785158 test: 0.2050455 best: 0.2049317 (75) total: 2m 40s remaining: 14.3s 90: learn: 0.0781232 test: 0.2048764 best: 0.2048764 (90) total: 2m 42s remaining: 12.5s 91: learn: 0.0776499 test: 0.2047474 best: 0.2047474 (91) total: 2m 43s remaining: 10.7s 92: learn: 0.0760567 test: 0.2048019 best: 0.2047474 (91) total: 2m 45s remaining: 8.91s 93: learn: 0.0753807 test: 0.2049356 best: 0.2047474 (91) total: 2m 47s remaining: 7.12s 94: learn: 0.0739262 test: 0.2054475 best: 0.2047474 (91) total: 2m 49s remaining: 5.34s 95: learn: 0.0733102 test: 0.2054166 best: 0.2047474 (91) total: 2m 50s remaining: 3.56s 96: learn: 0.0722445 test: 0.2052965 best: 0.2047474 (91) total: 2m 52s remaining: 1.78s 97: learn: 0.0719465 test: 0.2053182 best: 0.2047474 (91) total: 2m 54s remaining: 0us bestTest = 0.2047473662 bestIteration = 91 Shrink model to first 92 iterations. Trial 74, Fold 1: Log loss = 0.20372707646786092, Average precision = 0.9743246850650773, ROC-AUC = 0.9691592523288549, Elapsed Time = 174.74297730000035 seconds Trial 74, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 74, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6305059 test: 0.6346441 best: 0.6346441 (0) total: 1.63s remaining: 2m 38s 1: learn: 0.5705205 test: 0.5765615 best: 0.5765615 (1) total: 2.3s remaining: 1m 50s 2: learn: 0.5189749 test: 0.5278152 best: 0.5278152 (2) total: 3.97s remaining: 2m 5s 3: learn: 0.4737566 test: 0.4850789 best: 0.4850789 (3) total: 5.66s remaining: 2m 13s 4: learn: 0.4361327 test: 0.4492156 best: 0.4492156 (4) total: 7.41s remaining: 2m 17s 5: learn: 0.4035306 test: 0.4187688 best: 0.4187688 (5) total: 9.16s remaining: 2m 20s 6: learn: 0.3741846 test: 0.3926622 best: 0.3926622 (6) total: 10.9s remaining: 2m 22s 7: learn: 0.3502524 test: 0.3695183 best: 0.3695183 (7) total: 12.7s remaining: 2m 22s 8: learn: 0.3288609 test: 0.3500364 best: 0.3500364 (8) total: 14.5s remaining: 2m 23s 9: learn: 0.3109054 test: 0.3343291 best: 0.3343291 (9) total: 16.3s remaining: 2m 23s 10: learn: 0.2940793 test: 0.3203075 best: 0.3203075 (10) total: 17.9s remaining: 2m 21s 11: learn: 0.2787930 test: 0.3078320 best: 0.3078320 (11) total: 19.6s remaining: 2m 20s 12: learn: 0.2660815 test: 0.2970408 best: 0.2970408 (12) total: 21.3s remaining: 2m 18s 13: learn: 0.2550755 test: 0.2871206 best: 0.2871206 (13) total: 23s remaining: 2m 18s 14: learn: 0.2451035 test: 0.2789062 best: 0.2789062 (14) total: 24.7s remaining: 2m 16s 15: learn: 0.2357202 test: 0.2722299 best: 0.2722299 (15) total: 26.5s remaining: 2m 15s 16: learn: 0.2277827 test: 0.2652440 best: 0.2652440 (16) total: 28.1s remaining: 2m 14s 17: learn: 0.2205118 test: 0.2591896 best: 0.2591896 (17) total: 29.8s remaining: 2m 12s 18: learn: 0.2125879 test: 0.2543520 best: 0.2543520 (18) total: 31.4s remaining: 2m 10s 19: learn: 0.2062161 test: 0.2496260 best: 0.2496260 (19) total: 33.2s remaining: 2m 9s 20: learn: 0.2007733 test: 0.2456235 best: 0.2456235 (20) total: 34.9s remaining: 2m 7s 21: learn: 0.1956051 test: 0.2416316 best: 0.2416316 (21) total: 36.6s remaining: 2m 6s 22: learn: 0.1911422 test: 0.2380524 best: 0.2380524 (22) total: 38.3s remaining: 2m 4s 23: learn: 0.1868597 test: 0.2352875 best: 0.2352875 (23) total: 40.1s remaining: 2m 3s 24: learn: 0.1826932 test: 0.2326422 best: 0.2326422 (24) total: 41.8s remaining: 2m 1s 25: learn: 0.1798532 test: 0.2300990 best: 0.2300990 (25) total: 43.5s remaining: 2m 26: learn: 0.1770522 test: 0.2277588 best: 0.2277588 (26) total: 45.1s remaining: 1m 58s 27: learn: 0.1738003 test: 0.2260248 best: 0.2260248 (27) total: 46.8s remaining: 1m 56s 28: learn: 0.1698852 test: 0.2244439 best: 0.2244439 (28) total: 48.6s remaining: 1m 55s 29: learn: 0.1658222 test: 0.2229826 best: 0.2229826 (29) total: 50.4s remaining: 1m 54s 30: learn: 0.1630623 test: 0.2213292 best: 0.2213292 (30) total: 52.1s remaining: 1m 52s 31: learn: 0.1603514 test: 0.2201059 best: 0.2201059 (31) total: 53.9s remaining: 1m 51s 32: learn: 0.1571383 test: 0.2186780 best: 0.2186780 (32) total: 55.6s remaining: 1m 49s 33: learn: 0.1551124 test: 0.2173072 best: 0.2173072 (33) total: 57.4s remaining: 1m 47s 34: learn: 0.1523852 test: 0.2164313 best: 0.2164313 (34) total: 59.2s remaining: 1m 46s 35: learn: 0.1506913 test: 0.2153564 best: 0.2153564 (35) total: 1m remaining: 1m 44s 36: learn: 0.1485221 test: 0.2141877 best: 0.2141877 (36) total: 1m 2s remaining: 1m 43s 37: learn: 0.1450241 test: 0.2134824 best: 0.2134824 (37) total: 1m 4s remaining: 1m 41s 38: learn: 0.1433445 test: 0.2128196 best: 0.2128196 (38) total: 1m 6s remaining: 1m 39s 39: learn: 0.1398570 test: 0.2121946 best: 0.2121946 (39) total: 1m 7s remaining: 1m 38s 40: learn: 0.1385464 test: 0.2113968 best: 0.2113968 (40) total: 1m 9s remaining: 1m 36s 41: learn: 0.1354708 test: 0.2108683 best: 0.2108683 (41) total: 1m 11s remaining: 1m 35s 42: learn: 0.1315917 test: 0.2109612 best: 0.2108683 (41) total: 1m 13s remaining: 1m 33s 43: learn: 0.1294357 test: 0.2105349 best: 0.2105349 (43) total: 1m 14s remaining: 1m 31s 44: learn: 0.1270823 test: 0.2102435 best: 0.2102435 (44) total: 1m 16s remaining: 1m 30s 45: learn: 0.1253418 test: 0.2099513 best: 0.2099513 (45) total: 1m 18s remaining: 1m 28s 46: learn: 0.1236531 test: 0.2096011 best: 0.2096011 (46) total: 1m 19s remaining: 1m 26s 47: learn: 0.1222445 test: 0.2090891 best: 0.2090891 (47) total: 1m 21s remaining: 1m 24s 48: learn: 0.1205340 test: 0.2084365 best: 0.2084365 (48) total: 1m 23s remaining: 1m 23s 49: learn: 0.1191274 test: 0.2077914 best: 0.2077914 (49) total: 1m 25s remaining: 1m 21s 50: learn: 0.1178606 test: 0.2075205 best: 0.2075205 (50) total: 1m 26s remaining: 1m 20s 51: learn: 0.1150827 test: 0.2072921 best: 0.2072921 (51) total: 1m 28s remaining: 1m 18s 52: learn: 0.1134994 test: 0.2065496 best: 0.2065496 (52) total: 1m 30s remaining: 1m 16s 53: learn: 0.1119285 test: 0.2064945 best: 0.2064945 (53) total: 1m 32s remaining: 1m 15s 54: learn: 0.1105451 test: 0.2061508 best: 0.2061508 (54) total: 1m 33s remaining: 1m 13s 55: learn: 0.1094919 test: 0.2058459 best: 0.2058459 (55) total: 1m 35s remaining: 1m 11s 56: learn: 0.1084041 test: 0.2055599 best: 0.2055599 (56) total: 1m 37s remaining: 1m 9s 57: learn: 0.1071495 test: 0.2052040 best: 0.2052040 (57) total: 1m 39s remaining: 1m 8s 58: learn: 0.1064792 test: 0.2048529 best: 0.2048529 (58) total: 1m 40s remaining: 1m 6s 59: learn: 0.1059477 test: 0.2045015 best: 0.2045015 (59) total: 1m 42s remaining: 1m 4s 60: learn: 0.1046622 test: 0.2037797 best: 0.2037797 (60) total: 1m 44s remaining: 1m 3s 61: learn: 0.1029739 test: 0.2033449 best: 0.2033449 (61) total: 1m 45s remaining: 1m 1s 62: learn: 0.1019796 test: 0.2027993 best: 0.2027993 (62) total: 1m 47s remaining: 59.9s 63: learn: 0.1007970 test: 0.2025079 best: 0.2025079 (63) total: 1m 49s remaining: 58.2s 64: learn: 0.0998373 test: 0.2019174 best: 0.2019174 (64) total: 1m 51s remaining: 56.5s 65: learn: 0.0992331 test: 0.2016063 best: 0.2016063 (65) total: 1m 53s remaining: 54.9s 66: learn: 0.0985461 test: 0.2015555 best: 0.2015555 (66) total: 1m 54s remaining: 53.2s 67: learn: 0.0976560 test: 0.2012185 best: 0.2012185 (67) total: 1m 56s remaining: 51.5s 68: learn: 0.0968407 test: 0.2010501 best: 0.2010501 (68) total: 1m 58s remaining: 49.8s 69: learn: 0.0960192 test: 0.2008036 best: 0.2008036 (69) total: 2m remaining: 48.1s 70: learn: 0.0949924 test: 0.2004909 best: 0.2004909 (70) total: 2m 1s remaining: 46.4s 71: learn: 0.0942375 test: 0.2004435 best: 0.2004435 (71) total: 2m 3s remaining: 44.6s 72: learn: 0.0931843 test: 0.2004091 best: 0.2004091 (72) total: 2m 5s remaining: 42.9s 73: learn: 0.0917431 test: 0.2002313 best: 0.2002313 (73) total: 2m 6s remaining: 41.2s 74: learn: 0.0906313 test: 0.2001467 best: 0.2001467 (74) total: 2m 8s remaining: 39.5s 75: learn: 0.0897867 test: 0.2000473 best: 0.2000473 (75) total: 2m 10s remaining: 37.7s 76: learn: 0.0884457 test: 0.2000255 best: 0.2000255 (76) total: 2m 12s remaining: 36s 77: learn: 0.0880653 test: 0.1998533 best: 0.1998533 (77) total: 2m 13s remaining: 34.3s 78: learn: 0.0874212 test: 0.1995023 best: 0.1995023 (78) total: 2m 15s remaining: 32.6s 79: learn: 0.0861406 test: 0.1994161 best: 0.1994161 (79) total: 2m 17s remaining: 30.9s 80: learn: 0.0854980 test: 0.1992772 best: 0.1992772 (80) total: 2m 19s remaining: 29.2s 81: learn: 0.0841662 test: 0.1992123 best: 0.1992123 (81) total: 2m 20s remaining: 27.5s 82: learn: 0.0833858 test: 0.1991788 best: 0.1991788 (82) total: 2m 22s remaining: 25.8s 83: learn: 0.0829385 test: 0.1989594 best: 0.1989594 (83) total: 2m 24s remaining: 24s 84: learn: 0.0821038 test: 0.1987318 best: 0.1987318 (84) total: 2m 25s remaining: 22.3s 85: learn: 0.0802810 test: 0.1986021 best: 0.1986021 (85) total: 2m 27s remaining: 20.6s 86: learn: 0.0793922 test: 0.1985762 best: 0.1985762 (86) total: 2m 29s remaining: 18.9s 87: learn: 0.0782675 test: 0.1985173 best: 0.1985173 (87) total: 2m 31s remaining: 17.2s 88: learn: 0.0777800 test: 0.1982293 best: 0.1982293 (88) total: 2m 32s remaining: 15.5s 89: learn: 0.0773811 test: 0.1982658 best: 0.1982293 (88) total: 2m 34s remaining: 13.7s 90: learn: 0.0769835 test: 0.1982233 best: 0.1982233 (90) total: 2m 36s remaining: 12s 91: learn: 0.0764587 test: 0.1980261 best: 0.1980261 (91) total: 2m 37s remaining: 10.3s 92: learn: 0.0748222 test: 0.1979350 best: 0.1979350 (92) total: 2m 39s remaining: 8.58s 93: learn: 0.0734735 test: 0.1979706 best: 0.1979350 (92) total: 2m 41s remaining: 6.87s 94: learn: 0.0720688 test: 0.1981031 best: 0.1979350 (92) total: 2m 43s remaining: 5.15s 95: learn: 0.0717073 test: 0.1979768 best: 0.1979350 (92) total: 2m 44s remaining: 3.43s 96: learn: 0.0708292 test: 0.1978697 best: 0.1978697 (96) total: 2m 46s remaining: 1.72s 97: learn: 0.0705497 test: 0.1977061 best: 0.1977061 (97) total: 2m 48s remaining: 0us bestTest = 0.1977060533 bestIteration = 97 Trial 74, Fold 2: Log loss = 0.1970840858190826, Average precision = 0.9740080719974162, ROC-AUC = 0.9705176815002151, Elapsed Time = 168.68866730000082 seconds Trial 74, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 74, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6314968 test: 0.6350299 best: 0.6350299 (0) total: 1.76s remaining: 2m 50s 1: learn: 0.5709230 test: 0.5762778 best: 0.5762778 (1) total: 3.33s remaining: 2m 39s 2: learn: 0.5183208 test: 0.5262936 best: 0.5262936 (2) total: 4.96s remaining: 2m 36s 3: learn: 0.4784091 test: 0.4858203 best: 0.4858203 (3) total: 5.07s remaining: 1m 59s 4: learn: 0.4401958 test: 0.4492670 best: 0.4492670 (4) total: 6.79s remaining: 2m 6s 5: learn: 0.4076178 test: 0.4185860 best: 0.4185860 (5) total: 8.39s remaining: 2m 8s 6: learn: 0.3796070 test: 0.3919651 best: 0.3919651 (6) total: 10.1s remaining: 2m 11s 7: learn: 0.3546936 test: 0.3696932 best: 0.3696932 (7) total: 11.8s remaining: 2m 13s 8: learn: 0.3331388 test: 0.3496461 best: 0.3496461 (8) total: 13.5s remaining: 2m 13s 9: learn: 0.3138458 test: 0.3328608 best: 0.3328608 (9) total: 15.3s remaining: 2m 14s 10: learn: 0.2984838 test: 0.3187920 best: 0.3187920 (10) total: 17s remaining: 2m 14s 11: learn: 0.2837605 test: 0.3055201 best: 0.3055201 (11) total: 18.5s remaining: 2m 12s 12: learn: 0.2706807 test: 0.2940970 best: 0.2940970 (12) total: 20.2s remaining: 2m 12s 13: learn: 0.2595649 test: 0.2842039 best: 0.2842039 (13) total: 21.9s remaining: 2m 11s 14: learn: 0.2487032 test: 0.2755649 best: 0.2755649 (14) total: 23.7s remaining: 2m 10s 15: learn: 0.2393684 test: 0.2676685 best: 0.2676685 (15) total: 25.4s remaining: 2m 10s 16: learn: 0.2312739 test: 0.2614293 best: 0.2614293 (16) total: 27.1s remaining: 2m 9s 17: learn: 0.2238749 test: 0.2558872 best: 0.2558872 (17) total: 28.9s remaining: 2m 8s 18: learn: 0.2169696 test: 0.2506347 best: 0.2506347 (18) total: 30.6s remaining: 2m 7s 19: learn: 0.2102183 test: 0.2470141 best: 0.2470141 (19) total: 32.3s remaining: 2m 5s 20: learn: 0.2051215 test: 0.2430681 best: 0.2430681 (20) total: 34s remaining: 2m 4s 21: learn: 0.1997709 test: 0.2392549 best: 0.2392549 (21) total: 35.8s remaining: 2m 3s 22: learn: 0.1952166 test: 0.2366436 best: 0.2366436 (22) total: 37.5s remaining: 2m 2s 23: learn: 0.1912902 test: 0.2332884 best: 0.2332884 (23) total: 39.2s remaining: 2m 24: learn: 0.1873529 test: 0.2308010 best: 0.2308010 (24) total: 40.9s remaining: 1m 59s 25: learn: 0.1821483 test: 0.2282977 best: 0.2282977 (25) total: 42.7s remaining: 1m 58s 26: learn: 0.1789806 test: 0.2262332 best: 0.2262332 (26) total: 44.3s remaining: 1m 56s 27: learn: 0.1745502 test: 0.2242188 best: 0.2242188 (27) total: 46s remaining: 1m 54s 28: learn: 0.1719080 test: 0.2222006 best: 0.2222006 (28) total: 47.8s remaining: 1m 53s 29: learn: 0.1687956 test: 0.2207426 best: 0.2207426 (29) total: 49.6s remaining: 1m 52s 30: learn: 0.1650627 test: 0.2193181 best: 0.2193181 (30) total: 51.3s remaining: 1m 50s 31: learn: 0.1618847 test: 0.2181083 best: 0.2181083 (31) total: 53.1s remaining: 1m 49s 32: learn: 0.1585808 test: 0.2168021 best: 0.2168021 (32) total: 54.9s remaining: 1m 48s 33: learn: 0.1567968 test: 0.2156099 best: 0.2156099 (33) total: 56.5s remaining: 1m 46s 34: learn: 0.1546465 test: 0.2145796 best: 0.2145796 (34) total: 58.2s remaining: 1m 44s 35: learn: 0.1527917 test: 0.2131379 best: 0.2131379 (35) total: 59.9s remaining: 1m 43s 36: learn: 0.1512102 test: 0.2123411 best: 0.2123411 (36) total: 1m 1s remaining: 1m 41s 37: learn: 0.1490990 test: 0.2115560 best: 0.2115560 (37) total: 1m 3s remaining: 1m 39s 38: learn: 0.1463996 test: 0.2111460 best: 0.2111460 (38) total: 1m 4s remaining: 1m 38s 39: learn: 0.1433624 test: 0.2107562 best: 0.2107562 (39) total: 1m 6s remaining: 1m 36s 40: learn: 0.1400500 test: 0.2100793 best: 0.2100793 (40) total: 1m 8s remaining: 1m 35s 41: learn: 0.1389721 test: 0.2092858 best: 0.2092858 (41) total: 1m 10s remaining: 1m 33s 42: learn: 0.1359281 test: 0.2091345 best: 0.2091345 (42) total: 1m 11s remaining: 1m 31s 43: learn: 0.1340851 test: 0.2087780 best: 0.2087780 (43) total: 1m 13s remaining: 1m 30s 44: learn: 0.1313898 test: 0.2083155 best: 0.2083155 (44) total: 1m 15s remaining: 1m 28s 45: learn: 0.1294558 test: 0.2077748 best: 0.2077748 (45) total: 1m 17s remaining: 1m 27s 46: learn: 0.1277180 test: 0.2073355 best: 0.2073355 (46) total: 1m 18s remaining: 1m 25s 47: learn: 0.1258730 test: 0.2069286 best: 0.2069286 (47) total: 1m 20s remaining: 1m 23s 48: learn: 0.1246598 test: 0.2062653 best: 0.2062653 (48) total: 1m 22s remaining: 1m 22s 49: learn: 0.1230276 test: 0.2060040 best: 0.2060040 (49) total: 1m 24s remaining: 1m 20s 50: learn: 0.1213354 test: 0.2056561 best: 0.2056561 (50) total: 1m 25s remaining: 1m 19s 51: learn: 0.1193424 test: 0.2053559 best: 0.2053559 (51) total: 1m 27s remaining: 1m 17s 52: learn: 0.1185860 test: 0.2047844 best: 0.2047844 (52) total: 1m 29s remaining: 1m 15s 53: learn: 0.1169202 test: 0.2042798 best: 0.2042798 (53) total: 1m 31s remaining: 1m 14s 54: learn: 0.1147180 test: 0.2040932 best: 0.2040932 (54) total: 1m 32s remaining: 1m 12s 55: learn: 0.1138593 test: 0.2036273 best: 0.2036273 (55) total: 1m 34s remaining: 1m 10s 56: learn: 0.1128243 test: 0.2031567 best: 0.2031567 (56) total: 1m 36s remaining: 1m 9s 57: learn: 0.1111693 test: 0.2028993 best: 0.2028993 (57) total: 1m 38s remaining: 1m 7s 58: learn: 0.1104624 test: 0.2027151 best: 0.2027151 (58) total: 1m 39s remaining: 1m 5s 59: learn: 0.1095603 test: 0.2020399 best: 0.2020399 (59) total: 1m 41s remaining: 1m 4s 60: learn: 0.1085305 test: 0.2018394 best: 0.2018394 (60) total: 1m 43s remaining: 1m 2s 61: learn: 0.1069983 test: 0.2015018 best: 0.2015018 (61) total: 1m 44s remaining: 1m 62: learn: 0.1055908 test: 0.2013836 best: 0.2013836 (62) total: 1m 46s remaining: 59.1s 63: learn: 0.1048523 test: 0.2010414 best: 0.2010414 (63) total: 1m 48s remaining: 57.4s 64: learn: 0.1036434 test: 0.2008294 best: 0.2008294 (64) total: 1m 49s remaining: 55.7s 65: learn: 0.1029425 test: 0.2007643 best: 0.2007643 (65) total: 1m 51s remaining: 54s 66: learn: 0.1020570 test: 0.2004738 best: 0.2004738 (66) total: 1m 53s remaining: 52.3s 67: learn: 0.1003595 test: 0.2004409 best: 0.2004409 (67) total: 1m 54s remaining: 50.7s 68: learn: 0.0990874 test: 0.2004398 best: 0.2004398 (68) total: 1m 56s remaining: 49s 69: learn: 0.0983589 test: 0.2000566 best: 0.2000566 (69) total: 1m 58s remaining: 47.3s 70: learn: 0.0978476 test: 0.1997507 best: 0.1997507 (70) total: 1m 59s remaining: 45.6s 71: learn: 0.0973169 test: 0.1994381 best: 0.1994381 (71) total: 2m 1s remaining: 43.9s 72: learn: 0.0963378 test: 0.1992513 best: 0.1992513 (72) total: 2m 3s remaining: 42.3s 73: learn: 0.0941234 test: 0.1992854 best: 0.1992513 (72) total: 2m 5s remaining: 40.6s 74: learn: 0.0933325 test: 0.1990794 best: 0.1990794 (74) total: 2m 6s remaining: 38.9s 75: learn: 0.0926002 test: 0.1989175 best: 0.1989175 (75) total: 2m 8s remaining: 37.2s 76: learn: 0.0919891 test: 0.1987588 best: 0.1987588 (76) total: 2m 10s remaining: 35.5s 77: learn: 0.0904942 test: 0.1985962 best: 0.1985962 (77) total: 2m 12s remaining: 33.8s 78: learn: 0.0888409 test: 0.1985953 best: 0.1985953 (78) total: 2m 13s remaining: 32.2s 79: learn: 0.0880228 test: 0.1984158 best: 0.1984158 (79) total: 2m 15s remaining: 30.5s 80: learn: 0.0874915 test: 0.1982447 best: 0.1982447 (80) total: 2m 17s remaining: 28.8s 81: learn: 0.0864143 test: 0.1980272 best: 0.1980272 (81) total: 2m 18s remaining: 27.1s 82: learn: 0.0855369 test: 0.1979411 best: 0.1979411 (82) total: 2m 20s remaining: 25.4s 83: learn: 0.0847847 test: 0.1978895 best: 0.1978895 (83) total: 2m 22s remaining: 23.7s 84: learn: 0.0838158 test: 0.1977735 best: 0.1977735 (84) total: 2m 23s remaining: 22s 85: learn: 0.0828468 test: 0.1977216 best: 0.1977216 (85) total: 2m 25s remaining: 20.3s 86: learn: 0.0820803 test: 0.1976729 best: 0.1976729 (86) total: 2m 27s remaining: 18.6s 87: learn: 0.0803414 test: 0.1977791 best: 0.1976729 (86) total: 2m 29s remaining: 16.9s 88: learn: 0.0778979 test: 0.1977432 best: 0.1976729 (86) total: 2m 30s remaining: 15.2s 89: learn: 0.0771884 test: 0.1975639 best: 0.1975639 (89) total: 2m 32s remaining: 13.6s 90: learn: 0.0767450 test: 0.1972905 best: 0.1972905 (90) total: 2m 34s remaining: 11.9s 91: learn: 0.0762294 test: 0.1971980 best: 0.1971980 (91) total: 2m 36s remaining: 10.2s 92: learn: 0.0753576 test: 0.1971812 best: 0.1971812 (92) total: 2m 37s remaining: 8.48s 93: learn: 0.0742536 test: 0.1970407 best: 0.1970407 (93) total: 2m 39s remaining: 6.79s 94: learn: 0.0737671 test: 0.1970393 best: 0.1970393 (94) total: 2m 41s remaining: 5.09s 95: learn: 0.0730550 test: 0.1970078 best: 0.1970078 (95) total: 2m 42s remaining: 3.39s 96: learn: 0.0721950 test: 0.1971381 best: 0.1970078 (95) total: 2m 44s remaining: 1.7s 97: learn: 0.0712071 test: 0.1972132 best: 0.1970078 (95) total: 2m 46s remaining: 0us bestTest = 0.1970077741 bestIteration = 95 Shrink model to first 96 iterations. Trial 74, Fold 3: Log loss = 0.19643855032112573, Average precision = 0.9725789577283187, ROC-AUC = 0.9709949340755651, Elapsed Time = 166.7922209999997 seconds Trial 74, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 74, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6233121 test: 0.6261686 best: 0.6261686 (0) total: 1.58s remaining: 2m 32s 1: learn: 0.5620001 test: 0.5685956 best: 0.5685956 (1) total: 3.13s remaining: 2m 30s 2: learn: 0.5108070 test: 0.5196249 best: 0.5196249 (2) total: 4.7s remaining: 2m 28s 3: learn: 0.4676212 test: 0.4785555 best: 0.4785555 (3) total: 6.33s remaining: 2m 28s 4: learn: 0.4305388 test: 0.4437370 best: 0.4437370 (4) total: 7.96s remaining: 2m 27s 5: learn: 0.3983257 test: 0.4134129 best: 0.4134129 (5) total: 9.64s remaining: 2m 27s 6: learn: 0.3700675 test: 0.3881180 best: 0.3881180 (6) total: 11.4s remaining: 2m 28s 7: learn: 0.3463396 test: 0.3664051 best: 0.3664051 (7) total: 13.2s remaining: 2m 28s 8: learn: 0.3253771 test: 0.3472186 best: 0.3472186 (8) total: 14.9s remaining: 2m 27s 9: learn: 0.3070548 test: 0.3305121 best: 0.3305121 (9) total: 16.8s remaining: 2m 27s 10: learn: 0.2911812 test: 0.3164166 best: 0.3164166 (10) total: 18.5s remaining: 2m 26s 11: learn: 0.2771547 test: 0.3041283 best: 0.3041283 (11) total: 20.2s remaining: 2m 24s 12: learn: 0.2644678 test: 0.2923060 best: 0.2923060 (12) total: 22s remaining: 2m 23s 13: learn: 0.2528891 test: 0.2834818 best: 0.2834818 (13) total: 23.8s remaining: 2m 22s 14: learn: 0.2437088 test: 0.2758140 best: 0.2758140 (14) total: 25.5s remaining: 2m 21s 15: learn: 0.2351982 test: 0.2679816 best: 0.2679816 (15) total: 27.2s remaining: 2m 19s 16: learn: 0.2269520 test: 0.2616733 best: 0.2616733 (16) total: 28.8s remaining: 2m 17s 17: learn: 0.2189886 test: 0.2563810 best: 0.2563810 (17) total: 30.4s remaining: 2m 15s 18: learn: 0.2121767 test: 0.2511327 best: 0.2511327 (18) total: 32.1s remaining: 2m 13s 19: learn: 0.2059364 test: 0.2468459 best: 0.2468459 (19) total: 33.8s remaining: 2m 11s 20: learn: 0.2005769 test: 0.2431924 best: 0.2431924 (20) total: 35.5s remaining: 2m 10s 21: learn: 0.1937588 test: 0.2399249 best: 0.2399249 (21) total: 37.3s remaining: 2m 8s 22: learn: 0.1880809 test: 0.2364989 best: 0.2364989 (22) total: 39s remaining: 2m 7s 23: learn: 0.1829791 test: 0.2335322 best: 0.2335322 (23) total: 40.6s remaining: 2m 5s 24: learn: 0.1786932 test: 0.2308873 best: 0.2308873 (24) total: 42.3s remaining: 2m 3s 25: learn: 0.1750996 test: 0.2287006 best: 0.2287006 (25) total: 44s remaining: 2m 1s 26: learn: 0.1724733 test: 0.2270890 best: 0.2270890 (26) total: 45.7s remaining: 2m 27: learn: 0.1693847 test: 0.2251256 best: 0.2251256 (27) total: 47.4s remaining: 1m 58s 28: learn: 0.1665990 test: 0.2238107 best: 0.2238107 (28) total: 49.1s remaining: 1m 56s 29: learn: 0.1641574 test: 0.2223931 best: 0.2223931 (29) total: 50.9s remaining: 1m 55s 30: learn: 0.1603719 test: 0.2212647 best: 0.2212647 (30) total: 52.6s remaining: 1m 53s 31: learn: 0.1574392 test: 0.2203245 best: 0.2203245 (31) total: 54.4s remaining: 1m 52s 32: learn: 0.1544843 test: 0.2193297 best: 0.2193297 (32) total: 56.2s remaining: 1m 50s 33: learn: 0.1523642 test: 0.2180179 best: 0.2180179 (33) total: 57.9s remaining: 1m 48s 34: learn: 0.1489380 test: 0.2168741 best: 0.2168741 (34) total: 59.6s remaining: 1m 47s 35: learn: 0.1467808 test: 0.2157567 best: 0.2157567 (35) total: 1m 1s remaining: 1m 45s 36: learn: 0.1449938 test: 0.2144930 best: 0.2144930 (36) total: 1m 2s remaining: 1m 43s 37: learn: 0.1433955 test: 0.2135570 best: 0.2135570 (37) total: 1m 4s remaining: 1m 42s 38: learn: 0.1413920 test: 0.2126463 best: 0.2126463 (38) total: 1m 6s remaining: 1m 40s 39: learn: 0.1397731 test: 0.2118377 best: 0.2118377 (39) total: 1m 8s remaining: 1m 38s 40: learn: 0.1355338 test: 0.2114496 best: 0.2114496 (40) total: 1m 9s remaining: 1m 36s 41: learn: 0.1324425 test: 0.2113972 best: 0.2113972 (41) total: 1m 11s remaining: 1m 35s 42: learn: 0.1312079 test: 0.2106239 best: 0.2106239 (42) total: 1m 13s remaining: 1m 33s 43: learn: 0.1293444 test: 0.2102052 best: 0.2102052 (43) total: 1m 14s remaining: 1m 31s 44: learn: 0.1277160 test: 0.2096286 best: 0.2096286 (44) total: 1m 16s remaining: 1m 30s 45: learn: 0.1255476 test: 0.2091827 best: 0.2091827 (45) total: 1m 18s remaining: 1m 28s 46: learn: 0.1236911 test: 0.2091114 best: 0.2091114 (46) total: 1m 19s remaining: 1m 26s 47: learn: 0.1214980 test: 0.2086244 best: 0.2086244 (47) total: 1m 21s remaining: 1m 24s 48: learn: 0.1205538 test: 0.2080843 best: 0.2080843 (48) total: 1m 23s remaining: 1m 23s 49: learn: 0.1194027 test: 0.2076589 best: 0.2076589 (49) total: 1m 24s remaining: 1m 21s 50: learn: 0.1183678 test: 0.2071071 best: 0.2071071 (50) total: 1m 26s remaining: 1m 19s 51: learn: 0.1177495 test: 0.2065429 best: 0.2065429 (51) total: 1m 28s remaining: 1m 18s 52: learn: 0.1155922 test: 0.2060804 best: 0.2060804 (52) total: 1m 30s remaining: 1m 16s 53: learn: 0.1140559 test: 0.2056754 best: 0.2056754 (53) total: 1m 31s remaining: 1m 14s 54: learn: 0.1130394 test: 0.2052483 best: 0.2052483 (54) total: 1m 33s remaining: 1m 13s 55: learn: 0.1112517 test: 0.2049916 best: 0.2049916 (55) total: 1m 35s remaining: 1m 11s 56: learn: 0.1092297 test: 0.2048545 best: 0.2048545 (56) total: 1m 36s remaining: 1m 9s 57: learn: 0.1079090 test: 0.2045270 best: 0.2045270 (57) total: 1m 38s remaining: 1m 8s 58: learn: 0.1065459 test: 0.2042507 best: 0.2042507 (58) total: 1m 40s remaining: 1m 6s 59: learn: 0.1056580 test: 0.2038225 best: 0.2038225 (59) total: 1m 42s remaining: 1m 4s 60: learn: 0.1046071 test: 0.2038012 best: 0.2038012 (60) total: 1m 43s remaining: 1m 2s 61: learn: 0.1039441 test: 0.2034323 best: 0.2034323 (61) total: 1m 45s remaining: 1m 1s 62: learn: 0.1031614 test: 0.2031178 best: 0.2031178 (62) total: 1m 47s remaining: 59.6s 63: learn: 0.1020375 test: 0.2027943 best: 0.2027943 (63) total: 1m 49s remaining: 58s 64: learn: 0.1006474 test: 0.2026049 best: 0.2026049 (64) total: 1m 50s remaining: 56.2s 65: learn: 0.0995097 test: 0.2024234 best: 0.2024234 (65) total: 1m 52s remaining: 54.6s 66: learn: 0.0990005 test: 0.2022181 best: 0.2022181 (66) total: 1m 54s remaining: 52.9s 67: learn: 0.0978009 test: 0.2020653 best: 0.2020653 (67) total: 1m 56s remaining: 51.2s 68: learn: 0.0964343 test: 0.2020447 best: 0.2020447 (68) total: 1m 57s remaining: 49.5s 69: learn: 0.0951296 test: 0.2020527 best: 0.2020447 (68) total: 1m 59s remaining: 47.8s 70: learn: 0.0941058 test: 0.2020281 best: 0.2020281 (70) total: 2m 1s remaining: 46.1s 71: learn: 0.0931910 test: 0.2018797 best: 0.2018797 (71) total: 2m 2s remaining: 44.4s 72: learn: 0.0923266 test: 0.2018316 best: 0.2018316 (72) total: 2m 4s remaining: 42.7s 73: learn: 0.0905134 test: 0.2017651 best: 0.2017651 (73) total: 2m 6s remaining: 41s 74: learn: 0.0898224 test: 0.2014375 best: 0.2014375 (74) total: 2m 8s remaining: 39.3s 75: learn: 0.0892318 test: 0.2010123 best: 0.2010123 (75) total: 2m 10s remaining: 37.7s 76: learn: 0.0887047 test: 0.2008404 best: 0.2008404 (76) total: 2m 12s remaining: 36s 77: learn: 0.0880062 test: 0.2006235 best: 0.2006235 (77) total: 2m 13s remaining: 34.3s 78: learn: 0.0875249 test: 0.2004301 best: 0.2004301 (78) total: 2m 15s remaining: 32.6s 79: learn: 0.0867684 test: 0.2004069 best: 0.2004069 (79) total: 2m 17s remaining: 30.9s 80: learn: 0.0857148 test: 0.2004665 best: 0.2004069 (79) total: 2m 19s remaining: 29.2s 81: learn: 0.0852709 test: 0.2003482 best: 0.2003482 (81) total: 2m 20s remaining: 27.5s 82: learn: 0.0849478 test: 0.2002940 best: 0.2002940 (82) total: 2m 22s remaining: 25.8s 83: learn: 0.0846921 test: 0.2001292 best: 0.2001292 (83) total: 2m 24s remaining: 24s 84: learn: 0.0836338 test: 0.1999401 best: 0.1999401 (84) total: 2m 25s remaining: 22.3s 85: learn: 0.0820915 test: 0.1999892 best: 0.1999401 (84) total: 2m 27s remaining: 20.6s 86: learn: 0.0803187 test: 0.2000436 best: 0.1999401 (84) total: 2m 29s remaining: 18.9s 87: learn: 0.0796612 test: 0.2001032 best: 0.1999401 (84) total: 2m 31s remaining: 17.2s 88: learn: 0.0789598 test: 0.2000365 best: 0.1999401 (84) total: 2m 32s remaining: 15.4s 89: learn: 0.0773750 test: 0.1998569 best: 0.1998569 (89) total: 2m 34s remaining: 13.7s 90: learn: 0.0768555 test: 0.1997263 best: 0.1997263 (90) total: 2m 36s remaining: 12s 91: learn: 0.0765776 test: 0.1996282 best: 0.1996282 (91) total: 2m 37s remaining: 10.3s 92: learn: 0.0755715 test: 0.1997974 best: 0.1996282 (91) total: 2m 39s remaining: 8.57s 93: learn: 0.0746660 test: 0.1998339 best: 0.1996282 (91) total: 2m 41s remaining: 6.85s 94: learn: 0.0734693 test: 0.2000120 best: 0.1996282 (91) total: 2m 42s remaining: 5.14s 95: learn: 0.0723868 test: 0.2001724 best: 0.1996282 (91) total: 2m 44s remaining: 3.42s 96: learn: 0.0719337 test: 0.2001340 best: 0.1996282 (91) total: 2m 46s remaining: 1.71s 97: learn: 0.0715309 test: 0.2000712 best: 0.1996282 (91) total: 2m 47s remaining: 0us bestTest = 0.1996282366 bestIteration = 91 Shrink model to first 92 iterations. Trial 74, Fold 4: Log loss = 0.19890881742148034, Average precision = 0.9744471072122258, ROC-AUC = 0.9696595778164052, Elapsed Time = 168.1239002000002 seconds Trial 74, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 74, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6308905 test: 0.6354902 best: 0.6354902 (0) total: 1.69s remaining: 2m 44s 1: learn: 0.5708977 test: 0.5783548 best: 0.5783548 (1) total: 2.1s remaining: 1m 40s 2: learn: 0.5168869 test: 0.5299628 best: 0.5299628 (2) total: 3.8s remaining: 2m 3: learn: 0.4721512 test: 0.4877199 best: 0.4877199 (3) total: 5.55s remaining: 2m 10s 4: learn: 0.4344857 test: 0.4520350 best: 0.4520350 (4) total: 7.31s remaining: 2m 16s 5: learn: 0.4012680 test: 0.4209405 best: 0.4209405 (5) total: 8.98s remaining: 2m 17s 6: learn: 0.3768902 test: 0.3967692 best: 0.3967692 (6) total: 9.06s remaining: 1m 57s 7: learn: 0.3509952 test: 0.3739018 best: 0.3739018 (7) total: 10.8s remaining: 2m 2s 8: learn: 0.3302289 test: 0.3544784 best: 0.3544784 (8) total: 12.5s remaining: 2m 3s 9: learn: 0.3110280 test: 0.3377808 best: 0.3377808 (9) total: 14.2s remaining: 2m 5s 10: learn: 0.2952648 test: 0.3226714 best: 0.3226714 (10) total: 16.1s remaining: 2m 7s 11: learn: 0.2806385 test: 0.3105674 best: 0.3105674 (11) total: 17.8s remaining: 2m 7s 12: learn: 0.2676585 test: 0.2998964 best: 0.2998964 (12) total: 19.6s remaining: 2m 7s 13: learn: 0.2563801 test: 0.2900135 best: 0.2900135 (13) total: 21.3s remaining: 2m 7s 14: learn: 0.2467377 test: 0.2818983 best: 0.2818983 (14) total: 23s remaining: 2m 7s 15: learn: 0.2380399 test: 0.2743534 best: 0.2743534 (15) total: 24.8s remaining: 2m 7s 16: learn: 0.2284140 test: 0.2681902 best: 0.2681902 (16) total: 26.6s remaining: 2m 6s 17: learn: 0.2216271 test: 0.2625644 best: 0.2625644 (17) total: 28.3s remaining: 2m 5s 18: learn: 0.2153416 test: 0.2574413 best: 0.2574413 (18) total: 30s remaining: 2m 4s 19: learn: 0.2092178 test: 0.2530583 best: 0.2530583 (19) total: 31.9s remaining: 2m 4s 20: learn: 0.2030340 test: 0.2491291 best: 0.2491291 (20) total: 33.6s remaining: 2m 3s 21: learn: 0.1984913 test: 0.2457413 best: 0.2457413 (21) total: 35.4s remaining: 2m 2s 22: learn: 0.1942413 test: 0.2421127 best: 0.2421127 (22) total: 37.3s remaining: 2m 1s 23: learn: 0.1903919 test: 0.2395518 best: 0.2395518 (23) total: 39s remaining: 2m 24: learn: 0.1856686 test: 0.2370648 best: 0.2370648 (24) total: 40.9s remaining: 1m 59s 25: learn: 0.1816793 test: 0.2345109 best: 0.2345109 (25) total: 42.5s remaining: 1m 57s 26: learn: 0.1764417 test: 0.2330132 best: 0.2330132 (26) total: 44.3s remaining: 1m 56s 27: learn: 0.1734745 test: 0.2309293 best: 0.2309293 (27) total: 46s remaining: 1m 54s 28: learn: 0.1705156 test: 0.2294608 best: 0.2294608 (28) total: 47.8s remaining: 1m 53s 29: learn: 0.1681200 test: 0.2281336 best: 0.2281336 (29) total: 49.5s remaining: 1m 52s 30: learn: 0.1641545 test: 0.2268333 best: 0.2268333 (30) total: 51.2s remaining: 1m 50s 31: learn: 0.1619015 test: 0.2255153 best: 0.2255153 (31) total: 52.7s remaining: 1m 48s 32: learn: 0.1587913 test: 0.2249985 best: 0.2249985 (32) total: 54.4s remaining: 1m 47s 33: learn: 0.1564707 test: 0.2240006 best: 0.2240006 (33) total: 56s remaining: 1m 45s 34: learn: 0.1548027 test: 0.2229525 best: 0.2229525 (34) total: 57.7s remaining: 1m 43s 35: learn: 0.1531335 test: 0.2221233 best: 0.2221233 (35) total: 59.4s remaining: 1m 42s 36: learn: 0.1502559 test: 0.2215548 best: 0.2215548 (36) total: 1m 1s remaining: 1m 40s 37: learn: 0.1483533 test: 0.2207097 best: 0.2207097 (37) total: 1m 3s remaining: 1m 39s 38: learn: 0.1470323 test: 0.2199279 best: 0.2199279 (38) total: 1m 4s remaining: 1m 38s 39: learn: 0.1453055 test: 0.2192819 best: 0.2192819 (39) total: 1m 6s remaining: 1m 36s 40: learn: 0.1429852 test: 0.2185167 best: 0.2185167 (40) total: 1m 8s remaining: 1m 34s 41: learn: 0.1405635 test: 0.2180150 best: 0.2180150 (41) total: 1m 10s remaining: 1m 33s 42: learn: 0.1390094 test: 0.2175703 best: 0.2175703 (42) total: 1m 11s remaining: 1m 31s 43: learn: 0.1378501 test: 0.2168795 best: 0.2168795 (43) total: 1m 13s remaining: 1m 30s 44: learn: 0.1354861 test: 0.2164673 best: 0.2164673 (44) total: 1m 15s remaining: 1m 28s 45: learn: 0.1329133 test: 0.2164928 best: 0.2164673 (44) total: 1m 17s remaining: 1m 27s 46: learn: 0.1314107 test: 0.2164366 best: 0.2164366 (46) total: 1m 19s remaining: 1m 25s 47: learn: 0.1298902 test: 0.2156688 best: 0.2156688 (47) total: 1m 20s remaining: 1m 24s 48: learn: 0.1284800 test: 0.2152018 best: 0.2152018 (48) total: 1m 22s remaining: 1m 22s 49: learn: 0.1266101 test: 0.2149395 best: 0.2149395 (49) total: 1m 24s remaining: 1m 20s 50: learn: 0.1251785 test: 0.2146386 best: 0.2146386 (50) total: 1m 26s remaining: 1m 19s 51: learn: 0.1236674 test: 0.2145163 best: 0.2145163 (51) total: 1m 27s remaining: 1m 17s 52: learn: 0.1230329 test: 0.2140977 best: 0.2140977 (52) total: 1m 29s remaining: 1m 15s 53: learn: 0.1214110 test: 0.2139359 best: 0.2139359 (53) total: 1m 31s remaining: 1m 14s 54: learn: 0.1198069 test: 0.2137605 best: 0.2137605 (54) total: 1m 33s remaining: 1m 12s 55: learn: 0.1176055 test: 0.2137195 best: 0.2137195 (55) total: 1m 34s remaining: 1m 11s 56: learn: 0.1167581 test: 0.2132092 best: 0.2132092 (56) total: 1m 36s remaining: 1m 9s 57: learn: 0.1158223 test: 0.2128621 best: 0.2128621 (57) total: 1m 38s remaining: 1m 7s 58: learn: 0.1145790 test: 0.2125157 best: 0.2125157 (58) total: 1m 40s remaining: 1m 6s 59: learn: 0.1125466 test: 0.2121348 best: 0.2121348 (59) total: 1m 42s remaining: 1m 4s 60: learn: 0.1117749 test: 0.2117327 best: 0.2117327 (60) total: 1m 43s remaining: 1m 2s 61: learn: 0.1091758 test: 0.2113793 best: 0.2113793 (61) total: 1m 45s remaining: 1m 1s 62: learn: 0.1077719 test: 0.2115973 best: 0.2113793 (61) total: 1m 47s remaining: 59.7s 63: learn: 0.1066108 test: 0.2114689 best: 0.2113793 (61) total: 1m 49s remaining: 58s 64: learn: 0.1059050 test: 0.2113028 best: 0.2113028 (64) total: 1m 50s remaining: 56.2s 65: learn: 0.1048246 test: 0.2111591 best: 0.2111591 (65) total: 1m 52s remaining: 54.5s 66: learn: 0.1043813 test: 0.2109342 best: 0.2109342 (66) total: 1m 54s remaining: 52.8s 67: learn: 0.1039862 test: 0.2107090 best: 0.2107090 (67) total: 1m 55s remaining: 51.1s 68: learn: 0.1019612 test: 0.2105102 best: 0.2105102 (68) total: 1m 57s remaining: 49.4s 69: learn: 0.1013998 test: 0.2103799 best: 0.2103799 (69) total: 1m 59s remaining: 47.7s 70: learn: 0.1008057 test: 0.2100016 best: 0.2100016 (70) total: 2m 1s remaining: 46s 71: learn: 0.0994034 test: 0.2098997 best: 0.2098997 (71) total: 2m 2s remaining: 44.4s 72: learn: 0.0969196 test: 0.2098054 best: 0.2098054 (72) total: 2m 4s remaining: 42.7s 73: learn: 0.0960478 test: 0.2097977 best: 0.2097977 (73) total: 2m 6s remaining: 41s 74: learn: 0.0947732 test: 0.2094188 best: 0.2094188 (74) total: 2m 8s remaining: 39.3s 75: learn: 0.0938909 test: 0.2092969 best: 0.2092969 (75) total: 2m 9s remaining: 37.5s 76: learn: 0.0929224 test: 0.2091682 best: 0.2091682 (76) total: 2m 11s remaining: 35.8s 77: learn: 0.0918461 test: 0.2092325 best: 0.2091682 (76) total: 2m 13s remaining: 34.1s 78: learn: 0.0895664 test: 0.2092278 best: 0.2091682 (76) total: 2m 14s remaining: 32.4s 79: learn: 0.0883381 test: 0.2091488 best: 0.2091488 (79) total: 2m 16s remaining: 30.7s 80: learn: 0.0870339 test: 0.2092013 best: 0.2091488 (79) total: 2m 18s remaining: 29s 81: learn: 0.0857624 test: 0.2091354 best: 0.2091354 (81) total: 2m 20s remaining: 27.3s 82: learn: 0.0842130 test: 0.2092477 best: 0.2091354 (81) total: 2m 21s remaining: 25.6s 83: learn: 0.0834668 test: 0.2091546 best: 0.2091354 (81) total: 2m 23s remaining: 23.9s 84: learn: 0.0826374 test: 0.2090777 best: 0.2090777 (84) total: 2m 25s remaining: 22.2s 85: learn: 0.0818856 test: 0.2089433 best: 0.2089433 (85) total: 2m 27s remaining: 20.5s 86: learn: 0.0814350 test: 0.2087909 best: 0.2087909 (86) total: 2m 28s remaining: 18.8s 87: learn: 0.0805184 test: 0.2087237 best: 0.2087237 (87) total: 2m 30s remaining: 17.1s 88: learn: 0.0802229 test: 0.2085573 best: 0.2085573 (88) total: 2m 32s remaining: 15.4s 89: learn: 0.0795743 test: 0.2086171 best: 0.2085573 (88) total: 2m 33s remaining: 13.7s 90: learn: 0.0779795 test: 0.2086772 best: 0.2085573 (88) total: 2m 35s remaining: 12s 91: learn: 0.0772961 test: 0.2089322 best: 0.2085573 (88) total: 2m 37s remaining: 10.3s 92: learn: 0.0768116 test: 0.2087248 best: 0.2085573 (88) total: 2m 39s remaining: 8.56s 93: learn: 0.0764007 test: 0.2084266 best: 0.2084266 (93) total: 2m 41s remaining: 6.85s 94: learn: 0.0756760 test: 0.2082492 best: 0.2082492 (94) total: 2m 42s remaining: 5.14s 95: learn: 0.0750344 test: 0.2082277 best: 0.2082277 (95) total: 2m 44s remaining: 3.43s 96: learn: 0.0729086 test: 0.2082732 best: 0.2082277 (95) total: 2m 46s remaining: 1.71s 97: learn: 0.0724173 test: 0.2083103 best: 0.2082277 (95) total: 2m 47s remaining: 0us bestTest = 0.208227748 bestIteration = 95 Shrink model to first 96 iterations. Trial 74, Fold 5: Log loss = 0.20718779155537703, Average precision = 0.9731871268430515, ROC-AUC = 0.9684625531835833, Elapsed Time = 168.26519910000206 seconds
Optimization Progress: 75%|#######5 | 75/100 [2:19:59<2:18:31, 332.46s/it]
Trial 75, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 75, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5097813 test: 0.5126082 best: 0.5126082 (0) total: 201ms remaining: 1.41s 1: learn: 0.4078349 test: 0.4143247 best: 0.4143247 (1) total: 402ms remaining: 1.21s 2: learn: 0.3377477 test: 0.3489414 best: 0.3489414 (2) total: 628ms remaining: 1.05s 3: learn: 0.2877183 test: 0.3029563 best: 0.3029563 (3) total: 848ms remaining: 848ms 4: learn: 0.2565791 test: 0.2753284 best: 0.2753284 (4) total: 1.07s remaining: 643ms 5: learn: 0.2344505 test: 0.2559626 best: 0.2559626 (5) total: 1.27s remaining: 424ms 6: learn: 0.2171929 test: 0.2410710 best: 0.2410710 (6) total: 1.46s remaining: 209ms 7: learn: 0.2030235 test: 0.2301766 best: 0.2301766 (7) total: 1.69s remaining: 0us bestTest = 0.2301766315 bestIteration = 7 Trial 75, Fold 1: Log loss = 0.23017663145427283, Average precision = 0.9720650521748062, ROC-AUC = 0.9672628009348876, Elapsed Time = 1.8035525000013877 seconds Trial 75, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 75, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.4925732 test: 0.4968635 best: 0.4968635 (0) total: 205ms remaining: 1.44s 1: learn: 0.3834791 test: 0.3908931 best: 0.3908931 (1) total: 428ms remaining: 1.28s 2: learn: 0.3238229 test: 0.3341868 best: 0.3341868 (2) total: 638ms remaining: 1.06s 3: learn: 0.2730064 test: 0.2873088 best: 0.2873088 (3) total: 874ms remaining: 874ms 4: learn: 0.2397216 test: 0.2581664 best: 0.2581664 (4) total: 1.11s remaining: 668ms 5: learn: 0.2213316 test: 0.2417468 best: 0.2417468 (5) total: 1.33s remaining: 443ms 6: learn: 0.2066362 test: 0.2291984 best: 0.2291984 (6) total: 1.54s remaining: 221ms 7: learn: 0.1935625 test: 0.2184730 best: 0.2184730 (7) total: 1.77s remaining: 0us bestTest = 0.2184730204 bestIteration = 7 Trial 75, Fold 2: Log loss = 0.2184730203782967, Average precision = 0.9736359492697437, ROC-AUC = 0.9703997734709058, Elapsed Time = 1.880307499999617 seconds Trial 75, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 75, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5126189 test: 0.5148431 best: 0.5148431 (0) total: 232ms remaining: 1.62s 1: learn: 0.3974469 test: 0.4027819 best: 0.4027819 (1) total: 453ms remaining: 1.36s 2: learn: 0.3328133 test: 0.3412621 best: 0.3412621 (2) total: 669ms remaining: 1.11s 3: learn: 0.2847019 test: 0.2961406 best: 0.2961406 (3) total: 909ms remaining: 909ms 4: learn: 0.2500534 test: 0.2646369 best: 0.2646369 (4) total: 1.16s remaining: 695ms 5: learn: 0.2224107 test: 0.2413757 best: 0.2413757 (5) total: 1.45s remaining: 484ms 6: learn: 0.2078418 test: 0.2297494 best: 0.2297494 (6) total: 1.73s remaining: 248ms 7: learn: 0.1956612 test: 0.2204956 best: 0.2204956 (7) total: 2.08s remaining: 0us bestTest = 0.2204955656 bestIteration = 7 Trial 75, Fold 3: Log loss = 0.22049556556625982, Average precision = 0.9723505203681976, ROC-AUC = 0.9691137220579595, Elapsed Time = 2.214204299998528 seconds Trial 75, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 75, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5167488 test: 0.5195956 best: 0.5195956 (0) total: 222ms remaining: 1.55s 1: learn: 0.4153409 test: 0.4227675 best: 0.4227675 (1) total: 428ms remaining: 1.28s 2: learn: 0.3489447 test: 0.3594281 best: 0.3594281 (2) total: 649ms remaining: 1.08s 3: learn: 0.2945964 test: 0.3079072 best: 0.3079072 (3) total: 870ms remaining: 870ms 4: learn: 0.2609621 test: 0.2782619 best: 0.2782619 (4) total: 1.11s remaining: 669ms 5: learn: 0.2289680 test: 0.2484209 best: 0.2484209 (5) total: 1.36s remaining: 453ms 6: learn: 0.2136895 test: 0.2349619 best: 0.2349619 (6) total: 1.55s remaining: 222ms 7: learn: 0.2001226 test: 0.2243627 best: 0.2243627 (7) total: 1.76s remaining: 0us bestTest = 0.2243626571 bestIteration = 7 Trial 75, Fold 4: Log loss = 0.22436265714959572, Average precision = 0.9714429749314285, ROC-AUC = 0.9686229556666692, Elapsed Time = 1.9285004000012123 seconds Trial 75, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 75, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5078668 test: 0.5128209 best: 0.5128209 (0) total: 189ms remaining: 1.32s 1: learn: 0.4043368 test: 0.4130910 best: 0.4130910 (1) total: 381ms remaining: 1.14s 2: learn: 0.3373254 test: 0.3509105 best: 0.3509105 (2) total: 592ms remaining: 987ms 3: learn: 0.2912234 test: 0.3089344 best: 0.3089344 (3) total: 806ms remaining: 806ms 4: learn: 0.2563167 test: 0.2779319 best: 0.2779319 (4) total: 1.03s remaining: 619ms 5: learn: 0.2314248 test: 0.2571290 best: 0.2571290 (5) total: 1.24s remaining: 413ms 6: learn: 0.2137800 test: 0.2425057 best: 0.2425057 (6) total: 1.44s remaining: 205ms 7: learn: 0.2004658 test: 0.2324861 best: 0.2324861 (7) total: 1.65s remaining: 0us bestTest = 0.2324860652 bestIteration = 7 Trial 75, Fold 5: Log loss = 0.23248606523157098, Average precision = 0.9710552546928959, ROC-AUC = 0.9676153317355034, Elapsed Time = 1.773934900000313 seconds
Optimization Progress: 76%|#######6 | 76/100 [2:20:17<1:35:14, 238.09s/it]
Trial 76, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 76, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6531104 test: 0.6532013 best: 0.6532013 (0) total: 83.6ms remaining: 4.59s 1: learn: 0.6171570 test: 0.6171406 best: 0.6171406 (1) total: 163ms remaining: 4.4s 2: learn: 0.5840155 test: 0.5838715 best: 0.5838715 (2) total: 244ms remaining: 4.3s 3: learn: 0.5586446 test: 0.5587452 best: 0.5587452 (3) total: 324ms remaining: 4.21s 4: learn: 0.5307507 test: 0.5306415 best: 0.5306415 (4) total: 401ms remaining: 4.09s 5: learn: 0.5056393 test: 0.5054170 best: 0.5054170 (5) total: 481ms remaining: 4.01s 6: learn: 0.4830397 test: 0.4828917 best: 0.4828917 (6) total: 571ms remaining: 4s 7: learn: 0.4621968 test: 0.4619300 best: 0.4619300 (7) total: 651ms remaining: 3.9s 8: learn: 0.4451438 test: 0.4448856 best: 0.4448856 (8) total: 732ms remaining: 3.82s 9: learn: 0.4298765 test: 0.4297603 best: 0.4297603 (9) total: 816ms remaining: 3.75s 10: learn: 0.4157323 test: 0.4159280 best: 0.4159280 (10) total: 899ms remaining: 3.68s 11: learn: 0.4021683 test: 0.4025113 best: 0.4025113 (11) total: 983ms remaining: 3.6s 12: learn: 0.3889298 test: 0.3892619 best: 0.3892619 (12) total: 1.07s remaining: 3.54s 13: learn: 0.3780654 test: 0.3787182 best: 0.3787182 (13) total: 1.15s remaining: 3.45s 14: learn: 0.3672958 test: 0.3681708 best: 0.3681708 (14) total: 1.23s remaining: 3.37s 15: learn: 0.3600952 test: 0.3612093 best: 0.3612093 (15) total: 1.32s remaining: 3.29s 16: learn: 0.3515488 test: 0.3528294 best: 0.3528294 (16) total: 1.4s remaining: 3.22s 17: learn: 0.3431400 test: 0.3446529 best: 0.3446529 (17) total: 1.49s remaining: 3.14s 18: learn: 0.3360282 test: 0.3376380 best: 0.3376380 (18) total: 1.59s remaining: 3.09s 19: learn: 0.3297287 test: 0.3315354 best: 0.3315354 (19) total: 1.68s remaining: 3.02s 20: learn: 0.3234019 test: 0.3254780 best: 0.3254780 (20) total: 1.77s remaining: 2.96s 21: learn: 0.3166103 test: 0.3186517 best: 0.3186517 (21) total: 1.88s remaining: 2.9s 22: learn: 0.3110638 test: 0.3132554 best: 0.3132554 (22) total: 1.97s remaining: 2.82s 23: learn: 0.3058295 test: 0.3083471 best: 0.3083471 (23) total: 2.06s remaining: 2.75s 24: learn: 0.3014977 test: 0.3040754 best: 0.3040754 (24) total: 2.16s remaining: 2.68s 25: learn: 0.2963237 test: 0.2989098 best: 0.2989098 (25) total: 2.25s remaining: 2.6s 26: learn: 0.2925774 test: 0.2952923 best: 0.2952923 (26) total: 2.34s remaining: 2.51s 27: learn: 0.2889400 test: 0.2919377 best: 0.2919377 (27) total: 2.43s remaining: 2.43s 28: learn: 0.2854147 test: 0.2885009 best: 0.2885009 (28) total: 2.52s remaining: 2.35s 29: learn: 0.2820518 test: 0.2852917 best: 0.2852917 (29) total: 2.62s remaining: 2.27s 30: learn: 0.2791199 test: 0.2824427 best: 0.2824427 (30) total: 2.71s remaining: 2.18s 31: learn: 0.2762922 test: 0.2798157 best: 0.2798157 (31) total: 2.8s remaining: 2.1s 32: learn: 0.2728636 test: 0.2764230 best: 0.2764230 (32) total: 2.89s remaining: 2.02s 33: learn: 0.2699246 test: 0.2735545 best: 0.2735545 (33) total: 3s remaining: 1.94s 34: learn: 0.2671098 test: 0.2708318 best: 0.2708318 (34) total: 3.1s remaining: 1.86s 35: learn: 0.2644640 test: 0.2682580 best: 0.2682580 (35) total: 3.19s remaining: 1.77s 36: learn: 0.2618049 test: 0.2656519 best: 0.2656519 (36) total: 3.29s remaining: 1.69s 37: learn: 0.2592663 test: 0.2631719 best: 0.2631719 (37) total: 3.38s remaining: 1.6s 38: learn: 0.2568144 test: 0.2607426 best: 0.2607426 (38) total: 3.47s remaining: 1.51s 39: learn: 0.2547655 test: 0.2587460 best: 0.2587460 (39) total: 3.56s remaining: 1.43s 40: learn: 0.2531868 test: 0.2572552 best: 0.2572552 (40) total: 3.65s remaining: 1.33s 41: learn: 0.2511122 test: 0.2553048 best: 0.2553048 (41) total: 3.74s remaining: 1.25s 42: learn: 0.2494783 test: 0.2536914 best: 0.2536914 (42) total: 3.83s remaining: 1.16s 43: learn: 0.2478029 test: 0.2520268 best: 0.2520268 (43) total: 3.92s remaining: 1.07s 44: learn: 0.2460735 test: 0.2502778 best: 0.2502778 (44) total: 4.01s remaining: 980ms 45: learn: 0.2449670 test: 0.2492961 best: 0.2492961 (45) total: 4.1s remaining: 892ms 46: learn: 0.2437376 test: 0.2481237 best: 0.2481237 (46) total: 4.2s remaining: 805ms 47: learn: 0.2425706 test: 0.2469646 best: 0.2469646 (47) total: 4.3s remaining: 716ms 48: learn: 0.2412710 test: 0.2457741 best: 0.2457741 (48) total: 4.39s remaining: 627ms 49: learn: 0.2404419 test: 0.2450671 best: 0.2450671 (49) total: 4.48s remaining: 538ms 50: learn: 0.2396209 test: 0.2443544 best: 0.2443544 (50) total: 4.57s remaining: 449ms 51: learn: 0.2384248 test: 0.2432264 best: 0.2432264 (51) total: 4.67s remaining: 359ms 52: learn: 0.2371256 test: 0.2419730 best: 0.2419730 (52) total: 4.75s remaining: 269ms 53: learn: 0.2363813 test: 0.2412341 best: 0.2412341 (53) total: 4.84s remaining: 179ms 54: learn: 0.2353438 test: 0.2402713 best: 0.2402713 (54) total: 4.94s remaining: 89.8ms 55: learn: 0.2343987 test: 0.2393465 best: 0.2393465 (55) total: 5.04s remaining: 0us bestTest = 0.2393465114 bestIteration = 55 Trial 76, Fold 1: Log loss = 0.23934651142423916, Average precision = 0.969694644871701, ROC-AUC = 0.9642598746313742, Elapsed Time = 5.158317500001431 seconds Trial 76, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 76, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6527278 test: 0.6527310 best: 0.6527310 (0) total: 90ms remaining: 4.95s 1: learn: 0.6165312 test: 0.6169178 best: 0.6169178 (1) total: 180ms remaining: 4.85s 2: learn: 0.5838936 test: 0.5842661 best: 0.5842661 (2) total: 265ms remaining: 4.69s 3: learn: 0.5592647 test: 0.5598867 best: 0.5598867 (3) total: 354ms remaining: 4.59s 4: learn: 0.5314428 test: 0.5323539 best: 0.5323539 (4) total: 438ms remaining: 4.47s 5: learn: 0.5069410 test: 0.5079333 best: 0.5079333 (5) total: 525ms remaining: 4.37s 6: learn: 0.4849835 test: 0.4861545 best: 0.4861545 (6) total: 614ms remaining: 4.29s 7: learn: 0.4689742 test: 0.4701814 best: 0.4701814 (7) total: 699ms remaining: 4.19s 8: learn: 0.4525387 test: 0.4535573 best: 0.4535573 (8) total: 781ms remaining: 4.08s 9: learn: 0.4347356 test: 0.4359207 best: 0.4359207 (9) total: 867ms remaining: 3.99s 10: learn: 0.4191025 test: 0.4202875 best: 0.4202875 (10) total: 950ms remaining: 3.89s 11: learn: 0.4045687 test: 0.4059637 best: 0.4059637 (11) total: 1.04s remaining: 3.8s 12: learn: 0.3913898 test: 0.3927751 best: 0.3927751 (12) total: 1.13s remaining: 3.72s 13: learn: 0.3797007 test: 0.3811099 best: 0.3811099 (13) total: 1.21s remaining: 3.63s 14: learn: 0.3693973 test: 0.3709167 best: 0.3709167 (14) total: 1.29s remaining: 3.54s 15: learn: 0.3599835 test: 0.3616386 best: 0.3616386 (15) total: 1.38s remaining: 3.46s 16: learn: 0.3517784 test: 0.3534356 best: 0.3534356 (16) total: 1.47s remaining: 3.37s 17: learn: 0.3444718 test: 0.3460776 best: 0.3460776 (17) total: 1.55s remaining: 3.28s 18: learn: 0.3371640 test: 0.3387039 best: 0.3387039 (18) total: 1.64s remaining: 3.19s 19: learn: 0.3292042 test: 0.3308794 best: 0.3308794 (19) total: 1.72s remaining: 3.1s 20: learn: 0.3218262 test: 0.3234231 best: 0.3234231 (20) total: 1.81s remaining: 3.01s 21: learn: 0.3149828 test: 0.3166148 best: 0.3166148 (21) total: 1.89s remaining: 2.92s 22: learn: 0.3099483 test: 0.3115694 best: 0.3115694 (22) total: 1.98s remaining: 2.83s 23: learn: 0.3052979 test: 0.3069035 best: 0.3069035 (23) total: 2.06s remaining: 2.75s 24: learn: 0.3002863 test: 0.3020206 best: 0.3020206 (24) total: 2.15s remaining: 2.67s 25: learn: 0.2962924 test: 0.2979768 best: 0.2979768 (25) total: 2.24s remaining: 2.59s 26: learn: 0.2918973 test: 0.2936282 best: 0.2936282 (26) total: 2.33s remaining: 2.5s 27: learn: 0.2879072 test: 0.2897598 best: 0.2897598 (27) total: 2.42s remaining: 2.42s 28: learn: 0.2850346 test: 0.2867181 best: 0.2867181 (28) total: 2.51s remaining: 2.33s 29: learn: 0.2817408 test: 0.2835249 best: 0.2835249 (29) total: 2.59s remaining: 2.25s 30: learn: 0.2785161 test: 0.2803428 best: 0.2803428 (30) total: 2.68s remaining: 2.16s 31: learn: 0.2747769 test: 0.2766579 best: 0.2766579 (31) total: 2.77s remaining: 2.08s 32: learn: 0.2722585 test: 0.2742130 best: 0.2742130 (32) total: 2.86s remaining: 1.99s 33: learn: 0.2689355 test: 0.2709072 best: 0.2709072 (33) total: 3.04s remaining: 1.97s 34: learn: 0.2666552 test: 0.2686182 best: 0.2686182 (34) total: 3.17s remaining: 1.9s 35: learn: 0.2644793 test: 0.2664568 best: 0.2664568 (35) total: 3.29s remaining: 1.82s 36: learn: 0.2619183 test: 0.2639788 best: 0.2639788 (36) total: 3.38s remaining: 1.74s 37: learn: 0.2594974 test: 0.2615739 best: 0.2615739 (37) total: 3.48s remaining: 1.65s 38: learn: 0.2570282 test: 0.2591038 best: 0.2591038 (38) total: 3.57s remaining: 1.56s 39: learn: 0.2556756 test: 0.2577243 best: 0.2577243 (39) total: 3.66s remaining: 1.47s 40: learn: 0.2542757 test: 0.2562938 best: 0.2562938 (40) total: 3.75s remaining: 1.37s 41: learn: 0.2521544 test: 0.2542332 best: 0.2542332 (41) total: 3.86s remaining: 1.28s 42: learn: 0.2505835 test: 0.2525807 best: 0.2525807 (42) total: 3.94s remaining: 1.19s 43: learn: 0.2493936 test: 0.2513081 best: 0.2513081 (43) total: 4.03s remaining: 1.1s 44: learn: 0.2479864 test: 0.2498994 best: 0.2498994 (44) total: 4.13s remaining: 1.01s 45: learn: 0.2467979 test: 0.2487877 best: 0.2487877 (45) total: 4.22s remaining: 917ms 46: learn: 0.2452026 test: 0.2472204 best: 0.2472204 (46) total: 4.31s remaining: 825ms 47: learn: 0.2441586 test: 0.2461674 best: 0.2461674 (47) total: 4.4s remaining: 733ms 48: learn: 0.2425827 test: 0.2445981 best: 0.2445981 (48) total: 4.5s remaining: 642ms 49: learn: 0.2412957 test: 0.2433997 best: 0.2433997 (49) total: 4.59s remaining: 551ms 50: learn: 0.2402813 test: 0.2424389 best: 0.2424389 (50) total: 4.68s remaining: 459ms 51: learn: 0.2396413 test: 0.2417546 best: 0.2417546 (51) total: 4.77s remaining: 367ms 52: learn: 0.2383801 test: 0.2404653 best: 0.2404653 (52) total: 4.86s remaining: 275ms 53: learn: 0.2379039 test: 0.2400035 best: 0.2400035 (53) total: 4.95s remaining: 183ms 54: learn: 0.2368026 test: 0.2388992 best: 0.2388992 (54) total: 5.03s remaining: 91.5ms 55: learn: 0.2357659 test: 0.2378747 best: 0.2378747 (55) total: 5.12s remaining: 0us bestTest = 0.2378747225 bestIteration = 55 Trial 76, Fold 2: Log loss = 0.23787472252935388, Average precision = 0.9701072622343252, ROC-AUC = 0.9667991778911973, Elapsed Time = 5.23895039999843 seconds Trial 76, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 76, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6534712 test: 0.6533331 best: 0.6533331 (0) total: 104ms remaining: 5.7s 1: learn: 0.6229017 test: 0.6225270 best: 0.6225270 (1) total: 200ms remaining: 5.39s 2: learn: 0.5954637 test: 0.5947158 best: 0.5947158 (2) total: 295ms remaining: 5.2s 3: learn: 0.5662584 test: 0.5652448 best: 0.5652448 (3) total: 388ms remaining: 5.04s 4: learn: 0.5434886 test: 0.5422109 best: 0.5422109 (4) total: 475ms remaining: 4.84s 5: learn: 0.5172492 test: 0.5157733 best: 0.5157733 (5) total: 565ms remaining: 4.71s 6: learn: 0.4994856 test: 0.4978205 best: 0.4978205 (6) total: 656ms remaining: 4.59s 7: learn: 0.4824454 test: 0.4805643 best: 0.4805643 (7) total: 744ms remaining: 4.46s 8: learn: 0.4673186 test: 0.4652066 best: 0.4652066 (8) total: 832ms remaining: 4.34s 9: learn: 0.4496785 test: 0.4474887 best: 0.4474887 (9) total: 922ms remaining: 4.24s 10: learn: 0.4334583 test: 0.4310974 best: 0.4310974 (10) total: 1.06s remaining: 4.35s 11: learn: 0.4219326 test: 0.4195698 best: 0.4195698 (11) total: 1.38s remaining: 5.08s 12: learn: 0.4084185 test: 0.4058201 best: 0.4058201 (12) total: 1.52s remaining: 5.01s 13: learn: 0.3949201 test: 0.3922245 best: 0.3922245 (13) total: 1.67s remaining: 5s 14: learn: 0.3833467 test: 0.3806001 best: 0.3806001 (14) total: 1.85s remaining: 5.05s 15: learn: 0.3746401 test: 0.3717033 best: 0.3717033 (15) total: 1.96s remaining: 4.9s 16: learn: 0.3639599 test: 0.3609126 best: 0.3609126 (16) total: 2.06s remaining: 4.72s 17: learn: 0.3550424 test: 0.3518191 best: 0.3518191 (17) total: 2.16s remaining: 4.57s 18: learn: 0.3469710 test: 0.3435730 best: 0.3435730 (18) total: 2.26s remaining: 4.4s 19: learn: 0.3389988 test: 0.3356705 best: 0.3356705 (19) total: 2.35s remaining: 4.24s 20: learn: 0.3325287 test: 0.3291369 best: 0.3291369 (20) total: 2.45s remaining: 4.08s 21: learn: 0.3260155 test: 0.3226817 best: 0.3226817 (21) total: 2.55s remaining: 3.94s 22: learn: 0.3191394 test: 0.3158121 best: 0.3158121 (22) total: 2.65s remaining: 3.81s 23: learn: 0.3137397 test: 0.3103880 best: 0.3103880 (23) total: 2.75s remaining: 3.67s 24: learn: 0.3086286 test: 0.3051559 best: 0.3051559 (24) total: 2.84s remaining: 3.53s 25: learn: 0.3040552 test: 0.3004424 best: 0.3004424 (25) total: 2.94s remaining: 3.39s 26: learn: 0.2988701 test: 0.2951747 best: 0.2951747 (26) total: 3.03s remaining: 3.26s 27: learn: 0.2947627 test: 0.2910127 best: 0.2910127 (27) total: 3.13s remaining: 3.13s 28: learn: 0.2912653 test: 0.2874714 best: 0.2874714 (28) total: 3.22s remaining: 2.99s 29: learn: 0.2872383 test: 0.2834293 best: 0.2834293 (29) total: 3.3s remaining: 2.86s 30: learn: 0.2831062 test: 0.2792664 best: 0.2792664 (30) total: 3.4s remaining: 2.74s 31: learn: 0.2799659 test: 0.2761386 best: 0.2761386 (31) total: 3.49s remaining: 2.62s 32: learn: 0.2770314 test: 0.2731616 best: 0.2731616 (32) total: 3.58s remaining: 2.5s 33: learn: 0.2744849 test: 0.2706213 best: 0.2706213 (33) total: 3.68s remaining: 2.38s 34: learn: 0.2710067 test: 0.2669611 best: 0.2669611 (34) total: 3.77s remaining: 2.26s 35: learn: 0.2686161 test: 0.2644765 best: 0.2644765 (35) total: 3.86s remaining: 2.14s 36: learn: 0.2664281 test: 0.2622757 best: 0.2622757 (36) total: 3.95s remaining: 2.03s 37: learn: 0.2642628 test: 0.2601130 best: 0.2601130 (37) total: 4.04s remaining: 1.91s 38: learn: 0.2614911 test: 0.2572950 best: 0.2572950 (38) total: 4.12s remaining: 1.8s 39: learn: 0.2591087 test: 0.2548362 best: 0.2548362 (39) total: 4.21s remaining: 1.68s 40: learn: 0.2567862 test: 0.2525276 best: 0.2525276 (40) total: 4.3s remaining: 1.57s 41: learn: 0.2545882 test: 0.2502480 best: 0.2502480 (41) total: 4.39s remaining: 1.46s 42: learn: 0.2526201 test: 0.2483092 best: 0.2483092 (42) total: 4.49s remaining: 1.35s 43: learn: 0.2512819 test: 0.2470324 best: 0.2470324 (43) total: 4.58s remaining: 1.25s 44: learn: 0.2493287 test: 0.2450596 best: 0.2450596 (44) total: 4.67s remaining: 1.14s 45: learn: 0.2479875 test: 0.2437394 best: 0.2437394 (45) total: 4.77s remaining: 1.04s 46: learn: 0.2468392 test: 0.2425216 best: 0.2425216 (46) total: 4.86s remaining: 931ms 47: learn: 0.2457948 test: 0.2414838 best: 0.2414838 (47) total: 4.95s remaining: 826ms 48: learn: 0.2446968 test: 0.2403951 best: 0.2403951 (48) total: 5.04s remaining: 720ms 49: learn: 0.2432976 test: 0.2389691 best: 0.2389691 (49) total: 5.13s remaining: 616ms 50: learn: 0.2418680 test: 0.2375337 best: 0.2375337 (50) total: 5.22s remaining: 512ms 51: learn: 0.2407199 test: 0.2365008 best: 0.2365008 (51) total: 5.32s remaining: 409ms 52: learn: 0.2394790 test: 0.2353198 best: 0.2353198 (52) total: 5.41s remaining: 306ms 53: learn: 0.2381995 test: 0.2340263 best: 0.2340263 (53) total: 5.49s remaining: 203ms 54: learn: 0.2373588 test: 0.2331685 best: 0.2331685 (54) total: 5.58s remaining: 102ms 55: learn: 0.2363980 test: 0.2322473 best: 0.2322473 (55) total: 5.67s remaining: 0us bestTest = 0.232247298 bestIteration = 55 Trial 76, Fold 3: Log loss = 0.2322472979786853, Average precision = 0.9708223233526677, ROC-AUC = 0.9666556260895108, Elapsed Time = 5.81767519999994 seconds Trial 76, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 76, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6530051 test: 0.6529084 best: 0.6529084 (0) total: 89.5ms remaining: 4.92s 1: learn: 0.6162401 test: 0.6160806 best: 0.6160806 (1) total: 177ms remaining: 4.79s 2: learn: 0.5838820 test: 0.5837486 best: 0.5837486 (2) total: 265ms remaining: 4.68s 3: learn: 0.5559254 test: 0.5557755 best: 0.5557755 (3) total: 353ms remaining: 4.59s 4: learn: 0.5288952 test: 0.5287341 best: 0.5287341 (4) total: 438ms remaining: 4.47s 5: learn: 0.5045107 test: 0.5043156 best: 0.5043156 (5) total: 523ms remaining: 4.36s 6: learn: 0.4831589 test: 0.4829664 best: 0.4829664 (6) total: 612ms remaining: 4.28s 7: learn: 0.4622842 test: 0.4621265 best: 0.4621265 (7) total: 698ms remaining: 4.19s 8: learn: 0.4454401 test: 0.4452076 best: 0.4452076 (8) total: 783ms remaining: 4.09s 9: learn: 0.4313235 test: 0.4312412 best: 0.4312412 (9) total: 873ms remaining: 4.02s 10: learn: 0.4182152 test: 0.4182931 best: 0.4182931 (10) total: 963ms remaining: 3.94s 11: learn: 0.4047993 test: 0.4047812 best: 0.4047812 (11) total: 1.05s remaining: 3.85s 12: learn: 0.3938581 test: 0.3938209 best: 0.3938209 (12) total: 1.14s remaining: 3.77s 13: learn: 0.3816794 test: 0.3817976 best: 0.3817976 (13) total: 1.23s remaining: 3.69s 14: learn: 0.3700306 test: 0.3701129 best: 0.3701129 (14) total: 1.31s remaining: 3.59s 15: learn: 0.3613612 test: 0.3614405 best: 0.3614405 (15) total: 1.4s remaining: 3.51s 16: learn: 0.3531965 test: 0.3532968 best: 0.3532968 (16) total: 1.49s remaining: 3.41s 17: learn: 0.3452400 test: 0.3453516 best: 0.3453516 (17) total: 1.58s remaining: 3.34s 18: learn: 0.3379089 test: 0.3380628 best: 0.3380628 (18) total: 1.68s remaining: 3.27s 19: learn: 0.3316963 test: 0.3317879 best: 0.3317879 (19) total: 1.77s remaining: 3.18s 20: learn: 0.3253523 test: 0.3254290 best: 0.3254290 (20) total: 1.85s remaining: 3.09s 21: learn: 0.3194575 test: 0.3194924 best: 0.3194924 (21) total: 1.94s remaining: 3s 22: learn: 0.3130249 test: 0.3129479 best: 0.3129479 (22) total: 2.04s remaining: 2.92s 23: learn: 0.3079260 test: 0.3077720 best: 0.3077720 (23) total: 2.13s remaining: 2.84s 24: learn: 0.3034410 test: 0.3033641 best: 0.3033641 (24) total: 2.23s remaining: 2.76s 25: learn: 0.2993790 test: 0.2992939 best: 0.2992939 (25) total: 2.31s remaining: 2.67s 26: learn: 0.2956892 test: 0.2957245 best: 0.2957245 (26) total: 2.4s remaining: 2.58s 27: learn: 0.2910248 test: 0.2909633 best: 0.2909633 (27) total: 2.49s remaining: 2.49s 28: learn: 0.2873798 test: 0.2874820 best: 0.2874820 (28) total: 2.58s remaining: 2.4s 29: learn: 0.2835701 test: 0.2837564 best: 0.2837564 (29) total: 2.67s remaining: 2.31s 30: learn: 0.2794962 test: 0.2795719 best: 0.2795719 (30) total: 2.76s remaining: 2.22s 31: learn: 0.2758562 test: 0.2758063 best: 0.2758063 (31) total: 2.85s remaining: 2.14s 32: learn: 0.2724049 test: 0.2722889 best: 0.2722889 (32) total: 2.95s remaining: 2.05s 33: learn: 0.2692813 test: 0.2691363 best: 0.2691363 (33) total: 3.05s remaining: 1.97s 34: learn: 0.2669266 test: 0.2667541 best: 0.2667541 (34) total: 3.14s remaining: 1.88s 35: learn: 0.2648819 test: 0.2647326 best: 0.2647326 (35) total: 3.23s remaining: 1.79s 36: learn: 0.2622403 test: 0.2620439 best: 0.2620439 (36) total: 3.32s remaining: 1.71s 37: learn: 0.2604151 test: 0.2602757 best: 0.2602757 (37) total: 3.41s remaining: 1.61s 38: learn: 0.2583986 test: 0.2582686 best: 0.2582686 (38) total: 3.5s remaining: 1.52s 39: learn: 0.2558259 test: 0.2557335 best: 0.2557335 (39) total: 3.59s remaining: 1.44s 40: learn: 0.2534413 test: 0.2534038 best: 0.2534038 (40) total: 3.68s remaining: 1.35s 41: learn: 0.2519955 test: 0.2519818 best: 0.2519818 (41) total: 3.77s remaining: 1.26s 42: learn: 0.2503138 test: 0.2502199 best: 0.2502199 (42) total: 3.87s remaining: 1.17s 43: learn: 0.2482950 test: 0.2482264 best: 0.2482264 (43) total: 3.96s remaining: 1.08s 44: learn: 0.2466068 test: 0.2464884 best: 0.2464884 (44) total: 4.05s remaining: 991ms 45: learn: 0.2449789 test: 0.2448887 best: 0.2448887 (45) total: 4.15s remaining: 903ms 46: learn: 0.2436950 test: 0.2436374 best: 0.2436374 (46) total: 4.25s remaining: 813ms 47: learn: 0.2426648 test: 0.2427152 best: 0.2427152 (47) total: 4.34s remaining: 723ms 48: learn: 0.2417522 test: 0.2417570 best: 0.2417570 (48) total: 4.43s remaining: 633ms 49: learn: 0.2410197 test: 0.2410956 best: 0.2410956 (49) total: 4.53s remaining: 543ms 50: learn: 0.2395860 test: 0.2396216 best: 0.2396216 (50) total: 4.62s remaining: 453ms 51: learn: 0.2384784 test: 0.2384867 best: 0.2384867 (51) total: 4.72s remaining: 363ms 52: learn: 0.2372873 test: 0.2373924 best: 0.2373924 (52) total: 4.8s remaining: 272ms 53: learn: 0.2362577 test: 0.2363555 best: 0.2363555 (53) total: 4.89s remaining: 181ms 54: learn: 0.2352613 test: 0.2354361 best: 0.2354361 (54) total: 4.98s remaining: 90.6ms 55: learn: 0.2341704 test: 0.2344225 best: 0.2344225 (55) total: 5.07s remaining: 0us bestTest = 0.2344224768 bestIteration = 55 Trial 76, Fold 4: Log loss = 0.2344224767786823, Average precision = 0.9702264133128352, ROC-AUC = 0.9650462664911742, Elapsed Time = 5.200046199999633 seconds Trial 76, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 76, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6605860 test: 0.6610183 best: 0.6610183 (0) total: 89.4ms remaining: 4.92s 1: learn: 0.6231751 test: 0.6238760 best: 0.6238760 (1) total: 175ms remaining: 4.72s 2: learn: 0.5952613 test: 0.5966417 best: 0.5966417 (2) total: 262ms remaining: 4.63s 3: learn: 0.5699543 test: 0.5719818 best: 0.5719818 (3) total: 352ms remaining: 4.58s 4: learn: 0.5428247 test: 0.5450093 best: 0.5450093 (4) total: 440ms remaining: 4.48s 5: learn: 0.5164566 test: 0.5188019 best: 0.5188019 (5) total: 528ms remaining: 4.4s 6: learn: 0.4925379 test: 0.4952004 best: 0.4952004 (6) total: 617ms remaining: 4.32s 7: learn: 0.4730110 test: 0.4759275 best: 0.4759275 (7) total: 702ms remaining: 4.21s 8: learn: 0.4545458 test: 0.4576443 best: 0.4576443 (8) total: 789ms remaining: 4.12s 9: learn: 0.4382148 test: 0.4415603 best: 0.4415603 (9) total: 878ms remaining: 4.04s 10: learn: 0.4255626 test: 0.4291900 best: 0.4291900 (10) total: 967ms remaining: 3.96s 11: learn: 0.4105133 test: 0.4144462 best: 0.4144462 (11) total: 1.05s remaining: 3.85s 12: learn: 0.3964145 test: 0.4006251 best: 0.4006251 (12) total: 1.14s remaining: 3.77s 13: learn: 0.3851120 test: 0.3894635 best: 0.3894635 (13) total: 1.23s remaining: 3.68s 14: learn: 0.3732777 test: 0.3777422 best: 0.3777422 (14) total: 1.31s remaining: 3.59s 15: learn: 0.3626888 test: 0.3674401 best: 0.3674401 (15) total: 1.4s remaining: 3.51s 16: learn: 0.3553034 test: 0.3601051 best: 0.3601051 (16) total: 1.49s remaining: 3.42s 17: learn: 0.3472346 test: 0.3521004 best: 0.3521004 (17) total: 1.58s remaining: 3.33s 18: learn: 0.3393736 test: 0.3443551 best: 0.3443551 (18) total: 1.67s remaining: 3.24s 19: learn: 0.3310328 test: 0.3362261 best: 0.3362261 (19) total: 1.75s remaining: 3.15s 20: learn: 0.3238035 test: 0.3292072 best: 0.3292072 (20) total: 1.84s remaining: 3.06s 21: learn: 0.3179140 test: 0.3235077 best: 0.3235077 (21) total: 1.93s remaining: 2.98s 22: learn: 0.3125112 test: 0.3182499 best: 0.3182499 (22) total: 2.02s remaining: 2.89s 23: learn: 0.3060997 test: 0.3121425 best: 0.3121425 (23) total: 2.1s remaining: 2.81s 24: learn: 0.3018046 test: 0.3080894 best: 0.3080894 (24) total: 2.19s remaining: 2.72s 25: learn: 0.2962147 test: 0.3026966 best: 0.3026966 (25) total: 2.29s remaining: 2.64s 26: learn: 0.2912461 test: 0.2979042 best: 0.2979042 (26) total: 2.38s remaining: 2.55s 27: learn: 0.2875556 test: 0.2943495 best: 0.2943495 (27) total: 2.47s remaining: 2.47s 28: learn: 0.2833348 test: 0.2902709 best: 0.2902709 (28) total: 2.56s remaining: 2.38s 29: learn: 0.2791253 test: 0.2862757 best: 0.2862757 (29) total: 2.65s remaining: 2.29s 30: learn: 0.2760463 test: 0.2833291 best: 0.2833291 (30) total: 2.74s remaining: 2.21s 31: learn: 0.2730233 test: 0.2802395 best: 0.2802395 (31) total: 2.83s remaining: 2.12s 32: learn: 0.2706004 test: 0.2778865 best: 0.2778865 (32) total: 2.92s remaining: 2.04s 33: learn: 0.2683031 test: 0.2757331 best: 0.2757331 (33) total: 3.01s remaining: 1.95s 34: learn: 0.2653904 test: 0.2729350 best: 0.2729350 (34) total: 3.11s remaining: 1.86s 35: learn: 0.2624872 test: 0.2702849 best: 0.2702849 (35) total: 3.2s remaining: 1.78s 36: learn: 0.2597275 test: 0.2677025 best: 0.2677025 (36) total: 3.3s remaining: 1.7s 37: learn: 0.2580059 test: 0.2660769 best: 0.2660769 (37) total: 3.39s remaining: 1.61s 38: learn: 0.2563293 test: 0.2644335 best: 0.2644335 (38) total: 3.49s remaining: 1.52s 39: learn: 0.2540978 test: 0.2623006 best: 0.2623006 (39) total: 3.58s remaining: 1.43s 40: learn: 0.2525090 test: 0.2608361 best: 0.2608361 (40) total: 3.67s remaining: 1.34s 41: learn: 0.2509959 test: 0.2593770 best: 0.2593770 (41) total: 3.76s remaining: 1.25s 42: learn: 0.2490087 test: 0.2575292 best: 0.2575292 (42) total: 3.85s remaining: 1.16s 43: learn: 0.2473146 test: 0.2559506 best: 0.2559506 (43) total: 3.94s remaining: 1.07s 44: learn: 0.2455820 test: 0.2543551 best: 0.2543551 (44) total: 4.03s remaining: 986ms 45: learn: 0.2446646 test: 0.2535001 best: 0.2535001 (45) total: 4.12s remaining: 896ms 46: learn: 0.2434408 test: 0.2522205 best: 0.2522205 (46) total: 4.23s remaining: 809ms 47: learn: 0.2419228 test: 0.2507068 best: 0.2507068 (47) total: 4.33s remaining: 722ms 48: learn: 0.2405062 test: 0.2493846 best: 0.2493846 (48) total: 4.43s remaining: 633ms 49: learn: 0.2395549 test: 0.2485112 best: 0.2485112 (49) total: 4.55s remaining: 547ms 50: learn: 0.2382790 test: 0.2473218 best: 0.2473218 (50) total: 4.66s remaining: 457ms 51: learn: 0.2369449 test: 0.2460712 best: 0.2460712 (51) total: 4.75s remaining: 366ms 52: learn: 0.2359784 test: 0.2451799 best: 0.2451799 (52) total: 4.85s remaining: 275ms 53: learn: 0.2350447 test: 0.2442437 best: 0.2442437 (53) total: 4.95s remaining: 183ms 54: learn: 0.2337662 test: 0.2430852 best: 0.2430852 (54) total: 5.04s remaining: 91.6ms 55: learn: 0.2331999 test: 0.2425203 best: 0.2425203 (55) total: 5.13s remaining: 0us bestTest = 0.2425203036 bestIteration = 55 Trial 76, Fold 5: Log loss = 0.24252030364660943, Average precision = 0.9669052919651657, ROC-AUC = 0.9631877821749066, Elapsed Time = 5.259091599997191 seconds
Optimization Progress: 77%|#######7 | 77/100 [2:20:52<1:07:53, 177.11s/it]
Trial 77, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 77, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5565804 test: 0.5536771 best: 0.5536771 (0) total: 145ms remaining: 12.9s 1: learn: 0.4730340 test: 0.4714051 best: 0.4714051 (1) total: 300ms remaining: 13.2s 2: learn: 0.4116212 test: 0.4102433 best: 0.4102433 (2) total: 447ms remaining: 13s 3: learn: 0.3711878 test: 0.3712388 best: 0.3712388 (3) total: 599ms remaining: 12.9s 4: learn: 0.3340063 test: 0.3352479 best: 0.3352479 (4) total: 734ms remaining: 12.5s 5: learn: 0.3072189 test: 0.3097578 best: 0.3097578 (5) total: 862ms remaining: 12.1s 6: learn: 0.2810098 test: 0.2850808 best: 0.2850808 (6) total: 1.01s remaining: 12s 7: learn: 0.2646874 test: 0.2700704 best: 0.2700704 (7) total: 1.15s remaining: 11.8s 8: learn: 0.2516047 test: 0.2584804 best: 0.2584804 (8) total: 1.3s remaining: 11.7s 9: learn: 0.2403658 test: 0.2482175 best: 0.2482175 (9) total: 1.45s remaining: 11.6s 10: learn: 0.2300272 test: 0.2398940 best: 0.2398940 (10) total: 1.6s remaining: 11.5s 11: learn: 513.8837067 test: 2051.2052998 best: 0.2398940 (10) total: 1.77s remaining: 11.5s 12: learn: 513.8748288 test: 2051.1948931 best: 0.2398940 (10) total: 1.95s remaining: 11.6s 13: learn: 513.8684589 test: 2051.1868951 best: 0.2398940 (10) total: 2.1s remaining: 11.4s 14: learn: 513.8629134 test: 2051.1797293 best: 0.2398940 (10) total: 2.25s remaining: 11.2s 15: learn: 513.8574007 test: 2051.1721080 best: 0.2398940 (10) total: 2.41s remaining: 11.1s 16: learn: 513.8530273 test: 2051.1649878 best: 0.2398940 (10) total: 2.55s remaining: 10.9s 17: learn: 513.8498087 test: 2051.1589768 best: 0.2398940 (10) total: 2.67s remaining: 10.7s 18: learn: 3211.4289789 test: 6145.8456147 best: 0.2398940 (10) total: 2.83s remaining: 10.6s 19: learn: 3211.4200116 test: 6145.8317203 best: 0.2398940 (10) total: 2.98s remaining: 10.4s 20: learn: 4239.3923879 test: 6145.8158475 best: 0.2398940 (10) total: 3.13s remaining: 10.3s 21: learn: 4239.3811615 test: 6145.8015831 best: 0.2398940 (10) total: 3.27s remaining: 10.1s 22: learn: 4239.3710281 test: 6145.7883034 best: 0.2398940 (10) total: 3.4s remaining: 9.9s 23: learn: 4239.3608002 test: 6145.7747947 best: 0.2398940 (10) total: 3.54s remaining: 9.75s 24: learn: 4753.2584320 test: 6145.7617401 best: 0.2398940 (10) total: 3.65s remaining: 9.5s 25: learn: 4753.2465066 test: 6145.7476546 best: 0.2398940 (10) total: 3.78s remaining: 9.31s 26: learn: 4753.2351888 test: 6145.7339449 best: 0.2398940 (10) total: 3.93s remaining: 9.16s 27: learn: 5524.3584532 test: 6145.7207384 best: 0.2398940 (10) total: 4.05s remaining: 8.98s 28: learn: 5524.3459304 test: 6145.7070924 best: 0.2398940 (10) total: 4.19s remaining: 8.81s 29: learn: 5524.3326851 test: 6145.6935299 best: 0.2398940 (10) total: 4.34s remaining: 8.68s 30: learn: 5524.3205160 test: 6145.6809816 best: 0.2398940 (10) total: 4.45s remaining: 8.46s 31: learn: 5524.3083146 test: 6145.6678264 best: 0.2398940 (10) total: 4.56s remaining: 8.27s 32: learn: 5524.2955064 test: 6145.6546587 best: 0.2398940 (10) total: 4.71s remaining: 8.13s 33: learn: 5524.2828630 test: 6145.6416827 best: 0.2398940 (10) total: 4.84s remaining: 7.97s 34: learn: 5524.2706434 test: 6145.6284658 best: 0.2398940 (10) total: 4.96s remaining: 7.79s 35: learn: 5524.2581176 test: 6145.6153826 best: 0.2398940 (10) total: 5.09s remaining: 7.63s 36: learn: 5524.2455906 test: 6145.6018261 best: 0.2398940 (10) total: 5.22s remaining: 7.47s 37: learn: 5524.2328993 test: 6145.5886028 best: 0.2398940 (10) total: 5.35s remaining: 7.32s 38: learn: 5524.2203781 test: 6145.5756523 best: 0.2398940 (10) total: 5.47s remaining: 7.16s 39: learn: 5524.2081102 test: 6145.5625054 best: 0.2398940 (10) total: 5.59s remaining: 6.99s 40: learn: 5524.1957784 test: 6145.5492304 best: 0.2398940 (10) total: 5.71s remaining: 6.82s 41: learn: 5524.1836174 test: 6145.5366175 best: 0.2398940 (10) total: 5.82s remaining: 6.66s 42: learn: 5524.1713694 test: 6145.5235618 best: 0.2398940 (10) total: 5.94s remaining: 6.5s 43: learn: 5524.1590325 test: 6145.5105554 best: 0.2398940 (10) total: 6.07s remaining: 6.35s 44: learn: 5459.8840857 test: 6145.4971554 best: 0.2398940 (10) total: 6.22s remaining: 6.22s 45: learn: 5459.8721463 test: 6145.4842713 best: 0.2398940 (10) total: 6.33s remaining: 6.06s 46: learn: 5459.8601257 test: 6145.4710566 best: 0.2398940 (10) total: 6.46s remaining: 5.91s 47: learn: 5459.8480935 test: 6145.4579951 best: 0.2398940 (10) total: 6.58s remaining: 5.76s 48: learn: 5459.8357493 test: 6145.4448119 best: 0.2398940 (10) total: 6.72s remaining: 5.63s 49: learn: 6423.7749901 test: 9224.5651931 best: 0.2398940 (10) total: 6.88s remaining: 5.5s 50: learn: 6809.9319855 test: 10766.4688222 best: 0.2398940 (10) total: 7.01s remaining: 5.36s bestTest = 0.2398939532 bestIteration = 10 Shrink model to first 11 iterations.
Training has stopped (degenerate solution on iteration 51, probably too small l2-regularization, try to increase it)
Trial 77, Fold 1: Log loss = 0.23989395324427604, Average precision = 0.9677350413395026, ROC-AUC = 0.9640070791691162, Elapsed Time = 7.274260000001959 seconds Trial 77, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 77, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5780279 test: 0.5786035 best: 0.5786035 (0) total: 153ms remaining: 13.6s 1: learn: 0.4827739 test: 0.4847063 best: 0.4847063 (1) total: 339ms remaining: 14.9s 2: learn: 0.4201291 test: 0.4216690 best: 0.4216690 (2) total: 489ms remaining: 14.2s 3: learn: 0.3767015 test: 0.3792734 best: 0.3792734 (3) total: 639ms remaining: 13.7s 4: learn: 0.3454090 test: 0.3487198 best: 0.3487198 (4) total: 800ms remaining: 13.6s 5: learn: 0.3069652 test: 0.3117192 best: 0.3117192 (5) total: 965ms remaining: 13.5s 6: learn: 0.2830240 test: 0.2892132 best: 0.2892132 (6) total: 1.13s remaining: 13.4s 7: learn: 0.2662770 test: 0.2722603 best: 0.2722603 (7) total: 1.27s remaining: 13s 8: learn: 0.2546075 test: 0.2607054 best: 0.2607054 (8) total: 1.42s remaining: 12.8s 9: learn: 0.2443771 test: 0.2512106 best: 0.2512106 (9) total: 1.59s remaining: 12.7s 10: learn: 0.2357366 test: 0.2429890 best: 0.2429890 (10) total: 1.72s remaining: 12.4s 11: learn: 1173.2271628 test: 1366.8710486 best: 0.2429890 (10) total: 1.87s remaining: 12.2s 12: learn: 6176.3583424 test: 4468.6651971 best: 0.2429890 (10) total: 2.02s remaining: 12s 13: learn: 2840.8835427 test: 2400.7790667 best: 0.2429890 (10) total: 2.17s remaining: 11.8s 14: learn: 2840.8724831 test: 2400.7689530 best: 0.2429890 (10) total: 2.32s remaining: 11.6s 15: learn: 3865.2934993 test: 4464.8808793 best: 0.2429890 (10) total: 2.48s remaining: 11.5s 16: learn: 4125.4961801 test: 4726.2777580 best: 0.2429890 (10) total: 2.64s remaining: 11.3s 17: learn: 4125.4845544 test: 4726.2657963 best: 0.2429890 (10) total: 2.78s remaining: 11.1s 18: learn: 4125.2892833 test: 4726.2539299 best: 0.2429890 (10) total: 2.94s remaining: 11s 19: learn: 4119.7459911 test: 4726.2424853 best: 0.2429890 (10) total: 3.09s remaining: 10.8s 20: learn: 4119.7359927 test: 4726.2313565 best: 0.2429890 (10) total: 3.23s remaining: 10.6s 21: learn: 4119.7256333 test: 4726.2202223 best: 0.2429890 (10) total: 3.35s remaining: 10.4s 22: learn: 4119.4580486 test: 4726.2090891 best: 0.2429890 (10) total: 3.49s remaining: 10.2s 23: learn: 4119.4487779 test: 4726.1988623 best: 0.2429890 (10) total: 3.6s remaining: 9.91s 24: learn: 4119.4395326 test: 4726.1886512 best: 0.2429890 (10) total: 3.73s remaining: 9.69s 25: learn: 4119.4298881 test: 4726.1779719 best: 0.2429890 (10) total: 3.85s remaining: 9.47s 26: learn: 4119.4204838 test: 4726.1674243 best: 0.2429890 (10) total: 3.96s remaining: 9.23s 27: learn: 4119.4109678 test: 4726.1570846 best: 0.2429890 (10) total: 4.07s remaining: 9.02s 28: learn: 4119.4008224 test: 4726.1459522 best: 0.2429890 (10) total: 4.19s remaining: 8.81s 29: learn: 4119.3911805 test: 4726.1355506 best: 0.2429890 (10) total: 4.31s remaining: 8.62s 30: learn: 4119.3813486 test: 4726.1252931 best: 0.2429890 (10) total: 4.45s remaining: 8.47s 31: learn: 4119.3716707 test: 4726.1147212 best: 0.2429890 (10) total: 4.58s remaining: 8.31s 32: learn: 4119.3620457 test: 4726.1043297 best: 0.2429890 (10) total: 4.71s remaining: 8.14s 33: learn: 4119.3524416 test: 4726.0938001 best: 0.2429890 (10) total: 4.83s remaining: 7.96s 34: learn: 39495.9941766 test: 33674.1178643 best: 0.2429890 (10) total: 4.96s remaining: 7.79s 35: learn: 388371.4092690 test: 349260.6884528 best: 0.2429890 (10) total: 5.1s remaining: 7.65s
Training has stopped (degenerate solution on iteration 36, probably too small l2-regularization, try to increase it)
bestTest = 0.2429889915 bestIteration = 10 Shrink model to first 11 iterations. Trial 77, Fold 2: Log loss = 0.2429889914899703, Average precision = 0.9712504661585221, ROC-AUC = 0.967077786825318, Elapsed Time = 5.351572499999747 seconds Trial 77, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 77, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5728092 test: 0.5709514 best: 0.5709514 (0) total: 163ms remaining: 14.5s 1: learn: 0.4828826 test: 0.4817103 best: 0.4817103 (1) total: 342ms remaining: 15s 2: learn: 0.4247475 test: 0.4238877 best: 0.4238877 (2) total: 497ms remaining: 14.4s 3: learn: 0.3800704 test: 0.3790955 best: 0.3790955 (3) total: 645ms remaining: 13.9s 4: learn: 0.3426243 test: 0.3430445 best: 0.3430445 (4) total: 808ms remaining: 13.7s 5: learn: 0.3147867 test: 0.3156043 best: 0.3156043 (5) total: 950ms remaining: 13.3s 6: learn: 0.2959852 test: 0.2967269 best: 0.2967269 (6) total: 1.11s remaining: 13.1s 7: learn: 0.2760976 test: 0.2775164 best: 0.2775164 (7) total: 1.25s remaining: 12.8s 8: learn: 517.0440995 test: 1036.7380966 best: 0.2775164 (7) total: 1.41s remaining: 12.7s 9: learn: 517.0313238 test: 1036.7255335 best: 0.2775164 (7) total: 1.56s remaining: 12.5s 10: learn: 517.0197603 test: 1036.7139560 best: 0.2775164 (7) total: 1.71s remaining: 12.3s 11: learn: 517.0098725 test: 1036.7041906 best: 0.2775164 (7) total: 1.85s remaining: 12s 12: learn: 517.0021627 test: 1036.6963082 best: 0.2775164 (7) total: 1.99s remaining: 11.8s 13: learn: 4112.3225381 test: 1036.6881418 best: 0.2775164 (7) total: 2.17s remaining: 11.8s 14: learn: 4112.3087118 test: 1036.6817509 best: 0.2775164 (7) total: 2.32s remaining: 11.6s 15: learn: 4561.6949556 test: 3096.8575987 best: 0.2775164 (7) total: 2.45s remaining: 11.3s 16: learn: 4561.6804818 test: 3096.8479187 best: 0.2775164 (7) total: 2.59s remaining: 11.1s 17: learn: 4497.4657219 test: 3096.8388185 best: 0.2775164 (7) total: 2.74s remaining: 11s 18: learn: 4946.8610001 test: 3096.8306315 best: 0.2775164 (7) total: 2.87s remaining: 10.7s 19: learn: 4946.8477793 test: 3096.8217299 best: 0.2775164 (7) total: 3.02s remaining: 10.6s 20: learn: 4946.8352344 test: 3096.8146540 best: 0.2775164 (7) total: 3.16s remaining: 10.4s 21: learn: 4946.8222092 test: 3096.8060968 best: 0.2775164 (7) total: 3.3s remaining: 10.2s 22: learn: 4946.8104622 test: 3096.7985193 best: 0.2775164 (7) total: 3.42s remaining: 9.95s 23: learn: 4946.7974359 test: 3096.7907889 best: 0.2775164 (7) total: 3.56s remaining: 9.8s 24: learn: 4946.7852441 test: 3096.7832391 best: 0.2775164 (7) total: 3.72s remaining: 9.68s 25: learn: 4946.7737838 test: 3096.7763962 best: 0.2775164 (7) total: 3.85s remaining: 9.47s 26: learn: 4946.7618770 test: 3096.7697224 best: 0.2775164 (7) total: 3.98s remaining: 9.29s 27: learn: 4946.7504847 test: 3096.7631442 best: 0.2775164 (7) total: 4.11s remaining: 9.1s 28: learn: 4946.7394945 test: 3096.7563939 best: 0.2775164 (7) total: 4.22s remaining: 8.89s 29: learn: 4946.7278693 test: 3096.7488440 best: 0.2775164 (7) total: 4.37s remaining: 8.73s 30: learn: 4946.7163974 test: 3096.7423456 best: 0.2775164 (7) total: 4.51s remaining: 8.59s 31: learn: 4946.7049469 test: 3096.7357493 best: 0.2775164 (7) total: 4.64s remaining: 8.42s 32: learn: 4946.6937968 test: 3096.7289277 best: 0.2775164 (7) total: 4.77s remaining: 8.23s 33: learn: 5139.3248597 test: 3096.7215256 best: 0.2775164 (7) total: 4.92s remaining: 8.1s 34: learn: 5139.3130748 test: 3096.7144365 best: 0.2775164 (7) total: 5.06s remaining: 7.95s 35: learn: 5524.5152353 test: 3096.7073464 best: 0.2775164 (7) total: 5.19s remaining: 7.78s 36: learn: 5909.7170266 test: 3096.7004220 best: 0.2775164 (7) total: 5.32s remaining: 7.62s 37: learn: 5909.7037671 test: 3096.6934999 best: 0.2775164 (7) total: 5.45s remaining: 7.46s 38: learn: 5909.6909477 test: 3096.6868375 best: 0.2775164 (7) total: 5.56s remaining: 7.27s 39: learn: 5909.6777868 test: 3096.6801628 best: 0.2775164 (7) total: 5.69s remaining: 7.11s 40: learn: 5909.6646789 test: 3096.6733123 best: 0.2775164 (7) total: 5.81s remaining: 6.95s 41: learn: 5909.6515301 test: 3096.6666047 best: 0.2775164 (7) total: 5.95s remaining: 6.8s 42: learn: 5909.6382067 test: 3096.6599837 best: 0.2775164 (7) total: 6.12s remaining: 6.69s 43: learn: 5909.6252747 test: 3096.6533253 best: 0.2775164 (7) total: 6.24s remaining: 6.53s 44: learn: 5909.6118852 test: 3096.6469206 best: 0.2775164 (7) total: 6.38s remaining: 6.38s 45: learn: 6680.0183233 test: 3869.2265168 best: 0.2775164 (7) total: 6.5s remaining: 6.22s 46: learn: 6680.0038217 test: 3869.2181354 best: 0.2775164 (7) total: 6.62s remaining: 6.06s 47: learn: 6679.9892811 test: 3869.2099445 best: 0.2775164 (7) total: 6.74s remaining: 5.9s 48: learn: 6679.9748203 test: 3869.2016410 best: 0.2775164 (7) total: 6.85s remaining: 5.73s 49: learn: 6679.9602320 test: 3869.1932033 best: 0.2775164 (7) total: 6.97s remaining: 5.58s 50: learn: 6679.9456015 test: 3869.1849058 best: 0.2775164 (7) total: 7.1s remaining: 5.43s 51: learn: 6679.9313348 test: 3869.1766420 best: 0.2775164 (7) total: 7.2s remaining: 5.26s 52: learn: 6679.9168061 test: 3869.1683857 best: 0.2775164 (7) total: 7.33s remaining: 5.12s 53: learn: 6679.9021827 test: 3869.1597899 best: 0.2775164 (7) total: 7.44s remaining: 4.96s 54: learn: 6679.8875177 test: 3869.1510544 best: 0.2775164 (7) total: 7.55s remaining: 4.81s 55: learn: 6679.8731868 test: 3869.1428317 best: 0.2775164 (7) total: 7.67s remaining: 4.65s 56: learn: 6679.8588699 test: 3869.1346080 best: 0.2775164 (7) total: 7.77s remaining: 4.5s 57: learn: 6679.8442828 test: 3869.1264716 best: 0.2775164 (7) total: 7.91s remaining: 4.36s 58: learn: 6679.8299190 test: 3869.1181882 best: 0.2775164 (7) total: 8.02s remaining: 4.21s 59: learn: 6679.8151460 test: 3869.1094270 best: 0.2775164 (7) total: 8.16s remaining: 4.08s 60: learn: 6679.8004518 test: 3869.1011905 best: 0.2775164 (7) total: 8.29s remaining: 3.94s 61: learn: 6679.7859721 test: 3869.0928616 best: 0.2775164 (7) total: 8.41s remaining: 3.8s 62: learn: 6679.7713849 test: 3869.0848485 best: 0.2775164 (7) total: 8.54s remaining: 3.66s 63: learn: 6679.7569376 test: 3869.0765941 best: 0.2775164 (7) total: 8.65s remaining: 3.51s 64: learn: 8861.3997005 test: 3869.0688188 best: 0.2775164 (7) total: 8.79s remaining: 3.38s 65: learn: 8860.9623194 test: 3869.0604350 best: 0.2775164 (7) total: 8.92s remaining: 3.24s 66: learn: 9888.8986569 test: 4899.8975383 best: 0.2775164 (7) total: 9.05s remaining: 3.11s 67: learn: 9888.8770539 test: 4899.8862466 best: 0.2775164 (7) total: 9.18s remaining: 2.97s 68: learn: 9888.8560248 test: 4899.8759043 best: 0.2775164 (7) total: 9.29s remaining: 2.83s 69: learn: 10273.9620168 test: 6444.8810921 best: 0.2775164 (7) total: 9.44s remaining: 2.7s 70: learn: 10273.9395694 test: 6444.8674702 best: 0.2775164 (7) total: 9.58s remaining: 2.56s 71: learn: 9786.2708814 test: 4899.8446872 best: 0.2775164 (7) total: 9.72s remaining: 2.43s 72: learn: 9786.2499871 test: 4899.8343102 best: 0.2775164 (7) total: 9.84s remaining: 2.29s 73: learn: 9722.0262868 test: 4899.8239639 best: 0.2775164 (7) total: 9.98s remaining: 2.16s 74: learn: 9722.0055189 test: 4899.8134277 best: 0.2775164 (7) total: 10.1s remaining: 2.02s 75: learn: 9721.9846624 test: 4899.8029606 best: 0.2775164 (7) total: 10.2s remaining: 1.88s 76: learn: 9721.9637435 test: 4899.7925740 best: 0.2775164 (7) total: 10.3s remaining: 1.75s 77: learn: 9721.9428760 test: 4899.7822449 best: 0.2775164 (7) total: 10.5s remaining: 1.61s 78: learn: 9721.9218433 test: 4899.7719496 best: 0.2775164 (7) total: 10.6s remaining: 1.48s 79: learn: 9721.9009961 test: 4899.7615027 best: 0.2775164 (7) total: 10.7s remaining: 1.34s 80: learn: 9721.8799903 test: 4899.7512128 best: 0.2775164 (7) total: 10.8s remaining: 1.2s 81: learn: 9721.8592171 test: 4899.7409085 best: 0.2775164 (7) total: 11s remaining: 1.07s 82: learn: 9721.8386035 test: 4899.7308330 best: 0.2775164 (7) total: 11.1s remaining: 933ms 83: learn: 9721.8174855 test: 4899.7204129 best: 0.2775164 (7) total: 11.2s remaining: 801ms 84: learn: 9721.7967191 test: 4899.7100903 best: 0.2775164 (7) total: 11.3s remaining: 666ms 85: learn: 9721.7759399 test: 4899.6999732 best: 0.2775164 (7) total: 11.4s remaining: 532ms 86: learn: 9721.7547149 test: 4899.6899275 best: 0.2775164 (7) total: 11.6s remaining: 399ms 87: learn: 9721.7339208 test: 4899.6795088 best: 0.2775164 (7) total: 11.7s remaining: 266ms 88: learn: 9721.7129960 test: 4899.6692937 best: 0.2775164 (7) total: 11.8s remaining: 133ms 89: learn: 9721.6915942 test: 4899.6588979 best: 0.2775164 (7) total: 12s remaining: 0us bestTest = 0.2775164099 bestIteration = 7 Shrink model to first 8 iterations. Trial 77, Fold 3: Log loss = 0.2775164098647098, Average precision = 0.9677509471789273, ROC-AUC = 0.9655515182122874, Elapsed Time = 12.092096499996842 seconds Trial 77, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 77, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5673725 test: 0.5685589 best: 0.5685589 (0) total: 158ms remaining: 14.1s 1: learn: 0.4843944 test: 0.4858121 best: 0.4858121 (1) total: 303ms remaining: 13.3s 2: learn: 0.4281729 test: 0.4298057 best: 0.4298057 (2) total: 452ms remaining: 13.1s 3: learn: 0.3709796 test: 0.3733062 best: 0.3733062 (3) total: 615ms remaining: 13.2s 4: learn: 0.3319446 test: 0.3349015 best: 0.3349015 (4) total: 755ms remaining: 12.8s 5: learn: 0.3045139 test: 0.3079489 best: 0.3079489 (5) total: 902ms remaining: 12.6s 6: learn: 0.2780954 test: 0.2823795 best: 0.2823795 (6) total: 1.07s remaining: 12.7s 7: learn: 0.2596377 test: 0.2640886 best: 0.2640886 (7) total: 1.23s remaining: 12.6s 8: learn: 0.2475674 test: 0.2526150 best: 0.2526150 (8) total: 1.39s remaining: 12.5s 9: learn: 0.2386823 test: 0.2447540 best: 0.2447540 (9) total: 1.54s remaining: 12.3s 10: learn: 0.2276380 test: 0.2342114 best: 0.2342114 (10) total: 1.71s remaining: 12.3s 11: learn: 902.3131075 test: 2306.4597994 best: 0.2342114 (10) total: 1.87s remaining: 12.2s 12: learn: 1288.0055450 test: 2306.4506349 best: 0.2342114 (10) total: 2.01s remaining: 11.9s 13: learn: 1287.9977684 test: 2306.4416744 best: 0.2342114 (10) total: 2.17s remaining: 11.8s 14: learn: 1287.9912153 test: 2306.4334154 best: 0.2342114 (10) total: 2.31s remaining: 11.6s 15: learn: 1287.9837954 test: 2306.4249328 best: 0.2342114 (10) total: 2.46s remaining: 11.4s 16: learn: 1287.9776336 test: 2306.4180378 best: 0.2342114 (10) total: 2.6s remaining: 11.2s 17: learn: 1287.9711944 test: 2306.4102998 best: 0.2342114 (10) total: 2.73s remaining: 10.9s 18: learn: 2059.3587375 test: 3331.3548371 best: 0.2342114 (10) total: 2.87s remaining: 10.7s 19: learn: 2059.3525011 test: 3331.3467857 best: 0.2342114 (10) total: 3s remaining: 10.5s 20: learn: 3084.0792951 test: 9458.3827598 best: 0.2342114 (10) total: 3.15s remaining: 10.3s 21: learn: 3084.0711342 test: 9458.3619569 best: 0.2342114 (10) total: 3.28s remaining: 10.1s 22: learn: 3084.0633411 test: 9458.3410704 best: 0.2342114 (10) total: 3.41s remaining: 9.93s 23: learn: 3405.4722618 test: 9458.3209058 best: 0.2342114 (10) total: 3.52s remaining: 9.69s 24: learn: 3405.4647545 test: 9458.3008862 best: 0.2342114 (10) total: 3.62s remaining: 9.42s 25: learn: 3405.4558217 test: 9458.2800458 best: 0.2342114 (10) total: 3.78s remaining: 9.3s 26: learn: 3405.4469160 test: 9458.2589581 best: 0.2342114 (10) total: 3.92s remaining: 9.14s 27: learn: 3405.4392492 test: 9458.2390406 best: 0.2342114 (10) total: 4.04s remaining: 8.94s 28: learn: 3405.4308453 test: 9458.2186494 best: 0.2342114 (10) total: 4.17s remaining: 8.77s 29: learn: 3405.4223713 test: 9458.1981583 best: 0.2342114 (10) total: 4.31s remaining: 8.62s 30: learn: 4235.2578977 test: 9458.2981068 best: 0.2342114 (10) total: 4.46s remaining: 8.5s 31: learn: 4685.6349874 test: 11253.5676715 best: 0.2342114 (10) total: 4.58s remaining: 8.3s 32: learn: 4685.6243619 test: 11253.5435300 best: 0.2342114 (10) total: 4.7s remaining: 8.12s 33: learn: 4685.6131270 test: 11253.5190885 best: 0.2342114 (10) total: 4.84s remaining: 7.97s 34: learn: 4685.6021776 test: 11253.4948618 best: 0.2342114 (10) total: 4.98s remaining: 7.83s 35: learn: 4685.5915337 test: 11253.4712907 best: 0.2342114 (10) total: 5.1s remaining: 7.66s 36: learn: 4685.5805444 test: 11253.4472610 best: 0.2342114 (10) total: 5.23s remaining: 7.5s 37: learn: 4685.5702360 test: 11253.4235293 best: 0.2342114 (10) total: 5.35s remaining: 7.32s 38: learn: 6149.9736509 test: 13199.1698793 best: 0.2342114 (10) total: 5.5s remaining: 7.2s 39: learn: 6149.9600099 test: 13199.1419647 best: 0.2342114 (10) total: 5.62s remaining: 7.03s 40: learn: 6149.9468888 test: 13199.1141625 best: 0.2342114 (10) total: 5.72s remaining: 6.84s 41: learn: 1505747.1478393 test: 1550635.5817794 best: 0.2342114 (10) total: 5.86s remaining: 6.7s
Training has stopped (degenerate solution on iteration 42, probably too small l2-regularization, try to increase it)
bestTest = 0.234211359 bestIteration = 10 Shrink model to first 11 iterations. Trial 77, Fold 4: Log loss = 0.23421135897576462, Average precision = 0.9724300880487378, ROC-AUC = 0.9673531829486088, Elapsed Time = 6.118050200002472 seconds Trial 77, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 77, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5648622 test: 0.5668813 best: 0.5668813 (0) total: 190ms remaining: 16.9s 1: learn: 0.4894681 test: 0.4923432 best: 0.4923432 (1) total: 338ms remaining: 14.9s 2: learn: 0.4195867 test: 0.4214040 best: 0.4214040 (2) total: 493ms remaining: 14.3s 3: learn: 0.3703088 test: 0.3742635 best: 0.3742635 (3) total: 651ms remaining: 14s 4: learn: 0.3336377 test: 0.3396583 best: 0.3396583 (4) total: 832ms remaining: 14.1s 5: learn: 0.3068980 test: 0.3141934 best: 0.3141934 (5) total: 984ms remaining: 13.8s 6: learn: 0.2841855 test: 0.2926142 best: 0.2926142 (6) total: 1.13s remaining: 13.4s 7: learn: 0.2634750 test: 0.2739564 best: 0.2739564 (7) total: 1.29s remaining: 13.2s 8: learn: 0.2502256 test: 0.2625385 best: 0.2625385 (8) total: 1.44s remaining: 13s 9: learn: 0.2389818 test: 0.2526448 best: 0.2526448 (9) total: 1.6s remaining: 12.8s 10: learn: 0.2268744 test: 0.2416729 best: 0.2416729 (10) total: 1.77s remaining: 12.7s 11: learn: 648.5487299 test: 0.2351372 best: 0.2351372 (11) total: 1.91s remaining: 12.4s 12: learn: 648.5404621 test: 0.2294992 best: 0.2294992 (12) total: 2.09s remaining: 12.4s 13: learn: 648.5347687 test: 0.2264725 best: 0.2264725 (13) total: 2.22s remaining: 12.1s 14: learn: 648.5297327 test: 0.2237511 best: 0.2237511 (14) total: 2.37s remaining: 11.8s 15: learn: 648.5247757 test: 0.2218021 best: 0.2218021 (15) total: 2.51s remaining: 11.6s 16: learn: 648.5208653 test: 0.2202242 best: 0.2202242 (16) total: 2.65s remaining: 11.4s 17: learn: 3220.5868651 test: 5119.0751186 best: 0.2202242 (16) total: 2.78s remaining: 11.1s 18: learn: 3220.5775656 test: 5119.0630231 best: 0.2202242 (16) total: 2.93s remaining: 10.9s 19: learn: 3220.5675183 test: 5119.0493745 best: 0.2202242 (16) total: 3.06s remaining: 10.7s 20: learn: 3482.5481471 test: 5119.0375151 best: 0.2202242 (16) total: 3.21s remaining: 10.6s 21: learn: 3932.6627594 test: 6910.6720447 best: 0.2202242 (16) total: 3.34s remaining: 10.3s 22: learn: 3932.6530127 test: 6910.6564174 best: 0.2202242 (16) total: 3.48s remaining: 10.1s 23: learn: 4961.4081175 test: 6910.6404190 best: 0.2202242 (16) total: 3.63s remaining: 9.98s 24: learn: 4961.3955300 test: 6910.6245863 best: 0.2202242 (16) total: 3.79s remaining: 9.86s 25: learn: 7738.9280674 test: 13359.6560405 best: 0.2202242 (16) total: 3.98s remaining: 9.79s 26: learn: 7738.9113574 test: 13359.6277380 best: 0.2202242 (16) total: 4.09s remaining: 9.54s 27: learn: 7738.8932273 test: 13359.5988155 best: 0.2202242 (16) total: 4.26s remaining: 9.42s 28: learn: 7738.8760008 test: 13359.5700871 best: 0.2202242 (16) total: 4.41s remaining: 9.28s bestTest = 0.2202242128 bestIteration = 16 Shrink model to first 17 iterations.
Training has stopped (degenerate solution on iteration 29, probably too small l2-regularization, try to increase it)
Trial 77, Fold 5: Log loss = 0.22022421278444518, Average precision = 0.970420295649105, ROC-AUC = 0.9660487895509355, Elapsed Time = 4.705272299997887 seconds
Optimization Progress: 78%|#######8 | 78/100 [2:21:36<50:16, 137.12s/it]
Trial 78, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 78, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6793145 test: 0.6796342 best: 0.6796342 (0) total: 250ms remaining: 24.3s 1: learn: 0.6659242 test: 0.6663420 best: 0.6663420 (1) total: 548ms remaining: 26.3s 2: learn: 0.6531666 test: 0.6536834 best: 0.6536834 (2) total: 795ms remaining: 25.2s 3: learn: 0.6407016 test: 0.6412586 best: 0.6412586 (3) total: 1.05s remaining: 24.7s 4: learn: 0.6285480 test: 0.6291655 best: 0.6291655 (4) total: 1.34s remaining: 24.9s 5: learn: 0.6168079 test: 0.6174345 best: 0.6174345 (5) total: 1.61s remaining: 24.7s 6: learn: 0.6053878 test: 0.6061092 best: 0.6061092 (6) total: 1.95s remaining: 25.4s 7: learn: 0.5941093 test: 0.5948889 best: 0.5948889 (7) total: 2.26s remaining: 25.4s 8: learn: 0.5831834 test: 0.5840054 best: 0.5840054 (8) total: 2.58s remaining: 25.5s 9: learn: 0.5727399 test: 0.5736660 best: 0.5736660 (9) total: 2.86s remaining: 25.2s 10: learn: 0.5625598 test: 0.5635484 best: 0.5635484 (10) total: 3.16s remaining: 25s 11: learn: 0.5525848 test: 0.5537012 best: 0.5537012 (11) total: 3.48s remaining: 25s 12: learn: 0.5430630 test: 0.5442148 best: 0.5442148 (12) total: 3.78s remaining: 24.7s 13: learn: 0.5339257 test: 0.5351712 best: 0.5351712 (13) total: 4.1s remaining: 24.6s 14: learn: 0.5248079 test: 0.5261425 best: 0.5261425 (14) total: 4.43s remaining: 24.5s 15: learn: 0.5159615 test: 0.5173907 best: 0.5173907 (15) total: 4.7s remaining: 24.1s 16: learn: 0.5073620 test: 0.5089112 best: 0.5089112 (16) total: 5.02s remaining: 23.9s 17: learn: 0.4991757 test: 0.5008121 best: 0.5008121 (17) total: 5.32s remaining: 23.6s 18: learn: 0.4913029 test: 0.4930146 best: 0.4930146 (18) total: 5.64s remaining: 23.5s 19: learn: 0.4836770 test: 0.4856256 best: 0.4856256 (19) total: 5.97s remaining: 23.3s 20: learn: 0.4759904 test: 0.4780829 best: 0.4780829 (20) total: 6.33s remaining: 23.2s 21: learn: 0.4685897 test: 0.4707693 best: 0.4707693 (21) total: 6.63s remaining: 22.9s 22: learn: 0.4615425 test: 0.4638015 best: 0.4638015 (22) total: 6.92s remaining: 22.6s 23: learn: 0.4547433 test: 0.4570899 best: 0.4570899 (23) total: 7.2s remaining: 22.2s 24: learn: 0.4481762 test: 0.4506892 best: 0.4506892 (24) total: 7.52s remaining: 22s 25: learn: 0.4417785 test: 0.4444240 best: 0.4444240 (25) total: 7.81s remaining: 21.6s 26: learn: 0.4354902 test: 0.4382279 best: 0.4382279 (26) total: 8.1s remaining: 21.3s 27: learn: 0.4298281 test: 0.4327077 best: 0.4327077 (27) total: 8.36s remaining: 20.9s 28: learn: 0.4238658 test: 0.4268260 best: 0.4268260 (28) total: 8.61s remaining: 20.5s 29: learn: 0.4179162 test: 0.4210416 best: 0.4210416 (29) total: 8.88s remaining: 20.1s 30: learn: 0.4122811 test: 0.4155075 best: 0.4155075 (30) total: 9.19s remaining: 19.9s 31: learn: 0.4068665 test: 0.4102382 best: 0.4102382 (31) total: 9.46s remaining: 19.5s 32: learn: 0.4016064 test: 0.4051867 best: 0.4051867 (32) total: 9.72s remaining: 19.2s 33: learn: 0.3966030 test: 0.4002805 best: 0.4002805 (33) total: 10s remaining: 18.8s 34: learn: 0.3918889 test: 0.3956998 best: 0.3956998 (34) total: 10.2s remaining: 18.4s 35: learn: 0.3868689 test: 0.3908198 best: 0.3908198 (35) total: 10.6s remaining: 18.2s 36: learn: 0.3821795 test: 0.3863120 best: 0.3863120 (36) total: 10.8s remaining: 17.8s 37: learn: 0.3776136 test: 0.3818291 best: 0.3818291 (37) total: 11.1s remaining: 17.5s 38: learn: 0.3730917 test: 0.3773868 best: 0.3773868 (38) total: 11.4s remaining: 17.2s 39: learn: 0.3688626 test: 0.3733285 best: 0.3733285 (39) total: 11.6s remaining: 16.9s 40: learn: 0.3648538 test: 0.3694183 best: 0.3694183 (40) total: 11.9s remaining: 16.6s 41: learn: 0.3605963 test: 0.3653402 best: 0.3653402 (41) total: 12.2s remaining: 16.3s 42: learn: 0.3566197 test: 0.3615126 best: 0.3615126 (42) total: 12.5s remaining: 15.9s 43: learn: 0.3527264 test: 0.3577884 best: 0.3577884 (43) total: 12.7s remaining: 15.6s 44: learn: 0.3490215 test: 0.3542535 best: 0.3542535 (44) total: 13s remaining: 15.3s 45: learn: 0.3453604 test: 0.3507351 best: 0.3507351 (45) total: 13.2s remaining: 15s 46: learn: 0.3417824 test: 0.3473918 best: 0.3473918 (46) total: 13.5s remaining: 14.7s 47: learn: 0.3382794 test: 0.3440086 best: 0.3440086 (47) total: 13.8s remaining: 14.4s 48: learn: 0.3348463 test: 0.3407267 best: 0.3407267 (48) total: 14s remaining: 14s 49: learn: 0.3313937 test: 0.3374122 best: 0.3374122 (49) total: 14.3s remaining: 13.7s 50: learn: 0.3281139 test: 0.3342654 best: 0.3342654 (50) total: 14.6s remaining: 13.4s 51: learn: 0.3248886 test: 0.3311833 best: 0.3311833 (51) total: 14.8s remaining: 13.1s 52: learn: 0.3218639 test: 0.3283417 best: 0.3283417 (52) total: 15.1s remaining: 12.8s 53: learn: 0.3189456 test: 0.3255716 best: 0.3255716 (53) total: 15.4s remaining: 12.5s 54: learn: 0.3160916 test: 0.3229455 best: 0.3229455 (54) total: 15.7s remaining: 12.3s 55: learn: 0.3133495 test: 0.3203767 best: 0.3203767 (55) total: 15.9s remaining: 12s 56: learn: 0.3107070 test: 0.3178543 best: 0.3178543 (56) total: 16.2s remaining: 11.7s 57: learn: 0.3080208 test: 0.3153584 best: 0.3153584 (57) total: 16.5s remaining: 11.4s 58: learn: 0.3053166 test: 0.3128732 best: 0.3128732 (58) total: 16.7s remaining: 11.1s 59: learn: 0.3027084 test: 0.3105013 best: 0.3105013 (59) total: 17s remaining: 10.8s 60: learn: 0.3001795 test: 0.3081082 best: 0.3081082 (60) total: 17.3s remaining: 10.5s 61: learn: 0.2977524 test: 0.3058654 best: 0.3058654 (61) total: 17.6s remaining: 10.2s 62: learn: 0.2953341 test: 0.3035933 best: 0.3035933 (62) total: 17.8s remaining: 9.9s 63: learn: 0.2928175 test: 0.3013171 best: 0.3013171 (63) total: 18.1s remaining: 9.62s 64: learn: 0.2906324 test: 0.2993516 best: 0.2993516 (64) total: 18.4s remaining: 9.33s 65: learn: 0.2885789 test: 0.2974830 best: 0.2974830 (65) total: 18.6s remaining: 9.04s 66: learn: 0.2863366 test: 0.2954344 best: 0.2954344 (66) total: 18.9s remaining: 8.75s 67: learn: 0.2842102 test: 0.2934604 best: 0.2934604 (67) total: 19.2s remaining: 8.45s 68: learn: 0.2820617 test: 0.2914896 best: 0.2914896 (68) total: 19.4s remaining: 8.15s 69: learn: 0.2799402 test: 0.2894790 best: 0.2894790 (69) total: 19.7s remaining: 7.87s 70: learn: 0.2779703 test: 0.2876979 best: 0.2876979 (70) total: 19.9s remaining: 7.58s 71: learn: 0.2759566 test: 0.2859065 best: 0.2859065 (71) total: 20.2s remaining: 7.31s 72: learn: 0.2741213 test: 0.2842596 best: 0.2842596 (72) total: 20.5s remaining: 7.02s 73: learn: 0.2723222 test: 0.2826392 best: 0.2826392 (73) total: 20.8s remaining: 6.73s 74: learn: 0.2706336 test: 0.2811017 best: 0.2811017 (74) total: 21s remaining: 6.44s 75: learn: 0.2688340 test: 0.2795030 best: 0.2795030 (75) total: 21.3s remaining: 6.16s 76: learn: 0.2670808 test: 0.2779121 best: 0.2779121 (76) total: 21.5s remaining: 5.88s 77: learn: 0.2652803 test: 0.2763206 best: 0.2763206 (77) total: 21.8s remaining: 5.59s 78: learn: 0.2635447 test: 0.2747821 best: 0.2747821 (78) total: 22.1s remaining: 5.31s 79: learn: 0.2618699 test: 0.2732985 best: 0.2732985 (79) total: 22.4s remaining: 5.03s 80: learn: 0.2602875 test: 0.2718917 best: 0.2718917 (80) total: 22.6s remaining: 4.74s 81: learn: 0.2587196 test: 0.2705212 best: 0.2705212 (81) total: 22.8s remaining: 4.46s 82: learn: 0.2571331 test: 0.2691671 best: 0.2691671 (82) total: 23.1s remaining: 4.18s 83: learn: 0.2557453 test: 0.2679605 best: 0.2679605 (83) total: 23.4s remaining: 3.9s 84: learn: 0.2543145 test: 0.2666797 best: 0.2666797 (84) total: 23.6s remaining: 3.61s 85: learn: 0.2527969 test: 0.2654190 best: 0.2654190 (85) total: 23.9s remaining: 3.33s 86: learn: 0.2513001 test: 0.2641608 best: 0.2641608 (86) total: 24.2s remaining: 3.05s 87: learn: 0.2498619 test: 0.2628919 best: 0.2628919 (87) total: 24.4s remaining: 2.77s 88: learn: 0.2485597 test: 0.2617961 best: 0.2617961 (88) total: 24.7s remaining: 2.5s 89: learn: 0.2472595 test: 0.2606715 best: 0.2606715 (89) total: 25s remaining: 2.22s 90: learn: 0.2460508 test: 0.2596235 best: 0.2596235 (90) total: 25.2s remaining: 1.94s 91: learn: 0.2448723 test: 0.2586300 best: 0.2586300 (91) total: 25.5s remaining: 1.66s 92: learn: 0.2435132 test: 0.2574498 best: 0.2574498 (92) total: 25.8s remaining: 1.39s 93: learn: 0.2421795 test: 0.2563036 best: 0.2563036 (93) total: 26s remaining: 1.11s 94: learn: 0.2409863 test: 0.2552869 best: 0.2552869 (94) total: 26.3s remaining: 831ms 95: learn: 0.2397458 test: 0.2542647 best: 0.2542647 (95) total: 26.6s remaining: 554ms 96: learn: 0.2385533 test: 0.2531880 best: 0.2531880 (96) total: 26.8s remaining: 276ms 97: learn: 0.2372579 test: 0.2521268 best: 0.2521268 (97) total: 27.1s remaining: 0us bestTest = 0.2521268089 bestIteration = 97 Trial 78, Fold 1: Log loss = 0.25212680890439687, Average precision = 0.9735816065453561, ROC-AUC = 0.9701956763835715, Elapsed Time = 27.27744139999777 seconds Trial 78, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 78, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6796621 test: 0.6795280 best: 0.6795280 (0) total: 232ms remaining: 22.5s 1: learn: 0.6664878 test: 0.6665095 best: 0.6665095 (1) total: 460ms remaining: 22.1s 2: learn: 0.6535396 test: 0.6537465 best: 0.6537465 (2) total: 687ms remaining: 21.8s 3: learn: 0.6409910 test: 0.6413241 best: 0.6413241 (3) total: 922ms remaining: 21.7s 4: learn: 0.6289351 test: 0.6293778 best: 0.6293778 (4) total: 1.14s remaining: 21.3s 5: learn: 0.6172549 test: 0.6178720 best: 0.6178720 (5) total: 1.41s remaining: 21.6s 6: learn: 0.6060658 test: 0.6068197 best: 0.6068197 (6) total: 1.67s remaining: 21.7s 7: learn: 0.5948485 test: 0.5957953 best: 0.5957953 (7) total: 1.95s remaining: 22s 8: learn: 0.5840145 test: 0.5850112 best: 0.5850112 (8) total: 2.18s remaining: 21.6s 9: learn: 0.5736714 test: 0.5747813 best: 0.5747813 (9) total: 2.42s remaining: 21.3s 10: learn: 0.5634411 test: 0.5646521 best: 0.5646521 (10) total: 2.65s remaining: 21s 11: learn: 0.5535422 test: 0.5549426 best: 0.5549426 (11) total: 2.92s remaining: 20.9s 12: learn: 0.5438790 test: 0.5454372 best: 0.5454372 (12) total: 3.19s remaining: 20.8s 13: learn: 0.5346785 test: 0.5363762 best: 0.5363762 (13) total: 3.49s remaining: 20.9s 14: learn: 0.5255897 test: 0.5275017 best: 0.5275017 (14) total: 3.8s remaining: 21s 15: learn: 0.5167333 test: 0.5188241 best: 0.5188241 (15) total: 4.1s remaining: 21s 16: learn: 0.5082027 test: 0.5103856 best: 0.5103856 (16) total: 4.36s remaining: 20.8s 17: learn: 0.4998917 test: 0.5022527 best: 0.5022527 (17) total: 4.63s remaining: 20.6s 18: learn: 0.4920316 test: 0.4944593 best: 0.4944593 (18) total: 4.88s remaining: 20.3s 19: learn: 0.4842145 test: 0.4867547 best: 0.4867547 (19) total: 5.14s remaining: 20s 20: learn: 0.4766069 test: 0.4792162 best: 0.4792162 (20) total: 5.41s remaining: 19.8s 21: learn: 0.4694476 test: 0.4721835 best: 0.4721835 (21) total: 5.67s remaining: 19.6s 22: learn: 0.4623292 test: 0.4651727 best: 0.4651727 (22) total: 5.92s remaining: 19.3s 23: learn: 0.4553385 test: 0.4583429 best: 0.4583429 (23) total: 6.19s remaining: 19.1s 24: learn: 0.4485234 test: 0.4516740 best: 0.4516740 (24) total: 6.46s remaining: 18.9s 25: learn: 0.4420390 test: 0.4453404 best: 0.4453404 (25) total: 6.71s remaining: 18.6s 26: learn: 0.4357367 test: 0.4391505 best: 0.4391505 (26) total: 7.01s remaining: 18.4s 27: learn: 0.4297827 test: 0.4333184 best: 0.4333184 (27) total: 7.26s remaining: 18.1s 28: learn: 0.4240117 test: 0.4277055 best: 0.4277055 (28) total: 7.53s remaining: 17.9s 29: learn: 0.4185591 test: 0.4223218 best: 0.4223218 (29) total: 7.78s remaining: 17.6s 30: learn: 0.4128838 test: 0.4167592 best: 0.4167592 (30) total: 8.08s remaining: 17.5s 31: learn: 0.4074402 test: 0.4113766 best: 0.4113766 (31) total: 8.32s remaining: 17.2s 32: learn: 0.4022457 test: 0.4062718 best: 0.4062718 (32) total: 8.57s remaining: 16.9s 33: learn: 0.3972175 test: 0.4013100 best: 0.4013100 (33) total: 8.81s remaining: 16.6s 34: learn: 0.3923021 test: 0.3965410 best: 0.3965410 (34) total: 9.1s remaining: 16.4s 35: learn: 0.3876127 test: 0.3919417 best: 0.3919417 (35) total: 9.38s remaining: 16.2s 36: learn: 0.3828124 test: 0.3872337 best: 0.3872337 (36) total: 9.63s remaining: 15.9s 37: learn: 0.3782915 test: 0.3828447 best: 0.3828447 (37) total: 9.88s remaining: 15.6s 38: learn: 0.3736062 test: 0.3782943 best: 0.3782943 (38) total: 10.2s remaining: 15.4s 39: learn: 0.3691258 test: 0.3739198 best: 0.3739198 (39) total: 10.5s remaining: 15.2s 40: learn: 0.3648651 test: 0.3697562 best: 0.3697562 (40) total: 10.8s remaining: 14.9s 41: learn: 0.3607500 test: 0.3657100 best: 0.3657100 (41) total: 11s remaining: 14.6s 42: learn: 0.3566178 test: 0.3617127 best: 0.3617127 (42) total: 11.3s remaining: 14.4s 43: learn: 0.3527382 test: 0.3579430 best: 0.3579430 (43) total: 11.5s remaining: 14.1s 44: learn: 0.3488484 test: 0.3542651 best: 0.3542651 (44) total: 11.8s remaining: 13.9s 45: learn: 0.3451567 test: 0.3506580 best: 0.3506580 (45) total: 12.1s remaining: 13.6s 46: learn: 0.3414732 test: 0.3470745 best: 0.3470745 (46) total: 12.4s remaining: 13.4s 47: learn: 0.3380451 test: 0.3437499 best: 0.3437499 (47) total: 12.6s remaining: 13.2s 48: learn: 0.3348237 test: 0.3407163 best: 0.3407163 (48) total: 12.9s remaining: 12.9s 49: learn: 0.3316871 test: 0.3376156 best: 0.3376156 (49) total: 13.1s remaining: 12.6s 50: learn: 0.3285882 test: 0.3346082 best: 0.3346082 (50) total: 13.4s remaining: 12.4s 51: learn: 0.3254717 test: 0.3315435 best: 0.3315435 (51) total: 13.7s remaining: 12.1s 52: learn: 0.3225544 test: 0.3286917 best: 0.3286917 (52) total: 14s remaining: 11.9s 53: learn: 0.3198502 test: 0.3260170 best: 0.3260170 (53) total: 14.2s remaining: 11.6s 54: learn: 0.3170299 test: 0.3232523 best: 0.3232523 (54) total: 14.5s remaining: 11.3s 55: learn: 0.3140763 test: 0.3203935 best: 0.3203935 (55) total: 14.8s remaining: 11.1s 56: learn: 0.3111926 test: 0.3176269 best: 0.3176269 (56) total: 15s remaining: 10.8s 57: learn: 0.3084632 test: 0.3149880 best: 0.3149880 (57) total: 15.3s remaining: 10.6s 58: learn: 0.3059049 test: 0.3124708 best: 0.3124708 (58) total: 15.6s remaining: 10.3s 59: learn: 0.3032803 test: 0.3099287 best: 0.3099287 (59) total: 15.9s remaining: 10.1s 60: learn: 0.3007848 test: 0.3075392 best: 0.3075392 (60) total: 16.1s remaining: 9.79s 61: learn: 0.2984745 test: 0.3053560 best: 0.3053560 (61) total: 16.4s remaining: 9.53s 62: learn: 0.2959993 test: 0.3030570 best: 0.3030570 (62) total: 16.7s remaining: 9.29s 63: learn: 0.2937621 test: 0.3009643 best: 0.3009643 (63) total: 17s remaining: 9.02s 64: learn: 0.2914088 test: 0.2987292 best: 0.2987292 (64) total: 17.2s remaining: 8.75s 65: learn: 0.2892484 test: 0.2966368 best: 0.2966368 (65) total: 17.5s remaining: 8.48s 66: learn: 0.2872100 test: 0.2946916 best: 0.2946916 (66) total: 17.7s remaining: 8.21s 67: learn: 0.2851164 test: 0.2926665 best: 0.2926665 (67) total: 18s remaining: 7.95s 68: learn: 0.2830155 test: 0.2906097 best: 0.2906097 (68) total: 18.3s remaining: 7.7s 69: learn: 0.2809695 test: 0.2886277 best: 0.2886277 (69) total: 18.6s remaining: 7.43s 70: learn: 0.2789632 test: 0.2866822 best: 0.2866822 (70) total: 18.8s remaining: 7.16s 71: learn: 0.2770139 test: 0.2847906 best: 0.2847906 (71) total: 19.1s remaining: 6.88s 72: learn: 0.2749624 test: 0.2828779 best: 0.2828779 (72) total: 19.3s remaining: 6.62s 73: learn: 0.2730814 test: 0.2811344 best: 0.2811344 (73) total: 19.6s remaining: 6.37s 74: learn: 0.2711827 test: 0.2793729 best: 0.2793729 (74) total: 19.9s remaining: 6.1s 75: learn: 0.2694171 test: 0.2776515 best: 0.2776515 (75) total: 20.1s remaining: 5.83s 76: learn: 0.2676306 test: 0.2759955 best: 0.2759955 (76) total: 20.4s remaining: 5.57s 77: learn: 0.2659194 test: 0.2743457 best: 0.2743457 (77) total: 20.7s remaining: 5.3s 78: learn: 0.2643651 test: 0.2728354 best: 0.2728354 (78) total: 20.9s remaining: 5.03s 79: learn: 0.2627426 test: 0.2712956 best: 0.2712956 (79) total: 21.2s remaining: 4.76s 80: learn: 0.2612409 test: 0.2698639 best: 0.2698639 (80) total: 21.4s remaining: 4.49s 81: learn: 0.2596942 test: 0.2683876 best: 0.2683876 (81) total: 21.7s remaining: 4.23s 82: learn: 0.2581955 test: 0.2670162 best: 0.2670162 (82) total: 21.9s remaining: 3.96s 83: learn: 0.2568134 test: 0.2656746 best: 0.2656746 (83) total: 22.2s remaining: 3.69s 84: learn: 0.2555068 test: 0.2644404 best: 0.2644404 (84) total: 22.4s remaining: 3.43s 85: learn: 0.2540783 test: 0.2631278 best: 0.2631278 (85) total: 22.7s remaining: 3.16s 86: learn: 0.2525158 test: 0.2616442 best: 0.2616442 (86) total: 22.9s remaining: 2.9s 87: learn: 0.2510789 test: 0.2603839 best: 0.2603839 (87) total: 23.2s remaining: 2.63s 88: learn: 0.2498034 test: 0.2592190 best: 0.2592190 (88) total: 23.5s remaining: 2.37s 89: learn: 0.2486007 test: 0.2581185 best: 0.2581185 (89) total: 23.7s remaining: 2.11s 90: learn: 0.2472291 test: 0.2568593 best: 0.2568593 (90) total: 24s remaining: 1.85s 91: learn: 0.2459094 test: 0.2556359 best: 0.2556359 (91) total: 24.3s remaining: 1.58s 92: learn: 0.2446818 test: 0.2546171 best: 0.2546171 (92) total: 24.6s remaining: 1.32s 93: learn: 0.2435345 test: 0.2535362 best: 0.2535362 (93) total: 24.8s remaining: 1.06s 94: learn: 0.2423217 test: 0.2524073 best: 0.2524073 (94) total: 25.1s remaining: 792ms 95: learn: 0.2411296 test: 0.2513469 best: 0.2513469 (95) total: 25.4s remaining: 529ms 96: learn: 0.2399374 test: 0.2502789 best: 0.2502789 (96) total: 25.6s remaining: 264ms 97: learn: 0.2388192 test: 0.2492520 best: 0.2492520 (97) total: 25.9s remaining: 0us bestTest = 0.2492519866 bestIteration = 97 Trial 78, Fold 2: Log loss = 0.24925198663235323, Average precision = 0.9749105582168802, ROC-AUC = 0.9726450337047008, Elapsed Time = 26.056784600001265 seconds Trial 78, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 78, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6793016 test: 0.6793914 best: 0.6793914 (0) total: 252ms remaining: 24.4s 1: learn: 0.6661013 test: 0.6661843 best: 0.6661843 (1) total: 492ms remaining: 23.6s 2: learn: 0.6533154 test: 0.6533806 best: 0.6533806 (2) total: 733ms remaining: 23.2s 3: learn: 0.6408015 test: 0.6409396 best: 0.6409396 (3) total: 949ms remaining: 22.3s 4: learn: 0.6289224 test: 0.6289724 best: 0.6289724 (4) total: 1.19s remaining: 22.2s 5: learn: 0.6171191 test: 0.6171758 best: 0.6171758 (5) total: 1.43s remaining: 21.9s 6: learn: 0.6056025 test: 0.6056916 best: 0.6056916 (6) total: 1.68s remaining: 21.9s 7: learn: 0.5944350 test: 0.5944937 best: 0.5944937 (7) total: 1.95s remaining: 22s 8: learn: 0.5836016 test: 0.5835916 best: 0.5835916 (8) total: 2.22s remaining: 22s 9: learn: 0.5730620 test: 0.5729992 best: 0.5729992 (9) total: 2.46s remaining: 21.7s 10: learn: 0.5630179 test: 0.5628363 best: 0.5628363 (10) total: 2.74s remaining: 21.7s 11: learn: 0.5533330 test: 0.5531413 best: 0.5531413 (11) total: 3.01s remaining: 21.6s 12: learn: 0.5438733 test: 0.5436947 best: 0.5436947 (12) total: 3.25s remaining: 21.2s 13: learn: 0.5347092 test: 0.5344903 best: 0.5344903 (13) total: 3.51s remaining: 21.1s 14: learn: 0.5257551 test: 0.5255708 best: 0.5255708 (14) total: 3.77s remaining: 20.8s 15: learn: 0.5170911 test: 0.5169044 best: 0.5169044 (15) total: 4.04s remaining: 20.7s 16: learn: 0.5086150 test: 0.5084904 best: 0.5084904 (16) total: 4.31s remaining: 20.5s 17: learn: 0.5006197 test: 0.5005705 best: 0.5005705 (17) total: 4.59s remaining: 20.4s 18: learn: 0.4925208 test: 0.4924768 best: 0.4924768 (18) total: 4.88s remaining: 20.3s 19: learn: 0.4847030 test: 0.4847449 best: 0.4847449 (19) total: 5.15s remaining: 20.1s 20: learn: 0.4771166 test: 0.4771826 best: 0.4771826 (20) total: 5.41s remaining: 19.8s 21: learn: 0.4698439 test: 0.4699029 best: 0.4699029 (21) total: 5.66s remaining: 19.6s 22: learn: 0.4626581 test: 0.4628309 best: 0.4628309 (22) total: 5.94s remaining: 19.4s 23: learn: 0.4557374 test: 0.4559744 best: 0.4559744 (23) total: 6.16s remaining: 19s 24: learn: 0.4491136 test: 0.4493570 best: 0.4493570 (24) total: 6.4s remaining: 18.7s 25: learn: 0.4426188 test: 0.4428589 best: 0.4428589 (25) total: 6.66s remaining: 18.4s 26: learn: 0.4363638 test: 0.4365692 best: 0.4365692 (26) total: 6.92s remaining: 18.2s 27: learn: 0.4299492 test: 0.4302556 best: 0.4302556 (27) total: 7.22s remaining: 18s 28: learn: 0.4241017 test: 0.4245200 best: 0.4245200 (28) total: 7.46s remaining: 17.8s 29: learn: 0.4183970 test: 0.4188437 best: 0.4188437 (29) total: 7.73s remaining: 17.5s 30: learn: 0.4126593 test: 0.4131127 best: 0.4131127 (30) total: 8.02s remaining: 17.3s 31: learn: 0.4073432 test: 0.4077772 best: 0.4077772 (31) total: 8.26s remaining: 17s 32: learn: 0.4019746 test: 0.4025003 best: 0.4025003 (32) total: 8.56s remaining: 16.9s 33: learn: 0.3969389 test: 0.3976013 best: 0.3976013 (33) total: 8.83s remaining: 16.6s 34: learn: 0.3917656 test: 0.3924539 best: 0.3924539 (34) total: 9.11s remaining: 16.4s 35: learn: 0.3867550 test: 0.3875707 best: 0.3875707 (35) total: 9.38s remaining: 16.2s 36: learn: 0.3819288 test: 0.3828400 best: 0.3828400 (36) total: 9.65s remaining: 15.9s 37: learn: 0.3772156 test: 0.3782402 best: 0.3782402 (37) total: 9.93s remaining: 15.7s 38: learn: 0.3727420 test: 0.3739979 best: 0.3739979 (38) total: 10.2s remaining: 15.5s 39: learn: 0.3685459 test: 0.3698922 best: 0.3698922 (39) total: 10.5s remaining: 15.2s 40: learn: 0.3643382 test: 0.3657776 best: 0.3657776 (40) total: 10.8s remaining: 15s 41: learn: 0.3602013 test: 0.3616882 best: 0.3616882 (41) total: 11s remaining: 14.7s 42: learn: 0.3562670 test: 0.3578188 best: 0.3578188 (42) total: 11.4s remaining: 14.6s 43: learn: 0.3524154 test: 0.3540498 best: 0.3540498 (43) total: 11.7s remaining: 14.4s 44: learn: 0.3488468 test: 0.3505111 best: 0.3505111 (44) total: 12s remaining: 14.1s 45: learn: 0.3453468 test: 0.3471152 best: 0.3471152 (45) total: 12.3s remaining: 13.9s 46: learn: 0.3419045 test: 0.3437572 best: 0.3437572 (46) total: 12.5s remaining: 13.6s 47: learn: 0.3384741 test: 0.3404408 best: 0.3404408 (47) total: 12.8s remaining: 13.3s 48: learn: 0.3352445 test: 0.3372680 best: 0.3372680 (48) total: 13s remaining: 13s 49: learn: 0.3320212 test: 0.3341549 best: 0.3341549 (49) total: 13.3s remaining: 12.8s 50: learn: 0.3288889 test: 0.3311273 best: 0.3311273 (50) total: 13.6s remaining: 12.5s 51: learn: 0.3259012 test: 0.3282475 best: 0.3282475 (51) total: 13.8s remaining: 12.2s 52: learn: 0.3228066 test: 0.3251962 best: 0.3251962 (52) total: 14.1s remaining: 12s 53: learn: 0.3199607 test: 0.3224616 best: 0.3224616 (53) total: 14.4s remaining: 11.7s 54: learn: 0.3170367 test: 0.3197093 best: 0.3197093 (54) total: 14.6s remaining: 11.4s 55: learn: 0.3141741 test: 0.3170050 best: 0.3170050 (55) total: 14.9s remaining: 11.2s 56: learn: 0.3114058 test: 0.3142830 best: 0.3142830 (56) total: 15.2s remaining: 10.9s 57: learn: 0.3086348 test: 0.3116767 best: 0.3116767 (57) total: 15.5s remaining: 10.7s 58: learn: 0.3060826 test: 0.3091599 best: 0.3091599 (58) total: 15.8s remaining: 10.4s 59: learn: 0.3034088 test: 0.3066294 best: 0.3066294 (59) total: 16s remaining: 10.1s 60: learn: 0.3008119 test: 0.3041678 best: 0.3041678 (60) total: 16.3s remaining: 9.9s 61: learn: 0.2982090 test: 0.3016544 best: 0.3016544 (61) total: 16.6s remaining: 9.61s 62: learn: 0.2957895 test: 0.2992922 best: 0.2992922 (62) total: 16.9s remaining: 9.37s 63: learn: 0.2934971 test: 0.2970973 best: 0.2970973 (63) total: 17.1s remaining: 9.11s 64: learn: 0.2912622 test: 0.2949561 best: 0.2949561 (64) total: 17.4s remaining: 8.82s 65: learn: 0.2890291 test: 0.2928613 best: 0.2928613 (65) total: 17.6s remaining: 8.56s 66: learn: 0.2869262 test: 0.2908634 best: 0.2908634 (66) total: 17.9s remaining: 8.28s 67: learn: 0.2849605 test: 0.2890215 best: 0.2890215 (67) total: 18.2s remaining: 8.02s 68: learn: 0.2828073 test: 0.2870361 best: 0.2870361 (68) total: 18.5s remaining: 7.76s 69: learn: 0.2807694 test: 0.2850765 best: 0.2850765 (69) total: 18.7s remaining: 7.49s 70: learn: 0.2787897 test: 0.2831921 best: 0.2831921 (70) total: 19s remaining: 7.22s 71: learn: 0.2768644 test: 0.2814252 best: 0.2814252 (71) total: 19.3s remaining: 6.96s 72: learn: 0.2750534 test: 0.2796692 best: 0.2796692 (72) total: 19.5s remaining: 6.67s 73: learn: 0.2731935 test: 0.2778674 best: 0.2778674 (73) total: 19.7s remaining: 6.4s 74: learn: 0.2713989 test: 0.2762024 best: 0.2762024 (74) total: 20s remaining: 6.13s 75: learn: 0.2695087 test: 0.2745195 best: 0.2745195 (75) total: 20.3s remaining: 5.87s 76: learn: 0.2679113 test: 0.2730398 best: 0.2730398 (76) total: 20.5s remaining: 5.6s 77: learn: 0.2662808 test: 0.2715246 best: 0.2715246 (77) total: 20.8s remaining: 5.34s 78: learn: 0.2644375 test: 0.2698519 best: 0.2698519 (78) total: 21.1s remaining: 5.08s 79: learn: 0.2627715 test: 0.2683372 best: 0.2683372 (79) total: 21.4s remaining: 4.81s 80: learn: 0.2611997 test: 0.2668824 best: 0.2668824 (80) total: 21.6s remaining: 4.54s 81: learn: 0.2596341 test: 0.2654074 best: 0.2654074 (81) total: 21.9s remaining: 4.27s 82: learn: 0.2580449 test: 0.2640197 best: 0.2640197 (82) total: 22.2s remaining: 4s 83: learn: 0.2563840 test: 0.2625523 best: 0.2625523 (83) total: 22.5s remaining: 3.75s 84: learn: 0.2547328 test: 0.2610681 best: 0.2610681 (84) total: 22.8s remaining: 3.48s 85: learn: 0.2532323 test: 0.2597271 best: 0.2597271 (85) total: 23s remaining: 3.22s 86: learn: 0.2516902 test: 0.2583874 best: 0.2583874 (86) total: 23.3s remaining: 2.95s 87: learn: 0.2504512 test: 0.2572549 best: 0.2572549 (87) total: 23.6s remaining: 2.68s 88: learn: 0.2491223 test: 0.2560423 best: 0.2560423 (88) total: 23.8s remaining: 2.41s 89: learn: 0.2478733 test: 0.2548875 best: 0.2548875 (89) total: 24.1s remaining: 2.14s 90: learn: 0.2465552 test: 0.2537489 best: 0.2537489 (90) total: 24.3s remaining: 1.87s 91: learn: 0.2451940 test: 0.2525941 best: 0.2525941 (91) total: 24.6s remaining: 1.6s 92: learn: 0.2440819 test: 0.2516148 best: 0.2516148 (92) total: 24.9s remaining: 1.34s 93: learn: 0.2428333 test: 0.2505027 best: 0.2505027 (93) total: 25.1s remaining: 1.07s 94: learn: 0.2414651 test: 0.2492980 best: 0.2492980 (94) total: 25.4s remaining: 802ms 95: learn: 0.2402142 test: 0.2481032 best: 0.2481032 (95) total: 25.7s remaining: 535ms 96: learn: 0.2390408 test: 0.2470374 best: 0.2470374 (96) total: 25.9s remaining: 267ms 97: learn: 0.2378414 test: 0.2459940 best: 0.2459940 (97) total: 26.2s remaining: 0us bestTest = 0.2459940375 bestIteration = 97 Trial 78, Fold 3: Log loss = 0.24599403752822974, Average precision = 0.9733445425441831, ROC-AUC = 0.9720903138272771, Elapsed Time = 26.411656200001744 seconds Trial 78, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 78, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6792605 test: 0.6792180 best: 0.6792180 (0) total: 228ms remaining: 22.1s 1: learn: 0.6661058 test: 0.6660731 best: 0.6660731 (1) total: 459ms remaining: 22s 2: learn: 0.6532680 test: 0.6532278 best: 0.6532278 (2) total: 687ms remaining: 21.8s 3: learn: 0.6407977 test: 0.6407878 best: 0.6407878 (3) total: 922ms remaining: 21.7s 4: learn: 0.6285741 test: 0.6285796 best: 0.6285796 (4) total: 1.17s remaining: 21.8s 5: learn: 0.6167276 test: 0.6168290 best: 0.6168290 (5) total: 1.43s remaining: 21.9s 6: learn: 0.6053985 test: 0.6056379 best: 0.6056379 (6) total: 1.71s remaining: 22.2s 7: learn: 0.5943410 test: 0.5946154 best: 0.5946154 (7) total: 1.98s remaining: 22.3s 8: learn: 0.5836254 test: 0.5839891 best: 0.5839891 (8) total: 2.28s remaining: 22.5s 9: learn: 0.5732345 test: 0.5736412 best: 0.5736412 (9) total: 2.52s remaining: 22.2s 10: learn: 0.5631031 test: 0.5635660 best: 0.5635660 (10) total: 2.79s remaining: 22.1s 11: learn: 0.5533721 test: 0.5539504 best: 0.5539504 (11) total: 3.07s remaining: 22s 12: learn: 0.5439423 test: 0.5445627 best: 0.5445627 (12) total: 3.3s remaining: 21.6s 13: learn: 0.5346122 test: 0.5352555 best: 0.5352555 (13) total: 3.57s remaining: 21.4s 14: learn: 0.5255903 test: 0.5262640 best: 0.5262640 (14) total: 3.83s remaining: 21.2s 15: learn: 0.5169517 test: 0.5177090 best: 0.5177090 (15) total: 4.11s remaining: 21.1s 16: learn: 0.5084478 test: 0.5093022 best: 0.5093022 (16) total: 4.38s remaining: 20.9s 17: learn: 0.5003218 test: 0.5012604 best: 0.5012604 (17) total: 4.62s remaining: 20.6s 18: learn: 0.4922340 test: 0.4932127 best: 0.4932127 (18) total: 4.92s remaining: 20.4s 19: learn: 0.4844290 test: 0.4855253 best: 0.4855253 (19) total: 5.16s remaining: 20.1s 20: learn: 0.4768880 test: 0.4781030 best: 0.4781030 (20) total: 5.42s remaining: 19.9s 21: learn: 0.4694504 test: 0.4708398 best: 0.4708398 (21) total: 5.7s remaining: 19.7s 22: learn: 0.4625208 test: 0.4640431 best: 0.4640431 (22) total: 5.95s remaining: 19.4s 23: learn: 0.4556889 test: 0.4572959 best: 0.4572959 (23) total: 6.2s remaining: 19.1s 24: learn: 0.4488958 test: 0.4505809 best: 0.4505809 (24) total: 6.46s remaining: 18.9s 25: learn: 0.4424501 test: 0.4442334 best: 0.4442334 (25) total: 6.72s remaining: 18.6s 26: learn: 0.4361395 test: 0.4380880 best: 0.4380880 (26) total: 7.02s remaining: 18.5s 27: learn: 0.4299073 test: 0.4319754 best: 0.4319754 (27) total: 7.3s remaining: 18.2s 28: learn: 0.4240469 test: 0.4262028 best: 0.4262028 (28) total: 7.56s remaining: 18s 29: learn: 0.4185703 test: 0.4208994 best: 0.4208994 (29) total: 7.85s remaining: 17.8s 30: learn: 0.4129343 test: 0.4152373 best: 0.4152373 (30) total: 8.09s remaining: 17.5s 31: learn: 0.4073408 test: 0.4098149 best: 0.4098149 (31) total: 8.36s remaining: 17.2s 32: learn: 0.4020933 test: 0.4047407 best: 0.4047407 (32) total: 8.65s remaining: 17s 33: learn: 0.3970223 test: 0.3997465 best: 0.3997465 (33) total: 8.89s remaining: 16.7s 34: learn: 0.3921114 test: 0.3949524 best: 0.3949524 (34) total: 9.15s remaining: 16.5s 35: learn: 0.3870921 test: 0.3900020 best: 0.3900020 (35) total: 9.42s remaining: 16.2s 36: learn: 0.3824452 test: 0.3855071 best: 0.3855071 (36) total: 9.7s remaining: 16s 37: learn: 0.3778046 test: 0.3809319 best: 0.3809319 (37) total: 9.96s remaining: 15.7s 38: learn: 0.3733330 test: 0.3765249 best: 0.3765249 (38) total: 10.2s remaining: 15.5s 39: learn: 0.3691242 test: 0.3724124 best: 0.3724124 (39) total: 10.5s remaining: 15.2s 40: learn: 0.3649905 test: 0.3683487 best: 0.3683487 (40) total: 10.7s remaining: 14.9s 41: learn: 0.3608341 test: 0.3643049 best: 0.3643049 (41) total: 11s remaining: 14.7s 42: learn: 0.3569152 test: 0.3605890 best: 0.3605890 (42) total: 11.3s remaining: 14.4s 43: learn: 0.3532496 test: 0.3569799 best: 0.3569799 (43) total: 11.5s remaining: 14.1s 44: learn: 0.3494621 test: 0.3532850 best: 0.3532850 (44) total: 11.8s remaining: 13.9s 45: learn: 0.3455902 test: 0.3496151 best: 0.3496151 (45) total: 12.1s remaining: 13.6s 46: learn: 0.3420360 test: 0.3461260 best: 0.3461260 (46) total: 12.3s remaining: 13.3s 47: learn: 0.3387720 test: 0.3429718 best: 0.3429718 (47) total: 12.5s remaining: 13.1s 48: learn: 0.3353660 test: 0.3397168 best: 0.3397168 (48) total: 12.8s remaining: 12.8s 49: learn: 0.3321361 test: 0.3366058 best: 0.3366058 (49) total: 13.1s remaining: 12.6s 50: learn: 0.3288560 test: 0.3334910 best: 0.3334910 (50) total: 13.3s remaining: 12.3s 51: learn: 0.3258374 test: 0.3305208 best: 0.3305208 (51) total: 13.6s remaining: 12s 52: learn: 0.3227246 test: 0.3276236 best: 0.3276236 (52) total: 13.9s remaining: 11.8s 53: learn: 0.3198275 test: 0.3248364 best: 0.3248364 (53) total: 14.1s remaining: 11.5s 54: learn: 0.3169936 test: 0.3222213 best: 0.3222213 (54) total: 14.4s remaining: 11.2s 55: learn: 0.3141343 test: 0.3195348 best: 0.3195348 (55) total: 14.6s remaining: 11s 56: learn: 0.3114209 test: 0.3170133 best: 0.3170133 (56) total: 14.9s remaining: 10.7s 57: learn: 0.3086986 test: 0.3144292 best: 0.3144292 (57) total: 15.2s remaining: 10.4s 58: learn: 0.3060267 test: 0.3118677 best: 0.3118677 (58) total: 15.4s remaining: 10.2s 59: learn: 0.3035262 test: 0.3094429 best: 0.3094429 (59) total: 15.7s remaining: 9.92s 60: learn: 0.3009834 test: 0.3069937 best: 0.3069937 (60) total: 15.9s remaining: 9.66s 61: learn: 0.2986460 test: 0.3048498 best: 0.3048498 (61) total: 16.2s remaining: 9.42s 62: learn: 0.2961647 test: 0.3024654 best: 0.3024654 (62) total: 16.5s remaining: 9.15s 63: learn: 0.2938554 test: 0.3003466 best: 0.3003466 (63) total: 16.8s remaining: 8.91s 64: learn: 0.2915324 test: 0.2981632 best: 0.2981632 (64) total: 17s remaining: 8.64s 65: learn: 0.2891991 test: 0.2959898 best: 0.2959898 (65) total: 17.3s remaining: 8.39s 66: learn: 0.2870904 test: 0.2940294 best: 0.2940294 (66) total: 17.6s remaining: 8.12s 67: learn: 0.2848994 test: 0.2919812 best: 0.2919812 (67) total: 17.9s remaining: 7.88s 68: learn: 0.2827991 test: 0.2900379 best: 0.2900379 (68) total: 18.1s remaining: 7.62s 69: learn: 0.2806768 test: 0.2880440 best: 0.2880440 (69) total: 18.4s remaining: 7.36s 70: learn: 0.2787059 test: 0.2861655 best: 0.2861655 (70) total: 18.6s remaining: 7.08s 71: learn: 0.2767267 test: 0.2844001 best: 0.2844001 (71) total: 18.9s remaining: 6.83s 72: learn: 0.2749007 test: 0.2826972 best: 0.2826972 (72) total: 19.2s remaining: 6.56s 73: learn: 0.2729949 test: 0.2809300 best: 0.2809300 (73) total: 19.4s remaining: 6.3s 74: learn: 0.2712705 test: 0.2793238 best: 0.2793238 (74) total: 19.7s remaining: 6.04s 75: learn: 0.2694237 test: 0.2776560 best: 0.2776560 (75) total: 20s remaining: 5.78s 76: learn: 0.2678136 test: 0.2761311 best: 0.2761311 (76) total: 20.2s remaining: 5.51s 77: learn: 0.2659907 test: 0.2744807 best: 0.2744807 (77) total: 20.5s remaining: 5.26s 78: learn: 0.2643138 test: 0.2728991 best: 0.2728991 (78) total: 20.8s remaining: 5s 79: learn: 0.2627952 test: 0.2715157 best: 0.2715157 (79) total: 21s remaining: 4.73s 80: learn: 0.2610806 test: 0.2700042 best: 0.2700042 (80) total: 21.3s remaining: 4.47s 81: learn: 0.2593963 test: 0.2684060 best: 0.2684060 (81) total: 21.6s remaining: 4.21s 82: learn: 0.2578191 test: 0.2669627 best: 0.2669627 (82) total: 21.8s remaining: 3.95s 83: learn: 0.2564757 test: 0.2657193 best: 0.2657193 (83) total: 22.1s remaining: 3.68s 84: learn: 0.2551557 test: 0.2644284 best: 0.2644284 (84) total: 22.3s remaining: 3.41s 85: learn: 0.2538121 test: 0.2631569 best: 0.2631569 (85) total: 22.5s remaining: 3.14s 86: learn: 0.2523722 test: 0.2619020 best: 0.2619020 (86) total: 22.8s remaining: 2.88s 87: learn: 0.2510009 test: 0.2606408 best: 0.2606408 (87) total: 23s remaining: 2.62s 88: learn: 0.2498018 test: 0.2595293 best: 0.2595293 (88) total: 23.3s remaining: 2.35s 89: learn: 0.2485496 test: 0.2584305 best: 0.2584305 (89) total: 23.5s remaining: 2.09s 90: learn: 0.2471980 test: 0.2572562 best: 0.2572562 (90) total: 23.8s remaining: 1.83s 91: learn: 0.2459814 test: 0.2561446 best: 0.2561446 (91) total: 24.1s remaining: 1.57s 92: learn: 0.2447309 test: 0.2550282 best: 0.2550282 (92) total: 24.3s remaining: 1.31s 93: learn: 0.2433757 test: 0.2538114 best: 0.2538114 (93) total: 24.6s remaining: 1.05s 94: learn: 0.2421302 test: 0.2527379 best: 0.2527379 (94) total: 24.9s remaining: 786ms 95: learn: 0.2409867 test: 0.2517283 best: 0.2517283 (95) total: 25.1s remaining: 524ms 96: learn: 0.2398074 test: 0.2506955 best: 0.2506955 (96) total: 25.4s remaining: 262ms 97: learn: 0.2385434 test: 0.2496040 best: 0.2496040 (97) total: 25.7s remaining: 0us bestTest = 0.2496040322 bestIteration = 97 Trial 78, Fold 4: Log loss = 0.2496040321948194, Average precision = 0.9748202675363168, ROC-AUC = 0.9709528965809948, Elapsed Time = 25.881493100001535 seconds Trial 78, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 78, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6794411 test: 0.6797552 best: 0.6797552 (0) total: 240ms remaining: 23.2s 1: learn: 0.6661060 test: 0.6666652 best: 0.6666652 (1) total: 476ms remaining: 22.8s 2: learn: 0.6531507 test: 0.6539505 best: 0.6539505 (2) total: 703ms remaining: 22.3s 3: learn: 0.6407153 test: 0.6417793 best: 0.6417793 (3) total: 955ms remaining: 22.4s 4: learn: 0.6284837 test: 0.6297992 best: 0.6297992 (4) total: 1.18s remaining: 22s 5: learn: 0.6166705 test: 0.6182049 best: 0.6182049 (5) total: 1.45s remaining: 22.2s 6: learn: 0.6051974 test: 0.6069648 best: 0.6069648 (6) total: 1.7s remaining: 22.1s 7: learn: 0.5939869 test: 0.5959638 best: 0.5959638 (7) total: 1.94s remaining: 21.8s 8: learn: 0.5829253 test: 0.5851833 best: 0.5851833 (8) total: 2.23s remaining: 22s 9: learn: 0.5723485 test: 0.5748337 best: 0.5748337 (9) total: 2.48s remaining: 21.8s 10: learn: 0.5622602 test: 0.5649593 best: 0.5649593 (10) total: 2.74s remaining: 21.7s 11: learn: 0.5522355 test: 0.5551840 best: 0.5551840 (11) total: 3.02s remaining: 21.6s 12: learn: 0.5425813 test: 0.5456703 best: 0.5456703 (12) total: 3.4s remaining: 22.2s 13: learn: 0.5334760 test: 0.5367110 best: 0.5367110 (13) total: 3.71s remaining: 22.2s 14: learn: 0.5244615 test: 0.5279361 best: 0.5279361 (14) total: 4.01s remaining: 22.2s 15: learn: 0.5157503 test: 0.5193851 best: 0.5193851 (15) total: 4.29s remaining: 22s 16: learn: 0.5071695 test: 0.5110027 best: 0.5110027 (16) total: 4.6s remaining: 21.9s 17: learn: 0.4989629 test: 0.5029748 best: 0.5029748 (17) total: 4.94s remaining: 22s 18: learn: 0.4908650 test: 0.4950255 best: 0.4950255 (18) total: 5.22s remaining: 21.7s 19: learn: 0.4829855 test: 0.4873465 best: 0.4873465 (19) total: 5.47s remaining: 21.3s 20: learn: 0.4754555 test: 0.4799874 best: 0.4799874 (20) total: 5.72s remaining: 21s 21: learn: 0.4681238 test: 0.4727994 best: 0.4727994 (21) total: 6.01s remaining: 20.8s 22: learn: 0.4609315 test: 0.4658161 best: 0.4658161 (22) total: 6.29s remaining: 20.5s 23: learn: 0.4539176 test: 0.4590425 best: 0.4590425 (23) total: 6.61s remaining: 20.4s 24: learn: 0.4471647 test: 0.4524465 best: 0.4524465 (24) total: 6.93s remaining: 20.3s 25: learn: 0.4405296 test: 0.4460068 best: 0.4460068 (25) total: 7.25s remaining: 20.1s 26: learn: 0.4345221 test: 0.4400799 best: 0.4400799 (26) total: 7.55s remaining: 19.8s 27: learn: 0.4284409 test: 0.4341747 best: 0.4341747 (27) total: 7.9s remaining: 19.8s 28: learn: 0.4224954 test: 0.4283848 best: 0.4283848 (28) total: 8.23s remaining: 19.6s 29: learn: 0.4167032 test: 0.4227416 best: 0.4227416 (29) total: 8.62s remaining: 19.5s 30: learn: 0.4112685 test: 0.4174656 best: 0.4174656 (30) total: 9.03s remaining: 19.5s 31: learn: 0.4059672 test: 0.4122858 best: 0.4122858 (31) total: 9.38s remaining: 19.4s 32: learn: 0.4008635 test: 0.4073039 best: 0.4073039 (32) total: 9.67s remaining: 19s 33: learn: 0.3956150 test: 0.4022170 best: 0.4022170 (33) total: 9.99s remaining: 18.8s 34: learn: 0.3907587 test: 0.3975503 best: 0.3975503 (34) total: 10.4s remaining: 18.8s 35: learn: 0.3857153 test: 0.3927412 best: 0.3927412 (35) total: 10.8s remaining: 18.6s 36: learn: 0.3808899 test: 0.3880992 best: 0.3880992 (36) total: 11.2s remaining: 18.4s 37: learn: 0.3761219 test: 0.3835407 best: 0.3835407 (37) total: 11.5s remaining: 18.1s 38: learn: 0.3717023 test: 0.3793007 best: 0.3793007 (38) total: 11.8s remaining: 17.8s 39: learn: 0.3675794 test: 0.3753781 best: 0.3753781 (39) total: 12.1s remaining: 17.5s 40: learn: 0.3636081 test: 0.3715598 best: 0.3715598 (40) total: 12.3s remaining: 17.2s 41: learn: 0.3595332 test: 0.3676790 best: 0.3676790 (41) total: 12.6s remaining: 16.8s 42: learn: 0.3555297 test: 0.3639018 best: 0.3639018 (42) total: 12.9s remaining: 16.6s 43: learn: 0.3516455 test: 0.3601475 best: 0.3601475 (43) total: 13.2s remaining: 16.3s 44: learn: 0.3477483 test: 0.3564809 best: 0.3564809 (44) total: 13.6s remaining: 16s 45: learn: 0.3442298 test: 0.3530616 best: 0.3530616 (45) total: 13.9s remaining: 15.7s 46: learn: 0.3405315 test: 0.3494875 best: 0.3494875 (46) total: 14.2s remaining: 15.4s 47: learn: 0.3369707 test: 0.3461096 best: 0.3461096 (47) total: 14.5s remaining: 15.2s 48: learn: 0.3335152 test: 0.3428418 best: 0.3428418 (48) total: 14.9s remaining: 14.9s 49: learn: 0.3301329 test: 0.3396386 best: 0.3396386 (49) total: 15.2s remaining: 14.6s 50: learn: 0.3271232 test: 0.3367262 best: 0.3367262 (50) total: 15.5s remaining: 14.2s 51: learn: 0.3241672 test: 0.3339483 best: 0.3339483 (51) total: 15.7s remaining: 13.9s 52: learn: 0.3211795 test: 0.3311384 best: 0.3311384 (52) total: 16.1s remaining: 13.6s 53: learn: 0.3180084 test: 0.3282405 best: 0.3282405 (53) total: 16.3s remaining: 13.3s 54: learn: 0.3151921 test: 0.3255832 best: 0.3255832 (54) total: 16.7s remaining: 13s 55: learn: 0.3124096 test: 0.3229813 best: 0.3229813 (55) total: 16.9s remaining: 12.7s 56: learn: 0.3098342 test: 0.3204972 best: 0.3204972 (56) total: 17.2s remaining: 12.3s 57: learn: 0.3070780 test: 0.3179265 best: 0.3179265 (57) total: 17.5s remaining: 12.1s 58: learn: 0.3042444 test: 0.3153004 best: 0.3153004 (58) total: 17.8s remaining: 11.7s 59: learn: 0.3016747 test: 0.3129627 best: 0.3129627 (59) total: 18s remaining: 11.4s 60: learn: 0.2992307 test: 0.3106954 best: 0.3106954 (60) total: 18.3s remaining: 11.1s 61: learn: 0.2969852 test: 0.3085580 best: 0.3085580 (61) total: 18.6s remaining: 10.8s 62: learn: 0.2947134 test: 0.3064649 best: 0.3064649 (62) total: 18.8s remaining: 10.5s 63: learn: 0.2923510 test: 0.3042457 best: 0.3042457 (63) total: 19.1s remaining: 10.1s 64: learn: 0.2902256 test: 0.3021314 best: 0.3021314 (64) total: 19.3s remaining: 9.81s 65: learn: 0.2881534 test: 0.3002604 best: 0.3002604 (65) total: 19.6s remaining: 9.49s 66: learn: 0.2859486 test: 0.2982696 best: 0.2982696 (66) total: 19.9s remaining: 9.19s 67: learn: 0.2837570 test: 0.2962891 best: 0.2962891 (67) total: 20.1s remaining: 8.89s 68: learn: 0.2818288 test: 0.2945017 best: 0.2945017 (68) total: 20.4s remaining: 8.57s 69: learn: 0.2797953 test: 0.2926613 best: 0.2926613 (69) total: 20.7s remaining: 8.28s 70: learn: 0.2775690 test: 0.2906708 best: 0.2906708 (70) total: 21s remaining: 7.97s 71: learn: 0.2755493 test: 0.2887661 best: 0.2887661 (71) total: 21.2s remaining: 7.67s 72: learn: 0.2735097 test: 0.2869288 best: 0.2869288 (72) total: 21.5s remaining: 7.37s 73: learn: 0.2716288 test: 0.2852270 best: 0.2852270 (73) total: 21.8s remaining: 7.07s 74: learn: 0.2697698 test: 0.2835627 best: 0.2835627 (74) total: 22.1s remaining: 6.77s 75: learn: 0.2679529 test: 0.2819319 best: 0.2819319 (75) total: 22.3s remaining: 6.47s 76: learn: 0.2662611 test: 0.2804055 best: 0.2804055 (76) total: 22.6s remaining: 6.16s 77: learn: 0.2645350 test: 0.2788494 best: 0.2788494 (77) total: 22.9s remaining: 5.86s 78: learn: 0.2628505 test: 0.2772937 best: 0.2772937 (78) total: 23.1s remaining: 5.56s 79: learn: 0.2613669 test: 0.2759189 best: 0.2759189 (79) total: 23.4s remaining: 5.26s 80: learn: 0.2597897 test: 0.2745052 best: 0.2745052 (80) total: 23.6s remaining: 4.96s 81: learn: 0.2581058 test: 0.2730060 best: 0.2730060 (81) total: 23.9s remaining: 4.67s 82: learn: 0.2565720 test: 0.2716229 best: 0.2716229 (82) total: 24.2s remaining: 4.37s 83: learn: 0.2548919 test: 0.2701351 best: 0.2701351 (83) total: 24.5s remaining: 4.08s 84: learn: 0.2535791 test: 0.2689195 best: 0.2689195 (84) total: 24.7s remaining: 3.78s 85: learn: 0.2520787 test: 0.2675901 best: 0.2675901 (85) total: 25s remaining: 3.49s 86: learn: 0.2505722 test: 0.2662784 best: 0.2662784 (86) total: 25.3s remaining: 3.2s 87: learn: 0.2491102 test: 0.2650400 best: 0.2650400 (87) total: 25.6s remaining: 2.9s 88: learn: 0.2477964 test: 0.2638710 best: 0.2638710 (88) total: 25.8s remaining: 2.61s 89: learn: 0.2466238 test: 0.2628482 best: 0.2628482 (89) total: 26s remaining: 2.31s 90: learn: 0.2452950 test: 0.2617077 best: 0.2617077 (90) total: 26.3s remaining: 2.02s 91: learn: 0.2440111 test: 0.2606755 best: 0.2606755 (91) total: 26.6s remaining: 1.74s 92: learn: 0.2426750 test: 0.2595634 best: 0.2595634 (92) total: 26.9s remaining: 1.45s 93: learn: 0.2413240 test: 0.2584506 best: 0.2584506 (93) total: 27.2s remaining: 1.16s 94: learn: 0.2401402 test: 0.2574370 best: 0.2574370 (94) total: 27.5s remaining: 867ms 95: learn: 0.2389248 test: 0.2564104 best: 0.2564104 (95) total: 27.7s remaining: 577ms 96: learn: 0.2376309 test: 0.2553057 best: 0.2553057 (96) total: 28s remaining: 289ms 97: learn: 0.2363540 test: 0.2542366 best: 0.2542366 (97) total: 28.3s remaining: 0us bestTest = 0.2542366469 bestIteration = 97 Trial 78, Fold 5: Log loss = 0.25423664690806513, Average precision = 0.9730233495599783, ROC-AUC = 0.9702950039945749, Elapsed Time = 28.444352600003185 seconds
Optimization Progress: 79%|#######9 | 79/100 [2:24:00<48:46, 139.34s/it]
Trial 79, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 79, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6825476 test: 0.6828433 best: 0.6828433 (0) total: 35.4ms remaining: 815ms 1: learn: 0.6725666 test: 0.6728272 best: 0.6728272 (1) total: 71.3ms remaining: 784ms 2: learn: 0.6616335 test: 0.6619511 best: 0.6619511 (2) total: 108ms remaining: 758ms 3: learn: 0.6508057 test: 0.6511323 best: 0.6511323 (3) total: 146ms remaining: 728ms 4: learn: 0.6397463 test: 0.6399790 best: 0.6399790 (4) total: 181ms remaining: 688ms 5: learn: 0.6290711 test: 0.6292841 best: 0.6292841 (5) total: 216ms remaining: 649ms 6: learn: 0.6186491 test: 0.6188141 best: 0.6188141 (6) total: 253ms remaining: 613ms 7: learn: 0.6085253 test: 0.6086273 best: 0.6086273 (7) total: 289ms remaining: 578ms 8: learn: 0.5987580 test: 0.5988602 best: 0.5988602 (8) total: 327ms remaining: 544ms 9: learn: 0.5893275 test: 0.5894758 best: 0.5894758 (9) total: 363ms remaining: 508ms 10: learn: 0.5804548 test: 0.5805974 best: 0.5805974 (10) total: 399ms remaining: 471ms 11: learn: 0.5717486 test: 0.5718755 best: 0.5718755 (11) total: 436ms remaining: 436ms 12: learn: 0.5636009 test: 0.5638131 best: 0.5638131 (12) total: 473ms remaining: 401ms 13: learn: 0.5550776 test: 0.5552615 best: 0.5552615 (13) total: 512ms remaining: 366ms 14: learn: 0.5467887 test: 0.5469405 best: 0.5469405 (14) total: 549ms remaining: 329ms 15: learn: 0.5388700 test: 0.5390377 best: 0.5390377 (15) total: 586ms remaining: 293ms 16: learn: 0.5310322 test: 0.5312084 best: 0.5312084 (16) total: 624ms remaining: 257ms 17: learn: 0.5233904 test: 0.5235628 best: 0.5235628 (17) total: 663ms remaining: 221ms 18: learn: 0.5160045 test: 0.5161464 best: 0.5161464 (18) total: 701ms remaining: 185ms 19: learn: 0.5088571 test: 0.5090018 best: 0.5090018 (19) total: 741ms remaining: 148ms 20: learn: 0.5022996 test: 0.5025388 best: 0.5025388 (20) total: 782ms remaining: 112ms 21: learn: 0.4957271 test: 0.4960090 best: 0.4960090 (21) total: 821ms remaining: 74.6ms 22: learn: 0.4892652 test: 0.4895792 best: 0.4895792 (22) total: 859ms remaining: 37.3ms 23: learn: 0.4828183 test: 0.4831544 best: 0.4831544 (23) total: 897ms remaining: 0us bestTest = 0.4831543544 bestIteration = 23 Trial 79, Fold 1: Log loss = 0.4831543544418097, Average precision = 0.9595032524219984, ROC-AUC = 0.9549524460742846, Elapsed Time = 1.003319499999634 seconds Trial 79, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 79, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6807777 test: 0.6806013 best: 0.6806013 (0) total: 39.3ms remaining: 903ms 1: learn: 0.6689264 test: 0.6688560 best: 0.6688560 (1) total: 80.5ms remaining: 886ms 2: learn: 0.6573794 test: 0.6573792 best: 0.6573792 (2) total: 120ms remaining: 839ms 3: learn: 0.6471802 test: 0.6471859 best: 0.6471859 (3) total: 163ms remaining: 814ms 4: learn: 0.6363498 test: 0.6364695 best: 0.6364695 (4) total: 202ms remaining: 767ms 5: learn: 0.6258257 test: 0.6260232 best: 0.6260232 (5) total: 241ms remaining: 723ms 6: learn: 0.6172177 test: 0.6174939 best: 0.6174939 (6) total: 282ms remaining: 685ms 7: learn: 0.6072090 test: 0.6075225 best: 0.6075225 (7) total: 322ms remaining: 644ms 8: learn: 0.5975308 test: 0.5979325 best: 0.5979325 (8) total: 362ms remaining: 603ms 9: learn: 0.5879971 test: 0.5884265 best: 0.5884265 (9) total: 401ms remaining: 562ms 10: learn: 0.5787019 test: 0.5791875 best: 0.5791875 (10) total: 440ms remaining: 520ms 11: learn: 0.5696442 test: 0.5702096 best: 0.5702096 (11) total: 480ms remaining: 480ms 12: learn: 0.5614854 test: 0.5621564 best: 0.5621564 (12) total: 521ms remaining: 441ms 13: learn: 0.5529718 test: 0.5537109 best: 0.5537109 (13) total: 561ms remaining: 401ms 14: learn: 0.5446612 test: 0.5454715 best: 0.5454715 (14) total: 600ms remaining: 360ms 15: learn: 0.5365120 test: 0.5374024 best: 0.5374024 (15) total: 640ms remaining: 320ms 16: learn: 0.5286465 test: 0.5296278 best: 0.5296278 (16) total: 679ms remaining: 280ms 17: learn: 0.5210325 test: 0.5220354 best: 0.5220354 (17) total: 719ms remaining: 240ms 18: learn: 0.5138868 test: 0.5148146 best: 0.5148146 (18) total: 758ms remaining: 199ms 19: learn: 0.5067176 test: 0.5077104 best: 0.5077104 (19) total: 796ms remaining: 159ms 20: learn: 0.4998383 test: 0.5008612 best: 0.5008612 (20) total: 835ms remaining: 119ms 21: learn: 0.4938194 test: 0.4948514 best: 0.4948514 (21) total: 875ms remaining: 79.5ms 22: learn: 0.4872745 test: 0.4883722 best: 0.4883722 (22) total: 919ms remaining: 40ms 23: learn: 0.4810645 test: 0.4821145 best: 0.4821145 (23) total: 963ms remaining: 0us bestTest = 0.4821144775 bestIteration = 23 Trial 79, Fold 2: Log loss = 0.48211447753772685, Average precision = 0.9586106387625188, ROC-AUC = 0.9574851264408072, Elapsed Time = 1.0759388000005856 seconds Trial 79, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 79, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6809656 test: 0.6810161 best: 0.6810161 (0) total: 36.8ms remaining: 846ms 1: learn: 0.6693700 test: 0.6693437 best: 0.6693437 (1) total: 72.6ms remaining: 799ms 2: learn: 0.6578436 test: 0.6577661 best: 0.6577661 (2) total: 112ms remaining: 785ms 3: learn: 0.6465763 test: 0.6464563 best: 0.6464563 (3) total: 149ms remaining: 747ms 4: learn: 0.6358084 test: 0.6355935 best: 0.6355935 (4) total: 188ms remaining: 716ms 5: learn: 0.6270092 test: 0.6267436 best: 0.6267436 (5) total: 231ms remaining: 692ms 6: learn: 0.6168859 test: 0.6165612 best: 0.6165612 (6) total: 270ms remaining: 656ms 7: learn: 0.6066954 test: 0.6063047 best: 0.6063047 (7) total: 311ms remaining: 622ms 8: learn: 0.5970154 test: 0.5966008 best: 0.5966008 (8) total: 351ms remaining: 585ms 9: learn: 0.5892081 test: 0.5886639 best: 0.5886639 (9) total: 392ms remaining: 549ms 10: learn: 0.5814444 test: 0.5808244 best: 0.5808244 (10) total: 431ms remaining: 509ms 11: learn: 0.5724855 test: 0.5718055 best: 0.5718055 (11) total: 470ms remaining: 470ms 12: learn: 0.5637949 test: 0.5630526 best: 0.5630526 (12) total: 510ms remaining: 432ms 13: learn: 0.5553406 test: 0.5545561 best: 0.5545561 (13) total: 550ms remaining: 393ms 14: learn: 0.5470419 test: 0.5461772 best: 0.5461772 (14) total: 588ms remaining: 353ms 15: learn: 0.5392196 test: 0.5383264 best: 0.5383264 (15) total: 627ms remaining: 314ms 16: learn: 0.5314669 test: 0.5305302 best: 0.5305302 (16) total: 666ms remaining: 274ms 17: learn: 0.5238884 test: 0.5229044 best: 0.5229044 (17) total: 706ms remaining: 235ms 18: learn: 0.5181006 test: 0.5170288 best: 0.5170288 (18) total: 746ms remaining: 196ms 19: learn: 0.5109003 test: 0.5097603 best: 0.5097603 (19) total: 786ms remaining: 157ms 20: learn: 0.5038761 test: 0.5026678 best: 0.5026678 (20) total: 824ms remaining: 118ms 21: learn: 0.4972045 test: 0.4958972 best: 0.4958972 (21) total: 863ms remaining: 78.4ms 22: learn: 0.4905627 test: 0.4892067 best: 0.4892067 (22) total: 902ms remaining: 39.2ms 23: learn: 0.4840477 test: 0.4826243 best: 0.4826243 (23) total: 941ms remaining: 0us bestTest = 0.4826243037 bestIteration = 23 Trial 79, Fold 3: Log loss = 0.48262430371767545, Average precision = 0.9563391169999513, ROC-AUC = 0.9576537685036262, Elapsed Time = 1.0467338000016753 seconds Trial 79, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 79, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6808044 test: 0.6806958 best: 0.6806958 (0) total: 36.2ms remaining: 832ms 1: learn: 0.6689185 test: 0.6688384 best: 0.6688384 (1) total: 74.2ms remaining: 816ms 2: learn: 0.6592496 test: 0.6591650 best: 0.6591650 (2) total: 115ms remaining: 808ms 3: learn: 0.6479851 test: 0.6479253 best: 0.6479253 (3) total: 154ms remaining: 769ms 4: learn: 0.6370696 test: 0.6369886 best: 0.6369886 (4) total: 192ms remaining: 731ms 5: learn: 0.6285412 test: 0.6284022 best: 0.6284022 (5) total: 231ms remaining: 692ms 6: learn: 0.6182118 test: 0.6180816 best: 0.6180816 (6) total: 268ms remaining: 650ms 7: learn: 0.6082648 test: 0.6081088 best: 0.6081088 (7) total: 306ms remaining: 613ms 8: learn: 0.6004576 test: 0.6002422 best: 0.6002422 (8) total: 344ms remaining: 573ms 9: learn: 0.5914173 test: 0.5911966 best: 0.5911966 (9) total: 382ms remaining: 534ms 10: learn: 0.5821085 test: 0.5818981 best: 0.5818981 (10) total: 419ms remaining: 495ms 11: learn: 0.5739695 test: 0.5737452 best: 0.5737452 (11) total: 456ms remaining: 456ms 12: learn: 0.5650316 test: 0.5647915 best: 0.5647915 (12) total: 496ms remaining: 420ms 13: learn: 0.5564704 test: 0.5562435 best: 0.5562435 (13) total: 536ms remaining: 383ms 14: learn: 0.5485246 test: 0.5483240 best: 0.5483240 (14) total: 575ms remaining: 345ms 15: learn: 0.5404097 test: 0.5402036 best: 0.5402036 (15) total: 614ms remaining: 307ms 16: learn: 0.5342874 test: 0.5340219 best: 0.5340219 (16) total: 653ms remaining: 269ms 17: learn: 0.5265623 test: 0.5262619 best: 0.5262619 (17) total: 692ms remaining: 231ms 18: learn: 0.5195440 test: 0.5192442 best: 0.5192442 (18) total: 731ms remaining: 192ms 19: learn: 0.5122576 test: 0.5119674 best: 0.5119674 (19) total: 770ms remaining: 154ms 20: learn: 0.5052272 test: 0.5049095 best: 0.5049095 (20) total: 808ms remaining: 115ms 21: learn: 0.4985568 test: 0.4983171 best: 0.4983171 (21) total: 846ms remaining: 76.9ms 22: learn: 0.4933021 test: 0.4930400 best: 0.4930400 (22) total: 884ms remaining: 38.4ms 23: learn: 0.4869741 test: 0.4867494 best: 0.4867494 (23) total: 921ms remaining: 0us bestTest = 0.4867493757 bestIteration = 23 Trial 79, Fold 4: Log loss = 0.48674937572540555, Average precision = 0.9639261039203338, ROC-AUC = 0.9609027584044484, Elapsed Time = 1.0273558999979286 seconds Trial 79, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 79, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6806743 test: 0.6809337 best: 0.6809337 (0) total: 37.4ms remaining: 861ms 1: learn: 0.6688381 test: 0.6692686 best: 0.6692686 (1) total: 74.6ms remaining: 820ms 2: learn: 0.6592730 test: 0.6599742 best: 0.6599742 (2) total: 112ms remaining: 785ms 3: learn: 0.6478407 test: 0.6486789 best: 0.6486789 (3) total: 151ms remaining: 754ms 4: learn: 0.6374438 test: 0.6383525 best: 0.6383525 (4) total: 192ms remaining: 728ms 5: learn: 0.6267919 test: 0.6278068 best: 0.6278068 (5) total: 233ms remaining: 700ms 6: learn: 0.6163181 test: 0.6175326 best: 0.6175326 (6) total: 278ms remaining: 676ms 7: learn: 0.6061708 test: 0.6074680 best: 0.6074680 (7) total: 319ms remaining: 638ms 8: learn: 0.5969601 test: 0.5983699 best: 0.5983699 (8) total: 357ms remaining: 596ms 9: learn: 0.5874465 test: 0.5889329 best: 0.5889329 (9) total: 397ms remaining: 555ms 10: learn: 0.5786995 test: 0.5803007 best: 0.5803007 (10) total: 436ms remaining: 515ms 11: learn: 0.5696905 test: 0.5713550 best: 0.5713550 (11) total: 476ms remaining: 476ms 12: learn: 0.5609250 test: 0.5626557 best: 0.5626557 (12) total: 515ms remaining: 436ms 13: learn: 0.5529244 test: 0.5546776 best: 0.5546776 (13) total: 557ms remaining: 398ms 14: learn: 0.5448550 test: 0.5466922 best: 0.5466922 (14) total: 596ms remaining: 358ms 15: learn: 0.5368933 test: 0.5387851 best: 0.5387851 (15) total: 635ms remaining: 317ms 16: learn: 0.5290210 test: 0.5310254 best: 0.5310254 (16) total: 674ms remaining: 278ms 17: learn: 0.5221144 test: 0.5241632 best: 0.5241632 (17) total: 713ms remaining: 238ms 18: learn: 0.5147030 test: 0.5168626 best: 0.5168626 (18) total: 752ms remaining: 198ms 19: learn: 0.5074022 test: 0.5096849 best: 0.5096849 (19) total: 792ms remaining: 158ms 20: learn: 0.5003742 test: 0.5027194 best: 0.5027194 (20) total: 830ms remaining: 119ms 21: learn: 0.4941771 test: 0.4966091 best: 0.4966091 (21) total: 869ms remaining: 79ms 22: learn: 0.4875385 test: 0.4900336 best: 0.4900336 (22) total: 908ms remaining: 39.5ms 23: learn: 0.4810223 test: 0.4836139 best: 0.4836139 (23) total: 947ms remaining: 0us bestTest = 0.4836138542 bestIteration = 23 Trial 79, Fold 5: Log loss = 0.483613854201858, Average precision = 0.9560993198853893, ROC-AUC = 0.9541198372443007, Elapsed Time = 1.0542547999939416 seconds
Optimization Progress: 80%|######## | 80/100 [2:24:14<33:49, 101.49s/it]
Trial 80, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 80, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5529500 test: 0.5528939 best: 0.5528939 (0) total: 101ms remaining: 8.97s 1: learn: 0.4472210 test: 0.4489717 best: 0.4489717 (1) total: 206ms remaining: 9.07s 2: learn: 0.3876643 test: 0.3908444 best: 0.3908444 (2) total: 309ms remaining: 8.96s 3: learn: 0.3348323 test: 0.3406929 best: 0.3406929 (3) total: 410ms remaining: 8.81s 4: learn: 0.3059041 test: 0.3125768 best: 0.3125768 (4) total: 500ms remaining: 8.5s 5: learn: 0.2787688 test: 0.2871093 best: 0.2871093 (5) total: 602ms remaining: 8.42s 6: learn: 0.2621827 test: 0.2721093 best: 0.2721093 (6) total: 700ms remaining: 8.3s 7: learn: 0.2448374 test: 0.2557445 best: 0.2557445 (7) total: 804ms remaining: 8.24s 8: learn: 4.7553184 test: 0.2443686 best: 0.2443686 (8) total: 908ms remaining: 8.17s 9: learn: 4.7454163 test: 0.2357062 best: 0.2357062 (9) total: 1.01s remaining: 8.05s 10: learn: 4.7375817 test: 0.2295680 best: 0.2295680 (10) total: 1.11s remaining: 7.96s 11: learn: 4.7302209 test: 0.2236569 best: 0.2236569 (11) total: 1.21s remaining: 7.88s 12: learn: 4.7242175 test: 0.2196267 best: 0.2196267 (12) total: 1.31s remaining: 7.78s 13: learn: 6.0765476 test: 2.0228969 best: 0.2196267 (12) total: 1.41s remaining: 7.67s 14: learn: 6.0712761 test: 2.0186821 best: 0.2196267 (12) total: 1.52s remaining: 7.61s 15: learn: 6.0681860 test: 2.0164181 best: 0.2196267 (12) total: 1.61s remaining: 7.46s 16: learn: 8.4483231 test: 5.8221667 best: 0.2196267 (12) total: 1.71s remaining: 7.35s
Training has stopped (degenerate solution on iteration 17, probably too small l2-regularization, try to increase it)
bestTest = 0.2196266537 bestIteration = 12 Shrink model to first 13 iterations. Trial 80, Fold 1: Log loss = 0.21962665373815726, Average precision = 0.9736935458727383, ROC-AUC = 0.969040882814031, Elapsed Time = 1.926980200005346 seconds Trial 80, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 80, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5363507 test: 0.5383151 best: 0.5383151 (0) total: 104ms remaining: 9.26s 1: learn: 0.4394560 test: 0.4437305 best: 0.4437305 (1) total: 208ms remaining: 9.17s 2: learn: 0.3768750 test: 0.3826198 best: 0.3826198 (2) total: 319ms remaining: 9.25s 3: learn: 0.3276336 test: 0.3341281 best: 0.3341281 (3) total: 422ms remaining: 9.07s 4: learn: 0.2946404 test: 0.3024942 best: 0.3024942 (4) total: 527ms remaining: 8.95s 5: learn: 0.2700044 test: 0.2786865 best: 0.2786865 (5) total: 635ms remaining: 8.89s 6: learn: 0.2475294 test: 0.2572929 best: 0.2572929 (6) total: 744ms remaining: 8.82s 7: learn: 0.2348978 test: 0.2453161 best: 0.2453161 (7) total: 843ms remaining: 8.64s 8: learn: 0.2217459 test: 0.2332528 best: 0.2332528 (8) total: 943ms remaining: 8.49s 9: learn: 0.2123231 test: 0.2244717 best: 0.2244717 (9) total: 1.06s remaining: 8.47s 10: learn: 0.2043987 test: 0.2172099 best: 0.2172099 (10) total: 1.16s remaining: 8.32s 11: learn: 0.1990532 test: 0.2125029 best: 0.2125029 (11) total: 1.26s remaining: 8.2s
Training has stopped (degenerate solution on iteration 12, probably too small l2-regularization, try to increase it)
bestTest = 0.2125029044 bestIteration = 11 Shrink model to first 12 iterations. Trial 80, Fold 2: Log loss = 0.2125029043547971, Average precision = 0.9746115179581751, ROC-AUC = 0.9721019283805082, Elapsed Time = 1.4735628999987966 seconds Trial 80, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 80, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5230123 test: 0.5244451 best: 0.5244451 (0) total: 120ms remaining: 10.7s 1: learn: 0.4369878 test: 0.4387668 best: 0.4387668 (1) total: 216ms remaining: 9.52s 2: learn: 0.3722028 test: 0.3757026 best: 0.3757026 (2) total: 318ms remaining: 9.24s 3: learn: 0.3290222 test: 0.3326867 best: 0.3326867 (3) total: 418ms remaining: 8.99s 4: learn: 0.2902896 test: 0.2951478 best: 0.2951478 (4) total: 529ms remaining: 8.99s 5: learn: 0.2654312 test: 0.2716437 best: 0.2716437 (5) total: 635ms remaining: 8.89s 6: learn: 0.2476247 test: 0.2560263 best: 0.2560263 (6) total: 741ms remaining: 8.79s 7: learn: 0.2335366 test: 0.2435394 best: 0.2435394 (7) total: 846ms remaining: 8.68s 8: learn: 0.2233783 test: 0.2344873 best: 0.2344873 (8) total: 950ms remaining: 8.55s 9: learn: 0.2138988 test: 0.2260328 best: 0.2260328 (9) total: 1.06s remaining: 8.45s 10: learn: 0.2069565 test: 0.2200847 best: 0.2200847 (10) total: 1.16s remaining: 8.33s 11: learn: 0.2010268 test: 0.2151249 best: 0.2151249 (11) total: 1.26s remaining: 8.21s 12: learn: 0.1971969 test: 0.2120485 best: 0.2120485 (12) total: 1.36s remaining: 8.07s 13: learn: 0.1933597 test: 0.2088549 best: 0.2088549 (13) total: 1.47s remaining: 7.96s 14: learn: 0.1891542 test: 0.2062135 best: 0.2062135 (14) total: 1.57s remaining: 7.87s 15: learn: 0.1856433 test: 0.2036277 best: 0.2036277 (15) total: 1.68s remaining: 7.76s 16: learn: 0.1831049 test: 0.2017443 best: 0.2017443 (16) total: 1.77s remaining: 7.62s 17: learn: 0.1798995 test: 0.1996992 best: 0.1996992 (17) total: 1.88s remaining: 7.52s 18: learn: 0.1777720 test: 0.1985574 best: 0.1985574 (18) total: 1.98s remaining: 7.4s 19: learn: 5.5960883 test: 0.1967698 best: 0.1967698 (19) total: 2.08s remaining: 7.3s 20: learn: 5.5937566 test: 0.1958974 best: 0.1958974 (20) total: 2.2s remaining: 7.24s 21: learn: 5.5921375 test: 0.1948848 best: 0.1948848 (21) total: 2.32s remaining: 7.17s 22: learn: 5.5912772 test: 0.1944679 best: 0.1944679 (22) total: 2.42s remaining: 7.05s 23: learn: 5.5895280 test: 0.1932105 best: 0.1932105 (23) total: 2.53s remaining: 6.96s 24: learn: 12.6458123 test: 0.1924803 best: 0.1924803 (24) total: 2.64s remaining: 6.87s 25: learn: 12.6447962 test: 0.1921237 best: 0.1921237 (25) total: 2.74s remaining: 6.74s 26: learn: 12.6432568 test: 0.1910528 best: 0.1910528 (26) total: 2.83s remaining: 6.61s 27: learn: 12.6426056 test: 0.1911687 best: 0.1910528 (26) total: 2.92s remaining: 6.47s 28: learn: 12.6418164 test: 0.1907753 best: 0.1907753 (28) total: 3.02s remaining: 6.34s 29: learn: 12.6409496 test: 0.1905219 best: 0.1905219 (29) total: 3.1s remaining: 6.21s 30: learn: 12.6400218 test: 0.1902167 best: 0.1902167 (30) total: 3.2s remaining: 6.09s 31: learn: 12.6392037 test: 0.1901643 best: 0.1901643 (31) total: 3.29s remaining: 5.96s 32: learn: 12.6388580 test: 0.1900445 best: 0.1900445 (32) total: 3.37s remaining: 5.81s 33: learn: 12.6383200 test: 0.1900738 best: 0.1900445 (32) total: 3.45s remaining: 5.69s 34: learn: 12.6376716 test: 0.1895423 best: 0.1895423 (34) total: 3.53s remaining: 5.55s 35: learn: 12.6369790 test: 0.1897031 best: 0.1895423 (34) total: 3.62s remaining: 5.43s 36: learn: 50.1326874 test: 0.1891690 best: 0.1891690 (36) total: 3.71s remaining: 5.31s 37: learn: 50.1321203 test: 0.1893230 best: 0.1891690 (36) total: 3.79s remaining: 5.18s 38: learn: 50.1314201 test: 0.1892644 best: 0.1891690 (36) total: 3.87s remaining: 5.06s 39: learn: 50.1306314 test: 0.1889276 best: 0.1889276 (39) total: 3.95s remaining: 4.94s 40: learn: 50.1299393 test: 0.1888931 best: 0.1888931 (40) total: 4.03s remaining: 4.82s 41: learn: 50.1297507 test: 0.1887990 best: 0.1887990 (41) total: 4.1s remaining: 4.69s 42: learn: 50.1292217 test: 0.1888755 best: 0.1887990 (41) total: 4.19s remaining: 4.58s 43: learn: 50.1286884 test: 0.1890732 best: 0.1887990 (41) total: 4.27s remaining: 4.46s 44: learn: 50.1280666 test: 0.1893538 best: 0.1887990 (41) total: 4.35s remaining: 4.35s 45: learn: 50.1275918 test: 0.1893572 best: 0.1887990 (41) total: 4.45s remaining: 4.25s 46: learn: 50.1268147 test: 0.1892197 best: 0.1887990 (41) total: 4.54s remaining: 4.15s 47: learn: 50.1262746 test: 0.1891134 best: 0.1887990 (41) total: 4.62s remaining: 4.04s 48: learn: 50.1256235 test: 0.1890010 best: 0.1887990 (41) total: 4.71s remaining: 3.94s 49: learn: 50.1253455 test: 0.1890674 best: 0.1887990 (41) total: 4.79s remaining: 3.83s 50: learn: 50.1251424 test: 0.1891262 best: 0.1887990 (41) total: 4.86s remaining: 3.71s 51: learn: 50.1244596 test: 0.1890943 best: 0.1887990 (41) total: 4.95s remaining: 3.62s 52: learn: 50.1240121 test: 0.1890527 best: 0.1887990 (41) total: 5.03s remaining: 3.51s 53: learn: 56.4395044 test: 0.1889058 best: 0.1887990 (41) total: 5.11s remaining: 3.41s 54: learn: 56.4386920 test: 0.1887959 best: 0.1887959 (54) total: 5.21s remaining: 3.31s 55: learn: 56.4382008 test: 0.1886379 best: 0.1886379 (55) total: 5.29s remaining: 3.21s 56: learn: 56.4374572 test: 0.1884031 best: 0.1884031 (56) total: 5.38s remaining: 3.11s 57: learn: 56.4371671 test: 0.1884050 best: 0.1884031 (56) total: 5.45s remaining: 3.01s 58: learn: 56.4366423 test: 0.1884936 best: 0.1884031 (56) total: 5.53s remaining: 2.9s 59: learn: 56.4361350 test: 0.1885429 best: 0.1884031 (56) total: 5.61s remaining: 2.81s 60: learn: 56.4355737 test: 0.1883608 best: 0.1883608 (60) total: 5.7s remaining: 2.71s 61: learn: 56.4350921 test: 0.1883619 best: 0.1883608 (60) total: 5.78s remaining: 2.61s 62: learn: 56.4346703 test: 0.1883196 best: 0.1883196 (62) total: 5.86s remaining: 2.51s 63: learn: 56.4341969 test: 0.1881551 best: 0.1881551 (63) total: 5.94s remaining: 2.41s 64: learn: 56.4338809 test: 0.1880887 best: 0.1880887 (64) total: 6.02s remaining: 2.31s 65: learn: 56.4335963 test: 0.1881287 best: 0.1880887 (64) total: 6.09s remaining: 2.21s 66: learn: 56.4332313 test: 0.1880573 best: 0.1880573 (66) total: 6.17s remaining: 2.12s 67: learn: 56.4326205 test: 0.1878338 best: 0.1878338 (67) total: 6.26s remaining: 2.03s 68: learn: 56.4320976 test: 0.1878575 best: 0.1878338 (67) total: 6.34s remaining: 1.93s 69: learn: 56.4316825 test: 0.1878119 best: 0.1878119 (69) total: 6.42s remaining: 1.83s 70: learn: 56.4312945 test: 0.1878342 best: 0.1878119 (69) total: 6.5s remaining: 1.74s 71: learn: 56.4309980 test: 0.1878678 best: 0.1878119 (69) total: 6.57s remaining: 1.64s 72: learn: 56.4306888 test: 0.1878620 best: 0.1878119 (69) total: 6.64s remaining: 1.55s 73: learn: 56.4305156 test: 0.1878860 best: 0.1878119 (69) total: 6.71s remaining: 1.45s 74: learn: 56.4300894 test: 0.1880786 best: 0.1878119 (69) total: 6.79s remaining: 1.36s 75: learn: 56.4294762 test: 0.1878228 best: 0.1878119 (69) total: 6.88s remaining: 1.27s 76: learn: 56.4288677 test: 0.1879057 best: 0.1878119 (69) total: 6.96s remaining: 1.18s 77: learn: 56.4283847 test: 0.1879626 best: 0.1878119 (69) total: 7.05s remaining: 1.08s 78: learn: 56.4280245 test: 0.1879318 best: 0.1878119 (69) total: 7.13s remaining: 992ms 79: learn: 56.4277890 test: 0.1879082 best: 0.1878119 (69) total: 7.2s remaining: 900ms 80: learn: 56.4274815 test: 0.1879266 best: 0.1878119 (69) total: 7.28s remaining: 809ms 81: learn: 56.4269550 test: 0.1880788 best: 0.1878119 (69) total: 7.36s remaining: 718ms 82: learn: 56.4265467 test: 0.1879856 best: 0.1878119 (69) total: 7.44s remaining: 628ms 83: learn: 56.4261068 test: 0.1878614 best: 0.1878119 (69) total: 7.52s remaining: 537ms 84: learn: 56.4258305 test: 0.1879180 best: 0.1878119 (69) total: 7.6s remaining: 447ms 85: learn: 56.4252344 test: 0.1881130 best: 0.1878119 (69) total: 7.68s remaining: 357ms 86: learn: 56.4249969 test: 0.1882174 best: 0.1878119 (69) total: 7.75s remaining: 267ms 87: learn: 56.4247978 test: 0.1881520 best: 0.1878119 (69) total: 7.83s remaining: 178ms 88: learn: 56.4245350 test: 0.1883488 best: 0.1878119 (69) total: 7.9s remaining: 88.8ms 89: learn: 56.4240339 test: 0.1881168 best: 0.1878119 (69) total: 7.99s remaining: 0us bestTest = 0.1878119485 bestIteration = 69 Shrink model to first 70 iterations. Trial 80, Fold 3: Log loss = 0.18781194846724483, Average precision = 0.9755745016444857, ROC-AUC = 0.9733321154086929, Elapsed Time = 8.1260306999975 seconds Trial 80, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 80, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5538821 test: 0.5551782 best: 0.5551782 (0) total: 109ms remaining: 9.73s 1: learn: 0.4523582 test: 0.4559568 best: 0.4559568 (1) total: 214ms remaining: 9.41s 2: learn: 0.3811483 test: 0.3857977 best: 0.3857977 (2) total: 319ms remaining: 9.25s 3: learn: 0.3317547 test: 0.3376816 best: 0.3376816 (3) total: 420ms remaining: 9.03s 4: learn: 0.2919761 test: 0.2983985 best: 0.2983985 (4) total: 523ms remaining: 8.88s 5: learn: 0.2702518 test: 0.2784221 best: 0.2784221 (5) total: 637ms remaining: 8.92s 6: learn: 0.2520465 test: 0.2612995 best: 0.2612995 (6) total: 745ms remaining: 8.83s 7: learn: 0.2364687 test: 0.2466696 best: 0.2466696 (7) total: 849ms remaining: 8.7s 8: learn: 0.2263920 test: 0.2373249 best: 0.2373249 (8) total: 946ms remaining: 8.51s 9: learn: 0.2144470 test: 0.2267600 best: 0.2267600 (9) total: 1.05s remaining: 8.39s 10: learn: 0.2065679 test: 0.2206621 best: 0.2206621 (10) total: 1.16s remaining: 8.32s 11: learn: 0.1992731 test: 0.2138082 best: 0.2138082 (11) total: 1.26s remaining: 8.18s 12: learn: 0.1942910 test: 0.2098400 best: 0.2098400 (12) total: 1.36s remaining: 8.08s 13: learn: 11.0677119 test: 0.2062640 best: 0.2062640 (13) total: 1.46s remaining: 7.95s
Training has stopped (degenerate solution on iteration 14, probably too small l2-regularization, try to increase it)
bestTest = 0.2062639631 bestIteration = 13 Shrink model to first 14 iterations. Trial 80, Fold 4: Log loss = 0.20626396314467765, Average precision = 0.97516142641915, ROC-AUC = 0.971032848821267, Elapsed Time = 1.6831536000026972 seconds Trial 80, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 80, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5423717 test: 0.5460791 best: 0.5460791 (0) total: 111ms remaining: 9.88s 1: learn: 0.4401360 test: 0.4458539 best: 0.4458539 (1) total: 213ms remaining: 9.35s 2: learn: 0.3800993 test: 0.3881039 best: 0.3881039 (2) total: 325ms remaining: 9.43s 3: learn: 0.3312859 test: 0.3409895 best: 0.3409895 (3) total: 427ms remaining: 9.19s 4: learn: 0.2960988 test: 0.3069963 best: 0.3069963 (4) total: 530ms remaining: 9s 5: learn: 0.2716820 test: 0.2848152 best: 0.2848152 (5) total: 639ms remaining: 8.95s 6: learn: 0.2515691 test: 0.2668433 best: 0.2668433 (6) total: 744ms remaining: 8.82s 7: learn: 0.2370467 test: 0.2543314 best: 0.2543314 (7) total: 851ms remaining: 8.72s 8: learn: 0.2251289 test: 0.2447872 best: 0.2447872 (8) total: 955ms remaining: 8.6s 9: learn: 0.2148875 test: 0.2362421 best: 0.2362421 (9) total: 1.06s remaining: 8.52s 10: learn: 0.2056182 test: 0.2281942 best: 0.2281942 (10) total: 1.19s remaining: 8.53s 11: learn: 0.1993890 test: 0.2234916 best: 0.2234916 (11) total: 1.3s remaining: 8.46s 12: learn: 0.1943098 test: 0.2196278 best: 0.2196278 (12) total: 1.41s remaining: 8.34s 13: learn: 0.1901598 test: 0.2172360 best: 0.2172360 (13) total: 1.51s remaining: 8.19s 14: learn: 0.1860728 test: 0.2135866 best: 0.2135866 (14) total: 1.61s remaining: 8.05s 15: learn: 0.1834313 test: 0.2115759 best: 0.2115759 (15) total: 1.71s remaining: 7.9s 16: learn: 11.5026185 test: 0.2101576 best: 0.2101576 (16) total: 1.81s remaining: 7.78s 17: learn: 11.5010703 test: 0.2090268 best: 0.2090268 (17) total: 1.9s remaining: 7.59s 18: learn: 11.4991770 test: 0.2078413 best: 0.2078413 (18) total: 2s remaining: 7.46s 19: learn: 11.4971854 test: 0.2069018 best: 0.2069018 (19) total: 2.09s remaining: 7.31s 20: learn: 31.3473019 test: 7.4086762 best: 0.2069018 (19) total: 2.18s remaining: 7.16s 21: learn: 31.3452552 test: 7.4081714 best: 0.2069018 (19) total: 2.28s remaining: 7.04s 22: learn: 31.3440283 test: 7.4076363 best: 0.2069018 (19) total: 2.36s remaining: 6.89s 23: learn: 31.3419860 test: 7.4069675 best: 0.2069018 (19) total: 2.46s remaining: 6.76s 24: learn: 31.3413217 test: 7.4067514 best: 0.2069018 (19) total: 2.54s remaining: 6.61s 25: learn: 31.3406166 test: 7.4065226 best: 0.2069018 (19) total: 2.62s remaining: 6.46s 26: learn: 31.3399829 test: 7.4063566 best: 0.2069018 (19) total: 2.71s remaining: 6.32s 27: learn: 55.3050941 test: 31.2534129 best: 0.2069018 (19) total: 2.79s remaining: 6.18s 28: learn: 55.3044379 test: 31.2531109 best: 0.2069018 (19) total: 2.87s remaining: 6.03s 29: learn: 55.3037923 test: 31.2530956 best: 0.2069018 (19) total: 2.95s remaining: 5.9s 30: learn: 55.3028839 test: 31.2526602 best: 0.2069018 (19) total: 3.03s remaining: 5.77s 31: learn: 55.3021633 test: 31.2525589 best: 0.2069018 (19) total: 3.12s remaining: 5.65s 32: learn: 63.4471160 test: 63.6773291 best: 0.2069018 (19) total: 3.22s remaining: 5.56s 33: learn: 63.4465026 test: 63.6771795 best: 0.2069018 (19) total: 3.32s remaining: 5.47s 34: learn: 63.4459837 test: 63.6768396 best: 0.2069018 (19) total: 3.47s remaining: 5.45s 35: learn: 78.3878206 test: 63.6761644 best: 0.2069018 (19) total: 3.59s remaining: 5.39s 36: learn: 78.3873478 test: 63.6759699 best: 0.2069018 (19) total: 3.71s remaining: 5.32s 37: learn: 78.3866968 test: 63.6756429 best: 0.2069018 (19) total: 3.87s remaining: 5.29s 38: learn: 78.3857758 test: 63.6752846 best: 0.2069018 (19) total: 3.99s remaining: 5.21s 39: learn: 78.3851686 test: 63.6749143 best: 0.2069018 (19) total: 4.09s remaining: 5.12s 40: learn: 78.3847635 test: 63.6748064 best: 0.2069018 (19) total: 4.19s remaining: 5s 41: learn: 78.3839129 test: 63.6743221 best: 0.2069018 (19) total: 4.3s remaining: 4.91s 42: learn: 78.3831506 test: 63.6742705 best: 0.2069018 (19) total: 4.4s remaining: 4.8s 43: learn: 78.3823351 test: 63.6740908 best: 0.2069018 (19) total: 4.51s remaining: 4.72s 44: learn: 78.3815880 test: 63.6738512 best: 0.2069018 (19) total: 4.66s remaining: 4.66s 45: learn: 78.3811556 test: 63.6737004 best: 0.2069018 (19) total: 4.78s remaining: 4.58s 46: learn: 78.3807026 test: 63.6734599 best: 0.2069018 (19) total: 4.91s remaining: 4.49s 47: learn: 78.3800873 test: 63.6732438 best: 0.2069018 (19) total: 5.01s remaining: 4.39s 48: learn: 78.3793523 test: 63.6729048 best: 0.2069018 (19) total: 5.14s remaining: 4.3s 49: learn: 78.3789322 test: 63.6727679 best: 0.2069018 (19) total: 5.24s remaining: 4.2s 50: learn: 85.5577251 test: 77.9614927 best: 0.2069018 (19) total: 5.36s remaining: 4.1s 51: learn: 85.5570556 test: 77.9612715 best: 0.2069018 (19) total: 5.49s remaining: 4.01s 52: learn: 85.5562851 test: 77.9609523 best: 0.2069018 (19) total: 5.63s remaining: 3.93s 53: learn: 85.5556778 test: 77.9606176 best: 0.2069018 (19) total: 5.73s remaining: 3.82s 54: learn: 85.5550659 test: 77.9605895 best: 0.2069018 (19) total: 5.83s remaining: 3.71s 55: learn: 85.5544936 test: 77.9604446 best: 0.2069018 (19) total: 5.93s remaining: 3.6s 56: learn: 85.5537975 test: 77.9599820 best: 0.2069018 (19) total: 6.03s remaining: 3.49s 57: learn: 85.5533224 test: 77.9598359 best: 0.2069018 (19) total: 6.12s remaining: 3.38s 58: learn: 85.5523973 test: 77.9597648 best: 0.2069018 (19) total: 6.23s remaining: 3.27s 59: learn: 85.5517534 test: 77.9595927 best: 0.2069018 (19) total: 6.33s remaining: 3.16s 60: learn: 85.5512362 test: 77.9593695 best: 0.2069018 (19) total: 6.42s remaining: 3.05s 61: learn: 85.5506198 test: 77.9591670 best: 0.2069018 (19) total: 6.51s remaining: 2.94s 62: learn: 85.5499825 test: 77.9590179 best: 0.2069018 (19) total: 6.61s remaining: 2.83s 63: learn: 58.3708860 test: 67.1404785 best: 0.2069018 (19) total: 6.7s remaining: 2.72s 64: learn: 63.8010658 test: 67.1401011 best: 0.2069018 (19) total: 6.83s remaining: 2.63s 65: learn: 63.8004946 test: 67.1398456 best: 0.2069018 (19) total: 6.93s remaining: 2.52s 66: learn: 63.8000379 test: 67.1397384 best: 0.2069018 (19) total: 7.03s remaining: 2.41s 67: learn: 63.7997489 test: 67.1395531 best: 0.2069018 (19) total: 7.12s remaining: 2.3s 68: learn: 63.7990372 test: 67.1393060 best: 0.2069018 (19) total: 7.22s remaining: 2.2s 69: learn: 63.7984832 test: 67.1390422 best: 0.2069018 (19) total: 7.31s remaining: 2.09s 70: learn: 63.7977424 test: 67.1390436 best: 0.2069018 (19) total: 7.41s remaining: 1.98s 71: learn: 63.7971986 test: 67.1392105 best: 0.2069018 (19) total: 7.51s remaining: 1.88s 72: learn: 63.7967630 test: 67.1392106 best: 0.2069018 (19) total: 7.6s remaining: 1.77s 73: learn: 63.7960892 test: 67.1393547 best: 0.2069018 (19) total: 7.69s remaining: 1.66s 74: learn: 63.7956206 test: 67.1391793 best: 0.2069018 (19) total: 7.79s remaining: 1.56s 75: learn: 63.7950591 test: 67.1390992 best: 0.2069018 (19) total: 7.88s remaining: 1.45s 76: learn: 63.7943382 test: 67.1388664 best: 0.2069018 (19) total: 7.97s remaining: 1.34s 77: learn: 63.7939395 test: 67.1387397 best: 0.2069018 (19) total: 8.05s remaining: 1.24s 78: learn: 63.7931790 test: 67.1385142 best: 0.2069018 (19) total: 8.14s remaining: 1.13s 79: learn: 63.7926303 test: 67.1384362 best: 0.2069018 (19) total: 8.31s remaining: 1.04s 80: learn: 63.7919103 test: 67.1384239 best: 0.2069018 (19) total: 8.41s remaining: 935ms 81: learn: 63.7913967 test: 67.1382702 best: 0.2069018 (19) total: 8.51s remaining: 830ms 82: learn: 63.7907820 test: 67.1378586 best: 0.2069018 (19) total: 8.61s remaining: 726ms 83: learn: 63.7903928 test: 67.1377647 best: 0.2069018 (19) total: 8.69s remaining: 621ms 84: learn: 63.7898486 test: 67.1376154 best: 0.2069018 (19) total: 8.79s remaining: 517ms 85: learn: 63.7895786 test: 67.1374396 best: 0.2069018 (19) total: 8.88s remaining: 413ms 86: learn: 63.7890654 test: 67.1372510 best: 0.2069018 (19) total: 8.97s remaining: 309ms 87: learn: 63.7886513 test: 67.1369377 best: 0.2069018 (19) total: 9.06s remaining: 206ms 88: learn: 63.7880875 test: 67.1370621 best: 0.2069018 (19) total: 9.16s remaining: 103ms 89: learn: 63.7875184 test: 67.1367441 best: 0.2069018 (19) total: 9.25s remaining: 0us bestTest = 0.2069018419 bestIteration = 19 Shrink model to first 20 iterations. Trial 80, Fold 5: Log loss = 0.20690184187035557, Average precision = 0.9726087518814819, ROC-AUC = 0.9694474852758115, Elapsed Time = 9.387293200001295 seconds
Optimization Progress: 81%|########1 | 81/100 [2:24:44<25:24, 80.22s/it]
Trial 81, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 81, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6482903 test: 0.6509808 best: 0.6509808 (0) total: 1.78s remaining: 24.9s 1: learn: 0.6083457 test: 0.6129483 best: 0.6129483 (1) total: 3.53s remaining: 23s 2: learn: 0.5715556 test: 0.5785374 best: 0.5785374 (2) total: 5.4s remaining: 21.6s 3: learn: 0.5381297 test: 0.5479297 best: 0.5479297 (3) total: 7.29s remaining: 20s 4: learn: 0.5084421 test: 0.5194222 best: 0.5194222 (4) total: 9.05s remaining: 18.1s 5: learn: 0.4810553 test: 0.4942125 best: 0.4942125 (5) total: 11s remaining: 16.6s 6: learn: 0.4549130 test: 0.4712900 best: 0.4712900 (6) total: 13.2s remaining: 15.1s 7: learn: 0.4323980 test: 0.4504097 best: 0.4504097 (7) total: 15.1s remaining: 13.2s 8: learn: 0.4120358 test: 0.4308878 best: 0.4308878 (8) total: 17s remaining: 11.3s 9: learn: 0.3927969 test: 0.4136924 best: 0.4136924 (9) total: 18.7s remaining: 9.37s 10: learn: 0.3759942 test: 0.3974403 best: 0.3974403 (10) total: 20.8s remaining: 7.56s 11: learn: 0.3595980 test: 0.3825574 best: 0.3825574 (11) total: 22.7s remaining: 5.68s 12: learn: 0.3442676 test: 0.3695106 best: 0.3695106 (12) total: 24.6s remaining: 3.78s 13: learn: 0.3304700 test: 0.3579138 best: 0.3579138 (13) total: 26.6s remaining: 1.9s 14: learn: 0.3181491 test: 0.3464089 best: 0.3464089 (14) total: 28.6s remaining: 0us bestTest = 0.3464088985 bestIteration = 14 Trial 81, Fold 1: Log loss = 0.34652195021508936, Average precision = 0.9723340245495337, ROC-AUC = 0.9666713267661777, Elapsed Time = 28.784268800001882 seconds Trial 81, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 81, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6508831 test: 0.6519273 best: 0.6519273 (0) total: 1.96s remaining: 27.5s 1: learn: 0.6109928 test: 0.6137744 best: 0.6137744 (1) total: 3.95s remaining: 25.7s 2: learn: 0.5743540 test: 0.5792658 best: 0.5792658 (2) total: 6.03s remaining: 24.1s 3: learn: 0.5414026 test: 0.5478258 best: 0.5478258 (3) total: 8.22s remaining: 22.6s 4: learn: 0.5110997 test: 0.5192792 best: 0.5192792 (4) total: 10.3s remaining: 20.5s 5: learn: 0.4831363 test: 0.4932644 best: 0.4932644 (5) total: 12.2s remaining: 18.3s 6: learn: 0.4586228 test: 0.4700399 best: 0.4700399 (6) total: 14.1s remaining: 16.1s 7: learn: 0.4355824 test: 0.4488519 best: 0.4488519 (7) total: 15.8s remaining: 13.8s 8: learn: 0.4149601 test: 0.4296197 best: 0.4296197 (8) total: 17.6s remaining: 11.8s 9: learn: 0.3961978 test: 0.4122041 best: 0.4122041 (9) total: 19.4s remaining: 9.7s 10: learn: 0.3789618 test: 0.3964055 best: 0.3964055 (10) total: 21.1s remaining: 7.69s 11: learn: 0.3623884 test: 0.3818805 best: 0.3818805 (11) total: 23.2s remaining: 5.79s 12: learn: 0.3480172 test: 0.3683754 best: 0.3683754 (12) total: 25.2s remaining: 3.88s 13: learn: 0.3343903 test: 0.3562661 best: 0.3562661 (13) total: 27.5s remaining: 1.97s 14: learn: 0.3221214 test: 0.3452412 best: 0.3452412 (14) total: 30.1s remaining: 0us bestTest = 0.3452411566 bestIteration = 14 Trial 81, Fold 2: Log loss = 0.34531134344144665, Average precision = 0.9727726788725815, ROC-AUC = 0.9692984639178668, Elapsed Time = 30.26727309999842 seconds Trial 81, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 81, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6492509 test: 0.6523419 best: 0.6523419 (0) total: 2.29s remaining: 32.1s 1: learn: 0.6099105 test: 0.6146163 best: 0.6146163 (1) total: 4.38s remaining: 28.5s 2: learn: 0.5740766 test: 0.5800058 best: 0.5800058 (2) total: 6.2s remaining: 24.8s 3: learn: 0.5409560 test: 0.5487033 best: 0.5487033 (3) total: 7.78s remaining: 21.4s 4: learn: 0.5103878 test: 0.5200330 best: 0.5200330 (4) total: 9.41s remaining: 18.8s 5: learn: 0.4827531 test: 0.4950838 best: 0.4950838 (5) total: 11.1s remaining: 16.6s 6: learn: 0.4573616 test: 0.4709221 best: 0.4709221 (6) total: 12.7s remaining: 14.5s 7: learn: 0.4346724 test: 0.4496088 best: 0.4496088 (7) total: 14.3s remaining: 12.5s 8: learn: 0.4141142 test: 0.4302439 best: 0.4302439 (8) total: 15.9s remaining: 10.6s 9: learn: 0.3949757 test: 0.4122074 best: 0.4122074 (9) total: 17.5s remaining: 8.76s 10: learn: 0.3780477 test: 0.3960211 best: 0.3960211 (10) total: 19.2s remaining: 6.97s 11: learn: 0.3624797 test: 0.3818921 best: 0.3818921 (11) total: 20.9s remaining: 5.22s 12: learn: 0.3486044 test: 0.3693211 best: 0.3693211 (12) total: 22.6s remaining: 3.47s 13: learn: 0.3356987 test: 0.3571446 best: 0.3571446 (13) total: 24.2s remaining: 1.73s 14: learn: 0.3230783 test: 0.3459297 best: 0.3459297 (14) total: 25.8s remaining: 0us bestTest = 0.3459297189 bestIteration = 14 Trial 81, Fold 3: Log loss = 0.3461829771105294, Average precision = 0.9726650811982974, ROC-AUC = 0.9686962016401652, Elapsed Time = 26.005644000004395 seconds Trial 81, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 81, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6488027 test: 0.6514402 best: 0.6514402 (0) total: 1.58s remaining: 22.1s 1: learn: 0.6093160 test: 0.6134009 best: 0.6134009 (1) total: 3.09s remaining: 20.1s 2: learn: 0.5760767 test: 0.5802447 best: 0.5802447 (2) total: 3.18s remaining: 12.7s 3: learn: 0.5418990 test: 0.5486109 best: 0.5486109 (3) total: 4.75s remaining: 13.1s 4: learn: 0.5114501 test: 0.5200532 best: 0.5200532 (4) total: 6.37s remaining: 12.7s 5: learn: 0.4835021 test: 0.4945850 best: 0.4945850 (5) total: 7.95s remaining: 11.9s 6: learn: 0.4581449 test: 0.4711481 best: 0.4711481 (6) total: 9.55s remaining: 10.9s 7: learn: 0.4354061 test: 0.4498473 best: 0.4498473 (7) total: 11.2s remaining: 9.77s 8: learn: 0.4142076 test: 0.4303301 best: 0.4303301 (8) total: 12.8s remaining: 8.53s 9: learn: 0.3954101 test: 0.4122559 best: 0.4122559 (9) total: 14.5s remaining: 7.25s 10: learn: 0.3781440 test: 0.3962659 best: 0.3962659 (10) total: 16.2s remaining: 5.88s 11: learn: 0.3621336 test: 0.3815036 best: 0.3815036 (11) total: 17.9s remaining: 4.47s 12: learn: 0.3467396 test: 0.3683085 best: 0.3683085 (12) total: 19.6s remaining: 3.01s 13: learn: 0.3335605 test: 0.3560212 best: 0.3560212 (13) total: 21.2s remaining: 1.51s 14: learn: 0.3212953 test: 0.3451804 best: 0.3451804 (14) total: 22.9s remaining: 0us bestTest = 0.3451803561 bestIteration = 14 Trial 81, Fold 4: Log loss = 0.34528470439268066, Average precision = 0.9725478250183603, ROC-AUC = 0.9676028753597479, Elapsed Time = 23.048615799998515 seconds Trial 81, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 81, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6475273 test: 0.6525592 best: 0.6525592 (0) total: 1.6s remaining: 22.3s 1: learn: 0.6069345 test: 0.6149532 best: 0.6149532 (1) total: 3.13s remaining: 20.4s 2: learn: 0.5709694 test: 0.5809619 best: 0.5809619 (2) total: 4.73s remaining: 18.9s 3: learn: 0.5374447 test: 0.5496961 best: 0.5496961 (3) total: 6.3s remaining: 17.3s 4: learn: 0.5077914 test: 0.5219324 best: 0.5219324 (4) total: 7.96s remaining: 15.9s 5: learn: 0.4806945 test: 0.4965350 best: 0.4965350 (5) total: 9.59s remaining: 14.4s 6: learn: 0.4556132 test: 0.4732812 best: 0.4732812 (6) total: 11.3s remaining: 12.9s 7: learn: 0.4330831 test: 0.4528873 best: 0.4528873 (7) total: 12.9s remaining: 11.3s 8: learn: 0.4123145 test: 0.4338871 best: 0.4338871 (8) total: 14.5s remaining: 9.68s 9: learn: 0.3931433 test: 0.4160274 best: 0.4160274 (9) total: 16.2s remaining: 8.11s 10: learn: 0.3758558 test: 0.4001885 best: 0.4001885 (10) total: 18s remaining: 6.53s 11: learn: 0.3601544 test: 0.3858256 best: 0.3858256 (11) total: 19.7s remaining: 4.92s 12: learn: 0.3454401 test: 0.3728685 best: 0.3728685 (12) total: 21.4s remaining: 3.29s 13: learn: 0.3317022 test: 0.3614192 best: 0.3614192 (13) total: 23s remaining: 1.65s 14: learn: 0.3195137 test: 0.3502241 best: 0.3502241 (14) total: 24.7s remaining: 0us bestTest = 0.3502241195 bestIteration = 14 Trial 81, Fold 5: Log loss = 0.35025012848988823, Average precision = 0.9698996793264779, ROC-AUC = 0.9654267506456349, Elapsed Time = 24.826989799999865 seconds
Optimization Progress: 82%|########2 | 82/100 [2:27:05<29:33, 98.51s/it]
Trial 82, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 82, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6792135 test: 0.6794098 best: 0.6794098 (0) total: 146ms remaining: 10.5s 1: learn: 0.6655851 test: 0.6660033 best: 0.6660033 (1) total: 299ms remaining: 10.6s 2: learn: 0.6524160 test: 0.6530500 best: 0.6530500 (2) total: 451ms remaining: 10.5s 3: learn: 0.6395385 test: 0.6404013 best: 0.6404013 (3) total: 608ms remaining: 10.5s 4: learn: 0.6271336 test: 0.6282315 best: 0.6282315 (4) total: 771ms remaining: 10.5s 5: learn: 0.6150301 test: 0.6163224 best: 0.6163224 (5) total: 938ms remaining: 10.5s 6: learn: 0.6033852 test: 0.6048642 best: 0.6048642 (6) total: 1.09s remaining: 10.3s 7: learn: 0.5920285 test: 0.5937406 best: 0.5937406 (7) total: 1.25s remaining: 10.1s 8: learn: 0.5813017 test: 0.5834310 best: 0.5834310 (8) total: 1.41s remaining: 10s 9: learn: 0.5705995 test: 0.5729515 best: 0.5729515 (9) total: 1.58s remaining: 9.94s 10: learn: 0.5601223 test: 0.5628859 best: 0.5628859 (10) total: 1.77s remaining: 10s 11: learn: 0.5499990 test: 0.5530112 best: 0.5530112 (11) total: 2s remaining: 10.1s 12: learn: 0.5404500 test: 0.5436511 best: 0.5436511 (12) total: 2.19s remaining: 10.1s 13: learn: 0.5310507 test: 0.5345345 best: 0.5345345 (13) total: 2.38s remaining: 10s 14: learn: 0.5217304 test: 0.5254638 best: 0.5254638 (14) total: 2.58s remaining: 9.97s 15: learn: 0.5127349 test: 0.5167189 best: 0.5167189 (15) total: 2.76s remaining: 9.84s 16: learn: 0.5041942 test: 0.5083021 best: 0.5083021 (16) total: 2.94s remaining: 9.69s 17: learn: 0.4958054 test: 0.5001348 best: 0.5001348 (17) total: 3.13s remaining: 9.57s 18: learn: 0.4876907 test: 0.4921939 best: 0.4921939 (18) total: 3.3s remaining: 9.37s 19: learn: 0.4798029 test: 0.4845168 best: 0.4845168 (19) total: 3.47s remaining: 9.21s 20: learn: 0.4722083 test: 0.4772752 best: 0.4772752 (20) total: 3.64s remaining: 9.02s 21: learn: 0.4647282 test: 0.4699750 best: 0.4699750 (21) total: 3.82s remaining: 8.86s 22: learn: 0.4576320 test: 0.4632149 best: 0.4632149 (22) total: 4.02s remaining: 8.73s 23: learn: 0.4506612 test: 0.4565961 best: 0.4565961 (23) total: 4.2s remaining: 8.58s 24: learn: 0.4439077 test: 0.4500279 best: 0.4500279 (24) total: 4.38s remaining: 8.4s 25: learn: 0.4374211 test: 0.4436678 best: 0.4436678 (25) total: 4.55s remaining: 8.23s 26: learn: 0.4308940 test: 0.4374816 best: 0.4374816 (26) total: 4.72s remaining: 8.05s 27: learn: 0.4248527 test: 0.4316791 best: 0.4316791 (27) total: 4.92s remaining: 7.9s 28: learn: 0.4187402 test: 0.4258629 best: 0.4258629 (28) total: 5.1s remaining: 7.74s 29: learn: 0.4127979 test: 0.4201391 best: 0.4201391 (29) total: 5.29s remaining: 7.59s 30: learn: 0.4070769 test: 0.4146017 best: 0.4146017 (30) total: 5.48s remaining: 7.42s 31: learn: 0.4015501 test: 0.4092886 best: 0.4092886 (31) total: 5.65s remaining: 7.24s 32: learn: 0.3961916 test: 0.4040478 best: 0.4040478 (32) total: 5.82s remaining: 7.06s 33: learn: 0.3910306 test: 0.3990979 best: 0.3990979 (33) total: 6.01s remaining: 6.9s 34: learn: 0.3858613 test: 0.3941593 best: 0.3941593 (34) total: 6.22s remaining: 6.76s 35: learn: 0.3810154 test: 0.3896047 best: 0.3896047 (35) total: 6.39s remaining: 6.57s 36: learn: 0.3763225 test: 0.3852724 best: 0.3852724 (36) total: 6.57s remaining: 6.4s 37: learn: 0.3718071 test: 0.3809371 best: 0.3809371 (37) total: 6.75s remaining: 6.22s 38: learn: 0.3673812 test: 0.3767454 best: 0.3767454 (38) total: 6.96s remaining: 6.07s 39: learn: 0.3629746 test: 0.3725405 best: 0.3725405 (39) total: 7.14s remaining: 5.89s 40: learn: 0.3586263 test: 0.3683797 best: 0.3683797 (40) total: 7.31s remaining: 5.7s 41: learn: 0.3545308 test: 0.3645141 best: 0.3645141 (41) total: 7.5s remaining: 5.54s 42: learn: 0.3505885 test: 0.3607812 best: 0.3607812 (42) total: 7.68s remaining: 5.36s 43: learn: 0.3466280 test: 0.3570317 best: 0.3570317 (43) total: 7.86s remaining: 5.18s 44: learn: 0.3427630 test: 0.3534592 best: 0.3534592 (44) total: 8.07s remaining: 5.02s 45: learn: 0.3391638 test: 0.3501119 best: 0.3501119 (45) total: 8.24s remaining: 4.84s 46: learn: 0.3356426 test: 0.3469002 best: 0.3469002 (46) total: 8.44s remaining: 4.67s 47: learn: 0.3320558 test: 0.3434678 best: 0.3434678 (47) total: 8.61s remaining: 4.49s 48: learn: 0.3287222 test: 0.3403082 best: 0.3403082 (48) total: 8.8s remaining: 4.31s 49: learn: 0.3253067 test: 0.3371334 best: 0.3371334 (49) total: 8.99s remaining: 4.13s 50: learn: 0.3221146 test: 0.3341522 best: 0.3341522 (50) total: 9.18s remaining: 3.96s 51: learn: 0.3189461 test: 0.3311723 best: 0.3311723 (51) total: 9.36s remaining: 3.78s 52: learn: 0.3158264 test: 0.3282985 best: 0.3282985 (52) total: 9.54s remaining: 3.6s 53: learn: 0.3127496 test: 0.3254125 best: 0.3254125 (53) total: 9.7s remaining: 3.41s 54: learn: 0.3100203 test: 0.3229240 best: 0.3229240 (54) total: 9.88s remaining: 3.23s 55: learn: 0.3072873 test: 0.3204287 best: 0.3204287 (55) total: 10.1s remaining: 3.06s 56: learn: 0.3046075 test: 0.3180405 best: 0.3180405 (56) total: 10.3s remaining: 2.88s 57: learn: 0.3018870 test: 0.3155347 best: 0.3155347 (57) total: 10.4s remaining: 2.7s 58: learn: 0.2993093 test: 0.3131996 best: 0.3131996 (58) total: 10.6s remaining: 2.52s 59: learn: 0.2967500 test: 0.3108533 best: 0.3108533 (59) total: 10.8s remaining: 2.34s 60: learn: 0.2943646 test: 0.3086951 best: 0.3086951 (60) total: 11s remaining: 2.16s 61: learn: 0.2921945 test: 0.3067357 best: 0.3067357 (61) total: 11.1s remaining: 1.98s 62: learn: 0.2898847 test: 0.3046203 best: 0.3046203 (62) total: 11.3s remaining: 1.8s 63: learn: 0.2875949 test: 0.3024777 best: 0.3024777 (63) total: 11.5s remaining: 1.61s 64: learn: 0.2854325 test: 0.3005119 best: 0.3005119 (64) total: 11.7s remaining: 1.43s 65: learn: 0.2833520 test: 0.2986317 best: 0.2986317 (65) total: 11.8s remaining: 1.25s 66: learn: 0.2813824 test: 0.2968825 best: 0.2968825 (66) total: 12s remaining: 1.07s 67: learn: 0.2792722 test: 0.2950182 best: 0.2950182 (67) total: 12.2s remaining: 896ms 68: learn: 0.2772767 test: 0.2932249 best: 0.2932249 (68) total: 12.4s remaining: 716ms 69: learn: 0.2751748 test: 0.2913176 best: 0.2913176 (69) total: 12.5s remaining: 537ms 70: learn: 0.2733926 test: 0.2897855 best: 0.2897855 (70) total: 12.7s remaining: 357ms 71: learn: 0.2713736 test: 0.2879162 best: 0.2879162 (71) total: 12.9s remaining: 179ms 72: learn: 0.2694383 test: 0.2860770 best: 0.2860770 (72) total: 13s remaining: 0us bestTest = 0.2860769624 bestIteration = 72 Trial 82, Fold 1: Log loss = 0.28607187175267457, Average precision = 0.9733830444439191, ROC-AUC = 0.9695839636416645, Elapsed Time = 13.178475799999433 seconds Trial 82, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 82, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6792198 test: 0.6795759 best: 0.6795759 (0) total: 153ms remaining: 11s 1: learn: 0.6656914 test: 0.6664166 best: 0.6664166 (1) total: 311ms remaining: 11s 2: learn: 0.6524554 test: 0.6533790 best: 0.6533790 (2) total: 476ms remaining: 11.1s 3: learn: 0.6397681 test: 0.6408434 best: 0.6408434 (3) total: 627ms remaining: 10.8s 4: learn: 0.6273515 test: 0.6286375 best: 0.6286375 (4) total: 787ms remaining: 10.7s 5: learn: 0.6152723 test: 0.6167627 best: 0.6167627 (5) total: 940ms remaining: 10.5s 6: learn: 0.6036068 test: 0.6052530 best: 0.6052530 (6) total: 1.1s remaining: 10.4s 7: learn: 0.5920811 test: 0.5940248 best: 0.5940248 (7) total: 1.3s remaining: 10.6s 8: learn: 0.5810509 test: 0.5832925 best: 0.5832925 (8) total: 1.48s remaining: 10.5s 9: learn: 0.5704574 test: 0.5729104 best: 0.5729104 (9) total: 1.65s remaining: 10.4s 10: learn: 0.5601380 test: 0.5628685 best: 0.5628685 (10) total: 1.82s remaining: 10.3s 11: learn: 0.5502086 test: 0.5530593 best: 0.5530593 (11) total: 1.99s remaining: 10.1s 12: learn: 0.5405816 test: 0.5436690 best: 0.5436690 (12) total: 2.17s remaining: 10s 13: learn: 0.5311406 test: 0.5343836 best: 0.5343836 (13) total: 2.36s remaining: 9.93s 14: learn: 0.5220957 test: 0.5255958 best: 0.5255958 (14) total: 2.54s remaining: 9.8s 15: learn: 0.5131705 test: 0.5168266 best: 0.5168266 (15) total: 2.7s remaining: 9.64s 16: learn: 0.5046084 test: 0.5084140 best: 0.5084140 (16) total: 2.87s remaining: 9.45s 17: learn: 0.4962168 test: 0.5001150 best: 0.5001150 (17) total: 3.04s remaining: 9.28s 18: learn: 0.4881875 test: 0.4923027 best: 0.4923027 (18) total: 3.21s remaining: 9.14s 19: learn: 0.4803616 test: 0.4846709 best: 0.4846709 (19) total: 3.38s remaining: 8.95s 20: learn: 0.4726521 test: 0.4772487 best: 0.4772487 (20) total: 3.56s remaining: 8.81s 21: learn: 0.4651700 test: 0.4699363 best: 0.4699363 (21) total: 3.73s remaining: 8.66s 22: learn: 0.4580228 test: 0.4630110 best: 0.4630110 (22) total: 3.9s remaining: 8.48s 23: learn: 0.4509688 test: 0.4562381 best: 0.4562381 (23) total: 4.08s remaining: 8.33s 24: learn: 0.4441970 test: 0.4496713 best: 0.4496713 (24) total: 4.26s remaining: 8.17s 25: learn: 0.4376343 test: 0.4432248 best: 0.4432248 (25) total: 4.42s remaining: 7.99s 26: learn: 0.4312394 test: 0.4369507 best: 0.4369507 (26) total: 4.58s remaining: 7.8s 27: learn: 0.4251443 test: 0.4309962 best: 0.4309962 (27) total: 4.74s remaining: 7.62s 28: learn: 0.4189802 test: 0.4250534 best: 0.4250534 (28) total: 4.92s remaining: 7.46s 29: learn: 0.4129771 test: 0.4192546 best: 0.4192546 (29) total: 5.12s remaining: 7.34s 30: learn: 0.4073983 test: 0.4137795 best: 0.4137795 (30) total: 5.29s remaining: 7.17s 31: learn: 0.4018941 test: 0.4084188 best: 0.4084188 (31) total: 5.46s remaining: 7s 32: learn: 0.3966573 test: 0.4033012 best: 0.4033012 (32) total: 5.65s remaining: 6.84s 33: learn: 0.3914587 test: 0.3983161 best: 0.3983161 (33) total: 5.83s remaining: 6.69s 34: learn: 0.3863279 test: 0.3933295 best: 0.3933295 (34) total: 6s remaining: 6.51s 35: learn: 0.3815140 test: 0.3886918 best: 0.3886918 (35) total: 6.17s remaining: 6.35s 36: learn: 0.3769562 test: 0.3842633 best: 0.3842633 (36) total: 6.37s remaining: 6.19s 37: learn: 0.3722297 test: 0.3797104 best: 0.3797104 (37) total: 6.54s remaining: 6.02s 38: learn: 0.3678121 test: 0.3753415 best: 0.3753415 (38) total: 6.7s remaining: 5.84s 39: learn: 0.3634879 test: 0.3711106 best: 0.3711106 (39) total: 6.86s remaining: 5.66s 40: learn: 0.3591246 test: 0.3668323 best: 0.3668323 (40) total: 7.03s remaining: 5.49s 41: learn: 0.3550176 test: 0.3629384 best: 0.3629384 (41) total: 7.23s remaining: 5.34s 42: learn: 0.3510047 test: 0.3590437 best: 0.3590437 (42) total: 7.4s remaining: 5.16s 43: learn: 0.3470459 test: 0.3551946 best: 0.3551946 (43) total: 7.56s remaining: 4.99s 44: learn: 0.3431654 test: 0.3514644 best: 0.3514644 (44) total: 7.73s remaining: 4.81s 45: learn: 0.3394860 test: 0.3479081 best: 0.3479081 (45) total: 7.88s remaining: 4.63s 46: learn: 0.3359592 test: 0.3444841 best: 0.3444841 (46) total: 8.06s remaining: 4.46s 47: learn: 0.3325864 test: 0.3412461 best: 0.3412461 (47) total: 8.24s remaining: 4.29s 48: learn: 0.3291543 test: 0.3378866 best: 0.3378866 (48) total: 8.4s remaining: 4.11s 49: learn: 0.3259274 test: 0.3348225 best: 0.3348225 (49) total: 8.56s remaining: 3.94s 50: learn: 0.3229141 test: 0.3319971 best: 0.3319971 (50) total: 8.73s remaining: 3.77s 51: learn: 0.3197371 test: 0.3288505 best: 0.3288505 (51) total: 8.89s remaining: 3.59s 52: learn: 0.3167563 test: 0.3260032 best: 0.3260032 (52) total: 9.07s remaining: 3.42s 53: learn: 0.3138679 test: 0.3233878 best: 0.3233878 (53) total: 9.26s remaining: 3.26s 54: learn: 0.3110140 test: 0.3206850 best: 0.3206850 (54) total: 9.44s remaining: 3.09s 55: learn: 0.3083367 test: 0.3181128 best: 0.3181128 (55) total: 9.62s remaining: 2.92s 56: learn: 0.3054945 test: 0.3155146 best: 0.3155146 (56) total: 9.79s remaining: 2.75s 57: learn: 0.3027820 test: 0.3128581 best: 0.3128581 (57) total: 9.95s remaining: 2.57s 58: learn: 0.3003348 test: 0.3105849 best: 0.3105849 (58) total: 10.1s remaining: 2.4s 59: learn: 0.2977354 test: 0.3080867 best: 0.3080867 (59) total: 10.3s remaining: 2.23s 60: learn: 0.2953614 test: 0.3057579 best: 0.3057579 (60) total: 10.5s remaining: 2.06s 61: learn: 0.2929982 test: 0.3035621 best: 0.3035621 (61) total: 10.7s remaining: 1.89s 62: learn: 0.2906755 test: 0.3013354 best: 0.3013354 (62) total: 10.9s remaining: 1.72s 63: learn: 0.2883738 test: 0.2991267 best: 0.2991267 (63) total: 11s remaining: 1.55s 64: learn: 0.2861251 test: 0.2970165 best: 0.2970165 (64) total: 11.2s remaining: 1.38s 65: learn: 0.2837496 test: 0.2947313 best: 0.2947313 (65) total: 11.4s remaining: 1.21s 66: learn: 0.2816101 test: 0.2926883 best: 0.2926883 (66) total: 11.5s remaining: 1.03s 67: learn: 0.2795636 test: 0.2907648 best: 0.2907648 (67) total: 11.7s remaining: 861ms 68: learn: 0.2776174 test: 0.2889710 best: 0.2889710 (68) total: 11.9s remaining: 690ms 69: learn: 0.2757547 test: 0.2871858 best: 0.2871858 (69) total: 12.1s remaining: 517ms 70: learn: 0.2738764 test: 0.2854262 best: 0.2854262 (70) total: 12.3s remaining: 345ms 71: learn: 0.2720170 test: 0.2836972 best: 0.2836972 (71) total: 12.4s remaining: 173ms 72: learn: 0.2702694 test: 0.2820841 best: 0.2820841 (72) total: 12.6s remaining: 0us bestTest = 0.2820841335 bestIteration = 72 Trial 82, Fold 2: Log loss = 0.28204453817430003, Average precision = 0.9761361480874985, ROC-AUC = 0.9731854834426472, Elapsed Time = 12.764068100004806 seconds Trial 82, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 82, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6791230 test: 0.6792886 best: 0.6792886 (0) total: 156ms remaining: 11.3s 1: learn: 0.6655547 test: 0.6659034 best: 0.6659034 (1) total: 307ms remaining: 10.9s 2: learn: 0.6523528 test: 0.6528794 best: 0.6528794 (2) total: 462ms remaining: 10.8s 3: learn: 0.6396528 test: 0.6402812 best: 0.6402812 (3) total: 612ms remaining: 10.6s 4: learn: 0.6272575 test: 0.6281381 best: 0.6281381 (4) total: 766ms remaining: 10.4s 5: learn: 0.6152730 test: 0.6162536 best: 0.6162536 (5) total: 920ms remaining: 10.3s 6: learn: 0.6035316 test: 0.6046785 best: 0.6046785 (6) total: 1.08s remaining: 10.2s 7: learn: 0.5921663 test: 0.5934842 best: 0.5934842 (7) total: 1.24s remaining: 10s 8: learn: 0.5811135 test: 0.5826279 best: 0.5826279 (8) total: 1.4s remaining: 9.94s 9: learn: 0.5704185 test: 0.5720801 best: 0.5720801 (9) total: 1.55s remaining: 9.79s 10: learn: 0.5601501 test: 0.5620286 best: 0.5620286 (10) total: 1.71s remaining: 9.65s 11: learn: 0.5500453 test: 0.5520975 best: 0.5520975 (11) total: 1.92s remaining: 9.75s 12: learn: 0.5404412 test: 0.5428327 best: 0.5428327 (12) total: 2.11s remaining: 9.75s 13: learn: 0.5309002 test: 0.5334344 best: 0.5334344 (13) total: 2.29s remaining: 9.67s 14: learn: 0.5218048 test: 0.5244134 best: 0.5244134 (14) total: 2.47s remaining: 9.54s 15: learn: 0.5130605 test: 0.5158336 best: 0.5158336 (15) total: 2.64s remaining: 9.4s 16: learn: 0.5046411 test: 0.5075576 best: 0.5075576 (16) total: 2.81s remaining: 9.26s 17: learn: 0.4963337 test: 0.4993914 best: 0.4993914 (17) total: 2.98s remaining: 9.09s 18: learn: 0.4881491 test: 0.4912431 best: 0.4912431 (18) total: 3.14s remaining: 8.94s 19: learn: 0.4803304 test: 0.4835270 best: 0.4835270 (19) total: 3.3s remaining: 8.75s 20: learn: 0.4727096 test: 0.4760821 best: 0.4760821 (20) total: 3.47s remaining: 8.58s 21: learn: 0.4653088 test: 0.4687619 best: 0.4687619 (21) total: 3.62s remaining: 8.38s 22: learn: 0.4582609 test: 0.4617949 best: 0.4617949 (22) total: 3.79s remaining: 8.24s 23: learn: 0.4511135 test: 0.4548776 best: 0.4548776 (23) total: 3.97s remaining: 8.12s 24: learn: 0.4443141 test: 0.4482072 best: 0.4482072 (24) total: 4.15s remaining: 7.96s 25: learn: 0.4376384 test: 0.4416963 best: 0.4416963 (25) total: 4.31s remaining: 7.8s 26: learn: 0.4313743 test: 0.4355158 best: 0.4355158 (26) total: 4.48s remaining: 7.63s 27: learn: 0.4252271 test: 0.4296054 best: 0.4296054 (27) total: 4.64s remaining: 7.46s 28: learn: 0.4191859 test: 0.4236656 best: 0.4236656 (28) total: 4.83s remaining: 7.33s 29: learn: 0.4133988 test: 0.4179461 best: 0.4179461 (29) total: 4.99s remaining: 7.16s 30: learn: 0.4077107 test: 0.4122970 best: 0.4122970 (30) total: 5.16s remaining: 6.99s 31: learn: 0.4021454 test: 0.4068230 best: 0.4068230 (31) total: 5.33s remaining: 6.84s 32: learn: 0.3969010 test: 0.4016264 best: 0.4016264 (32) total: 5.49s remaining: 6.66s 33: learn: 0.3917484 test: 0.3965276 best: 0.3965276 (33) total: 5.64s remaining: 6.47s 34: learn: 0.3868086 test: 0.3917161 best: 0.3917161 (34) total: 5.83s remaining: 6.33s 35: learn: 0.3817623 test: 0.3867438 best: 0.3867438 (35) total: 6.01s remaining: 6.17s 36: learn: 0.3772056 test: 0.3822945 best: 0.3822945 (36) total: 6.17s remaining: 6s 37: learn: 0.3726046 test: 0.3777878 best: 0.3777878 (37) total: 6.34s remaining: 5.84s 38: learn: 0.3682701 test: 0.3735992 best: 0.3735992 (38) total: 6.51s remaining: 5.68s 39: learn: 0.3639182 test: 0.3694401 best: 0.3694401 (39) total: 6.7s remaining: 5.53s 40: learn: 0.3596481 test: 0.3652776 best: 0.3652776 (40) total: 6.87s remaining: 5.36s 41: learn: 0.3555120 test: 0.3612346 best: 0.3612346 (41) total: 7.03s remaining: 5.19s 42: learn: 0.3516788 test: 0.3575158 best: 0.3575158 (42) total: 7.2s remaining: 5.03s 43: learn: 0.3477622 test: 0.3536824 best: 0.3536824 (43) total: 7.37s remaining: 4.86s 44: learn: 0.3440905 test: 0.3501771 best: 0.3501771 (44) total: 7.55s remaining: 4.7s 45: learn: 0.3406092 test: 0.3468073 best: 0.3468073 (45) total: 7.71s remaining: 4.52s 46: learn: 0.3369667 test: 0.3432491 best: 0.3432491 (46) total: 7.9s remaining: 4.37s 47: learn: 0.3334342 test: 0.3399141 best: 0.3399141 (47) total: 8.07s remaining: 4.2s 48: learn: 0.3300681 test: 0.3366470 best: 0.3366470 (48) total: 8.23s remaining: 4.03s 49: learn: 0.3267687 test: 0.3335528 best: 0.3335528 (49) total: 8.4s remaining: 3.87s 50: learn: 0.3235778 test: 0.3304664 best: 0.3304664 (50) total: 8.57s remaining: 3.7s 51: learn: 0.3204967 test: 0.3275065 best: 0.3275065 (51) total: 8.75s remaining: 3.53s 52: learn: 0.3174611 test: 0.3246470 best: 0.3246470 (52) total: 8.95s remaining: 3.38s 53: learn: 0.3143696 test: 0.3216411 best: 0.3216411 (53) total: 9.12s remaining: 3.21s 54: learn: 0.3114860 test: 0.3188651 best: 0.3188651 (54) total: 9.29s remaining: 3.04s 55: learn: 0.3088055 test: 0.3163127 best: 0.3163127 (55) total: 9.46s remaining: 2.87s 56: learn: 0.3059359 test: 0.3134706 best: 0.3134706 (56) total: 9.62s remaining: 2.7s 57: learn: 0.3032919 test: 0.3109813 best: 0.3109813 (57) total: 9.8s remaining: 2.54s 58: learn: 0.3006070 test: 0.3083888 best: 0.3083888 (58) total: 9.98s remaining: 2.37s 59: learn: 0.2979713 test: 0.3058013 best: 0.3058013 (59) total: 10.2s remaining: 2.2s 60: learn: 0.2954674 test: 0.3033861 best: 0.3033861 (60) total: 10.3s remaining: 2.03s 61: learn: 0.2931001 test: 0.3011320 best: 0.3011320 (61) total: 10.5s remaining: 1.86s 62: learn: 0.2908309 test: 0.2990322 best: 0.2990322 (62) total: 10.7s remaining: 1.69s 63: learn: 0.2886761 test: 0.2969864 best: 0.2969864 (63) total: 10.8s remaining: 1.52s 64: learn: 0.2864962 test: 0.2948855 best: 0.2948855 (64) total: 11s remaining: 1.36s 65: learn: 0.2843459 test: 0.2928905 best: 0.2928905 (65) total: 11.2s remaining: 1.19s 66: learn: 0.2823076 test: 0.2909830 best: 0.2909830 (66) total: 11.5s remaining: 1.02s 67: learn: 0.2802192 test: 0.2889893 best: 0.2889893 (67) total: 11.7s remaining: 858ms 68: learn: 0.2783028 test: 0.2871605 best: 0.2871605 (68) total: 11.9s remaining: 688ms 69: learn: 0.2763354 test: 0.2853150 best: 0.2853150 (69) total: 12.2s remaining: 524ms 70: learn: 0.2743284 test: 0.2834205 best: 0.2834205 (70) total: 12.4s remaining: 351ms 71: learn: 0.2722057 test: 0.2814113 best: 0.2814113 (71) total: 12.7s remaining: 176ms 72: learn: 0.2702409 test: 0.2795353 best: 0.2795353 (72) total: 12.9s remaining: 0us bestTest = 0.279535281 bestIteration = 72 Trial 82, Fold 3: Log loss = 0.2796964981801753, Average precision = 0.9760290100970991, ROC-AUC = 0.97292445175333, Elapsed Time = 13.040741799995885 seconds Trial 82, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 82, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6792708 test: 0.6794827 best: 0.6794827 (0) total: 216ms remaining: 15.5s 1: learn: 0.6656748 test: 0.6660459 best: 0.6660459 (1) total: 420ms remaining: 14.9s 2: learn: 0.6525691 test: 0.6531326 best: 0.6531326 (2) total: 632ms remaining: 14.7s 3: learn: 0.6397520 test: 0.6406259 best: 0.6406259 (3) total: 836ms remaining: 14.4s 4: learn: 0.6274767 test: 0.6285455 best: 0.6285455 (4) total: 1.04s remaining: 14.1s 5: learn: 0.6153665 test: 0.6166435 best: 0.6166435 (5) total: 1.28s remaining: 14.3s 6: learn: 0.6035887 test: 0.6050694 best: 0.6050694 (6) total: 1.48s remaining: 14s 7: learn: 0.5922490 test: 0.5938893 best: 0.5938893 (7) total: 1.68s remaining: 13.6s 8: learn: 0.5812727 test: 0.5831257 best: 0.5831257 (8) total: 1.86s remaining: 13.2s 9: learn: 0.5705959 test: 0.5726711 best: 0.5726711 (9) total: 2.06s remaining: 13s 10: learn: 0.5601757 test: 0.5624441 best: 0.5624441 (10) total: 2.25s remaining: 12.7s 11: learn: 0.5500819 test: 0.5526763 best: 0.5526763 (11) total: 2.44s remaining: 12.4s 12: learn: 0.5404148 test: 0.5431906 best: 0.5431906 (12) total: 2.63s remaining: 12.1s 13: learn: 0.5308408 test: 0.5337398 best: 0.5337398 (13) total: 2.82s remaining: 11.9s 14: learn: 0.5216573 test: 0.5246863 best: 0.5246863 (14) total: 2.98s remaining: 11.5s 15: learn: 0.5128890 test: 0.5162394 best: 0.5162394 (15) total: 3.2s remaining: 11.4s 16: learn: 0.5042543 test: 0.5079686 best: 0.5079686 (16) total: 3.39s remaining: 11.2s 17: learn: 0.4958918 test: 0.4998870 best: 0.4998870 (17) total: 3.57s remaining: 10.9s 18: learn: 0.4878015 test: 0.4919924 best: 0.4919924 (18) total: 3.75s remaining: 10.7s 19: learn: 0.4797881 test: 0.4841985 best: 0.4841985 (19) total: 3.98s remaining: 10.5s 20: learn: 0.4721392 test: 0.4767365 best: 0.4767365 (20) total: 4.19s remaining: 10.4s 21: learn: 0.4646399 test: 0.4694891 best: 0.4694891 (21) total: 4.42s remaining: 10.3s 22: learn: 0.4574204 test: 0.4624438 best: 0.4624438 (22) total: 4.64s remaining: 10.1s 23: learn: 0.4507872 test: 0.4559687 best: 0.4559687 (23) total: 4.85s remaining: 9.9s 24: learn: 0.4440615 test: 0.4493602 best: 0.4493602 (24) total: 5.03s remaining: 9.65s 25: learn: 0.4373396 test: 0.4428660 best: 0.4428660 (25) total: 5.24s remaining: 9.47s 26: learn: 0.4311713 test: 0.4369579 best: 0.4369579 (26) total: 5.44s remaining: 9.28s 27: learn: 0.4250015 test: 0.4309250 best: 0.4309250 (27) total: 5.63s remaining: 9.05s 28: learn: 0.4188963 test: 0.4248947 best: 0.4248947 (28) total: 5.81s remaining: 8.81s 29: learn: 0.4130075 test: 0.4191375 best: 0.4191375 (29) total: 5.99s remaining: 8.58s 30: learn: 0.4073401 test: 0.4135885 best: 0.4135885 (30) total: 6.18s remaining: 8.37s 31: learn: 0.4018029 test: 0.4082069 best: 0.4082069 (31) total: 6.37s remaining: 8.16s 32: learn: 0.3962950 test: 0.4029103 best: 0.4029103 (32) total: 6.57s remaining: 7.96s 33: learn: 0.3911271 test: 0.3978510 best: 0.3978510 (33) total: 6.76s remaining: 7.75s 34: learn: 0.3861024 test: 0.3929456 best: 0.3929456 (34) total: 6.94s remaining: 7.53s 35: learn: 0.3811505 test: 0.3881201 best: 0.3881201 (35) total: 7.12s remaining: 7.31s 36: learn: 0.3766706 test: 0.3838185 best: 0.3838185 (36) total: 7.29s remaining: 7.1s 37: learn: 0.3721684 test: 0.3794667 best: 0.3794667 (37) total: 7.47s remaining: 6.88s 38: learn: 0.3677214 test: 0.3751509 best: 0.3751509 (38) total: 7.62s remaining: 6.65s 39: learn: 0.3633909 test: 0.3709922 best: 0.3709922 (39) total: 7.78s remaining: 6.42s 40: learn: 0.3590862 test: 0.3667924 best: 0.3667924 (40) total: 7.94s remaining: 6.2s 41: learn: 0.3548426 test: 0.3627682 best: 0.3627682 (41) total: 8.13s remaining: 6s 42: learn: 0.3508451 test: 0.3588116 best: 0.3588116 (42) total: 8.3s remaining: 5.79s 43: learn: 0.3470624 test: 0.3551140 best: 0.3551140 (43) total: 8.52s remaining: 5.61s 44: learn: 0.3432194 test: 0.3513723 best: 0.3513723 (44) total: 8.69s remaining: 5.4s 45: learn: 0.3394310 test: 0.3476928 best: 0.3476928 (45) total: 8.86s remaining: 5.2s 46: learn: 0.3359368 test: 0.3444719 best: 0.3444719 (46) total: 9.04s remaining: 5s 47: learn: 0.3325859 test: 0.3412842 best: 0.3412842 (47) total: 9.23s remaining: 4.81s 48: learn: 0.3289977 test: 0.3382069 best: 0.3382069 (48) total: 9.41s remaining: 4.61s 49: learn: 0.3257668 test: 0.3351171 best: 0.3351171 (49) total: 9.57s remaining: 4.4s 50: learn: 0.3223834 test: 0.3318738 best: 0.3318738 (50) total: 9.73s remaining: 4.2s 51: learn: 0.3194189 test: 0.3289796 best: 0.3289796 (51) total: 9.89s remaining: 3.99s 52: learn: 0.3163743 test: 0.3261043 best: 0.3261043 (52) total: 10.1s remaining: 3.8s 53: learn: 0.3133132 test: 0.3231575 best: 0.3231575 (53) total: 10.2s remaining: 3.6s 54: learn: 0.3103157 test: 0.3203148 best: 0.3203148 (54) total: 10.4s remaining: 3.41s 55: learn: 0.3076439 test: 0.3177488 best: 0.3177488 (55) total: 10.6s remaining: 3.21s 56: learn: 0.3049434 test: 0.3151636 best: 0.3151636 (56) total: 10.7s remaining: 3.02s 57: learn: 0.3022121 test: 0.3125587 best: 0.3125587 (57) total: 10.9s remaining: 2.82s 58: learn: 0.2996208 test: 0.3100975 best: 0.3100975 (58) total: 11.1s remaining: 2.63s 59: learn: 0.2970013 test: 0.3076171 best: 0.3076171 (59) total: 11.2s remaining: 2.43s 60: learn: 0.2945455 test: 0.3052904 best: 0.3052904 (60) total: 11.4s remaining: 2.25s 61: learn: 0.2921698 test: 0.3030339 best: 0.3030339 (61) total: 11.6s remaining: 2.06s 62: learn: 0.2898457 test: 0.3008244 best: 0.3008244 (62) total: 11.8s remaining: 1.87s 63: learn: 0.2876371 test: 0.2986937 best: 0.2986937 (63) total: 11.9s remaining: 1.68s 64: learn: 0.2853937 test: 0.2965709 best: 0.2965709 (64) total: 12.1s remaining: 1.49s 65: learn: 0.2832035 test: 0.2946093 best: 0.2946093 (65) total: 12.3s remaining: 1.3s 66: learn: 0.2809842 test: 0.2924394 best: 0.2924394 (66) total: 12.4s remaining: 1.11s 67: learn: 0.2789306 test: 0.2905178 best: 0.2905178 (67) total: 12.6s remaining: 928ms 68: learn: 0.2768288 test: 0.2884988 best: 0.2884988 (68) total: 12.8s remaining: 740ms 69: learn: 0.2749537 test: 0.2867382 best: 0.2867382 (69) total: 12.9s remaining: 554ms 70: learn: 0.2730189 test: 0.2848890 best: 0.2848890 (70) total: 13.1s remaining: 369ms 71: learn: 0.2712314 test: 0.2832119 best: 0.2832119 (71) total: 13.3s remaining: 185ms 72: learn: 0.2693663 test: 0.2815881 best: 0.2815881 (72) total: 13.5s remaining: 0us bestTest = 0.2815881418 bestIteration = 72 Trial 82, Fold 4: Log loss = 0.28160085343236496, Average precision = 0.9745131789422415, ROC-AUC = 0.9722249046832191, Elapsed Time = 13.665348000002268 seconds Trial 82, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 82, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6793029 test: 0.6796546 best: 0.6796546 (0) total: 150ms remaining: 10.8s 1: learn: 0.6656368 test: 0.6663415 best: 0.6663415 (1) total: 298ms remaining: 10.6s 2: learn: 0.6524875 test: 0.6535071 best: 0.6535071 (2) total: 451ms remaining: 10.5s 3: learn: 0.6396677 test: 0.6410379 best: 0.6410379 (3) total: 595ms remaining: 10.3s 4: learn: 0.6272738 test: 0.6289904 best: 0.6289904 (4) total: 740ms remaining: 10.1s 5: learn: 0.6151613 test: 0.6172990 best: 0.6172990 (5) total: 889ms remaining: 9.93s 6: learn: 0.6034605 test: 0.6059509 best: 0.6059509 (6) total: 1.04s remaining: 9.83s 7: learn: 0.5921417 test: 0.5951841 best: 0.5951841 (7) total: 1.19s remaining: 9.66s 8: learn: 0.5811260 test: 0.5847193 best: 0.5847193 (8) total: 1.35s remaining: 9.59s 9: learn: 0.5704193 test: 0.5743256 best: 0.5743256 (9) total: 1.5s remaining: 9.44s 10: learn: 0.5600345 test: 0.5644483 best: 0.5644483 (10) total: 1.66s remaining: 9.34s 11: learn: 0.5500129 test: 0.5547236 best: 0.5547236 (11) total: 1.81s remaining: 9.22s 12: learn: 0.5401346 test: 0.5451722 best: 0.5451722 (12) total: 2s remaining: 9.21s 13: learn: 0.5306132 test: 0.5361880 best: 0.5361880 (13) total: 2.16s remaining: 9.12s 14: learn: 0.5214392 test: 0.5274817 best: 0.5274817 (14) total: 2.33s remaining: 9.01s 15: learn: 0.5125338 test: 0.5188401 best: 0.5188401 (15) total: 2.48s remaining: 8.85s 16: learn: 0.5038215 test: 0.5103943 best: 0.5103943 (16) total: 2.65s remaining: 8.73s 17: learn: 0.4952518 test: 0.5023351 best: 0.5023351 (17) total: 2.82s remaining: 8.61s 18: learn: 0.4870423 test: 0.4944069 best: 0.4944069 (18) total: 2.98s remaining: 8.48s 19: learn: 0.4791283 test: 0.4868497 best: 0.4868497 (19) total: 3.16s remaining: 8.37s 20: learn: 0.4713743 test: 0.4793482 best: 0.4793482 (20) total: 3.32s remaining: 8.22s 21: learn: 0.4639513 test: 0.4721974 best: 0.4721974 (21) total: 3.47s remaining: 8.05s 22: learn: 0.4566475 test: 0.4651903 best: 0.4651903 (22) total: 3.65s remaining: 7.93s 23: learn: 0.4495660 test: 0.4583662 best: 0.4583662 (23) total: 3.81s remaining: 7.78s 24: learn: 0.4426564 test: 0.4517633 best: 0.4517633 (24) total: 4s remaining: 7.68s 25: learn: 0.4360982 test: 0.4454574 best: 0.4454574 (25) total: 4.17s remaining: 7.55s 26: learn: 0.4296655 test: 0.4393361 best: 0.4393361 (26) total: 4.35s remaining: 7.41s 27: learn: 0.4234115 test: 0.4333735 best: 0.4333735 (27) total: 4.51s remaining: 7.24s 28: learn: 0.4174466 test: 0.4277273 best: 0.4277273 (28) total: 4.66s remaining: 7.07s 29: learn: 0.4115751 test: 0.4222674 best: 0.4222674 (29) total: 4.84s remaining: 6.94s 30: learn: 0.4058042 test: 0.4169442 best: 0.4169442 (30) total: 5.03s remaining: 6.81s 31: learn: 0.4002035 test: 0.4116077 best: 0.4116077 (31) total: 5.19s remaining: 6.64s 32: learn: 0.3951370 test: 0.4067064 best: 0.4067064 (32) total: 5.35s remaining: 6.49s 33: learn: 0.3898954 test: 0.4017790 best: 0.4017790 (33) total: 5.5s remaining: 6.32s 34: learn: 0.3846972 test: 0.3967663 best: 0.3967663 (34) total: 5.66s remaining: 6.14s 35: learn: 0.3796655 test: 0.3919786 best: 0.3919786 (35) total: 5.82s remaining: 5.98s 36: learn: 0.3750042 test: 0.3874531 best: 0.3874531 (36) total: 6.02s remaining: 5.86s 37: learn: 0.3703590 test: 0.3831051 best: 0.3831051 (37) total: 6.17s remaining: 5.68s 38: learn: 0.3661134 test: 0.3789694 best: 0.3789694 (38) total: 6.33s remaining: 5.52s 39: learn: 0.3619651 test: 0.3750103 best: 0.3750103 (39) total: 6.49s remaining: 5.36s 40: learn: 0.3577628 test: 0.3709643 best: 0.3709643 (40) total: 6.66s remaining: 5.2s 41: learn: 0.3536856 test: 0.3670187 best: 0.3670187 (41) total: 6.83s remaining: 5.04s 42: learn: 0.3499586 test: 0.3635745 best: 0.3635745 (42) total: 7s remaining: 4.88s 43: learn: 0.3459378 test: 0.3597348 best: 0.3597348 (43) total: 7.16s remaining: 4.72s 44: learn: 0.3420995 test: 0.3560382 best: 0.3560382 (44) total: 7.33s remaining: 4.56s 45: learn: 0.3382198 test: 0.3524014 best: 0.3524014 (45) total: 7.5s remaining: 4.4s 46: learn: 0.3345797 test: 0.3489433 best: 0.3489433 (46) total: 7.66s remaining: 4.24s 47: learn: 0.3310835 test: 0.3455868 best: 0.3455868 (47) total: 7.82s remaining: 4.07s 48: learn: 0.3274896 test: 0.3422627 best: 0.3422627 (48) total: 8s remaining: 3.92s 49: learn: 0.3241462 test: 0.3390670 best: 0.3390670 (49) total: 8.16s remaining: 3.75s 50: learn: 0.3208740 test: 0.3359144 best: 0.3359144 (50) total: 8.32s remaining: 3.59s 51: learn: 0.3178407 test: 0.3330566 best: 0.3330566 (51) total: 8.49s remaining: 3.43s 52: learn: 0.3147119 test: 0.3301624 best: 0.3301624 (52) total: 8.66s remaining: 3.27s 53: learn: 0.3117597 test: 0.3273476 best: 0.3273476 (53) total: 8.8s remaining: 3.1s 54: learn: 0.3089578 test: 0.3247476 best: 0.3247476 (54) total: 8.99s remaining: 2.94s 55: learn: 0.3060990 test: 0.3220756 best: 0.3220756 (55) total: 9.14s remaining: 2.77s 56: learn: 0.3034607 test: 0.3196098 best: 0.3196098 (56) total: 9.32s remaining: 2.62s 57: learn: 0.3009322 test: 0.3172460 best: 0.3172460 (57) total: 9.47s remaining: 2.45s 58: learn: 0.2983179 test: 0.3147767 best: 0.3147767 (58) total: 9.64s remaining: 2.29s 59: learn: 0.2958518 test: 0.3124939 best: 0.3124939 (59) total: 9.81s remaining: 2.13s 60: learn: 0.2932866 test: 0.3100817 best: 0.3100817 (60) total: 9.99s remaining: 1.97s 61: learn: 0.2908571 test: 0.3077705 best: 0.3077705 (61) total: 10.2s remaining: 1.8s 62: learn: 0.2885575 test: 0.3055697 best: 0.3055697 (62) total: 10.3s remaining: 1.64s 63: learn: 0.2863485 test: 0.3035275 best: 0.3035275 (63) total: 10.5s remaining: 1.48s 64: learn: 0.2839312 test: 0.3012939 best: 0.3012939 (64) total: 10.7s remaining: 1.31s 65: learn: 0.2818124 test: 0.2993307 best: 0.2993307 (65) total: 10.8s remaining: 1.15s 66: learn: 0.2795611 test: 0.2972217 best: 0.2972217 (66) total: 11s remaining: 986ms 67: learn: 0.2773799 test: 0.2952124 best: 0.2952124 (67) total: 11.2s remaining: 822ms 68: learn: 0.2751962 test: 0.2931936 best: 0.2931936 (68) total: 11.3s remaining: 658ms 69: learn: 0.2732823 test: 0.2914509 best: 0.2914509 (69) total: 11.5s remaining: 493ms 70: learn: 0.2713081 test: 0.2896025 best: 0.2896025 (70) total: 11.7s remaining: 328ms 71: learn: 0.2693840 test: 0.2878429 best: 0.2878429 (71) total: 11.8s remaining: 164ms 72: learn: 0.2675926 test: 0.2861980 best: 0.2861980 (72) total: 12s remaining: 0us bestTest = 0.2861979562 bestIteration = 72 Trial 82, Fold 5: Log loss = 0.2860766131001255, Average precision = 0.973495634018563, ROC-AUC = 0.9706371625513256, Elapsed Time = 12.147943600000872 seconds
Optimization Progress: 83%|########2 | 83/100 [2:28:18<25:44, 90.87s/it]
Trial 83, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 83, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6756482 test: 0.6756304 best: 0.6756304 (0) total: 149ms remaining: 597ms 1: learn: 0.6571632 test: 0.6571571 best: 0.6571571 (1) total: 282ms remaining: 423ms 2: learn: 0.6390481 test: 0.6391530 best: 0.6391530 (2) total: 436ms remaining: 291ms 3: learn: 0.6225629 test: 0.6227631 best: 0.6227631 (3) total: 590ms remaining: 148ms 4: learn: 0.6080865 test: 0.6083350 best: 0.6083350 (4) total: 743ms remaining: 0us bestTest = 0.6083349893 bestIteration = 4 Trial 83, Fold 1: Log loss = 0.6085548097717014, Average precision = 0.9677374077073312, ROC-AUC = 0.9623888148052555, Elapsed Time = 0.855243199999677 seconds Trial 83, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 83, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6737654 test: 0.6739189 best: 0.6739189 (0) total: 162ms remaining: 646ms 1: learn: 0.6559453 test: 0.6562379 best: 0.6562379 (1) total: 290ms remaining: 435ms 2: learn: 0.6381826 test: 0.6387133 best: 0.6387133 (2) total: 409ms remaining: 272ms 3: learn: 0.6211427 test: 0.6217961 best: 0.6217961 (3) total: 534ms remaining: 134ms 4: learn: 0.6060927 test: 0.6068532 best: 0.6068532 (4) total: 639ms remaining: 0us bestTest = 0.6068531554 bestIteration = 4 Trial 83, Fold 2: Log loss = 0.6069581930502699, Average precision = 0.9667052353253812, ROC-AUC = 0.9635686951602471, Elapsed Time = 0.7514337999964482 seconds Trial 83, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 83, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6743651 test: 0.6742674 best: 0.6742674 (0) total: 132ms remaining: 527ms 1: learn: 0.6564276 test: 0.6563068 best: 0.6563068 (1) total: 258ms remaining: 387ms 2: learn: 0.6401335 test: 0.6398945 best: 0.6398945 (2) total: 404ms remaining: 270ms 3: learn: 0.6248566 test: 0.6245883 best: 0.6245883 (3) total: 545ms remaining: 136ms 4: learn: 0.6089148 test: 0.6084059 best: 0.6084059 (4) total: 684ms remaining: 0us bestTest = 0.6084059102 bestIteration = 4 Trial 83, Fold 3: Log loss = 0.6085853531793376, Average precision = 0.9679774228780744, ROC-AUC = 0.9647212928942221, Elapsed Time = 0.795822699998098 seconds Trial 83, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 83, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6741266 test: 0.6741613 best: 0.6741613 (0) total: 148ms remaining: 590ms 1: learn: 0.6558635 test: 0.6558631 best: 0.6558631 (1) total: 279ms remaining: 419ms 2: learn: 0.6355681 test: 0.6357133 best: 0.6357133 (2) total: 447ms remaining: 298ms 3: learn: 0.6192175 test: 0.6193365 best: 0.6193365 (3) total: 589ms remaining: 147ms 4: learn: 0.6042852 test: 0.6046300 best: 0.6046300 (4) total: 744ms remaining: 0us bestTest = 0.6046299578 bestIteration = 4 Trial 83, Fold 4: Log loss = 0.6048031026637479, Average precision = 0.9675249858750178, ROC-AUC = 0.9630636297631566, Elapsed Time = 0.8545415999979014 seconds Trial 83, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 83, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6781933 test: 0.6784902 best: 0.6784902 (0) total: 142ms remaining: 567ms 1: learn: 0.6594324 test: 0.6600237 best: 0.6600237 (1) total: 254ms remaining: 380ms 2: learn: 0.6444042 test: 0.6452517 best: 0.6452517 (2) total: 391ms remaining: 261ms 3: learn: 0.6280163 test: 0.6291749 best: 0.6291749 (3) total: 521ms remaining: 130ms 4: learn: 0.6107263 test: 0.6120034 best: 0.6120034 (4) total: 692ms remaining: 0us bestTest = 0.6120033708 bestIteration = 4 Trial 83, Fold 5: Log loss = 0.6121398935774273, Average precision = 0.9638235104347864, ROC-AUC = 0.9585350500715307, Elapsed Time = 0.8031102000022656 seconds
Optimization Progress: 84%|########4 | 84/100 [2:28:30<17:55, 67.19s/it]
Trial 84, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 84, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6347000 test: 0.6385112 best: 0.6385112 (0) total: 1.79s remaining: 2m 19s 1: learn: 0.5828302 test: 0.5892120 best: 0.5892120 (1) total: 3.87s remaining: 2m 29s 2: learn: 0.5377779 test: 0.5462544 best: 0.5462544 (2) total: 6.03s remaining: 2m 32s 3: learn: 0.4967567 test: 0.5070984 best: 0.5070984 (3) total: 8.09s remaining: 2m 31s 4: learn: 0.4610533 test: 0.4730432 best: 0.4730432 (4) total: 10.2s remaining: 2m 30s 5: learn: 0.4286883 test: 0.4434998 best: 0.4434998 (5) total: 12.2s remaining: 2m 27s 6: learn: 0.3999757 test: 0.4171453 best: 0.4171453 (6) total: 14.2s remaining: 2m 25s 7: learn: 0.3739428 test: 0.3936614 best: 0.3936614 (7) total: 16.1s remaining: 2m 23s 8: learn: 0.3513454 test: 0.3734758 best: 0.3734758 (8) total: 18.1s remaining: 2m 20s 9: learn: 0.3327455 test: 0.3566652 best: 0.3566652 (9) total: 20.1s remaining: 2m 18s 10: learn: 0.3142091 test: 0.3405580 best: 0.3405580 (10) total: 22.2s remaining: 2m 17s 11: learn: 0.2972153 test: 0.3257499 best: 0.3257499 (11) total: 24.2s remaining: 2m 15s 12: learn: 0.2810223 test: 0.3124657 best: 0.3124657 (12) total: 26.3s remaining: 2m 13s 13: learn: 0.2708273 test: 0.3032440 best: 0.3032440 (13) total: 28.2s remaining: 2m 11s 14: learn: 0.2590567 test: 0.2926412 best: 0.2926412 (14) total: 30.2s remaining: 2m 8s 15: learn: 0.2482234 test: 0.2833818 best: 0.2833818 (15) total: 32.2s remaining: 2m 6s 16: learn: 0.2382109 test: 0.2750114 best: 0.2750114 (16) total: 34.1s remaining: 2m 4s 17: learn: 0.2300855 test: 0.2680913 best: 0.2680913 (17) total: 36.1s remaining: 2m 2s 18: learn: 0.2223024 test: 0.2617404 best: 0.2617404 (18) total: 38.2s remaining: 2m 19: learn: 0.2149820 test: 0.2560178 best: 0.2560178 (19) total: 40.3s remaining: 1m 58s 20: learn: 0.2088985 test: 0.2510774 best: 0.2510774 (20) total: 42.4s remaining: 1m 57s 21: learn: 0.2027123 test: 0.2460480 best: 0.2460480 (21) total: 44.5s remaining: 1m 55s 22: learn: 0.1979323 test: 0.2423030 best: 0.2423030 (22) total: 46.5s remaining: 1m 53s 23: learn: 0.1924320 test: 0.2386115 best: 0.2386115 (23) total: 48.7s remaining: 1m 51s 24: learn: 0.1885365 test: 0.2350071 best: 0.2350071 (24) total: 50.6s remaining: 1m 49s 25: learn: 0.1843587 test: 0.2319095 best: 0.2319095 (25) total: 52.5s remaining: 1m 47s 26: learn: 0.1798754 test: 0.2294387 best: 0.2294387 (26) total: 54.5s remaining: 1m 44s 27: learn: 0.1757271 test: 0.2271488 best: 0.2271488 (27) total: 56.5s remaining: 1m 42s 28: learn: 0.1715463 test: 0.2245006 best: 0.2245006 (28) total: 58.4s remaining: 1m 40s 29: learn: 0.1675878 test: 0.2228071 best: 0.2228071 (29) total: 1m remaining: 1m 38s 30: learn: 0.1643064 test: 0.2214201 best: 0.2214201 (30) total: 1m 2s remaining: 1m 36s 31: learn: 0.1613413 test: 0.2197657 best: 0.2197657 (31) total: 1m 4s remaining: 1m 34s 32: learn: 0.1585028 test: 0.2182658 best: 0.2182658 (32) total: 1m 6s remaining: 1m 32s 33: learn: 0.1554879 test: 0.2173348 best: 0.2173348 (33) total: 1m 8s remaining: 1m 30s 34: learn: 0.1529295 test: 0.2164952 best: 0.2164952 (34) total: 1m 10s remaining: 1m 28s 35: learn: 0.1498968 test: 0.2153599 best: 0.2153599 (35) total: 1m 12s remaining: 1m 26s 36: learn: 0.1474738 test: 0.2146093 best: 0.2146093 (36) total: 1m 14s remaining: 1m 24s 37: learn: 0.1452281 test: 0.2136942 best: 0.2136942 (37) total: 1m 16s remaining: 1m 22s 38: learn: 0.1437607 test: 0.2130035 best: 0.2130035 (38) total: 1m 18s remaining: 1m 20s 39: learn: 0.1414549 test: 0.2123112 best: 0.2123112 (39) total: 1m 21s remaining: 1m 19s 40: learn: 0.1386069 test: 0.2115704 best: 0.2115704 (40) total: 1m 23s remaining: 1m 17s 41: learn: 0.1367169 test: 0.2107963 best: 0.2107963 (41) total: 1m 26s remaining: 1m 15s 42: learn: 0.1345988 test: 0.2101541 best: 0.2101541 (42) total: 1m 28s remaining: 1m 13s 43: learn: 0.1311260 test: 0.2099988 best: 0.2099988 (43) total: 1m 30s remaining: 1m 11s 44: learn: 0.1299775 test: 0.2097430 best: 0.2097430 (44) total: 1m 32s remaining: 1m 9s 45: learn: 0.1282677 test: 0.2094185 best: 0.2094185 (45) total: 1m 34s remaining: 1m 7s 46: learn: 0.1265190 test: 0.2083821 best: 0.2083821 (46) total: 1m 36s remaining: 1m 6s 47: learn: 0.1252926 test: 0.2080020 best: 0.2080020 (47) total: 1m 39s remaining: 1m 3s 48: learn: 0.1228692 test: 0.2075940 best: 0.2075940 (48) total: 1m 41s remaining: 1m 1s 49: learn: 0.1217476 test: 0.2071043 best: 0.2071043 (49) total: 1m 43s remaining: 59.8s 50: learn: 0.1204613 test: 0.2070087 best: 0.2070087 (50) total: 1m 45s remaining: 57.7s 51: learn: 0.1178000 test: 0.2070644 best: 0.2070087 (50) total: 1m 47s remaining: 55.6s 52: learn: 0.1166718 test: 0.2065193 best: 0.2065193 (52) total: 1m 49s remaining: 53.5s 53: learn: 0.1137656 test: 0.2063091 best: 0.2063091 (53) total: 1m 51s remaining: 51.5s 54: learn: 0.1121202 test: 0.2061791 best: 0.2061791 (54) total: 1m 53s remaining: 49.4s 55: learn: 0.1106258 test: 0.2058979 best: 0.2058979 (55) total: 1m 55s remaining: 47.3s 56: learn: 0.1096706 test: 0.2059308 best: 0.2058979 (55) total: 1m 57s remaining: 45.2s 57: learn: 0.1090741 test: 0.2057921 best: 0.2057921 (57) total: 1m 59s remaining: 43.1s 58: learn: 0.1070392 test: 0.2058000 best: 0.2057921 (57) total: 2m 1s remaining: 41s 59: learn: 0.1058645 test: 0.2056504 best: 0.2056504 (59) total: 2m 3s remaining: 39s 60: learn: 0.1051002 test: 0.2050619 best: 0.2050619 (60) total: 2m 5s remaining: 36.9s 61: learn: 0.1037104 test: 0.2047734 best: 0.2047734 (61) total: 2m 7s remaining: 34.9s 62: learn: 0.1020640 test: 0.2047467 best: 0.2047467 (62) total: 2m 9s remaining: 32.8s 63: learn: 0.1009724 test: 0.2046162 best: 0.2046162 (63) total: 2m 11s remaining: 30.8s 64: learn: 0.0996886 test: 0.2045019 best: 0.2045019 (64) total: 2m 13s remaining: 28.7s 65: learn: 0.0987240 test: 0.2039679 best: 0.2039679 (65) total: 2m 15s remaining: 26.7s 66: learn: 0.0978308 test: 0.2038873 best: 0.2038873 (66) total: 2m 17s remaining: 24.6s 67: learn: 0.0960536 test: 0.2039998 best: 0.2038873 (66) total: 2m 19s remaining: 22.6s 68: learn: 0.0945890 test: 0.2040756 best: 0.2038873 (66) total: 2m 21s remaining: 20.6s 69: learn: 0.0937330 test: 0.2038856 best: 0.2038856 (69) total: 2m 23s remaining: 18.5s 70: learn: 0.0929279 test: 0.2040413 best: 0.2038856 (69) total: 2m 25s remaining: 16.4s 71: learn: 0.0920982 test: 0.2041623 best: 0.2038856 (69) total: 2m 28s remaining: 14.4s 72: learn: 0.0913867 test: 0.2036681 best: 0.2036681 (72) total: 2m 30s remaining: 12.3s 73: learn: 0.0899584 test: 0.2037983 best: 0.2036681 (72) total: 2m 32s remaining: 10.3s 74: learn: 0.0892696 test: 0.2038845 best: 0.2036681 (72) total: 2m 34s remaining: 8.22s 75: learn: 0.0879513 test: 0.2040471 best: 0.2036681 (72) total: 2m 36s remaining: 6.17s 76: learn: 0.0860195 test: 0.2041985 best: 0.2036681 (72) total: 2m 38s remaining: 4.11s 77: learn: 0.0848201 test: 0.2043522 best: 0.2036681 (72) total: 2m 40s remaining: 2.06s 78: learn: 0.0837079 test: 0.2041704 best: 0.2036681 (72) total: 2m 42s remaining: 0us bestTest = 0.2036680954 bestIteration = 72 Shrink model to first 73 iterations. Trial 84, Fold 1: Log loss = 0.2036680954383175, Average precision = 0.9731174438057646, ROC-AUC = 0.9678373224980046, Elapsed Time = 162.88111989999743 seconds Trial 84, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 84, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6335797 test: 0.6368632 best: 0.6368632 (0) total: 1.83s remaining: 2m 22s 1: learn: 0.5809681 test: 0.5869622 best: 0.5869622 (1) total: 3.82s remaining: 2m 27s 2: learn: 0.5343481 test: 0.5429518 best: 0.5429518 (2) total: 5.88s remaining: 2m 29s 3: learn: 0.4938466 test: 0.5050078 best: 0.5050078 (3) total: 7.82s remaining: 2m 26s 4: learn: 0.4540233 test: 0.4696324 best: 0.4696324 (4) total: 9.85s remaining: 2m 25s 5: learn: 0.4234748 test: 0.4412902 best: 0.4412902 (5) total: 12s remaining: 2m 25s 6: learn: 0.3938232 test: 0.4130453 best: 0.4130453 (6) total: 14.1s remaining: 2m 24s 7: learn: 0.3697086 test: 0.3903493 best: 0.3903493 (7) total: 16.1s remaining: 2m 23s 8: learn: 0.3498581 test: 0.3720554 best: 0.3720554 (8) total: 18.2s remaining: 2m 21s 9: learn: 0.3288462 test: 0.3538594 best: 0.3538594 (9) total: 20.3s remaining: 2m 20s 10: learn: 0.3102425 test: 0.3374088 best: 0.3374088 (10) total: 22.4s remaining: 2m 18s 11: learn: 0.2945745 test: 0.3230983 best: 0.3230983 (11) total: 24.4s remaining: 2m 16s 12: learn: 0.2800103 test: 0.3099994 best: 0.3099994 (12) total: 26.3s remaining: 2m 13s 13: learn: 0.2671258 test: 0.2991286 best: 0.2991286 (13) total: 28.3s remaining: 2m 11s 14: learn: 0.2552715 test: 0.2890283 best: 0.2890283 (14) total: 30.5s remaining: 2m 10s 15: learn: 0.2433003 test: 0.2793731 best: 0.2793731 (15) total: 32.6s remaining: 2m 8s 16: learn: 0.2349053 test: 0.2720666 best: 0.2720666 (16) total: 34.7s remaining: 2m 6s 17: learn: 0.2267489 test: 0.2656200 best: 0.2656200 (17) total: 36.8s remaining: 2m 4s 18: learn: 0.2177906 test: 0.2588603 best: 0.2588603 (18) total: 38.9s remaining: 2m 2s 19: learn: 0.2107547 test: 0.2537875 best: 0.2537875 (19) total: 41s remaining: 2m 20: learn: 0.2053510 test: 0.2491897 best: 0.2491897 (20) total: 43.1s remaining: 1m 58s 21: learn: 0.1996254 test: 0.2449876 best: 0.2449876 (21) total: 45.2s remaining: 1m 57s 22: learn: 0.1932451 test: 0.2406712 best: 0.2406712 (22) total: 47.2s remaining: 1m 54s 23: learn: 0.1874225 test: 0.2372772 best: 0.2372772 (23) total: 49.3s remaining: 1m 52s 24: learn: 0.1824185 test: 0.2342317 best: 0.2342317 (24) total: 51.4s remaining: 1m 50s 25: learn: 0.1780823 test: 0.2314544 best: 0.2314544 (25) total: 53.5s remaining: 1m 49s 26: learn: 0.1735282 test: 0.2290155 best: 0.2290155 (26) total: 55.6s remaining: 1m 47s 27: learn: 0.1707162 test: 0.2276609 best: 0.2276609 (27) total: 57.6s remaining: 1m 44s 28: learn: 0.1672908 test: 0.2256608 best: 0.2256608 (28) total: 59.5s remaining: 1m 42s 29: learn: 0.1642934 test: 0.2236379 best: 0.2236379 (29) total: 1m 1s remaining: 1m 40s 30: learn: 0.1616718 test: 0.2226359 best: 0.2226359 (30) total: 1m 3s remaining: 1m 38s 31: learn: 0.1590257 test: 0.2210355 best: 0.2210355 (31) total: 1m 5s remaining: 1m 36s 32: learn: 0.1555367 test: 0.2193512 best: 0.2193512 (32) total: 1m 7s remaining: 1m 33s 33: learn: 0.1526591 test: 0.2181922 best: 0.2181922 (33) total: 1m 9s remaining: 1m 31s 34: learn: 0.1500697 test: 0.2165879 best: 0.2165879 (34) total: 1m 11s remaining: 1m 29s 35: learn: 0.1471859 test: 0.2156053 best: 0.2156053 (35) total: 1m 13s remaining: 1m 27s 36: learn: 0.1449624 test: 0.2144795 best: 0.2144795 (36) total: 1m 15s remaining: 1m 25s 37: learn: 0.1420740 test: 0.2139990 best: 0.2139990 (37) total: 1m 17s remaining: 1m 23s 38: learn: 0.1393123 test: 0.2131479 best: 0.2131479 (38) total: 1m 19s remaining: 1m 21s 39: learn: 0.1377042 test: 0.2124678 best: 0.2124678 (39) total: 1m 21s remaining: 1m 19s 40: learn: 0.1346495 test: 0.2115703 best: 0.2115703 (40) total: 1m 24s remaining: 1m 17s 41: learn: 0.1324400 test: 0.2107555 best: 0.2107555 (41) total: 1m 26s remaining: 1m 15s 42: learn: 0.1308991 test: 0.2095993 best: 0.2095993 (42) total: 1m 28s remaining: 1m 13s 43: learn: 0.1288824 test: 0.2085486 best: 0.2085486 (43) total: 1m 30s remaining: 1m 11s 44: learn: 0.1260358 test: 0.2076799 best: 0.2076799 (44) total: 1m 32s remaining: 1m 10s 45: learn: 0.1230498 test: 0.2072869 best: 0.2072869 (45) total: 1m 34s remaining: 1m 7s 46: learn: 0.1212165 test: 0.2072078 best: 0.2072078 (46) total: 1m 36s remaining: 1m 5s 47: learn: 0.1202038 test: 0.2067248 best: 0.2067248 (47) total: 1m 38s remaining: 1m 3s 48: learn: 0.1187262 test: 0.2060490 best: 0.2060490 (48) total: 1m 40s remaining: 1m 1s 49: learn: 0.1170945 test: 0.2053688 best: 0.2053688 (49) total: 1m 42s remaining: 59.6s 50: learn: 0.1155854 test: 0.2048722 best: 0.2048722 (50) total: 1m 44s remaining: 57.5s 51: learn: 0.1145097 test: 0.2046132 best: 0.2046132 (51) total: 1m 46s remaining: 55.3s 52: learn: 0.1125355 test: 0.2044530 best: 0.2044530 (52) total: 1m 48s remaining: 53.3s 53: learn: 0.1117742 test: 0.2041124 best: 0.2041124 (53) total: 1m 50s remaining: 51.3s 54: learn: 0.1098825 test: 0.2037184 best: 0.2037184 (54) total: 1m 53s remaining: 49.3s 55: learn: 0.1081602 test: 0.2031932 best: 0.2031932 (55) total: 1m 54s remaining: 47.2s 56: learn: 0.1060283 test: 0.2031969 best: 0.2031932 (55) total: 1m 56s remaining: 45.1s 57: learn: 0.1042264 test: 0.2030756 best: 0.2030756 (57) total: 1m 59s remaining: 43.1s 58: learn: 0.1037168 test: 0.2025946 best: 0.2025946 (58) total: 2m 1s remaining: 41.1s 59: learn: 0.1022755 test: 0.2025019 best: 0.2025019 (59) total: 2m 3s remaining: 39s 60: learn: 0.0999223 test: 0.2024500 best: 0.2024500 (60) total: 2m 5s remaining: 37s 61: learn: 0.0989392 test: 0.2024379 best: 0.2024379 (61) total: 2m 7s remaining: 34.9s 62: learn: 0.0972842 test: 0.2023262 best: 0.2023262 (62) total: 2m 9s remaining: 32.8s 63: learn: 0.0964389 test: 0.2022893 best: 0.2022893 (63) total: 2m 11s remaining: 30.8s 64: learn: 0.0950705 test: 0.2025118 best: 0.2022893 (63) total: 2m 13s remaining: 28.7s 65: learn: 0.0936718 test: 0.2023874 best: 0.2022893 (63) total: 2m 15s remaining: 26.7s 66: learn: 0.0932789 test: 0.2021917 best: 0.2021917 (66) total: 2m 17s remaining: 24.6s 67: learn: 0.0923367 test: 0.2014782 best: 0.2014782 (67) total: 2m 19s remaining: 22.6s 68: learn: 0.0908682 test: 0.2009574 best: 0.2009574 (68) total: 2m 21s remaining: 20.5s 69: learn: 0.0904143 test: 0.2008391 best: 0.2008391 (69) total: 2m 23s remaining: 18.5s 70: learn: 0.0900280 test: 0.2007522 best: 0.2007522 (70) total: 2m 25s remaining: 16.4s 71: learn: 0.0889042 test: 0.2006465 best: 0.2006465 (71) total: 2m 27s remaining: 14.3s 72: learn: 0.0883537 test: 0.2001193 best: 0.2001193 (72) total: 2m 29s remaining: 12.3s 73: learn: 0.0878648 test: 0.1999491 best: 0.1999491 (73) total: 2m 31s remaining: 10.2s 74: learn: 0.0867566 test: 0.1999300 best: 0.1999300 (74) total: 2m 33s remaining: 8.21s 75: learn: 0.0858405 test: 0.1998709 best: 0.1998709 (75) total: 2m 35s remaining: 6.16s 76: learn: 0.0852366 test: 0.1997926 best: 0.1997926 (76) total: 2m 38s remaining: 4.11s 77: learn: 0.0843234 test: 0.1997731 best: 0.1997731 (77) total: 2m 40s remaining: 2.05s 78: learn: 0.0829058 test: 0.1999155 best: 0.1997731 (77) total: 2m 42s remaining: 0us bestTest = 0.1997731316 bestIteration = 77 Shrink model to first 78 iterations. Trial 84, Fold 2: Log loss = 0.19977313163179264, Average precision = 0.9717598719261711, ROC-AUC = 0.9678683169137171, Elapsed Time = 162.48672610000358 seconds Trial 84, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 84, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6411888 test: 0.6435786 best: 0.6435786 (0) total: 1.87s remaining: 2m 25s 1: learn: 0.5862412 test: 0.5921817 best: 0.5921817 (1) total: 3.84s remaining: 2m 27s 2: learn: 0.5412774 test: 0.5499072 best: 0.5499072 (2) total: 5.81s remaining: 2m 27s 3: learn: 0.4991514 test: 0.5096556 best: 0.5096556 (3) total: 7.86s remaining: 2m 27s 4: learn: 0.4614866 test: 0.4741943 best: 0.4741943 (4) total: 9.98s remaining: 2m 27s 5: learn: 0.4284875 test: 0.4425605 best: 0.4425605 (5) total: 12s remaining: 2m 26s 6: learn: 0.3990706 test: 0.4159774 best: 0.4159774 (6) total: 14.1s remaining: 2m 24s 7: learn: 0.3747137 test: 0.3933945 best: 0.3933945 (7) total: 16.2s remaining: 2m 23s 8: learn: 0.3524902 test: 0.3730264 best: 0.3730264 (8) total: 18.3s remaining: 2m 22s 9: learn: 0.3333216 test: 0.3550407 best: 0.3550407 (9) total: 20.4s remaining: 2m 20s 10: learn: 0.3156683 test: 0.3387291 best: 0.3387291 (10) total: 22.4s remaining: 2m 18s 11: learn: 0.2989038 test: 0.3238481 best: 0.3238481 (11) total: 24.4s remaining: 2m 16s 12: learn: 0.2831910 test: 0.3101632 best: 0.3101632 (12) total: 26.3s remaining: 2m 13s 13: learn: 0.2692895 test: 0.2985924 best: 0.2985924 (13) total: 28.4s remaining: 2m 11s 14: learn: 0.2570642 test: 0.2883289 best: 0.2883289 (14) total: 30.4s remaining: 2m 9s 15: learn: 0.2462685 test: 0.2791463 best: 0.2791463 (15) total: 32.3s remaining: 2m 7s 16: learn: 0.2361525 test: 0.2709689 best: 0.2709689 (16) total: 34.3s remaining: 2m 5s 17: learn: 0.2268730 test: 0.2636103 best: 0.2636103 (17) total: 36.2s remaining: 2m 2s 18: learn: 0.2195710 test: 0.2575167 best: 0.2575167 (18) total: 38.2s remaining: 2m 19: learn: 0.2114296 test: 0.2513937 best: 0.2513937 (19) total: 40.2s remaining: 1m 58s 20: learn: 0.2051235 test: 0.2466399 best: 0.2466399 (20) total: 42.4s remaining: 1m 57s 21: learn: 0.1995451 test: 0.2429132 best: 0.2429132 (21) total: 44.4s remaining: 1m 54s 22: learn: 0.1942834 test: 0.2388114 best: 0.2388114 (22) total: 46.4s remaining: 1m 53s 23: learn: 0.1900636 test: 0.2356473 best: 0.2356473 (23) total: 48.4s remaining: 1m 50s 24: learn: 0.1858432 test: 0.2330741 best: 0.2330741 (24) total: 50.3s remaining: 1m 48s 25: learn: 0.1817736 test: 0.2300005 best: 0.2300005 (25) total: 52.3s remaining: 1m 46s 26: learn: 0.1777605 test: 0.2274081 best: 0.2274081 (26) total: 54.2s remaining: 1m 44s 27: learn: 0.1736662 test: 0.2254167 best: 0.2254167 (27) total: 56.2s remaining: 1m 42s 28: learn: 0.1707345 test: 0.2239048 best: 0.2239048 (28) total: 58.1s remaining: 1m 40s 29: learn: 0.1670759 test: 0.2215976 best: 0.2215976 (29) total: 1m remaining: 1m 38s 30: learn: 0.1644079 test: 0.2197873 best: 0.2197873 (30) total: 1m 2s remaining: 1m 36s 31: learn: 0.1612983 test: 0.2185378 best: 0.2185378 (31) total: 1m 4s remaining: 1m 34s 32: learn: 0.1582386 test: 0.2175671 best: 0.2175671 (32) total: 1m 6s remaining: 1m 32s 33: learn: 0.1552298 test: 0.2165350 best: 0.2165350 (33) total: 1m 8s remaining: 1m 30s 34: learn: 0.1519473 test: 0.2153287 best: 0.2153287 (34) total: 1m 10s remaining: 1m 28s 35: learn: 0.1501391 test: 0.2144953 best: 0.2144953 (35) total: 1m 12s remaining: 1m 26s 36: learn: 0.1480955 test: 0.2137836 best: 0.2137836 (36) total: 1m 14s remaining: 1m 24s 37: learn: 0.1456866 test: 0.2126073 best: 0.2126073 (37) total: 1m 16s remaining: 1m 22s 38: learn: 0.1422253 test: 0.2118507 best: 0.2118507 (38) total: 1m 18s remaining: 1m 20s 39: learn: 0.1394746 test: 0.2111120 best: 0.2111120 (39) total: 1m 20s remaining: 1m 18s 40: learn: 0.1375943 test: 0.2106922 best: 0.2106922 (40) total: 1m 22s remaining: 1m 16s 41: learn: 0.1346589 test: 0.2102089 best: 0.2102089 (41) total: 1m 24s remaining: 1m 14s 42: learn: 0.1323694 test: 0.2098197 best: 0.2098197 (42) total: 1m 26s remaining: 1m 12s 43: learn: 0.1309384 test: 0.2091641 best: 0.2091641 (43) total: 1m 28s remaining: 1m 10s 44: learn: 0.1297214 test: 0.2086362 best: 0.2086362 (44) total: 1m 30s remaining: 1m 8s 45: learn: 0.1273562 test: 0.2084116 best: 0.2084116 (45) total: 1m 32s remaining: 1m 6s 46: learn: 0.1257841 test: 0.2075261 best: 0.2075261 (46) total: 1m 34s remaining: 1m 4s 47: learn: 0.1242147 test: 0.2070340 best: 0.2070340 (47) total: 1m 36s remaining: 1m 2s 48: learn: 0.1227558 test: 0.2067295 best: 0.2067295 (48) total: 1m 38s remaining: 1m 49: learn: 0.1214094 test: 0.2066609 best: 0.2066609 (49) total: 1m 40s remaining: 58.4s 50: learn: 0.1199833 test: 0.2064068 best: 0.2064068 (50) total: 1m 42s remaining: 56.5s 51: learn: 0.1187691 test: 0.2057982 best: 0.2057982 (51) total: 1m 44s remaining: 54.5s 52: learn: 0.1174182 test: 0.2054507 best: 0.2054507 (52) total: 1m 47s remaining: 52.5s 53: learn: 0.1163432 test: 0.2050222 best: 0.2050222 (53) total: 1m 49s remaining: 50.6s 54: learn: 0.1148142 test: 0.2047771 best: 0.2047771 (54) total: 1m 51s remaining: 48.6s 55: learn: 0.1119270 test: 0.2044993 best: 0.2044993 (55) total: 1m 53s remaining: 46.6s 56: learn: 0.1103679 test: 0.2045105 best: 0.2044993 (55) total: 1m 55s remaining: 44.6s 57: learn: 0.1091417 test: 0.2043546 best: 0.2043546 (57) total: 1m 57s remaining: 42.6s 58: learn: 0.1078324 test: 0.2042929 best: 0.2042929 (58) total: 1m 59s remaining: 40.5s 59: learn: 0.1067569 test: 0.2043365 best: 0.2042929 (58) total: 2m 1s remaining: 38.5s 60: learn: 0.1049472 test: 0.2041866 best: 0.2041866 (60) total: 2m 3s remaining: 36.4s 61: learn: 0.1042540 test: 0.2041211 best: 0.2041211 (61) total: 2m 5s remaining: 34.4s 62: learn: 0.1033056 test: 0.2040525 best: 0.2040525 (62) total: 2m 7s remaining: 32.4s 63: learn: 0.1008250 test: 0.2038363 best: 0.2038363 (63) total: 2m 9s remaining: 30.4s 64: learn: 0.0994091 test: 0.2036777 best: 0.2036777 (64) total: 2m 11s remaining: 28.4s 65: learn: 0.0981036 test: 0.2036376 best: 0.2036376 (65) total: 2m 13s remaining: 26.4s 66: learn: 0.0970010 test: 0.2034594 best: 0.2034594 (66) total: 2m 15s remaining: 24.3s 67: learn: 0.0963768 test: 0.2036432 best: 0.2034594 (66) total: 2m 17s remaining: 22.3s 68: learn: 0.0946225 test: 0.2035237 best: 0.2034594 (66) total: 2m 19s remaining: 20.3s 69: learn: 0.0935246 test: 0.2034977 best: 0.2034594 (66) total: 2m 22s remaining: 18.3s 70: learn: 0.0918679 test: 0.2032971 best: 0.2032971 (70) total: 2m 23s remaining: 16.2s 71: learn: 0.0906390 test: 0.2033108 best: 0.2032971 (70) total: 2m 25s remaining: 14.2s 72: learn: 0.0895888 test: 0.2027877 best: 0.2027877 (72) total: 2m 28s remaining: 12.2s 73: learn: 0.0886667 test: 0.2025830 best: 0.2025830 (73) total: 2m 30s remaining: 10.1s 74: learn: 0.0878814 test: 0.2022322 best: 0.2022322 (74) total: 2m 32s remaining: 8.12s 75: learn: 0.0867054 test: 0.2022369 best: 0.2022322 (74) total: 2m 34s remaining: 6.09s 76: learn: 0.0855802 test: 0.2021956 best: 0.2021956 (76) total: 2m 36s remaining: 4.06s 77: learn: 0.0845170 test: 0.2023166 best: 0.2021956 (76) total: 2m 38s remaining: 2.03s 78: learn: 0.0838546 test: 0.2023017 best: 0.2021956 (76) total: 2m 40s remaining: 0us bestTest = 0.2021955532 bestIteration = 76 Shrink model to first 77 iterations. Trial 84, Fold 3: Log loss = 0.2021955531654966, Average precision = 0.9718344579736193, ROC-AUC = 0.9678893767275669, Elapsed Time = 160.59512720000203 seconds Trial 84, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 84, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6368613 test: 0.6388689 best: 0.6388689 (0) total: 1.85s remaining: 2m 24s 1: learn: 0.5813839 test: 0.5881084 best: 0.5881084 (1) total: 3.88s remaining: 2m 29s 2: learn: 0.5379309 test: 0.5464319 best: 0.5464319 (2) total: 5.88s remaining: 2m 28s 3: learn: 0.4972070 test: 0.5075352 best: 0.5075352 (3) total: 7.88s remaining: 2m 27s 4: learn: 0.4598051 test: 0.4736560 best: 0.4736560 (4) total: 9.88s remaining: 2m 26s 5: learn: 0.4276172 test: 0.4435499 best: 0.4435499 (5) total: 11.9s remaining: 2m 24s 6: learn: 0.4004253 test: 0.4178298 best: 0.4178298 (6) total: 13.9s remaining: 2m 23s 7: learn: 0.3781427 test: 0.3973535 best: 0.3973535 (7) total: 16s remaining: 2m 21s 8: learn: 0.3539753 test: 0.3754475 best: 0.3754475 (8) total: 18s remaining: 2m 20s 9: learn: 0.3339149 test: 0.3577509 best: 0.3577509 (9) total: 20.1s remaining: 2m 18s 10: learn: 0.3162754 test: 0.3413812 best: 0.3413812 (10) total: 22.2s remaining: 2m 17s 11: learn: 0.2998713 test: 0.3262452 best: 0.3262452 (11) total: 24.4s remaining: 2m 16s 12: learn: 0.2882322 test: 0.3153457 best: 0.3153457 (12) total: 26.4s remaining: 2m 14s 13: learn: 0.2749756 test: 0.3045783 best: 0.3045783 (13) total: 28.4s remaining: 2m 11s 14: learn: 0.2631375 test: 0.2943419 best: 0.2943419 (14) total: 30.3s remaining: 2m 9s 15: learn: 0.2515402 test: 0.2847480 best: 0.2847480 (15) total: 32.3s remaining: 2m 7s 16: learn: 0.2418617 test: 0.2759356 best: 0.2759356 (16) total: 34.2s remaining: 2m 4s 17: learn: 0.2322495 test: 0.2678851 best: 0.2678851 (17) total: 36.2s remaining: 2m 2s 18: learn: 0.2249377 test: 0.2616279 best: 0.2616279 (18) total: 38.3s remaining: 2m 1s 19: learn: 0.2181128 test: 0.2562985 best: 0.2562985 (19) total: 40.4s remaining: 1m 59s 20: learn: 0.2108264 test: 0.2510095 best: 0.2510095 (20) total: 42.4s remaining: 1m 57s 21: learn: 0.2046255 test: 0.2467661 best: 0.2467661 (21) total: 44.4s remaining: 1m 54s 22: learn: 0.1991391 test: 0.2427957 best: 0.2427957 (22) total: 46.3s remaining: 1m 52s 23: learn: 0.1951413 test: 0.2396681 best: 0.2396681 (23) total: 48.3s remaining: 1m 50s 24: learn: 0.1912279 test: 0.2363883 best: 0.2363883 (24) total: 50.3s remaining: 1m 48s 25: learn: 0.1873843 test: 0.2338921 best: 0.2338921 (25) total: 52.2s remaining: 1m 46s 26: learn: 0.1826116 test: 0.2315364 best: 0.2315364 (26) total: 54.4s remaining: 1m 44s 27: learn: 0.1785783 test: 0.2289835 best: 0.2289835 (27) total: 56.7s remaining: 1m 43s 28: learn: 0.1747727 test: 0.2273351 best: 0.2273351 (28) total: 58.8s remaining: 1m 41s 29: learn: 0.1714397 test: 0.2252111 best: 0.2252111 (29) total: 1m remaining: 1m 39s 30: learn: 0.1675152 test: 0.2232613 best: 0.2232613 (30) total: 1m 2s remaining: 1m 37s 31: learn: 0.1650481 test: 0.2217015 best: 0.2217015 (31) total: 1m 4s remaining: 1m 35s 32: learn: 0.1619412 test: 0.2199039 best: 0.2199039 (32) total: 1m 6s remaining: 1m 33s 33: learn: 0.1588273 test: 0.2185789 best: 0.2185789 (33) total: 1m 8s remaining: 1m 31s 34: learn: 0.1561839 test: 0.2170866 best: 0.2170866 (34) total: 1m 10s remaining: 1m 29s 35: learn: 0.1533737 test: 0.2155285 best: 0.2155285 (35) total: 1m 12s remaining: 1m 27s 36: learn: 0.1510415 test: 0.2142308 best: 0.2142308 (36) total: 1m 14s remaining: 1m 24s 37: learn: 0.1486716 test: 0.2127847 best: 0.2127847 (37) total: 1m 16s remaining: 1m 23s 38: learn: 0.1457335 test: 0.2118180 best: 0.2118180 (38) total: 1m 19s remaining: 1m 21s 39: learn: 0.1437753 test: 0.2111192 best: 0.2111192 (39) total: 1m 21s remaining: 1m 19s 40: learn: 0.1422050 test: 0.2103482 best: 0.2103482 (40) total: 1m 23s remaining: 1m 17s 41: learn: 0.1399717 test: 0.2097252 best: 0.2097252 (41) total: 1m 25s remaining: 1m 15s 42: learn: 0.1374782 test: 0.2089562 best: 0.2089562 (42) total: 1m 27s remaining: 1m 13s 43: learn: 0.1351003 test: 0.2082478 best: 0.2082478 (43) total: 1m 29s remaining: 1m 11s 44: learn: 0.1332725 test: 0.2079839 best: 0.2079839 (44) total: 1m 31s remaining: 1m 9s 45: learn: 0.1297684 test: 0.2077764 best: 0.2077764 (45) total: 1m 33s remaining: 1m 7s 46: learn: 0.1267219 test: 0.2074013 best: 0.2074013 (46) total: 1m 35s remaining: 1m 5s 47: learn: 0.1257902 test: 0.2069794 best: 0.2069794 (47) total: 1m 38s remaining: 1m 3s 48: learn: 0.1231748 test: 0.2069488 best: 0.2069488 (48) total: 1m 40s remaining: 1m 1s 49: learn: 0.1218587 test: 0.2067426 best: 0.2067426 (49) total: 1m 42s remaining: 59.3s 50: learn: 0.1201829 test: 0.2065528 best: 0.2065528 (50) total: 1m 44s remaining: 57.2s 51: learn: 0.1187807 test: 0.2062151 best: 0.2062151 (51) total: 1m 46s remaining: 55.2s 52: learn: 0.1170650 test: 0.2059332 best: 0.2059332 (52) total: 1m 48s remaining: 53.2s 53: learn: 0.1151867 test: 0.2057748 best: 0.2057748 (53) total: 1m 50s remaining: 51.1s 54: learn: 0.1136923 test: 0.2054371 best: 0.2054371 (54) total: 1m 52s remaining: 49s 55: learn: 0.1122888 test: 0.2055289 best: 0.2054371 (54) total: 1m 54s remaining: 47s 56: learn: 0.1110505 test: 0.2052545 best: 0.2052545 (56) total: 1m 56s remaining: 44.9s 57: learn: 0.1102793 test: 0.2052236 best: 0.2052236 (57) total: 1m 58s remaining: 42.9s 58: learn: 0.1079999 test: 0.2053490 best: 0.2052236 (57) total: 2m remaining: 40.8s 59: learn: 0.1067137 test: 0.2048913 best: 0.2048913 (59) total: 2m 2s remaining: 38.8s 60: learn: 0.1058648 test: 0.2048184 best: 0.2048184 (60) total: 2m 4s remaining: 36.8s 61: learn: 0.1040737 test: 0.2047293 best: 0.2047293 (61) total: 2m 6s remaining: 34.8s 62: learn: 0.1020093 test: 0.2046767 best: 0.2046767 (62) total: 2m 8s remaining: 32.8s 63: learn: 0.1008272 test: 0.2043146 best: 0.2043146 (63) total: 2m 11s remaining: 30.7s 64: learn: 0.0997973 test: 0.2038967 best: 0.2038967 (64) total: 2m 13s remaining: 28.7s 65: learn: 0.0992689 test: 0.2036688 best: 0.2036688 (65) total: 2m 15s remaining: 26.6s 66: learn: 0.0981360 test: 0.2036178 best: 0.2036178 (66) total: 2m 17s remaining: 24.6s 67: learn: 0.0973920 test: 0.2036369 best: 0.2036178 (66) total: 2m 19s remaining: 22.5s 68: learn: 0.0966626 test: 0.2035521 best: 0.2035521 (68) total: 2m 21s remaining: 20.5s 69: learn: 0.0948123 test: 0.2034362 best: 0.2034362 (69) total: 2m 23s remaining: 18.5s 70: learn: 0.0939253 test: 0.2033088 best: 0.2033088 (70) total: 2m 25s remaining: 16.4s 71: learn: 0.0933529 test: 0.2032825 best: 0.2032825 (71) total: 2m 27s remaining: 14.3s 72: learn: 0.0914871 test: 0.2033811 best: 0.2032825 (71) total: 2m 29s remaining: 12.3s 73: learn: 0.0906449 test: 0.2032462 best: 0.2032462 (73) total: 2m 31s remaining: 10.3s 74: learn: 0.0895147 test: 0.2035366 best: 0.2032462 (73) total: 2m 33s remaining: 8.21s 75: learn: 0.0882197 test: 0.2036146 best: 0.2032462 (73) total: 2m 35s remaining: 6.15s 76: learn: 0.0870073 test: 0.2033073 best: 0.2032462 (73) total: 2m 38s remaining: 4.1s 77: learn: 0.0865690 test: 0.2033617 best: 0.2032462 (73) total: 2m 40s remaining: 2.05s 78: learn: 0.0855324 test: 0.2034026 best: 0.2032462 (73) total: 2m 42s remaining: 0us bestTest = 0.2032461992 bestIteration = 73 Shrink model to first 74 iterations. Trial 84, Fold 4: Log loss = 0.20324619917569225, Average precision = 0.9722407572545252, ROC-AUC = 0.9675492261398075, Elapsed Time = 162.48230349999358 seconds Trial 84, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 84, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6386066 test: 0.6451767 best: 0.6451767 (0) total: 2.01s remaining: 2m 37s 1: learn: 0.5901165 test: 0.6007657 best: 0.6007657 (1) total: 4.04s remaining: 2m 35s 2: learn: 0.5434056 test: 0.5564862 best: 0.5564862 (2) total: 6.1s remaining: 2m 34s 3: learn: 0.5040828 test: 0.5198070 best: 0.5198070 (3) total: 8.17s remaining: 2m 33s 4: learn: 0.4653934 test: 0.4840638 best: 0.4840638 (4) total: 10.3s remaining: 2m 33s 5: learn: 0.4318003 test: 0.4529650 best: 0.4529650 (5) total: 12.4s remaining: 2m 31s 6: learn: 0.4023466 test: 0.4258236 best: 0.4258236 (6) total: 14.4s remaining: 2m 28s 7: learn: 0.3758371 test: 0.4021800 best: 0.4021800 (7) total: 16.5s remaining: 2m 26s 8: learn: 0.3517399 test: 0.3804228 best: 0.3804228 (8) total: 18.6s remaining: 2m 24s 9: learn: 0.3310841 test: 0.3617111 best: 0.3617111 (9) total: 20.6s remaining: 2m 21s 10: learn: 0.3113875 test: 0.3447409 best: 0.3447409 (10) total: 22.6s remaining: 2m 19s 11: learn: 0.2948346 test: 0.3299564 best: 0.3299564 (11) total: 24.5s remaining: 2m 16s 12: learn: 0.2798435 test: 0.3167105 best: 0.3167105 (12) total: 26.5s remaining: 2m 14s 13: learn: 0.2657834 test: 0.3045831 best: 0.3045831 (13) total: 28.7s remaining: 2m 13s 14: learn: 0.2541929 test: 0.2949214 best: 0.2949214 (14) total: 30.7s remaining: 2m 11s 15: learn: 0.2439816 test: 0.2858273 best: 0.2858273 (15) total: 32.8s remaining: 2m 9s 16: learn: 0.2343418 test: 0.2781777 best: 0.2781777 (16) total: 34.7s remaining: 2m 6s 17: learn: 0.2250946 test: 0.2707016 best: 0.2707016 (17) total: 36.7s remaining: 2m 4s 18: learn: 0.2162938 test: 0.2644047 best: 0.2644047 (18) total: 38.7s remaining: 2m 2s 19: learn: 0.2094470 test: 0.2593564 best: 0.2593564 (19) total: 40.7s remaining: 2m 20: learn: 0.2042127 test: 0.2551068 best: 0.2551068 (20) total: 42.9s remaining: 1m 58s 21: learn: 0.1992865 test: 0.2508358 best: 0.2508358 (21) total: 44.9s remaining: 1m 56s 22: learn: 0.1946140 test: 0.2472902 best: 0.2472902 (22) total: 47.1s remaining: 1m 54s 23: learn: 0.1894559 test: 0.2437779 best: 0.2437779 (23) total: 49.2s remaining: 1m 52s 24: learn: 0.1856076 test: 0.2411098 best: 0.2411098 (24) total: 51.4s remaining: 1m 50s 25: learn: 0.1822252 test: 0.2384919 best: 0.2384919 (25) total: 53.5s remaining: 1m 48s 26: learn: 0.1785153 test: 0.2354000 best: 0.2354000 (26) total: 55.7s remaining: 1m 47s 27: learn: 0.1745789 test: 0.2332103 best: 0.2332103 (27) total: 57.7s remaining: 1m 45s 28: learn: 0.1708887 test: 0.2312182 best: 0.2312182 (28) total: 59.8s remaining: 1m 43s 29: learn: 0.1671179 test: 0.2294177 best: 0.2294177 (29) total: 1m 1s remaining: 1m 40s 30: learn: 0.1635217 test: 0.2282437 best: 0.2282437 (30) total: 1m 3s remaining: 1m 38s 31: learn: 0.1606541 test: 0.2271309 best: 0.2271309 (31) total: 1m 5s remaining: 1m 36s 32: learn: 0.1583108 test: 0.2261459 best: 0.2261459 (32) total: 1m 7s remaining: 1m 34s 33: learn: 0.1554298 test: 0.2252402 best: 0.2252402 (33) total: 1m 9s remaining: 1m 32s 34: learn: 0.1539216 test: 0.2242230 best: 0.2242230 (34) total: 1m 11s remaining: 1m 29s 35: learn: 0.1524260 test: 0.2234927 best: 0.2234927 (35) total: 1m 13s remaining: 1m 27s 36: learn: 0.1499915 test: 0.2224033 best: 0.2224033 (36) total: 1m 15s remaining: 1m 25s 37: learn: 0.1476665 test: 0.2214732 best: 0.2214732 (37) total: 1m 17s remaining: 1m 23s 38: learn: 0.1447748 test: 0.2203847 best: 0.2203847 (38) total: 1m 19s remaining: 1m 21s 39: learn: 0.1427093 test: 0.2194541 best: 0.2194541 (39) total: 1m 21s remaining: 1m 19s 40: learn: 0.1402924 test: 0.2189561 best: 0.2189561 (40) total: 1m 24s remaining: 1m 17s 41: learn: 0.1379307 test: 0.2184211 best: 0.2184211 (41) total: 1m 26s remaining: 1m 15s 42: learn: 0.1355560 test: 0.2177780 best: 0.2177780 (42) total: 1m 28s remaining: 1m 13s 43: learn: 0.1338950 test: 0.2167026 best: 0.2167026 (43) total: 1m 30s remaining: 1m 11s 44: learn: 0.1321204 test: 0.2162712 best: 0.2162712 (44) total: 1m 32s remaining: 1m 9s 45: learn: 0.1311555 test: 0.2158080 best: 0.2158080 (45) total: 1m 34s remaining: 1m 7s 46: learn: 0.1285674 test: 0.2155693 best: 0.2155693 (46) total: 1m 36s remaining: 1m 5s 47: learn: 0.1268272 test: 0.2152930 best: 0.2152930 (47) total: 1m 38s remaining: 1m 3s 48: learn: 0.1249125 test: 0.2148897 best: 0.2148897 (48) total: 1m 40s remaining: 1m 1s 49: learn: 0.1235196 test: 0.2143705 best: 0.2143705 (49) total: 1m 42s remaining: 59.7s 50: learn: 0.1224151 test: 0.2141321 best: 0.2141321 (50) total: 1m 44s remaining: 57.6s 51: learn: 0.1209490 test: 0.2138230 best: 0.2138230 (51) total: 1m 47s remaining: 55.6s 52: learn: 0.1196457 test: 0.2137767 best: 0.2137767 (52) total: 1m 49s remaining: 53.5s 53: learn: 0.1182717 test: 0.2135647 best: 0.2135647 (53) total: 1m 51s remaining: 51.5s 54: learn: 0.1168005 test: 0.2136508 best: 0.2135647 (53) total: 1m 53s remaining: 49.4s 55: learn: 0.1142073 test: 0.2138312 best: 0.2135647 (53) total: 1m 55s remaining: 47.3s 56: learn: 0.1120977 test: 0.2137404 best: 0.2135647 (53) total: 1m 57s remaining: 45.3s 57: learn: 0.1106565 test: 0.2136628 best: 0.2135647 (53) total: 1m 59s remaining: 43.2s 58: learn: 0.1101010 test: 0.2135768 best: 0.2135647 (53) total: 2m 1s remaining: 41.1s 59: learn: 0.1081363 test: 0.2135752 best: 0.2135647 (53) total: 2m 3s remaining: 39.1s 60: learn: 0.1076982 test: 0.2134306 best: 0.2134306 (60) total: 2m 5s remaining: 37s 61: learn: 0.1077220 test: 0.2131941 best: 0.2131941 (61) total: 2m 7s remaining: 35s 62: learn: 0.1064940 test: 0.2132067 best: 0.2131941 (61) total: 2m 9s remaining: 32.9s 63: learn: 0.1056288 test: 0.2126309 best: 0.2126309 (63) total: 2m 11s remaining: 30.8s 64: learn: 0.1038828 test: 0.2128675 best: 0.2126309 (63) total: 2m 13s remaining: 28.8s 65: learn: 0.1025286 test: 0.2130771 best: 0.2126309 (63) total: 2m 15s remaining: 26.7s 66: learn: 0.1007979 test: 0.2130048 best: 0.2126309 (63) total: 2m 17s remaining: 24.6s 67: learn: 0.0994120 test: 0.2129897 best: 0.2126309 (63) total: 2m 19s remaining: 22.6s 68: learn: 0.0979883 test: 0.2130735 best: 0.2126309 (63) total: 2m 21s remaining: 20.5s 69: learn: 0.0963644 test: 0.2131812 best: 0.2126309 (63) total: 2m 23s remaining: 18.5s 70: learn: 0.0953885 test: 0.2133328 best: 0.2126309 (63) total: 2m 25s remaining: 16.4s 71: learn: 0.0948946 test: 0.2133831 best: 0.2126309 (63) total: 2m 27s remaining: 14.4s 72: learn: 0.0938279 test: 0.2134766 best: 0.2126309 (63) total: 2m 30s remaining: 12.3s 73: learn: 0.0930973 test: 0.2130488 best: 0.2126309 (63) total: 2m 32s remaining: 10.3s 74: learn: 0.0922761 test: 0.2127744 best: 0.2126309 (63) total: 2m 34s remaining: 8.21s 75: learn: 0.0897579 test: 0.2128126 best: 0.2126309 (63) total: 2m 36s remaining: 6.16s 76: learn: 0.0891907 test: 0.2126399 best: 0.2126309 (63) total: 2m 38s remaining: 4.1s 77: learn: 0.0875503 test: 0.2127995 best: 0.2126309 (63) total: 2m 40s remaining: 2.05s 78: learn: 0.0864590 test: 0.2126447 best: 0.2126309 (63) total: 2m 42s remaining: 0us bestTest = 0.2126308641 bestIteration = 63 Shrink model to first 64 iterations. Trial 84, Fold 5: Log loss = 0.21263086407727635, Average precision = 0.9705225371020497, ROC-AUC = 0.9647914088772458, Elapsed Time = 162.46784879999905 seconds
Optimization Progress: 85%|########5 | 85/100 [2:42:09<1:13:10, 292.69s/it]
Trial 85, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 85, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5770920 test: 0.5797050 best: 0.5797050 (0) total: 319ms remaining: 14.3s 1: learn: 0.4824920 test: 0.4881338 best: 0.4881338 (1) total: 719ms remaining: 15.8s 2: learn: 0.4203312 test: 0.4290134 best: 0.4290134 (2) total: 1.06s remaining: 15.1s 3: learn: 0.3642862 test: 0.3751569 best: 0.3751569 (3) total: 1.42s remaining: 14.9s 4: learn: 0.3280895 test: 0.3410348 best: 0.3410348 (4) total: 1.78s remaining: 14.6s 5: learn: 0.2988066 test: 0.3149886 best: 0.3149886 (5) total: 2.21s remaining: 14.7s 6: learn: 0.2721591 test: 0.2921860 best: 0.2921860 (6) total: 2.75s remaining: 15.3s 7: learn: 0.2522520 test: 0.2745029 best: 0.2745029 (7) total: 3.3s remaining: 15.7s 8: learn: 0.2353302 test: 0.2601624 best: 0.2601624 (8) total: 3.78s remaining: 15.5s 9: learn: 0.2250600 test: 0.2515370 best: 0.2515370 (9) total: 4.2s remaining: 15.1s 10: learn: 0.2133481 test: 0.2426429 best: 0.2426429 (10) total: 4.75s remaining: 15.1s 11: learn: 0.2042367 test: 0.2352367 best: 0.2352367 (11) total: 5.24s remaining: 14.8s 12: learn: 0.1981877 test: 0.2305660 best: 0.2305660 (12) total: 5.64s remaining: 14.3s 13: learn: 0.1919395 test: 0.2263018 best: 0.2263018 (13) total: 6.11s remaining: 14s 14: learn: 0.1863965 test: 0.2224825 best: 0.2224825 (14) total: 6.56s remaining: 13.6s 15: learn: 0.1809510 test: 0.2189299 best: 0.2189299 (15) total: 6.99s remaining: 13.1s 16: learn: 0.1759109 test: 0.2157326 best: 0.2157326 (16) total: 7.44s remaining: 12.7s 17: learn: 0.1717971 test: 0.2138145 best: 0.2138145 (17) total: 7.93s remaining: 12.3s 18: learn: 0.1683755 test: 0.2118337 best: 0.2118337 (18) total: 8.42s remaining: 12s 19: learn: 0.1654994 test: 0.2105091 best: 0.2105091 (19) total: 8.82s remaining: 11.5s 20: learn: 0.1620236 test: 0.2091543 best: 0.2091543 (20) total: 9.31s remaining: 11.1s 21: learn: 0.1588518 test: 0.2079653 best: 0.2079653 (21) total: 9.74s remaining: 10.6s 22: learn: 0.1571523 test: 0.2067653 best: 0.2067653 (22) total: 10.1s remaining: 10.1s 23: learn: 0.1549660 test: 0.2053201 best: 0.2053201 (23) total: 10.4s remaining: 9.58s 24: learn: 0.1525771 test: 0.2042988 best: 0.2042988 (24) total: 10.8s remaining: 9.11s 25: learn: 0.1502090 test: 0.2032308 best: 0.2032308 (25) total: 11.3s remaining: 8.67s 26: learn: 0.1479854 test: 0.2023587 best: 0.2023587 (26) total: 11.7s remaining: 8.24s 27: learn: 0.1465932 test: 0.2018371 best: 0.2018371 (27) total: 12.2s remaining: 7.84s 28: learn: 0.1451258 test: 0.2013908 best: 0.2013908 (28) total: 12.6s remaining: 7.39s 29: learn: 0.1434482 test: 0.2011028 best: 0.2011028 (29) total: 13s remaining: 6.92s 30: learn: 0.1421506 test: 0.2005626 best: 0.2005626 (30) total: 13.3s remaining: 6.45s 31: learn: 0.1407193 test: 0.2000058 best: 0.2000058 (31) total: 13.7s remaining: 5.99s 32: learn: 0.1397776 test: 0.1999675 best: 0.1999675 (32) total: 14s remaining: 5.52s 33: learn: 0.1380924 test: 0.2001127 best: 0.1999675 (32) total: 14.4s remaining: 5.08s 34: learn: 0.1371283 test: 0.1999068 best: 0.1999068 (34) total: 14.7s remaining: 4.62s 35: learn: 0.1362652 test: 0.1999163 best: 0.1999068 (34) total: 15s remaining: 4.17s 36: learn: 0.1351698 test: 0.1998407 best: 0.1998407 (36) total: 15.4s remaining: 3.75s 37: learn: 0.1341826 test: 0.1993334 best: 0.1993334 (37) total: 15.7s remaining: 3.31s 38: learn: 0.1333215 test: 0.1995160 best: 0.1993334 (37) total: 16.1s remaining: 2.88s 39: learn: 0.1324425 test: 0.1993451 best: 0.1993334 (37) total: 16.4s remaining: 2.46s 40: learn: 0.1314108 test: 0.1994328 best: 0.1993334 (37) total: 16.7s remaining: 2.04s 41: learn: 0.1306209 test: 0.1994252 best: 0.1993334 (37) total: 17s remaining: 1.62s 42: learn: 0.1295533 test: 0.1991343 best: 0.1991343 (42) total: 17.4s remaining: 1.22s 43: learn: 0.1285126 test: 0.1991541 best: 0.1991343 (42) total: 17.8s remaining: 809ms 44: learn: 0.1274700 test: 0.1992658 best: 0.1991343 (42) total: 18.2s remaining: 404ms 45: learn: 0.1264146 test: 0.1991209 best: 0.1991209 (45) total: 18.6s remaining: 0us bestTest = 0.1991208792 bestIteration = 45 Trial 85, Fold 1: Log loss = 0.19912087921877314, Average precision = 0.9740483458617527, ROC-AUC = 0.970261843822142, Elapsed Time = 18.72625550000521 seconds Trial 85, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 85, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5577841 test: 0.5601287 best: 0.5601287 (0) total: 365ms remaining: 16.4s 1: learn: 0.4711747 test: 0.4750500 best: 0.4750500 (1) total: 724ms remaining: 15.9s 2: learn: 0.4049316 test: 0.4112520 best: 0.4112520 (2) total: 1.14s remaining: 16.3s 3: learn: 0.3556906 test: 0.3646865 best: 0.3646865 (3) total: 1.57s remaining: 16.5s 4: learn: 0.3177924 test: 0.3293174 best: 0.3293174 (4) total: 2.12s remaining: 17.3s 5: learn: 0.2887423 test: 0.3026863 best: 0.3026863 (5) total: 2.67s remaining: 17.8s 6: learn: 0.2673697 test: 0.2842895 best: 0.2842895 (6) total: 3.2s remaining: 17.9s 7: learn: 0.2505867 test: 0.2691317 best: 0.2691317 (7) total: 3.72s remaining: 17.7s 8: learn: 0.2354531 test: 0.2560481 best: 0.2560481 (8) total: 4.21s remaining: 17.3s 9: learn: 0.2217548 test: 0.2441766 best: 0.2441766 (9) total: 4.71s remaining: 17s 10: learn: 0.2101110 test: 0.2341756 best: 0.2341756 (10) total: 5.18s remaining: 16.5s 11: learn: 0.2030794 test: 0.2278814 best: 0.2278814 (11) total: 5.57s remaining: 15.8s 12: learn: 0.1949402 test: 0.2209578 best: 0.2209578 (12) total: 6.03s remaining: 15.3s 13: learn: 0.1873237 test: 0.2149045 best: 0.2149045 (13) total: 6.47s remaining: 14.8s 14: learn: 0.1814963 test: 0.2100044 best: 0.2100044 (14) total: 6.92s remaining: 14.3s 15: learn: 0.1751264 test: 0.2047453 best: 0.2047453 (15) total: 7.4s remaining: 13.9s 16: learn: 0.1707320 test: 0.2015026 best: 0.2015026 (16) total: 7.86s remaining: 13.4s 17: learn: 0.1668790 test: 0.1993399 best: 0.1993399 (17) total: 8.35s remaining: 13s 18: learn: 0.1638683 test: 0.1971885 best: 0.1971885 (18) total: 8.77s remaining: 12.5s 19: learn: 0.1600658 test: 0.1945648 best: 0.1945648 (19) total: 9.22s remaining: 12s 20: learn: 0.1575048 test: 0.1932420 best: 0.1932420 (20) total: 9.61s remaining: 11.4s 21: learn: 0.1548974 test: 0.1914927 best: 0.1914927 (21) total: 10.1s remaining: 11s 22: learn: 0.1529725 test: 0.1904040 best: 0.1904040 (22) total: 10.5s remaining: 10.5s 23: learn: 0.1512769 test: 0.1895383 best: 0.1895383 (23) total: 10.9s remaining: 9.97s 24: learn: 0.1496292 test: 0.1886542 best: 0.1886542 (24) total: 11.2s remaining: 9.44s 25: learn: 0.1484876 test: 0.1881697 best: 0.1881697 (25) total: 11.6s remaining: 8.9s 26: learn: 0.1469506 test: 0.1873637 best: 0.1873637 (26) total: 12s remaining: 8.42s 27: learn: 0.1456711 test: 0.1867588 best: 0.1867588 (27) total: 12.3s remaining: 7.9s 28: learn: 0.1438254 test: 0.1862939 best: 0.1862939 (28) total: 12.7s remaining: 7.45s 29: learn: 0.1427351 test: 0.1860617 best: 0.1860617 (29) total: 13s remaining: 6.95s 30: learn: 0.1417010 test: 0.1859725 best: 0.1859725 (30) total: 13.3s remaining: 6.45s 31: learn: 0.1405041 test: 0.1854068 best: 0.1854068 (31) total: 13.7s remaining: 5.99s 32: learn: 0.1397933 test: 0.1851549 best: 0.1851549 (32) total: 14s remaining: 5.5s 33: learn: 0.1388597 test: 0.1848180 best: 0.1848180 (33) total: 14.2s remaining: 5.03s 34: learn: 0.1376787 test: 0.1846365 best: 0.1846365 (34) total: 14.6s remaining: 4.59s 35: learn: 0.1364643 test: 0.1839937 best: 0.1839937 (35) total: 15s remaining: 4.16s 36: learn: 0.1356870 test: 0.1836680 best: 0.1836680 (36) total: 15.3s remaining: 3.71s 37: learn: 0.1345229 test: 0.1833390 best: 0.1833390 (37) total: 15.6s remaining: 3.29s 38: learn: 0.1338143 test: 0.1833050 best: 0.1833050 (38) total: 15.9s remaining: 2.86s 39: learn: 0.1326136 test: 0.1828451 best: 0.1828451 (39) total: 16.3s remaining: 2.44s 40: learn: 0.1319301 test: 0.1827026 best: 0.1827026 (40) total: 16.5s remaining: 2.02s 41: learn: 0.1307502 test: 0.1823351 best: 0.1823351 (41) total: 16.9s remaining: 1.61s 42: learn: 0.1297762 test: 0.1819912 best: 0.1819912 (42) total: 17.2s remaining: 1.2s 43: learn: 0.1290480 test: 0.1817364 best: 0.1817364 (43) total: 17.5s remaining: 795ms 44: learn: 0.1286581 test: 0.1816566 best: 0.1816566 (44) total: 17.7s remaining: 394ms 45: learn: 0.1277946 test: 0.1815602 best: 0.1815602 (45) total: 18s remaining: 0us bestTest = 0.1815602377 bestIteration = 45 Trial 85, Fold 2: Log loss = 0.1815602376785173, Average precision = 0.9763477449441452, ROC-AUC = 0.9742676727747173, Elapsed Time = 18.20334570000705 seconds Trial 85, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 85, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5538765 test: 0.5587728 best: 0.5587728 (0) total: 376ms remaining: 16.9s 1: learn: 0.4514450 test: 0.4592958 best: 0.4592958 (1) total: 877ms remaining: 19.3s 2: learn: 0.3868889 test: 0.3965746 best: 0.3965746 (2) total: 1.31s remaining: 18.8s 3: learn: 0.3432441 test: 0.3554483 best: 0.3554483 (3) total: 1.75s remaining: 18.4s 4: learn: 0.3080315 test: 0.3226303 best: 0.3226303 (4) total: 2.29s remaining: 18.8s 5: learn: 0.2812363 test: 0.2982453 best: 0.2982453 (5) total: 2.84s remaining: 18.9s 6: learn: 0.2600449 test: 0.2788260 best: 0.2788260 (6) total: 3.37s remaining: 18.8s 7: learn: 0.2440332 test: 0.2637660 best: 0.2637660 (7) total: 3.85s remaining: 18.3s 8: learn: 0.2279033 test: 0.2495186 best: 0.2495186 (8) total: 4.4s remaining: 18.1s 9: learn: 0.2173892 test: 0.2405384 best: 0.2405384 (9) total: 4.93s remaining: 17.8s 10: learn: 0.2053970 test: 0.2291290 best: 0.2291290 (10) total: 5.47s remaining: 17.4s 11: learn: 0.1964508 test: 0.2219776 best: 0.2219776 (11) total: 6.08s remaining: 17.2s 12: learn: 0.1908215 test: 0.2172444 best: 0.2172444 (12) total: 6.52s remaining: 16.6s 13: learn: 0.1854154 test: 0.2131242 best: 0.2131242 (13) total: 7s remaining: 16s 14: learn: 0.1798978 test: 0.2088518 best: 0.2088518 (14) total: 7.5s remaining: 15.5s 15: learn: 0.1751183 test: 0.2049453 best: 0.2049453 (15) total: 7.97s remaining: 14.9s 16: learn: 0.1716211 test: 0.2030071 best: 0.2030071 (16) total: 8.41s remaining: 14.3s 17: learn: 0.1687553 test: 0.2011116 best: 0.2011116 (17) total: 8.84s remaining: 13.7s 18: learn: 0.1661304 test: 0.1996012 best: 0.1996012 (18) total: 9.27s remaining: 13.2s 19: learn: 0.1631868 test: 0.1977775 best: 0.1977775 (19) total: 9.76s remaining: 12.7s 20: learn: 0.1602648 test: 0.1962259 best: 0.1962259 (20) total: 10.2s remaining: 12.1s 21: learn: 0.1578569 test: 0.1948054 best: 0.1948054 (21) total: 10.6s remaining: 11.6s 22: learn: 0.1558139 test: 0.1935154 best: 0.1935154 (22) total: 11s remaining: 11s 23: learn: 0.1537742 test: 0.1924547 best: 0.1924547 (23) total: 11.4s remaining: 10.4s 24: learn: 0.1519506 test: 0.1916180 best: 0.1916180 (24) total: 11.8s remaining: 9.9s 25: learn: 0.1497993 test: 0.1909345 best: 0.1909345 (25) total: 12.2s remaining: 9.39s 26: learn: 0.1472855 test: 0.1898948 best: 0.1898948 (26) total: 12.6s remaining: 8.89s 27: learn: 0.1455614 test: 0.1891054 best: 0.1891054 (27) total: 13s remaining: 8.39s 28: learn: 0.1442315 test: 0.1888875 best: 0.1888875 (28) total: 13.5s remaining: 7.9s 29: learn: 0.1424982 test: 0.1882573 best: 0.1882573 (29) total: 13.9s remaining: 7.41s 30: learn: 0.1409415 test: 0.1878948 best: 0.1878948 (30) total: 14.3s remaining: 6.91s 31: learn: 0.1399089 test: 0.1875754 best: 0.1875754 (31) total: 14.6s remaining: 6.39s 32: learn: 0.1392674 test: 0.1874604 best: 0.1874604 (32) total: 14.8s remaining: 5.85s 33: learn: 0.1382238 test: 0.1874263 best: 0.1874263 (33) total: 15.3s remaining: 5.38s 34: learn: 0.1365029 test: 0.1869031 best: 0.1869031 (34) total: 15.6s remaining: 4.92s 35: learn: 0.1350020 test: 0.1864763 best: 0.1864763 (35) total: 16s remaining: 4.45s 36: learn: 0.1341436 test: 0.1863190 best: 0.1863190 (36) total: 16.3s remaining: 3.97s 37: learn: 0.1331639 test: 0.1861729 best: 0.1861729 (37) total: 16.7s remaining: 3.51s 38: learn: 0.1322350 test: 0.1860084 best: 0.1860084 (38) total: 17s remaining: 3.06s 39: learn: 0.1312937 test: 0.1860074 best: 0.1860074 (39) total: 17.4s remaining: 2.61s 40: learn: 0.1302905 test: 0.1859531 best: 0.1859531 (40) total: 17.7s remaining: 2.16s 41: learn: 0.1296138 test: 0.1857143 best: 0.1857143 (41) total: 18s remaining: 1.71s 42: learn: 0.1289710 test: 0.1859447 best: 0.1857143 (41) total: 18.3s remaining: 1.28s 43: learn: 0.1279580 test: 0.1856586 best: 0.1856586 (43) total: 18.7s remaining: 849ms 44: learn: 0.1275637 test: 0.1856695 best: 0.1856586 (43) total: 18.9s remaining: 420ms 45: learn: 0.1269717 test: 0.1855773 best: 0.1855773 (45) total: 19.2s remaining: 0us bestTest = 0.185577321 bestIteration = 45 Trial 85, Fold 3: Log loss = 0.18557732102419647, Average precision = 0.9761387659789403, ROC-AUC = 0.9733354260769905, Elapsed Time = 19.36717679999856 seconds Trial 85, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 85, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5580235 test: 0.5612987 best: 0.5612987 (0) total: 373ms remaining: 16.8s 1: learn: 0.4705116 test: 0.4754137 best: 0.4754137 (1) total: 764ms remaining: 16.8s 2: learn: 0.4092826 test: 0.4164132 best: 0.4164132 (2) total: 1.18s remaining: 16.9s 3: learn: 0.3611395 test: 0.3701735 best: 0.3701735 (3) total: 1.55s remaining: 16.3s 4: learn: 0.3265729 test: 0.3386638 best: 0.3386638 (4) total: 2.02s remaining: 16.6s 5: learn: 0.2972582 test: 0.3112060 best: 0.3112060 (5) total: 2.54s remaining: 16.9s 6: learn: 0.2749441 test: 0.2908545 best: 0.2908545 (6) total: 3.02s remaining: 16.8s 7: learn: 0.2567382 test: 0.2746448 best: 0.2746448 (7) total: 3.45s remaining: 16.4s 8: learn: 0.2391821 test: 0.2583977 best: 0.2583977 (8) total: 3.92s remaining: 16.1s 9: learn: 0.2257575 test: 0.2471073 best: 0.2471073 (9) total: 4.41s remaining: 15.9s 10: learn: 0.2156928 test: 0.2383388 best: 0.2383388 (10) total: 4.87s remaining: 15.5s 11: learn: 0.2068069 test: 0.2313343 best: 0.2313343 (11) total: 5.33s remaining: 15.1s 12: learn: 0.1985395 test: 0.2251326 best: 0.2251326 (12) total: 5.8s remaining: 14.7s 13: learn: 0.1929489 test: 0.2208506 best: 0.2208506 (13) total: 6.24s remaining: 14.3s 14: learn: 0.1869911 test: 0.2173072 best: 0.2173072 (14) total: 6.8s remaining: 14.1s 15: learn: 0.1826484 test: 0.2140320 best: 0.2140320 (15) total: 7.23s remaining: 13.6s 16: learn: 0.1775336 test: 0.2104209 best: 0.2104209 (16) total: 7.71s remaining: 13.2s 17: learn: 0.1734553 test: 0.2079733 best: 0.2079733 (17) total: 8.23s remaining: 12.8s 18: learn: 0.1706737 test: 0.2060764 best: 0.2060764 (18) total: 8.6s remaining: 12.2s 19: learn: 0.1675512 test: 0.2041490 best: 0.2041490 (19) total: 9.01s remaining: 11.7s 20: learn: 0.1642329 test: 0.2017244 best: 0.2017244 (20) total: 9.46s remaining: 11.3s 21: learn: 0.1608248 test: 0.1994378 best: 0.1994378 (21) total: 9.88s remaining: 10.8s 22: learn: 0.1574977 test: 0.1978541 best: 0.1978541 (22) total: 10.4s remaining: 10.4s 23: learn: 0.1546531 test: 0.1966435 best: 0.1966435 (23) total: 10.9s remaining: 9.99s 24: learn: 0.1528822 test: 0.1957417 best: 0.1957417 (24) total: 11.3s remaining: 9.48s 25: learn: 0.1507129 test: 0.1951570 best: 0.1951570 (25) total: 11.7s remaining: 9.02s 26: learn: 0.1488166 test: 0.1942985 best: 0.1942985 (26) total: 12.2s remaining: 8.56s 27: learn: 0.1468684 test: 0.1933442 best: 0.1933442 (27) total: 12.6s remaining: 8.12s 28: learn: 0.1455241 test: 0.1924128 best: 0.1924128 (28) total: 13s remaining: 7.62s 29: learn: 0.1439876 test: 0.1917695 best: 0.1917695 (29) total: 13.3s remaining: 7.09s 30: learn: 0.1429105 test: 0.1913820 best: 0.1913820 (30) total: 13.6s remaining: 6.59s 31: learn: 0.1416866 test: 0.1909474 best: 0.1909474 (31) total: 14s remaining: 6.14s 32: learn: 0.1400510 test: 0.1907718 best: 0.1907718 (32) total: 14.5s remaining: 5.7s 33: learn: 0.1387834 test: 0.1902692 best: 0.1902692 (33) total: 14.8s remaining: 5.22s 34: learn: 0.1372407 test: 0.1903692 best: 0.1902692 (33) total: 15.2s remaining: 4.77s 35: learn: 0.1363554 test: 0.1901581 best: 0.1901581 (35) total: 15.5s remaining: 4.31s 36: learn: 0.1348952 test: 0.1898436 best: 0.1898436 (36) total: 16s remaining: 3.88s 37: learn: 0.1341444 test: 0.1898435 best: 0.1898435 (37) total: 16.2s remaining: 3.42s 38: learn: 0.1330026 test: 0.1898139 best: 0.1898139 (38) total: 16.6s remaining: 2.98s 39: learn: 0.1321643 test: 0.1895867 best: 0.1895867 (39) total: 17s remaining: 2.55s 40: learn: 0.1313591 test: 0.1897707 best: 0.1895867 (39) total: 17.3s remaining: 2.11s 41: learn: 0.1304489 test: 0.1897396 best: 0.1895867 (39) total: 17.6s remaining: 1.68s 42: learn: 0.1292855 test: 0.1897507 best: 0.1895867 (39) total: 18s remaining: 1.25s 43: learn: 0.1285475 test: 0.1898824 best: 0.1895867 (39) total: 18.3s remaining: 832ms 44: learn: 0.1281189 test: 0.1899241 best: 0.1895867 (39) total: 18.6s remaining: 413ms 45: learn: 0.1275156 test: 0.1898012 best: 0.1895867 (39) total: 18.9s remaining: 0us bestTest = 0.1895867295 bestIteration = 39 Shrink model to first 40 iterations. Trial 85, Fold 4: Log loss = 0.18958672972794569, Average precision = 0.974982758563082, ROC-AUC = 0.9725237010331647, Elapsed Time = 19.01251290000073 seconds Trial 85, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 85, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5518133 test: 0.5576849 best: 0.5576849 (0) total: 417ms remaining: 18.8s 1: learn: 0.4551307 test: 0.4642812 best: 0.4642812 (1) total: 795ms remaining: 17.5s 2: learn: 0.3921763 test: 0.4041626 best: 0.4041626 (2) total: 1.22s remaining: 17.4s 3: learn: 0.3478186 test: 0.3624957 best: 0.3624957 (3) total: 1.66s remaining: 17.4s 4: learn: 0.3120564 test: 0.3308596 best: 0.3308596 (4) total: 2.13s remaining: 17.5s 5: learn: 0.2858774 test: 0.3078343 best: 0.3078343 (5) total: 2.6s remaining: 17.3s 6: learn: 0.2627277 test: 0.2873952 best: 0.2873952 (6) total: 3.12s remaining: 17.4s 7: learn: 0.2439212 test: 0.2708964 best: 0.2708964 (7) total: 3.65s remaining: 17.4s 8: learn: 0.2288328 test: 0.2588669 best: 0.2588669 (8) total: 4.17s remaining: 17.2s 9: learn: 0.2155693 test: 0.2479589 best: 0.2479589 (9) total: 4.7s remaining: 16.9s 10: learn: 0.2041771 test: 0.2388870 best: 0.2388870 (10) total: 5.16s remaining: 16.4s 11: learn: 0.1963735 test: 0.2331124 best: 0.2331124 (11) total: 5.57s remaining: 15.8s 12: learn: 0.1878242 test: 0.2266824 best: 0.2266824 (12) total: 6.07s remaining: 15.4s 13: learn: 0.1825588 test: 0.2231828 best: 0.2231828 (13) total: 6.53s remaining: 14.9s 14: learn: 0.1776641 test: 0.2197130 best: 0.2197130 (14) total: 6.95s remaining: 14.4s 15: learn: 0.1725178 test: 0.2160360 best: 0.2160360 (15) total: 7.4s remaining: 13.9s 16: learn: 0.1672796 test: 0.2129330 best: 0.2129330 (16) total: 7.87s remaining: 13.4s 17: learn: 0.1639880 test: 0.2110082 best: 0.2110082 (17) total: 8.28s remaining: 12.9s 18: learn: 0.1604465 test: 0.2091596 best: 0.2091596 (18) total: 8.71s remaining: 12.4s 19: learn: 0.1570678 test: 0.2077197 best: 0.2077197 (19) total: 9.22s remaining: 12s 20: learn: 0.1540402 test: 0.2059518 best: 0.2059518 (20) total: 9.66s remaining: 11.5s 21: learn: 0.1516019 test: 0.2048061 best: 0.2048061 (21) total: 10.1s remaining: 11s 22: learn: 0.1492291 test: 0.2041865 best: 0.2041865 (22) total: 10.5s remaining: 10.5s 23: learn: 0.1476021 test: 0.2038457 best: 0.2038457 (23) total: 10.9s remaining: 9.99s 24: learn: 0.1459095 test: 0.2031258 best: 0.2031258 (24) total: 11.3s remaining: 9.48s 25: learn: 0.1439671 test: 0.2027061 best: 0.2027061 (25) total: 11.7s remaining: 8.99s 26: learn: 0.1419628 test: 0.2019717 best: 0.2019717 (26) total: 12.1s remaining: 8.52s 27: learn: 0.1403767 test: 0.2013008 best: 0.2013008 (27) total: 12.5s remaining: 8.02s 28: learn: 0.1394026 test: 0.2008895 best: 0.2008895 (28) total: 12.8s remaining: 7.5s 29: learn: 0.1377052 test: 0.2002541 best: 0.2002541 (29) total: 13.2s remaining: 7.04s 30: learn: 0.1364902 test: 0.2000528 best: 0.2000528 (30) total: 13.6s remaining: 6.58s 31: learn: 0.1359664 test: 0.1999880 best: 0.1999880 (31) total: 13.9s remaining: 6.06s 32: learn: 0.1346481 test: 0.1999981 best: 0.1999880 (31) total: 14.2s remaining: 5.6s 33: learn: 0.1334661 test: 0.1997061 best: 0.1997061 (33) total: 14.6s remaining: 5.15s 34: learn: 0.1323059 test: 0.1994286 best: 0.1994286 (34) total: 14.9s remaining: 4.7s 35: learn: 0.1316215 test: 0.1994192 best: 0.1994192 (35) total: 15.2s remaining: 4.22s 36: learn: 0.1307832 test: 0.1993241 best: 0.1993241 (36) total: 15.5s remaining: 3.77s 37: learn: 0.1299739 test: 0.1991084 best: 0.1991084 (37) total: 15.8s remaining: 3.32s 38: learn: 0.1290823 test: 0.1989465 best: 0.1989465 (38) total: 16.1s remaining: 2.89s 39: learn: 0.1281790 test: 0.1990339 best: 0.1989465 (38) total: 16.4s remaining: 2.46s 40: learn: 0.1273732 test: 0.1990427 best: 0.1989465 (38) total: 16.7s remaining: 2.04s 41: learn: 0.1268070 test: 0.1991239 best: 0.1989465 (38) total: 17s remaining: 1.62s 42: learn: 0.1261087 test: 0.1990270 best: 0.1989465 (38) total: 17.3s remaining: 1.21s 43: learn: 0.1249252 test: 0.1986211 best: 0.1986211 (43) total: 17.6s remaining: 802ms 44: learn: 0.1243504 test: 0.1987310 best: 0.1986211 (43) total: 18s remaining: 399ms 45: learn: 0.1235734 test: 0.1987615 best: 0.1986211 (43) total: 18.3s remaining: 0us bestTest = 0.1986211047 bestIteration = 43 Shrink model to first 44 iterations. Trial 85, Fold 5: Log loss = 0.19862110415844503, Average precision = 0.9733036329049334, ROC-AUC = 0.970154915184958, Elapsed Time = 18.505486000001838 seconds
Optimization Progress: 86%|########6 | 86/100 [2:43:51<54:56, 235.45s/it]
Trial 86, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 86, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6607903 test: 0.6614257 best: 0.6614257 (0) total: 129ms remaining: 11.8s 1: learn: 0.6267838 test: 0.6288625 best: 0.6288625 (1) total: 285ms remaining: 12.8s 2: learn: 0.6018658 test: 0.6049775 best: 0.6049775 (2) total: 436ms remaining: 12.9s 3: learn: 0.5705498 test: 0.5743843 best: 0.5743843 (3) total: 599ms remaining: 13.2s 4: learn: 0.5501953 test: 0.5547343 best: 0.5547343 (4) total: 728ms remaining: 12.7s 5: learn: 0.5277518 test: 0.5332887 best: 0.5332887 (5) total: 880ms remaining: 12.6s 6: learn: 0.5043359 test: 0.5112466 best: 0.5112466 (6) total: 1.06s remaining: 12.9s 7: learn: 0.4836345 test: 0.4913260 best: 0.4913260 (7) total: 1.24s remaining: 13s 8: learn: 0.4647843 test: 0.4730046 best: 0.4730046 (8) total: 1.4s remaining: 12.9s 9: learn: 0.4451632 test: 0.4546503 best: 0.4546503 (9) total: 1.56s remaining: 12.8s 10: learn: 0.4258987 test: 0.4365593 best: 0.4365593 (10) total: 1.77s remaining: 13s 11: learn: 0.4100786 test: 0.4220490 best: 0.4220490 (11) total: 1.95s remaining: 13s 12: learn: 0.3923375 test: 0.4047609 best: 0.4047609 (12) total: 2.13s remaining: 13s 13: learn: 0.3773307 test: 0.3901331 best: 0.3901331 (13) total: 2.29s remaining: 12.8s 14: learn: 0.3615284 test: 0.3758961 best: 0.3758961 (14) total: 2.46s remaining: 12.6s 15: learn: 0.3525954 test: 0.3677233 best: 0.3677233 (15) total: 2.61s remaining: 12.4s 16: learn: 0.3441058 test: 0.3597350 best: 0.3597350 (16) total: 2.79s remaining: 12.3s 17: learn: 0.3332464 test: 0.3495469 best: 0.3495469 (17) total: 3.02s remaining: 12.4s 18: learn: 0.3267047 test: 0.3434973 best: 0.3434973 (18) total: 3.26s remaining: 12.5s 19: learn: 0.3171536 test: 0.3345593 best: 0.3345593 (19) total: 3.52s remaining: 12.7s 20: learn: 0.3070936 test: 0.3246610 best: 0.3246610 (20) total: 3.76s remaining: 12.7s 21: learn: 0.2986388 test: 0.3172091 best: 0.3172091 (21) total: 3.97s remaining: 12.6s 22: learn: 0.2922557 test: 0.3111899 best: 0.3111899 (22) total: 4.16s remaining: 12.5s 23: learn: 0.2869043 test: 0.3068229 best: 0.3068229 (23) total: 4.36s remaining: 12.3s 24: learn: 0.2822368 test: 0.3027748 best: 0.3027748 (24) total: 4.58s remaining: 12.3s 25: learn: 0.2765557 test: 0.2978265 best: 0.2978265 (25) total: 4.79s remaining: 12.2s 26: learn: 0.2722860 test: 0.2941194 best: 0.2941194 (26) total: 4.95s remaining: 11.9s 27: learn: 0.2661093 test: 0.2889147 best: 0.2889147 (27) total: 5.14s remaining: 11.7s 28: learn: 0.2613386 test: 0.2846978 best: 0.2846978 (28) total: 5.31s remaining: 11.5s 29: learn: 0.2565320 test: 0.2806673 best: 0.2806673 (29) total: 5.5s remaining: 11.4s 30: learn: 0.2519244 test: 0.2765271 best: 0.2765271 (30) total: 5.65s remaining: 11.1s 31: learn: 0.2495072 test: 0.2746321 best: 0.2746321 (31) total: 5.83s remaining: 10.9s 32: learn: 0.2471382 test: 0.2725461 best: 0.2725461 (32) total: 5.98s remaining: 10.7s 33: learn: 0.2422985 test: 0.2686638 best: 0.2686638 (33) total: 6.13s remaining: 10.4s 34: learn: 0.2391729 test: 0.2673692 best: 0.2673692 (34) total: 6.31s remaining: 10.3s 35: learn: 0.2353732 test: 0.2638503 best: 0.2638503 (35) total: 6.48s remaining: 10.1s 36: learn: 0.2320610 test: 0.2612005 best: 0.2612005 (36) total: 6.65s remaining: 9.89s 37: learn: 0.2281999 test: 0.2579678 best: 0.2579678 (37) total: 6.82s remaining: 9.7s 38: learn: 0.2256916 test: 0.2562630 best: 0.2562630 (38) total: 6.99s remaining: 9.49s 39: learn: 0.2230011 test: 0.2537689 best: 0.2537689 (39) total: 7.15s remaining: 9.3s 40: learn: 0.2203818 test: 0.2513597 best: 0.2513597 (40) total: 7.31s remaining: 9.09s 41: learn: 0.2167592 test: 0.2486201 best: 0.2486201 (41) total: 7.47s remaining: 8.89s 42: learn: 0.2141589 test: 0.2461854 best: 0.2461854 (42) total: 7.63s remaining: 8.69s 43: learn: 0.2115881 test: 0.2441764 best: 0.2441764 (43) total: 7.81s remaining: 8.52s 44: learn: 0.2094527 test: 0.2424913 best: 0.2424913 (44) total: 7.99s remaining: 8.34s 45: learn: 0.2073316 test: 0.2417119 best: 0.2417119 (45) total: 8.18s remaining: 8.18s 46: learn: 0.2049693 test: 0.2399220 best: 0.2399220 (46) total: 8.38s remaining: 8.02s 47: learn: 0.2024260 test: 0.2380405 best: 0.2380405 (47) total: 8.56s remaining: 7.84s 48: learn: 0.2011827 test: 0.2373694 best: 0.2373694 (48) total: 8.73s remaining: 7.67s 49: learn: 0.1992251 test: 0.2358957 best: 0.2358957 (49) total: 8.92s remaining: 7.49s 50: learn: 0.1975919 test: 0.2348197 best: 0.2348197 (50) total: 9.09s remaining: 7.31s 51: learn: 0.1957381 test: 0.2340980 best: 0.2340980 (51) total: 9.27s remaining: 7.13s 52: learn: 0.1934585 test: 0.2321879 best: 0.2321879 (52) total: 9.44s remaining: 6.94s 53: learn: 0.1908420 test: 0.2304804 best: 0.2304804 (53) total: 9.61s remaining: 6.76s 54: learn: 0.1893595 test: 0.2293570 best: 0.2293570 (54) total: 9.78s remaining: 6.58s 55: learn: 0.1876628 test: 0.2279916 best: 0.2279916 (55) total: 9.96s remaining: 6.4s 56: learn: 0.1864780 test: 0.2272197 best: 0.2272197 (56) total: 10.1s remaining: 6.22s 57: learn: 0.1848794 test: 0.2258293 best: 0.2258293 (57) total: 10.3s remaining: 6.05s 58: learn: 0.1833423 test: 0.2256189 best: 0.2256189 (58) total: 10.5s remaining: 5.86s 59: learn: 0.1817875 test: 0.2246358 best: 0.2246358 (59) total: 10.7s remaining: 5.68s 60: learn: 0.1805730 test: 0.2238909 best: 0.2238909 (60) total: 10.8s remaining: 5.5s 61: learn: 0.1792252 test: 0.2237768 best: 0.2237768 (61) total: 11s remaining: 5.31s 62: learn: 0.1777178 test: 0.2228641 best: 0.2228641 (62) total: 11.1s remaining: 5.13s 63: learn: 0.1763365 test: 0.2224723 best: 0.2224723 (63) total: 11.3s remaining: 4.94s 64: learn: 0.1748980 test: 0.2218021 best: 0.2218021 (64) total: 11.4s remaining: 4.75s 65: learn: 0.1739840 test: 0.2215505 best: 0.2215505 (65) total: 11.6s remaining: 4.57s 66: learn: 0.1728186 test: 0.2210234 best: 0.2210234 (66) total: 11.8s remaining: 4.39s 67: learn: 0.1715360 test: 0.2202545 best: 0.2202545 (67) total: 11.9s remaining: 4.21s 68: learn: 0.1703244 test: 0.2195803 best: 0.2195803 (68) total: 12.1s remaining: 4.03s 69: learn: 0.1690368 test: 0.2193819 best: 0.2193819 (69) total: 12.2s remaining: 3.85s 70: learn: 0.1675333 test: 0.2186685 best: 0.2186685 (70) total: 12.4s remaining: 3.67s 71: learn: 0.1668254 test: 0.2182527 best: 0.2182527 (71) total: 12.6s remaining: 3.49s 72: learn: 0.1656157 test: 0.2177593 best: 0.2177593 (72) total: 12.7s remaining: 3.31s 73: learn: 0.1644444 test: 0.2170782 best: 0.2170782 (73) total: 12.9s remaining: 3.14s 74: learn: 0.1638423 test: 0.2167251 best: 0.2167251 (74) total: 13s remaining: 2.94s 75: learn: 0.1629822 test: 0.2165149 best: 0.2165149 (75) total: 13.1s remaining: 2.76s 76: learn: 0.1615125 test: 0.2160183 best: 0.2160183 (76) total: 13.3s remaining: 2.59s 77: learn: 0.1602293 test: 0.2154570 best: 0.2154570 (77) total: 13.5s remaining: 2.41s 78: learn: 0.1592371 test: 0.2152341 best: 0.2152341 (78) total: 13.6s remaining: 2.24s 79: learn: 0.1585457 test: 0.2147785 best: 0.2147785 (79) total: 13.8s remaining: 2.06s 80: learn: 0.1579910 test: 0.2145055 best: 0.2145055 (80) total: 13.9s remaining: 1.89s 81: learn: 0.1572684 test: 0.2140532 best: 0.2140532 (81) total: 14.1s remaining: 1.72s 82: learn: 0.1565658 test: 0.2136573 best: 0.2136573 (82) total: 14.2s remaining: 1.54s 83: learn: 0.1559498 test: 0.2133525 best: 0.2133525 (83) total: 14.4s remaining: 1.37s 84: learn: 0.1555804 test: 0.2130978 best: 0.2130978 (84) total: 14.5s remaining: 1.2s 85: learn: 0.1545227 test: 0.2125955 best: 0.2125955 (85) total: 14.7s remaining: 1.02s 86: learn: 0.1539988 test: 0.2124978 best: 0.2124978 (86) total: 14.8s remaining: 851ms 87: learn: 0.1531883 test: 0.2122427 best: 0.2122427 (87) total: 15s remaining: 680ms 88: learn: 0.1526658 test: 0.2119708 best: 0.2119708 (88) total: 15.1s remaining: 509ms 89: learn: 0.1520332 test: 0.2117644 best: 0.2117644 (89) total: 15.2s remaining: 339ms 90: learn: 0.1516098 test: 0.2116089 best: 0.2116089 (90) total: 15.4s remaining: 169ms 91: learn: 0.1512332 test: 0.2114271 best: 0.2114271 (91) total: 15.5s remaining: 0us bestTest = 0.2114270719 bestIteration = 91 Trial 86, Fold 1: Log loss = 0.21126570723582958, Average precision = 0.9710056253674182, ROC-AUC = 0.9667329386587515, Elapsed Time = 15.706904600003327 seconds Trial 86, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 86, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6570273 test: 0.6580254 best: 0.6580254 (0) total: 142ms remaining: 12.9s 1: learn: 0.6247366 test: 0.6266219 best: 0.6266219 (1) total: 269ms remaining: 12.1s 2: learn: 0.5953530 test: 0.5980538 best: 0.5980538 (2) total: 426ms remaining: 12.6s 3: learn: 0.5667545 test: 0.5701429 best: 0.5701429 (3) total: 575ms remaining: 12.6s 4: learn: 0.5420990 test: 0.5467014 best: 0.5467014 (4) total: 734ms remaining: 12.8s 5: learn: 0.5204643 test: 0.5261911 best: 0.5261911 (5) total: 885ms remaining: 12.7s 6: learn: 0.5000966 test: 0.5073233 best: 0.5073233 (6) total: 1.05s remaining: 12.8s 7: learn: 0.4823448 test: 0.4905971 best: 0.4905971 (7) total: 1.23s remaining: 12.9s 8: learn: 0.4639358 test: 0.4734637 best: 0.4734637 (8) total: 1.37s remaining: 12.6s 9: learn: 0.4471929 test: 0.4573734 best: 0.4573734 (9) total: 1.53s remaining: 12.6s 10: learn: 0.4315020 test: 0.4425415 best: 0.4425415 (10) total: 1.67s remaining: 12.3s 11: learn: 0.4183591 test: 0.4302389 best: 0.4302389 (11) total: 1.84s remaining: 12.3s 12: learn: 0.3997917 test: 0.4136133 best: 0.4136133 (12) total: 2s remaining: 12.2s 13: learn: 0.3887903 test: 0.4027303 best: 0.4027303 (13) total: 2.15s remaining: 12s 14: learn: 0.3751239 test: 0.3901192 best: 0.3901192 (14) total: 2.3s remaining: 11.8s 15: learn: 0.3621779 test: 0.3779113 best: 0.3779113 (15) total: 2.44s remaining: 11.6s 16: learn: 0.3523060 test: 0.3685045 best: 0.3685045 (16) total: 2.61s remaining: 11.5s 17: learn: 0.3404041 test: 0.3572018 best: 0.3572018 (17) total: 2.75s remaining: 11.3s 18: learn: 0.3307147 test: 0.3496548 best: 0.3496548 (18) total: 2.91s remaining: 11.2s 19: learn: 0.3198183 test: 0.3391800 best: 0.3391800 (19) total: 3.05s remaining: 11s 20: learn: 0.3115207 test: 0.3319681 best: 0.3319681 (20) total: 3.21s remaining: 10.8s 21: learn: 0.3025393 test: 0.3235223 best: 0.3235223 (21) total: 3.37s remaining: 10.7s 22: learn: 0.2944112 test: 0.3157676 best: 0.3157676 (22) total: 3.52s remaining: 10.6s 23: learn: 0.2871986 test: 0.3090168 best: 0.3090168 (23) total: 3.69s remaining: 10.5s 24: learn: 0.2818967 test: 0.3044986 best: 0.3044986 (24) total: 3.85s remaining: 10.3s 25: learn: 0.2769282 test: 0.2999749 best: 0.2999749 (25) total: 4.02s remaining: 10.2s 26: learn: 0.2701265 test: 0.2946851 best: 0.2946851 (26) total: 4.2s remaining: 10.1s 27: learn: 0.2652782 test: 0.2901369 best: 0.2901369 (27) total: 4.36s remaining: 9.97s 28: learn: 0.2616335 test: 0.2865043 best: 0.2865043 (28) total: 4.51s remaining: 9.79s 29: learn: 0.2560975 test: 0.2822872 best: 0.2822872 (29) total: 4.64s remaining: 9.59s 30: learn: 0.2492691 test: 0.2774297 best: 0.2774297 (30) total: 4.8s remaining: 9.46s 31: learn: 0.2445386 test: 0.2731826 best: 0.2731826 (31) total: 4.97s remaining: 9.31s 32: learn: 0.2402690 test: 0.2702995 best: 0.2702995 (32) total: 5.13s remaining: 9.17s 33: learn: 0.2368791 test: 0.2667657 best: 0.2667657 (33) total: 5.28s remaining: 9.01s 34: learn: 0.2332485 test: 0.2639523 best: 0.2639523 (34) total: 5.44s remaining: 8.86s 35: learn: 0.2302245 test: 0.2612043 best: 0.2612043 (35) total: 5.58s remaining: 8.68s 36: learn: 0.2255162 test: 0.2573558 best: 0.2573558 (36) total: 5.75s remaining: 8.55s 37: learn: 0.2222238 test: 0.2553186 best: 0.2553186 (37) total: 5.93s remaining: 8.43s 38: learn: 0.2192925 test: 0.2523731 best: 0.2523731 (38) total: 6.08s remaining: 8.27s 39: learn: 0.2163626 test: 0.2502768 best: 0.2502768 (39) total: 6.24s remaining: 8.12s 40: learn: 0.2131455 test: 0.2477027 best: 0.2477027 (40) total: 6.43s remaining: 8s 41: learn: 0.2099356 test: 0.2451883 best: 0.2451883 (41) total: 6.59s remaining: 7.84s 42: learn: 0.2074483 test: 0.2435869 best: 0.2435869 (42) total: 6.76s remaining: 7.7s 43: learn: 0.2043407 test: 0.2424005 best: 0.2424005 (43) total: 6.92s remaining: 7.55s 44: learn: 0.2016026 test: 0.2402210 best: 0.2402210 (44) total: 7.09s remaining: 7.41s 45: learn: 0.1991668 test: 0.2383636 best: 0.2383636 (45) total: 7.24s remaining: 7.24s 46: learn: 0.1972920 test: 0.2373413 best: 0.2373413 (46) total: 7.39s remaining: 7.08s 47: learn: 0.1951517 test: 0.2356379 best: 0.2356379 (47) total: 7.54s remaining: 6.91s 48: learn: 0.1935668 test: 0.2343935 best: 0.2343935 (48) total: 7.71s remaining: 6.76s 49: learn: 0.1920924 test: 0.2333945 best: 0.2333945 (49) total: 7.85s remaining: 6.6s 50: learn: 0.1902418 test: 0.2320458 best: 0.2320458 (50) total: 8s remaining: 6.43s 51: learn: 0.1888146 test: 0.2307438 best: 0.2307438 (51) total: 8.16s remaining: 6.28s 52: learn: 0.1867485 test: 0.2290482 best: 0.2290482 (52) total: 8.31s remaining: 6.12s 53: learn: 0.1853995 test: 0.2280603 best: 0.2280603 (53) total: 8.47s remaining: 5.96s 54: learn: 0.1838188 test: 0.2268211 best: 0.2268211 (54) total: 8.64s remaining: 5.81s 55: learn: 0.1823590 test: 0.2253641 best: 0.2253641 (55) total: 8.8s remaining: 5.66s 56: learn: 0.1803868 test: 0.2247487 best: 0.2247487 (56) total: 8.96s remaining: 5.5s 57: learn: 0.1786179 test: 0.2240488 best: 0.2240488 (57) total: 9.11s remaining: 5.34s 58: learn: 0.1767718 test: 0.2227801 best: 0.2227801 (58) total: 9.27s remaining: 5.18s 59: learn: 0.1758190 test: 0.2225562 best: 0.2225562 (59) total: 9.43s remaining: 5.03s 60: learn: 0.1750087 test: 0.2220330 best: 0.2220330 (60) total: 9.59s remaining: 4.87s 61: learn: 0.1730228 test: 0.2208908 best: 0.2208908 (61) total: 9.75s remaining: 4.72s 62: learn: 0.1718320 test: 0.2197971 best: 0.2197971 (62) total: 9.9s remaining: 4.56s 63: learn: 0.1701965 test: 0.2187182 best: 0.2187182 (63) total: 10.1s remaining: 4.4s 64: learn: 0.1685399 test: 0.2184044 best: 0.2184044 (64) total: 10.2s remaining: 4.24s 65: learn: 0.1676702 test: 0.2181738 best: 0.2181738 (65) total: 10.4s remaining: 4.08s 66: learn: 0.1664232 test: 0.2169786 best: 0.2169786 (66) total: 10.5s remaining: 3.92s 67: learn: 0.1654278 test: 0.2164222 best: 0.2164222 (67) total: 10.7s remaining: 3.77s 68: learn: 0.1640181 test: 0.2153943 best: 0.2153943 (68) total: 10.8s remaining: 3.61s 69: learn: 0.1629664 test: 0.2148006 best: 0.2148006 (69) total: 11s remaining: 3.45s 70: learn: 0.1615718 test: 0.2138804 best: 0.2138804 (70) total: 11.1s remaining: 3.29s 71: learn: 0.1603174 test: 0.2130427 best: 0.2130427 (71) total: 11.3s remaining: 3.14s 72: learn: 0.1594574 test: 0.2124153 best: 0.2124153 (72) total: 11.5s remaining: 2.98s 73: learn: 0.1579993 test: 0.2118782 best: 0.2118782 (73) total: 11.6s remaining: 2.83s 74: learn: 0.1572004 test: 0.2116192 best: 0.2116192 (74) total: 11.8s remaining: 2.67s 75: learn: 0.1564863 test: 0.2110098 best: 0.2110098 (75) total: 11.9s remaining: 2.51s 76: learn: 0.1557216 test: 0.2105391 best: 0.2105391 (76) total: 12.1s remaining: 2.35s 77: learn: 0.1538110 test: 0.2104224 best: 0.2104224 (77) total: 12.3s remaining: 2.2s 78: learn: 0.1531601 test: 0.2100784 best: 0.2100784 (78) total: 12.4s remaining: 2.04s 79: learn: 0.1523420 test: 0.2095993 best: 0.2095993 (79) total: 12.6s remaining: 1.89s 80: learn: 0.1511275 test: 0.2095595 best: 0.2095595 (80) total: 12.7s remaining: 1.73s 81: learn: 0.1502962 test: 0.2092798 best: 0.2092798 (81) total: 12.9s remaining: 1.57s 82: learn: 0.1491971 test: 0.2086707 best: 0.2086707 (82) total: 13.1s remaining: 1.42s 83: learn: 0.1483740 test: 0.2082517 best: 0.2082517 (83) total: 13.2s remaining: 1.26s 84: learn: 0.1475726 test: 0.2080336 best: 0.2080336 (84) total: 13.4s remaining: 1.1s 85: learn: 0.1471445 test: 0.2077911 best: 0.2077911 (85) total: 13.5s remaining: 942ms 86: learn: 0.1467216 test: 0.2076149 best: 0.2076149 (86) total: 13.7s remaining: 785ms 87: learn: 0.1457747 test: 0.2070327 best: 0.2070327 (87) total: 13.8s remaining: 628ms 88: learn: 0.1451481 test: 0.2070441 best: 0.2070327 (87) total: 14s remaining: 470ms 89: learn: 0.1443096 test: 0.2067678 best: 0.2067678 (89) total: 14.1s remaining: 313ms 90: learn: 0.1437879 test: 0.2064295 best: 0.2064295 (90) total: 14.3s remaining: 157ms 91: learn: 0.1416097 test: 0.2060932 best: 0.2060932 (91) total: 14.4s remaining: 0us bestTest = 0.206093221 bestIteration = 91 Trial 86, Fold 2: Log loss = 0.20596878467745464, Average precision = 0.9718932341074019, ROC-AUC = 0.968086530229221, Elapsed Time = 14.600737399996433 seconds Trial 86, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 86, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6664623 test: 0.6664743 best: 0.6664743 (0) total: 137ms remaining: 12.4s 1: learn: 0.6456719 test: 0.6454437 best: 0.6454437 (1) total: 169ms remaining: 7.62s 2: learn: 0.6154464 test: 0.6158207 best: 0.6158207 (2) total: 306ms remaining: 9.09s 3: learn: 0.5820953 test: 0.5838753 best: 0.5838753 (3) total: 459ms remaining: 10.1s 4: learn: 0.5546040 test: 0.5570903 best: 0.5570903 (4) total: 607ms remaining: 10.6s 5: learn: 0.5292907 test: 0.5325215 best: 0.5325215 (5) total: 756ms remaining: 10.8s 6: learn: 0.5071939 test: 0.5108689 best: 0.5108689 (6) total: 920ms remaining: 11.2s 7: learn: 0.4872364 test: 0.4917043 best: 0.4917043 (7) total: 1.07s remaining: 11.2s 8: learn: 0.4697385 test: 0.4751473 best: 0.4751473 (8) total: 1.23s remaining: 11.3s 9: learn: 0.4504414 test: 0.4569047 best: 0.4569047 (9) total: 1.39s remaining: 11.4s 10: learn: 0.4341423 test: 0.4411526 best: 0.4411526 (10) total: 1.54s remaining: 11.4s 11: learn: 0.4205188 test: 0.4276215 best: 0.4276215 (11) total: 1.71s remaining: 11.4s 12: learn: 0.4004319 test: 0.4091141 best: 0.4091141 (12) total: 1.88s remaining: 11.4s 13: learn: 0.3842337 test: 0.3938345 best: 0.3938345 (13) total: 2.05s remaining: 11.4s 14: learn: 0.3700829 test: 0.3805143 best: 0.3805143 (14) total: 2.19s remaining: 11.3s 15: learn: 0.3553922 test: 0.3673635 best: 0.3673635 (15) total: 2.34s remaining: 11.1s 16: learn: 0.3446124 test: 0.3569171 best: 0.3569171 (16) total: 2.5s remaining: 11s 17: learn: 0.3370340 test: 0.3501872 best: 0.3501872 (17) total: 2.64s remaining: 10.9s 18: learn: 0.3314557 test: 0.3449445 best: 0.3449445 (18) total: 2.81s remaining: 10.8s 19: learn: 0.3192381 test: 0.3335802 best: 0.3335802 (19) total: 2.96s remaining: 10.7s 20: learn: 0.3128077 test: 0.3274769 best: 0.3274769 (20) total: 3.12s remaining: 10.5s 21: learn: 0.3068314 test: 0.3217499 best: 0.3217499 (21) total: 3.29s remaining: 10.5s 22: learn: 0.2984883 test: 0.3139930 best: 0.3139930 (22) total: 3.53s remaining: 10.6s 23: learn: 0.2899008 test: 0.3063119 best: 0.3063119 (23) total: 3.69s remaining: 10.4s 24: learn: 0.2840296 test: 0.3012772 best: 0.3012772 (24) total: 3.85s remaining: 10.3s 25: learn: 0.2779169 test: 0.2955357 best: 0.2955357 (25) total: 4.02s remaining: 10.2s 26: learn: 0.2741064 test: 0.2924120 best: 0.2924120 (26) total: 4.18s remaining: 10.1s 27: learn: 0.2674142 test: 0.2862906 best: 0.2862906 (27) total: 4.35s remaining: 9.94s 28: learn: 0.2641241 test: 0.2838022 best: 0.2838022 (28) total: 4.5s remaining: 9.77s 29: learn: 0.2570682 test: 0.2776875 best: 0.2776875 (29) total: 4.66s remaining: 9.63s 30: learn: 0.2531696 test: 0.2742016 best: 0.2742016 (30) total: 4.82s remaining: 9.48s 31: learn: 0.2480527 test: 0.2696032 best: 0.2696032 (31) total: 4.99s remaining: 9.36s 32: learn: 0.2422284 test: 0.2647015 best: 0.2647015 (32) total: 5.14s remaining: 9.19s 33: learn: 0.2379932 test: 0.2611463 best: 0.2611463 (33) total: 5.31s remaining: 9.06s 34: learn: 0.2340654 test: 0.2576249 best: 0.2576249 (34) total: 5.46s remaining: 8.89s 35: learn: 0.2301159 test: 0.2542094 best: 0.2542094 (35) total: 5.63s remaining: 8.75s 36: learn: 0.2265381 test: 0.2510835 best: 0.2510835 (36) total: 5.78s remaining: 8.59s 37: learn: 0.2245704 test: 0.2496819 best: 0.2496819 (37) total: 5.95s remaining: 8.45s 38: learn: 0.2219837 test: 0.2483008 best: 0.2483008 (38) total: 6.11s remaining: 8.3s 39: learn: 0.2200588 test: 0.2463658 best: 0.2463658 (39) total: 6.18s remaining: 8.03s 40: learn: 0.2170424 test: 0.2442765 best: 0.2442765 (40) total: 6.33s remaining: 7.87s 41: learn: 0.2149859 test: 0.2431822 best: 0.2431822 (41) total: 6.47s remaining: 7.7s 42: learn: 0.2111952 test: 0.2404488 best: 0.2404488 (42) total: 6.63s remaining: 7.56s 43: learn: 0.2088968 test: 0.2387743 best: 0.2387743 (43) total: 6.78s remaining: 7.4s 44: learn: 0.2056574 test: 0.2362198 best: 0.2362198 (44) total: 6.94s remaining: 7.25s 45: learn: 0.2024143 test: 0.2338957 best: 0.2338957 (45) total: 7.1s remaining: 7.1s 46: learn: 0.2001141 test: 0.2320664 best: 0.2320664 (46) total: 7.25s remaining: 6.95s 47: learn: 0.1982252 test: 0.2306913 best: 0.2306913 (47) total: 7.42s remaining: 6.8s 48: learn: 0.1961167 test: 0.2300306 best: 0.2300306 (48) total: 7.59s remaining: 6.66s 49: learn: 0.1942702 test: 0.2285783 best: 0.2285783 (49) total: 7.74s remaining: 6.5s 50: learn: 0.1927293 test: 0.2272754 best: 0.2272754 (50) total: 7.89s remaining: 6.35s 51: learn: 0.1911887 test: 0.2261242 best: 0.2261242 (51) total: 8.03s remaining: 6.18s 52: learn: 0.1903105 test: 0.2256306 best: 0.2256306 (52) total: 8.17s remaining: 6.01s 53: learn: 0.1889378 test: 0.2249001 best: 0.2249001 (53) total: 8.32s remaining: 5.86s 54: learn: 0.1877478 test: 0.2236905 best: 0.2236905 (54) total: 8.47s remaining: 5.7s 55: learn: 0.1857403 test: 0.2223963 best: 0.2223963 (55) total: 8.62s remaining: 5.54s 56: learn: 0.1843833 test: 0.2213600 best: 0.2213600 (56) total: 8.77s remaining: 5.38s 57: learn: 0.1829533 test: 0.2204224 best: 0.2204224 (57) total: 8.92s remaining: 5.23s 58: learn: 0.1819449 test: 0.2197117 best: 0.2197117 (58) total: 9.08s remaining: 5.08s 59: learn: 0.1806657 test: 0.2188902 best: 0.2188902 (59) total: 9.23s remaining: 4.92s 60: learn: 0.1794799 test: 0.2180994 best: 0.2180994 (60) total: 9.4s remaining: 4.78s 61: learn: 0.1784081 test: 0.2174966 best: 0.2174966 (61) total: 9.56s remaining: 4.63s 62: learn: 0.1774326 test: 0.2168437 best: 0.2168437 (62) total: 9.74s remaining: 4.49s 63: learn: 0.1763573 test: 0.2165906 best: 0.2165906 (63) total: 9.89s remaining: 4.33s 64: learn: 0.1753151 test: 0.2157472 best: 0.2157472 (64) total: 10s remaining: 4.17s 65: learn: 0.1739353 test: 0.2153010 best: 0.2153010 (65) total: 10.2s remaining: 4.01s 66: learn: 0.1724947 test: 0.2144231 best: 0.2144231 (66) total: 10.3s remaining: 3.86s 67: learn: 0.1711384 test: 0.2138438 best: 0.2138438 (67) total: 10.5s remaining: 3.7s 68: learn: 0.1702933 test: 0.2133132 best: 0.2133132 (68) total: 10.6s remaining: 3.55s 69: learn: 0.1689591 test: 0.2126683 best: 0.2126683 (69) total: 10.8s remaining: 3.4s 70: learn: 0.1680356 test: 0.2119927 best: 0.2119927 (70) total: 11s remaining: 3.25s 71: learn: 0.1667704 test: 0.2117064 best: 0.2117064 (71) total: 11.1s remaining: 3.09s 72: learn: 0.1657367 test: 0.2109297 best: 0.2109297 (72) total: 11.3s remaining: 2.94s 73: learn: 0.1646489 test: 0.2106520 best: 0.2106520 (73) total: 11.5s remaining: 2.79s 74: learn: 0.1638772 test: 0.2101924 best: 0.2101924 (74) total: 11.6s remaining: 2.63s 75: learn: 0.1631950 test: 0.2097990 best: 0.2097990 (75) total: 11.7s remaining: 2.47s 76: learn: 0.1625977 test: 0.2093835 best: 0.2093835 (76) total: 11.9s remaining: 2.32s 77: learn: 0.1619341 test: 0.2091093 best: 0.2091093 (77) total: 12.1s remaining: 2.16s 78: learn: 0.1609516 test: 0.2087604 best: 0.2087604 (78) total: 12.2s remaining: 2.01s 79: learn: 0.1598429 test: 0.2084430 best: 0.2084430 (79) total: 12.4s remaining: 1.86s 80: learn: 0.1582701 test: 0.2085205 best: 0.2084430 (79) total: 12.5s remaining: 1.7s 81: learn: 0.1575562 test: 0.2082307 best: 0.2082307 (81) total: 12.7s remaining: 1.55s 82: learn: 0.1567970 test: 0.2077841 best: 0.2077841 (82) total: 12.8s remaining: 1.39s 83: learn: 0.1558776 test: 0.2073277 best: 0.2073277 (83) total: 13s remaining: 1.24s 84: learn: 0.1552766 test: 0.2071503 best: 0.2071503 (84) total: 13.2s remaining: 1.08s 85: learn: 0.1546103 test: 0.2069356 best: 0.2069356 (85) total: 13.3s remaining: 928ms 86: learn: 0.1540670 test: 0.2067228 best: 0.2067228 (86) total: 13.5s remaining: 773ms 87: learn: 0.1535814 test: 0.2064586 best: 0.2064586 (87) total: 13.6s remaining: 619ms 88: learn: 0.1529628 test: 0.2061218 best: 0.2061218 (88) total: 13.7s remaining: 463ms 89: learn: 0.1511427 test: 0.2063112 best: 0.2061218 (88) total: 13.9s remaining: 309ms 90: learn: 0.1497265 test: 0.2060038 best: 0.2060038 (90) total: 14.1s remaining: 154ms 91: learn: 0.1492452 test: 0.2058521 best: 0.2058521 (91) total: 14.2s remaining: 0us bestTest = 0.2058520868 bestIteration = 91 Trial 86, Fold 3: Log loss = 0.2059723953464966, Average precision = 0.9721881416040364, ROC-AUC = 0.9689537565852202, Elapsed Time = 14.365914299996803 seconds Trial 86, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 86, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6550474 test: 0.6563609 best: 0.6563609 (0) total: 146ms remaining: 13.3s 1: learn: 0.6243143 test: 0.6266009 best: 0.6266009 (1) total: 316ms remaining: 14.2s 2: learn: 0.5902780 test: 0.5938431 best: 0.5938431 (2) total: 464ms remaining: 13.8s 3: learn: 0.5678409 test: 0.5720637 best: 0.5720637 (3) total: 617ms remaining: 13.6s 4: learn: 0.5407645 test: 0.5457972 best: 0.5457972 (4) total: 769ms remaining: 13.4s 5: learn: 0.5171073 test: 0.5236157 best: 0.5236157 (5) total: 933ms remaining: 13.4s 6: learn: 0.4939296 test: 0.5013800 best: 0.5013800 (6) total: 1.1s remaining: 13.4s 7: learn: 0.4754514 test: 0.4833454 best: 0.4833454 (7) total: 1.26s remaining: 13.2s 8: learn: 0.4668612 test: 0.4747431 best: 0.4747431 (8) total: 1.31s remaining: 12.1s 9: learn: 0.4488133 test: 0.4572543 best: 0.4572543 (9) total: 1.46s remaining: 11.9s 10: learn: 0.4337541 test: 0.4427346 best: 0.4427346 (10) total: 1.59s remaining: 11.7s 11: learn: 0.4154324 test: 0.4254173 best: 0.4254173 (11) total: 1.76s remaining: 11.7s 12: learn: 0.3998702 test: 0.4104874 best: 0.4104874 (12) total: 1.92s remaining: 11.7s 13: learn: 0.3860450 test: 0.3970949 best: 0.3970949 (13) total: 2.08s remaining: 11.6s 14: learn: 0.3704819 test: 0.3818071 best: 0.3818071 (14) total: 2.23s remaining: 11.5s 15: learn: 0.3602095 test: 0.3719021 best: 0.3719021 (15) total: 2.38s remaining: 11.3s 16: learn: 0.3489452 test: 0.3610481 best: 0.3610481 (16) total: 2.44s remaining: 10.8s 17: learn: 0.3409033 test: 0.3539593 best: 0.3539593 (17) total: 2.59s remaining: 10.7s 18: learn: 0.3332100 test: 0.3466337 best: 0.3466337 (18) total: 2.74s remaining: 10.5s 19: learn: 0.3230274 test: 0.3370594 best: 0.3370594 (19) total: 2.9s remaining: 10.4s 20: learn: 0.3142506 test: 0.3285032 best: 0.3285032 (20) total: 3.04s remaining: 10.3s 21: learn: 0.3033476 test: 0.3188533 best: 0.3188533 (21) total: 3.2s remaining: 10.2s 22: learn: 0.2947968 test: 0.3109144 best: 0.3109144 (22) total: 3.35s remaining: 10.1s 23: learn: 0.2899136 test: 0.3065591 best: 0.3065591 (23) total: 3.52s remaining: 9.96s 24: learn: 0.2839403 test: 0.3012821 best: 0.3012821 (24) total: 3.68s remaining: 9.86s 25: learn: 0.2761773 test: 0.2944397 best: 0.2944397 (25) total: 3.84s remaining: 9.75s 26: learn: 0.2706644 test: 0.2894452 best: 0.2894452 (26) total: 3.99s remaining: 9.61s 27: learn: 0.2630475 test: 0.2824812 best: 0.2824812 (27) total: 4.16s remaining: 9.51s 28: learn: 0.2583542 test: 0.2786631 best: 0.2786631 (28) total: 4.3s remaining: 9.34s 29: learn: 0.2532818 test: 0.2742183 best: 0.2742183 (29) total: 4.42s remaining: 9.14s 30: learn: 0.2478178 test: 0.2700199 best: 0.2700199 (30) total: 4.58s remaining: 9s 31: learn: 0.2436803 test: 0.2666715 best: 0.2666715 (31) total: 4.73s remaining: 8.88s 32: learn: 0.2390148 test: 0.2623910 best: 0.2623910 (32) total: 4.88s remaining: 8.73s 33: learn: 0.2352504 test: 0.2596503 best: 0.2596503 (33) total: 5.04s remaining: 8.61s 34: learn: 0.2318182 test: 0.2573096 best: 0.2573096 (34) total: 5.2s remaining: 8.47s 35: learn: 0.2287472 test: 0.2544956 best: 0.2544956 (35) total: 5.36s remaining: 8.33s 36: learn: 0.2253935 test: 0.2518721 best: 0.2518721 (36) total: 5.5s remaining: 8.17s 37: learn: 0.2214879 test: 0.2487932 best: 0.2487932 (37) total: 5.66s remaining: 8.05s 38: learn: 0.2179998 test: 0.2460788 best: 0.2460788 (38) total: 5.81s remaining: 7.89s 39: learn: 0.2162597 test: 0.2449316 best: 0.2449316 (39) total: 5.96s remaining: 7.75s 40: learn: 0.2134312 test: 0.2425321 best: 0.2425321 (40) total: 6.11s remaining: 7.59s 41: learn: 0.2110271 test: 0.2403851 best: 0.2403851 (41) total: 6.27s remaining: 7.46s 42: learn: 0.2088186 test: 0.2395094 best: 0.2395094 (42) total: 6.43s remaining: 7.33s 43: learn: 0.2054065 test: 0.2372666 best: 0.2372666 (43) total: 6.6s remaining: 7.2s 44: learn: 0.2032608 test: 0.2359446 best: 0.2359446 (44) total: 6.76s remaining: 7.06s 45: learn: 0.2016175 test: 0.2349074 best: 0.2349074 (45) total: 6.89s remaining: 6.89s 46: learn: 0.1996001 test: 0.2341086 best: 0.2341086 (46) total: 7.07s remaining: 6.77s 47: learn: 0.1979843 test: 0.2327283 best: 0.2327283 (47) total: 7.22s remaining: 6.62s 48: learn: 0.1960601 test: 0.2311680 best: 0.2311680 (48) total: 7.37s remaining: 6.47s 49: learn: 0.1941083 test: 0.2296822 best: 0.2296822 (49) total: 7.54s remaining: 6.33s 50: learn: 0.1922827 test: 0.2290825 best: 0.2290825 (50) total: 7.68s remaining: 6.18s 51: learn: 0.1907470 test: 0.2279881 best: 0.2279881 (51) total: 7.84s remaining: 6.03s 52: learn: 0.1893788 test: 0.2270752 best: 0.2270752 (52) total: 7.97s remaining: 5.87s 53: learn: 0.1873614 test: 0.2255029 best: 0.2255029 (53) total: 8.13s remaining: 5.72s 54: learn: 0.1861833 test: 0.2243787 best: 0.2243787 (54) total: 8.18s remaining: 5.5s 55: learn: 0.1848534 test: 0.2234710 best: 0.2234710 (55) total: 8.35s remaining: 5.37s 56: learn: 0.1833322 test: 0.2225384 best: 0.2225384 (56) total: 8.51s remaining: 5.23s 57: learn: 0.1822221 test: 0.2220132 best: 0.2220132 (57) total: 8.68s remaining: 5.09s 58: learn: 0.1805846 test: 0.2214564 best: 0.2214564 (58) total: 8.84s remaining: 4.95s 59: learn: 0.1798214 test: 0.2212440 best: 0.2212440 (59) total: 8.99s remaining: 4.8s 60: learn: 0.1789509 test: 0.2208996 best: 0.2208996 (60) total: 9.14s remaining: 4.64s 61: learn: 0.1776007 test: 0.2205570 best: 0.2205570 (61) total: 9.3s remaining: 4.5s 62: learn: 0.1766025 test: 0.2205012 best: 0.2205012 (62) total: 9.46s remaining: 4.36s 63: learn: 0.1749508 test: 0.2195318 best: 0.2195318 (63) total: 9.62s remaining: 4.21s 64: learn: 0.1733500 test: 0.2185052 best: 0.2185052 (64) total: 9.77s remaining: 4.06s 65: learn: 0.1723884 test: 0.2177940 best: 0.2177940 (65) total: 9.93s remaining: 3.91s 66: learn: 0.1709483 test: 0.2173547 best: 0.2173547 (66) total: 10.1s remaining: 3.76s 67: learn: 0.1690872 test: 0.2164089 best: 0.2164089 (67) total: 10.3s remaining: 3.62s 68: learn: 0.1674362 test: 0.2161431 best: 0.2161431 (68) total: 10.4s remaining: 3.47s 69: learn: 0.1665136 test: 0.2156363 best: 0.2156363 (69) total: 10.6s remaining: 3.33s 70: learn: 0.1655869 test: 0.2151401 best: 0.2151401 (70) total: 10.7s remaining: 3.17s 71: learn: 0.1645891 test: 0.2147829 best: 0.2147829 (71) total: 10.9s remaining: 3.02s 72: learn: 0.1634087 test: 0.2139217 best: 0.2139217 (72) total: 11s remaining: 2.87s 73: learn: 0.1628040 test: 0.2135303 best: 0.2135303 (73) total: 11.2s remaining: 2.72s 74: learn: 0.1614274 test: 0.2129345 best: 0.2129345 (74) total: 11.3s remaining: 2.56s 75: learn: 0.1600554 test: 0.2123446 best: 0.2123446 (75) total: 11.5s remaining: 2.42s 76: learn: 0.1589926 test: 0.2123186 best: 0.2123186 (76) total: 11.6s remaining: 2.26s 77: learn: 0.1581713 test: 0.2119324 best: 0.2119324 (77) total: 11.8s remaining: 2.11s 78: learn: 0.1574005 test: 0.2114545 best: 0.2114545 (78) total: 11.9s remaining: 1.96s 79: learn: 0.1569194 test: 0.2111279 best: 0.2111279 (79) total: 12.1s remaining: 1.81s 80: learn: 0.1556924 test: 0.2108450 best: 0.2108450 (80) total: 12.2s remaining: 1.66s 81: learn: 0.1550270 test: 0.2105443 best: 0.2105443 (81) total: 12.4s remaining: 1.51s 82: learn: 0.1544147 test: 0.2100204 best: 0.2100204 (82) total: 12.5s remaining: 1.36s 83: learn: 0.1536415 test: 0.2098188 best: 0.2098188 (83) total: 12.7s remaining: 1.21s 84: learn: 0.1526040 test: 0.2094545 best: 0.2094545 (84) total: 12.9s remaining: 1.06s 85: learn: 0.1516786 test: 0.2089999 best: 0.2089999 (85) total: 13s remaining: 908ms 86: learn: 0.1511466 test: 0.2086870 best: 0.2086870 (86) total: 13.2s remaining: 757ms 87: learn: 0.1501858 test: 0.2082033 best: 0.2082033 (87) total: 13.3s remaining: 605ms 88: learn: 0.1496122 test: 0.2079119 best: 0.2079119 (88) total: 13.5s remaining: 454ms 89: learn: 0.1481769 test: 0.2076853 best: 0.2076853 (89) total: 13.6s remaining: 303ms 90: learn: 0.1468058 test: 0.2073717 best: 0.2073717 (90) total: 13.8s remaining: 151ms 91: learn: 0.1462810 test: 0.2072602 best: 0.2072602 (91) total: 13.9s remaining: 0us bestTest = 0.20726021 bestIteration = 91 Trial 86, Fold 4: Log loss = 0.20703913607806468, Average precision = 0.9713014618504603, ROC-AUC = 0.9657980261855883, Elapsed Time = 14.118714299998828 seconds Trial 86, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 86, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6596485 test: 0.6613787 best: 0.6613787 (0) total: 150ms remaining: 13.7s 1: learn: 0.6233461 test: 0.6274864 best: 0.6274864 (1) total: 290ms remaining: 13.1s 2: learn: 0.5969794 test: 0.6028800 best: 0.6028800 (2) total: 449ms remaining: 13.3s 3: learn: 0.5714650 test: 0.5789385 best: 0.5789385 (3) total: 613ms remaining: 13.5s 4: learn: 0.5458831 test: 0.5552100 best: 0.5552100 (4) total: 777ms remaining: 13.5s 5: learn: 0.5325506 test: 0.5420525 best: 0.5420525 (5) total: 812ms remaining: 11.6s 6: learn: 0.5097499 test: 0.5205085 best: 0.5205085 (6) total: 969ms remaining: 11.8s 7: learn: 0.4848410 test: 0.4975150 best: 0.4975150 (7) total: 1.12s remaining: 11.8s 8: learn: 0.4661453 test: 0.4802539 best: 0.4802539 (8) total: 1.28s remaining: 11.8s 9: learn: 0.4469116 test: 0.4619612 best: 0.4619612 (9) total: 1.44s remaining: 11.8s 10: learn: 0.4290628 test: 0.4447469 best: 0.4447469 (10) total: 1.59s remaining: 11.7s 11: learn: 0.4118497 test: 0.4289772 best: 0.4289772 (11) total: 1.74s remaining: 11.6s 12: learn: 0.3927800 test: 0.4113502 best: 0.4113502 (12) total: 1.91s remaining: 11.6s 13: learn: 0.3876740 test: 0.4062652 best: 0.4062652 (13) total: 1.94s remaining: 10.8s 14: learn: 0.3751135 test: 0.3947307 best: 0.3947307 (14) total: 2.1s remaining: 10.8s 15: learn: 0.3598152 test: 0.3801584 best: 0.3801584 (15) total: 2.25s remaining: 10.7s 16: learn: 0.3471220 test: 0.3681830 best: 0.3681830 (16) total: 2.42s remaining: 10.7s 17: learn: 0.3375215 test: 0.3593234 best: 0.3593234 (17) total: 2.58s remaining: 10.6s 18: learn: 0.3278083 test: 0.3505128 best: 0.3505128 (18) total: 2.72s remaining: 10.5s 19: learn: 0.3174869 test: 0.3407364 best: 0.3407364 (19) total: 2.86s remaining: 10.3s 20: learn: 0.3101129 test: 0.3337195 best: 0.3337195 (20) total: 3.02s remaining: 10.2s 21: learn: 0.3000643 test: 0.3244597 best: 0.3244597 (21) total: 3.16s remaining: 10.1s 22: learn: 0.2932162 test: 0.3181251 best: 0.3181251 (22) total: 3.32s remaining: 9.95s 23: learn: 0.2846905 test: 0.3106815 best: 0.3106815 (23) total: 3.47s remaining: 9.83s 24: learn: 0.2774844 test: 0.3045350 best: 0.3045350 (24) total: 3.64s remaining: 9.76s 25: learn: 0.2731555 test: 0.3007012 best: 0.3007012 (25) total: 3.81s remaining: 9.66s 26: learn: 0.2672496 test: 0.2957118 best: 0.2957118 (26) total: 3.96s remaining: 9.53s 27: learn: 0.2632984 test: 0.2925177 best: 0.2925177 (27) total: 4.12s remaining: 9.43s 28: learn: 0.2576054 test: 0.2875265 best: 0.2875265 (28) total: 4.28s remaining: 9.31s 29: learn: 0.2549473 test: 0.2854353 best: 0.2854353 (29) total: 4.43s remaining: 9.17s 30: learn: 0.2513376 test: 0.2823697 best: 0.2823697 (30) total: 4.6s remaining: 9.05s 31: learn: 0.2480465 test: 0.2799212 best: 0.2799212 (31) total: 4.76s remaining: 8.93s 32: learn: 0.2434934 test: 0.2767423 best: 0.2767423 (32) total: 4.92s remaining: 8.8s 33: learn: 0.2397733 test: 0.2732695 best: 0.2732695 (33) total: 5.07s remaining: 8.65s 34: learn: 0.2370494 test: 0.2713153 best: 0.2713153 (34) total: 5.23s remaining: 8.52s 35: learn: 0.2338356 test: 0.2689894 best: 0.2689894 (35) total: 5.38s remaining: 8.37s 36: learn: 0.2293080 test: 0.2654048 best: 0.2654048 (36) total: 5.54s remaining: 8.23s 37: learn: 0.2273940 test: 0.2634829 best: 0.2634829 (37) total: 5.57s remaining: 7.92s 38: learn: 0.2234492 test: 0.2615641 best: 0.2615641 (38) total: 5.71s remaining: 7.76s 39: learn: 0.2217104 test: 0.2604787 best: 0.2604787 (39) total: 5.87s remaining: 7.64s 40: learn: 0.2181881 test: 0.2575952 best: 0.2575952 (40) total: 6.02s remaining: 7.49s 41: learn: 0.2132692 test: 0.2537031 best: 0.2537031 (41) total: 6.18s remaining: 7.36s 42: learn: 0.2113720 test: 0.2529466 best: 0.2529466 (42) total: 6.35s remaining: 7.23s 43: learn: 0.2078499 test: 0.2500669 best: 0.2500669 (43) total: 6.5s remaining: 7.08s 44: learn: 0.2055750 test: 0.2486422 best: 0.2486422 (44) total: 6.66s remaining: 6.95s 45: learn: 0.2022021 test: 0.2465492 best: 0.2465492 (45) total: 6.83s remaining: 6.83s 46: learn: 0.2012127 test: 0.2461545 best: 0.2461545 (46) total: 6.98s remaining: 6.69s 47: learn: 0.1992307 test: 0.2442917 best: 0.2442917 (47) total: 7.13s remaining: 6.54s 48: learn: 0.1972077 test: 0.2424699 best: 0.2424699 (48) total: 7.3s remaining: 6.41s 49: learn: 0.1950887 test: 0.2412597 best: 0.2412597 (49) total: 7.47s remaining: 6.27s 50: learn: 0.1933807 test: 0.2397466 best: 0.2397466 (50) total: 7.6s remaining: 6.11s 51: learn: 0.1915524 test: 0.2384890 best: 0.2384890 (51) total: 7.78s remaining: 5.98s 52: learn: 0.1903416 test: 0.2378242 best: 0.2378242 (52) total: 7.91s remaining: 5.82s 53: learn: 0.1888692 test: 0.2366560 best: 0.2366560 (53) total: 8.07s remaining: 5.68s 54: learn: 0.1870378 test: 0.2351311 best: 0.2351311 (54) total: 8.21s remaining: 5.53s 55: learn: 0.1856034 test: 0.2346241 best: 0.2346241 (55) total: 8.38s remaining: 5.38s 56: learn: 0.1842396 test: 0.2336861 best: 0.2336861 (56) total: 8.54s remaining: 5.25s 57: learn: 0.1832619 test: 0.2328611 best: 0.2328611 (57) total: 8.68s remaining: 5.09s 58: learn: 0.1817694 test: 0.2319498 best: 0.2319498 (58) total: 8.85s remaining: 4.95s 59: learn: 0.1804858 test: 0.2313253 best: 0.2313253 (59) total: 9.01s remaining: 4.8s 60: learn: 0.1795228 test: 0.2304243 best: 0.2304243 (60) total: 9.16s remaining: 4.65s 61: learn: 0.1780864 test: 0.2298804 best: 0.2298804 (61) total: 9.32s remaining: 4.51s 62: learn: 0.1762390 test: 0.2289186 best: 0.2289186 (62) total: 9.47s remaining: 4.36s 63: learn: 0.1751030 test: 0.2283039 best: 0.2283039 (63) total: 9.61s remaining: 4.2s 64: learn: 0.1738694 test: 0.2273834 best: 0.2273834 (64) total: 9.75s remaining: 4.05s 65: learn: 0.1727382 test: 0.2271050 best: 0.2271050 (65) total: 9.9s remaining: 3.9s 66: learn: 0.1714591 test: 0.2261729 best: 0.2261729 (66) total: 10.1s remaining: 3.75s 67: learn: 0.1702565 test: 0.2255288 best: 0.2255288 (67) total: 10.2s remaining: 3.61s 68: learn: 0.1689345 test: 0.2252884 best: 0.2252884 (68) total: 10.4s remaining: 3.46s 69: learn: 0.1677100 test: 0.2245012 best: 0.2245012 (69) total: 10.5s remaining: 3.31s 70: learn: 0.1664261 test: 0.2237910 best: 0.2237910 (70) total: 10.7s remaining: 3.16s 71: learn: 0.1653865 test: 0.2232155 best: 0.2232155 (71) total: 10.8s remaining: 3.01s 72: learn: 0.1639919 test: 0.2227732 best: 0.2227732 (72) total: 11s remaining: 2.86s 73: learn: 0.1629918 test: 0.2222651 best: 0.2222651 (73) total: 11.1s remaining: 2.71s 74: learn: 0.1621923 test: 0.2218314 best: 0.2218314 (74) total: 11.3s remaining: 2.55s 75: learn: 0.1609009 test: 0.2211252 best: 0.2211252 (75) total: 11.4s remaining: 2.4s 76: learn: 0.1599620 test: 0.2204836 best: 0.2204836 (76) total: 11.6s remaining: 2.25s 77: learn: 0.1589072 test: 0.2200198 best: 0.2200198 (77) total: 11.7s remaining: 2.1s 78: learn: 0.1574420 test: 0.2191762 best: 0.2191762 (78) total: 11.8s remaining: 1.95s 79: learn: 0.1566186 test: 0.2188651 best: 0.2188651 (79) total: 12s remaining: 1.8s 80: learn: 0.1557906 test: 0.2186513 best: 0.2186513 (80) total: 12.1s remaining: 1.65s 81: learn: 0.1549670 test: 0.2184997 best: 0.2184997 (81) total: 12.3s remaining: 1.5s 82: learn: 0.1543115 test: 0.2183347 best: 0.2183347 (82) total: 12.5s remaining: 1.35s 83: learn: 0.1531804 test: 0.2185442 best: 0.2183347 (82) total: 12.6s remaining: 1.2s 84: learn: 0.1524632 test: 0.2179127 best: 0.2179127 (84) total: 12.8s remaining: 1.05s 85: learn: 0.1515704 test: 0.2177434 best: 0.2177434 (85) total: 13s remaining: 905ms 86: learn: 0.1506836 test: 0.2174917 best: 0.2174917 (86) total: 13.1s remaining: 754ms 87: learn: 0.1498233 test: 0.2174006 best: 0.2174006 (87) total: 13.3s remaining: 603ms 88: learn: 0.1487814 test: 0.2174194 best: 0.2174006 (87) total: 13.4s remaining: 453ms 89: learn: 0.1477605 test: 0.2170277 best: 0.2170277 (89) total: 13.6s remaining: 302ms 90: learn: 0.1466891 test: 0.2167914 best: 0.2167914 (90) total: 13.7s remaining: 151ms 91: learn: 0.1460294 test: 0.2167179 best: 0.2167179 (91) total: 13.9s remaining: 0us bestTest = 0.2167178935 bestIteration = 91 Trial 86, Fold 5: Log loss = 0.21634899604568314, Average precision = 0.9692303375647867, ROC-AUC = 0.9646564479869202, Elapsed Time = 14.080775299997185 seconds
Optimization Progress: 87%|########7 | 87/100 [2:45:12<40:57, 189.07s/it]
Trial 87, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 87, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5979648 test: 0.5986029 best: 0.5986029 (0) total: 483ms remaining: 15.5s 1: learn: 0.5336807 test: 0.5342894 best: 0.5342894 (1) total: 925ms remaining: 14.3s 2: learn: 0.4708385 test: 0.4721676 best: 0.4721676 (2) total: 1.43s remaining: 14.3s 3: learn: 0.4233674 test: 0.4254571 best: 0.4254571 (3) total: 1.91s remaining: 13.8s 4: learn: 0.3843473 test: 0.3874524 best: 0.3874524 (4) total: 2.35s remaining: 13.2s 5: learn: 0.3544349 test: 0.3583425 best: 0.3583425 (5) total: 2.83s remaining: 12.7s 6: learn: 0.3290461 test: 0.3333950 best: 0.3333950 (6) total: 3.28s remaining: 12.2s 7: learn: 0.3097138 test: 0.3152287 best: 0.3152287 (7) total: 3.75s remaining: 11.7s 8: learn: 0.2925720 test: 0.2991148 best: 0.2991148 (8) total: 4.26s remaining: 11.3s 9: learn: 0.2811674 test: 0.2883987 best: 0.2883987 (9) total: 4.7s remaining: 10.8s 10: learn: 0.2696040 test: 0.2778146 best: 0.2778146 (10) total: 5.15s remaining: 10.3s 11: learn: 0.2577412 test: 0.2669309 best: 0.2669309 (11) total: 5.61s remaining: 9.82s 12: learn: 0.2480571 test: 0.2588033 best: 0.2588033 (12) total: 6.16s remaining: 9.48s 13: learn: 0.2400766 test: 0.2520185 best: 0.2520185 (13) total: 6.63s remaining: 8.99s 14: learn: 0.2328620 test: 0.2460226 best: 0.2460226 (14) total: 7.08s remaining: 8.5s 15: learn: 0.2270929 test: 0.2415492 best: 0.2415492 (15) total: 7.58s remaining: 8.06s 16: learn: 0.2229506 test: 0.2379446 best: 0.2379446 (16) total: 8.04s remaining: 7.56s 17: learn: 0.2186440 test: 0.2348978 best: 0.2348978 (17) total: 8.51s remaining: 7.09s 18: learn: 0.2141708 test: 0.2309676 best: 0.2309676 (18) total: 8.96s remaining: 6.6s 19: learn: 0.2101795 test: 0.2278651 best: 0.2278651 (19) total: 9.41s remaining: 6.12s 20: learn: 0.2069499 test: 0.2258239 best: 0.2258239 (20) total: 9.87s remaining: 5.64s 21: learn: 0.2033169 test: 0.2235803 best: 0.2235803 (21) total: 10.4s remaining: 5.18s 22: learn: 0.2006582 test: 0.2217062 best: 0.2217062 (22) total: 10.8s remaining: 4.71s 23: learn: 0.1980467 test: 0.2201499 best: 0.2201499 (23) total: 11.3s remaining: 4.22s 24: learn: 0.1950203 test: 0.2183745 best: 0.2183745 (24) total: 11.7s remaining: 3.75s 25: learn: 0.1928271 test: 0.2171644 best: 0.2171644 (25) total: 12.2s remaining: 3.27s 26: learn: 0.1904452 test: 0.2159134 best: 0.2159134 (26) total: 12.6s remaining: 2.8s 27: learn: 0.1878669 test: 0.2142171 best: 0.2142171 (27) total: 13.1s remaining: 2.33s 28: learn: 0.1865281 test: 0.2135737 best: 0.2135737 (28) total: 13.5s remaining: 1.86s 29: learn: 0.1845983 test: 0.2123142 best: 0.2123142 (29) total: 14s remaining: 1.4s 30: learn: 0.1825122 test: 0.2113888 best: 0.2113888 (30) total: 14.4s remaining: 932ms 31: learn: 0.1805114 test: 0.2103413 best: 0.2103413 (31) total: 14.9s remaining: 466ms 32: learn: 0.1789055 test: 0.2095260 best: 0.2095260 (32) total: 15.4s remaining: 0us bestTest = 0.2095260472 bestIteration = 32 Trial 87, Fold 1: Log loss = 0.2095260472118056, Average precision = 0.9732601739127758, ROC-AUC = 0.969033564068004, Elapsed Time = 15.50528530000156 seconds Trial 87, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 87, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5993474 test: 0.6000179 best: 0.6000179 (0) total: 464ms remaining: 14.8s 1: learn: 0.5245379 test: 0.5256546 best: 0.5256546 (1) total: 936ms remaining: 14.5s 2: learn: 0.4652941 test: 0.4674935 best: 0.4674935 (2) total: 1.41s remaining: 14.1s 3: learn: 0.4192603 test: 0.4223822 best: 0.4223822 (3) total: 1.88s remaining: 13.6s 4: learn: 0.3819867 test: 0.3850986 best: 0.3850986 (4) total: 2.35s remaining: 13.2s 5: learn: 0.3527381 test: 0.3564098 best: 0.3564098 (5) total: 2.84s remaining: 12.8s 6: learn: 0.3265573 test: 0.3306972 best: 0.3306972 (6) total: 3.31s remaining: 12.3s 7: learn: 0.3067856 test: 0.3114599 best: 0.3114599 (7) total: 3.74s remaining: 11.7s 8: learn: 0.2926375 test: 0.2976059 best: 0.2976059 (8) total: 4.2s remaining: 11.2s 9: learn: 0.2781556 test: 0.2838650 best: 0.2838650 (9) total: 4.64s remaining: 10.7s 10: learn: 0.2671514 test: 0.2733344 best: 0.2733344 (10) total: 5.09s remaining: 10.2s 11: learn: 0.2569171 test: 0.2640093 best: 0.2640093 (11) total: 5.52s remaining: 9.65s 12: learn: 0.2484177 test: 0.2559890 best: 0.2559890 (12) total: 5.97s remaining: 9.18s 13: learn: 0.2422310 test: 0.2497373 best: 0.2497373 (13) total: 6.41s remaining: 8.69s 14: learn: 0.2362693 test: 0.2447288 best: 0.2447288 (14) total: 6.86s remaining: 8.23s 15: learn: 0.2312685 test: 0.2404111 best: 0.2404111 (15) total: 7.31s remaining: 7.77s 16: learn: 0.2257480 test: 0.2355749 best: 0.2355749 (16) total: 7.78s remaining: 7.32s 17: learn: 0.2205232 test: 0.2311074 best: 0.2311074 (17) total: 8.24s remaining: 6.87s 18: learn: 0.2170305 test: 0.2278502 best: 0.2278502 (18) total: 8.69s remaining: 6.4s 19: learn: 0.2127663 test: 0.2240463 best: 0.2240463 (19) total: 9.16s remaining: 5.95s 20: learn: 0.2091103 test: 0.2212702 best: 0.2212702 (20) total: 9.61s remaining: 5.49s 21: learn: 0.2060539 test: 0.2187406 best: 0.2187406 (21) total: 10s remaining: 5.02s 22: learn: 0.2032762 test: 0.2166570 best: 0.2166570 (22) total: 10.5s remaining: 4.57s 23: learn: 0.2011693 test: 0.2151968 best: 0.2151968 (23) total: 11s remaining: 4.11s 24: learn: 0.1983193 test: 0.2132012 best: 0.2132012 (24) total: 11.4s remaining: 3.66s 25: learn: 0.1959187 test: 0.2114574 best: 0.2114574 (25) total: 11.9s remaining: 3.2s 26: learn: 0.1940094 test: 0.2099624 best: 0.2099624 (26) total: 12.3s remaining: 2.74s 27: learn: 0.1918848 test: 0.2081180 best: 0.2081180 (27) total: 12.8s remaining: 2.28s 28: learn: 0.1900260 test: 0.2067508 best: 0.2067508 (28) total: 13.2s remaining: 1.82s 29: learn: 0.1878730 test: 0.2052367 best: 0.2052367 (29) total: 13.7s remaining: 1.37s 30: learn: 0.1863779 test: 0.2043670 best: 0.2043670 (30) total: 14.1s remaining: 912ms 31: learn: 0.1846111 test: 0.2033344 best: 0.2033344 (31) total: 14.6s remaining: 457ms 32: learn: 0.1830895 test: 0.2023268 best: 0.2023268 (32) total: 15.1s remaining: 0us bestTest = 0.2023267731 bestIteration = 32 Trial 87, Fold 2: Log loss = 0.20232677307511587, Average precision = 0.9745892673875997, ROC-AUC = 0.9726937447438361, Elapsed Time = 15.20940100000007 seconds Trial 87, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 87, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5962464 test: 0.5961243 best: 0.5961243 (0) total: 440ms remaining: 14.1s 1: learn: 0.5214249 test: 0.5208244 best: 0.5208244 (1) total: 869ms remaining: 13.5s 2: learn: 0.4619152 test: 0.4615166 best: 0.4615166 (2) total: 1.33s remaining: 13.3s 3: learn: 0.4193659 test: 0.4191105 best: 0.4191105 (3) total: 1.81s remaining: 13.1s 4: learn: 0.3838926 test: 0.3839896 best: 0.3839896 (4) total: 2.26s remaining: 12.7s 5: learn: 0.3536035 test: 0.3542439 best: 0.3542439 (5) total: 2.74s remaining: 12.4s 6: learn: 0.3291970 test: 0.3304250 best: 0.3304250 (6) total: 3.23s remaining: 12s 7: learn: 0.3074703 test: 0.3087674 best: 0.3087674 (7) total: 3.68s remaining: 11.5s 8: learn: 0.2910127 test: 0.2931548 best: 0.2931548 (8) total: 4.16s remaining: 11.1s 9: learn: 0.2788115 test: 0.2812167 best: 0.2812167 (9) total: 4.61s remaining: 10.6s 10: learn: 0.2674923 test: 0.2705730 best: 0.2705730 (10) total: 5.07s remaining: 10.1s 11: learn: 0.2574784 test: 0.2617937 best: 0.2617937 (11) total: 5.54s remaining: 9.69s 12: learn: 0.2477850 test: 0.2528985 best: 0.2528985 (12) total: 6s remaining: 9.22s 13: learn: 0.2413339 test: 0.2474711 best: 0.2474711 (13) total: 6.45s remaining: 8.75s 14: learn: 0.2345226 test: 0.2418859 best: 0.2418859 (14) total: 6.91s remaining: 8.3s 15: learn: 0.2280313 test: 0.2360501 best: 0.2360501 (15) total: 7.38s remaining: 7.84s 16: learn: 0.2224162 test: 0.2312713 best: 0.2312713 (16) total: 7.84s remaining: 7.38s 17: learn: 0.2183848 test: 0.2281921 best: 0.2281921 (17) total: 8.3s remaining: 6.91s 18: learn: 0.2140639 test: 0.2252703 best: 0.2252703 (18) total: 8.76s remaining: 6.46s 19: learn: 0.2100347 test: 0.2215298 best: 0.2215298 (19) total: 9.21s remaining: 5.99s 20: learn: 0.2067034 test: 0.2191462 best: 0.2191462 (20) total: 9.66s remaining: 5.52s 21: learn: 0.2042045 test: 0.2170261 best: 0.2170261 (21) total: 10.1s remaining: 5.05s 22: learn: 0.2011403 test: 0.2151073 best: 0.2151073 (22) total: 10.6s remaining: 4.59s 23: learn: 0.1984376 test: 0.2131087 best: 0.2131087 (23) total: 11s remaining: 4.13s 24: learn: 0.1958274 test: 0.2109619 best: 0.2109619 (24) total: 11.5s remaining: 3.67s 25: learn: 0.1937077 test: 0.2098184 best: 0.2098184 (25) total: 11.9s remaining: 3.21s 26: learn: 0.1915538 test: 0.2086524 best: 0.2086524 (26) total: 12.4s remaining: 2.76s 27: learn: 0.1894374 test: 0.2075572 best: 0.2075572 (27) total: 12.9s remaining: 2.31s 28: learn: 0.1877358 test: 0.2062409 best: 0.2062409 (28) total: 13.4s remaining: 1.85s 29: learn: 0.1861840 test: 0.2049505 best: 0.2049505 (29) total: 13.9s remaining: 1.39s 30: learn: 0.1842443 test: 0.2032357 best: 0.2032357 (30) total: 14.5s remaining: 934ms 31: learn: 0.1828503 test: 0.2024304 best: 0.2024304 (31) total: 15s remaining: 468ms 32: learn: 0.1816063 test: 0.2017099 best: 0.2017099 (32) total: 15.5s remaining: 0us bestTest = 0.2017099228 bestIteration = 32 Trial 87, Fold 3: Log loss = 0.2017099228008445, Average precision = 0.9733782745427833, ROC-AUC = 0.9717642882424323, Elapsed Time = 15.610752100001264 seconds Trial 87, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 87, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6034239 test: 0.6031853 best: 0.6031853 (0) total: 501ms remaining: 16s 1: learn: 0.5278947 test: 0.5279966 best: 0.5279966 (1) total: 1.01s remaining: 15.7s 2: learn: 0.4699870 test: 0.4705527 best: 0.4705527 (2) total: 1.49s remaining: 14.9s 3: learn: 0.4215622 test: 0.4227468 best: 0.4227468 (3) total: 1.96s remaining: 14.2s 4: learn: 0.3862521 test: 0.3873475 best: 0.3873475 (4) total: 2.46s remaining: 13.8s 5: learn: 0.3543504 test: 0.3561496 best: 0.3561496 (5) total: 2.94s remaining: 13.2s 6: learn: 0.3304117 test: 0.3326436 best: 0.3326436 (6) total: 3.37s remaining: 12.5s 7: learn: 0.3131375 test: 0.3160112 best: 0.3160112 (7) total: 3.87s remaining: 12.1s 8: learn: 0.2963663 test: 0.2998239 best: 0.2998239 (8) total: 4.37s remaining: 11.6s 9: learn: 0.2821970 test: 0.2864286 best: 0.2864286 (9) total: 4.87s remaining: 11.2s 10: learn: 0.2686774 test: 0.2735334 best: 0.2735334 (10) total: 5.38s remaining: 10.8s 11: learn: 0.2583447 test: 0.2641925 best: 0.2641925 (11) total: 5.86s remaining: 10.3s 12: learn: 0.2507479 test: 0.2573097 best: 0.2573097 (12) total: 6.32s remaining: 9.73s 13: learn: 0.2432010 test: 0.2505628 best: 0.2505628 (13) total: 6.77s remaining: 9.19s 14: learn: 0.2352277 test: 0.2434837 best: 0.2434837 (14) total: 7.23s remaining: 8.68s 15: learn: 0.2297929 test: 0.2390057 best: 0.2390057 (15) total: 7.7s remaining: 8.18s 16: learn: 0.2242159 test: 0.2344314 best: 0.2344314 (16) total: 8.21s remaining: 7.73s 17: learn: 0.2205332 test: 0.2311335 best: 0.2311335 (17) total: 8.65s remaining: 7.21s 18: learn: 0.2169182 test: 0.2279640 best: 0.2279640 (18) total: 9.11s remaining: 6.71s 19: learn: 0.2133690 test: 0.2252672 best: 0.2252672 (19) total: 9.56s remaining: 6.21s 20: learn: 0.2098273 test: 0.2225547 best: 0.2225547 (20) total: 10s remaining: 5.72s 21: learn: 0.2067150 test: 0.2202535 best: 0.2202535 (21) total: 10.5s remaining: 5.24s 22: learn: 0.2039577 test: 0.2184848 best: 0.2184848 (22) total: 10.9s remaining: 4.76s 23: learn: 0.2016740 test: 0.2171626 best: 0.2171626 (23) total: 11.4s remaining: 4.28s 24: learn: 0.1993031 test: 0.2154478 best: 0.2154478 (24) total: 11.8s remaining: 3.79s 25: learn: 0.1967950 test: 0.2137650 best: 0.2137650 (25) total: 12.3s remaining: 3.31s 26: learn: 0.1945628 test: 0.2124074 best: 0.2124074 (26) total: 12.7s remaining: 2.83s 27: learn: 0.1921014 test: 0.2105601 best: 0.2105601 (27) total: 13.2s remaining: 2.35s 28: learn: 0.1900517 test: 0.2092146 best: 0.2092146 (28) total: 13.7s remaining: 1.88s 29: learn: 0.1880741 test: 0.2082274 best: 0.2082274 (29) total: 14.1s remaining: 1.41s 30: learn: 0.1865056 test: 0.2075731 best: 0.2075731 (30) total: 14.6s remaining: 942ms 31: learn: 0.1843959 test: 0.2065516 best: 0.2065516 (31) total: 15.1s remaining: 471ms 32: learn: 0.1827090 test: 0.2055915 best: 0.2055915 (32) total: 15.5s remaining: 0us bestTest = 0.2055914631 bestIteration = 32 Trial 87, Fold 4: Log loss = 0.2055914630869302, Average precision = 0.9750701798218468, ROC-AUC = 0.9708940314646713, Elapsed Time = 15.679794000003312 seconds Trial 87, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 87, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5994466 test: 0.6013817 best: 0.6013817 (0) total: 483ms remaining: 15.5s 1: learn: 0.5268747 test: 0.5300558 best: 0.5300558 (1) total: 928ms remaining: 14.4s 2: learn: 0.4727546 test: 0.4772197 best: 0.4772197 (2) total: 1.37s remaining: 13.7s 3: learn: 0.4220637 test: 0.4278634 best: 0.4278634 (3) total: 1.83s remaining: 13.3s 4: learn: 0.3829023 test: 0.3900753 best: 0.3900753 (4) total: 2.3s remaining: 12.9s 5: learn: 0.3508385 test: 0.3587660 best: 0.3587660 (5) total: 2.75s remaining: 12.4s 6: learn: 0.3251154 test: 0.3345719 best: 0.3345719 (6) total: 3.23s remaining: 12s 7: learn: 0.3074022 test: 0.3181526 best: 0.3181526 (7) total: 3.68s remaining: 11.5s 8: learn: 0.2911988 test: 0.3023950 best: 0.3023950 (8) total: 4.13s remaining: 11s 9: learn: 0.2784838 test: 0.2905837 best: 0.2905837 (9) total: 4.57s remaining: 10.5s 10: learn: 0.2651303 test: 0.2779505 best: 0.2779505 (10) total: 5.05s remaining: 10.1s 11: learn: 0.2544452 test: 0.2683732 best: 0.2683732 (11) total: 5.48s remaining: 9.59s 12: learn: 0.2465667 test: 0.2607981 best: 0.2607981 (12) total: 5.94s remaining: 9.14s 13: learn: 0.2394794 test: 0.2541920 best: 0.2541920 (13) total: 6.4s remaining: 8.69s 14: learn: 0.2325538 test: 0.2483284 best: 0.2483284 (14) total: 6.85s remaining: 8.22s 15: learn: 0.2266476 test: 0.2437346 best: 0.2437346 (15) total: 7.3s remaining: 7.75s 16: learn: 0.2216562 test: 0.2397115 best: 0.2397115 (16) total: 7.77s remaining: 7.32s 17: learn: 0.2174181 test: 0.2362816 best: 0.2362816 (17) total: 8.22s remaining: 6.85s 18: learn: 0.2137981 test: 0.2338264 best: 0.2338264 (18) total: 8.67s remaining: 6.39s 19: learn: 0.2104137 test: 0.2313832 best: 0.2313832 (19) total: 9.14s remaining: 5.94s 20: learn: 0.2066001 test: 0.2287038 best: 0.2287038 (20) total: 9.62s remaining: 5.5s 21: learn: 0.2028381 test: 0.2261015 best: 0.2261015 (21) total: 10.1s remaining: 5.04s 22: learn: 0.2000682 test: 0.2239463 best: 0.2239463 (22) total: 10.5s remaining: 4.58s 23: learn: 0.1976455 test: 0.2224265 best: 0.2224265 (23) total: 11s remaining: 4.12s 24: learn: 0.1945067 test: 0.2201386 best: 0.2201386 (24) total: 11.4s remaining: 3.66s 25: learn: 0.1923430 test: 0.2187905 best: 0.2187905 (25) total: 11.9s remaining: 3.2s 26: learn: 0.1900933 test: 0.2175582 best: 0.2175582 (26) total: 12.3s remaining: 2.74s 27: learn: 0.1882242 test: 0.2165777 best: 0.2165777 (27) total: 12.8s remaining: 2.28s 28: learn: 0.1859723 test: 0.2150782 best: 0.2150782 (28) total: 13.2s remaining: 1.83s 29: learn: 0.1841249 test: 0.2135696 best: 0.2135696 (29) total: 13.7s remaining: 1.37s 30: learn: 0.1827077 test: 0.2131972 best: 0.2131972 (30) total: 14.1s remaining: 913ms 31: learn: 0.1809727 test: 0.2119271 best: 0.2119271 (31) total: 14.6s remaining: 456ms 32: learn: 0.1793632 test: 0.2113421 best: 0.2113421 (32) total: 15s remaining: 0us bestTest = 0.211342051 bestIteration = 32 Trial 87, Fold 5: Log loss = 0.21134205104358925, Average precision = 0.9732365419463417, ROC-AUC = 0.9697751518867398, Elapsed Time = 15.177862399999867 seconds
Optimization Progress: 88%|########8 | 88/100 [2:46:37<31:34, 157.84s/it]
Trial 88, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 88, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6224519 test: 0.6231479 best: 0.6231479 (0) total: 130ms remaining: 1.17s 1: learn: 0.5622996 test: 0.5634602 best: 0.5634602 (1) total: 242ms remaining: 969ms 2: learn: 0.5123254 test: 0.5138418 best: 0.5138418 (2) total: 353ms remaining: 824ms 3: learn: 0.4704919 test: 0.4728426 best: 0.4728426 (3) total: 472ms remaining: 708ms 4: learn: 0.4351923 test: 0.4385608 best: 0.4385608 (4) total: 616ms remaining: 616ms 5: learn: 0.4034795 test: 0.4079117 best: 0.4079117 (5) total: 752ms remaining: 501ms 6: learn: 0.3764601 test: 0.3815135 best: 0.3815135 (6) total: 871ms remaining: 373ms 7: learn: 0.3539116 test: 0.3596568 best: 0.3596568 (7) total: 992ms remaining: 248ms 8: learn: 0.3355914 test: 0.3419586 best: 0.3419586 (8) total: 1.11s remaining: 124ms 9: learn: 0.3192551 test: 0.3265294 best: 0.3265294 (9) total: 1.24s remaining: 0us bestTest = 0.326529436 bestIteration = 9 Trial 88, Fold 1: Log loss = 0.3265525342698958, Average precision = 0.9740318048854449, ROC-AUC = 0.9688813640230758, Elapsed Time = 1.3460980000018026 seconds Trial 88, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 88, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6234151 test: 0.6239920 best: 0.6239920 (0) total: 131ms remaining: 1.18s 1: learn: 0.5645143 test: 0.5655781 best: 0.5655781 (1) total: 271ms remaining: 1.08s 2: learn: 0.5136015 test: 0.5158391 best: 0.5158391 (2) total: 409ms remaining: 955ms 3: learn: 0.4725953 test: 0.4753741 best: 0.4753741 (3) total: 545ms remaining: 818ms 4: learn: 0.4368350 test: 0.4400977 best: 0.4400977 (4) total: 676ms remaining: 676ms 5: learn: 0.4067551 test: 0.4106645 best: 0.4106645 (5) total: 805ms remaining: 537ms 6: learn: 0.3795999 test: 0.3838291 best: 0.3838291 (6) total: 948ms remaining: 406ms 7: learn: 0.3576553 test: 0.3627150 best: 0.3627150 (7) total: 1.07s remaining: 269ms 8: learn: 0.3388352 test: 0.3442753 best: 0.3442753 (8) total: 1.21s remaining: 134ms 9: learn: 0.3213956 test: 0.3272290 best: 0.3272290 (9) total: 1.34s remaining: 0us bestTest = 0.3272289582 bestIteration = 9 Trial 88, Fold 2: Log loss = 0.32723467518616073, Average precision = 0.9724646298231266, ROC-AUC = 0.9693479336959295, Elapsed Time = 1.4484083999996074 seconds Trial 88, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 88, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6237516 test: 0.6241148 best: 0.6241148 (0) total: 142ms remaining: 1.27s 1: learn: 0.5638580 test: 0.5642638 best: 0.5642638 (1) total: 275ms remaining: 1.1s 2: learn: 0.5136600 test: 0.5140643 best: 0.5140643 (2) total: 403ms remaining: 941ms 3: learn: 0.4709290 test: 0.4712186 best: 0.4712186 (3) total: 541ms remaining: 811ms 4: learn: 0.4360744 test: 0.4368341 best: 0.4368341 (4) total: 700ms remaining: 700ms 5: learn: 0.4057766 test: 0.4066058 best: 0.4066058 (5) total: 830ms remaining: 553ms 6: learn: 0.3801435 test: 0.3812701 best: 0.3812701 (6) total: 967ms remaining: 415ms 7: learn: 0.3584478 test: 0.3599220 best: 0.3599220 (7) total: 1.1s remaining: 274ms 8: learn: 0.3387494 test: 0.3405054 best: 0.3405054 (8) total: 1.22s remaining: 136ms 9: learn: 0.3227168 test: 0.3248497 best: 0.3248497 (9) total: 1.36s remaining: 0us bestTest = 0.3248496803 bestIteration = 9 Trial 88, Fold 3: Log loss = 0.32492671269582485, Average precision = 0.9734236674287723, ROC-AUC = 0.9705342997274417, Elapsed Time = 1.4675580999974045 seconds Trial 88, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 88, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6229557 test: 0.6236307 best: 0.6236307 (0) total: 131ms remaining: 1.18s 1: learn: 0.5630670 test: 0.5646025 best: 0.5646025 (1) total: 280ms remaining: 1.12s 2: learn: 0.5133630 test: 0.5154427 best: 0.5154427 (2) total: 408ms remaining: 951ms 3: learn: 0.4712449 test: 0.4737168 best: 0.4737168 (3) total: 548ms remaining: 823ms 4: learn: 0.4348760 test: 0.4378488 best: 0.4378488 (4) total: 686ms remaining: 686ms 5: learn: 0.4037121 test: 0.4071164 best: 0.4071164 (5) total: 812ms remaining: 542ms 6: learn: 0.3797202 test: 0.3836552 best: 0.3836552 (6) total: 925ms remaining: 397ms 7: learn: 0.3575981 test: 0.3623004 best: 0.3623004 (7) total: 1.07s remaining: 268ms 8: learn: 0.3384951 test: 0.3435950 best: 0.3435950 (8) total: 1.19s remaining: 132ms 9: learn: 0.3210183 test: 0.3262889 best: 0.3262889 (9) total: 1.31s remaining: 0us bestTest = 0.3262888851 bestIteration = 9 Trial 88, Fold 4: Log loss = 0.32631804141618903, Average precision = 0.9741333940399586, ROC-AUC = 0.9694683034447569, Elapsed Time = 1.4225041000026977 seconds Trial 88, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 88, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6234003 test: 0.6246371 best: 0.6246371 (0) total: 140ms remaining: 1.26s 1: learn: 0.5644428 test: 0.5672313 best: 0.5672313 (1) total: 255ms remaining: 1.02s 2: learn: 0.5146575 test: 0.5189076 best: 0.5189076 (2) total: 388ms remaining: 905ms 3: learn: 0.4722280 test: 0.4773995 best: 0.4773995 (3) total: 519ms remaining: 778ms 4: learn: 0.4359995 test: 0.4423026 best: 0.4423026 (4) total: 649ms remaining: 649ms 5: learn: 0.4052814 test: 0.4128553 best: 0.4128553 (5) total: 801ms remaining: 534ms 6: learn: 0.3792006 test: 0.3876542 best: 0.3876542 (6) total: 928ms remaining: 398ms 7: learn: 0.3574206 test: 0.3669476 best: 0.3669476 (7) total: 1.05s remaining: 263ms 8: learn: 0.3379189 test: 0.3481084 best: 0.3481084 (8) total: 1.19s remaining: 132ms 9: learn: 0.3209056 test: 0.3319787 best: 0.3319787 (9) total: 1.32s remaining: 0us bestTest = 0.3319786655 bestIteration = 9 Trial 88, Fold 5: Log loss = 0.3319333786595974, Average precision = 0.9699001520709711, ROC-AUC = 0.9654956431265445, Elapsed Time = 1.4293959999995423 seconds
Optimization Progress: 89%|########9 | 89/100 [2:46:52<21:05, 115.01s/it]
Trial 89, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 89, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6013315 test: 0.6025500 best: 0.6025500 (0) total: 318ms remaining: 31.2s 1: learn: 0.5257102 test: 0.5271772 best: 0.5271772 (1) total: 614ms remaining: 29.8s 2: learn: 0.4700853 test: 0.4715170 best: 0.4715170 (2) total: 898ms remaining: 28.7s 3: learn: 0.4220272 test: 0.4240443 best: 0.4240443 (3) total: 1.19s remaining: 28.3s 4: learn: 0.3909089 test: 0.3933190 best: 0.3933190 (4) total: 1.46s remaining: 27.5s 5: learn: 0.3625561 test: 0.3653810 best: 0.3653810 (5) total: 1.71s remaining: 26.6s 6: learn: 0.3396757 test: 0.3434287 best: 0.3434287 (6) total: 2.01s remaining: 26.5s 7: learn: 0.3211635 test: 0.3256694 best: 0.3256694 (7) total: 2.24s remaining: 25.4s 8: learn: 0.3057485 test: 0.3127153 best: 0.3127153 (8) total: 2.49s remaining: 24.9s 9: learn: 0.2904888 test: 0.2977730 best: 0.2977730 (9) total: 2.77s remaining: 24.7s 10: learn: 0.2927099 test: 0.2870272 best: 0.2870272 (10) total: 3.02s remaining: 24.2s 11: learn: 0.2764124 test: 0.2751959 best: 0.2751959 (11) total: 3.33s remaining: 24.1s 12: learn: 0.2687353 test: 0.2681612 best: 0.2681612 (12) total: 3.62s remaining: 23.9s 13: learn: 0.2624698 test: 0.2623015 best: 0.2623015 (13) total: 3.85s remaining: 23.4s 14: learn: 0.2504236 test: 0.2547461 best: 0.2547461 (14) total: 4.16s remaining: 23.3s 15: learn: 0.2543011 test: 0.2488625 best: 0.2488625 (15) total: 4.46s remaining: 23.1s 16: learn: 0.2702687 test: 0.2443205 best: 0.2443205 (16) total: 4.76s remaining: 23s 17: learn: 0.2657432 test: 0.2400647 best: 0.2400647 (17) total: 5.03s remaining: 22.6s 18: learn: 0.2610073 test: 0.2355997 best: 0.2355997 (18) total: 5.33s remaining: 22.4s 19: learn: 0.2568091 test: 0.2322644 best: 0.2322644 (19) total: 5.64s remaining: 22.3s 20: learn: 0.2526542 test: 0.2289424 best: 0.2289424 (20) total: 5.95s remaining: 22.1s 21: learn: 0.2497173 test: 0.2265896 best: 0.2265896 (21) total: 6.25s remaining: 21.9s 22: learn: 0.2465668 test: 0.2241723 best: 0.2241723 (22) total: 6.54s remaining: 21.6s
Training has stopped (degenerate solution on iteration 23, probably too small l2-regularization, try to increase it)
bestTest = 0.2241722929 bestIteration = 22 Shrink model to first 23 iterations. Trial 89, Fold 1: Log loss = 0.22403616605955526, Average precision = 0.9728023961458518, ROC-AUC = 0.9680574823274688, Elapsed Time = 6.91963570000371 seconds Trial 89, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 89, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5902305 test: 0.5911349 best: 0.5911349 (0) total: 302ms remaining: 29.6s 1: learn: 0.5217683 test: 0.5238795 best: 0.5238795 (1) total: 606ms remaining: 29.4s 2: learn: 0.4680902 test: 0.4711911 best: 0.4711911 (2) total: 908ms remaining: 29.1s 3: learn: 0.4225324 test: 0.4260958 best: 0.4260958 (3) total: 1.21s remaining: 28.8s 4: learn: 0.3860159 test: 0.3899937 best: 0.3899937 (4) total: 1.48s remaining: 27.8s 5: learn: 0.3608215 test: 0.3655095 best: 0.3655095 (5) total: 1.77s remaining: 27.4s 6: learn: 0.3414423 test: 0.3466158 best: 0.3466158 (6) total: 2.08s remaining: 27.3s 7: learn: 0.3207897 test: 0.3268568 best: 0.3268568 (7) total: 2.38s remaining: 27s 8: learn: 0.3036343 test: 0.3101614 best: 0.3101614 (8) total: 2.68s remaining: 26.8s 9: learn: 0.2883941 test: 0.2951290 best: 0.2951290 (9) total: 2.95s remaining: 26.3s 10: learn: 0.2775877 test: 0.2841583 best: 0.2841583 (10) total: 3.23s remaining: 25.8s 11: learn: 0.2673975 test: 0.2739177 best: 0.2739177 (11) total: 3.53s remaining: 25.6s 12: learn: 0.2582042 test: 0.2649657 best: 0.2649657 (12) total: 3.84s remaining: 25.4s 13: learn: 0.2506904 test: 0.2574492 best: 0.2574492 (13) total: 4.13s remaining: 25.1s 14: learn: 0.2408881 test: 0.2484660 best: 0.2484660 (14) total: 4.44s remaining: 24.9s 15: learn: 0.2356407 test: 0.2432841 best: 0.2432841 (15) total: 4.72s remaining: 24.5s 16: learn: 0.2297170 test: 0.2375761 best: 0.2375761 (16) total: 5.01s remaining: 24.2s 17: learn: 0.2251493 test: 0.2336368 best: 0.2336368 (17) total: 5.31s remaining: 23.9s 18: learn: 0.2208619 test: 0.2293602 best: 0.2293602 (18) total: 5.63s remaining: 23.7s 19: learn: 0.2169591 test: 0.2255743 best: 0.2255743 (19) total: 5.93s remaining: 23.4s 20: learn: 0.2137084 test: 0.2224703 best: 0.2224703 (20) total: 6.25s remaining: 23.2s 21: learn: 0.2104061 test: 0.2194737 best: 0.2194737 (21) total: 6.56s remaining: 23s 22: learn: 0.2074410 test: 0.2166215 best: 0.2166215 (22) total: 6.86s remaining: 22.7s 23: learn: 0.2047785 test: 0.2143016 best: 0.2143016 (23) total: 7.17s remaining: 22.4s 24: learn: 0.2026634 test: 0.2123445 best: 0.2123445 (24) total: 7.46s remaining: 22.1s 25: learn: 0.2003030 test: 0.2101273 best: 0.2101273 (25) total: 7.76s remaining: 21.8s 26: learn: 0.1981367 test: 0.2083979 best: 0.2083979 (26) total: 8.07s remaining: 21.5s 27: learn: 0.1963336 test: 0.2069289 best: 0.2069289 (27) total: 8.38s remaining: 21.2s 28: learn: 0.1947808 test: 0.2055249 best: 0.2055249 (28) total: 8.67s remaining: 20.9s 29: learn: 0.2105436 test: 0.2042046 best: 0.2042046 (29) total: 8.98s remaining: 20.7s 30: learn: 0.2091986 test: 0.2031639 best: 0.2031639 (30) total: 9.17s remaining: 20.1s 31: learn: 0.2320290 test: 0.3974868 best: 0.2031639 (30) total: 9.42s remaining: 19.7s 32: learn: 0.2304974 test: 0.3964309 best: 0.2031639 (30) total: 9.66s remaining: 19.3s 33: learn: 0.2291906 test: 0.3954546 best: 0.2031639 (30) total: 9.92s remaining: 19s 34: learn: 0.2286059 test: 0.3949846 best: 0.2031639 (30) total: 10.1s remaining: 18.4s 35: learn: 0.2276136 test: 0.3942723 best: 0.2031639 (30) total: 10.3s remaining: 18.1s 36: learn: 0.2262040 test: 0.3934373 best: 0.2031639 (30) total: 10.6s remaining: 17.8s 37: learn: 0.2256230 test: 0.3931389 best: 0.2031639 (30) total: 10.8s remaining: 17.3s 38: learn: 0.2253682 test: 0.3929809 best: 0.2031639 (30) total: 10.9s remaining: 16.8s 39: learn: 0.2501523 test: 0.3927480 best: 0.2031639 (30) total: 11.1s remaining: 16.4s 40: learn: 0.3226237 test: 0.4872740 best: 0.2031639 (30) total: 11.2s remaining: 15.9s bestTest = 0.2031638743 bestIteration = 30 Shrink model to first 31 iterations.
Training has stopped (degenerate solution on iteration 41, probably too small l2-regularization, try to increase it)
Trial 89, Fold 2: Log loss = 0.20292609529155894, Average precision = 0.97405188344116, ROC-AUC = 0.971974004997358, Elapsed Time = 11.54586019999988 seconds Trial 89, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 89, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6065515 test: 0.6056032 best: 0.6056032 (0) total: 264ms remaining: 25.9s 1: learn: 0.5322196 test: 0.5316723 best: 0.5316723 (1) total: 545ms remaining: 26.4s 2: learn: 0.4724222 test: 0.4716816 best: 0.4716816 (2) total: 844ms remaining: 27s 3: learn: 0.4244193 test: 0.4237725 best: 0.4237725 (3) total: 1.1s remaining: 26.2s 4: learn: 0.3880140 test: 0.3879097 best: 0.3879097 (4) total: 1.41s remaining: 26.5s 5: learn: 0.3576776 test: 0.3575856 best: 0.3575856 (5) total: 1.72s remaining: 26.7s 6: learn: 0.3363683 test: 0.3364709 best: 0.3364709 (6) total: 2.03s remaining: 26.7s 7: learn: 0.3150018 test: 0.3155758 best: 0.3155758 (7) total: 2.31s remaining: 26.3s 8: learn: 0.2980340 test: 0.2989890 best: 0.2989890 (8) total: 2.58s remaining: 25.8s 9: learn: 0.2856759 test: 0.2868742 best: 0.2868742 (9) total: 2.85s remaining: 25.3s 10: learn: 0.2707848 test: 0.2745853 best: 0.2745853 (10) total: 3.15s remaining: 25.2s 11: learn: 0.2617315 test: 0.2659180 best: 0.2659180 (11) total: 3.41s remaining: 24.8s 12: learn: 0.2524261 test: 0.2569010 best: 0.2569010 (12) total: 3.72s remaining: 24.6s 13: learn: 0.2456987 test: 0.2505398 best: 0.2505398 (13) total: 4.03s remaining: 24.5s 14: learn: 0.2389509 test: 0.2442122 best: 0.2442122 (14) total: 4.34s remaining: 24.3s 15: learn: 0.2318950 test: 0.2378095 best: 0.2378095 (15) total: 4.65s remaining: 24.1s 16: learn: 0.2270389 test: 0.2330340 best: 0.2330340 (16) total: 4.9s remaining: 23.6s 17: learn: 0.2223781 test: 0.2286229 best: 0.2286229 (17) total: 5.21s remaining: 23.4s 18: learn: 0.2184738 test: 0.2253664 best: 0.2253664 (18) total: 5.52s remaining: 23.2s 19: learn: 0.2157648 test: 0.2226402 best: 0.2226402 (19) total: 5.83s remaining: 23s 20: learn: 0.2115497 test: 0.2188657 best: 0.2188657 (20) total: 6.07s remaining: 22.6s 21: learn: 0.2086105 test: 0.2160964 best: 0.2160964 (21) total: 6.38s remaining: 22.3s 22: learn: 0.2055314 test: 0.2138035 best: 0.2138035 (22) total: 6.69s remaining: 22.1s 23: learn: 0.2037344 test: 0.2121572 best: 0.2121572 (23) total: 6.93s remaining: 21.7s 24: learn: 0.2006820 test: 0.2096464 best: 0.2096464 (24) total: 7.24s remaining: 21.4s 25: learn: 0.1986440 test: 0.2080304 best: 0.2080304 (25) total: 7.52s remaining: 21.1s 26: learn: 0.1972399 test: 0.2069321 best: 0.2069321 (26) total: 7.75s remaining: 20.7s 27: learn: 0.1963222 test: 0.2062011 best: 0.2062011 (27) total: 7.95s remaining: 20.1s 28: learn: 0.1947907 test: 0.2049819 best: 0.2049819 (28) total: 8.23s remaining: 19.9s 29: learn: 0.1940219 test: 0.2043856 best: 0.2043856 (29) total: 8.42s remaining: 19.4s 30: learn: 0.1921272 test: 0.2026951 best: 0.2026951 (30) total: 8.71s remaining: 19.1s 31: learn: 0.1923463 test: 0.2022125 best: 0.2022125 (31) total: 8.92s remaining: 18.7s 32: learn: 0.1908874 test: 0.2011002 best: 0.2011002 (32) total: 9.21s remaining: 18.4s 33: learn: 0.1899199 test: 0.2003284 best: 0.2003284 (33) total: 9.44s remaining: 18s 34: learn: 0.2226494 test: 0.1995428 best: 0.1995428 (34) total: 9.63s remaining: 17.6s 35: learn: 0.2216618 test: 0.1989359 best: 0.1989359 (35) total: 9.94s remaining: 17.4s 36: learn: 0.3086576 test: 0.2573097 best: 0.1989359 (35) total: 10.2s remaining: 17.1s 37: learn: 0.3076170 test: 0.2585290 best: 0.1989359 (35) total: 10.4s remaining: 16.7s 38: learn: 0.3066941 test: 0.2578095 best: 0.1989359 (35) total: 10.6s remaining: 16.2s 39: learn: 0.3058194 test: 0.2572190 best: 0.1989359 (35) total: 10.8s remaining: 15.9s 40: learn: 0.3051972 test: 0.2568941 best: 0.1989359 (35) total: 11s remaining: 15.5s 41: learn: 0.3045701 test: 0.2565725 best: 0.1989359 (35) total: 11.2s remaining: 15.2s 42: learn: 0.2998223 test: 0.2557136 best: 0.1989359 (35) total: 11.5s remaining: 14.9s 43: learn: 0.2952132 test: 0.2554679 best: 0.1989359 (35) total: 11.6s remaining: 14.5s 44: learn: 0.2949957 test: 0.2553625 best: 0.1989359 (35) total: 11.8s remaining: 14.1s 45: learn: 0.3474399 test: 0.3084526 best: 0.1989359 (35) total: 11.9s remaining: 13.7s 46: learn: 0.3468281 test: 0.3080688 best: 0.1989359 (35) total: 12.1s remaining: 13.4s 47: learn: 0.3463325 test: 0.3077552 best: 0.1989359 (35) total: 12.3s remaining: 13.1s 48: learn: 0.3460023 test: 0.3075488 best: 0.1989359 (35) total: 12.5s remaining: 12.8s 49: learn: 0.3454161 test: 0.3071326 best: 0.1989359 (35) total: 12.7s remaining: 12.4s 50: learn: 0.3453169 test: 0.3071520 best: 0.1989359 (35) total: 12.8s remaining: 12s 51: learn: 0.7488106 test: 0.3887059 best: 0.1989359 (35) total: 13.1s remaining: 11.8s
Training has stopped (degenerate solution on iteration 52, probably too small l2-regularization, try to increase it)
bestTest = 0.1989359254 bestIteration = 35 Shrink model to first 36 iterations. Trial 89, Fold 3: Log loss = 0.1987894436384647, Average precision = 0.9754225044156319, ROC-AUC = 0.9718194409665708, Elapsed Time = 13.423010099999374 seconds Trial 89, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 89, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5946326 test: 0.5953119 best: 0.5953119 (0) total: 279ms remaining: 27.3s 1: learn: 0.5271274 test: 0.5279984 best: 0.5279984 (1) total: 578ms remaining: 28.1s 2: learn: 0.4764557 test: 0.4774341 best: 0.4774341 (2) total: 851ms remaining: 27.2s 3: learn: 0.4314397 test: 0.4328545 best: 0.4328545 (3) total: 1.14s remaining: 27.1s 4: learn: 0.3915860 test: 0.3936177 best: 0.3936177 (4) total: 1.45s remaining: 27.2s 5: learn: 0.3577563 test: 0.3612776 best: 0.3612776 (5) total: 1.76s remaining: 27.3s 6: learn: 0.3295966 test: 0.3335407 best: 0.3335407 (6) total: 2.06s remaining: 27.1s 7: learn: 0.3103448 test: 0.3146153 best: 0.3146153 (7) total: 2.37s remaining: 26.9s 8: learn: 0.2920426 test: 0.2972521 best: 0.2972521 (8) total: 2.67s remaining: 26.7s 9: learn: 0.2803072 test: 0.2858531 best: 0.2858531 (9) total: 2.92s remaining: 26s 10: learn: 0.2673219 test: 0.2735527 best: 0.2735527 (10) total: 3.21s remaining: 25.7s 11: learn: 0.2551784 test: 0.2616070 best: 0.2616070 (11) total: 3.49s remaining: 25.3s 12: learn: 0.2455452 test: 0.2525164 best: 0.2525164 (12) total: 3.8s remaining: 25.1s 13: learn: 0.2389954 test: 0.2470009 best: 0.2470009 (13) total: 4.1s remaining: 24.9s 14: learn: 0.2334334 test: 0.2425024 best: 0.2425024 (14) total: 4.4s remaining: 24.6s 15: learn: 0.2276519 test: 0.2371349 best: 0.2371349 (15) total: 4.71s remaining: 24.4s 16: learn: 0.2238630 test: 0.2333921 best: 0.2333921 (16) total: 4.97s remaining: 24s 17: learn: 0.2191818 test: 0.2298960 best: 0.2298960 (17) total: 5.28s remaining: 23.7s 18: learn: 0.2151159 test: 0.2260673 best: 0.2260673 (18) total: 5.54s remaining: 23.3s 19: learn: 0.2114516 test: 0.2227864 best: 0.2227864 (19) total: 5.84s remaining: 23.1s 20: learn: 0.2085522 test: 0.2205819 best: 0.2205819 (20) total: 6.15s remaining: 22.9s 21: learn: 0.2048960 test: 0.2171142 best: 0.2171142 (21) total: 6.44s remaining: 22.6s 22: learn: 0.2019769 test: 0.2145503 best: 0.2145503 (22) total: 6.75s remaining: 22.3s 23: learn: 0.1994921 test: 0.2124832 best: 0.2124832 (23) total: 7.01s remaining: 21.9s 24: learn: 0.1974592 test: 0.2106261 best: 0.2106261 (24) total: 7.24s remaining: 21.4s 25: learn: 0.1954315 test: 0.2089871 best: 0.2089871 (25) total: 7.56s remaining: 21.2s 26: learn: 0.1947667 test: 0.2083815 best: 0.2083815 (26) total: 7.73s remaining: 20.6s 27: learn: 0.1931728 test: 0.2076992 best: 0.2076992 (27) total: 8.05s remaining: 20.4s 28: learn: 0.1916592 test: 0.2065199 best: 0.2065199 (28) total: 8.24s remaining: 19.9s 29: learn: 0.1905030 test: 0.2058808 best: 0.2058808 (29) total: 8.5s remaining: 19.5s 30: learn: 0.1893320 test: 0.2051510 best: 0.2051510 (30) total: 8.76s remaining: 19.2s 31: learn: 0.1883322 test: 0.2044590 best: 0.2044590 (31) total: 9s remaining: 18.8s 32: learn: 0.1876211 test: 0.2040508 best: 0.2040508 (32) total: 9.17s remaining: 18.3s 33: learn: 0.2009662 test: 0.2038504 best: 0.2038504 (33) total: 9.36s remaining: 17.9s 34: learn: 0.2003103 test: 0.2034644 best: 0.2034644 (34) total: 9.6s remaining: 17.5s 35: learn: 0.1989085 test: 0.2023387 best: 0.2023387 (35) total: 10s remaining: 17.5s 36: learn: 0.1980629 test: 0.2017322 best: 0.2017322 (36) total: 10.3s remaining: 17.3s 37: learn: 0.1969930 test: 0.2007478 best: 0.2007478 (37) total: 10.5s remaining: 16.9s 38: learn: 0.4616912 test: 0.2002205 best: 0.2002205 (38) total: 10.8s remaining: 16.6s 39: learn: 0.4605027 test: 0.1994434 best: 0.1994434 (39) total: 11.1s remaining: 16.4s 40: learn: 0.4598355 test: 0.1989214 best: 0.1989214 (40) total: 11.3s remaining: 16s 41: learn: 0.4594067 test: 0.1987196 best: 0.1987196 (41) total: 11.5s remaining: 15.6s 42: learn: 0.4588887 test: 0.1984076 best: 0.1984076 (42) total: 11.7s remaining: 15.3s 43: learn: 0.4585492 test: 0.1983075 best: 0.1983075 (43) total: 11.9s remaining: 14.9s 44: learn: 0.4580065 test: 0.1980321 best: 0.1980321 (44) total: 12.2s remaining: 14.6s 45: learn: 0.4575631 test: 0.1979013 best: 0.1979013 (45) total: 12.4s remaining: 14.3s 46: learn: 0.4573399 test: 0.1978193 best: 0.1978193 (46) total: 12.5s remaining: 13.9s 47: learn: 0.4564969 test: 0.1971286 best: 0.1971286 (47) total: 12.8s remaining: 13.6s 48: learn: 0.4554028 test: 0.1968647 best: 0.1968647 (48) total: 13.1s remaining: 13.4s 49: learn: 0.4552049 test: 0.1967777 best: 0.1967777 (49) total: 13.2s remaining: 12.9s 50: learn: 0.4547741 test: 0.1966442 best: 0.1966442 (50) total: 13.4s remaining: 12.6s 51: learn: 0.4546430 test: 0.1966118 best: 0.1966118 (51) total: 13.5s remaining: 12.2s 52: learn: 0.4541450 test: 0.1964937 best: 0.1964937 (52) total: 13.7s remaining: 11.9s 53: learn: 0.4901004 test: 0.1959255 best: 0.1959255 (53) total: 13.9s remaining: 11.6s 54: learn: 0.4897284 test: 0.1956138 best: 0.1956138 (54) total: 14.1s remaining: 11.3s 55: learn: 0.5023048 test: 0.2261850 best: 0.1956138 (54) total: 14.3s remaining: 11s 56: learn: 0.4993318 test: 0.2257813 best: 0.1956138 (54) total: 14.5s remaining: 10.7s 57: learn: 0.4991315 test: 0.2256646 best: 0.1956138 (54) total: 14.6s remaining: 10.4s 58: learn: 0.4989782 test: 0.2256080 best: 0.1956138 (54) total: 14.8s remaining: 10s 59: learn: 0.4987979 test: 0.2254819 best: 0.1956138 (54) total: 14.9s remaining: 9.68s 60: learn: 0.4985731 test: 0.2254053 best: 0.1956138 (54) total: 15.1s remaining: 9.38s 61: learn: 0.4983493 test: 0.2253751 best: 0.1956138 (54) total: 15.2s remaining: 9.08s 62: learn: 0.4981673 test: 0.2253802 best: 0.1956138 (54) total: 15.4s remaining: 8.78s 63: learn: 0.4977571 test: 0.2251724 best: 0.1956138 (54) total: 15.5s remaining: 8.47s 64: learn: 0.4972018 test: 0.2248870 best: 0.1956138 (54) total: 15.7s remaining: 8.21s 65: learn: 0.4969107 test: 0.2249435 best: 0.1956138 (54) total: 15.9s remaining: 7.94s 66: learn: 0.4967046 test: 0.2248784 best: 0.1956138 (54) total: 16s remaining: 7.65s 67: learn: 0.4964236 test: 0.2246997 best: 0.1956138 (54) total: 16.1s remaining: 7.36s 68: learn: 0.4962958 test: 0.2246682 best: 0.1956138 (54) total: 16.3s remaining: 7.07s 69: learn: 0.4959918 test: 0.2245604 best: 0.1956138 (54) total: 16.4s remaining: 6.79s 70: learn: 0.4957679 test: 0.2245923 best: 0.1956138 (54) total: 16.6s remaining: 6.53s 71: learn: 0.4956154 test: 0.2246001 best: 0.1956138 (54) total: 16.7s remaining: 6.26s 72: learn: 0.4951632 test: 0.2245286 best: 0.1956138 (54) total: 16.9s remaining: 6.04s 73: learn: 0.4950870 test: 0.2245233 best: 0.1956138 (54) total: 17.1s remaining: 5.76s 74: learn: 0.4950245 test: 0.2245265 best: 0.1956138 (54) total: 17.1s remaining: 5.48s 75: learn: 0.4949484 test: 0.2245391 best: 0.1956138 (54) total: 17.3s remaining: 5.22s 76: learn: 0.4947306 test: 0.2244383 best: 0.1956138 (54) total: 17.4s remaining: 4.97s 77: learn: 0.4945859 test: 0.2244425 best: 0.1956138 (54) total: 17.5s remaining: 4.72s 78: learn: 0.4944004 test: 0.2244001 best: 0.1956138 (54) total: 17.7s remaining: 4.47s 79: learn: 0.4941629 test: 0.2244345 best: 0.1956138 (54) total: 17.8s remaining: 4.23s 80: learn: 0.4940090 test: 0.2244703 best: 0.1956138 (54) total: 17.9s remaining: 3.98s 81: learn: 0.4937643 test: 0.2244338 best: 0.1956138 (54) total: 18.1s remaining: 3.75s 82: learn: 0.4936342 test: 0.2244491 best: 0.1956138 (54) total: 18.2s remaining: 3.51s 83: learn: 0.4933925 test: 0.2244212 best: 0.1956138 (54) total: 18.4s remaining: 3.29s 84: learn: 0.4931793 test: 0.2242754 best: 0.1956138 (54) total: 18.5s remaining: 3.05s 85: learn: 0.4929023 test: 0.2241842 best: 0.1956138 (54) total: 18.7s remaining: 2.83s 86: learn: 0.4928479 test: 0.2241700 best: 0.1956138 (54) total: 18.8s remaining: 2.59s 87: learn: 0.4926329 test: 0.2241281 best: 0.1956138 (54) total: 19s remaining: 2.38s 88: learn: 0.4925902 test: 0.2241303 best: 0.1956138 (54) total: 19.1s remaining: 2.14s 89: learn: 0.4923887 test: 0.2240072 best: 0.1956138 (54) total: 19.3s remaining: 1.93s 90: learn: 0.4921137 test: 0.2239805 best: 0.1956138 (54) total: 19.5s remaining: 1.71s 91: learn: 0.4919291 test: 0.2239652 best: 0.1956138 (54) total: 19.6s remaining: 1.49s 92: learn: 0.4917248 test: 0.2239466 best: 0.1956138 (54) total: 19.7s remaining: 1.27s 93: learn: 0.4913000 test: 0.2238569 best: 0.1956138 (54) total: 20s remaining: 1.06s 94: learn: 0.4912507 test: 0.2238714 best: 0.1956138 (54) total: 20.1s remaining: 845ms 95: learn: 0.4911323 test: 0.2239271 best: 0.1956138 (54) total: 20.2s remaining: 630ms 96: learn: 0.4910053 test: 0.2238548 best: 0.1956138 (54) total: 20.3s remaining: 419ms 97: learn: 0.4908478 test: 0.2238116 best: 0.1956138 (54) total: 20.5s remaining: 209ms 98: learn: 0.4906750 test: 0.2239164 best: 0.1956138 (54) total: 20.7s remaining: 0us bestTest = 0.1956138421 bestIteration = 54 Shrink model to first 55 iterations. Trial 89, Fold 4: Log loss = 0.195184653196275, Average precision = 0.975645862695531, ROC-AUC = 0.9715126367757057, Elapsed Time = 20.797833399999945 seconds Trial 89, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 89, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5997204 test: 0.6015978 best: 0.6015978 (0) total: 296ms remaining: 29s 1: learn: 0.5331859 test: 0.5368385 best: 0.5368385 (1) total: 607ms remaining: 29.4s 2: learn: 0.4750935 test: 0.4799975 best: 0.4799975 (2) total: 926ms remaining: 29.6s 3: learn: 0.4279704 test: 0.4343782 best: 0.4343782 (3) total: 1.21s remaining: 28.6s 4: learn: 0.3876051 test: 0.3946569 best: 0.3946569 (4) total: 1.47s remaining: 27.6s 5: learn: 0.3605710 test: 0.3688405 best: 0.3688405 (5) total: 1.77s remaining: 27.5s 6: learn: 0.3362796 test: 0.3453312 best: 0.3453312 (6) total: 2.06s remaining: 27.1s 7: learn: 0.3180857 test: 0.3278492 best: 0.3278492 (7) total: 2.38s remaining: 27.1s 8: learn: 0.3026178 test: 0.3132503 best: 0.3132503 (8) total: 2.66s remaining: 26.6s 9: learn: 0.2888395 test: 0.3017888 best: 0.3017888 (9) total: 2.95s remaining: 26.2s 10: learn: 0.2770508 test: 0.2907470 best: 0.2907470 (10) total: 3.25s remaining: 26s 11: learn: 0.2640694 test: 0.2783835 best: 0.2783835 (11) total: 3.55s remaining: 25.8s 12: learn: 0.2539657 test: 0.2691572 best: 0.2691572 (12) total: 3.86s remaining: 25.5s 13: learn: 0.2450691 test: 0.2607164 best: 0.2607164 (13) total: 4.16s remaining: 25.3s 14: learn: 0.2390082 test: 0.2557271 best: 0.2557271 (14) total: 4.44s remaining: 24.9s 15: learn: 0.2318004 test: 0.2492815 best: 0.2492815 (15) total: 4.76s remaining: 24.7s 16: learn: 0.2270656 test: 0.2446073 best: 0.2446073 (16) total: 5s remaining: 24.1s 17: learn: 0.2218579 test: 0.2399367 best: 0.2399367 (17) total: 5.32s remaining: 23.9s 18: learn: 0.2181701 test: 0.2366570 best: 0.2366570 (18) total: 5.63s remaining: 23.7s 19: learn: 0.2144503 test: 0.2335994 best: 0.2335994 (19) total: 5.96s remaining: 23.5s 20: learn: 0.2109990 test: 0.2309731 best: 0.2309731 (20) total: 6.28s remaining: 23.3s 21: learn: 0.2083068 test: 0.2286985 best: 0.2286985 (21) total: 6.6s remaining: 23.1s 22: learn: 0.2054121 test: 0.2261880 best: 0.2261880 (22) total: 6.92s remaining: 22.9s 23: learn: 0.2023394 test: 0.2237233 best: 0.2237233 (23) total: 7.19s remaining: 22.5s 24: learn: 0.2003048 test: 0.2219170 best: 0.2219170 (24) total: 7.51s remaining: 22.2s 25: learn: 0.1983555 test: 0.2206364 best: 0.2206364 (25) total: 7.83s remaining: 22s 26: learn: 0.1966708 test: 0.2192204 best: 0.2192204 (26) total: 8.1s remaining: 21.6s 27: learn: 0.1948645 test: 0.2177811 best: 0.2177811 (27) total: 8.36s remaining: 21.2s 28: learn: 0.2271900 test: 0.2167847 best: 0.2167847 (28) total: 8.63s remaining: 20.8s 29: learn: 0.2254784 test: 0.2155901 best: 0.2155901 (29) total: 8.94s remaining: 20.6s 30: learn: 0.2248371 test: 0.2151575 best: 0.2151575 (30) total: 9.1s remaining: 20s 31: learn: 0.2235672 test: 0.2143830 best: 0.2143830 (31) total: 9.37s remaining: 19.6s 32: learn: 0.2223876 test: 0.2136295 best: 0.2136295 (32) total: 9.63s remaining: 19.3s 33: learn: 0.2216124 test: 0.2130367 best: 0.2130367 (33) total: 9.82s remaining: 18.8s 34: learn: 0.2205052 test: 0.2122055 best: 0.2122055 (34) total: 10.1s remaining: 18.5s 35: learn: 0.2196702 test: 0.2118263 best: 0.2118263 (35) total: 10.3s remaining: 18s 36: learn: 0.2186433 test: 0.2112037 best: 0.2112037 (36) total: 10.6s remaining: 17.8s 37: learn: 0.2171808 test: 0.2101847 best: 0.2101847 (37) total: 10.9s remaining: 17.5s 38: learn: 0.2163911 test: 0.2096779 best: 0.2096779 (38) total: 11.1s remaining: 17.1s 39: learn: 0.2154223 test: 0.2088417 best: 0.2088417 (39) total: 11.3s remaining: 16.7s 40: learn: 0.2148438 test: 0.2087290 best: 0.2087290 (40) total: 11.5s remaining: 16.3s 41: learn: 0.2141911 test: 0.2082817 best: 0.2082817 (41) total: 11.7s remaining: 15.9s 42: learn: 0.2136971 test: 0.2080897 best: 0.2080897 (42) total: 11.9s remaining: 15.5s 43: learn: 0.2133934 test: 0.2077668 best: 0.2077668 (43) total: 12.1s remaining: 15.2s 44: learn: 0.2126158 test: 0.2074428 best: 0.2074428 (44) total: 12.4s remaining: 14.9s 45: learn: 0.2123109 test: 0.2072287 best: 0.2072287 (45) total: 12.6s remaining: 14.5s 46: learn: 0.2111206 test: 0.2066683 best: 0.2066683 (46) total: 12.9s remaining: 14.3s 47: learn: 0.2107580 test: 0.2068083 best: 0.2066683 (46) total: 13.1s remaining: 13.9s 48: learn: 0.2103033 test: 0.2065245 best: 0.2065245 (48) total: 13.3s remaining: 13.5s 49: learn: 0.2097308 test: 0.2061447 best: 0.2061447 (49) total: 13.5s remaining: 13.2s 50: learn: 0.2092151 test: 0.2058151 best: 0.2058151 (50) total: 13.7s remaining: 12.9s 51: learn: 0.2088535 test: 0.2056955 best: 0.2056955 (51) total: 13.9s remaining: 12.5s 52: learn: 0.2081502 test: 0.2054711 best: 0.2054711 (52) total: 14.1s remaining: 12.2s 53: learn: 0.2080889 test: 0.2054539 best: 0.2054539 (53) total: 14.2s remaining: 11.8s 54: learn: 0.2079216 test: 0.2054664 best: 0.2054539 (53) total: 14.3s remaining: 11.5s 55: learn: 0.2076064 test: 0.2054130 best: 0.2054130 (55) total: 14.6s remaining: 11.2s 56: learn: 0.2072785 test: 0.2052159 best: 0.2052159 (56) total: 14.8s remaining: 10.9s 57: learn: 0.2069366 test: 0.2051050 best: 0.2051050 (57) total: 14.9s remaining: 10.5s 58: learn: 0.2066458 test: 0.2049818 best: 0.2049818 (58) total: 15.1s remaining: 10.2s 59: learn: 0.2061294 test: 0.2044216 best: 0.2044216 (59) total: 15.3s remaining: 9.96s 60: learn: 0.2057825 test: 0.2043599 best: 0.2043599 (60) total: 15.4s remaining: 9.61s 61: learn: 0.2017572 test: 0.2042124 best: 0.2042124 (61) total: 15.6s remaining: 9.32s 62: learn: 0.2015516 test: 0.2040679 best: 0.2040679 (62) total: 15.8s remaining: 9.01s 63: learn: 0.2012510 test: 0.2038899 best: 0.2038899 (63) total: 16s remaining: 8.73s 64: learn: 0.2009891 test: 0.2038627 best: 0.2038627 (64) total: 16.2s remaining: 8.45s 65: learn: 0.2008531 test: 0.2038426 best: 0.2038426 (65) total: 16.3s remaining: 8.14s 66: learn: 0.2006111 test: 0.2036864 best: 0.2036864 (66) total: 16.4s remaining: 7.85s 67: learn: 0.1998067 test: 0.2040426 best: 0.2036864 (66) total: 16.8s remaining: 7.64s 68: learn: 0.1994644 test: 0.2039283 best: 0.2036864 (66) total: 16.9s remaining: 7.37s 69: learn: 0.1953788 test: 0.2177117 best: 0.2036864 (66) total: 17.1s remaining: 7.08s 70: learn: 0.1952175 test: 0.2177094 best: 0.2036864 (66) total: 17.2s remaining: 6.79s 71: learn: 0.1950323 test: 0.2177878 best: 0.2036864 (66) total: 17.4s remaining: 6.53s 72: learn: 0.1943601 test: 0.2176556 best: 0.2036864 (66) total: 17.7s remaining: 6.3s 73: learn: 0.1900835 test: 0.2173638 best: 0.2036864 (66) total: 17.9s remaining: 6.03s 74: learn: 0.1898030 test: 0.2173201 best: 0.2036864 (66) total: 18s remaining: 5.77s 75: learn: 0.1895718 test: 0.2173558 best: 0.2036864 (66) total: 18.2s remaining: 5.5s 76: learn: 0.1894081 test: 0.2173681 best: 0.2036864 (66) total: 18.3s remaining: 5.22s 77: learn: 0.1890684 test: 0.2172154 best: 0.2036864 (66) total: 18.5s remaining: 4.98s 78: learn: 0.1859871 test: 0.2357092 best: 0.2036864 (66) total: 18.8s remaining: 4.75s 79: learn: 0.1854757 test: 0.2354702 best: 0.2036864 (66) total: 19s remaining: 4.51s 80: learn: 0.1853132 test: 0.2355093 best: 0.2036864 (66) total: 19.2s remaining: 4.26s 81: learn: 0.1851533 test: 0.2354972 best: 0.2036864 (66) total: 19.3s remaining: 4.01s 82: learn: 0.1850697 test: 0.2355247 best: 0.2036864 (66) total: 19.5s remaining: 3.75s 83: learn: 0.1850087 test: 0.2355099 best: 0.2036864 (66) total: 19.6s remaining: 3.49s 84: learn: 0.1848790 test: 0.2354098 best: 0.2036864 (66) total: 19.7s remaining: 3.24s 85: learn: 0.1847650 test: 0.2354111 best: 0.2036864 (66) total: 19.8s remaining: 2.99s 86: learn: 0.1847050 test: 0.2354334 best: 0.2036864 (66) total: 19.9s remaining: 2.74s 87: learn: 0.1846330 test: 0.2354132 best: 0.2036864 (66) total: 20s remaining: 2.5s 88: learn: 0.1842444 test: 0.2352619 best: 0.2036864 (66) total: 20.2s remaining: 2.27s 89: learn: 0.1817923 test: 0.2352696 best: 0.2036864 (66) total: 20.4s remaining: 2.04s 90: learn: 0.1815726 test: 0.2352203 best: 0.2036864 (66) total: 20.6s remaining: 1.81s 91: learn: 0.1814065 test: 0.2352587 best: 0.2036864 (66) total: 20.8s remaining: 1.58s 92: learn: 0.1813361 test: 0.2352845 best: 0.2036864 (66) total: 20.9s remaining: 1.34s 93: learn: 0.1811682 test: 0.2353072 best: 0.2036864 (66) total: 21s remaining: 1.12s 94: learn: 0.1811175 test: 0.2353071 best: 0.2036864 (66) total: 21.1s remaining: 889ms 95: learn: 0.1804059 test: 0.2351949 best: 0.2036864 (66) total: 21.3s remaining: 667ms 96: learn: 0.1803331 test: 0.2352119 best: 0.2036864 (66) total: 21.4s remaining: 442ms 97: learn: 0.1801860 test: 0.2351448 best: 0.2036864 (66) total: 21.6s remaining: 220ms 98: learn: 0.1801199 test: 0.2351566 best: 0.2036864 (66) total: 21.7s remaining: 0us bestTest = 0.2036863672 bestIteration = 66 Shrink model to first 67 iterations.
Optimization Progress: 90%|######### | 90/100 [2:48:14<17:31, 105.14s/it]
Trial 89, Fold 5: Log loss = 0.20299273275865473, Average precision = 0.9739603927737867, ROC-AUC = 0.9704795347713803, Elapsed Time = 21.790719999997236 seconds Trial 90, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 90, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6247489 test: 0.6244435 best: 0.6244435 (0) total: 17.3ms remaining: 1.54s 1: learn: 0.5811202 test: 0.5813624 best: 0.5813624 (1) total: 35.9ms remaining: 1.58s 2: learn: 0.5408880 test: 0.5404131 best: 0.5404131 (2) total: 56ms remaining: 1.62s 3: learn: 0.5149893 test: 0.5148705 best: 0.5148705 (3) total: 74.7ms remaining: 1.61s 4: learn: 0.4664361 test: 0.4662511 best: 0.4662511 (4) total: 94.6ms remaining: 1.61s 5: learn: 0.4316747 test: 0.4313231 best: 0.4313231 (5) total: 113ms remaining: 1.58s 6: learn: 0.4064880 test: 0.4062864 best: 0.4062864 (6) total: 133ms remaining: 1.58s 7: learn: 0.3832688 test: 0.3836813 best: 0.3836813 (7) total: 153ms remaining: 1.57s 8: learn: 0.3654575 test: 0.3655824 best: 0.3655824 (8) total: 173ms remaining: 1.55s 9: learn: 0.3478145 test: 0.3479852 best: 0.3479852 (9) total: 193ms remaining: 1.54s 10: learn: 0.3408212 test: 0.3412308 best: 0.3412308 (10) total: 212ms remaining: 1.52s 11: learn: 0.3355507 test: 0.3368377 best: 0.3368377 (11) total: 232ms remaining: 1.51s 12: learn: 0.3314862 test: 0.3332820 best: 0.3332820 (12) total: 252ms remaining: 1.49s 13: learn: 0.3205896 test: 0.3222857 best: 0.3222857 (13) total: 273ms remaining: 1.48s 14: learn: 0.3142242 test: 0.3161270 best: 0.3161270 (14) total: 293ms remaining: 1.46s 15: learn: 0.3050912 test: 0.3070651 best: 0.3070651 (15) total: 312ms remaining: 1.44s 16: learn: 0.2996017 test: 0.3014288 best: 0.3014288 (16) total: 331ms remaining: 1.42s 17: learn: 0.2960549 test: 0.2978850 best: 0.2978850 (17) total: 350ms remaining: 1.4s 18: learn: 0.2881522 test: 0.2895031 best: 0.2895031 (18) total: 368ms remaining: 1.38s 19: learn: 0.2867001 test: 0.2881867 best: 0.2881867 (19) total: 387ms remaining: 1.35s 20: learn: 0.2853900 test: 0.2870354 best: 0.2870354 (20) total: 406ms remaining: 1.33s 21: learn: 0.2832058 test: 0.2851181 best: 0.2851181 (21) total: 425ms remaining: 1.31s 22: learn: 0.2787846 test: 0.2805845 best: 0.2805845 (22) total: 446ms remaining: 1.3s 23: learn: 0.2754354 test: 0.2772135 best: 0.2772135 (23) total: 466ms remaining: 1.28s 24: learn: 0.2726101 test: 0.2745752 best: 0.2745752 (24) total: 486ms remaining: 1.26s 25: learn: 0.2711674 test: 0.2733237 best: 0.2733237 (25) total: 505ms remaining: 1.24s 26: learn: 0.2689617 test: 0.2710637 best: 0.2710637 (26) total: 524ms remaining: 1.22s 27: learn: 0.2671073 test: 0.2694499 best: 0.2694499 (27) total: 543ms remaining: 1.2s 28: learn: 0.2647088 test: 0.2669256 best: 0.2669256 (28) total: 562ms remaining: 1.18s 29: learn: 0.2618829 test: 0.2639574 best: 0.2639574 (29) total: 581ms remaining: 1.16s 30: learn: 0.2607306 test: 0.2628519 best: 0.2628519 (30) total: 600ms remaining: 1.14s 31: learn: 0.2595382 test: 0.2617706 best: 0.2617706 (31) total: 620ms remaining: 1.12s 32: learn: 0.2586208 test: 0.2609498 best: 0.2609498 (32) total: 639ms remaining: 1.1s 33: learn: 0.2579574 test: 0.2605012 best: 0.2605012 (33) total: 660ms remaining: 1.09s 34: learn: 0.2569184 test: 0.2596047 best: 0.2596047 (34) total: 681ms remaining: 1.07s 35: learn: 0.2555498 test: 0.2583189 best: 0.2583189 (35) total: 701ms remaining: 1.05s 36: learn: 0.2539694 test: 0.2565942 best: 0.2565942 (36) total: 722ms remaining: 1.03s 37: learn: 0.2524447 test: 0.2551432 best: 0.2551432 (37) total: 744ms remaining: 1.02s 38: learn: 0.2494896 test: 0.2524008 best: 0.2524008 (38) total: 766ms remaining: 1s 39: learn: 0.2489214 test: 0.2520950 best: 0.2520950 (39) total: 790ms remaining: 987ms 40: learn: 0.2483334 test: 0.2516763 best: 0.2516763 (40) total: 812ms remaining: 970ms 41: learn: 0.2470021 test: 0.2504068 best: 0.2504068 (41) total: 834ms remaining: 953ms 42: learn: 0.2466658 test: 0.2501085 best: 0.2501085 (42) total: 855ms remaining: 935ms 43: learn: 0.2457805 test: 0.2491303 best: 0.2491303 (43) total: 878ms remaining: 918ms 44: learn: 0.2450618 test: 0.2484016 best: 0.2484016 (44) total: 904ms remaining: 904ms 45: learn: 0.2440075 test: 0.2476951 best: 0.2476951 (45) total: 930ms remaining: 890ms 46: learn: 0.2434328 test: 0.2472124 best: 0.2472124 (46) total: 955ms remaining: 873ms 47: learn: 0.2431573 test: 0.2469434 best: 0.2469434 (47) total: 976ms remaining: 854ms 48: learn: 0.2425347 test: 0.2462656 best: 0.2462656 (48) total: 998ms remaining: 835ms 49: learn: 0.2423511 test: 0.2461828 best: 0.2461828 (49) total: 1.02s remaining: 815ms 50: learn: 0.2418264 test: 0.2456491 best: 0.2456491 (50) total: 1.04s remaining: 797ms 51: learn: 0.2416074 test: 0.2454691 best: 0.2454691 (51) total: 1.06s remaining: 778ms 52: learn: 0.2413359 test: 0.2453675 best: 0.2453675 (52) total: 1.09s remaining: 761ms 53: learn: 0.2396347 test: 0.2437581 best: 0.2437581 (53) total: 1.11s remaining: 742ms 54: learn: 0.2393511 test: 0.2434489 best: 0.2434489 (54) total: 1.14s remaining: 723ms 55: learn: 0.2380881 test: 0.2421213 best: 0.2421213 (55) total: 1.16s remaining: 703ms 56: learn: 0.2379314 test: 0.2419461 best: 0.2419461 (56) total: 1.18s remaining: 683ms 57: learn: 0.2373840 test: 0.2415437 best: 0.2415437 (57) total: 1.2s remaining: 664ms 58: learn: 0.2369789 test: 0.2412001 best: 0.2412001 (58) total: 1.23s remaining: 646ms 59: learn: 0.2365146 test: 0.2405791 best: 0.2405791 (59) total: 1.25s remaining: 627ms 60: learn: 0.2360233 test: 0.2400899 best: 0.2400899 (60) total: 1.28s remaining: 607ms 61: learn: 0.2357678 test: 0.2399205 best: 0.2399205 (61) total: 1.3s remaining: 587ms 62: learn: 0.2356122 test: 0.2398102 best: 0.2398102 (62) total: 1.32s remaining: 567ms 63: learn: 0.2351587 test: 0.2394009 best: 0.2394009 (63) total: 1.35s remaining: 547ms 64: learn: 0.2348189 test: 0.2391908 best: 0.2391908 (64) total: 1.37s remaining: 527ms 65: learn: 0.2333629 test: 0.2375536 best: 0.2375536 (65) total: 1.39s remaining: 506ms 66: learn: 0.2332457 test: 0.2374104 best: 0.2374104 (66) total: 1.41s remaining: 486ms 67: learn: 0.2329893 test: 0.2372711 best: 0.2372711 (67) total: 1.44s remaining: 464ms 68: learn: 0.2327236 test: 0.2371231 best: 0.2371231 (68) total: 1.46s remaining: 444ms 69: learn: 0.2318186 test: 0.2362045 best: 0.2362045 (69) total: 1.48s remaining: 423ms 70: learn: 0.2314399 test: 0.2358667 best: 0.2358667 (70) total: 1.5s remaining: 403ms 71: learn: 0.2310509 test: 0.2353323 best: 0.2353323 (71) total: 1.53s remaining: 382ms 72: learn: 0.2300947 test: 0.2344318 best: 0.2344318 (72) total: 1.55s remaining: 362ms 73: learn: 0.2299178 test: 0.2343663 best: 0.2343663 (73) total: 1.58s remaining: 341ms 74: learn: 0.2296662 test: 0.2341321 best: 0.2341321 (74) total: 1.6s remaining: 320ms 75: learn: 0.2292612 test: 0.2337993 best: 0.2337993 (75) total: 1.62s remaining: 299ms 76: learn: 0.2290994 test: 0.2336921 best: 0.2336921 (76) total: 1.64s remaining: 278ms 77: learn: 0.2287507 test: 0.2334476 best: 0.2334476 (77) total: 1.67s remaining: 256ms 78: learn: 0.2286147 test: 0.2332946 best: 0.2332946 (78) total: 1.69s remaining: 235ms 79: learn: 0.2285569 test: 0.2332905 best: 0.2332905 (79) total: 1.71s remaining: 214ms 80: learn: 0.2282373 test: 0.2330223 best: 0.2330223 (80) total: 1.74s remaining: 193ms 81: learn: 0.2278406 test: 0.2327103 best: 0.2327103 (81) total: 1.76s remaining: 171ms 82: learn: 0.2277421 test: 0.2325916 best: 0.2325916 (82) total: 1.78s remaining: 150ms 83: learn: 0.2274594 test: 0.2323823 best: 0.2323823 (83) total: 1.8s remaining: 129ms 84: learn: 0.2272687 test: 0.2322481 best: 0.2322481 (84) total: 1.83s remaining: 108ms 85: learn: 0.2271280 test: 0.2321224 best: 0.2321224 (85) total: 1.85s remaining: 86.2ms 86: learn: 0.2270428 test: 0.2320497 best: 0.2320497 (86) total: 1.88s remaining: 64.7ms 87: learn: 0.2263605 test: 0.2312574 best: 0.2312574 (87) total: 1.9s remaining: 43.1ms 88: learn: 0.2262897 test: 0.2311661 best: 0.2311661 (88) total: 1.92s remaining: 21.6ms 89: learn: 0.2262096 test: 0.2310983 best: 0.2310983 (89) total: 1.94s remaining: 0us bestTest = 0.2310983104 bestIteration = 89 Trial 90, Fold 1: Log loss = 0.23051339982515368, Average precision = 0.9680075702390486, ROC-AUC = 0.9627265226576427, Elapsed Time = 2.0497087000039755 seconds Trial 90, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 90, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6384403 test: 0.6390188 best: 0.6390188 (0) total: 18.8ms remaining: 1.67s 1: learn: 0.5967559 test: 0.5973580 best: 0.5973580 (1) total: 40.7ms remaining: 1.79s 2: learn: 0.5564713 test: 0.5574446 best: 0.5574446 (2) total: 62.4ms remaining: 1.81s 3: learn: 0.5328588 test: 0.5337732 best: 0.5337732 (3) total: 82.9ms remaining: 1.78s 4: learn: 0.5071891 test: 0.5082826 best: 0.5082826 (4) total: 103ms remaining: 1.75s 5: learn: 0.4741770 test: 0.4753042 best: 0.4753042 (5) total: 123ms remaining: 1.73s 6: learn: 0.4577768 test: 0.4588545 best: 0.4588545 (6) total: 144ms remaining: 1.71s 7: learn: 0.4342737 test: 0.4356310 best: 0.4356310 (7) total: 165ms remaining: 1.69s 8: learn: 0.4256402 test: 0.4269031 best: 0.4269031 (8) total: 185ms remaining: 1.67s 9: learn: 0.4116009 test: 0.4129618 best: 0.4129618 (9) total: 206ms remaining: 1.65s 10: learn: 0.3847291 test: 0.3861395 best: 0.3861395 (10) total: 228ms remaining: 1.64s 11: learn: 0.3644623 test: 0.3655553 best: 0.3655553 (11) total: 250ms remaining: 1.63s 12: learn: 0.3461488 test: 0.3473324 best: 0.3473324 (12) total: 272ms remaining: 1.61s 13: learn: 0.3415077 test: 0.3425290 best: 0.3425290 (13) total: 293ms remaining: 1.59s 14: learn: 0.3335305 test: 0.3344141 best: 0.3344141 (14) total: 314ms remaining: 1.57s 15: learn: 0.3242212 test: 0.3254233 best: 0.3254233 (15) total: 335ms remaining: 1.55s 16: learn: 0.3194534 test: 0.3206363 best: 0.3206363 (16) total: 357ms remaining: 1.53s 17: learn: 0.3115678 test: 0.3128613 best: 0.3128613 (17) total: 378ms remaining: 1.51s 18: learn: 0.3039220 test: 0.3050015 best: 0.3050015 (18) total: 400ms remaining: 1.5s 19: learn: 0.2988138 test: 0.2995235 best: 0.2995235 (19) total: 422ms remaining: 1.48s 20: learn: 0.2938887 test: 0.2946711 best: 0.2946711 (20) total: 443ms remaining: 1.46s 21: learn: 0.2926636 test: 0.2934852 best: 0.2934852 (21) total: 466ms remaining: 1.44s 22: learn: 0.2894611 test: 0.2903045 best: 0.2903045 (22) total: 489ms remaining: 1.42s 23: learn: 0.2834065 test: 0.2841649 best: 0.2841649 (23) total: 511ms remaining: 1.41s 24: learn: 0.2810966 test: 0.2818730 best: 0.2818730 (24) total: 533ms remaining: 1.39s 25: learn: 0.2791007 test: 0.2797093 best: 0.2797093 (25) total: 555ms remaining: 1.37s 26: learn: 0.2744749 test: 0.2752566 best: 0.2752566 (26) total: 577ms remaining: 1.35s 27: learn: 0.2709264 test: 0.2716432 best: 0.2716432 (27) total: 599ms remaining: 1.33s 28: learn: 0.2671783 test: 0.2679699 best: 0.2679699 (28) total: 621ms remaining: 1.31s 29: learn: 0.2645036 test: 0.2652210 best: 0.2652210 (29) total: 643ms remaining: 1.28s 30: learn: 0.2616178 test: 0.2623499 best: 0.2623499 (30) total: 665ms remaining: 1.27s 31: learn: 0.2595330 test: 0.2602573 best: 0.2602573 (31) total: 691ms remaining: 1.25s 32: learn: 0.2587088 test: 0.2594629 best: 0.2594629 (32) total: 715ms remaining: 1.23s 33: learn: 0.2578410 test: 0.2586643 best: 0.2586643 (33) total: 737ms remaining: 1.21s 34: learn: 0.2565331 test: 0.2571120 best: 0.2571120 (34) total: 759ms remaining: 1.19s 35: learn: 0.2550745 test: 0.2556642 best: 0.2556642 (35) total: 781ms remaining: 1.17s 36: learn: 0.2546429 test: 0.2551764 best: 0.2551764 (36) total: 803ms remaining: 1.15s 37: learn: 0.2538097 test: 0.2543456 best: 0.2543456 (37) total: 826ms remaining: 1.13s 38: learn: 0.2525451 test: 0.2529840 best: 0.2529840 (38) total: 848ms remaining: 1.11s 39: learn: 0.2509293 test: 0.2512898 best: 0.2512898 (39) total: 872ms remaining: 1.09s 40: learn: 0.2489713 test: 0.2495112 best: 0.2495112 (40) total: 895ms remaining: 1.07s 41: learn: 0.2484733 test: 0.2490600 best: 0.2490600 (41) total: 920ms remaining: 1.05s 42: learn: 0.2476631 test: 0.2483069 best: 0.2483069 (42) total: 944ms remaining: 1.03s 43: learn: 0.2463604 test: 0.2471128 best: 0.2471128 (43) total: 965ms remaining: 1.01s 44: learn: 0.2452917 test: 0.2460038 best: 0.2460038 (44) total: 987ms remaining: 987ms 45: learn: 0.2450813 test: 0.2458213 best: 0.2458213 (45) total: 1.01s remaining: 964ms 46: learn: 0.2445222 test: 0.2451144 best: 0.2451144 (46) total: 1.03s remaining: 942ms 47: learn: 0.2421903 test: 0.2426318 best: 0.2426318 (47) total: 1.05s remaining: 922ms 48: learn: 0.2410389 test: 0.2415418 best: 0.2415418 (48) total: 1.08s remaining: 900ms 49: learn: 0.2407677 test: 0.2413369 best: 0.2413369 (49) total: 1.1s remaining: 879ms 50: learn: 0.2397365 test: 0.2404829 best: 0.2404829 (50) total: 1.12s remaining: 858ms 51: learn: 0.2393026 test: 0.2399810 best: 0.2399810 (51) total: 1.14s remaining: 837ms 52: learn: 0.2384567 test: 0.2390515 best: 0.2390515 (52) total: 1.17s remaining: 815ms 53: learn: 0.2365358 test: 0.2368907 best: 0.2368907 (53) total: 1.19s remaining: 793ms 54: learn: 0.2362342 test: 0.2365794 best: 0.2365794 (54) total: 1.21s remaining: 771ms 55: learn: 0.2359786 test: 0.2363706 best: 0.2363706 (55) total: 1.24s remaining: 750ms 56: learn: 0.2355979 test: 0.2360010 best: 0.2360010 (56) total: 1.26s remaining: 730ms 57: learn: 0.2352487 test: 0.2355907 best: 0.2355907 (57) total: 1.28s remaining: 707ms 58: learn: 0.2350366 test: 0.2353763 best: 0.2353763 (58) total: 1.3s remaining: 685ms 59: learn: 0.2347124 test: 0.2350551 best: 0.2350551 (59) total: 1.33s remaining: 664ms 60: learn: 0.2346125 test: 0.2349666 best: 0.2349666 (60) total: 1.35s remaining: 642ms 61: learn: 0.2338949 test: 0.2343283 best: 0.2343283 (61) total: 1.37s remaining: 620ms 62: learn: 0.2332308 test: 0.2336579 best: 0.2336579 (62) total: 1.4s remaining: 599ms 63: learn: 0.2321274 test: 0.2323939 best: 0.2323939 (63) total: 1.42s remaining: 578ms 64: learn: 0.2313449 test: 0.2316025 best: 0.2316025 (64) total: 1.45s remaining: 556ms 65: learn: 0.2310438 test: 0.2312820 best: 0.2312820 (65) total: 1.47s remaining: 534ms 66: learn: 0.2308008 test: 0.2310568 best: 0.2310568 (66) total: 1.49s remaining: 512ms 67: learn: 0.2307827 test: 0.2310614 best: 0.2310568 (66) total: 1.51s remaining: 490ms 68: learn: 0.2305324 test: 0.2307327 best: 0.2307327 (68) total: 1.54s remaining: 468ms 69: learn: 0.2300279 test: 0.2302138 best: 0.2302138 (69) total: 1.56s remaining: 447ms 70: learn: 0.2298539 test: 0.2300284 best: 0.2300284 (70) total: 1.59s remaining: 424ms 71: learn: 0.2297675 test: 0.2299461 best: 0.2299461 (71) total: 1.61s remaining: 402ms 72: learn: 0.2294898 test: 0.2297267 best: 0.2297267 (72) total: 1.63s remaining: 380ms 73: learn: 0.2291593 test: 0.2293836 best: 0.2293836 (73) total: 1.66s remaining: 358ms 74: learn: 0.2286183 test: 0.2289114 best: 0.2289114 (74) total: 1.68s remaining: 336ms 75: learn: 0.2278205 test: 0.2280548 best: 0.2280548 (75) total: 1.7s remaining: 314ms 76: learn: 0.2277160 test: 0.2279788 best: 0.2279788 (76) total: 1.73s remaining: 291ms 77: learn: 0.2275734 test: 0.2278536 best: 0.2278536 (77) total: 1.75s remaining: 269ms 78: learn: 0.2273419 test: 0.2276402 best: 0.2276402 (78) total: 1.77s remaining: 247ms 79: learn: 0.2271748 test: 0.2274472 best: 0.2274472 (79) total: 1.8s remaining: 225ms 80: learn: 0.2262761 test: 0.2265067 best: 0.2265067 (80) total: 1.82s remaining: 202ms 81: learn: 0.2260692 test: 0.2263400 best: 0.2263400 (81) total: 1.84s remaining: 180ms 82: learn: 0.2258774 test: 0.2261254 best: 0.2261254 (82) total: 1.87s remaining: 157ms 83: learn: 0.2257298 test: 0.2259173 best: 0.2259173 (83) total: 1.89s remaining: 135ms 84: learn: 0.2254176 test: 0.2256640 best: 0.2256640 (84) total: 1.91s remaining: 112ms 85: learn: 0.2253614 test: 0.2256235 best: 0.2256235 (85) total: 1.94s remaining: 90.1ms 86: learn: 0.2250795 test: 0.2252626 best: 0.2252626 (86) total: 1.96s remaining: 67.6ms 87: learn: 0.2249655 test: 0.2251748 best: 0.2251748 (87) total: 1.98s remaining: 45.1ms 88: learn: 0.2248209 test: 0.2250409 best: 0.2250409 (88) total: 2s remaining: 22.5ms 89: learn: 0.2247448 test: 0.2249847 best: 0.2249847 (89) total: 2.03s remaining: 0us bestTest = 0.2249846571 bestIteration = 89 Trial 90, Fold 2: Log loss = 0.22462144683921156, Average precision = 0.9684789005706683, ROC-AUC = 0.9639012504321018, Elapsed Time = 2.1495166000022436 seconds Trial 90, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 90, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6395254 test: 0.6379497 best: 0.6379497 (0) total: 19.4ms remaining: 1.73s 1: learn: 0.5910708 test: 0.5885594 best: 0.5885594 (1) total: 40.1ms remaining: 1.76s 2: learn: 0.5503866 test: 0.5480361 best: 0.5480361 (2) total: 62.4ms remaining: 1.81s 3: learn: 0.5209579 test: 0.5179553 best: 0.5179553 (3) total: 83.1ms remaining: 1.79s 4: learn: 0.4827813 test: 0.4791759 best: 0.4791759 (4) total: 104ms remaining: 1.77s 5: learn: 0.4665479 test: 0.4623330 best: 0.4623330 (5) total: 124ms remaining: 1.74s 6: learn: 0.4469451 test: 0.4428578 best: 0.4428578 (6) total: 145ms remaining: 1.72s 7: learn: 0.4354546 test: 0.4308937 best: 0.4308937 (7) total: 165ms remaining: 1.7s 8: learn: 0.4054294 test: 0.4011959 best: 0.4011959 (8) total: 187ms remaining: 1.69s 9: learn: 0.3898110 test: 0.3853078 best: 0.3853078 (9) total: 208ms remaining: 1.67s 10: learn: 0.3683214 test: 0.3640117 best: 0.3640117 (10) total: 230ms remaining: 1.65s 11: learn: 0.3628035 test: 0.3580295 best: 0.3580295 (11) total: 251ms remaining: 1.63s 12: learn: 0.3526158 test: 0.3479634 best: 0.3479634 (12) total: 273ms remaining: 1.62s 13: learn: 0.3423476 test: 0.3373381 best: 0.3373381 (13) total: 295ms remaining: 1.6s 14: learn: 0.3342292 test: 0.3292407 best: 0.3292407 (14) total: 316ms remaining: 1.58s 15: learn: 0.3241091 test: 0.3193242 best: 0.3193242 (15) total: 338ms remaining: 1.56s 16: learn: 0.3182756 test: 0.3134299 best: 0.3134299 (16) total: 360ms remaining: 1.54s 17: learn: 0.3151026 test: 0.3099785 best: 0.3099785 (17) total: 380ms remaining: 1.52s 18: learn: 0.3077573 test: 0.3024902 best: 0.3024902 (18) total: 402ms remaining: 1.5s 19: learn: 0.3021262 test: 0.2968095 best: 0.2968095 (19) total: 424ms remaining: 1.48s 20: learn: 0.2971454 test: 0.2921895 best: 0.2921895 (20) total: 446ms remaining: 1.47s 21: learn: 0.2927890 test: 0.2879527 best: 0.2879527 (21) total: 468ms remaining: 1.45s 22: learn: 0.2870794 test: 0.2824861 best: 0.2824861 (22) total: 490ms remaining: 1.43s 23: learn: 0.2828996 test: 0.2784797 best: 0.2784797 (23) total: 512ms remaining: 1.41s 24: learn: 0.2778182 test: 0.2734476 best: 0.2734476 (24) total: 534ms remaining: 1.39s 25: learn: 0.2746797 test: 0.2701796 best: 0.2701796 (25) total: 555ms remaining: 1.37s 26: learn: 0.2716206 test: 0.2667182 best: 0.2667182 (26) total: 577ms remaining: 1.35s 27: learn: 0.2705060 test: 0.2655800 best: 0.2655800 (27) total: 599ms remaining: 1.32s 28: learn: 0.2691275 test: 0.2641702 best: 0.2641702 (28) total: 620ms remaining: 1.3s 29: learn: 0.2671867 test: 0.2622333 best: 0.2622333 (29) total: 642ms remaining: 1.28s 30: learn: 0.2640858 test: 0.2591734 best: 0.2591734 (30) total: 664ms remaining: 1.26s 31: learn: 0.2605944 test: 0.2555932 best: 0.2555932 (31) total: 687ms remaining: 1.25s 32: learn: 0.2597789 test: 0.2547258 best: 0.2547258 (32) total: 710ms remaining: 1.23s 33: learn: 0.2593401 test: 0.2542255 best: 0.2542255 (33) total: 731ms remaining: 1.2s 34: learn: 0.2573507 test: 0.2523501 best: 0.2523501 (34) total: 753ms remaining: 1.18s 35: learn: 0.2558949 test: 0.2507767 best: 0.2507767 (35) total: 775ms remaining: 1.16s 36: learn: 0.2545682 test: 0.2494618 best: 0.2494618 (36) total: 796ms remaining: 1.14s 37: learn: 0.2539751 test: 0.2489451 best: 0.2489451 (37) total: 819ms remaining: 1.12s 38: learn: 0.2528526 test: 0.2477338 best: 0.2477338 (38) total: 842ms remaining: 1.1s 39: learn: 0.2512810 test: 0.2461893 best: 0.2461893 (39) total: 864ms remaining: 1.08s 40: learn: 0.2503364 test: 0.2453812 best: 0.2453812 (40) total: 886ms remaining: 1.06s 41: learn: 0.2484927 test: 0.2432646 best: 0.2432646 (41) total: 911ms remaining: 1.04s 42: learn: 0.2474991 test: 0.2422218 best: 0.2422218 (42) total: 934ms remaining: 1.02s 43: learn: 0.2467160 test: 0.2415012 best: 0.2415012 (43) total: 957ms remaining: 1s 44: learn: 0.2461319 test: 0.2410112 best: 0.2410112 (44) total: 980ms remaining: 980ms 45: learn: 0.2454054 test: 0.2404669 best: 0.2404669 (45) total: 1s remaining: 959ms 46: learn: 0.2452825 test: 0.2403066 best: 0.2403066 (46) total: 1.02s remaining: 939ms 47: learn: 0.2448037 test: 0.2399257 best: 0.2399257 (47) total: 1.05s remaining: 917ms 48: learn: 0.2443434 test: 0.2394288 best: 0.2394288 (48) total: 1.07s remaining: 896ms 49: learn: 0.2433223 test: 0.2383017 best: 0.2383017 (49) total: 1.09s remaining: 876ms 50: learn: 0.2428531 test: 0.2379927 best: 0.2379927 (50) total: 1.12s remaining: 855ms 51: learn: 0.2419352 test: 0.2371953 best: 0.2371953 (51) total: 1.14s remaining: 834ms 52: learn: 0.2416123 test: 0.2369044 best: 0.2369044 (52) total: 1.17s remaining: 814ms 53: learn: 0.2410413 test: 0.2364273 best: 0.2364273 (53) total: 1.19s remaining: 792ms 54: learn: 0.2406139 test: 0.2360937 best: 0.2360937 (54) total: 1.21s remaining: 772ms 55: learn: 0.2384130 test: 0.2339589 best: 0.2339589 (55) total: 1.24s remaining: 751ms 56: learn: 0.2382016 test: 0.2337129 best: 0.2337129 (56) total: 1.26s remaining: 729ms 57: learn: 0.2378682 test: 0.2334570 best: 0.2334570 (57) total: 1.28s remaining: 708ms 58: learn: 0.2377079 test: 0.2333231 best: 0.2333231 (58) total: 1.31s remaining: 687ms 59: learn: 0.2376033 test: 0.2332911 best: 0.2332911 (59) total: 1.33s remaining: 666ms 60: learn: 0.2370242 test: 0.2327144 best: 0.2327144 (60) total: 1.35s remaining: 645ms 61: learn: 0.2369061 test: 0.2326650 best: 0.2326650 (61) total: 1.38s remaining: 623ms 62: learn: 0.2364823 test: 0.2322197 best: 0.2322197 (62) total: 1.4s remaining: 601ms 63: learn: 0.2356928 test: 0.2314092 best: 0.2314092 (63) total: 1.42s remaining: 579ms 64: learn: 0.2352580 test: 0.2311452 best: 0.2311452 (64) total: 1.45s remaining: 557ms 65: learn: 0.2352099 test: 0.2310843 best: 0.2310843 (65) total: 1.47s remaining: 535ms 66: learn: 0.2338050 test: 0.2298937 best: 0.2298937 (66) total: 1.49s remaining: 512ms 67: learn: 0.2334386 test: 0.2295618 best: 0.2295618 (67) total: 1.51s remaining: 490ms 68: learn: 0.2333202 test: 0.2294901 best: 0.2294901 (68) total: 1.54s remaining: 468ms 69: learn: 0.2331714 test: 0.2291910 best: 0.2291910 (69) total: 1.56s remaining: 447ms 70: learn: 0.2329880 test: 0.2289306 best: 0.2289306 (70) total: 1.59s remaining: 426ms 71: learn: 0.2329618 test: 0.2289480 best: 0.2289306 (70) total: 1.61s remaining: 403ms 72: learn: 0.2325018 test: 0.2284626 best: 0.2284626 (72) total: 1.64s remaining: 381ms 73: learn: 0.2323695 test: 0.2283288 best: 0.2283288 (73) total: 1.66s remaining: 359ms 74: learn: 0.2321454 test: 0.2280810 best: 0.2280810 (74) total: 1.68s remaining: 336ms 75: learn: 0.2319210 test: 0.2277647 best: 0.2277647 (75) total: 1.71s remaining: 314ms 76: learn: 0.2316982 test: 0.2276857 best: 0.2276857 (76) total: 1.73s remaining: 292ms 77: learn: 0.2314376 test: 0.2275080 best: 0.2275080 (77) total: 1.75s remaining: 270ms 78: learn: 0.2306823 test: 0.2269688 best: 0.2269688 (78) total: 1.77s remaining: 247ms 79: learn: 0.2299939 test: 0.2263202 best: 0.2263202 (79) total: 1.8s remaining: 225ms 80: learn: 0.2299523 test: 0.2262370 best: 0.2262370 (80) total: 1.82s remaining: 203ms 81: learn: 0.2298340 test: 0.2261417 best: 0.2261417 (81) total: 1.85s remaining: 181ms 82: learn: 0.2295879 test: 0.2259511 best: 0.2259511 (82) total: 1.87s remaining: 158ms 83: learn: 0.2294610 test: 0.2257908 best: 0.2257908 (83) total: 1.9s remaining: 135ms 84: learn: 0.2293483 test: 0.2256803 best: 0.2256803 (84) total: 1.92s remaining: 113ms 85: learn: 0.2291966 test: 0.2254915 best: 0.2254915 (85) total: 1.94s remaining: 90.4ms 86: learn: 0.2290023 test: 0.2254087 best: 0.2254087 (86) total: 1.97s remaining: 67.8ms 87: learn: 0.2288769 test: 0.2252983 best: 0.2252983 (87) total: 1.99s remaining: 45.2ms 88: learn: 0.2284791 test: 0.2249935 best: 0.2249935 (88) total: 2.02s remaining: 22.7ms 89: learn: 0.2279549 test: 0.2244594 best: 0.2244594 (89) total: 2.04s remaining: 0us bestTest = 0.2244593587 bestIteration = 89 Trial 90, Fold 3: Log loss = 0.22415890753308929, Average precision = 0.9699822864344301, ROC-AUC = 0.9653471596873766, Elapsed Time = 2.163629700000456 seconds Trial 90, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 90, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6235540 test: 0.6235370 best: 0.6235370 (0) total: 18.8ms remaining: 1.67s 1: learn: 0.5691869 test: 0.5690248 best: 0.5690248 (1) total: 38.6ms remaining: 1.7s 2: learn: 0.5348098 test: 0.5343973 best: 0.5343973 (2) total: 59.5ms remaining: 1.73s 3: learn: 0.5090410 test: 0.5082864 best: 0.5082864 (3) total: 79.7ms remaining: 1.71s 4: learn: 0.4854176 test: 0.4847593 best: 0.4847593 (4) total: 99.8ms remaining: 1.7s 5: learn: 0.4485480 test: 0.4478089 best: 0.4478089 (5) total: 121ms remaining: 1.69s 6: learn: 0.4254778 test: 0.4246964 best: 0.4246964 (6) total: 140ms remaining: 1.66s 7: learn: 0.4128030 test: 0.4125417 best: 0.4125417 (7) total: 159ms remaining: 1.63s 8: learn: 0.4001283 test: 0.4002605 best: 0.4002605 (8) total: 179ms remaining: 1.61s 9: learn: 0.3913670 test: 0.3913564 best: 0.3913564 (9) total: 198ms remaining: 1.59s 10: learn: 0.3795182 test: 0.3796798 best: 0.3796798 (10) total: 218ms remaining: 1.57s 11: learn: 0.3696616 test: 0.3694903 best: 0.3694903 (11) total: 239ms remaining: 1.55s 12: learn: 0.3595383 test: 0.3597519 best: 0.3597519 (12) total: 260ms remaining: 1.54s 13: learn: 0.3487819 test: 0.3491017 best: 0.3491017 (13) total: 282ms remaining: 1.53s 14: learn: 0.3371899 test: 0.3372999 best: 0.3372999 (14) total: 302ms remaining: 1.51s 15: learn: 0.3265993 test: 0.3267610 best: 0.3267610 (15) total: 322ms remaining: 1.49s 16: learn: 0.3140730 test: 0.3141821 best: 0.3141821 (16) total: 342ms remaining: 1.47s 17: learn: 0.3081431 test: 0.3081857 best: 0.3081857 (17) total: 362ms remaining: 1.45s 18: learn: 0.3000125 test: 0.3000214 best: 0.3000214 (18) total: 382ms remaining: 1.43s 19: learn: 0.2964953 test: 0.2964937 best: 0.2964937 (19) total: 402ms remaining: 1.41s 20: learn: 0.2931315 test: 0.2927683 best: 0.2927683 (20) total: 422ms remaining: 1.39s 21: learn: 0.2896378 test: 0.2888669 best: 0.2888669 (21) total: 442ms remaining: 1.37s 22: learn: 0.2860787 test: 0.2855775 best: 0.2855775 (22) total: 464ms remaining: 1.35s 23: learn: 0.2825914 test: 0.2821517 best: 0.2821517 (23) total: 485ms remaining: 1.33s 24: learn: 0.2803135 test: 0.2801198 best: 0.2801198 (24) total: 506ms remaining: 1.31s 25: learn: 0.2787191 test: 0.2786478 best: 0.2786478 (25) total: 527ms remaining: 1.3s 26: learn: 0.2772180 test: 0.2773624 best: 0.2773624 (26) total: 548ms remaining: 1.28s 27: learn: 0.2741234 test: 0.2743798 best: 0.2743798 (27) total: 568ms remaining: 1.26s 28: learn: 0.2699465 test: 0.2701902 best: 0.2701902 (28) total: 589ms remaining: 1.24s 29: learn: 0.2673305 test: 0.2674959 best: 0.2674959 (29) total: 609ms remaining: 1.22s 30: learn: 0.2644267 test: 0.2647991 best: 0.2647991 (30) total: 630ms remaining: 1.2s 31: learn: 0.2617498 test: 0.2623191 best: 0.2623191 (31) total: 650ms remaining: 1.18s 32: learn: 0.2589805 test: 0.2594896 best: 0.2594896 (32) total: 671ms remaining: 1.16s 33: learn: 0.2581691 test: 0.2586136 best: 0.2586136 (33) total: 692ms remaining: 1.14s 34: learn: 0.2549811 test: 0.2553915 best: 0.2553915 (34) total: 718ms remaining: 1.13s 35: learn: 0.2543368 test: 0.2545357 best: 0.2545357 (35) total: 739ms remaining: 1.11s 36: learn: 0.2532998 test: 0.2533493 best: 0.2533493 (36) total: 759ms remaining: 1.09s 37: learn: 0.2525672 test: 0.2524977 best: 0.2524977 (37) total: 781ms remaining: 1.07s 38: learn: 0.2506778 test: 0.2508513 best: 0.2508513 (38) total: 802ms remaining: 1.05s 39: learn: 0.2500939 test: 0.2501170 best: 0.2501170 (39) total: 822ms remaining: 1.03s 40: learn: 0.2484324 test: 0.2484763 best: 0.2484763 (40) total: 843ms remaining: 1.01s 41: learn: 0.2471350 test: 0.2471303 best: 0.2471303 (41) total: 864ms remaining: 988ms 42: learn: 0.2463819 test: 0.2465320 best: 0.2465320 (42) total: 885ms remaining: 968ms 43: learn: 0.2459208 test: 0.2459983 best: 0.2459983 (43) total: 908ms remaining: 949ms 44: learn: 0.2452802 test: 0.2452653 best: 0.2452653 (44) total: 932ms remaining: 932ms 45: learn: 0.2448665 test: 0.2448447 best: 0.2448447 (45) total: 953ms remaining: 912ms 46: learn: 0.2438494 test: 0.2441769 best: 0.2441769 (46) total: 974ms remaining: 891ms 47: learn: 0.2436394 test: 0.2440033 best: 0.2440033 (47) total: 992ms remaining: 868ms 48: learn: 0.2422478 test: 0.2425753 best: 0.2425753 (48) total: 1.01s remaining: 848ms 49: learn: 0.2410584 test: 0.2414155 best: 0.2414155 (49) total: 1.03s remaining: 828ms 50: learn: 0.2405608 test: 0.2411619 best: 0.2411619 (50) total: 1.06s remaining: 808ms 51: learn: 0.2391849 test: 0.2395487 best: 0.2395487 (51) total: 1.08s remaining: 789ms 52: learn: 0.2387834 test: 0.2390227 best: 0.2390227 (52) total: 1.1s remaining: 769ms 53: learn: 0.2386562 test: 0.2388274 best: 0.2388274 (53) total: 1.12s remaining: 748ms 54: learn: 0.2380153 test: 0.2380280 best: 0.2380280 (54) total: 1.15s remaining: 729ms 55: learn: 0.2377903 test: 0.2378343 best: 0.2378343 (55) total: 1.17s remaining: 709ms 56: learn: 0.2375377 test: 0.2375896 best: 0.2375896 (56) total: 1.19s remaining: 688ms 57: learn: 0.2372290 test: 0.2371659 best: 0.2371659 (57) total: 1.21s remaining: 668ms 58: learn: 0.2368419 test: 0.2368552 best: 0.2368552 (58) total: 1.23s remaining: 647ms 59: learn: 0.2367007 test: 0.2366870 best: 0.2366870 (59) total: 1.25s remaining: 627ms 60: learn: 0.2359674 test: 0.2360987 best: 0.2360987 (60) total: 1.28s remaining: 607ms 61: learn: 0.2352472 test: 0.2353652 best: 0.2353652 (61) total: 1.3s remaining: 586ms 62: learn: 0.2345289 test: 0.2348808 best: 0.2348808 (62) total: 1.32s remaining: 566ms 63: learn: 0.2343439 test: 0.2346303 best: 0.2346303 (63) total: 1.34s remaining: 546ms 64: learn: 0.2340155 test: 0.2343852 best: 0.2343852 (64) total: 1.37s remaining: 527ms 65: learn: 0.2336893 test: 0.2341240 best: 0.2341240 (65) total: 1.39s remaining: 506ms 66: learn: 0.2333235 test: 0.2338553 best: 0.2338553 (66) total: 1.41s remaining: 485ms 67: learn: 0.2326781 test: 0.2333499 best: 0.2333499 (67) total: 1.44s remaining: 464ms 68: learn: 0.2324353 test: 0.2331821 best: 0.2331821 (68) total: 1.46s remaining: 444ms 69: learn: 0.2322173 test: 0.2329564 best: 0.2329564 (69) total: 1.48s remaining: 423ms 70: learn: 0.2319107 test: 0.2325922 best: 0.2325922 (70) total: 1.5s remaining: 402ms 71: learn: 0.2311490 test: 0.2318766 best: 0.2318766 (71) total: 1.52s remaining: 380ms 72: learn: 0.2307128 test: 0.2313548 best: 0.2313548 (72) total: 1.54s remaining: 359ms 73: learn: 0.2301655 test: 0.2309681 best: 0.2309681 (73) total: 1.57s remaining: 339ms 74: learn: 0.2296532 test: 0.2306432 best: 0.2306432 (74) total: 1.59s remaining: 319ms 75: learn: 0.2294322 test: 0.2304846 best: 0.2304846 (75) total: 1.61s remaining: 297ms 76: learn: 0.2291301 test: 0.2302240 best: 0.2302240 (76) total: 1.64s remaining: 276ms 77: learn: 0.2284035 test: 0.2295946 best: 0.2295946 (77) total: 1.66s remaining: 255ms 78: learn: 0.2283429 test: 0.2295832 best: 0.2295832 (78) total: 1.68s remaining: 234ms 79: learn: 0.2281518 test: 0.2294303 best: 0.2294303 (79) total: 1.7s remaining: 213ms 80: learn: 0.2279102 test: 0.2291537 best: 0.2291537 (80) total: 1.72s remaining: 191ms 81: learn: 0.2277166 test: 0.2290340 best: 0.2290340 (81) total: 1.74s remaining: 170ms 82: learn: 0.2277065 test: 0.2290353 best: 0.2290340 (81) total: 1.76s remaining: 149ms 83: learn: 0.2272566 test: 0.2285742 best: 0.2285742 (83) total: 1.79s remaining: 128ms 84: learn: 0.2270804 test: 0.2283661 best: 0.2283661 (84) total: 1.81s remaining: 107ms 85: learn: 0.2269955 test: 0.2283178 best: 0.2283178 (85) total: 1.83s remaining: 85.2ms 86: learn: 0.2266054 test: 0.2279957 best: 0.2279957 (86) total: 1.85s remaining: 63.9ms 87: learn: 0.2263274 test: 0.2275761 best: 0.2275761 (87) total: 1.88s remaining: 42.6ms 88: learn: 0.2262778 test: 0.2275575 best: 0.2275575 (88) total: 1.9s remaining: 21.3ms 89: learn: 0.2261967 test: 0.2274735 best: 0.2274735 (89) total: 1.92s remaining: 0us bestTest = 0.2274734577 bestIteration = 89 Trial 90, Fold 4: Log loss = 0.2270119405669334, Average precision = 0.9691139041471833, ROC-AUC = 0.9630855365279657, Elapsed Time = 2.0362184999976307 seconds Trial 90, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 90, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6376408 test: 0.6396694 best: 0.6396694 (0) total: 17.5ms remaining: 1.56s 1: learn: 0.5669304 test: 0.5694370 best: 0.5694370 (1) total: 37.9ms remaining: 1.67s 2: learn: 0.5358182 test: 0.5395160 best: 0.5395160 (2) total: 57.9ms remaining: 1.68s 3: learn: 0.4982719 test: 0.5025366 best: 0.5025366 (3) total: 76.5ms remaining: 1.64s 4: learn: 0.4725948 test: 0.4779637 best: 0.4779637 (4) total: 95.3ms remaining: 1.62s 5: learn: 0.4391395 test: 0.4448614 best: 0.4448614 (5) total: 114ms remaining: 1.59s 6: learn: 0.4115739 test: 0.4174958 best: 0.4174958 (6) total: 134ms remaining: 1.58s 7: learn: 0.4003483 test: 0.4063648 best: 0.4063648 (7) total: 153ms remaining: 1.56s 8: learn: 0.3749422 test: 0.3809950 best: 0.3809950 (8) total: 172ms remaining: 1.54s 9: learn: 0.3625013 test: 0.3684965 best: 0.3684965 (9) total: 191ms remaining: 1.53s 10: learn: 0.3439008 test: 0.3500620 best: 0.3500620 (10) total: 211ms remaining: 1.51s 11: learn: 0.3333998 test: 0.3399599 best: 0.3399599 (11) total: 230ms remaining: 1.49s 12: learn: 0.3231042 test: 0.3296710 best: 0.3296710 (12) total: 249ms remaining: 1.47s 13: learn: 0.3119255 test: 0.3185381 best: 0.3185381 (13) total: 269ms remaining: 1.46s 14: learn: 0.3038587 test: 0.3107109 best: 0.3107109 (14) total: 289ms remaining: 1.44s 15: learn: 0.2985568 test: 0.3055499 best: 0.3055499 (15) total: 308ms remaining: 1.42s 16: learn: 0.2911307 test: 0.2983041 best: 0.2983041 (16) total: 328ms remaining: 1.41s 17: learn: 0.2859222 test: 0.2931845 best: 0.2931845 (17) total: 348ms remaining: 1.39s 18: learn: 0.2814185 test: 0.2885290 best: 0.2885290 (18) total: 368ms remaining: 1.37s 19: learn: 0.2796036 test: 0.2869101 best: 0.2869101 (19) total: 389ms remaining: 1.36s 20: learn: 0.2742989 test: 0.2816087 best: 0.2816087 (20) total: 408ms remaining: 1.34s 21: learn: 0.2732670 test: 0.2807827 best: 0.2807827 (21) total: 427ms remaining: 1.32s 22: learn: 0.2701949 test: 0.2780794 best: 0.2780794 (22) total: 448ms remaining: 1.3s 23: learn: 0.2669553 test: 0.2750313 best: 0.2750313 (23) total: 468ms remaining: 1.29s 24: learn: 0.2657539 test: 0.2736442 best: 0.2736442 (24) total: 489ms remaining: 1.27s 25: learn: 0.2614969 test: 0.2692315 best: 0.2692315 (25) total: 511ms remaining: 1.26s 26: learn: 0.2601874 test: 0.2677800 best: 0.2677800 (26) total: 531ms remaining: 1.24s 27: learn: 0.2587503 test: 0.2663869 best: 0.2663869 (27) total: 551ms remaining: 1.22s 28: learn: 0.2579339 test: 0.2657012 best: 0.2657012 (28) total: 573ms remaining: 1.21s 29: learn: 0.2560801 test: 0.2640444 best: 0.2640444 (29) total: 596ms remaining: 1.19s 30: learn: 0.2552500 test: 0.2631850 best: 0.2631850 (30) total: 616ms remaining: 1.17s 31: learn: 0.2532909 test: 0.2613530 best: 0.2613530 (31) total: 637ms remaining: 1.15s 32: learn: 0.2509295 test: 0.2589957 best: 0.2589957 (32) total: 657ms remaining: 1.14s 33: learn: 0.2493266 test: 0.2572739 best: 0.2572739 (33) total: 679ms remaining: 1.12s 34: learn: 0.2485501 test: 0.2564945 best: 0.2564945 (34) total: 701ms remaining: 1.1s 35: learn: 0.2481036 test: 0.2558867 best: 0.2558867 (35) total: 723ms remaining: 1.08s 36: learn: 0.2470580 test: 0.2549251 best: 0.2549251 (36) total: 744ms remaining: 1.06s 37: learn: 0.2460928 test: 0.2537830 best: 0.2537830 (37) total: 765ms remaining: 1.05s 38: learn: 0.2453556 test: 0.2530523 best: 0.2530523 (38) total: 786ms remaining: 1.03s 39: learn: 0.2451064 test: 0.2528822 best: 0.2528822 (39) total: 807ms remaining: 1.01s 40: learn: 0.2444290 test: 0.2522720 best: 0.2522720 (40) total: 828ms remaining: 989ms 41: learn: 0.2430352 test: 0.2510058 best: 0.2510058 (41) total: 849ms remaining: 970ms 42: learn: 0.2420740 test: 0.2499692 best: 0.2499692 (42) total: 871ms remaining: 952ms 43: learn: 0.2414730 test: 0.2492030 best: 0.2492030 (43) total: 892ms remaining: 933ms 44: learn: 0.2403782 test: 0.2480582 best: 0.2480582 (44) total: 915ms remaining: 915ms 45: learn: 0.2397571 test: 0.2474869 best: 0.2474869 (45) total: 963ms remaining: 921ms 46: learn: 0.2394124 test: 0.2471984 best: 0.2471984 (46) total: 1.01s remaining: 925ms 47: learn: 0.2387849 test: 0.2464446 best: 0.2464446 (47) total: 1.04s remaining: 915ms 48: learn: 0.2378326 test: 0.2454596 best: 0.2454596 (48) total: 1.08s remaining: 902ms 49: learn: 0.2376787 test: 0.2452512 best: 0.2452512 (49) total: 1.1s remaining: 881ms 50: learn: 0.2366482 test: 0.2441849 best: 0.2441849 (50) total: 1.12s remaining: 859ms 51: learn: 0.2362542 test: 0.2438419 best: 0.2438419 (51) total: 1.15s remaining: 838ms 52: learn: 0.2358847 test: 0.2435958 best: 0.2435958 (52) total: 1.17s remaining: 817ms 53: learn: 0.2354822 test: 0.2431897 best: 0.2431897 (53) total: 1.19s remaining: 794ms 54: learn: 0.2352438 test: 0.2429697 best: 0.2429697 (54) total: 1.21s remaining: 771ms 55: learn: 0.2348686 test: 0.2427099 best: 0.2427099 (55) total: 1.23s remaining: 748ms 56: learn: 0.2342338 test: 0.2423459 best: 0.2423459 (56) total: 1.25s remaining: 726ms 57: learn: 0.2338071 test: 0.2418727 best: 0.2418727 (57) total: 1.27s remaining: 703ms 58: learn: 0.2334513 test: 0.2415910 best: 0.2415910 (58) total: 1.3s remaining: 681ms 59: learn: 0.2333534 test: 0.2415021 best: 0.2415021 (59) total: 1.32s remaining: 659ms 60: learn: 0.2329944 test: 0.2410892 best: 0.2410892 (60) total: 1.34s remaining: 639ms 61: learn: 0.2325226 test: 0.2407063 best: 0.2407063 (61) total: 1.36s remaining: 617ms 62: learn: 0.2315062 test: 0.2397614 best: 0.2397614 (62) total: 1.39s remaining: 594ms 63: learn: 0.2313340 test: 0.2396853 best: 0.2396853 (63) total: 1.41s remaining: 572ms 64: learn: 0.2311266 test: 0.2396643 best: 0.2396643 (64) total: 1.43s remaining: 550ms 65: learn: 0.2309655 test: 0.2394826 best: 0.2394826 (65) total: 1.45s remaining: 528ms 66: learn: 0.2305884 test: 0.2390215 best: 0.2390215 (66) total: 1.47s remaining: 506ms 67: learn: 0.2303745 test: 0.2389534 best: 0.2389534 (67) total: 1.49s remaining: 483ms 68: learn: 0.2300156 test: 0.2386557 best: 0.2386557 (68) total: 1.51s remaining: 461ms 69: learn: 0.2289056 test: 0.2375984 best: 0.2375984 (69) total: 1.54s remaining: 439ms 70: learn: 0.2287093 test: 0.2374574 best: 0.2374574 (70) total: 1.56s remaining: 418ms 71: learn: 0.2286351 test: 0.2373799 best: 0.2373799 (71) total: 1.59s remaining: 397ms 72: learn: 0.2284293 test: 0.2371404 best: 0.2371404 (72) total: 1.61s remaining: 375ms 73: learn: 0.2274147 test: 0.2361122 best: 0.2361122 (73) total: 1.64s remaining: 354ms 74: learn: 0.2270401 test: 0.2357159 best: 0.2357159 (74) total: 1.66s remaining: 331ms 75: learn: 0.2266082 test: 0.2353512 best: 0.2353512 (75) total: 1.68s remaining: 309ms 76: learn: 0.2265280 test: 0.2353631 best: 0.2353512 (75) total: 1.7s remaining: 287ms 77: learn: 0.2263657 test: 0.2352738 best: 0.2352738 (77) total: 1.72s remaining: 265ms 78: learn: 0.2257217 test: 0.2348203 best: 0.2348203 (78) total: 1.74s remaining: 243ms 79: learn: 0.2249665 test: 0.2340574 best: 0.2340574 (79) total: 1.77s remaining: 221ms 80: learn: 0.2245284 test: 0.2337896 best: 0.2337896 (80) total: 1.79s remaining: 199ms 81: learn: 0.2244252 test: 0.2337220 best: 0.2337220 (81) total: 1.81s remaining: 177ms 82: learn: 0.2242250 test: 0.2335109 best: 0.2335109 (82) total: 1.83s remaining: 155ms 83: learn: 0.2240445 test: 0.2334837 best: 0.2334837 (83) total: 1.85s remaining: 133ms 84: learn: 0.2235681 test: 0.2333874 best: 0.2333874 (84) total: 1.88s remaining: 110ms 85: learn: 0.2234460 test: 0.2332100 best: 0.2332100 (85) total: 1.9s remaining: 88.3ms 86: learn: 0.2230363 test: 0.2329132 best: 0.2329132 (86) total: 1.92s remaining: 66.3ms 87: learn: 0.2221691 test: 0.2320881 best: 0.2320881 (87) total: 1.94s remaining: 44.2ms 88: learn: 0.2218561 test: 0.2319131 best: 0.2319131 (88) total: 1.97s remaining: 22.1ms 89: learn: 0.2215599 test: 0.2316752 best: 0.2316752 (89) total: 1.99s remaining: 0us bestTest = 0.2316751919 bestIteration = 89 Trial 90, Fold 5: Log loss = 0.23093247191285196, Average precision = 0.9647195552947815, ROC-AUC = 0.9601831187410588, Elapsed Time = 2.099057699997502 seconds
Optimization Progress: 91%|#########1| 91/100 [2:48:33<11:52, 79.17s/it]
Trial 91, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 91, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5881843 test: 0.5896063 best: 0.5896063 (0) total: 322ms remaining: 28.3s 1: learn: 0.5071535 test: 0.5101896 best: 0.5101896 (1) total: 634ms remaining: 27.6s 2: learn: 0.4435901 test: 0.4480414 best: 0.4480414 (2) total: 984ms remaining: 28.2s 3: learn: 0.3949678 test: 0.4014205 best: 0.4014205 (3) total: 1.28s remaining: 27.2s 4: learn: 0.3568471 test: 0.3649220 best: 0.3649220 (4) total: 1.56s remaining: 26.3s 5: learn: 0.3275776 test: 0.3371185 best: 0.3371185 (5) total: 1.87s remaining: 25.9s 6: learn: 0.3027732 test: 0.3136327 best: 0.3136327 (6) total: 2.17s remaining: 25.4s 7: learn: 0.2841425 test: 0.2966280 best: 0.2966280 (7) total: 2.48s remaining: 25.1s 8: learn: 0.2678213 test: 0.2811195 best: 0.2811195 (8) total: 2.81s remaining: 25s 9: learn: 0.2550950 test: 0.2698545 best: 0.2698545 (9) total: 3.11s remaining: 24.6s 10: learn: 0.2444342 test: 0.2607415 best: 0.2607415 (10) total: 3.42s remaining: 24.3s 11: learn: 0.2353581 test: 0.2529327 best: 0.2529327 (11) total: 3.73s remaining: 23.9s 12: learn: 0.2268148 test: 0.2452100 best: 0.2452100 (12) total: 4.02s remaining: 23.5s 13: learn: 0.2195862 test: 0.2394620 best: 0.2394620 (13) total: 4.32s remaining: 23.2s 14: learn: 0.2137576 test: 0.2346334 best: 0.2346334 (14) total: 4.63s remaining: 22.8s 15: learn: 0.2089980 test: 0.2310920 best: 0.2310920 (15) total: 4.93s remaining: 22.5s 16: learn: 0.2043270 test: 0.2274162 best: 0.2274162 (16) total: 5.24s remaining: 22.2s 17: learn: 0.1998386 test: 0.2240815 best: 0.2240815 (17) total: 5.53s remaining: 21.8s 18: learn: 0.1958145 test: 0.2213428 best: 0.2213428 (18) total: 5.83s remaining: 21.5s 19: learn: 0.1921370 test: 0.2186551 best: 0.2186551 (19) total: 6.13s remaining: 21.1s 20: learn: 0.1893676 test: 0.2167055 best: 0.2167055 (20) total: 6.42s remaining: 20.8s 21: learn: 0.1865476 test: 0.2146034 best: 0.2146034 (21) total: 6.72s remaining: 20.5s 22: learn: 0.1839744 test: 0.2132372 best: 0.2132372 (22) total: 7.02s remaining: 20.1s 23: learn: 0.1812582 test: 0.2114856 best: 0.2114856 (23) total: 7.33s remaining: 19.8s 24: learn: 0.1789735 test: 0.2101325 best: 0.2101325 (24) total: 7.61s remaining: 19.5s 25: learn: 0.1771537 test: 0.2092112 best: 0.2092112 (25) total: 7.92s remaining: 19.2s 26: learn: 0.1752576 test: 0.2080940 best: 0.2080940 (26) total: 8.2s remaining: 18.8s 27: learn: 0.1733485 test: 0.2073550 best: 0.2073550 (27) total: 8.49s remaining: 18.5s 28: learn: 0.1717164 test: 0.2066420 best: 0.2066420 (28) total: 8.78s remaining: 18.2s 29: learn: 0.1702618 test: 0.2060938 best: 0.2060938 (29) total: 9.05s remaining: 17.8s 30: learn: 0.1687592 test: 0.2050509 best: 0.2050509 (30) total: 9.35s remaining: 17.5s 31: learn: 0.1673122 test: 0.2047600 best: 0.2047600 (31) total: 9.62s remaining: 17.1s 32: learn: 0.1659431 test: 0.2040967 best: 0.2040967 (32) total: 9.92s remaining: 16.8s 33: learn: 0.1648636 test: 0.2038163 best: 0.2038163 (33) total: 10.2s remaining: 16.5s 34: learn: 0.1638299 test: 0.2036956 best: 0.2036956 (34) total: 10.5s remaining: 16.2s 35: learn: 0.1626225 test: 0.2029331 best: 0.2029331 (35) total: 10.8s remaining: 15.8s 36: learn: 0.1614397 test: 0.2025494 best: 0.2025494 (36) total: 11s remaining: 15.5s 37: learn: 0.1602367 test: 0.2021943 best: 0.2021943 (37) total: 11.3s remaining: 15.2s 38: learn: 0.1591676 test: 0.2018700 best: 0.2018700 (38) total: 11.6s remaining: 14.9s 39: learn: 0.1579137 test: 0.2013283 best: 0.2013283 (39) total: 11.9s remaining: 14.5s 40: learn: 0.1570424 test: 0.2009367 best: 0.2009367 (40) total: 12.2s remaining: 14.2s 41: learn: 0.1561721 test: 0.2006287 best: 0.2006287 (41) total: 12.5s remaining: 13.9s 42: learn: 0.1551911 test: 0.2002451 best: 0.2002451 (42) total: 12.7s remaining: 13.6s 43: learn: 0.1542734 test: 0.2000558 best: 0.2000558 (43) total: 13s remaining: 13.3s 44: learn: 0.1533827 test: 0.1996427 best: 0.1996427 (44) total: 13.3s remaining: 13s 45: learn: 0.1524388 test: 0.1995897 best: 0.1995897 (45) total: 13.5s remaining: 12.7s 46: learn: 0.1515538 test: 0.1993787 best: 0.1993787 (46) total: 13.8s remaining: 12.3s 47: learn: 0.1506973 test: 0.1991485 best: 0.1991485 (47) total: 14.1s remaining: 12s 48: learn: 0.1497400 test: 0.1989367 best: 0.1989367 (48) total: 14.3s remaining: 11.7s 49: learn: 0.1490658 test: 0.1988635 best: 0.1988635 (49) total: 14.6s remaining: 11.4s 50: learn: 0.1485177 test: 0.1985845 best: 0.1985845 (50) total: 14.9s remaining: 11.1s 51: learn: 0.1478970 test: 0.1985490 best: 0.1985490 (51) total: 15.2s remaining: 10.8s 52: learn: 0.1470223 test: 0.1985864 best: 0.1985490 (51) total: 15.4s remaining: 10.5s 53: learn: 0.1464716 test: 0.1984648 best: 0.1984648 (53) total: 15.7s remaining: 10.2s 54: learn: 0.1459002 test: 0.1983857 best: 0.1983857 (54) total: 16s remaining: 9.88s 55: learn: 0.1450161 test: 0.1982489 best: 0.1982489 (55) total: 16.2s remaining: 9.57s 56: learn: 0.1444534 test: 0.1981569 best: 0.1981569 (56) total: 16.5s remaining: 9.28s 57: learn: 0.1436360 test: 0.1980213 best: 0.1980213 (57) total: 16.8s remaining: 8.99s 58: learn: 0.1431936 test: 0.1979655 best: 0.1979655 (58) total: 17.1s remaining: 8.69s 59: learn: 0.1424494 test: 0.1975177 best: 0.1975177 (59) total: 17.4s remaining: 8.39s 60: learn: 0.1417639 test: 0.1972436 best: 0.1972436 (60) total: 17.6s remaining: 8.1s 61: learn: 0.1411241 test: 0.1971229 best: 0.1971229 (61) total: 17.9s remaining: 7.8s 62: learn: 0.1404507 test: 0.1968521 best: 0.1968521 (62) total: 18.2s remaining: 7.5s 63: learn: 0.1398103 test: 0.1970611 best: 0.1968521 (62) total: 18.5s remaining: 7.21s 64: learn: 0.1393600 test: 0.1969394 best: 0.1968521 (62) total: 18.7s remaining: 6.91s 65: learn: 0.1389438 test: 0.1968505 best: 0.1968505 (65) total: 19s remaining: 6.62s 66: learn: 0.1384585 test: 0.1970695 best: 0.1968505 (65) total: 19.3s remaining: 6.33s 67: learn: 0.1379858 test: 0.1970532 best: 0.1968505 (65) total: 19.5s remaining: 6.04s 68: learn: 0.1373999 test: 0.1968672 best: 0.1968505 (65) total: 19.8s remaining: 5.75s 69: learn: 0.1369025 test: 0.1968920 best: 0.1968505 (65) total: 20.1s remaining: 5.46s 70: learn: 0.1361955 test: 0.1967741 best: 0.1967741 (70) total: 20.4s remaining: 5.17s 71: learn: 0.1358488 test: 0.1967116 best: 0.1967116 (71) total: 20.6s remaining: 4.87s 72: learn: 0.1353175 test: 0.1966040 best: 0.1966040 (72) total: 20.9s remaining: 4.58s 73: learn: 0.1346863 test: 0.1966868 best: 0.1966040 (72) total: 21.2s remaining: 4.3s 74: learn: 0.1340845 test: 0.1968210 best: 0.1966040 (72) total: 21.5s remaining: 4s 75: learn: 0.1335245 test: 0.1969463 best: 0.1966040 (72) total: 21.7s remaining: 3.72s 76: learn: 0.1328991 test: 0.1968313 best: 0.1966040 (72) total: 22s remaining: 3.43s 77: learn: 0.1324357 test: 0.1967116 best: 0.1966040 (72) total: 22.3s remaining: 3.14s 78: learn: 0.1320268 test: 0.1967190 best: 0.1966040 (72) total: 22.5s remaining: 2.85s 79: learn: 0.1315995 test: 0.1966584 best: 0.1966040 (72) total: 22.8s remaining: 2.57s 80: learn: 0.1312298 test: 0.1965932 best: 0.1965932 (80) total: 23.1s remaining: 2.28s 81: learn: 0.1307086 test: 0.1965864 best: 0.1965864 (81) total: 23.4s remaining: 1.99s 82: learn: 0.1301545 test: 0.1968002 best: 0.1965864 (81) total: 23.6s remaining: 1.71s 83: learn: 0.1297163 test: 0.1968666 best: 0.1965864 (81) total: 23.9s remaining: 1.42s 84: learn: 0.1292939 test: 0.1969266 best: 0.1965864 (81) total: 24.2s remaining: 1.14s 85: learn: 0.1289599 test: 0.1968584 best: 0.1965864 (81) total: 24.4s remaining: 853ms 86: learn: 0.1285454 test: 0.1967140 best: 0.1965864 (81) total: 24.7s remaining: 568ms 87: learn: 0.1282061 test: 0.1965027 best: 0.1965027 (87) total: 25s remaining: 284ms 88: learn: 0.1278069 test: 0.1964489 best: 0.1964489 (88) total: 25.3s remaining: 0us bestTest = 0.1964488876 bestIteration = 88 Trial 91, Fold 1: Log loss = 0.1961050109510587, Average precision = 0.975616059910034, ROC-AUC = 0.9717072214813859, Elapsed Time = 25.408131299998786 seconds Trial 91, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 91, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5890081 test: 0.5904561 best: 0.5904561 (0) total: 330ms remaining: 29s 1: learn: 0.5079570 test: 0.5107264 best: 0.5107264 (1) total: 649ms remaining: 28.2s 2: learn: 0.4446912 test: 0.4485939 best: 0.4485939 (2) total: 966ms remaining: 27.7s 3: learn: 0.3959556 test: 0.4005855 best: 0.4005855 (3) total: 1.26s remaining: 26.7s 4: learn: 0.3571209 test: 0.3629736 best: 0.3629736 (4) total: 1.57s remaining: 26.4s 5: learn: 0.3269743 test: 0.3328502 best: 0.3328502 (5) total: 1.88s remaining: 26s 6: learn: 0.3038514 test: 0.3108169 best: 0.3108169 (6) total: 2.17s remaining: 25.4s 7: learn: 0.2850038 test: 0.2922902 best: 0.2922902 (7) total: 2.47s remaining: 25s 8: learn: 0.2705449 test: 0.2793784 best: 0.2793784 (8) total: 2.76s remaining: 24.6s 9: learn: 0.2566561 test: 0.2660097 best: 0.2660097 (9) total: 3.07s remaining: 24.2s 10: learn: 0.2443973 test: 0.2546156 best: 0.2546156 (10) total: 3.37s remaining: 23.9s 11: learn: 0.2360374 test: 0.2472004 best: 0.2472004 (11) total: 3.66s remaining: 23.5s 12: learn: 0.2285287 test: 0.2403610 best: 0.2403610 (12) total: 3.97s remaining: 23.2s 13: learn: 0.2217645 test: 0.2339706 best: 0.2339706 (13) total: 4.28s remaining: 22.9s 14: learn: 0.2155751 test: 0.2284693 best: 0.2284693 (14) total: 4.56s remaining: 22.5s 15: learn: 0.2101330 test: 0.2236949 best: 0.2236949 (15) total: 4.86s remaining: 22.2s 16: learn: 0.2053719 test: 0.2194624 best: 0.2194624 (16) total: 5.13s remaining: 21.7s 17: learn: 0.2010546 test: 0.2158966 best: 0.2158966 (17) total: 5.43s remaining: 21.4s 18: learn: 0.1975023 test: 0.2130118 best: 0.2130118 (18) total: 5.74s remaining: 21.1s 19: learn: 0.1939140 test: 0.2100091 best: 0.2100091 (19) total: 6.03s remaining: 20.8s 20: learn: 0.1908517 test: 0.2079621 best: 0.2079621 (20) total: 6.31s remaining: 20.4s 21: learn: 0.1885183 test: 0.2059285 best: 0.2059285 (21) total: 6.61s remaining: 20.1s 22: learn: 0.1857584 test: 0.2038745 best: 0.2038745 (22) total: 6.92s remaining: 19.9s 23: learn: 0.1835000 test: 0.2021912 best: 0.2021912 (23) total: 7.2s remaining: 19.5s 24: learn: 0.1811523 test: 0.2003812 best: 0.2003812 (24) total: 7.49s remaining: 19.2s 25: learn: 0.1790959 test: 0.1986072 best: 0.1986072 (25) total: 7.78s remaining: 18.8s 26: learn: 0.1773243 test: 0.1974851 best: 0.1974851 (26) total: 8.06s remaining: 18.5s 27: learn: 0.1756582 test: 0.1965137 best: 0.1965137 (27) total: 8.34s remaining: 18.2s 28: learn: 0.1740010 test: 0.1955675 best: 0.1955675 (28) total: 8.61s remaining: 17.8s 29: learn: 0.1725151 test: 0.1947142 best: 0.1947142 (29) total: 8.89s remaining: 17.5s 30: learn: 0.1708144 test: 0.1936358 best: 0.1936358 (30) total: 9.19s remaining: 17.2s 31: learn: 0.1692854 test: 0.1926187 best: 0.1926187 (31) total: 9.46s remaining: 16.9s 32: learn: 0.1679928 test: 0.1918996 best: 0.1918996 (32) total: 9.72s remaining: 16.5s 33: learn: 0.1665764 test: 0.1913224 best: 0.1913224 (33) total: 10s remaining: 16.2s 34: learn: 0.1651012 test: 0.1908725 best: 0.1908725 (34) total: 10.3s remaining: 15.9s 35: learn: 0.1640042 test: 0.1904891 best: 0.1904891 (35) total: 10.6s remaining: 15.6s 36: learn: 0.1628189 test: 0.1898406 best: 0.1898406 (36) total: 10.9s remaining: 15.3s 37: learn: 0.1617251 test: 0.1892677 best: 0.1892677 (37) total: 11.2s remaining: 15s 38: learn: 0.1606336 test: 0.1885244 best: 0.1885244 (38) total: 11.4s remaining: 14.7s 39: learn: 0.1595964 test: 0.1879461 best: 0.1879461 (39) total: 11.7s remaining: 14.3s 40: learn: 0.1586555 test: 0.1875681 best: 0.1875681 (40) total: 12s remaining: 14s 41: learn: 0.1577945 test: 0.1871324 best: 0.1871324 (41) total: 12.2s remaining: 13.7s 42: learn: 0.1567351 test: 0.1865921 best: 0.1865921 (42) total: 12.5s remaining: 13.4s 43: learn: 0.1557060 test: 0.1862079 best: 0.1862079 (43) total: 12.8s remaining: 13.1s 44: learn: 0.1546319 test: 0.1856788 best: 0.1856788 (44) total: 13.1s remaining: 12.8s 45: learn: 0.1537008 test: 0.1855154 best: 0.1855154 (45) total: 13.3s remaining: 12.5s 46: learn: 0.1528017 test: 0.1853343 best: 0.1853343 (46) total: 13.6s remaining: 12.2s 47: learn: 0.1519597 test: 0.1849072 best: 0.1849072 (47) total: 13.9s remaining: 11.8s 48: learn: 0.1514116 test: 0.1846124 best: 0.1846124 (48) total: 14.2s remaining: 11.6s 49: learn: 0.1508023 test: 0.1842623 best: 0.1842623 (49) total: 14.4s remaining: 11.2s 50: learn: 0.1499660 test: 0.1839222 best: 0.1839222 (50) total: 14.7s remaining: 10.9s 51: learn: 0.1492035 test: 0.1837256 best: 0.1837256 (51) total: 14.9s remaining: 10.6s 52: learn: 0.1481687 test: 0.1836382 best: 0.1836382 (52) total: 15.2s remaining: 10.3s 53: learn: 0.1474968 test: 0.1832254 best: 0.1832254 (53) total: 15.5s remaining: 10s 54: learn: 0.1469043 test: 0.1828573 best: 0.1828573 (54) total: 15.7s remaining: 9.73s 55: learn: 0.1461864 test: 0.1827212 best: 0.1827212 (55) total: 16s remaining: 9.42s 56: learn: 0.1455426 test: 0.1824215 best: 0.1824215 (56) total: 16.3s remaining: 9.13s 57: learn: 0.1451333 test: 0.1822133 best: 0.1822133 (57) total: 16.5s remaining: 8.83s 58: learn: 0.1444529 test: 0.1819996 best: 0.1819996 (58) total: 16.8s remaining: 8.54s 59: learn: 0.1440402 test: 0.1818362 best: 0.1818362 (59) total: 17.1s remaining: 8.25s 60: learn: 0.1432904 test: 0.1818448 best: 0.1818362 (59) total: 17.4s remaining: 7.96s 61: learn: 0.1427765 test: 0.1817403 best: 0.1817403 (61) total: 17.6s remaining: 7.67s 62: learn: 0.1423802 test: 0.1816141 best: 0.1816141 (62) total: 17.9s remaining: 7.38s 63: learn: 0.1419937 test: 0.1814281 best: 0.1814281 (63) total: 18.1s remaining: 7.09s 64: learn: 0.1413465 test: 0.1812130 best: 0.1812130 (64) total: 18.4s remaining: 6.8s 65: learn: 0.1408913 test: 0.1811538 best: 0.1811538 (65) total: 18.7s remaining: 6.51s 66: learn: 0.1405827 test: 0.1810580 best: 0.1810580 (66) total: 18.9s remaining: 6.22s 67: learn: 0.1401270 test: 0.1809988 best: 0.1809988 (67) total: 19.2s remaining: 5.93s 68: learn: 0.1397417 test: 0.1810298 best: 0.1809988 (67) total: 19.5s remaining: 5.65s 69: learn: 0.1390662 test: 0.1808239 best: 0.1808239 (69) total: 19.7s remaining: 5.36s 70: learn: 0.1383671 test: 0.1809359 best: 0.1808239 (69) total: 20s remaining: 5.08s 71: learn: 0.1378263 test: 0.1807452 best: 0.1807452 (71) total: 20.3s remaining: 4.79s 72: learn: 0.1371327 test: 0.1808015 best: 0.1807452 (71) total: 20.6s remaining: 4.51s 73: learn: 0.1365352 test: 0.1806814 best: 0.1806814 (73) total: 20.8s remaining: 4.22s 74: learn: 0.1360792 test: 0.1805650 best: 0.1805650 (74) total: 21.1s remaining: 3.94s 75: learn: 0.1355994 test: 0.1805785 best: 0.1805650 (74) total: 21.4s remaining: 3.65s 76: learn: 0.1352385 test: 0.1805445 best: 0.1805445 (76) total: 21.6s remaining: 3.37s 77: learn: 0.1348405 test: 0.1805013 best: 0.1805013 (77) total: 21.9s remaining: 3.08s 78: learn: 0.1342201 test: 0.1803366 best: 0.1803366 (78) total: 22.1s remaining: 2.8s 79: learn: 0.1337704 test: 0.1803180 best: 0.1803180 (79) total: 22.4s remaining: 2.52s 80: learn: 0.1331748 test: 0.1802965 best: 0.1802965 (80) total: 22.6s remaining: 2.24s 81: learn: 0.1326500 test: 0.1801853 best: 0.1801853 (81) total: 22.9s remaining: 1.96s 82: learn: 0.1320849 test: 0.1802730 best: 0.1801853 (81) total: 23.2s remaining: 1.68s 83: learn: 0.1316808 test: 0.1802768 best: 0.1801853 (81) total: 23.5s remaining: 1.4s 84: learn: 0.1313305 test: 0.1803745 best: 0.1801853 (81) total: 23.8s remaining: 1.12s 85: learn: 0.1307397 test: 0.1802306 best: 0.1801853 (81) total: 24s remaining: 838ms 86: learn: 0.1301798 test: 0.1802534 best: 0.1801853 (81) total: 24.3s remaining: 558ms 87: learn: 0.1297075 test: 0.1799366 best: 0.1799366 (87) total: 24.6s remaining: 279ms 88: learn: 0.1292356 test: 0.1796815 best: 0.1796815 (88) total: 24.8s remaining: 0us bestTest = 0.1796815343 bestIteration = 88 Trial 91, Fold 2: Log loss = 0.1794548550293013, Average precision = 0.977897147942451, ROC-AUC = 0.9755842365614438, Elapsed Time = 24.99169129999791 seconds Trial 91, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 91, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5892725 test: 0.5900181 best: 0.5900181 (0) total: 322ms remaining: 28.4s 1: learn: 0.5081207 test: 0.5094452 best: 0.5094452 (1) total: 641ms remaining: 27.9s 2: learn: 0.4450656 test: 0.4468568 best: 0.4468568 (2) total: 950ms remaining: 27.2s 3: learn: 0.3954783 test: 0.3984119 best: 0.3984119 (3) total: 1.25s remaining: 26.6s 4: learn: 0.3574737 test: 0.3615553 best: 0.3615553 (4) total: 1.54s remaining: 25.8s 5: learn: 0.3266224 test: 0.3311498 best: 0.3311498 (5) total: 1.85s remaining: 25.6s 6: learn: 0.3036675 test: 0.3090210 best: 0.3090210 (6) total: 2.16s remaining: 25.3s 7: learn: 0.2842687 test: 0.2906173 best: 0.2906173 (7) total: 2.46s remaining: 24.9s 8: learn: 0.2696219 test: 0.2771066 best: 0.2771066 (8) total: 2.76s remaining: 24.5s 9: learn: 0.2555498 test: 0.2636943 best: 0.2636943 (9) total: 3.06s remaining: 24.2s 10: learn: 0.2453384 test: 0.2540197 best: 0.2540197 (10) total: 3.38s remaining: 24s 11: learn: 0.2361408 test: 0.2458989 best: 0.2458989 (11) total: 3.67s remaining: 23.5s 12: learn: 0.2285167 test: 0.2392888 best: 0.2392888 (12) total: 3.98s remaining: 23.2s 13: learn: 0.2211627 test: 0.2327965 best: 0.2327965 (13) total: 4.27s remaining: 22.9s 14: learn: 0.2150306 test: 0.2274454 best: 0.2274454 (14) total: 4.57s remaining: 22.5s 15: learn: 0.2101211 test: 0.2234373 best: 0.2234373 (15) total: 4.87s remaining: 22.2s 16: learn: 0.2049220 test: 0.2193429 best: 0.2193429 (16) total: 5.17s remaining: 21.9s 17: learn: 0.2006960 test: 0.2160904 best: 0.2160904 (17) total: 5.46s remaining: 21.5s 18: learn: 0.1967001 test: 0.2128223 best: 0.2128223 (18) total: 5.74s remaining: 21.1s 19: learn: 0.1933329 test: 0.2097921 best: 0.2097921 (19) total: 6.03s remaining: 20.8s 20: learn: 0.1904581 test: 0.2075607 best: 0.2075607 (20) total: 6.3s remaining: 20.4s 21: learn: 0.1875471 test: 0.2054278 best: 0.2054278 (21) total: 6.59s remaining: 20.1s 22: learn: 0.1854715 test: 0.2039763 best: 0.2039763 (22) total: 6.87s remaining: 19.7s 23: learn: 0.1832023 test: 0.2027093 best: 0.2027093 (23) total: 7.14s remaining: 19.4s 24: learn: 0.1809363 test: 0.2010713 best: 0.2010713 (24) total: 7.45s remaining: 19.1s 25: learn: 0.1789486 test: 0.1995088 best: 0.1995088 (25) total: 7.72s remaining: 18.7s 26: learn: 0.1771929 test: 0.1985900 best: 0.1985900 (26) total: 8s remaining: 18.4s 27: learn: 0.1754023 test: 0.1976176 best: 0.1976176 (27) total: 8.28s remaining: 18s 28: learn: 0.1737324 test: 0.1967183 best: 0.1967183 (28) total: 8.58s remaining: 17.7s 29: learn: 0.1720326 test: 0.1958587 best: 0.1958587 (29) total: 8.87s remaining: 17.4s 30: learn: 0.1707470 test: 0.1951656 best: 0.1951656 (30) total: 9.14s remaining: 17.1s 31: learn: 0.1692251 test: 0.1944073 best: 0.1944073 (31) total: 9.42s remaining: 16.8s 32: learn: 0.1677318 test: 0.1937116 best: 0.1937116 (32) total: 9.71s remaining: 16.5s 33: learn: 0.1665654 test: 0.1930474 best: 0.1930474 (33) total: 9.98s remaining: 16.1s 34: learn: 0.1651826 test: 0.1920503 best: 0.1920503 (34) total: 10.3s remaining: 15.8s 35: learn: 0.1640567 test: 0.1916393 best: 0.1916393 (35) total: 10.5s remaining: 15.5s 36: learn: 0.1628817 test: 0.1913461 best: 0.1913461 (36) total: 10.8s remaining: 15.2s 37: learn: 0.1621618 test: 0.1912324 best: 0.1912324 (37) total: 11.1s remaining: 14.9s 38: learn: 0.1611529 test: 0.1907469 best: 0.1907469 (38) total: 11.4s remaining: 14.6s 39: learn: 0.1600217 test: 0.1903121 best: 0.1903121 (39) total: 11.7s remaining: 14.3s 40: learn: 0.1590662 test: 0.1901574 best: 0.1901574 (40) total: 11.9s remaining: 14s 41: learn: 0.1578983 test: 0.1896451 best: 0.1896451 (41) total: 12.2s remaining: 13.6s 42: learn: 0.1568539 test: 0.1892732 best: 0.1892732 (42) total: 12.5s remaining: 13.3s 43: learn: 0.1557139 test: 0.1891207 best: 0.1891207 (43) total: 12.7s remaining: 13s 44: learn: 0.1547206 test: 0.1891842 best: 0.1891207 (43) total: 13s remaining: 12.7s 45: learn: 0.1537615 test: 0.1888899 best: 0.1888899 (45) total: 13.3s remaining: 12.4s 46: learn: 0.1530150 test: 0.1887675 best: 0.1887675 (46) total: 13.6s remaining: 12.1s 47: learn: 0.1520842 test: 0.1886008 best: 0.1886008 (47) total: 13.8s remaining: 11.8s 48: learn: 0.1511893 test: 0.1885780 best: 0.1885780 (48) total: 14.1s remaining: 11.5s 49: learn: 0.1502823 test: 0.1881428 best: 0.1881428 (49) total: 14.4s remaining: 11.2s 50: learn: 0.1493236 test: 0.1878151 best: 0.1878151 (50) total: 14.7s remaining: 10.9s 51: learn: 0.1483760 test: 0.1876093 best: 0.1876093 (51) total: 14.9s remaining: 10.6s 52: learn: 0.1477744 test: 0.1874974 best: 0.1874974 (52) total: 15.2s remaining: 10.3s 53: learn: 0.1469112 test: 0.1874418 best: 0.1874418 (53) total: 15.5s remaining: 10s 54: learn: 0.1464061 test: 0.1872032 best: 0.1872032 (54) total: 15.7s remaining: 9.72s 55: learn: 0.1457332 test: 0.1871740 best: 0.1871740 (55) total: 16s remaining: 9.42s 56: learn: 0.1449405 test: 0.1870159 best: 0.1870159 (56) total: 16.3s remaining: 9.13s 57: learn: 0.1443887 test: 0.1867632 best: 0.1867632 (57) total: 16.5s remaining: 8.84s 58: learn: 0.1438465 test: 0.1866612 best: 0.1866612 (58) total: 16.8s remaining: 8.54s 59: learn: 0.1433579 test: 0.1866703 best: 0.1866612 (58) total: 17.1s remaining: 8.25s 60: learn: 0.1428393 test: 0.1865768 best: 0.1865768 (60) total: 17.3s remaining: 7.95s 61: learn: 0.1422294 test: 0.1863678 best: 0.1863678 (61) total: 17.6s remaining: 7.67s 62: learn: 0.1415152 test: 0.1864910 best: 0.1863678 (61) total: 18s remaining: 7.41s 63: learn: 0.1407709 test: 0.1864072 best: 0.1863678 (61) total: 18.2s remaining: 7.13s 64: learn: 0.1401912 test: 0.1863987 best: 0.1863678 (61) total: 18.5s remaining: 6.83s 65: learn: 0.1395526 test: 0.1861317 best: 0.1861317 (65) total: 18.8s remaining: 6.55s 66: learn: 0.1388986 test: 0.1861545 best: 0.1861317 (65) total: 19s remaining: 6.25s 67: learn: 0.1384564 test: 0.1860146 best: 0.1860146 (67) total: 19.3s remaining: 5.97s 68: learn: 0.1377688 test: 0.1859528 best: 0.1859528 (68) total: 19.6s remaining: 5.68s 69: learn: 0.1371046 test: 0.1859822 best: 0.1859528 (68) total: 19.9s remaining: 5.39s 70: learn: 0.1365663 test: 0.1858137 best: 0.1858137 (70) total: 20.1s remaining: 5.1s 71: learn: 0.1361412 test: 0.1857865 best: 0.1857865 (71) total: 20.4s remaining: 4.81s 72: learn: 0.1356928 test: 0.1858828 best: 0.1857865 (71) total: 20.6s remaining: 4.52s 73: learn: 0.1351089 test: 0.1859232 best: 0.1857865 (71) total: 20.9s remaining: 4.24s 74: learn: 0.1345031 test: 0.1860410 best: 0.1857865 (71) total: 21.2s remaining: 3.95s 75: learn: 0.1340391 test: 0.1858454 best: 0.1857865 (71) total: 21.4s remaining: 3.67s 76: learn: 0.1333212 test: 0.1859953 best: 0.1857865 (71) total: 21.7s remaining: 3.38s 77: learn: 0.1327735 test: 0.1861797 best: 0.1857865 (71) total: 22s remaining: 3.1s 78: learn: 0.1322846 test: 0.1862750 best: 0.1857865 (71) total: 22.2s remaining: 2.81s 79: learn: 0.1317275 test: 0.1863582 best: 0.1857865 (71) total: 22.5s remaining: 2.53s 80: learn: 0.1313023 test: 0.1861968 best: 0.1857865 (71) total: 22.7s remaining: 2.25s 81: learn: 0.1309181 test: 0.1860843 best: 0.1857865 (71) total: 23s remaining: 1.96s 82: learn: 0.1301590 test: 0.1862413 best: 0.1857865 (71) total: 23.3s remaining: 1.68s 83: learn: 0.1297592 test: 0.1861068 best: 0.1857865 (71) total: 23.5s remaining: 1.4s 84: learn: 0.1293246 test: 0.1860841 best: 0.1857865 (71) total: 23.8s remaining: 1.12s 85: learn: 0.1289072 test: 0.1860847 best: 0.1857865 (71) total: 24.1s remaining: 839ms 86: learn: 0.1284405 test: 0.1858796 best: 0.1857865 (71) total: 24.3s remaining: 559ms 87: learn: 0.1279580 test: 0.1859814 best: 0.1857865 (71) total: 24.6s remaining: 279ms 88: learn: 0.1275617 test: 0.1861427 best: 0.1857865 (71) total: 24.8s remaining: 0us bestTest = 0.1857864827 bestIteration = 71 Shrink model to first 72 iterations. Trial 91, Fold 3: Log loss = 0.18561336490114735, Average precision = 0.9770555590377952, ROC-AUC = 0.9741588043310765, Elapsed Time = 24.98324109999521 seconds Trial 91, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 91, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5890804 test: 0.5904647 best: 0.5904647 (0) total: 300ms remaining: 26.4s 1: learn: 0.5080049 test: 0.5104645 best: 0.5104645 (1) total: 616ms remaining: 26.8s 2: learn: 0.4445021 test: 0.4477090 best: 0.4477090 (2) total: 917ms remaining: 26.3s 3: learn: 0.3952452 test: 0.3998256 best: 0.3998256 (3) total: 1.23s remaining: 26.2s 4: learn: 0.3562410 test: 0.3620580 best: 0.3620580 (4) total: 1.54s remaining: 25.8s 5: learn: 0.3263296 test: 0.3333527 best: 0.3333527 (5) total: 1.85s remaining: 25.7s 6: learn: 0.3028677 test: 0.3109525 best: 0.3109525 (6) total: 2.15s remaining: 25.2s 7: learn: 0.2831838 test: 0.2924570 best: 0.2924570 (7) total: 2.44s remaining: 24.7s 8: learn: 0.2677758 test: 0.2782908 best: 0.2782908 (8) total: 2.73s remaining: 24.2s 9: learn: 0.2537956 test: 0.2648872 best: 0.2648872 (9) total: 3.03s remaining: 23.9s 10: learn: 0.2430753 test: 0.2551850 best: 0.2551850 (10) total: 3.34s remaining: 23.7s 11: learn: 0.2344034 test: 0.2471275 best: 0.2471275 (11) total: 3.67s remaining: 23.5s 12: learn: 0.2269764 test: 0.2404596 best: 0.2404596 (12) total: 3.98s remaining: 23.3s 13: learn: 0.2195358 test: 0.2341135 best: 0.2341135 (13) total: 4.26s remaining: 22.8s 14: learn: 0.2137461 test: 0.2291709 best: 0.2291709 (14) total: 4.55s remaining: 22.5s 15: learn: 0.2087719 test: 0.2246915 best: 0.2246915 (15) total: 4.86s remaining: 22.2s 16: learn: 0.2046142 test: 0.2214529 best: 0.2214529 (16) total: 5.15s remaining: 21.8s 17: learn: 0.2005254 test: 0.2182253 best: 0.2182253 (17) total: 5.43s remaining: 21.4s 18: learn: 0.1965484 test: 0.2148287 best: 0.2148287 (18) total: 5.71s remaining: 21.1s 19: learn: 0.1932305 test: 0.2122129 best: 0.2122129 (19) total: 6s remaining: 20.7s 20: learn: 0.1899507 test: 0.2100887 best: 0.2100887 (20) total: 6.28s remaining: 20.3s 21: learn: 0.1874380 test: 0.2083914 best: 0.2083914 (21) total: 6.58s remaining: 20s 22: learn: 0.1847284 test: 0.2066360 best: 0.2066360 (22) total: 6.85s remaining: 19.7s 23: learn: 0.1823486 test: 0.2053260 best: 0.2053260 (23) total: 7.13s remaining: 19.3s 24: learn: 0.1799912 test: 0.2036675 best: 0.2036675 (24) total: 7.42s remaining: 19s 25: learn: 0.1780083 test: 0.2023046 best: 0.2023046 (25) total: 7.68s remaining: 18.6s 26: learn: 0.1761493 test: 0.2013358 best: 0.2013358 (26) total: 7.95s remaining: 18.3s 27: learn: 0.1744365 test: 0.2007353 best: 0.2007353 (27) total: 8.23s remaining: 17.9s 28: learn: 0.1726973 test: 0.1998250 best: 0.1998250 (28) total: 8.49s remaining: 17.6s 29: learn: 0.1711582 test: 0.1993953 best: 0.1993953 (29) total: 8.77s remaining: 17.2s 30: learn: 0.1696617 test: 0.1985334 best: 0.1985334 (30) total: 9.05s remaining: 16.9s 31: learn: 0.1682865 test: 0.1981621 best: 0.1981621 (31) total: 9.31s remaining: 16.6s 32: learn: 0.1669410 test: 0.1978673 best: 0.1978673 (32) total: 9.6s remaining: 16.3s 33: learn: 0.1657639 test: 0.1973245 best: 0.1973245 (33) total: 9.88s remaining: 16s 34: learn: 0.1644976 test: 0.1967841 best: 0.1967841 (34) total: 10.1s remaining: 15.6s 35: learn: 0.1632871 test: 0.1964517 best: 0.1964517 (35) total: 10.4s remaining: 15.3s 36: learn: 0.1621146 test: 0.1959986 best: 0.1959986 (36) total: 10.7s remaining: 15s 37: learn: 0.1610961 test: 0.1954609 best: 0.1954609 (37) total: 10.9s remaining: 14.7s 38: learn: 0.1601493 test: 0.1950473 best: 0.1950473 (38) total: 11.2s remaining: 14.4s 39: learn: 0.1589736 test: 0.1947909 best: 0.1947909 (39) total: 11.5s remaining: 14s 40: learn: 0.1580586 test: 0.1943732 best: 0.1943732 (40) total: 11.7s remaining: 13.8s 41: learn: 0.1572672 test: 0.1942218 best: 0.1942218 (41) total: 12s remaining: 13.4s 42: learn: 0.1564791 test: 0.1940734 best: 0.1940734 (42) total: 12.3s remaining: 13.1s 43: learn: 0.1557685 test: 0.1939111 best: 0.1939111 (43) total: 12.6s remaining: 12.8s 44: learn: 0.1548651 test: 0.1935294 best: 0.1935294 (44) total: 12.8s remaining: 12.5s 45: learn: 0.1539105 test: 0.1934179 best: 0.1934179 (45) total: 13.1s remaining: 12.3s 46: learn: 0.1531158 test: 0.1932113 best: 0.1932113 (46) total: 13.4s remaining: 12s 47: learn: 0.1523259 test: 0.1931344 best: 0.1931344 (47) total: 13.7s remaining: 11.7s 48: learn: 0.1517669 test: 0.1928717 best: 0.1928717 (48) total: 14s remaining: 11.4s 49: learn: 0.1510464 test: 0.1925527 best: 0.1925527 (49) total: 14.3s remaining: 11.1s 50: learn: 0.1502026 test: 0.1924497 best: 0.1924497 (50) total: 14.6s remaining: 10.9s 51: learn: 0.1495295 test: 0.1922965 best: 0.1922965 (51) total: 14.9s remaining: 10.6s 52: learn: 0.1486913 test: 0.1921701 best: 0.1921701 (52) total: 15.2s remaining: 10.3s 53: learn: 0.1479163 test: 0.1921263 best: 0.1921263 (53) total: 15.5s remaining: 10.1s 54: learn: 0.1474498 test: 0.1920508 best: 0.1920508 (54) total: 16s remaining: 9.87s 55: learn: 0.1466924 test: 0.1917201 best: 0.1917201 (55) total: 16.3s remaining: 9.59s 56: learn: 0.1459805 test: 0.1918035 best: 0.1917201 (55) total: 16.6s remaining: 9.3s 57: learn: 0.1451822 test: 0.1912561 best: 0.1912561 (57) total: 16.9s remaining: 9.02s 58: learn: 0.1444374 test: 0.1910894 best: 0.1910894 (58) total: 17.2s remaining: 8.72s 59: learn: 0.1438160 test: 0.1908761 best: 0.1908761 (59) total: 17.4s remaining: 8.43s 60: learn: 0.1431343 test: 0.1909409 best: 0.1908761 (59) total: 17.7s remaining: 8.14s 61: learn: 0.1426616 test: 0.1909737 best: 0.1908761 (59) total: 18s remaining: 7.86s 62: learn: 0.1420993 test: 0.1907591 best: 0.1907591 (62) total: 18.3s remaining: 7.57s 63: learn: 0.1414820 test: 0.1905561 best: 0.1905561 (63) total: 18.6s remaining: 7.28s 64: learn: 0.1406949 test: 0.1903283 best: 0.1903283 (64) total: 18.9s remaining: 6.98s 65: learn: 0.1401209 test: 0.1903554 best: 0.1903283 (64) total: 19.2s remaining: 6.69s 66: learn: 0.1397244 test: 0.1902049 best: 0.1902049 (66) total: 19.5s remaining: 6.39s 67: learn: 0.1390260 test: 0.1906630 best: 0.1902049 (66) total: 19.8s remaining: 6.1s 68: learn: 0.1383954 test: 0.1907329 best: 0.1902049 (66) total: 20s remaining: 5.81s 69: learn: 0.1378571 test: 0.1907836 best: 0.1902049 (66) total: 20.3s remaining: 5.51s 70: learn: 0.1372155 test: 0.1909054 best: 0.1902049 (66) total: 20.6s remaining: 5.23s 71: learn: 0.1365955 test: 0.1910854 best: 0.1902049 (66) total: 20.9s remaining: 4.93s 72: learn: 0.1361190 test: 0.1909881 best: 0.1902049 (66) total: 21.1s remaining: 4.63s 73: learn: 0.1354210 test: 0.1910532 best: 0.1902049 (66) total: 21.4s remaining: 4.34s 74: learn: 0.1347987 test: 0.1909982 best: 0.1902049 (66) total: 21.7s remaining: 4.06s 75: learn: 0.1342772 test: 0.1907988 best: 0.1902049 (66) total: 22s remaining: 3.76s 76: learn: 0.1338722 test: 0.1907475 best: 0.1902049 (66) total: 22.3s remaining: 3.47s 77: learn: 0.1332539 test: 0.1908867 best: 0.1902049 (66) total: 22.6s remaining: 3.18s 78: learn: 0.1327115 test: 0.1906436 best: 0.1902049 (66) total: 22.8s remaining: 2.89s 79: learn: 0.1321652 test: 0.1907227 best: 0.1902049 (66) total: 23.1s remaining: 2.6s 80: learn: 0.1316142 test: 0.1905422 best: 0.1902049 (66) total: 23.4s remaining: 2.31s 81: learn: 0.1310750 test: 0.1905031 best: 0.1902049 (66) total: 23.6s remaining: 2.02s 82: learn: 0.1304846 test: 0.1904162 best: 0.1902049 (66) total: 23.9s remaining: 1.73s 83: learn: 0.1298653 test: 0.1904046 best: 0.1902049 (66) total: 24.1s remaining: 1.44s 84: learn: 0.1295551 test: 0.1902365 best: 0.1902049 (66) total: 24.4s remaining: 1.15s 85: learn: 0.1291643 test: 0.1901823 best: 0.1901823 (85) total: 24.7s remaining: 862ms 86: learn: 0.1286904 test: 0.1902194 best: 0.1901823 (85) total: 25s remaining: 574ms 87: learn: 0.1283029 test: 0.1903262 best: 0.1901823 (85) total: 25.3s remaining: 287ms 88: learn: 0.1277280 test: 0.1903904 best: 0.1901823 (85) total: 25.5s remaining: 0us bestTest = 0.1901823167 bestIteration = 85 Shrink model to first 86 iterations. Trial 91, Fold 4: Log loss = 0.18991550218277667, Average precision = 0.976480947163987, ROC-AUC = 0.9723979979303323, Elapsed Time = 25.658579900002223 seconds Trial 91, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 91, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5882392 test: 0.5906055 best: 0.5906055 (0) total: 301ms remaining: 26.5s 1: learn: 0.5071399 test: 0.5115621 best: 0.5115621 (1) total: 592ms remaining: 25.8s 2: learn: 0.4434272 test: 0.4497720 best: 0.4497720 (2) total: 905ms remaining: 25.9s 3: learn: 0.3937875 test: 0.4019268 best: 0.4019268 (3) total: 1.19s remaining: 25.4s 4: learn: 0.3567716 test: 0.3665522 best: 0.3665522 (4) total: 1.49s remaining: 25.1s 5: learn: 0.3260631 test: 0.3375056 best: 0.3375056 (5) total: 1.79s remaining: 24.8s 6: learn: 0.3016418 test: 0.3145295 best: 0.3145295 (6) total: 2.1s remaining: 24.6s 7: learn: 0.2813397 test: 0.2957537 best: 0.2957537 (7) total: 2.41s remaining: 24.4s 8: learn: 0.2666188 test: 0.2823311 best: 0.2823311 (8) total: 2.68s remaining: 23.8s 9: learn: 0.2546098 test: 0.2714840 best: 0.2714840 (9) total: 2.96s remaining: 23.4s 10: learn: 0.2427158 test: 0.2609265 best: 0.2609265 (10) total: 3.26s remaining: 23.1s 11: learn: 0.2337769 test: 0.2528990 best: 0.2528990 (11) total: 3.55s remaining: 22.8s 12: learn: 0.2249279 test: 0.2451397 best: 0.2451397 (12) total: 3.87s remaining: 22.6s 13: learn: 0.2188126 test: 0.2403381 best: 0.2403381 (13) total: 4.18s remaining: 22.4s 14: learn: 0.2122852 test: 0.2351253 best: 0.2351253 (14) total: 4.49s remaining: 22.1s 15: learn: 0.2075565 test: 0.2316795 best: 0.2316795 (15) total: 4.79s remaining: 21.9s 16: learn: 0.2027837 test: 0.2279230 best: 0.2279230 (16) total: 5.07s remaining: 21.5s 17: learn: 0.1984606 test: 0.2243873 best: 0.2243873 (17) total: 5.36s remaining: 21.1s 18: learn: 0.1944613 test: 0.2214394 best: 0.2214394 (18) total: 5.67s remaining: 20.9s 19: learn: 0.1911317 test: 0.2193585 best: 0.2193585 (19) total: 5.95s remaining: 20.5s 20: learn: 0.1883681 test: 0.2176779 best: 0.2176779 (20) total: 6.24s remaining: 20.2s 21: learn: 0.1855720 test: 0.2159012 best: 0.2159012 (21) total: 6.51s remaining: 19.8s 22: learn: 0.1829645 test: 0.2143648 best: 0.2143648 (22) total: 6.8s remaining: 19.5s 23: learn: 0.1805752 test: 0.2127136 best: 0.2127136 (23) total: 7.08s remaining: 19.2s 24: learn: 0.1785025 test: 0.2112508 best: 0.2112508 (24) total: 7.37s remaining: 18.9s 25: learn: 0.1763612 test: 0.2098343 best: 0.2098343 (25) total: 7.67s remaining: 18.6s 26: learn: 0.1744023 test: 0.2085009 best: 0.2085009 (26) total: 7.94s remaining: 18.2s 27: learn: 0.1727089 test: 0.2075053 best: 0.2075053 (27) total: 8.23s remaining: 17.9s 28: learn: 0.1712633 test: 0.2068621 best: 0.2068621 (28) total: 8.49s remaining: 17.6s 29: learn: 0.1696952 test: 0.2059073 best: 0.2059073 (29) total: 8.78s remaining: 17.3s 30: learn: 0.1682477 test: 0.2054248 best: 0.2054248 (30) total: 9.05s remaining: 16.9s 31: learn: 0.1668708 test: 0.2048648 best: 0.2048648 (31) total: 9.33s remaining: 16.6s 32: learn: 0.1654648 test: 0.2043469 best: 0.2043469 (32) total: 9.6s remaining: 16.3s 33: learn: 0.1639699 test: 0.2039024 best: 0.2039024 (33) total: 9.9s remaining: 16s 34: learn: 0.1627684 test: 0.2033338 best: 0.2033338 (34) total: 10.2s remaining: 15.7s 35: learn: 0.1616500 test: 0.2025552 best: 0.2025552 (35) total: 10.4s remaining: 15.4s 36: learn: 0.1606237 test: 0.2024425 best: 0.2024425 (36) total: 10.7s remaining: 15.1s 37: learn: 0.1596083 test: 0.2022954 best: 0.2022954 (37) total: 11s remaining: 14.7s 38: learn: 0.1587627 test: 0.2017944 best: 0.2017944 (38) total: 11.2s remaining: 14.4s 39: learn: 0.1577505 test: 0.2014961 best: 0.2014961 (39) total: 11.5s remaining: 14.1s 40: learn: 0.1566945 test: 0.2009453 best: 0.2009453 (40) total: 11.8s remaining: 13.8s 41: learn: 0.1556690 test: 0.2006901 best: 0.2006901 (41) total: 12.1s remaining: 13.5s 42: learn: 0.1545310 test: 0.2004491 best: 0.2004491 (42) total: 12.3s remaining: 13.2s 43: learn: 0.1536084 test: 0.2001934 best: 0.2001934 (43) total: 12.6s remaining: 12.9s 44: learn: 0.1527506 test: 0.1998964 best: 0.1998964 (44) total: 12.9s remaining: 12.6s 45: learn: 0.1519940 test: 0.1996821 best: 0.1996821 (45) total: 13.2s remaining: 12.3s 46: learn: 0.1510465 test: 0.1989156 best: 0.1989156 (46) total: 13.5s remaining: 12s 47: learn: 0.1500795 test: 0.1986783 best: 0.1986783 (47) total: 13.7s remaining: 11.7s 48: learn: 0.1492843 test: 0.1983294 best: 0.1983294 (48) total: 14s remaining: 11.4s 49: learn: 0.1485364 test: 0.1980455 best: 0.1980455 (49) total: 14.3s remaining: 11.1s 50: learn: 0.1476811 test: 0.1976657 best: 0.1976657 (50) total: 14.5s remaining: 10.8s 51: learn: 0.1471880 test: 0.1975790 best: 0.1975790 (51) total: 14.8s remaining: 10.5s 52: learn: 0.1465610 test: 0.1973386 best: 0.1973386 (52) total: 15.1s remaining: 10.2s 53: learn: 0.1458922 test: 0.1971105 best: 0.1971105 (53) total: 15.3s remaining: 9.94s 54: learn: 0.1452289 test: 0.1970721 best: 0.1970721 (54) total: 15.6s remaining: 9.64s 55: learn: 0.1444512 test: 0.1970485 best: 0.1970485 (55) total: 15.9s remaining: 9.35s 56: learn: 0.1437931 test: 0.1967688 best: 0.1967688 (56) total: 16.1s remaining: 9.04s 57: learn: 0.1429885 test: 0.1964325 best: 0.1964325 (57) total: 16.4s remaining: 8.76s 58: learn: 0.1424096 test: 0.1962148 best: 0.1962148 (58) total: 16.7s remaining: 8.47s 59: learn: 0.1416954 test: 0.1960722 best: 0.1960722 (59) total: 16.9s remaining: 8.19s 60: learn: 0.1411646 test: 0.1959872 best: 0.1959872 (60) total: 17.2s remaining: 7.9s 61: learn: 0.1405831 test: 0.1960536 best: 0.1959872 (60) total: 17.5s remaining: 7.61s 62: learn: 0.1398885 test: 0.1959579 best: 0.1959579 (62) total: 17.7s remaining: 7.32s 63: learn: 0.1393294 test: 0.1960085 best: 0.1959579 (62) total: 18s remaining: 7.03s 64: learn: 0.1387667 test: 0.1961448 best: 0.1959579 (62) total: 18.3s remaining: 6.76s 65: learn: 0.1379643 test: 0.1958633 best: 0.1958633 (65) total: 18.5s remaining: 6.46s 66: learn: 0.1375356 test: 0.1958765 best: 0.1958633 (65) total: 18.8s remaining: 6.18s 67: learn: 0.1370724 test: 0.1957045 best: 0.1957045 (67) total: 19.1s remaining: 5.9s 68: learn: 0.1365243 test: 0.1955841 best: 0.1955841 (68) total: 19.3s remaining: 5.61s 69: learn: 0.1359501 test: 0.1953991 best: 0.1953991 (69) total: 19.6s remaining: 5.32s 70: learn: 0.1352494 test: 0.1954919 best: 0.1953991 (69) total: 19.9s remaining: 5.04s 71: learn: 0.1347234 test: 0.1954319 best: 0.1953991 (69) total: 20.2s remaining: 4.76s 72: learn: 0.1343224 test: 0.1955032 best: 0.1953991 (69) total: 20.4s remaining: 4.47s 73: learn: 0.1337187 test: 0.1954186 best: 0.1953991 (69) total: 20.7s remaining: 4.19s 74: learn: 0.1333560 test: 0.1951261 best: 0.1951261 (74) total: 20.9s remaining: 3.91s 75: learn: 0.1328789 test: 0.1952856 best: 0.1951261 (74) total: 21.2s remaining: 3.63s 76: learn: 0.1325055 test: 0.1949439 best: 0.1949439 (76) total: 21.5s remaining: 3.35s 77: learn: 0.1318983 test: 0.1948100 best: 0.1948100 (77) total: 21.7s remaining: 3.06s 78: learn: 0.1314374 test: 0.1948068 best: 0.1948068 (78) total: 22s remaining: 2.78s 79: learn: 0.1311102 test: 0.1947361 best: 0.1947361 (79) total: 22.3s remaining: 2.5s 80: learn: 0.1308752 test: 0.1946666 best: 0.1946666 (80) total: 22.5s remaining: 2.22s 81: learn: 0.1304217 test: 0.1946569 best: 0.1946569 (81) total: 22.8s remaining: 1.94s 82: learn: 0.1296909 test: 0.1948092 best: 0.1946569 (81) total: 23s remaining: 1.67s 83: learn: 0.1291560 test: 0.1946930 best: 0.1946569 (81) total: 23.3s remaining: 1.39s 84: learn: 0.1285601 test: 0.1947175 best: 0.1946569 (81) total: 23.6s remaining: 1.11s 85: learn: 0.1280163 test: 0.1945230 best: 0.1945230 (85) total: 23.9s remaining: 832ms 86: learn: 0.1274504 test: 0.1945086 best: 0.1945086 (86) total: 24.1s remaining: 554ms 87: learn: 0.1270267 test: 0.1943604 best: 0.1943604 (87) total: 24.4s remaining: 277ms 88: learn: 0.1266924 test: 0.1943095 best: 0.1943095 (88) total: 24.7s remaining: 0us bestTest = 0.1943094949 bestIteration = 88 Trial 91, Fold 5: Log loss = 0.19393170180505884, Average precision = 0.9757090510226494, ROC-AUC = 0.972612154655073, Elapsed Time = 24.828430500005197 seconds
Optimization Progress: 92%|#########2| 92/100 [2:50:46<12:43, 95.48s/it]
Trial 92, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 92, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5956332 test: 0.5955421 best: 0.5955421 (0) total: 92.5ms remaining: 5.08s 1: learn: 0.5193324 test: 0.5193298 best: 0.5193298 (1) total: 192ms remaining: 5.17s 2: learn: 0.4594002 test: 0.4595830 best: 0.4595830 (2) total: 295ms remaining: 5.21s 3: learn: 0.4118118 test: 0.4123675 best: 0.4123675 (3) total: 401ms remaining: 5.21s 4: learn: 0.3743687 test: 0.3748782 best: 0.3748782 (4) total: 508ms remaining: 5.18s 5: learn: 0.3443778 test: 0.3454412 best: 0.3454412 (5) total: 618ms remaining: 5.15s 6: learn: 0.3213494 test: 0.3229271 best: 0.3229271 (6) total: 731ms remaining: 5.11s 7: learn: 0.3022276 test: 0.3040114 best: 0.3040114 (7) total: 847ms remaining: 5.08s 8: learn: 0.2861403 test: 0.2881999 best: 0.2881999 (8) total: 954ms remaining: 4.98s 9: learn: 0.2740314 test: 0.2767556 best: 0.2767556 (9) total: 1.06s remaining: 4.9s 10: learn: 0.2633213 test: 0.2664537 best: 0.2664537 (10) total: 1.17s remaining: 4.8s 11: learn: 0.2547127 test: 0.2585070 best: 0.2585070 (11) total: 1.28s remaining: 4.71s 12: learn: 0.2470532 test: 0.2513279 best: 0.2513279 (12) total: 1.4s remaining: 4.63s 13: learn: 0.2403807 test: 0.2451278 best: 0.2451278 (13) total: 1.51s remaining: 4.53s 14: learn: 0.2342092 test: 0.2395030 best: 0.2395030 (14) total: 1.62s remaining: 4.44s 15: learn: 0.2292761 test: 0.2352956 best: 0.2352956 (15) total: 1.74s remaining: 4.35s 16: learn: 0.2248580 test: 0.2311966 best: 0.2311966 (16) total: 1.85s remaining: 4.25s 17: learn: 0.2209054 test: 0.2278353 best: 0.2278353 (17) total: 1.98s remaining: 4.19s 18: learn: 0.2175385 test: 0.2248994 best: 0.2248994 (18) total: 2.1s remaining: 4.09s 19: learn: 0.2145899 test: 0.2226586 best: 0.2226586 (19) total: 2.23s remaining: 4.02s 20: learn: 0.2122204 test: 0.2207559 best: 0.2207559 (20) total: 2.37s remaining: 3.95s 21: learn: 0.2100649 test: 0.2192236 best: 0.2192236 (21) total: 2.5s remaining: 3.87s 22: learn: 0.2080502 test: 0.2178404 best: 0.2178404 (22) total: 2.65s remaining: 3.79s 23: learn: 0.2058382 test: 0.2159543 best: 0.2159543 (23) total: 2.77s remaining: 3.69s 24: learn: 0.2038372 test: 0.2142042 best: 0.2142042 (24) total: 2.91s remaining: 3.6s 25: learn: 0.2017759 test: 0.2127817 best: 0.2127817 (25) total: 3.04s remaining: 3.51s 26: learn: 0.1998978 test: 0.2109867 best: 0.2109867 (26) total: 3.17s remaining: 3.4s 27: learn: 0.1985750 test: 0.2100351 best: 0.2100351 (27) total: 3.29s remaining: 3.29s 28: learn: 0.1967988 test: 0.2086805 best: 0.2086805 (28) total: 3.42s remaining: 3.18s 29: learn: 0.1954757 test: 0.2078630 best: 0.2078630 (29) total: 3.55s remaining: 3.08s 30: learn: 0.1938694 test: 0.2064745 best: 0.2064745 (30) total: 3.68s remaining: 2.97s 31: learn: 0.1926720 test: 0.2058127 best: 0.2058127 (31) total: 3.81s remaining: 2.86s 32: learn: 0.1916258 test: 0.2052017 best: 0.2052017 (32) total: 3.93s remaining: 2.74s 33: learn: 0.1905932 test: 0.2044924 best: 0.2044924 (33) total: 4.05s remaining: 2.62s 34: learn: 0.1894352 test: 0.2037587 best: 0.2037587 (34) total: 4.18s remaining: 2.51s 35: learn: 0.1881002 test: 0.2024800 best: 0.2024800 (35) total: 4.3s remaining: 2.39s 36: learn: 0.1872234 test: 0.2020751 best: 0.2020751 (36) total: 4.43s remaining: 2.27s 37: learn: 0.1860530 test: 0.2013434 best: 0.2013434 (37) total: 4.56s remaining: 2.16s 38: learn: 0.1848471 test: 0.2005949 best: 0.2005949 (38) total: 4.68s remaining: 2.04s 39: learn: 0.1841060 test: 0.2000822 best: 0.2000822 (39) total: 4.8s remaining: 1.92s 40: learn: 0.1834553 test: 0.1998030 best: 0.1998030 (40) total: 4.92s remaining: 1.8s 41: learn: 0.1826601 test: 0.1996664 best: 0.1996664 (41) total: 5.05s remaining: 1.68s 42: learn: 0.1817648 test: 0.1991760 best: 0.1991760 (42) total: 5.18s remaining: 1.57s 43: learn: 0.1811581 test: 0.1987720 best: 0.1987720 (43) total: 5.3s remaining: 1.45s 44: learn: 0.1803803 test: 0.1984756 best: 0.1984756 (44) total: 5.43s remaining: 1.33s 45: learn: 0.1793929 test: 0.1983099 best: 0.1983099 (45) total: 5.55s remaining: 1.21s 46: learn: 0.1787400 test: 0.1978408 best: 0.1978408 (46) total: 5.67s remaining: 1.09s 47: learn: 0.1782243 test: 0.1975133 best: 0.1975133 (47) total: 5.79s remaining: 965ms 48: learn: 0.1774339 test: 0.1972928 best: 0.1972928 (48) total: 5.92s remaining: 846ms 49: learn: 0.1769292 test: 0.1971323 best: 0.1971323 (49) total: 6.04s remaining: 725ms 50: learn: 0.1764527 test: 0.1968137 best: 0.1968137 (50) total: 6.17s remaining: 605ms 51: learn: 0.1757838 test: 0.1967477 best: 0.1967477 (51) total: 6.3s remaining: 485ms 52: learn: 0.1750182 test: 0.1967280 best: 0.1967280 (52) total: 6.43s remaining: 364ms 53: learn: 0.1744722 test: 0.1963887 best: 0.1963887 (53) total: 6.55s remaining: 243ms 54: learn: 0.1737932 test: 0.1963595 best: 0.1963595 (54) total: 6.68s remaining: 121ms 55: learn: 0.1733211 test: 0.1961268 best: 0.1961268 (55) total: 6.8s remaining: 0us bestTest = 0.1961268191 bestIteration = 55 Trial 92, Fold 1: Log loss = 0.19556131489577427, Average precision = 0.9764224469448858, ROC-AUC = 0.9721163244480769, Elapsed Time = 6.934846899996046 seconds Trial 92, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 92, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.5951823 test: 0.5955069 best: 0.5955069 (0) total: 94.6ms remaining: 5.21s 1: learn: 0.5179662 test: 0.5186430 best: 0.5186430 (1) total: 200ms remaining: 5.41s 2: learn: 0.4580192 test: 0.4591945 best: 0.4591945 (2) total: 310ms remaining: 5.47s 3: learn: 0.4111834 test: 0.4130200 best: 0.4130200 (3) total: 420ms remaining: 5.46s 4: learn: 0.3740735 test: 0.3761228 best: 0.3761228 (4) total: 530ms remaining: 5.41s 5: learn: 0.3449075 test: 0.3473693 best: 0.3473693 (5) total: 641ms remaining: 5.34s 6: learn: 0.3215209 test: 0.3241413 best: 0.3241413 (6) total: 746ms remaining: 5.22s 7: learn: 0.3025956 test: 0.3051886 best: 0.3051886 (7) total: 857ms remaining: 5.14s 8: learn: 0.2862746 test: 0.2891222 best: 0.2891222 (8) total: 970ms remaining: 5.06s 9: learn: 0.2732773 test: 0.2764419 best: 0.2764419 (9) total: 1.08s remaining: 4.98s 10: learn: 0.2622941 test: 0.2655742 best: 0.2655742 (10) total: 1.2s remaining: 4.9s 11: learn: 0.2539455 test: 0.2573051 best: 0.2573051 (11) total: 1.31s remaining: 4.82s 12: learn: 0.2458604 test: 0.2496258 best: 0.2496258 (12) total: 1.44s remaining: 4.75s 13: learn: 0.2389999 test: 0.2427802 best: 0.2427802 (13) total: 1.56s remaining: 4.68s 14: learn: 0.2336140 test: 0.2374993 best: 0.2374993 (14) total: 1.68s remaining: 4.6s 15: learn: 0.2288847 test: 0.2329163 best: 0.2329163 (15) total: 1.81s remaining: 4.53s 16: learn: 0.2249834 test: 0.2292538 best: 0.2292538 (16) total: 1.94s remaining: 4.45s 17: learn: 0.2209859 test: 0.2254415 best: 0.2254415 (17) total: 2.06s remaining: 4.35s 18: learn: 0.2173276 test: 0.2218218 best: 0.2218218 (18) total: 2.19s remaining: 4.26s 19: learn: 0.2144379 test: 0.2190606 best: 0.2190606 (19) total: 2.31s remaining: 4.16s 20: learn: 0.2113190 test: 0.2160030 best: 0.2160030 (20) total: 2.44s remaining: 4.06s 21: learn: 0.2088238 test: 0.2138972 best: 0.2138972 (21) total: 2.56s remaining: 3.96s 22: learn: 0.2069406 test: 0.2122835 best: 0.2122835 (22) total: 2.69s remaining: 3.86s 23: learn: 0.2049706 test: 0.2106957 best: 0.2106957 (23) total: 2.81s remaining: 3.75s 24: learn: 0.2034805 test: 0.2091428 best: 0.2091428 (24) total: 2.94s remaining: 3.64s 25: learn: 0.2017153 test: 0.2075577 best: 0.2075577 (25) total: 3.06s remaining: 3.53s 26: learn: 0.1998892 test: 0.2061038 best: 0.2061038 (26) total: 3.19s remaining: 3.42s 27: learn: 0.1982836 test: 0.2047454 best: 0.2047454 (27) total: 3.31s remaining: 3.31s 28: learn: 0.1970773 test: 0.2038533 best: 0.2038533 (28) total: 3.43s remaining: 3.19s 29: learn: 0.1960644 test: 0.2030898 best: 0.2030898 (29) total: 3.54s remaining: 3.07s 30: learn: 0.1949322 test: 0.2021613 best: 0.2021613 (30) total: 3.66s remaining: 2.95s 31: learn: 0.1937768 test: 0.2015106 best: 0.2015106 (31) total: 3.78s remaining: 2.84s 32: learn: 0.1925571 test: 0.2004956 best: 0.2004956 (32) total: 3.91s remaining: 2.72s 33: learn: 0.1914212 test: 0.1994504 best: 0.1994504 (33) total: 4.03s remaining: 2.61s 34: learn: 0.1902912 test: 0.1985076 best: 0.1985076 (34) total: 4.15s remaining: 2.49s 35: learn: 0.1891978 test: 0.1977043 best: 0.1977043 (35) total: 4.26s remaining: 2.37s 36: learn: 0.1880933 test: 0.1968713 best: 0.1968713 (36) total: 4.38s remaining: 2.25s 37: learn: 0.1873031 test: 0.1962705 best: 0.1962705 (37) total: 4.49s remaining: 2.13s 38: learn: 0.1864813 test: 0.1958189 best: 0.1958189 (38) total: 4.61s remaining: 2.01s 39: learn: 0.1856961 test: 0.1950603 best: 0.1950603 (39) total: 4.74s remaining: 1.9s 40: learn: 0.1847898 test: 0.1944335 best: 0.1944335 (40) total: 4.87s remaining: 1.78s 41: learn: 0.1840284 test: 0.1938035 best: 0.1938035 (41) total: 4.99s remaining: 1.66s 42: learn: 0.1833774 test: 0.1934837 best: 0.1934837 (42) total: 5.11s remaining: 1.54s 43: learn: 0.1825500 test: 0.1932263 best: 0.1932263 (43) total: 5.24s remaining: 1.43s 44: learn: 0.1818739 test: 0.1930720 best: 0.1930720 (44) total: 5.36s remaining: 1.31s 45: learn: 0.1809614 test: 0.1923901 best: 0.1923901 (45) total: 5.48s remaining: 1.19s 46: learn: 0.1802033 test: 0.1921297 best: 0.1921297 (46) total: 5.61s remaining: 1.07s 47: learn: 0.1794037 test: 0.1917834 best: 0.1917834 (47) total: 5.73s remaining: 955ms 48: learn: 0.1787750 test: 0.1914507 best: 0.1914507 (48) total: 5.85s remaining: 835ms 49: learn: 0.1782838 test: 0.1911751 best: 0.1911751 (49) total: 5.96s remaining: 715ms 50: learn: 0.1778684 test: 0.1910411 best: 0.1910411 (50) total: 6.08s remaining: 596ms 51: learn: 0.1772020 test: 0.1904617 best: 0.1904617 (51) total: 6.19s remaining: 476ms 52: learn: 0.1765163 test: 0.1899463 best: 0.1899463 (52) total: 6.3s remaining: 357ms 53: learn: 0.1758807 test: 0.1896040 best: 0.1896040 (53) total: 6.42s remaining: 238ms 54: learn: 0.1753024 test: 0.1894535 best: 0.1894535 (54) total: 6.54s remaining: 119ms 55: learn: 0.1748359 test: 0.1890643 best: 0.1890643 (55) total: 6.66s remaining: 0us bestTest = 0.1890642783 bestIteration = 55 Trial 92, Fold 2: Log loss = 0.18868938554398013, Average precision = 0.9770155439451153, ROC-AUC = 0.9744132747748746, Elapsed Time = 6.808789899994736 seconds Trial 92, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 92, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5959971 test: 0.5957627 best: 0.5957627 (0) total: 94.6ms remaining: 5.21s 1: learn: 0.5188507 test: 0.5181431 best: 0.5181431 (1) total: 200ms remaining: 5.4s 2: learn: 0.4586978 test: 0.4579242 best: 0.4579242 (2) total: 308ms remaining: 5.43s 3: learn: 0.4115131 test: 0.4105292 best: 0.4105292 (3) total: 535ms remaining: 6.96s 4: learn: 0.3744221 test: 0.3734777 best: 0.3734777 (4) total: 649ms remaining: 6.62s 5: learn: 0.3456696 test: 0.3446873 best: 0.3446873 (5) total: 770ms remaining: 6.42s 6: learn: 0.3217520 test: 0.3208486 best: 0.3208486 (6) total: 885ms remaining: 6.19s 7: learn: 0.3036111 test: 0.3025436 best: 0.3025436 (7) total: 998ms remaining: 5.99s 8: learn: 0.2878198 test: 0.2870160 best: 0.2870160 (8) total: 1.12s remaining: 5.83s 9: learn: 0.2751863 test: 0.2744372 best: 0.2744372 (9) total: 1.24s remaining: 5.69s 10: learn: 0.2644255 test: 0.2638303 best: 0.2638303 (10) total: 1.36s remaining: 5.55s 11: learn: 0.2552322 test: 0.2548508 best: 0.2548508 (11) total: 1.48s remaining: 5.44s 12: learn: 0.2476837 test: 0.2474662 best: 0.2474662 (12) total: 1.6s remaining: 5.3s 13: learn: 0.2409194 test: 0.2408900 best: 0.2408900 (13) total: 1.73s remaining: 5.18s 14: learn: 0.2353047 test: 0.2356064 best: 0.2356064 (14) total: 1.84s remaining: 5.04s 15: learn: 0.2302989 test: 0.2312242 best: 0.2312242 (15) total: 1.96s remaining: 4.91s 16: learn: 0.2261437 test: 0.2275657 best: 0.2275657 (16) total: 2.09s remaining: 4.79s 17: learn: 0.2224506 test: 0.2239489 best: 0.2239489 (17) total: 2.21s remaining: 4.67s 18: learn: 0.2186301 test: 0.2206609 best: 0.2206609 (18) total: 2.33s remaining: 4.54s 19: learn: 0.2154826 test: 0.2177294 best: 0.2177294 (19) total: 2.46s remaining: 4.42s 20: learn: 0.2130860 test: 0.2155005 best: 0.2155005 (20) total: 2.57s remaining: 4.29s 21: learn: 0.2109918 test: 0.2136048 best: 0.2136048 (21) total: 2.69s remaining: 4.16s 22: learn: 0.2087821 test: 0.2120522 best: 0.2120522 (22) total: 2.81s remaining: 4.03s 23: learn: 0.2065966 test: 0.2102412 best: 0.2102412 (23) total: 2.94s remaining: 3.91s 24: learn: 0.2049629 test: 0.2087661 best: 0.2087661 (24) total: 3.05s remaining: 3.78s 25: learn: 0.2027333 test: 0.2067113 best: 0.2067113 (25) total: 3.17s remaining: 3.66s 26: learn: 0.2010355 test: 0.2054036 best: 0.2054036 (26) total: 3.3s remaining: 3.54s 27: learn: 0.1994764 test: 0.2043337 best: 0.2043337 (27) total: 3.42s remaining: 3.42s 28: learn: 0.1982334 test: 0.2037061 best: 0.2037061 (28) total: 3.54s remaining: 3.29s 29: learn: 0.1968731 test: 0.2024686 best: 0.2024686 (29) total: 3.65s remaining: 3.17s 30: learn: 0.1957894 test: 0.2017530 best: 0.2017530 (30) total: 3.77s remaining: 3.04s 31: learn: 0.1945551 test: 0.2008328 best: 0.2008328 (31) total: 3.9s remaining: 2.92s 32: learn: 0.1934866 test: 0.2000206 best: 0.2000206 (32) total: 4.02s remaining: 2.8s 33: learn: 0.1923348 test: 0.1992259 best: 0.1992259 (33) total: 4.14s remaining: 2.68s 34: learn: 0.1912942 test: 0.1987199 best: 0.1987199 (34) total: 4.27s remaining: 2.56s 35: learn: 0.1899574 test: 0.1978547 best: 0.1978547 (35) total: 4.4s remaining: 2.44s 36: learn: 0.1889733 test: 0.1973340 best: 0.1973340 (36) total: 4.52s remaining: 2.32s 37: learn: 0.1879773 test: 0.1966172 best: 0.1966172 (37) total: 4.64s remaining: 2.2s 38: learn: 0.1870490 test: 0.1961271 best: 0.1961271 (38) total: 4.76s remaining: 2.08s 39: learn: 0.1861487 test: 0.1954748 best: 0.1954748 (39) total: 4.88s remaining: 1.95s 40: learn: 0.1853113 test: 0.1949433 best: 0.1949433 (40) total: 5s remaining: 1.83s 41: learn: 0.1844939 test: 0.1944387 best: 0.1944387 (41) total: 5.13s remaining: 1.71s 42: learn: 0.1835517 test: 0.1940477 best: 0.1940477 (42) total: 5.25s remaining: 1.59s 43: learn: 0.1826734 test: 0.1936828 best: 0.1936828 (43) total: 5.37s remaining: 1.46s 44: learn: 0.1817734 test: 0.1931483 best: 0.1931483 (44) total: 5.49s remaining: 1.34s 45: learn: 0.1810120 test: 0.1929643 best: 0.1929643 (45) total: 5.62s remaining: 1.22s 46: learn: 0.1803708 test: 0.1925569 best: 0.1925569 (46) total: 5.74s remaining: 1.1s 47: learn: 0.1797342 test: 0.1923955 best: 0.1923955 (47) total: 5.85s remaining: 975ms 48: learn: 0.1791832 test: 0.1923709 best: 0.1923709 (48) total: 5.97s remaining: 853ms 49: learn: 0.1786737 test: 0.1918943 best: 0.1918943 (49) total: 6.08s remaining: 729ms 50: learn: 0.1781390 test: 0.1915384 best: 0.1915384 (50) total: 6.19s remaining: 607ms 51: learn: 0.1775685 test: 0.1914385 best: 0.1914385 (51) total: 6.31s remaining: 485ms 52: learn: 0.1769999 test: 0.1911639 best: 0.1911639 (52) total: 6.43s remaining: 364ms 53: learn: 0.1764002 test: 0.1910262 best: 0.1910262 (53) total: 6.55s remaining: 243ms 54: learn: 0.1758641 test: 0.1907972 best: 0.1907972 (54) total: 6.66s remaining: 121ms 55: learn: 0.1753564 test: 0.1904493 best: 0.1904493 (55) total: 6.79s remaining: 0us bestTest = 0.1904492758 bestIteration = 55 Trial 92, Fold 3: Log loss = 0.19015322927809597, Average precision = 0.9758680395724453, ROC-AUC = 0.9732675573768913, Elapsed Time = 6.929672999998729 seconds Trial 92, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 92, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5956446 test: 0.5956901 best: 0.5956901 (0) total: 93.3ms remaining: 5.13s 1: learn: 0.5195863 test: 0.5196381 best: 0.5196381 (1) total: 201ms remaining: 5.42s 2: learn: 0.4595773 test: 0.4594265 best: 0.4594265 (2) total: 313ms remaining: 5.53s 3: learn: 0.4117250 test: 0.4113480 best: 0.4113480 (3) total: 423ms remaining: 5.5s 4: learn: 0.3749985 test: 0.3747277 best: 0.3747277 (4) total: 527ms remaining: 5.37s 5: learn: 0.3453747 test: 0.3454005 best: 0.3454005 (5) total: 637ms remaining: 5.3s 6: learn: 0.3221247 test: 0.3224230 best: 0.3224230 (6) total: 746ms remaining: 5.22s 7: learn: 0.3027358 test: 0.3035132 best: 0.3035132 (7) total: 858ms remaining: 5.14s 8: learn: 0.2864028 test: 0.2878888 best: 0.2878888 (8) total: 969ms remaining: 5.06s 9: learn: 0.2735630 test: 0.2755206 best: 0.2755206 (9) total: 1.08s remaining: 4.97s 10: learn: 0.2633408 test: 0.2660345 best: 0.2660345 (10) total: 1.19s remaining: 4.87s 11: learn: 0.2545190 test: 0.2577438 best: 0.2577438 (11) total: 1.3s remaining: 4.78s 12: learn: 0.2473435 test: 0.2508899 best: 0.2508899 (12) total: 1.42s remaining: 4.69s 13: learn: 0.2408504 test: 0.2444296 best: 0.2444296 (13) total: 1.54s remaining: 4.61s 14: learn: 0.2345420 test: 0.2384422 best: 0.2384422 (14) total: 1.66s remaining: 4.54s 15: learn: 0.2291183 test: 0.2332551 best: 0.2332551 (15) total: 1.78s remaining: 4.45s 16: learn: 0.2247985 test: 0.2289149 best: 0.2289149 (16) total: 1.9s remaining: 4.36s 17: learn: 0.2210522 test: 0.2254225 best: 0.2254225 (17) total: 2.02s remaining: 4.27s 18: learn: 0.2179030 test: 0.2229609 best: 0.2229609 (18) total: 2.15s remaining: 4.18s 19: learn: 0.2146550 test: 0.2199819 best: 0.2199819 (19) total: 2.27s remaining: 4.08s 20: learn: 0.2118856 test: 0.2174866 best: 0.2174866 (20) total: 2.39s remaining: 3.98s 21: learn: 0.2097462 test: 0.2154761 best: 0.2154761 (21) total: 2.5s remaining: 3.87s 22: learn: 0.2072506 test: 0.2133869 best: 0.2133869 (22) total: 2.62s remaining: 3.76s 23: learn: 0.2049740 test: 0.2114403 best: 0.2114403 (23) total: 2.75s remaining: 3.67s 24: learn: 0.2031315 test: 0.2102292 best: 0.2102292 (24) total: 2.88s remaining: 3.57s 25: learn: 0.2013180 test: 0.2087806 best: 0.2087806 (25) total: 2.99s remaining: 3.46s 26: learn: 0.2003046 test: 0.2080839 best: 0.2080839 (26) total: 3.12s remaining: 3.35s 27: learn: 0.1987396 test: 0.2068999 best: 0.2068999 (27) total: 3.23s remaining: 3.23s 28: learn: 0.1972964 test: 0.2058658 best: 0.2058658 (28) total: 3.36s remaining: 3.12s 29: learn: 0.1959978 test: 0.2050468 best: 0.2050468 (29) total: 3.48s remaining: 3.01s 30: learn: 0.1945879 test: 0.2039392 best: 0.2039392 (30) total: 3.6s remaining: 2.9s 31: learn: 0.1935792 test: 0.2032956 best: 0.2032956 (31) total: 3.71s remaining: 2.78s 32: learn: 0.1923377 test: 0.2028016 best: 0.2028016 (32) total: 3.84s remaining: 2.67s 33: learn: 0.1910040 test: 0.2017495 best: 0.2017495 (33) total: 3.96s remaining: 2.56s 34: learn: 0.1900653 test: 0.2012336 best: 0.2012336 (34) total: 4.08s remaining: 2.45s 35: learn: 0.1887961 test: 0.2002595 best: 0.2002595 (35) total: 4.19s remaining: 2.33s 36: learn: 0.1878957 test: 0.1998513 best: 0.1998513 (36) total: 4.31s remaining: 2.21s 37: learn: 0.1867523 test: 0.1988883 best: 0.1988883 (37) total: 4.43s remaining: 2.1s 38: learn: 0.1859599 test: 0.1982259 best: 0.1982259 (38) total: 4.54s remaining: 1.98s 39: learn: 0.1852276 test: 0.1979733 best: 0.1979733 (39) total: 4.66s remaining: 1.86s 40: learn: 0.1844296 test: 0.1974120 best: 0.1974120 (40) total: 4.78s remaining: 1.75s 41: learn: 0.1835486 test: 0.1972346 best: 0.1972346 (41) total: 4.9s remaining: 1.63s 42: learn: 0.1827428 test: 0.1968408 best: 0.1968408 (42) total: 5.02s remaining: 1.52s 43: learn: 0.1820271 test: 0.1967897 best: 0.1967897 (43) total: 5.14s remaining: 1.4s 44: learn: 0.1811485 test: 0.1963410 best: 0.1963410 (44) total: 5.26s remaining: 1.29s 45: learn: 0.1805351 test: 0.1958132 best: 0.1958132 (45) total: 5.38s remaining: 1.17s 46: learn: 0.1796954 test: 0.1958026 best: 0.1958026 (46) total: 5.5s remaining: 1.05s 47: learn: 0.1789833 test: 0.1953949 best: 0.1953949 (47) total: 5.61s remaining: 936ms 48: learn: 0.1783632 test: 0.1953015 best: 0.1953015 (48) total: 5.73s remaining: 819ms 49: learn: 0.1777565 test: 0.1945383 best: 0.1945383 (49) total: 5.85s remaining: 702ms 50: learn: 0.1770784 test: 0.1942896 best: 0.1942896 (50) total: 5.96s remaining: 585ms 51: learn: 0.1763217 test: 0.1941437 best: 0.1941437 (51) total: 6.08s remaining: 468ms 52: learn: 0.1758629 test: 0.1939091 best: 0.1939091 (52) total: 6.2s remaining: 351ms 53: learn: 0.1751233 test: 0.1938569 best: 0.1938569 (53) total: 6.32s remaining: 234ms 54: learn: 0.1746234 test: 0.1935569 best: 0.1935569 (54) total: 6.44s remaining: 117ms 55: learn: 0.1739218 test: 0.1933340 best: 0.1933340 (55) total: 6.56s remaining: 0us bestTest = 0.1933339555 bestIteration = 55 Trial 92, Fold 4: Log loss = 0.1928947959492357, Average precision = 0.9764457522976637, ROC-AUC = 0.9724084297230986, Elapsed Time = 6.705496800001129 seconds Trial 92, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 92, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5952907 test: 0.5970351 best: 0.5970351 (0) total: 94ms remaining: 5.17s 1: learn: 0.5178889 test: 0.5208483 best: 0.5208483 (1) total: 205ms remaining: 5.54s 2: learn: 0.4567117 test: 0.4612822 best: 0.4612822 (2) total: 326ms remaining: 5.75s 3: learn: 0.4090908 test: 0.4149374 best: 0.4149374 (3) total: 431ms remaining: 5.61s 4: learn: 0.3717366 test: 0.3785830 best: 0.3785830 (4) total: 538ms remaining: 5.49s 5: learn: 0.3414930 test: 0.3495358 best: 0.3495358 (5) total: 646ms remaining: 5.38s 6: learn: 0.3182762 test: 0.3271378 best: 0.3271378 (6) total: 754ms remaining: 5.28s 7: learn: 0.2990159 test: 0.3087194 best: 0.3087194 (7) total: 866ms remaining: 5.2s 8: learn: 0.2838257 test: 0.2940066 best: 0.2940066 (8) total: 972ms remaining: 5.07s 9: learn: 0.2707619 test: 0.2812879 best: 0.2812879 (9) total: 1.08s remaining: 4.99s 10: learn: 0.2605813 test: 0.2715422 best: 0.2715422 (10) total: 1.19s remaining: 4.89s 11: learn: 0.2515551 test: 0.2633475 best: 0.2633475 (11) total: 1.32s remaining: 4.83s 12: learn: 0.2439925 test: 0.2562126 best: 0.2562126 (12) total: 1.44s remaining: 4.75s 13: learn: 0.2372647 test: 0.2504459 best: 0.2504459 (13) total: 1.56s remaining: 4.67s 14: learn: 0.2317843 test: 0.2454581 best: 0.2454581 (14) total: 1.68s remaining: 4.59s 15: learn: 0.2267310 test: 0.2406487 best: 0.2406487 (15) total: 1.8s remaining: 4.49s 16: learn: 0.2224668 test: 0.2370467 best: 0.2370467 (16) total: 1.92s remaining: 4.41s 17: learn: 0.2186460 test: 0.2335476 best: 0.2335476 (17) total: 2.04s remaining: 4.31s 18: learn: 0.2152016 test: 0.2304742 best: 0.2304742 (18) total: 2.16s remaining: 4.21s 19: learn: 0.2120369 test: 0.2276782 best: 0.2276782 (19) total: 2.29s remaining: 4.12s 20: learn: 0.2092037 test: 0.2251953 best: 0.2251953 (20) total: 2.41s remaining: 4.02s 21: learn: 0.2066907 test: 0.2231433 best: 0.2231433 (21) total: 2.53s remaining: 3.91s 22: learn: 0.2043572 test: 0.2213729 best: 0.2213729 (22) total: 2.65s remaining: 3.81s 23: learn: 0.2024106 test: 0.2194675 best: 0.2194675 (23) total: 2.77s remaining: 3.7s 24: learn: 0.2005077 test: 0.2178793 best: 0.2178793 (24) total: 2.89s remaining: 3.59s 25: learn: 0.1988325 test: 0.2165551 best: 0.2165551 (25) total: 3.02s remaining: 3.48s 26: learn: 0.1973624 test: 0.2156198 best: 0.2156198 (26) total: 3.14s remaining: 3.38s 27: learn: 0.1956764 test: 0.2142496 best: 0.2142496 (27) total: 3.26s remaining: 3.26s 28: learn: 0.1941601 test: 0.2133838 best: 0.2133838 (28) total: 3.39s remaining: 3.15s 29: learn: 0.1930919 test: 0.2125383 best: 0.2125383 (29) total: 3.5s remaining: 3.03s 30: learn: 0.1918151 test: 0.2116022 best: 0.2116022 (30) total: 3.62s remaining: 2.92s 31: learn: 0.1907617 test: 0.2109533 best: 0.2109533 (31) total: 3.74s remaining: 2.81s 32: learn: 0.1897248 test: 0.2102319 best: 0.2102319 (32) total: 3.87s remaining: 2.7s 33: learn: 0.1887862 test: 0.2098113 best: 0.2098113 (33) total: 4s remaining: 2.59s 34: learn: 0.1876552 test: 0.2092438 best: 0.2092438 (34) total: 4.12s remaining: 2.47s 35: learn: 0.1867598 test: 0.2087065 best: 0.2087065 (35) total: 4.24s remaining: 2.36s 36: learn: 0.1860511 test: 0.2081436 best: 0.2081436 (36) total: 4.36s remaining: 2.24s 37: learn: 0.1850674 test: 0.2075549 best: 0.2075549 (37) total: 4.48s remaining: 2.12s 38: learn: 0.1842037 test: 0.2070240 best: 0.2070240 (38) total: 4.61s remaining: 2.01s 39: learn: 0.1831735 test: 0.2061105 best: 0.2061105 (39) total: 4.72s remaining: 1.89s 40: learn: 0.1825109 test: 0.2055835 best: 0.2055835 (40) total: 4.84s remaining: 1.77s 41: learn: 0.1816801 test: 0.2052421 best: 0.2052421 (41) total: 4.96s remaining: 1.65s 42: learn: 0.1809991 test: 0.2047606 best: 0.2047606 (42) total: 5.07s remaining: 1.53s 43: learn: 0.1804081 test: 0.2045793 best: 0.2045793 (43) total: 5.18s remaining: 1.41s 44: learn: 0.1795730 test: 0.2039303 best: 0.2039303 (44) total: 5.3s remaining: 1.3s 45: learn: 0.1788158 test: 0.2037196 best: 0.2037196 (45) total: 5.43s remaining: 1.18s 46: learn: 0.1780676 test: 0.2035498 best: 0.2035498 (46) total: 5.55s remaining: 1.06s 47: learn: 0.1774550 test: 0.2032620 best: 0.2032620 (47) total: 5.66s remaining: 943ms 48: learn: 0.1769434 test: 0.2028413 best: 0.2028413 (48) total: 5.77s remaining: 824ms 49: learn: 0.1762399 test: 0.2022676 best: 0.2022676 (49) total: 5.89s remaining: 707ms 50: learn: 0.1754730 test: 0.2023151 best: 0.2022676 (49) total: 6.02s remaining: 591ms 51: learn: 0.1746735 test: 0.2021834 best: 0.2021834 (51) total: 6.14s remaining: 473ms 52: learn: 0.1740948 test: 0.2021310 best: 0.2021310 (52) total: 6.26s remaining: 355ms 53: learn: 0.1732007 test: 0.2023191 best: 0.2021310 (52) total: 6.39s remaining: 237ms 54: learn: 0.1726018 test: 0.2019789 best: 0.2019789 (54) total: 6.51s remaining: 118ms 55: learn: 0.1721205 test: 0.2017962 best: 0.2017962 (55) total: 6.63s remaining: 0us bestTest = 0.2017961921 bestIteration = 55 Trial 92, Fold 5: Log loss = 0.20111419071654832, Average precision = 0.9742303516876331, ROC-AUC = 0.9709538301469631, Elapsed Time = 6.771970400004648 seconds
Optimization Progress: 93%|#########3| 93/100 [2:51:28<09:16, 79.48s/it]
Trial 93, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 93, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6001769 test: 0.5993738 best: 0.5993738 (0) total: 71.6ms remaining: 6.08s 1: learn: 0.5280358 test: 0.5271895 best: 0.5271895 (1) total: 143ms remaining: 6s 2: learn: 0.4706813 test: 0.4698631 best: 0.4698631 (2) total: 212ms remaining: 5.87s 3: learn: 0.4264982 test: 0.4255930 best: 0.4255930 (3) total: 285ms remaining: 5.84s 4: learn: 0.3928484 test: 0.3923467 best: 0.3923467 (4) total: 361ms remaining: 5.85s 5: learn: 0.3636328 test: 0.3630249 best: 0.3630249 (5) total: 439ms remaining: 5.85s 6: learn: 0.3435455 test: 0.3434831 best: 0.3434831 (6) total: 516ms remaining: 5.82s 7: learn: 0.3260767 test: 0.3265752 best: 0.3265752 (7) total: 606ms remaining: 5.91s 8: learn: 0.3091944 test: 0.3095396 best: 0.3095396 (8) total: 681ms remaining: 5.83s 9: learn: 0.2946213 test: 0.2950359 best: 0.2950359 (9) total: 755ms remaining: 5.74s 10: learn: 0.2842459 test: 0.2849060 best: 0.2849060 (10) total: 826ms remaining: 5.63s 11: learn: 0.2761323 test: 0.2773603 best: 0.2773603 (11) total: 908ms remaining: 5.6s 12: learn: 0.2692557 test: 0.2707590 best: 0.2707590 (12) total: 981ms remaining: 5.51s 13: learn: 0.2632911 test: 0.2652840 best: 0.2652840 (13) total: 1.05s remaining: 5.42s 14: learn: 0.2561330 test: 0.2583737 best: 0.2583737 (14) total: 1.13s remaining: 5.34s 15: learn: 0.2501252 test: 0.2528384 best: 0.2528384 (15) total: 1.2s remaining: 5.25s 16: learn: 0.2450430 test: 0.2479268 best: 0.2479268 (16) total: 1.27s remaining: 5.18s 17: learn: 0.2415659 test: 0.2448329 best: 0.2448329 (17) total: 1.35s remaining: 5.11s 18: learn: 0.2375957 test: 0.2410643 best: 0.2410643 (18) total: 1.42s remaining: 5.02s 19: learn: 0.2335310 test: 0.2373287 best: 0.2373287 (19) total: 1.49s remaining: 4.93s 20: learn: 0.2309642 test: 0.2350964 best: 0.2350964 (20) total: 1.57s remaining: 4.86s 21: learn: 0.2285621 test: 0.2327039 best: 0.2327039 (21) total: 1.65s remaining: 4.79s 22: learn: 0.2266180 test: 0.2311630 best: 0.2311630 (22) total: 1.72s remaining: 4.71s 23: learn: 0.2249984 test: 0.2296769 best: 0.2296769 (23) total: 1.79s remaining: 4.63s 24: learn: 0.2237871 test: 0.2287012 best: 0.2287012 (24) total: 1.87s remaining: 4.56s 25: learn: 0.2223637 test: 0.2275592 best: 0.2275592 (25) total: 1.95s remaining: 4.5s 26: learn: 0.2210870 test: 0.2264230 best: 0.2264230 (26) total: 2.03s remaining: 4.43s 27: learn: 0.2197444 test: 0.2253274 best: 0.2253274 (27) total: 2.1s remaining: 4.36s 28: learn: 0.2186787 test: 0.2244770 best: 0.2244770 (28) total: 2.18s remaining: 4.28s 29: learn: 0.2174033 test: 0.2232626 best: 0.2232626 (29) total: 2.26s remaining: 4.22s 30: learn: 0.2165825 test: 0.2228109 best: 0.2228109 (30) total: 2.33s remaining: 4.13s 31: learn: 0.2155003 test: 0.2220340 best: 0.2220340 (31) total: 2.4s remaining: 4.04s 32: learn: 0.2146735 test: 0.2211719 best: 0.2211719 (32) total: 2.46s remaining: 3.96s 33: learn: 0.2138008 test: 0.2205249 best: 0.2205249 (33) total: 2.54s remaining: 3.89s 34: learn: 0.2131137 test: 0.2199161 best: 0.2199161 (34) total: 2.61s remaining: 3.81s 35: learn: 0.2116947 test: 0.2188285 best: 0.2188285 (35) total: 2.69s remaining: 3.73s 36: learn: 0.2113351 test: 0.2185598 best: 0.2185598 (36) total: 2.75s remaining: 3.64s 37: learn: 0.2106459 test: 0.2181938 best: 0.2181938 (37) total: 2.82s remaining: 3.56s 38: learn: 0.2101899 test: 0.2178090 best: 0.2178090 (38) total: 2.89s remaining: 3.48s 39: learn: 0.2089080 test: 0.2166963 best: 0.2166963 (39) total: 2.97s remaining: 3.42s 40: learn: 0.2084226 test: 0.2162777 best: 0.2162777 (40) total: 3.04s remaining: 3.33s 41: learn: 0.2079238 test: 0.2158906 best: 0.2158906 (41) total: 3.11s remaining: 3.26s 42: learn: 0.2075921 test: 0.2156616 best: 0.2156616 (42) total: 3.18s remaining: 3.18s 43: learn: 0.2072059 test: 0.2153433 best: 0.2153433 (43) total: 3.25s remaining: 3.1s 44: learn: 0.2067699 test: 0.2149598 best: 0.2149598 (44) total: 3.32s remaining: 3.02s 45: learn: 0.2063502 test: 0.2146426 best: 0.2146426 (45) total: 3.4s remaining: 2.96s 46: learn: 0.2051280 test: 0.2135005 best: 0.2135005 (46) total: 3.47s remaining: 2.88s 47: learn: 0.2048648 test: 0.2134387 best: 0.2134387 (47) total: 3.54s remaining: 2.8s 48: learn: 0.2043303 test: 0.2129232 best: 0.2129232 (48) total: 3.6s remaining: 2.72s 49: learn: 0.2041627 test: 0.2127862 best: 0.2127862 (49) total: 3.67s remaining: 2.64s 50: learn: 0.2037898 test: 0.2124799 best: 0.2124799 (50) total: 3.74s remaining: 2.57s 51: learn: 0.2030621 test: 0.2120055 best: 0.2120055 (51) total: 3.82s remaining: 2.5s 52: learn: 0.2027824 test: 0.2118575 best: 0.2118575 (52) total: 3.89s remaining: 2.42s 53: learn: 0.2023940 test: 0.2117213 best: 0.2117213 (53) total: 3.97s remaining: 2.35s 54: learn: 0.2022132 test: 0.2116540 best: 0.2116540 (54) total: 4.03s remaining: 2.27s 55: learn: 0.2021758 test: 0.2116169 best: 0.2116169 (55) total: 4.09s remaining: 2.19s 56: learn: 0.2019416 test: 0.2114183 best: 0.2114183 (56) total: 4.16s remaining: 2.11s 57: learn: 0.2017055 test: 0.2113830 best: 0.2113830 (57) total: 4.22s remaining: 2.04s 58: learn: 0.2016592 test: 0.2113454 best: 0.2113454 (58) total: 4.29s remaining: 1.96s 59: learn: 0.2015619 test: 0.2112423 best: 0.2112423 (59) total: 4.35s remaining: 1.89s 60: learn: 0.2012209 test: 0.2109940 best: 0.2109940 (60) total: 4.43s remaining: 1.81s 61: learn: 0.2008416 test: 0.2106939 best: 0.2106939 (61) total: 4.5s remaining: 1.74s 62: learn: 0.2006598 test: 0.2105730 best: 0.2105730 (62) total: 4.56s remaining: 1.67s 63: learn: 0.2002947 test: 0.2104947 best: 0.2104947 (63) total: 4.63s remaining: 1.59s 64: learn: 0.1999900 test: 0.2101451 best: 0.2101451 (64) total: 4.7s remaining: 1.52s 65: learn: 0.1995260 test: 0.2098506 best: 0.2098506 (65) total: 4.77s remaining: 1.45s 66: learn: 0.1988728 test: 0.2092557 best: 0.2092557 (66) total: 4.84s remaining: 1.37s 67: learn: 0.1987190 test: 0.2090529 best: 0.2090529 (67) total: 4.91s remaining: 1.3s 68: learn: 0.1985305 test: 0.2090228 best: 0.2090228 (68) total: 4.98s remaining: 1.23s 69: learn: 0.1981232 test: 0.2086405 best: 0.2086405 (69) total: 5.05s remaining: 1.15s 70: learn: 0.1980697 test: 0.2086484 best: 0.2086405 (69) total: 5.11s remaining: 1.08s 71: learn: 0.1978876 test: 0.2085032 best: 0.2085032 (71) total: 5.18s remaining: 1.01s 72: learn: 0.1978240 test: 0.2084831 best: 0.2084831 (72) total: 5.24s remaining: 934ms 73: learn: 0.1976204 test: 0.2083322 best: 0.2083322 (73) total: 5.31s remaining: 861ms 74: learn: 0.1976086 test: 0.2083334 best: 0.2083322 (73) total: 5.37s remaining: 787ms 75: learn: 0.1973904 test: 0.2083781 best: 0.2083322 (73) total: 5.44s remaining: 716ms 76: learn: 0.1970446 test: 0.2081670 best: 0.2081670 (76) total: 5.51s remaining: 644ms 77: learn: 0.1966739 test: 0.2078654 best: 0.2078654 (77) total: 5.58s remaining: 572ms 78: learn: 0.1965525 test: 0.2079187 best: 0.2078654 (77) total: 5.64s remaining: 500ms 79: learn: 0.1965280 test: 0.2078801 best: 0.2078654 (77) total: 5.7s remaining: 428ms 80: learn: 0.1964802 test: 0.2078808 best: 0.2078654 (77) total: 5.76s remaining: 356ms 81: learn: 0.1964601 test: 0.2078901 best: 0.2078654 (77) total: 5.82s remaining: 284ms 82: learn: 0.1960604 test: 0.2074357 best: 0.2074357 (82) total: 5.89s remaining: 213ms 83: learn: 0.1958117 test: 0.2073463 best: 0.2073463 (83) total: 5.97s remaining: 142ms 84: learn: 0.1954356 test: 0.2071470 best: 0.2071470 (84) total: 6.04s remaining: 71.1ms 85: learn: 0.1952230 test: 0.2069677 best: 0.2069677 (85) total: 6.11s remaining: 0us bestTest = 0.2069677297 bestIteration = 85 Trial 93, Fold 1: Log loss = 0.20636956049698438, Average precision = 0.9728329234990032, ROC-AUC = 0.9695648452846961, Elapsed Time = 6.222817199995916 seconds Trial 93, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 93, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6072895 test: 0.6074199 best: 0.6074199 (0) total: 76.3ms remaining: 6.49s 1: learn: 0.5386178 test: 0.5400693 best: 0.5400693 (1) total: 152ms remaining: 6.4s 2: learn: 0.4800306 test: 0.4812703 best: 0.4812703 (2) total: 231ms remaining: 6.4s 3: learn: 0.4324105 test: 0.4338438 best: 0.4338438 (3) total: 309ms remaining: 6.34s 4: learn: 0.3968300 test: 0.3985523 best: 0.3985523 (4) total: 385ms remaining: 6.24s 5: learn: 0.3670500 test: 0.3691589 best: 0.3691589 (5) total: 463ms remaining: 6.17s 6: learn: 0.3458455 test: 0.3479751 best: 0.3479751 (6) total: 541ms remaining: 6.11s 7: learn: 0.3286302 test: 0.3303589 best: 0.3303589 (7) total: 617ms remaining: 6.01s 8: learn: 0.3132183 test: 0.3147420 best: 0.3147420 (8) total: 701ms remaining: 5.99s 9: learn: 0.3004201 test: 0.3018290 best: 0.3018290 (9) total: 781ms remaining: 5.93s 10: learn: 0.2894705 test: 0.2908541 best: 0.2908541 (10) total: 861ms remaining: 5.87s 11: learn: 0.2800890 test: 0.2813213 best: 0.2813213 (11) total: 938ms remaining: 5.78s 12: learn: 0.2738672 test: 0.2750428 best: 0.2750428 (12) total: 1.01s remaining: 5.67s 13: learn: 0.2652316 test: 0.2666539 best: 0.2666539 (13) total: 1.09s remaining: 5.6s 14: learn: 0.2587023 test: 0.2603574 best: 0.2603574 (14) total: 1.16s remaining: 5.51s 15: learn: 0.2535958 test: 0.2551318 best: 0.2551318 (15) total: 1.25s remaining: 5.46s 16: learn: 0.2496862 test: 0.2510820 best: 0.2510820 (16) total: 1.32s remaining: 5.38s 17: learn: 0.2439461 test: 0.2453504 best: 0.2453504 (17) total: 1.4s remaining: 5.3s 18: learn: 0.2397852 test: 0.2413013 best: 0.2413013 (18) total: 1.48s remaining: 5.23s 19: learn: 0.2362191 test: 0.2378345 best: 0.2378345 (19) total: 1.56s remaining: 5.15s 20: learn: 0.2325072 test: 0.2343197 best: 0.2343197 (20) total: 1.64s remaining: 5.08s 21: learn: 0.2299818 test: 0.2320516 best: 0.2320516 (21) total: 1.72s remaining: 5s 22: learn: 0.2275858 test: 0.2296697 best: 0.2296697 (22) total: 1.79s remaining: 4.9s 23: learn: 0.2263661 test: 0.2283627 best: 0.2283627 (23) total: 1.86s remaining: 4.8s 24: learn: 0.2245602 test: 0.2270495 best: 0.2270495 (24) total: 1.93s remaining: 4.72s 25: learn: 0.2231399 test: 0.2256182 best: 0.2256182 (25) total: 2s remaining: 4.62s 26: learn: 0.2217073 test: 0.2240915 best: 0.2240915 (26) total: 2.07s remaining: 4.53s 27: learn: 0.2204773 test: 0.2226562 best: 0.2226562 (27) total: 2.15s remaining: 4.45s 28: learn: 0.2192508 test: 0.2214284 best: 0.2214284 (28) total: 2.23s remaining: 4.39s 29: learn: 0.2184098 test: 0.2205913 best: 0.2205913 (29) total: 2.31s remaining: 4.3s 30: learn: 0.2171598 test: 0.2193825 best: 0.2193825 (30) total: 2.38s remaining: 4.23s 31: learn: 0.2165094 test: 0.2187064 best: 0.2187064 (31) total: 2.45s remaining: 4.13s 32: learn: 0.2153783 test: 0.2176787 best: 0.2176787 (32) total: 2.51s remaining: 4.04s 33: learn: 0.2149686 test: 0.2172547 best: 0.2172547 (33) total: 2.58s remaining: 3.95s 34: learn: 0.2144257 test: 0.2167395 best: 0.2167395 (34) total: 2.65s remaining: 3.87s 35: learn: 0.2134474 test: 0.2157325 best: 0.2157325 (35) total: 2.73s remaining: 3.79s 36: learn: 0.2130666 test: 0.2153252 best: 0.2153252 (36) total: 2.79s remaining: 3.7s 37: learn: 0.2124057 test: 0.2147490 best: 0.2147490 (37) total: 2.87s remaining: 3.63s 38: learn: 0.2123364 test: 0.2147502 best: 0.2147490 (37) total: 2.94s remaining: 3.54s 39: learn: 0.2116847 test: 0.2140123 best: 0.2140123 (39) total: 3s remaining: 3.45s 40: learn: 0.2114030 test: 0.2137184 best: 0.2137184 (40) total: 3.07s remaining: 3.37s 41: learn: 0.2107948 test: 0.2131272 best: 0.2131272 (41) total: 3.14s remaining: 3.29s 42: learn: 0.2106180 test: 0.2129112 best: 0.2129112 (42) total: 3.21s remaining: 3.21s 43: learn: 0.2098751 test: 0.2121678 best: 0.2121678 (43) total: 3.27s remaining: 3.12s 44: learn: 0.2089787 test: 0.2112137 best: 0.2112137 (44) total: 3.35s remaining: 3.05s 45: learn: 0.2088898 test: 0.2111371 best: 0.2111371 (45) total: 3.41s remaining: 2.96s 46: learn: 0.2084661 test: 0.2106802 best: 0.2106802 (46) total: 3.48s remaining: 2.89s 47: learn: 0.2081027 test: 0.2103507 best: 0.2103507 (47) total: 3.55s remaining: 2.81s 48: learn: 0.2078602 test: 0.2100671 best: 0.2100671 (48) total: 3.62s remaining: 2.73s 49: learn: 0.2074738 test: 0.2097234 best: 0.2097234 (49) total: 3.69s remaining: 2.65s 50: learn: 0.2073421 test: 0.2096676 best: 0.2096676 (50) total: 3.75s remaining: 2.57s 51: learn: 0.2068474 test: 0.2091697 best: 0.2091697 (51) total: 3.82s remaining: 2.5s 52: learn: 0.2059653 test: 0.2082618 best: 0.2082618 (52) total: 3.89s remaining: 2.42s 53: learn: 0.2059207 test: 0.2082495 best: 0.2082495 (53) total: 3.95s remaining: 2.34s 54: learn: 0.2056101 test: 0.2079614 best: 0.2079614 (54) total: 4.02s remaining: 2.27s 55: learn: 0.2052290 test: 0.2076628 best: 0.2076628 (55) total: 4.08s remaining: 2.19s 56: learn: 0.2048847 test: 0.2074975 best: 0.2074975 (56) total: 4.16s remaining: 2.11s 57: learn: 0.2042941 test: 0.2069551 best: 0.2069551 (57) total: 4.23s remaining: 2.04s 58: learn: 0.2037755 test: 0.2064960 best: 0.2064960 (58) total: 4.3s remaining: 1.97s 59: learn: 0.2033612 test: 0.2061275 best: 0.2061275 (59) total: 4.37s remaining: 1.89s 60: learn: 0.2032884 test: 0.2061235 best: 0.2061235 (60) total: 4.43s remaining: 1.82s 61: learn: 0.2028445 test: 0.2058423 best: 0.2058423 (61) total: 4.5s remaining: 1.74s 62: learn: 0.2028235 test: 0.2058358 best: 0.2058358 (62) total: 4.56s remaining: 1.67s 63: learn: 0.2027806 test: 0.2058105 best: 0.2058105 (63) total: 4.62s remaining: 1.59s 64: learn: 0.2025438 test: 0.2056736 best: 0.2056736 (64) total: 4.69s remaining: 1.51s 65: learn: 0.2020432 test: 0.2053060 best: 0.2053060 (65) total: 4.76s remaining: 1.44s 66: learn: 0.2016014 test: 0.2048035 best: 0.2048035 (66) total: 4.83s remaining: 1.37s 67: learn: 0.2015561 test: 0.2047958 best: 0.2047958 (67) total: 4.89s remaining: 1.29s 68: learn: 0.2011413 test: 0.2044325 best: 0.2044325 (68) total: 4.96s remaining: 1.22s 69: learn: 0.2005323 test: 0.2037918 best: 0.2037918 (69) total: 5.03s remaining: 1.15s 70: learn: 0.2000670 test: 0.2034098 best: 0.2034098 (70) total: 5.1s remaining: 1.08s 71: learn: 0.1999125 test: 0.2033000 best: 0.2033000 (71) total: 5.17s remaining: 1s 72: learn: 0.1995861 test: 0.2030390 best: 0.2030390 (72) total: 5.24s remaining: 933ms 73: learn: 0.1995627 test: 0.2030520 best: 0.2030390 (72) total: 5.3s remaining: 859ms 74: learn: 0.1992231 test: 0.2026563 best: 0.2026563 (74) total: 5.36s remaining: 786ms 75: learn: 0.1989521 test: 0.2024435 best: 0.2024435 (75) total: 5.43s remaining: 714ms 76: learn: 0.1988483 test: 0.2024441 best: 0.2024435 (75) total: 5.49s remaining: 642ms 77: learn: 0.1988319 test: 0.2024451 best: 0.2024435 (75) total: 5.55s remaining: 569ms 78: learn: 0.1985856 test: 0.2022473 best: 0.2022473 (78) total: 5.62s remaining: 498ms 79: learn: 0.1984442 test: 0.2022186 best: 0.2022186 (79) total: 5.69s remaining: 427ms 80: learn: 0.1983635 test: 0.2021398 best: 0.2021398 (80) total: 5.75s remaining: 355ms 81: learn: 0.1982927 test: 0.2021305 best: 0.2021305 (81) total: 5.82s remaining: 284ms 82: learn: 0.1978041 test: 0.2016980 best: 0.2016980 (82) total: 5.89s remaining: 213ms 83: learn: 0.1977752 test: 0.2016664 best: 0.2016664 (83) total: 5.95s remaining: 142ms 84: learn: 0.1976139 test: 0.2016491 best: 0.2016491 (84) total: 6.02s remaining: 70.8ms 85: learn: 0.1975491 test: 0.2016201 best: 0.2016201 (85) total: 6.08s remaining: 0us bestTest = 0.2016200922 bestIteration = 85 Trial 93, Fold 2: Log loss = 0.201206877637942, Average precision = 0.9739785718672523, ROC-AUC = 0.9707472758996292, Elapsed Time = 6.19267260000197 seconds Trial 93, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 93, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6060574 test: 0.6052425 best: 0.6052425 (0) total: 73.8ms remaining: 6.27s 1: learn: 0.5334207 test: 0.5320508 best: 0.5320508 (1) total: 150ms remaining: 6.3s 2: learn: 0.4808201 test: 0.4790674 best: 0.4790674 (2) total: 223ms remaining: 6.17s 3: learn: 0.4330542 test: 0.4308826 best: 0.4308826 (3) total: 300ms remaining: 6.15s 4: learn: 0.4024408 test: 0.3997043 best: 0.3997043 (4) total: 372ms remaining: 6.03s 5: learn: 0.3716657 test: 0.3690705 best: 0.3690705 (5) total: 450ms remaining: 6s 6: learn: 0.3456022 test: 0.3431125 best: 0.3431125 (6) total: 536ms remaining: 6.05s 7: learn: 0.3282788 test: 0.3253367 best: 0.3253367 (7) total: 614ms remaining: 5.99s 8: learn: 0.3119142 test: 0.3090017 best: 0.3090017 (8) total: 693ms remaining: 5.93s 9: learn: 0.2998311 test: 0.2967208 best: 0.2967208 (9) total: 769ms remaining: 5.84s 10: learn: 0.2872606 test: 0.2843251 best: 0.2843251 (10) total: 847ms remaining: 5.78s 11: learn: 0.2777399 test: 0.2748441 best: 0.2748441 (11) total: 925ms remaining: 5.71s 12: learn: 0.2703548 test: 0.2676978 best: 0.2676978 (12) total: 1s remaining: 5.64s 13: learn: 0.2622670 test: 0.2599410 best: 0.2599410 (13) total: 1.08s remaining: 5.57s 14: learn: 0.2569267 test: 0.2547679 best: 0.2547679 (14) total: 1.15s remaining: 5.47s 15: learn: 0.2515697 test: 0.2494023 best: 0.2494023 (15) total: 1.23s remaining: 5.38s 16: learn: 0.2486035 test: 0.2464822 best: 0.2464822 (16) total: 1.31s remaining: 5.3s 17: learn: 0.2437333 test: 0.2414288 best: 0.2414288 (17) total: 1.38s remaining: 5.22s 18: learn: 0.2408822 test: 0.2385572 best: 0.2385572 (18) total: 1.45s remaining: 5.13s 19: learn: 0.2379067 test: 0.2359837 best: 0.2359837 (19) total: 1.54s remaining: 5.07s 20: learn: 0.2352020 test: 0.2334060 best: 0.2334060 (20) total: 1.62s remaining: 5s 21: learn: 0.2329776 test: 0.2312916 best: 0.2312916 (21) total: 1.69s remaining: 4.92s 22: learn: 0.2307702 test: 0.2289547 best: 0.2289547 (22) total: 1.77s remaining: 4.84s 23: learn: 0.2286440 test: 0.2268782 best: 0.2268782 (23) total: 1.83s remaining: 4.74s 24: learn: 0.2276434 test: 0.2260732 best: 0.2260732 (24) total: 1.91s remaining: 4.66s 25: learn: 0.2263219 test: 0.2249162 best: 0.2249162 (25) total: 1.99s remaining: 4.58s 26: learn: 0.2250616 test: 0.2239474 best: 0.2239474 (26) total: 2.06s remaining: 4.49s 27: learn: 0.2244050 test: 0.2233511 best: 0.2233511 (27) total: 2.13s remaining: 4.41s 28: learn: 0.2236208 test: 0.2224462 best: 0.2224462 (28) total: 2.19s remaining: 4.31s 29: learn: 0.2227408 test: 0.2216073 best: 0.2216073 (29) total: 2.27s remaining: 4.23s 30: learn: 0.2211352 test: 0.2202930 best: 0.2202930 (30) total: 2.35s remaining: 4.16s 31: learn: 0.2200759 test: 0.2193599 best: 0.2193599 (31) total: 2.41s remaining: 4.07s 32: learn: 0.2192547 test: 0.2188249 best: 0.2188249 (32) total: 2.49s remaining: 3.99s 33: learn: 0.2184955 test: 0.2180199 best: 0.2180199 (33) total: 2.56s remaining: 3.91s 34: learn: 0.2182008 test: 0.2179376 best: 0.2179376 (34) total: 2.63s remaining: 3.83s 35: learn: 0.2178314 test: 0.2176095 best: 0.2176095 (35) total: 2.71s remaining: 3.76s 36: learn: 0.2167050 test: 0.2163971 best: 0.2163971 (36) total: 2.78s remaining: 3.68s 37: learn: 0.2164147 test: 0.2162242 best: 0.2162242 (37) total: 2.85s remaining: 3.59s 38: learn: 0.2153562 test: 0.2152515 best: 0.2152515 (38) total: 2.91s remaining: 3.51s 39: learn: 0.2141124 test: 0.2141284 best: 0.2141284 (39) total: 2.99s remaining: 3.44s 40: learn: 0.2136352 test: 0.2137286 best: 0.2137286 (40) total: 3.06s remaining: 3.36s 41: learn: 0.2125435 test: 0.2126258 best: 0.2126258 (41) total: 3.13s remaining: 3.28s 42: learn: 0.2118795 test: 0.2120213 best: 0.2120213 (42) total: 3.2s remaining: 3.2s 43: learn: 0.2115349 test: 0.2118677 best: 0.2118677 (43) total: 3.27s remaining: 3.12s 44: learn: 0.2112822 test: 0.2117381 best: 0.2117381 (44) total: 3.33s remaining: 3.04s 45: learn: 0.2110824 test: 0.2115696 best: 0.2115696 (45) total: 3.4s remaining: 2.96s 46: learn: 0.2099157 test: 0.2104093 best: 0.2104093 (46) total: 3.47s remaining: 2.88s 47: learn: 0.2095917 test: 0.2101131 best: 0.2101131 (47) total: 3.55s remaining: 2.81s 48: learn: 0.2088187 test: 0.2095592 best: 0.2095592 (48) total: 3.62s remaining: 2.73s 49: learn: 0.2087642 test: 0.2095447 best: 0.2095447 (49) total: 3.68s remaining: 2.65s 50: learn: 0.2087320 test: 0.2095504 best: 0.2095447 (49) total: 3.74s remaining: 2.57s 51: learn: 0.2086661 test: 0.2095442 best: 0.2095442 (51) total: 3.81s remaining: 2.49s 52: learn: 0.2084605 test: 0.2093920 best: 0.2093920 (52) total: 3.87s remaining: 2.41s 53: learn: 0.2081108 test: 0.2089514 best: 0.2089514 (53) total: 3.94s remaining: 2.33s 54: learn: 0.2076553 test: 0.2086630 best: 0.2086630 (54) total: 4.01s remaining: 2.26s 55: learn: 0.2075129 test: 0.2086290 best: 0.2086290 (55) total: 4.09s remaining: 2.19s 56: learn: 0.2073428 test: 0.2084160 best: 0.2084160 (56) total: 4.16s remaining: 2.11s 57: learn: 0.2071092 test: 0.2081948 best: 0.2081948 (57) total: 4.22s remaining: 2.04s 58: learn: 0.2067424 test: 0.2078405 best: 0.2078405 (58) total: 4.29s remaining: 1.97s 59: learn: 0.2066324 test: 0.2077314 best: 0.2077314 (59) total: 4.36s remaining: 1.89s 60: learn: 0.2056576 test: 0.2068826 best: 0.2068826 (60) total: 4.43s remaining: 1.81s 61: learn: 0.2050585 test: 0.2063529 best: 0.2063529 (61) total: 4.51s remaining: 1.74s 62: learn: 0.2048231 test: 0.2060558 best: 0.2060558 (62) total: 4.57s remaining: 1.67s 63: learn: 0.2046750 test: 0.2059410 best: 0.2059410 (63) total: 4.63s remaining: 1.59s 64: learn: 0.2044338 test: 0.2056916 best: 0.2056916 (64) total: 4.7s remaining: 1.52s 65: learn: 0.2042809 test: 0.2055743 best: 0.2055743 (65) total: 4.76s remaining: 1.44s 66: learn: 0.2038484 test: 0.2052193 best: 0.2052193 (66) total: 4.83s remaining: 1.37s 67: learn: 0.2036785 test: 0.2050162 best: 0.2050162 (67) total: 4.9s remaining: 1.3s 68: learn: 0.2032504 test: 0.2045470 best: 0.2045470 (68) total: 4.97s remaining: 1.22s 69: learn: 0.2031772 test: 0.2045454 best: 0.2045454 (69) total: 5.04s remaining: 1.15s 70: learn: 0.2029084 test: 0.2043865 best: 0.2043865 (70) total: 5.11s remaining: 1.08s 71: learn: 0.2026336 test: 0.2042812 best: 0.2042812 (71) total: 5.17s remaining: 1.01s 72: learn: 0.2023233 test: 0.2040038 best: 0.2040038 (72) total: 5.25s remaining: 934ms 73: learn: 0.2021734 test: 0.2039301 best: 0.2039301 (73) total: 5.31s remaining: 862ms 74: learn: 0.2019325 test: 0.2037429 best: 0.2037429 (74) total: 5.38s remaining: 789ms 75: learn: 0.2016840 test: 0.2036139 best: 0.2036139 (75) total: 5.45s remaining: 717ms 76: learn: 0.2013464 test: 0.2033661 best: 0.2033661 (76) total: 5.51s remaining: 644ms 77: learn: 0.2012789 test: 0.2033533 best: 0.2033533 (77) total: 5.57s remaining: 572ms 78: learn: 0.2010590 test: 0.2031227 best: 0.2031227 (78) total: 5.64s remaining: 500ms 79: learn: 0.2008253 test: 0.2029675 best: 0.2029675 (79) total: 5.7s remaining: 428ms 80: learn: 0.2006364 test: 0.2028444 best: 0.2028444 (80) total: 5.77s remaining: 356ms 81: learn: 0.2001839 test: 0.2022917 best: 0.2022917 (81) total: 5.84s remaining: 285ms 82: learn: 0.2001247 test: 0.2022916 best: 0.2022916 (82) total: 5.9s remaining: 213ms 83: learn: 0.1999737 test: 0.2022101 best: 0.2022101 (83) total: 5.97s remaining: 142ms 84: learn: 0.1996405 test: 0.2018349 best: 0.2018349 (84) total: 6.04s remaining: 71.1ms 85: learn: 0.1991676 test: 0.2014304 best: 0.2014304 (85) total: 6.11s remaining: 0us bestTest = 0.201430364 bestIteration = 85 Trial 93, Fold 3: Log loss = 0.20110905395662873, Average precision = 0.9740746789266266, ROC-AUC = 0.9705698141691788, Elapsed Time = 6.227032500006317 seconds Trial 93, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 93, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5999119 test: 0.6007997 best: 0.6007997 (0) total: 74.3ms remaining: 6.32s 1: learn: 0.5328433 test: 0.5344729 best: 0.5344729 (1) total: 146ms remaining: 6.15s 2: learn: 0.4831135 test: 0.4852226 best: 0.4852226 (2) total: 215ms remaining: 5.95s 3: learn: 0.4348151 test: 0.4366198 best: 0.4366198 (3) total: 290ms remaining: 5.95s 4: learn: 0.3994636 test: 0.4010351 best: 0.4010351 (4) total: 368ms remaining: 5.97s 5: learn: 0.3746728 test: 0.3755088 best: 0.3755088 (5) total: 440ms remaining: 5.87s 6: learn: 0.3489200 test: 0.3495633 best: 0.3495633 (6) total: 517ms remaining: 5.84s 7: learn: 0.3294423 test: 0.3301896 best: 0.3301896 (7) total: 602ms remaining: 5.87s 8: learn: 0.3161840 test: 0.3167286 best: 0.3167286 (8) total: 676ms remaining: 5.78s 9: learn: 0.3057371 test: 0.3068035 best: 0.3068035 (9) total: 754ms remaining: 5.73s 10: learn: 0.2931953 test: 0.2942327 best: 0.2942327 (10) total: 837ms remaining: 5.71s 11: learn: 0.2815767 test: 0.2825940 best: 0.2825940 (11) total: 918ms remaining: 5.66s 12: learn: 0.2723469 test: 0.2735771 best: 0.2735771 (12) total: 1000ms remaining: 5.61s 13: learn: 0.2644380 test: 0.2657097 best: 0.2657097 (13) total: 1.08s remaining: 5.54s 14: learn: 0.2577742 test: 0.2588750 best: 0.2588750 (14) total: 1.15s remaining: 5.46s 15: learn: 0.2538503 test: 0.2555446 best: 0.2555446 (15) total: 1.23s remaining: 5.39s 16: learn: 0.2503502 test: 0.2521644 best: 0.2521644 (16) total: 1.31s remaining: 5.33s 17: learn: 0.2455574 test: 0.2476695 best: 0.2476695 (17) total: 1.39s remaining: 5.25s 18: learn: 0.2423358 test: 0.2443346 best: 0.2443346 (18) total: 1.47s remaining: 5.19s 19: learn: 0.2376325 test: 0.2403537 best: 0.2403537 (19) total: 1.55s remaining: 5.1s 20: learn: 0.2346739 test: 0.2374509 best: 0.2374509 (20) total: 1.62s remaining: 5.02s 21: learn: 0.2323565 test: 0.2352578 best: 0.2352578 (21) total: 1.71s remaining: 4.96s 22: learn: 0.2307589 test: 0.2336675 best: 0.2336675 (22) total: 1.78s remaining: 4.88s 23: learn: 0.2286541 test: 0.2316298 best: 0.2316298 (23) total: 1.86s remaining: 4.8s 24: learn: 0.2269414 test: 0.2300977 best: 0.2300977 (24) total: 1.93s remaining: 4.71s 25: learn: 0.2251787 test: 0.2284508 best: 0.2284508 (25) total: 2s remaining: 4.63s 26: learn: 0.2238308 test: 0.2272658 best: 0.2272658 (26) total: 2.07s remaining: 4.53s 27: learn: 0.2225421 test: 0.2260914 best: 0.2260914 (27) total: 2.14s remaining: 4.44s 28: learn: 0.2214238 test: 0.2250375 best: 0.2250375 (28) total: 2.22s remaining: 4.36s 29: learn: 0.2203971 test: 0.2240701 best: 0.2240701 (29) total: 2.29s remaining: 4.27s 30: learn: 0.2191267 test: 0.2228986 best: 0.2228986 (30) total: 2.36s remaining: 4.18s 31: learn: 0.2185535 test: 0.2224626 best: 0.2224626 (31) total: 2.42s remaining: 4.09s 32: learn: 0.2176605 test: 0.2215982 best: 0.2215982 (32) total: 2.49s remaining: 4.01s 33: learn: 0.2171116 test: 0.2210497 best: 0.2210497 (33) total: 2.57s remaining: 3.92s 34: learn: 0.2165344 test: 0.2205896 best: 0.2205896 (34) total: 2.63s remaining: 3.84s 35: learn: 0.2146975 test: 0.2187750 best: 0.2187750 (35) total: 2.72s remaining: 3.77s 36: learn: 0.2145493 test: 0.2186279 best: 0.2186279 (36) total: 2.78s remaining: 3.68s 37: learn: 0.2138612 test: 0.2181254 best: 0.2181254 (37) total: 2.85s remaining: 3.6s 38: learn: 0.2136284 test: 0.2179142 best: 0.2179142 (38) total: 2.92s remaining: 3.51s 39: learn: 0.2126898 test: 0.2170793 best: 0.2170793 (39) total: 2.98s remaining: 3.43s 40: learn: 0.2120336 test: 0.2163231 best: 0.2163231 (40) total: 3.05s remaining: 3.35s 41: learn: 0.2113771 test: 0.2159107 best: 0.2159107 (41) total: 3.12s remaining: 3.27s 42: learn: 0.2108503 test: 0.2154271 best: 0.2154271 (42) total: 3.19s remaining: 3.19s 43: learn: 0.2105222 test: 0.2151931 best: 0.2151931 (43) total: 3.26s remaining: 3.11s 44: learn: 0.2099910 test: 0.2149135 best: 0.2149135 (44) total: 3.33s remaining: 3.03s 45: learn: 0.2095712 test: 0.2145856 best: 0.2145856 (45) total: 3.4s remaining: 2.96s 46: learn: 0.2094052 test: 0.2145656 best: 0.2145656 (46) total: 3.47s remaining: 2.88s 47: learn: 0.2088028 test: 0.2142247 best: 0.2142247 (47) total: 3.53s remaining: 2.8s 48: learn: 0.2086417 test: 0.2141102 best: 0.2141102 (48) total: 3.6s remaining: 2.71s 49: learn: 0.2077313 test: 0.2133725 best: 0.2133725 (49) total: 3.67s remaining: 2.64s 50: learn: 0.2076341 test: 0.2132757 best: 0.2132757 (50) total: 3.73s remaining: 2.56s 51: learn: 0.2072559 test: 0.2130559 best: 0.2130559 (51) total: 3.81s remaining: 2.49s 52: learn: 0.2069676 test: 0.2126888 best: 0.2126888 (52) total: 3.88s remaining: 2.42s 53: learn: 0.2068715 test: 0.2125671 best: 0.2125671 (53) total: 3.95s remaining: 2.34s 54: learn: 0.2066919 test: 0.2124454 best: 0.2124454 (54) total: 4.01s remaining: 2.26s 55: learn: 0.2062972 test: 0.2121751 best: 0.2121751 (55) total: 4.08s remaining: 2.19s 56: learn: 0.2061933 test: 0.2121466 best: 0.2121466 (56) total: 4.14s remaining: 2.11s 57: learn: 0.2056841 test: 0.2117883 best: 0.2117883 (57) total: 4.21s remaining: 2.03s 58: learn: 0.2055852 test: 0.2117417 best: 0.2117417 (58) total: 4.28s remaining: 1.96s 59: learn: 0.2046801 test: 0.2108169 best: 0.2108169 (59) total: 4.35s remaining: 1.88s 60: learn: 0.2043161 test: 0.2104072 best: 0.2104072 (60) total: 4.41s remaining: 1.81s 61: learn: 0.2040639 test: 0.2102621 best: 0.2102621 (61) total: 4.48s remaining: 1.73s 62: learn: 0.2038891 test: 0.2100943 best: 0.2100943 (62) total: 4.54s remaining: 1.66s 63: learn: 0.2036397 test: 0.2100724 best: 0.2100724 (63) total: 4.61s remaining: 1.58s 64: learn: 0.2035939 test: 0.2100567 best: 0.2100567 (64) total: 4.67s remaining: 1.51s 65: learn: 0.2030232 test: 0.2095987 best: 0.2095987 (65) total: 4.75s remaining: 1.44s 66: learn: 0.2024134 test: 0.2089743 best: 0.2089743 (66) total: 4.82s remaining: 1.37s 67: learn: 0.2022635 test: 0.2088483 best: 0.2088483 (67) total: 4.89s remaining: 1.29s 68: learn: 0.2022350 test: 0.2088610 best: 0.2088483 (67) total: 4.99s remaining: 1.23s 69: learn: 0.2021699 test: 0.2088774 best: 0.2088483 (67) total: 5.09s remaining: 1.16s 70: learn: 0.2021435 test: 0.2088864 best: 0.2088483 (67) total: 5.15s remaining: 1.09s 71: learn: 0.2020572 test: 0.2088532 best: 0.2088483 (67) total: 5.21s remaining: 1.01s 72: learn: 0.2017865 test: 0.2086053 best: 0.2086053 (72) total: 5.28s remaining: 941ms 73: learn: 0.2011903 test: 0.2081225 best: 0.2081225 (73) total: 5.36s remaining: 869ms 74: learn: 0.2011168 test: 0.2081660 best: 0.2081225 (73) total: 5.42s remaining: 795ms 75: learn: 0.2010715 test: 0.2081556 best: 0.2081225 (73) total: 5.48s remaining: 721ms 76: learn: 0.2007685 test: 0.2079018 best: 0.2079018 (76) total: 5.54s remaining: 648ms 77: learn: 0.2005758 test: 0.2078932 best: 0.2078932 (77) total: 5.61s remaining: 576ms 78: learn: 0.2004053 test: 0.2078144 best: 0.2078144 (78) total: 5.68s remaining: 504ms 79: learn: 0.2003604 test: 0.2078174 best: 0.2078144 (78) total: 5.75s remaining: 431ms 80: learn: 0.2002131 test: 0.2077160 best: 0.2077160 (80) total: 5.82s remaining: 359ms 81: learn: 0.2001161 test: 0.2076862 best: 0.2076862 (81) total: 5.88s remaining: 287ms 82: learn: 0.2000928 test: 0.2076772 best: 0.2076772 (82) total: 5.94s remaining: 215ms 83: learn: 0.1999360 test: 0.2075237 best: 0.2075237 (83) total: 6s remaining: 143ms 84: learn: 0.1997241 test: 0.2073884 best: 0.2073884 (84) total: 6.07s remaining: 71.4ms 85: learn: 0.1996900 test: 0.2073762 best: 0.2073762 (85) total: 6.14s remaining: 0us bestTest = 0.207376201 bestIteration = 85 Trial 93, Fold 4: Log loss = 0.20686190202264296, Average precision = 0.9735082620764466, ROC-AUC = 0.9684091784277677, Elapsed Time = 6.252805199997965 seconds Trial 93, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 93, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5984615 test: 0.6002872 best: 0.6002872 (0) total: 72.6ms remaining: 6.17s 1: learn: 0.5286174 test: 0.5313127 best: 0.5313127 (1) total: 147ms remaining: 6.16s 2: learn: 0.4694816 test: 0.4732579 best: 0.4732579 (2) total: 219ms remaining: 6.07s 3: learn: 0.4283849 test: 0.4333346 best: 0.4333346 (3) total: 296ms remaining: 6.07s 4: learn: 0.3958676 test: 0.4016262 best: 0.4016262 (4) total: 374ms remaining: 6.05s 5: learn: 0.3660435 test: 0.3722578 best: 0.3722578 (5) total: 454ms remaining: 6.05s 6: learn: 0.3454750 test: 0.3523793 best: 0.3523793 (6) total: 530ms remaining: 5.98s 7: learn: 0.3258670 test: 0.3334155 best: 0.3334155 (7) total: 614ms remaining: 5.99s 8: learn: 0.3084734 test: 0.3165378 best: 0.3165378 (8) total: 693ms remaining: 5.93s 9: learn: 0.2939251 test: 0.3027970 best: 0.3027970 (9) total: 779ms remaining: 5.92s 10: learn: 0.2832358 test: 0.2925249 best: 0.2925249 (10) total: 854ms remaining: 5.82s 11: learn: 0.2755135 test: 0.2851969 best: 0.2851969 (11) total: 931ms remaining: 5.74s 12: learn: 0.2664883 test: 0.2770344 best: 0.2770344 (12) total: 1.02s remaining: 5.74s 13: learn: 0.2590383 test: 0.2700120 best: 0.2700120 (13) total: 1.1s remaining: 5.68s 14: learn: 0.2522707 test: 0.2638131 best: 0.2638131 (14) total: 1.19s remaining: 5.62s 15: learn: 0.2472117 test: 0.2590036 best: 0.2590036 (15) total: 1.27s remaining: 5.54s 16: learn: 0.2426371 test: 0.2548147 best: 0.2548147 (16) total: 1.34s remaining: 5.46s 17: learn: 0.2398024 test: 0.2524005 best: 0.2524005 (17) total: 1.42s remaining: 5.36s 18: learn: 0.2354020 test: 0.2482296 best: 0.2482296 (18) total: 1.5s remaining: 5.3s 19: learn: 0.2327010 test: 0.2456902 best: 0.2456902 (19) total: 1.58s remaining: 5.21s 20: learn: 0.2306880 test: 0.2440998 best: 0.2440998 (20) total: 1.65s remaining: 5.11s 21: learn: 0.2283646 test: 0.2418266 best: 0.2418266 (21) total: 1.72s remaining: 5.01s 22: learn: 0.2255401 test: 0.2394841 best: 0.2394841 (22) total: 1.81s remaining: 4.95s 23: learn: 0.2239887 test: 0.2382263 best: 0.2382263 (23) total: 1.88s remaining: 4.86s 24: learn: 0.2221916 test: 0.2365788 best: 0.2365788 (24) total: 1.95s remaining: 4.76s 25: learn: 0.2204657 test: 0.2347604 best: 0.2347604 (25) total: 2.04s remaining: 4.7s 26: learn: 0.2186545 test: 0.2331749 best: 0.2331749 (26) total: 2.11s remaining: 4.62s 27: learn: 0.2175779 test: 0.2320453 best: 0.2320453 (27) total: 2.19s remaining: 4.53s 28: learn: 0.2160860 test: 0.2304317 best: 0.2304317 (28) total: 2.26s remaining: 4.44s 29: learn: 0.2148365 test: 0.2294235 best: 0.2294235 (29) total: 2.33s remaining: 4.34s 30: learn: 0.2135786 test: 0.2284166 best: 0.2284166 (30) total: 2.4s remaining: 4.25s 31: learn: 0.2123938 test: 0.2273846 best: 0.2273846 (31) total: 2.47s remaining: 4.17s 32: learn: 0.2115351 test: 0.2266468 best: 0.2266468 (32) total: 2.55s remaining: 4.09s 33: learn: 0.2101615 test: 0.2257561 best: 0.2257561 (33) total: 2.62s remaining: 4.01s 34: learn: 0.2094313 test: 0.2252327 best: 0.2252327 (34) total: 2.7s remaining: 3.94s 35: learn: 0.2087140 test: 0.2245619 best: 0.2245619 (35) total: 2.77s remaining: 3.85s 36: learn: 0.2083650 test: 0.2242176 best: 0.2242176 (36) total: 2.84s remaining: 3.76s 37: learn: 0.2073704 test: 0.2232413 best: 0.2232413 (37) total: 2.92s remaining: 3.69s 38: learn: 0.2058778 test: 0.2218508 best: 0.2218508 (38) total: 3.01s remaining: 3.62s 39: learn: 0.2054894 test: 0.2214048 best: 0.2214048 (39) total: 3.07s remaining: 3.53s 40: learn: 0.2048089 test: 0.2209861 best: 0.2209861 (40) total: 3.14s remaining: 3.45s 41: learn: 0.2040859 test: 0.2204763 best: 0.2204763 (41) total: 3.21s remaining: 3.37s 42: learn: 0.2035282 test: 0.2200895 best: 0.2200895 (42) total: 3.28s remaining: 3.28s 43: learn: 0.2033391 test: 0.2200429 best: 0.2200429 (43) total: 3.36s remaining: 3.2s 44: learn: 0.2027050 test: 0.2194502 best: 0.2194502 (44) total: 3.43s remaining: 3.13s 45: learn: 0.2024958 test: 0.2192383 best: 0.2192383 (45) total: 3.5s remaining: 3.04s 46: learn: 0.2021205 test: 0.2190059 best: 0.2190059 (46) total: 3.57s remaining: 2.96s 47: learn: 0.2016909 test: 0.2184754 best: 0.2184754 (47) total: 3.64s remaining: 2.88s 48: learn: 0.2011722 test: 0.2180014 best: 0.2180014 (48) total: 3.7s remaining: 2.8s 49: learn: 0.2005539 test: 0.2176039 best: 0.2176039 (49) total: 3.79s remaining: 2.73s 50: learn: 0.2004931 test: 0.2175283 best: 0.2175283 (50) total: 3.85s remaining: 2.64s 51: learn: 0.1997591 test: 0.2169882 best: 0.2169882 (51) total: 3.92s remaining: 2.56s 52: learn: 0.1992642 test: 0.2166121 best: 0.2166121 (52) total: 4s remaining: 2.49s 53: learn: 0.1990943 test: 0.2164473 best: 0.2164473 (53) total: 4.07s remaining: 2.41s 54: learn: 0.1988532 test: 0.2162253 best: 0.2162253 (54) total: 4.13s remaining: 2.33s 55: learn: 0.1988019 test: 0.2161676 best: 0.2161676 (55) total: 4.2s remaining: 2.25s 56: learn: 0.1984506 test: 0.2158131 best: 0.2158131 (56) total: 4.27s remaining: 2.17s 57: learn: 0.1978443 test: 0.2150912 best: 0.2150912 (57) total: 4.34s remaining: 2.09s 58: learn: 0.1977185 test: 0.2150740 best: 0.2150740 (58) total: 4.4s remaining: 2.02s 59: learn: 0.1974068 test: 0.2147472 best: 0.2147472 (59) total: 4.47s remaining: 1.94s 60: learn: 0.1973167 test: 0.2146604 best: 0.2146604 (60) total: 4.54s remaining: 1.86s 61: learn: 0.1971103 test: 0.2145720 best: 0.2145720 (61) total: 4.6s remaining: 1.78s 62: learn: 0.1968893 test: 0.2144033 best: 0.2144033 (62) total: 4.67s remaining: 1.7s 63: learn: 0.1967404 test: 0.2143825 best: 0.2143825 (63) total: 4.74s remaining: 1.63s 64: learn: 0.1964107 test: 0.2142678 best: 0.2142678 (64) total: 4.81s remaining: 1.55s 65: learn: 0.1961281 test: 0.2139799 best: 0.2139799 (65) total: 4.88s remaining: 1.48s 66: learn: 0.1960836 test: 0.2139510 best: 0.2139510 (66) total: 4.94s remaining: 1.4s 67: learn: 0.1959630 test: 0.2138532 best: 0.2138532 (67) total: 5.01s remaining: 1.33s 68: learn: 0.1957802 test: 0.2137943 best: 0.2137943 (68) total: 5.09s remaining: 1.25s 69: learn: 0.1954582 test: 0.2135591 best: 0.2135591 (69) total: 5.17s remaining: 1.18s 70: learn: 0.1952472 test: 0.2134779 best: 0.2134779 (70) total: 5.23s remaining: 1.1s 71: learn: 0.1950779 test: 0.2134253 best: 0.2134253 (71) total: 5.29s remaining: 1.03s 72: learn: 0.1950498 test: 0.2133835 best: 0.2133835 (72) total: 5.36s remaining: 955ms 73: learn: 0.1948877 test: 0.2133089 best: 0.2133089 (73) total: 5.43s remaining: 880ms 74: learn: 0.1947003 test: 0.2132292 best: 0.2132292 (74) total: 5.49s remaining: 806ms 75: learn: 0.1945292 test: 0.2130865 best: 0.2130865 (75) total: 5.56s remaining: 731ms 76: learn: 0.1944069 test: 0.2130796 best: 0.2130796 (76) total: 5.63s remaining: 658ms 77: learn: 0.1942291 test: 0.2129342 best: 0.2129342 (77) total: 5.69s remaining: 583ms 78: learn: 0.1938183 test: 0.2125368 best: 0.2125368 (78) total: 5.76s remaining: 510ms 79: learn: 0.1935775 test: 0.2123376 best: 0.2123376 (79) total: 5.83s remaining: 437ms 80: learn: 0.1934145 test: 0.2122430 best: 0.2122430 (80) total: 5.9s remaining: 364ms 81: learn: 0.1933354 test: 0.2122197 best: 0.2122197 (81) total: 5.96s remaining: 291ms 82: learn: 0.1932580 test: 0.2121912 best: 0.2121912 (82) total: 6.03s remaining: 218ms 83: learn: 0.1929867 test: 0.2120767 best: 0.2120767 (83) total: 6.11s remaining: 146ms 84: learn: 0.1927560 test: 0.2118282 best: 0.2118282 (84) total: 6.19s remaining: 72.8ms 85: learn: 0.1922354 test: 0.2113304 best: 0.2113304 (85) total: 6.25s remaining: 0us bestTest = 0.2113303999 bestIteration = 85 Trial 93, Fold 5: Log loss = 0.2105990331303193, Average precision = 0.9715499741051221, ROC-AUC = 0.9682608550248036, Elapsed Time = 6.37430100000347 seconds
Optimization Progress: 94%|#########3| 94/100 [2:52:08<06:44, 67.39s/it]
Trial 94, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 94, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6374221 test: 0.6388598 best: 0.6388598 (0) total: 296ms remaining: 13.3s 1: learn: 0.5886071 test: 0.5910520 best: 0.5910520 (1) total: 658ms remaining: 14.5s 2: learn: 0.5449047 test: 0.5484331 best: 0.5484331 (2) total: 1.01s remaining: 14.4s 3: learn: 0.5064686 test: 0.5113521 best: 0.5113521 (3) total: 1.41s remaining: 14.8s 4: learn: 0.4722868 test: 0.4781373 best: 0.4781373 (4) total: 1.77s remaining: 14.5s 5: learn: 0.4428693 test: 0.4500039 best: 0.4500039 (5) total: 2.16s remaining: 14.4s 6: learn: 0.4161340 test: 0.4242286 best: 0.4242286 (6) total: 2.55s remaining: 14.2s 7: learn: 0.3925698 test: 0.4017535 best: 0.4017535 (7) total: 2.9s remaining: 13.8s 8: learn: 0.3720813 test: 0.3825471 best: 0.3825471 (8) total: 3.24s remaining: 13.3s 9: learn: 0.3534811 test: 0.3647414 best: 0.3647414 (9) total: 3.56s remaining: 12.8s 10: learn: 0.3367154 test: 0.3492104 best: 0.3492104 (10) total: 3.92s remaining: 12.5s 11: learn: 0.3226763 test: 0.3360081 best: 0.3360081 (11) total: 4.23s remaining: 12s 12: learn: 0.3092613 test: 0.3233619 best: 0.3233619 (12) total: 4.55s remaining: 11.6s 13: learn: 0.2979150 test: 0.3128908 best: 0.3128908 (13) total: 4.89s remaining: 11.2s 14: learn: 0.2870293 test: 0.3034958 best: 0.3034958 (14) total: 5.27s remaining: 10.9s 15: learn: 0.2776158 test: 0.2949624 best: 0.2949624 (15) total: 5.6s remaining: 10.5s 16: learn: 0.2692126 test: 0.2876576 best: 0.2876576 (16) total: 5.9s remaining: 10.1s 17: learn: 0.2612382 test: 0.2805931 best: 0.2805931 (17) total: 6.19s remaining: 9.63s 18: learn: 0.2535935 test: 0.2740476 best: 0.2740476 (18) total: 6.5s remaining: 9.24s 19: learn: 0.2467771 test: 0.2682063 best: 0.2682063 (19) total: 6.8s remaining: 8.84s 20: learn: 0.2407622 test: 0.2632299 best: 0.2632299 (20) total: 7.1s remaining: 8.46s 21: learn: 0.2355168 test: 0.2589000 best: 0.2589000 (21) total: 7.43s remaining: 8.11s 22: learn: 0.2309056 test: 0.2553272 best: 0.2553272 (22) total: 7.75s remaining: 7.75s 23: learn: 0.2257713 test: 0.2512233 best: 0.2512233 (23) total: 8.05s remaining: 7.38s 24: learn: 0.2216401 test: 0.2478855 best: 0.2478855 (24) total: 8.38s remaining: 7.04s 25: learn: 0.2171711 test: 0.2444021 best: 0.2444021 (25) total: 8.71s remaining: 6.7s 26: learn: 0.2132795 test: 0.2411128 best: 0.2411128 (26) total: 8.98s remaining: 6.32s 27: learn: 0.2102534 test: 0.2388359 best: 0.2388359 (27) total: 9.27s remaining: 5.96s 28: learn: 0.2066112 test: 0.2361213 best: 0.2361213 (28) total: 9.61s remaining: 5.63s 29: learn: 0.2034057 test: 0.2336141 best: 0.2336141 (29) total: 9.92s remaining: 5.29s 30: learn: 0.2004674 test: 0.2315420 best: 0.2315420 (30) total: 10.2s remaining: 4.95s 31: learn: 0.1979135 test: 0.2299142 best: 0.2299142 (31) total: 10.5s remaining: 4.59s 32: learn: 0.1949951 test: 0.2279118 best: 0.2279118 (32) total: 10.8s remaining: 4.26s 33: learn: 0.1922723 test: 0.2262777 best: 0.2262777 (33) total: 11.1s remaining: 3.92s 34: learn: 0.1900197 test: 0.2247924 best: 0.2247924 (34) total: 11.4s remaining: 3.58s 35: learn: 0.1879894 test: 0.2235224 best: 0.2235224 (35) total: 11.7s remaining: 3.25s 36: learn: 0.1861652 test: 0.2223201 best: 0.2223201 (36) total: 12s remaining: 2.91s 37: learn: 0.1839134 test: 0.2206289 best: 0.2206289 (37) total: 12.3s remaining: 2.59s 38: learn: 0.1819277 test: 0.2193260 best: 0.2193260 (38) total: 12.6s remaining: 2.26s 39: learn: 0.1801723 test: 0.2179907 best: 0.2179907 (39) total: 12.9s remaining: 1.93s 40: learn: 0.1783726 test: 0.2168288 best: 0.2168288 (40) total: 13.1s remaining: 1.6s 41: learn: 0.1768019 test: 0.2160383 best: 0.2160383 (41) total: 13.4s remaining: 1.27s 42: learn: 0.1752472 test: 0.2150628 best: 0.2150628 (42) total: 13.7s remaining: 953ms 43: learn: 0.1739982 test: 0.2141369 best: 0.2141369 (43) total: 13.9s remaining: 633ms 44: learn: 0.1725995 test: 0.2135379 best: 0.2135379 (44) total: 14.2s remaining: 316ms 45: learn: 0.1713381 test: 0.2127849 best: 0.2127849 (45) total: 14.5s remaining: 0us bestTest = 0.212784861 bestIteration = 45 Trial 94, Fold 1: Log loss = 0.21252819744697407, Average precision = 0.974217960654209, ROC-AUC = 0.9709315090836094, Elapsed Time = 14.636794699996244 seconds Trial 94, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 94, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6374757 test: 0.6390619 best: 0.6390619 (0) total: 303ms remaining: 13.6s 1: learn: 0.5883849 test: 0.5914947 best: 0.5914947 (1) total: 564ms remaining: 12.4s 2: learn: 0.5449027 test: 0.5491511 best: 0.5491511 (2) total: 838ms remaining: 12s 3: learn: 0.5071679 test: 0.5125799 best: 0.5125799 (3) total: 1.12s remaining: 11.8s 4: learn: 0.4732404 test: 0.4798224 best: 0.4798224 (4) total: 1.41s remaining: 11.5s 5: learn: 0.4436620 test: 0.4515787 best: 0.4515787 (5) total: 1.69s remaining: 11.3s 6: learn: 0.4172085 test: 0.4260354 best: 0.4260354 (6) total: 2.05s remaining: 11.4s 7: learn: 0.3935940 test: 0.4032689 best: 0.4032689 (7) total: 2.36s remaining: 11.2s 8: learn: 0.3729268 test: 0.3832836 best: 0.3832836 (8) total: 2.64s remaining: 10.8s 9: learn: 0.3544734 test: 0.3659167 best: 0.3659167 (9) total: 2.96s remaining: 10.7s 10: learn: 0.3381804 test: 0.3505253 best: 0.3505253 (10) total: 3.29s remaining: 10.5s 11: learn: 0.3236188 test: 0.3366044 best: 0.3366044 (11) total: 3.57s remaining: 10.1s 12: learn: 0.3109955 test: 0.3249671 best: 0.3249671 (12) total: 3.84s remaining: 9.75s 13: learn: 0.2990482 test: 0.3134994 best: 0.3134994 (13) total: 4.11s remaining: 9.4s 14: learn: 0.2885408 test: 0.3036871 best: 0.3036871 (14) total: 4.43s remaining: 9.15s 15: learn: 0.2785448 test: 0.2944214 best: 0.2944214 (15) total: 4.75s remaining: 8.9s 16: learn: 0.2697351 test: 0.2864685 best: 0.2864685 (16) total: 5.06s remaining: 8.63s 17: learn: 0.2625847 test: 0.2796406 best: 0.2796406 (17) total: 5.33s remaining: 8.28s 18: learn: 0.2553627 test: 0.2728115 best: 0.2728115 (18) total: 5.59s remaining: 7.94s 19: learn: 0.2483735 test: 0.2667470 best: 0.2667470 (19) total: 5.92s remaining: 7.69s 20: learn: 0.2422879 test: 0.2608831 best: 0.2608831 (20) total: 6.19s remaining: 7.37s 21: learn: 0.2368120 test: 0.2561697 best: 0.2561697 (21) total: 6.48s remaining: 7.07s 22: learn: 0.2313108 test: 0.2514233 best: 0.2514233 (22) total: 6.77s remaining: 6.77s 23: learn: 0.2262680 test: 0.2469229 best: 0.2469229 (23) total: 7.1s remaining: 6.51s 24: learn: 0.2221019 test: 0.2431851 best: 0.2431851 (24) total: 7.38s remaining: 6.2s 25: learn: 0.2182108 test: 0.2395508 best: 0.2395508 (25) total: 7.64s remaining: 5.88s 26: learn: 0.2146269 test: 0.2364965 best: 0.2364965 (26) total: 7.94s remaining: 5.58s 27: learn: 0.2114178 test: 0.2337430 best: 0.2337430 (27) total: 8.21s remaining: 5.28s 28: learn: 0.2083952 test: 0.2311182 best: 0.2311182 (28) total: 8.49s remaining: 4.98s 29: learn: 0.2051835 test: 0.2284691 best: 0.2284691 (29) total: 8.79s remaining: 4.69s 30: learn: 0.2022807 test: 0.2260844 best: 0.2260844 (30) total: 9.1s remaining: 4.4s 31: learn: 0.1996252 test: 0.2238293 best: 0.2238293 (31) total: 9.35s remaining: 4.09s 32: learn: 0.1970528 test: 0.2218816 best: 0.2218816 (32) total: 9.64s remaining: 3.8s 33: learn: 0.1944145 test: 0.2199244 best: 0.2199244 (33) total: 9.94s remaining: 3.51s 34: learn: 0.1917403 test: 0.2179858 best: 0.2179858 (34) total: 10.2s remaining: 3.22s 35: learn: 0.1899630 test: 0.2164320 best: 0.2164320 (35) total: 10.5s remaining: 2.91s 36: learn: 0.1877038 test: 0.2147595 best: 0.2147595 (36) total: 10.8s remaining: 2.62s 37: learn: 0.1856771 test: 0.2130536 best: 0.2130536 (37) total: 11s remaining: 2.32s 38: learn: 0.1837598 test: 0.2119098 best: 0.2119098 (38) total: 11.3s remaining: 2.04s 39: learn: 0.1817612 test: 0.2102657 best: 0.2102657 (39) total: 11.6s remaining: 1.74s 40: learn: 0.1803498 test: 0.2090995 best: 0.2090995 (40) total: 11.9s remaining: 1.45s 41: learn: 0.1787670 test: 0.2080725 best: 0.2080725 (41) total: 12.1s remaining: 1.16s 42: learn: 0.1769770 test: 0.2068264 best: 0.2068264 (42) total: 12.4s remaining: 867ms 43: learn: 0.1754227 test: 0.2057044 best: 0.2057044 (43) total: 12.7s remaining: 577ms 44: learn: 0.1736131 test: 0.2046041 best: 0.2046041 (44) total: 13s remaining: 289ms 45: learn: 0.1722434 test: 0.2035451 best: 0.2035451 (45) total: 13.3s remaining: 0us bestTest = 0.2035451156 bestIteration = 45 Trial 94, Fold 2: Log loss = 0.20336771407503848, Average precision = 0.9753278846963165, ROC-AUC = 0.9732267588402947, Elapsed Time = 13.444962800000212 seconds Trial 94, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 94, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6368930 test: 0.6383617 best: 0.6383617 (0) total: 295ms remaining: 13.3s 1: learn: 0.5882204 test: 0.5906543 best: 0.5906543 (1) total: 548ms remaining: 12.1s 2: learn: 0.5448378 test: 0.5478843 best: 0.5478843 (2) total: 815ms remaining: 11.7s 3: learn: 0.5067284 test: 0.5104835 best: 0.5104835 (3) total: 1.07s remaining: 11.3s 4: learn: 0.4730618 test: 0.4775106 best: 0.4775106 (4) total: 1.36s remaining: 11.1s 5: learn: 0.4430664 test: 0.4480938 best: 0.4480938 (5) total: 1.61s remaining: 10.7s 6: learn: 0.4170043 test: 0.4225212 best: 0.4225212 (6) total: 1.88s remaining: 10.5s 7: learn: 0.3944895 test: 0.4008806 best: 0.4008806 (7) total: 2.17s remaining: 10.3s 8: learn: 0.3740515 test: 0.3810249 best: 0.3810249 (8) total: 2.44s remaining: 10s 9: learn: 0.3557720 test: 0.3636558 best: 0.3636558 (9) total: 2.71s remaining: 9.77s 10: learn: 0.3391468 test: 0.3477249 best: 0.3477249 (10) total: 3.01s remaining: 9.58s 11: learn: 0.3241903 test: 0.3333545 best: 0.3333545 (11) total: 3.29s remaining: 9.31s 12: learn: 0.3112677 test: 0.3209282 best: 0.3209282 (12) total: 3.55s remaining: 9.01s 13: learn: 0.2995912 test: 0.3098647 best: 0.3098647 (13) total: 3.84s remaining: 8.78s 14: learn: 0.2889180 test: 0.2999257 best: 0.2999257 (14) total: 4.14s remaining: 8.56s 15: learn: 0.2796498 test: 0.2914585 best: 0.2914585 (15) total: 4.41s remaining: 8.28s 16: learn: 0.2708715 test: 0.2834071 best: 0.2834071 (16) total: 4.67s remaining: 7.96s 17: learn: 0.2627588 test: 0.2761052 best: 0.2761052 (17) total: 4.99s remaining: 7.76s 18: learn: 0.2554081 test: 0.2696229 best: 0.2696229 (18) total: 5.26s remaining: 7.48s 19: learn: 0.2486404 test: 0.2637235 best: 0.2637235 (19) total: 5.58s remaining: 7.25s 20: learn: 0.2430779 test: 0.2587933 best: 0.2587933 (20) total: 5.86s remaining: 6.98s 21: learn: 0.2375314 test: 0.2541581 best: 0.2541581 (21) total: 6.15s remaining: 6.71s 22: learn: 0.2322249 test: 0.2496860 best: 0.2496860 (22) total: 6.46s remaining: 6.46s 23: learn: 0.2272309 test: 0.2454701 best: 0.2454701 (23) total: 6.77s remaining: 6.21s 24: learn: 0.2225077 test: 0.2416044 best: 0.2416044 (24) total: 7.08s remaining: 5.95s 25: learn: 0.2183750 test: 0.2381906 best: 0.2381906 (25) total: 7.35s remaining: 5.66s 26: learn: 0.2144440 test: 0.2351904 best: 0.2351904 (26) total: 7.64s remaining: 5.38s 27: learn: 0.2111208 test: 0.2323461 best: 0.2323461 (27) total: 7.94s remaining: 5.1s 28: learn: 0.2082995 test: 0.2300834 best: 0.2300834 (28) total: 8.22s remaining: 4.82s 29: learn: 0.2049655 test: 0.2274763 best: 0.2274763 (29) total: 8.53s remaining: 4.55s 30: learn: 0.2017700 test: 0.2250271 best: 0.2250271 (30) total: 8.84s remaining: 4.28s 31: learn: 0.1993558 test: 0.2231235 best: 0.2231235 (31) total: 9.11s remaining: 3.98s 32: learn: 0.1966483 test: 0.2210074 best: 0.2210074 (32) total: 9.41s remaining: 3.71s 33: learn: 0.1938348 test: 0.2189417 best: 0.2189417 (33) total: 9.72s remaining: 3.43s 34: learn: 0.1915713 test: 0.2170042 best: 0.2170042 (34) total: 9.98s remaining: 3.14s 35: learn: 0.1895156 test: 0.2154928 best: 0.2154928 (35) total: 10.2s remaining: 2.85s 36: learn: 0.1877999 test: 0.2141650 best: 0.2141650 (36) total: 10.5s remaining: 2.55s 37: learn: 0.1856674 test: 0.2125714 best: 0.2125714 (37) total: 10.8s remaining: 2.27s 38: learn: 0.1836816 test: 0.2111144 best: 0.2111144 (38) total: 11.1s remaining: 1.99s 39: learn: 0.1819892 test: 0.2099267 best: 0.2099267 (39) total: 11.4s remaining: 1.71s 40: learn: 0.1804165 test: 0.2088309 best: 0.2088309 (40) total: 11.6s remaining: 1.42s 41: learn: 0.1787422 test: 0.2076416 best: 0.2076416 (41) total: 12s remaining: 1.14s 42: learn: 0.1770495 test: 0.2066743 best: 0.2066743 (42) total: 12.3s remaining: 861ms 43: learn: 0.1756790 test: 0.2057967 best: 0.2057967 (43) total: 12.6s remaining: 573ms 44: learn: 0.1743221 test: 0.2048611 best: 0.2048611 (44) total: 12.9s remaining: 286ms 45: learn: 0.1728689 test: 0.2040439 best: 0.2040439 (45) total: 13.2s remaining: 0us bestTest = 0.2040438513 bestIteration = 45 Trial 94, Fold 3: Log loss = 0.2039125345162296, Average precision = 0.9768937951634301, ROC-AUC = 0.9732710185301114, Elapsed Time = 13.332860399998026 seconds Trial 94, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 94, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6377034 test: 0.6387764 best: 0.6387764 (0) total: 266ms remaining: 12s 1: learn: 0.5888450 test: 0.5907178 best: 0.5907178 (1) total: 511ms remaining: 11.2s 2: learn: 0.5455360 test: 0.5481811 best: 0.5481811 (2) total: 774ms remaining: 11.1s 3: learn: 0.5075927 test: 0.5112164 best: 0.5112164 (3) total: 1.06s remaining: 11.1s 4: learn: 0.4734769 test: 0.4777740 best: 0.4777740 (4) total: 1.33s remaining: 10.9s 5: learn: 0.4435133 test: 0.4489226 best: 0.4489226 (5) total: 1.64s remaining: 10.9s 6: learn: 0.4172800 test: 0.4234896 best: 0.4234896 (6) total: 1.9s remaining: 10.6s 7: learn: 0.3938127 test: 0.4011187 best: 0.4011187 (7) total: 2.22s remaining: 10.6s 8: learn: 0.3729450 test: 0.3810529 best: 0.3810529 (8) total: 2.5s remaining: 10.3s 9: learn: 0.3541240 test: 0.3628072 best: 0.3628072 (9) total: 2.77s remaining: 9.96s 10: learn: 0.3388183 test: 0.3481408 best: 0.3481408 (10) total: 3.07s remaining: 9.77s 11: learn: 0.3239743 test: 0.3341090 best: 0.3341090 (11) total: 3.35s remaining: 9.49s 12: learn: 0.3116891 test: 0.3222758 best: 0.3222758 (12) total: 3.6s remaining: 9.15s 13: learn: 0.2997435 test: 0.3113251 best: 0.3113251 (13) total: 3.88s remaining: 8.87s 14: learn: 0.2891847 test: 0.3017902 best: 0.3017902 (14) total: 4.21s remaining: 8.69s 15: learn: 0.2796951 test: 0.2927587 best: 0.2927587 (15) total: 4.46s remaining: 8.36s 16: learn: 0.2713652 test: 0.2847050 best: 0.2847050 (16) total: 4.73s remaining: 8.07s 17: learn: 0.2630213 test: 0.2774672 best: 0.2774672 (17) total: 5.04s remaining: 7.85s 18: learn: 0.2556569 test: 0.2707972 best: 0.2707972 (18) total: 5.33s remaining: 7.58s 19: learn: 0.2494531 test: 0.2653096 best: 0.2653096 (19) total: 5.6s remaining: 7.28s 20: learn: 0.2435962 test: 0.2599961 best: 0.2599961 (20) total: 5.88s remaining: 7s 21: learn: 0.2381202 test: 0.2553707 best: 0.2553707 (21) total: 6.19s remaining: 6.75s 22: learn: 0.2329456 test: 0.2507888 best: 0.2507888 (22) total: 6.45s remaining: 6.45s 23: learn: 0.2279518 test: 0.2466218 best: 0.2466218 (23) total: 6.73s remaining: 6.17s 24: learn: 0.2232544 test: 0.2429098 best: 0.2429098 (24) total: 7.06s remaining: 5.93s 25: learn: 0.2191444 test: 0.2393696 best: 0.2393696 (25) total: 7.36s remaining: 5.66s 26: learn: 0.2148599 test: 0.2357544 best: 0.2357544 (26) total: 7.66s remaining: 5.39s 27: learn: 0.2112076 test: 0.2328174 best: 0.2328174 (27) total: 7.95s remaining: 5.11s 28: learn: 0.2075176 test: 0.2300714 best: 0.2300714 (28) total: 8.25s remaining: 4.84s 29: learn: 0.2045342 test: 0.2276296 best: 0.2276296 (29) total: 8.53s remaining: 4.55s 30: learn: 0.2015407 test: 0.2252050 best: 0.2252050 (30) total: 8.83s remaining: 4.27s 31: learn: 0.1983594 test: 0.2227115 best: 0.2227115 (31) total: 9.13s remaining: 3.99s 32: learn: 0.1961655 test: 0.2208738 best: 0.2208738 (32) total: 9.37s remaining: 3.69s 33: learn: 0.1941645 test: 0.2191744 best: 0.2191744 (33) total: 9.62s remaining: 3.4s 34: learn: 0.1918496 test: 0.2174560 best: 0.2174560 (34) total: 9.89s remaining: 3.11s 35: learn: 0.1896796 test: 0.2159683 best: 0.2159683 (35) total: 10.2s remaining: 2.83s 36: learn: 0.1875833 test: 0.2143209 best: 0.2143209 (36) total: 10.5s remaining: 2.54s 37: learn: 0.1855823 test: 0.2127931 best: 0.2127931 (37) total: 10.7s remaining: 2.25s 38: learn: 0.1838957 test: 0.2117682 best: 0.2117682 (38) total: 11s remaining: 1.97s 39: learn: 0.1820986 test: 0.2104551 best: 0.2104551 (39) total: 11.3s remaining: 1.69s 40: learn: 0.1806989 test: 0.2094100 best: 0.2094100 (40) total: 11.5s remaining: 1.4s 41: learn: 0.1791737 test: 0.2082330 best: 0.2082330 (41) total: 11.8s remaining: 1.12s 42: learn: 0.1775020 test: 0.2069686 best: 0.2069686 (42) total: 12s remaining: 838ms 43: learn: 0.1760642 test: 0.2058805 best: 0.2058805 (43) total: 12.3s remaining: 557ms 44: learn: 0.1744906 test: 0.2047745 best: 0.2047745 (44) total: 12.5s remaining: 279ms 45: learn: 0.1730480 test: 0.2038025 best: 0.2038025 (45) total: 12.8s remaining: 0us bestTest = 0.2038025047 bestIteration = 45 Trial 94, Fold 4: Log loss = 0.20360199853009206, Average precision = 0.9772784588469647, ROC-AUC = 0.9738545742517126, Elapsed Time = 12.94422840000334 seconds Trial 94, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 94, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6377750 test: 0.6396367 best: 0.6396367 (0) total: 296ms remaining: 13.3s 1: learn: 0.5886315 test: 0.5921352 best: 0.5921352 (1) total: 552ms remaining: 12.1s 2: learn: 0.5448371 test: 0.5499350 best: 0.5499350 (2) total: 815ms remaining: 11.7s 3: learn: 0.5063041 test: 0.5128082 best: 0.5128082 (3) total: 1.07s remaining: 11.2s 4: learn: 0.4721258 test: 0.4802964 best: 0.4802964 (4) total: 1.34s remaining: 11s 5: learn: 0.4421518 test: 0.4517627 best: 0.4517627 (5) total: 1.65s remaining: 11s 6: learn: 0.4156068 test: 0.4265917 best: 0.4265917 (6) total: 1.92s remaining: 10.7s 7: learn: 0.3919919 test: 0.4041110 best: 0.4041110 (7) total: 2.18s remaining: 10.3s 8: learn: 0.3711357 test: 0.3841629 best: 0.3841629 (8) total: 2.45s remaining: 10.1s 9: learn: 0.3522460 test: 0.3665733 best: 0.3665733 (9) total: 2.76s remaining: 9.93s 10: learn: 0.3354763 test: 0.3511934 best: 0.3511934 (10) total: 3.04s remaining: 9.69s 11: learn: 0.3203715 test: 0.3372535 best: 0.3372535 (11) total: 3.33s remaining: 9.43s 12: learn: 0.3072184 test: 0.3249341 best: 0.3249341 (12) total: 3.61s remaining: 9.16s 13: learn: 0.2955117 test: 0.3141712 best: 0.3141712 (13) total: 3.88s remaining: 8.88s 14: learn: 0.2853252 test: 0.3047933 best: 0.3047933 (14) total: 4.17s remaining: 8.63s 15: learn: 0.2762218 test: 0.2965135 best: 0.2965135 (15) total: 4.45s remaining: 8.34s 16: learn: 0.2673091 test: 0.2887050 best: 0.2887050 (16) total: 4.78s remaining: 8.16s 17: learn: 0.2595481 test: 0.2819809 best: 0.2819809 (17) total: 5.1s remaining: 7.94s 18: learn: 0.2520932 test: 0.2755090 best: 0.2755090 (18) total: 5.38s remaining: 7.65s 19: learn: 0.2458195 test: 0.2698868 best: 0.2698868 (19) total: 5.68s remaining: 7.38s 20: learn: 0.2394545 test: 0.2642400 best: 0.2642400 (20) total: 5.97s remaining: 7.11s 21: learn: 0.2338792 test: 0.2594413 best: 0.2594413 (21) total: 6.26s remaining: 6.83s 22: learn: 0.2283794 test: 0.2548036 best: 0.2548036 (22) total: 6.56s remaining: 6.56s 23: learn: 0.2232624 test: 0.2508125 best: 0.2508125 (23) total: 6.85s remaining: 6.28s 24: learn: 0.2190066 test: 0.2473137 best: 0.2473137 (24) total: 7.13s remaining: 5.99s 25: learn: 0.2147932 test: 0.2441985 best: 0.2441985 (25) total: 7.43s remaining: 5.71s 26: learn: 0.2110715 test: 0.2412307 best: 0.2412307 (26) total: 7.73s remaining: 5.44s 27: learn: 0.2076236 test: 0.2382453 best: 0.2382453 (27) total: 7.98s remaining: 5.13s 28: learn: 0.2041195 test: 0.2353716 best: 0.2353716 (28) total: 8.26s remaining: 4.84s 29: learn: 0.2011389 test: 0.2329789 best: 0.2329789 (29) total: 8.52s remaining: 4.55s 30: learn: 0.1987091 test: 0.2312409 best: 0.2312409 (30) total: 8.78s remaining: 4.25s 31: learn: 0.1956835 test: 0.2291101 best: 0.2291101 (31) total: 9.11s remaining: 3.99s 32: learn: 0.1932006 test: 0.2272387 best: 0.2272387 (32) total: 9.39s remaining: 3.7s 33: learn: 0.1908697 test: 0.2255625 best: 0.2255625 (33) total: 9.68s remaining: 3.42s 34: learn: 0.1889020 test: 0.2240499 best: 0.2240499 (34) total: 9.94s remaining: 3.12s 35: learn: 0.1866952 test: 0.2225281 best: 0.2225281 (35) total: 10.3s remaining: 2.85s 36: learn: 0.1844323 test: 0.2210485 best: 0.2210485 (36) total: 10.5s remaining: 2.56s 37: learn: 0.1824609 test: 0.2197585 best: 0.2197585 (37) total: 10.8s remaining: 2.28s 38: learn: 0.1803135 test: 0.2184354 best: 0.2184354 (38) total: 11.1s remaining: 2s 39: learn: 0.1783056 test: 0.2172987 best: 0.2172987 (39) total: 11.4s remaining: 1.71s 40: learn: 0.1764729 test: 0.2163364 best: 0.2163364 (40) total: 11.7s remaining: 1.43s 41: learn: 0.1746945 test: 0.2152694 best: 0.2152694 (41) total: 12s remaining: 1.15s 42: learn: 0.1729067 test: 0.2140879 best: 0.2140879 (42) total: 12.3s remaining: 860ms 43: learn: 0.1716243 test: 0.2132589 best: 0.2132589 (43) total: 12.6s remaining: 572ms 44: learn: 0.1702388 test: 0.2126299 best: 0.2126299 (44) total: 12.9s remaining: 286ms 45: learn: 0.1689392 test: 0.2119678 best: 0.2119678 (45) total: 13.1s remaining: 0us bestTest = 0.2119678358 bestIteration = 45 Trial 94, Fold 5: Log loss = 0.21166354035556453, Average precision = 0.9743657194657328, ROC-AUC = 0.9717300782193486, Elapsed Time = 13.27798199999961 seconds
Optimization Progress: 95%|#########5| 95/100 [2:53:24<05:50, 70.11s/it]
Trial 95, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 95, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6780774 test: 0.6781281 best: 0.6781281 (0) total: 43.7ms remaining: 2.92s 1: learn: 0.6637968 test: 0.6638729 best: 0.6638729 (1) total: 86.9ms remaining: 2.87s 2: learn: 0.6500259 test: 0.6501122 best: 0.6501122 (2) total: 132ms remaining: 2.86s 3: learn: 0.6385117 test: 0.6386051 best: 0.6386051 (3) total: 174ms remaining: 2.79s 4: learn: 0.6254711 test: 0.6254909 best: 0.6254909 (4) total: 220ms remaining: 2.78s 5: learn: 0.6131001 test: 0.6131605 best: 0.6131605 (5) total: 265ms remaining: 2.74s 6: learn: 0.6022874 test: 0.6024672 best: 0.6024672 (6) total: 310ms remaining: 2.7s 7: learn: 0.5904933 test: 0.5906082 best: 0.5906082 (7) total: 355ms remaining: 2.66s 8: learn: 0.5793606 test: 0.5793379 best: 0.5793379 (8) total: 398ms remaining: 2.6s 9: learn: 0.5688099 test: 0.5688322 best: 0.5688322 (9) total: 438ms remaining: 2.54s 10: learn: 0.5589817 test: 0.5590185 best: 0.5590185 (10) total: 480ms remaining: 2.49s 11: learn: 0.5505632 test: 0.5509917 best: 0.5509917 (11) total: 529ms remaining: 2.47s 12: learn: 0.5408375 test: 0.5412392 best: 0.5412392 (12) total: 572ms remaining: 2.42s 13: learn: 0.5325751 test: 0.5331311 best: 0.5331311 (13) total: 615ms remaining: 2.37s 14: learn: 0.5228215 test: 0.5233771 best: 0.5233771 (14) total: 651ms remaining: 2.3s 15: learn: 0.5151444 test: 0.5159154 best: 0.5159154 (15) total: 693ms remaining: 2.25s 16: learn: 0.5072442 test: 0.5084225 best: 0.5084225 (16) total: 736ms remaining: 2.21s 17: learn: 0.4998469 test: 0.5012069 best: 0.5012069 (17) total: 782ms remaining: 2.17s 18: learn: 0.4923683 test: 0.4937487 best: 0.4937487 (18) total: 824ms remaining: 2.12s 19: learn: 0.4863522 test: 0.4878079 best: 0.4878079 (19) total: 866ms remaining: 2.08s 20: learn: 0.4790079 test: 0.4804198 best: 0.4804198 (20) total: 912ms remaining: 2.04s 21: learn: 0.4719302 test: 0.4733706 best: 0.4733706 (21) total: 960ms remaining: 2.01s 22: learn: 0.4656461 test: 0.4673634 best: 0.4673634 (22) total: 1s remaining: 1.96s 23: learn: 0.4585511 test: 0.4602685 best: 0.4602685 (23) total: 1.05s remaining: 1.92s 24: learn: 0.4531900 test: 0.4552736 best: 0.4552736 (24) total: 1.09s remaining: 1.88s 25: learn: 0.4481719 test: 0.4503752 best: 0.4503752 (25) total: 1.13s remaining: 1.82s 26: learn: 0.4412946 test: 0.4435449 best: 0.4435449 (26) total: 1.18s remaining: 1.79s 27: learn: 0.4360311 test: 0.4383146 best: 0.4383146 (27) total: 1.22s remaining: 1.74s 28: learn: 0.4297140 test: 0.4319705 best: 0.4319705 (28) total: 1.27s remaining: 1.71s 29: learn: 0.4237889 test: 0.4260707 best: 0.4260707 (29) total: 1.31s remaining: 1.67s 30: learn: 0.4184825 test: 0.4207546 best: 0.4207546 (30) total: 1.36s remaining: 1.63s 31: learn: 0.4126096 test: 0.4149338 best: 0.4149338 (31) total: 1.41s remaining: 1.59s 32: learn: 0.4075228 test: 0.4098968 best: 0.4098968 (32) total: 1.46s remaining: 1.55s 33: learn: 0.4033199 test: 0.4057586 best: 0.4057586 (33) total: 1.51s remaining: 1.51s 34: learn: 0.3980498 test: 0.4004533 best: 0.4004533 (34) total: 1.56s remaining: 1.48s 35: learn: 0.3933168 test: 0.3957698 best: 0.3957698 (35) total: 1.61s remaining: 1.44s 36: learn: 0.3892112 test: 0.3917148 best: 0.3917148 (36) total: 1.67s remaining: 1.4s 37: learn: 0.3852617 test: 0.3879410 best: 0.3879410 (37) total: 1.71s remaining: 1.35s 38: learn: 0.3814413 test: 0.3841895 best: 0.3841895 (38) total: 1.75s remaining: 1.3s 39: learn: 0.3768925 test: 0.3796047 best: 0.3796047 (39) total: 1.8s remaining: 1.26s 40: learn: 0.3729533 test: 0.3757053 best: 0.3757053 (40) total: 1.85s remaining: 1.22s 41: learn: 0.3688690 test: 0.3716330 best: 0.3716330 (41) total: 1.9s remaining: 1.18s 42: learn: 0.3649994 test: 0.3677479 best: 0.3677479 (42) total: 1.94s remaining: 1.13s 43: learn: 0.3620838 test: 0.3649887 best: 0.3649887 (43) total: 1.99s remaining: 1.08s 44: learn: 0.3581863 test: 0.3610851 best: 0.3610851 (44) total: 2.04s remaining: 1.04s 45: learn: 0.3551245 test: 0.3581973 best: 0.3581973 (45) total: 2.09s remaining: 999ms 46: learn: 0.3524834 test: 0.3557297 best: 0.3557297 (46) total: 2.15s remaining: 960ms 47: learn: 0.3491513 test: 0.3524365 best: 0.3524365 (47) total: 2.19s remaining: 915ms 48: learn: 0.3461657 test: 0.3494811 best: 0.3494811 (48) total: 2.24s remaining: 869ms 49: learn: 0.3433559 test: 0.3467010 best: 0.3467010 (49) total: 2.28s remaining: 820ms 50: learn: 0.3404572 test: 0.3438199 best: 0.3438199 (50) total: 2.33s remaining: 775ms 51: learn: 0.3380844 test: 0.3416296 best: 0.3416296 (51) total: 2.37s remaining: 731ms 52: learn: 0.3348619 test: 0.3384262 best: 0.3384262 (52) total: 2.42s remaining: 684ms 53: learn: 0.3320123 test: 0.3356502 best: 0.3356502 (53) total: 2.46s remaining: 639ms 54: learn: 0.3298295 test: 0.3336823 best: 0.3336823 (54) total: 2.51s remaining: 593ms 55: learn: 0.3272855 test: 0.3312568 best: 0.3312568 (55) total: 2.54s remaining: 545ms 56: learn: 0.3251301 test: 0.3293337 best: 0.3293337 (56) total: 2.58s remaining: 499ms 57: learn: 0.3227571 test: 0.3269608 best: 0.3269608 (57) total: 2.63s remaining: 454ms 58: learn: 0.3198230 test: 0.3239961 best: 0.3239961 (58) total: 2.68s remaining: 409ms 59: learn: 0.3172879 test: 0.3214283 best: 0.3214283 (59) total: 2.72s remaining: 363ms 60: learn: 0.3154897 test: 0.3198259 best: 0.3198259 (60) total: 2.77s remaining: 318ms 61: learn: 0.3127970 test: 0.3171170 best: 0.3171170 (61) total: 2.82s remaining: 273ms 62: learn: 0.3104596 test: 0.3147810 best: 0.3147810 (62) total: 2.87s remaining: 228ms 63: learn: 0.3085754 test: 0.3129700 best: 0.3129700 (63) total: 2.91s remaining: 182ms 64: learn: 0.3064386 test: 0.3108300 best: 0.3108300 (64) total: 2.96s remaining: 137ms 65: learn: 0.3042285 test: 0.3086596 best: 0.3086596 (65) total: 3.01s remaining: 91.2ms 66: learn: 0.3022735 test: 0.3066470 best: 0.3066470 (66) total: 3.06s remaining: 45.7ms 67: learn: 0.3005506 test: 0.3049620 best: 0.3049620 (67) total: 3.1s remaining: 0us bestTest = 0.3049619817 bestIteration = 67 Trial 95, Fold 1: Log loss = 0.3048867323429682, Average precision = 0.9689622510859973, ROC-AUC = 0.9619546936761255, Elapsed Time = 3.2133815999986837 seconds Trial 95, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 95, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6807331 test: 0.6810531 best: 0.6810531 (0) total: 39.7ms remaining: 2.66s 1: learn: 0.6665543 test: 0.6669580 best: 0.6669580 (1) total: 81.3ms remaining: 2.68s 2: learn: 0.6554377 test: 0.6567161 best: 0.6567161 (2) total: 130ms remaining: 2.81s 3: learn: 0.6416970 test: 0.6430583 best: 0.6430583 (3) total: 173ms remaining: 2.76s 4: learn: 0.6294432 test: 0.6308827 best: 0.6308827 (4) total: 212ms remaining: 2.67s 5: learn: 0.6187782 test: 0.6202916 best: 0.6202916 (5) total: 262ms remaining: 2.71s 6: learn: 0.6077238 test: 0.6092385 best: 0.6092385 (6) total: 301ms remaining: 2.62s 7: learn: 0.5965233 test: 0.5979937 best: 0.5979937 (7) total: 349ms remaining: 2.62s 8: learn: 0.5846945 test: 0.5862036 best: 0.5862036 (8) total: 393ms remaining: 2.58s 9: learn: 0.5738396 test: 0.5753754 best: 0.5753754 (9) total: 444ms remaining: 2.57s 10: learn: 0.5632828 test: 0.5648894 best: 0.5648894 (10) total: 494ms remaining: 2.56s 11: learn: 0.5527015 test: 0.5542878 best: 0.5542878 (11) total: 540ms remaining: 2.52s 12: learn: 0.5444540 test: 0.5466718 best: 0.5466718 (12) total: 592ms remaining: 2.5s 13: learn: 0.5361018 test: 0.5386203 best: 0.5386203 (13) total: 634ms remaining: 2.44s 14: learn: 0.5284448 test: 0.5310263 best: 0.5310263 (14) total: 683ms remaining: 2.41s 15: learn: 0.5195290 test: 0.5221519 best: 0.5221519 (15) total: 734ms remaining: 2.39s 16: learn: 0.5112924 test: 0.5139260 best: 0.5139260 (16) total: 777ms remaining: 2.33s 17: learn: 0.5039029 test: 0.5070463 best: 0.5070463 (17) total: 827ms remaining: 2.3s 18: learn: 0.4958035 test: 0.4989064 best: 0.4989064 (18) total: 870ms remaining: 2.24s 19: learn: 0.4884955 test: 0.4920632 best: 0.4920632 (19) total: 922ms remaining: 2.21s 20: learn: 0.4807705 test: 0.4843007 best: 0.4843007 (20) total: 970ms remaining: 2.17s 21: learn: 0.4739424 test: 0.4776278 best: 0.4776278 (21) total: 1.02s remaining: 2.13s 22: learn: 0.4667415 test: 0.4704309 best: 0.4704309 (22) total: 1.07s remaining: 2.09s 23: learn: 0.4603219 test: 0.4641713 best: 0.4641713 (23) total: 1.12s remaining: 2.05s 24: learn: 0.4548085 test: 0.4588260 best: 0.4588260 (24) total: 1.17s remaining: 2s 25: learn: 0.4488285 test: 0.4532022 best: 0.4532022 (25) total: 1.22s remaining: 1.97s 26: learn: 0.4431430 test: 0.4475230 best: 0.4475230 (26) total: 1.26s remaining: 1.92s 27: learn: 0.4382970 test: 0.4426753 best: 0.4426753 (27) total: 1.31s remaining: 1.87s 28: learn: 0.4328845 test: 0.4373037 best: 0.4373037 (28) total: 1.36s remaining: 1.83s 29: learn: 0.4268463 test: 0.4312386 best: 0.4312386 (29) total: 1.4s remaining: 1.77s 30: learn: 0.4210204 test: 0.4253213 best: 0.4253213 (30) total: 1.45s remaining: 1.73s 31: learn: 0.4155734 test: 0.4198790 best: 0.4198790 (31) total: 1.49s remaining: 1.68s 32: learn: 0.4106159 test: 0.4150483 best: 0.4150483 (32) total: 1.54s remaining: 1.64s 33: learn: 0.4050027 test: 0.4095040 best: 0.4095040 (33) total: 1.59s remaining: 1.59s 34: learn: 0.4001351 test: 0.4045947 best: 0.4045947 (34) total: 1.64s remaining: 1.55s 35: learn: 0.3953456 test: 0.3997306 best: 0.3997306 (35) total: 1.69s remaining: 1.5s 36: learn: 0.3902240 test: 0.3945074 best: 0.3945074 (36) total: 1.74s remaining: 1.46s 37: learn: 0.3853865 test: 0.3896492 best: 0.3896492 (37) total: 1.79s remaining: 1.42s 38: learn: 0.3816670 test: 0.3861664 best: 0.3861664 (38) total: 1.84s remaining: 1.37s 39: learn: 0.3780729 test: 0.3828099 best: 0.3828099 (39) total: 1.89s remaining: 1.32s 40: learn: 0.3739778 test: 0.3786441 best: 0.3786441 (40) total: 1.95s remaining: 1.28s 41: learn: 0.3707378 test: 0.3754551 best: 0.3754551 (41) total: 2s remaining: 1.24s 42: learn: 0.3666496 test: 0.3712988 best: 0.3712988 (42) total: 2.04s remaining: 1.19s 43: learn: 0.3630285 test: 0.3675986 best: 0.3675986 (43) total: 2.09s remaining: 1.14s 44: learn: 0.3600613 test: 0.3647563 best: 0.3647563 (44) total: 2.14s remaining: 1.09s 45: learn: 0.3568353 test: 0.3615130 best: 0.3615130 (45) total: 2.19s remaining: 1.04s 46: learn: 0.3535409 test: 0.3581761 best: 0.3581761 (46) total: 2.23s remaining: 999ms 47: learn: 0.3503052 test: 0.3549239 best: 0.3549239 (47) total: 2.28s remaining: 951ms 48: learn: 0.3465695 test: 0.3511156 best: 0.3511156 (48) total: 2.33s remaining: 903ms 49: learn: 0.3442833 test: 0.3488450 best: 0.3488450 (49) total: 2.37s remaining: 855ms 50: learn: 0.3414255 test: 0.3461270 best: 0.3461270 (50) total: 2.42s remaining: 806ms 51: learn: 0.3385043 test: 0.3431574 best: 0.3431574 (51) total: 2.47s remaining: 759ms 52: learn: 0.3358381 test: 0.3404335 best: 0.3404335 (52) total: 2.52s remaining: 712ms 53: learn: 0.3339479 test: 0.3386082 best: 0.3386082 (53) total: 2.57s remaining: 666ms 54: learn: 0.3308481 test: 0.3354685 best: 0.3354685 (54) total: 2.62s remaining: 619ms 55: learn: 0.3282720 test: 0.3327545 best: 0.3327545 (55) total: 2.67s remaining: 572ms 56: learn: 0.3258965 test: 0.3305082 best: 0.3305082 (56) total: 2.72s remaining: 524ms 57: learn: 0.3233804 test: 0.3279689 best: 0.3279689 (57) total: 2.77s remaining: 477ms 58: learn: 0.3215230 test: 0.3262241 best: 0.3262241 (58) total: 2.81s remaining: 429ms 59: learn: 0.3185491 test: 0.3232094 best: 0.3232094 (59) total: 2.86s remaining: 381ms 60: learn: 0.3168306 test: 0.3216199 best: 0.3216199 (60) total: 2.91s remaining: 334ms 61: learn: 0.3147377 test: 0.3196414 best: 0.3196414 (61) total: 2.96s remaining: 287ms 62: learn: 0.3124491 test: 0.3173254 best: 0.3173254 (62) total: 3.01s remaining: 239ms 63: learn: 0.3108643 test: 0.3158627 best: 0.3158627 (63) total: 3.06s remaining: 191ms 64: learn: 0.3087950 test: 0.3136918 best: 0.3136918 (64) total: 3.1s remaining: 143ms 65: learn: 0.3069371 test: 0.3118866 best: 0.3118866 (65) total: 3.15s remaining: 95.5ms 66: learn: 0.3055162 test: 0.3105978 best: 0.3105978 (66) total: 3.2s remaining: 47.8ms 67: learn: 0.3035205 test: 0.3085440 best: 0.3085440 (67) total: 3.25s remaining: 0us bestTest = 0.3085439697 bestIteration = 67 Trial 95, Fold 2: Log loss = 0.3085093171522456, Average precision = 0.9674176039455818, ROC-AUC = 0.9631101892264533, Elapsed Time = 3.363808900001459 seconds Trial 95, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 95, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6813364 test: 0.6812663 best: 0.6812663 (0) total: 39.3ms remaining: 2.63s 1: learn: 0.6698799 test: 0.6698275 best: 0.6698275 (1) total: 87.3ms remaining: 2.88s 2: learn: 0.6573248 test: 0.6576883 best: 0.6576883 (2) total: 135ms remaining: 2.91s 3: learn: 0.6438049 test: 0.6441619 best: 0.6441619 (3) total: 180ms remaining: 2.87s 4: learn: 0.6329283 test: 0.6335481 best: 0.6335481 (4) total: 222ms remaining: 2.79s 5: learn: 0.6197066 test: 0.6202302 best: 0.6202302 (5) total: 269ms remaining: 2.78s 6: learn: 0.6105021 test: 0.6109274 best: 0.6109274 (6) total: 312ms remaining: 2.72s 7: learn: 0.5988529 test: 0.5991739 best: 0.5991739 (7) total: 357ms remaining: 2.68s 8: learn: 0.5867987 test: 0.5871232 best: 0.5871232 (8) total: 399ms remaining: 2.61s 9: learn: 0.5771731 test: 0.5774627 best: 0.5774627 (9) total: 447ms remaining: 2.59s 10: learn: 0.5667535 test: 0.5674184 best: 0.5674184 (10) total: 489ms remaining: 2.53s 11: learn: 0.5568575 test: 0.5575122 best: 0.5575122 (11) total: 535ms remaining: 2.5s 12: learn: 0.5468641 test: 0.5473899 best: 0.5473899 (12) total: 578ms remaining: 2.45s 13: learn: 0.5367586 test: 0.5371947 best: 0.5371947 (13) total: 626ms remaining: 2.42s 14: learn: 0.5272500 test: 0.5276324 best: 0.5276324 (14) total: 674ms remaining: 2.38s 15: learn: 0.5179882 test: 0.5183265 best: 0.5183265 (15) total: 720ms remaining: 2.34s 16: learn: 0.5091009 test: 0.5093480 best: 0.5093480 (16) total: 771ms remaining: 2.31s 17: learn: 0.5012922 test: 0.5014531 best: 0.5014531 (17) total: 817ms remaining: 2.27s 18: learn: 0.4929614 test: 0.4930564 best: 0.4930564 (18) total: 858ms remaining: 2.21s 19: learn: 0.4847686 test: 0.4848387 best: 0.4848387 (19) total: 907ms remaining: 2.18s 20: learn: 0.4772404 test: 0.4773013 best: 0.4773013 (20) total: 958ms remaining: 2.14s 21: learn: 0.4704536 test: 0.4704510 best: 0.4704510 (21) total: 1.01s remaining: 2.11s 22: learn: 0.4649228 test: 0.4652198 best: 0.4652198 (22) total: 1.05s remaining: 2.06s 23: learn: 0.4580882 test: 0.4582481 best: 0.4582481 (23) total: 1.11s remaining: 2.03s 24: learn: 0.4519980 test: 0.4519759 best: 0.4519759 (24) total: 1.15s remaining: 1.99s 25: learn: 0.4469804 test: 0.4469923 best: 0.4469923 (25) total: 1.21s remaining: 1.95s 26: learn: 0.4405440 test: 0.4405269 best: 0.4405269 (26) total: 1.25s remaining: 1.91s 27: learn: 0.4348314 test: 0.4347445 best: 0.4347445 (27) total: 1.3s remaining: 1.85s 28: learn: 0.4296290 test: 0.4294999 best: 0.4294999 (28) total: 1.35s remaining: 1.81s 29: learn: 0.4250282 test: 0.4249658 best: 0.4249658 (29) total: 1.4s remaining: 1.77s 30: learn: 0.4199322 test: 0.4201198 best: 0.4201198 (30) total: 1.45s remaining: 1.73s 31: learn: 0.4150219 test: 0.4153117 best: 0.4153117 (31) total: 1.49s remaining: 1.68s 32: learn: 0.4103724 test: 0.4107771 best: 0.4107771 (32) total: 1.54s remaining: 1.64s 33: learn: 0.4053811 test: 0.4057227 best: 0.4057227 (33) total: 1.59s remaining: 1.59s 34: learn: 0.4004514 test: 0.4007172 best: 0.4007172 (34) total: 1.64s remaining: 1.54s 35: learn: 0.3955254 test: 0.3957538 best: 0.3957538 (35) total: 1.69s remaining: 1.5s 36: learn: 0.3912110 test: 0.3914609 best: 0.3914609 (36) total: 1.73s remaining: 1.45s 37: learn: 0.3867445 test: 0.3869469 best: 0.3869469 (37) total: 1.78s remaining: 1.41s 38: learn: 0.3833055 test: 0.3836202 best: 0.3836202 (38) total: 1.82s remaining: 1.36s 39: learn: 0.3793762 test: 0.3797409 best: 0.3797409 (39) total: 1.87s remaining: 1.31s 40: learn: 0.3753238 test: 0.3756732 best: 0.3756732 (40) total: 1.92s remaining: 1.26s 41: learn: 0.3715832 test: 0.3718047 best: 0.3718047 (41) total: 1.97s remaining: 1.22s 42: learn: 0.3676134 test: 0.3679755 best: 0.3679755 (42) total: 2s remaining: 1.16s 43: learn: 0.3640713 test: 0.3643933 best: 0.3643933 (43) total: 2.04s remaining: 1.11s 44: learn: 0.3605464 test: 0.3609439 best: 0.3609439 (44) total: 2.08s remaining: 1.07s 45: learn: 0.3573212 test: 0.3576088 best: 0.3576088 (45) total: 2.13s remaining: 1.02s 46: learn: 0.3545473 test: 0.3550655 best: 0.3550655 (46) total: 2.19s remaining: 976ms 47: learn: 0.3507723 test: 0.3512097 best: 0.3512097 (47) total: 2.23s remaining: 931ms 48: learn: 0.3471612 test: 0.3475212 best: 0.3475212 (48) total: 2.29s remaining: 886ms 49: learn: 0.3438132 test: 0.3440936 best: 0.3440936 (49) total: 2.33s remaining: 840ms 50: learn: 0.3406084 test: 0.3408032 best: 0.3408032 (50) total: 2.38s remaining: 795ms 51: learn: 0.3373512 test: 0.3374810 best: 0.3374810 (51) total: 2.43s remaining: 748ms 52: learn: 0.3348397 test: 0.3348835 best: 0.3348835 (52) total: 2.48s remaining: 702ms 53: learn: 0.3320065 test: 0.3319900 best: 0.3319900 (53) total: 2.53s remaining: 655ms 54: learn: 0.3289641 test: 0.3289460 best: 0.3289460 (54) total: 2.57s remaining: 608ms 55: learn: 0.3260045 test: 0.3259358 best: 0.3259358 (55) total: 2.62s remaining: 561ms 56: learn: 0.3239703 test: 0.3239311 best: 0.3239311 (56) total: 2.67s remaining: 515ms 57: learn: 0.3210540 test: 0.3209782 best: 0.3209782 (57) total: 2.72s remaining: 470ms 58: learn: 0.3186997 test: 0.3186007 best: 0.3186007 (58) total: 2.77s remaining: 423ms 59: learn: 0.3165687 test: 0.3164985 best: 0.3164985 (59) total: 2.82s remaining: 376ms 60: learn: 0.3144340 test: 0.3143466 best: 0.3143466 (60) total: 2.87s remaining: 329ms 61: learn: 0.3122219 test: 0.3121100 best: 0.3121100 (61) total: 2.91s remaining: 282ms 62: learn: 0.3106268 test: 0.3105014 best: 0.3105014 (62) total: 2.95s remaining: 234ms 63: learn: 0.3091181 test: 0.3090444 best: 0.3090444 (63) total: 3s remaining: 187ms 64: learn: 0.3070386 test: 0.3069553 best: 0.3069553 (64) total: 3.04s remaining: 140ms 65: learn: 0.3049912 test: 0.3049199 best: 0.3049199 (65) total: 3.09s remaining: 93.7ms 66: learn: 0.3031195 test: 0.3030752 best: 0.3030752 (66) total: 3.13s remaining: 46.8ms 67: learn: 0.3007319 test: 0.3006518 best: 0.3006518 (67) total: 3.17s remaining: 0us bestTest = 0.300651798 bestIteration = 67 Trial 95, Fold 3: Log loss = 0.3007512603381208, Average precision = 0.9702529092382228, ROC-AUC = 0.9648927704635417, Elapsed Time = 3.283542600001965 seconds Trial 95, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 95, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6810932 test: 0.6812982 best: 0.6812982 (0) total: 35.9ms remaining: 2.41s 1: learn: 0.6663498 test: 0.6666275 best: 0.6666275 (1) total: 80ms remaining: 2.64s 2: learn: 0.6542100 test: 0.6546776 best: 0.6546776 (2) total: 127ms remaining: 2.74s 3: learn: 0.6426401 test: 0.6432655 best: 0.6432655 (3) total: 169ms remaining: 2.71s 4: learn: 0.6292226 test: 0.6298850 best: 0.6298850 (4) total: 209ms remaining: 2.63s 5: learn: 0.6162467 test: 0.6168839 best: 0.6168839 (5) total: 254ms remaining: 2.63s 6: learn: 0.6036296 test: 0.6042818 best: 0.6042818 (6) total: 305ms remaining: 2.66s 7: learn: 0.5932480 test: 0.5940726 best: 0.5940726 (7) total: 349ms remaining: 2.62s 8: learn: 0.5815427 test: 0.5823790 best: 0.5823790 (8) total: 392ms remaining: 2.57s 9: learn: 0.5706665 test: 0.5714426 best: 0.5714426 (9) total: 436ms remaining: 2.53s 10: learn: 0.5611388 test: 0.5625654 best: 0.5625654 (10) total: 487ms remaining: 2.52s 11: learn: 0.5510202 test: 0.5523728 best: 0.5523728 (11) total: 539ms remaining: 2.52s 12: learn: 0.5405563 test: 0.5419320 best: 0.5419320 (12) total: 599ms remaining: 2.54s 13: learn: 0.5321455 test: 0.5340345 best: 0.5340345 (13) total: 639ms remaining: 2.46s 14: learn: 0.5242993 test: 0.5265202 best: 0.5265202 (14) total: 686ms remaining: 2.42s 15: learn: 0.5155107 test: 0.5178476 best: 0.5178476 (15) total: 726ms remaining: 2.36s 16: learn: 0.5072107 test: 0.5095361 best: 0.5095361 (16) total: 777ms remaining: 2.33s 17: learn: 0.5000516 test: 0.5027895 best: 0.5027895 (17) total: 823ms remaining: 2.29s 18: learn: 0.4922947 test: 0.4949716 best: 0.4949716 (18) total: 871ms remaining: 2.25s 19: learn: 0.4846671 test: 0.4873718 best: 0.4873718 (19) total: 921ms remaining: 2.21s 20: learn: 0.4766731 test: 0.4793304 best: 0.4793304 (20) total: 972ms remaining: 2.18s 21: learn: 0.4701289 test: 0.4730140 best: 0.4730140 (21) total: 1.02s remaining: 2.14s 22: learn: 0.4631706 test: 0.4660111 best: 0.4660111 (22) total: 1.06s remaining: 2.07s 23: learn: 0.4568586 test: 0.4595935 best: 0.4595935 (23) total: 1.1s remaining: 2.02s 24: learn: 0.4511526 test: 0.4542362 best: 0.4542362 (24) total: 1.15s remaining: 1.98s 25: learn: 0.4443998 test: 0.4474968 best: 0.4474968 (25) total: 1.2s remaining: 1.95s 26: learn: 0.4395185 test: 0.4426081 best: 0.4426081 (26) total: 1.25s remaining: 1.9s 27: learn: 0.4346867 test: 0.4380364 best: 0.4380364 (27) total: 1.3s remaining: 1.86s 28: learn: 0.4296409 test: 0.4332924 best: 0.4332924 (28) total: 1.35s remaining: 1.82s 29: learn: 0.4253313 test: 0.4291653 best: 0.4291653 (29) total: 1.4s remaining: 1.77s 30: learn: 0.4204059 test: 0.4246803 best: 0.4246803 (30) total: 1.46s remaining: 1.74s 31: learn: 0.4145238 test: 0.4186496 best: 0.4186496 (31) total: 1.5s remaining: 1.69s 32: learn: 0.4092337 test: 0.4132266 best: 0.4132266 (32) total: 1.55s remaining: 1.64s 33: learn: 0.4042291 test: 0.4082093 best: 0.4082093 (33) total: 1.6s remaining: 1.6s 34: learn: 0.3993601 test: 0.4032629 best: 0.4032629 (34) total: 1.64s remaining: 1.55s 35: learn: 0.3943653 test: 0.3982036 best: 0.3982036 (35) total: 1.7s remaining: 1.51s 36: learn: 0.3905586 test: 0.3946405 best: 0.3946405 (36) total: 1.75s remaining: 1.46s 37: learn: 0.3867558 test: 0.3909160 best: 0.3909160 (37) total: 1.79s remaining: 1.42s 38: learn: 0.3830184 test: 0.3874803 best: 0.3874803 (38) total: 1.84s remaining: 1.37s 39: learn: 0.3798900 test: 0.3845106 best: 0.3845106 (39) total: 1.88s remaining: 1.32s 40: learn: 0.3755675 test: 0.3800844 best: 0.3800844 (40) total: 1.94s remaining: 1.28s 41: learn: 0.3724283 test: 0.3772198 best: 0.3772198 (41) total: 1.99s remaining: 1.23s 42: learn: 0.3689514 test: 0.3737575 best: 0.3737575 (42) total: 2.04s remaining: 1.18s 43: learn: 0.3664588 test: 0.3713688 best: 0.3713688 (43) total: 2.08s remaining: 1.13s 44: learn: 0.3633097 test: 0.3681882 best: 0.3681882 (44) total: 2.12s remaining: 1.08s 45: learn: 0.3591699 test: 0.3638961 best: 0.3638961 (45) total: 2.16s remaining: 1.03s 46: learn: 0.3554180 test: 0.3600380 best: 0.3600380 (46) total: 2.21s remaining: 987ms 47: learn: 0.3521220 test: 0.3567485 best: 0.3567485 (47) total: 2.26s remaining: 941ms 48: learn: 0.3492694 test: 0.3541490 best: 0.3541490 (48) total: 2.31s remaining: 895ms 49: learn: 0.3466741 test: 0.3517528 best: 0.3517528 (49) total: 2.36s remaining: 849ms 50: learn: 0.3438293 test: 0.3488919 best: 0.3488919 (50) total: 2.41s remaining: 802ms 51: learn: 0.3409512 test: 0.3458995 best: 0.3458995 (51) total: 2.45s remaining: 755ms 52: learn: 0.3382639 test: 0.3431421 best: 0.3431421 (52) total: 2.5s remaining: 708ms 53: learn: 0.3349347 test: 0.3397342 best: 0.3397342 (53) total: 2.55s remaining: 662ms 54: learn: 0.3325840 test: 0.3375971 best: 0.3375971 (54) total: 2.6s remaining: 615ms 55: learn: 0.3299609 test: 0.3349511 best: 0.3349511 (55) total: 2.65s remaining: 568ms 56: learn: 0.3274122 test: 0.3323648 best: 0.3323648 (56) total: 2.7s remaining: 521ms 57: learn: 0.3244213 test: 0.3293023 best: 0.3293023 (57) total: 2.75s remaining: 473ms 58: learn: 0.3225059 test: 0.3274849 best: 0.3274849 (58) total: 2.79s remaining: 425ms 59: learn: 0.3204712 test: 0.3255720 best: 0.3255720 (59) total: 2.84s remaining: 378ms 60: learn: 0.3182302 test: 0.3233158 best: 0.3233158 (60) total: 2.88s remaining: 331ms 61: learn: 0.3162891 test: 0.3214306 best: 0.3214306 (61) total: 2.94s remaining: 285ms 62: learn: 0.3144732 test: 0.3198163 best: 0.3198163 (62) total: 2.98s remaining: 237ms 63: learn: 0.3125707 test: 0.3179617 best: 0.3179617 (63) total: 3.03s remaining: 189ms 64: learn: 0.3109220 test: 0.3164184 best: 0.3164184 (64) total: 3.08s remaining: 142ms 65: learn: 0.3086698 test: 0.3140866 best: 0.3140866 (65) total: 3.13s remaining: 94.7ms 66: learn: 0.3069303 test: 0.3124961 best: 0.3124961 (66) total: 3.17s remaining: 47.4ms 67: learn: 0.3048198 test: 0.3103390 best: 0.3103390 (67) total: 3.22s remaining: 0us bestTest = 0.3103390211 bestIteration = 67 Trial 95, Fold 4: Log loss = 0.3102611696645102, Average precision = 0.9689651838747172, ROC-AUC = 0.9625386870486012, Elapsed Time = 3.3399668999991263 seconds Trial 95, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 95, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6813682 test: 0.6815511 best: 0.6815511 (0) total: 34.3ms remaining: 2.3s 1: learn: 0.6687240 test: 0.6697314 best: 0.6697314 (1) total: 75.8ms remaining: 2.5s 2: learn: 0.6564256 test: 0.6583029 best: 0.6583029 (2) total: 121ms remaining: 2.62s 3: learn: 0.6438151 test: 0.6468273 best: 0.6468273 (3) total: 159ms remaining: 2.55s 4: learn: 0.6323606 test: 0.6363883 best: 0.6363883 (4) total: 197ms remaining: 2.48s 5: learn: 0.6201348 test: 0.6241753 best: 0.6241753 (5) total: 241ms remaining: 2.49s 6: learn: 0.6077253 test: 0.6117999 best: 0.6117999 (6) total: 288ms remaining: 2.51s 7: learn: 0.5952302 test: 0.5994477 best: 0.5994477 (7) total: 337ms remaining: 2.53s 8: learn: 0.5845900 test: 0.5891882 best: 0.5891882 (8) total: 380ms remaining: 2.49s 9: learn: 0.5738358 test: 0.5784604 best: 0.5784604 (9) total: 424ms remaining: 2.46s 10: learn: 0.5642279 test: 0.5695430 best: 0.5695430 (10) total: 469ms remaining: 2.43s 11: learn: 0.5560897 test: 0.5618632 best: 0.5618632 (11) total: 511ms remaining: 2.39s 12: learn: 0.5477646 test: 0.5538991 best: 0.5538991 (12) total: 559ms remaining: 2.37s 13: learn: 0.5391628 test: 0.5458121 best: 0.5458121 (13) total: 606ms remaining: 2.34s 14: learn: 0.5297379 test: 0.5365240 best: 0.5365240 (14) total: 655ms remaining: 2.31s 15: learn: 0.5204829 test: 0.5274008 best: 0.5274008 (15) total: 702ms remaining: 2.28s 16: learn: 0.5120572 test: 0.5190238 best: 0.5190238 (16) total: 753ms remaining: 2.26s 17: learn: 0.5036092 test: 0.5106775 best: 0.5106775 (17) total: 800ms remaining: 2.22s 18: learn: 0.4962696 test: 0.5039997 best: 0.5039997 (18) total: 845ms remaining: 2.18s 19: learn: 0.4877737 test: 0.4954346 best: 0.4954346 (19) total: 894ms remaining: 2.15s 20: learn: 0.4812388 test: 0.4892247 best: 0.4892247 (20) total: 938ms remaining: 2.1s 21: learn: 0.4755042 test: 0.4836822 best: 0.4836822 (21) total: 982ms remaining: 2.05s 22: learn: 0.4694203 test: 0.4776958 best: 0.4776958 (22) total: 1.03s remaining: 2.02s 23: learn: 0.4621913 test: 0.4705452 best: 0.4705452 (23) total: 1.09s remaining: 2s 24: learn: 0.4557149 test: 0.4641205 best: 0.4641205 (24) total: 1.14s remaining: 1.96s 25: learn: 0.4491115 test: 0.4576094 best: 0.4576094 (25) total: 1.19s remaining: 1.92s 26: learn: 0.4429883 test: 0.4514262 best: 0.4514262 (26) total: 1.24s remaining: 1.89s 27: learn: 0.4375649 test: 0.4463248 best: 0.4463248 (27) total: 1.29s remaining: 1.84s 28: learn: 0.4312972 test: 0.4401532 best: 0.4401532 (28) total: 1.34s remaining: 1.8s 29: learn: 0.4250380 test: 0.4338826 best: 0.4338826 (29) total: 1.39s remaining: 1.75s 30: learn: 0.4199468 test: 0.4287068 best: 0.4287068 (30) total: 1.43s remaining: 1.71s 31: learn: 0.4150240 test: 0.4239663 best: 0.4239663 (31) total: 1.47s remaining: 1.66s 32: learn: 0.4102694 test: 0.4197230 best: 0.4197230 (32) total: 1.52s remaining: 1.61s 33: learn: 0.4046105 test: 0.4139797 best: 0.4139797 (33) total: 1.57s remaining: 1.57s 34: learn: 0.3993363 test: 0.4086503 best: 0.4086503 (34) total: 1.61s remaining: 1.52s 35: learn: 0.3943815 test: 0.4036721 best: 0.4036721 (35) total: 1.66s remaining: 1.47s 36: learn: 0.3903835 test: 0.3999014 best: 0.3999014 (36) total: 1.7s remaining: 1.43s 37: learn: 0.3869134 test: 0.3965667 best: 0.3965667 (37) total: 1.75s remaining: 1.38s 38: learn: 0.3829815 test: 0.3927216 best: 0.3927216 (38) total: 1.79s remaining: 1.33s 39: learn: 0.3790169 test: 0.3886690 best: 0.3886690 (39) total: 1.83s remaining: 1.28s 40: learn: 0.3753529 test: 0.3852388 best: 0.3852388 (40) total: 1.88s remaining: 1.24s 41: learn: 0.3719215 test: 0.3819317 best: 0.3819317 (41) total: 1.92s remaining: 1.19s 42: learn: 0.3680800 test: 0.3780540 best: 0.3780540 (42) total: 1.97s remaining: 1.14s 43: learn: 0.3639919 test: 0.3739339 best: 0.3739339 (43) total: 2.02s remaining: 1.1s 44: learn: 0.3612885 test: 0.3713772 best: 0.3713772 (44) total: 2.08s remaining: 1.06s 45: learn: 0.3575894 test: 0.3676301 best: 0.3676301 (45) total: 2.13s remaining: 1.02s 46: learn: 0.3548934 test: 0.3652935 best: 0.3652935 (46) total: 2.18s remaining: 974ms 47: learn: 0.3515211 test: 0.3619061 best: 0.3619061 (47) total: 2.22s remaining: 926ms 48: learn: 0.3484642 test: 0.3588291 best: 0.3588291 (48) total: 2.27s remaining: 881ms 49: learn: 0.3453657 test: 0.3560634 best: 0.3560634 (49) total: 2.32s remaining: 834ms 50: learn: 0.3423001 test: 0.3530037 best: 0.3530037 (50) total: 2.37s remaining: 788ms 51: learn: 0.3389399 test: 0.3495911 best: 0.3495911 (51) total: 2.41s remaining: 742ms 52: learn: 0.3361214 test: 0.3465500 best: 0.3465500 (52) total: 2.46s remaining: 697ms 53: learn: 0.3333862 test: 0.3437192 best: 0.3437192 (53) total: 2.51s remaining: 652ms 54: learn: 0.3306872 test: 0.3412171 best: 0.3412171 (54) total: 2.56s remaining: 606ms 55: learn: 0.3276169 test: 0.3381185 best: 0.3381185 (55) total: 2.61s remaining: 560ms 56: learn: 0.3254146 test: 0.3360262 best: 0.3360262 (56) total: 2.66s remaining: 513ms 57: learn: 0.3225408 test: 0.3331141 best: 0.3331141 (57) total: 2.7s remaining: 466ms 58: learn: 0.3204940 test: 0.3311853 best: 0.3311853 (58) total: 2.75s remaining: 420ms 59: learn: 0.3181974 test: 0.3288705 best: 0.3288705 (59) total: 2.79s remaining: 373ms 60: learn: 0.3162493 test: 0.3270199 best: 0.3270199 (60) total: 2.84s remaining: 326ms 61: learn: 0.3140870 test: 0.3247441 best: 0.3247441 (61) total: 2.89s remaining: 280ms 62: learn: 0.3117351 test: 0.3224042 best: 0.3224042 (62) total: 2.93s remaining: 233ms 63: learn: 0.3098173 test: 0.3205013 best: 0.3205013 (63) total: 2.98s remaining: 187ms 64: learn: 0.3081825 test: 0.3191062 best: 0.3191062 (64) total: 3.03s remaining: 140ms 65: learn: 0.3064345 test: 0.3174107 best: 0.3174107 (65) total: 3.07s remaining: 93.1ms 66: learn: 0.3048205 test: 0.3160237 best: 0.3160237 (66) total: 3.13s remaining: 46.7ms 67: learn: 0.3024380 test: 0.3136068 best: 0.3136068 (67) total: 3.17s remaining: 0us bestTest = 0.3136068054 bestIteration = 67 Trial 95, Fold 5: Log loss = 0.31342355489084844, Average precision = 0.9655505776422212, ROC-AUC = 0.9594478197053304, Elapsed Time = 3.287348599995312 seconds
Optimization Progress: 96%|#########6| 96/100 [2:53:48<03:45, 56.39s/it]
Trial 96, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 96, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6167493 test: 0.6169072 best: 0.6169072 (0) total: 58.5ms remaining: 4.97s 1: learn: 0.5646745 test: 0.5650281 best: 0.5650281 (1) total: 124ms remaining: 5.19s 2: learn: 0.5101653 test: 0.5104059 best: 0.5104059 (2) total: 184ms remaining: 5.1s 3: learn: 0.4628358 test: 0.4630011 best: 0.4630011 (3) total: 251ms remaining: 5.15s 4: learn: 0.4237167 test: 0.4245030 best: 0.4245030 (4) total: 324ms remaining: 5.25s 5: learn: 0.3954397 test: 0.3959613 best: 0.3959613 (5) total: 397ms remaining: 5.29s 6: learn: 0.3691118 test: 0.3701217 best: 0.3701217 (6) total: 468ms remaining: 5.28s 7: learn: 0.3511150 test: 0.3520386 best: 0.3520386 (7) total: 520ms remaining: 5.07s 8: learn: 0.3308723 test: 0.3317996 best: 0.3317996 (8) total: 588ms remaining: 5.03s 9: learn: 0.3149037 test: 0.3155921 best: 0.3155921 (9) total: 658ms remaining: 5s 10: learn: 0.3010486 test: 0.3019436 best: 0.3019436 (10) total: 732ms remaining: 4.99s 11: learn: 0.2891733 test: 0.2907190 best: 0.2907190 (11) total: 801ms remaining: 4.94s 12: learn: 0.2813679 test: 0.2835502 best: 0.2835502 (12) total: 870ms remaining: 4.89s 13: learn: 0.2713573 test: 0.2737621 best: 0.2737621 (13) total: 942ms remaining: 4.84s 14: learn: 0.2636383 test: 0.2661153 best: 0.2661153 (14) total: 1.01s remaining: 4.79s 15: learn: 0.2591786 test: 0.2622812 best: 0.2622812 (15) total: 1.08s remaining: 4.74s 16: learn: 0.2540546 test: 0.2577636 best: 0.2577636 (16) total: 1.16s remaining: 4.69s 17: learn: 0.2488205 test: 0.2527550 best: 0.2527550 (17) total: 1.23s remaining: 4.64s 18: learn: 0.2454782 test: 0.2497556 best: 0.2497556 (18) total: 1.3s remaining: 4.58s 19: learn: 0.2406426 test: 0.2453285 best: 0.2453285 (19) total: 1.37s remaining: 4.52s 20: learn: 0.2365624 test: 0.2415443 best: 0.2415443 (20) total: 1.44s remaining: 4.47s 21: learn: 0.2328994 test: 0.2381813 best: 0.2381813 (21) total: 1.51s remaining: 4.4s 22: learn: 0.2299942 test: 0.2356265 best: 0.2356265 (22) total: 1.58s remaining: 4.34s 23: learn: 0.2273861 test: 0.2333807 best: 0.2333807 (23) total: 1.66s remaining: 4.29s 24: learn: 0.2250681 test: 0.2312342 best: 0.2312342 (24) total: 1.73s remaining: 4.22s 25: learn: 0.2236700 test: 0.2300321 best: 0.2300321 (25) total: 1.79s remaining: 4.13s 26: learn: 0.2207802 test: 0.2275872 best: 0.2275872 (26) total: 1.86s remaining: 4.06s 27: learn: 0.2189291 test: 0.2261441 best: 0.2261441 (27) total: 1.93s remaining: 4s 28: learn: 0.2183216 test: 0.2257802 best: 0.2257802 (28) total: 2.04s remaining: 4.02s 29: learn: 0.2168613 test: 0.2251145 best: 0.2251145 (29) total: 2.13s remaining: 3.98s 30: learn: 0.2149985 test: 0.2233944 best: 0.2233944 (30) total: 2.21s remaining: 3.92s 31: learn: 0.2137012 test: 0.2223631 best: 0.2223631 (31) total: 2.29s remaining: 3.86s 32: learn: 0.2128351 test: 0.2215879 best: 0.2215879 (32) total: 2.36s remaining: 3.79s 33: learn: 0.2116549 test: 0.2207510 best: 0.2207510 (33) total: 2.43s remaining: 3.72s 34: learn: 0.2112996 test: 0.2204197 best: 0.2204197 (34) total: 2.49s remaining: 3.62s 35: learn: 0.2101434 test: 0.2195372 best: 0.2195372 (35) total: 2.55s remaining: 3.55s 36: learn: 0.2092728 test: 0.2189275 best: 0.2189275 (36) total: 2.63s remaining: 3.48s 37: learn: 0.2090255 test: 0.2187688 best: 0.2187688 (37) total: 2.67s remaining: 3.38s 38: learn: 0.2083350 test: 0.2183059 best: 0.2183059 (38) total: 2.75s remaining: 3.31s 39: learn: 0.2073151 test: 0.2176240 best: 0.2176240 (39) total: 2.84s remaining: 3.26s 40: learn: 0.2062284 test: 0.2167614 best: 0.2167614 (40) total: 2.92s remaining: 3.2s 41: learn: 0.2047577 test: 0.2159725 best: 0.2159725 (41) total: 3s remaining: 3.14s 42: learn: 0.2042091 test: 0.2155657 best: 0.2155657 (42) total: 3.08s remaining: 3.08s 43: learn: 0.2034734 test: 0.2149895 best: 0.2149895 (43) total: 3.16s remaining: 3.02s 44: learn: 0.2027872 test: 0.2146536 best: 0.2146536 (44) total: 3.24s remaining: 2.95s 45: learn: 0.2019777 test: 0.2142427 best: 0.2142427 (45) total: 3.32s remaining: 2.89s 46: learn: 0.2006558 test: 0.2133116 best: 0.2133116 (46) total: 3.4s remaining: 2.82s 47: learn: 0.2002376 test: 0.2130796 best: 0.2130796 (47) total: 3.49s remaining: 2.76s 48: learn: 0.1989594 test: 0.2121993 best: 0.2121993 (48) total: 3.58s remaining: 2.7s 49: learn: 0.1983158 test: 0.2115647 best: 0.2115647 (49) total: 3.67s remaining: 2.64s 50: learn: 0.1973305 test: 0.2108477 best: 0.2108477 (50) total: 3.76s remaining: 2.58s 51: learn: 0.1964536 test: 0.2104641 best: 0.2104641 (51) total: 3.85s remaining: 2.52s 52: learn: 0.1959677 test: 0.2101339 best: 0.2101339 (52) total: 3.93s remaining: 2.44s 53: learn: 0.1954083 test: 0.2097527 best: 0.2097527 (53) total: 4.01s remaining: 2.38s 54: learn: 0.1948254 test: 0.2096154 best: 0.2096154 (54) total: 4.09s remaining: 2.31s 55: learn: 0.1940191 test: 0.2092393 best: 0.2092393 (55) total: 4.17s remaining: 2.23s 56: learn: 0.1939292 test: 0.2091217 best: 0.2091217 (56) total: 4.22s remaining: 2.15s 57: learn: 0.1935704 test: 0.2088922 best: 0.2088922 (57) total: 4.29s remaining: 2.07s 58: learn: 0.1931990 test: 0.2086141 best: 0.2086141 (58) total: 4.36s remaining: 2s 59: learn: 0.1931364 test: 0.2085848 best: 0.2085848 (59) total: 4.41s remaining: 1.91s 60: learn: 0.1929136 test: 0.2083690 best: 0.2083690 (60) total: 4.48s remaining: 1.84s 61: learn: 0.1922617 test: 0.2081995 best: 0.2081995 (61) total: 4.56s remaining: 1.76s 62: learn: 0.1918692 test: 0.2081070 best: 0.2081070 (62) total: 4.64s remaining: 1.69s 63: learn: 0.1915824 test: 0.2081513 best: 0.2081070 (62) total: 4.72s remaining: 1.62s 64: learn: 0.1909694 test: 0.2076574 best: 0.2076574 (64) total: 4.79s remaining: 1.55s 65: learn: 0.1905574 test: 0.2073877 best: 0.2073877 (65) total: 4.87s remaining: 1.48s 66: learn: 0.1902686 test: 0.2072972 best: 0.2072972 (66) total: 4.95s remaining: 1.4s 67: learn: 0.1898331 test: 0.2072347 best: 0.2072347 (67) total: 5.02s remaining: 1.33s 68: learn: 0.1895735 test: 0.2072417 best: 0.2072347 (67) total: 5.09s remaining: 1.25s 69: learn: 0.1894140 test: 0.2071107 best: 0.2071107 (69) total: 5.17s remaining: 1.18s 70: learn: 0.1887602 test: 0.2066739 best: 0.2066739 (70) total: 5.24s remaining: 1.11s 71: learn: 0.1883309 test: 0.2065591 best: 0.2065591 (71) total: 5.31s remaining: 1.03s 72: learn: 0.1880803 test: 0.2064819 best: 0.2064819 (72) total: 5.39s remaining: 960ms 73: learn: 0.1877438 test: 0.2064689 best: 0.2064689 (73) total: 5.46s remaining: 886ms 74: learn: 0.1873523 test: 0.2062151 best: 0.2062151 (74) total: 5.54s remaining: 813ms 75: learn: 0.1866553 test: 0.2058076 best: 0.2058076 (75) total: 5.62s remaining: 739ms 76: learn: 0.1861141 test: 0.2054994 best: 0.2054994 (76) total: 5.7s remaining: 666ms 77: learn: 0.1858582 test: 0.2052406 best: 0.2052406 (77) total: 5.77s remaining: 592ms 78: learn: 0.1855050 test: 0.2049184 best: 0.2049184 (78) total: 5.85s remaining: 518ms 79: learn: 0.1852106 test: 0.2047142 best: 0.2047142 (79) total: 5.93s remaining: 445ms 80: learn: 0.1851533 test: 0.2046653 best: 0.2046653 (80) total: 5.98s remaining: 369ms 81: learn: 0.1851293 test: 0.2046662 best: 0.2046653 (80) total: 6.03s remaining: 294ms 82: learn: 0.1847292 test: 0.2047243 best: 0.2046653 (80) total: 6.1s remaining: 220ms 83: learn: 0.1841797 test: 0.2044644 best: 0.2044644 (83) total: 6.17s remaining: 147ms 84: learn: 0.1836418 test: 0.2039959 best: 0.2039959 (84) total: 6.25s remaining: 73.5ms 85: learn: 0.1835394 test: 0.2038675 best: 0.2038675 (85) total: 6.3s remaining: 0us bestTest = 0.2038675462 bestIteration = 85 Trial 96, Fold 1: Log loss = 0.2038675462430337, Average precision = 0.9733697266440057, ROC-AUC = 0.9685632226141485, Elapsed Time = 6.410741300001973 seconds Trial 96, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 96, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6463684 test: 0.6461204 best: 0.6461204 (0) total: 30.5ms remaining: 2.59s 1: learn: 0.5745327 test: 0.5751579 best: 0.5751579 (1) total: 94.6ms remaining: 3.97s 2: learn: 0.5220238 test: 0.5234051 best: 0.5234051 (2) total: 154ms remaining: 4.25s 3: learn: 0.4877745 test: 0.4894481 best: 0.4894481 (3) total: 223ms remaining: 4.58s 4: learn: 0.4530186 test: 0.4549452 best: 0.4549452 (4) total: 297ms remaining: 4.81s 5: learn: 0.4242233 test: 0.4266351 best: 0.4266351 (5) total: 372ms remaining: 4.96s 6: learn: 0.3914446 test: 0.3937024 best: 0.3937024 (6) total: 442ms remaining: 4.99s 7: learn: 0.3622445 test: 0.3645833 best: 0.3645833 (7) total: 514ms remaining: 5.01s 8: learn: 0.3380085 test: 0.3409189 best: 0.3409189 (8) total: 587ms remaining: 5.02s 9: learn: 0.3195057 test: 0.3221156 best: 0.3221156 (9) total: 656ms remaining: 4.99s 10: learn: 0.3066187 test: 0.3087880 best: 0.3087880 (10) total: 728ms remaining: 4.96s 11: learn: 0.2946268 test: 0.2966427 best: 0.2966427 (11) total: 805ms remaining: 4.96s 12: learn: 0.2820292 test: 0.2839369 best: 0.2839369 (12) total: 879ms remaining: 4.94s 13: learn: 0.2748746 test: 0.2765995 best: 0.2765995 (13) total: 953ms remaining: 4.9s 14: learn: 0.2672743 test: 0.2687738 best: 0.2687738 (14) total: 1.02s remaining: 4.85s 15: learn: 0.2613476 test: 0.2627683 best: 0.2627683 (15) total: 1.1s remaining: 4.8s 16: learn: 0.2558057 test: 0.2570103 best: 0.2570103 (16) total: 1.17s remaining: 4.74s 17: learn: 0.2494096 test: 0.2507756 best: 0.2507756 (17) total: 1.25s remaining: 4.7s 18: learn: 0.2438952 test: 0.2453751 best: 0.2453751 (18) total: 1.32s remaining: 4.64s 19: learn: 0.2401912 test: 0.2421314 best: 0.2421314 (19) total: 1.39s remaining: 4.6s 20: learn: 0.2373918 test: 0.2392266 best: 0.2392266 (20) total: 1.46s remaining: 4.53s 21: learn: 0.2337616 test: 0.2356775 best: 0.2356775 (21) total: 1.54s remaining: 4.47s 22: learn: 0.2311956 test: 0.2333703 best: 0.2333703 (22) total: 1.61s remaining: 4.41s 23: learn: 0.2289872 test: 0.2311947 best: 0.2311947 (23) total: 1.68s remaining: 4.35s 24: learn: 0.2265093 test: 0.2289919 best: 0.2289919 (24) total: 1.76s remaining: 4.29s 25: learn: 0.2240618 test: 0.2268167 best: 0.2268167 (25) total: 1.84s remaining: 4.24s 26: learn: 0.2226372 test: 0.2254533 best: 0.2254533 (26) total: 1.92s remaining: 4.19s 27: learn: 0.2213115 test: 0.2242080 best: 0.2242080 (27) total: 2s remaining: 4.13s 28: learn: 0.2188777 test: 0.2219826 best: 0.2219826 (28) total: 2.08s remaining: 4.08s 29: learn: 0.2179045 test: 0.2208950 best: 0.2208950 (29) total: 2.13s remaining: 3.98s 30: learn: 0.2168887 test: 0.2200007 best: 0.2200007 (30) total: 2.21s remaining: 3.91s 31: learn: 0.2152566 test: 0.2191283 best: 0.2191283 (31) total: 2.29s remaining: 3.86s 32: learn: 0.2138874 test: 0.2178974 best: 0.2178974 (32) total: 2.36s remaining: 3.79s 33: learn: 0.2130766 test: 0.2171078 best: 0.2171078 (33) total: 2.44s remaining: 3.73s 34: learn: 0.2125253 test: 0.2167639 best: 0.2167639 (34) total: 2.51s remaining: 3.66s 35: learn: 0.2116334 test: 0.2160536 best: 0.2160536 (35) total: 2.59s remaining: 3.6s 36: learn: 0.2114987 test: 0.2158691 best: 0.2158691 (36) total: 2.63s remaining: 3.48s 37: learn: 0.2107580 test: 0.2150809 best: 0.2150809 (37) total: 2.7s remaining: 3.41s 38: learn: 0.2095594 test: 0.2142087 best: 0.2142087 (38) total: 2.78s remaining: 3.35s 39: learn: 0.2091065 test: 0.2139839 best: 0.2139839 (39) total: 2.85s remaining: 3.28s 40: learn: 0.2082194 test: 0.2131773 best: 0.2131773 (40) total: 2.93s remaining: 3.22s 41: learn: 0.2066488 test: 0.2117476 best: 0.2117476 (41) total: 3.01s remaining: 3.15s 42: learn: 0.2054218 test: 0.2107637 best: 0.2107637 (42) total: 3.08s remaining: 3.08s 43: learn: 0.2040567 test: 0.2094447 best: 0.2094447 (43) total: 3.16s remaining: 3.02s 44: learn: 0.2031941 test: 0.2090314 best: 0.2090314 (44) total: 3.23s remaining: 2.95s 45: learn: 0.2024012 test: 0.2083239 best: 0.2083239 (45) total: 3.31s remaining: 2.88s 46: learn: 0.2016790 test: 0.2080557 best: 0.2080557 (46) total: 3.39s remaining: 2.81s 47: learn: 0.2014170 test: 0.2079555 best: 0.2079555 (47) total: 3.46s remaining: 2.74s 48: learn: 0.2004857 test: 0.2072308 best: 0.2072308 (48) total: 3.54s remaining: 2.67s 49: learn: 0.2001385 test: 0.2069151 best: 0.2069151 (49) total: 3.62s remaining: 2.61s 50: learn: 0.1995638 test: 0.2062741 best: 0.2062741 (50) total: 3.69s remaining: 2.53s 51: learn: 0.1988809 test: 0.2058653 best: 0.2058653 (51) total: 3.77s remaining: 2.46s 52: learn: 0.1981673 test: 0.2053400 best: 0.2053400 (52) total: 3.85s remaining: 2.39s 53: learn: 0.1979240 test: 0.2052324 best: 0.2052324 (53) total: 3.91s remaining: 2.31s 54: learn: 0.1976174 test: 0.2049501 best: 0.2049501 (54) total: 3.98s remaining: 2.24s 55: learn: 0.1971408 test: 0.2048050 best: 0.2048050 (55) total: 4.05s remaining: 2.17s 56: learn: 0.1970254 test: 0.2046874 best: 0.2046874 (56) total: 4.09s remaining: 2.08s 57: learn: 0.1962155 test: 0.2039703 best: 0.2039703 (57) total: 4.16s remaining: 2.01s 58: learn: 0.1957421 test: 0.2038984 best: 0.2038984 (58) total: 4.24s remaining: 1.94s 59: learn: 0.1952020 test: 0.2034462 best: 0.2034462 (59) total: 4.32s remaining: 1.87s 60: learn: 0.1943708 test: 0.2029226 best: 0.2029226 (60) total: 4.39s remaining: 1.8s 61: learn: 0.1940655 test: 0.2028165 best: 0.2028165 (61) total: 4.47s remaining: 1.73s 62: learn: 0.1938220 test: 0.2026807 best: 0.2026807 (62) total: 4.54s remaining: 1.66s 63: learn: 0.1933373 test: 0.2021021 best: 0.2021021 (63) total: 4.62s remaining: 1.59s 64: learn: 0.1929709 test: 0.2018393 best: 0.2018393 (64) total: 4.69s remaining: 1.52s 65: learn: 0.1924796 test: 0.2015509 best: 0.2015509 (65) total: 4.77s remaining: 1.45s 66: learn: 0.1919674 test: 0.2011357 best: 0.2011357 (66) total: 4.85s remaining: 1.38s 67: learn: 0.1915144 test: 0.2008755 best: 0.2008755 (67) total: 4.93s remaining: 1.3s 68: learn: 0.1913561 test: 0.2007225 best: 0.2007225 (68) total: 4.98s remaining: 1.23s 69: learn: 0.1910680 test: 0.2004790 best: 0.2004790 (69) total: 5.05s remaining: 1.15s 70: learn: 0.1904220 test: 0.1998878 best: 0.1998878 (70) total: 5.13s remaining: 1.08s 71: learn: 0.1901148 test: 0.1997867 best: 0.1997867 (71) total: 5.2s remaining: 1.01s 72: learn: 0.1894131 test: 0.1992834 best: 0.1992834 (72) total: 5.28s remaining: 940ms 73: learn: 0.1889430 test: 0.1992225 best: 0.1992225 (73) total: 5.35s remaining: 868ms 74: learn: 0.1889311 test: 0.1992154 best: 0.1992154 (74) total: 5.39s remaining: 790ms 75: learn: 0.1886087 test: 0.1989455 best: 0.1989455 (75) total: 5.46s remaining: 718ms 76: learn: 0.1880053 test: 0.1987020 best: 0.1987020 (76) total: 5.53s remaining: 646ms 77: learn: 0.1876286 test: 0.1985927 best: 0.1985927 (77) total: 5.6s remaining: 575ms 78: learn: 0.1871191 test: 0.1982683 best: 0.1982683 (78) total: 5.68s remaining: 503ms 79: learn: 0.1868492 test: 0.1980385 best: 0.1980385 (79) total: 5.75s remaining: 431ms 80: learn: 0.1863801 test: 0.1975641 best: 0.1975641 (80) total: 5.82s remaining: 359ms 81: learn: 0.1859623 test: 0.1974488 best: 0.1974488 (81) total: 5.9s remaining: 288ms 82: learn: 0.1855158 test: 0.1972541 best: 0.1972541 (82) total: 5.98s remaining: 216ms 83: learn: 0.1848067 test: 0.1969554 best: 0.1969554 (83) total: 6.06s remaining: 144ms 84: learn: 0.1842307 test: 0.1965825 best: 0.1965825 (84) total: 6.14s remaining: 72.2ms 85: learn: 0.1838869 test: 0.1963117 best: 0.1963117 (85) total: 6.21s remaining: 0us bestTest = 0.1963117067 bestIteration = 85 Trial 96, Fold 2: Log loss = 0.19631170670130296, Average precision = 0.9740709703615107, ROC-AUC = 0.9711709557166577, Elapsed Time = 6.3368633999998565 seconds Trial 96, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 96, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6182688 test: 0.6170438 best: 0.6170438 (0) total: 35.9ms remaining: 3.05s 1: learn: 0.5687070 test: 0.5665174 best: 0.5665174 (1) total: 80.9ms remaining: 3.4s 2: learn: 0.5047583 test: 0.5023640 best: 0.5023640 (2) total: 149ms remaining: 4.12s 3: learn: 0.4507786 test: 0.4483758 best: 0.4483758 (3) total: 220ms remaining: 4.52s 4: learn: 0.4164369 test: 0.4137684 best: 0.4137684 (4) total: 291ms remaining: 4.71s 5: learn: 0.3849048 test: 0.3820793 best: 0.3820793 (5) total: 361ms remaining: 4.81s 6: learn: 0.3623901 test: 0.3591770 best: 0.3591770 (6) total: 415ms remaining: 4.68s 7: learn: 0.3387561 test: 0.3355565 best: 0.3355565 (7) total: 482ms remaining: 4.7s 8: learn: 0.3247537 test: 0.3222292 best: 0.3222292 (8) total: 558ms remaining: 4.77s 9: learn: 0.3090500 test: 0.3065317 best: 0.3065317 (9) total: 624ms remaining: 4.74s 10: learn: 0.2997849 test: 0.2971445 best: 0.2971445 (10) total: 672ms remaining: 4.58s 11: learn: 0.2867782 test: 0.2842539 best: 0.2842539 (11) total: 739ms remaining: 4.56s 12: learn: 0.2772109 test: 0.2754402 best: 0.2754402 (12) total: 816ms remaining: 4.58s 13: learn: 0.2687287 test: 0.2671681 best: 0.2671681 (13) total: 888ms remaining: 4.57s 14: learn: 0.2616027 test: 0.2605944 best: 0.2605944 (14) total: 962ms remaining: 4.55s 15: learn: 0.2552240 test: 0.2541450 best: 0.2541450 (15) total: 1.03s remaining: 4.53s 16: learn: 0.2503779 test: 0.2491092 best: 0.2491092 (16) total: 1.1s remaining: 4.48s 17: learn: 0.2452912 test: 0.2443681 best: 0.2443681 (17) total: 1.17s remaining: 4.43s 18: learn: 0.2429716 test: 0.2421625 best: 0.2421625 (18) total: 1.24s remaining: 4.37s 19: learn: 0.2406662 test: 0.2401900 best: 0.2401900 (19) total: 1.32s remaining: 4.34s 20: learn: 0.2370865 test: 0.2364858 best: 0.2364858 (20) total: 1.39s remaining: 4.29s 21: learn: 0.2349371 test: 0.2344264 best: 0.2344264 (21) total: 1.46s remaining: 4.25s 22: learn: 0.2326377 test: 0.2320414 best: 0.2320414 (22) total: 1.53s remaining: 4.19s 23: learn: 0.2303639 test: 0.2300546 best: 0.2300546 (23) total: 1.6s remaining: 4.14s 24: learn: 0.2280011 test: 0.2278059 best: 0.2278059 (24) total: 1.67s remaining: 4.08s 25: learn: 0.2257194 test: 0.2257684 best: 0.2257684 (25) total: 1.74s remaining: 4.02s 26: learn: 0.2240127 test: 0.2242226 best: 0.2242226 (26) total: 1.82s remaining: 3.97s 27: learn: 0.2217316 test: 0.2228566 best: 0.2228566 (27) total: 1.89s remaining: 3.91s 28: learn: 0.2196075 test: 0.2210970 best: 0.2210970 (28) total: 1.96s remaining: 3.86s 29: learn: 0.2187674 test: 0.2204051 best: 0.2204051 (29) total: 2.04s remaining: 3.8s 30: learn: 0.2172364 test: 0.2191257 best: 0.2191257 (30) total: 2.11s remaining: 3.74s 31: learn: 0.2160968 test: 0.2182837 best: 0.2182837 (31) total: 2.18s remaining: 3.68s 32: learn: 0.2149224 test: 0.2172841 best: 0.2172841 (32) total: 2.25s remaining: 3.61s 33: learn: 0.2141781 test: 0.2166645 best: 0.2166645 (33) total: 2.32s remaining: 3.55s 34: learn: 0.2126303 test: 0.2153324 best: 0.2153324 (34) total: 2.39s remaining: 3.49s 35: learn: 0.2115262 test: 0.2145982 best: 0.2145982 (35) total: 2.47s remaining: 3.42s 36: learn: 0.2107882 test: 0.2139825 best: 0.2139825 (36) total: 2.54s remaining: 3.36s 37: learn: 0.2093077 test: 0.2122044 best: 0.2122044 (37) total: 2.61s remaining: 3.29s 38: learn: 0.2087738 test: 0.2118890 best: 0.2118890 (38) total: 2.68s remaining: 3.23s 39: learn: 0.2081261 test: 0.2112491 best: 0.2112491 (39) total: 2.75s remaining: 3.17s 40: learn: 0.2071990 test: 0.2104210 best: 0.2104210 (40) total: 2.83s remaining: 3.11s 41: learn: 0.2060824 test: 0.2095980 best: 0.2095980 (41) total: 2.91s remaining: 3.05s 42: learn: 0.2053057 test: 0.2090317 best: 0.2090317 (42) total: 2.98s remaining: 2.98s 43: learn: 0.2046296 test: 0.2084381 best: 0.2084381 (43) total: 3.06s remaining: 2.92s 44: learn: 0.2041416 test: 0.2080224 best: 0.2080224 (44) total: 3.13s remaining: 2.86s 45: learn: 0.2034910 test: 0.2074055 best: 0.2074055 (45) total: 3.21s remaining: 2.79s 46: learn: 0.2029366 test: 0.2070255 best: 0.2070255 (46) total: 3.28s remaining: 2.72s 47: learn: 0.2023125 test: 0.2067398 best: 0.2067398 (47) total: 3.36s remaining: 2.66s 48: learn: 0.2017132 test: 0.2064070 best: 0.2064070 (48) total: 3.43s remaining: 2.59s 49: learn: 0.2014561 test: 0.2063687 best: 0.2063687 (49) total: 3.5s remaining: 2.52s 50: learn: 0.2009692 test: 0.2060855 best: 0.2060855 (50) total: 3.58s remaining: 2.46s 51: learn: 0.2004166 test: 0.2057134 best: 0.2057134 (51) total: 3.65s remaining: 2.39s 52: learn: 0.1994739 test: 0.2051122 best: 0.2051122 (52) total: 3.73s remaining: 2.32s 53: learn: 0.1987504 test: 0.2047147 best: 0.2047147 (53) total: 3.8s remaining: 2.25s 54: learn: 0.1984453 test: 0.2044525 best: 0.2044525 (54) total: 3.88s remaining: 2.18s 55: learn: 0.1974568 test: 0.2036897 best: 0.2036897 (55) total: 3.95s remaining: 2.12s 56: learn: 0.1969903 test: 0.2031237 best: 0.2031237 (56) total: 4.02s remaining: 2.05s 57: learn: 0.1965085 test: 0.2028501 best: 0.2028501 (57) total: 4.09s remaining: 1.98s 58: learn: 0.1957337 test: 0.2022067 best: 0.2022067 (58) total: 4.17s remaining: 1.91s 59: learn: 0.1950616 test: 0.2018375 best: 0.2018375 (59) total: 4.24s remaining: 1.84s 60: learn: 0.1945584 test: 0.2015142 best: 0.2015142 (60) total: 4.32s remaining: 1.77s 61: learn: 0.1944226 test: 0.2012957 best: 0.2012957 (61) total: 4.39s remaining: 1.7s 62: learn: 0.1938912 test: 0.2010861 best: 0.2010861 (62) total: 4.46s remaining: 1.63s 63: learn: 0.1934955 test: 0.2010023 best: 0.2010023 (63) total: 4.53s remaining: 1.56s 64: learn: 0.1930699 test: 0.2006644 best: 0.2006644 (64) total: 4.61s remaining: 1.49s 65: learn: 0.1925292 test: 0.2003672 best: 0.2003672 (65) total: 4.68s remaining: 1.42s 66: learn: 0.1921142 test: 0.2001405 best: 0.2001405 (66) total: 4.76s remaining: 1.35s 67: learn: 0.1918123 test: 0.1999126 best: 0.1999126 (67) total: 4.84s remaining: 1.28s 68: learn: 0.1914516 test: 0.1998439 best: 0.1998439 (68) total: 4.92s remaining: 1.21s 69: learn: 0.1912018 test: 0.1997182 best: 0.1997182 (69) total: 4.99s remaining: 1.14s 70: learn: 0.1910048 test: 0.1995774 best: 0.1995774 (70) total: 5.07s remaining: 1.07s 71: learn: 0.1904660 test: 0.1989021 best: 0.1989021 (71) total: 5.14s remaining: 1000ms 72: learn: 0.1902867 test: 0.1987112 best: 0.1987112 (72) total: 5.2s remaining: 926ms 73: learn: 0.1901612 test: 0.1986277 best: 0.1986277 (73) total: 5.25s remaining: 851ms 74: learn: 0.1897263 test: 0.1983760 best: 0.1983760 (74) total: 5.32s remaining: 780ms 75: learn: 0.1893881 test: 0.1982577 best: 0.1982577 (75) total: 5.39s remaining: 709ms 76: learn: 0.1889329 test: 0.1982248 best: 0.1982248 (76) total: 5.46s remaining: 639ms 77: learn: 0.1881982 test: 0.1977513 best: 0.1977513 (77) total: 5.54s remaining: 568ms 78: learn: 0.1876954 test: 0.1973330 best: 0.1973330 (78) total: 5.62s remaining: 498ms 79: learn: 0.1871970 test: 0.1970746 best: 0.1970746 (79) total: 5.7s remaining: 428ms 80: learn: 0.1867214 test: 0.1968306 best: 0.1968306 (80) total: 5.79s remaining: 357ms 81: learn: 0.1866608 test: 0.1967727 best: 0.1967727 (81) total: 5.87s remaining: 286ms 82: learn: 0.1865457 test: 0.1967283 best: 0.1967283 (82) total: 5.94s remaining: 215ms 83: learn: 0.1859901 test: 0.1964380 best: 0.1964380 (83) total: 6.04s remaining: 144ms 84: learn: 0.1851966 test: 0.1962080 best: 0.1962080 (84) total: 6.12s remaining: 72ms 85: learn: 0.1851646 test: 0.1961880 best: 0.1961880 (85) total: 6.18s remaining: 0us bestTest = 0.1961880349 bestIteration = 85 Trial 96, Fold 3: Log loss = 0.19618803491667963, Average precision = 0.9748705927722668, ROC-AUC = 0.9713855176922113, Elapsed Time = 6.298042399997939 seconds Trial 96, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 96, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6185156 test: 0.6185551 best: 0.6185551 (0) total: 43.1ms remaining: 3.67s 1: learn: 0.5565949 test: 0.5571257 best: 0.5571257 (1) total: 120ms remaining: 5.04s 2: learn: 0.5109719 test: 0.5117847 best: 0.5117847 (2) total: 196ms remaining: 5.42s 3: learn: 0.4634355 test: 0.4648080 best: 0.4648080 (3) total: 271ms remaining: 5.55s 4: learn: 0.4207647 test: 0.4221785 best: 0.4221785 (4) total: 358ms remaining: 5.8s 5: learn: 0.3923487 test: 0.3935506 best: 0.3935506 (5) total: 452ms remaining: 6.03s 6: learn: 0.3691179 test: 0.3706770 best: 0.3706770 (6) total: 542ms remaining: 6.11s 7: learn: 0.3479795 test: 0.3498225 best: 0.3498225 (7) total: 629ms remaining: 6.13s 8: learn: 0.3317606 test: 0.3339255 best: 0.3339255 (8) total: 719ms remaining: 6.15s 9: learn: 0.3166100 test: 0.3190168 best: 0.3190168 (9) total: 809ms remaining: 6.14s 10: learn: 0.3021165 test: 0.3044017 best: 0.3044017 (10) total: 900ms remaining: 6.13s 11: learn: 0.2929172 test: 0.2949581 best: 0.2949581 (11) total: 1s remaining: 6.18s 12: learn: 0.2829435 test: 0.2852789 best: 0.2852789 (12) total: 1.1s remaining: 6.17s 13: learn: 0.2752477 test: 0.2782232 best: 0.2782232 (13) total: 1.19s remaining: 6.14s 14: learn: 0.2661079 test: 0.2687365 best: 0.2687365 (14) total: 1.3s remaining: 6.16s 15: learn: 0.2585176 test: 0.2609075 best: 0.2609075 (15) total: 1.42s remaining: 6.2s 16: learn: 0.2535691 test: 0.2560429 best: 0.2560429 (16) total: 1.51s remaining: 6.12s 17: learn: 0.2483101 test: 0.2511973 best: 0.2511973 (17) total: 1.6s remaining: 6.05s 18: learn: 0.2445013 test: 0.2472019 best: 0.2472019 (18) total: 1.69s remaining: 5.96s 19: learn: 0.2412903 test: 0.2438025 best: 0.2438025 (19) total: 1.78s remaining: 5.88s 20: learn: 0.2375121 test: 0.2401671 best: 0.2401671 (20) total: 1.87s remaining: 5.79s 21: learn: 0.2347055 test: 0.2374720 best: 0.2374720 (21) total: 1.95s remaining: 5.68s 22: learn: 0.2314358 test: 0.2343905 best: 0.2343905 (22) total: 2.04s remaining: 5.6s 23: learn: 0.2302699 test: 0.2334381 best: 0.2334381 (23) total: 2.1s remaining: 5.43s 24: learn: 0.2281823 test: 0.2316216 best: 0.2316216 (24) total: 2.2s remaining: 5.36s 25: learn: 0.2267399 test: 0.2304706 best: 0.2304706 (25) total: 2.3s remaining: 5.32s 26: learn: 0.2251785 test: 0.2292737 best: 0.2292737 (26) total: 2.4s remaining: 5.24s 27: learn: 0.2231874 test: 0.2276575 best: 0.2276575 (27) total: 2.48s remaining: 5.13s 28: learn: 0.2215326 test: 0.2262885 best: 0.2262885 (28) total: 2.56s remaining: 5.04s 29: learn: 0.2197683 test: 0.2246194 best: 0.2246194 (29) total: 2.65s remaining: 4.94s 30: learn: 0.2173259 test: 0.2222077 best: 0.2222077 (30) total: 2.82s remaining: 5.01s 31: learn: 0.2153021 test: 0.2205124 best: 0.2205124 (31) total: 2.92s remaining: 4.93s 32: learn: 0.2134550 test: 0.2187812 best: 0.2187812 (32) total: 3.01s remaining: 4.84s 33: learn: 0.2119166 test: 0.2175970 best: 0.2175970 (33) total: 3.1s remaining: 4.74s 34: learn: 0.2108207 test: 0.2168381 best: 0.2168381 (34) total: 3.19s remaining: 4.65s 35: learn: 0.2100983 test: 0.2163371 best: 0.2163371 (35) total: 3.28s remaining: 4.56s 36: learn: 0.2089176 test: 0.2153339 best: 0.2153339 (36) total: 3.36s remaining: 4.45s 37: learn: 0.2078561 test: 0.2145010 best: 0.2145010 (37) total: 3.44s remaining: 4.35s 38: learn: 0.2077697 test: 0.2144566 best: 0.2144566 (38) total: 3.49s remaining: 4.21s 39: learn: 0.2065347 test: 0.2132553 best: 0.2132553 (39) total: 3.57s remaining: 4.1s 40: learn: 0.2051080 test: 0.2120555 best: 0.2120555 (40) total: 3.65s remaining: 4s 41: learn: 0.2041266 test: 0.2112730 best: 0.2112730 (41) total: 3.73s remaining: 3.91s 42: learn: 0.2036988 test: 0.2110451 best: 0.2110451 (42) total: 3.81s remaining: 3.81s 43: learn: 0.2029747 test: 0.2104913 best: 0.2104913 (43) total: 3.9s remaining: 3.72s 44: learn: 0.2025209 test: 0.2100588 best: 0.2100588 (44) total: 3.98s remaining: 3.62s 45: learn: 0.2021373 test: 0.2097806 best: 0.2097806 (45) total: 4.05s remaining: 3.53s 46: learn: 0.2014061 test: 0.2093952 best: 0.2093952 (46) total: 4.14s remaining: 3.43s 47: learn: 0.2007171 test: 0.2088136 best: 0.2088136 (47) total: 4.22s remaining: 3.34s 48: learn: 0.2001683 test: 0.2087248 best: 0.2087248 (48) total: 4.3s remaining: 3.25s 49: learn: 0.1998631 test: 0.2082665 best: 0.2082665 (49) total: 4.36s remaining: 3.14s 50: learn: 0.1994922 test: 0.2082560 best: 0.2082560 (50) total: 4.45s remaining: 3.05s 51: learn: 0.1985019 test: 0.2077601 best: 0.2077601 (51) total: 4.53s remaining: 2.96s 52: learn: 0.1981092 test: 0.2074149 best: 0.2074149 (52) total: 4.62s remaining: 2.88s 53: learn: 0.1978473 test: 0.2072049 best: 0.2072049 (53) total: 4.7s remaining: 2.79s 54: learn: 0.1969516 test: 0.2068279 best: 0.2068279 (54) total: 4.79s remaining: 2.7s 55: learn: 0.1963526 test: 0.2063553 best: 0.2063553 (55) total: 4.88s remaining: 2.61s 56: learn: 0.1958439 test: 0.2061235 best: 0.2061235 (56) total: 4.97s remaining: 2.53s 57: learn: 0.1957814 test: 0.2061384 best: 0.2061235 (56) total: 5.01s remaining: 2.42s 58: learn: 0.1953600 test: 0.2059313 best: 0.2059313 (58) total: 5.09s remaining: 2.33s 59: learn: 0.1948363 test: 0.2056184 best: 0.2056184 (59) total: 5.18s remaining: 2.24s 60: learn: 0.1941929 test: 0.2050832 best: 0.2050832 (60) total: 5.26s remaining: 2.15s 61: learn: 0.1937013 test: 0.2048752 best: 0.2048752 (61) total: 5.35s remaining: 2.07s 62: learn: 0.1929017 test: 0.2047419 best: 0.2047419 (62) total: 5.43s remaining: 1.98s 63: learn: 0.1923543 test: 0.2043188 best: 0.2043188 (63) total: 5.52s remaining: 1.9s 64: learn: 0.1920301 test: 0.2041550 best: 0.2041550 (64) total: 5.6s remaining: 1.81s 65: learn: 0.1919489 test: 0.2040799 best: 0.2040799 (65) total: 5.65s remaining: 1.71s 66: learn: 0.1914809 test: 0.2038612 best: 0.2038612 (66) total: 5.72s remaining: 1.62s 67: learn: 0.1911932 test: 0.2039513 best: 0.2038612 (66) total: 5.81s remaining: 1.54s 68: learn: 0.1904883 test: 0.2034127 best: 0.2034127 (68) total: 5.89s remaining: 1.45s 69: learn: 0.1899233 test: 0.2029839 best: 0.2029839 (69) total: 5.97s remaining: 1.36s 70: learn: 0.1895403 test: 0.2028165 best: 0.2028165 (70) total: 6.04s remaining: 1.28s 71: learn: 0.1891002 test: 0.2026976 best: 0.2026976 (71) total: 6.13s remaining: 1.19s 72: learn: 0.1883483 test: 0.2024458 best: 0.2024458 (72) total: 6.22s remaining: 1.11s 73: learn: 0.1881279 test: 0.2022878 best: 0.2022878 (73) total: 6.3s remaining: 1.02s 74: learn: 0.1873485 test: 0.2017774 best: 0.2017774 (74) total: 6.38s remaining: 936ms 75: learn: 0.1868966 test: 0.2015070 best: 0.2015070 (75) total: 6.46s remaining: 850ms 76: learn: 0.1868517 test: 0.2014701 best: 0.2014701 (76) total: 6.5s remaining: 760ms 77: learn: 0.1862306 test: 0.2009478 best: 0.2009478 (77) total: 6.58s remaining: 675ms 78: learn: 0.1857461 test: 0.2006798 best: 0.2006798 (78) total: 6.66s remaining: 590ms 79: learn: 0.1852169 test: 0.2006765 best: 0.2006765 (79) total: 6.74s remaining: 505ms 80: learn: 0.1845227 test: 0.2003868 best: 0.2003868 (80) total: 6.82s remaining: 421ms 81: learn: 0.1844260 test: 0.2004759 best: 0.2003868 (80) total: 6.89s remaining: 336ms 82: learn: 0.1841591 test: 0.2003321 best: 0.2003321 (82) total: 6.97s remaining: 252ms 83: learn: 0.1836666 test: 0.2004532 best: 0.2003321 (82) total: 7.05s remaining: 168ms 84: learn: 0.1832965 test: 0.2002027 best: 0.2002027 (84) total: 7.13s remaining: 83.9ms 85: learn: 0.1830048 test: 0.2000830 best: 0.2000830 (85) total: 7.21s remaining: 0us bestTest = 0.2000829843 bestIteration = 85 Trial 96, Fold 4: Log loss = 0.2000829843369374, Average precision = 0.9743039508856786, ROC-AUC = 0.9693250153198327, Elapsed Time = 7.345657800004119 seconds Trial 96, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 96, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6136591 test: 0.6156909 best: 0.6156909 (0) total: 65.7ms remaining: 5.59s 1: learn: 0.5464123 test: 0.5493478 best: 0.5493478 (1) total: 139ms remaining: 5.86s 2: learn: 0.4884498 test: 0.4926981 best: 0.4926981 (2) total: 205ms remaining: 5.67s 3: learn: 0.4410509 test: 0.4457874 best: 0.4457874 (3) total: 264ms remaining: 5.41s 4: learn: 0.3986203 test: 0.4040598 best: 0.4040598 (4) total: 339ms remaining: 5.49s 5: learn: 0.3750653 test: 0.3809843 best: 0.3809843 (5) total: 383ms remaining: 5.1s 6: learn: 0.3539471 test: 0.3605275 best: 0.3605275 (6) total: 438ms remaining: 4.94s 7: learn: 0.3369067 test: 0.3439251 best: 0.3439251 (7) total: 508ms remaining: 4.95s 8: learn: 0.3170192 test: 0.3253355 best: 0.3253355 (8) total: 579ms remaining: 4.96s 9: learn: 0.3008885 test: 0.3099211 best: 0.3099211 (9) total: 649ms remaining: 4.93s 10: learn: 0.2878629 test: 0.2975414 best: 0.2975414 (10) total: 720ms remaining: 4.91s 11: learn: 0.2804532 test: 0.2899574 best: 0.2899574 (11) total: 792ms remaining: 4.88s 12: learn: 0.2717219 test: 0.2814659 best: 0.2814659 (12) total: 865ms remaining: 4.86s 13: learn: 0.2637026 test: 0.2735733 best: 0.2735733 (13) total: 937ms remaining: 4.82s 14: learn: 0.2569579 test: 0.2671260 best: 0.2671260 (14) total: 1.01s remaining: 4.77s 15: learn: 0.2522560 test: 0.2626285 best: 0.2626285 (15) total: 1.06s remaining: 4.64s 16: learn: 0.2463760 test: 0.2573618 best: 0.2573618 (16) total: 1.13s remaining: 4.59s 17: learn: 0.2420966 test: 0.2533611 best: 0.2533611 (17) total: 1.21s remaining: 4.55s 18: learn: 0.2392937 test: 0.2508396 best: 0.2508396 (18) total: 1.28s remaining: 4.53s 19: learn: 0.2367690 test: 0.2486882 best: 0.2486882 (19) total: 1.36s remaining: 4.5s 20: learn: 0.2350783 test: 0.2470880 best: 0.2470880 (20) total: 1.44s remaining: 4.46s 21: learn: 0.2306296 test: 0.2437308 best: 0.2437308 (21) total: 1.53s remaining: 4.46s 22: learn: 0.2280425 test: 0.2412473 best: 0.2412473 (22) total: 1.61s remaining: 4.41s 23: learn: 0.2258890 test: 0.2393883 best: 0.2393883 (23) total: 1.68s remaining: 4.35s 24: learn: 0.2238820 test: 0.2375787 best: 0.2375787 (24) total: 1.75s remaining: 4.28s 25: learn: 0.2223662 test: 0.2362170 best: 0.2362170 (25) total: 1.83s remaining: 4.22s 26: learn: 0.2202592 test: 0.2342141 best: 0.2342141 (26) total: 1.9s remaining: 4.16s 27: learn: 0.2193613 test: 0.2334684 best: 0.2334684 (27) total: 1.96s remaining: 4.05s 28: learn: 0.2176315 test: 0.2318391 best: 0.2318391 (28) total: 2.03s remaining: 3.99s 29: learn: 0.2162355 test: 0.2305151 best: 0.2305151 (29) total: 2.1s remaining: 3.92s 30: learn: 0.2146253 test: 0.2288846 best: 0.2288846 (30) total: 2.17s remaining: 3.86s 31: learn: 0.2122014 test: 0.2269549 best: 0.2269549 (31) total: 2.25s remaining: 3.79s 32: learn: 0.2109367 test: 0.2261423 best: 0.2261423 (32) total: 2.32s remaining: 3.72s 33: learn: 0.2097972 test: 0.2252630 best: 0.2252630 (33) total: 2.39s remaining: 3.65s 34: learn: 0.2091251 test: 0.2244662 best: 0.2244662 (34) total: 2.46s remaining: 3.58s 35: learn: 0.2080766 test: 0.2238556 best: 0.2238556 (35) total: 2.53s remaining: 3.51s 36: learn: 0.2071224 test: 0.2230219 best: 0.2230219 (36) total: 2.6s remaining: 3.44s 37: learn: 0.2062190 test: 0.2223302 best: 0.2223302 (37) total: 2.67s remaining: 3.38s 38: learn: 0.2053184 test: 0.2216371 best: 0.2216371 (38) total: 2.74s remaining: 3.3s 39: learn: 0.2040328 test: 0.2205413 best: 0.2205413 (39) total: 2.82s remaining: 3.24s 40: learn: 0.2037262 test: 0.2203446 best: 0.2203446 (40) total: 2.86s remaining: 3.14s 41: learn: 0.2027216 test: 0.2194914 best: 0.2194914 (41) total: 2.92s remaining: 3.06s 42: learn: 0.2018970 test: 0.2188577 best: 0.2188577 (42) total: 3s remaining: 3s 43: learn: 0.2009822 test: 0.2183486 best: 0.2183486 (43) total: 3.07s remaining: 2.94s 44: learn: 0.2000918 test: 0.2175973 best: 0.2175973 (44) total: 3.15s remaining: 2.87s 45: learn: 0.1997871 test: 0.2173321 best: 0.2173321 (45) total: 3.2s remaining: 2.78s 46: learn: 0.1991222 test: 0.2168226 best: 0.2168226 (46) total: 3.27s remaining: 2.72s 47: learn: 0.1984722 test: 0.2163681 best: 0.2163681 (47) total: 3.35s remaining: 2.65s 48: learn: 0.1979508 test: 0.2159630 best: 0.2159630 (48) total: 3.42s remaining: 2.58s 49: learn: 0.1973984 test: 0.2155711 best: 0.2155711 (49) total: 3.49s remaining: 2.51s 50: learn: 0.1969213 test: 0.2154064 best: 0.2154064 (50) total: 3.56s remaining: 2.44s 51: learn: 0.1963904 test: 0.2150989 best: 0.2150989 (51) total: 3.63s remaining: 2.38s 52: learn: 0.1962762 test: 0.2150208 best: 0.2150208 (52) total: 3.67s remaining: 2.28s 53: learn: 0.1955557 test: 0.2145149 best: 0.2145149 (53) total: 3.74s remaining: 2.22s 54: learn: 0.1952949 test: 0.2144391 best: 0.2144391 (54) total: 3.82s remaining: 2.15s 55: learn: 0.1949752 test: 0.2142957 best: 0.2142957 (55) total: 3.89s remaining: 2.08s 56: learn: 0.1943398 test: 0.2139079 best: 0.2139079 (56) total: 3.96s remaining: 2.01s 57: learn: 0.1939676 test: 0.2138536 best: 0.2138536 (57) total: 4.04s remaining: 1.95s 58: learn: 0.1935621 test: 0.2134578 best: 0.2134578 (58) total: 4.11s remaining: 1.88s 59: learn: 0.1924758 test: 0.2125876 best: 0.2125876 (59) total: 4.18s remaining: 1.81s 60: learn: 0.1916854 test: 0.2122834 best: 0.2122834 (60) total: 4.26s remaining: 1.75s 61: learn: 0.1915044 test: 0.2122892 best: 0.2122834 (60) total: 4.32s remaining: 1.67s 62: learn: 0.1910578 test: 0.2120934 best: 0.2120934 (62) total: 4.39s remaining: 1.6s 63: learn: 0.1907439 test: 0.2120256 best: 0.2120256 (63) total: 4.47s remaining: 1.54s 64: learn: 0.1906746 test: 0.2119037 best: 0.2119037 (64) total: 4.5s remaining: 1.46s 65: learn: 0.1899217 test: 0.2113880 best: 0.2113880 (65) total: 4.57s remaining: 1.39s 66: learn: 0.1896519 test: 0.2113001 best: 0.2113001 (66) total: 4.64s remaining: 1.32s 67: learn: 0.1890033 test: 0.2109392 best: 0.2109392 (67) total: 4.72s remaining: 1.25s 68: learn: 0.1884623 test: 0.2109842 best: 0.2109392 (67) total: 4.79s remaining: 1.18s 69: learn: 0.1883098 test: 0.2108555 best: 0.2108555 (69) total: 4.86s remaining: 1.11s 70: learn: 0.1879329 test: 0.2106382 best: 0.2106382 (70) total: 4.94s remaining: 1.04s 71: learn: 0.1877834 test: 0.2106118 best: 0.2106118 (71) total: 5.01s remaining: 974ms 72: learn: 0.1875240 test: 0.2106179 best: 0.2106118 (71) total: 5.08s remaining: 905ms 73: learn: 0.1870424 test: 0.2104008 best: 0.2104008 (73) total: 5.15s remaining: 836ms 74: learn: 0.1864984 test: 0.2099027 best: 0.2099027 (74) total: 5.22s remaining: 766ms 75: learn: 0.1858857 test: 0.2095619 best: 0.2095619 (75) total: 5.29s remaining: 696ms 76: learn: 0.1854965 test: 0.2093075 best: 0.2093075 (76) total: 5.36s remaining: 627ms 77: learn: 0.1852165 test: 0.2092143 best: 0.2092143 (77) total: 5.43s remaining: 557ms 78: learn: 0.1849627 test: 0.2090338 best: 0.2090338 (78) total: 5.5s remaining: 488ms 79: learn: 0.1847409 test: 0.2090166 best: 0.2090166 (79) total: 5.58s remaining: 418ms 80: learn: 0.1845241 test: 0.2089737 best: 0.2089737 (80) total: 5.65s remaining: 349ms 81: learn: 0.1842195 test: 0.2088971 best: 0.2088971 (81) total: 5.72s remaining: 279ms 82: learn: 0.1839125 test: 0.2086203 best: 0.2086203 (82) total: 5.79s remaining: 209ms 83: learn: 0.1837536 test: 0.2085180 best: 0.2085180 (83) total: 5.86s remaining: 140ms 84: learn: 0.1832151 test: 0.2082678 best: 0.2082678 (84) total: 5.94s remaining: 69.8ms 85: learn: 0.1827516 test: 0.2081256 best: 0.2081256 (85) total: 6.01s remaining: 0us bestTest = 0.2081256289 bestIteration = 85 Trial 96, Fold 5: Log loss = 0.208125628933512, Average precision = 0.9717449732708364, ROC-AUC = 0.9684660461141148, Elapsed Time = 6.138437400004477 seconds
Optimization Progress: 97%|#########7| 97/100 [2:54:29<02:35, 51.70s/it]
Trial 97, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 97, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6242062 test: 0.6253154 best: 0.6253154 (0) total: 217ms remaining: 19.1s 1: learn: 0.5671448 test: 0.5694278 best: 0.5694278 (1) total: 429ms remaining: 18.6s 2: learn: 0.5163842 test: 0.5199639 best: 0.5199639 (2) total: 659ms remaining: 18.9s 3: learn: 0.4740568 test: 0.4779965 best: 0.4779965 (3) total: 888ms remaining: 18.9s 4: learn: 0.4378443 test: 0.4425196 best: 0.4425196 (4) total: 1.12s remaining: 18.8s 5: learn: 0.4060623 test: 0.4114563 best: 0.4114563 (5) total: 1.35s remaining: 18.7s 6: learn: 0.3805406 test: 0.3864833 best: 0.3864833 (6) total: 1.58s remaining: 18.5s 7: learn: 0.3584780 test: 0.3648486 best: 0.3648486 (7) total: 1.81s remaining: 18.3s 8: learn: 0.3384992 test: 0.3455346 best: 0.3455346 (8) total: 2.04s remaining: 18.1s 9: learn: 0.3220730 test: 0.3297077 best: 0.3297077 (9) total: 2.29s remaining: 18.1s 10: learn: 0.3086627 test: 0.3170717 best: 0.3170717 (10) total: 2.54s remaining: 18s 11: learn: 0.2959096 test: 0.3047721 best: 0.3047721 (11) total: 2.81s remaining: 18s 12: learn: 0.2849314 test: 0.2946258 best: 0.2946258 (12) total: 3.07s remaining: 18s 13: learn: 0.2752805 test: 0.2857239 best: 0.2857239 (13) total: 3.34s remaining: 17.9s 14: learn: 0.2679768 test: 0.2794892 best: 0.2794892 (14) total: 3.61s remaining: 17.8s 15: learn: 0.2608106 test: 0.2728574 best: 0.2728574 (15) total: 3.88s remaining: 17.7s 16: learn: 0.2543859 test: 0.2675784 best: 0.2675784 (16) total: 4.14s remaining: 17.6s 17: learn: 0.2495587 test: 0.2635074 best: 0.2635074 (17) total: 4.42s remaining: 17.4s 18: learn: 0.2436825 test: 0.2583816 best: 0.2583816 (18) total: 4.7s remaining: 17.3s 19: learn: 0.2390075 test: 0.2542355 best: 0.2542355 (19) total: 4.95s remaining: 17.1s 20: learn: 0.2348032 test: 0.2502397 best: 0.2502397 (20) total: 5.21s remaining: 16.9s 21: learn: 0.2324161 test: 0.2481824 best: 0.2481824 (21) total: 5.46s remaining: 16.6s 22: learn: 0.2290584 test: 0.2452608 best: 0.2452608 (22) total: 5.74s remaining: 16.5s 23: learn: 0.2257596 test: 0.2423784 best: 0.2423784 (23) total: 6.03s remaining: 16.3s 24: learn: 0.2228562 test: 0.2398357 best: 0.2398357 (24) total: 6.3s remaining: 16.1s 25: learn: 0.2203129 test: 0.2375370 best: 0.2375370 (25) total: 6.56s remaining: 15.9s 26: learn: 0.2180720 test: 0.2361718 best: 0.2361718 (26) total: 6.83s remaining: 15.7s 27: learn: 0.2157748 test: 0.2343780 best: 0.2343780 (27) total: 7.09s remaining: 15.5s 28: learn: 0.2141390 test: 0.2328619 best: 0.2328619 (28) total: 7.35s remaining: 15.2s 29: learn: 0.2125422 test: 0.2317722 best: 0.2317722 (29) total: 7.6s remaining: 15s 30: learn: 0.2106072 test: 0.2299068 best: 0.2299068 (30) total: 7.87s remaining: 14.7s 31: learn: 0.2085843 test: 0.2281493 best: 0.2281493 (31) total: 8.13s remaining: 14.5s 32: learn: 0.2070166 test: 0.2269586 best: 0.2269586 (32) total: 8.38s remaining: 14.2s 33: learn: 0.2058215 test: 0.2258609 best: 0.2258609 (33) total: 8.63s remaining: 14s 34: learn: 0.2049206 test: 0.2249597 best: 0.2249597 (34) total: 8.88s remaining: 13.7s 35: learn: 0.2043629 test: 0.2245816 best: 0.2245816 (35) total: 9.13s remaining: 13.4s 36: learn: 0.2033601 test: 0.2240581 best: 0.2240581 (36) total: 9.39s remaining: 13.2s 37: learn: 0.2022582 test: 0.2238995 best: 0.2238995 (37) total: 9.65s remaining: 12.9s 38: learn: 0.2012815 test: 0.2232463 best: 0.2232463 (38) total: 9.9s remaining: 12.7s 39: learn: 0.2008220 test: 0.2227878 best: 0.2227878 (39) total: 9.95s remaining: 12.2s 40: learn: 0.2001349 test: 0.2222828 best: 0.2222828 (40) total: 10.2s remaining: 11.9s 41: learn: 0.1989723 test: 0.2216240 best: 0.2216240 (41) total: 10.5s remaining: 11.7s 42: learn: 0.1971614 test: 0.2204266 best: 0.2204266 (42) total: 10.7s remaining: 11.5s 43: learn: 0.1971602 test: 0.2204284 best: 0.2204266 (42) total: 10.8s remaining: 11s 44: learn: 0.1964244 test: 0.2200735 best: 0.2200735 (44) total: 11s remaining: 10.8s 45: learn: 0.1959679 test: 0.2199129 best: 0.2199129 (45) total: 11.3s remaining: 10.5s 46: learn: 0.1951479 test: 0.2193705 best: 0.2193705 (46) total: 11.5s remaining: 10.3s 47: learn: 0.1944683 test: 0.2188802 best: 0.2188802 (47) total: 11.8s remaining: 10.1s 48: learn: 0.1943765 test: 0.2187890 best: 0.2187890 (48) total: 11.8s remaining: 9.64s 49: learn: 0.1933604 test: 0.2182976 best: 0.2182976 (49) total: 12.1s remaining: 9.41s 50: learn: 0.1925041 test: 0.2178857 best: 0.2178857 (50) total: 12.3s remaining: 9.18s 51: learn: 0.1918871 test: 0.2174449 best: 0.2174449 (51) total: 12.6s remaining: 8.95s 52: learn: 0.1912652 test: 0.2173047 best: 0.2173047 (52) total: 12.8s remaining: 8.73s 53: learn: 0.1912641 test: 0.2173025 best: 0.2173025 (53) total: 12.9s remaining: 8.35s 54: learn: 0.1905756 test: 0.2172911 best: 0.2172911 (54) total: 13.1s remaining: 8.13s 55: learn: 0.1903093 test: 0.2170755 best: 0.2170755 (55) total: 13.4s remaining: 7.89s 56: learn: 0.1898951 test: 0.2169643 best: 0.2169643 (56) total: 13.6s remaining: 7.66s 57: learn: 0.1886837 test: 0.2167509 best: 0.2167509 (57) total: 13.9s remaining: 7.44s 58: learn: 0.1880797 test: 0.2165140 best: 0.2165140 (58) total: 14.2s remaining: 7.21s 59: learn: 0.1880713 test: 0.2165027 best: 0.2165027 (59) total: 14.2s remaining: 6.87s 60: learn: 0.1868597 test: 0.2164143 best: 0.2164143 (60) total: 14.5s remaining: 6.64s 61: learn: 0.1860281 test: 0.2156824 best: 0.2156824 (61) total: 14.7s remaining: 6.41s 62: learn: 0.1856636 test: 0.2153766 best: 0.2153766 (62) total: 15s remaining: 6.18s 63: learn: 0.1852998 test: 0.2152143 best: 0.2152143 (63) total: 15.2s remaining: 5.95s 64: learn: 0.1847636 test: 0.2146244 best: 0.2146244 (64) total: 15.5s remaining: 5.71s 65: learn: 0.1841160 test: 0.2141422 best: 0.2141422 (65) total: 15.7s remaining: 5.48s 66: learn: 0.1841023 test: 0.2141419 best: 0.2141419 (66) total: 15.8s remaining: 5.18s 67: learn: 0.1834743 test: 0.2135441 best: 0.2135441 (67) total: 16s remaining: 4.95s 68: learn: 0.1824747 test: 0.2133194 best: 0.2133194 (68) total: 16.3s remaining: 4.72s 69: learn: 0.1822878 test: 0.2132299 best: 0.2132299 (69) total: 16.5s remaining: 4.49s 70: learn: 0.1818526 test: 0.2131287 best: 0.2131287 (70) total: 16.8s remaining: 4.25s 71: learn: 0.1818497 test: 0.2131224 best: 0.2131224 (71) total: 16.8s remaining: 3.97s 72: learn: 0.1810649 test: 0.2131258 best: 0.2131224 (71) total: 17.1s remaining: 3.74s 73: learn: 0.1806613 test: 0.2130607 best: 0.2130607 (73) total: 17.3s remaining: 3.51s 74: learn: 0.1797661 test: 0.2127543 best: 0.2127543 (74) total: 17.6s remaining: 3.28s 75: learn: 0.1795759 test: 0.2126368 best: 0.2126368 (75) total: 17.8s remaining: 3.05s 76: learn: 0.1787905 test: 0.2126844 best: 0.2126368 (75) total: 18.1s remaining: 2.82s 77: learn: 0.1781388 test: 0.2122521 best: 0.2122521 (77) total: 18.4s remaining: 2.59s 78: learn: 0.1778257 test: 0.2121034 best: 0.2121034 (78) total: 18.6s remaining: 2.35s 79: learn: 0.1778250 test: 0.2120998 best: 0.2120998 (79) total: 18.6s remaining: 2.1s 80: learn: 0.1773806 test: 0.2120601 best: 0.2120601 (80) total: 18.9s remaining: 1.87s 81: learn: 0.1767622 test: 0.2116690 best: 0.2116690 (81) total: 19.2s remaining: 1.64s 82: learn: 0.1765371 test: 0.2115220 best: 0.2115220 (82) total: 19.4s remaining: 1.4s 83: learn: 0.1761051 test: 0.2113049 best: 0.2113049 (83) total: 19.7s remaining: 1.17s 84: learn: 0.1756087 test: 0.2112098 best: 0.2112098 (84) total: 19.9s remaining: 938ms 85: learn: 0.1755840 test: 0.2111842 best: 0.2111842 (85) total: 20s remaining: 697ms 86: learn: 0.1751622 test: 0.2112076 best: 0.2111842 (85) total: 20.2s remaining: 465ms 87: learn: 0.1750480 test: 0.2111571 best: 0.2111571 (87) total: 20.5s remaining: 233ms 88: learn: 0.1749526 test: 0.2111600 best: 0.2111571 (87) total: 20.7s remaining: 0us bestTest = 0.2111571232 bestIteration = 87 Shrink model to first 88 iterations. Trial 97, Fold 1: Log loss = 0.21082753340035829, Average precision = 0.9728964449793732, ROC-AUC = 0.9681904893343465, Elapsed Time = 20.86236479999934 seconds Trial 97, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 97, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6236358 test: 0.6249362 best: 0.6249362 (0) total: 238ms remaining: 21s 1: learn: 0.5647823 test: 0.5672655 best: 0.5672655 (1) total: 463ms remaining: 20.1s 2: learn: 0.5158725 test: 0.5192086 best: 0.5192086 (2) total: 690ms remaining: 19.8s 3: learn: 0.4734664 test: 0.4772955 best: 0.4772955 (3) total: 924ms remaining: 19.6s 4: learn: 0.4370236 test: 0.4412982 best: 0.4412982 (4) total: 1.16s remaining: 19.5s 5: learn: 0.4062735 test: 0.4113503 best: 0.4113503 (5) total: 1.43s remaining: 19.8s 6: learn: 0.3800769 test: 0.3859503 best: 0.3859503 (6) total: 1.7s remaining: 19.9s 7: learn: 0.3579309 test: 0.3642609 best: 0.3642609 (7) total: 1.95s remaining: 19.8s 8: learn: 0.3377591 test: 0.3447896 best: 0.3447896 (8) total: 2.21s remaining: 19.7s 9: learn: 0.3214111 test: 0.3287881 best: 0.3287881 (9) total: 2.47s remaining: 19.5s 10: learn: 0.3073159 test: 0.3149516 best: 0.3149516 (10) total: 2.73s remaining: 19.4s 11: learn: 0.2950372 test: 0.3030119 best: 0.3030119 (11) total: 3s remaining: 19.2s 12: learn: 0.2841106 test: 0.2925803 best: 0.2925803 (12) total: 3.26s remaining: 19s 13: learn: 0.2753333 test: 0.2840843 best: 0.2840843 (13) total: 3.51s remaining: 18.8s 14: learn: 0.2666621 test: 0.2754613 best: 0.2754613 (14) total: 3.77s remaining: 18.6s 15: learn: 0.2595838 test: 0.2685470 best: 0.2685470 (15) total: 4.02s remaining: 18.4s 16: learn: 0.2538563 test: 0.2632560 best: 0.2632560 (16) total: 4.29s remaining: 18.2s 17: learn: 0.2476062 test: 0.2575951 best: 0.2575951 (17) total: 4.55s remaining: 17.9s 18: learn: 0.2424850 test: 0.2527862 best: 0.2527862 (18) total: 4.8s remaining: 17.7s 19: learn: 0.2387032 test: 0.2489569 best: 0.2489569 (19) total: 5.05s remaining: 17.4s 20: learn: 0.2355195 test: 0.2458811 best: 0.2458811 (20) total: 5.3s remaining: 17.2s 21: learn: 0.2327266 test: 0.2438032 best: 0.2438032 (21) total: 5.56s remaining: 16.9s 22: learn: 0.2292372 test: 0.2406371 best: 0.2406371 (22) total: 5.83s remaining: 16.7s 23: learn: 0.2259214 test: 0.2376778 best: 0.2376778 (23) total: 6.09s remaining: 16.5s 24: learn: 0.2231264 test: 0.2349476 best: 0.2349476 (24) total: 6.34s remaining: 16.2s 25: learn: 0.2205252 test: 0.2328500 best: 0.2328500 (25) total: 6.61s remaining: 16s 26: learn: 0.2178522 test: 0.2301251 best: 0.2301251 (26) total: 6.85s remaining: 15.7s 27: learn: 0.2163348 test: 0.2288184 best: 0.2288184 (27) total: 7.1s remaining: 15.5s 28: learn: 0.2148798 test: 0.2275305 best: 0.2275305 (28) total: 7.36s remaining: 15.2s 29: learn: 0.2128319 test: 0.2263249 best: 0.2263249 (29) total: 7.62s remaining: 15s 30: learn: 0.2108686 test: 0.2245092 best: 0.2245092 (30) total: 7.87s remaining: 14.7s 31: learn: 0.2095237 test: 0.2239475 best: 0.2239475 (31) total: 8.12s remaining: 14.5s 32: learn: 0.2079665 test: 0.2225105 best: 0.2225105 (32) total: 8.38s remaining: 14.2s 33: learn: 0.2067581 test: 0.2217665 best: 0.2217665 (33) total: 8.63s remaining: 14s 34: learn: 0.2055758 test: 0.2205190 best: 0.2205190 (34) total: 8.89s remaining: 13.7s 35: learn: 0.2053622 test: 0.2202755 best: 0.2202755 (35) total: 8.94s remaining: 13.2s 36: learn: 0.2038634 test: 0.2189947 best: 0.2189947 (36) total: 9.2s remaining: 12.9s 37: learn: 0.2032151 test: 0.2184489 best: 0.2184489 (37) total: 9.44s remaining: 12.7s 38: learn: 0.2023379 test: 0.2180293 best: 0.2180293 (38) total: 9.7s remaining: 12.4s 39: learn: 0.2014160 test: 0.2175130 best: 0.2175130 (39) total: 9.95s remaining: 12.2s 40: learn: 0.2001054 test: 0.2173075 best: 0.2173075 (40) total: 10.2s remaining: 12s 41: learn: 0.1988940 test: 0.2163452 best: 0.2163452 (41) total: 10.5s remaining: 11.7s 42: learn: 0.1982706 test: 0.2157324 best: 0.2157324 (42) total: 10.7s remaining: 11.5s 43: learn: 0.1978001 test: 0.2154775 best: 0.2154775 (43) total: 11s remaining: 11.2s 44: learn: 0.1974150 test: 0.2150562 best: 0.2150562 (44) total: 11.2s remaining: 11s 45: learn: 0.1960352 test: 0.2146274 best: 0.2146274 (45) total: 11.5s remaining: 10.8s 46: learn: 0.1957558 test: 0.2145293 best: 0.2145293 (46) total: 11.8s remaining: 10.5s 47: learn: 0.1950588 test: 0.2141232 best: 0.2141232 (47) total: 12s remaining: 10.3s 48: learn: 0.1943880 test: 0.2137696 best: 0.2137696 (48) total: 12.3s remaining: 10s 49: learn: 0.1937311 test: 0.2132931 best: 0.2132931 (49) total: 12.5s remaining: 9.78s 50: learn: 0.1933935 test: 0.2131604 best: 0.2131604 (50) total: 12.8s remaining: 9.53s 51: learn: 0.1933142 test: 0.2130535 best: 0.2130535 (51) total: 12.8s remaining: 9.14s 52: learn: 0.1925175 test: 0.2130112 best: 0.2130112 (52) total: 13.1s remaining: 8.9s 53: learn: 0.1918758 test: 0.2126880 best: 0.2126880 (53) total: 13.3s remaining: 8.65s 54: learn: 0.1910871 test: 0.2121899 best: 0.2121899 (54) total: 13.6s remaining: 8.41s 55: learn: 0.1907084 test: 0.2120835 best: 0.2120835 (55) total: 13.9s remaining: 8.18s 56: learn: 0.1902277 test: 0.2117437 best: 0.2117437 (56) total: 14.2s remaining: 7.94s 57: learn: 0.1898126 test: 0.2113651 best: 0.2113651 (57) total: 14.4s remaining: 7.71s 58: learn: 0.1889360 test: 0.2107638 best: 0.2107638 (58) total: 14.7s remaining: 7.47s 59: learn: 0.1889352 test: 0.2107599 best: 0.2107599 (59) total: 14.7s remaining: 7.12s 60: learn: 0.1886148 test: 0.2105927 best: 0.2105927 (60) total: 15s remaining: 6.89s 61: learn: 0.1875886 test: 0.2101266 best: 0.2101266 (61) total: 15.3s remaining: 6.66s 62: learn: 0.1875883 test: 0.2101240 best: 0.2101240 (62) total: 15.3s remaining: 6.32s 63: learn: 0.1871255 test: 0.2100268 best: 0.2100268 (63) total: 15.6s remaining: 6.09s 64: learn: 0.1865394 test: 0.2095892 best: 0.2095892 (64) total: 15.9s remaining: 5.86s 65: learn: 0.1859896 test: 0.2091639 best: 0.2091639 (65) total: 16.1s remaining: 5.62s 66: learn: 0.1858262 test: 0.2090154 best: 0.2090154 (66) total: 16.4s remaining: 5.39s 67: learn: 0.1857035 test: 0.2089598 best: 0.2089598 (67) total: 16.7s remaining: 5.15s 68: learn: 0.1851899 test: 0.2086343 best: 0.2086343 (68) total: 16.9s remaining: 4.91s 69: learn: 0.1846193 test: 0.2080504 best: 0.2080504 (69) total: 17.2s remaining: 4.66s 70: learn: 0.1836163 test: 0.2078442 best: 0.2078442 (70) total: 17.4s remaining: 4.42s 71: learn: 0.1834981 test: 0.2077499 best: 0.2077499 (71) total: 17.7s remaining: 4.18s 72: learn: 0.1830708 test: 0.2074668 best: 0.2074668 (72) total: 18s remaining: 3.94s 73: learn: 0.1824241 test: 0.2073480 best: 0.2073480 (73) total: 18.2s remaining: 3.69s 74: learn: 0.1824044 test: 0.2073265 best: 0.2073265 (74) total: 18.3s remaining: 3.41s 75: learn: 0.1821031 test: 0.2070869 best: 0.2070869 (75) total: 18.5s remaining: 3.17s 76: learn: 0.1812162 test: 0.2066588 best: 0.2066588 (76) total: 18.8s remaining: 2.93s 77: learn: 0.1809103 test: 0.2064620 best: 0.2064620 (77) total: 19s remaining: 2.68s 78: learn: 0.1804555 test: 0.2061097 best: 0.2061097 (78) total: 19.3s remaining: 2.44s 79: learn: 0.1799295 test: 0.2058797 best: 0.2058797 (79) total: 19.6s remaining: 2.2s 80: learn: 0.1798065 test: 0.2057419 best: 0.2057419 (80) total: 19.8s remaining: 1.96s 81: learn: 0.1793505 test: 0.2055453 best: 0.2055453 (81) total: 20.1s remaining: 1.71s 82: learn: 0.1786427 test: 0.2050845 best: 0.2050845 (82) total: 20.3s remaining: 1.47s 83: learn: 0.1786370 test: 0.2050919 best: 0.2050845 (82) total: 20.4s remaining: 1.21s 84: learn: 0.1783328 test: 0.2048034 best: 0.2048034 (84) total: 20.6s remaining: 970ms 85: learn: 0.1778130 test: 0.2045678 best: 0.2045678 (85) total: 20.9s remaining: 728ms 86: learn: 0.1769276 test: 0.2042207 best: 0.2042207 (86) total: 21.1s remaining: 485ms 87: learn: 0.1768412 test: 0.2041974 best: 0.2041974 (87) total: 21.3s remaining: 242ms 88: learn: 0.1767174 test: 0.2040588 best: 0.2040588 (88) total: 21.3s remaining: 0us bestTest = 0.2040587624 bestIteration = 88 Trial 97, Fold 2: Log loss = 0.20386114733480118, Average precision = 0.9730014843998919, ROC-AUC = 0.9699471098268527, Elapsed Time = 21.46011730000464 seconds Trial 97, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 97, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6241282 test: 0.6247628 best: 0.6247628 (0) total: 230ms remaining: 20.2s 1: learn: 0.5656643 test: 0.5668731 best: 0.5668731 (1) total: 466ms remaining: 20.3s 2: learn: 0.5171349 test: 0.5187945 best: 0.5187945 (2) total: 695ms remaining: 19.9s 3: learn: 0.4738572 test: 0.4760360 best: 0.4760360 (3) total: 930ms remaining: 19.8s 4: learn: 0.4372940 test: 0.4399121 best: 0.4399121 (4) total: 1.18s remaining: 19.8s 5: learn: 0.4062023 test: 0.4092518 best: 0.4092518 (5) total: 1.42s remaining: 19.7s 6: learn: 0.3794880 test: 0.3831727 best: 0.3831727 (6) total: 1.68s remaining: 19.7s 7: learn: 0.3569746 test: 0.3610512 best: 0.3610512 (7) total: 1.94s remaining: 19.6s 8: learn: 0.3383140 test: 0.3427298 best: 0.3427298 (8) total: 2.2s remaining: 19.5s 9: learn: 0.3212522 test: 0.3259614 best: 0.3259614 (9) total: 2.45s remaining: 19.4s 10: learn: 0.3072614 test: 0.3126885 best: 0.3126885 (10) total: 2.72s remaining: 19.3s 11: learn: 0.2948475 test: 0.3006471 best: 0.3006471 (11) total: 2.97s remaining: 19.1s 12: learn: 0.2836402 test: 0.2900637 best: 0.2900637 (12) total: 3.24s remaining: 18.9s 13: learn: 0.2742039 test: 0.2813584 best: 0.2813584 (13) total: 3.51s remaining: 18.8s 14: learn: 0.2663142 test: 0.2743542 best: 0.2743542 (14) total: 3.77s remaining: 18.6s 15: learn: 0.2589061 test: 0.2676962 best: 0.2676962 (15) total: 4.04s remaining: 18.4s 16: learn: 0.2527728 test: 0.2618184 best: 0.2618184 (16) total: 4.29s remaining: 18.2s 17: learn: 0.2471978 test: 0.2566666 best: 0.2566666 (17) total: 4.55s remaining: 17.9s 18: learn: 0.2420540 test: 0.2520140 best: 0.2520140 (18) total: 4.8s remaining: 17.7s 19: learn: 0.2373073 test: 0.2482327 best: 0.2482327 (19) total: 5.06s remaining: 17.5s 20: learn: 0.2327249 test: 0.2441374 best: 0.2441374 (20) total: 5.32s remaining: 17.2s 21: learn: 0.2286960 test: 0.2406004 best: 0.2406004 (21) total: 5.58s remaining: 17s 22: learn: 0.2263218 test: 0.2384706 best: 0.2384706 (22) total: 5.83s remaining: 16.7s 23: learn: 0.2231390 test: 0.2355901 best: 0.2355901 (23) total: 6.08s remaining: 16.5s 24: learn: 0.2201780 test: 0.2329493 best: 0.2329493 (24) total: 6.34s remaining: 16.2s 25: learn: 0.2174761 test: 0.2307631 best: 0.2307631 (25) total: 6.61s remaining: 16s 26: learn: 0.2154237 test: 0.2289452 best: 0.2289452 (26) total: 6.86s remaining: 15.8s 27: learn: 0.2130766 test: 0.2267546 best: 0.2267546 (27) total: 7.12s remaining: 15.5s 28: learn: 0.2106181 test: 0.2250314 best: 0.2250314 (28) total: 7.38s remaining: 15.3s 29: learn: 0.2091299 test: 0.2238776 best: 0.2238776 (29) total: 7.63s remaining: 15s 30: learn: 0.2077480 test: 0.2225217 best: 0.2225217 (30) total: 7.89s remaining: 14.8s 31: learn: 0.2062190 test: 0.2216437 best: 0.2216437 (31) total: 8.14s remaining: 14.5s 32: learn: 0.2050675 test: 0.2210206 best: 0.2210206 (32) total: 8.41s remaining: 14.3s 33: learn: 0.2043257 test: 0.2205299 best: 0.2205299 (33) total: 8.66s remaining: 14s 34: learn: 0.2029502 test: 0.2198015 best: 0.2198015 (34) total: 8.92s remaining: 13.8s 35: learn: 0.2019324 test: 0.2191075 best: 0.2191075 (35) total: 9.18s remaining: 13.5s 36: learn: 0.2010664 test: 0.2183181 best: 0.2183181 (36) total: 9.43s remaining: 13.3s 37: learn: 0.2001420 test: 0.2181484 best: 0.2181484 (37) total: 9.71s remaining: 13s 38: learn: 0.1994296 test: 0.2177979 best: 0.2177979 (38) total: 9.97s remaining: 12.8s 39: learn: 0.1982578 test: 0.2170070 best: 0.2170070 (39) total: 10.2s remaining: 12.5s 40: learn: 0.1978457 test: 0.2167958 best: 0.2167958 (40) total: 10.5s remaining: 12.3s 41: learn: 0.1969140 test: 0.2166176 best: 0.2166176 (41) total: 10.7s remaining: 12s 42: learn: 0.1960346 test: 0.2162064 best: 0.2162064 (42) total: 11s remaining: 11.7s 43: learn: 0.1954090 test: 0.2158476 best: 0.2158476 (43) total: 11.2s remaining: 11.5s 44: learn: 0.1947696 test: 0.2154202 best: 0.2154202 (44) total: 11.5s remaining: 11.2s 45: learn: 0.1939390 test: 0.2148716 best: 0.2148716 (45) total: 11.7s remaining: 11s 46: learn: 0.1933212 test: 0.2144436 best: 0.2144436 (46) total: 12s remaining: 10.7s 47: learn: 0.1917666 test: 0.2133690 best: 0.2133690 (47) total: 12.3s remaining: 10.5s 48: learn: 0.1907818 test: 0.2129221 best: 0.2129221 (48) total: 12.7s remaining: 10.3s 49: learn: 0.1895950 test: 0.2119754 best: 0.2119754 (49) total: 12.9s remaining: 10.1s 50: learn: 0.1891507 test: 0.2117904 best: 0.2117904 (50) total: 13.2s remaining: 9.83s 51: learn: 0.1883888 test: 0.2115341 best: 0.2115341 (51) total: 13.5s remaining: 9.58s 52: learn: 0.1879033 test: 0.2113658 best: 0.2113658 (52) total: 13.7s remaining: 9.32s 53: learn: 0.1872746 test: 0.2109609 best: 0.2109609 (53) total: 14s remaining: 9.06s 54: learn: 0.1864891 test: 0.2105165 best: 0.2105165 (54) total: 14.2s remaining: 8.8s 55: learn: 0.1857112 test: 0.2103641 best: 0.2103641 (55) total: 14.5s remaining: 8.54s 56: learn: 0.1850821 test: 0.2099996 best: 0.2099996 (56) total: 14.8s remaining: 8.28s 57: learn: 0.1845810 test: 0.2099878 best: 0.2099878 (57) total: 15s remaining: 8.02s 58: learn: 0.1842839 test: 0.2099101 best: 0.2099101 (58) total: 15.3s remaining: 7.76s 59: learn: 0.1840775 test: 0.2098537 best: 0.2098537 (59) total: 15.4s remaining: 7.42s 60: learn: 0.1832375 test: 0.2099495 best: 0.2098537 (59) total: 15.6s remaining: 7.17s 61: learn: 0.1828657 test: 0.2097778 best: 0.2097778 (61) total: 15.9s remaining: 6.91s 62: learn: 0.1824769 test: 0.2094982 best: 0.2094982 (62) total: 16.1s remaining: 6.65s 63: learn: 0.1818061 test: 0.2092166 best: 0.2092166 (63) total: 16.4s remaining: 6.4s 64: learn: 0.1814316 test: 0.2090514 best: 0.2090514 (64) total: 16.6s remaining: 6.14s 65: learn: 0.1809233 test: 0.2088129 best: 0.2088129 (65) total: 16.9s remaining: 5.89s 66: learn: 0.1808630 test: 0.2087579 best: 0.2087579 (66) total: 16.9s remaining: 5.57s 67: learn: 0.1802371 test: 0.2086387 best: 0.2086387 (67) total: 17.2s remaining: 5.31s 68: learn: 0.1798880 test: 0.2084035 best: 0.2084035 (68) total: 17.5s remaining: 5.06s 69: learn: 0.1795155 test: 0.2081737 best: 0.2081737 (69) total: 17.7s remaining: 4.81s 70: learn: 0.1787996 test: 0.2078100 best: 0.2078100 (70) total: 18s remaining: 4.56s 71: learn: 0.1784302 test: 0.2076691 best: 0.2076691 (71) total: 18.2s remaining: 4.3s 72: learn: 0.1781537 test: 0.2074754 best: 0.2074754 (72) total: 18.5s remaining: 4.05s 73: learn: 0.1776226 test: 0.2073368 best: 0.2073368 (73) total: 18.8s remaining: 3.8s 74: learn: 0.1774498 test: 0.2072247 best: 0.2072247 (74) total: 19s remaining: 3.55s 75: learn: 0.1772872 test: 0.2071565 best: 0.2071565 (75) total: 19.3s remaining: 3.29s 76: learn: 0.1771369 test: 0.2071122 best: 0.2071122 (76) total: 19.5s remaining: 3.04s 77: learn: 0.1756545 test: 0.2072825 best: 0.2071122 (76) total: 19.8s remaining: 2.79s 78: learn: 0.1756543 test: 0.2072837 best: 0.2071122 (76) total: 19.8s remaining: 2.51s 79: learn: 0.1753037 test: 0.2072399 best: 0.2071122 (76) total: 20.1s remaining: 2.26s 80: learn: 0.1747512 test: 0.2068748 best: 0.2068748 (80) total: 20.3s remaining: 2.01s 81: learn: 0.1745890 test: 0.2068394 best: 0.2068394 (81) total: 20.6s remaining: 1.76s 82: learn: 0.1743342 test: 0.2066502 best: 0.2066502 (82) total: 20.8s remaining: 1.51s 83: learn: 0.1739140 test: 0.2064403 best: 0.2064403 (83) total: 21.1s remaining: 1.25s 84: learn: 0.1730627 test: 0.2058265 best: 0.2058265 (84) total: 21.4s remaining: 1s 85: learn: 0.1725394 test: 0.2058478 best: 0.2058265 (84) total: 21.6s remaining: 754ms 86: learn: 0.1717642 test: 0.2054969 best: 0.2054969 (86) total: 21.9s remaining: 503ms 87: learn: 0.1711538 test: 0.2051542 best: 0.2051542 (87) total: 22.1s remaining: 252ms 88: learn: 0.1708165 test: 0.2050370 best: 0.2050370 (88) total: 22.4s remaining: 0us bestTest = 0.2050370016 bestIteration = 88 Trial 97, Fold 3: Log loss = 0.20484532578439052, Average precision = 0.9731240040616084, ROC-AUC = 0.9694139394785759, Elapsed Time = 22.550649700002396 seconds Trial 97, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 97, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6241833 test: 0.6247905 best: 0.6247905 (0) total: 227ms remaining: 20s 1: learn: 0.5645621 test: 0.5659708 best: 0.5659708 (1) total: 447ms remaining: 19.5s 2: learn: 0.5142324 test: 0.5159962 best: 0.5159962 (2) total: 676ms remaining: 19.4s 3: learn: 0.4714815 test: 0.4736320 best: 0.4736320 (3) total: 900ms remaining: 19.1s 4: learn: 0.4341960 test: 0.4371006 best: 0.4371006 (4) total: 1.15s remaining: 19.3s 5: learn: 0.4039323 test: 0.4072797 best: 0.4072797 (5) total: 1.41s remaining: 19.5s 6: learn: 0.3779308 test: 0.3822915 best: 0.3822915 (6) total: 1.69s remaining: 19.8s 7: learn: 0.3555609 test: 0.3602437 best: 0.3602437 (7) total: 1.95s remaining: 19.7s 8: learn: 0.3371132 test: 0.3424937 best: 0.3424937 (8) total: 2.21s remaining: 19.6s 9: learn: 0.3210225 test: 0.3269075 best: 0.3269075 (9) total: 2.46s remaining: 19.5s 10: learn: 0.3060564 test: 0.3128704 best: 0.3128704 (10) total: 2.73s remaining: 19.3s 11: learn: 0.2940775 test: 0.3011685 best: 0.3011685 (11) total: 2.99s remaining: 19.2s 12: learn: 0.2840894 test: 0.2917569 best: 0.2917569 (12) total: 3.24s remaining: 19s 13: learn: 0.2753839 test: 0.2833473 best: 0.2833473 (13) total: 3.5s remaining: 18.7s 14: learn: 0.2674160 test: 0.2759103 best: 0.2759103 (14) total: 3.76s remaining: 18.6s 15: learn: 0.2608413 test: 0.2697327 best: 0.2697327 (15) total: 4.01s remaining: 18.3s 16: learn: 0.2539551 test: 0.2631192 best: 0.2631192 (16) total: 4.27s remaining: 18.1s 17: learn: 0.2487198 test: 0.2583816 best: 0.2583816 (17) total: 4.53s remaining: 17.9s 18: learn: 0.2440608 test: 0.2541164 best: 0.2541164 (18) total: 4.79s remaining: 17.7s 19: learn: 0.2399494 test: 0.2506155 best: 0.2506155 (19) total: 5.04s remaining: 17.4s 20: learn: 0.2365182 test: 0.2475964 best: 0.2475964 (20) total: 5.31s remaining: 17.2s 21: learn: 0.2339373 test: 0.2452846 best: 0.2452846 (21) total: 5.55s remaining: 16.9s 22: learn: 0.2309184 test: 0.2435026 best: 0.2435026 (22) total: 5.82s remaining: 16.7s 23: learn: 0.2276713 test: 0.2403599 best: 0.2403599 (23) total: 6.07s remaining: 16.4s 24: learn: 0.2244500 test: 0.2375646 best: 0.2375646 (24) total: 6.33s remaining: 16.2s 25: learn: 0.2212964 test: 0.2350877 best: 0.2350877 (25) total: 6.59s remaining: 16s 26: learn: 0.2185826 test: 0.2325892 best: 0.2325892 (26) total: 6.85s remaining: 15.7s 27: learn: 0.2161081 test: 0.2306310 best: 0.2306310 (27) total: 7.11s remaining: 15.5s 28: learn: 0.2141424 test: 0.2288565 best: 0.2288565 (28) total: 7.35s remaining: 15.2s 29: learn: 0.2128240 test: 0.2282348 best: 0.2282348 (29) total: 7.61s remaining: 15s 30: learn: 0.2108128 test: 0.2263092 best: 0.2263092 (30) total: 7.87s remaining: 14.7s 31: learn: 0.2088689 test: 0.2253294 best: 0.2253294 (31) total: 8.13s remaining: 14.5s 32: learn: 0.2074584 test: 0.2240915 best: 0.2240915 (32) total: 8.38s remaining: 14.2s 33: learn: 0.2060389 test: 0.2229639 best: 0.2229639 (33) total: 8.64s remaining: 14s 34: learn: 0.2045653 test: 0.2221764 best: 0.2221764 (34) total: 8.9s remaining: 13.7s 35: learn: 0.2045569 test: 0.2221625 best: 0.2221625 (35) total: 8.94s remaining: 13.2s 36: learn: 0.2034391 test: 0.2217309 best: 0.2217309 (36) total: 9.19s remaining: 12.9s 37: learn: 0.2019881 test: 0.2205386 best: 0.2205386 (37) total: 9.44s remaining: 12.7s 38: learn: 0.2006928 test: 0.2195705 best: 0.2195705 (38) total: 9.7s remaining: 12.4s 39: learn: 0.1999700 test: 0.2190252 best: 0.2190252 (39) total: 9.94s remaining: 12.2s 40: learn: 0.1986613 test: 0.2186000 best: 0.2186000 (40) total: 10.2s remaining: 12s 41: learn: 0.1979885 test: 0.2182910 best: 0.2182910 (41) total: 10.5s remaining: 11.7s 42: learn: 0.1968011 test: 0.2177014 best: 0.2177014 (42) total: 10.7s remaining: 11.5s 43: learn: 0.1957911 test: 0.2169995 best: 0.2169995 (43) total: 11s remaining: 11.2s 44: learn: 0.1950321 test: 0.2164628 best: 0.2164628 (44) total: 11.2s remaining: 11s 45: learn: 0.1938578 test: 0.2160081 best: 0.2160081 (45) total: 11.5s remaining: 10.7s 46: learn: 0.1924297 test: 0.2156497 best: 0.2156497 (46) total: 11.7s remaining: 10.5s 47: learn: 0.1916762 test: 0.2151752 best: 0.2151752 (47) total: 12s remaining: 10.3s 48: learn: 0.1910419 test: 0.2149889 best: 0.2149889 (48) total: 12.3s remaining: 10s 49: learn: 0.1905805 test: 0.2146503 best: 0.2146503 (49) total: 12.5s remaining: 9.75s 50: learn: 0.1893749 test: 0.2138634 best: 0.2138634 (50) total: 12.8s remaining: 9.51s 51: learn: 0.1890277 test: 0.2137705 best: 0.2137705 (51) total: 13s remaining: 9.27s 52: learn: 0.1883360 test: 0.2134152 best: 0.2134152 (52) total: 13.3s remaining: 9.02s 53: learn: 0.1878260 test: 0.2128993 best: 0.2128993 (53) total: 13.4s remaining: 8.65s 54: learn: 0.1874335 test: 0.2126648 best: 0.2126648 (54) total: 13.6s remaining: 8.4s 55: learn: 0.1867981 test: 0.2126358 best: 0.2126358 (55) total: 13.9s remaining: 8.16s 56: learn: 0.1858433 test: 0.2123776 best: 0.2123776 (56) total: 14.1s remaining: 7.92s 57: learn: 0.1850909 test: 0.2117404 best: 0.2117404 (57) total: 14.4s remaining: 7.67s 58: learn: 0.1848689 test: 0.2114966 best: 0.2114966 (58) total: 14.4s remaining: 7.33s 59: learn: 0.1839971 test: 0.2110624 best: 0.2110624 (59) total: 14.7s remaining: 7.1s 60: learn: 0.1834611 test: 0.2107486 best: 0.2107486 (60) total: 14.9s remaining: 6.85s 61: learn: 0.1827083 test: 0.2103634 best: 0.2103634 (61) total: 15.2s remaining: 6.61s 62: learn: 0.1822443 test: 0.2102751 best: 0.2102751 (62) total: 15.4s remaining: 6.37s 63: learn: 0.1817247 test: 0.2098492 best: 0.2098492 (63) total: 15.7s remaining: 6.13s 64: learn: 0.1817247 test: 0.2098498 best: 0.2098492 (63) total: 15.7s remaining: 5.81s 65: learn: 0.1814962 test: 0.2097599 best: 0.2097599 (65) total: 16s remaining: 5.57s 66: learn: 0.1810416 test: 0.2098001 best: 0.2097599 (65) total: 16.2s remaining: 5.33s 67: learn: 0.1805966 test: 0.2094164 best: 0.2094164 (67) total: 16.5s remaining: 5.09s 68: learn: 0.1799460 test: 0.2092815 best: 0.2092815 (68) total: 16.7s remaining: 4.85s 69: learn: 0.1798926 test: 0.2092593 best: 0.2092593 (69) total: 16.8s remaining: 4.56s 70: learn: 0.1792328 test: 0.2089257 best: 0.2089257 (70) total: 17.1s remaining: 4.32s 71: learn: 0.1787027 test: 0.2087471 best: 0.2087471 (71) total: 17.3s remaining: 4.08s 72: learn: 0.1779338 test: 0.2083367 best: 0.2083367 (72) total: 17.6s remaining: 3.85s 73: learn: 0.1775416 test: 0.2082959 best: 0.2082959 (73) total: 17.8s remaining: 3.61s 74: learn: 0.1771821 test: 0.2082810 best: 0.2082810 (74) total: 18.1s remaining: 3.37s 75: learn: 0.1764314 test: 0.2075873 best: 0.2075873 (75) total: 18.3s remaining: 3.13s 76: learn: 0.1759925 test: 0.2071059 best: 0.2071059 (76) total: 18.6s remaining: 2.9s 77: learn: 0.1755510 test: 0.2070527 best: 0.2070527 (77) total: 18.8s remaining: 2.65s 78: learn: 0.1755509 test: 0.2070536 best: 0.2070527 (77) total: 18.9s remaining: 2.39s 79: learn: 0.1750465 test: 0.2070135 best: 0.2070135 (79) total: 19.1s remaining: 2.15s 80: learn: 0.1747356 test: 0.2069967 best: 0.2069967 (80) total: 19.4s remaining: 1.91s 81: learn: 0.1745173 test: 0.2069355 best: 0.2069355 (81) total: 19.6s remaining: 1.68s 82: learn: 0.1742584 test: 0.2068169 best: 0.2068169 (82) total: 19.9s remaining: 1.44s 83: learn: 0.1737316 test: 0.2064098 best: 0.2064098 (83) total: 20.1s remaining: 1.2s 84: learn: 0.1731003 test: 0.2061261 best: 0.2061261 (84) total: 20.4s remaining: 960ms 85: learn: 0.1727715 test: 0.2060280 best: 0.2060280 (85) total: 20.6s remaining: 720ms 86: learn: 0.1726464 test: 0.2059896 best: 0.2059896 (86) total: 20.9s remaining: 480ms 87: learn: 0.1720495 test: 0.2059620 best: 0.2059620 (87) total: 21.2s remaining: 240ms 88: learn: 0.1713322 test: 0.2058243 best: 0.2058243 (88) total: 21.4s remaining: 0us bestTest = 0.2058242623 bestIteration = 88 Trial 97, Fold 4: Log loss = 0.20557429699161273, Average precision = 0.9736747205772149, ROC-AUC = 0.968906551404298, Elapsed Time = 21.568858400001773 seconds Trial 97, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 97, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6223558 test: 0.6239699 best: 0.6239699 (0) total: 223ms remaining: 19.6s 1: learn: 0.5630348 test: 0.5659397 best: 0.5659397 (1) total: 451ms remaining: 19.6s 2: learn: 0.5131512 test: 0.5170932 best: 0.5170932 (2) total: 676ms remaining: 19.4s 3: learn: 0.4706197 test: 0.4758614 best: 0.4758614 (3) total: 909ms remaining: 19.3s 4: learn: 0.4342901 test: 0.4408707 best: 0.4408707 (4) total: 1.15s remaining: 19.2s 5: learn: 0.4037684 test: 0.4113778 best: 0.4113778 (5) total: 1.4s remaining: 19.4s 6: learn: 0.3773266 test: 0.3861794 best: 0.3861794 (6) total: 1.67s remaining: 19.5s 7: learn: 0.3572549 test: 0.3653214 best: 0.3653214 (7) total: 1.93s remaining: 19.5s 8: learn: 0.3377849 test: 0.3463800 best: 0.3463800 (8) total: 2.19s remaining: 19.4s 9: learn: 0.3223148 test: 0.3319222 best: 0.3319222 (9) total: 2.44s remaining: 19.3s 10: learn: 0.3073393 test: 0.3183805 best: 0.3183805 (10) total: 2.71s remaining: 19.2s 11: learn: 0.2939290 test: 0.3060878 best: 0.3060878 (11) total: 2.96s remaining: 19s 12: learn: 0.2826187 test: 0.2957112 best: 0.2957112 (12) total: 3.23s remaining: 18.9s 13: learn: 0.2738280 test: 0.2879512 best: 0.2879512 (13) total: 3.48s remaining: 18.7s 14: learn: 0.2654660 test: 0.2802477 best: 0.2802477 (14) total: 3.74s remaining: 18.4s 15: learn: 0.2606186 test: 0.2752092 best: 0.2752092 (15) total: 3.8s remaining: 17.3s 16: learn: 0.2538455 test: 0.2692907 best: 0.2692907 (16) total: 4.06s remaining: 17.2s 17: learn: 0.2486262 test: 0.2645553 best: 0.2645553 (17) total: 4.31s remaining: 17s 18: learn: 0.2429931 test: 0.2594822 best: 0.2594822 (18) total: 4.6s remaining: 17s 19: learn: 0.2394088 test: 0.2565472 best: 0.2565472 (19) total: 4.91s remaining: 16.9s 20: learn: 0.2345589 test: 0.2523898 best: 0.2523898 (20) total: 5.17s remaining: 16.8s 21: learn: 0.2303270 test: 0.2488169 best: 0.2488169 (21) total: 5.44s remaining: 16.6s 22: learn: 0.2278674 test: 0.2467438 best: 0.2467438 (22) total: 5.71s remaining: 16.4s 23: learn: 0.2251137 test: 0.2444635 best: 0.2444635 (23) total: 5.99s remaining: 16.2s 24: learn: 0.2215457 test: 0.2415500 best: 0.2415500 (24) total: 6.26s remaining: 16s 25: learn: 0.2193085 test: 0.2402887 best: 0.2402887 (25) total: 6.52s remaining: 15.8s 26: learn: 0.2166149 test: 0.2381706 best: 0.2381706 (26) total: 6.78s remaining: 15.6s 27: learn: 0.2136240 test: 0.2356006 best: 0.2356006 (27) total: 7.04s remaining: 15.3s 28: learn: 0.2115556 test: 0.2342107 best: 0.2342107 (28) total: 7.32s remaining: 15.1s 29: learn: 0.2092079 test: 0.2324468 best: 0.2324468 (29) total: 7.58s remaining: 14.9s 30: learn: 0.2071484 test: 0.2307424 best: 0.2307424 (30) total: 7.84s remaining: 14.7s 31: learn: 0.2056058 test: 0.2293870 best: 0.2293870 (31) total: 8.09s remaining: 14.4s 32: learn: 0.2047963 test: 0.2288268 best: 0.2288268 (32) total: 8.35s remaining: 14.2s 33: learn: 0.2036250 test: 0.2279037 best: 0.2279037 (33) total: 8.61s remaining: 13.9s 34: learn: 0.2023067 test: 0.2270065 best: 0.2270065 (34) total: 8.87s remaining: 13.7s 35: learn: 0.2008643 test: 0.2260959 best: 0.2260959 (35) total: 9.15s remaining: 13.5s 36: learn: 0.2000018 test: 0.2254877 best: 0.2254877 (36) total: 9.42s remaining: 13.2s 37: learn: 0.1995179 test: 0.2250790 best: 0.2250790 (37) total: 9.68s remaining: 13s 38: learn: 0.1994949 test: 0.2250857 best: 0.2250790 (37) total: 9.74s remaining: 12.5s 39: learn: 0.1986813 test: 0.2246301 best: 0.2246301 (39) total: 10s remaining: 12.3s 40: learn: 0.1976514 test: 0.2239053 best: 0.2239053 (40) total: 10.3s remaining: 12s 41: learn: 0.1972620 test: 0.2236130 best: 0.2236130 (41) total: 10.6s remaining: 11.8s 42: learn: 0.1959692 test: 0.2227227 best: 0.2227227 (42) total: 10.8s remaining: 11.6s 43: learn: 0.1949388 test: 0.2218885 best: 0.2218885 (43) total: 11.1s remaining: 11.4s 44: learn: 0.1943671 test: 0.2216064 best: 0.2216064 (44) total: 11.4s remaining: 11.2s 45: learn: 0.1923540 test: 0.2210868 best: 0.2210868 (45) total: 11.7s remaining: 10.9s 46: learn: 0.1918922 test: 0.2208593 best: 0.2208593 (46) total: 12s remaining: 10.7s 47: learn: 0.1918505 test: 0.2208348 best: 0.2208348 (47) total: 12s remaining: 10.3s 48: learn: 0.1914175 test: 0.2206323 best: 0.2206323 (48) total: 12.3s remaining: 10s 49: learn: 0.1913143 test: 0.2205531 best: 0.2205531 (49) total: 12.3s remaining: 9.62s 50: learn: 0.1903324 test: 0.2201435 best: 0.2201435 (50) total: 12.6s remaining: 9.4s 51: learn: 0.1889389 test: 0.2198586 best: 0.2198586 (51) total: 12.9s remaining: 9.18s 52: learn: 0.1878526 test: 0.2193368 best: 0.2193368 (52) total: 13.2s remaining: 8.96s 53: learn: 0.1874694 test: 0.2191094 best: 0.2191094 (53) total: 13.4s remaining: 8.71s 54: learn: 0.1867962 test: 0.2188737 best: 0.2188737 (54) total: 13.7s remaining: 8.48s 55: learn: 0.1861227 test: 0.2184174 best: 0.2184174 (55) total: 14.1s remaining: 8.32s 56: learn: 0.1859402 test: 0.2183523 best: 0.2183523 (56) total: 14.5s remaining: 8.13s 57: learn: 0.1856300 test: 0.2180962 best: 0.2180962 (57) total: 14.8s remaining: 7.91s 58: learn: 0.1853297 test: 0.2179187 best: 0.2179187 (58) total: 15.1s remaining: 7.68s 59: learn: 0.1850940 test: 0.2177875 best: 0.2177875 (59) total: 15.4s remaining: 7.45s 60: learn: 0.1848763 test: 0.2176187 best: 0.2176187 (60) total: 15.8s remaining: 7.24s 61: learn: 0.1848163 test: 0.2175885 best: 0.2175885 (61) total: 15.9s remaining: 6.91s 62: learn: 0.1844968 test: 0.2174711 best: 0.2174711 (62) total: 16.2s remaining: 6.7s 63: learn: 0.1840048 test: 0.2172150 best: 0.2172150 (63) total: 16.6s remaining: 6.47s 64: learn: 0.1833476 test: 0.2172081 best: 0.2172081 (64) total: 16.9s remaining: 6.24s 65: learn: 0.1828911 test: 0.2171368 best: 0.2171368 (65) total: 17.3s remaining: 6.02s 66: learn: 0.1824356 test: 0.2168657 best: 0.2168657 (66) total: 17.6s remaining: 5.78s 67: learn: 0.1818958 test: 0.2165510 best: 0.2165510 (67) total: 17.9s remaining: 5.54s 68: learn: 0.1807768 test: 0.2166209 best: 0.2165510 (67) total: 18.3s remaining: 5.3s 69: learn: 0.1805543 test: 0.2164008 best: 0.2164008 (69) total: 18.4s remaining: 4.99s 70: learn: 0.1802025 test: 0.2161594 best: 0.2161594 (70) total: 18.7s remaining: 4.74s 71: learn: 0.1798289 test: 0.2159824 best: 0.2159824 (71) total: 19s remaining: 4.48s 72: learn: 0.1791844 test: 0.2157456 best: 0.2157456 (72) total: 19.3s remaining: 4.22s 73: learn: 0.1785004 test: 0.2153268 best: 0.2153268 (73) total: 19.6s remaining: 3.96s 74: learn: 0.1777869 test: 0.2150121 best: 0.2150121 (74) total: 19.8s remaining: 3.7s 75: learn: 0.1773246 test: 0.2148202 best: 0.2148202 (75) total: 20.1s remaining: 3.44s 76: learn: 0.1770029 test: 0.2148277 best: 0.2148202 (75) total: 20.4s remaining: 3.18s 77: learn: 0.1764398 test: 0.2148444 best: 0.2148202 (75) total: 20.7s remaining: 2.92s 78: learn: 0.1757242 test: 0.2144965 best: 0.2144965 (78) total: 21s remaining: 2.65s 79: learn: 0.1753735 test: 0.2142868 best: 0.2142868 (79) total: 21.2s remaining: 2.39s 80: learn: 0.1753735 test: 0.2142844 best: 0.2142844 (80) total: 21.3s remaining: 2.1s 81: learn: 0.1746820 test: 0.2142283 best: 0.2142283 (81) total: 21.6s remaining: 1.84s 82: learn: 0.1741871 test: 0.2141666 best: 0.2141666 (82) total: 21.9s remaining: 1.58s 83: learn: 0.1739976 test: 0.2140269 best: 0.2140269 (83) total: 22.1s remaining: 1.32s 84: learn: 0.1738205 test: 0.2139671 best: 0.2139671 (84) total: 22.4s remaining: 1.05s 85: learn: 0.1734329 test: 0.2138714 best: 0.2138714 (85) total: 22.7s remaining: 791ms 86: learn: 0.1730285 test: 0.2136505 best: 0.2136505 (86) total: 23s remaining: 528ms 87: learn: 0.1730282 test: 0.2136485 best: 0.2136485 (87) total: 23s remaining: 262ms 88: learn: 0.1725256 test: 0.2135325 best: 0.2135325 (88) total: 23.3s remaining: 0us bestTest = 0.2135325497 bestIteration = 88 Trial 97, Fold 5: Log loss = 0.2131723402127683, Average precision = 0.9711635361806826, ROC-AUC = 0.9676526392062873, Elapsed Time = 23.49738530000468 seconds
Optimization Progress: 98%|#########8| 98/100 [2:56:27<02:23, 71.62s/it]
Trial 98, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 98, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.5968750 test: 0.5991725 best: 0.5991725 (0) total: 138ms remaining: 10.5s 1: learn: 0.5167462 test: 0.5208641 best: 0.5208641 (1) total: 281ms remaining: 10.5s 2: learn: 0.4547995 test: 0.4584184 best: 0.4584184 (2) total: 410ms remaining: 10.1s 3: learn: 0.4095910 test: 0.4147788 best: 0.4147788 (3) total: 557ms remaining: 10.2s 4: learn: 0.3766586 test: 0.3820032 best: 0.3820032 (4) total: 697ms remaining: 10s 5: learn: 0.3478912 test: 0.3546664 best: 0.3546664 (5) total: 831ms remaining: 9.83s 6: learn: 0.3195902 test: 0.3280797 best: 0.3280797 (6) total: 978ms remaining: 9.78s 7: learn: 0.3019855 test: 0.3116274 best: 0.3116274 (7) total: 1.11s remaining: 9.58s 8: learn: 0.2838560 test: 0.2944675 best: 0.2944675 (8) total: 1.26s remaining: 9.53s 9: learn: 0.2688377 test: 0.2817223 best: 0.2817223 (9) total: 1.42s remaining: 9.5s 10: learn: 0.2563287 test: 0.2701333 best: 0.2701333 (10) total: 1.56s remaining: 9.36s 11: learn: 0.2436324 test: 0.2586319 best: 0.2586319 (11) total: 1.69s remaining: 9.14s 12: learn: 1.4499444 test: 0.2474971 best: 0.2474971 (12) total: 1.85s remaining: 9.09s 13: learn: 1.4402849 test: 0.2388634 best: 0.2388634 (13) total: 1.99s remaining: 8.95s 14: learn: 1.4323262 test: 0.2329519 best: 0.2329519 (14) total: 2.15s remaining: 8.88s 15: learn: 2.6433769 test: 0.2269289 best: 0.2269289 (15) total: 2.29s remaining: 8.75s 16: learn: 94.7961396 test: 108.9798878 best: 0.2269289 (15) total: 2.43s remaining: 8.59s bestTest = 0.2269289229 bestIteration = 15 Shrink model to first 16 iterations.
Training has stopped (degenerate solution on iteration 17, probably too small l2-regularization, try to increase it)
Trial 98, Fold 1: Log loss = 0.22696060308168534, Average precision = 0.9719228885972324, ROC-AUC = 0.9674070847851337, Elapsed Time = 2.6826937000005273 seconds Trial 98, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 98, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6016898 test: 0.6029375 best: 0.6029375 (0) total: 144ms remaining: 11s 1: learn: 0.5279998 test: 0.5304055 best: 0.5304055 (1) total: 282ms remaining: 10.6s 2: learn: 0.4590162 test: 0.4637861 best: 0.4637861 (2) total: 444ms remaining: 11s 3: learn: 0.4147109 test: 0.4213086 best: 0.4213086 (3) total: 579ms remaining: 10.6s 4: learn: 0.3727901 test: 0.3797932 best: 0.3797932 (4) total: 730ms remaining: 10.5s 5: learn: 0.3403563 test: 0.3487030 best: 0.3487030 (5) total: 864ms remaining: 10.2s 6: learn: 0.3153129 test: 0.3238323 best: 0.3238323 (6) total: 1.01s remaining: 10.1s 7: learn: 0.2940122 test: 0.3036071 best: 0.3036071 (7) total: 1.16s remaining: 9.97s 8: learn: 0.2760001 test: 0.2871191 best: 0.2871191 (8) total: 1.29s remaining: 9.79s 9: learn: 0.2583909 test: 0.2719383 best: 0.2719383 (9) total: 1.46s remaining: 9.76s 10: learn: 0.2440667 test: 0.2580860 best: 0.2580860 (10) total: 1.62s remaining: 9.7s 11: learn: 0.2336067 test: 0.2478549 best: 0.2478549 (11) total: 1.76s remaining: 9.55s 12: learn: 0.2243852 test: 0.2395097 best: 0.2395097 (12) total: 1.93s remaining: 9.52s 13: learn: 0.2157405 test: 0.2322450 best: 0.2322450 (13) total: 2.09s remaining: 9.4s 14: learn: 1.4266033 test: 5.1391837 best: 0.2322450 (13) total: 2.25s remaining: 9.3s 15: learn: 1.4203719 test: 5.1336475 best: 0.2322450 (13) total: 2.4s remaining: 9.15s 16: learn: 1.4161080 test: 5.1299051 best: 0.2322450 (13) total: 2.54s remaining: 8.97s 17: learn: 7.9096676 test: 44.4203935 best: 0.2322450 (13) total: 2.69s remaining: 8.82s 18: learn: 7.9056335 test: 44.4169357 best: 0.2322450 (13) total: 2.84s remaining: 8.66s 19: learn: 7.9001740 test: 44.4115922 best: 0.2322450 (13) total: 2.99s remaining: 8.53s 20: learn: 7.8612052 test: 44.4059114 best: 0.2322450 (13) total: 3.13s remaining: 8.36s 21: learn: 7.8582231 test: 44.4032758 best: 0.2322450 (13) total: 3.29s remaining: 8.22s 22: learn: 7.8562915 test: 44.4016636 best: 0.2322450 (13) total: 3.42s remaining: 8.03s 23: learn: 7.6473516 test: 44.3998431 best: 0.2322450 (13) total: 3.57s remaining: 7.89s 24: learn: 7.6446830 test: 44.3986193 best: 0.2322450 (13) total: 3.73s remaining: 7.75s 25: learn: 9.4715406 test: 49.9575426 best: 0.2322450 (13) total: 3.88s remaining: 7.62s 26: learn: 69.9872737 test: 96.3613415 best: 0.2322450 (13) total: 4.01s remaining: 7.42s 27: learn: 817.0419979 test: 936.4704886 best: 0.2322450 (13) total: 4.14s remaining: 7.25s 28: learn: 629.5373794 test: 925.6394512 best: 0.2322450 (13) total: 4.34s remaining: 7.19s 29: learn: 2047.5605626 test: 2621.8375430 best: 0.2322450 (13) total: 4.49s remaining: 7.03s 30: learn: 3056.8632855 test: 3784.3916738 best: 0.2322450 (13) total: 4.64s remaining: 6.89s
Training has stopped (degenerate solution on iteration 31, probably too small l2-regularization, try to increase it)
bestTest = 0.2322450297 bestIteration = 13 Shrink model to first 14 iterations. Trial 98, Fold 2: Log loss = 0.23230951948803025, Average precision = 0.9733879690908728, ROC-AUC = 0.9699329972828041, Elapsed Time = 4.96705149999616 seconds Trial 98, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 98, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.5942542 test: 0.5969691 best: 0.5969691 (0) total: 271ms remaining: 20.6s 1: learn: 0.5055235 test: 0.5084790 best: 0.5084790 (1) total: 541ms remaining: 20.3s 2: learn: 0.4493551 test: 0.4527923 best: 0.4527923 (2) total: 740ms remaining: 18.3s 3: learn: 0.4061785 test: 0.4111673 best: 0.4111673 (3) total: 947ms remaining: 17.3s 4: learn: 0.3703365 test: 0.3757627 best: 0.3757627 (4) total: 1.14s remaining: 16.4s 5: learn: 0.3398797 test: 0.3456551 best: 0.3456551 (5) total: 1.32s remaining: 15.7s 6: learn: 0.3170382 test: 0.3236042 best: 0.3236042 (6) total: 1.5s remaining: 15s 7: learn: 0.2968404 test: 0.3044536 best: 0.3044536 (7) total: 1.68s remaining: 14.5s 8: learn: 0.2805288 test: 0.2885937 best: 0.2885937 (8) total: 1.89s remaining: 14.3s 9: learn: 0.2664177 test: 0.2746376 best: 0.2746376 (9) total: 2.07s remaining: 13.9s 10: learn: 0.2529465 test: 0.2620822 best: 0.2620822 (10) total: 2.25s remaining: 13.5s 11: learn: 0.2416425 test: 0.2527405 best: 0.2527405 (11) total: 2.42s remaining: 13.1s 12: learn: 0.2332899 test: 0.2458020 best: 0.2458020 (12) total: 2.57s remaining: 12.7s 13: learn: 1.2335756 test: 0.2384403 best: 0.2384403 (13) total: 2.74s remaining: 12.3s 14: learn: 1.2259719 test: 0.2320762 best: 0.2320762 (14) total: 2.94s remaining: 12.1s 15: learn: 1.0069789 test: 0.2256273 best: 0.2256273 (15) total: 3.1s remaining: 11.8s 16: learn: 0.9997849 test: 0.2198269 best: 0.2198269 (16) total: 3.27s remaining: 11.6s 17: learn: 2.2121965 test: 0.2146997 best: 0.2146997 (17) total: 3.45s remaining: 11.3s 18: learn: 2.2073049 test: 0.2109466 best: 0.2109466 (18) total: 3.62s remaining: 11s 19: learn: 3.0078039 test: 0.2089574 best: 0.2089574 (19) total: 3.78s remaining: 10.8s 20: learn: 36.4268463 test: 20.4843156 best: 0.2089574 (19) total: 3.94s remaining: 10.5s 21: learn: 35.7446589 test: 20.0914923 best: 0.2089574 (19) total: 4.09s remaining: 10.2s 22: learn: 37.6268162 test: 18.0897885 best: 0.2089574 (19) total: 4.24s remaining: 9.96s 23: learn: 106.1007271 test: 103.7938452 best: 0.2089574 (19) total: 4.39s remaining: 9.71s 24: learn: 1979.0993002 test: 2245.7357907 best: 0.2089574 (19) total: 4.55s remaining: 9.47s 25: learn: 1173.5200037 test: 647.4769983 best: 0.2089574 (19) total: 4.69s remaining: 9.21s 26: learn: 1530.6013296 test: 960.4686493 best: 0.2089574 (19) total: 4.86s remaining: 9s 27: learn: 2081.5896091 test: 1496.1813577 best: 0.2089574 (19) total: 5s remaining: 8.75s 28: learn: 3208.8884857 test: 1926.0695910 best: 0.2089574 (19) total: 5.15s remaining: 8.53s 29: learn: 7742.9858344 test: 7342.1096130 best: 0.2089574 (19) total: 5.29s remaining: 8.29s
Training has stopped (degenerate solution on iteration 30, probably too small l2-regularization, try to increase it)
bestTest = 0.208957374 bestIteration = 19 Shrink model to first 20 iterations. Trial 98, Fold 3: Log loss = 0.20909505132557127, Average precision = 0.9720119165738083, ROC-AUC = 0.970647765359093, Elapsed Time = 5.5795787000024575 seconds Trial 98, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 98, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.5982989 test: 0.5996583 best: 0.5996583 (0) total: 154ms remaining: 11.7s 1: learn: 0.5398699 test: 0.5431375 best: 0.5431375 (1) total: 283ms remaining: 10.6s 2: learn: 0.4729225 test: 0.4762672 best: 0.4762672 (2) total: 445ms remaining: 11s 3: learn: 0.4153591 test: 0.4215758 best: 0.4215758 (3) total: 607ms remaining: 11.1s 4: learn: 0.3798469 test: 0.3864927 best: 0.3864927 (4) total: 782ms remaining: 11.3s 5: learn: 0.3502816 test: 0.3574635 best: 0.3574635 (5) total: 916ms remaining: 10.8s 6: learn: 0.3246987 test: 0.3320685 best: 0.3320685 (6) total: 1.07s remaining: 10.7s 7: learn: 0.3064033 test: 0.3155896 best: 0.3155896 (7) total: 1.25s remaining: 10.8s 8: learn: 0.2896150 test: 0.2994228 best: 0.2994228 (8) total: 1.41s remaining: 10.7s 9: learn: 0.2734332 test: 0.2841585 best: 0.2841585 (9) total: 1.57s remaining: 10.5s 10: learn: 0.2584194 test: 0.2698601 best: 0.2698601 (10) total: 1.74s remaining: 10.4s 11: learn: 0.2476752 test: 0.2599972 best: 0.2599972 (11) total: 1.9s remaining: 10.3s 12: learn: 0.2378626 test: 0.2508359 best: 0.2508359 (12) total: 2.04s remaining: 10.1s 13: learn: 0.2286058 test: 0.2433132 best: 0.2433132 (13) total: 2.21s remaining: 9.96s 14: learn: 0.2198438 test: 0.2357272 best: 0.2357272 (14) total: 2.37s remaining: 9.8s 15: learn: 0.2138281 test: 0.2305535 best: 0.2305535 (15) total: 2.52s remaining: 9.59s 16: learn: 0.2075804 test: 0.2253769 best: 0.2253769 (16) total: 2.68s remaining: 9.46s 17: learn: 0.2010164 test: 0.2208682 best: 0.2208682 (17) total: 2.84s remaining: 9.3s 18: learn: 1.2165518 test: 0.2153574 best: 0.2153574 (18) total: 3s remaining: 9.15s 19: learn: 1.2124901 test: 0.2123962 best: 0.2123962 (19) total: 3.13s remaining: 8.91s 20: learn: 2.4057314 test: 4.9869625 best: 0.2123962 (19) total: 3.29s remaining: 8.78s 21: learn: 2.1916496 test: 4.9845805 best: 0.2123962 (19) total: 3.43s remaining: 8.57s 22: learn: 9.4926252 test: 36.5692928 best: 0.2123962 (19) total: 3.58s remaining: 8.4s 23: learn: 194.8643802 test: 263.9515660 best: 0.2123962 (19) total: 3.72s remaining: 8.21s 24: learn: 681.4881060 test: 891.8477545 best: 0.2123962 (19) total: 3.86s remaining: 8.02s 25: learn: 1312.8094138 test: 1760.5341177 best: 0.2123962 (19) total: 4.01s remaining: 7.87s 26: learn: 2655.9003149 test: 3285.5872254 best: 0.2123962 (19) total: 4.16s remaining: 7.71s 27: learn: 3489.7889882 test: 4301.8698840 best: 0.2123962 (19) total: 4.31s remaining: 7.54s 28: learn: 4245.1091760 test: 5288.8966329 best: 0.2123962 (19) total: 4.44s remaining: 7.34s 29: learn: 5059.6217207 test: 6357.6695161 best: 0.2123962 (19) total: 4.57s remaining: 7.16s 30: learn: 5924.8992738 test: 7406.9601978 best: 0.2123962 (19) total: 4.71s remaining: 6.98s 31: learn: 6257.6974106 test: 7789.3104141 best: 0.2123962 (19) total: 4.84s remaining: 6.8s 32: learn: 7963.6232091 test: 9291.5857935 best: 0.2123962 (19) total: 4.97s remaining: 6.62s 33: learn: 11206.3073787 test: 12192.1746278 best: 0.2123962 (19) total: 5.09s remaining: 6.44s
Training has stopped (degenerate solution on iteration 34, probably too small l2-regularization, try to increase it)
bestTest = 0.2123961827 bestIteration = 19 Shrink model to first 20 iterations. Trial 98, Fold 4: Log loss = 0.21227717522735867, Average precision = 0.9746143472649325, ROC-AUC = 0.9699343810429886, Elapsed Time = 5.330942900000082 seconds Trial 98, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 98, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.5897678 test: 0.5942515 best: 0.5942515 (0) total: 154ms remaining: 11.7s 1: learn: 0.5068883 test: 0.5158148 best: 0.5158148 (1) total: 301ms remaining: 11.3s 2: learn: 0.4397798 test: 0.4490221 best: 0.4490221 (2) total: 455ms remaining: 11.2s 3: learn: 0.3985913 test: 0.4078524 best: 0.4078524 (3) total: 600ms remaining: 11s 4: learn: 0.3634282 test: 0.3738948 best: 0.3738948 (4) total: 740ms remaining: 10.7s 5: learn: 0.3379700 test: 0.3495524 best: 0.3495524 (5) total: 886ms remaining: 10.5s 6: learn: 0.3144078 test: 0.3275431 best: 0.3275431 (6) total: 1.02s remaining: 10.2s 7: learn: 0.2910424 test: 0.3045074 best: 0.3045074 (7) total: 1.15s remaining: 9.94s 8: learn: 0.2705054 test: 0.2859577 best: 0.2859577 (8) total: 1.31s remaining: 9.87s 9: learn: 0.2556943 test: 0.2717641 best: 0.2717641 (9) total: 1.44s remaining: 9.63s 10: learn: 0.2412884 test: 0.2594253 best: 0.2594253 (10) total: 1.59s remaining: 9.56s 11: learn: 0.2310294 test: 0.2504305 best: 0.2504305 (11) total: 1.73s remaining: 9.37s 12: learn: 0.2216996 test: 0.2418969 best: 0.2418969 (12) total: 1.87s remaining: 9.19s 13: learn: 0.2138015 test: 0.2351822 best: 0.2351822 (13) total: 2.01s remaining: 9.04s 14: learn: 0.2070210 test: 0.2300535 best: 0.2300535 (14) total: 2.15s remaining: 8.88s 15: learn: 0.2017849 test: 0.2254741 best: 0.2254741 (15) total: 2.28s remaining: 8.69s 16: learn: 0.1973102 test: 0.2226232 best: 0.2226232 (16) total: 2.41s remaining: 8.52s 17: learn: 0.1924263 test: 0.2191224 best: 0.2191224 (17) total: 2.55s remaining: 8.36s 18: learn: 0.1871417 test: 0.2158849 best: 0.2158849 (18) total: 2.71s remaining: 8.28s 19: learn: 1.8148490 test: 6.7084264 best: 0.2158849 (18) total: 2.86s remaining: 8.16s 20: learn: 1.8115287 test: 6.7065501 best: 0.2158849 (18) total: 3.02s remaining: 8.05s 21: learn: 1.8082604 test: 6.7052524 best: 0.2158849 (18) total: 3.15s remaining: 7.88s 22: learn: 5.0660699 test: 13.1932189 best: 0.2158849 (18) total: 3.29s remaining: 7.73s 23: learn: 7.7172163 test: 13.1911397 best: 0.2158849 (18) total: 3.42s remaining: 7.56s 24: learn: 7.7144906 test: 13.1894918 best: 0.2158849 (18) total: 3.55s remaining: 7.38s 25: learn: 7.7118112 test: 13.1883134 best: 0.2158849 (18) total: 3.69s remaining: 7.24s 26: learn: 7.7090136 test: 13.1870298 best: 0.2158849 (18) total: 3.84s remaining: 7.11s 27: learn: 9.2326318 test: 14.7048453 best: 0.2158849 (18) total: 3.97s remaining: 6.95s 28: learn: 14.1997910 test: 18.6640683 best: 0.2158849 (18) total: 4.1s remaining: 6.78s 29: learn: 14.9950483 test: 18.6635939 best: 0.2158849 (18) total: 4.24s remaining: 6.64s 30: learn: 14.5698483 test: 18.6633189 best: 0.2158849 (18) total: 4.38s remaining: 6.5s 31: learn: 14.5680957 test: 18.6625665 best: 0.2158849 (18) total: 4.5s remaining: 6.32s
Training has stopped (degenerate solution on iteration 32, probably too small l2-regularization, try to increase it)
bestTest = 0.2158848539 bestIteration = 18 Shrink model to first 19 iterations. Trial 98, Fold 5: Log loss = 0.21565013467815503, Average precision = 0.9731286815912488, ROC-AUC = 0.9690805046169852, Elapsed Time = 4.74505119999958 seconds
Optimization Progress: 99%|#########9| 99/100 [2:56:59<00:59, 59.68s/it]
Trial 99, Fold 1: Train size = 20663 where 0 = 10533, 1 = 10130, 0/1 = 1.0397828232971371 Trial 99, Fold 1: Validation size = 5175 where 0 = 2592, 1 = 2583, 0/1 = 1.0034843205574913 0: learn: 0.6744634 test: 0.6743991 best: 0.6743991 (0) total: 641ms remaining: 28.2s 1: learn: 0.6566847 test: 0.6565880 best: 0.6565880 (1) total: 1.33s remaining: 28.6s 2: learn: 0.6393980 test: 0.6392001 best: 0.6392001 (2) total: 1.94s remaining: 27.2s 3: learn: 0.6228420 test: 0.6227753 best: 0.6227753 (3) total: 2.59s remaining: 26.6s 4: learn: 0.6072265 test: 0.6070361 best: 0.6070361 (4) total: 3.23s remaining: 25.9s 5: learn: 0.5931676 test: 0.5931443 best: 0.5931443 (5) total: 3.87s remaining: 25.2s 6: learn: 0.5796245 test: 0.5797473 best: 0.5797473 (6) total: 4.5s remaining: 24.5s 7: learn: 0.5662836 test: 0.5664043 best: 0.5664043 (7) total: 5.16s remaining: 23.9s 8: learn: 0.5532878 test: 0.5535266 best: 0.5535266 (8) total: 5.83s remaining: 23.3s 9: learn: 0.5407660 test: 0.5410758 best: 0.5410758 (9) total: 6.47s remaining: 22.6s 10: learn: 0.5288448 test: 0.5291887 best: 0.5291887 (10) total: 7.14s remaining: 22.1s 11: learn: 0.5180161 test: 0.5185040 best: 0.5185040 (11) total: 7.79s remaining: 21.4s 12: learn: 0.5073271 test: 0.5079431 best: 0.5079431 (12) total: 8.43s remaining: 20.7s 13: learn: 0.4966424 test: 0.4973775 best: 0.4973775 (13) total: 9.07s remaining: 20.1s 14: learn: 0.4862561 test: 0.4871162 best: 0.4871162 (14) total: 9.7s remaining: 19.4s 15: learn: 0.4766710 test: 0.4776055 best: 0.4776055 (15) total: 10.4s remaining: 18.8s 16: learn: 0.4678525 test: 0.4688606 best: 0.4688606 (16) total: 11s remaining: 18.1s 17: learn: 0.4584474 test: 0.4596109 best: 0.4596109 (17) total: 11.7s remaining: 17.5s 18: learn: 0.4492891 test: 0.4505966 best: 0.4505966 (18) total: 12.3s remaining: 16.8s 19: learn: 0.4408422 test: 0.4421311 best: 0.4421311 (19) total: 12.9s remaining: 16.2s 20: learn: 0.4337171 test: 0.4349415 best: 0.4349415 (20) total: 13.6s remaining: 15.5s 21: learn: 0.4257806 test: 0.4271109 best: 0.4271109 (21) total: 14.3s remaining: 14.9s 22: learn: 0.4181984 test: 0.4196453 best: 0.4196453 (22) total: 14.9s remaining: 14.3s 23: learn: 0.4110076 test: 0.4124211 best: 0.4124211 (23) total: 15.6s remaining: 13.6s 24: learn: 0.4039070 test: 0.4054895 best: 0.4054895 (24) total: 16.2s remaining: 12.9s 25: learn: 0.3976961 test: 0.3993657 best: 0.3993657 (25) total: 16.9s remaining: 12.3s 26: learn: 0.3915171 test: 0.3933823 best: 0.3933823 (26) total: 17.5s remaining: 11.7s 27: learn: 0.3848178 test: 0.3867610 best: 0.3867610 (27) total: 18.1s remaining: 11s 28: learn: 0.3790387 test: 0.3810889 best: 0.3810889 (28) total: 18.8s remaining: 10.4s 29: learn: 0.3736499 test: 0.3757810 best: 0.3757810 (29) total: 19.4s remaining: 9.72s 30: learn: 0.3684837 test: 0.3707115 best: 0.3707115 (30) total: 20.1s remaining: 9.07s 31: learn: 0.3636420 test: 0.3660190 best: 0.3660190 (31) total: 20.7s remaining: 8.43s 32: learn: 0.3581463 test: 0.3606941 best: 0.3606941 (32) total: 21.4s remaining: 7.79s 33: learn: 0.3537902 test: 0.3564819 best: 0.3564819 (33) total: 22s remaining: 7.13s 34: learn: 0.3490848 test: 0.3519081 best: 0.3519081 (34) total: 22.7s remaining: 6.49s 35: learn: 0.3445731 test: 0.3475122 best: 0.3475122 (35) total: 23.3s remaining: 5.84s 36: learn: 0.3401328 test: 0.3432058 best: 0.3432058 (36) total: 24s remaining: 5.18s 37: learn: 0.3362188 test: 0.3394144 best: 0.3394144 (37) total: 24.6s remaining: 4.54s 38: learn: 0.3326719 test: 0.3360860 best: 0.3360860 (38) total: 25.3s remaining: 3.89s 39: learn: 0.3287436 test: 0.3323041 best: 0.3323041 (39) total: 25.9s remaining: 3.24s 40: learn: 0.3250274 test: 0.3287641 best: 0.3287641 (40) total: 26.6s remaining: 2.59s 41: learn: 0.3214353 test: 0.3251875 best: 0.3251875 (41) total: 27.2s remaining: 1.94s 42: learn: 0.3181001 test: 0.3220649 best: 0.3220649 (42) total: 27.9s remaining: 1.3s 43: learn: 0.3150407 test: 0.3192379 best: 0.3192379 (43) total: 28.5s remaining: 648ms 44: learn: 0.3118676 test: 0.3162341 best: 0.3162341 (44) total: 29.2s remaining: 0us bestTest = 0.3162340849 bestIteration = 44 Trial 99, Fold 1: Log loss = 0.31628329592305343, Average precision = 0.9689417572860706, ROC-AUC = 0.964567112602343, Elapsed Time = 29.312696900000446 seconds Trial 99, Fold 2: Train size = 20701 where 0 = 10471, 1 = 10230, 0/1 = 1.0235581622678396 Trial 99, Fold 2: Validation size = 5137 where 0 = 2654, 1 = 2483, 0/1 = 1.0688683044703986 0: learn: 0.6744150 test: 0.6745858 best: 0.6745858 (0) total: 638ms remaining: 28.1s 1: learn: 0.6572246 test: 0.6575363 best: 0.6575363 (1) total: 1.29s remaining: 27.7s 2: learn: 0.6397327 test: 0.6401877 best: 0.6401877 (2) total: 1.92s remaining: 26.9s 3: learn: 0.6237044 test: 0.6243550 best: 0.6243550 (3) total: 2.55s remaining: 26.1s 4: learn: 0.6081255 test: 0.6090012 best: 0.6090012 (4) total: 3.19s remaining: 25.5s 5: learn: 0.5928721 test: 0.5938639 best: 0.5938639 (5) total: 3.84s remaining: 24.9s 6: learn: 0.5786552 test: 0.5797750 best: 0.5797750 (6) total: 4.47s remaining: 24.3s 7: learn: 0.5646416 test: 0.5658531 best: 0.5658531 (7) total: 5.09s remaining: 23.6s 8: learn: 0.5514865 test: 0.5527920 best: 0.5527920 (8) total: 5.73s remaining: 22.9s 9: learn: 0.5394791 test: 0.5408454 best: 0.5408454 (9) total: 6.38s remaining: 22.3s 10: learn: 0.5285492 test: 0.5300489 best: 0.5300489 (10) total: 7.04s remaining: 21.7s 11: learn: 0.5173595 test: 0.5189124 best: 0.5189124 (11) total: 7.69s remaining: 21.1s 12: learn: 0.5065608 test: 0.5082302 best: 0.5082302 (12) total: 8.32s remaining: 20.5s 13: learn: 0.4956739 test: 0.4975271 best: 0.4975271 (13) total: 8.98s remaining: 19.9s 14: learn: 0.4859971 test: 0.4879467 best: 0.4879467 (14) total: 9.64s remaining: 19.3s 15: learn: 0.4759339 test: 0.4778641 best: 0.4778641 (15) total: 10.3s remaining: 18.6s 16: learn: 0.4664922 test: 0.4685265 best: 0.4685265 (16) total: 11s remaining: 18s 17: learn: 0.4569320 test: 0.4590932 best: 0.4590932 (17) total: 11.6s remaining: 17.4s 18: learn: 0.4486178 test: 0.4509538 best: 0.4509538 (18) total: 12.2s remaining: 16.7s 19: learn: 0.4406956 test: 0.4431031 best: 0.4431031 (19) total: 12.9s remaining: 16.1s 20: learn: 0.4334018 test: 0.4357995 best: 0.4357995 (20) total: 13.5s remaining: 15.5s 21: learn: 0.4253688 test: 0.4277905 best: 0.4277905 (21) total: 14.2s remaining: 14.8s 22: learn: 0.4186857 test: 0.4210880 best: 0.4210880 (22) total: 14.8s remaining: 14.2s 23: learn: 0.4120444 test: 0.4145365 best: 0.4145365 (23) total: 15.5s remaining: 13.5s 24: learn: 0.4049288 test: 0.4075737 best: 0.4075737 (24) total: 16.1s remaining: 12.9s 25: learn: 0.3984937 test: 0.4011913 best: 0.4011913 (25) total: 16.8s remaining: 12.3s 26: learn: 0.3924678 test: 0.3952512 best: 0.3952512 (26) total: 17.4s remaining: 11.6s 27: learn: 0.3866549 test: 0.3895545 best: 0.3895545 (27) total: 18.1s remaining: 11s 28: learn: 0.3806148 test: 0.3836209 best: 0.3836209 (28) total: 18.8s remaining: 10.3s 29: learn: 0.3746721 test: 0.3777340 best: 0.3777340 (29) total: 19.4s remaining: 9.7s 30: learn: 0.3693319 test: 0.3725306 best: 0.3725306 (30) total: 20.1s remaining: 9.06s 31: learn: 0.3648101 test: 0.3679586 best: 0.3679586 (31) total: 20.7s remaining: 8.41s 32: learn: 0.3597500 test: 0.3629276 best: 0.3629276 (32) total: 21.4s remaining: 7.76s 33: learn: 0.3549443 test: 0.3582247 best: 0.3582247 (33) total: 22s remaining: 7.11s 34: learn: 0.3505584 test: 0.3540098 best: 0.3540098 (34) total: 22.6s remaining: 6.47s 35: learn: 0.3465714 test: 0.3501048 best: 0.3501048 (35) total: 23.3s remaining: 5.82s 36: learn: 0.3423468 test: 0.3459850 best: 0.3459850 (36) total: 23.9s remaining: 5.17s 37: learn: 0.3379922 test: 0.3417053 best: 0.3417053 (37) total: 24.6s remaining: 4.53s 38: learn: 0.3338099 test: 0.3376184 best: 0.3376184 (38) total: 25.2s remaining: 3.88s 39: learn: 0.3297058 test: 0.3335881 best: 0.3335881 (39) total: 25.9s remaining: 3.24s 40: learn: 0.3262325 test: 0.3301156 best: 0.3301156 (40) total: 26.6s remaining: 2.59s 41: learn: 0.3223386 test: 0.3263185 best: 0.3263185 (41) total: 27.2s remaining: 1.95s 42: learn: 0.3190346 test: 0.3230853 best: 0.3230853 (42) total: 27.9s remaining: 1.3s 43: learn: 0.3158184 test: 0.3199045 best: 0.3199045 (43) total: 28.6s remaining: 649ms 44: learn: 0.3126849 test: 0.3168083 best: 0.3168083 (44) total: 29.2s remaining: 0us bestTest = 0.3168082874 bestIteration = 44 Trial 99, Fold 2: Log loss = 0.31685294828349275, Average precision = 0.9704712253162898, ROC-AUC = 0.9675230148278833, Elapsed Time = 29.368694899996626 seconds Trial 99, Fold 3: Train size = 20682 where 0 = 10517, 1 = 10165, 0/1 = 1.034628627643876 Trial 99, Fold 3: Validation size = 5156 where 0 = 2608, 1 = 2548, 0/1 = 1.0235478806907379 0: learn: 0.6746909 test: 0.6747288 best: 0.6747288 (0) total: 637ms remaining: 28s 1: learn: 0.6570252 test: 0.6570360 best: 0.6570360 (1) total: 1.25s remaining: 26.9s 2: learn: 0.6406150 test: 0.6405484 best: 0.6405484 (2) total: 1.89s remaining: 26.5s 3: learn: 0.6250812 test: 0.6250113 best: 0.6250113 (3) total: 2.56s remaining: 26.2s 4: learn: 0.6094696 test: 0.6093519 best: 0.6093519 (4) total: 3.19s remaining: 25.6s 5: learn: 0.5950482 test: 0.5949568 best: 0.5949568 (5) total: 3.88s remaining: 25.2s 6: learn: 0.5807817 test: 0.5805808 best: 0.5805808 (6) total: 4.55s remaining: 24.7s 7: learn: 0.5668144 test: 0.5666183 best: 0.5666183 (7) total: 5.2s remaining: 24.1s 8: learn: 0.5531705 test: 0.5530008 best: 0.5530008 (8) total: 5.85s remaining: 23.4s 9: learn: 0.5404331 test: 0.5402515 best: 0.5402515 (9) total: 6.5s remaining: 22.8s 10: learn: 0.5284521 test: 0.5281950 best: 0.5281950 (10) total: 7.09s remaining: 21.9s 11: learn: 0.5166820 test: 0.5163580 best: 0.5163580 (11) total: 7.76s remaining: 21.4s 12: learn: 0.5064914 test: 0.5059900 best: 0.5059900 (12) total: 8.41s remaining: 20.7s 13: learn: 0.4960515 test: 0.4956939 best: 0.4956939 (13) total: 9.1s remaining: 20.2s 14: learn: 0.4861374 test: 0.4858083 best: 0.4858083 (14) total: 9.76s remaining: 19.5s 15: learn: 0.4761336 test: 0.4758399 best: 0.4758399 (15) total: 10.4s remaining: 18.9s 16: learn: 0.4662086 test: 0.4660396 best: 0.4660396 (16) total: 11.1s remaining: 18.2s 17: learn: 0.4572547 test: 0.4570817 best: 0.4570817 (17) total: 11.7s remaining: 17.6s 18: learn: 0.4488863 test: 0.4487368 best: 0.4487368 (18) total: 12.4s remaining: 17s 19: learn: 0.4412255 test: 0.4410634 best: 0.4410634 (19) total: 13.1s remaining: 16.3s 20: learn: 0.4331003 test: 0.4329442 best: 0.4329442 (20) total: 13.7s remaining: 15.7s 21: learn: 0.4255103 test: 0.4254953 best: 0.4254953 (21) total: 14.4s remaining: 15s 22: learn: 0.4176537 test: 0.4176220 best: 0.4176220 (22) total: 15s remaining: 14.4s 23: learn: 0.4109787 test: 0.4110726 best: 0.4110726 (23) total: 15.7s remaining: 13.8s 24: learn: 0.4042901 test: 0.4043675 best: 0.4043675 (24) total: 16.4s remaining: 13.1s 25: learn: 0.3974766 test: 0.3976357 best: 0.3976357 (25) total: 17.1s remaining: 12.5s 26: learn: 0.3914668 test: 0.3916123 best: 0.3916123 (26) total: 17.7s remaining: 11.8s 27: learn: 0.3851032 test: 0.3852696 best: 0.3852696 (27) total: 18.4s remaining: 11.2s 28: learn: 0.3794723 test: 0.3796306 best: 0.3796306 (28) total: 19s remaining: 10.5s 29: learn: 0.3736905 test: 0.3738749 best: 0.3738749 (29) total: 19.7s remaining: 9.87s 30: learn: 0.3690435 test: 0.3692276 best: 0.3692276 (30) total: 20.6s remaining: 9.29s 31: learn: 0.3636804 test: 0.3638800 best: 0.3638800 (31) total: 21.3s remaining: 8.64s 32: learn: 0.3590737 test: 0.3592127 best: 0.3592127 (32) total: 22s remaining: 7.98s 33: learn: 0.3542224 test: 0.3544257 best: 0.3544257 (33) total: 22.6s remaining: 7.33s 34: learn: 0.3495474 test: 0.3497940 best: 0.3497940 (34) total: 23.4s remaining: 6.68s 35: learn: 0.3447867 test: 0.3450977 best: 0.3450977 (35) total: 24s remaining: 6.01s 36: learn: 0.3403512 test: 0.3406765 best: 0.3406765 (36) total: 24.6s remaining: 5.32s 37: learn: 0.3364648 test: 0.3368116 best: 0.3368116 (37) total: 25.3s remaining: 4.65s 38: learn: 0.3326999 test: 0.3331160 best: 0.3331160 (38) total: 25.9s remaining: 3.99s 39: learn: 0.3293349 test: 0.3297643 best: 0.3297643 (39) total: 26.6s remaining: 3.33s 40: learn: 0.3258598 test: 0.3263424 best: 0.3263424 (40) total: 27.2s remaining: 2.65s 41: learn: 0.3221307 test: 0.3226926 best: 0.3226926 (41) total: 27.8s remaining: 1.99s 42: learn: 0.3191234 test: 0.3196907 best: 0.3196907 (42) total: 28.5s remaining: 1.32s 43: learn: 0.3159448 test: 0.3166050 best: 0.3166050 (43) total: 29.2s remaining: 664ms 44: learn: 0.3130781 test: 0.3137362 best: 0.3137362 (44) total: 29.9s remaining: 0us bestTest = 0.313736158 bestIteration = 44 Trial 99, Fold 3: Log loss = 0.3139282110978027, Average precision = 0.9694548904810062, ROC-AUC = 0.9677273044659108, Elapsed Time = 30.014548099999956 seconds Trial 99, Fold 4: Train size = 20656 where 0 = 10479, 1 = 10177, 0/1 = 1.0296747568045592 Trial 99, Fold 4: Validation size = 5182 where 0 = 2646, 1 = 2536, 0/1 = 1.0433753943217665 0: learn: 0.6747939 test: 0.6747461 best: 0.6747461 (0) total: 634ms remaining: 27.9s 1: learn: 0.6572350 test: 0.6573584 best: 0.6573584 (1) total: 1.29s remaining: 27.8s 2: learn: 0.6408896 test: 0.6410427 best: 0.6410427 (2) total: 1.95s remaining: 27.3s 3: learn: 0.6244291 test: 0.6246424 best: 0.6246424 (3) total: 2.56s remaining: 26.2s 4: learn: 0.6096446 test: 0.6100346 best: 0.6100346 (4) total: 3.2s remaining: 25.6s 5: learn: 0.5951785 test: 0.5955442 best: 0.5955442 (5) total: 3.85s remaining: 25s 6: learn: 0.5810817 test: 0.5814746 best: 0.5814746 (6) total: 4.53s remaining: 24.6s 7: learn: 0.5665408 test: 0.5670034 best: 0.5670034 (7) total: 5.15s remaining: 23.8s 8: learn: 0.5525830 test: 0.5531763 best: 0.5531763 (8) total: 5.79s remaining: 23.2s 9: learn: 0.5412497 test: 0.5418730 best: 0.5418730 (9) total: 6.46s remaining: 22.6s 10: learn: 0.5295311 test: 0.5301874 best: 0.5301874 (10) total: 7.12s remaining: 22s 11: learn: 0.5174974 test: 0.5181171 best: 0.5181171 (11) total: 7.77s remaining: 21.4s 12: learn: 0.5066367 test: 0.5072255 best: 0.5072255 (12) total: 8.42s remaining: 20.7s 13: learn: 0.4959180 test: 0.4965880 best: 0.4965880 (13) total: 9.07s remaining: 20.1s 14: learn: 0.4856946 test: 0.4864606 best: 0.4864606 (14) total: 9.72s remaining: 19.4s 15: learn: 0.4765019 test: 0.4773398 best: 0.4773398 (15) total: 10.3s remaining: 18.7s 16: learn: 0.4667862 test: 0.4677010 best: 0.4677010 (16) total: 11s remaining: 18.1s 17: learn: 0.4577825 test: 0.4587743 best: 0.4587743 (17) total: 11.6s remaining: 17.5s 18: learn: 0.4497002 test: 0.4506755 best: 0.4506755 (18) total: 12.3s remaining: 16.8s 19: learn: 0.4412593 test: 0.4423033 best: 0.4423033 (19) total: 13s remaining: 16.2s 20: learn: 0.4333025 test: 0.4344409 best: 0.4344409 (20) total: 13.6s remaining: 15.5s 21: learn: 0.4254524 test: 0.4267477 best: 0.4267477 (21) total: 14.2s remaining: 14.9s 22: learn: 0.4185374 test: 0.4199010 best: 0.4199010 (22) total: 14.9s remaining: 14.2s 23: learn: 0.4112507 test: 0.4126986 best: 0.4126986 (23) total: 15.5s remaining: 13.6s 24: learn: 0.4046716 test: 0.4062558 best: 0.4062558 (24) total: 16.1s remaining: 12.9s 25: learn: 0.3989649 test: 0.4007284 best: 0.4007284 (25) total: 16.8s remaining: 12.3s 26: learn: 0.3923629 test: 0.3941786 best: 0.3941786 (26) total: 17.5s remaining: 11.6s 27: learn: 0.3861529 test: 0.3880743 best: 0.3880743 (27) total: 18.1s remaining: 11s 28: learn: 0.3801288 test: 0.3820751 best: 0.3820751 (28) total: 18.7s remaining: 10.3s 29: learn: 0.3750276 test: 0.3770852 best: 0.3770852 (29) total: 19.4s remaining: 9.7s 30: learn: 0.3696844 test: 0.3718825 best: 0.3718825 (30) total: 20.1s remaining: 9.05s 31: learn: 0.3642069 test: 0.3664964 best: 0.3664964 (31) total: 20.7s remaining: 8.42s 32: learn: 0.3592650 test: 0.3615992 best: 0.3615992 (32) total: 21.4s remaining: 7.77s 33: learn: 0.3542884 test: 0.3566538 best: 0.3566538 (33) total: 21.9s remaining: 7.09s 34: learn: 0.3499872 test: 0.3524737 best: 0.3524737 (34) total: 22.5s remaining: 6.42s 35: learn: 0.3452706 test: 0.3479239 best: 0.3479239 (35) total: 23.2s remaining: 5.79s 36: learn: 0.3408309 test: 0.3435856 best: 0.3435856 (36) total: 23.8s remaining: 5.15s 37: learn: 0.3367590 test: 0.3396028 best: 0.3396028 (37) total: 24.5s remaining: 4.52s 38: learn: 0.3326569 test: 0.3356552 best: 0.3356552 (38) total: 25.2s remaining: 3.88s 39: learn: 0.3289934 test: 0.3320720 best: 0.3320720 (39) total: 25.8s remaining: 3.23s 40: learn: 0.3251258 test: 0.3283410 best: 0.3283410 (40) total: 26.5s remaining: 2.58s 41: learn: 0.3216275 test: 0.3248954 best: 0.3248954 (41) total: 27.2s remaining: 1.94s 42: learn: 0.3183068 test: 0.3216678 best: 0.3216678 (42) total: 27.8s remaining: 1.29s 43: learn: 0.3146762 test: 0.3181735 best: 0.3181735 (43) total: 28.5s remaining: 647ms 44: learn: 0.3115289 test: 0.3150991 best: 0.3150991 (44) total: 29.1s remaining: 0us bestTest = 0.3150991339 bestIteration = 44 Trial 99, Fold 4: Log loss = 0.3151704117016752, Average precision = 0.9708887439177136, ROC-AUC = 0.966269468705814, Elapsed Time = 29.271147599996766 seconds Trial 99, Fold 5: Train size = 20650 where 0 = 10500, 1 = 10150, 0/1 = 1.0344827586206897 Trial 99, Fold 5: Validation size = 5188 where 0 = 2625, 1 = 2563, 0/1 = 1.0241904018728054 0: learn: 0.6752818 test: 0.6754744 best: 0.6754744 (0) total: 610ms remaining: 26.9s 1: learn: 0.6575178 test: 0.6578638 best: 0.6578638 (1) total: 1.28s remaining: 27.5s 2: learn: 0.6406069 test: 0.6412724 best: 0.6412724 (2) total: 1.88s remaining: 26.3s 3: learn: 0.6251920 test: 0.6260586 best: 0.6260586 (3) total: 2.53s remaining: 25.9s 4: learn: 0.6099784 test: 0.6112158 best: 0.6112158 (4) total: 3.18s remaining: 25.5s 5: learn: 0.5950069 test: 0.5964518 best: 0.5964518 (5) total: 3.84s remaining: 25s 6: learn: 0.5810397 test: 0.5827393 best: 0.5827393 (6) total: 4.49s remaining: 24.4s 7: learn: 0.5672193 test: 0.5691440 best: 0.5691440 (7) total: 5.15s remaining: 23.8s 8: learn: 0.5546435 test: 0.5566585 best: 0.5566585 (8) total: 5.85s remaining: 23.4s 9: learn: 0.5424949 test: 0.5446863 best: 0.5446863 (9) total: 6.5s remaining: 22.7s 10: learn: 0.5316993 test: 0.5337959 best: 0.5337959 (10) total: 7.16s remaining: 22.1s 11: learn: 0.5207131 test: 0.5229438 best: 0.5229438 (11) total: 7.82s remaining: 21.5s 12: learn: 0.5087146 test: 0.5111931 best: 0.5111931 (12) total: 8.47s remaining: 20.8s 13: learn: 0.4977165 test: 0.5004250 best: 0.5004250 (13) total: 9.13s remaining: 20.2s 14: learn: 0.4866730 test: 0.4895956 best: 0.4895956 (14) total: 9.75s remaining: 19.5s 15: learn: 0.4775263 test: 0.4806444 best: 0.4806444 (15) total: 10.4s remaining: 18.8s 16: learn: 0.4684094 test: 0.4717952 best: 0.4717952 (16) total: 11s remaining: 18.1s 17: learn: 0.4592663 test: 0.4627902 best: 0.4627902 (17) total: 11.7s remaining: 17.5s 18: learn: 0.4504197 test: 0.4541881 best: 0.4541881 (18) total: 12.3s remaining: 16.9s 19: learn: 0.4424720 test: 0.4464314 best: 0.4464314 (19) total: 13s remaining: 16.3s 20: learn: 0.4355742 test: 0.4394029 best: 0.4394029 (20) total: 13.7s remaining: 15.6s 21: learn: 0.4278960 test: 0.4319636 best: 0.4319636 (21) total: 14.2s remaining: 14.9s 22: learn: 0.4201165 test: 0.4243819 best: 0.4243819 (22) total: 14.8s remaining: 14.2s 23: learn: 0.4128502 test: 0.4172356 best: 0.4172356 (23) total: 15.5s remaining: 13.6s 24: learn: 0.4055824 test: 0.4102066 best: 0.4102066 (24) total: 16.2s remaining: 12.9s 25: learn: 0.3995007 test: 0.4042907 best: 0.4042907 (25) total: 16.8s remaining: 12.3s 26: learn: 0.3933024 test: 0.3983077 best: 0.3983077 (26) total: 17.5s remaining: 11.6s 27: learn: 0.3879139 test: 0.3930966 best: 0.3930966 (27) total: 18.1s remaining: 11s 28: learn: 0.3819920 test: 0.3873325 best: 0.3873325 (28) total: 18.8s remaining: 10.4s 29: learn: 0.3762710 test: 0.3817610 best: 0.3817610 (29) total: 19.5s remaining: 9.74s 30: learn: 0.3708023 test: 0.3764855 best: 0.3764855 (30) total: 20.1s remaining: 9.09s 31: learn: 0.3658746 test: 0.3716833 best: 0.3716833 (31) total: 20.8s remaining: 8.45s 32: learn: 0.3603432 test: 0.3663190 best: 0.3663190 (32) total: 21.5s remaining: 7.8s 33: learn: 0.3553954 test: 0.3614472 best: 0.3614472 (33) total: 22.1s remaining: 7.14s 34: learn: 0.3508934 test: 0.3571081 best: 0.3571081 (34) total: 22.7s remaining: 6.49s 35: learn: 0.3460054 test: 0.3523881 best: 0.3523881 (35) total: 23.4s remaining: 5.84s 36: learn: 0.3418220 test: 0.3483214 best: 0.3483214 (36) total: 24s remaining: 5.19s 37: learn: 0.3378130 test: 0.3444665 best: 0.3444665 (37) total: 24.7s remaining: 4.55s 38: learn: 0.3336955 test: 0.3405246 best: 0.3405246 (38) total: 25.3s remaining: 3.89s 39: learn: 0.3297488 test: 0.3367384 best: 0.3367384 (39) total: 25.9s remaining: 3.24s 40: learn: 0.3262602 test: 0.3333947 best: 0.3333947 (40) total: 26.6s remaining: 2.6s 41: learn: 0.3226297 test: 0.3298914 best: 0.3298914 (41) total: 27.3s remaining: 1.95s 42: learn: 0.3186870 test: 0.3261888 best: 0.3261888 (42) total: 27.9s remaining: 1.3s 43: learn: 0.3151896 test: 0.3228663 best: 0.3228663 (43) total: 28.6s remaining: 649ms 44: learn: 0.3119761 test: 0.3197956 best: 0.3197956 (44) total: 29.2s remaining: 0us bestTest = 0.3197955685 bestIteration = 44 Trial 99, Fold 5: Log loss = 0.3197794008180273, Average precision = 0.9691484953841378, ROC-AUC = 0.9645229734500121, Elapsed Time = 29.368412000003445 seconds
Optimization Progress: 100%|##########| 100/100 [2:59:34<00:00, 88.22s/it]
Optuna Optimization Elapsed Time: 10774.411937399997 seconds
Optimization Progress: 100%|##########| 100/100 [2:59:34<00:00, 107.75s/it]
Training with Best Trial 91
Full_params: {'objective': 'Logloss', 'eval_metric': 'Logloss', 'custom_metric': ['AUC', 'PRAUC'], 'random_seed': 42, 'thread_count': -1, 'verbose': True, 'allow_writing_files': False, 'bootstrap_type': 'Bernoulli', 'subsample': 0.8903206126906871, 'grow_policy': 'Lossguide', 'posterior_sampling': False, 'model_shrink_mode': 'Constant', 'iterations': 89, 'learning_rate': 0.08799821484684099, 'l2_leaf_reg': 0.26486908504555773, 'random_strength': 0.8218257282282394, 'depth': 11, 'min_data_in_leaf': 46, 'has_time': False, 'rsm': 0.8824980018648074, 'leaf_estimation_method': 'Gradient', 'leaf_estimation_backtracking': 'No', 'fold_len_multiplier': 3.6584072693303096, 'auto_class_weights': 'SqrtBalanced', 'boost_from_average': False, 'allow_const_label': True, 'score_function': 'L2', 'border_count': 165, 'max_leaves': 48}
0: learn: 0.5885022 total: 427ms remaining: 37.5s
1: learn: 0.5083766 total: 774ms remaining: 33.7s
2: learn: 0.4461774 total: 1.12s remaining: 32s
3: learn: 0.3964089 total: 1.51s remaining: 32s
4: learn: 0.3583454 total: 1.86s remaining: 31.2s
5: learn: 0.3278594 total: 2.2s remaining: 30.5s
6: learn: 0.3034009 total: 2.56s remaining: 30s
7: learn: 0.2844949 total: 2.92s remaining: 29.5s
8: learn: 0.2691102 total: 3.29s remaining: 29.2s
9: learn: 0.2560877 total: 3.63s remaining: 28.7s
10: learn: 0.2456433 total: 3.98s remaining: 28.2s
11: learn: 0.2372414 total: 4.31s remaining: 27.6s
12: learn: 0.2300701 total: 4.65s remaining: 27.2s
13: learn: 0.2227844 total: 4.98s remaining: 26.7s
14: learn: 0.2164941 total: 5.29s remaining: 26.1s
15: learn: 0.2105657 total: 5.64s remaining: 25.7s
16: learn: 0.2062664 total: 5.98s remaining: 25.3s
17: learn: 0.2022427 total: 6.28s remaining: 24.8s
18: learn: 0.1985302 total: 6.62s remaining: 24.4s
19: learn: 0.1951126 total: 6.96s remaining: 24s
20: learn: 0.1918545 total: 7.29s remaining: 23.6s
21: learn: 0.1888871 total: 7.62s remaining: 23.2s
22: learn: 0.1864966 total: 7.94s remaining: 22.8s
23: learn: 0.1844704 total: 8.26s remaining: 22.4s
24: learn: 0.1824832 total: 8.57s remaining: 22s
25: learn: 0.1805885 total: 8.91s remaining: 21.6s
26: learn: 0.1785059 total: 9.24s remaining: 21.2s
27: learn: 0.1770598 total: 9.57s remaining: 20.9s
28: learn: 0.1754954 total: 9.92s remaining: 20.5s
29: learn: 0.1742324 total: 10.2s remaining: 20.1s
30: learn: 0.1728466 total: 10.5s remaining: 19.7s
31: learn: 0.1714074 total: 10.8s remaining: 19.3s
32: learn: 0.1699145 total: 11.1s remaining: 18.9s
33: learn: 0.1688395 total: 11.4s remaining: 18.5s
34: learn: 0.1677017 total: 11.7s remaining: 18.1s
35: learn: 0.1665990 total: 12s remaining: 17.7s
36: learn: 0.1655675 total: 12.4s remaining: 17.4s
37: learn: 0.1644919 total: 12.7s remaining: 17s
38: learn: 0.1633730 total: 13s remaining: 16.6s
39: learn: 0.1624194 total: 13.3s remaining: 16.3s
40: learn: 0.1613016 total: 13.6s remaining: 15.9s
41: learn: 0.1603534 total: 13.9s remaining: 15.5s
42: learn: 0.1593411 total: 14.2s remaining: 15.2s
43: learn: 0.1584854 total: 14.5s remaining: 14.8s
44: learn: 0.1577283 total: 14.8s remaining: 14.5s
45: learn: 0.1569449 total: 15.1s remaining: 14.1s
46: learn: 0.1560944 total: 15.4s remaining: 13.7s
47: learn: 0.1553528 total: 15.7s remaining: 13.4s
48: learn: 0.1545783 total: 16s remaining: 13s
49: learn: 0.1539453 total: 16.3s remaining: 12.7s
50: learn: 0.1532242 total: 16.6s remaining: 12.3s
51: learn: 0.1524812 total: 16.9s remaining: 12s
52: learn: 0.1518746 total: 17.2s remaining: 11.7s
53: learn: 0.1511438 total: 17.5s remaining: 11.3s
54: learn: 0.1505563 total: 17.8s remaining: 11s
55: learn: 0.1498915 total: 18.1s remaining: 10.7s
56: learn: 0.1491990 total: 18.4s remaining: 10.3s
57: learn: 0.1485577 total: 18.7s remaining: 9.97s
58: learn: 0.1480263 total: 19s remaining: 9.64s
59: learn: 0.1473390 total: 19.3s remaining: 9.3s
60: learn: 0.1467196 total: 19.5s remaining: 8.97s
61: learn: 0.1460746 total: 19.9s remaining: 8.65s
62: learn: 0.1455186 total: 20.2s remaining: 8.32s
63: learn: 0.1449028 total: 20.5s remaining: 7.99s
64: learn: 0.1443467 total: 20.7s remaining: 7.66s
65: learn: 0.1439275 total: 21s remaining: 7.33s
66: learn: 0.1435451 total: 21.3s remaining: 7s
67: learn: 0.1430987 total: 21.6s remaining: 6.67s
68: learn: 0.1424863 total: 21.9s remaining: 6.35s
69: learn: 0.1419814 total: 22.2s remaining: 6.03s
70: learn: 0.1414624 total: 22.5s remaining: 5.7s
71: learn: 0.1407579 total: 22.8s remaining: 5.38s
72: learn: 0.1403933 total: 23.1s remaining: 5.06s
73: learn: 0.1398631 total: 23.4s remaining: 4.74s
74: learn: 0.1393847 total: 23.7s remaining: 4.42s
75: learn: 0.1389018 total: 24s remaining: 4.1s
76: learn: 0.1383993 total: 24.3s remaining: 3.78s
77: learn: 0.1378940 total: 24.6s remaining: 3.47s
78: learn: 0.1374083 total: 24.9s remaining: 3.15s
79: learn: 0.1370790 total: 25.2s remaining: 2.83s
80: learn: 0.1367009 total: 25.4s remaining: 2.51s
81: learn: 0.1362868 total: 25.7s remaining: 2.19s
82: learn: 0.1358621 total: 26s remaining: 1.88s
83: learn: 0.1354340 total: 26.3s remaining: 1.57s
84: learn: 0.1350311 total: 26.6s remaining: 1.25s
85: learn: 0.1346524 total: 26.9s remaining: 939ms
86: learn: 0.1341681 total: 27.2s remaining: 625ms
87: learn: 0.1338300 total: 27.5s remaining: 312ms
88: learn: 0.1335008 total: 27.7s remaining: 0us
Training Elapsed Time: 27.940681599997333 seconds
Log loss: (Train) 0.18900408697386856 vs (Test) 0.19319317338089453
PR-AUC: (Train) 0.9765517530153833 vs (Test) 0.9757429181240718
ROC-AUC: (Train) 0.9732920829918623 vs (Test) 0.9722518803451436
save_results(clf_name = "CatBoostClassifier",
best_trials = best_trials_cbc,
exec_time = exec_time_cbc,
lloss_auc_train = lloss_auc_train_cbc,
lloss_auc_test = lloss_auc_test_cbc,
df_metrics = df_metrics_cbc,
cm_final = cm_final_cbc,
cm_all = cm_cbc_all,
cm_labels = cm_labels_cbc_all)
Comparison (best trials)¶
- Parameters
- Mutli-objectives
- Classification report
- Confusion matrices
- Elapsed times
# List of classifiers for downstream processing
clfs = ["LGBM", "HistGradientBoostingClassifier", "CatBoostClassifier", "XGBClassifier"]
Parameters¶
# List of JSON files and classifier names
json_files = [{"file": "best_trials_LGBM.json", "clf": "LGBMClassifier"},
{"file": "best_trials_HistGradientBoostingClassifier.json", "clf": "HistGradientBoostingClassifier"},
{"file": "best_trials_CatBoostClassifier.json", "clf": "CatBoostClassifier"},
{"file": "best_trials_XGBClassifier.json", "clf": "XGBClassifier"}
]
# Dictionary to store the combined data
all_records = {}
# Read and process JSON files
for item in json_files:
file_path = os.path.join(temp_dir, item["file"])
clf_name = item["clf"]
with open(file_path, "r") as f:
data = json.load(f)
# Store parameters for each model and trial in the dictionary
for trial_num, params in data.items():
all_records[(clf_name, trial_num)] = params
# Show all rows without truncation
pd.set_option("display.max_rows", None)
display(pd.DataFrame(all_records))
# Default setting
pd.reset_option("display.max_rows")
| LGBMClassifier | HistGradientBoostingClassifier | CatBoostClassifier | XGBClassifier | ||
|---|---|---|---|---|---|
| 68 | 69 | 91 | 65 | 76 | |
| objective | binary | NaN | Logloss | binary:logistic | binary:logistic |
| metric | [binary_logloss, auc, average_precision] | NaN | NaN | NaN | NaN |
| device_type | cpu | NaN | NaN | NaN | NaN |
| verbosity | 2 | NaN | NaN | 1 | 1 |
| random_state | 42 | 42 | NaN | NaN | NaN |
| deterministic | True | NaN | NaN | NaN | NaN |
| bagging_seed | 42 | NaN | NaN | NaN | NaN |
| feature_fraction_seed | 42 | NaN | NaN | NaN | NaN |
| drop_seed | 42 | NaN | NaN | NaN | NaN |
| force_col_wise | True | NaN | NaN | NaN | NaN |
| num_threads | -1 | NaN | NaN | NaN | NaN |
| max_depth | -1 | None | NaN | 0 | 0 |
| boosting_type | gbdt | NaN | NaN | NaN | NaN |
| lambda_l1 | 0.001799 | NaN | NaN | NaN | NaN |
| lambda_l2 | 0.0 | NaN | NaN | NaN | NaN |
| num_leaves | 81 | NaN | NaN | NaN | NaN |
| feature_fraction | 0.852907 | NaN | NaN | NaN | NaN |
| bagging_fraction | 0.723172 | NaN | NaN | NaN | NaN |
| bagging_freq | 6 | NaN | NaN | NaN | NaN |
| min_child_samples | 28 | NaN | NaN | NaN | NaN |
| learning_rate | 0.074913 | 0.096154 | 0.087998 | NaN | NaN |
| is_unbalance | True | NaN | NaN | NaN | NaN |
| max_bin | 184 | NaN | NaN | 47 | 92 |
| min_sum_hessian_in_leaf | 0.0052 | NaN | NaN | NaN | NaN |
| max_delta_step | 40.700039 | NaN | NaN | 65.541538 | 37.673488 |
| feature_fraction_bynode | 0.952448 | NaN | NaN | NaN | NaN |
| num_boost_round | 88 | NaN | NaN | NaN | NaN |
| loss | NaN | log_loss | NaN | NaN | NaN |
| verbose | NaN | 2 | True | NaN | NaN |
| categorical_features | NaN | from_dtype | NaN | NaN | NaN |
| max_iter | NaN | 64 | NaN | NaN | NaN |
| max_leaf_nodes | NaN | 81 | NaN | NaN | NaN |
| min_samples_leaf | NaN | 47 | NaN | NaN | NaN |
| l2_regularization | NaN | 0.000012 | NaN | NaN | NaN |
| class_weight | NaN | None | NaN | NaN | NaN |
| max_features | NaN | 0.77363 | NaN | NaN | NaN |
| max_bins | NaN | 217 | NaN | NaN | NaN |
| interaction_cst | NaN | pairwise | NaN | NaN | NaN |
| warm_start | NaN | False | NaN | NaN | NaN |
| eval_metric | NaN | NaN | Logloss | [logloss, auc, aucpr] | [logloss, auc, aucpr] |
| custom_metric | NaN | NaN | [AUC, PRAUC] | NaN | NaN |
| random_seed | NaN | NaN | 42 | NaN | NaN |
| thread_count | NaN | NaN | -1 | NaN | NaN |
| allow_writing_files | NaN | NaN | False | NaN | NaN |
| bootstrap_type | NaN | NaN | Bernoulli | NaN | NaN |
| subsample | NaN | NaN | 0.890321 | 0.56143 | 0.894273 |
| grow_policy | NaN | NaN | Lossguide | depthwise | depthwise |
| posterior_sampling | NaN | NaN | False | NaN | NaN |
| model_shrink_mode | NaN | NaN | Constant | NaN | NaN |
| iterations | NaN | NaN | 89 | NaN | NaN |
| l2_leaf_reg | NaN | NaN | 0.264869 | NaN | NaN |
| random_strength | NaN | NaN | 0.821826 | NaN | NaN |
| depth | NaN | NaN | 11 | NaN | NaN |
| min_data_in_leaf | NaN | NaN | 46 | NaN | NaN |
| has_time | NaN | NaN | False | NaN | NaN |
| rsm | NaN | NaN | 0.882498 | NaN | NaN |
| leaf_estimation_method | NaN | NaN | Gradient | NaN | NaN |
| leaf_estimation_backtracking | NaN | NaN | No | NaN | NaN |
| fold_len_multiplier | NaN | NaN | 3.658407 | NaN | NaN |
| auto_class_weights | NaN | NaN | SqrtBalanced | NaN | NaN |
| boost_from_average | NaN | NaN | False | NaN | NaN |
| allow_const_label | NaN | NaN | True | NaN | NaN |
| score_function | NaN | NaN | L2 | NaN | NaN |
| border_count | NaN | NaN | 165 | NaN | NaN |
| max_leaves | NaN | NaN | 48 | 128 | 196 |
| device | NaN | NaN | NaN | cpu | cpu |
| validate_parameters | NaN | NaN | NaN | True | True |
| seed | NaN | NaN | NaN | 42 | 42 |
| sampling_method | NaN | NaN | NaN | uniform | uniform |
| num_parallel_tree | NaN | NaN | NaN | 1 | 1 |
| booster | NaN | NaN | NaN | gbtree | gbtree |
| eta | NaN | NaN | NaN | 0.087594 | 0.064891 |
| gamma | NaN | NaN | NaN | 0.000005 | 1.733278 |
| min_child_weight | NaN | NaN | NaN | 0.000026 | 0.001577 |
| colsample_bytree | NaN | NaN | NaN | 0.812988 | 0.570634 |
| colsample_bylevel | NaN | NaN | NaN | 0.918444 | 0.83765 |
| colsample_bynode | NaN | NaN | NaN | 0.625508 | 0.916695 |
| lambda | NaN | NaN | NaN | 7.025645 | 0.005622 |
| alpha | NaN | NaN | NaN | 0.037279 | 0.0 |
| tree_method | NaN | NaN | NaN | auto | hist |
| scale_pos_weight | NaN | NaN | NaN | 1 | 1 |
Multi-objectives¶
- log-loss, PR-AUC, ROC-AUC
# Combine training/test performance metrics CSVs
lloss_auc_train_df = pd.concat([pd.read_csv(f'{temp_dir}/lloss_auc_train_{clf}.csv') for clf in clfs], ignore_index = True)
lloss_auc_test_df = pd.concat([pd.read_csv(f'{temp_dir}/lloss_auc_test_{clf}.csv') for clf in clfs], ignore_index = True)
lloss_auc_df = pd.concat([lloss_auc_train_df, lloss_auc_test_df], ignore_index = True).sort_values(by = ["Classifier", "Best Trial", "Set"],
ascending=[False, True, False]).reset_index(drop = True)
# Reshape to long format
df_long = lloss_auc_df.melt(id_vars = ["Classifier", "Best Trial", "Set"],
value_vars = ["Log loss", "PR-AUC", "ROC-AUC"],
var_name = "Metric",
value_name = "Score")
# Combine Classifier+Trial label for x-axis
df_long["Classifier_Trial"] = df_long["Classifier"] + " (Trial " + df_long["Best Trial"].astype(str) + ")"
fig = px.bar(df_long, x = "Classifier_Trial", y = "Score",
color = "Set", barmode = "group", facet_col = "Metric",
text = "Score",
color_discrete_map = {"Training": "#56B4E9", "Test": "#009E73"},
facet_col_spacing = 0.04 # Increase facet spacing
)
# Format numbers on bars
fig.update_traces(texttemplate = "%{text:.3f}", textposition = "outside")
# Independent y-axes for each facet
fig.update_yaxes(matches = None, showticklabels = True, showgrid = True)
# Remove "Classifier_Trial" from x-aixs title
fig.update_xaxes(title_text = None)
# Remove all y-axis titles, then add only for the first facet
fig.update_yaxes(title_text = None)
fig.layout["yaxis"].title.text = "Score"
# Title and layout tweaks
fig.update_layout(height = 600, width = 1600, title = dict(text = "Classifier Performance (Train vs Test Across Metrics and Best Trials)",
x = 0.5, xanchor = "center", yanchor = "top",
font = dict(size = 20, family = "Arial", color = "black", weight = "bold")),
legend_title = dict(text = "Set", side = "top"),
legend = dict(orientation = "v", x = 1.04, xanchor = "center", y = 0.9, yanchor = "bottom"),
margin = dict(t = 80, b = 40)
)
# Remove "Metric=" from facet titles
fig.for_each_annotation(lambda a: a.update(text = a.text.split("=")[-1]))
# Custom intervals for y-axis
fig.layout["yaxis1"].range = [0.18, 0.23] # PR-AUC facet
fig.layout["yaxis2"].range = [0.96, 0.98] # PR-AUC facet
fig.layout["yaxis3"].range = [0.96, 0.98] # ROC-AUC facet
fig.show();
display(lloss_auc_df)
del lloss_auc_train_df, lloss_auc_test_df, lloss_auc_df, df_long, fig;
| Classifier | Best Trial | Set | Log loss | PR-AUC | ROC-AUC | |
|---|---|---|---|---|---|---|
| 0 | XGBClassifier | 65 | Training | 0.187296 | 0.976407 | 0.973166 |
| 1 | XGBClassifier | 65 | Test | 0.192752 | 0.975963 | 0.972528 |
| 2 | XGBClassifier | 76 | Training | 0.188652 | 0.976516 | 0.973094 |
| 3 | XGBClassifier | 76 | Test | 0.194125 | 0.976004 | 0.972184 |
| 4 | LGBM | 68 | Training | 0.186913 | 0.976984 | 0.973562 |
| 5 | LGBM | 68 | Test | 0.193374 | 0.975140 | 0.971766 |
| 6 | HistGradientBoostingClassifier | 69 | Training | 0.211290 | 0.971433 | 0.966883 |
| 7 | HistGradientBoostingClassifier | 69 | Test | 0.221352 | 0.967821 | 0.962714 |
| 8 | CatBoostClassifier | 91 | Training | 0.189004 | 0.976552 | 0.973292 |
| 9 | CatBoostClassifier | 91 | Test | 0.193193 | 0.975743 | 0.972252 |
The grouped bar chart above compares the training and test performance of 4 tree-based classifiers (two XGBoost variants, LightGBM, HistGradientBoostingClassifier and CatBoostClassifier) based on the best trial selected from the multi-objective 5-fold, 100-trial Optuna optimization. The metrics reported include log-loss as a calibration measure, and PR-AUC and ROC-AUC as discriminative measures.
A lower log-loss indicates that the model’s predicted probabilities align more closely with the actual outcomes, reflecting both well-calibrated probability estimates and improved predictive accuracy. In contrast, PR-AUC and ROC-AUC assess the model’s ability to distinguish between safe and risky loans across decision thresholds, with higher values indicating stronger discriminative power. A smaller train–test gap across these metrics suggests better generalization to unseen data and reduced risk of overfitting.
Looking first at the two XGBoost models i.e. Trial 65 and Trial 76, both perform at nearly identical levels, with very strong discriminative power (PR-AUC $\approx 0.976$, ROC-AUC $\approx 0.972$) and competitive calibration (test log-loss $\approx 0.193 - 0.194$). Their train–test gaps are exceptionally small i.e. $<0.001$ for both PR-AUC and ROC-AUC, and $\approx 0.005 - 0.006$ in log-loss. Between the two, XGBoost Trial 65 has a slight edge, with marginally lower test log-loss (0.1928 vs. 0.1941) and higher ROC-AUC (0.9725 vs. 0.9722), making it the more robust choice.
CatBoostClassifier and LightGBM also achieve strong results, clearly outperforming HistGradientBoostingClassifier. CatBoostClassifier, in particular, records one of the lowest test log-loss values ($\approx 0.1932$) and maintains excellent AUCs (PR-AUC $\approx 0.9757$, ROC-AUC $\approx 0.9723$), though with slightly larger train–test gaps than XGBoost Trial 65. LightGBM is similarly competitive, with strong calibration (test log-loss $\approx 0.1934$) and high discriminative power (PR-AUC $\approx 0.9751$, ROC-AUC $\approx 0.9718$), though its train–test gaps are larger still.
In contrast, the HistGradientBoostingClassifier performs noticeably worse. Its test log-loss ($\approx 0.2214$) is substantially higher, and both PR-AUC ($\approx 0.9678$) and ROC-AUC ($\approx 0.9627$) are lower than the other models. This highlights its weaker calibration and discrimination.
Taken together, XGBoost Trial 65 demonstrates the best overall balance of calibration, discriminative performance and generalization, with CatBoostClassifier and LightGBM as strong alternatives that also clearly outperform HistGradientBoostingClassifier.
So when these numbers are put in context for XGBoost Trial 65, the test log-loss of $\approx 0.193$ means that when the model assigns a loan a $\approx 19.3\%$ probability of being risky, the actual proportion of loans that the model scores at $\approx 19.3\%$ tends to be close to 19.3% in reality. The test PR-AUC of $\approx 0.976$ shows the model is performing at $\approx 97.6\%$ of the best possible performance. In other words, it’s almost perfect at finding risky loans while making very few mistakes in wrongly flagging safe loans as risky, and it balances this well across different probability thresholds. Finally, the test ROC-AUC of $\approx 0.973$ means that if we randomly select one safe and one risky loan, the model will correctly give the risky loan a higher risk score $\approx 97.3\%$ of the time, regardless of where the decision threshold is set.
Classification reports¶
# Combine classification report CSVs
metrics_df = pd.concat([pd.read_csv(f'{temp_dir}/df_metrics_{clf}.csv') for clf in clfs], ignore_index = True)
# Custom order of metrics
ordered_metrics = ["Accuracy",
"Precision (Safe)", "Precision (Risky)", "Precision (Macro avg)", "Precision (Weighted avg)",
"Recall (Safe)", "Recall (Risky)", "Recall (Macro avg)", "Recall (Weighted avg)",
"F1-score (Safe)", "F1-score (Risky)", "F1-score (Macro avg)", "F1-score (Weighted avg)"]
metrics_df["Classifier_Trial"] = metrics_df["Classifier"] + " (Trial " + metrics_df["Trial"].astype(str) + ")"
df_long = metrics_df.melt(id_vars=["Classifier_Trial"], value_vars = ordered_metrics,
var_name = "Metric", value_name = "Score")
fig = px.bar(df_long, x = "Metric", y = "Score",
color = "Classifier_Trial", barmode = "group",
category_orders = {"Metric": ordered_metrics},
)
fig.update_layout(height = 600, width = 1600,
title = dict(text = "Classifier Comparison Across Metrics (Best Trials)",
x = 0.5, xanchor = "center", yanchor = "top",
font = dict(size = 20, family = "Arial", color = "black", weight = "bold")),
legend_title = None,
xaxis_tickangle = -45, template = "plotly_white",
yaxis = dict(range=[0.84, 0.98])
)
fig.show();
display(metrics_df.drop(columns = "Classifier_Trial"))
del metrics_df, ordered_metrics, df_long, fig;
| Classifier | Trial | Accuracy | Precision (Safe) | Recall (Safe) | F1-score (Safe) | Precision (Risky) | Recall (Risky) | F1-score (Risky) | Precision (Macro avg) | Recall (Macro avg) | F1-score (Macro avg) | Precision (Weighted avg) | Recall (Weighted avg) | F1-score (Weighted avg) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | LGBM | 68 | 0.919215 | 0.879867 | 0.973122 | 0.924148 | 0.969155 | 0.864062 | 0.913597 | 0.924511 | 0.918592 | 0.918872 | 0.924001 | 0.919215 | 0.918933 |
| 1 | HistGradientBoostingClassifier | 69 | 0.913346 | 0.869317 | 0.975260 | 0.919246 | 0.971082 | 0.850000 | 0.906516 | 0.920199 | 0.912630 | 0.912881 | 0.919618 | 0.913346 | 0.912953 |
| 2 | CatBoostClassifier | 91 | 0.921841 | 0.880848 | 0.977703 | 0.926752 | 0.974296 | 0.864688 | 0.916225 | 0.927572 | 0.921195 | 0.921488 | 0.927038 | 0.921841 | 0.921549 |
| 3 | XGBClassifier | 65 | 0.919370 | 0.878855 | 0.974954 | 0.924414 | 0.971147 | 0.862500 | 0.913605 | 0.925001 | 0.918727 | 0.919009 | 0.924473 | 0.919370 | 0.919071 |
| 4 | XGBClassifier | 76 | 0.918134 | 0.878378 | 0.972816 | 0.923188 | 0.968750 | 0.862187 | 0.912368 | 0.923564 | 0.917502 | 0.917778 | 0.923048 | 0.918134 | 0.917840 |
At first glance, CatBoostClassifier stands out on the grouped bar chart because it shows the strongest classification report metrics overall i.e. accuracy, precision, recall, F1-score and both macro and weighted averages.
However, the most important question here is how well a model can catch risky loans, because missing them means real financial loss. On this point, CatBoostClassifier reports a recall on risky loans of $\approx 0.8647$, which is very slightly lower than LightGBM's $\approx 0.8641$, but the difference is only in the fourth decimal place, essentially negligible in practice. In other words, both models correctly flag about 86 out of every 100 risky loans when using the standard 0.5 decision threshold.
It's worth noting that both CatBoostClassifier and LightGBM used class reweighting. LightGBM was tuned with is_unbalance = True, which applies a relatively strong weighting scheme by directly up-weighting risky loans. This often improves recall on the minority class but can also slightly reduce precision because the model becomes more aggressive in labeling loans as risky. CatBoostClassifier, in contrast, was tuned with auto_class_weights = SqrtBalanced, which applies a softer adjustment (inverse square-root of class frequencies). This's less aggressive than LightGBM’s scheme and tends to yield a more balanced trade-off between precision and recall.
Importantly, the choice of these weighting parameters reflects Optuna’s multi-objective optimization rather than true dataset imbalance. Even though the folds were nearly 1:1 balanced, Optuna favoured configurations with weighting because they slightly improved log-loss through better probability calibration and PR-AUC by enhancing sensitivity to risky loans. In this setting, class-weighting acted less as an imbalance correction and more as a fine-tuning mechanism that interacted with other hyperparameters to achieve marginal but meaningful gains in the combined objectives.
Despite this, CatBoostClassifier still edges out LightGBM overall, nearly matching it on risky recall ($\approx 0.8647$ vs $\approx 0.8641$) while clearly dominating on other key metrics like accuracy, precision, F1-score, both macro and weighted averages. This suggests that CatBoostClassifier benefits from weighting but does so in a way that preserves stability and avoids overcompensating. Its macro and weighted recall scores are also strong at $\approx 0.9212$ and $\approx 0.9218$, showing that it treats both safe and risky loans fairly well. Moreover, the model also shows the same level of consistency between training and test sets on the optimization metrics in log-loss, PR-AUC and ROC-AUC, suggesting its predictions are better calibrated and more likely to generalize well in practice.
Breaking down the numbers:
- Accuracy ($\approx 92.2\%$): Out of every 100 loans, CatBoostClassifier correctly flags $\approx 92$ as risky or safe.
- Safe loans:
- Precision ($\approx 88.1\%$): When CatBoostClassifier says a loan is safe, it’s right ($\approx 88$) times out of 100.
- Recall ($\approx 97.8\%$): Out of 100 genuinely safe loans, CatBoostClassifier correctly classifies ($\approx 98$) and misclassifies ($\approx 2$).
- F1-score ($\approx 92.7\%$): Strong balance, showing it captures the majority of safe loans while avoiding unnecessary misclassifications.
- Risky loans:
- Precision ($\approx 97.4\%$): When the model classifies a loan as risky, ($\approx 97$) in 100 are truly risky. Very few safe loans are mislabeled as risky.
- Recall ($\approx 86.5\%$): Out of 100 truly risky loans, it catches about ($\approx 87$), with ($\approx 13$) slipping through incorrectly as safe.
- F1-score ($\approx 91.6\%$): Shows a solid balance between catching risky loans and maintaining precision.
Looking at the averages:
- Macro precision ($\approx 92.8\%$) | Macro recall ($\approx 92.1\%$): Strong fairness, treating safe and risky loans equally well.
- Weighted precision ($\approx 92.7\%$) | Weighted recall ($\approx 92.2\%$): In real-world distributions, where datasets usually have far more safe loans, CatBoostClassifier still achieves the same levels of performance.
Summary:
- Both CatBoostClassifier and LightGBM use class weighting, but CatBoostClassifier applies a softer scheme (
SqrtBalanced) that promotes balance rather than aggressiveness. - CatBoostClassifier correctly classifies most safe loans and most risky loans, with particularly strong reliability when labeling risky loans (precision $\approx 97.4\%$).
- Its recall for risky loans ($\approx 86.5\%$) means it can identify the majority of high-risk loans, reducing the chance that risky loans are misclassified as safe.
- Its macro and weighted averages ($\approx 92\% - 93\%$) highlight that it treats both loan classes fairly.
- Its optimization consistency (log-loss, PR-AUC, ROC-AUC) suggests well-calibrated predictions and better generalization in practice.
- Optuna’s selection of weighting parameters, despite near-balanced folds, shows that subtle reweighting can still improve calibration and discrimination, reinforcing why CatBoostClassifier is the strongest overall choice.
Confusion matrices¶
# Load confusion matrices and their labels
grouped_matrices = []
grouped_labels = []
for clf in clfs:
with open(f'{temp_dir}/cm_{clf}_all.pkl', "rb") as f:
cm_all = pickle.load(f)
with open(f'{temp_dir}/cm_labels_{clf}_all.pkl', "rb") as f:
cm_labels = pickle.load(f)
grouped_matrices.append(cm_all)
grouped_labels.append(cm_labels)
# Plot confusion matrices
# Determine grid size: one row per classifier, columns = maximum number of matrices among classifiers
num_classifiers = len(grouped_matrices)
max_cols = max(len(cm_list) for cm_list in grouped_matrices)
# Subplots grid (rows = classifiers, cols = matrices) with size scaled for readability
fig, axes = plt.subplots(num_classifiers, max_cols, figsize = (4 * max_cols, 2 * num_classifiers))
# Ensure axes is a 2D array
if num_classifiers == 1:
axes = [axes]
if max_cols == 1:
axes = [[ax] for ax in axes]
# For each classifier group, plot all its confusion matrices along one row
for row_idx, (cm_list, label_list, row_axes) in enumerate(zip(grouped_matrices, grouped_labels, axes)):
for col_idx in range(max_cols):
if col_idx < len(label_list) and col_idx < len(cm_list):
ax = row_axes[col_idx]
sns.heatmap(cm_list[col_idx], annot = True, fmt = "d", cmap = "Blues",
ax = ax, cbar = False, annot_kws = {"size": 10})
# Update title and axis labels (introducing newline for clarity)
ax.set_title(label_list[col_idx].replace("for Best Trial", "\nfor Best Trial"), fontsize = 10)
ax.set_xlabel("Predicted Labels", fontsize = 10)
ax.set_ylabel("True Labels", fontsize = 10)
ax.set_xticklabels(["Safe", "Risky"], fontsize = 10)
ax.set_yticklabels(["Safe", "Risky"], fontsize = 10)
else:
row_axes[col_idx].axis('off') # Hide empty subplot
plt.tight_layout()
plt.show()
del grouped_matrices, grouped_labels, num_classifiers, max_cols, fig, axes;
When looking at the confusion matrices of the best trials, it’s clear that in loan risk models, false negatives are especially costly. They represent risky loans wrongly classified as safe, which can directly translate into financial losses. Using the standard decision threshold of 0.5, CatBoostClassifier reports the lowest number of false negatives (433), slightly better than LightGBM (435) and ahead of both XGBoost (440–441) and HistGradientBoostingClassifier (480).
At the same time, CatBoostClassifier also produces the fewest false positives (73), meaning it avoids unnecessarily misclassifying safe loans as risky. By contrast, LightGBM makes 88 false positives, XGBoost 82–89 and HistGradientBoostingClassifier 81. This combination of strong recall on risky loans with minimal false alarms gives CatBoostClassifier the clearest edge among the models.
HistGradientBoostingClassifier performs notably worse at this threshold, allowing nearly 480 risky loans through incorrectly. XGBoost is stronger, but it still lags behind CatBoostClassifier by permitting more risky loans to slip through and by flagging more safe loans incorrectly.
Overall, CatBoostClassifier provides the best trade-off at the conventional 0.5 threshold. It captures the majority of risky loans while keeping false positives very low, which both protects the bank from defaults and preserves customer relationships by not unfairly labeling safe loans as risky. Importantly, Optuna’s optimization tuned CatBoostClassifier with auto_class_weights = SqrtBalanced and LightGBM with is_unbalance = True. While both involve class reweighting, CatBoostClassifier’s softer square-root scheme achieves a more balanced outcome, suggesting its results are not only competitive but also more stable and better calibrated in practice.
Elapsed times¶
# Combine execution time CSVs
exec_time_df = pd.concat([pd.read_csv(f'{temp_dir}/exec_time_{clf}.csv') for clf in clfs], ignore_index = True)
display(exec_time_df);
del exec_time_df;
| Classifier | Best Trial | Optimization Elapsed Time (s) | Training Elapsed Time (s) | |
|---|---|---|---|---|
| 0 | LGBM | 68 | 1416.186557 | 3.886080 |
| 1 | HistGradientBoostingClassifier | 69 | 1254.363137 | 3.126119 |
| 2 | CatBoostClassifier | 91 | 10774.411937 | 27.940682 |
| 3 | XGBClassifier | 65 | 14523.852434 | 2.708413 |
| 4 | XGBClassifier | 76 | 14523.852434 | 3.324781 |
In my setup, Optuna optimization and model training were run only on the CPU of my personal laptop. With these limited resources, models such as LightGBM and HistGradientBoosting were very efficient, completing training in just a few seconds and finishing optimization in $\approx 20$ minutes ($\approx 1,250 – 1,400$ seconds). CatBoostClassifier, on the other hand, required noticeably more time per training run, close to half a minute, and its full optimization stretched into several hours ($> 10,000$ seconds). XGBoost’s individual training runs were as fast as LightGBM and HistGradientBoostingClassifier, but the overall optimization process was by far the slowest, taking $> 4$ hours ($>14,000$ seconds). This slowdown was mainly due to Optuna exploring a larger and more complex hyperparameter space, which caused the optimization process to balloon far beyond the actual training time. On this kind of hardware, LightGBM appeared to be the most practical choice for speed.
These results, however, reflect the limits of personal hardware more than the true capabilities of the algorithms. Large organizations, especially banks, rely on advanced IT infrastructure with powerful servers, far more memory, and distributed frameworks that handle demanding machine learning workloads efficiently. In such environments, runtime differences between LightGBM, CatBoostClassifier and XGBoost become much smaller because the infrastructure is built for scale and speed.
That’s why I believe the focus should be on predictive performance rather than raw training time. In financial risk modeling, accuracy is what prevents losses. CatBoostClassifier has consistently shown stronger and more balanced predictive performance without the need for artificial weighting adjustments. While it ran more slowly on my laptop, in an enterprise IT setting the runtime disadvantage would no longer be a major concern. What ultimately matters is CatBoostClassifier’s ability to classify safe and risky loans accurately, which makes it the better choice.
Overall Review¶
Context and Model Choice¶
When I tested models on my personal laptop, some models such as LightGBM and HistGradientBoostingClassifier ran very fast while others like CatBoostClassifier and XGBoost felt slower. This difference was really more about the limits of my laptop than the models themselves.
In a bank or corporation setting, models are normally trained on much more powerful machines with far greater processing power. On that type of setup, the slower models also run fast enough to be practical.
During optimization, LightGBM produced repeated warnings such as "No further splits with positive gain" until the later trials. This's normal behavior when the algorithm cannot find additional useful splits. The warnings diminished toward the end and did not affect overall performance as shown by the small train–test gap in log-loss, PR-AUC and ROC-AUC.
For the modelling phase, the binary target is the loan’s current status, derived from a nominal feature and encoded so one class represents safe loans and the other represents risky loans. The important point here is that in this dataset the ratio of safe to risky loans is roughly equal, about one to one. Because of this balance, accuracy is a meaningful metric when comparing models. At the same time recall for risky loans is still especially important from a business perspective because missing those loans has greater financial consequences.
So for me, the real question is not about speed but about which model provides the most reliable results while also recognizing that banks and regulators may require other measures for compliance and specific best practices, which go beyond my current knowledge. A good model should catch risky loans, treat safe loans fairly, produce risk scores that can be trusted, and be easy to explain when someone asks why a particular decision was made. Across these points, CatBoostClassifier stands out as the best overall choice.
Strengths and Limitations of CatBoostClassifier¶
Strengths
- Finds risky loans effectively without being overly harsh on safe ones
- Handles different kinds of data naturally such as numbers, categories, and yes or no answers
- Produces probabilities that are usually well-aligned with real outcomes
- Includes tools such as feature importance and SHAP to help explain what drives predictions
- Runs efficiently once used on enterprise‑level systems
Limitations
- Appears slower when run only on a small personal machine
- Requires subject matter expert (SME) input to make sure the data used makes sense in a business and compliance setting
- Needs periodic updates to stay reliable, such as tuning thresholds or retraining
- Can become too complex if too many engineered features are added without proper discipline
- Like any model, accuracy will decline over time if data drift occurs, for example when borrower behavior, inputs or their relationships change
Data and Feature Engineering¶
Adding richer and more varied data can improve the model’s ability to detect risk. Examples include loan usage patterns, broader economic conditions, ratios such as debt to income or balance to limit, and inputs from other trusted external sources.
It's important to work with SMEs so that chosen features are not only predictive but also reasonable, explainable and compliant with business rules and regulatory expectations. Proper tracking and clear documentation of why each feature and data source included, make the model easier to explain and ensure traceability for audits.
Adjusting the Decision Threshold¶
By default models apply a 50% cutoff where anything above is labeled risky and anything below is labeled safe. In lending though, missing a risky loan which is a false negative is generally much more costly than incorrectly flagging a safe loan as risky which is a false positive.
So keeping the threshold fixed at 50% may not be the best option. For example lowering it to 40% would capture more risky loans but would incorrectly flag more safe loans. Raising it would cut down on false positives but would cause more risky loans to slip through.
The best threshold is about finding the right balance for the business and aligning it with its appetite for financial risk. This should be reviewed regularly.
Monitoring and Data Drift¶
Over time borrower behavior, financial status, loan types and market conditions change. Because of this, models do not stay accurate forever. This is known as data drift.
To handle this we should track changes in input data and monitor how predictions behave. Alerts should be set up if recall or accuracy drops below an agreed level. The model should be retrained with recent data every few months, or faster if the market shifts quickly, such as during a downturn. Documentation on how drift is tracked and addressed should always be maintained.
Fairness and Bias¶
The model must treat different borrower groups fairly, across loan types, incomes or regions. This can be checked by comparing performance across these groups and ensuring no group is consistently disadvantaged.
Borrowers should also receive a clear, plain-language explanation when their loans are no longer in good standing. Fairness is therefore not just a technical matter, but also about how decisions are communicated back to customers.
Interpretability and Business Narratives¶
It's not enough to simply say the model makes a prediction. The reasoning needs to be explained in plain words.
For example, instead of giving a math‑heavy explanation, one could say "Debt to income ratio was a key factor because it reflects how comfortably someone can repay".
This kind of everyday explanation makes the model easier to defend with both internal teams and external reviewers.
Operational Integration¶
The model should work smoothly in the loan lifecycle. Sometimes a quick decision is required, while in other cases checks can be run overnight in batches. What matters most is that the system works reliably, runs efficiently and always uses correct data so that decisions are not delayed.
Before going live, the model should also be reviewed by an independent validation team to ensure checks are not only done by the same people who built.
Data Quality and Governance¶
Even the most advanced model will fail if the inputs are messy. Data quality checks for missing values, odd values, incorrect fields or format changes are necessary. Changes should be logged and a record kept so the data can be reviewed later if questions arise. This provides a clear audit trail for transparency and regulatory trust.
Stress Testing and Segment Stability¶
The model should also be tested under tough conditions such as higher unemployment or interest rate hikes. Stress testing shows if the model still provides useful guidance when the economy is unstable.
It's also important to check how well the model performs across different loan products such as mortgages and credit cards, and across borrower groups. A reliable model should perform consistently and not favour one group over another.
Human in the Loop¶
The model shouldn’t decide everything on its own. Loan officers need to review predictions and override them when needed. If overrides happen often, it may signal weaknesses in the model.
Retraining Policy¶
A decision should be made in advance about how often the model will be retrained. One option is retraining every quarter which is simpler to manage. Another option is continuous updates with new data which is more responsive but harder to govern. Whatever the approach, it should be clearly documented and approved.
Privacy and Security¶
When using new external data sources, it's important to ensure the data is handled safely and kept private. Data should only be used in ways that customers and regulators would view as fair and responsible.
Recommendations¶
- Adopt CatBoostClassifier as the main model
- Use LightGBM as a backup or in an ensemble
- Benchmark CatBoostClassifier against other algorithms to validate that it remains the best option
- Work with SMEs to confirm features are meaningful and compliant
- Adjust decision thresholds regularly to balance business needs and risk appetite
- Put in place monitoring for drift and fairness
- Carry out stress testing and stability checks across products and groups
- Define override rules for credit officers
- Create a clear retraining policy and data handling guidelines
- Keep documentation transparent for both internal and external review
Next Steps¶
- Work with SMEs to fine-tune which data the model uses
- Test how well the model’s thresholds work and adjust if needed
- Set up a monitoring system to track fairness, data drift and how the model performs in tough situations
- Decide how often the model should be updated, every few months or more often if needed
- Agree on whether the model will make predictions instantly (real-time) or in batches (like overnight runs)
- Pilot the model on a smaller group before deploying it widely
- Explore and benchmark alternative algorithms to confirm CatBoostClassifier remains the best choice
- Use benchmarking results to validate CatBoostClassifier’s performance and meet regulatory expectations
Closing Message¶
In this case, CatBoostClassifier is a strong choice for telling apart safe loans from risky ones. But the algorithm by itself isn’t enough. The real value comes from managing the model carefully throughout its entire life.
That means working with SMEs to choose the right data, fine-tuning thresholds, keeping an eye on changes over time, checking that the model is fair, testing how it performs under pressure, retraining it when needed, making sure it fits smoothly into everyday processess and letting people step in to review tricky cases.
Handled this way, the bank ends up with a model that is not only accurate but also fair, easy to explain, reliable and built to last. This protects both the business and its customers over the long run.
Neverthelss, the finer details on regulations and governance would need SME and legal guidance to ensure full compliance with regulatory, authorities, business needs and maintain customer trust.
Session Information¶
Log the full session environment including OS, CPU, Python version and loaded modules to support reproducibility and assist in debugging environment-specific issues.
import importlib
importlib.metadata.version("markupsafe")
'3.0.2'
display(Markdown(f"<span style = 'font-size: 18px; font-weight: bold;'> Session Information </span>"))
# https://pypi.org/project/session-info/
session_info.show(na = True, os = True, cpu = True, jupyter = True, dependencies = True,
std_lib = True, private = True, write_req_file = False, req_file_name = None, html = None
)
Session Information
C:\Users\grace\AppData\Local\Programs\Python\Python311\Lib\site-packages\session_info\main.py:213: UserWarning:
The '__version__' attribute is deprecated and will be removed in MarkupSafe 3.1. Use feature detection, or `importlib.metadata.version("markupsafe")`, instead.
Click to view session information
----- __main__ NA catboost 1.2.7 collections NA gc NA importlib NA io NA json 2.0.9 lightgbm 4.6.0 logging 0.5.1.2 matplotlib 3.10.0 numpy 1.26.4 optuna 4.1.0 optunahub 0.2.0 optunahub_registry NA os NA pandas 2.2.3 pickle NA platform 1.0.8 plotly 5.24.1 seaborn 0.13.2 session_info v1.0.1 sklearn 1.6.1 time NA tqdm 4.67.1 xgboost 2.0.3 -----
Click to view modules imported as dependencies
PIL 10.4.0 __future__ NA __mp_main__ NA _abc NA _ast NA _asyncio NA _bisect NA _blake2 NA _bz2 NA _catboost NA _cffi_backend 1.17.1 _codecs NA _collections NA _collections_abc NA _compat_pickle NA _compression NA _contextvars NA _csparsetools NA _csv 1.0 _ctypes 1.1.0 _cython_0_29_37 NA _cython_3_0_10 NA _cython_3_0_11 NA _cython_3_0_8 NA _cython_3_1_0a0 NA _datetime NA _decimal 1.70 _distutils_hack NA _elementtree NA _frozen_importlib NA _frozen_importlib_external NA _functools NA _hashlib NA _heapq NA _imp NA _io NA _json NA _locale NA _loss NA _lsprof NA _lzma NA _moduleTNC NA _multibytecodec NA _multiprocessing NA _ni_label NA _opcode NA _openssl NA _operator NA _overlapped NA _pickle NA _plotly_utils NA _pydev_bundle NA _pydev_runfiles NA _pydevd_bundle NA _pydevd_frame_eval NA _pydevd_sys_monitoring NA _queue NA _random NA _sha512 NA _signal NA _sitebuiltins NA _socket NA _sodium NA _sqlite3 2.6.0 _sre NA _ssl NA _stat NA _statistics NA _string NA _strptime NA _struct NA _thread NA _typing NA _uuid NA _warnings NA _weakref NA _weakrefset NA _win32sysloader NA _winapi NA _zoneinfo NA abc NA anyio NA argparse 1.1 array NA arrow 1.3.0 ast NA asttokens NA astunparse 1.6.3 asyncio NA atexit NA attr 24.3.0 attrs 24.3.0 babel 2.16.0 base64 NA bdb NA binascii NA bisect NA bz2 NA cProfile NA calendar NA certifi 2024.12.14 cffi 1.17.1 charset_normalizer 3.4.1 cloudpickle 3.1.1 cmaes 0.11.1 cmath NA cmd NA code NA codecs NA codeop NA colorama 0.4.6 colorlog NA colorsys NA comm 0.2.2 concurrent NA contextlib NA contextvars NA copy NA copyreg NA cryptography 44.0.0 csv 1.0 ctypes 1.1.0 cycler 0.12.1 cython_runtime NA dataclasses NA datetime NA dateutil 2.9.0.post0 debugpy 1.8.11 decimal 1.70 decorator 5.1.1 defusedxml 0.7.1 deprecated 1.2.18 difflib NA dis NA email NA encodings NA enum NA errno NA executing 2.1.0 fastjsonschema NA faulthandler NA filecmp NA fnmatch NA fqdn NA fractions NA functools NA ga4mp NA genericpath NA getopt NA getpass NA gettext NA github NA glob NA google NA graphviz 0.20.3 gzip NA hashlib NA heapq NA hmac NA html NA http NA idna 3.10 inspect NA ipaddress 1.0 ipykernel 6.29.5 ipywidgets 8.1.5 isoduration NA itertools NA jedi 0.19.2 jinja2 3.1.5 joblib 1.4.2 json5 0.10.0 jsonpointer 3.0.0 jsonschema 4.23.0 jsonschema_specifications NA jupyter_events 0.11.0 jupyter_server 2.15.0 jupyterlab_server 2.27.3 jwt 2.10.1 kaleido 0.2.1 keyword NA kiwisolver 1.4.7 linecache NA locale NA lzma NA markupsafe 3.0.2 marshal 4 math NA matplotlib_inline 0.1.7 mimetypes NA mmap NA mpl_toolkits NA msvcrt NA multiprocessing NA nacl 1.5.0 nbformat 5.10.4 nt NA ntpath NA nturl2path NA numbers NA numexpr 2.10.2 opcode NA operator NA overrides NA packaging 24.2 parso 0.8.4 pathlib NA patsy 1.0.1 pdb NA pickletools NA pkgutil NA platformdirs 4.3.6 plistlib NA posixpath NA pprint NA profile NA prometheus_client NA prompt_toolkit 3.0.48 pstats NA psutil 6.1.1 pure_eval 0.2.3 pyarrow 18.1.0 pycparser 2.22 pydev_ipython NA pydevconsole NA pydevd 3.2.3 pydevd_file_utils NA pydevd_plugins NA pydevd_tracing NA pydoc NA pydoc_data NA pyexpat NA pygments 2.19.1 pyparsing 3.2.0 pythoncom NA pythonjsonlogger NA pytz 2024.2 pywin32_bootstrap NA pywin32_system32 NA pywintypes NA queue NA quopri NA random NA re 2.2.1 referencing NA reprlib NA requests 2.32.3 rfc3339_validator 0.1.4 rfc3986_validator 0.1.1 rpds NA runpy NA scipy 1.13.1 secrets NA select NA selectors NA send2trash NA shlex NA shutil NA signal NA site NA six 1.17.0 sniffio 1.3.1 socket NA socketserver 0.4 sqlite3 2.6.0 ssl NA stack_data 0.6.3 stat NA statistics NA statsmodels 0.14.4 string NA stringprep NA struct NA subprocess NA sys 3.11.2 (tags/v3.11.2:878ead1, Feb 7 2023, 16:38:35) [MSC v.1934 64 bit (AMD64)] sysconfig NA tarfile 0.9.0 tempfile NA tenacity NA textwrap NA threading NA threadpoolctl 3.5.0 timeit NA token NA tokenize NA torch 2.5.1+cu118 torchgen NA tornado 6.4.2 traceback NA traitlets 5.14.3 types NA typing NA typing_extensions NA unicodedata NA unittest NA uri_template NA urllib NA urllib3 2.3.0 uuid NA warnings NA wcwidth 0.2.13 weakref NA webbrowser NA webcolors NA websocket 1.8.0 win32api NA win32com NA win32con NA win32trace NA winerror NA winreg NA wrapt 1.17.2 wsgiref NA xarray 2025.1.2 xml NA xmlrpc NA yaml 6.0.2 zipfile NA zipimport NA zlib 1.0 zmq 26.2.0 zoneinfo NA
----- IPython 8.31.0 jupyter_client 8.6.3 jupyter_core 5.7.2 jupyterlab 4.3.4 notebook 7.3.2 ----- Python 3.11.2 (tags/v3.11.2:878ead1, Feb 7 2023, 16:38:35) [MSC v.1934 64 bit (AMD64)] Windows-10-10.0.22631-SP0 8 logical CPU cores, Intel64 Family 6 Model 140 Stepping 1, GenuineIntel ----- Session information updated at 2025-09-24 01:01